StatAnalytica

Top 100 Research Methodology Project Topics

research methodology project topics

Research methodology might sound like a fancy term, but it’s simply the way researchers go about investigating a question or problem. Think of it as a roadmap for your project, guiding you through the steps to find answers. It’s crucial to pick the right methodology because it determines how you collect and analyze data, which affects the reliability of your findings. So, let’s check 100 research methodology project topics below.

Types of Research Methodologies

Table of Contents

There are mainly three types of research methodologies: quantitative, qualitative, and mixed-methods.

Quantitative Research Methodology

Quantitative research focuses on collecting numerical data and analyzing it statistically. It’s great for measuring things objectively.

For instance, if you’re studying how many people prefer coffee over tea, quantitative research can provide concrete numbers.

Qualitative Research Methodology

Qualitative research, on the other hand, dives deep into understanding people’s experiences, feelings, and behaviors. It’s like peeling an onion layer by layer to reveal the underlying emotions and motivations.

For example, if you want to explore why some students struggle with math, qualitative research can uncover personal stories and perspectives.

Mixed-Methods Research

Sometimes, researchers use a combination of quantitative and qualitative methods, known as mixed-methods research.

This approach offers a more comprehensive understanding of a topic by blending numerical data with rich narratives. It’s like having the best of both worlds.

Factors Influencing Choice of Research Methodology

Several factors influence the choice of research methodology:

  • Nature of the research question: Is it about measuring something objectively or understanding complex human behaviors?
  • Availability of resources: Do you have access to the tools and expertise needed for a particular methodology?
  • Time constraints: How much time do you have to conduct the research?
  • Ethical considerations: Are there any ethical concerns related to your research methods?

Steps Involved in Research Methodology for Project Topics

Regardless of the chosen methodology, research typically follows these steps:

  • Problem Definition: Clearly define the research question or problem you want to address.
  • Literature Review: Explore existing research and theories related to your topic to build a solid foundation.
  • Selection of Research Design: Choose the appropriate methodology based on your research question and objectives.
  • Data Collection: Gather relevant data using surveys, interviews, observations, or experiments.
  • Data Analysis: Analyze the collected data using statistical tools (for quantitative research) or thematic analysis (for qualitative research).
  • Interpretation of Results: Draw conclusions based on your analysis and discuss their implications.

Best Practices in Research Methodology for Project Topics

To ensure the quality and integrity of your research, follow these best practices:

  • Ensuring validity and reliability of data: Use reliable measurement tools and sampling techniques to minimize errors.
  • Ethical considerations in research: Obtain informed consent from participants, protect their privacy, and avoid any form of deception.
  • Proper documentation and citation: Keep detailed records of your research process and cite all sources properly to avoid plagiarism.
  • Peer review and feedback: Seek feedback from peers and experts in your field to improve the quality of your research.
  • The impact of online surveys on response rates and data quality.
  • Comparing the effectiveness of focus groups and individual interviews in marketing research.
  • Analyzing the ethical considerations of using social media data for research.
  • Exploring the potential of big data analytics in social science research.
  • Evaluating the reliability and validity of mixed-methods research approaches.
  • Examining the role of cultural sensitivity in international research projects.
  • Investigating the challenges and opportunities of conducting research in conflict zones.
  • Analyzing the effectiveness of different strategies for recruiting research participants.
  • Exploring the use of action research methodologies in addressing real-world problems.
  • Evaluating the impact of researcher bias on the research process and outcomes.
  • Investigating the potential of citizen science for collecting and analyzing data.
  • Exploring the use of virtual reality in conducting research studies.
  • Analyzing the ethical considerations of conducting research with vulnerable populations.
  • Evaluating the effectiveness of different strategies for disseminating research findings.
  • Examining the role of storytelling in qualitative research.
  • Investigating the use of visual methods in research, such as photography and video.
  • Analyzing the challenges and opportunities of conducting longitudinal research studies.
  • Exploring the use of case studies in research projects.
  • Evaluating the effectiveness of different strategies for coding and analyzing qualitative data.
  • Examining the role of theory in research design and analysis.
  • Investigating the use of discourse analysis methodologies in research.
  • Analyzing the strengths and limitations of quantitative research methods.
  • Exploring the use of experimental research designs in social science research.
  • Evaluating the effectiveness of different sampling techniques in research.
  • Examining the role of research ethics committees in ensuring the ethical conduct of research.
  • Investigating the challenges and opportunities of conducting research online.
  • Analyzing the impact of social media on public perceptions of research.
  • Exploring the use of gamification in research to increase participant engagement.
  • Evaluating the effectiveness of different strategies for data visualization.
  • Examining the role of open access in making research findings available to a wider audience.
  • Investigating the challenges and opportunities of interdisciplinary research collaborations.
  • Analyzing the impact of political and economic factors on research funding.
  • Exploring the use of participatory action research methodologies to empower communities.
  • Evaluating the effectiveness of different strategies for knowledge mobilization.
  • Examining the role of research in informing policy and practice.
  • Investigating the use of artificial intelligence in research methodologies.
  • Analyzing the ethical considerations of using facial recognition technology in research.
  • Exploring the potential of blockchain technology to improve data security and transparency in research.
  • Evaluating the effectiveness of different strategies for engaging with stakeholders in research projects.
  • Examining the role of reflexivity in qualitative research.
  • Investigating the use of narrative inquiry methodologies in research.
  • Analyzing the strengths and limitations of case studies as a research method.
  • Exploring the use of secondary data analysis in research projects.
  • Evaluating the effectiveness of different strategies for managing and storing research data.
  • Examining the role of research assistants in the research process.
  • Investigating the challenges and opportunities of conducting research in developing countries.
  • Analyzing the impact of climate change on research methodologies.
  • Exploring the use of citizen science for environmental monitoring.
  • Evaluating the effectiveness of different strategies for conducting research with indigenous communities.
  • Examining the role of research in promoting social justice.
  • Investigating the historical development of research methodologies.
  • Analyzing the impact of technological advancements on research practices.
  • Exploring the use of mixed methods research approaches in different disciplines.
  • Evaluating the effectiveness of different strategies for managing research projects.
  • Examining the role of research funders in shaping research agendas.
  • Investigating the challenges and opportunities of conducting research across different cultures.
  • Analyzing the impact of language barriers on research communication.
  • Exploring the use of collaborative online platforms for conducting research.
  • Evaluating the effectiveness of different strategies for promoting research skills development.
  • Examining the role of research misconduct in undermining public trust in research.
  • Investigating the challenges and opportunities of conducting research with children.
  • Analyzing the impact of research on mental health and well-being.
  • Exploring the use of arts-based research methodologies.
  • Evaluating the effectiveness of different strategies for recruiting and retaining research participants.
  • Examining the role of research networks in supporting researchers.
  • Investigating the challenges and opportunities of conducting research in the private sector.
  • Exploring the use of open science practices to promote research transparency and reproducibility.
  • Evaluating the effectiveness of different strategies for mentoring and supporting early-career researchers.
  • Examining the role of research misconduct in retracting scientific articles.
  • Investigating the challenges and opportunities of data sharing in research.
  • Analyzing the impact of open data initiatives on scientific progress.
  • Exploring the use of crowdsourcing in research to gather data and solve problems.
  • Evaluating the effectiveness of different strategies for promoting research impact.
  • Examining the role of alternative research metrics in evaluating the quality of research.
  • Investigating the use of bibliometrics to analyze research trends and identify emerging areas.
  • Analyzing the impact of research on public policy and decision-making.
  • Exploring the use of participatory research methodologies to empower communities.
  • Evaluating the effectiveness of different strategies for communicating research findings to the public.
  • Examining the role of social media in disseminating research findings.
  • Analyzing the impact of humanitarian aid on research practices in developing countries.
  • Exploring the use of research methodologies to address global challenges, such as climate change and poverty.
  • Evaluating the effectiveness of different strategies for building research capacity in developing countries.
  • Examining the role of international research collaborations in promoting global research excellence.
  • Investigating the challenges and opportunities of conducting research in the field of artificial intelligence.
  • Analyzing the ethical considerations of using autonomous robots in research.
  • Exploring the potential of artificial intelligence to automate research tasks.
  • Evaluating the effectiveness of different strategies for mitigating the risks of bias in artificial intelligence-powered research.
  • Examining the role of research in shaping the future of work.
  • Investigating the impact of automation on research jobs.
  • Exploring the use of new technologies to improve research efficiency and productivity.
  • Evaluating the effectiveness of different strategies for developing transferable skills for researchers.
  • Examining the role of lifelong learning in maintaining research expertise.
  • Investigating the impact of research funding cuts on research quality and innovation.
  • Exploring the use of alternative funding models, such as crowdfunding and philanthropy, to support research.
  • Evaluating the effectiveness of different strategies for advocating for increased research funding.
  • Examining the role of research universities in driving innovation and economic growth.
  • Investigating the impact of research on social and cultural change.
  • Exploring the future of research methodologies in an ever-changing world.

Examples of Research Methodology Project Topics

Here are some examples of project topics suited for different research methodologies:

Quantitative Research Topics

  • The impact of social media usage on mental health among teenagers.
  • Factors influencing customer satisfaction in the hospitality industry.

Qualitative Research Topics

  • Exploring the experiences of first-generation college students.
  • Understanding the challenges faced by small business owners during the COVID-19 pandemic.

Mixed-Methods Research Topics

  • Assessing the effectiveness of a school bullying prevention program .
  • Investigating the relationship between exercise habits and stress levels among working adults.

Research methodology is like a compass that guides you through the journey of inquiry. By understanding the different types of methodologies, factors influencing their choice, and best practices, you can embark on your research methodology project topics journey with confidence.

Remember, the key to successful research lies in asking the right questions and choosing the appropriate methodology to find the answers.

Related Posts

best way to finance car

Step by Step Guide on The Best Way to Finance Car

how to get fund for business

The Best Way on How to Get Fund For Business to Grow it Efficiently

seminar topics on research methodology

  • Search events
  • Submit an event
  • Other information
  • International events
  • UK & Ireland
  • Online events
  • By subject area
  • By member network

seminar topics on research methodology

Webinar on Research Methodology: A Guide For Beginners

10 february 2023 17:00-18:00, chennai, india.

seminar topics on research methodology

Useful links

seminar topics on research methodology

  • Dr Supaprawat Siripipatthanakul Malaysia

chennai, shanthi colony, Anna Nagar, Chennai, Chennai, 600040, India

  • Quvae Research and Publications India http://www.quvae.com/upcoming-webinar

Related events

seminar topics on research methodology

10th International Conference on Nuclear and Radiochemistry – NRC10

25 august 2024 18:00 - 30 august 2024 13:00, brighton, united kingdom.

seminar topics on research methodology

ViCEPHEC 2024

28 august 2024 12:00 - 30 august 2024 17:00, guildford, united kingdom.

seminar topics on research methodology

TAC 2025 - 60th Anniversary Conference

14 april 2025 09:00 - 16 april 2025 17:00, guildford, united kingdom.

seminar topics on research methodology

The Forge 2024

12 november 2024 09:30 - 13 november 2024 16:00, leeds, united kingdom,   exclude online events, please check to filter on date range,   select date range.

  • All locations
  • UK & Ireland

Subject area

  • All event types
  • Competition
  • Conference / Symposium
  • CPD & Careers
  • Demonstration
  • Exhibitions
  • RSC Prizes & Awards
  • Schools Event
  • Social / Networking
  • Webinar (Online only)

seminar topics on research methodology

  • Platform title

seminar topics on research methodology

  • Membership & professional community
  • Campaigning & outreach
  • Journals & books
  • Resources & tools
  • News & events
  • Locations & contacts
  • Awards & funding
  • Help & legal
  • Become a member
  • Connect with others
  • Supporting individuals
  • Supporting organisations
  • Engage with us
  • Manage my membership

Queens award

Browse Course Material

Course info, instructors.

  • Prof. Jesper B. Sorensen
  • Prof. Lotte Bailyn

Departments

  • Sloan School of Management

As Taught In

  • Social Science

Learning Resource Types

Doctoral seminar in research methods i, course description.

An open, spiral-bound notebook.

You are leaving MIT OpenCourseWare

seminar topics on research methodology

Free Webinar

Research Methodology 101

Get started with the basics of research methodology. In this free webinar, you’ll learn:

– What research methodology is (and it’s purpose) – The 4 core components of a methodology – What assessors/markers are looking for – How to get started developing a methodology

Dissertation coaching awards

What’s this all about?

The 45-minute webinar is an engaging online workshop covering the basics of research methodology, specifically within the context of dissertations and theses.

The webinar is tailored toward students undertaking research for Master’s and Doctoral-level degrees within the sciences  (natural and social). That said, you’ll still benefit even if you don’t fit this description, as the requirements are largely consistent across disciplines .

Key Details:

– When : On-demand (instant access) – Where : Online (video or audio) – Duration : 45 minutes – Cost: 100% free

Meet Your Host

Kerryn

Kerryn Warren (PhD) is one of our friendly coaches. Having lectured and tutored students for over 10 years in subjects ranging from biology (natural science) to archaeological heritage (social science), Kerryn brings a unique blend of skills and perspectives.

Watch The Webinar Now

Instant Access. 100% Free.

Watch Now (Instant Access)

Awards and accreditations

What you’ll get

The research methodology webinar is more than just a one-way presentation.

seminar topics on research methodology

A-Z Workshop

The presentation will cover the basics of research methodology so that you can make informed decisions.

seminar topics on research methodology

Engaging Chat

Join in on the conversation with students from around the world – learn and share throughout the webinar.

seminar topics on research methodology

Free Resources

You’ll receive access to a host of free resources, including detailed guides, comprehensive templates and practical exercises.

Join The Webinar Now

100% Free. Instant Access.

seminar topics on research methodology

seminar topics on research methodology

  • Work with us

seminar topics on research methodology

  • Vision & Mission
  • Recent Achievements

seminar topics on research methodology

  • Skill Education
  • International
  • Memorandum of Association (MoA)
  • Executive Council
  • Academic Council
  • Finance Committee
  • Statutory bodies & Committees
  • Equal Opportunity Center
  • Publications
  • Field Action

seminar topics on research methodology

Five Day Workshop on Research Methodology

Five Day Workshop on Research Methodology.

seminar topics on research methodology

May 8th to 12th 2023 

School of Research Methodology

Tata Institute of Social Sciences, Deonar, Mumbai

The workshop is organised by School of Research Methodology

About the Programme

The aim of the Five Day Workshop on 'Research Methodology' is to enable  the participants both social sciences research approaches – Qualitative and quantitative .

Workshop Objectives

The objective of this workshop are:

To  provide  a comprehensive over  view  of  qualitative  and quantitative research framework.

Course Content

Different approaches of social science research.

Fundamental of social science research

Tools and methods of data collection for qualitative and quantitative research

Research based on secondary data

Data analysis using statistical software

Reporting and presentation

Citation, reference management tool and plagiarism

Workshop Dynamics

Basic understanding of the concept of research methodology is necessary to appreciate the program. Hence, the workshop will start with some theoretical input sessions so as to help the participants to brush the knowledge in basic research methods. Thus both theoretical and practical sessions will be arranged so that the participants could understand, appreciate and able to meaningfully interpret the output.

Participant Selection

The participants will be selected on the first come first serve basis

   The seats  are limited, if  the number of seats  gets  filled  before the last  date of application,  no subsequent applications will be accepted

Eligibility:

Researchers, M. Phil/Ph.D research scholars, teachers and NGO working in social sciences subjects in any sector will benefit from participating in this workshop.

Course Fees: Note: The workshop course fee on Qualitative Research Methodology is Rs. 5,000.00 and for TISS Alumni Rs. 4000.00 per person . The fee will cover course materials, workshop certificate, group photograph, tea and lunch during the workshop.  Payment can be made through online registration through the link.  The number of participants for the workshop is restricted to 40. Last date for registering to the programme is 1st May 2023.

Registration link :  https://support.tiss.edu/

Accommodation: a limited seats on twin share basis will be available in guest house @ 1500/- per person per day if informed in advance. Participants will have to bear their own expenses for travel, boarding and lodging.

Resource Persons

Prof. Anil Sutar, Tata Institute of Social Sciences, Mumbai

Prof. R. B. Bhagat, Tata Institute of Social Sciences, Mumbai

Dr. D. P. Singh, Tata Institute of Social Sciences, Mumbai

Course co-ordinator

Prof. D. P. Singh

Tata Institute of Social Sciences, Mumbai

For any queries regarding workshop contact

Mob: + 91 9819177709

E-Mail: [email protected] , [email protected]

seminar topics on research methodology

Search Results will be displayed here

seminar topics on research methodology

  • Announcements
  • NIRF Documents
  • Initiatives
  • Collaborations
  • Covid Response
  • Schools & Centres
  • Teaching Programmes
  • Faculty & Staff
  • Office for International affairs
  • NAD registration
  • INFORMATION
  • Anti Ragging
  • Annual Reports
  • Alumni Transcript Form 2023
  • Agency Educational Verification Form
  • Circular and Notices
  • WEB-RESOURCES
  • Photo Gallery

seminar topics on research methodology

  • Tata Institute of Social Sciences, V.N. Purav Marg, Deonar, Mumbai-400088.

seminar topics on research methodology

  • Open access
  • Published: 07 September 2020

A tutorial on methodological studies: the what, when, how and why

  • Lawrence Mbuagbaw   ORCID: orcid.org/0000-0001-5855-5461 1 , 2 , 3 ,
  • Daeria O. Lawson 1 ,
  • Livia Puljak 4 ,
  • David B. Allison 5 &
  • Lehana Thabane 1 , 2 , 6 , 7 , 8  

BMC Medical Research Methodology volume  20 , Article number:  226 ( 2020 ) Cite this article

41k Accesses

58 Citations

61 Altmetric

Metrics details

Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.

We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?

Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.

Peer Review reports

The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 , 2 , 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 , 7 , 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).

In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig.  1 .

figure 1

Trends in the number studies that mention “methodological review” or “meta-

epidemiological study” in PubMed.

The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.

The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.

What is a methodological study?

Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 , 13 , 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.

Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.

When should we conduct a methodological study?

Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.

These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].

How often are methodological studies conducted?

There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.

Why do we conduct methodological studies?

Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].

Where can we find methodological studies?

Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.

Some frequently asked questions about methodological studies

In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.

Q: How should I select research reports for my methodological study?

A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].

The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.

Q: How many databases should I search?

A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.

Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.

Q: Should I publish a protocol for my methodological study?

A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.

Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).

Q: How to appraise the quality of a methodological study?

A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.

Q: Should I justify a sample size?

A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:

Comparing two groups

Determining a proportion, mean or another quantifier

Determining factors associated with an outcome using regression-based analyses

For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].

Q: What should I call my study?

A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.

Q: Should I account for clustering in my methodological study?

A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”

A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].

Q: Should I extract data in duplicate?

A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].

Q: Should I assess the risk of bias of research reports included in my methodological study?

A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].

Q: What variables are relevant to methodological studies?

A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:

Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.

Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].

Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]

Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 , 66 , 67 ].

Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].

Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].

Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].

Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.

Q: Should I focus only on high impact journals?

A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.

Q: Can I conduct a methodological study of qualitative research?

A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.

Q: What reporting guidelines should I use for my methodological study?

A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.

Q: What are the potential threats to validity and how can I avoid them?

A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.

Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].

With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.

Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n  = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n  = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n  = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.

Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.

A proposed framework

In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:

What is the aim?

Methodological studies that investigate bias

A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].

Methodological studies that investigate quality (or completeness) of reporting

Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].

Methodological studies that investigate the consistency of reporting

Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].

Methodological studies that investigate factors associated with reporting

In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].

Methodological studies that investigate methods

Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].

Methodological studies that summarize other methodological studies

Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].

Methodological studies that investigate nomenclature and terminology

Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].

Other types of methodological studies

In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.

What is the design?

Methodological studies that are descriptive

Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].

Methodological studies that are analytical

Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].

What is the sampling strategy?

Methodological studies that include the target population

Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n  = 103) [ 30 ].

Methodological studies that include a sample of the target population

Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.

What is the unit of analysis?

Methodological studies with a research report as the unit of analysis

Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.

Methodological studies with a design, analysis or reporting item as the unit of analysis

Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].

This framework is outlined in Fig.  2 .

figure 2

A proposed framework for methodological studies

Conclusions

Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.

In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.

Availability of data and materials

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

Abbreviations

Consolidated Standards of Reporting Trials

Evidence, Participants, Intervention, Comparison, Outcome, Timeframe

Grading of Recommendations, Assessment, Development and Evaluations

Participants, Intervention, Comparison, Outcome, Timeframe

Preferred Reporting Items of Systematic reviews and Meta-Analyses

Studies Within a Review

Studies Within a Trial

Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.

PubMed   Google Scholar  

Chan AW, Song F, Vickers A, Jefferson T, Dickersin K, Gotzsche PC, Krumholz HM, Ghersi D, van der Worp HB. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.

PubMed   PubMed Central   Google Scholar  

Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.

Higgins JP, Altman DG, Gotzsche PC, Juni P, Moher D, Oxman AD, Savovic J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.

Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet. 2001;357.

Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.

Shea BJ, Hamel C, Wells GA, Bouter LM, Kristjansson E, Grimshaw J, Henry DA, Boers M. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013–20.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. Bmj. 2017;358:j4008.

Lawson DO, Leenus A, Mbuagbaw L. Mapping the nomenclature, methodology, and reporting of studies that review methods: a pilot methodological review. Pilot Feasibility Studies. 2020;6(1):13.

Puljak L, Makaric ZL, Buljan I, Pieper D. What is a meta-epidemiological study? Analysis of published literature indicated heterogeneous study designs and definitions. J Comp Eff Res. 2020.

Abbade LPF, Wang M, Sriganesh K, Jin Y, Mbuagbaw L, Thabane L. The framing of research questions using the PICOT format in randomized controlled trials of venous ulcer disease is suboptimal: a systematic survey. Wound Repair Regen. 2017;25(5):892–900.

Gohari F, Baradaran HR, Tabatabaee M, Anijidani S, Mohammadpour Touserkani F, Atlasi R, Razmgir M. Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review. J Diabetes Metab Disord. 2015;15(1):36.

Wang M, Jin Y, Hu ZJ, Thabane A, Dennis B, Gajic-Veljanoski O, Paul J, Thabane L. The reporting quality of abstracts of stepped wedge randomized trials is suboptimal: a systematic survey of the literature. Contemp Clin Trials Commun. 2017;8:1–10.

Shanthanna H, Kaushal A, Mbuagbaw L, Couban R, Busse J, Thabane L: A cross-sectional study of the reporting quality of pilot or feasibility trials in high-impact anesthesia journals Can J Anaesthesia 2018, 65(11):1180–1195.

Kosa SD, Mbuagbaw L, Borg Debono V, Bhandari M, Dennis BB, Ene G, Leenus A, Shi D, Thabane M, Valvasori S, et al. Agreement in reporting between trial publications and current clinical trial registry in high impact journals: a methodological review. Contemporary Clinical Trials. 2018;65:144–50.

Zhang Y, Florez ID, Colunga Lozano LE, Aloweni FAB, Kennedy SA, Li A, Craigie S, Zhang S, Agarwal A, Lopes LC, et al. A systematic survey on reporting and methods for handling missing participant data for continuous outcomes in randomized controlled trials. J Clin Epidemiol. 2017;88:57–66.

CAS   PubMed   Google Scholar  

Hernández AV, Boersma E, Murray GD, Habbema JD, Steyerberg EW. Subgroup analyses in therapeutic cardiovascular clinical trials: are most of them misleading? Am Heart J. 2006;151(2):257–64.

Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, Fruci V, Dennis B, Bawor M, Thabane L. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.

Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59(7):697–703.

Carrasco-Labra A, Brignardello-Petersen R, Santesso N, Neumann I, Mustafa RA, Mbuagbaw L, Etxeandia Ikobaltzeta I, De Stio C, McCullagh LJ, Alonso-Coello P. Improving GRADE evidence tables part 1: a randomized trial shows improved understanding of content in summary-of-findings tables with a new format. J Clin Epidemiol. 2016;74:7–18.

The Northern Ireland Hub for Trials Methodology Research: SWAT/SWAR Information [ https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/SWATSWARInformation/ ]. Accessed 31 Aug 2020.

Chick S, Sánchez P, Ferrin D, Morrice D. How to conduct a successful simulation study. In: Proceedings of the 2003 winter simulation conference: 2003; 2003. p. 66–70.

Google Scholar  

Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.

Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mount Sinai J Med New York. 1996;63(3–4):216–24.

CAS   Google Scholar  

Areia M, Soares M, Dinis-Ribeiro M. Quality reporting of endoscopic diagnostic studies in gastrointestinal journals: where do we stand on the use of the STARD and CONSORT statements? Endoscopy. 2010;42(2):138–47.

Knol M, Groenwold R, Grobbee D. P-values in baseline tables of randomised controlled trials are inappropriate but still common in high impact journals. Eur J Prev Cardiol. 2012;19(2):231–2.

Chen M, Cui J, Zhang AL, Sze DM, Xue CC, May BH. Adherence to CONSORT items in randomized controlled trials of integrative medicine for colorectal Cancer published in Chinese journals. J Altern Complement Med. 2018;24(2):115–24.

Hopewell S, Ravaud P, Baron G, Boutron I. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.

The Cochrane Methodology Register Issue 2 2009 [ https://cmr.cochrane.org/help.htm ]. Accessed 31 Aug 2020.

Mbuagbaw L, Kredo T, Welch V, Mursleen S, Ross S, Zani B, Motaze NV, Quinlan L. Critical EPICOT items were absent in Cochrane human immunodeficiency virus systematic reviews: a bibliometric analysis. J Clin Epidemiol. 2016;74:66–72.

Barton S, Peckitt C, Sclafani F, Cunningham D, Chau I. The influence of industry sponsorship on the reporting of subgroup analyses within phase III randomised controlled trials in gastrointestinal oncology. Eur J Cancer. 2015;51(18):2732–9.

Setia MS. Methodology series module 5: sampling strategies. Indian J Dermatol. 2016;61(5):505–9.

Wilson B, Burnett P, Moher D, Altman DG, Al-Shahi Salman R. Completeness of reporting of randomised controlled trials including people with transient ischaemic attack or stroke: a systematic review. Eur Stroke J. 2018;3(4):337–46.

Kahale LA, Diab B, Brignardello-Petersen R, Agarwal A, Mustafa RA, Kwong J, Neumann I, Li L, Lopes LC, Briel M, et al. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey. J Clin Epidemiol. 2018;99:14–23.

De Angelis CD, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, Kotzin S, Laine C, Marusic A, Overbeke AJPM, et al. Is this clinical trial fully registered?: a statement from the International Committee of Medical Journal Editors*. Ann Intern Med. 2005;143(2):146–8.

Ohtake PJ, Childs JD. Why publish study protocols? Phys Ther. 2014;94(9):1208–9.

Rombey T, Allers K, Mathes T, Hoffmann F, Pieper D. A descriptive analysis of the characteristics and the peer review process of systematic review protocols published in an open peer review journal from 2012 to 2017. BMC Med Res Methodol. 2019;19(1):57.

Grimes DA, Schulz KF. Bias and causal associations in observational research. Lancet. 2002;359(9302):248–52.

Porta M (ed.): A dictionary of epidemiology, 5th edn. Oxford: Oxford University Press, Inc.; 2008.

El Dib R, Tikkinen KAO, Akl EA, Gomaa HA, Mustafa RA, Agarwal A, Carpenter CR, Zhang Y, Jorge EC, Almeida R, et al. Systematic survey of randomized trials evaluating the impact of alternative diagnostic strategies on patient-important outcomes. J Clin Epidemiol. 2017;84:61–9.

Helzer JE, Robins LN, Taibleson M, Woodruff RA Jr, Reich T, Wish ED. Reliability of psychiatric diagnosis. I. a methodological review. Arch Gen Psychiatry. 1977;34(2):129–33.

Chung ST, Chacko SK, Sunehag AL, Haymond MW. Measurements of gluconeogenesis and Glycogenolysis: a methodological review. Diabetes. 2015;64(12):3996–4010.

CAS   PubMed   PubMed Central   Google Scholar  

Sterne JA, Juni P, Schulz KF, Altman DG, Bartlett C, Egger M. Statistical methods for assessing the influence of study characteristics on treatment effects in 'meta-epidemiological' research. Stat Med. 2002;21(11):1513–24.

Moen EL, Fricano-Kugler CJ, Luikart BW, O’Malley AJ. Analyzing clustered data: why and how to account for multiple observations nested within a study participant? PLoS One. 2016;11(1):e0146721.

Zyzanski SJ, Flocke SA, Dickinson LM. On the nature and analysis of clustered data. Ann Fam Med. 2004;2(3):199–200.

Mathes T, Klassen P, Pieper D. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review. BMC Med Res Methodol. 2017;17(1):152.

Bui DDA, Del Fiol G, Hurdle JF, Jonnalagadda S. Extractive text summarization system to aid data extraction from full text in systematic review development. J Biomed Inform. 2016;64:265–72.

Bui DD, Del Fiol G, Jonnalagadda S. PDF text classification to leverage information extraction from publication reports. J Biomed Inform. 2016;61:141–8.

Maticic K, Krnic Martinic M, Puljak L. Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience. BMC Med Res Methodol. 2019;19(1):32.

Speich B. Blinding in surgical randomized clinical trials in 2015. Ann Surg. 2017;266(1):21–2.

Abraha I, Cozzolino F, Orso M, Marchesi M, Germani A, Lombardo G, Eusebi P, De Florio R, Luchetta ML, Iorio A, et al. A systematic review found that deviations from intention-to-treat are common in randomized trials and systematic reviews. J Clin Epidemiol. 2017;84:37–46.

Zhong Y, Zhou W, Jiang H, Fan T, Diao X, Yang H, Min J, Wang G, Fu J, Mao B. Quality of reporting of two-group parallel randomized controlled clinical trials of multi-herb formulae: A survey of reports indexed in the Science Citation Index Expanded. Eur J Integrative Med. 2011;3(4):e309–16.

Farrokhyar F, Chu R, Whitlock R, Thabane L. A systematic review of the quality of publications reporting coronary artery bypass grafting trials. Can J Surg. 2007;50(4):266–77.

Oltean H, Gagnier JJ. Use of clustering analysis in randomized controlled trials in orthopaedic surgery. BMC Med Res Methodol. 2015;15:17.

Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.

Balasubramanian SP, Wiener M, Alshameeri Z, Tiruvoipati R, Elbourne D, Reed MW. Standards of reporting of randomized controlled trials in general surgery: can we do better? Ann Surg. 2006;244(5):663–7.

de Vries TW, van Roon EN. Low quality of reporting adverse drug reactions in paediatric randomised controlled trials. Arch Dis Child. 2010;95(12):1023–6.

Borg Debono V, Zhang S, Ye C, Paul J, Arya A, Hurlburt L, Murthy Y, Thabane L. The quality of reporting of RCTs used within a postoperative pain management meta-analysis, using the CONSORT statement. BMC Anesthesiol. 2012;12:13.

Kaiser KA, Cofield SS, Fontaine KR, Glasser SP, Thabane L, Chu R, Ambrale S, Dwary AD, Kumar A, Nayyar G, et al. Is funding source related to study reporting quality in obesity or nutrition randomized control trials in top-tier medical journals? Int J Obes. 2012;36(7):977–81.

Thomas O, Thabane L, Douketis J, Chu R, Westfall AO, Allison DB. Industry funding and the reporting quality of large long-term weight loss trials. Int J Obes. 2008;32(10):1531–6.

Khan NR, Saad H, Oravec CS, Rossi N, Nguyen V, Venable GT, Lillard JC, Patel P, Taylor DR, Vaughn BN, et al. A review of industry funding in randomized controlled trials published in the neurosurgical literature-the elephant in the room. Neurosurgery. 2018;83(5):890–7.

Hansen C, Lundh A, Rasmussen K, Hrobjartsson A. Financial conflicts of interest in systematic reviews: associations with results, conclusions, and methodological quality. Cochrane Database Syst Rev. 2019;8:Mr000047.

Kiehna EN, Starke RM, Pouratian N, Dumont AS. Standards for reporting randomized controlled trials in neurosurgery. J Neurosurg. 2011;114(2):280–5.

Liu LQ, Morris PJ, Pengel LH. Compliance to the CONSORT statement of randomized controlled trials in solid organ transplantation: a 3-year overview. Transpl Int. 2013;26(3):300–6.

Bala MM, Akl EA, Sun X, Bassler D, Mertz D, Mejza F, Vandvik PO, Malaga G, Johnston BC, Dahm P, et al. Randomized trials published in higher vs. lower impact journals differ in design, conduct, and analysis. J Clin Epidemiol. 2013;66(3):286–95.

Lee SY, Teoh PJ, Camm CF, Agha RA. Compliance of randomized controlled trials in trauma surgery with the CONSORT statement. J Trauma Acute Care Surg. 2013;75(4):562–72.

Ziogas DC, Zintzaras E. Analysis of the quality of reporting of randomized controlled trials in acute and chronic myeloid leukemia, and myelodysplastic syndromes as governed by the CONSORT statement. Ann Epidemiol. 2009;19(7):494–500.

Alvarez F, Meyer N, Gourraud PA, Paul C. CONSORT adoption and quality of reporting of randomized controlled trials: a systematic analysis in two dermatology journals. Br J Dermatol. 2009;161(5):1159–65.

Mbuagbaw L, Thabane M, Vanniyasingam T, Borg Debono V, Kosa S, Zhang S, Ye C, Parpia S, Dennis BB, Thabane L. Improvement in the quality of abstracts in major clinical journals since CONSORT extension for abstracts: a systematic review. Contemporary Clin trials. 2014;38(2):245–50.

Thabane L, Chu R, Cuddy K, Douketis J. What is the quality of reporting in weight loss intervention studies? A systematic review of randomized controlled trials. Int J Obes. 2007;31(10):1554–9.

Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evidence Based Med. 2017;22(4):139.

METRIC - MEthodological sTudy ReportIng Checklist: guidelines for reporting methodological studies in health research [ http://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#METRIC ]. Accessed 31 Aug 2020.

Jager KJ, Zoccali C, MacLeod A, Dekker FW. Confounding: what it is and how to deal with it. Kidney Int. 2008;73(3):256–60.

Parker SG, Halligan S, Erotocritou M, Wood CPJ, Boulton RW, Plumb AAO, Windsor ACJ, Mallett S. A systematic methodological review of non-randomised interventional studies of elective ventral hernia repair: clear definitions and a standardised minimum dataset are needed. Hernia. 2019.

Bouwmeester W, Zuithoff NPA, Mallett S, Geerlings MI, Vergouwe Y, Steyerberg EW, Altman DG, Moons KGM. Reporting and methods in clinical prediction research: a systematic review. PLoS Med. 2012;9(5):1–12.

Schiller P, Burchardi N, Niestroj M, Kieser M. Quality of reporting of clinical non-inferiority and equivalence randomised trials--update and extension. Trials. 2012;13:214.

Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, Dosenovic S, Jakus D, Vrdoljak M, Poklepovic Pericic T, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;125(4):1348–54.

Thabut G, Estellat C, Boutron I, Samama CM, Ravaud P. Methodological issues in trials assessing primary prophylaxis of venous thrombo-embolism. Eur Heart J. 2005;27(2):227–36.

Puljak L, Riva N, Parmelli E, González-Lorenzo M, Moja L, Pieper D. Data extraction methods: an analysis of internal reporting discrepancies in single manuscripts and practical advice. J Clin Epidemiol. 2020;117:158–64.

Ritchie A, Seubert L, Clifford R, Perry D, Bond C. Do randomised controlled trials relevant to pharmacy meet best practice standards for quality conduct and reporting? A systematic review. Int J Pharm Pract. 2019.

Babic A, Vuka I, Saric F, Proloscic I, Slapnicar E, Cavar J, Pericic TP, Pieper D, Puljak L. Overall bias methods and their use in sensitivity analysis of Cochrane reviews were not consistent. J Clin Epidemiol. 2019.

Tan A, Porcher R, Crequit P, Ravaud P, Dechartres A. Differences in treatment effect size between overall survival and progression-free survival in immunotherapy trials: a Meta-epidemiologic study of trials with results posted at ClinicalTrials.gov. J Clin Oncol. 2017;35(15):1686–94.

Croitoru D, Huang Y, Kurdina A, Chan AW, Drucker AM. Quality of reporting in systematic reviews published in dermatology journals. Br J Dermatol. 2020;182(6):1469–76.

Khan MS, Ochani RK, Shaikh A, Vaduganathan M, Khan SU, Fatima K, Yamani N, Mandrola J, Doukky R, Krasuski RA: Assessing the Quality of Reporting of Harms in Randomized Controlled Trials Published in High Impact Cardiovascular Journals. Eur Heart J Qual Care Clin Outcomes 2019.

Rosmarakis ES, Soteriades ES, Vergidis PI, Kasiakou SK, Falagas ME. From conference abstract to full paper: differences between data presented in conferences and journals. FASEB J. 2005;19(7):673–80.

Mueller M, D’Addario M, Egger M, Cevallos M, Dekkers O, Mugglin C, Scott P. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations. BMC Med Res Methodol. 2018;18(1):44.

Li G, Abbade LPF, Nwosu I, Jin Y, Leenus A, Maaz M, Wang M, Bhatt M, Zielinski L, Sanger N, et al. A scoping review of comparisons between abstracts and full reports in primary biomedical research. BMC Med Res Methodol. 2017;17(1):181.

Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol. 2019;19(1):203.

Analytical study [ https://medical-dictionary.thefreedictionary.com/analytical+study ]. Accessed 31 Aug 2020.

Tricco AC, Tetzlaff J, Pham B, Brehaut J, Moher D. Non-Cochrane vs. Cochrane reviews were twice as likely to have positive conclusion statements: cross-sectional study. J Clin Epidemiol. 2009;62(4):380–6 e381.

Schalken N, Rietbergen C. The reporting quality of systematic reviews and Meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. 2017;8:1395.

Ranker LR, Petersen JM, Fox MP. Awareness of and potential for dependent error in the observational epidemiologic literature: A review. Ann Epidemiol. 2019;36:15–9 e12.

Paquette M, Alotaibi AM, Nieuwlaat R, Santesso N, Mbuagbaw L. A meta-epidemiological study of subgroup analyses in cochrane systematic reviews of atrial fibrillation. Syst Rev. 2019;8(1):241.

Download references

Acknowledgements

This work did not receive any dedicated funding.

Author information

Authors and affiliations.

Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada

Lawrence Mbuagbaw, Daeria O. Lawson & Lehana Thabane

Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario, L8N 4A6, Canada

Lawrence Mbuagbaw & Lehana Thabane

Centre for the Development of Best Practices in Health, Yaoundé, Cameroon

Lawrence Mbuagbaw

Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000, Zagreb, Croatia

Livia Puljak

Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN, 47405, USA

David B. Allison

Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON, Canada

Lehana Thabane

Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON, Canada

Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON, Canada

You can also search for this author in PubMed   Google Scholar

Contributions

LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.

Corresponding author

Correspondence to Lawrence Mbuagbaw .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Mbuagbaw, L., Lawson, D.O., Puljak, L. et al. A tutorial on methodological studies: the what, when, how and why. BMC Med Res Methodol 20 , 226 (2020). https://doi.org/10.1186/s12874-020-01107-7

Download citation

Received : 27 May 2020

Accepted : 27 August 2020

Published : 07 September 2020

DOI : https://doi.org/10.1186/s12874-020-01107-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Methodological study
  • Meta-epidemiology
  • Research methods
  • Research-on-research

BMC Medical Research Methodology

ISSN: 1471-2288

seminar topics on research methodology

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Dissertation
  • What Is a Research Methodology? | Steps & Tips

What Is a Research Methodology? | Steps & Tips

Published on August 25, 2022 by Shona McCombes and Tegan George. Revised on November 20, 2023.

Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation , or research paper , the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research and your dissertation topic .

It should include:

  • The type of research you conducted
  • How you collected and analyzed your data
  • Any tools or materials you used in the research
  • How you mitigated or avoided research biases
  • Why you chose these methods
  • Your methodology section should generally be written in the past tense .
  • Academic style guides in your field may provide detailed guidelines on what to include for different types of studies.
  • Your citation style might provide guidelines for your methodology section (e.g., an APA Style methods section ).

Instantly correct all language mistakes in your text

Upload your document to correct all your mistakes in minutes

upload-your-document-ai-proofreader

Table of contents

How to write a research methodology, why is a methods section important, step 1: explain your methodological approach, step 2: describe your data collection methods, step 3: describe your analysis method, step 4: evaluate and justify the methodological choices you made, tips for writing a strong methodology chapter, other interesting articles, frequently asked questions about methodology.

Prevent plagiarism. Run a free check.

Your methods section is your opportunity to share how you conducted your research and why you chose the methods you chose. It’s also the place to show that your research was rigorously conducted and can be replicated .

It gives your research legitimacy and situates it within your field, and also gives your readers a place to refer to if they have any questions or critiques in other sections.

You can start by introducing your overall approach to your research. You have two options here.

Option 1: Start with your “what”

What research problem or question did you investigate?

  • Aim to describe the characteristics of something?
  • Explore an under-researched topic?
  • Establish a causal relationship?

And what type of data did you need to achieve this aim?

  • Quantitative data , qualitative data , or a mix of both?
  • Primary data collected yourself, or secondary data collected by someone else?
  • Experimental data gathered by controlling and manipulating variables, or descriptive data gathered via observations?

Option 2: Start with your “why”

Depending on your discipline, you can also start with a discussion of the rationale and assumptions underpinning your methodology. In other words, why did you choose these methods for your study?

  • Why is this the best way to answer your research question?
  • Is this a standard methodology in your field, or does it require justification?
  • Were there any ethical considerations involved in your choices?
  • What are the criteria for validity and reliability in this type of research ? How did you prevent bias from affecting your data?

Once you have introduced your reader to your methodological approach, you should share full details about your data collection methods .

Quantitative methods

In order to be considered generalizable, you should describe quantitative research methods in enough detail for another researcher to replicate your study.

Here, explain how you operationalized your concepts and measured your variables. Discuss your sampling method or inclusion and exclusion criteria , as well as any tools, procedures, and materials you used to gather your data.

Surveys Describe where, when, and how the survey was conducted.

  • How did you design the questionnaire?
  • What form did your questions take (e.g., multiple choice, Likert scale )?
  • Were your surveys conducted in-person or virtually?
  • What sampling method did you use to select participants?
  • What was your sample size and response rate?

Experiments Share full details of the tools, techniques, and procedures you used to conduct your experiment.

  • How did you design the experiment ?
  • How did you recruit participants?
  • How did you manipulate and measure the variables ?
  • What tools did you use?

Existing data Explain how you gathered and selected the material (such as datasets or archival data) that you used in your analysis.

  • Where did you source the material?
  • How was the data originally produced?
  • What criteria did you use to select material (e.g., date range)?

The survey consisted of 5 multiple-choice questions and 10 questions measured on a 7-point Likert scale.

The goal was to collect survey responses from 350 customers visiting the fitness apparel company’s brick-and-mortar location in Boston on July 4–8, 2022, between 11:00 and 15:00.

Here, a customer was defined as a person who had purchased a product from the company on the day they took the survey. Participants were given 5 minutes to fill in the survey anonymously. In total, 408 customers responded, but not all surveys were fully completed. Due to this, 371 survey results were included in the analysis.

  • Information bias
  • Omitted variable bias
  • Regression to the mean
  • Survivorship bias
  • Undercoverage bias
  • Sampling bias

Qualitative methods

In qualitative research , methods are often more flexible and subjective. For this reason, it’s crucial to robustly explain the methodology choices you made.

Be sure to discuss the criteria you used to select your data, the context in which your research was conducted, and the role you played in collecting your data (e.g., were you an active participant, or a passive observer?)

Interviews or focus groups Describe where, when, and how the interviews were conducted.

  • How did you find and select participants?
  • How many participants took part?
  • What form did the interviews take ( structured , semi-structured , or unstructured )?
  • How long were the interviews?
  • How were they recorded?

Participant observation Describe where, when, and how you conducted the observation or ethnography .

  • What group or community did you observe? How long did you spend there?
  • How did you gain access to this group? What role did you play in the community?
  • How long did you spend conducting the research? Where was it located?
  • How did you record your data (e.g., audiovisual recordings, note-taking)?

Existing data Explain how you selected case study materials for your analysis.

  • What type of materials did you analyze?
  • How did you select them?

In order to gain better insight into possibilities for future improvement of the fitness store’s product range, semi-structured interviews were conducted with 8 returning customers.

Here, a returning customer was defined as someone who usually bought products at least twice a week from the store.

Surveys were used to select participants. Interviews were conducted in a small office next to the cash register and lasted approximately 20 minutes each. Answers were recorded by note-taking, and seven interviews were also filmed with consent. One interviewee preferred not to be filmed.

  • The Hawthorne effect
  • Observer bias
  • The placebo effect
  • Response bias and Nonresponse bias
  • The Pygmalion effect
  • Recall bias
  • Social desirability bias
  • Self-selection bias

Mixed methods

Mixed methods research combines quantitative and qualitative approaches. If a standalone quantitative or qualitative study is insufficient to answer your research question, mixed methods may be a good fit for you.

Mixed methods are less common than standalone analyses, largely because they require a great deal of effort to pull off successfully. If you choose to pursue mixed methods, it’s especially important to robustly justify your methods.

Next, you should indicate how you processed and analyzed your data. Avoid going into too much detail: you should not start introducing or discussing any of your results at this stage.

In quantitative research , your analysis will be based on numbers. In your methods section, you can include:

  • How you prepared the data before analyzing it (e.g., checking for missing data , removing outliers , transforming variables)
  • Which software you used (e.g., SPSS, Stata or R)
  • Which statistical tests you used (e.g., two-tailed t test , simple linear regression )

In qualitative research, your analysis will be based on language, images, and observations (often involving some form of textual analysis ).

Specific methods might include:

  • Content analysis : Categorizing and discussing the meaning of words, phrases and sentences
  • Thematic analysis : Coding and closely examining the data to identify broad themes and patterns
  • Discourse analysis : Studying communication and meaning in relation to their social context

Mixed methods combine the above two research methods, integrating both qualitative and quantitative approaches into one coherent analytical process.

Above all, your methodology section should clearly make the case for why you chose the methods you did. This is especially true if you did not take the most standard approach to your topic. In this case, discuss why other methods were not suitable for your objectives, and show how this approach contributes new knowledge or understanding.

In any case, it should be overwhelmingly clear to your reader that you set yourself up for success in terms of your methodology’s design. Show how your methods should lead to results that are valid and reliable, while leaving the analysis of the meaning, importance, and relevance of your results for your discussion section .

  • Quantitative: Lab-based experiments cannot always accurately simulate real-life situations and behaviors, but they are effective for testing causal relationships between variables .
  • Qualitative: Unstructured interviews usually produce results that cannot be generalized beyond the sample group , but they provide a more in-depth understanding of participants’ perceptions, motivations, and emotions.
  • Mixed methods: Despite issues systematically comparing differing types of data, a solely quantitative study would not sufficiently incorporate the lived experience of each participant, while a solely qualitative study would be insufficiently generalizable.

Remember that your aim is not just to describe your methods, but to show how and why you applied them. Again, it’s critical to demonstrate that your research was rigorously conducted and can be replicated.

1. Focus on your objectives and research questions

The methodology section should clearly show why your methods suit your objectives and convince the reader that you chose the best possible approach to answering your problem statement and research questions .

2. Cite relevant sources

Your methodology can be strengthened by referencing existing research in your field. This can help you to:

  • Show that you followed established practice for your type of research
  • Discuss how you decided on your approach by evaluating existing research
  • Present a novel methodological approach to address a gap in the literature

3. Write for your audience

Consider how much information you need to give, and avoid getting too lengthy. If you are using methods that are standard for your discipline, you probably don’t need to give a lot of background or justification.

Regardless, your methodology should be a clear, well-structured text that makes an argument for your approach, not just a list of technical details and procedures.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Normal distribution
  • Measures of central tendency
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles

Methodology

  • Cluster sampling
  • Stratified sampling
  • Thematic analysis
  • Cohort study
  • Peer review
  • Ethnography

Research bias

  • Implicit bias
  • Cognitive bias
  • Conformity bias
  • Hawthorne effect
  • Availability heuristic
  • Attrition bias

Methodology refers to the overarching strategy and rationale of your research project . It involves studying the methods used in your field and the theories or principles behind them, in order to develop an approach that matches your objectives.

Methods are the specific tools and procedures you use to collect and analyze data (for example, experiments, surveys , and statistical tests ).

In shorter scientific papers, where the aim is to report the findings of a specific study, you might simply describe what you did in a methods section .

In a longer or more complex research project, such as a thesis or dissertation , you will probably include a methodology section , where you explain your approach to answering the research questions and cite relevant sources to support your choice of methods.

In a scientific paper, the methodology always comes after the introduction and before the results , discussion and conclusion . The same basic structure also applies to a thesis, dissertation , or research proposal .

Depending on the length and type of document, you might also include a literature review or theoretical framework before the methodology.

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

Reliability and validity are both about how well a method measures something:

  • Reliability refers to the  consistency of a measure (whether the results can be reproduced under the same conditions).
  • Validity   refers to the  accuracy of a measure (whether the results really do represent what they are supposed to measure).

If you are doing experimental research, you also have to consider the internal and external validity of your experiment.

A sample is a subset of individuals from a larger population . Sampling means selecting the group that you will actually collect data from in your research. For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

In statistics, sampling allows you to test a hypothesis about the characteristics of a population.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. & George, T. (2023, November 20). What Is a Research Methodology? | Steps & Tips. Scribbr. Retrieved August 12, 2024, from https://www.scribbr.com/dissertation/methodology/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, what is a theoretical framework | guide to organizing, what is a research design | types, guide & examples, qualitative vs. quantitative research | differences, examples & methods, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Perspect Clin Res
  • v.9(2); Apr-Jun 2018

Research methodology workshops: A small step towards practice of evidence-based medicine

Nithya jaideep gogtay.

Department of Clinical Pharmacology, Seth GS Medical College and KEM Hospital, Mumbai, Maharashtra, India

A recent issue of a leading journal celebrated advances in the field of infectious diseases in the past 17 years. Some examples quoted were genome sequencing for real-time monitoring of the development of artemisinin–piperaquine resistance, the substantially improved outlook for people living with HIV-AIDS (including access to drugs), and vaccines to control the Ebola virus outbreak.[ 1 ] A recurring theme among all three examples is research, the quality of research, and methodology used therein. From the early controlled studies by James Lind for scurvy,[ 2 ] to the randomized study of the use of streptomycin for tuberculosis,[ 3 ] to the stepped wedge design (a study design that permits rigorous scientific evaluation albeit under logistic constraints),[ 4 ] or the basket and umbrella designs in cancer precision medicine,[ 5 ] the nature, repertoire, and complexity of scientific methodology has only grown. Given this, researchers and clinician researchers need to keep abreast of these changes to interpret and use evidence-based medicine (EBM) effectively. One way of doing this is through attending research methodology training programs.

In the current issue of the journal, Shrivastava et al. conducted a study that evaluated 153 heterogeneous participants who attended a 4-day basic research methodology workshop.[ 6 ] Participants filled out a pretest and a posttest questionnaire, with a significant improvement being seen in the latter scores. The authors concluded that research methodology workshops were a useful way to improve both knowledge and awareness about research. They also acknowledged that the assessment conducted by them was not long term.

A research methodology workshop can be likened to the more familiar continuing medical education (CME) program at one level. The goal of CMEs is to primarily plug gaps in knowledge over the duration that it is conducted.[ 7 ] Most countries worldwide also use it for recertification of practitioners. CMEs are expected to improve health-care outcomes and are based on a “felt need” of the participant to both maintain and improve clinical performance. Unlike the CME, a participant in a research methodology workshop may not truly feel the need to learn about the research process or even see value in it. This is likely true of at least some postgraduates who are required to mandatorily attend these workshops as part of fulfillment of requirements toward their degrees. However, the tremendous value that these workshops bring can be best understood by understanding the basis of EBM. In the era of EBM that we live in today, evidence gleaned through scientific method forges practice and even changes paradigms. The very basis of EBM is research, more research, and constant research! An altruistic reason for doing research is progress of science and society, but the process also brings great personal satisfaction and peer recognition and contribution to the nation's health needs. So then, should every one of us attend research methodology workshops? And if yes, then what after that? And what kind of training should these workshops really impart?

A good place to begin learning about the need for research and the process involved therein is at the undergraduate level. This helps sow that seed for potential researchers and clinician researchers. It also helps when moving to the postgraduate level where a thesis is compulsory, rather than being faced with research for the first time at this level. The Indian Council of Medical Research (ICMR) is at the forefront with short-term studentship (STS) programs for undergraduate medical students.[ 8 ] Dr. MG Deo conducted a series of research and laboratory method workshops between 2010 and 2011 and invited undergraduate students who were ICMR undergraduate STS awardees to participate in them. He noted a 3-fold increase in the proportion of students who “passed” the evaluation test postconduct of the workshop.[ 9 ] Several workshops of this nature are now conducted regularly in the country both at the undergraduate (usually voluntary) and at the postgraduate level.

Will attending workshops beyond the postgraduate level help? Faculty in academic institutions themselves need to be trained so that they become good mentors. Several of us would be testimony to the often-repeated statement, “I only read the conclusion of the paper or I always skip the section on statistics or I only read the abstract” which occurs at several levels both in academia and in private practice. The latter who are usually quicker to implement EBM would find it useful to know why they do what they do. Policymakers need to have a grasp of EBM so as to implement key decisions. It can thus be argued that anyone who enters medicine or is currently practicing it needs to understand at least a little bit about the research process.

The second question of what after is slightly more difficult to address. One does not expect everyone who exits a research methodology workshop to carry out research. It is also likely that much of the knowledge imparted is lost over time. But what a research methodology does is that it sheds light on a path that was hitherto unknown and shows that it can be walked, and if not walked, at the least understood. Solomon et al. [ 10 ] evaluated the impact of National Institutes of Health-sponsored medical student research programs at two medical schools in the United States and looked at both short-term and long-term outcomes. The study showed an interest among the participants in pursuing academic careers. Many were currently engaged in research, presenting papers and publishing their work. The research programs thus appeared to foster the growth of physician - scientists.

Finally, where do we stand with respect to research as a country? The inadequate research output from the country has already been bemoaned [ 11 ] as also lack of good-quality public health research.[ 12 ] A recent study on clinical trials in India has shown them not to be commensurate with her health-care needs.[ 13 ] Against this backdrop, training workshops such as the ones conducted by Shrivastava et al are useful. However, they need to be (1) structured, (2) range from basic to advanced, (3) need to be spread all over the country with the help of institutions like the ICMR and other major players in research, (4) include e-learning and use technology to reach a wider audience, (5) cater to diverse levels of participants, (6) link EBM to research, (7) have evaluation of long-term outcomes, and (8) motivate participants to learn research as a way to better practice medicine. This, it is hoped will foster the growth of scientific method and physician-scientists in the country in a small, but significant way.

Training Course in Sexual and Reproductive Health Research 2017 Geneva Foundation for Medical Education and Research

  • Application
  • Course guide
  • Assignment guide
  • Weekly schedule
  • Google groups
  • GFMER country coordinators
  • GFMER Geneva team
  • Tutors by country
  • Course files
  • WHO documents

Course weekly schedule Research methodology and other research related topics - Course files

Training Course in Sexual and Reproductive Health Research 2017

Research methodology and other research related topics

Module coordinators: Tomas Allen, Moazzam Ali, Karim Abawi

The objective of this module is to provide knowledge and skills in scientific writing (research protocol, literature search and article) and to appraise the quality of scientific papers.

The contents of this module can be classified into four parts:

Literature search and referral to biomedical documents

The aim of literature search is to identify the most relevant sources related to a study topic in order to provide evidence, give background information, place a research in a theoretical context and inform readers about similar research on the topic. A literature review is crucial for the credibility of a paper and serves as an update of professional knowledge in a particular field of expertise.

Referencing is a standardized way of citing the sources of information you have used and ideas that are not your intellectual property. It is important to verify citation and to allow readers to follow up what you have written and locate the cited author’s work.

Epidemiologic studies

This section provides basic knowledge on different epidemiologic studies used to conduct a research projects. It includes methods for observational (analytic & descriptive) studies, such as cross sectional, screening, cohort, case-control and experimental studies like randomized trials.

Research protocol development

This section will show you how to develop a research protocol according to a WHO recommended format. This section also contains a special course on research ethics. Participants who take this course will receive a special research ethics certificate that is required for doing research in some countries.

Additional research related topics

This section contains topics related research methodology and project implementation in the field of sexual and reproductive health, such as monitoring and evaluation, WHO guidelines and gender issues.

The core module on research methodology and related research topics is integrated with other specific modules.

The participants are required to read and consult the contents of this module for assignments and final research projects.

  • Research methodology and other research related topics - Course files
  • International Committee of Medical Journal Editors (ICMJE)
  • Research Reporting Guidelines and Initiatives: By Organization
  • Citing Medicine -- NCBI Bookshelf
  • CONSORT - Consolidated Standards of Reporting Trials
  • GRADE (Grading of Recommendations Assessment, Development and Evaluation) working group

swayam-logo

Research Methodology

--> --> --> --> --> --> --> --> --> --> --> --> --> --> --> --> --> -->

Note: This exam date is subjected to change based on seat availability. You can check final exam date on your hall ticket.

Page Visits

Course layout, books and references, instructor bio.

seminar topics on research methodology

Prof. Soumitro Banerjee

Course certificate.

seminar topics on research methodology

DOWNLOAD APP

seminar topics on research methodology

SWAYAM SUPPORT

Please choose the SWAYAM National Coordinator for support. * :

IEEE Communications Society Bangalore Chapter

  • Execom 2022
  • Execom 2021
  • Execom 2020
  • Chair Column
  • Amrita School of Engineering
  • CMR Institute of Technology.
  • Indian Institute of Science
  • Malnad College of Engineering
  • Manipal Institute of Technology
  • Ramaiah Institute of Technology
  • Reva University
  • RV College of Engineering
  • ST Joseph Engineering College
  • Upcoming events
  • Recently Concluded Events
  • Important Announcement
  • Important Links
  • Technical Sponsorship
  • Financial Support
  • Photo Gallery
  • Distinguished lecturer/experts
  • Important Download
  • Technical Talks
  • Volunteering

Research Methodology and IEEE AuthorshipLab Workshop

IEEE ComSoc has taken this initiative to propagate and conduct the Research Methodology Workshop extensively to help the youngster to do quality research in his/her field. Primary focus of this workshop is to improve research quality , create an environment and culture for the research. The contents are unique. The 2-day workshop will consist of all the components required in the research cycle. It has been noticed that our quality of research and published papers are not of standards (in many cases). We dont blame only youngster/scholars who are new to this. In a deliberated discussions with some of the senior professional, we have found significant issues with senior people who are experienced do not pay attention to this and may be youngster are not being guided properly. The workshop addresses all the issues and therefore, useful for ALL. Normally, planned to have 3-4 workshop every year including outside Bangalore.  Organizations/Institutions who want to host the workshop are requested to contact Dr Navin For Industry/Organizations  One day Package is available. However, for academic Institutions, two days are suggested. Contents are normally:
You can also download here: ResearchMethodologyWorkshop_Details_FinalJuly2018
Workshop Objectives: At the end of this workshop, the audience should be able to: understand some basic concepts of research and its methodologies identify appropriate research topics select and define appropriate research problem and parameters prepare a project (thesis) proposal (to undertake a project/thesis) organize and conduct research (advanced project) in a more appropriate manner write a research report, article/paper and thesis write a research proposal (grants) Review an article and submit the report (peer review process) The workshop is divided into 4 modules (four session each of 3-31/2 hrs : Module 1 – Identifying a problem ( Problem Definition ) Module 2 – Tackling and addressing the problem ( Doing Research ) Module 3 – Reporting the research (Report/thesis writing/publishing includes IEEE AuthorshipLab ) Module 4 – Peer Review Process and Proposal Writing Institutions/Organizations interested in hosting the workshop are requested to contact Dr Navin Contact – Dr Navin Kumar @ navinkumar@ieee.org For Industry – One day workshop is packaged with the main focus and emphasize on Writing and Publishing. Tentative SCHEDULE MODULE 1 Defining a Research Problem Day/ Time Subject/Topic/Title to be covered Prospective Speakers (Pls list/recommend) Day 1 09:00- 13:00 Defining a Problem? [1] Overview of Research and its Methodologies ·       Concepts of research ·       The need for research ·       Types of research ·       Steps in conducting research [2] Literature review ·       What is literature review? ·       Why the need for literature review? ·       How to carry out a literature review? [3] Selecting and defining a research problem ·       Problem formulation – why the need for this? ·       What are the criteria for selecting a problem? ·       Identifying variables ·       Evaluating problems ·      Functions of a hypothesis MODULE 2 Approach to Solve the Problem  (Conducting Research) Day 1 14:00 – 17:30 Approach to Solve the problem? [4] Conducting the research ·       Research activities ·       Preparations before conducting your research [5] Examples of Research at the University ·       Differences among Postgraduate and Undergraduate Research ·       Research at the postgraduate level (PhD and MSc) ·       Research at the undergraduate level (BSc) ·      Preparations for an Undergraduate Final Year Project MODULE 3 Publishing / Reporting Research work (Includes Writing Research Report or Thesis) Day 2     09:00 – 13:00 While describing publication, copyright comes and so CR and Patenting can be discussed. Including related other terms like Copyright transfer, etc. Patent vs copyright, including coauthors, etc [6] Writing Research Reports and Thesis Writing Research Paper ·      Why the need to write papers and reports? ·      Writing a research report ·      Writing a technical paper IEEE AuthorshipLab     Thesis Writing ·      Contents of a Thesis ·      Case Study MODULE 4 Reviewing Article & Proposal writing (Peer Review) Day 2     14:00-17:00 Peer Review ·      What, why (Importance) ·      Review process ·      Plagiarism, etc. ·      Review submission   Contact – Dr Navin Kumar @ navinkumar@ieee.org

=====================

      7th Workshop was organized at Christ University, Lavasa, Pune

          date : 13-14 sep 2019.

Over 50 faculties and research scholars attended the sessions. They appreciated the workshop contents. The feedback was excellent.

6th Edition (13-14 April, 2019)  ComSoc and CMRIT Sponsor  @  CMRIT, ITPL Road, Kundanhalli Bangalore 560037 The workshop concluded with a regular attendance of over 75 faculties and scholars. All of them attended all the modules and enjoyed the session. Some of the faculties/professionals with several years of experience in Industry gave an excellent feedback for this Practial/Example based workshop   Research Methodology Workshop
research ability, the paper quality, the skills in writing paper, and bring awareness and adaptation of professionalism, and bring confidence in scholars and professionals the ability of search and define problems understand the peer review importance and write comments for a review; and many more..

IEEE ComSoc Bangalore Chapter came up with Research Methodology Workshop Series Event Plan. It is utmost needed. At least 2-3 events per year, in and around Bangalore will be conducted.

The July edition is jointly sponsored by IEEE Bangalore Section and IEEE ComSoc Bangalore Chapter . Scheduled on 8-9 July, 2017 @ AMRITA SCHOOL OF ENGINEERING, Kasavanahalli, Bangalore, 560035

You can find some details here:

IEEE ComSoc Chapter and IEEE Bangalore Section
Jointly Organizes Two Days (8-9 July, 2017) Workshop on Research Methodology
  Unique Contents, 4-Sessions, 4 Experts (like Tutorial Session)
LIMITED SEATS ! REGISTER  HERE: https://in.explara.com/e/research-methodology-workshop
Registration Includes Lunch, Tea/Coffee/Biscuits/Snacks (certificate Soft Copy) Details – Download the Schedule: ResearchMethodologyWorkshop_nkv3 @ Amrita School of Engineering, Kasavanahalli, Bangalore – 560035

================================================================

Previous Workshop

Two days workshop on, research methodology.

8- 9 May, 2017 @ CMRIT, ITPL Road

COMMENTS

  1. Top 100 Research Methodology Project Topics

    Explore the world of research methodology. Choose the right approach for your research methodology project topics.

  2. PDF Research Methodology Seminars

    Research Methodology Seminars Doctoral candidates complete four research methodology seminars. These seminars enable you to design, conduct, and evaluate business research. All candidates must complete a graduate elementary statistics course within three years before or one year after starting the doctoral program.

  3. Webinar on Research Methodology: A Guide For Beginners

    How to write the topic, research objective (s) and research question (s) & Why research methods are essential. What "research methodology" is & The data collection process. How to do data analysis & How to write the result and conclusion parts. How to write the discussion part & How to write limitations and recommendations.

  4. Research Methodologies

    This course focuses on research methodologies. In this vein, the focus will be placed on qualitative and quantitative research methodologies, sampling approaches, and primary and secondary data collection. The course begins with a discussion on qualitative research approaches, looking at focus groups, personal interviews, ethnography, case ...

  5. What Is Research Methodology? Definition + Examples

    Learn exactly what a research methodology is with Grad Coach's plain language, easy-to-understand explanation, including examples and videos.

  6. Doctoral Seminar in Research Methods I

    This course is designed to lay the foundations of good empirical research in the social sciences. It does not deal with specific techniques per se, but rather with the assumptions and the logic underlying social research. Students become acquainted with a variety of approaches to research design, and are helped to develop their own research projects and to evaluate the products of empirical ...

  7. Free Webinar: Research Methodology 101

    Learn the basics of research methodology. We explain what it is, the core components, and how to get started (with plenty of examples).

  8. PDF Research Seminar on Methodology

    The research question is the first step in establishing a focus on our area of research interest and addressing a specific issue of our topic with relevance in the real world and the academic domain through a well-defined question that will guide our research.

  9. Five Day Workshop on Research Methodology

    The aim of the Five Day Workshop on 'Research Methodology' is to enable the participants both social sciences research approaches - Qualitative and quantitative .

  10. PDF J380ResearchmethodssyllabusF13

    Describe and compare the major quantitative and qualitative research methods in mass communication research. Propose a research study and justify the theory as well as the methodological decisions, including sampling and measurement. Understand the importance of research ethics and integrate research ethics into the research process.

  11. Understanding Research Methods

    This MOOC is about demystifying research and research methods. It will outline the fundamentals of doing research, aimed primarily, but not exclusively, at the postgraduate level. It places the student experience at the centre of our endeavours by engaging learners in a range of robust and challenging discussions and exercises befitting SOAS ...

  12. A tutorial on methodological studies: the what, when, how and why

    Background Methodological studies - studies that evaluate the design, analysis or reporting of other research-related reports - play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste. Main body We provide an overview of some of the key aspects of ...

  13. What Is a Research Methodology?

    Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, dissertation, or research paper, the methodology chapter explains what you did and how you did it, allowing readers to evaluate the reliability and validity of your research and your dissertation topic.

  14. PDF a h a n d b o o k for r e s e a r c h s e m i n a r t u t o r s

    nd research methods that Re-search Seminars are meant to offer. Research Seminars meet once per week as a group, in single two-hour class meeting presided over by the coursehead. This meeting will be dedicated to the Seminar's topical content, as explored through shared readings of se

  15. Research methodology workshops: A small step towards practice of

    Research methodology workshops: A small step towards practice of evidence-based medicine. A recent issue of a leading journal celebrated advances in the field of infectious diseases in the past 17 years. Some examples quoted were genome sequencing for real-time monitoring of the development of artemisinin-piperaquine resistance, the ...

  16. PDF A Doctoral Seminar in Qualitative Research Methods: Lessons Learned

    In the qualitative methods seminar for doctoral graduate students, the concept of paradigms con-tinued to plague students' ability to absorb qualitative methods text references and articles. With-in paradigms, the theme of "common language" was noted as a barrier to embracing philosophy as part of research.

  17. (PDF) Technical Seminar on Research Methodology

    PDF | On Jun 29, 2019, Shamim Akhter published Technical Seminar on Research Methodology | Find, read and cite all the research you need on ResearchGate

  18. Research Seminar Definition & Importance

    Learn about research seminars. Understand what a seminar is, learn the differences between seminars and workshops and see a comparison of...

  19. Research methodology and other research related topics

    Additional research related topics This section contains topics related research methodology and project implementation in the field of sexual and reproductive health, such as monitoring and evaluation, WHO guidelines and gender issues. The core module on research methodology and related research topics is integrated with other specific modules.

  20. Research Methodology

    About the Course : The course covers all the conceptual and methodological issues that go into successful conduction of research. That includes philosophy of science, the methodological issues in measurement, proposing and testing hypotheses, scientific communication and the ethical issues in the practice of science.

  21. Research Methodology and IEEE AuthorshipLab Workshop

    Workshop Objectives: At the end of this workshop, the audience should be able to: understand some basic concepts of research and its methodologies identify appropriate research topics select and define appropriate research problem and parameters prepare a project (thesis) proposal (to undertake a project/thesis) organize and conduct research (advanced project) in a more appropriate manner ...

  22. PDF Research Seminar Report

    This session focused on unpacking the scope, aims, conceptual framework, methodology, and preliminary research findings of each pilot research undertaken including the lessons learnt at national levels.

  23. Research Seminar "Data Processing and Research Methodology"

    Abstract The research seminar aims at: the formation of basic practical skills in working with scientific literature, bibliography, reference books, databases; processing of research results; writing a scientific text, preparing an oral presentation; carrying out career-oriented work among students, allowing them to choose the direction and topic of research as part of the course work in the ...

Course Status : Completed
Course Type : Elective
Duration : 12 weeks
Category :
Credit Points : 3
Postgraduate
Start Date : 24 Jan 2022
End Date : 06 May 2022
Enrollment Ends : 07 Feb 2022
Exam Date : 23 Apr 2022 IST