Banner

SPH Writing Support Services

  • Appointment System
  • ESL Conversation Group
  • Mini-Courses
  • Thesis/Dissertation Writing Group
  • Career Writing
  • Citing Sources
  • Critiquing Research Articles
  • Project Planning for the Beginner This link opens in a new window
  • Grant Writing
  • Publishing in the Sciences
  • Systematic Review Overview
  • Systematic Review Resources This link opens in a new window
  • Writing Across Borders / Writing Across the Curriculum
  • Conducting an article critique for a quantitative research study: Perspectives for doctoral students and other novice readers (Vance et al.)
  • Critique Process (Boswell & Cannon)
  • The experience of critiquing published research: Learning from the student and researcher perspective (Knowles & Gray)
  • A guide to critiquing a research paper. Methodological appraisal of a paper on nurses in abortion care (Lipp & Fothergill)
  • Step-by-step guide to critiquing research. Part 1: Quantitative research (Coughlan et al.)
  • Step-by-step guide to critiquing research. Part 2: Qualitative research (Coughlan et al.)

Guidelines:

  • Critiquing Research Articles (Flinders University)
  • Framework for How to Read and Critique a Research Study (American Nurses Association)
  • How to Critique a Journal Article (UIS)
  • How to Critique a Research Paper (University of Michigan)
  • How to Write an Article Critique
  • Research Article Critique Form
  • Writing a Critique or Review of a Research Article (University of Calgary)

Presentations:

  • The Critique Process: Reviewing and Critiquing Research
  • Writing a Critique
  • << Previous: Citing Sources
  • Next: Project Planning for the Beginner >>
  • Last Updated: Apr 30, 2024 12:52 PM
  • URL: https://libguides.sph.uth.tmc.edu/writing_support_services
  • Applied Philosophy

Conducting an article critique for a quantitative research study: Perspectives for doctoral students and other novice readers.

  • Annual Review of Nursing Research 2013(default):67-75
  • 2013(default):67-75
  • CC BY-NC 3.0

David E Vance at University of Alabama at Birmingham

  • University of Alabama at Birmingham
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Michele H Talley at University of Alabama at Birmingham

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Ransford Awuku-Gyampoh

  • Fazal Rahim
  • Naseem Farhana
  • Irfan Ullah
  • Aiman Farooq

Garry Huang

  • Shwn-Meei Lee

Daniel Clinciu

  • J PROF NURS

Janice Lazear

  • Julie Suzuki-Crumly

Michelle Ackerman

  • Kathie M. Hiers
  • David Preiss

Laine Thomas

  • Dorothy Giles Williams
  • Earl R. Babbie

Gill Crozier

  • N.K. Denzin
  • Y.S. Lincoln
  • Howard Bayless

Mirjam C Kempf

  • Stanley Milgram
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Javascript is currently disabled in your browser. Several features of this site will not function whilst javascript is disabled.

  • Why Publish With Us?
  • Editorial Policies
  • Author Guidelines
  • Peer Review Guidelines
  • Open Outlook
  • Submit New Manuscript

explain critique of quantitative published research article

  • Sustainability
  • Press Center
  • Testimonials
  • Favored Author Program
  • Permissions
  • Pre-Submission

Chinese website (中文网站)

open access to scientific and medical research

A part of Taylor & Francis Group

Back to Journals » Nursing: Research and Reviews » Volume 3

explain critique of quantitative published research article

Conducting an article critique for a quantitative research study: perspectives for doctoral students and other novice readers

  • Get Permission
  • Cite this article

Authors Vance DE   , Talley M , Azuero A , Pearce PF , Christian BJ

Received 29 January 2013

Accepted for publication 12 March 2013

Published 22 April 2013 Volume 2013:3 Pages 67—75

DOI https://doi.org/10.2147/NRR.S43374

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

David E Vance, 1 Michele Talley, 1 Andres Azuero, 1 Patricia F Pearce, 2 Becky J Christian 1 1 School of Nursing, University of Alabama at Birmingham, Birmingham, AL, USA; 2 Loyola University School of Nursing, New Orleans, LA, USA Abstract: The ability to critically evaluate the merits of a quantitative design research article is a necessary skill for practitioners and researchers of all disciplines, including nursing, in order to judge the integrity and usefulness of the evidence and conclusions made in an article. In general, this skill is automatic for many practitioners and researchers who already possess a good working knowledge of research methodology, including: hypothesis development, sampling techniques, study design, testing procedures and instrumentation, data collection and data management, statistics, and interpretation of findings. For graduate students and junior faculty who have yet to master these skills, completing a formally written article critique can be a useful process to hone such skills. However, a fundamental knowledge of research methods is still needed in order to be successful. Because there are few published examples of critique examples, this article provides the practical points of conducting a formally written quantitative research article critique while providing a brief example to demonstrate the principles and form. Keywords: quantitative article critique, statistics, methodology, graduate students

Creative Commons License

Contact Us   •   Privacy Policy   •   Associations & Partners   •   Testimonials   •   Terms & Conditions   •   Recommend this site •   Cookies •   Top

Contact Us   •   Privacy Policy

Critical Appraisal of Quantitative Research

  • Living reference work entry
  • Later version available View entry history
  • First Online: 27 February 2018
  • Cite this living reference work entry

explain critique of quantitative published research article

  • Rocco Cavaleri 2 ,
  • Sameer Bhole 3 &
  • Amit Arora 2 , 4 , 5  

246 Accesses

Critical appraisal skills are important for anyone wishing to make informed decisions or improve the quality of healthcare delivery. A good critical appraisal provides information regarding the believability and usefulness of a particular study. However, the appraisal process is often overlooked, and critically appraising quantitative research can be daunting for both researchers and clinicians. This chapter introduces the concept of critical appraisal and highlights its importance in evidence-based practice. Readers are then introduced to the most common quantitative study designs and key questions to ask when appraising each type of study. These studies include systematic reviews, experimental studies (randomized controlled trials and non-randomized controlled trials), and observational studies (cohort, case-control, and cross-sectional studies). This chapter also provides the tools most commonly used to appraise the methodological and reporting quality of quantitative studies. Overall, this chapter serves as a step-by-step guide to appraising quantitative research in healthcare settings.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Similar content being viewed by others

explain critique of quantitative published research article

Appraisal of Qualitative Studies

explain critique of quantitative published research article

Literature Evaluation and Critique

Altman DG, Bland JM. Treatment allocation in controlled trials: why randomise? BMJ. 1999;318(7192):1209.

Article   Google Scholar  

Arora A, Scott JA, Bhole S, Do L, Schwarz E, Blinkhorn AS. Early childhood feeding practices and dental caries in preschool children: a multi-centre birth cohort study. BMC Public Health. 2011;11(1):28.

Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, … Lijmer JG. The STARD statement for reporting studies of diagnostic accuracy: explanation and elaboration. Ann Intern Med. 2003;138(1):W1–12.

Google Scholar  

Cavaleri R, Schabrun S, Te M, Chipchase L. Hand therapy versus corticosteroid injections in the treatment of de quervain’s disease: a systematic review and meta-analysis. J Hand Ther. 2016;29(1):3–11. https://doi.org/10.1016/j.jht.2015.10.004 .

Centre for Evidence-based Management. Critical appraisal tools. 2017. Retrieved 20 Dec 2017, from https://www.cebma.org/resources-and-tools/what-is-critical-appraisal/ .

Centre for Evidence-based Medicine. Critical appraisal worksheets. 2017. Retrieved 3 Dec 2017, from http://www.cebm.net/blog/2014/06/10/critical-appraisal/ .

Clark HD, Wells GA, Huët C, McAlister FA, Salmi LR, Fergusson D, Laupacis A. Assessing the quality of randomized trials: reliability of the jadad scale. Control Clin Trials. 1999;20(5):448–52. https://doi.org/10.1016/S0197-2456(99)00026-4 .

Critical Appraisal Skills Program. Casp checklists. 2017. Retrieved 5 Dec 2017, from http://www.casp-uk.net/casp-tools-checklists .

Dawes M, Davies P, Gray A, Mant J, Seers K, Snowball R. Evidence-based practice: a primer for health care professionals. London: Elsevier; 2005.

Dumville JC, Torgerson DJ, Hewitt CE. Research methods: reporting attrition in randomised controlled trials. BMJ. 2006;332(7547):969.

Greenhalgh T, Donald A. Evidence-based health care workbook: understanding research for individual and group learning. London: BMJ Publishing Group; 2000.

Guyatt GH, Sackett DL, Cook DJ, Guyatt G, Bass E, Brill-Edwards P, … Gerstein H. Users’ guides to the medical literature: II. How to use an article about therapy or prevention. JAMA. 1993;270(21):2598–601.

Guyatt GH, Oxman AD, Akl EA, Kunz R, Vist G, Brozek J, … Jaeschke R. GRADE guidelines: 1. Introduction – GRADE evidence profiles and summary of findings tables. J Clin Epidemiol. 2011;64(4), 383–94.

Herbert R, Jamtvedt G, Mead J, Birger Hagen K. Practical evidence-based physiotherapy. London: Elsevier Health Sciences; 2005.

Hewitt CE, Torgerson DJ. Is restricted randomisation necessary? BMJ. 2006;332(7556):1506–8.

Higgins JPT, Green S. Cochrane handbook for systematic reviews of interventions version 5.0.2. The cochrane collaboration. 2009. Retrieved 3 Dec 2017, from http://www.cochrane-handbook.org .

Hoffmann T, Bennett S, Del Mar C. Evidence-based practice across the health professions. Chatswood: Elsevier Health Sciences; 2013.

Hoffmann T, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, … Johnston M. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ, 2014;348: g1687.

Joanna Briggs Institute. Critical appraisal tools. 2017. Retrieved 4 Dec 2017, from http:// joannabriggs.org/research/critical-appraisal-tools.html .

Mhaskar R, Emmanuel P, Mishra S, Patel S, Naik E, Kumar A. Critical appraisal skills are essential to informed decision-making. Indian J Sex Transm Dis. 2009;30(2):112–9. https://doi.org/10.4103/0253-7184.62770 .

Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel group randomized trials. BMC Med Res Methodol. 2001;1(1):2. https://doi.org/10.1186/1471-2288-1-2 .

Moher D, Liberati A, Tetzlaff J, Altman DG, Prisma Group. Preferred reporting items for systematic reviews and meta-analyses: the prisma statement. PLoS Med. 2009;6(7):e1000097.

National Health and Medical Research Council. NHMRC additional levels of evidence and grades for recommendations for developers of guidelines. Canberra: NHMRC; 2009. Retrieved from https://www.nhmrc.gov.au/_files_nhmrc/file/guidelines/developers/nhmrc_levels_grades_evidence_120423.pdf .

National Heart Lung and Blood Institute. Study quality assessment tools. 2017. Retrieved 17 Dec 2017, from https://www.nhlbi.nih.gov/health-topics/study-quality-assessment-tools .

Physiotherapy Evidence Database. PEDro scale. 2017. Retrieved 10 Dec 2017, from https://www.pedro.org.au/english/downloads/pedro-scale/ .

Portney L, Watkins M. Foundations of clinical research: application to practice. 2nd ed. Upper Saddle River: F.A. Davis Company/Publishers; 2009.

Roberts C, Torgerson DJ. Understanding controlled trials: baseline imbalance in randomised controlled trials. BMJ. 1999;319(7203):185.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, … Kristjansson E. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ. 2017;358:j4008. https://doi.org/10.1136/bmj.j4008 .

Sterne JA, Hernán MA, Reeves BC, Savović J, Berkman ND, Viswanathan M, … Boutron I. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. BMJ. 2016;355:i4919.

Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, … Thacker SB. Meta-analysis of observational studies in epidemiology: a proposal for reporting. JAMA. 2000;283(15):2008–12.

Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, Initiative S. The strengthening the reporting of observational studies in epidemiology (strobe) statement: guidelines for reporting observational studies. Int J Surg. 2014;12(12):1495–9.

Whiting PF, Rutjes AW, Westwood ME, Mallett S, Deeks JJ, Reitsma JB, … Bossuyt PM. QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med 2011;155(8):529–36.

Download references

Author information

Authors and affiliations.

School of Science and Health, Western Sydney University, Sydney, NSW, Australia

Rocco Cavaleri & Amit Arora

Faculty of Dentistry, The University of Sydney, Surry Hills, NSW, Australia

Sameer Bhole

Discipline of Child and Adolescent Health, Sydney Medical School, Sydney, NSW, Australia

Oral Health Services, Sydney Local Health District and Sydney Dental Hospital, NSW Health, Sydney, NSW, Australia

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Rocco Cavaleri .

Editor information

Editors and affiliations.

Health, Locked Bag 1797, CA.02.35, Western Sydney Univ, School of Science & Health, Locked Bag 1797, CA.02.35, Penrith, New South Wales, Australia

Pranee Liamputtong

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this entry

Cite this entry.

Cavaleri, R., Bhole, S., Arora, A. (2018). Critical Appraisal of Quantitative Research. In: Liamputtong, P. (eds) Handbook of Research Methods in Health Social Sciences . Springer, Singapore. https://doi.org/10.1007/978-981-10-2779-6_120-1

Download citation

DOI : https://doi.org/10.1007/978-981-10-2779-6_120-1

Received : 20 January 2018

Accepted : 12 February 2018

Published : 27 February 2018

Publisher Name : Springer, Singapore

Print ISBN : 978-981-10-2779-6

Online ISBN : 978-981-10-2779-6

eBook Packages : Springer Reference Social Sciences Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences

  • Publish with us

Policies and ethics

Chapter history

DOI: https://doi.org/10.1007/978-981-10-2779-6_120-2

DOI: https://doi.org/10.1007/978-981-10-2779-6_120-1

  • Find a journal
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Step-by-step guide to critiquing research. Part 1: quantitative research

Affiliation.

  • 1 School of Nursing and Midwifery, University of Dublin, Trinity College, Dublin.
  • PMID: 17577184
  • DOI: 10.12968/bjon.2007.16.11.23681

When caring for patients, it is essential that nurses are using the current best practice. To determine what this is, nurses must be able to read research critically. But for many qualified and student nurses, the terminology used in research can be difficult to understand, thus making critical reading even more daunting. It is imperative in nursing that care has its foundations in sound research, and it is essential that all nurses have the ability to critically appraise research to identify what is best practice. This article is a step-by-step approach to critiquing quantitative research to help nurses demystify the process and decode the terminology.

PubMed Disclaimer

Similar articles

  • Critiquing research for use in practice. Dale JC. Dale JC. J Pediatr Health Care. 2005 May-Jun;19(3):183-6. doi: 10.1016/j.pedhc.2005.02.004. J Pediatr Health Care. 2005. PMID: 15867836 No abstract available.
  • Step-by-step guide to critiquing research. Part 2: Qualitative research. Ryan F, Coughlan M, Cronin P. Ryan F, et al. Br J Nurs. 2007 Jun 28-Jul 11;16(12):738-44. doi: 10.12968/bjon.2007.16.12.23726. Br J Nurs. 2007. PMID: 17851363 Review.
  • Presenting research to clinicians: strategies for writing about research findings. Oermann MH, Galvin EA, Floyd JA, Roop JC. Oermann MH, et al. Nurse Res. 2006;13(4):66-74. doi: 10.7748/nr2006.07.13.4.66.c5990. Nurse Res. 2006. PMID: 16897941 Review.
  • Undertaking a literature review: a step-by-step approach. Cronin P, Ryan F, Coughlan M. Cronin P, et al. Br J Nurs. 2008 Jan 10-23;17(1):38-43. doi: 10.12968/bjon.2008.17.1.28059. Br J Nurs. 2008. PMID: 18399395 Review.
  • Reflections on how to write and organise a research thesis. Hardy S, Ramjeet J. Hardy S, et al. Nurse Res. 2005;13(2):27-39. doi: 10.7748/nr.13.2.27.s5. Nurse Res. 2005. PMID: 16416978 Review.
  • Mental distress among university students in the Eastern Cape Province, South Africa. Mutinta G. Mutinta G. BMC Psychol. 2022 Aug 18;10(1):204. doi: 10.1186/s40359-022-00903-8. BMC Psychol. 2022. PMID: 35982493 Free PMC article.
  • Health inequalities in post-conflict settings: A systematic review. Bwirire D, Crutzen R, Ntabe Namegabe E, Letschert R, de Vries N. Bwirire D, et al. PLoS One. 2022 Mar 14;17(3):e0265038. doi: 10.1371/journal.pone.0265038. eCollection 2022. PLoS One. 2022. PMID: 35286351 Free PMC article.
  • Describing the categories of people that contribute to an Emergency Centre crowd at Khayelitsha hospital, Western Cape, South Africa. Ahiable E, Lahri S, Bruijns S. Ahiable E, et al. Afr J Emerg Med. 2017 Jun;7(2):68-73. doi: 10.1016/j.afjem.2017.04.004. Epub 2017 Apr 20. Afr J Emerg Med. 2017. PMID: 30456111 Free PMC article.
  • Women's experiences with postpartum anxiety disorders: a narrative literature review. Ali E. Ali E. Int J Womens Health. 2018 May 29;10:237-249. doi: 10.2147/IJWH.S158621. eCollection 2018. Int J Womens Health. 2018. PMID: 29881312 Free PMC article. Review.
  • Barriers to successful implementation of prevention-of-mother-to-child-transmission (PMTCT) of HIV programmes in Malawi and Nigeria: a critical literature review study. Okoli JC, Lansdown GE. Okoli JC, et al. Pan Afr Med J. 2014 Oct 15;19:154. doi: 10.11604/pamj.2014.19.154.4225. eCollection 2014. Pan Afr Med J. 2014. PMID: 25767672 Free PMC article. Review.

Publication types

  • Search in MeSH

Related information

  • PubChem Compound
  • PubChem Substance

LinkOut - more resources

Full text sources.

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • Write for Us
  • BMJ Journals

You are here

  • Volume 21, Issue 4
  • How to appraise quantitative research
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

This article has a correction. Please see:

  • Correction: How to appraise quantitative research - April 01, 2019

Download PDF

  • Xabi Cathala 1 ,
  • Calvin Moorley 2
  • 1 Institute of Vocational Learning , School of Health and Social Care, London South Bank University , London , UK
  • 2 Nursing Research and Diversity in Care , School of Health and Social Care, London South Bank University , London , UK
  • Correspondence to Mr Xabi Cathala, Institute of Vocational Learning, School of Health and Social Care, London South Bank University London UK ; cathalax{at}lsbu.ac.uk and Dr Calvin Moorley, Nursing Research and Diversity in Care, School of Health and Social Care, London South Bank University, London SE1 0AA, UK; Moorleyc{at}lsbu.ac.uk

https://doi.org/10.1136/eb-2018-102996

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Some nurses feel that they lack the necessary skills to read a research paper and to then decide if they should implement the findings into their practice. This is particularly the case when considering the results of quantitative research, which often contains the results of statistical testing. However, nurses have a professional responsibility to critique research to improve their practice, care and patient safety. 1  This article provides a step by step guide on how to critically appraise a quantitative paper.

Title, keywords and the authors

The authors’ names may not mean much, but knowing the following will be helpful:

Their position, for example, academic, researcher or healthcare practitioner.

Their qualification, both professional, for example, a nurse or physiotherapist and academic (eg, degree, masters, doctorate).

This can indicate how the research has been conducted and the authors’ competence on the subject. Basically, do you want to read a paper on quantum physics written by a plumber?

The abstract is a resume of the article and should contain:

Introduction.

Research question/hypothesis.

Methods including sample design, tests used and the statistical analysis (of course! Remember we love numbers).

Main findings.

Conclusion.

The subheadings in the abstract will vary depending on the journal. An abstract should not usually be more than 300 words but this varies depending on specific journal requirements. If the above information is contained in the abstract, it can give you an idea about whether the study is relevant to your area of practice. However, before deciding if the results of a research paper are relevant to your practice, it is important to review the overall quality of the article. This can only be done by reading and critically appraising the entire article.

The introduction

Example: the effect of paracetamol on levels of pain.

My hypothesis is that A has an effect on B, for example, paracetamol has an effect on levels of pain.

My null hypothesis is that A has no effect on B, for example, paracetamol has no effect on pain.

My study will test the null hypothesis and if the null hypothesis is validated then the hypothesis is false (A has no effect on B). This means paracetamol has no effect on the level of pain. If the null hypothesis is rejected then the hypothesis is true (A has an effect on B). This means that paracetamol has an effect on the level of pain.

Background/literature review

The literature review should include reference to recent and relevant research in the area. It should summarise what is already known about the topic and why the research study is needed and state what the study will contribute to new knowledge. 5 The literature review should be up to date, usually 5–8 years, but it will depend on the topic and sometimes it is acceptable to include older (seminal) studies.

Methodology

In quantitative studies, the data analysis varies between studies depending on the type of design used. For example, descriptive, correlative or experimental studies all vary. A descriptive study will describe the pattern of a topic related to one or more variable. 6 A correlational study examines the link (correlation) between two variables 7  and focuses on how a variable will react to a change of another variable. In experimental studies, the researchers manipulate variables looking at outcomes 8  and the sample is commonly assigned into different groups (known as randomisation) to determine the effect (causal) of a condition (independent variable) on a certain outcome. This is a common method used in clinical trials.

There should be sufficient detail provided in the methods section for you to replicate the study (should you want to). To enable you to do this, the following sections are normally included:

Overview and rationale for the methodology.

Participants or sample.

Data collection tools.

Methods of data analysis.

Ethical issues.

Data collection should be clearly explained and the article should discuss how this process was undertaken. Data collection should be systematic, objective, precise, repeatable, valid and reliable. Any tool (eg, a questionnaire) used for data collection should have been piloted (or pretested and/or adjusted) to ensure the quality, validity and reliability of the tool. 9 The participants (the sample) and any randomisation technique used should be identified. The sample size is central in quantitative research, as the findings should be able to be generalised for the wider population. 10 The data analysis can be done manually or more complex analyses performed using computer software sometimes with advice of a statistician. From this analysis, results like mode, mean, median, p value, CI and so on are always presented in a numerical format.

The author(s) should present the results clearly. These may be presented in graphs, charts or tables alongside some text. You should perform your own critique of the data analysis process; just because a paper has been published, it does not mean it is perfect. Your findings may be different from the author’s. Through critical analysis the reader may find an error in the study process that authors have not seen or highlighted. These errors can change the study result or change a study you thought was strong to weak. To help you critique a quantitative research paper, some guidance on understanding statistical terminology is provided in  table 1 .

  • View inline

Some basic guidance for understanding statistics

Quantitative studies examine the relationship between variables, and the p value illustrates this objectively.  11  If the p value is less than 0.05, the null hypothesis is rejected and the hypothesis is accepted and the study will say there is a significant difference. If the p value is more than 0.05, the null hypothesis is accepted then the hypothesis is rejected. The study will say there is no significant difference. As a general rule, a p value of less than 0.05 means, the hypothesis is accepted and if it is more than 0.05 the hypothesis is rejected.

The CI is a number between 0 and 1 or is written as a per cent, demonstrating the level of confidence the reader can have in the result. 12  The CI is calculated by subtracting the p value to 1 (1–p). If there is a p value of 0.05, the CI will be 1–0.05=0.95=95%. A CI over 95% means, we can be confident the result is statistically significant. A CI below 95% means, the result is not statistically significant. The p values and CI highlight the confidence and robustness of a result.

Discussion, recommendations and conclusion

The final section of the paper is where the authors discuss their results and link them to other literature in the area (some of which may have been included in the literature review at the start of the paper). This reminds the reader of what is already known, what the study has found and what new information it adds. The discussion should demonstrate how the authors interpreted their results and how they contribute to new knowledge in the area. Implications for practice and future research should also be highlighted in this section of the paper.

A few other areas you may find helpful are:

Limitations of the study.

Conflicts of interest.

Table 2 provides a useful tool to help you apply the learning in this paper to the critiquing of quantitative research papers.

Quantitative paper appraisal checklist

  • 1. ↵ Nursing and Midwifery Council , 2015 . The code: standard of conduct, performance and ethics for nurses and midwives https://www.nmc.org.uk/globalassets/sitedocuments/nmc-publications/nmc-code.pdf ( accessed 21.8.18 ).
  • Gerrish K ,
  • Moorley C ,
  • Tunariu A , et al
  • Shorten A ,

Competing interests None declared.

Patient consent Not required.

Provenance and peer review Commissioned; internally peer reviewed.

Correction notice This article has been updated since its original publication to update p values from 0.5 to 0.05 throughout.

Linked Articles

  • Miscellaneous Correction: How to appraise quantitative research BMJ Publishing Group Ltd and RCN Publishing Company Ltd Evidence-Based Nursing 2019; 22 62-62 Published Online First: 31 Jan 2019. doi: 10.1136/eb-2018-102996corr1

Read the full text or download the PDF:

A guide for critique of research articles

Following is the list of criteria to evaluate (critique) a research article. Please note that you should first summarize the paper and then evaluate different parts of it.

Most of the evaluation section should be devoted to evaluation of internal validity of the conclusions. Please add at the end a section entitled ''changes in the design/procedures if I want to replicate this study." Attach a copy of the original article to your paper.

Click here to see a an example (this is how you start) of a research critique.

Click here to see the original article.

The following list is a guide for you to organize your evaluation. It is recommended to organize your evaluation in this order. This is a long list of questions. You don’t have to address all questions. However, you should address highlighted questions . Some questions may not be relevant to your article.

Introduction

1.     Is there a statement of the problem?

2.     Is the problem “researchable”? That is, can it be investigated through the collection and analysis of data?

3.     Is background information on the problem presented?

4.     Is the educational significance of the problem discussed?

5.     Does the problem statement indicate the variables of interest and the specific relationship between those variables which are investigated? When necessary, are variables directly or operationally defined?

Review of Related Literature

1.     Is the review comprehensive?

2.     Are all cited references relevant to the problem under investigation?

3.     Are most of the sources primary, i.e., are there only a few or no secondary sources?

4.     Have the references been critically analyzed and the results of various studies compared and contrasted, i.e., is the review more than a series of abstracts or annotations?

5.     Does the review conclude with a brief summary of the literature and its implications for the problem investigated?

6.     Do the implications discussed form an empirical or theoretical rationale for the hypotheses which follow?

1.     Are specific questions to be answered listed or specific hypotheses to be tested stated?

2.     Does each hypothesis state an expected relationship or difference?

3.     If necessary, are variables directly or operationally defined?

4.     Is each hypothesis testable?

Method          Subjects

1.     Are the size and major characteristics of the population studied described?

2.     If a sample was selected, is the method of selecting the sample clearly described?

3.      Is the method of sample selection described one that is likely to result in a representative, unbiased sample?

4.     Did the researcher avoid the use of volunteers?

5.     Are the size and major characteristics of the sample described?

6.     Does the sample size meet the suggested guideline for minimum sample size appropriate for the method of research represented?      

Instruments

1.     Is the rationale given for the selection of the instruments (or measurements) used?

2.     Is each instrument described in terms of purpose and content?

3.     Are the instruments appropriate for measuring the intended variables?

4.     Is evidence presented that indicates that each instrument is appropriate for the sample under study?

5.     Is instrument validity discussed and coefficients given if appropriate?

6.     Is reliability discussed in terms of type and size of reliability coefficients?

7.     If appropriate, are subtest reliabilities given?

8.     If an instrument was developed specifically for the study, are the procedures involved in its development and validation described?

9.     If an instrument was developed specifically for the study, are administration, scoring or tabulating, and interpretation procedures fully described?

Design and Procedure

1.     Is the design appropriate for answering the questions or testing the hypotheses of the   study?

2.     Are the procedures described in sufficient detail to permit them to be replicated by another researcher?

3.     If a pilot study was conducted, are its execution and results described as well as its impact on the subsequent study?

4.     Are the control procedures described?

5.     Did the researcher discuss or account for any potentially confounding variables that he or she was unable to control for?

1.     Are appropriate descriptive or inferential statistics presented?

2.     Was the probability level, α, at which the results of the tests of significance were evaluated,

       specified in advance of the data analyses?

3.     If parametric tests were used, is there evidence that the researcher avoided violating the

       required assumptions for parametric tests?

4.     Are the tests of significance described appropriate, given the hypotheses and design of the

       study?

5.     Was every hypothesis tested?

6.     Are the tests of significance interpreted using the appropriate degrees of freedom?

7.     Are the results clearly presented?

8.     Are the tables and figures (if any) well organized and easy to understand?

9.     Are the data in each table and figure described in the text?

Discussion (Conclusions and Recommendation)

1.     Is each result discussed in terms of the original hypothesis to which it relates?

2.     Is each result discussed in terms of its agreement or disagreement with previous results

        obtained by other researchers in other studies?

3.     Are generalizations consistent with the results?

4.     Are the possible effects of uncontrolled variables on the results discussed?

5.     Are theoretical and practical implications of the findings discussed?

6.     Are recommendations for future action made?

7.     Are the suggestions for future action based on practical significance or on statistical

       significance only, i.e., has the author avoided confusing practical and statistical

       significance?

8.     Are recommendations for future research made?

Additional general questions to be answered in your critique.

1. What is (are) the research question(s) (or hypothesis)?

2. Describe the sample used in this study.

3. Describe the reliability and validity of all the instruments used.

4. What type of research is this?  Explain.

5. How was the data analyzed?

6. What is (are) the major finding(s)?

  •  Sign into My Research
  •  Create My Research Account
  • Company Website
  • Our Products
  • About Dissertations
  • Español (España)
  • Support Center

Select language

  • Bahasa Indonesia
  • Português (Brasil)
  • Português (Portugal)

Welcome to My Research!

You may have access to the free features available through My Research. You can save searches, save documents, create alerts and more. Please log in through your library or institution to check if you have access.

Welcome to My Research!

Translate this article into 20 different languages!

If you log in through your library or institution you might have access to this article in multiple languages.

Translate this article into 20 different languages!

Get access to 20+ different citations styles

Styles include MLA, APA, Chicago and many more. This feature may be available for free if you log in through your library or institution.

Get access to 20+ different citations styles

Looking for a PDF of this document?

You may have access to it for free by logging in through your library or institution.

Looking for a PDF of this document?

Want to save this document?

You may have access to different export options including Google Drive and Microsoft OneDrive and citation management tools like RefWorks and EasyBib. Try logging in through your library or institution to get access to these tools.

Want to save this document?

  • Document 1 of 1
  • More like this
  • Scholarly Journal

Conducting an article critique for a quantitative research study: perspectives for doctoral students and other novice readers

No items selected.

Please select one or more items.

Select results items first to use the cite, email, save, and export options

[[missing key: loading-pdf-error]] [[missing key: loading-pdf-link]]

The ability to critically evaluate the merits of a quantitative design research article is a necessary skill for practitioners and researchers of all disciplines, including nursing, in order to judge the integrity and usefulness of the evidence and conclusions made in an article. In general, this skill is automatic for many practitioners and researchers who already possess a good working knowledge of research methodology, including: hypothesis development, sampling techniques, study design, testing procedures and instrumentation, data collection and data management, statistics, and interpretation of findings. For graduate students and junior faculty who have yet to master these skills, completing a formally written article critique can be a useful process to hone such skills. However, a fundamental knowledge of research methods is still needed in order to be successful. Because there are few published examples of critique examples, this article provides the practical points of conducting a formally written quantitative research article critique while providing a brief example to demonstrate the principles and form.

You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer

Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer

Suggested sources

  • About ProQuest
  • Terms of Use
  • Privacy Policy
  • Cookie Policy

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • BMJ Glob Health
  • v.4(Suppl 1); 2019

Logo of bmjgh

Synthesising quantitative and qualitative evidence to inform guidelines on complex interventions: clarifying the purposes, designs and outlining some methods

1 School of Social Sciences, Bangor University, Wales, UK

Andrew Booth

2 School of Health and Related Research (ScHARR), University of Sheffield, Sheffield, UK

Graham Moore

3 School of Social Sciences, Cardiff University, Wales, UK

Kate Flemming

4 Department of Health Sciences, The University of York, York, UK

Özge Tunçalp

5 Department of Reproductive Health and Research including UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction (HRP), World Health Organization, Geneva, Switzerland

Elham Shakibazadeh

6 Department of Health Education and Promotion, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran

Associated Data

bmjgh-2018-000893supp001.pdf

bmjgh-2018-000893supp002.pdf

bmjgh-2018-000893supp003.pdf

bmjgh-2018-000893supp005.pdf

bmjgh-2018-000893supp004.pdf

Guideline developers are increasingly dealing with more difficult decisions concerning whether to recommend complex interventions in complex and highly variable health systems. There is greater recognition that both quantitative and qualitative evidence can be combined in a mixed-method synthesis and that this can be helpful in understanding how complexity impacts on interventions in specific contexts. This paper aims to clarify the different purposes, review designs, questions, synthesis methods and opportunities to combine quantitative and qualitative evidence to explore the complexity of complex interventions and health systems. Three case studies of guidelines developed by WHO, which incorporated quantitative and qualitative evidence, are used to illustrate possible uses of mixed-method reviews and evidence. Additional examples of methods that can be used or may have potential for use in a guideline process are outlined. Consideration is given to the opportunities for potential integration of quantitative and qualitative evidence at different stages of the review and guideline process. Encouragement is given to guideline commissioners and developers and review authors to consider including quantitative and qualitative evidence. Recommendations are made concerning the future development of methods to better address questions in systematic reviews and guidelines that adopt a complexity perspective.

Summary box

  • When combined in a mixed-method synthesis, quantitative and qualitative evidence can potentially contribute to understanding how complex interventions work and for whom, and how the complex health systems into which they are implemented respond and adapt.
  • The different purposes and designs for combining quantitative and qualitative evidence in a mixed-method synthesis for a guideline process are described.
  • Questions relevant to gaining an understanding of the complexity of complex interventions and the wider health systems within which they are implemented that can be addressed by mixed-method syntheses are presented.
  • The practical methodological guidance in this paper is intended to help guideline producers and review authors commission and conduct mixed-method syntheses where appropriate.
  • If more mixed-method syntheses are conducted, guideline developers will have greater opportunities to access this evidence to inform decision-making.

Introduction

Recognition has grown that while quantitative methods remain vital, they are usually insufficient to address complex health systems related research questions. 1 Quantitative methods rely on an ability to anticipate what must be measured in advance. Introducing change into a complex health system gives rise to emergent reactions, which cannot be fully predicted in advance. Emergent reactions can often only be understood through combining quantitative methods with a more flexible qualitative lens. 2 Adopting a more pluralist position enables a diverse range of research options to the researcher depending on the research question being investigated. 3–5 As a consequence, where a research study sits within the multitude of methods available is driven by the question being asked, rather than any particular methodological or philosophical stance. 6

Publication of guidance on designing complex intervention process evaluations and other works advocating mixed-methods approaches to intervention research have stimulated better quality evidence for synthesis. 1 7–13 Methods for synthesising qualitative 14 and mixed-method evidence have been developed or are in development. Mixed-method research and review definitions are outlined in box 1 .

Defining mixed-method research and reviews

Pluye and Hong 52 define mixed-methods research as “a research approach in which a researcher integrates (a) qualitative and quantitative research questions, (b) qualitative research methods* and quantitative research designs, (c) techniques for collecting and analyzing qualitative and quantitative evidence, and (d) qualitative findings and quantitative results”.A mixed-method synthesis can integrate quantitative, qualitative and mixed-method evidence or data from primary studies.† Mixed-method primary studies are usually disaggregated into quantitative and qualitative evidence and data for the purposes of synthesis. Thomas and Harden further define three ways in which reviews are mixed. 53

  • The types of studies included and hence the type of findings to be synthesised (ie, qualitative/textual and quantitative/numerical).
  • The types of synthesis method used (eg, statistical meta-analysis and qualitative synthesis).
  • The mode of analysis: theory testing AND theory building.

*A qualitative study is one that uses qualitative methods of data collection and analysis to produce a narrative understanding of the phenomena of interest. Qualitative methods of data collection may include, for example, interviews, focus groups, observations and analysis of documents.

†The Cochrane Qualitative and Implementation Methods group coined the term ‘qualitative evidence synthesis’ to mean that the synthesis could also include qualitative data. For example, qualitative data from case studies, grey literature reports and open-ended questions from surveys. ‘Evidence’ and ‘data’ are used interchangeably in this paper.

This paper is one of a series that aims to explore the implications of complexity for systematic reviews and guideline development, commissioned by WHO. This paper is concerned with the methodological implications of including quantitative and qualitative evidence in mixed-method systematic reviews and guideline development for complex interventions. The guidance was developed through a process of bringing together experts in the field, literature searching and consensus building with end users (guideline developers, clinicians and reviewers). We clarify the different purposes, review designs, questions and synthesis methods that may be applicable to combine quantitative and qualitative evidence to explore the complexity of complex interventions and health systems. Three case studies of WHO guidelines that incorporated quantitative and qualitative evidence are used to illustrate possible uses of mixed-method reviews and mechanisms of integration ( table 1 , online supplementary files 1–3 ). Additional examples of methods that can be used or may have potential for use in a guideline process are outlined. Opportunities for potential integration of quantitative and qualitative evidence at different stages of the review and guideline process are presented. Specific considerations when using an evidence to decision framework such as the Developing and Evaluating Communication strategies to support Informed Decisions and practice based on Evidence (DECIDE) framework 15 or the new WHO-INTEGRATE evidence to decision framework 16 at the review design and evidence to decision stage are outlined. See online supplementary file 4 for an example of a health systems DECIDE framework and Rehfuess et al 16 for the new WHO-INTEGRATE framework. Encouragement is given to guideline commissioners and developers and review authors to consider including quantitative and qualitative evidence in guidelines of complex interventions that take a complexity perspective and health systems focus.

Designs and methods and their use or applicability in guidelines and systematic reviews taking a complexity perspective

Case study examples and referencesComplexity-related questions of interest in the guidelineTypes of synthesis used in the guidelineMixed-method review design and integration mechanismsObservations, concerns and considerations
A. Mixed-method review designs used in WHO guideline development
Antenatal Care (ANC) guidelines ( )
What do women in high-income, medium-income and low-income countries want and expect from antenatal care (ANC), based on their own accounts of their beliefs, views, expectations and experiences of pregnancy?Qualitative synthesis
Framework synthesis
Meta-ethnography

Quantitative and qualitative reviews undertaken separately (segregated), an initial scoping review of qualitative evidence established women’s preferences and outcomes for ANC, which informed design of the quantitative intervention review (contingent)
A second qualitative evidence synthesis was undertaken to look at implementation factors (sequential)
Integration: quantitative and qualitative findings were brought together in a series of DECIDE frameworks Tools included:
Psychological theory
SURE framework conceptual framework for implementing policy options
Conceptual framework for analysing integration of targeted health interventions into health systems to analyse contextual health system factors
An innovative approach to guideline development
No formal cross-study synthesis process and limited testing of theory. The hypothetical nature of meta-ethnography findings may be challenging for guideline panel members to process without additional training
See Flemming for considerations when selecting meta-ethnography
What are the evidence-based practices during ANC that improved outcomes and lead to positive pregnancy experience and how should these practices be delivered?Quantitative review of trials
Factors that influence the uptake of routine antenatal services by pregnant women
Views and experiences of maternity care providers
Qualitative synthesis
Framework synthesis
Meta-ethnography
Task shifting guidelines ( ) What are the effects of lay health worker interventions in primary and community healthcare on maternal and child health and the management of infectious diseases?Quantitative review of trials
Several published quantitative reviews were used (eg, Cochrane review of lay health worker interventions)
Additional new qualitative evidence syntheses were commissioned (segregated)

Integration: quantitative and qualitative review findings on lay health workers were brought together in several DECIDE frameworks. Tools included adapted SURE Framework and post hoc logic model
An innovative approach to guideline development
The post hoc logic model was developed after the guideline was completed
What factors affect the implementation of lay health worker programmes for maternal and child health?Qualitative evidence synthesis
Framework synthesis
Risk communication guideline ( ) Quantitative review of quantitative evidence (descriptive)
Qualitative using framework synthesis

A knowledge map of studies was produced to identify the method, topic and geographical spread of evidence. Reviews first organised and synthesised evidence by method-specific streams and reported method-specific findings. Then similar findings across method-specific streams were grouped and further developed using all the relevant evidence
Integration: where possible, quantitative and qualitative evidence for the same intervention and question was mapped against core DECIDE domains. Tools included framework using public health emergency model and disaster phases
Very few trials were identified. Quantitative and qualitative evidence was used to construct a high level view of what appeared to work and what happened when similar broad groups of interventions or strategies were implemented in different contexts
Example of a fully integrated mixed-method synthesis.
Without evidence of effect, it was highly challenging to populate a DECIDE framework
B. Mixed-method review designs that can be used in guideline development
Factors influencing children’s optimal fruit and vegetable consumption Potential to explore theoretical, intervention and implementation complexity issues
New question(s) of interest are developed and tested in a cross-study synthesis
Mixed-methods synthesis
Each review typically has three syntheses:
Statistical meta-analysis
Qualitative thematic synthesis
Cross-study synthesis

Aim is to generate and test theory from diverse body of literature
Integration: used integrative matrix based on programme theory
Can be used in a guideline process as it fits with the current model of conducting method specific reviews separately then bringing the review products together
C. Mixed-method review designs with the potential for use in guideline development
Interventions to promote smoke alarm ownership and function
Intervention effect and/or intervention implementation related questions within a systemNarrative synthesis (specifically Popay’s methodology)
Four stage approach to integrate quantitative (trials) with qualitative evidence
Integration: initial theory and logic model used to integrate evidence of effect with qualitative case summaries. Tools used included tabulation, groupings and clusters, transforming data: constructing a common rubric, vote-counting as a descriptive tool, moderator variables and subgroup analyses, idea webbing/conceptual mapping, creating qualitative case descriptions, visual representation of relationship between study characteristics and results
Few published examples with the exception of Rodgers, who reinterpreted a Cochrane review on the same topic with narrative synthesis methodology.
Methodology is complex. Most subsequent examples have only partially operationalised the methodology
An intervention effect review will still be required to feed into the guideline process
Factors affecting childhood immunisation
What factors explain complexity and causal pathways?Bayesian synthesis of qualitative and quantitative evidence
Aim is theory-testing by fusing findings from qualitative and quantitative research
Produces a set of weighted factors associated with/predicting the phenomenon under review
Not yet used in a guideline context.
Complex methodology.
Undergoing development and testing for a health context. The end product may not easily ‘fit’ into an evidence to decision framework and an effect review will still be required
Providing effective and preferred care closer to home: a realist review of intermediate care. Developing and testing theories of change underpinning complex policy interventions
What works for whom in what contexts and how?
Realist synthesis
NB. Other theory-informed synthesis methods follow similar processes

Development of a theory from the literature, analysis of quantitative and qualitative evidence against the theory leads to development of context, mechanism and outcome chains that explain how outcomes come about
Integration: programme theory and assembling mixed-method evidence to create Context, Mechanism and Outcome (CMO) configurations
May be useful where there are few trials. The hypothetical nature of findings may be challenging for guideline panel members to process without additional training. The end product may not easily ‘fit’ into an evidence to decision framework and an effect review will still be required
Use of morphine to treat cancer-related pain Any aspect of complexity could potentially be explored
How does the context of morphine use affect the established effectiveness of morphine?
Critical interpretive synthesis
Aims to generate theory from large and diverse body of literature
Segregated sequential design
Integration: integrative grid
There are few examples and the methodology is complex.
The hypothetical nature of findings may be challenging for guideline panel members to process without additional training.
The end product would need to be designed to feed into an evidence to decision framework and an intervention effect review will still be required
Food sovereignty, food security and health equity Examples have examined health system complexity
To understand the state of knowledge on relationships between health equity—ie, health inequalities that are socially produced—and food systems, where the concepts of 'food security' and 'food sovereignty' are prominent
Focused on eight pathways to health (in)equity through the food system: (1) Multi-Scalar Environmental, Social Context; (2) Occupational Exposures; (3) Environmental Change; (4) Traditional Livelihoods, Cultural Continuity; (5) Intake of Contaminants; (6) Nutrition; (7) Social Determinants of Health; (8) Political, Economic and Regulatory context
Meta-narrativeAim is to review research on diffusion of innovation to inform healthcare policy
Which research (or epistemic) traditions have considered this broad topic area?; How has each tradition conceptualised the topic (for example, including assumptions about the nature of reality, preferred study designs and ways of knowing)?; What theoretical approaches and methods did they use?; What are the main empirical findings?; and What insights can be drawn by combining and comparing findings from different traditions?
Integration: analysis leads to production of a set of meta-narratives (‘storylines of research’)
Not yet used in a guideline context. The originators are calling for meta-narrative reviews to be used in a guideline process.
Potential to provide a contextual overview within which to interpret other types of reviews in a guideline process. The meta-narrative review findings may require tailoring to ‘fit’ into an evidence to decision framework and an intervention effect review will still be required
Few published examples and the methodology is complex

Supplementary data

Taking a complexity perspective.

The first paper in this series 17 outlines aspects of complexity associated with complex interventions and health systems that can potentially be explored by different types of evidence, including synthesis of quantitative and qualitative evidence. Petticrew et al 17 distinguish between a complex interventions perspective and a complex systems perspective. A complex interventions perspective defines interventions as having “implicit conceptual boundaries, representing a flexible, but common set of practices, often linked by an explicit or implicit theory about how they work”. A complex systems perspective differs in that “ complexity arises from the relationships and interactions between a system’s agents (eg, people, or groups that interact with each other and their environment), and its context. A system perspective conceives the intervention as being part of the system, and emphasises changes and interconnections within the system itself”. Aspects of complexity associated with implementation of complex interventions in health systems that could potentially be addressed with a synthesis of quantitative and qualitative evidence are summarised in table 2 . Another paper in the series outlines criteria used in a new evidence to decision framework for making decisions about complex interventions implemented in complex systems, against which the need for quantitative and qualitative evidence can be mapped. 16 A further paper 18 that explores how context is dealt with in guidelines and reviews taking a complexity perspective also recommends using both quantitative and qualitative evidence to better understand context as a source of complexity. Mixed-method syntheses of quantitative and qualitative evidence can also help with understanding of whether there has been theory failure and or implementation failure. The Cochrane Qualitative and Implementation Methods Group provide additional guidance on exploring implementation and theory failure that can be adapted to address aspects of complexity of complex interventions when implemented in health systems. 19

Health-system complexity-related questions that a synthesis of quantitative and qualitative evidence could address (derived from Petticrew et al 17 )

Aspect of complexity of interestExamples of potential research question(s) that a synthesis of qualitative and quantitative evidence could addressTypes of studies or data that could contribute to a review of qualitative and quantitative evidence
What ‘is’ the system? How can it be described?What are the main influences on the health problem? How are they created and maintained? How do these influences interconnect? Where might one intervene in the system?Quantitative: previous systematic reviews of the causes of the problem); epidemiological studies (eg, cohort studies examining risk factors of obesity); network analysis studies showing the nature of social and other systems
Qualitative data: theoretical papers; policy documents
Interactions of interventions with context and adaptation Qualitative: (1) eg, qualitative studies; case studies
Quantitative: (2) trials or other effectiveness studies from different contexts; multicentre trials, with stratified reporting of findings; other quantitative studies that provide evidence of moderating effects of context
System adaptivity (how does the system change?)(How) does the system change when the intervention is introduced? Which aspects of the system are affected? Does this potentiate or dampen its effects?Quantitative: longitudinal data; possibly historical data; effectiveness studies providing evidence of differential effects across different contexts; system modelling (eg, agent-based modelling)
Qualitative: qualitative studies; case studies
Emergent propertiesWhat are the effects (anticipated and unanticipated) which follow from this system change?Quantitative: prospective quantitative evaluations; retrospective studies (eg, case–control studies, surveys) may also help identify less common effects; dose–response evaluations of impacts at aggregate level in individual studies or across studies included with systematic reviews (see suggested examples)
Qualitative: qualitative studies
Positive (reinforcing) and negative (balancing) feedback loopsWhat explains change in the effectiveness of the intervention over time?
Are the effects of an intervention are damped/suppressed by other aspects of the system (eg, contextual influences?)
Quantitative: studies of moderators of effectiveness; long-term longitudinal studies
Qualitative: studies of factors that enable or inhibit implementation of interventions
Multiple (health and non-health) outcomesWhat changes in processes and outcomes follow the introduction of this system change? At what levels in the system are they experienced?Quantitative: studies tracking change in the system over time
Qualitative: studies exploring effects of the change in individuals, families, communities (including equity considerations and factors that affect engagement and participation in change)

It may not be apparent which aspects of complexity or which elements of the complex intervention or health system can be explored in a guideline process, or whether combining qualitative and quantitative evidence in a mixed-method synthesis will be useful, until the available evidence is scoped and mapped. 17 20 A more extensive lead in phase is typically required to scope the available evidence, engage with stakeholders and to refine the review parameters and questions that can then be mapped against potential review designs and methods of synthesis. 20 At the scoping stage, it is also common to decide on a theoretical perspective 21 or undertake further work to refine a theoretical perspective. 22 This is also the stage to begin articulating the programme theory of the complex intervention that may be further developed to refine an understanding of complexity and show how the intervention is implemented in and impacts on the wider health system. 17 23 24 In practice, this process can be lengthy, iterative and fluid with multiple revisions to the review scope, often developing and adapting a logic model 17 as the available evidence becomes known and the potential to incorporate different types of review designs and syntheses of quantitative and qualitative evidence becomes better understood. 25 Further questions, propositions or hypotheses may emerge as the reviews progress and therefore the protocols generally need to be developed iteratively over time rather than a priori.

Following a scoping exercise and definition of key questions, the next step in the guideline development process is to identify existing or commission new systematic reviews to locate and summarise the best available evidence in relation to each question. For example, case study 2, ‘Optimising health worker roles for maternal and newborn health through task shifting’, included quantitative reviews that did and did not take an additional complexity perspective, and qualitative evidence syntheses that were able to explain how specific elements of complexity impacted on intervention outcomes within the wider health system. Further understanding of health system complexity was facilitated through the conduct of additional country-level case studies that contributed to an overall understanding of what worked and what happened when lay health worker interventions were implemented. See table 1 online supplementary file 2 .

There are a few existing examples, which we draw on in this paper, but integrating quantitative and qualitative evidence in a mixed-method synthesis is relatively uncommon in a guideline process. Box 2 includes a set of key questions that guideline developers and review authors contemplating combining quantitative and qualitative evidence in mixed-methods design might ask. Subsequent sections provide more information and signposting to further reading to help address these key questions.

Key questions that guideline developers and review authors contemplating combining quantitative and qualitative evidence in a mixed-methods design might ask

Compound questions requiring both quantitative and qualitative evidence?

Questions requiring mixed-methods studies?

Separate quantitative and qualitative questions?

Separate quantitative and qualitative research studies?

Related quantitative and qualitative research studies?

Mixed-methods studies?

Quantitative unpublished data and/or qualitative unpublished data, eg, narrative survey data?

Throughout the review?

Following separate reviews?

At the question point?

At the synthesis point?

At the evidence to recommendations stage?

Or a combination?

Narrative synthesis or summary?

Quantitising approach, eg, frequency analysis?

Qualitising approach, eg, thematic synthesis?

Tabulation?

Logic model?

Conceptual model/framework?

Graphical approach?

  • WHICH: Which mixed-method designs, methodologies and methods best fit into a guideline process to inform recommendations?

Complexity-related questions that a synthesis of quantitative and qualitative evidence can potentially address

Petticrew et al 17 define the different aspects of complexity and examples of complexity-related questions that can potentially be explored in guidelines and systematic reviews taking a complexity perspective. Relevant aspects of complexity outlined by Petticrew et al 17 are summarised in table 2 below, together with the corresponding questions that could be addressed in a synthesis combining qualitative and quantitative evidence. Importantly, the aspects of complexity and their associated concepts of interest have however yet to be translated fully in primary health research or systematic reviews. There are few known examples where selected complexity concepts have been used to analyse or reanalyse a primary intervention study. Most notable is Chandler et al 26 who specifically set out to identify and translate a set of relevant complexity theory concepts for application in health systems research. Chandler then reanalysed a trial process evaluation using selected complexity theory concepts to better understand the complex causal pathway in the health system that explains some aspects of complexity in table 2 .

Rehfeuss et al 16 also recommends upfront consideration of the WHO-INTEGRATE evidence to decision criteria when planning a guideline and formulating questions. The criteria reflect WHO norms and values and take account of a complexity perspective. The framework can be used by guideline development groups as a menu to decide which criteria to prioritise, and which study types and synthesis methods can be used to collect evidence for each criterion. Many of the criteria and their related questions can be addressed using a synthesis of quantitative and qualitative evidence: the balance of benefits and harms, human rights and sociocultural acceptability, health equity, societal implications and feasibility (see table 3 ). Similar aspects in the DECIDE framework 15 could also be addressed using synthesis of qualitative and quantitative evidence.

Integrate evidence to decision framework criteria, example questions and types of studies to potentially address these questions (derived from Rehfeuss et al 16 )

Domains of the WHO-INTEGRATE EtD frameworkExamples of potential research question(s) that a synthesis of qualitative and/or quantitative evidence could addressTypes of studies that could contribute to a review of qualitative and quantitative evidence
Balance of benefits and harmsTo what extent do patients/beneficiaries different health outcomes?Qualitative: studies of views and experiences
Quantitative: Questionnaire surveys
Human rights and sociocultural acceptabilityIs the intervention to patients/beneficiaries as well as to those implementing it?
To what extent do patients/beneficiaries different non-health outcomes?
How does the intervention affect an individual’s, population group’s or organisation’s , that is, their ability to make a competent, informed and voluntary decision?
Qualitative: discourse analysis, qualitative studies (ideally longitudinal to examine changes over time)
Quantitative: pro et contra analysis, discrete choice experiments, longitudinal quantitative studies (to examine changes over time), cross-sectional studies
Mixed-method studies; case studies
Health equity, equality and non-discriminationHow is the intervention for individuals, households or communities?
How —in terms of physical as well as informational access—is the intervention across different population groups?
Qualitative: studies of views and experiences
Quantitative: cross-sectional or longitudinal observational studies, discrete choice experiments, health expenditure studies; health system barrier studies, cross-sectional or longitudinal observational studies, discrete choice experiments, ethical analysis, GIS-based studies
Societal implicationsWhat is the of the intervention: are there features of the intervention that increase or reduce stigma and that lead to social consequences? Does the intervention enhance or limit social goals, such as education, social cohesion and the attainment of various human rights beyond health? Does it change social norms at individual or population level?
What is the of the intervention? Does it contribute to or limit the achievement of goals to protect the environment and efforts to mitigate or adapt to climate change?
Qualitative: studies of views and experiences
Quantitative: RCTs, quasi-experimental studies, comparative observational studies, longitudinal implementation studies, case studies, power analyses, environmental impact assessments, modelling studies
Feasibility and health system considerationsAre there any that impact on implementation of the intervention?
How might , such as past decisions and strategic considerations, positively or negatively impact the implementation of the intervention?
How does the intervention ? Is it likely to fit well or not, is it likely to impact on it in positive or negative ways?
How does the intervention interact with the need for and usage of the existing , at national and subnational levels?
How does the intervention interact with the need for and usage of the as well as other relevant infrastructure, at national and subnational levels?
Non-research: policy and regulatory frameworks
Qualitative: studies of views and experiences
Mixed-method: health systems research, situation analysis, case studies
Quantitative: cross-sectional studies

GIS, Geographical Information System; RCT, randomised controlled trial.

Questions as anchors or compasses

Questions can serve as an ‘anchor’ by articulating the specific aspects of complexity to be explored (eg, Is successful implementation of the intervention context dependent?). 27 Anchor questions such as “How does intervention x impact on socioeconomic inequalities in health behaviour/outcome x” are the kind of health system question that requires a synthesis of both quantitative and qualitative evidence and hence a mixed-method synthesis. Quantitative evidence can quantify the difference in effect, but does not answer the question of how . The ‘how’ question can be partly answered with quantitative and qualitative evidence. For example, quantitative evidence may reveal where socioeconomic status and inequality emerges in the health system (an emergent property) by exploring questions such as “ Does patterning emerge during uptake because fewer people from certain groups come into contact with an intervention in the first place? ” or “ are people from certain backgrounds more likely to drop out, or to maintain effects beyond an intervention differently? ” Qualitative evidence may help understand the reasons behind all of these mechanisms. Alternatively, questions can act as ‘compasses’ where a question sets out a starting point from which to explore further and to potentially ask further questions or develop propositions or hypotheses to explore through a complexity perspective (eg, What factors enhance or hinder implementation?). 27 Other papers in this series provide further guidance on developing questions for qualitative evidence syntheses and guidance on question formulation. 14 28

For anchor and compass questions, additional application of a theory (eg, complexity theory) can help focus evidence synthesis and presentation to explore and explain complexity issues. 17 21 Development of a review specific logic model(s) can help to further refine an initial understanding of any complexity-related issues of interest associated with a specific intervention, and if appropriate the health system or section of the health system within which to contextualise the review question and analyse data. 17 23–25 Specific tools are available to help clarify context and complex interventions. 17 18

If a complexity perspective, and certain criteria within evidence to decision frameworks, is deemed relevant and desirable by guideline developers, it is only possible to pursue a complexity perspective if the evidence is available. Careful scoping using knowledge maps or scoping reviews will help inform development of questions that are answerable with available evidence. 20 If evidence of effect is not available, then a different approach to develop questions leading to a more general narrative understanding of what happened when complex interventions were implemented in a health system will be required (such as in case study 3—risk communication guideline). This should not mean that the original questions developed for which no evidence was found when scoping the literature were not important. An important function of creating a knowledge map is also to identify gaps to inform a future research agenda.

Table 2 and online supplementary files 1–3 outline examples of questions in the three case studies, which were all ‘COMPASS’ questions for the qualitative evidence syntheses.

Types of integration and synthesis designs in mixed-method reviews

The shift towards integration of qualitative and quantitative evidence in primary research has, in recent years, begun to be mirrored within research synthesis. 29–31 The natural extension to undertaking quantitative or qualitative reviews has been the development of methods for integrating qualitative and quantitative evidence within reviews, and within the guideline process using evidence to decision-frameworks. Advocating the integration of quantitative and qualitative evidence assumes a complementarity between research methodologies, and a need for both types of evidence to inform policy and practice. Below, we briefly outline the current designs for integrating qualitative and quantitative evidence within a mixed-method review or synthesis.

One of the early approaches to integrating qualitative and quantitative evidence detailed by Sandelowski et al 32 advocated three basic review designs: segregated, integrated and contingent designs, which have been further developed by Heyvaert et al 33 ( box 3 ).

Segregated, integrated and contingent designs 32 33

Segregated design.

Conventional separate distinction between quantitative and qualitative approaches based on the assumption they are different entities and should be treated separately; can be distinguished from each other; their findings warrant separate analyses and syntheses. Ultimately, the separate synthesis results can themselves be synthesised.

Integrated design

The methodological differences between qualitative and quantitative studies are minimised as both are viewed as producing findings that can be readily synthesised into one another because they address the same research purposed and questions. Transformation involves either turning qualitative data into quantitative (quantitising) or quantitative findings are turned into qualitative (qualitising) to facilitate their integration.

Contingent design

Takes a cyclical approach to synthesis, with the findings from one synthesis informing the focus of the next synthesis, until all the research objectives have been addressed. Studies are not necessarily grouped and categorised as qualitative or quantitative.

A recent review of more than 400 systematic reviews 34 combining quantitative and qualitative evidence identified two main synthesis designs—convergent and sequential. In a convergent design, qualitative and quantitative evidence is collated and analysed in a parallel or complementary manner, whereas in a sequential synthesis, the collation and analysis of quantitative and qualitative evidence takes place in a sequence with one synthesis informing the other ( box 4 ). 6 These designs can be seen to build on the work of Sandelowski et al , 32 35 particularly in relation to the transformation of data from qualitative to quantitative (and vice versa) and the sequential synthesis design, with a cyclical approach to reviewing that evokes Sandelowski’s contingent design.

Convergent and sequential synthesis designs 34

Convergent synthesis design.

Qualitative and quantitative research is collected and analysed at the same time in a parallel or complementary manner. Integration can occur at three points:

a. Data-based convergent synthesis design

All included studies are analysed using the same methods and results presented together. As only one synthesis method is used, data transformation occurs (qualitised or quantised). Usually addressed one review question.

b. Results-based convergent synthesis design

Qualitative and quantitative data are analysed and presented separately but integrated using a further synthesis method; eg, narratively, tables, matrices or reanalysing evidence. The results of both syntheses are combined in a third synthesis. Usually addresses an overall review question with subquestions.

c. Parallel-results convergent synthesis design

Qualitative and quantitative data are analysed and presented separately with integration occurring in the interpretation of results in the discussion section. Usually addresses two or more complimentary review questions.

Sequential synthesis design

A two-phase approach, data collection and analysis of one type of evidence (eg, qualitative), occurs after and is informed by the collection and analysis of the other type (eg, quantitative). Usually addresses an overall question with subquestions with both syntheses complementing each other.

The three case studies ( table 1 , online supplementary files 1–3 ) illustrate the diverse combination of review designs and synthesis methods that were considered the most appropriate for specific guidelines.

Methods for conducting mixed-method reviews in the context of guidelines for complex interventions

In this section, we draw on examples where specific review designs and methods have been or can be used to explore selected aspects of complexity in guidelines or systematic reviews. We also identify other review methods that could potentially be used to explore aspects of complexity. Of particular note, we could not find any specific examples of systematic methods to synthesise highly diverse research designs as advocated by Petticrew et al 17 and summarised in tables 2 and 3 . For example, we could not find examples of methods to synthesise qualitative studies, case studies, quantitative longitudinal data, possibly historical data, effectiveness studies providing evidence of differential effects across different contexts, and system modelling studies (eg, agent-based modelling) to explore system adaptivity.

There are different ways that quantitative and qualitative evidence can be integrated into a review and then into a guideline development process. In practice, some methods enable integration of different types of evidence in a single synthesis, while in other methods, the single systematic review may include a series of stand-alone reviews or syntheses that are then combined in a cross-study synthesis. Table 1 provides an overview of the characteristics of different review designs and methods and guidance on their applicability for a guideline process. Designs and methods that have already been used in WHO guideline development are described in part A of the table. Part B outlines a design and method that can be used in a guideline process, and part C covers those that have the potential to integrate quantitative, qualitative and mixed-method evidence in a single review design (such as meta-narrative reviews and Bayesian syntheses), but their application in a guideline context has yet to be demonstrated.

Points of integration when integrating quantitative and qualitative evidence in guideline development

Depending on the review design (see boxes 3 and 4 ), integration can potentially take place at a review team and design level, and more commonly at several key points of the review or guideline process. The following sections outline potential points of integration and associated practical considerations when integrating quantitative and qualitative evidence in guideline development.

Review team level

In a guideline process, it is common for syntheses of quantitative and qualitative evidence to be done separately by different teams and then to integrate the evidence. A practical consideration relates to the organisation, composition and expertise of the review teams and ways of working. If the quantitative and qualitative reviews are being conducted separately and then brought together by the same team members, who are equally comfortable operating within both paradigms, then a consistent approach across both paradigms becomes possible. If, however, a team is being split between the quantitative and qualitative reviews, then the strengths of specialisation can be harnessed, for example, in quality assessment or synthesis. Optimally, at least one, if not more, of the team members should be involved in both quantitative and qualitative reviews to offer the possibility of making connexions throughout the review and not simply at re-agreed junctures. This mirrors O’Cathain’s conclusion that mixed-methods primary research tends to work only when there is a principal investigator who values and is able to oversee integration. 9 10 While the above decisions have been articulated in the context of two types of evidence, variously quantitative and qualitative, they equally apply when considering how to handle studies reporting a mixed-method study design, where data are usually disaggregated into quantitative and qualitative for the purposes of synthesis (see case study 3—risk communication in humanitarian disasters).

Question formulation

Clearly specified key question(s), derived from a scoping or consultation exercise, will make it clear if quantitative and qualitative evidence is required in a guideline development process and which aspects will be addressed by which types of evidence. For the remaining stages of the process, as documented below, a review team faces challenges as to whether to handle each type of evidence separately, regardless of whether sequentially or in parallel, with a view to joining the two products on completion or to attempt integration throughout the review process. In each case, the underlying choice is of efficiencies and potential comparability vs sensitivity to the underlying paradigm.

Once key questions are clearly defined, the guideline development group typically needs to consider whether to conduct a single sensitive search to address all potential subtopics (lumping) or whether to conduct specific searches for each subtopic (splitting). 36 A related consideration is whether to search separately for qualitative, quantitative and mixed-method evidence ‘streams’ or whether to conduct a single search and then identify specific study types at the subsequent sifting stage. These two considerations often mean a trade-off between a single search process involving very large numbers of records or a more protracted search process retrieving smaller numbers of records. Both approaches have advantages and choice may depend on the respective availability of resources for searching and sifting.

Screening and selecting studies

Closely related to decisions around searching are considerations relating to screening and selecting studies for inclusion in a systematic review. An important consideration here is whether the review team will screen records for all review types, regardless of their subsequent involvement (‘altruistic sifting’), or specialise in screening for the study type with which they are most familiar. The risk of missing relevant reports might be minimised by whole team screening for empirical reports in the first instance and then coding them for a specific quantitative, qualitative or mixed-methods report at a subsequent stage.

Assessment of methodological limitations in primary studies

Within a guideline process, review teams may be more limited in their choice of instruments to assess methodological limitations of primary studies as there are mandatory requirements to use the Cochrane risk of bias tool 37 to feed into Grading of Recommendations Assessment, Development and Evaluation (GRADE) 38 or to select from a small pool of qualitative appraisal instruments in order to apply GRADE; Confidence in the Evidence from Reviews of Qualitative Research (GRADE-CERQual) 39 to assess the overall certainty or confidence in findings. The Cochrane Qualitative and Implementation Methods Group has recently issued guidance on the selection of appraisal instruments and core assessment criteria. 40 The Mixed-Methods Appraisal Tool, which is currently undergoing further development, offers a single quality assessment instrument for quantitative, qualitative and mixed-methods studies. 41 Other options include using corresponding instruments from within the same ‘stable’, for example, using different Critical Appraisal Skills Programme instruments. 42 While using instruments developed by the same team or organisation may achieve a degree of epistemological consonance, benefits may come more from consistency of approach and reporting rather than from a shared view of quality. Alternatively, a more paradigm-sensitive approach would involve selecting the best instrument for each respective review while deferring challenges from later heterogeneity of reporting.

Data extraction

The way in which data and evidence are extracted from primary research studies for review will be influenced by the type of integrated synthesis being undertaken and the review purpose. Initially, decisions need to be made regarding the nature and type of data and evidence that are to be extracted from the included studies. Method-specific reporting guidelines 43 44 provide a good template as to what quantitative and qualitative data it is potentially possible to extract from different types of method-specific study reports, although in practice reporting quality varies. Online supplementary file 5 provides a hypothetical example of the different types of studies from which quantitative and qualitative evidence could potentially be extracted for synthesis.

The decisions around what data or evidence to extract will be guided by how ‘integrated’ the mixed-method review will be. For those reviews where the quantitative and qualitative findings of studies are synthesised separately and integrated at the point of findings (eg, segregated or contingent approaches or sequential synthesis design), separate data extraction approaches will likely be used.

Where integration occurs during the process of the review (eg, integrated approach or convergent synthesis design), an integrated approach to data extraction may be considered, depending on the purpose of the review. This may involve the use of a data extraction framework, the choice of which needs to be congruent with the approach to synthesis chosen for the review. 40 45 The integrative or theoretical framework may be decided on a priori if a pre-developed theoretical or conceptual framework is available in the literature. 27 The development of a framework may alternatively arise from the reading of the included studies, in relation to the purpose of the review, early in the process. The Cochrane Qualitative and Implementation Methods Group provide further guidance on extraction of qualitative data, including use of software. 40

Synthesis and integration

Relatively few synthesis methods start off being integrated from the beginning, and these methods have generally been subject to less testing and evaluation particularly in a guideline context (see table 1 ). A review design that started off being integrated from the beginning may be suitable for some guideline contexts (such as in case study 3—risk communication in humanitarian disasters—where there was little evidence of effect), but in general if there are sufficient trials then a separate systematic review and meta-analysis will be required for a guideline. Other papers in this series offer guidance on methods for synthesising quantitative 46 and qualitative evidence 14 in reviews that take a complexity perspective. Further guidance on integrating quantitative and qualitative evidence in a systematic review is provided by the Cochrane Qualitative and Implementation Methods Group. 19 27 29 40 47

Types of findings produced by specific methods

It is highly likely (unless there are well-designed process evaluations) that the primary studies may not themselves seek to address the complexity-related questions required for a guideline process. In which case, review authors will need to configure the available evidence and transform the evidence through the synthesis process to produce explanations, propositions and hypotheses (ie, findings) that were not obvious at primary study level. It is important that guideline commissioners, developers and review authors are aware that specific methods are intended to produce a type of finding with a specific purpose (such as developing new theory in the case of meta-ethnography). 48 Case study 1 (antenatal care guideline) provides an example of how a meta-ethnography was used to develop a new theory as an end product, 48 49 as well as framework synthesis which produced descriptive and explanatory findings that were more easily incorporated into the guideline process. 27 The definitions ( box 5 ) may be helpful when defining the different types of findings.

Different levels of findings

Descriptive findings —qualitative evidence-driven translated descriptive themes that do not move beyond the primary studies.

Explanatory findings —may either be at a descriptive or theoretical level. At the descriptive level, qualitative evidence is used to explain phenomena observed in quantitative results, such as why implementation failed in specific circumstances. At the theoretical level, the transformed and interpreted findings that go beyond the primary studies can be used to explain the descriptive findings. The latter description is generally the accepted definition in the wider qualitative community.

Hypothetical or theoretical finding —qualitative evidence-driven transformed themes (or lines of argument) that go beyond the primary studies. Although similar, Thomas and Harden 56 make a distinction in the purposes between two types of theoretical findings: analytical themes and the product of meta-ethnographies, third-order interpretations. 48

Analytical themes are a product of interrogating descriptive themes by placing the synthesis within an external theoretical framework (such as the review question and subquestions) and are considered more appropriate when a specific review question is being addressed (eg, in a guideline or to inform policy). 56

Third-order interpretations come from translating studies into one another while preserving the original context and are more appropriate when a body of literature is being explored in and of itself with broader or emergent review questions. 48

Bringing mixed-method evidence together in evidence to decision (EtD) frameworks

A critical element of guideline development is the formulation of recommendations by the Guideline Development Group, and EtD frameworks help to facilitate this process. 16 The EtD framework can also be used as a mechanism to integrate and display quantitative and qualitative evidence and findings mapped against the EtD framework domains with hyperlinks to more detailed evidence summaries from contributing reviews (see table 1 ). It is commonly the EtD framework that enables the findings of the separate quantitative and qualitative reviews to be brought together in a guideline process. Specific challenges when populating the DECIDE evidence to decision framework 15 were noted in case study 3 (risk communication in humanitarian disasters) as there was an absence of intervention effect data and the interventions to communicate public health risks were context specific and varied. These problems would not, however, have been addressed by substitution of the DECIDE framework with the new INTEGRATE 16 evidence to decision framework. A d ifferent type of EtD framework needs to be developed for reviews that do not include sufficient evidence of intervention effect.

Mixed-method review and synthesis methods are generally the least developed of all systematic review methods. It is acknowledged that methods for combining quantitative and qualitative evidence are generally poorly articulated. 29 50 There are however some fairly well-established methods for using qualitative evidence to explore aspects of complexity (such as contextual, implementation and outcome complexity), which can be combined with evidence of effect (see sections A and B of table 1 ). 14 There are good examples of systematic reviews that use these methods to combine quantitative and qualitative evidence, and examples of guideline recommendations that were informed by evidence from both quantitative and qualitative reviews (eg, case studies 1–3). With the exception of case study 3 (risk communication), the quantitative and qualitative reviews for these specific guidelines have been conducted separately, and the findings subsequently brought together in an EtD framework to inform recommendations.

Other mixed-method review designs have potential to contribute to understanding of complex interventions and to explore aspects of wider health systems complexity but have not been sufficiently developed and tested for this specific purpose, or used in a guideline process (section C of table 1 ). Some methods such as meta-narrative reviews also explore different questions to those usually asked in a guideline process. Methods for processing (eg, quality appraisal) and synthesising the highly diverse evidence suggested in tables 2 and 3 that are required to explore specific aspects of health systems complexity (such as system adaptivity) and to populate some sections of the INTEGRATE EtD framework remain underdeveloped or in need of development.

In addition to the required methodological development mentioned above, there is no GRADE approach 38 for assessing confidence in findings developed from combined quantitative and qualitative evidence. Another paper in this series outlines how to deal with complexity and grading different types of quantitative evidence, 51 and the GRADE CERQual approach for qualitative findings is described elsewhere, 39 but both these approaches are applied to method-specific and not mixed-method findings. An unofficial adaptation of GRADE was used in the risk communication guideline that reported mixed-method findings. Nor is there a reporting guideline for mixed-method reviews, 47 and for now reports will need to conform to the relevant reporting requirements of the respective method-specific guideline. There is a need to further adapt and test DECIDE, 15 WHO-INTEGRATE 16 and other types of evidence to decision frameworks to accommodate evidence from mixed-method syntheses which do not set out to determine the statistical effects of interventions and in circumstances where there are no trials.

When conducting quantitative and qualitative reviews that will subsequently be combined, there are specific considerations for managing and integrating the different types of evidence throughout the review process. We have summarised different options for combining qualitative and quantitative evidence in mixed-method syntheses that guideline developers and systematic reviewers can choose from, as well as outlining the opportunities to integrate evidence at different stages of the review and guideline development process.

Review commissioners, authors and guideline developers generally have less experience of combining qualitative and evidence in mixed-methods reviews. In particular, there is a relatively small group of reviewers who are skilled at undertaking fully integrated mixed-method reviews. Commissioning additional qualitative and mixed-method reviews creates an additional cost. Large complex mixed-method reviews generally take more time to complete. Careful consideration needs to be given as to which guidelines would benefit most from additional qualitative and mixed-method syntheses. More training is required to develop capacity and there is a need to develop processes for preparing the guideline panel to consider and use mixed-method evidence in their decision-making.

This paper has presented how qualitative and quantitative evidence, combined in mixed-method reviews, can help understand aspects of complex interventions and the systems within which they are implemented. There are further opportunities to use these methods, and to further develop the methods, to look more widely at additional aspects of complexity. There is a range of review designs and synthesis methods to choose from depending on the question being asked or the questions that may emerge during the conduct of the synthesis. Additional methods need to be developed (or existing methods further adapted) in order to synthesise the full range of diverse evidence that is desirable to explore the complexity-related questions when complex interventions are implemented into health systems. We encourage review commissioners and authors, and guideline developers to consider using mixed-methods reviews and synthesis in guidelines and to report on their usefulness in the guideline development process.

Handling editor: Soumyadeep Bhaumik

Contributors: JN, AB, GM, KF, ÖT and ES drafted the manuscript. All authors contributed to paper development and writing and agreed the final manuscript. Anayda Portela and Susan Norris from WHO managed the series. Helen Smith was series Editor. We thank all those who provided feedback on various iterations.

Funding: Funding provided by the World Health Organization Department of Maternal, Newborn, Child and Adolescent Health through grants received from the United States Agency for International Development and the Norwegian Agency for Development Cooperation.

Disclaimer: ÖT is a staff member of WHO. The author alone is responsible for the views expressed in this publication and they do not necessarily represent the decisions or policies of WHO.

Competing interests: No financial interests declared. JN, AB and ÖT have an intellectual interest in GRADE CERQual; and JN has an intellectual interest in the iCAT_SR tool.

Patient consent: Not required.

Provenance and peer review: Not commissioned; externally peer reviewed.

Data sharing statement: No additional data are available.

Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

IMAGES

  1. A Critique of Quantitative Research

    explain critique of quantitative published research article

  2. Quantitative Research

    explain critique of quantitative published research article

  3. (PDF) Quantitative Research Method

    explain critique of quantitative published research article

  4. Critique of Quantitative Research Article Free Essay Example

    explain critique of quantitative published research article

  5. Critique Writing on Quantitative Research Essay Example

    explain critique of quantitative published research article

  6. Article Critique

    explain critique of quantitative published research article

COMMENTS

  1. PDF Step'by-step guide to critiquing research. Part 1: quantitative research

    It is imperative in nursing that care has its foundations in sound research and it is essential that all nurses have the ability to critically appraise research to identify what is best practice. This article is a step-by step-approach to critiquing quantitative research to help nurses demystify the process and decode the terminology.

  2. Conducting an article critique for a quantitative research study

    Because there are few published examples of critique examples, this article provides the practical points of conducting a formally written quantitative research article critique while providing a brief example to demonstrate the principles and form. Keywords: quantitative article critique, statistics, methodology, graduate students

  3. Critiquing Quantitative Research Reports: Key Points for the Beginner

    The first step in the critique process is for the reader to browse the abstract and article for an overview. During this initial review a great deal of information can be obtained. The abstract should provide a clear, concise overview of the study. During this review it should be noted if the title, problem statement, and research question (or ...

  4. Critiquing Research Articles

    Conducting an article critique for a quantitative research study: Perspectives for doctoral students and other novice readers (Vance et al.) Critique Process (Boswell & Cannon) The experience of critiquing published research: Learning from the student and researcher perspective (Knowles & Gray) A guide to critiquing a research paper.

  5. Conducting an article critique for a quantitative research study

    Because there are few published examples of critique examples, this article provides the practical points of conducting a formally written quantitative research article critique while providing a ...

  6. PDF CRITIQUING LITERATURE

    CRITIQUING LITERATUREWHY DO W. CRITIQUE LITERATURE?Evaluating literature is a process of analysing research to determine its str. ngths and weaknesses. This is an important process as not all published research is reliable or. scientifically sound. Arguments and the interpretation of data can be biase.

  7. Writing, reading, and critiquing reviews

    They describe what is known about given topic and lead us to identify a knowledge gap to study. All reviews require authors to be able accurately summarize, synthesize, interpret and even critique the research literature. 1, 2 In fact, for this editorial we have had to review the literature on reviews.

  8. Making sense of research: A guide for critiquing a paper

    The range, quantity and quality of publications available today via print, electronic and Internet databases means it has become essential to equip students and practitioners with the prerequisites to judge the integrity and usefulness of published research. Finding, understanding and critiquing quality articles can be a difficult process.

  9. Conducting an article critique for a quantitative research study

    However, a fundamental knowledge of research methods is still needed in order to be successful. Because there are few published examples of critique examples, this article provides the practical points of conducting a formally written quantitative research article critique while providing a brief example to demonstrate the principles and form.

  10. Conducting an article critique for a quantitative research study

    Abstract: The ability to critically evaluate the merits of a quantitative design research article is a necessary skill for practitioners and researchers of all disciplines, including nursing, in order to judge the integrity and usefulness of the evidence and conclusions made in an article. In general, this skill is automatic for many practitioners and researchers who already possess a good ...

  11. Critiquing quantitative research

    Research should be well written, understand- able on first or second reading, not over use jargon and be presented logically in keeping with the steps of the research process. The inexperienced reviewer, undertaking a research critique is in an ideal position to make judgements of this nature.

  12. Deeper than Wordplay: A Systematic Review of Critical Quantitative

    The purpose of our systematic literature review is twofold: (a) to understand how critical approaches to quantitative inquiry emerged as a new paradigm within quantitative methods and (b) whether there is any distinction between quantitative criticalism, QuantCrit, and critical quantitative inquiries or simply interchangeable wordplay.

  13. Research essentials. How to critique quantitative research

    Abstract QUANTITATIVE RESEARCH is a systematic approach to investigating numerical data and involves measuring or counting attributes, that is quantities. Through a process of transforming information that is collected or observed, the researcher can often describes a situation or event, answering the 'what' and 'how many' questions about a situation ( Parahoo 2014 ).

  14. Critical Appraisal of Quantitative Research

    Critical appraisal skills are important for anyone wishing to make informed decisions or improve the quality of healthcare delivery. A good critical appraisal provides information regarding the believability and usefulness of a particular study. However, the...

  15. Critical appraisal of published research papers

    Conclusion: Journal article criticism is a crucial tool to develop a research attitude among postgraduate students. Participation in the JC activity led to the improvement in the skill of critical appraisal of published research articles, but this improvement was not educationally relevant.

  16. Step-by-step guide to critiquing research. Part 1: quantitative

    It is imperative in nursing that care has its foundations in sound research, and it is essential that all nurses have the ability to critically appraise research to identify what is best practice. This article is a step-by-step approach to critiquing quantitative research to help nurses demystify the process and decode the terminology.

  17. Critical Analysis: The Often-Missing Step in Conducting Literature

    Critical Analysis: The Often-Missing Step in Conducting Literature Review Research Joan E. Dodgson, PhD, MPH, RN, FAAN

  18. How to appraise quantitative research

    However, nurses have a professional responsibility to critique research to improve their practice, care and patient safety. 1 This article provides a step by step guide on how to critically appraise a quantitative paper.

  19. Critiquing Research Evidence for Use in Practice: Revisited

    This article provides a revised framework for critiquing research evidence for use in practice, based on current literature and practice guidelines.

  20. Critique of research articles

    Critique of research articles. A guide for critique of research articles. Following is the list of criteria to evaluate (critique) a research article. Please note that you should first summarize the paper and then evaluate different parts of it. Most of the evaluation section should be devoted to evaluation of internal validity of the conclusions.

  21. Conducting an article critique for a quantitative

    The ability to critically evaluate the merits of a quantitative design research article is a necessary skill for practitioners and researchers of all disciplines, including nursing, in order to judge the integrity and usefulness of the evidence and conclusions made in an article. In general, this skill is automatic for many practitioners and ...

  22. The experience of critiquing published research: Learning from the

    Some of the assignment critique is validated by Morag, whilst some of the evaluation demonstrates unreliability of critique shown by Judie. Discussion surrounding sufficiency of research critique through systematic examination of a published article, versus an original research report such as a thesis ensues.

  23. Synthesising quantitative and qualitative evidence to inform guidelines

    Consideration is given to the opportunities for potential integration of quantitative and qualitative evidence at different stages of the review and guideline process. Encouragement is given to guideline commissioners and developers and review authors to consider including quantitative and qualitative evidence.