Evidence-Based Practice
EBP process
The EBP process has 5 steps:
- Ask - Develop a relevant, answerable clinical question
- Acquire - Plan, search & find the best available evidence
- Appraise - Critically appraise articles for validity & applicability
- Apply - Integrate the evidence into practice
- Assess - Evaluate your clinical decision
Before you begin
- Clarify and state your question - see PICO
- Acquire the evidence - see Finding Evidence
Appraising the literature
Critical appraisal is a systematic analysis of research articles to determine the strength of the evidence in reference to your clinical question. Systematically review the different aspects (below) of each study to answer the following questions:
- What were the strengths of the study and the overall rigor of the study in terms of credibility?
- Quality of evidence is based on the level and strength of the study.
- Level of evidence: Various scales have been developed to rank; e.g., PEDro scale
- Strength of evidence is based on the methodological limitations and/or threats to validity that may affect interpretation of findings and generalization of results.
- Quality of evidence is based on the level and strength of the study.
- How are the results of this study relevant to your clinical question? How might the results influence clinical practice?
- Applicability of evidence is based on its relevance to your question.
- How are the results of this study relevant to your clinical question? How might the results influence clinical practice?
Study designs
What is the study design?
- Qualitative study design examples: phenomenology, ethnography, grounded theory, participatory action research.
- Quantitative study design examples: randomized (RCT), cohort, single-case, before and after, case-control, cross-sectional or case study
Further readings on study design:
- Types of Designs - Center for Social Research Methods Research Methods Knowledge Base
- Understanding Research Study Designs - University of Minnesota LibGuide
- Study designs - Short article from CEBM describing different study designs
Handbook of Research Methods in Abnormal and Clinical Psychology by Dean McKay (Editor)
Publication Date: 2008The Handbook of Research Methods in Abnormal and Clinical Psychology presents a diverse range of areas critical to any researcher or student entering the field. See Chapter 22 "Single-case research designs" by Mathew K. Nock.
Setting & participants
Setting:
- Was the setting appropriate to answer the research question?
Participants/Sample:
- How were participants recruited?
- What were the inclusion/exclusion criteria?
- How many participants participated? How many were lost through attrition?
- What were the participant demographics?
- If more than one group, was there similarity between groups?
Intervention & outcome measures
Intervention(s):
- Was the intervention clearly described?
- Who delivered the intervention and how often?
- Was there cross-contamination between interventions?
- Was there a break-in period?
Outcome measure(s):
- What instruments or methods were used to measure the variables? Examples include participant observation, interviews, focus groups, instruments, devices & questionnaires.
- Did the authors use measures with documented evidence of validity and reliability?
- Was the procedural reliability documented?
- How frequently were the participants measured?
Results & conclusions
Main results or key findings:
- What were the results?
- Was there statistical difference? What was the effect size?
- How were the results analyzed and were the analysis methods appropriate?
Authors' interpretation/conclusion:
- What was the clinical relevance of the study and the results?
Appraisal worksheets and checklists
- Applying the Evidence WorksheetView and download PDF from Dartmouth Biomedical Libraries.
- CEBM Critical Appraisal SheetsAssess the quality of systematic reviews, diagnostic studies, and RCTs.
- Strengthening the Reporting of Observational studies in Epidemiology (STROBE)Checklists of what to include in articles of observational studies. Includes case control, cohort, and cross-sectional studies.
- Synthesis WritingDrew University Resources for Writers
- Tips for learners of evidence-based medicineRelative risk reduction, absolute risk reduction & number needed to treat
CMAJ August 17, 2004 vol. 171 no. 4 doi: 10.1503/cmaj.1021197
Critical appraisal
- What are the strengths and weaknesses of the study?
- Were interventions delivered and data collected systematically, objectively and with fidelity?
- What were the potential threats to internal validity?
- Examples of potential threats to internal validity: history, maturation, testing, instrumentation, statistical regression, selection, mortality, interactions with selection, ambiguity about causal influence, and diffusion of intervention.
- What were the potential threats to external validity?
- Examples of potential threats to external validity: interaction of testing and treatment, interaction of selection and treatment, interaction of setting and treatment, interaction of history and treatment, multiple-treatment interference.
- Description of external validity
- Were there limitations of the study to answer your clinical question?
- Rate the study quality – this may be rated on a scale of 1-5 or 1-3.
- Level of evidence is based on the study design(s).
- Quality of evidence is based on the methodological strengths and weaknesses.
- Applicability of evidence is based on its relevance to your question.
- Various scales have been developed to rank; e.g., PEDro scale
- Your overall summary and interpretation of the study:
- What were the strengths of the study and the overall rigor of the study in terms of credibility?
- How are the results of this study relevant to your clinical question? How might the results influence clinical practice?
Synthesizing the literature
Synthesis involves combining ideas or results from two or more sources in a meaningful way. In EBP the synthesis is focused on the clinical question. You may combine the details from the article appraisals into themes to organize the ideas. The writing must remain objective and accurately report the information from the original sources.
- Strength of evidence is based on the quantity, quality and consistency (of results) of a body of literature.
- Quantity: The number of studies available
- Quality: The level and strength of evidence available
- Consistency of results: The consistency of the research findings
- Applicability of evidence: Determined by ability for evidence to answer questions
Discuss implications for practice, education, or research. The discussion may include suggestions or recommendations for changes to practice, education or research as well as confirmation of current practice. A table may be used to display the information collected from the articles under discussion.
Synthesis table
Topic |
Article 1 [1st Author and Year] |
Article 2 [1st Author and Year] |
Article 3 [1st Author and Year] |
Article 4 [1st Author and Year] |
Population |
|
|
|
|
Setting |
|
|
|
|
Outcome Measure(s) |
|
|
|
|
Study Design |
|
|
|
|
Intervention |
|
|
|
|
Key Findings |
|
|
|
|
Critical appraisal |
|
|
|
|
Study Quality |
|
|
|
|