• Medical education · Aug 2014

    Meta Analysis

    How much evidence does it take? A cumulative meta-analysis of outcomes of simulation-based education.

    • David A Cook.
    • Division of General Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota, USA; Center for Online Learning, Mayo Clinic College of Medicine, Rochester, Minnesota, USA; Mayo Multidisciplinary Simulation Center, Mayo Clinic College of Medicine, Rochester, Minnesota, USA.
    • Med Educ. 2014 Aug 1;48(8):750-60.

    ContextStudies that investigate research questions that have already been resolved represent a waste of resources. However, the failure to collect sufficient evidence to resolve a given question results in ambiguity.ObjectivesThe present study was conducted to reanalyse the results of a meta-analysis of simulation-based education (SBE) to determine: (i) whether researchers continue to replicate research studies after the answer to a research question has become known, and (ii) whether researchers perform enough replications to definitively answer important questions.MethodsA systematic search of multiple databases to May 2011 was conducted to identify original research evaluating SBE for health professionals in comparison with no intervention or any active intervention, using skill outcomes. Data were extracted by reviewers working in duplicate. Data synthesis involved a cumulative meta-analysis to illuminate patterns of evidence by sequentially adding studies according to a variable of interest (e.g. publication year) and re-calculating the pooled effect size with each addition. Cumulative meta-analysis by publication year was applied to 592 comparative studies using several thresholds of 'sufficiency', including: statistical significance; stable effect size classification and magnitude (Hedges' g ± 0.1), and precise estimates (confidence intervals of less than ± 0.2).ResultsAmong studies that compared the outcomes of SBE with those of no intervention, evidence supporting a favourable effect of SBE on skills existed as early as 1973 (one publication) and further evidence confirmed a quantitatively large effect of SBE by 1997 (28 studies). Since then, a further 404 studies were published. Among studies comparing SBE with non-simulation instruction, the effect initially favoured non-simulation training, but the addition of a third study in 1997 brought the pooled effect to slightly favour simulation, and by 2004 (14 studies) this effect was statistically significant (p < 0.05) and the magnitude had stabilised (small effect). A further 37 studies were published after 2004. By contrast, evidence from studies evaluating repetition continued to show borderline statistical significance and wide confidence intervals in 2011.ConclusionsSome replication is necessary to obtain stable estimates of effect and to explore different contexts, but the number of studies of SBE often exceeds the minimum number of replications required.© 2014 John Wiley & Sons Ltd.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…