The western journal of emergency medicine
-
Multicenter Study Observational Study
Direct Observation Assessment of Milestones: Problems with Reliability.
Emergency medicine (EM) milestones are used to assess residents' progress. While some milestone validity evidence exists, there is a lack of standardized tools available to reliably assess residents. Inherent to this is a concern that we may not be truly measuring what we intend to assess. The purpose of this study was to design a direct observation milestone assessment instrument supported by validity and reliability evidence. In addition, such a tool would further lend validity evidence to the EM milestones by demonstrating their accurate measurement. ⋯ The validity and reliability of the current EM milestone assessment tools have yet to be determined. This study is a rigorous attempt to collect validity evidence in the development of a direct observation assessment instrument. However, despite strict attention to validity evidence, inter-rater reliability was low. The potential sources of reducible variance include rater- and instrument-based error. Based on this study, there may be concerns for the reliability of other EM milestone assessment tools that are currently in use.
-
Emergency medicine (EM) education is becoming increasingly challenging as a result of changes to North American medical education and the growing complexity of EM practice. Education scholarship (ES) provides a process to develop solutions to these challenges. ES includes both research and innovation. ⋯ Digital technologies have improved the discovery of work that informs ES, broadened the scope and timing of peer review, and provided new platforms for the dissemination and archiving of innovations. This editorial reviews key steps in raising an education innovation to the level of scholarship. It also discusses important areas for EM education scholars to address, which include the following: the delivery of competency-based medical education programs, the impact of social media on learning, and the redesign of continuing professional development.
-
Comparative Study Observational Study
Emergency Medicine Residents Consistently Rate Themselves Higher than Attending Assessments on ACGME Milestones.
In 2012 the Accreditation Council for Graduate Medical Education (ACGME) introduced the Next Accreditation System (NAS), which implemented milestones to assess the competency of residents and fellows. While attending evaluation and feedback is crucial for resident development, perhaps equally important is a resident's self-assessment. If a resident does not accurately self-assess, clinical and professional progress may be compromised. The objective of our study was to compare emergency medicine (EM) resident milestone evaluation by EM faculty with the same resident's self-assessment. ⋯ Residents over-estimated their abilities in every sub-competency assessed. This underscores the importance of feedback and assessment transparency. More attention needs to be paid to methods by which residency leadership can make residents' self-perception of their clinical ability more congruent with that of their teachers and evaluators. The major limitation of our study is small sample size of both residents and attendings.
-
Multicenter Study Observational Study
Correlation of the National Board of Medical Examiners Emergency Medicine Advanced Clinical Examination Given in July to Intern American Board of Emergency Medicine in-training Examination Scores: A Predictor of Performance?
There is great variation in the knowledge base of Emergency Medicine (EM) interns in July. The first objective knowledge assessment during residency does not occur until eight months later, in February, when the American Board of EM (ABEM) administers the in-training examination (ITE). In 2013, the National Board of Medical Examiners (NBME) released the EM Advanced Clinical Examination (EM-ACE), an assessment intended for fourth-year medical students. Administration of the EM-ACE to interns at the start of residency may provide an earlier opportunity to assess the new EM residents' knowledge base. The primary objective of this study was to determine the correlation of the NBME EM-ACE, given early in residency, with the EM ITE. Secondary objectives included determination of the correlation of the United States Medical Licensing Examination (USMLE) Step 1 or 2 scores with early intern EM-ACE and ITE scores and the effect, if any, of clinical EM experience on examination correlation. ⋯ Given early during intern year, the EM-ACE score showed positive correlation with ITE. Clinical EM experience prior to the in-training exam did not affect the correlation.
-
Comparative Study
Combined Versus Detailed Evaluation Components in Medical Student Global Rating Indexes.
To determine if there is any correlation between any of the 10 individual components of a global rating index on an emergency medicine (EM) student clerkship evaluation form. If there is correlation, to determine if a weighted average of highly correlated components loses predictive value for the final clerkship grade. ⋯ This study revealed that several components of the evaluation card had a high degree of correlation. Combining the correlated items, a reduced model containing four items (clinical skills, interpersonal skills, procedural skills, and documentation) was as predictive of the student's clinical grade as the full 10-item evaluation. Clerkship directors should be aware of the performance of their individual global rating scales when assessing medical student performance, especially if attempting to measure greater than four components.