-
- Jessica V Rich, Sue Fostaty Young, Catherine Donnelly, Andrew K Hall, DagnoneJ DamonJDDepartment of Emergency Medicine, Queen's University, Kingston, Ontario, Canada., Kristen Weersink, Jaelyn Caudle, Elaine Van Melle, and Don A Klinger.
- Faculty of Education, Queen's University, Kingston, Ontario, Canada.
- J Eval Clin Pract. 2020 Aug 1; 26 (4): 1087-1095.
Rationale, Aims, And ObjectivesProgrammatic assessment has been identified as a system-oriented approach to achieving the multiple purposes for assessment within Competency-Based Medical Education (CBME, i.e., formative, summative, and program improvement). While there are well-established principles for designing and evaluating programs of assessment, few studies illustrate and critically interpret, what a system of programmatic assessment looks like in practice. This study aims to use systems thinking and the 'two communities' metaphor to interpret a model of programmatic assessment and to identify challenges and opportunities with operationalization.MethodAn interpretive case study was used to investigate how programmatic assessment is being operationalized within one competency-based residency program at a Canadian university. Qualitative data were collected from residents, faculty, and program leadership via semi-structured group and individual interviews conducted at nine months post-CBME implementation. Data were analyzed using a combination of data-based inductive analysis and theory-derived deductive analysis.ResultsIn this model, Academic Advisors had a central role in brokering assessment data between communities responsible for producing and using residents' performance information for decision making (i.e., formative, summative/evaluative, and program improvement). As system intermediaries, Academic Advisors were in a privileged position to see how the parts of the assessment system contributed to the functioning of the whole and could identify which system components were not functioning as intended. Challenges were identified with the documentation of residents' performance information (i.e., system inputs); use of low-stakes formative assessments to inform high-stakes evaluative judgments about the achievement of competence standards; and gaps in feedback mechanisms for closing learning loops.ConclusionsThe findings of this research suggest that program stakeholders can benefit from a systems perspective regarding how their assessment practices contribute to the efficacy of the system as a whole. Academic Advisors are well positioned to support educational development efforts focused on overcoming challenges with operationalizing programmatic assessment.© 2019 John Wiley & Sons, Ltd.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.