-
- Simon M Smith, Peyton Davis, and Llion Davies.
- aOxford Deanery, Health Education Thames Valley bEmergency Department, Oxford University Hospitals NHS Trust, Oxford cDepartment of General Surgery, Royal Glamorgan Hospital, Llantrisant, UK.
- Eur J Emerg Med. 2015 Dec 1; 22 (6): 436-9.
AbstractThe most common method of assessing the quality of medical education is through a selection of qualitative assessments, usually as part of a programme evaluation. Common qualitative assessments include measurements of students' and teachers' participation, outcome measures such as assessment results, and qualitative assessments such as interviews and questionnaires of students and teachers. Programme evaluation can therefore be a process that is both laborious and subject to accusations of a lack of objectivity. As a result, the development of a quantitative tool that could be used alongside a programme evaluation may be both useful and desirable. A pragmatic scoring system, utilizing routinely collected quantitative data, termed as the Quality Assessment Tool, was developed during the 2013 academic year within the setting of an Emergency Medicine training programme in the UK. This tool was tested against the standard assessment currently used for this programme to establish whether the quantitative tool correlated with the programme evaluation. Second, the individual items within the tool were investigated to identify any correlations with the current assessment of quality established by the programme evaluation. The Quality Assessment Tool appears to be correlated to the quality of training delivered at individual training sites in a single specialty. It certainly identifies those centres delivering the highest quality of training and also identifies those centres whose training is consistently of a lower standard. The assessment tool is less accurate at ranking those training centres whose training is merely 'satisfactory'; whether this is a result of the imprecision of the tool itself or a reflection of the subjective nature of the current assessment (i.e. whether the current evaluation system lacks validity) cannot be stated. In summary, it appears to be possible to use a single quantitative tool to reliably, and with validity, measure the quality of training delivered at a postgraduate medical training centre. Although it is not envisaged that this would, or should, replace ongoing quality assurance systems such as programme evaluations, a validated tool can be derived for a given setting to usefully inform and augment current quality management systems in postgraduate medical education.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.