Recommendations for specialty trainee assessment and review
In 2012 we piloted revisions to the WPBA system to test their acceptability and feasibility. As a result of the pilot, we introduced April 2014 recommendations for specialty trainee assessment and review.pdf effective from 6th August 2014 for all trainees.
The revisions to the WPBA were undertaken for two reasons:
1. Widespread comments from trainees and supervisors that the WPBA system was a burdensome process, regarded by many as a tick box exercise, incapable of detecting poorly-performing trainees. Incorrect use to determine trainee progression was impairing the correct use of WPBAs as aids to learning
2. In November 2011 the GMC publication Learning and assessment in the clinical environment: the way forward proposed two purposes of workplace-based assessment:
- Assessment for learning (formative assessment) - supervised learning events (SLEs)
- Assessment of learning (summative assessment) - assessments of performance (AoPs)
The pilot was designed to test the GMC's recommendations and added the premise that trainees needed fewer assessments with greater emphasis on feedback and trainee reflection, and carried out at appropriate intervals throughout the training year.
The lessons learned from the pilot - together with feedback from trainee and trainer surveys, specialist advisory committees (SACs) and heads of schools of medicine -contributed to a series of recommendations for the system of assessment and review of training to be implemented in August 2014.
The key recommendations were:
- AoPs did not function well as summative assessments and will not be part of our assessment strategy.
- SLEs will continue to use the established set of tools of mini-clinical evaluation exercise (mini-CEX), acute care assessment tool (ACAT) and case-based discussion (CbD) and will focus on constructive feedback and action plans.
- The educational supervisor's report is pivotal to the annual review of competence progression (ARCP). The educational supervisor should report on the trainee's engagement with the curriculum and learning demonstrated through SLEs and other evidence. The report must also include a summary of feedback received via the multiple consultant report (MCR) and multi-source feedback (MSF).
- The minimum number and type of SLEs will be clearly defined for each specialty by the SACs and will be documented in the ARCP decision aids.
- The number of links to curriculum competencies for different SLEs will be limited and clearly defined.
- Trainees may link SLEs and other evidence to curriculum competencies in order to demonstrate engagement with and exploration of the curriculum. The trainee has to make a judgement as to when a competence has been adequately explored and to link the evidence.
- Supervisors should sample the evidence linked to competencies in the ePortfolio. It is not necessary to examine all the competencies to determine a trainee's engagement with the curriculum, the accuracy of their statements and to make a judgement on the trainee's progress.
- A number of medical specialty curricula have large numbers of competencies (in some cases more than 100). We are developing checklists for core medical training (CMT) and acute internal medicine (AIM) to guide trainees and supervisors during specialty placements on the potential for group sign off of competencies, on the basis of the supervisor's deep knowledge of the trainee's performance, rather than portfolio linkages.
- Ten of the common competencies will not require linked evidence in the ePortfolio.
Trainee and trainer guidance
We have produced trainee and trainer guidance to help you to understand the new process.
These documents should be referred to along with the recommendations report and relevant ARCP decision aid.
- 2014 Changes to specialty trainee assessment and review - guidance for trainees.pdf
- 2014 Changes to specialty trainee assessment and review - guidance for supervisors.pdf
- 2014 Changes to specialty trainee assessment and review - guidance for programme directors.pdf
- 2014 Changes to specialty trainee assessment and review - guidance for programme directors in CMT, GIM and AIM.pdf
If you have any queries regarding the recommendations please contact email@example.com
The various workplace based assessment methods are described below. These methods will be used in different ways by different specialties and not all specialties will use all methods. Please refer to the relevant curriculum for details. These assessment tools are available online in the ePortfolio. Printable versions of some of the forms can be found in the document library on this website.
Acute Care Assessment Tool (ACAT)
The ACAT is designed to be used for supervised learning events on the acute medical take (but may be on a ward round or covering a day's management of admissions and ward work). The ACAT looks at clinical assessment and management, decision making, team working, time management, record keeping and handover for the whole time period and multiple patients. There should be a minimum of 5 cases for an ACAT assessment.
Audit Assessment (AA)
The Audit Assessment tool is designed to assess a trainee's competence in completing an audit. The Audit Assessment can be based on review of audit documentation or on a presentation of the audit at a meeting. If possible the trainee should be assessed on the same audit by more than one assessor
Case-based Discussion (CbD)
The CbD is a tool for supervised learning events based on a trainee's management of a patient and provides feedback on clinical reasoning, decision-making and application of medical knowledge in relation to patient care. It also serves as a method to document conversations about, and presentations of, cases by trainees. The CbD should focus on a written record (such as written case notes, out-patient letter, discharge summary). A typical encounter might be when presenting newly referred patients in the out-patient department.
Direct Observation of Procedural Skills (DOPS)
A DOPS is an assessment tool designed to evaluate the performance of a trainee in undertaking a practical procedure, against a structured checklist. The trainee receives immediate feedback to identify strengths and areas for development. DOPS have been separated into two categories for routine and life-threatening procedures, with a clear differentiation of formative and summative sign off. Formative DOPS for routine and potentially life threatening procedures should be undertaken before doing a summative DOPS and can be undertake as many times as the trainee and their supervisor feel is necessary. Summative DOPS should be undertaken as follows:
- summative sign off for routine procedures to be undertaken on one occasion with one assessor to confirm clinical independence.
- summative sign off for potentially life threatening procedures to be undertaken on two occasions with two different assessors (one assessor per occasion).
Please refer to the relevant specialty ARCP decision aid and see below for a list of potentially life threatening procedures in CMT, AIM, GIM and Palliative Medicine.
mini-Clinical Evaluation Exercise (mini-CEX)
This supervised learning event tool evaluates a clinical encounter with a patient to provide feedback on skills essential for good clinical care such as history taking, examination and clinical reasoning. The trainee receives immediate feedback to aid learning. It can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available.
Multi-source feedback (MSF)
This tool is a method of assessing generic skills such as communication, leadership, team working, reliability etc, across the domains of Good Medical Practice. This provides objective systematic collection and feedback of performance data on a trainee, derived from a number of colleagues. ‘Raters' are individuals with whom the trainee works, and includes doctors, administration staff, and other allied professionals. The trainee will not see the individual responses by raters, feedback is given to the trainee by the Educational Supervisor.
Patient Survey (PS)
Patient Survey address issues, including behaviour of the doctor and effectiveness of the consultation, which are important to patients. It is intended to assess the trainee's performance in areas such as interpersonal skills, communication skills and professionalism by concentrating solely on their performance during one consultation. Patient survey forms are not currently available on the ePortfolio and should be downloaded from the website (see below):
- 2014 Patient Survey Form.pdf
- 2014 Patient Survey cover letter.pdf
- 2014 Patient Survey Summary Form.docx
- 2014 Patient Survey Guidance for Trainees.pdf
Quality improvement project assessment tool (QIPAT)
The QIP Assessment tool is designed to assess a trainee's competence in completing a quality improvement project. The trainee should be given immediate feedback to identify strengths and areas for development. All workplace-based assessments are intended primarily to support learning so this feedback is very important
The QIP Assessment can be based on review of QIP documentation OR on a presentation of the QIP at a meeting. If possible the trainee should be assessed on the same QIP by more than one assessor. Assessors can be any doctor with suitable experience - for trainees in higher specialty training this is likely to be consultants. Some curricula may have specific requirements for numbers of consultant assessments.
The QIPAT will primarily be used in Core Medical Training (CMT). It is also specified in the Neurology curriculum from August 2014. All specialty trainees may use the QIPAT as an alternative to the Audit Assessment (AA) on a voluntary basis.
Teaching Observation (TO)
The Teaching Observation is designed to provide structured, formative feedback to trainees on their competence at teaching. The Teaching Observation can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. The process should be trainee-led (identifying appropriate teaching sessions and assessors).
Multiple Consultant Report
The Multiple Consultant Report (MCR) was introduced in 2013 and is designed to capture the views of consultant supervisors on a trainee's clinical performance. The MCR must be completed by consultants or associate specialists/specialty doctors (not trainees) who are able to provide feedback on a trainee's clinical performance. Educational supervisors should not be asked to complete an MCR for their own trainees as they will complete the ES report.
Each MCR form is completed by a single consultant. Therefore if four MCRs are required, four consultants should complete a form each resulting in four MCR forms. The MCRs will be automatically collated and summarised in the MCR Year Summary Sheet which will inform the educational supervisor report at the end of the training year, The MCR requests feedback on clinical performance and must be completed in addition to the Multi Source Feedback (MSF) tool. The same consultant may be approached to complete both forms.
Guidance on the minimum number of MCRs required by specialty is provided below. Trainees who are less than full time should complete the number of MCRs pro rata following discussion with their education supervisor.