The various workplace based assessment methods are described below. These methods will be used in different ways by different specialties and not all specialties will use all methods. Please refer to the relevant curriculum for details. These assessment tools are available online in the ePortfolio. Printable versions of some of the forms can be found in this website's document library by using the search function.
Acute care assessment tool (ACAT)
The ACAT is designed to be used for supervised learning events (SLEs) on the acute medical take but may be on a ward round or covering a day's management of admissions and ward work. The ACAT looks at clinical assessment and management, decision making, team working, time management, record keeping and handover for the whole time period and multiple patients. There should be a minimum of 5 cases for an ACAT assessment.
SLEs may be linked to curriculum competencies in the ePortfolio as evidence of engagement with, and exploration of, the curriculum. However, it is not appropriate for an SLE to be linked to large numbers of competencies and for this reason the number of links for an ACAT should be limited to eight curriculum competencies.
Audit assessment (AA)
The Audit Assessment tool is designed to assess a trainee's competence in completing an audit. The Audit Assessment can be based on review of audit documentation or on a presentation of the audit at a meeting. If possible the trainee should be assessed on the same audit by more than one assessor
Case-based discussion (CbD)
The CbD is a tool for supervised learning events (SLEs) based on a trainee's management of a patient and provides feedback on clinical reasoning, decision-making and application of medical knowledge in relation to patient care. It also serves as a method to document conversations about, and presentations of, cases by trainees. The CbD should focus on a written record (such as written case notes, out-patient letter, discharge summary). A typical encounter might be when presenting newly referred patients in the out-patient department.
SLEs may be linked to curriculum competencies in the ePortfolio as evidence of engagement with, and exploration of, the curriculum. However, it is not appropriate for an SLE to be linked to large numbers of competencies and for this reason the number of links for CbDs should be limited to two competencies in the curriculum.
Direct observation of procedural skills (DOPS)
A DOPS is an assessment tool designed to evaluate the performance of a trainee in undertaking a practical procedure, against a structured checklist. The trainee receives immediate feedback to identify strengths and areas for development. DOPS have been separated into two categories for routine and life-threatening procedures, with a clear differentiation of formative and summative sign off. Formative DOPS for routine and potentially life threatening procedures should be undertaken before doing a summative DOPS and can be undertake as many times as the trainee and their supervisor feel is necessary.
mini-clinical evaluation exercise (mini-CEX)
This supervised learning event (SLE) tool evaluates a clinical encounter with a patient to provide feedback on skills essential for good clinical care such as history taking, examination and clinical reasoning. The trainee receives immediate feedback to aid learning. It can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available.
SLEs may be linked to curriculum competencies in the ePortfolio as evidence of engagement with, and exploration of, the curriculum. However, it is not appropriate for an SLE to be linked to large numbers of competencies and for this reason the number of links for mini-CEX should be limited to two competencies in the curriculum.
Multiple consultant report (MCR)
The Multiple consultant report (MCR) is designed to capture the views of consultant supervisors on a trainee's clinical performance. It must be completed by consultants or associate specialists/specialty doctors (not trainees) who are able to provide feedback on a trainee's clinical performance. Educational supervisors should not be asked to complete an MCR for their own trainees as they will complete the ES report.
Each MCR form is completed by a single consultant. Therefore if four MCRs are required, four consultants should complete a form each resulting in four MCR forms. The MCRs will be automatically collated and summarised in the MCR Year Summary Sheet which will inform the educational supervisor report at the end of the training year, The MCR requests feedback on clinical performance and must be completed in addition to the Multi-source feedback (MSF) tool. The same consultant may be approached to complete both forms.
Guidance on the minimum number of MCRs required by specialty is provided below. Trainees who are less than full time should complete the number of MCRs pro rata following discussion with their education supervisor. The MCR can be found in this website's document library by using the search function.
Multi-source feedback (MSF)
This tool is a method of assessing generic skills such as communication, leadership, team working, reliability etc, across the domains of Good Medical Practice. This provides objective systematic collection and feedback of performance data on a trainee, derived from a number of colleagues. ‘Raters' are individuals with whom the trainee works, and includes doctors, administration staff, and other allied professionals. The trainee will not see the individual responses by raters, feedback is given to the trainee by the Educational Supervisor.
Patient survey (PS)
Patient Survey address issues, including behaviour of the doctor and effectiveness of the consultation, which are important to patients. It is intended to assess the trainee's performance in areas such as interpersonal skills, communication skills and professionalism by concentrating solely on their performance during one consultation. Patient survey forms are not currently available on the ePortfolio but can be found in this website's document library by using the search function.
Quality improvement project assessment tool (QIPAT)
The QIP Assessment tool is designed to assess a trainee's competence in completing a quality improvement project. The trainee should be given immediate feedback to identify strengths and areas for development. All workplace-based assessments are intended primarily to support learning so this feedback is very important
The QIP Assessment can be based on review of QIP documentation OR on a presentation of the QIP at a meeting. If possible the trainee should be assessed on the same QIP by more than one assessor. Assessors can be any doctor with suitable experience - for trainees in higher specialty training this is likely to be consultants. Some curricula may have specific requirements for numbers of consultant assessments.
The QIPAT will primarily be used in Core Medical Training (CMT). It is also specified in the Neurology curriculum from August 2014. All specialty trainees may use the QIPAT as an alternative to the Audit Assessment (AA) if agreed with their educational supervisor.
Teaching observation (TO)
The Teaching Observation is designed to provide structured, formative feedback to trainees on their competence at teaching. The Teaching Observation can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. The process should be trainee-led (identifying appropriate teaching sessions and assessors).
Recommendations for specialty trainee assessment and review
In 2014 we introduced recommendations for specialty trainee assessment and review. We also produced trainee and trainer guidance which can be found in this website's document library by using the search function above.