Workplace Based Assessment (WPBA)
In 2012 the JRCPTB piloted revisions to the WPBA system to test their acceptability and feasibility. As a result of the pilot the JRCPTB has developed recommendations for specialty trainee assessment and review to be implemented from 6th August 2014 for all trainees.
The revisions to the WPBA were undertaken for two reasons:
1. Widespread comments from trainees and supervisors that the WPBA system was a burdensome process, regarded by many as a tick box exercise, incapable of detecting poorly-performing trainees. Incorrect use to determine trainee progression was impairing the correct use of WPBAs as aids to learning
2. In November 2011 the GMC publication Learning and assessment in the clinical environment: the way forward proposed two purposes of workplace-based assessment:
The pilot was designed to test the GMC's recommendations and added the premise that trainees needed fewer assessments with greater emphasis on feedback and trainee reflection, and carried out at appropriate intervals throughout the training year.
The lessons learned from the pilot - together with feedback from trainee and trainer surveys, specialist advisory committees (SACs) and heads of schools of medicine -contributed to a series of recommendations for the system of assessment and review of training to be implemented in August 2014.
The key recommendations are:
- AoPs did not function well as summative assessments and will not be part of our assessment strategy.
- SLEs will continue to use the established set of tools of mini-clinical evaluation exercise (mini-CEX), acute care assessment tool (ACAT) and case-based discussion (CbD) and will focus on constructive feedback and action plans.
- The educational supervisor's report is pivotal to the annual review of competence progression (ARCP). The educational supervisor should report on the trainee's engagement with the curriculum and learning demonstrated through SLEs and other evidence. The report must also include a summary of feedback received via the multiple consultant report (MCR) and multi-source feedback (MSF).
- The minimum number and type of SLEs will be clearly defined for each specialty by the SACs and will be documented in the ARCP decision aids.
- The number of links to curriculum competencies for different SLEs will be limited and clearly defined.
- Trainees may link SLEs and other evidence to curriculum competencies in order to demonstrate engagement with and exploration of the curriculum. The trainee has to make a judgement as to when a competence has been adequately explored and to link the evidence.
- Supervisors should sample the evidence linked to competencies in the ePortfolio. It is not necessary to examine all the competencies to determine a trainee's engagement with the curriculum, the accuracy of their statements and to make a judgement on the trainee's progress.
- A number of medical specialty curricula have large numbers of competencies (in some cases more than 100). We are developing checklists for core medical training (CMT) and acute internal medicine (AIM) to guide trainees and supervisors during specialty placements on the potential for group sign off of competencies, on the basis of the supervisor's deep knowledge of the trainee's performance, rather than portfolio linkages.
- Ten of the common competencies will not require linked evidence in the ePortfolio.
We have produced trainee and supervisor guidance to help you to understand the new process.
These documents should be referred to along with the recommendations report and relevant ARCP decision aid.
The WPBA evaluation report, completed by the Royal College of Physicians, London on behalf of JRCPTB, is available to download below:
Further information on the WPBA pilot is available on the WPBA pilot page.
If you have any queries regarding the recommendations please contact firstname.lastname@example.org
For trainees who started in CMT or specialty training from 2007 onwards, the selection and numbers of assessments required are documented in the appropriate ARCP Decision Aid which can be found on the relevant specialty page of this website.
The following paragraphs briefly describe the assessment methods to be used. These methods will be used in different ways by different specialties, and not all specialties will use all methods.
Multi-source feedback (MSF)
This tool is a method of assessing generic skills such as communication, leadership, team working, reliability etc, across the domains of Good Medical Practice. This provides objective systematic collection and feedback of performance data on a trainee, derived from a number of colleagues. ‘Raters' are individuals with whom the trainee works, and includes doctors, administration staff, and other allied professionals. The trainee will not see the individual responses by raters, feedback is given to the trainee by the Educational Supervisor.
mini-Clinical Evaluation Exercise (mini-CEX)
This tool evaluates a clinical encounter with a patient to provide an indication of competence in skills essential for good clinical care such as history taking, examination and clinical reasoning. The trainee receives immediate feedback to aid learning. It can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available.
Direct Observation of Procedural Skills (DOPS)
A DOPS is an assessment tool designed to evaluate the performance of a trainee in undertaking a practical procedure, against a structured checklist. The trainee receives immediate feedback to identify strengths and areas for development.
They have been separated into two categories of routine and life-threatening procedures, with a clear differentiation of formative and summative sign off. Formative DOPS for routine and potentially life threatening procedures should be undertaken before doing a summative DOPS and can be undertake as many times as the trainee and their supervisor feel is necessary. Summative DOPS should be undertaken as follows:
- summative sign off for routine procedures to be undertaken on one occasion with one assessor to confirm clinical independence.
- summative sign off for potentially life threatening procedures to be undertaken on two occasions with two different assessors (one assessor per occasion).
A list of all potentially life threatening procedures for CMT, GIM, AIM and palliative medicine can be viewed here.
Case-based Discussion (CbD)
The CbD assesses the performance of a trainee in their management of a patient to provide an indication of competence in areas such as clinical reasoning, decision-making and application of medical knowledge in relation to patient care. It also serves as a method to document conversations about, and presentations of, cases by trainees. The CbD should focus on a written record (such as written case notes, out-patient letter, discharge summary). A typical encounter might be when presenting newly referred patients in the out-patient department.
Acute Care Assessment Tool (ACAT)
The ACAT is designed to assess and facilitate feedback on a doctor's performance during their practice on the Acute Medical Take. Any doctor who has been responsible for the supervision of the Acute Medical Take can be the assessor for an ACAT.
Patient Survey (PS)
Patient Survey address issues, including behaviour of the doctor and effectiveness of the consultation, which are important to patients. It is intended to assess the trainee's performance in areas such as interpersonal skills, communication skills and professionalism by concentrating solely on their performance during one consultation.
Patient survey forms (which are not currently available on the ePortfolio) are available here:
Audit Assessment (AA)
The Audit Assessment tool is designed to assess a trainee's competence in completing an audit. The Audit Assessment can be based on review of audit documentation OR on a presentation of the audit at a meeting. If possible the trainee should be assessed on the same audit by more than one assessor. The Audit Assesment tool is now available in the ePortfolio.
Teaching Observation (TO)
The Teaching Observation is designed to provide structured, formative feedback to trainees on their competence at teaching. The Teaching Observation can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. The process should be trainee-led (identifying appropriate teaching sessions and assessors). The Teaching Observation is also now available in the ePortfolio.
Quality improvement project assessment tool (QIPAT)
The QIP Assessment tool is designed to assess a trainee's competence in completing a quality improvement project. The trainee should be given immediate feedback to identify strengths and areas for development. All workplace-based assessments are intended primarily to support learning so this feedback is very important
The QIP Assessment can be based on review of QIP documentation OR on a presentation of the QIP at a meeting. If possible the trainee should be assessed on the same QIP by more than one assessor. Assessors can be any doctor with suitable experience - for trainees in higher specialty training this is likely to be consultants. Some curricula may have specific requirements for numbers of consultant assessments.
The QIPAT will primarily be used in Core Medical Training (CMT). It is also specified in the Neurology curriculum from August 2014. All specialty trainees may use the QIPAT as an alternative to the Audit Assessment (AA) on a voluntary basis.
Multiple Consultant Report
The Multiple Consultant Report (MCR) was introduced in October 2013 as part of JRCPTB's programme to improve workplace-based assessment. The MCR effectively replaces the clinical supervisor's report previously found in the ePortfolio.
This form is additional to the Multi Source Feedback tool (MSF). It should only be completed by consultants and is intended to focus specifically on clinical performance. The responses given will contribute to the Educational Supervisor's report and you should try to give an accurate description of the trainee's abilities.
You can read more about the MCR here.
Workplace-based assessment requirements for Specialist Registrars (SpRs) still following pre-2007 curricula can be accessed here.
StRs who are using the JRCPTB ePortfolio should use the online forms in the ePortfolio. SpRs should use the paper forms available here.
Evaluation for teaching and presentations
Trainees may also wish to seek feedback for teaching and presentations. Please use the Evaluation Form for teaching and presentations to handout in advance to the audience and those you are teaching.
Training for Assessors
The JRCPTB considers it essential that assessor training in the use of all WPBA is undertaken before assessors undertake the formal assessment of their trainees.
Page updated 17 July 2014