Sign In

Workplace Based Assessment (WPBA)

Information about Workplace Based Assessments/Supervised Learning Events 

Workplace-based assessments (WPBA) and Supervised Learning Events (SLEs) will take place throughout the training programme to allow trainees to continually gather evidence of learning and to provide formative feedback. They are not individually summative (i.e. not "pass/fail") but overall outcomes from a number of such assessments provide evidence towards decision making. The WPBAs / SLEs must be spread evenly throughout the training year and not left until just before the ARCP process.

The Royal Colleges of Physicians have promoted the use of workplace-based assessments (DOPS, case-based discussion, mini-CEX and multi-source feedback) for trainees, having researched and piloted these techniques. In 2007 all specialties of medicine were required by the then Postgraduate Medical Education and Training Board (PMETB) to define assessment strategies to be followed by all trainees entering the new run-through training programmes. For each specialty we now have an integrated assessment system which identifies the appropriate methods to be used to assess curricula competencies. These include combinations of workplace-based assessments, including "new" methods in addition to those mentioned above.  

Updates for WPBAs/SLEs for August 2012

The JRCPTB has been reviewing the structure and process of our WPBAs following the AoMRC, COPMeD, and the GMC report on WPBAs, which recommended dropping the word "assessment", and emphasing the formative nature of WPBAs. The JRCPTB established a working group in January 2011 to examine recommendations and to advise on the need for changes to the system of WPBAs in medical specialties.

The working group made a significant number of key recommendations for a revised assessment system. Some of the changes such as the trialing of Assessments of Performance (AoPs), the introduction of private SLEs (ie not seen by the ARC panel) and use of formal consultant feedback report are being tested in a pilot study. This pilot study is currently being undertaken in 3 deaneries and involves 10 specialties, for further details please see the WPBA 2012 pilot page.

Other recommendations were implemented across the board and came into effect on 1st August 2012 for all trainers and trainees. These are:

  • Assessments should take place throughout the training year. Clustering of assessments just prior to an ARCP panel will not be acceptable.
  • Introduction of specific assessment forms for core medical trainees (CMT) and higher specialty trainees (HST) ie there is a mini-CEX form for CMT and a separate mini-CEX form for HST, using different anchor statements in the rating scales.
  • the ‘radio buttons' currently used on WPBA forms have been removed and replaced with free-text boxes.
  • CbDs, mini-CEX, ACATs are now categorised as SLEs in the ePortfolio
  • directly observed procedural skills (DOPS) have been separated into two categories of routine and life-threatening procedures, with a clear differentiation of formative and summative sign off. Formative DOPS for routine and potentially life threatening procedures should be undertaken before doing a summative DOPS and can be undertake as many times as the trainee and their supervisor feel is necessary. Summative DOPS should be undertaken as follows:
    • summative sign off for routine procedures to be undertaken on one occasion with one assessor to confirm clinical independence.
    • summative sign off for potentially life threatening procedures to be undertaken on two occasions with two different assessors (one assessor per occasion).

A list of all potentially life threatening procedures for CMT, GIM, AIM and palliative medicine can be viewed here.

WPBA Requirements for Specialty Registrars (StRs)

For trainees who started in CMT or specialty training from 2007 onwards, the selection and numbers of assessments required are documented in the appropriate ARCP Decision Aid which can be found on the relevant specialty page of this website.

Assessment Methods

The following paragraphs briefly describe the assessment methods to be used. These methods will be used in different ways by different specialties, and not all specialties will use all methods.

Multi-source feedback (MSF)

This tool is a method of assessing generic skills such as communication, leadership, team working, reliability etc, across the domains of Good Medical Practice. This provides objective systematic collection and feedback of performance data on a trainee, derived from a number of colleagues. ‘Raters' are individuals with whom the trainee works, and includes doctors, administration staff, and other allied professionals. The trainee will not see the individual responses by raters, feedback is given to the trainee by the Educational Supervisor.

mini-Clinical Evaluation Exercise (mini-CEX)

This tool evaluates a clinical encounter with a patient to provide an indication of competence in skills essential for good clinical care such as history taking, examination and clinical reasoning. The trainee receives immediate feedback to aid learning. It can be used at any time and in any setting when there is a trainee and patient interaction and an assessor is available.

Direct Observation of Procedural Skills (DOPS)

A DOPS is an assessment tool designed to evaluate the performance of a trainee in undertaking a practical procedure, against a structured checklist. The trainee receives immediate feedback to identify strengths and areas for development.

Case-based Discussion (CbD)

The CbD assesses the performance of a trainee in their management of a patient to provide an indication of competence in areas such as clinical reasoning, decision-making and application of medical knowledge in relation to patient care. It also serves as a method to document conversations about, and presentations of, cases by trainees. The CbD should focus on a written record (such as written case notes, out-patient letter, discharge summary). A typical encounter might be when presenting newly referred patients in the out-patient department.

Acute Care Assessment Tool (ACAT)

The ACAT is designed to assess and facilitate feedback on a doctor's performance during their practice on the Acute Medical Take. Any doctor who has been responsible for the supervision of the Acute Medical Take can be the assessor for an ACAT.

Patient Survey (PS)

Patient Survey address issues, including behaviour of the doctor and effectiveness of the consultation, which are important to patients. It is intended to assess the trainee's performance in areas such as interpersonal skills, communication skills and professionalism by concentrating solely on their performance during one consultation.

Patient survey forms (which are not currently available on the ePortfolio) are available here:

Audit Assessment (AA)

The Audit Assessment tool is designed to assess a trainee's competence in completing an audit. The Audit Assessment can be based on review of audit documentation OR on a presentation of the audit at a meeting. If possible the trainee should be assessed on the same audit by more than one assessor. The Audit Assesment tool is now available in the ePortfolio.

Teaching Observation (TO)

The Teaching Observation is designed to provide structured, formative feedback to trainees on their competence at teaching. The Teaching Observation can be based on any instance of formalised teaching by the trainee which has been observed by the assessor. The process should be trainee-led (identifying appropriate teaching sessions and assessors). The Teaching Observation is also now available in the ePortfolio.

Quality improvement project assessment tool (QIPAT) - CMT, GIM and AIM only

The QIP Assessment tool is designed to assess a trainee's competence in completing a quality improvement project. The trainee should be given immediate feedback to identify strengths and areas for development. All workplace-based assessments are intended primarily to support learning so this feedback is very important

The QIP Assessment can be based on review of QIP documentation OR on a presentation of the QIP at a meeting. If possible the trainee should be assessed on the same QIP by more than one assessor.

Assessors can be any doctor with suitable experience - for trainees in higher specialty training this is likely to be consultants. Some curricula may have specific requirements for numbers of consultant assessments.   

Multiple Consultant Report

The Multiple Consultant Report (MCR) was introduced in October 2013 as part of JRCPTB's programme to improve workplace-based assessment.  The MCR effectively replaces the clinical supervisor's report previousl found in the ePortfolio.

This form is additional to the Multi Source Feedback tool (MSF). It should only be completed by consultants and is intended to focus specifically on clinical performance. The responses given will contribute to the Educational Supervisor's report and you should try to give an accurate description of the trainee's abilities.

You can read more about the introduction of the MCR here.

WPBA Requirements for Specialist Registrars (SpRs) 

The JRCPTB has recently become aware of confusion about the use of workplace-based assessments (WPBA) and knowledge-based assessments (KBA) for SpRs.    

It was never the intention of the Board to mandate the use of WPBA for SpRs although it was hoped that most would use them on a voluntary basis.   In a statement dated 9th January 2009, published on this website, the Board stated its expectation that most SpRs would avail themselves of these useful tools in the run up to their RITA episodes.     

Similarly, KBA (in the form of Specialty-specific examinations (SCEs) can only be mandated for those entering training against a curriculum that requires such an assessment.   In effect this means that only those entering training from August 2007 when the new StR curricula came into effect will have to pass the stipulated KBA for the specialty.    It must be emphasised that SpRs cannot be required to sit or pass the KBA as a pre-condition for the award of CCT. 

The board has issued this updated statement.

SpRs who are following pre-2007 curricula and who choose to use workplace based assessments should use the table below as guidance when gathering evidence: 

 

 

Frequency of assessments

Number of assessors

Number of assessments per assessor

Indicative time requirements

MiniCEX

4 per year

2

2

4 hours per year per SpR

DOPS*

6 over 4 years

2

3

1 hour per year per SpR

MSF

2 in 4 years

12 - 20 raters 1 collater

1 or 2

1 hour per year per SpR

 *depends on the specialty requirements for procedure specific DOPS'  

Assessment Forms

StRs who are using the JRCPTB ePortfolio should use the online forms in the ePortfolio. SpRs should use the paper forms available here.

Evaluation for teaching and presentations

Trainees may also wish to seek feedback for teaching and presentations. Please use the Evaluation Form for teaching and presentations to handout in advance to the audience and those you are teaching.

Training Events for Assessors

The JRCPTB considers it essential that assessor training in the use of all WPBA is undertaken before assessors undertake the formal assessment of their trainees.  Training is widely available in deaneries and through the Education Dept at RCP London tailored events have been run for various specialist groups and in a number of deaneries.

 

Page updated 3 October 2013