SlideShare a Scribd company logo
1 of 6
Download to read offline
1
ICBSE: February 2009
Intercollegiate Examiners’ Newsletter
Welcome to the first edition of the intercollegiate examiners’ newsletter!
Its purpose is to keep you informed about changes in the MRCS and DO-HNS
examination and help to create a sense of intercollegiality.
I was appointed chairman of Intercollegiate Committee for Basic Surgical
Examinations (ICBSE) in July 2007 succeeding Mr David Ward. My term of office is
three years. In my daytime job I am a Consultant Trauma Orthopaedic Surgeon at
The Royal Infirmary of Edinburgh. However I was born in London and trained at
University College Hospital, London. My postgraduate training was in London,
Yorkshire, Oswestry, Stoke-on-Trent, Seattle and Oxford. I have both the English
and Edinburgh FRCS and an Edinburgh FRCP. I was co-convener of Examinations
of the Royal College of Surgeons of Edinburgh and am a member of its Council.
As chairman of ICBSE I lead the committee that governs the operation, regulation
and development of the intercollegiate MRCS and DO-HNS and am responsible for
the following sub-committees: OSCE; Syllabus; MCQ Paper Panel & Question
Quality; DO – HNS; Internal Quality Assurance; Clinical; Oral and Communications
Skills. I represent ICBSE at the Joint Committee on Surgical Training (JCST), Joint
Surgical Colleges Meeting (JSCM), The Senate of Surgery, Curriculum Development
and Assessment Sub-Group of ISCP, Joint Surgical Colleges Planning and Review
Committee and Joint Committee of Intercollegiate Exams (JCIE).
There has been significant change in surgical training and assessment in 2008 and I
hope we can now have a period of stability and collaboration to build an examination
that is robust, reliable and fit for purpose. A tremendous amount of work has been
performed by ICBSE in the construction and implementation of the new Objective
Structured Clinical Examination (OSCE) which replaces the Part 3 oral, clinical and
communication skills components for all new trainees in the UK. In 2009 we hope to
appoint an OSCE question bank editor to help manage the complex scenario writing
process. The three-part MRCS is end-dated in the UK in 2010 but will still run
overseas. We need to train new examiners and continue to sustain that exam as we
improve the new one.
I hope this newsletter will be a forum for dispersing wisdom relating to the MRCS
and DO-HNS examinations. We welcome your input, comments and feedback.
Mr Chris Oliver
Chairman ICBSE
http://www.intercollegiatemrcs.org.uk/
cwoliver@btopenworld.com
http://www.rcsed.ac.uk/fellows/cwoliver
2
For those who will mourn
Some of you have now examined in
the new OSCE. Others will do so soon.
The new format MRCS provides an
examination to fit the MMC career
structure and complement work-place
based assessments. Crucially, it
provides an opportunity to test their
validity.
It has been designed to maintain the
standards expected by the Colleges
and to provide a more objective and
reproducible test. Whilst familiar
elements of the oral, clinical and
communication skills exams are
retained we now assess new areas
such as manual surgical skills, patient
safety and examining a patient with an
acute problem.
The new exam is longer than the
present orals, clinicals and
communication skills combined. It is a
much more sophisticated version of
the OSCE that you may be familiar
with from examining undergraduates.
The marking system is more complex
than the tick box of a conventional
OSCE so that we can ensure
candidates are competent in the
required content areas and domains
i.e. they know lots, communicate well,
have manual skills and judgement. It
requires a high level of examiner
concentration but you, the examiners,
decide whether a candidate passes or
fails a station. The overall standard of
the examination rests with your
judgement.
The development period was
concentrated with a first meeting of the
Intercollegiate group in October 2007,
the pilot in April 2008, PMETB
approval in June and the first diet in
October 2008. It involved a lot of work
over the summer by numerous
individuals - both examiners and staff.
Much was learnt. Systems are now
settling down and the process will be
even smoother in the future.
The first diet produced a 62 per cent
pass rate. Analysis suggests that the
candidates were an “above average”
group. The results correlated well with
the stage of training with higher pass
rates in ST2 and much lower in FY2
which is very encouraging. As
numbers increase we will have a much
better view of how it is working.
Any new process is unlikely to be
perfect straight away and there are
certainly those who will mourn the
passing of the more traditional
examinations and argue for the
inclusion of more anatomy, more
clinicals etc. The plan is to not alter the
examination for the first three diets.
When we have examined about a
thousand candidates, we will have a
thorough review. Your opinions are
going to be vital in this.
I would also urge you all to become
involved in the writing and testing of
stations. The format is very flexible
and we can test candidates on the
whole breadth and depth of the
syllabus. It is just up to the ingenuity of
you, the examiners, to produce
realistic OSCE stations that will be
enjoyable to examine and fairly test
the candidate’s ability.
Christopher M Butler MS, FRCS,
Chairman ICBSE OSCE Sub-Group
3
What you don’t see
Setting up and delivering a new form
of an examination – especially one as
complex and innovative as the MRCS
OSCE – is always going to be an
administrative challenge. Despite the
good intentions of all involved, there
was a great deal of last-minute review,
modification and sourcing of the
various materials for the first diet. The
following gives just a flavour of what
was involved.
The equipment, patient and actor
needs were extensive: articulated and
disarticulated skeletons, anatomy
specimens, pins, flags, suture pads,
false arm, artificial blood, beds and
linen, couches, chairs, real and
simulated patients, actors, screens -
and all this before we start on the
paperwork! All had to be sourced,
purchased and trialled before use.
With a 200 minute OSCE, breaks were
needed in the circuits in addition to the
two “rest stations” – these would give
the candidates some relief but not the
examiners. The layout within RCSEd
facilitated 2x10 station circuits –
groups of candidates circulated within
each circuit and had a 20 minute break
after the first 10 stations. The
challenge was to keeping the groups
apart during the break and transfer to
the second circuit. We also had to
keep the morning and afternoon
candidates separate - the decision to
offer them a light lunch whilst
quarantined in a separate room
seemed to placate those who were
hoping for a quick escape.
What became obvious during the
planning is that the OSCE would take
up a great deal of space – our new
Quincentennary Hall, incorporating
examination rooms and clinical skills
laboratory, provided the ideal venue.
Equipment apart, this OSCE runs on
paper. ICBSE supplied a master copy
of questions and mark sheets (in this
first diet sometimes modified at the
eleventh hour). This paperwork
contained information for examiners,
actors and patients, and listed
equipment. The administrative staff
had to extract the information required
for the actors and patients; separate
the mark sheet, photocopy (numerous
times) and ensure that each had a
candidate number; prepare a question
book for each examiner and laminate
the questions. Candidates were also
given a badge with their candidate
number, main specialty and sub
specialty – which was very helpful for
the candidates who had forgotten the
specialty that they had selected. On
the day of the exam, all the marks
were double-entered and cross-
checked in order to eliminate errors.
This was very labour intensive and
required maximum concentration with
up to 77 individual scores to be
entered for each candidate.
On the day(s) it all went extremely well
- preparation and the determination
and the professionalism of the staff
delivered the framework within which
the examiners were able to perform
their duties without incident.
Susan M Grant
Head of Surgical Examinations RCS Ed
4
A Lay Examiner’s Experience
One important change introduced in
the MRCS OSCE last October was the
inclusion of lay examiners in the
communication skills area – a clear
recognition of the advantages of
involving appropriately trained non
clinicians in assessment.
Communication skills are an important
dimension of any professional’s
performance. The OSCE
communication skills station involved
an actor in a role play with a candidate
being observed by both lay and
clinician examiners.
All lay examiners were selected
through an interview process and had
to undergo a full day’s specific
communication skills training as well
as the general OSCE examiner
training. For my own part I found the
training very professionally run. The
general examiner training broke down
any barriers which could have existed
between lay and clinician examiners,
making the task of approaching the
first OSCE a real team challenge.
The MRCS is an important stage in a
young doctor’s career and there were
some important safety checks
introduced around the quality of our
marking. Although the lay examiner
evaluated specific communication
skills alongside the clinician, when a
single global performance grade was
given for a candidate on the station
overall then the clinician’s assessment
took precedence in the event of a
difference of opinion.
Quality assurance checks were also
set up to record differences between
lay and clinical examiner performance
and to monitor them compared to other
bays where two clinicians carried out
assessments. Interestingly, these
showed an almost identical result.
The first impression I had was one of
noise and bustle. Clear and energetic
bell ringing marked the start and end
of each bay session and then we had
the sounds of various interactive bays
echoing around the hall. I am sure that
even the most experienced examiner
feels some nervousness with the
earlier candidates. Given the
concentration required and the
necessary time pressures on
candidates and examiners this very
quickly disappears.
I was surprised that examination
sessions were so physically tiring.
From my perspective I found it helpful
to have experienced people around.
It is difficult to overstate the size of the
logistical job which the Examination
Departments in Colleges undertake in
setting up a new exam and enabling
the complex process of fair evaluation
to take place.
Overall - enjoyable and worthwhile.
Colin Slatter
Lay Examiner
5
The need to know...
The MRCS exists to determine that
trainees have acquired the knowledge,
skills and attributes required for
completion of core surgical training.
The OSCE has been developed to test
a wide range of professional skills to a
given level of competence compatible
with the objective of the examination:
clinical skills of history taking and
physical examination; interpretation of
data and subsequent clinical
management including critical care;
some surgical techniques; oral and
written communication skills. There
are also tests of underpinning
knowledge in anatomy and surgical
pathology. For most of us used to the
old style examinations, this represents
a considerable change.
Professional practice can be divided
into areas or ‘domains’ for assessment
purposes. For the OSCE, six domains
are recognised, namely:-
• Clinical knowledge
• Clinical skill
• Technical skill
• Communication
• Decision-making, problem-solving,
situation awareness and
judgement
• Organisation and planning.
Each of these is tested across a
number of the professional skills as
listed above. For example,
communication skills may be
examined, not only in the dedicated
communication skills stations, but also
whilst taking a history or performing a
clinical examination. Up to four
domains can be assessed in each
station.
Examiners need to know which
domains are to be assessed in their
station and how each is marked using
the descriptors provided. They also
need to make a professional
judgement about how well the
candidate did in the station overall by
deciding whether the candidate is fail,
borderline fail, borderline pass, or
pass.
Examiners must therefore be familiar
with the subject matter of the five
professional skill areas, the domains
which are to be assessed within each
and the criteria for the award of marks.
In practice most examiners have no
difficulty with ‘fail’ or ‘pass’. The
greatest area of uncertainty is between
‘borderline fail’ and ‘borderline pass’.
This is the area that requires the most
expertise and examiner training and
feedback is essential.
Feedback from the examiners has
been very favourable, both their
training and their experience in the
examination. One area of concern is
the use of the domain descriptors and
this will be tackled in the training prior
to the February diet and in subsequent
training for new examiners, both
professional and lay.
An excellent curriculum has been
produced with the MRCS based
around it. With very little refinement,
and good examiner training, the
examination will be fully fit for purpose
to determine as objectively as possible
that trainees have reached a
significant milestone in their
professional development.
Rodney Peyton
6
Who was Cronbach and
should we care?
Reliability is one of the big issues in
assessment. In examinations, we try to
measure mental attributes; ideally with
the same precision with which we can
measure objects in the physical world.
Unfortunately, human beings change.
Attempts to estimate reliability by testing
candidates on two occasions in the
expectation of achieving the same
results are doomed to failure: the
candidates learn (or more likely forget),
even if the interval between the tests is
short; they become fatigued, bored ...
There are threats to reliability in most
forms of assessment. In the essay
examination two examiners may award
differing marks to the same script, or
one examiner may give differing marks
to the same script marked on different
occasions.
The attraction of the MCQ, from which
the human element is largely excluded,
is therefore apparent: the results are
likely to be more reliable. However, one
particular problem remains: sampling.
In quiz formats, contestants may feel
they would have done better if they had
been asked different questions (usually,
those asked of the other contestants).
We can therefore think of a test as a
sample of questions from a population
of all possible questions. Reliability in
these terms is a matter of possible error
or bias in sampling. We have set the
candidates one particular sample of
questions to answer; how confident can
we be that we would have achieved the
same results if we had used a different
sample of questions? Thus, statistics
are called upon to perform their familiar
inferential function: with what
confidence can we generalise from a
sample to a population - the answer
being termed a "reliability coefficient".
In the 1930s the problem was
addressed by Kuder and Richardson,
and their formula 20 (known as KR-20)
solved the problem for the particular
case where multiple-choice questions
are used and are scored either 0 or 1.
Later, a more general solution was
proposed by Cronbach and his alpha
formula (Cronbach's alpha for short) is
now the most widely-used means of
estimating test reliability.
The coefficient has values between 0
and 1, where 1 would mean a perfectly
reliable test. This is no more than a
theoretical possibility, but it can be
approached by tests which take a broad
sample of knowledge and are carefully
constructed. The minimum acceptable
value of the coefficient is often taken to
be 0.80, but some require a minimum of
0.90 for "high-stakes" examinations.
The MCQ which forms Part A of the
intercollegiate MRCS (in its latest
incarnation) regularly achieves a
reliability of 0.93. The OSCE still has
some way to go to match that.
John Foulkes
If you have any comments on this
newsletter, or would like to
contribute to the next edition, please
email: awoodthorpe@rcseng.ac.uk

More Related Content

Viewers also liked

Global Leadership
Global LeadershipGlobal Leadership
Global Leadershipbhc0005
 
Medical Professionalism in the New Millennium: A Physician Charter
Medical Professionalism in the New Millennium: A Physician CharterMedical Professionalism in the New Millennium: A Physician Charter
Medical Professionalism in the New Millennium: A Physician Chartermeducationdotnet
 
P.I.P. Breast Implants The Literature Review June 2013
P.I.P. Breast Implants The Literature Review June 2013P.I.P. Breast Implants The Literature Review June 2013
P.I.P. Breast Implants The Literature Review June 2013meducationdotnet
 
Calgary Cambridge Guide to the Medical Interview
Calgary Cambridge Guide to the Medical InterviewCalgary Cambridge Guide to the Medical Interview
Calgary Cambridge Guide to the Medical Interviewmeducationdotnet
 
2016 term one week eleven itinerary
2016 term one week eleven itinerary2016 term one week eleven itinerary
2016 term one week eleven itineraryWaihiCollege
 
Haemochromotosis brief overview
Haemochromotosis brief overviewHaemochromotosis brief overview
Haemochromotosis brief overviewmeducationdotnet
 
Ιστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής Τέχνης
Ιστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής ΤέχνηςΙστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής Τέχνης
Ιστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής ΤέχνηςMichelangelo705
 
Acid-base EXCELLENT SUMMARY
Acid-base EXCELLENT SUMMARYAcid-base EXCELLENT SUMMARY
Acid-base EXCELLENT SUMMARYmeducationdotnet
 

Viewers also liked (14)

Global Leadership
Global LeadershipGlobal Leadership
Global Leadership
 
Medical Professionalism in the New Millennium: A Physician Charter
Medical Professionalism in the New Millennium: A Physician CharterMedical Professionalism in the New Millennium: A Physician Charter
Medical Professionalism in the New Millennium: A Physician Charter
 
P.I.P. Breast Implants The Literature Review June 2013
P.I.P. Breast Implants The Literature Review June 2013P.I.P. Breast Implants The Literature Review June 2013
P.I.P. Breast Implants The Literature Review June 2013
 
Guidelines for write up
Guidelines for write upGuidelines for write up
Guidelines for write up
 
Bbfc classification
Bbfc classificationBbfc classification
Bbfc classification
 
Skin Cancer
Skin CancerSkin Cancer
Skin Cancer
 
Overview of the Liver
Overview of the LiverOverview of the Liver
Overview of the Liver
 
Calgary Cambridge Guide to the Medical Interview
Calgary Cambridge Guide to the Medical InterviewCalgary Cambridge Guide to the Medical Interview
Calgary Cambridge Guide to the Medical Interview
 
2016 term one week eleven itinerary
2016 term one week eleven itinerary2016 term one week eleven itinerary
2016 term one week eleven itinerary
 
Haemochromotosis brief overview
Haemochromotosis brief overviewHaemochromotosis brief overview
Haemochromotosis brief overview
 
Intro to ECG
Intro to ECGIntro to ECG
Intro to ECG
 
Ascities overview
Ascities overviewAscities overview
Ascities overview
 
Ιστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής Τέχνης
Ιστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής ΤέχνηςΙστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής Τέχνης
Ιστορικό Πλαίσιο και Αρχές της Αναγεννησιακής Ζωγραφικής Τέχνης
 
Acid-base EXCELLENT SUMMARY
Acid-base EXCELLENT SUMMARYAcid-base EXCELLENT SUMMARY
Acid-base EXCELLENT SUMMARY
 

Similar to Intercollegiate MRCS Examiners Newsletter Volume 1

Intercollegiate MRCS Examiners Newsletter Volume 2
Intercollegiate MRCS Examiners Newsletter Volume 2Intercollegiate MRCS Examiners Newsletter Volume 2
Intercollegiate MRCS Examiners Newsletter Volume 2meducationdotnet
 
objective structured clinical examination
objective structured clinical examinationobjective structured clinical examination
objective structured clinical examinationManimozhi R
 
presentation on objective structured clinical examination
presentation on objective structured clinical examinationpresentation on objective structured clinical examination
presentation on objective structured clinical examinationManimozhi R
 
Introduction of Objective Structured Clinical Examination as assessment tool ...
Introduction of Objective Structured Clinical Examination as assessment tool ...Introduction of Objective Structured Clinical Examination as assessment tool ...
Introduction of Objective Structured Clinical Examination as assessment tool ...iosrjce
 
A Step By Step Guide To Mastering The OSCEs
A Step By Step Guide To Mastering The OSCEsA Step By Step Guide To Mastering The OSCEs
A Step By Step Guide To Mastering The OSCEsAl Imari
 
Difference between a syllabus and a curriculum
Difference between a syllabus and a curriculumDifference between a syllabus and a curriculum
Difference between a syllabus and a curriculummeducationdotnet
 
A step by_step_guide_to_mastering_the_osc_es
A step by_step_guide_to_mastering_the_osc_esA step by_step_guide_to_mastering_the_osc_es
A step by_step_guide_to_mastering_the_osc_esMohd Jamil Yaacob
 
College Writing II Synthesis Essay Assignment Summer Semester 2017.docx
College Writing II Synthesis Essay Assignment Summer Semester 2017.docxCollege Writing II Synthesis Essay Assignment Summer Semester 2017.docx
College Writing II Synthesis Essay Assignment Summer Semester 2017.docxclarebernice
 
Evidence based medicine-workbook
Evidence based medicine-workbookEvidence based medicine-workbook
Evidence based medicine-workbookBadheeb
 
OCNZ Wellington Regional Conference November 2014
OCNZ Wellington Regional Conference November 2014OCNZ Wellington Regional Conference November 2014
OCNZ Wellington Regional Conference November 2014OCNZ
 
Developing and maintaining an assessment system
Developing and maintaining an assessment systemDeveloping and maintaining an assessment system
Developing and maintaining an assessment systemmeducationdotnet
 
Stepladder approach Training of Orthopaedic Resident
Stepladder approach Training of Orthopaedic ResidentStepladder approach Training of Orthopaedic Resident
Stepladder approach Training of Orthopaedic Residentdesirekbp
 

Similar to Intercollegiate MRCS Examiners Newsletter Volume 1 (20)

Intercollegiate MRCS Examiners Newsletter Volume 2
Intercollegiate MRCS Examiners Newsletter Volume 2Intercollegiate MRCS Examiners Newsletter Volume 2
Intercollegiate MRCS Examiners Newsletter Volume 2
 
objective structured clinical examination
objective structured clinical examinationobjective structured clinical examination
objective structured clinical examination
 
presentation on objective structured clinical examination
presentation on objective structured clinical examinationpresentation on objective structured clinical examination
presentation on objective structured clinical examination
 
Introduction of Objective Structured Clinical Examination as assessment tool ...
Introduction of Objective Structured Clinical Examination as assessment tool ...Introduction of Objective Structured Clinical Examination as assessment tool ...
Introduction of Objective Structured Clinical Examination as assessment tool ...
 
Changing trends in the medical examinations
Changing trends in the medical examinationsChanging trends in the medical examinations
Changing trends in the medical examinations
 
Changing trends in the medical examinations
Changing trends in the medical examinationsChanging trends in the medical examinations
Changing trends in the medical examinations
 
A Step By Step Guide To Mastering The OSCEs
A Step By Step Guide To Mastering The OSCEsA Step By Step Guide To Mastering The OSCEs
A Step By Step Guide To Mastering The OSCEs
 
OSCE.docx
OSCE.docxOSCE.docx
OSCE.docx
 
Difference between a syllabus and a curriculum
Difference between a syllabus and a curriculumDifference between a syllabus and a curriculum
Difference between a syllabus and a curriculum
 
CERTiFy study newsletter - 2014
CERTiFy study newsletter - 2014CERTiFy study newsletter - 2014
CERTiFy study newsletter - 2014
 
A step by_step_guide_to_mastering_the_osc_es
A step by_step_guide_to_mastering_the_osc_esA step by_step_guide_to_mastering_the_osc_es
A step by_step_guide_to_mastering_the_osc_es
 
EMS University 2016
EMS University 2016EMS University 2016
EMS University 2016
 
College Writing II Synthesis Essay Assignment Summer Semester 2017.docx
College Writing II Synthesis Essay Assignment Summer Semester 2017.docxCollege Writing II Synthesis Essay Assignment Summer Semester 2017.docx
College Writing II Synthesis Essay Assignment Summer Semester 2017.docx
 
Evidence based medicine-workbook
Evidence based medicine-workbookEvidence based medicine-workbook
Evidence based medicine-workbook
 
nw-january-2012
nw-january-2012nw-january-2012
nw-january-2012
 
CV - CTA 20150926
CV - CTA 20150926CV - CTA 20150926
CV - CTA 20150926
 
OCNZ Wellington Regional Conference November 2014
OCNZ Wellington Regional Conference November 2014OCNZ Wellington Regional Conference November 2014
OCNZ Wellington Regional Conference November 2014
 
Osce and ospe
Osce and ospeOsce and ospe
Osce and ospe
 
Developing and maintaining an assessment system
Developing and maintaining an assessment systemDeveloping and maintaining an assessment system
Developing and maintaining an assessment system
 
Stepladder approach Training of Orthopaedic Resident
Stepladder approach Training of Orthopaedic ResidentStepladder approach Training of Orthopaedic Resident
Stepladder approach Training of Orthopaedic Resident
 

More from meducationdotnet

More from meducationdotnet (20)

No Title
No TitleNo Title
No Title
 
Spondylarthropathy
SpondylarthropathySpondylarthropathy
Spondylarthropathy
 
Diagnosing Lung cancer
Diagnosing Lung cancerDiagnosing Lung cancer
Diagnosing Lung cancer
 
Eczema Herpeticum
Eczema HerpeticumEczema Herpeticum
Eczema Herpeticum
 
The Vagus Nerve
The Vagus NerveThe Vagus Nerve
The Vagus Nerve
 
Water and sanitation and their impact on health
Water and sanitation and their impact on healthWater and sanitation and their impact on health
Water and sanitation and their impact on health
 
The ethics of electives
The ethics of electivesThe ethics of electives
The ethics of electives
 
Intro to Global Health
Intro to Global HealthIntro to Global Health
Intro to Global Health
 
WTO and Health
WTO and HealthWTO and Health
WTO and Health
 
Globalisation and Health
Globalisation and HealthGlobalisation and Health
Globalisation and Health
 
Health Care Worker Migration
Health Care Worker MigrationHealth Care Worker Migration
Health Care Worker Migration
 
International Institutions
International InstitutionsInternational Institutions
International Institutions
 
Overview of Antidepressants
Overview of AntidepressantsOverview of Antidepressants
Overview of Antidepressants
 
Gout Presentation
Gout PresentationGout Presentation
Gout Presentation
 
Review of orthopaedic services: Prepared for the Auditor General for Scotland...
Review of orthopaedic services: Prepared for the Auditor General for Scotland...Review of orthopaedic services: Prepared for the Auditor General for Scotland...
Review of orthopaedic services: Prepared for the Auditor General for Scotland...
 
Sugammadex - a revolution in anaesthesia?
Sugammadex - a revolution in anaesthesia?Sugammadex - a revolution in anaesthesia?
Sugammadex - a revolution in anaesthesia?
 
Ophthamology Revision
Ophthamology RevisionOphthamology Revision
Ophthamology Revision
 
Dermatology Atlas
Dermatology AtlasDermatology Atlas
Dermatology Atlas
 
Interstitial and restrictive lung diseases
Interstitial and restrictive lung diseasesInterstitial and restrictive lung diseases
Interstitial and restrictive lung diseases
 
Back Pain
Back PainBack Pain
Back Pain
 

Intercollegiate MRCS Examiners Newsletter Volume 1

  • 1. 1 ICBSE: February 2009 Intercollegiate Examiners’ Newsletter Welcome to the first edition of the intercollegiate examiners’ newsletter! Its purpose is to keep you informed about changes in the MRCS and DO-HNS examination and help to create a sense of intercollegiality. I was appointed chairman of Intercollegiate Committee for Basic Surgical Examinations (ICBSE) in July 2007 succeeding Mr David Ward. My term of office is three years. In my daytime job I am a Consultant Trauma Orthopaedic Surgeon at The Royal Infirmary of Edinburgh. However I was born in London and trained at University College Hospital, London. My postgraduate training was in London, Yorkshire, Oswestry, Stoke-on-Trent, Seattle and Oxford. I have both the English and Edinburgh FRCS and an Edinburgh FRCP. I was co-convener of Examinations of the Royal College of Surgeons of Edinburgh and am a member of its Council. As chairman of ICBSE I lead the committee that governs the operation, regulation and development of the intercollegiate MRCS and DO-HNS and am responsible for the following sub-committees: OSCE; Syllabus; MCQ Paper Panel & Question Quality; DO – HNS; Internal Quality Assurance; Clinical; Oral and Communications Skills. I represent ICBSE at the Joint Committee on Surgical Training (JCST), Joint Surgical Colleges Meeting (JSCM), The Senate of Surgery, Curriculum Development and Assessment Sub-Group of ISCP, Joint Surgical Colleges Planning and Review Committee and Joint Committee of Intercollegiate Exams (JCIE). There has been significant change in surgical training and assessment in 2008 and I hope we can now have a period of stability and collaboration to build an examination that is robust, reliable and fit for purpose. A tremendous amount of work has been performed by ICBSE in the construction and implementation of the new Objective Structured Clinical Examination (OSCE) which replaces the Part 3 oral, clinical and communication skills components for all new trainees in the UK. In 2009 we hope to appoint an OSCE question bank editor to help manage the complex scenario writing process. The three-part MRCS is end-dated in the UK in 2010 but will still run overseas. We need to train new examiners and continue to sustain that exam as we improve the new one. I hope this newsletter will be a forum for dispersing wisdom relating to the MRCS and DO-HNS examinations. We welcome your input, comments and feedback. Mr Chris Oliver Chairman ICBSE http://www.intercollegiatemrcs.org.uk/ cwoliver@btopenworld.com http://www.rcsed.ac.uk/fellows/cwoliver
  • 2. 2 For those who will mourn Some of you have now examined in the new OSCE. Others will do so soon. The new format MRCS provides an examination to fit the MMC career structure and complement work-place based assessments. Crucially, it provides an opportunity to test their validity. It has been designed to maintain the standards expected by the Colleges and to provide a more objective and reproducible test. Whilst familiar elements of the oral, clinical and communication skills exams are retained we now assess new areas such as manual surgical skills, patient safety and examining a patient with an acute problem. The new exam is longer than the present orals, clinicals and communication skills combined. It is a much more sophisticated version of the OSCE that you may be familiar with from examining undergraduates. The marking system is more complex than the tick box of a conventional OSCE so that we can ensure candidates are competent in the required content areas and domains i.e. they know lots, communicate well, have manual skills and judgement. It requires a high level of examiner concentration but you, the examiners, decide whether a candidate passes or fails a station. The overall standard of the examination rests with your judgement. The development period was concentrated with a first meeting of the Intercollegiate group in October 2007, the pilot in April 2008, PMETB approval in June and the first diet in October 2008. It involved a lot of work over the summer by numerous individuals - both examiners and staff. Much was learnt. Systems are now settling down and the process will be even smoother in the future. The first diet produced a 62 per cent pass rate. Analysis suggests that the candidates were an “above average” group. The results correlated well with the stage of training with higher pass rates in ST2 and much lower in FY2 which is very encouraging. As numbers increase we will have a much better view of how it is working. Any new process is unlikely to be perfect straight away and there are certainly those who will mourn the passing of the more traditional examinations and argue for the inclusion of more anatomy, more clinicals etc. The plan is to not alter the examination for the first three diets. When we have examined about a thousand candidates, we will have a thorough review. Your opinions are going to be vital in this. I would also urge you all to become involved in the writing and testing of stations. The format is very flexible and we can test candidates on the whole breadth and depth of the syllabus. It is just up to the ingenuity of you, the examiners, to produce realistic OSCE stations that will be enjoyable to examine and fairly test the candidate’s ability. Christopher M Butler MS, FRCS, Chairman ICBSE OSCE Sub-Group
  • 3. 3 What you don’t see Setting up and delivering a new form of an examination – especially one as complex and innovative as the MRCS OSCE – is always going to be an administrative challenge. Despite the good intentions of all involved, there was a great deal of last-minute review, modification and sourcing of the various materials for the first diet. The following gives just a flavour of what was involved. The equipment, patient and actor needs were extensive: articulated and disarticulated skeletons, anatomy specimens, pins, flags, suture pads, false arm, artificial blood, beds and linen, couches, chairs, real and simulated patients, actors, screens - and all this before we start on the paperwork! All had to be sourced, purchased and trialled before use. With a 200 minute OSCE, breaks were needed in the circuits in addition to the two “rest stations” – these would give the candidates some relief but not the examiners. The layout within RCSEd facilitated 2x10 station circuits – groups of candidates circulated within each circuit and had a 20 minute break after the first 10 stations. The challenge was to keeping the groups apart during the break and transfer to the second circuit. We also had to keep the morning and afternoon candidates separate - the decision to offer them a light lunch whilst quarantined in a separate room seemed to placate those who were hoping for a quick escape. What became obvious during the planning is that the OSCE would take up a great deal of space – our new Quincentennary Hall, incorporating examination rooms and clinical skills laboratory, provided the ideal venue. Equipment apart, this OSCE runs on paper. ICBSE supplied a master copy of questions and mark sheets (in this first diet sometimes modified at the eleventh hour). This paperwork contained information for examiners, actors and patients, and listed equipment. The administrative staff had to extract the information required for the actors and patients; separate the mark sheet, photocopy (numerous times) and ensure that each had a candidate number; prepare a question book for each examiner and laminate the questions. Candidates were also given a badge with their candidate number, main specialty and sub specialty – which was very helpful for the candidates who had forgotten the specialty that they had selected. On the day of the exam, all the marks were double-entered and cross- checked in order to eliminate errors. This was very labour intensive and required maximum concentration with up to 77 individual scores to be entered for each candidate. On the day(s) it all went extremely well - preparation and the determination and the professionalism of the staff delivered the framework within which the examiners were able to perform their duties without incident. Susan M Grant Head of Surgical Examinations RCS Ed
  • 4. 4 A Lay Examiner’s Experience One important change introduced in the MRCS OSCE last October was the inclusion of lay examiners in the communication skills area – a clear recognition of the advantages of involving appropriately trained non clinicians in assessment. Communication skills are an important dimension of any professional’s performance. The OSCE communication skills station involved an actor in a role play with a candidate being observed by both lay and clinician examiners. All lay examiners were selected through an interview process and had to undergo a full day’s specific communication skills training as well as the general OSCE examiner training. For my own part I found the training very professionally run. The general examiner training broke down any barriers which could have existed between lay and clinician examiners, making the task of approaching the first OSCE a real team challenge. The MRCS is an important stage in a young doctor’s career and there were some important safety checks introduced around the quality of our marking. Although the lay examiner evaluated specific communication skills alongside the clinician, when a single global performance grade was given for a candidate on the station overall then the clinician’s assessment took precedence in the event of a difference of opinion. Quality assurance checks were also set up to record differences between lay and clinical examiner performance and to monitor them compared to other bays where two clinicians carried out assessments. Interestingly, these showed an almost identical result. The first impression I had was one of noise and bustle. Clear and energetic bell ringing marked the start and end of each bay session and then we had the sounds of various interactive bays echoing around the hall. I am sure that even the most experienced examiner feels some nervousness with the earlier candidates. Given the concentration required and the necessary time pressures on candidates and examiners this very quickly disappears. I was surprised that examination sessions were so physically tiring. From my perspective I found it helpful to have experienced people around. It is difficult to overstate the size of the logistical job which the Examination Departments in Colleges undertake in setting up a new exam and enabling the complex process of fair evaluation to take place. Overall - enjoyable and worthwhile. Colin Slatter Lay Examiner
  • 5. 5 The need to know... The MRCS exists to determine that trainees have acquired the knowledge, skills and attributes required for completion of core surgical training. The OSCE has been developed to test a wide range of professional skills to a given level of competence compatible with the objective of the examination: clinical skills of history taking and physical examination; interpretation of data and subsequent clinical management including critical care; some surgical techniques; oral and written communication skills. There are also tests of underpinning knowledge in anatomy and surgical pathology. For most of us used to the old style examinations, this represents a considerable change. Professional practice can be divided into areas or ‘domains’ for assessment purposes. For the OSCE, six domains are recognised, namely:- • Clinical knowledge • Clinical skill • Technical skill • Communication • Decision-making, problem-solving, situation awareness and judgement • Organisation and planning. Each of these is tested across a number of the professional skills as listed above. For example, communication skills may be examined, not only in the dedicated communication skills stations, but also whilst taking a history or performing a clinical examination. Up to four domains can be assessed in each station. Examiners need to know which domains are to be assessed in their station and how each is marked using the descriptors provided. They also need to make a professional judgement about how well the candidate did in the station overall by deciding whether the candidate is fail, borderline fail, borderline pass, or pass. Examiners must therefore be familiar with the subject matter of the five professional skill areas, the domains which are to be assessed within each and the criteria for the award of marks. In practice most examiners have no difficulty with ‘fail’ or ‘pass’. The greatest area of uncertainty is between ‘borderline fail’ and ‘borderline pass’. This is the area that requires the most expertise and examiner training and feedback is essential. Feedback from the examiners has been very favourable, both their training and their experience in the examination. One area of concern is the use of the domain descriptors and this will be tackled in the training prior to the February diet and in subsequent training for new examiners, both professional and lay. An excellent curriculum has been produced with the MRCS based around it. With very little refinement, and good examiner training, the examination will be fully fit for purpose to determine as objectively as possible that trainees have reached a significant milestone in their professional development. Rodney Peyton
  • 6. 6 Who was Cronbach and should we care? Reliability is one of the big issues in assessment. In examinations, we try to measure mental attributes; ideally with the same precision with which we can measure objects in the physical world. Unfortunately, human beings change. Attempts to estimate reliability by testing candidates on two occasions in the expectation of achieving the same results are doomed to failure: the candidates learn (or more likely forget), even if the interval between the tests is short; they become fatigued, bored ... There are threats to reliability in most forms of assessment. In the essay examination two examiners may award differing marks to the same script, or one examiner may give differing marks to the same script marked on different occasions. The attraction of the MCQ, from which the human element is largely excluded, is therefore apparent: the results are likely to be more reliable. However, one particular problem remains: sampling. In quiz formats, contestants may feel they would have done better if they had been asked different questions (usually, those asked of the other contestants). We can therefore think of a test as a sample of questions from a population of all possible questions. Reliability in these terms is a matter of possible error or bias in sampling. We have set the candidates one particular sample of questions to answer; how confident can we be that we would have achieved the same results if we had used a different sample of questions? Thus, statistics are called upon to perform their familiar inferential function: with what confidence can we generalise from a sample to a population - the answer being termed a "reliability coefficient". In the 1930s the problem was addressed by Kuder and Richardson, and their formula 20 (known as KR-20) solved the problem for the particular case where multiple-choice questions are used and are scored either 0 or 1. Later, a more general solution was proposed by Cronbach and his alpha formula (Cronbach's alpha for short) is now the most widely-used means of estimating test reliability. The coefficient has values between 0 and 1, where 1 would mean a perfectly reliable test. This is no more than a theoretical possibility, but it can be approached by tests which take a broad sample of knowledge and are carefully constructed. The minimum acceptable value of the coefficient is often taken to be 0.80, but some require a minimum of 0.90 for "high-stakes" examinations. The MCQ which forms Part A of the intercollegiate MRCS (in its latest incarnation) regularly achieves a reliability of 0.93. The OSCE still has some way to go to match that. John Foulkes If you have any comments on this newsletter, or would like to contribute to the next edition, please email: awoodthorpe@rcseng.ac.uk