Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages
Filter by Categories
Case Report
Case Series
Editorial
Original Article
Perspective
Review Article
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages
Filter by Categories
Case Report
Case Series
Editorial
Original Article
Perspective
Review Article
View/Download PDF

Translate this page into:

Editorial
2 (
1
); 4-5
doi:
10.25259/KJS_1_2025

Is OSCE the Answer to Proper Clinical Skills Assessment?

Department of Surgery, Shanthi Hospital and Research Centre, Bengaluru, India

*Corresponding author: Krishnaswamy Lakshman, Department of Surgery, Shanthi Hospital and Research Centre, Bengaluru, India, klakshman58@gmail.com

Licence
This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-Share Alike 4.0 License, which allows others to remix, transform, and build upon the work non-commercially, as long as the author is credited and the new creations are licensed under the identical terms.

How to cite this article: Lakshman K. Is OSCE the Answer to Proper Clinical Skills Assessment? Karnataka J Surg. 2025;2:4–5. doi: 10.25259/KJS_1_2025

Surgery is a skills-based science. The purpose of all surgical training is to produce a safe and ethical surgeon who will be of real service to the community. The trainee surgeon has to learn several aspects of surgery, including analytical thinking, skills in clinical examination, communication, asking for and interpreting investigation results, and choosing the correct operation and performing it efficiently, all this, keeping in mind socio-economic and ethical aspects of surgical practice. An important aspect of surgical training is a proper assessment of all these skills.[1]

The conventional assessment of clinical skills is done through the discussion of long and short cases and a viva voce on instruments and specimens. Experts in medical education have highlighted several limitations of this method of assessment, as listed below.

The scope of assessment is limited; the wide variety of skills, like communication, procedural details, and interpretation of results, in addition to history taking and physical examination, cannot be properly assessed in the conventional method. There is significant subjectivity in the assessment with considerable observer bias. The style of conducting the examination by different examiners may differ, and this hinders the capability of the examinee to answer questions. The conventional method does not assess the higher levels of the so-called Miller’s Pyramid of assessment. This pyramid has basically 4 levels, namely, knows, knows how, shows how, and does. The conventional assessment assesses only the initial two aspects and does not assess the higher levels of shows-how and does—this is indeed a requirement for proper assessment of professional competencies.[2] The conventional method is static and has been continuing for decades with little adaptation to the current real-world clinical situation. Further, the conventional method has no scope for feedback either to the students or the examiners.

To counter some of these limitations, Ronald Harden from Dundee devised the method of assessment called the ‘Objective Structured Clinical Examination’ (OSCE) in the 1970s.[3] In OSCE, two of the three variables in an assessment exercise, namely, the patient and the examiner, are controlled, the only variable being the examinee. The examinee goes through several ‘stations’ with each of these stations examining a particular skill—it may be history taking, communication of bad news to the family, doing a particular physical examination, reporting about a patient to a consultant surgeon, interpreting a laboratory result, interpreting an X-ray, magnetic resonance imaging (MRI), or computed tomography (CT), and even performing a given skill in a simulated environment—like suturing. All this is done in a standardised manner with each examinee given the same time to perform the given focussed task, and even the scoring is done through a checklist of what is expected. As we can see, the scope of assessment is really wide, and several aspects of professional competencies can be assessed. And, importantly, the subjectivity is removed from the scoring process.

The effectiveness of OSCE in assessing professional competence is well established.[4,5] OSCEs offer several advantages over the conventional method. Standardised patients and scenarios with controlled scoring leads to objectivity. With many stations in play, the assessment is comprehensive. Standardised patients, many times trained actors, add realism to the assessment. The stress is on applied knowledge—just rote learning and the recall will not be adequate. OSCE offers flexibility in that the examiner can adapt the same scenario to different levels of competence, like the undergraduate or the postgraduate level. Several candidates can be rotated through the various stations, thus saving time and increasing efficiency.[6]

What do the students and teachers think about OSCE? It is generally well received by both students and teachers. Positive sentiments are expressed regarding fairness, validity, relevance, and the conduct of the examination. The majority of examiners also express satisfaction regarding the administration of an OSCE examination.[7]

Some of the downsides of OSCE are as follows:

Students feel pressured by the time constraints imposed in each station. They report higher anxiety levels. The design and execution of the stations in OSCE take a lot of time and effort. A lot of thought goes into the scoring sheet design. For the examiners, the standardised process makes the assessment a mechanical process.

In conclusion, OSCE has proved to be a worthy tool in the assessment of professional competencies. It offers several advantages over the conventional methods. The design of stations can be time-consuming and challenging. But, once it is done, the assessment is indeed objective and unbiased. While some teething troubles may occur during the initial period of the introduction of OSCE, the negatives can be countered by adequate training of teachers to design the stations well and the students to take the examination through practice sessions. The general experience is that OSCE is well received.

References

  1. , . AMEE Guide No. 25: The Assessment of Learning Outcomes for the Competent and Reflective Physician. Med Teach. 2003;25:569-84.
    [Google Scholar]
  2. , Dewan P, . Objective Structured Clinical Examination (OSCE) Revisited. Indian Paediatr. 2010;47:911-20.
    [Google Scholar]
  3. , Stevenson M, Downie WW, . Assessment of Clinical Competence Using Objective Structured Examination. Br Med J. 1975;1:447-51.
    [Google Scholar]
  4. . Summative OSCEs in Undergraduate Medical Education. Ulster Med J. 2011;80:127-32.
    [Google Scholar]
  5. , , , . Comparison of the Objective Structured Clinical Examination With the Performance of Third-Year Medical Students in Surgery. Am J Surg. 2000;179:286-88.
    [Google Scholar]
  6. , , , . The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: An Historical and Theoretical Perspective. Med Teach. 2013;35:e1437-46.
    [Google Scholar]
  7. , , , , , . An Evaluative Study of Objective Structured Clinical Examination (OSCE): Students and Examiners Perspectives. Adv Med Educ Pract. 2019;10:387-97.
    [Google Scholar]

Fulltext Views
161

PDF downloads
71
View/Download PDF
Download Citations
BibTeX
RIS
Show Sections