Final-year medical students’ self-assessment of their competence to perform 123 clinical skills: A cross-sectional study in Greece

RESEARCH ARTICLE

Hippokratia 2024, 28(3): 109-114

Savvidou E, Evangelidis N, Evangelidis P, Avramidou E, Nteli M, Nteli D, Nastas T, Sitmalidou M, Vitoris I, Smyrnakis E
Laboratory of Primary Health Care, General Practice and Health Services Research, School of Medicine, Aristotle University of Thessaloniki, Greece

Abstract

Background: Clinical skills teaching constitutes a fundamental part of the medical education curriculum. The School of Medicine of the Aristotle University of Thessaloniki in Greece has developed a six-year logbook that defines 123 basic clinical skills, which every graduate student is expected to be competent to perform upon completion of undergraduate studies. The present study examines the self-assessed competence of final-year medical students to perform those 123 skills and the student’s suggestions for improving the clinical skills teaching.

Methods: We conducted a cross-sectional study that collected data through an anonymous questionnaire distributed at the end of the academic years 2021-2022, 2022-2023, and 2023-2024 to the final-year medical students. We set three competency levels so participating final-year medical students could evaluate their ability to perform the clinical skills: can perform the skill i) independently, ii) with assistance, and iii) cannot perform the skill.

Results: Three hundred sixteen final-year medical students completed the survey (response rate: 31.01 %). In the total of the 123 skills, the majority of the respondents stated that they can perform 72.36 % of them on their own and 27.76 % with assistance (89 and 28 skills, respectively). For six of the 123 skills, the majority of students answered that they could not perform them.

Conclusions: This study highlights that most final-year medical students feel competent enough to perform most of the skills upon completing their undergraduate studies. Objective clinical examination is important to provide a more holistic view of the student’s clinical competency. Medical schools should invest in systematic training and upgrade the clinical skills teaching, especially those skills crucial for young doctors’ first steps as licensed physicians. HIPPOKRATIA 2024, 28 (3):109-114.

Keywords: Basic clinical skills, clinical competence, medical education, medical students, self-assessment 

Corresponding author: Professor Emmanouil Smyrnakis, Laboratory of Primary Health Care, General Practice and Health Services Research, Faculty of Health Sciences, School of Medicine, Aristotle University of Thessaloniki, 54636 Thessaloniki, Greece, tel: +302310999147, e-mail: smyrnak@auth.gr

Introduction

Undergraduate medical education aims to equip medical students with theoretical and foundational knowledge and contribute to developing their clinical competency through clinical skills training1. Clinical skills are defined as the knowledge and practices related to safe and quality patient care that each physician should possess2. Teaching clinical skills has been integrated into medical education programs over the last decades3. However, studies have shown that even the most commonly used clinical skills (such as hand hygiene) are not performed successfully in everyday clinical practice, not only by senior medical students but by healthcare professionals as well4,5. Therefore, systematic and organized training on clinical skills is crucial to be integrated into all medical education curricula6. Clinical skills teaching should be initiated early in medical studies and continue until the final years7. Moreover, training on clinical skills training can be performed in simulation and real-life clinical settings8. The development of clinical competency is a multi-level process, best described in Miller’s pyramid of clinical competence. Miller’s model aims to shape students’ knowledge, skills, and attitudes and focuses on what learners can do at the end of each educational intervention, which is not necessarily and completely identical to what they have been taught and is mainly used for the external, objective evaluation of the trainees by the trainers, according to the proposed evaluation strategies at each level9. Miller’s pyramid consists of four levels: “knows” (level one), “knows how” (level two), “shows how” (level three), and “does” (level four)10. An adapted version of the pyramid includes a baseline line level that reflects the personal contributing factors, which are assessed with self-reported measures11. Self-assessment is an invaluable tool in clinical education, contributing positively to developing students’ awareness12. The assessment of self-reported measures helps educators detect knowledge and skills deficits13.

The School of Medicine of Aristotle University of Thessaloniki (AUTh) introduced a revised undergraduate medical curriculum in the academic year 2019-2020, which emphasized clinical skills teaching and early exposure of medical students to health care by introducing two compulsory clinical skills courses during the preclinical years, part of which comprises mandatory training in primary healthcare units and hospital emergency departments. A process for creating a six-year clinical skills logbook was implemented shortly after, in 2021-2022, and the new logbook (Logbook edition 1.1) was initially used by first-year medical students in the academic year 2022-2023. The designated levels for developing clinical skills competence were based on the Observable and Entrustable Professional Activities model (OPA and EPA, respectively). As part of the logbook, we defined a list of 123 clinical skills that graduate medical students are anticipated to be competent in performing independently in real-life clinical settings (EPAs). Those skills are preventive, diagnostic, therapeutic, and communicational, performed in adult and pediatric populations, and are correlated to compulsory courses taught across the twelve semesters of the syllabus. A multi-level, extensive process took place to ensure the development of a comprehensive, responsive to the student’s educational needs and courses’ learning objectives. The process equally involved faculty members and senior medical students.

In order to be able to evaluate the impact and the efficacy of the revised medical curriculum, the new clinical skills logbook, and the new methods of clinical skills training, it is important to have a control database with sufficient data from the previous, gradually substituted medical curriculum and educational practices. This study investigated the self-assessed competence of the final-year medical students of the School of Medicine, AUTh, who followed the reformed curriculum, gradually substituting the former undergraduate medical curriculum to perform those 123 clinical skills defined in the new logbook. 

Methods

Study design and sampling method

We conducted a cross-sectional study based on data collection through an anonymous online questionnaire, as described in the following section. The study population was the final-year medical students of the School of Medicine, AUTh, in the academic years 2021-2022, 2022-2023, and 2023-2024. The questionnaire was emailed to all final-year medical students at their institutional addresses. The data collection took place in June and July of 2022, 2023, and 2024, respectively. The study was approved by the Bioethics Committee of the School of Medicine, AUTh (protocol No 51/2022).

Data collection

An anonymous questionnaire was developed for the aims of this study using the Google Forms platform (Google, Mountain View, CA, USA). The questionnaire included an initial declaration of students’ informed consent to participate in this research and then 127 closed-type questions divided into six sections. The first section included demographic data collection questions. The rest of the sections included Likert-type scale questions about clinical skills practiced in adult and pediatric patients. The Likert-type scale was used, with three ratings: “i) I can perform the skill on my own”, “ii) I can perform the skill with assistance”, and “iii) I cannot perform the skill”. Throughout the entire process, the privacy and anonymity of the participants were secured, and respondents were allowed to withdraw from the survey at any time before finalizing the submission of the questionnaire.

Statistical analysis

We performed all statistical analyses using the IBM SPSS Statistics for Windows, Version 28.0 software (IBM Corp., Armonk, NY, USA). We calculated the descriptive statistics of the demographic data and the responses to the questionnaire’s close-ended questions. We present categorical variables as frequencies (%) and modes. We calculated the Cronbach’s alpha value to assess the questionnaire’s internal consistency. We considered statistically significant a p-value of less than 0.05.

Results

Participants

The sample size of the study was 1,019 final-year medical students of the academic years 2021-2022, 2022-2023, and 2023-2024 (334 students in the academic year 2021-2022, 343 in the year 2022-2023, and 342 in the year 2022-2023), of whom 316 completed the questionnaire (response rate: 31.01 %). Specifically, 93 students (29.43 %) were mainly trained at the university hospital “AHEPA”, 114 people (36.08 %) at the “Papageorgiou” hospital, 108 people (34.18 %) at the “Ippokrateio” Hospital, and one person (0.32 %) preferred not to state.

Self-reported assessment of students’ clinical skills competence

The overall Cronbach’s alpha for the questionnaire was 0.965, indicating an acceptable internal consistency. The analysis included the records of all the basic clinical skills included in the questionnaire. The modes for each clinical skill were calculated. The results for the skills are: 89 clinical skills (72.36 %) were classified in “mode 1” (i.e., the majority of the students stated that they could perform the skill on their own), 28 skills (22.76 %) were classified in “mode 2” (i.e., the majority of the students stated that they could perform the skill with assistance), and six skills (4.88 %) were classified in “mode 3” (i.e., the majority of the students stated that they could not perform the skill). The clinical skills that were categorized in “mode 2” and “mode 3” are presented, per category of skill, in Table 1 and Table 2.

Further analysis included calculating the percentage that each clinical skill received regarding answer option “i) I can perform the skill on my own” by the total of 225 responders. The optimum for each clinical skill would be for this answer option to be selected by all participating students; in such a case, the obtained percentage of 100 % indicates that the clinical skill can be performed to the maximum level by 100 % of the individuals, which is also the optimum goal of the educational process. These data are presented in Table 3, while in the acknowledgment, there is a link for a supplementary file demonstrating the percentage for each of the 123 basic clinical skills that every graduate student is anticipated to be competent upon completion of her/his undergraduate studies.

The distribution of the modes across the three clinical practice hospitals (AHEPA, Papageorgiou, and Ippokrateio) was tested (independent-samples Kruskal-Wallis test, adjusted by the Bonferroni correction for multiple tests) and showed no significant difference as well (p =0.592).

Discussion

In our cohort, the majority of the respondents reported that they were confident in practicing most of the skills. As mentioned above, in 72.36 % of the questions concerning the self-reported performance of clinical skills, most students chose the option “I can practice the skill by myself”. However, reinforcement and systematization of clinical skills teaching are crucial to ensure that the vast majority of medical school graduates will be able to perform the essential skills in real-life clinical settings14.

Numerous medical schools and associations worldwide have already developed basic clinical skills lists, including the Association of American Medical Colleges (AAMC), the General Medical Council (GMC) of the United Kingdom, and the Stanford Medicine MD program15-17. The implementation of a clinical skills logbook in everyday medical education is crucial to ensure that the students will practice the majority of the essential skills. For this aim, electronic logbooks, information sessions about their use, and regular feedback between students and teaching staff can be helpful18. Electronic logbooks have been used even in low-resource educational environments19.

According to the above organizations, some skills are essential for every medical school graduate. While 98.1 % of the students reported that they could perform an electrocardiogram (ECG), only 51.27 % could perform a basic interpretation. Vishnevsky et al studied competency and confidence toward ECG, and their findings suggested that 45.1 % of students had the baseline knowledge of ECG interpretation20. According to GMC (GMC Practical Skills and Procedures)17, wound closure and dressing and the use of local anesthetic are skills each young doctor should have developed. In our study, 74.05 % of the students answered that they could practice without assistance the application of local anesthetic and 70.57 % the wound closure and dressing. Taking into consideration the increasing number of patients who need wound care, it’s crucial the development of organized wound care courses in medical schools, a suggestion strongly supported by medical students21. Regarding the basic interpretation of chest X-rays, our results showed that a significant number (72.78 %) of students felt confident enough to practice this skill by themselves. However, considering the vitality of this knowledge, medical schools must adopt novel teaching methods for instruction this skill22. A significant deficit of knowledge on the basic computed tomography (CT) interpretation was observed in our study (only 22.78 % of the students chose option 1), and this result agrees with the study of Nguyen et al23. Interactive web-based learning programs and high-fidelity simulation education have been implemented to teach CT interpretation. These methods can be helpful in the achievement of better outcomes regarding this skill24,25. In this study, we identified that the level of competence is related to the level of difficulty of the skill [infection prevention and use of a surgical mask 98.73 % vs 4.43 % for performing a focused assessment with sonography in trauma (FAST) ultrasound to highlight intra-abdominal bleeding and abdominal emergencies]. The diversification of the difficulty is justified because “more difficult” skills require more systematic education that is not widely offered in medical school curricula. Nevertheless, these results cannot be generalized because skills that do not have significant difficulty in their acquisition are observed to have low competence.

Furthermore, it was observed that not all skills received the maximum competence by the students, even the most vital for everyday clinical practice skills. However, the study prompts critical questions about the necessity of certain skills in the medical curriculum, exemplified by the challenges posed by skills requiring substantial and expensive equipment, such as FAST ultrasonography. Taking that into consideration, some skills are vital and essential for each graduate, and some skills are “too specialized”. It is quite disputable which skills schools should consider as “basic”. “Basic” is considered a skill essential for each young physician, such as ECG interpretation26. Medical schools should ensure that all graduates are competent enough to perform the “basic” clinical skills. Continuous feedback between the faculty and students, evaluation, and updates of the clinical skills included in the logbook are fundamental. Students could not perform all the skills integrated into the developed list, which can be rationalized either by inadequate teaching of the specific skill or the induction of skills in the list that exceed the scope of the basic medical studies. To achieve better learning outcomes, medical schools can include small-group teaching methods in the teaching procedure27. Small group teaching provides better learning outcomes for both clinical and communication skills, but more faculty is required. Thus, the recruitment of additional staff could be beneficial28. The peer-teaching method can also be utilized in small group teaching, which is supported by data showing that peer teaching is equally effective as faculty teaching in acquiring clinical skills29. To develop the maximum level of competence for basic clinical skills, self-assessment questionnaires should be performed, as well as a systematic assessment method should be integrated into medical school curricula30. To secure clinical competence for all students, we can adopt specific types of assessment, such as the Objective Structured Clinical Examination (OSCE) and mini-Clinical Evaluation Exercise (CEX). OSCE was first developed in 1996 at the University of South Dakota School of Medicine. OSCE is considered a very effective way of assessment31. Mini-CEX is also a helpful tool for the evaluation of learners’ skills32.

The present study included 316 final-year medical students from three consecutive student cohorts at the School of Medicine, AUTh. We observed a difference in the level of competence in several clinical skills. Medical schools worldwide can develop similar self-assessment questionnaires to explore their graduates’ referred knowledge and skill levels. Multicenter collaboration is needed in order to adopt the optimal teaching methods. Several limitations should be recognized in our study. This cross-sectional study explored final-year medical students’ self-reported ability to perform 123 basic clinical skills. A study’s limitation is that the data were retrieved from self-assessment questionnaires and answered using their subjective judgment. The main drawback of self-report questionnaires might be the possibility of providing non-valid answers. The implementation of the self-assessment method allows the respondents to deeply navigate their knowledge and experiences and answer the questionnaires more accurately33. An objective way to evaluate the performance of these skills could be the implementation of OSCE exams. The self-assessment needs to be repeated at two key milestones: completion of studies of the first class of students following the revised undergraduate medical curriculum (academic year 2024/25) and completion of studies of the first class of students following the revised undergraduate medical curriculum and using the clinical skills logbook (academic year 2027/28). However, the continuation of the self-assessment process at the end of each academic year can provide more accurate results of the educational intervention and the need for modifications. Another limitation of this study is that the results were not checked regarding intraobserver reliability. In prospective student cohorts, the medical students can answer the questionnaire in different timelines to check how stable the responses obtained are.

In this cross-sectional study, we demonstrated that the majority of the final-year medical students reported that they could perform most of the clinical skills included in the logbook. This is the first systematic and class-wide attempt to evaluate the self-reported clinical competence of medical students at the School of Medicine, AUTh. It is important to strengthen the educational effort and systematize the teaching of clinical skills to achieve the ultimate educational goal of all students, which is to be competent to perform all the clinical skills on their own upon completion of the studies. Teaching clinical skills, essential to all medical graduates in their first steps as physicians, should be enhanced. In order to form a more holistic view of the student’s clinical competency and the impact of the six-year educational process, a mixed-methods evaluation strategy should be adopted, incorporating both objective, structured examination methods and the self-reported evaluation techniques, and be repeated periodically.

Conflict of interest

Authors declare no conflicts of interest.

Acknowledgements

A supplementary file demonstrating the percentage for each of the 123 basic clinical skills that every graduate student is anticipated to be competent in upon completion of her/his undergraduate studies is provided in the electronic version of the paper on the journals’ website.

References

  1. Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. Core principles of assessment in competency-based medical education. Med Teach. 2017; 39: 609-616.
  2. Vogel D, Harendza S. Basic practical skills teaching and learning in undergraduate medical education – a review on methodological evidence. GMS J Med Educ. 2016; 33: Doc64.
  3. Hale JF, Cahan MA, Zanetti ML. Integration of basic clinical skills training in medical education: an interprofessional simulated teaching experience. Teach Learn Med. 2011; 23: 278-284.
  4. Saunders H, Gallagher-Ford L, Kvist T, Vehviläinen-Julkunen K. Practicing Healthcare Professionals’ Evidence-Based Practice Competencies: An Overview of Systematic Reviews. Worldviews Evid Based Nurs. 2019; 16: 176-185.
  5. Seitz T, Raschauer B, Längle AS, Löffler-Stastka H. Competency in medical history taking-the training physicians’ view. Wien Klin Wochenschr. 2019; 131: 17-22.
  6. AlKhateeb NE, Salih JH, Shabela N, Shabila NP. The perspectives of final year medical students for one year training experience in clinical skills laboratory in Erbil. Zanco J Med Sci. 2015; 19: 972-979.
  7. Duban S, Mennin S, Waterman R, Lucero S, Stubbs A, Vanderwagen C, Kaufman A. Teaching clinical skills to pre-clinical medical students: integration with basic science learning. Med Educ. 1982; 16: 183-187.
  8. Favarato MH, Sarno MM, Carneiro Peres LV, Teles Arruda F. Teaching-learning process of clinical skills using simulations – report of experience. MedEdPublish (2016). 2019; 8: 86.
  9. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990; 65: S63-S67.
  10. Albino JE, Young SK, Neumann LM, Kramer GA, Andrieu SC, Henson L, et al. Assessing dental students’ competence: best practice recommendations in the performance assessment literature and investigation of current practices in predoctoral dental education. J Dent Educ. 2008; 72: 1405-1435.
  11. Kohrt BA, Mutamba BB, Luitel NP, Gwaikolo W, Onyango Mangen P, et al. How competent are non-specialists trained to integrate mental health services in primary care? Global health perspectives from Uganda, Liberia, and Nepal. Int Rev Psychiatry. 2018; 30: 182-198.
  12. Taylor I, Bing-Jonsson P, Wangensteen S, Finnbakk E, Sandvik L, McCormack B, et al. The self-assessment of clinical competence and the need for further training: A cross-sectional survey of advanced practice nursing students. J Clin Nurs. 2020; 29: 545-555.
  13. Evans AW, McKenna C, Oliver M. Self-assessment in medical practice. J R Soc Med. 2002; 95: 511-513.
  14. Moirasgenti M, Smyrnakis E, Tsoulfas G, Grosomanidis V, Toufas K, Benos A. Making clinical skills education real – Transition from simulation to ward. Aristotle University Medical Journal. 2013; 40: 8-12.
  15. Proceedings of the AAMC’S Consensus Conference on the Use of Standardized Patients in the Teaching and Evaluation of Clinical Skills. Washington, D.C., December 3-4, 1992. Acad Med. 1993; 68: 437-483.
  16. Nelson MS, Traub S. Clinical skills training of U.S. medical students. Acad Med. 1993; 68: 926-928.
  17. General Medical Council. Outcomes for graduates – Practical skills and procedures. Available at: https://www.gmc-uk.org/education/standards-guidance-and-curricula/standards-and-outcomes/outcomes-for-graduates/outcomes-for-graduates—practical-skills-and-procedures, date accessed: 30/03/2024.
  18. Schüttpelz-Brauns K, Narciss E, Schneyinck C, Böhme K, Brüstle P, Mau-Holzmann U, et al. Twelve tips for successfully implementing logbooks in clinical training. Med Teach. 2016; 38: 564-569.
  19. Barteit S, Schmidt J, Kakusa M, Syakantu G, Shanzi A, Ahmed Y, et alElectronic logbooks (e-logbooks) for the continuous assessment of medical licentiates and their medical skill development in the low-resource context of Zambia: A mixed-methods study. Front Med (Lausanne). 2022; 9: 943971.
  20. Vishnevsky G, Cohen T, Elitzur Y, Reis S. Competency and confidence in ECG interpretation among medical students. Int J Med Educ. 2022 Nov 30; 13: 315-321.
  21. Poacher AT, Bhachoo H, Jones A, Weston J, Powell K, Myaat P, et al. A cross-sectional evaluation of the current state of wound healing education in the United Kingdom’s undergraduate medical curriculum. Int Wound J. 2023; 20: 3939-3944.
  22. Al Elq A, Alfayez AA, AlQahtani MI, Alshahrani RS, Alotaibi GA, Aldakheel AA, et al. The Effects of Various Teaching Methods on Chest X-ray Interpretation Skills Among Medical Students and Interns: A Systematic Review. Cureus. 2023; 15: e44399.
  23. Nguyen B, Werth B, Brewer N, Ward JG, Nold RJ, Haan JM. Comparisons of Medical Student Knowledge Regarding Life-Threatening CT Images Before and After Clinical Experience. Kans J Med. 2017; 10: 1-12.
  24. Lee K, Baird M, Lewis S, McInerney J, Dimmock M. Computed tomography learning via high-fidelity simulation for undergraduate radiography students. Radiography (Lond). 2020; 26: 49-56.
  25. Aliaga L, Clarke SO. Rethinking Radiology: An Active Learning Curriculum for Head Computed Tomography Interpretation. West J Emerg Med. 2022; 23: 47-51.
  26. Michels ME, Evans DE, Blok GA. What is a clinical skill? Searching for order in chaos through a modified Delphi process. Med Teach. 2012; 34: e573-e581.
  27. Edmunds S, Brown G. Effective small group learning: AMEE Guide No. 48. Med Teach. 2010; 32: 715-726.
  28. Burgess A, van Diggele C, Roberts C, Mellis C. Facilitating small group learning in the health professions. BMC Med Educ. 2020; 20: 457.
  29. Rees EL, Hawarden AW, Dent G, Hays R, Bates J, Hassell AB. Evidence regarding the utility of multiple mini-interview (MMI) for selection to undergraduate health programs: A BEME systematic review: BEME Guide No. 37. Med Teach. 2016; 38: 443-455.
  30. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010; 32: 676-682.
  31. Tervo RC, Dimitrievich E, Trujillo AL, Whittle K, Redinius P, Wellman L. The Objective Structured Clinical Examination (OSCE) in the clinical clerkship: an overview. S D J Med. 1997; 50: 153-156.
  32. He Y, Wen S, Zhou M, Li X, Gong M, Zhou L. A Pilot Study of Modified Mini-Clinical Evaluation Exercises (Mini-CEX) in Rotation Students in the Department of Endocrinology. Diabetes Metab Syndr Obes. 2022; 15: 2031-2038.
  33. Yates N, Gough S, Brazil V. Self-assessment: With all its limitations, why are we still measuring and teaching it? Lessons from a scoping review. Med Teach. 2022; 44: 1296-1302.