Hippokratia 2013, 17(1):34-37

Lavranos G, Koliaki C, Briasoulis A, Nikolaou A, Stefanadis C
1st Department of Cardiology, Medical School, Athens University, Greece


Introduction: The aim of the study is to assess reported changes in medical students’ capacity to attain five basic cardiological clinical skills, following a one-month intensive cardiology course provisioned in the core curriculum.
Materials and Methods: An anonymous questionnaire comprising self reported performance in the five skills, namely 1) arterial blood pressure measurement, 2) cardiac auscultation, 3) electrocardiogram (ECG) carry out, 4) ECG interpretation and 5) defibrillation, was distributed to 177 fifth year students of the Athens Medical School upon initiating the cardiology course (pre-training group) and to 59 students matched for sex, age, year of study and training centre, following completion of the course (post training group). Comparison of pre- and post- training performance was evaluated using the χ2 test.
Results: No change was noted with regards to blood pressure measurement, cardiac auscultation or defibrillation. By contrast, a statistically significant improvement was reported for ECG execution (54.3 versus 81.4%; p<0.001) and interpretation (from 33.1 to 89.8%; p<0.001).  
Conclusion: Improvement in the execution and interpretation of ECGs seems to be among the strengths of the cardiology training program. Further studies including larger samples from multiple medical schools and objective assessment of skill execution might facilitate accurate training evaluation and define opportunities for improvement.

Read PDF

Keywords: Assessment, clinical skills, curriculum, medical education, training

Corresponding Author: Giagkos Lavranos, Falireos 34, 18547, Neo Faliro, Piraeus, Greece, tel: +302104811942, e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.


Medical education is nowadays justifiably considered to be the most challenging field of scientific research.  New innovative teaching techniques and instructional methods are being gradually implemented in several University Medical Schools all over the world. It is believed that a doctor’s capability of successfully coping with urgent medical conditions and effectively carrying out practical medical procedures depends to a great extent on the presence or absence of practical orientation during his basic medical training. Thus, health professionals and specialists of the educational process are presently familiarizing with new teaching protocols that oppose to what has been established so far, deviate from conventional methodologies such as plain theoretical lectures with no practical component and are related to digital forms of educational technology that are mainly computer-assisted. The extent to which the implementation of these tools in the Hellenic Medical Education system has provided improved outcomes in terms of students’ clinical skill attainment remains, however, unclear.

According to the regional compulsory medical education curriculum, undergraduate cardiology training is limited to a one-month period, which takes place during the fifth year of medical studies (the total duration of medical studies is 6 years). The schedule includes a combination of tutorials-case discussion and clinical practice from 8:00 am to 13:30 pm three times a week. During this period, students visit in rotation several sub-departments of the clinic, such as the echocardiography department, the outpatient clinics, the catheterization laboratory and the coronary intensive care unit.

The objective of this study is to estimate whether attending a practical cardiology course of one-month duration can actually improve undergraduate fifth year medical students’ self-reported performance efficiency in terms of five basic cardiological clinical skills.


The study is based on an anonymous questionnaire, developed by medical students of Athens University, in collaboration with the 1st Department of Cardiology. Following an initial, pilot phase, during which the questionnaires were tested on a minimal number of students trained in the 1st Department of Cardiology, a final question form was developed, taking into account any reports on problems encountered during distribution and analysis of the original version.

The revised questionnaire was distributed to 177 randomly selected fifth year undergraduate medical students, during a fifteen-day period per term. In this questionnaire the participants were asked to assess their competence in five specific skills of cardiological interest: 1) arterial blood pressure measurement, 2) defibrillation, 3) ECG execution, 4) analysis-interpretation of an ECG and 5) cardiac auscultation and differential diagnosis of cardiac sounds. No partial completion of questionnaires was noted.

From the 177 students who participated, 59 students had already attended the Cardiology course (post-training group) while the other 118 students were randomly chosen from those who had not attended (pre-training group) so that 2 students would correspond to every one in the post-training group in terms of sex (equal distribution), age (24 +/-1 years), year of study (5th) and year of entering Medical School.

Exclusion criteria included: previous academic study in a health-related field, pre-existing professional or systematic extracurricular exposure to clinical practice, participation in research protocols related to Cardiology and participation in certified life support workshops.

The enrollment and randomization process was conducted per term as follows: The research group included in the study only those students who had a Registry Number starting with 2001 demonstrating that all participants had entered the University Medical School at the same year. The study group enrolled consecutively every third student (randomized selection), reaching a total specimen size of 30 participants (equal to one third of the total number) with an equal representation of the male and female students (15 male / 15 female), the latter being true in the general medical student population, too. Only one male student who selected was reluctant to participate, contributing to a refusal rate of 1.66 %. Once the formation of the post-training group was completed, the research team initiated the process of matching every student from the post-training group to two other students who had not been trained in cardiology yet, maintaining comparable age, gender, year of study and year of entry.

Potential confounding factors, such as the socioeconomic status of the participants or their overall performance in medical training were also taken into consideration. In particular, all students of the study were admitted in the Medical School via highly competitive State Examinations, requiring a mean grade of over 96%, ranking them in the top 3% of the national student population. A study of archival data from the academic Registry revealed that the large majority of students derive from parents employed in either public services or healthcare, implying reasonable comparability in terms of educational background and family income, with a roughly symmetrical distribution around a medium-high income peak value.

Data was collected in anonymous collective tables and analyzed via the SPSS software (version 18.0, IBM SPSS Inc, Chicago, IL, USA). The statistical evaluation was performed through univariate analysis with the application of the Chi-square test for every one of the examined five cardiological skills. The minimum statistical significance level was set at p<0.05. Due to the categorical and non-ordinal nature of the data (dual response option), measurement of normality and correlation coefficients was not applied.


Table 1 displays the distribution of the 118 pre-training and the 59 post-training students’ self-reported capacity in the examined cardiological skills.

The post-training students reported higher performance than the pre-training ones in the questions concerning the ECG. More specifically, 81.4 % of the post-training group claim to be capable of carrying out an ECG and 89.8 % state that they are able to interpret an ECG. The corresponding rates of the pre-training students were 54.2 % and 33.1 % respectively. In both cases, the difference observed are statistically highly significant, with a p-value <0.001.

As far as defibrillation and cardiac auscultation are concerned, although the post-training students presented higher percentages of positive answers than the pre-training group, the differences found were not statistically significant.

Last but not least, no significant difference was observed between the two subgroups of the study concerning the skill of measuring the arterial blood pressure.


The choice of the 5 cardiological skills included in the study has been based on a hierarchical classification of clinical actions necessary for general practice and acute care according to the relevant training curricula1.

An overall estimation of study outcome reveals that there is a considerable lack in training for almost all skills examined. Blood measurement appears to be the most widely recognized activity, while only about half the students feel capable to perform cardiac auscultation correctly. Concerning pre- and post-training comparison, a statistically significant difference is only observed for ECG-associated skills.

These findings are consistent with the study hypothesis, namely that student self-reported capacity is directly related to the education they are provided with. For instance, in terms of blood pressure measurement, all students have received a combination of theoretical and practical application under the supervision of a health professional throughout the years of undergraduate clinical practice. Thus, high competency rates as those reported are to be expected, regardless of Cardiology training completion. Concerning carrying out an ECG, the skill is practically applied in patients in cooperation with the nursing personnel on a daily basis during Cardiology training, this being the first opportunity to cover this skill for the majority of trainees. With regard to ECG interpretation, Cardiology clinics place a particular emphasis on detecting basic heart disorders and this is specifically evaluated in theory and practice before successful completion of the course, a process that seems to succeed in improving performance status, although not at the desired level. In terms of defibrillation, there seems to be a considerable deficit in medical students’ familiarization. Finally, as far as cardiac auscultation is concerned, students are stimulated to use their stethoscope while examining their patients in all stages of their clinical education. This seems to achieve some educational goals, however, there is still a large proportion of medical students that remain unsatisfied. This may be due to a number of causes, including poor ratios of students per patient or trainer per student observed in some cases.

All the cardiological skills examined, demand critical thought and diagnostically oriented methodologies2,3. Effective training also requires a (real or alleged) patient to take part in the educational process, although recent research findings demonstrate that a really good effect could be accomplished through digitally simulated experiences by means of computer technology and virtual classrooms1-3. This is particularly relevant for emergency skills, such as defibrillation. The training difficulty often described as an obstacle in this process may no longer be considered valid, since current guidelines propose the use of semiautomatic and automated external defibrillators even by the general public4-6. In some studies simulators have been available in combination with ALS (Advanced Life Support) training software7-9. However, despite the recognized significance of these skills in daily practice, concern on student training outcome remains high worldwide10-11.

Simulator training can allow the student to learn from errors. Small-group discussion and repetitive auscultation of simulated heart sounds has also been reported to improve the cardiac auscultatory proficiency12. Moreover, cardiology patient simulators13 and computer-assisted instruction systems14 enable teaching and assessment of cardiac pathophysiology. Finally, electronic stethoscopes linked to a laptop computer with software created to visualise auscultatory findings facilitate the understanding and identification of cardiac murmurs15. All these practices are only partially applied in Greek medical education and, therefore, their efficacy remains to be evaluated in future.

Self-assessment is widely considered by medical educators to be a vital part of the training process for future and current physicians16-18. However, there is considerable debate as to the accuracy of students’ assessments of their own competence and results obtained should later be validated by objective evaluation17-20.

The SKILLS study is the first questionnaire-based medical students’ self-assessment study in Greek universities, although similar projects have already been implemented in a pilot basis abroad, including the UK17-21.

Limitations of the study

The design of the study had to deal with two major limitations discussed below.

The first limitation is that the statistical sample does not consist of the same students being prospectively observed. However such a procedure could not be achieved since medical students attend different courses every month via a rotation schedule. This difficulty, however, is somewhat surpassed by the selection of comparable (age, sex, year of entry-study, training department-hospital, overall skills score) students.

The second limitation is that students’ efficiency in skills was not objectively tested and measured with a specific examinational method but depended on students’ self-report and self-evaluation. However, results obtained via questionnaire were compared with data from some pilot objectively structured clinical examinations (OSCE) on the same skills performed by a limited number of students under supervision by a trainer. Differences between referred and objectively observed capacity to perform each skill were not statistically significant, thus allowing the use of the questionnaires as a reasonably accurate means to estimate training efficiency18-21. However, only the extensive implementation of objective evaluation methods will allow safe conclusions as to the correlation between reported and observed data.


The study findings stress the potential of the current system to deliver adequate clinical training for future doctors, if not generally, at least in the specific field tested, since they constitute core elements of almost all residency programs. Some measurable gains are obtained even after short-term exposure to clinical scenarios, raising the question as to what level of expertise may be achieved by further expanding and consolidating these positive results. On the other hand, the study is also underlying significant gaps in medical training, stressing the need for continuous evaluation and improvement of the provided courses via additive educational activities. In this aspect, a self-reported evaluation based on structured questionnaires applied before and after a specific training intervention appears to provide reliable data to initially assess all the clinical goals mastered and detect areas of limited response. However, the combined implementation of subjective and objective evaluation methods in large populations nationwide and in a larger sum of clinical skills is required to guide the development of more effective teaching schemes. The inclusion of trainer- rather than self-reported skill performance outcomes using a uniform assessment tool might also increase objectivity and reproducibility and facilitate future data comparison across time and space.

Conflict of Interest

The authors report no conflict of interest.


The authors acknowledge the contribution of Professor Eleni Petridou, Department of Hygiene and Epidemiology, Medical School, Athens University, in the critical appraisal of the paper prior to submission.


1. Barrett MJ, Kuzma MA, Seto TC, Richards P, Mason D, Barrett DM, et al. The power of repetition in mastering cardiac auscultation. Am J Med. 2006; 119: 73-75.
2. Mangione S, Nieman LZ, Gracely E, Kaye D. The teaching and practice of cardiac auscultation during internal medicine and cardiology training. A nationwide survey. Ann Intern Med. 1993; 119: 47-54.
3. Fasce E, Ibáñez P. [Long-term results of an independent study program of electrocardiography applied to medical students]. Rev Med Chil. 1994; 122: 133-140.
4. Moule P, Albarran JW. Automated external defibrillation as part of BLS: implications for education and practice. Resuscitation. 2002; 54: 223-230.
5. Beckers S, Fries M, Bickenbach J, Derwall M, Kuhlen R, Rossaint R. Minimal instructions improve the performance of laypersons in the use of semiautomatic and automatic external defibrillators. Critical Care. 2005; 9: R110-R116.
6. Beckers SK, Friesa M, Bickenbacha J,  Skorning MH, Derwall M, Kuhlen R, et al. Retention of skills in medical students following minimal theoretical instructions on semi and fully automated external defibrillators. Resuscitation. 2007; 72: 444-450.
7. Bergeron BP, Greenes RA. Clinical skill-building simulations in cardiology: HeartLab and EkgLab. Comput Methods Programs Biomed. 1989; 30: 111–126.
8. Bourlas P, Giakoumakis E, Koutsouris D, Papakonstantinou G, Tsanakas P. The CARDIO-LOGOS system for ECG training and diagnosis. Technol Health Care. 1996; 3: 279–285.
9. Christensen UJ, Heffernan D, Andersen SF, Jensen PF. ResusSim 98--a PC advanced life support trainer. Resuscitation. 1998; 39: 81–84.
10. Favrat B, Pécoud A, Jaussi A. Teaching cardiac auscultation to trainees in internal medicine and family practice: does it work? BMC Med Educ. 2004; 4: 5.
11. K Ahmed Mel-B. What is happening to bedside clinical teaching? Med Educ. 2002; 36: 1185–1188.
12. Horiszny JA. Teaching cardiac auscultation using simulated heart sounds and small-group discussion. Fam Med. 2001; 33: 39–44.
13. Woywodt A, Herrmann A, Kielstein JT, Haller H, Haubitz M, Purnhagen H. A novel multimedia tool to improve bedside teaching of cardiac auscultation. Postgrad Med J. 2004; 80: 355–357.
14. Sajid AW, Ewy GA, Felner JM, Gessner I, Gordon MS, Mayer JW et al. Cardiology patient simulator and computer-assisted instruction technologies in bedside teaching. Med Educ. 1990; 24: 512–517.
15. Klar R, Bayer U. Computer-assisted teaching and learning in medicine. Int J Biomed Comput. 1990; 26: 7–27.
16. Woolliscroft JO, TenHaken J, Smith J, Calhoun JG. Medical students’ clinical self-assessments: comparisons with external measures of performance and the students’ self-assessments of overall performance and effort. Acad Med. 1993; 68: 285–294.
17. Reiter HI, Eva KW, Hatala RM, Norman GR. Self and peer assessment in tutorials: application of a relative-ranking model. Acad Med. 2002; 77: 1134-1139.
18. Ward M, Gruppen L, Regehr G. Measuring self-assessment: current state of the art. Adv Health Sci Educ Theory Pract. 2002; 7: 63–68.
19. Probert CS, Cahill DJ, McCann GL, Ben-Shlomo Y. Traditional finals and OSCEs in predicting consultant and self-reported clinical skills of PRHOs: a pilot study. Med Educ. 2003; 37: 597-602.
20. Mattheos N, Nattestad A, Falk-Nilsson E, Attström R. The interactive examination: assessing students’ self-assessment ability. Med Educ. 2004; 38: 378-389.
21. Vivekananda-Schmidt P, Lewis M, Hassell AB, Coady D, Walker D, Kay L, et al. Validation of MSAT: an instrument to measure medical students’ self-assessed confidence in musculoskeletal examination. Med Educ. 2007; 41: 402-410.