Feasibility and Effectiveness of Mini-Clinical Evaluation Exercise (Mini-CEX) in an Undergraduate Medical Program: A Study From Pakistan

Background In clinical settings, direct observation (DO) with feedback is an effective method to assess and improve learner performance. One tool used for DO is the mini-clinical evaluation exercise (Mini-CEX). We conducted a study to assess the effectiveness and feasibility of Mini-CEX for medical students at Aga Khan University, Karachi. Methods Utilizing a purposive sampling technique, a total of 199 students in six core clerkships of Years 3 and 4 were selected for this study. Participating faculty underwent training workshops for use of Mini-CEX and feedback strategies. Each student was assessed twice by one faculty, using a modified version of the Mini-CEX, which assessed four domains of clinical skills: Data Gathering, Communication, Diagnosis/Differential, and Management Plan and Organization. Feedback was given after each encounter. Faculty and students also provided detailed feedback regarding the process of DO. Data were analyzed using Statistical Package for Social Sciences (SPSS) version 26 (IBM Corp., Armonk, NY, USA), with categorical variables arranged as frequencies and percentages. The Chi-squared test was used for further statistical analyses, and a P-value of < 0.05 was considered statistically significant. Effectiveness was assessed via a change in student performance between the first and second Mini-CEX, and feasibility was assessed via qualitative feedback. Results We obtained three sets of results: Mini-CEX forms (523), from which we included a total 350 evaluations for analysis, 216 from Year 3 and 134 from Year 4, and feedback on DO: student (70) and faculty (18). Year 3 students performed significantly better in all foci of the Mini-CEX between the first and second assessment (P ≤ 0.001), whereas in Year 4, significant improvement was limited to only two domains of the Mini-CEX [Communication of History/Physical Examination (P = 0.040) and Diagnosis/Differential and Management Plan (P < 0.001)]. Students (65.7%) and faculty (94.4%) felt this exercise improved their interaction. 83.3% faculty recommended its formal implementation compared to 27.1% of students, who reported challenges in implementation of the Mini-CEX such as time constraints, logistics, the subjectivity of assessment, and varying interest by faculty. Conclusion Direct observation using Mini-CEX is effective in improving the clinical and diagnostic skills of medical students and strengthens student-faculty interaction. While challenges exist in its implementation, the strategic placement of Mini-CEX may enhance its utility in measuring student competency.


Introduction
Medical education has undergone fundamental changes over the past few decades. There is a pedagogical move towards competency-based teaching/learning to achieve specific learning outcomes [1]. In clinical settings, direct observation (DO) of medical student performance is a vigorous method of formative and summative assessment, and is a major assessment tool for outcomes in a competency-based medical education [2]; for example, it has shown to provide an opportunity to gauge knowledge, skills, and clinical reasoning during patient care [3]. Feedback from this interaction influences and improves students' performance and skills.
The Mini-Clinical Evaluation Exercise (Mini-CEX) has been used as an assessment tool to determine clinical competency by DO. It was first introduced and used to evaluate Internal Medicine Residents, as an alternative to other methods of examinations via DO [4,5]. Multiple studies have argued for the use of Mini-CEX amongst medical students as a method of assessment. The Mini-CEX was found to be a feasible assessment tool during busy clerkships such as Inpatient Psychiatry, and it helped improve students' counseling skills [6]. In another study, students and faculty reported one-on-one interaction as the most important advantage of the Mini-CEX [7]. The Mini-CEX does have limitations: time constraints secondary to high patient volumes and a limited time for patient interaction, and health system restrictions [7,8]. Despite these challenges, the Mini-CEX has proven its potential as a cost-effective way of giving students constructive feedback in a structured manner [9].
The Aga Khan University (AKU) Karachi, a frontier institution in the country, has a five-year program for Undergraduate Medical Education (UGME). Ongoing assessment in clinical clerkships is done via a form that evaluates 13 attributes of knowledge, skills, and professionalism. Key stakeholders, i.e., students and faculty, however report that opportunities for DO with feedback are limited [3]. UGME Program reviews have also recommended strengthening opportunities for DO during patient care.
We, therefore, conducted this study to assess the effectiveness and feasibility of using Mini-CEX for the DO of learners in the UGME program at AKU. Furthermore, we observed its role in student-faculty interaction and contribution to competency-based learning.

Materials And Methods
Our study was conducted during the academic year of 2018-2019, in the UGME program of Aga Khan University in Karachi, Pakistan. Approval was obtained from the Institutional Ethics Review Committee (ERC #2018-0516-662), and an initial survey was conducted amongst 55 local students (approximately half from Years 3 and 4 each), who reported a substantial lack of DO opportunities during their clinical clerkships ( Table 1).  All 199 students rotating in the said clerkships were oriented to the Mini-CEX pilot program at the beginning of the academic year and Mini-CEX was implemented as a formative assessment tool within existing clinical interactions, with real patients. Clerkship coordinators and department chairs were asked to identify four to six faculty members to participate in this study. These faculty members attended mandatory training workshops to standardize DO skills and feedback strategies; workshops were conducted by medical educationists who had prior experience with the Mini-CEX. We also engaged clinic administration during the planning meetings to ensure their understanding and support. We obtained written informed consent from all participating faculty members and students. We used the Mini-CEX in both inpatient and outpatient settings. In clinics, DO was completed within the first hour of the clinic, to ensure maximum opportunity for student-faculty interaction and minimal disruption of patient flow (done with the help of clinic staff). Each student had two Mini-CEXs by the same clinical faculty within one clerkship to assess its effect on student progress.

Clerkships
We utilized an internally validated, modified version of the Mini-CEX for DO. Four domains of clinical skills were included: Data Gathering (history and physical examination), Communication (of history/physical examination), Diagnosis/Differential (with diagnostic reasoning), and Management Plan and Organization. These were assessed on a 3-point Likert scale (Needs Improvement, Improving and Satisfactory or Above), with justification required for each assessment and written feedback to improve that skill ( Figure 1); effectiveness was assessed via a change in student performance between the first and second Mini-CEX. We also obtained feedback from students and faculty at the end of the study through specially designed and internally validated forms, and assessed the feasibility of the tool qualitatively via comments from students and faculty about ease and time of completion in real-time [10].

Analysis
Data were analyzed using Statistical Package for Social Sciences (SPSS) version 26 (IBM Corp., Armonk, NY, USA). All Mini-CEX forms were coded to maintain confidentiality. We arranged categorical variables as frequencies and percentages. The Chi-square test was used for further statistical analyses. Forms with missing data were excluded, and a P-value of <0.05 was considered statistically significant. We compiled feedback questionnaires and categorized them according to the Likert scale. Open-ended questions were reviewed, and comments were collated and summarized into two themes of 'Strengths' and 'Weaknesses' of Mini-CEX for DO, as perceived by students and faculty.  We assessed the effectiveness of the Mini-CEX individually for Years 3 and 4.

Results
In Year 3, we found a statistically significant improvement between the first and second Mini-CEX in all foci of assessment (P ≤ 0.001) (

Faculty feedback
All faculty members provided feedback ( Figure 3). Almost all (94.4%) faculty reported that this exercise improved faculty/student interaction, and 83.3% formally recommended the implementation of this project across the UGME curriculum.

FIGURE 3: Faculty feedback about the implementation of mini-clinical evaluation exercise (Mini-CEX) for direct observation (DO).
We combined the comments received from students and faculty, into themes of strengths and weaknesses of the Mini-CEX form ( Table 5). Students' real-time feedback was a strength of Mini-CEX, while a limited time duration due to busy clinics was considered a weakness by students and faculty alike.

Strengths Identified
Improved student/faculty interaction 4 4 Opportunity to observe students' performance 6 10 Opportunity to give real-time feedback 3 32

Discussion
Our study assessed the effectiveness and feasibility of the Mini-CEX for direct observation of students in the undergraduate medical program at AKU. We also explored the effects of this exercise on student-faculty interaction and obtained stakeholder feedback on strengths and challenges. A recent study from the Norwegian University of Science and Technology evaluated the use of Mini-CEX in their undergraduate program, concluding with positive recommendations for the implementation of this tool for formative assessment [11]. In our region, there is a paucity of data regarding formative assessments in UGME, with only one study from 2015 [12]. Therefore, we hoped to add to the existing literature, a way forward in planning and implementing Mini-CEX in busy clinical programs.
In our experience, the implementation of this project formalized and improved opportunities for the DO of students. We found some similarities and differences from previous studies:  [4]. We believe this is likely due to a higher level of learners i.e., post-graduate trainees, and possibly an increased number of observations. In our understanding, both these results reflect the level of the learner and may help in planning the implementation of Mini-CEX for undergraduate learners. We advocate repeated use of Mini-CEX to allow students to work on their skills [13] and shift the focus of the Mini-CEX to its different domains over time based on the academic progression of the student throughout medical school.
Most of the completed Mini-CEXs came from the Family Medicine clerkship and least from the General Medicine clerkship. We believe that this difference is arbitrary, as another study cited the highest number of responses from General Medicine [14]. In our context, the greater complexity of patients in Medicine clinics may have limited Mini-CEX opportunities for third-year students.
The key strength of the DO observation in our study lay in the increased student-faculty interaction, making Mini-CEX a valuable tool to aid such contact in busy clinical settings [15]. As stated by a student, "Two spaced-out evaluations helped me get a better understanding of my progress in the clinical rotation". Proximate feedback that resulted from this interaction was another strength of this project. As reported previously [16][17][18], formative assessments and feedback are considered effective tools for improving skills [19], therefore lending support to the implementation of DO via Mini-CEX. However, despite the value of feedback in improving learner performance, DO in undergraduate medical education remains suboptimal [20]. Institutions planning to implement DO must seek to identify individualized ways to optimize utilization.
The quality of feedback in DO also matters; learners benefit more from specific feedback than numerical scores [21]. Timely and specific feedback is often difficult in un-observed student-patient encounters. Mini-CEX provides an opportunity for consultants to give not only immediate but specific feedback to improve the clinical judgment of learners [22], as also seen in our study. As quoted by a student, "In its true sense, this gave us the chance to interact with faculty one on one. I always wanted my consultants to look at my examination technique and the way I interacted with patients so they could identify my mistakes/strong points". Feedback from faculty is shown to trigger positive emotional reactions and self-reflection among students, encouraging them to be more motivated toward learning [23].
The formative nature of this interaction was appreciated by both students and faculty [24]. The resulting feedback allowed better case correlation and understanding and encouraged more active student participation, improved communication, and clarification of Urdu terminologies [25,26]. As one student stated, "It's not graded, so there's less pressure to perform and therefore we do what we usually do in the clinic and learn from our mistakes".
All levels of faculty members participated, albeit their numbers were small. We believe offering more workshops that enhance understanding of the Mini-CEX tool might engage greater number of clinical faculty in conducting DOs [27]. Faculty members saw a clear benefit in using Mini-CEX for learning and feedback and believed it also helped improve their clinical teaching ability. Faculty support for Mini-CEX is found in prior studies, with another faculty cohort citing the experience as a self-learning exercise [1].
While the students found the interaction during DO beneficial, some found it daunting. As a student described, "Often felt uncomfortable with the consultant observing". The sense of intimidation reported in our study is similar to previous citations where students and residents were uncomfortable during the Mini-CEX and/or feedback [4,21]. This could be because of a natural fear of criticism, the nature of the faculty, or past interactions. Surprisingly, in our study the junior students felt more comfortable with faculty observing them; whether this was also linked to specific clerkships was not explored. We believe that the intimidation factor markedly contributed to why a majority of students did not support the formal implementation of DO in the UGME curriculum. We suggest an early start and regular use of the Mini-CEX for direct observation and feedback as a way to overcome student discomfort [28].
We found time limitation as the major barrier to the implementation of Mini-CEX for DO and a constraint in its feasibility [29]. As reported "It was hard to manage to get Mini-CEX forms filled due to busy clinics and lack of time". Other factors influencing feasibility were clinic volumes, space constraints, and placing ownership of the process largely on faculty members [3,30]. We propose that in busy clinical settings, the way to overcome these challenges is to create multiple opportunities for Mini-CEX and allow learners to self-identify opportunities to engage faculty. Faculty members formally engaged in education may also be tasked to perform a set number of Mini-CEXs per clerkship. Sessions to demonstrate the utility of the Mini-CEX and constant reinforcement may also improve its completion rate, and thus, feasibility [31]. Furthermore, to improve feasibility, more clerkships with a wider range of patient problems should be engaged, increasing the number of opportunities for Mini-CEX to gauge clinical competence.
Another weakness in this process identified by students was low enthusiasm and lack of uniform and constructive feedback by some faculty members. A student reported that, "The expectations of every consultant are very different, thus grading was very subjective". Consultant expectations, and their internal state during the Mini-CEX, could influence the assessment and feedback, causing a lack of uniformity in evaluating student performance [32]. This necessitates ongoing faculty training.
Patient complexity was an additional challenge as highlighted by a student, "Sometimes, the patients were very complex, which didn't help us perform or learn appropriately. Also, sometimes the patient started addressing the consultant directly, and it got hard to maintain communication with the patient". The solution to this lies with more frequent observations, thereby increasing the opportunity for varied patient complexity.
Our study has some limitations. It was a single-institutional study, therefore external validity may be affected. Another limitation in our study was the small number of faculty engaged in the process. A selfselection bias in faculty members with an inherent interest in education may have influenced their affirmation of the DO process. We may have seen different results with a larger faculty cohort.

Conclusions
The pandemic has had a transformative effect on health education. Medical students being brought to the frontlines of care and a broader use of their communication skills for implementation of effective telemedicine makes the adoption of a competency-based education even more compelling. DO is an important tool to assess competency and should be strategically implemented in the assessment program of undergraduate medical education. Mini-CEX is a valid tool for formative assessment of students' clinical skills and communication and should be incorporated purposefully in curricula that aim for competency attainment. However, factors such as time constraints and clinic patient influx limit its completion rate, and by extension, the feasibility of the tool itself.
Therefore, we believe that the way forward is to incorporate planned opportunities for multiple Mini-CEXs into the curriculum and ensure that faculty engaged in clinical education perform a set number of Mini-CEXs, with the executional responsibility tasked to both faculty and learners. Furthermore, we recommend early introduction of workplace-based assessments [Mini-CEX, Direct Observation of Procedural Skills (DOPS), Case-Based Discussions (CBD)] within the UGME to allow students to overcome the sense of intimidation, with identification and training of clinical faculty for DO and feedback as the best way to ensure quality assurance and optimum utilization.

Additional Information Disclosures
Human subjects: Consent was obtained or waived by all participants in this study. Ethics Review Committee, The Aga Khan University issued approval 2018-0516-662. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work.
Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.