Competency-Based Medical Education: Are Canadian Pediatric Anesthesiologists Ready?

Background With the introduction of Competency-Based Medical Education (CBME), the Canadian Pediatric Anesthesia Society (CPAS) surveyed its members to assess their awareness of and prior experience with CBME concepts and evaluation tools, and identify methods for faculty development of CBME teaching strategies for pediatric anesthesia residents and fellows. Methods An online survey was sent to CPAS members. Outcomes included respondents’ previous exposure to CBME and the educational support they had received in anticipation of the curriculum. Questions used multi-item Likert scales and a general feedback question. Results The response rate was 39% (60/155). Eighty-eight percent of respondents spent ≥50% of their time practicing pediatric anesthesia; 78% and 45% spent at least a quarter of their time teaching residents and fellows respectively. Eighty-three percent were familiar with CBME concepts, and 58% were familiar with Milestones, Competencies, and Entrustable Professional Activities (EPAs). However, 64% had not received any formal training and 52% had not used any CBME evaluation tools. Learning preferences included small group discussions (72%), lectures with questions and answers (Q&A) (62%), seminars (50%), and workshops (50%). Conclusions Despite widespread awareness of CBME concepts, there is a need to educate Canadian pediatric anesthesiologists regarding CBME evaluation tools. Faculty development support will increase the utilization of these tools in teaching practice.


Introduction
There is a shift occurring in medical education from a time-based model to a competency-based model [1,2]. Moving from the traditional pediatric anesthesia fellowship to a competency-based model may be required so as to better align medical education as a continuum of competence. The Royal College of Physicians and Surgeons of Canada is currently introducing Competence by Design (CBD), a multi-year initiative to implement a competency-based medical education (CBME) approach to residency education and specialty practice in Canada and to align Royal College policies and processes with a CBME approach [3].

Study design and approval
An online survey tool was developed by the authors, using Fluidsurveys (http://fluidsurveys.com), a secure online electronic data capture tool hosted in Canada. The survey was reviewed by the Education Committee of the Canadian Pediatric Anesthesia Society (CPAS), prior to obtaining ethical approval from the University of British Columbia Children's & Women's Research Ethics Board (approval number H15-02624). In keeping with previous surveys [7,8], practicing pediatric anesthesiologists were identified as eligible study participants by their membership in CPAS. The majority of CPAS members were working at academic institutions that provide pediatric anesthesia fellowship training. The survey questionnaire was tested beforehand using a convenience sample of five pediatric anesthesia fellows and four pediatric anesthesiologists.

Study participants
The online survey was distributed to 155 out of 160 CPAS members who had a valid email address in the membership database. The survey remained open for 60 days after the initial email was sent, and a single follow-up reminder email was sent to all eligible study participants two weeks after the initial invitation to participate in the survey. No identifying data was collected and consent was implied by participation.

Survey instrument
Participants were asked to describe their pediatric anesthesia experience according to four intervals (less than 5 years, 5-10 years, 11-20 years, and greater than 20 years) and their teaching activities according to what proportion of their time (None, <25%, 25-50%, 50-75%, >75%) was spent teaching fellows, residents, and medical students.
Participants were also asked to determine their familiarity with CBME, especially the major concepts of Milestones, Competencies, and Entrustable Professional Activities (EPAs). Participants were asked to rate their experience of these concepts in their own training and in their teaching activities, each according to a 5-point Likert scale (Strongly disagree, Disagree, Neither agree nor disagree, Agree, Strongly agree).
In addition, participants were then asked about their experience of the following specific CBME evaluation tools, how easy each was to apply in an anesthesia setting, and their accuracy in assessing trainees' ability: Finally, participants were asked to indicate their opinion on the value of the CBME approach to pediatric anesthesia training and their preferred formats for acquiring the necessary skills to employ CBME approaches in their teaching (lectures without question and answer sessions, lectures with question and answer sessions, seminars, simulation cases, workshops, journal club, small group discussions, online podcasts, operating room case-based learning, one-on-one consultations with colleagues and/or peer experts and leaders). Respondents were instructed to select as many choices as they felt applicable.
The median years of practice of respondents were 11-20 years (  Almost three-quarters (44/60, 73%) respondents stated that the majority of their time (greater than 75%) was spent practicing pediatric anesthesia, and a further 9/60 (15%) stated that pediatric anesthesia comprised 50-75% of their practice.
The majority of respondents (47/60, 78%) spent greater than 25% of their time teaching residents. Only one respondent replied as having spent no time teaching anesthesia residents. Less than half of respondents (27/60, 45%) reported spending >25% of their time teaching fellows, with 10/60 (17%) respondents reporting no teaching activities with fellows. The majority of respondents (51/60, 85%) reported spending less than 25% of their time teaching medical students.

Familiarity with CBME concepts and evaluation tools
Fifty out of 60 (83%) respondents replied that they were familiar with the concept of CBME, with a mean response of 4.13 (Agree) ( Table 2). The majority (35/60, 58%) respondents replied that they were familiar with the major concepts of Milestones, Competencies, and EPAs. Overall, 38/59 (64%) had not received any formal training in using CBME evaluation tools, and 31/60 (52%) were not currently using nor previously used any CBME evaluation tools. CBD was the most commonly used tool for teaching (used by 49/54, 91%) and evaluation of trainees (used by 40/58 69%) ( Table 2). The simulation was found to be actively used by 22/42 (52%) of respondents for teaching, but was incorporated less as an evaluation tool (used by 10/40, 25%).

Respondents preferred learning modes on the use of competencybased tools
While very few of the respondents were exposed to CBME during their own training (7/58, 12%) (as shown in Table 2), half of the respondents reported that competency-based training and evaluation will be a useful change to implement in pediatric anesthesia fellowship training (  There was a strong learning preference towards small group discussions (43/60, 72%), lectures with question and answer sessions (37/60, 62%), seminars (30/60, 50%), and workshops (30/50, 50%) ( Table 4). Interestingly, one of the CBME learning/evaluation tools, simulation cases (21/60, 35%), was not in the top three preferences selected.

TABLE 4: Respondents' perception of evaluation tools
Total number of participants (N) = 60 n = Number of participants, data presented as n (%; 95% CI)

Discussion
The pending implementation by the Royal College of Physicians and Surgeons of CBME in Canadian anesthesia residency training represents a significant paradigm shift in the teaching and evaluation of pediatric anesthesia. The results of this survey suggest that the Canadian pediatric anesthesia community is familiar with some of the key concepts of CBME, Milestones, Competencies, and EPAs. However, major gaps exist in the knowledge and current use of the full range of CBME evaluation tools for pediatric anesthesia teaching and evaluation of trainees. The faculty has identified small group sessions to be key in gaining information on this new curriculum.
Competency-based pediatric anesthesia training presents opportunities for a meaningful assessment using evaluation tools that are based on a detailed rubric (the milestones), and which are available to both the trainer and trainee. Outcome-based evaluation allows learners to focus on the key concepts and skills to direct their learning. This is in contrast to traditional methods of medical education, which have emphasized a time-based evaluation. Curricula are designed around competency-based milestones that must be clearly defined and agreed upon, including specific milestones for an individual subspecialty, such as has recently been published for pediatric cardiac anesthesiology [10].
Our findings demonstrate that the majority of pediatric anesthesiologists are not familiar with the range of CBME tools and do not have experience with their use. Educating anesthesiologists regarding the best choice of tools to measure specific learning outcomes is vital in order to facilitate constructive feedback to trainees allowing them to improve their learning experience and achieve the required competencies. Educators will need to assist faculty with professional development in the use of DOPS, A-CEX/Mini-CEX, ANTS/ALMATS, MSF, CBD, and Simulation for assessment. Hodges et al. have previously argued that the use of checklisttype tools offers poor discrimination of expertise, but can be used in combination with global rating scales [11]. The new CBME curriculum will also include the assessment of EPAs and milestones. Faculty across Canada will require training on the use of these tools for an effective transition to the CBME model.
Reassuringly for respondents' familiarity with these evaluation tools, the consensus was that they were easy to apply and perceived to be a reasonable assessment of the milestone, competency, or EPA that they were designed to evaluate. Recent studies [12,13] exist on how to address the primary concerns and challenges of CBME implementation and how to support faculty development in order to "train the trainers". Boet et al. [14] identified concerns from Canadian Program Directors regarding the administration of CBME, including challenging scheduling and the need to have 'buy-in' from faculty. They did not however report on approaches to buy-in or faculty development. A recent study into the implementation and evaluation of an online feedback tool for anesthesia residents, which facilitated mapping of feedback to milestones, demonstrated some perceived benefits for trainees, but also highlighted issues in the uptake of the tool by staff [15].
Our study reports that respondents prefer small group discussions and panel discussions as methods for acquiring training on the CBME process and the use of assessment tools. The inclusion of CBME-focused small group discussions and panel discussions at pediatric anesthesia meetings may assist in addressing this important need during this time of transition. The use of small group sessions will likely be costly. Faculty developers will need to consider the impact of these challenges on delaying the process of implementation and may consider starting with faculty that regularly teach residents. At a single Canadian pilot site, the training of faculty was met with such challenges and these will need to be addressed country-wide with local considerations [16]. Further, the development and validation of pediatric anesthesia EPAs and global rating scales will assist in providing a standardized assessment of fellows within Canada and in other jurisdictions [17].

Study limitations
There are several limitations to our study. This is a survey of pediatric anesthesiologists practicing in Canada and the results may not be translatable to other jurisdictions. However, the training of Pediatric Anesthesia fellows anecdotally is similar across the globe allowing for the consideration of many of the tools and methods we reviewed to be considered in other jurisdictions. Another limitation is the survey is reflecting practice at a point in time. Further surveys or mixed methodology studies may determine other barriers or opportunities for implementing competency-based medical education for Pediatric Anesthesia fellowship training.

Conclusions
The results of this survey suggest that, despite widespread general awareness of the CBME concept and major constructs, there is a need to educate Canadian pediatric anesthesiologists in the use of specific CBME evaluation tools. Further faculty development is required as one of many factors of many to increase utilization of these tools in teaching practice.

Additional Information Disclosures
Human subjects: Consent was obtained or waived by all participants in this study. University of British Columbia Children's & Women's Research Ethics Board issued approval H15-02624. Ethical approval was obtained from the University of British Columbia Children's & Women's Research Ethics Board with the approval number H15-02624. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.