"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."

Margaret Mead
Original article
peer-reviewed

Expert Facilitated Development of an Objective Assessment Tool for Point-of-Care Ultrasound Performance in Undergraduate Medical Education



Abstract

Background: With the various applications of point-of-care ultrasound (PoCUS) steadily increasing, many medical schools across North America are incorporating PoCUS training into their undergraduate curricula. The Faculty of Medicine at Memorial University also intends to introduce PoCUS training into its own undergraduate medical program. The proposed approach is to introduce a PoCUS curriculum focusing on anatomy and physiology while developing cognitive and psychomotor skills that are later transferred into clinical applications. This has been the common approach taken by most undergraduate ultrasound programs in the United States. This project highlights the development and the challenges involved in creating an objective assessment tool that meets the unique needs of this proposed undergraduate ultrasound curriculum.

Methods: After a thorough review of existing literature and input from experts in PoCUS, a prototype global rating scale (GRS) and three exam-specific checklists were created by researchers. The exam-specific checklists include aorta exam, subxiphoid cardiac exam, and focused abdominal exam. A panel of 18 emergency room physicians certified in PoCUS were recruited to evaluate the GRS and three checklists. This was accomplished using a modified Delphi technique. The items were rated on a 5-point Likert scale. If an item received a mean score of less than 4, it was deemed unimportant for the assessment of PoCUS performance in undergraduate medical learners and was excluded. Experts were also encouraged to provide comments and suggest further items to be added to the GRS or checklists. Items were modified according to these comments. All of the edits were then sent back to the experts for revisions.

Results: A consensus was achieved after three rounds of surveys, with the final GRS containing nine items. The final aorta checklist contained nine items, and the subxiphoid cardiac and focused abdominal checklists each contained 11 items.

Conclusion: By using a modified Delphi technique, we developed a single GRS and three checklists. A panel of independent PoCUS practitioners supports the content validity of these tools. Research is currently ongoing to evaluate their validity for assessing PoCUS competency in undergraduate medical students.

Introduction

In the last two decades the medical community has seen a steady increase in the use of point-of-care ultrasound (PoCUS) [1]. PoCUS, or use of ultrasound at a patient’s bedside, is now part of the daily practice of many physicians in various specialties. The wide acceptance of PoCUS is largely attributable to the versatility of ultrasound imaging and the advent of smaller, cheaper ultrasound machines that provide high quality images [1-2]. PoCUS has become both an adjunct to the physical exam as well as an important tool used in many procedures [1, 3-7]. Visualization of vital internal structures using ultrasound can narrow the clinician’s differential diagnosis and improve diagnostic accuracy [8]. Ultrasound guided procedures include, but are not limited to, central venous catheter placement, regional anesthesia, thoracentesis, and paracentesis [1, 9]. A review of the literature shows the body of research regarding PoCUS is rapidly growing, expanding the list of PoCUS related procedures and exams. When used appropriately these procedures and exams can ultimately lead to superior patient care and safety [1, 10].

In addition to its versatility, safety, and overall patient benefits, it has also been shown that PoCUS can safely be taught to medical students and novice examiners [8, 11-15].  Therefore, it is not surprising that a number of medical programs are piloting and incorporating PoCUS into their curricula [5, 16-21]. Medical schools need to develop PoCUS programs if they are to ensure their students remain current with modern medical advances and are adequately prepared to face the changing demands of clinical practice [22-23]. Memorial University’s Faculty of Medicine recognizes this need and endeavors to train competent and knowledgeable generalists who are able to meet these demands [Metcalfe et al., 2014; http://www.wcume.org/wp-content/uploads/2015/09/v6-OHSU_14_01-WC-Ultrasound-Conference_Guide-Book_V6.pdf].

Our approach is to develop and introduce an undergraduate PoCUS curriculum focusing on anatomy and physiology while developing cognitive and psychomotor skills that can later be transferred into clinical applications. This has been the common approach taken by most undergraduate ultrasound programs in the United States [24]. Figure 1 demonstrates the educational theory behind Memorial University’s approach to developing and introducing an undergraduate PoCUS curriculum. This educational theory is based on Bloom’s three domains of learning [25]. It presumes that undergraduate PoCUS learners acquire the attitudes and physical proficiencies to perform ultrasound first, followed later in their medical school career by the ability to “put it all together” in the making of clinical decisions.

Like any clinical skill, the addition of PoCUS to medical school curricula necessitates evaluation. We must be able to assess the level of competence on the part of the practitioner since the inappropriate use of PoCUS can be dangerous [26]. Traditional assessment of clinicians participating in PoCUS training courses includes observation with subsequent written and visual exams. To assess medical students in the same manner as experienced clinicians may not be appropriate [27]. Accordingly, undergraduate medical education requires its own unique form of assessment. Although assessment tools have been designed for practicing clinicians [28], and milestones for curricula development have been suggested [24], we are unaware of any existing assessment created specifically for undergraduate learners. We, therefore, set out to create an assessment tool that would fit within our educational framework and also meet the unique needs of our proposed undergraduate program. Our tool is designed with the intention of observing and assessing medical students over time. This observation would occur both in pre-clerkship, when they have minimal knowledge of PoCUS and its applications, and in clerkship, when they should be better able to understand and appreciate the many clinical applications of PoCUS.

Materials & Methods

A modified Delphi technique was used to obtain expert consensus on items to be included in the assessment tool. The Delphi technique requires a panel of experts to complete several rounds of an opinion-eliciting survey. The responses to the survey are then collected, analyzed, summarized, and redistributed to the experts in the form of a new survey. Multiple iterations are used to achieve consensus [29]. This method encourages debate while maintaining anonymity, lessening the impact of strong opinions and personalities on final consensus formation [30].

A non-probability sampling technique known as purposive sampling was used to select the expert panel to which the survey would be distributed. We initially administered the survey to a group of highly trained specialists practicing in a variety of areas including emergency medicine, anesthesia, intensive care, obstetrics/gynecology, otolaryngology, and radiology. The initial administration of the survey was used to analyze the feasibility of the study, specifically as it related to participant recruitment/survey administration and sample/panelist selection. Some of these participants indicated their clinical use of ultrasound is highly specialized. As a result, they did not feel they could comment on how to assess learners who would practice PoCUS as generalists. For example, an intensivist said, “I’m not sure I am best placed to respond to this survey. My skill set is not as comprehensive as ER docs.”

Based on this feedback, the participant inclusion criteria was revised, limiting participants to local emergency physicians as they were thought to be best equipped to aid in the development of a tool for learners who will be trained as generalists in PoCUS. These participants were certified with the Canadian Emergency Ultrasound Society as Independent Practitioners and Master Instructors.

The survey was then redistributed to these experts who were asked to give their opinion on a “Point of Care Ultrasound Assessment Tool for Undergraduate Medical Education” comprised of a Global Rating Scale (GRS) and three anatomically specific checklists (aorta, focused abdominal exam, and cardiac). Systematic reviews have shown checklists and GRS’s have differing strengths and weaknesses [31], making the simultaneous use of both GRS and checklists valuable. The GRS was modified from the original Objective Structured Assessment of Technical Skills (OSATS) to include skills specific to ultrasound. Both the GRS [32] and the OSATS method of assessment have been found to be valid and reliable [32-36].

Each expert was asked to score each GRS item and each checklist item as “yes” for inclusion or “no” for exclusion. The experts were then asked to rate these same items on a Likert scale ranging from 1-5. As in other published studies, the items that had a mean score of greater than 4 and standard deviation (SD) ≤.5 were deemed important for the assessment of PoCUS performance in undergraduate medical learners [37]. If an item received a mean score of less than 4, it was deemed unimportant for this assessment and was excluded. In addition, we encouraged the experts to provide comments and suggest additional items that might be included. Items were modified accordingly. Items that met inclusion criteria, but could be improved with adaptation according to experts’ comments, were also modified. If no comments were made but the item required revision, key informants were consulted. All changes and additions were analyzed, summarized, and then returned to the experts in subsequent rounds of the survey. When the participants suggested no additional items or significant changes, the results were compiled, and the final GRS and checklists were created.

Results

The final GRS and checklists were developed after three rounds of surveys.

All 18 invited reviewers completed the first round of the survey. Based on the Delphi method criteria, only three of the 11 initial GRS items were accepted by the reviewers--image interpretation, knowledge of procedure (if applicable), and overall performance. The other eight items were either rejected or modified based on the reviewers’ comments. The expert panel also suggested two new items be added to the GRS--documentation of ultrasound image, and demonstrates understanding of personal and technical limitations.

Similarly, based on the Delphi method criteria, 11 of the original 23 checklist items were accepted for each of their final respective checklists. The item 'landmarks xiphoid process' was rejected from the focused abdominal ultrasound checklist based on the Delphi criteria. The researchers revised the remaining 11 checklist items for the next round of the survey based on the reviewers’ comments. The new items suggested by the reviewers were: differentiates aorta from IVC, identifies left and right ventricle, scans pelvis, identifies Pouch of Douglas or recto-vesical pouch and presence of fluid, and demonstrates techniques for dealing with rib shadows.

All 18 reviewers completed the second iteration of the survey. In round two, they reviewed the six updated and the two new GRS items. Two items--image optimization (probe choice, optimization of gain and depth) and probe technique--were accepted for inclusion in the final version of the GRS. The panel rejected two items: documentation of ultrasound image, and demonstrates understanding of personal and technical limitations. The remaining four items were sent back to researchers to be revised.

In round two, the expert panel reviewed a combined total of 15 revised and new checklist items. Five of the revised checklist items were accepted for inclusion in each of their final respective checklists (obtains verbal consent for bedside ultrasound when possible, ensures patient is in a comfortable position and is draped appropriately, scans aorta from diaphragm to bifurcation, accurately measures aorta by measuring outside wall to outside wall, identifies the apex and septum). The reviewers rejected three items (scans pelvis, identifies Pouch of Douglas or recto-vesical pouch, and presence of fluid). The remaining checklist items were sent back to researchers for further revision based on the reviewers’ comments.

In round three, nine of the invited reviewers completed the third and last iteration of the survey. This round was completed in real time, face-to-face. They reviewed four GRS items (preparation for procedure--machine, machine placement, gel, towels; patient interaction--rapport, patient comfort; use of sterile technique--if applicable; troubleshooting--adjusts approach as necessary), and eight checklist items (washes hands before performing ultrasound, demonstrates appropriate starting position by identifying cardiac activity, landmarks spine and spine shadow, differentiates aorta from IVC and other vascular structures, landmarks the xiphoid process to begin the subxiphoid cardiac exam, identifies left and right ventricle of the heart, scans and sweeps splenodiaphragmatic interface, demonstrates techniques for dealing with rib shadows) from the previous round. The reviewers’ comments were used to modify items in real time. The reviewers accepted all of the items once they were modified according to the comments made in the session. For clarity, each round and the number accepted, excluded, revised, and suggested items are shown in Figure 2. The final GRS, aorta, subxiphoid cardiac, and focused abdominal ultrasound checklists are shown below. (Figures 3-6)

Discussion

The purpose of this project was to create an objective assessment tool that will meet the needs of Memorial University Faculty of Medicine’s proposed PoCUS curriculum. The evaluation of ultrasound skills must include objective, reliable, and validated assessment tools, as they are necessary to ensure that a standard, general level of competence is attained [1]. Given that some undergraduate learning objectives may differ from those of postgraduate and clinician training courses, this proposed PoCUS undergraduate program requires a unique form of assessment.

The Faculty of Medicine at Memorial University trains undergraduate students with the objective of preparing them as generalists, ready to enter whichever residency program they choose. As a result, any newly introduced PoCUS program must also align with this key principle. The knowledge base, clinical competency, and needs of postgraduate learners differ from those in undergraduate medicine. Undergraduate medical students cannot be expected to study and be assessed using a PoCUS curriculum that is ultimately aimed at a different level and type of learner. Interestingly, this became evident when our survey was first administered to a variety of specialists. While we initially thought that recruiting specialists for the study would result in the development of a more comprehensive assessment tool, the specialists felt uncomfortable developing a tool that would be used to assess generalists. Given the broad scope of practice and generalist approach used in emergency medicine, we decided that emergency medicine physicians would be best suited to comment on a tool meant to assess learners who would be trained as generalists.

The proposed PoCUS program at Memorial University envisions undergraduate students learning anatomy/physiology and physical exam skills while concurrently gaining the skills necessary to perform PoCUS. Initially, students will learn the general principles of PoCUS, such as ultrasound physics and probe technique, while simultaneously gaining anatomical knowledge and clinical skills. By introducing these skills early, students will have more opportunity to practice their PoCUS skills, and faculty will have more time to assess them. This ensures they are competent and qualified to practice at the end of medical school. This proposed curriculum and accompanying assessment tool strengthen the effectiveness of the assessment of PoCUS skills in two ways. First, the assessment is more effective if done over time by different faculty [38]. Second, more consistent feedback is gained when using a tool designed specifically for a skill like PoCUS [38]. Learners in this proposed undergraduate curriculum will have far more time for structured and consistent assessment compared to their post-graduate counterparts. It is our belief that better assessment leads to better skills.

Objective tools like the one developed here enhance the learning process by facilitating constructive feedback based on each specific item and marking progression over time [39]. Some may argue that this is impractical and could make learning and the subsequent assessment of anatomy/physiology and PoCUS more difficult. However, in other domains of knowledge acquisition, such as critical thinking, evidence suggests that incorporating critical thinking into existing subjects may be advantageous as compared to having a separate course on critical thinking [40]. Although further research is required to determine if the same principles apply to teaching ultrasound skills across curriculum and existing courses, at this stage we speculate that if not beneficial, at least this approach is not harmful. Furthermore, this tool was modified from previously existing assessment tools that have been shown to be valid and reliable [32-34]. As well, it can be used in both the simulation lab and the clinical area [39, 41], making this form of assessment both effective and efficient.

Our research team acknowledges several limitations. This project was potentially impacted by both sample size and composition. Despite the growing popularity of PoCUS, there are a limited number of certified, independent practitioners at our site. Furthermore, these participants have for the most part all undergone similar training by a limited number of instructors. A larger, more diverse sample size may have provided more varied responses. Moreover, as a result of the small sample size, items were rounded down and accepted with a standard deviation of .51. Without this alteration, seemingly important items that received only “important” or “very important” on the Likert scale would not have been accepted for inclusion in the final assessment tool.

Additionally, the Delphi technique was modified so that participants fully understood the primary objective of the project, which was to create an assessment for use in an undergraduate curriculum. Educational methods used for novices can be ineffective in training experts. This is known as expertise reversal effect [42]. Furthermore, all assessment methods are not always appropriate for differing levels of learners [27]. As a result, we thought it imperative to ensure that the experts helping to create the tool thought about the assessment of PoCUS at the undergraduate level and did not solely think about how they themselves as experts would be assessed. Accordingly, as a means of stressing this point, the third round of the survey was conducted face-to-face rather than online. Every effort was made to avoid biasing the panel. Lastly, caution must be taken when using an opinion-based assessment tool [36]; however, we believe this issue can be addressed in future projects by testing the validity and reliability of this tool using independent raters.

Conclusions

Using a modified Delphi technique, we were able to create an objective assessment tool for undergraduate PoCUS learners at Memorial University. A panel of expert PoCUS practitioners supported the content validity of this tool. Research is currently ongoing to evaluate this tool’s validity in assessing PoCUS competency in undergraduate medical students. At a time when many medical schools are changing their curricula to coincide with changing national exams, this technique can be modified and used to create further objective assessment tools.


References

  1. Moore CL, Copel JA: Point-of-care ultrasonography. N Engl J Med. 2011, 364:749–757. 10.1056/NEJMra0909487
  2. Halm BM: Reducing the time in making the diagnosis and improving workflow with point-of-care ultrasound. Pediatr Emerg Care. 2013, 29:218–221. 10.1097/PEC.0b013e318280d698
  3. Milas M, Stephen A, Berber E, Wagner K, Miskulin J, Siperstein A: Ultrasonography for the endocrine surgeon: a valuable clinical tool that enhances diagnostic and therapeutic outcomes. Surgery. 2005, 138:1193–1200. 10.1016/j.surg.2005.08.032
  4. Butter J, Grant TH, Egan M, et al.: Does ultrasound training boost Year 1 medical student competence and confidence when learning abdominal examination?. Med Educ. 2007, 41:843–848. 10.1111/j.1365-2923.2007.02848.x
  5. Milling TJ Jr, Rose J, Briggs WM, et al.: Randomized controlled clinical trial of point-of-care limited ultrasonography assistance of central venous cannulation: The third sonography outcomes assessment program (SOAP-3) trial. Crit Care Med. 2005, 33:1764–1769.
  6. Ollerton JE, Sugrue M, Balogh Z, D'Amours SK, Giles A, Wyllie P: Prospective study to evaluate the influence of FAST on trauma patient management. J Trauma. 2006, 60:785–791. 10.1097/01.ta.0000214583.21492.e8
  7. Swamy M, Searle RF: Anatomy teaching with portable ultrasound to medical students. BMC Med Educ. 2012, 12:99. 10.1186/1472-6920-12-99
  8. Parks AR, Atkinson P, Verheul G, Leblanc-Duchin D: Can medical learners achieve point-of-care ultrasound competency using a high-fidelity ultrasound simulator?: a pilot study. Crit Ultrasound J. 2013, 5:9. 10.1186/2036-7902-5-9
  9. Nicolaou S, Talsky A, Khashoggi K, Venu V: Ultrasound-guided interventional radiology in critical care. Crit Care Med. 2007, 35:S186–S197. 10.1097/01.CCM.0000260630.68855.DF
  10. Shah NB, Platt SL: ALARA: is there a cause for alarm? Reducing radiation risks from computed tomography scanning in children. Curr Opin Pediatr. 2008, 20:243–247. 10.1097/MOP.0b013e3282ffafd2
  11. Bonnafy T, Lacroix P, Desormais I, et al.: Reliability of the measurement of the abdominal aortic diameter by novice operators using a pocket-sized ultrasound system. Arch Cardiovasc Dis. 2013, 106:644–650. 10.1016/j.acvd.2013.08.004
  12. Brennan JM, Blair JE, Goonewardena S, et al.: A comparison by medicine residents of physical examination versus hand-carried ultrasound for estimation of right atrial pressure. Am J Cardiol. 2007, 99:1614–1616. 10.1016/j.amjcard.2007.01.037
  13. Frederiksen CA, Juhl-Olsen P, Andersen NH, Sloth E: Assessment of cardiac pathology by point-of-care ultrasonography performed by a novice examiner is comparable to the gold standard. Scand J Trauma Resusc Emerg Med. 2013, 21:87. 10.1186/1757-7241-21-87
  14. Mouratev G, Howe D, Hoppmann R, et al.: Teaching medical students ultrasound to measure liver size: Comparison with experienced clinicians using physical examination alone. Teach Learn Med. 2013, 25:84–88. 10.1080/10401334.2012.741535
  15. Panoulas VF, Daigeler AL, Malaweera AS, et al.: Pocket-size hand-held cardiac ultrasound as an adjunct to clinical examination in the hands of medical students and junior doctors. Eur Heart J Cardiovasc Imaging. 2013, 14:323–330. 10.1093/ehjci/jes140
  16. Hoppmann R, Cook T, Hunt P, et al.: Ultrasound in medical education: A vertical curriculum at the University of South Carolina School of Medicine. J S C Med Assoc. 2006, 102:330–334.
  17. Fox JC, Schlang JR, Maldonado G, Lotfipour S, Clayman RV: Proactive medicine: the "UCI 30," an ultrasound-based clinical initiative from the University of California, Irvine. Acad Med. 2014, 89:984–989. 10.1097/ACM.0000000000000292
  18. Mollenkopf M, Tait N: Is it time to include point-of-care ultrasound in general surgery training? A review to stimulate discussion. ANZ J Surg. 2013, 83:908–911. 10.1111/ans.12363
  19. Rao S, van Holsbeeck L, Musial JL, et al.: A pilot study of comprehensive ultrasound education at the Wayne State University School of Medicine. A Pioneer Year in Review. J Ultrasound Med. 2008, 27:745–749.
  20. Solomon SD, Saldana F: Point-of-care ultrasound in medical education-stop listening and look. N Engl J Med. 2014, 370:1083–1085. 10.1056/NEJMp1311944
  21. Bahner DP, Adkins EJ, Hughes D, Barrie M, Boulger CT, Royall NA: Integrated medical school ultrasound: development of an ultrasound vertical curriculum. Crit Ultrasound J. 2013, 5:6. 10.1186/2036-7902-5-6
  22. Blickendorf JM, Adkins EJ, Boulger C, Bahner DP: Trained simulated ultrasound patients: Medical students as models, learners, and teachers. J Ultrasound Med. 2014, 33:35–38. 10.7863/ultra.33.1.35
  23. Royse CF, Canty DJ, Faris J, Haji DL, Veltman M, Royse A: Core review: physician-performed ultrasound: the time has come for routine use in acute care medicine.. Anesth Analg. 2012, 115:1007–1028. 10.1213/ANE.0b013e31826a79c1
  24. Dinh VA, Lakoff D, Hess J, et al.: Medical student core clinical ultrasound milestones: a consensus among directors in the United States. J Ultrasound Med. 2016, 35:421-434. 10.7863/ultra.15.07080
  25. Bloom BS: Taxonomy of Educational Objectives: The Classification of Educational Goals: By a Committee of College and University Examiners. Handbook 1 . David McKay Co, New York, NY; 1969.
  26. Canadian Association of Radiologists: Position statement on the use of point of care ultrasound. Canadian Association of Radiologists. 2013, 1-10. Accessed: Accessed: May 10, 2016: http://www.car.ca/uploads/standards%20guidelines/point_of_care_ultrasound_position_statement_20140527.pdf.
  27. Kalyuga S: Rapid cognitive assessment of learners' knowledge structures. Learn Instr. 2006, 16:1–11. 10.1016/j.learninstruc.2005.12.002
  28. Tolsgaard MG, Todsen T, Sorensen JL, et al.: International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PloS One. 2013, 8:e57687. 10.1371/journal.pone.0057687
  29. Polit DF, Beck CT: Quantitative research for various purposes. Nursing Research: Principles and Methods. 7th edition. Polit DF, Beck CT (ed): Lippincott Williams & Wilkins, Philadelphia, PA; 2004. 223-244.
  30. Clayton MJ: Delphi: a technique to harness expert opinion for critical decision‐making tasks in education. Educational Psychology. 1997, 17:373–386. 10.1080/0144341970170401
  31. Ilgen JS, Ma IW, Hatala R, Cook DA: A systematic review of validity evidence for checklists versus global rating scales in simulation‐based assessment. Med Educ. 2015, 49:161–173. 10.1111/medu.12621
  32. Doyle JD, Webber EM, Sidhu RS: A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg. 2007, 193:551–555. 10.1016/j.amjsurg.2007.02.003
  33. Martin JA, Regehr G, Reznick R, et al.: Objective Structured Assessment of Technical Skills (OSATS) for surgical residents. Br J Surg. 1997, 84:273–278. 10.1046/j.1365-2168.1997.02502.x
  34. Niitsu H, Hirabayashi N, Yoshimitsu M, et al.: Using the Objective Structured Assessment of Technical Skills (OSATS) global rating scale to evaluate the skills of surgical trainees in the operating room. Surg Today. 2013, 43:271–275. 10.1007/s00595-012-0313-7
  35. Jabbour N, Reihsen T, Payne NR, Finkelstein M, Sweet RM, Sidman JD: Validated Assessment Tools for Pediatric Airway Endoscopy Simulation. Otolaryngol Head Neck Surg. 2012, 147:1131–1135. 10.1177/0194599812459703
  36. Hopmans CJ, den Hoed PT, van der Laan L, et al.: Assessment of surgery residents' operative skills in the operating theater using a modified Objective Structured Assessment of Technical Skills (OSATS): A prospective multicenter study. Surgery. 2014, 156:1078–1088. 10.1016/j.surg.2014.04.052
  37. Cheung JJ, Chen EW, Darani R, McCartney CJ, Dubrowski A, Awad IT: The creation of an objective assessment tool for ultrasound-guided regional anesthesia using the Delphi method. Reg Anesth Pain Med. 2012, 37:329–333. 10.1097/AAP.0b013e318246f63c
  38. Dougherty P, Kasten SJ, Reynolds RK, Prince ME, Lypson ML: Intraoperative assessment of residents. J Grad Med Educ. 2013, 5:333–334. 10.4300/JGME-D-13-00074.1
  39. Hiemstra E, Kolkman W, Wolterbeek R, Trimbos B, Jansen FW: Value of an objective assessment tool in the operating room. Can J Surg. 2011, 54:116–122. 10.1503/cjs.032909
  40. Ennis R: Critical thinking across the curriculum: the wisdom CTAC program. Inquiry: Critical Thinking across the Disciplines. 2013, 28:25–45. 10.5840/inquiryct20132828
  41. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W: Testing technical skill via an innovative "bench station" examination. Am J Surg. 1997, 173:226–230. 10.1016/S0002-9610(97)89597-9
  42. Kalyuga S, Sweller J: Measuring knowledge to optimize cognitive load factors during instruction. J Educ Psy. 2004, 96:558–568. 10.1037/0022-0663.96.3.558
Original article
peer-reviewed

Expert Facilitated Development of an Objective Assessment Tool for Point-of-Care Ultrasound Performance in Undergraduate Medical Education


Author Information

Holly Black

Emergency Medicine, Memorial University of Newfoundland

Gillian Sheppard

Faculty of Medicine, Memorial University of Newfoundland, St. John's, Select Country

Brian Metcalfe

Faculty of Medicine, Memorial University of Newfoundland

Jordan Stone-McLean

Faculty of Medicine, Memorial University of Newfoundland

Heather McCarthy

Faculty of Medicine, Memorial University of Newfoundland

Adam Dubrowski Corresponding Author

Health Sciences, University of Ontario Institute of Technology, Oshawa, CAN

Emergency Medicine, Memorial University, St. John's, CAN


Ethics Statement and Conflict of Interest Disclosures

Human subjects: All authors have confirmed that this study did not involve human participants or tissue. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.

Acknowledgements

Thank you to Dr. Peter Rogers for connecting me with this project and team. Thank you to Dr. Tia Renouf and Sabrina Alani for all the help and encouragement.


Original article
peer-reviewed

Expert Facilitated Development of an Objective Assessment Tool for Point-of-Care Ultrasound Performance in Undergraduate Medical Education


Figures etc.

SIQ
6.8
RATED BY 6 READERS
CONTRIBUTE RATING

Scholary Impact Quotient™ (SIQ™) is our unique post-publication peer review rating process. Learn more here.