Combined Internal Medicine/Pediatrics (Med/Peds) residencies rely on categorical program data to predict pass rates for the American Board of Internal Medicine Certifying Exam (ABIM-CE) and the American Board of Pediatrics Certifying Exam (ABP-CE). There is insufficient literature describing what best predicts a Med/Peds resident passing board exams. In this study, we aimed to determine how standardized test scores predict performance on ABIM-CE and ABP-CE for Med/Peds residents.
We analyzed prior exam scores for 91/96 (95%) residents in a Med/Peds program from 2008 to 2017. Scores from the United States Medical Licensing Examination (USMLE) Steps 1 and 2 Clinical Knowledge (CK) and In-Training Exams in Internal Medicine (ITE-IM) and Pediatrics (ITE-P) were analyzed with the corresponding ABIM-CE and ABP-CE first-time scores. Linear and logistic regression were applied to predict board scores/passage.
USMLE 1 and 2 CK, ITE-IM, and ITE-P scores had a linear relationship with both ABIM-CE and ABP-CE scores. In the linear regression, adjusted R2 values showed low-to-moderate predictive ability (R2 = 0.11-0.35), with the highest predictor of ABIM-CE and ABP-CE being USMLE Step 1 (0.35) and Postgraduate Year 1 (PGY-1) ITE-IM (0.33), respectively. Logistic regression showed odds ratios of passing board certifications ranging from 1.05 to 1.53 per point increase on the prior exam score. The PGY-3 ITE-IM was the best predictor of passing both certifying exams.
In one Med/Peds program, USMLE Steps 1 and 2 and all ITE-IM and ITE-P scores predicted certifying exam scores and passage. This provides Med/Peds-specific data to allow individualized resident counseling and guide programmatic improvements targeted to board performance.
Combined Internal Medicine and Pediatrics residencies (Med/Peds) have maintained a prominent presence in graduate medical education for over 40 years . After four years of training, graduates of Med/Peds residencies are eligible to sit for the certifying exams in both Internal Medicine (American Board of Internal Medicine Certifying Exam, ABIM-CE) and Pediatrics (American Board of Pediatrics Certifying Exam, ABP-CE). As of 2013, the ABP-CE examination is scored with a criterion-referenced set passing score independent of other takers . The ABIM-CE is scored using a standard-setting procedure, with a set passing score determined by content experts . Program accreditation relies on maintaining >5th percentile or at least 80% first-time pass rate on ABIM-CE and ABP-CE . In categorical internal medicine and pediatrics literature, a significant amount of time and attention has been devoted to identifying the factors that are positively or negatively correlated with passing each of these exams.
Multiple studies in internal medicine have found positive correlation with board pass rate and In-Training Exams (ITE), while other studies have found correlation with United States Medical Licensing Exam (USMLE) Step 1 and number of overnight in-house calls in the final six months of training [5-8]. One study looked at program characteristics and determined that program features that contributed positively to pass rate were faculty-to-resident ratio, preliminary positions, a formal mentoring program, and on-site child care; the number of Doctors of Osteopathic Medicine (DO) in the program contributed negatively . A recent study showed that research experience and awards such as Alpha Omega Alpha had no correlation to test scores or patient ratings; Step 2 Clinical Knowledge (CK) scores significantly correlated with patient ratings, ABIM-CE passage, and ITE, while Step 1 scores predicted ITE scores .
Pediatric literature has shown that ABP-CE passage was correlated positively with USMLE Step 1, USMLE Step 2 CK, combinations of Step 1 and 2, and ITE [11-13]. Program characteristics, such as faculty-to-resident ratio, average hours per week of lectures, and percent of US Allopathic Doctors of Medicine (MDs) are associated with higher pass rates for ABP-CE . Chase et al. examined didactic and clinical factors related to ABP-CE passage and found that only the number of pediatric admissions correlated with improvement in ITE scores, without changing ABP-CE scores .
To our knowledge, no study has been conducted that looks solely at a Med/Peds program; therefore, Med/Peds programs have relied on categorical data to determine pass rate predictions and targets for early interventions with residents. In our study, we compiled all available standardized test data over a 10-year period from over 90 Med/Peds residents at the University of Tennessee Health Science Center (UTHSC) to determine which factors most accurately predicted performance on both ABIM-CE and ABP-CE in a Med/Peds residency.
Materials & Methods
We compiled test scores from graduates of the Med/Peds residency program at UTHSC from 2008 to 2017. Variables analyzed were USMLE Step 1, USMLE Step 2 CK, ITE data from all four years for both internal medicine and pediatrics, and certifying examination scores (pass/fail + numeric score when available). Board score data and ITE data are represented as percent correct, as is ITE data.
Univariate logistic and univariate linear regression methods were applied to data from 10 prior exams predicting performance on ABIM-CE and ABP-CE for residents graduating in internal medicine and pediatrics at UTHSC in years 2008 through 2017. To predict exam passage (pass/fail), logistic regression was applied. Linear regression was used to predict quantitative scores on board exams. The primary outcome in the logistic regression model was passing ABIM-CE and ABP-CE. The primary outcome in the linear regression model was correlation of scores between pre-board standardized testing and ABIM-CE and ABP-CE scores. While multivariable models using backward stepwise regression were developed in each case, due to high correlation among the predictive exams, each model reduced to one predictive variable. Demographics such as age, race, and gender were not included in the analysis as the goal was to develop standard prediction models that could be applied by any student.
Overall, 91 of 96 (95%) total subjects had pass/fail data for the ABIM-CE and 71 (74%) for the ABP-CE. Sixty-five subjects (68%) had quantitative scores for the ABIM-CE and 71 (74%) for the ABP-CE. Not all Med/Peds residents go on to take both certifying examinations, and some graduating in later years had not completed the examination at the time of the analysis. Residents with missing values only had the paired predictive/outcome variables for the missing results excluded from the analysis. Odds ratios with 95% confidence intervals and c-statistics were estimated for each prior exam with the certifying exams. Likewise, linear intercepts and coefficients, each with 95% confidence intervals, and adjusted R2 values were estimated for each prior exam with the two certifying exams.
In the linear regression models, we make the assumption that the relationship between each predictor and each predicted exam score is linear. In the logistic regression models, we assume that the logit is linear in the predictor exam scores. We have modeled the performance on the ABIM-CE and ABP-CE exams with the assumption that the meaningful step size is one point on any predictor exam. For example, for each increase of one point on a predictor exam scale, we expect the board exam score to increase by the amount reflected in the estimated linear regression coefficient. Or, we expect the odds of passing to increase by the odds ratio shown, per point change on the predictor exam.
The UTHSC Institutional Review Board approved the project with Exempt status.
The mean USMLE Step 1 score for entering residents in our study was 216.3, with a median of 216. The mean Step 2 CK score was 227.3, with a median of 227. The percentage of US allopathic graduates ranged from a low of 64% in 2016 to a high of 100% in the classes of 2009, 2010, 2011, and 2015. Classes ranged from 27% to 80% female with an overall average of 51% female. Our ABIM-CE pass rate during the years of 2008-2017 was 89%, and our ABP-CE pass rate was 83%.
The findings of logistic and linear regressions of prior exams with board exams are summarized in Tables 1, 2. Linear regression showed low-to-moderate predictive ability for each prior exam with relation to each certifying exam score. Logistic regression showed that an increase in score in each earlier exam increased the odds of passing each certifying exam.
For the logistic regression, odds ratio estimates for the ABIM range from 1.05 to 1.35 on the prior exams, and for the ABP from 1.07 to 1.53. Using the c-statistic for comparison, the PGY-3 ITE in internal medicine is the best predictor of passing either the ABIM-CE exam or the ABP-CE at 0.86 and 0.94, respectively (see Table 1).
Among the linear regressions, squared Pearson correlation coefficient (R2) values for the ABIM-CE range from 0.18 to 0.35, and for the ABP from 0.11 to 0.33 (see Table 2), showing low-to-moderate predictive ability. According to these R2 values, the USMLE Step 1 exam is the most effective predictor of quantitative score on the ABIM-CE, while the PGY-1 exam in internal medicine is the most effective predictor of the ABP-CE score. See Figure 1 for the linear regression plot for the USMLE Step 1 versus ABIM-CE percent correct, and Figure 2 for the linear regression plot of the PGY1 IM ITE versus ABP-CE percent correct.
Our results show that USMLE Step 1, Step 2 CK, and all years of both internal medicine and pediatrics ITE scores predict with varying degrees of reliability the numeric ABP-CE and ABIM-CE scores and odds ratios for passing the certifying examinations. Though presumably the knowledge base for internal medicine and pediatrics is different, there was still correlation between internal medicine ITE scores and ABP-CE score, as well as between pediatrics ITE and ABIM-CE scores. Correlation of scores across specialties implies that generalizable factors such as knowledge base, study skills, or test-taking skills significantly affect results. Ours is the first study to compare the test scores on exams from the two specialties, and the first Med/Peds program-specific study of test scores.
Linear regression analysis showed low-to-moderate predictive ability for all tests analyzed with respect to predicting numeric certifying exam scores. Previous internal medicine studies including Brateanu et al. also found a linear correlation between ITE scores and ABIM scores , though their data showed a higher correlation between PGY-2 and PGY-3 ITEs with ABIM-CE. Kay et al. found a similar moderate correlation between USMLE Step 1 and ABIM-CE, as well as between ITE-IM exams and ABIM-CE . McDonald et al. and Perez and Greer also showed correlation between USMLE and ITE scores in internal medicine [15,16], but did not extend their study to certifying examination scores. Work in other specialties related USMLE scores to in-training scores in similar medical specialties such as emergency medicine and dermatology [17,18]. In-training scores in obstetrics and gynecology, psychiatry, emergency medicine, and family medicine have been shown to correlate with passage of the corresponding certifying examinations [19-23].
The unexpected finding that earlier tests such as USMLE Step 1 (taken during medical school) and the PGY-1 internal medicine ITE (taken in September of PGY-1 year) correlate more highly with certifying exam scores than later ITE scores implies that other factors not accounted for by our study have a significant impact, such as fatigue or test-taking skills.
Limitations include that our study was performed in only one residency program, which allowed local systems factors to impact our trainees’ test scores. In addition, some scores were unavailable, limiting the data set and reducing the statistical power. Our statistical analysis attempted to correct for year taken, but the significant change in scoring of the ABP-CE still may have affected the results. The change in USMLE Step 1 to pass/fail in 2022 will limit future use of scores from the USMLE Step 1 in predicting certification exam passage. Finally, the percent correct was used for the ITE and certifying exams, and the percent correct score that is a passing score on the certifying examinations varies.
Our data can be used to target board study or test-taking skills interventions to individuals at higher risk of board failure within a Med/Peds program. Weekly board study emails and mandatory reading and board study group sessions have been shown to have significant correlation with passage of the ABP certifying exam, with a greater effect among residents with lower pediatrics ITE scores [24,25]. In another study, increased pediatric admissions was found to improve ITE scores , suggesting that optimal clinical exposure contributes to improving medical knowledge scores.
In the future, we plan to strengthen our score predictions with upcoming graduates’ data. We hope to involve other programs to broaden our source of data and improve the accuracy of our predictions. The program will use the prediction tool to counsel residents at risk, and to recommend a more intensive board review strategy similar to the studies mentioned above for those residents to improve their odds of passing their board exams.
Our study confirms that in combined Med/Peds programs, USMLE scores and ITE scores show low-to-moderate correlation with future board certification exam scores. Furthermore, there is low-to-moderate correlation even between internal medicine ITE scores and pediatrics certifying examination scores, and vice versa. USMLE and ITE scores can help predict future certification exam passage and scores, with varying degrees of accuracy. Finally, the R2 values for prior tests compared to ABIM-CE were generally higher than those compared with ABP, indicating more variability in ABP-CE scores compared to ABIM-CE scores.
- Falcone JL: Residencies with dual internal medicine and pediatrics programs outperform others on the American Board of Pediatrics Certifying Examination. Clin Pediatr (Phila). 2014, 53:854-857. 10.1177/0009922814533407
- American Board of Pediatrics: scoring FAQs. (2021). Accessed: May 7, 2019: https://www.abp.org/content/scoring-faqs.
- American Board of Internal Medicine: how exams are developed. (2021). Accessed: February 18, 2021: https://www.abim.org/about/exam-information/exam-development.aspx.
- Accreditation Council for Graduate Medical Education. ACGME common program requirements (Residency). (2018). Accessed: February 18, 2021: https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRResidency2019.pdf.
- Rollins LK, Martindale JR, Edmond M, Manser T, Scheld WM: Predicting pass rates on the American Board of Internal Medicine certifying examination. J Gen Intern Med. 1998, 13:414-416. 10.1046/j.1525-1497.1998.00122.x
- Brateanu A, Yu C, Kattan MW, Olender J, Nielsen C: A nomogram to predict the probability of passing the American Board of Internal Medicine examination. Med Educ Online. 2012, 17:18810. 10.3402/meo.v17i0.18810
- Kay C, Jackson JL, Frank M: The relationship between internal medicine residency graduate performance on the ABIM Certifying Examination, yearly in-service training examinations, and the USMLE Step 1 examination. Acad Med. 2015, 90:100-104. 10.1097/ACM.0000000000000500
- Rayamajhi S, Dhakal P, Wang L, Rai MP, Shrotriya S: Do USMLE steps, and ITE score predict the American Board of Internal Medicine Certifying Exam results?. BMC Med Educ. 2020, 20:79. 10.1186/s12909-020-1974-3
- Atsawarungruangkit A: Relationship of residency program characteristics with pass rate of the American Board of Internal Medicine certifying exam. Med Educ Online. 2015, 20:28631. 10.3402/meo.v20.28631
- Sharma A, Schauer DP, Kelleher M, Kinnear B, Sall D, Warm E: USMLE Step 2 CK: best predictor of multimodal performance in an internal medicine residency. J Grad Med Educ. 2019, 11:412-419. 10.4300/JGME-D-19-00099.1
- McCaskill QE, Kirk JJ, Barata DM, Wludyka PS, Zenni EA, Chiu TT: USMLE Step 1 scores as a significant predictor of future board passage in pediatrics. Ambul Pediatr. 2007, 7:192-195. 10.1016/j.ambp.2007.01.002
- Welch TR, Olson BG, Nelsen E, Beck Dallaghan GL, Kennedy GA, Botash A: United States Medical Licensing Examination and American Board of Pediatrics certification examination results: does the residency program contribute to trainee achievement. J Pediatr. 2017, 188:270-274. 10.1016/j.jpeds.2017.05.057
- Althouse LA, McGuinness GA: The in-training examination: an analysis of its predictive value on performance on the general pediatrics certification examination. J Pediatr. 2008, 153:425-428. 10.1016/j.jpeds.2008.03.012
- Chase LH, Highbaugh-Battle AP, Buchter S: Residency factors that influence pediatric in-training examination score improvement. Hosp Pediatr. 2012, 2:210-214. 10.1542/hpeds.2011-0028
- McDonald FS, Zeger SL, Kolars JC: Associations Between United States Medical Licensing Examination (USMLE) and Internal Medicine In-Training Examination (IM-ITE) scores. J Gen Intern Med. 2008, 23:1016-1019. 10.1007/s11606-008-0641-x
- Perez JA, Greer S: Correlation of United States Medical Licensing Examination and Internal Medicine In-Training Examination performance. Adv Health Sci Educ Theory Pract. 2009, 14:753-758. 10.1007/s10459-009-9158-2
- Thundiyil JG, Modica RF, Silvestri S, Papa L: Do United States Medical Licensing Examination (USMLE) scores predict in-training test performance for emergency medicine residents?. J Emerg Med. 2010, 38:65-69. 10.1016/J.JEMERMED.2008.04.010
- Fening K, Vander Horst A, Zirwas M: Correlation of USMLE Step 1 scores with performance on dermatology in-training examinations. J Am Acad Dermatol. 2011, 64:102-106. 10.1016/j.jaad.2009.12.051
- Withiam-Leitch M, Olawaiye A: Resident performance on the in-training and board examinations in obstetrics and gynecology: implications for the ACGME outcome project. Teach Learn Med. 2008, 20:136-142. 10.1080/10401330801991642
- Lingenfelter BM, Jiang X, Schnatz PF, O’Sullivan DM, Minassian SS, Forstein DA: CREOG in-training examination results: contemporary use to predict ABOG written examination outcomes. J Grad Med Educ. 2016, 8:353-357. 10.4300/JGME-D-15-00408.1
- Juul D, Schneidman BS, Sexson SB, et al.: Relationship between resident-in-training examination in psychiatry and subsequent certification examination performances. Acad Psychiatry. 2009, 33:404-406. 10.1176/appi.ap.33.5.404
- Harmouche E, Goyal N, Pinawin A, Nagarwala J, Bhat R: USMLE scores predict success in ABEM initial certification: a multicenter study. West J Emerg Med. 2017, 18:544-549. 10.5811/westjem.2016.12.32478
- O’Neill TR, Zijia L, Peabody MR, Lybarger M, Kenneth R, Puffer JC: The predictive validity of the ABFM’s in-training examination. Fam Med. 2016, 47:349-356.
- Aeder L, Fogel J, Schaeffer H: Pediatric board review course for residents "at risk". Clin Pediatr (Phila). 2010, 49:450-456. 10.1177/0009922809352679
- Langenau EE, Fogel J, Schaeffer HA: Correlation between an email based board review program and American Board of Pediatrics General Pediatrics Certifying Examination scores. Med Educ Online. 2009, 14:18. 10.3885/meo.2009.Res00321
Relationship Between Standardized Test Scores and Board Certification Exams in a Combined Internal Medicine/Pediatrics Residency Program
Ethics Statement and Conflict of Interest Disclosures
Human subjects: Consent was obtained or waived by all participants in this study. UTHSC Institutional Review Board issued approval N/A. The UTHSC Institutional Review Board approved the project with Exempt status. . Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.
Special thanks to program coordinator Melissa Hayes for her assistance with this project.
Cite this article as:
Ost S R, Wells D, Goedecke P J, et al. (February 26, 2021) Relationship Between Standardized Test Scores and Board Certification Exams in a Combined Internal Medicine/Pediatrics Residency Program. Cureus 13(2): e13567. doi:10.7759/cureus.13567
Received by Cureus: August 10, 2020
Peer review began: September 15, 2020
Peer review concluded: February 19, 2021
Published: February 26, 2021
© Copyright 2021
Ost et al. This is an open access article distributed under the terms of the Creative Commons Attribution License CC-BY 4.0., which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.