Approximately 90% of Americans have access to the internet with the majority of people searching online for medical information pertaining to their health, or the health of loved ones. The public relies immensely on online health information to make decisions related to their healthcare. The American Medical Association (AMA) and the National Institute of Health (NIH) recommend that publicly available health-related information be written at the level of the sixth-seventh grade.
Materials and methods
Patient education materials available to the public on the Annals.org, a website sponsored by the American College of Physicians, were collected. All 89 patient education articles were downloaded from the website and analyzed for their ease of readability. The articles were analyzed utilizing a readability software generating five quantitative readability scores: Flesch Reading Ease (FRE), Flesch-Kincaid Grade Level (FKGL), Gunning Fog Index (GFI), Coleman-Liau Index (CLI), Simple Measure of Gobbledygook (SMOG). All scores, with the exception of FRE, generate a grade level that correlates with the required school-grade level to ensure adequate readability of the information.
Eighty-nine articles were analyzed generating an average score as follows: FRE 62.8, FKGL 7.0, GFI 8.6, CLI 9.6 and SMOG 9.8. Overall, 87.6% of the articles were written at a level higher than the 7th-grade level, which is recommended by the AMA and NIH.
In an era of increased reliance on the internet for medical information pertaining to patients’ health, materials written at a higher grade than recommended has the potential to negatively impact patients’ well-being, in addition to tremendous ramifications on the healthcare system. Potentially redrafting, these articles can prove beneficial to patients who rely on these resources for making healthcare-related decisions.
In an era of widespread internet availability, it is estimated that approximately 90% of Americans have access to the internet, with more than 80% of internet users searching online for medical information. This translates to more than eight million Americans searching the internet for health-related information, on any given day [1-2]. Only a third of these users discuss this health information with their healthcare providers . Moreover, 53% of health seekers report that the information obtained had an impact on how they take care of themselves or someone else . This is in part because of the ease by which information could be found online and the fact that more than three-quarters of Americans own a smartphone and have access to the internet at the tip of their fingers .
Unfortunately, the 2003 National Assessment of Adult Literacy (NAAL) showed that 14% of Americans could not read, or understand text written in English and were only able to comprehend very basic, simple text . Furthermore, almost half of Americans lack sufficient literacy required to appropriately comprehend and implement medical treatment and preventive health care, with grave economic consequences [4-6]. According to the NAAL report, uninsured adults have lower health literacy than insured adults . In addition, limited health literacy is prevalent and associated with lower socioeconomic status, comorbidities, and poor health care access, which suggests that limited health literacy can be considered an independent risk factor for the disparities in health faced by the older population. Studies have shown that adults older than 65 years with lower health literacy were more likely to utilize the emergency department and experience higher costs during those visits .
Therefore, it is of utmost importance to define health literacy and eHealth literacy in this era. The Institute of Medicine defines health literacy as “the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decision” . Multiple other definitions are proposed by various other entities, with varying degrees of emphasis on eHealth literacy, where eHealth literacy is defined as “a set of skills and knowledge that are essential for productive interactions with technology-based health tools” . One of the crucial qualities of health literacy is the readability of health literature that is directed towards laypeople, with readability defined as “the ease with which written materials are read” . Readability is paramount, as increased readability correlates with increased comprehension . Multiple readability assessment tools are in place to assist authors in addressing the public with materials that are comprehensible and more effortlessly understood by readers. Yet, online patient information is often written at a level that is beyond the comprehensibility of the majority of the population [6,11-13]. Internal medicine ailments encompass a wide variety of illnesses, especially in the older generations. Hence, the requirement of easily readable materials to facilitate understanding and potentially improve adherence and outcomes.
In an effort to address the extent of inadequacies related to health literacy, the US Department of Health and Human Services (USDHHS), the American Medical Association, and the National Institute of Health have all published guidelines related to the readability of patient-related information (e.g. hand-outs, consent forms, health education materials). These organizations recommend that health information directed towards patient be written at the level of sixth- to seventh-grade reading level, corresponding to a reading level associated with ages 11-13 years [5,10,14]. Per the USDHHS, a sixth-grade reading level is categorized as “easy to read”, while any material between the seventh- and the ninth-grade level is categorized as “average difficulty” and any literature written beyond that level as “difficult” . The purpose of this study was to evaluate the readability of patient education materials that are available on the “Annals of Internal Medicine: Patient Information” website, with measures taken to assess the difficulty of the information written and the complexity by which the articles were formulated.
Materials & Methods
Online patient education materials available to the public from “Annals of Internal Medicine: Patient Information” website, http://annals.org/aim/pages/patient-information , were retrieved in September 2018. This website, sponsored by the American College of Physicians (ACP), offers healthcare-related information in the form of brief summaries of studies and clinical guidelines published in the Annals of Internal Medicine journal, targeting patients and interested lay people.
A total of 93 hyperlinks were found on the website, with one duplicate hyperlink for information related to colon cancer screening and three hyperlinks yielded unavailable pages. Duplicate and unavailable links were excluded from analysis. Articles that were targeting physicians or practitioners were excluded. The hyperlinks were all patient-related information. The text from the 89 remaining articles was copied and pasted as plain text into individual documents using Microsoft® Word® (Microsoft, Redmond, Washington, USA). The text was reviewed by the authors independently. During the review, all medical terms followed by explanation were removed from the text prior to analysis, for example when mentioned “sputum; mucus brought up with coughing”, the word sputum would be removed from the text. Likewise, when a medical procedure is explained, for example, when “endoscopic retrograde cholangiopancreatography, which examines the pancreas through a tube inserted down the throat into the stomach and pancreas”, the words endoscopic retrograde cholangiopancreatography were removed. Medical terms or procedures not explained in the text were not removed. In addition, all hyperlinks, tables, advertisements, figures, images, and tables were removed. Further editing was done on the remaining text with expunction of any headings, bullet points, and decimals. References were also expunged, as well as author names and websites. Names of medications, whether brand or generic, were similarly removed from the text.
The articles were then analyzed for their readability levels using the readability website https://readable.com. Five validated scales were then used to quantitatively analyze the articles (Table 1). The scales used were: Flesch Reading Ease (FRE), Flesch-Kincaid Grade Level (FKGL), Gunning Fog Index (GFI), Coleman-Liau Index (CLI), Simple Measure of Gobbledygook (SMOG) [16-20]. The SMOG scale is the preferred assessment of healthcare literature . All the scales generate a readability grade that correlates with a typical school grade level, with 0-12 being from kindergarten to 12th grade and scores higher than 12 correlate with their college degree hierarchical equivalent. The only exception is FRE, which generates a score out of 100, with scores 0-30: very difficult to read, written at the levels of college graduates, 50-60: fairly difficult to read, written at the level of 10th to 12th grade, 60-70: plain English, written at the level of 8th to 9th grade, 70-80: fairly easy to read, written at the level of 7th grade, and 90-100: very easy to read, easily understood by an average 11-year-old student.
The readability grades for the 89 articles obtained from Annal.org online patient information were analyzed. Each article was analyzed individually using five readability scales. The average readability score using the FRE for the collective articles was 62.8, indicating a level of readability that is of average difficulty, per the USDHHS, which is higher than the recommended readability level by the USDHHS, NIH, and AMA [5,10,14]. Table 2 presents the readability scores for the 89 individual articles, in addition to the final column displaying the average grade level for each article.
The FRE score for the articles had an average value of 62.8, with a range of scores between 29.1 and 89.6, with less than a third of the articles attaining a score more than 70.0, which translates to articles easily understood by seventh-grade level. Utilizing the FKGL score, the average readability grade level of the collective articles was 7.0, with a range between 2.9 and 8.2. The GFI scale produced an average reading level of 8.6, with scores ranging between 5.2 to 13.2, with more than 80% of the articles written above the recommended readability grade level. In addition, both the CLI and SMOG showed higher average readability scores respective averages of 9.6 and 9.8, as seen in Figure 1. The maximum score obtained on the CLI score was 15, with more 77 of the articles written above the recommended level. Furthermore, SMOG analysis showed minimum readability of 6.4 and a maximum of 12.6, with only two articles written at or below the recommended readability level. However, the SMOG score intrinsically yields higher scores because of a 100% comprehension goal during analysis, yet it has been recommended for healthcare literature with a comprehensibility correlation of 0.88 [20-21]. The number of articles written at each readability grade level, comparing the different scores utilized is shown in Figures 2A-2E.
Patient reliance on online education materials to enhance their well-being and determine when to visit a physician have increased exponentially over the past decade, with more than 50% of internet users reporting that information found on the internet affected their decision in treating a medical condition . While online education materials found on the patient information site on Annals.org provide patients with a valuable, evidence-based resource for maintaining their well-being, readability of the content of these materials is variable, with 87.6% of articles written above the seventh-grade level.
Health literacy is complicated, with multiple variables that play part in the overall comprehensibility of educational materials . Individuals’ level of education and familiarity with medical terminology are fundamental in assessing the level of understanding they will attain in reading online education materials. Health literacy is heavily dependent on the education attained by patients, with lower health literacy rates linked to poorer health outcomes, including increased hospitalizations, poorer health status and higher mortality . Studies have shown lower mammography studies and lower influenza immunizations among patients with lower health literacy . On a larger scale, lower health literacy rates are associated with the tremendous economic cost to the US economy, with estimates ranging between $70 and $230 billion US dollars annually [5-6].
Comparing the readability of online education materials found on Annal.org to other major medical societies shows better overall readability of the materials with an average grade level being 8.8 ± 1.8 (SD) [6,10-13]. Using readability scales to assess comprehensibility of education materials carries inherent flaws and has its own limitations since essentially all the algorithms used, take into account word length and the number of syllables in the words to assess readability. For example, more difficult medical words such as “lipid” or “ketone”, which are short in length can be interpreted as more readable than longer words, such as “hospitalization”, which is longer, yet more understandable by the general population. Nonetheless, it remains true that the materials obtained are written beyond the recommended readability of the American population. Revisions to the materials can effectively increase readability, increasing comprehension among readers with limited health literacy capabilities. Utilizing graphics and videos can also increase the comprehension of complex health information that is difficult to clarify using textual information. Attaining a goal of near-universal comprehension can improve health outcomes and potentially minimize the financial burden associated with these outcomes.
The online patient education materials found on the Annals.org website provide patients with an excellent source of information to enable them to care for themselves and loved ones. Nevertheless, the majority of the materials are written at a readability level higher than recommended by the AMA and NIH. Given the integral role that online patient education materials play in the decision-making process of patients and follow-up care with physicians, greater emphasis should be placed on the readability and comprehensibility of online educational materials.
- Online health search 2006. (2006). Accessed: November 1, 2018: http://www.pewinternet.org/Reports/2006/Online-Health-Search-2006.aspx.
- 11% of americans don’t use the internet. who are they?. (2018). Accessed: September 25, 2018: http://www.pewresearch.org/fact-tank/2018/03/05/some-americans-dont-use-the-internet-who-are-they.
- Record shares of americans now own smartphones, have home broadband. (2017). Accessed: September 23, 2018: http://www.pewresearch.org/fact-tank/2017/01/12/evolution-of-technology.
- Kutner M, Greenberg E, Jin Y, Paulsen C: The Health Literacy of America’s Adults: Results from the 2003 National Assessment of Adult Literacy. U.S. Department of Education, Washington, DC; 2006. 10.1592/phco.22.5.282.33191
- Weiss BD: Health Literacy A Manual for Clinicians. American Medical Association, Chicago, IL; 2003.
- Agarwal N, Hansberry DR, Sabourin V, Tomei KL, Prestigiacomo CJ: A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern Med. 2013, 173:1257-1259. 10.1001/jamainternmed.2013.6060
- Herndon JB, Chaney M, Carden D: Health literacy and emergency department outcomes: a systematic review. Ann Emerg Med. 2011, 57:334-345. 10.1016/j.annemergmed.2010.08.035
- Institute of Medicine (US) Committee on Health Literacy: Health Literacy: A Prescription to End Confusion. Nielsen-Bohlman L, Panzer AM, Kindig DA (ed): National Academies Press (US), Washington D.C; 2004. 10.17226/10883
- Chan CV, Kaufman DR: A framework for characterizing ehealth literacy demands and barriers. J Med Internet Res. 2011, 13:94. 10.2196/jmir.1750
- Edmunds M, Barry R, Denniston A: Readability assessment of online ophthalmic patient information. JAMA Ophthalmol. 2013, 131:1610-1616. 10.1001/jamaophthalmol.2013.5521
- Misra P, Agarwal N, Kasabwala K, Hansberry DR, Setzen M, Eloy JA: Readability analysis of healthcare-oriented education resources from the American Academy of Facial Plastic and Reconstructive Surgery. Laryngoscope. 2013, 123:90-96. 10.1002/lary.23574
- Hansberry DR, John A, John E, Agarwal N, Gonzales SF, Baker SR: A critical review of the readability of online patient education resources from radiologyinfo.org. Am J Roentgenol. 2014, 202:566-575. 10.2214/AJR.13.11223
- Hansberry DR, Patel SR, Agarwal P, Agarwal N, John ES, John AM, Reynolds JC: A quantitative readability analysis of patient education resources from Gastroenterology Society websites. Int J Colorectal Dis. 2017, 32:917-920. 10.1007/s00384-016-2730-3
- How to write easy-to-read health materials. (2012). Accessed: October 12, 2018: http://www.nlm.nih.gov/medlineplus/etr.html.
- Annals of internal medicine. (2018). Accessed: September 20, 2018: http://annals.org/aim/pages/patient-information.
- Flesch R: A new readability yardstick. J Appl Psychol. 1948, 32:221-233.
- Kincaid JP, Fishburne Jr RP, Rogers RL, Chissom BS: Derivation of New Readability Formulas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) for Navy Enlisted Personnel. Institute for Simulation and Training, University of Central Florida, Millington, TN; 1975.
- Gunning R: The Technique of Clear Writing. McGraw-Hill, New York, NY; 1952.
- Coleman M, Liau TL: A computer readability formula designed for machine scoring. J Appl Psychol. 1975, 60:283-284. 10.1037/h0076540
- Mc Laughlin GH: SMOG grading-a new readability formula. J Read. 1969, 12:639-646.
- Doak CC, Doak LG, Root JH: Teaching Patients with Low Literacy Skills. J.P. Lippincott Company, Philadelphia, PA; 1996.
- Baker DW: The meaning and the measure of health literacy. J Gen Intern Med. 2006, 21:878-883. 10.1111/j.1525-1497.2006.00540.x
- Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, Crotty K: Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011, 155:97-107. 10.7326/0003-4819-155-2-201107190-00005
Quantitative Readability Assessment of the Internal Medicine Online Patient Information on Annals.org
Ethics Statement and Conflict of Interest Disclosures
Human subjects: All authors have confirmed that this study did not involve human participants or tissue. Animal subjects: All authors have confirmed that this study did not involve animal subjects or tissue. Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.
Cite this article as:
Abu-Heija A A, Shatta M, Ajam M, et al. (March 06, 2019) Quantitative Readability Assessment of the Internal Medicine Online Patient Information on Annals.org. Cureus 11(3): e4184. doi:10.7759/cureus.4184
Received by Cureus: February 19, 2019
Peer review began: February 26, 2019
Peer review concluded: February 28, 2019
Published: March 06, 2019
© Copyright 2019
Abu-Heija et al. This is an open access article distributed under the terms of the Creative Commons Attribution License CC-BY 3.0., which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.