We’re sorry, something doesn't seem to be working properly.
Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.
Evaluating on-line health information for patients with polymyalgia rheumatica: a descriptive study
BMC Musculoskeletal Disordersvolume 18, Article number: 43 (2017)
The Internet is increasingly used to access health information, although the quality of information varies. The aim of this study was to evaluate the readability, and quality of websites about polymyalgia rheumatica (PMR).
Three UK search engines (Google, Yahoo and Bing) were searched for the term ‘polymyalgia rheumatica’. After deleting duplicates, the first 50 eligible websites from each were evaluated. Readability was assessed using the Flesch Reading Ease and ‘Simple Measure of Gobbledygook (SMOG) Readability’ indicators. Credibility was assessed using a previously published Credibility Indicator.
Of the 52 unique websites identified, the mean (standard deviation) Flesch Reading Ease and SMOG Readability scores were 48 (15) and 10 (2), respectively. The mean (SD) Credibility Indicator was 2 (1). Fifty (96%) of websites were accurate. Website design and content was good, with an average of 68 and 64% respectively, of the assessed criteria being met.
Most websites about PMR require a higher readability age than is recommended. Thus whilst websites are often well designed and accurate this study suggests that their content could be refined and simplified to maximise patient benefit.
‘Health literacy’ refers to the ability to perform basic reading and numerical tasks required to function effectively in the healthcare environment. This includes the ability to read, understand and interpret information (print literacy), perform quantitative tasks such as following treatment regimens (numeracy) and to speak and listen effectively (oral literacy) . Low levels of health literacy are associated with poorer disease control, increased health care costs  and increased mortality .
Therefore, provision of good quality yet accessible health information is increasingly important, especially in the management of long term conditions. A longitudinal study found that one third of respondents aged 63 to 66 years had searched for on-line information about their health . The unrestricted nature of the internet means that evaluating sites for quality and content is increasingly important.
However, health information is only useful if patients can read, understand and apply it to their own circumstances. Studies show that low literacy levels are common , leading to guidelines for patient information to be written at sixth grade level (age 11 to 12 years) [6, 7] to maximise accessibility. Quantitative readability measures such as the Flesch Reading Ease  and Simple Measure of Gobbledygook (SMOG) Readability  tools have been developed to evaluate the appropriateness of written health information . Furthermore in addition to being able to read and understand information, information should be both accurate and accessible. Studies suggest that appropriate presentation of information on the internet (eg by using bulleted lists rather than large passages of text) can enhance its usability and as such specific usability and credibility (indicators have been developed .
There is growing interest in the importance of patient health literacy in long term conditions. Studies evaluating patient education materials for rheumatological conditions including osteoarthritis, rheumatoid arthritis, systemic lupus erythematosus and vasculitis found many to be written at readability levels above the recommended sixth-grade reading level .
Given the chronic nature of polymyalgia rheumatica (PMR), patients and their carers may be more likely to seek additional health information via the internet. To date studies have not evaluated internet website resources for patients, especially whether they are designed at appropriate readability levels and therefore likely to be understood by users. Therefore, the aim of this study was to evaluate the readability, credibility and usability, design and content of websites for PMR.
Identification of websites
The three most commonly used UK search engines (Google, Yahoo and Bing) were searched for the term ‘polymyalgia rheumatica’ on 31 July 2013. The search page results from each of the search engines were saved in a PDF format, with a hyperlink for each search page and website to ensure that the same pages found in this original search could be accessed again. Starting with the highest ranking website, the first 50 eligible websites from each search engine were evaluated. Websites were excluded if they were videos, chat forums or product advertisements, or if it clearly stated that it was intended solely as a professional resource. Websites were evaluated only once if they were identified by more than one search engine. Excluded websites were replaced by the next eligible website found by the search engine, leaving 50 sites from each search engine. No human data is used and therefore ethical approval is not required.
Readability was measured using the Flesch Reading Ease  and SMOG Readability  tools. The Flesch Reading Ease tool measures readability using a formula that assesses word and sentence length. It rates text on a 100-point scale; the greater the score, the easier it is to understand. A Flesch readability score of 60 or above is considered to be easy to follow. This tool has a high correlation to other readability scales, excellent reproducibility and has been used in numerous studies. The ‘SMOG Readability’ tool also measures readability using a formula, from 30 sentences (10 from the start, 10 from the middle and 10 from the end of the text of interest), counting the number of words containing three or more syllables. The SMOG score is the square root of the total word count plus 3. A score between 3–8, 9–12 and 13 or more indicates that completion of primary, secondary and tertiary education respectively is needed in order to comprehend the information. This tool is simple to use, repeatable and accurate in determining the reading level.
For the readability assessment of the websites, the first 600 words of the website content was copied and pasted into a free on-line text readability calculator to calculate the ‘Flesch Reading Ease’ and ‘SMOG Readability’ scores. Titles, subtitles, references, web links and advertising text were excluded from the readability analysis, with only body text and bullet point text included. Although the first 600 words of the website were assessed for readability the rest of the site was examined for credibility, usability and content as described below.
Website credibility and quality
Website credibility and usability was assessed using the 22 variables described by , which was designed to comprise variables easily identifiable and interpreted by people irrespective of their level of education. From this we calculated the 8-item Credibility Indicator described (incorporating authorship, affiliation, editorial team, date of creation, date of update, backing, accreditation and financing).
Information contained on each site was summarised into clinical domains PMR (e.g. symptoms and signs, management, prognosis) and accuracy assessed by clinicians (including a consultant rheumatologist (SH) and general practitioner (JP)).
The mean and standard deviation (SD) of the readability scores (‘Flesch Reading Ease’ and ‘SMOG Readability’) was calculated across all websites assessed. For other domains, the proportion of websites scoring positively on each item was calculated and the 8 item Credibility Index calculated. Results are presented as mean (SD) unless otherwise stated.
Figure 1 shows the website selection process from the three search engines. After removing duplicate sites and those aimed purely at healthcare professionals, 52 sites remained for evaluation.
The mean (SD) ‘Flesch Reading Ease’ and ‘SMOG Readability’ scores of the websites were 48 (15) and 10 (2), respectively.
Credibility and usability
Table 1 details the credibility and quality results. The mean (SD) Credibility Indicator was 2 (1). 36 (69%) websites included contact details and 42 (81%) had a built-in search facility. Information regarding financial support, date of creation and named authors were only included in 1 (2%), 8 (15%) and 11 (21%) of websites, respectively. Moreover, appropriate bodies accredited only 11 (21%) of the websites and only 8 (15%) included a ‘help’ option. The ability to change the font size was available for 14 (27%) of websites.
Design of the websites was generally good. All used consistent designs, font sizes and styles with 51 (98%) using a font size of at least 12-points and 44 (85%) of websites ‘chunking’ information into meaningful sections with clear headings. Moreover, 47 (90%) used short sentences and an active voice, with 45 (87%) avoiding the use of jargon or technical language. However, only 5 (7%) included video or audio illustration and only 8 (15%) supplemented the text with illustrations.
Evaluation of all 52 websites found variation in the type of content (Table 2). Whilst 49 (94%) provided information regarding symptoms and all included aspects of management, fewer provided information on prognosis (n = 16, 31%) although most sites were accurate (n = 50, 96%), with some important inaccuracies including using herbal supplements as treatment for PMR, and others suggesting statins cause PMR. Furthermore although many websites highlighted the link between PMR and GCA only 44% had appropriate advice regarding seeking urgent medical attention if visual symptoms developed, and only 25% contained appropriate advice for what to do about steroids if unwell, suggesting that some key patient messages are not universally highlighted.
With the increasing number of people using the internet to access health-related information, it is essential that the information on websites is readable, accurate, credible and user-friendly. Whilst the accuracy and design of websites providing information on PMR is generally good, although there are some key omissions, this study highlights that the readability of these sites is poor, with the majority of the websites having a reading age of at least 16 years, significantly higher than the United States Department of Health and Human Services (USDHHS) recommended reading age of 10–12 years for patient information , suggesting their effectiveness could be improved to ensure they are widely accessible. These findings are in line with those reported previously, suggesting that it is common for patient health information to require higher than the recommended reading ages . Given that health literacy levels are known to be lower in older people than in the general population , this may be a particular issue for patients with PMR, suggesting that significant revisions may be needed to ensure that information is accessible.
These findings support the proposal by Fitzsimmons et al. encouraging website editors to consider introducing a minimum readability policy based on the USDHHS guidelines , using a validated readability measure to improve the comprehension of patient information such as the ‘SMOG Readability’ measure, which is easy to use and for which on-line calculators are available .
In addition to the poor readability of the PMR websites, this study found that the credibility and usability of most of the PMR patient orientated websites could be improved. Many did not state the date of creation, accreditation or detail authors making it difficult to assess how up to date these sites are or who wrote them. Moreover, the majority of the websites did not have a ‘help’ option or have the ability to change the size of the text, suggesting that these websites have not taken into account those patients who are visually impaired, which may be a particular problem for older adults.
A key strength of this study is that since it evaluated the first 50 websites from three of the most commonly used search engines in the UK, it is likely to have assessed those websites that patients are likely to read. Moreover, this study not only evaluated the readability, credibility and usability of the websites, but also their design and content. This is in contrast to most other studies that have looked at on-line health information with regards to a particular condition, that have tended to focus on evaluating one specific aspect of website quality [11, 15].
In summary, although there is a wide range of PMR websites, this study suggests that many require a higher reading age than recommended. This suggests that the readability, credibility and usability of PMR websites should be reconsidered to maximise their likely patient benefit.
Simple Measure of Gobbledygook
United States Department of Health and Human Services
Baker DW. The meaning and the measure of health literacy. J Gen Intern Med. 2006;21:878–83.
Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, Crotty K. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155:97–107.
Pincus T, Keysor J, Sokka T, Krishnan E, Callahan LF. Patient questionnaires and formal education level as prospective predictors of mortality over 10 years in 97% of 1416 patients with rheumatoid arthritis from 15 United States private practices. J Rheumatol. 2004;31:229–34.
Flynn KE, Smith MA, Freese J. When do older adults turn to the internet for health information? Findings from the Wisconsin Longitudinal Study. J Gen Intern Med. 2006;21:1295–301.
Rowlands G, Protheroe J, Winkley J, Richardson M, Seed PT, Rudd R. A mismatch between population health literacy and the complexity of health information: an observational study. Br J Gen Pract. 2015;65:379–86.
US Department of Health and Human Services. Saying it clearly. Washington, DC: US Government Printing Office; 2000.
Davis TC, Crouch MA, Wills G, Miller S, Abdehou DM. The gap between patient reading comprehension and the readability of patient education materials. J Fam Pract. 1990;31:533–8.
Farr JN, Jenkins JJ, Paterson DG. Simplification of Flesch Reading Ease Formula. J Appl Psychol. 1951;35:333–7.
McLaughlin GH. SMOG grading: a new readability formula. J READ RES. 1969;12:639–46.
Rhee RL, Von Feldt JM, Schumacher HR, Merkel PA. Readability and suitability assessment of patient education materials in rheumatic diseases. Arthritis Care Res (Hoboken). 2013;65:1702–6.
Guardiola-Wanden-Berghe R, Gil-Pérez JD, Sanz-Valero J, Wanden-Berghe C. Evaluating the quality of websites relating to diet and eating disorders. Health Info Libr J. 2011;28:294–301.
NIACE. Readability: How to produce clear written materials for a range of readers. 2005.
Stiles L. e-Health Literacy: what it means and why it matters. 2009.
Kobayashi LC, Wardle J, Wolf MS, von Wagner C. Aging and Functional Health Literacy: A Systematic Review and Meta-Analysis. J Gerontol B Psychol Sci Soc Sci. 2016;71(3):445–57.
Fitzsimmons PR, Michael BD, Hulley JL, Scott GO. A readability assessment of online Parkinson's disease information. J R Coll Physicians Edinb. 2010;40:292–6.
AV was funded by an INSPIRE award. SM is funded by the National Institute for Health Research School for Primary Care Research.
This article presents independent research funded by the National Institute for Health Research (NIHR). The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health. The study funders had no role in the study design; data collection, analysis, or interpretation; in the writing of the paper; or in the decision to submit the paper for publication.
Availability of data and materials
All data is presented in the main paper.
SH, JP, SM made substantial contributions to the study conception and design. AV made substantial contributions to the acquisition of data. All authors made substantial contributions to the analysis and interpretation of data, drafting of the article/revising it critically for important intellectual content and approved the final manuscript.
AV (MPhil), Foundation Year Doctor.
JP (MB ChB, MRes, PhD, FRCGP), Senior Lecturer in General Practice and GP.
SM (BSc, MSc, PhD), Research Fellow: Inflammatory Arthritis.
SH (BMedSci, BM BS, MSc, PhD, FRCP), Senior Lecturer & Honorary Consultant Rheumatologist.
The authors declare that they have no competing interests.
Consent for publication
Ethics approval and consent to participate