"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."

Margaret Mead

Review article
peer-reviewed

Reliability and Validity of Non-radiographic Methods of Forward Head Posture Measurement: A Systematic Review



Abstract

Forward head posture measurement can be conducted using various methods and instruments. The selection of the appropriate method requires the factors of validity and reliability to be considered. This systematic review reports on the reliability and validity of the non-radiographic methods examined for measuring forward head posture. The review identified relevant studies following a systematic search of electronic databases. The studies were assessed for quality by two independent reviewers using a critical appraisal tool. The studies’ data were extracted and assessed, and the results were synthesized qualitatively using a level of evidence approach. Twenty-one studies met the eligibility criteria and were included in the review. Both reliability and validity were investigated for five studies, whereas reliability only was investigated for 17 studies. In total, 11 methods of forward head posture measurement were evaluated in the retrieved studies. The validity of the methods ranged from low to very high. The reliability of the methods ranged from moderate to excellent. The strongest levels of evidence for reliability support the use of classic photogrammetry. For validity, the evidence is not conclusive. Further studies are required to strengthen the level of evidence on the reliability and validity of the remaining methods. It is recommended that this point be addressed in future research.

Introduction & Background

Neck pain shows high epidemiological occurrence [1,2]. According to the Bone and Joint Decade 2000-2010 Task Force on Neck Pain and Its Associated Disorders, most people experience some neck pain in their lifetimes [3]. However, for most, the pain does not seriously hinder everyday activities. At least one in three adults in Europe and North America experiences neck pain at some point. About 5-10% of these cases involve severe neck pain. The prevalence of neck pain is higher in women, and it increases with age [1,4,5]. Additional risk factors are lack of physical activity, increased body mass index (BMI), low kinaesthesia, and incorrect movement patterns [6-9]. Neck pain has also been associated with poor health, previous neck injuries and other risk factors, including occupation, smoking and obesity and bad posture [3,10-12].

The most common pathological postural adaptation associated with neck pain development is the forward head posture (FHP) [8,13]. FHP increases weight pressure on the cervical spine, enhancing pathological myofascial adaptations and muscle imbalances. Amongst others, the muscles that FHP weakens include the deep neck flexors, scapular stabilizers, and retractors. The muscles that become shortened and overactive include the deep upper cervical extensors, shoulder protractors, and elevators. Those muscle imbalances can cause cervical and thoracic instability, resulting in decreased respiratory function, proprioceptive alterations, increased muscle tone and cervical pain [14,15].

The anterior displacement of the head is mainly assessed through examination of the craniovertebral angle (CVA) as defined by Wickens and Kipputh [16]. CVA measurement is essential to the musculoskeletal assessment, helping clinical therapists screen for excessive anterior head displacement and develop correct therapeutic strategies for this pathological condition.

The current gold standard for the quantitative determination of the cervical angle is the lateral x-ray, which, however, shows significant limitations in its use such as the high cost of examination and exposure of patients to high doses of potentially harmful radiation. Alternatively, several non-invasive examination methods have been adopted for clinical use, including imaging-photographs, goniometry, and three-dimensional (3D) motion devices [17-21]. Guidelines for selecting assessment tools in clinical and laboratory testing settings recommend that the validity and reliability of measurement tools are among the key parameters to be ensured [22-24]. Validity refers to the truth of a set of statements [25,26], and it examines whether a study instrument measures the variable it intends to measure [22,26]. In contrast, reliability is the reproducibility of results upon repeated trials [26,27] without error.

Since several studies have been published on the validity and reliability of CVA non-invasive screening tools, a literature review is needed to draw conclusions and provide valuable clinical guidelines. Therefore, the purpose of this systematic review was to report on the reliability and validity of non-radiographic methods of measuring FHP.

Review

Methods

Search Strategy

The primary investigator conducted a systematic search from April 1 to May 1, 2022. Databases included PubMed, MEDLINE (Medical Literature Analysis and Retrieval System Online), EBSCO (Elton B. Stephens Company), Google Scholar, and Science Direct. The keywords used in different combinations were: forward head posture, craniovertebral angle, test, measurement, validity, reliability, cervical photogrammetry and radiography.

After the initial search, duplicate articles were removed, and the remaining studies were assessed based on the title and abstract. The full-text article was searched and analyzed when the article appeared to meet the inclusion criteria. A full reading of the articles was then conducted to ensure relevance, and seven articles were removed. The reference lists of the articles were further searched for additional articles, but none were identified as relevant.

Eligibility Criteria

The eligibility criteria were agreed upon during a meeting between the two reviewers. The inclusion criteria were as follows: 1) articles available in full text, 2) articles available in the English and Greek language, 3) FHP recorded with non-invasive techniques, 4) included measurement of validity and/or reliability, and 5) human participants being part of the study, with no restrictions on their physical and somatic characteristics. The exclusion criteria were as follows: 1) radiographic measurement techniques only, and 2) no intraclass correlation coefficient (ICC) calculated for the measurement of reliability. When the final list of articles was drafted, the secondary reviewer checked it across the eligibility criteria. No disagreements occurred between reviewers regarding the eligibility of chosen articles.

Quality Assessment

The reviewers used the checklist by Brink and Louw (2011), representing a critical appraisal tool [28]. This was designed to assess the methodological quality of studies by testing the validity and reliability of objective clinical tools. The checklist comprises 13 questions that qualitatively assess the methodology of studies by combining the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) [29] and the Quality Appraisal of Diagnostic Reliability Studies (QAREL) [30]. Response options for the 13 questions are ‘yes’, ‘no’, or ‘N/A’ (not applicable). This checklist was also used in a systematic review of non-radiographic thoracic kyphosis measurements [31]. In that systematic review, Barrett et al. included articles, some of which assessed both reliability and validity [31]. Therefore, this checklist was deemed more convenient than using the QUADAS or QUAREL separately. The studies were awarded a high-quality score if a positive score (‘yes’) was given to 60% or more of the questions (≥60%); the same scheme was used previously by van der Wurff et al. [32], May et al. [33], and Adhia et al. [34].

Quality assessment was first performed by the primary reviewer. In the next stage, the secondary reviewer checked the rating of the primary reviewer. Limited differences arose; these were discussed based on the proper interpretation of questions and which response would more accurately reflect the reality. No kappa score was recorded because there were very few diverging views and consensus was quickly reached.

Data Analysis

The collected studies showed large heterogeneity in the study populations and measurement tests. Therefore, neither meta-analysis nor subgroup analysis was deemed feasible. Consequently, the reviewers performed a descriptive analysis by synthesising data using the evidence approach [35], as shown in Table 1.

Level of evidence Criteria
Strong Consistent findings from three high-quality studies
Moderate Consistent findings from at least one high-quality and one or more low-quality studies
Limited Consistent findings in one low-quality study or only one study available
Conflicting Inconsistent evidence in multiple studies, irrespective of study quality
No evidence No studies found

The ICC and Pearson’s correlation coefficient were interpreted according to Munro and Visintainer [36] as follows: (i) Very low correlation: .00-.29; (ii) Low correlation: .30-.49; (iii) Moderate correlation: .50-.69; (iv) High correlation: .70-.89; (v) Very high correlation: .90-1.00.

Results

Selection of Studies

In total, 21 articles were reviewed based on the selection criteria mentioned above. Of these, five studies examined validity and reliability, and 17 evaluated only reliability. Of the 21 reliability studies, 15 investigated both intra- and inter-rater reliability, five investigated only intra-rater reliability, and one investigated only inter-rater reliability. Figure 1 presents the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) diagram describing the selection process of articles [37].

Methodological Quality

Fourteen of the 21 studies were deemed of high quality. The full scoring is given in Table 2. After discussion and deliberation, the two reviewers agreed on the scores to be attributed. All five studies that examined validity were of high quality [38-42].

Study 1 2 3 4 5 6 7 8 9 10 11 12 13 High quality?
Gadotti et al., 2013 [17] Yes Yes Yes Yes Yes Yes Yes Yes No Yes Yes N/A Yes Yes
Garrett et al., 1993 [18] Yes Yes No Yes Yes Yes N/A Yes N/A Yes N/A No Yes Yes
Hickey et al., 2000 [20] Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No Yes Yes
Gallego-Izquierdo et al., 2020 [38] Yes Yes N/A Yes Yes Yes N/A Yes N/A Yes N/A No Yes Yes
Hopkins et al., 2019 [39] Yes Yes Yes N/A N/A Yes Yes Yes Yes Yes Yes No Yes Yes
Lau et al., 2010 [40] Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No Yes Yes
Ruivo et al., 2013 [41] N/A Yes Yes Yes Yes Yes N/A Yes Yes Yes Yes No Yes Yes
van Niekerk et al., 2008 [42] Yes Yes Yes N/A N/A No Yes Yes No Yes Yes N/A Yes Yes
Dimitriadis et al., 2015 [43] Yes Yes N/A Yes No Yes N/A Yes N/A Yes N/A N/A Yes No
Dunk et al., 2005 [44] Yes No N/A N/A Yes N/A Yes N/A N/A Yes N/A N/A Yes No
Gadotti et al., 2010 [45] Yes Yes Yes N/A No No No No Yes Yes Yes N/A Yes No
Moradi et al., 2014 [46] Yes Yes N/A Yes Yes Yes No Yes No Yes No No Yes Yes
Nam et al., 2013 [47] Yes Yes N/A Yes Yes Yes N/A Yes N/A Yes No No Yes Yes
Salahzadeh et al., 2014 [48] Yes Yes N/A N/A N/A Yes Yes Yes N/A Yes N/A No Yes No
Ferreira et al., 2010 [49] No Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes N/A Yes Yes
Ruivo et al., 2015 [50] Yes Yes Yes Yes Yes Yes Yes Yes N/A Yes No Yes Yes Yes
Souza et al., 2011 [51] Yes No N/A N/A Yes Yes N/A Yes N/A Yes N/A No Yes No
Weber et al., 2012 [52] Yes Yes Yes N/A N/A Yes Yes Yes Yes Yes Yes N/A Yes Yes
Cote et al., 2021 [53] Yes Yes N/A Yes Yes No N/A Yes N/A Yes N/A N/A Yes No
Gray et al., 2020 [54] Yes Yes No Yes Yes Yes No Yes N/A Yes No No Yes Yes
Lee et al., 2017 [55] Yes No N/A No No Yes N/A Yes N/A Yes N/A N/A Yes No

Study Characteristics

Eleven different methods of measuring FHP were identified within the reviewed articles. The most studied were photogrammetry and Postural Assessment Software (SAPO version 0.69). Classic photogrammetry was not tested for validity. SAPO’s validity was examined in only one study [41]. The full list of methods is given in Table 3.

1. Photogrammetry: Seven articles [17, 43-48]  
2. Postural Assessment Software (SAPO): Five articles [41,49-52]  
3. Photogrammetry through Video Conferencing Platform: One article [53]  
4. Digital Photogrammetry with PostureScreen Mobile App (PostureCo, Inc., Florida, United States): One article [39]  
5. FHP App (Pyeongtaek, South Korea) Mobile Phone Application: One article [38]  
6. Goniometer, CROM Instrument: One article [18]  
7. Goniometer, Posture Measuring Device (PMD): One article [54]  
8. Goniometer, CROM Device and Plumb-Line Techniques: One article [20]  
9. Goniometer, SmartTool Angle Finder (M-D Building Products, Inc., Oklahoma City, United States): One article [40]  
10. 3D Motion Capture System: Ibe article [55]  
11. Photographic Posture Analysis Method (PPAM): One article [42]

Types of Participants

A healthy population was included in 12 of 21 studies [20,38,39,42,43-48,50,54]. A mixed population of healthy and neck-pain participants was included in four of 21 studies [18,40,51,53]. Five of 21 studies did not report on the health condition of participants [41,45,49,52,55]. The participants’ BMI was reported in 10 out of 21 studies [17,18,38,39,43,44-46,53,55].

The population age varied between studies. Underage populations aged 15-16 and 16-17 years took part in two studies [42,50]. Two studies’ populations were young adults aged 18-28 and 17-27 years [17,45]. The population of one study was between the ages of 19 and 35 years [52]. Seven studies included participants who were predominantly in their early twenties, within the age group of 18-27 years [38,39,43,44,47,48,55]. Three studies included a population aged 25-26 years [20,51,53]. Three studies included older populations of 33 ± 8.03 years [45], 46.7 ± 9.5 years (40) and 50 ± 15.7 years [18]. Finally, three studies did not report the population age [41,49,54].

Reliability and Validity

The validity of the methods ranged from low to very high. However, only five of 21 studies assessed validity. Reliability showed varying results, given that not all studies investigated both inter- and intra-rater reliability. More detailed results are shown in Table 4.

Reference High quality? Reliability (ICC/Cronbach’s alpha) SEM Validity (correlation coefficient)
Gadotti and Magee, 2013 [17] Yes .99 (inter), .99 (intra) 0.04 (inter), 0.01 (intra) N/A
Garrett et al., 1993 [18] Yes .93 (intra), .83 (inter) N/A N/A
Hickey et al., 2000 [20] Yes CROM 0.77 (intra) 0.69 (inter), plumb line 0.83 (intra), 0.75 (inter) N/A N/A
Gallego-Izquierdo et al., 2020 [38] Yes .86 (intra), 0.88 (inter) 1.96 (intra), 1.795 (inter) 0.86 (correlation coefficient)
Hopkins et al., 2019 [39] Yes 1.00 ± 0.09 (intra) N/A -0.14 ± 0.06 (-0.32; 0.04) bias (99.75% credible Interval)
Lau et al., 2010 [40] Yes .99 (inter), 0.99 (intra) N/A 0.72 (correlation coefficient)
Ruivo et al., 2013 [41] Yes .99 (inter), 0.99 (intra) N/A 0.94 (correlation coefficient)
van Niekerk et al., 2008 [42] Yes .98 (inter) 8.06 0.89 (correlation coefficient)
Dimitriadis et al., 2015 [43] No Sitting .86, standing .82 (inter), sitting .86, standing .88 (intra) Sitting 1.7, standing 1.94 (inter), sitting 2.08, standing 1.75 (intra) N/A
Dunk et al., 2005 [44] No .83–.74 (intra)   N/A
Gadotti et al., 2010 [45] No .85 (intra)   N/A
Moradi et al., 2014 [46] Yes .95 (intra), .89 (inter) 0.74 (intra), 1.5 (inter) N/A
Nam et al., 2013 [47] Yes .75 (inter), 0.91 (intra) 0.13 (inter), 0.16 (intra) N/A
Salahzadeh et al., 2014 [48] No .90 (inter), 0.92 (intra) 1.94 (inter), 1.74 (intra) N/A
Ferreira et al., 2010 [49] Yes .69 (inter), .85 (intra) 1.77 (inter), 1.33 (intra) N/A
Ruivo et al., 2015 [50] Yes .88 (inter), 0.83 (intra) 2.35 (inter), 2.72 (intra) N/A
Souza et al., 2011 [51] No .99 (ANOVA-p-intra), .98 (inter) N/A N/A
Weber et al., 2012 [52] Yes .99 (intra)   N/A
Cote et al., 2021 [53] No .88 (inter), 0.91 (intra) 1.87 (inter), 1.89 (intra) N/A
Gray et al., 2020 [54] Yes .82 (intra) 0.87 (inter) N/A N/A
Lee et al., 2017 [55] No Sitting .92, standing .96 (intra)   N/A

Level of Evidence

Table 5 details the accumulated levels of evidence found for all examined methods. For most studies, there was a limited or inconsistent level of evidence for the reliability and validity of the methods. Strong and moderate levels of evidence were found for a small selection of methods.

Level of evidence Method Reliability Validity
Strong Photogrammetry Good to excellent intra-rater and inter-rater reliability  
  SAPO software Good to excellent intra-rater and inter-rater reliability  
  FHP app (Pyeongtaek, South Korea) mobile phone application Very good intra- and inter-rater reliability Very high validity
  Goniometer, CROM instrument Very good intra- and inter-rater reliability  
  Goniometer, CROM device and plumb-line techniques Moderate intra- and inter-rater reliability  
  PPAM Excellent intra-rater reliability Very high validity
  Goniometer, SmartTool Angle Finder (M-D Building Products, Inc., Oklahoma City, United States) Excellent intra- and inter-rater reliability Moderate validity
Moderate Photogrammetry Good to excellent intra-rater and inter-rater reliability  
  Goniometer, PMD Very good intra- and inter-rater reliability  
  Digital photogrammetry with PostureScreen Mobile App (PostureCo, Inc., Trinity, FL, United States) Excellent intra-rater reliability High validity
  3D Motion Capture System Excellent intra-rater reliability  
  SAPO software Excellent intra-rater and inter-rater reliability  
Limited SAPO software Excellent intra-rater and inter-rater reliability Very high validity

Discussion

Main Findings

This systematic review examined 11 methods for the non-invasive measurement of FHP, excluding radiography. These included a variety of approaches, from classic photogrammetric methods measuring the CVA to digital postural assessment tools and mobile applications. Levels of reliability varied significantly between methods. However, it is not feasible to draw safe conclusions because of the limited number of references per method. An adequate number of references examined were present for classic photogrammetry and postural assessment software (seven and five, respectively). Both methods ranked a score of good to excellent intra-rater and inter-rater reliability. Digital photogrammetry’s reliability was studied in only two articles; this was done using two different tools, a video conferencing article [53] and a mobile application [39]. Goniometry as a method was studied in five articles using five different instruments. Upon reviewing the data, it was found that classic photogrammetry and postural assessment software seem to be equally reliable.

The validity of measurement methods has been less commonly studied. However, photogrammetry as a method is generally accepted as valid.

Validity

Spinal X-rays are considered the gold standard method for assessing spinal deformities, including such postural alterations as FHP. However, radiographs impose accessibility and ethical obstacles on different populations [56]. This is why only one study examined photogrammetry versus radiography [42], and another one study examined goniometry versus radiography [39]. In addition, the placement of surface landmarks used to locate the tragus, and C7 cannot be considered to be as accurate as locating those spots on radiographic images.

Gallego-Izquierdo et al. [38] found high criterion validity of the FHP app using photogrammetry via a software program (Kinovea) that automatically calculates CVA [39]. The study's very good intra- and inter-rater reliability also supported its internal validity. Lau et al. [40] showed good criterion validity of the SmartTool Angle Finder (M-D Building Products, Inc., Oklahoma City, United States) goniometer against X-rays. The study's excellent intra- and inter-rater reliability also supported its internal validity.

Van Niekerk et al. showed good criterion validity of computerised photogrammetry of the PPAM method against radiography using the LODOX system (Lodox Systems (Pty) Ltd, Sandton, South Africa) [42]. Ruivo et al. showed high validity of the SAPO method examining classic goniometry [41]. However, the level of evidence was limited; therefore, goniometry is not considered a solidly valid method for FHP measurement. The excellent intra- and inter-rater reliability of the study supported its internal validity.

Hopkins et al. [39] performed the first study to evaluate the validity of digital photogrammetry with the PostureScreen Mobile App (PostureCo, Inc., Trinity, FL, United States), but the results were uncertain. In addition, the level of evidence of the study was moderate. 

Based on the above, no definitive conclusions can be drawn regarding the validity of the assessment of FHP by non-invasive techniques other than radiographic assessment. Although there are indications that photogrammetry can produce valid results in assessing FHP, this should be confirmed by future validity studies. Until then, the radiographic evaluation will remain essential in the clinical evaluation of FHP.

 Reliability

The reliability of measurement methods depends heavily on eliminating the uncertainty caused by postural discrepancies. The vast majority of researchers attempted to address this point via variation of the testing order (except for four studies) [42,44,45,53] and by taking measurements in repeated periods (except for two studies) [44,45].

Measurement of CVA required accurately locating the relevant spots on the anatomy-for example, the C7 and tragus for photogrammetry-and in some cases, the placing of surface landmarks. Therefore, reliability depends highly on accuracy. Twenty studies described the procedure followed to improve accuracy in detail; the one study that did not do so was that of Gadotti et al. [45]. In addition, 10 studies included experienced testers [18,20,38,40,45,47,48,50,53,54]. Moreover, only two studies took measurements in both sitting and standing positions [43,55].

Based on the above, it can be concluded that non-invasive evaluation techniques of the FHP can produce reliable results if the measurement process is standardized. However, the articles' significant methodological differences and characteristics (high heterogeneity of study populations, measurement methods, and raters) make conclusions generalization difficult. Despite this, there were a sufficient number of reports documenting the reliability of these techniques for assessing FHP, particularly using classical photogrammetry and postural assessment software (PAS/SAPO).

Methodological Considerations

The methodological limitations of the reviewed studies involved the general health and condition of the studied populations. Most included a healthy population sample with a mean age of between 20 and 65 years. The BMI was unreported in 11 of the 21 studies. These characteristics do not accurately and inclusively reflect the patients who will receive FHP measurements in clinical practice [29]. Therefore, the results for diagnostic accuracy may have limited clinical applicability (generalisability).

Raters were described in only three out of the five SAPO-related studies, calling into question the possibility of generalising results for this method. Moreover, some studies did not perform inter-rater blinding [39,42,44,45,48,51,52,55] or intra-rater blinding [39,43,42,45,48,52,55].

Limitations of the Review

The present review was conducted in a systematic manner, incorporating the PRISMA guidelines for the search of studies and QUADAS and QAREL for qualitatively assessing their methodology. In addition, two reviewers were engaged, as well as all available populations. However, the review had limitations in that it included only articles from English and Greek language. Moreover, two reviewers had knowledge of the results of the studies before assessing their methodological quality. The critical appraisal tool was applied with the strictest criteria to limit the possibility of reviewer bias [57]. Finally, the high heterogeneity of study populations, measurement methods, and raters suggests that the external validity of this scoping review is low.

Clinical and Research Implications

The examined methods showed that therapists could choose a method to assess FHP from a limited number of approaches. The most widespread methods are radiography, classic photogrammetry and goniometry.

Photogrammetry can be recommended as a reliable and valid method to use without the disadvantages of radiography. Digital photogrammetry is trending, and different software and mobile applications have been tested with limited data so far. It may be useful for specific populations to make use of video-based telehealth platforms, as examined in Cote et al.’s study [53]. Goniometry is a widespread approach but it is performed with different instruments, leaving the therapist to decide which is most suitable for them; thus, there are no conclusive outcomes as to the reliability and validity of each instrument. Experience in the use of each goniometer is also a determining factor for measurement accuracy. Further research could inform the use of the appropriate goniometer.

It can be stated from this review that therapists need to consider population characteristics when deciding on the appropriate FHP assessment method. In addition, future research should include more representative samples of populations, ensure rater blinding and focus on appropriate statistical analyses.

Conclusions

This systematic review examined various FHP measurement methods, including photogrammetry, postural assessment software, mobile applications, goniometer measurements, and 3D motion capture systems. However, the number of studies examining each method was limited except for photogrammetry. Overall, the reliability data were positive, but such data remain limited; in some cases, the data presented significant limitations. Photogrammetry consistently delivers reliable results. In contrast, the different goniometers used in goniometry methods do not allow a definite conclusion regarding the method’s overall reliability. Validity data are very limited throughout the methods, although photogrammetry appears to be considered valid. Ultimately, further research is needed to evaluate the reliability and validity of goniometry, solidify the validity of photogrammetry, and provide data on other reviewed methods.


References

  1. Andersson HI, Ejlertsson G, Leden I, Rosenberg C: Chronic pain in a geographically defined general population: studies of differences in age, gender, social class, and pain localization. Clin J Pain. 1993, 9:174-82. 10.1097/00002508-199309000-00004
  2. Hoy DG, Protani M, De R, Buchbinder R: The epidemiology of neck pain. Best Pract Res Clin Rheumatol. 2010, 24:783-92. 10.1016/j.berh.2011.01.019
  3. Haldeman S, Carroll L, Cassidy JD, Schubert J, Nygren Å: The bone and joint decade 2000-2010 task force on neck pain and its associated disorders. Eur Spine J. 2008, 17:5-7. 10.1007/s00586-008-0619-8
  4. Côté P, Cassidy JD, Carroll L: The Saskatchewan Health and Back Pain Survey. The prevalence of neck pain and related disability in Saskatchewan adults. Spine (Phila Pa 1976). 1998, 23:1689-98. 10.1097/00007632-199808010-00015
  5. Fejer R, Kyvik KO, Hartvigsen J: The prevalence of neck pain in the world population: a systematic critical review of the literature. Eur Spine J. 2006, 15:834-48. 10.1007/s00586-004-0864-4
  6. Binder AI: Cervical pain syndromes. Oxford Textbook of Rheumatology. Isenberg DA, Maddison PJ, Woo P, Glass DN, Breedveld FC (ed): Oxford University Press, Oxford; New York; 2004. 1185-95.
  7. Trask C, Mathiassen SE, Rostami M: Partly visible periods in posture observation from video: prevalence and effect on summary estimates of postures in the job. Appl Ergon. 2015, 49:63-9. 10.1016/j.apergo.2015.02.001
  8. Schwanke NL, Pohl HH, Reuter CP, Borges TS, de Souza S, Burgos MS: Differences in body posture, strength and flexibility in schoolchildren with overweight and obesity: a quasi-experimental study. Man Ther. 2016, 22:138-44. 10.1016/j.math.2015.11.004
  9. Kang DY: Deep cervical flexor training with a pressure biofeedback unit is an effective method for maintaining neck mobility and muscular endurance in college students with forward head posture. J Phys Ther Sci. 2015, 27:3207-10. 10.1589/jpts.27.3207
  10. Ariëns GA, Borghouts JA, Koes BW: Neck pain. Epidemiology of Pain: A Report of the Task Force on Epidemiology of the International Association for the Study of Pain. Crombie IK, Croft PR, Linton SJ, LeResche. L, Korff M (ed): IASP Press, Seattle, WA; 1999.
  11. Cohen SP: Epidemiology, diagnosis, and treatment of neck pain. Mayo Clin Proc. 2015, 90:284-99. 10.1016/j.mayocp.2014.09.008
  12. Mäkelä M, Heliövaara M, Sievers K, Impivaara O, Knekt P, Aromaa A: Prevalence, determinants, and consequences of chronic neck pain in Finland. Am J Epidemiol. 1991, 134:1356-67. 10.1093/oxfordjournals.aje.a116038
  13. Kim MH, Yi CH, Kwon OY, Cho SH, Yoo WG: Changes in neck muscle electromyography and forward head posture of children when carrying schoolbags. Ergonomics. 2008, 51:890-901. 10.1080/00140130701852747
  14. McDonnell MK, Sahrmann SA, Van Dillen L: A specific exercise program and modification of postural alignment for treatment of cervicogenic headache: a case report. J Orthop Sports Phys Ther. 2005, 35:3-15. 10.2519/jospt.2005.35.1.3
  15. Mallin G, Murphy S: The effectiveness of a 6-week Pilates programme on outcome measures in a population of chronic neck pain patients: a pilot study. J Bodyw Mov Ther. 2013, 17:376-84. 10.1016/j.jbmt.2013.03.003
  16. Grimmer KA, Williams MT, Gill TK: The associations between adolescent head-on-neck posture, backpack weight, and anthropometric features. Spine (Phila Pa 1976). 1999, 24:2262-7. 10.1097/00007632-199911010-00015
  17. Gadotti IC, Armijo-Olivo S, Silveira A, Magee D: Reliability of the craniocervical posture assessment: visual and angular measurements using photographs and radiographs. J Manipulative Physiol Ther. 2013, 36:619-25. 10.1016/j.jmpt.2013.09.002
  18. Garrett TR, Youdas JW, Madson TJ: Reliability of measuring forward head posture in a clinical setting. J Orthop Sports Phys Ther. 1993, 17:155-60. 10.2519/jospt.1993.17.3.155
  19. Grimmer-Somers K, Milanese S, Louw Q: Measurement of cervical posture in the sagittal plane. J Manipulative Physiol Ther. 2008, 31:509-17. 10.1016/j.jmpt.2008.08.005
  20. Hickey ER, Rondeau MJ, Corrente JR, et al.: Reliability of the cervical range of motion (CROM) device and plumb-line techniques in measuring resting head posture (RHP). J Man Manip Ther. 2000, 8:10-17. 10.1179/106698100790811346
  21. Kandasamy G, Bettany-Saltikov J, Schaik P: Posture and back shape measurement tools: a narrative literature review. Spinal Deformities in Adolescents, Adults and Older Adults. Bettany-Saltikov J, Kandasam G (ed): IntechOpen, London; 2020. 10.5772/intechopen.91803
  22. Galanis P: Validity and reliability of questionnaires in epidemiological studies (Article in Greek). Arch Hell Med. 2013, 30:97-110.
  23. Aaronson N, Alonso J, Burnam A, Lohr KN, Patrick DL, Perrin E, Stein RE: Assessing health status and quality-of-life instruments: attributes and review criteria. Qual Life Res. 2002, 11:193-205. 10.1023/a:1015291021312
  24. Terwee CB, Bot SD, de Boer MR, et al.: Quality criteria were proposed for measurement properties of health status questionnaires. J Clin Epidemiol. 2007, 60:34-42. 10.1016/j.jclinepi.2006.03.012
  25. Nunnally JC, Bernstein IH: Psychometric theory. McGraw-Hill, New York; 1994.
  26. Higgins PA, Straub AJ: Understanding the error of our ways: mapping the concepts of validity and reliability. Nurs Outlook. 2006, 54:23-9. 10.1016/j.outlook.2004.12.004
  27. Carmines EG, Zeller RA: Reliability and Validity. Sage Publications, Newbury Park, CA; 1979. 10.4135/9781412985642
  28. Brink Y, Louw QA: Clinical instruments: reliability and validity critical appraisal. J Eval Clin Pract. 2012, 18:1126-32. 10.1111/j.1365-2753.2011.01707.x
  29. Whiting PF, Rutjes AW, Westwood ME, et al.: QUADAS-2: a revised tool for the quality assessment of diagnostic accuracy studies. Ann Intern Med. 2011, 155:529-36. 10.7326/0003-4819-155-8-201110180-00009
  30. Lucas NP, Macaskill P, Irwig L, Bogduk N: The development of a quality appraisal tool for studies of diagnostic reliability (QAREL). J Clin Epidemiol. 2010, 63:854-61. 10.1016/j.jclinepi.2009.10.002
  31. Barrett E, McCreesh K, Lewis J: Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review. Man Ther. 2014, 19:10-7. 10.1016/j.math.2013.09.003
  32. van der Wurff P, Hagmeijer RH, Meyne W: Clinical tests of the sacroiliac joint. A systematic methodological review. Part 1: Reliability. Man Ther. 2000, 5:30-6. 10.1054/math.1999.0228
  33. May S, Chance-Larsen K, Littlewood C, Lomas D, Saad M: Reliability of physical examination tests used in the assessment of patients with shoulder problems: a systematic review. Physiotherapy. 2010, 96:179-90. 10.1016/j.physio.2009.12.002
  34. Adhia DB, Bussey MD, Ribeiro DC, Tumilty S, Milosavljevic S: Validity and reliability of palpation-digitization for non-invasive kinematic measurement - a systematic review. Man Ther. 2013, 18:26-34. 10.1016/j.math.2012.06.004
  35. van Tulder M, Furlan A, Bombardier C, Bouter L: Updated method guidelines for systematic reviews in the cochrane collaboration back review group. Spine (Phila Pa 1976). 2003, 28:1290-9. 10.1097/01.BRS.0000065484.95996.AF
  36. Munro BH: Statistical Methods for Health Care Research. Lippincott Williams and Wilkins, Philadelphia; 2005.
  37. Liberati A, Altman DG, Tetzlaff J, et al.: The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: explanation and elaboration. BMJ. 2009, 339:b2700. 10.1136/bmj.b2700
  38. Gallego-Izquierdo T, Arroba-Díaz E, García-Ascoz G, Val-Cano MD, Pecos-Martin D, Cano-de-la-Cuerda R: Psychometric proprieties of a mobile application to measure the craniovertebral angle a validation and reliability study. Int J Environ Res Public Health. 2020, 17:6521. 10.3390/ijerph17186521
  39. Hopkins BB, Vehrs PR, Fellingham GW, George JD, Hager R, Ridge ST: Validity and reliability of standing posture measurements using a mobile application. J Manipulative Physiol Ther. 2019, 42:132-40. 10.1016/j.jmpt.2019.02.003
  40. Lau HM, Chiu TT, Lam TH: Measurement of craniovertebral angle with electronic head posture instrument: criterion validity. J Rehabil Res Dev. 2010, 47:911-8. 10.1682/jrrd.2010.01.0001
  41. Ruivo RM, Pezarat-Correia P, Carita AI, Vaz JR: Reliability and validity of angular measures through the software for postural assessment: postural assessment software (Article in Spanish). Rehabilitacion (Madr). 2013, 47:223-8. 10.1016/j.rh.2013.07.002
  42. van Niekerk SM, Louw Q, Vaughan C, Grimmer-Somers K, Schreve K: Photographic measurement of upper-body sitting posture of high school students: a reliability and validity study. BMC Musculoskelet Disord. 2008, 9:113. 10.1186/1471-2474-9-113
  43. Dimitriadis Z, Podogyros G, Polyviou D, Tasopoulos I, Passa K: The reliability of lateral photography for the assessment of the forward head posture through four different angle-based analysis methods in healthy individuals. Musculoskeletal Care. 2015, 13:179-86. 10.1002/msc.1095
  44. Dunk NM, Lalonde J, Callaghan JP: Implications for the use of postural analysis as a clinical diagnostic tool: reliability of quantifying upright standing spinal postures from photographic images. J Manipulative Physiol Ther. 2005, 28:386-92. 10.1016/j.jmpt.2005.06.006
  45. Gadotti IC, Biasotto-Gonzalez DA: Sensitivity of clinical assessments of sagittal head posture. J Eval Clin Pract. 2010, 16:141-4. 10.1111/j.1365-2753.2009.01137.x
  46. Moradi N, Maroufi N, Bijankhan M, et al.: Intrarater and interrater reliability of sagittal head posture: a novel technique performed by a physiotherapist and a speech and language pathologist. J Voice. 2014, 28:842.e11-6. 10.1016/j.jvoice.2014.02.014
  47. Nam SH, Son SM, Kwon JW, Lee NK: The intra- and inter-rater reliabilities of the forward head posture assessment of normal healthy subjects. J Phys Ther Sci. 2013, 25:737-9. 10.1589/jpts.25.737
  48. Salahzadeh Z, Maroufi N, Ahmadi A, Behtash H, Razmjoo A, Gohari M, Parnianpour M: Assessment of forward head posture in females: observational and photogrammetry methods. J Back Musculoskelet Rehabil. 2014, 27:131-9. 10.3233/BMR-130426
  49. Ferreira EA, Duarte M, Maldonado EP, Burke TN, Marques AP: Postural assessment software (PAS/SAPO): validation and reliabiliy. Clinics (Sao Paulo). 2010, 65:675-81. 10.1590/S1807-59322010000700005
  50. Ruivo RM, Pezarat-Correia P, Carita AI: Intrarater and interrater reliability of photographic measurement of upper-body standing posture of adolescents. J Manipulative Physiol Ther. 2015, 38:74-80. 10.1016/j.jmpt.2014.10.009
  51. Souza JA, Pasinato F, Basso D, Corrêa EC, Silva AM: Biophotogrammetry: reliability of measurements obtained with a posture assessment software (SAPO) (Article in Portuguese). Rev Bras Cineantropom Desempenho Hum. 2011, 13:299-305.
  52. Weber P, Correa EC, Milanesi JM, Soares JC, Trevisan ME: Craniocervical posture: cephalometric and biophotogrammetric analysis. Braz J Oral Sci. 2012, 11:416-21.
  53. Cote R, Vietas C, Kolakowski M, Lombardo K, Prete J, Dashottar A: Inter and intra-rater reliability of measuring photometric craniovertebral angle using a cloud-based video communication platform. Int J Telerehabil. 2021, 13:e6346. 10.5195/ijt.2021.6346
  54. Gray K, Dalal R, Davis Weaver J, Randolph S: Quantitative measurements of forward head in college-aged students: a conformational study of intra-rater and inter-rater reliability of a novel posture measuring device. J Bodyw Mov Ther. 2021, 26:233-7. 10.1016/j.jbmt.2020.12.008
  55. Lee CH, Lee S, Shin G: Reliability of forward head posture evaluation while sitting, standing, walking and running. Hum Mov Sci. 2017, 55:81-6. 10.1016/j.humov.2017.07.008
  56. Greendale GA, Nili NS, Huang MH, Seeger L, Karlamangla AS: The reliability and validity of three non-radiological measures of thoracic kyphosis and their relations to the standing radiological Cobb angle. Osteoporos Int. 2011, 22:1897-905. 10.1007/s00198-010-1422-z
  57. Stochkendahl MJ, Christensen HW, Hartvigsen J, et al.: Manual examination of the spine: a systematic critical literature review of reproducibility. J Manipulative Physiol Ther. 2006, 29:475-85, 485.e1-10. 10.1016/j.jmpt.2006.06.011

Review article
peer-reviewed

Reliability and Validity of Non-radiographic Methods of Forward Head Posture Measurement: A Systematic Review


Author Information

Konstantinos Mylonas

Department of Physiotherapy, Laboratory of Therapeutic Exercise and Sports Rehabilitation, University of Patras, Aigio, GRC

Maria Tsekoura Corresponding Author

Department of Physiotherapy, Laboratory of Therapeutic Exercise and Sports Rehabilitation, University of Patras, Aigio, GRC

Evdokia Billis

Department of Physiotherapy, Laboratory of Therapeutic Exercise and Sports Rehabilitation, University of Patras, Aigio, GRC

Pavlos Aggelopoulos

Department of Physiotherapy, Laboratory of Therapeutic Exercise and Sports Rehabilitation, University of Patras, Aigio, GRC

Elias Tsepis

Department of Physiotherapy, Laboratory of Therapeutic Exercise and Sports Rehabilitation, University of Patras, Aigio, GRC

Konstantinos Fousekis

Department of Physiotherapy, Laboratory of Therapeutic Exercise and Sports Rehabilitation, University of Patras, Aigio, GRC


Ethics Statement and Conflict of Interest Disclosures

Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.



Review article
peer-reviewed

Reliability and Validity of Non-radiographic Methods of Forward Head Posture Measurement: A Systematic Review


Figures etc.

SIQ
-
RATED BY 1 READER
CONTRIBUTE RATING

Scholarly Impact Quotient™ (SIQ™) is our unique post-publication peer review rating process. Learn more here.