Frequency of Testing to Detect Visual Field Progression Derived Using a Longitudinal Cohort of Glaucoma Patients.
Summary
This study provides information on the time required to detect progression using MD trend analysis in glaucoma eyes when different testing frequencies are used.
Abstract
PURPOSE
To determine the time required to detect statistically significant progression for different rates of visual field loss using standard automated perimetry (SAP) when considering different frequencies of testing using a follow-up scheme that resembles clinical practice.
DESIGN
Observational cohort study.
PARTICIPANTS
One thousand seventy-two eyes of 665 patients with glaucoma followed up over an average of 4.3±0.9 years.
METHODS
Participants with 5 or more visual field tests over a 2- to 5-year period were included to derive the longitudinal measurement variability of SAP mean deviation (MD) using linear regressions. Estimates of variability then were used to reconstruct real-world visual field data by computer simulation to evaluate the time required to detect progression for various rates of visual field loss and different frequencies of testing. The evaluation was performed using a follow-up scheme that resembled clinical practice by requiring a set of 2 baseline tests and a confirmatory test to identify progression.
MAIN OUTCOME MEASURES
Time (in years) required to detect progression.
RESULTS
The time required to detect a statistically significant negative MD slope decreased as the frequency of testing increased, albeit not proportionally. For example, 80% of eyes with an MD loss of -2 dB/year would be detected after 3.3, 2.4, and 2.1 years when testing is performed once, twice, and thrice per year, respectively. For eyes with an MD loss of -0.5 dB/year, progression can be detected with 80% power after 7.3, 5.7, and 5.0 years, respectively.
CONCLUSIONS
This study provides information on the time required to detect progression using MD trend analysis in glaucoma eyes when different testing frequencies are used. The smaller gains in the time to detect progression when testing is increased from twice to thrice per year suggests that obtaining 2 reliable tests at baseline followed by semiannual testing and confirmation of progression through repeat testing in the initial years of follow-up may provide a good compromise for detecting progression, while minimizing the burden on health care resources in clinical practice.
More by Zhichao Wu
View full profile →Peripapillary and Macular Vessel Density in Patients with Glaucoma and Single-Hemifield Visual Field Defect.
Impact of Normal Aging and Progression Definitions on the Specificity of Detecting Retinal Nerve Fiber Layer Thinning.
Performance of the 10-2 and 24-2 Visual Field Tests for Detecting Central Visual Field Abnormalities in Glaucoma.
Top Research in Visual Field
Browse all →Optical coherence tomography angiography: A comprehensive review of current methods and clinical applications.
Relationship between Optical Coherence Tomography Angiography Vessel Density and Severity of Visual Field Loss in Glaucoma.
Improving our understanding, and detection, of glaucomatous damage: An approach based upon optical coherence tomography (OCT).
Discussion
Comments and discussion will appear here in a future update.