Optic Nerve Atrophy Conditions Associated With 3D Unsegmented Optical Coherence Tomography Volumes Using Deep Learning.
David Szanto, Jui-Kai Wang, Brian Woods, Asala Erekat, Mona Garvin, Randy Kardon, Mark J Kupersmith
Summary
Deep learning-based analysis of unsegmented OCT scans reliably distinguished between different forms of optic nerve atrophy, suggesting subtle, disease-specific structural patterns.
Abstract
IMPORTANCE
Accurate differentiation of optic nerve head (ONH) atrophy is vital for guiding diagnosis and treatment of conditions such as glaucoma, nonarteritic anterior ischemic optic neuropathy (NAION), and optic neuritis. Traditional 2-dimensional assessments may overlook subtle, volumetric changes.
OBJECTIVE
To determine whether a 3-dimensional (3D) deep learning model trained on unsegmented ONH optical coherence tomography (OCT) scans can reliably distinguish optic atrophy in glaucoma, NAION, optic neuritis, and healthy eyes. DESIGN, SETTING,
AND PARTICIPANTS
This cross-sectional study used data from multiple clinical trials and referral centers (2008-2025), including randomized trials, longitudinal studies, and referral clinics. Participants included patients with glaucoma, NAION, or optic neuritis and healthy control patients.
EXPOSURES
Three ResNet-3D-18 models were trained using 5-fold stratified cross-validation. One assessed the full OCT volume, another focused only on the peripapillary region (PPR), and the third considered only the ONH. Identical data splits were used to allow direct performance comparison.
MAIN OUTCOMES AND MEASURES
Classification accuracy, macro area under the receiver operating characteristic curve (AUC-ROC), precision, recall, and F1 scores, aggregated across all validation folds. Confusion matrices were generated to characterize misclassifications.
RESULTS
A total of 7014 Cirrus ONH OCT scans from 1382 eyes of glaucoma (n = 113), NAION (n = 311), optic neuritis (n = 163), and healthy controls (n = 715) were analyzed. The mean (SD) age was 54.2 (16.9) years; there were 733 (65%) male patients and 402 (35%) female patients. The entire-volume model achieved 88.9% accuracy (macro AUC-ROC, 0.977; 95% CI, 0.974-0.979) and F1 scores of 0.94, 0.87, 0.78, and 0.91 for glaucoma, NAION, optic neuritis, and healthy eyes, respectively. The PPR-only model reached 85.9% accuracy (AUC-ROC, 0.970; 95% CI, 0.967-0.972), while the ONH-only model attained 87.0% accuracy (AUC-ROC, 0.972; 95% CI, 0.970-0.975). Both achieved F1 scores from 0.71 to 0.94. Optic neuritis presented the greatest classification challenge, misclassified as NAION or healthy when axonal loss was severe or minimal. Activation maps revealed disease-specific regions of interest in the retina, including the retinal nerve fiber layer, ganglion cell layer, and retinal pigment epithelium.
CONCLUSIONS AND RELEVANCE
Deep learning-based analysis of unsegmented OCT scans reliably distinguished between different forms of optic nerve atrophy, suggesting subtle, disease-specific structural patterns. This automated approach may support diagnostic efforts, guide clinical management of optic neuropathies, and complement less standardized imaging modalities and subjective clinical impressions.
More by David Szanto
View full profile →Relating Standardized Automated Perimetry Performed With Stimulus Sizes III and V in Eyes With Field Loss Due to Glaucoma and NAION.
Archetypal Analysis Reveals Consistent Visual Field Patterns for Stimulus Sizes III and V in Glaucoma and NAION.
Archetypal Analysis Reveals Consistent Visual Field Patterns for Stimulus Size III and Size Modulation Perimetry in Glaucoma.
Top Research in Optic Nerve & Disc
Browse all →Efficacy of a Deep Learning System for Detecting Glaucomatous Optic Neuropathy Based on Color Fundus Photographs.
Relationship between Optical Coherence Tomography Angiography Vessel Density and Severity of Visual Field Loss in Glaucoma.
Inflammation in Glaucoma: From the back to the front of the eye, and beyond.
In the Knowledge Library
Discussion
Comments and discussion will appear here in a future update.