Tehran University of Medical Sciences

Science Communicator Platform

Stay connected! Follow us on X network (Twitter):
Share this content! On (X network) By
Classification of Ocular Surface Diseases: Deep Learning for Distinguishing Ocular Surface Squamous Neoplasia From Pterygium Publisher



Ramezani F1, 2 ; Azimi H3 ; Delfanian B3 ; Amanollahi M2 ; Saeidian J3 ; Masoumi A2 ; Farrokhpour H2 ; Khalili Pour E2, 4 ; Khodaparast M2
Authors
Show Affiliations
Authors Affiliations
  1. 1. Clinical Research Development Center, Imam Khomeini, Mohammad Kermanshahi and Farabi Hospitals, Kermanshah University of Medical Sciences, Kermanshah, Iran
  2. 2. Translational Ophthalmology Research Center, Farabi Eye Hospital, Tehran University of Medical Sciences, Tehran, Iran
  3. 3. Faculty of Mathematical Sciences and Computer, Kharazmi University, No. 50, Taleghani Avenue, Tehran, Iran
  4. 4. Retina Service, Farabi Eye Hospital, Tehran University of Medical Sciences, South Kargar Street, Qazvin Square, Qazvin Street, Tehran, Iran

Source: Graefe's Archive for Clinical and Experimental Ophthalmology Published:2025


Abstract

Purpose: Given the significance and potential risks associated with Ocular Surface Squamous Neoplasia (OSSN) and the importance of its differentiation from other conditions, we aimed to develop a Deep Learning (DL) model differentiating OSSN from pterygium (PTG) using slit photographs. Methods: A dataset comprising slit photographs of 162 patients including 77 images of OSSN and 85 images of PTG was assembled. After manual segmentation of the images, a Python-based transfer learning approach utilizing the EfficientNet B7 network was employed for automated image segmentation. GoogleNet, a pre-trained neural network was used to categorize the images into OSSN or PTG. To evaluate the performance of our DL model, K-Fold 10 Cross Validation was implemented, and various performance metrics were measured. Results: There was a statistically significant difference in mean age between the OSSN (63.23 ± 13.74 years) and PTG groups (47.18 ± 11.53) (P-value =.000). Furthermore, 84.41% of patients in the OSSN group and 80.00% of the patients in the PTG group were male. Our classification model, trained on automatically segmented images, demonstrated reliable performance measures in distinguishing OSSN from PTG, with an Area Under Curve (AUC) of 98%, sensitivity, F1 score, and accuracy of 94%, and a Matthews Correlation Coefficient (MCC) of 88%. Conclusions: This study presents a novel DL model that effectively segments and classifies OSSN from PTG images with a relatively high accuracy. In addition to its clinical use, this model can be potentially used as a telemedicine application. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2025.