Abstract
A class of eye diseases known as glaucoma refers to a category of eye disease which is often associated with increased intraocular pressure, and can damage the eye by destroying nerve cells in the retina in addition to the eye's optic nerve. The accuracy of diagnosing glaucoma eye disease depends on the knowledge and experience of the ophthalmologist. Therefore, the need for automatic feature extraction from the retinal image is crucial for the diagnosis of glaucoma eye disease. The current article introduces a multi-layer deep neural network model with convolution and classification layers. The originality of the current search focuses on the model proposed by CNN, and also examines the influence of epoch values and the number of filters on the classification results in glaucoma diagnosis. The proposed approach was executed on ACRIMA dataset images. The results showed that epoch values 50, and 100 have a strong influence on the classification results for all filters used, while the number of filters 32 recorded the highest values for TP and lower values for FP for all epoch values used. Furthermore, the variation of recall, precision, and accuracy recorded approximately stable uniform results for all filters and epoch values used except for the epoch value equal to 5 which had a variable setting for the adopted metrics. Through the filters used, the F1-Score can be adopted as the best metric in the evaluation process. Further studies need to test and evaluate other eye diseases in future to help physicians diagnose such cases.
Keywords
Epoch, F1-score, Glaucoma, Precision, Recall
Subject Area
Physics
Article Type
Article
First Page
4130
Last Page
4141
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.
How to Cite this Article
Rahim, Maysoon Jaaffar; Al-Zuky, Ali Abid Dawood; and Al-Obaidi, Fatin Ezzat Muhy Al-Dean
(2025)
"Classifying Glaucoma Disease Images via Deep Neural Network Technique,"
Baghdad Science Journal: Vol. 22:
Iss.
12, Article 18.
DOI: https://doi.org/10.21123/2411-7986.5169
