e-ISSN:0976-5166
p-ISSN:2231-3850


INDIAN JOURNAL OF COMPUTER SCIENCE AND ENGINEERING

Call for Papers 2025

Feb 2024 - Volume 16, Issue 1
Deadline: 15 Jan 2025
Publication: 20 Feb 2025

Dec 2024 - Volume 16, Issue 2
Deadline: 15 Mar 2024
Publication: 20 Apr 2024

More

 

ABSTRACT

Title : A Novel Method Using Deep Learning Technique for Automatic Grading and Classification of Interferometry Video Frames for Dry Eye Analysis
Authors : Spoorthi K, Dr. Deepthi K Prasad, Dr. Sourav Pal, Venkatakrishnan S, Meghna S Kulkarni, Madhura Prakash M
Keywords : Lipid layer, interferometry, dry eye, convolutional neural network
Issue Date : Mar-Apr 2023
Abstract :
Maintaining good ocular health is vital to overall health, wellness and well-being. Dry eye is a syndrome that affects people of all ages if the blink rate is not up to the mark. A normal mean blink rate of up to 22 blinks/min has been observed under relaxed conditions. Tears protect eyes from irritants and infections and keep the eyes wet. A thin layer of tear film is spread across the outer layer of the eye with each blink. Dry eye occurs when eyes do not produce enough tears or good quality of tears. There are several diagnostic techniques available in the market to detect dry eye syndrome. One of the techniques is tear film interferometry, which is based on interferometric patterns developed under extended white light source over outer most layer of tear film, called lipid layer. It is a non-invasive technique used to evaluate the quality of the tear film. Interferometric patterns develop over lipid layer can be classified into five different dry eye grades: grade 1, grade 2, grade 3, grade 4 and grade 5 based on color and distribution of fringes. This work intends to propose a novel method for automatic classification of interferometry video frames to detect the dry eye condition using a combination of computer vision and deep learning technique based on sequential convolutional neural network architecture. The proposed novel technique has achieved an overall accuracy of 86.25% and the results look promising.
Page(s) : 336-347
ISSN : 0976-5166
Source : Vol. 14, No. 2
PDF : Download
DOI : 10.21817/indjcse/2023/v14i2/221302092