DRIONS-DB: Digital Retinal Images for Optic Nerve Segmentation Database
This is a public database for benchmarking optic nerve head segmentation from digital retinal images. The research community is invited to test their algorithms on this database, compare and share the results with other researchers through this web site. For more information refer to the documentation.
Julián García Feijoo(a)
José María Martínez de la Casa(a)
Hospital Miguel Servet(b)
Mariano Rincón Zamorano(c)
Margarita Bachiller Mayoral(c)
Enrique J. Carmona Suárez(c)
(a)Departamento de Glaucoma, Servicio de Oftalmología, Hospital Clínico San Carlos, Instituto de Investigaciones Ramón Castroviejo, Universidad Complutense, Madrid, Spain.
(b)Servicio de Oftalmología, Hospital Miguel Servet, Zaragoza, Spain.
(c)Departamento de Inteligencia Artificial, Escuela Técnica Superior de Ingeniería Informática, Universidad Nacional de Educación a Distancia (UNED), C/Juan del Rosal 16, 28040 Madrid, Spain.
DRIONS-DB is Copyright © 2008 by Julián García Feijoo, Jose María Martínez de la Casa, Hospital Miguel Servet, Mariano Rincón Zamorano, Margarita Bachiller and Enrique J. Carmona Suárez.
The data included in this database can be used, free of charge, for research and educational purposes. You are not allowed to modify it. Copying, redistribution, and any unauthorized commercial use is prohibited. Inclusion of this database or even parts of it in a proprietary program is not allowed without a written permission from the owners of the copyright. The use of this database is restricted to those individuals or organizations that obtained the database directly from this website.
Any researcher reporting results which use this database must acknowledge the DRIONS-DB. We request you to do so by citing this publication (if you need a copy of this paper, please make contact with E.J. Carmona):
E.J. Carmona, M. Rincón, J. García-Feijoo and J. M. Martínez-de-la-Casa (2008). Identification of the optic nerve head with genetic algorithms. Artificial Intelligence in Medicine, Vol. 43(3), pp. 243-259. [DOI] [Results and Demos].
The database consists of 110 colour digital retinal images. Initially, it were obtained 124 eye fundus images selected randomly from an eye fundus image base belonging to the Ophthalmology Service at Miguel Servet Hospital, Saragossa (Spain). From this initial image base, all those eye images (14 in total) that had some type of cataract (severe and moderate) were eliminated and, finally, was obtained the image base with 110 images.
Table 1 shows, for the 110 images selected, all those visual characteristics related to potential problems that may distort the detection process of the papillary contour. The mean age of the patients was 53.0 years (S.D. 13.05), with 46.2% male and 53.8% female and all of them were Caucasian ethnicity. 23.1% patients had chronic simple glaucoma and 76.9% eye hypertension. The images were acquired with a colour analogical fundus camera, approximately centred on the ONH and they were stored in slide format. In order to have the images in digital format, they were digitised using a HP-PhotoSmart-S20 high-resolution scanner, RGB format, resolution 600x400 and 8 bits/pixel.
For more background information on the database, consult .
|Number of images|
|Cataract (severe or moderate)||
|Some of rim blurred or missing||
|Moderate peripapillary atrophy||
|Concentric peripapillary atrophy/artifacts||
|Strong pallor distractor||
Table 1. Image base visual characteristics
Eye fundus image examples:
Image002 Image025 Image082 Image092
Independent contours from 2 medical experts were collected by using a software tool provided for image annotation. A person with medical education and solid experience in ophthalmology was considered as an expert. In each image, each expert traced the contour by selecting the most significant papillary contour points and the annotation tool connected automatically adjacent points by a curve. As expected, there was an inter-observer variability among the two experts tracing the contour of a papilla. For this reason, in our work  we used as a gold stantard, in each image, the average of the contours traced by the two experts. From here, it can be downloaded the ground truth of the each expert. For more background information on the database, consult 
Eye fundus image examples with optic nerve contours traced by two experts (only is showed the papilla area):
Image002 Image025 Image082 Image092
You can download the DRIONS database from the following link.
- M.S. Haleem, L. Han, J. van Hemert and B. Li (2013). Automatic Extraction of Retinal Features from Color RetinalImages for Glaucoma Diagnosis: A Review. Computerized Medical Imaging and Graphics. Vol. 37 (7-8), pp. 581-596.
- S. Morales, V. Naranjo, J. Angulo and M. Alcañiz (2013). Automatic Detection of Optic Disc Based on PCA and Mathematical Morphology. IEEE Transactions on Medical Imaging, Vol. 32, pp. 786-796.
- J.M. Molina and E.J. Carmona (2013). Pattern Detection in Images Using LBP-Based Relational Operators. Natural and Artificial Computation in Engineering and Medical Applications, J.M. Ferrández et al (eds.), LNCS-7931, Springer-Verlag, pp. 11-20.
- S. Morales, V. Naranjo, D. Pérez. A. Navea and M. Alcañiz (2012). Automatic detection of optic disc based on PCA and stochastic watershed. Signal Processing Conference (EUSIPCO), 2012 Proceedings of the 20th European, pp.2605-2609.
- J.M. Molina and E.J. Carmona (2011). Localization and Segmentation of the Optic Nerve Head in Eye Fundus Images using Pyramid Representation and Genetic Algorithms. In Proceding of IWINAC 2011: J.M. Ferrández et al (eds.), Bioinspired Applications in Artificial and Natural Computation, LNCS-6686, Springer-Verlag, pp. 431-440.
- E.J. Carmona, M. Rincón, J. García-Feijoo and J. M. Martínez-de-la-Casa (2008). Identification of the optic nerve head with genetic algorithms. Artificial Intelligence in Medicine, Vol. 43(3), pp. 243-259. [ Results and Demos]
- M. Rincón, M. Bachiller, J. Mira (2005). Knowledge modeling for the image understanding task as a design task. Expert Systems with Applications. Vol. 29(1), pp. 207-217.