Traditional fingerprint biometric systems capture the user fingerprint images by a contact-based sensor. Differently, contactless systems aim to capture the fingerprint images by an approach based on a vision system without the need of any contact of the user with the sensor. The user finger is placed in front of a special CCD-based system that captures the pattern of ridges and valleys of the fingertips. This approach is less constrained by the point of view of the user, but it requires much more capability of the system to deal with the focus of the moving target, the illumination problems and the complexity of the background in the captured image. During the acquisition procedure, the quality of each frame must be carefully evaluated in order to extract only the correct frames with valuable biometric information from the sequence. In this paper, we present a neural-based approach for the quality estimation of the contactless fingertips images. The application of the neural classification models allowed for a relevant reduction of the computational complexity permitting the application in realtime. Experimental results show that the proposed method has an adequate accuracy, and it can capture fingerprints at a distance up to 0.2 meters.

Neural-based quality measurement of fingerprint images in contactless biometric systems / R. Donida Labati, V. Piuri, F. Scotti (PROCEEDINGS OF ... INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS). - In: Neural Networks (IJCNN), The 2010 International Joint Conference on[s.l] : Institute of Electrical and Electronics Engineers (IEEE), 2010 Jul. - ISBN 9781424469161. - pp. 1-8 (( convegno WCCI World Congress on Computational Intelligence tenutosi a Barcelona nel 2010 [10.1109/IJCNN.2010.5596694].

Neural-based quality measurement of fingerprint images in contactless biometric systems

R. Donida Labati
Primo
;
V. Piuri
Secondo
;
F. Scotti
Ultimo
2010

Abstract

Traditional fingerprint biometric systems capture the user fingerprint images by a contact-based sensor. Differently, contactless systems aim to capture the fingerprint images by an approach based on a vision system without the need of any contact of the user with the sensor. The user finger is placed in front of a special CCD-based system that captures the pattern of ridges and valleys of the fingertips. This approach is less constrained by the point of view of the user, but it requires much more capability of the system to deal with the focus of the moving target, the illumination problems and the complexity of the background in the captured image. During the acquisition procedure, the quality of each frame must be carefully evaluated in order to extract only the correct frames with valuable biometric information from the sequence. In this paper, we present a neural-based approach for the quality estimation of the contactless fingertips images. The application of the neural classification models allowed for a relevant reduction of the computational complexity permitting the application in realtime. Experimental results show that the proposed method has an adequate accuracy, and it can capture fingerprints at a distance up to 0.2 meters.
recognition
Settore INF/01 - Informatica
lug-2010
Institute of Electrical and Electronics Engineers (IEEE)
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
Neural-based_quality_measurement_of_fingerprint_images_in_contactless_biometric_systems.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 912.17 kB
Formato Adobe PDF
912.17 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
IJCNN2010_web.pdf

accesso aperto

Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 1.1 MB
Formato Adobe PDF
1.1 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/148698
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 29
  • ???jsp.display-item.citation.isi??? 3
social impact