Most fingerprint recognition systems use Level 1 characteristics (ridge flow, orientation, and frequency) and Level 2 features (minutiae points) to recognize individuals. Level 3 features (sweat pores, incipient ridges and ultra-thin characteristics of the ridges) are less frequently adopted because they can be extracted only from high resolution images, but they have the potential of improving all the steps of the biometric recognition process. In particular, sweat pores can be used for quality assessment, liveness detection, biometric matching in live applications, and matching of partial latent fingerprints in forensic applications. Currently, each type of fingerprint acquisition technique (touch-based, touchless, or latent) requires a different algorithm for pore extraction. In this paper, we propose the first method in the literature able to extract the coordinates of the pores from touch-based, touchless, and latent fingerprint images. Our method uses specifically designed and trained Convolutional Neural Networks (CNN) to estimate and refine the centroid of each pore. Results show that our method is feasible and achieved satisfactory accuracy for all the types of evaluated images, with a better performance with respect to the compared state-of-the-art methods.

A novel pore extraction method for heterogeneous fingerprint images using Convolutional Neural Networks / R. Donida Labati, A. Genovese, E. Muñoz Ballester, V. Piuri, F. Scotti. - In: PATTERN RECOGNITION LETTERS. - ISSN 0167-8655. - 113:1(2018 Oct), pp. 58-66.

A novel pore extraction method for heterogeneous fingerprint images using Convolutional Neural Networks

R. Donida Labati
Primo
;
A. Genovese
Secondo
;
E. Muñoz Ballester;V. Piuri
Penultimo
;
F. Scotti
Ultimo
2018

Abstract

Most fingerprint recognition systems use Level 1 characteristics (ridge flow, orientation, and frequency) and Level 2 features (minutiae points) to recognize individuals. Level 3 features (sweat pores, incipient ridges and ultra-thin characteristics of the ridges) are less frequently adopted because they can be extracted only from high resolution images, but they have the potential of improving all the steps of the biometric recognition process. In particular, sweat pores can be used for quality assessment, liveness detection, biometric matching in live applications, and matching of partial latent fingerprints in forensic applications. Currently, each type of fingerprint acquisition technique (touch-based, touchless, or latent) requires a different algorithm for pore extraction. In this paper, we propose the first method in the literature able to extract the coordinates of the pores from touch-based, touchless, and latent fingerprint images. Our method uses specifically designed and trained Convolutional Neural Networks (CNN) to estimate and refine the centroid of each pore. Results show that our method is feasible and achieved satisfactory accuracy for all the types of evaluated images, with a better performance with respect to the compared state-of-the-art methods.
Fingerprint; Pores; Level 3; Deep Learning; Latent; CNN;
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
   ABC GATES FOR EUROPE
   ABC4EU
   EUROPEAN COMMISSION
   FP7
   312797

   Enforceable Security in the Cloud to Uphold Data Ownership
   ESCUDO CLOUD
   EUROPEAN COMMISSION
   H2020
   644579

   COntactlesS Multibiometric mObile System in the wild: COSMOS
   MINISTERO DELL'ISTRUZIONE E DEL MERITO
   201548C5NT_004
ott-2018
3-apr-2017
Article (author)
File in questo prodotto:
File Dimensione Formato  
prl_2017_web.pdf

accesso aperto

Descrizione: Articolo principale
Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 811.35 kB
Formato Adobe PDF
811.35 kB Adobe PDF Visualizza/Apri
1-s2.0-S0167865517301058-main.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 2.33 MB
Formato Adobe PDF
2.33 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/489894
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 78
  • ???jsp.display-item.citation.isi??? 41
social impact