The estimation of the volume occupied by an object is an important task in the fields of granulometry, quality control, and archaeology. An accurate and well know technique for the volume measurement is based on the Archimedes' principle. However, in many applications it is not possible to use this technique and faster contact-less techniques based on image processing or laser scanning should be adopted. In this work, we propose a low-cost approach for the volume estimation of different kinds of objects by using a two-view vision approach. The method first computes a reduced threedimensional model from a single couple of images, then extracts a series of features from the obtained model. Lastly, the features are processed using a computational intelligence approach, which is able to learn the relation between the features and the volume of the captured object, in order to estimate the volume independently of its position and angle, and without computing a full three-dimensional model. Results show that the approach is feasible and can obtain an accurate volume estimation. Compared to the direct computation of the volume from the three-dimensional models, the approach is more accurate and also less dependent to the position and angle of the measured objects with respect to the cameras.

Low-cost volume estimation by two-view acquisitions: a computational intelligence approach / R. Donida Labati, A. Genovese, V. Piuri, F. Scotti (PROCEEDINGS OF ... INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS). - In: IJCNNPiscataway : Institute of Electrical and Electronics Engineers (IEEE), 2012 Jun. - ISBN 9781467314886. - pp. 1-8 (( convegno International Joint Conference on Neural Networks : July, 12th - 15th tenutosi a Brisbane nel 2012 [10.1109/IJCNN.2012.6252515].

Low-cost volume estimation by two-view acquisitions: a computational intelligence approach

R. Donida Labati
Primo
;
A. Genovese
Secondo
;
V. Piuri
Penultimo
;
F. Scotti
Ultimo
2012

Abstract

The estimation of the volume occupied by an object is an important task in the fields of granulometry, quality control, and archaeology. An accurate and well know technique for the volume measurement is based on the Archimedes' principle. However, in many applications it is not possible to use this technique and faster contact-less techniques based on image processing or laser scanning should be adopted. In this work, we propose a low-cost approach for the volume estimation of different kinds of objects by using a two-view vision approach. The method first computes a reduced threedimensional model from a single couple of images, then extracts a series of features from the obtained model. Lastly, the features are processed using a computational intelligence approach, which is able to learn the relation between the features and the volume of the captured object, in order to estimate the volume independently of its position and angle, and without computing a full three-dimensional model. Results show that the approach is feasible and can obtain an accurate volume estimation. Compared to the direct computation of the volume from the three-dimensional models, the approach is more accurate and also less dependent to the position and angle of the measured objects with respect to the cameras.
silhouette
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
giu-2012
Institute of Electrical and Electronics Engineers (IEEE)
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
ijcnn2012_volume.pdf

accesso aperto

Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 525.4 kB
Formato Adobe PDF
525.4 kB Adobe PDF Visualizza/Apri
Low-cost_volume_estimation_by_two-view_acquisitions_A_computational_intelligence_approach.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 1.12 MB
Formato Adobe PDF
1.12 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/197153
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? 0
social impact