We propose a method to improve an existing parametrization (UV-map layout) of a textured 3D model, targeted explicitly at alleviating typical defects afflicting models generated with automatic photo-reconstruction tools from real-world objects. This class of 3D data is becoming increasingly important thanks to the growing popularity of reliable, ready-to-use photogrammetry software packages. The resulting textured models are richly detailed, but their underlying parametrization typically falls short of many practical requirements, particularly exhibiting excessive fragmentation and consequent problems. Producing a completely new UV-map, with standard parametrization techniques, and then resampling a new texture image, is often neither practical nor desirable for at least two reasons: first, these models have characteristics (such as inconsistencies, high resolution) that make them unfit for automatic or manual parametrization; second, the required resampling leads to unnecessary signal degradation because this process is unaware of the original texel densities. In contrast, our method improves the existing UV-map instead of replacing it, balancing the reduction of the map fragmentation with signal degradation due to resampling, while also avoiding oversampling of the original signal. The proposed approach is fully automatic and extensively tested on a large benchmark of photo-reconstructed models; quantitative evaluation evidences a drastic and consistent improvement of the mappings.

Texture Defragmentation for Photo-Reconstructed 3D Models / A. Maggiordomo, P. Cignoni, M. Tarini. - In: COMPUTER GRAPHICS FORUM. - ISSN 0167-7055. - 40:2(2021 Jun 04), pp. 65-78. ((Intervento presentato al 42. convegno EUROGRAPHICS 2021: Annual Conference of the European Association for Computer Graphics tenutosi a Vienna, Austria nel 2021 [10.1111/cgf.142615].

Texture Defragmentation for Photo-Reconstructed 3D Models

A. Maggiordomo
Primo
;
M. Tarini
Ultimo
2021

Abstract

We propose a method to improve an existing parametrization (UV-map layout) of a textured 3D model, targeted explicitly at alleviating typical defects afflicting models generated with automatic photo-reconstruction tools from real-world objects. This class of 3D data is becoming increasingly important thanks to the growing popularity of reliable, ready-to-use photogrammetry software packages. The resulting textured models are richly detailed, but their underlying parametrization typically falls short of many practical requirements, particularly exhibiting excessive fragmentation and consequent problems. Producing a completely new UV-map, with standard parametrization techniques, and then resampling a new texture image, is often neither practical nor desirable for at least two reasons: first, these models have characteristics (such as inconsistencies, high resolution) that make them unfit for automatic or manual parametrization; second, the required resampling leads to unnecessary signal degradation because this process is unaware of the original texel densities. In contrast, our method improves the existing UV-map instead of replacing it, balancing the reduction of the map fragmentation with signal degradation due to resampling, while also avoiding oversampling of the original signal. The proposed approach is fully automatic and extensively tested on a large benchmark of photo-reconstructed models; quantitative evaluation evidences a drastic and consistent improvement of the mappings.
CCS Concepts; Mesh models; • Computing methodologies → Texturing;
Settore INF/01 - Informatica
4-giu-2021
https://onlinelibrary.wiley.com/doi/10.1111/cgf.142615
Article (author)
File in questo prodotto:
File Dimensione Formato  
2021_texture_defrag_cgf.142615_COMPRESSED.pdf

accesso aperto

Descrizione: FILE AGGRESSIVAMENTE COMPRESSO (a causa delle anacronistiche limitazioni cineca)
Tipologia: Publisher's version/PDF
Dimensione 1.96 MB
Formato Adobe PDF
1.96 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/852562
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 6
social impact