We devise a technique designed to remove the texturing artefacts that are typical of 3D models representing real-world objects, acquired by photogrammetric techniques. Our technique leverages the recent advancements in inpainting of natural colour images, adapting them to the specific context. A neural network, modified and trained for our purposes, replaces the texture areas containing the defects, substituting them with new plausible patches of texels, reconstructed from the surrounding surface texture. We train and apply the network model on locally reparametrized texture patches, so to provide an input that simplifies the learning process, because it avoids any texture seams, unused texture areas, background, depth jumps and so on. We automatically extract appropriate training data from real-world datasets. We show two applications of the resulting method: one, as a fully automatic tool, addressing all problems that can be detected by analysing the UV-map of the input model; and another, as an interactive semi-automatic tool, presented to the user as a 3D ‘fixing’ brush that has the effect of removing artefacts from any zone the users paints on. We demonstrate our method on a variety of real-world inputs and provide a reference usable implementation.

Texture Inpainting for Photogrammetric Models / A. Maggiordomo, P. Cignoni, M. Tarini. - In: COMPUTER GRAPHICS FORUM. - ISSN 0167-7055. - (2023), pp. 1-16. [Epub ahead of print] [10.1111/cgf.14735]

Texture Inpainting for Photogrammetric Models

A. Maggiordomo
Primo
;
M. Tarini
Ultimo
2023

Abstract

We devise a technique designed to remove the texturing artefacts that are typical of 3D models representing real-world objects, acquired by photogrammetric techniques. Our technique leverages the recent advancements in inpainting of natural colour images, adapting them to the specific context. A neural network, modified and trained for our purposes, replaces the texture areas containing the defects, substituting them with new plausible patches of texels, reconstructed from the surrounding surface texture. We train and apply the network model on locally reparametrized texture patches, so to provide an input that simplifies the learning process, because it avoids any texture seams, unused texture areas, background, depth jumps and so on. We automatically extract appropriate training data from real-world datasets. We show two applications of the resulting method: one, as a fully automatic tool, addressing all problems that can be detected by analysing the UV-map of the input model; and another, as an interactive semi-automatic tool, presented to the user as a 3D ‘fixing’ brush that has the effect of removing artefacts from any zone the users paints on. We demonstrate our method on a variety of real-world inputs and provide a reference usable implementation.
modelling; rendering; surface parameterization; texture mapping; texture synthesis
Settore INF/01 - Informatica
   Social and hUman ceNtered XR
   SUN
   European Commission
   Horizon Europe Framework Programme
   101092612
2023
feb-2023
https://onlinelibrary.wiley.com/doi/full/10.1111/cgf.14735
Article (author)
File in questo prodotto:
File Dimensione Formato  
2023_Texture_Inpainting_compressed.pdf

accesso aperto

Descrizione: Article
Tipologia: Publisher's version/PDF
Dimensione 714.33 kB
Formato Adobe PDF
714.33 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/969303
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
social impact