Editing High Dynamic Range (HDR) environment maps using an inverse differentiable rendering architecture is a complex inverse problem due to the sparsity of relevant pixels and the challenges in balancing light sources and background. The pixels illuminating the objects are a small fraction of the total image, leading to noise and convergence issues when the optimization directly involves pixel values. HDR images, with pixel values beyond the typical Standard Dynamic Range (SDR), pose additional challenges. Higher learning rates corrupt the background during optimization, while lower learning rates fail to manipulate light sources. Our work introduces a novel method for editing HDR environment maps using a differentiable rendering, addressing sparsity and variance between values. Instead of introducing strong priors that extract the relevant HDR pixels and separate the light sources, or using tricks such as optimizing the HDR image in the log space, we propose to model the optimized environment map with a new variant of implicit neural representations able to handle HDR images. The neural representation is trained with adversarial perturbations over the weights to ensure smooth changes in the output when it receives gradients from the inverse rendering. In this way, we obtain novel and cheap environment maps without relying on latent spaces of expensive generative models, maintaining the original visual consistency. Experimental results demonstrate the method's effectiveness in reconstructing the desired lighting effects while preserving the fidelity of the map and reflections on objects in the scene. Our approach can pave the way to interesting tasks, such as estimating a new environment map given a rendering with novel light sources, maintaining the initial perceptual features, and enabling brush stroke-based editing of existing environment maps. Our code is publicly available at https://github.com/OmnAI-Lab/R-SIREN.

Environment Maps Editing using Inverse Rendering and Adversarial Implicit Functions / D'Orazio, Antonio; Sforza, Davide; Pellacini, Fabio; Masi, Iacopo. - (2024). ( 2024 Eurographics Italian Chapter Conference on Smart Tools and Applications in Graphics, STAG 2024 Verona ) [10.2312/stag.20241339].

Environment Maps Editing using Inverse Rendering and Adversarial Implicit Functions

Antonio D'Orazio
Primo
;
Davide Sforza
Secondo
;
Iacopo Masi
Ultimo
2024

Abstract

Editing High Dynamic Range (HDR) environment maps using an inverse differentiable rendering architecture is a complex inverse problem due to the sparsity of relevant pixels and the challenges in balancing light sources and background. The pixels illuminating the objects are a small fraction of the total image, leading to noise and convergence issues when the optimization directly involves pixel values. HDR images, with pixel values beyond the typical Standard Dynamic Range (SDR), pose additional challenges. Higher learning rates corrupt the background during optimization, while lower learning rates fail to manipulate light sources. Our work introduces a novel method for editing HDR environment maps using a differentiable rendering, addressing sparsity and variance between values. Instead of introducing strong priors that extract the relevant HDR pixels and separate the light sources, or using tricks such as optimizing the HDR image in the log space, we propose to model the optimized environment map with a new variant of implicit neural representations able to handle HDR images. The neural representation is trained with adversarial perturbations over the weights to ensure smooth changes in the output when it receives gradients from the inverse rendering. In this way, we obtain novel and cheap environment maps without relying on latent spaces of expensive generative models, maintaining the original visual consistency. Experimental results demonstrate the method's effectiveness in reconstructing the desired lighting effects while preserving the fidelity of the map and reflections on objects in the scene. Our approach can pave the way to interesting tasks, such as estimating a new environment map given a rendering with novel light sources, maintaining the initial perceptual features, and enabling brush stroke-based editing of existing environment maps. Our code is publicly available at https://github.com/OmnAI-Lab/R-SIREN.
2024
2024 Eurographics Italian Chapter Conference on Smart Tools and Applications in Graphics, STAG 2024
computer graphics; deep learning; inverse problems; inverse rendering
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Environment Maps Editing using Inverse Rendering and Adversarial Implicit Functions / D'Orazio, Antonio; Sforza, Davide; Pellacini, Fabio; Masi, Iacopo. - (2024). ( 2024 Eurographics Italian Chapter Conference on Smart Tools and Applications in Graphics, STAG 2024 Verona ) [10.2312/stag.20241339].
File allegati a questo prodotto
File Dimensione Formato  
DOrazio_Engironment-Maps_2024.pdf

accesso aperto

Note: DOI 10.2312/stag.20241339
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 9.72 MB
Formato Adobe PDF
9.72 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1723590
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact