Casual Indoor HDR Radiance Capture from Omnidirectional Images
Pulkit Gera, Mohammad Reza Karimi Dastjerdi, Charles Renaud, P. J. Narayanan, Jean-François Lalonde
08/16/2022
Keywords: Material/Lighting Estimation, Large-Scale Scenes
Venue: BMVC 2022
Bibtex:
@article{gera2022panohdrnerf,
author = {Pulkit Gera and Mohammad Reza Karimi Dastjerdi and Charles Renaud and P. J. Narayanan and Jean-Francois Lalonde},
title = {Casual Indoor HDR Radiance Capture from Omnidirectional Images},
year = {2022},
month = {Aug},
url = {http://arxiv.org/abs/2208.07903v2}
}
Abstract
We present PanoHDR-NeRF, a neural representation of the full HDR radiance field of an indoor scene, and a pipeline to capture it casually, without elaborate setups or complex capture protocols. First, a user captures a low dynamic range (LDR) omnidirectional video of the scene by freely waving an off-the-shelf camera around the scene. Then, an LDR2HDR network uplifts the captured LDR frames to HDR, which are used to train a tailored NeRF++ model. The resulting PanoHDR-NeRF can render full HDR images from any location of the scene. Through experiments on a novel test dataset of real scenes with the ground truth HDR radiance captured at locations not seen during training, we show that PanoHDR-NeRF predicts plausible HDR radiance from any scene point. We also show that the predicted radiance can synthesize correct lighting effects, enabling the augmentation of indoor scenes with synthetic objects that are lit correctly. Datasets and code are available at https://lvsn.github.io/PanoHDR-NeRF/.
Citation Graph
(Double click on nodes to open corresponding papers' pages)
(Double click on nodes to open corresponding papers' pages)
* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.