Photorealistic Monocular 3D Reconstruction of Humans Wearing Clothing

Thiemo Alldieck, Mihai Zanfir, Cristian Sminchisescu

4/19/2022

Keywords: Human (Body), Generalization, Supervision by Gradient (PDE), Positional Encoding

Venue: CVPR 2022

Bibtex: @article{alldieck2022phorhum, author = {Thiemo Alldieck and Mihai Zanfir and Cristian Sminchisescu}, title = {Photorealistic Monocular 3D Reconstruction of Humans Wearing Clothing}, year = {2022}, month = {Apr}, url = {http://arxiv.org/abs/2204.08906v1} }

Abstract

We present PHORHUM, a novel, end-to-end trainable, deep neural network methodology for photorealistic 3D human reconstruction given just a monocular RGB image. Our pixel-aligned method estimates detailed 3D geometry and, for the first time, the unshaded surface color together with the scene illumination. Observing that 3D supervision alone is not sufficient for high fidelity color reconstruction, we introduce patch-based rendering losses that enable reliable color reconstruction on visible parts of the human, and detailed and plausible color estimation for the non-visible parts. Moreover, our method specifically addresses methodological and practical limitations of prior work in terms of representing geometry, albedo, and illumination effects, in an end-to-end model where factors can be effectively disentangled. In extensive experiments, we demonstrate the versatility and robustness of our approach. Our state-of-the-art results validate the method qualitatively and for different metrics, for both geometric and color reconstruction.

Citation Graph
(Double click on nodes to open corresponding papers' pages)

* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.