Intrinsic Neural Fields: Learning Functions on Manifolds
Lukas Koestler, Daniel Grittner, Michael Moeller, Daniel Cremers, Zorah Lähner
3/15/2022
Keywords: Fundamentals, Positional Encoding
Venue: ARXIV 2022
Bibtex:
@article{koestler2022intrinsicneuralfields,
author = {Lukas Koestler and Daniel Grittner and Michael Moeller and Daniel Cremers and Zorah Lahner},
title = {Intrinsic Neural Fields: Learning Functions on Manifolds},
year = {2022},
month = {Mar},
url = {http://arxiv.org/abs/2203.07967v3}
}
Abstract
Neural fields have gained significant attention in the computer vision community due to their excellent performance in novel view synthesis, geometry reconstruction, and generative modeling. Some of their advantages are a sound theoretic foundation and an easy implementation in current deep learning frameworks. While neural fields have been applied to signals on manifolds, e.g., for texture reconstruction, their representation has been limited to extrinsically embedding the shape into Euclidean space. The extrinsic embedding ignores known intrinsic manifold properties and is inflexible wrt. transfer of the learned function. To overcome these limitations, this work introduces intrinsic neural fields, a novel and versatile representation for neural fields on manifolds. Intrinsic neural fields combine the advantages of neural fields with the spectral properties of the Laplace-Beltrami operator. We show theoretically that intrinsic neural fields inherit many desirable properties of the extrinsic neural field framework but exhibit additional intrinsic qualities, like isometry invariance. In experiments, we show intrinsic neural fields can reconstruct high-fidelity textures from images with state-of-the-art quality and are robust to the discretization of the underlying manifold. We demonstrate the versatility of intrinsic neural fields by tackling various applications: texture transfer between deformed shapes & different shapes, texture reconstruction from real-world images with view dependence, and discretization-agnostic learning on meshes and point clouds.
Citation Graph
(Double click on nodes to open corresponding papers' pages)
(Double click on nodes to open corresponding papers' pages)
* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.