Learning Smooth Neural Functions via Lipschitz Regularization
Hsueh-Ti Derek Liu, Francis Williams, Alec Jacobson, Sanja Fidler, Or Litany
2/16/2022
Keywords: Geometry Only, Fundamentals, Generalization, Global Conditioning, Supervision by Gradient (PDE), Regularization
Venue: ARXIV 2022
Bibtex:
@article{liu2022learning,
author = {Hsueh-Ti Derek Liu and Francis Williams and Alec Jacobson and Sanja Fidler and Or Litany},
title = {Learning Smooth Neural Functions via Lipschitz Regularization},
year = {2022},
month = {Feb},
url = {http://arxiv.org/abs/2202.08345v1}
}
Abstract
Neural implicit fields have recently emerged as a useful representation for 3D shapes. These fields are commonly represented as neural networks which map latent descriptors and 3D coordinates to implicit function values. The latent descriptor of a neural field acts as a deformation handle for the 3D shape it represents. Thus, smoothness with respect to this descriptor is paramount for performing shape-editing operations. In this work, we introduce a novel regularization designed to encourage smooth latent spaces in neural fields by penalizing the upper bound on the field's Lipschitz constant. Compared with prior Lipschitz regularized networks, ours is computationally fast, can be implemented in four lines of code, and requires minimal hyperparameter tuning for geometric applications. We demonstrate the effectiveness of our approach on shape interpolation and extrapolation as well as partial shape reconstruction from 3D point clouds, showing both qualitative and quantitative improvements over existing state-of-the-art and non-regularized baselines.
Citation Graph
(Double click on nodes to open corresponding papers' pages)
(Double click on nodes to open corresponding papers' pages)
* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.