Sobolev Training for Implicit Neural Representations with Approximated Image Derivatives

Wentao Yuan, Qingtian Zhu, Xiangyue Liu, Yikang Ding, Haotian Zhang, Chi Zhang

07/21/2022

Keywords: 2D Image Neural Fields, Regularization, Supervision by Gradient (PDE), Image-Based Rendering

Venue: ECCV 2022

Bibtex: @article{yuan2022sobolev_inrs, author = {Wentao Yuan and Qingtian Zhu and Xiangyue Liu and Yikang Ding and Haotian Zhang and Chi Zhang}, title = {Sobolev Training for Implicit Neural Representations with Approximated Image Derivatives}, year = {2022}, month = {Jul}, url = {http://arxiv.org/abs/2207.10395v1} }

Abstract

Recently, Implicit Neural Representations (INRs) parameterized by neural networks have emerged as a powerful and promising tool to represent different kinds of signals due to its continuous, differentiable properties, showing superiorities to classical discretized representations. However, the training of neural networks for INRs only utilizes input-output pairs, and the derivatives of the target output with respect to the input, which can be accessed in some cases, are usually ignored. In this paper, we propose a training paradigm for INRs whose target output is image pixels, to encode image derivatives in addition to image values in the neural network. Specifically, we use finite differences to approximate image derivatives. We show how the training paradigm can be leveraged to solve typical INRs problems, i.e., image regression and inverse rendering, and demonstrate this training paradigm can improve the data-efficiency and generalization capabilities of INRs. The code of our method is available at \url{https://github.com/megvii-research/Sobolev_INRs}.

Citation Graph
(Double click on nodes to open corresponding papers' pages)

* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.