A-SDF: Learning Disentangled Signed Distance Functions for Articulated Shape Representation
Jiteng Mu, Weichao Qiu, Adam Kortylewski, Alan Yuille, Nuno Vasconcelos, Xiaolong Wang
4/15/2021
Keywords: Editable
Venue: ARXIV 2021
Bibtex:
@article{mu2021asdf,
journal = {arXiv preprint arXiv:2104.07645},
booktitle = {ArXiv Pre-print},
author = {Jiteng Mu and Weichao Qiu and Adam Kortylewski and Alan Yuille and Nuno Vasconcelos and Xiaolong Wang},
title = {A-SDF: Learning Disentangled Signed Distance Functions for Articulated Shape Representation},
year = {2021},
url = {http://arxiv.org/abs/2104.07645v1},
entrytype = {article},
id = {mu2021asdf}
}
Abstract
Recent work has made significant progress on using implicit functions, as a continuous representation for 3D rigid object shape reconstruction. However, much less effort has been devoted to modeling general articulated objects. Compared to rigid objects, articulated objects have higher degrees of freedom, which makes it hard to generalize to unseen shapes. To deal with the large shape variance, we introduce Articulated Signed Distance Functions (A-SDF) to represent articulated shapes with a disentangled latent space, where we have separate codes for encoding shape and articulation. We assume no prior knowledge on part geometry, articulation status, joint type, joint axis, and joint location. With this disentangled continuous representation, we demonstrate that we can control the articulation input and animate unseen instances with unseen joint angles. Furthermore, we propose a Test-Time Adaptation inference algorithm to adjust our model during inference. We demonstrate our model generalize well to out-of-distribution and unseen data, e.g., partial point clouds and real-world depth images.
Citation Graph
(Double click on nodes to open corresponding papers' pages)
(Double click on nodes to open corresponding papers' pages)
* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.