ObjectFolder: A Dataset of Objects with Implicit Visual, Auditory, and Tactile Representations

Ruohan Gao, Yen-Yu Chang, Shivani Mall, Li Fei-Fei, Jiajun Wu

9/16/2021

Keywords: Compression, Science & Engineering, Robotics, Audio

Venue: CoRL 2021

Paper Citation Code coming soon... Data coming soon...
Bibtex: @inproceedings{gao2021objectfolder, booktitle = {Proceedings of the Conference on Robot Learning (CoRL)}, author = {Ruohan Gao and Yen-Yu Chang and Shivani Mall and Li Fei-Fei and Jiajun Wu}, title = {ObjectFolder: A Dataset of Objects with Implicit Visual, Auditory, and Tactile Representations}, year = {2021}, url = {http://arxiv.org/abs/2109.07991v2}, entrytype = {inproceedings}, id = {gao2021objectfolder} }

Abstract

Multisensory object-centric perception, reasoning, and interaction have been a key research topic in recent years. However, the progress in these directions is limited by the small set of objects available -- synthetic objects are not realistic enough and are mostly centered around geometry, while real object datasets such as YCB are often practically challenging and unstable to acquire due to international shipping, inventory, and financial cost. We present ObjectFolder, a dataset of 100 virtualized objects that addresses both challenges with two key innovations. First, ObjectFolder encodes the visual, auditory, and tactile sensory data for all objects, enabling a number of multisensory object recognition tasks, beyond existing datasets that focus purely on object geometry. Second, ObjectFolder employs a uniform, object-centric, and implicit representation for each object's visual textures, acoustic simulations, and tactile readings, making the dataset flexible to use and easy to share. We demonstrate the usefulness of our dataset as a testbed for multisensory perception and control by evaluating it on a variety of benchmark tasks, including instance recognition, cross-sensory retrieval, 3D reconstruction, and robotic grasping.

Citation Graph
(Double click on nodes to open corresponding papers' pages)

* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.