Layered Neural Atlases for Consistent Video Editing
Yoni Kasten, Dolev Ofri, Oliver Wang, Tali Dekel
9/23/2021
Keywords: Dynamic/Temporal, 2D Image Neural Fields, Editable
Venue: SIGGRAPH 2021
Bibtex:
@article{kasten2021layered,
publisher = {Association for Computing Machinery},
journal = {ACM Transactions on Graphics (TOG)},
author = {Yoni Kasten and Dolev Ofri and Oliver Wang and Tali Dekel},
title = {Layered Neural Atlases for Consistent Video Editing},
year = {2021},
url = {http://arxiv.org/abs/2109.11418v1},
entrytype = {article},
id = {kasten2021layered}
}
Abstract
We present a method that decomposes, or "unwraps", an input video into a set of layered 2D atlases, each providing a unified representation of the appearance of an object (or background) over the video. For each pixel in the video, our method estimates its corresponding 2D coordinate in each of the atlases, giving us a consistent parameterization of the video, along with an associated alpha (opacity) value. Importantly, we design our atlases to be interpretable and semantic, which facilitates easy and intuitive editing in the atlas domain, with minimal manual work required. Edits applied to a single 2D atlas (or input video frame) are automatically and consistently mapped back to the original video frames, while preserving occlusions, deformation, and other complex scene effects such as shadows and reflections. Our method employs a coordinate-based Multilayer Perceptron (MLP) representation for mappings, atlases, and alphas, which are jointly optimized on a per-video basis, using a combination of video reconstruction and regularization losses. By operating purely in 2D, our method does not require any prior 3D knowledge about scene geometry or camera poses, and can handle complex dynamic real world videos. We demonstrate various video editing applications, including texture mapping, video style transfer, image-to-video texture transfer, and segmentation/labeling propagation, all automatically produced by editing a single 2D atlas image.
Citation Graph
(Double click on nodes to open corresponding papers' pages)
(Double click on nodes to open corresponding papers' pages)
* Showing citation graph for papers within our database. Data retrieved from Semantic Scholar. For full citation graphs, visit ConnectedPapers.