1University of California, San Diego 2Adobe Research 3Fyusion Inc.
Abstract
Recently neural volumetric representations such as neural reflectance fields have been
widely applied to faithfully reproduce the appearance of real-world objects and scenes
under novel viewpoints and lighting conditions. However, it remains challenging and
time-consuming to render such representations under complex lighting such as environment
maps, which requires individual ray marching towards each single light to calculate
the transmittance at every sampled point. In this paper, we propose a novel method based
on precomputed Neural Transmittance Functions to accelerate the rendering of neural
reflectance fields. Our neural transmittance functions enable us to efficiently query the
transmittance at an arbitrary point in space along an arbitrary ray without tedious ray
marching, which effectively reduces the time-complexity of the rendering. We propose
a novel formulation for the neural transmittance function, and train it jointly with the
neural reflectance fields on images captured under collocated camera and light, while enforcing
monotonicity. Results on real and synthetic scenes demonstrate almost two order
of magnitude speedup for renderings under environment maps with minimal accuracy
loss.
@article{shafiei2021NeuralTransmittance,
title = {Learning Neural Transmittance for Efficient Rendering of Reflectance Fields},
author = {Shafiei, Mohammad and Bi, Sai and
Li, Zhengqin and Liaudanskas, Aidas and Ortiz-Cayon, Rodrigo and Ramamoorthi, Ravi},
journal={British Machine Vision Conference (BMVC)},
year={2021},
}