Alert button
Picture for Jaeyoung Chung

Jaeyoung Chung

Alert button

MEIL-NeRF: Memory-Efficient Incremental Learning of Neural Radiance Fields

Dec 31, 2022
Jaeyoung Chung, Kanggeon Lee, Sungyong Baik, Kyoung Mu Lee

Figure 1 for MEIL-NeRF: Memory-Efficient Incremental Learning of Neural Radiance Fields
Figure 2 for MEIL-NeRF: Memory-Efficient Incremental Learning of Neural Radiance Fields
Figure 3 for MEIL-NeRF: Memory-Efficient Incremental Learning of Neural Radiance Fields
Figure 4 for MEIL-NeRF: Memory-Efficient Incremental Learning of Neural Radiance Fields

Hinged on the representation power of neural networks, neural radiance fields (NeRF) have recently emerged as one of the promising and widely applicable methods for 3D object and scene representation. However, NeRF faces challenges in practical applications, such as large-scale scenes and edge devices with a limited amount of memory, where data needs to be processed sequentially. Under such incremental learning scenarios, neural networks are known to suffer catastrophic forgetting: easily forgetting previously seen data after training with new data. We observe that previous incremental learning algorithms are limited by either low performance or memory scalability issues. As such, we develop a Memory-Efficient Incremental Learning algorithm for NeRF (MEIL-NeRF). MEIL-NeRF takes inspiration from NeRF itself in that a neural network can serve as a memory that provides the pixel RGB values, given rays as queries. Upon the motivation, our framework learns which rays to query NeRF to extract previous pixel values. The extracted pixel values are then used to train NeRF in a self-distillation manner to prevent catastrophic forgetting. As a result, MEIL-NeRF demonstrates constant memory consumption and competitive performance.

* 18 pages. For the project page, see https://robot0321.github.io/meil-nerf/index.html 
Viaarxiv icon