3D Scene Reconstruction with Neural Radiance Fields (NeRF) Considering Dynamic Illumination Conditions

Вантажиться...
Ескіз

Дата

2023

Науковий керівник

Назва журналу

Номер ISSN

Назва тому

Видавець

Anhalt University of Applied Sciences

Анотація

This paper addresses the problem of novel view synthesis using Neural Radiance Fields (NeRF) for scenes with dynamic illumination. NeRF training utilizes photometric consistency loss that is pixel-wise consistency between a set of scene images and intensity values rendered by NeRF. For reflective surfaces, image intensity depends on viewing angle and this effect is taken into account by using ray direction as NeRF input. For scenes with dynamic illumination, image intensity depends not only on position and viewing direction but also on time. We show that this factor affects NeRF training with standard photometric loss function effectively decreasing quality of both image and depth rendering. To cope with this problem, we propose to add time as additional NeRF input. Experiments on ScanNet dataset demonstrate that NeRF with modified input outperforms original model version and renders more consistent 3D structures. Results of this study could be used to improve quality of training data augmentation for depth prediction models (e.g. depth-from-stereo models) for scenes with non-static illumination.

Опис

Ключові слова

Computer Vision, Neural Radiance Fields, Dynamic Illumination, View Synthesis, 3D Scene Reconstruction

Бібліографічний опис

3D Scene Reconstruction with Neural Radiance Fields (NeRF) Considering Dynamic Illumination Conditions / Olena Kolodiazhna, Volodymyr Savin, Mykhailo Uss, Nataliia Kussul // In Proceedings of International Conference on Applied Innovation in IT, (ICAIIT). – 2023. – Pp. 233-238. – Bibliogr.: 19 ref.