MonoNeRF-DDP: Neural radiance fields from monocular endoscopic images with dense depth priors
Authors: Liu, J., Huang, D., Shi, Y. and Qu, J.
Journal: Computers and Graphics
Volume: 133
ISSN: 0097-8493
DOI: 10.1016/j.cag.2025.104487
Abstract:Synthesizing novel views from monocular endoscopic images is challenging due to sparse input views, occlusion of invalid regions, and soft tissue deformation. To tackle these challenges, we propose the neural radiance fields from monocular endoscopic images with dense depth priors, called MonoNeRF-DDP. The algorithm consists of two parts: preprocessing and normative depth-assisted reconstruction. In the preprocessing part, we use labelme to obtain mask images for invalid regions in endoscopy images, preventing their reconstruction. Then, to address the view sparsity problem, we fine-tuned a monocular depth estimation network to predict dense depth maps, enabling the recovery of scene depth information from sparse views during the neural radiance fields optimization process. In the normative depth-assisted reconstruction, to deal with the issues of soft tissue deformation and inaccurate depth information, we adopt neural radiance fields for dynamic scenes to take mask images and dense depth maps as additional inputs and utilize the proposed adaptive loss function to achieve self-supervised training. Experimental results show that MonoNeRF-DDP outperforms the best average values of competing algorithms across the real monocular endoscopic image dataset GastroSynth. MonoNeRF-DDP can reconstruct structurally accurate shapes, fine details, and highly realistic textures with only about 15 input images. Furthermore, a study of 14 medical-related participants indicates that MonoNeRF-DDP can more accurately observe the details of the disease sites and make more reliable preoperative diagnoses.
Source: Scopus
MonoNeRF-DDP: Neural radiance fields from monocular endoscopic images with dense depth priors
Authors: Liu, J., Huang, D., Shi, Y. and Qu, J.
Journal: COMPUTERS & GRAPHICS-UK
Volume: 133
eISSN: 1873-7684
ISSN: 0097-8493
DOI: 10.1016/j.cag.2025.104487
Source: Web of Science (Lite)