Poster
Variational Multi-scale Representation for Estimating Uncertainty in 3D Gaussian Splatting
Ruiqi Li · Yiu-ming Cheung
3D Gaussian Splatting (3DGS) has been a popular method for constructing dense 3D representations for reconstructing appearance and geometry. However, the learning pipeline in 3DGS inherently lacks the ability to quantify uncertainty, which is an important factor in applications like robotics mapping and navigation. In our work, We propose an uncertainty estimation method built upon Bayesian inference framework. Specifically, we propose a method to build variational multi-scale 3D Gaussians, where we leverage explicit scale information in 3DGS parameters to construct diversified parameter space samples. We developed an offset table technique to represent local multi-scale samples efficiently by offsetting some parameters and sharing the parameters. Then, the offset table is learned with variational inference with multi-scale prior. The model space samples can be used in the forward pass to infer the predictive uncertainty, and can further estimate the uncertainty of each individual Gaussian component. Extensive experimental results on various benchmark datasets show that our method provides state-of-the-art calibration performance on uncertainty quantification and significantly better rendering quality compared to previous methods that provide uncertainty quantification with view synthesis. We also showed that by leveraging the model parameter uncertainty estimated by our method, we can remove noisy Gaussians automatically and get a high-fidelity part of the reconstructed scene, which is of great help in improving the visual quality.
Live content is unavailable. Log in and register to view live content