Poster
3D Gaussian Can Be Sparser Than You Thought: Efficient Rendering via Learned Fragment Pruning
Zhifan Ye · Chenxi Wan · Chaojian Li · Jihoon Hong · Sixu Li · Leshu Li · Yongan Zhang · Celine Lin
[
Abstract
]
Wed 11 Dec 11 a.m. PST
— 2 p.m. PST
Abstract:
3D Gaussian splatting has recently emerged as a promising technique for novel view synthesis from sparse image sets, yet comes at the cost of requiring millions of 3D Gaussian primitives to reconstruct each 3D scene. This largely limits its application to resource-constrained devices and applications. Despite advances in Gaussian pruning techniques that target individual 3D Gaussian primitives, the significant reduction in primitives often fails to translate into commensurate increases in rendering speed, impeding efficiency and practical deployment. We identify that this discrepancy arises due to the overlooked impact of the number of fragments resulting from projecting 3D Gaussians onto the 2D image plane. To bridge this gap and meet the growing demands for efficient on-device 3D Gaussian rendering, we propose \textit{fragment} pruning, an orthogonal enhancement to existing pruning methods that can significantly accelerate rendering by selectively pruning overlapping fragments. Our adaptive pruning framework dynamically optimizes pruning thresholds for each Gaussian fragment, markedly improving rendering speed and quality. Extensive experiments in both static and dynamic scenes validate our approach. For instance, by integrating our fragment pruning technique—which sharpens the output of pruned 3D Gaussians—with state-of-the-art Gaussian pruning methods, we achieve up to a 1.71$\times$ speedup in real-world scenes and enhance rendering quality by an average of 0.16 PSNR on the Tanks\&Temples dataset. The source code of this paper is open-sourced for review.
Live content is unavailable. Log in and register to view live content