Skip to content

Latest commit

 

History

History
7 lines (5 loc) · 2.21 KB

2411.19756.md

File metadata and controls

7 lines (5 loc) · 2.21 KB

DeSplat: Decomposed Gaussian Splatting for Distractor-Free Rendering

Gaussian splatting enables fast novel view synthesis in static 3D environments. However, reconstructing real-world environments remains challenging as distractors or occluders break the multi-view consistency assumption required for accurate 3D reconstruction. Most existing methods rely on external semantic information from pre-trained models, introducing additional computational overhead as pre-processing steps or during optimization. In this work, we propose a novel method, DeSplat, that directly separates distractors and static scene elements purely based on volume rendering of Gaussian primitives. We initialize Gaussians within each camera view for reconstructing the view-specific distractors to separately model the static 3D scene and distractors in the alpha compositing stages. DeSplat yields an explicit scene separation of static elements and distractors, achieving comparable results to prior distractor-free approaches without sacrificing rendering speed. We demonstrate DeSplat's effectiveness on three benchmark data sets for distractor-free novel view synthesis.

高斯点云表示(Gaussian Splatting) 在静态 3D 环境中的快速新视图合成中表现出色。然而,在现实世界环境中进行高质量重建仍然具有挑战性,因为干扰物或遮挡物会破坏多视图一致性假设,从而影响精确的 3D 重建。大多数现有方法依赖于预训练模型提供的外部语义信息,这引入了额外的计算开销,无论是在预处理阶段还是优化过程中。 为了解决这些问题,我们提出了一种新方法 DeSplat,能够仅基于高斯原语的体渲染直接分离干扰物和静态场景元素。我们的方法通过在每个相机视图中初始化高斯点,用于重建特定视图的干扰物,从而在alpha 合成阶段分别建模静态 3D 场景和干扰物。 DeSplat 实现了静态元素和干扰物的显式场景分离,在不牺牲渲染速度的情况下,取得了与现有无干扰方法相当的效果。我们在三个基准数据集上验证了 DeSplat 的有效性,用于无干扰的新视图合成,结果表明该方法在效率和准确性方面均具有显著优势。