Skip to content

Latest commit

 

History

History
8 lines (5 loc) · 1.87 KB

2412.01745.md

File metadata and controls

8 lines (5 loc) · 1.87 KB

Horizon-GS: Unified 3D Gaussian Splatting for Large-Scale Aerial-to-Ground Scenes

Seamless integration of both aerial and street view images remains a significant challenge in neural scene reconstruction and rendering. Existing methods predominantly focus on single domain, limiting their applications in immersive environments, which demand extensive free view exploration with large view changes both horizontally and vertically. We introduce Horizon-GS, a novel approach built upon Gaussian Splatting techniques, tackles the unified reconstruction and rendering for aerial and street views. Our method addresses the key challenges of combining these perspectives with a new training strategy, overcoming viewpoint discrepancies to generate high-fidelity scenes. We also curate a high-quality aerial-to-ground views dataset encompassing both synthetic and real-world scene to advance further research. Experiments across diverse urban scene datasets confirm the effectiveness of our method.

在神经场景重建与渲染中,实现航拍视角与街景视角的无缝融合仍然是一项重大挑战。现有方法大多专注于单一视角领域,限制了其在需要大范围自由视角探索(包括水平和垂直大视角变化)的沉浸式环境中的应用。我们提出了Horizon-GS,一种基于高斯散点(Gaussian Splatting)技术的新方法,旨在实现航拍与街景视角的统一重建与渲染。 该方法针对将这两种视角融合的核心挑战,引入了一种全新的训练策略,克服了视角差异问题,从而生成高保真的场景。此外,我们精心构建了一个高质量的“航拍到地面视角”数据集,涵盖合成和真实场景,以推动相关研究的进一步发展。 在多个城市场景数据集上的实验结果验证了我们方法的有效性,展现了其在高质量视角融合与场景重建上的强大性能。