Novel-view synthesis techniques based on Neural Radiance Fields [Mildenhall et al. 2020], Plenoxels [Fridovich-Keil et al. 2022], or, most recently and best known, 3D Gaussian Splatting [Kerbl et al. 2023, Liu et al. 2024] enable the visually high-fidelity representation of surfaces that are hard or even almost impossible to reconstruct using classic photogrammetric approaches. Examples of such surfaces include fur, vegetation, transparent or translucent objects and thin structures in general. The novel-view synthesis approaches perform faithful interpolation of existing color information contained in a set of high-quality input images. Novel views can be rendered in real-time, provided one has access to reasonable powerful graphics hardware. In a previous project, we explored the Gaussian Splatting literature and optimized an existing Unity-based rendering plugin for efficient rendering of Gaussian-based scenes on desktop graphics hardware. We also identified several challenges in rendering these models on mobile devices. In this project, we aim to build on these insights and optimize Gaussian Splatting algorithms for mobile mixed reality (MR) devices such as the Meta Quest 3 or other mobile devices such as tablets. We will research, implement, and evaluate promising techniques in areas like visibility culling, output-sensitive rendering, data compression, and hybrid representations. Our goal is to fully leverage mobile hardware for real-time rendering at appropriate quality levels. In addition to the challenge of efficiently rendering on low-power MR hardware, we want to address related research questions with part of the project team, such as how to interact with scene elements consisting of hundreds of thousands of unstructured Gaussian-based primitives or how to convincingly blend Gaussian-Splatting-based scenes with camera streams obtained by mixed-reality devices. If you are experienced or interested in real-time computer graphics and/or topics in the field of mixed reality, we would be excited to welcome you to our project! We will provide you with a Quest 3 for the duration of the project and will address the challenges of rendering photorealistic real-world datasets on low-power MR hardware. |