Search papers, labs, and topics across Lattice.
This paper extends 3D Gaussian Splatting (3DGS) to handle multispectral data by representing each Gaussian with per-band spherical harmonics and optimizing with a dual RGB and multispectral loss. By performing spectral-to-RGB conversion at the pixel level, the method preserves richer spectral cues during optimization, leading to improved rendering fidelity, especially in scenes with translucent materials and anisotropic reflections. Experiments on real-world datasets demonstrate consistent improvements in image quality and spectral consistency compared to RGB-only 3DGS.
Multispectral 3D Gaussian Splatting unlocks more realistic rendering of translucent and reflective objects, outperforming RGB-only methods by directly modeling spectral radiance.
We present a multispectral extension to 3D Gaussian Splatting (3DGS) for wavelength-aware view synthesis. Each Gaussian is augmented with spectral radiance, represented via per-band spherical harmonics, and optimized under a dual-loss supervision scheme combining RGB and multispectral signals. To improve rendering fidelity, we perform spectral-to-RGB conversion at the pixel level, allowing richer spectral cues to be retained during optimization. Our method is evaluated on both public and self-captured real-world datasets, demonstrating consistent improvements over the RGB-only 3DGS baseline in terms of image quality and spectral consistency. Notably, it excels in challenging scenes involving translucent materials and anisotropic reflections. The proposed approach maintains the compactness and real-time efficiency of 3DGS while laying the foundation for future integration with physically based shading models.