A playable demo renders game environments using Gaussian Splatting, replacing hand-authored meshes with real-world photogrammetric capture.
Key Takeaways
Gaussian Splats serve as the full environment geometry, with real-world scenes scanned and rendered in a playable game loop.
Runs smoothly on Apple Silicon (M4 Max confirmed); production readiness for complex, large-scale scenes is still limited.
No dynamic lighting or real-time effects are implemented; visual output is comparable to pre-baked shadow maps from mid-2000s engines.
Characters are rendered as traditional meshes layered over the splat environment, creating a visible style mismatch.
Hybrid use cases are plausible: splats for static natural elements like grass and shrubbery, meshes for anything dynamic or interactive.
Hacker News Comment Review
Memory scaling is a hard wall for builders: commenters report running out of VRAM on an RTX 6000 with relatively small environments, and no practical tips have surfaced yet.
The visual fidelity argument is contested – critics note the lack of dynamic lighting makes it look like a 2006 title; supporters frame the photorealism and scene capture pipeline as genuinely novel.
Open infrastructure questions dominate the thread: per-frame cost versus triangle meshes, browser delivery viability, and whether consumer capture hardware like Insta360 can feed an open-source splat pipeline are all unresolved.
Notable Comments
@marlburrow: asks for per-frame cost data versus triangle-mesh equivalents and flags browser support and file size as “the two big walls” blocking splats from becoming default web-delivered 3D.
@swiftcoder: flags meshed characters layered over photorealistic splat scenery as “sort of unfortunate” – the hybrid seam is visually obvious.
@fho: notes Gaussian Splatting would have made it trivially easy to scan and recreate real physical spaces, with obvious dual-use implications.