Develop an asset pipeline system that allows users to speak 3D objects into existence or take a snapshot of reality
The Application was built in only 3 Days and loosely based on the Star Trek Holodeck. We aimed for a natural language-based UX, generative 3D Asset creation and reality to 3D conversion.
Year
2025
Responsibilities
UX Design Interactions Unity implementation
Patners
David Finsterwalder
Eleni Kofekidou
Justin Loose
Michael Deering
Process
This Project was build on the idea of liberating through technology. We wanted to create a shared space for people to explore the potential of AI and MR and feel unbound.
Challenges Faced During our brief development window, we encountered several technical hurdles. On day one, we successfully built a fast, intuitive pipeline for object creation and manipulation. However, due to a last-minute platform constraint requiring standalone Quest 3 support, we had to pivot from our initial solution. This led to a significant rollback in progress, forcing us to rely on slower, costlier, and less interactive API calls to maintain functionality.
Final Push & Achievements Despite the setbacks and running on little to no sleep, our team was able to recover much of the lost functionality and bring the project close to our original vision. We achieved a smooth, voice-driven user experience that allowed intuitive interaction without the need for complex UI. One of the standout features was our Reality-to-3D capture functionality, which enabled users to convert real-world objects into usable 3D assets. Additionally, we implemented a fully creative asset generation pipeline, allowing for dynamic and personalized content creation within the MR space. These core features came together to form a compelling, interactive experience despite the technical limitations.