top of page
Noodle
Spatial AI in collaborative AR (2026)
2x Winner of the 2026 MIT Reality Hack




+
Spectacles: Best Use of Spatial AI
Founders Lab Track
Identified problem space:
Designers constantly switch between various platforms during their process.
Every tool switch breaks the flow state, co-creation is a separate workflow.
Why can't we just have one infinite canvas that does it all?
Project video:

Project links, click to view:
Development Team:
SNAK studios:
- Stacey Cho (UI/UX designer, Frontend developer, Branding)
- Neha Sajja (UI/UX Designer, Branding)
- Ash Shah (Backend developer)
- Kavin Kumar Balamurugan (Fullstack developer)



Development Process:
Our work flow: ideations - wireframing - prototyping - testing - iterations
Snap Spectacles, Lens Studio, Gemini API, Snap3D API, Figma, After Effects, Premiere Pro
Reflections:
This was my first MIT Reality Hack and I'm incredibly grateful for my team dynamics despite
the raging snowstorm that occurred during the hackathon.
Can't wait to work more with AR glasses, next steps are to enable full API connections across various platforms for more empowered user possibilities!
Connect with me: staceycho@mde.harvard.edu
bottom of page