top of page

Noodle

Spatial AI in collaborative AR (2026)

2x Winner of the 2026 MIT Reality Hack

Spatial AI.jpeg
MITRealityHack-Neon.png.webp
Founders Lab.jpeg
MITRealityHack-Neon.png.webp

+

Spectacles: Best Use of Spatial AI

Founders Lab Track

Identified problem space: 

Designers constantly switch between various platforms during their process.
Every tool switch
breaks the flow state, co-creation is a separate workflow. 

Why can't we just have
one infinite canvas that does it all? 

Project video: 

Development Team: 

SNAK studios: 
- Stacey Cho (UI/UX designer, Frontend developer, Branding) 
- Neha Sajja (UI/UX Designer, Branding)
- Ash Shah (Backend developer)
- Kavin Kumar Balamurugan (Fullstack developer)

Team SNAK.png
SNOW_20260126_000614_867.jpg
20260125_121309.jpg

Development Process: 

Our work flow:  ideations - wireframing - prototyping - testing - iterations

Snap Spectacles, Lens Studio, Gemini API, Snap3D API, Figma, After Effects, Premiere Pro 

Reflections: 

This was my first MIT Reality Hack and I'm incredibly grateful for my team dynamics despite
the raging
snowstorm that occurred during the hackathon.

Can't wait to work more with AR glasses, next steps are to enable
full API connections across various platforms for more empowered user possibilities! 

Connect with me: staceycho@mde.harvard.edu

bottom of page