top of page

Microbial Memories

Machine Learning, Brain-Computer Interface (2025)

Identified problem space: 

Can personal memories encoded into living Algae be interpreted by real-time reactive AI to retell narratives? What is the language that bridges the gap between natural and artificial? 

Microbial Memories works to
inscribe cognition into living matter, to ultimately change internal human memories into natural and artificial morphologies. It opens new possibilities to create biological-autographic interfaces.

Development Team: 

- Stacey Cho (Design Engineer, Fullstack + AI Engine developer, Digital Fabrication) 
- Xixi Li
 (Design Engineer, Frontend developer, Hardware) 
- Manini Banerjee
 (Design Engineer, Bio-designer) 

Development Process: 

Our work flow:  ideations - prototyping - development - iterations - installation

Comfy UI, Pytorch, Hugging Face, Muse EEG device, TouchDesigner, Figma, Lasercutter 

Reflections: 

This was my first time working with BCI (Brain-computer interfaces) and developing a full local AI engine to integrate with "physical" devices. I also learned a lot on what it means to work with living matter (Algae) as we constantly had to conduct lab work and maintain conditions to keep it alive. It also felt good to get back into physical prototyping and digitally fabricating, and truly this whole project felt like an adventure.

I greatly thank my team who worked alongside me to constantly collaborate and flexibly iterate. Microbial Memories could not have been completed without our teamwork, and I also thank the participants in being open to providing us their personal memories and narratives.

Connect with me: staceycho@mde.harvard.edu

bottom of page