Lumen is an experimental Mixed Reality storytelling platform that enables people to immerse themselves in alternate realities in their natural space utilizing machine learning and projection mapping technologies. It explores the creation of a new kind of media that takes advantage of the physical world by overlaying a layer of digital fiction on top of it.

The main inspiration behind Lumen was imagining a future without screens and imagining how people would interact with the environment around them.

This lead to the design question: “How might we immerse people in alternate reality experiences without isolating them in headsets?” All popular immersive AR/VR technologies rely on headsets or screens, Lumen challenges this constraint and explores how people can feel immersed in their natural space by merging bits with atoms.

The Technology

Lumen uses the yolo darknet machine learning platform to classify objects which is then processed by the onboard algorithm that generates stories on top of those classified objects. Narratives for this platform are designed by storytellers and game designers who will have access to the background graphical interface that serves as the story builder and dashboard.

The hardware consists of a laser projector in combination with a camera and depth sensors that work to create the best projection mapping experience for the user.

For further documentation: project page.