Objectifier

Objectifier empowers people to train objects in their daily environment to respond to their unique behaviours. It gives an experience of training an artificial intelligence; a shift from a passive consumer to an active, playful director of domestic technology. Interacting with Objectifier is much like training a dog – you teach it only what you want it to care about. Just like a dog, it sees and understands its environment. 

With computervision and a neural network complex behaviours are associated with your command. For example, you might want to turn on your radio with your favourite dance move. Connect your radio to the Objectifier and use the training app to show it when the radio should turn on. In this way, people will be able to experience new interactive ways to control objects, building a creative relationship with technology without any programming knowledge.

Prototype 1 – Pupil
A physical interface for the machine learning program “Wekinator”. It served as a remote control to explore different ideas. Pressing red or white records data. Blue toggles the neural network to process the data and run the feedback. Later it became a prop to talk to dog trainers about the physical manifest of machine learning.

Prototype 2 – Trainee v1
A prototyping tool that allows makers to train any input sensor and connect them to an output without any need to write code. Trainee can combine and cross multiple output pins to create a more complex training result.

Prototype 3 – Trainee v2
A refined version of the trainee v1 as an open-source PCB-circuit for creating you own trainee board. This build is based on a small Teenzy microcontroller and comes with a digital interface called Coach.

Prototype 4 – Intern
An extension for the Trainee v1 to control devices as the output pin. The Intern has a power outlet with a relay so the Trainee v1 could train objects with 230V. Its purpose was to invite non-makers and average consumer to manipulate objects they can relate to and inspire custom problem-solving in their own contexts.

Prototype 5 – Apprentice
Designed to combine all the learnings from the previous prototypes in one device. Apprentice uses computer vision as sensor input and can be controlled wirelessly from a mobile app where feedback is given. With a raspberryPi 3 as its brain it runs a custom server to connect the app and neural network.

Prototype 6 – Objectifier
A smaller, friendlier and smarter version of the Apprentice. It gives an experience of training an intelligence to control other domestic objects. The system can adjust to any behaviour or gesture. Through the training app the Objectifier can learn when it should turn another object on or off.

more about this project on bjoernkarmann.dk

 

STUDENTS