I, Drawing Robot

Inspired by the idea of enabling people with motor disabilities to be able to draw, the team designed an experience for visual expression through eye movement. At the end of the experience, the participant can take home the drawing made by the robot as a memory.

During the brainstorming phase of the project, the team asked themselves who might use a drawing machine and why. Originally conceived as a serious tool to help those with severe motor handicaps to express themselves, the project bloomed into a greater, more inclusive experience that could have a more general purpose, while not compromising on usability for its initial target group.

Regular feedback and iteration were crucial to the evolution of this project. Based on learnings from the previous week, the team coded and fabricated in parallel to optimize their workflow. With user testing, they discovered that users needed to know what their eye movement translated to without seeing the robot. This led to the creation of the feedback lights.

The team used OpenCV within Processing to track users’ eye movements, which were then mapped to cardinal directions within a Processing sketch by looking in a direction for a short period. To interpret these directional instructions, signals are sent from Processing to two Arduinos, one to control the LEDs, and another to ‘steer’ the drawing robot on its paper canvas, creating abstract patterns.

An important learning was that Processing communicates at a much faster rate than the robot would respond.  Upon observing this, the team calibrated the time it took for Processing to relay messages to Arduino, creating an instant correlation between the user’s eye movement and the robot’s drawing.