Human Touch


Human Touch is an experiment exploring how we may be able to feel a person’s face across screens. The project was inspired by the possibilities that haptic technologies can offer to increase the sensation of being ‘remotely in touch’ with people.

Companies like Senseg are developing new systems that will give us high fidelity tactile feedback on our mobile devices. At the same time, miniaturization of depth cameras (like the ones found in Kinect) will soon bring 3D image data to our phones. We see the University of Ottawa already developing a Haptic Video Chat System, aiming to allow the person at the other end to feel a hug, a handshake or pat on the back through the interface.


In this spirit, we took the first steps towards this haptic vision: a basic prototype that allowed the user to stroke over another person’s face. The system relies on two photos: a side and a front view of a face. The side view is translated into a relief profile, which affects the motorized slider, applying force to the user’s hand. When moving the slider up and down, a user can “feel” the face’s relief. The slider is embedded into a case with a screen, displaying the face and pointing to which part of the face is currently being felt.

The first prototype is based on static images. In the second iteration, the live depth map of the Kinect translates the relief profile. While the data is less fine-grained, it embodies the whole concept in its most basic form: live footage being felt remotely, in real-time.

[Video demo coming soon]