Dates: 02.06.2014 – 06.06.2014

Faculty: Jacob Sikker Remin + Dennis P Paul

Key Words: sensors, effectors, spatial interaction, post-digital, prototypes, simulations, scenarios, speculations, competition

## General Introduction

Since the advent of science, we had an almost uncanny fascination with *measuring*, thus turning things and events into numbers, signals, and data. This allowed us to draw conclusions, make informed decisions, and generally expand our cognitive capacities.

As we became machine builders, we soon transferred and extended the idea of measuring into contraptions we called *sensors*. these sensors allowed us to expand our senses beyond our natural limitations while at the same time allowing our machines to act independently on our behalf.

Over time, our machines evolved from mechanical into digital ones, split into a threefold structure of *sensor > computer > effector*. *sensors* acting as the conduit between physical and digital. *computers* storing and algorithmically processing data. *effectors*, controlled by algorithms, manipulating the physical world.

With digital technologies becoming more and more part of our everyday life so are the sensors. surrounded by such visible and increasingly invisible digital devices, we are living in a world of a billion sensors. but for what reason? to what end? to solve which problem? to tell what story? or more generally asked: how will we make sense of it all?

In this 5-day class we want to address this question by developing practical design projects, utilizing *digital technology* as a connection between physical and virtual, transforming spaces into interactive spaces and *facilitate* spatial interaction through sensors, computers and effectors.

This class will be a response to a brief by *microsoft 2014 design challenge* asking that exact same question: In a **world with a billion sensors**, how will we make sense of it all?

### Microsoft 2014 Design Challenge (Brief):

In a **world with a billion sensors**, how will we make sense of it all? In our daily lives we encounter sensors all the time, like when a motion sensor turns a light on in a dark place, or when a carbon monoxide detector tell us that the air is becoming hazardous. Sensors extend our abilities to see, hear, and feel far beyond what we ourselves can take in – from arrays of telescopes sensing the edges of the universe to nano-scale biological sensors amplifying our own sense of smell.

How will sensors change the way we perceive not only our environment but ourselves and others? How will sensors change the way we live and work? What interfaces, services, devices and experiences will be necessary to make sense of it all and avoid sensory overload? What are key problems this data can be used to help solve, what new troubles can we anticipate it creates?

## Structure

This class was project-based i.e students were asked to develop their own project in response to the brief. However, students were not required (but also not discouraged) to deliver fully elaborated, highly polished projects but rather meaningful, strong, and most importantly *presentable* results ( via e.g sketches, mock-ups, (functional) models, simulations ). In case of a successful advancement in the challenge the participants will have extra time to finish their projects.

Due to the open structure and topic of this class students were encouraged to understand this class as a combination of all preceding classes. The class was divided into 3 parts: incubation, development, and presentation.