This project began as an exploration into sensory substitution, emotion hacking, and the possibility of merging the two very different fields of interactive interfaces and human emotions. Could everyday objects convey information through emotions? Could we receive information about the state of an object or the state of a given environment through emotions; and in this way could we better internalize and relate to that information? What if we could seamlessly map digital information onto emotions, via a physical, tangible device that conveys a range of emotions through facial expressions, and then be able to monitor the dynamics of that information through the dynamics of the corresponding emotions? Moreover, could we also do the opposite – being able to control external devices or the state of different environments through changes of emotions / facial expressions? We developed a device that makes the aforementioned scenarios possible, and we called it Emobject.

Emobject Design Exploded View


Emobject is novel, interactive, tangible interface that bridges the gap between the worlds of bits, atoms, and emotions. It is a physical manifestation of a digital emoji with rich input, output, and control capabilities. It allows mapping of physical and digital information into emotional expressions, as well as controlling external devices through change of the device's facial expressions. Moreover, Emobject can also detect faces and emotions of people by leveraging the latest emotion recognition algorithms developed at MIT Media Lab, and then adapt accordingly, mirror the emotions in real time, or control another device in response to the emotional input. Emobject is a universal emotion-based I/O device enabling a host of novel user experiences, that are described in the applications section.


Face perception is a rapid, subconscious process wired into our brains even before birth. We are in fact so great at recognizing faces and expressions, that we often perceive them even in the most mundane objects like house facades, power outlets, landscapes, and even grilled toasts. This phenomenon is called facial pareidolia, and it is an important survival mechanism. Moreover, we are much faster a perceiving faces or face-like objects (165ms after stimulus onset) than perceiving other objects [1]{2], and we can perceive faces even in our peripheral vision. But could we hack this neurological phenomenon and exploit the highly efficient facial processing parts of the brain for tasks they were never intended for? Could we perceive other critical information as rapidly as we perceive faces?

Prior research into sensory substitution has shown that when a new kind of information stream is mapped into haptic stimulus, through a vibrotactile device, users wearing the device develop subconscious awareness for that new information stream after some time. Examples of this include a vibrotactile vest[3], which enables deaf people to perceive speech mapped into patterns of vibrotactile stimulation on their back. Another example is a vibrotactile belt, that enables users to develop subconscious awareness for the direction of north[4]. But could similar results of subconscious awareness for some information stream be achieved without requiring users to wear any devices at all, by using visual stimulus in the form of facial expressions rather than haptic stimulus?

Social Mood Monitor and Display

Emobject analyzes the emotions of multiple faces simultaneously and then displays the average emotion. As a social mood monitor and display, Emobject could alert another person whether joining the conversation would be appropriate.


Instant Emotional Feedback Mirror

Emobject can provide users with immediate, tangible feedback about their facial expression. When made aware about their instant facial expressions, users may choose to smile more frequently, which has been shown to elicit happiness.


Notification Interface

Users can use a number of Emobjects for notification and control applications. When the laundry machine is finished for example, or when the lights are turned without need, or when the thermostat is set to too high or too low are all examples of informational alerts that could be provided through facial expressions on different Emobjects. Since we are so much better and faster at perceiving faces, when these alerts are delivered in the form a facial expressions, we might be much more likely to notice them and act upon them.

Control Interface

Emobject could be used as a home automation device. A user may associate a particular emotion with a set of lighting conditions, temperature, and music for instance, and another emotion with another set of conditions. If all of these devices are automated, the user could just change the emotion on the Emobject and set the parameters corresponding to that emotion. We've used Emobject to control the lights in the Media Lab 3rd floor atrium.

Emotion Recording and Playback Device

What was the range of facial expressions and emotion you went through today? What if you could play back your facial expressions in a time-lapsed fashion; would you be surprised at what you see? Would it provoke you to take a certain action or make behavioral change?

Average Personal Expression Reflector

Emobject can be attached to a bathroom mirror to monitor your emotional state every time you face the mirror, and then display your average emotional expression during the past week, and the variation in emotions over time. When presented with this information, users would have the motivation to smile more often.

Habit Changer

Emobject could encourage good habits such as hand washing by monitoring how many seconds people spend washing their hands. Users would be presented with a smiling face only if they spend 30+ seconds, and with an angry face if they finish earlier than that.

Development Platform for Creative Expressions

Users can map any input data-stream to any facial expression, and they can choose to control any IoT device through facial expressions. Moreover, they can even define entirely new kinds of facial expressions.