Cameras are everywhere, a compromise of our privacy usually in the name of public safety and accountability. But wherever you stand on the privacy front, let’s face it, our world of ubiquitous surveillance is here to stay. If anything, there is an increasing number of ways to watch and detect your every move. But how about your every flicker of emotion?
Want to know someone’s deepest, darkest feelings? Thanks to a company called Affectiva, all you need is a camera. The Waltham-based company has developed a technology that can read facial expressions and map them to a number of emotional states.
Rana el-Kaliouby is the co-founder of Affectiva, a spinoff venture out of MIT’s Media Lab. And sensing our emotions and analyzing them is at the heart of what they do.
“It basically uses a lot of data, like tens of thousands of examples of people smiling and people furrowing their eyebrows, and by using computer vision and machine learning, we’re able to train our algorithms to read those different facial expressions,” Kaliouby said.
And with a camera on pretty much every computer interface you can think of-- your phone, laptop, tablet, smart TV—it’s become a marketer’s dream.
“The value proposition is really clear. Marketers struggle with really understanding the emotions of consumers.” Kaliouby said.
Affectiva works with over 1,400 brands around the world. CBS uses their technology to measure which characters viewers love or hate on their TV shows. The film industry uses it to figure out which parts of a movie they should use to make a trailer. So I thought I’d test out Affectiva myself.
Using the camera on the laptop, my face mirrored back at me on the screen. A square locator hovered around my face, tracking my every move. The technology was very sensitive to my face’s every move. And remarkably accurate. When I cringed, the “disgust” meter peaked to red. When I laughed, the “joy” meter instantly jumped. There were other meters for tracking sadness and rage, as well as furrowed brow.
Fortunately, all my silly faces aren’t in some mug shot repository. Facial expressions are immediately converted to data.
“So, emotions are very personal and we recognize that,” el-Kaliouby said. “Everything we’ve done so far is on an opt-in basis, you have to opt in before we can start capturing this kind of data, but the other piece of it is value. We’re already sharing a ton of data about ourselves, but we do it because we’re getting enough value in return. If you start getting value in return for turning the service on, I think you’ll see people using it.”
Right now Affectiva’s facial expression technology is being used in a mindful meditation app and a live-streaming chat app. El-Kaliouby says this isn’t just for marketing-- there is potential for tracking emotional well-being—think the Fitbit but for how happy and sad you are throughout the day. El-Kaliouby is also excited about applying Affectiva’s technology to autism therapy.
“Autism is another area where our technology has a lot of value. In assessment, you are able to assess if non-verbal cues are atypical or more typical.” El-Kaliouby said. “There is also a lot of opportunity to help build therapeutic devices.”
Now that could be something to smile about.