The Walt Disney Company, commonly known as Disney, is using machine learning techniques to track the facial expressions of the audience and gauge whether they are enjoying it or not.

How does emotion recognition work?

Automatic Facial Expression analysis has enthralled increasing attention in the research community more than two decades and its expedient in many application like face animation, customer satisfaction studies, human-computer interaction and video conferencing. Emotive analytics is an interesting blend of psychology and technology. With Facial emotion detection, algorithms detect faces within a photo or video and sense micro expressions by analyzing the relationship between points on the face, based on curated databases compiled in academic environments.

Facial recognition ft. Disney

The tech just ventilates at IEEE’s Computer Vision and Pattern Recognition conference in Hawaii, Disney research team, explained that how their technique tracks the expressions of the people watching movies.

The new algorithm which Disney called as “factorized variational autoencoders”(FVAEs) is so potent that it will track the facial expressions of the audience for just 10 minutes and can predict their expressions for the rest of the Movie.

To collect a large set of face data Disney research team used a 400- seat theatre equipped with four infrared cameras to film the audience during 150 showings of nine mainstream movies like The Force Awaken, Zootopia, The Jungle Book, Inside Out, Big Hero 6, Star Wars.

The resulting dataset, containing 16 million facial landmarks by 3,179 audiences, was fed to the neural network. After it had tracked audience facial reaction patterns for few minutes, the algorithm was then able to predict when they smile or laugh the algorithm is so efficient it even predict other emotions like fear and sadness.

Disney's not alone

Disney isn’t alone, other companies in the market want to study how viewers react to movies. Dolby Laboratories, known for its proprietary high-dynamic range technology, has been studying audience on the neurophysiological level, strapping biosensors on volunteers to get a sense how viewers are engaging with media.

Netflix also used eye tracking to help it redesign its interface in 2014.

We are only on the tip of the iceberg when it comes to human machine interaction, but cognitive computing technologies like these are exciting steps toward creating real machine emotional intelligence.