An application or software that can read facial expressions of people will be highly helpful in market and political surveys. Pollsters can identify the attitudes of people, who watch TVs, surf web, listen to a public meeting or just walk on street overhearing to a public speech. A team of researchers at MIT’s Media Lab is currently working to develop technologies that can interpret human expressions from their faces. Various advanced technologies are used by the researchers for realizing the system, which can unearth real human response to a variety of propaganda campaigns.
According to Rana el Kaliouby, a Media Lab researcher, this technology can deliver a non-verbal voice method to gather the opinion of masses. Kaliouby and team developed a great technique dubbed MindReader to interpret human expressions from short video clips. MindReader tracks twenty two points around the eyes, nose and mouth of the person in the video to track his/her attitude towards an advert, TV program or political speech. The software also observes the texture, shape, color and facial movements of the person.
Most similar technologies including the MindReader utilize machine-learning methods to find the difference between various human expressions. Well, it can thus track difference between sadness and happiness, interest and boredom, contempt and disgust and others. In a recent test of the MindReader in the IEEE Transactions on Affective Computing, it was seen that the software is able to predict human expressions better than human testers.
As of now, a commercial version of the software, called Affdex is in use to check people’s attitude towards ads. The technology can really track the real-time facial expressions of TV audiences as some ads appear on the screen. Anyhow, privacy of audience remains a major concern of such facial expression tracking technologies. Many of them surmount the challenge without recording the video or images of the audiences.