For years, marketers have used focus groups and consumer surveys to find out what people think of campaigns and advertising, but a growing number of agencies say that those answers are rarely, if ever, accurate.
One possible solution is facial-recognition tech. Omnicom’s data group, Annalect, spent Super Bowl weekend with a group of 134 people who watched Super Bowl television ads in a lab in downtown Manhattan. A prototype software randomized the commercials, and a camera took pictures of their faces every three seconds. “Feature extraction” on photos gave the researches age, gender and a mood guess.
So, for example, “happy” ads saw the eyelids tighten as the cheeks rose. “Off-putting” ads saw a narrowing of the brows and visible protrusions of the tongue.
It’s a next step into behavioral data, say Annalect researchers. “We’ve spent several years getting access to granular behavioral data so we can see who is impacting rational and emotional decision-making,” said Slavi Samardzija, chief analytics officer at Annalect.
The idea of using emotional reactions to gauge ad performance is an idea that comes around every once in a while, and there have been movements toward this at other agencies.