This is our expression capture tool in action.

It uses the a respondent’s camera to capture expression whilst they watch a video – usually a TV commercial but it can be any video of any length.

As you can see it uses AI to capture an expression in real time with an accuracy score given to each reading. This is time stamped and aggregated to undersand how expressions vary across the duration of an ad.

So it’s not a reading of emotion, it’s a reading of expression. What’s difference? Well no matter how intelligent AI is, it can’t know what you are feeling by reading the points on your face. Just as if I was to study your face whilst you watched an ad I couldn’t tell what were feeling. But I might have some hypothesis based on how your face changed throughout the ad. I would then ask you what you were feeling whilst you watched it together with a raft of rational follow up questions and would use this combination to form a view of how effective an ad was likely to be.

Understanding how effective an ad is, is a mixture of art and science – much like creating advertising. When it comes to using technology to measure advertising effectiveness, particular an automated tool like Research by Bot’s Ad Tester, the quesetion is what role does expression analysis play? I see it as one of many clues to look at. You may hypothesise (as many do) that changes in expression are an indication of emotional engagement. So it’s not so interesting to look at the average emotional measure in an ad, but rather to look at the points of the ad when emotion shifts – where happiness jumps up, or sadness, or suprise. Conversely, any momeent when passive expression drops.

A word of warning. I have been conducting creative development research using face-to-face research for decades and with a handful of exceptions like this HBA crocodile ad which really did have people laughing out loud and this worksafe ad which brought people to tears, people’s faces aren’t usually that expressive when they are watching an ad. Both of these ads were initially tested as narrative scripts and stood out from the pile of other scripts they were tested against based on their ability to elicit an emotional response, and were then fine tuned in terms of messaging.

So what can expression analysis tell us? Not much on its own. But together with other features of our ad testing tool like measures of sentiment and magnitude to system 1 emotion questions, it can be a give important clues to generate understanding not just as to how effective or ineffective an ad is, but to understand why. For example you can look at scene by scene changes in emotion, or how people of different genders or age groups respond, or how customers and non customer respond.

People are rightly wary of this sort of technology. In our experience it is extrememly difficult to discern between passive and sad for example (especially when you suffer from resting bitch face as some of us do and with older faces), glasses and beards can also make it difficult to get an ‘accurate’ read. Which is why we advise looking at how expressions change across the ad, rather than at the overall results.

Another concern is privacy. We always seek permission from the respondent to turn on the camera and turn it off when the ad finishes playing. Crucially we extract the data not the video. We don’t record the video and we don’t need to store the video. It reduces the risk of privacy breach and it aids in people opting in. The technology itself works in the browser on any device without callibration so it doesn’t take up processing power or slow down the device. To find our more about our ad tester, download the information pack.

Subscribe To Our Newsletter

Subscribe To Our Newsletter

We will send you a monthly summary to our latest posts and news

You have Successfully Subscribed!