Advertisers Look With Empathy Into Your Front Room

Advertisers Look With Empathy Into Your Front Room
(Adam Berry/Getty Images)

Technology is under development to enable advertisers to target products not just at a broad group of people that might be watching a certain type of programme but at specific households and even individuals.

By gathering information about what you watch and do in your front room, they can decide which adverts will have the most traction. It’s an advertisers dream to broadcast to an audience of one and we are ever closer to this becoming reality.

In the UK, the charge is being led by Sky, which says that Tesco, the Royal Bank of Scotland, Littlewoods, American Airlines, Audi, Citroen and Dial-A-Flight will all use its AdSmart services.

A window on your world

Their efforts are, in part, made possible by the fact that there are now many more ways to watch televisual content than on televisions, such as online via Netflix or catch-up services. The cookies we leave behind when we watch all help those providing the programmes build up a picture of what you like and what you watch. That’s pure gold for advertisers.

But we also connect our TVs to different devices, such as Xboxes, which will offer a window into our world like never before. An Xbox enables advertisers to target content at you based on your subscriber information, such as your age, gender and address.

These devices can also see into your living room. Sensors in an Xbox make it possible to play games without a controller in your hand because they can scan the content of the room and identify how many people are in it. That, in turn, means advertisers can gather the information and deliver even more refined advertising. Spooked yet? Well, there’s more.

Verizon’s big play

If you have time to spare, it can be illuminating to search for the patent applications of technology companies. I did, and found a fascinating application by telecommunications company Verizon to the US patents office, titled Methods and Systems for Presenting an Advertisement Associated with an Ambient Action of a Use. This shows that the company has patented a suite of technologies that would trigger tailored advertisements based on whether viewers are eating, playing, cuddling, laughing, singing, fighting, talking or gesturing in front of their sets. Cuddling, for example, might trigger advertisements for a trip to Paris.

It might seem like an incredible leap but the ability to make assessments about you comes from technologies that we are gradually becoming accustomed to already. Motion capture and analysis technologies, gesture recognition technologies, facial recognition technologies, voice recognition technologies and acoustic source localisation technologies all combine to build an accurate picture of who you are and what you are doing.

It’s not just the fact that you are moving or speaking but the gestures you make and the mannerisms you display. It’s your physical attributes and the tone and inflection of your voice. It’s the language you speak or the accent you have and it’s your proximity to others.

And, it’s not just people. Pets, products, brands, decorative style, objects such as pictures and photographs can also be scanned and profiled. If your TV picks up a small furry shape moving around your feet, it may start pushing cat food at you.

Verizon’s set-top box will determine which people present are adults by linking what information it has in its user profiles with variables such as physical appearance and voice attributes. Once it knows the tone of your voice and your mannerisms it could even guess your mood. This is yet more valuable information for advertisers, who can suggest a holiday when you’re feeling down or a DIY store when you’re full of energy.

If you’re humming a tune, Verizon plans to use a signal processing heuristic to identify the name and genre of the song. Very handy when music labels ask it to sell their latest releases.

And of course, if your TV can gather all this information about you before it shows you an ad, it can monitor your response to that ad once you’ve seen it.

Then it can communicate with the mobile device or tablet you are using as you watch, sending ads to you while you browse and taking cookies back for future reference.

Empathy: the future of advertising

This all takes the idea of behavioural advertising quite literally. What’s more, the TV, the advertising and the devices could even be said to be displaying a whole new characteristic: empathy.

And if they can empathise, we might have to revise how we think about the media.

It’s important to note that empathy is quite distinct from sympathy. Empathy is a form of reading social cues and responding appropriately. It’s about understanding others by watching, listening, sensing and making inferences about actions or inaction.

We do this without thinking as we assess how others are getting on, check people’s current behaviour against past behaviour, and revise our actions toward others if we get an unwanted reaction. These are all actions being performed by this type of advertising system.

The question is not one of intelligence or whether machines might act like people, but that people and machines (arguably along with animals) empathise in similar ways through responses to public behaviour. After all, when we humans empathise, we do not literally feel into the skulls of others, we read and respond to public signals. That’s what empathic media is increasingly doing too.

As this technology edges towards the market, what people say, do and feel in the domestic sphere become increasingly public. The boundaries are being tested and the meaning of personal privacy redefined and, at least for now, it’s the advertisers that are calling the shots.

ANDREW MCSTAY lectures at Bangor University and is author of Digital Advertising; The Mood of Information: A Critique of Online Behavioural Advertising; Creativity and Advertising: Affect, Events and Process; and the forthcoming book Privacy and Philosophy: New Media and Affective Norms.

The Conversation

This article was originally published on The Conversation. Read the original article.