Consumers have long theorized that our devices are eavesdropping on us. From Apple transcribing audio recordings of Siri users to Amazon Alexa recording and storing the things you say — whether it's directed at the digital assistant or not — there is some truth to the speculation.
It’s one thing when regular consumers get tripped up by stuff like “my phone is listening to me,” but it's something else entirely when we see industry insiders also scratching their heads.
Let us give you an example. Brian O'Kelley has been around since the early days of digital advertising. He was the CTO of early online ad player Right Media and co-founder of the demand-side platform AppNexus, which was acquired by AT&T in 2018. In a recent blog post, he described how he was intrusively and confusingly targeted by ads after researching a cancer drug prescribed to a relative. When he visited the site of the drugmaker, he was tracked by a long list of cookies, targeted, and profiled immediately, with little respect for privacy or the emotional experience associated with the need to research a cancer drug. Not only was the experience jarring, perplexing, and devoid of any empathy warranted by such a delicate scenario, but he uses Safari, which is supposed to block third-party cookies.
"The fact that I am still getting tracked and targeted is problematic. The fact that I don't know why, or by whom - even as a 15-year veteran of ad tech - is unbelievable."
If a guy who helped shape the digital advertising industry can’t understand how and where his data was sold and resold or why he’s seeing a certain ad if his browser is supposed to block third-party cookies, what are regular consumers supposed to do?
A black box
Platforms intentionally operate as black boxes, meaning they can make claims about artificial intelligence, machine learning, and algorithms without needing to show proof, thereby obfuscating why one platform may be better than another. A complex system with many players makes it impossible to determine who exactly is “‘listening”’ or why a particular ad is targeting a specific user. What is certain is that a set of business rules and training data is behind every algorithm.
Consumers have an outdated perception of how advertising works, believing erroneously that companies they care about use advertising to tell them about their products. But instead, any company that puts you in the “interested in widgets” audience segment can bombard you, your family, friends, anyone on your Wi-Fi network or in close proximity — even those in your social networking circle — with ads. With no frequency cap or other controls in place, welcome to widgets 24/7, all day and all night.
The way digital advertising is sold today is, technically speaking, so beyond the comprehension of the average consumer that something like “‘my phone is listening to me” feels more acceptable than some of the questionable practices that enable personalized advertising. The types of situations that would suggest that an ad appeared after a conversation are likely the result of ad targeting based on IP address or location clustering. What consumers should really be concerned about are the massive troves of data they produce and unknowingly share with countless companies and platforms. These data sets are sold, married with other seemingly unrelated datasets, and resold again in order to show consumers more relevant ads. But that relevance may come at a cost. Sure, seeing an ad for something you just discussed is creepy, but what if this data set is also now shared with a company that decides your credit score, which affects your ability to get a better mortgage rate? What connections can be made when alternative data sets are combined with little oversight? Should someone’s random browsing behavior end up somehow penalizing them from being offered a good interest rate on a loan — or getting a loan at all?
Every next signal produces a further bond. If you regularly communicate with someone via WhatsApp or email, you’ll show up in their social circle and will both likely see ads that are based on each other’s interests. Same goes for Facebook’s relationship status, shared Wi-Fi credentials, multiple tagged photos, or a funny social quiz you’d share with close friends (think Instagram’s close friends feature).
What happens when you unexpectedly start receiving ads for divorce attorneys, botox treatments, and the Ashley Madison dating service after spending the weekend with friends? Someone close to you is likely going through something, and the mighty algorithm decided that you should also be bundled into this data set. After all, we are the company we keep, aren’t we?
This is what consumers should be worried about — not the possibility that their phones are listening to them.
Consumers practically have no ability to remove themselves from any targeting category. They could visit individual platforms such as Google, Facebook, or Twitter to see what interest groups they’re lumped into and request to be removed, but there is no unified “delete me” button. No one has any idea what happens to that data once it leaves the platforms or how it's packaged and repackaged. Even if you can remove yourself from a category in which a platform placed you, by the time you've gone through the motions, your data points have probably been sold and resold. The company that bought the data set can keep you in any given segment indefinitely. That’s why you’re endlessly followed around the internet by those shoes you bought a week ago.
We wrote about the issue of data breaches beyond the realm of advertising and how marketers can create guidelines to mitigate the risk of data collected for advertising purposes being used in other contexts.
In the West, we've also looked down upon Chinese social credit concepts, where the Communist Party monitors Chinese citizens and ranks them based on their behavior, such as their driving behavior, spending habits, and financial credit history. These practices are really creepy, yet they are based on similar practices as digital advertising, but with less transparency.
Consumers readily share their data for advertising purposes — especially now with all those annoying pop-ups asking for consumers to accept cookies in order to access the content they want to read or watch. Maybe consumers assume that sharing their personal information will only affect the ads they see immediately after; remember the social challenge of posting photos of yourself over a 10-year span, which could have been meaningless fun or a data-collection ploy to train Facebook’s facial recognition algorithm?
But when people get the sense that their phone is listening to them, it triggers an emotional reaction that this is not information being voluntarily shared but rather something that’s been obtained without their explicit permission, which, in most cases, is true.
Somehow, advertising is a consumer-facing industry that has completely cut out the consumer experience.
Not that the typical consumer cares about the inner workings of advertising, just as your typical advertising professional rarely gives a hoot about the internal mechanics of other industries. But that’s beside the point. Even though most consumers don’t care about the nuts and bolts of how advertising works, marketers and everyone else in the digital advertising supply chain need to think about the consumer.
Yet no one does. There is no one to represent the consumer, who largely has no recourse. We’ve seen massive data breaches with little meaningful relief. Everyone just shrugs.
As a result, why would companies make consumers a priority? From the advertising perspective, the “my phone is listening to me” bogeyman might hold the key.
A path forward
So, what’s the solution? We’d imagine that the companies whose ads show up after conversations are not faring well among consumers, but at the end of the day, consumers don’t think of Facebook, or Twitter negatively, even though those are the companies most likely to be listening to them through their phones.
Last week we wrote about the changing role of the CMO and the need for a C-level executive to own the customer experience. If all brand advertising was created and delivered based on the customer experience, any creepiness factor would disappear.
Consumers also need to have a voice in how they are classified. For example, they should be able to not only opt out of personalized advertising, they should also be given the power to take themselves out of targeting segments so that they aren’t followed around the internet by that pair of shoes they’ve already purchased and worn a few times.
Additionally, since some of this data is incredibly sensitive — think health or financial data, for example — regulatory action is likely needed. As O'Kelley called for in his blog, sensitive data, especially information about medical history or interest, should have greater restrictions on its use. He suggests requiring consent for any entity to be able to access this type of sensitive data; prohibiting sensitive data from being used for marketing or advertising purposes; and prohibiting sensitive data from being sold or transferred to or by data brokers. The EU's General Data Protection Regulation (GDPR) is a great start in giving consumers more control over their data. Under the law, Europeans have the right to learn how their personal data is being collected and used, correct inaccurate data and, in some cases, have their data erased. However, we doubt that most consumers have ever read a single cookie consent popup. Instead, they likely just click on “accept all cookies” and move on without a second thought to the content they want to consume.
One question
Given how challenging and fragmented opting out is, what type of solution could enable consumers to have the necessary control to mute mispersonalized advertising and remove themselves from a specific category or inherent group?
Dig deeper
Thanks for reading,
Ana, Maja, and the Sparrow team
Enjoyed this piece? Share it, like it, and send us comments (you can reply to this email).
Who we are: Sparrow Advisers
We’re a results oriented management consultancy bringing deep operational expertise to solve strategic and tactical objectives of companies in and around the ad tech and mar tech space.
Our unique perspective rooted deeply in AdTech, MarTech, SaaS, media, entertainment, commerce, software, technology, and services allows us to accelerate your business from strategy to day-to-day execution.
Founded in 2015 by Ana and Maja Milicevic, principals & industry veterans who combined their product, strategy, sales, marketing, and company scaling chops and built the type of consultancy they wish existed when they were in operational roles at industry-leading adtech, martech, and software companies. Now a global team, Sparrow Advisers help solve the most pressing commercial challenges and connect all the necessary dots across people, process, and technology to simplify paths to revenue from strategic vision down to execution. We believe that expertise with fast-changing, emerging technologies at the crossroads of media, technology, creativity, innovation, and commerce are a differentiator and that every company should have access to wise Sherpas who’ve solved complex cross-sectional problems before. Contact us here.