We're having the wrong conversation on privacy
IDFA mania & what we should be talking about instead
When Apple announced sweeping changes to how IDFAs are shared at WWDC this June, the industry was instantly thrown into a tizzy. The proposed switch to IDFA explicit opt-in was expected to reduce the number of folks opting in by ~80% (if you’re on the optimistic side), thus severely impacting regular advertising-related operations like frequency capping, fraud detection/prevention, measurement and attribution, and - of course - ad targeting itself. As rationale for breaking the app advertising ecosystem, Apple invoked the ever-popular topic of privacy (mentioned 12 times on the iOS 14 preview page).
There was an audible sigh of collective relief when Apple announced last week they’re postponing the IDFA-related changes to ‘early next year’. While developers work on stop-gap measures and figure out how to use SKAdNetwork for attribution, there’s a larger and extremely timely conversation to be had around privacy.
What is an IDFA anyway? It’s an alpha-numeric, unique, and persistent string that gets assigned to each iOS-running device (iPhone, iPad, AppleTV). If you want to find out what yours is, you’ll need an app like this one courtesy of AppsFlyer (if you have an Android device you can look up your Android Advertising Identifier -- Android version of IDFA -- by tapping Settings → Google → Ads). That’s the ID that gets passed to individual apps and makes it possible to understand which ad exposure is leading to conversions and similar activities. If you’re looking for a terrible analogy, it’s akin to the mobile world’s version of a third-party cookie: a token that multiple ecosystem participants can interpret and that serves to tie together user behavior for various commercial purposes (most of which are advertising-related).
Privacy and tracking often come up in the context of digital advertising and marketing. Apple defines privacy as a human right in its iOS 14 announcement. Regulators worldwide have also taken notice: General Data Protection Regulation (GDPR) became EU law in May 2018 and established fines and enforcement for companies found to be in breach. California real estate developer Alastair Mactaggart spearheaded the charge in the US, frustrated by the amount of information technology companies collected and stored on their users (us!) with little transparency or recourse on how that data is used. Mr. Mactaggart outlined and worked to pass what is now the California Consumer Privacy Act (CCPA) based on the following tenets:
We’ve previously explored how today’s Big Tech should really be thought of as Big Advertising since today’s platforms owe significant portions of their revenue to advertising.
In the context of privacy, advertising and marketing use cases get the most scrutiny because those are the ones regular humans (e.g. internet users) can spot the easiest.
Hey, there’s that pair of shoes you bought last week; and here’s that gift you were researching for your partner, that’s now seemingly on every website you visit. While these may be the most annoying and visible, they are often the least innocuous. Data use also didn’t originate on the internet: an obvious non-digital example is the DMV which routinely sells personally identifiable information from license and ID applications to various offline data aggregators without the ability for consumers (residents of the US) to opt-out or control how this data is used down the road.
Consider how rapidly digital data sets are growing. The research firm IDC predicts that by 2025 we’ll be dealing with 175 Zettabytes of data globally: an absurd number that is difficult if not futile to visualize. It’s perhaps better illustrated by mentioning that we’re simply not disposing of data anymore; just generating more and more of it. Contrast that with some of the legacy infrastructure we’ve inherited from the pre-digital world: in the US, our social security numbers (SSNs) serve as a de facto unique identifier of every single person residing in the country. Introduced in 1936, they’ve gone well beyond their initial use of signing up for select government services; most important interactions involving your financial life require a SSN. Since they’re handy, unique, and persistent (they cannot be changed) they’ve creeped into other spheres of life. That doctor’s office you went to that one time 15 years ago for a twisted ankle probably still has yours on file. Our email addresses and mobile phones have become de-facto secondary persistent IDs: we rarely think twice before signing up for a merchant’s newsletter with our email or entering our mobile phone number supposedly for order tracking purposes. This seems like a low risk activity in and of itself -- someone gaining access to just your email address doesn’t sound like a big deal. But when that address is used across different systems, different vendors, and different companies to identify you, sooner rather than later someone in that chain may hold more impactful personal information and expose you to the risks of identity theft, financial fraud, and similar nightmares. High profile hacks like the 2017 Equifax data breach illustrate how little recourse an affected consumer really has. It took 2 years to hammer out a settlement which included a laughably low one-time payment of $125 (if you were lucky enough to file quickly before the earmarked funds ran out) and an even more laughable credit protection service from the very same company that leaked your data in the first place.
Why should you care:
Most of our conversations around privacy today are wrong thanks to two main factors:
As the recent IDFA kerfuffle demonstrated, we’re approaching privacy tactically: as something that is to be addressed within the context of a single app or a single channel (e.g. mobile) instead of holistically at the level of a person (aka internet user)
Privacy as a term isn’t well defined and means vastly different things to different people. As a result of this amorphous, fuzzy definition it’s predominantly approached as a nice-to-have, optional feature rather than a fundamental core aspect of services design
For an illustration, we don’t have to go very far. Look through your ‘interests’ settings in Google or a similar advertising platform and flag how many of the things listed there are your actual interests vs. stuff randomly classified as an interest -- Ana’s included helpful ones like ‘Microsoft PowerPoint’ (ok, that one kind of makes sense), ‘Populace’ (?!!), ‘The Sopranos’ (no comment), ‘Portugal national football team’ (excuse me, what?), and knitting (finally one that makes sense). Funny, but where’s the harm? Say you research a particular type of cancer because someone you know has just been diagnosed. A site you visit during your research classifies you into a ‘likely to have this type of cancer’ based on your browsing behavior, entirely unbeknownst to you. They then package and resell that data to a variety of pharmaceutical advertisers but also sell the raw data to a life insurer who then flags you for denial of coverage. That got dark really quick.
As consumers and people on the internet we have little control and practically no visibility into how data we generate is subsequently used.
(N.B. we explored the real impact of data breaches on consumer trust and ways to ensure marketing and advertising companies and professionals become better custodians of customer’s data in this piece for AdExchanger).
There’s our solution, too. Instead of fluffy privacy talk we need to reframe these kinds of conversations in the context of data usage rights. Mr. Mactaggart and the original intent of CCPA come readily to mind here: we need transparency (what is being collected, where from, how long is it stored, etc), control (what is being done with my data and what can I do to curtail this), and accountability (if something does go wrong and there’s a breach, I want to know that I have real recourse). On the transparency front, the IAB’s Tech Lab rolled out an interesting proposal in 2018 to create a data version of the food nutrition labels that detail ingredients and other pertinent nutritional information. It’s an interesting concept (at least visually) that could help quickly understand data origin, quality, and certainly its ‘use by’ date.
For control, as a consumer I want to be able to stop companies from transacting on my data without my knowledge (and benefit to me). The challenge here, aside from data literacy, is one of scale: while data in aggregate is worth billions (~20BB in the US alone per the IAB & Winterberry Group State of Data report) an individual’s data footprint is practically worthless today. It doesn’t have to be: imagine if instead of printing and sending a pricey catalog to your home because a company thinks you might be in market for furniture, you as a consumer could indicate that you are in fact in market and then entertain (commercial) offers for your attention (e.g. $10 to my charity of choice to send an offer to my inbox; if your targeting is sound, that’s not a bad cost of acquisition in many verticals). For this we’d need an entirely different system and setup: a privacy-and-control-by-design ecosystem built from the ground up which, perhaps paradoxically, may bring us to the first practical application of something like blockchain technology in advertising-related waters.
This leaves accountability: in countries that have prioritized regulation (like the EU with GDPR) we’re starting to see first major enforcement efforts. Faced with a class action suit alleging damages to the tune of €10BN, enterprise vendor Oracle has announced the shut down of their third-party data exchange (in Europe; what about in other territories where such regulation doesn’t yet exist?).
Today the data economy is very lucrative but purely extractive: commercial aggregators are the ones who reap the most benefit with little scrutiny and even less consideration for the needs of consumers. With the level of data exhaust created every day, this type of ecosystem is unsustainable and only prone to more dangerous gaffes and breaches. If a mom and pop shop you visited once is the weakest link in a complex data security chain, it’s time to build an entirely new system.
One question:
The data industry is opaque today; to transition to a more transparent one, the speed of industry innovation often outpaces what users and regulators can protect against. What type of data literacy do we expect from consumers and is that in line with how quickly the industry is evolving?
Dig deeper:
Alastair Mactaggart’s keynote at IAPP outlining the genesis of CCPA (great 20 min watch)
Major GDPR class action lawsuits filed against Oracle and Salesforce
IAB Tech Lab’s data transparency initiative (aka the data label)
Apple announces IDFA changes in June and then delays IDFA changes in September
Enjoyed this piece? Share it, like it, and send us comments (you can reply to this email).
Who we are: Sparrow Advisers
We’re a results oriented management consultancy bringing deep operational expertise to solve strategic and tactical objectives of companies in and around the ad tech and mar tech space.
Our unique perspective rooted deeply in AdTech, MarTech, SaaS, media, entertainment, commerce, software, technology, and services allows us to accelerate your business from strategy to day-to-day execution.
Founded in 2015 by Ana and Maja Milicevic, principals & industry veterans who combined their product, strategy, sales, marketing, and company scaling chops and built the type of consultancy they wish existed when they were in operational roles at industry-leading adtech, martech, and software companies. Now a global team, Sparrow Advisers help solve the most pressing commercial challenges and connect all the necessary dots across people, process, and technology to simplify paths to revenue from strategic vision down to execution. We believe that expertise with fast-changing, emerging technologies at the crossroads of media, technology, creativity, innovation, and commerce are a differentiator and that every company should have access to wise Sherpas who’ve solved complex cross-sectional problems before. Contact us here.