A few years ago, Facebook got in serious trouble for allowing housing adverts to screen out ethnic minorities from their audience. Finally, this summer, the platform responded to these accusations by removing a number of its ad targeting options and introducing a new certification for US advertisers.
Many have celebrated this move as evidence of tech titans being held accountable to the public at large. But this kind of tokenistic corporate self-regulation will do nothing to tackle the far deeper problems at the core of the political economy of information.
Reducing the potential for targeted ads to discriminate is a good thing. But should targeted advertising even exist in the first place? For a while, critics have argued that “the surveillance economy should die” and that ending personalised advertising could remove one of the primary reasons why so much data is collected about us. This form of advertising doesn’t benefit the public, and it’s not even clear it serves advertisers’ interests effectively.
What it does do is result in those able to hoard the most data becoming incredibly powerful. As tech expert and ex-software developer Wendy Liu highlights, “In the battle over digital platforms, there’s more at stake than just data; this is a battle over technological development more broadly, and who gets to control and deploy that.”
This week saw one of the biggest legal challenges yet to the data collection upon which targeted Internet advertising depends. Simultaneous complaints were filed with various European data protection authorities, including the UK’s Information Commissioner’s Office, against Google and other ad tech firms.
The complaint informs European regulators of a massive and ongoing data breach that affects virtually every user on the web. Every time you click on a link, your personal data is broadcast to thousands of companies without your knowledge in order to auction and place ads. The original filer of the complaint says this clearly violates GDPR requirements. The failure of these companies to comply exposes serious limitations to the regulation.
Billions of bids for your attention
In Tim Wu’s book on advertising and the wider attention economy, Attention Merchants, he describes a new kind of corporate domination: the industry that monopolises attention. He traces this back to the first world war, where the British government embarked on the first systematic propaganda campaign in history with 50million big colourful recruitment posters, as well as parades and vans with film projectors.
And thus the modern advertising industry was born. It continued into the early days of the internet where ads were placed no differently from how they were placed offline. Websites would make space available and agencies and others would pay to have their clients’ messages posted. But since 2010, companies like Google and Facebook have developed a far more advanced infrastructure for extracting profit from users’ attention, with adverts supposedly tailored to our personal interests and inner desires.
Today, when you click on a link, something called a ‘bid request’ is sent to one of two main ad tech channels, OpenRTB and Authorised Buyer (the latter of which is run by Google). The request provides as much information about you as possible, including the website you are visiting, your IP address (from which your location can be inferred), device details along with various identifying information about the user allowing a more detailed profile to be accessed.
This information is then used by ad tech companies to bid in an auction for the right to show you a particular advert. The winning bid then pays the price offered by the runner up. This process is happening repeatedly as we all surf the web, billions of times a day without most of us being aware of it.
Data breach of the century?
There are many problems with the way that advertising works on the internet. Some would like to see the whole targeted advertising system rejected, while others question its efficacy, noting that as much as 25% of ad spend is lost to fraud and 50% of adverts will never be seen by a human. The fraud mostly takes the form of botnets (collections of computers controlled by malicious code) that automatically clicking on adverts. These are also serious ethical concerns surrounding data-driven social control, “with Big Data used to discriminate against groups, steers vulnerable people into financial scams, and meddle in elections”.
The legal cases brought to data protection authorities around Europe this week have shown that the ad tech industry is potentially exposing every person who uses the internet to the non-consensual, and often unwitting, sharing of their data with thousands of companies who are all able to copy, share and sell the data. The now infamous Cambridge Analytica used to be one of many companies that had access to this stream of personal user data.
This week’s legal complaint was originally raised by Brave, a privacy-focused web browser set up by the co-founder of Mozilla, the open-source web browser. Brave operates as a private browser and ad blocker, giving it deep insight into the way the targeted ad industry harvests data about people’s online behaviour.
They highlight how the ‘bid request’ during the auction process totally fails to ensure the protection of the personal data against unauthorized access. Requests broadcast more data than could be justified for advertising purposes, including sensitive information such as sexuality, ethnicity or political opinions. A recent study by DotEveryone revealed that 45% of users had no idea this data was being used to help targeted ads.
As Brave notes, “there is a massive and systematic data breach at the heart of the behavioral advertising industry. Despite the two year lead-in period before the GDPR, ad tech companies have failed to comply. Ads can be useful and relevant without broadcasting intimate personal data.”
Bravo’s complaint shows that GDPR has given us a framework to challenge the unauthorised sharing of person data in these sectors at such a massive scale. However the complacent attitude of the firms involved, who have repeatedly failed to engage with the accusation, is worrying and suggests that GDPR may not have the bite originally intended. Crucially, we should also not limit our judgements regarding data processes to whether or not they are GDPR-compliant if we are to dismantle the structures that create tech monopolies and pose a major threat to society.
While the focus of this complaint is on legality and privacy rather than broader social justice, this kind of challenge marks a watershed moment for the surveillance economy. Too often, debates around socio-technical systems focus on reforming digital tools to reduce bias or error rate and increase transparency. But, as legal scholar Frank Pasquale highlights, accountability in the digital economy should question whether these tools should be developed at all and at the very least, what limits should be placed on their use and commercialisation.