Tick, Click and Hope for the Best No Longer Cuts It for Consumers Who Want Tech on Their Terms

The storm over Uber's consumer privacy settings is just the latest in a growing list of concerns about the tech industry's handling of our data. From general irritation about targeted ads; to deep unease about our personal data security, to fears over the erosion of civil liberties - there is concern about who has access to data about us and what they are doing with it.

Tick, click and hope for best no longer cuts it for consumers who want tech on their terms.

The next big step for consumers in the digital age could well be one that puts consent to share data on the terms of the individual, not the service provider.

The storm over Uber's consumer privacy settings is just the latest in a growing list of concerns about the tech industry's handling of our data. From general irritation about targeted ads; to deep unease about our personal data security, to fears over the erosion of civil liberties - there is concern about who has access to data about us and what they are doing with it.

In the US 86% of consumers have tried to use the internet in ways that minimise the visibility of their digital footprints. Across Europe, 55% of consumers fear becoming a victim of fraud when disclosing personal data in online transactions, while 68% of UK consumers find the way that brands use the information they hold on them creepy.

This unease is exacerbated by the lack of transparency over who is obtaining our data, who they are sharing it with, how they are using it, and to what ends.

Take Axciom, one of the world's largest data brokers. Unbeknown to almost everybody, it is reported to hold 1,500 pieces of information on more than 500 million people around the world, giving it the ability to predict 3,000 possible reactions to brands and marketing techniques.

Such data brokerage firms - part of a multi-billion dollar industry that has emerged to meet growing demand - are harvesting data about us from multiple sources online (and offline) and combining it into rich, if incomplete and context-less, profiles of individuals, segmented to meet the needs of their clients.

The collection of such data is being used to sell us stuff in more and more extraordinary ways.

Personalisation

In 2012, for instance, US retail giant Target sought to outdo its competitors in reaching the lucrative 'new parent' demographic by developing an algorithm that used purchasing history data to predict which of its female customers were pregnant. It would then send tailored discount vouchers for maternity and baby items to women it predicted were in their second trimester. The results are now data segmentation folklore.

Authorities in London last year had to stop a company's roll out of 'smart' litter bins that were connecting with pedestrian's phones and serving up targeted ads based on places the passer-by had previously visited.

UK retail giant Tesco is installing facial recognition technology that will see screens target ads at customers based on age and gender.

Marketing innovators such as Ditto are using digital photo recognition software to trawl social media and analyse how brands are being contextualised in images people share online.

Just a taste of how the arms race to create personalised marketing campaigns is well underway; a race only likely to pick up as we take the next digital leap into the internet of things.

The submission

It is well established that terms of use, End User Licence Agreements and privacy policies - the mechanisms by which we 'consent' to the harvesting of our data - are too long, too complex and too inflexible. Ironically, in light of the targeted advertising they fuel, they are distinctly impersonal. Analysis undertaken in 2008 calculated that it would take 76 working days to read every privacy policy an internet user encounters in the course of a year.

No surprise then that research shows the median time users spend on license agreements was only six seconds; that 70% of users spend less than 12 seconds on the license page; and that no more than 8% of users read the License Agreement in full.

Yet despite the growing unease and risk, most individuals still tick the 'I agree' box and 'consent' to giving this data. But is it given either knowingly or willingly? I think we can safely say the answer is no.

Faced with a binary 'take it or leave it' choice and with no opportunity to set their own preferences, current T&Cs can make consumer consent look more like consumer submission. We are left having to tick, click and hope for the best.

This has led the World Economic Forum to caution of a developing 'crisis of trust', stemming from the use of personal data in ways that are inconsistent with individuals' preferences or expectations.

Finding a more meaningful solution to this problem requires mechanisms that enable the consumer to express their terms in a simple and accessible way; not a one sided, one-size-fits-all model of consent.

Encouragingly, there are growing indications that change may be on the horizon.

The blowback

Earlier this year the US Federal Trade Commission's own look at the data broker industry found that "data brokers operate with a fundamental lack of transparency";

GlobalWebIndex research found that more than a quarter of the world's online population are using tools to disguise their identity or location.

In March the father of the web, Tim Berners Lee called for an online Magna Carta - a bill of rights that would guarantee the independence of the internet and ensure people's privacy.

And even the tech giants have begun to make a virtue of privacy. For example, Microsoft's global ad campaign asserting 'Your Privacy is Our Concern'. Or Apple's CEO feeling obliged to publish an open letter to its customers stating that "your trust means everything to us", and outlining its 'strict' data handling policies (just as Apple gears up for a big push on health and financial services - two of the most sensitive forms of consumer data).

Analysts are predicting that privacy is set to become a competitive differentiator, and the driving force for the next 'killer app'; and the pressure for something different, for something better is now building to the point where change looks inevitable.

A new breed of tech companies are already taking the lead on developing tools that enable consumers to start taking back control.

For example, 40 million people are using Ghostery - a browser extension that enables users to see and block companies that track you when you visit a website. Personal data vault services are emerging that allow consumers to securely gather, store, control and release their data on their own terms. The development of 'sticky' data policies, bind a consumer's permissions to their data "as it travels across multiple parties, enabling users to improve control over their personal information".

Of course, to enable effective permission-setting consumers need to understand the permissions they are granting. This too is prompting new initiatives in how to present potentially complex contracts and preferences in a 'human readable', engaging form.

The pressure is mounting for a better deal on data and privacy for consumers. It's coming from a range of actors: governments and regulators, tech titans, internet visionaries, consumer bodies and, crucially, it's coming more and more from consumers themselves.

Some entrenched parties will try to resist it, but those genuinely working in the consumer interest must embrace this eagerness for change. So let's move towards a digital future set on terms that put the consumer first.

Close

What's Hot