The Facebook/Admiral Scandal Shows The Limits And Dangers Of Big Data Capitalism

Risk is the insurance's commodity: Humans are looking for certainty in an uncertain world and insurance companies promise them means of managing risk. Insurance companies' capital accumulation model is that they seek for ways of minimising the number of actual insurance events in order to maximise their profits.
|

Insurance company Admiral planned to offer to first time car owners setting their premium based on the analysis of their Facebook data. The basic idea was to observe users' online behaviour in order to assess whether they are conscientious drivers or not. Data to be analysed should have included users' writing style, their likes, or the way they plan meetings. Facebook pulled out of the deal with the insurance company in the last second before the launch.

Open Image Modal

Risk is the insurance's commodity: Humans are looking for certainty in an uncertain world and insurance companies promise them means of managing risk. Insurance companies' capital accumulation model is that they seek for ways of minimising the number of actual insurance events in order to maximise their profits. They try to minimise their financial risk by managing and assessing their clients' risk factor. Traditionally car insurance companies have assessed this risk by experientially observing their clients' social world: If I cause a car incident, my premium will go up. If I am involved in no accident, it will over time go down - at least as long as the premium levels are not increased in general. In the world of big data capitalism, the situation is changing: Human decision-making is increasingly being automated. Algorithms take over and colonise the social world.

The financial world of insurances, banks, financial funds, stocks, derivatives, etc. constitutes the capitalist economy's largest sector. Finance is a world of high-risk and high potential gains. The Admiral Group describes itself as "one of the UK's largest and most profitable car insurance providers, with over 11% market share and market‐leading financial results". Its annual profit after tax increased in 2015 by 3.3% to £292.2 million. Its profits during 2016's first six month increased by 3.8% to £193 million in comparison to the first half year in 2015. So Admiral certainly is not facing an existential crisis. The example in contrast shows that the constant drive to increase profits tends to subject ever more aspects of our private life to corporate and administrative control, capture and enclosure.

An argument that is often heard about personal data control is that one should not worry because the schemes are voluntary and data collection and analysis are of a limited scope. The problem of this logic is that surveillance tends to be incremental and expansive. It is also shaped by power dynamics that make it difficult for vulnerable and weak groups to say "no". They often have no other option than to agree to and opt-in to data capture.

The problem of big data analytics' use by corporations and state institutions is not just that it can involve the analysis and use of sensitive personal data such as ethnicity, religion, health status, sexuality, political affiliations and worldviews. Algorithms also tend to socially sort human subjects into statistical groups and to discriminate, i.e. to treat them in different ways. This can easily result in discriminatory practices such as racial profiling.

Algorithmic analysis of behaviour and personalities installs a regime of categorical suspicion, in which everyone is first and foremost seen as a potential offender, who may break the rules and be the cause of risk. Big data analytics negatively impacts the trust that is needed as cohesive force in social relations and society. Algorithmic de-humanisation is the consequence: In the world of algorithmic surveillance, we are not considered first and foremost as human being human, but as constituting potential sources of risk, crime, terrorism, trouble, costs, etc. Data mining can also easily intensify disadvantages and inequalities that structurally disadvantaged groups are facing.

Let's think the logic of big data analytics to the end and in the last instance you find a world of totalitarian control: Sensors reach into our thoughts, dreams and fantasies and police us based on predictions they make. You wake up one day and get the following message: "Based on a predictive analysis of your dreams, our algorithm has predicted that you are a potential threat to the political and economic security of the nation. Your employer has decided to terminate your contract. Your insurance has increased all your premiums to the high-risk status. Your credit score has massively dropped and therefore your bank has to cancel your mortgage and all your credit cards. You are from now one being considered as a potential criminal and terrorist. As a security measure, you will therefore be put into precautionary confinement with no pre-determined time limit". What may sound like a particularly scary episode of Black Mirror, is in reality just the dystopian totalitarian society that you get when you think the logic of big data analytics to the end and put it into practice.

Algorithms do not have feelings, morals, and ethics. They do not understand jokes, sarcasm, humour, love, care, and empathy. They try to make a complex world one-dimensional. The trouble of big data analytics is that it approaches and assesses a contradictory world with statistical and mathematical models that are blind for the complexity and dialectics of society and human behaviour. A complex world shaped by contradictions cannot be planned and predicted. Computational analytics are a techno-fetishism that promises easy technological fixes to society's problem. There are no technological solutions to society's key challenges. When algorithms displace human decisions, we end up in a highly instrumental world.

I have since 2008 led a number of research projects that studied how users think about the world of social media and big data. The project Social Networking Sites in the Surveillance Society showed that users tend to be highly sceptical of online surveillance and desire a different online world. Users desire a world beyond big data capitalism.

The case of Admiral and Facebook illustrates the limits and dangers of big data, big capital, and big bureaucracy. When these trends converge, we may very well end up with big brother 2.0, a world dominated by a surveillance-industrial complex. The only alternative is to re-think big data capitalism and high-velocity society. It is not-yet too late to create an alternative world that is based on human trust, human communication and solidarity, in which humans are in control of the systems that shape their life chances.

Image: By Camelia.boban (Own work) [CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons