Social Media Companies Are Finally Held To Account For Their Inaction On Removing Illegal, Extremist And Hate Material

The response from social media companies like Google, Twitter and You Tube in particular, has been bluster, public relations spin and media lines that have done nothing to help victims who have suffered so much on their platforms.
shutterstock

As the founder of the anti-Muslim hate monitoring and support service, Tell MAMA, the report of the Home Affairs Select Committee on social media companies and their inaction to remove extremist and illegal material is a breath of fresh air and a vindication for many of us who for years have been brushed aside by the arrogance of social media corporations and their public relations teams. For years we highlighted how their platforms have been used to humiliate people because of their identities and to promote extremism, as well as anti-Semitism and anti-Muslim hatred. The response from social media companies like Google, Twitter and You Tube in particular, has been bluster, public relations spin and media lines that have done nothing to help victims who have suffered so much on their platforms.

The Home Affairs Select Committee today publishes its long awaited report entitled, 'Hate Crime: abuse, hate and extremism online'. If ever social media companies should feel humility, it should be now and through this report. I am reminded of Hans Christian Anderson's tale of the 'Emperor's New Clothes'. The difference here though is that social media companies have done everything in their power to construct a narrative that they have been acting on hate speech and within the laws of the country. This false construct has been their defence and their clothing and social media staff in public relations departments have even bought into it, parading and strutting their wares when those of us working at the coal face of supporting victims of hate crimes, could see that they had no clothes on. What they were doing, was spinning a well-rehearsed ruse.

In a press release from the Home Affairs Select Committee (HASC) accompanying their report, the opening title states that the 'biggest, richest social media companies are shamefully far from tackling illegal and dangerous content'. Yvette Cooper, the chair of HASC said what many of us had been saying for years, though to large corporations like Google, You Tube and Twitter in particular, we were just aberrations given that they listened and carried on regardless. People's lives, their reputations and their emotional and psychological well-being cared little to well-paid public relations staff. However, they can't carry on as normal after this statement from the chair:

"Social media companies' failure to deal with illegal and dangerous material online is a disgrace. They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful. These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people's lives. This isn't beyond them to solve, yet they are failing to do so. They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe."

Racist Material and Inaction from Google

I can tell you from personal experience that corporations like Google have simply obfuscated and done everything in their power to take no action in delinking their search engine from racist and illegal material. On the other hand, Twitter have simply failed to act on extremist accounts for years, even when colleagues in Tell MAMA notified them to remove extremist far right accounts. The HASC report mentioned the Pakemon campaign that targeted British Muslims of Pakistani heritage and which took place in November 2016. The racist campaign was circulated through Twitter and Google's search facility continued to point to racist sites running and highlighting this abhorrent campaign even when Google and Twitter were notified that the material was illegal. In fact, I was one of the individuals, along with the Mayor of London, who were targeted in the racist campaign and because of my work challenging anti-Muslim hatred. Yet, having notified Google in January 2017 of the need to delink to such sites, their response was that because I was a 'public person' they could not delink their search facility from such sites. In other words, because I founded a national hate crime campaign countering racism, intolerance and prejudice, I should put up with it. This was their response and with further advice that I should look elsewhere to seek remedy. It is precisely this callous disregard for the law and individual rights that the HASC report highlights has been the standard position adopted by social media agencies such as Google, You Tube and Twitter.

Or take the fact the HASC report found that You Tube was the 'vehicle of choice' for spreading terrorist propaganda and where videos of proscribed jihadist groups such as ISIS, Jabhat al-Nusra and Jund al-Aqsa could be found with extremist material from neo-Nazi groups such as Combat 18, the North West Infidels and National Action (now a proscribed organisation). This once again tallies with our experience when we have reported in extremist far right hate material to Google, with one video in particular, now remaining online up for five months. HASC's statement on this should thoroughly embarrass Google staff. HASC stated, "It is shocking that Google failed to perform basic due diligence regarding advertising on YouTube paid for by reputable companies and organisations which appeared alongside videos containing inappropriate and unacceptable content, some of which were created by terrorist organisations. We believe it to be a reflection of the laissez-faire approach that many social media companies have taken to moderating extremist content on their platforms."

Successes

There are some successes in this monumental struggle against corporate social media companies. When I presented to the HASC in December 2016, one of the things that I had suggested, has been taken on board. Since founding Tell MAMA in 2011, it became clear to me that the cost to the State and to civil society groups of social media platforms was running into the tens of millions. Policing costs, evidence collection, work by civil society groups and personal and economic impacts to victims all meant that there were financial, reputational, emotional and physical costs to society. Yet, none of the social media platforms were investing hard cash into the problems that were percolating through them.

I made clear to the committee that this could not continue and that social media companies had to set aside a fund to support work on countering hatred or to ensure that the police costs were reimbursed. No longer could such large profit making corporations pass off the costs onto the public purse or the public. The HASC committee made the following recommendation and they stated that, "Social media companies that fail to proactively search for and remove illegal material should pay towards costs of the police doing so instead." They went to add that, "Social media companies currently face almost no penalties for failing to remove illegal content. There are too many examples of social media companies being made aware of illegal material yet failing to remove it, or to do so in a timely way. We recommend that the Government consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe." It is clear that after the recent Times articles on Google and Facebook, the best way to hit social media companies is in their pockets.

Twitter's Laissez Faire Approach

The report also rightly rounds on the inaction of Twitter, probably one of the worst offenders of inaction. For example, over four years, Tell MAMA had reported specific anti-Muslim hate accounts that pumped out hundreds of anti-Muslim and anti-Islam tweets a day. It took four years for action on one account and on another, it took 18 months before this account was closed, this despite reporting in through the normal channels, including specifically writing on numerous occasions to their Twitter representative, Nick Pickles. The second account was suspended just before Pickles gave evidence and it has been this relaxed, careless attitude to the lives of victims has led to the public humiliation of these platforms by the HASC. This case was highlighted in the HASC report which is published today.

Additionally, apart from the wilful disregard to take immediate action on illegal content, Twitter's platform has had a fundamental failure within its operating model, something that it has tried to keep quiet though which continues to affect the lives of victims of online harassment. Even if accounts are shut down, Twitter cannot stop the same individual from opening up another account and continuing with their hateful activity. We have seen this time and time account and so what we had in the form of social media companies, were individuals and entrepreneurs who developed a platform with little thinking about potential issues that may come up. Twitter has recently stated that it is trying to find a solution to this problem, some eight years after the platform started operating. Shambolic is the term that comes to mind.

Transparency

The HASC account also berated social media companies on their lack of transparency. Social media companies have told us that their platforms will 'increase transparency, ensure probity and widen greater understanding'. Yet, the HASC report specifically cites their 'secretive' nature, in particular to the "level of resources that they devote to monitoring and removing inappropriate content". The committee went onto add that, "Social media companies are highly secretive about the number of staff and the level of resources that they devote to monitoring and removing inappropriate content. Google, Facebook and Twitter all refused to tell us the number of staff that they employed for such purposes". So to sum up, whilst social media companies expect their platforms to increase transparency and probity, they simply are not willing to be honest to the public about their own affairs.

Anyone who has used social media and who has been targeted for hatred, intolerance and prejudice should warmly welcome the HASC report out today. Yvette Cooper MP and her colleagues have done us all proud and social media companies are on notice. Change or people power in the future will ensure that you are hit in your pockets, which seems to be the only thing that ensures you remove illegal content.

Fiyaz Mughal is the Founder of Faith Matters and its Director. He is also the Founder of Tell MAMA, which supports victims of anti-Muslim hatred and maps, measures and monitors it across the UK.

Close

What's Hot