With every new year comes change, and Germany is attempting a big one. The country has put in place a law requiring social networks with more than two million members to remove “obviously illegal” fake news and hate speech within 24 hours of being notified. If they don’t, they could be fined up to €50m (£44.3m).
The law – titled Netzwerkdurchsetzungsgesetz (or the Network Enforcement Act) – will affect the likes of Facebook, Twitter, YouTube, Instagram, Snapchat, Tumblr and more, ensuring they act faster than they have previously in removing illegal content. And rightly so. In the UK, these social networks have regularly been criticised for failing to remove damaging content fast enough. In September 2017, Theresa May proposed that internet companies should remove online extremist content within two hours of it being posted, and more recently security minister Ben Wallace said these companies should face a tax punishment for not dealing with radical online content in the UK.
Some social media companies have tried to take proactive measures themselves to reassure people they’re committed to fighting harmful content. In December, YouTube CEO Susan Wojcicki said the company will increase staff numbers to 10,000 in 2018 to help better moderate video content, among other things, after being hit with scandals involving violent content being embedded in children’s cartoons and concerning suggested search terms. However, YouTuber Logan Paul already damaged that company’s promise at the start of 2018, posting a video – to much criticism from viewers – of a man who had committed suicide in a Japanese forest. The video remained available to all for hours after being posted, until Mr Paul himself removed it.
These laws and company changes pose some questions: will Germany’s Network Enforcement Act be enough to eradicate harmful content on social media? And is removal within 24 hours fast enough?
It’s a step in the right direction, but it’s not without its flaws too. While for many people the £44.3m fine seems hefty, for many of social media companies it’s a drop in the ocean. Even if a large amount of content is reported, and therefore a lot of money is at stake, many social media companies already struggle to have their content moderators sift through reported content fast enough. If the tech giants can’t do it, how will the German government be able to do it, and ensure companies are fined accordingly?
Equally, in the case of Logan Paul, as a result of public pressure he removed his own video within a few hours, but the controversial nature of the video saw many people replicate and re-post it on YouTube, meaning it can still be easily found. So, while the German law indicates that original content must be removed in 24 hours, there’s no guarantee that other accounts aren’t syndicating it around further. So, is 24 hours fast enough? Not a chance.
The other flaw in the law is the focus on “obviously illegal” content. While this may help prioritise what needs to be removed faster, it’s a low bar and leaves a lot of grey area too where plenty of harmful content will stay online. And how is “obviously illegal” defined?
Cyber bullying, while not obviously illegal, can be incredibly damaging, and has led to suicides among children and teenagers across the world – so why isn’t that a priority now? Even some forms of radicalisation may not be deemed “obviously illegal”. Where do we draw the line?
Ultimately, the Network Enforcement Act is a big step in the right direction with all the right intent, but it’s a blunt instrument with dubious workability and many potential unintended consequences. Solutions that curb this content before it can go live on social networks are needed. Once this harmful content is seen, the damage is done; it can be shared and replicated exponentially before moderators have a chance to take it down – and fining a social network 24 hours later won’t stop an eight-year-old girl remembering the time she saw a slaughterhouse video embedded in Peppa Pig.