Social Media Platforms Face Government Regulation And Fines Over Online Harms, Under New Proposals

Tech firms will have responsibilities outlined in a mandatory 'duty of care'.
LOADINGERROR LOADING

Tech bosses have been told they must develop new ways to detect illegal content, and could face fines for harmful content, under new proposals being set out by the government.

Home Secretary Sajid Javid said the internet is a “hunting ground for monsters” as he unveiled a government white paper on online harms on Monday.

Alongside Jeremy Wright, secretary of state for digital, culture, media and sport, Javid told the audience that UK users had trusted tech giants with “their children” but the companies failed to “repay that trust”. As a result they will now be subject to government regulation.

Javid explained: “They [tech companies] have failed to take responsibility for the content posted by others online, but they are quite happy to profit from that very same content. They have made their choice and I’ve made mine.”

Sajid Javid.
Sajid Javid.
ASSOCIATED PRESS

In September 2018 Javid told Google, Facebook and Microsoft they needed to act to protect vulnerable users against abuse but said they had “failed” to step up in the last six months. “To be a bystander is to be complicit and I’m not prepared to let them stand by any longer - if you run any business you have a duty to protect your customers.”

At the event in central London, Jeremy Wright told the audience technology was “one of the great policy challenges of our age”.

The “world first” internet safety laws will legally require social media companies to protect users, giving them a duty of care, which if they fail to comply with, could see companies face penalties, fines and even being blocked in the UK.

The joint proposal on online harms from the Home Office and the Department for Digital, Culture, Media and Sport (DCMS) says a regulator will be appointed in due course to ensure internet firms meet their responsibilities.

The government is currently consulting on whether to create a new regulator or use an existing one, such as Ofcom, to enforce the new rules.

As well as financial penalties, the regulatory body will have the potential to bring criminal charges against individual senior staff members and management.

The white paper is now in a process of consultation until 1 July 2019, at which point Javid and Wright say they hope parliament will begin to legislate.

The consultation will address ongoing questions about how the regulator will manage ‘private channels’. Although the white paper states private communications will not be scanned or monitored, it later goes on to explain that the precise definition of a ‘private channel’ has not yet been established.

The proposed measures are part of a government plan to make the UK one of the safest places in the world to be online, and comes in response to concerns over the growth of violent content, encouraging suicide, disinformation and the exposure of children to cyberbullying and other inappropriate material online.

A number of charities and campaigners have called for greater regulation to be introduced, while several reports from MPs and other groups published this year have also supported the calls for a duty of care to be implemented.

Prime Minister Theresa May said the proposals were a sign the age of self-regulation for internet companies was over.

“The internet can be brilliant at connecting people across the world - but for too long these companies have not done enough to protect users, especially children and young people, from harmful content,” she said.

“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.

“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”

The proposed new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, the government said, applicable to companies of all sizes from social media platforms to file hosting sites, forum, messaging services and search engines.

It also calls for powers to be given to a regulator to force internet firms to publish annual transparency reports on the harmful content on their platforms and how they are addressing it.

Companies including Facebook and Twitter already publish reports of this nature.

Last week, Facebook boss Mark Zuckerberg told politicians in Ireland that the company would work with governments to establish new policies in a bid to regulate social media.

The Home Secretary, Sajid Javid added that tech firms had a “moral duty” to protect the young people they “profit from”.

“Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online,” he said.

“That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people - and we are now delivering on that promise.”

A 12-week consultation of the proposals will now take place before the government will publish its final proposals for legislation.

The government said the proposed regulator would have a legal duty to pay due regard to innovation, as well as to protect users’ rights online.

Peter Wanless, chief executive of children’s charity the NSPCC - which has campaigned for regulation for the past two years - said the proposals would make the UK a “world pioneer” in protecting children online.

However, former culture secretary John Whittingdale warned ministers risked dragging people into a “draconian censorship regime” in their attempts to regulate internet firms.

Writing in the Mail on Sunday, he said he feared the plans could “give succour to Britain’s enemies”, giving them an excuse to further censor their own people.

Responding to the proposals, Facebook’s UK head of public policy Rebecca Stimson said: “The internet has transformed how billions of people live, work and connect with each other, but new forms of communication also bring huge challenges.

“We have responsibilities to keep people safe on our services and we share the government’s commitment to tackling harmful content online. As Mark Zuckerberg (Facebook’s founder) said last month, new regulations are needed so that we have a standardised approach across platforms, and private companies aren’t making so many important decisions alone.”

She added that while Facebook had tripled the number of people it employs to identify harmful content and continued to review its policies, “we know there is much more to do”.

“New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech. These are complex issues to get right and we look forward to working with the government and parliament to ensure new regulations are effective,” she added.

Close

What's Hot