Over the weekend the Digital Secretary Matt Hancock gave an interview to The Times, in which he announced his fears over children’s use of social media.
He outlined in the conversation a whole gamut of ideas to protect children from the perils of social media. Some of which appear in the General Data Protection Regulation (GDPR) such as age limits, others of which have been touted in the UK’s Digital Charter and the Internet Safety Strategy green paper which are dedicated to making “Britain the safest place in the world to be online”.
What was new however was Mr Hancock’s idea to create mandatory limits on the amount of time children spend on certain sites. He is quoted as saying he wouldn’t want to restrict adults but that for children “it might be right to have different time cut-offs”.
On first sight, many parents reading the interview might have nodded and thought what a great idea, anything to get their child to put down their phone or get off the laptop is welcome. Many parents fret about addiction to screens and are at a loss on how best to curb access to the internet and social media. It is a serious concern and one which needs proper consideration.
But is a Government or private company enforced time cut off the solution and how exactly does Mr Hancock think such a proposal would work? He offered no concrete ideas, merely indicated that a system might be similar to the age verification for online pornography which was passed in 2017. It was curious that he used this as an example, as Mr Hancock’s department, the Department for Digital, Culture, Media and Sport put the plans on hold a couple of days later allegedly due to no workable solution having been found.
The delay by DDCMS on age verification is telling, like many of the technical or data driven solutions passed by Parliament they sound possible on paper, but struggle to materialise in reality, with technical hitches, impacts to privacy and question marks over security getting in the way.
Age verification aside, the idea that a time cut off will be technically and morally easy to determine seems far-fetched and fraught with complexity, not least the idea that it will mean people’s internet activity will be constantly watched. Children don’t all have their own devices which means family members who use the same laptop, tablet or computer may have their activity subject to monitoring also. More though, the simple idea that people will willingly allow themselves to have their internet activity surveilled in order to protect them from themselves could be one step too far.
Whether officials believe it or not people are uncomfortable about the extent to which they are monitored and tracked online. Those in power like to remind us that we allow companies to monitor our lives all the time and offer little complaint or concern, but many people either have little knowledge of the extent this happens, or feel uncomfortable about it. Research from doteveryone published earlier this month shows just how unclear people are about the extent to which their data is monitored, shared and used.
The Government may make us feel guilty. Recommending that we allow our children to be tracked as a method of good parenting because it will be the only way to keep them safe and healthy from the perils of online addiction. But is permitting children to be watched over in whatever they do the right approach for society and individual parents to take or will we be handing over parenting responsibilities to the state and big business?
Furthermore, isn’t the idea that outside influence dictating what we do, when we do it and for how long is something most of us rally against? Imagine if the state determined how many books children could read, toys they could play with, how much television they could watch or radio they could listen to? Such restrictions would seem absurd at best, downright authoritarian at worst.
Whilst it is great that the UK Government are putting some thought into how we build a safe and sustainable connected world, it is far from reassuring that ideas like this are being presented in a random, ad hoc fashion with little deep thought or analysis about how they would work and what the unintended consequences might be.
There are a great many people working to consider these ideas, campaigners such as myself, technologists, ethicists, data experts, lawyers and academics to name but a few. Ideas are being explored in collaborative ways, with an understanding that it might not be solutions we are looking to find but ways of learning, educating, guiding and promoting best practice as the starting point.
Society knows that connectivity is here to stay, and that regulation may be necessary going forward. What is essential now is that we ensure regulation supports us, not controls us.