Black Mental Health Online Is The Next Pandemic

Social media companies should have an explicit duty of care to ensure all users are protected from traumatic content, write Seyi Akiwowo and Lauren Pemberton-Nelson.
Protesters demonstrate outside the Nigeria High Commission in central London, over the Nigerian federal Special Anti-Robbery Squad (Sars)
Protesters demonstrate outside the Nigeria High Commission in central London, over the Nigerian federal Special Anti-Robbery Squad (Sars)
PA

Months after a surge of organisations and institutions committed to supporting Black people and turning the tide on anti-Blackness, social media companies are still yet to truly play their part.

And with the rise in devastating content of police brutality against End SARS protestors in Nigeria, social media companies can no longer waver on this issue.

In May and June, there were waves of dedication to #BlackLivesMatter by all kinds of organisations and institutions. For many Black people, the sudden “white urgency” and spotlight was an overwhelming and intense period of time.

This was not least of all due to a continuous stream of graphic videos on social media, signifying even more that institutionally, both online and off, Black lives did and do not matter.

A few months later after hundreds – if not thousands – of commitments from these organisations and institutions, what progress has been made? At a time when protestors in Nigeria are fighting for their lives against police brutality, and people are using online mediums more than ever before, what are these institutions doing to show that Black Lives Matter online too?

Uploading and or sharing a piece of bystander content that shows violent, traumatic and triggering racial injustice with intention to spark a change has become a common tactic across our socials.

In March, we saw that with Desmond Ziggy Mombeyarara, in May, we saw that with George Floyd, and we’re currently seeing it with the #ENDSARS movement in Nigeria.

There is an argument that with the right intentions, this can spark offline action and those with privilege and power to do more. It’s incredibly important to raise awareness of the injustices that Black people face worldwide. Without people posting about the Lekki Massacre, the world may not have known about it.

“Graphic content of police brutality has an impact on the mental health of Black people.”

But even the purest of intentions can cause unintended consequences and harm to Black communities – many of us want to be informed about what is happening and how to help without seeing public executions.

For Black people, the trauma contained in these videos translates offline – knowing that it could be us in a similar situation struggling for our lives, or knowing that it is our friends and family currently experiencing this trauma.

Graphic content of police brutality has an impact on the mental health of Black people. Ultimately, social media companies should have an explicit duty of care to ensure that all users are protected when accessing their platforms, in the same way that it would with other graphic content, such as animal cruelty.

Platforms must make a commitment to shedding light on injustices as well as a commitment to digital citizenship so viral content on their platform doesn’t grow and generate profit at the expense of already marginalised communities.

Google, Facebook, Twitter and TikTok could – and should – ensure that those engaging with their platforms are given more autonomy with what they choose to engage in, through blurring violent imagery, trigger warning labels and filtering content.

It’s not easy to add trigger warnings when live streaming or documenting what is happening on the ground. What we all must avoid is publicly reprimanding brave activists and potentially triggering a pile on.

This where tech companies can facilitate – giving all users more autonomy to responsibly upload content in the first place as well as adding trigger warnings on trending topics and hashtags when they are aware there is graphic content. This already applies to some topics, but is not always consistent.

Thousands of videos of police brutality and racial injustices worldwide have circulated on social media and will continue until there is systematic change.

Until we reach a point where videos like this no longer exist, and no longer need to be circulated, social media platforms must invest and facilitate more appropriate ways to campaign online, a way that centres the dignity, humanity, mental health and wellbeing of Black users.

Seyi Akiwowo is founder of Glitch UK and Lauren Pemberton-Nelson is senior communications coordinator.

Close

What's Hot