Reading coverage of the Facebook Files in the Guardian reporting on Facebook's secret moderation rules, I was left with one question: why is the tech giant so reluctant to censor? Their rules are so complicated that they betray their lack of commitment to removing offensive content. That is not to say they are not trying, but their heart is clearly not in it.
Each rule seems to have many caveats and exceptions, to the point at which one is no longer surprised that the moderators have a high turnaround, with only ten seconds to apply these rules per post. For example, graphic violence against animals can be ignored if it is a photograph, but not if it is a video. If it is a video, it will be marked as disturbing. If the photo counts as animal mutilation, it too can be marked as disturbing, but not deleted. My inner communist dictator screams out, "Why don't you just delete the whole lot?!"
"Freedom of expression" is the obvious answer to this question, but just because it is obvious doesn't mean it is true. Freedom of speech and expression are about our right to make a statement, but a video of an individual mutilating an animal is not a statement, it is them relishing and broadcasting their sadism. A statement has to have political intent, like a charity sharing the video as part of a campaign to make the public take abuse more seriously.
However, Facebook's rule is not about the intention of the user, but about the potential effects the content might have. The rule is "We allow photos and videos documenting animal abuse for awareness", not "If the user is intending to raise awareness, we do not delete it". The importance of this emphasis cannot be understated, Facebook are not protecting their users' right to raise awareness, they are protecting their content's right to raise awareness. They seem to believe that, regardless of the original intent, Facebook content might do good for society, therefore it should at all costs be left alone.
You might accuse me of reading too much into a leaked phrase from an internal handbook, but this falls directly within the philosophy of technology offered by 20th Century German philosopher, Martin Heidegger. In an essay from 1954, Heidegger argues that problem of technology is not about which devices are good for us and which are bad, the problem is the cultural mindset that produces them: everything in the world should be catelogued and made available and accessible to everyone as a resource.
Even though Heidegger was writing in the fifties, his picture of technology is most applicable to the digital technologies of our time. Consider your smartphone. Its purpose is not really for talking to people, but about making information and people available to you, and you available to them. Anyone who has been enjoying a quiet moment only to be dragged out of it by receiving a batch of emails and a phone call has experienced this; here technology is yanking you from where you are to make you available somewhere else. The task of the smartphone is to turn you into a resource for others and to allow you to use others as a resource.
To return to Facebook, consider their mission statement:
Founded in 2004, Facebook's mission is to give people the power to share and make the world more open and connected. People use Facebook to stay connected with friends and family, to discover what's going on in the world, and to share and express what matters to them
What most of us would consider to be Facebook's prime purpose, speaking to our friends and families, is almost an afterthought. Facebook's task is really about making the world "open and more connected" by giving "people the power to share", in other words by getting humans to produce Facebook content, the world has more information as a resource. If we ask why this is a good thing, the only possible answer is "because more information, more connectedness, and more availability of resources is a good thing".
The cause of Facebook is to capture content and distribute it, regardless of what that content is, because the mindset of technology says that we must make everything as available and usable as possible. While freedom of expression has very clear exceptions when that expression causes harm, if you believe that available content is a self-demonstrating good, you will not want to censor even the most harmful of content. If Heidegger is right, the reason Facebook have such reluctance is not a commitment to the classical liberal value of freedom, but because it runs directly against the essence of technology, and the mission of their organisation.