It happened one night in 2014. Boko Haram stormed a boarding school in the town of Chibok and kidnapped nearly 300 female students. For the sake of simplicity, let us focus on the two narratives that then went viral worldwide and in Nigeria, where the crime happened. One told the facts behind the incident and asked a global audience to keep pressuring the government to rescue the captives, more popularly known as the #BringBackOurGirls campaign. Another claimed the Chibok girls kidnapping was a hoax.
In a TED Talk entitled "How fake news does real harm," CNN journalist Stephanie Busari returns to her encounters while reporting from the field. Since the kidnapping, she has also met with some of the captive girls' families, as well as the girls who escaped by jumping off the truck that carried them that night. Despite their firsthand stories, the alternative fact that the crime did not happen continues to persist in the country.
Busari believes the hoax narrative significantly delayed rescue operations. But while it is the job of journalists like her -- as well as people from places like Google and Facebook -- to ensure fake news does not spread, ordinary citizens like you and me also have a role to play.
"We are the ones who share the content. We are the ones who share the stories online. In this day and age, we're all publishers, and we have responsibility," says Busari.
Since we are stakeholders in this information exchange, we have to treat it like a process in which we are not only reactors but also actors. As actors, it matters that we remain sceptical. Being sceptical, in the sense of extending our suspension of judgment, is a virtue. We exhibit this virtue through asking tough questions, just as Busari encourages.
Take this comparison for instance. No loan provider is supposed to approve an application without due diligence. The assumption that credentials can be faked overrides the lending procedures of banks and other financial institutions.
But how do we translate due diligence or enact scepticism in our daily lives? Are we not supposed to just "click and go," particularly on Facebook and Twitter?
While it is indeed not required to spare mental space for things that we are supposed to scan on our feeds, it is precisely this act of instant gratification that begs the question on why we should filter what goes in our heads. There are already studies that support the negative impact of online reading on our brains. Consuming digital content may be great for gathering information faster, but it does not promote deep, analytical thinking. In other words, we can be piling up mostly junk information while browsing content materials on our social accounts.
Sharing useless data with another person is just the inevitable consequence of this unguarded participation in the communication cycle. This is not to say that we stop posting and talking about stories we resonate with (from feline photos to Donald Trump). But this is a reminder that we should open our minds to the possibility of endangering other people when we do not think before we click.
Further, our brains deserve some challenge -- the challenge of asking tough questions. Perhaps we can borrow Busari's line of questioning. In her talk, she shares three things to ask when discovering stories in whatever form online:
It is important that we ask these questions. Countering misinformation is not only the responsibility of a selected few -- like journalists and technocrats -- but of every citizen, or netizen if you will. The fabric of social media is made up of individual threads, of individual minds. About time we also treat the digital realm as an extension of our society, not just a virtual place for escape. We have functions to fulfils here as we have ones in the physical world. Because the fact of the matter is there are real, and potentially dangerous, consequences if we ignore it further.
So, when we encounter alt facts or fake news stories, we have to make sure their reach stops with us.