Sri Lanka shuts down Facebook. Can we stop the spread of hate permanently?
After the bombings, Sri Lanka’s government shut down Facebook, WhatsApp, Instagram and another social application, Viber. Apparently these applications are toxic in crises. (Also at all other times, but especially in crises.)
According to government spokesman Harindra B. Dassanayake, as quoted in the New York Times, “These platforms are banned because they were spreading hate speeches and amplifying them.” The government was worried about riots and further violence. “Some attacks that have actually not taken place are being reported. It spreads that we are being attacked and we have to respond,” said Dassanayake. People were also sharing information about making bombs.
Remember, these are the same social networks that became an essential form of communication for rebels during the Arab Spring. Shutting them down is a free speech issue.
Tech thinker Kara Swisher is conflicted, but she regretfully recognizes the problem. Her commentary included this:
[W]hen the Sri Lankan government temporarily shut down access to American social media services like Facebook and Google’s YouTube after the bombings there on Easter morning, my first thought was “good.”
Good, because it could save lives. Good, because the companies that run these platforms seem incapable of controlling the powerful global tools they have built. Good, because the toxic digital waste of misinformation that floods these platforms has overwhelmed what was once so very good about them. And indeed, by Sunday morning so many false reports about the carnage were already circulating online that the Sri Lankan government worried more violence would follow.
Time to fix this
Here’s the deal: Facebook is effective at spreading hate and falsehoods all day long every day and everywhere.
Remember, point 15 of “The fundamental laws of Facebook” is “The algorithm loves lies and fakes. Minor, cosmetic changes to the algorithm, loudly proclaimed by management, don’t change that.”
We put up with this since the normal impact of spreading hate and falsehoods is just to make hateful people feel more hateful. Most of the time it doesn’t lead to actual violence.
Right now in Sri Lanka, though, it is spurring violence. So they shut off the hate machine.
Facebook will not change voluntarily. We must force it to change.
The change must start with regulation in the US and EU, Facebook’s largest markets. Other nations will follow.
US and EU regulators must immediately put Facebook on notice that it must create a crash program within a year to build an algorithm to detect and prevent the spread of hate speech and false news reports. Such a program would have the following characteristics:
- Identifies false news based on sources and corroborating evidence.
- Identifies hate speech and prejudice based on algorithmic analysis of language.
- Completes identification of such content within minutes of its posting.
- Once it has identified such content, the social network prevents others from sharing it.
- Using a public test harness, government agencies and other outsiders can test the effectiveness of the solution and continually monitor it.
Right now, Facebook is optimized to help such content spread. It must do the opposite. (And this applies equally to all other social networks, whether owned by Facebook or by others.)
If Facebook is smart, it will not only develop this AI-based improvement to the algorithm, it will do so in open source and share it. This will allow other social networks to use the same algorithm, and other coders to suggest improvements.
Congress and EU regulators must pass rules that specify that in the absence of a testable program like this, they will shut down Facebook completely — or at least require completely disabling all its sharing features.
I hate the idea of regulating speech. But these social platforms have weaponized hate and accelerated conflict. Unless they change their algorithms — and only government pressure can accomplish that — they will cause the worst of us to burn down the world.
A thoughtful and considered commentary, thank you.