On Monday, a United Nations panel recommended that Myanmar’s top army commander and other senior military officials be brought before the International Criminal Court (ICC) or a special tribunal to face charges of genocide. It’s not the first time Myanmar’s leaders have been accused of orchestrating a brutal campaign of murder, rape, and ethnic cleansing against the country’s Muslim Rohingya minority (careful, that link is not for the faint-hearted). But the UN finding is significant, because it’s the first step in seeking formal prosecution of those responsible for displacing an estimated 700,000 people and killing at least 10,000 more over the past year.
The UN used similar tribunals to seek justice after war crimes that took place during the civil war in the former Yugoslavia and the Rwandan genocide in the 1990s. In this case, though, even delayed justice may be difficult. China, which doesn’t like outside powers meddling in Asia, is likely to block any attempt by the UN to begin formal proceedings.
Perhaps more significant than naming the generals, the UN report also singled out Facebook. It said that the social network, which has become an important source of news in a country where eight years ago just 1 percent of the population had phones, had also been a “useful instrument for those seeking to spread hate” during the campaign against the Rohingya. It’s also not the first time such accusations have been leveled against the social network. But the fact that they’re being made in a UN report that accuses a government of mass atrocities is significant. It shows Facebook’s problems go way beyond online bullying or letting Russian trolls mess with the democratic process. We are now talking about genocide.
Twentieth century genocides were centrally directed affairs, where hate spread from the top down, through government propaganda, TV, and radio. Today, anyone can post hateful comments that incite violence, and watch them go viral over social media. Until recently, the tech industry has been reluctant to play policeman. In 2015, Facebook only had two Burmese-speaking content monitors. As of June, long after its problems in Myanmar became public, it was only using a few dozen contractors to monitor posts made by 18 million-odd users in the country, according to Reuters.
On Monday, Facebook admitted it had been too slow to respond to the crisis. It also banned the country’s top general and dozens of other military-linked accounts, effectively depriving them of their most important megaphone for reaching the public. It’s a small step, and much more will be required to make social media safe for the Rohingya and other ethnic minorities around the world that aren’t protected by their governments. Even if the UN report fails to bring the perpetrators of this atrocity to justice, by turning up the moral pressure on social media companies, it may spur the industry to work even harder on the changes needed to avoid the next one.