Facebook Is ‘a Toilet,’ Fails to Contain Hate Speech in Myanmar | Social Media
“Facebook has been expanding aggressively abroad,” Oliver said of the social media site, which he described as “the worst place to wish someone ‘happy birthday’ other than a funeral.” He added, “More than half its revenue comes from outside the U.S., and more than 80 percent of its users come from foreign countries … The company has made some hugely consequential mistakes overseas.”
Internet usage has exploded in Myanmar, also known as Burma, in recent years as the country has transitioned from dictatorships to a “quasi-civilian government.” In 2013, only 1.2 percent of citizens had Internet access, but 18 million people are currently using Facebook – partly because it comes pre-installed on many mobile phones. In fact, Oliver notes that many Burmese people use the terms “Facebook” and “Internet” interchangeably.
But that widespread Facebook usage has led to rampant hate speech, particularly toward the Muslim group Rohingya, who were the victims of a wave of “ethnic cleansing” in 2017 that resulted in at least 10,000 deaths.
“A report on Myanmar prepared by independent U.N. investigators said, ‘Facebook has been a useful instrument for those seeking to spread hate,’” Oliver said. “And it is weird to hear something that started out as frivolous be described like that. It’s like if five years from now, U.N. investigators called bubble tea ‘an aggressive threat to human rights.’”
Much of that hate speech has flown through military leaders, politicians and Buddhist monks like Ashin Wirathu, whom Time dubbed the “Burmese bin Laden” in a 2013 cover story. Despite numerous warnings, Facebook didn’t ban Wirathu until early 2018 – this delay despite Facebook’s ridiculously granular content rules, which even regulate the types of anuses that can be photoshopped onto people’s faces.
Facebook failed Myanmar with its dodgy efforts to spot hate speech – partly because its technology is not compatible with Myanmar’s various language fonts. They rely on Burmese people to flag content, but the country’s reporting systems were listed in English until late 2015. And content that was flagged was often not taken down because Facebook didn’t even have any Burmese-speaking content reviewers – only one in 2014, two as of 2015, then expanded to four through outsourcing.
The company’s CEO, Mark Zuckerberg, recently hired 60 more reviewers to help minimize the problem. But Facebook is clearly not doing enough: In a mid-August report, Reuters found over 1,000 Facebook posts attacking the Rohingya.
Oliver closed the segment with a satirical Facebook ad that emphasized, “Somewhere between 80 and 100 percent of what’s on our site is bullshit – complete bullshit.” The host added, “It is painfully obvious that everyone should be treating anything on their site with extreme skepticism and see Facebook for what it actually is: a fetid swamp of mistruths and outright lies interspersed with the occasional reminder of a dead pet.”