The class-action suit against Meta, Facebook's parent company, was brought by a Rohingya woman in Illinois on behalf of the 10,000-plus Rohingya refugees who have resettled in the United States since 2012. It alleges that Facebook's algorithm amplified hate speech and that it neglected to remove inflammatory content despite repeated warnings that such posts could foment ethnic violence.
A similar complaint against the tech giant is expected to be filed in a British court next year, the BBC reported. Facebook declined to comment Tuesday on the lawsuits.
Lawyers representing the plaintiff in the California case argued in their complaint that Facebook's entrance into Myanmar a decade ago marked "a key inflection point" for the Rohingya people, who have long been discriminated against in the Buddhist-majority Southeast Asian country.
Myanmar's military launched a "scorched-earth campaign" in 2017 to push Rohingya residents, who are mostly Muslim, out of Rakhine state. Some 750,000 Muslim men, women and children were driven out in a campaign of rape, murder and razed villages that a top United Nations official called a "textbook example of ethnic cleansing." That year, Doctors Without Borders estimated that at least 6,700 Rohingya people had been killed as a result of the attacks.
Around the same time, influential figures such as nationalist monks and top government officials posted or recirculated slurs against the Rohingya, while spreading falsehoods and doctored images that suggested some Rohingya burned their own villages and then blamed it on Myanmar security forces.
Myanmar has denied the genocide accusations and has justified some actions on counterterrorism grounds.
After a searing U.N. report connected Facebook to the atrocities against the Rohingya people, the region became a priority for the company, which began flooding it with resources in 2018, two former employees told The Washington Post.
Facebook in August 2018 began deleting and banning accounts of key individuals and organizations in Myanmar, acknowledging that its platform was used to "foment division and incite offline violence" that the U.N. mission found colossal in scale. The platform said that in the third quarter of 2018, it removed some 64,000 pieces of content in Myanmar that violated its policies against hate speech.
"Not until 2018-after the damage had been done-did Facebook executives . . . meekly admit that Facebook should and could have done more," the lawsuit alleges. "Facebook is like a robot programed with a singular mission: to grow. And the undeniable reality is that Facebook's growth, fueled by hate, division, and misinformation, has left hundreds of thousands of devastated Rohingya lives in its wake."
Even after pledging more resources to regulate the platform, Facebook found in a 2020 internal audit that its algorithm still could not sift for covid-related posts when they are written in local Myanmar languages, which could weaken the company's attempts to weed out false information on the platform.
The legal actions in the United States and Britain are part of a growing number of moves to hold responsible alleged perpetrators of genocide. The tiny African nation of Gambia filed a lawsuit against Myanmar at the International Court of Justice in 2019. It requested that the court issue an injunction to stop the Myanmar government from committing "atrocities and genocide against its own Rohingya people."
Backed by the Organization of Islamic Cooperation, Gambia asked a U.S. court to force Facebook to turn over data related to accounts it deleted in 2018 that fueled atrocities in Myanmar. After some legal wrangling, a federal judge in Washington eventually shot down the request this week.
Published : December 08, 2021
By : The Washington Post