Dozens of Rohingya refugees in the UK and US have sued Facebook, accusing the social media giant of allowing hate speech against them to spread.
They are demanding more than $150 billion (BRL 850 billion) in compensation, alleging that Facebook platforms promoted violence against the persecuted minority.
An estimated 10,000 Rohingya Muslims were killed during a military crackdown in Myanmar, mostly Buddhist, in 2017.
Facebook — which was renamed Meta — did not immediately respond to the allegations.
The company is accused of allowing “the spread of heinous and dangerous disinformation to continue for years.”
In the UK, a British law firm representing some of the refugees wrote a letter to Facebook, seen by the BBC, alleging the following:
- Facebook’s algorithms “amplified hate speech against the Rohingya people”
- The company “failed to invest” in moderators and fact-checkers who knew about the political situation in Myanmar
- The company was unable to remove posts or delete accounts that incited violence against the Rohingya people
- The company has failed to “take adequate and timely action” despite warnings from charities and the media
In the United States, lawyers have filed a legal complaint against Facebook in San Francisco, accusing it of “being willing to trade the lives of the Rohingya people for better market penetration in a small Southeast Asian country.”
They cite Facebook posts that appeared in an investigation by the Reuters news agency, including one in 2013 that said: “We must fight them the same way Hitler did the Jews.”
Another post read: “Pour fuel and set fire so they can find Allah faster.”
Facebook has over 20 million users in Myanmar. For many, the social media site is the main or only way to get and share news.
Facebook admitted in 2018 that it has not done enough to prevent incitement to violence and hate speech against the Rohingya.
This came after an independent report, commissioned by Facebook, which said the platform had created an “enabling environment” for the proliferation of human rights abuses.
Analysis by James Clayton, North America Technology Reporter
What happened in Myanmar is one of the first warning signs for Facebook about the platform’s problems.
Social networking was very popular there — but the company didn’t fully understand what was happening on its own platform. They weren’t actively moderating content in local languages ​​like Burmese and Rakhine.
If they had, they would have seen anti-Muslim hate speech and disinformation about Rohingya terrorist plots. Critics say it helped fuel ethnic tensions that escalated into brutal violence.
Mark Zuckerberg personally admitted errors in the escalation of widespread violence there.
That’s what makes this process particularly interesting: Facebook doesn’t deny that it could have done more.
Whether or not this means they are legally guilty is a very different question. Could this process get anywhere? It’s possible, though unlikely.
But as its parent, Meta, tries to shift its focus away from Facebook, it still finds itself haunted by past mistakes.
The Rohingya are seen as illegal migrants across Myanmar and have been discriminated against by the government and the public for decades.
In 2017, Myanmar’s military launched a violent crackdown in Rakhine state after Rohingya militants carried out deadly attacks on police posts.
Thousands of people died and more than 700,000 Rohingya fled to neighboring Bangladesh. There are also widespread allegations of human rights abuses, including arbitrary murder, rape and land burning.
In 2018, the UN accused Facebook of being “slow and ineffective” in its response to the spread of online hatred.
Under US law, Facebook is largely protected from liability for the content posted by its users. But the new lawsuit argues that Myanmar law — which has no such protections — should prevail in this case.
The BBC approached Meta to comment on the case but got no response.
.