Lawyers representing Rohingya refugees brought a class-action lawsuit against Facebook owner Meta this week, seeking more than $150 billion in damages from the company by claiming that it bears responsibility for the violent and racist messaging on its platform that U.N. investigators have said contributed to a potential genocide in Myanmar. 

Speaking to Rest of World, Richard Fields, founding partner of Fields PLLC, one of the two legal firms leading the suit, said that their case would involve challenging tech companies’ longstanding defense that they are not responsible for content published on their platforms, under Section 230 of the Communications Decency Act. That protection is weakening, he said, as lawmakers and regulators reexamine the role that social media platforms play in society.

Fields told Rest of World that the “ground has been shifting” on Section 230. “Social media platforms … basically have no guardrails, anything goes,” he said. “I don’t think anybody with a right mind believes that Congress intended for these kinds of things to be happening when 230 was written.”

The class action, brought by Fields and Edelson PC after consultation with hundreds of Rohingya families in the U.S., accuses Facebook of failing to take adequate steps to prevent its platform — offered to hundreds of thousands of Burmese users in subsidized form through the company’s Free Basics product — from hosting and amplifying hate speech targeting the vulnerable Rohingya minority. Messages from military-linked groups in Myanmar vilified and dehumanized Rohingyas, calling them “invaders” and comparing them to animals. The online hate presaged attacks on towns and villages that killed at least 6,700 people and displaced hundreds of thousands more.

“Many Rohingya people, families, lost children, wives, brothers, sisters. The loss is so painful. We don’t want this to happen again.”

The suit alleges that in failing to stop this messaging from proliferating, Meta prioritized user engagement and growth over safety.

“At the core of this complaint is the realization that Facebook was willing to trade the lives of the Rohingya people for better market penetration in a small country in Southeast Asia,” it reads.

Written in 1996, Section 230 states that providers of “an interactive computer service” can’t be treated as publishers of content provided by third parties. This defense allows social media companies to claim they aren’t responsible for what their users post — whether they’re the Myanmar military or U.S. election conspiracy theorists. In 2019, a U.S. Court of Appeals ruled that Facebook didn’t have liability for an attack on U.S. citizens by the Palestinian militant group Hamas, citing the immunity granted by Section 230. 

Fields told Rest of World that he believes that protection could be weakening. He pointed to comments made by U.S. Supreme Court Justice Clarence Thomas, who in April questioned the scope of the act, saying that “applying old doctrines to new digital platforms is rarely straightforward.”

Lawmakers from both Republican and Democratic parties have called for reforms to Section 230. On the right, challenges to Twitter’s banning of former President Donald Trump have focused attention on the companies’ ability to decide who gets to use their platforms, while on the left, concerns over political and public health misinformation have led to calls for tighter regulation.

Field added that the companies’ algorithms, which give them power over what content users see, mean that it is hard to argue that they’re neutral platforms. “We believe that through the use of their technology, and their algorithms, specifically, that they have moved over from the hosting and distribution of content to content creation,” he said.

The complaint also quotes liberally from documents leaked by Facebook whistleblower Frances Haugen, which give insights into the company’s approach to managing risks on its platforms in volatile environments, including Myanmar. 

Facebook was heavily criticized for having insufficient moderation capacity in the Burmese language during the worst of the violence against Rohingya populations in 2017 and 2018. Internal documents leaked in October show that the company was still struggling to identify problematic content in Myanmar and other “at-risk countries,” including Ethiopia, where an ethnic conflict has raised concerns of another impending genocide.

A Meta spokesperson said in an email to Rest of World that the company is “appalled by the crimes committed against the Rohingya people in Myanmar.” The company has banned the Myanmar military, known as the Tatmadaw, from its platforms, disrupted misinformation networks, and invested in Burmese-language technology, the spokesperson said.

Nasir Zakaria, a refugee who heads the Rohingya Cultural Center in Chicago, is consulting with Fields on the case. He told Rest of World that there was widespread support for the action among the community.

Zakaria, whose parents and brother are still in refugee camps in Bangladesh, said that the hope is that the suit will force Meta to deal with hate speech on its platforms and prevent a repeat of the tragedy, in Myanmar or elsewhere in the world.

“Many Rohingya people, families, lost children, wives, brothers, sisters. The loss is so painful,” he said. “So we don’t want, around the world, this to happen again.”