Senate Select Committee on Intelligence Chairman Mark R. Warner (D-VA) sent a letter to Meta CEO Mark Zuckerberg, pressing the company on its efforts to combat the spread of misinformation, hate speech, and incitement content around the world. Reporting indicates that Facebook devotes 84 percent of its misinformation budget to the United States, where only ten percent of its users reside.
“In its pursuit of growth and dominance in new markets, I worry that Meta has not adequately invested in the technical, organizational, and human safeguards necessary to ensuring that your platform is not used to incite violence and real-world harm,” wrote Sen. Warner,pointing to evidence, acknowledged by Meta, that the platform as used to foment genocide in Myanmar. “I am concerned that Meta is not taking seriously the responsibility it has to ensure that Facebook and its other platforms do not inspire similar events in other nations around the world.”
In his letter, Sen. Warner noted that Facebook supported more than 110 languages on its platform as of October 2021, and users and advertisers posted on the platform in over 160 languages. However, Facebook’s community standards, the policies that outline what is and isn’t allowed on the platform, were available in less than half of the languages that Facebook offered at that time. Facebook has previously said that it uses artificial intelligence to proactively identify hate speech in more than 50 languages and that it has native speakers reviewing content in more than 70 languages.
“Setting aside the efficacy of Facebook’s AI solutions to detect hate speech and violent rhetoric in all of the languages that it offers, the fact that Facebook does not employ native speakers in dozens of languages officially welcomed on its platform is troubling – indicating that Facebook has prioritized growth over the safety of its users and the communities Facebook operates in,” Sen. Warner wrote, citing documents provided by Facebook whistleblower Frances Haugen. “Of particular concern is the lack of resources dedicated to what Facebook itself calls ‘at-risk countries’ – nations that are especially vulnerable to misinformation, hate speech, and incitement to violence.”
Warner noted that in Ethiopia, Facebook reportedly did not have automated systems capable of flagging harmful posts in Amharic and Oromo, the country’s two most spoken languages. A March 2021 internal report said that armed groups within Ethiopia were using Facebook to incite violence against ethnic minorities, recruit, and fundraise.
“In the wake of Facebook’s role in the genocide of the Rohingya in Myanmar – where UN investigators explicitly described Facebook as playing a ‘determining role’ in the atrocities – one would imagine more resources would be dedicated to places like Ethiopia. Even in languages where Meta does have experience, the systems in place appear woefully inadequate at preventing violent hate speech from appearing on Facebook,” observed Sen. Warner, citing an investigation conducted by the non-profit Global Witness, which was able to post ads in Swahili and English ahead of the 2022 general elections in Kenya that violated Facebook’s stated Community Standards for hate speech and ethnic-based calls to violence.
“Unfortunately, these are not isolated cases – or new revelations. For nearly six years, Facebook’s role in fueling, amplifying, and accelerating racial, religious, and ethnic violence has been documented across the globe – including in Bangladesh, Indonesia, South Sudan, and Sri Lanka. In other developing countries – such as Cambodia, Vietnam and the Philippines – Facebook has reportedly courted autocratic parties and leaders in order to ensure its continued penetration of those markets,” wrote Sen. Warner. “Across many of these cases, Facebook’s global success – an outgrowth of its business strategy to cultivate high levels of global dependence through efforts like Facebook Free Basics and Internet.org – has heightened the effects of its misuse. In many developing countries, Facebook, in effect, constitutes the internet for millions of people, and serves as the infrastructure for significant social, political, and economic activity.”
“Ultimately, the destabilizing impacts of your platform on fragile societies across the globe poses a set of regional – if not global – security risks,” concluded Warner, posing a series of questions to Zuckerberg about the company’s investments in foreign language content moderation and requesting a response by March 15, 2023.