In 2018, United Nations experts investigating a brutal campaign of killings and expulsions against Myanmar's Rohingya Muslim minority said Facebook was widely used to spread hate speech toward them. The system aims to steer resources to places where abuses on its site could have the most severe impact, the people said.įacebook reviews and prioritizes these countries every six months in line with United Nations guidelines aimed at helping companies prevent and remedy human rights abuses in their business operations, spokesperson Jones said. The company designates countries "at-risk" based on variables including unrest, ethnic violence, the number of users and existing laws, two former staffers told Reuters. The material expands upon Reuters' previous reporting on Myanmar and other countries, where the world's largest social network has failed repeatedly to protect users from problems on its own platform and has struggled to monitor content across languages.Īmong the weaknesses cited were a lack of screening algorithms for languages used in some of the countries Facebook has deemed most "at-risk" for potential real-world harm and violence stemming from abuses on its site. Still, the cache of internal Facebook documents offers detailed snapshots of how employees in recent years have sounded alarms about problems with the company's tools - both human and technological - aimed at rooting out or blocking speech that violated its own standards. "We know these challenges are real and we are proud of the work we've done to date," Jones said. She said these teams are working to stop abuse on Facebook's platform in places where there is a heightened risk of conflict and violence.
Their existence was first reported by The Wall Street Journal.įacebook spokesperson Mavis Jones said in a statement that the company has native speakers worldwide reviewing content in more than 70 languages, as well as experts in humanitarian and human rights issues. Reuters was among a group of news organizations able to view the documents, which include presentations, reports and posts shared on the company’s internal message board. Securities and Exchange Commission and Congress by Facebook whistleblower Frances Haugen, a former Facebook product manager who left the company in May. The documents are among a cache of disclosures made to the U.S.
In a review posted to Facebook's internal message board last year regarding ways the company identifies abuses on its site, one employee reported "significant gaps" in certain countries at risk of real-world violence, especially Myanmar and Ethiopia. Those shortcomings, employees warned in the documents, could limit the company's ability to make good on its promise to block hate speech and other rule-breaking posts in places from Afghanistan to Yemen. The documents also showed that the artificial intelligence systems Facebook employs to root out such content frequently aren't up to the task, either and that the company hasn't made it easy for its global users themselves to flag posts that violate the site's rules. Internal company documents viewed by Reuters show Facebook has known that it hasn't hired enough workers who possess both the language skills and knowledge of local events needed to identify objectionable posts from users in a number of developing countries.