Facebook admitted that its censors made the wrong call on nearly half of a set of 49 situations addressed in a new study.
ProPublica reported that it reviewed the company’s practices regarding images depicting violence or statements that are offensive, and in roughly half the cases even the company admitted it should have done better.
For example, it declined to take down a post with an image reflecting graphic violence – the body of a man on top of ground that was soaked in blood, with a message stating, “the only good Muslim is a ——- dead one.”
But taken down was the single line “Death to the Muslims” which did not have an image at all.
“We asked Facebook to explain its decisions on a sample of 49 items, sent in by people who maintained that content reviewers had erred, mostly by leaving hate speech up, or in a few instances by deleting legitimate expression,” Propublica’s report said.
“In 22 cases, Facebook said its reviewers had made a mistake. In 19, it defended the rulings. In six cases, Facebook said the content did violate its rules but its reviewers had not actually judged it one way or the other because users had not flagged it correctly, or the author had deleted it. In the other two cases, it said it didn’t have enough information to respond.”
Inconsistencies are common, ProPublica said, having “found in an analysis of more than 900 posts submitted to us as part of a crowd-sourced investigation into how the world’s largest social network implements its hate-speech rules.”
“Its content reviewers often make different calls on items with similar content, and don’t always abide by the company’s complex guidelines. Even when they do follow the rules, racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech.”
The organization said it found that content reviewers for Facebook often make different calls regarding the same content.
“We’re sorry for the mistakes we have made — they do not reflect the community we want to help build,” Facebook Vice President Justin Osofsky told Propublica in a statement. “We must do better.”
Expansion of the company’s safety and security teams are coming up in 2018, he said.
The operation deletes some 66,000 offending passages each week, officials said.
“Our policies allow content that may be controversial and at times even distasteful, but it does not cross the line into hate speech,” he said. “This may include criticism of public figures, religions, professions, and political ideologies.”
But Propublica said, “In several instances, Facebook ignored repeated requests by users to delete hateful content that violated its guidelines. At least a dozen people, as well as the Anti-Defamation League in 2012, lodged protests with Facebook to no avail about a page called Jewish Ritual Murder. However, after ProPublica asked Facebook about the page, it was taken down.”
Bias in allowed in the guidelines, as is sarcasm, but no explicitly hostile or demeaning language.
“How Facebook handles such speech is important because hate groups use the world’s largest social network to attract followers and organize demonstrations,” the report said. “After the white supremacist rally in Charlottesville, Virginia, this summer, CEO Mark Zuckerberg pledged to step up monitoring of posts celebrating ‘hate crimes or acts of terrorism.’
“Yet some activists for civil rights and women’s rights end up in ‘Facebook jail,’ while pages run by groups listed as hateful by the Southern Poverty Law Center are decked out with verification checkmarks and donation buttons,” the report.
It was the SPLC itself, too, that was linked to domestic terrorism when one of its fans used its online resources to identify the Family Research Council in Washington, D.C., as a target for a potential mass homicide attack.
“Facebook defines seven types of ‘attacks’ that it considers hate speech: calls for exclusion, calls for violence, calls for segregation, degrading generalization, dismissing, cursing and slurs,” the report said. “For users who want to contest Facebook’s rulings, the company offers little recourse. Users can provide feedback on decisions they don’t like, but there is no formal appeals process.”