Facebook claims that censoring a Bible verse from a Christian writer’s page was a mistake, and it has since apologized.
But to whom?
Julio Severo, an activist whose Facebook pages have been attacked at least three times in just the last six months by Facebook itself, says he has not heard anything from the social-media giant.
That’s the one in which God instructs, “Do not lie with a man as one lies with a woman; that is detestable.”
Some translations use the word “abomination.”
The verse is a favorite target for actor Ian McKellen, who has admitted that when he stays in hotels and motels, he takes out the Bibles from nightstands and “rips out pages that contain a certain passage from Leviticus.”
Facebook had sent a message to Severo that said: “We Removed Something You Posted. It looks like something you posted doesn’t follow our Community Standards. We remove posts that attack people based on their race, ethnicity, national origin, religious affiliation, sexual orientation, gender or disability. Levítico 18.22: Não de deitarás com homem, como se fosse mulher; abominação é.”
WND’s initial request to Facebook for comment brought only an automated response.
Days later, Severo’s page again was active, and a Facebook statement to WND, released on condition that it be considered “on background,” said the post “was mistakenly removed by a member of our review team after we received reports that content in the post violated our Community Standards.”
“As soon as we were notified of the problem, we began to investigate and restored the content as soon as we were able to identify the mistake,” Facebook said. “The content has been restored since it didn’t violate our standards. We’ve informed Mr. Severo of the restoration and apologized for the error.”
But Severo told WND there was no apology. And he said there’s a longtime pattern of Facebook attacking him.
He said he was put on a 30-day suspension on Jan. 28.
It was while he was on suspension that the Bible verse issue arose.
And last summer, Facebook gave him another 30-day suspension for posting an article about homosexuality in Brazil.
At that time the company also admitted the suspension was a mistake but still left it in place, Severo said.
The linked article reported neighbors were fined $4,500 for calling a homosexual a name, but Severo said his posting “made it abundantly clear that I oppose name-calling and foul language.”
The company appears to be targeting Christians, Severo said, noting “pictures of the communist criminal Che Guevara, who murdered people, including gay men, remain unshakingly throughout Facebook’s social network, as if his filthy image did not deserve banishment for his crimes.”
WND reported in December an independent review of Facebook’s practices regarding images depicting violence or statements that are offensive found that in roughly half of the cases examined, the company admitted it should have done better.
For example, Facebook allowed an image of the body of a man soaked in blood with a message stating “the only good Muslim is a ——- dead one.”
But Facebook removed the single line “Death to the Muslims,” which did not have an accompanying image.
“We asked Facebook to explain its decisions on a sample of 49 items, sent in by people who maintained that content reviewers had erred, mostly by leaving hate speech up, or in a few instances by deleting legitimate expression,” ProPublica’s report said.
“In 22 cases, Facebook said its reviewers had made a mistake. In 19, it defended the rulings. In six cases, Facebook said the content did violate its rules but its reviewers had not actually judged it one way or the other because users had not flagged it correctly, or the author had deleted it. In the other two cases, it said it didn’t have enough information to respond.”
Inconsistencies are common, ProPublica said, having “found in an analysis of more than 900 posts submitted to us as part of a crowd-sourced investigation into how the world’s largest social network implements its hate-speech rules.”
“Its content reviewers often make different calls on items with similar content, and don’t always abide by the company’s complex guidelines. Even when they do follow the rules, racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech.”
The organization said it found that content reviewers for Facebook often make different calls regarding the same content.
In the most recent disagreement, Facebook declined completely to respond to WND’s question about whether there were statements in religions other than Christianity that are censored.