(Columbia Journalism Review) In December of 2016, after receiving a firestorm of criticism about online disinformation during the presidential election, Facebook announced its Third Party Fact-Checking project. Independent organizations would debunk false news stories, and Facebook would make the findings obvious to users, down-ranking the relevant post in its News Feed. Now the project includes 50 partner organizations around the world, operating in 42 languages, yet it’s still very much an open question how effective the program is at stopping the spread of misinformation.
Recently, Full Fact, a non-profit partner in Facebook’s project, published an in-depth report on the first six months of its involvement in the program. Overall, the group says, third-party fact-checking is worthwhile, but the report has a number of criticisms to make about the way the project works. For example, Full Fact says, the way Facebook rates misinformation needs to change, because the terminology and categories it applies aren’t granular enough to be useful. Plus, Facebook has so far failed to speed up its flags and responses to fact checks.