Facebook only moderates 3-5% of hateful content, claims to be 'best in the world at it'

Published , by Donovan Erskine

Facebook has built quite the reputation for being a toxic platform that’s filled to the brim with misinformation and hateful content. These concerns are only made worse by the company constantly finding itself in the middle of different scandals. Most recently Frances Haugen, an ex-employee of Facebook, shared that she had gotten a hold of internal company documents, which confirmed Facebook’s knowledge of a multitude of issues surrounding the platform. One of which being the fact that Facebook is well aware that it only moderates a very small percentage of the hateful content on its platform.

It was roughly a month ago that the Facebook Whistleblower filed legal papers to the SEC against the social media company. Now, that whistleblower has been revealed as Frances Haugen, a former product manager at the company. Haugen recently sat down for an interview with 60 Minutes on CBS, where she shared that she got her eyes on some internal company documents before her exit from Facebook.

“We estimate that we may action as little as 3-5% of hate…and 0.6% of V&I [violence and incitement] on Facebook...” the documents read. Mark Zuckerberg and company have long touted Facebook as a platform that cracks down on hateful content that goes against its terms of service, but it looks like that’s not the case. There had already been strong concerns about how much inappropriate content was actually being moderated on Facebook, but it’s pretty damning to hear it come from the company itself.

It’s yet to be seen what the actual ramifications of the newly revealed information will be, but there’s likely a long way to go with the current situation surrounding Facebook. Haugen also revealed that she had filed several complaints to the SEC, and revealed that Facebook relaxed its misinformation filters after the 2020 Presidential Election.