Meta's latest cuts include security & moderation layoffs, as well as nixing a fact-checking project

Employees of site security, privacy, and integrity teams at Meta fell on the chopping block and a fact-checking tool that was testing early this year was dissolved.

Image via Meta
3

Facebook and Meta have been targets of major scrutiny and criticism regarding their moderation and handling of hate speech, especially through Donald Trump’s presidency from 2016 to 2020. However, it looks like the cost cuts Meta has been making to staff and programs throughout 2023 have put security and moderation on the cutting room floor again. Not only are teams related to security, privacy, and integrity facing layoffs, but a fact-checking tool project was completely dissolved in Meta’s latest cuts.

Meta’s latest cuts to staff were reported by Reuters, as part of plans announced earlier this year to cut about 10,000 employees in what Mark Zuckerberg is calling the “Year of Efficiency” for the company. Meanwhile, CNBC reported on the cutting of a fact-checking project that was supposed to provide a tool through which third-party organizations such as the Associated Press and Reuters could provide further context and accurate information on questionable articles posted on Facebook.

Mark Zuckerberg
Mark Zuckerberg has called 2023 "The Year of Efficiency" for Meta, which apparently includes cutting safety and moderation staff in relation to Facebook.
Source: Kenzo Tribouillard/AFP/Getty Images

Reportedly, the fact-checking tool had received an initial buy-in from executives, and had even reached a testing phase earlier in 2023. That said, when the latest round of cuts were announced within Meta, the project itself was completely terminated, according to anonymous sources close to the project. A Meta spokesperson could not answer questions related to the project or the apparent cutting of teams related to the moderation, security, and integrity of Facebook, but claimed that the company is still exploring and developing new strides in security and content moderation.

“we remain focused on advancing our industry-leading integrity efforts and continue to invest in teams and technologies to protect our community,” Meta said in a statement.

Some may recall in 2021 when whistleblower Frances Haugen released the Facebook Papers, revealing that Meta had made several decisions to compromise user safety and moderation of hate speech and misinformation, in favor of pushing for greater profits and user activity on the platform. Even in 2023, Meta has been in hot water with the FTC for monetizing the data from Facebook users under the age of 18, which the commission is now seeking to bar Facebook from doing.

With these cuts to safety and moderation teams, it’s concerning to see what kind of effect this will have, especially as we enter an election year in 2024 (a major catalyst of hate speech which spurred the Facebook Papers). Stay tuned as we continue to follow this story for further updates.

Senior News Editor

TJ Denzer is a player and writer with a passion for games that has dominated a lifetime. He found his way to the Shacknews roster in late 2019 and has worked his way to Senior News Editor since. Between news coverage, he also aides notably in livestream projects like the indie game-focused Indie-licious, the Shacknews Stimulus Games, and the Shacknews Dump. You can reach him at tj.denzer@shacknews.com and also find him on Twitter @JohnnyChugs.

Filed Under
From The Chatty
Hello, Meet Lola