Apple delays iPhone image scan & child exploitation security system beyond 2021

After major criticism regarding customer privacy, Apple has delayed a system that would allow it to scan iPhone image libraries for child exploitation.


Apple caught a lot of unwanted attention recently when it announced an upcoming system that would allow it to scan images and determine whether they represented child sexual abuse and exploitation. Obviously, for many, this brought up concern over privacy. First off, how would it recognize what was considered to be child exploitation? How accurate would it be? Would it flag anything that it shouldn’t? Second, many were alarmed at the idea of Apple accessing their photo libraries without permission. Due to these major concerns, Apple has delayed the security system back to at least 2022.

Apple updated its plans for the release of this security system in a press release on September 3, 2021. Originally announced on August 5, 2021, the system was designed to scan iPhone image libraries automatically and determine whether or not images included instances of child sexual abuse and exploitation. The announcement of this plan brought on a wave of criticism throughout August about invasion of privacy and quality of the system. With that in mind, Apple has delayed the launch of said system to improve it “over the coming months.”

Many saw Apple's upcoming image scan and child exploitation detection system as an invasive breach of customer privacy, forcing Apple to delay the system.
Many saw Apple's upcoming image scan and child exploitation detection system as an invasive breach of customer privacy, forcing Apple to delay the system.

At initial glance, Apple’s motives may seem admirable, but there is indeed a lot to be considered in the launch of such a system, as even Apple has admitted.

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material,” Apple wrote in its updated statement. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

It seems Apple will still eventually go ahead with this system regardless of criticism. The delay is to gather feedback and ensure this feature works appropriately.

Apple has been caught up in a lot of concerning conversation as of late. The launch of the #AppleToo website promised to release stories of harassment and discrimination within the company due to the failures of Apple HR. Meanwhile, Apple was also alleged to have shuttered an in-company Slack channel regarding pay equality, using inconsistent enforcement of company Slack rules as the grounds.

With these problems buzzing around the company right now, it’s still quite concerning that Apple is developing such a seemingly invasive system, even if the goal sounds admirable. At least for now, Apple’s image scan security features have been pushed back for the remainder of 2021. Stay tuned as we continue to follow this story and further Apple news as it becomes available.

Senior News Editor

TJ Denzer is a player and writer with a passion for games that has dominated a lifetime. He found his way to the Shacknews roster in late 2019 and has worked his way to Senior News Editor since. Between news coverage, he also aides notably in livestream projects like the indie game-focused Indie-licious, the Shacknews Stimulus Games, and the Shacknews Dump. You can reach him at and also find him on Twitter @JohnnyChugs.

Filed Under
From The Chatty
Hello, Meet Lola