The company now plans to take a few months more to collect input and make improvements before releasing the features, which drew fire from privacy advocates.

In a surprise Friday announcement, Apple said it will take more time to improve its controversial child safety tools before it introduces them.

More feedback sought

The company says it plans to get more feedback and improve the system, which had three key components: iCloud photos scanning for CSAM material, on-device message scanning to protect kids, and search suggestions designed to protect children.

Ever since Apple announced the tools, it has faced a barrage of criticism from concerned individuals and rights groups from across the world. The big argument the company seemed to have a problem addressing seems to have been the potential for repressive governments to force Apple to monitor for more than CSAM.

Who watches the watchmen?

Edward Snowden, accused of leaking US intelligence and now a privacy advocate, warned on Twitter, “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”

Critics said these tools could be exploited or extended to support censorship of ideas or otherwise threaten free thought. Apple’s response — that it would not extend the system — was seen as a little naïve. The company said:

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content.

Tags: , , , , , , , , , , , , , , , , , ,
Nikoleta Yanakieva Editor at DevStyleR International