Apple announced on Friday that it is postponing the launch of its controversial new tools to protect child pornography, which some have accused of undermining the privacy of its devices and services.
The Silicon Valley giant said last month that iPhones and iPads would soon start detecting images of child sexual abuse and reporting them when uploaded to its online storage in the United States.
However, digital rights organizations quickly realized that the tweaks to Apple’s operating systems created a potential “back door” in gadgets that governments or other groups could exploit.
Apple cited feedback from customers, stakeholders, researchers, and others in announcing the delay.
“We decided to take additional time in the coming months to gather input and make improvements before we release these critically important child safety features,” the company said in a statement.
New technology enables the software that powers Apple mobile devices to match abusive photos on a user’s phone with a database of known child sexual abuse images provided by security organizations and then tag the images when they’re in the Online iCloud storage can be uploaded by Apple, the company said.
When and when the system was put into operation, the system would be “powered by cryptographic technology” to determine “if there is a match without revealing the result” unless the image contains depictions of child sexual abuse .