Apple said Friday it would take more time to gather feedback and improve proposed child safety features after the system was criticized for privacy and other reasons both inside and outside the company.
Apple’s promise last month to check US customer phones and computers for images of child sexual abuse sparked backlash from a variety of rights groups around the world, with employees also criticizing the plan internally.
Critics argued that the feature could be exploited by repressive governments looking for other material related to censorship or arrest, and it would also be impossible for outside researchers to determine whether Apple is reviewing even a small set of content on the device.
Apple countered that it would allow security researchers to verify its claims, but the company said Friday it would take more time to make changes to the system.
“Based on feedback from customers, stakeholders, researchers and others, we have decided to take additional time in the coming months to gather input and make improvements before we release these critically important child safety features,” the company said in a statement on Friday.
Matthew Green, a cybersecurity researcher at Johns Hopkins University who criticized Apple’s move, said Apple’s move was “promising.”
Green said on Twitter that Apple “should be clear about why you’re scanning and what you’re scanning. From scanning nothing (except email attachments) to scanning everyone’s private photo library has been an enormous delta. You have to justify such escalations. “
Apple had defended the plan for weeks and had already offered a number of statements and documents to show that the risks of false positives were low.
The plan was to roll out the feature for iPhones, iPads, and Macs with software updates later this year in the United States.
Live television
#mute