Apple delays launch of child abuse detection tools

business
Apple Delays Launch Of Child Abuse Detection Tools
The iPhone maker announced the new feature last month, but privacy campaigners have raised concerns over the detection tools.
Share this article

Martyn Landi, PA Technology Correspondent

Apple is to delay the launch of new tools designed to detect child sexual abuse material (CSAM), saying it wants to take more time to “make improvements” after privacy concerns were raised.

The iPhone maker had announced plans to introduce new systems which would detect child sexual abuse imagery when someone tried to upload it to iCloud, and report it to authorities.

Advertisement

Apple said the process would be done securely and would not regularly scan a user’s camera roll, however, privacy campaigners raised concerns over the plans, with some suggesting the technology could be hijacked by authoritarian governments to look for other types of imagery – something Apple said it would not allow.

But the tech giant has now confirmed it is delaying the rollout following feedback from a number of groups.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” the company said in a statement.

Advertisement

 

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Advertisement

The system works by looking for image matches based on a database of “hashes” – a type of digital fingerprint – of known CSAM images provided by child safety organisations.

This process takes place securely on a device when a user attempts to upload images to their iCloud photo library.

It was to be joined by another new feature in the Messages app, which warns children and their parents using linked family accounts when sexually explicit photos are sent or received, with images blocked from view and on-screen alerts; and new guidance in Siri and Search which will point users to helpful resources when they perform searches related to CSAM.

Advertisement

 

Apple said the two features are not the same and do not use the same technology, adding that it will “never” gain access to communications as a result of the improvements to Messages.

Advertisement

Andy Burrows, head of child safety online policy at children’s charity the NSPCC, said the delay was “incredibly disappointing”.

“Apple were on track to roll out really significant technological solutions that would undeniably make a big difference in keeping children safe from abuse online and could have set an industry standard,” he said.

“They sought to adopt a proportionate approach that scanned for child abuse images in a privacy-preserving way, and that balanced user safety and privacy.

“We hope Apple will consider standing their ground instead of delaying important child protection measures in the face of criticism.”

Read More

Message submitting... Thank you for waiting.

Want us to email you top stories each lunch time?

Download our Apps
© BreakingNews.ie 2024, developed by Square1 and powered by PublisherPlus.com