Apple’s child abuse detection software may be vulnerable to attack


Close up of digital data and binary code in network.

Apple has plans to detect pictures of kid sexual abuse on a few of its units

Yuichiro Chino/Getty Photos

Apple’s soon-to-be-launched algorithm to detect pictures of kid sexual abuse on iPhones and iPads could incorrectly flag individuals as being in possession of unlawful pictures, warn researchers.

NeuralHash can be launched within the US with an replace to iOS and iPadOS later this 12 months. The device will examine a hash – a novel string of characters created by an algorithm – of each picture uploaded to the cloud with a database of hashes for identified pictures …


Supply hyperlink