Apple to search iCloud uploads for child sexual abuse content with ‘NeuralHash’
Among the tools is a new technology that will allow Apple to detect known child sex abuse images stored in users’ iCloud Photo accounts and report them to law enforcement. The detection process will not involve indiscriminate manual inspection of users’ iCloud content. Instead, it will use a new tool called “NeuralHash” which is based on a database of hashes – a digital fingerprint which allows a unique piece of content to be identified but not reconstructed – which represent known images from a database provided by child safety organisations; the hashes will be stored locally. Other major tech companies - including Facebook, Microsoft and Google - already use the same database to detect child sex abuse content on their own platforms. The tool allows edited images similar to the originals…