Apple capable of client-side photo hashing to pick out child abuse messages

Apple has not officially commented on the use of this technology

Apple capable of client-side photo hashing to pick out child abuse messages

iPhones are known to be private and secure according to Apple. This may be largely true. A recent tweet by Matthew Green, who is a teacher of cryptography at John Hopkins, claims that the company could soon develop a photo hashing feature that can be used to detect photos of child abuse. 

Now we will be doubtful as to how this can happen. Apple will not attempt to upload the image library to a database to check for abuse. It will use a client-side photo hashing that downloads a set of fingerprints on the device itself and the fingerprints will be compared with the photos in the camera roll.

If the comparison finds something, it will be sent to a human moderator, where it will be further reviewed. An official announcement from the side of Apple has not come yet. So, the details on how it will work, its implications, and the user responses are yet to be mentioned. 

Green has also pointed out some complications which can arise like the control of the government over the “fingerprints”. If the government tries to take control, then they can use the same technology for taking down activism and political opponents. 

As iCloud Photos are not end-to-end encrypted, users need not worry about their privacy as it is already out there. Apple can anytime decrypt the photos stored on its servers. If the technology is used for good purposes only, it cannot harm society but it will be complicated once it is out there.