Apple’s new child safety feature CSAM scanning will roll out on a per-country basis

WhatsApp head had also lashed out on Apple's new feature calling it a "surveillance system"

Apple’s new child safety feature CSAM scanning will roll out on a per-country basis
0
0

Apple has much-trusted privacy features but some recent announcements on features made people think that the company is actually loosening its grip over these features. Apple had recently announced a feature to protect children by identifying illegal content present in a user’s iCloud account. Some critics argue that this protection ensured by cracking down Child Sexual Abuse Material (CSAM) could be misused by governments. Apple has, however, clarified that it won’t happen as it will the rollout of CSAM detection on a case-to-case basis per country. 

Many critics and other companies had labelled this feature of Apple as a surveillance method because it seemingly violates a person’s privacy. The problem is with the method of detection of the presence of CSAM content in images, which involves AI and machine learning to avoid manual scanning. While Apple sees it as a protective feature, it could be used for potential violations. The critics cite that the machine learning system can be infested with data that can accidentally or intentionally be used to find content related to things other than CSAM. It could be misused by some governments to track down opposition voices, for example. 

Apple is now planning to extend this feature to iPhones and iPads which has caused a much bigger commotion. The company has clarified that it would consider the laws of each market specifically before making a rollout. It will come first in the US. Countries like China and Russia have strict censorship laws, so the users there will not be much affected by the privacy issues. However, it is not enough for the privacy advocates as they see a backdoor open to abuse in this feature by Apple.