New Features Introduced
Apple is simultaneously introducing three new measures, which will be able to scan through the messages, images, and additional features. The new features are rumored to be introduced "later this year." We believe that Apple will be introducing these features in its upcoming updates to iOS15, macOS, iPadOS 15, and Watch OS8.
At the moment, Apple only intends to introduce these features in the US.
Looking at the details of each of the three features, the first feature will use Apple's machine learning algorithm to scan through the content in search of children's messages or photos, which might be considered sexually explicit. According to Apple Sources, the scan will entirely be done internally by the device, and that the messages won't be shared even with Apple.
In case the algorithm does find any sexually explicit content (images/messages), the same will be blurred, and the child will be warned about the sensitivity of the information. The child will also be provided with the option to block the contact. If children still choose to view the content, they will be warned about the same being notified to the parents, which will then be triggered.
The same will be applied to content (images, messages) send by children, in which case the children will be warned about the sensitive nature of the content. Also, parents will be able to set up alert notifications for any sensitive image sent by the children.
While the first feature seems somewhat helpful in limiting children's access to sexually explicit content, the second feature is more of a concern for privacy advocates. The second feature will be able to scan through users' images for possible Child Sexual Abuse Material (CSAM). The feature will be able to access iCloud photo library for such content. While users and privacy advocates are genuinely concerned about the feature, Apple claims that the feature will perform scanning with user privacy in mind.
Just like in the first feature, the scanning for the second feature will first be completed on the device itself. Again, the machine learning algorithm employed by iOS will scan the devices to identify any photos that match with the database of abusive photos provided by Child Safety Organizations. In case the algorithm is able to identify photos that are similar to those offensive images in its database, the same will be revealed to able, which then will be able to view the content manually to confirm the match. In case the company verifies child abusive content manually, it will then disable the account and report the same to authorities.
According to Apple, both of these features are designed with users' privacy in mind. To validate its claim, the company also uploaded the technical assessments carried out by illustrative professors.
However, even with all the assurances from Apple, the features have caused much controversy. Matthew Green – a cryptographer on Wednesday, revealed that the company has been working on these features for a long and that the implementation of such features will mark Apple's departure from its long help commitment towards users' privacy.
According to Professor Green from Johns Hopkins, while the new features may be better than traditional tools, these are still tools for mass surveillance, and that they could be used to search any picture and not just Child Abusive images (for which it is being introduced). He also warned that will be no way to know if the system is being abused.
He further said that implementing such features is simply like pushing users to trust Apple blindly that it will only use the system to scan really bad images, without any way to reassure the same.
Alan Woodward is another computing expert that echoed the concerns of Professor Green. According to Alan, who is a computing expert, the new features are like a double-edged sword. He also relates the new features with "road to hell paved with good intentions" and stressed that Apple should hold public discussions and prove the system before launching it.
Lastly, the third new feature, which is also a relatively more straightforward feature, gives Siri more information to help parents and children deal with child abuse. The new system will enable Siri to respond to more specific questions regarding child abuse, so parents and children are able to know their legal rights and protection against the same.