[ad_1]
Apple said it will launch new software later this year that will analyze child pornography in iPad and iPhone photos and report any related findings to the authorities.
Apple said it will launch new software later this year. The software will analyze the photos stored in the user’s iCloud photo account, look for pornographic images of children, and then report the situation to the relevant authorities.
As part of the new protection measures involving children, the company also announced a feature that will analyze photos sent and received from or from children in the Messages app to see if they are sexually explicit. Apple is also adding features to its Siri digital voice assistant to intervene when users search for relevant abusive materials. The Cupertino, California-based tech giant previewed the three new features on Thursday and said they will be put into use later in 2021.
If Apple detects a threshold for child pornography in user accounts, the company will manually review these instances and report them to the National Center for Missing and Exploited Children (NCMEC), which cooperates with law enforcement agencies. Apple said that before the images are uploaded to the cloud, they will be analyzed on the iPhone and iPad of American users.
Apple said it will detect abusive images by comparing the photos with the known child sexual abuse material (CSAM) database provided by NCMEC. The company is using a technology called NeuralHash to analyze images and convert them into hash keys or unique sets of numbers. Then use encryption technology to compare the key with the database. Apple said the process ensures that it cannot understand images that do not match the database.
Apple said its system has an annual error rate of “less than one part in a trillion” and protects user privacy. The company said in a statement: “Apple will only learn about a user’s photos if there is a set of known CSAMs in the user’s iCloud photo account.” “Even in these cases, Apple will only learn about the known CSAMs. Matching image.”
The company said that anyone who believes that their account has been mislabeled can appeal.
In response to privacy issues related to this feature, Apple released a white paper detailing the technology and third-party analysis of the protocol by multiple researchers.
NCMEC President and CEO John Clark praised Apple’s new features.
Clark said in a statement provided by Apple: “These new security measures have the potential to save lives for children who are lured online and whose horrible images are spread in child sexual abuse materials.”
The function in the message is optional, and parents can enable it on the device used by the child. The system will check the pornographic material in the photos received and the photos to be sent by the child. If the child receives a picture with pornographic content, it will be blurred and the child must click an additional button to view it. If they do view the image, their parents will be notified. Similarly, if children try to send explicit pictures, they will be warned and their parents will be notified.
Apple said that the message function uses on-device analysis, and the company cannot view the content of the message. This feature applies to Apple’s iMessage service and other protocols, such as multimedia messaging services.
The company also introduced two functions related to Siri and search. The system will be able to answer questions about reporting images of child exploitation and abuse and provide information about how users submit reports. The second function is to warn users searching for child abuse materials. The company said that messaging and Siri features will land on iPhone, iPad, Mac and Apple Watch.
[ad_2]
Source link