Apple will begin automatically searching for photos of child sexual abuse this year that are uploaded to iCloud, the company's cloud storage service.
The new security tool, which will arrive with versions of iOS 15, will be known as neuralHash, and will include several safeguards to protect the privacy of users, although some researchers and security experts have been concerned about the implications that its application could have. implantation.
Read more: Pegasus Surveillance Software
Initially, the verification will only be performed on US user accounts. The fully automated system will not have direct access to the content of the images that users upload to iCloud, but will have direct access to their "fingerprint" (hash).
This footprint, which is basically a string of characters, is calculated based on the information present in the image and is subsequently compared with a database of more than 200,000 image footprints preselected by the National Center for Lost or Exploited Children ( NCMEC), the US agency that is responsible for stopping the distribution of explicit sexual images of minors on the network.
Read also: Gmail, YouTube And Other Google Apps Will Stop Working On Older Android Phones
The system designed by Apple is advanced enough to also detect similarities in images that have been slightly modified (for example, cropped). Detection will take place on the user's own phone before uploading the image to the iCloud service. If a photo is detected that may match those that NCMEC has collected and flagged as an image with sexually explicit content with minors, a record associated with the user's account is generated.
If a user account exceeds a registration threshold (which Apple has not specified), the company will manually check the content of the images on iCloud and alert NCMEC and authorities if it finally discovers explicit sexual content starring minors.
"The threshold is established to provide an extremely high level of precision and the possibility of incorrectly marking a certain account is one in a billion," they explain from the company.
A CONTROVERTED TOOL
These types of automated systems for detecting images of explicit sexual content with minors are common in cloud storage systems, such as Google Drive or Dropbox. Apple also performs these checks on files stored on its servers, but its implementation on the iPhone itself opens a precedent that some researchers and security experts consider worrying.
Read more: INTEL || This Will Be The Processors That Will Reach Your Computer In The Coming Years
It is, after all, a system that automatically analyzes the user's images , although it does not do it directly, and many have the uploading of images to iCloud enabled by default, which affects all photos stored on the computer reel.
"Whoever controls the fingerprint list can search for whatever content they want on your phone, and you really have no way of knowing what's on that list because it's invisible to you," explains Matthew Green, professor of security and encryption at Johns Hopkins University.
The fear of Green and other security experts is that, in the future, a government may decide to use this tool already implemented to search, for example, images of a banner at a protest or other type of content that endangers citizens, especially in countries with authoritarian regimes.
Read also: The WhatsApp Vacation Mode Is Here
Several researchers have also shown that there are ways to fool the algorithms that create these unique fingerprints of each image, something that could be used to generate false positives with seemingly innocent photos and thus give the security forces an excuse to access the information. present on a phone.
In recent years, various law enforcement agencies and governments have asked Apple to create back doors that allow them to access messages and images sent or stored on phones to aid in investigations of child abuse or terrorism.
Read more: Sony's Interchangeable Lens Camera
Apple, so far, has refused. Their argument is that a back door could also be used by authoritarian governments to violate people's rights and that, once created, this type of door could also fall into the hands of cybercriminals.
NeuralHash is in a way a compromise solution that allows the detection of explicit images of child abuse without having a back door that lowers the security or general level of privacy of the platform.
NOTICE TO PARENTS
In addition to this tool, Apple will include in iOS15 new protection systems in the accounts of minors to stop the exchange of sexually explicit photos.
If an underage user receives a photo with sexual content, it will appear blurred on the screen and the child will be warned that the photo contains potentially harmful content.
Read also: Netflix Now Wants To Make Video Games
If you still decide to see it, you will be able to do so but a notice will be sent to the parents. Multiple confirmation screens will explain why submitting these types of images can be harmful and what is considered to be a sexually explicit image (images showing parts of the body that are normally covered with a bathing suit).
Similar protections are available if a child tries to send sexually explicit photos. The child will be warned before the photo is sent, and parents will receive a message if the child decides to send it.
The detection system uses machine learning techniques on the device to analyze image attachments and determine if a photo is sexually explicit. The function is designed so that Apple does not have access to messages (all verification is done on the device itself) and will only work with Apple's own Messages application.
Do You Know What We Have Posted on
0 Comments