This is how Apple’s trace hash would work
Child abuse is undoubtedly a global problem that must be fought with all the necessary tools. Apple wants to put its grain of sand with tools necessary to be able to search for child pornography. This is something that has been made public in different reports, being unofficial, although everything indicates that it will finally be announced in the coming weeks. This system will be installed on the device client in the privacy section. A set of fingerprints that represent illegal content to be able to verify each photograph in the gallery. Subsequently, a manual review will be carried out so as not to leave such an important decision in the hands of a simple algorithm.
It is not an infallible or perfect system
Security experts have made different statements after knowing these reports, stating that this system is not infallible. In many cases, hashing algorithms they can give a false positive since detecting an image as child pornography can be complex to do due to the great diversity of images that may exist. The geopolitical implications that this system of algorithms may have should also be studied. It must be taken into account that the information will be made available to governments and that is why it can be used for other tasks such as to suppress political activism in those countries where there is not a full democracy.
It should also be noted that currently the photos that are uploaded to iCloud for backup are not encrypted as such, but are stored in an encrypted way. Apple can always provide the keys to decrypt this content to a government and to view the entire library of a specific user, although it is something that happens in all services. But as we have mentioned previously, this is a first idea that has been collected in a report. This means that it can radically change to when it is finished being announced officially.