WhatsApp has categorically stated they won’t be indulging in the sort of scanning activity that Apple said they will undertake in iPhone devices to fish out potential instances of child abuse. Will Cathcart who heads Facebook-owned WhatsApp said they will never resort to such tactics given the sort of legal and privacy issues this can lead to.
Cathcart said this may not be the right way to deal with the issue as it can directly impinge on iPhone user’s privacy. The WhatsApp head also said the tool to be used for this purpose can also run into errors which are when things can become all the more complicated. The concerns are widespread as Matthew Green, an associate professor at Johns Hopkins Information Security Institute too has said such a tool can mark the beginning of bringing encrypted messaging systems under surveillance, which would be a bad precedent, to say the least.
Apple had earlier stated they intend to release the update sometime late in 2021. The iPhone maker also justified its approach with the new controversial tool claiming it is aimed at protecting children from sexual predators. The tool is based on the principle of neural matching principle referred to as NeuralHash. It works by scanning the images on a user’s iPhone to see if those match with known child sexual abuse material or CSAM fingerprints.
The Cupertino giant also ruled out the new tool will have any privacy implications as such given that there isn’t going to be scanning of the images in the manner that the term might imply. Since scanning generates a result in its truest form, there isn’t going to be any result achieved with the tool. Rather, whatever the scanning leads to is going to be stored in a cryptographic safety voucher which again is a placeholder containing interpretable data.
In other words, Apple won’t be storing any information about user’s records post the scan unless of course there are tons of Child Sexual Abuse Material (CSAM). Apple also said the chances of error creeping in wherein a user is mistakenly identified for being a sexual offender is less than one in a trillion. This points to the tool being extremely robust and well defined in its functioning though it remains to be seen how things shape up post it is released.