Why WhatsApp Said No To Apple

Why WhatsApp Said No To Apple

A new function of Apple’s operating systems, with a very noble purpose, is causing a sensation for the methods announced: the CEO of WhatsApp says no to “CSAM detection.”

For a few days in the tech world, there has been talking of nothing but the recent announcement by Apple, which with iOS 15, iPadOS 15, watchOS 8, and macOS Monterey will launch its fight against “Child Sexual Abuse Material” (CSAM) that is, in a nutshell, child pornography photos and videos. Apple has stated that it would introduce new features to limit the exchange of material of this type through iPhones and iPads, and no one is against all this. It is the method to do so announced by Apple, which is receiving a lot of criticism, including from Will Cathcart, the CEO of WhatsApp.

The matter is complex because Apple intends to introduce a two-step process to analyze potentially illegal photos and videos. The first step is managed by artificial intelligence; the second involves a moderation team made up of real people. One might think that those who do not download or share photos and videos depicting minors in sexual attitudes (or worse) have nothing to fear with the new verification system implemented by Apple and, at least in theory, that that’s right. However, things could go very differently in practice, and the privacy implications could be powerful. To understand why it is first necessary to see how Apple’sApple’s new control against CSAM, i.eThe so-called “CSAM detection “will work.

Apple CSAM Detection: How It Works

As already mentioned, recognizing the “Child Sexual Abuse Material” will take place in two phases. The first phase takes place entirely inside the device: the iPhone or the iPad compares the photos and videos present in the memory (or uploaded to the device). your iCloud account) with those present on a database managed by NCMEC, the American National Center for Missing and Exploited Children.

This database contains millions of illegal files previously discovered on pedophile devices and on numerous forums and sites they frequent. An artificial intelligence algorithm compares, directly on the device, all the photos and videos in search of similarities with the material contained in the database (which is constantly updated).

If material is found too similar to the database, then the offending photo or video is sent in a report to Apple’sApple’s review team. At this point, as they explain to Cupertino,” Apple manually reviews each report to confirm if there is a match, to disable the user’s account, and to make a report to NCMEC. “

The problem with this system is all in two words: false positives. Artificial intelligence algorithms are far from perfect, and sometimes they get it wrong. Some of them are often wrong. In the event of an error, therefore, the user’s photos and videos would be sent to Apple and would be seen by at least one person for final judgment even if they have nothing illegal at all.

Such photos or videos would probably have at least one minor as the protagonist. It means that photos and videos of your children, grandchildren, or any other child in your household could be reported to Apple and viewed by a controller. Images that, perhaps, the user wanted to keep private and did not have the slightest intention of exchanging with anyone either in public or private. Or he wanted to share only with a small number of friends and relatives, or even with just one of them.

WhatsApp Says No

In light of all this, it is pretty clear why the CEO of WhatsApp intervened to have his say on this system: if there is an app with which you share photos of your children, it is usually WhatsApp. Many Apple users have wondered if the photos and videos exchanged via WhatsApp will be subjected to this treatment, and CEO Will Cathcart could not help but answer.

“I read the information released yesterday by Apple, and I am concerned – wrote Cathcart on Twitter – I think this is the wrong approach and a setback for the privacy of people around the world. People have asked if we will adopt this system for WhatsApp. The answer is no “.

Also Read : WhatsApp Mimics Snapchat: What Changes With View Once

Share

Leave a Reply

Your email address will not be published. Required fields are marked *