Friday, December 3

Will Apple’s Image Scanning Plan Protect Kids or Just Threaten Privacy? | John naughton


ORA while ago, updates to computer operating systems were of interest only to geeks. Not anymore, at least in relation to Apple, iOS and Mac OS operating systems. You may recall how iOS version 14.5, which required users to opt out of tracking, made online ad gangsters nervous as their ally Facebook stood up for them. Now the next version of iOS has libertarians, privacy advocates, and “wedge end” worries in a twist.

It also has busy mainstream journalists scrambling to find headline-friendly summaries of what Apple has in store for us. “Apple is snooping on iPhones to find sexual predators, but privacy activists are concerned that governments may use this feature as a weapon,” said the venerable Washington Post initially reported it. This was, to put it politely, a bit misleading and the first three paragraphs below the headline were, like John Gruber pointed sharply, completely wrong.

To be fair to him Mail however, we must acknowledge that there is no one-sentence formulation that accurately captures the scope of what Apple has in mind. The truth is that it is complicated; Worse still, it’s all about cryptography, an issue that ensures that anyone can search for the nearest exit. And it’s about child sexual abuse images, which are (rightly) one of the most controversial topics in the online world.

A good place to start, therefore, is with Apple explanation of what you are trying to do. Basically: three things. The first is to provide tools to help parents manage their children’s messaging activity. (Yes, there are families rich enough to gift everyone an iPhone!) The iMessage app on children’s phones will use their Incorporated Machine learning ability to warn of inappropriate content and alert your parents. Second, updated operating systems will use cryptographic tools to limit the spread of CSAM (child sexual abuse material) on Apple’s iCloud storage service while preserving user privacy. (If this sounds like squaring a circle, stay tuned.) And third, Apple is providing Siri updates and searches to help parents and children if they find unsafe material. This third change seems relatively straightforward. It is the other two that have generated the most heat.

The first change is controversial because it involves things that happen on people’s iPhones. Well, actually, on phones used by kids on a shared family account. If the machine learning algorithm detects an unreliable message, the photo will be blurred and accompanied by a message that warns the user that if they see it, their parents will be notified. The same applies if the child tries to send a sexually explicit photograph.

But how does the system know if an image is sexually explicit? It appears to do so by seeing if it matches images from a database maintained by the US National Center for Missing & Exploited Children (NCMEC). Every image in that gloomy database has a unique cryptographic signature – a long, incomprehensible number – in other words, the kind of thing computers are exceptionally good at reading. This is the way the photos will be scanned into iCloud, not trying to analyze the image itself, but simply verifying its cryptographic signature. So Apple’s innovation is to do it “client-side” (as tech jargon puts it), verifying both the device and the cloud.

It is this innovation that has sounded the most alarms among those concerned about privacy and civil rights, who see it as undermining what has so far been an impressive feature of iMessage: its end-to-end encryption. The Electronic Frontier Foundation, for example, sees it as a possible “back door.” “It is impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children.” warns. “As a consequence, even a well-intentioned effort to build such a system will break the key promises of messenger encryption and open the door to wider abuses. That’s not a slippery slope, it’s a fully built system that just waits for external pressure to make the slightest change. “

Before you get too enraged, here are a few things worth keeping in mind. You don’t have to use iCloud for photos. And while Apple will undoubtedly try to reclaim the moral ground, as usual, it’s worth noting that to date it has seemed relatively relaxed about what was on iCloud. The NCMEC reportsFor example, that in 2020 Facebook reported 20.3 million images to you, while Apple reported only 265. So could your brave new update be just to catch up? Or a preemptive strike against upcoming UK and EU reporting requirements? As the Bible would say, corporations move in mysterious ways, their wonders to be performed.

What i’ve been reading

Stunted growth
Vaclav Smil: We must leave behind growth is the transcript of an interview by David Wallace-Wells recorded after the publication of Smil’s master book on growth.

Fallen idol
Surely we can do better than Elon Musk is a fabulous long read by Nathan J. Robinson on the Current Affairs site.

Hanging
Teenage loneliness and the smartphone is a somber New York Times test by Jonathan Haidt and Jean Twenge, who have spent years studying the effect of smartphones and social media on our daily lives and mental health.


www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share