FFor centuries, cryptography was the exclusive preserve of the state. Then, in 1976, Whitfield Diffie and Martin Hellman devised a practical method of establishing a shared secret key over an authenticated (but not confidential) communication channel without using a prior shared secret. The following year, three MIT scholars, Ron Rivest, Adi Shamir, and Leonard Adleman, came up with the RSA algorithm (named for its initials) to implement it. It was the beginning of public key cryptography – at least in the public domain.
From the beginning, state authorities were not amused by this development. They were even less amused when in 1991 Phil Zimmermann created Pretty good privacy (PGP) to sign, encrypt and decrypt texts, emails, files and other things. PGP raised the specter that ordinary citizens, or at least the geekiest of them, could wrap their electronic communications in an envelope that not even the most powerful state could open. In fact, the US government was so enraged by Zimmermann’s work that it defined the PGP as ammunition, meaning it was a crime to export it to the Warsaw Pact countries. (The cold war was still relatively hot back then.)
In the four decades since then, there has been a conflict between the desire of citizens to have communications unreadable to the state and other agencies and the desire of those agencies to be able to read them. The aftermath of September 11, which gave states carte blanche to spy on everything people did online, and the explosion of online communication via the Internet and (since 2007) smartphones, has intensified the conflict. . During the Clinton years, US authorities tried (and failed) to ensure that all electronic devices had a secret back door, while the Snowden Revelations In 2013 they pressured Internet companies to offer end-to-end encryption for their users’ communications, rendering them unreadable to security services or the technology companies themselves. The result was something of a showdown: between tech companies that facilitate unreadable communications and law enforcement and security agencies that can’t access the evidence to which they have a legitimate right.
In August, Apple opened a crack in the armor of the industry, announcing that it would add new features to its iOS operating system that were designed to combat the sexual exploitation of children and the distribution of abusive images. The most controversial measure scans photos on an iPhone, compares them to a database of known child sexual abuse material (CSAM), and notifies Apple if a match is found. The technology is known as client-side scanning or CSS.
Powerful forces in government and the tech industry are now pushing hard for CSS to be mandatory on all smartphones. Their argument is that instead of weakening encryption or providing backdoor keys to law enforcement, CSS would allow data analysis on the device in a clear way (i.e. before an app like WhatsApp or iMessage encrypts them. ). If specific information were detected, its existence and potentially its source would be disclosed to the agencies; otherwise, little or no information would come out of the client device.
CSS evangelists claim it is a win-win proposition: providing a solution to the debate between encryption and public safety by offering privacy (end-to-end encryption without hindrance) and the ability to successfully investigate serious crimes. What’s not to like? A lot, he says an academic job by some of the world’s leading computer security experts, published last week.
The impetus behind the CSS lobbying is that the scanning software is installed on everybody smartphones rather than covertly installed on the devices of suspects or by court order on those of ex-criminals. This universal deployment would threaten the safety of law-abiding citizens and offenders. And although CSS still allows end-to-end encryption, this is moot if the message has already been scanned for specific content before being sent. Similarly, while Apple’s implementation of the technology just scans images, it doesn’t take much to imagine political regimes scanning text for names, memes, political opinions, etc.
In reality, CSS is a technology for what is called “bulk interception” in the security world. Because it would give government agencies access to private content, it really should be treated like wiretapping and regulated accordingly. And in jurisdictions where bulk interception is already prohibited, bulk CSS should be prohibited as well.
Yet in the broader perspective of the evolution of digital technology, CSS is just the latest step in the inexorable intrusion of surveillance devices into our lives. The trend that started with reading our emails, went on to record our searches and our browsing click streams, undermine our online activity to create profiles to target advertising towards us and use facial recognition to allow us to enter our offices it now continues to traverse the home with “smart devices that broadcast everything to motherships in the” cloud “and, if CSS is enacted, penetrate directly into our pockets, purses, and purses. That leaves only one remaining barrier: the human skull. But rest assured, Elon Musk undoubtedly has a plan for that too.
What i’ve been reading
Wheels within wheels
I’m not an indoor cyclist, but if I was, The Counterintuitive Mechanics of Peloton Addiction, a confessional blog post by Anne Helen Petersen, I could take a break.
Get out of here
The Last Days of Intervention is a long and reflective essay on External relationships by Rory Stewart, one of the few British politicians who always spoke with common sense about Afghanistan.
Sounding the whistle on Facebook is just the first step is a reinforcement piece by Maria Farrell on the Conversationalist about the Facebook whistleblower.
George is Digismak’s reported cum editor with 13 years of experience in Journalism