The end is good, but the means are not. More than 90 human rights and internet groups from around the world have sent this Thursday an open letter Apple CEO Tim Cook urging him to abandon his plans to scan the messages sent by minors on his iMessages messaging system and the phones of adults to detect images of child sexual abuse and combat child abuse.
The letter comes almost 15 days after the company was mired in a scandal. On August 5, Apple announced that it would install a software surveillance whose purpose is to perform a scan on their devices, specifically in the messages section (iMessages) and in the photographs. While the intention of this measure is to protect children and reduce the spread of child sexual abuse material, critics fear that governments and police forces around the world will take advantage of the new functions to censor freedom of expression or persecute dissidents. .
“While these features are intended to protect children and reduce the spread of child sexual abuse material, we are concerned that they are used to censor protected speech, threaten the privacy and safety of people around the world, and that ultimately have disastrous consequences for many children, “say the groups in the letter.
This is the largest campaign to date to point out an information encryption problem targeting a single company and has been organized and driven by the Center for Democracy and Technology (CDT), a non-profit organization based in the United States. “The breadth of the international coalition that joins the letter shows to what extent Apple’s plans open the door to threats to human rights around the world,” details the CDT. Some of the signing organizations are PEN America, Electronic Frontier Foundation (EFF), Access Now, Privacy International, Derechos Digitales, Global Voices, Global Partners Digital and groups from Europe, Latin America, Africa, Asia and Australia.
Persecuted youth and opportunistic governments
Organizations focus their concern in two areas. First, in the scan and alert function in iMessages intended to notify the parents of minors. This could generate alerts that threaten the safety and well-being of some young people and especially “LGTBI + young people with unsympathetic parents, because with these alerts they are particularly at risk,” they explain from CDT. And second, that once scanning for photos is integrated into Apple products, the company would face enormous pressure – and possibly legal requirements – from countries around the world to search for other types of images that each government considers. objectionable. And this is the most worrying thing for the signatories.
The new measures, according to these organizations, would thus give governments a free hand to take advantage of the surveillance capacity that Apple is incorporating into iPhones, iPads and computers for undemocratic purposes. “They will demand that Apple find and block images of human rights abuses, political protests and other content that should be protected such as free expression, which forms the backbone of a free and democratic society,” predicts Sharon Bradford Franklin, co-director of the project. CDT security and surveillance.
The organizations have reiterated that they support all efforts to protect children and strongly oppose the proliferation of child sexual abuse. But they insist that the changes that Apple has announced “put children and other users at risk, both now and in the future.” Organizations urge Apple to quit immediately these changes and reaffirm the company’s commitment to protect its users “with end-to-end encryption” [una función que garantiza que solo el emisor y el receptor puedan leer o escuchar lo que se envía, y que nadie más, ni siquiera las plataformas, puedan hacerlo].
“We also urge Apple to consult more regularly with civil society groups and vulnerable communities that may be disproportionately affected by changes to its products and services,” they advise.
Eddie is an Australian news reporter with over 9 years in the industry and has published on Forbes and tech crunch.