Saturday, July 31

Mass facial recognition is the apparatus of police states and must be regulated | Sight


In April, the EU will become the first world power to draw up detailed plans to regulate artificial intelligence, including rules on the use of biometric surveillance technologies, such as public facial recognition.

Biometric technologies collect unique data from our bodies and behaviors, revealing confidential information about who we are. More than half of the EU countries use facial recognition and similar tools in ways that conflict with their own human rights rules, European Digital Rights Advocacy Group (EDRi) found.

Despite being promoted as an easy solution to crime, the reality is that these invasive technologies are often based on nonscientific and discriminatory foundations. They are deployed by powerful actors trying to play God and only benefit the tech companies that sell them.

Naively at best, our governments are creating the perfect conditions for anti-democratic mass surveillance of entire populations. Now Europeans are fighting back: the Reclaim Your Face movement has been launched a pan-european petition to pressure the EU institutions to ban the harmful use of these technologies.

While facial recognition technology has been around for decades, recent years have seen a growing desire by cash-strapped city councils and police forces to automate the use of facial and vocal expressions, body movements, and behaviors. of people to predict whether that person is “suspicious” or not. ”.

Earlier this month, the EU’s highest court handled a case about controversial AI “lie detector” tests through a project called iBorderCTRL. Patrick Breyer MEP is suing the European Commission over this EU-funded pilot, which ran between 2016 and 2019. According to the Commission, the project aimed to speed up controls for non-EU citizens traveling to the EU. Actually, he used an invasive biometric analysis of emotions, which some scientists have said they do not work, to determine whether or not to grant someone the freedom to travel.

iBorderCTRL and similar EU-funded projects have been criticized for low secretthat these experiments of alleged violation of rights have been carried out, for being part of increasingly inhumane migration control strategies, and for infringing “cognitive freedom“Another example can be seen in the Netherlands, where authorities are using biometric technologies to predict the aggressiveness of people.

We are witnessing huge sums of money being spent by authorities on data-driven biometric technology as an easy shortcut to complex social problems. They are doing this rather than investing in education, welfare, or building trust with communities, which are more effective ways to reduce crime and improve equitable access to opportunity.

Equipping our streets supermarkets, and parks with these technologies will amplify existing discrimination against people of color, people with disabilities, and other underserved groups. Rather than being debatable, these practices will be hidden under a veil of false scientific authority and deliberate opacity.

London police, for example, claim that facial recognition is never used just to make a decision. Still, research from the University of Essex shows that humans are psychologically discouraged from challenging AI-based decisions due to the heavy burden of disproving them. In short: “The computer says no.”

By stating that we know what people are thinking or going to do next, the latest generation of biometric systems propels us toward a technologically enabled thought police. Through “mind-reading” experiments like iBorderCTRL, the EU is legitimizing flimsy pseudoscience and opening the door to increasingly invasive biometric predictions. Technology ignores the fact that expressions and emotions vary widely on both a cultural and individual level. Biometric suspicion detection, by contrast, forces everyone to act according to an arbitrary technological standard.

Governments are opening a Pandora’s box where state control extends not only to everything we do, but to our most intimate thoughts, and they are applying it in contexts where our innocence or guilt is predicted based on our biometric data. These tools and practices are the dreams of any quintessential police state and the antithesis of the rule of law. But we have the opportunity to stop them before they become widespread in our societies.

The European Reclaim Your Face campaign, backed by EDRi, calls for a permanent end to biometric mass surveillance like iBorderCTRL. Your request, which hopes to gather 1 million signatures, and which launches today, aims to promote a future where people will not be segmented based on their appearance, religion, sexuality, disability and other factors that make up our diverse identities. History could not give us a clearer warning about what happens when societies divide and turn against each other on the basis of who is construed as suspicious, abnormal, and aberrant.

EU countries will decide whether the development and use of AI will be legally controlled or not. As the first continent to consider regulating these fast-moving technologies, it has the power to set a global example, for example by banning mass biometric surveillance practices.

This would ensure that we can all enjoy the presumption of innocence, free from the judgment of machines that strip us of our dignity, violate our physical and psychological integrity, and threaten our democratic freedoms. We urge you to #ReclaimYourFace with us.


feedproxy.google.com

Leave a Reply

Your email address will not be published. Required fields are marked *