Thursday, October 28

Smile for the Camera: The Dark Side of China’s Emotion Recognition Technology | porcelain


“THECommon people here in China are not happy with this technology, but they have no other choice. If the police say there has to be cameras in a community, people will just have to live with it. There is always that demand and we are here to meet it. “

So says Chen Wei of Taigusys, a company specializing in emotion recognition technology, the latest evolution in the wider world of surveillance systems that play a role in almost every aspect of Chinese society.

Emotion recognition technologies, in which facial expressions of anger, sadness, happiness, and boredom are tracked, as well as other biometric data, can supposedly infer a person’s feelings based on traits such as movements of the facial muscles, vocal tone, body movements and other biometric signals. It goes beyond facial recognition technologies, which simply compare faces to determine a match.

But similar to facial recognition, it involves the massive collection of sensitive personal data to track, monitor and profile people and uses machine learning to analyze expressions and other clues.

The industry is booming in China, where since at least 2012, figures like President Xi Jinping have emphasized creating “positive energy” as part of an ideological campaign to promote certain types of expression and limit others.

Critics say the technology is based on a pseudoscience of stereotypes, and a growing number of researchers, lawyers and rights activists believe it has serious implications for human rights, privacy and freedom of expression. With the global industry forecast worth nearly $ 36 billion by 2023, growing at nearly 30% a year, rights groups say action must be taken now.

‘Intimidation and censorship’

Taigusys main office is hidden behind some low-rise office buildings in Shenzhen. Visitors are greeted at the door by a series of cameras that capture their images on a large screen that shows body temperature, along with age estimates and other statistics. Chen, the company’s general manager, says the system at the door is the company’s best-selling at the moment due to high demand during the coronavirus pandemic.

Chen hails emotion recognition as a way to predict dangerous behavior in prisoners, spot potential criminals at police checkpoints, troubled students in schools, and older people suffering from dementia in nursing homes.

Visitors to Taigusys in Shenzhen are greeted by cameras that capture their images on a large screen displaying body temperature, estimated age, and other statistics.
Visitors to Taigusys in Shenzhen are greeted by cameras that capture their images on a large screen displaying body temperature, estimated age, and other statistics. Photograph: Michael Standaert / The Guardian

Taigusys systems are installed in some 300 jails, detention centers and detention centers in China, connecting 60,000 cameras.

“Violence and suicide are very common in detention centers,” says Chen. “Even if today’s police don’t beat prisoners, they often try to wear them down by not allowing them to fall asleep. As a result, some prisoners will suffer a mental breakdown and seek to commit suicide. And our system will help prevent that from happening. “

Chen says that since the prisoners know that they are monitored by this system, 24 hours a day, in real time, they become more docile, which for the authorities is positive on many fronts. “Since they know what the system does, they will not consciously try to violate certain rules,” he says.

Questions and answers

What is the Digital Citizens series?

to show

As part of our Rights and Freedoms project, we investigated how rapid advances in data-intensive technologies are affecting human rights around the world.

Under the cover of the pandemic, many governments have used digital technologies to track and analyze the movements of citizens, stifle dissent and restrict freedom of expression, while digital platforms have manipulated the truth and spread misinformation.

But technology can also be a powerful force for hope and justice, helping to preserve rights and freedoms in the face of growing authoritarianism.

In addition to prisons and police checkpoints, Taigusys has deployed its systems in schools to monitor teachers, students and staff, in nursing homes to detect falls and changes in the emotional state of residents, and in shopping centers and parking lots.

While the use of emotion recognition technology in schools in China has drawn some criticism, there has been very little discussion of its use by authorities on citizens.

Chen, while aware of the concerns, took advantage of the system’s potential to stop violent incidents. He cites an incident in which a security guard stabbed some 41 people in southern China’s Guangxi province last June, claiming it was technologically preventable.

Vidushi Marda is a digital program manager at the British human rights organization Article 19 and a lawyer specializing in the socio-legal implications of emerging technologies. She questions Chen’s opinion on the Guangxi stabbing.

An 'Intelligent AI Outbreak Prevention' by SenseTime company in Shenzhen can detect if people have a fever and identify faces even behind a mask.
An ‘Intelligent AI Outbreak Prevention’ by SenseTime company in Shenzhen can detect if people have a fever and identify faces even behind a mask.
Photograph: Alex Plavevski / EPA

“This is a familiar and somewhat frustrating narrative that we see often used when introducing newer and ‘brighter’ technologies under the security umbrella, but in reality video surveillance has little nexus to security, and I’m not sure. how he thought real-time feedback would fix the violence, “Marda told The Guardian.

“I think a lot of biometric surveillance is closely related to bullying and censorship, and I guess [emotion recognition] is an example of that. “

Biometrics 3.0

A recent Article 19 report on the development of these surveillance technologies, which one Chinese company describes as “biometrics 3.0”, of 27 companies in China found their growth without safeguards and public deliberation, it was especially troublesome, particularly in the public security and education sectors.

Ultimately, groups like Article 19 say the technology must be banned before widespread adoption globally makes the ramifications too difficult to contain.

The Guardian contacted a number of companies included in the report. Only Taigusys responded to an interview request.

Another problem is that recognition systems are generally based on actors posing in what they believe to be happy, sad, angry, and other emotional states and not actual expressions of those emotions. Facial expressions can also vary widely between cultures, leading to further ethnic biases and inaccuracies.

A Taigusys system used by police in China, as well as security services in Thailand and some African countries, includes identifiers such as “yellow, white, black” and even “Uighur.”

Children walk past cameras in Akto, near Kashgar, Xinjiang, where Uighurs in China face intense surveillance.  Cameras can distinguish between Uighurs and Han Chinese.
Children walk past cameras in Akto, near Kashgar, Xinjiang, where Uighurs in China face intense surveillance. Cameras can distinguish between Uighurs and Han Chinese. Photograph: Greg Baker / AFP / Getty

“Populations in these countries are more racially diverse than in China, and in China, it is also used to distinguish Uighurs from Han Chinese,” says Chen, referring to the country’s dominant ethnicity. “If a Uyghur appears, they will be tagged, but Han Chinese will not be tagged.”

Potential for misuse

When asked if he was concerned about the misuse of these functions by the authorities, Chen says he is not concerned that the software is being used by the police, implying that those institutions should automatically be trusted.

“I’m not worried because it’s not our technology that’s the problem,” says Chen. “There are demands for this technology in certain settings and locations, and we will do our best to meet those demands.”

For Shazeda Ahmed, a visiting researcher at New York University’s AI Now Institute who contributed to the Article 19 report, these are all “terrible reasons.”

“It is really worrying that Chinese conceptions of race are incorporated into technology and exported to other parts of the world, especially since there is no such critical discourse. [about racism and ethnicity in China] that we are having in the United States, ”he tells The Guardian.

“In any event, investigations and investigative reports in recent years have shown that sensitive personal information is particularly dangerous when in the hands of state entities, especially given the wide scope of its possible use by state actors.”

A driver of the emotion recognition technology sector in China is the lack of strict privacy laws in the country. Basically, there are no laws restricting authorities’ access to biometric data for national or public security reasons, giving companies like Taigusys complete freedom to develop and launch these products when similar companies are in the US, Japan, or Europe cannot, he says. Chen.

“So we have the opportunity to collect as much information as possible and find the best scenarios to make use of that data,” he says.


www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Share