China's Emotion Recognition Tech Receives Criticism

Published on 02 Mar 2021

China's emotion recognition technology has received criticism from human rights groups that are calling for a ban on technology that they say has a racial bias and is inaccurate.

What is emotion recognition technology?

The technology uses cameras and machine learning to identify facial expressions of anger, sadness, happiness, and boredom. The system is supposed to help authorities infer a person's feelings based on their facial muscle movement, body language, vocal tone, and other biometric signals. Unlike facial-recognition technology that compares matches faces from a database, emotion recognition makes inferences about people's behavior. Though everyday citizens in the country are not happy about the technology they have little choice in the matter.

But similar to facial recognition, it involves the mass collection of sensitive personal data to track, monitor, and profile people and uses machine learning to analyze expressions and other clues.

See Also:  What is cloudwashing and why it's crucial for businesses to avoid it?

Taigusys and their opinion on the technology

Taigusys is one of the companies specializing in this technology. Chen Wei, a representative for the company says emotion recognition is a way to predict dangerous behavior by prisoners, identify potential criminals at checkpoints, recognize students with issues in schools and recognize dementia among elderly people in care homes. The company's systems are installed in 300 prison and detention centers and connects over 60,000 cameras.
“Violence and suicide are very common in detention centers. Even if police nowadays don’t beat prisoners, they often try to wear them down by not allowing them to fall asleep. As a result, some prisoners will have a mental breakdown and seek to kill themselves. And our system will help prevent that from happening,” says Chen.
Taigusys's systems have also been installed in schools to monitor students, staff and teachers. The use of this technology in schools has led to some criticism but not much has been discussed about the use of emotion recognition by authorities on citizens.

See Also:  What is Microsoft Viva and Why You Should Care?

Criticisms for the technology

Critics of the technology question is validity and say that it is based on pseudo-science and relies heavily on stereotypes. A large number of human rights activists say that the technology could have serious negative impacts on freedom of expressions, privacy and human rights. A lawyer focusing on the socio-legal impact of emerging technologies, Vidushi Marda is also a digital program manager at the British human rights organization Article 19. She says, “A lot of biometric surveillance, I think, is closely tied to intimidation and censorship, and I suppose [emotion recognition] is one example of just that.”

Featured image: People photo created by rawpixel.com - www.freepik.com