Every second, the surveillance cameras installed in each classroom at Niulanshan First Secondary School in Beijing snap a photo. The images are then fed into the Classroom Care System, an “emotion recognition” program developed by Hanwang Technology. It identifies each student’s face and analyzes their behavior: a student rifling through their desk might be labeled “distracted,” while another looking at the board would be labeled “focused.” Other behavioral categories include answering questions, interacting with other students, writing, and sleeping. Teachers and parents receive a weekly report through a mobile app, which can be unsparing: In one, a student who had answered just a single question in his English class was called out for low participation — despite the app recording him as “focused” 94% of the time.
The Beijing program, first described by journalist Yujie Xue in 2019, has attracted fresh scrutiny in a sweeping new report on emotion recognition technology in China published Monday by Article 19. The British human rights organization found that the dubious tech, while not yet widespread, is being promoted by dozens of Chinese corporations and academic researchers for a wide range of applications, including border screening and prison surveillance as well as assessing student behavior and performance.
Emotion recognition technology is based upon a fundamentally flawed idea: that an algorithm can analyze a person’s facial expressions and accurately infer their inner state or mood. In reality, when a person experiences emotions like joy, worry, or disgust, studies have found that they don’t necessarily respond by reacting in consistent, universal ways. While many people may frown if they feel sad, that reaction is also dependent on factors such as culture and the situation and moment.
A 2019 meta-analysis that looked at over 1,000 studies on emotion recognition found that it’s “not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be the scientific facts.” In other words, using facial expressions to determine someone’s attention level, motivation, or trustworthiness — all things emotion recognition companies purport to do — simply isn’t achievable.
These findings haven’t stopped tech companies like Amazon, Microsoft, and Google from offering emotion recognition to their customers (though Amazon and Microsoft note their tools can’t make “a determination of the person’s internal emotional state” and that “facial expressions alone may not necessarily represent the internal states of people.”) Other startups have tried applying emotion recognition to sensitive tasks including screening job applicants. Overall, the global emotion recognition market for the tech will be worth more than $33 billion by 2023, according to one estimate. “New technologies proliferate in societies not necessarily because they work or have demonstrated impact,” said Vidushi Marda, senior program officer at Article 19 and a co-author of the report, “but because the actors and institutions that build, sell, and use these technologies claim that it works.”
In China, according to the report, some firms describe emotion recognition as an evolution of facial recognition, even though the technologies have disparate functions. One company, for example, called emotion recognition “biometrics 3.0.” While researchers have also found many facial recognition programs to be flawed, the tech is only designed to identify faces, rather than discern what a person may be feeling or thinking.
The authors of the Article 19 report recommend that China and other countries prohibit the sale and use of emotion recognition technology, and not only because it’s often based on junk science. They worry the tech has the potential to erode privacy and human rights, especially for minorities and other vulnerable populations. Shazeda Ahmed, a co-author of the report and a Ph.D. candidate at the University of California, Berkeley said many of the methods tech companies are using “reproduce racist, culturally biased assumptions about how humans express emotions.”
Two years ago, the AI Now Institute at New York University called for emotion recognition technology to be banned from use for “important decisions that impact people’s lives and access to opportunities,” including evaluating “student performance in school.” But there are still major incentives for companies in China and other countries to continue bringing the technology into classrooms.
“In the competitive Chinese educational environment, it’s easy for companies to pander to parents’ anxieties about their children’s success,” said Ahmed. School administrators may also see the technology as a way to attract state funding and produce educational improvements overnight. In places like the United States and India, facial and emotion recognition tools have been used in schools for safety and to boost attendance.
The Article 19 report documents a range of Chinese companies that offer emotion recognition for education, including tech giants like Lenovo. One firm claimed to have built an interface for teachers that displays “problem student warnings,” which flag emotions like sadness or fear. The program combines emotion recognition with academic performance to categorize students according to different archetypes. The “falsely earnest type,” for instance, is someone who listens in lectures but gets bad grades.
Some startups have incorporated emotion recognition tools into online learning platforms, which have exploded in popularity in China during the pandemic. In the U.S., the switch to remote learning has led schools and universities to adopt AI systems that purport to detect behavior like cheating, provoking criticism from students and administrators alike.
Yong Zhao, an education professor at the University of Kansas, cautioned that not only can these technologies amplify students’ anxieties, they’re also highly fallible. “We don’t know yet how good the algorithm is,” said Zhao. “Can you really capture all students’ emotional patterns? What does it really mean to be paying attention?”
Not everyone in China is in favor of using emotion recognition in schools. In an infamous 2018 incident, Hangzhou No. 11 Middle School, in southeastern Zhejiang Province, implemented a system developed by surveillance giant Hikvision that scanned students’ faces every 30 seconds to identify seven types of emotions and six types of behavior. It attracted local and international media attention, and after backlash from students and parents, the program was reportedly quickly paused.
But the Article 19 report found that positive media coverage of emotional recognition still prevails in China over accounts documenting the downsides of using it in schools. Ahmed said she wanted to believe the Hangzhou incident would deter other companies, “but many of the additional examples we found were launched after that trial.”
One of the overarching problems with emotion recognition is that it’s often unclear how educators should respond to the data. If the algorithm indicates students look more unhappy than usual, there are no obvious indications for how a teacher should adjust their lesson plan. “Education is about the development of human beings. For that purpose, I don’t think AI or emotion recognition technology can be of much help,” said Zhao. “Human beings need a lot of different experiences to grow, not only the knowledge they get through instruction.”