Online games show the dangers of emotion recognition techniques using artificial intelligence

in CryptoDog6 months ago

Technology designed to identify human emotions using machine learning algorithms is a huge industry, and its proponents claim there is great value to it, from keeping road safety to analyzing market research.

But critics say the technology not only raises privacy concerns, it is imprecise and racially biased.

A report by the Guardian stated that a team of researchers has developed a website called emojify.info where the public can try emotion recognition systems through their computer cameras.

One game works to pull faces to fool technology, while another explores how these systems can struggle to read facial expressions so that they are consistent with certain situations.

The researchers say their hope is to raise awareness of the technology and foster conversations about its use.

image.png

"It's a form of facial recognition, but it goes further because instead of just identifying people, it pretends to read our emotions and our inner feelings from our faces," said Dr. Alexa Hagerty, project leader and researcher at Cambridge Leverholm University.

Facial recognition technology, which is often used to identify people, has come under intense scrutiny in recent years. Last year, the Equality and Human Rights Commission said it should stop its use for mass screening, saying it could increase police discrimination and harm freedom of expression.

But Hagerty said many people were unaware of how common the emotion recognition systems were, noting that they were working in situations ranging from helping with employment, to seeing clients, helping with airport security operations, and even teaching to see if students were participating or Do their homework.

She said the technology is used all over the world, from Europe to the United States and China.

Fears

Taigusys, a company that specializes in emotion recognition systems with its head office in Shenzhen, China, says it has used it in places ranging from care homes to prisons, while according to reports earlier this year, Lucknow is planning India for using technology to discover the distress women suffer as a result of harassment, a move that has drawn criticism from digital rights organizations and others.

While Hagerty said that emotion recognition technology may have some potential benefits, this must be balanced with concerns about accuracy and racial bias, as well as whether technology is the right tool for a particular job.

"We need to have a broader public conversation and deliberation about these technologies," she added.

The new project allows users to experiment with emotion recognition technology. The site notes that "no personal data is collected and all images are stored on your device."

In one game, users are invited to pull out a series of faces to fake an emotion and see if the system has been tricked.

"The people who develop this technology say it reads emotions ... but the system was actually reading the movement of the face and then combining that with the assumption that those movements are related to emotions, for example a smile means that someone is happy," Hagerty said.

She adds that the matter is more complicated than that, as human experience has shown that it is possible to fake a smile. "This is what the game we created is trying to prove, to show you that you did not change your inner state and feelings 6 times, just changed the way you look and the features of your face."

Vidushi Marda, chief program officer at the human rights organization, said pressure was needed to stem the growing market for emotion recognition systems.

He added, "The use of emotion recognition techniques is extremely worrying because these systems are not only based on discriminatory and discredited science, but their use is fundamentally inconsistent with human rights."

Special thanks to

@booming01
@booming02
@booming03
@booming04

Because your support for us is the key for me to continue developing and continuing forward