Home » today » Technology » Surveillance at work – emotion detection in video chat

Surveillance at work – emotion detection in video chat

Since the beginning of the pandemic, they have become part of everyday work for many: video calls. On the one hand, this popularity has made the suppliers of meeting software very rich, but on the other hand, it has also led to great competitive pressure in the growing market. Accordingly, the manufacturers try to find unique selling points for their programs. So does the top dog “Zoom”, which, according to a report in the magazine “Protocol” plans to analyze the emotions of the participants in the future.-

Human rights organizations warn of the potential for misuse of such technologies. For example Daniel Leufer von “Access Now”. He already thinks the promise is wrong. Although artificial intelligence is able to read lips, it has so far been impossible to accurately recognize emotions. –

The problems with AI start with training

According to Leufer, the companies are selling a technology that, for example, promises a deep insight into the candidate during job interviews. This is cheaper than entrusting qualified people with the analysis of the interview. But the impact on the job search is not the biggest problem that Leufer sees.

Also the

Artist and net activist Padeluun
pleads for the all-clear. He doubts whether facial recognition software already works as well as some manufacturers claim.



There are already companies that offer such a monitoring option for students in distance learning. This is intended to provide teachers with feedback on how the teaching material was responded to. But this data could also flow into the assessment of the other party, according to Leufer.

“We have seen in recent years that when AI systems were introduced in the context of teaching, they had a very negative impact,” says the digital expert. “They were used to assess teacher performance. Basically, I would say: The more complex the task that the AI ​​is supposed to do, the less likely it is that it can actually do it.”

More caution is required

The systems already available on the market promise to be able to record between seven and nine emotions. But the problem begins with the data with which the AI ​​is to be trained. Because these often came from actors with overly clear, unnatural facial expressions.

In addition, there are further scientific and philosophical problems in the data collection of emotions: “We have all been in situations in which we pretended to be happy about a birthday present that we did not like,” says Leufer. “So we know that we can also lie with emotions.”

There is a relationship between facial expressions and emotional state. “But it’s not that simple, it’s not a one-to-one relationship, and the theories these emotion-recognition systems are built on are extremely simple.”

Drive for innovation at all costs

Leufer sees such products as a kind of technology fetish of venture capitalists in Silicon Valley. They are always looking for the next thing that can be done by artificial intelligence and then try to fit it into every aspect of our society to supposedly increase productivity. This drive to innovate at any cost often results in the emergence of harmful technology that wouldn’t exist if we were more careful.
(hte)


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.