top of page

Emotion Tracking AI in Video Conference


The pandemic has made video conferencing much more common around the world. Without the ability to read body language from a screen, salespeople struggle to gauge how open potential customers are to their products and services. Companies have started using technology that has the ability to analyze people's moods during calls, and Protocol says Zoom plans to provide the same service of emotion tracking AI in video conferences.


But there are also some handicaps of emotion tracking AI in video conference. There are various opinions about the fact that this technology is not able to make the most accurate predictions and is unfair to people. There are loud complaints about this technology, which says it can be "discriminatory, manipulative, potentially dangerous, and based on the assumption that all people use the same facial expressions, voice patterns, and body language." Following reports revealing that Zoom is considering incorporating artificial intelligence into its virtual meeting software to detect and analyze people's moods and emotions, human rights organizations recently announced that they sent a joint letter to the company, claiming that the technology is wrong and could threaten fundamental rights.


Contrary Views on Emotion Tracking AI


The nonprofit digital rights and human rights organizations wrote an open letter to Zoom asking the company not to continue investigating the emotion tracking AI in video conferences can analyze emotions on its video conferencing platform. The groups wrote the letter in response to a Protocol report that says Zoom is actively exploring how to incorporate emotion tracking AI in video conferences in the future. It is widely believed to be part of a larger project that is examining how companies are starting to use artificial intelligence to detect a potential customer's emotional state during sales calls.


The groups also pointed out that the technology, like facial recognition, is inherently biased and racist. They said that by including the feature, Zoom would discriminate against certain ethnicities and people with disabilities. It can also be used to punish students or staff if they display the wrong emotion. In 2021, a project led by Cambridge University professor Alexa Hagerty demonstrated the limits of emotion recognition AIs and how easy it is to fool them. Previous studies have also shown that emotion recognition programs fail the racial bias test and have trouble reading Black faces.


The group concluded the letter by citing Zoom's decision to cancel the rollout of face tracking features, describing it as another opportunity to do the right thing for its users. Human rights organizations hope their call will pressure Zoom to abandon its plans.

874 görüntüleme1 yorum

Son Yazılar

Hepsini Gör
bottom of page