Alibaba KI weiß, was Du FÜHLST! Emotionale KI mit R1-Omni

- Erste KI mit multimodaler Emotionserkennung
- Verbesserte emotionale Intelligenz
- Einsatz von Reinforcement Learning

#ai #ki #artificialintelligence #kuenstlicheintelligenz #Alibaba #EmotionRecognition #ReinforcementLearning #r1omni

Jetzt lesen und folgen!

https://kinews24.de/alibaba-r1-omni/

Alibaba R1-Omni: Durchbruch in emotionaler KI

Alibaba R1-Omni: Durchbruch in emotionaler KI! Revolutionäre multimodale Emotionserkennung mit RLVR. Anwendungsbereiche, Benchmarks & Zukunftsperspektiven.

KINEWS24.de

The EU’s #AIAct prohibitions are now in effect! But gaps remain. Learn more: https://algorithmwatch.org/en/ai-act-prohibitions-february-2025/

🚫 Now banned in the EU: #ManipulativeAI, AI that exploits people's vulnerabilities, #SocialScoring, #Scraping of facial images on the internet, Live #FaceRecognition in Public Spaces. Others are partially banned, like #PredictivePolicing, #EmotionRecognition, and more.

As of February 2025: Harmful AI applications prohibited in the EU - AlgorithmWatch

Bans under the EU AI Act become applicable now. Certain risky AI systems which have been already trialed or used in everyday life are from now on – at least partially – prohibited.

AlgorithmWatch

"On December 17th, EPIC filed comments with the Dutch data protection authority, Autoriteit Persoonsgegevens, regarding use of and prohibitions on emotion recognition surveillance. The EU AI Act prohibits the development, deployment, and placement on the EU market of emotion recognition systems intended for use in the workplace and in educational institutions, with limited exceptions where the algorithm is intended for certain medical or safety reasons. Autoriteit Persoonsgegevens opened a consultation requesting feedback on the implementation of this prohibition.

EPIC’s comments discuss some of the common types of emotion recognition, the harms of emotion recognition systems and their inefficacy, common uses and risks in the education and workplace settings, and recommendations. EPIC urges Autoriteit Persoonsgegevens to define emotion recognition systems broadly and either allow for no exemptions or construe the medical and safety exemption narrowly. This recommendation is based the complete lack of scientific evidence that these systems work and the many ways they violate the rights to privacy, data protection, freedom from discrimination, and various other rights enshrined in the EU Charter of Fundamental Rights and other EU regulations."

https://epic.org/epic-urges-dutch-data-protection-authority-to-protect-students-and-employees-from-the-harms-of-emotion-recognition/

#EU #Netherlands #Surveillance #DataProtection #Biometrics #EmotionRecognition

#EmotionRecognition #AI #PseudoScience: "For his part, Keyes is convinced technologists will never get that far. Developing an AI capable of parsing all the many nuances of human emotion, they say, would effectively mean cracking the problem of general AI, probably just after humanity has developed faster-than-light travel and begun settling distant solar systems.

Instead, in Keyes’ view, we’ve been left with a middling technology: one that demonstrates enough capability in applications with low-enough stakes to convince the right people to invest in further development.

It is this misunderstanding that seems to lie at the root of our inflated expectations of emotion recognition. “It works just well enough to be plausible, just well enough to be given an extra length of rope,” says Keyes, “and just poorly enough that it will hang us with that length of rope.”"

https://techmonitor.ai/technology/emerging-technology/emotion-recognition

UK train stations trial Amazon emotion recognition on passengers

Amazon-powered AI cameras are now being used to monitor and analyze passengers' emotions by employing a combination of smart CCTV cameras.

BiometricUpdate.com

Writing his #MastersThesis at the Massachusetts Institute of Technology (MIT) was not a given for our Master's student Nektarios Totikos. Learn more about what got him so far and his research on #EmotionRecognition: http://go.tum.de/325709

#careleaver #AIresearch

📷private

"Education can work wonders"

Studying at TUM and writing his Master's thesis at MIT was not a given for Nektarios Totikos, who spent part of his childhood and youth in youth welfare facilities. What got him so far? Intelligence, hard work and a great deal of perseverance.

#EU #AI #AIAct #Biometrics #EmotionRecognition: "Despite these concerns, emotion recognition has not been included in Article 5 of the AI Act which defines Prohibited Artificial Intelligence Practices. Instead, emotion recognition is defined as a high-risk AI system and its use is prohibited at the workplace and in educational institutions, with exceptions for safety and medical reasons. Deployers of emotion recognition systems are not required to inform people about the operation of such systems if the system is used to “detect, prevent and investigate criminal offenses,” according to the legislation.

The language of the AI Act adds to the confusion on the legality of emotion recognition: The law states that the fact that an AI system is classified as a high-risk AI system “should not be interpreted as indicating that the use of the system is lawful under other acts of Union law or under national law compatible with Union law.”

At the beginning of February, European lawmakers reached an agreement on the technical details of the AI Act, opening the path to European Parliament committees’ approval in April. Although significant opposition is not expected, lawmakers may still introduce changes that could slow down the timeline of its implementation."

https://www.biometricupdate.com/202402/is-the-eu-ai-act-leaving-a-backdoor-for-emotion-recognition

Is the EU AI Act leaving a backdoor for emotion recognition?

The AI Act still leaves the door open for its use in law enforcement and migration officers which could lead to potential rights abuses.

BiometricUpdate.com