Automatically detecting emotions from #EEGs is expected to become a major task of #BCIs. However, inaccuracies, high error rates and a lack of stability still occupy #research. A research group has now succeeded in using Deep Convolutional Neural Networks #DCNNs to classify positive, neutral and negative #emotions from EEG signals with 96% accuracy by having volunteers listen to different music.
#Bioelectronics

https://mdpi.com/2079-9292/12/10/2216

Automatic Emotion Recognition from EEG Signals Using a Combination of Type-2 Fuzzy and Deep Convolutional Networks

Emotions are an inextricably linked component of human life. Automatic emotion recognition can be widely used in brain–computer interfaces. This study presents a new model for automatic emotion recognition from electroencephalography signals based on a combination of deep learning and fuzzy networks, which can recognize two different emotions: positive, and negative. To accomplish this, a standard database based on musical stimulation using EEG signals was compiled. Then, to deal with the phenomenon of overfitting, generative adversarial networks were used to augment the data. The generative adversarial network output is fed into the proposed model, which is based on improved deep convolutional networks with type-2 fuzzy activation functions. Finally, in two separate class, two positive and two negative emotions were classified. In the classification of the two classes, the proposed model achieved an accuracy of more than 98%. In addition, when compared to previous studies, the proposed model performed well and can be used in future brain–computer interface applications.

MDPI

Emotionen aus #EEGs automatisch zu erkennen, soll zu einer wesentlichen Aufgabe von #BCIs werden. Ungenauigkeiten, hohe Fehlerquoten und mangelnde Stabilität beschäftigen allerdings noch die #Forschung. Einer Arbeitsgruppe gelang es nun durch Deep Convolutional Neural Networks #DCNNs positive, neutrale und negative #Emotionen aus EEG-Signalen mit 96%iger Genauigkeit zu klassifizieren indem sie Freiwillige verschiedene Musik hören ließen.
#Bioelektronik

https://www.mdpi.com/2079-9292/12/10/2216

Automatic Emotion Recognition from EEG Signals Using a Combination of Type-2 Fuzzy and Deep Convolutional Networks

Emotions are an inextricably linked component of human life. Automatic emotion recognition can be widely used in brain–computer interfaces. This study presents a new model for automatic emotion recognition from electroencephalography signals based on a combination of deep learning and fuzzy networks, which can recognize two different emotions: positive, and negative. To accomplish this, a standard database based on musical stimulation using EEG signals was compiled. Then, to deal with the phenomenon of overfitting, generative adversarial networks were used to augment the data. The generative adversarial network output is fed into the proposed model, which is based on improved deep convolutional networks with type-2 fuzzy activation functions. Finally, in two separate class, two positive and two negative emotions were classified. In the classification of the two classes, the proposed model achieved an accuracy of more than 98%. In addition, when compared to previous studies, the proposed model performed well and can be used in future brain–computer interface applications.

MDPI
Deep #convolutional #neural networks (#DCNNs) don’t see objects the way #humans do – using configural shape perception – and that could be dangerous in real-world #AI applications
#ArtificialIntelligence #Neuroscience #sflorg
https://www.sflorg.com/2022/09/ai09172201.html
Even the smartest AI models don’t match human visual processing

York University study highlights how deep-network models take potentially dangerous ‘shortcuts’ in solving complex recognition tasks