These brain implants speak your mind — even when you don't want to.

Surgically implanted devices that allow paralyzed people to speak can also eavesdrop on their inner monologue.

That's the conclusion of a study of brain-computer interfaces (BCIs) in the journal Cell.

https://www.npr.org/sections/shots-health-news/2025/08/20/nx-s1-5506334/brain-computer-implant-speak-inner-speech-mind?utm_source=globalmuseum #globalmuseum #brain #BCIs

Brain-Computer Interfaces #BCIs to boost human + #AI synergy 🤖

💰 Market 🔛 $2B in 2023 → $10B by 2030 25%+ annual growth

🩺 2024 🔛 Restoring speech & mobility

🧠 2040 🔛 Memory backups & cognitive boosts

BCIs + AI = unlocking human potential 🚀 v/ @MikeQuindazzi #CES2025

New publication out in #CLSR: I delve into consumer #neurotechnologies #BCIs, within EU product safety law. The paper examines the sector-specific Medical Devices Regulation (#MDR) + amendments, complemented by considerations about the General Product Safety Regulation (#GPSR). Overall, the findings indicate that EU legislators have been diligent in adapting and modernising the relevant EU legal frameworks within the context of EU product safety law. https://doi.org/10.1016/j.clsr.2024.105945

I'm happy to have signed this, and even more happy that the BBC has decided I'm an expert :D

https://www.bbc.co.uk/news/technology-66218709

#AI #artificialintelligence #BCIS #bbc #bbcnews

More than 1,300 experts call AI a force for good

An open letter organised by the UK professional body for IT says AI is not a threat to humanity.

BBC News
#bcis
#BLLマンと繋がりたい
#ブルロLINE
#egis
とりま、タグ作ろうね💕︎

Automatically detecting emotions from #EEGs is expected to become a major task of #BCIs. However, inaccuracies, high error rates and a lack of stability still occupy #research. A research group has now succeeded in using Deep Convolutional Neural Networks #DCNNs to classify positive, neutral and negative #emotions from EEG signals with 96% accuracy by having volunteers listen to different music.
#Bioelectronics

https://mdpi.com/2079-9292/12/10/2216

Automatic Emotion Recognition from EEG Signals Using a Combination of Type-2 Fuzzy and Deep Convolutional Networks

Emotions are an inextricably linked component of human life. Automatic emotion recognition can be widely used in brain–computer interfaces. This study presents a new model for automatic emotion recognition from electroencephalography signals based on a combination of deep learning and fuzzy networks, which can recognize two different emotions: positive, and negative. To accomplish this, a standard database based on musical stimulation using EEG signals was compiled. Then, to deal with the phenomenon of overfitting, generative adversarial networks were used to augment the data. The generative adversarial network output is fed into the proposed model, which is based on improved deep convolutional networks with type-2 fuzzy activation functions. Finally, in two separate class, two positive and two negative emotions were classified. In the classification of the two classes, the proposed model achieved an accuracy of more than 98%. In addition, when compared to previous studies, the proposed model performed well and can be used in future brain–computer interface applications.

MDPI

Emotionen aus #EEGs automatisch zu erkennen, soll zu einer wesentlichen Aufgabe von #BCIs werden. Ungenauigkeiten, hohe Fehlerquoten und mangelnde Stabilität beschäftigen allerdings noch die #Forschung. Einer Arbeitsgruppe gelang es nun durch Deep Convolutional Neural Networks #DCNNs positive, neutrale und negative #Emotionen aus EEG-Signalen mit 96%iger Genauigkeit zu klassifizieren indem sie Freiwillige verschiedene Musik hören ließen.
#Bioelektronik

https://www.mdpi.com/2079-9292/12/10/2216

Automatic Emotion Recognition from EEG Signals Using a Combination of Type-2 Fuzzy and Deep Convolutional Networks

Emotions are an inextricably linked component of human life. Automatic emotion recognition can be widely used in brain–computer interfaces. This study presents a new model for automatic emotion recognition from electroencephalography signals based on a combination of deep learning and fuzzy networks, which can recognize two different emotions: positive, and negative. To accomplish this, a standard database based on musical stimulation using EEG signals was compiled. Then, to deal with the phenomenon of overfitting, generative adversarial networks were used to augment the data. The generative adversarial network output is fed into the proposed model, which is based on improved deep convolutional networks with type-2 fuzzy activation functions. Finally, in two separate class, two positive and two negative emotions were classified. In the classification of the two classes, the proposed model achieved an accuracy of more than 98%. In addition, when compared to previous studies, the proposed model performed well and can be used in future brain–computer interface applications.

MDPI