DATE: September 26, 2025 at 03:54PM
SOURCE: HEALTHCARE INFO SECURITY

Direct article link at end of text block below.

Senate Bill Seeks #Privacy Protection for Brain Wave Data: #MINDAct Asks @FTC to Study Exploitation Risks for #NeuralData https://t.co/tDBPHof7nH @SenSchumer @MariaCantwell @EdMarkey

Here are any URLs found in the article text:

https://t.co/tDBPHof7nH

Articles can be found by scrolling down the page at https://www.healthcareinfosecurity.com/ under the title "Latest"

-------------------------------------------------

Private, vetted email list for mental health professionals: https://www.clinicians-exchange.org

Healthcare security & privacy posts not related to IT or infosec are at @HIPAABot . Even so, they mix in some infosec with the legal & regulatory information.

-------------------------------------------------

#security #healthcare #doctors #itsecurity #hacking #doxxing #psychotherapy #securitynews #psychotherapist #mentalhealth #psychiatry #hospital #socialwork #datasecurity #webbeacons #cookies #HIPAA #privacy #datanalytics #healthcaresecurity #healthitsecurity #patientrecords @infosec #telehealth #netneutrality #socialengineering

A new law in California protects consumers’ brain data. Some think it doesn’t go far enough

The bill defines neural data as “information that is generated by measuring the activity of a consumer’s central or peripheral nervous system, and that is not inferred from nonneural information.” In other words, data collected from a person’s brain or nerves.

#CCPA #BrainData #MentalPrivacy #NeuralData #privacy #surveillance #data #bigdata #biometrics #technology #tech

https://www.technologyreview.com/2024/10/04/1104972/law-california-protects-brain-data-doesnt-go-far-enough/

A new law in California protects consumers’ brain data. Some think it doesn’t go far enough.

Tech companies collect brain data that could be used to infer our thoughts—so it’s vital we get legal protections right.

MIT Technology Review

#California Passes Law Protecting Consumer #Brain Data

The state extended its current personal #privacy law to include the neural data increasingly coveted by technology companies.
#neuraldata

https://www.nytimes.com/2024/09/29/science/california-neurorights-tech-law.html

California Passes Law Protecting Consumer Brain Data

The state extended its current personal privacy law to include the neural data increasingly coveted by technology companies.

The New York Times

Your Brain Waves Are Up for Sale. A New Law Wants to Change That.

In a first, a #Colorado law extends #privacy rights to the #NeuralData increasingly coveted by technology companies.

#Brain #Medicine

Gift article: https://www.nytimes.com/2024/04/17/science/colorado-brain-data-privacy.html?unlocked_article_code=1.lE0._Fwh.Tt-a8B1EVTS1

Colorado Bill Aims to Protect Consumer Brain Data

In a first, a Colorado law extends privacy rights to the neural data increasingly coveted by technology companies.

The New York Times

Endless feedback - by Rob Horning - Internal exile https://robhorning.substack.com/p/endless-feedback
> Ideally one would be compelled to wear some sort of #brainwave monitor that would broadcast #neuraldata directly to #corporateservers somewhere, as suggested by #MarkZuckerberg’s enthusiasm for brain-reading machines.

#surveillance #surveillancecapitalism

--
No wonder the #fediverse is against federating with #Meta, this is an essential reminder of what they are about.

Endless feedback

In 2013, Facebook data scientist Adam Kremer and intern Sauvik Das published “Self-Censorship on Facebook,” a paper which was to spark some controversy. Not only did the authors repeatedly make the confounding assertion that “the act of preventing oneself from speaking” — i.e., starting to write a Facebook post but then not posting it — was “censorship,” a kind of self-imposed masochistic tyranny rather than privacy, reticence, or good sense; but their methodology also revealed that Facebook retained the data that users input but never posted. Anything typed into a box on Facebook was fair game for the company to do whatever they wanted with, regardless of whether the user ultimately intended to share it with others. Thus while one’s privacy settings may apply to other users, they didn’t apply to the platform itself, which tracked and stored whatever user behavior it could, wherever it could. Nothing a user could do could get anything removed from that database, and their intuitions about what was included in it were likely to be insufficient.

Internal exile