Andreas Horn

473 Followers
173 Following
703 Posts
Neuroscientist at Harvard Medical School, BWH, MGH & Charite into network neuromodulation. PI of Netstim Lab. Author of @leaddbs and Stimulating Brains
Websitehttp://www.andreas-horn.de
Scholarhttps://scholar.google.com/citations?user=1jF_5-0AAAAJ&hl=en
Researchgatehttps://www.researchgate.net/profile/Andreas-Horn
RT @RebeccaRHelm
OMG it literally took someone SWIMMING FROM HAWAII TO CALIFORNIA to discover this, but wow did we find something shocking in the Great Pacific Garbage Patch... [a thread 🧵]…
New study: https://plos.io/3LXY3CC
High concentrations of floating neustonic life in the plastic-rich North Pacific Garbage Patch

Floating life (neuston) is a core component of the ocean surface food web, but the Sargasso Sea in the North Atlantic is the only known region of high neustonic abundance. This study reveals high densities of floating life in the plastic-rich Great Pacific Garbage Patch, suggesting that this area could be an important marine habitat.

RT @boeslab
We see some strange and sometimes paradoxical effects of lesions to the default mode network.

A thread of some of the curious observations….

RT @alex_ander
In the latest paper from my lab, @jerryptang showed that we can decode language that a person is hearing (or even just thinking) from fMRI responses. https://www.nature.com/articles/s41593-023-01304-9
Semantic reconstruction of continuous language from non-invasive brain recordings | Nature Neuroscience

A brain–computer interface that decodes continuous language from non-invasive recordings would have many scientific and practical applications. Currently, however, non-invasive language decoders can only identify stimuli from among a small set of words or phrases. Here we introduce a non-invasive decoder that reconstructs continuous language from cortical semantic representations recorded using functional magnetic resonance imaging (fMRI). Given novel brain recordings, this decoder generates intelligible word sequences that recover the meaning of perceived speech, imagined speech and even silent videos, demonstrating that a single decoder can be applied to a range of tasks. We tested the decoder across cortex and found that continuous language can be separately decoded from multiple regions. As brain–computer interfaces should respect mental privacy, we tested whether successful decoding requires subject cooperation and found that subject cooperation is required both to train and to apply the decoder. Our findings demonstrate the viability of non-invasive language brain–computer interfaces. Tang et al. show that continuous language can be decoded from functional MRI recordings to recover the meaning of perceived and imagined speech stimuli and silent videos and that this language decoding requires subject cooperation.

RT @zixiao_yin
Excited to share our new publications in #Neurobiologyofdisease, where we looked at pallidal activity during sleep and its utility for sleep decoding in patients with dystonia, Huntington's, and Parkinson's disease.🫡🫡🫡
https://doi.org/10.1016/j.nbd.2023.106143
RT @BWHNeurology
Congratulations to @andreashorn_ et al. on the publication, “Insights and opportunities for deep brain stimulation as a brain circuit intervention,” in @TrendsNeuro #DeepBrainStimulation #DBS @Brain_Circuits
🧠🔗🔗👇
https://www.cell.com/trends/neurosciences/fulltext/S0166-2236(23)00083-8?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0166223623000838%3Fshowall%3Dtrue
RT @jmacshine
As we wander through our daily lives, our brain process billions of terabytes of information, much of which is highly ambiguous. Ever wonder how our brains resolve this uncertainty?
RT @mbeisen
If the head of one of the most prominent publishers in the world treats the idea that the public would be interested in reading research articles with such disdain is it any wonder that articles aren’t written for the public and the public doesn’t read them regularly?

RT @GregoryZipfel
Neuromodulation for #depression and #blindness?? It’s coming…

Incredible William Coxe, MD Lecture by @drpouratian explaining where the field of neuromodulation is going in the years to come.

@UTSWNeurosurg
@WashUNeurosurg

RT @drpouratian
Thanks @GregoryZipfel for the invite to @WashUNeurosurg. Amazing #neurosurgery #team and residents. Looking forward to continued #collaboration. https://twitter.com/gregoryzipfel/status/1653744826356051969
Gregory Zipfel on Twitter

“Neuromodulation for #depression and #blindness?? It’s coming… Incredible William Coxe, MD Lecture by @drpouratian explaining where the field of neuromodulation is going in the years to come. @UTSWNeurosurg @WashUNeurosurg”

Twitter

RT @TrackingActions
🦓 Self-supervised multimodal ML is promising the next AI breakthrough - in our new work published in @Nature, we debut @cebraAI: for self-supervised hypothesis- and discovery-driven science.

📝 https://doi.org/10.1038/s41586-023-06031-6
đź’»https://github.com/AdaptiveMotorControlLab/CEBRA
🦓 https://cebra.ai/
🧵⬇️