https://medicalxpress.com/news/2025-02-brain-imaging-text-mindllm.html

…people have different brain structures that never quite match when aligned to a standardized #brainatlas…different input dimensions are required for each subject.

…neuroscience-informed activity mapping within the #fMRI encoder…allows the system to accommodate these varying input shapes across subjects.

By separating a voxel's functional information from its raw fMRI value, the model leverages pre-existing knowledge from #neuroscience #research

#MindLLM

Direct translation of brain imaging to text with MindLLM

Yale University, Dartmouth College, and the University of Cambridge researchers have developed MindLLM, a subject-agnostic model for decoding functional magnetic resonance imaging (fMRI) signals into text.

Medical Xpress