Human #echolocation works step by step www.sciencenews.org/article/huma... by @[email protected], "A study reveals how individual tongue clicks and their echoes contribute to object sensing"

Human echolocation works step ...
Human echolocation works step by step

Experts in echolocation use multiple clicks and echoes to sense objects, offering insight into how the brain builds perception.

Science News
Neural and behavioral correlates of evidence accumulation in human click-based echolocation www.eneuro.org/content/earl... by @[email protected] @[email protected]

Neural and behavioral correlat...
Neural and behavioral correlates of evidence accumulation in human click-based echolocation

Echolocation enables blind individuals to perceive and navigate their environment by emitting clicks and interpreting their returning echoes. While expert blind echolocators demonstrate remarkable spatial accuracy, the behavioral and neural mechanisms by which spatial echoacoustic cues are combined across repeated samples remain less explored. Here, we investigated the temporal dynamics of spatial information processing in human click-based echolocation using EEG. Blind expert echolocators (n=4, all males) and novice sighted participants (n=21, 12 males) localized virtual spatialized echoes derived from realistic synthesized mouth clicks, presented in trains of 2–11 clicks. Behavioral results showed that blind expert echolocators significantly outperformed sighted controls in spatial localization. For these experts, localization thresholds decreased as the number of clicks increased, a pattern consistent with cumulative integration of spatial information across repeated samples. EEG decoding analyses revealed reliable neural discrimination of echo laterality from the first click that correlated with overall spatial localization performance. Across successive clicks, neural responses evolved systematically, reflecting sequence-position–dependent changes in neural dynamics. EEG trial-level modeling further allowed us to distinguish accumulation-consistent decision readout policies from alternative repetition-based accounts, revealing individual differences in decision policies among expert echolocators. These findings provide, to our knowledge, the first fine-grained account of the temporal neural dynamics supporting human click-based echolocation, directly linked to behavioral performance across multiple samples. They reveal how, in expert echolocators who successfully performed the task, successive echoes are progressively integrated into coherent spatial representations, demonstrating adaptive sensory processing in the absence of vision. Significance Statement Remarkably, some blind individuals navigate the world using echolocation, producing mouth clicks and interpreting returning echoes to perceive their surroundings. Yet how the brain combines successive echoes to build spatial representations remains poorly understood. Here, we show that expert blind echolocators localized echoes more accurately than sighted novices and that their performance improved as additional clicks provided more spatial information over time. Brain recordings revealed that neural activity distinguished sound location from the earliest echoes and evolved systematically across click sequences in parallel with behavioral improvements. These findings provide a detailed account of how the human brain transforms repeated acoustic information into stable spatial representations, supporting navigation in the absence of vision.

eNeuro
The brain stacks sound information to navigate the dark neurosciencenews.com/human-echolo... Each #echolocation "click acts like a brushstroke, building a high-resolution mental representation of the surroundings in real-time."

The Brain Stacks Sound Informa...
The Brain Stacks Sound Information to Navigate the Dark - Neuroscience News

Neuroscience News provides research news for neuroscience, neurology, psychology, AI, brain science, mental health, robotics and cognitive sciences.

Neuroscience News
(YouTube) The new AI depth view option of The vOICe for Android might bring it closer to a kind of "visual echolocation" that is far easier to master than interpreting the regular camera soundscapes? www.youtube.com/watch?v=gs8I...

vOICe Depth test run: live dep...
vOICe Depth test run: live depth mapping for The vOICe vision BCI for the blind

YouTube
Science news highlights echolocation study by Santani Teng and Haydée García-Lázaro www.ski.org/news/science... by @[email protected]

Science News Highlights Echolo...
Science News Highlights Echolocation Study by Santani Teng and Haydée García-Lázaro | Smith-Kettlewell Eye Research

Smith-Kettlewell Eye Research
New research reveals the superpower behind how blind 'echolocators' navigate using sound thedebrief.org/new-research... by @[email protected] on #echolocation

New Research Reveals the Super...
New Research Reveals the Superpower Behind How Blind 'Echolocators' Navigate Using Sound

Science, Tech and Defense for the Rebelliously Curious.

The Debrief
Keep in mind that The vOICe vision BCI is not about echolocation, as it conveys truly visual information from a camera through sound www.seeingwithsound.com/webvoice/web... The vOICe web app

The vOICe web app
The vOICe web app

Platform-independent progressive web app version of The vOICe vision BCI.

The vOICe vision BCI