Gustav Fechner - Wikipedia

I didn't know that October 22 was "Fechner Day" (the founder of psychophysics)

If you don't know who Gustav Fechner is, you can read more about him here: https://en.wikipedia.org/wiki/Gustav_Fechner

#VisionScience #psychophysics

Gustav Fechner - Wikipedia

If you're near Besançon and interested in auditory science or psychophysics, check out this upcoming series of talks at the FEMTO-ST Institute https://neuro-team-femto.github.io/revcor25/
I'll be giving one of them, looking forward to great discussions!
#AuditoryScience #Psychophysics
Mini-workshop on reverse correlation

Recent advances in auditory reverse correlation On the occasion of the PhD Defense of Aynaz Adl Zarrabi, the FEMTO Neuro group is hosting...

Neuro group at the Dept of Automation and Robotics, FEMTO-ST Institute

Worked on improving camera recordings in PsychoPy today. Some of the optimizations I made to movie player are now being applied to the camera interface with excellent results

#PsychoPy #psychology #psychophysics #neuroscience

I finally got to work on a prototype for the audio/visual synchronization tester I designed a few years back

This device is used to measure the lag between the presentation of a visual stimulus on screen and a sound that's associated with it

This design will be open sourced and uses widely available components

#electronics #psychophysics #psychology #audio

I bought loads of capacitors for work and I still don't have the right value for this project I'm starting :/

The project is a tone detector module for a device which measures audio/visual/input lag on computers used for psychophysics experiments. It also allows us to benchmark software techniques that correct for synchronization errors

#psychophysics #electronics #psychology #science

Is there anything that neuroscientists can say (preferably do, study) to know/boost what the human brain can *learn* to do with sound-guided mental imagery as in https://youtube.com/watch?v=7jFJ_IXzwKI ? From #psychophysics to #neuroscience and back; mental imagery training in #BCI #NeuroTech
Can you visualize the soundscapes before their source image appears? Screen recording of The vOICe

Important: use stereo headphones!This sensory substitution training video is aimed at normally sighted (but perhaps also low vision) people, who have eyesigh...

YouTube
Is there anything that neuroscientists can say (preferably do, study) to know/boost what the human brain can *learn* to do with sound-guided mental imagery as in youtube.com/watch?v=7jFJ... ? From #psychophysics to #neuroscience and back; mental imagery training in #BCI #NeuroTech

youtube.com/watch?v=7jFJ_I...
Can you visualize the soundscapes before their source image appears? Screen recording of The vOICe

Important: use stereo headphones!This sensory substitution training video is aimed at normally sighted (but perhaps also low vision) people, who have eyesigh...

YouTube
We are the *Laboratoire des Systèmes Perceptifs*, a research unit located at the Ecole Normale Supérieure in Paris and attached to @cnrs. We are interested in #visual and #auditory perception, from behavioural, computational, and neural perspectives. #intro #introduction
@psychology #psychophysics
Crossmodal correspondence of elevation/pitch and size/pitch is driven by real-world features https://link.springer.com/article/10.3758/s13414-024-02975-7 The vOICe sensory substitution uses pitch (tone frequency) to encode elevation; #crossmodal #multisensory #perception #psychophysics
Crossmodal correspondence of elevation/pitch and size/pitch is driven by real-world features - Attention, Perception, & Psychophysics

Crossmodal correspondences are consistent associations between sensory features from different modalities, with some theories suggesting they may either reflect environmental correlations or stem from innate neural structures. This study investigates this question by examining whether retinotopic or representational features of stimuli induce crossmodal congruency effects. Participants completed an auditory pitch discrimination task paired with visual stimuli varying in their sensory (retinotopic) or representational (scene integrated) nature, for both the elevation/pitch and size/pitch correspondences. Results show that only representational visual stimuli produced crossmodal congruency effects on pitch discrimination. These results support an environmental statistics hypothesis, suggesting crossmodal correspondences rely on real-world features rather than on sensory representations.

SpringerLink