Crossmodal interaction of flashes and beeps across time and number follows Bayesian causal inference https://link.springer.com/article/10.3758/s13423-026-02857-z #SiFI #multisensory
Crossmodal interaction of flashes and beeps across time and number follows Bayesian causal inference - Psychonomic Bulletin & Review

Multisensory perception requires the brain to dynamically infer causal relationships between sensory inputs across various dimensions, such as temporal and spatial attributes. Traditionally, Bayesian Causal Inference (BCI) models have generally provided a robust framework for understanding sensory processing in unidimensional settings where stimuli across sensory modalities vary along one dimension such as spatial location, or numerosity (Samad et al., PloS one, 10 (2), e0117178, 2015). However, real-world sensory processing involves multidimensional cues, where the alignment of information across multiple dimensions influences whether the brain perceives a unified or segregated source. In an effort to investigate sensory processing in more realistic conditions, this study introduces an expanded BCI model that incorporates multidimensional information, specifically numerosity and temporal discrepancies. Using a modified sound-induced flash illusion (SiFI) paradigm with manipulated audiovisual disparities, we tested the performance of the enhanced BCI model. Results showed that integration probability decreased with increasing temporal discrepancies, and our proposed multidimensional BCI model accurately predicts multisensory perception outcomes under the entire range of stimulus conditions. This multidimensional framework extends the BCI model’s applicability, providing deeper insights into the computational mechanisms underlying multisensory processing and offering a foundation for future quantitative studies on naturalistic sensory processing.

SpringerLink