Good morning gamers.
Unfortunately I am skipping this year's #FA2023 and hope those assisting to First Attack PR this weekend have a blast.
Mainly to DaniPhantomPR, No Sweat and JustAKidd55, have a great DNF Duel, FPS and SF6 tournament respectively!
I see you in 2024!
Exciting news! Our conference paper titled "Auditory reverse correlation applied to the study of place and voicing: four new phoneme-discrimination tasks" has been accepted for presentation at #ForumAcusticum2023 #FA2023! This is the foundation stone for a bigger study to be published next year, and also a summary of our overall scientific aim in the team. https://hal.science/hal-04130939 #psycholinguistics @[email protected]
Attached: 1 image In this conference paper for #ForumAcousticum #FA2023 we replicate Ahumada's seminal experiment from 1975 on tone-in-noise perception using our own #OpenSource fastACI toolbox (https://github.com/aosses-tue/fastACI), and we analyze the data obtained by an artificial listener from the (also #OpenSource) Auditory Modeling Toolbox on the task. It's very satisfactory to have a network of interconnected toolboxes working together... And of course all analyses can be reproduced easily using the commands listed in the article. You can even run the experiment on yourself to replicate Ahumada's original results! #replication #OpenCode #Auditory #Psychoacoustics https://hal.science/hal-04186363v1/document
Yesterday I was visiting the museum of cinema in #Torino. I was amazed to discover that Camille Flammarion was using the magic lantern during his scientific conferences:
“In 1866 I started using limelight projection devices in my astronomy lectures: their bright images were an effective complement every to illustrate astronomic principles… We started by projecting the 30 images from my publication The Wonders of the Heavens which were the entire content of my illustrated conferences.”
… the first PowerPoint presentation in human history!
Anybody here going to Forum Acusticum #FA2023 in Torino next week? My team will be there in force with 3 oral presentations:
fastACI toolbox: the MATLAB toolbox for investigating auditory perception using reverse correlation. - GitHub - aosses-tue/fastACI: fastACI toolbox: the MATLAB toolbox for investigating auditory pe...
Exciting news! Our conference paper titled "Auditory reverse correlation applied to the study of place and voicing: four new phoneme-discrimination tasks" has been accepted for presentation at #ForumAcusticum2023 #FA2023! This is the foundation stone for a bigger study to be published next year, and also a summary of our overall scientific aim in the team.
https://hal.science/hal-04130939
Auditory reverse correlation (revcorr) is an experimental paradigm that reveals the acoustic cues used by listeners in any auditory task. It has been previously used to explore the categorisation of /aba/ and /ada/ sounds in noise. Here, we extend the paradigm to new phonemic contrasts. In a typical revcorr experiment, one introduces random fluctuations in stimuli in order to measure how they affect the behavioural responses of the participant on a trial-by-trial basis. The outcome is called auditory classification images (ACI), i.e. time-frequency maps of the acoustic cues used by participants, revealing their individual listening strategies in a given task. Here, we use the "fastACI toolbox" [Osses & Varnet, 2021] to apply the paradigm to new phonemic contrasts: /aba/-/apa/ ; /ada/-/aga/ ; /ada/-/ata/ ; /apa/-/ata/. It allows us to study the perception of two phonetic traits: place of articulation and voicing. We present the results of 2 participants for each contrast. The results are consistent with the main auditory cues already identified in the psycholinguistic iterature but they also reveal unexpected secondary cues.
Sadly official, I'm not going to assist on First Attack #FA2023 FGC Puerto Rico event this year.
There are some priorities including obligations that will limit my free time to properly practice & not in the mood/economic position will tolerate losing a Tekken 7 tourney.