The amygdala plays a crucial role in emotional processing, particularly in detecting threat-related stimuli and regulating responses to them. Fear processing is a vital function emerging during the latter half of the first postnatal year and becomes progressively more regulated and context-dependent with maturation across early childhood. However, the neural underpinnings of early-emerging individual differences in fear processing remain underexplored.

In our previous studies, we have examined how 8-month-old infants avert their gaze from fearful faces relative to non-fearful faces. In general, children of this age tend to stay looking at fearful faces more easily, a phenomenon called fear bias. However, in our previous study, we found that a smaller left amygdala volume after birth was associated with a greater likelihood of averting gaze from fearful faces at 8 months of age.

Our latest study builds on this by extending the analysis longitudinally. We investigated whether neonatal amygdala volume and microstructural properties, indexed by mean diffusivity, are associated with attentional biases toward fearful faces at 30 and 60 months. Neonatal MRI was acquired at 2–8 weeks of age using 3T MRI. The same cohort completed eye-tracking at follow-ups (n = 57 at 30 months; n = 54 at 60 months).

Our results show that larger newborn left amygdala volume was associated with decreased disengagement from fearful (vs. non-fearful) faces at 30 months (p = .041), but not at 60 months (p = .553). Moreover, sex-specific analyses indicated that higher mean diffusivity in the left amygdala was associated with lower fear bias at 60 months in boys (p = .046).

These findings highlight the dynamic nature of amygdala-related fear processing across early development. Associations between neonatal amygdala characteristics and fear bias appeared age-dependent and sex-specific, consistent with developmental changes in fear processing, with fear bias typically elevated in infancy and becoming less pronounced by around five years of age.

https://doi.org/10.1007/s00787-026-03041-3

#amygdala #EyeTracking #MRI #EmotionalProcessing #FearProcessing

Neonatal amygdala and fear processing across early childhood - European Child & Adolescent Psychiatry

The amygdala plays a crucial role in emotional processing, particularly in detecting threat-related stimuli and regulating responses to them. Fear processi

SpringerLink

In our latest study, we examined how Finnish children read and integrate information across multiple expository texts when given an inquiry task. We were interested in how task-relevance of text information affects readers' eye movements and whether the eye movements are connected to the quality of an essay written after reading. We were also interested in differentiating between the effects of technical reading skill and reading comprehension in respect to these processes.

In total, 24 fifth and sixth grade Finnish native-speakers completed the experiment. Prior to testing, the participants were told that at the end of the testing session, they would have to complete an inquiry task (e.g., “What's the difference between human and dog hearing?”). During an eye tracking experiment, the participants read two science texts on the topic of the inquiry task. The texts contained both task-relevant and task-irrelevant text segments. After the reading task, the children wrote an essay to complete the inquiry task. Furthermore, participants' technical reading skill and reading comprehension were measured with an independent classroom test.

It was shown that the task-relevant segments were read longer than the task-irrelevant segments during first-pass reading. Moreover, reading skills modulated the effect of relevance, as weaker comprehenders were less likely to regress within an irrelevant segment. Furthermore, the relevance effect was more pronounced for the better technical readers with respect to look-backs. No reliable effects were found for the essay-writing task.

The results imply that the participants were able to detect which parts of the text were relevant and adjusted their reading accordingly, based on their reading skills. However, they did not seem to form a coherent memory representation of the relevant text contents in order to perform well in the essay writing task.

https://onlinelibrary.wiley.com/doi/10.1111/sjop.70099

#reading #MultipleTextComprehension #EyeTracking #relevance

Understanding verbal irony involves detecting that the speaker’s intended meaning contrasts with the literal meaning. This is challenging for children as the underlying skills required to understand irony may not be fully developed.

In our new study, we investigated how 10-year-olds’ working memory, empathy skills, and gender were related to their processing and comprehension of written irony. Data from two previous eye-tracking experiments with 97 children (46 girls and 51 boys) were analysed.

Results showed that children with stronger empathy skills had higher irony comprehension accuracy and were less likely to reread ironic phrases. Higher working memory was linked to faster processing of irony but did not lead to higher comprehension. Conversely, lower working memory was associated with more accurate irony comprehension. Child gender was not related to irony comprehension.

These results imply that working memory and emotional perspective-taking are important for children’s irony comprehension, underscoring theories that take individual differences into account.

https://doi.org/10.1017/S0305000926100543

#LanguageDevelopment #irony #EyeMovements #EyeTracking #WorkingMemory #empathy

Project Aria @Meta (@meta_aria)

Meta의 Project Aria와 커스텀 힘 감지 그리퍼를 결합한 Hoi! 데이터셋을 소개합니다. Aria 글래스의 1인칭 시점, 시선 추적, 실제 환경 상호작용을 활용해 보는 것·행동·감각의 간극을 줄이는 멀티모달 로봇/비전 데이터셋입니다.

https://x.com/meta_aria/status/2044829498156478579

#metaaria #dataset #multimodal #robotics #eyetracking

Project Aria @Meta (@meta_aria) on X

What happens when you combine Meta's Project Aria with custom force-sensing grippers? 🤖👓 The new Hoi! dataset uses the egocentric perspective from Aria glasses to bridge the gap between what is seen, done, and felt; coupling egocentric vision and eye-tracking with real-world

X (formerly Twitter)

Summer School: “Advanced Methods in Eye Tracking”

From June 22 to 23, 2026, the summer school “Advanced Methods in Eye Tracking” will take place at the University of East Anglia, UK.

See the poster for details. The announcement reads:

This interdisciplinary summer school will offer Phd students and other early career researchers from psychology and across the cognitive and social sciences advanced training in all aspects of eye tracking, and a clear interdisciplinary understanding of a range of research questions that can be addressed by eye tracking. It will be conducted over two days, with the first day consisting of research talks and the second day consisting of hands-on lab work and skill building. The first day is being offered as a hybrid event with talks being streamed live, for students wanting to attend online only. The second day is “in person” only.

#CognitiveScience #EyeTracking #Psychology #SocialScience

We are excited to share that two of our papers have been accepted to ETRA 2026!

1. QualitEye: Public and Privacy-preserving Gaze Data Quality Verification
Mayar Elfares, Pascal Reisert, Ralf Küsters, Andreas Bulling

2. Learning Alignments of Human Gaze and Fine-grained Task Descriptions
Takumi Nishiyasu, Zhiming Hu, Andreas Bulling, Yoichi Sato

Congratulations to all authors!

For preprints and updates, feel free to visit our website: https://www.collaborative-ai.org/

#ETRA2026 #EyeTracking #HCI

Collaborative Artificial Intelligence

Our group conducts fundamental research towards collaborative artificial intelligence (CAI) at the intersection of multimodal machine learning, computational cognitive modelling, computer vision, and human-machine interaction.

FYI: Lumen Research brings attention measurement to Netflix ads in five European markets: Lumen Research partners with Netflix to deliver eye-tracking attention measurement for CTV, desktop, and mobile ads in the UK, Germany, France, Italy, and Spain. https://ppc.land/lumen-research-brings-attention-measurement-to-netflix-ads-in-five-european-markets/ #AttentionMeasurement #NetflixAds #DigitalMarketing #CTV #EyeTracking
Lumen Research brings attention measurement to Netflix ads in five European markets

Lumen Research partners with Netflix to deliver eye-tracking attention measurement for CTV, desktop, and mobile ads in the UK, Germany, France, Italy, and Spain.

PPC Land
ICYMI: Lumen Research brings attention measurement to Netflix ads in five European markets: Lumen Research partners with Netflix to deliver eye-tracking attention measurement for CTV, desktop, and mobile ads in the UK, Germany, France, Italy, and Spain. https://ppc.land/lumen-research-brings-attention-measurement-to-netflix-ads-in-five-european-markets/ #LumenResearch #NetflixAds #AttentionMeasurement #EyeTracking #CTV
Lumen Research brings attention measurement to Netflix ads in five European markets

Lumen Research partners with Netflix to deliver eye-tracking attention measurement for CTV, desktop, and mobile ads in the UK, Germany, France, Italy, and Spain.

PPC Land
Eye Tracking Is The Missing Piece In Mark Zuckerberg's VR Strategy

Why did Meta take a pause on eye tracking after the Quest Pro?

UploadVR

Experts explain why the 'Infinity Tracing Technique' can be a game changer for people with insomnia

https://fed.brid.gy/r/https://www.upworthy.com/infinity-tracing-technique-for-sleep