This year’s speakers include Kevin Miller, @TimKietzmann, Tanja Schultz, Lisa-Marie Vortmann, and others https://www.iiccsss.org
| Lab Webpage | https://www.kietzmannlab.org |
| Google Scholar | https://scholar.google.com/citations?user=JXcWFkgAAAAJ&hl=en |
| Lab Webpage | https://www.kietzmannlab.org |
| Google Scholar | https://scholar.google.com/citations?user=JXcWFkgAAAAJ&hl=en |
🚨ERC Job Alert 🚨
Are you looking for a PhD position? Are you into AI and/or computational/cognitive neuroscience? Then consider becoming an ERC funded graduate student with us:
https://www.ikw.uni-osnabrueck.de/fileadmin/user_upload/jobs/68_IKW_Research_Assistant_E_13_65_.pdf
I am biased, but I think this is a great opportunity.
First, the lab. I could not wish for a better team. People collaborate a ton, are helpful, constructive, and fun. The hallway is filled with chatter about new ideas, directions, and excitement. Feel free to contact current members to find out more.
Second, the context of the ERC project. TIME bridges the fields of deep learning and cognitive computational neuroscience to establish when, where and how visual semantic understanding emerges in the brain, as it actively samples and integrates information. Exciting questions.
Third, the institute and country. Germany is a great place to live and do science. The institute was among the first to establish a distinct cognitive science program, and you will find yourselves among highly motivated colleagues who strive to do excellent science together.
Did I mention 30 days of paid vacation, great health insurance, and free daycare? This is a very family friendly lab and city.
Importantly, we are striving to create a better gender balance in the lab, so please share this opportunity far and wide.
Please see our lab webpage and publications for further information on the work we do and get in touch with me if you have any questions.
Ensuring that experimental stimuli were not part of model training gets harder with closed/larger/industry models.
Point in case: CLIP and NSD. CLIP is trained on part of MS COCO, making it impossible to cleanly estimate it's predictive performance on NSD neuroimaging data, which was also collected while participants watched coco images as stimuli.
Our incoming H100 node is heavy compute.
Literally.
It's 108kg...