This year’s speakers include Kevin Miller, @TimKietzmann, Tanja Schultz, Lisa-Marie Vortmann, and others https://www.iiccsss.org
| Lab Webpage | https://www.kietzmannlab.org |
| Google Scholar | https://scholar.google.com/citations?user=JXcWFkgAAAAJ&hl=en |
| Lab Webpage | https://www.kietzmannlab.org |
| Google Scholar | https://scholar.google.com/citations?user=JXcWFkgAAAAJ&hl=en |
🚨ERC Job Alert 🚨
Are you looking for a PhD position? Are you into AI and/or computational/cognitive neuroscience? Then consider becoming an ERC funded graduate student with us:
https://www.ikw.uni-osnabrueck.de/fileadmin/user_upload/jobs/68_IKW_Research_Assistant_E_13_65_.pdf
I am biased, but I think this is a great opportunity.
First, the lab. I could not wish for a better team. People collaborate a ton, are helpful, constructive, and fun. The hallway is filled with chatter about new ideas, directions, and excitement. Feel free to contact current members to find out more.
Second, the context of the ERC project. TIME bridges the fields of deep learning and cognitive computational neuroscience to establish when, where and how visual semantic understanding emerges in the brain, as it actively samples and integrates information. Exciting questions.
Third, the institute and country. Germany is a great place to live and do science. The institute was among the first to establish a distinct cognitive science program, and you will find yourselves among highly motivated colleagues who strive to do excellent science together.
Did I mention 30 days of paid vacation, great health insurance, and free daycare? This is a very family friendly lab and city.
Importantly, we are striving to create a better gender balance in the lab, so please share this opportunity far and wide.
Please see our lab webpage and publications for further information on the work we do and get in touch with me if you have any questions.
Ensuring that experimental stimuli were not part of model training gets harder with closed/larger/industry models.
Point in case: CLIP and NSD. CLIP is trained on part of MS COCO, making it impossible to cleanly estimate it's predictive performance on NSD neuroimaging data, which was also collected while participants watched coco images as stimuli.
Our incoming H100 node is heavy compute.
Literally.
It's 108kg...
“Just finished my course “Machine Learning for Cognitive Computational Neuroscience”. Across 12 lectures (90 minutes each) and 10 workgroup sessions, we covered >100 papers (46% published in the past two years). The students (and I) learned a lot. Here is what we covered: 1/”
We are excited to announce that submissions for Cognitive Computational Neuroscience (CCN) 2023 are now open! The submission deadline for Abstracts, Generative Adversarial Collaborations (GACs), and Keynote&Tutorials will be **31 March 2023** (earlier than previous years!).
Abstract submission will be in the form of 2-page papers. More information can be found in our Author Kit https://2023.ccneuro.org/papers/author_kit.php. To submit a paper, visit https://2023.ccneuro.org/papers.php.
Generative Adversarial Collaborations (GACs) provide an extended workshop format to discuss the latest challenges and controversies in Cognitive Computational Neuroscience. For further details of the format, and to submit your GAC, visit https://2023.ccneuro.org/gac.php.
Keynotes with Tutorials (K+Ts) give a unique opportunity to deliver a keynote on your lab’s work, backed up by an extended tutorial (typically led by your labs postdocs/graduate students) to give a chance to work with the code, data and models behind your talk. To submit your K+T proposal, visit https://2023.ccneuro.org/keynotes_tutorials.php.
Registration for CCN 2023 is now open and can be completed here:
https://www.oxforduniversitystores.co.uk/conferences-and-events/experimental-psychology/events/computational-cognitive-neuroscience-society-meeting-2023
For the most up-to-date information including reminders about deadlines join our mailing list (https://mail.securecms.com/mailman/listinfo/ccneuro-announce) and also follow us here on Mastodon or Twitter.