[preprint alert 🚨 link below] How do we search for elements within a cognitive map? How does the brain perform search, are elements reactivated in sequence or simultaneously? We set out to answer this question using MEG and decoding

We let participants learn associations that were embedded into a hidden graph structure. After a short consolidation (8 min.) we asked them to retrieve triplets from the hidden graph.
We trained a machine learning classifiers to extract representations from #MEG recordings of the individual items per participant. Using these classifiers, we assessed during retrieval whether items were replayed in short sequence or were reactivated simultaneously.

We confirmed that near items are stronger reactivated than items further away. Looking at this relationship in detail revealed that there was a grading of items on the graph by their decoded reactivation strength, only present for correct answers. (4/7)

Additionally, we also looked into sequential replay using #TDLM. We found that overall there was no significant sequential replay detected (5/7)

However, there was a significant correlation between replay and memory performance with low-performing participants relying more on sequential replay. This is in line with previous results from
gelliott_wimmer

If you want to know more about the study and read our conclusion and don't want to wait for the final publication, here is the preprint https://doi.org/10.1101/2023.07.31.551234 (7/7)
biorxiv.org
Reactivation strength during cued recall is modulated by graph distance within cognitive maps
Declarative memory retrieval is thought to involve reinstatement of the neuronal activity patterns elicited and encoded during a prior learning episode. Recently, it has been suggested that two...

Thanks to all co-authors (Juli Nagel, Fungi Gerchen, Cagatay Guersoy @caggursoy, Andreas Meyer- Lindenberg, Peter Kirsch, Ray Dolan, Steffen Gais) and specially my supervisor Gordon (@GordonFeld), @studienstiftung & @DGSchlafmedizin for funding and @zi_mannheim for hosting me

J'ai whatmille fils #tdlm en cours ça y est
Satisfaction