Math in comp neuro summer camp!!
Apply now!!!
10-28/July in lovely Norway (amazing/middle of nowhere). No tuition and room/board/food(good) is covered. Great experience/speakers/students/activities!!!!
“fire together, wire together”
Do it!
compneuronrsn.org
Job title: 2 PhD positions in the project IMod: An interdisciplinary approach to data-based modelling (237922), Employer: NTNU - Norwegian University of Science and Technology, Deadline: Tuesday, January 31, 2023
New in TiNS: The tricky business of defining brain functions
”Observations lead to interpretations. Interpretations become concepts. And concepts may become dogmas that feel so intuitive, so natural, that they are accepted without question. We should, from time to time, re-evaluate the core beliefs of our fields of study.”
https://www.cell.com/trends/neurosciences/fulltext/S0166-2236(22)00213-2
I really wish there was wider awareness of this issue, especially outside of academia
Latent variable model with parametric tuning curves that assume a common shape shared by neurons in an ensemble, where the ensembles are defined parametrically. Uses ML tricks to make it fast (e.g. VAE-style encoder) and seems to work on data (head direction and grid cell).
Here is the article: https://arxiv.org/abs/2210.03155
Authors: Martin Bjerke, Lukas Schott, Kristopher T. Jensen , Claudia Battistin, David Klindt (@dak), and Benjamin Dunn (@benjamin_dunn)
Note: here it was assumed that cell types are a thing and that brain areas aren’t a continuous mush of selectivity.
Systems neuroscience relies on two complementary views of neural data, characterized by single neuron tuning curves and analysis of population activity. These two perspectives combine elegantly in neural latent variable models that constrain the relationship between latent variables and neural activity, modeled by simple tuning curve functions. This has recently been demonstrated using Gaussian processes, with applications to realistic and topologically relevant latent manifolds. Those and previous models, however, missed crucial shared coding properties of neural populations. We propose feature sharing across neural tuning curves, which significantly improves performance and leads to better-behaved optimization. We also propose a solution to the problem of ensemble detection, whereby different groups of neurons, i.e., ensembles, can be modulated by different latent manifolds. This is achieved through a soft clustering of neurons during training, thus allowing for the separation of mixed neural populations in an unsupervised manner. These innovations lead to more interpretable models of neural population activity that train well and perform better even on mixtures of complex latent manifolds. Finally, we apply our method on a recently published grid cell dataset, recovering distinct ensembles, inferring toroidal latents and predicting neural tuning curves all in a single integrated modeling framework.