Petition to Reverse the NIH Indirect Cost Cap
| Website | https://parralab.org |
| Publications | https://scholar.google.com/citations?user=-4BM5pwAAAAJ&hl |
| Website | https://parralab.org |
| Publications | https://scholar.google.com/citations?user=-4BM5pwAAAAJ&hl |
Petition to Reverse the NIH Indirect Cost Cap
Stand up for Science ... because science is for everyone.
March 7, In NY City and Nationwide.
https://standupforscience2025.org
Please distribute widely!
A bit of causal inference for everyone: Granger analysis when there are internal dynamics and external inputs. This happens everywhere. New method with examples from neuroscience, physiology, sociology and econometrics.
Code: Matlab, python, R
Complex systems, such as in brains, markets, and societies, exhibit internal dynamics influenced by external factors. Disentangling delayed external effects from internal dynamics within these systems is often difficult. We propose using a Vector Autoregressive model with eXogenous input (VARX) to capture delayed interactions between internal and external variables. Whereas this model aligns with Granger’s statistical formalism for testing “causal relations”, the connection between the two is not widely understood. Here, we bridge this gap by providing fundamental equations, user-friendly code, and demonstrations using simulated and real-world data from neuroscience, physiology, sociology, and economics. Our examples illustrate how the model avoids spurious correlation by factoring out external influences from internal dynamics, leading to more parsimonious explanations of these systems. For instance, in neural recordings we find that prolonged response of the brain can be explained as a short exogenous effect, followed by prolonged internal recurrent activity. In recordings of human physiology, we find that the model recovers established effects such as eye movements affecting pupil size and a bidirectional interaction of respiration and heart rate. We also provide methods for enhancing model efficiency, such as L2 regularization for limited data and basis functions to cope with extended delays. Additionally, we analyze model performance under various scenarios where model assumptions are violated. MATLAB, Python, and R code are provided for easy adoption: https://github.com/lcparra/varx.
Backprop is the workhorse of AI, but unrealistic in biology, and way too inefficient in recurrent networks (BPTT).
One can also propagate sensitivity forward in time. This can be done efficiently and exactly for a large class of recurrent networks.
💥 Problem solved! 😁
Conventional computer vision models rely on very deep, feedforward networks processing whole images and trained offline with extensive labeled data. In contrast, biological vision relies on comparatively shallow, recurrent networks that analyze sequences of fixated image patches, learning continuously in real-time without explicit supervision. This work introduces a vision network inspired by these biological principles. Specifically, it leverages a joint embedding predictive architecture incorporating recurrent gated circuits. The network learns by predicting the representation of the next image patch (fixation) based on the sequence of past fixations, a form of self-supervised learning. We show mathematical and empirically that the training algorithm avoids the problem of representational collapse. We also introduce \emph{Recurrent-Forward Propagation}, a learning algorithm that avoids biologically unrealistic backpropagation through time or memory-inefficient real-time recurrent learning. We show mathematically that the algorithm implements exact gradient descent for a large class of recurrent architectures, and confirm empirically that it learns efficiently. This paper focuses on these theoretical innovations and leaves empirical evaluation of performance in downstream tasks, and analysis of representational similarity with biological vision for future work.
If you're on BlueSky and want to bridge to Mastodon, follow @ap.brid.gy
That's it. Nothing to install, no terms of service to sign, no complicated garbage. If you want to stop, just block @ap.brid.gy
Details here: https://fed.brid.gy/docs
If you want to bridge your account to BlueSky, simply follow this account: @bsky.brid.gy
Why am I encouraging this? Because when BlueSky inevitably goes bad, people there will have friends in the Fediverse to help them move here.
Biological life is persistent yet is constantly turning over using not just energy but matter. Thermodynamics explains the order of life as the result of the flow of energy from sun to space through Earth, but says nothing about constant turnover of matter. Interesting podcast on this gap in theory
https://open.spotify.com/episode/5TjhHFU6XTHkJWVMSzxkBQ?si=BOcSzVHwRt2a5SR9VtsgDA
Time to rebuild my follower base from Twitter on Mastodon!
I’m Vidar, founder of Street Art Utopia, curating the most inspiring street art from around the world.
Every new follower who comments here will receive a high-quality street art photo as a thank-you reply! Feel free to request a specific theme for your photo!
Binary vector embeddings are so cool
https://emschwartz.me/binary-vector-embeddings-are-so-cool/
Discussions: https://discu.eu/q/https://emschwartz.me/binary-vector-embeddings-are-so-cool/
Tired of zoom, meet, w/e video conferencing software collecting your data?
Signal's got you❤️
NEW: call links let you start a video call with your fave Signal users easily, no group needed. Announcing these, and other improvements to calling here👇

If you love group calls on Signal, but don’t want to create a group chat for every combination of your friends or colleagues, you’re in luck. Today we’re launching call links: Share a link with anyone on Signal and in just a tap or click they can join the call. No group chat required.