This month, CTCS (IIT Madras) & @PIK_climate present a webinar:
📢: Performance-dependent network evolution for enhanced computational capacity
🎙️: Prof. Sudeshna Sinha, IISER Mohali, India
📅: February 23 | 20:30 IST | 16:00 CET | 10:00 EST
🔗: https://us06web.zoom.us/webinar/register/WN__ioNrqy2S9Kx74NE3Xuj7Q
#Complexsystems #phd #NetworkEvolution #ReservoirComputing #MachineLearning

Representation learning often emphasizes metric preservation. We instead build Symplectic structural invariance directly into the representation.

https://arxiv.org/abs/2512.19409

We embed Hamiltonian/symplectic geometry by making the RNN state dynamics a symplectomorphism, which preserves Legendre duality (information geometry) through time. This yields structure-preserving representations enforced by the latent dynamics, rather than imposed indirectly via the output.

#ReservoirComputing #RepresentationLearning #InformationGeometry #SymplecticGeometry #HamiltonianDynamics #GeometricDeepLearning #DynamicalSystems #PhysicsInformedML

Symplectic Reservoir Representation of Legendre Dynamics

Modern learning systems act on internal representations of data, yet how these representations encode underlying physical or statistical structure is often left implicit. In physics, conservation laws of Hamiltonian systems such as symplecticity guarantee long-term stability, and recent work has begun to hard-wire such constraints into learning models at the loss or output level. Here we ask a different question: what would it mean for the representation itself to obey a symplectic conservation law in the sense of Hamiltonian mechanics? We express this symplectic constraint through Legendre duality: the pairing between primal and dual parameters, which becomes the structure that the representation must preserve. We formalize Legendre dynamics as stochastic processes whose trajectories remain on Legendre graphs, so that the evolving primal-dual parameters stay Legendre dual. We show that this class includes linear time-invariant Gaussian process regression and Ornstein-Uhlenbeck dynamics. Geometrically, we prove that the maps that preserve all Legendre graphs are exactly symplectomorphisms of cotangent bundles of the form "cotangent lift of a base diffeomorphism followed by an exact fibre translation". Dynamically, this characterization leads to the design of a Symplectic Reservoir (SR), a reservoir-computing architecture that is a special case of recurrent neural network and whose recurrent core is generated by Hamiltonian systems that are at most linear in the momentum. Our main theorem shows that every SR update has this normal form and therefore transports Legendre graphs to Legendre graphs, preserving Legendre duality at each time step. Overall, SR implements a geometrically constrained, Legendre-preserving representation map, injecting symplectic geometry and Hamiltonian mechanics directly at the representational level.

arXiv.org
For #ReservoirComputing lovers,I found a challenging #attractor, the Thomas':
dx = np.sin(y) - b * x
dy = np.sin(z) - b * y
dz = np.sin(x) - b * z
x += dx * dt
y += dy * dt
z += dz * dt
3k neurons yet 67% correlation. Here is the code for you https://github.com/alecrimi/magic_reservoir/blob/main/thomas_attractor_prediction.py
Li #attractor=chaotic vibes like Lorenz/Chen use its wild dynamics for #ReservoirComputing to process time series.
The truth: just use it because it looks cool⚔🏴‍☠️🌊
def dequan_li(x,y,z):
dx = a*(y - x) + y*z
dy = b*x - x*z+y
dz = c*z + x*y/3
return dx,dy,dz

Summer ☀️ read on Computo: a new publication on reservoir computing in R!

Reservoir computing is a machine learning approach that relies on mapping inputs to higher dimensional spaces through a non-linear dynamical system (the reservoir), for example using a deep recurrent neural network and training only its final layer.

In this new publication, Thomas Ferté and co-authors Kalidou Ba, Dan Dutartre, Pierrick Legrand, Vianney Jouhet, Rodolphe Thiébaut, Xavier Hinaut and Boris P. Hejblum present the reservoirnet package, which is the first implementation of reservoir computing in R (rather then the existing Python and Julia).

The article also serves as an introduction to reservoir computing and illustrates its usefulness as well as the usage of the package on several real-case applications, including forecasting the number of COVID-19 hospitalizations at Bordeaux University Hospital using public data (epidemiological statistics, weather data) and hospital-level data (hospitalizations, ICU admission, ambulance service and ER notes). On this example, the authors also show how the weights of the connection between the input and the output layers can be used to compute feature importances.

The paper and accompanying R code are available at https://doi.org/10.57750/arxn-6z34

reservoirnet is available at https://cran.r-project.org/package=reservoirnet

#machineLearning #reservoirComputing #Rstats #openScience #openSource #openAccess

A Minimal Genetic Circuit for Cellular Anticipation

Living systems have evolved cognitive complexity to reduce environmental uncertainty, enabling them to predict and prepare for future conditions. Anticipation, distinct from simple prediction, involves active adaptation before an event occurs and is a key feature of both neural and non-neural biological agents. Recent work by Steven Frank proposed a minimal anticipatory mechanism based on the moving average convergence-divergence principle from financial markets. Here, we implement this principle using synthetic biology to design and evaluate minimal genetic circuits capable of anticipating environmental trends. Through deterministic and stochastic analyses, we demonstrate that these motifs achieve robust anticipatory responses under a wide range of conditions. Our findings suggest that simple genetic circuits could be naturally exploited by cells to prepare for future events, providing a foundation for engineering predictive biological systems. ### Competing Interest Statement The authors have declared no competing interest.

bioRxiv

Die #DeutschePhysikalischeGesellschaft hat auf ihrer Frühjahrstagung der Sektion #KondensierteMaterie drei Physiker mit Forschungspreisen für ihre wissenschaftlichen Arbeiten an der #UniMainz ausgezeichnet.

🎉 Herzlichen Glückwunsch an Dr. Libor Šmejkal zum Walter-Schottky-Preis 2025, an Dr. Robin R. Neumann zum INNOMAG Dissertationspreis 2025 und Grischa Beneke zum INNOMAG Master-Preis 2025 👉 https://www.phmi.uni-mainz.de/auszeichnungen-auf-dem-gebiet-der-physik-der-kondensierten-materie-fuer-drei-jgu-physiker/

#Physik #Altermagnetismus #Magnonen #ReservoirComputing

Auszeichnungen auf dem Gebiet der Physik der kondensierten Materie für drei JGU-Physiker | FB 08 - Physik, Mathematik und Informatik

Johannes Gutenberg-Universität Mainz
Updates on the 🧠#brain effective connectivity library.
1. Added #Reservoircomputing causality as nonlinear #Granger:
https://github.com/alecrimi/effconnpy/blob/main/effconnpy/nonlinearGC.py
2. Now lables on to the nodes
3. Added directionality shown with colors from blue to white (white as the arrows)
https://github.com/alecrimi/effconnpy/blob/main/effconnpy/vis_effconn_tracts.py
effconnpy/effconnpy/nonlinearGC.py at main · alecrimi/effconnpy

A package for causal inference and statistical modeling in brain time series - alecrimi/effconnpy

GitHub

#simplicialcomplex + #Causality +#Reservoircomputing:
"Higher-order Granger reservoir computing: simultaneously achieving scalable complex structures inference and accurate dynamics prediction" https://www.nature.com/articles/s41467-024-46852-1

#dynamicalsystem #ML #AI

Higher-order Granger reservoir computing: simultaneously achieving scalable complex structures inference and accurate dynamics prediction - Nature Communications

For reservoir computing, improving prediction accuracy while maintaining low computing complexity remains a challenge. Inspired by the Granger causality, Li et al. design a data-driven and model-free framework by integrating the inference process and the inferred results on high-order structures.

Nature
Energy-saving computing with magnetic whirls: Brownian #ReservoirComputing allows to detect human hand gestures on the basis of diffusion and displacement of #skyrmions // #physics #spintronics #magnetism #computing #TopDyn
@NatureComms @uni_mainz_eng
https://nachrichten.idw-online.de/2024/09/16/energy-saving-computing-with-magnetic-whirls
Energy-saving computing with magnetic whirls