Representation learning often emphasizes metric preservation. We instead build Symplectic structural invariance directly into the representation.

https://arxiv.org/abs/2512.19409

We embed Hamiltonian/symplectic geometry by making the RNN state dynamics a symplectomorphism, which preserves Legendre duality (information geometry) through time. This yields structure-preserving representations enforced by the latent dynamics, rather than imposed indirectly via the output.

#ReservoirComputing #RepresentationLearning #InformationGeometry #SymplecticGeometry #HamiltonianDynamics #GeometricDeepLearning #DynamicalSystems #PhysicsInformedML

Symplectic Reservoir Representation of Legendre Dynamics

Modern learning systems act on internal representations of data, yet how these representations encode underlying physical or statistical structure is often left implicit. In physics, conservation laws of Hamiltonian systems such as symplecticity guarantee long-term stability, and recent work has begun to hard-wire such constraints into learning models at the loss or output level. Here we ask a different question: what would it mean for the representation itself to obey a symplectic conservation law in the sense of Hamiltonian mechanics? We express this symplectic constraint through Legendre duality: the pairing between primal and dual parameters, which becomes the structure that the representation must preserve. We formalize Legendre dynamics as stochastic processes whose trajectories remain on Legendre graphs, so that the evolving primal-dual parameters stay Legendre dual. We show that this class includes linear time-invariant Gaussian process regression and Ornstein-Uhlenbeck dynamics. Geometrically, we prove that the maps that preserve all Legendre graphs are exactly symplectomorphisms of cotangent bundles of the form "cotangent lift of a base diffeomorphism followed by an exact fibre translation". Dynamically, this characterization leads to the design of a Symplectic Reservoir (SR), a reservoir-computing architecture that is a special case of recurrent neural network and whose recurrent core is generated by Hamiltonian systems that are at most linear in the momentum. Our main theorem shows that every SR update has this normal form and therefore transports Legendre graphs to Legendre graphs, preserving Legendre duality at each time step. Overall, SR implements a geometrically constrained, Legendre-preserving representation map, injecting symplectic geometry and Hamiltonian mechanics directly at the representational level.

arXiv.org

Discover topoteretes & cognee, revolutionary AI/ML concepts #AI #MachineLearning #CognitiveComputing

The article discusses topoteretes and cognee, emerging concepts in AI/ML research, focusing on spatial reasoning and cognitive architectures. Topoteretes, a term derived from topology and teretes, implies a new approach to geometric deep learning, while cognee represents a paradigm shift in cognitive computing. Although...

#topoteretes #cognee #geometricdeeplearning #cognitivecomputing

Discover topoteretes & cognee, the mysterious AI trends #AITrends #MachineLearning #Cognee

The article discusses the emerging trend of topoteretes and cognee, although it lacks specific details on their applications or technological advancements. Topoteretes, a term not widely recognized in the AI/ML community, may refer to a novel approach in geometric deep learning or a specialized neural network architecture....

#topoteretes #cognee #geometricdeeplearning #cognitivearchitectures

Really looking forward to this workshop! It will be cool to discuss what equivariance can do for robotics and what the geometric dl field can learn from robotics.

https://mobile.twitter.com/DianWang1007/status/1640423772426768384

#GeometricDeepLearning #Robotics #RSS2023

Dian Wang on Twitter

“Excited to announce the #RSS2023 Workshop on Symmetries in Robot Learning! Submit your work and join us to explore how symmetries can help in robotics with our incredible speakers @andyzengtweets @HughWang19 @haosu_twitr @tesssmidt @erikjbekkers. Website: https://t.co/4oV1uXZIhb.”

Twitter

Our #ICLR2023 workshop on Physics4ML is open for submissions. Deadline: 3rd February.

Submit your work on physics-based ML, equivariance, etc here: https://openreview.net/group?id=ICLR.cc/2023/Workshop/Physics4ML

More info:
https://physics4ml.github.io/

https://mobile.twitter.com/tk_rusch/status/1610305901558210563

#Physics4ML #AI4Science #GeometricDeepLearning

ICLR 2023 Workshop Physics4ML

Welcome to the OpenReview homepage for ICLR 2023 Workshop Physics4ML

OpenReview

We're organising a workshop on Physics for ML at #ICLR2023.

Submit your work on physics-based ML, equivariance, etc.

Site: https://physics4ml.github.io
OpenReview: https://openreview.net/group?id=ICLR.cc/2023/Workshop/Physics4ML

Deadline 3rd Feb.

https://twitter.com/tk_rusch/status/1603791044398702595

#Physics4ML #AI4Science #GeometricDeepLearning

Overview

Physics4ML

ICLR 2023 Workshop on Physics for Machine Learning
Quick intro: I'm a Senior Lecturer (Assoc Prof) at Kings College London. I work in developing tools for analysing cortical signals from MRI and electrophysiology. I'm involved with image processing for the Human Connectome Project, Developing Human Connectome Project and UK Biobank. #GeometricDeepLearning #AI #opendata
On my way to #aiche 2022 in #phoenix to talk about #GeometricDeepLearning! We'll see if I can make it out of #yyz today or if my flights will continue to be delayed and cancelled 🥲