Glad to share the publication of our #newpaper :
A Predictive Approach to Enhance Time-Series Forecasting
By Skye Gunasekaran, Assel Kembay, Hugo Ladret, Rui-Jie Zhu, myself, Omid Kavehei and Jason Eshraghian
The lead author, Jason Eshragian, speaks most clearly about it:
For the amount of compute they burn, transformers are pretty bad at time-series data analysis. Which is pretty unsurprising if your objective is to predict the next token, one step at a time.
Brains, on the other hand, are predictive machines. Think of your daily commute to work. On Day 1, your brain was probably in overdrive to make sure you're not late, taking in all of your environment. On Day 1000, you're on full autopilot, barely burning mental energy unless something unexpected - like a major accident - forces you to adjust.
That's predictive coding in action: the brain continuously compares its expectations (no traffic) to reality (flipped car damn), then updates only when surprised.
Skye Gunasekaran has spent the past couple of years integrating this principle into Future-Guided Learning, where a "future" model guides a "past" forecasting model, dynamically minimizing surprise when reality deviates from predictions.
In our preprint, we show how drawing upon neuroscience-inspired ideas actually helps in time-series forecasting with deep learning. Efficiency isn't the only win from the brain; it's also pretty damn good at organizing long-range time-series information.
https://www.linkedin.com/feed/update/urn:li:activity:7378797683425296385/
https://www.nature.com/articles/s41467-025-63786-4
https://laurentperrinet.github.io/publication/gunasekaran-25/