This doesn’t mean curvature isn’t interesting, though. Agustinus Kristiadi, in lecture 13 (impersonated imperfectly by myself) explains the beautiful connection between curvature and uncertainty, and introduces linearized Laplace approximations.

https://youtu.be/LssjrrOMlIg

Numerics of ML 13 -- Uncertainty in Deep Learning -- Agustinus Kristiadi

YouTube

People like @emtiyaz, Alex Immer, Erik Daxberger, Matthias Bauer, Runa Eschenhagen and Agustinus himself have built a beautifully comprehensive framework to turn deep models approximately into Gaussian processes, and thus transfer all the clean mechanisms associated with GPs to deep learning:

Calibrated, learnable uncertainty; out-of-distribution robustness; architecture-optimization by evidence maximization; multi-task, life-long learning, and the list goes on.

@emtiyaz At the end, I sneak myself on stage on final time to summarize: https://youtu.be/t2CSqdfmKGA

Long story short:

Computation is inference. Whether information is loaded from disk or prepared directly on the GPU is not a fundamental distinction. If we quantify uncertainty everywhere, we can actively control, guide, manage and monitor the use of empirical as well as computational data.

Numerics of ML 14 -- Conclusion -- Philipp Hennig

YouTube

@emtiyaz This course, and it's Open Educational Resources, brought to you by
Nathanael Bosch, Julia Grosse, Agustinus Kristiadi, Marvin Pförtner, Jonathan Schmidt, Frank Schneider, Lukas Tatzel, Jonathan Wenger, and yours truly.

-- The End (of Term) --