Elizabeth Hazen "Approximations" — THE SHORE

THE SHORE

'Laplace Meets Moreau: Smooth Approximation to Infimal Convolutions Using Laplace's Method', by Ryan J. Tibshirani, Samy Wu Fung, Howard Heaton, Stanley Osher.

http://jmlr.org/papers/v26/24-0944.html

#convolutions #laplace #approximations

Laplace Meets Moreau: Smooth Approximation to Infimal Convolutions Using Laplace's Method

France24 speaks french like a vache espagnole

Les gars sont journalistes.... (ou alors des #ia_generative ?)

#journalisme #anglicismes #approximations #merdification

'Correction to "Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations"', by Daniel Paulin, Peter A. Whalley.

http://jmlr.org/papers/v25/24-0895.html

#ergodic #wasserstein #approximations

Correction to "Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations"

'Low-rank Variational Bayes correction to the Laplace method', by Janet van Niekerk, Haavard Rue.

http://jmlr.org/papers/v25/21-1405.html

#variational #hyperparameters #approximations

Low-rank Variational Bayes correction to the Laplace method

Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches

Tamim El Ahmad, Pierre Laforgue, Florence d'Alché-Buc

Action editor: Makoto Yamada.

https://openreview.net/forum?id=ry2qgRqTOw

#sparse #kernel #approximations

Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified...

Kernel methods are learning algorithms that enjoy solid theoretical foundations while suffering from important computational limitations. Sketching, which consists in looking for solutions among a...

OpenReview

Non-asymptotic approximations of Gaussian neural networks via second-order Poincar\'e inequalities

https://openreview.net/forum?id=OUn4ezaMem

#gaussian #asymptotic #approximations

Non-asymptotic approximations of Gaussian neural networks via...

There is a growing interest on large-width asymptotic and non-asymptotic properties of deep Gaussian neural networks (NNs), namely NNs with weights initialized as Gaussian distributions. For a...

OpenReview

Non-asymptotic approximations of Gaussian neural networks via second-order Poincar\'e inequalities

https://openreview.net/forum?id=BKtxHvwnut

#gaussian #approximations #approximation

Non-asymptotic approximations of Gaussian neural networks via...

There is a recent and growing interest on large-width asymptotic properties of Gaussian neural networks (NNs), namely NNs whose weights are initialized according to Gaussian distributions. A...

OpenReview

@_thegeoff @static

The real problem there is when #QFT (for example) is no better overall at explaining. If the leading edge theories were sufficiently developed, there would be easy to follow modelling.

Also, the notion that #maths are capable of doing this without proper models & explanations leads to false impressions. (shut up & calculate exemplifies this)

I think appeals to authority are part of it, when anyone who is 'supposed to know' can't actually explain without resorting to contradictive analogies, they just fall back to the pressure of 'this is the version that will be given points on the test' for practical consideration, and sprinkling in enough 'mystery & #paradox' to hold the superior emotional center.

This creates a #society educated into false consensus, with little trust in pursuing #reason or truth, and instead places the most value on marching forward without such clarity. The #information trickle down effect places teachers right in the middle of this dilemma.

Note that this is a necessary element we've evolved with; we won't eliminate this. We can however, spend more time #teaching about the shortcomings of the #approximations students are expected to learn, and encourage them to contribute to the next level of understanding rather than to #fear trying & failing.

Solving #brain dynamics gives rise to flexible machine learning models | #MIT CSAIL

Liquid neural networks made an order of magnitude faster and more scalable by the use of closed form #approximations, that is, “closed-form continuous-time” (CfC) neural network.

#LiquidNeuralNetworks #DeepLearning #ContinuousTime #ODE #AI #MachineLearning

https://www.csail.mit.edu/news/solving-brain-dynamics-gives-rise-flexible-machine-learning-models

Solving brain dynamics gives rise to flexible machine learning models | MIT CSAIL