'Faster Randomized Methods for Orthogonality Constrained Problems', by Boris Shustin, Haim Avron.
http://jmlr.org/papers/v25/21-1022.html
#preconditioning #randomization #randomized
'Faster Randomized Methods for Orthogonality Constrained Problems', by Boris Shustin, Haim Avron.
http://jmlr.org/papers/v25/21-1022.html
#preconditioning #randomization #randomized
Preconditioning for Physics-Informed Neural Networks
Songming Liu, Chang Su, Jiachen Yao, Zhongkai Hao, Hang Su, Youjia Wu, Jun Zhu
abs: http://arxiv.org/abs/2402.00531
pdf: https://arxiv.org/pdf/2402.00531.pdf
I have been on a preconditioning kick, and I'm seeing -- occasionally! -- more papers on its use outside of numerical analysis. Yeah 🎉
Physics-informed neural networks (PINNs) have shown promise in solving various partial differential equations (PDEs). However, training pathologies have negatively affected the convergence and prediction accuracy of PINNs, which further limits their practical applications. In this paper, we propose to use condition number as a metric to diagnose and mitigate the pathologies in PINNs. Inspired by classical numerical analysis, where the condition number measures sensitivity and stability, we highlight its pivotal role in the training dynamics of PINNs. We prove theorems to reveal how condition number is related to both the error control and convergence of PINNs. Subsequently, we present an algorithm that leverages preconditioning to improve the condition number. Evaluations of 18 PDE problems showcase the superior performance of our method. Significantly, in 7 of these problems, our method reduces errors by an order of magnitude. These empirical findings verify the critical role of the condition number in PINNs' training.
Electric car drivers can make use of 'magic' feature to avoid windscreen frost and fines
https://www.msn.com/en-gb/cars/news/electric-car-drivers-can-make-use-of-magic-feature-to-avoid-windscreen-frost-and-fines/ar-AA16NgoF?cvid=4bbe2508708a4725ab78b031b73b992f&ocid=winp2fptaskbarhover&ei=33