“Live happy in lovingkindness.”
― Toba Beta
Curvature-Informed SGD via General Purpose Lie-Group Preconditioners
Authors: Omead Pooladzandi, Xi-Lin Li
abs: https://arxiv.org/abs/2402.04553
pdf: https://arxiv.org/pdf/2402.04553.pdf
code1:https://github.com/lixilinx/psgd_torch
code2: https://github.com/opooladz/Preconditioned-Stochastic-Gradient-Descent
We present a novel approach to accelerate stochastic gradient descent (SGD) by utilizing curvature information obtained from Hessian-vector products or finite differences of parameters and gradients, similar to the BFGS algorithm. Our approach involves two preconditioners: a matrix-free preconditioner and a low-rank approximation preconditioner. We update both preconditioners online using a criterion that is robust to stochastic gradient noise and does not require line search or damping. To preserve the corresponding symmetry or invariance, our preconditioners are constrained to certain connected Lie groups. The Lie group's equivariance property simplifies the preconditioner fitting process, while its invariance property eliminates the need for damping, which is commonly required in second-order optimizers. As a result, the learning rate for parameter updating and the step size for preconditioner fitting are naturally normalized, and their default values work well in most scenarios. Our proposed approach offers a promising direction for improving the convergence of SGD with low computational overhead. We demonstrate that Preconditioned SGD (PSGD) outperforms SoTA on Vision, NLP, and RL tasks across multiple modern deep-learning architectures. We have provided code for reproducing toy and large scale experiments in this paper.
“The art of not #reading is a very important one. It consists in not taking an interest in whatever may be engaging the attention of the general public at any particular time. When some political or ecclesiastical #pamphlet, or #novel, or #poem is making a great #commotion, you should remember that he who writes for fools always finds a large public. A #precondition for reading good books is not reading bad ones: for #life is short.”
Arthur #Schopenhauer, Essays and #Aphorisms