https://rohangautam.github.io/blog/chebyshev_gauss/ #GaussianIntegration #ChebyshevGauss #NumericalAnalysis #MathIsCool #PartyWithMath #HackerNews #ngated
Gaussian Integration Is Cool
https://rohangautam.github.io/blog/chebyshev_gauss/
#HackerNews #Gaussian #Integration #Cool #Mathematics #Algorithms #NumericalAnalysis
New publication https://doi.org/10.1103/PhysRevB.111.205143
New algorithm for the #inverseproblem of Kohn-Sham #densityfunctionaltheory (#dft), i.e. to find the #potential from the #density.
Outcome of a fun collaboration of @herbst with the group of Andre Laestadius at #oslomet to derive first mathematical error bounds for this problem
#condensedmatter #planewave #numericalanalysis #convexanalysis #dftk
That first implementation didn't even support the multi-GPU and multi-node features of #GPUSPH (could only run on a single GPU), but it paved the way for the full version, that took advantage of the whole infrastructure of GPUSPH in multiple ways.
First of all, we didn't have to worry about how to encode the matrix and its sparseness, because we could compute the coefficients on the fly, and operate with the same neighbors list transversal logic that was used in the rest of the code; this allowed us to minimize memory use and increase code reuse.
Secondly, we gained control on the accuracy of intermediate operations, allowing us to use compensating sums wherever needed.
Thirdly, we could leverage the multi-GPU and multi-node capabilities already present in GPUSPH to distribute computations across all available devices.
And last but not least, we actually found ways to improve the classic #CG and #BiCGSTAB linear solving algorithms to achieve excellent accuracy and convergence even without preconditioners, while making the algorithms themselves more parallel-friendly:
https://doi.org/10.1016/j.jcp.2022.111413
4/n
People in the market for a postdoc position in numerical linear algebra should look at the advert for a postdoc in Edinburgh "devoted to research on Randomized Numerical Linear Algebra for Optimization and Control of Partial Differential Equations."
The mentors are John Pearson (Edinburgh) and Stefan Güttel (Manchester), both excellent people, and the topic is fascinating. I even fantasised about leaving my permanent job and doing this instead ...
More info: https://www.jobs.ac.uk/job/DNA984/postdoctoral-research-associate
#NumericalAnalysis #optimization #PartialDifferentialEquations #postdoc
Thanks to the Manchester NA group for organizing a seminar by David Watkins, one of the foremost experts on matrix eigenvalue algorithms. I find numerical linear algebra talks often too technical, but I could follow David's talk quite well even though I did not get everything, so thanks for that.
David spoke about the standard eigenvalue algorithm, which is normally called the QR-algorithm. He does not like that name because the QR-decomposition is not actually important in practice and he calls it the Francis algorithm (after John Francis, who developed it). It is better to think of the algorithm as an iterative process which reduces the matrix to triangular form in the limit.
Newton's method:
#newton #mathematics #numericalanalysis #optimization #research
📉
SUperman: Efficient Permanent Computation on GPUs
Apparenty we weren't having enough issues of context collapse for #SPH as an acronym of #SmoothedParticleHydrodynamics, since I'm now seeing #STI as an acronym for #SymplecticTimeIntegrator. And of course these article are more often than not written with #LaTeX.
(No, Mastodon, I really do not want you to normalize the case of *that* tag.)
One of these I'm going to create a quiz game: #kink #fetish or #numericalAnalysis?