New publication https://doi.org/10.1038/s41524-025-01880-3

Our work on AD-DFPT, a unification of #automaticdifferentiation with linear response for #densityfunctionaltheory is published in npj Computational Materials. We show examples for #property predition, #uncertainty propagation, the design of #materials and #machinelearning of new #dft models. #condensedmatter #dftk

#SciML fact of the day: automatic differentiation fails to give the correct derivative on a lot of very simple functions ๐Ÿ˜ฑ ๐Ÿ˜ฑ ๐Ÿ˜ฑ . #julialang #automaticdifferentiation

https://youtube.com/shorts/KTguZpL9Zz8

Automatic differentiation is incorrect on very simple functions??? ๐Ÿ˜ฑ ๐Ÿ˜ฑ ๐Ÿ˜ฑ

YouTube
The Numerical Analysis of Differentiable Simulation: Automatic Differentiation Can Be Incorrect - Stochastic Lifestyle

ISCL Seminar Series The Numerical Analysis of Differentiable Simulation: How Automatic Differentiation of Physics Can Give Incorrect Derivatives Scientific machine learning (SciML) relies heavily on automatic differentiation (AD), the process of constructing gradients which include machine learning integrated into mechanistic models for the purpose of gradient-based optimization. While these differentiable programming approaches pitch an idea of โ€œsimply put the simulator into a loss function and use ADโ€, it turns out there are a lot more subtle details to consider in practice. In this talk we will dive into the numerical analysis of differentiable simulation and ask the question: how numerically stable and robust is AD? We will use examples from the Python-based Jax (diffrax) and PyTorch (torchdiffeq) libraries in order to demonstrate how canonical formulations ... READ MORE

Stochastic Lifestyle

New preprint: https://arxiv.org/abs/2509.07785

We present an implementation of AD-DFPT, a unification of #automaticdifferentiation with classical #dfpt response techniques for #densityfunctionaltheory (#dft). We demonstrate its use for #property predition, #uncertainty propagation, design of new #materials as well as the #machinelearning of new #dft models.

#condensedmatter #planewave #response #physics #simulation #computation

Algorithmic differentiation for plane-wave DFT: materials design, error control and learning model parameters

We present a differentiation framework for plane-wave density-functional theory (DFT) that combines the strengths of algorithmic differentiation (AD) and density-functional perturbation theory (DFPT). In the resulting AD-DFPT framework derivatives of any DFT output quantity with respect to any input parameter (e.g. geometry, density functional or pseudopotential) can be computed accurately without deriving gradient expressions by hand. We implement AD-DFPT into the Density-Functional ToolKit (DFTK) and show its broad applicability. Amongst others we consider the inverse design of a semiconductor band gap, the learning of exchange-correlation functional parameters, or the propagation of DFT parameter uncertainties to relaxed structures. These examples demonstrate a number of promising research avenues opened by gradient-driven workflows in first-principles materials modeling.

arXiv.org

This week the @MatMat group takes part in the #psik conference (https://www.psik2025.net/) at #epfl
with plentey of cutting-edge talks on #materials #modeling and simulations of #condensedmatter.

My contribution has been a short talk on #error quantification and propagation in #densityfunctionaltheory simulations leveraging the built-in #automaticdifferentiation framework of the #dftk code for automatic
gradient computation.

Slides: https://michael-herbst.com/talks/2025.08.25_Psik.pdf

Psi-k conference

SwissTech Convention Center, EPFL, Lausanne (Switzerland)

As part of the #cecam workshop on perspectives of the atomistic simulation environment (#ase) I delivered a talk on our #materials #modeling ecosystem juliamolsim.org written in the #julialang
programming language and showed some examples: #automaticdifferentiation through the simulation pipeline, seamless #gpu usage, #error propagation and many more

Slides: https://michael-herbst.com/talks/2025.06.23_ASE_perspectives.pdf
#julialang demo: https://michael-herbst.com/talks/2025.06.23_ASE_perspectives_demo.tar.gz

#dftk #densityfunctionaltheory #condensedmatter #planewave #simulation

An Illustrated Guide to Automatic Sparse Differentiation | ICLR Blogposts 2025

In numerous applications of machine learning, Hessians and Jacobians exhibit sparsity, a property that can be leveraged to vastly accelerate their computation. While the usage of automatic differentiation in machine learning is ubiquitous, automatic sparse differentiation (ASD) remains largely unknown. This post introduces ASD, explaining its key components and their roles in the computation of both sparse Jacobians and Hessians. We conclude with a practical demonstration showcasing the performance benefits of ASD.

Have you ever thought ๐Ÿ’ก of using JAX as ๐Ÿงฎ #automaticdifferentiation engine in ๐Ÿ’ป finite element simulations? Boost the performance ๐Ÿ‡ of computationally-expensive hyperelastic material models with #jit in ๐Ÿ” FElupe! ๐Ÿš€ ๐Ÿš€

https://github.com/adtzlr/felupe

#python #jax #finiteelementmethod #scientificcomputing #computationalmechanics #fea #fem #hyperelasticity

GitHub - adtzlr/felupe: :mag: finite element analysis for continuum mechanics of solid bodies

:mag: finite element analysis for continuum mechanics of solid bodies - adtzlr/felupe

GitHub

This paper from @jenseisert and colleagues sounds interesting!

"The incorporation of automatic differentiation in tensor networks algorithms has ultimately enabled a new, flexible way for variational simulation of ground states and excited states. In this work, we review the state of the art of the variational iPEPS framework. We present and explain the functioning of an efficient, comprehensive and general tensor network library for the simulation of infinite two-dimensional systems using iPEPS, with support for flexible unit cells and different lattice geometries."

https://scirate.com/arxiv/2308.12358

#quantum #TensorNetwork #computational #physics #AutomaticDifferentiation #iPEPS

variPEPS -- a versatile tensor network library for variational ground state simulations in two spatial dimensions

Tensor networks capture large classes of ground states of phases of quantum matter faithfully and efficiently. Their manipulation and contraction has remained a challenge over the years, however. For most of the history, ground state simulations of two-dimensional quantum lattice systems using (infinite) projected entangled pair states have relied on what is called a time-evolving block decimation. In recent years, multiple proposals for the variational optimization of the quantum state have been put forward, overcoming accuracy and convergence problems of previously known methods. The incorporation of automatic differentiation in tensor networks algorithms has ultimately enabled a new, flexible way for variational simulation of ground states and excited states. In this work, we review the state of the art of the variational iPEPS framework. We present and explain the functioning of an efficient, comprehensive and general tensor network library for the simulation of infinite two-dimensional systems using iPEPS, with support for flexible unit cells and different lattice geometries.

SciRate

I really enjoyed the talk by Manuel Drehwald at #RustSciComp23 who drew the lines of an exciting future for #AutomaticDifferentiation in #Rust with #LLVM #Enzyme , which should be directly integrated into the compiler at an horizon of a couple of months.

If I understood correctly, the idea is to differentiate code at the LLVM IR level, *after optimization* (and to do another pass of optimization after that). This can produce faster code than the AD engines that operate at the source code level.