JAX
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
JAX is Autograd and XLA, brought together for high-performance machine learning research.
With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order.
https://github.com/google/jax
Examples
JAXの自動微分で二階微分を求める - Qiita
Tutorial
What is JAX? - YouTube