Investigation of chemical reactivity by machine learning techniques

PN 6 A-1

Project description

The prediction of chemical processes and materials properties requires accurate knowledge of potential energy surfaces. These map atomic positions and nuclear charges to electronic energies. Such potential energy surfaces can be obtained point-wise by electronic structure calculations. Based on these, machine learned models can be trained if appropriate descriptors of atomic structures are available. They have to fulfill physical constraints, like invariance with respect to rotation, translation, and reflection of the whole molecule of invariance with respect to exchange of atoms of the same type. Such descriptors are to be developed in this project and will be applied in ML approaches like feed-forward neural networks and kernel ridge regression. Furthermore, only a single neural network with a square augmented input layer will be used for all atomic species, which is different from the most available open-source packages. In the case of the kernel ridge regression model an atomic selectivity of the kernel will be introduced. Along with standard kernels, like square exponential one, different graph kernels will be used.

Project information

Project title Investigation of chemical reactivity by machine learning techniques
Project Leader Johannes Kästner
Project duration January 2020 - December 2023
Project number PN 6 A-1
Project Partners The project ist funded by the Studienstiftung des deutschen Volkes

Publications PN 6 A-1

  1. 2023

    1. D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart, “A Framework and Benchmark for Deep Batch Active Learning for Regression,” Journal of Machine Learning Research, vol. 24, no. 164, Art. no. 164, 2023, [Online]. Available: http://jmlr.org/papers/v24/22-0937.html
    2. V. Zaverkin, D. Holzmüller, L. Bonfirraro, and J. Kästner, “Transfer learning for chemically accurate interatomic neural network potentials,” Physical Chemistry Chemical Physics, vol. 25, no. 7, Art. no. 7, 2023, doi: 10.1039/D2CP05793J.
    3. K. Gubaev, V. Zaverkin, P. Srinivasan, A. I. Duff, J. Kästner, and B. Grabowski, “Performance of two complementary machine-learned potentials in modelling chemically complex systems,” NPJ Computational Materials, vol. 9, p. 129, 2023, doi: 10.1038/s41524-023-01073-w.
  2. 2022

    1. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Exploring chemical and conformational spaces by batch mode deep active learning,” Digital Discovery, vol. 1, pp. 605–620, 2022, doi: 10.1039/D₂DD00034B.
    2. V. Zaverkin, D. Holzmüller, R. Schuldt, and J. Kästner, “Predicting properties of periodic systems from cluster data: A case study of liquid water,” The Journal of Chemical Physics, vol. 156, no. 11, Art. no. 11, 2022, doi: 10.1063/5.0078983.
    3. V. Zaverkin, J. Netz, F. Zills, A. Köhn, and J. Kästner, “Thermally Averaged Magnetic Anisotropy Tensors via Machine Learning Based on Gaussian Moments,” Journal of Chemical Theory and Computation, vol. 18, pp. 1–12, 2022, doi: 10.1021/acs.jctc.1c00853.
  3. 2021

    1. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” Journal of Chemical Theory and Computation, vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
    2. V. Zaverkin and J. Kästner, “Exploration of transferable and uniformly accurate neural network interatomic potentials using optimal experimental design,” Machine Learning: Science and Technology, vol. 2, no. 3, Art. no. 3, 2021.
  4. 2020

    1. G. Molpeceres, V. Zaverkin, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – I. adsorption and desorption,” Mon. Not. R. Astron. Soc., vol. 499, pp. 1373–1384, 2020, doi: 10.1093/mnras/staa2891.
    2. V. Zaverkin and J. Kästner, “Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials,” Journal of Chemical Theory and Computation, vol. 16, pp. 5410–5421, 2020, doi: 10.1021/acs.jctc.0c00347.
To the top of the page