Understanding physical constraints in machine learning for simulation

PN 6-3

Project description

We investigate how simulation specific (physical) constraints can be incorporated into a large family of machine learning methods that contains, for example, (hierarchical) kernel-based methods and deep neural networks. These constraints can either be hard, that is, they can be rigorously guaranteed before the learning process, or soft in the sense that constraint violations are penalized during the training phase. Our research goals comprise the identification of constraints that fit well to certain members of the family, a better understanding of the learning process, and steering the learning process towards better solutions. For the latter two goals, novel visualization techniques that make it possible to derive new hypotheses about the highly complex learning process will be developed. In turn, these new hypotheses can further refine the visualization techniques, so that an interplay between visualization and better understanding occurs. At the end, the collected insights will be used to develop new visualization techniques that help non-experts to better understand and control the learning process. Examples of considered physical systems will be chosen in close collaboration with PN 1-5.

Project information

Project title Understanding physical constraints in machine learning for simulation
Project leaders Ingo Steinwart (Daniel Weiskopf)
Project duration January 2020 - June 2023
Project number PN 6-3

Publications PN 6-3 and PN 6-3 (II)

  1. 2023

    1. D. Holzmüller, “Regression from linear models to neural networks: double descent, active learning, and sampling,” University of Stuttgart, 2023.
    2. D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart, “A Framework and Benchmark for Deep Batch Active Learning for Regression,” Journal of Machine Learning Research, vol. 24, no. 164, Art. no. 164, 2023, [Online]. Available: http://jmlr.org/papers/v24/22-0937.html
    3. V. Zaverkin, D. Holzmüller, L. Bonfirraro, and J. Kästner, “Transfer learning for chemically accurate interatomic neural network potentials,” Physical Chemistry Chemical Physics, vol. 25, no. 7, Art. no. 7, 2023, doi: 10.1039/D2CP05793J.
  2. 2022

    1. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Exploring chemical and conformational spaces by batch mode deep active learning,” Digital Discovery, vol. 1, pp. 605–620, 2022, doi: 10.1039/D₂DD00034B.
    2. D. Holzmüller and I. Steinwart, “Training two-layer ReLU networks with gradient descent is inconsistent,” Journal of Machine Learning Research, vol. 23, no. 181, Art. no. 181, 2022, [Online]. Available: http://jmlr.org/papers/v23/20-830.html
    3. V. Zaverkin, D. Holzmüller, R. Schuldt, and J. Kästner, “Predicting properties of periodic systems from cluster data: A case study of liquid water,” The Journal of Chemical Physics, vol. 156, no. 11, Art. no. 11, 2022, doi: 10.1063/5.0078983.
  3. 2021

    1. D. Holzmüller and D. Pflüger, “Fast Sparse Grid Operations Using the Unidirectional Principle: A Generalized and Unified Framework,” in Sparse Grids and Applications - Munich 2018, H.-J. Bungartz, J. Garcke, and D. Pflüger, Eds., in Sparse Grids and Applications - Munich 2018. Cham: Springer International Publishing, 2021, pp. 69--100.
    2. D. Holzmüller, “On the Universality of the Double Descent Peak in Ridgeless Regression,” in International Conference on Learning Representations, in International Conference on Learning Representations. 2021. [Online]. Available: https://openreview.net/forum?id=0IO5VdnSAaH
    3. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” Journal of Chemical Theory and Computation, vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
  4. 2020

    1. D. F. B. Häufle, I. Wochner, D. Holzmüller, D. Drieß, M. Günther, and S. Schmitt, “Muscles reduce neuronal information load : quantification of control effort in biological vs. robotic pointing and walking,” Frontiers in Robotics and AI, vol. 7, p. 77, 2020, doi: 10.3389/frobt.2020.00077.
To the top of the page