Gaussian process techniques for differential equations

PN 6-3 II

Project description

Gaussian Proceeses for Machine Learning (GP4ML) are an established kernel-based Bayesian machine learning method that has also been considered in the context of differential equations. From a mathematical point of view, however, the application of GP4ML for differential equaitons is not sufficiently justified, in particular if it comes to infinite rank conditioning. The latter conditioning, however, is somewhat natural either has an idealized limit of existing methods or as a methodologically independent approach of its own. The goal of this project is to address these shortcomings by establishing a rigorous and general conditioning theory for GP4MLs and to apply this theory in the context of differential equations. One particular focus lies on approximations with finite rank conditionings. In cooperation with PN5-6 we will further apply our insights to specific applications and evaluate the approaches numerically.

Project information

Project title Gaussian process techniques for differential equations
Project leaders Ingo Steinwart (Wolfgang Nowak)
Project staff Aleksandar Arsenijevic, doctoral researcher
Project duration July 2022 - December 2025
Project number PN 6-3 (II)

Publications PN 6-3 and PN 6-3 (II)

  1. 2023

    1. D. Holzmüller, “Regression from linear models to neural networks: double descent, active learning, and sampling,” University of Stuttgart, 2023.
    2. D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart, “A Framework and Benchmark for Deep Batch Active Learning for Regression,” Journal of Machine Learning Research, vol. 24, no. 164, Art. no. 164, 2023, [Online]. Available: http://jmlr.org/papers/v24/22-0937.html
    3. V. Zaverkin, D. Holzmüller, L. Bonfirraro, and J. Kästner, “Transfer learning for chemically accurate interatomic neural network potentials,” Physical Chemistry Chemical Physics, vol. 25, no. 7, Art. no. 7, 2023, doi: 10.1039/D2CP05793J.
  2. 2022

    1. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Exploring chemical and conformational spaces by batch mode deep active learning,” Digital Discovery, vol. 1, pp. 605–620, 2022, doi: 10.1039/D₂DD00034B.
    2. D. Holzmüller and I. Steinwart, “Training two-layer ReLU networks with gradient descent is inconsistent,” Journal of Machine Learning Research, vol. 23, no. 181, Art. no. 181, 2022, [Online]. Available: http://jmlr.org/papers/v23/20-830.html
    3. V. Zaverkin, D. Holzmüller, R. Schuldt, and J. Kästner, “Predicting properties of periodic systems from cluster data: A case study of liquid water,” The Journal of Chemical Physics, vol. 156, no. 11, Art. no. 11, 2022, doi: 10.1063/5.0078983.
  3. 2021

    1. D. Holzmüller and D. Pflüger, “Fast Sparse Grid Operations Using the Unidirectional Principle: A Generalized and Unified Framework,” in Sparse Grids and Applications - Munich 2018, H.-J. Bungartz, J. Garcke, and D. Pflüger, Eds., in Sparse Grids and Applications - Munich 2018. Cham: Springer International Publishing, 2021, pp. 69--100.
    2. D. Holzmüller, “On the Universality of the Double Descent Peak in Ridgeless Regression,” in International Conference on Learning Representations, in International Conference on Learning Representations. 2021. [Online]. Available: https://openreview.net/forum?id=0IO5VdnSAaH
    3. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” Journal of Chemical Theory and Computation, vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
  4. 2020

    1. D. F. B. Häufle, I. Wochner, D. Holzmüller, D. Drieß, M. Günther, and S. Schmitt, “Muscles reduce neuronal information load : quantification of control effort in biological vs. robotic pointing and walking,” Frontiers in Robotics and AI, vol. 7, p. 77, 2020, doi: 10.3389/frobt.2020.00077.
To the top of the page