Schedule & Talks

Numerik im Ländle 2025

true" ? copyright : '' }
    Monday, 5 May Tuesday, 6 May
08:45 am 09:15 am   Nonlinear compressive reduced basis 
(Prud'homme, SB)
09:15 am 09:35 am Registration/Coffee
09:35 am 10:00 am Geodesic Programming
(Herzog, HD)
10:00 am 10:25 am Statistical Learning Theory for Neural Operators
(Zech, HD)
Research Data Management for Applied Mathematics
(Schembera, S)
10:25 am 10:50 am Coffee Break
10:50 am 11:15 am High-frequency wave propagation - analysis and numerics
(Jahnke, KA)
Optimal switching Dirac control for stabilization of time-varying linear parabolic equations
(Azmi, KO)
11:15 am 11:40 am Constrained Total Variation Minimization: Nonconforming Discretization and Adaptivity
(Jackisch, FR)
Multilevel Stochastic Gradient Descent Applied to Optimal Control Under Uncertainty (Baumgarten, HD)
11:40 am 12:05 pm Deep learning methods for stochastic Galerkin approximations of elliptic random PDEs
(Musco, S)
On a generalized Riemann solver for a hyperbolic model of two-layer thin film flow
(Barthwal, S)
12:05 pm 01:30 pm Lunch Break
01:30 pm 01:55 pm Numerical treatment of generalized Gamow problems
(Striet, FR)
Nonlocal Traffic Models
(Göttlich, MA)
01:55 pm 02:20 pm Positivity preserving FEM for the Gross-Pitaevskii ground state
(Hauck, KA)
02:20 pm 02:45 pm Generalized Bayesian Inversion
(König, S)
A multiscale approach to the stationary Ginzburg-Landau equations of superconductivity
(Dörich, KA)
02:45 pm 03:10 pm A Posteriori Error Bounds for Kohn-Sham Systems with Convex Exchange Correlation Functionals
(Lainez Reyes, S)
Filtered finite difference methods for nonlinear Schrödinger equations in semiclassical scaling
(Shi, TÜ)
03:10 pm 03:35 pm Dirichlet-Neumann Averaging: The DNA of Efficient Gaussian Process Simulation
(Scheichl, HD)
A symmetry-preserving and transferable representation for learning the Kohn-Sham density matrix
(Nottoli, S)
03:35 pm 04:00 pm Coffee Break Discussion/Closing
04:00 pm 04:25 pm Samplets: Wavelet concepts for scattered data
(Harbrecht, BS)
 
04:25 pm 04:50 pm
04:50 pm 05:15 pm A posteriori error control of PINNs for solving PDEs
(Urban, U)
05:15 pm 05:40 pm Discussion
07:00 pm 09:00 pm Dinner

Invited Talks

Nonlocal Traffic Models (Prof. Dr. Simone Göttlich)

Based on a Godunov type numerical scheme for a class of scalar conservation laws with non-local flux arising for example in traffic flow models, we first show the well-posedness of the considered class of scalar conservation laws. One reasonable extension of the nonlocal traffic model includes the investigation of a stochastic velocity function. A numerical analysis offers insights how the stochasticity affects the evolution of densities. Finally, we present recent results how learning techniques may affect the nonlocal flow dynamics.

Samplets: Wavelet concepts for scattered data (Prof. Dr. Helmut Harbrecht)

This joint talk with Michael Multerer, Olaf Schenk, and Christoph Schwab is dedicated to recent developments in the field of wavelet analysis for scattered data. We introduce the concept of samplets, which are signed measures of wavelet type and may be defined on sets of arbitrarily distributed data sites in possibly high dimension. By employing samplets, we transfer well-known concepts known from wavelet analysis, namely the fast basis transform, data compression, operator compression and operator arithmetics to scattered data problems. Especially, samplet matrix compression facilitates the rapid solution of scattered data interpolation problems, even for kernel functions with nonlocal support. Finally, we demonstrate that sparsity constraints for scattered data approximation problems become meaningful and can efficiently be solved in samplet coordinates.

Statistical Learning Theory for Neural Operators (Jun.-Prof. Dr. Jakob Zech)

In this talk, we present new results on the sample size required to learn surrogates of nonlinear mappings between   infinite-dimensional Hilbert spaces. Such surrogate models have a wide range of applications and can be used in uncertainty quantification and parameter estimation problems in fields such as classical mechanics, fluid mechanics, electrodynamics, earth sciences etc. Here, the operator input determines the problem configuration, such as initial conditions, material properties, or forcing terms of a partial differential equation (PDE) governing the underlying physics. The operator output corresponds to the PDE solution. Our analysis shows that, for certain neural network architectures, empirical risk minimization can overcome the curse of dimensionality. Specifically, we show that both the number of network parameters and the quantity of input-output data pairs required for training remain manageable, with the error converging at an algebraic rate.

tba (Prof. Dr. Christophe Prud'homme)

tba

To the top of the page