2023
- V. Zaverkin, D. Holzmüller, L. Bonfirraro, and J. Kästner, “Transfer learning for chemically accurate interatomic neural network potentials,” Phys. Chem. Chem. Phys., vol. 25, no. 7, Art. no. 7, 2023, doi: 10.1039/D₂CP05793J.
- D. Holzmüller and F. Bach, “Convergence rates for non-log-concave sampling and log-partition estimation,” arXiv:2303.03237, 2023.
- D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart, “A Framework and Benchmark for Deep Batch Active Learning for Regression,” Journal of Machine Learning Research, vol. 24, no. 164, Art. no. 164, 2023, [Online]. Available: http://jmlr.org/papers/v24/22-0937.html
2022
- T. Wenzel, M. Kurz, A. Beck, G. Santin, and B. Haasdonk, “Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows,” in Large-Scale Scientific Computing, I. Lirkov and S. Margenov, Eds., in Large-Scale Scientific Computing. Cham: Springer International Publishing, 2022, pp. 410--418.
- P. Gavrilenko et al., “A Full Order, Reduced Order and Machine Learning Model Pipeline for Efficient Prediction of Reactive Flows,” in Large-Scale Scientific Computing, I. Lirkov and S. Margenov, Eds., in Large-Scale Scientific Computing. Cham: Springer International Publishing, 2022, pp. 378--386.
- D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart, “A Framework and Benchmark for Deep Batch Active Learning for Regression,” arXiv:1112.5745, 2022.
- V. Zaverkin, J. Netz, F. Zills, A. Köhn, and J. Kästner, “Thermally Averaged Magnetic Anisotropy Tensors via Machine Learning Based on Gaussian Moments,” J. Chem. Theory Comput., vol. 18, pp. 1–12, 2022, doi: 10.1021/acs.jctc.1c00853.
- R. Leiteritz, P. Buchfink, B. Haasdonk, and D. Pflüger, “Surrogate-data-enriched Physics-Aware Neural Networks,” in Proceedings of the Northern Lights Deep Learning Workshop 2022, in Proceedings of the Northern Lights Deep Learning Workshop 2022, vol. 3. Mar. 2022. doi: 10.7557/18.6268.
- K. Pluhackova, V. Schittny, P.-C. Bürkner, C. Siligan, and A. Horner, “Multiple pore lining residues modulate water permeability of GlpF,” Protein Science, vol. 31, no. 10, Art. no. 10, 2022, doi: https://doi.org/10.1002/pro.4431.
- V. Zaverkin, G. Molpeceres, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – II. Diffusion,” Mon. Not. R. Astron. Soc., vol. 510, no. 2, Art. no. 2, 2022, doi: 10.1093/mnras/stab3631.
- V. Zaverkin, D. Holzmüller, R. Schuldt, and J. Kästner, “Predicting properties of periodic systems from cluster data: A case study of liquid water,” The Journal of Chemical Physics, vol. 156, no. 11, Art. no. 11, 2022, doi: 10.1063/5.0078983.
- T. Wenzel, G. Santin, and B. Haasdonk, “Stability of convergence rates: Kernel interpolation on non-Lipschitz domains.” arXiv, 2022. doi: 10.48550/ARXIV.2203.12532.
- T. Wenzel, G. Santin, and B. Haasdonk, “Analysis of Target Data-Dependent Greedy Kernel Algorithms: Convergence Rates for f-, \$\$f \backslashcdot P\$\$- and f/P-Greedy,” Constructive Approximation, Oct. 2022, doi: 10.1007/s00365-022-09592-3.
- V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Exploring chemical and conformational spaces by batch mode deep active learning,” Digital Discovery, vol. 1, pp. 605–620, 2022, doi: 10.1039/D₂DD00034B.
2021
- B. Haasdonk, M. Ohlberger, and F. Schindler, “An adaptive model hierarchy for data-augmented training of kernel models for reactive flow.” 2021.
- Molpeceres, G., Zaverkin, V., Watanabe, N., and Kästner, J., “Binding energies and sticking coefficients of H2 on crystalline and amorphous CO ice,” A&A, vol. 648, p. A84, 2021, doi: 10.1051/0004-6361/202040023.
- V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” Journal of Chemical Theory and Computation, vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
- G. Tkachev, S. Frey, and T. Ertl, “S4: Self-Supervised learning of Spatiotemporal Similarity,” IEEE Transactions on Visualization and Computer Graphics, 2021.
- D. Holzmüller and D. Pflüger, “Fast Sparse Grid Operations Using the Unidirectional Principle: A Generalized and Unified Framework,” in Sparse Grids and Applications - Munich 2018, H.-J. Bungartz, J. Garcke, and D. Pflüger, Eds., in Sparse Grids and Applications - Munich 2018. Cham: Springer International Publishing, 2021, pp. 69--100.
- D. Holzmüller, “On the Universality of the Double Descent Peak in Ridgeless Regression,” in International Conference on Learning Representations, in International Conference on Learning Representations. 2021. [Online]. Available: https://openreview.net/forum?id=0IO5VdnSAaH
- A. M. Miksch, T. Morawietz, J. Kästner, A. Urban, and N. Artrith, “Strategies for the construction of machine-learning potentials for accurate and efficient atomic-scale simulations,” Mach. Learn.: Sci. Technol., vol. 2, p. 031001, 2021, doi: 10.1088/2632-2153/abfd96.
- R. Leiteritz, M. Hurler, and D. Pflüger, “Learning Free-Surface Flow with Physics-Informed Neural Networks,” in 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA), in 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA). 2021, pp. 1664–1669. doi: 10.1109/ICMLA52953.2021.00266.
- R. Garcia, T. Munz, and D. Weiskopf, “Visual analytics tool for the interpretation of hidden states in recurrent neural networks,” Visual Computing for Industry, Biomedicine, and Art, vol. 4, no. 24, Art. no. 24, 2021, doi: 10.1186/s42492-021-00090-0.
- B. Haasdonk, T. Wenzel, G. Santin, and S. Schmitt, “Biomechanical surrogate modelling using stabilized vectorial greedy kernel methods,” in Numerical Mathematics and Advanced Applications ENUMATH 2019, in Numerical Mathematics and Advanced Applications ENUMATH 2019. Springer, 2021, pp. 499--508. doi: https://doi.org/10.1007/978-3-030-55874-1_49.
- V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” J. Chem. Theory Comput., vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
- V. Zaverkin and J. Kästner, “Exploration of transferable and uniformly accurate neural network interatomic potentials using optimal experimental design,” Mach. Learn.: Sci. Technol., vol. 2, p. 035009, 2021, doi: 10.1088/2632-2153/abe294.
- M. Scholz and R. Torkar, “An empirical study of Linespots: A novel past-fault algorithm,” Software Testing, Verification and Reliability, vol. n/a, no. n/a, Art. no. n/a, 2021, doi: https://doi.org/10.1002/stvr.1787.
- M. Heinemann, S. Frey, G. Tkachev, A. Straub, F. Sadlo, and T. Ertl, “Visual analysis of droplet dynamics in large-scale multiphase spray simulations,” Journal of Visualization, vol. 24, no. 5, Art. no. 5, May 2021, doi: 10.1007/s12650-021-00750-6.
- R. Garcia, T. Munz, and D. Weiskopf, “Visual analytics tool for the interpretation of hidden states in recurrent neural networks,” Springer, vol. 4, no. 24, Art. no. 24, Sep. 2021, doi: 10.1186/s42492-021-00090-0.
- G. Tkachev, S. Frey, and T. Ertl, “Local Prediction Models for Spatiotemporal Volume Visualization,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 7, Art. no. 7, Jul. 2021, doi: 10.1109/TVCG.2019.2961893.
- T. Wenzel, G. Santin, and B. Haasdonk, “A novel class of stabilized greedy kernel approximation algorithms: Convergence, stability and uniform point distribution,” ELSEVIER, vol. 262, p. 105508, Feb. 2021, doi: 10.1016/j.jat.2020.105508.
- V. Zaverkin, G. Molpeceres, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – II. Diffusion,” Mon. Not. R. Astron. Soc., vol. 510, no. 2, Art. no. 2, 2021, doi: 10.1093/mnras/stab3631.
- T. Munz, D. Väth, P. Kuznecov, N. T. Vu, and D. Weiskopf, “NMTVis - Trained Models for our Visual Analytics System.” DaRUS, 2021. doi: 10.18419/DARUS-1850.
- V. Zaverkin and J. Kästner, “Exploration of transferable and uniformly accurate neural network interatomic potentials using optimal experimental design,” Machine Learning: Science and Technology, vol. 2, no. 3, Art. no. 3, 2021.
- T. Munz, D. Väth, P. Kuznecov, T. Vu, and D. Weiskopf, “Visual-Interactive Neural Machine Translation,” in Graphics Interface 2021, in Graphics Interface 2021. 2021. [Online]. Available: https://openreview.net/forum?id=DQHaCvN9xd
- R. Leiteritz, P. Buchfink, B. Haasdonk, and D. Pflüger, “Surrogate-data-enriched Physics-Aware Neural Networks.” 2021.
- T. Wenzel, M. Kurz, A. Beck, G. Santin, and B. Haasdonk, “Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows,” 2021. [Online]. Available: https://arxiv.org/pdf/2103.13655.pdf
- V. Zaverkin, J. Kästner, D. Holzmüller, and I. Steinwart, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” J. Chem. Theory Comput., 2021, doi: https://doi.org/10.1021/acs.jctc.1c00527.
- T. Wenzel, G. Santin, and B. Haasdonk, “Analysis of target data-dependent greedy kernel algorithms: Convergence rates for $f$-, $f P$- and $f/P$-greedy.” arXiv, 2021. doi: 10.48550/ARXIV.2105.07411.
2020
- G. Molpeceres, V. Zaverkin, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water--I. adsorption and desorption,” Monthly Notices of the Royal Astronomical Society, vol. 499, no. 1, Art. no. 1, 2020.
- F. Heyen et al., “ClaVis: An Interactive Visual Comparison System for Classifiers,” in Proceedings of the International Conference on Advanced Visual Interfaces, in Proceedings of the International Conference on Advanced Visual Interfaces. 2020, pp. 1--9. doi: 10.1145/3399715.3399814.
- D. Holzmüller and I. Steinwart, “Training two-layer ReLU networks with gradient descent is inconsistent,” arXiv:2002.04861, 2020, [Online]. Available: https://arxiv.org/abs/2002.04861
- F. Heyen et al., “ClaVis: An Interactive Visual Comparison System for Classifiers,” in Proceedings of the International Conference on Advanced Visual Interfaces, in Proceedings of the International Conference on Advanced Visual Interfaces. Salerno, Italy: Association for Computing Machinery, 2020. doi: 10.1145/3399715.3399814.
- T. Munz, N. Schaefer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Demo of a Visual Gaze Analysis System for Virtual Board Games,” in ACM Symposium on Eye Tracking Research and Applications, in ACM Symposium on Eye Tracking Research and Applications. Stuttgart, Germany: Association for Computing Machinery, 2020. doi: 10.1145/3379157.3391985.
- V. Zaverkin and J. Kästner, “Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials,” J. Chem. Theory Comput., vol. 16, pp. 5410–5421, 2020, doi: 10.1021/acs.jctc.0c00347.
- T. Munz, N. Schäfer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Comparative Visual Gaze Analysis for Virtual Board Games,” The 13th International Symposium on Visual Information Communication and Interaction (VINCI 2020), 2020, doi: 10.1145/3430036.3430038.
- G. Molpeceres, V. Zaverkin, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – I. adsorption and desorption,” Mon. Not. R. Astron. Soc., vol. 499, pp. 1373–1384, 2020, doi: 10.1093/mnras/staa2891.
- T. Munz, N. Schäfer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Supplemental Material for Comparative Visual Gaze Analysis for Virtual Board Games.” DaRUS, 2020. doi: 10.18419/DARUS-1130.
- T. Munz, N. Schäfer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Comparative visual gaze analysis for virtual board games,” in Proceedings of the 13th International Symposium on Visual Information Communication and Interaction, in Proceedings of the 13th International Symposium on Visual Information Communication and Interaction. Eindhoven, Netherlands: Association for Computing Machinery, Dec. 2020, pp. 1–8. doi: 10.1145/3430036.3430038.
- D. Holzmüller and I. Steinwart, “Training two-layer relu networks with gradient descent is inconsistent,” arXiv preprint arXiv:2002.04861, 2020.
- V. Zaverkin and J. Kästner, “Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials,” Journal of Chemical Theory and Computation, vol. 16, no. 8, Art. no. 8, Jul. 2020, doi: 10.1021/acs.jctc.0c00347.
- D. Holzmüller, “On the Universality of the Double Descent Peak in Ridgeless Regression,” arXiv preprint arXiv:2010.01851, 2020, [Online]. Available: https://openreview.net/pdf?id=0IO5VdnSAaH
- E. Sood, S. Tannert, D. Frassinelli, A. Bulling, and N. T. Vu, “Interpreting attention models with human visual attention in machine reading comprehension,” arXiv preprint arXiv:2010.06396, 2020, [Online]. Available: https://arxiv.org/abs/2010.06396
- D. F. B. Häufle, I. Wochner, D. Holzmüller, D. Driess, M. Günther, and S. Schmitt, “Muscles Reduce Neuronal Information Load : Quantification of Control Effort in Biological vs. Robotic Pointing and Walking,” Frontiers In Robotics and AI, vol. 7, p. 77, 2020, doi: 10.3389/frobt.2020.00077.
2019
- T. Munz, M. Burch, T. van Benthem, Y. Poels, F. Beck, and D. Weiskopf, “Overlap-Free Drawing of Generalized Pythagoras Trees for Hierarchy Visualization,” in 2019 IEEE Visualization Conference (VIS), in 2019 IEEE Visualization Conference (VIS). Oct. 2019, pp. 251–255. doi: 10.1109/VISUAL.2019.8933606.
- T. Munz, L. L. Chuang, S. Pannasch, and D. Weiskopf, “VisME: Visual microsaccades explorer,” Journal of Eye Movement Research, vol. 12, no. 6, Art. no. 6, Dec. 2019, doi: 10.16910/jemr.12.6.5.
- G. Tkachev, S. Frey, and T. Ertl, “Local Prediction Models for Spatiotemporal Volume Visualization,” IEEE Transactions on Visualization and Computer Graphics, 2019, doi: 10.1109/TVCG.2019.2961893.
Project Network Coordinators

Dirk Pflüger
Prof. Dr. rer. nat.PI, Vice-Head of GS SimTech
[Photo: SimTech/Max Kovalenko]

Ingo Steinwart
Univ.-Prof. Dr. rer. nat.[Photo: SimTech/Max Kovalenko]