Publications of PN 6

  1. 2022

    1. V. Zaverkin, G. Molpeceres, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – II. Diffusion,” Mon. Not. R. Astron. Soc., vol. 510, no. 2, Art. no. 2, 2022, doi: 10.1093/mnras/stab3631.
    2. T. Wenzel, M. Kurz, A. Beck, G. Santin, and B. Haasdonk, “Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows,” in Large-Scale Scientific Computing, Cham, 2022, pp. 410--418.
    3. T. Wenzel, G. Santin, and B. Haasdonk, “Stability of convergence rates: Kernel interpolation on non-Lipschitz domains.” arXiv, 2022. doi: 10.48550/ARXIV.2203.12532.
    4. P. Gavrilenko et al., “A Full Order, Reduced Order and Machine Learning Model Pipeline for Efficient Prediction of Reactive Flows,” in Large-Scale Scientific Computing, Cham, 2022, pp. 378--386.
    5. V. Zaverkin, D. Holzmüller, R. Schuldt, and J. Kästner, “Predicting properties of periodic systems from cluster data: A case study of liquid water,” The Journal of Chemical Physics, vol. 156, no. 11, Art. no. 11, 2022, doi: 10.1063/5.0078983.
    6. D. Holzmüller, V. Zaverkin, J. Kästner, and I. Steinwart, “A Framework and Benchmark for Deep Batch Active Learning for Regression,” arXiv:1112.5745, 2022.
    7. V. Zaverkin, J. Netz, F. Zills, A. Köhn, and J. Kästner, “Thermally Averaged Magnetic Anisotropy Tensors via Machine Learning Based on Gaussian Moments,” J. Chem. Theory Comput., vol. 18, pp. 1–12, 2022, doi: 10.1021/acs.jctc.1c00853.
  2. 2021

    1. T. Wenzel, G. Santin, and B. Haasdonk, “Analysis of target data-dependent greedy kernel algorithms: Convergence rates for $f$-, $f P$- and $f/P$-greedy.” arXiv, 2021. doi: 10.48550/ARXIV.2105.07411.
    2. D. Holzmüller and D. Pflüger, “Fast Sparse Grid Operations Using the Unidirectional Principle: A Generalized and Unified Framework,” in Sparse Grids and Applications - Munich 2018, Cham, 2021, pp. 69--100.
    3. M. Heinemann, S. Frey, G. Tkachev, A. Straub, F. Sadlo, and T. Ertl, “Visual analysis of droplet dynamics in large-scale multiphase spray simulations,” Journal of Visualization, vol. 24, no. 5, Art. no. 5, 2021, doi: 10.1007/s12650-021-00750-6.
    4. V. Zaverkin, G. Molpeceres, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – II. Diffusion,” Mon. Not. R. Astron. Soc., vol. 510, no. 2, Art. no. 2, 2021, doi: 10.1093/mnras/stab3631.
    5. R. Leiteritz, M. Hurler, and D. Pflüger, “Learning Free-Surface Flow with Physics-Informed Neural Networks,” in 2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA), 2021, pp. 1664–1669. doi: 10.1109/ICMLA52953.2021.00266.
    6. B. Haasdonk, M. Ohlberger, and F. Schindler, “An adaptive model hierarchy for data-augmented training of kernel models for reactive flow.” 2021.
    7. R. Leiteritz, P. Buchfink, B. Haasdonk, and D. Pflüger, “Surrogate-data-enriched Physics-Aware Neural Networks.” 2021.
    8. B. Haasdonk, T. Wenzel, G. Santin, and S. Schmitt, “Biomechanical surrogate modelling using stabilized vectorial greedy kernel methods,” in Numerical Mathematics and Advanced Applications ENUMATH 2019, Springer, 2021, pp. 499--508. doi: https://doi.org/10.1007/978-3-030-55874-1_49.
    9. R. Garcia, T. Munz, and D. Weiskopf, “Visual analytics tool for the interpretation of hidden states in recurrent neural networks,” Springer, vol. 4, no. 24, Art. no. 24, Sep. 2021, doi: 10.1186/s42492-021-00090-0.
    10. V. Zaverkin and J. Kästner, “Exploration of transferable and uniformly accurate neural network interatomic potentials using optimal experimental design,” Machine Learning: Science and Technology, vol. 2, no. 3, Art. no. 3, 2021.
    11. T. Munz, D. Väth, P. Kuznecov, N. T. Vu, and D. Weiskopf, “NMTVis - Trained Models for our Visual Analytics System.” DaRUS, 2021. doi: 10.18419/DARUS-1850.
    12. G. Tkachev, S. Frey, and T. Ertl, “S4: Self-Supervised learning of Spatiotemporal Similarity,” IEEE Transactions on Visualization and Computer Graphics, 2021.
    13. Molpeceres, G., Zaverkin, V., Watanabe, N., and Kästner, J., “Binding energies and sticking coefficients of H2 on crystalline and amorphous CO ice,” A&A, vol. 648, p. A84, 2021, doi: 10.1051/0004-6361/202040023.
    14. R. Garcia, T. Munz, and D. Weiskopf, “Visual analytics tool for the interpretation of hidden states in recurrent neural networks,” Visual Computing for Industry, Biomedicine, and Art, vol. 4, no. 24, Art. no. 24, 2021, doi: 10.1186/s42492-021-00090-0.
    15. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” Journal of Chemical Theory and Computation, vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
    16. D. Holzmüller, “On the Universality of the Double Descent Peak in Ridgeless Regression,” 2021. [Online]. Available: https://openreview.net/forum?id=0IO5VdnSAaH
    17. T. Wenzel, G. Santin, and B. Haasdonk, “A novel class of stabilized greedy kernel approximation algorithms: Convergence, stability and uniform point distribution,” ELSEVIER, vol. 262, p. 105508, Feb. 2021, doi: 10.1016/j.jat.2020.105508.
    18. T. Munz, D. Väth, P. Kuznecov, T. Vu, and D. Weiskopf, “Visual-Interactive Neural Machine Translation,” 2021. [Online]. Available: https://openreview.net/forum?id=DQHaCvN9xd
    19. M. Scholz and R. Torkar, “An empirical study of Linespots: A novel past-fault algorithm,” Software Testing, Verification and Reliability, vol. n/a, no. n/a, Art. no. n/a, 2021, doi: https://doi.org/10.1002/stvr.1787.
    20. G. Tkachev, S. Frey, and T. Ertl, “Local Prediction Models for Spatiotemporal Volume Visualization,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 7, Art. no. 7, Jul. 2021, doi: 10.1109/TVCG.2019.2961893.
    21. T. Wenzel, M. Kurz, A. Beck, G. Santin, and B. Haasdonk, “Structured Deep Kernel Networks for Data-Driven Closure Terms of Turbulent Flows,” 2021. [Online]. Available: https://arxiv.org/pdf/2103.13655.pdf
    22. V. Zaverkin and J. Kästner, “Exploration of transferable and uniformly accurate neural network interatomic potentials using optimal experimental design,” Mach. Learn.: Sci. Technol., vol. 2, p. 035009, 2021, doi: 10.1088/2632-2153/abe294.
    23. A. M. Miksch, T. Morawietz, J. Kästner, A. Urban, and N. Artrith, “Strategies for the construction of machine-learning potentials for accurate and efficient atomic-scale simulations,” Mach. Learn.: Sci. Technol., vol. 2, p. 031001, 2021, doi: 10.1088/2632-2153/abfd96.
    24. V. Zaverkin, D. Holzmüller, I. Steinwart, and J. Kästner, “Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments,” J. Chem. Theory Comput., vol. 17, no. 10, Art. no. 10, 2021, doi: 10.1021/acs.jctc.1c00527.
  3. 2020

    1. D. Holzmüller and I. Steinwart, “Training two-layer relu networks with gradient descent is inconsistent,” arXiv preprint arXiv:2002.04861, 2020.
    2. T. Munz, N. Schäfer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Comparative visual gaze analysis for virtual board games,” in Proceedings of the 13th International Symposium on Visual Information Communication and Interaction, Eindhoven, Netherlands, Dec. 2020, pp. 1–8. doi: 10.1145/3430036.3430038.
    3. E. Sood, S. Tannert, D. Frassinelli, A. Bulling, and N. T. Vu, “Interpreting attention models with human visual attention in machine reading comprehension,” arXiv preprint arXiv:2010.06396, 2020, [Online]. Available: https://arxiv.org/abs/2010.06396
    4. T. Munz, N. Schäfer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Comparative Visual Gaze Analysis for Virtual Board Games,” The 13th International Symposium on Visual Information Communication and Interaction (VINCI 2020), 2020, doi: 10.1145/3430036.3430038.
    5. D. Holzmüller and I. Steinwart, “Training two-layer ReLU networks with gradient descent is inconsistent,” arXiv:2002.04861, 2020, [Online]. Available: https://arxiv.org/abs/2002.04861
    6. T. Munz, N. Schäfer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Supplemental Material for Comparative Visual Gaze Analysis for Virtual Board Games.” DaRUS, 2020. doi: 10.18419/DARUS-1130.
    7. D. Holzmüller, “On the Universality of the Double Descent Peak in Ridgeless Regression,” arXiv preprint arXiv:2010.01851, 2020, [Online]. Available: https://openreview.net/pdf?id=0IO5VdnSAaH
    8. F. Heyen et al., “ClaVis: An Interactive Visual Comparison System for Classifiers,” in Proceedings of the International Conference on Advanced Visual Interfaces, 2020, no. 9, pp. 1--9. doi: 10.1145/3399715.3399814.
    9. T. Munz, N. Schaefer, T. Blascheck, K. Kurzhals, E. Zhang, and D. Weiskopf, “Demo of a Visual Gaze Analysis System for Virtual Board Games,” Stuttgart, Germany, 2020. doi: 10.1145/3379157.3391985.
    10. F. Heyen et al., “ClaVis: An Interactive Visual Comparison System for Classifiers,” Salerno, Italy, 2020. doi: 10.1145/3399715.3399814.
    11. V. Zaverkin and J. Kästner, “Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials,” Journal of Chemical Theory and Computation, vol. 16, no. 8, Art. no. 8, Jul. 2020, doi: 10.1021/acs.jctc.0c00347.
    12. G. Molpeceres, V. Zaverkin, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water--I. adsorption and desorption,” Monthly Notices of the Royal Astronomical Society, vol. 499, no. 1, Art. no. 1, 2020.
    13. V. Zaverkin and J. Kästner, “Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials,” J. Chem. Theory Comput., vol. 16, pp. 5410–5421, 2020, doi: 10.1021/acs.jctc.0c00347.
    14. G. Molpeceres, V. Zaverkin, and J. Kästner, “Neural-network assisted study of nitrogen atom dynamics on amorphous solid water – I. adsorption and desorption,” Mon. Not. R. Astron. Soc., vol. 499, pp. 1373–1384, 2020, doi: 10.1093/mnras/staa2891.
  4. 2019

    1. T. Munz, L. L. Chuang, S. Pannasch, and D. Weiskopf, “VisME: Visual microsaccades explorer,” Journal of Eye Movement Research, vol. 12, no. 6, Art. no. 6, Dec. 2019, doi: 10.16910/jemr.12.6.5.
    2. T. Munz, M. Burch, T. van Benthem, Y. Poels, F. Beck, and D. Weiskopf, “Overlap-Free Drawing of Generalized Pythagoras Trees for Hierarchy Visualization,” in 2019 IEEE Visualization Conference (VIS), Oct. 2019, pp. 251–255. doi: 10.1109/VISUAL.2019.8933606.
    3. G. Tkachev, S. Frey, and T. Ertl, “Local Prediction Models for Spatiotemporal Volume Visualization,” IEEE Transactions on Visualization and Computer Graphics, 2019, doi: 10.1109/TVCG.2019.2961893.

Project Network Coordinators

This image shows Ingo Steinwart

Ingo Steinwart

Univ.-Prof. Dr. rer. nat.

[Photo: SimTech/Max Kovalenko]

Daniel Weiskopf

Prof. Dr.
To the top of the page