Project description
A Digital Human Model is the vision of a simulated human which can be used for medical or scientific questions. Clearly, such a simulation requires models of cognitive and perceptual processes in the brain. To gain fundamental knowledge to inform such models, we commonly measure brain responses using electroencephalography (EEG). However, in order to allow a Digital Human Model to exhibit natural behavior in a natural world, we foremost need to investigate real human behavior and brain responses in their own natural environments. Currently, such experiments are rarely possible, because the necessary analysis tools are missing. Here, we develop these highly needed tools by using a new unified statistical framework based on deconvolution and Generalized Additive Models. The resulting models represent important steps towards multi-x modelling (FC2) in the Digital Human (FC1) and address challenges of ML4Sim (PN6).
Already within the project network, PN7.5 investigates connections of data-driven models based on eye-gaze data, with modular simulations of cognition which include attention, perception and decision-making modules. These models will be able to mimic behavior, but they are uninformed about any actual brain processes. We will bridge this gap by combining brain activity measures and statistical and cognitive models. In addition, we will connect with PN7.5 by focusing on the combination of gaze-evoked brain activity. In broader terms, we want to build a fundament for investigating the human brain detached from experimental control, under fully natural conditions. The outcomes of this project can be used to build parts of a Digital Human Model applicable to natural experiments which require unrestricted gaze, and which need concurrent analysis of brain activity: for example, revealing fatigue at demanding workplaces, detecting omissions of important information in visualization or medical imaging, or improving human-computer interactions.
Project information
Project title | Statistical modelling of combined eye movements and EEG |
Project leaders | Benedikt Ehinger (Andreas Bulling) |
Project staff | René Skukies, doctoral researcher |
Project duration | November 2020 - April 2024 |
Project number | PN 7-7 |
Publications PN 7-7
2023
- H. Bonasch and B. V. Ehinger, “Decoding accuracies as well as ERP amplitudes do not show between-task correlations,” Conference on Cognitive Computational Neuroscience, 2023, doi: 10.32470/CCN.2023.1029-0.
- R. S. Skukies and B. Ehinger, “The effect of estimation time window length on overlap correction in EEG data,” Conference on Cognitive Computational Neuroscience, 2023, doi: 10.32470/CCN.2023.1229-0.
- R. Frömer, M. R. Nassar, B. V. Ehinger, and A. Shenhav, “Common neural choice signals emerge artifactually amidst multiple distinct value signals,” bioRxiv, 2023, doi: 10.1101/2022.08.02.502393.
- A. R. Nikolaev, B. V. Ehinger, R. N. Meghanathan, and C. van Leeuwen, “Planning to revisit: Neural activity in refixation precursors,” Journal of Vision, vol. 23, no. 7, Art. no. 7, Jul. 2023, doi: 10.1167/jov.23.7.2.
- C. Yan, B. Ehinger, A. Pérez-Bellido, M. V. Peelen, and F. P. de Lange, “Humans predict the forest, not the trees : statistical learning of spatiotemporal structure in visual scenes,” Cerebral cortex, vol. 33, no. 13, Art. no. 13, 2023, doi: 10.1093/cercor/bhad115.
2022
- A. L. Gert, B. V. Ehinger, S. Timm, T. C. Kietzmann, and P. König, “WildLab: A naturalistic free viewing experiment reveals previously unknown electroencephalography signatures of face processing,” European Journal of Neuroscience, vol. 56, no. 11, Art. no. 11, Nov. 2022, doi: 10.1111/ejn.15824.
2021
- A. Czeszumski et al., “Coordinating With a Robot Partner Affects Neural Processing Related to Action Monitoring,” Frontiers in Neurorobotics, vol. 15, Aug. 2021, doi: 10.3389/fnbot.2021.686010.