The ability to autonomously perform activities of daily living (ADL), e.g., during a meal where food has to be cut or where a glass has to be filled from a bottle, is crucial for self-determination and quality of life in patients with neurodegenerative diseases, such as Parkinson’s disease, multiple sclerosis or cerebellar ataxia. Although these patients are able to plan motor actions, tremor or overshooting movements disturb the intended movements. Assistive devices that pro-actively suppress such dysfunctional movement components could dramatically improve patients’ motor abilities. However, predicting planned movements during ADL remains profoundly challenging.
The goal of this research project is to pioneer the first non-invasive approach to suppress unwanted (dysfunctional) movement while allowing intended movement. This requires computational methods to predict human arm movement intentions and execution from multi-modal bio-signals stemming from real and synthetic data during everyday tasks. To this end, the project aims to combine eye movement recordings, multi-modal arm motion recordings, and novel machine learning techniques with a biophysical neuro-musculoskeletal human model to predict the next, most-likely intended user action. As such, the proposed project will not only contribute new methods to generate control signals for the neuro-musculoskeletal computer simulations of the host group based on live data integration. It will also complement ongoing efforts in the collaborative host group, partly funded by an ERC Starting Grant, by studying intent prediction during motor actions executed as part of ADL.