Small mobile devices such as fitness trackers, smartwatches, or mobile phones allow people to interact with real time visualizations of data collected from physiological sensors anywhere and anytime, for example, while sitting on the bus, while walking or running, at home or in the supermarket. This context differs from desktop usage and with the minimal rendering space for data representations, which is inherent to mobile devices such as smartwatches and fitness trackers, leads to interesting new challenges for visualizations. Therefore, the aim of this project is to study such small (interactive) data visualizations – micro visualizations – on mobile devices.
This project on micro visualizations for pervasive and mobile data exploration studies a broad range of challenges. First, the minimal rendering space requires to study fundamental components (i.e., visual variables) of visualizations at micro scale in the lab but also in realistic settings, for example, while a person is moving. This second aspect also needs to consider the time spent to focus on a visualization depicted on a mobile device – if a person is running, looking at the heart rate is not the primary task anymore and the visualizations needs to convey information “at-a-glance.” A third challenge is to find visual representations that provide micro and macro reading in a single micro visualization for more complex data representations. Last, interacting with visualizations allows to display larger and more complex visualizations by offering options to change view parameters.
In context of pervasive simulations, mobile devices can enable interaction with visualizations of complex biomechanical simulations, for example, predictive simulations of muscle behavior of a human arm. Therefore, the goals and challenges described above can be tested within a real-world scenario. This adds another dimension to the project – a multi-display connection between simulation model, interactive visualization in augmented reality (AR), as well as micro visualization. In this scenario, we consider a smartwatch as the chosen device to collect data from physiological sensors to be used in the simulation, requiring to port and scale existing and complex simulations to such a small device. Also, a smartwatch allows interaction with the AR visualization using micro visualizations as described above, as well as give haptic feedback (e.g., using vibration) to a wearer. For example, supervise if the arm is bent enough or not during a physical therapy session and guide a person when performing arm movements. Therefore, micro visualizations on mobile devices will allow active engagement of a person in human-in-the-loop pervasive simulations.
|Alternative Project Number||DFG ER 272/14-1|
|Project Name||Micro Visualizations for Pervasive and Mobile Data Exploration|
|Project Duration||March 2019 - December 2022|
|Project Leader||Dr. Tanja Blascheck|
|Project Members||Fairouz Grioui, PhD researcher|