Project description
Small mobile devices such as fitness trackers, smartwatches, or mobile phones allow people to interact with real time visualizations of data collected from physiological sensors anywhere and anytime, for example, while sitting on the bus, while walking or running, at home or in the supermarket. This context differs from desktop usage and with the minimal rendering space for data representations, which is inherent to mobile devices such as smartwatches and fitness trackers, leads to interesting new challenges for visualizations. Therefore, the aim of this project is to study such small (interactive) data visualizations – micro visualizations – on mobile devices.
This project on micro visualizations for pervasive and mobile data exploration studies a broad range of challenges. First, the minimal rendering space requires to study fundamental components (i.e., visual variables) of visualizations at micro scale in the lab but also in realistic settings, for example, while a person is moving. This second aspect also needs to consider the time spent to focus on a visualization depicted on a mobile device – if a person is running, looking at the heart rate is not the primary task anymore and the visualizations needs to convey information “at-a-glance.” A third challenge is to find visual representations that provide micro and macro reading in a single micro visualization for more complex data representations. Last, interacting with visualizations allows to display larger and more complex visualizations by offering options to change view parameters.
In context of pervasive simulations, mobile devices can enable interaction with visualizations of complex biomechanical simulations, for example, predictive simulations of muscle behavior of a human arm. Therefore, the goals and challenges described above can be tested within a real-world scenario. This adds another dimension to the project – a multi-display connection between simulation model, interactive visualization in augmented reality (AR), as well as micro visualization. In this scenario, we consider a smartwatch as the chosen device to collect data from physiological sensors to be used in the simulation, requiring to port and scale existing and complex simulations to such a small device. Also, a smartwatch allows interaction with the AR visualization using micro visualizations as described above, as well as give haptic feedback (e.g., using vibration) to a wearer. For example, supervise if the arm is bent enough or not during a physical therapy session and guide a person when performing arm movements. Therefore, micro visualizations on mobile devices will allow active engagement of a person in human-in-the-loop pervasive simulations.
Project information
Project title | Micro visualizations for pervasive and mobile data exploration |
Project leader | Tanja Blascheck |
Project duration | March 2019 - December 2022 |
Project number | PN7A-1 |
Alternative project number | DFG ER 272/14-1 |
Further information | https://www.vis.uni-stuttgart.de/en/projects/dfg-anr-micro-vis/ |
Publications PN 7 A-1
2023
- F. Grioui and T. Blascheck, “Heart Rate Visualizations on a Virtual Smartwatch to Monitor Physical Activity Intensity,” in Proceedings of the 18th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, in Proceedings of the 18th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications. SCITEPRESS - Science and Technology Publications, 2023. doi: 10.5220/0011665500003417.
2021
- F. Grioui and T. Blascheck, Study of Heart Rate Visualizations on a Virtual Smartwatch. in Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology. ACM, 2021. doi: https://doi.org/10.1145/3489849.3489913.