PEARL: Physical Environment based Augmented Reality Lenses for In-Situ Human Movement Analysis
Abstract
This paper presents Pearl, a mixed-reality approach to the analysis of human movement data in situ. As the physical environment shapes human motion and behavior, the analysis of such motion can benefit from the direct inclusion of the environment in the analytical process. We present methods for exploring movement data in relation to surrounding regions of interest, such as objects, furniture, and architectural elements. We introduce concepts for selecting and filtering data through direct interaction with the environment, and a suite of visualizations for revealing aggregated and emergent spatial and temporal relations. More sophisticated analysis is supported through complex queries comprising multiple regions of interest. To illustrate the potential of Pearl, we develop an Augmented Reality-based prototype and conduct expert review sessions and scenario walkthroughs in a simulated exhibition. Our contribution lays the foundation for leveraging the physical environment in the in-situ analysis of movement data.
Media: Videos, Slides, and Supplemental Material
Recorded Talk @ ACM CHI ’23
Soon to be come!
Accompanying Video
Soon to be come!
Download the Author Version of the Paper:
Soon to be come!
Download the Appendix:
Soon to be come!
Supplemental Material
-
- Soon to be come!
Related Publication
@inproceedings{luo2023pearl,
author = {Weizhou Luo and Zhongyuan Yu and Rufat Rzayev and Marc Satkowski and Stefan Gumhold and Matthew McGinity and Raimund Dachselt},
title = {PEARL: Physical Environment based Augmented Reality Lenses for In-Situ Human Movement Analysis},
booktitle = {Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems},
series = {CHI '23},
year = {2023},
month = {04},
location = {Hamburg, Germany},
doi = {10.1145/3544548.3580715},
publisher = {ACM},
address = {New York, NY, USA}
}List of additional material
Acknowledgments
We thank Wolfgang Büschel and Annett Mitschick for their support in this paper, our participants in the study, and the anonymous reviewers for their constructive feedback and suggestions. This work was supported by the German Research Foundation (DFG, Deutsche Forschungsgemeinschaft) under Germany’s Excellence Strategy – EXC-2068 – 390729961 – Cluster of Excellence “Physics of Life” and EXC 2050/1 – Project ID 390696704 – Cluster of Excellence “Centre for Tactile Internet with Human-in-the-Loop” (CeTI) of TU Dresden, DFG grant 389792660 as part of TRR 248 (see https://perspicuous-computing.science), and by the German Federal Ministry of Education and Research (BMBF, SCADS22B) and the Saxon State Ministry for Science, Culture and Tourism (SMWK) by funding the competence center for Big Data and AI “ScaDS.AI Dresden/Leipzig“.