Spatio-Temporal Analysis and Inference Toolkit: Standardizing the organization and analysis of eye, head, and body timeseries data for research using mixed-reality devices
Poster Presentation: Tuesday, May 20, 2025, 8:30 am – 12:30 pm, Pavilion
Session: 3D Processing: Space, coordinate frames, virtual environments
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Russell Cohen Hoffing1, Michael Nonte2, Thomas Dang2, Jonathan Touryan1; 1U.S. Army DEBCOM Army Research Laboratory, 2DCS Corporation
Spatial computing technologies have enabled a new approach to understanding human behavior and cognition in real-world environments. Mixed-reality head mounted displays in particular give researchers access to a rich set of perspective, position, orientation and world data during naturalistic behavior. However, a lack of standardization of spatio-temporal data has stymied the ability to analyze, quantify, and compare behavior across different experimental settings and devices. Here we present a standardized schema along with the Spatio-Temporal Analysis and Inference Toolkit (STAIT) to overcome some of these obstacles. This standard implements a homogenous data format with tools to synchronize and analyze data across spatio-temporal sensors, as well as augmented, virtual, and mixed reality devices. The implementation of this toolbox is intended to support multi-agent, mixed-reality contexts where multiple individuals and robots, with different sensors and effectors, exist in the same space and time. We will present the framework of the toolbox and sample analysis demonstrating its utility.