Investigating finger-tapping and pupillometry as potential indicators of presence in VR

Poster Presentation 53.467: Tuesday, May 21, 2024, 8:30 am – 12:30 pm, Pavilion
Session: 3D Perception: Virtual and augmented reality

Sean Hinkle1 (), Shayna Soares1, Robert Bacon1, Corey Bohil2; 1University of Central Florida, 2Lawrence Technological University

Spatial presence in VR, the feeling of ‘being there,’ is linked to outcomes in clinical, training, education, and entertainment applications. Overreliance on survey measures has hampered the field and prevented progress with few generalizable alternatives. The field has tested physiological, neuroimaging, and behavioral measures in search of continuous and objective indicators of presence. In two studies we evaluated finger-tapping and pupillometry as potential indicators of presence. We predicted that variance in inter-tap-intervals (ITIs) and pupil size would predict presence, and that a neural-net classifier would be able to identify high versus low presence conditions at the individual subject level. In Experiment 1, participants walked the “virtual plank” tapping to a rhythm at heights or on the ground to manipulate presence. Surveys confirmed that heights manipulated presence (p = .04). ITI variance did not follow this pattern (p = .375). A feedforward neural-net classifier was trained on tapping and pupillometry data at the individual level. For finger-tapping, the classifier identified the condition of four-second windows of finger-position data at 77% accuracy. Pupillometry data yielded 70% accuracy, but a lighting confound weakened our conclusions. In Experiment 2, participants watched two 360-degree videos twice, with or without sound, to manipulate presence while controlling global luminance. Each video was analyzed separately. Surveys confirmed that sound increased presence for both videos (all ps < .05), but pupil variance did not follow this pattern. The neural-net classifier was unable to replicate the high accuracy of Experiment 1, with accuracies of 57% and 55%. Our results demonstrate that finger-tapping is a promising indicator of presence in VR and is especially sensitive when analyzed via neural-net classifier. While results for pupillometry are mixed, we believe that pupillometry and other eye-tracking metrics merit further investigation with more refined machine-learning methods, potentially in combination with finger-tapping.