Integrating Haptic Cues Distinctively Improved Judgement of Anger and Disgust Facial Expressions
Poster Presentation 16.314: Friday, May 15, 2026, 3:45 – 6:00 pm, Banyan Breezeway
Session: Multisensory Processing: Visual-tactile
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
There is a Poster PDF for this presentation, but you must be a current member or registered to attend VSS 2026 to view it.
Please go to your Account Home page to register.
Cameron Mavericks Choo1 (), Hong Xu2; 1Interdisciplinary Graduate Programme (Neuroscience), Nanyang Technological University, 2School of Social Sciences, Nanyang Technological University
Anger and disgust are often confused in facial-expression recognition tasks due to overlapping visual cues for emotional judgements. Integrating information from different sensory modalities may yield more reliable and accurate affective cues—multisensory integration. In this study, we investigated whether visual-haptic integration could improve the performance to differentiate these emotions. Forty-seven participants participated in a 2 (emotion: anger, disgust) x 3 (modality: visual, haptic, visual-haptic) within-subjects emotion categorization task incorporating a "none of the above" option to mitigate possible forced-choice artifacts. Accuracy, response latency and confidence level of their judgements were also measured. Stimuli consisted of facial expressions from the NTU Asian Face Database and affective human haptic gestures developed and validated in our lab. Trial-level responses were analyzed to assess perceptual performance across conditions. Poisson regression with model comparisons were used to assess classification patterns. We found that modality played a significant role in response accuracy (χ²(12) = 566.08, p < .001). Post-hoc tests with Bonferroni correction indicated that visual-haptic integration significantly increased correct identification frequencies for both emotions relative to visual trials (p’s < .001). Correspondingly, misclassification rates (i.e., wrong choice of emotion) were reduced for anger (p < .001), but not for disgust (p = .098), suggesting that uncertainties in identifying disgust (i.e., ‘none of the above’) were mitigated while misclassifications of disgust as anger persisted even with visual-haptic integration. Mixed-effects models further revealed that this multisensory enhancement extended beyond simple error correction: visual-haptic conditions yielded higher accuracy probabilities, shorter response latencies, and higher confidence levels compared to unimodal conditions (p’s < .001). These findings demonstrate that haptic cues effectively disambiguate facial expressions via emotion-specific mechanisms, shifting from reducing false perceptions of anger to enhancing signal salience of disgust. Furthermore, it suggests that multisensory integration provides distinct functional benefits across different affective categories in emotion perception and judgement.
Acknowledgements: Neuroscience, Interdisciplinary Graduate Programme, Nanyang Technological University, Singapore