Context modulates Perceived Empathy in Human-Voice Agent Interactions

Poster No:

747 

Submission Type:

Abstract Submission 

Authors:

Hurshitha Vasudevan1, Bhoomika Kar2

Institutions:

1Center of Behavioural and Cognitive Sciences, University of Allahabad, Prayagraj, Uttar Pradesh, 2Center of Behavioural and Cognitive Sciences, University of Allahabad, Uttar Pradesh,India., Prayagraj, Uttar Pradesh

First Author:

Hurshitha Vasudevan  
Center of Behavioural and Cognitive Sciences, University of Allahabad
Prayagraj, Uttar Pradesh

Co-Author:

Bhoomika Kar  
Center of Behavioural and Cognitive Sciences, University of Allahabad, Uttar Pradesh,India.
Prayagraj, Uttar Pradesh

Introduction:

Perceived empathy is the ability of humans to form an embodied representation of another human being's emotional state while simultaneously being aware of the causal mechanism that induced that emotional state [1]. The present study aimed to investigate perceived empathy expressed in a human-voice agent interaction using the 'computers as social actors' (CASA) paradigm where computers are treated as social agents by the human [2]. The CASA paradigm can also be extended to voice agents [3]. We hypothesized that empathy shown by voice agents and humans is perceived similarly in a human- voice-agent interaction and shows similar brain activation in empathy regions. Considering that human-voice agent interactions occur across different social contexts, we also hypothesized that behavioral responses given by the participants and the brain activations while participants passively view the videos, context would modulate the level of empathy perceived in a Human-Voice agent empathetic interaction.

Methods:

Sixteen video stimuli 1-4 minutes each were developed across four contexts (Figure 1). The videos were rated by 31 participants and the contexts (visit to a nutritionist and plan a trip) with high mean ratings for dialogue/context comprehension were selected for the fMRI experiment.
23 Participants were scanned in a 3 Tesla fmri scanner (TR = 2780 ms, TE = 22 ms, FOV = 250 mm^3, 52 slices with a voxel size of 2.5 x 2.5 x 3.0 mm3) while they viewed eight videos pertaining to visit to a nutritionist and plan a trip context across four empathy conditions. After every video participants rated perceived empathy for the human- voice agent interaction on a 4 point Likert Scale.
Supporting Image: Figure1VideoStimulusDevelopment.jpg
 

Results:

Participants' ratings were analysed and a 2 (context) x 4 (empathy conditions) repeated measures ANOVA revealed a significant effect of perceived empathy when human/voice /both agents showed empathy (p < .05) and the effect of context (p < .05), when voice/both agents showed empathy. The BOLD response was acquired during video viewing and analyzed using the General Linear Model with SPM12 (p < 0.05, FWE Corrected). The effect of empathy showed significant activations in superior temporal gyrus (STG) associated with cognitive empathy and perspective taking. The interaction between context and empathy for human and Voice agent showed activations in the left STG. The contrast human showing empathy > voice agent showing empathy in visit to a nutritionist context showed activation in Posterior Insula associated with affective empathy and left cingulate gyrus associated with cognitive empathy in plan a trip context. Region of Interest (p < 0.05, uncorrected) based functional connectivity analysis in CONN (Figure 2) showed that right STG and anterior cingulate cortex associated with cognitive empathy were positively correlated. Activation in STG and anterior insula positively correlated when human showed empathy and negatively correlated when the voice agent showed empathy in the visit to a nutritionist context. Increased connectivity between anterior insula and STG was observed when the voice agent showed empathy in the Plan a trip context which was not significant in the case of human actor. Thus neural correlates coding for empathy varied as a function of context for both the agents. Increased positive correlation between right insula and STG for both agents suggests the interaction between affective and cognitive empathy.
Supporting Image: ColorfulIllustrativeYouthSportsA3LandscapePoster.jpg
 

Conclusions:

Empathy is perceived from a voice agent in a human-voice agent interaction supported by behavioral and fMRI results. The fMRI results highlight the role of context modulating perceived empathy in human-voice agent interactions coded by regions associated with cognitive and affective empathy i.e. Superior Temporal gyrus, Insula and Cingulate Cortex reflected in condition-specific contrasts and connectivity results. The finding highlights the significance of explicit empathetic cues in the voice and scripts for effective Human-Voice agent Interaction.

Emotion, Motivation and Social Neuroscience:

Emotional Perception 1
Social Cognition
Social Interaction 2
Social Neuroscience Other

Novel Imaging Acquisition Methods:

BOLD fMRI

Keywords:

Cognition
Design and Analysis
Emotions
fMRI CONTRAST MECHANISMS
FUNCTIONAL MRI
Limbic Systems
Social Interactions
Other - Perceived Empathy; Computer as social actors; Social Contexts;Conversational agent

1|2Indicates the priority used for review

Provide references using author date format

[1] Gonzalez-Liencres C, Shamay-Tsoory SG, Brüne M. (2013). Towards a neuroscience of empathy: ontogeny, phylogeny, brain mechanisms, context, and psychopathology. Neuroscience Biobehavioural Review, 37(8), 1537-1548.

[2] Nass, Clifford, Steuer, Jonathan, & Siminoff, Ellen. (1994). Computers are social actors. Proceedings of the Conference on Human Factors in Computing Systems, 204.

[3] Gambino, A., Fox, J., & Ratan, R. A. (2020). Building a stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71–85.