Poster No:
2548
Submission Type:
Abstract Submission
Authors:
Xiao Fu1, Etienne Roesch1, Katie Gray1
Institutions:
1University of Reading, Reading, Berkshire
First Author:
Xiao Fu
University of Reading
Reading, Berkshire
Co-Author(s):
Katie Gray
University of Reading
Reading, Berkshire
Introduction:
Social perception research has tended to investigate the processing of individuals; however, people are often viewed interacting with each other. These social interactions are thought to be detected quickly, and may also recruit holistic processing (Vester et al., 2020), whereby multiple features are integrated into a non-decomposable whole. The current study used steady state visually evoked potentials (SSVEP) to investigate whether facing dyads (interactions viewed from a third-person perspective) are processed holistically, and whether this effect is specific to social interactions, or whether it is also generalised to objects.
Methods:
Thirty-two participants were recruited to observe two images that were paired together to make a dyad. The two images were flickered at different rates, one at 5.45 Hz (F1) and the other at 7.5 Hz (F2) whilst participants' brain activity was recorded with EEG. The dyads were presented in different configurations (facing and non-facing), and there were three different image types (bodies, chairs, and lamps). At intermodulation frequencies (nF1 ± mF2), the two images in a dyad appeared on the screen simultaneously. Responses at these frequencies are considered to indicate the existence and degree of neural integration (Gordon et al., 2019). Signals were decomposed using the Fast Fourier Transform algorithm in python, with a frequency resolution of .0143 Hz. Responses were calculated as the power at the frequency of interest divided by the average power of its forty neighbouring frequency bins.
Results:
We first tested whether activation at the frequencies of interest was significantly higher than baseline using one-sample t-tests. As expected, we found significant responses at the fundamental frequencies (F1, F2), and up to the third harmonic, showing that we were able to elicit SSVEPs in response to our stimuli. Importantly, we also found significant responses in the facing-body and facing-lamp conditions at an intermodulation frequency (F1+F2), indicating that facing bodies and lamps were integrated into a representation that was different from the sum of their parts. To explore the effect of dyad configuration and image type further, repeated measures ANOVAs were conducted on each frequency of interest separately. Responses at the fundamental frequencies (F1, F2), were significantly different between types of images, but not between their configurations. There was no significant effect of image type nor configuration on the responses at the second order intermodulation frequencies.
Conclusions:
The findings indicate that facing body dyads show some evidence of recruiting holistic processing, whereas the same was not found for non-facing body dyads. However, this effect may not be specific to social interactions, as we also found the same effect in facing lamp stimuli. This leads to the possibility that domain general attentional cues might be responsible for the effects.
Emotion, Motivation and Social Neuroscience:
Social Interaction 2
Perception, Attention and Motor Behavior:
Perception: Visual 1
Keywords:
Electroencephaolography (EEG)
Social Interactions
Vision
1|2Indicates the priority used for review
Provide references using author date format
Vestner T, Gray KLH, Cook R. Why are social interactions found quickly in visual search tasks? Cognition. (2020). Jul;200:104270. doi: 10.1016/j.cognition.2020.104270. Epub 2020 Mar 26. PMID: 32220782; PMCID: PMC7315127.
Gordon, N., Hohwy, J., Davidson, M. J., van Boxtel, J. J. A., & Tsuchiya, N. (2019). From intermodulation components to visual perception and cognition-a review. Neuroimage, 199, 480e494. https://doi.org/10.1016/j.neuroimage.2019.06.008