Extracting EEG Features of Dyadic Cooperation Skill Using an Explainable Deep Learning Approach

Poster No:

1626 

Submission Type:

Abstract Submission 

Authors:

Kazumasa Uehara1,2,3, Kenta Matsuoka2, Keiichi Kitajo2,4

Institutions:

1Toyohashi University of Technology, Toyohashi, Aichi, 2National Institute for Physiological Sciences, Okazaki, Japan, 3JST PRESTO, Tokyo, Japan, 4The Graduate Institute for Advanced Studies, SOKENDAI, Okazaki, Japan

First Author:

Kazumasa Uehara  
Toyohashi University of Technology|National Institute for Physiological Sciences|JST PRESTO
Toyohashi, Aichi|Okazaki, Japan|Tokyo, Japan

Co-Author(s):

Kenta Matsuoka  
National Institute for Physiological Sciences
Okazaki, Japan
Keiichi Kitajo  
National Institute for Physiological Sciences|The Graduate Institute for Advanced Studies, SOKENDAI
Okazaki, Japan|Okazaki, Japan

Introduction:

Cognitive and motor actions cooperated between individuals play crucial roles in not only social interactions, but also team sports and musical performance. In the past decade, the development of hyper-scanning brain activity has allowed us to explore neural mechanisms involving dyadic cooperation (Kawasaki et al. 2018; Abe et al. 2019). While our understanding of some neural mechanisms has advanced, there is a possibility that a strong hypothesis-driven approach and excessive data preprocessing that has been taken for granted may overlook hidden neural mechanisms. To overcome these potential limitations, we adopted a data-driven approach based on deep learning to identify the success and failure of cooperative actions from hyper-scanning EEG data. Subsequently, we used a technique for producing visual explanations to extract spatial and temporal features of EEG that have the highest impact on classification decisions. Furthermore, another open question is whether the data-driven approach can detect individual neural features. To explore this, the present study recruited musicians and non-musicians. This is because musicians exhibit a higher level of behavioral synchronization with others when playing their instruments compared to non-musicians.
We aimed at identifying the potential of a data-driven deep learning approach in musicians and non-musicians who may differ in the dyadic cooperation skills. In addition, we addressed how the visualization of the classification decision can contribute to elucidating functional relationships between previously unseen neural features and behavioral performance.

Methods:

A total of twenty-three dyads were formed. Twenty-four of these individuals had daily musical experience. The remaining individuals had no intensive musical experience throughout their lives. During an experimental task, hyperscanning EEG was performed using two-32-channel EEG acquisition systems. Our experimental task was a dyadic visuo-motor cursor tracking task using a joystick controller per person. Each dyad was asked to repeat a total of 100 trials. For the behavioral data analysis, we computed the displacement value between the cursor and the target throughout the target trajectory for each trial. We then annotated the data with information regarding the success or failure of the cooperative actions for every trial. For the EEG data analysis, band-pass filtered EEG data (1-50Hz) were then fed into convolutional neural network (CNN) with the success or failure annotations. In this study, we were more interested in the individual brain activity that generates the cooperative actions rather than the person-to-person brain connectivity. We, therefore, used Intra-individual brain activity for CNN classifications. For the CNN analysis, using the EEGNet architecture (Lawhern et al. 2018), we addressed the binary-classification problem of assigning success and failure to the cooperative actions. To explore what kind of neural features have a strong impact on the model's classification decision, we adapted gradient-weighted class activation mapping (Grad-CAM) as a visualization approach (Selvaraju et al. 2017).

Results:

The classification accuracy was 75.6% for the musician's group and 65.5% for the non-musician's group. Furthermore, the Grad-CAM approach for the visualization of the classification decision revealed that preparatory EEG signals in the frontal and occipital brain areas showed the highest impact on the classification decision in musicians, whereas EEG signals in the frontal and occipital brain areas immediately before and after the onset of cooperative actions contributed to the classification decision in non-musicians.

Conclusions:

Our findings demonstrate the potential for a fusion of neurophysiology and artificial intelligence. This result may bring an entirely new dimension for validation and further development of cognitive and motor control theory and traditional neurophysiology studies.

Emotion, Motivation and Social Neuroscience:

Social Interaction

Higher Cognitive Functions:

Music

Modeling and Analysis Methods:

EEG/MEG Modeling and Analysis 1

Novel Imaging Acquisition Methods:

EEG 2

Keywords:

Cognition
Electroencephaolography (EEG)
Machine Learning
Modeling
Social Interactions
Other - Deep learning

1|2Indicates the priority used for review

Provide references using author date format

1. Abe M, Koike T, Okazaki S, Takahashi K, Watanabe K, Sadato N (2019) Neural correlates of online cooperation during joint force production. Neuroimage. :150–161.
2. Kawasaki M, Kitajo K, Yamaguchi Y.(2018) Sensory-motor synchronization in the brain corresponds to behavioral synchronization between individuals. Neuropsychologia. 119:59–67.
3. Lawhern VJ, Solon AJ, Waytowich NR, Gordon SM, Hung CP, Lance BJ (2018) EEGNet: A compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng. 15(5).
4. Selvaraju RR, Cogswell M, Das A, Vedantam R (2017) Visual explanations from deep networks via gradient-based localization. Proceedings of the 2017 IEEE.