Intuitive avatar gait control through multimodal EEG and eye-gaze brain-machine interface

Poster No:

2046 

Submission Type:

Abstract Submission 

Authors:

Taiga Seri1, Seitaro Iwama2, Junichi Ushiba2

Institutions:

1Graduate School of Science and Technology, Keio University, Kanagawa, Japan, 2Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University, Kanagawa, Japan

First Author:

Taiga Seri  
Graduate School of Science and Technology, Keio University
Kanagawa, Japan

Co-Author(s):

Seitaro Iwama  
Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University
Kanagawa, Japan
Junichi Ushiba  
Department of Biosciences and Informatics, Faculty of Science and Technology, Keio University
Kanagawa, Japan

Introduction:

Brain-machine interface (BMI) provides a direct intention pathway between brain and external devices, inferred by spectral power changes in scalp electroencephalograms (EEG) [1]. Event-related desynchronization (ERD) of sensorimotor rhythm (SMR) has been traditionally used in BMI paradigms to decode human motor intention in simple tasks [2]. However, the decoded contents were limited to flexion or extension of the shoulder, wrist, or fingers. Here, to improve intuitive user experience by estimating complex intention beyond the single action related to a hand, this study has proposed a novel BMI paradigm in which the gait of an avatar in a virtual environment is controlled by motor imagery (MI) of walking.

Methods:

In this study, ten healthy, right-handed subjects participated (22.5±1.0 years), and their EEG and gaze directions were processed in real-time to control an avatar navigating a circular course in a virtual reality space. During experiment, EEG signals were recorded at a sampling rate of 1000 Hz using a 128-channel scalp EEG cap (Hydrocel Geodesic Sensor Nets, Electrical Geodesics, Inc.), and eye-gaze signals were recorded binocularly at a sampling rate of 1200 Hz using a screen-based eye tracker (Tobii Pro Spectrum, Tobii AB). In our BMI protocol (Fig. 1-A), presence of movement intention (GO/STOP) was decoded based on SMR-ERD related to motor imagery, and preferred movement direction was estimated based on conscious eye movement. As for movement direction, we implemented an eye-gaze control architecture supported by semantic segmentation performed in real-time, recognizing clusters of pixels that form distinctive categories in an external virtual environment. All participants controlled the avatar configured to move forward based on MI of walking (Foot-model) and unclenching right hand (Hand-model), following experimental procedure shown in Fig. 1-Ba. After training of MI (Fig. 1-Bb), avatar control session (Fig. 1-Bc) was conducted twice, each session followed by questionnaire. For the evaluation of BMI which realizes intuitive gait control of an avatar, we posit that sense of ownership (SoO) [3], and sense of agency (SoA) [4] are significant other than the quality of EEG or success rate of control. Originating from rubber hand illusion [5], employment of virtual reality reframed the main question to address SoO and SoA in terms of experiencing a virtual body representation as our own. Hence, the intuitive experience of users is clarified by whether they perceive the avatar as an extension of themselves or merely as a tool.
Supporting Image: ohbm_1.png
   ·Fig.1 BMI protocol and Experimental procedures
 

Results:

In sessions, a virtual gauge ranging from 1 to 20 represented transition of SMR-ERD, and decoded presence of movement intention. The gauge follows time-frequency map in the same timeline, where avatar moved straight while the gauge exceeded 10. A frequency characteristic of both beta rhythm (18-23 Hz) and mu rhythm (8-13 Hz) were observed in Foot-model (Fig. 2-Aa), whereas only mu rhythm in Hand-model (Fig. 2-Ab). Topographic representations of SMR-ERD were observed around primary sensorimotor cortices, left and right in Foot-model (Fig. 2-Ba); only left in Hand-model (Fig. 2-Bb) respectively. Overall success rate of completing the course was comparable when participants used Foot-model (77%) or Hand-model (73%). On the other hand, ERD durability defined as the proportion exceeding the threshold, was significantly higher (p<0.05) in Hand-model (62.61%) compared to Foot-model (51.88%). Meanwhile, despite both models induced significant SoO (*p<0.001) and SoA (*p<0.001), SoO was significantly higher in Foot-Model (**p<0.01) as a result of Wilcoxon signed rank test being conducted (Fig. 2-C).
Supporting Image: ohbm_2.png
   ·Fig.2 Time-frequency map and ERD gauge, Topography map, and Questionnaire rating
 

Conclusions:

Together, these results show that the difficulty of avatar control does not depend solely on the quality of EEG. Furthermore, significant differences in ERD durability and SoO suggest that a synchronization between MI task and avatar movement enhances intuitiveness, improving the operability of eye-gaze control.

Motor Behavior:

Brain Machine Interface 1
Motor Planning and Execution 2

Keywords:

Electroencephaolography (EEG)
Motor
Vision
Other - Brain-Machine Interface; Brain-Computer Interface

1|2Indicates the priority used for review

Provide references using author date format

1. Wolpaw, J.R. (2000), ‘Brain-computer interface technology: a review of the first international meeting’, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 8, no. 2, pp. 164-173.
2. Yuan, J.W. (2010), ‘Negative covariation between task-related responses in alpha/beta-band activity and BOLD in human sensorimotor cortex: An EEG and fMRI study of motor imagery and movements’, NeuroImage, vol. 49, no. 3, pp. 2596–2606.
3. Tsakiris, M (2007), ‘On agency and body-ownership: Phenomenological and neurocognitive reflections’, Consciousness and Cognition, vol. 16, no. 3, pp. 645–660.
4. David, N (2008), ‘The “sense of agency” and its underlying cognitive and neural mechanisms.’, Consciousness and Cognition, vol.17, no. 2, pp. 523–534.
5. Botvinick, M (1998), ‘Rubber hands ‘feel’ touch that eyes see’, NATURE, vol. 391, pp. 756.