Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In IEEE journal of biomedical and health informatics

While many voluntary movements involve bimanual coordination, few attempts have been made to simultaneously decode the trajectory of bimanual movements from electroencephalogram (EEG) signals. In this study, we proposed a novel bimanual brain-computer interface (BCI) paradigm to reconstruct the continuous trajectory of both hands during coordinated movements from EEG. The protocol required human subjects to complete a bimanual reaching task to the left, middle, or right target while EEG data were collected. A multi-task deep learning model combining the EEGNet and long short-term memory network (LSTM) was proposed to decode bimanual trajectories, including position and velocity. Decoding performance was evaluated in terms of the correlation coefficient (CC) and normalized root mean square error (NRMSE) between decoded and real trajectories. Experimental results from 13 human subjects showed that the grand-averaged combined CC values achieved 0.54 and 0.42 for position and velocity decoding, respectively. The corresponding combined NRMSE values were 0.22 and 0.23. Both CC and NRMSE were significantly superior to the chance level (p<0.05). Comparative experiments also indicated that the proposed model significantly outperformed some other commonly-used methods in terms of CC and NRMSE for continuous trajectory decoding. These findings demonstrated the feasibility of simultaneously decoding bimanual trajectory from EEG, indicating the potential of bimanual control for coordinated tasks.

Chen Yi-Feng, Fu Ruiqi, Wu Junde, Song Jongbin, Ma Rui, Jiang Yi-Chuan, Zhang Mingming

2022-Nov-24