%0 Generic %A Ge, Sheng %A Wang, Peng %A Liu, Hui %A Lin, Pan %A Gao, Junfeng %A Wang, Ruimin %A Iramina, Keiji %A Zhang, Quan %A Zheng, Wenming %D 2019 %T Data_Sheet_1_Neural Activity and Decoding of Action Observation Using Combined EEG and fNIRS Measurement.pdf %U https://frontiersin.figshare.com/articles/dataset/Data_Sheet_1_Neural_Activity_and_Decoding_of_Action_Observation_Using_Combined_EEG_and_fNIRS_Measurement_pdf/9979421 %R 10.3389/fnhum.2019.00357.s001 %2 https://frontiersin.figshare.com/ndownloader/files/18003125 %K action observation %K mirror neuron system %K theory of mind %K complex brain network %K EEG %K fNIRS %X

In a social world, observing the actions of others is fundamental to understanding what they are doing, as well as their intentions and feelings. Studies of the neural basis and decoding of action observation are important for understanding action-related processes and have implications for cognitive, social neuroscience, and human-machine interaction (HMI). In the current study, we first investigated temporal-spatial dynamics during action observation using a combined 64-channel electroencephalography (EEG) and 48-channel functional near-infrared spectroscopy (fNIRS) system. We measured brain activation while 16 healthy participants observed three action tasks: (1) grasping a cup with the intention of drinking; (2) grasping a cup with the intention of moving it; and (3) touching a cup with an unclear intention. The EEG and fNIRS source analysis results revealed the dynamic involvement of both the mirror neuron system (MNS) and the theory of mind (ToM)/mentalizing network during action observation. The source analysis results suggested that the extent to which these two systems were engaged was determined by the clarity of the intention of the observed action. Based on the difference in neural activity observed among different action-observation tasks in the first experiment, we conducted a second experiment to classify the neural processes underlying action observation using a feature classification method. We constructed complex brain networks based on the EEG and fNIRS data. Fusing features from both EEG and fNIRS complex brain networks resulted in a classification accuracy of 72.7% for the three action observation tasks. This study provides a theoretical and empirical basis for elucidating the neural mechanisms of action observation and intention understanding, and a feasible method for decoding the underlying neural processes.

%I Frontiers