In recent years, with the continuous development of artificial intelligence and brain-computer interfaces, emotion recognition based on electroencephalogram (EEG) signals has become a prosperous research direction. Due to saliency in brain cognition, we construct a new spatio-temporal convolutional attention network for emotion recognition named BiTCAN. First, in the proposed method, the original EEG signals are de-baselined, and the two-dimensional mapping matrix sequence of EEG signals is constructed by combining the electrode position. Second, on the basis of the two-dimensional mapping matrix sequence, the features of saliency in brain cognition are extracted by using the Bi-hemisphere discrepancy module, and the spatio-temporal features of EEG signals are captured by using the 3-D convolution module. Finally, the saliency features and spatio-temporal features are fused into the attention module to further obtain the internal spatial relationships between brain regions, and which are input into the classifier for emotion recognition. Many experiments on DEAP and SEED (two public datasets) show that the accuracies of the proposed algorithm on both are higher than 97%, which is superior to most existing emotion recognition algorithms.
Citation: Yanling An, Shaohai Hu, Shuaiqi Liu, Bing Li. BiTCAN: An emotion recognition network based on saliency in brain cognition[J]. Mathematical Biosciences and Engineering, 2023, 20(12): 21537-21562. doi: 10.3934/mbe.2023953
In recent years, with the continuous development of artificial intelligence and brain-computer interfaces, emotion recognition based on electroencephalogram (EEG) signals has become a prosperous research direction. Due to saliency in brain cognition, we construct a new spatio-temporal convolutional attention network for emotion recognition named BiTCAN. First, in the proposed method, the original EEG signals are de-baselined, and the two-dimensional mapping matrix sequence of EEG signals is constructed by combining the electrode position. Second, on the basis of the two-dimensional mapping matrix sequence, the features of saliency in brain cognition are extracted by using the Bi-hemisphere discrepancy module, and the spatio-temporal features of EEG signals are captured by using the 3-D convolution module. Finally, the saliency features and spatio-temporal features are fused into the attention module to further obtain the internal spatial relationships between brain regions, and which are input into the classifier for emotion recognition. Many experiments on DEAP and SEED (two public datasets) show that the accuracies of the proposed algorithm on both are higher than 97%, which is superior to most existing emotion recognition algorithms.
[1] | J. D. Dołżycka, J. Nikadon, P. P. Weis, C. Herbert, M. Formanowicz, Linguistic and emotional responses evoked by pseudoword presentation: An EEG and behavioral study, Brain Cognition, 168 (2023), 105973. https://doi.org/10.1016/j.bandc.2023.105973 doi: 10.1016/j.bandc.2023.105973 |
[2] | M. Sajjad, F. U. M. Ullah, M. Ullah, G. Christodoulou, F. A. Cheikh, M. Hijji, et al., A comprehensive survey on deep facial expression recognition: challenges, applications, and future guidelines, Alexandria Eng. J., 68 (2023), 817–840. https://doi.org/10.1016/j.aej.2023.01.017 doi: 10.1016/j.aej.2023.01.017 |
[3] | L. Trinh Van, T. Dao Thi Le, T. Le Xuan, E. J. S. Castelli, Emotional speech recognition using deep neural networks, Sensors, 22 (2022), 1414. https://doi.org/10.3390/s22041414 doi: 10.3390/s22041414 |
[4] | N. Ma, Z. Wu, Y. M. Cheung, Y. Guo, Y. Gao, J. Li, et al., A survey of human action recognition and posture prediction, Tsinghua Sci. Technol., 27 (2022), 973–1001. https://doi.org/10.26599/TST.2021.9010068 doi: 10.26599/TST.2021.9010068 |
[5] | V. Chaturvedi, A. B. Kaur, V. Varshney, A. Garg, G. S. Chhabra, M. Kumar, Music mood and human emotion recognition based on physiological signals: a systematic review, Multimedia Syst., 28 (2022), 21–44. https://doi.org/10.1007/s00530-021-00786-6 doi: 10.1007/s00530-021-00786-6 |
[6] | X. Zhang, J. Liu, J. Shen, S. Li, K. Hou, B. Hu, et al., Emotion recognition from multimodal physiological signals using a regularized deep fusion of kernel machine, IEEE Trans. Cybern., 51 (2020), 4386–4399. https://doi.org/10.1109/tcyb.2020.2987575 doi: 10.1109/tcyb.2020.2987575 |
[7] | H. Zhu, C. Fu, F. Shu, H. Yu, C. Chen, W. Chen, The effect of coupled electroencephalography signals in electrooculography signals on sleep staging based on deep learning methods, Bioengineering, 10 (2023), 573. https://doi.org/10.3390/bioengineering10050573 doi: 10.3390/bioengineering10050573 |
[8] | M. Xu, J. Cheng, C. Li, Y. Liu, X. Chen, Medicine, Spatio-temporal deep forest for emotion recognition based on facial electromyography signals, Comput. Biol. Med., 156 (2023), 106689. https://doi.org/10.1016/j.compbiomed.2023.106689 doi: 10.1016/j.compbiomed.2023.106689 |
[9] | J. A. Lee, K. C. Kwak, Personal identification using an ensemble approach of 1D-LSTM and 2D-CNN with electrocardiogram signals, Appl. Sci., 12 (2022), 2692. https://doi.org/10.3390/app12052692 doi: 10.3390/app12052692 |
[10] | X. Chen, W. Liu. Research on Positive Emotion Recognition Based on EEG Signals, in 2023 5th International Conference on Communications, Information System and Computer Engineering (CISCE), (2023), 70–79. https://doi.org/10.1109/CISCE58541.2023.10142342 |
[11] | M. Wu, W. Teng, C. Fan, S. Pei, P. Li, Z. Lv, An investigation of olfactory-enhanced video on eeg-based emotion recognition, IEEE Trans. Neural Syst. Reh. Eng., 31 (2023), 1602–1613. https://doi.org/10.1109/TNSRE.2023.3253866 doi: 10.1109/TNSRE.2023.3253866 |
[12] | Z. Tian, D. Huang, S. Zhou, Z. Zhao, D. Jiang, Personality first in emotion: a deep neural network based on electroencephalogram channel attention for cross-subject emotion recognition, Roy. Soc. Open Sci., 8 (2021), 201976. https://doi.org/10.1098/rsos.201976 doi: 10.1098/rsos.201976 |
[13] | D. Huang, S. Zhou, D. Jiang, Generator-based domain adaptation method with knowledge free for cross-subject eeg emotion recognition, Cogn. Comput., 14 (2022), 1316–1327. https://doi.org/10.1007/s12559-022-10016-4 doi: 10.1007/s12559-022-10016-4 |
[14] | Y. An, S. Hu, X. Duan, L. Zhao, C. Xie, Y. Zhao, Electroencephalogram emotion recognition based on 3D feature fusion and convolutional autoencoder, Front. Comput. Neurosc., 15 (2021), 743426. https://doi.org/10.3389/fncom.2021.743426 doi: 10.3389/fncom.2021.743426 |
[15] | S. Liu, X. Wang, L. Zhao, J. Zhao, Q. Xin, S. H. Wang, et al., Subject-independent emotion recognition of EEG signals based on dynamic empirical convolutional neural network, IEEE/ACM Trans. Comput. Bi., 18 (2020), 1710–1721. https://doi.org/10.1109/tcbb.2020.3018137 doi: 10.1109/tcbb.2020.3018137 |
[16] | J. Liu, G. Wu, Y. Luo, S. Qiu, S. Yang, W. Li, et al., EEG-based emotion classification using a deep neural network and sparse autoencoder, Front. Syst. Neurosc., 14 (2020), 43. https://doi.org/10.3389/fnsys.2020.00043 doi: 10.3389/fnsys.2020.00043 |
[17] | J. Cheng, M. Chen, C. Li, Y. Liu, R. Song, A. Liu, et al., Emotion recognition from multi-channel EEG via deep forest, IEEE J. Biomed. Health Inf., 25 (2020), 453–464. https://doi.org/10.1109/jbhi.2020.2995767 doi: 10.1109/jbhi.2020.2995767 |
[18] | T. Song, S. Liu, W. Zheng, Y. Zong, Z. Cui, Y. Li, et al., Variational instance-adaptive graph for EEG emotion recognition, IEEE Trans. Affect. Comput., 14 (2021), 343–356. https://doi.org/10.1109/taffc.2021.3064940 doi: 10.1109/taffc.2021.3064940 |
[19] | H. Cui, A. Liu, X. Zhang, X. Chen, K. Wang, X. Chen, EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network, Knowl.-Based Syst., 205 (2020), 106243. https://doi.org/10.1016/j.knosys.2020.106243 doi: 10.1016/j.knosys.2020.106243 |
[20] | J. Yang, X. Huang, H. Wu, X. Yang, EEG-based emotion classification based on bidirectional long short-term memory network, Procedia Comput. Sci., 174 (2020), 491–504. https://doi.org/10.1016/j.procs.2020.06.117 doi: 10.1016/j.procs.2020.06.117 |
[21] | J. Jungilligens, S. Paredes-Echeverri, S. Popkirov, L. F. Barrett, D. L. Perez, A new science of emotion: implications for functional neurological disorder, Brain, 145 (2022), 2648–2663. https://doi.org/10.1093/brain/awac204 doi: 10.1093/brain/awac204 |
[22] | Y. Li, L. Wang, W. Zheng, Y. Zong, L. Qi, Z. Cui, et al., A novel bi-hemispheric discrepancy model for EEG emotion recognition, IEEE Trans. Cogn. Dev. Syst., 13 (2020), 354–367. https://doi.org/10.1109/TCDS.2020.2999337 doi: 10.1109/TCDS.2020.2999337 |
[23] | D. Huang, S. Chen, C. Liu, L. Zheng, Z. Tian, D. Jiang, Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition, Neurocomputing, 448 (2021), 140–151. https://doi.org/10.1016/j.neucom.2021.03.105 doi: 10.1016/j.neucom.2021.03.105 |
[24] | Y. Yang, Q. Wu, Y. Fu, X. Chen, Continuous convolutional neural network with 3D input for EEG-based emotion recognition, in Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science, Springer, (2018), 433–443. |
[25] | H. Chen, S. Sun, J. Li, R. Yu, N. Li, X. Li, et al., Personal-zscore: Eliminating individual difference for eeg-based cross-subject emotion recognition, IEEE Trans. Affect. Comput., 14 (2021), 2077–2088. https://doi.org/10.1109/taffc.2021.3137857 doi: 10.1109/taffc.2021.3137857 |
[26] | S. Liu, X. Wang, L. Zhao, B. Li, W. Hu, J. Yu, et al., 3DCANN: A spatio-temporal Convolution attention neural network for EEG emotion recognition, IEEE J. Biomed. Health., 26 (2022), 5321–5331. https://doi.org/10.1109/JBHI.2021.3083525 doi: 10.1109/JBHI.2021.3083525 |
[27] | C. Domingos, J. L. Marôco, M. Miranda, C. Silva, X. Melo, C. Borrego, Repeatability of brain activity as measured by a 32-channel EEG system during resistance exercise in healthy young adults, Int. J. Environ. Res. Public Health, 20 (2023), 1992. https://doi.org/10.3390/ijerph20031992 doi: 10.3390/ijerph20031992 |
[28] | S. Liu, Z. Wang, Y. An, B. Li, X. Wang, Y. Zhang, DA-CapsNet: A multi-branch capsule network based on adversarial domain adaption for cross-subject EEG emotion recognition, Knowl-based. Syst., 283 (2024), 111137. https://doi.org/10.1016/j.knosys.2023.111137. doi: 10.1016/j.knosys.2023.111137 |
[29] | J. Chen, P. Zhang, Z. Mao, Y. Huang, D. Jiang, Y. Zhang, Accurate EEG-based emotion recognition on combined features using deep convolutional neural networks, IEEE Access, 7 (2019), 44317–44328. https://doi.org/10.1109/access.2019.2908285 doi: 10.1109/access.2019.2908285 |
[30] | S. Koelstra, C. Muhl, M. Soleymani, J. S. Lee, A. Yazdani, T. Ebrahimi, et al., Deap: A database for emotion analysis; using physiological signals, IEEE Trans. Affect. Comput., 3 (2011), 18–31. https://doi.org/10.1109/t-affc.2011.15 doi: 10.1109/t-affc.2011.15 |
[31] | T. Song, W. Zheng, S. Liu, Y. Zong, Z. Cui, Y. Li, Graph-embedded convolutional neural network for image-based EEG emotion recognition, IEEE Trans. Affect. Comput., 10 (2021), 1399–1413. https://doi.org/10.1109/tetc.2021.3087174 doi: 10.1109/tetc.2021.3087174 |
[32] | Z. Gao, X. Wang, Y. Yang, Y. Li, K. Ma, G. Chen, A channel-fused dense convolutional network for EEG-based emotion recognition, IEEE Trans. Cogn. Dev. Syst., 13 (2020), 945–954. https://doi.org/10.1109/tcds.2020.2976112 doi: 10.1109/tcds.2020.2976112 |
[33] | Y. Gu, X. Zhong, C. Qu, C. Liu, B. Chen, A domain generative graph network for EEG-based emotion recognition, IEEE J. Biomed. Health Inf., 27 (2023), 2377–2386. https://doi.org/10.1109/JBHI.2023.3242090 doi: 10.1109/JBHI.2023.3242090 |
[34] | L. Yang, Y. Wang, X. Yang, C. Zheng, Stochastic weight averaging enhanced temporal convolution network for EEG-based emotion recognition, Biomed. Signal Proces., 83 (2023), 104661. https://doi.org/10.1016/j.bspc.2023.104661 doi: 10.1016/j.bspc.2023.104661 |
[35] | X. Du, C. Ma, G. Zhang, J. Li, Y. K. Lai, G. Zhao, et al., An efficient LSTM network for emotion recognition from multichannel EEG signals, IEEE Trans. Affect. Comput., 13 (2020), 1528–1540. https://doi.org/10.1109/TAFFC.2020.3013711 doi: 10.1109/TAFFC.2020.3013711 |
[36] | F. Cui, R. Wang, W. Ding, Y. Chen, L. Huang, A novel DE-CNN-BiLSTM multi-fusion model for EEG emotion recognition, Mathematics, 10 (2022), 582. https://doi.org/10.3390/math10040582 doi: 10.3390/math10040582 |
[37] | M. Ramzan, S. Dawn, Fused CNN-LSTM deep learning emotion recognition model using electroencephalography signals, Int. J. Neurosci., 133 (2023), 587–597. https://doi.org/10.1080/00207454.2021.1941947 doi: 10.1080/00207454.2021.1941947 |
[38] | G. Peng, K. Zhao, H. Zhang, D. Xu, X. Kong, Temporal relative transformer encoding cooperating with channel attention for EEG emotion analysis, Comput. Biol. Med., 154 (2023), 106537. https://doi.org/10.1016/j.compbiomed.2023.106537 doi: 10.1016/j.compbiomed.2023.106537 |
[39] | D. Kuang, C. Michoski, SEER-net: Simple EEG-based Recognition network, Biomed. Signal Proces., 83 (2023), 104620. https://doi.org/10.1016/j.bspc.2023.104620 doi: 10.1016/j.bspc.2023.104620 |
[40] | M. Sun, W. Cui, S. Yu, H. Han, B. Hu, Y. Li, A dual-branch dynamic graph convolution based adaptive transformer feature fusion network for EEG emotion recognition, IEEE Trans. Affect. Comput., 13 (2022), 2218–2228. https://doi.org/10.1109/TAFFC.2022.3199075 doi: 10.1109/TAFFC.2022.3199075 |
[41] | T. Song, W. Zheng, P. Song, Z. Cui, EEG emotion recognition using dynamical graph convolutional neural networks, IEEE Trans. Emerg. Top. Comput., 11 (2018), 532–541. https://doi.org/10.1109/taffc.2018.2817622 doi: 10.1109/taffc.2018.2817622 |
[42] | J. A. Coan, J. J. B. Allen, Frontal EEG asymmetry as a moderator and mediator of emotion, Biol. Psychol., 67 (2004), 7–50. https://doi.org/10.1016/j.biopsycho.2004.03.002 doi: 10.1016/j.biopsycho.2004.03.002 |
[43] | C. Wang, Y. Li, L. Wang, S. Liu, S. Yang, A study of EEG non-stationarity on inducing false memory in different emotional states, Neurosci. Lett., 809 (2023), 137306. https://doi.org/10.1016/j.neulet.2023.137306 doi: 10.1016/j.neulet.2023.137306 |
[44] | A. S. Reis, E. L. Brugnago, R. L. Viana, A. M. Batista, K. C. Iarosz, I. L. Caldas, Effects of feedback control in small-world neuronal networks interconnected according to a human connectivity map, Neurocomputing, 518 (2023), 321–331. https://doi.org/10.1016/j.neucom.2022.11.008 doi: 10.1016/j.neucom.2022.11.008 |
[45] | S. Halder, D. Agorastos, R. Veit, E. M. Hammer, S. Lee, B. Varkuti, et al., Neural mechanisms of brain–computer interface control, Neuroimage, 55 (2011), 1779–1790. https://doi.org/10.1016/j.neuroimage.2011.01.021 doi: 10.1016/j.neuroimage.2011.01.021 |
[46] | B. C. Gibson, A. Vakhtin, V. P. Clark, C. C. Abbott, D. K. Quinn, Revisiting hemispheric asymmetry in mood regulation: implications for rTMS for major depressive disorder, Brain Sci., 12 (2022), 112. https://doi.org/10.3390/brainsci12010112 doi: 10.3390/brainsci12010112 |
[47] | X. Li, Y. Zhang, P. Tiwari, D. Song, B. Hu, M. Yang, et al., EEG based emotion recognition: A tutorial and review, ACM Comput. Surv., 55 (2022), 1–57. https://doi.org/10.1145/3524499 doi: 10.1145/3524499 |
[48] | J. D. Herrington, W. Heller, A. Mohanty, A. S. Engels, M. T. Banich, A. G. Webb, et al., Localization of asymmetric brain function in emotion and depression, Psychophysiology, 47 (2010), 442–454. https://doi.org/10.1111/j.1469-8986.2009.00958.x doi: 10.1111/j.1469-8986.2009.00958.x |
[49] | B. Mishra, S. Tarai, V. Ratre, A. Bit, Medicine, Processing of attentional and emotional stimuli depends on retrospective response of foot pressure: Conceptualizing neuron-cognitive distribution in human brain, Comput. Biol. Med., 164 (2023), 107186. https://doi.org/10.1016/j.compbiomed.2023.107186 doi: 10.1016/j.compbiomed.2023.107186 |
[50] | Y. Liu, O. Sourina, M. K. Nguyen, Real-time EEG-based emotion recognition and its applications, in Transactions on Computational Science XⅡ, Springer, (2011), 256–277. https://doi.org/10.1007/978-3-642-22336-5_13 |
[51] | N. Jatupaiboon, S. Pan-Ngum, P. Israsena. Emotion classification using minimal EEG channels and frequency bands, in The 2013 10th International Joint Conference on Computer Science and Software Engineering (JCSSE), (2013), 21–24. https://doi.org/10.1109/JCSSE.2013.6567313 |