access icon free Deep recurrent–convolutional neural network for classification of simultaneous EEG–fNIRS signals

Brain–computer interface (BCI) is a powerful system for communicating between the brain and outside world. Traditional BCI systems work based on electroencephalogram (EEG) signals only. Recently, researchers have used a combination of EEG signals with other signals to improve the performance of BCI systems. Among these signals, the combination of EEG with functional near-infrared spectroscopy (fNIRS) has achieved favourable results. In most studies, only EEGs or fNIRs have been considered as chain-like sequences, and do not consider complex correlations between adjacent signals, neither in time nor channel location. In this study, a deep neural network model has been introduced to identify the exact objectives of the human brain by introducing temporal and spatial features. The proposed model incorporates the spatial relationship between EEG and fNIRS signals. This could be implemented by transforming the sequences of these chain-like signals into hierarchical three-rank tensors. The tests show that the proposed model has a precision of 99.6%.

Inspec keywords: brain-computer interfaces; electroencephalography; infrared spectroscopy; recurrent neural nets; signal classification; medical signal processing; convolutional neural nets

Other keywords: human brain; EEG signals; near-infrared spectroscopy; simultaneous EEG–fNIRS signal classification; complex correlations; deep recurrent-convolutional neural network; deep neural network model; traditional BCI systems; spatial features; adjacent signals; temporal features; brain–computer interface

Subjects: Neural computing techniques; User interfaces; Signal processing and detection; Electrical activity in neurophysiological processes; Bioelectric signals; Biology and medical computing; Digital signal processing; Electrodiagnostics and other electrical measurement techniques

References

    1. 1)
      • 5. Bauer, G., Gerstenbrand, F., Rumpl, E.: ‘Varieties of the locked-in syndrome’, J. Neurol., 1979, 221, (2), pp. 7791.
    2. 2)
      • 21. Janani, A., Sasikala, M.: ‘Classification of fNIRS signals for decoding right- and left-arm movement execution using SVM for BCI applications’, in Premnath, R. N. (Ed.): ‘Computational signal processing and analysis’ (Springer, Singapore, 2018), pp. 315323.
    3. 3)
      • 19. Zhang, R., Li, Y., Yan, Y., et al: ‘Control of a wheelchair in an indoor environment based on a brain–computer interface and automated navigation’, IEEE Trans. Neural Syst. Rehabil. Eng., 2015, 24, (1), pp. 128139.
    4. 4)
      • 20. Peng, H., Chao, J., Wang, S., et al: ‘Single-trial classification of fNIRS signals in four directions motor imagery tasks measured from prefrontal cortex’, IEEE Trans. Nanobiosc., 2018, 17, (3), pp. 181190.
    5. 5)
      • 37. Li, R., Potter, T., Huang, W., et al: ‘Enhancing performance of a hybrid EEG–fNIRS system using channel selection and early temporal features’, Front. Hum. Neurosci., 2017, 11, p. 462.
    6. 6)
      • 42. Blankertz, B., Tangermann, M., Vidaurre, C., et al: ‘The Berlin brain–computer interface: non-medical uses of BCI technology’, Front. Neurosci., 2010, 4, p. 198.
    7. 7)
      • 31. Shibasaki, H.: ‘Human brain mapping: hemodynamic response and electrophysiology’, Clin. Neurophysiol., 2008, 119, (4), pp. 731743.
    8. 8)
      • 62. Mustafa, I., Mustafa, I.: ‘Smart thoughts: BCI-based system implementation to detect motor imagery movements’. 2018 15th Int. Bhurban Conf. Applied Sciences and Technology (IBCAST), Islamabad, Pakistan, 2018, pp. 365371.
    9. 9)
      • 12. Wolpaw, J.R., Loeb, G.E., Allison, B.Z., et al: ‘BCI meeting 2005-workshop on signals and recording methods’, IEEE Trans. Neural Syst. Rehabil. Eng., 2006, 14, (2), pp. 138141.
    10. 10)
      • 23. Fazli, S., Mehnert, J., Steinbrink, J., et al: ‘Enhanced performance by a hybrid NIRS–EEG brain–computer interface’, Neuroimage, 2012, 59, (1), pp. 519529.
    11. 11)
      • 33. Sun, S., Zhou, J.: ‘A review of adaptive feature extraction and classification methods for EEG-based brain–computer interfaces’. 2014 Int. Joint Conf. Neural Networks (IJCNN), Beijing, People's Republic of China2014, pp. 17461753.
    12. 12)
      • 64. Lin, X., Sai, L., Yuan, Z.: ‘Detecting concealed information with fused electroencephalography and functional near-infrared spectroscopy’, Neuroscience, 2018, 386, pp. 284294.
    13. 13)
      • 27. Shin, J., von Lühmann, A., Blankertz, B., et al: ‘Open access dataset for EEG + NIRS single-trial classification’, IEEE Trans. Neural Syst. Rehabil. Eng., 2016, 25, (10), pp. 17351745.
    14. 14)
      • 26. Wallois, F., Patil, A., Héberlé, C., et al: ‘EEG–NIRS in epilepsy in children and neonates’, Neurophysiol. Clin./Clin. Neurophysiol., 2010, 40, (5–6), pp. 281292.
    15. 15)
      • 63. Pandey, P., Seeja, K.R.: ‘Subject-independent emotion detection from EEG signals using deep neural network’. Int. Conf. Innovative Computing and Communications, Delhi, India, 2019, pp. 4146.
    16. 16)
      • 66. Li, J., Zhang, Z., He, H.: ‘Implementation of EEG emotion recognition system based on hierarchical convolutional neural networks’. Int. Conf. Brain Inspired Cognitive Systems, Beijing, People's Republic of China, 2016, pp. 2233.
    17. 17)
      • 9. Khan, M.J., Hong, K.S.: ‘Hybrid EEG–fNIRS-based eight-command decoding for BCI: application to quadcopter control’, Front. Neurorobot., 2017, 11, p. 6.
    18. 18)
      • 2. Illes, J., Sahakian, B.J. (Eds.): ‘Oxford handbook of neuroethics’ (Oxford University Press, UK, 2013).
    19. 19)
      • 40. Zhang, D., Yao, L., Zhang, X., et al: ‘Cascade and parallel convolutional recurrent neural networks on EEG-based intention recognition for brain–computer interface’. 32nd AAAI Conf. Artificial Intelligence, 2018.
    20. 20)
      • 71. Kingma, D.P., Ba, J.: ‘Adam: a method for stochastic optimization’. arXiv preprint arXiv:1412.6980, 2014.
    21. 21)
      • 52. Zhang, D., Yao, L., Chen, K., et al: ‘A graph-based hierarchical attention model for movement intention detection from EEG signals’, IEEE Trans. Neural Syst. Rehabil. Eng., 2019, 27, (11), pp. 22472253.
    22. 22)
      • 11. Tabar, Y.R., Halici, U.: ‘Brain–computer interfaces for silent speech’, Eur. Rev., 2017, 25, (2), pp. 208230.
    23. 23)
      • 53. Khan, M.J., Ghafoor, U., Hong, K.S.: ‘Early detection of hemodynamic responses using EEG: a hybrid EEG–fNIRS study’, Front. Hum. Neurosci., 2018, 12, p. 479.
    24. 24)
      • 57. Shin, J., Kim, D.W., Müller, K.R., et al: ‘Improvement of information transfer rates using a hybrid EEG–NIRS brain–computer interface with a short trial length: offline and pseudo-online analyses’, Sensors, 2018, 18, (6), p. 1827.
    25. 25)
      • 46. Peng, K., Nguyen, D.K., Tayah, T., et al: ‘fNIRS–EEG study of focal interictal epileptiform discharges’, Epilepsy Res., 2014, 108, (3), pp. 491505.
    26. 26)
      • 41. Vaadia, E., Birbaumer, N.: ‘Grand challenges of brain–computer interfaces in the years to come’, Front. Neurosci., 2009, 3, p. 15.
    27. 27)
      • 72. Rashed-Al-Mahfuz, M., Islam, M.R., Hirose, K., et al: ‘Artifact suppression and analysis of brain activities with electroencephalography signals’, Neural Regeneration Res., 2013, 8, (16), p. 1500.
    28. 28)
      • 65. Chambon, S., Galtier, M.N., Arnal, P.J., et al: ‘A deep learning architecture for temporal sleep stage classification using multivariate and multimodal time series’, IEEE Trans. Neural Syst. Rehabil. Eng., 2018, 26, (4), pp. 758769.
    29. 29)
      • 13. Gelenbe, E., Feng, Y., Krishnan, K.R.R.: ‘Neural network methods for volumetric magnetic resonance imaging of the human brain’, Proc. IEEE, 1996, 84, (10), pp. 14881496.
    30. 30)
      • 3. Guo, F., Hong, B., Gao, X., et al: ‘A brain–computer interface using motion-onset visual evoked potential’, J. Neural Eng., 2008, 5, (4), p. 477.
    31. 31)
      • 44. Pfurtscheller, G., Allison, B.Z., Bauernfeind, G., et al: ‘The hybrid BCI’, Front. Neurosci., 2010, 4, p. 3.
    32. 32)
      • 43. Van Erp, J., Lotte, F., Tangermann, M.: ‘Brain–computer interfaces: beyond medical applications’, Computer, 2012, 45, (4), pp. 2634.
    33. 33)
      • 8. Wolpaw, J.R., Birbaumer, N., Heetderks, W.J., et al: ‘Brain–computer interface technology: a review of the first international meeting’, IEEE Trans. Rehabil. Eng., 2000, 8, (2), pp. 164173.
    34. 34)
      • 50. Zeng, H., Yang, C., Dai, G., et al: ‘EEG classification of driver mental states by deep learning’, Cogn. Neurodyn., 2018, 12, (6), pp. 597606.
    35. 35)
      • 58. Yohanandan, S.A., Kiral-Kornek, I., Tang, J., et al: ‘A robust low-cost EEG motor imagery-based brain–computer interface’. 2018 40th Annual Int. Conf. IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 2018, pp. 50895092.
    36. 36)
      • 30. Friston, K.J.: ‘Modalities, modes, and models in functional neuroimaging’, Science, 2009, 326, (5951), pp. 399403.
    37. 37)
      • 6. Patterson, J.R., Grabois, M.: ‘Locked-in syndrome: a review of 139 cases’, Stroke, 1986, 17, (4), pp. 758764.
    38. 38)
      • 70. Donahue, J., Anne Hendricks, L., Guadarrama, S., et al: ‘Long-term recurrent convolutional networks for visual recognition and description’. Proc. IEEE Conf. Computer Vision and Pattern Recognition, Boston, MA, USA, 2015, pp. 26252634.
    39. 39)
      • 38. Lu, N., Li, T., Ren, X., et al: ‘A deep learning scheme for motor imagery classification based on restricted Boltzmann machines’, IEEE Trans. Neural Syst. Rehabil. Eng., 2016, 25, (6), pp. 566576.
    40. 40)
      • 67. Oostenveld, R., Praamstra, P.: ‘The five per cent electrode system for high-resolution EEG and ERP measurements’, Clin. Neurophysiol., 2001, 112, (4), pp. 713719.
    41. 41)
      • 60. Liu, T., Liu, X., Yi, L., et al: ‘Assessing autism at its social and developmental roots: a review of autism spectrum disorder studies using functional near-infrared spectroscopy’, Neuroimage, 2019, 185, pp. 955967.
    42. 42)
      • 69. Gers, F.A., Schmidhuber, J., Cummins, F.: ‘Learning to forget: continual prediction with LSTM’, 1999.
    43. 43)
      • 28. Shin, J., Müller, K.R., Schmitz, C.H., et al: ‘Evaluation of a compact hybrid brain–computer interface system’, BioMed Res. Int., 2017, 2017, p. 6820482.
    44. 44)
      • 22. Ho, T.K.K., Gwak, J., Park, C.M., et al: ‘Deep learning-based approach for mental workload discrimination from multi-channel fNIRS’, in Premnath, R. N. (Ed.): ‘Recent trends in communication, computing, and electronics’ (Springer, Singapore, 2019), pp. 431440.
    45. 45)
      • 45. Nguyen, D.K., Tremblay, J., Pouliot, P., et al: ‘Non-invasive continuous EEG–fNIRS recording of temporal lobe seizures’, Epilepsy Res., 2012, 99, (1–2), pp. 112126.
    46. 46)
      • 49. Kosmyna, N., Lécuyer, A.: ‘A conceptual space for EEG-based brain–computer interfaces’, PloS One, 2019, 14, (1), p. e0210145.
    47. 47)
      • 39. Yang, J., Yao, S., Wang, J.: ‘Deep fusion feature learning network for MI-EEG classification’, IEEE Access, 2018, 6, pp. 7905079059.
    48. 48)
      • 18. Ramli, R., Arof, H., Ibrahim, F., et al: ‘Using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation’, Expert Syst. Appl., 2015, 42, (5), pp. 24512463.
    49. 49)
      • 24. Hong, K.S., Khan, M.J., Hong, M.J.: ‘Feature extraction and classification methods for hybrid fNIRS–EEG brain–computer interfaces’, Front. Hum. Neurosci., 2018, 12, pp. 125.
    50. 50)
      • 68. LeCun, Y., Kavukcuoglu, K., Farabet, C.: ‘Convolutional networks and applications in vision’. Proc. 2010 IEEE Int. Symp. Circuits and Systems, Paris, France, 2010, pp. 253256.
    51. 51)
      • 32. Elsayed, N., Zaghloul, Z.S., Bayoumi, M.: ‘Brain–computer interface: EEG signal preprocessing issues and solutions’, Int. J. Comput. Appl., 2017, 169, (3), pp. 9758887.
    52. 52)
      • 48. Näsi, T., Kotilahti, K., Noponen, T., et al: ‘Correlation of visual evoked hemodynamic responses and potentials in human brain’, Exp. Brain Res., 2010, 202, (3), pp. 561570.
    53. 53)
      • 14. Bauernfeind, G., Leeb, R., Wriessnegger, S.C., et al: ‘Development, set-up and first results for a one-channel near-infrared spectroscopy system/entwicklung, aufbau und vorläufige ergebnisse eines einkanal-nahinfrarot-spektroskopie-systems’, Biomed. Tech., 2008, 53, (1), pp. 3643.
    54. 54)
      • 35. Yin, Z., Wang, Y., Liu, L., et al: ‘Cross-subject EEG feature selection for emotion recognition using transfer recursive feature elimination’, Front. Neurorobot., 2017, 11, p. 19.
    55. 55)
      • 59. Shin, J., Kwon, J., Im, C.H.: ‘A ternary hybrid EEG–NIRS brain–computer interface for the classification of brain activation patterns during mental arithmetic, motor imagery, and idle state’, Front. Neuroinformatics, 2018, 12, p. 5.
    56. 56)
      • 16. Gruzelier, J.H.: ‘EEG-neurofeedback for optimising performance. III: a review of methodological and theoretical considerations’, Neurosci. Biobehav. Rev., 2014, 44, pp. 159182.
    57. 57)
      • 47. Seyal, M.: ‘Frontal hemodynamic changes precede EEG onset of temporal lobe seizures’, Clin. Neurophysiol., 2014, 125, (3), pp. 442448.
    58. 58)
      • 56. Zafar, A., Khan, M.J., Park, J., et al: ‘Initial dip-based quadcopter control: application to fNIRS-BCI’, IFAC-PapersOnLine, 2018, 51, (15), pp. 945950.
    59. 59)
      • 36. She, Q., Hu, B., Luo, Z., et al: ‘A hierarchical semi-supervised extreme learning machine method for EEG recognition’, Med. Biol. Eng. Comput., 2019, 57, (1), pp. 147157.
    60. 60)
      • 7. Kübler, A., Kotchoubey, B., Kaiser, J., et al: ‘Brain–computer communication: unlocking the locked in’, Psychol. Bull., 2001, 127, (3), p. 358.
    61. 61)
      • 17. Ahn, M., Jun, S.C.: ‘Performance variation in motor imagery brain–computer interface: a brief review’, J. Neurosci. Methods, 2015, 243, pp. 103110.
    62. 62)
      • 54. Zhang, M., Hua, Q., Jia, W., et al: ‘Feature extraction and classification algorithm of brain–computer interface based on human brain central nervous system’, Neuroquantology, 2018, 16, (5), pp. 896900.
    63. 63)
      • 10. Lotte, F., Congedo, M., Lécuyer, A., et al: ‘A review of classification algorithms for EEG-based brain–computer interfaces’, J. Neural Eng., 2007, 4, (2), p. R1.
    64. 64)
      • 55. Chiarelli, A.M., Croce, P., Merla, A., et al: ‘Deep learning for hybrid EEG–fNIRS brain–computer interface: application to motor imagery classification’, J. Neural Eng., 2018, 15, (3), p. 036028.
    65. 65)
      • 4. Ware, C., Mikaelian, H.H.: ‘An evaluation of an eye tracker as a device for computer input2’, ACM Sigchi Bull., 1987, 18, (4), pp. 183188.
    66. 66)
      • 51. Mortaheb, S., Annen, J., Chatelle, C., et al: ‘A graph signal processing approach to study high-density EEG signals in patients with disorders of consciousness’. 2019 41st Annual Int. Conf. IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 2019, pp. 45494553.
    67. 67)
      • 61. Yoo, S.H., Woo, S.W., Amad, Z.: ‘Classification of three categories from prefrontal cortex using LSTM networks: fNIRS study’. 2018 18th Int. Conf. Control, Automation and Systems (ICCAS), Daegwallyeong, Republic of Korea, 2018, pp. 11411146.
    68. 68)
      • 15. Wang, H., Zhang, Y., Waytowich, N.R., et al: ‘Discriminative feature extraction via multivariate linear regression for SSVEP-based BCI’, IEEE Trans. Neural Syst. Rehabil. Eng., 2016, 24, (5), pp. 532541.
    69. 69)
      • 25. Herrmann, M.J., Huter, T., Plichta, M.M., et al: ‘Enhancement of activity of the primary visual cortex during processing of emotional stimuli as measured with event-related functional near-infrared spectroscopy and event-related potentials’, Hum. Brain Mapp., 2008, 29, (1), pp. 2835.
    70. 70)
      • 1. Hofman, M.A., Falk, D.: ‘Evolution of the primate brain: from neuron to behavior’, vol. 195 (Elsevier, Netherlands, 2012).
    71. 71)
      • 34. Heydari, E., Shahbakhti, M.: ‘Adaptive wavelet technique for EEG de-noising’. 2015 Eighth Biomedical Engineering Int. Conf. (BMEiCON), New Orleans, LA, USA, 2015, pp. 14.
    72. 72)
      • 29. Biessmann, F., Plis, S., Meinecke, F.C., et al: ‘Analysis of multimodal neuroimaging data’, IEEE Rev. Biomed. Eng., 2011, 4, pp. 2658.
http://iet.metastore.ingenta.com/content/journals/10.1049/iet-spr.2019.0297
Loading

Related content

content/journals/10.1049/iet-spr.2019.0297
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading