Your browser does not support JavaScript!

access icon free Classification of emotions from EEG signals using time-order representation based on the S-transform and convolutional neural network

Emotions are the most powerful information source to study the cognition, behaviour, and medical conditions of a person. Accurate identification of emotions helps in the development of affective computing, brain–computer interface, medical diagnosis system, etc. Electroencephalogram (EEG) signals are one such source to capture and study human emotions. In this Letter, a novel time-order representation based on the S-transform and convolutional neural network (CNN) is proposed for the identification of human emotions. EEG signals are transformed into time-order representation (TOR) based on the S-transform. This TOR is given as an input to CNN to automatically extract and classify the deep features. Emotional states of happiness, fear, sadness, and relax are classified with an accuracy of 94.58%. The superiority of the method is judged by evaluating four performance parameters and comparing it with existing state-of-the-art on the same dataset.


    1. 1)
      • 2. Tarnowski, P., Kolodziej, M., Majkowski, A., et al: ‘Emotion recognition using facial expressions’. Procedia Comput. Sci., 2017, 108, pp. 11751184.
    2. 2)
      • 11. Sachin, T., Varun, B.: ‘Emotion recognition from single-channel EEG signals using a two-stage correlation and instantaneous frequency-based filtering method’, Comput. Methods Programs Biomed., 2019, 173, pp. 157165.
    3. 3)
      • 1. Sawangjai, P., Hompoonsup, S., Leelaarporn, P., et al: ‘Consumer grade EEG measuring sensors as research tools: a review’, IEEE Sens. J., 2020, 20, (8), pp. 39964024.
    4. 4)
      • 14. Stockwell, R.G., Mansinha, L., Lowe, R.P.: ‘Localization of the complex spectrum: the S transform’, IEEE Trans. Signal Process., 1996, 44, pp. 9981001.
    5. 5)
      • 16. Silvia, U.L., Smith, K., Varun, B., et al: ‘Hybrid computerized method for environmental sound classification’, IEEE Access, 2020, 8, pp. 124055124065.
    6. 6)
      • 3. Chen, L., Mao, X., Xue, Y., et al: ‘Speech emotion recognition: features and classification models’, Digit. Signal Process., 2012, 22, (6), pp. 11541160.
    7. 7)
      • 17. Alom, M.Z., Taha, T.M., Yakopcic, C., et al: ‘The history began from AlexNet: a comprehensive survey on deep learning approaches’. CoRR, vol. abs/1803.01164, 2018. Available at
    8. 8)
      • 15. Pooja, Jain, Pachori, Ram Bilas: ‘Event-based method for instantaneous fundamental frequency estimation from voiced speech based on eigenvalue decomposition of the Hankel matrix’, IEEE/ACM Trans. Audio Speech and Lang. Process., 2014, 22, pp. 14671482.
    9. 9)
      • 12. Smith, K., Varun, B.: ‘Time-frequency representation and convolutional neural network based emotion recognition’, IEEE Trans. Neural Netw. Learn. Syst., 2020, to appear.
    10. 10)
      • 5. Wang, X.-W., Nie, D., Lu, B.-L.: ‘EEG-based emotion recognition using frequency domain features and support vector machines’. In: Lu, BL., Zhang, L., Kwok, J. (eds). Neural Information Processing. ICONIP 2011. Lecture Notes in Computer Science, vol 7062. Springer, Berlin, Heidelberg.
    11. 11)
      • 6. Lin, Y., Wang, C., Jung, T., et al: ‘EEG-based emotion recognition in music listening’, IEEE Trans. Biomed. Eng., 2010, 57, (7), pp. 17981806.
    12. 12)
      • 10. Varun, B., Sachin, T., Abdulkadir, S.: ‘Emotion classification using flexible analytic wavelet transform for electroencephalogram signals’, Health Inf. Sci. Syst., 2018, 6, (1), p. 12.
    13. 13)
      • 9. Varun, B., Annala, K.H., Sri, A.B., et al: ‘Emotion classification using EEG signals based on tunable-Q wavelet transform’, IET Sci., Meas. Technol., 2019, 13, (3), pp. 375380.
    14. 14)
      • 7. Murugappan, N., Ramachandran, M., Sazali, Y.: ‘Classification of human emotion from EEG using discrete wavelet transform’, J. Biomed. Sci. Eng., 2010, 334054, pp. 390396.
    15. 15)
      • 13. Smith, K., Varun, B., Sinha, G.R.: ‘Adaptive tunable Q wavelet transform based emotion identification’, IEEE Trans. Instrum. Meas., 2020, to appear.
    16. 16)
      • 8. Zhuang, N., Zeng, Y., Tong, L., et al: ‘Emotion recognition from EEG signals using multidimensional information in EMD domain’, BioMed Res. Int., 2017, 01, pp. 19, 10.1155/2017/8317357.
    17. 17)
      • 4. Liu, Y., Yu, M., Zhao, G., et al: ‘Real-time movieinduced discrete emotion recognition from EEG signals’, IEEE Trans. Affective Comput., 2018, 9, (4), pp. 5500562.

Related content

This is a required field
Please enter a valid email address