http://iet.metastore.ingenta.com
1887

Feedforward artificial neural networks for event-related potential detection

Feedforward artificial neural networks for event-related potential detection

For access to this article, please select a purchase option:

Buy chapter PDF
£10.00
(plus tax if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Signal Processing and Machine Learning for Brain-Machine Interfaces — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

The detection of brain responses at the single-trial level in the electroencephalogram (EEG) such as event-related potentials (ERPs) is a difficult problem that requires different processing steps to extract relevant discriminant features as the input signal is noisy and the brain responses can be different overtime. In brain-computer interface, single-trial detection is primarily applied to distinguish the presence of large ERP components such as the P300. Because the characteristics of the P300 can depend on the parameters of the oddball paradigm, the type of stimuli, and as it can vary across subjects and over time during experiments, a reliable classifier must take into account this variability for the detection of the P300. While most of the signal and classification techniques for the detection of brain responses are based on linear algebra, different pattern recognition techniques such as convolutional neural network (CNN), as a type of deep learning technique, have shown some interest as they are able to process the signal after limited preprocessing. In this chapter, we propose to investigate the performance of different feedforward neural networks in relation of their architecture and in relation to how they are evaluated: a single system for each subject or a system for all the subjects. More particularly, we want to address the change of performance that can be observed between specifying a neural network to a subject, or by considering a neural network for a group of subjects, taking advantage of a larger number of trials from different subjects. The results support the conclusion that a CNN trained on different subjects can lead to an AUC above 0.9 by using an appropriate architecture using spatial filtering and shift invariant layers.

Chapter Contents:

  • Abstract
  • 9.1 Introduction
  • 9.2 Event-related potentials
  • 9.3 Feedforward neural networks
  • 9.3.1 Activation functions
  • 9.3.2 Error evaluation
  • 9.3.3 Architectures
  • 9.4 Methods
  • 9.5 Experimental protocol
  • 9.5.1 Conv nets
  • 9.5.2 Performance evaluation
  • 9.6 Results
  • 9.7 Discussion
  • 9.8 Conclusion
  • References

Inspec keywords: feature extraction; spatial filters; convolution; medical signal processing; learning (artificial intelligence); signal classification; brain-computer interfaces; bioelectric potentials; linear algebra; feedforward neural nets; electroencephalography

Other keywords: oddball paradigm; spatial filtering; shift invariant layers; event-related potential detection; deep learning technique; convolutional neural network; EEG; feature extraction; CNN; electroencephalogram; AUC; linear algebra; single-trial detection; signal classification techniques; ERP components; brain-computer interface; brain responses detection; pattern recognition techniques; feedforward artificial neural networks

Subjects: Biology and medical computing; Bioelectric signals; Neural computing techniques; Knowledge engineering techniques; Algebra; Digital signal processing; Signal processing and detection

Preview this chapter:
Zoom in
Zoomout

Feedforward artificial neural networks for event-related potential detection, Page 1 of 2

| /docserver/preview/fulltext/books/ce/pbce114e/PBCE114E_ch9-1.gif /docserver/preview/fulltext/books/ce/pbce114e/PBCE114E_ch9-2.gif

Related content

content/books/10.1049/pbce114e_ch9
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address