Your browser does not support JavaScript!
http://iet.metastore.ingenta.com
1887

Accelerated training of backpropagation networks by using adaptive momentum step

Accelerated training of backpropagation networks by using adaptive momentum step

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Considerable research into the training of neural networks by the backpropagation technique has been undertaken in recent years. Introduction of a momentum term into the training equation can accelerate the training process. In the Letter a new momentum step and a scheme for dynamically selecting the momentum rate are described. It is shown that these give improved acceleration of training and strong global convergence characteristics. Results are presented for four benchmark training tasks.

References

    1. 1)
      • Fahlman, S.: `Faster-learning variations on backpropagation: an empirical study', Proc. 1988 Connectionist Models Summer School, 1988, Carnegie-Mellon University, p. 38–51.
    2. 2)
      • A.D. Pierre . (1969) , Optimization theory with applications.
    3. 3)
      • D.E. Rumelhart , G.E. Hinton , R.J. Williams , D.E. Rumelhart , J.L. McClelland . (1986) Learning internal representations by error propagation, Parallel distributed processing: explorations in Microstructure of cognition.
    4. 4)
      • L.W. Chan , F. Fallside . An adaptive training algorithm for backpropagation networks. Computer Speech and Language , 205 - 218
    5. 5)
      • R. Jacobs . Increased rates of convergence through earning rate adaptation. Neural Netw. , 4 , 295 - 307
http://iet.metastore.ingenta.com/content/journals/10.1049/el_19920236
Loading

Related content

content/journals/10.1049/el_19920236
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address