http://iet.metastore.ingenta.com
1887

End-to-end training algorithm for conceptor-based neural networks

End-to-end training algorithm for conceptor-based neural networks

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

Since a conceptor, achieving direction-selective damping of high-dimensional network signals, usually takes the form of a projection matrix and is deduced analytically, a conceptor-based neural network is thought to be untrainable with backpropagation and gradient-descent algorithms from end to end. It limits the application of conceptors. To address this issue, an algorithm is proposed to train conceptor-based neural networks from end to end with gradient-descent algorithms. To the best of the authors’ knowledge, it is the first work of such an end-to-end training algorithm. To develop this algorithm, a softmax-like loss function involved with conceptors is constructed empirically. Based on this loss function, corresponding gradients are deduced by using backpropagation method so that it is possible to train conceptor neural networks from end to end with a gradient-descent algorithm. Several experiments are conducted to show the feasibility and effectiveness of the proposed training algorithm.

References

    1. 1)
    2. 2)
    3. 3)
      • 3. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ‘Imagenet classification with deep convolutional neural networks’. Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, December 2012, pp. 10971105.
    4. 4)
      • 4. Wan, L., Zeiler, M., Zhang, S., et al: ‘Regularization of neural networks using dropconnect’. Proc. of the 30th Int. Conf. on Machine Learning (ICML-13), Atlanta, GA, USA, June 2013, pp. 10581066.
    5. 5)
      • 5. Rumelhart, D.E., Hinton, G.E., Williams, R.J., et al: ‘Learning representations by back-propagating errors’, Cogn. Model., 1988, 5, (3), p. 1.
    6. 6)
    7. 7)
      • 7. Jaeger, H.: ‘Using conceptors to manage neural long-term memories for temporal patterns’, J. Mach. Learn. Res., 2017, 18, (13), pp. 143.
    8. 8)
      • 8. Qian, G., Zhang, L.: ‘A simple feedforward convolutional conceptor neural network for classification’, Appl. Soft Comput., 2017, doi: 10.1016/j.asoc.2017.08.016.
    9. 9)
      • 9. Qian, G., Zhang, L., Zhang, Q.: ‘Fast conceptor classifier in pre-trained neural networks for visual recognition’. Int. Symp. on Neural Networks, Sapporo, Hakodate, and Muroran, Hokkaido, Japan, June 2017, pp. 290298.
    10. 10)
    11. 11)
    12. 12)
      • 12. Simonyan, K., Zisserman, A.: ‘Very deep convolutional networks for large-scale image recognition’, arXiv preprint arXiv:1409.1556, 2014.
    13. 13)
http://iet.metastore.ingenta.com/content/journals/10.1049/el.2018.0033
Loading

Related content

content/journals/10.1049/el.2018.0033
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading
This is a required field
Please enter a valid email address