%0 Electronic Article
%A Guangwu Qian
%A Lei Zhang
%A Qianjun Zhang
%K end-to-end training algorithm
%K softmax-like loss function
%K conceptor-based neural networks
%K high-dimensional network signals
%K gradient-descent algorithm
%K direction-selective damping
%K recurrent neural networks
%K projection matrix
%K backpropagation algorithm
%X Since a conceptor, achieving direction-selective damping of high-dimensional network signals, usually takes the form of a projection matrix and is deduced analytically, a conceptor-based neural network is thought to be untrainable with backpropagation and gradient-descent algorithms from end to end. It limits the application of conceptors. To address this issue, an algorithm is proposed to train conceptor-based neural networks from end to end with gradient-descent algorithms. To the best of the authorsâ€™ knowledge, it is the first work of such an end-to-end training algorithm. To develop this algorithm, a softmax-like loss function involved with conceptors is constructed empirically. Based on this loss function, corresponding gradients are deduced by using backpropagation method so that it is possible to train conceptor neural networks from end to end with a gradient-descent algorithm. Several experiments are conducted to show the feasibility and effectiveness of the proposed training algorithm.
%@ 0013-5194
%T End-to-end training algorithm for conceptor-based neural networks
%B Electronics Letters
%D July 2018
%V 54
%N 15
%P 924-926
%I Institution of Engineering and Technology
%U https://digital-library.theiet.org/;jsessionid=4swq7ui3klbtd.x-iet-live-01content/journals/10.1049/el.2018.0033
%G EN