A1 Guangwu Qian

A1 Lei Zhang

A1 Qianjun Zhang

PB iet

T1 End-to-end training algorithm for conceptor-based neural networks

JN Electronics Letters

VO 54

IS 15

SP 924

OP 926

AB Since a conceptor, achieving direction-selective damping of high-dimensional network signals, usually takes the form of a projection matrix and is deduced analytically, a conceptor-based neural network is thought to be untrainable with backpropagation and gradient-descent algorithms from end to end. It limits the application of conceptors. To address this issue, an algorithm is proposed to train conceptor-based neural networks from end to end with gradient-descent algorithms. To the best of the authorsâ€™ knowledge, it is the first work of such an end-to-end training algorithm. To develop this algorithm, a softmax-like loss function involved with conceptors is constructed empirically. Based on this loss function, corresponding gradients are deduced by using backpropagation method so that it is possible to train conceptor neural networks from end to end with a gradient-descent algorithm. Several experiments are conducted to show the feasibility and effectiveness of the proposed training algorithm.

K1 end-to-end training algorithm

K1 softmax-like loss function

K1 conceptor-based neural networks

K1 high-dimensional network signals

K1 gradient-descent algorithm

K1 direction-selective damping

K1 recurrent neural networks

K1 projection matrix

K1 backpropagation algorithm

DO https://doi.org/10.1049/el.2018.0033

UL https://digital-library.theiet.org/;jsessionid=28lb5i13q363p.x-iet-live-01content/journals/10.1049/el.2018.0033

LA English

SN 0013-5194

YR 2018

OL EN