Learning to forget: continual prediction with LSTM
Learning to forget: continual prediction with LSTM
- Author(s): F.A. Gers ; J. Schmidhuber ; F. Cummins
- DOI: 10.1049/cp:19991218
For access to this article, please select a purchase option:
Buy conference paper PDF
Buy Knowledge Pack
IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.
9th International Conference on Artificial Neural Networks: ICANN '99 — Recommend this title to your library
Thank you
Your recommendation has been sent to your librarian.
- Author(s): F.A. Gers ; J. Schmidhuber ; F. Cummins Source: 9th International Conference on Artificial Neural Networks: ICANN '99, 1999 p. 850 – 855
- Conference: 9th International Conference on Artificial Neural Networks: ICANN '99
- DOI: 10.1049/cp:19991218
- ISBN: 0 85296 721 7
- Location: Edinburgh, UK
- Conference date: 7-10 Sept. 1999
- Format: PDF
Long short-term memory (LSTM) can solve many tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams without explicitly marked sequence ends. Without resets, the internal state values may grow indefinitely and eventually cause the network to break down. Our remedy is an adaptive “forget gate” that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review an illustrative benchmark problem on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve a continual version of that problem. LSTM with forget gates, however, easily solves it in an elegant way.
Inspec keywords: learning (artificial intelligence); resource allocation; content-addressable storage; recurrent neural nets
Subjects: Neural nets (theory); Learning in AI (theory)
Related content
content/conferences/10.1049/cp_19991218
pub_keyword,iet_inspecKeyword,pub_concept
6
6
