MOS circuit for nonlinear Hebbian learning

Access Full Text

MOS circuit for nonlinear Hebbian learning

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

An analogue MOS circuit that implements a nonlinear Hebbian learning rule is presented. The circuit has two differential inputs (voltages) and yields an output (current) which approximates the product of a cubic and a hyperbolic tangent functions of the two input voltages. This has been successfully incorporated in an analogue integrated circuit implementation of the Herault-Jutten neuromorphic network.

Inspec keywords: MOS integrated circuits; linear integrated circuits; neural nets; learning systems

Other keywords: nonlinear Hebbian learning; analogue MOS circuit; differential inputs; Herault-Jutten neuromorphic network; cubic tangent functions; hyperbolic tangent functions

Subjects: Neural nets (circuit implementations); Other MOS integrated circuits; Neural nets (theory); Neural net devices

References

    1. 1)
      • C. Jutten , J. Herault . Blind separation of sources, part I: an adaptive algorithm based on neuromimetic architecture. Signal Proc. , 1 - 10
    2. 2)
      • C. Mead . (1989) , Analog VLSI and neural systems.
    3. 3)
      • Delbrük, T.: `“Bump” circuits for computing similarity and dissimilarity of analog voltages', Proc. Int. Joint Conf. on Neural Networks, 8th–12th July 1991, Seattle, WA, p. 475–479.
    4. 4)
      • Palmieri, F., Zhu, J.: `Linear neural networks which minimize the output variance', Proc. Int. Joint Conf. on Neural Networks, 8th–12th July 1991, Seattle, WA, p. 791–797.
    5. 5)
      • Foldiak, P.: `Adaptive network for optimal linear feature extraction', Proc. Int. Joint Conf. Neural Networks, 18th–22nd June 1989, Washington, DC, p. 401–405.
    6. 6)
      • Cohen, M.H.: `Analog VLSI implementation of an auto-adaptive synthetic neural network for real-time separation of independent signal sources', 1991, MSE Thesis, The Johns Hopkins University, Baltimore, MD.
    7. 7)
      • S.J. Orfanidis . Gram-Schmidt neural nets. Neural Computation , 116 - 126
    8. 8)
      • Y. Tsividis , S. Satyanarayana . Analogue circuits for variable-synapse electronic neural networks. Electron. Lett. , 1313 - 1314
    9. 9)
      • E. Vitoz , J. Fellrath . CMOS analog integrated circuits based on weak inversion operation. IEEE J. Solid-State Circuits , 224 - 231
    10. 10)
      • D.O. Hebb . (1949) , The organization of behaviour.
http://iet.metastore.ingenta.com/content/journals/10.1049/el_19920512
Loading

Related content

content/journals/10.1049/el_19920512
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading