Compact building blocks for artificial neural networks

Compact building blocks for artificial neural networks

For access to this article, please select a purchase option:

Buy article PDF
(plus tax if applicable)
Buy Knowledge Pack
10 articles for $120.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Your details
Why are you recommending this title?
Select reason:
Electronics Letters — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

A four quadrant analogue multiplier, an activation function and a differentiator intended for the implementation of backpropagation algorithms are proposed. The priorities for the design are low-power, low-voltage, and minimum silicon area. Breadboard results, showing good agreement with the theoretical ones, are reported.


    1. 1)
      • S. Haykin . Neural networks: a comprehensive foundation.
    2. 2)
      • B. Widrow , M.A. Lehr . 30 years of adaptive neural networks: perceptron,madaline, and backpropagation. IEEE Proc. , 9 , 1415 - 1442
    3. 3)
      • B. Gilbert . Translinear circuits: a proposed classification. Electron. Lett. , 1 , 14 - 16
    4. 4)
      • B. Gilbert , C. Toumazou , F.J. Lidgey , D.G. Haigh . (1990) Current-mode circuits from a translinear viewpoint: a tutorial, Analogue IC design: the current-mode approach.
    5. 5)
      • Melendez, M., Silva, J.: `Low power/minimum transistor building blocks forthe implementation of backpropagation algorithms', Proc. IEEE Mid. Symp. Circ. Syst., 1998, p. 1334–1337.

Related content

This is a required field
Please enter a valid email address