Neural networks and conditional association networks. common properties and differences

Access Full Text

Neural networks and conditional association networks. common properties and differences

For access to this article, please select a purchase option:

Buy article PDF
£12.50
(plus tax if applicable)
Buy Knowledge Pack
10 articles for £75.00
(plus taxes if applicable)

IET members benefit from discounts to all IET publications and free access to E&T Magazine. If you are an IET member, log in to your account and the discounts will automatically be applied.

Learn more about IET membership 

Recommend Title Publication to library

You must fill out fields marked with: *

Librarian details
Name:*
Email:*
Your details
Name:*
Email:*
Department:*
Why are you recommending this title?
Select reason:
 
 
 
 
 
IEE Proceedings E (Computers and Digital Techniques) — Recommend this title to your library

Thank you

Your recommendation has been sent to your librarian.

At first it is shown how large networks can be represented in an illustrative way. Then, the special aspects of neural networks are discussed. When conditional association arrays and semantic memories are plotted as networks surprising similiarities arise, but important characteristic differences can be seen too. The first experiments suggest that substantial progress in text processing may be achieved faster with conditional association networks than with neural networks.

Inspec keywords: parallel processing; content-addressable storage; neural nets

Other keywords: neural networks; conditional association networks; common properties; semantic memories; text processing

Subjects: Memory circuits; Systems theory applications in biology and medicine; Other digital storage; Artificial intelligence (theory)

References

    1. 1)
      • W. Hilberg . (1988) , Das Netzwerk der menschlichen Sprache und Grundzüge einer entsprechend gebauten Sprachmaschine.
    2. 2)
      • K. Goser . (1988) , Konzepte und Schaltungen für lernende Speicher in VLSI-Technik.
    3. 3)
      • W. Hilberg . (1984) , Assoziative Gedächtnisstrukturen.
    4. 4)
      • J.R. Pierce . (1967) , A survey of information theory: new methods of thought and procedure.
    5. 5)
      • J.J. Hopfield . Artificial neural networks. IEEE Circuits Devices Mag. , 5 , 3 - 10
    6. 6)
      • T. Kohonen . (1977) , Associative Memory.
    7. 7)
      • W. Hilberg . (1987) , Digitale Speicher 1.
    8. 8)
      • Erb, M., and Palm, G.: ‘Lernen und Informationsspeicherung in neuronalen Netzen’. ITG-Fachbericht 102, Tagung Digitale Speicher, Darmstadt 1988, S.379-390.
    9. 9)
      • C.E. Shannon . Prediction and entropy of printed English. Bell Syst. Tech. J. , 1 , 50 - 64
    10. 10)
      • P.A. Devijver , J. Kittler . (1987) , Pattern recognition Theory and applications.
    11. 11)
      • W. Hilberg , J. Meyer . (1988) , Zur effizienten Speicherung von Sprache.
    12. 12)
      • J. Meyer . (1989) , Die Verwendung hierarchisch strukturierter Sprachnetzwerke zur redundanzarmen Codierung von Texten.
    13. 13)
      • R. Eckmiller , D.V. Malsburg . (1988) , Neural Computers.
    14. 14)
      • W. Hilberg . (1986) , Semantische Speicherung von Texten.
    15. 15)
      • T. Kohonen . (1984) , Self-organization and associative Memory.
http://iet.metastore.ingenta.com/content/journals/10.1049/ip-e.1989.0046
Loading

Related content

content/journals/10.1049/ip-e.1989.0046
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading