Prikladnaya Diskretnaya Matematika
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Prikl. Diskr. Mat.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Prikladnaya Diskretnaya Matematika, 2017, Number 37, Pages 107–113
DOI: https://doi.org/10.17223/20710410/37/9
(Mi pdm596)
 

This article is cited in 1 scientific paper (total in 1 paper)

Mathematical Foundations of Informatics and Programming

Reduction of synapses in the Hopfield autoassociative memory

M. S. Tarkov

Rzhanov Institute of Semiconductor Physics SB RAS, Novosibirsk, Russia
Full-text PDF (634 kB) Citations (1)
References:
Abstract: The auto-associative Hopfield network is a set of neurons in which the output of each neuron is the input of all other neurons, i.e. the inter-neuronal connections graph of the Hopfield network is complete. The large number of inter-neuronal connections is one of the problems of the Hopfield networks hardware implementation. A solution is the reduction (exclusion) of insignificant connections. In this paper, based on the analogy with oscillator networks, the connections number reducing effect on the auto-associative Hopfield network behavior is investigated. It is shown that the exclusion of connections with weights whose absolute values are strictly less than the maximum for a given neuron substantially improves the operation quality of the Hopfield network trained according to the Hebb's rule. As the dimension of the stored vectors increases, not only the chimeras disappear but the permissible input data noise level also increases. At the same time, the network connections number is reduced by 13–17 times. The reduction of connections in the Hopfield network, trained by the projection method, worsens its functioning quality, namely: in the network output data, there are distortions even while the reference vectors are entered. With the stored vectors dimension increasing, the allowable noise level for the reduced Hopfield–Hebb networks approaches the corresponding index for the Hopfield projection networks. Thus, given the much smaller number of connections in the reduced Hopfield–Hebb networks, these networks can successfully compete with the Hopfield projection networks for a sufficiently large stored vectors dimension.
Keywords: auto-associative Hopfield memory, Hebb's rule, projection method, oscillatory network, reduction of connections.
Bibliographic databases:
Document Type: Article
UDC: 004.8
Language: Russian
Citation: M. S. Tarkov, “Reduction of synapses in the Hopfield autoassociative memory”, Prikl. Diskr. Mat., 2017, no. 37, 107–113
Citation in format AMSBIB
\Bibitem{Tar17}
\by M.~S.~Tarkov
\paper Reduction of synapses in the Hopfield autoassociative memory
\jour Prikl. Diskr. Mat.
\yr 2017
\issue 37
\pages 107--113
\mathnet{http://mi.mathnet.ru/pdm596}
\crossref{https://doi.org/10.17223/20710410/37/9}
Linking options:
  • https://www.mathnet.ru/eng/pdm596
  • https://www.mathnet.ru/eng/pdm/y2017/i3/p107
  • This publication is cited in the following 1 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Прикладная дискретная математика
    Statistics & downloads:
    Abstract page:219
    Full-text PDF :90
    References:45
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024