Informatsionnye Tekhnologii i Vychslitel'nye Sistemy
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Guidelines for authors

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Informatsionnye Tekhnologii i Vychslitel'nye Sistemy:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Informatsionnye Tekhnologii i Vychslitel'nye Sistemy, 2023, Issue 3, Pages 46–54
DOI: https://doi.org/10.14357/20718632230305
(Mi itvs820)
 

INTELLIGENT SYSTEMS AND TECHNOLOGIES

Layer-wise knowledge distillation for simplified bipolar morphological neural networks

M. V. Zingerenkoab, E. E. Limonovabc

a Moscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region, Russia
b Smart Engines Service LLC, Moscow, Russia
c Federal Research Center "Computer Science and Control" of Russian Academy of Sciences, Moscow, Russia
Abstract: Various neuron approximations can be used to reduce the computational complexity of neural networks. One such approximation based on summation and maximum operations is a bipolar morphological neuron. This paper presents an improved structure of the bipolar morphological neuron that enhances its computational efficiency and a new approach to training based on continuous approximations of the maximum and knowledge distillation. Experiments were conducted on the MNIST dataset using a LeNet-like neural network architecture and on the CIFAR10 dataset using a ResNet-22 model architecture. The proposed training method achieves 99.45% classification accuracy on the LeNet-like model, with the same accuracy of the classical network, and 86.69% accuracy on the ResNet-22 model, compared to 86.43% accuracy of the classical model. The results show that the proposed method with logsum-exp (LSE) approximation of the maximum and layer-by-layer knowledge distillation, allows for a simplified bipolar morphological network that is not inferior to classical networks.
Keywords: bipolar morphological networks, approximations, artificial neural networks, computational efficiency.
Bibliographic databases:
Document Type: Article
Language: Russian
Citation: M. V. Zingerenko, E. E. Limonova, “Layer-wise knowledge distillation for simplified bipolar morphological neural networks”, Informatsionnye Tekhnologii i Vychslitel'nye Sistemy, 2023, no. 3, 46–54
Citation in format AMSBIB
\Bibitem{ZinLim23}
\by M.~V.~Zingerenko, E.~E.~Limonova
\paper Layer-wise knowledge distillation for simplified bipolar morphological neural networks
\jour Informatsionnye Tekhnologii i Vychslitel'nye Sistemy
\yr 2023
\issue 3
\pages 46--54
\mathnet{http://mi.mathnet.ru/itvs820}
\crossref{https://doi.org/10.14357/20718632230305}
\elib{https://elibrary.ru/item.asp?id=54676455}
Linking options:
  • https://www.mathnet.ru/eng/itvs820
  • https://www.mathnet.ru/eng/itvs/y2023/i3/p46
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Informatsionnye  Tekhnologii i Vychslitel'nye Sistemy
    Statistics & downloads:
    Abstract page:27
    First page:4
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024