|
INTELLIGENT SYSTEMS AND TECHNOLOGIES
Layer-wise knowledge distillation for simplified bipolar morphological neural networks
M. V. Zingerenkoab, E. E. Limonovabc a Moscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region, Russia
b Smart Engines Service LLC, Moscow, Russia
c Federal Research Center "Computer Science and Control" of Russian Academy of Sciences, Moscow, Russia
Abstract:
Various neuron approximations can be used to reduce the computational complexity of neural networks. One such approximation based on summation and maximum operations is a bipolar morphological neuron. This paper presents an improved structure of the bipolar morphological neuron that enhances its computational efficiency and a new approach to training based on continuous approximations of the maximum and knowledge distillation. Experiments were conducted on the MNIST dataset using a LeNet-like neural network architecture and on the CIFAR10 dataset using a ResNet-22 model architecture. The proposed training method achieves 99.45% classification accuracy on the LeNet-like model, with the same accuracy of the classical network, and 86.69% accuracy on the ResNet-22 model, compared to 86.43% accuracy of the classical model. The results show that the proposed method with logsum-exp (LSE) approximation of the maximum and layer-by-layer knowledge distillation, allows for a simplified bipolar morphological network that is not inferior to classical networks.
Keywords:
bipolar morphological networks, approximations, artificial neural networks, computational efficiency.
Citation:
M. V. Zingerenko, E. E. Limonova, “Layer-wise knowledge distillation for simplified bipolar morphological neural networks”, Informatsionnye Tekhnologii i Vychslitel'nye Sistemy, 2023, no. 3, 46–54
Linking options:
https://www.mathnet.ru/eng/itvs820 https://www.mathnet.ru/eng/itvs/y2023/i3/p46
|
Statistics & downloads: |
Abstract page: | 27 | First page: | 4 |
|