|
This article is cited in 1 scientific paper (total in 1 paper)
Topical issue
Distilling face recognition models trained using margin-based softmax function
D. V. Svitovab, S. A. Alyamkina a Expasoft LLC, Novosibirsk, 630090 Russia
b Institute of Automation and Electrometry of the Siberian Branch of the Russian Academy of Sciences,
Novosibirsk, 630090 Russia
Abstract:
The use of convolutional neural networks trained with the margin-based softmax function allows achieving the highest accuracy in the face recognition problem. The development of embedded systems such as smart intercoms has increased interest in lightweight neural networks. Thus, lightweight neural network models, trained using the margin-based softmax function, were proposed for the face identification problem. In the present paper, we propose a distillation method that allows obtaining greater accuracy than other methods for the face recognition problem on LFW, AgeDB-30, and Megaface datasets. The main idea of our approach is to use the class centers of the teacher network to initialize the student network. Then the student network is trained to produce biometric vectors the angles from which to the class centers are equal to the angles in the teacher network.
Keywords:
convolutional neural network, distillation, bioidentification.
Citation:
D. V. Svitov, S. A. Alyamkin, “Distilling face recognition models trained using margin-based softmax function”, Avtomat. i Telemekh., 2022, no. 10, 35–46; Autom. Remote Control, 83:10 (2022), 1517–1526
Linking options:
https://www.mathnet.ru/eng/at16049 https://www.mathnet.ru/eng/at/y2022/i10/p35
|
Statistics & downloads: |
Abstract page: | 91 | References: | 25 | First page: | 16 |
|