Informatics and Automation
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Informatics and Automation:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Informatics and Automation, 2022, Issue 21, volume 1, Pages 161–180
DOI: https://doi.org/10.15622/ia.2022.21.6
(Mi trspy1187)
 

This article is cited in 2 scientific papers (total in 2 papers)

Artificial Intelligence, Knowledge and Data Engineering

Polynomial approximations for several neural network activation functions

G. B. Marshalko, J. A. Trufanova

Technical committee for standardization "Cryptography and security mechanisms"
Abstract: Active deployment of machine learning systems sets a task of their protection against different types of attacks that threaten confidentiality, integrity and accessibility of both processed data and trained models. One of the promising ways for such protection is the development of privacy-preserving machine learning systems, that use homomorphic encryption schemes to protect data and models. However, such schemes can only process polynomial functions, which means that we need to construct polynomial approximations for nonlinear functions used in neural models. The goal of this paper is the construction of precise approximations of several widely used neural network activation functions while limiting the degree of approximation polynomials as well as the evaluation of the impact of the approximation precision on the resulting value of the whole neural network. In contrast to the previous publications, in the current paper we study and compare different ways for polynomial approximation construction, introduce precision metrics, present exact formulas for approximation polynomials as well as exact values of corresponding precisions. We compare our results with the previously published ones. Finally, for a simple convolutional network we experimentally evaluate the impact of the approximation precision on the bias of the output neuron values of the network from the original ones. Our results show that the best approximation for ReLU could be obtained with the numeric method, and for the sigmoid and hyperbolic tangent – with Chebyshev polynomials. At the same time, the best approximation among the three functions could be obtained for ReLU. The results could be used for the construction of polynomial approximations of activation functions in privacy-preserving machine learning systems.
Keywords: activation function, ReLU, tanh, sigmoid, homomorphic encryption, BGV, CKKS, neural network, polynomial approximation, privacy-preserving machine learning.
Received: 07.10.2021
Document Type: Article
UDC: 004.032.26
Language: Russian
Citation: G. B. Marshalko, J. A. Trufanova, “Polynomial approximations for several neural network activation functions”, Informatics and Automation, 21:1 (2022), 161–180
Citation in format AMSBIB
\Bibitem{MarTru22}
\by G.~B.~Marshalko, J.~A.~Trufanova
\paper Polynomial approximations for several neural network activation functions
\jour Informatics and Automation
\yr 2022
\vol 21
\issue 1
\pages 161--180
\mathnet{http://mi.mathnet.ru/trspy1187}
\crossref{https://doi.org/10.15622/ia.2022.21.6}
Linking options:
  • https://www.mathnet.ru/eng/trspy1187
  • https://www.mathnet.ru/eng/trspy/v21/i1/p161
  • This publication is cited in the following 2 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Informatics and Automation
    Statistics & downloads:
    Abstract page:271
    Full-text PDF :331
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024