|
This article is cited in 3 scientific papers (total in 3 papers)
MATHEMATICS
On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes
V. N. Krutikov, N. S. Samoilenko Kemerovo State University
Abstract:
In this paper, the relaxation subgradient method with rank 2 correction of metric matrices is studied. It is proven that, on high-convex functions, in the case of the existence of a linear coordinate transformation reducing the degree of the task casualty, the method has a linear convergence rate corresponding to the casualty degree. The paper offers a new efficient tool for choosing the initial approximation of an artificial neural network. The use of regularization allowed excluding the overfitting effect and efficiently deleting low-significant neurons and intra-neural connections. The ability to efficiently solve such problems is ensured by the use of the subgradient method with metric matrix rank 2 correction. It has been experimentally proved that the convergence rate of the quasi-Newton method and that of the method under research are virtually equivalent on smooth functions. The method has a high convergence rate on non-smooth functions as well. The method's computing capabilities are used to build efficient neural network learning algorithms. The paper describes an artificial neural network learning algorithm which, together with the redundant neuron suppression, allows obtaining reliable approximations in one count.
Keywords:
method, subgradient, minimization, rate of convergence, neural networks, regularization.
Received: 31.03.2018
Citation:
V. N. Krutikov, N. S. Samoilenko, “On the convergence rate of the subgradient method with metric variation and its applications in neural network approximation schemes”, Vestn. Tomsk. Gos. Univ. Mat. Mekh., 2018, no. 55, 22–37
Linking options:
https://www.mathnet.ru/eng/vtgu668 https://www.mathnet.ru/eng/vtgu/y2018/i55/p22
|
Statistics & downloads: |
Abstract page: | 259 | Full-text PDF : | 89 | References: | 43 |
|