|
This article is cited in 1 scientific paper (total in 1 paper)
Machine learning of the well-known things
V. V. Dolotinabc, A. Yu. Morozovabc, A. V. Popolitovabc a Moscow Institute of Physics and Technology (National Research University), Dolgoprudny, Moscow Region, Russia
b Alikhanov Institute for Theoretical and Experimental Physics, National Research Centre "Kurchatov Institute", Noscow, Russia
c Kharkevich Institute for Information Transmission Problems of the Russian Academy of Sciences, Moscow, Russia
Abstract:
Machine learning (ML) in its current form implies that the answer to any problem can be well approximated by a function of a very peculiar form: a specially adjusted iteration of Heaviside theta-functions. It is natural to ask whether the answers to questions that we already know can be naturally represented in this form. We provide elementary and yet nonevident examples showing that this is indeed possible, and suggest to look for a systematic reformulation of existing knowledge in an ML-consistent way. The success or failure of these attempts can shed light on a variety of problems, both scientific and epistemological.
Keywords:
exact approaches to QFT, nonlinear algebra, machine learning, steepest descent method.
Received: 02.12.2022 Revised: 06.12.2022
Citation:
V. V. Dolotin, A. Yu. Morozov, A. V. Popolitov, “Machine learning of the well-known things”, TMF, 214:3 (2023), 517–528; Theoret. and Math. Phys., 214:3 (2023), 446–455
Linking options:
https://www.mathnet.ru/eng/tmf10418https://doi.org/10.4213/tmf10418 https://www.mathnet.ru/eng/tmf/v214/i3/p517
|
Statistics & downloads: |
Abstract page: | 310 | Full-text PDF : | 61 | Russian version HTML: | 199 | References: | 45 | First page: | 22 |
|