Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Dokl. RAN. Math. Inf. Proc. Upr.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia, 2023, Volume 514, Number 2, Pages 343–354
DOI: https://doi.org/10.31857/S2686954323601665
(Mi danma478)
 

SPECIAL ISSUE: ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING TECHNOLOGIES

Optimal data splitting in distributed optimization for machine learning

D. Medyakov, G. Molodtsov, A. Beznosikov, A. Gasnikov

Moscow Institute of Physics and Technology, Dolgoprudny, Russian Federation
References:
Abstract: The distributed optimization problem has become increasingly relevant recently. It has a lot of advantages such as processing a large amount of data in less time compared to non-distributed methods. However, most distributed approaches suffer from a significant bottleneck – the cost of communications. Therefore, a large amount of research has recently been directed at solving this problem. One such approach uses local data similarity. In particular, there exists an algorithm provably optimally exploiting the similarity property. But this result, as well as results from other works solve the communication bottleneck by focusing only on the fact that communication is significantly more expensive than local computing and does not take into account the various capacities of network devices and the different relationship between communication time and local computing expenses. We consider this setup and the objective of this study is to achieve an optimal ratio of distributed data between the server and local machines for any costs of communications and local computations. The running times of the network are compared between uniform and optimal distributions. The superior theoretical performance of our solutions is experimentally validated.
Funding agency Grant number
Russian Science Foundation 23-11-00229
The research of A. Beznosikov was supported by Russian Science Foundation (project no. 23-11-00229).
Presented: A. L. Semenov
Received: 02.09.2023
Revised: 15.09.2023
Accepted: 15.10.2023
English version:
Doklady Mathematics, 2023, Volume 108, Issue suppl. 2, Pages S465–S475
DOI: https://doi.org/10.1134/S1064562423701600
Bibliographic databases:
Document Type: Article
UDC: 004.8
Language: Russian
Citation: D. Medyakov, G. Molodtsov, A. Beznosikov, A. Gasnikov, “Optimal data splitting in distributed optimization for machine learning”, Dokl. RAN. Math. Inf. Proc. Upr., 514:2 (2023), 343–354; Dokl. Math., 108:suppl. 2 (2023), S465–S475
Citation in format AMSBIB
\Bibitem{MedMolBez23}
\by D.~Medyakov, G.~Molodtsov, A.~Beznosikov, A.~Gasnikov
\paper Optimal data splitting in distributed optimization for machine learning
\jour Dokl. RAN. Math. Inf. Proc. Upr.
\yr 2023
\vol 514
\issue 2
\pages 343--354
\mathnet{http://mi.mathnet.ru/danma478}
\crossref{https://doi.org/10.31857/S2686954323601665}
\elib{https://elibrary.ru/item.asp?id=56717852}
\transl
\jour Dokl. Math.
\yr 2023
\vol 108
\issue suppl. 2
\pages S465--S475
\crossref{https://doi.org/10.1134/S1064562423701600}
Linking options:
  • https://www.mathnet.ru/eng/danma478
  • https://www.mathnet.ru/eng/danma/v514/i2/p343
  • Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024