|
|
Общероссийский семинар по оптимизации им. Б.Т. Поляка
30 июня 2021 г. 17:30, Москва, Онлайн, пятница, 19:00
|
|
|
|
|
|
Александр Безносиков - On Communication Bottleneck in Distributed Optimization
Zengyanxing (Huawei) - Distributed optimization algorithm scalability for large scale ML scenario based on cluster computing
|
Количество просмотров: |
Эта страница: | 174 |
|
Аннотация:
Первый доклад:
Communications are the bottleneck for most distributed algorithms. This presentation is a small survey of some of the modern ways of diminishing the communication resource in distributed optimization. In particular, we talk about
1) quantization and compression operators in optimization methods;
2) local steps technique;
3) methods that take into account data-similarity of computing devices.
At the end of the presentation, we compare decentralized and centralized approaches in terms of efficient use of computing resources and communications.
Второй доклад:
We propose to use the fundamental isoefficiency function to evaluate the scalability performance of distributed optimization algorithm. The commercial mode of computing product is effective computing power. Distributed optimization algorithm approaching lower bound of isoefficiency will best release the effective computing power and bring great cost competiveness of computing product. Based on deep diving the state of art methodology from industrial and academic point of view and analyzing the challenging problems of them, we call for proposal to design a general framework for distributed optimization algorithm which can approach the theoretical lower bound of isoefficiency function O(p). The expected deliverables will benefit computing product, HPC, AI clusters and cloud.
|
|