Название исследуемой задачи: | Sign operator for |
---|---|
Тип научной работы: | M1P |
Автор: | Иконников Марк Игоревич |
Научный руководитель: | Beznosikov Aleksandr Nikolaevich, PhD |
Научный консультант(при наличии): | MS student Korniliv Nikita Maksimovich |
In Machine Learning, the non-smoothness of optimization problems, the high cost of communicating gradients between workers, and severely corrupted data during training necessitate generalized optimization approaches. This paper explores the efficacy of sign-based methods, which address slow transmission by communicating only the sign of each minibatch stochastic gradient. We investigate these methods within
The object of this research is the stochastic optimization of a smooth, non-convex function. Traditional optimization often assumes
In this paper we:
Investigated sign-based methods for communication-efficient distributed optimization under the assumptions above. Developed high-probability convergence guarantees accounting for generalized conditions.
The experimental goals are to validate convergence under
The paper is the part of the larger collective research work. The findings are submitted to NIPS_2025 conference.
The papers is mainly theoretical, so the softwatre only includes the code for the computational experiments of the convergence rate. A code with all experiment visualisation is here.