Казанский (Приволжский) федеральный университет, КФУ
КАЗАНСКИЙ
ФЕДЕРАЛЬНЫЙ УНИВЕРСИТЕТ
 
USING NON-LIPSCHITZ SIGNUM-BASED FUNCTIONS FOR DISTRIBUTED OPTIMIZATION AND MACHINE LEARNING: TRADE-OFF BETWEEN CONVERGENCE RATE AND OPTIMALITY GAP
Форма представленияСтатьи в зарубежных журналах и сборниках
Год публикации2025
Языканглийский
  • Габидуллина Зульфия Равилевна, автор
  • Aghasi Alireza , автор
  • Doostmohammadian Mohammadreza , автор
  • Ghods Amir Ahmad , автор
  • Rabiee Hamid R. , автор
  • Библиографическое описание на языке оригинала Mohammadreza Doostmohammadian, Amir Ahmad Ghods, Alireza Aghasi, Zulfiya R. Gabidullina, Hamid R. Rabiee. Using Non-Lipschitz Signum-Based Functions for Distributed Optimization and Machine Learning: Trade-Off Between Convergence Rate and Optimality Gap, Mathematical and Computational Applications. 2025, 30(5), 108; https://doi.org/10.3390/mca30050108
    Аннотация In recent years, the prevalence of large-scale datasets and the demand for sophisticated learning models have necessitated the development of efficient distributed machine learning (ML) solutions. Convergence speed is a critical factor influencing the practicality and effectiveness of these distributed frameworks. Recently, non-Lipschitz continuous optimization algorithms have been proposed to improve the slow convergence rate of the existing linear solutions. The use of signum-based functions was previously considered in consensus and control literature to reach fast convergence in the prescribed time and also to provide robust algorithms to noisy/outlier data. However, as shown in this work, these algorithms lead to an optimality gap and steady-state residual of the objective function in discrete-time setup. This motivates us to investigate the distributed optimization and ML algorithms in terms of trade-off between convergence rate and optimality gap. In this direction, we specifically consider the distributed regression problem and check its convergence rate by applying both linear and non-Lipschitz signum-based functions. We check our distributed regression approach by extensive simulations. Our results show that although adopting signum-based functions may give faster convergence, it results in large optimality gaps. The findings presented in this paper may contribute to and advance the ongoing discourse of similar distributed algorithms, e.g., for distributed constrained optimization and distributed estimation. Lipschitz continuity
    Ключевые слова linear regression, distributed optimization,network and graph theory, Lipschitz continuity
    Название журнала MATHEMATICAL AND COMPUTATIONAL APPLICATIONS
    Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на эту карточку https://repository.kpfu.ru/?p_id=320950

    Полная запись метаданных