In this letter we address nonconvex distributed consensus optimization, a popular framework for distributed big-data analytics and learning. We consider the Gradient Tracking algorithm and, by resorting to an elegant system theoretical analysis, we show that agent estimates asymptotically reach consensus to a stationary point. We take advantage of suitable coordinates to write the Gradient Tracking as the interconnection of a fast dynamics and a slow one. To use a singular perturbation analysis, we separately study two auxiliary subsystems called boundary layer and reduced systems, respectively. We provide a Lyapunov function for the boundary layer system and use Lasalle-based arguments to show that trajectories of the reduced system converge to the set of stationary points. Finally, a customized version of a Lasalle's Invariance Principle for singularly perturbed systems is proved to show the convergence properties of the Gradient Tracking.

Nonconvex Distributed Optimization via Lasalle and Singular Perturbations / Carnevale, G; Notarstefano, G. - In: IEEE CONTROL SYSTEMS LETTERS. - ISSN 2475-1456. - ELETTRONICO. - 7:(2023), pp. 9812748.301-9812748.306. [10.1109/LCSYS.2022.3187918]

Nonconvex Distributed Optimization via Lasalle and Singular Perturbations

Carnevale, G
Primo
;
Notarstefano, G
Secondo
2023

Abstract

In this letter we address nonconvex distributed consensus optimization, a popular framework for distributed big-data analytics and learning. We consider the Gradient Tracking algorithm and, by resorting to an elegant system theoretical analysis, we show that agent estimates asymptotically reach consensus to a stationary point. We take advantage of suitable coordinates to write the Gradient Tracking as the interconnection of a fast dynamics and a slow one. To use a singular perturbation analysis, we separately study two auxiliary subsystems called boundary layer and reduced systems, respectively. We provide a Lyapunov function for the boundary layer system and use Lasalle-based arguments to show that trajectories of the reduced system converge to the set of stationary points. Finally, a customized version of a Lasalle's Invariance Principle for singularly perturbed systems is proved to show the convergence properties of the Gradient Tracking.
2023
Nonconvex Distributed Optimization via Lasalle and Singular Perturbations / Carnevale, G; Notarstefano, G. - In: IEEE CONTROL SYSTEMS LETTERS. - ISSN 2475-1456. - ELETTRONICO. - 7:(2023), pp. 9812748.301-9812748.306. [10.1109/LCSYS.2022.3187918]
Carnevale, G; Notarstefano, G
File in questo prodotto:
File Dimensione Formato  
Nonconvex_Distributed_Optimization_via_Lasalle_and_Singular_Perturbations.pdf

accesso aperto

Tipo: Versione (PDF) editoriale
Licenza: Creative commons
Dimensione 459.95 kB
Formato Adobe PDF
459.95 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/900885
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact