We study distributed big-data nonconvex optimization in multi-agent networks. We consider the (constrained) minimization of the sum of a smooth (possibly) nonconvex function plus a convex (possibly) nonsmooth regularizer. Our interest is on big-data problems in which the number of optimization variables is large. If treated by means of standard distributed optimization algorithms, these large-scale problems may be intractable due to the prohibitive local computation and communication burden at each node. We propose a novel distributed solution method where, at each iteration, agents update in an uncoordinated fashion only one block of the entire decision vector. To deal with the nonconvexity of the cost, our scheme hinges on Successive Convex Approximation (SCA) techniques combined with a novel block-wise perturbed push-sum consensus protocol, instrumental for local block-averaging operations and tracking of gradient averages. Asymptotic convergence to stationary solutions of the nonconvex problem is proved. Numerical results show the effectiveness of the proposed algorithm and highlight how the block dimension impacts on the communication overhead and practical convergence speed.

Distributed Big-Data Optimization via Block-wise Gradient Tracking / Notarnicola, Ivano; Sun, Ying; Scutari, Gesualdo; Notarstefano, Giuseppe. - In: IEEE TRANSACTIONS ON AUTOMATIC CONTROL. - ISSN 0018-9286. - ELETTRONICO. - 66:5(2021), pp. 2045-2060. [10.1109/TAC.2020.3008713]

Distributed Big-Data Optimization via Block-wise Gradient Tracking

Notarnicola, Ivano
;
Notarstefano, Giuseppe
2021

Abstract

We study distributed big-data nonconvex optimization in multi-agent networks. We consider the (constrained) minimization of the sum of a smooth (possibly) nonconvex function plus a convex (possibly) nonsmooth regularizer. Our interest is on big-data problems in which the number of optimization variables is large. If treated by means of standard distributed optimization algorithms, these large-scale problems may be intractable due to the prohibitive local computation and communication burden at each node. We propose a novel distributed solution method where, at each iteration, agents update in an uncoordinated fashion only one block of the entire decision vector. To deal with the nonconvexity of the cost, our scheme hinges on Successive Convex Approximation (SCA) techniques combined with a novel block-wise perturbed push-sum consensus protocol, instrumental for local block-averaging operations and tracking of gradient averages. Asymptotic convergence to stationary solutions of the nonconvex problem is proved. Numerical results show the effectiveness of the proposed algorithm and highlight how the block dimension impacts on the communication overhead and practical convergence speed.
2021
Distributed Big-Data Optimization via Block-wise Gradient Tracking / Notarnicola, Ivano; Sun, Ying; Scutari, Gesualdo; Notarstefano, Giuseppe. - In: IEEE TRANSACTIONS ON AUTOMATIC CONTROL. - ISSN 0018-9286. - ELETTRONICO. - 66:5(2021), pp. 2045-2060. [10.1109/TAC.2020.3008713]
Notarnicola, Ivano; Sun, Ying; Scutari, Gesualdo; Notarstefano, Giuseppe
File in questo prodotto:
File Dimensione Formato  
post_print_TAC2020.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 1.93 MB
Formato Adobe PDF
1.93 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/790157
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 17
  • ???jsp.display-item.citation.isi??? 11
social impact