In this paper we consider a distributed opti- mization scenario in which the aggregate objective function to minimize is partitioned, big-data and possibly non-convex. Specifically, we focus on a set-up in which the dimension of the decision variable depends on the network size as well as the number of local functions, but each local function handled by a node depends only on a (small) portion of the entire optimiza- tion variable. This problem set-up has been shown to appear in many interesting network application scenarios. As main paper contribution, we develop a simple, primal distributed algorithm to solve the optimization problem, based on a randomized descent approach, which works under asynchronous gossip communication. We prove that the proposed asynchronous algorithm is a proper, ad-hoc version of a coordinate descent method and thus converges to a stationary point. To show the effectiveness of the proposed algorithm, we also present numerical simulations on a non-convex quadratic program, which confirm the theoretical results.

Notarnicola Ivano, Notarstefano Giuseppe (2016). A randomized primal distributed algorithm for partitioned and big-data non-convex optimization. USA : IEEE [10.1109/CDC.2016.7798262].

A randomized primal distributed algorithm for partitioned and big-data non-convex optimization

Notarnicola Ivano
;
Notarstefano Giuseppe
2016

Abstract

In this paper we consider a distributed opti- mization scenario in which the aggregate objective function to minimize is partitioned, big-data and possibly non-convex. Specifically, we focus on a set-up in which the dimension of the decision variable depends on the network size as well as the number of local functions, but each local function handled by a node depends only on a (small) portion of the entire optimiza- tion variable. This problem set-up has been shown to appear in many interesting network application scenarios. As main paper contribution, we develop a simple, primal distributed algorithm to solve the optimization problem, based on a randomized descent approach, which works under asynchronous gossip communication. We prove that the proposed asynchronous algorithm is a proper, ad-hoc version of a coordinate descent method and thus converges to a stationary point. To show the effectiveness of the proposed algorithm, we also present numerical simulations on a non-convex quadratic program, which confirm the theoretical results.
2016
2016 IEEE 55th Conference on Decision and Control
153
158
Notarnicola Ivano, Notarstefano Giuseppe (2016). A randomized primal distributed algorithm for partitioned and big-data non-convex optimization. USA : IEEE [10.1109/CDC.2016.7798262].
Notarnicola Ivano; Notarstefano Giuseppe
File in questo prodotto:
File Dimensione Formato  
A randomized primal_post_reviewed.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 572.61 kB
Formato Adobe PDF
572.61 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/674767
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 8
social impact