Frank–Wolfe (FW) algorithms have been often proposed over the last few years as efficient solvers for a variety of optimization problems arising in the field of machine learning. The ability to work with cheap projection-free iterations and the incremental nature of the method make FW a very effective choice for many large-scale problems where computing a sparse model is desirable. In this paper, we present a high-performance implementation of the FW method tailored to solve large-scale Lasso regression problems, based on a randomized iteration, and prove that the convergence guarantees of the standard FW method are preserved in the stochastic setting. We show experimentally that our algorithm outperforms several existing state of the art methods, including the Coordinate Descent algorithm by Friedman et al. (one of the fastest known Lasso solvers), on several benchmark datasets with a very large number of features, without sacrificing the accuracy of the model. Our results illustrate that the algorithm is able to generate the complete regularization path on problems of size up to four million variables in <1 min. © 2016, The Author(s).

Emanuele, F., Ricardo, Ñ., Stefano, L., Claudio, S., Johan, A.K.S. (2016). Fast and scalable Lasso via stochastic Frank–Wolfe methods with a convergence guarantee. MACHINE LEARNING, 104(2-3), 195-221 [10.1007/s10994-016-5578-4].

Fast and scalable Lasso via stochastic Frank–Wolfe methods with a convergence guarantee

LODI, STEFANO;SARTORI, CLAUDIO;
2016

Abstract

Frank–Wolfe (FW) algorithms have been often proposed over the last few years as efficient solvers for a variety of optimization problems arising in the field of machine learning. The ability to work with cheap projection-free iterations and the incremental nature of the method make FW a very effective choice for many large-scale problems where computing a sparse model is desirable. In this paper, we present a high-performance implementation of the FW method tailored to solve large-scale Lasso regression problems, based on a randomized iteration, and prove that the convergence guarantees of the standard FW method are preserved in the stochastic setting. We show experimentally that our algorithm outperforms several existing state of the art methods, including the Coordinate Descent algorithm by Friedman et al. (one of the fastest known Lasso solvers), on several benchmark datasets with a very large number of features, without sacrificing the accuracy of the model. Our results illustrate that the algorithm is able to generate the complete regularization path on problems of size up to four million variables in <1 min. © 2016, The Author(s).
2016
Emanuele, F., Ricardo, Ñ., Stefano, L., Claudio, S., Johan, A.K.S. (2016). Fast and scalable Lasso via stochastic Frank–Wolfe methods with a convergence guarantee. MACHINE LEARNING, 104(2-3), 195-221 [10.1007/s10994-016-5578-4].
Emanuele, Frandi; Ricardo, Ñanculef; Stefano, Lodi; Claudio, Sartori; Johan, A. K. Suykens
File in questo prodotto:
File Dimensione Formato  
LASSO_paper_Main.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 1.06 MB
Formato Adobe PDF
1.06 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/585648
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 7
social impact