Training a support vector machine (SVM) requires the solution of a quadratic programming problem (QP) whose computational complexity becomes prohibitively expensive for large scale datasets. Traditional optimization methods cannot be directly applied in these cases, mainly due to memory restrictions. By adopting a slightly different objective function and under mild conditions on the kernel used within the model, e±cient algorithms to train SVMs have been devised under the name of core vector machines (CVMs). This framework exploits the equivalence of the resulting learning problem with the task of building a minimal enclosing ball (MEB) problem in a feature space, where data is implicitly embedded by a kernel function. In this paper, we improve on the CVM approach by proposing two novel methods to build SVMs based on the FrankWolfe algorithm, recently revisited as a fast method to approximate optimization steps. Experiments on a large collection of datasets show that our methods scale better than CVMs in most cases, sometimes at the price of a slightly lower accuracy. As CVMs, the proposed methods can be easily extended to machine learning problems other than binary classi fication. However, effective classifiers are also obtained using kernels which do not satisfy the condition required by CVMs and thus our methods can be used for a wider set of problems.

E. FRANDI, R. ÑANCULEF, M. G. GASPARO, S. LODI, C. SARTORI (2013). TRAINING SUPPORT VECTOR MACHINES USING FRANK-WOLFE OPTIMIZATION METHODS. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 27(3), 1360003-1-1360003-40 [10.1142/S0218001413600033].

TRAINING SUPPORT VECTOR MACHINES USING FRANK-WOLFE OPTIMIZATION METHODS

LODI, STEFANO;SARTORI, CLAUDIO
2013

Abstract

Training a support vector machine (SVM) requires the solution of a quadratic programming problem (QP) whose computational complexity becomes prohibitively expensive for large scale datasets. Traditional optimization methods cannot be directly applied in these cases, mainly due to memory restrictions. By adopting a slightly different objective function and under mild conditions on the kernel used within the model, e±cient algorithms to train SVMs have been devised under the name of core vector machines (CVMs). This framework exploits the equivalence of the resulting learning problem with the task of building a minimal enclosing ball (MEB) problem in a feature space, where data is implicitly embedded by a kernel function. In this paper, we improve on the CVM approach by proposing two novel methods to build SVMs based on the FrankWolfe algorithm, recently revisited as a fast method to approximate optimization steps. Experiments on a large collection of datasets show that our methods scale better than CVMs in most cases, sometimes at the price of a slightly lower accuracy. As CVMs, the proposed methods can be easily extended to machine learning problems other than binary classi fication. However, effective classifiers are also obtained using kernels which do not satisfy the condition required by CVMs and thus our methods can be used for a wider set of problems.
2013
E. FRANDI, R. ÑANCULEF, M. G. GASPARO, S. LODI, C. SARTORI (2013). TRAINING SUPPORT VECTOR MACHINES USING FRANK-WOLFE OPTIMIZATION METHODS. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 27(3), 1360003-1-1360003-40 [10.1142/S0218001413600033].
E. FRANDI;R. ÑANCULEF;M. G. GASPARO;S. LODI;C. SARTORI
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/142579
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? 6
social impact