It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel methods have been introduced under the name of Core Vector Machines (CVMs). In this paper, we study a new algorithm to train SVMs based on an instance of the Frank-Wolfe optimization method recently proposed to approximate the solution of the MEB problem. We show that, specialized to SVM training, this algorithm can scale better than CVMs at the price of a slightly lower accuracy.
A New Algorithm for Training SVMs using Approximate Minimal Enclosing Balls / Emanuele Frandi; Maria Grazia Gasparo; Stefano Lodi; Ricardo Ñanculef; Claudio Sartori. - STAMPA. - LNCS 6419:(2010), pp. 87-95. (Intervento presentato al convegno 15th Iberoamerican Congress on Pattern Recognition, CIARP 2010 tenutosi a Sao Paulo, Brazil nel November 8-11, 2010).
A New Algorithm for Training SVMs using Approximate Minimal Enclosing Balls
LODI, STEFANO;SARTORI, CLAUDIO
2010
Abstract
It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel methods have been introduced under the name of Core Vector Machines (CVMs). In this paper, we study a new algorithm to train SVMs based on an instance of the Frank-Wolfe optimization method recently proposed to approximate the solution of the MEB problem. We show that, specialized to SVM training, this algorithm can scale better than CVMs at the price of a slightly lower accuracy.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.