It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel methods have been introduced under the name of Core Vector Machines (CVMs). In this paper, we study a new algorithm to train SVMs based on an instance of the Frank-Wolfe optimization method recently proposed to approximate the solution of the MEB problem. We show that, specialized to SVM training, this algorithm can scale better than CVMs at the price of a slightly lower accuracy.
Titolo: | A New Algorithm for Training SVMs using Approximate Minimal Enclosing Balls |
Autore/i: | Emanuele Frandi; Maria Grazia Gasparo; LODI, STEFANO; Ricardo Ñanculef; SARTORI, CLAUDIO |
Autore/i Unibo: | |
Anno: | 2010 |
Titolo del libro: | Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications |
Pagina iniziale: | 87 |
Pagina finale: | 95 |
Abstract: | It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel methods have been introduced under the name of Core Vector Machines (CVMs). In this paper, we study a new algorithm to train SVMs based on an instance of the Frank-Wolfe optimization method recently proposed to approximate the solution of the MEB problem. We show that, specialized to SVM training, this algorithm can scale better than CVMs at the price of a slightly lower accuracy. |
Data prodotto definitivo in UGOV: | 18-nov-2010 |
Appare nelle tipologie: | 4.01 Contributo in Atti di convegno |