Training of support vector machines (SVMs) requires to solve a linearly constrained convex quadratic problem. In real applications, the number of training data may be very huge and the Hessian matrix cannot be stored. In order to take into account this issue, a common strategy consists in using decomposition algorithms which at each iteration operate only on a small subset of variables, usually referred to as the working set. Training time can be significantly reduced by using a caching technique that allocates some memory space to store the columns of the Hessian matrix corresponding to the variables recently updated. The convergence properties of a decomposition method can be guaranteed by means of a suitable selection of the working set and this can limit the possibility of exploiting the information stored in the cache. We propose a general hybrid algorithm model which combines the capability of producing a globally convergent sequence of points with a flexible use of the information in the cache. As an example of a specific realization of the general hybrid model, we describe an algorithm based on a particular strategy for exploiting the information deriving from a caching technique. We report the results of computational experiments performed by simple implementations of this algorithm. The numerical results point out the potentiality of the approach.

A Convergent Hybrid Decomposition Algorithm Model for SVM Training / Lucidi, Stefano; Palagi, Laura; Risi, Arnaldo; Sciandrone, M.. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS. - ISSN 1045-9227. - STAMPA. - 20:6(2009), pp. 1055-1060. [10.1109/tnn.2009.2020908]

A Convergent Hybrid Decomposition Algorithm Model for SVM Training

LUCIDI, Stefano;PALAGI, Laura;RISI, ARNALDO;M. Sciandrone
2009

Abstract

Training of support vector machines (SVMs) requires to solve a linearly constrained convex quadratic problem. In real applications, the number of training data may be very huge and the Hessian matrix cannot be stored. In order to take into account this issue, a common strategy consists in using decomposition algorithms which at each iteration operate only on a small subset of variables, usually referred to as the working set. Training time can be significantly reduced by using a caching technique that allocates some memory space to store the columns of the Hessian matrix corresponding to the variables recently updated. The convergence properties of a decomposition method can be guaranteed by means of a suitable selection of the working set and this can limit the possibility of exploiting the information stored in the cache. We propose a general hybrid algorithm model which combines the capability of producing a globally convergent sequence of points with a flexible use of the information in the cache. As an example of a specific realization of the general hybrid model, we describe an algorithm based on a particular strategy for exploiting the information deriving from a caching technique. We report the results of computational experiments performed by simple implementations of this algorithm. The numerical results point out the potentiality of the approach.
2009
caching technique; convergent hybrid schemes; large scale quadratic optimization; maximum violating pair; support vector machines (svms); working set selection; working set selection rule
01 Pubblicazione su rivista::01a Articolo in rivista
A Convergent Hybrid Decomposition Algorithm Model for SVM Training / Lucidi, Stefano; Palagi, Laura; Risi, Arnaldo; Sciandrone, M.. - In: IEEE TRANSACTIONS ON NEURAL NETWORKS. - ISSN 1045-9227. - STAMPA. - 20:6(2009), pp. 1055-1060. [10.1109/tnn.2009.2020908]
File allegati a questo prodotto
File Dimensione Formato  
VE_2009_11573-44387.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 255.3 kB
Formato Adobe PDF
255.3 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/44387
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 21
  • ???jsp.display-item.citation.isi??? 17
social impact