In this work we present a novel way to solve the sub-problems that originate when using decomposition algorithms to train Support Vector Machines (SVMs). State-of-the-art Sequential Minimization Optimization (SMO) solvers reduce the original problem to a sequence of sub-problems of two variables for which the solution is analytical. Although considering more than two variables at a time usually results in a lower number of iterations needed to train an SVM model, solving the sub-problem becomes much harder and the overall computational gains are limited, if any. We propose to apply the two-variables decomposition method to solve the sub-problems themselves and experimentally show that it is a viable and efficient way to deal with sub-problems of up to 50 variables. As a second contribution we explore different ways to select the working set and its size, combining first-order and second-order working set selection rules together with a strategy for exploiting cached elements of the Hessian matrix. An extensive numerical comparison shows that the method performs considerably better than state-of-the-art software.

A Two-Level Decomposition Framework Exploiting First and Second Order Information for SVM Training Problems / Galvan, G; Lapucci, M; Lin, Cj; Sciandrone, M. - In: JOURNAL OF MACHINE LEARNING RESEARCH. - ISSN 1532-4435. - 22:23(2021), pp. 1-38.

A Two-Level Decomposition Framework Exploiting First and Second Order Information for SVM Training Problems

Galvan, G
;
Sciandrone, M
2021

Abstract

In this work we present a novel way to solve the sub-problems that originate when using decomposition algorithms to train Support Vector Machines (SVMs). State-of-the-art Sequential Minimization Optimization (SMO) solvers reduce the original problem to a sequence of sub-problems of two variables for which the solution is analytical. Although considering more than two variables at a time usually results in a lower number of iterations needed to train an SVM model, solving the sub-problem becomes much harder and the overall computational gains are limited, if any. We propose to apply the two-variables decomposition method to solve the sub-problems themselves and experimentally show that it is a viable and efficient way to deal with sub-problems of up to 50 variables. As a second contribution we explore different ways to select the working set and its size, combining first-order and second-order working set selection rules together with a strategy for exploiting cached elements of the Hessian matrix. An extensive numerical comparison shows that the method performs considerably better than state-of-the-art software.
2021
SVM; Support Vector Machines; Decomposition Method; Working Set Selection; Sequential Minimal Optimization
01 Pubblicazione su rivista::01a Articolo in rivista
A Two-Level Decomposition Framework Exploiting First and Second Order Information for SVM Training Problems / Galvan, G; Lapucci, M; Lin, Cj; Sciandrone, M. - In: JOURNAL OF MACHINE LEARNING RESEARCH. - ISSN 1532-4435. - 22:23(2021), pp. 1-38.
File allegati a questo prodotto
File Dimensione Formato  
Galvan_A-Two-Level_2021.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 1.13 MB
Formato Adobe PDF
1.13 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1625411
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 2
social impact