direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Page Content

Machine Learning

All Publications

An SMO algorithm for the Potential Support Vector Machine
Citation key Knebel2008
Author Knebel, T. and Hochreiter, S. and Obermayer, K.
Pages 271 – 287
Year 2008
DOI 10.1162/neco.2008.20.1.271
Journal Neural Comput.
Volume 20
Number 1
Publisher MIT Press
Abstract We describe a fast Sequential Minimal Optimization (SMO) procedure for solving the dual optimization problem of the recently proposed Potential Support Vector Machine (P-SVM). The new SMO consists of a sequence of iteration steps, in which the Lagrangian is optimized either with respect to one (single SMO) or to two (dual SMO) of the Lagrange multipliers while keeping the other variables fixed. An efficient selection procedure for Lagrange multipliers is given and two heuristics for improving the SMO procedure are described: block optimization and annealing of the regularization parameter ε. A comparison between the different variants show, that the dual SMO including block optimization and annealing performs most efficient in terms of computation time. In contrast to standard Support Vector Machines (SVMs), the P-SVM is applicable to arbitrary dyadic datasets, but benchmarks are provided against libSVM's ε-SVR and C-SVC implementations for problems which are also solvable by standard SVM methods. For those problems computation time of the P-SVM is comparable or somewhat higher than the standard SVM. The number of support vectors found by the P-SVM is usually much smaller for the same generalization performance.
Bibtex Type of Publication Selected:structured
Link to original publication Download Bibtex entry

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions