direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

All Publications

Dynamic Hyperparameter Scaling Method for LVQ Algorithms
Citation key Seo2006
Author Seo, S. and Obermayer, K.
Title of Book IJCNN 2006 Conference Proceedings
Pages 3196 – 3203
Year 2006
ISBN 0-7803-9490-9
ISSN 2161-4393
DOI 10.1109/IJCNN.2006.247304
Publisher IEEE
Abstract We propose a new annealing method for the hyperparameters of several recent Learning Vector Quantization algorithms. We first analyze the relationship between values assigned to the hyperparameters, the on-line learning process, and the structure of the resulting classifier. Motivated by the results we then suggest an annealing method, where each hyperparameter is initially set to a large value and is then slowly decreased during learning. We apply the annealing method to the LVQ 2.1, SLVQ-LR, and RSLVQ methods, and we compare the generalization performance achieved with the new annealing method and with a standard hyperparameter selection using 10-fold cross validation. Benchmark results are provided for the datasets letter and pendigits from the UCI Machine Learning Repository. The new selection method provides equally good or - for some data sets - even superior results when compared to standard selection methods. More importantly, however, the number of learning trials for different values of the hyperparameters is drastically reduced. The results are insensitive to the form and parameters of the annealing schedule.
Link to publication Link to original publication Download Bibtex entry

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions