TU Berlin

Neural Information ProcessingMachine Learning

Neuronale Informationsverarbeitung

Page Content

to Navigation

Machine Learning

All Publications

Soft Nearest Prototype Classification
Citation key Seo2003a
Author Seo, S. and Bode, M. and Obermayer, K.
Pages 390 – 398
Year 2003
DOI 10.1109/TNN.2003.809407.
Journal IEEE Transactions on Neural Networks
Volume 14
Publisher IEEE
Abstract We propose a new method for the construction of nearest prototype classifiers which is based on a Gaussian mixture ansatz and which can be interpreted as an annealed version of Learning Vector Quantization. The algorithm performs a gradient descent on a cost-function minimizing the classification error on the training set. We investigate the properties of the algorithm and asses its performance fo several toy data sets and for an optcal letter classification task. Results sho i) that annealing in the dispersion parameter of the Gaussian kernels improves classification accuracy, ii) that classification resuolts are better than those obtained with standard Learning Vector Quantization (LVQ 2.1, LVQ 3) for equal numbers of prototypes and iii) that annealing of the width paramter improved the classification capability. Additionally, the principled approach provides an explanation of a number of features ofthe (heuristic) LV methods.
Bibtex Type of Publication Selected:quantization
Link to original publication Download Bibtex entry


Quick Access

Schnellnavigation zur Seite über Nummerneingabe