direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

All Publications

Optimal Kernels for Unsupervised Learning
Citation key Hochreiter2005b
Author Hochreiter, S. and Obermayer, K.
Title of Book Proceedings of the International Joint Conference on Neural Networks
Pages 1895 – 1899
Year 2005
ISBN 0-7803-9048-2
ISSN 2161-4393
DOI 10.1109/IJCNN.2005.1556169
Volume 3
Abstract We investigate the optimal kernel for sample-based model selection in unsupervised learning if maximum likelihood approaches are intractable. Given a set of training data and a set of data generated by the model, two kernel density estimators are constructed. A model is selected through gradient descent w.r.t. the model parameters on the integrated squared difference between the density estimators. Firstly we prove that convergence is optimal, i.e. that the cost function has only one global minimum w.r.t. the locations of the model samples, if and only if the kernel in the reparametrized cost function is a Coulomb kernel. As a consequence, Gaussian kernels commonly used for density estimators are suboptimal. Secondly we show that the absolute value of the difference between model and reference density converges at least with 1/t. Finally, we apply the new methods to distribution free ICA and to nonlinear ICA.
Link to original publication Download Bibtex entry

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions