TU Berlin

Neural Information ProcessingLearning on Structured Representations

Neuronale Informationsverarbeitung

Page Content

to Navigation

Learning on Structured Representations

Lupe

Learning from examples in order to predict is one of the standard tasks in machine learning. Many techniques have been developed to solve classification and regression problems, but by far, most of them were specifically designed for vectorial data. Vectorial data are very convenient because of the structure imposed by the Euclidean metric. For many data sets (protein sequences, text, images, videos, chemical formulas, etc.) a vector-based description is not only inconvenient but may simply wrong, and representations that consider relationships between objects or that embed objects in spaces with non-Euclidean structure are often more appropriate. Here we follow different approaches to extend learning from examples to non-vectorial data. One approach is focussed on an extension of kernel methods leading to learning algorithms specifically designed for relational data representations of a general form. In a second approach - specifically designed for objects which are naturally represented in terms of finite combinatorial structures - we explore embeddings into quotient spaces of a Euclidean vector space ("structure spaces"). In a third approach we consider representations of in spaces with data adapted geometries, i.e. using Riemannian manifolds as models for data spaces. In this context we are also interested in active learning schemes which are based on geometrical concepts. The developed algorithms have been applied to various applications domains, including bio- and chemoinformatics (cf. "Research" page "Applications to Problems in Bio- and Chemoinformatics") and the analysis of multimodal neural data (cf. "Research" page "MRI, EM, Autoradiography, and Multi-modal Data").



Acknowledgement: This work was funded by the BMWA and by the Technical University of Berlin.

Software:

The Potential Support Vector Machine (P-SVM)

Selected Publications:

Cakan, C. and Obermayer, K. (2020). Biophysically grounded mean-field models of neural populations under electrical stimulation. PLOS Computational Biology, 2020



Trowitzsch I., Schymura C., Kolossa D. and K., O. (2019). Joining Sound Event Detection and Localization Through Spatial Segregation. IEEE Trans. Audio Speech Language Proc.


Oschmann, F., Berry, H., Obermayer, K. and Lenk, K. (2018). From in Silico Astrocyte Cell Models to Neuron-astrocyte Network Models: A Review. Brain Research Bulletin


Aspart, F., Remme, M. and Obermayer, K. (2018). Differential Polarization of Cortical Pyramidal Neuron Dendrites through Weak Extracellular Fields. Computational Biology, 15


Meyer, R., Ladenbauer, J. and Obermayer, K. (2017). Influence of Mexican Hat Recurrent Connectivity on Noise Correlations and Stimulus Encoding. Frontiers in Computational Neuroscience


Trowitzsch, I., Mohr, J., Kashef, Y. and Obermayer, K. (2017). Robust Detection of Environmental Sounds in Binaural Auditory Scenes. IEEE Transactions on Audio Speech and Language Processing, 25, 1344-1356.


Oschmann, F., Mergenthaler, K., Jungnickel, E. and Obermayer, K. (2017). Spatial Separation of Two Different Pathways Accounting for the Generation of Calcium Signals in Astrocytes. PLoS Computational Biology, 13


Augustin, M., Ladenbauer, J., Baumann, F. and Obermayer, K. (2017). Low-dimensional spike rate models derived from networks of adaptive integrate-and-fire neurons: Comparison and implementation. PLoS Computational Biology, 13


Guo, R., Böhmer, W., Hebart, M., Chien, S., Sommer, T., Obermayer, K. and Gläscher, J. (2016). Interaction of Instrumental and Goal-directed Learning Modulates Prediction Error Representations in the Ventral Striatum. Journal of Neuroscience, 36, 12650-12660.


Meyer, R. and Obermayer, K. (2016). pypet: A Python Toolkit for Data Management of Parameter Explorations. Frontiers Neuroinformatics, 10


Aspart, F., Ladenbauer, J. and Obermayer, K. (2016). Extending Integrate-and-fire Model Neurons to Account for the Effects of Weak Electric Fields and Input Filtering Mediated by the Dendrite. PLOS Computional Biology, 12, e1005206.


Donner, C., Obermayer, K. and Shimazaki, H. (2016). Approximate Inference for Time-varying Interactions and Macroscopic Dynamics of Neural Populations. PLoS Computional Biology, 13, 1 -27.


Jain, B. and Obermayer, K. (2010). Large Sample Statistics in the Domain of Graphs. Structural, Syntactic, and Statistical Pattern Recognition. Springer Berlin Heidelberg, 690 – 697.,10.1007/978-3-642-14980-1_10


Jain, B. and Obermayer, K. (2009). Algorithms for the Sample Mean of Graphs. Lecture Notes in Computer Science, 351 – 359.,


Jain, B. and Obermayer, K. (2009). Structure Spaces. Journal of Machine Learning Research, 10, 2667 – 2714.


Knebel, T., Hochreiter, S. and Obermayer, K. (2008). An SMO algorithm for the Potential Support Vector Machine. Neural Comput., 20, 271 – 287.


Henrich, F. and Obermayer, K. (2008). Active Learning by Spherical Subdivision. Journal of Machine Learning Research, 9, 105 – 130.


Hochreiter, J. and Obermayer, K. (2006). Support Vector Machines for Dyadic Data. Neural Comput., 18, 1472 – 1510.


Hochreiter, S. and Obermayer, K. (2006). Nonlinear Feature Selection with the Potential Support Vector Machine. Feature Extraction: Foundations and Applications. Springer Berlin Heidelberg, 419 – 438.,10.1007/978-3-540-35488-8_20


Navigation

Quick Access

Schnellnavigation zur Seite über Nummerneingabe