Geometric Representation Learning

Geometric methods for representations learning on structured data like graphs, trees, and text.

(Image by Mercatp)

Knowledge Graph Embeddings

Statistical machine learning on relational knowledge representations.

(Image by Anders Sandberg)

Selected Publications

We are concerned with the discovery of hierarchical relationships from large-scale unstructured similarity scores. For this purpose, we study different models of hyperbolic space and find that learning embeddings in the Lorentz model is substantially more efficient than in the Poincaré-ball model. We show that the proposed approach allows us to learn high-quality embeddings of large taxonomies which yield improvements over Poincaré embeddings, especially in low dimensions. Lastly, we apply our model to discover hierarchies in two real-world datasets: we show that an embedding in hyperbolic space can reveal important aspects of a company’s organizational structure as well as reveal historical relationships between language families.
ICML, 2018

Representation learning has become an invaluable approach for learning from symbolic data such as text and graphs. However, state-of-the-art embedding methods typically do not account for latent hierarchical structures which are characteristic for many complex symbolic datasets. In this work, we introduce a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space – or more precisely into an n-dimensional Poincaré ball. Due to the underlying hyperbolic geometry, this allows us to learn parsimonious representations of symbolic data by simultaneously capturing hierarchy and similarity. We present an efficient algorithm to learn the embeddings based on Riemannian optimization and show experimentally that Poincaré embeddings can outperform Euclidean embeddings significantly on data with latent hierarchies, both in terms of representation capacity and in terms of generalization ability.
NIPS, 2017

Relational learning is becoming increasingly important in many areas of application. Here, we present a novel approach to relational learning based on the factorization of a three-way tensor. We show that unlike other tensor approaches, our method is able to perform collective learning via the latent components of the model and provide an efficient algorithm to compute the factorization. We substantiate our theoretical considerations regarding the collective learning capabilities of our model by the means of experiments on both a new dataset and a dataset commonly used in entity resolution. Furthermore, we show on common benchmark datasets that our approach achieves better or on-par results, if compared to current state-of-the-art relational learning solutions, while it is significantly faster to compute.
ICML, 2011

Recent Workshops & Symposia

Understanding intelligence and the brain requires theories at different levels, ranging from the biophysics of single neurons to algorithms, computations, and a theory of learning. In this symposium, we aim to bring together researchers from machine learning, artificial intelligence, neuroscience, and cognitive science to present and discuss state-of-the-art research that is focused on understanding intelligence at these different levels.

Recent Publications

More Publications

. Hearst Patterns Revisited: Automatic Hypernym Detection from Large Text Corpora. ACL, 2018.

PDF Code

. Learning visually grounded sentence representations. NAACL, 2018.

PDF Video

. Separating Self-Expression and Visual Content in Hashtag Supervision. CVPR, 2017.


. Fast Linear Model for Knowledge Graph Embeddings. AKBC, 2017.

PDF Project

. Holographic Embeddings of Knowledge Graphs. AAAI, 2015.

PDF Code Project

. A Review of Relational Machine Learning for Knowledge Graphs. Proc. IEEE, 2015.

Preprint PDF Project

. Querying the Web with Statistical Machine Learning. THESEUS, 2014.

Preprint PDF