Word Sense Embedded in Geometric Spaces - From Induction to Applications using Machine Learning
Licentiatavhandling, 2016

Words are not detached individuals but part of a beautiful interconnected web of related concepts, and to capture the full complexity of this web they need to be represented in a way that encapsulates all the semantic and syntactic facets of the language. Further, to enable computational processing they need to be expressed in a consistent manner so that similar properties are encoded in a similar way. In this thesis dense real valued vector representations, i.e. word embeddings, are extended and studied for their applicability to natural language processing (NLP). Word embeddings of two distinct flavors are presented as part of this thesis, sense aware word representations where different word senses are represented as distinct objects, and grounded word representations that are learned using multi-agent deep reinforcement learning to explicitly express properties of the physical world while the agents learn to play Guess Who?. The empirical usefulness of word embeddings are evaluated by employing them in a series of NLP related applications, i.e. word sense induction, word sense disambiguation, and automatic document summarisation. The results show great potential for word embeddings by outperforming previous state-of-the-art methods in two out of three applications, and achieving a statistically equivalent result in the third application but using a much simpler model than previous work.

word sense induction

word sense disambiguation

word embeddings

extractive summarisation

neural networks

deep learning

natural language procsessing

reinforcement learning

HC1, Hörsalsvägen 14, Chalmers
Opponent: Richard Socher, Chief Scientist at Salesforce, San Francisco, USA


Mikael Kågebäck

Chalmers, Data- och informationsteknik, Datavetenskap

Extractive Summarization using Continuous Vector Space Models

Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC) EACL, April 26-30, 2014 Gothenburg, Sweden,; (2014)p. 31-39

Paper i proceeding

Extractive summarization by aggregating multiple similarities

International Conference Recent Advances in Natural Language Processing, RANLP,; Vol. 2015(2015)p. 451-457

Paper i proceeding

Neural context embeddings for automatic discovery of word senses

Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing. Denver, United States,; (2015)p. 25-32

Paper i proceeding

E. Jorge, M. Kågebäck, E. Gustavsson, Learning to Play Guess Who? and Inventing a Grounded Language as a Consequence

M. Kågebäck and H. Salomonsson, Word Sense Disambiguation using a Bidirectional LSTM


Språkteknologi (språkvetenskaplig databehandling)

Jämförande språkvetenskap och allmän lingvistik

Datavetenskap (datalogi)



HC1, Hörsalsvägen 14, Chalmers

Opponent: Richard Socher, Chief Scientist at Salesforce, San Francisco, USA

Mer information