Word Sense Embedded in Geometric Spaces - From Induction to Applications using Machine Learning
Licentiate thesis, 2016
word sense induction
word sense disambiguation
word embeddings
extractive summarisation
neural networks
deep learning
natural language procsessing
reinforcement learning
Author
Mikael Kågebäck
Chalmers, Computer Science and Engineering (Chalmers), Computing Science (Chalmers)
Extractive Summarization using Continuous Vector Space Models
Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC) EACL, April 26-30, 2014 Gothenburg, Sweden,;(2014)p. 31-39
Paper in proceeding
Extractive summarization by aggregating multiple similarities
International Conference Recent Advances in Natural Language Processing, RANLP,;Vol. 2015(2015)p. 451-457
Paper in proceeding
Neural context embeddings for automatic discovery of word senses
Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing. Denver, United States,;(2015)p. 25-32
Paper in proceeding
E. Jorge, M. Kågebäck, E. Gustavsson, Learning to Play Guess Who? and Inventing a Grounded Language as a Consequence
M. Kågebäck and H. Salomonsson, Word Sense Disambiguation using a Bidirectional LSTM
Subject Categories
Language Technology (Computational Linguistics)
General Language Studies and Linguistics
Computer Science
Publisher
Chalmers
HC1, Hörsalsvägen 14, Chalmers
Opponent: Richard Socher, Chief Scientist at Salesforce, San Francisco, USA