Word Sense Embedded in Geometric Spaces - From Induction to Applications using Machine Learning
Licentiatavhandling, 2016
word sense induction
word sense disambiguation
word embeddings
extractive summarisation
neural networks
deep learning
natural language procsessing
reinforcement learning
Författare
Mikael Kågebäck
Chalmers, Data- och informationsteknik, Datavetenskap
Extractive Summarization using Continuous Vector Space Models
Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC) EACL, April 26-30, 2014 Gothenburg, Sweden,;(2014)p. 31-39
Paper i proceeding
Extractive summarization by aggregating multiple similarities
International Conference Recent Advances in Natural Language Processing, RANLP,;Vol. 2015(2015)p. 451-457
Paper i proceeding
Neural context embeddings for automatic discovery of word senses
Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing. Denver, United States,;(2015)p. 25-32
Paper i proceeding
E. Jorge, M. Kågebäck, E. Gustavsson, Learning to Play Guess Who? and Inventing a Grounded Language as a Consequence
M. Kågebäck and H. Salomonsson, Word Sense Disambiguation using a Bidirectional LSTM
Ämneskategorier
Språkteknologi (språkvetenskaplig databehandling)
Jämförande språkvetenskap och allmän lingvistik
Datavetenskap (datalogi)
Utgivare
Chalmers
HC1, Hörsalsvägen 14, Chalmers
Opponent: Richard Socher, Chief Scientist at Salesforce, San Francisco, USA