Optimeringsmetoder med prestandagarantier för maskininlärningsmetoder
Forskningsprojekt, 2019
– 2022
Machine learning methods play an ever increasing role in today’s society. Intelligent systems have traditionally been used for applications requiring large scale data analysis like recommender systems, but are now appearing in highly complex tasks like self-driving cars. This project will study inference methods for learning and dimensionality reduction related to matrix factorization.Learning a low dimensional subspace from high dimensional data is a core problem in many branches of science. Applications exist in computer vision, bio-informatics, control theory and many other areas. In its simplest form subspace learning amounts to finding a low rank approximation of a matrix containing the observed data. Current state-of-the-art methods suffer from significant shortcomings that limit their applicability. For example, many formulations are purely data driven and do not allow incorporation of prior knowledge of the observed system. Thus their success hinges on the availability of large enough data sets for inferring all model characteristics.This project aims to develop flexible methods that simultaneous incorporate multiple subspace constraints and priors making then capable of learning compact models from a minimum amount of data. We design formulations that model the problem more accurately than current methods but still allow efficient inference. We aim to develop new strong relaxations, both convex and non-convex, and algorithms that scale well beyond today´s standard.
Deltagare
Carl Olsson (kontakt)
Digitala bildsystem och bildanalys
Fredrik Kahl
Digitala bildsystem och bildanalys
Finansiering
Vetenskapsrådet (VR)
Projekt-id: 2018-05375
Finansierar Chalmers deltagande under 2019–2022