Optimization Methods with Performance Guarantees for Subspace Learning
Research Project, 2019
– 2022
Machine learning methods play an ever increasing role in today’s society. Intelligent systems have traditionally been used for applications requiring large scale data analysis like recommender systems, but are now appearing in highly complex tasks like self-driving cars. This project will study inference methods for learning and dimensionality reduction related to matrix factorization.Learning a low dimensional subspace from high dimensional data is a core problem in many branches of science. Applications exist in computer vision, bio-informatics, control theory and many other areas. In its simplest form subspace learning amounts to finding a low rank approximation of a matrix containing the observed data. Current state-of-the-art methods suffer from significant shortcomings that limit their applicability. For example, many formulations are purely data driven and do not allow incorporation of prior knowledge of the observed system. Thus their success hinges on the availability of large enough data sets for inferring all model characteristics.This project aims to develop flexible methods that simultaneous incorporate multiple subspace constraints and priors making then capable of learning compact models from a minimum amount of data. We design formulations that model the problem more accurately than current methods but still allow efficient inference. We aim to develop new strong relaxations, both convex and non-convex, and algorithms that scale well beyond today´s standard.
Participants
Carl Olsson (contact)
Imaging and Image Analysis
Fredrik Kahl
Imaging and Image Analysis
Funding
Swedish Research Council (VR)
Project ID: 2018-05375
Funding Chalmers participation during 2019–2022