Towards Better Representation Learning in the Absence of Sufficient Supervision
Licentiate thesis, 2022

We focus on the problem of learning representations from data in the situation where we do not have access to sufficient supervision such as labels or feature values. This situation can be present in many real-world machine learning tasks. We approach this problem from different perspectives summarized as follows.
First, we assume there is some knowledge already available from a different but related task or model, and aim at using that knowledge in our task of interest. We perform this form of knowledge transfer in two different but related ways: i. using the knowledge available in kernel embeddings to improve the training properties of a neural network, and ii. transferring the knowledge available in a large model to a smaller one. In the former case, we use the recent theoretical results on training of neural networks and a multiple kernel learning algorithm to achieve a high performance in terms of both optimization and generalization in a neural network.
Next, we tackle the problem of learning appropriate data representations from an online learning point of view in which one should learn incrementally from an incoming source of data. We assume that the whole feature set of a data input is not always available, and seek a way to learn efficiently from a smaller set of feature values. We propose a novel online learning framework which builds a decision tree from a data stream, and yields highly accurate predictions, competitive with classical online decision tree learners but with a significantly lower cost.

Online learning

Neural network

Decision tree

Representation learning

Kernel embedding

Supervision

Knowledge transfer

Feature acquisition

Rännvägen 6B, Analysen
Opponent: Prof. Josephine Sullivan, DIVISION OF ROBOTICS, PERCEPTION AND LEARNING, KTH Royal Institute of Technology, Sweden

Author

Arman Rahbar

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Do Kernel and Neural Embeddings Help in Training and Generalization?

Neural Processing Letters,; Vol. 55(2023)p. 1681-1695

Journal article

Analysis of Knowledge Transfer in Kernel Regime

International Conference on Information and Knowledge Management, Proceedings,; (2022)p. 1615-1624

Paper in proceeding

Arman Rahbar, Ziyu Ye, Chaoqi Wang, Yuxin Chen, Morteza Haghir Chehreghani. Efficient Online Decision Tree Learning by Utility of Features

Areas of Advance

Information and Communication Technology

Subject Categories

Information Science

Computer Science

Publisher

Chalmers

Rännvägen 6B, Analysen

Opponent: Prof. Josephine Sullivan, DIVISION OF ROBOTICS, PERCEPTION AND LEARNING, KTH Royal Institute of Technology, Sweden

More information

Latest update

10/26/2023