Lifelong learning starting from zero
Paper i proceeding, 2019

We present a deep neural-network model for lifelong learning inspired by several forms of neuroplasticity. The neural network develops continuously in response to signals from the environment. In the beginning the network is a blank slate with no nodes at all. It develops according to four rules: (i) expansion, which adds new nodes to memorize new input combinations; (ii) generalization, which adds new nodes that generalize from existing ones; (iii) forgetting, which removes nodes that are of relatively little use; and (iv) backpropagation, which fine-tunes the network parameters. We analyze the model from the perspective of accuracy, energy efficiency, and versatility and compare it to other network models, finding better performance in several cases.

Dynamic architectures

Lifelong learning

Deep learning

Författare

Claes Strannegård

Chalmers, Data- och informationsteknik

FS Dynamics Sweden AB

Herman Carlström

Chalmers, Data- och informationsteknik

Niklas Engsner

FS Dynamics Sweden AB

Fredrik Mäkeläinen

FS Dynamics Sweden AB

Filip Slottner Seholm

Chalmers, Data- och informationsteknik

Morteza Haghir Chehreghani

Chalmers, Data- och informationsteknik

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 11654 LNAI 188-197
978-3-030-27005-6 (ISBN)

12th International Conference on Artificial General Intelligence, AGI 2019
Shenzhen, China,

Ämneskategorier

Telekommunikation

Kommunikationssystem

Bioinformatik (beräkningsbiologi)

DOI

10.1007/978-3-030-27005-6_19

Mer information

Senast uppdaterat

2023-11-20