Mapping Multiple LSTM models on FPGAs
Paper in proceeding, 2020

Recurrent Neural Networks (RNNs) and their more recent variant Long Short-Term Memory (LSTM) are utilised in a number of modern applications like Natural Language Processing and human action recognition, where capturing longterm dependencies on sequential and temporal data is required. However, their computational structure imposes a challenge when it comes to their efficient mapping on a computing device due to its memory-bounded nature. As recent approaches aim to capture longer dependencies through the utilisation of Hierarchical and Stacked RNN/LSTM models, i.e. models that utilise multiple LSTM models for prediction, meeting the desired application latency becomes even more challenging. This paper addresses the problem of mapping multiple LSTM models to a device by introducing a framework that alters their computational structure opening opportunities for co-optimising the memory requirements to the target architecture. Targeting an FPGA device, the proposed framework achieves 3× to 5× improved performance over state-of-The-Art approaches for the same accuracy loss, opening the path for the deployment of high-performance systems for Hierarchical and Stacked LSTM models.

Author

Stefano Ribes

Chalmers, Computer Science and Engineering (Chalmers), Computer Engineering (Chalmers)

Pedro Petersen Moura Trancoso

Chalmers, Computer Science and Engineering (Chalmers), Computer Engineering (Chalmers)

Ioannis Sourdis

Chalmers, Computer Science and Engineering (Chalmers), Computer Engineering (Chalmers)

C. S. Bouganis

Imperial College London

Proceedings - 2020 International Conference on Field-Programmable Technology, ICFPT 2020

1-9 9415569
9780738105185 (ISBN)

2020 International Conference on Field-Programmable Technology, ICFPT 2020
Maui, USA,

Energy-efficient Heterogeneous COmputing at exaSCALE (ECOSCALE)

European Commission (EC) (EC/H2020/671632), 2015-10-01 -- 2018-12-31.

Subject Categories

Computer Engineering

Embedded Systems

Computer Science

DOI

10.1109/ICFPT51103.2020.00010

More information

Latest update

3/2/2022 3