Learned Belief-Propagation Decoding with Simple Scaling and SNR Adaptation
Paper in proceeding, 2019

We consider the weighted belief-propagation (WBP) decoder recently proposed by Nachmani et al. where different weights are introduced for each Tanner graph edge and optimized using machine learning techniques. Our focus is on simple-scaling models that use the same weights across certain edges to reduce the storage and computational burden. The main contribution is to show that simple scaling with few parameters often achieves the same gain as the full parameterization. Moreover, several training improvements for WBP are proposed. For example, it is shown that minimizing average binary cross-entropy is suboptimal in general in terms of bit error rate (BER) and a new "soft-BER" loss is proposed which can lead to better performance. We also investigate parameter adapter networks (PANs) that learn the relation between the signal-to-noise ratio and the WBP parameters. As an example, for the (32, 16) Reed-Muller code with a highly redundant parity-check matrix, training a PAN with soft-BER loss gives near-maximum-likelihood performance assuming simple scaling with only three parameters.

Author

Mengke Lian

Duke University

Fabrizio Carpi

University of Parma

Christian Häger

Chalmers, Electrical Engineering, Communication, Antennas and Optical Networks

Henry D. Pfister

Duke University

IEEE International Symposium on Information Theory - Proceedings

21578095 (ISSN)

Vol. 2019 July 161-165 8849419
978-153869291-2 (ISBN)

IEEE Int. Symp. on Information Theory (ISIT)
Paris, France,

Coding for terabit-per-second fiber-optical communications (TERA)

European Commission (EC) (EC/H2020/749798), 2017-01-01 -- 2019-12-31.

Areas of Advance

Information and Communication Technology

Subject Categories

Telecommunications

Communication Systems

Signal Processing

DOI

10.1109/ISIT.2019.8849419

More information

Latest update

3/2/2022 2