FR3.R3.1

Generalized Gradient Flow Decoding and Its Tensor-Computability

Tadashi Wadayama, Lantian Wei, Nagoya Institute of Technology, Japan

Session:
Iterative Decoding

Track:
2: Modern Coding Theory

Location:
Ypsilon IV-V-VI

Presentation Time:
Fri, 12 Jul, 14:35 - 14:55

Session Chair:
Michael Lentmaier,
Abstract
This paper introduces an extension of Gradient Flow (GF) decoding for LDPC codes. GF decoding, a continuous-time methodology based on gradient flow, employs a potential energy function associated with bipolar codewords of LDPC codes. The original GF decoding was designed for AWGN channels but in this paper, we introduce the negative log-likelihood function of the channel for generalizing the original method. The proposed method is shown to be tensor-computable, which means that the gradient of the objective function can be evaluated with the combination of basic tensor computations. This characteristic is well-suited to emerging AI accelerators, potentially applicable in wireless signal processing. The paper assesses the decoding performance of the generalized GF decoding in LDPC-coded MIMO channels. Our numerical experiments reveal that this method's decoding performance rivals that of established techniques such as MMSE + BP. A further benefit of the proposed method is its suitability for deep unfolding. This advantage stems from the fact that each component of the method is differentiable.
Resources