TU4.R1.1

Section-wise Revolving NBP-like Decoders for QC-LDPC Codes

Qinshan Zhang, Tsinghua University, China; Bin Chen, Harbin Institute of Technology (Shenzhen), China; Tianqu Zhuang, Yong Jiang, Shu-Tao Xia, Tsinghua University, China

Session:
Deep Learning in Coding

Track:
8: Deep Learning (such as understanding large language models)

Location:
Ballroom II & III

Presentation Time:
Tue, 9 Jul, 16:05 - 16:25

Session Chair:
Natasha Devroy, University of Illinois Chicago
Abstract
Although deep learning has demonstrated improvements over classical decoding algorithms in various families of error-correcting codes with short to moderate block length, neural decoders like neural belief propagation (NBP), constructed based on the Tanner graphs of linear codes, have been extensively studied, their application to longer codes is limited due to increased network complexity with longer block lengths. This complexity leads to higher computational costs and deployment challenges, such as GPU memory usage. To address this, we propose a novel revolving framework for NBP-like decoders tailored to QC-LDPC codes, a commonly used class of error correction codes. Our approach leverages the section-wise cyclic structure inherent in QC-LDPC codes, significantly simplifying the complexity of the network. Experimental results demonstrate the effectiveness of our method across various QC-LDPC codes. Especially, compared to the traditional decoding scheme based on the section-wise cyclic structure, our proposed decoder shows substantial improvements on 5G LDPC codes, exhibits better performance than the traditional min-sum decoder, and approaches the sum-product decoder without a noticeable error floor within the investigated noise levels.
Resources