Nested Construction of Polar Codes via Transformers
Sravan Ankireddy, University of Texas at Austin, United States; Ashwin Hebbar, Princeton University, United States; Heping Wan, Joonyoung Cho, Charlie Zhang, Samsung Research America, United States
Session:
Deep Learning in Coding
Track:
8: Deep Learning (such as understanding large language models)
Location:
Ballroom II & III
Presentation Time:
Tue, 9 Jul, 16:45 - 17:05
Session Chair:
Natasha Devroy, University of Illinois Chicago
Abstract
THIS PAPER IS ELIGIBLE FOR THE STUDENT PAPER AWARD. Construction of polar codes for decoding algorithms other than successive cancellation is a long-standing open problem. Recent advances in artificial intelligence (AI) have sparked an interest in the information and coding theory communities to use AI-based optimization algorithms to tailor the code construction for practical decoding algorithms. However, despite the inherent nested structure of polar codes, the use of sequence models in polar code construction is understudied. In this work, we propose using a sequence modeling framework to iteratively construct a polar code for any given length and rate under various channel conditions. Simulations show that polar codes designed via sequential modeling using transformers outperform both 5G-NR sequence and Density Evolution based approaches for both AWGN and Rayleigh fading channels.