TH3.R1.1

QML-IB: Quantized Collaborative Intelligence between Multiple Devices and the Mobile Network

Jingchen Peng, Boxiang Ren, Tsinghua University, China; Lu Yang, Chenghui Peng, Huawei Tech. Co. Ltd, China; Panpan Niu, Hao Wu, Tsinghua University, China

Session:
Information Bottleneck

Track:
8: Machine Learning

Location:
Ballroom II & III

Presentation Time:
Thu, 11 Jul, 14:35 - 14:55

Session Chair:
Lampros Gavalakis, Gustave Eiffel University
Abstract
The integration of artificial intelligence (AI) and mobile networks is regarded as one of the most important scenarios for 6G. In 6G, a major objective is to realize the efficient transmission of task-relevant data. Then a key problem arises, how to design collaborative AI models for the device side and the network side, so that the transmitted data between the device and the network is efficient enough, which means the transmission overhead is low but the AI task result is accurate. In this paper, we propose the multi-link information bottleneck (ML-IB) scheme for such collaborative models design. We formulate our problem based on a novel performance metric, which can evaluate both task accuracy and transmission overhead. Then we introduce a quantizer that is adjustable in the quantization bit depth, amplitudes, and breakpoints. Given the infeasibility of calculating our proposed metric on high-dimensional data, we establish a variational upper bound for this metric. However, due to the incorporation of quantization, the closed form of the variational upper bound remains uncomputable. Hence, we employ the Log-Sum Inequality to derive an approximation and provide a theoretical guarantee. Based on this, we devise the quantized multi-link information bottleneck (QML-IB) algorithm for collaborative AI models generation. Finally, numerical experiments demonstrate the superior performance of our QML-IB algorithm compared to the state-of-the-art algorithm.
Resources