Max-sliced mutual information (mSMI) was recently proposed as a data-efficient measure of dependence. This measure extends popular correlation-based methods and proves useful in various machine learning tasks. In this paper, we extend the notion of mSMI to discrete variables and investigate its role in popular problems of information theory and statistics. We use mSMI to propose a soft version of the G\'acs-K\"orner common information, which, due to the mSMI structure, naturally extends to continuous domains and multivariate settings. We then characterize the optimal growth rate in a horse race with constrained side information. Additionally, we examine the error of independence testing under communication constraints. Finally, we study mSMI in communications. We characterize the capacity of discrete memoryless channels with constrained encoders and decoders, and propose an mSMI-based scheme to decode information obtained through remote sensing. These connections motivate the use of max-slicing in information theory, and benefit from its merits.