We consider the rate-distortion function for lossy source compression as well as the channel capacity for error correction through the lens of distributional robustness. We assume that the distribution of the source or of the additive channel noise is unknown and lies within a Wasserstein-2 ambiguity set of a given radius centered around a specified nominal distribution and look for the worst-case asymptotically optimal coding rate over such an ambiguity class. Varying the radius of the ambiguity set interpolates between the worst-case and stochastic scenarios using probabilistic tools. Our problem setting fits into the paradigm of compound source/channel models introduced by Sakrison and Blackwell, respectively. This paper shows that if the nominal distribution is Gaussian, then so is the worst-case source/noise distribution, and the compound rate-distortion/channel capacity functions admit convex formulations with Linear Matrix Inequality (LMI) constraints. These formulations yield simple closed-form expressions in the scalar case, offering insights into the behavior of Shannon limits with changing radius of the Wasserstein-2 ambiguity set.