In this paper, we are interested in the regime where the common information between two Gaussian random vectors (X, Y) can be (or can approach) infinity. We ask two main questions: what is the rate of growth for common information from a finite to an infinite number of bits, as the dependency between the variables increases and how well can we “approximately" simulate a pair of random variables (X, Y) with infinite common information using a finite number of shared bits? We analytically prove that the answer to both of these questions depends on the common information dimension d(X, Y) between X and Y, that we introduced in our recent work [1]. Our work characterizes in a closed form the asymptotic behaviors, by building a connection to singular values associated with the covariance matrix Σ of (X, Y). We conclude the paper by providing numerical evaluation results that indicate fast convergence to the asymptotic regime.