Gradient coding is a distributed computing technique aiming to provide robustness against slow or non-responsive computing nodes, known as stragglers, while balancing the computational load for responsive computing nodes. Among existing gradient codes, a construction based on combinatorial design, called BIBD gradient code, achieves the best trade-off between robustness and computational load in the worst-case adversarial straggler setting. However, the range of system parameters for which BIBD gradient codes exist is limited. In this paper, we overcome this limitation and propose a new probabilistic gradient code, termed Sparse Gaussian (SG) gradient code. The encoding matrix of the proposed SG gradient code is generated from a carefully chosen correlated multivariate Gaussian distribution, masked by Bernoulli random variables to reduce computational load. The proposed gradient code achieves a similar worst-case error performance compared to the BIBD gradient code (when such code of the same parameters exists) and outperforms several other existing gradient codes, including Fractional Repetition gradient codes and Bernoulli gradient codes. Moreover, it further extends the range of system parameters over existing BIBD and soft BIBD gradient codes, making it a promising solution for distributed computing tasks.