We revisit guessing random additive noise decoding (GRAND) in discrete additive noise channels. We derive a non-asymptotic random coding bound using elementary tools, which is applicable to arbitrary noise guessing orders. We then use this bound to analyze a universal variant of GRAND, that does not require knowledge of the noise distribution, and show that it achieves the random coding error exponent. Finally, we apply GRAND to an instance of the Slepian-Wolf coding problem.