A family of $(n=2^\ell, M=2^{k_{\ell}}, d=2)$ binary constant-weight codes for any positive integer $\ell \geq 3, k_\ell = \frac{\ell(\ell+1)}{2} -1$ was recently proposed in literature~\cite{SasVitDau}. As information is encoded in the gaps (cyclically counted) between successive $1$'s, we refer to these codes as cyclic-gap constant weight codes denoted by ${\cal C}_G[\ell]$. These codes admit very low-complexity algorithms for mapping and demapping between message and codeword vectors, fully eliminating the need for costly computations of binomial coefficients. In this paper, we study maximum-likelihood (ML) decoding of these codes under additive white Gaussian noise channels. Since the minimum distance of the code $d=2$, hard-decision decoders can not correct errors. Motivated by the error-correcting capability of soft-decision Wagner-rule decoder for single-parity-check codes, we derive an ML decoder for ${\cal C}_G[\ell]$. We also derive a low-complexity approximation of the ML decoder with a time-complexity of $O(n\log n)$. The algorithm is based on a novel technique of traversal through the Hasse diagram of a partially ordered set of all $\ell$-subsets of $\{1,2,\ldots,n\}$ in a breadth-first manner. Our approach is applicable to decoding of any binary constant-weight code and therefore is of general interest. We simulate the performance of the low-complexity decoder for the $(8,32,2)$ code ${\cal C}_G[3]$ and show that it performs almost similar to ML when a parameter $\lambda_{\max}$ (that determines how far to traverse in the Hasse diagram) is taken to be $3$. We also show by simulation that it performs better than a comparable $[8,5,2]$ linear code under ML decoding.