Differential Privacy (DP) has become a gold standard in privacy-preserving data analysis. While it provides a rigorous notion of privacy, there are settings where its applicability is limited. In this work, we introduce a new notion of privacy, called \emph{Utilitarian Privacy} (UP), that complements DP. Informally, a UP mechanism is required not to include any ``non-utile information'' in the output. In particular, if two databases result in ``close-by'' outputs, then the mechanism should not allow distinguishing between them. On one hand UP permits weaker privacy guarantees when distinguishing between neighboring databases is important for utility; on the other hand, UP gives stronger privacy guarantees by making even non-neighboring databases indistinguishable from each other, if they yield close-by outcomes. We show that for real-valued functions, adding appropriately calibrated Laplace noise to the output, remarkably, achieves UP guarantees. A separate contribution of this work is to study \emph{private sampling}, by extending the accuracy notion of mechanisms to sampling tasks. We show that for real-valued random variables, adding Laplace noise, calibrated according to a {\em generalized} sensitivity measure of the output distribution yields DP and UP. Both the above extensions build on a recently introduced notion of ``lossy Wasserstein distance'' -- a 2-parameter error measure for distributions.