Motivated by controlling the errors of individual edges in the causal graph discovery problem, we propose a novel framework for causal discovery inspired by the Neyman-Pearson formulation of hypothesis testing. In particular, our formulation requires that the false negative rate is minimized while simultaneously ensuring that the false positive rate is held below a specified tolerance level. This allows us to call on techniques from binary hypothesis testing. Specifically, we derive the optimal rule for our problem, which consists of a likelihood ratio test on the edges, and derive a series of matching upper and lower bounds on the false negative rate, characterized by the Renyi divergence, which can be used as benchmarks for current discovery algorithms.