Estimating unknown parameters in restricted parameter space is an important problem with applications in communication, statistics, and machine learning. In this paper, we adopt the conventional minimax formulation to investigate such problems. In particular, we focus on studying the second-order characterizations of the minimax risk in the asymptotic regime. We first show that the second-order convergence rate of the minimax risk depends on the local flatness of the Fisher information around its global optimum. Then, we demonstrate that the second-order terms can be computed by solving certain ordinary differential equations, where the coefficients of the second-order terms can be explicitly expressed in some cases. Finally, the estimators achieving the minimax risk are also given, which provides potential guidance for machine learning designs.