Detection of sparse signals arises in many modern applications such as signal processing, bioinformatics, finance, and disease surveillance. However, in many of these applications, the data may contain sensitive personal information, which is desirable to be protected during the data analysis. In this article, we consider the problem of $(\epsilon,\delta)$ -differentially private detection of a general sparse mixture with a focus on how privacy affects the detection power. By investigating the nonasymptotic upper bound for the summation of error probabilities, we find any $(\epsilon,\delta)$ -differentially private test cannot detect the sparse signal if the privacy constraint is too strong or if the model parameters are in the undetectable region (Cai and Wu, 2014). Moreover, we study the private clamped log-likelihood ratio test proposed by Canonne et al., 2019 and show it achieves vanishing error probabilities in some conditions on the model parameters and privacy parameters. Then, for the case when the null distribution is a standard normal distribution, we propose an adaptive $(\epsilon,\delta)$ -differentially private test, which achieves vanishing error probabilities in the same detectable region (Cai and Wu, 2014) when the privacy parameters satisfy certain sufficient conditions. Several numerical experiments are conducted to verify our theoretical results and illustrate the performance of our proposed test.