Taking the logarithm and differentiating with respect to $\mu$ and $\sigma^2$, we get:
The likelihood function is given by:
$$L(\lambda) = \prod_{i=1}^{n} \frac{\lambda^{x_i} e^{-\lambda}}{x_i!}$$ theory of point estimation solution manual
$$\hat{\lambda} = \bar{x}$$
$$\frac{\partial \log L}{\partial \mu} = \sum_{i=1}^{n} \frac{x_i-\mu}{\sigma^2} = 0$$ Taking the logarithm and differentiating with respect to
$$\hat{\sigma}^2 = \frac{1}{n} \sum_{i=1}^{n} (x_i-\bar{x})^2$$ theory of point estimation solution manual