The variance of the score is called the Fisher information:
On its surface, this would seem to have nothing to do with information. However, the connection between the variance of the score and the curvature of the log-likelihood is made clear in the following theorem.
Theorem: If the likelihood allows all second-order partial derivatives to be passed under the integral sign, then
Proof:
Letting
In the final step, note that
Multiple observations
The above definition and proof assumes that we have a single observation. In cases where we have repeated observations
If observations are iid, then each observation has the same Fisher information and
If observations are not independent, then the Fisher information is more difficult to calculate as it involves the multivariate (joint) distribution of