Definition: An estimator \(\bth\) is said to be a consistent estimator of \(\bt\) if \(\bth \inP \bt\).

The idea of consistency is often too weak to be interesting: any reasonable estimator is consistent given an infinite amount of data. A stronger idea is \(\sqrt{n}\)-consistency.

Definition: An estimator \(\bth\) is said to be a \(\sqrt{n}\)-consistent estimator of \(\bt\) if \(\norm{\bth-\bt} = O_p(1/\sqrt{n})\).

This means that not only does \(\bth\) get close to \(\bt\) as \(n \to \infty\), but it converges to \(\bt\) so fast that it stays within an ever-shrinking neighborhood \(N_{M/\sqrt{n}}(\bt)\) with high probability.

See also: Proving consistency via quadratic mean