Deterministic

Definition: A sequence of numbers \(X_n\) is said to be \(O(1)\) if there exist \(M\) and \(n_0\) such that

\[\abs{X_n} < M\]

for all \(n>n_0\). Likewise, \(X_n\) is said to be \(O(r_n)\) if there exist \(M\) and \(n_0\) such that for all \(n>n_0\),

\[\left|\frac{X_n}{r_n}\right| < M.\]

Definition: A sequence of numbers \(X_n\) is said to be \(o(1)\) if it converges to zero. Likewise, \(X_n\) is said to be \(o(r_n)\) if

\[\frac{X_n}{r_n} \to 0\]

as \(n \to \infty\).

Stochastic

Definition: A sequence of random vectors \(\x_n\) is said to be \(o_p(1)\) if it converges to \(\zero\) in probability. Furthermore, \(\x_n\) is said to be \(o_p(r_n)\) if

\[\frac{\x_n}{r_n} \inP \zero.\]

Definition: A sequence of random vectors \(\x_n\) is said to be \(O_p(1)\) if it is bounded in probability. Furthermore, \(\x_n\) is said to be \(O_p(r_n)\) if \(\x_n/r_n\) is bounded in probability.

Algebra of \(O, o\) notation

The following rules apply to both deterministic and stochastic representations (here, $a \le b$):

\[\begin{alignat*}{2} O(1) + O(1) &= O(1) & O\{O(1)\} &= O(1) \\ o(1) + o(1) &= o(1) & o\{O(1)\} &= o(1) \\ o(1) + O(1) &= O(1) & o(r_n) &= r_no(1) \\ O(1)O(1) &= O(1) & O(r_n) &= r_nO(1) \\ O(1)o(1) &= o(1) & O(n^a) + O(n^b) &= O(n^b) \\ \{1+o(1)\}^{-1} &= O(1) &\qquad\qquad o(n^a) + o(n^b) &= o(n^b) \end{alignat*}\]

Source: Asymptotic statistics (1998), van der Vaart A. Cambridge.

Proof: This is an example proof, showing that \(o(1) + O(1) = O(1)\). Other results can be derived with a similar technique.

Let \(A_n = o(1)\), \(B_n = O(1)\), and \(\eps > 0\).

\(\begin{alignat*}{2} \tag*{$\tcirc{1}$} \exists N_a: n > N_a \implies \abs{A_n} &< \eps &\hspace{4em}& A_n=o(1) \\ \tag*{$\tcirc{2}$} \exists N_b, M: n > N_b \implies \abs{B_n} &< M && B_n=O(1) \\ n > \max(N_a, N_b) \implies \abs{A_n + B_n} &\le \abs{A_n} + \abs{B_n} && \href{norm.html}{\text{Triangle inequality}} \\ &< \eps + \abs{B_n} && \tcirc{1} \\ &< \eps + M && \tcirc{2} \end{alignat*}\)