A fundamental axiom of probability is that if we have two events $A$ and $B$ from the same probability space, and $A$ is a subset of $B$ ($A \subseteq B)$, then $\Pr(A) \le \Pr(B)$. This should be intuitive: $A$ occurring is one of the ways that $B$ can occur, but there are also other ways $B$ can happen without $A$, so its probability must be higher. This axiom is known as the monotonicity of probability measures; in this course, for the sake of brevity I typically refer to this as the principle of inclusion.
A great deal of statistical theory involves inequalities, and a typical scenario in which this arises is: we know that $b < c$, therefore $\Pr(a < b) \le \Pr(a < c)$. This is intuitively simple, but it is easy to make mistakes with respect to the sign. As a mnemonic device, I suggest “if large side gets larger, or the small side gets smaller, the probability goes up”, but use whatever works for you. Here is the result stated as a theorem:
Theorem: Suppose $a < b$ and $Y$ is a random variable. Then both of the following are true:
\[\as{ \Pr\{Y < a\} &\le \Pr\{Y < b\}. \\ \Pr\{b < Y\} &\le \Pr\{a < Y\} }\]As a more specific example, suppose $X$, $Y$, and $Z$ are random variables. Then by combining the triangle inequality and the monotonicity of probability measures, we have
\[\Pr(Z < \abs{X + Y}) \le \Pr(Z < \abs{X} + \abs{Y}).\]Convergence also comes up often in the context of convergence, as stated in the theorem below.
Theorem: Suppose now we have sequences of sets $A_n$ and $B_n$, and for every $n$, $A_n \subseteq B_n$. Then both of the following are true:
\[\as{ \Pr(A_n) \to 1 \quad &\text{implies} \quad \Pr(B_n) \to 1 \\ \Pr(B_n) \to 0 \quad &\text{implies} \quad \Pr(A_n) \to 1 }\]