Let $Y_1, Y_2, … $ be a sequence of random variables. We say that $Y_n$ converge to a random variable $Y$ in probability denoted by $Y_n \to Y $ if
$$\limu{ n }{ \infty } = P(\abs{ Y_n - Y } \geq \epsilon) = 0, \forall \epsilon > 0 \tag{1}$$
Or, $\forall \epsilon > 0, \forall \delta > 0, \exists n_0$ such that
$$P(\abs{ Y_n - a } \geq \epsilon) \leq \delta, \forall n \geq n_0 \tag{2}$$
We refer to $\epsilon$ as the accuracy level, and $\delta$ as the confidence level. Then we can restate the second definition as follows:
For any given level of accuracy and confidence. $Y_n $ will be equal to $a$, within these levels of accuracy and confidence, provided that $n $ is large enough.
Consider a sequence of independent random variables $X_n $ that are uniformly distributed in the interval $[0,1 ] $, and let
$$Y_n = \min{ X_1, X_2, …, X_n } $$
Observe that the sequence of values of $Y_n $ cannot increase as $n $ increases, and it will occasionally decrease. Thus we intuitively expect that $Y_n $ converges to zero. Indeed, for $\epsilon > 0 $, we have using the independence of the $X_n $,
$$\begin{align*} P(\abs{ Y_n - 0 } > \epsilon) &= P(X_1 \geq \epsilon, …, X_n \geq \epsilon) \br &= P(X_1 \geq \epsilon) … P(X_n \geq \epsilon) \br &= (1 - \epsilon)^n \end{align*}$$
In particular,
$$\limu{ n }{ \infty } P(\abs{ Y_n - 0 } \geq \epsilon) = \limu{ n }{ \infty } (1 - \epsilon)^n = 0 $$
Since this is true for every $\epsilon > 0 $, we conclude that $P \ipto 0 $.
$Y \expdist{ 1 }$.
For any positive integer $n $, let $Y_n = Y/ n $. (Note that these random variables are dependent.) Prove $Y_n \ipto 0$.
For $\epsilon > 0 $. We have
$$P(\abs{ Y_n - 0 } \geq \epsilon) = P(Y_n \geq \epsilon) = P(Y \geq n \epsilon) = e^{ -n \epsilon }$$
In particular,
$$\limu{ n }{ \infty } P(\abs{ Y_n - 0 } \geq \epsilon) = \limu{ n }{ \infty } e^{-n \epsilon} = 0$$
Since this is the case for every $\epsilon > 0 $, $Y_n \ipto 0$.
One might be tempted to believe that if a sequence $Y_n $ converges to a number $a $, then $E[Y_n] $ must also converge to $a $.
This is not necessarily true.
Consider a sequence of discrete random variables $Y_n $ with the following distribution:
$$P(Y_n = y) = \begin{cases} 1 - \frac{ 1 }{ n }& y = 0 \br \frac{ 1 }{ n }& y = n^2 \br 0 &\text{ otherwise} \end{cases}$$
$$\limu{n}{\infty} P(\abs{ Y_n } \geq \epsilon) = \limu{ n }{ \infty } \frac{ 1 }{ n } = 0$$
However,
$$E[Y_n] = \frac{ n^2 }{ n } = n $$
which goes to infinity as $n$ increases.