We derive some important inequalities in this section, which use the mean and sometimes the variance of a random variable to draw conclusions on the probabilities of certain events.
They are primarily useful in situations where exact values or bounds for the mean and variance of a random variable $X $ are easily computable, but the distribution of $X $ is either unavailable or hard to calculate. Essentially, they provide us with a range approximation of the probability.
Markov inequality asserts that if a nonnegative random variable has a small mean, then the probability that it takes a large value must also be small.
If a random variable $X $ can only take nonnegative values, then
$$P(X \geq a) \leq \frac{ E [X] }{ a }, \forall a > 0$$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $X$ be a random variable. Then $\forall a > 0$
$$\begin{align*} P(\abs{ X } \geq a) &\leq \frac{ E [ \abs{ X } ] }{ a } \br P(\abs{ X } \geq a) &\leq \frac{ E [ \abs{ X }^n ] }{ a^n } \end{align*}$$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
If $X$ is a random variable with mean $\mu$ and variance $\sigma^2$, then
$$\forall c > 0, P(\abs{ X - \mu } \geq c) \leq \frac{ \sigma^2 }{ c^2 }$$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
If $X$ is a random variable with mean $\mu$ and variance $\sigma^2$, then
$$\forall s>0, P(\abs{X- \mu} \geq s \sigma) \leq \frac{1}{s^2}$$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
The Chebyshev inequality tends to be more powerful than the Markov inequality, since it provides more accurate bounds using the information on the variance of $X$.
Still, the mean and the variance of a random variable are only a rough summary of its properties, and we cannot expect the bounds to be closed aproximations of the exact probabilities.
Let $X$ be uniformly distributed in $[0, 4] $. Use Chebyshev inequality to bound the probability that $\abs{ X - 2 } \geq 1 $. We have
$$P(\abs{ X - 2 } \geq 1) \leq \frac{ 4 }{ 3 } $$
This example shows that Chebyshev inequality sometimes provides bounds that are so loose that it provides no information at all.
Let $X $ be exponentially distributed with parameter $\lambda = 1 $, so that $E[X] = \var{ X } = 1 $. For $c > 1 $, using the Chebyshev inequality, we obtain
$$P(X \geq c) = P(X - 1 \geq c - 1) \leq P(\abs{ X - 1 } \geq c - 1) \leq \frac{ 1 }{(c - 1)^2 }$$
Comparing to the exact answer $P(X \geq c) = e^{-c} $, this is a very conservative estimation.
When $X $ is known to take values in a range $[a, b] $ we claim that
$$\sigma^2 \leq (b-a)^2 / 4$$
Thus if $\sigma^2$ is unknown, we may use the bound $(b -a)^2 /4 $ in place of $\sigma^2 $ in the Chebyshev inequality, and obtain
$$P(\abs{ X - \mu } \geq c) \leq \frac{(b - a)^2 }{ 4 c^2 }, \forall c > 0 $$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $X $ be a random variable. Then, $\forall s > 0, \forall a \in R $.
$$P(X \geq a) \leq e^{ - sa } M_X(s)$$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>