The conditional PDF of a continuous random variable $X $, given an event $A $ with $\b{P}(A) > 0 $, is defined as a nonnegative function $f_{X|A} $ that satisfies
$$\b{P}(X \in B | A) = \int_{ B } f_{X|A}(x) \d x $$
for any subset $B $ of the real line.
$$\int_{ -\infty }^{ +\infty } f_{X|A}(x) \d x = 1$$
In particular, we condition an event of the form $\set{ X \in A } $, with $\b{P}(X \in A) > 0 $, the definition of conditional probabilities yields
$$\b{P}(X \in B | X \in A) = \frac{ \b{P}(X \in B, X \in A)}{ \b{P}(X \in A)} = \frac{ \int_{ A \cap B } f_X(x) \d x }{ \b{P}(X \in A)} $$
We conclude that
$$ f_{X | \set{ X \in A }}(x) = \begin{cases} \frac{ f_X(x)}{ \b{P}(X \in A)}, \if x \in A \br 0, \otherwise \end{cases} $$
Let $A_1, A_2, …, A_{ n } $ be disjoint events that form a partition of the sample space, and assume that $\b{P}(A_i) > 0 $ for all $i $, then
$f_X(x) = \sum_{ i=1 }^{ n } \b{P}(A_i)f_{X|A_i}(x)$(total probability theorem)
Let $X $ and $Y $ be jointly continuous random variables with joint PDF $f_{X,Y} $.
The joint, marginal, and conditional PDFs are related to each other by the formulas:
$$\begin{align*} f_{X, Y}(x, y) &= f_Y(y)f_{X|Y}(x|y) f_X(x) &= \int_{ -\infty }^{ +\infty } f_Y(y) f_{X|Y}(x | y) \d y \end{align*}$$
The conditional PDF $f_{X|Y}(x|y)$ is defined only for those $y $ for which $f_Y(y) > 0 $
We have
$$\b{P}(X \in A | Y = y) = \int_{ A } f_{X|Y}(x|y) \d x$$
Let $X $ and $Y $ be jointly continuous random variables, and let $A $ be an event with $\b{P}(A) > 0 $.
The conditional expectation of $X $ given the event $A $ is defined by
$$E[X|A] = \int_{ -\infty }^{ +\infty } x f_{X|A}(x) \d x $$
The conditional expectation of $X $ given that $Y = y$ is defined by
$E[X|Y = y] = \int_{ -\infty }^{ +\infty } x f_{X|Y} (x | y) \d x$
Let $X $ and $Y $ be jointly continuous random variables, and let $A $ be an event with $\b{P}(A) > 0 $. For a function $g(X)$, we have
The expectation of $g(X)$ given the event $A $ is defined by
$$E[g(X)|A] = \int_{ -\infty }^{ +\infty } g(x) f_{X|A}(x) \d x $$
The expectation of $g(X)$ given that $Y = y$ is defined by
$E[g(X)|Y = y] = \int_{ -\infty }^{ +\infty } g(x) f_{X|Y} (x | y) \d x$
Discrete case let’s assume $X $ takes non-negative integer values then
$$E[X] = \sum_{ k = 1 }^{ \infty } \b{P}(X \geq k) = \sum_{ k = 0 }^{ \infty } \b{P}(X > k) $$
Continuous case if $X > 0 $, then
$E[X] = \int_{ -\infty }^{ \infty } \b{P}(X > x) \d x = 1 - F_X(x)$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Let $A_1, A_2, …, A_{ n } $ be disjoint events that form a partition of the sample space, and assume that $\b{P}(A_i) > 0$ for all $i$. Then,
$$E[X] = \sum_{ i=1 }^{ n } \b{P}(A_i) E[X | A_i]$$
Similarly,
$$E[X] = \int_{ -\infty }^{ +\infty } E[X | Y = y] f_Y(y) \d y $$