We say that random variable $X$ is independent of the event $A$ if
$$\b{P}({X = x} \cap A) = \b{P}(X = x)\b{P}(A) = p_X(x)\b{P}(A), \forall x $$
It follows that if $\b{P}(A) \neq 0$, independence is the same as the condition
$$p_{X|A} (x) = p_X(x), \forall x $$
We say that two random variables $X$ and $Y$ are independent if
$$p_{X,Y}(x, y) = p_X(x)p_Y(y), \forall x, y$$
It follows that
$$p_{X|Y}(x|y) = p_X(x), \forall y, p_Y(y) \neq 0, \forall x$$
$$p_{Y|X}(y|) = p_Y(y), \forall x, p_X(x) \neq 0, \forall y$$
Given an event $A$, $\b{P}(A) \neq 0$, we say random variables $X$ and $Y$ are conditionally independent if
$$\b{P}(X=x,Y=y|A)=\b{P}(X=x|A)\b{P}(Y=y|A), \forall x,y$$
or in another notation,
$p_{X,Y|A}(x, y) = p_{X|A}(x)p_{Y|A}(y), \forall x, y$
Once more, this is equivalent to
$p_{X|Y, A} (x|y) = p_{X|A}(x) \forall x, y, p_{Y|A}(y) > 0$
If $X$ and $Y$ are independent random variables, then
$$E\brac{ XY } = E\brac{ X } E\brac{ Y } $$
as shown by the following calculation,
$$\begin{align} E\brac{ XY } &= \sum_{ x } \sum_{ y }xy p_{X,Y}(x, y) \br &= \sum_{ x } \sum_{ x }xy p_X(x) p_Y(y) \br &= \sum_{ x } x p_X(x) \sum_{ y } y p_Y(y) \br &= E\brac{ X } E\brac{ Y } \end{align}$$
Similarly,
$$E\brac{ g(X) h(Y)} = E\brac{ g(X)} E\brac{ h(Y)} $$
Let $X$ and $Y$ be two independent random variables. Let $\tilde X = X - E\brac{ X } $ and $\tilde Y = Y - E\brac{ Y } $. We have
$$\begin{align*} \var{ X+Y } &= \var{ \tilde X + \tilde Y } \br &= E\brac{(\tilde X + \tilde Y)^2 } \br &= E\brac{{\tilde X}^2 + 2 \tilde X \tilde Y + {\tilde Y}^2 } \br &= E\brac{{\tilde X}^2 } + 2 E\brac{ \tilde X \tilde Y } + E\brac{{\tilde Y}^2 } \br &= E\brac{{\tilde X}^2 } + E\brac{{\tilde Y}^2 } \br &= \var{ \tilde X } + \var{ \tilde Y } \br &= \var{ X } + \var{ Y } \end{align*}$$
In conclusion, the variance of the sum of two independent random variables is equal to the sum of their variances.