Let $X$ be a continuous random variable with PDF $f_X(x) : \R \to [0, +\infty)$, and $g: \R \to \R$ be a function. Want to determine the derived distribution $Y = g(X)$.
We first calculate the CDF $F_Y(y)$
$$F_Y(y) = \int_{\set{ x : g(x) \leq y}} f_X(x) \d x$$
Assuming $F_Y(y)$ is differentiable, then
$$f_Y(y) = \der{F_Y}{y}(y)$$
$X \tilde \text{Uniform}(-1, 1]$, $Y= X^3$. Find the distribution of $Y$.
$$f_X(x) = \begin{cases} \frac{ 1 }{ 2 } & x \in (-1, 1] \br 0 & x \not \in (-1, 1] \end{cases}$$
$$\begin{align} F_Y(y) =& \int_\set{ x: x^3 \leq y } f_X(x) \d x = \int_{ -\infty }^{ \sqrt[ 3 ]{y}} f_X(x) \d x \br =& \begin{cases} 0& y < -1 \br \frac{ 1 }{ 2 } \sqrt[ 3 ]{ y } + \frac{ 1 }{ 2 } & y \in (-1, 1] \br 1 & y > 1 \end{cases} \end{align}$$
$$f_Y(y) =\der{ }{ y } F_Y(y) = \begin{cases} \frac{ 1 }{ 2 } \abs{ \frac{ 1 }{ 3 } y ^{ - \frac{ 2 }{ 3 }}}=\frac{ 1 }{ 6 } y ^{ - \frac{ 2 }{ 3 }} & y \in (-1, 1] \br 0 & y \not \in (-1, 1] \end{cases}$$
$f: I \to J. I, J \subseteq \R $.
$f$ is strictly increasing if $f(x) < f(y), \forall x < y$.
$f$ is strictly decreasing if $f(x) > f(y), \forall x < y$.
$f$ is strictly monotonic if it is either strictly increasing or strictly decreasing.
$g: I \to J$ is a strictly monotonic function, $\exists h: J \to I$, such that $g(h(y)) = y, y \in J$.
By chain rule, we get
$$g'(h(y)) h'(y) = 1, y \in J$$
and therefore
$$h'(y) = \frac{ 1 }{ g'(h(y))}$$
Let $X$ be a continuous random variable such that $F_X$ is differentiable.
Let $I, J$ be open intervals in $\R$, $g: I \to J$ be a strictly monotonic differentiable function. That is
$$g'(x) > 0\text{ or } g'(x) < 0, \forall x \in I$$
Let $h: J \to I$ be the inverse of $g$. Notice that $h$ is also differentiable.
Then, the PDF of $Y$ in the region where $f_Y(y) > 0$ is given by
$$f_Y(y) = f_X(h(y)) \abs{h'(y)} = f_X(h(y)) \frac{ 1 }{ \abs{ g'(h(y))}}, \forall y \in J$$
If $g$ is increasing,
$$\begin{align*} F_Y(y) &= \b{P}(Y \leq y) = \b{P}(g(X) \leq y) = \b{P}(X \leq h(y)) \br &= \int_{-\infty}^{h(y)}f_X(x)\d x \br &= F_X(h(y)) \end{align*}$$
$f_Y(y) = \der{}{y}F_X(h(y)) = f_X(h(y))h'(y)$.
If $g$ is decreasing,
$$\begin{align*} F_Y(y) &= \b{P}(Y \leq y) = \b{P}(g(X) \leq y) = \b{P}(X \geq h(y)) \br &= 1 - F_X(h(y)) \end{align*}$$
$f_Y(y) = \der{}{y} (1- F_X(h(y))) = - f_X(h(y))h'(y)$.
$T_A \tilde \text{Exp}(\lambda), T_B \tilde \text{Exp}(\mu). \lambda, \mu > 0.$ Suppose $T_A, T_B$ are independent. $X = \max{ T_A, T_B }, Y = \min{ T_A, T_B }$. Want to know $F_X, F_Y $.
$\begin{align*} F_X(y)=& \b{P}(\max{ T_A, T_B } \leq y) \br =& \b{P}(T_A \leq y, T_B \leq y) \br =& \b{P}(T_A \leq y) \b{P}(T_B \leq y) \br =& F_{T_A}(y)F_{T_B}(y) \br =& (1 - e^{- \lambda y})(1 - e^{ - \mu y }) \end{align*}$
$f_X(y) = \der{ }{ y }F_X(y) $
$\begin{align*} &\b{P}(\min{ T_A, T_B } \leq y) \br =& 1 - \b{P}(\min{ T_A, T_B } \geq y) \br =& 1 - \b{P}(T_A \geq y, T_A \geq y) \br =& 1 - P (T_A \geq y) P (T_B \geq y) \br =& 1 - (1 - F_{T_A}(y))(1 - F_{T_B}(y)) \br =& \begin{cases} 1 - e^{-y \lambda}e^{-y \mu} = 1 - e ^{- (\lambda + \mu)y}, y \geq 0 \end{cases} \end{align*}$
Note, $Min (T_A, T_B) ~ Exponential (\lambda + \mu)$
$E(X) = E(T_A + T_B - \min{ T_A, T_B }) = \frac{ 1 }{ \lambda } + \frac{ 1 }{ \mu } - \frac{ 1 }{ \lambda + \mu }$
$E(Y) = \frac{ 1 }{ \lambda + \mu } $
$X = \max{ T_A, T_B }$
$Y = \min{ T_A, T_B } $
$T_A - T_B - independent $
$F_Y(y) = \iint_{ \set{(a, b): \min{(a, b)} \leq y }} f_{T_AT_B}(a, b) \d a \d b = \iint_{ \set{(a, b): \min{(a, b)} \leq y }}f_{T_A}(a) f_{T_B}(b) \d a \d b$
$\begin{cases} F_Y(y) &= \b{P}(\min{ T_A T_B } \leq y) = 1 - \b{P}(\min{ T_A, T_B } \geq y) \br &= 1 - \b{P}(T_A \geq y, T_B \geq y) = 1- \b{P}(T_A \geq y)\b{P}(T_B \geq y) \br &= 1 - (1 - F_{T_A}(y))(1 - F_{T_B}(y)) \end{cases}$
Suppose $X,Y $ are two independent integratable random variable. Then the probability mass function of $x + y $ is given by
for $t \in \Z,$
$\begin{align*} P_{X+Y}(t) \br =& \b{P}(X+Y = t) \br =& (TPF) \sum_{ x \in \Z } \b{P}(X = x) \b{P}(X + Y = t | X = x) \br =& \sum_{ x \in \Z } \b{P}(X = x) \b{P}(Y = t - x | X = x) \br =& \sum_{ x \in \Z } \b{P}(X = x) \b{P}(Y = t - x) \br =& \sum_{ x \in \Z } P_X(x)P_Y(t- x) \end{align*}$
Let $X, Y $ be two independent random variable, such that$F_{X+Y}(t)$ is differentiable $\forall t \in R$. Then
$$f_{X+Y}(t) = f_X *f_Y(t) \forall t \in \R $$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
(discrete case)
Let $Z = X + Y $, where $X$ and $Y$ are independent integer-valued random variables with PMFs $p_X $ and $p_Y $, respectively. Then, for any integer $z $,
$$\begin{align*} p_Z(z) &= \b{P}(X + Y = z) \br &= \sum_{ \set{(x, y) | x + y = z }} \b{P}(X = x, Y = y) \br &= \sum_{ x } \b{P}(X = x, Y = z - x) \br &= \sum_{ x } p_X(x)p_Y(z - x) \end{align*}$$
The resulting PMF $p_z $ is called the convolution of the PMFs of $X $ and $Y $.
(continuous case)
Suppose now that $X $ and $Y $ are independent continuous random variables with PDFs $f_X$ and $f_Y $, respectively. Then, to find the PDF of $Z $, we first note that
$$\begin{align*} \b{P}(Z \leq z | X = x) =& \b{P}(X + Y \leq z | X = x) \br =& \b{P}(x + Y \leq z | X = x) \br =& \b{P}(x + Y \leq z) \br =& \b{P}(Y \leq z - x) \end{align*}$$
By differentiating both sides with respect to $z$, we see that $f_{Z|X}(z|x) = f_Y(z - x) $. Using the multiplication rule, we have
$$\begin{align*} f_{X, Z}(x, z) =& f_X(x) f_{Z|X}(z|x) \br =& f_X(x)f_Y(z- x) \end{align*}$$
Then,
$$\begin{align*} f_Z(z) =& \int_{ -\infty }^{ +\infty } f_{X, Z} (x, z) \d x \br =& \int_{ -\infty }^{ +\infty } f_X(x) f_Y(z - x) \d x \end{align*}$$