Let $\b{X} = (X_1, X_2, …, X_n)$ be a vector of $n$ discrete random variables,. Let $\b{x} = (x_1, x_2, …, x_n) $ be possible values of the random variables, the PMF of the event
$$\set{ \b{X} = \b{x}} = \set{X_1 = x_1, X_2 = x_2, …, X_n = x_n}$$
or more precisely
$$\bigcap_{i=1}^n \set{X_i = x_i}$$
is the joint PMF, denoted as follow
$$p_{ \b{X}}(\b{x}) = \b{P}(\b{X} = \b{x})$$
The joint PMF determines the probability of any event $A$ that can be specified in terms of the random variables $X_1, X_2, …, X_n$
$$\b{P}(\b{X} \in A) = \sum_{ \b{X} \in A } p_{ \b{X}}(\b{x})$$
Given a joint PMF $p_ \b{X}$, we call $p_{X_i}(x_i), i \in \set{ 1, 2, …, n }$ the marginal PMF
$$p_{X_i}(x_i) = \sum_{X_1} … \sum_{X_{i-1}}\sum_{X_{i+1}}… \sum_{X_n} p_{\b{X}} (\b{x})$$
Let $X_1, X_2, …, X_n$ be $n$ discrete random variables denoted as $\b{X}$. $Y$ be another random variable defined as $Y = g(\b{X})$. Its PMF can be calculated from the joint PMF according to
$$p_Y(y) = \sum_{\set{ \b{x} | g(\b{x}) = y}} p_{ \b{X}}(\b{x})$$
The expected value rule for functions naturally extends and takes the forms
$$E [g(\b{X})] = \sum_{X_1} \sum_{X_2}… \sum_{X_n} g(\b{x}) p_{\b{X}}(\b{x})$$