A simple [[Probability Mass Function]] describes the probability law of a single discrete [[Random Variable]]. As an extension we can describe the combination of multiple discrete r.v’s. with a multi-dimensional joint PMF.
## Two-Dimensional PMF
$ p_{XY}(x,y)=\mathbf P(X=x, Y=y) $
Normalization to $1$ has to hold, when we sum over all $x,y$ pairs.
$ \sum_x \sum_yp_{XY}(x,y)=1 $
We can always reduce the multivariate PMF by a dimension, when we fix one of the r.v's at a certain value. E.g. the equation below fixes r.v. $X$ at $(x=2)$ and makes the PMF 1-dimensional.
$ \begin{aligned} p_X(x) &=\sum_y p_{XY}(x,y) \\ p_X(2) &=\sum_yp_{XY}(x=2,y) \end{aligned} $
We called this process marginalization of the PMF.
![[marginalize-joint-pmf.png|center|300]]
## Multi-Dimensional PMF
This is a natural extension to the two-dimensional case from above.
$ p_{XYZ}(x,y,z)=\mathbf P(X=x, Y=y, Z=z) $
Normalization to $1$ has to hold, when we sum over all $x,y,z$ pairs.
$ \sum_x \sum_y \sum_z p_{XYZ}(x,y,z)=1 $
Reduce to one-dimensional PMF:
$ p_X(x)=\sum_y \sum_z p_{XYZ}(x,y,z) $
Reduce to two-dimensional PMF:
$ p_{XY}(x,y)=\sum_zp_{XYZ}(x,y,z) $
>[!note:]
>Always sum over the variables that are not part of the left-side PMF term.
## Functions with Multiple Inputs
Let us assume that our r.v. of interest is a function of two other r.v’s.
$ Z=g(X,Y) $
The [[Expectation]] is simply the function evaluated for all possible $x,y$ pairs, and weighted with the respective two-dimensional PMF $p_{XY}(x,y)$.
$ \mathbb E[Z] = \mathbb E[g(X,Y)] = \sum_x \sum_y g(x,y)*p_{XY}(x,y) $
To obtain the PMF at some specific value $z$, we sum all probas of the two-dimensional PMF $p_{XY}(x,y)$ where he $x,y$ pairs evaluated to $z$ in the function $g(x,y)=z$.
$ p_z(z) = \sum_{(x,y):g(x,y)=z} p_{XY}(x,y) $