## Probability Mass Function
A series of i.i.d. [[Bernoulli Distribution|Bernoulli]] trials (e.g. coin flips) are executed. The binomial gives probabilities for $k$ number of successes out of $n$ trials. Each trial has a success probability of $p$.
$ \mathbf P(k \text{ heads})= \binom{n}{k}* p^k*(1-p)^{n-k} $
| Expression | Meaning |
| ---------------------------- | ----------------------------------------------------------------------- |
| $\mathbf P(k \text{ heads})$ | Probability that out of $n$ coin flips, there will be exactly $k$ heads |
| $\binom{n}{k}$ | Number of possibilities that heads can come up exactly $k$-times |
| $p^k$ | Probability of heads $k$ times |
| $(1-p)^{n-k}$ | Probability of tails $(n-k)$ times |
Note that we are allowed to multiply $p^k$ with $(1-p)^{n-k}$ because of the [[Probability Multiplication Rule]] for [[Independence of Events|independent events]].
![[binomial-pmf.png|center|400]]
## Expectation
**Via Expected Value Rule:**
As usual we build the sum product over all $k$ with its PMF. While feasible for small $n$ this can get cumbersome for larger values (binomial coefficient is expensive to compute).
$ \mathbb E[X] =\sum_{k=0}^n k* \underbrace{\binom{n}{k}*p^k(1-p)^{n-k}}_{p_X(k)} $
Using the property of the [[Binomial Coefficient]]:
$
\begin{align}
k *\binom{n}{k} &= k*\frac{n!}{k!*(n-k)!} \tag{1}\\[6pt]
&= \frac{n!}{(k-1)!*(n-k)!} \tag{2}\\[6pt]
&= n*\frac{(n-1)!}{(k-1)!*(n-k)!} \tag{3}\\[6pt]
&= n*\frac{(n-1)!}{(k-1)!*\big((n-1)-(k-1)\big)!} \tag{4}\\[6pt]
& = n* \binom{n-1}{k-1} \tag{5}
\end{align}
$
where:
- (3) Factoring out $n$ reduces the numerator to $(n-1)!$
- (4) Writing $(n-k)!$ is equivalent to $\big((n-1)-(k-1)\big)!$
- (5) Writing fraction again as binomial coefficient.
Expressing the expectation with rewritten binomial coefficient:
$
\begin{align}
\mathbb E[X] &=\sum_{k=0}^n n* \binom{n-1}{k-1}*p^k(1-p)^{n-k} \tag{6}\\[2pt]
&=\sum_{k=0}^n np* \binom{n-1}{k-1}*p^{k-1}(1-p)^{n-k} \tag{7}\\[2pt]
&=np*\sum_{k=1}^n \binom{n-1}{k-1}*p^{k-1}(1-p)^{n-k} \tag{8}\\[2pt]
&=np*\sum_{k=1}^n \binom{n-1}{k-1}*p^{k-1}(1-p)^{(n-1)-(k-1)} \tag{9}\\
\end{align}
$
where:
- (7) Factoring out $p$ reduces the exponent of $p^k$.
- (8) Since the case of $k=0$ will return nothing, it can be neglected in the summation. Also $np$ can be moved outside the summation.
- (9) The terms inside the summation happen to be a valid Binomial PMF with parameters $k^\prime$ and $n^\prime$.
$
\mathbb E[X]=np*\sum_{k^\prime=0}^{n^\prime} \mathrm{Binom}(k^\prime; n^\prime,p) \quad \text{where}
\begin{cases}
k^\prime=k-1\\
n^\prime=n-1
\end{cases}
$
Since summing over the entire PMF equals to $1$, we can conclude that:
$ \mathbb E[X]=np$
**Via Indicator Variables:**
The binomial r.v. states the number of successes in a sequence of Bernoulli trials. We can compute the [[Expectation]] of each trial separately. Then, due to [[Linearity of Expectations]] we can sum up all individual expectations.
$ X \begin{cases} X_i=1 & \text{success w.p. } p\\ X_i=0 & \text{otherwise} \\ \end{cases} $
$ \mathbb E[X]= \underbrace{\mathbb E[X_1]}_{p}+ \dots + \underbrace{\mathbb E[X_n]}_{p} = np $
## Variance
Since the binomial consists of multiple Bernoulli trials that are independent of each other, we can write the [[Variance of Sum of Random Variables|Variance of Sum of r.v's.]] as the sum of all individual variances.
$
\begin{align}
\mathrm{Var}(X) &= \mathrm{Var}(X_1)+ \dots+\mathrm{Var}(X_n) \\
&=p(1-p)+\dots+p(1-p) \\
&=np(1-p)
\end{align}
$