This [[Random Variable]] can only take outcomes from $\{0,1\}$. It typically models success or failure.
$
X=
\begin{cases}
1 & \text{w.p.} &p\\
0 & \text{w.p.} &(1-p)\\
\end{cases}
$
**Indicator variable:** To translate an [[Events|event]] $A$ into a r.v., we use a Bernoulli to denote if the event happened or not.
$
I_A=
\begin{cases}
A & \text{w.p.} &p\\
A^C & \text{w.p.} &(1-p)\\
\end{cases}
\implies \mathbf P(I_A)=\mathbf P(A)
$
**Expectation:**
$ \mathbb E[X] = 1_p +0_(1-p)=p $
**Variance:**
$ \mathrm{Var}(X) = p*(1-p) $
![[bernoulli-variance.png|center|400]]
Variance derivation via expected value rule:
$
\begin{align}
\mathrm{Var}(X)
&=\sum_x (x-\mu)^2*p_X(x) \\
&=(1-p)^2*p + (0-p)^2*(1-p) \\[4pt]
&=(1-2p+p^2)*p+p^2*(1-p) \\[4pt]
&=p-2p^2+p^3+ p^2-p^3 \\[4pt]
&=p-p^2 \\[4pt]
&=p*(1-p)
\end{align}
$
Variance derivation via method of moments:
$
\begin{align}
\mathrm{Var}(X)
&=\mathbb E[X^2]- (\mathbb E[X])^2 \\
&=\mathbb E[X]- (\mathbb E[X])^2 \\
&=p-p^2 \\
&=p*(1-p)
\end{align}
$
where:
- (2) For a Bernoulli r.v. its second moment is equal to its first moment (i.e. [[Expectation]]). This is because Bernoulli $\in \{0,1\}$, and squaring these values results in the same outcome.
- (3) Replacing the expectation with $p$.