## Independence of Events
The occurrence of [[Events|event]] $A$ does not effect the probability of event $B$ in any kind, leading to $\mathbf P(B \vert A) = \mathbf P(B)$. Applying this equality to the [[Probability Multiplication Rule|multiplication rule]] means that the intersection of 2 independent events is their product.
$
\begin{align}
\mathbf P(A \cap B) &= \overbrace{\mathbf P(B \vert A)}^{=\mathbf P(B)}* \mathbf P(A) \tag{1}\\
\mathbf P(A \cap B) &= \mathbf P(B)*\mathbf P(A) \tag{2}
\end{align}
$
**Example:** In the below graph $A,B$ are definitely NOT independent, since if one happens, the other is impossible to happen.
![[independent-events.png|center|400]]
**Symmetry:** If $A$ is independent of $B$, then $B$ has to be independent of $A$. This also applies in the edge case, where $\mathbf P(A)=0$ or $\mathbf P(B)=0$ , since such an empty set gives no information about the other event.
## Independence of Complements
If $A,B$ are independent, it is also true for their complements.
$ A \perp B \implies A \perp B^C \implies B \perp A^C \implies A^C \perp B^C $
To prove this statement, the following equation has to hold:
$ \mathbf P(A \cap B^C)= \mathbf P(A)* \mathbf P(B^C) $
**Derivation:**
$
\begin{align}
\mathbf P(A) &= \mathbf P(A \cap B) + \mathbf P (A \cap B^C) \tag{1}\\
\mathbf P(A) &= \mathbf P(A)*\mathbf P(B) + \mathbf P(A \cap B^C) \tag{2}\\
\mathbf P(A) - \big((\mathbf P(A)*\mathbf P(B)\big) &= \mathbf P(A \cap B^C) \tag{3}\\
\mathbf P(A)* \big(1- \mathbf P(B)\big) &=\mathbf P(A \cap B^C) \tag{4}\\
\mathbf P(A)*\mathbf P(B^C) &= \mathbf P(A \cap B^C) \tag{5}
\end{align}
$
where:
- (1) Splitting $\mathbf P(A)$ into the intersections with and without $B$.
- (2) Since we assume that $A \perp B$, we can express it as a product.
- (5) Replace $1-\mathbf P(B)$ as complement of $B$.
## Independence of Conditional Events
The concept of independence also exists for [[Conditional Probability]]. However, independence between $A$ and $B$ can change when a different conditions $C$ is set.
$
\text{dependent on } C
\begin{cases} \mathbf P(A \cap B \vert C) =
\mathbf P(A \vert C) * \mathbf P(B \vert C) \\[4pt]
\mathbf P(A \cap B \vert C) \not = \mathbf P(A \vert C) * \mathbf P(B \vert C)
\end{cases}
$
**Example:** Let us assume that $A \perp B$. However, in the conditional universe $C$, they are clearly NOT independent, as one event rules out the occurrence of the other.
![[independent-conditional-events.png|center|400]]
## Independence of Multiple Events
The notion of independence can also express the relationship between a collection of events, considering independence between all events $\{A_1 \dots A_n\}$ together.
$ \mathbf P(A_1 \cap \dots \cap A_n) = \prod_i \mathbf P(A_i) $
**Note:**
- *Mutual independence* of all events collectively implies pairwise independence between any two events.
- *Pairwise independence* between all bilateral combinations does NOT guarantee mutual independence.
**Example:** We flip a coin twice, which will return any combination of $\{H,T\}$. Event $C$ covers outcomes where both coin flips give the same result $\{HH, TT\}$.
![[independent-multiple-events.png|center|400]]
We have pairwise independence for all combinations:
- $H \perp T$
- $H \perp C$
- $T \perp C$
If all events would be mutually independent, then the below equation would need to hold. However, we can show numerically that this is NOT the case, and conclude that their is no mutual independence.
$
\begin{align}
\mathbf P(H \cap H \cap C) &= \mathbf P(H)*\mathbf P(H)*\mathbf P(C) \\
\mathbf P(H \cap H \cap C)&= 1/4 \\
\mathbf P(H)*\mathbf P(H)*\mathbf P(C)&= 1/8
\end{align}
$