Covariance is a measure of dependence between two [[Random Variable|random variables]] $X$ and $Y$. When covariance is positive, the r.v’s. tend to move together. When it is negative they tend to move in opposite directions. $ \mathrm {Cov}(X,Y)= \mathbb{E}\Big [(X- \mathbb{E}[X])*(Y-\mathbb{E}[Y]) \Big] $ ## Covariance of Independent Random Variables If $X \perp Y$, then their covariance has to be $0$. The derivation follows: $ \begin{align} \mathrm{Cov}&=\mathbb E\Big[\overbrace{(X- \mathbb E[X])}^{A}* \overbrace{(Y-\mathbb{E}[Y])}^{B} \Big] \tag{1}\\ &= \mathbb E\Big[(X- \mathbb E[X])\Big] * \mathbb E\Big[(Y- \mathbb E [Y])\Big] \tag{2}\\[4pt] &= (\mathbb E[X]-\mathbb E[X]) * (\mathbb E[Y]-\mathbb{E}[Y]) \tag{3}\\[4pt] &= 0*0 \tag{4} \end{align} $ where: - (2) Since $A \perp B$ the [[Expectation of a Product]] is the product of the single expectations. - (3) Applying the [[Law of Iterated Expectations]] onto the expectation of an expectation, which returns it expectation $\mathbb E[X]$ is itself. >[!note:] However $\mathrm {Cov}(X,Y)=0$ does not imply that $X,Y$ are independent. Covariance only measures linear dependence, so there may still be non-linear dependencies. ## Covariance as Variance The covariance of $X$ with itself is equal to the variance of $X$. $ \begin{align} \mathrm {Cov}(X,X)&=\mathbb{E}\Big [(X- \mathbb{E}[X])*(X-\mathbb{E}[X]) \Big] \\ &= \mathbb{E}\Big[(X-\mathbb{E}[X])^2 \Big] \\[4pt] &=\mathrm {Var}(X) \end{align} $ ## Expanded Form of Covariance $ \begin{align} \mathrm{Cov}(X,Y)&=\mathbb{E}\Big [(X- \mathbb{E}[X])*(Y-\mathbb{E}[Y]) \Big] \tag{1}\\ &= \mathbb{E} \Big[XY-\mathbb{E}[Y]X-\mathbb{E}[X]Y+\mathbb{E}[X]\mathbb{E}[Y] \Big] \tag{2}\\ &= \mathbb{E}[XY]- \mathbb{E}[\mathbb{E}[Y]X]-\mathbb{E}[\mathbb{E}[X]Y]+ \mathbb{E}\Big[\mathbb{E}[X]\mathbb{E}[Y]\Big] \tag{3}\\[6pt] &= \mathbb{E}[XY] - \mathbb{E}[X]\mathbb{E}[Y] - \mathbb{E}[X]\mathbb{E}[Y] + \mathbb{E}[X]\mathbb{E}[Y] \tag{4}\\[6pt] &= \mathbb{E}[XY] - \mathbb{E}[X]\mathbb{E}[Y] \tag{5} \end{align} $ where: - (1) Take the regular covariance formula, and multiply out the terms. - (3) Apply [[Linearity of Expectations]] to break up the outer expectation into separate expectations. - (4) Pull out expectations of expectations by law of iterated expectations. ## Covariance after Linear Transformation A linear transformation of a r.v. $X$ also affects its covariance to $Y$. Consider $\mathrm {Cov}(aX + b, Y)$ where $a,b \in \mathbb R$. For simplicity let $\mathbb{E}[X]=0$ and $\mathbb{E}[Y]=0$, so that covariance simplifies to: $ \mathrm {Cov}(X,Y) = \mathbb{E}[XY] - \mathbb{E}[X]\mathbb{E}[Y] = \mathbb{E}[XY] $ **Derivation:** $ \begin{align} \mathrm{Cov}(aX+b,Y)&= \mathbb{E}[(aX+b)*Y)] \\ &=\mathbb{E}[aXY + bY] \\ &=a*\mathbb{E}[XY]+b*\mathbb{E}[Y] \\ &= a* \mathrm{Cov}(X,Y) \end{align} $ **Key Points:** - Scaling $X$ by a factor $a$ also scales its covariance by the same factor $a$. - Shifting $X$ by a constant $b$ has no effect on covariance (similar to [[Variance after Linear Transformation]]). ## Covariance of Sums of Random Variables We are interested in the covariance of $X,Y$ where one or both of them are for themselves the sum of other random variables. $ \begin{align} \mathrm{Cov}(X,Y+Z)&=\mathbb{E}[X(Y+Z)] \\ &=\mathbb{E}[XY+XZ)] \\ &=\mathbb{E}[XY]+\mathbb{E}[XZ] \\ &=\mathrm{Cov}(X,Y)+\mathrm{Cov}(X,Z) \end{align} $ **Generalizing:** The covariance can be found by summing the covariances of all combinations between the single $X_i$ the $Y_i$, but excluding terms that appear in both sums (i.e. $X_i=Y_j$). $ \mathrm{Cov}\left(\sum_{i=1}^m X_i, \sum_{j=1}^n Y_j \right) = \sum_{i=1}^m \sum_{j=1}^n \mathrm{Cov}(X_i, Y_j) \quad \text{where } X_i \not = Y_j $