The expected value of a [[Random Variable]] can be interpreted as the average outcome over a large number of [[Independence of Random Variables|independent]] draws from that random variable. For the calculation we iterate over all possible values $x$, and weight the respective $x$ with its probability of occurrence $p_X(x)$. ^52e656
$
\begin{align}
\mathbb E[X]&= \sum_x x*p_X(x) \quad &&\text{(discrete)} \\[4pt]
\mathbb E[X]&= \int_{-\infty}^\infty x*f_X(x)\, dx \quad &&\text{(continuous)}
\end{align}
$
^32925b
## Properties of Expectation
**Property 1:** If $X \ge 0$ then $\mathbb E[X] \ge 0$
- *Description:* If all $x_i$ are $\ge0$, then the expectation has to be $\ge 0$ as well.
- *Proof:* Since probabilities are always $\ge 0$ by [[Probability Axioms#Axioms|axiom]], and in this case $x_i \ge0$, we are only multiplying and summing non-negative terms.
$ \mathbb E[X] = \sum_x x* \underbrace{p_X(x)}_{\ge0} $
**Property 2:** If $a \le X \le b$ then $a \le \mathbb E[X] \le b$
- *Description:* If all $x_i$ are between $[a,b]$, then the expectation is within these bounds as well.
- *Proof:* Assume the edge case, where all $x_i=a$, where $a$ is the lower bound of the range.
$
\begin{align}
\mathbb E[X]
&= \sum_x x*p_X(x) \\
&=\sum_x a*p_X(x) \\
&=a*\underbrace{\sum_x p_X(x)}_{=1}=a
\end{align}
$
**Property 3:** If $X$ is a constant then $\mathbb E[X] = x$
- If there is only of possible outcome $x$ from the r.v. $X$, then the expectation is $x$ as well.
**Property 4:** If $Y=g(X)$ then the expectation of $Y$can be evaluated in 2 ways.
- *Directly:* Iterating over the possible values of $Y$.
$ \mathbb E[Y] = \sum_y y*p_Y(y) $
- *Indirectly:* Iterating over the possible values of $X$ transformed by $g(X)$.
$ \mathbb E[Y] = \mathbb E[g(X)] = \sum_x g(X)*p_X(x) $