## Stochastic Process
A collection of [[Random Variable|Random Variables]] indexed by some discrete or continuous parameter $t$. The collection of all r.v’s. is defined on a common probability space $\Omega$.
$ X_1, \dots, X_n : \Omega \to \mathbb R $
- *Time series process:* A [[Stochastic Process]] where the index parameter $t$ represents the discrete time intervals (with fixed gaps). We assign a r.v. at each time stamp.
- *Time series:* The observations $(x_1, \dots, x_n)$ of the single realization $\omega$ for each r.v. in that stochastic process.
$
\begin{align}
x_1 = X_1(\omega): \omega \in \Omega \\
x_n=X_n(\omega): \omega \in \Omega
\end{align}
$
## Dependency Structure
In time series, we usually do not assume [[Independence of Random Variables]]. Therefore the dependency structure, is the essence of what needs to be modeled.
- *Advantage:* A strong dependency helps us to make predictions, as past realizations contain knowledge about future realizations.
- *Disadvantage:* Strong autocorrelation reveals less information about the data generating process. [[Law of Large Numbers|LLN]] and [[Central Limit Theorem|CLT]] do not directly apply.
## Time Series Decomposition
The [[Expectation]] of the r.v. $X$ at timestamp $t$ is $\mathbb E[X_t]$. It can be decomposed into a trend $m_X(t)$ and potentially a seasonal variation $s_X(t)$.
$
\begin{align}
\mu_X(t)&:= \mathbb E[X_t] \\[6pt]
\mu_X(t) &= m_X(t) +s_X(t)
\end{align}
$
A realized time series can therefore be modeled in the following form, where $W_t$ is a [[Autoregressive Model#^33d29b|White-Noise Model]].
$ X_t= m_X(t)+s_X(t)+W_t, \quad \text{where}\: W_t \stackrel{iid}{\sim}\mathcal N(0, \sigma^2) $