A Gaussian vector $X \in \mathbb R^d$ is a [[Vector Operations|Vector]] that has $d$ number of “coordinates” on a $d$-dimensional [[Gaussian Distribution]]. So individually, each element of the vector is a univariate Gaussian [[Random Variable|r.v.]] itself. Jointly all elements form a multivariate Gaussian. The Gaussian vector is completely determined by the [[Expectation]] of each component $\mu \in \mathbb R^d$ and by the [[Covariance]] between each pair i.e. the [[Covariance Matrix]] $\Sigma$. $ X= \begin{bmatrix} X^{(1)}\\ \vdots \\[4pt] X^{(d)}\\ \end{bmatrix}, \quad \mu= \begin{bmatrix} \mu^{(1)}\\ \vdots \\[4pt] \mu^{(d)}\\ \end{bmatrix}, \quad \Sigma= \begin{bmatrix} \mathrm{Cov}(X^{(1)},X^{(1)}) & \cdots & \mathrm{Cov}(X^{(1)},X^{(d)}) \\ \vdots & \ddots & \vdots \\[4pt] \mathrm{Cov}(X^{(d)},X^{(1)}) & \cdots & \mathrm{Cov}(X^{(d)},X^{(d)}) \end{bmatrix} $ The short notation of a multivariate Gaussian looks as follows: $ X \sim \mathcal N_d(\mu, \Sigma) $ The generalized [[Probability Density Function|PDF]] of a Gaussian in $\mathbb R^d$ looks as follows: $ f(x)=f\Big(x^{(1)},\dots, x^{(d)}\Big)= \frac{1}{\big(2 \pi \det (\Sigma)\big)^{d/2}} \exp\Big(-\frac{1}{2}(x-\mu)^T \, \Sigma^{-1}(x-\mu)\Big) $ In the 1-dimensional case, the covariance matrix $\Sigma$ is just $(1 \times 1)$ with the [[Variance]] $\sigma^2$ and $\mu$ is just scalar. By plugging in these values, we arrive at the univariate Gaussian PDF. $ \begin{align} f(x) &= \frac{1}{\big(2 \pi \det (\sigma^2)\big)^{1/2}} \exp\left\{-\frac{1}{2}(x-\mu)^T \, \frac{1}{\sigma^2}(x-\mu)\right\} \tag{1}\\[8pt] &= \frac{1}{\sqrt{2\pi \sigma^2}}\, \exp\left\{-\frac{1}{2\sigma^2}(x-\mu)^2 \right\} \tag{2} \\[8pt] &= \frac{1}{\sqrt{2\pi \sigma^2}}\, \exp\left\{-\frac{1}{2}\big(\frac{x-\mu}{ \sigma}\big)^2\right\} \tag{3} \end{align} $ where: - (1) Substituting $\sigma^2$ and $d=1$. - (2) The determinant of a $(1 \times 1)$ matrix is the value itself. Since $\mu$ is a scalar the multiplication of $(x-\mu)$ terms can be written as a square $(x-\mu)^2$.