This distribution is defined by a collection of $k$ [[Independence and Identical Distribution|i.i.d.]] [[Random Variable|r.v's.]] from a standard [[Gaussian Distribution]] $Z_1, \dots, Z_k \stackrel{iid}{\sim} \mathcal N (0,1)$. Concretely, each Gaussian is squared and then they are being summed together. The only parameter is $k$ (also referred to degrees of freedom).
$ Z_1^2+\dots Z_k^2 \sim \chi^2_k $
If $\mathbf Z$ is a [[Vector Operations|Vector]] of standard Gaussians, then we can also state the chi-square distribution as the Euclidean [[Vector Length#Norm Notation|Vector Norm]].
$ \text{if} \quad \mathbf Z= \begin{bmatrix} Z_1\\ \vdots\\ Z_k \end{bmatrix} \quad \text{then} \quad \lvert \lvert Z \rvert \rvert_2^2 \sim \chi_k^2 $
We can compute the [[Expectation]] and [[Variance]] of $V \sim \chi_k^2$.
$
\begin{align}
\mathbb E[V] &=\mathbb E\left [\sum_{i=1}^k Z_i^2\right] \tag{1}\\
&=\sum_{i=1}^k \underbrace{\mathbb E\left [Z_i^2\right]}_{=1} \tag{2}\\
&=k \tag{3}
\end{align}
$
(2) [[Linearity of Expectations]] to move the sum out of the expectation.
$
\begin{align}
\text{var}(X)&= \text{var}\left(\sum_{i=1}^k Z_i^2\right) \tag{1}\\[4pt] &=\sum_{i=1}^k\text{var}\left( Z_i^2\right) \tag{2}\\[4pt]
&=k*\Big(\underbrace{\mathbb E[Z^{2*2}]}_{=3}- \underbrace{\mathbb E[Z^2]^2}_{=1} \Big) \tag{3} \\[2pt]
&=2k \tag{4}
\end{align}
$
(5) Simplifying $Z_i$ as the [[Variance of Sum of Random Variables#Special Case of Independence|Variance of Sum of independent Random Variables]].