An absorbing state is a [[Convergence of Markov Chains#Recurrent & Transient States|recurrent]] state $k$ with $p_{kk}=1$, which means a state with an infinite self-loop. The [[Markov Chain]] will never leave that state once it has entered it. This implies, that when $n \to \infty$ only absorbing states will have transition probabilities gt;0$. ## Expected Time to Absorption Assume a MC with only one absorbing state. As discussed, the steady-state transition probabilities of transient states will go to $0$, while all probability mass is on the absorbing state (which will be reached with certainty). ![[markov-chain-absorbing-states.png|center|400]] We want to calculate the expected number of transitions (steps) $\mu_i$ starting from state $i$ until the absorbing state is reached. - We express $\mu_i$ as the weighted probability of all possible next states $\mu_j$. - We add $+1$ because of the first step that we already took to get from $\mu_i \to \mu_j$. $ \mu_i = 1+ \sum_jp_{ij}\mu_j $ There is a unique solution to the system of equations, when a MC has only one absorbing state. $ \begin{align} \mu_4&=0\\ \mu_2 &= 1+ 0.2\mu_4+0.8\mu_1 \\ \mu_1 &= 1+ 0.6\mu_2+0.4\mu_3 \\ \mu_3 &= 1+ 0.5\mu_1+ 0.5\mu_2 \end{align} $ ## Multiple Absorbing States When there is gt;1$ absorbing states $[\mu_x, \mu_y]$ the expected number of transitions until reaching one of the absorbing states is $\infty$. This is because the MC will end up with some probability $p$ in one of them and will need $\infty$ number of steps to reach the other absorbing state. Thus, it makes more sense to compute the number of steps until *any* absorbing state is reached. This can be done by acting as if all absorbing states are treated like one single state. Therefore we simply add up all arcs that come from one state to any of the absorbing states. ![[markov-chain-multiple-absorbing-states.png|center|400]]