Definition State Space and Markov Property
Let \(E\) be a finite set \(\{e_1, e_2, \dots, e_k\}\) called the state space.
A sequence of random variables \((X_n)_{n \in \mathbb{N}}\) taking values in \(E\) is a Markov chain if, for all \(n \in \mathbb{N}\), for all indices \(i,j \in \{1,\dots,k\}\), and for all indices \(i_0,\dots,i_{n-1} \in \{1,\dots,k\}\) such that the conditioning event has positive probability,$$P\!\Bigl(X_{n+1}=e_j \,\Big|\, X_n=e_i,\; X_{n-1}=e_{i_{n-1}},\dots,\; X_0=e_{i_0}\Bigr)=P\!\Bigl(X_{n+1}=e_j \,\Big|\, X_n=e_i\Bigr).$$In this chapter, we consider time-homogeneous Markov chains, i.e.\ the conditional probability$$p_{ij}=P\!\bigl(X_{n+1}=e_j \mid X_n=e_i\bigr)$$does not depend on \(n\). The number \(p_{ij} \in [0, 1]\) is the transition probability from state \(e_i\) to state \(e_j\).