What’s Markov Chain
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Transition Matrix
Transition Matrix is also called stochastic matrix. It describes a Markov chain over a finite state space with cardinality
If the probability of moving from to in one time step is , the stochastic matrix is given by using as the -th row and -th column element, e.g.,
Since the total of transition probability from a state to all other states must be 1, so,
Example
An RPG with a 33% blitz rate. But if the first two times you don’t blitz, the third time you’re bound to blitz. So what is the actual hit rate?
Simulation Code: