Probability of Transition Times for a Markov Chain

I have a simple Markov Chain with two states A and B. After each day, if the Markov chain is at State A, the probability to stay at State A is 0.6, and to move to State B is 0.4. If the Markov Chain is at State B, the probability to stay at State A is 0.4, and to move to State B is 0.6. Basically, in a stationary distribution, the Markov chain will stay at State A for 60% of the time and stay at State B for 40% of the time.

My question is that: If the Markov chain starts at State A, after N days (say 100 days), what is probability the Markov chain will transition from A to B and back to A for n times, n = 1, 2, ..., 50?

Is there a theory to calculate this? Any idea or giving me a reference for this is appreciated. Thanks a lot.


Well-Known Member
I can't think of any practical way to get an exact solution. In principle, there will be an exact answer for any given set of conditions, but it will probably involve tabulating all the possibilities, calculating the probability of each, and then collating the results. That's 2^100 for 100 days. There might possibly be a shortcut if the all the probabilities were 0.5.
In any particular situation, instead of doing the 2^100 possibilities, you can just do a random sample of say 10000 of them. In other words, do a Monte Carlo simulation. Do a run of 100 days, doing A and B according to your probabilities, count how many transitions in the 100 days, and write the number down. Repeat 10000 times (or as many as you want). Find the proportion of runs that have n = 1, 2, ... etc.


Well-Known Member
Are you still interested in an exact solution? I think one would be possible using dynamic programming which is just a fancy way of saying that you can build up a table of smaller values to calculate the final value.