However upon further analysis the data does not appear to have the "memoryless" property of a Markov chain. In my data, whose transition matrices are shown in the image at the bottom, an element ever touching state 04 or 05 has a significant impact on the state-behavior of that element in ALL future periods and not only the next period. The "memory" from one period to the next is quite sticky. Generally elements can dance around states NULL, 01, 02, fairly randomly from one period to the next but once an element touches any of states 03-05 then that element is forever "tainted" in its behavior and rarely falls out of those 03-05 states (mostly ending up in state 05).

I also easily calculate the SDEV of the transition probabilities. Note that in the below image I oriented the from/to transitions vertically along columns; the R code I used easily transposes the matrices.

My end goal is to run simulations to derive probability distributions for elements ending in the states of NULL, 01, 02, 03, 04, 05, in period x where x is a user input (where x > 6 in this dataset example).

So my question is,

*In the context of an appropriate R package would be most helpful, for running simulations. I do everything in R now. Or in practice can the memoryless property be safely ignored in running MCMC? Is there an R package for testing the "Markovness" of a data set?*

**what is a good approach for running transition matrix simulations when your transitions do not meet the memoryless property of a Markov chain?**