Markov Chains
Markov Chain
•a sequence of random variables
•
•      is the state of the model at time t
•
•
•
•Markov assumption:  each state is dependent only on the previous one
–dependency given by a conditional probability:
–
–
•The above is actually a first-order Markov chain
•An N’th-order Markov chain: