Definitions
There is 1 meaning of the phrase
Markov Chain.
Markov Chain - as a noun
A markov process for which the parameter is discrete time values
Synonyms (Exact Relations)
markoff chain