Definition of Markoff chain
1. Noun. A Markov process for which the parameter is discrete time values.
Markoff Chain Pictures
Click the following link to bring up a new window with an automated collection of images related to the term: Markoff Chain Images
Lexicographical Neighbors of Markoff Chain
Literary usage of Markoff chain
Below you will find example usage of this term as found in modern and/or classical literature:
1. R.R. Bahadur's Lectures on the Theory of Estimation by Raghu Raj Bahadur, Stephen M. Stigler, Wing Hung Wong, Daming Xu (2002)
"... 1) x (0, 1) and that a Markoff chain with transition probability matrix as above starts at Т and is observed for n one-step transitions. ..."