Definition of Markoff chain

1. Noun. A Markov process for which the parameter is discrete time values.

Exact synonyms: Markov Chain
Generic synonyms: Markoff Process, Markov Process

Lexicographical Neighbors of Markoff Chain

Mark
Mark Anthony
Mark Antony
Mark Clark
Mark Hopkins
Mark Rothko
Mark Tobey
Mark Twain
Mark Wayne Clark
Markab
Markan
Markaz-ud-Dawa-wal-Irshad
Markie
Markland
Markoff
Markoff chain
Markoff process
Markov
Markov chain
Markov chains
Markov jump process
Markov process
Markova
Markovian
Markowitz
Marks
Marks and Sparks
Markus
Marky
Marla

Literary usage of Markoff chain

Below you will find example usage of this term as found in modern and/or classical literature:

1. R.R. Bahadur's Lectures on the Theory of Estimation by Raghu Raj Bahadur, Stephen M. Stigler, Wing Hung Wong, Daming Xu (2002)
"... 1) x (0, 1) and that a markoff chain with transition probability matrix as above starts at Т and is observed for n one-step transitions. ..."

Other Resources:

Search for Markoff chain on Dictionary.com!Search for Markoff chain on Thesaurus.com!Search for Markoff chain on Google!Search for Markoff chain on Wikipedia!

Search