Definition of Markov chain
1. Noun. A Markov process for which the parameter is discrete time values.
Definition of Markov chain
1. Noun. (probability theory) A discrete-time stochastic process with the Markov property. ¹
¹ Source: wiktionary.com
Markov Chain Pictures
Click the following link to bring up a new window with an automated collection of images related to the term: Markov Chain Images
Lexicographical Neighbors of Markov Chain
Literary usage of Markov chain
Below you will find example usage of this term as found in modern and/or classical literature:
1. Statistics in Molecular Biology and Genetics: Selected Proceedings of a 1997 by Françoise Seillier-Moiseiwitsch (1999)
"The challenging part is to approximate the posterior, and we do this by constructing a Markov chain having the posterior as its invariant distribution, ..."
2. Assessing the Human Health Risks of Trichloroethylene: Key Scientific Issues by National Research Council (U.S.) (2006)
"Posterior Markov chain Monte Carlo simulation was used to implement Bayesian posterior inference—again, a natural choice and almost a compulsory consequence ..."
3. State of the Art in Probability and Statistics: Festschrift for Willem R by Mathisca de Gunst, Chris Klaassen, A. W. van der Vaart (2001)
"Associated with the estimation problem and the improper prior is a symmetric Markov chain. It is shown that if the Markov chain is recurrent, ..."
4. Spatial Statistics and Imaging by Antonio Possolo (1991)
"Let Xi, X?, ... be a stationary Markov chain with r + 1 states, 1 < r < oo. (With more notation the results of this section extend to countable-state chains ..."
5. Statistics, Probability, and Game Theory: Papers in Honor of David Blackwell by David Blackwell, Thomas Shelburne Ferguson, Lloyd S. Shapley, James B. MacQueen (1996)
"We have a Markov chain with finite state space and known initial state, ... A state of a Markov chain is called recurrent if the probability that it is ..."
6. Statistical Inference from Genetic Data on Pedigrees by Elizabeth A. Thompson (2000)
"Chapter 8 Markov chain Monte Carlo on Pedigrees 8.1 Simulation conditional on data: MCMC Equation (7.10) gave the likelihood for a genetic model on a ..."
7. Inequalities in Statistics and Probability: Proceedings of the Symposium on by Yung Liang Tong (1984)
"A central limit theorem for processes defined on a finite Markov chain. Proc. Camb. Phil. Soc. 60547-567. KEILSON, J. and WISHART, DMG (1967). ..."
8. A Festschrift for Herman Rubin by Anirban DasGupta, Herman Rubin (2004)
"In the current Markov chain context, here is a "one-set" condition that implies ... Suppose the Markov chain W with state space 0 and transition function ..."