Definition of Markoff process
1. Noun. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state.
Specialized synonyms: Markoff Chain, Markov Chain
Generic synonyms: Stochastic Process
Markoff Process Pictures
Click the following link to bring up a new window with an automated collection of images related to the term: Markoff Process Images