
Definition of Transition matrix
1. Adjective. (mathematics stochastic processes of a Markov chain) a square matrix whose rows consist of nonnegative real numbers, with each row summing to $1$. Used to describe the transitions of a Markov chain; its element in the $i$'th row and $j$'th column describes the probability of moving from state $i$ to state $j$ in one time step. ¹
¹ Source: wiktionary.com
Transition Matrix Pictures
Click the following link to bring up a new window with an automated collection of images related to the term: Transition Matrix Images