Free Trial

Safari Books Online is a digital library providing on-demand subscription access to thousands of learning resources.

Share this Page URL

Chapter 3. Markov Chains: Introduction > 3.1. Definitions - Pg. 79

3 Markov Chains: Introduction 3.1 Definitions A Markov process {X t } is a stochastic process with the property that, given the value of X t , the values of X s for s > t are not influenced by the values of X u for u < t. In words, the probability of any particular future behavior of the process, when its current state is known exactly, is not altered by additional knowledge concerning its past behavior. A discrete-time Markov chain is a Markov process whose state space is a finite or countable set, and whose (time) index set is T = (0, 1, 2, . . .). In formal terms, the Markov property is that Pr{X n+1 = j|X 0 = i 0 , . . . , X n-1 = i n-1 , X n = i} = Pr{X n+1 = j|X n = i} (3.1) for all time points n and all states i 0 , . . . , i n-1 , i, j. It is frequently convenient to label the state space of the Markov chain by the non- negative integers {0, 1, 2, . . .}, which we will do unless the contrary is explicitly stated, and it is customary to speak of X n as being in state i if X n = i.