# Definitions:

A process is a series of events.

A stochastic, or chance, process, is a mathematical model that is probabilistic and is best suited to a collection or a family of random variables indexed by time and space.   It is a series of events that is not determined by its initial state.  Usually good model for a time series.

A process that has a discrete parameter and state space is called a Markov Chain.   Based on a given series of events or states, a Markov chain predicts the probability of being in a state at a specific time and the time taken until a state is first reached.

Every Markov chain has a set of mutually exclusive state.  The transition from state to state is considered instantaneous, meaning it takes zero time.  Furthermore, the future of the system only depends on the current state and not on the past states.   When there are only two states in a Markov chain, it is called a two-state Markov chain can be used.

Bhat, U. N. (1984). Elements of Applied Stochastic Processes. New York, Chichester, Brisbane, Toronto, Singapore, John Wiley & Sons.
Stander, J., D. P. Farrington, et al.(1989). “Markov Chain Analysis and Specialization in Criminal Careers.” The British Journal of Criminology 29(4): 317-335.
Stewart, W. J. (1994). Introduction to the Numerical Solution of Markov Chains. Princeton, Princeton University Press.