- Markov process, Markoff process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
--1 is a kind of stochastic process
--1 has particulars: Markov chain, Markoff chain