Noun markoff chain has 1 sense
  1. Markov chain, Markoff chain - a Markov process for which the parameter is discrete time values
    --1 is a kind of Markov process, Markoff process
,
TOP