Advertisement

Advertisement

Markov chain

or Mar·koff chain

[ mahr-kawf ]

noun

Statistics.
  1. a Markov process restricted to discrete random events or to discontinuous time sequences.


Markov chain

/ ˈɑːɒ /

noun

  1. statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
“Collins English Dictionary — Complete & Unabridged” 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Discover More

Word History and Origins

Origin of Markov chain1

First recorded in 1940–45; Markov process
Discover More

Word History and Origins

Origin of Markov chain1

C20: named after Andrei Markov (1856–1922), Russian mathematician
Discover More

Example Sentences

Examples have not been reviewed.

These rules could be decomposed into two sets that dominate at distinct length scales -- Markov chain and random nuclei.

From

The Markov chain, simple as it is, somehow captures something of the style of naming practices of different eras.

From

My ego would like me to believe that my writing process is a little more complicated than a Markov chain.

From

The full Markov chain Monte Carlo analysis and uncertainties are discussed in Methods.

From

The main tool in the Duke paper is a method called the “Markov chain Monte Carlo” algorithm.

From

Advertisement

Advertisement

Advertisement

Advertisement


MarkovaMarkov process