Markov process love

Definitions

from the GNU version of the Collaborative International Dictionary of English.

  • noun (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It is distinguished from a Markov chain in that the states of a Markov process may be continuous as well as discrete.

from Wiktionary, Creative Commons Attribution/Share-Alike License.

  • noun probability theory A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states.

from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.

  • noun a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state

Etymologies

Sorry, no etymologies found.

Support

Help support Wordnik (and make this page ad-free) by adopting the word Markov process.

Examples

    Sorry, no example sentences found.

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.