Markoff process

Markoff process
noun see Markov process

New Collegiate Dictionary. 2001.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Markoff process — Markov process Mark ov pro cess, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or… …   The Collaborative International Dictionary of English

  • Markoff process — noun a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state • Syn: ↑Markov process • Hypernyms: ↑stochastic process • Hyponyms: ↑Markov chain,… …   Useful english dictionary

  • Markoff chain — noun a Markov process for which the parameter is discrete time values • Syn: ↑Markov chain • Hypernyms: ↑Markov process, ↑Markoff process * * * noun see markov chain …   Useful english dictionary

  • Markoff chain — Markov chain Mark ov chain, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the… …   The Collaborative International Dictionary of English

  • Markov process — noun a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state • Syn: ↑Markoff process • Hypernyms: ↑stochastic process • Hyponyms: ↑Markov chain,… …   Useful english dictionary

  • Markov process — Mark ov pro cess, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next… …   The Collaborative International Dictionary of English

  • Markov process — Statistics. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding. Also, Markoff process. [1935 40; after Russian mathematician Andrei Andreevich …   Universalium

  • stochastic process — noun a statistical process involving a number of random variables depending on a variable parameter (which is usually time) • Hypernyms: ↑model, ↑theoretical account, ↑framework • Hyponyms: ↑Markov process, ↑Markoff process, ↑random walk, ↑ …   Useful english dictionary

  • Markov process — noun Date: 1938 a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also Markov chain called also Markoff process …   New Collegiate Dictionary

  • Markov process — [mär′kôf] n. a chain of random events in which only the present state influences the next future state, as in a genetic code: also Markoff process …   English World dictionary

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”