- Markoff chain
- noun see Markov chain
New Collegiate Dictionary. 2001.
New Collegiate Dictionary. 2001.
Markoff chain — Markov chain Mark ov chain, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the… … The Collaborative International Dictionary of English
Markoff chain — noun a Markov process for which the parameter is discrete time values • Syn: ↑Markov chain • Hypernyms: ↑Markov process, ↑Markoff process * * * noun see markov chain … Useful english dictionary
Markoff process — noun a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state • Syn: ↑Markov process • Hypernyms: ↑stochastic process • Hyponyms: ↑Markov chain,… … Useful english dictionary
Lempel-Ziv-Markoff chain-Algorithm — Lempel Ziv Markow Algorithmus (LZMA) ist ein freier Datenkompressionsalgorithmus, der von Igor Pavlov seit 1998 entwickelt wird und vergleichsweise gute Kompressionsraten und eine hohe Geschwindigkeit beim Entpacken erreicht. Er ist benannt nach… … Deutsch Wikipedia
Markoff process — Markov process Mark ov pro cess, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or… … The Collaborative International Dictionary of English
Markoff-Kette — Eine Markow Kette (engl. Markov chain, auch Markow Prozess, nach Andrei Andrejewitsch Markow, andere Schreibweisen: Markov Kette, Markoff Kette) ist eine spezielle Klasse von stochastischen Prozessen. Man unterscheidet eine Markow Kette in… … Deutsch Wikipedia
Markov chain — noun a Markov process for which the parameter is discrete time values • Syn: ↑Markoff chain • Hypernyms: ↑Markov process, ↑Markoff process … Useful english dictionary
Markov chain — Mark ov chain, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding… … The Collaborative International Dictionary of English
Markov chain — noun Etymology: A. A. Markov died 1922 Russian mathematician Date: 1938 a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or… … New Collegiate Dictionary
Markov chain — /mahr kawf/, Statistics. a Markov process restricted to discrete random events or to discontinuous time sequences. Also, Markoff chain. [1940 45; see MARKOV PROCESS] * * * … Universalium