Online dictionaryOnline dictionary
Synonyms, antonyms, pronunciation

  Home
English Dictionary      examples: 'day', 'get rid of', 'New York Bay'




Markov process   Listen
noun
Markov process  n.  (Also spelled Markoff process)  (Statistics) A random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It is distinguished from a Markov chain in that the states of a Markov process may be continuous as well as discrete.






Collaborative International Dictionary of English 0.48








Advanced search
     Find words:
Starting with
Ending with
Containing
Matching a pattern  

Synonyms
Antonyms
Quotes
Words linked to  

only single words



Share |
Add this dictionary
to your browser search bar





Words linked to "Markov process" :   Markov chain, Markoff process, stochastic process



Copyright © 2024 Dictionary One.com