Definisi 'markov process'
English to English
noun
1 a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
source: wordnet30
More Word(s)
stochastic process, markoff chain, markov chain,

Visual Synonyms
Click for larger image