Definisi 'markoff chain'
English to English
noun
1 a Markov process for which the parameter is discrete time values
source: wordnet30
More Word(s)
markoff process, markov process,

Visual Synonyms
Click for larger image