Results for « Markov chain » (eng)
13511368-n
Markov chain, Markoff chain      a Markov process for which the parameter is discrete time values

Langs:

Preferences
(0.00150 seconds)
More detail about the Extended Open Multilingual Wordnet (1.3)
Maintainer: Francis Bond <bond@ieee.org>