Markov information source, or simply, a Markov source




Yüklə 17.19 Kb.
tarix29.04.2016
ölçüsü17.19 Kb.
In mathematics, a Markov information source, or simply, a Markov source, is an information source whose underlying dynamics are given by a stationary finite Markov chain.



[edit]Formal definition

An information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.

A Markov information source is then a (stationary) Markov chain M, together with a function

that maps states S in the Markov chain to letters in the alphabet Γ.

unifilar Markov source is a Markov source for which the values f(sk) are distinct whenever each of the states sk are reachable, in one step, from a common prior state. Unifilar sources are notable in that many of their properties are far more easily analyzed, as compared to the general case.

[edit]Applications

Markov sources are commonly used in communication theory, as a model of a transmitter. Markov sources also occur in natural language processing, where they are used to represent hidden meaning in a text. Given the output of a Markov source, whose underlying Markov chain is unknown, the task of solving for the underlying chain is undertaken by the techniques of hidden Markov models, such as the Viterbi algorithm.

[edit]

Markov chain is the characterization of a system that transits from one state to another, in a chainlike manner. It concerns any random process endowed with the Markov propertyi.e., the property, simply said, that the next state depends only on the current state and not on the past. It is a Markov model, named for Andrey Markov, for a particular type of Markov process in which the process can only be in a finite or countable number of states. Markov chains are useful as tools for statistical modeling in almost all fields of modern applied mathematics.



in probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time-varying random phenomenon for which a specific property (the Markov property) holds.

 the term Markov property refers to the memoryless property of a stochastic process.



in a stochastic or random process there is some indeterminacy in its future evolution described by probability distributions. This means that even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths may be more probable and others less so.


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©azrefs.org 2016
rəhbərliyinə müraciət

    Ana səhifə