Markov chain

What is Markov Chain
A Markov chain is a mathematical process that changes from one state to another within a finite number of possible states. It is a collection of different states and probabilities of a variable, the future state or state of which depends largely on its immediate previous state. A Markov chain is also known as a discrete-time Markov chain (DTMC) or Markov process.
Markov chains are mainly used to predict the future state of a variable or any object based on its past state. It applies probabilistic approaches in predicting the next state. Markov chains are represented using directed graphs that define the current and past state and the likelihood of transition from one state to another.

Markov chains have several implementations in computer and Internet technologies. For example, the PageRank (R) formula used by Google Search uses a Markov chain to calculate the PageRank of a particular webpage. It is also used to predict user behavior on a website based on previous user preferences or interactions.

Was the explanation to "Markov chain"Helpful? Rate now:

Weitere Erklärungen zu Anfangsbuchstabe M