Origin of Markov chains | Journey into information theory | Computer Science | Khan Academy

## Translated in collaboration with Agami and Grameenphone ## While observing the natural environment of the world, Many of us notice nature’s excellent bi-divisional principle. No two things are exactly the same, But they all have some kind of underlying structure. Plato believed that the universe was real The structure is hidden from us. By observing the natural environment, We can only gain approximate knowledge about this. There is a secret structure in it. The basic structures are only mathematics and Found by theoretical reasoning of philosophy. For example, circle, it is such From whose perimeter All distances up to the center are equal. But still we never get such material Which is a perfect circle or a completely straight line. Interestingly though, Plato guessed That many, many years later, The universe will reach an ideal state, In its perfect structure. This is Plato's ideal of basic structure Century after century was popular. But later in the 16th century People are different in the real world Begins to try to understand. And tries to figure out the underlying pattern by applying math.

Bernoulli refined the notion of expectation. He focuses on a method that accurately guesses The probability of an event Which independently depends on the number of events that occur. He uses a simple example. Hold on without telling you 3,000 white pebbles and 2,000 black pebbles Hidden in a container, And to find the ratio of white to black, You are taking out one pebble after another, Replacing and noting that How many white vs black pebbles are you taking out? He proved that expected White vs. black gravel quality ratio With the increase in the number of tests To match the actual ratio, Which is called the weak theory of larger numbers. He concluded by saying, “If all events The observation is continued for eternity, Then it can be noticed that everything in the world Precise proportions and constant of change Controlled by theory.

" This idea spread very fast Because it was remarkable that It doesn't just match the expected average, But also the type of average probability A familiar, underlying shape [ __ ] distribution follows. A great example of this is the bean machine by Francis Galton. We consider each conflict as a single independent event, Throw such a coin. After 10 collisions or incidents, The beans fall into a container which Left vs.

Right deviation, Or express the head vs. tail of the coin. This overall curvature, known as binomial distribution, Which is an ideal structure Because its presence is noticeable everywhere. Any large number at any time This can be understood by looking at any test. Apparently the result of these events By no means do I want to convey that I recommend for the mother to be inactive. Known as central boundary theory. But the idea was dangerous to some people. Pavel Nekrasov, originally a theologian, Later he studied mathematics and he Was a strong religious promoter of free will.

He is such a predetermined Didn't like the statistical results. He made a famous claim that freedom Required for larger number theories, Because freedom, in the case of all these common examples Using beans or sixes, where the previous incident The results are basically present or future Does not change the probability of occurrence. However, as we all know, Most things in this world Apparently dependent on previous results, Such as the possibility of catching fire or the sun Or our longevity. When the probability of an event depends, Or is conditional, on previous events, We call them dependent events, Or say dependent variable. This claim angered a Russian mathematician, Andrei Markov, who is public He used to express hatred towards Nekrasov. He said in a letter that “this situation Forcing me to write essays That's the theory of larger numbers Dependent variables can also be applied, " He uses a strategy Which he thought Nekrasov could not even imagine.

Markov extended Bernoulli's result By applying it to a dependent variable. To throw a coin that is not independent But depending on the previous results, So it contains short-term memories of an event. It can be theoretically imagined that A machine has two pots, Which we call condition. In one case we have 50-50 mixtures Pebbles, white and black, In other cases there are more black pebbles than white. A pot can be called a zero state. It previously revealed black pebbles, And the other condition, we say one, It reveals the presence of previously white pebbles. As we run the instrument, we start in a random state And let's choose a pebble. Then we go to either zero or one state, Based on that incident. Based on the election results, If we are black, zero, Or write one if it is white. By these two-state instruments, We get four possible transfers. If it is zero and the pebble is black, We repeat the same situation again and select again. When selecting a white pebble we Let's go to the state of one, which can come back to itself again, Or if it is black, we can go to zero.

The possibility of choosing white versus black Obviously not independent here, Since it depends on the previous result. But Markov proves that as long as in the instrument This can be done in all situations The machines are run sequentially, They reach equilibrium. Whatever the starting point, Once the sequence begins, Every time you go to each situation It can be matched in certain proportions or probabilities.

This simple example disproves Nekrasov's claim That only independent events are of approximate distribution Can be combined with. But by chance by chance The concept of sequence model And transition between conditions Markov became known as Chain. Markov is the first and most famous of the chain Applied by Claude Shannon. ## Translated in collaboration with Agami and Grameenphone ##.

test attribution text

Add Comment