If you're seeing this message, it means we're having trouble loading external resources on our website.

Хэрэв та вэб шүүлтүүртэй газар байгаа бол домэйн нэрийг *.kastatic.org and *.kasandbox.org блоклосон эсэхийг нягтална уу.

Үндсэн товъёог
Цаг: 0:00Нийт үргэлжлэх хугацаа:4:02

Video transcript

now Shannon had just finished developing his theories related to cryptography and therefore was well aware that human communication was a mix of randomness and statistical dependencies letters in our messages were obviously dependent on previous letters to some extent and in 1949 he published a groundbreaking paper a mathematical theory of communication and in it he uses Markov models as the basis for how we can think about communication and he starts with a toy example imagine you encounter a bunch of text written in an alphabet of a B and C perhaps you know nothing about this language though you notice a seemed to clump together while B and C's do not he then shows that you could design a machine to generate similar looking text using a Markov chain and he starts off with a zero order approximation which means we just independently select a symbol a B or C at random and form a sequence however notice that this sequence doesn't look like the original he shows then you could do a bit better with a first-order approximation where the letters are chosen independently but according to the probability of each letter in the original sequence so this is slightly better as a's are now more likely but it still doesn't capture much structure the next step is key a second order approximation takes into account each pair of letters which can occur and in this case we need three states the first state represents all pairs which begin with a the second all pairs up begin with B and the third state all pairs that begin with C and notice now that the a cup has many AAA pairs which makes sense since the conditional probability of an a after an a is higher in our original message now we can generate a sequence using this second-order model easily as follows we start anywhere and pick a tile and we write down or output the first letter and move to the cup defined by the second letter then we pick a new tile and repeat this process indefinitely notice that this sequence is starting to look very similar to the original message because this model is capturing the conditional dependencies between letters and if we want to do even better we could move to a third order approximation which takes into account groups of three letters or trigrams and in this case we would need nine states but next Shannon applies this exact same logic to actual English text using statistics that were known for letters pairs and trigrams etc and he shows the same progression from zeroth order random letters to first-order second-order and third-order sequences he then goes on and tries the same thing using words instead of letters and he writes the resemblance to ordinary English text increases quite noticeably at each depth indeed these machines were producing meaningless text though they contained approximately the same statistical structure you'd see in actual English Shannon then proceeds to define a quantitative measure of information as he realizes that the amount of information in some message must be tied up in the design of the machine which could be used to generate similar looking sequences which brings us to his concept of entropy you