# How much information does a language have? Shanon, C. Prediction and Entropy of Printed English, Bell System Technical Journal, 1951.

## Presentation on theme: "How much information does a language have? Shanon, C. Prediction and Entropy of Printed English, Bell System Technical Journal, 1951."— Presentation transcript:

How much information does a language have? Shanon, C. Prediction and Entropy of Printed English, Bell System Technical Journal, 1951

Motivation/Skills

Redundancy The redundancy of ordinary English, not considering statistical structure over greater distances than about eight letters, is roughly 50%. This means that when we write En_ _ _sh ha_f o_ w_ _t w_ w_ _te i_ dete_ _ _ _e_ b_ t_e str_ct_r_ _ f _ _ _ lang_ _ _ _ a_d H_ _f i_ c_os_n fre_ _ _ Redundancy =1-H/H max

Entropy How much information is produced on average for each letter

L Evêqe en effet est très streect: le clergé, de temps en temps, se permet de révéler ses préférences envers des événements frenchement débreedés, mets l évêqe hème qe ses fêtes respectent des règles sévères et les trensgresser, c est fréqemment reesqer de se fère relegger. Saisi par l'inspiration, il composa illico un lai, qui, suivant la tradition du Canticum Canticorum Salomonis, magnifiait l'illuminant corps d'Anastasia : Ton corps, un grand galion où j'irai au long-cours, un sloop, un brigantin tanguant sous mon roulis, Ton front, un fort dont j'irai à l'assaut, un bastion, un glacis qui fondra sous l'aquilon du transport qui m'agit,

> E0.131 T0.105 A0.082 O0.08 N0.071 R0.068 I0.063 S0.061 H0.053 D0.038 L0.034 F0.029 C0.028 M0.025 U G0.02 Y P W0.015 B0.014 V0.009 K0.004 X0.002 J0.001 Q Z8E-04 E0.13676 A0.12529 O0.08684 S0.0798 R0.06873 N0.06712 I0.06249 D0.05856 L0.04971 C0.04679 T0.04629 U0.03934 M0.0315 P0.02505 B0.0142 G0.01006 Y0.00895 V Q0.00875 H0.00704 F0.00694 Z0.00523 J0.00443 X0.00221 W0.00023 K0.00004

How much information is obtained by adding one letter? S E E E0.131 T0.105 A0.082 X0.002 J0.001 Q Z8E-04 SE

Bits per letterFn 4.75F0 4.03F1 3.32F2 3.1F3

3 order IN NO IST LAT WHEY CRATICT FROURE BIRS GROCID PONDENOME OF DEMONSTURES OF THE REPTAGIN IS REGOACTIONA OF CRE.

ProbabilityWord#.071The1.034of2.03and3 Vocabulary size (no. lemmas) % of content in OECExample lemmas 1025%the, of, and, to, that, have 10050%from, because, go, me, our, well, way 100075%girl, win, decide, huge, difficult, series 700090%tackle, peak, crude, purely, dude, modest 50,00095%saboteur, autocracy, calyx, conformist >1,000,00099%laggardly, endobenthic, pomological

ProbabilityWord#.071The1.034of2.03and3

Zipfs Law

Is English trying to warn us? 992-995 America ensure oil opportunity 2629-2634 bush admit specifically agents smell denied 16047-16048 arafat unhealthy ProbabilityWord#.071The1.034of2.03and3

How to continue? Aoccdrnig to rseearch at an Elingsh uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoatnt tihng is that the frist and lsat ltteer is at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit a porbelm. Tihs is bcuseae we do not raed ervey lteter by it slef but the wrod as a wlohe.

Revealing the statistic of the language Q….. 2034 words start with q ….q 8 words finish with q ….q …. Ira 0 qq 0.1

Revealing the statistic of the language THERE IS NO REVERSE ON A MOT0RCYCLE 1115112112111511711121321227111141111131 FRlEND 0F MINE FOUND THIS OUT 861311111111111621111112111111 RATHER DRAMATICALLY THE OTHER DAY 41111111151111111111161111111111111 R R R R 4 11 1

# of times guessed Position of the guessed letter

What is the probability to find the number 1 in the third position? THE 1 1 1 REV 1 1 5 ERS 1 1 2 MOT 1 1 2 THA 112

THE 1 1 1 ANT 3 1 3 ERS 1 1 2 MOT 1 1 2 HER 222 THA 1 1 2 HEN 1 1 3 ERS 1 1 2 TH_ 1 1 3 AN_ 312 HE_ 2 2 1 REV 1 1 5 ERS 1 1 2 MOT 1 1 2 AND 311 LASCU Probability to find the number I in the place N

Bounds THERE IS NO REVERSE ON A MOT0RCYCLE F0 (all the letter have the same probability) F1 (each letter has its own probability) F2 (correlation of two letters) 111511211211151171112132 1227111141111131 F0 (all the numbers have the same probability) F1 (each number has its own probability) F2 (correlation of two numbers) FN

Bounds

Entropy

4 47 17 13 3 1 5 3 4 47 17 13 1.3 4.47.17.13 0.013

Bounds Redundancy ~ 75% Bits per letterFn 4.75F0 4.03F1 3.32F2 3.1F3

Similar presentations