Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. probability of eventAverage Information:

Similar presentations


Presentation on theme: "Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. probability of eventAverage Information:"— Presentation transcript:

1 Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. probability of eventAverage Information: Entropy Source producing many symbols of probabilities etc. A review on important bits of Part I

2 For a binary source Maximum entropy

3 Redundancy Conditional entropy H(j|i) If there is intersymbol influence, average information is given by Joint probability Conditional probability (probability of j given i)

4 Coding in noiseless channel : Source coding (Speed of transmission is the main consideration ) 1. uniquely decodable (all combinations of code words distinct) 2. instantaneous (no code words a prefix of another) 3. compact (shorter code words given to more probable symbols) Important properties of codes

5 Important parameters: Coding methods  Fano-Shannon method  Huffman’s Method is length (in binary digits)where

6 Coding methods  Fano-Shannon method 1. Writing the symbol in a table in the order of descending order of probabilities ; 2. Dividing lines are inserted to successively divide the probabilities into halves, quarters, etc (or as near as possible); 3. A ‘0’ and ‘1’ are added to the code at each division. 4. Final code for each symbol is obtained by reading from towards each symbol.

7

8 s1s s2s s3s s4s s5s L=0.5×1+0.2 ×3+3 × 0.1 ×3=2.0 H=1.96 E=0.98

9 Coding methods  Huffman’s Method 1. Writing the symbol in a table in the order of descending order of probabilities ; 2.The probabilities are added in pairs from bottom and reordered. 3. A ‘0’ or ‘1’ is placed at each branch; 4. Final code for each symbol is obtained by reading from towards each symbol.

10 S3{S5, S4} S2S S5S4S3 S2S

11 {S2, {S3, {S5, S4}}} S1 0.5 S2{S3,{S5, S4}} S

12 Codes: S1: 0 S2: 11 S3: 101 S4: 1000 S5: 1001

13 L=0.5×1+0.2 × ×3+2 × 0.1 ×4 =2.0 H=1.96 E=0.98

14 Shannon’s first theorem The coding process is sometimes known as ‘matching source to channel’, that is to making the output of the coder as suitable as possible for the channel. Matching source to channel Shannon proved formally that if the source symbols are coded in groups of n, then the average length per symbol tends to the source entropy H as n tends to infinite. In consequence, a further increase in efficiency can be obtained by grouping the source symbols in groups, ( pairs, threes), and applying the coding procedure to the relevant probabilities of the chosen group.

15 Example An information source produces a long sequence of three independent symbols A, B, C with probabilities 16/20,3/20 and 1/20 respectively; 100 such symbols are produced per second. The information is to be transmitted via a noiseless binary channel which can transmit up to 100 binary digits per second. Design a suitable compact instantaneous code and find the probabilities of the binary digits produced. sourcecoder decoder 100 symbol/s 0, 1 channel P(A)=16/20, p(B)=3/20, p(C)=1/20 A16/2000 B3/ c1/20111 Coding singly, using Fano-Shannon method P(0)=0.73, p(1)=0.27

16 AA AB BA AC CA BB BC CB CC L=1.865 per pair, R=93.25bits/s p(0)=0.547, p(1)= The entropy of the output stream is –(p(1)logp(0)+p(1)logp(1))=0.993 bits. close to maximum value of 1bit, (p(0)=p(1)).


Download ppt "Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. probability of eventAverage Information:"

Similar presentations


Ads by Google