Download presentation

Presentation is loading. Please wait.

Published byRashad Fereday Modified about 1 year ago

1
Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. probability of eventAverage Information: Entropy Source producing many symbols of probabilities etc. A review on important bits of Part I

2
For a binary source Maximum entropy

3
Redundancy Conditional entropy H(j|i) If there is intersymbol influence, average information is given by Joint probability Conditional probability (probability of j given i)

4
Coding in noiseless channel : Source coding (Speed of transmission is the main consideration ) 1. uniquely decodable (all combinations of code words distinct) 2. instantaneous (no code words a prefix of another) 3. compact (shorter code words given to more probable symbols) Important properties of codes

5
Important parameters: Coding methods Fano-Shannon method Huffman’s Method is length (in binary digits)where

6
Coding methods Fano-Shannon method 1. Writing the symbol in a table in the order of descending order of probabilities ; 2. Dividing lines are inserted to successively divide the probabilities into halves, quarters, etc (or as near as possible); 3. A ‘0’ and ‘1’ are added to the code at each division. 4. Final code for each symbol is obtained by reading from towards each symbol.

8
s1s1 0.500 s2s2 0.20 0 1 100 s3s3 0.11101 s4s4 0.10 1 110 s5s5 0.11111 L=0.5×1+0.2 ×3+3 × 0.1 ×3=2.0 H=1.96 E=0.98

9
Coding methods Huffman’s Method 1. Writing the symbol in a table in the order of descending order of probabilities ; 2.The probabilities are added in pairs from bottom and reordered. 3. A ‘0’ or ‘1’ is placed at each branch; 4. Final code for each symbol is obtained by reading from towards each symbol.

10
S3{S5, S4} S2S1 0.1 0.2 0.5 S5S4S3 S2S1 0.1 0.20.5

11
{S2, {S3, {S5, S4}}} S1 0.5 S2{S3,{S5, S4}} S1 0.2 0.30.5

12
Codes: S1: 0 S2: 11 S3: 101 S4: 1000 S5: 1001

13
L=0.5×1+0.2 ×2+ 0.1 ×3+2 × 0.1 ×4 =2.0 H=1.96 E=0.98

14
Shannon’s first theorem The coding process is sometimes known as ‘matching source to channel’, that is to making the output of the coder as suitable as possible for the channel. Matching source to channel Shannon proved formally that if the source symbols are coded in groups of n, then the average length per symbol tends to the source entropy H as n tends to infinite. In consequence, a further increase in efficiency can be obtained by grouping the source symbols in groups, ( pairs, threes), and applying the coding procedure to the relevant probabilities of the chosen group.

15
Example An information source produces a long sequence of three independent symbols A, B, C with probabilities 16/20,3/20 and 1/20 respectively; 100 such symbols are produced per second. The information is to be transmitted via a noiseless binary channel which can transmit up to 100 binary digits per second. Design a suitable compact instantaneous code and find the probabilities of the binary digits produced. sourcecoder decoder 100 symbol/s 0, 1 channel P(A)=16/20, p(B)=3/20, p(C)=1/20 A16/2000 B3/200 1 10 c1/20111 Coding singly, using Fano-Shannon method P(0)=0.73, p(1)=0.27

16
AA0.640 0 AB0.120 1 10 BA0.120 1 110 AC0.040 0 1 11100 CA 0.04 1 11101 BB 0.0225 0 1 11110 BC 0.0075 0 1 111110 CB 0.0075 0 1 1111110 CC 0.0025 1 1111111 L=1.865 per pair, R=93.25bits/s p(0)=0.547, p(1)=0.453. The entropy of the output stream is –(p(1)logp(0)+p(1)logp(1))=0.993 bits. close to maximum value of 1bit, (p(0)=p(1)).

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google