Presentation is loading. Please wait.

Presentation is loading. Please wait.

Huffman Code and Shannon Fano Code Problems By L.SARAVANAN, AP/ECE, Rajalakshmi Institute of Technology, Chennai, TN, India.

Similar presentations


Presentation on theme: "Huffman Code and Shannon Fano Code Problems By L.SARAVANAN, AP/ECE, Rajalakshmi Institute of Technology, Chennai, TN, India."— Presentation transcript:

1 Huffman Code and Shannon Fano Code Problems By L.SARAVANAN, AP/ECE, Rajalakshmi Institute of Technology, Chennai, TN, India

2 Consider source with 5 message having probabilities calculate Length, Entropy and Efficiency using Huffman code Arrange the symbols in the decreasing order of their probabilities A1A2A3A4A5 0.30.250.20.150.1 A10.3 A20.25 A30.2 A40.15 A50.1

3 0.3 0.45 0.55 0.25 0.3 0.45 0.20.25 0.150.2 0.1 0 1 0 0 0 1 1 1

4 0.3 0.45 0.55 0.25 0.3 0.45 0.20.25 0.150.2 0.1 0 1 0 0 0 1 1 1

5 Calculate Average Length L  z    L  a i  P  a i   0.3  2  0.25  2  0.2  2  0.15  0.1    2.25bits/symbol ProbabilityCodewordLength 0.3002 0.25102 0.2112 0.150103 0.10113

6 Calculate Entropy H  z     P  a i  log 2 (1/P  a i   0.3  log 2 (1/.3)  0.25  log 2 (1/.25)  0.2  log 2 (1/.2)  0.15  log 2 (1/.15)  0.1  log 2 (1/.1)  2.23 bits/symbol

7 Efficiency   H  x   2.23  99.1%   H  z   2.227  0.9897 L(x) 2.25 × 100

8 Consider source with 7 message having probabilities calculate Length, Entropy and Efficiency Arrange the symbols in the decreasing order of their probabilities A1A2A3A4A5A6A7 0.25 0.1250.06250.1250.06250.125 A10.25 A20.25 A30.125 A70.125 A50.125 A60.0625 A40.0625

9 0.25 0.5 0.25 0.5 0.125 0.25 0.125 0.25 0.125 0.06250.125 0.0625 1 0 0 1 1 0 0 0 0 1 1 1

10 Calculate Average Length L  z    L  a i  P  a i   0.25  2  0.25  2  0.125  3  0.125  0.125  0.0625  4 +0.0625  4   2.625bits/symbol

11 Calculate Entropy H  z     P  a i  log 2 (1/P  a i   0.25  log 2 (1/.25)  0.25  log 2 (1/.25)  0.125  log 2 (1/.125)  0.125  log 2 (1/.125)  0.125  log 2 (1/.125)  0.0625  log 2 (1/0.0625) +0.0625  log 2 (1/0.0625)  2.624 bits/symbol

12 Efficiency   H  x   2.624  99.9%   H  z   2.227  0.9897 L(x) 2.625 × 100

13 Shannon-Fano Coding A Shannon-Fano tree is built according to a specification designed to define an effective code table. The actual algorithm is simple: 1. For a given list of symbols, develop a corresponding list of probabilities or frequency counts so that each symbol’s relative frequency of occurrence is known. 2. Sort the lists of symbols according to frequency, with the most frequently occurring symbols at the left and the least common at the right. 3. Divide the list into two parts, with the total frequency counts of the left half being as close to the total of the right as possible.

14 Shannon-Fano Coding 4. The left half of the list is assigned the binary digit 0, and the right half is assigned the digit 1. This means that the codes for the symbols in the first half will all start with 0, and the codes in the second half will all start with 1. 5. Recursively apply the steps 3 and 4 to each of the two halves, subdividing groups and adding bits to the codes until each symbol has become a corresponding code leaf on the tree

15 Steps of Shannon-Fano algorithm is as follows: 1.List the frequency table & sort the table on the basis of freq. 2.Divide table in two halves such that groups have more or less equal no. of frequencies. 3.Assign 0 to upper half & 1 to lower half. 4.Repeat the process recursively until each symbol becomes leaf of a tree.

16 Shannon Fano Coding Steps

17 Shannon fano code  Example of a Shannon-Fano frequency code.  First division  Second division SymbolABCDE Frequenc y 128765 SymbolABCDE Frequen cy 128765 Sum (20) (18) Assign bit 0 1 SymbolA B C D E Frequen cy 12876 5 Sum1287 11 Code000110 11

18 Contd ….  Third division  Final codes SymbolABCDE Frequency 65 Sum65 Code110111 SymbolABCDE Code000110110111

19 Consider source with 5 message having probabilities calculate Length, Entropy and Efficiency using Shannon-Fano code Arrange the symbols in the decreasing order of their probabilities A1A2A3A4A5 0.30.250.20.150.1 A10.3 A20.25 A30.2 A40.15 A50.1

20 A1,A2,A3,A4,A5 A10.3 A20.25 A30.2 A40.15 A50.1 A1A2 A3A4A5 A1A2 A3 A4A5 A4 A5 0 1 01 0 0 1 1.55.45.25.2 00 01 10 110 111 A10.300 A20.2501 A30.210 A40.15110 A50.1111 AFTER CODING

21 Calculate Average Length L  z    L  a i  P  a i   0.3  2  0.25  2  0.2  2  0.15  0.1    2.25bits/symbol ProbabilityCodewordLength 0.3002 0.25012 0.2102 0.151103 0.11113

22 Calculate Entropy H  z     P  a i  log 2 (1/P  a i   0.3  log 2 (1/.3)  0.25  log 2 (1/.25)  0.2  log 2 (1/.2)  0.15  log 2 (1/.15)  0.1  log 2 (1/.1)  2.23 bits/symbol ProbabilityCodewordLength 0.3002 0.25012 0.2102 0.151103 0.11113

23 Efficiency   H  x   2.23  99.1%   H  z   2.227  0.9897 L(x) 2.25 × 100 ProbabilityCodewordLength 0.3002 0.25012 0.2102 0.151103 0.11113


Download ppt "Huffman Code and Shannon Fano Code Problems By L.SARAVANAN, AP/ECE, Rajalakshmi Institute of Technology, Chennai, TN, India."

Similar presentations


Ads by Google