Download presentation

Presentation is loading. Please wait.

Published byChristy Tutton Modified over 2 years ago

1
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)

2
Outline 1.Coding & codes 2.Code trees and tree codes 3.Optimal codes The source coding theorem

3
Part 1: Coding & codes Coding: To each sorce symbol (or group of symbols) we assign a codeword from an extended binary alphabet. Types of codes: –FIFO: Fixed input (i.e., # source symbols, fixed output (i.e., # code symbols). –FIVO:Fixed input, variable output. –VIFO:Variable input, fixed output. –VIVO:Variable input, variable output. FIVO and VIVO are called variable length codes (VLC). Should be comma-free.

4
Example Assume a memoryless source with alphabet A = {a 1, …, a 4 } probabilities P(a 1 ) = ½ P(a 2 ) = ¼ P(a 3 ) = P(a 4 ) = 1/8. a 1 FIFO: 00 FIVO: 0 a 2 0101 a 3 10110 a 4 11111

5
All codes Non-singular codes Four different classes a1a2a3a4a1a2a3a4 00000000 Singular 0 010 01 10 Non-singular 10 00 11 110 Uniqely decodable 0 10 110 111 Instantaneous Uniqely decodable Instantaneous Decoding probem: 010 could mean a 1 a 4 or a 2 or a 3 a 1. Decoding probem: 1100000000000000001… is uniqely decodable, but the first symbol ( a 3 or a 4 ) cannot be decoded until the third ’1’ arrives (Compare 11010 and 110010).

6
Data Compression Efficient codes utilize the following properties: –Uneven symbol probability –Inter-symbol dependence (memory source) –Acceptable distortion Examples: –The FIVO example –”There’s always an a 3 after a 1 ”. –Don’t care whether it’s a 3 or a 4.

7
Consider, again, our favourite example code {a 1, …, a 4 } = {0, 10, 110, 111}. The codewords are the leaves in a code tree. Tree codes are comma-free and instantaneous. No codeword is a prefix of another! Part 2: Code Trees and Tree Codes 0 1 a1a1 0 1 a2a2 0 1 a3a3 a4a4

8
Kraft’s Inequality For a uniqely decodable code with codeword lengths l i we have Conversely, if this is valid for a given set of codeword lengths, it is possible to construct a code tree. (Proof: Sayood 2.4)

9
Kraft’s Inequality and Tree Codes If KI is valid for a set of codeword lengths, there is a tree code with such lengths. Proof: Create a maximal tree with the size from the longest codeword length l max. –The tree then has 2 l max leaves. –Place the codewords, cut the tree, and use KI to prove that there is enough leaves. –Let’s illustrate.

10
l1l1 Cannot be used: cut! Place l 1 in the tree. Then 2 l max – l 1 leaves disappear.....and 2 l max – 2 l max – l 1 = 2 l max (1 – 2 -l 1 ) leaves remain. Place l 2 in the tree. Then 2 l max (1 – 2 -l 1 – 2 -l 2 ) leaves remain. l2l2 l3l3 l4l4 After placing N codeword lengths, 2 l max (1 – 2 -l 1 ) leaves remain. Possible whenever KI is valid, i.e., 2 -l 1 · 1. Try with {l i } = {1, 2, 3, 3} and {l i } = {1, 2, 2, 3} ! l max = 3 leads to this tree:

11
Optimal codeword lengths ) the entropy limit is reached! Part 3: Optimal Codes But what about the integer constraints? l i = – log p i is not always an integer! Average codeword length [bits/codeword] Kraft’s Inequality

12
The Source Coding Theorem Assume that the source X is memory-free, and create the tree code for the extended source, i.e., blocks of n symbols. We have: We can come arbitrarily close to the entropy!

13
In Practice Two practical problems need to be solved: –Bit-assignment –The integer contraint Theoretically: Chose l i = – log p i –Rounding up not always the best! –Example: Binary source p 1 = 0.25, p 2 = 0.75 ) l 1 = log 4 = 2 l 2 = d – log 0.75 e = 1 Instead, use, e.g., the Huffman algorithm (D.Huffman, 1952) to create an optimal tree code!

14
Summary Coding: Assigning binary codewords to (blocks of) source symbols. Variable-length codes (VLC) and fixed-length codes. Instantaneous codes ½ Uniqely decodable codes ½ Non-singular codes ½ All codes Tree codes are instantaneous. Tree code, Kraft’s Inequality. The Source Coding Theorem.

Similar presentations

OK

Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.

Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Running message display ppt on ipad Ppt on tri gate transistor.pdf Ppt on chapter 3 atoms and molecules video Ppt on holographic technology software Human body parts for kids ppt on batteries Ppt on wifi networking concepts Ppt on minimum wages act 1948 india Ppt on electromagnetic field Ppt on media and entertainment industry Ppt on different internet services