Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image Processing Architecture, © 2001-2004 Oleh TretiakPage 1Lecture 2 ECE-C490 Winter 2004 Image Processing Architecture Lecture 2, 1/8/2004 Lossless.

Similar presentations


Presentation on theme: "Image Processing Architecture, © 2001-2004 Oleh TretiakPage 1Lecture 2 ECE-C490 Winter 2004 Image Processing Architecture Lecture 2, 1/8/2004 Lossless."— Presentation transcript:

1 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 1Lecture 2 ECE-C490 Winter 2004 Image Processing Architecture Lecture 2, 1/8/2004 Lossless Coding (More) Oleh Tretiak Drexel University

2 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 2Lecture 2 Review: Need for Compression Example: Fax document - 8.5x11”, 200 dpi, 8.5x11x(200) 2 = 3.74 Mbits - @28.8 kbits/sec, 3740000/28800 = 130 sec - Typical compression = 15, with compression 8.65 sec. Example: Video - 640x480 pictures, 24 bits/sample, 30 frames/sec - 640x480x24x30 = 2.21E+08 bits/sec = 2.76E+07bytes/sec - A CD-ROM stores 650 Mbytes -> playing time = 650/27.6 = 23.5 sec - With compression, 74 minutes of low (VHS) quality video can be stored on a CD-ROM

3 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 3Lecture 2 Review: How is Compression Possible? Statistical Redundancy - Adjacent pixels are similar (spatial correlation) - Color components are similar (spectral correlation) - Successive frames are similar (temporal correlation) Perceptual Redundancy - Data can be eliminated from signal with no visible changes. Lossy compression: - Send (inferior) picture that requires fever bits.

4 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 4Lecture 2 Example: Compression We have 1000 symbols from alphabet {a b c d} Coding: 2 bits per symbol, total bits = 2x1000 = 2000 Variable length coding, some symbols are more frequent Total bits = 900 + 100 + 75 + 75 = 1150 Average bits/symbol = 1150/1000 = 1.15 < 2 Compression = source bits/code bits = 2/1.15 =1.74

5 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 5Lecture 2 Review: Typical Encoder System Issues - Constant Bit Rate vs. Variable Bit Rate oIn lossless encoder, bit rate depends on compression efficiency oVariable bit rate is undesirable in real-time applications oIn lossy encoder, bit rate can be kept constant by varying quality - Single or Multiple Sample encoding oMultiple sample encoding is usually more efficient but also is more complex.

6 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 6Lecture 2 Review: Information Theory Let the i-th symbol have probability p i. The information of this symbol is defined as log 2 (1/ p i ) bits. The average information of this symbol is p i log 2 (1/ p i ) bits. The entropy of this symbol set is defined as Shannon Coding Theorem: It is possible to encode a (long) sequence of symbols with H +  bits per symbol How to do it? If the symbols are statistically independent, it is impossible to encode these with fewer than H bits per symbol. bits

7 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 7Lecture 2 Review: Entropy and Huffman Codes Theory - Information - Entropy - Shannon Compression Theorem Practice - Deriving symbols from signals - Huffman encoding oCoder construction oEncoders oDecoders

8 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 8Lecture 2 Review: Variable Length Codes Symbol set: s i, i = 1 … N, p i — symbol probability Code: c i, l i, where c i is a sequence of 1’s and 0’s of length l i. The code words must be decodable: the transmitted bit stream is just a set of 1’s and 0’s, code word boundaries are not indicated. For a decodable code, the code word boundaries can be found uniquely from the transmitted sequence. Sufficient condition for decodability: A code is decodable if no code word is at the beginning (prefix) of another code word. This is called the prefix condition Average code word length: For a decodable code, L ave ≥ H

9 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 9Lecture 2 This Lecture Huffman codes - How to construct codes - Encoding and decoding algorithms - Huffman codes with constrained length Golomb and Rice coding Arithmetic Coding - Coding when H < 1

10 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 10Lecture 2 Construction of a Huffman Code Tree construction - Order the symbols according to probabilities - Apply contraction process to the two symbols with lowest probability oassign a new (hypothetical) symbol to these two,with probabilities equal to sum of the code word probabilities - Repeat until one symbol is left Code construction - For the two initial branches of the tree, attach bit ‘0’ and ‘1’ at end of code word. - For both branches oIf (branch is a source symbol) done oelse repeat above process

11 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 11Lecture 2 Code construction example A=k,w B=u,A C=B,? D=l,r E=e,C H = 2.55

12 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 12Lecture 2 Variable Length Code: Encoding Source sequence: werule?

13 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 13Lecture 2 Prefix code: bit-serial decoding Algorithm steps: bold denotes output symbols - ECBAwEeDrECBuDl...

14 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 14Lecture 2 Prefix code: table decoding Let k be the maximum code symbol length. Construct table with 2 k entries. Each table location contains input symbol and code word length. Order code by binary value. A code symbol of length l will have 2 k-l entries. Since k = 5, we use 32 table entries. Code symbol ‘00’ will use 2 5-2 =8 entries. Each entry will have output symbol ‘r’ and length 2. The next 8 entries will be for ‘l’. The following 8 entries will be for ‘e’. The following 4 entries will be for ‘?’, etc. Decoding:Take k = 5 bits from encoded sequence. Decode it by table lookup. From table, find the symbol length, discard these many bits from the code word used for lookup, take additional bits from encoded sequence.

15 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 15Lecture 2 Lookup table decoding example First lookup code: 11011. Output = ‘w’, l = 5, read 5 bits from input stream. Lookup code: 10001. Output = ‘e’, l = 2, discard the initial ‘10’ from lookup code, read two more bits Lookup code 00110. Output = ‘r’, l = 2, discard the initial ‘00’ from lookup code, read two more bits. Lookup code 11000. Output = ‘u’, l = 4, discard initial ‘1100’, read 4 more bits Lookup code 01101. Output = ‘l’,...

16 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 16Lecture 2 Huffman Codes With Constrained Length Some codes used in practice can have longest code words 20 bits long: huge lookup tables for lookup sequences Solution concept 1: ‘Escape’ sequences Solution concept 2: limit max code word length

17 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 17Lecture 2 Code construction - Escape Escape sequence approach - High probability code symbols decoded with lookup table, table is modest in size. - All low probability codes are lumped together: they are assigned one symbol (escape symbol) in the high probability table. - To encode low probability symbol, send escape symbol plus low- probability sequence (this will be another Huffman code). - Decoding: use high probability table: if escape symbol is encountered, switch to low probability table. This approach uses a hierarchical set of lookup tables for decoding

18 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 18Lecture 2 Rule of Thumb If symbol has probability p, then the length of a VLC code word should be about l = -log 2 p Examples - p = 0.5, l = 1 bit - p = 0.1, l = 3 bits - p = 0.01, l = 7 bit

19 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 19Lecture 2 Two level hierarchy: method S is the source, L be the maximum code word length Sort the symbols by probability, so that p 1 ≥ p 2 … ≥ p N Split the source into two sets: Create a special symbol Q with probability Augment S 1 by Q to form new set W. Design Huffman code for this this set Encoding: For symbols is S 1, output code word. For symbols in S 2, send Q, then symbol without encoding (this requires bits).

20 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 20Lecture 2 Example H = 2.65, L ave = 2.69 Max l is 9, decoding table requires 512 entries Target max length 5, min p > 1/2^5 = 1/32 = 0.03125 S 1 = (a, b, c, d, e), S 2 = (f … p) P(S 2 ) = 0.0559, min p in S 1 = 0.0513 Both are greater than 1/32 Expect longest code word should be 5 bits or less

21 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 21Lecture 2 Example - Continued Build Huffman code for a-f + S 2 If a-f needs encoding, send code word If g-p needs encoding, send code for S 2, followed by binary number (one of 10) -> 4 bits Example: encode “cam” 010 00 111 0110

22 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 22Lecture 2 Performance Analysis What is L ave ? L ave = Avg no of bits to send VLC + p(S 2 )*4 = 2.5421 + 0.0559*4 = 2.766 How does this compare with Huffman code? For Huffman code, L ave = 2.694 Other ideas?

23 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 23Lecture 2 Constrained Length - 2nd method Long code words produced by low probability symbols - Idea: modify probabilities, increasing the low probability values - Design Huffman code for this This approach produces codes with lower maximum length, but larger average length Leads to simpler encoder and decoder structure than escape (hierarchical) approach, but performance may not be as good.

24 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 24Lecture 2 Golomb-Rice coding Golomb: P(n) = (1 – p 0 )p 0 n, n ≥ 0 (geometric distribution) - Let - Given n, compute quotient and remainder for n =mq + r. - Codeword is unary code for m (m zeroes followed by a 1), followed by the binary code for r. - Golomb-Rice: special case when m = 2 k (only case used in practice). Example: m = 4, n = 13 = 4*3 + 1. q = 3, r = 1. Code for q = 0001, code for r = 01, full code word is 000101. This distribution is of much practical use!

25 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 25Lecture 2 Huffman vs. Arithmetic Code Lowest L ave for Huffman codes is 1. Suppose H << 1? - One option: use one code symbol for several source symbols - Another option: Arithmetic code. Idea behind arithmetic code: - Represent the probability of a sequence by a binary number.

26 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 26Lecture 2 Arithmetic Encoding Assume source alphabet has values 0 and 1, p 0 = p, p 1 = 1 – p. A sequence of symbols s 1, s 2, … s m is represented by a probability interval found as follows: - Initialize, lo = 0; hi = 1 - For i = 0 to m oif s i = 0 § hi = lo + (lo-hi)* p 0 oelse § lo = lo + (lo-hi)* p 0 oend - end Send binary fraction x such that lo ≤ x < hi. This will require

27 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 27Lecture 2 Arithmetic Encoding Assume source alphabet has values 0 and 1, p 0 = p, p 1 = 1 – p. A sequence of symbols s 1, s 2, … s m is represented by a probability interval found as follows: - Initialize, lo = 0; range = 1 - For i = 0 to m oif s i = 0 § range = range*p oelse // s i = 1 § lo = lo + range*p § range = range*(1-p) oend - end Send binary fraction x such that lo ≤ x < hi. This will require

28 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 28Lecture 2 Arithmetic coding: example p 0 = 0.2, source sequence is 1101 Number of bits = ceiling(-log 2 (0.1024)) = 4 Bits sent: 0111 0.36 + (1-0.36)*0.2 = 0.36 + 0.128 0 + (1-0)*0.2 = 0.2

29 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 29Lecture 2 Arithmetic coding: example p 0 = 0.2, source sequence is 1101 Number of bits = ceiling(-log 2 (0.1024)) = 4 low 2 =.01100010, (low+range) 2 =.01111100 Bits sent: 0111

30 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 30Lecture 2 Arithmetic Decoding We receive x, a binary fraction lo = 0; hi = 1 for i = 1 to m - if (x - lo) < p*(hi-lo) osi = 0 ohi = lo + (hi-lo)*p - else osi = 1 olo = lo + (hi-lo)*p - end end

31 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 31Lecture 2 Arithmetic Decoding We receive x, a binary fraction lo = 0; range = 1; for i = 1 to m - if (x - lo) < p*range osi = 0 orange = p*range - else osi = 1 olo = lo +range*p orange = range*(1 - p) - end end

32 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 32Lecture 2 Arithmetic Decoding We receive x, a binary fraction for i = 1 to m - if x < p osi = 0 ox = x/p - Else // x > p osi = 1 ox = (x - p)/(1 - p) - end end Receive x = 0111 = 0.4375 p = 0.2

33 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 33Lecture 2 Arithmetic decoding example Receive 0111 (0.4375), decode 4 bits, p0 = 0.2

34 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 34Lecture 2 Arithmetic decoding example Receive 0111 (0.4375), decode 4 bits, p0 = 0.2

35 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 35Lecture 2 Magic Features of Arithmetic Coding Remember I (information) = - log 2 p - p = 0.5, I = 1 - p = 0.125, I = 3 - p = 0.99, I = 0,0145 (wow!) High p symbol, less than 1 code bit per symbol! In encoder, hi - lo = ∑ I(symbols)

36 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 36Lecture 2 Summary: Arithmetic Coding Complexity: requires arithmetic (multiplications, divisions), rather than just table lookups Algorithms are complex, accuracy (significant bits) is tricky Can be made to operate incrementally - Both encoder and decoder can output symbols with limited internal memory Provides important compression savings in certain settings Part of standards

37 Image Processing Architecture, © 2001-2004 Oleh TretiakPage 37Lecture 2 Summary: VLC Lowest level (basis) of image compression We talked about - Huffman - Golomb/Rice - Arithmetic Two phases - Code design - Encoding/decoding More about VLC - Adaptive coding: estimate probabilities - There are universal coders (good non-adaptive coding) such as Lempel-Ziv (zip)


Download ppt "Image Processing Architecture, © 2001-2004 Oleh TretiakPage 1Lecture 2 ECE-C490 Winter 2004 Image Processing Architecture Lecture 2, 1/8/2004 Lossless."

Similar presentations


Ads by Google