§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.

Slides:



Advertisements
Similar presentations
DCSP-8: Minimal length coding II, Hamming distance, Encryption Jianfeng Feng
Advertisements

DCSP-10 Jianfeng Feng Department of Computer Science Warwick Univ., UK
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
CY2G2 Information Theory 1
15-583:Algorithms in the Real World
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Sampling and Pulse Code Modulation
Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4.
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Problem: Huffman Coding Def: binary character code = assignment of binary strings to characters e.g. ASCII code A = B = C =
Data Compression.
Chapter 6 Information Theory
Fundamental limits in Information Theory Chapter 10 :
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch4: Channel.
A Data Compression Algorithm: Huffman Compression
資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch3: Coding Theory.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Lossless data compression Lecture 1. Data Compression Lossless data compression: Store/Transmit big files using few bytes so that the original files.
Data Structures – LECTURE 10 Huffman coding
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Variable-Length Codes: Huffman Codes
Source Coding Hafiz Malik Dept. of Electrical & Computer Engineering The University of Michigan-Dearborn
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Linear Codes for Distributed Source Coding: Reconstruction of a Function of the Sources -D. Krithivasan and S. Sandeep Pradhan -University of Michigan,
Basics of Compression Goals: to understand how image/audio/video signals are compressed to save storage and increase transmission efficiency to understand.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
§1 Entropy and mutual information
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Source Coding-Compression
Dr.-Ing. Khaled Shawky Hassan
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Basic Concepts of Encoding Codes, their efficiency and redundancy 1.
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Linawati Electrical Engineering Department Udayana University
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
Source Coding Efficient Data Representation A.J. Han Vinck.
Basic Concepts of Encoding Codes and Error Correction 1.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
INFORMATION THEORY Pui-chor Wong.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
Introduction to Information theory
2018/9/16 Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S.Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information.
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Distributed Compression For Binary Symetric Channels
Lecture 11 The Noiseless Coding Theorem (Section 3.4)
Theory of Information Lecture 13
Presentation transcript:

§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion function §3.4 Distortion source coding theorem §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion function §3.4 Distortion source coding theorem

1. Source coder §3.1 Source coding Source coder Source alphabet Channel input alphabet Code Extended source coder  Example 3.1

§3.1 Source coding 2. Examples 1) ASCII source coder ASCII coder {0,1} {English symbol, command} {binary code, 7 bits}

2) Morse source coder Source coder (1) Source coder (2) {0,1}{., —} {A,B,…,Z}Binary code 2. Examples §3.1 Source coding

3) Chinese telegraph coder “中”“中” “0022” “ ” 2. Examples §3.1 Source coding

Constant-length codes Variable-length codes Distortionless codes Distortion codes 2. Classification of the source coding Uniquely decodable (UD) codes Non-UD codes §3.1 Source coding The code C is called uniquely decodable (UD) if each string in each C k arises in only one way as a concatenation of codewords. This means that if say and each of the τ’s and σ’s is a codeword, then Thus every string in C k can be uniquely decoded into a concatenation of codewords.

2. Classification of the source coding  Example 3.2 §3.1 Source coding

3. Parameters about source coding 1) Average length of coding For extended source coding: (code/sig) code/m-sigs Length of codeword §3.1 Source coding (code/sig)

2) Information rate of coding (bit/code) 3. Parameters about source coding §3.1 Source coding

3) Coding efficiency Actual rate Maximum rate 3. Parameters about source coding §3.1 Source coding For extended source coding:

§3.2 Distortionless source coding theorem §3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion function §3.4 Distortion source coding theorem

 Example 3.3 The binary DMS has the probability space: §3.2 Distortionless source coding theorem 1) “0” a1a1, “1” a2a2 2) a 1 a 1 : 0 a 1 a 2 : 10 a 2 a 1 : 110 a 2 a 2 : 111

Average length of coding: Code efficiency: §3.2 Distortionless source coding theorem “0” a1a1, “1” a2a2 Rate:  Example 3.3

Extended source coding code Length of codeword a1a1a1a1 01 a1a2a1a2 102 a2a1a2a a2a2a2a Average length of coding : Code efficiency: Rate: §3.2 Distortionless source coding theorem  Example 3.3

m times extended source coding m = 3: R 3 = (bit/code) m = 4: R 4 = (bit/code) m §3.2 Distortionless source coding theorem  Example 3.3

§3.2 Distortionless source coding theorem  Distortionless source coding theorem Theorem 3.1 If the code C is UD, its average length must exceed the s-ary entropy of the source, that is, (Theorem 11.3 in textbook)

§3.2 Distortionless source coding theorem  Distortionless source coding theorem Theorem 3.2 (Theorem 11.4 in textbook)

§3.2 Distortionless source coding theorem Theorem 3.3 (Theorem 11.5 in textbook)  Distortionless source coding theorem The source can indeed be represented faithfully using s-ary symbols per source symbol.

§3.2 Distortionless source coding theorem  Distortionless source coding theorem corollary The efficient UD codes are achievable if rate R ≤ C. (C is the capacity of s-ary lossless channel )

Review KeyWords: Source coder Variable-length codes distortionless codes Uniquely decodable codes Average length of coding Information rate of coding Coding efficiency Shannon’s TH1

Homework 1. p344: p345:11.20

§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion function §3.4 Distortion source coding theorem

§3.3 The rate-distortion function 1. Introduction Review  Distortionless source coding theorem (corollary) The efficient UD codes are achievable if rate R ≤ C. (C is the capacity of s-ary lossless channel ) Conversely, any sequence of (2 nR, n) codes with must have R ≤ C.  The channel coding theorem (Statement 2 ): All rates below capacity C are achievable. Specifically, for every rate R ≤ C, there exists a sequence of (2 nR,n) codes with maximum probability of error.

§3.3 The rate-distortion function 1. Introduction Review For distortionless coding:R≤C - (P E →0, R→C - ) But actually…… Given a source distribution and a distortion measure, what is the minimum expected distortion achievable at a particular rate? what is the minimum rate description required to achieve a particular distortion?

§3.3 The rate-distortion function 2. Distortion measure coding channel uiui vjvj A U ={u 1,u 2,…,u r }A V ={v 1,v 2,…,v s } Source symbol Destination symbol

2. Distortion measure Average distortion measure: Let the input and output of the channel be U=(U 1,U 2,…,U k ) and V=(V 1,V 2,…,V k ) respectively where, §3.3 The rate-distortion function

 Example A U = A V = {0,1};source statistics p(0) = p, p(1) = q = 1-p, where p ½; and distortion matrix 2. Distortion measure §3.3 The rate-distortion function

 Example A U = {-1,0,+1}, A V = {-1/2, +1/2};source statistics(1/3,1/3,1/3) and distortion matrix 2. Distortion measure §3.3 The rate-distortion function

2. Distortion measure Fidelity criterion: §3.3 The rate-distortion function Test channel: Let the source statistics p(u) and distortion measure d(u,v) are fixed.

3. Rate-distortion function 1) Definition The function is a function of the source statistics (p(u)),the distortion matrix D, and the real number. §3.3 The rate-distortion function The information rate distortion function R k (δ) for a source U with distortion measure d(U, V) is defined as The information rate distortion function R k (δ) for a source U with distortion measure d(U, V) is defined as

3. Rate-distortion function ③ If, then §3.3 The rate-distortion function ② The minimum possible value of is,where R(δ) and C ① The function I(U;V) actually achieves its minimum value on the region of ;

3. Rate-distortion function 2) Properties Theorem 3.4 is a convex function of. (Theorem 3.1 in textbook) §3.3 The rate-distortion function R(0)=H(U)

3. Rate-distortion function 2) Properties Theorem 3.4 is a convex function of. Theorem 3.5 For a DMS, for all k and. (Theorem 3.1 in textbook) (Theorem 3.2 in textbook) §3.3 The rate-distortion function

3. Rate-distortion function  Example (continued) A U = A V = {0,1};source statistics p(0) = p, p(1) = q = 1-p, where p ½; and distortion matrix §3.3 The rate-distortion function 2) Properties

§3.3 The rate-distortion function A U = A V = {0,1,…,r-1}, P{U=u}=1/r Distortions are given by: 2) Properties  Example Rate-distortion function

§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion function §3.4 Distortion source coding theorem

1. Distortion source coding theorem (modified on Theorem 3.4 in textbook) §3.4 Distortion source coding theorem Theorem 3.6 (Shannon’s source coding theorem with a fidelity criterion) If, there exists a source code C of length k with M codewords, where: If,no such codes exist. A source symbol can be compressed into R(δ) bits if a distortion δ is allowable.

2. Relation of shannon’s theorems §3.4 Distortion source coding theorem Source Distortion source coder Distortionless source coder Sink Distortion source decoder Distortionless source decoder channel Channel coder Channel decoder A general communication system

Review KeyWords: Distortion measure Average distortion measure Fidelity criterion Test channel Rate-distortion function Shannon’s TH3

 thinking Source X has the alphabet set {a 1,a 2,…,a 2n },P{X = a i }=1/2n, i = 1,2,…,2n. The distortion measure is Hamming distortion measure,that is Design a source coding method with δ=1/2. §3.4 Distortion source coding theorem