Lecture 6 Instantaneous Codes and Kraft’s Theorem (Section 1.4)

Slides:



Advertisements
Similar presentations
DCSP-8: Minimal length coding II, Hamming distance, Encryption Jianfeng Feng
Advertisements

DCSP-8: Minimal length coding I Jianfeng Feng Department of Computer Science Warwick Univ., UK
C&O 355 Lecture 15 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A A A A A.
Chapter 4 Variable–Length and Huffman Codes. Unique Decodability We must always be able to determine where one code word ends and the next one begins.
Foundations of Cryptography Lecture 10 Lecturer: Moni Naor.
Lecture 3: Source Coding Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Source Coding Data Compression A.J. Han Vinck. DATA COMPRESSION NO LOSS of information and exact reproduction (low compression ratio 1:4) general problem.
An introduction to Data Compression
Discrete Mathematics Lecture 5 Alexander Bukharovich New York University.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Data Compression.
CS5371 Theory of Computation
ENGG2013 Unit 11 Row-Rank Feb,
Lecture 6: Huffman Code Thinh Nguyen Oregon State University.
Data Structures – LECTURE 10 Huffman coding
1 Chapter 5 A Measure of Information. 2 Outline 5.1 Axioms for the uncertainty measure 5.2 Two Interpretations of the uncertainty function 5.3 Properties.
Variable-Length Codes: Huffman Codes
CSI Uncertainty in A.I. Lecture 201 Basic Information Theory Review Measuring the uncertainty of an event Measuring the uncertainty in a probability.
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Richard W. Hamming Learning to Learn The Art of Doing Science and Engineering Session 10: Coding Theory I Learning to Learn The Art of Doing Science and.
(CSC 102) Discrete Structures Lecture 10.
Induction and recursion
Linear Algebra Chapter 4 Vector Spaces.
Zvi Kohavi and Niraj K. Jha 1 Memory, Definiteness, and Information Losslessness of Finite Automata.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Precise definition of limits The phrases “x is close to a” and “f(x) gets closer and closer to L” are vague. since f(x) can be arbitrarily close to 5 as.
DCSP-8: Minimal length coding I Jianfeng Feng Department of Computer Science Warwick Univ., UK
 Rooted tree and binary tree  Theorem 5.19: A full binary tree with t leaves contains i=t-1 internal vertices.
5.5.3 Rooted tree and binary tree  Definition 25: A directed graph is a directed tree if the graph is a tree in the underlying undirected graph.  Definition.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Information and Coding Theory Cyclic codes Juris Viksna, 2015.
CS 103 Discrete Structures Lecture 13 Induction and Recursion (1)
Performance of Coherent M-ary Signaling ENSC 428 – Spring 2007.
Copyright © Cengage Learning. All rights reserved. CHAPTER 4 ELEMENTARY NUMBER THEORY AND METHODS OF PROOF ELEMENTARY NUMBER THEORY AND METHODS OF PROOF.
Chapter 1.8 Absolute Value and Inequalities. Recall from Chapter R that the absolute value of a number a, written |a|, gives the distance from a to 0.
Compression for Fixed-Width Memories Ori Rottenstriech, Amit Berman, Yuval Cassuto and Isaac Keslassy Technion, Israel.
SEAC-2 J.Teuhola Coding-Theoretic Foundations Source alphabet S  Target alphabet {0, 1} Categories of source encoding: 1. Codebook methods:
5.6 Prefix codes and optimal tree Definition 31: Codes with this property which the bit string for a letter never occurs as the first part of the bit string.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
 2004 SDU Uniquely Decodable Code 1.Related Notions 2.Determining UDC 3.Kraft Inequality.
Math 3121 Abstract Algebra I Lecture 6 Midterm back over+Section 7.
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
Chapter 3 The Real Numbers.
Theory of Computation Lecture 10: A Universal Program I
Introduction to Information theory
Chapter 3 The Real Numbers.
Chapter 3 The Real Numbers.
Derivative and properties of functions
Cardinality of Sets Section 2.5.
Asymptotic Notations Algorithms Lecture 9.
The sum of any two even integers is even.
Chapter 4 Sequences.
5.5 Properties of the Definite Integral
Lecture 43 Section 10.1 Wed, Apr 6, 2005
Section 8.1: Sequences.
Advanced Analysis of Algorithms
Chapter 4 Sequences.
Lecture 7 Information Sources; Average Codeword Length (Section 2.1)
Lecture 11 The Noiseless Coding Theorem (Section 3.4)
Linear Vector Space and Matrix Mechanics
Lecture 17 Making New Codes from Old Codes (Section 4.6)
Lecture 3 Strings and Things (Section 1.1)
Lecture 15 The Minimum Distance of a Code (Section 4.4)
Lecture 4 What are Codes? (Section 1.2)
Theory of Information Lecture 13
Lecture 8 Huffman Encoding (Section 2.2)
8/7/2019 Berhanu G (Dr) 1 Chapter 3 Convex Functions and Separation Theorems In this chapter we focus mainly on Convex functions and their properties in.
Chapter 2. Simplex method
Lecture 18 The Main Coding Theory Problem (Section 4.7)
Presentation transcript:

Lecture 6 Instantaneous Codes and Kraft’s Theorem (Section 1.4) Theory of Information Lecture 6 Theory of Information Lecture 6 Instantaneous Codes and Kraft’s Theorem (Section 1.4)

Theory of Information Lecture 6 Instantaneous Codes Theory of Information Lecture 6 DEFINITION A code is said to be instantaneous if, whenever any sequence of codewords is transmitted, each codeword can be interpreted as soon as it is received. Example. {0, 01} {0, 10}

Prefix and Suffix Properties Theory of Information Lecture 6 DEFINITION 1) A code is said to have the prefix property if no codeword is a prefix of any other codeword. 2) A code is said to have the suffix property if no code word is a suffix of any other codeword. Fact: Having the suffix or prefix property is sufficient to be uniquely Decipherable. But not vice versa. Does {0,01} have the prefix property? suffix property? Is it uniquely decipherable? Does {00, 001000, 001011, 11} have the prefix property? suffix property? Is it uniquely decipherable?

Prefix Property = Instantaneousity Theory of Information Lecture 6 THEOREM 1.4.1 A code is instantaneous if and only if it has the prefix property. Comma code: {0, 10, 110, 1110} has the prefix property and is instantaneous. How about {0, 01, 011, 0111}: Prefix property? Instantaneous? Suffix property? Uniquely decipherable?

Theory of Information Lecture 6 Kraft’s Theorem Theory of Information Lecture 6 THEOREM 1.4.2 There exists an instantaneous r-ary code with codeword lengths a1,a2,…,aq if and only if Kraft’s inequality is satisfied: 1/ra1 + … + 1/raq  1. Let C be an instantaneous r-ary code. Then C is maximal instantaneous, i.e. C is not contained in any strictly larger instantaneous code, if and only if equality holds in Kraft’s inequality. Suppose that C is an instantaneous code with maximum codeword Length m. If C is not maximal, then it is possible to add a word of length m to C without destroying its property of being instantaneous.

Theory of Information Lecture 6 Kraft’s Theorem Theory of Information Lecture 6 THEOREM 1.4.2 There exists an instantaneous r-ary code with codeword lengths a1,a2,…,aq if and only if Kraft’s inequality is satisfied: 1/ra1 + … + 1/raq  1. Note: That a given code C satisfies Kraft’s inequality does not necessarily mean that C is instantaneous; maybe another code D --- with the same codeword lengths --- is instantaneous rather than C itself. Example: C={0, 11, 100, 110} Does C satisfy Kraft’s inequality? Is C instantaneous? However, D={0,11,101,100} has the same codeword lengths and Is instantaneous.

The Utility of Kraft’s Theorem Theory of Information Lecture 6 Kraft’s theorem allows us to find an instantaneous code (if such exists) with given codelengths a1a2… aq as follows: For i=1 to i=q do: Pick any codeword of length ai such that no earlier codeword is its prefix Example: Construct a ternary code {c1,c2,c3,c4,c5,c6} with codelengths 1, 1, 2, 4, 4, 5 Kraft’s test to see if such a code exists: c1= c2= c3= c4= c5= c6=

Reasonable Uniquely Decipherable Codes Are Instantaneous Theory of Information Lecture 6 Reasonable Uniquely Decipherable Codes Are Instantaneous THEOREM 1.4.3. If a uniquely decipherable code exists with codeword lengths l1,l2,…,ln, then an instantaneous code must also exist with these same codeword lengths. Proof: Suppose the above uniquely decipherable code exists. Then, by McMillan’s Theorem, Kraft’s inequality should be satisfied for those lengths. But then, by Kraft’s Theorem, there must also exist an Instantaneous code with the same codelengths.

Theory of Information Lecture 6 Homework Theory of Information Lecture 6 Exercises 2,3,4,5,6,7,8,9,10,11,12,13,14 of Section 1.4.