Coding Theory Efficient and Reliable Transfer of Information

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Chapter 8 Channel Capacity. bits of useful info per bits actually sent Change in entropy going through the channel (drop in uncertainty): average uncertainty.
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Entropy and Information Theory
Huffman code and ID3 Prof. Sin-Min Lee Department of Computer Science.
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Chapter 6 Information Theory
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
SWE 423: Multimedia Systems
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
A Bit of Information Theory Unsupervised Learning Working Group Assaf Oron, Oct Based mostly upon: Cover & Thomas, “Elements of Inf. Theory”,
Fundamental limits in Information Theory Chapter 10 :
1Causality & MDL Causal Models as Minimal Descriptions of Multivariate Systems Jan Lemeire June 15 th 2006.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
Molecular Information Theory Niru Chennagiri Probability and Statistics Fall 2004 Dr. Michael Partensky.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Source Coding Hafiz Malik Dept. of Electrical & Computer Engineering The University of Michigan-Dearborn
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
Some basic concepts of Information Theory and Entropy
Analysis of Iterative Decoding
Entropy and some applications in image processing Neucimar J. Leite Institute of Computing
STATISTIC & INFORMATION THEORY (CSNB134)
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Channel Capacity
Richard W. Hamming Learning to Learn The Art of Doing Science and Engineering Session 13: Information Theory ` Learning to Learn The Art of Doing Science.
(Important to algorithm analysis )
Channel Capacity.
§3 Discrete memoryless sources and their rate-distortion function §3.1 Source coding §3.2 Distortionless source coding theorem §3.3 The rate-distortion.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Image Compression – Fundamentals and Lossless Compression Techniques
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information Theory The Work of Claude Shannon ( ) and others.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
CS654: Digital Image Analysis
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
2IS80 Fundamentals of Informatics
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
1Causal Performance Models Causal Models for Performance Analysis of Computer Systems Jan Lemeire TELE lab May 24 th 2006.
Mutual Information, Joint Entropy & Conditional Entropy
SEAC-3 J.Teuhola Information-Theoretic Foundations Founder: Claude Shannon, 1940’s Gives bounds for:  Ultimate data compression  Ultimate transmission.
Entropy estimation and lossless compression Structure and Entropy of English How much lossless compression can be achieved for a given image? How much.
DIGITAL COMMUNICATION. Introduction In a data communication system, the output of the data source is transmitted from one point to another. The rate of.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Introduction to Communication Lecture (11) 1. Digital Transmission A computer network is designed to send information from one point to another. This.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Introduction to Information theory
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
Information Theory Michael J. Watts
Digital Multimedia Coding
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
A Brief Introduction to Information Theory
Digital Encodings.
Distributed Compression For Binary Symetric Channels
Lecture 2: Basic Information Theory
Presentation transcript:

Coding Theory Efficient and Reliable Transfer of Information Information Theory Algebraic Coding Theory

Information Source Encoder Communication Channel Decoder Information Sink Noise

What is Information

Information Theory Information is a decrease in uncertainty What is uncertainty?

Uncertainty Number of Possibilities 2 ways 3 ways Together: 6 ways Prefer additive measure take logarithm of number of symbols M

Symbols with unequal probability

Average uncertainty per symbol: Shannon Entropy

Mutual Information Change in uncertainty Change in uncertainty in one random variable from knowledge of another

Molecular Information Theory Applied to DNA Sequences Pattern of information, or label, to ribosome to start translation at given site

Channel Capacity Maximum rate at which information can be sent and recovered with vanishing probability of error Channel Coding Theorem

Source Coding Can’t Compress information below entropy Optimal Prefix-Free Codes for Representation Code Lengths are greater than or equal to entropy of source

Rate Distortion Encoding bits of information per symbol bit How many given a certain distortion (error) p 1 1-p

Algorithmic Information Theory Kolmogorov Descriptive complexity of an object is the length of the shortest computer program that describes the object If equal to length of object then it is random Pi