Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.

Slides:



Advertisements
Similar presentations
Cognitive Radio Communications and Networks: Principles and Practice By A. M. Wyglinski, M. Nekovee, Y. T. Hou (Elsevier, December 2009) 1 Chapter 11 Information.
Advertisements

The Transmission-Switching Duality of Communication Networks
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
Communications Systems ASU Course EEE455/591 Instructor: Joseph Hui Monarch Institute of Engineering.
Chapter 6 Information Theory
Fundamental limits in Information Theory Chapter 10 :
Sep 06, 2005CS477: Analog and Digital Communications1 Introduction Analog and Digital Communications Autumn
Communication Systems
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
1 Chapter 1 Introduction. 2 Outline 1.1 A Very Abstract Summary 1.2 History 1.3 Model of the Signaling System 1.4 Information Source 1.5 Encoding a Source.
Department of Electrical Engineering Systems. What is Systems? The study of mathematical and engineering tools used to analyze and implement engineering.
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
Data Communication, Lecture91 PAM and QAM. Data Communication, Lecture92 Homework 1: exercises 1, 2, 3, 4, 9 from chapter1 deadline: 85/2/19.
§1 Entropy and mutual information
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Postacademic Course on Telecommunications 20/4/00 p. 1 Module-3 Transmission Marc Moonen Lecture-2 Limits of Communication K.U.Leuven-ESAT/SISTA Lecture-2:
Channel Coding Part 1: Block Coding
Channel Capacity
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
Channel Capacity.
Prepared by: Amit Degada Teaching Assistant, ECED, NIT Surat
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Introduction to Digital and Analog Communication Systems
§2 Discrete memoryless channels and their capacity function
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Lecture 2 Outline Announcements: No class next Wednesday MF lectures (1/13,1/17) start at 12:50pm Review of Last Lecture Analog and Digital Signals Information.
Information Theory The Work of Claude Shannon ( ) and others.
DIGITAL COMMUNICATIONS Linear Block Codes
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
CHAPTER 5 SIGNAL SPACE ANALYSIS
University of Houston Cullen College of Engineering Electrical & Computer Engineering Capacity Scaling in MIMO Wireless System Under Correlated Fading.
Additive White Gaussian Noise
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
EE 3220: Digital Communication Dr. Hassan Yousif Ahmed Department of Electrical Engineering College of Engineering at Wadi Al Dawaser Prince Sattam bin.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Mutual Information and Channel Capacity Multimedia Security.
ELEC E7210 Communication Theory Lectures autumn 2015 Department of Communications and Networking.
Chapter 1: PCM, and Delta Modulation and Demodulation
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Information Theory Information Suppose that we have the source alphabet of q symbols s 1, s 2,.., s q, each with its probability p(s i )=p i. How much.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Chapter 6: Spread-Spectrum Communications
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Subject Name: Information Theory Coding Subject Code: 10EC55
A Brief Introduction to Information Theory
Sampling Theorems- Nyquist Theorem and Shannon-Hartley Theorem
EE 6332, Spring, 2017 Wireless Telecommunication
Malong Wang Ting-change
Presentation transcript:

Chapter 4: Information Theory

Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its measure. LO 4.2 – Discuss the probabilistic behaviour of a source of information. LO 4.3 – Illustrate the properties of discrete memoryless channel and mutual information. LO 4.4 –Analyze the intrinsic ability of the communication channel to convey information reliably.

4.1.1 Discrete and Continuous Message Figure An Analog Discrete-Time Signal Figure An Analog Discrete-Time Signal t s(t) t Figure An Analog Continuous-Time Signal Figure An Analog Continuous-Time Signal

… Discrete and Continuous Messages Figure A Digital Discrete-Time Signal Figure A Digital Discrete-Time Signal Figure A Digital Continuous-Time Signal Figure A Digital Continuous-Time Signal t s(t) t

….. Discrete and Continuous Messages Figure A Digital Communication System with DMS Discrete Memoryless Source (DMS) Binary Source and Channel Encoder Binary Source and Channel Decoder Destination Channel noise Binary symmetric channel Transmitter side 0, 1 Receiver side 0, 1

4.1.2 Amount of Information Measure of Information : Bit Nat – 1 nat = 1.44 bits Decit or Hartley – 1 Decit = 3.32 bits

4.2 Average Information and Entropy The average information :that represents statistical average per individual message generated by a source is known as entropy, expresses in bits per symbol.

Concept of Information and Entropy The information contained in a message depends on its probability of occurrence. That is, if the probability of occurrence of a particular message is more, then it contains less amount of information and vice versa. The Entropy of a source is a measure of the average amount of information per source symbol in a long message. It is usually expressed in bits per symbol.

Properties of Entropy

Entropy of Binary Memoryless Source A binary source is said to be memoryless when it generates statistically independent successive symbols 0 and 1.

Differential Entropy Properties of Differential Entropy

Joint Entropy The joint entropy is the average uncertainty of the communication channel as a whole considering the entropy due to channel input as well as channel output.

Conditional Entropy It is a measure of the average uncertainty remaining about the channel input after the channel output, and the channel output after the channel input has been observed, respectively.

Average Effective Entropy It is the difference between the entropy of the source and the conditional entropy of the message. If a discrete memoryless source generates r messages per second, then the information rate or the average information per second is defined as

Coding of Information

4.3 Characteristics of a Discrete Memoryless Channel A Channel Matrix. Or Probability Transition Matrix

Binary Symmetric Channel (BSC) It is a binary channel which can transmit only one of two symbols (0 and 1) In BSC channel, transmission is not perfect, and occasionally the receiver gets the wrong bit.

Binary Ersure Channel (BEC) Input symbols Output symbols x 0 = 0 x 1 = 1 y 0 = 0 y e = e p(y 0 /x 0 ) p(y 1 /x 1 ) p(y e /x 1 ) p(y e /x 0 ) y 1 = 1 Figure A General Model of a Binary Erasure Channel Figure A General Model of a Binary Erasure Channel

4.3.1 Mutual Information

Properties of Mutual Information Symmetrical property : Non-negative property : Joint Entropy of Input/output Channel :

4.4 Shannon’s Channel Coding Theorem

Implementation of Shannon’s Channel Coding Theorem in BSC

4.4.1 Channel Capacity  A channel that possesses Gaussian noise characteristics is known as a Gaussian channel.  If the band-limited white Gaussian noise is linearly added with input during transmission through a channel, then it is called additive white Gaussian noise (AWGN), and the channel is called AWGN channel.

Shannon Channel Capacity Theorem

About the Author T. L. Singal graduated from National Institute of Technology, Kurukshetra and post-graduated from Punjab Technical university in Electronics & Communication Engineering. He began his career with Avionics Design Bureau, HAL, Hyderabad in 1981 and worked on Radar Communication Systems. Then he led R&D group in a Telecom company and successfully developed Multi- Access VHF Wireless Communication Systems. He visited Germany during He executed international assignment as Senior Network Consultant with Flextronics Network Services, Texas, USA during He was associated with Nokia, AT&T, Cingular Wireless and Nortel Networks, for optimization of 2G/3G Cellular Networks in USA. Since 2003, he is in teaching profession in reputed engineering colleges in India. He has number of technical research papers published in the IEEE Proceedings, Journals, and International/National Conferences. He has authored three text-books `Wireless Communications (2010)’, `Analog & Digital Communications (2012)’, and `Digital Communication (2015)’ with internationally renowned publisher McGraw-Hill Education.

THANKS!