Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its.

Similar presentations


Presentation on theme: "Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its."— Presentation transcript:

1 Chapter 4: Information Theory

2 Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its measure. LO 4.2 – Discuss the probabilistic behaviour of a source of information. LO 4.3 – Illustrate the properties of discrete memoryless channel and mutual information. LO 4.4 –Analyze the intrinsic ability of the communication channel to convey information reliably.

3 4.1.1 Discrete and Continuous Message Figure 4.1.1 An Analog Discrete-Time Signal Figure 4.1.1 An Analog Discrete-Time Signal t s(t) t Figure 4.1.2 An Analog Continuous-Time Signal Figure 4.1.2 An Analog Continuous-Time Signal

4 … Discrete and Continuous Messages Figure 4.1.3 A Digital Discrete-Time Signal Figure 4.1.3 A Digital Discrete-Time Signal Figure 4.1.4 A Digital Continuous-Time Signal Figure 4.1.4 A Digital Continuous-Time Signal t s(t) t

5 ….. Discrete and Continuous Messages Figure 4.1.5 A Digital Communication System with DMS Discrete Memoryless Source (DMS) Binary Source and Channel Encoder Binary Source and Channel Decoder Destination Channel noise Binary symmetric channel Transmitter side 0, 1 Receiver side 0, 1

6 4.1.2 Amount of Information Measure of Information : Bit Nat – 1 nat = 1.44 bits Decit or Hartley – 1 Decit = 3.32 bits

7 4.2 Average Information and Entropy The average information :that represents statistical average per individual message generated by a source is known as entropy, expresses in bits per symbol.

8 Concept of Information and Entropy The information contained in a message depends on its probability of occurrence. That is, if the probability of occurrence of a particular message is more, then it contains less amount of information and vice versa. The Entropy of a source is a measure of the average amount of information per source symbol in a long message. It is usually expressed in bits per symbol.

9 Properties of Entropy

10 Entropy of Binary Memoryless Source A binary source is said to be memoryless when it generates statistically independent successive symbols 0 and 1.

11 Differential Entropy Properties of Differential Entropy

12 Joint Entropy The joint entropy is the average uncertainty of the communication channel as a whole considering the entropy due to channel input as well as channel output.

13 Conditional Entropy It is a measure of the average uncertainty remaining about the channel input after the channel output, and the channel output after the channel input has been observed, respectively.

14 Average Effective Entropy It is the difference between the entropy of the source and the conditional entropy of the message. If a discrete memoryless source generates r messages per second, then the information rate or the average information per second is defined as

15 Coding of Information

16 4.3 Characteristics of a Discrete Memoryless Channel A Channel Matrix. Or Probability Transition Matrix

17 Binary Symmetric Channel (BSC) It is a binary channel which can transmit only one of two symbols (0 and 1) In BSC channel, transmission is not perfect, and occasionally the receiver gets the wrong bit.

18 Binary Ersure Channel (BEC) Input symbols Output symbols x 0 = 0 x 1 = 1 y 0 = 0 y e = e p(y 0 /x 0 ) p(y 1 /x 1 ) p(y e /x 1 ) p(y e /x 0 ) y 1 = 1 Figure 4.5.1 A General Model of a Binary Erasure Channel Figure 4.5.1 A General Model of a Binary Erasure Channel

19 4.3.1 Mutual Information

20 Properties of Mutual Information Symmetrical property : Non-negative property : Joint Entropy of Input/output Channel :

21 4.4 Shannon’s Channel Coding Theorem

22 Implementation of Shannon’s Channel Coding Theorem in BSC

23 4.4.1 Channel Capacity  A channel that possesses Gaussian noise characteristics is known as a Gaussian channel.  If the band-limited white Gaussian noise is linearly added with input during transmission through a channel, then it is called additive white Gaussian noise (AWGN), and the channel is called AWGN channel.

24 Shannon Channel Capacity Theorem

25 About the Author T. L. Singal graduated from National Institute of Technology, Kurukshetra and post-graduated from Punjab Technical university in Electronics & Communication Engineering. He began his career with Avionics Design Bureau, HAL, Hyderabad in 1981 and worked on Radar Communication Systems. Then he led R&D group in a Telecom company and successfully developed Multi- Access VHF Wireless Communication Systems. He visited Germany during 1990-92. He executed international assignment as Senior Network Consultant with Flextronics Network Services, Texas, USA during 2000-02. He was associated with Nokia, AT&T, Cingular Wireless and Nortel Networks, for optimization of 2G/3G Cellular Networks in USA. Since 2003, he is in teaching profession in reputed engineering colleges in India. He has number of technical research papers published in the IEEE Proceedings, Journals, and International/National Conferences. He has authored three text-books `Wireless Communications (2010)’, `Analog & Digital Communications (2012)’, and `Digital Communication (2015)’ with internationally renowned publisher McGraw-Hill Education.

26 THANKS!


Download ppt "Chapter 4: Information Theory. Learning Objectives LO 4.1 – Understand discrete and continuous messages, message sources, amount of information and its."

Similar presentations


Ads by Google