Presentation is loading. Please wait.

Presentation is loading. Please wait.

Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.

Similar presentations


Presentation on theme: "Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1."— Presentation transcript:

1 Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1

2 The amount of Information C. Shannon has suggested that the random variable –log P{E k } is an indicative relative measure of the occurrence of the event E k. The mean of this function is a good indication of the average uncertainty with respect to all outcomes of the experiment. 2

3 The amount of Information Consider the sample space Ω. Let us partition the sample space in a finite number of mutually exclusive events: The way in which the probability space defined by such equations is called a complete finite scheme. 3

4 The amount of Information Consider the sample space Ω. Let us partition the sample space in a finite number of mutually exclusive events: C. Shannon has suggested that the random variable I = –log P{E k } is an indicative relative measure of the occurrence of the event E k. This measure is called the amount of information contained in the occurrence of the event E k. 4

5 The amount of Information. Entropy. It is important to evaluate not only the amount of information (the uncertainty) contained in a single isolated event (message), but to evaluate the average uncertainty of the entire complete finite scheme. C. Shannon and N. Wiener suggested the following measure of uncertainty – the Entropy: 5

6 Entropy of a Bit (a simple communication channel) A completely random bit with p=(½,½) has H(p) = –(½ log ½ + ½ log ½) = –(–½ + –½) = 1. A deterministic bit with p=(1,0) has H(p) = –(1 log 1 + 0 log 0) = –(0+0) = 0. A biased bit with p=(0.1,0.9) has H(p) = 0.468996… In general, the entropy looks as follows as a function of 0≤P{X=1}≤1: 6

7 The amount of Information. Entropy. We have to investigate the principal properties of this measure with respect to statistical problems of communication systems. We have to generalize this concept to two- dimensional and n -dimensional probability schemes. 7

8 Entropy. Basic Properties Continuity: if the probabilities of the occurrence of events are slightly changed, the entropy is slightly changed accordingly. Symmetry: Extremal Property : when all the events are equally likely, the average uncertainty has the largest value: 8

9 Entropy. Basic Properties Additivity. Let is the entropy associated with a complete set of events E 1, E 2, …, E n. Let the event E n is divided into m disjoint subsets: Thus and where 9

10 Entropy. Basic Properties In general, is continuous in p i for all 10

11 Entropy for Two-dimensional Discrete Finite Probability Schemes The two-dimensional probability scheme provides the simplest mathematical model for a communication system with a transmitter and a receiver. Consider two finite discrete sample spaces Ω 1 (transmitter space) Ω 2 (receiver space) and their product space Ω. 11

12 Entropy for Two-dimensional Discrete Finite Probability Schemes In Ω 1 and Ω 2 we select complete set of events Each event E k of Ω 1 may occur in conjunction with any event F j of Ω 2. Thus for the product space Ω= Ω 1 Ω 2 we have the following complete set of events: 12

13 Entropy for Two-dimensional Discrete Finite Probability Schemes We have deal with the following three complete sets of probability schemes: Hence the joint probability matrix is 13


Download ppt "Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1."

Similar presentations


Ads by Google