Presentation is loading. Please wait.

Presentation is loading. Please wait.

資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch2: Basic Concepts.

Similar presentations


Presentation on theme: "資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch2: Basic Concepts."— Presentation transcript:

1 資訊理論 授課老師 : 陳建源 Email:cychen07@nuk.edu.tw 研究室 : 法 401 網站 http://www.csie.nuk.edu.tw/~cychen/ Ch2: Basic Concepts

2 Let S be a system of events Def: The self-information of the event E k is written I(E k ): 2. 1 Self-information in which The base of the logarithm: 2 (log), e (ln) 單位: bit, nat

3 Ch2: Basic Concepts when 2. 1 Self-information then when then when then when then 愈小 愈大

4 Ch2: Basic Concepts 2. 1 Self-information Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random.

5 Ch2: Basic Concepts 2. 1 Self-information Ex3. 64 points are arranged in a square grid. E j be the event that a point picked at random in the j th column E k be the event that a point picked at random in the k th row Why?

6 Ch2: Basic Concepts 2. 2 Entropy E(f) be expectation or average or mean of f f: E k → f k Let S be the system with events the associated probabilities being

7 Ch2: Basic Concepts 2. 2 Entropy 觀察 最小值為 0 ,表示已確定。但最大值呢 ? Let certainty Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows

8 Ch2: Basic Concepts 2. 2 Entropy Thm: with equality only when Proof:

9 Ch2: Basic Concepts 2. 2 Entropy Thm 2.2:For x>0 with equality only when x=1. Assume that p k ≠0

10 Ch2: Basic Concepts 2. 2 Entropy

11 Ch2: Basic Concepts 2. 2 Entropy In the system S the probabilities p 1 and p 2 where p 2 > p 1 are replaced by p 1 +ε and p 2 -εrespectively under the proviso 0<2ε<p 2 -p 1. Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness

12 Ch2: Basic Concepts 2. 3 Mutual information Let S 1 be the system with events the associated probabilities being Let S 2 be the system with events the associated probabilities being

13 Ch2: Basic Concepts 2. 3 Mutual information Two systems S 1 and S 2 satisfying relation

14 Ch2: Basic Concepts 2. 3 Mutual information relation

15 Ch2: Basic Concepts 2. 3 Mutual information conditional probability conditional self-information mutual information NOTE:

16 Ch2: Basic Concepts 2. 3 Mutual information conditional entropy mutual information

17 Ch2: Basic Concepts 2. 3 Mutual information conditional self-informationmutual information and If E j and F k are statistically independent

18 Ch2: Basic Concepts 2. 3 Mutual information joint entropy joint entropy and conditional entropy

19 Ch2: Basic Concepts 2. 3 Mutual information mutual information and conditional entropy

20 Ch2: Basic Concepts 2. 3 Mutual information mutual information of two systems cannot exceed the sum of their separate entropies Thm:

21 Ch2: Basic Concepts 2. 3 Mutual information Joint entropy of two statistically independent systems is the sum of their separate entropies System’s independent If S 1 and S 2 are statistically independent

22 Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:Assume that p jk ≠0

23 Ch2: Basic Concepts 2. 3 Mutual information with equality only if S 1 and S 2 are statistically independent Thm: Proof:

24 Ch2: Basic Concepts 2. 3 Mutual information Ex: A binary symmetric channel with crossover probability ε Let S 1 be the input E 0 =0, E 1 =1 and S 2 be the output F 0 =0, F 1 =1

25 Ch2: Basic Concepts 2. 3 Mutual information Assume that Then

26 Ch2: Basic Concepts 2. 3 Mutual information Compute the output Then If then

27 Ch2: Basic Concepts 2. 3 Mutual information Compute the mutual information

28 Ch2: Basic Concepts 2. 3 Mutual information Compute the mutual information

29 Ch2: Basic Concepts 2. 3 Mutual information Ex: The following message may be sent over a binary symmetric channel with crossover probability ε and they are equally probable at the input. What is the mutual information between M 1 and the first output digit being 0? What additional mutual information is conveyed by the knowledge that the second output digit is also 0?

30 Ch2: Basic Concepts 2. 3 Mutual information For the output 00 The extra mutual infoemation

31 Ch2: Basic Concepts 2. 4 Data processing theorem If S 1 and S 3 are statistically independent when conditioned on S 2, then If S 1 and S 3 are statistically independent when conditioned on S 2, then Data processing theorem convexity theorem

32 Ch2: Basic Concepts 2. 4 Data processing theorem If S 1 and S 3 are statistically independent when conditioned on S 2, then Data processing theorem proof

33 Ch2: Basic Concepts 2. 5 Uniqueness theorem be a continuous function of its arguments in which Def: 滿足 (a) f takes its largest value of p k =1/n (b) f is unaltered if an impossible event is added to the system (c)

34 Ch2: Basic Concepts 2. 5 Uniqueness theorem Uniqueness theorem for a positive constant C proof


Download ppt "資訊理論 授課老師 : 陳建源 研究室 : 法 401 網站 Ch2: Basic Concepts."

Similar presentations


Ads by Google