Presentation is loading. Please wait.

Presentation is loading. Please wait.

15-781 Machine Learning (Recitation 1) By Jimeng Sun 9/15/05.

Similar presentations


Presentation on theme: "15-781 Machine Learning (Recitation 1) By Jimeng Sun 9/15/05."— Presentation transcript:

1 15-781 Machine Learning (Recitation 1) By Jimeng Sun 9/15/05

2 Entropy, Information Gain Decision Tree Probability Entropy

3 Suppose X can have one of m values… V 1, V 2, … V m What’s the smallest possible number of bits, on average, per symbol, needed to transmit a stream of symbols drawn from X’s distribution? It’s H(X) = The entropy of X “High Entropy” means X is from a uniform (boring) distribution “Low Entropy” means X is from varied (peaks and valleys) distribution Entropy P(X=V 1 ) = p 1 P(X=V 2 ) = p 2 ….P(X=V m ) = p m

4 Entropy H(*) H(X) = 1.5 H(Y) = 1 X = College Major Y = Likes “XBOX” XY MathYes HistoryNo CSYes MathNo MathNo CSYes HistoryNo MathYes

5 Definition of Specific Conditional Entropy: H(Y |X=v) = The entropy of Y among only those records in which X has value v X = College Major Y = Likes “ XBOX ” XY MathYes HistoryNo CSYes MathNo MathNo CSYes HistoryNo MathYes Specific Conditional Entropy H(Y|X=v)

6 Definition of Specific Conditional Entropy: H(Y |X=v) = The entropy of Y among only those records in which X has value v Example: H(Y|X=Math) = 1 H(Y|X=History) = 0 H(Y|X=CS) = 0 X = College Major Y = Likes “ XBOX ” XY MathYes HistoryNo CSYes MathNo MathNo CSYes HistoryNo MathYes Specific Conditional Entropy H(Y|X=v)

7 Conditional Entropy H(Y|X) Definition of Conditional Entropy: H(Y |X) = The average specific conditional entropy of Y = if you choose a record at random what will be the conditional entropy of Y, conditioned on that row’s value of X = Expected number of bits to transmit Y if both sides will know the value of X = Σ j Prob(X=v j ) H(Y | X = v j ) X = College Major Y = Likes “ XBOX ” XY MathYes HistoryNo CSYes MathNo MathNo CSYes HistoryNo MathYes

8 Conditional Entropy Definition of Conditional Entropy: H(Y|X) = The average conditional entropy of Y = Σ j Prob(X=v j ) H(Y | X = v j ) X = College Major Y = Likes “ XBOX ” Example: vjvj Prob(X=v j )H(Y | X = v j ) Math0.51 History0.250 CS0.250 H(Y|X) = 0.5 * 1 + 0.25 * 0 + 0.25 * 0 = 0.5 XY MathYes HistoryNo CSYes MathNo MathNo CSYes HistoryNo MathYes

9 Information Gain Definition of Information Gain: IG(Y|X) = I must transmit Y. How many bits on average would it save me if both ends of the line knew X? IG(Y|X) = H(Y) - H(Y | X) X = College Major Y = Likes “ XBOX ” Example: H(Y) = 1 H(Y|X) = 0.5 Thus IG(Y|X) = 1 – 0.5 = 0.5 XY MathYes HistoryNo CSYes MathNo MathNo CSYes HistoryNo MathYes

10 Decision Tree

11

12 Tree pruning

13 Probability

14 Test your understanding All the time? Only when X and Y are independent? It can fail even if X and Y are independent?

15


Download ppt "15-781 Machine Learning (Recitation 1) By Jimeng Sun 9/15/05."

Similar presentations


Ads by Google