Presentation is loading. Please wait.

Presentation is loading. Please wait.

Machine Learning Concept Learning General-to Specific Ordering

Similar presentations


Presentation on theme: "Machine Learning Concept Learning General-to Specific Ordering"— Presentation transcript:

1 Machine Learning Concept Learning General-to Specific Ordering
(Inductive Classification)

2 Note Simple approach Assumption: no noise Illustrates key concepts

3 Concept learning task Concept learning = classification (categorizatin) Given examples, learn a general concept (category) subset of some general domain boolean-valued function over the domain (characteristic function) Determine infer a boolean-valued function from samples of it (input, output)

4 Training Examples for EnjoySport
Concept learning: inferring a boolean-valued function from training examples Binary classification Concept to learn: Enjoy Sport Sky Temp Humid Wind Water Fore Cast Enjoy Sport Sunny Warm Normal Strong Same Yes High Rainy Cold Change No Cool

5 Representing Hypotheses
Many possible representations Here, h is conjunction of constraints on attributes Each constraint can be a specific value (e.g., Water=Warm) don't care (e.g., Water=?) no value allowed (e.g., Water=Ø) For example, <Sky Temp Humid Wind Water Forecast > <Sunny ? ? Strong ? Same >

6 Concept learning task Given: Determine:
A description of an instance, xX, where X is the instance language or instance space. A fixed set of categories: C={c1, c2,…cn} Determine: The category of x: c(x)C, where c(x) is a categorization function whose domain is X and whose range is C. If c(x) is a binary function C={0,1} ({true,false}, {positive, negative}, {yes, no}) then it is called a concept.

7 Example task “Days on which Aldo enjoys his water sports”
day = set of attributes predict the value of one attribute given values of others Representation? Conjunction of constraints on attributes Sky Temp Humid Wind Water Fore Cast Enjoy Sport Sunny Warm Normal Strong Same Yes High Rainy Cold Change No Cool

8 Notation X: set of items over which the concept is defined
c: target concept function X  {0,1} c(x) = 1: x is a positive example of c c(x) = 0: x is a negative example of c D: set of examples <x, c(x)> H: hypothesis space design choice members of H are functions X -> 0/1 Goal of (concept) learning find h in H such that h(x) = c(x) for all x in X

9 Prototypical Concept Learning Task
Given: Instances X: Possible days, each described by the attributes Sky, Temp, Humidity, Wind, Water, Forecast Target function c: EnjoySport: X  {0,1} Hypotheses H: Conjunctions of literals. E.g. < ?, Cold, High, ?, ?, ?> Training examples D: Positive and negative examples of the target function <x1, c(x1)> , … <xn, c(xn)> Determine: A hypothesis h in H such that h(x)=c(x) for all x in D.

10 Inductive Learning Hypothesis
Any hypothesis found to approximate the target function well over a sufficiently large set of training examples will also approximate the target function well over other unobserved examples.

11 Notes Unique Concept learning task requires most general hypothesis
most specific hypothesis Concept learning task requires domain (set of instances) target function set of candidate hypotheses set of (labeled) examples

12 Induction Learning hypothesis
All we know of c are the examples Best we can do is to produce a h consistent with example data Hypotheses best h regarding unseen instances is the best h regarding seen ones fundamental assumption in inductive learning

13 Concept learning as search
H = search space find h best fitting to D Selection of representation defines search space size of space (syntactic/semantic) Efficient search in very large (or infinite) spaces for best h

14 General-to-Specific order
Useful structure organize the search process exists for any concept learning task search possible without enumerating all members of H (may be infinite) h1 ‘more general than or equal’ h2 for all x: h1(x) = 1 implies h2(x) = 1 independent of c! defines a partial order on H

15 Instance, Hypotheses, and More-General-Than

16 Find-S Algorithm Initialize h to the most specific hypothesis in H
For each positive training instance x For each attribute constraint ai in h If the constraint ai in h is satisfied by x Then do nothing Else replace ai in h by the next more general constraint that is satisfied by x Output hypothesis h

17 Find-S Algorithm

18 Key property of Find-S For H defined as conjunction of constraints
finds the most specific h consistent with the positive examples consistent with negative ones if no noise and c is in H

19 Problems of Find-S Can not tell whether has learned c
have we converged to c? if not, how uncertain we are of c? Why choose most specific h? Is training data consistent? noise can severely mislead Find-S detection, overcoming ‘most specific’ is not always unique (Depending on H, there might be several!) Frequently realistic training data is corrupted by errors (noise) in the features or class values. Such noise can result in missing valid generalizations.

20 Version Spaces and Candidate Elimination
Another learning approach: outputs a description of all members of H that are consistent with D possible without enumerating members of H (ordering!) suffers from noise useful framework for introducing many fundamental issues of ML

21 Version Space Given an hypothesis space, H, and training data, D, the version space (VS) is the complete subset of H that is consistent with D. The version space can be naively generated for any finite H by enumerating all hypotheses and eliminating the inconsistent ones.

22 Version Space with S and G
The version space can be represented more compactly by maintaining two boundary sets of hypotheses, S, the set of most specific consistent hypotheses, and G, the set of most general consistent hypotheses: S and G represent the entire version space via its boundaries in the generalization lattice: G version space S

23 List-Then-Eliminate Algorithm
VersionSpace  a list containing every hypothesis in H For each training example, <x, c(x)> remove from VersionSpace any hypothesis h for which h(x) ≠ c(x) Output the list of hypotheses in VersionSpace

24 Example Version Space Sky Temp Humid Wind Water Fore Cast Enjoy Sport
Sunny Warm Normal Strong Same Yes High Rainy Cold Change No Cool

25 Representing Version Spaces
The General boundary, G, of version space VSH,D is the set of its maximally general members The Specific boundary, S, of version space VSH,D is the set of its maximally specific members Every member of the version space lies between these boundaries where x ≥ y means x is more general or equal to y

26 Elimination algorithms
List-then-Eliminate something to start with ineffective Candidate elimination same principle more compact representation maintain most specific and most general elements of VS(H,D)

27 Candidate Elimination Algorithm
G  maximally general hypotheses in H S  maximally specific hypotheses in H For each training example d, do If d is a positive example Remove from G any hypothesis inconsistent with d For each hypothesis s in S that is not consistent with d Remove s from S Add to S all minimal generalizations h of s such that h is consistent with d, and some member of G is more general than h Remove from S any hypothesis that is more general than another hypothesis in S If d is a negative example Remove from S any hypothesis inconsistent with d For each hypothesis g in G that is not consistent with d Remove g from G Add to G all minimal specializations h of g such that some member of S is more specific than h Remove from G any hypothesis that is less general than another hypothesis in G

28 Example Trace Next training example? Sky Temp Humid Wind Water Fore
Cast Enjoy Sport Sunny Warm Normal Strong Same Yes High Rainy Cold Change No Cool

29 Sky Temp Humid Wind Water Fore Cast Enjoy Sport Sunny Warm Normal Strong Same Yes High Rainy Cold Change No Cool

30 How Should These Be Classified?

31 What Justifies this Inductive Leap?

32 Remarks on VS & CE Converges to correct h? Effect of errors (noise)
no errors, H rich enough: yes exact: G = S and both singletons Effect of errors (noise) example 2 negative  CE removes the correct target from VS! detection: VS gets empty similarly when c can not be represented (e.g. disjunctions)

33 What example next? Assume learner may ask What to ask in example case?
analogy: experiments, teacher query: instance constructed by learner, classified by teacher What to ask in example case? Is there a good general strategy? Should attempt to discriminate among alternatives in VS

34 What to ask... Discriminating example c(x) = 1: S will generalize
c(x) = 0: G will specialize optimal choice halves VS x satisfies half of VS members not possible to generate one in general

35 How to use partially learned concepts?
Classification with ambiguous VS h(x) = 0/1 for every h in VS: ok enough to check with G (0) & S (1) 3rd example: 50% support for both 4th: 66% support for 0 majority vote + confidence? Ok if all h are equally likely

36 Inductive Bias What if c is not in H? We concentrate us on C-E, but
Use H that includes ‘everything’? Will be quite large --> effect on learning (generalization, # of examples) We concentrate us on C-E, but results apply to any algorithm outputting any consistent h for D

37 Biased hypothesis space
Assure H contains c make H general enough what if it’s not? Example case max specific h consistent with x1 & x2 (+) is too general for x3 (-) reason: we have biased the learner to consider only conjunctions


Download ppt "Machine Learning Concept Learning General-to Specific Ordering"

Similar presentations


Ads by Google