Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 34 of 41 Wednesday, 10.

Slides:



Advertisements
Similar presentations
Concept Learning and the General-to-Specific Ordering
Advertisements

2. Concept Learning 2.1 Introduction
CS 484 – Artificial Intelligence1 Announcements Project 1 is due Tuesday, October 16 Send me the name of your konane bot Midterm is Thursday, October 18.
Università di Milano-Bicocca Laurea Magistrale in Informatica
Adapted by Doug Downey from: Bryan Pardo, EECS 349 Fall 2007 Machine Learning Lecture 2: Concept Learning and Version Spaces 1.
Computing & Information Sciences Kansas State University Lecture 11 of 42 CIS 530 / 730 Artificial Intelligence Lecture 11 of 42 William H. Hsu Department.
Chapter 2 - Concept learning
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
Machine Learning: Symbol-Based
A Brief Survey of Machine Learning
Part I: Classification and Bayesian Learning
Lecture 32 of 42 Machine Learning: Basic Concepts
Artificial Intelligence 6. Machine Learning, Version Space Method
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2001.
Computing & Information Sciences Kansas State University Lecture 01 of 42 Wednesday, 24 January 2008 William H. Hsu Department of Computing and Information.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 25 January 2008 William.
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 12 January 2007 William.
CpSc 810: Machine Learning Design a learning system.
1 Machine Learning What is learning?. 2 Machine Learning What is learning? “That is what learning is. You suddenly understand something you've understood.
Machine Learning Chapter 11.
Computing & Information Sciences Kansas State University Wednesday, 13 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 9 of 42 Wednesday, 13 September.
Computing & Information Sciences Kansas State University Friday, 18 Jan 2008CIS 732 / 830: Machine Learning / Advanced Topics in AI Lecture 0 of 42 Friday,
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2000.
General-to-Specific Ordering. 8/29/03Logic Based Classification2 SkyAirTempHumidityWindWaterForecastEnjoySport SunnyWarmNormalStrongWarmSameYes SunnyWarmHighStrongWarmSameYes.
Computing & Information Sciences Kansas State University Lecture 10 of 42 CIS 530 / 730 Artificial Intelligence Lecture 10 of 42 William H. Hsu Department.
Computing & Information Sciences Kansas State University Monday, 13 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 34 of 42 Monday, 13 November.
1 Concept Learning By Dong Xu State Key Lab of CAD&CG, ZJU.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 22, 2001 William.
Chapter 2: Concept Learning and the General-to-Specific Ordering.
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
Computing & Information Sciences Kansas State University Friday, 10 Nov 2006CIS 490 / 730: Artificial Intelligence Lecture 33 of 42 Friday, 10 November.
Computing & Information Sciences Kansas State University Wednesday, 12 Sep 2007CIS 530 / 730: Artificial Intelligence Lecture 9 of 42 Wednesday, 12 September.
CpSc 810: Machine Learning Concept Learning and General to Specific Ordering.
Concept Learning and the General-to-Specific Ordering 이 종우 자연언어처리연구실.
Outline Inductive bias General-to specific ordering of hypotheses
Overview Concept Learning Representation Inductive Learning Hypothesis
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 9 of 42 Wednesday, 14.
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Thursday, August 26, 1999 William.
Machine Learning: Lecture 2
Machine Learning Concept Learning General-to Specific Ordering
Data Mining and Decision Support
Kansas State University Department of Computing and Information Sciences CIS 690: Implementation of High-Performance Data Mining Systems Thursday, 20 May.
CS 8751 ML & KDDComputational Learning Theory1 Notions of interest: efficiency, accuracy, complexity Probably, Approximately Correct (PAC) Learning Agnostic.
Introduction Machine Learning: Chapter 1. Contents Types of learning Applications of machine learning Disciplines related with machine learning Well-posed.
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
CS464 Introduction to Machine Learning1 Concept Learning Inducing general functions from specific training examples is a main issue of machine learning.
Concept Learning and The General-To Specific Ordering
Computational Learning Theory Part 1: Preliminaries 1.
Computing & Information Sciences Kansas State University CIS 530 / 730: Artificial Intelligence Lecture 09 of 42 Wednesday, 17 September 2008 William H.
Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2).
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Machine Learning Chapter 7. Computational Learning Theory Tom M. Mitchell.
Supervise Learning Introduction. What is Learning Problem Learning = Improving with experience at some task – Improve over task T, – With respect to performance.
Chapter 2 Concept Learning
Concept Learning Machine Learning by T. Mitchell (McGraw-Hill) Chp. 2
CSE543: Machine Learning Lecture 2: August 6, 2014
CS 9633 Machine Learning Concept Learning
Spring 2003 Dr. Susan Bridges
Analytical Learning Discussion (4 of 4):
Why Machine Learning Flood of data
Concept Learning Berlin Chen 2005 References:
Machine Learning Chapter 2
Implementation of Learning Systems
Version Space Machine Learning Fall 2018.
Machine Learning Chapter 2
Presentation transcript:

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 34 of 41 Wednesday, 10 November 2004 William H. Hsu Department of Computing and Information Sciences, KSU Reading: Sections and 18.5, Russell and Norvig Introduction to Machine Learning

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture Outline Today’s Reading –Sections , 18.5, Russell and Norvig –References: Chapters 1-3, Machine Learning, Mitchell Next Week: Sections , , Russell and Norvig Previously: Representation and Reasoning (Inference) –Logical –Probabilistic (“Soft Computing”) Today: Introduction to Learning –Machine learning framework –Definitions Taxonomies: supervised, unsupervised, reinforcement Instance spaces (X) Hypotheses (h) and hypothesis spaces (H) –Basic examples –Version spaces and candidate elimination algorithm Next Thursday: Inductive Bias and Learning Decision Trees

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Specifying A Learning Problem Learning = Improving with Experience at Some Task –Improve over task T, –with respect to performance measure P, –based on experience E. Example: Learning to Play Checkers –T: play games of checkers –P: percent of games won in world tournament –E: opportunity to play against self Refining the Problem Specification: Issues –What experience? –What exactly should be learned? –How shall it be represented? –What specific algorithm to learn it? Defining the Problem Milieu –Performance element: How shall the results of learning be applied? –How shall the performance element be evaluated? The learning system?

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Example (Revisited): Learning to Play Board Games Type of Training Experience –Direct or indirect? –Teacher or not? –Knowledge about the game (e.g., openings/endgames)? Problem: Is Training Experience Representative (of Performance Goal)? Software Design –Assumptions of the learning system: legal move generator exists –Software requirements: generator, evaluator(s), parametric target function Choosing a Target Function –ChooseMove: Board  Move (action selection function, or policy) –V: Board  R (board evaluation function) –Ideal target V; approximated target –Goal of learning process: operational description (approximation) of V

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Implicit Representation in Learning: Target Evaluation Function for Checkers Possible Definition –If b is a final board state that is won, then V(b) = 100 –If b is a final board state that is lost, then V(b) = -100 –If b is a final board state that is drawn, then V(b) = 0 –If b is not a final board state in the game, then V(b) = V(b’) where b’ is the best final board state that can be achieved starting from b and playing optimally until the end of the game –Correct values, but not operational Choosing a Representation for the Target Function –Collection of rules? –Neural network? –Polynomial function (e.g., linear, quadratic combination) of board features? –Other? A Representation for Learned Function – –bp/rp = number of black/red pieces; bk/rk = number of black/red kings; bt/rt = number of black/red pieces threatened (can be taken on next turn)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Design Choices for Learning to Play Checkers Completed Design Determine Type of Training Experience Games against experts Games against self Table of correct moves Determine Target Function Board  valueBoard  move Determine Representation of Learned Function Polynomial Linear function of six features Artificial neural network Determine Learning Algorithm Gradient descent Linear programming

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Performance Element: What to Learn? Classification Functions –Hidden functions: estimating (“fitting”) parameters –Concepts (e.g., chair, face, game) –Diagnosis, prognosis: medical, risk assessment, fraud, mechanical systems Models –Map (for navigation) –Distribution (query answering, aka QA) –Language model (e.g., automaton/grammar) Skills –Playing games –Planning –Reasoning (acquiring representation to use in reasoning) Cluster Definitions for Pattern Recognition –Shapes of objects –Functional or taxonomic definition Many Learning Problems Can Be Reduced to Classification

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Representations and Algorithms: How to Learn It? Supervised –What is learned? Classification function; other models –Inputs and outputs? Learning: –How is it learned? Presentation of examples to learner (by teacher) Unsupervised –Cluster definition, or vector quantization function (codebook) –Learning: –Formation, segmentation, labeling of clusters based on observations, metric Reinforcement –Control policy (function from states of the world to actions) –Learning: –(Delayed) feedback of reward values to agent based on actions selected; model updated based on reward, (partially) observable state

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence (Supervised) Concept Learning Given: Training Examples of Some Unknown Function f Find: A Good Approximation to f Examples (besides Concept Learning) –Disease diagnosis x = properties of patient (medical history, symptoms, lab tests) f = disease (or recommended therapy) –Risk assessment x = properties of consumer, policyholder (demographics, accident history) f = risk level (expected cost) –Automatic steering x = bitmap picture of road surface in front of vehicle f = degrees to turn the steering wheel –Part-of-speech tagging –Fraud/intrusion detection –Web log analysis –Multisensor integration and prediction

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Example: Supervised Inductive Learning Problem Unknown Function x1x1 x2x2 x3x3 x4x4 y = f (x 1, x 2, x 3, x 4 ) x i : t i, y: t, f: (t 1  t 2  t 3  t 4 )  t Our learning function: Vector (t 1  t 2  t 3  t 4  t)  (t 1  t 2  t 3  t 4 )  t

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Hypothesis Space: Unrestricted Case | A  B | = | B | | A | |H 4  H | = | {0,1}  {0,1}  {0,1}  {0,1}  {0,1} | = = function values Complete Ignorance: Is Learning Possible? –Need to see every possible input/output pair –After 7 examples, still have 2 9 = 512 possibilities (out of 65536) for f

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Example: Learning A Concept (EnjoySport) from Data Specification for Training Examples –Similar to a data type definition –6 variables (aka attributes, features): Sky, Temp, Humidity, Wind, Water, Forecast –Nominal-valued (symbolic) attributes - enumerative data type Binary (Boolean-Valued or H -Valued) Concept Supervised Learning Problem: Describe the General Concept

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Representing Hypotheses Many Possible Representations Hypothesis h: Conjunction of Constraints on Attributes Constraint Values –Specific value (e.g., Water = Warm) –Don’t care (e.g., “Water = ?”) –No value allowed (e.g., “Water = Ø”) Example Hypothesis for EnjoySport –SkyAirTempHumidityWindWater Forecast –Is this consistent with the training examples? –What are some hypotheses that are consistent with the examples?

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Typical Concept Learning Tasks Given –Instances X: possible days, each described by attributes Sky, AirTemp, Humidity, Wind, Water, Forecast –Target function c  EnjoySport: X  H  {{Rainy, Sunny}  {Warm, Cold}  {Normal, High}  {None, Mild, Strong}  {Cool, Warm}  {Same, Change}}  {0, 1} –Hypotheses H: conjunctions of literals (e.g., ) –Training examples D: positive and negative examples of the target function Determine –Hypothesis h  H such that h(x) = c(x) for all x  D –Such h are consistent with the training data Training Examples –Assumption: no missing X values –Noise in values of c (contradictory labels)?

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Inductive Learning Hypothesis Fundamental Assumption of Inductive Learning Informal Statement –Any hypothesis found to approximate the target function well over a sufficiently large set of training examples will also approximate the target function well over other unobserved examples –Definitions deferred: sufficiently large, approximate well, unobserved Formal Statements, Justification, Analysis –Statistical (Mitchell, Chapter 5; statistics textbook) –Probabilistic (R&N, Chapters and 19; Mitchell, Chapter 6) –Computational (R&N, Section 18.6; Mitchell, Chapter 7) More on This Topic: Machine Learning and Pattern Recognition (CIS732) Next: How to Find This Hypothesis?

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Instances, Hypotheses, and the Partial Ordering Less-Specific-Than Instances XHypotheses H x 1 = x 2 = h 1 = h 2 = h 3 = h 2  P h 1 h 2  P h 3 x1x1 x2x2 Specific General h1h1 h3h3 h2h2  P  Less-Specific-Than  More-General-Than

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Find-S Algorithm 1. Initialize h to the most specific hypothesis in H H: the hypothesis space (partially ordered set under relation Less-Specific-Than) 2. For each positive training instance x For each attribute constraint a i in h IF the constraint a i in h is satisfied by x THEN do nothing ELSE replace a i in h by the next more general constraint that is satisfied by x 3. Output hypothesis h

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Hypothesis Space Search by Find-S Instances XHypotheses H x 1 =, + x 2 =, + x 3 =, - x 4 =, + h 1 = h 2 = h 3 = h 4 = h 5 = Shortcomings of Find-S –Can’t tell whether it has learned concept –Can’t tell when training data inconsistent –Picks a maximally specific h (why?) –Depending on H, there might be several! h1h1 h0h0 h 2,3 h4h x3x3 x1x1 x2x2 x4x4

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Version Spaces Definition: Consistent Hypotheses –A hypothesis h is consistent with a set of training examples D of target concept c if and only if h(x) = c(x) for each training example in D. –Consistent (h, D)    D. h(x) = c(x) Definition: Version Space –The version space VS H,D, with respect to hypothesis space H and training examples D, is the subset of hypotheses from H consistent with all training examples in D. –VS H,D  { h  H | Consistent (h, D) }

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence 1. Initialization G  (singleton) set containing most general hypothesis in H, denoted { } S  set of most specific hypotheses in H, denoted { } 2. For each training example d If d is a positive example (Update-S) Remove from G any hypotheses inconsistent with d For each hypothesis s in S that is not consistent with d Remove s from S Add to S all minimal generalizations h of s such that 1. h is consistent with d 2. Some member of G is more general than h (These are the greatest lower bounds, or meets, s  d, in VS H,D ) Remove from S any hypothesis that is more general than another hypothesis in S (remove any dominated elements) Candidate Elimination Algorithm [1]

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Candidate Elimination Algorithm [2] (continued) If d is a negative example (Update-G) Remove from S any hypotheses inconsistent with d For each hypothesis g in G that is not consistent with d Remove g from G Add to G all minimal specializations h of g such that 1. h is consistent with d 2. Some member of S is more specific than h (These are the least upper bounds, or joins, g  d, in VS H,D ) Remove from G any hypothesis that is less general than another hypothesis in G (remove any dominating elements)

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence S2S2 = G 2 Example Trace S0S0 G0G0 d 1 : d 2 : d 3 : d 4 : = S 3 G3G3 S4S4 G4G4 S1S1 = G 1

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Summary Points Taxonomies of Learning Definition of Learning: Task, Performance Measure, Experience Concept Learning as Search through H –Hypothesis space H as a state space –Learning: finding the correct hypothesis General-to-Specific Ordering over H –Partially-ordered set: Less-Specific-Than (More-General-Than) relation –Upper and lower bounds in H Version Space Candidate Elimination Algorithm –S and G boundaries characterize learner’s uncertainty –Version space can be used to make predictions over unseen cases Learner Can Generate Useful Queries Next Tuesday: When and Why Are Inductive Leaps Possible?

Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Terminology Supervised Learning –Concept - function from observations to categories (so far, boolean-valued: +/-) –Target (function) - true function f –Hypothesis - proposed function h believed to be similar to f –Hypothesis space - space of all hypotheses that can be generated by the learning system –Example - tuples of the form –Instance space (aka example space) - space of all possible examples –Classifier - discrete-valued function whose range is a set of class labels The Version Space Algorithm –Algorithms: Find-S, List-Then-Eliminate, candidate elimination –Consistent hypothesis - one that correctly predicts observed examples –Version space - space of all currently consistent (or satisfiable) hypotheses Inductive Learning –Inductive generalization - process of generating hypotheses that describe cases not yet observed –The inductive learning hypothesis