1 Concept Learning By Dong Xu State Key Lab of CAD&CG, ZJU.

Slides:



Advertisements
Similar presentations
Concept Learning and the General-to-Specific Ordering
Advertisements

2. Concept Learning 2.1 Introduction
Concept Learning DefinitionsDefinitions Search Space and General-Specific OrderingSearch Space and General-Specific Ordering The Candidate Elimination.
CS 484 – Artificial Intelligence1 Announcements Project 1 is due Tuesday, October 16 Send me the name of your konane bot Midterm is Thursday, October 18.
Università di Milano-Bicocca Laurea Magistrale in Informatica
Machine Learning II Decision Tree Induction CSE 473.
Decision Trees. DEFINE: Set X of Instances (of n-tuples x = ) –E.g., days decribed by attributes (or features): Sky, Temp, Humidity, Wind, Water, Forecast.
Computational Learning Theory
Adapted by Doug Downey from: Bryan Pardo, EECS 349 Fall 2007 Machine Learning Lecture 2: Concept Learning and Version Spaces 1.
Chapter 2 - Concept learning
Machine Learning: Symbol-Based
Concept Learning and Version Spaces
Artificial Intelligence 6. Machine Learning, Version Space Method
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2001.
Computing & Information Sciences Kansas State University Lecture 01 of 42 Wednesday, 24 January 2008 William H. Hsu Department of Computing and Information.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 25 January 2008 William.
By Wang Rui State Key Lab of CAD&CG
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
CS 484 – Artificial Intelligence1 Announcements List of 5 source for research paper Homework 5 due Tuesday, October 30 Book Review due Tuesday, October.
For Friday Read chapter 18, sections 3-4 Homework: –Chapter 14, exercise 12 a, b, d.
1 Machine Learning What is learning?. 2 Machine Learning What is learning? “That is what learning is. You suddenly understand something you've understood.
Machine Learning Chapter 11.
CpSc 810: Machine Learning Decision Tree Learning.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, January 19, 2000.
General-to-Specific Ordering. 8/29/03Logic Based Classification2 SkyAirTempHumidityWindWaterForecastEnjoySport SunnyWarmNormalStrongWarmSameYes SunnyWarmHighStrongWarmSameYes.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Gun Ho Lee Soongsil University, Seoul.
机器学习 陈昱 北京大学计算机科学技术研究所 信息安全工程研究中心. 课程基本信息  主讲教师:陈昱 Tel :  助教:程再兴, Tel :  课程网页:
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 22, 2001 William.
Chapter 2: Concept Learning and the General-to-Specific Ordering.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
CpSc 810: Machine Learning Concept Learning and General to Specific Ordering.
Concept Learning and the General-to-Specific Ordering 이 종우 자연언어처리연구실.
Outline Inductive bias General-to specific ordering of hypotheses
Overview Concept Learning Representation Inductive Learning Hypothesis
Computational Learning Theory IntroductionIntroduction The PAC Learning FrameworkThe PAC Learning Framework Finite Hypothesis SpacesFinite Hypothesis Spaces.
CS Machine Learning 15 Jan Inductive Classification.
1 Universidad de Buenos Aires Maestría en Data Mining y Knowledge Discovery Aprendizaje Automático 2-Concept Learning (1/3) Eduardo Poggi
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Thursday, August 26, 1999 William.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 34 of 41 Wednesday, 10.
Machine Learning: Lecture 2
Machine Learning Concept Learning General-to Specific Ordering
1 Decision Trees ExampleSkyAirTempHumidityWindWaterForecastEnjoySport 1SunnyWarmNormalStrongWarmSameYes 2SunnyWarmHighStrongWarmSameYes 3RainyColdHighStrongWarmChangeNo.
Kansas State University Department of Computing and Information Sciences CIS 690: Implementation of High-Performance Data Mining Systems Thursday, 20 May.
CS 8751 ML & KDDComputational Learning Theory1 Notions of interest: efficiency, accuracy, complexity Probably, Approximately Correct (PAC) Learning Agnostic.
CS464 Introduction to Machine Learning1 Concept Learning Inducing general functions from specific training examples is a main issue of machine learning.
Concept Learning and The General-To Specific Ordering
Computational Learning Theory Part 1: Preliminaries 1.
Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2).
Machine Learning Chapter 7. Computational Learning Theory Tom M. Mitchell.
Chapter 2 Concept Learning
Computational Learning Theory
Concept Learning Machine Learning by T. Mitchell (McGraw-Hill) Chp. 2
CSE543: Machine Learning Lecture 2: August 6, 2014
CS 9633 Machine Learning Concept Learning
Analytical Learning Discussion (4 of 4):
Machine Learning Chapter 2
Introduction to Machine Learning Algorithms in Bioinformatics: Part II
Ordering of Hypothesis Space
Concept Learning.
Concept learning Maria Simi, 2010/2011.
IES 511 Machine Learning Dr. Türker İnce (Lecture notes by Prof. T. M
Concept Learning Berlin Chen 2005 References:
Machine Learning Chapter 2
Implementation of Learning Systems
A task of induction to find patterns
Version Space Machine Learning Fall 2018.
Machine Learning Chapter 2
Presentation transcript:

1 Concept Learning By Dong Xu State Key Lab of CAD&CG, ZJU

2 Overview  Introduction  Perspective  Algorithms  Remarks  Inductive Bias  Conclusions

3 Introduction  What is concept learning? –Induce boolean function from a sample of positive/negative training examples.  Concept learning in daily life – 根据人证物证判断犯罪嫌疑人是否有罪 – 根据笔试面试决定是否录用 –Any more ?

4 A Demo Task – EnjoySport  Given: –Instances X: Possible days, each described by the attributes Sky (Sunny, Cloudy, and Rainy) Temp (Warm and Cold) Humidity (Normal and High) Wind (Strong and Weak) Water (Warm and Cool) Forecast (Same and Change) –Hypotheses H: Each hypothesis is described by a conjunction of constraints. These constraints may be “?” (any value), “0” (no value), or a specific value.  Determine: –A hypothesis h in H such that h(x) = c(x) for all x in X.

5 EnjoySport Training Data IDSkyTempHumidityWindWaterForecastEnjoy 1SunnyWarmNormalStrongWarmSameYes 2SunnyWarmHighStrongWarmSameYes 3RainyColdHighStrongWarmChangeNo 4SunnyWarmHighStrongCoolChangeYes

6 The Inductive Learning Hypothesis  Any hypothesis found to approximate the target function well over a sufficiently large set of training examples will also approximate the target function well over other unobserved examples.  根据已知推断未知,假定已知满足某种 规律

7 Perspective  Concept learning can be formulated as a searching - through a predefined space of potential hypotheses for the hypothesis that best fits the training examples.  General-to-specific ordering –Example : >= –Introduce a hierarchy structure into hypotheses space, which leads to efficient searching strategy.

8 Algorithms AlgorithmOrderStrategyN/P FIND-SSpecific-to- general Top-downPositive LIST-THEN- ELIMINATE General-to- Specific Bottom-upNegative CANDIDATE- ELIMINATION Bi-directional Both

9 FIND-S  h0 =  h1 =  h2 =  h3 =  h4 = Training examples: 1., Enjoy Sport = Yes 2., Enjoy Sport = Yes 3., Enjoy Sport = No 4., Enjoy Sport = Yes Report the most specific hypothesis

10 LIST-THEN-ELIMINATE  h0 =  h1 = or or Report the most general hypothesis

11 CANDIDATE-ELIMINATION (1) S0: { } S1: { } S2: { } G0, G1, G2: { } Training examples: 1., Enjoy Sport = Yes 2., Enjoy Sport = Yes

12 CANDIDATE-ELIMINATION (2) S2,S3: { } G3: { } G2: { } Training examples: 3., Enjoy Sport = No

13 CANDIDATE-ELIMINATION (3) S3: { } S4: { } G4:{ } G3:{ Training examples: 4., Enjoy Sport = Yes

14 Final Version Space S4: { } G4:{ } Report the version space – all possible hypotheses

15 Remarks  Convergence Condition –Noise Free (No Errors) –The target concept DOES exist in the searching hypotheses space H  What Training Example Should the Learner Request Next? –Satisfy half the hypotheses in the current version space –Fastest Convergence, Least Sample Needed, Best Uncertainty Elimination  How Can Partially Learned Concepts Be Used? –Absolutely Accept –Absolutely Deny –Pending

16 Inductive Bias  Bias Vs. Unbiase  The Futility of Bias-Free Learning –Too large searching space –The convergence is impossible –Rational inference is impossible  Inductive bias of CANDIDATE-ELIMINATION algorithm –The target concept c is contained in the given hypothesis space H. –Inductive System == Deductive System + Inductive Bias

17 Conclusions  Concept learning can be cast as Searching through predefined hypotheses space.  The general-to-specific partial ordering of hypotheses leads to efficient searching strategy, such as CANDIDATE-ELIMINATION algorithm.  A practical concept learning methods must employ inductive bias. Otherwise, they can only classify the observed training examples.  Version spaces and the CANDIDATE-ELIMINATION algorithm provide a useful conceptual framework for studying concept learning. However, their correctness rely on the noise-free training examples and the ability of provided hypotheses space to express the unknown target concepts.

18 Thank you