Knowledge Representation and Reasoning Learning Sets of Rules and Analytical Learning Harris Georgiou – 4.

Slides:



Advertisements
Similar presentations
Explanation-Based Learning (borrowed from mooney et al)
Advertisements

Analytical Learning.
Combining Inductive and Analytical Learning
Machine Learning: Lecture 9
1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Knowledge in Learning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 19 Spring 2004.
처음 페이지로 이동 Chapter 11: Analytical Learning Inductive learning training examples n Analytical learning prior knowledge + deductive reasoning n Explanation.
Chapter 10 Learning Sets Of Rules
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Tuesday, November 27, 2001 William.
CPSC 322 Introduction to Artificial Intelligence September 15, 2004.
Knowledge in Learning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 19 Spring 2005.
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Machine Learning: Symbol-Based
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 22 Jim Martin.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
17.5 Rule Learning Given the importance of rule-based systems and the human effort that is required to elicit good rules from experts, it is natural to.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Issues with Data Mining
Inductive Logic Programming Includes slides by Luis Tari CS7741L16ILP.
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
1 Machine Learning: Lecture 11 Analytical Learning / Explanation-Based Learning (Based on Chapter 11 of Mitchell, T., Machine Learning, 1997)
Machine Learning Chapter 11. Analytical Learning
Introduction to ILP ILP = Inductive Logic Programming = machine learning  logic programming = learning with logic Introduced by Muggleton in 1992.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 7, 2001.
1 Machine Learning What is learning?. 2 Machine Learning What is learning? “That is what learning is. You suddenly understand something you've understood.
Machine Learning Chapter 11.
Theory Revision Chris Murphy. The Problem Sometimes we: – Have theories for existing data that do not match new data – Do not want to repeat learning.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, February 4, 2000 Lijun.
110/19/2015CS360 AI & Robotics AI Application Areas  Neural Networks and Genetic Algorithms  These model the structure of neurons in the brain  Humans.
November 10, Machine Learning: Lecture 9 Rule Learning / Inductive Logic Programming.
Assoc. Prof. Abdulwahab AlSammak. Course Information Course Title: Artificial Intelligence Instructor : Assoc. Prof. Abdulwahab AlSammak
Knowledge Representation and Reasoning Predicate Calculus and Resolution Harris Georgiou – 1.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
Overview Concept Learning Representation Inductive Learning Hypothesis
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Thursday, November 29, 2001.
Decision Trees. What is a decision tree? Input = assignment of values for given attributes –Discrete (often Boolean) or continuous Output = predicated.
Learning, page 1 CSI 4106, Winter 2005 Symbolic learning Points Definitions Representation in logic What is an arch? Version spaces Candidate elimination.
KU NLP Machine Learning1 Ch 9. Machine Learning: Symbol- based  9.0 Introduction  9.1 A Framework for Symbol-Based Learning  9.2 Version Space Search.
The AI War LISP and Prolog Basic Concepts of Logic Programming
Explanation Based Learning (EBL) By M. Muztaba Fuad.
For Monday Finish chapter 19 No homework. Program 4 Any questions?
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Wednesday, February 2, 2000.
For Monday Finish chapter 19 Take-home exam due. Program 4 Any questions?
CS 5751 Machine Learning Chapter 10 Learning Sets of Rules1 Learning Sets of Rules Sequential covering algorithms FOIL Induction as the inverse of deduction.
Machine Learning Concept Learning General-to Specific Ordering
CpSc 810: Machine Learning Analytical learning. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Monday, January 24, 2000 William.
Data Mining and Decision Support
CS 5751 Machine Learning Chapter 12 Comb. Inductive/Analytical 1 Combining Inductive and Analytical Learning Why combine inductive and analytical learning?
More Symbolic Learning CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Chap. 10 Learning Sets of Rules 박성배 서울대학교 컴퓨터공학과.
Computing & Information Sciences Kansas State University Wednesday, 04 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 17 of 42 Wednesday, 04 October.
Computing & Information Sciences Kansas State University Friday, 13 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 21 of 42 Friday, 13 October.
Introduction to Artificial Intelligence Heshaam Faili University of Tehran.
CS 9633 Machine Learning Explanation Based Learning
Artificial Intelligence
CS 9633 Machine Learning Concept Learning
CS 9633 Machine Learning Inductive-Analytical Methods
Review of AI Professor: Liqing Zhang
Lecture 14 Learning Inductive inference
Presentation transcript:

Knowledge Representation and Reasoning Learning Sets of Rules and Analytical Learning Harris Georgiou – 4

Learning Sets of Rules Harris Georgiou – 2 – 13 What is Inductive Learning (ILM) ? What is the concept of Analytical Learning ? What is Explanation-Based Learning (EBL) ? When do we use each one of them ? How can the two be combined ? Which one is more human-like AI ?

IF-THEN rules IF-THEN rules model exactly our own prior knowledge and reasoning at high-level In generalized form they use variables, like the Predicate Calculus does Most common case: first-order Horn clauses, used in Inductive Logic Programming (ILP) Algorithms are necessary to implement the induction process (resolution in PROLOG) Harris Georgiou – 3 – 13

Harris Georgiou – 4 – 13 Induction algorithms Sequential Covering Algorithms: – A “Learn-One-Rule” subroutine applies one rule at a time, each limiting the available samples/atoms that satisfy the current conditions enforced by these rules General-to-Specific Beam Search (CN2): – Works like the tree-search algorithm, using depth-first tree organization and extending the most promising candidate Simultaneous Covering Algorithms (ID3): – Search simultaneously all the alternatives than can be created by applying any one of the available attributes Choice between CN2-like or ID3-like systems is problem-dependent and not exclusive

Harris Georgiou – 5 – 13 FOIL algorithm First-order Horn clauses are a powerful and simple way to describe general rules FOIL (Quinlan, 1990): uses a simple version of Horn clauses and works like the sequential covering algorithms FOIL examples: learn Quicksort and Chess by a large set of examples In practice, algorithms like FOIL implement deduction by generalizing from examples

Harris Georgiou – 6 – 13 Inductive Learning Methods Decision Tree Learning (DTL) Artificial Neural Networks (ANN) Inductive Logic Programming (ILP) Genetic Programming (GP)...  They are all based on statistical generalization and reasoning

Harris Georgiou – 7 – 13 Analytical Learning Inductive learning methods (ANN, DTL) are based on “generalization by examples” algorithms Analytical learning uses prior knowledge and deductive reasoning to “augment” information Explanation-Based Learning (EBL): prior knowledge is used to analyze/explain how observed training examples satisfy the “target” concept. Generalization here is based on logical rather than statistical reasoning.

Harris Georgiou – 8 – 13 Explanation-Based Learning Prior knowledge (generic) may be included for “explaining” the examples In this case, the “complexity” of the hypotheses (input) space can be drastically reduced Alternatively, much fewer examples are needed Example (chess): – concept: “black looses Queen in one move” – statistical learning requires all possible pieces setup – explanation-based learning generalizes from simple examples – result: “black losses Queen if King is in check position”

Harris Georgiou – 9 – 13 Explanation-Based Learning Basic limitation of EBL: information and assertions by the learner are assumed to be 100% correct! If not, then more weight must be assigned to statistical learning (ILM) to avoid misleading PROLOG-EBG: Explanation-based generalization Translates new positive examples to generalized hypotheses that cover the entire training set Uses Horn clauses as attribute-value pairs

Harris Georgiou – 10 – 13 Comparison of ILM and EBL LEARNINGInductive (ILM)Analytical (EBL) Goal:Hypothesis fits the data Hypothesis fits domain theory Justification:Statistical inference Deductive inference Advantages:Requires little prior knowledge Learns from scarce data Pitfalls:Scarce data, incorrect bias Imperfect domain theory

Harris Georgiou – 11 – 13 Combining ILM and EBL Use prior knowledge to: derive an initial hypothesis from which to begin the search (example: KBANN) alter the objective of the hypothesis search space (example: EBNN) alter the available search steps by applying multiple “revisions” (example: FOCL)

Harris Georgiou – 12 – 13 Food for thought When ILM is better than EBL ? When EBL is better than ILM ? Which one should be used in designing a computer chess player ? Which one should be used in designing a computer medical assistant ? Which one is more similar to the human way of thinking and problem-solving ability ?

P.C. – Readings Tom Mitchell, “Machine Learning”,McGrawHill, [see: ch.10, ch.11, ch.12] S. J. Russell, P. Norvig, “Artificial Intelligence: A Modern Approach”, 2nd/Ed, Prentice Hall, Harris Georgiou – 13 – 13