Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani FEEL FREE TO ASK QUESTIONS DURING PRESENTATION.

Slides:



Advertisements
Similar presentations
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Advertisements

Random Forest Predrag Radenković 3237/10
CHAPTER 9: Decision Trees
CHAPTER 2: Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Learning a Class from Examples.
 (x) f(x,u) u x f(x,  (x) x. Example: Using feed-forward, what should be canceled?
QUALITATIVE MODELLING AND REASONING Ivan Bratko Faculty of Computer and Information Science Ljubljana University Slovenia.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Qualitative Reverse Engineering Dorian Šuc and Ivan Bratko Artificial Intelligence Laboratory Faculty of Computer and Information Science University of.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Clustering short time series gene expression data Jason Ernst, Gerard J. Nau and Ziv Bar-Joseph BIOINFORMATICS, vol
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Relational Data Mining in Finance Haonan Zhang CFWin /04/2003.
Using Analytic QP and Sparseness to Speed Training of Support Vector Machines John C. Platt Presented by: Travis Desell.
Measuring Model Complexity (Textbook, Sections ) CS 410/510 Thurs. April 27, 2007 Given two hypotheses (models) that correctly classify the training.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 99 Chapter 4 The Simplex Method.
University of Southern California Center for Systems and Software Engineering 1 © USC-CSSE A Constrained Regression Technique for COCOMO Calibration Presented.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Scientific Thinking - 1 A. It is not what the man of science believes that distinguishes him, but how and why he believes it. B. A hypothesis is scientific.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
After today Week 9 –Tu: Pat Rondon –Th: Ravi/Nathan Week 10 –Tu: Nathan/Ravi –Th: Class canceled Finals week –Th: Zach, John.
Building Knowledge-Driven DSS and Mining Data
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Part I: Classification and Bayesian Learning
Machine reconstruction of human control strategies Dorian Šuc Artificial Intelligence Laboratory Faculty of Computer and Information Science University.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Next Generation Techniques: Trees, Network and Rules
Understanding Data Analytics and Data Mining Introduction.
Chapter 8: Problem Solving
INTRODUCTION TO MACHINE LEARNING. $1,000,000 Machine Learning  Learn models from data  Three main types of learning :  Supervised learning  Unsupervised.
Inductive learning Simplest form: learn a function from examples
Qualitative Induction Dorian Šuc and Ivan Bratko Artificial Intelligence Laboratory Faculty of Computer and Information Science University of Ljubljana,
1  (x) f(x,u) u x f(x,  (x) x Example: Using feed-forward, what should be canceled?
1 CSI 5388:Topics in Machine Learning Inductive Learning: A Review.
Machine Learning CSE 681 CH2 - Supervised Learning.
Chapter 2 Section 1. Objectives Be able to define: science, scientific method, system, research, hypothesis, experiment, analysis, model, theory, variable,
1 Research Methodology Model. 2 Hypothesis a prediction of what is the case (fact) based on theory Conclusions Observation (s): Phenomena; Problem (Tree)
The Scientific Method Chpt. 5 Summary. Objectives Describe the order of steps in the scientific method Describe the order of steps in the scientific method.
Learning from observations
Learning from Observations Chapter 18 Through
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Introduction to Machine Learning Supervised Learning 姓名 : 李政軒.
Confirmatory Factor Analysis Psych 818 DeShon. Construct Validity: MTMM ● Assessed via convergent and divergent evidence ● Convergent – Measures of the.
Introduction to Problem Solving. Steps in Programming A Very Simplified Picture –Problem Definition & Analysis – High Level Strategy for a solution –Arriving.
Bi-directional incremental evolution Dr Tatiana Kalganova Electronic and Computer Engineering Dept. Bio-Inspired Intelligent Systems Group Brunel University.
Multi-Relational Data Mining: An Introduction Joe Paulowskey.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
CS690L Data Mining: Classification
Chapter 11 Statistical Techniques. Data Warehouse and Data Mining Chapter 11 2 Chapter Objectives  Understand when linear regression is an appropriate.
Jeff Howbert Introduction to Machine Learning Winter Regression Linear Regression Regression Trees.
Chapter 1 Introduction n Introduction: Problem Solving and Decision Making n Quantitative Analysis and Decision Making n Quantitative Analysis n Model.
MACHINE LEARNING 10 Decision Trees. Motivation  Parametric Estimation  Assume model for class probability or regression  Estimate parameters from all.
Concept learning, Regression Adapted from slides from Alpaydin’s book and slides by Professor Doina Precup, Mcgill University.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Patch Based Prediction Techniques University of Houston By: Paul AMALAMAN From: UH-DMML Lab Director: Dr. Eick.
Reduces time complexity: Less computation Reduces space complexity: Less parameters Simpler models are more robust on small datasets More interpretable;
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Data Mining and Decision Support
Regression. We have talked about regression problems before, as the problem of estimating the mapping f(x) between an independent variable x and a dependent.
Classification and Regression Trees
MACHINE LEARNING 3. Supervised Learning. Learning a Class from Examples Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
Decision Trees.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
INTRODUCTION TO Machine Learning
INTRODUCTION TO Machine Learning 2nd Edition
Supervised machine learning: creating a model
Presentation transcript:

Learning Qualitative Models Ivan Bratko, Dorian Suc Presented by Cem Dilmegani FEEL FREE TO ASK QUESTIONS DURING PRESENTATION

Summary ● Understand QUIN algorithm ● Explore the Crane Example ● Analyze Learning Models expressed as QDEs  GENMODEL by Coiera  QSI by Say and Kuru  QOPH by Coghill et Al.  ILP Systems ● Conclusion  Applications  Further Progress

Modeling ● Modeling is complex ● Modeling requires creativity ● Solution: Use machine learning algorithms for modeling

Modeling ● Modeling is complex ● Modeling requires creativity ● Solution: Use machine learning algorithms for modeling

Learning examples hypothesis Hypothesis examples learning

Decision Tree

Decision Tree Algorithm

QUIN (QUalitative INduction) ● Looks for qualitative patterns in quantitative data ● Uses so-called qualitative trees

Qualitative tree The splits define a partition of the attribute space into areas with common qualitative behaviour of the class variable Qualitatively constrained functions (QCFs) in leaves define qualitative constraints on the class variable

Qualitatively constrained functions (QCFs) The qualitative constraint given by the sign only states that when the i-th attribute increases, the QCF will also change in the direction specified in M, barring other changes.

Qualitative Tree Example

Explanation of Algorithm(Leaf Level) ● Minimal cost QCF is sought ● Cost= M+(inconsistencies or ambiguities between dataset and QCF)

Consistency ● A QCV (Qualitative Change Vector) is consistent with a QCF if either a) class qualitative change is zero b) all attributes QCF-predictions are zero or c) there exists an attribute whose QCF prediction is equal to the class' qualitative change  Z=M+,-(X,Y) ● a) no change = (inc,dec) ● a) no change = (inc,inc) ● b) * = (no change, no change) ● c) inc = (inc, dec)

Ambiguity ● A qualitative ambiguity appears a) when there exist both positive and negative QCF-predictions b) whenever all QCF-predictions are 0.  Z=M+,-(X,Y) ● a) * = (inc,inc) ● b) * = (no change, no change)

Ambiguity-Inconsistency

Explanation of Algorithm ● Start with QCF that minimizes cost in one attribute and then use “error-cost” to refine the current QCF with another attribute ● Tree Level algorithm: QUIN chooses best split by comparing the partitions of the examples it generates: for every possible split, it splits the examples into 2 subsets (according to the split), finds the minimal cost QCF in both subsets and selects the split which minimizes the tree error cost. This goes on until, a specified error bound is reached.

Qualitative Reverse Engineering ● In the industry, there exists library of designs and corresponding simulation models which are not well documented ● We may have to reverse engineer complex simulations to understand how the simulation functions. ● Similar to QSI

Crane Simulation

QUIN Approach ● Looks counterintuitive? ● Yes, but it outperforms straightforward transformations of quantitative data to quantitative model, like regression

Identification of Operator's Skill ● Can't be learnt from operator verbally (Bratko and Urbancic 1999) ● Skill is manifested in operator's actions, QUIN is better at explaining those skills than quantitative models

Comparison of 2 operators S (slow) L (adventurous)

Explanation of S's Strategy ● At the beginning V increases as X increases (load behind crane) ● Later, V decreases as X increases (load gradually moves ahead of crane) ● V increases as the angle increases (crane catches up with the load

GENMODEL by Coiera ● QSI without hidden variables ● Algorithm:  Construct all possible constraints using all observed variables  Evaluate all constraints  Retain those constraints that are satisfied by all states, discard all other  The retained constraints are your model

GENMODEL by Coiera ● Limitations:  Assumes that all variables are observed  Biased towards the most specific models (overfitting)  Does not support operating regions

QSI by Say and Kuru ● Explained last week ● Algorithm:  Starts like GENMODEL  Constructs new variables if needed ● Limitations:  Biased towards the most specific model

Negative Examples ● Consider U-Tube Example  Conservation of water until the second tube bursts or overflows  There can not be negative amounts of water in a container ● Evaporation?

Inductive Logic Programming (ILP) ● ILP is a machine learning approach which uses techniques of logic programming. ● From a database of facts which are divided into positive and negative examples, an ILP system tries to derive a logic program that proves all the positive and none of the negative examples.

Inductive Logic Programming (ILP) ● Advantages:  No need to create a new program, uses established framework  Hidden variables are introduced  Can learn models with multiple operating regions as well

Applications ● German car manufacturer simplified their wheel suspension system with QUIN ● Induction of patient-specific models from patients' measured cardio vascular signals using GENMODEL ● An ILP based learning system (QuMAS) learnt the electrical system of the heart and is able to explain many types of cardiac arrhythmias

Suggestions for Further Progress ● Better methods for transforming numerical data into qualitative data ● Deeper study of principles or heuristics associated with the discovery of hidden variables ● More effective use of general ILP techniques.

Sources Dorian Suc, Ivan Bratko “Qualitative Induction” Ethem Alpaydin “Introduction to Machine Learning” MIT Press Wikipedia

Any Questions? ??????? ??