CPSC 322 Introduction to Artificial Intelligence December 1, 2004.

Slides:



Advertisements
Similar presentations
Welcome Back to School!!! Mr. Sortina.
Advertisements

Summer Computing Workshop. Introduction to Variables Variables are used in every aspect of programming. They are used to store data the programmer needs.
How to Succeed in Mathematics WOU Mathematics Department
Machine Learning: Intro and Supervised Classification
Decision Trees Decision tree representation ID3 learning algorithm
CHAPTER 2: Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Learning a Class from Examples.
1 CS101 Introduction to Computing Lecture 17 Algorithms II.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
RIPPER Fast Effective Rule Induction
Authority 2. HW 8: AGAIN HW 8 I wanted to bring up a couple of issues from grading HW 8. Even people who got problem #1 exactly right didn’t think about.
Copy in Agenda and TOC Turn in your homework equation sheet.
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
Shapes: Introductory basics you can't live without An introduction to shapes What is a shape? In Visio, the definition is much broader than you might think.
CPSC 322 Introduction to Artificial Intelligence October 6, 2004.
Advanced Technical Writing Today in class ► Presentation guidelines ► Ideas for displaying data.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Solving the Multiple-Instance Problem with Axis-Parallel Rectangles By Thomas G. Dietterich, Richard H. Lathrop, and Tomás Lozano-Pérez Appeared in Artificial.
Ensemble Learning: An Introduction
Instance Based Learning. Nearest Neighbor Remember all your data When someone asks a question –Find the nearest old data point –Return the answer associated.
Induction of Decision Trees
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
CPSC 322 Introduction to Artificial Intelligence November 29, 2004.
Three kinds of learning
CS 223B Assignment 1 Help Session Dan Maynes-Aminzade.
Machine Learning: Symbol-Based
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
“As is our confidence, so is our capacity
Let’s talk about text positioning All about the text block This diagram shows a server with connectors leading to other computers. Notice the position.
The Marriage Problem Finding an Optimal Stopping Procedure.
Artificial Intelligence Lecture 9. Outline Search in State Space State Space Graphs Decision Trees Backtracking in Decision Trees.
Learning CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Please take a seat Choose any seat except mine DON’T touch the computers! Don’t sit near someone who might get you in trouble!
Cornell Notes Key Ideas Main Points Titles of Powerpoint Slides Vocabulary Words Detailed information Definitions of Words Example problems This is YOUR.
Assignment 2: remarks FIRST PART Please don’t make a division of labor so blatantly obvious! 1.1 recode - don't just delete everything that looks suspicious!
Instructor: Vincent Conitzer
Ch10 Machine Learning: Symbol-Based
ESB Students’ attitudes. Methodology and Research Research was conducted in June, current and 6 graduated ESB students took part in the research.
Unit 7.2 Cognition: Thinking and Problem Solving.
Special Topics in Educational Data Mining HUDK5199 Spring term, 2013 March 4, 2013.
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
QUIZ! You can bring notes you took during class Any concept/topic we’ve covered may be up for grabs.
1. Finding your seat - grab your name tent from the basket 2. Match the number written on your name tent to the seat number in our class 3. Take out your.
An overview based on the work of Molly Bang
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
CS 8751 ML & KDDDecision Trees1 Decision tree representation ID3 learning algorithm Entropy, Information gain Overfitting.
Chapter 11 Statistical Techniques. Data Warehouse and Data Mining Chapter 11 2 Chapter Objectives  Understand when linear regression is an appropriate.
Energy forms and transformations. What is energy? We use the word all the time – but very few people have a strong understanding what it is It.
CS 5751 Machine Learning Chapter 3 Decision Tree Learning1 Decision Trees Decision tree representation ID3 learning algorithm Entropy, Information gain.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
Clustering Prof. Ramin Zabih
CPSC 322 Introduction to Artificial Intelligence December 3, 2004.
Learning to Detect Faces A Large-Scale Application of Machine Learning (This material is not in the text: for further information see the paper by P.
Decision Tree Learning Presented by Ping Zhang Nov. 26th, 2007.
“Home” by Jamie Deliz Sight, Sound & Motion Professor Langer.
1 Decision Trees Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) [Edited by J. Wiebe] Decision Trees.
Evaluation Essay.
Computational Learning Theory Part 1: Preliminaries 1.
1 Learning Bias & Clustering Louis Oliphant CS based on slides by Burr H. Settles.
Machine Learning in Practice Lecture 2 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
Welcome to Math 6 Our subject for today is… Divisibility.
Data Mining CH6 Implementation: Real machine learning schemes(2) Reporter: H.C. Tsai.
Conditionals.
By: Antonio Vazquez.  As far as this year goes, there were a lot of struggles that I had this year, I can’t really explain why, they just occurred. 
Chapter 11: Learning Introduction
Data Mining Lecture 11.
Get In Shape With EMS Training. INTRODUCTION Those that are thinking about making a change in their life might have thought about going through with EMS.
Haskell Tips You can turn any function that takes two inputs into an infix operator: mod 7 3 is the same as 7 `mod` 3 takeWhile returns all initial.
AI and Machine Learning
Presentation transcript:

CPSC 322 Introduction to Artificial Intelligence December 1, 2004

Things... Welcome to December! Kaili’s practice homework assignment has been posted. Is there some law prohibiting the sale of Christmas trees in British Columbia?

Example: Supervised learning of concept Negative examples help the learning procedure specialize. If the model of an arch is too general (too inclusive), the negative examples will tell the procedure in what ways to make the model more specific. upright block upright block sideways block must_support has_part arch-1

Example: Supervised learning of concept Here comes another near miss. What’s the difference between the near miss and the current concept of an arch? not-arch-3 upright block upright block sideways block must_support has_part sideways block arch-1 upright block upright block has_part supports touches

Example: Supervised learning of concept The difference is the existence of the touches links in the near miss. That is, there’s no gap between the upright blocks. Since that’s the only difference, then the supporting blocks in an arch must not touch. not-arch-3 upright block upright block sideways block must_support has_part sideways block arch-1 upright block upright block has_part supports touches

Example: Supervised learning of concept The program updates its representation to reflect that the touches links between the upright blocks are forbidden. not-arch-3 upright block upright block sideways block must_support has_part sideways block arch-1 upright block upright block has_part supports touches must_not_touch

Example: Supervised learning of concept Because of the second negative example, the concept of the arch is even more specific than before. upright block upright block sideways block must_support has_part arch-1 must_not_touch

Example: Supervised learning of concept Here’s yet another training example, but this time it’s a positive example. What’s the difference between the new positive example and the current concept of an arch? arch-4 upright block upright block sideways block must_support has_part sideways wedge arch-1 upright block upright block has_part supports must_not_touch

Example: Supervised learning of concept The difference is that the block being supported has a different shape: it’s a wedge. So the block being supported can be either a rectangular block or a wedge. The model of the arch is updated accordingly. arch-4 upright block upright block sideways block or wedge must_support has_part sideways wedge arch-1 upright block upright block has_part supports must_not_touch

Example: Supervised learning of concept Positive examples tell the learning procedure how to make its model more general, to cover more instances with the model. upright block upright block sideways block or wedge must_support has_part arch-1 must_not_touch

Example: Supervised learning of concept If we take the program out of learning mode and ask it to classify a new input, what happens? maybe-arch-5 upright block upright block sideways block or wedge must_support has_part sideways arc arch-1 upright block upright block has_part supports must_not_touch

Example: Supervised learning of concept Warning: Do not learn the wrong things from this example. It is not the case that negative examples are only about links and positive examples are only about nodes! upright block upright block sideways block or wedge must_support has_part arch-1 must_not_touch

Example: Supervised learning of concept This is a very simple model of one specific kind of learning, but it’s easy to understand and easy to implement. That’s one reason it’s presented in just about every introductory AI course. But it also presents many of the issues that are common to all sorts of approaches to learning. upright block upright block sideways block or wedge must_support has_part arch-1 must_not_touch

What’s Minsky teaching to his class?

Example: Supervised learning of concept Another way of looking at this is that the system is trying to gain the ability to make predictions beyond the given data. We’ve talked about this before, when we discussed inductive inference -- making generalizations from lots of examples. upright block upright block sideways block or wedge must_support has_part arch-1 must_not_touch

Inductive inference and learning by example If you were a robot trying to learn when to cross the street and had seen lots of successful and unsuccessful examples of street crossings, how would you know what to pay attention to? width of streetdriver attributes number of carstype of cars speed of carstrees along the street weathergas station on corner daytime/nighttime?and countless others color of cars

Inductive inference and learning by example While computers so far tend to be bad at figuring out for themselves what the salient features are, people are magically great at this sort of thing. It’s a survival thing. width of streetdriver attributes number of carstype of cars speed of carstrees along the street weathergas station on corner daytime/nighttime?and countless others color of cars

Learning is about choosing the best representation That’s certainly true in a logic-based AI world. Our arch learner started with some internal representation of an arch. As examples were presented, the arch learner modified its internal representation to either make the representation accommodate positive examples (generalization) or exclude negative examples (specialization). There’s really nothing else the learner could modify... the reasoning system is what it is. So any learning problem can be mapped onto one of choosing the best representation...

Learning is about search...but wait, there’s more! By now, you’ve figured out that the arch learner was doing nothing more than searching the space of possible representations, right? So learning, like everything else, boils down to search. If that wasn’t obvious, you probably will want to do a little extra preparation for the final exam....

Same problem - different representation The arch learner could have represented the arch concept as a decision tree if we wanted

Same problem - different representation The arch learner could have represented the arch concept as a decision tree if we wanted arch

Same problem - different representation The arch learner could have represented the arch concept as a decision tree if we wanted do upright blocks support sideways block? no yes not arch arch

Same problem - different representation The arch learner could have represented the arch concept as a decision tree if we wanted do upright blocks support sideways block? no yes not arch do upright blocks touch each other? arch not arch no yes

Same problem - different representation The arch learner could have represented the arch concept as a decision tree if we wanted do upright blocks support sideways block? no yes not arch do upright blocks touch each other? is the not arch top block either a rectangle or a wedge? no yes not arch arch

Other issues with learning by example The learning process requires that there is someone to say which examples are positive and which are negative. This approach must start with a positive example to specialize or generalize from. Learning by example is sensitive to the order in which examples are presented. Learning by example doesn’t work well with noisy, randomly erroneous data.