Download presentation
Presentation is loading. Please wait.
Published byMorris Wilcox Modified over 8 years ago
1
Computing & Information Sciences Kansas State University CIS 530 / 730: Artificial Intelligence Lecture 09 of 42 Wednesday, 17 September 2008 William H. Hsu Department of Computing and Information Sciences, KSU KSOL course page: http://snipurl.com/v9v3http://snipurl.com/v9v3 Course web site: http://www.kddresearch.org/Courses/Fall-2008/CIS730http://www.kddresearch.org/Courses/Fall-2008/CIS730 Instructor home page: http://www.cis.ksu.edu/~bhsuhttp://www.cis.ksu.edu/~bhsu Reading for Next Class: Section 7.5 – 7.7, p. 211 - 232, Russell & Norvig 2 nd edition Logical Agents and Propositional Logic Discussion: Logic in AI
2
Computing & Information Sciences Kansas State University CIS 530 / 730: Artificial Intelligence Lecture Outline Reading for Next Class: Sections 7.5 – 7.7, R&N 2 e Today: Logical Agents Classical knowledge representation Limitations of the classical symbolic approach Modern approach: representation, reasoning, learning “New” aspects: uncertainty, abstraction, classification paradigm Next Week: Start of Material on Logic Representation: “a bridge between learning and reasoning” (Koller) Basis for automated reasoning: theorem proving, other inference
3
Computing & Information Sciences Kansas State University Type of Training Experience Direct or indirect? Teacher or not? Knowledge about the game (e.g., openings/endgames)? Problem: Is Training Experience Representative (of Performance Goal)? Software Design Assumptions of the learning system: legal move generator exists Software requirements: generator, evaluator(s), parametric target function Choosing a Target Function ChooseMove: Board Move (action selection function, or policy) V: Board R (board evaluation function) Ideal target V; approximated target Goal: operational description (approximation) of V Example: Learning to Play Checkers
4
Computing & Information Sciences Kansas State University A Target Function for Learning to Play Checkers Possible Definition If b is a final board state that is won, then V(b) = 100 If b is a final board state that is lost, then V(b) = -100 If b is a final board state that is drawn, then V(b) = 0 If b is not a final board state in the game, then V(b) = V(b’) where b’ is the best final board state that can be achieved starting from b and playing optimally until the end of the game Correct values, but not operational Choosing a Representation for the Target Function Collection of rules? Neural network? Polynomial function (e.g., linear, quadratic combination) of board features? Other? A Representation for Learned Function bp/rp = number of black/red pieces; bk/rk = number of black/red kings; bt/rt = number of black/red pieces threatened (can be taken next turn)
5
Computing & Information Sciences Kansas State University Training Procedure for Learning to Play Checkers Obtaining Training Examples the target function the learned function the training value One Rule For Estimating Training Values: Choose Weight Tuning Rule Least Mean Square (LMS) weight update rule: REPEAT Select a training example b at random Compute the error(b) for this training example For each board feature f i, update weight w i as follows: where c is a small, constant factor to adjust the learning rate
6
Computing & Information Sciences Kansas State University Design Choices for Learning to Play Checkers Completed Design Determine Type of Training Experience Games against experts Games against self Table of correct moves Determine Target Function Board valueBoard move Determine Representation of Learned Function Polynomial Linear function of six features Artificial neural network Determine Learning Algorithm Gradient descent Linear programming
7
Computing & Information Sciences Kansas State University Knowledge Bases Adapted from slides by S. Russell UC Berkeley
8
Computing & Information Sciences Kansas State University Simple Knowledge-Based Agent Figure 6.1 p. 152 R&N Adapted from slides by S. Russell UC Berkeley
9
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Overview Today’s Reading Sections 7.1 – 7.4, Russell and Norvig 2e Recommended references: Nilsson and Genesereth (Logical Foundations of AI) Previously: Logical Agents Knowledge Bases (KB) and KB agents Motivating example: Wumpus World Logic in general Syntax of propositional calculus Today Propositional calculus (concluded) Normal forms Production systems Predicate logic Introduction to First-Order Logic (FOL): examples, inference rules (sketch) Next Week: First-Order Logic Review, Resolution
10
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Knowledge Representation (KR) for Intelligent Agent Problems Percepts What can agent observe? What can sensors tell it? Actions What actuators does agent have? In what context are they applicable? Goals What are agents goals? Preferences (utilities)? How does agent evaluate them (check environment, deliberate, etc.)? Environment What are “rules of the world”? How can these be represented, simulated?
11
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Review: Simple Knowledge-Based Agent Adapted from slides by S. Russell, UC Berkeley Figure 6.1 p. 152 R&N
12
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Review: Types of Logic Adapted from slides by S. Russell, UC Berkeley Figure 6.7 p. 166 R&N
13
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Propositional Logic: Semantics
14
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Propositional Inference: Enumeration (Model Checking) Method
15
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Normal Forms: CNF, DNF, Horn
16
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Validity and Satisfiability
17
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Proof Methods
18
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Adapted from slides by S. Russell, UC Berkeley Inference (Sequent) Rules for Propositional Logic
19
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Logical Agents: Taking Stock Adapted from slides by S. Russell, UC Berkeley
20
Computing & Information Sciences Kansas State University CIS 530 / 730: Artificial Intelligence The Road Ahead: Predicate Logic and FOL Predicate Logic Enriching language Predicates Functions Syntax and semantics of predicate logic First-Order Logic (FOL, FOPC) Need for quantifiers Relation to (unquantified) predicate logic Syntax and semantics of FOL Fun with Sentences Wumpus World in FOL Adapted from slides by S. Russell, UC Berkeley
21
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Syntax of FOL: Basic Elements Adapted from slides by S. Russell, UC Berkeley
22
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence FOL: Atomic Sentences (Atomic Well-Formed Formulae) Adapted from slides by S. Russell, UC Berkeley
23
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Summary Points Logical Agents Overview (Last Time) Knowledge Bases (KB) and KB agents Motivating example: Wumpus World Logic in general Syntax of propositional calculus Propositional and First-Order Calculi (Today) Propositional calculus (concluded) Normal forms Inference (aka sequent) rules Production systems Predicate logic without quantifiers Introduction to First-Order Logic (FOL) Examples Inference rules (sketch) Next Week: First-Order Logic Review, Intro to Theorem Proving
24
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence Fun with Sentences: Family Feud Adapted from slides by S. Russell, UC Berkeley Brothers are Siblings x, y. Brother (x, y) Sibling (x, y) Siblings (i.e., Sibling Relationships) are Reflexive x, y. Sibling (x, y) Sibling (y, x) One’s Mother is One’s Female Parent x, y. Mother (x, y) Female (x) Parent (x, y) A First Cousin Is A Child of A Parent’s Sibling x, y. First-Cousin (x, y) p, ps. Parent (p, x) Sibling (p, ps) Parent (ps, y)
25
Computing & Information Sciences Kansas State University Friday, 14 Sep 2007 CIS 530 / 730: Artificial Intelligence “Every Dog Chases Its Own Tail” d. Chases (d, tail-of (d)) Alternative Statement: d. t. Tail-Of (t, d) Chases (d, t) Prefigures concept of Skolemization (Skolem variables / functions) “Every Dog Chases Its Own (Unique) Tail” d. 1 t. Tail-Of (t, d) Chases (d, t) d. t. Tail-Of (t, d) Chases (d, t) [ t’ Chases (d, t’) t’ = t] “Only The Wicked Flee when No One Pursueth” x. Flees (x) [¬ y Pursues (y, x)] Wicked (x) Alternative : x. [ y. Flees (x, y)] [¬ z. Pursues (z, x)] Wicked (x) Offline Exercise: What Is An nth Cousin, m Times Removed? Jigsaw Exercise: First-Order Logic Sentences
26
Computing & Information Sciences Kansas State University CIS 530 / 730: Artificial Intelligence Terminology Logical Frameworks Knowledge Bases (KB) Logic in general: representation languages, syntax, semantics Propositional logic First-order logic (FOL, FOPC) Model theory, domain theory: possible worlds semantics, entailment Normal Forms Conjunctive Normal Form (CNF) Disjunctive Normal Form (DNF) Horn Form Proof Theory and Inference Systems Sequent calculi: rules of proof theory Derivability or provability Properties Soundness (derivability implies entailment) Completeness (entailment implies derivability)
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.