Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.

Slides:



Advertisements
Similar presentations
Constraint Satisfaction Problems
Advertisements

Informed search algorithms
Local Search Algorithms
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
CS6800 Advanced Theory of Computation
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
CSC344: AI for Games Lecture 5 Advanced heuristic search Patrick Olivier
4 Feb 2004CS Constraint Satisfaction1 Constraint Satisfaction Problems Chapter 5 Section 1 – 3.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
1 Chapter 5 Advanced Search. 2 l
Trading optimality for speed…
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Constraint Satisfaction Problems
Constraint Satisfaction Problems
Local Search and Optimization
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
CS 484 – Artificial Intelligence1 Announcements Homework 2 due today Lab 1 due Thursday, 9/20 Homework 3 has been posted Autumn – Current Event Tuesday.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
Informed search algorithms
Informed search algorithms
Chapter 5 Section 1 – 3 1.  Constraint Satisfaction Problems (CSP)  Backtracking search for CSPs  Local search for CSPs 2.
Constraint Satisfaction CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Searching by Constraint CMSC Artificial Intelligence January 24, 2008.
Hande ÇAKIN IES 503 TERM PROJECT CONSTRAINT SATISFACTION PROBLEMS.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Chapter 5: Constraint Satisfaction ICS 171 Fall 2006.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Artificial Intelligence for Games Online and local search
Local Search Algorithms
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Chapter 5 Constraint Satisfaction Problems
4/11/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 4, 4/11/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
CHAPTER 5 SECTION 1 – 3 4 Feb 2004 CS Constraint Satisfaction 1 Constraint Satisfaction Problems.
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
1. 2 Outline of Ch 4 Best-first search Greedy best-first search A * search Heuristics Functions Local search algorithms Hill-climbing search Simulated.
Knowledge Representation Fall 2013 COMP3710 Artificial Intelligence Computing Science Thompson Rivers University.
Computing & Information Sciences Kansas State University Friday, 08 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 7 of 42 Friday, 08 September.
Chapter 5 Team Teaching AI (created by Dewi Liliana) PTIIK Constraint Satisfaction Problems.
Local Search Algorithms and Optimization Problems
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
1 Intro to AI Local Search. 2 Intro to AI Local search and optimization Local search: –use single current state & move to neighboring states Idea: –start.
Optimization via Search
Knowledge Representation
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Constraint Satisfaction Problems
More on Search: A* and Optimization
Chapter 5. Advanced Search
Knowledge Representation
Local Search Algorithms
CS 8520: Artificial Intelligence
Search.
Search.
CSC 380: Design and Analysis of Algorithms
Constraint Satisfaction Problems
Beyond Classical Search
Local Search Algorithms
Presentation transcript:

Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

TRU-COMP3710 Advanced Search2 Course Outline Part I – Introduction to Artificial Intelligence Part II – Classical Artificial Intelligence Knowledge Representation Searching Search Methodologies Advanced Search Genetic Algorithms (relatively new study area) Knowledge Represenation and Automated Reasoning Propositinoal and Predicate Logic Inference and Resolution for Problem Solving Rules and Expert Systems Part III – Machine Learning Part IV – Advanced Topics

TRU-COMP3710 Advanced Search3 Chapter Objectives Given a contraint satisfaction algorithm, define variables and constraints. Use of Most-Constrained Variable and Most-Constraining Variable heuristics for the 8-queens problem and the map coloring problem. Use of Least-Constraining Variable heuristic for the 8-queens problem and the map coloring problem. Use of Heuristic Repair for the 8-queens problem....

TRU-COMP3710 Advanced Search4 Chapter Outline 1. Constraint Satisfaction Search Constraint Satisfaction Search Forward Checking Most-Constrained Variable First Least-Constraining Variable First Heuristic Repair 2. Combinatorial Optimization Problems Combinatorial Optimization Problems How to use greedy approach – Local Search How to improve local search Exchanging Heuristic Iterated Local Search Simulated Annealing Parallel Search Genetic Algorithms for Search

TRU-COMP3710 Heuristic Search5 1. Constraint Satisfaction Problems Put n queens on an n × n board with no two queens on the same row, column, or diagonal [Q] How to solve? [Q] Which one do we need to move first?

TRU-COMP3710 Advanced Search6 Combinatorial optimization problems involve assigning values to a number of variables. A constraint satisfaction problem (CSP) is a combinatorial optimization problem with a set of constraints. Example: The 8-queens problem Eight queens must be placed on a chess board in such a way that no two queens are on the same diagonal, row, or column [Q] How to model the constraints? 8 variables, a through h Each variable can have a value 1 to 8. The values must satisfy the constraints. [Q] Meaning is?

TRU-COMP3710 Advanced Search7 Combinatorial optimization problems involve assigning values to a number of variables. A constraint satisfaction problem (CSP) is a combinatorial optimization problem with a set of constraints. Example: The 8-queens problem Eight queens must be placed on a chess board in such a way that no two queens are on the same diagonal, row, or column [Q] How to solve? [Q] Can be solved using search? DFS? A*? What kind of search tree? Huge space: ??? With many variables it is essential to use heuristics.

TRU-COMP3710 Advanced Search8 1.1 Forward Checking Huge search tree [Q] Do we have to visit any choice (, i,e., state or node in the search,) that conflicts with constraints? To delete, from the set of possible future choices, any that have been rendered impossible by placing the queen on that square If placing a queen on the board results in removing all remaining squares, then backtracking immediately. How about to use heuristics? Most-Constrained Variable First Lease-Constraining Variable First

TRU-COMP3710 Advanced Search9 1.2 Most-Constrained Variables Most-Constrained Variable First heuristic At each stage of the search, this heuristic involves working with the variable that has the least possible number of valid choices. In the example of the 8-queens problem, assigning a value to 8 variables, a through h a = 1; b = 3; c = 5, then d has 3 choices. e has 3 choices. f has 1 choice. g has 3 choices. h has 3 choices. The next move is to place a queen in column f, not d.

TRU-COMP3710 Advanced Search10 What if there are ties? Then, most-constraining variable heuristic for breaking the tie. To assign a value to the variable that places the greatest number of constraints on future variables. E.g., map coloring problem with only 3 colors What is the next choice?

TRU-COMP3710 Advanced Search Least-Constraining Variables Instead of the previous two heuristics – most-constrained variable first and most-constraining variable to break ties, Least-Constraining Variable First heuristic To assign a value to a variable that leaves the greatest number of choices for other variables. More intuitive This heuristic makes n-queens problems with extremely large values of n, e.g., 1000, quite solvable. Can you try with n=8?

TRU-COMP3710 Advanced Search Heuristic Repair A heuristic method for solving CSPs. Generate a possible solution (randomly, or using a heuristic to generate a position that is close to a solution), and then make small changes to bring it closer to satisfying constraints.

TRU-COMP3710 Advanced Search13 Initial state – one queen is conflicting with another. We’ll now move that queen to the square with the fewest conflicts. Heuristic Repair for the 8-Queens Problem

TRU-COMP3710 Advanced Search14 Second state – now the queen on the f column is conflicting, so we’ll move it to the square with fewest conflicts.

TRU-COMP3710 Advanced Search15 Units

TRU-COMP3710 Advanced Search16 2. Combinatorial Optimization Problems [Wikipedia] In applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects. In many such problems, exhaustive search is not feasible. It operates on the domain of those optimization problems, in which the set of feasible solutions is discrete or can be reduced to discrete, and in which the goal is to find the best solution. Some common problems involving combinatorial optimization are the traveling salesman problem ("TSP") and the minimum spanning tree problem ("MST"). Combinatorial optimization is a subset of mathematical optimization that is related to operations research, algorithm theory, and computational complexity theory. It has important applications in several fields, including artificial intelligence, machine learning, mathematics, auction theory, and software engineering.

TRU-COMP3710 Advanced Search17 [Q] How to solve combinatorial optimization problems? From greedy approach – local search To advanced local search algorithms Units

TRU-COMP3710 Advanced Search Local Search Like heuristic repair, local search methods start from a random state, and make small changes (improvement) until a goal state is achieved. Most local search methods are susceptible to local maxima, like hill- climbing. Local search methods are known as metaheuristics. Here are very important local search methods. Simulated annealing Genetic algorithms Colony optimization Neural networks We will discuss some ideas of how to improve local search in the following slides.

TRU-COMP3710 Advanced Search19 A bit better state Initial state A bit better state Susceptable to local maxima How to solve? Cumulative improvement

TRU-COMP3710 Advanced Search20 How to Improve Local Search [Q] Any good idea? 1. Keep cumulative improvement 2. Give more diversity while keeping stability 3. Give some random walks Units

TRU-COMP3710 Advanced Search Exchanging Heuristics A simple local search method. Heuristic repair is an example of an exchanging heuristic. Involves exchanging one or more variables at each step by giving them different values Exchanging variables until the new state becomes better Repeat this step until a solution is found A k-exchange involves swapping the values of k variables. Can be used to solve the traveling salesman problem.

A bit better state TRU-COMP3710 Advanced Search22 A bit better state Initial state A bit better state Ramdom choice -> Diversity Cumulative improvement Units

TRU-COMP3710 Advanced Search Iterated Local Search A local search is applied repeatedly from different initial states. Useful in cases where the search space is extremely large, and exhaustive search will not be possible. Units

TRU-COMP3710 Advanced Search Simulated Annealing A method based on the way in which metal is heated and then cooled very slowly in order to make it extremely strong. Aims at obtaining a minimum value for some function of a large number of variables. This value is known as the energy of the system. Based on metropolis Monte Carlo Simulation. Simple Monte Carlo Simulation: a method of learning information about the shape of a search space. E.g., a square partially contained within a circle. How to identify what proportion of the square is within the circle? By random sampling

TRU-COMP3710 Advanced Search25 Algorithm: A random initial state is selected A small random change is made. To select a new state that makes a small change to the current state If this change lowers the system energy, it is accepted. If it increases the energy, it may be accepted, depending on a probability called the Boltzmann acceptance criteria: e (-dE/T), where T is the current temperature, and dE is the increase in energey that has been produced by moving the previous state to the new state.

TRU-COMP3710 Advanced Search26 e (-dE/T), where T is the current temperature, and dE is the increase in energey that has been produced by moving the previous state to the new state. To determine whether to move to a higher energy state or not, if a random number within (0, 1) < the probability above, then move. When the process starts, T is high, meaning increases in energy are relatively likely to happen. Over successive iterations, T lowers and increases in energy become less likely. T (decreasing) P

A bit better state A bit worse state A bit better state TRU-COMP3710 Advanced Search27 A bit better state Initial state A bit worse state A bit better state A bit worse state A bit better state Diversity is gradually becoming less effective. -> better stability Ramdom choice -> better diversity Cumulative improvement

TRU-COMP3710 Advanced Search28 [Q] Why is Simulated Annealing good? Because the energy of the system is allowed to increase by using random selection, simulated annealing is able to escape from local minima. Simulated annealing is a widely used local search method for solving problems with very large numbers of variables. For example: scheduling problems, traveling salesman, placing VLSI (chip) components. [Q] How to improve? Combination of iterated local search and simulated annealing? Units

TRU-COMP3710 Advanced Search Parallel Search Some search methods can be easily split into tasks which can be solved in parallel. -> improved diversity Important concepts to consider are: Divide and conquer? Task distribution Load balancing Tree ordering Units

TRU-COMP3710 Advanced Search Genetic Algorithms A method based on biological evolution. Create chromosomes which represent possible solutions to a problem. The best chromosomes in each generation are bred with each other to produce a new generation. Much more detail on this later. A form of local search, which has Cumulative improvement Better diversity Better stability Randomness Quick searching Advanced form of the combination of iterated local search and simulated annealing

Advanced Search31 Start with k randomly generated states (population) – 1 st generation A state is represented as a string over a finite alphabet (often a string of 0s and 1s) (encoding) Evaluation function (fitness function). Higher values for better states. Next generation A successor state is generated by combining two parent states. Produce the next generation of states by selection according to the evaluation with fitness function, crossover, and mutation. Next generation … TRU-COMP3710

Advanced Search How to represent the left individual (, i.e., state)? (3, 1, 7, 5, 8, 6, 4, 6) Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28). The larger, the better, in this example. [Q] fitness values of the next states? TRU-COMP3710

Advanced Search The fitness value: 23 The fitness value: 24 TRU-COMP3710

Advanced Search34 Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28) 24/( ) = 31% 23/( ) = 29% etc TRU-COMP3710 Units Can you evaluate them?