Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.

Similar presentations


Presentation on theme: "Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University."— Presentation transcript:

1 Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University

2 TRU-COMP3710 Advanced Search2 Course Outline Part I – Introduction to Artificial Intelligence Part II – Classical Artificial Intelligence Knowledge Representation Searching Search Methodologies Advanced Search Genetic Algorithms (relatively new study area) Knowledge Represenation and Automated Reasoning Propositinoal and Predicate Logic Inference and Resolution for Problem Solving Rules and Expert Systems Part III – Machine Learning Part IV – Advanced Topics

3 TRU-COMP3710 Advanced Search3 Chapter Objectives Given a contraint satisfaction algorithm, define variables and constraints. Use of Most-Constrained Variable and Most-Constraining Variable heuristics for the 8-queens problem and the map coloring problem. Use of Least-Constraining Variable heuristic for the 8-queens problem and the map coloring problem. Use of Heuristic Repair for the 8-queens problem....

4 TRU-COMP3710 Advanced Search4 Chapter Outline 1. Constraint Satisfaction Search Constraint Satisfaction Search Forward Checking Most-Constrained Variable First Least-Constraining Variable First Heuristic Repair 2. Combinatorial Optimization Problems Combinatorial Optimization Problems How to use greedy approach – Local Search How to improve local search Exchanging Heuristic Iterated Local Search Simulated Annealing Parallel Search Genetic Algorithms for Search

5 TRU-COMP3710 Heuristic Search5 1. Constraint Satisfaction Problems Put n queens on an n × n board with no two queens on the same row, column, or diagonal [Q] How to solve? [Q] Which one do we need to move first?

6 TRU-COMP3710 Advanced Search6 Combinatorial optimization problems involve assigning values to a number of variables. A constraint satisfaction problem (CSP) is a combinatorial optimization problem with a set of constraints. Example: The 8-queens problem Eight queens must be placed on a chess board in such a way that no two queens are on the same diagonal, row, or column [Q] How to model the constraints? 8 variables, a through h Each variable can have a value 1 to 8. The values must satisfy the constraints. [Q] Meaning is?

7 TRU-COMP3710 Advanced Search7 Combinatorial optimization problems involve assigning values to a number of variables. A constraint satisfaction problem (CSP) is a combinatorial optimization problem with a set of constraints. Example: The 8-queens problem Eight queens must be placed on a chess board in such a way that no two queens are on the same diagonal, row, or column [Q] How to solve? [Q] Can be solved using search? DFS? A*? What kind of search tree? Huge space: ??? With many variables it is essential to use heuristics.

8 TRU-COMP3710 Advanced Search8 1.1 Forward Checking Huge search tree [Q] Do we have to visit any choice (, i,e., state or node in the search,) that conflicts with constraints? To delete, from the set of possible future choices, any that have been rendered impossible by placing the queen on that square If placing a queen on the board results in removing all remaining squares, then backtracking immediately. How about to use heuristics? Most-Constrained Variable First Lease-Constraining Variable First

9 TRU-COMP3710 Advanced Search9 1.2 Most-Constrained Variables Most-Constrained Variable First heuristic At each stage of the search, this heuristic involves working with the variable that has the least possible number of valid choices. In the example of the 8-queens problem, assigning a value to 8 variables, a through h a = 1; b = 3; c = 5, then d has 3 choices. e has 3 choices. f has 1 choice. g has 3 choices. h has 3 choices. The next move is to place a queen in column f, not d.

10 TRU-COMP3710 Advanced Search10 What if there are ties? Then, most-constraining variable heuristic for breaking the tie. To assign a value to the variable that places the greatest number of constraints on future variables. E.g., map coloring problem with only 3 colors What is the next choice?

11 TRU-COMP3710 Advanced Search11 1.3 Least-Constraining Variables Instead of the previous two heuristics – most-constrained variable first and most-constraining variable to break ties, Least-Constraining Variable First heuristic To assign a value to a variable that leaves the greatest number of choices for other variables. More intuitive This heuristic makes n-queens problems with extremely large values of n, e.g., 1000, quite solvable. Can you try with n=8?

12 TRU-COMP3710 Advanced Search12 1.4 Heuristic Repair A heuristic method for solving CSPs. Generate a possible solution (randomly, or using a heuristic to generate a position that is close to a solution), and then make small changes to bring it closer to satisfying constraints.

13 TRU-COMP3710 Advanced Search13 Initial state – one queen is conflicting with another. We’ll now move that queen to the square with the fewest conflicts. Heuristic Repair for the 8-Queens Problem

14 TRU-COMP3710 Advanced Search14 Second state – now the queen on the f column is conflicting, so we’ll move it to the square with fewest conflicts.

15 TRU-COMP3710 Advanced Search15 Units

16 TRU-COMP3710 Advanced Search16 2. Combinatorial Optimization Problems [Wikipedia] In applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects. In many such problems, exhaustive search is not feasible. It operates on the domain of those optimization problems, in which the set of feasible solutions is discrete or can be reduced to discrete, and in which the goal is to find the best solution. Some common problems involving combinatorial optimization are the traveling salesman problem ("TSP") and the minimum spanning tree problem ("MST"). Combinatorial optimization is a subset of mathematical optimization that is related to operations research, algorithm theory, and computational complexity theory. It has important applications in several fields, including artificial intelligence, machine learning, mathematics, auction theory, and software engineering.

17 TRU-COMP3710 Advanced Search17 [Q] How to solve combinatorial optimization problems? From greedy approach – local search To advanced local search algorithms Units

18 TRU-COMP3710 Advanced Search18 2.1 Local Search Like heuristic repair, local search methods start from a random state, and make small changes (improvement) until a goal state is achieved. Most local search methods are susceptible to local maxima, like hill- climbing. Local search methods are known as metaheuristics. Here are very important local search methods. Simulated annealing Genetic algorithms Colony optimization Neural networks We will discuss some ideas of how to improve local search in the following slides.

19 TRU-COMP3710 Advanced Search19 A bit better state Initial state A bit better state Susceptable to local maxima How to solve? Cumulative improvement

20 TRU-COMP3710 Advanced Search20 How to Improve Local Search [Q] Any good idea? 1. Keep cumulative improvement 2. Give more diversity while keeping stability 3. Give some random walks Units

21 TRU-COMP3710 Advanced Search21 2.2 Exchanging Heuristics A simple local search method. Heuristic repair is an example of an exchanging heuristic. Involves exchanging one or more variables at each step by giving them different values Exchanging variables until the new state becomes better Repeat this step until a solution is found A k-exchange involves swapping the values of k variables. Can be used to solve the traveling salesman problem.

22 A bit better state TRU-COMP3710 Advanced Search22 A bit better state Initial state A bit better state Ramdom choice -> Diversity Cumulative improvement Units

23 TRU-COMP3710 Advanced Search23 2.3 Iterated Local Search A local search is applied repeatedly from different initial states. Useful in cases where the search space is extremely large, and exhaustive search will not be possible. Units

24 TRU-COMP3710 Advanced Search24 2.4 Simulated Annealing A method based on the way in which metal is heated and then cooled very slowly in order to make it extremely strong. Aims at obtaining a minimum value for some function of a large number of variables. This value is known as the energy of the system. Based on metropolis Monte Carlo Simulation. Simple Monte Carlo Simulation: a method of learning information about the shape of a search space. E.g., a square partially contained within a circle. How to identify what proportion of the square is within the circle? By random sampling

25 TRU-COMP3710 Advanced Search25 Algorithm: A random initial state is selected A small random change is made. To select a new state that makes a small change to the current state If this change lowers the system energy, it is accepted. If it increases the energy, it may be accepted, depending on a probability called the Boltzmann acceptance criteria: e (-dE/T), where T is the current temperature, and dE is the increase in energey that has been produced by moving the previous state to the new state.

26 TRU-COMP3710 Advanced Search26 e (-dE/T), where T is the current temperature, and dE is the increase in energey that has been produced by moving the previous state to the new state. To determine whether to move to a higher energy state or not, if a random number within (0, 1) < the probability above, then move. When the process starts, T is high, meaning increases in energy are relatively likely to happen. Over successive iterations, T lowers and increases in energy become less likely. T (decreasing) P

27 A bit better state A bit worse state A bit better state TRU-COMP3710 Advanced Search27 A bit better state Initial state A bit worse state A bit better state A bit worse state A bit better state Diversity is gradually becoming less effective. -> better stability Ramdom choice -> better diversity Cumulative improvement

28 TRU-COMP3710 Advanced Search28 [Q] Why is Simulated Annealing good? Because the energy of the system is allowed to increase by using random selection, simulated annealing is able to escape from local minima. Simulated annealing is a widely used local search method for solving problems with very large numbers of variables. For example: scheduling problems, traveling salesman, placing VLSI (chip) components. [Q] How to improve? Combination of iterated local search and simulated annealing? Units

29 TRU-COMP3710 Advanced Search29 2.5 Parallel Search Some search methods can be easily split into tasks which can be solved in parallel. -> improved diversity Important concepts to consider are: Divide and conquer? Task distribution Load balancing Tree ordering Units

30 TRU-COMP3710 Advanced Search30 2.6 Genetic Algorithms A method based on biological evolution. Create chromosomes which represent possible solutions to a problem. The best chromosomes in each generation are bred with each other to produce a new generation. Much more detail on this later. A form of local search, which has Cumulative improvement Better diversity Better stability Randomness Quick searching Advanced form of the combination of iterated local search and simulated annealing

31 Advanced Search31 Start with k randomly generated states (population) – 1 st generation A state is represented as a string over a finite alphabet (often a string of 0s and 1s) (encoding) Evaluation function (fitness function). Higher values for better states. Next generation A successor state is generated by combining two parent states. Produce the next generation of states by selection according to the evaluation with fitness function, crossover, and mutation. Next generation … TRU-COMP3710

32 Advanced Search32 3275241124748552 32748552 How to represent the left individual (, i.e., state)? (3, 1, 7, 5, 8, 6, 4, 6) Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28). The larger, the better, in this example. [Q] fitness values of the next states? TRU-COMP3710

33 Advanced Search33 3275241124748552 32748552 The fitness value: 23 The fitness value: 24 TRU-COMP3710

34 Advanced Search34 Fitness function: number of non-attacking pairs of queens (min = 0, max = 8 × 7/2 = 28) 24/(24+23+20+11) = 31% 23/(24+23+20+11) = 29% etc TRU-COMP3710 Units Can you evaluate them?


Download ppt "Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University."

Similar presentations


Ads by Google