Artificial Intelligence (CS 370D)

Slides:



Advertisements
Similar presentations
Informed search algorithms
Advertisements

Local Search Algorithms
Local Search Algorithms Chapter 4. Outline Hill-climbing search Simulated annealing search Local beam search Genetic algorithms Ant Colony Optimization.
LOCAL SEARCH AND CONTINUOUS SEARCH. Local search algorithms  In many optimization problems, the path to the goal is irrelevant ; the goal state itself.
CSC344: AI for Games Lecture 5 Advanced heuristic search Patrick Olivier
Local search algorithms
Local search algorithms
Trading optimality for speed…
CSC344: AI for Games Lecture 4: Informed search
Informed Search Next time: Search Application Reading: Machine Translation paper under Links Username and password will be mailed to class.
1 CS 2710, ISSP 2610 R&N Chapter 4.1 Local Search and Optimization.
Artificial Intelligence Problem solving by searching CSC 361
Local Search and Optimization
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Artificial Intelligence Problem solving by searching CSC 361
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Search CSE When you can’t use A* Hill-climbing Simulated Annealing Other strategies 2 person- games.
INTRODUÇÃO AOS SISTEMAS INTELIGENTES Prof. Dr. Celso A.A. Kaestner PPGEE-CP / UTFPR Agosto de 2011.
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Informed search algorithms
Informed search algorithms
1 Shanghai Jiao Tong University Informed Search and Exploration.
Chapter 4.1 Beyond “Classic” Search. What were the pieces necessary for “classic” search.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
1 CS 2710, ISSP 2610 Chapter 4, Part 2 Heuristic Search.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Artificial Intelligence for Games Online and local search
Local Search Algorithms
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Local Search and Optimization Presented by Collin Kanaley.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
When A* doesn’t work CIS 391 – Intro to Artificial Intelligence A few slides adapted from CS 471, Fall 2004, UBMC (which were adapted from notes by Charles.
4/11/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 4, 4/11/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Chapter 4 (Section 4.3, …) 2 nd Edition or Chapter 4 (3 rd Edition) Local Search and Optimization.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Local Search Algorithms and Optimization Problems
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Department of Computer Science Lecture 5: Local Search
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
1 Intro to AI Local Search. 2 Intro to AI Local search and optimization Local search: –use single current state & move to neighboring states Idea: –start.
Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Games: Expectimax MAX MIN MAX Prune if α ≥ β. Games: Expectimax MAX MIN MAX
1 Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Optimization via Search
Genetic Algorithms.
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
Informed Search Chapter 4 (b)
Local Search Algorithms
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Local Search and Optimization
Informed Search Chapter 4 (b)
Informed search algorithms
More on Search: A* and Optimization
Artificial Intelligence
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
First Exam 18/10/2010.
Local Search Algorithms
Beyond Classical Search
Local Search Algorithms
Presentation transcript:

Artificial Intelligence (CS 370D) Princess Nora University Faculty of Computer & Information Systems Artificial Intelligence (CS 370D)

(Chapter-4) BEYOND CLASSICAL SEARCH

Local search Strategy Hill-Climbing Search. Simulated Annealing Search. Local Beam Search. Genetic Algorithms.

Classical search versus Local Search systematic exploration of search space. Keeps one or more paths in memory. Records which alternatives have been explored at each point along the path. The path to the goal is a solution to the problem. In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution. State space = set of "complete" configurations. Find configuration satisfying constraints, Find best state according to some objective function h(s). e.g., n-queens, h(s)= number of attacking queens. In such cases, we can use Local Search Algorithms.

Local Search Algorithms Local Search Algorithms keep a single "current" state, and move to neighboring states in order to try improve it. Solution path needs not be maintained. Hence, the search is “local”. Local search suitable for problems in which path is not important; the goal state itself is the solution. It is an optimization search

Example: n-queens Put n queens on an n × n board with no two queens on the same row, column, or diagonal. In the 8-queens problem, what matters is the final configuration of queens, not the order in which they are added.

Local Search: Key Idea Key idea: Select (random) initial state (generate an initial guess). Make local modification to improve current state (evaluate current state and move to other states). Repeat Step 2 until goal state found (or out of time).

Local Search: Key Idea Advantages Drawback: Use very little memory – usually a constant amount. Can often find reasonable solutions in large or infinite state spaces (e.g., continuous). For which systematic search is unsuitable. Local Search can get stuck in local maxima and not find the optimal solution.

State Space Landscape A state space landscape: is a graph of states associated with their costs.

Hill-Climbing Search

Hill-Climbing Search Main Idea: Keep a single current node and move to a neighboring state to improve it. Uses a loop that continuously moves in the direction of increasing value (uphill): Choose the best successor, choose randomly if there is more than one. Terminate when a peak reached where no neighbor has a higher value. It also called greedy local search, steepest ascent/descent.

Hill-Climbing Search

Hill-Climbing in Action … Current best solution cost States

Hill-Climbing in Action … Current best solution cost States

Hill-Climbing in Action … Current best solution cost States

Hill-Climbing in Action … Current best solution cost States

Hill-Climbing in Action … Current solution cost Local minima Global minima States

Hill-Climbing in Action … cost States Current solution Global minima Local minima Drawback: Depending on initial state, it can get stuck in local maxima/minimum or flat local maximum and not find the solution. Cure: Random restart.

Simulated Annealing Search

The Problem Most minimization strategies find the nearest local minimum Standard strategy Generate trial point based on current estimates Evaluate function at proposed location Accept new value if it improves solution

The Solution We need a strategy to find other minima This means, we must sometimes select new points that do not improve solution How?

Annealing One manner in which crystals are formed Gradual cooling of liquid … At high temperatures, molecules move freely At low temperatures, molecules are "stuck" If cooling is slow Low energy, organized crystal lattice formed

Simulated annealing Search Main Idea: escape local maxima by allowing some "bad" moves but gradually decrease their frequency. Instead of picking the best move, it picks a random move..

Simulated annealing Search say the change in objective function is d if d is positive, then move to that state otherwise: move to this state with probability proportional to d thus: worse moves (very large negative d) are executed less often

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated Annealing Cost Best States

Simulated annealing Search Lets say there are 3 moves available, with changes in the objective function of d1 = -0.1, d2 = 0.5, d3 = -5. (Let T = 1). pick a move randomly: if d2 is picked, move there. if d1 or d3 are picked, probability of move = exp(d/T) move 1: prob1 = exp(-0.1) = 0.9, i.e., 90% of the time we will accept this move move 3: prob3 = exp(-5) = 0.05 i.e., 5% of the time we will accept this move

Properties of Simulated Annealing Cooling Schedule: determines rate at which the temperature T is lowered.

Local Beam Search

Local Beam Search Main Idea: Keep track of k states rather than just one. Start with k randomly generated states. At each iteration, all the successors of all k states are generated. If any one is a goal state, stop; else select the k best successors from the complete list and repeat. Drawback: the k states tend to regroup very quickly in the same region  lack of diversity.

Local Beam Search Cost States

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Local Beam Search

Genetic Algorithm Search

Genetic Algorithms Formally introduced in the US in the 70s by John Holland. GAs emulate ideas from genetics and natural selection and can search potentially large spaces. Before we can apply Genetic Algorithm to a problem, we need to answer: - How is an individual represented? - What is the fitness function? - How are individuals selected? - How do individuals reproduce?

Genetic Algorithms: Representation of states (solutions) Each state or individual is represented as a string over a finite alphabet. It is also called chromosome which Contains genes. genes 1001011111 Solution: 607 Encoding Chromosome: Binary String

Genetic Algorithms: Fitness Function Each state is rated by the evaluation function called fitness function. Fitness function should return higher values for better states: Fitness(X) should be greater than Fitness(Y) !! [Fitness(x) = 1/Cost(x)] Cost States X Y

GA Parent Selection - Roulette Wheel Sum the fitnesses of all the population members, TF Generate a random number, m, between 0 and TF Return the first population member whose fitness added to the preceding population members is greater than or equal to m Roulette Wheel Selection

Genetic Algorithms: Selection How are individuals selected ? Roulette Wheel Selection 1 2 3 5 18 4 6 7 8 Rnd[0..18] = 7 Chromosome4 Rnd[0..18] = 12 Chromosome6

Genetic Algorithms: Cross-Over and Mutation How do individuals reproduce ?

Genetic Algorithms Crossover - Recombination 1010000000 1001011111 Crossover single point - random 1011011111 Parent1 Parent2 Offspring1 Offspring2 With some high probability (crossover rate) apply crossover to the parents. (typical values are 0.8 to 0.95)

Stochastic Search: Genetic Algorithms Mutation mutate 1011001111 Offspring1 1011011111 Offspring1 1010000000 1000000000 Offspring2 Offspring2 Original offspring Mutated offspring With some small probability (the mutation rate) flip each bit in the offspring (typical values between 0.1 and 0.001)

C1 C2 Given the following parents, P1 and P2, and the template T Show how the following crossover operators work uniform crossover C1 A F C D B I H J C2 E G

If P3 and P2 are chosen as parents and we apply one point crossover show the resulting children, C1 and C2. Use a crossover point of 1 (first digit) Do the same using P4 and P2 with a crossover point of 2(two digit) and create C3 and C4 Do multiple point crossover using parent P1 and P3 to give C5 and C6 on digts 1 and 4   Chromosome Binary String P1 11100 P2 01111 P3 10111 P4 00100 Chromosome Binary String C1 11111 C2 00111 C3 C4 01100 C5 10110 C6 11101

Genetic Algorithms Algorithm: 1. Initialize population with p Individuals at random 2. For each Individual h compute its fitness 3. While max fitness < threshold do Create a new generation Ps 4. Return the Individual with highest fitness

Genetic algorithms Has the effect of “jumping” to a completely different new part of the search space (quite non-local)

Thank you End of Chapter 4