Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 175 Project in AI. 2 Lectures: ICS 180 Tuesday, Thursday Hours: 9.30-10.50 am Discussion: DBH 1300 Wednesday Hours: 1.00-1.50 pm Instructor: Natalia.

Similar presentations


Presentation on theme: "CS 175 Project in AI. 2 Lectures: ICS 180 Tuesday, Thursday Hours: 9.30-10.50 am Discussion: DBH 1300 Wednesday Hours: 1.00-1.50 pm Instructor: Natalia."— Presentation transcript:

1 CS 175 Project in AI

2 2 Lectures: ICS 180 Tuesday, Thursday Hours: 9.30-10.50 am Discussion: DBH 1300 Wednesday Hours: 1.00-1.50 pm Instructor: Natalia Flerova DBH 4099 e-mail: nflerova@uci.edunflerova@uci.edu TA: TBA Class website: www.ics.uci.edu/~nflerova/cs175/index.html

3 3 Grading Project proposal: 5% Midterm report: 30% Demo-presentation: 30% Final report: 35%

4 4 Grading Project proposal: due April 7th Midterm report: due May 5th Demo-presentation: during 10th and finals weeks Final report: due June 11th

5 5 Project ideas Constraints: Class scheduling TA assignment Sudoku Bayesian networks: Advising a first year student Vision: Image classification Image recognition (e.g. handwritten digits recognition)

6 6 Project ideas Games: Othello (Reversi) Checkers Go Japanese crossword Etc. Other: RoboSoccer simulation Email management project Estimating space explored by heuristic search

7 7 Project proposal Single page report describing the 2 ideas for your project (indicate priorities). Be sure to address the following issues: What is the main purpose of your project? What tasks will your end product solve? What AI techniques and approaches are you planning to use (it's OK to provide only high level description) How are you planning to split the responsibilities between group members?

8 Constraint Networks Overview

9 Suggested reading Russell and Norvig. Artificial Intelligence: Modern Approach. Chapter 5.

10 10 Good source of advanced information Rina Dechter, Constraint Processing, Morgan Kaufmann

11 11 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

12 12 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

13 AB redgreen redblack greenred greenblack blackgreen black red Constraint Satisfaction Example: map coloring Variables - countries (A,B,C,etc.) Values - colors (e.g., red, green, black) Constraints: C A B D E F G

14 14 Constraint Network; Definition A constraint network is: R= (X,D,C) X variables D domain C constraints R expresses allowed tuples over scopes A solution is an assignment to all variables that satisfies all constraints (join of all relations). Tasks: consistency?, one or all solutions, counting, optimization

15 Example The 4-queen problem Q Q Q Q Q Q Q Q Place 4 Queens on a chess board of 4x4 such that no two queens reside in the same row, column or diagonal. Standard CSP formulation of the problem: Variables: each row is a variable. Q Q Q Q 1 2 3 4 Domains:

16 Example The 4-queen problem Q Q Q Q Q Q Q Q Place 4 Queens on a chess board of 4x4 such that no two queens reside in the same row, column or diagonal. Standard CSP formulation of the problem: Variables: each row is a variable. Q Q Q Q 1 2 3 4 Domains: Constraints: There are = 6 constraints involved: 4 2 ( ) Constraint Graph :

17

18

19 19 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

20 20 Constraint’s representations Relation: allowed tuples Algebraic expression: Propositional formula: Constraint graph

21 21 Figure 1.8: Example of set operations intersection, union, and difference applied to relations.

22 22 Constraint Graphs: Primal, Dual and Hypergraphs A (primal) constraint graph: a node per variable arcs connect constrained variables. A dual constraint graph: a node per constraint’s scope, an arc connect nodes sharing variables =hypergraph

23

24

25

26 26 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

27 Search space

28 28 Backtracking search

29 Backtracking

30

31

32

33

34 The search space A tree of all partial solutions A partial solution: (a1,…,aj) satisfying all relevant constraints The size of the underlying search space depends on:  Variable ordering  Level of consistency possessed by the problem

35 Search space and the effect of ordering

36 Backtracking Complexity of extending a partial solution:  Complexity of consistent O(e log t), t bounds tuples, e constraints  Complexity of selectvalue O(e k log t)

37 A coloring problem example

38 Backtracking search for a solution

39 Backtracking Search for a Solution

40 Backtracking Search for All Solutions

41 41 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

42 Before search : (reducing the search space) –Arc-consistency, path-consistency, i-consistency –Variable ordering (fixed) During search : –Look-ahead schemes: Value ordering/pruning (choose a least restricting value), Variable ordering (Choose the most constraining variable) –Look-back schemes: Backjumping Constraint recording Dependency-directed backtracking Improving Backtracking

43 43 Consistency methods Constraint propagation – inferring new constraints Can get such an explicit network that the search will find the solution without dead-ends. Approximation of inference:  Arc, path and i-consistency Methods that transform the original network into a tighter and tighter representations

44 44 Arc-consistency 32,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

45 45 Arc-consistency 32,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,1,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

46 46 Arc-consistency 32,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,2,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

47 47 Arc-consistency 32,2,1,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,2,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

48 48 Arc-consistency 32,2,1,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,2,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

49 49 Arc-consistency 32,2,1,1, 32,2,1,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,2,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

50 50 Arc-consistency 32,2,1,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,2,1,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

51 51 Arc-consistency 32,1, 32,1,32,1, 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ 32,1,  =   - infer constraints based on pairs of variables Insures that every legal value in the domain of a single variable has a legal match In the domain of any other selected variable

52 52 1  X, Y, Z, T  3 X  Y Y = Z T  Z X  T XY TZ  =   13 23 Arc-consistency

53 Arc-consistency algorithm domain of x domain of y Arc is arc-consistent if for any value of there exist a matching value of Algorithm Revise makes an arc consistent Begin 1. For each a in D i if there is no value b in D j that matches a then delete a from the D j. End. Revise is, k is the number of value in each domain.

54 Algorithm AC-3 Begin –1. Q <--- put all arcs in the queue in both directions –2. While Q is not empty do, –3. Select and delete an arc from the queue Q 4. Revise 5. If Revise cause a change then add to the queue all arcs that touch X i (namely ( X i,X m ) and ( X l,X i )). –6. end-while End Complexity: –Processing an arc requires O(k^2) steps –The number of times each arc can be processed is 2·k –Total complexity is

55 55 Sudoku – Constraint Satisfaction Each row, column and major block must be alldifferent “Well posed” if it has unique solution: 27 constraints 2 3 4 6 2 Variables: empty slots Domains = {1,2,3,4,5,6,7,8,9} Constraints: 27 all-different Constraint Propagation Inference

56 56 Path-consistency

57 57 I-consistency

58 The Effect of Consistency Level After arc-consistency z=5 and l=5 are removed After path-consistency –R’_zx –R’_zy –R’_zl –R’_xy –R’_xl –R’_yl Tighter networks yield smaller search spaces

59 59 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

60

61

62

63

64

65

66

67

68

69

70

71

72

73 73 Outline CSP: Definition, and simple modeling examples Representing constraints Basic search strategy Improving search: –Consistency algorithms –Look-ahead methods –Look-back methods

74 Look-back: Backjumping / Learning Backjumping: – In deadends, go back to the most recent culprit. Learning: – constraint-recording, no-good recording. – good-recording

75 Backjumping (X1=r,x2=b,x3=b,x4=b,x5=g,x6=r,x7={r,b}) (r,b,b,b,g,r) conflict set of x7 (r,-,b,b,g,-) c.s. of x7 (r,-,b,-,-,-,-) minimal conflict-set Leaf deadend: (r,b,b,b,g,r) Every conflict-set is a no-good

76

77

78

79

80

81 The cycle-cutset method An instantiation can be viewed as blocking cycles in the graph Given an instantiation to a set of variables that cut all cycles (a cycle-cutset) the rest of the problem can be solved in linear time by a tree algorithm. Complexity (n number of variables, k the domain size and C the cycle-cutset size):

82 Tree Decomposition

83

84

85

86 GSAT – local search for SAT (Selman, Levesque and Mitchell, 1992) 1.For i=1 to MaxTries 2. Select a random assignment A 3. For j=1 to MaxFlips 4. if A satisfies all constraint, return A 5. else flip a variable to maximize the score 6. (number of satisfied constraints; if no variable 7. assignment increases the score, flip at random) 8. end 9.end Greatly improves hill-climbing by adding restarts and sideway moves

87 WalkSAT (Selman, Kautz and Cohen, 1994) With probability p random walk – flip a variable in some unsatisfied constraint With probability 1-p perform a hill-climbing step Adds random walk to GSAT:

88 More Stochastic Search: Simulated Annealing, reweighting Simulated annealing: – A method for overcoming local minimas – Allows bad moves with some probability: With some probability related to a temperature parameter T the next move is picked randomly. – Theoretically, with a slow enough cooling schedule, this algorithm will find the optimal solution. But so will searching randomly. Breakout method (Morris, 1990): adjust the weights of the violated constraints

89


Download ppt "CS 175 Project in AI. 2 Lectures: ICS 180 Tuesday, Thursday Hours: 9.30-10.50 am Discussion: DBH 1300 Wednesday Hours: 1.00-1.50 pm Instructor: Natalia."

Similar presentations


Ads by Google