Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM.

Similar presentations


Presentation on theme: "Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM."— Presentation transcript:

1 Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM

2 IBM Haifa Research Lab Copyright IBM 2 Outline  CSP solving algorithms  Systematic  Stochastic  Limitations of systematic methods  Stochastic approach  Stocs algorithm  Stocs challenges  Summary

3 IBM Haifa Research Lab Copyright IBM 3 Constraint satisfaction problems  Variables: Anna, Beth, Cory, Dave  Domains: Red, Green, Orange, Yellow houses  Constraints:  The Red and Green houses are in the city  The Orange and Yellow houses are in the countryside  The Red and Green houses are neighboring, as well as the Orange and Yellow houses  Anna and Dave have dogs, Beth owns a cat  Dogs and cats cannot be neighbors  Dogs must live in the countryside  Solution:  Anna lives in the Orange house, Beth lives in the Red house, Cory lives in the Green house, Dave lives in the Yellow house

4 IBM Haifa Research Lab Copyright IBM 4 CSP Solving algorithms Systematic: GEC Stochastic: Stocs

5 IBM Haifa Research Lab Copyright IBM 5 Systematic approach  Systematically go over the search space  Use pruning whenever possible  pruning is done by projection

6 IBM Haifa Research Lab Copyright IBM 6 Example: projector for multiply a x b = c a є [2, 20], b є [3, 20], c є [1, 20] Projection to input 1 a’ = [2, 6] Projection to input 2 b’ = [3, 10] Projection to result c’ = {6, 8, 9, 10, 12, 14, 15, 16, 18, 20}

7 IBM Haifa Research Lab Copyright IBM 7 Limitations of systematic methods: example 1 Propagation is hard (factoring)

8 IBM Haifa Research Lab Copyright IBM 8 Limitations of systematic methods: example 2  The Some-Different constraint: given a graph on the variables, the variables connected by an edge must have different values  Propagation is NP-hard for domains of size k > 3 (k-colorability)

9 IBM Haifa Research Lab Copyright IBM 9 Limitations of systematic methods: example 3 Only solution:  Local consistency at onset  probability of success: 1/N

10 IBM Haifa Research Lab Copyright IBM 10 Stochastic approach  State: an assignment of values to all the variables  Cost: a function from the set of states to {0} U R +  Cost = 0 iff all constraints are satisfied by the state

11 IBM Haifa Research Lab Copyright IBM 11 Stochastic approach  General idea:  Start from some state  Find the next state and move there  Stop if a state with cost 0 is found  Stochastic algorithms are usually incomplete  Different stochastic algorithms use different heuristics for finding the next state  Examples:  Simulated annealing  Tabu search

12 IBM Haifa Research Lab Copyright IBM 12 Stocs algorithm overview  Check states on length-scales “typical” for the problem. Hop to a new state if cost is lower  Learn the topography of the problem: learn the typical step sizes and directions  Get domain-knowledge as input strategies

13 IBM Haifa Research Lab Copyright IBM 13  Problem:  7 groups of players  6 members in each group  Play 4 weeks  Without any two players playing together (in the same group) twice  Exponential decrease  No sense in trying step sizes larger than 20.  But may benefit strongly from step sizes of 10-15  Reproducible - characterizes the problem Example: Social Golfer Problem

14 IBM Haifa Research Lab Copyright IBM 14  Problem:  Minimize the autocorrelation on a sequence of N (45) bits  Non-exponential decrease, followed by saturation  Makes sense to always try large steps  Identifies small characteristic features  Extremely reproducible Example: LABS

15 IBM Haifa Research Lab Copyright IBM 15  Problem:  Select different values for three variables out of a given set of values (smaller than domains)  Easy problem: results are for many runs  Prefer larger step sizes  (up to a cutoff)  Reproducible Example: Selection Problem

16 IBM Haifa Research Lab Copyright IBM 16  Problem:  Same as before, modeled differently  Prefer intermediate step sizes  Reproducible Example: Selection Problem, different modeling

17 IBM Haifa Research Lab Copyright IBM 17 Stocs algorithm At each step: decide attempt type: random, learned or user-defined if random: choose a random step if learned: decide learn-type: step-size, direction, … if step-size: choose a step-size which was previously successful (weighted) create a random attempt with chosen step size if direction: choose a direction which was previously successful (weighted) create a random attempt with chosen direction if user-defined: get next user-defined attempt

18 IBM Haifa Research Lab Copyright IBM 18 Optimization problems  Constraints must be satisfied  In addition, an objective function that should be optimized is given  Example: doll houses  Constraints as before  In addition each doll has a preferred set of houses  The best solution satisfies as much of the preferences as possible

19 IBM Haifa Research Lab Copyright IBM 19 Optimization with Stocs  Last year we added the optimization capability to Stocs  Optimization is natural for Stocs:  First find a solution  Then keep searching for a better state  Implementation:  Cost function from a state to a pair of non-negative numbers (c1, c2):  c1 is the cost of the constraints  c2 is the value of the objective function  lexicographic order on the pairs:  a better state will always improve the constraints  after a state with c1 = 0 is found, Stocs will continue searching for a better c2

20 IBM Haifa Research Lab Copyright IBM 20 Preprocessing and initialization  Before starting the search 2 things happen:  Preprocessing of the problem  including:  finding bits that should be constant in any solution  removing unnecessary variables  simplifying constraints  has a big impact on the search:  last year we improved the performance by a factor of 100 with preprocessing  Initialization: finding the initial state  Starting the search at a good state is critical  Currently, each constraint tries to initialize its variables to a satisfying assignment, considering the “wishes” of other constraints

21 IBM Haifa Research Lab Copyright IBM 21 Summary  Limitations of systematic methods  Stochastic approach: move between full assignments  Stocs: learn the topography of the problem, allow user-defined heuristics  Optimization with Stocs  Preprocessing and initialization  Variable types

22 IBM Haifa Research Lab Copyright IBM 22 Thank you


Download ppt "Stocs – A Stochastic CSP Solver Bella Dubrov IBM Haifa Research Lab © Copyright IBM."

Similar presentations


Ads by Google