Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gradual Relaxation Techniques with Applications to Behavioral Synthesis Zhiru Zhang, Yiping Fan, Miodrag Potkonjak, Jason Cong Department of Computer Science.

Similar presentations


Presentation on theme: "Gradual Relaxation Techniques with Applications to Behavioral Synthesis Zhiru Zhang, Yiping Fan, Miodrag Potkonjak, Jason Cong Department of Computer Science."— Presentation transcript:

1 Gradual Relaxation Techniques with Applications to Behavioral Synthesis Zhiru Zhang, Yiping Fan, Miodrag Potkonjak, Jason Cong Department of Computer Science University of California, Los Angeles Partially supported by NSF under reward CCR-0096383

2 Outline Motivations & objectives Motivations & objectives Gradual relaxation techniques Gradual relaxation techniques –Application example: Time-Constrained Scheduling Other application examples Other application examples –Maximum Independent Set (MIS) –Soft Real-Time System Scheduling Conclusions Conclusions

3 Motivations & Objectives Motivations Motivations –Many synthesis tasks are computationally intractable SAT, scheduling, graph coloring, … SAT, scheduling, graph coloring, … –Lack of systematic way to develop effective heuristics Objectives Objectives –Development of a new general heuristic paradigm Gradual Relaxation Gradual Relaxation –Applications to a wide range of synthesis problems

4 Gradual Relaxation Paradigm Techniques Techniques –Most constrained principle –Minimal freedom reduction & Negative thinking –Compounding variables & Simultaneous step consideration –Calibration –Probabilistic modeling Detailed discussion with illustrations Detailed discussion with illustrations

5 Application Example: Time-Constrained Scheduling Problem: Time-Constrained Scheduling Problem: Time-Constrained Scheduling –Given: (1) A CDFG G(V, E) (1) A CDFG G(V, E) (2) A time constraint T (2) A time constraint T –Objective: Schedule the operations of V into T cycles so that the resource usage is minimized and all precedence constraints in G are satisfied Schedule the operations of V into T cycles so that the resource usage is minimized and all precedence constraints in G are satisfied

6 Time-Constrained Scheduling: Related Work McFarland, et al, 1990 McFarland, et al, 1990 –High level synthesis tasks De Micheli, 1994 De Micheli, 1994 –ILP formulations Lee and Messerschmitt, 1987 Lee and Messerschmitt, 1987 –SDF scheduling Paulin and Knight, 1987 Paulin and Knight, 1987 –Force-Directed Scheduling (FDS) Exploit schedule freedom (slack) to minimize the hardware resources Exploit schedule freedom (slack) to minimize the hardware resources Iteratively schedule one operation per iteration Iteratively schedule one operation per iteration …

7 Time-Constrained Scheduling: Concepts in FDS Determine ASAP & ALAP schedules DG for Multiply 0 1 2 3 4 12341234 DG for Add, Sub, Comp 0 1 2 3 4 12341234 C-step 1 C-step 2 C-step 3 C-step 4 Time Frames 1/2 1/3 * + < e+e+ * * - * * - * **** ** ASAP +< + - - ALAP ** * * * * +< +- - Determine Time Frame of each operation –Length of box : Possible execution cycles –Width of box: Probability of assignment –Uniform distribution, Area assigned = 1 Create Distribution Graphs (DG) –Sum of probabilities of each Op type –Indicates concurrency of similar Ops DG(i) =  Prob(Op, i)

8 First resolve the most constrained components First resolve the most constrained components Tie breaking – make a decision that minimally impacts the difficulty of still unresolved constraints Tie breaking – make a decision that minimally impacts the difficulty of still unresolved constraints Most Constrained Principle

9 Most Constrained Principle: Time-Constrained Scheduling Operation Op, at control step i, targeting control step t Operation Op, at control step i, targeting control step t –Force(Op, i, t) = DG(i) * x(Op, i, t) –x(Op, i, t): the Prob change in i when Op is scheduled to t The self force of operation Op w.r.t control step t The self force of operation Op w.r.t control step t –Self Force(Op, t) =  i  time frame Force(Op, i, t) 0 1 2 3 4 12341234 1/3 C-step 1 C-step 2 C-step 3 C-step 4 1/2 d*d* h+h+ i<i< e+e+ c*c* a* j- b* f* k- g*g* c*c* 0 1 2 3 4 12341234 C-step 1 C-step 2 C-step 3 C-step 4 1/3 d*d* h+h+ i<i< e+e+ a* j- b* f* k- g* c*

10 Most Constrained Principle: Related Work General technique General technique –Bitner and Reingold, 1975 –Brelaz, 1979, application to graph coloring –Pearl, 1984, application to intelligent search –… Slack based heuristics Slack based heuristics –Davis, et al, 1993 –Goldwasser, 2003 –… Force-directed heuristic Force-directed heuristic –Paulin and Knight, 1989 –…

11 Minimal Freedom Reduction / Negative Thinking Minimal Freedom Reduction – key of a good heuristic: Minimal Freedom Reduction – key of a good heuristic: –To avoid the greedy behavior of optimization –Make a small gradual atomic decision –Evaluate its individual impact before committing to large decisions Negative Thinking – way to realize Minimal Freedom Reduction Negative Thinking – way to realize Minimal Freedom Reduction –Traditional heuristics resolve a specific component of the solution –Negative thinking determines what will not be considered as a component of the solution

12 Negative Thinking: Time-Constrained Scheduling Traditional FDS: Traditional FDS: –Select minimum force (Op, t), schedule Op to t Negative thinking FDS: Negative thinking FDS: –Select maximum force (Op, t), remove t from Op’s time frame DG for Multiply 0 1 2 3 4 12341234 DG for Add, Sub, Comp 0 1 2 3 4 12341234 C-step 1 C-step 2 C-step 3 C-step 4 Time Frames 1/2 1/3 d*d* h+h+ i<i< e+e+ c*c* a* j- b* f* k- g*g* d*d* DG for Multiply 0 1 2 3 4 12341234 DG for Add, Sub, Comp 0 1 2 3 4 12341234 Time Frames d*d* h+h+ C-step 1 C-step 2 C-step 3 C-step 4 1/2 1/3 i<i< e+e+ c*c* a* j- b* f* k- g*g*

13 Negative Thinking: Similar Ideas Verhaegh, et al, 1995 Verhaegh, et al, 1995 –Improved force-directed scheduling: Gradually shrink operations’ time fames Gradually shrink operations’ time fames Cong and Madden, 1997 Cong and Madden, 1997 –Iterative deletion method for standard cell global routing: From the complete routing graph, delete edges one by one to get an optimum routing tree From the complete routing graph, delete edges one by one to get an optimum routing tree …

14 Compounding Variables / Simultaneous Steps Consideration Compounding variables Compounding variables –For the problems where variables can be assigned only to binary values –Combine several variables together Simultaneous steps consideration Simultaneous steps consideration –Consider a small negative decision on a set of variables simultaneously Example: a SAT instance Example: a SAT instance –Compound x 1 and x 2, there are 4 assignment options –Evaluate their impacts to the maximum constraints –Negative thinking: remove one option, keep the other three promising options

15 Calibration Heuristics conduct the optimization Heuristics conduct the optimization –Keep the options for important variables –Discard the options for unimportant variables Example: Time-Constrained Scheduling Example: Time-Constrained Scheduling –Multipliers are much more expensive than adders –Preserve maximum slacks for the multiplications –Lower the priority to minimize required adders C-step 1 C-step 2 C-step 3 C-step 4 1/2 1/3 * h+h+ < + * * - * * - * d*d* h+h+ C-step 1 C-step 2 C-step 3 C-step 4 1/2 1/3 < + * * - * * - * d*d*

16 Probabilistic Modeling Options of variables are non-uniformly distributed Options of variables are non-uniformly distributed Probabilistic modeling Probabilistic modeling –A non-uniform function of all constraints imposed on a particular variable Prob(Op1, 1) = 0.6Prob(Op1, 1) = 0.6 Prob(Op1, 2) = 0.3Prob(Op1, 2) = 0.3 Prob(Op1, 3) = 0.1Prob(Op1, 3) = 0.1 3 1 2 c-step1 c-step2 c-step3 c-step4 c-step5 3 1 2 3 1 2 3 1 2 1 2 3 1 2 3 1 2 3 3 1 2 3 1 2 1 2 3 C-step 1 C-step 2 C-step 3 C-step 4 3 2 1

17 When is Gradual Relaxation Most Effective? Minimal freedom reduction / Negative thinking Minimal freedom reduction / Negative thinking –A large number of variables have significant slack –Variables have complex interactions among a large number of constraints Compounding variables / simultaneous steps consideration Compounding variables / simultaneous steps consideration –Each variable has a small set of potential values Calibration Calibration –The final solution only involves relatively few types of resources Probabilistic modeling Probabilistic modeling –Effective for large and complex instances

18 Experimental Results: Time-Constrained Scheduling (1) TCS results comparison under critical-path time constraint TCS results comparison under critical-path time constraint

19 Experimental Results: Time-Constrained Scheduling (2) TCS results comparison under time constraint with 1.5x critical path length TCS results comparison under time constraint with 1.5x critical path length

20 Application Example: Maximum Independent Set (1) Problem: Maximum Independent Set Problem: Maximum Independent Set –Given: G (V, E) –Objective: find a maximum-size independent set V ’  V, such that for u  V ’ and v  V ’, (u, v)  E. Related work Related work –Garey and Johnson, 1979 A popular generic NP-Complete problem A popular generic NP-Complete problem –Kirovski and Potkonjak, 1998 Useful for efficient graph coloring Useful for efficient graph coloring –…

21 Application Example: Maximum Independent Set (2) Reasoning: Reasoning: –In practice, MIS size is much smaller than the total graph size A smaller decision: A smaller decision: –To select a most constrained vertex not to be in the MIS –Simple heuristic: h 1 (v) = Number of Neighbors of v –Look-forward heuristic: h 2 (v) =  u  Neighbors (v) (1 / Number of Neighbors of u)

22 Experimental Results: Maximum Independent Set Apply to DIMACS benchmark graphs Apply to DIMACS benchmark graphs Compare to a state-of-the-art iterative algorithm Compare to a state-of-the-art iterative algorithm –MIS algorithm used in Kirovski and Potkonjak, DAC 1998 –Similar quality –Much faster: 50X using h 1, 30X using h 2 Look-forward heuristic outperforms the simple version Look-forward heuristic outperforms the simple version

23 Application Example: Soft Real-Time System Scheduling (1) Problem: Soft Real-Time System Scheduling Problem: Soft Real-Time System Scheduling –Given: (1) A set of non-preemptive tasks  ={  1,  2, …  n } and each task  i =(a i, d i, e i ) is characterized by an arrival time a i, a deadline d i and an execution time e i (1) A set of non-preemptive tasks  ={  1,  2, …  n } and each task  i =(a i, d i, e i ) is characterized by an arrival time a i, a deadline d i and an execution time e i (2) A single processor P (2) A single processor P (3) A time constraint T (3) A time constraint T –Objective: Schedule a subset of tasks in  on processor P within the available time T so that the number of tasks scheduled is maximized Schedule a subset of tasks in  on processor P within the available time T so that the number of tasks scheduled is maximized

24 Application Example: Soft Real-Time System Scheduling (2) Modeling multimedia applications Modeling multimedia applications –Kao and Garcia-Molina, 1994 –Adelberg, et al, 1994; Modeling video and WWW servers Modeling video and WWW servers –Jones, et al, 1997 CAD and embedded system modeling CAD and embedded system modeling –Ziegenbein, et al, 2000 –Verkest, et al, 2001 –Richter, et al, 2002 Formal definition Formal definition –D’Argenio, et al, 1999 …

25 Application Example: Soft Real-Time System Scheduling (3) Two phase heuristic: Two phase heuristic: –Conflict minimization Gradually shrink the time frame for every task Gradually shrink the time frame for every task –Legalization Probabilistic modeling: Probabilistic modeling: –Trapezoid-shape Task Probability Distribution sisi s i +e i c i -e i cici t prob(  i,t)

26 Application Example: Soft Real-Time System Scheduling (4) Objective: Objective: –Minimize the number of conflicts Repeat until all tasks are locked Repeat until all tasks are locked –Update distribution graph –Compute forces for every tasks at the start and cutoff time slots –Select the maximum force (T, t), remove t from the time frame of T Time Slot Task. Prob Time Slot Task. Prob

27 Experimental Results: Soft Real-Time System Scheduling

28 Conclusions Development of gradual relaxation techniques Development of gradual relaxation techniques –Most constrained principle –Minimal freedom reduction & Negative thinking –Compounding variables & Simultaneous step consideration –Calibration –Probabilistic modeling Applications to: Applications to: –Maximum independent set –Time-constrained scheduling –Soft real-time scheduling

29 Thank you!


Download ppt "Gradual Relaxation Techniques with Applications to Behavioral Synthesis Zhiru Zhang, Yiping Fan, Miodrag Potkonjak, Jason Cong Department of Computer Science."

Similar presentations


Ads by Google