Presentation is loading. Please wait.

Presentation is loading. Please wait.

Composing Dataflow Analyses and Transformations Sorin Lerner (University of Washington) David Grove (IBM T.J. Watson) Craig Chambers (University of Washington)

Similar presentations


Presentation on theme: "Composing Dataflow Analyses and Transformations Sorin Lerner (University of Washington) David Grove (IBM T.J. Watson) Craig Chambers (University of Washington)"— Presentation transcript:

1 Composing Dataflow Analyses and Transformations Sorin Lerner (University of Washington) David Grove (IBM T.J. Watson) Craig Chambers (University of Washington)

2 x := 11; if (x == 11) { DoSomething(); } else { DoSomethingElse(); x := x + 1; } y := x; // value of y? Phase ordering problem Optimizations can interact in mutually beneficial ways, and no order exploits all of these interactions. Classic example: constant propagation and unreachable code elimination. x := 11; DoSomething(); y := x; // value of y? x := 11; DoSomething(); y := 11; const prop followed by unreachable code elimination const prop again true

3 One known solution: Iterate individual analyses until the results don’t change x := 11; do { if (x == 11) { DoSomething(); } else { DoSomethingElse(); x := x + 1; } } while (...) y := x; // value of y? Compiler is slow. In the presence of loops in the source program, might not yield best possible results.

4 Another known solution: hand written super-analysis Lose modularity: –difficult to write, reuse, and extend such analyses Examples: –conditional constant propagation [Wegman and Zadeck 91] –class analysis, splitting and inlining [Chambers and Ungar 90] –const prop and pointer analysis [Pioli and Hind 99] Monolithic Super-Analysis

5 Ideally...... we want to: –Write analyses modularly –Exploit mutually beneficial interactions –Have a fast compiler We present a framework that achieves this. Composition Framework

6 The key to modular composition Traditionally, optimizations are defined in two parts: 1.A dataflow analysis. 2.Rules for transforming the program representation after the analysis is solved. The key insight is to merge these two parts: –Dataflow functions return either a dataflow value OR a replacement graph with which to replace the current statement.

7 Roadmap Several small examples that show how flow functions work One large example that shows how modular analyses are automatically composed together Overview of the theory behind the framework Experimental validation

8 Flow function returning a dataflow value y := 5

9 Flow function returning a dataflow value y := 5 [... ] [..., y → 5] PROPAGATE

10 Flow function returning a replacement graph y := x+2

11 [x → 3] Flow function returning a replacement graph y := x+2 [x → 3] REPLACE y := 5 Replacement graph Step 1: Initialize input edges with dataflow information

12 Flow function returning a replacement graph y := 5 [x → 3] PROPAGATE [x → 3, y → 5] Step 2: Perform recursive dataflow analysis on the replacement graph

13 Flow function returning a replacement graph y := 5 [x → 3] PROPAGATE [x → 3, y → 5] Step 3: Propagate dataflow information from output edges.

14 Flow function returning a replacement graph y := x+2 [x → 3] [x → 3, y → 5] Replacement graphs: –used to compute outgoing dataflow information for the current statement. Replacement graphs: –used to compute outgoing dataflow information for the current statement. –a convenient way of specifying what might otherwise be a complicated flow function.

15 Flow function returning a replacement graph y := x+2 [x → 3] [x → 3, y → 5] Soundness requirement: –Replacement graph must have the same concrete semantics as the original statement, but only on concrete inputs that are consistent with the current dataflow facts.

16 Flow function returning a replacement graph y := x+2 [x → 3] [x → 3, y → 5] Let’s assume we’ve reached a fixed point.

17 Flow function returning a replacement graph y := x+2 [x → 3] [x → 3, y → 5] y := 5 Let’s assume we’ve reached a fixed point.

18 Flow function returning a replacement graph y := 5 [x → 3] [x → 3, y → 5] Replacement graphs: –used to transform the program once a fixed point has been reached. Let’s assume we’ve reached a fixed point.

19 Iterative analysis example y := x+2 [x → 3, y → 5] [x → 3][x → T ] Now, let’s assume we haven’t reached a fixed point.

20 Iterative analysis example y := x+2 [x → 3, y → 5] PROPAGATE [x → 3][x → T ] [x → T, y → T ] Now, let’s assume we haven’t reached a fixed point.

21 Branch folding example if (x == 11) FT

22 Branch folding example if (x == 11) REPLACE [x → 11] FT

23 Branch folding example [x → 11]

24 Branch folding example if (x == 11) [x → 11] FT

25 Composing several analyses x := new C; do { b := x instanceof C; if (b) { x := x.foo(); } else { x := new D; } } while (...) class A { A foo() { return new A; } }; class C extends A { A foo() { return self; } }; class D extends A { }; Constant Propagation Class Analysis Inlining Unreachable code elimination

26 x := new C merge b := x instanceof C x := new Dx := x.foo() merge while(…) if (b) TF

27 x := new C b := x instanceof C x := new Dx := x.foo() if (b) PROPAGATE while(…) PROPAGATE [x → T ] [x → {C}] T merge TF PROPAGATE T

28 x := new C b := x instanceof C x := new Dx := x.foo() if (b) PROPAGATE while(…) PROPAGATE [x → T ] [x → {C}] T ([x → T ], [x → {C}], T, T ) merge PROPAGATE TF T

29 x := new C b := x instanceof C x := new Dx := x.foo() if (b) PROPAGATE ([x → T ], [x → {C}], T, T ) while(…) merge TF

30 x := new C b := x instanceof C x := new Dx := x.foo() if (b) while(…) PROPAGATE [x → T, b → T ] merge TF ([x → T ], [x → {C}], T, T )

31 x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T ], [x → {C}], T, T ) REPLACE b := true while(…) [x → T, b → T ] merge TF ([x → T ], [x → {C}], T, T )

32 b := true ([x → T ], [x → {C}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) PROPAGATE

33 ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) b := true ([x → T ], [x → {C}], T, T )

34 ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) x := new C b := x instanceof C x := new Dx := x.foo() if (b) Replacement graph is analyzed by composed analysis. When one analysis chooses a replacement graph, other analyses see it immediately. Analyses communicate implicitly through graph transformations while(…) merge TF ([x → T ], [x → {C}], T, T )

35 x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) REPLACE σ while(…) merge TF ([x → T ], [x → {C}], T, T )

36 x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) σσ while(…) merge TF ([x → T ], [x → {C}], T, T )

37 σ σ (,,, )

38 σ σ σ (,,, ) (,,, )

39 x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) while(…) merge TF ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T )

40 x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T ], [x → {C}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) while(…) merge TF ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) REPLACE (,,, )

41 (,,, ) (,,, ) (,,, )

42 x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) while(…) merge TF ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T ) (,,, )

43 σ x := new C b := x instanceof C x := new Dx := x.foo() if (b) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) REPLACE x := C::foo(x) while(…) merge T (,,, ) (,,, ) F ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T ], [x → {C}], T, T ) σ

44 x := C::foo(x) σ REPLACE x := x σ class C extends A { A foo() { return self; } }

45 x := x σ σ PROPAGATE

46 x := x σ σ σ

47 x := C::foo(x) σ σ σ

48 σ σ ([x → T, b → true], [x → {C}, b → {Bool}], T, T )

49 ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) x := new C b := x instanceof C x := x.foo() if (b) while(…) merge T ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T ) (,,, ) x := new D F

50 x := new C b := x instanceof C x := x.foo() if (b) PROPAGATE ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) while(…) merge T x := new D F ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T ) (,,, )

51 x := new C b := x instanceof C x := x.foo() if (b) PROPAGATE ([x → T ], [x → {C}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) while(…) merge T x := new D F ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T ) (,,, )

52 x := new C b := x instanceof C x := x.foo() if (b) while(…) merge T x := new D F ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) ([x → T, b → true], [x → {C}, b → {Bool}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T ) (,,, ) ([x → T ], [x → {C}], T, T )

53 x := new C b := x instanceof C x := x.foo() if (b) x := x b := true while(…) merge T x := new D F

54 x := new C b := true x := x x := new C; do { b := x instanceof C; if (b) { x := x.foo(); } else { x := new D; } } while (...) x := new C; do { b := true; x := x; } while (...) while(…) merge

55 x := new C; do { b := x instanceof C; if (b) { x := x.foo(); } else { x := new D; } } while (...) x := new C; do { b := true; x := x; } while (...) Analyses are defined modularly and separately. Combining them achieves the results of a monolithic analysis. If the analyses were run separately in any order any number of times, no optimizations could be performed.

56 Theoretical foundation Definition: used abstract interpretation to define precisely how the framework composes analyses. Soundness theorem: if the individual analyses are sound, the composed analysis is sound. Termination theorem: composed analyses are guaranteed to terminate, under reasonable conditions. Precision theorem: if the composed flow function is monotonic, the composed analysis is guaranteed to produce results as precise as running the individual analyses in sequence, any number of times, in any order.

57 Experimental validation Implemented and used our framework in the Vortex and Whirlwind compilers for 5+ years. –composed: class analysis, splitting, inlining, const prop, CSE, removal of redundant loads and stores, symbolic assertion prop Compiled a range of big programs in Vortex. –largest benchmark: Vortex compiler with ~60,000 lines Our framework vs. iteration: –compiles at least 5 times faster Our framework vs. monolithic super-analysis: –same precision –compiles at most 20% slower

58 Conclusions We developed and implemented a new approach for defining dataflow analyses. Our approach allows analyses to be written modularly: –easier to write and reuse analyses. Our approach allows analyses to be automatically combined into a monolithic analysis.


Download ppt "Composing Dataflow Analyses and Transformations Sorin Lerner (University of Washington) David Grove (IBM T.J. Watson) Craig Chambers (University of Washington)"

Similar presentations


Ads by Google