7. Optimization Prof. O. Nierstrasz Lecture notes by Marcus Denker.

Slides:



Advertisements
Similar presentations
CSC 4181 Compiler Construction Code Generation & Optimization.
Advertisements

8. Static Single Assignment Form Marcus Denker. © Marcus Denker SSA Roadmap  Static Single Assignment Form (SSA)  Converting to SSA Form  Examples.
Synopsys University Courseware Copyright © 2012 Synopsys, Inc. All rights reserved. Compiler Optimization and Code Generation Lecture - 3 Developed By:
6. Intermediate Representation
P3 / 2004 Register Allocation. Kostis Sagonas 2 Spring 2004 Outline What is register allocation Webs Interference Graphs Graph coloring Spilling Live-Range.
Lecture 11: Code Optimization CS 540 George Mason University.
Chapter 9 Code optimization Section 0 overview 1.Position of code optimizer 2.Purpose of code optimizer to get better efficiency –Run faster –Take less.
1 CS 201 Compiler Construction Lecture 3 Data Flow Analysis.
CMPUT Compiler Design and Optimization1 CMPUT680 - Winter 2006 Topic 5: Peep Hole Optimization José Nelson Amaral
ECE 454 Computer Systems Programming Compiler and Optimization (I) Ding Yuan ECE Dept., University of Toronto
1 Chapter 8: Code Generation. 2 Generating Instructions from Three-address Code Example: D = (A*B)+C =* A B T1 =+ T1 C T2 = T2 D.
Course Outline Traditional Static Program Analysis –Theory Compiler Optimizations; Control Flow Graphs Data-flow Analysis – today’s class –Classic analyses.
Control-Flow Graphs & Dataflow Analysis CS153: Compilers Greg Morrisett.
Chapter 10 Code Optimization. A main goal is to achieve a better performance Front End Code Gen Intermediate Code source Code target Code user Machine-
C Chuen-Liang Chen, NTUCS&IE / 321 OPTIMIZATION Chuen-Liang Chen Department of Computer Science and Information Engineering National Taiwan University.
1 Code Optimization Code produced by compilation algorithms can often be improved (ideally optimized) in terms of run-time speed and the amount of memory.
8. Introduction to Denotational Semantics. © O. Nierstrasz PS — Denotational Semantics 8.2 Roadmap Overview:  Syntax and Semantics  Semantics of Expressions.
12. Common Errors, a few Puzzles. © O. Nierstrasz P2 — Common Errors, a few Puzzles 12.2 Common Errors, a few Puzzles Sources  Cay Horstmann, Computing.
Program Representations. Representing programs Goals.
Optimization Compiler Baojian Hua
1 CS 201 Compiler Construction Lecture 7 Code Optimizations: Partial Redundancy Elimination.
12. Summary, Trends, Research. © O. Nierstrasz PS — Summary, Trends, Research Roadmap  Summary: —Trends in programming paradigms  Research:...
1 CS 201 Compiler Construction Lecture 5 Code Optimizations: Copy Propagation & Elimination.
Recap from last time Saw several examples of optimizations –Constant folding –Constant Prop –Copy Prop –Common Sub-expression Elim –Partial Redundancy.
Improving code generation. Better code generation requires greater context Over expressions: optimal ordering of subtrees Over basic blocks: Common subexpression.
CS 536 Spring Intermediate Code. Local Optimizations. Lecture 22.
ESE Einführung in Software Engineering X. CHAPTER Prof. O. Nierstrasz Wintersemester 2005 / 2006.
4/23/09Prof. Hilfinger CS 164 Lecture 261 IL for Arrays & Local Optimizations Lecture 26 (Adapted from notes by R. Bodik and G. Necula)
9. Optimization Marcus Denker. 2 © Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations.
4/25/08Prof. Hilfinger CS164 Lecture 371 Global Optimization Lecture 37 (From notes by R. Bodik & G. Necula)
7. Fixed Points. © O. Nierstrasz PS — Fixed Points 7.2 Roadmap  Representing Numbers  Recursion and the Fixed-Point Combinator  The typed lambda calculus.
Intermediate Code. Local Optimizations
Improving Code Generation Honors Compilers April 16 th 2002.
Introduction to Optimization Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.
Prof. Fateman CS164 Lecture 211 Local Optimizations Lecture 21.
Compiler Construction A Compulsory Module for Students in Computer Science Department Faculty of IT / Al – Al Bayt University Second Semester 2008/2009.
Machine-Independent Optimizations Ⅰ CS308 Compiler Theory1.
PSUCS322 HM 1 Languages and Compiler Design II IR Code Optimization Material provided by Prof. Jingke Li Stolen with pride and modified by Herb Mayer PSU.
Reduced Instruction Set Computers (RISC) Computer Organization and Architecture.
Optimizing Compilers Nai-Wei Lin Department of Computer Science and Information Engineering National Chung Cheng University.
Topic #10: Optimization EE 456 – Compiling Techniques Prof. Carl Sable Fall 2003.
Introduction For some compiler, the intermediate code is a pseudo code of a virtual machine. Interpreter of the virtual machine is invoked to execute the.
What’s in an optimizing compiler?
1 Code Generation Part II Chapter 8 (1 st ed. Ch.9) COP5621 Compiler Construction Copyright Robert van Engelen, Florida State University,
1 Code Generation Part II Chapter 9 COP5621 Compiler Construction Copyright Robert van Engelen, Florida State University, 2005.
1 Code optimization “Code optimization refers to the techniques used by the compiler to improve the execution efficiency of the generated object code”
Compiler Principles Fall Compiler Principles Lecture 0: Local Optimizations Roman Manevich Ben-Gurion University.
More on Loop Optimization Data Flow Analysis CS 480.
3/2/2016© Hal Perkins & UW CSES-1 CSE P 501 – Compilers Optimizing Transformations Hal Perkins Autumn 2009.
CS412/413 Introduction to Compilers and Translators April 2, 1999 Lecture 24: Introduction to Optimization.
©SoftMoore ConsultingSlide 1 Code Optimization. ©SoftMoore ConsultingSlide 2 Code Optimization Code generation techniques and transformations that result.
Code Optimization Code produced by compilation algorithms can often be improved (ideally optimized) in terms of run-time speed and the amount of memory.
Code Optimization Overview and Examples
Introduction to Optimization
High-level optimization Jakub Yaghob
Code Optimization.
Static Single Assignment
Optimization Code Optimization ©SoftMoore Consulting.
Introduction to Optimization
Code Generation Part III
Optimizing Transformations Hal Perkins Autumn 2011
Optimizing Transformations Hal Perkins Winter 2008
Code Optimization Overview and Examples Control Flow Graph
Code Generation Part III
Introduction to Optimization
9. Optimization Marcus Denker.
Optimization 薛智文 (textbook ch# 9) 薛智文 96 Spring.
Code Generation Part II
CSE P 501 – Compilers SSA Hal Perkins Autumn /31/2019
Code Optimization.
Presentation transcript:

7. Optimization Prof. O. Nierstrasz Lecture notes by Marcus Denker

© Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations  Advanced Optimizations 2

© Marcus Denker Optimization Literature  Muchnick: Advanced Compiler Design and Implementation —>600 pages on optimizations  Appel: Modern Compiler Implementation in Java —The basics 3

© Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations  Advanced Optimizations 4

© Marcus Denker Optimization Optimization: The Idea  Transform the program to improve efficiency  Performance: faster execution  Size: smaller executable, smaller memory footprint Tradeoffs: 1) Performance vs. Size 2) Compilation speed and memory 5

© Marcus Denker Optimization No Magic Bullet!  There is no perfect optimizer  Example: optimize for simplicity Opt(P): Smallest Program Q: Program with no output, does not stop Opt(Q)? 6

© Marcus Denker Optimization No Magic Bullet!  There is no perfect optimizer  Example: optimize for simplicity Opt(P): Smallest Program Q: Program with no output, does not stop Opt(Q)? L1 goto L1 7

© Marcus Denker Optimization No Magic Bullet!  There is no perfect optimizer  Example: optimize for simplicity Opt(P): Smallest Program Q: Program with no output, does not stop Opt(Q)? L1 goto L1 Halting problem! 8

© Marcus Denker Optimization Another way to look at it...  Rice (1953): For every compiler there is a modified compiler that generates shorter code.  Proof: Assume there is a compiler U that generates the shortest optimized program Opt(P) for all P. —Assume P to be a program that does not stop and has no output —Opt(P) will be L1 goto L1 —Halting problem. Thus: U does not exist.  There will be always a better optimizer! —Job guarantee for compiler architects :-) 9

© Marcus Denker Optimization Optimization on many levels  Optimizations both in the optimizer and back-end 10

© Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations  Advanced Optimizations 11

© Marcus Denker Optimization Optimizations in the Backend  Register Allocation  Instruction Selection  Peep-hole Optimization 12

© Marcus Denker Optimization Register Allocation  Processor has only finite amount of registers —Can be very small (x86)  Temporary variables —non-overlapping temporaries can share one register  Passing arguments via registers  Optimizing register allocation very important for good performance —Especially on x86 13

© Marcus Denker Optimization Instruction Selection  For every expression, there are many ways to realize them for a processor  Example: Multiplication*2 can be done by bit-shift Instruction selection is a form of optimization 14

© Marcus Denker Optimization Peephole Optimization  Simple local optimization  Look at code “through a hole” —replace sequences by known shorter ones —table pre-computed store R,a; load a,R store R,a; imul 2,R;ashl 1,R; Important when using simple instruction selection! 15

© Marcus Denker Optimization Optimization on many levels Most optimization is done in a special phase 16

© Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations  Advanced Optimizations 17

© Marcus Denker Optimization Examples for Optimizations  Constant Folding / Propagation  Copy Propagation  Algebraic Simplifications  Strength Reduction  Dead Code Elimination —Structure Simplifications  Loop Optimizations  Partial Redundancy Elimination  Code Inlining 18

© Marcus Denker Optimization Constant Folding  Evaluate constant expressions at compile time  Only possible when side-effect freeness guaranteed c:= 1 + 3c:= 4 true notfalse Caveat: Floats — implementation could be different between machines! 19

© Marcus Denker Optimization Constant Propagation  Variables that have constant value, e.g. c := 3 —Later uses of c can be replaced by the constant —If no change of c between! b := 3 c := 1 + b d := b + c b := 3 c := d := 3 + c Analysis needed, as b can be assigned more than once! 20

© Marcus Denker Optimization Copy Propagation  for a statement x := y  replace later uses of x with y, if x and y have not been changed. x := y c := 1 + x d := x + c x := y c := 1 + y d := y + c Analysis needed, as y and x can be assigned more than once! 21

© Marcus Denker Optimization Algebraic Simplifications  Use algebraic properties to simplify expressions -(-i)i b or: truetrue Important to simplify code for later optimizations 22

© Marcus Denker Optimization Strength Reduction  Replace expensive operations with simpler ones  Example: Multiplications replaced by additions y := x * 2y := x + x Peephole optimizations are often strength reductions 23

© Marcus Denker Optimization Dead Code  Remove unnecessary code —e.g. variables assigned but never read b := 3 c := d := 3 + c c := d := 3 + c  Remove code never reached if (false) {a := 5} if (false) {} 24

© Marcus Denker Optimization Simplify Structure  Similar to dead code: Simplify CFG Structure  Optimizations will degenerate CFG  Needs to be cleaned to simplify further optimization! 25

© Marcus Denker Optimization Delete Empty Basic Blocks 26

© Marcus Denker Optimization Fuse Basic Blocks 27

© Marcus Denker Optimization Common Subexpression Elimination (CSE) Common Subexpression: There is another occurrence of the expression whose evaluation always precedes this one operands remain unchanged Local (inside one basic block): When building IR Global (complete flow-graph) 28

© Marcus Denker Optimization Example CSE b := a + 2 c := 4 * b b < c? b := 1 d := a + 2 t1 := a + 2 b := t1 c := 4 * b b < c? b := 1 d := t1 29

© Marcus Denker Optimization Loop Optimizations  Optimizing code in loops is important —often executed, large payoff  All optimizations help when applied to loop-bodies  Some optimizations are loop specific 30

© Marcus Denker Optimization Loop Invariant Code Motion  Move expressions that are constant over all iterations out of the loop 31

© Marcus Denker Optimization Induction Variable Optimizations  Values of variables form an arithmetic progression integer a(100) do i = 1, 100 a(i) = * i endo integer a(100) t1 := 202 do i = 1, 100 t1 := t1 - 2 a(i) = t1 endo value assigned to a decreases by 2 uses Strength Reduction 32

© Marcus Denker Optimization Partial Redundancy Elimination (PRE)  Combines multiple optimizations: —global common-subexpression elimination —loop-invariant code motion  Partial Redundancy: computation done more than once on some path in the flow-graph  PRE: insert and delete code to minimize redundancy. 33

© Marcus Denker Optimization Code Inlining  All optimizations up to now were local to one procedure  Problem: procedures or functions are very short —Especially in good OO code!  Solution: Copy code of small procedures into the caller —OO: Polymorphic calls. Which method is called? 34

© Marcus Denker Optimization Example: Inlining a := power2(b)power2(x) { return x*x } a := b * b 35

© Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations  Advanced Optimizations 36

© Marcus Denker Optimization Recall: SSA  SSA: Static Single Assignment Form  Definition: Every variable is only assigned once 37

© Marcus Denker Optimization Properties  Definitions of variables (assignments) have a list of all uses  Variable uses (reads) point to the one definition  CFG of Basic Blocks 38

© Marcus Denker Optimization Examples: Optimization on SSA  We take three simple ones: —Constant Propagation —Copy Propagation —Simple Dead Code Elimination 39

© Marcus Denker Optimization Recall: Constant Propagation  Variables that have constant value, e.g. c := 3 —Later uses of c can be replaced by the constant —If no change of c between! b := 3 c := 1 + b d := b + c b := 3 c := d := 3 + c Analysis needed, as b can be assigned more than once! 40

© Marcus Denker Optimization Constant Propagation and SSA  Variables are assigned once  We know that we can replace all uses by the constant! b1 := 3 c1 := 1 + b1 d1 := b1 + c1 b1 := 3 c1 := d1 := 3 + c 41

© Marcus Denker Optimization Recall: Copy Propagation  for a statement x := y  replace later uses of x with y, if x and y have not been changed. x := y c := 1 + x d := x + c x := y c := 1 + y d := y + c Analysis needed, as y and x can be assigned more than once! 42

© Marcus Denker Optimization Copy Propagation and SSA  for a statement x1 := y1  replace later uses of x1 with y1 x1 := y1 c1 := 1 + x1 d1 := x1 + c1 x1 := y1 c1 := 1 + y1 d1 := y1 + c1 43

© Marcus Denker Optimization Dead Code Elimination and SSA  Variable is live if the list of uses is not empty.  Dead definitions can be deleted —(If there is no side-effect) 44

© Marcus Denker Optimization Roadmap  Introduction  Optimizations in the Back-end  The Optimizer  SSA Optimizations  Advanced Optimizations 45

© Marcus Denker Optimization Advanced Optimizations  Optimizing for using multiple processors —Auto parallelization —Very active area of research (again)  Inter-procedural optimizations —Global view, not just one procedure  Profile-guided optimization  Vectorization  Dynamic optimization —Used in virtual machines (both hardware and language VM) 46

© Marcus Denker Optimization Iterative Process  There is no general “right” order of optimizations  One optimization generates new opportunities for a preceding one.  Optimization is an iterative process Compile Time vs. Code Quality 47

© Marcus Denker Optimization What you should know! ✎ Why do we optimize programs? ✎ Is there an optimal optimizer? ✎ Where in a compiler does optimization happen? ✎ Can you explain constant propagation? 48

© Marcus Denker Optimization Can you answer these questions? ✎ What makes SSA suitable for optimization? ✎ When is a definition of a variable live in SSA Form? ✎ Why don’t we just optimize on the AST? ✎ Why do we need to optimize IR on different levels? ✎ In which order do we run the different optimizations? 49

Attribution-ShareAlike 3.0 Unported You are free: to Share — to copy, distribute and transmit the work to Remix — to adapt the work Under the following conditions: Attribution. You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work). Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under the same, similar or a compatible license. For any reuse or distribution, you must make clear to others the license terms of this work. The best way to do this is with a link to this web page. Any of the above conditions can be waived if you get permission from the copyright holder. Nothing in this license impairs or restricts the author's moral rights. License 50