Intermediate Code Representations

Slides:



Advertisements
Similar presentations
Intermediate Representations CS 671 February 12, 2008.
Advertisements

1 Lecture 10 Intermediate Representations. 2 front end »produces an intermediate representation (IR) for the program. optimizer »transforms the code in.
1 CS 201 Compiler Construction Machine Code Generation.
8. Code Generation. Generate executable code for a target machine that is a faithful representation of the semantics of the source code Depends not only.
Chapter 8 ICS 412. Code Generation Final phase of a compiler construction. It generates executable code for a target machine. A compiler may instead generate.
Intermediate Representations Saumya Debray Dept. of Computer Science The University of Arizona Tucson, AZ
CS 31003: Compilers Introduction to Phases of Compiler.
Control Flow Analysis (Chapter 7) Mooly Sagiv (with Contributions by Hanne Riis Nielson)
8 Intermediate code generation
Program Representations. Representing programs Goals.
Compiler Construction Sohail Aslam Lecture IR Taxonomy IRs fall into three organizational categories 1.Graphical IRs encode the compiler’s knowledge.
Semantic analysis Parsing only verifies that the program consists of tokens arranged in a syntactically-valid combination, we now move on to semantic analysis,
Intermediate Representation I High-Level to Low-Level IR Translation EECS 483 – Lecture 17 University of Michigan Monday, November 6, 2006.
CS412/413 Introduction to Compilers Radu Rugina Lecture 16: Efficient Translation to Low IR 25 Feb 02.
Intermediate Representations Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
10/22/2002© 2002 Hal Perkins & UW CSEG-1 CSE 582 – Compilers Intermediate Representations Hal Perkins Autumn 2002.
Representing programs Goals. Representing programs Primary goals –analysis is easy and effective just a few cases to handle directly link related things.
Cpeg421-08S/final-review1 Course Review Tom St. John.
1 Semantic Processing. 2 Contents Introduction Introduction A Simple Compiler A Simple Compiler Scanning – Theory and Practice Scanning – Theory and Practice.
1 Intermediate representation Goals: –encode knowledge about the program –facilitate analysis –facilitate retargeting –facilitate optimization scanning.
Intermediate Code CS 471 October 29, CS 471 – Fall Intermediate Code Generation Source code Lexical Analysis Syntactic Analysis Semantic.
CSC 8505 Compiler Construction Intermediate Representations.
From Cooper & Torczon1 Implications Must recognize legal (and illegal) programs Must generate correct code Must manage storage of all variables (and code)
Chapter 2 A Simple Compiler
Lecture 2 Phases of Compiler. Preprocessors, Compilers, Assemblers, and Linkers Preprocessor Compiler Assembler Linker Skeletal Source Program Source.
10/1/2015© Hal Perkins & UW CSEG-1 CSE P 501 – Compilers Intermediate Representations Hal Perkins Autumn 2009.
Compiler course 1. Introduction. Outline Scope of the course Disciplines involved in it Abstract view for a compiler Front-end and back-end tasks Modules.
Unit-1 Introduction Prepared by: Prof. Harish I Rathod
Joey Paquet, 2000, Lecture 10 Introduction to Code Generation and Intermediate Representations.
Introduction to Compilers. Related Area Programming languages Machine architecture Language theory Algorithms Data structures Operating systems Software.
Introduction to Code Generation and Intermediate Representations
1 Compiler Design (40-414)  Main Text Book: Compilers: Principles, Techniques & Tools, 2 nd ed., Aho, Lam, Sethi, and Ullman, 2007  Evaluation:  Midterm.
Chapter 1 Introduction Study Goals: Master: the phases of a compiler Understand: what is a compiler Know: interpreter,compiler structure.
Introduction CPSC 388 Ellen Walker Hiram College.
Winter Compilers Software Eng. Dept. – Ort Braude Compiling Assignments and Expressions Lecturer: Esti Stein brd4.ort.org.il/~esti2.
INTRODUCTION TO COMPILERS(cond….) Prepared By: Mayank Varshney(04CS3019)
Compiler Introduction 1 Kavita Patel. Outlines 2  1.1 What Do Compilers Do?  1.2 The Structure of a Compiler  1.3 Compilation Process  1.4 Phases.
12/18/2015© Hal Perkins & UW CSEG-1 CSE P 501 – Compilers Intermediate Representations Hal Perkins Winter 2008.
The Model of Compilation Natawut Nupairoj, Ph.D. Department of Computer Engineering Chulalongkorn University.
1 Compiler & its Phases Krishan Kumar Asstt. Prof. (CSE) BPRCE, Gohana.
Prepared By: Abhisekh Biswas - 04CS3002 Intermediate Code Generation.
C H A P T E R T W O Linking Syntax And Semantics Programming Languages – Principles and Paradigms by Allen Tucker, Robert Noonan.
Introduction to Language Programming Hierarchy of programming lang. Based on machine independences: 1. Machine language 2. Assembly language 3. Higher.
What is a compiler? –A program that reads a program written in one language (source language) and translates it into an equivalent program in another language.
Compilers Computer Symbol Table Output Scanner (lexical analysis)
Compiler Construction CPCS302 Dr. Manal Abdulaziz.
CSC 4181 Compiler Construction
1 Asstt. Prof Navjot Kaur Computer Dept PRESENTED BY.
ICS312 Introduction to Compilers Set 23. What is a Compiler? A compiler is software (a program) that translates a high-level programming language to machine.
CS 404Ahmed Ezzat 1 CS 404 Introduction to Compiler Design Lecture 10 Ahmed Ezzat.
1 Compiler Construction (CS-636) Muhammad Bilal Bashir UIIT, Rawalpindi.
Compiler Chapter 9. Intermediate Languages Sung-Dong Kim Dept. of Computer Engineering, Hansung University.
CS 404 Introduction to Compiler Design
Compiler Design (40-414) Main Text Book:
PRINCIPLES OF COMPILER DESIGN
Introduction to Compiler Construction
Intermediate code Jakub Yaghob
Compiler Construction (CS-636)
Compiler Chapter 9. Intermediate Languages
Compiler Construction
An Overview to Compiler Design
Lecture 2: General Structure of a Compiler
Intermediate Representations Hal Perkins Autumn 2011
Intermediate Representations
Intermediate Code Generation
CS 201 Compiler Construction
Intermediate Representations
Intermediate Representations Hal Perkins Autumn 2005
Review: What is an activation record?
Intermediate Code Generating machine-independent intermediate form.
Presentation transcript:

Intermediate Code Representations

Conceptual phases of compiler Lexical Analysis (scanner) Syntax analysis (parser) Semantic Analysis Code optimization Code generation Sequence of tokens Optimized code Intermediate code - IR1 Intermediate code IR2 Target code Front End machine independent language dependent Middle Back End machine dependent language independent

Why use an IR? Separates machine independent and machine dependent parts of the compiler - Both retargetable. Easier to perform machine independent optimizations than at machine code level Example: common sub-expression elimination 3. Simplifies code generation

IR – Encodes Compiler’s Program Knowledge Thus, some IR PROPERTIES: Ease of generation Ease of manipulation Size Freedom of Expression Level of Abstraction Selecting IR is critical.

3 Categories of IRs Structural/Graphical - AST and Concrete ST - call graph - program dependence graph (PDG) 2. Linear - 3-address code - abstract stack machine code Hybrid - control flow graph (CFG) Advantages and disadvantages and typical uses of these categories of IRs

Level of Abstraction Consider:A[j,i] = @A + j*10 + i Loadi 1, R1 [ ] A I J Loadi 1, R1 Sub RJ, R1, R2 Loadi 10, R3 Mult R2, R3, R4 Sub Ri, R1, r5 Add R4, R5, R6 Loadi @A, R7 Add R7, R6, R8 Load R8, RAIJ What is the construct being represented? Array subscripting of A[I,j]. High level AST – good for memory disambiguation, maybe harder to optimize, easier to generate Low level 3-addr code: different opts capable here

Some Design Issues for IRs Questions to Ponder: What is the minimum needed in the language’s set of operators? What is the advantage of a small set of operators? What is the concern of designing the operations Close to actual machine operations? 4. What is the potential problem of having a small Set of IR operations? Need to express the source languages Small set of oeprators – easier to implement If too close to particular machine, then lose portability Small set could lead to long instruction sequences – requires more work during optimization phase

High Level Graphical Representations Consider: A -> V := E E -> E + E | E * E | - E | id String: a := b * - c + b * - c Exercise: Concrete ST? AST? DAG? AST: more compact, easier to generate code DAG: unique node for each value. More compact. Showing redundant expressions explicitly. Easy to Generate during parsing. Encodes redundancy.

Linear IRs: Three Address Code Sequence of instructions of the form X := y op z where x, y and z are variable names, constants, or compiler generated variables (“temporaries”) Only one operator is permitted on the RHS – expressions computed using temporaries

Simple Linear IRs Write the 3 – address code for: a := b * - c + b * - c ? = -c = b * ? … complete the code from the ast? The dag? There is a need for compiler-generated temporary variables (temps) to represent intermediary values in internal nodes of ast. Code from ast: T1 = -c T2 = b * T1 T3 = -c T4 = b * T3 T5 = T2 + T4 A = T5 Versus from dag T3 = T1 + T2 A = T3

Exercise Give the 3 address code for: Z := x * y + a[j] / sum(b)

More Simple Linear IRs Stack machine code: push, pop, ops Consider: x – 2 * y Advantages? Push x Push 2 Push y Mult Sub Advantages: compact, temp names are implicit. Temps take up no extra space. Simple to generate and execute, useful when code transmitted over slow common links (the internet).

Hybrid IRs

Exercise – Construct the CFG Where are the leaders? Basic blocks? Edges?

Call Graph Representation Node = function or method Edge from A to B : A has a call site where B is potentially called

Exercise: Construct a call graph

Multiple IRs: WHIRL

Key Highlights of IRs