12/18/2015© 2002-08 Hal Perkins & UW CSEG-1 CSE P 501 – Compilers Intermediate Representations Hal Perkins Winter 2008.

Slides:



Advertisements
Similar presentations
1 Lecture 10 Intermediate Representations. 2 front end »produces an intermediate representation (IR) for the program. optimizer »transforms the code in.
Advertisements

Intermediate Representations Saumya Debray Dept. of Computer Science The University of Arizona Tucson, AZ
CS 31003: Compilers Introduction to Phases of Compiler.
Program Representations. Representing programs Goals.
Compiler Construction Sohail Aslam Lecture IR Taxonomy IRs fall into three organizational categories 1.Graphical IRs encode the compiler’s knowledge.
Semantic analysis Parsing only verifies that the program consists of tokens arranged in a syntactically-valid combination, we now move on to semantic analysis,
Intermediate Representation I High-Level to Low-Level IR Translation EECS 483 – Lecture 17 University of Michigan Monday, November 6, 2006.
CPSC Compiler Tutorial 9 Review of Compiler.
Intermediate Representations Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
PLLab, NTHU Cs2403 Programming Languages Implementation Issues Cs2403 Programming Language Spring 2005 Kun-Yuan Hsieh.
9/27/2006Prof. Hilfinger, Lecture 141 Syntax-Directed Translation Lecture 14 (adapted from slides by R. Bodik)
10/22/2002© 2002 Hal Perkins & UW CSEG-1 CSE 582 – Compilers Intermediate Representations Hal Perkins Autumn 2002.
1 Semantic Processing. 2 Contents Introduction Introduction A Simple Compiler A Simple Compiler Scanning – Theory and Practice Scanning – Theory and Practice.
1 Intermediate representation Goals: –encode knowledge about the program –facilitate analysis –facilitate retargeting –facilitate optimization scanning.
Intermediate Code CS 471 October 29, CS 471 – Fall Intermediate Code Generation Source code Lexical Analysis Syntactic Analysis Semantic.
1 Intermediate representation Goals: encode knowledge about the program facilitate analysis facilitate retargeting facilitate optimization scanning parsing.
Intermediate Representations Comp 412 Copyright 2010, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University.
1.3 Executing Programs. How is Computer Code Transformed into an Executable? Interpreters Compilers Hybrid systems.
8/19/2015© Hal Perkins & UW CSEC-1 CSE P 501 – Compilers Parsing & Context-Free Grammars Hal Perkins Winter 2008.
CSE P501 – Compiler Construction Parser Semantic Actions Intermediate Representations AST Linear Next Spring 2014 Jim Hogg - UW - CSE - P501G-1.
COP4020 Programming Languages
Chapter 1 Introduction Dr. Frank Lee. 1.1 Why Study Compiler? To write more efficient code in a high-level language To provide solid foundation in parsing.
10/1/2015© Hal Perkins & UW CSEG-1 CSE P 501 – Compilers Intermediate Representations Hal Perkins Autumn 2009.
Chapter 10: Compilers and Language Translation Invitation to Computer Science, Java Version, Third Edition.
Compiler course 1. Introduction. Outline Scope of the course Disciplines involved in it Abstract view for a compiler Front-end and back-end tasks Modules.
Unit-1 Introduction Prepared by: Prof. Harish I Rathod
Introduction Lecture 1 Wed, Jan 12, The Stages of Compilation Lexical analysis. Syntactic analysis. Semantic analysis. Intermediate code generation.
Introduction to Compilers. Related Area Programming languages Machine architecture Language theory Algorithms Data structures Operating systems Software.
Topic #1: Introduction EE 456 – Compiling Techniques Prof. Carl Sable Fall 2003.
Introduction to Code Generation and Intermediate Representations
Overview of Previous Lesson(s) Over View  A program must be translated into a form in which it can be executed by a computer.  The software systems.
1 Compiler Design (40-414)  Main Text Book: Compilers: Principles, Techniques & Tools, 2 nd ed., Aho, Lam, Sethi, and Ullman, 2007  Evaluation:  Midterm.
Chapter 1 Introduction Study Goals: Master: the phases of a compiler Understand: what is a compiler Know: interpreter,compiler structure.
Introduction CPSC 388 Ellen Walker Hiram College.
Chapter# 6 Code generation.  The final phase in our compiler model is the code generator.  It takes as input the intermediate representation(IR) produced.
Compiler Design Introduction 1. 2 Course Outline Introduction to Compiling Lexical Analysis Syntax Analysis –Context Free Grammars –Top-Down Parsing –Bottom-Up.
Intermediate Code Representations
Compiler Introduction 1 Kavita Patel. Outlines 2  1.1 What Do Compilers Do?  1.2 The Structure of a Compiler  1.3 Compilation Process  1.4 Phases.
1 Compiler & its Phases Krishan Kumar Asstt. Prof. (CSE) BPRCE, Gohana.
What is a compiler? –A program that reads a program written in one language (source language) and translates it into an equivalent program in another language.
Compiler Construction CPCS302 Dr. Manal Abdulaziz.
CSC 4181 Compiler Construction
LECTURE 3 Compiler Phases. COMPILER PHASES Compilation of a program proceeds through a fixed series of phases.  Each phase uses an (intermediate) form.
1 Asstt. Prof Navjot Kaur Computer Dept PRESENTED BY.
ICS312 Introduction to Compilers Set 23. What is a Compiler? A compiler is software (a program) that translates a high-level programming language to machine.
Intermediate Code Generation CS 671 February 14, 2008.
CS 404Ahmed Ezzat 1 CS 404 Introduction to Compiler Design Lecture 10 Ahmed Ezzat.
1 Compiler Construction Vana Doufexi office CS dept.
Presented by : A best website designer company. Chapter 1 Introduction Prof Chung. 1.
CS510 Compiler Lecture 1. Sources Lecture Notes Book 1 : “Compiler construction principles and practice”, Kenneth C. Louden. Book 2 : “Compilers Principles,
1 Compiler Construction (CS-636) Muhammad Bilal Bashir UIIT, Rawalpindi.
What Do Compilers Do 1 A compiler acts as a translator, transforming human-oriented programming languages into computer-oriented machine languages. Ignore.
Lecture 12 Intermediate Code Generation Translating Expressions
CS 404 Introduction to Compiler Design
Compiler Design (40-414) Main Text Book:
Chapter 1 Introduction.
Chapter 1 Introduction.
Compiler Lecture 1 CS510.
Compiler Construction
Basic Program Analysis: AST
Syntax-Directed Translation
Intermediate Representations Hal Perkins Autumn 2011
Intermediate Representations
Intermediate Representations
LL and Recursive-Descent Parsing Hal Perkins Autumn 2011
Intermediate Representations Hal Perkins Autumn 2005
Parsing & Context-Free Grammars Hal Perkins Summer 2004
LL and Recursive-Descent Parsing Hal Perkins Autumn 2009
LL and Recursive-Descent Parsing Hal Perkins Winter 2008
Parsing & Context-Free Grammars Hal Perkins Autumn 2005
Presentation transcript:

12/18/2015© Hal Perkins & UW CSEG-1 CSE P 501 – Compilers Intermediate Representations Hal Perkins Winter 2008

12/18/2015© Hal Perkins & UW CSEG-2 Agenda Parser Semantic Actions Intermediate Representations Abstract Syntax Trees (ASTs) Linear Representations & more

12/18/2015© Hal Perkins & UW CSEG-3 Compiler Structure (review) SourceTarget Scanner Parser Middle (optimization) Code Gen characters tokens IR IR (maybe different) Assembly or binary code

12/18/2015© Hal Perkins & UW CSEG-4 What’s a Parser to Do? Idea: at significant points in the parse perform a semantic action Typically when a production is reduced (LR) or at a convenient point in the parse (LL) Typical semantic actions Build (and return) a representation of the parsed chunk of the input (compiler) Perform some sort of computation and return result (interpreter)

12/18/2015© Hal Perkins & UW CSEG-5 Intermediate Representations In most compilers, the parser builds an intermediate representation of the program Rest of the compiler transforms the IR to “improve” (optimize) it and eventually translates it to final code Often will transform initial IR to one or more different IRs along the way

12/18/2015© Hal Perkins & UW CSEG-6 IR Design Decisions affect speed and efficiency of the rest of the compiler Desirable properties Easy to generate Easy to manipulate Expressive Appropriate level of abstraction Different tradeoffs depending on compiler goals Different tradeoffs in different parts of the same compiler

12/18/2015© Hal Perkins & UW CSEG-7 Types of IRs Three major categories Structural Linear Hybrid Some basic examples now; more when we get to later phases of the compiler

12/18/2015© Hal Perkins & UW CSEG-8 Levels of Abstraction Key design decision: how much detail to expose Affects possibility and profitability of various optimizations Structural IRs are typically fairly high-level Linear IRs are typically low-level But these generalizations don’t always hold

12/18/2015© Hal Perkins & UW CSEG-9 Example: Array Reference A[i,j] loadI 1 => r1 sub rj,r1 => r2 loadI 10 => r3 mult r2,r3 => r4 sub ri,r1 => r5 add r4,r5 => r6 => r7 add r7,r6 => r8 load r8 => r9 subscript Aij

12/18/2015© Hal Perkins & UW CSEG-10 Structural IRs Typically reflect source (or other higher- level) language structure Tend to be large Examples: syntax trees, DAGs Particularly useful for source-to-source transformations

12/18/2015© Hal Perkins & UW CSEG-11 Concrete Syntax Trees The full grammar is needed to guide the parser, but contains many extraneous details Chain productions Rules that control precedence and associativity Typically the full syntax tree does not need to be used explicitly

12/18/2015© Hal Perkins & UW CSEG-12 Syntax Tree Example Concrete syntax for x=2*(n+m); expr ::= expr + term | expr – term | term term ::= term * factor | term / factor | factor factor ::= int | id | ( expr )

12/18/2015© Hal Perkins & UW CSEG-13 Abstract Syntax Trees Want only essential structural information Omit extraneous junk Can be represented explicitly as a tree or in a linear form Example: LISP/Scheme S-expressions are essentially ASTs

12/18/2015© Hal Perkins & UW CSEG-14 AST Example AST for x=2*(n+m);

12/18/2015© Hal Perkins & UW CSEG-15 Linear IRs Pseudo-code for an abstract machine Level of abstraction varies Simple, compact data structures Examples: stack machine code, three- address code

12/18/2015© Hal Perkins & UW CSEG-16 Stack Machine Code Originally used for stack-based computers (famous example: B5000) Now used for Java (.class files), C# (MSIL) Advantages Very compact; mostly 0-address opcodes Easy to generate Simple to translate to naïve machine code And a good starting point for generating optimized code

12/18/2015© Hal Perkins & UW CSEG-17 Stack Code Example Hypothetical code for x=2*(n+m); pushaddr x pushconst 2 pushval n pushval m add mult store

12/18/2015© Hal Perkins & UW CSEG-18 Three-Address code Many different representations General form: x  y (op) z One operator Maximum of three names Example: x=2*(n+m); becomes t1  n + m t2  2 * t1 x  t2

12/18/2015© Hal Perkins & UW CSEG-19 Three Address Code (cont) Advantages Resembles code for actual machines Explicitly names intermediate results Compact Often easy to rearrange Various representations Quadruples, triples, SSA Much more later…

12/18/2015© Hal Perkins & UW CSEG-20 Hybrid IRs Combination of structural and linear Level of abstraction varies Example: control-flow graph

12/18/2015© Hal Perkins & UW CSEG-21 What to Use? Common choice: all(!) AST or other structural representation built by parser and used in early stages of the compiler Closer to source code Good for semantic analysis Facilitates some higher-level optimizations Flatten to linear IR for later stages of compiler Closer to machine code Exposes machine-related optimizations Hybrid forms in optimization phases

12/18/2015© Hal Perkins & UW CSEG-22 Coming Attractions Representing ASTs Working with ASTs Where do the algorithms go? Is it really object-oriented? (Does it matter?) Visitor pattern Then: semantic analysis, type checking, and symbol tables