Semantics “Semantics” has to do with the meaning of a program. We will consider two types of semantics: –Static semantics: semantics which can be enforced.

Slides:



Advertisements
Similar presentations
Semantics Static semantics Dynamic semantics attribute grammars
Advertisements

ICE1341 Programming Languages Spring 2005 Lecture #6 Lecture #6 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Hoare’s Correctness Triplets Dijkstra’s Predicate Transformers
Axiomatic Semantics The meaning of a program is defined by a formal system that allows one to deduce true properties of that program. No specific meaning.
Copyright © 2006 Addison-Wesley. All rights reserved.1-1 ICS 410: Programming Languages Chapter 3 : Describing Syntax and Semantics Axiomatic Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
Dynamic semantics Precisely specify the meanings of programs. Why? –programmers need to understand the meanings of programs they read –programmers need.
Copyright © 2006 Addison-Wesley. All rights reserved. 3.5 Dynamic Semantics Meanings of expressions, statements, and program units Static semantics – type.
Fall Semantics Juan Carlos Guzmán CS 3123 Programming Languages Concepts Southern Polytechnic State University.
CS 330 Programming Languages 09 / 19 / 2006 Instructor: Michael Eckmann.
1 Semantic Description of Programming languages. 2 Static versus Dynamic Semantics n Static Semantics represents legal forms of programs that cannot be.
Announcements We are done with homeworks Second coding exam this week, in recitation –Times will be posted later today –If in doubt, show up for your regular.
CSE 331 Software Design & Implementation Dan Grossman Winter 2014 Lecture 2 – Reasoning About Code With Logic 1CSE 331 Winter 2014.
CS 355 – Programming Languages
Concepts of Programming Languages 1 Describing Syntax and Semantics Brahim Hnich Högskola I Gävle
Static semantics Semantic checking which can be done at compile-time Type-compatibility –int can be assigned to double (type coercion) –double cannot be.
Axiomatic Semantics Dr. M Al-Mulhem ICS
CS 330 Programming Languages 09 / 18 / 2007 Instructor: Michael Eckmann.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 18 Program Correctness To treat programming.
ISBN Chapter 3 Describing Syntax and Semantics.
Dr. Muhammed Al-Mulhem 1ICS ICS 535 Design and Implementation of Programming Languages Part 1 Fundamentals (Chapter 4) Axiomatic Semantics ICS 535.
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
Describing Syntax and Semantics
ISBN Chapter 3 Describing Syntax and Semantics.
Axiomatic semantics - II We reviewed the axiomatic semantic rules for: –assignment –sequence –conditional –while loop We also mentioned: –preconditions,
Proving Program Correctness The Axiomatic Approach.
Describing Syntax and Semantics
Copyright © 1998 by Addison Wesley Longman, Inc. 1 Chapter 3 Syntax - the form or structure of the expressions, statements, and program units Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
Chapter Describing Syntax and Semantics. Chapter 3 Topics 1-2 Introduction The General Problem of Describing Syntax Formal Methods of Describing.
ISBN Chapter 3 Describing Syntax and Semantics.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
Course: ICS313 Fundamentals of Programming Languages. Instructor: Abdul Wahid Wali Lecturer, College of Computer Science and Engineering.
CS 363 Comparative Programming Languages Semantics.
Program Correctness. 2 Program Verification An object is a finite state machine: –Its attribute values are its state. –Its methods optionally: Transition.
Muhammad Idrees Lecturer University of Lahore 1. Outline Introduction The General Problem of Describing Syntax Formal Methods of Describing Syntax Attribute.
3.2 Semantics. 2 Semantics Attribute Grammars The Meanings of Programs: Semantics Sebesta Chapter 3.
ISBN Chapter 3 Describing Semantics.
Chapter 3 Part II Describing Syntax and Semantics.
ISBN Chapter 3 Describing Syntax and Semantics.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Semantics In Text: Chapter 3.
Chapter 3 Describing Syntax and Semantics. Chapter 3: Describing Syntax and Semantics - Introduction - The General Problem of Describing Syntax - Formal.
Copyright © 2006 Addison-Wesley. All rights reserved. Ambiguity in Grammars A grammar is ambiguous if and only if it generates a sentential form that has.
COP4020 Programming Languages Introduction to Axiomatic Semantics Prof. Robert van Engelen.
CCSB 314 Programming Language Department of Software Engineering College of IT Universiti Tenaga Nasional Chapter 3 Describing Syntax and Semantics.
1 / 48 Formal a Language Theory and Describing Semantics Principles of Programming Languages 4.
Chapter 3 © 2002 by Addison Wesley Longman, Inc Introduction - Who must use language definitions? 1. Other language designers 2. Implementors 3.
Principle of Programming Lanugages 3: Compilation of statements Statements in C Assertion Hoare logic Department of Information Science and Engineering.
Describing Syntax and Semantics Session 2 Course : T Programming Language Concept Year : February 2011.
Chapter 3 Describing Syntax and Semantics. Copyright © 2012 Addison-Wesley. All rights reserved.1-2 Chapter 3 Topics Introduction The General Problem.
ISBN Chapter 3 Describing Syntax and Semantics.
CSC3315 (Spring 2009)1 CSC 3315 Languages & Compilers Hamid Harroud School of Science and Engineering, Akhawayn University
C HAPTER 3 Describing Syntax and Semantics. D YNAMIC S EMANTICS Describing syntax is relatively simple There is no single widely acceptable notation or.
Advanced programming language theory. week 2. Attribute grammars and semantics.
PZ03CX Programming Language design and Implementation -4th Edition Copyright©Prentice Hall, PZ03CX - Language semantics Programming Language Design.
Describing Syntax and Semantics
Syntax Questions 6. Define a left recursive grammar rule.
Describing Syntax and Semantics
Programming Languages and Compilers (CS 421)
Programming Languages 2nd edition Tucker and Noonan
Semantics In Text: Chapter 3.
CSCI 3370: Principles of Programming Languages Syntax (cont.)
Predicate Transformers
Chapter 3 Describing Syntax and Semantics.
Chapter 3 (b) Semantics.
Programming Languages and Compilers (CS 421)
Programming Languages 2nd edition Tucker and Noonan
COP4020 Programming Languages
Presentation transcript:

Semantics “Semantics” has to do with the meaning of a program. We will consider two types of semantics: –Static semantics: semantics which can be enforced at compile-time. –Dynamic semantics: semantics which express the run-time meaning of programs.

Static semantics Semantic checking which can be done at compile-time Type-compatibility –int can be assigned to double (type coercion) –double cannot be assigned to int without explicit type cast Can be captured in grammar, but only at expense of larger, more complex grammar

Type rules in grammar Must introduce new non-terminals which encode types: Instead of a generic grammar rule for assignment: –  ‘=’ ‘;’ we need multiple rules: –  ‘=’ | ‘;’ –  ‘=’ ‘;’ Of course, such rules need to handle all the relevant type possibilities (e.g. byte, char, short, int, long, float and double ).

Attribute grammars Attribute grammars provide a neater way of encoding such information.

Attributes We can associate with each symbol X of the grammar a set of attributes A(X). Attributes are partitioned into: synthesized attributes S(X) – pass info up tree inherited attributes I(X) – pass info down tree

Semantic functions We can associate with each rule R of the grammar a set of semantic functions. For rule X0  X1 X2 … Xn S(X0) = f(A(X1), A(X2), …, A(Xn)) for 1<=j<=n, I(Xj) = f(A(X0),…,A(Xj-1))

Predicates Boolean expression involving the attributes and a set of attribute values If true, node is ok If false, node violates semantic rule

Example  =.expType .actType  [2] + [3].actType  if (var[2].actType = int) and (var[3].actType = int) then int else real.actType ==.expType .actType .actType.actType ==.expType  A | B | C.actType  lookUp(.string) Syntax rule Semantic rule Semantic predicate

A=A+B [2] [3] Suppose: A is int B is int

A=A+B [2] [3] Suppose: A is int B is int actual type = int expected type = int actual type = int actual type = int actual type = int

A=A+B [2] [3] Suppose: A is real B is int actual type = real expected type = real actual type = real actual type = int actual type = real type coercion during ‘+’: int  real

A=A+B [2] [3] Suppose: A is int B is real actual type = int expected type = int actual type = int actual type = real actual type = real Houston, we have a problem! Semantic predicate is false.

Dynamic semantics Dynamic semantics precisely specify the meanings of programs. Why is this important? –programmers need to understand the meanings of programs they read –programmers need to understand how to express themselves in a given language –compiler writers need to understand how to translate a high-level program into a semantically equivalent low-level one Formal semantics are not always used – no single formalism exists. We will look at: –operational semantics –axiomatic semantics There is also denotational semantics. Look in text if you’re interested.

Operational semantics Basic idea: we describe the meanings of program elements by describing how a machine executes them. Basic approach: express semantics of a high- level construct in terms of its translation into a low-level construct Example: Java described in terms of JVM Example: Pascal described in terms of p-code Example: C described in terms of PDP-11 assembly

Example high-level programoperational semantics for (e1; e2; e3) {. } e1; L1: if not(e2) goto L2. e3; goto L1 L2:

Axiomatic semantics Basic idea: you make assertions about statements in a program. A precondition is an assertion about what is true prior to execution of a statement. A postcondition is an assertion about what is true after execution of a statement. For example: sum = 2*sum + 1 {sum > 1}

Weakest precondition “The weakest precondition is the least restrictive precondition that will guarantee the validity of the associated postcondition.” [p. 151] In example above {sum>10} is a valid precondition but {sum>0} is the weakest precondition: {sum > 0} sum = 2*sum + 1 {sum > 1}

Correctness proofs “If the weakest precondition can be computed from the given postconditions for each statement of a language, then correctness proofs can be constructed for programs in that language.” [p. 151] To do this, start with output specifications for the program as the postcondition for the program as a whole.

Next, work backwards, computing weakest preconditions one statement at a time, to derive the weakest precondition for the program as a whole: stmt1{c0} stmt1 {c1} stmt2{c1} stmt2 {c2}.. .. stmtN {cN}{cN-1} stmtN {cN}

If the input specification for the program satisfies the program’s weakest precondition, then the program will (provably) produce the correct result.

Assignment statements The axiomatic semantics of an assignment statement x=E is written as P=Q x  E This means that the precondition is the postcondition with all instances of x replaced by E. Example: a = b/2-1 {a<10} To compute precondition, replace all instances of a in postcondition a<10 by b/2–1 : b/2-1 < 10, or b < 22 Semantics: {b<22} a = b/2-1 {a<10} In general: {Q x  E } x=E {Q}

Inference Suppose we wanted to prove the following: {sum > 5} sum = 2*sum + 1 {sum > 3} Starting with the postcondition, we derive as the weakest precondition something different: 2*sum + 1 > 3 2*sum > 2 sum > 1 (weakest precondition) Clearly this is OK, because the actual precondition {sum>5} implies the weakest precondition {sum>1}: sum>5 => sum>1 We use an inference rule, the rule of consequence, to prove this.

Rule of consequence General form: S1, S2, …, Sn S Precondition strengthening: P=>P’, {P’} C {Q} {P} C {Q} Postcondition weakening: Q’=>Q, {P} C {Q’} {P} C {Q}

Precondition strengthening applied Recall our example. We want to prove the following: {sum > 5} sum = 2*sum + 1 {sum > 3} The weakest precondition we derive is: sum > 1 Apply precondition strengthening to finish proof: P=>P’, {P’} C {Q} {P} C {Q} sum>5 => sum>1, {sum>1} sum=2*sum+1 {sum>3} {sum>5} sum=2*sum+1 {sum>3}

Hang on! Why get so formal? Because this way correctness proofs can be (partially) automated.

Sequences: S1 S2 Start with a sequence of statements: y = 3*x+1; x = y+3; {x<10} Compute weakest precondition for 2 nd stmt: {y+3<10} or {y<7}, use as postcondition for 1 st stmt: y = 3*x+1; {y<7} {y<7} x = y+3; {x<10} Compute weakest precondition for 1 st stmt: {3x+1<7} or {x<2} {x<2} y = 3*x+1; {y<7} {y<7} x = y+3; {x<10} Conclude, applying sequence rule: {x<2} y = 3*x+1; x = y+3; {x<10} {P} S1 {Q}, {Q} S2 {R} {P} S1 S2 {R}

Selection: if B then S1 else S2 Start with a conditional statement: if (x>0) then y=y-1 else y=y+1 {y>0} Deal with arms of the conditional one at a time, first the then-arm: y = y-1 {y>0}  {y>1} y = y-1; {y>0} Now the else-arm: y = y+1 {y>0}  {y>-1} y = y+1; {y>0} Use rule of precondition strengthening on the else-arm result (to make both arms uniform): {y>1} y = y+1; {y>0} Now strengthen both arms’ preconditions by imposing a constraint on x: { x>0 & y>1} y = y-1; {y>0} {!(x>0) & y>1} y = y+1; {y>0} Conclude, applying selection rule: {y>1} if (x>0) then y=y-1 else y=y+1 {y>0} {B & P} S1 {Q}, {!B & P} S2 {Q} {P} if B then S1 else S2 {Q}

While: while B do S end Let’s prove the following: {true} r=x; q=0; while y<=r do r=r-y; q=q+1; end {y>r & x=r+y*q} Start by proving loop body: {y<=r & x=r+y*q} r=r-y; q=q+1;{x=r+y*q} Start with last statement: q=q+1 {x=r+y*q}  {x=r+y*(q+1)} q=q+1 {x=r+y*q} {x=r+y+y*q} q=q+1 {x=r+y*q} Continue with second-to-last statement: r=r-y {x=r+y+y*q}  {x=r-y+y+y*q} r=r-y {x=r+y+y*q} {x=r+y*q} r=r-y {x=r+y+y*q} Use rule for sequence to get: {x=r+y*q} r=r-y; q=q+1; {x=r+y*q} Now strengthen precondition to conclude proof of loop body: {y<=r & x=r+y*q} r=r-y; q=q+1; {x=r+y*q} This lets us derive a weakest precondition for the while loop: {x=r+y*q} while y<=r do r=r-y; q=q+1; end {x=r+y*q & !(y<=r)} {B & I} S {I} _ {I} while B do S end {I and !B}

While: while B do S end The next step is to prove the sequence {true} r=x; q=0; while y r)} Start by moving backwards from the while loop (since we derived a weakest precondition from its postcondition already): {true} r=x; q=0; {x=r+y*q} Start with last statement: q=0; {x=r+y*q}  {x=r+y*0} q=0 {x=r+y*q} {x=r} q=0 {x=r+y*q} Continue with second-to-last statement: r=x {x=r}  {x=x} r=x {x=r} Precondition strengthening: {true} r=x {x=r} Sequence rule (applied in general form): {true} r=x; q=0; while y<=r do r=r-y; q=q+1; end {x=r+y*q & !(y<=r)} Finally, postcondition weakening because !(y y>r : {true} r=x; q=0; while y r} We're done! {B & I} S {I} _ {I} while B do S end {I and !B}

Loop invariant Recall the While rule: {B & I} S {I} _ {I} while B do S end {I and !B} I is called the loop invariant: –"…if executing [the body] once preserves the truth of [the invariant], then executing [the body] any number of times also preserves the truth of [the invariant]." [Gordon, Programming Language Theory and its Implementation, paraphrased from page 24]

Importance of loop invariants Developing loop invariants are a powerful way to design and understand algorithms. Consider selection sort: selectionSort(int[] array) { int min, temp, bar=0; while (bar < array.length - 1) { min = indexOfSmallest(array, bar); // find min temp = array[bar]; // swap array[bar] = array[min]; array[min] = temp; bar = bar + 1; // Loop invariant: region before bar is sorted // for all i,j<=bar, if i<j, array[i] <= array[j] } }

Example [ ] [ ] [ ] [ ] [ ] [ ] region not known to be sorted region known to be sorted