CSCE 2100: Computing Foundations 1 Running Time of Programs

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

HST 952 Computing for Biomedical Scientists Lecture 10.
Chapter 3 Growth of Functions
Chapter 10 Algorithm Efficiency
CSC401 – Analysis of Algorithms Lecture Notes 1 Introduction
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Introduction to Analysis of Algorithms
Analysis of Algorithms Algorithm Input Output. Analysis of Algorithms2 Outline and Reading Running time (§1.1) Pseudo-code (§1.1) Counting primitive operations.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Analysis of Algorithms
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Asymptotic Notations Iterative Algorithms and their analysis
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Algorithm Input Output An algorithm is a step-by-step procedure for solving a problem in a finite amount of time. Chapter 4. Algorithm Analysis (complexity)
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Iterative Algorithm Analysis & Asymptotic Notations
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Introduction to Analysis of Algorithms COMP171 Fall 2005.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Analysis Algorithm Analysis Lectures 3 & 4 Resources Data Structures & Algorithms Analysis in C++ (MAW): Chap. 2 Introduction to Algorithms (Cormen,
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
2-0 Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 2 Theoretical.
Algorithm Analysis: Running Time Big O and omega ( 
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
CSCE 2100: Computing Foundations 1 Analyzing Iterative Programs Tamara Schneider Summer 2013.
Announcement We will have a 10 minutes Quiz on Feb. 4 at the end of the lecture. The quiz is about Big O notation. The weight of this quiz is 3% (please.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
1 COMP9024: Data Structures and Algorithms Week Two: Analysis of Algorithms Hui Wu Session 2, 2014
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Chapter 2 Algorithm Analysis
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
Analysis of Algorithms
Chapter 2.
CSE 1342 Programming Concepts
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Presentation transcript:

CSCE 2100: Computing Foundations 1 Running Time of Programs Tamara Schneider Summer 2013

What is Efficiency? Time it takes to run a program? Resources Storage space taken by variables Traffic generated on computer network Amount of data moved to and from disk

Summarizing Running Time Benchmarking Use of benchmarks: small collection of typical inputs Analysis Group input based on size Running time is influenced by various factors Computer Compiler

Running Time worst-case running time: maximum running time over all inputs of size 𝑛 average running time: average running time of all inputs of size 𝑛 best-case running time: minimum running time over all inputs of size 𝑛

Worst, Best, and Average Case

Running Time of a Program 𝑇(𝑛) is the running time of a program as a function of the input size 𝑛. 𝑇(𝑛) = 𝑐𝑛 indicates that the running time is linearly proportional to the size of the input, that is, linear time.

Running Time of Simple Statements We assume that “primitive operations” take a single instruction. Arithmetic operations (+, %, *, -, ...) Logical operations (&&, ||, ...) Accessing operations (A[i], x->y, ...) Simple assignment Calls to library functions (scanf, printf, ... )

Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1

Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 1

Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 1 + (n+1)

Code Segment 1 1 1 + (n+1) + n = 2n+2 sum = 0; for(i=0; i<n; i++)

Code Segment 1 1 1 + (n+1) + n = 2n+2 1 How many times? sum = 0; for(i=0; i<n; i++) sum++; 1 1 + (n+1) + n = 2n+2 1 How many times?

Code Segment 1 1 1 + (n+1) + n = 2n+2 1 How many times? sum = 0; for(i=0; i<n; i++) sum++; 1 1 + (n+1) + n = 2n+2 1 How many times? 1 + (2n+2) + n*1 = 3n + 3 Complexity?

Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++)

Code Segment 2 1 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++)

Code Segment 2 1 2n+2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2

Code Segment 2 1 2n+2 2n+2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2

Code Segment 2 1 2n+2 2n+2 1 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2 1

Code Segment 2 1 2n+2 2n+2 1 1 + (2n+2) + (2n+2)*n + n*n*1 Complexity? sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2 1 1 + (2n+2) + (2n+2)*n + n*n*1 Complexity?

Code Segment 3 1 2n+2 ? 1 Complexity? sum = 0; for(i=0; i<n; i++) for(j=0; j<n*n; j++) sum++; 1 2n+2 ? 1 Complexity?

Code Segment 4 1 2n+4 ? 1 Complexity? sum = 0; for(i=0; i<=n; i++) for(j=0; j<i; j++) sum++; 1 2n+4 ? 1 Complexity? i=0 i=1 j=0 i=2 j=0 j=1 i=3 j=0 j=1 j=2 … i=n j=0 j=1 j=2 j=3 . . . j=n-1

How Do Running Times Compare?

Towards “Big Oh” t (time) c f(n), e.g. 5 x2 with c = 5, f(n)=x2 T(n) describes the runtime of some program, e.g. T(n) = 2x2-4x+3 n (input size) n0 We can observe that for an input size n ≥ n0 , the graph of the function c f(n) has a higher time value than the graph for the function T(n).  For n ≥ n0, c f(n) is an upper bound on T(n), i.e. c f(n) ≥ T(n).

Big-Oh [1] It is too much work to use the exact number of machine instructions Instead, hide the details average number of compiler-generated machine instructions average number of instructions executed by a machine per second Simplification Instead of 4m-1 write O(m) O(m) ?!

Big-Oh [2] Restrict argument to integer 𝑛 ≥ 0 𝑇(𝑛) is nonnegative for all 𝑛 Definition: 𝑇(𝑛) is 𝑂(𝑓(𝑛)) if ∃ an integer 𝑛0 and a constant 𝑐 > 0: ∀ 𝑛 ≥ 𝑛0, 𝑇 𝑛 ≤ 𝑐·𝑓(𝑛) ∃ “there exists” ∀ “for all”

Big-Oh - Example [1] Example 1: T(0) = 1 T(1) = 4 T(2) = 9 Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) Example 1: T(0) = 1 T(1) = 4 T(2) = 9 in general : T(n) = (n+1)2 Is T(n) also O(n2) ???

Big-Oh - Example [2] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) T(n)=(n+1)2. We want to show that T(n) is O(n2). In other words, f(n) = n2 If this is true, there exist and integer n0 and a constant c > 0 such that for all n ≥ n0 : T(n) ≤ cn2

Big-Oh - Example [3] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2 Choose c=4, n0=1: Show that (n+1)2 ≤ 4n2 for n ≥ 1 (n+1)2 = n2 + 2n + 1 ≤ n2 + 2n2 + 1 = 3n2 + 1 ≤ 3n2 + n2 = 4n2 = cn2

Big-Oh - Example [Alt 3] Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2 Choose c=2, n0=3: Show that (n+1)2 ≤ 2n2 for n ≥ 3 (n+1)2 = n2 + 2n + 1 ≤ n2 + n2 = 2n2 = cn2 For all n≥3: 2n+2 ≤ n2

Simplification Rules for Big-Oh Constant factors can be omitted O(54n2) = O(n2) Lower-oder terms can be omitted O(n4 + n2) = O(n4) O(n2) + O(1) = O(n2) Note that the highest-order term should never be negative. Lower order terms can be negative. Negative terms can be omitted since they do not increase the runtime.

Transitivity [1] What is transitivity? Is Big Oh transitive? if A☺B and B☺C, then A☺C example: a < b and b < c, then a < c e.g. 2 < 4 and 4 < 7, then 2 < 7 since “<“ is transitive Is Big Oh transitive?

Transitivity [2] if f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) f(n) is O(g(n)): ∃ n1, c1 such that f(n) ≤ c1 g(n) ∀ n ≥ n1 g(n) is O(h(n)): ∃ n2, c2 such that g(n) ≤ c2 h(n) ∀ n ≥ n2 Choose n0 = max{n1,n2} and c = c1 c2 f(n) ≤ c1 g(n) ≤ c1 c2 h(n) ⇒ f(n) is O(h(n)) ≤ c2 h(n)

Tightness Use constant factor “1” Use tightest upper bound that we can proof 3n is O(n2) and O(n) and O(2n) Which one should we use?

Summation Rule [1] Consider a program that that contains 2 parts Part 1 takes T1(n) time and is O(f1(n)) Part 2 takes T2(n) time and is O(f2(n)) We also know that f2 grows no faster than f1 ⇒ f2(n) is O(f1(n)) What is the running time of the entire program? T1(n) + T2(n) is O(f1(n) + f2(n)) But can we simplify this?

Summation Rule [2] T1(n) + T2(n) is O(f1(n)) since f2 grows no faster than f1 Proof: T1(n) ≤ c1 f1(n) for n ≥ n1 T2(n) ≤ c2 f2(n) for n ≥ n2 f2(n) ≤ c3 f1(n) for n ≥ n3 n0 = max{n1,n2,n3} T1(n) + T2(n) ≤ c1 f1(n) + c2 f2(n) = c1 f1(n) + c2 f2(n) ≤ c1 f1(n) + c2 c3 f1(n) = c1 +c2 c3 f1(n) = c f1(n) with c=c1+c2c3 ⇒ T1(n) + T2(n) is O(f1(n))

Summation Rule - Example //make A identity matrix scanf("%d", &d); for(i=0; i<n; i++) for(j=0; j<n; j++) A[i][j] = 0; A[i][i] = 1; 𝑂(1) 𝑂(𝑛) O(n2) 𝑂(1) 𝑂(𝑛) 𝑂(1) O(1) + O(n2) + O(n) = O(n2)

Summary of Rules & Concepts [1] Worst-case, average-case, and best-case running time are compared for a fixed input size n, not for varying n! Counting Instructions Assume 1 instruction for assignments, simple calculations, comparisons, etc. Definition of Big-Oh T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n)

Summary of Rules & Concepts [2] Rule 1: Constant factors can be omitted Example: O(3n5) = O(n5) Rule 2: Low order terms can be omitted Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(3n5) We can combine Rule 1 and Rule 2: Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(n5)

Summary of Rules & Concepts [3] For O(f(n) + g(n)), we can neglect the function with the slower growth rate. Example: O(f(n) + g(n)) = O(n + nlogn) = O(nlogn) Transitivity: If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) Example: f(n)=3n, g(n)=n2, h(n)=n6 3n is O(n2) and n2 is O(n6)  3n is O(n6) Tightness: We try to find an upper bound Big-Oh that is as small as possible. Example: n2 is O(n6), but is O(n2) is a much tighter (and better) bound.

Solutions to Instruction Counts on Code Segments Instructions Big Oh Code Segment 1 3n + 3 O(n) Code Segment 2 3n2 + 4n + 3 O(n2) Code Segment 3 3n3 + 4n + 3 O(n3) Code Segment 4 Argh!