Mon 29 Sep 2014Lecture 4 1. Running Time Performance analysis Techniques until now: Experimental Cost models counting execution of operations or lines.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Lecture3: Algorithm Analysis Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
Spring 2015 Lecture 5: QuickSort & Selection
Computational Complexity 1. Time Complexity 2. Space Complexity.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Chapter 10 Algorithm Efficiency
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
CS421 - Course Information Website Syllabus Schedule The Book:
CS 307 Fundamentals of Computer Science 1 Asymptotic Analysis of Algorithms (how to evaluate your programming power tools) based on presentation material.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
Elementary Data Structures and Algorithms
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
Analysis of Performance
Asymptotic Notations Iterative Algorithms and their analysis
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Introduction to complexity. 2 Analysis of Algorithms Why do we need to analyze algorithms? –To measure the performance –Comparison of different algorithms.
Introduction to Algorithms Jiafen Liu Sept
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Iterative Algorithm Analysis & Asymptotic Notations
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
26 Sep 2014Lecture 3 1. Last lecture: Experimental observation & prediction Cost models: Counting the number of executions of Every single kind of command.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSC 205 Java Programming II Algorithm Efficiency.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CSC 211 Data Structures Lecture 13
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Data Structure Introduction.
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Lauren Milne Summer 2015.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Asymptotic Algorithm Analysis The asymptotic analysis of an algorithm determines the running time in big-Oh (big O) notation To perform the asymptotic.
Algorithm Analysis Dr. Bernard Chen Ph.D. University of Central Arkansas Fall 2008.
Algorithm Analysis O Ω.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of algorithms. What are we going to learn? Need to say that some algorithms are “better” than others Criteria for evaluation Structure of programs.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 5: Algorithms.
Algorithm Analysis (Big O)
Spring 2015 Lecture 2: Analysis of Algorithms
تصميم وتحليل الخوارزميات عال311 Chapter 3 Growth of Functions
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
2IL50 Data Structures Spring 2016 Lecture 2: Analysis of Algorithms.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Program Performance 황승원 Fall 2010 CSE, POSTECH. Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Running Time Performance analysis
CS 3343: Analysis of Algorithms
Programming and Data Structure
Programming and Data Structure
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Presentation transcript:

Mon 29 Sep 2014Lecture 4 1

Running Time Performance analysis Techniques until now: Experimental Cost models counting execution of operations or lines of code. under some assumptions only some operations count cost of each operation = 1 time unit Tilde notation: T(n) ~ (5/3)n 2 Today: Θ-notation Examples: insertionSort & binarySearch Mon 29 Sep 2014Lecture 4 2

InsertionSort – Pseudocode Mon 29 Sep 2014Lecture 4 3 Algorithm (in pseudocode) 1. for (j = 1; j<A.length; j++) { 2. //shift A[j] into the sorted A[0..j-1] 3. i=j-1 4. while i>=0 and A[i]>A[i+1] { 5. swap A[i], A[i+1] 6. i=i-1 7. }} 8. return A

Worst Case Mon 29 Sep 2014Lecture 4 4 costno of times 1. for (j = 1; j<A.length; j++) {1n 2. //shift A[j] into the sorted A[0..j-1] 3. i=j-11n-1 4. while i>=0 and A[i]>A[i+1] {12+…+n 5. swap A[i], A[i+1]11+…+(n-1) 6. i=i-111+…+(n-1) 7. }} 8. return A11 In the worst case the array is in reverse sorted order. T(n) = n + n-1 + Sum x=2..n (x) + 2Sum x=1..n-1 (x-1) + 1 = n + n-1 + (n(n+1)/2 - 1) + 2n(n-1)/2 + 1 = (3/2)n 2 + (3/2)n - 1 We also saw best- and average-case

Mon 29 Sep 2014Lecture 4 5 How fast is T(n) = (3/2)n 2 + (3/2)n - 1 ? Fast computer vs. Slow computer

Mon 29 Sep 2014Lecture 4 6 T1(n) = (3/2)n 2 + (3/2)n - 1T2(n) = (3/2)n - 1 Fast Computer vs. Smart Programmer

Mon 29 Sep 2014Lecture 4 7 T1(n) = (3/2)n 2 + (3/2)n - 1T2(n) = (3/2)n - 1 Fast Computer vs Smart Programmer (rematch!)

A smart programmer with a better algorithm always beats a fast computer with a worst algorithm for sufficiently large inputs. Mon 29 Sep 2014Lecture 4 8

At large enough input sizes only the rate of growth of an algorithm’s running time matters. That’s why we dropped the lower-order terms with the tilde notation: When T(n) = (3/2)n 2 + (3/2)n -1 we write: T(n) ~ (3/2)n 2 However: to calculate (3/2)n 2 we need to first calculate (3/2)n 2 + (3/2)n -1 It is not possible to calculate the coefficient 3/2 without the complete polynomials. Mon 29 Sep 2014Lecture 4 9

Simpler approach It turns out that even the coefficient of the highest order term of polynomials is not all that important for large enough inputs. This leads us to Asymptotic running time: T(n) = (3/2)n 2 + (3/2)n - 1 = Θ(n 2 ) We ignore everything except for the most significant growth function Even with such a simplification, we can compare algorithms to discover the best ones Sometimes constants matter in the real-world performance of algorithms, but this is rare. Mon 29 Sep 2014Lecture 4 10

Important Growth Functions From better to worse: Function fName 1 constant log n logarithmic n linear n. log n n 2 quadratic n 3 cubic … 2 n exponential... Mon 29 Sep 2014Lecture 4 11

Important Growth Functions From better to worse: Function fName 1 constant log n logarithmic n linear n. log n n 2 quadratic n 3 cubic … 2 n exponential... Mon 29 Sep 2014Lecture 4 12 The first 4 are practically fast (most commercial programs run in such Θ-time) Anything less than exponential is theoretically fast (P vs NP)

Important Growth Functions From better to worse: Function fName Problem size solved in mins (today) 1 constantany log n logarithmicany n linearbillions n. log nhundreds of millions n 2 quadratictens of thousands n 3 cubicthousands … 2 n exponential Mon 29 Sep 2014Lecture 4 13

Growth Functions From better to worse: Function fNameExample code of Θ(f): 1 constantswap A[i], A[j] log n logarithmicj=n; while(j>0){ …; j=j/2} n linearfor(j=1; j<n; j++){ … } n. log n[best sorting algorithms] n 2 quadraticfor(j=1; j<n; j++){ for(i=1; i<j; i++){ … }} n 3 cubic[3 nested for-loops] … 2 n exponential[brute-force password breaking tries all combinations] Mon 29 Sep 2014Lecture 4 14

Asymptotic Running Time Θ(f(n)) It has useful operations: Θ(n) + Θ(n 2 ) = Θ(n 2 ) Θ(n) × Θ(n 2 ) = Θ(n 3 ) Θ(n) × Θ(log n) = Θ(n. log n) Θ(f(n)) + Θ(g(n)) = Θ(g(n))if Θ(f(n)) ≤ Θ(g(n)) Θ(f(n)) × Θ(g(n)) = Θ(f(n) × g(n)) If f(n) = Θ(g(n))and g(n) = Θ(h(n))then f(n) = Θ(h(n)) Mon 29 Sep 2014Lecture 4 15

InsertionSort – asymptotic worst-case analysis Mon 29 Sep 2014Lecture 4 16 asymptotic cost (LoC model) 1. for j = 1 to A.length { 2. //shift A[j] into the sorted A[0..j-1] 3. i=j-1 4. while i>=0 and A[i]>A[i+1] { 5. swap A[i], A[i+1] 6. i=i-1 7. }} 8. return A T(n) =

InsertionSort – asymptotic worst-case analysis Mon 29 Sep 2014Lecture 4 17 asymptotic cost (LoC model) 1. for j = 1 to A.length {Θ(n) 2. //shift A[j] into the sorted A[0..j-1] 3. i=j-1Θ(n) 4. while i>=0 and A[i]>A[i+1] {Θ(n 2 ) 5. swap A[i], A[i+1]Θ(n 2 ) 6. i=i-1Θ(n 2 ) 7. }} 8. return AΘ(1) T(n) = Θ(n) + Θ(n) + Θ(n 2 ) + Θ(n 2 ) + Θ(n 2 ) + Θ(1) = Θ(n 2 )

More Asymptotic Notation O, Ω When we are giving exact bounds we write: T(n) = Θ(f(n)) When we are giving upper bounds we write: Τ(n) ≤ Θ(f(n)) or alternatively T(n) = Ο(f(n)) When we are giving lower bounds we write: Τ(n) ≥ Θ(f(n)) or alternatively T(n) = Ω(f(n)) Mon 29 Sep 2014Lecture 4 18

Examples Θ, O, Ω 3n 2. log n + n 2 + 4n - 2 = ? 3n 2. log n + n 2 + 4n - 2 = O(n 2. log n) 3n 2. log n + n 2 + 4n - 2 = O(n 3 ) 3n 2. log n + n 2 + 4n - 2 = O(2 n ) 3n 2. log n + n 2 + 4n - 2 ≠ O(n 2 ) Mon 29 Sep 2014Lecture 4 19

Examples Θ, O, Ω 3n 2. log n + n 2 + 4n - 2 = Θ(n 2. log n) 3n 2. log n + n 2 + 4n - 2 = Ω(n 2. log n) 3n 2. log n + n 2 + 4n - 2 = Ω(n 2 ) 3n 2. log n + n 2 + 4n - 2 = Ω(1) 3n 2. log n + n 2 + 4n - 2 ≠ Ω(n 3. log n) Mon 29 Sep 2014Lecture 4 20

Examples (comparisons) Θ(n logn) =?= Θ(n) Mon 29 Sep 2014Lecture 4 21

Examples (comparisons) Θ(n logn) > Θ(n) Θ(n 2 + 3n – 1) =?= Θ(n 2 ) Mon 29 Sep 2014Lecture 4 22

Examples (comparisons) Θ(n logn) > Θ(n) Θ(n 2 + 3n – 1) = Θ(n 2 ) Mon 29 Sep 2014Lecture 4 23

Examples (comparisons) Θ(n log(n)) > Θ(n) Θ(n 2 + 3n – 1) = Θ(n 2 ) Θ(1) =?= Θ(10) Θ(5n) =?= Θ(n 2 ) Θ(n 3 + log(n)) =?= Θ(100n 3 + log(n)) Write all of the above in order, writing = or < between them Mon 29 Sep 2014Lecture 4 24

Principle Θ bounds are the most precise asymptotic performance bounds we can give O/Ω bounds may be imprecise Mon 29 Sep 2014Lecture 4 25

One more example: BinarySeach Specification: Input: array a[0..n-1], integer key Input property: a is sorted Output: integer pos Output property: if key==a[i] then pos==i Trivial? First binary search published in 1946 First bug-free binary search published in 1962 Bug in Java’s Arrays.binarySearch() found in 2006 Mon 29 Sep 2014Lecture 4 26

BinarySearch – pseudocode Mon 29 Sep 2014Lecture lo = 0, hi = a.length-1 2. while (lo <= hi) { 3. int mid = lo + (hi - lo) / 2 4. if (key < a[mid]) then hi = mid else if (key > a[mid]) then lo = mid else return mid 7. } 8. return -1 Note: here array indices start from 0 and go up to length-1.

BinarySearch – Loop Invariant Mon 29 Sep 2014Lecture lo = 0, hi = a.length-1 2. while (lo <= hi) { 3. int mid = lo + (hi - lo) / 2 4. if (key < a[mid]) then hi = mid else if (key > a[mid]) then lo = mid else return mid 7. } 8. return -1 Note: here array indices start from 0 and go up to length-1. Invariant: if key in a[0..n-1] then key in a[lo..hi]

BinarySearch – Asymptotic Running Time Mon 29 Sep 2014Lecture 4 29 Asymptotic cost 1. lo = 0, hi = a.length-1 2. while (lo <= hi) { 3. int mid = lo + (hi - lo) / 2 4. if (key < a[mid]) then hi = mid else if (key > a[mid]) then lo = mid else return mid 7. } 8. return -1 Note: array indices start from 0 and go up to length-1.

BinarySearch – Asymptotic Running Time Mon 29 Sep 2014Lecture 4 30 Asymptotic cost 1. lo = 0, hi = a.length-1Θ(1) 2. while (lo <= hi) {Θ(log n) 3. int mid = lo + (hi - lo) / 2Θ(log n) 4. if (key < a[mid]) then hi = mid – 1Θ(log n) 5. else if (key > a[mid]) then lo = mid + 1Θ(log n) 6. else return midΘ(log n) 7. } 8. return -1Θ(1) Note: array indices start from 0 and go up to length-1. T(n) = Θ(log n) When a loop throws away half the input array at each iteration: it will perform Θ(log n) iterations!

We will use the Asymptotic Θ-notation from now on because it’s easier to calculate The book uses the ~ notation (more accurate but similar to Θ) We will mostly look at the worst case (sometimes the average) Sometimes we can sacrifice some memory space to improve running time We will discuss space performance and space/time tradeoff in next lecture Don’t forget the labs tomorrow, Tuesday and Thursday see website Mon 29 Sep 2014Lecture 4 31