1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Algorithm Analysis.
Lecture: Algorithmic complexity
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
CSE 373 Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
CSE 326: Data Structures Lecture #2 Analysis of Algorithms Alon Halevy Fall Quarter 2000.
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
 Last lesson  Arrays for implementing collection classes  Performance analysis (review)  Today  Performance analysis  Logarithm.
Data Structure Algorithm Analysis TA: Abbas Sarraf
Analysis of Algorithm.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Program Efficiency and Complexity
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 Big-Oh Notation CS 105 Introduction to Data Structures and Algorithms.
Lecture 2 Computational Complexity
Iterative Algorithm Analysis & Asymptotic Notations
2.1 Computational Tractability
Analysis of Algorithms
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Asymptotic Analysis-Ch. 3
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
1 Lower Bound on Comparison-based Search We have now covered lots of searching methods –Contiguous Data (Arrays) Sequential search Binary Search –Dynamic.
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Algorithm Analysis 1.
Analysis of Algorithms
Growth of functions CSC317.
Algorithm Analysis (not included in any exams!)
Introduction to Algorithms Analysis
CS 201 Fundamental Structures of Computer Science
Analysis of Algorithms
CSE 2010: Algorithms and Data Structures Algorithms
At the end of this session, learner will be able to:
Analysis of Algorithms
Algorithm Course Dr. Aref Rashad
Presentation transcript:

1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms

2 Example I: Finding the sum of an array of numbers int Sum(int A[], int N) { int sum = 0; for (i=0; i < N; i++){ sum += A[i]; } //end-for return sum; } //end-Sum How many steps does this algorithm take to finish? –We define a step to be a unit of work that can be executed in constant amount of time in a machine.

3 Example I: Finding the sum of an array of numbers Times Executed 1 N N Total: 1 + N + N + 1 = 2N + 2 Running Time: T(N) = 2N+2 –N is the input size (number of ints) to add int Sum(int A[], int N) { int sum = 0; for (i=0; i < N; i++){ sum += A[i]; } //end-for return sum; } //end-Sum

4 Example II: Searching a key in an array of numbers int LinearSearch(int A[N], int N, int key) { int i= 0; while (i < N){ if (A[i] == key) break; i++; } //end-while if (i < N) return i; else return -1; } //end-LinearSearch Times Executed 1 L L L or L Total: 1+3*L+1 = 3L+2

5 Example II: Searching a key in an array of numbers What’s the best case? –Loop iterates just once =>T(n) = 5 What’s the average (expected) case? –Loop iterates N/2 times =>T(n)=3*n/2+2 = 1.5n+2 What’s the worst case? –Loop iterates N times =>T(n) = 3n+2 –Recall that we are only interested in the worst case

6 Example III: Nested for loops for (i=1; i<=N; i++) { for (j=1; j<=N; j++) { printf(“Foo\n”); } //end-for-inner } //end-for-outer How many times is the printf statement executed? Or how many Foo will you see on the screen?

7 Example IV: Matrix Multiplication /* Two dimensional arrays A, B, C. Compute C = A*B */ for (i=0; i<N; i++) { for (j=0; j<N; j++) { C[i][j] = 0; for (int k=0; k<N; k++){ C[i][j] += A[i][k]*B[k][j]; } //end-for-innermost } //end-for-inner } //end-for-outer

8 Example V: Binary Search Problem: You are given a sorted array of integers, and you are searching for a key −Linear Search – T(n) = 3n+2 (Worst case) −Can we do better? −E.g. Search for 55 in the sorted array below

9 Example V: Binary Search Since the array is sorted, we can reduce our search space in half by comparing the target key with the key contained in the middle of the array and continue this way Example: Let’s search for left right left right 6 Eliminated 11 middle 60

10 Binary Search (continued) left right 6 5 Eliminated 50 middle left right 6 5 Eliminated 55 middle Now we found 55  Successful search Had we searched for 57, we would have terminated at the next step unsuccessfully

11 Binary Search (continued) < target > target left right ? At any step during a search for “target”, we have restricted our search space to those keys between “left” and “right”. Any key to the left of “left” is smaller than “target” and is thus eliminated from the search space Any key to the right of “right” is greater than “target” and is thus eliminated from the search space

12 Binary Search - Algorithm // Return the index of the array containing the key or –1 if key not found int BinarySearch(int A[], int N, int key){ left = 0; right = N-1; while (left <= right){ int middle = (left+right)/2; // Index of the key to test against if (A[middle] == key) return middle; // Key found. Return the index else if (key < A[middle]) right = middle – 1; // Eliminate the right side else left = middle+1; // Eliminate the left side } //end-while return –1; // Key not found } //end-BinarySearch Worst case running time: T(n) = 3 + 5*log 2 N. Why?

13 Asymptotic Notations Linear Search – T(N) = 3N + 2 Binary Search – T(N) = 3 + 5*log 2 N Nested Loop – T(N) = N 2 Matrix Multiplication – T(N) = N 3 + N 2 In general, what really matters is the “asymptotic” performance as N → ∞, regardless of what happens for small input sizes N. 3 asymptotic notations –Big-O --- Asymptotic upper bound –Omega --- Asymptotic lower bound –Theta --- Asymptotic tight bound

14 Big-Oh Notation: Asymptotic Upper Bound T(n) = O(f(n)) [T(n) is big-Oh of f(n) or order of f(n)] –If there are positive constants c & n0 such that T(n) = n0 –Example: T(n) = 50n is O(n). Why? –Choose c=50, n0=1. Then 50n =1 –many other choices work too! Input size N c*f(n) T(n) n0 Running Time

15 Common Functions we will encounter NameBig-OhComment ConstantO(1) Can’t beat it! Log logO(loglogN) Extrapolation search LogarithmicO(logN) Typical time for good searching algorithms LinearO(N) This is about the fastest that an algorithm can run given that we need O(n) just to read the input N logNO(NlogN) Most sorting algorithms QuadraticO(N 2 ) Acceptable when the data size is small (N<1000) CubicO(N 3 ) Acceptable when the data size is small (N<1000) ExponentialO(2 N ) Only good for really small input sizes (n<=20) Increasing cost Polynomial time

16  Notation: Asymptotic Lower Bound T(n) =  (f(n)) –If there are positive constants c & n0 such that T(n) >= c*f(n) for all n >= n0 –Example: T(n) = 2n + 5 is  (n). Why? –2n+5 >= 2n, for all n >= 1 –T(n) = 5*n 2 - 3*n is  (n 2 ). Why? –5*n 2 - 3*n >= 4*n 2, for all n >= 4 Input size N T(n) c*f(n) n0 Running Time

17  Notation: Asymptotic Tight Bound T(n) =  (f(n)) –If there are positive constants c1, c2 & n0 such that c1*f(n) = n0 –Example: T(n) = 2n + 5 is  (n). Why? 2n = 5 –T(n) = 5*n 2 - 3*n is  (n 2 ). Why? –4*n 2 = 4 n0 Input size N T(n) c1*f(n) c2*f(n) Running Time

18 Big-Oh, Theta, Omega Tips to guide your intuition: Think of O(f(N)) as “less than or equal to” f(N) –Upper bound: “grows slower than or same rate as” f(N) Think of Ω(f(N)) as “greater than or equal to” f(N) – Lower bound: “grows faster than or same rate as” f(N) Think of Θ(f(N)) as “equal to” f(N) –“Tight” bound: same growth rate (True for large N and ignoring constant factors)

19 Some Math S(N) = … N = Sum of Squares: Geometric Series: A > 1 A < 1

20 Some More Math Linear Geometric Series: Harmonic Series: Logs:

21 More on Summations Summations with general bounds: Linearity of Summations: