Time Complexity Problem size (Input size)

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CSE Lecture 3 – Algorithms I
Computational Complexity 1. Time Complexity 2. Space Complexity.
What is an Algorithm? (And how do we analyze one?)
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Instruction count for statements Methods Examples
Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Concept of Basic Time Complexity Problem size (Input size) Time complexity analysis.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
Abstract Data Types (ADTs) Data Structures The Java Collections API
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Lecture 2 Computational Complexity
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
Analysis of Algorithms
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
1 Analysis of Algorithms CS 105 Introduction to Data Structures and Algorithms.
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Complexity Analysis (Part I)
CMPT 438 Algorithms.
Introduction to Algorithms
Theoretical analysis of time efficiency
Analysis of Algorithms
Introduction to Analysis of Algorithms
Analysis of Algorithms
CSC 427: Data Structures and Algorithm Analysis
Analysis of Algorithms
CSE15 Discrete Mathematics 03/13/17
COP 3503 FALL 2012 Shayan Javed Lecture 15
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to complexity
Introduction to Algorithms
Introduction Algorithms Order Analysis of Algorithm
Data Structures and Algorithms
Analysis of algorithms
Lecture 3 of Computer Science II
Sorting by Tammy Bailey
Algorithm Analysis CSE 2011 Winter September 2018.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Big-Oh and Execution Time: A Review
Algorithm design and Analysis
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Searching CLRS, Sections 9.1 – 9.3.
Programming and Data Structure
Algorithm Analysis Bina Ramamurthy CSE116A,B.
Fundamentals of the Analysis of Algorithm Efficiency
Searching, Sorting, and Asymptotic Complexity
Introduction to Algorithms
CSC 427: Data Structures and Algorithm Analysis
Revision of C++.
Discrete Mathematics 7th edition, 2009
Analysis of algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Complexity Analysis (Part I)
Analysis of Algorithms
Algorithms and data structures: basic definitions
Complexity Analysis (Part I)
Presentation transcript:

Time Complexity Problem size (Input size) Time analysis (A-priori analysis)

Problem instances An instance is the actual data for which the problem needs to be solved. In this class we use the terms instance and input interchangeably. Problem: Sort list of records. Instances: (1, 10, 5) (20, -40) (1, 2,3,4, 1000, -27)

Instance Size Size of Instance - Number of data (e.g., elements, values, records, parameters, bits, words, etc. ) needed to represent the instance on a computer.

Input Size Examples Search and sort: Size = n number of records in the list. For search ignore search key Graphs problems: Size = (|V| + |E|) number nodes |V| plus number edges in graph |E|. Matrix problems: Size = r*c number of rows r multiplied by number of columns c

Number problems Problems where input numbers can become increasingly large. Examples: Factorial of 10, 106, 1015 Fibonacci Number operation (e.g., Multiplying, adding) For these problems we should use the formal definition

Example 1: Factorial Compute factorial of an n bit number v. Its value is 2n - 1  v <2n v! = v * v-1 *… * 3 * 2 *1 Algorithm does O(v) multiplications Size of input = n Example: n = 100 bits (first bit is 1) 299 v <2100 v > 0.6* 1030 How many multiplications are needed?

Efficiency The efficiency of an algorithm depends on the quantity of resources it requires Usually we compare algorithms based on their time Sometimes also based on the space resources they need. The time required by an algorithm depends on the instance size, and its data

Time Analysis Best Case: The smallest amount of time needed to run any instance of a given size Worst Case: The largest amount of time needed to run any instance of a given size Average Case: the expected time required by an instance of a given size

Time Analysis If the best, worst and average “times” of some algorithms are identical, we have every case time analysis. e.g., array addition, matrix multiplication, etc. Usually, the best, worst and average time of a algorithm are different.

Example: Sequential search Problem: Find a search key in a list of records Algorithm: Sequential search Main idea: Compare search key to all keys until match, or list is exhausted. Time depends on the size of the list n, and on the data stored in a list.

Example: Time Analysis for Sequential search Worst-case: if the search key x is the last item in the array or if x is not in the array. W(n) = n Best-case: if x is matched with the first key in array S, which means x=S[1], regardless of array size n B(n) = 1

Time Analysis for Sequential search Average-case: If the probability that x is in the kth array slot is 1/n, which means probability that x occurs is uniformly distributed in the array slots: Question: what is the A(n) if x could be outside of the array?

Worse case time analysis It is the most commonly used time complexity analysis approach Because: Easier to compute than average case; No knowledge on the probability of occurrence of instances needed. Maximum "time" needed as a function of instance size. More useful than the best case; Important for critical mission, and real time problems.

Worst case time analysis Drawbacks of comparing algorithms based on their worst case time: - An algorithm could be superior on the average than another although the worst case time complexity is not superior. - For some algorithms a worst case instance is very unlikely to occur in practice.

Evaluation of running time through experiments Algorithm must be fully implemented. To compare run time we need to use the same hardware and software environments. There may be differences because of the writing style of different individuals.

Requirements for time analysis Independence A priori Large instances Growth rate classes

Independence Requirement Time analysis must be independent of: The hardware of a computer, The programming language used for pseudo code The programmer that wrote the code.

A Priori Requirement Analysis should be a priori Derived for any algorithm expressed in high level description, or pseudo code.

Large Instance Requirement Algorithms running efficiently on small instances may run very slowly with large instance sizes Therefore, the analysis must capture algorithm behavior when problem instances are large, and ignore behavior for small sizes

Growth Rate Classes Requirement Time analysis must classify algorithms into: Ordered classes so that all algorithms in a single class are considered to have the same efficiency. If class A “is better than“ class B, then all algorithms that belong to A are considered more efficient than all algorithms in class B.

The classes Growth rate classes are derived from instruction counts Time analysis partitions algorithms into general equivalence classes such as: logarithmic, linear, quadratic, cubic, polynomial exponential, etc.

Instruction counts Provide rough estimates of actual number of instructions executed Depend on: Language used to describe algorithm Programmer style Method used to derive count May be quite different from actual counts Algorithm with count=2n, may not be faster than one with count=5n.

Comparing an nlogn to an n2 algorithm The following example shows that an nlogn algorithm is always more efficient for large instances Pete is a programmer for a super computer. The computer executes 100 million instructions per second. His implementation of Insertion Sort requires only 2n2 computer instructions to sort n numbers. Joe has a PC which executes 1 million instructions per second. Joe’s sloppy implementation of Merge Sort requires 75n lg n computer instructions to sort n numbers.

Who is faster sorting a million numbers? Super Pete: ( 2 (106 )2 instructions) / ( 108 instructions/sec) = 20,000 seconds » 5.56 hours Joe: (75 *106 lg (106 ) instructions)/ (106instructions/sec) = 1494.8 seconds » 25 minutes

Who is faster sorting 50 items? Super Pete: ( 2 (50)2 instructions) / ( 108 instructions/sec) » 0.00005 seconds Joe: (75 *50 lg (50 ) instructions) / (106 instructions/sec) » 0.000353 seconds

Insertion sort Worst-case time complexity for number of comparisons: In the “while” loop, for a given i, the comparison is done at most i-1 times. In total: Insertionsort(int n, keytype S[]) { index i, j; keytype x; for (i=2; i<=n; i++) x=S[i]; j=i-1; while( j>0 && S[j] > x) S[j+1] = S[j]; j--; } S[j+1] =x;

Example: Binary search (Recursive) Index Binsearch(index low, index high_ { index mid; if (low > high) return 0; else{ mid = floor[(low+high)/2]; if (x== S[mid]) return mid; else if (x < S[mid]) return Binsearch(low, mid-1); else return Binsearch(mid+1, high); } Worst-case Time complexity: W(n) = W(n/2) + 1 W(1) = 1  W(n) = lgn +1

Time Complexity (non-recursive algorithm) Instruction counts

Deriving Instruction Counts Given a (non-recursive) algorithm expressed in pseudo code we explain how to: Assign counts to high level statements Describe methods for deriving an instruction count Compute counts for several examples

Counts for High Level Statements Assignment loop condition for loop for loop body for loop control while loop while loop control while loop body if Note: The counts we use are not accurate; The goal is to derive a correct growth function

Assignment Statement 1. A= B*C-D/F Count1 = 1 In reality? At least 4 Note: When numbers B, C, D, F are very large, algorithms that deal with large numbers will be used and the count will depend on the number of digits needed to store the large numbers.

Loop condition (i < n)&&(!found) Count1 = 1 Note: if loop condition invokes functions, count of function must be used

for loop body for (i=0; i < n; i++) A[i] = i Count2 = 1 Count1(2) = How many times are there for the condition check? i = 0 i < n false true A[i] = i i = i + 1

for loop control Count = number of times loop condition is executed for (i=0; i < n; i++) <body> Count = number of times loop condition is executed (assuming loop condition has a count of 1) i < n false true Body of the loop i = i + 1

for loop control i = 0 for (i=0; i < n; i++) <body> Count1 = number of times loop condition i < n is executed = n + 1 Note: last time condition is checked i =n and (n < n) evaluates to false i < n false true Body of the loop i = i + 1

while loop control Count = number times loop condition Line 1 i = 0 while (i < n){ A[i] = i i = i + 1} Count = number times loop condition is executed (assuming loop condition has a count of 1) Line 2 i < n false true A[i] = i Line 3 i = i + 1 Line 4

while loop control Count2 = number times loop condition Line 1 i = 0 while (i < n){ A[i] = i i = i + 1} Count2 = number times loop condition (i < n) is executed = n + 1 Line 2 i < n false true A[i] = i Line 3 i = i + 1 Line 4

Countif = 1 +max{count2, count3} If statement Line 1: if (i == 0) Line 2: statement else Line 3: statement For worst case analysis, how many counts are there for Countif ? Countif = 1 +max{count2, count3}

Method 1: Sum Line Counts Derive a count for each line of code taking into account all loop nesting. Compute total by adding line counts.

Method 2: Barometer Operation A “barometer instruction” is selected Count = number of times that barometer instruction is executed. Search algorithms: barometer instruction (x == L[j]?). Sort algorithms: barometer instruction (L[i] == L[j]?).

Example 1 Method 1 count1= n+1 count1(2) = n*1 = n _________________ 1. for (i=0; i<n; i++ ) 2. A[i] = i Method 1 count1= n+1 count1(2) = n*1 = n _________________ Total = (n+1)+n = 2n+1

Example 1 continued Method 2 Barometer operation = + in body of loop 1. for (i=0; i<n; i++ ) 2. A[i] = i + 1 Barometer operation = + in body of loop count1(+) = n

Example 2: What is count1(2)? Line 1 1.for (i=1;i<=n;i=3*i) 2. A[i] = i Line 1 i <= n no yes A[i] = i Line 2 i = 3* i Line 1

Example 2: What is count1(2)? Line 1 1. for(i=1;i<=n; i=3*i) 2. A[i] = i For simplicity, n = 3k for some positive integer k. Body of the loop executed for i = 1(=30), 31, 32,…,3k. So count1(2)= Since k = log3n, it is executed log3n + 1 times. Line 1 i <= n no yes A[i] = i Line 2 i = 3* i Line 1

Example 3: Sequential Search 1. location=0 while (location<=n-1 && L[location]! = x) 4. location++ 5. return location Barometer operation = (L[location]! = x?) Best case analysis Worst case analysis x == L[0] and the count is 1 x = L[n-1] or x not in the list. Count is n.

Example 4: Barometer is + in body of loop. count2(3(+))) = ? 1. x = 0 2. for (i=0; i<n; i++) 3. for (j=0; j<m; j++) 4. x = x + 1 Barometer is + in body of loop. count2(3(+))) = ?

Flowchart for example 4 x = 0 i = 0 i < n j = 0 i = i + 1 j < m Line 1 i = 0 Line 2 Line 2 i < n false j = 0 Line 3 i = i + 1 j < m Line 3 false j = j + 1 x = x + 1 Line 4

Example 5: x=0 for (i=0; i<n; i++) for (j=0, j<n2; j++) x = x + 1 The body of the loop of line 2 is executed ? The body of the loop of line 3 is executed ? Count2(3(+))= ? Answer: n*n2*1

Example 6: Line 1: for (i=0; i<n; i++) Line 2: for (j=0, j<i; j++) Line 3. x = x + 1 Barometer = + Count1(2(+))= ?

Example 7: 1. for (i=0; i<n; i++) 2. for (j=0, j<i; j++) 3. for (k=0; k<=j; k++) count1(2(3))= Note:

Example of Count: Pigeonhole sort Input: Array T containing n keys in ranges 1.. S Idea: (Similar to Histogram) 1) Maintain the count of number of keys in an auxiliary array U 2) Use counts to overwrite array in place 1 7 Input T 2 1 2 4 3 1 2 1 2 3 4 Aux U 2 3 1 1 Output T 1 1 2 2 2 3 4

Algorithm: Pigeonhole-Sort( T, s) for i = 1 to s //initialize U U[i ] = 0 for j = 1 to length[T] U[T[ j ] ] = U[T[ j ] ] + 1 //Count keys q = 1 for j = 1 to s //rewrite T while U[j]>0 T[q] = j U[ j ] = U[ j ]-1 q = q+1

Pseudo code in text Body of a loop indicated by indentation for i =1 to n is equivalent to for (i =1; i <= n; i++) Arrays are assumed to start at index 1

Count: q  1 for j  1 to s //rewrite T while U[j] > 0 T[q] = j U[ j ]  U[ j ]-1 q  q+1 Barometer operation – in line 9 n (not n + s)

Count: Use loop condition of while as barometer operation Count6(7) = n + s since Is this algorithm efficient?