Analysis of Recursive Algorithms

Slides:



Advertisements
Similar presentations
A simple example finding the maximum of a set S of n numbers.
Advertisements

Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
CS 2210 (22C:19) Discrete Structures Advanced Counting
April 9, 2015Applied Discrete Mathematics Week 9: Relations 1 Solving Recurrence Relations Another Example: Give an explicit formula for the Fibonacci.
ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 8 Analysis of Recursive Algorithms King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial.
CSC 2300 Data Structures & Algorithms January 30, 2007 Chapter 2. Algorithm Analysis.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Analysis of Algorithms
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 5. Recursive Algorithms.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Analysis of Recursive Algorithms October 29, 2014
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Algorithm Lecture 3 Recurrence, control structure and few examples (Part 1) Huma Ayub (Assistant Professor) Department of Software Engineering.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
Analysis of Algorithms
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
Chapter Objectives To understand how to think recursively
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
1 Recurrences Algorithms Jay Urbain, PhD Credits: Discrete Mathematics and Its Applications, by Kenneth Rosen The Design and Analysis of.
Project 2 due … Project 2 due … Project 2 Project 2.
Recurrence Relation. Outline  What is a recurrence relation ?  Solving linear recurrence relations  Divide-and-conquer algorithms and recurrence relations.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Chapter 8 With Question/Answer Animations. Chapter Summary Applications of Recurrence Relations Solving Linear Recurrence Relations Homogeneous Recurrence.
Introduction to Algorithms Chapter 4: Recurrences.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Recurrences – II. Comp 122, Spring 2004.
Foundations II: Data Structures and Algorithms
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
Analysis of Algorithms & Recurrence Relations. Recursive Algorithms Definition –An algorithm that calls itself Components of a recursive algorithm 1.Base.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Lecture #3 Analysis of Recursive Algorithms
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Unit 6 Analysis of Recursive Algorithms
Analysis of Recursive Algorithms
Divide-and-Conquer 6/30/2018 9:16 AM
CS 3343: Analysis of Algorithms
Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Divide-and-Conquer 7 2  9 4   2   4   7
Divide-and-Conquer 7 2  9 4   2   4   7
At the end of this session, learner will be able to:
Algorithms Recurrences.
Recurrences.
Analysis of Recursive Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Recursive Algorithms
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

Analysis of Recursive Algorithms What is a recurrence relation? Forming Recurrence Relations Solving Recurrence Relations Analysis Of Recursive Factorial Analysis Of Recursive Selection Sort Analysis Of Recursive Binary Search Analysis Of Recursive Towers of Hanoi Analysis Of Recursive Fibonacci Simplified Master Theorem

What is a recurrence relation? A recurrence relation, T(n), is a recursive function of an integer variable n. Like all recursive functions, it has one or more recursive cases and one or more base cases. Example: The portion of the definition that does not contain T is called the base case of the recurrence relation; the portion that contains T is called the recurrent or recursive case. Recurrence relations are useful for expressing the running times (i.e., the number of basic operations executed) of recursive algorithms The specific values of the constants such as a, b, and c (in the above recurrence) are important in determining the exact solution to the recurrence. Often however we are only concerned with finding an asymptotic upper bound on the solution. We call such a bound an asymptotic solution to the recurrence.

Forming Recurrence Relations For a given recursive method, the base case and the recursive case of its recurrence relation correspond directly to the base case and the recursive case of the method. Example 1: Write the recurrence relation for the following method: The base case is reached when n = = 0. The method performs one comparison. Thus, the number of operations when n = = 0, T(0), is some constant a. When n > 0, the method performs two basic operations and then calls itself, using ONE recursive call, with a parameter n – 1. Therefore the recurrence relation is: T(0) = a for some constant a T(n) = b + T(n – 1) for some constant b public void f (int n) { if (n > 0) { System.out.println(n); f(n-1); } In General, T(n) is usually a sum of various choices of T(m ), the cost of the recursive subproblems, plus the cost of the work done outside the recursive calls: T(n ) = aT(f(n)) + bT(g(n)) + . . . + c(n) where a and b are the number of subproblems, f(n) and g(n) are subproblem sizes, and c(n) is the cost of the work done outside the recursive calls [Note: c(n) may be a constant]

Forming Recurrence Relations (Cont’d) Example 2: Write the recurrence relation for the following method: The base case is reached when n == 1. The method performs one comparison and one return statement. Therefore, T(1), is some constant c. When n > 1, the method performs TWO recursive calls, each with the parameter n / 2, and some constant # of basic operations. Hence, the recurrence relation is: T(1) = c for some constant c T(n) = b + 2T(n / 2) for some constant b public int g(int n) { if (n == 1) return 2; else return 3 * g(n / 2) + g( n / 2) + 5; }

Forming Recurrence Relations (Cont’d) Example 3: Write the recurrence relation for the following method: The base case is reached when n == 1 or n == 2. The method performs one comparison and one return statement. Therefore each of T(1) and T(2) is some constant c. When n > 2, the method performs TWO recursive calls, one with the parameter n - 1 , another with parameter n – 2, and some constant # of basic operations. Hence, the recurrence relation is: T(n) = c if n = 1 or n = 2 T(n) = T(n – 1) + T(n – 2) + b if n > 2 long fibonacci (int n) { // Recursively calculates Fibonacci number if( n == 1 || n == 2) return 1; else return fibonacci(n – 1) + fibonacci(n – 2); }

Forming Recurrence Relations (Cont’d) Example 4: Write the recurrence relation for the following method: The base case is reached when n == 0 or n == 1. The method performs one comparison and one return statement. ThereforeT(0) and T(1) is some constant c. At every step the problem size reduces to half the size. When the power is an odd number, an additional multiplication is involved. To work out time complexity, let us consider the worst case, that is we assume that at every step an additional multiplication is needed. Thus total number of operations T(n) will reduce to number of operations for n/2, that is T(n/2) with seven additional basic operations (the odd power case) Hence, the recurrence relation is: T(n) = c if n = 0 or n = 1 T(n) = 2T(n /2) + b if n > 2 long power (long x, long n) { if(n == 0) return 1; else if(n == 1) return x; else if ((n % 2) == 0) return power (x, n/2) * power (x, n/2); else return x * power (x, n/2) * power (x, n/2); }

Solving Recurrence Relations To solve a recurrence relation T(n) we need to derive a form of T(n) that is not a recurrence relation. Such a form is called a closed form of the recurrence relation. There are five methods to solve recurrence relations that represent the running time of recursive methods: Iteration method (unrolling and summing) Substitution method (Guess the solution and verify by induction) Recursion tree method Master theorem (Master method) Using Generating functions or Characteristic equations In this course, we will use the Iteration method and a simplified Master theorem.

Solving Recurrence Relations - Iteration method Steps: Expand the recurrence Express the expansion as a summation by plugging the recurrence back into itself until you see a pattern.   Evaluate the summation In evaluating the summation one or more of the following summation formulae may be used: Arithmetic series: Geometric Series: Special Cases of Geometric Series:

Solving Recurrence Relations - Iteration method (Cont’d) Harmonic Series: Others:

Analysis Of Recursive Factorial method Example1: Form and solve the recurrence relation for the running time of factorial method and hence determine its big-O complexity: T(0) = c (1) T(n) = b + T(n - 1) (2) = b + b + T(n - 2) by subtituting T(n – 1) in (2) = b +b +b + T(n - 3) by substituting T(n – 2) in (2) … = kb + T(n - k) The base case is reached when n – k = 0  k = n, we then have: T(n) = nb + T(n - n) = bn + T(0) = bn + c Therefore the method factorial is O(n) long factorial (int n) { if (n == 0) return 1; else return n * factorial (n – 1); }

Analysis Of Recursive Selection Sort public static void selectionSort(int[] x) { selectionSort(x, x.length - 1); } private static void selectionSort(int[] x, int n) { int minPos; if (n > 0) { minPos = findMinPos(x, n); swap(x, minPos, n); selectionSort(x, n - 1); private static int findMinPos (int[] x, int n) { int k = n; for(int i = 0; i < n; i++) if(x[i] < x[k]) k = i; return k; private static void swap(int[] x, int minPos, int n) { int temp=x[n]; x[n]=x[minPos]; x[minPos]=temp;

Analysis Of Recursive Selection Sort (Cont’d) findMinPos is O(n), and swap is O(1), therefore the recurrence relation for the running time of the selectionSort method is: T(0) = a (1) T(n) = T(n – 1) + n + c if n > 0 (2) = [T(n-2) +(n-1) + c] + n + c = T(n-2) + (n-1) + n + 2c by substituting T(n-1) in (2) = [T(n-3) + (n-2) + c] +(n-1) + n + 2c= T(n-3) + (n-2) + (n-1) + n + 3c by substituting T(n-2) in (2) = T(n-4) + (n-3) + (n-2) + (n-1) + n + 4c = …… = T(n-k) + (n-k + 1) + (n-k + 2) + …….+ n + kc The base case is reached when n – k = 0  k = n, we then have : Therefore, Recursive Selection Sort is O(n2)

Analysis Of Recursive Binary Search public int binarySearch (int target, int[] array, int low, int high) { if (low > high) return -1; else { int middle = (low + high)/2; if (array[middle] == target) return middle; else if(array[middle] < target) return binarySearch(target, array, middle + 1, high); else return binarySearch(target, array, low, middle - 1); } The recurrence relation for the running time of the method is: T(1) = a if n = 1 (one element array) T(n) = T(n / 2) + b if n > 1

Analysis Of Recursive Binary Search (Cont’d) Without loss of generality, assume n, the problem size, is a multiple of 2, i.e., n = 2k Expanding: T(1) = a (1) T(n) = T(n / 2) + b (2) = [T(n / 22) + b] + b = T (n / 22) + 2b by substituting T(n/2) in (2) = [T(n / 23) + b] + 2b = T(n / 23) + 3b by substituting T(n/22) in (2) = …….. = T( n / 2k) + kb The base case is reached when n / 2k = 1  n = 2k  k = log2 n, we then have: T(n) = T(1) + b log2 n = a + b log2 n Therefore, Recursive Binary Search is O(log n)

Analysis Of Recursive Towers of Hanoi Algorithm The recurrence relation for the running time of the method hanoi is: T(n) = a if n = 1 T(n) = 2T(n - 1) + b if n > 1 public static void hanoi(int n, char from, char to, char temp){ if (n == 1) System.out.println(from + " --------> " + to); else{ hanoi(n - 1, from, temp, to); hanoi(n - 1, temp, to, from); }

Analysis Of Recursive Towers of Hanoi Algorithm (Cont’d) Expanding: T(1) = a (1) T(n) = 2T(n – 1) + b if n > 1 (2) = 2[2T(n – 2) + b] + b = 22 T(n – 2) + 2b + b by substituting T(n – 1) in (2) = 22 [2T(n – 3) + b] + 2b + b = 23 T(n – 3) + 22b + 2b + b by substituting T(n-2) in (2) = 23 [2T(n – 4) + b] + 22b + 2b + b = 24 T(n – 4) + 23 b + 22b + 21b + 20b by substituting T(n – 3) in (2) = …… = 2k T(n – k) + b[2k- 1 + 2k– 2 + . . . 21 + 20] The base case is reached when n – k = 1  k = n – 1, we then have: Therefore, The method hanoi is O(2n)

Analysis Of Recursive Fibonacci long fibonacci (int n) { // Recursively calculates Fibonacci number if( n == 1 || n == 2) return 1; else return fibonacci(n – 1) + fibonacci(n – 2); } T(n) = c if n = 1 or n = 2 (1) T(n) = T(n – 1) + T(n – 2) + b if n > 2 (2) We determine a lower bound on T(n): Expanding: T(n) = T(n - 1) + T(n - 2) + b ≥ T(n - 2) + T(n-2) + b = 2T(n - 2) + b = 2[T(n - 3) + T(n - 4) + b] + b by substituting T(n - 2) in (2)  2[T(n - 4) + T(n - 4) + b] + b = 22T(n - 4) + 2b + b = 22[T(n - 5) + T(n - 6) + b] + 2b + b by substituting T(n - 4) in (2) ≥ 23T(n – 6) + (22 + 21 + 20)b . . .  2kT(n – 2k) + (2k-1 + 2k-2 + . . . + 21 + 20)b = 2kT(n – 2k) + (2k – 1)b The base case is reached when n – 2k = 2  k = (n - 2) / 2 Hence T(n) ≥ 2 (n – 2) / 2 T(2) + [2 (n - 2) / 2 – 1]b = (b + c)2 (n – 2) / 2 – b = [(b + c) / 2]*(2)n/2 – b  Recursive Fibonacci is exponential

Master Theorem (Master Method) The master method provides an estimate of the growth rate of the solution for recurrences of the form: where a ≥ 1, b > 1 and the overhead function f(n) > 0 If T(n) is interpreted as the number of steps needed to execute an algorithm for an input of size n, this recurrence corresponds to a “divide and conquer” algorithm, in which a problem of size n is divided into a sub-problems of size n / b, where a, b are positive constants: Divide-and-conquer algorithm: • divide the problem into a number of subproblems • conquer the subproblems (solve them) • combine the subproblem solutions to get the solution to the original problem Example: Merge Sort • divide the n-element sequence to be sorted into two n/2- element sequences. • conquer the subproblems recursively using merge sort. • combine the resulting two sorted n/2-element sequences by merging

Simplified Master Theorem The Simplified Master Method for Solving Recurrences: Consider recurrences of the form: T(1) = 1 T(n) = aT(n/b) + knc + h for constants a ≥ 1, b > 1, c  0, k ≥ 1, and h  0 then: if a > bc if a = bc if a < bc Note: Since k and h do not affect the result, they are sometimes not included in the above recurrence

Simplified Master Theorem (Cont’d) Example1: Find the big-Oh running time of the following recurrence. Use the Master Theorem: Solution: a = 3, b = 4, c = ½  a > bc  Case 1 Hence Example2: Find the big-Oh running time of the following recurrence. Use the Master Theorem: T(1) = 1 T(n) = 2T(n / 2) + n Solution: a = 2, b = 2, c = 1  a = bc  Case 2 Hence T(n) is O(n log n) Example3: Find the big-Oh running time of the following recurrence. Use the Master Theorem: T(1) = 1 T(n) = 4T(n / 2) + kn3 + h where k ≥ 1 and h  1 Solution: a = 4, b = 2, c = 3  a < bc  Case 3 Hence T(n) is O(n3)