Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)

Slides:



Advertisements
Similar presentations
Recursion vs. Iteration The original Lisp language was truly a functional language: –Everything was expressed as functions –No local variables –No iteration.
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
21/3/00SEM107- Kamin & ReddyClass 15 - Recursive Sorting - 1 Class 15 - Recursive sorting methods r Processing arrays by recursion r Divide-and-conquer.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Introduction to Algorithms Chapter 7: Quick Sort.
Quick Sort. 2 Divide: Pick any element p as the pivot, e.g, the first element Partition the remaining elements into FirstPart, which contains all elements.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Chapter 10 Recursion Instructor: alkar/demirer. Copyright ©2004 Pearson Addison-Wesley. All rights reserved.10-2 Recursive Function recursive functionThe.
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Sorting Algorithms Bubble Sort Merge Sort Quick Sort Randomized Quick Sort.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
CHAPTER 10 Recursion. 2 Recursive Thinking Recursion is a programming technique in which a method can call itself to solve a problem A recursive definition.
Prof. S.M. Lee Department of Computer Science. Answer:
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Chapter 3: Recursive Algorithms
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
10 Algorithms in 20th Century Science, Vol. 287, No. 5454, p. 799, February 2000 Computing in Science & Engineering, January/February : The Metropolis.
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
C++ Programming: From Problem Analysis to Program Design, Third Edition Chapter 17: Recursion.
Recursion, Complexity, and Sorting By Andrew Zeng.
Lecturer: Dr. AJ Bieszczad Chapter 11 COMP 150: Introduction to Object-Oriented Programming 11-1 l Basics of Recursion l Programming with Recursion Recursion.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Stephen P. Carl - CS 2421 Recursion Reading : Chapter 4.
Computer Science Department Data Structure & Algorithms Lecture 8 Recursion.
C++ Programming: From Problem Analysis to Program Design, Fourth Edition Chapter 17: Recursion.
Review Introduction to Searching External and Internal Searching Types of Searching Linear or sequential search Binary Search Algorithms for Linear Search.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
COMP 171 Data Structures and Algorithms Tutorial 3 Merge Sort & Quick Sort.
Data Structures R e c u r s i o n. Recursive Thinking Recursion is a problem-solving approach that can be used to generate simple solutions to certain.
1 Sorting Algorithms Sections 7.1 to Comparison-Based Sorting Input – 2,3,1,15,11,23,1 Output – 1,1,2,3,11,15,23 Class ‘Animals’ – Sort Objects.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter 11Java: an Introduction to Computer Science & Programming - Walter Savitch 1 Chapter 11 l Basics of Recursion l Programming with Recursion Recursion.
Chapter 6 Recursion. Solving simple problems Iteration can be replaced by a recursive function Recursion is the process of a function calling itself.
Chapter 15: Recursion. Objectives In this chapter, you will: – Learn about recursive definitions – Explore the base case and the general case of a recursive.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 16: Recursion.
1 Recursive algorithms Recursive solution: solve a smaller version of the problem and combine the smaller solutions. Example: to find the largest element.
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
Chapter 15: Recursion. Recursive Definitions Recursion: solving a problem by reducing it to smaller versions of itself – Provides a powerful way to solve.
Chapter 15: Recursion. Objectives In this chapter, you will: – Learn about recursive definitions – Explore the base case and the general case of a recursive.
1 Overview Divide and Conquer Merge Sort Quick Sort.
Lecture #3 Analysis of Recursive Algorithms
Maitrayee Mukerji. Factorial For any positive integer n, its factorial is n! is: n! = 1 * 2 * 3 * 4* ….* (n-1) * n 0! = 1 1 ! = 1 2! = 1 * 2 = 2 5! =
chap10 Chapter 10 Recursion chap10 2 Recursive Function recursive function The recursive function is a kind of function that calls.
Recursion Data Structure Submitted By:- Dheeraj Kataria.
Advanced Sorting.
Analysis of Algorithms CS 477/677
Recursion Ali.
Chapter 10 Recursion Instructor: Yuksel / Demirer.
Chapter 7 Sorting Spring 14
Data Structures Recursion CIS265/506: Chapter 06 - Recursion.
Advance Analysis of Algorithms
Recursion "To understand recursion, one must first understand recursion." -Stephen Hawking.
Applied Algorithms (Lecture 17) Recursion Fall-23
Quick Sort (11.2) CSE 2011 Winter November 2018.
CO 303 Algorithm Analysis And Design Quicksort
Data Structures Review Session
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
EE 312 Software Design and Implementation I
CS 3343: Analysis of Algorithms
CSE 373 Data Structures and Algorithms
Algorithms Recurrences.
Design and Analysis of Algorithms
CSE 332: Sorting II Spring 2016.
Sorting Popular algorithms:
Presentation transcript:

Lecture 7

Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n) = 4T (n/4) + 2n Again by substituting n/4, we can see 4T(n/4) = 4(2T(n/8)) + (n/4) = 8T(n/8) + n And T(n) = 8T(n/8) + 3n Continuing in this manner, we obtain T(n) = 2 k T(n/2 k ) + k.n Using k = lgn T(n) = nT(1) + nlgn = nlgn + n T(n) = O(nlgn)

3 Overview Divide and Conquer Merge Sort Quick Sort

4 Divide: Pick any element p as the pivot, e.g, the first element Partition the remaining elements into FirstPart, which contains all elements < p SecondPart, which contains all elements ≥ p Recursively sort the FirstPart and SecondPart Combine: no work is necessary since sorting is done in place

5 Quick Sort x < p p p ≤ x Partition FirstPart SecondPart p pivot A: Recursive call x < p p p ≤ x Sorted FirstPart Sorted SecondPart Sorted

6 Quick Sort Quick-Sort(A, left, right) if left ≥ right return else middle ← Partition(A, left, right) Quick-Sort(A, left, middle–1 ) Quick-Sort(A, middle+1, right) end if

7 Partition p p x < pp ≤ x p x < p A: A: A: p

8 Partition ExampleA:

9 A: i=0 j=1

10 Partition ExampleA: j= i=0 8

11 Partition ExampleA: i=0 j=2

12 Partition ExampleA: i= j=3i=1

13 Partition ExampleA: i=1 5 j=4

14 Partition ExampleA: i=1 1 j=5

15 Partition ExampleA: i=2 16 j=5

16 Partition ExampleA: i= j=6

17 Partition ExampleA: i= i=3j=7

18 Partition ExampleA: i=3 15 j=8

19 Partition ExampleA: 41678i=

20 A: x < 4 4 ≤ x pivot in correct position Partition Example

21 Partition(A, left, right) 1.x ← A[left] 2.i ← left 3.for j ← left+1 to right 4.if A[j] < x then 5.i ← i swap(A[i], A[j]) 7.end if 8.end for j 9.swap(A[i], A[left]) 10.return i n = right – left +1 Time: cn for some constant c Space: constant

Quick-Sort(A, 0, 7) Partition A:

Quick-Sort(A, 0, 7) Quick-Sort(A, 0, 2) A:, partition

Quick-Sort(A, 0, 7) Quick-Sort(A, 0, 0), base case, return

Quick-Sort(A, 0, 7) Quick-Sort(A, 1, 1), base case

Quick-Sort(A, 0, 7) Quick-Sort(A, 2, 2), return Quick-Sort(A, 0, 2), return

Quick-Sort(A, 0, 7) Quick-Sort(A, 2, 2), return Quick-Sort(A, 4, 7), partition

Quick-Sort(A, 0, 7) Quick-Sort(A, 5, 7), partition

Quick-Sort(A, 0, 7) Quick-Sort(A, 6, 7), partition

Quick-Sort(A, 0, 7) Quick-Sort(A, 7, 7) 8, return, base case 8

Quick-Sort(A, 0, 7) Quick-Sort(A, 6, 7), return

Quick-Sort(A, 0, 7) Quick-Sort(A, 5, 7), return 6 8 7

Quick-Sort(A, 0, 7) Quick-Sort(A, 4, 7), return

Quick-Sort(A, 0, 7), done!

35 Quick-Sort: Best Case Even Partition Total time:  (nlogn) cn 2 × cn/2 = cn 4 × c/4 = cn n/3 × 3c = cn log n levels n n/2 n/

36 cn c(n-1) 3c 2c n n-1 n c(n-2) Happens only if input is sortd input is reversely sorted Quick-Sort: Worst Case Unbalanced Partition Total time:  (n 2 )

37 Quick-Sort: an Average Case Suppose the split is 1/10 : 9/10 Quick-Sort: an Average Case cn ≤cn n 0.1n 0.9n 0.01n 0.09n Total time:  (nlogn) 0.81n 2 2 log 10 n log 10/9 n ≤cn

38 Quick-Sort Summary Time –Most of the work done in partitioning. –Average case takes  (n log(n)) time. –Worst case takes  (n 2 ) time Space –Sorts in-place, i.e., does not require additional space

Recursion Recursion is more than just a programming technique. It has two other uses in computer science and software engineering, namely: as a way of describing, defining, or specifying things. as a way of designing solutions to problems (divide and conquer).

In general, we can define the factorial function in the following way:

Iterative Definition This is an iterative definition of the factorial function. It is iterative because the definition only contains the algorithm parameters and not the algorithm itself. This will be easier to see after defining the recursive implementation.

Recursive Definition We can also define the factorial function in the following way:

Iterative vs. Recursive Iterative 1 if n=0 factorial(n) = n x (n-1) x (n-2) x … x 2 x 1 if n>0 Recursive 1if n=0 factorial(n) = n x factorial(n-1)if n>0 Function calls itself Function does NOT call itself

Recursion To see how the recursion works, let ’ s break down the factorial function to solve factorial(3)

Breakdown Here, we see that we start at the top level, factorial(3), and simplify the problem into 3 x factorial(2). Now, we have a slightly less complicated problem in factorial(2), and we simplify this problem into 2 x factorial(1).

Breakdown We continue this process until we are able to reach a problem that has a known solution. In this case, that known solution is factorial(0) = 1. The functions then return in reverse order to complete the solution.

Breakdown This known solution is called the base case. Every recursive algorithm must have a base case to simplify to. Otherwise, the algorithm would run forever (or until the computer ran out of memory).

Breakdown The other parts of the algorithm, excluding the base case, are known as the general case. For example: 3 x factorial(2)  general case 2 x factorial(1)  general case etc …

Iterative Algorithm factorial(n) { i = 1 factN = 1 loop (i <= n) factN = factN * i i = i + 1 end loop return factN } The iterative solution is very straightforward. We simply loop through all the integers between 1 and n and multiply them together.

Recursive Algorithm factorial(n) { if (n = 0) return 1 else return n*factorial(n-1) end if } Note how much simpler the code for the recursive version of the algorithm is as compared with the iterative version  we have eliminated the loop and implemented the algorithm with 1 ‘if’ statement.

How Recursion Works To truly understand how recursion works we need to first explore how any function call works. When a program calls a subroutine (function) the current function must suspend its processing. The called function then takes over control of the program.

How Recursion Works When the function is finished, it needs to return to the function that called it. The calling function then ‘ wakes up ’ and continues processing. One important point in this interaction is that, unless changed through call-by- reference, all local data in the calling module must remain unchanged.

How Recursion Works Therefore, when a function is called, some information needs to be saved in order to return the calling module back to its original state (i.e., the state it was in before the call). We need to save information such as the local variables and the spot in the code to return to after the called function is finished.

How Recursion Works To do this we use a stack. Before a function is called, all relevant data is stored in a stackframe. This stackframe is then pushed onto the system stack. After the called function is finished, it simply pops the system stack to return to the original state.

How Recursion Works By using a stack, we can have functions call other functions which can call other functions, etc. Because the stack is a first-in, last-out data structure, as the stackframes are popped, the data comes out in the correct order.

Main disadvantage of programming recursively The main disadvantage of programming recursively is that, while it makes it easier to write simple and elegant programs, it also makes it easier to write inefficient ones. when we use recursion to solve problems we are interested exclusively with correctness, and not at all with efficiency. Consequently, our simple, elegant recursive algorithms may be inherently inefficient.

Limitations of Recursion Recursive solutions may involve extensive overhead because they use calls. When a call is made, it takes time to build a stackframe and push it onto the system stack. Conversely, when a return is executed, the stackframe must be popped from the stack and the local variables reset to their previous values – this also takes time.

Limitations of Recursion In general, recursive algorithms run slower than their iterative counterparts. Also, every time we make a call, we must use some of the memory resources to make room for the stackframe.

Limitations of Recursion Therefore, if the recursion is deep, say, factorial(1000), we may run out of memory. Because of this, it is usually best to develop iterative algorithms when we are working with large numbers.

Recursion is based upon calling the same function over and over, whereas iteration simply `jumps back' to the beginning of the loop. A function call is often more expensive than a jump.