CS 46101 Section 600 CS 56101 Section 002 Dr. Angela Guercio Spring 2010.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Divide and Conquer (Merge Sort)
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
A simple example finding the maximum of a set S of n numbers.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Divide and Conquer Strategy
1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Lecture 2: Divide and Conquer algorithms Phan Thị Hà Dương
Chapter 2. Getting Started. Outline Familiarize you with the to think about the design and analysis of algorithms Familiarize you with the framework to.
A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님 1.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Divide and Conquer Chapter 6. Divide and Conquer Paradigm Divide the problem into sub-problems of smaller sizes Conquer by recursively doing the same.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
Spring 2015 Lecture 5: QuickSort & Selection
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
CS421 - Course Information Website Syllabus Schedule The Book:
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 4. Recurrences - 1 Recurrences.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Introduction CIS 606 Spring The sorting problem Input: A sequence of n numbers 〈 a 1, a 2, …, a n 〉. Output: A permutation (reordering) 〈 a’ 1,
Unit 1. Sorting and Divide and Conquer. Lecture 1 Introduction to Algorithm and Sorting.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Efficient Algorithms Quicksort. Quicksort A common algorithm for sorting non-sorted arrays. Worst-case running time is O(n 2 ), which is actually the.
Divide-and-Conquer 7 2  9 4   2   4   7
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Project 2 due … Project 2 due … Project 2 Project 2.
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
1 Designing algorithms There are many ways to design an algorithm. Insertion sort uses an incremental approach: having sorted the sub-array A[1…j - 1],
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
ALGORITHMS THIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture three Dr. Hamdy M. Mousa.
Getting Started Introduction to Algorithms Jeff Chastine.
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
Data Structures and Algorithms (AT70.02) Comp. Sc. and Inf. Mgmt. Asian Institute of Technology Instructor: Prof. Sumanta Guha Slide Sources: CLRS “Intro.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
 Design and Analysis of Algorithms تصميم وتحليل الخوارزميات (311 عال) Chapter 2 Sorting (insertion Sort, Merge Sort)
Lecture # 6 1 Advance Analysis of Algorithms. Divide-and-Conquer Divide the problem into a number of subproblems Similar sub-problems of smaller size.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture3.
Algorithms Sorting – Part 3.
Analysis of Algorithms CS 477/677
Introduction to Algorithms
Quick Sort Divide: Partition the array into two sub-arrays
Algorithm Design & Analysis
Growth Functions Algorithms Lecture 8
Algorithms and Data Structures Lecture III
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
CSE 2010: Algorithms and Data Structures
Divide and Conquer (Merge Sort)
Algorithms CSCI 235, Spring 2019 Lecture 6 Recurrences
Presentation transcript:

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010

 Sorting ◦ Insertion Sort ◦ Complexity  Θ(n 2 ) in the worst and in the average case (resp.)  Sorted in the opposite direction and partly sorted (resp.)  Θ(n) in best case  Already sorted

 To sort A[p.. r]: ◦ Divide by splitting into two subarrays A[p..q] and A[q r], where q is the halfway point of A[p.. r]. ◦ Conquer by recursively sorting the two subarrays A[p.. q] and A[q + 1..r]. ◦ Combine by merging the two sorted subarrays A[p.. q] and A[q r] to produce a single sorted subarray A[p.. r]. To accomplish this step, we’ll define a procedure MERGE(A, p, q, r). ◦ The recursion bottoms out when the subarray has just 1 element, so that it’s trivially sorted.

 Bottom-up view for n = 8

 Bottom-up view for n = 11

 What remains is the MERGE procedure.  Input: Array A and indices p, q, r such that ◦ p ≤ q < r. ◦ Subarray A[p..q] is sorted and subarray A[q + 1..r] is sorted. By the restrictions on p, q, r, neither subarray is empty.  Output: The two subarrays are merged into a single sorted subarray in A[p.. r].  We implement it so that it takes Θ(n) time, where n = r - p + 1 = the number of elements being merged.

 Running time of MERGE ◦ The first two for loops take Θ(n n 2 22 ) = Θ(n) time. The last for loop makes n iterations, each taking constant time, for Θ(n) time. ◦ Total time: Θ(n).

 Use a recurrence equation (more commonly, a recurrence) to describe the running time of a divide-and-conquer algorithm.  Let T(n) =running time on a problem of size n. ◦ If the problem size is small enough (say, n ≤ c for some constant c), we have a base case. The brute- force solution takes constant time: Θ(1). ◦ Otherwise, suppose that we divide into a subproblems, each 1/b the size of the original. (In merge sort, a =b = 2.)

◦ Let the time to divide a size-n problem be D(n). ◦ Have a subproblems to solve, each of size n/b ⇒ each subproblem takes T(n/b) time to solve ⇒ we spend aT(n/b) time solving subproblems. ◦ Let the time to combine solutions be C(n). ◦ We get the recurrence

 For simplicity, assume that n is a power of 2 ⇒ each divide step yields two subproblems, both of size exactly n=2.  The base case occurs when n = 1.  When n ≥ 2, time for merge sort steps: ◦ Divide: Just compute q as the average of p and r ⇒ D(n) = Θ(1). ◦ Conquer: Recursively solve 2 subproblems, each of size n/2 ⇒ 2T(n/2). ◦ Combine: MERGE on an n-element subarray takes Θ(n) time ⇒ C(n) = Θ(n).

 Since D(n) = Θ(1) and C(n) = Θ(n), summed together they give a function that is linear in n: Θ(n) ⇒ recurrence for merge sort running time is

 By the master theorem in Chapter 4, we can show that this recurrence has the solution T(n) =Θ(n lg n).  Compared to insertion sort Θ(n 2 ) worst-case time), merge sort is faster.  On small inputs, insertion sort may be faster.  We can understand how to solve the merge- sort recurrence without the master theorem.

 Let c be a constant that describes the running time for the base case and also is the time per array element for the divide and conquer steps.  We rewrite the recurrence as

 Draw a recursion tree, which shows successive expansions of the recurrence.  For the original problem, we have a cost of cn, plus the two subproblems, each costing T(n/2):

 For each of the size-n/2 subproblems, we have a cost of cn=2, plus two subproblems, each costing T(n/4):

 Continue expanding until the problem sizes get down to 1:

 Each level has cost cn. ◦ The top level has cost cn. ◦ The next level down has 2 subproblems, each contributing cost cn/2. ◦ …  There are lg n + 1 levels (height is lg n). ◦ Use induction. ◦ Base case: n = 1 ⇒ 1 level, and lg 1 = = 1. ◦ Inductive hypothesis is that a tree for a problem size of 2 i i I has lg 2 i + 1 = I + 1 levels.

◦ Because we assume that the problem size is a power of 2, the next problem size up after 2 i is 2 i+1 ◦ A tree for a problem size of 2 i+1 has has one more level than the size-2 i tree ⇒ i + 2 levels ◦ Since lg 2 i = i + 2, we’re done with the inductive argument ◦ Total cost is sum of costs at each level. Have lg n + 1 levels, each costing cn ⇒ total cost is cn lg n + cn. ◦ Ignore low-order term of cn and constant coefficient c⇒ Θ(n lg n)

 The computation:

 Growth of functions  Reading: ◦ Read Chapter 3