Application: Efficiency of Algorithms II

Slides:



Advertisements
Similar presentations
Towers of Hanoi Move n (4) disks from pole A to pole C such that a disk is never put on a smaller disk A BC ABC.
Advertisements

1 Divide & Conquer Algorithms. 2 Recursion Review A function that calls itself either directly or indirectly through another function Recursive solutions.
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Divide and Conquer Chapter 6. Divide and Conquer Paradigm Divide the problem into sub-problems of smaller sizes Conquer by recursively doing the same.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Mergesort, Analysis of Algorithms Jon von Neumann and ENIAC (1945)
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.
1 TCSS 342, Winter 2005 Lecture Notes Sorting Weiss Ch. 8, pp
Cmpt-225 Sorting. Fundamental problem in computing science  putting a collection of items in order Often used as part of another algorithm  e.g. sort.
CS2420: Lecture 9 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting Chapter 6 Chapter 6 –Insertion Sort 6.1 –Quicksort 6.2 Chapter 5 Chapter 5 –Mergesort 5.2 –Stable Sorts Divide & Conquer.
1 Algorithmic analysis Introduction. This handout tells you what you are responsible for concerning the analysis of algorithms You are responsible for:
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
1 Divide and Conquer Binary Search Mergesort Recurrence Relations CSE Lecture 4 – Algorithms II.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
Analysis of Recursive Algorithms October 29, 2014
Recitation 11 Analysis of Algorithms and inductive proofs 1.
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
1 Designing algorithms There are many ways to design an algorithm. Insertion sort uses an incremental approach: having sorted the sub-array A[1…j - 1],
CSE 373: Data Structures and Algorithms Lecture 6: Sorting 1.
September 17, 2001 Algorithms and Data Structures Lecture II Simonas Šaltenis Nykredit Center for Database Research Aalborg University
EFFICIENCY & SORTING II CITS Scope of this lecture Quicksort and mergesort Performance comparison.
Sort Algorithms.
Some comments on lab4. Hi Philippe! Can you tell me if my code works? Thanks! I’ll show you what works…
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide and Conquer Sorting Algorithms CS /02/05 HeapSort Slide 2 Copyright 2005, by the authors of these slides, and Ateneo de Manila University.
Lecture 28 CSE 331 Nov 9, Mini project report due WED.
Analysis of Algorithms Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets.
Algorithms CSCI 235, Fall 2015 Lecture 12 Elementary Sorts II
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
Divide and Conquer Sorting Algorithms COMP s1 Sedgewick Chapters 7 and 8.
 Design and Analysis of Algorithms تصميم وتحليل الخوارزميات (311 عال) Chapter 2 Sorting (insertion Sort, Merge Sort)
Merge Sort. In plain English: if the size of the array > 1, split the array into two halves, and recursively sort both halves; when the sorts return,
A Introduction to Computing II Lecture 7: Sorting 1 Fall Session 2000.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Nothing is particularly hard if you divide it into small jobs. Henry Ford Nothing is particularly hard if you divide it into small jobs. Henry Ford.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
Analysis of Algorithms CS 477/677
Sorts, CompareTo Method and Strings
Fundamentals of Algorithms MCS - 2 Lecture # 11
Sorting Mr. Jacobs.
Algorithm Design & Analysis
Divide and Conquer divide and conquer algorithms typically recursively divide a problem into several smaller sub-problems until the sub-problems are.
QuickSort QuickSort Best, Worst Average Cases K-th Ordered Statistic
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Chapter 4: Divide and Conquer
Advanced Sorting Methods: Shellsort
Divide and Conquer Approach
Topic: Divide and Conquer
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Merge Sort.
Shell Sort and Merge Sort
Divide and Conquer (Merge Sort)
CSE 373: Data Structures and Algorithms
Rosen 5th ed., §§ ~18 slides, ~1 lecture
CSE 373 Data Structures and Algorithms
Algorithms: Design and Analysis
Divide & Conquer Algorithms
slides adapted from Marty Stepp
Topic: Divide and Conquer
Application: Efficiency of Algorithms II
Comp 208 Computers in Engineering Yi Lin Fall, 2005
Divide and Conquer Approach
Advanced Sorting Methods: Shellsort
Algorithms and Data Structures Lecture II
Presentation transcript:

Application: Efficiency of Algorithms II Lecture 45 Section 9.5 Tue, Apr 24, 2007

Algorithm Analysis We will analyze the merge sort algorithm.

The Merge Sort Algorithm The Merge Sort Algorithm sorts a list of numbers by repeatedly merging longer and longer sublists. The initial sublists are of size 1. The final sublist is the entire list.

Example 88 16 2 43 57 67 79 34 5 71 62 69 49 90 29 65

Example 88 16 2 43 57 67 79 34 5 71 62 69 49 90 29 65 88 16 2 43 57 67 79 34 5 71 62 69 49 90 29 65 16 88 2 43 57 67 79 34 5 71 62 69 49 90 29 65 2 16 43 88 34 57 67 79 5 62 69 71 29 49 65 90 2 16 34 43 57 67 79 88 5 29 49 62 65 69 71 90 2 5 16 29 34 43 49 57 62 65 67 69 71 79 88 90

Analyzing the Merge Sort We will use two functions: MergeSort(). Merge(). To analyze the algorithm, we will count the number of comparisons required to sort a list of length n.

The MergeSort() Function The MergeSort() function is recursive. void MergeSort(int a[], int low, int high) { if (low < high) int mid = (low + high)/2; MergeSort(a, low, mid); MergeSort(a, mid + 1, high); Merge(a, low, mid, high); } return;

The MergeSort() Function The initial call would be to a non-recursive “starter” function. where MergeSort() is MergeSort(a, size); void MergeSort(int a[], int size) { MergeSort(a, 0, size – 1); return; }

The Merge() Function The real action takes place in the Merge() function. This function makes a single pass down each of the two lists to be merged, comparing elements of the first list to elements of the second list. The smaller element is copied into the new list.

The Merge() Function void Merge(int a[], int low, int mid, int high) { int b[high – low + 1]; int i = low; int j = mid + 1; int k = 0; while (i <= mid && j <= high) if (a[i] < a[j]) b[k++] = a[i++]; else b[k++] = a[j++]; } :

The Merge() Function : while (i <= mid) b[k++] = a[i++]; while (j <= high) b[k++] = a[j++]; j = low; for (i = 0; i < high; i++) a[j++] = b[i]; return; }

Analysis of Merge() What is the growth rate of Merge()? Inspect the loops. The length of each loop is proportional to the length of the lists. Therefore, the run-time of Merge() is (n), where n is the combined length of the two lists. In fact, # comparisons  n – 1.

Analysis of MergeSort() Now we can analyze MergeSort(). Let cn be the number of comparisons needed by MergeSort() for a list of length n. Then cn  cfloor(n/2) + cceiling(n/2) + (n – 1). Furthermore, c1 = 0.

Analysis of MergeSort() Assume the worst case: cn = cfloor(n/2) + cceiling(n/2) + (n – 1). Compute: c1 = 0. c2 = 2c1 + 1 = 1. c3 = c1 + c2 + 2 = 3. c4 = 2c2 + 3 = 5.

Analysis of MergeSort() c5 = c2 + c3 + 4 = 8. c6 = 2c3 + 5 = 11. c7 = c3 + c4 + 6 = 14. c8 = 2c4 + 7 = 17. c9 = c4 + c5 + 8 = 21. and so on.

Analysis of MergeSort() Note that c1 = 0 = –1  1 + 1. c2 = 1 = 0  2 + 1. c4 = 5 = 1  4 + 1. c8 = 17 = 2  8 + 1. Does c16 = 3  16 + 1 = 49? Does c32 = 4  32 + 1 = 129?

Analysis of MergeSort() This generalizes as cn = (log2 n – 1)  n + 1 = n log2 n – n + 1

Analysis of MergeSort() Assume the worst case: cn = cfloor(n/2) + cceiling(n/2) + (n – 1). Show by induction that cn  n log2 n – n + 1. The base case is trivial. So suppose the inequality is true for all n from 1 to k – 1 for some k  2. Show that the inequality is true when n = k.

Analysis of MergeSort() Suppose k is even. Then

Analysis of MergeSort() Suppose k is odd. Then we need the following lemma. Lemma: if k > 1, then (k – 1)log(k – 1) + (k + 1)log(k + 1) > 2k log k. Proof: Use calculus.

Analysis of MergeSort() Then

Analysis of MergeSort()

Analysis of MergeSort() Thus, cn is (n log2 n). Now we will show that cn is O(n log2 n). It will follow that cn is (n log2 n).

Analysis of MergeSort() Show by induction that cn  2n log2 n. The base case is trivial. Suppose that the inequality is true for all n from 1 to k – 1 for some k  2. Show that it is true when n = k.

Analysis of MergeSort() Suppose k is even. Then

Analysis of MergeSort() Suppose k is odd. Then

Analysis of MergeSort()

Analysis of MergeSort() Therefore, the worst case of the merge sort is (n log2 n).

Analysis of MergeSort() Suppose that MergeSort() sorts a list of 100 numbers in 1 s. How long will it take to sort a list of one thousand numbers? How long will it take to sort a list of one million numbers? How long will it take to sort a list of one billion numbers?