Copyright (C) Gal Kaminka 2003 1 Data Structures and Algorithms Sorting II: Divide and Conquer Sorting Gal A. Kaminka Computer Science Department.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Quicksort CSE 331 Section 2 James Daly. Review: Merge Sort Basic idea: split the list into two parts, sort both parts, then merge the two lists
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Sorting Algorithms and Average Case Time Complexity
Sorting Algorithms n 2 Sorts ◦Selection Sort ◦Insertion Sort ◦Bubble Sort Better Sorts ◦Merge Sort ◦Quick Sort ◦Radix Sort.
CMPS1371 Introduction to Computing for Engineers SORTING.
1 Sorting Problem: Given a sequence of elements, find a permutation such that the resulting sequence is sorted in some order. We have already seen: –Insertion.
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Quicksort Divide-and-Conquer. Quicksort Algorithm Given an array S of n elements (e.g., integers): If array only contains one element, return it. Else.
CS 171: Introduction to Computer Science II Quicksort.
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
CS 206 Introduction to Computer Science II 04 / 27 / 2009 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 12 / 09 / 2009 Instructor: Michael Eckmann.
Sorting21 Recursive sorting algorithms Oh no, not again!
1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.
CS 206 Introduction to Computer Science II 12 / 05 / 2008 Instructor: Michael Eckmann.
Unit 281 Merge- and Quick Sort Merge Sort Quick Sort Exercises.
Quicksort.
CS 206 Introduction to Computer Science II 12 / 03 / 2008 Instructor: Michael Eckmann.
Sorting Chapter 10.
Sorting Chapter 10. Chapter 10: Sorting2 Chapter Objectives To learn how to use the standard sorting methods in the Java API To learn how to implement.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS 206 Introduction to Computer Science II 12 / 08 / 2008 Instructor: Michael Eckmann.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Chapter 10 B Algorithm Efficiency and Sorting. © 2004 Pearson Addison-Wesley. All rights reserved 9 A-2 Sorting Algorithms and Their Efficiency Sorting.
Sorting Chapter 10. Chapter Objectives  To learn how to use the standard sorting methods in the Java API  To learn how to implement the following sorting.
1 Data Structures and Algorithms Sorting I Gal A. Kaminka Computer Science Department.
Sorting. Pseudocode of Insertion Sort Insertion Sort To sort array A[0..n-1], sort A[0..n-2] recursively and then insert A[n-1] in its proper place among.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Chapter 5 Searching and Sorting. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Examine the linear search and binary.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Lecture No. 04,05 Sorting.  A process that organizes a collection of data into either ascending or descending order.  Can be used as a first step for.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Sorting – Part II CS 367 – Introduction to Data Structures.
Quicksort Data Structures and Algorithms CS 244 Brent M. Dingle, Ph.D. Game Design and Development Program Department of Mathematics, Statistics, and Computer.
QUICKSORT 2015-T2 Lecture 16 School of Engineering and Computer Science, Victoria University of Wellington COMP 103 Marcus Frean.
Chapter 9 Sorting 1. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Divide and Conquer Sorting Algorithms COMP s1 Sedgewick Chapters 7 and 8.
Review 1 Insertion Sort Insertion Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Chapter 4, Part II Sorting Algorithms. 2 Heap Details A heap is a tree structure where for each subtree the value stored at the root is larger than all.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Review Quick Sort Quick Sort Algorithm Time Complexity Examples
Chapter 9: Sorting1 Sorting & Searching Ch. # 9. Chapter 9: Sorting2 Chapter Outline  What is sorting and complexity of sorting  Different types of.
Computer Sciences Department1. Sorting algorithm 4 Computer Sciences Department3.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Warmup What is an abstract class?
Algorithm Design Methods
Advanced Sorting Methods: Shellsort
Sorting Algorithms Ellysa N. Kosinaya.
Chapter 4.
CSE 326: Data Structures Sorting
CSE 373 Data Structures and Algorithms
Core Assessments Core #1: This Friday (5/4) Core #2: Tuesday, 5/8.
Advanced Sorting Methods: Shellsort
Presentation transcript:

Copyright (C) Gal Kaminka Data Structures and Algorithms Sorting II: Divide and Conquer Sorting Gal A. Kaminka Computer Science Department

2 Last week: in-place sorting Bubble Sort – O(n 2 ) comparisons O(n) best case comparisons, O(n 2 ) exchanges Selection Sort - O(n 2 ) comparisons O(n 2 ) best case comparisons O(n) exchanges (always) Insertion Sort – O(n 2 ) comparisons O(n) best case comparisons Fewer exchanges than bubble sort Best in practice for small lists (<30)

3 This week Mergesort O(n log n) always O(n) storage Quick sort O(n log n) average, O(n^2) worst Good in practice (>30), O(log n) storage

4 MergeSort A divide-and-conquer technique Each unsorted collection is split into 2 Then again  Then again  ……. Until we have collections of size 1  Now we merge sorted collections  Then again Then again Until we merge the two halves

5 MergeSort(array a, indexes low, high) 1. If (low < high) 2. middle  (low + high)/2 3. MergeSort(a,low,middle) // split 1 4. MergeSort(a,middle+1,high) // split 2 5. Merge(a,low,middle,high) // merge 1+2

6 Merge(arrays a, index low, mid, high) 1. b  empty array, t  mid+1, i  low, tl  low 2. while (tl<=mid AND t<=high) 3. if (a[tl]<=a[t]) 4. b[i]  a[tl] 5. i  i+1, tl  tl+1 6. else 7. b[i]  a[t] 8. i  i+1, t  t+1 9. if tl<=mid copy a[tl…mid] into b[i…] 10. else if t<=high copy a[t…high] into b[i…] 11. copy b[low…high] onto a[low…high]

7 An example Initial: Split: Merge: Merge: Merge:

8 The complexity of MergeSort Every split, we half the collection How many times can this be done? We are looking for x, where 2 x = n x = log 2 n So there are a total of log n splits

9 The complexity of MergeSort Each merge is of what run-time? First merge step: n/2 merges of 2  n Second merge step: n/4 merges of 4  n Third merge step: n/8 merges of 8  n …. How many merge steps? Same as splits log n Total: n log n steps

10 Storage complexity of MergeSort Every merge, we need to hold the merged array:

11 Storage complexity of MergeSort So we need temporary storage for merging Which is the same size as the two collections together To merge the last two sub-arrays (each size n/2) We need n/2+n/2 = n temporary storage Total: O(n) storage

12 MergeSort summary O(n log n) runtime (best and worst) O(n) storage (not in-place) Very naturally done using recursion But note can be done without recursion! In practice: Can be improved by combining with insertion sort Split down to arrays of size 20-30, then insert-sort Then merge

13 QuickSort Key idea: Select a item (called the pivot) Put it into its proper FINAL position Make sure: All greater item are on one side (side 1) All smaller item are on other side (side 2) Repeat for side 1 Repeat for side 2

14 Short example Let’s select 25 as our initial pivot. We move items such that: All left of 25 are smaller All right of 25 are larger As a result 25 is now in its final position

15 Now, repeat (recursively) for left and right sides Sort 12 Sort needs no sorting For the other side, we repeat the process Select a pivot item (let’s take 57) Move items around such that left items are smaller, etc.

Changes into And now we repeat the process for left And for the right

17 QuickSort(array a; index low, hi) 1. if (low >= hi) 2. return ; // a[low..hi] is sorted 3. pivot  find_pivot(a,low,hi) 4. p_index=partition(a,low,high,pivot) 5. QuickSort(a,low,p_index-1) 6. QuickSort(a,p_index+1,hi)

18 Key questions How do we select an item ( FindPivot ())? If we always select the largest item as the pivot Then this process becomes Selection Sort Which is O(n 2 ) So this works only if we select items “in the middle” Since then we will have log n divisions How do we move items around efficiently ( Partition ()?) This offsets the benefit of partitioning

19 FindPivot To find a real median (middle item) takes O(n) In practice however, we want this to be O(1) So we approximate: Take the first item (a[low]) as the pivot Take the median of {a[low],a[hi],a[(low+hi)/2]} FindPivot(array a; index low, high) 1. return a[low]

20 Partition (in O(n)) Key idea: Keep two indexes into the array up points at lowest item >= pivot down points at highest item <= pivot We move up, down in the array Whenever they point inconsistently, interchange At end: up and down meet in location of pivot

21 partition(array a; index low,hi ; pivot; index pivot_i) 1. down  low, up  hi 2. while(down<up) 3. while (a[down]<=pivot && down<hi) 4. down  down while (a[hi]>pivot) 6. up  up – 1 7. if (down < up) 8. swap(a[down],a[up]) 9. a[pivot_i]=a[up] 10. a[up] = pivot 11. return up

22 Example: partition() with pivot=25 First pass through loop on line 2: down up

23 Example: partition() with pivot=25 First pass through loop on line 2: down up We go into loop in line 3 (while a[down]<=pivot)

24 Example: partition() with pivot=25 First pass through loop on line 2: down up We go into loop in line 5 (while a[up]>pivot)

25 Example: partition() with pivot=25 First pass through loop on line 2: down up We go into loop in line 5 (while a[up]>pivot)

26 Example: partition() with pivot=25 First pass through loop on line 2: down up Now we found an inconsistency!

27 Example: partition() with pivot=25 First pass through loop on line 2: down up So we swap a[down] with a[up]

28 Example: partition() with pivot=25 Second pass through loop on line 2: down up

29 Example: partition() with pivot=25 Second pass through loop on line 2: down up Move down again (increasing) – loop on line 3

30 Example: partition() with pivot=25 Second pass through loop on line 2: down up Now we begin to move up again – loop on line 5

31 Example: partition() with pivot=25 Second pass through loop on line 2: down up Again – loop on line 5

32 Example: partition() with pivot=25 Second pass through loop on line 2: down up down < up? No. So we don’t swap.

33 Example: partition() with pivot=25 Second pass through loop on line 2: down up Instead, we are done. Just put pivot in place.

34 Example: partition() with pivot=25 Second pass through loop on line 2: down up Instead, we are done. Just put pivot in place. (swap it with a[up] – for us a[low] was the pivot)

35 Example: partition() with pivot=25 Second pass through loop on line 2: down up Now we return 2 as the new pivot index

36 Notes We need the initial pivot_index in partition() For instance, change FindPivot(): return pivot (a[low]), as well as initial pivot_index (low) Then use pivot_index in the final swap QuickSort: Average O(n log n), Worst case O(n 2 ) works very well in practice (collections >30) Average O(n log n), Worst case O(n 2 ) Space requirements O(log n) – for recursion