1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
CS 112 Introduction to Programming Sorting of an Array Debayan Gupta Computer Science Department Yale University 308A Watson, Phone:
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Quicksort File: D|\data\bit143\Fall01\day1212\quicksort.sdd BIT Gerard Harrison Divide and Conquer Reduce the problem by reducing the data set. The.
21/3/00SEM107- Kamin & ReddyClass 15 - Recursive Sorting - 1 Class 15 - Recursive sorting methods r Processing arrays by recursion r Divide-and-conquer.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Introduction to Algorithms Chapter 7: Quick Sort.
QuickSort The content for these slides was originally created by Gerard Harrison. Ported to C# by Mike Panitz.
Sorting Algorithms and Average Case Time Complexity
 1 Sorting. For computer, sorting is the process of ordering data. [ ]  [ ] [ “Tom”, “Michael”, “Betty” ]  [ “Betty”, “Michael”,
CS 162 Intro to Programming II Quick Sort 1. Quicksort Maybe the most commonly used algorithm Quicksort is also a divide and conquer algorithm Advantage.
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
Data Structures Advanced Sorts Part 2: Quicksort Phil Tayco Slide version 1.0 Mar. 22, 2015.
Faster Sorting Methods Chapter 9. 2 Chapter Contents Merge Sort Merging Arrays Recursive Merge Sort The Efficiency of Merge Sort Merge Sort in the Java.
Sorting21 Recursive sorting algorithms Oh no, not again!
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Copyright © 2006 Pearson Addison-Wesley. All rights reserved. Sorting III 1 An Introduction to Sorting.
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
CS2420: Lecture 10 Vladimir Kulyukin Computer Science Department Utah State University.
Unit 061 Quick Sort csc326 Information Structures Spring 2009.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Mergesort and Quicksort Chapter 8 Kruse and Ryba.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
IKI 10100I: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100I: Data.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
COMP 171 Data Structures and Algorithms Tutorial 3 Merge Sort & Quick Sort.
EFFICIENCY & SORTING II CITS Scope of this lecture Quicksort and mergesort Performance comparison.
Sort Algorithms.
1 CSE 373 Sorting 3: Merge Sort, Quick Sort reading: Weiss Ch. 7 slides created by Marty Stepp
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Merge Sort. Stable vs. Non-Stable Sorts We frequently use sorting methods for items with multiple keys Sometimes we need to apply the sorting with different.
Big-O and Sorting February 6, Administrative Stuff Readings for today: Ch Readings for tomorrow: Ch 8.
Sorting: Advanced Techniques Smt Genap
CS 146: Data Structures and Algorithms July 9 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
CS 46B: Introduction to Data Structures July 2 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
Divide and Conquer Strategy
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
Intro To Algorithms Searching and Sorting. Searching A common task for a computer is to find a block of data A common task for a computer is to find a.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting divide and conquer. Divide-and-conquer  a recursive design technique  solve small problem directly  divide large problem into two subproblems,
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Review Quick Sort Quick Sort Algorithm Time Complexity Examples
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Quick Sort Modifications By Mr. Dave Clausen Updated for Python.
1 Overview Divide and Conquer Merge Sort Quick Sort.
Sorting – Lecture 3 More about Merge Sort, Quick Sort.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Lecture No.45 Data Structures Dr. Sohail Aslam.
Warmup What is an abstract class?
Sorting Chapter 13 presents several common algorithms for sorting an array of integers. Two slow but simple algorithms are Selectionsort and Insertionsort.
Divide and Conquer.
Chapter 4: Divide and Conquer
CSC215 Lecture Algorithms.
slides adapted from Marty Stepp
CSE 373: Data Structures and Algorithms
CSE 373 Data Structures and Algorithms
Algorithms: Design and Analysis
Core Assessments Core #1: This Friday (5/4) Core #2: Tuesday, 5/8.
Presentation transcript:

1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick Sort and its Implementation.  Brief Analysis of Quick Sort.  Preview: Searching Algorithms.

2 Sorting Algorithms (Part II) Divide & Conquer Sorting Methods  In the last lecture, we studied two sorting methods, both of which are quadratic. That is, they are said to be of order n 2.  An interesting question to ask is, can we have a linear sorting method – one involving order n comparisons and order n data movements.  The answer to this question is generally no. However, we could have something in-between.  The two methods we are considering in this lecture, namely merge sort and quick sort, are of order n log 2 n.  Both these methods take an approach called divide and conquer approach, usually implemented using recursion.  In this approach, the array is repeatedly divided into two until the simplest sub-divisions (containing one element) are obtained. These subdivisions which are automatically sorted are then combined together to form larger sorted parts until the entire array is obtained.

3 Sorting Algorithms (Part II) Merge Sort: An Implementation  This is sometimes called EasySplit/HardJoin as the main work is in the merging part.  Its algorithm consists of the following steps:  1. Split the list into two equal (or nearly equal) sub-lists – since smaller lists are easier to sort.  2. Repeat the process on the sub-list (recursively) until all the sub-lists are of order1 – which means they are already sorted.  3. Rewind the recursion by merging the sub-lists to form larger sorted list. At the end, the original list would have been sorted.  4. The following diagram illustrates merge sort.

4 Sorting Algorithms (Part II) Merge Sort: An Implementation (Cont’d) l The following diagram illustrates the merge sort algorithm:

5 Sorting Algorithms (Part II) Merge Sort: An Implementation (Cont’d)

6 Sorting Algorithms (Part II) Merge Sort: An Implementation (Cont’d) public class MergeSort { public static void merge(int[] a, int from, int mid, int to) { int n = to - from + 1; int[] b = new int[n]; int i1 = from; int i2 = mid + 1; int j = 0; // next open position in b while (i1 <= mid && i2 <= to) { if (a[i1] < a[i2]) { b[j] = a[i1]; i1++; } else { b[j] = a[i2]; i2++; } j++; } while (i1 <= mid) { b[j] = a[i1]; i1++; j++; } while (i2 <= to) { b[j] = a[i2]; i2++; j++; } for (j = 0; j < n; j++) a[from + j] = b[j]; } public static void mergeSort(int[] a, int from, int to) { if (from == to) return; int mid = (from + to) / 2; //System.out.println("from "+from+ " mid "+mid); mergeSort(a, from, mid); //System.out.println("mid+1 "+(mid+1)+ " to "+to); mergeSort(a, mid + 1, to); //System.out.println("Merge: "+"from "+from+" mid "+mid+" to "+to); merge(a, from, mid, to); } public static void sort(int[] a) { mergeSort(a, 0, a.length - 1); }

7 Sorting Algorithms (Part II) Brief Analysis of Merge Sort  First we notice that the main work is being done by the merge() method – this is where both the comparison and data movement takes place.  The number of comparisons in the merge() method depends on the number of elements in the sub-list and their ordering. However, since all the elements must be moved to temporary array and moved back to the sub-list, the number of moves is twice the size of the sub-list.  At the top-level for example, at most n key comparison is made and 2n data movements.  As we go down the recursive levels, the size reduce by half each time, but the number of recursive calls increase by the same factor so that the overall number of comparison is n at each level as shown by the following diagram:

8 Sorting Algorithms (Part II) Brief Analysis of Merge (Cont’d)  The complexity of Merge Sort is “nlog n”. Recurrence relation is used to compute the complexity of merge sort [ics 353 course].  One disadvantage of merge sort is that a separate array of the same size as the original is required in merging the sub-lists. This takes extra space and computer time.

9 Sorting Algorithms (Part II) Quick Sort: An Implementation  Quick sort is another divide-and-conquer algorithm that spends more time in the partitioning than merge sort, as such it is sometimes called HardSplit/EasyJoin.  To do the partitioning, Quick Sort first selects an element called the pivot and conceptually divides the list into two sub-lists with respect to the pivot: the first sub-list consisting of all elements less than or equal to the pivot and the second consists of all elements that are greater or equal to the pivot.  These two sub-lists are then sorted using the same idea. By the time the list reduces to single elements, the list would have been sorted.  The partitioning is achieved by using two variables, left and right which are initially set to the first and last index and allow them to move towards each other.  The left variable is allowed to increase until it reaches an element greater to the pivot.  Similarly, the right variable is allowed to decrease until it reaches an element less than the pivot.  Provided the two variables do not cross, the elements they point to are swapped, after swaping left is increase and right is decrease by 1. This process continues until the variables cross each other, at which stage the partition would have been achieved.  The pivot could be any element, but for simplicity we take the middle element.

10 Sorting Algorithms (Part II) Quick Sort: An Implementation (Cont’d)  The following set of diagrams shows how quick sort works:  Original list:  First we choose a pivot, the middle element = 55.  Left will move and stop at 81, since 81>55; while right cannot move since 23<55  At this point, the two elements are swapped & variables left++, right--  Next, left moves and stops at 55, while remain at 17. After swapping and left++ and right-- we get: leftright leftright left right leftright leftright

11 Sorting Algorithms (Part II) Quick Sort: An Implementation (Cont’d)  Next, left remain at 65 and right moves and stops at 17. Since the two variables have cross each other, we do not swap, instead we have the following:  Since the variables have crossed, this terminate the first partitioning process, with the two parts as follows:  The process is then repeated with each of the sub partition rightleft

12 Sorting Algorithms (Part II) Quick Sort: An Implementation (Cont’d) public class QuickSort { public static int partition(int[] a, int left, int right, int pivot) { do { while(a[left] < pivot) left++; while(a[right] > pivot) right--; if(left < right) { / / if left did not cross right ArrayUtil.swap(a, left, right); left++; right--; } else if(left == right) left++; } while(left <= right); return right; } public static void quickSort(int[] a, int from, int to) { if (to <= from) return; int left = from, right = to; int pivot = a[(from + to)/2]; int newRight = partition(a, left, right, pivot); int newLeft = newRight + 1; if(from < newRight) quickSort(a, from, newRight); if(newLeft < to) quickSort(a, newLeft, to); } public static void sort(int[] a) { quickSort(a, 0, a.length - 1); }

13 Sorting Algorithms (Part II) Brief Analysis of Quick Sort  Complexity of Quick Sort is nlog n.  Again, most of the work is done by the partition() method which does. both the comparisons and data movements.  The number of comparison depends on the size of the sub-list being considered and like merge sort, it is at most n for each level of recursion.  However, the number of data movements depends not only on the size of the sub-list, but also on choice of the pivot and the relative ordering of the keys. It is at worst equal to the size of the list (max n) but can be considerably less.  The next question is how many level of recursion are involved? This again depends on the choice of pivot. A good choice of pivot will divide the list into two nearly equal sub-lists. However, in practice, because quick sort performs less number of data movements, it is much faster than merge sort.