Examples of Recursion Data Structures in Java with JUnit ©Rick Mercer.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms Quicksort
Advertisements

Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Quicksort File: D|\data\bit143\Fall01\day1212\quicksort.sdd BIT Gerard Harrison Divide and Conquer Reduce the problem by reducing the data set. The.
21/3/00SEM107- Kamin & ReddyClass 15 - Recursive Sorting - 1 Class 15 - Recursive sorting methods r Processing arrays by recursion r Divide-and-conquer.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
CSE 373: Data Structures and Algorithms
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
QuickSort The content for these slides was originally created by Gerard Harrison. Ported to C# by Mike Panitz.
Sorting Algorithms and Average Case Time Complexity
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
Recursive sorting: Quicksort and its Complexity
CS 162 Intro to Programming II Quick Sort 1. Quicksort Maybe the most commonly used algorithm Quicksort is also a divide and conquer algorithm Advantage.
Data Structures Advanced Sorts Part 2: Quicksort Phil Tayco Slide version 1.0 Mar. 22, 2015.
Sorting Algorithms Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
1 Sorting Algorithms (Part II) Overview  Divide and Conquer Sorting Methods.  Merge Sort and its Implementation.  Brief Analysis of Merge Sort.  Quick.
Recursion. Objectives At the conclusion of this lesson, students should be able to Explain what recursion is Design and write functions that use recursion.
1 Algorithm Efficiency and Sorting (Walls & Mirrors - Remainder of Chapter 9)
Quicksort. 2 Introduction * Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case: O(N 2 ) n But, the worst case seldom.
Quick Sort Cmput Lecture 13 Department of Computing Science University of Alberta ©Duane Szafron 2000 Some code in this lecture is based on code.
Sorting Algorithms Bubble Sort Merge Sort Quick Sort Randomized Quick Sort.
CS2420: Lecture 11 Vladimir Kulyukin Computer Science Department Utah State University.
CSE 373 Data Structures Lecture 19
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
(c) , University of Washington
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
20-1 Computing Fundamentals with C++ Object-Oriented Programming and Design, 2nd Edition Rick Mercer Franklin, Beedle & Associates, 1999 ISBN
Recursive Quicksort Data Structures in Java with JUnit ©Rick Mercer.
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
Computer Science Searching & Sorting.
Recursion Textbook chapter Recursive Function Call a recursive call is a function call in which the called function is the same as the one making.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Examples of Recursion Data Structures in Java with JUnit ©Rick Mercer.
CSC 211 Data Structures Lecture 13
© M. Gross, ETH Zürich, 2014 Informatik I für D-MAVT (FS 2014) Exercise 12 – Data Structures – Trees Sorting Algorithms.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
1 CSE 373 Sorting 3: Merge Sort, Quick Sort reading: Weiss Ch. 7 slides created by Marty Stepp
Quicksort CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Using Recursion to Convert Number to Other Number Bases Data Structures in Java with JUnit ©Rick Mercer.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
1 Sorting اعداد: ابوزيد ابراهيم حامد سعد صبرة حميده الشاذلي عبدالاه السيد محمد احمد.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 6 Recursion. Solving simple problems Iteration can be replaced by a recursive function Recursion is the process of a function calling itself.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
The Sorting Methods Lecture Notes 10. Sorts Many programs will execute more efficiently if the data they process is sorted before processing begins. –
Sorting Algorithms Merge Sort Quick Sort Hairong Zhao New Jersey Institute of Technology.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
QuickSort. Yet another sorting algorithm! Usually faster than other algorithms on average, although worst-case is O(n 2 ) Divide-and-conquer: –Divide:
Prof. Amr Goneid, AUC1 CSCE 210 Data Structures and Algorithms Prof. Amr Goneid AUC Part 8b. Sorting(2): (n log n) Algorithms.
CS 367 Introduction to Data Structures Lecture 11.
Sorting & Searching Geletaw S (MSC, MCITP). Objectives At the end of this session the students should be able to: – Design and implement the following.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Recursion. Objectives At the conclusion of this lesson, students should be able to Explain what recursion is Design and write functions that use recursion.
CMPT 120 Topic: Searching – Part 2 and Intro to Time Complexity (Algorithm Analysis)
Sorting Mr. Jacobs.
Recitation 13 Searching and Sorting.
Data Structures in Java with JUnit ©Rick Mercer
Programming in Java: lecture 10
Quicksort "There's nothing in your head the sorting hat can't see. So try me on and I will tell you where you ought to be." -The Sorting Hat, Harry Potter.
Data Structures and Algorithms
CO 303 Algorithm Analysis And Design Quicksort
slides adapted from Marty Stepp
CSE 373 Data Structures and Algorithms
Algorithms: Design and Analysis
Data Structures & Algorithms
Presentation transcript:

Examples of Recursion Data Structures in Java with JUnit ©Rick Mercer

Converting Decimal Numbers to other bases  Problem: Convert a decimal (base 10) number into other bases method Call Return method Call Return convert(99, 2) convert(99, 2) convert(99, 3) convert(99, 3) convert(99, 4) 1203 convert(99, 4) 1203 convert(99, 5) 344 convert(99, 5) 344 convert(99, 6) 243 convert(99, 6) 243 convert(99, 7) 201 convert(99, 7) 201 convert(99, 8) 143 convert(99, 8) 143 convert(99, 9) 120 convert(99, 9) 120

Digits are multiplied by powers of the base 10, 8, 2, or whatever  First: converting from other bases to decimal  Decimal numbers multiply digits by powers of = 9 x x x x 10 0  Octal numbers powers of = 1 x x x x 8 0 = = = =  Binary numbers powers of = 1 x x x x 2 0 = = = = 13 10

Converting base 10 to base 2 1) divide number (5) by new base(2), write remainder (1) 2) divide quotient (2), write new remainder (0) to left 3) divide quotient (1), write new remainder (1) to left _ _ 2_ _ _ 2_ 2 ) 5 Remainder = 1 2 ) 5 Remainder = 1 __1_ __1_ 2 ) 2 Remainder = 0 2 ) 2 Remainder = 0 __0_ __0_ 2 ) 1 Remainder = 1 2 ) 1 Remainder = 1 Stop when the quotient is = Stop when the quotient is = Print remainders in reverse order

Converting base 10 to base 8 1) divide number by new base (8), write remainder (1) 2) divide quotient (2), write new remainder (0) to left 3) divide quotient (1), write new remainder (1) to left _12_ _12_ 8 )99 Remainder = 3 8 )99 Remainder = 3 __1_ __1_ 8 )12 Remainder = 4 8 )12 Remainder = 4 __0_ __0_ 8 ) 1 Remainder = 1 8 ) 1 Remainder = 1 Stop when the quotient is = Print remainders in reverse order

Possible Solutions  We could either  store remainders in an array and reverse it or  write out the remainders in reverse order  have to postpone the output until we get quotient = 0  store result as a String (like a Recursion assignment)  Iterative Algorithm while the decimal number > 0 { while the decimal number > 0 { Divide the decimal number by the new base Divide the decimal number by the new base Set the decimal number = decimal number divided by the base Set the decimal number = decimal number divided by the base Store the remainder to the left of any preceding remainders Store the remainder to the left of any preceding remainders }

Recursive algorithm Base Case -- Recursive Case  Base case  if decimal number being converted = 0  do nothing (or return "")  Recursive case  if decimal number being converted > 0  solve a simpler version of the problem –use the quotient as the argument to the next call  store the current remainder (number % base) in the correct place

One solution assertEquals("14", rf.convert(14, 10)); assertEquals(" ", rf.convert(99, 2)); assertEquals("143", rf.convert(99, 8)); assertEquals("98", rf.convert(98, 10)); // 9*10+8 public String convert(int num, int base) { if (num == 0) if (num == 0) return ""; return ""; else else return convert(num/base, base) + (num%base); return convert(num/base, base) + (num%base);}

Hexadecimal, something we see as Computer Scientists  Convert this algorithm to handle all base up through hexadecimal (base 16)  10 = A  11 = B  12 = C  13 = D  14 = E  15 = F

Quicksort: O(n log n) Sorting  Quicksort was discovered by Tony Hoare 1962  Here is an outline of his famous algorithm:  Pick one item from the array--call it the pivot  Partition the items in the array around the pivot so all elements to the left are  to the pivot and all elements to the right are greater than the pivot  Use recursion to sort the two partitions a snapshot pivot partition: items > pivot partition 1: items  pivot

Before and After  Let's sort integers  Pick the leftmost one (27) as the pivot value  The array before call to partition(a, 0, n-1)  Array looks like this after first partition is done pivotitems < pivotitems > pivot

The partition method  p artition divvies up a around the split and returns the position of the split, an integer in the range of 0..n-1  The postcondition of partition: a[first]..a[split-1] <= a[split] && a[split+1]..a[last] > a[split]  Notes:  May be more than 1 element equal to the pivot  Put them in left partition could have been the right

Recursive call to sort smaller part of the array quickSort(a, split+1, last); // sort right  QuickSort the right. At some point  Pivot will be 53  And the left portion will be sorted pivot items < pivotitems > pivot left is already sorted, begin to sort part to the right of split

Complete the sort // sort left and right around new split // sort left and right around new split quickSort(a, first, split-1); quickSort(a, first, split-1); // sort right // sort right quickSort(a, split+1, last); quickSort(a, split+1, last); Entire array is now sorted

Start Over (i ==1) Start Over (i ==1)  Now let's back up and start with empty partitions int partition(int a[], int first, int last) { int lastSmall = first; int lastSmall = first; int i = (first + 1); // Beginning of unknowns int i = (first + 1); // Beginning of unknowns  Compare all items from a[i]..a[last]  if a[i] > a[first] (the pivot), do nothing  if a[i] <= a[first], swap a[lastSmall+1] with a[i] first lastSmall i unknown items (all are unknown--except first)

Result of the 1st loop iteration(i==2) first lastSmall i unknown items partition 1: all items pivot  The following array shows no changes were made in the array since a[i] <= a[first] was false  So simply add 1 to i (i++)--create 2 partitions  Now partition 1 has one element (the pivot 27) and partition 2 has 1 element (41)

Result of the 2nd loop iteration(i==3) first lastSmall i unknown items partition 1: all items pivot  The following array shows a swap was made in the array since a[i] <= a[first] was true (14 < 27)  a[i] (14) is swapped with a[lastSmall+1] (41)  lastSmall gets incremented to point to the last element in partition 1  i gets incremented to reference 56

Result of the 3rd loop iteration(i==4) first lastSmall i unknown items partition 1: all items pivot  The following array shows no swap was made in the array since a[i] <= a[first] was false  lastSmall does NOT get incremented  i gets incremented to reference 31

Result of the 4th loop iteration(i==5) first lastSmall i unknown items partition 1: all items pivot  The following array shows no swap was made in the array since a[i] <= a[first] was false  lastSmall does NOT get incremented  i gets incremented to reference 9

Result of the 5th loop iteration(i==6) first lastSmall i unknown items partition 1: all items pivot  The following array shows a swap was made in the array since a[i] <= a[first] was true (9 < 27)  a[i] (9) is swapped with a[lastSmall+1] (41)  lastSmall gets incremented to point to the last element in partition 1  i++ points to the first unknown (22)

first lastSmall i unknown items partition 1: all items pivot i == 7

first lastSmall i unknown partition 1: all items pivot i == 8

first lastSmall i unknown partition 1: all items pivot i == 9

first lastSmall i unknown partition 1: all items pivot i == 10

first lastSmall i unknown partition 1: all items pivot i == 11

first lastSmall i unknown partition 1: all items pivot i == 12

first lastSmall i unknown partition 1: all items pivot i == 13

Result of the 14th loop iteration(i==14) first lastSmall i partition 1: all items pivot  The following array shows what happens after traversing the entire array with this loop (i>last): for (i = first + 1; i <= last; i++) { if(a[i] <= a[first] ) { if(a[i] <= a[first] ) { lastSmall++; lastSmall++; swapElements(a, lastSmall, i); swapElements(a, lastSmall, i); } }

Post Loop Detail  Now place the pivot into where we expect the pivot to be: in-between the two partitions swapElements( a, first, lastSmall ); swapElements( a, first, lastSmall );  Then we can return lastSmall for the next call return lastSmall; return lastSmall; first lastSmall (pivot position) partition 1: all items pivot

quickSort is called like this: quickSort(a, 0, n-1) void quickSort(int a[], int first, int last) { void quickSort(int a[], int first, int last) { // precondition: a is an array to be sorted from // precondition: a is an array to be sorted from // a[first]..a[last] // a[first]..a[last] if(first >= last) if(first >= last) return; // Done: we have an empty array return; // Done: we have an empty array // The difficult algorithm is in partition // The difficult algorithm is in partition int split = partition ( a, first, last ); int split = partition ( a, first, last ); // Recursively Quicksort left, then right // Recursively Quicksort left, then right quickSort(a, first, split-1); // sort left quickSort(a, first, split-1); // sort left quickSort(a, split+1, last); // sort right quickSort(a, split+1, last); // sort right // post: the array a is sorted // post: the array a is sorted }

Analyzing Quicksort  The critical statement happens in the comparison of the for loop of the partition function if(a[i] <= a[first]) if(a[i] <= a[first])  So how many times is partition called?  And what are the values for first and last (# comparisons)?  If the pivot element is near the mode, we don't have many calls to QuickSort, it is O(log n)

The best of Quicksort, the worst of Quicksort  In the best case (1st element is always middle value) with 7 elements, call partition 3 times first == 0, last == 6 // 6 comparisons first == 0, last == 6 // 6 comparisons first == 0, last == 2 // 2 comparisons first == 0, last == 2 // 2 comparisons first == 4, last == 6 // 2 comparisons first == 4, last == 6 // 2 comparisons  In the worst case, (sorted array), with 7 elements, partition is called first == 0, last == 6 // 6 comparisons first == 0, last == 6 // 6 comparisons first == 1, last == 6 // 5 comparisons first == 1, last == 6 // 5 comparisons first == 2, last == 6 // 4 comparisons first == 2, last == 6 // 4 comparisons first == 3, last == 6 // 3 comparisons first == 3, last == 6 // 3 comparisons first == 4, last == 6 // 2 comparisons first == 4, last == 6 // 2 comparisons first == 5, last == 6 // 1 comparison first == 5, last == 6 // 1 comparison

Best Case: [ 4, 1, 3, 2, 6, 5, 7 ] [ 4, 1, 3, 2, 6, 5, 7 ] [ 2, 1, 3, 4, 6, 5, 7 ] [ 2, 1, 3, 4, 6, 5, 7 ] [ 2, 1, 3] [ 6, 5, 7 ] [ 2, 1, 3] [ 6, 5, 7 ] [ 1, 2, 3] [ 5, 6, 7 ] [ 1, 2, 3] [ 5, 6, 7 ] [1] [3] [5] [7] [1] [3] [5] [7]

Worst Case [ 1, 2, 3, 4, 5, 6, 7] [] [2, 3, 4, 5, 6, 7] [] [2, 3, 4, 5, 6, 7] [] [3, 4, 5, 6, 7] [] [3, 4, 5, 6, 7] [] [4, 5, 6, 7] [] [4, 5, 6, 7] [] [5, 6, 7] [] [5, 6, 7] [] [6, 7] [] [6, 7] [] [7] [] [7] [] [] [] []

The Best and Worst continued  So in the worst case, partition is called n-1 times: (n-1)+(n-2) comparisons = O(n 2 )  The worst case of Quicksort may be the same as Selection Sort, which is O(n 2 )  Quicksort is used because of its best and average cases