Recursion, Complexity, and Sorting By Andrew Zeng.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Garfield AP Computer Science
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Copyright © 2014, 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Starting Out with C++ Early Objects Eighth Edition by Tony Gaddis,
Practice Quiz Question
Recursion. Recursion is a powerful technique for thinking about a process It can be used to simulate a loop, or for many other kinds of applications In.
Chapter 9: Searching, Sorting, and Algorithm Analysis
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Sorting Algorithms. Motivation Example: Phone Book Searching Example: Phone Book Searching If the phone book was in random order, we would probably never.
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Quick Sort. Quicksort Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare. The quick sort is an in-place, divide- and-conquer, massively.
Recursion. Binary search example postponed to end of lecture.
© 2006 Pearson Addison-Wesley. All rights reserved10-1 Chapter 10 Algorithm Efficiency and Sorting CS102 Sections 51 and 52 Marc Smith and Jim Ten Eyck.
Wednesday, 11/25/02, Slide #1 CS 106 Intro to CS 1 Wednesday, 11/25/02  QUESTIONS??  Today:  More on sorting. Advanced sorting algorithms.  Complexity:
Merge sort, Insertion sort
Recursion.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (3) Recurrence Relation 11/11 ~ 11/14/2008 Yang Song.
Algorithm Efficiency and Sorting
Merge sort, Insertion sort. Sorting I / Slide 2 Sorting * Selection sort or bubble sort 1. Find the minimum value in the list 2. Swap it with the value.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
CHAPTER 7: SORTING & SEARCHING Introduction to Computer Science Using Ruby (c) Ophir Frieder at al 2012.
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
Week 11 Introduction to Computer Science and Object-Oriented Programming COMP 111 George Basham.
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Fall 2013 Instructor: Reza Entezari-Maleki Sharif University of Technology 1 Fundamentals of Programming Session 17 These.
Chapter 12 Recursion, Complexity, and Searching and Sorting
1 Lecture 16: Lists and vectors Binary search, Sorting.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Heapsort. Heapsort is a comparison-based sorting algorithm, and is part of the selection sort family. Although somewhat slower in practice on most machines.
1 CSC 427: Data Structures and Algorithm Analysis Fall 2008 Algorithm analysis, searching and sorting  best vs. average vs. worst case analysis  big-Oh.
Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy Walters, and Godfrey Muganda Modified for use by MSU Dept. of Computer Science.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Sorting CS 110: Data Structures and Algorithms First Semester,
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)
Sorting and Searching by Dr P.Padmanabham Professor (CSE)&Director
© Janice Regan, CMPT 128, February CMPT 128: Introduction to Computing Science for Engineering Students Recursion.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
Intro To Algorithms Searching and Sorting. Searching A common task for a computer is to find a block of data A common task for a computer is to find a.
Searching and Sorting Searching: Sequential, Binary Sorting: Selection, Insertion, Shell.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.
Quicksort This is probably the most popular sorting algorithm. It was invented by the English Scientist C.A.R. Hoare It is popular because it works well.
Mergesort and Quicksort Opening Discussion zWhat did we talk about last class? zDo you have any questions about assignment #4? Have you thought.
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
Chapter 15: Recursion. Recursive Definitions Recursion: solving a problem by reducing it to smaller versions of itself – Provides a powerful way to solve.
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy.
INTRO2CS Tirgul 8 1. Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Warmup What is an abstract class?
COMP 53 – Week Seven Big O Sorting.
Chapter 7 Sorting Spring 14
Algorithm Analysis CSE 2011 Winter September 2018.
Teach A level Computing: Algorithms and Data Structures
Quick Sort (11.2) CSE 2011 Winter November 2018.
8/04/2009 Many thanks to David Sun for some of the included slides!
Algorithm Efficiency and Sorting
CSE 373 Data Structures and Algorithms
Core Assessments Core #1: This Friday (5/4) Core #2: Tuesday, 5/8.
Presentation transcript:

Recursion, Complexity, and Sorting By Andrew Zeng

Table of Contents Quicksort Mergesort Recursion Algorithms and Complexity Review Questions

Overview In this slideshow we will cover: – The implementation of sorts – Speed of the sorts – The benefits and disadvantages of recursion – The analysis of algorithms – Big-O notation – Review problems

Algorithms and Complexity Big-O notation – It is an abstract function used to represent time complexity of an algorithm or another function with parameter size n – Examples are O(n 3 ),O(log n), and O(a n ). (Note than a can be different values, such as 2 and 3, and be different complexities because they differ by a non constant factor of O(1.5 n ). Also note n 2 is the same as n 2 /100)

Algorithms and Complexity cont. Big-O analysis – Simple assignment actions are O(1) – The running time of a function calls is the Big-O of its body. – Method parameters are O(1) – Logic tests such as in if/else statements are in O(1). The complexity of the if/else chain is the complexity of the worst case – Add together all of the complexities or multiply if in a loop to get the Big-O notation

Recursion In computer science, recursion is when a method or function calls itself, much like a recursive series in mathematics. This requires stack memory to keep track of all the methods that have been called Recursion must have a base case, which will stop the function from infinitely calling itself. For example, a selection sort could call itself to sort from 0  size – 1, then call itself to sort from 0  size – 2, then find the greatest and put that on the top of the sorted array

Pros and cons of recursion Imagine the sort previously mentioned. – If the size was 1 billion, it would require 1 billion method calls, so the sort would run out of memory Therefore it is almost always better to use iterative solutions whenever possible. Recursion can be useful when – It is difficult to divide the problem into an iterative approach – It is as fast and as memory consuming as an iterative approach because it will probably make code clearer Now imagine a stack a millions times as big.

Example of recursion Click to continue Click this if you are bored of clicking Not having a base case is like falling forever without an edge to hang on to in Portal.

Quicksort Its complexity is O(n 2 ) in its worst case because, by choosing the worst pivot every time (the highest element value), it basically turns into a bubble sort On average it is O(n log n)

Quicksort cont. Generally what happens is you choose a pivot. Then all the values smaller than the pivot are moved to the left and the bigger ones to the right. This is the most simple version. The algorithm can be modified also to sort items ascending or descending.

Merge Sort Background – Developed by the mathematician John von Neumann in 1945, who invented the computer – Based on the divide and conquer algorithm idea – Its complexity is O(n log n) in both best and worst case, but it is a slower O(n log n) on average than that of quicksort. Sometimes, though, Mergesort can be tied with quicksort

Merge Sort cont. The general code for the sort is – If the list size is less than 2, it is sorted and return the sublist – Otherwise divide the list in two half the size lists and recursively sort the sublists – Merge the sublists together and return the new list Here is a visual sample of Mergesort

Pros and Cons It has good speed and is very consistent Not really much, except that with a huge array, memory problems can arise from an overly large stack – Its total recursive calls is O(2n-1)compared with the quicksort n It also makes fewer comparisons than quicksort except in rare cases

Summary We have found in this presentation that: – Recursion should be used only when necessary. – The speed of the sorts Quicksort and Mergesort, among others. – How to analyze a algorithm, recursive or iterative.

Review Questions #1 – Analyze the Big-O of the following code: A: O(n 2 log n) B: O(n*n!) C: O(n n ) D: O(n 3 )

Review Questions #2 – Which of the following is the fastest algorithm? – A: Quicksorting an array of size 5n – B: Mergesorting an array of size 5n – C: Bubble sorting an array of size n – D: It depends #3 – Analyze the result of the following code on the left for a = 3. (Spaces = line feeds) – A: c++ c++ c++.net.net.net – B: c++ c++ c++.net.net – C: c++ c++ c++.net c++ c++ c++.net.net – D: None of the above

Review Answers #1 – Start with sort(), because that is the first method called, it calls inorder() and negates it which is O(1). inorder() is clearly O(n), because there a single equality test done at most n times so its O(n * 1 = n). shuffle() is O(n) because all of the action in the loop are O(1) and the shuffle is done n times. Add loop overhead of inorder() and shuffle() and you get O(n) because the O(1) is of little consequence as n gets big. Now analyze the while loop. The probability that all the elements are in order is 1/n * 1/(n-1) * … ½ * 1. So it has probability 1/n! it is right, so the expected amount of times it takes on average is n!, so the complexity is O(n*n!). In this case we do not consider the worst case because it may never randomly get to the answer.

Review Answers cont. #2 – The answer is D: it depends. One way to approach this is that for all cases, Quicksort on average faster than Mergesort, but can be slower. On the other hand, bubble sort is slower for the same size array as the other two sorts, but the size is smaller so for small n it is faster.

Review Answers cont. #3 – The answer is C. For a = 3, the method is recursively called for a = 2, which then calls a = 1, and then a = 0. At a = 0, the method tests to see the value of a and breaks out of the recursive sequence. Then at a = 1, c++ is printed once in the 1 st loop and.net 0 times in the 2 nd. Then the method finishes and a = 2 executes, and acts almost like a = 1 except for the number of times the words are printed, which both increase by 1 as a goes up by 1. This is also true for a = 3.

Code writing exercise Write a recursive and iterative algorithm to create anagrams of a word. Imagine a word as collection or characters. Let n = size Assume that there are no repeats of a letter. – Recursive: Starting with the whole collection do this size of word times – Create anagrams of the sub-word (the whole word excluding the first letter, then display the word when if it has reached a size of 2 (the base case is size 1, because there are no permutations except itself) – Then rotate the word by moving the first letter to the end – Iterative: Starting at element 0, swap adjacent elements until you reach the end Repeat this process size! / (size -1) times.