Sorting by Tammy Bailey

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

Garfield AP Computer Science
CSE Lecture 3 – Algorithms I
CompSci Searching & Sorting. CompSci Searching & Sorting The Plan  Searching  Sorting  Java Context.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
© 2004 Goodrich, Tamassia Quick-Sort     29  9.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Searching – Linear and Binary Searches. Comparing Algorithms Should we use Program 1 or Program 2? Is Program 1 “fast”? “Fast enough”? P1P2.
Iterative Algorithm Analysis & Asymptotic Notations
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
L. Grewe.  An array ◦ stores several elements of the same type ◦ can be thought of as a list of elements: int a[8]
COT 5405 Spring 2009 Discussion Session 3. Introduction Our weekly discussion class are like organized TA office hours. Here we encourage students to.
1 CSC 427: Data Structures and Algorithm Analysis Fall 2008 Algorithm analysis, searching and sorting  best vs. average vs. worst case analysis  big-Oh.
New Mexico Computer Science For All Algorithm Analysis Maureen Psaila-Dombrowski.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Sorting.
Sorting and Searching by Dr P.Padmanabham Professor (CSE)&Director
Sorting Algorithm Analysis. Sorting  Sorting is important!  Things that would be much more difficult without sorting: –finding a phone number in the.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Sorting. Sorting Sorting is important! Things that would be much more difficult without sorting: –finding a telephone number –looking up a word in the.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
Searching Topics Sequential Search Binary Search.
Chapter 9: Sorting1 Sorting & Searching Ch. # 9. Chapter 9: Sorting2 Chapter Outline  What is sorting and complexity of sorting  Different types of.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
WHICH SEARCH OR SORT IS BETTER?. COMPARING ALGORITHMS Time efficiency refers to how long it takes an algorithm to run Space efficiency refers to the amount.
Section 1.7 Comparing Algorithms: Big-O Analysis.
Chapter 16: Searching, Sorting, and the vector Type.
CMPT 238 Data Structures More on Sorting: Merge Sort and Quicksort.
Algorithm Analysis 1.
CMPT 438 Algorithms.
Sorting.
16 Searching and Sorting.
Analysis of Algorithms
CSC 427: Data Structures and Algorithm Analysis
Searching – Linear and Binary Searches
Algorithmic Efficency
David Kauchak CS52 – Spring 2015
Divide and Conquer.
2008/12/03: Lecture 20 CMSC 104, Section 0101 John Y. Park
Quick Sort (11.2) CSE 2011 Winter November 2018.
Chapter 3: The Efficiency of Algorithms
Manipulating lists of data
MSIS 655 Advanced Business Applications Programming
8/04/2009 Many thanks to David Sun for some of the included slides!
CS 201 Fundamental Structures of Computer Science
Chapter 3: The Efficiency of Algorithms
Sub-Quadratic Sorting Algorithms
Linear-Time Sorting Algorithms
Sorting And Searching CSE116A,B 2/22/2019 B.Ramamurthy.
EE 312 Software Design and Implementation I
ITEC 2620M Introduction to Data Structures
Intro to Computer Science CS1510 Dr. Sarah Diesburg
CSC 427: Data Structures and Algorithm Analysis
Intro to Computer Science CS1510 Dr. Sarah Diesburg
David Kauchak cs161 Summer 2009
Algorithm Efficiency and Sorting
Presentation transcript:

Sorting by Tammy Bailey

Algorithm analysis Determine the amount of resources an algorithm requires to run computation time, space in memory Running time of an algorithm is the number of basic operations performed additions, multiplications, comparisons usually grows with the size of the input faster to add 2 numbers than to add 2,000,000!

Running time Worst-case running time upper bound on the running time guarantee the algorithm will never take longer to run Average-case running time time it takes the algorithm to run on average (expected value) Best-case running time lower bound on the running time guarantee the algorithm will not run faster

Comparisons in insertion sort Worst case element k requires (k-1) comparisons total number of comparisons: 0+1+2+ … + (n-1) = ½ (n)(n-1) = ½ (n2-n) Best case elements 2 through n each require one comparison 1+1+1+ … + 1 = n-1 (n-1) times

Running time of insertion sort Best case running time is linear Worst case running time is quadratic Average case running time is also quadratic on average element k requires (k-1)/2 comparisons total number of comparisons: ½ (0+1+2+ … + n-1) = ¼ (n)(n-1) = ¼ (n2-n)

Mergesort 27 10 12 20 27 10 12 20 12 20 27 10 10 27 10 12 20 27 divide divide divide merge merge merge

Merging two sorted lists first list second list result of merge 10 27 12 20 10 10 27 12 20 10 12 10 27 12 20 10 12 20 10 27 12 20 10 12 20 27

Comparisons in merging Merging two sorted lists of size m requires at least m and at most 2m-1 comparisons m comparisons if all elements in one list are smaller than all elements in the second list 2m-1 comparisons if the smallest element alternates between lists

Logarithm Power to which any other number a must be raised to produce n a is called the base of the logarithm Frequently used logarithms have special symbols lg n = log2 n logarithm base 2 ln n = loge n natural logarithm (base e) log n = log10 n common logarithm (base 10) If we assume n is a power of 2, then the number of times we can recursively divide n numbers in half is lg n

Comparisons at each merge #lists #elements in each list #merges #comparisons per merge #comparisons total n 1 n/2 2 n/4 3 3n/4 4 n/8 7 7n/8  n-1

Comparisons in mergesort Total number of comparisons is the sum of the number of comparisons made at each merge at most n comparisons at each merge the number of times we can recursively divide n numbers in half is lg n, so there are lg n merges there are at most n lg n comparisons total

Comparison of sorting algorithms Best, worst and average-case running time of mergesort is (n lg n) Compare to average case behavior of insertion sort: n Insertion sort Mergesort 10 25 33 100 2500 664 1000 250000 9965 10000 25000000 132877 100000 2500000000 1660960

Quicksort Most commonly used sorting algorithm One of the fastest sorts in practice Best and average-case running time is O(n lg n) Worst-case running time is quadratic Runs very fast on most computers when implemented correctly

Searching

Searching Determine the location or existence of an element in a collection of elements of the same type Easier to search large collections when the elements are already sorted finding a phone number in the phone book looking up a word in the dictionary What if the elements are not sorted?

Sequential search Given a collection of n unsorted elements, compare each element in sequence Worst-case: Unsuccessful search search element is not in input make n comparisons search time is linear Average-case: expect to search ½ the elements make n/2 comparisons

Searching sorted input If the input is already sorted, we can search more efficiently than linear time Example: “Higher-Lower” think of a number between 1 and 1000 have someone try to guess the number if they are wrong, you tell them if the number is higher than their guess or lower Strategy? How many guesses should we expect to make?

Best Strategy Always pick the number in the middle of the range Why? you eliminate half of the possibilities with each guess We should expect to make at most lg1000  10 guesses Binary search search n sorted inputs in logarithmic time

Binary search Search for 9 in a list of 16 elements 1 3 4 5 7 9 10 11 13 14 18 20 22 23 30 1 3 4 5 7 9 10 5 7 9 10 9 10 9

Sequential vs. binary search Average-case running time of sequential search is linear Average-case running time of binary search is logarithmic Number of comparisons: n sequential search binary search 2 1 16 8 4 256 128 4096 2048 12 65536 32768