WHICH SEARCH OR SORT IS BETTER?. COMPARING ALGORITHMS Time efficiency refers to how long it takes an algorithm to run Space efficiency refers to the amount.

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

Practice Quiz Question
Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
Efficient Sorts. Divide and Conquer Divide and Conquer : chop a problem into smaller problems, solve those – Ex: binary search.
ISOM MIS 215 Module 7 – Sorting. ISOM Where are we? 2 Intro to Java, Course Java lang. basics Arrays Introduction NewbieProgrammersDevelopersProfessionalsDesigners.
Sorting Algorithms. Motivation Example: Phone Book Searching Example: Phone Book Searching If the phone book was in random order, we would probably never.
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
CPSC 171 Introduction to Computer Science More Efficiency of Algorithms.
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
Chapter 19: Searching and Sorting Algorithms
Algorithm An algorithm is a step-by-step set of operations to be performed. Real-life example: a recipe Computer science example: determining the mode.
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
Chapter 11 Sorting and Searching. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Examine the linear search and.
Wednesday, 11/25/02, Slide #1 CS 106 Intro to CS 1 Wednesday, 11/25/02  QUESTIONS??  Today:  More on sorting. Advanced sorting algorithms.  Complexity:
CS107 Introduction to Computer Science
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
CHAPTER 11 Sorting.
Efficiency of Algorithms Csci 107 Lecture 8. Last time –Data cleanup algorithms and analysis –  (1),  (n),  (n 2 ) Today –Binary search and analysis.
Quicksort
Sorting Algorithms and Analysis Robert Duncan. Refresher on Big-O  O(2^N)Exponential  O(N^2)Quadratic  O(N log N)Linear/Log  O(N)Linear  O(log N)Log.
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
Searching1 Searching The truth is out there.... searching2 Serial Search Brute force algorithm: examine each array item sequentially until either: –the.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Chapter 19 Searching, Sorting and Big O
Reynolds 2006 Complexity1 Complexity Analysis Algorithm: –A sequence of computations that operates on some set of inputs and produces a result in a finite.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Analysis of Algorithms
Chapter 19: Searching and Sorting Algorithms
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
 2005 Pearson Education, Inc. All rights reserved Searching and Sorting.
 Pearson Education, Inc. All rights reserved Searching and Sorting.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Ch 18 – Big-O Notation: Sorting & Searching Efficiencies Our interest in the efficiency of an algorithm is based on solving problems of large size. If.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 B-1 Chapter 10 (continued) Algorithm Efficiency and Sorting.
Chapter 5 Searching and Sorting. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Examine the linear search and binary.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Chapter 8 Sorting and Searching Goals: 1.Java implementation of sorting algorithms 2.Selection and Insertion Sorts 3.Recursive Sorts: Mergesort and Quicksort.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n )
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting 1. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
Intro To Algorithms Searching and Sorting. Searching A common task for a computer is to find a block of data A common task for a computer is to find a.
C++ How to Program, 7/e © by Pearson Education, Inc. All Rights Reserved.
Searching Topics Sequential Search Binary Search.
PREVIOUS SORTING ALGORITHMS  BUBBLE SORT –Time Complexity: O(n 2 ) For each item, make (n –1) comparisons Gives: Comparisons = (n –1) + (n – 2)
Chapter 9: Sorting1 Sorting & Searching Ch. # 9. Chapter 9: Sorting2 Chapter Outline  What is sorting and complexity of sorting  Different types of.
 2006 Pearson Education, Inc. All rights reserved. 1 Searching and Sorting.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
1 Algorithms Searching and Sorting Algorithm Efficiency.
19 Searching and Sorting.
Growth of Functions & Algorithms
Warmup What is an abstract class?
Sorting by Tammy Bailey
Teach A level Computing: Algorithms and Data Structures
Data Structures and Algorithms
Chapter 4.
Core Assessments Core #1: This Friday (5/4) Core #2: Tuesday, 5/8.
Algorithms.
Presentation transcript:

WHICH SEARCH OR SORT IS BETTER?

COMPARING ALGORITHMS Time efficiency refers to how long it takes an algorithm to run Space efficiency refers to the amount of memory space an algorithm uses Algorithms are compared to each other by expressing their efficiency in big-oh notation 2

BIG-O NOTATION ConstantO (1)No matter how large or small the data set, the algorithm will take the same amount of time. Example: 1 item takes 1 second, 10 items takes 1 second, 100 items takes 1 second LogarithmicO (log n)Time increases with larger data sets, but not proportionately Example: 1 item takes 1 second, 10 items takes 2 seconds, 100 items takes 3 seconds. LinearO (n)The larger the data set, the time taken grows proportionately. Example: 1 item takes 1 second, 10 items takes 10 seconds, 100 items takes 100 seconds. LinearithmicO (n log n)Combines the previous two. Normally there’s 2 parts to the sort, the first loop is O(n), the second is O(log n), combining to form O(n log n). Example:1 item takes 2 seconds, 10 items takes 12 seconds, 100 items takes 103 seconds. QuadraticO (n 2 )Things are getting extra slow. Example: 1 item takes 1 second, 10 items takes 100, 100 items takes ExponentialO (2 n )The algorithm takes twice as long for every new element added. Example: 1 item takes 1 second, 10 items takes 1024 seconds, 100 items takes seconds.

BIG-O NOTATION Sequential Search = O(n) Binary Search = O(log n) Selection Sort = O(n²) Insertion Sort = O(n²) Merge Sort = O(n log n) Quick Sort = O (n log n)

ANALYSIS OF SEQUENTIAL SEARCH Problem: Find a target in a list of n values Best case: 1 comparison (target found immediately) Worst case: n comparisons made (target not in list, or target last element) Average case: n/2 comparisons (target found in the middle)

ANALYSIS OF BINARY SEARCH Problem: Find a target in a sorted array of n values (find entry in a phone book?) Algorithm: Search in the middle, every iteration the list is halved (n, n/2, n/4…) Best case: target is found on the first try (in the middle) Worst case: target is not in the list, or is at either end of a sublist How do you know the maximum number of comparisons under the worst case? Example: If the array has 9 elements then find the 2 m closest to 9 (round up to 16)

LOG 2 N log2 n  The number of times you can half a (positive) number n before it goes below 1 log 2 n = m which means 2 m = n Examples: log 2 16 = 4 because 2 4 = 16 16/2=8 and 8/2=4 and 4/2=2 and 2/2=1 log 2 8 = 3 which mean 2 3 =8

LOG 2 N Increases very slowly log 2 8 = 3 log 2 32 = 5 log = 7 log = 10 log = 20 log = 30 …

LOG 2 N Does efficiency matter? Say n = 10 9 (1 billion elements) 10 MHz computer ==> 1 instruction takes seconds Sequential search would take (n) = 10 9 x seconds = 100 seconds Binary search would take (log n) = log x sec = 30 x10 -7 sec = 3 microseconds

COMPARING SORTS Both Selection and Insertion sorts are similar in efficiency They both have outer loops that scan all elements, and inner loops that compare the value of the outer loop with almost all values in the list Approximately n 2 number of comparisons are made to sort a list of size n We therefore say that these sorts have efficiency O(n 2 ), or are of order n 2 Other sorts are more efficient: O(n log 2 n) 10

ANALYSIS OF SELECTION SORT Iteration 1:  Find largest value in a list of n numbers : n-1 comparisons  Exchange values and move marker Iteration 2:  Find largest value in a list of n-1 numbers: n-2 comparisons  Exchange values and move marker Iteration 3:  Find largest value in a list of n-2 numbers: n-3 comparisons  Exchange values and move marker … Iteration n:  Find largest value in a list of 1 numbers: 0 comparisons  Exchange values and move marker Total comparisons: (n-1) + (n-2) + …

ANALYSIS OF SELECTION SORT There is no best case or worst case This is an O (n 2 ) algorithm Selection Sort will be too inefficient to use with a large array

ANALYSIS OF INSERTION SORT Insertion sort uses nested loops. This means that as the number of elements in the array grows it will take approximately n * n longer to sort. In big-O notation, this will be represented like O(n 2 ). Best case scenario is O(n) because if the array is already sorted the inner loop won’t need to go through all of the elements again. Space efficiency: O(1) because it sorts in place Appropriate for very small arrays

ANALYSIS OF MERGE SORT 3 steps: 1. First we computes the midpoint. This is O(1) no matter how large the array. 2. Next we recursively sort two subarrays of approximately half. 3. The final step is to merge the two subarrays together. The merge sort algorithm has time efficiency of O(n log n). Space efficiency: O(n) because it uses a temporary array

ANALYSIS OF QUICK SORT 3 steps: 1. First we select a pivot. This is O(1) no matter how large the array. 2. Next we partition the array with items less than pivot on the left, and items greater than pivot on the right. 3. recursively quicksort until all elements are sorted in place (no temporary arrays needed). Worst case: the pivot partitions the array into a subset of 1 item and a subset of n-1 items, which gives us O(n 2 ) behavior in the worst case. Average case: The quick sort algorithm has time efficiency of O(n log n).

nMerge Sort (milliseconds)Selection Sort (milliseconds) 10, ,000732,148 30, ,796 40, ,192 50, ,321 60, ,299 ANALYZING THE MERGE SORT ALGORITHM

Merge Sort Timing vs. Selection Sort