Computer Science 101 Fast Algorithms. What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n ) 2 1 2 4 4 4 2 4 16 64 8 3 8 64 256 16 4 16 256 65536 32 5.

Slides:



Advertisements
Similar presentations
Zabin Visram Room CS115 CS126 Searching
Advertisements

CS4413 Divide-and-Conquer
CSC 331: Algorithm Analysis Divide-and-Conquer Algorithms.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
QuickSort 4 February QuickSort(S) Fast divide and conquer algorithm first discovered by C. A. R. Hoare in If the number of elements in.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Sorting. “Sorting” When we just say “sorting,” we mean in ascending order (smallest to largest) The algorithms are trivial to modify if we want to sort.
Computer Science 112 Fundamentals of Programming II Finding Faster Algorithms.
CPSC 171 Introduction to Computer Science More Efficiency of Algorithms.
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
© 2006 Pearson Addison-Wesley. All rights reserved10-1 Chapter 10 Algorithm Efficiency and Sorting CS102 Sections 51 and 52 Marc Smith and Jim Ten Eyck.
Chapter 11 Sorting and Searching. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Examine the linear search and.
Wednesday, 11/25/02, Slide #1 CS 106 Intro to CS 1 Wednesday, 11/25/02  QUESTIONS??  Today:  More on sorting. Advanced sorting algorithms.  Complexity:
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
CHAPTER 11 Sorting.
Quicksort.
5 - 1 § 5 The Divide-and-Conquer Strategy e.g. find the maximum of a set S of n numbers.
Quicksort
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (3) Recurrence Relation 11/11 ~ 11/14/2008 Yang Song.
Data Structures Review Session 1
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Searching1 Searching The truth is out there.... searching2 Serial Search Brute force algorithm: examine each array item sequentially until either: –the.
Sorting II/ Slide 1 Lecture 24 May 15, 2011 l merge-sorting l quick-sorting.
© 2006 Pearson Addison-Wesley. All rights reserved10 A-1 Chapter 10 Algorithm Efficiency and Sorting.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Sorting (Part II: Divide and Conquer) CSE 373 Data Structures Lecture 14.
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
计算机科学概述 Introduction to Computer Science 陆嘉恒 中国人民大学 信息学院
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
1 Lecture 16: Lists and vectors Binary search, Sorting.
SEARCHING UNIT II. Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Ch 18 – Big-O Notation: Sorting & Searching Efficiencies Our interest in the efficiency of an algorithm is based on solving problems of large size. If.
CSC 211 Data Structures Lecture 13
Computer Science 101 Introduction to Sorting. Sorting One of the most common activities of a computer is sorting data Arrange data into numerical or alphabetical.
1 Section 2.1 Algorithms. 2 Algorithm A finite set of precise instructions for performing a computation or for solving a problem.
Searching & Sorting Programming 2. Searching Searching is the process of determining if a target item is present in a list of items, and locating it A.
Computer Science 101 A Survey of Computer Science QuickSort.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Searching Topics Sequential Search Binary Search.
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
WHICH SEARCH OR SORT IS BETTER?. COMPARING ALGORITHMS Time efficiency refers to how long it takes an algorithm to run Space efficiency refers to the amount.
Section 1.7 Comparing Algorithms: Big-O Analysis.
Sorts, CompareTo Method and Strings
CSCI 104 Sorting Algorithms
Fundamentals of Programming II Finding Faster Algorithms
Computer Science 101 A Survey of Computer Science
Quicksort
Quicksort 1.
Algorithm Design Methods
Unit-2 Divide and Conquer
Data Structures Review Session
Chapter 4.
Quicksort.
Algorithm Efficiency and Sorting
And now for something completely different . . .
CPS120: Introduction to Computer Science
CPS120: Introduction to Computer Science
Quicksort.
Design and Analysis of Algorithms
Algorithm Efficiency and Sorting
Divide and Conquer Merge sort and quick sort Binary search
Algorithm Efficiency and Sorting
Quicksort.
Presentation transcript:

Computer Science 101 Fast Algorithms

What Is Really Fast? n O(log 2 n) O(n) O(n 2 )O(2 n ) digits whoa!

Logarithms Definition: The logarithm to base 2 of n is a number k such that 2 k =n, so log 2 (n) = k When we say log(n), we mean log 2 (n) Example: 2 5 = 32 so log(32) = 5 Another way to think of this is that log(n) is the number of times we must divide n by 2 until we get 1 Note: log(n) is usually a fraction, but the closest integer will work for us

Improving Efficiency Tweaking does not improve the big-O, so the change is not noticeable when N becomes very large If we can go from O(n) to O(logn), we have really accomplished something

Example: Sequential Search If the data items are in random order, then each one must be examined in the worst case Complexity of sequential search = O(n) set Current to 1 Set Found to false while Current <= N and not Found do if A(Current) = Target then set Found to true else increment Current if Found then output Current else output 0

Searching a Sorted List When we search a phone book, we don’t begin with the first name and look at each successor We skip over large numbers of names until we find the target or give up

Binary Search Strategy –Have pointers marking left and right ends of the list still to be processed –Compute the position of the midpoint between the two pointers –If the target equals the value at midpoint, quit with the position found –Otherwise, if the target is less than that value, search just the positions to the left of midpoint –Otherwise, search the just the positions to the right of midpoint –Give up when the pointers cross target leftrightmid

Binary Search Strategy –Have pointers marking left and right ends of the list still to be processed –Compute the position of the midpoint between the two pointers –If the target equals the value at midpoint, quit with the position found –Otherwise, if the target is less than that value, search just the positions to the left of midpoint –Otherwise, search the just the positions to the right of midpoint –Give up when the pointers cross target leftrightmid

The Binary Search Algorithm set Left to 1 Set Right to N Set Found to false while Left <= Right and not Found do compute the midpoint if Target = A(Mid) then set Found to true else if Target < A(Mid) then search to the left of the midpoint else search to the right of the midpoint if Found then output Mid else output 0

The Binary Search Algorithm set Left to 1 Set Right to N Set Found to false while Left <= Right and not Found do set Mid to (Left + Right) / 2 if Target = A(Mid) then set Found to true else if Target < A(Mid) then search to the left of the midpoint else search to the right of the midpoint if Found then output Mid else output 0

The Binary Search Algorithm set Left to 1 Set Right to N Set Found to false while Left <= Right and not Found do set Mid to (Left + Right) / 2 if Target = A(Mid) then set Found to true else if Target < A(Mid) then set Right to Mid – 1 else search to the right of the midpoint if Found then output Mid else output 0

The Binary Search Algorithm set Left to 1 Set Right to N Set Found to false while Left <= Right and not Found do set Mid to (Left + Right) / 2 if Target = A(Mid) then set Found to true else if Target < A(Mid) then set Right to Mid – 1 else set Left to Mid + 1 if Found then output Mid else output 0

Analysis of Binary Search On each pass through the loop, ½ of the positions in the list are discarded In the worst case, the number of comparisons equals the number of times the size of the list can be divided by 2 O(logn) algorithm! However, must have a sorted list

Improving on N 2 Sorting Several algorithms have been developed to break the N 2 barrier for sorting Most of them use a divide-and-conquer strategy Break the list into smaller pieces and apply another algorithm to them

Quicksort Strategy - Divide and Conquer: –Partition list into two parts, with small elements in the first part and large elements in the second part –Sort the first part –Sort the second part Question - How do we sort the sections? Answer - Apply Quicksort to them Recursive algorithm - one which makes use of itself to solve smaller problems of the same type

Quicksort Question - Will this recursive process ever stop? Answer - Yes, when the problem is small enough, we no longer use recursion. Such cases are called base cases

Partitioning a List To partition a list, we choose a pivot element The elements that are less than or equal to the pivot go into the first section The elements larger than the pivot go into the second section

Partition Partitioning a List Pivot is the element at the midpoint Sublist to sort Data are where they should be relative to the pivot

The Quicksort Algorithm if the list to sort has more than 1 element then if the list has exactly two elements then if the elements are out of order then exchange them else perform the Partition Algorithm on the list apply QuickSort to the first section apply QuickSort to the second section

Partitioning: Choosing the Pivot Ideal would be to choose the median element as the pivot, but this would take too long Some versions just choose the first element Our choice - the median of the first three elements

Partition Partitioning a List Pivot is median of first three items The median of the first three items is a better approximation to the actual median than the item at the midpoint and results in more even splits

The Partition Algorithm exchange the median of the first 3 items with the first set P to first position of list set L to second position of list set U to last position of list while L <= U while A(L)  A(P) do set L to L + 1 while A(U) > A(P) do set U to U - 1 if L < U then exchange A(L) and A(U) exchange A(P) and A(U) A The list P The position of the pivot element L Probes for items > pivot U Probes for items <= pivot

Quicksort: Rough Analysis For simplification, assume that we always get even splits when we partition When we partition the entire list, each element is compared with the pivot - approximately n comparisons Each of the halves is partitioned, each taking about n/2 comparisons, thus about n more comparisons Each of the fourths is partitioned,each taking about n/4 comparisons - n more

Quicksort: Rough Analysis How many levels of about n comparisons do we get? Roughly, we keep splitting until the pieces are about size 1 How many times must we divide n by 2 before we get 1? log(n) times, of course Thus comparisons  n Log(n) Quicksort is O (n Log(n)) – in the ideal or best case

Quicksort: Best Case/Worst Case List is already sorted and we choose the pivot from the midpoint – leads to even splits all the way down, log(n) total splits List is already sorted and we choose the pivot from the first position – leads to the worst splits at each level, n total splits O(nlogn) in best case, O(n 2 ) in worst case