INTRO2CS Tirgul 8 1. Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe.

Slides:



Advertisements
Similar presentations
Garfield AP Computer Science
Advertisements

CompSci Searching & Sorting. CompSci Searching & Sorting The Plan  Searching  Sorting  Java Context.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
CSCE 3110 Data Structures & Algorithm Analysis
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Sorting Algorithms. Motivation Example: Phone Book Searching Example: Phone Book Searching If the phone book was in random order, we would probably never.
Quicksort CS 3358 Data Structures. Sorting II/ Slide 2 Introduction Fastest known sorting algorithm in practice * Average case: O(N log N) * Worst case:
CMPS1371 Introduction to Computing for Engineers SORTING.
Data Structures and Algorithms PLSD210 Sorting. Card players all know how to sort … First card is already sorted With all the rest, ¶Scan back from the.
Chapter 19: Searching and Sorting Algorithms
 1 Sorting. For computer, sorting is the process of ordering data. [ ]  [ ] [ “Tom”, “Michael”, “Betty” ]  [ “Betty”, “Michael”,
Recitation 9 Programming for Engineers in Python.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Chapter 11 Sorting and Searching. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Examine the linear search and.
CHAPTER 11 Sorting.
TDDB56 DALGOPT-D DALG-C Lecture 8 – Sorting (part I) Jan Maluszynski - HT Sorting: –Intro: aspects of sorting, different strategies –Insertion.
TTIT33 Algorithms and Optimization – Dalg Lecture 2 HT TTIT33 Algorithms and optimization Lecture 2 Algorithms Sorting [GT] 3.1.2, 11 [LD] ,
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (3) Recurrence Relation 11/11 ~ 11/14/2008 Yang Song.
Algorithm Efficiency and Sorting
 2003 Prentice Hall, Inc. All rights reserved Sorting Arrays Sorting data –Important computing application –Virtually every organization must sort.
Lecture 8 Sorting. Sorting (Chapter 7) We have a list of real numbers. Need to sort the real numbers in increasing order (smallest first). Important points.
Lecture 8 Sorting. Sorting (Chapter 7) We have a list of real numbers. Need to sort the real numbers in increasing order (smallest first). Important points.
Simple Sorting Algorithms. 2 Bubble sort Compare each element (except the last one) with its neighbor to the right If they are out of order, swap them.
CHAPTER 7: SORTING & SEARCHING Introduction to Computer Science Using Ruby (c) Ophir Frieder at al 2012.
Week 11 Sorting Algorithms. Sorting Sorting Algorithms A sorting algorithm is an algorithm that puts elements of a list in a certain order. We need sorting.
HOW TO SOLVE IT? Algorithms. An Algorithm An algorithm is any well-defined (computational) procedure that takes some value, or set of values, as input.
1 Data Structures and Algorithms Sorting. 2  Sorting is the process of arranging a list of items into a particular order  There must be some value on.
CIS 068 Welcome to CIS 068 ! Lesson 9: Sorting. CIS 068 Overview Algorithmic Description and Analysis of Selection Sort Bubble Sort Insertion Sort Merge.
Sorting HKOI Training Team (Advanced)
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Analysis of Algorithms
HKOI 2006 Intermediate Training Searching and Sorting 1/4/2006.
1 Lecture 16: Lists and vectors Binary search, Sorting.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
1 Joe Meehean.  Problem arrange comparable items in list into sorted order  Most sorting algorithms involve comparing item values  We assume items.
Sorting CS 105 See Chapter 14 of Horstmann text. Sorting Slide 2 The Sorting problem Input: a collection S of n elements that can be ordered Output: the.
Sorting CS 110: Data Structures and Algorithms First Semester,
Data Structure Introduction.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Fundamentals of Algorithms MCS - 2 Lecture # 15. Bubble Sort.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
3 – SIMPLE SORTING ALGORITHMS
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Lecture No. 04,05 Sorting.  A process that organizes a collection of data into either ascending or descending order.  Can be used as a first step for.
INTRO2CS Tirgul 6 1. What we will cover today  Algorithms and their analysis  Image  Command Line Arguments  Debugger 2.
Sorting and Searching by Dr P.Padmanabham Professor (CSE)&Director
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Chapter 9 Sorting. The efficiency of data handling can often be increased if the data are sorted according to some criteria of order. The first step is.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Sorting. Sorting Sorting is important! Things that would be much more difficult without sorting: –finding a telephone number –looking up a word in the.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Sorting and Runtime Complexity CS255. Sorting Different ways to sort: –Bubble –Exchange –Insertion –Merge –Quick –more…
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Advanced Sorting.
Advanced Sorting 7 2  9 4   2   4   7
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
CSCI 104 Sorting Algorithms
Teach A level Computing: Algorithms and Data Structures
Description Given a linear collection of items x1, x2, x3,….,xn
Algorithm Design Methods
Adapted from slides by Marty Stepp and Stuart Reges
Bubble Sort Bubble sort is one way to sort an array of numbers. Adjacent values are swapped until the array is completely sorted. This algorithm gets its.
Quick Sort (11.2) CSE 2011 Winter November 2018.
Unit-2 Divide and Conquer
Welcome to CIS 068 ! Lesson 9: Sorting CIS 068.
Sorting … and Insertion Sort.
Analysis of Algorithms
Presentation transcript:

INTRO2CS Tirgul 8 1

Searching and Sorting  Tips for debugging  Binary search  Sorting algorithms:  Bogo sort  Bubble sort  Quick sort and maybe  Radix sort  Merge sort 2

Tips for Debugging 3  Debugging is a process of finding bugs or defects in a program  We can roughly divide the bugs into two types: 1. Bugs that will cause the program to terminate its execution. This will be followed by an exception, and usually in Python there will be a traceback print of the error, including in which line it occurred and what type of error it is Those are the easy bugs to solve..

Tips for Debugging 4  For example:  Division by zero  Index Error  Undefined variable …  There are some examples in tirgul 6

Tips for Debugging 5 2. Bugs in the implementation – there is no exception thrown but the program does not do what we wanted it to do  Those are hard to find.  This is where we need to debug..

Tips for Debugging - example 6  Say we want to implement a function that gets a list and sums its elements:  No exception will be thrown, but it will not result what we intended

Tips for Debugging - example 7  We can add print commands in places we suspect can go wrong:

Tips for Debugging - example 8  Breakpoint means a point in the program where the execution is paused and we can retrieve the current values of the variables  We can use Pdb (the Python debugger) module and create breakpoints throughout the program using set_trace() function  In each breakpoint we can ask for the current variable values in the stdout  We can continue execution using ‘continue’

Tips for Debugging - example 9 > test.py(4)sum_list() -> for i in lst: (Pdb) 7 7 > test.py(4)sum_list() -> for i in lst: (Pdb) 2 type the variable name i sum_lst continue i Continue to next breakpoint More info:

Sorting  A sorting algorithm is an algorithm that puts elements in a certain order:  Lexicographic  Numeric  Chronologic  Sorting is important since many search algorithms are much more efficient (lower running time complexity) when running on a sorted input  We will talk about lists during the presentation, but it is valid to any type of sequence that keeps the order of its elements 10

Example  Find the youngest person  Unsorted: go over n rows: worst case takes n steps  Find the person who is 10 years old  Unsorted: go over n rows: worst case takes n steps NameAge Bart10 Homer37 Lisa8 Marge35 Meggie1 11

Example  Find the youngest person  Sorted (by age): get the youngest person in the table: takes 1 step  Find the person who is 10 years old  How long would it take on a sorted list? NameAge Meggie1 Lisa8 Bart10 Marge35 Homer37 12

Runtime Analysis  It is like “counting” the number of operations needed to be done, given an input of size n  How many iterations or recursive calls  How many operations are done in each iteration or recursive call  We will learn about it more deeply in the future 13

Binary Search  Given a sorted list and a key to be found:  Get the middle element  if(key == middle) we are done.  If(key < middle) do binary search on the left half of the list  If(key > middle) do binary search on the right half of the list 14

Binary Search - example  A list: [1,1,2,3,5,8,13,21,34,55]  We want to find the index of 3 in the list 1. middle = < 8, therefore we will look for it in the left half: [1,1,2,3,5] 3. Search again on the updated list 15

Binary Search  Find the index of 3: middle middle 16

Binary Search – implementation (1) 17

Binary Search – implementation (2) 18

Binary Search – How long does it take?  In each iteration (or in each recursive call) the search is done on a list that is half the size of the previous one.  Assume the list is of size n  After first iteration the list is of size n/2  After the second iteration the list is of size n/2*(1/2) = n/2 2  After k iterations the list is of size n/2 k  The algorithm is done when the list is of size 1, therefore: n/2 k = 1 Take log: k = log(n) 19

Sorting Motivation:  In many applications we would like to get information on a given input, for example to get the grades of a student  Given list of data of size n:  Searching on an unsorted list requires n steps  Searching on a sorted list requires log(n) steps when we use binary search 20

Stable Sorting  A sorting algorithm is considered to be stable if the initial order of equal items is preserved  That means that if 2 items are equal and one came before the other in the input, then this order would be kept in the output 21

Stable Sorting  Stable sort will always return same solution (permutation) on the same input. 22

Stable Sorting  The order of the two 5 cards is preserved  Actually, it is sorted by a certain key but have more than one comparable property 23

Stable Sorting  We can implement unstable algorithms as stable ones  For example, we can add to each element the original index as an additional property  This, of course would cost more memory  Stable algorithms usually have higher CPU and/or memory usage than unstable algorithms. 24

Bogo Sort  It is also called “random sort”, “slow sort”, or “stupid sort”  Simply shuffle the collection given randomly until it is sorted. 25

Bogo Sort - implementation 26

Bogo Sort Runtime Analysis  How long would it take?  Is it stable? No 1 iteration if we are lucky n! iterations on average Infinite number of iterations in the worst case 27

Bubble Sort  The idea: bubble up the largest element  Iteration 1: Compare each element with its neighbor to the right from index 0 to index n-1  If they are out of order, swap them  At the end of this iteration, the largest element is at the end of the list  If there were no swaps after going over all elements, we are done 28

Bubble Sort  Iteration 2: Compare each element with its neighbor to the right from index 0 to index n-2 …  Continue until there are no more elements to swap 29

Bubble Sort - example (done) 30

Bubble Sort - implementation 31

Bubble Sort Runtime Analysis (I)  For a list of size n, the outer loop does n iterations  In each iteration i of the outer loop, the inner loop does n-i iterations: (n-1) + (n-2) + … + 1 = n(n-1)/2  Note that each operation within the inner loop (comparing and swapping) takes constant time c Number of steps is a polynomial of 2 nd degree (n 2 ) 32

Bubble Sort Runtime Analysis (II)  What would be the number of steps in the best case? (i.e., the input list is sorted)  Start with the first iteration, going over all the list  The list is sorted so there are no swaps n steps are required 33

Bubble Sort  Is it stable? Yes, we will not swap elements of equal values  Relatively inefficient, so we would probably use it only in cases where we know that the input is nearly sorted 34

Quick Sort  Choose an element from the list called pivot  Partition the list:  All elements < pivot will be on the left  All elements ≥ pivot will be on the right  Recursively call the quicksort function on each part of the list 35

Quick Sort - implementation 36

Quick Sort – implementation (II) 37

Quick Sort – implementation (III) 38

Quick Sort – Runtime Analysis (I)  On each level of the recursion, we go over lists that contain total of n elements: About n steps at each level 39

Quick Sort – Runtime Analysis (II)  How many levels are there?  It depends on the pivot value:  Lets say we choose each time the median value  Each time the list is divided by half:  n/2  n/4 …… 11  There will be log(n) levels, and each takes n steps It would take about nlog(n) steps 40

Quick Sort – Runtime Analysis (III)  Lets say we choose each time an extreme value (smallest or largest) – it is unlikely  Each time we get one list of size 1 and one of size n-1:  n-1  n-2 …… 11  There will be n levels, and each takes n steps It would take about n 2 steps 41

Quick Sort  The efficiency is depended on the pivot choice  Is it stable? 42 No, since partition reorders elements within a partition

Find Duplicate Values 43  We would like to find duplicates values in a given list. It can be done by two approaches: 1. Compare all possible pairs 2. Sort the list and then search for neighboring elements with the same value

Find Duplicate Values (1) 44  How long does it take?  Outer loop iterates n times  For each outer iteration i, inner iteration runs i times: … + n-1 = n(n-1)/2

Find Duplicate Values (2) 45  How long does that takes?

Find Duplicate Values (2) 46  Quicksort takes on average nlog(n)  Make one pass on the sorted data – n steps  In total: n + nlog(n) steps This is much more efficient than ~n 2 steps

Radix Sort  Sorts integers by grouping keys by the individual digits which share the same significant position and value  Assumption: the input has d digits ranging from 0 to k : For each position there is a finite number of possible digits, depends on the base of the number (for decimal representation there are 10) 47

Radix Sort (assume decimal representation)  Divide all in integers into 10 groups according the least significant digit (why?)  For two numbers with the same digit keep the original order  Repeat the above steps with the next least significant digit 48

Radix Sort  Input list: [493,812,715,710,195,437,582,340,385] digitsublist 5715,195, [710, 340, 812, 582, 493, 715, 195, 385, 437] Divide by the least significant digit: 49 digitsublist 0710, ,

Radix Sort  Current list: [710, 340, 812,582,493,715,195,385,437] digitsublist , ,195 [710, 812, 715, 437, 340, 582, 385, 493, 195] Divide by the second least significant digit: 50 digitsublist ,812,

Radix Sort  Current list: [710, 812,715,437,340,582,385,493,195] digitsublist , [195, 340, 385, 437, 493, 582, 710, 715, 812] Divide by the third least significant (and last) digit: and we are done! 51 digitsublist , , 493

Radix Sort - implementation 52

Radix Sort Runtime Analysis  Lets note:  n the number of elements in the input  d the number of digits in the largest number  k the base of the number  The outer loop would iterate d times  In each iteration, we go over n elements and rebuild a list from k buckets It would take about d*(n+k) steps For d, k small enough it is linear runtime 53

Merge Sort  Divide and conquer algorithm:  Recursively break down a problem into two or more sub-problems of the same (or related) type, until these become simple enough to be solved directly  The solutions to the sub-problems are then combined to give a solution to the original problem. 54

Merge Sort  Divide the unsorted list into n lists, each contain 1 element  Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. 55

Merge Sort Split Merge 56

Merge Sort - implementation 57

Merge Sort Runtime Analysis (I)  How many times is merge called?  Last level of merge is on lists of size n/2  In previous level, the lists are of size n/2*(1/2) = n/2 2 …  First call is on lists of size 1 Assume we have k levels of merge calls, and we will stop when the lists size is 1: n/2 k = 1 Take log: k = log(n) 58

Merge Sort Runtime Analysis (II)  In each merge call, we iterate len(right) + len(left) times  Each merge is done on total of n elements (the number of elements in each level)  In total: we have log(n) levels of merge, each level takes n steps: It would take about nlog(n) steps 59

Merge Sort Runtime Analysis (III) log(n) levels n operations on each level 60

Insertion Sort  Similar to bubble sort, we go over the list n times  Iteration 1: Compare the first element with the second one.  If they are out of order, swap them  At the end of this iteration, the sub-list of the first two elements is sorted.  Iteration 2: Compare the third element with the elements of its left, from index 1 to index 0. …  Continue till the end of the list 61

Insertion Sort sortednext to be inserted temp sorted less than 10 62

Insertion Sort - implementation 63

Insertion Sort Runtime Analysis  For a list of size n, the outer loop does n iterations  In each iteration i of the outer loop, the inner loop does i iterations (worst case): … + (n-2) + (n-1) = n(n-1)/2  Note that each operation within the inner loop (comparing and swapping) takes constant time c  And what about the best case? (i.e, the list is sorted) Number of steps is a polynomial of 2 nd degree (n 2 ) n steps 64

What have we seen? Algorithm Best Case Runtime Worst Case Runtime Stable Bogo sortninfiniteno Bubble sortnn2n2 yes Insertion sortnn2n2 yes Merge sortnlog(n) yes Quick sortnlog(n)n2n2 no Radix sortk*(n+b) yes 65