Download presentation

Presentation is loading. Please wait.

Published bySebastian Vaughn Modified over 4 years ago

1
SortingTechniques

2
Bubble-sort: One of the simplest sorting methods. The basic idea is the weight of the record. The records are kept in an array held vertically. light records bubbling up to the top. We make repeated passes over the array from bottom to top. If two adjacent elements are out of order i.e. lighter one is below, we reverse the order. Internal sorting model

3
Bubble-sort: The overall effect, is that after the first pass the lightest record will bubble all the way to the top. On the second pass, the second lowest rises to the second position, and so on. On second pass we need not try bubbling to the first position, because we know that the lowest key is already there. Internal sorting model

4
Example Element12345678 Data27631726458149 1st pass27163645814972 2nd pass12763581496472 3rd pass12758149636472... for i:= 1 to n-1 do for j:= i+1 to n do if A[j] < A[i] then swap(A[j],A[i])

5
Insertion Sort On the i th pass we insert the ith element A[i] into its rightful place among A[1],A[2],…A[i-1] which were placed in sorted order. After this insertion A[1],A[2],…A[i] are in sorted order. Internal sorting model

6
Insertion Sort A 1nj 368497251 i Strategy Start empty handed Insert a card in the right position of the already sorted hand Continue until all cards are inserted/sorted Strategy Start empty handed Insert a card in the right position of the already sorted hand Continue until all cards are inserted/sorted for j=2 to length(A) do key=A[j] insert A[j] into the sorted sequence A[1..j-1] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] insert A[j] into the sorted sequence A[1..j-1] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key

7
Insertion Sort

8
Time taken by Insertion Sort depends on input: 1000 takes longer than 3 numbers Can take different amounts of time to sort 2 input sequences of the same size In general, the time taken by an algorithm grows with the size of the input Describe the running time of a program as function of the size of input

9
Analysis of Algorithms Efficiency: Running time Space used Efficiency as a function of input size: Number of data elements (numbers, points) A number of bits in an input number

10
Analysis of Insertion Sort 1.for j=2 to length(A) 2. do key=A[j] 3.insert A[j] into the sorted sequence A[1..j-1] 4. i=j-1 5. while i>0 and A[i]>key 6. do A[i+1]=A[i] 7. i-- 8. A[i+1]:=key cost c 1 c 2 c 3 =0 c 4 c 5 c 6 c 7 c 8 times n n-1 n-1 n-1 n-1 Time to compute the running time as a function of the input size

11
Insertion-Sort Running Time T(n) = c 1 [n] + c 2 (n-1) + c 3 (n-1) + c 4 (n-1) + c 5 ( j=2,n t j ) + c 6 ( j=2,n (t j -1) ) + c 7 ( j=2,n (t j -1) ) + c 8 (n-1) c 3 = 0, of course, since its the comment

12
Best/Worst/Average Case Best case: elements already sorted t j =1, running time = f(n), i.e., linear time. Worst case: elements are sorted in inverse order t j =j, running time = f(n 2 ), i.e., quadratic time Average case: t j =j/2, running time = f(n 2 ), i.e., quadratic time

13
Best Case Result Occurs when array is already sorted. For each j = 2, 3,…..n we find that A[i]<=key in line 5 when I has its initial value of j-1. T(n) = c 1 n + (c 2 + c 4 ) (n-1) + c 5 (n-1)+ c 8 (n-1) = n ( c 1 + c 2 + c 4 + c 5 + c 8 ) + ( -c 2 – c 4 - c 5 – c 8 ) = c 9 n + c 10 = f 1 (n 1 ) + f 2 (n 0 )

14
Worst Case T(n) Occurs when the loop of lines 5-7 is executed as many times as possible, which is when A[] is in reverse sorted order. key is A[j] from line 2 i starts at j-1 from line 4 i goes down to 0 due to line 7 So, t j in lines 5-7 is [(j-1) – 0] + 1 = j The 1 at the end is due to the test that fails, causing exit from the loop.

15
Contd T(n) = c 1 [n]+ c 2 (n-1) + c 4 (n-1) + c 5 ( j=2,n j)+ c 6 [ j=2,n (j - 1) ] + c 7 [ j=2,n (j -1) ]+ c 8 (n-1)

16
Contd T(n) = c 1 n + c 2 (n-1) + c 4 (n-1) + c 8 (n-1) + c 5 ( j=2,n j) + c 6 [ j=2,n (j -1) ]+ c 7 [ j=2,n (j -1) ] = c 9 n + c 10 + c 5 ( j=2,n j) + c 11 [ j=2,n (j -1) ]

17
Contd T(n) = c 9 n + c 10 + c 5 ( j=2,n j) + c 11 [ j=2,n (j -1) ] But j=2,n j = [n(n+1)/2] – 1 so that j=2,n (j-1) = j=2,n j – j=2,n (1) = [n(n+1)/2] – 1 – (n-2+1) = [n(n+1)/2] – 1 – n + 1 = n(n+1)/2 - n = [n(n+1)-2n]/2 = [n(n+1-2)]/2 = n(n-1)/2

18
Contd In conclusion, T(n) = c 9 n + c 10 + c 5 [n(n+1)/2] – 1 + c 11 n(n- 1)/2 = c 12 n 2 + c 13 n + c 14 = f 1 (n 2 ) + f 2 (n 1 ) + f 3 (n 0 )

19
Selection Sort Given an array of length n, Search elements 0 through n-1 and select the smallest Swap it with the element in location 0 Search elements 1 through n-1 and select the smallest Swap it with the element in location 1 Search elements 2 through n-1 and select the smallest Swap it with the element in location 2 Search elements 3 through n-1 and select the smallest Swap it with the element in location 3 Continue in this fashion until theres nothing left to search

20
Example and analysis of Selection Sort The Selection Sort might swap an array element with itself--this is harmless. 7 2854 2 7854 2 4857 2 4587 2 4578

21
Code for Selection Sort void selectionSort(int a[], int size) { int outer, inner, min; for (outer = 0; outer < size; outer++) { // outer counts down min = outer; for (inner = outer + 1; inner < size; inner++) { if (a[inner] < a[min]) { min = inner; } } // a[min] is least among a[outer]..a[a.length - 1] int temp = a[outer]; a[outer] = a[min]; a[min] = temp; } }

22
Quick Sort : Based on Divide and Conquer paradigm. One of the fastest in-memory sorting algorithms (if not the fastest) is a very efficient sorting algorithm designed by C.A.R.Hoare in 1960. Consists of two phases: Partition phase Sort phase Quick Sort

23
Steps... 1. Divide: Pick a pivot element and rearrange the array so that - all elements to the left of the pivot are smaller than the pivot. - all elements to the right of the pivot are larger than the pivot. 2. Conquer: Recursively quicksort the left and right subarrays. 3:Combine: since subarrays are sorted in place, no work is needed to combine them,array is now sorted.

24
Quicksort Sort Left Partition in the same way Initial Step - First Partition

25
Quicksort – Partition phase Goals: Select pivot value Move everything less than pivot value to the left of it Move everything greater than pivot value to the right of it

26
Algorithm: Quick Sort if ( r > l ) then j partition ( A, l, r ); QuickSort ( A, l, j - 1 ); QuickSort ( A, j + 1, r ); end of if. Procedure QuickSort ( A, l, r )

27
9 8 2 3 88 34 5 10 11 0 5 8 2 3 0 9 34 11 10 88 Partition l rl j r A A

28
Algorithm: Partition Function Partition (A, l, r ) v a[ l ]; i l ; j r; while i<j while (A[i]<=v && i<r) i++; while (A[j]>v ) j--; if (i<j) then swap (a[ i ], a[ j ]); A [ l ] = a[ j ]; a[ j ] v; return ( j ); There are various algorithms for partition. Above Algorithm is the most popular. This is because it does not need an extra array. Only 3 variables v,i, and j.

29
9 8 2 3 88 34 5 10 11 0 i j v 9 i j 9 8 2 3 0 34 5 10 11 88 ij ij v = a[ l ]; i = l ; j = r; while i<j while (A[i]<=v && i<r) i++; while (A[j]>v ) j--; if (i<j) then swap (a[ i ], a[ j ]); A [ l ] = a[ j ]; a[ j ] = v; return ( j );

30
9 8 2 3 0 5 34 10 11 88 ij ij 5 8 2 3 0 9 34 10 11 88 ij out of outer while loop v = a[ l ]; i = l ; j = r; while i<j while (A[i]<=v && i<r) i++; while (A[j]>v ) j--; if (i<j) then swap (a[ i ], a[ j ]); A [ l ] = a[ j ]; a[ j ] = v; return ( j );

31
Time Complexity Recurrence Relation T(n)=2T(n/2) + n Using Master Theorem applying case 2: So time complexity is O(nlogn)

32
The worst case is when the input is already sorted. 1 2 3 4 5 6 7 8

33
Randomized Quick Sort Randomize the input data before giving it to Quick Sort. OR In the partition phase, rather than choosing first value to be the pivot value, choose a RANDOM value to be pivot value. This makes Quick Sort run time independent of input ordering So Worst case wont happen for SORTED inputs but may only happen by worseness of random number generator.

Similar presentations

OK

Shell Sort. Invented by Donald Shell in 1959, the shell sort is the most efficient of the O(n²) class of sorting algorithms. Of course, the shell sort.

Shell Sort. Invented by Donald Shell in 1959, the shell sort is the most efficient of the O(n²) class of sorting algorithms. Of course, the shell sort.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To make this website work, we log user data and share it with processors. To use this website, you must agree to our Privacy Policy, including cookie policy.

Ads by Google

Ppt on standing order activated Download ppt on verb in hindi Ppt on different sectors of economy Ppt on effective business communication skills Ppt on business etiquettes training programs Ppt on memory management in os Ppt on save environment speech Types of clouds for kids ppt on batteries Ppt on political parties and electoral process for kids Ppt on networking related topics in ict