Presentation is loading. Please wait.

Presentation is loading. Please wait.

By D.Kumaragurubaran Adishesh Pant

Similar presentations


Presentation on theme: "By D.Kumaragurubaran Adishesh Pant"— Presentation transcript:

1 By D.Kumaragurubaran Adishesh Pant
ALGORITHM By D.Kumaragurubaran Adishesh Pant

2 Agenda Introduction Big O Notation Simple algorithms Complex algorithm

3 Introduction Algorithm is a sequence of method/instructions which can be followed to performed a specific task, such as calculation and data processing.

4 Big O Notation Shorthand way to say how efficient a computer algorithm is. In computer science , this rough measure is called “Big O” notation. For example, Automobiles are divided by size into several categories: subcompacts, compacts , midsize, and so on. These categories provide a quick idea what size car you’re talking about, without needing to mention actual dimensions.

5 Why Shorthand way ? “Algorithm A is twice as fast as algorithm B” – Is it meaningful? Why not? Because the proportion can change radically as the number of items changes. The idea in Big O notation isn’t to give actual figures for running times but to convey how the running times are affected by the number of items. Other factors affecting the running time are speed of the microprocessor, how efficiently the compiler has generated the program code, etc. and called as invariants.

6 Invariants Recognizing invariants can be useful in understanding the algorithm. In certain situations they may also be helpful in debugging; you can repeatedly check that the invariant is true, and signal an error if it isn’t.

7 Algorithm Running Time
Linear search O(N) Binary search O(log N) Insertion in unordered array O(1) Insertion in ordered array O(N) Deletion in unordered array O(N) Deletion in ordered array O(N)

8 Search Algorithm N is the total number of items T is search time
Binary Search Proportional to log(N) Binary search uses the same as children’s guessing game. T = K * log2(N) =>T = K * log(N) N is the total number of items T is search time Linear Search Proportional to N Average linear search time is proportional to the size of the array T = K * N / 2 => T = K * N N is the total number of items T is search time

9 Sorting Algorithms Sorting is the process of rearranging a set of items in a specific order. Why Sorting? Sorting is essential in data processing so that applications would be able to access them more efficiently. Example: "Order By" or "Group By". Sorting Algorithm is an algorithm with the purpose of rearranging items in a specific order.

10 Sorting Categories Sorting by Exchange - Bubble sort, Quicksort
Sorting by Insertion - Insertion sort, Shellsort Sorting by Selection - Selection sort, Heapsort Sorting by Merging - Merge sort Sorting by Distribution - Radix sort

11 Complexity The following are the fundamental operations that take place during sorting: Comparison of two keys Interchange of records Assignment of a record to a temporary location Complexity of a sorting algorithm Measures the running time as a function of n, the number of records sorted Proportional to number of comparisons(It has no big picture, can compare only two at a time)

12 Simple Algorithms The Bubble sort, Selection sort, and Insertion sort can be classified as simple algorithms. Two steps, executed over and over until the data is sorted: Compare two items. Swap two items, or copy one item.

13 Bubble Sort Notoriously slow, but conceptually the simplest of the sorting algorithms - good beginning for our exploration. Here are the rules you’re following: 1. Compare two players. 2. If the one on the left is taller, swap them. 3. Move one position right. You continue down the line this way until you reach the right end. You have by no means finished sorting the kids, but you do know that the tallest kid is on the right.

14 Cont.. This is why it’s called the bubble sort: As the algorithm progresses, the biggest items “bubble up” to the top end of the array. After this first pass through all the data, you’ve made N-1 comparisons and somewhere between 0 and N-1 swaps, depending on the initial. The item at the end of the array is sorted and won’t be moved again. Example: (5,6,3,2,4) 1st sort– (5,6,3,2,4) (5,3,6,2,4) (5,3,2,6,4) (5,3,2,4,6)

15 Sample code public void bubbleSort() { int out, in; for(out=nElems-1; out>1; out--) { for(in=0; in<out; in++) { if( a[in] > a[in+1] ) swap(in, in+1); // swap them }

16 Efficiency Although in the worst case, with the initial data inversely sorted, a swap is necessary with every comparison. The outer loop executes N times, and the inner loop executes N times for each cycle of the outer loop. Thus both swaps and comparisons are proportional to O(n2). So this is slow.

17 Insertion sort The insertion sort, for example, is preferable for small files and for almost sorted files. It still executes in O(n2) time, but it’s about twice as fast as the bubble sort and somewhat faster than the selection sort in normal situations. It’s often used as the final stage of more sophisticated sorts, such as quicksort.

18 Insertion Sort cont… Here a partially sorted data is considered(3,5,6,4,7). Partially sorted – (3,5,6) Unsorted – (4,7) The first data in the unsorted list ‘4’ is removed and stored in a temp, then compared with the sorted list from right in descending order(3,5,6). If the data in sorted list is larger move it to the position of 4. Thus larger data are moved right one by one till a value less than ‘4’ is reached. When data being compared is small then our temp data is inserted to the right of it. 1st step - (3,5, ,6,7) 2nd step – (3, ,5,6,7) 3rd step – (3,4,5,6,7)

19 public void insertionSort() { int in, out; for(out=1; out<nElems; out++) // out is dividing line { long temp = a[out]; // remove marked item in = out; // start shifts at out while(in>0 && a[in-1] >= temp) // until one is smaller, { a[in] = a[in-1]; // shift item right, --in; // go left one position } a[in] = temp; // insert marked item } // end for } // end insertionSort()

20 Selection Sort The selection sort improves on the bubble sort by reducing the number of swaps necessary from O(n2) to O(N). Unfortunately, the number of comparisons remains O(n2). However, the selection sort can still offer a significant improvement for large records that must be physically moved around in memory, causing the swap time to be much more important than the comparison time.

21 Selection cont… What’s involved in the selection sort is making a pass through all the data and picking (or selecting, hence the name of the sort) the shortest one. This smallest data is then swapped with the data on the left end of the line, at position 0. Now the leftmost data is sorted and won’t need to be moved again. Notice that in this algorithm the sorted data accumulate on the left (lower indices), whereas in the bubble sort they accumulated on the right. The next time you pass down the row of data, you start at position 1, and, finding the minimum, swap with position 1. This process continues until all the data are sorted.

22 Example public void selectionSort() { int out, in, min; for(out=0; out<nElems-1; out++) { min = out; // minimum for(in=out+1; in<nElems; in++) // inner loop if(a[in] < a[min] ) // if min greater, min = in; // we have a new min swap(out, min); // swap them } }

23 Is an o(n^2) sorting algorithm A simple algorithm
Selection Sort Is an o(n^2) sorting algorithm A simple algorithm

24 Algorithm There are two lists input list and output list
Find the minimum value in input list, place it in output list. Replace that value in input list with ∞. Repeat step 1 until all the values in input list are replaced with ∞.

25 Optimisations Two lists can be avoided by doing the operation on input list. This is achieved by avoiding placing ∞, instead the minimum value is swapped with the first element of the list. The next search for the minimum value begins with second element, where the second element is replaced with minimum value. The next search starts with third element and so on…

26 Tree Selection sort Tournament picks the best player, but the runner will not always be the second best.

27 Negative weights: replacing the strongest player with the weakest one, gives us the actual second best player.

28 After 7 steps

29 Finally A simple but inefficient algorithm
Faster than bubblesort but slower than insertion sort. All the elements should be available before sorting can begin.

30 Merge Sort Merge sort is an O(n log n) sorting algorithm.
Used for sequential sorting like tape drives. Divide and Conquer approach

31 Merging sorted lists One of the atomic parts of merge sort algorithm is merging sorted lists.

32 Implementations In python

33 Java:

34 Unsorted lists Extending merge sort for unsorted lists.
Identifying semi-sorted lists from unsorted lists.

35 An unsorted list can always be divided into sorted sublists from the fact that a single element is always sorted.

36 When unsorted list is divided into sorted sublists, apply mergeSortedlists algorithm to merge two sorted sublists to get a single sorted list. Continue the above step to merge all the sorted sublists into one single list.

37 There many types of merge sorts, k-way merge sorts etc.
Faster than insertion sort and bubble sort

38 Visual Demo..


Download ppt "By D.Kumaragurubaran Adishesh Pant"

Similar presentations


Ads by Google