Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.

Slides:



Advertisements
Similar presentations
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Advertisements

CS4413 Divide-and-Conquer
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
25 May Quick Sort (11.2) CSE 2011 Winter 2011.
Chapter 7: Sorting Algorithms
CSC 2300 Data Structures & Algorithms March 23, 2007 Chapter 7. Sorting.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
Chapter 7 (Part 2) Sorting Algorithms Merge Sort.
DIVIDE & CONQUR ALGORITHMS Often written as first as a recursive algorithm Master’s Theorem: T(n) = aT(n/b) + cn i, for some constant integer i, and constants.
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Sorting What makes it hard? Chapter 7 in DS&AA Chapter 8 in DS&PS.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
Young CS 331 D&A of Algo. Topic: Divide and Conquer1 Divide-and-Conquer General idea: Divide a problem into subprograms of the same kind; solve subprograms.
Advanced Sorting.
Divide and Conquer Sorting
CMPT 438 Algorithms.
Chapter 11 Sorting Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount.
Analysis of Algorithms CS 477/677
Fundamental Data Structures and Algorithms
Subject Name: Design and Analysis of Algorithm Subject Code: 10CS43
Sorting.
Quicksort
Randomized Algorithms
Divide-and-Conquer The most-well known algorithm design strategy:
Divide-And-Conquer-And-Combine
Chapter 7 Sorting Spring 14
Divide and Conquer – and an Example QuickSort
Advance Analysis of Algorithms
CSC 413/513: Intro to Algorithms
Quicksort 1.
Algorithm Design Methods
Order Statistics(Selection Problem)
Quicksort and Mergesort
Chapter 4: Divide and Conquer
Quick Sort (11.2) CSE 2011 Winter November 2018.
CO 303 Algorithm Analysis And Design Quicksort
Unit-2 Divide and Conquer
Ch 7: Quicksort Ming-Te Chi
Randomized Algorithms
Sorting Algorithms Ellysa N. Kosinaya.
Data Structures Review Session
Lecture 3 / 4 Algorithm Analysis
Topic: Divide and Conquer
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Divide-And-Conquer-And-Combine
Sub-Quadratic Sorting Algorithms
Chapter 4.
EE 312 Software Design and Implementation I
CS 3343: Analysis of Algorithms
Quicksort.
CS 332: Algorithms Quicksort David Luebke /9/2019.
CSE 373 Data Structures and Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
Quick-Sort 4/25/2019 8:10 AM Quick-Sort     2
Topic: Divide and Conquer
CSC 380: Design and Analysis of Algorithms
David Kauchak cs161 Summer 2009
The Selection Problem.
Design and Analysis of Algorithms
CS203 Lecture 15.
Quicksort Quick sort Correctness of partition - loop invariant
Divide and Conquer Merge sort and quick sort Binary search
Quicksort.
Advanced Sorting Methods: Shellsort
Presentation transcript:

Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic programming- Matrix chain multiplication

Algorithm Design Techniques, Divide-and-Conquer . Prune-and-Search . Dynamic Programming . Greedy Algorithms .

Divide-and-Conquer . Essence of Divide and Conquer Divide problem into several smaller sub problems Normally, the sub problems are similar to the original Conquer the sub problems by solving them recursively Base case: solve small enough problems by brute force Combine the solutions to get a solution to the sub problems And finally a solution to the original problem Divide and Conquer algorithms are normally recursive

Divide-and-Conquer -Quick Sort QuickSort uses Divide-and-Conquer recursive algorithm . To sort the values Divide and Conquer is one of the famous algorithmic techniques. It works with the philosophy that divide the whole problem into smaller manageable chunks of sub problems and work out these small sub problems there by combining the intermediate partial solutions. There are many famous examples which use the Divide and Conquer strategy, for example Binary search, Merge sort, Insertion sort, Quick sort etc.,

Basic Idea of QuickSort 1. Pick an element in the array as the pivot element. 2. Make a pass to the array, called the PARTITION step, which rearranges the elements in the array:  a. The pivot element is in the proper place b. The elements less than pivot element are on the left of it c. The elements greater than pivot element are on the right of it 3. Recursively apply the above process to the left and right part of the pivot element.

QuickSort Step 1. Choosing the Pivot Element Choosing the pivot element can determine the complexity of the algorithm i.e. whether it will be n*logn or quadratic time: a. Normally we choose the first, last or the middle element as pivot. This can harm us badly as the pivot might end up to be the smallest or the largest element, thus leaving one of the partitions empty. b. We should choose the Median of the first, last and middle elements. If there are N elements, then the ceiling of N/2 is taken as the pivot element. Example: 8, 3, 25, 6, 10, 17, 1, 2, 18, 5 first element: 8 middle element: 10 last element: 5 Therefore the median on [8,10,5] is 8.

QuickSort Step 2. Partitioning a. First thing is to get the pivot out of the way and swapping it with the last number. Example: (shown using the above array elements) 5, 3, 25, 6, 10, 17, 1, 2, 18, 8 b. Now we want the elements greater than pivot to be on the right side of it and similarly the elements less than pivot to be on the left side of it. For this we define 2 pointers, namely i and j. i being at the first index and j being and the last index of the array. * While i is less than j we keep in incrementing i until we find an element greater than pivot. * Similarly, while j is greater then i keep decrementing j until we find an element less than pivot. * After both i and j stop we swap the elements at the indexes of i and j respectively. c. Restoring the pivot When the above steps are done correctly we will get this as our output: [5, 3, 2, 6, 1] [8] [10, 25, 18, 17] Step 3. Recursively Sort the left and right part of the pivot.

QuickSort Complexity of QuickSort Worst Case : O(N^2) This happens when the pivot is the smallest or the largest element. Then one of the partition is empty and we repeat the recursion for N-1 elements Best Case: O(NlogN) This is when the pivot is the median of the array and the left and right part are the of the same size. There are logN partitions and to compare we do N comparisions

Analysis of Quick Sort How is it that quick sort's worst-case and average-case running times differ? Let's start by looking at the worst-case running time. Suppose that we're really unlucky and the partition sizes are really unbalanced. In particular, suppose that the pivot chosen by the partition function is always either the smallest or the largest element in the nn-element subarray. Then one of the partitions will contain no elements and the other partition will contain n-1n−1 elements—all but the pivot. So the recursive calls will be on sub arrays of sizes 0 and n-1n−1.

QuickSort Worst-case running time When quicksort always has the most unbalanced partitions possible, then the original call takes n  time for some constant c, the recursive call on n−1elements takes c(n−1) time, the recursive call on n−2 elements takes c(n−2) time, and so on. Here's a tree of the subproblem sizes with their partitioning times: When we total up the partitioning times for each level, we get cn+c(n−1)+c(n−2)+⋯+2c=c(n+(n−1)+(n−2)+⋯+2)=c((n+1)(n/2)−1) . The last line is because 1 + 2 + 3 + \cdots + n1+2+3+⋯+n is the arithmetic series, as we saw when we analyzed selection sort [fix link]. (We subtract 1 because for quicksort, the summation starts at 2, not 1.) We have some low-order terms and constant coefficients, but when we use big-Θ notation, we ignore them. In big-Θ notation, quicksort's worst-case running time is \Theta(n^2)Θ(n​2​​).

QuickSort When we total up the partitioning times for each level, we get cn+c(n−1)+c(n−2)+⋯+2c=c(n+(n−1)+(n−2)+⋯+2)=c((n+1)(n/2)−1) . We have some low-order terms and constant coefficients, but when we use big-Θ notation, we ignore them. In big-Θ notation, quicksort's worst-case running time is \Theta(n^2)Θ(n​2​​). The last line is because 1 + 2 + 3 +…. + n1+2+3+⋯+n is the arithmetic series, as we saw when we analyzed selection sort [fix link]. (We subtract 1 because for quicksort, the summation starts at 2, not 1.) We have some low-order terms and constant coefficients, but when we use big-Θ notation, we ignore them. In big-Θ notation, quicksort's worst-case running time is \Theta(n^2)Θ(n​2​​).

T(N) = T(i) + T(N - i -1) + cN The time to sort the file is equal tothe time to sort the left partition with i elements, plus the time to sort the right partition with N-i-1 elements, plus the time to build the partitions 6. 3.

Worst case analysis The pivot is the smallest element T(N) = T(N-1) + cN, N > 1 Telescoping: T(N-1) = T(N-2) + c(N-1) T(N-2) = T(N-3) + c(N-2) T(N-3) = T(N-4) + c(N-3) T(2) = T(1) + c.2 Add all equations: T(N) + T(N-1) + T(N-2) + … + T(2) = = T(N-1) + T(N-2) + … + T(2) + T(1) + c(N) + c(N-1) + c(N-2) + … + c.2 T(N) = T(1) + c(2 + 3 + … + N) T(N) = 1 + c(N(N+1)/2 -1) Therefore T(N) = O(N2)

Best-case analysis: The pivot is in the middle T(N) = 2T(N/2) + cN Divide by N: T(N) / N = T(N/2) / (N/2) + c Telescoping: T(N/2) / (N/2) = T(N/4) / (N/4) + c T(N/4) / (N/4) = T(N/8) / (N/8) + c …… T(2) / 2 = T(1) / (1) + c Add all equations: T(N) / N + T(N/2) / (N/2) + T(N/4) / (N/4) + …. + T(2) / 2 = = (N/2) / (N/2) + T(N/4) / (N/4) + … + T(1) / (1) + c.logN After crossing the equal terms: T(N)/N = T(1) + cLogN = 1 + cLogN T(N) = N + NcLogN Therefore T(N) = O(NlogN)

Average case analysis Similar computations, resulting in T(N) = O(NlogN) The average value of T(i) is 1/N times the sum of T(0) through T(N-1) 1/N S T(j), j = 0 thru N-1 T(N) = 2/N (S T(j)) + cN Multiply by N NT(N) = 2(S T(j)) + cN*N To remove the summation, we rewrite the equation for N-1: (N-1)T(N-1) = 2(S T(j)) + c(N-1)2, j = 0 thru N-2 and subtract: NT(N) - (N-1)T(N-1) = 2T(N-1) + 2cN -c Prepare for telescoping. Rearrange terms, drop the insignificant c: NT(N) = (N+1)T(N-1) + 2cN Divide by N(N+1): T(N)/(N+1) = T(N-1)/N + 2c/(N+1) Telescope: T(N-1)/(N) = T(N-2)/(N-1)+ 2c/(N) T(N-2)/(N-1) = T(N-3)/(N-2) + 2c/(N-1) …. T(2)/3 = T(1)/2 + 2c /3 Add the equations and cross equal terms: T(N)/(N+1) = T(1)/2 +2c S (1/j), j = 3 to N+1 T(N) = (N+1)(1/2 + 2c S(1/j))The sum S (1/j), j =3 to N-1, is about LogN Thus T(N) = O(NlogN)

Randomized Quick Sort In the randomized version of Quick sort we impose a distribution on input. This does not improve the worst-case running time independent of the input ordering. In this version we choose a random key for the pivot. Assume that procedure Random (a, b) returns a random integer in the range [a, b); there are b-a+1 integers in the range and procedure is equally likely to return one of them. The new partition procedure, simply implemented the swap before actually partitioning.   RANDOMIZED_PARTITION (A, p, r) i ← RANDOM (p, r) Exchange A[p] ↔ A[i] return PARTITION (A, p, r) Now randomized quick sort call the above procedure in place of PARTITION RANDOMIZED_QUICKSORT (A, p, r) If p < r then     q ← RANDOMIZED_PARTITION (A, p, r)     RANDOMIZED_QUICKSORT (A, p, q)     RANDOMIZED_QUICKSORT (A, q+1, r) Like other randomized algorithms, RANDOMIZED_QUICKSORT has the property that no particular input elicits its worst-case behavior; the behavior of algorithm only depends on the random-number generator. Even intentionally, we cannot produce a bad input for RANDOMIZED_QUICKSORT unless we can predict generator will produce next. Analysis of Quicksort