Download presentation
Presentation is loading. Please wait.
Published byBranden Shelton Modified over 9 years ago
1
DIVIDE AND CONQUER APPROACH
2
General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size). Divide Sub problems are solved independently using recursion. Conquer Solutions for sub problems then combined to get solution for the original problem. Combine
3
Examples to be considered Binary search Merge Sort Quick Sort Selection sort Radix sort Strassen’s Matrix Multiplication
4
Binary Search Basics Sequential search not very efficient for large lists. Binary search faster Works only on sorted list E.g. searching for some number in a phone book based on person’s name. Logic: compares the target (search key) with the middle element of the array and terminates if it is the same, else it divides the array into two sub arrays and continues the search in left/right sub array if the target is less/greater than the middle element of the array.
5
Binary Search … non recursive Bin_search (A, n, target) left 0 right n-1 While (left ≤ right) do mid floor(( left + right )/2) If (target = a[mid]) then End with output as mid If (target < a[mid]) then right mid -1 If (target > a[mid]) then left mid+1 Return element not found
6
Binary Search … non recursive Average and worst case complexity is O(log n) Vs. Linear Search Linear search is of order O(n). Benefit of binary search over linear search becomes significant for lists over about 100 elements. For smaller lists linear search may be faster because of the speed of the simple increment compared with the divisions needed in binary search.
7
Binary Search … Recursive Bin_Search(A, target, low, high, n) If (high < low) return not found mid low + ((high - low) / 2) if (A[mid] > target) return Bin_Search(A, target, low, mid-1) if (A[mid] < target) return Bin_Search(A, target, mid+1, high) else return mid
8
Binary Search …recursive Complexity order same as non recursive search i.e. O(log n) Easier to implement.
9
Merge Sort Divide: If A has at least two elements (nothing needs to be done if A has zero or one elements), remove all the elements from A and put them into two sequences, L and R, each containing about half of the elements of A. (i.e. L contains the first n/2 elements and R contains the remaining n/2 elements). Conquer: Sort sequences L and R using Merge Sort. Combine: Put back the elements into A by merging the sorted sequences L and R into one sorted sequence
10
Merge Sort … contd Merge_Sort (A, low, high) if low < high then mid floor((low+high)/2) Merge_Sort(A, low, mid) Merge_Sort(A, mid+1, high) Merge (A, low, mid, high)
11
h low, i low, j mid + 1 Create auxiliary array B[ ] while (h ≤ mid and j ≤ high) do if (A[h] ≤ A[j]) then B[i] A[h] h h+1 Else B[i] A[j] j j+1 i i+1 if (h>mid) then for k j to high B[i] A[k] i i+1 else for k h to mid B[i] A[k] i i+1 for k low to high A[k] B[k] Merge(A, low, mid, high) Take the smallest of the two topmost elements of sequences A[low..mid] and A[mid+1..high] and put into the resulting sequence. Repeat this, until both sequences are empty. Copy the resulting sequence into A[low..high]. Merge(A, low, mid, high) Take the smallest of the two topmost elements of sequences A[low..mid] and A[mid+1..high] and put into the resulting sequence. Repeat this, until both sequences are empty. Copy the resulting sequence into A[low..high].
12
Merge Sort (Example) - 1
13
Merge Sort (Example) - 2
14
Merge Sort (Example) - 3
15
Merge Sort (Example) - 4
16
Merge Sort (Example) - 5
17
Merge Sort (Example) - 6
18
Merge Sort (Example) - 7
19
Merge Sort (Example) - 8
20
Merge Sort (Example) - 9
21
Merge Sort (Example) - 10
22
Merge Sort (Example) - 11
23
Merge Sort (Example) - 12
24
Merge Sort (Example) - 13
25
Merge Sort (Example) - 14
26
Merge Sort (Example) - 15
27
Merge Sort (Example) - 16
28
Merge Sort (Example) - 17
29
Merge Sort (Example) - 18
30
Merge Sort (Example) - 19
31
Merge Sort (Example) - 20
32
Merge Sort (Example) - 21
33
Merge Sort (Example) - 22
34
Running time of Merge Sort To sort n numbers if n=1…. done recursively sort 2 lists of numbers floor(n/2) and floor(n/2) elements Merge 2 sorted lists in O(n) time Again the running time can be expressed as a recurrence:
35
Running time of Merge Sort Worst case and best case complexity same as average case complexity. Applications Sorting linked list
36
Quicksort Also called partition exchange sort. Fastest known sorting algorithm in practice. The array A[p..r] is partitioned into two non-empty subarrays A[p..q] and A[q+1..r] Unlike merge sort, no combining step
37
Quicksort … contd Quicksort(A, lb, ub) if (lb < ub) then pivot Partition(A, lb, ub) Quicksort(A, lb, pivot-1) Quicksort(A, pivot+1, ub) To sort an entire array A, initially lb is 0/1 and ub is n-1/n.
38
Quicksort … contd Partition(A, lb, ub) x A[lb] up ub down lb while (down < up) do while (A[down] ≤ x and down < ub) down down +1 while (A[up] > x) up up – 1 if (down < up) Swap A[down] A[up] A[lb] A[up] A[up] x Return up
39
Quicksort … contd 402010806050730100 402010806050730100 down up 402010806050730100 updown 402010806050730100 up down 402010806050730100 up down
40
Quicksort …contd 402010806050730100 down up 402010306050780100 down up 402010306050780100 down up 402010306050780100 down up
41
Quicksort …contd 402010306050780100 down up 402010307506080100 down up 402010307506080100 down up 402010307506080100 down up 402010307506080100 down up
42
Quicksort … contd 402010307506080100 down up 72010307506080100 down up 720103040506080100 down up 720103040506080100
43
Quicksort … Analysis For analyzing quicksort, we count only number of element comparisons. Running time pivot selection: constant time, i.e. O(1) partitioning: linear time, i.e. O(N) running time of the two recursive calls T(n) = T(i)+T(n-i-1)+cn c is a constant i is the number of elements in the partition.
44
Quicksort… Analysis Worst case Complexity O(n 2 ) Average Case Complexity O(nlogn) Applications Commercial Applications
45
Randomized Quicksort Choose the pivot randomly (or randomly permute the input array before sorting). The running time of the algorithm is independent of input ordering. No specific input elicits worst case behavior. The worst case depends on the random number generator. Helps modify Quicksort so that it performs well on every input.
46
Randomized Quicksort R_Quicksort(A, lb, ub) if (lb < ub) then i Random(lb, ub) Swap A[up] A[i] pivot Partition(A, lb, ub) Quicksort(A, lb, pivot-1) Quicksort(A, pivot+1, ub
47
Randomized Quicksort … Analysis Assumptions: all elements are distinct All partitions from 0:n-1 and n-1:0 equally likely Probability of each partition is 1/n Average case complexity is O(nlogn).
48
Selection Sort Successive elements are selected in order and placed in their proper position. An in-place sort. Simple to implement Works as follows Find the minimum value in the list Swap it with the value in the first position Repeat the steps abottiove for the remainder of the list (starting at the second position and advancing each time)
49
Selection Sort … contd Sel_sort(A, n) for i 0 to n – 2 k i for j i + 1 to n – 1 // Find the i th smallest element. if A[j] < A[k] then then k j if k i then Swap A[i] and A[k]
50
Selection Sort… Analysis Selection sort stops when unsorted part becomes empty. On every step number of unsorted elements decreased by one. Makes n steps (n is number of elements in array) of outer loop, before stopping. Number of swaps vary from 0 to n-1 Overall algorithm complexity O(n 2 ).
51
Radix Sort A sorting algorithm that sorts integers by processing individual digits, by comparing individual digits sharing the same significant position i.e. radix. Compares two numbers starting from their LSD till they match The number with the larger digit in the first position which the digits of the two numbers do not match is the larger of the two.
52
Radix sort … steps Partition number in ten groups based on LSD. Can use ten queues for this… say Q[0] – Q[9]. Merge all the numbers from Q[0] – Q[9] Again perform step 1, now considering the second LSD Continue till each subgroup has been subdivided and sorted on MSD.
53
Radix sort… example Start with elements of same denomination Read array A of size n E.g. 38, 36, 32, 25, 29, 54, 44, 40 Start by inserting elements in queues based on LSD
54
Radix sort… contd Elements consisting of different number of digits in their values. 121, 345, 2, 4343, 12, 34, 5, 1231, 231, 3212 Complexity n elements present take m passes to get sorted where m is the number of maximum digits in an element. Order is O(n*m)
55
Strassen’s Matrix Multiplication The standard method of matrix multiplication of two n × n matrices takes O(n3) operations. Strassen’s algorithm is a Divide-and-Conquer algorithm that is asymptotically faster, i.e. O(n lg 7 ). The usual multiplication of two 2 × 2 matrices takes 8 multiplications and 4 additions. Strassen showed how two 2 × 2 matrices can be multiplied using only 7 multiplications and 18 additions.
56
Basic Matrix Multiplication
57
Strassen’s Observation where
58
Self Study Finding Min and Max in a list using Divide and Conquer Technique. Find out other algorithms which can be solved using divide and conquer.
59
Next Greedy Method
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.