242-535 ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their.

Slides:



Advertisements
Similar presentations
ADA: 5. Quicksort1 Objective o describe the quicksort algorithm, it's partition function, and analyse its running time under different data conditions.
Advertisements

Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
Recursion & Merge Sort Introduction to Algorithms Recursion & Merge Sort CSE 680 Prof. Roger Crawfis.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CPSC 320: Intermediate Algorithm Design & Analysis Divide & Conquer and Recurrences Steve Wolfman 1.
The master method The master method applies to recurrences of the form T(n) = a T(n/b) + f (n), where a  1, b > 1, and f is asymptotically positive.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 3 Recurrence equations Formulating recurrence equations Solving recurrence equations.
Analysis of Recursive Algorithms
David Luebke 1 7/2/2015 Merge Sort Solving Recurrences The Master Theorem.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
1 Divide and Conquer Binary Search Mergesort Recurrence Relations CSE Lecture 4 – Algorithms II.
October 1, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Analysis of Algorithms
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
David Luebke 1 10/3/2015 CS 332: Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
1 Designing algorithms There are many ways to design an algorithm. Insertion sort uses an incremental approach: having sorted the sub-array A[1…j - 1],
File Organization and Processing Week 13 Divide and Conquer.
Merge Sort Solving Recurrences The Master Theorem
CSC 413/513: Intro to Algorithms Merge Sort Solving Recurrences.
CS 2133:Data Structures Merge Sort Solving Recurrences The Master Theorem.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Algorithms Merge Sort Solving Recurrences The Master Theorem.
Tonga Institute of Higher Education Design and Analysis of Algorithms IT 254 Lecture 2: Mathematical Foundations.
Recurrences David Kauchak cs161 Summer Administrative Algorithms graded on efficiency! Be specific about the run times (e.g. log bases) Reminder:
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
CSC 413/513: Intro to Algorithms Solving Recurrences Continued The Master Theorem Introduction to heapsort.
1 Algorithms CSCI 235, Fall 2015 Lecture 6 Recurrences.
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Lecture # 6 1 Advance Analysis of Algorithms. Divide-and-Conquer Divide the problem into a number of subproblems Similar sub-problems of smaller size.
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Introduction to Algorithms
Mathematical Foundations (Solving Recurrence)
Divide-and-Conquer 6/30/2018 9:16 AM
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
T(n) = aT(n/b) + cn = a(aT(n/b/b) + cn/b) + cn 2
Algorithms and Data Structures Lecture III
Divide-and-Conquer 7 2  9 4   2   4   7
CSE 2010: Algorithms and Data Structures
Divide and Conquer (Merge Sort)
Merge Sort Solving Recurrences The Master Theorem
Divide-and-Conquer 7 2  9 4   2   4   7
Algorithms: Design and Analysis
Solving Recurrences Continued The Master Theorem
At the end of this session, learner will be able to:
Algorithms Recurrences.
Recurrences.
Algorithms and Data Structures Lecture III
Divide-and-Conquer 7 2  9 4   2   4   7
Presentation transcript:

ADA: 4. Divide/Conquer1 Objective o look at several divide and conquer examples (merge sort, binary search), and 3 approaches for calculating their running time Algorithm Design and Analysis (ADA) , Semester Divide and Conquer

ADA: 4. Divide/Conquer2 1.Divide and Conquer 2.A Faster Sort: merge sort 3.The Iteration Method 4.Recursion Trees 5.Merge Sort vs Insertion Sort 6.Binary Search 7.Recursion Tree Examples 8.Iteration Method Examples 9.The Master MethodOverview

ADA: 4. Divide/Conquer3 1.Divide the problem into subproblems 2.Conquer the subproblems by solving them recursively 3.Combine subproblem solutions. 1. Divide and Conquer

2. A Faster Sort: Merge Sort M ERGE S ORT( A, left, right) 1.If left < right, // if left ≥ right, do nothing 2. mid := floor(left+right)/2) 3. MergeSort(A, left, mid) 4. MergeSort( A, mid+1,right) 5. Merge(A, left, mid, right) 6.return Initial call: M ERGE S ORT (A, 1, n)

ADA: 4. Divide/Conquer5 A faster sort: MergeSort A[1.. mid ] A[ mid +1.. n] input A[1.. n] M ERGE S ORT Sorted A[1.. mid ] Sorted A[ mid +1.. n] M ERGE output

ADA: 4. Divide/Conquer6 merge Tracing MergeSort()

Merging two sorted arrays The merge() function

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays Time = one pass through each array =  (n) to merge a total of n elements (linear time).

ADA: 4. Divide/Conquer20 StatementEffort Analysis of Merge Sort MergeSort(A, left, right) T(n) if (left < right) {O(1) mid = floor((left+right)/2); O(1) MergeSort(A, left, mid);T(n/2) MergeSort(A, mid+1, right);T(n/2) Merge(A, left, mid, right);O(n) } As shown on the previous slides

ADA: 4. Divide/Conquer21 merge(A, left, mid, right) Merges two adjacent subranges of an array A o left == the index of the first element of the first range o mid == the index of the last element of the first range o right == to the index of the last element of the second range merge() Code

ADA: 4. Divide/Conquer22 void merge(int[] A, int left, int mid, int right) { int[] temp = new int[right–left + 1]; int aIdx = left; int bIdx = mid+1; for (int i=0; i < temp.length; i++){ if(aIdx > mid) temp[i] = A[bIdx++]; // copy 2nd range else if (bIdx > right) temp[i] = A[aIdx++]; // copy 1st range else if (a[aIdx] <= a[bIdx]) temp[i] = A[aIdx++]; else temp[i] = A[bIdx++]; } // copy back into A for (int j = 0; j < temp.length; j++) A[left+j] = temp[j]; }

ADA: 4. Divide/Conquer23 Up to now, we have been solving recurrences using the Iteration method o Write T() as a recursive equation using big-Oh o Convert T() equation into algebra (replace O()'s) o Expand the recurrence o Rewrite the recursion into a summation o Convert algebra back to O() 3. The Iteration Method

ADA: 4. Divide/Conquer24 Recursive T() equation: o T(1) = O(1) o T(n) = 2T(n/2) + O(n), for n > 1 Convert to algebra o T(1) = a o T(n) = 2T(n/2) + cn MergeSort Running Time

ADA: 4. Divide/Conquer25 The expression: is called a recurrence. A recurrence is an equation that describes a function in terms of its value for smaller function calls. Recurrence for Merge Sort

ADA: 4. Divide/Conquer26 T(n) = 2T(n/2) + cn 2(2T(n/2/2) + cn/2) + cn 2 2 T(n/2 2 ) + cn2/2 + cn 2 2 T(n/2 2 ) + cn(2/2 + 1) 2 2 (2T(n/2 2 /b) + cn/2 2 ) + cn(2/2 + 1) 2 3 T(n/2 3 ) + cn(2 2 /2 2 ) + cn(2/2 + 1) 2 3 T(n/2 3 ) + cn(2 2 /2 2 +2/2 + 1) … 2 k T(n/2 k ) + cn(2 k-1 /2 k k-2 /2 k-2 + … / /2 + 1)

ADA: 4. Divide/Conquer27 So we have o T(n) = 2 k T( n/2 k ) + cn(2 k-1 /2 k / /2 + 1) For k = log 2 n o n = 2 k, so T() argument becomes 1 o T(n)= 2 k T( 1 ) + cn(k-1+1) = na + cn(log 2 n) = O(n) + O(n log 2 n) = O(n log 2 n) k-1 of these

ADA: 4. Divide/Conquer28 A graphical technique for finding a big-oh solution to a recurrence o Draw a tree of recursive function calls o Each tree node gets assigned the big-oh work done during its call to the function. o The big-oh equation is the sum of work at all the nodes in the tree. 4. Recursion Trees

MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. We usually omit stating the base case because our algorithms always run in time  (1) when n is a small constant.

MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n)T(n)

MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n/2) cn

MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn T(n/4) cn/2

MergeSort Recursion Tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = log n cn #leaves = n (n)(n) Total  (n log n) …

ADA: 4. Divide/Conquer34 Height and no. of Leaves h steps why?

ADA: 4. Divide/Conquer35 Logarithm Equalities because of this

ADA: 4. Divide/Conquer36 O(n lg n) grows more slowly than O( n 2 ). In other words, merge sort is asymptotically faster (runs faster) than insertion sort in the worst case. In practice, merge sort beats insertion sort for n > 30 or so. 5. Merge Sort vs Insertion Sort

ADA: 4. Divide/Conquer37 Running time estimates: o Laptop executes 10 8 compares/second. o Supercomputer executes compares/second. Timing Comparisons Lesson 1. Good algorithms are better than supercomputers.

ADA: 4. Divide/Conquer38 Binary Search from part 3 is a divide and conquer algorithm. Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Easy; return index 6. Binary Search Example: Find

Binary Search Example: Find Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.

Binary Search Example: Find Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.

Binary Search Example: Find Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.

Binary Search Example: Find Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial.

Binary Search Find an element in a sorted array: 1.Divide: Check middle element. 2.Conquer: Recursively search 1 subarray. 3.Combine: Trivial. Example: Find

ADA: 4. Divide/Conquer44 Binary Search Code (again) int binSrch(char A[], int i,int j, char key) { int k; if (i > j) /* key not found */ return -1; k = (i+j)/2; if (key == A[k]) /* key found */ return k; if (key < A[k]) j = k-1; /* search left half */ else i = k+1; /* search right half */ return binSrch(A, i, j, key); }

ADA: 4. Divide/Conquer45 Using big-oh. o Basis: T(1) = O(1) o Induction: T(n) = O(1) + T( ), for n > 1 As algebra o Basis: T(1) = a o Induction: T(n) = c + T( ), for n > 1 Running time for binary search is O(log 2 n). Running Time (again) n == the range of the array being looked at

Recurrence for Binary Search T(n) = 1 T(n/2) +  (1) # subproblems subproblem size work dividing and combining

BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. We usually don't bother with the base case because our algorithms always run in time  (1) when n is a small constant.

BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. T(n)T(n)

BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. c T(n/2)

BS Recursion tree Solve T(n) = T(n/2) + c, where c > 0 is constant. c c T(n/4)

BS Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.  (1) h = log 2 n c c c a Total = c log 2 n + a = O(log 2 n) … c c c …

ADA: 4. Divide/Conquer52 Merge Sort T(n) = 2T(n/2) +  (n) =  (n log n) Binary Search T(n) = T(n/2) +  (1) =  (log n) The big-oh running times were calculated in two ways: the iteration method and using recursion trees. Let's do some more example of both. Two recurrences so far

7. Recursion Tree Examples 1 Solve T(n) = T(n/4) + T(n/2) + n 2 :

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : T(n)

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 T(n/4) T(n/2)

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 (n/4) 2 (n/2) 2 T(n/16)T(n/8) T(n/4)

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 (n/4) 2 (n/2) 2 (n/16) 2 (n/8) 2 (n/4) 2 O (1)

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : n2n2 (n/4) 2 (n/16) 2 (n/8) 2 O (1) 2 n (n/2) 2 (n/8) 2 (n/4) 2

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : (n/4) 2 (n/16) 2 (n/8) 2 O (1) n2n2 (n/2) 2 (n/8) 2 (n/4) n 2 n

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : (n/4) 2 (n/16) 2 (n/8) 2 O (1) n2n2 (n/2) 2 (n/8) 2 (n/4) n 2 n 2 n

Example 1 Solve T(n) = T(n/4) + T(n/2) + n 2 : (n/4) 2 (n/16) 2 (n/8) 2 n2n2 (n/2) 2 (n/8) 2 (n/4) n 2 n 2 n O (1) 2 Total = n ( 1 + = O(n 2 ) ( ) + ( ) +  ) 16 geometric series = 16/11*n 2

Geometric Series Reminder for |x| < 1 for x  1

ADA: 4. Divide/Conquer63 T(n) = 3T(n/4) + cn 2 Recursion Tree 2

ADA: 4. Divide/Conquer64 T(n) = 3T(n/4) + cn 2

ADA: 4. Divide/Conquer65 height (h) = no. of leaves = T(n) = 3T(n/4) + cn 2

ADA: 4. Divide/Conquer66 Height and no. of Leaves h steps why?

ADA: 4. Divide/Conquer67 Add the cost of all the levels: Cost of the Tree leaves level leaves level next to bottom level next to bottom level

ADA: 4. Divide/Conquer68 T(n) = T(n/3) + T(2n/3) + cn Recursion Tree 3 height =

ADA: 4. Divide/Conquer69 Height and no. of Leaves h steps for the longest path why?

ADA: 4. Divide/Conquer70 Since the tree is smaller than a complete binary tree, then the cost of all the level will be: T(n) ≤ cn * log 3/2 n T(n) is O(n log 3/2 n) is O(n log n) Since log 3/2 n = log 2 n / log 2 3/2 = c log 2 n // see slide 35 Cost of the Tree

ADA: 4. Divide/Conquer71 8. Iteration Method Examples

ADA: 4. Divide/Conquer72 T(n) = c + T(n-1) = c + c + T(n-2) = 2c + T(n-2) = 2c + c + T(n-3) = 3c + T(n-3) … = kc + T(n-k) = ck + T(n-k) Example 1

ADA: 4. Divide/Conquer73 When k == n o T(n) = cn + T(0) = cn The conversion back to big-oh: o T(n) is O(n)

ADA: 4. Divide/Conquer74 T(n) =n + T(n-1) =n + n-1 + T(n-2) =n + n-1 + n-2 + T(n-3) =n + n-1 + n-2 + n-3 + T(n-4) =…=… = n + n-1 + n-2 + n-3 + … + n-(k-1) + T(n-k) Example 2

ADA: 4. Divide/Conquer75 T(n) =n + T(n-1) =n + n-1 + T(n-2) =n + n-1 + n-2 + T(n-3) =n + n-1 + n-2 + n-3 + T(n-4) =… = n + n-1 + n-2 + n-3 + … + n-(k-1) + T(n-k) =

ADA: 4. Divide/Conquer76 When k = n, T(n) = In general, T(n) is O(n 2 )

ADA: 4. Divide/Conquer77 T(n) = aT(n/b) + cn a(aT(n/b/b) + cn/b) + cn a 2 T(n/b 2 ) + cna/b + cn a 2 T(n/b 2 ) + cn(a/b + 1) a 2 (aT(n/b 2 /b) + cn/b 2 ) + cn(a/b + 1) a 3 T(n/b 3 ) + cn(a 2 /b 2 ) + cn(a/b + 1) a 3 T(n/b 3 ) + cn(a 2 /b 2 + a/b + 1) … a k T(n/b k ) + cn(a k-1 /b k-1 + a k-2 /b k-2 + … + a 2 /b 2 + a/b + 1) Example 3

ADA: 4. Divide/Conquer78 So we have o T(n) = a k T(n/b k ) + cn(a k-1 /b k a 2 /b 2 + a/b + 1) For k = log b n o n = b k o T(n)= a k T(1) + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = a k d + cn(a k-1 /b k a 2 /b 2 + a/b + 1) ~= ca k + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = cna k /b k + cn(a k-1 /b k a 2 /b 2 + a/b + 1) = cn(a k /b k a 2 /b 2 + a/b + 1)

ADA: 4. Divide/Conquer79 With k = log b n o T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) There are three cases at this stage depending on if a == b, a b If a == b o T(n)= cn(k + 1) = cn(log b n + 1) = O(n log b n) Case 1

ADA: 4. Divide/Conquer80 With k = log b n o T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) If a < b o Recall that  (x k + x k-1 + … + x + 1) = (x k+1 -1)/(x-1) // slide 62 o So: o T(n) = cn * O(1) = O(n) Case 2

ADA: 4. Divide/Conquer81 With k = log b n o T(n) = cn(a k /b k a 2 /b 2 + a/b + 1) If a > b? Case 3

ADA: 4. Divide/Conquer82 why?

ADA: 4. Divide/Conquer83 So… e.g. merge sort (a = b = 2 and c = 1) e.g. merge sort (a = b = 2 and c = 1)

9. The Master Method The master method only applies to divide and conquer recurrences of the form: T(n) = a T(n/b) + f (n) where a  1, b > 1, and f (n) > 0 for all n > n 0 The Master method gives us a cookbook solution for an algorithm’s running time plug in the numbers, get the equation this is a more general version of the last example this is a more general version of the last example

ADA: 4. Divide/Conquer85 >a>a When T(n) = aT(n/b) + f(n) then Three cases Case 1 Case 2 Case 3 <a<a = n log b a == no. of leaves in the recursion tree (see next slides) n log b a == no. of leaves in the recursion tree (see next slides) note: n is a polynomial

Example 1 E X.T(n) = 4T(n/2) + n a = 4, b = 2 so n log b a = n 2 ; f (n) = n. C ASE 1 since f(n) < n log b a (n < n 2 )  T(n) is  (n 2 )

Example 2 E X.T(n) = 4T(n/2) + n 2 a = 4, b = 2 so n log b a = n 2 ; f (n) = n 2. C ASE 2 since f(n) is same as n log b a (n 2 = n 2 )  T(n) is  (n 2 * log n)

Example 3 E X. T(n) = 4T(n/2) + n 3 a = 4, b = 2 so n log b a = n 2 ; f (n) = n 3. C ASE 3 since f(n) > n log b a (n 3 > n 2 ) and 4(n/2) 3  cn 3 (reg. cond.) for c = 1/2  T(n) is  (n 3 )

Example 4 (fail) E X.T(n) = 4T(n/2) + n 2 /log n a = 4, b = 2 so n log b a = n 2 ; f (n) = n 2 / log n. The master method does not apply because n 2 /log n ≠ n 2- for any. f(n) must be a simple polynominal function for the master method to be applicable

ADA: 4. Divide/Conquer90 Example 5

ADA: 4. Divide/Conquer91 Common Examples Cannot use Master method since f(n) is not a polynomial

n log b a   (1) f (n/b)   (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) #leaves = n log b a … Recursion Tree for Master T() T(n) = aT(n/b) + f(n)

ADA: 4. Divide/Conquer93 Height and no. of Leaves h steps why?

f (n/b)   (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) The sums increase geometrically from the root to the leaves. The leaves hold the biggest part of the total sum.  (n log b a ) … n log b a   (1) Case 1 Explained The – means that f(n) is smaller than leaf sum.

f (n/b)   (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) The sums are approximately the same on each of the levels (Θ(total of all sums)).  (n log b a * log n) … n log b a   (1) Case 2 Explained No means that f(n) is roughly equal to the leaf sum.

f (n/b)   (1) … … f (n)f (n) a f (n/b 2 ) … a h = log b n f (n)f (n) a f (n/b) a 2 f (n/b 2 ) … The sums decrease geometrically from the root to the leaves. The root holds the biggest part of the total sum. n log b a   (1)  ( f (n)) Case 3 Explained af(n/b) is getting smaller at lower levels (see def n )