Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction To Algorithms

Similar presentations


Presentation on theme: "Introduction To Algorithms"β€” Presentation transcript:

1 Introduction To Algorithms

2 Algorithm Any well-defined computational procedure that takes some value, or set of values, as input and produces a value, or set of values as output. Tool for solving a well-specified computation problem. E.g. – Sorting a sequence of numbers in nondecresing order. Input: sequence of β€œn” numbers ( π‘Ž 1 , π‘Ž 2 ,…. π‘Ž 𝑛 ) Output: permutation (reordering) of the input sequence < π‘Žβ€² 1 , π‘Žβ€² 2 ,…. π‘Žβ€² 𝑛 > s.t. π‘Žβ€² 1 ≀ π‘Žβ€² 2 β‰€π‘Žβ€² 3 ,…≀ π‘Žβ€² 𝑛

3 Algorithm Instance of a problem consists of all the inputs needed to compute a solution to the problem - Choosing an algorithm depends on: {- # of elements to be sorted {- to what extend elements are already sorted {- kind of storage used disk and/or tape An algorithm is correct if, for every input instance, it holts with the correct output. A correct algorithm solves a given computational problem An incorrect algorithm might not halt at all on some input instances, or it might halt with other than the desired answer. If error can be controlled, incorrect algorithms might be useful (e.g. approximate algorithms.)

4 Analysing Algorithm Predicting the resources that the algorithm requires - memory - communication bandwidth - computational time One problem can be solved using a number of candidate algorithms => the most efficient is chosen Sequential algorithms assume a generic one processor, random-access machine (RAM) model of computation (instructions executed sequentially with no concurrent operations)

5 Analysis of Insertion Sort
Input size How sorted the input is Running time: on a particular input is the number of primitive operations (steps) to be executed. e.g. pseudocode assumes β€œatomic” operations -> The sum of running times for each statement executed. Uses an incremental approach: having sorted a subarray A[1,…j-1], we insert the single element A[j] into its proper place => resulting in sorted subarray A[1,…j]. depend on the problem being studied

6 Insertion Sort For j <- 2 to length [A] Do key <- A[j]
Cost Times Best Worst C1 n C2 n-1 C4 C5 2 𝑛 𝑑 𝑗 𝑛(𝑛+1) C6 2 𝑛 (𝑑 𝑗 βˆ’1) 𝑛(π‘›βˆ’1) 2 C7 C8 For j <- 2 to length [A] Do key <- A[j] Insert A[j] into the sorted sequence A[1,.…j-1] i <- j-1 While (i > 0) AND A[i] > key Do A[i+1] <- A[i] i <- i-1 A[i+1] <- key

7 Insertion Sort: Running time
𝑇 𝑛 = 𝑐 1 𝑛+ 𝑐 2 π‘›βˆ’1 + 𝑐 4 π‘›βˆ’1 + 𝑐 5 2 𝑛 𝑑 𝑗 + 𝑐 6 2 𝑛 (𝑑 𝑗 βˆ’1)+ 𝑐 7 2 𝑛 (𝑑 𝑗 βˆ’1)+ 𝑐 8 (nβˆ’1) <- general Best case: Array already sorted 𝑇(𝑛)= 𝑐 1 𝑛+ 𝑐 2 π‘›βˆ’1 + 𝑐 4 π‘›βˆ’1 + 𝑐 5 π‘›βˆ’1 + 𝑐 8 π‘›βˆ’1 = 𝑐 1 + 𝑐 2 + 𝑐 4 + 𝑐 5 + 𝑐 8 nβˆ’ 𝑐 2 + 𝑐 4 + 𝑐 5 + 𝑐 8 =a𝑛+𝑏 <- linear Worst case: (Array sorted in reverse order) 𝑇 𝑛 = 𝑐 1 𝑛+ 𝑐 2 π‘›βˆ’1 + 𝑐 4 π‘›βˆ’1 + 𝑐 5 𝑛 𝑛+1 2 βˆ’1 + 𝑐 6 𝑛 π‘›βˆ’ 𝑐 7 𝑛 π‘›βˆ’ 𝑐 8 π‘›βˆ’1 = 𝑐 𝑐 𝑐 𝑛 2 +( 𝑐 1 + 𝑐 2 + 𝑐 4 + 𝑐 5 2 βˆ’ 𝑐 6 2 βˆ’ 𝑐 𝑐 8 )nβˆ’( 𝑐 2 + 𝑐 4 + 𝑐 5 + 𝑐 8 ) = a 𝑛 2 +𝑏𝑛+𝑐 <- Quadratic

8 Insertion Sort: Running time
Worst-case: longest time for any input of size β€œn” upper bound on the running time. occurs often in many algorithms average-case may be often as bad as worst-case. Average-case (expected time) of interest when analyzing specific scientific problems - difficult to determine what constitutes an average input for a problem e.g. all input of a given size are equally likely. - Used in randomized algorithms.

9 Order of Growth Rate of growth of running time
Only leading term (e.g. π‘Žπ‘› 2 ) counts since lower order terms are insignificant for large n. Worst-case running time for insertion sort is Θ( 𝑛 2 ) One algorithm is considered to be more efficient than another if its worst-case running time has a lower order of growth.

10 Divide and Conquer Approach
For algorithms that are recursive in structure. Break the problem into several subproblems that are similar to the original but smaller in size. (divide) Solve the problem recursively (conquer) Combine solutions to create a solution to the original problem. (combine)

11 E.g. MERGE SORT Divide n-element sequence into two subsequences of 𝑛 2 elements each Sort the 2 subsequences using MERGE SORT Merge the 2 sorted subsequences to produce the sorted answer. Note: When sequence to be sorted has length = 1 => it is already sorted

12 MERGE SORT Merge-sort(A,p,r) If p < r Then q <- 𝑝+π‘Ÿ 2
Merge-sort (A, p, q) Merge-sort (A, q+1, r) Merge(A, p, q, r) Where, A: array, p, q, r indicates in A s.t. 𝑝≀q<r Merge (A, p, q, r) -> auxiliary procedure that assumes A[p….q] A[q+1…r] are in sorted order -> merge these subarrays in a single sorted subarray A[p…r] -> takes ΞΈ(n) time

13 Analysis Divide-and-conquer T(n): running time on a problem of size n.
If problem size is small n ≀ c. => solution takes ΞΈ(1). If we divide the problem in a subproblems, each subproblem has 1 𝑏 size of the original D(n) – time to divide the problem into subproblems C(n) – time to combine the solutions to the subproblems into the original.

14 Analysis Running time for the algorithm: T(n) = ΞΈ(1) -> if n ≀ c
aT( 𝑛 𝑏 )+ D(n)+C(n) -> otherwise conquer 𝑛 𝑏 divide combine Merge-sort (assume n = 2 π‘˜ ) T(n) = ΞΈ(1) -> if n ≀ c 2T( 𝑛 2 )+ ΞΈ(1) + ΞΈ(n) -> otherwise Conquer Divide Combine Compute recursively 2 subproblems, each of size 𝑛 2 Compute middle of subarray: ΞΈ(1) Merge procedure ΞΈ(n) 𝑇 𝑛 = ΞΈ(n log n)

15 Growth of Functions Gives a simple characterization of algorithm efficiency Allows us to compare the relative performance of alternative algorithms. For large enough inputs, the multiplicative constants are lower-order terms of an exact running time are dominated by the effect of the input itself. When input sizes are large enough => only the order of growth of the running time is relevant Asymptotic efficiency of an algorithm

16 Growth of Functions Asymptotic efficiency: -> How the running time of an algorithm increases with the size of the input, as the size of the input increases without bound. An algorithm that is asymptotically more efficient is the best choice for all but very small set of inputs. In general: asymptotic running time is defined in terms of functions whose domains are the set of natural numbers N = {0,1,2,…}.

17 Asymptotic notation Θ notation (asymptotic tight bound)
Any polynomial P(n) = 𝑖=0 𝑑 π‘Ž 𝑖 𝑛 𝑖 =πœƒ( 𝑛 𝑑 ) of degree d Function f(n) is polynomially bounded if: f(n) = O(1) f(n)= O( 𝑛 π‘˜ ) For a given function g(n), we denote πœƒ(g(n)) the set of functions: πœƒ(g(n)) = {f(n): βˆƒ 𝑐 1 , 𝑐 2 , 𝑛 0 positive constants s.t. 0≀ 𝑐 1 𝑔 𝑛 ≀𝑓 𝑛 ≀ 𝑐 2 g n , βˆ€ 𝑛β‰₯ 𝑛 0 } ← asymptotically positive if π‘Ž 𝑑 > 0, π‘Ž 𝑖 constants for all

18 Θ notation (asymptotic tight bound)
Asymptotic notation Θ notation (asymptotic tight bound) βˆ€ 𝑛β‰₯ 𝑛 0 , f(n)=g(n) to within a constant factor g(n) is an asymptotic tight bound for f(n) Note: f(n) nonnegative for 𝑛β‰₯ 𝑛 g(n) asymptotically nonnegative

19 Θ notation (asymptotic tight bound)
Asymptotic notation Θ notation (asymptotic tight bound) Any constant is a degree-zero polynomial πœƒ 𝑛 0 = πœƒ(1) Note: 𝑝 𝑛 = 𝑖=0 𝑑 π‘Ž 𝑖 𝑛 𝑖 π‘Ž 𝑖 constants, π‘Ž 𝑑 >0 p(n) = πœƒ 𝑛 𝑑

20 Asymptotic notation O - notation (asymptotic upper bound) 𝑂(g(n)) = {f(n): βˆƒ c, 𝑛 0 positive constants s.t. 0≀𝑓 𝑛 ≀c g n ,βˆ€ 𝑛β‰₯ 𝑛 0 } πœƒ(g(n)) ≀ O(g(n))

21 Asymptotic notation Observations: O – notation describes upper bound on: - worst-case running time - running time on arbitrary inputs - running time on every input (included in above) Eg. Insertion sort O( 𝑛 2 ) – upper bound for { - arbitrary inputs (worst case) - every inputs. πœƒ 𝑛 2 - tight bound { - arbitrary inputs - Not for every inputs => πœƒ 𝑛 (list sorted) worst-case

22 Asymptotic notation Ξ© - notation (asymptotic lower bound) Ξ©(g(n)) = f(n): βˆƒ 𝑐, 𝑛 0 positive constants s.t. 0≀𝑐 𝑔 𝑛 ≀𝑓 𝑛 , βˆ€ 𝑛β‰₯ 𝑛 0 } Best-case running time within a constant factor An arbitrary inputs. Eg. Insertion Sort Ξ© (n)

23 Theorem For any f(n), g(n) F(n) = πœƒ(g(n)) iff f(n) = O(g(n)) and f(n) =Ξ©(g(n)) Note: Ξ©- notation describes lower bound - best case running time: running time of an algorithm on arbitrary inputs is at least a constant time g(n), for n β‰₯ 𝑛 0 Eg. Insertion sort O( 𝑛 2 ) Ξ©(n) asymptotic tight for every input Note 𝑛 2 = Ξ©(n) n = O( 𝑛 2 )

24 Theorem Eg. Show that 1 2 𝑛 2 βˆ’3𝑛= πœƒ( 𝑛 2 ) Proof Notation: g(n) = 𝑛 2 , πœƒ 𝑔(𝑛) =πœƒ( 𝑛 2 ) f(n) = 1 2 𝑛 2 βˆ’3𝑛 we need to prove that βˆƒ 𝑐 1 , 𝑐 2 , 𝑛 0 constants, positive s.t. 𝑐 1 𝑔 𝑛 ≀𝑓 𝑛 ≀ 𝑐 2 g n , βˆ€ 𝑛β‰₯ 𝑛 0 𝑐 1 𝑛 2 ≀ 1 2 𝑛 2 βˆ’3𝑛≀ 𝑐 2 𝑛 2 𝑐 1 ≀ 1 2 βˆ’ 3 𝑛 ≀ 𝑐 2 true for 𝑛 0 = 7, 𝑐 1 = 1 14 , 𝑐 2 = 1 2 other choices of constants as well Note: we need to show that βˆƒ at least one choice

25 Theorem Eg. Show that 6 𝑛 3 ≠𝑂( 𝑛 2 ) Proof f(n) = 6 𝑛 3 g(n) = 𝑛 2 Suppose βˆƒ 𝑐 2 , 𝑛 0 s.t. 6 𝑛 3 ≀ 𝑐 2 𝑛 2 βˆ€ 𝑛β‰₯ 𝑛 0 => n≀ 𝑐 2 6 not true for βˆ€ 𝑛 large since 𝑐 2 is constant.

26 Theorem Eg. Show that any quadratic function is in πœƒ( 𝑛 2 ) and in O( 𝑛 2 ) as well. Proof f(n) = a 𝑛 2 + b n +c a, b, c constants, a>0 1) throw away lower terms => f(n) =πœƒ( 𝑛 2 ) 2) Formally: for 𝑐 1 = π‘Ž 4 𝑐 2 = 7π‘Ž 4 𝑛 0 =2. max ( 𝑏 π‘Ž , |𝑐| π‘Ž ) 0≀ 𝑐 1 𝑛 2 β‰€π‘Ž 𝑛 2 +𝑏𝑛+𝑐≀ 𝑐 2 𝑛 2 , βˆ€ 𝑛β‰₯ 𝑛 0 => f(n) = πœƒ( 𝑛 2 ) f(n) = O( 𝑛 2 ) since βˆƒ 𝑐 2 , 𝑛 0 above s.t. 0≀𝑓(𝑛)≀ 𝑐 2 ( 𝑛 2 ) βˆ€ 𝑛β‰₯ 𝑛 0 Eg. Any linear function an + b is in O( 𝑛 2 ). Proof f(n) = an + b g(n) = 𝑛 2 0β‰€π‘Žπ‘›+𝑏≀𝑐 𝑛 2 true for 𝑛 0 = 1 q.e.d. c= a+ |b|

27 Recurrences Recurrence Eg. Worst-case Merge-sort:
An equation that describes a function in terms of its value as smaller inputs. Eg. Worst-case Merge-sort: T(n) πœƒ 1 , n = T( 𝑛 2 )+ πœƒ 𝑛 , n > 1 T(n) = πœƒ 𝑛 log 𝑛 Solving recurrences: obtaining asymptotic bounds for ΞΈ or O on the solution.

28 Recurrences Methods substitution: -guess a bound
-use induction to prove guess is correct - iteration -converts recurrence into summation -bound summations to solve the recurrence - master T(n) = aT( 𝑛 𝑏 )+ f 𝑛 where a β‰₯ 1, b>1 f(n) given function

29 Assumptions and observations:
1) Merge- sort T(n)= πœƒ 1 , if n = 1 T( 𝑛 2 )+ T 𝑛 2 + πœƒ 𝑛 if n > 1 - running time is defined only when n integer 2) Boundary conditions - running time of an algorithm on a constant- size input is a constant. T(n) = πœƒ 1 for sufficient small n. => we omit statements of the boundary conditions of recurrences and assume that T(n) is constant for small n.

30 Assumptions and observations:
Eg. T(n) = 2T( 𝑛 2 )+ πœƒ 𝑛 and omit giving values for small n Reason: although T(1) changes the solution to the recurrence, the solution does not change by more than a constant. factor. => order of growth is not changed. In general: omit floor, ceiling, boundary

31 Substitution method Guess form of the solution
Use mathematical induction to find the constants and show that solution works Used to establish – upper/lower bounds on a recurrence Eg. Find upper bound: T(n) = 2 T 𝑛 n Guess T(n) = O (n log n) Need to prove that T(n) ≀ c n log n for an appropriate constant c > 0 Assume correct for 𝑛 2 => T 𝑛 2 ≀ c 𝑛 2 log 𝑛 2 Substitute in T(n): T(n) ≀ 2(c 𝑛 2 log 𝑛 2 ) + n ≀ c n log ( 𝑛 2 ) + n = c n log n – c n log 2 + n = c n log n – c n + n ≀ c n log n for c β‰₯ 1 q.e.d.

32 Substitution method Prove that it works for boundary conditions as well T(n) ≀ c n log n For: Boundary T(2) =4 𝑇 2 ≀ c 2 log 2 n > 3 T(3) = 5 𝑇 2 ≀ c 3 log 3 c β‰₯ 2 Note: T(1) is not a good boundary since T(1) ≀ c |log| = 0 and there is no c to satisfy that.

33 Subtleties T(n) = T 𝑛 2 + T( 𝑛 2 ) + πœƒ 𝑛 guess O(n)
Try to show T(n) ≀ cn for appropriate c T(n) ≀ c 𝑛 2 + c 𝑛 = cn ⟹ T(n) ≀ cn for any c Try to show T(n) = O( 𝑛 2 ) will work But T(n) = O(n) guess was correct just change guess to T(n) ≀ cn –b b β‰₯ 0 constant => T(n) ≀ (c 𝑛 2 -b) + (c 𝑛 2 -b) + 1 = cn – 2b +1 ≀ cn – b ≀ cn for all b β‰₯ 1 Also must show it works for boundary conditions

34 since we had to prove that T(n) ≀ cn
False proof Eg. Determine the upper bound of the recurrence: T(n) = 2T 𝑛 2 + n Proof: guess T(n) ≀ cn assume: T( 𝑛 2 ) = 2T 𝑛 4 + 𝑛 2 ≀ c( 𝑛 2 ) substitute: T( 𝑛 2 ) ≀ c ( 𝑛 2 ) T(n) ≀ 2 c 𝑛 2 + n ≀ cn + n = O(n) Wrong proof since we had to prove that T(n) ≀ cn

35 Iteration Method Expand the recurrence and express it as a summation of terms depended only on n and the initial conditions. Eg.) 𝑇 𝑛 =3T 𝑛 4 +n 𝑇 𝑛 =𝑛+3𝑇 𝑛 4 =𝑛+3 𝑛 T 𝑛 16 =n+3 𝑛 𝑛 𝑇 𝑛 = ……. ≀ n + 3𝑛 𝑛 …… 3 π‘™π‘œπ‘” 4 𝑛 .πœƒ(1) ≀ n 𝑖=0 ∞ ( 3 4 ) 𝑖 + πœƒ( 𝑛 π‘™π‘œπ‘” 4 3 ) n 𝑖=0 𝑛 ( 3 4 ) 𝑖 = 4n Since π‘˜=0 ∞ π‘₯ π‘˜ = βˆ’π‘₯ when |x| < 1 3 𝑖 𝑇( 𝑛 4 𝑖 ) Last term is et. Note: 𝑛 4 𝑖 ≀ 𝑛 4 𝑖 and 3 π‘™π‘œπ‘” 4 𝑛 = 𝑛 π‘™π‘œπ‘” 4 𝑛

36 Iteration Method β‡’ 𝑇 𝑛 ≀ 4n + O(n) = O(n)
asymptotic not tight β‡’ 𝑇 𝑛 ≀ 4n + O(n) = O(n) Note: πœƒ( 𝑛 π‘™π‘œπ‘” 4 3 ) = πœƒ(n) since π‘™π‘œπ‘” 4 3 = n Observation: # of times the recurrence needs to be iterated to reach the boundary condition. Sum of terms arising from each level of iteration process. Recursion trees - Visualize what happens when a recurrence is iterated.

37 Iteration Method Eg.1) T(n) = 2T( 𝑛 2 ) + 𝑛 2 assume n = 2 π‘˜
Construction of tree:

38 Iteration Method Total: is at most a constant factor more than largest term Total: πœƒ ( 𝑛 2 ) Since 𝑛 2 𝑖=0 ∞ 𝑖 = 𝑛 βˆ’ = 2𝑛 2 (height of tree +1) – no. of levels of iterations At each level partial terms are computed

39 Iteration Method Eg 2) T n = T 𝑛 3 +T 2𝑛 3 +𝑛 T n β‡’ omit floor / ceilings Height of tree is: 𝐾= π‘™π‘œπ‘” 𝑛 since the largest path from root to leaf is =𝑛 β†’ 𝑛 β†’ 𝑛 β†’ β†’ … 𝑛 Γ— π‘™π‘œπ‘” 𝑛 = O(n log n)

40 Iteration Method Observation !
Here Eg 2 differs from Eg 1 since the total does not differ from the term on top of the tree by a constant term.

41 Induction Method Eg) use mathematical induction to show that the solution to the recurrence 𝑇 𝑛 = 2 if n = 2 2𝑇 𝑛 2 +𝑛 if n = 2 π‘˜ , π‘˜ β‰₯1 is Tn = n log n Initially: T(2) = 2 log 2 = 2 Inductive step: For n = 2 π‘˜ and k > 1, assume t( 2 π‘˜ ) = 2 π‘˜ log 2 π‘˜ time T( 2 π‘˜ ) = 2 π‘˜ log 2 π‘˜ = k. 2 π‘˜ ≀ 2T ( 2 π‘˜βˆ’1 )+ 2 π‘˜ Then, for n = 2 π‘˜+1 T( 2 π‘˜+1 )=2 𝑇 2 π‘˜ + 2 π‘˜+1 = = 2.π‘˜. 2 π‘˜ + 2 π‘˜+1 = 2 π‘˜+1 (π‘˜+1) = (k+1) 2 π‘˜+1 true

42 Induction Method Recurrence Theory T(n) = 2T( 𝑛 2 ) + n T( 2 π‘˜ ) = 2T( 2 π‘˜βˆ’1 ) + 2 π‘˜ T( 2 π‘˜ ) = V(k) V(k) = 2 V(k-1) + 2 π‘˜ General solution: 𝑉 𝑝 π‘˜ = 𝐴. 2 π‘˜ π‘˜ π΄π‘˜2 π‘˜ =2𝐴 π‘˜βˆ’1 2 π‘˜βˆ’1 + 2 π‘˜ Ak=Ak βˆ’A+1 A=1 β‡’ 𝑉 𝑝 π‘˜ =k. 2 π‘˜ V k = 𝑉 β„Ž π‘˜ + 𝑉 𝑝 π‘˜ 𝑐 π‘˜ +π‘˜ 2 π‘˜ Initial: V 1 = 𝑐 =2β‡’ 𝑐 1 =0 V k = 𝑉 𝑝 π‘˜ =π‘˜. 2 π‘˜ 𝑉 β„Ž π‘˜ =0 T(n) = n log n q.e.d.

43 Induction Method Y= π‘₯ 2 Y<1
Eg) Find: x < 1 π‘˜=1 ∞ 2π‘˜+1 π‘₯ 2π‘˜ = π‘˜=0 ∞ 2π‘˜+1 π‘₯ 2π‘˜ βˆ’1 = π‘˜=0 ∞ 2π‘˜ π‘₯ 2π‘˜ += π‘˜=0 ∞ π‘₯ 2π‘˜ βˆ’1 = π‘˜=0 ∞ 2π‘˜ 𝑦 π‘˜ + π‘˜=0 ∞ 𝑦 π‘˜ βˆ’1 =2 π‘˜=0 ∞ π‘˜ 𝑦 π‘˜ += π‘˜=0 ∞ 𝑦 π‘˜ βˆ’1 =2 𝑦 (1βˆ’π‘¦) βˆ’π‘¦ βˆ’1 =2 π‘₯ 2 (1βˆ’ π‘₯ 2 ) βˆ’ π‘₯ 2 βˆ’1= π‘₯ 2 (3βˆ’ π‘₯ 2 ) (1βˆ’ π‘₯ 2 ) 2 x<1. Y= π‘₯ 2 Y<1

44 The Master Theorem Let a β‰₯ 1 and b β‰₯ 1 constants, f(n) a function (asymptotic positive) and T(n) Π„ N+ (non negative integers) defined by the recurrence: T(n) = aT( 𝑛 𝑏 ) + f(n) Problem of size n divided into a subproblems each of size 𝑛 𝑏 The a subproblems are solved recursively each in time T( 𝑛 𝑏 ) Cost of dividing + combining is f(n) Eg. Merge Sort T(n) = 2T( 𝑛 2 ) + ΞΈ (n) a=2 b = 2 f(n) = ΞΈ (n) Then T(n) is bounded asymptotically as follows: If f(n) = O( 𝑛 π‘™π‘œπ‘” 𝑏 π‘Žβˆ’βˆˆ ) for some constant ∈ >0, then T(n) = ΞΈ ( 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž ) If f(n) = ΞΈ( 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž ) then T(n) = ΞΈ ( 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž log 𝑛 ) If f(n) = Ξ©( 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž+∈ ) for some constant ∈ >0, and if a f( 𝑛 𝑏 ) ≀ C f(n) for some constant c<1 and n sufficiently large, then T(n)= ΞΈ(f(n)) regularity

45 The Master Theorem Intution: We compare f(n) with 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž
If 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž > O(f(n)), then T(n) = πœƒ(𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž ) If 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž = πœƒ(f(n)) (functions have the same size) then T(n) = πœƒ(f(n) log n) If 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž < Ξ©(f(n)), then T(n) = πœƒ(f(n)) Polynomially larger(smaller): f(n) must be asymptotically larger (smaller) than 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž by a factor of 𝑛 ∈ for some constant ∈ >0. Gaps (Master theorem cannot be used): Cases (1) (2) - when f(n) is smaller than 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž but not polynomially smaller Cases (2) (3) – when f(n) is larger than 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž but not polynomially larger. If the regularity in condition (3) fails to hold. polynomially polynomially

46 Examples T(n) = 9 T ( 𝑛 3 ) + n a = 9
b = 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž = 𝑛 π‘™π‘œπ‘” 3 9 = 𝑛 2 f(n) = n f(n) = n = 𝑛 1 = 𝑛 2βˆ’1 = O( 𝑛 π‘™π‘œπ‘” 3 9βˆ’1 ) = O( 𝑛 π‘™π‘œπ‘” 𝑏 π‘Žβˆ’βˆˆ ) where ∈ = 1 =π‘Žπ‘π‘π‘™π‘¦ π‘π‘Žπ‘ π‘’ 1 ⇒𝑇 𝑛 =πœƒ 𝑛 2 T(n) = T ( 2𝑛 3 ) + 1 a = 1 b = 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž = 𝑛 π‘™π‘œπ‘” = 𝑛 0 =1 f(n) = 1 f(n) = 1= πœƒ(𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž )=πœƒ(1) case (2) applies: ⇒𝑇 𝑛 =πœƒ( log 𝑛)

47 Examples T(n) = 3T ( 𝑛 4 ) + nlogn a = 3
b = 4 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž = 𝑛 π‘™π‘œπ‘” 4 3 = 𝑛 0.793 f(n) = n log n f(n) = Ξ©(𝑛 π‘™π‘œπ‘” 4 3+∈ )=Ξ© n for ∈ β‰ˆ0.2 then case (3) applies if β€œregularity” holds for f(n) For n large, a f 𝑛 𝑏 =3 𝑛 4 log 𝑛 4 ≀ 𝑛 log 𝑛 = =𝑐𝑓 𝑛 π‘“π‘œπ‘Ÿ 𝑐= 3 4 , (𝑐<1) β‡’ by case(3) ⇒𝑇 𝑛 = πœƒ(𝑛 log 𝑛)

48 Examples The master method does not apply for the recurrence:
T(n) = 2T ( 𝑛 2 ) + n log n a = 2 b = 2 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž =𝑛 f(n) = n log n f(n) is asymptotic larger than 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž but not polynomially larger than 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž f(n) = n log n = log n < 𝑛 ∈ 𝑛 π‘™π‘œπ‘” 𝑏 π‘Ž n for any ∈ >0 β‡’ recurrence falls into the gap between case (2) and case (3)

49 Recurrences Linear Homogeneous Recurrence (constant coefficients)
f n = π‘Ž 1 𝑓 π‘›βˆ’1 + π‘Ž 2 𝑓 π‘›βˆ’2 +….+ π‘Ž π‘˜ 𝑓(π‘›βˆ’π‘˜) Solution: - plugin in f(n) = 𝑣 𝑛 - divide by 𝑣 π‘›βˆ’π‘˜ - solve polynomial equation for v with k roots. 𝑓 𝑛 =𝑐 1 𝑣 1 𝑛 + 𝑐 2 𝑣 2 𝑛 +….+ 𝑐 π‘˜ 𝑣 π‘˜ 𝑛 where 𝑣 1 , 𝑣 2 ,…. 𝑣 π‘˜ are roots of the polynomial. ( 𝑐 𝑖 are determined by initial conditions) Eg) Fibonacci numbers: F(n) = f(n-1) + F(n-2) let 𝑣 𝑛 =𝐹 𝑛 𝑣 𝑛 βˆ’ 𝑣 π‘›βˆ’1 βˆ’ 𝑣 π‘›βˆ’2 =0 | Divide by 𝑣 π‘›βˆ’2 𝑣 2 βˆ’vβˆ’1=0 𝑣= 1Β± ⌽= ← π‘”π‘œπ‘™π‘‘π‘’π‘› π‘Ÿπ‘Žπ‘‘π‘–π‘œ ⦽= 1βˆ’ 5 2

50 Recurrences β‡’ 𝑓 𝑛 =𝑐 1 ⌽ 𝑛 + 𝑐 2 ⦽ 𝑛
β‡’ 𝑓 𝑛 =𝑐 1 ⌽ 𝑛 + 𝑐 2 ⦽ 𝑛 Initial conditions: F(0) = 0, F(1) = 1 𝑐 1 ⌽ 0 + 𝑐 2 ⦽ 0 = β‡’ 𝑐 1 + 𝑐 2 =0 𝑐 1 ⌽ 1 + 𝑐 2 ⦽ 1 = β‡’βŒ½ 𝑐 1 + ⦽𝑐 2 =1 (βŒ½βˆ’ ⦽)𝑐 1 =1 𝑐 1 = 1 βŒ½βˆ’β¦½ = 1 5 𝑐 2 =βˆ’ 1 5 Thus 𝐹 𝑛 = ( ⌽ 𝑛 βˆ’ ⦽ 𝑛 ) By induction: 𝐹 0 = ⌽ 0 βˆ’ ⦽ 0 = √ verified 𝐹 1 = ⌽ 1 βˆ’ ⦽ 1 = √

51 Recurrences Assume we can find F(i-1) and F(i-2) 𝐹(𝑖) = 𝐹(π‘–βˆ’1) + 𝐹(π‘–βˆ’2) = 1 5 ⌽ π‘–βˆ’1 βˆ’ ⦽ π‘–βˆ’ ⌽ π‘–βˆ’2 βˆ’ ⦽ π‘–βˆ’2 = 1 5 ( ⌽ π‘–βˆ’1 + ⌽ π‘–βˆ’2 βˆ’ ⦽ π‘–βˆ’1 βˆ’ ⦽ π‘–βˆ’2 ) = 1 5 ( ⌽ π‘–βˆ’2 (⌽+1)βˆ’ ⦽ π‘–βˆ’2 (⦽+1)) ⌽+1= = =( ) 2 = ⌽ 2 ⦽ +1= 1βˆ’ = 3βˆ’ 5 2 =( 1βˆ’ 5 2 ) 2 = ⦽ 2 ⇒𝐹 𝑖 = 1 5 ⌽ π‘–βˆ’2 . ⌽ 2 βˆ’ ⦽ π‘–βˆ’2 . ⦽ 2 = 1 5 ⌽ 𝑖 βˆ’ ⦽ 𝑖 Note: if roots are repeated, put n in front of root terms e.g. 𝑓(𝑛) = 6 𝑓(π‘›βˆ’1) – 9 𝑓(π‘›βˆ’1) 𝑣 𝑛 βˆ’6 𝑣 π‘›βˆ’1 +9 𝑣 π‘›βˆ’2 =0 β‡’ 𝑣 1,2 =3,3 𝑓 𝑛 = 𝑐 1 3 𝑛 + 𝑐 2 3 𝑛

52 Recurrences Non homogeneous linear recurrences with constant coefficients: f n = π‘Ž 1 𝑓 π‘›βˆ’1 + π‘Ž 2 𝑓 π‘›βˆ’2 +….+ π‘Ž π‘˜ 𝑓(π‘›βˆ’π‘˜)+𝑏(𝑛) Where b(n) is a polynomial in n of degree m multiplied by 𝛼 𝑛 (𝛼 is a constant) e.g. b(n) = ( 𝑏 π‘š 𝑛 π‘š + 𝑏 π‘šβˆ’1 𝑛 π‘šβˆ’1 +….+ 𝑏 1 𝑛+ 𝑏 0 ) 𝛼 𝑛 Solution: 𝑓 β„Ž 𝑛 - is solution to homogenous eq. (f(n) without b(n).) 𝑓 𝑝 𝑛 - is solution formed by making logical guess to its form and plugging into recurrence. Eg) Recursive Insertion Sort (exercise) ! 𝑓(𝑛) = 𝑓(π‘›βˆ’1) + 𝑛

53 Recurrences Non linear recurrences f n =cf 𝑛 π‘Ž +𝑏 𝑛
Eg) Merge Sort. (exercise) f n =2f 𝑛 2 +𝑛 f 2 π‘˜ =2f 2 π‘˜βˆ’ π‘˜ 𝑉 π‘˜ =2𝑉 π‘˜βˆ’1 +2 π‘˜ Where, 𝑉 β„Ž π‘˜ = 𝑐 1 2 π‘˜ 𝑉 𝑝 π‘˜ =𝐴. 2 π‘˜ .k=Ak. 2 π‘˜ π΄π‘˜2 π‘˜ =2𝐴 π‘˜βˆ’1 2 π‘˜βˆ’1 + 2 π‘˜ =2π΄π‘˜.2 π‘˜βˆ’1 βˆ’2𝐴 2 π‘˜βˆ’1 + 2 π‘˜ π΄π‘˜=π΄π‘˜βˆ’π΄ ⇒𝐴=1 V k = 𝑐 1 2 π‘˜ +k 2 π‘˜ 𝑓 𝑛 =𝑐 1 𝑛+𝑛 log 𝑛 since n= 2 π‘˜ π‘˜= log 𝑛 let n= π‘Ž π‘˜ f( π‘Ž π‘˜ ) = cf( π‘Ž π‘˜βˆ’1) +𝑏( π‘Ž π‘˜ ) let V(k) = f( π‘Ž π‘˜ ) V(k) = cV(k-1)+b( π‘Ž π‘˜ ) n= 2 π‘˜ But fn has a form V(k)= 𝑉 β„Ž + 𝑉 𝑝


Download ppt "Introduction To Algorithms"

Similar presentations


Ads by Google