Download presentation
Presentation is loading. Please wait.
1
Introduction To Algorithms
2
Algorithm Any well-defined computational procedure that takes some value, or set of values, as input and produces a value, or set of values as output. Tool for solving a well-specified computation problem. E.g. β Sorting a sequence of numbers in nondecresing order. Input: sequence of βnβ numbers ( π 1 , π 2 ,β¦. π π ) Output: permutation (reordering) of the input sequence < πβ² 1 , πβ² 2 ,β¦. πβ² π > s.t. πβ² 1 β€ πβ² 2 β€πβ² 3 ,β¦β€ πβ² π
3
Algorithm Instance of a problem consists of all the inputs needed to compute a solution to the problem - Choosing an algorithm depends on: {- # of elements to be sorted {- to what extend elements are already sorted {- kind of storage used disk and/or tape An algorithm is correct if, for every input instance, it holts with the correct output. A correct algorithm solves a given computational problem An incorrect algorithm might not halt at all on some input instances, or it might halt with other than the desired answer. If error can be controlled, incorrect algorithms might be useful (e.g. approximate algorithms.)
4
Analysing Algorithm Predicting the resources that the algorithm requires - memory - communication bandwidth - computational time One problem can be solved using a number of candidate algorithms => the most efficient is chosen Sequential algorithms assume a generic one processor, random-access machine (RAM) model of computation (instructions executed sequentially with no concurrent operations)
5
Analysis of Insertion Sort
Input size How sorted the input is Running time: on a particular input is the number of primitive operations (steps) to be executed. e.g. pseudocode assumes βatomicβ operations -> The sum of running times for each statement executed. Uses an incremental approach: having sorted a subarray A[1,β¦j-1], we insert the single element A[j] into its proper place => resulting in sorted subarray A[1,β¦j]. depend on the problem being studied
6
Insertion Sort For j <- 2 to length [A] Do key <- A[j]
Cost Times Best Worst C1 n C2 n-1 C4 C5 2 π π‘ π π(π+1) C6 2 π (π‘ π β1) π(πβ1) 2 C7 C8 For j <- 2 to length [A] Do key <- A[j] Insert A[j] into the sorted sequence A[1,.β¦j-1] i <- j-1 While (i > 0) AND A[i] > key Do A[i+1] <- A[i] i <- i-1 A[i+1] <- key
7
Insertion Sort: Running time
π π = π 1 π+ π 2 πβ1 + π 4 πβ1 + π 5 2 π π‘ π + π 6 2 π (π‘ π β1)+ π 7 2 π (π‘ π β1)+ π 8 (nβ1) <- general Best case: Array already sorted π(π)= π 1 π+ π 2 πβ1 + π 4 πβ1 + π 5 πβ1 + π 8 πβ1 = π 1 + π 2 + π 4 + π 5 + π 8 nβ π 2 + π 4 + π 5 + π 8 =aπ+π <- linear Worst case: (Array sorted in reverse order) π π = π 1 π+ π 2 πβ1 + π 4 πβ1 + π 5 π π+1 2 β1 + π 6 π πβ π 7 π πβ π 8 πβ1 = π π π π 2 +( π 1 + π 2 + π 4 + π 5 2 β π 6 2 β π π 8 )nβ( π 2 + π 4 + π 5 + π 8 ) = a π 2 +ππ+π <- Quadratic
8
Insertion Sort: Running time
Worst-case: longest time for any input of size βnβ upper bound on the running time. occurs often in many algorithms average-case may be often as bad as worst-case. Average-case (expected time) of interest when analyzing specific scientific problems - difficult to determine what constitutes an average input for a problem e.g. all input of a given size are equally likely. - Used in randomized algorithms.
9
Order of Growth Rate of growth of running time
Only leading term (e.g. ππ 2 ) counts since lower order terms are insignificant for large n. Worst-case running time for insertion sort is Ξ( π 2 ) One algorithm is considered to be more efficient than another if its worst-case running time has a lower order of growth.
10
Divide and Conquer Approach
For algorithms that are recursive in structure. Break the problem into several subproblems that are similar to the original but smaller in size. (divide) Solve the problem recursively (conquer) Combine solutions to create a solution to the original problem. (combine)
11
E.g. MERGE SORT Divide n-element sequence into two subsequences of π 2 elements each Sort the 2 subsequences using MERGE SORT Merge the 2 sorted subsequences to produce the sorted answer. Note: When sequence to be sorted has length = 1 => it is already sorted
12
MERGE SORT Merge-sort(A,p,r) If p < r Then q <- π+π 2
Merge-sort (A, p, q) Merge-sort (A, q+1, r) Merge(A, p, q, r) Where, A: array, p, q, r indicates in A s.t. πβ€q<r Merge (A, p, q, r) -> auxiliary procedure that assumes A[pβ¦.q] A[q+1β¦r] are in sorted order -> merge these subarrays in a single sorted subarray A[pβ¦r] -> takes ΞΈ(n) time
13
Analysis Divide-and-conquer T(n): running time on a problem of size n.
If problem size is small n β€ c. => solution takes ΞΈ(1). If we divide the problem in a subproblems, each subproblem has 1 π size of the original D(n) β time to divide the problem into subproblems C(n) β time to combine the solutions to the subproblems into the original.
14
Analysis Running time for the algorithm: T(n) = ΞΈ(1) -> if n β€ c
aT( π π )+ D(n)+C(n) -> otherwise conquer π π divide combine Merge-sort (assume n = 2 π ) T(n) = ΞΈ(1) -> if n β€ c 2T( π 2 )+ ΞΈ(1) + ΞΈ(n) -> otherwise Conquer Divide Combine Compute recursively 2 subproblems, each of size π 2 Compute middle of subarray: ΞΈ(1) Merge procedure ΞΈ(n) π π = ΞΈ(n log n)
15
Growth of Functions Gives a simple characterization of algorithm efficiency Allows us to compare the relative performance of alternative algorithms. For large enough inputs, the multiplicative constants are lower-order terms of an exact running time are dominated by the effect of the input itself. When input sizes are large enough => only the order of growth of the running time is relevant Asymptotic efficiency of an algorithm
16
Growth of Functions Asymptotic efficiency: -> How the running time of an algorithm increases with the size of the input, as the size of the input increases without bound. An algorithm that is asymptotically more efficient is the best choice for all but very small set of inputs. In general: asymptotic running time is defined in terms of functions whose domains are the set of natural numbers N = {0,1,2,β¦}.
17
Asymptotic notation Ξ notation (asymptotic tight bound)
Any polynomial P(n) = π=0 π π π π π =π( π π ) of degree d Function f(n) is polynomially bounded if: f(n) = O(1) f(n)= O( π π ) For a given function g(n), we denote π(g(n)) the set of functions: π(g(n)) = {f(n): β π 1 , π 2 , π 0 positive constants s.t. 0β€ π 1 π π β€π π β€ π 2 g n , β πβ₯ π 0 } β asymptotically positive if π π > 0, π π constants for all
18
Ξ notation (asymptotic tight bound)
Asymptotic notation Ξ notation (asymptotic tight bound) β πβ₯ π 0 , f(n)=g(n) to within a constant factor g(n) is an asymptotic tight bound for f(n) Note: f(n) nonnegative for πβ₯ π g(n) asymptotically nonnegative
19
Ξ notation (asymptotic tight bound)
Asymptotic notation Ξ notation (asymptotic tight bound) Any constant is a degree-zero polynomial π π 0 = π(1) Note: π π = π=0 π π π π π π π constants, π π >0 p(n) = π π π
20
Asymptotic notation O - notation (asymptotic upper bound) π(g(n)) = {f(n): β c, π 0 positive constants s.t. 0β€π π β€c g n ,β πβ₯ π 0 } π(g(n)) β€ O(g(n))
21
Asymptotic notation Observations: O β notation describes upper bound on: - worst-case running time - running time on arbitrary inputs - running time on every input (included in above) Eg. Insertion sort O( π 2 ) β upper bound for { - arbitrary inputs (worst case) - every inputs. π π 2 - tight bound { - arbitrary inputs - Not for every inputs => π π (list sorted) worst-case
22
Asymptotic notation Ξ© - notation (asymptotic lower bound) Ξ©(g(n)) = f(n): β π, π 0 positive constants s.t. 0β€π π π β€π π , β πβ₯ π 0 } Best-case running time within a constant factor An arbitrary inputs. Eg. Insertion Sort Ξ© (n)
23
Theorem For any f(n), g(n) F(n) = π(g(n)) iff f(n) = O(g(n)) and f(n) =Ξ©(g(n)) Note: Ξ©- notation describes lower bound - best case running time: running time of an algorithm on arbitrary inputs is at least a constant time g(n), for n β₯ π 0 Eg. Insertion sort O( π 2 ) Ξ©(n) asymptotic tight for every input Note π 2 = Ξ©(n) n = O( π 2 )
24
Theorem Eg. Show that 1 2 π 2 β3π= π( π 2 ) Proof Notation: g(n) = π 2 , π π(π) =π( π 2 ) f(n) = 1 2 π 2 β3π we need to prove that β π 1 , π 2 , π 0 constants, positive s.t. π 1 π π β€π π β€ π 2 g n , β πβ₯ π 0 π 1 π 2 β€ 1 2 π 2 β3πβ€ π 2 π 2 π 1 β€ 1 2 β 3 π β€ π 2 true for π 0 = 7, π 1 = 1 14 , π 2 = 1 2 other choices of constants as well Note: we need to show that β at least one choice
25
Theorem Eg. Show that 6 π 3 β π( π 2 ) Proof f(n) = 6 π 3 g(n) = π 2 Suppose β π 2 , π 0 s.t. 6 π 3 β€ π 2 π 2 β πβ₯ π 0 => nβ€ π 2 6 not true for β π large since π 2 is constant.
26
Theorem Eg. Show that any quadratic function is in π( π 2 ) and in O( π 2 ) as well. Proof f(n) = a π 2 + b n +c a, b, c constants, a>0 1) throw away lower terms => f(n) =π( π 2 ) 2) Formally: for π 1 = π 4 π 2 = 7π 4 π 0 =2. max ( π π , |π| π ) 0β€ π 1 π 2 β€π π 2 +ππ+πβ€ π 2 π 2 , β πβ₯ π 0 => f(n) = π( π 2 ) f(n) = O( π 2 ) since β π 2 , π 0 above s.t. 0β€π(π)β€ π 2 ( π 2 ) β πβ₯ π 0 Eg. Any linear function an + b is in O( π 2 ). Proof f(n) = an + b g(n) = π 2 0β€ππ+πβ€π π 2 true for π 0 = 1 q.e.d. c= a+ |b|
27
Recurrences Recurrence Eg. Worst-case Merge-sort:
An equation that describes a function in terms of its value as smaller inputs. Eg. Worst-case Merge-sort: T(n) π 1 , n = T( π 2 )+ π π , n > 1 T(n) = π π log π Solving recurrences: obtaining asymptotic bounds for ΞΈ or O on the solution.
28
Recurrences Methods substitution: -guess a bound
-use induction to prove guess is correct - iteration -converts recurrence into summation -bound summations to solve the recurrence - master T(n) = aT( π π )+ f π where a β₯ 1, b>1 f(n) given function
29
Assumptions and observations:
1) Merge- sort T(n)= π 1 , if n = 1 T( π 2 )+ T π 2 + π π if n > 1 - running time is defined only when n integer 2) Boundary conditions - running time of an algorithm on a constant- size input is a constant. T(n) = π 1 for sufficient small n. => we omit statements of the boundary conditions of recurrences and assume that T(n) is constant for small n.
30
Assumptions and observations:
Eg. T(n) = 2T( π 2 )+ π π and omit giving values for small n Reason: although T(1) changes the solution to the recurrence, the solution does not change by more than a constant. factor. => order of growth is not changed. In general: omit floor, ceiling, boundary
31
Substitution method Guess form of the solution
Use mathematical induction to find the constants and show that solution works Used to establish β upper/lower bounds on a recurrence Eg. Find upper bound: T(n) = 2 T π n Guess T(n) = O (n log n) Need to prove that T(n) β€ c n log n for an appropriate constant c > 0 Assume correct for π 2 => T π 2 β€ c π 2 log π 2 Substitute in T(n): T(n) β€ 2(c π 2 log π 2 ) + n β€ c n log ( π 2 ) + n = c n log n β c n log 2 + n = c n log n β c n + n β€ c n log n for c β₯ 1 q.e.d.
32
Substitution method Prove that it works for boundary conditions as well T(n) β€ c n log n For: Boundary T(2) =4 π 2 β€ c 2 log 2 n > 3 T(3) = 5 π 2 β€ c 3 log 3 c β₯ 2 Note: T(1) is not a good boundary since T(1) β€ c |log| = 0 and there is no c to satisfy that.
33
Subtleties T(n) = T π 2 + T( π 2 ) + π π guess O(n)
Try to show T(n) β€ cn for appropriate c T(n) β€ c π 2 + c π = cn βΉ T(n) β€ cn for any c Try to show T(n) = O( π 2 ) will work But T(n) = O(n) guess was correct just change guess to T(n) β€ cn βb b β₯ 0 constant => T(n) β€ (c π 2 -b) + (c π 2 -b) + 1 = cn β 2b +1 β€ cn β b β€ cn for all b β₯ 1 Also must show it works for boundary conditions
34
since we had to prove that T(n) β€ cn
False proof Eg. Determine the upper bound of the recurrence: T(n) = 2T π 2 + n Proof: guess T(n) β€ cn assume: T( π 2 ) = 2T π 4 + π 2 β€ c( π 2 ) substitute: T( π 2 ) β€ c ( π 2 ) T(n) β€ 2 c π 2 + n β€ cn + n = O(n) Wrong proof since we had to prove that T(n) β€ cn
35
Iteration Method Expand the recurrence and express it as a summation of terms depended only on n and the initial conditions. Eg.) π π =3T π 4 +n π π =π+3π π 4 =π+3 π T π 16 =n+3 π π π π = β¦β¦. β€ n + 3π π β¦β¦ 3 πππ 4 π .π(1) β€ n π=0 β ( 3 4 ) π + π( π πππ 4 3 ) n π=0 π ( 3 4 ) π = 4n Since π=0 β π₯ π = βπ₯ when |x| < 1 3 π π( π 4 π ) Last term is et. Note: π 4 π β€ π 4 π and 3 πππ 4 π = π πππ 4 π
36
Iteration Method β π π β€ 4n + O(n) = O(n)
asymptotic not tight β π π β€ 4n + O(n) = O(n) Note: π( π πππ 4 3 ) = π(n) since πππ 4 3 = n Observation: # of times the recurrence needs to be iterated to reach the boundary condition. Sum of terms arising from each level of iteration process. Recursion trees - Visualize what happens when a recurrence is iterated.
37
Iteration Method Eg.1) T(n) = 2T( π 2 ) + π 2 assume n = 2 π
Construction of tree:
38
Iteration Method Total: is at most a constant factor more than largest term Total: π ( π 2 ) Since π 2 π=0 β π = π β = 2π 2 (height of tree +1) β no. of levels of iterations At each level partial terms are computed
39
Iteration Method Eg 2) T n = T π 3 +T 2π 3 +π T n β omit floor / ceilings Height of tree is: πΎ= πππ π since the largest path from root to leaf is =π β π β π β β β¦ π Γ πππ π = O(n log n)
40
Iteration Method Observation !
Here Eg 2 differs from Eg 1 since the total does not differ from the term on top of the tree by a constant term.
41
Induction Method Eg) use mathematical induction to show that the solution to the recurrence π π = 2 if n = 2 2π π 2 +π if n = 2 π , π β₯1 is Tn = n log n Initially: T(2) = 2 log 2 = 2 Inductive step: For n = 2 π and k > 1, assume t( 2 π ) = 2 π log 2 π time T( 2 π ) = 2 π log 2 π = k. 2 π β€ 2T ( 2 πβ1 )+ 2 π Then, for n = 2 π+1 T( 2 π+1 )=2 π 2 π + 2 π+1 = = 2.π. 2 π + 2 π+1 = 2 π+1 (π+1) = (k+1) 2 π+1 true
42
Induction Method Recurrence Theory T(n) = 2T( π 2 ) + n T( 2 π ) = 2T( 2 πβ1 ) + 2 π T( 2 π ) = V(k) V(k) = 2 V(k-1) + 2 π General solution: π π π = π΄. 2 π π π΄π2 π =2π΄ πβ1 2 πβ1 + 2 π Ak=Ak βA+1 A=1 β π π π =k. 2 π V k = π β π + π π π π π +π 2 π Initial: V 1 = π =2β π 1 =0 V k = π π π =π. 2 π π β π =0 T(n) = n log n q.e.d.
43
Induction Method Y= π₯ 2 Y<1
Eg) Find: x < 1 π=1 β 2π+1 π₯ 2π = π=0 β 2π+1 π₯ 2π β1 = π=0 β 2π π₯ 2π += π=0 β π₯ 2π β1 = π=0 β 2π π¦ π + π=0 β π¦ π β1 =2 π=0 β π π¦ π += π=0 β π¦ π β1 =2 π¦ (1βπ¦) βπ¦ β1 =2 π₯ 2 (1β π₯ 2 ) β π₯ 2 β1= π₯ 2 (3β π₯ 2 ) (1β π₯ 2 ) 2 x<1. Y= π₯ 2 Y<1
44
The Master Theorem Let a β₯ 1 and b β₯ 1 constants, f(n) a function (asymptotic positive) and T(n) Π N+ (non negative integers) defined by the recurrence: T(n) = aT( π π ) + f(n) Problem of size n divided into a subproblems each of size π π The a subproblems are solved recursively each in time T( π π ) Cost of dividing + combining is f(n) Eg. Merge Sort T(n) = 2T( π 2 ) + ΞΈ (n) a=2 b = 2 f(n) = ΞΈ (n) Then T(n) is bounded asymptotically as follows: If f(n) = O( π πππ π πββ ) for some constant β >0, then T(n) = ΞΈ ( π πππ π π ) If f(n) = ΞΈ( π πππ π π ) then T(n) = ΞΈ ( π πππ π π log π ) If f(n) = Ξ©( π πππ π π+β ) for some constant β >0, and if a f( π π ) β€ C f(n) for some constant c<1 and n sufficiently large, then T(n)= ΞΈ(f(n)) regularity
45
The Master Theorem Intution: We compare f(n) with π πππ π π
If π πππ π π > O(f(n)), then T(n) = π(π πππ π π ) If π πππ π π = π(f(n)) (functions have the same size) then T(n) = π(f(n) log n) If π πππ π π < Ξ©(f(n)), then T(n) = π(f(n)) Polynomially larger(smaller): f(n) must be asymptotically larger (smaller) than π πππ π π by a factor of π β for some constant β >0. Gaps (Master theorem cannot be used): Cases (1) (2) - when f(n) is smaller than π πππ π π but not polynomially smaller Cases (2) (3) β when f(n) is larger than π πππ π π but not polynomially larger. If the regularity in condition (3) fails to hold. polynomially polynomially
46
Examples T(n) = 9 T ( π 3 ) + n a = 9
b = π πππ π π = π πππ 3 9 = π 2 f(n) = n f(n) = n = π 1 = π 2β1 = O( π πππ 3 9β1 ) = O( π πππ π πββ ) where β = 1 =πππππ¦ πππ π 1 βπ π =π π 2 T(n) = T ( 2π 3 ) + 1 a = 1 b = π πππ π π = π πππ = π 0 =1 f(n) = 1 f(n) = 1= π(π πππ π π )=π(1) case (2) applies: βπ π =π( log π)
47
Examples T(n) = 3T ( π 4 ) + nlogn a = 3
b = 4 π πππ π π = π πππ 4 3 = π 0.793 f(n) = n log n f(n) = Ξ©(π πππ 4 3+β )=Ξ© n for β β0.2 then case (3) applies if βregularityβ holds for f(n) For n large, a f π π =3 π 4 log π 4 β€ π log π = =ππ π πππ π= 3 4 , (π<1) β by case(3) βπ π = π(π log π)
48
Examples The master method does not apply for the recurrence:
T(n) = 2T ( π 2 ) + n log n a = 2 b = 2 π πππ π π =π f(n) = n log n f(n) is asymptotic larger than π πππ π π but not polynomially larger than π πππ π π f(n) = n log n = log n < π β π πππ π π n for any β >0 β recurrence falls into the gap between case (2) and case (3)
49
Recurrences Linear Homogeneous Recurrence (constant coefficients)
f n = π 1 π πβ1 + π 2 π πβ2 +β¦.+ π π π(πβπ) Solution: - plugin in f(n) = π£ π - divide by π£ πβπ - solve polynomial equation for v with k roots. π π =π 1 π£ 1 π + π 2 π£ 2 π +β¦.+ π π π£ π π where π£ 1 , π£ 2 ,β¦. π£ π are roots of the polynomial. ( π π are determined by initial conditions) Eg) Fibonacci numbers: F(n) = f(n-1) + F(n-2) let π£ π =πΉ π π£ π β π£ πβ1 β π£ πβ2 =0 | Divide by π£ πβ2 π£ 2 βvβ1=0 π£= 1Β± β½= β ππππππ πππ‘ππ ⦽= 1β 5 2
50
Recurrences β π π =π 1 β½ π + π 2 ⦽ π
β π π =π 1 β½ π + π 2 ⦽ π Initial conditions: F(0) = 0, F(1) = 1 π 1 β½ 0 + π 2 ⦽ 0 = β π 1 + π 2 =0 π 1 β½ 1 + π 2 ⦽ 1 = ββ½ π 1 + ⦽π 2 =1 (β½β ⦽)π 1 =1 π 1 = 1 β½β⦽ = 1 5 π 2 =β 1 5 Thus πΉ π = ( β½ π β ⦽ π ) By induction: πΉ 0 = β½ 0 β ⦽ 0 = β verified πΉ 1 = β½ 1 β ⦽ 1 = β
51
Recurrences Assume we can find F(i-1) and F(i-2) πΉ(π) = πΉ(πβ1) + πΉ(πβ2) = 1 5 β½ πβ1 β ⦽ πβ β½ πβ2 β ⦽ πβ2 = 1 5 ( β½ πβ1 + β½ πβ2 β ⦽ πβ1 β ⦽ πβ2 ) = 1 5 ( β½ πβ2 (β½+1)β ⦽ πβ2 (⦽+1)) β½+1= = =( ) 2 = β½ 2 ⦽ +1= 1β = 3β 5 2 =( 1β 5 2 ) 2 = ⦽ 2 βπΉ π = 1 5 β½ πβ2 . β½ 2 β ⦽ πβ2 . ⦽ 2 = 1 5 β½ π β ⦽ π Note: if roots are repeated, put n in front of root terms e.g. π(π) = 6 π(πβ1) β 9 π(πβ1) π£ π β6 π£ πβ1 +9 π£ πβ2 =0 β π£ 1,2 =3,3 π π = π 1 3 π + π 2 3 π
52
Recurrences Non homogeneous linear recurrences with constant coefficients: f n = π 1 π πβ1 + π 2 π πβ2 +β¦.+ π π π(πβπ)+π(π) Where b(n) is a polynomial in n of degree m multiplied by πΌ π (πΌ is a constant) e.g. b(n) = ( π π π π + π πβ1 π πβ1 +β¦.+ π 1 π+ π 0 ) πΌ π Solution: π β π - is solution to homogenous eq. (f(n) without b(n).) π π π - is solution formed by making logical guess to its form and plugging into recurrence. Eg) Recursive Insertion Sort (exercise) ! π(π) = π(πβ1) + π
53
Recurrences Non linear recurrences f n =cf π π +π π
Eg) Merge Sort. (exercise) f n =2f π 2 +π f 2 π =2f 2 πβ π π π =2π πβ1 +2 π Where, π β π = π 1 2 π π π π =π΄. 2 π .k=Ak. 2 π π΄π2 π =2π΄ πβ1 2 πβ1 + 2 π =2π΄π.2 πβ1 β2π΄ 2 πβ1 + 2 π π΄π=π΄πβπ΄ βπ΄=1 V k = π 1 2 π +k 2 π π π =π 1 π+π log π since n= 2 π π= log π let n= π π f( π π ) = cf( π πβ1) +π( π π ) let V(k) = f( π π ) V(k) = cV(k-1)+b( π π ) n= 2 π But fn has a form V(k)= π β + π π
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.