Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 2008. 1. 2 이재준 담당교수님 : 박희진 교수님 1.

Similar presentations


Presentation on theme: "A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 2008. 1. 2 이재준 담당교수님 : 박희진 교수님 1."— Presentation transcript:

1 A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 2008. 1. 2 이재준 담당교수님 : 박희진 교수님 1

2 Contents of Table 1.Algorithmic Paradigms 2.Analysis Of Computer Algorithms 3.Analyzing Insertion Sort algorithm 4.Analyzing Merge Sort algorithm 5.Comparing Insertion Sort and Merge Sort 6.Next Step for Algorithm Analysis 7.References 8.Question & Answer 2

3 1. Algorithmic Paradigms Design and analysis of computer algorithms. Critical thinking and problem-solving 1) Greedy. 2) Divide and conquer. 3) Dynamic programming. 4) Network flow. 5) Randomized algorithms. 6) Intractability. 7) Copying with intractability. 3

4 1. Algorithmic Paradigms In Chapter 2 ChapterContents Chapter 3 - Theta [ θ]-notation - formally interpret equation containing Theta [ θ]-notation Chapter 4 - How to solve Recurrence Relation - Master theorem Chapter 5 Probabilistic analysis for Randomized algorithms 4

5 2. Analysis of computer algorithms Loop Invariant - Definition Statements that remains true each time when the program enters, and executes, and exits the loop. - Understanding Loop invariants help us analyze programs, check for errors, derive programs from specifications. Asymptotic Complexity - Big Oh[O], Omega[Ω], Theta[θ] 5

6 2. Analysis of computer algorithms Use a Loop invariant to prove Correctness - Initialization : It is true prior to the first iteration of the loop. - Maintenance : If it is true before an iteration of the loop, it remains true before the next iteration. - Termination : When the loop terminates, the invariant gives us a useful property that helps show that the algorithm is correct. 6

7 2. Analysis of computer algorithms Asymptotic complexity (Theta[θ]) - [Theta] : f(n) = θ(g(n)) - for all n, n ≥ n 0, If there exist positive constant c 1, c 2 and n 0 such that c 1 g(n) ≤ f(n) ≤ c 2 g(n) then, f(n) = θ(g(n)). - if g(n) is both an upper and lower bound on f(n) then, f(n) = g(n). Big Oh Omega The Theta notation more precise than both the “Big Oh” and “Omega”. 7

8 3. Analyzing Insertion Sort algorithm J = 2 J = 3 J = 4 J = 5 J = 6 J = n + 1 8

9 3. Analyzing Insertion Sort algorithm Loop Invariant for Insertion Sort Initialization : When j=2 then, this subarray(A[1..j-1]=A[1])is sorted Maintenance : the body outer for loop works by moving subarray is sorted. Termination : When j=N+1 then, for loop ends. A[1..j-1] is sorted. Initialization : When j=2 then, this subarray(A[1..j-1]=A[1])is sorted Maintenance : the body outer for loop works by moving subarray is sorted. Termination : When j=N+1 then, for loop ends. A[1..j-1] is sorted. 9

10 3. Analyzing Insertion Sort algorithm Asymptotic complexity for Insertion Sort Best Case : The array is already sorted ( t j = 1, T(n) is linear function of n ) Worst Case : The array is in reverse sorted order ( t j = j, T(n) is quadratic function of n. ) Average Case : running time is approximately half of the worst-case running time ( t j = j / 2 ), it’s still a quadratic function of n. Best Case : The array is already sorted ( t j = 1, T(n) is linear function of n ) Worst Case : The array is in reverse sorted order ( t j = j, T(n) is quadratic function of n. ) Average Case : running time is approximately half of the worst-case running time ( t j = j / 2 ), it’s still a quadratic function of n. 10

11 3. Analyzing Insertion Sort algorithm Asymptotic complexity for Insertion Sort Can Express T(n) as an 2 + bn + c for constants a, b, c (that again depend on statement costs ) → T(n) is a quadratic function of n 11

12 4. Analyzing Merge Sort algorithm Divide and Conquer algorithm – Divide the problem into a number of subproblems – Conquer the subproblems by solving them recursively Base case: If the subproblems are small enough, just solve them by brute force – Combine the subproblem solutions to give a solution to the original problem 12

13 4. Analyzing Merge Sort algorithm Divide Conquer Solution 13

14 4. Analyzing Merge Sort algorithm 14

15 4. Analyzing Merge Sort algorithm Example : Call of Merge (9, 12, 16) Sorted Order Subarray 15

16 4. Analyzing Merge Sort algorithm Example : Call of Merge (9, 12, 16) Merge Complete 16

17 4. Analyzing Merge Sort algorithm Initialization : When k=p then, A[p..k- 1] is empty. Not copied back to A Maintenance : L[i] > R[j] then R[i], or L[i]≤R[j] then L[i] copied into A[k]. Termination : When k=r+1 then loop ends. A[p..r] is sorted. Initialization : When k=p then, A[p..k- 1] is empty. Not copied back to A Maintenance : L[i] > R[j] then R[i], or L[i]≤R[j] then L[i] copied into A[k]. Termination : When k=r+1 then loop ends. A[p..r] is sorted. Loop Invariant for Merging 17

18 4. Analyzing Merge Sort algorithm Asymptotic complexity for Merging θ (n 1 + n 2 ) θ (n) ⇒ θ (n ) 18

19 4. Analyzing Merge Sort algorithm If we assume that n is a power of 2 ⇒ each divide step yields two subproblems, both of size exactly n/2 The base case occurs when n = 1 When n ≥2, time for merge sort steps: Combine Divide & Conquer Divide: Just compute q as the average of p and r ⇒ D(n) = θ (1) Conquer: Recursively solve 2 subproblems, each of size n/2 ⇒ 2T(n/2) Combine: MERGE on an n-element subarray takes θ (n) time ⇒ C(n) = θ (n) Divide: Just compute q as the average of p and r ⇒ D(n) = θ (1) Conquer: Recursively solve 2 subproblems, each of size n/2 ⇒ 2T(n/2) Combine: MERGE on an n-element subarray takes θ (n) time ⇒ C(n) = θ (n) 19

20 c is a constant that describes the running time for the base case and also is the time per array element for the divide and conquer steps 4. Analyzing Merge Sort algorithm Rewrite the recurrence Draw a recurrence tree 20

21 4. Analyzing Merge Sort algorithm Continue expanding until the problem sizes get down to 1: Level = log n +1 cn (Log n + 1) ⇒ θ (n log n ) 21

22 4. Analyzing Merge Sort algorithm T(n) = 2 i T(n/2 i ) + i c n The expansion stops when n/2 i = 1 ⇒ i = log n ⇒ 2 log n T(1) + log n c n ⇒ n + c n log n Ignore low-order term n and costant coefficient c ⇒ θ (n lg n) 22

23 5. Compare Insertion & Merge Compared to insertion sort, merge sort is faster. One small inputs, insertion sort may be faster. But, for large enough inputs, merge sort will always be faster Compared to insertion sort, merge sort is faster. One small inputs, insertion sort may be faster. But, for large enough inputs, merge sort will always be faster Insertion Sort Merge Sort 23

24 6. Next Step for Algorithm Analysis ChapterContents Chapter 3 - Theta [ θ]-notation - formally interpret equation containing Theta [ θ]-notation Chapter 4 - How to solve Recurrence Relation - Master theorem Chapter 5 Probabilistic analysis for Randomized algorithms Reinforcement About this Study 24

25 7. References ▣ Introduction to Algorithms Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein | The MIT Press | pg18 - pg39 ▣ Fundamentals of Data Structures in C Horowitz, Sahni, Anderson-Freed | Computer Science Press | pg31 – pg49 ▣ Wikipedia the free encyclopedia http://www.wikipedia.org | definition about keyword. ▣ Algorithm Design Jon Kleinberg & Eva Tardos | Pearson International Edition, Addison Wesly 25

26 8. Q & A Thank you ! & Happy New Year!!! 26


Download ppt "A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 2008. 1. 2 이재준 담당교수님 : 박희진 교수님 1."

Similar presentations


Ads by Google