Algorithmics - Lecture 71 LECTURE 7: Algorithms design techniques - Decrease and conquer -

Slides:



Advertisements
Similar presentations
Introduction to Algorithms
Advertisements

1 Recursive Algorithm Analysis Dr. Ying Lu RAIK 283: Data Structures & Algorithms September 13, 2012.
COSC 2006 Data Structures I Recursion III
Recursion, Divide and Conquer Lam Chi Kit (George) HKOI2007.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Proving the Correctness of Algorithms Algorithm Design and Analysis Week 2
Chapter 10 Recursion. Copyright © 2005 Pearson Addison-Wesley. All rights reserved Chapter Objectives Explain the underlying concepts of recursion.
Recursive Algorithms Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Chapter 15 Recursive Algorithms. 2 Recursion Recursion is a programming technique in which a method can call itself to solve a problem A recursive definition.
Analysis of Algorithms
CHAPTER 10 Recursion. 2 Recursive Thinking Recursion is a programming technique in which a method can call itself to solve a problem A recursive definition.
analysis, plug ‘n’ chug, & induction
Induction and recursion
Copyright © Cengage Learning. All rights reserved.
Data Structures Using C++ 2E Chapter 6 Recursion.
Algorithmics - Lecture 21 LECTURE 2: Algorithms description - examples -
Towers of Hanoi. Introduction This problem is discussed in many maths texts, And in computer science an AI as an illustration of recursion and problem.
A Review of Recursion Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Data Structures Using C++ 2E Chapter 6 Recursion.
1 Decrease-and-Conquer Approach Lecture 06 ITS033 – Programming & Algorithms Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing.
Reading and Writing Mathematical Proofs
Chapter 2 - Mathematical Review Functions
Analysis of Algorithms
CHAPTER 02 Recursion Compiled by: Dr. Mohammad Omar Alhawarat.
Algorithmics - Lecture 8-91 LECTURE 8: Divide and conquer.
Analysis of Algorithms
1 Recurrences Algorithms Jay Urbain, PhD Credits: Discrete Mathematics and Its Applications, by Kenneth Rosen The Design and Analysis of.
Project 2 due … Project 2 due … Project 2 Project 2.
Solving Recurrence Lecture 19: Nov 25. Some Recursive Programming (Optional?)
Recursion Textbook chapter Recursive Function Call a recursive call is a function call in which the called function is the same as the one making.
1 Section 5.5 Solving Recurrences Any recursively defined function ƒ with domain N that computes numbers is called a recurrence or recurrence relation.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
Chapter 4 Recursion. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.1-2 Chapter Objectives Explain the underlying concepts of recursion.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Design of Algorithms by Induction Part 1 Algorithm Design and Analysis Week 3 Bibliography: [Manber]- Chap.
Algorithmics - Lecture 101 LECTURE 10: Greedy technique.
By: Lokman Chan Recursive Algorithm Recursion Definition: A function that is define in terms of itself. Goal: Reduce the solution to.
Lecture 5 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Algorithmics - Lecture 131 LECTURE 13: Backtracking.
Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)
Recursive Algorithms A recursive algorithm calls itself to do part of its work The “call to itself” must be on a smaller problem than the one originally.
Algorithmics - Lecture 121 LECTURE 11: Dynamic programming - II -
CS 312: Algorithm Analysis Lecture #7: Recurrence Relations a.k.a. Difference Equations Slides by: Eric Ringger, with contributions from Mike Jones, Eric.
Algorithmics - Lecture 61 LECTURE 6: Analysis of sorting methods.
1 Recursive algorithms Recursive solution: solve a smaller version of the problem and combine the smaller solutions. Example: to find the largest element.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
7. RECURSIONS Rocky K. C. Chang October 12, 2015.
Algorithmics - Lecture 41 LECTURE 4: Analysis of Algorithms Efficiency (I)
Recursion Continued Divide and Conquer Dynamic Programming.
1 Recursion Recursive function: a function that calls itself (directly or indirectly). Recursion is often a good alternative to iteration (loops). Its.
Mathematical Analysis of Recursive Algorithm CSG3F3 Lecture 7.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Section Recursion  Recursion – defining an object (or function, algorithm, etc.) in terms of itself.  Recursion can be used to define sequences.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Algorithm Analysis 1.
Analysis of Algorithms
Decrease-and-Conquer Approach
CS 3343: Analysis of Algorithms
Recursive Thinking Chapter 9 introduces the technique of recursive programming. As you have seen, recursive programming involves spotting smaller occurrences.
Mathematical Induction Recursion
Recursive Thinking Chapter 9 introduces the technique of recursive programming. As you have seen, recursive programming involves spotting smaller occurrences.
Algorithm design and Analysis
Applied Discrete Mathematics Week 9: Integer Properties
Recurrences.
Chapter 3 :Recursion © 2011 Pearson Addison-Wesley. All rights reserved.
The Selection Problem.
Presentation transcript:

Algorithmics - Lecture 71 LECTURE 7: Algorithms design techniques - Decrease and conquer -

Algorithmics - Lecture 72 Outline What is an algorithm design technique ? Brute force technique Decrease-and-conquer technique Recursive algorithms and their analysis Applications of decrease-and-conquer

Algorithmics - Lecture 73 What is an algorithm design technique ? … it is a general approach to solve problems algorithmically … it can be applied to a variety of problems from different areas of computing

Algorithmics - Lecture 74 Why do we need to know such techniques ? … they provide us guidance in designing algorithms for new problems … they represent a collection of tools useful for applications

Algorithmics - Lecture 75 Which are the most used techniques ? Brute force Decrease and conquer Divide and conquer Greedy technique Dynamic programming Backtracking

Algorithmics - Lecture 76 Brute force … it is a straightforward approach to solve a problem, usually directly based on the problem’s statement … it is the easiest (and the most intuitive) way for solving a problem … algorithms designed by brute force are not always efficient

Algorithmics - Lecture 77 Brute force Examples: Compute x n, x is a real number and n a natural number Idea: x n = x*x*…*x (n times) Power(x,n) p←1 FOR i ← 1,n DO p ← p*x ENDFOR RETURN p Efficiency class  (n)

Algorithmics - Lecture 78 Brute force Examples: Compute n!, n a natural number (n>=1) Idea: n!=1*2*…*n Factorial(n) f ← 1 FOR i ← 1,n DO f ← f*i ENDFOR RETURN f Efficiency class  (n)

Algorithmics - Lecture 79 Decrease and conquer Basic idea: exploit the relationship between the solution of a given instance of a problem and the solution of a smaller instance of the same problem. By reducing successively the problem’s dimension we eventually arrive to a particular case which can be solved directly. Motivation: such an approach could lead us to an algorithm which is more efficient than a brute force algorithm sometimes it is easier to describe the solution of a problem by referring to the solution of a smaller problem than to describe explicitly the solution

Algorithmics - Lecture 710 Decrease and conquer Example. Let us consider the problem of computing x n for n=2 m, m>=1 Since x*x if m=1 x 2^m = x 2^(m-1) *x 2^(m-1) if m>1 It follows that we can compute x 2^m by computing: m=1 => p:=x*x=x 2 m=2 => p:=p*p=x 2 *x 2 =x 4 m=3 => p:=p*p=x 4 *x 4 =x 8 ….

Algorithmics - Lecture 711 Decrease and conquer Power2(x,m) p ← x*x FOR i ← 1,m-1 DO p ← p*p ENDFOR RETURN p Analysis: a)Correctness Loop invariant: p=x 2^i b) Efficiency (i) problem size: m (ii) dominant operation: * T(m) = m Remark: m=lg n Bottom up approach (start with the smallest instance of the problem)

Algorithmics - Lecture 712 Decrease and conquer x*x if m=1 x 2^m = x 2^(m-1) *x 2^(m-1) if m>1 power3(x,m) IF m=1 THEN RETURN x*x ELSE p ←power3(x,m-1) RETURN p*p ENDIF x*x if n=2 x^n = x n/2 *x n/2 if n>2 power4(x,n) IF n=2 THEN RETURN x*x ELSE p ← power4(x, n DIV 2) RETURN p*p ENDIF decrease by a constant factor

Algorithmics - Lecture 713 Decrease and conquer power3(x,m) IF m=1 THEN RETURN x*x ELSE p ← power3(x,m-1) RETURN p*p ENDIF power4(x,n) IF n=2 THEN RETURN x*x ELSE p ← power4(x,n DIV 2) RETURN p*p ENDIF Remarks: 1.Top-down approach (start with the largest instance of the problem) 2.Both algorithms are recursive algorithms

Algorithmics - Lecture 714 Decrease and conquer This idea can be extended to the case of an arbitrary value for n: x*x if n=2 x^n= x n/2 *x n/2 if n>2, n is even x (n-1)/2 *x (n-1)/2 *x if n>2, n is odd power5(x,n) IF n=1 THEN RETURN x ELSE IF n=2 THEN RETURN x*x ELSE p←power5(x,n DIV 2) IF n MOD 2=0 THEN RETURN p*p ELSE RETURN p*p*x ENDIF ENDIF ENDIF

Algorithmics - Lecture 715 Outline What is an algorithm design technique ? Brute force technique Decrease-and-conquer technique Recursive algorithms and their analysis Applications of decrease-and-conquer

Algorithmics - Lecture 716 Recursive algorithms Definitions Recursive algorithm = an algorithm which contains at least one recursive call Recursive call = call of the same algorithm either directly (algorithm A calls itself) or indirectly (algorithm A calls algorithm B which calls algorithm A) Remarks: The cascade of recursive calls is similar to an iterative processing Each recursive algorithm must contain a particular case for which it returns the result without calling itself again The recursive algorithms are easy to implement but their implementation is not always efficient (due to the supplementary space on the program stack needed to deal with the recursive calls)

Algorithmics - Lecture 717 Recursive calls - example fact(n) If n<=1 then rez←1 else rez←fact(n-1)*n endif return rez fact(4) 24 fact(3) 6 fact(2) 2 fact(1) 1 2*1 3*2 4*6 2*fact(1) 3*fact(2) 4*fact(3) fact(4): stack = [4] fact(3): stack = [3,4] fact(2): stack = [2,3,4] fact(1): stack = [1,2,3,4] stack = [3,4] stack = [4] stack = [] Sequence of recursive calls Back to the calling function

Algorithmics - Lecture 718 Recursive algorithms - correctness Correctness analysis. to prove that a recursive algorithm is correct it suffices to show that: –The recurrence relation which describes the relationship between the solution of the problem and the solution for other instances of the problem is correct (from a mathematical point of view) –The algorithm implements in a correct way the recurrence relation Since the recursive algorithms describe an iterative process the correctness can be proved by identifying an assertion (similar to a loop invariant) which has the following properties: –It is true for the particular case –It remains true after the recursive call –For the actual values of the algorithm parameters It implies the postcondition

Algorithmics - Lecture 719 Recursive algorithms-correctness Example. P: a,b naturals, a<>0; Q: returns gcd(a,b) Recurrence relation: a if b=0 gcd(a,b)= gcd(b, a MOD b) if b<>0 gcd(a,b) IF b=0 THEN rez← a ELSE rez←gcd(b, a MOD b) ENDIF RETURN rez Invariant property: rez=gcd(a,b) Particular case: b=0 => rez=a=gcd(a,b) After the recursive call: since for b<>0 gcd(a,b)=gcd(b,a MOD b) it follows that rez=gcd(a,b) Postcondition: rez=gcd(a,b) => Q

Algorithmics - Lecture 720 Recursive algorithms - efficiency Steps of efficiency analysis: Establish the problem size Choose the dominant operation Check if the running time depends only on the problem size or also on the properties of input data (if so, the best case and the worst case should be analyzed) (Particular for recursive algorithms) Set up a recurrence relation which describes the relation between the running time corresponding to the problem and that corresponding to a smaller instance of the problem. Establish the initial value (based on the particular case) Solve the recurrence relation

Algorithmics - Lecture 721 Recursive algorithms - efficiency Remark: Recurrence relation Recursive algorithm Recurrence relation Algorithm design Efficiency analysis

Algorithmics - Lecture 722 Recursive algorithms - efficiency rec_alg (n) IF n=n0 THEN ELSE rec_alg(h(n)) ENDIF Assumptions: is a processing step of constant cost (c0) h is a decreasing function and it exists k such that h (k) (n)=h(h(…(h(n))…))=n0 The cost of computing h(n) is c The recurrence relation for the running time is: c0 if n=n0 T(n)= T(h(n))+c if n>n0

Algorithmics - Lecture 723 Recursive algorithms - efficiency Computing n!, n>=1 Recurrence relation: 1 n=1 n!= (n-1)!*n n>1 Algorithm: fact(n) IF n<=1 THEN RETURN 1 ELSE RETURN fact(n-1)*n ENDIF Problem dimension: n Dominant operation: multiplication Recurrence relation for the running time: 0 n=1 T(n)= T(n-1)+1 n>1

Algorithmics - Lecture 724 Recursive algorithms - efficiency Methods to solve the recurrence relations: Forward substitution –Start from the particular case and construct terms of the sequence –Identify a pattern in the sequence and infer the formula of the general term –Prove by direct computation or by mathematical induction that the inferred formula satisfies the recurrence relation Backward substitution –Start from the general case T(n) and replace T(h(n)) with the right-hand side of the corresponding relation, then replace T(h(h(n))) and so on, until we arrive to the particular case –Compute the expression of T(n)

Algorithmics - Lecture 725 Recursive algorithms - efficiency Example: n! 0 n=1 T(n)= T(n-1)+1 n>1 Forward substitution T(1)=0 T(2)=1 T(3)=2 …. T(n)=n-1 It satisfies the recurrence relation Backward substitution T(n) =T(n-1)+1 T(n-1)=T(n-2)+1 …. T(2) =T(1)+1 T(1) = (by adding) T(n)=n-1 Remark: same efficiency as of the brute force algorithm !

Algorithmics - Lecture 726 Recursive algorithms - efficiency Example: x n, n=2 m, 1 n=2 T(n)= T(n/2)+1 n>2 T(2 m ) =T(2 m-1 )+1 T(2 m-1 ) =T(2 m-2 )+1 …. T(2) = (by adding) T(n)=m=lg n power4(x,n) IF n=2 THEN RETURN x*x ELSE p:=power4(x,n/2) RETURN p*p ENDIF

Algorithmics - Lecture 727 Recursive algorithms - efficiency Remark: for this example the decrease and conquer algorithm is more efficient than the brute force algorithm Explanation: x n/2 is computed only once. If it would be computed twice then … it is no more decrease and conquer. 1 n=2 T(n)= 2T(n/2)+1 n>2 T(2 m ) =2T(2 m-1 )+1 T(2 m-1 ) =2T(2 m-2 )+1 |*2 T(2 m-2 ) =2T(2 m-3 )+1 |*2 2 …. T(2) =1 |*2 m (by adding) T(n)= …+2 m-1 =2 m -1= n-1 pow(x,n) IF n=2 THEN RETURN x*x ELSE RETURN pow(x,n/2)*pow(x,n/2) ENDIF

Algorithmics - Lecture 728 Outline What is an algorithm design technique ? Brute force Decrease-and-conquer technique Recursive algorithms and their analysis Applications of decrease-and-conquer

Algorithmics - Lecture 729 Applications of decrease-and-conquer Example 1: generating all n! permutations of {1,2,…,n} Idea: the k! permutations of {1,2,…,k} can be obtained from the (k-1)! permutations of {1,2,…,k-1} by placing the k-th element successively on the first position, second position, third position, … k-th position. Placing k on position i is realized by swapping k with i.

Algorithmics - Lecture 730 Generating permutations Illustration for n=3 (top-down approach) k=3 k=2 k=1 3↔13↔1 3↔23↔2 3↔33↔3 2↔32↔3 2↔22↔23↔13↔1 3↔33↔32↔12↔12↔22↔2 recursive call return

Algorithmics - Lecture 731 Generating permutations perm(k) IF k=1 THEN WRITE x[1..n] ELSE FOR i←1,k DO x[i] ↔ x[k] perm(k-1) x[i] ↔ x[k] ENDFOR ENDIF Let x[1..n] be a global array (accessed by all functions) containing at the beginning the values (1,2,…,n) The algorithm has a formal parameter, k. It is called for k=n. The particular case corresponds to k=1, when a permutation is obtained and it is printed. Remark: the algorithm contains k recursive calls Efficiency analysis: Problem size: k Dominant operation: swap Recurrence relation: 0 k= 1 T(k)= k(T(k-1)+2) k>1

Algorithmics - Lecture 732 Generating permutations T(k) =k(T(k-1)+2) T(k-1)=(k-1)(T(k-2)+2) |*k T(k-2)=(k-2)(T(k-3)+2) |*k*(k-1) … T(2) =2(T(1)+2) |*k*(k-1)*…*3 T(1) =0 |*k*(k-1)*…*3* T(k)=2(k+k(k-1)+ k(k-1)(k-2)+…+k!) =2k!(1/(k-1)!+1/(k-2)!+…+ ½+1) -> 2e k! (for large values of k). For k=n => T(n)  (n!) 0 k=1 T(k)= k(T(k-1)+2) k>1

Algorithmics - Lecture 733 Towers of Hanoi Hypotheses: Let us consider three rods labeled S (source), D (destination) and I (intermediary). Initially on the rod S there are n disks of different sizes in decreasing order of their size: the largest on the bottom and the smallest on the top Goal: Move all disks from S to D using (if necessary) the rod I as an auxiliary Restriction: We can move only one disk at a time and it is forbidden to place a larger disk on top of a smaller one

Algorithmics - Lecture 734 Towers of Hanoi S ID Initial configuration

Algorithmics - Lecture 735 Towers of Hanoi SID Move 1: S->D

Algorithmics - Lecture 736 Towers of Hanoi S ID Move 2: S->I

Algorithmics - Lecture 737 Towers of Hanoi SID Move 3: D->I

Algorithmics - Lecture 738 Towers of Hanoi S ID Move 4: S->D

Algorithmics - Lecture 739 Towers of Hanoi S ID Move 5: I->S

Algorithmics - Lecture 740 Towers of Hanoi S ID Move 6: I->D

Algorithmics - Lecture 741 Towers of Hanoi S ID Move 7: S->D

Algorithmics - Lecture 742 Towers of Hanoi Idea: move (n-1) disks from the rod S to I (using D as auxiliary) move the element left on S directly to D move the (n-1) disks from the rod I to D (using S as auxiliary) Algorithm: hanoi(n,S,D,I) IF n=1 THEN “move from S to D” ELSE hanoi(n-1,S,I,D) “move from S to D” hanoi(n-1,I,D,S) ENDIF Significance of parameters: First parameter: number of disks to be moved Second parameter: source rod Third parameter: destination rod Fourth parameter: auxiliary rod Remark: The algorithm contains 2 recursive calls

Algorithmics - Lecture 743 Towers of Hanoi Illustration for n=3. hanoi(3,s,d,i) hanoi(2,s,i,d)hanoi(2,i,d,s) hanoi(1,s,d,i) hanoi(1,d,i,s)hanoi(1,i,s,d)hanoi(1,s,d,i) s -> d s -> i s -> d d -> i i -> d i -> s s -> d

Algorithmics - Lecture 744 Towers of Hanoi hanoi(n,S,D,I) IF n=1 THEN “move from S to D” ELSE hanoi(n-1,S,I,D) “move from S to D” hanoi(n-1,I,D,S) ENDIF Problem size: n Dominant operation: move Recurrence relation: 1 n=1 T(n)= 2T(n-1)+1 n>1 T(n) =2T(n-1)+1 T(n-1)=2T(n-2)+1 |*2 T(n-2)=2T(n-3)+1 |*2 2 … T(2) =2T(1)+1 |*2 n-2 T(1) =1 |*2 n T(n)=1+2+…+2 n-1 = 2 n -1 T(n)  (2 n )

Algorithmics - Lecture 745 Decrease and conquer variants Decrease by a constant –Example: n! (n!=1 if n=1 n!=(n-1)!*n if n>1) Decrease by a constant factor –Example: x n (x n =x*x if n=2 x n =x n/2 *x n/2 if n>2, n=2 m ) Decrease by a variable –Example: gcd(a,b) (gcd(a,b)=a if a=b gcd(a,b)=gcd(b,a-b) if a>b gcd(a,b)=gcd(a,b-a) if b>a) Decrease by a variable factor –Example: gcd(a,b) ( gcd(a,b)=a if b=0 gcd(a,b)=gcd(b,a MOD b) if b<>0)

Algorithmics - Lecture 746 Next lecture will be on … … divide and conquer … and its analysis