CS 46101 Section 600 CS 56101 Section 002 Dr. Angela Guercio Spring 2010.

Slides:



Advertisements
Similar presentations
CPSC 411 Design and Analysis of Algorithms Set 3: Divide and Conquer Prof. Jennifer Welch Spring 2011 CPSC 411, Spring 2011: Set 3 1.
Advertisements

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Divide-and-Conquer CIS 606 Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
A simple example finding the maximum of a set S of n numbers.
More on Divide and Conquer. The divide-and-conquer design paradigm 1. Divide the problem (instance) into subproblems. 2. Conquer the subproblems by solving.
Comp 122, Spring 2004 Divide and Conquer (Merge Sort)
Divide and Conquer Strategy
Strassen's Matrix Multiplication Sibel KIRMIZIGÜL.
Introduction to Algorithms Jiafen Liu Sept
1 Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances.
CS4413 Divide-and-Conquer
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Divide-and-Conquer Recursive in structure –Divide the problem into several smaller sub-problems that are similar to the original but smaller in size –Conquer.
1 Divide-and-Conquer CSC401 – Analysis of Algorithms Lecture Notes 11 Divide-and-Conquer Objectives: Introduce the Divide-and-conquer paradigm Review the.
Introduction to Algorithms 6.046J/18.401J L ECTURE 3 Divide and Conquer Binary search Powering a number Fibonacci numbers Matrix multiplication Strassen’s.
11 Computer Algorithms Lecture 6 Recurrence Ch. 4 (till Master Theorem) Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Recurrence Relations Reading Material –Chapter 2 as a whole, but in particular Section 2.8 –Chapter 4 from Cormen’s Book.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Divide-and-Conquer1 7 2  9 4   2  2 79  4   72  29  94  4.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Recurrence Relations Reading Material –Chapter 2 as a whole, but in particular Section 2.8 –Chapter 4 from Cormen’s Book.
Recurrences Part 3. Recursive Algorithms Recurrences are useful for analyzing recursive algorithms Recurrence – an equation or inequality that describes.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions Analysis of.
Chapter 4: Solution of recurrence relationships
CSE 421 Algorithms Richard Anderson Lecture 11 Recurrences.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
Recurrences The expression: is a recurrence. –Recurrence: an equation that describes a function in terms of its value on smaller functions BIL741: Advanced.
Divide-and-Conquer 7 2  9 4   2   4   7
Analysis of Algorithms
Analyzing Recursive Algorithms A recursive algorithm can often be described by a recurrence equation that describes the overall runtime on a problem of.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
MA/CSSE 473 Day 17 Divide-and-conquer Convex Hull Strassen's Algorithm: Matrix Multiplication.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 3 Prof. Erik Demaine.
1 Computer Algorithms Lecture 7 Master Theorem Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Project 2 due … Project 2 due … Project 2 Project 2.
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
DR. NAVEED AHMAD DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF PESHAWAR LECTURE-5 Advance Algorithm Analysis.
4.Divide-and-Conquer Hsu, Lih-Hsing. Computer Theory Lab. Chapter 4P.2 Instruction Divide Conquer Combine.
1 Chapter 4 Divide-and-Conquer. 2 About this lecture Recall the divide-and-conquer paradigm, which we used for merge sort: – Divide the problem into a.
Chapter 4. Recurrences. Outline Offers three methods for solving recurrences, that is for obtaining asymptotic bounds on the solution In the substitution.
ADA: 4.5. Matrix Mult.1 Objective o an extra divide and conquer example, based on a question in class Algorithm Design and Analysis (ADA) ,
Divide and Conquer Andreas Klappenecker [based on slides by Prof. Welch]
Divide-and-Conquer UNC Chapel HillZ. Guo. Divide-and-Conquer It’s a technique instead of an algorithm Recursive in structure – Divide the problem into.
Solving Recurrences with the Substitution Method.
Design & Analysis of Algorithms COMP 482 / ELEC 420 John Greiner
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
1Computer Sciences Department. Objectives Recurrences.  Substitution Method,  Recursion-tree method,  Master method.
Master Method Some of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
1 How to Multiply Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. integers, matrices, and polynomials.
Recurrences (in color) It continues…. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. When an algorithm.
Divide and Conquer Faculty Name: Ruhi Fatima Topics Covered Divide and Conquer Matrix multiplication Recurrence.
BY Lecturer: Aisha Dawood. A recurrence is a function is defined in terms of:  one or more base cases, and  itself, with smaller arguments. 2.
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
Chapter 4: Solution of recurrence relationships Techniques: Substitution: proof by induction Tree analysis: graphical representation Master theorem: Recipe.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
Recurrences It continues… Jeff Chastine. Recurrences When an algorithm calls itself recursively, its running time is described by a recurrence. A recurrence.
Equal costs at all levels
Introduction to Algorithms: Divide-n-Conquer Algorithms
Mathematical Foundations (Solving Recurrence)
Chapter 4: Divide and Conquer
CSCE 411 Design and Analysis of Algorithms
Algorithms and Data Structures Lecture III
Ch 4: Recurrences Ming-Te Chi
Divide and Conquer (Merge Sort)
Divide-and-Conquer 7 2  9 4   2   4   7
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010

 Strassen’s Algorithm for Matrix Multiplication  More on Recurrence Relations

 Is Θ(n 3 ) the best we can do? Can we multiply matrices in o(n 3 )?  Seems like any algorithm to multiply matrices must take Ω(n 3 ) time:  Must compute n 2 entries  Each entry is the sum of n terms ◦ But with Strassen’s method we can multiply in o(n 3 )  Strassen’s method runs in Θ(n lg 7 ) time.  2.80 ≤ lg 7 ≤ 2.81  Hence, runs in Θ(n 2.81 ) time

 As with the other divide-and-conquer algorithms, assume that n is a power of 2.  Partition each of A;B;C into four n/2 × n/2 matrices:  Rewrite C = A ⋅B as

 giving the four equations  Each of these equations multiplies two n/2 × n/2 matrices and then adds their n/2 × n/2 products.  Use these equations to get a divide-and- conquer algorithm

 Analysis ◦ Let T(n) be the time to multiply two n/2 × n/2 matrices. ◦ Best case: n = 1. Perform one scalar multiplication: Θ(1). ◦ Recursive case: n > 1.  Dividing takes Θ(1) time using index calculations.  Conquering makes 8 recursive calls, each multiplying two n/2 × n/2 matrices ⇒ 8T(n/2).  Combining takes Θ(n 2 ) time to add n/2 × n/2 matrices 4 times.

 Recurrence is  Can use master method to show that it has solution T(n) = Θ(n 3 ). Asymptotically, no better than the obvious method.

 When setting up recurrences, can absorb constant factors into asymptotic notation, but cannot absorb a constant number of subproblems. ◦ Although we absorb the 4 additions of n/2 × n/2 matrices into the Θ(n 2 ) time we cannot lose the 8 in front of the T(n/2) term. ◦ If we absorb the constant number of subproblems, then the recursion tree would not be “bushy” and would instead just be a linear chain.

 Idea: Make the recursion tree less bushy. Perform only 7 recursive multiplications of n/2 × n/2 matrices, rather than 8. Will cost several additions of n/2 × n/2 matrices, but just a constant number more ⇒ can still absorb the constant factor for matrix additions into the Θ(n 2 ) term.

 Algorithm ◦ As in the recursive method, partition each of the matrices into four n/2 × n/2 submatrices. Time: Θ(1). ◦ Create 10 matrices S 1, S 2,…, S 10. Each is n/2 × n/2 and is the sum or difference of two matrices created in previous step. Time: Θ(n 2 ) to create all 10 matrices. ◦ Recursively compute 7 matrix products P 1, P 2,…, P 10 each n/2 × n/2. ◦ Compute n/2 × n/2 submatrices of C by adding and subtracting various combinations of the P i. Time: Θ(n 2 ).

 Analysis: ◦ Recurrence will be ◦ By the master method, solution is T(n) = Θ(n lg 7 ).

 Step 2, create the 10 matrices  Add or subtract n/2 × n/2 matrices 10 times ⇒ time is Θ(n / 2).

 Step 3, create the 7 matrices  The only multiplications needed are in the middle column; right-hand column just shows the products in terms of the original submatrices of A and B.

 Step 4, Add and subtract the P i to construct submatrices of C:  To see how these computations work, expand each right-hand side, replacing each P i with the submatrices of A and B that form it, and cancel terms:

 Strassen’s algorithm was the first to beat Θ(n 3 ) time, but it’s not the asymptotically fastest known. A method by Coppersmith and Winograd runs in O(n ) time.  Practical issues against Strassen’s algorithm: ◦ Higher constant factor than the obvious Θ(n 3 )-time method. ◦ Not good for sparse matrices. ◦ Not numerically stable: larger errors accumulate than in the obvious method. ◦ Submatrices consume space, especially if copying. ◦ Crossover point somewhere between n = 8 and 400.

 Use to generate a guess. Then verify by substitution method.  Example:

 There are log 3 n full levels, and after log 3/2 n levels, the problem size is down to 1.  Each level contributes ≤ c n.  Lower bound guess: ≥ d n log 3 n = Ω(n lg n) for some positive constant d.  Upper bound guess: ≤ d n log 3/2 n = O(n lg n) for some positive constant d.  Then prove by substitution.

 Upper bound:

 Lower bound:

 Used for many divide-and-conquer recurrences of the form ◦ T(n) = aT(n/b) + f(n), where a ≥ 1, b > 1, f(n) > 0. ◦ Based on the master theorem (Theorem 4.1).

 The theorem does not cover all the cases ◦ There is a gap between case 1 and case 2 when f(n) is smaller than n log b a but not polynomially smaller. ◦ Similarly there is a gap between case 2 and case 3 when f(n) is larger than n log b a but not polynomially larger.

 Heapsort  Reading: ◦ Read Chapter 7