Download presentation
Presentation is loading. Please wait.
Published byBryan Chapman Modified over 6 years ago
1
CSCI 256 Data Structures and Algorithm Analysis Lecture 12 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some by Iker Gondra
2
More divide and conquer strategies
Today we will blast through some problems which show how careful analysis of the particular problem is required in order to come up with a more efficient strategy. The first involves finding closest pair in the plane in which some knowledge of Euclidean distances is necessary; the second involves faster binary and matrix multiplication which takes advantage of preprocessing using cheap additions to allow us to cut down on the expensive multiplications For the second we recall the proof done several days ago on recurrence relations of q subproblems of size n/2; there we proved that for q > 2 if T(n) ≤ q T(n/2) + cn for n > 2, and T(2) = c then T(n) = O(n log2q)
3
Closest Pair Closest pair: Given n points, find a pair with smallest Euclidean distance between them Brute force: Check all pairs of points p and q with (n2) comparisons
4
Closest Pair: The One Dimensional Case
Closest pair on the line: Given n points on a line, find a pair of points that are minimally distant from each other Divide & Conquer? Partition points into two sets S1 and S2 by some point m. The points in S1 are to the left of m and those in S2 are to the right of m Let {p1,p2} be closest pair in S1 and {q1,q2} be closest pair in S2. Let d be the smallest of dist(p1,p2) and dist(q1,q2) We can now say that the closest pair is {p1,p2}, or {q1,q2} or some pair {p3,q3} where p3 is in S1 and q3 is in S2 Notice that if closest pair is {p3,q3} , then both p3 and q3 must be within d of the midpoint m. WHY? If either of p3 or q3 were farther than d of midpoint then distance between them must be bigger than d so we don’t need to compare them How many {p3,q3} points can there be? (Just one: if p and p’ are in S1 (or S2) within d of m their distance would be less than d)
5
Closest Pair in the Plane
Can we generalize the 1-D Divide & Conquer algorithm to the 2-D case? Yes, but it’s a bit more complicated
6
Closest Pair in the Plane
Algorithm Divide: draw vertical line L so that roughly ½n points on each side L
7
Closest Pair in the Plane
Algorithm Divide: draw vertical line L so that roughly ½n points on each side Conquer: find closest pair in each side recursively L 21 12
8
Closest Pair in the Plane
Algorithm Divide: draw vertical line L so that roughly ½n points on each side Conquer: find closest pair in each side recursively Combine: find closest pair with one point in each side Return best of 3 solutions seems like (n2) L 8 21 12
9
Closest Pair in the Plane
Find closest pair with one point in each side, assuming that distance < L 21 = min(12, 21) 12
10
Closest Pair in the Plane
Find closest pair with one point in each side, assuming that distance < Observation: only need to consider points within of line L (otherwise their distance is bigger than ) L 21 = min(12, 21) 12
11
Closest Pair in the Plane
Till now, we are completely in step with the 1-D case. At this point, however, the extra dimension causes some problems. In 1-D case there was at most one pair of points in the strip. How many points could there be in the strip in the 2-D case? Why is this a problem? L 21 = min(12, 21) 12
12
Combining Solutions Suppose the minimum separation from the sub problems is d In looking for cross set closest pairs, we only need to consider points within d of the boundary How many cross border interactions do we need to test?
13
Claim: For any point si = (xi,yi) in the 2-strip, just check the 6 nearest points on the other side of the divider Def. Let si be the point in the 2-strip, with the ith smallest y-coordinate. Claim. If |i – j| 7, then the distance between si and sj is at least . Proof idea (uses results from some geometry). No two points lie in same ½-by-½ box. Two points at least 2 rows apart have distance 2(½). Geometry tell us we only need to consider at most 6 points ▪ .
14
Details Preprocessing: sort points by y Merge step
Select points in boundary zone (the strip) - there are at most n For each point in the boundary zone Find highest point on the other side that is at most d above Find lowest point on the other side that is at most d below Compare with the points in this interval - there are at most 6 We don't need n2 comparisons, just 6n at most to check all candidate pairs However, since we sorted the points in the strip by their y-coordinates the process of merging our two subsets is not linear, but in fact takes O(n log n) time
15
Closest Pair in the Plane: Algorithm
Closest-Pair(p1, …, pn) { Compute separation line L such that half the points are on one side and half on the other side 1 = Closest-Pair(left half) 2 = Closest-Pair(right half) = min(1, 2) Delete all points further than from separation line L Sort remaining points by y-coordinate Scan and compare distance between each point in the boundary and at most 6 neighbors. If any of these distances is less than , update return } O(n log n) 2T(n / 2) O(n) O(n log n) O(n)
16
Closest Pair in the Plane: Analysis
Running time Prove this by induction: Hint use strong induction Start with T(n) ≤ 2T(n/2) + cn for some constant c Base case T(2) ≤ 2T(1) + c2 log 2 ≤ c 2 log2 2 Induction hypothesis??? Induction step??
17
Closest-Pair Using Recursion:
Closest-Pair Algorithm outlined in the text (page 230) uses a Closest-Pair-Recursion Don't sort points in strip from scratch each time Each recursive call returns two lists: all points sorted by y coordinate, and all points sorted by x coordinate Sort by merging two pre-sorted lists – and this takes O(n) time; thus we can achieve better time: it is now O(n log n)
18
Rearranging and recombining
In this section we shall see how calculations like multiplying numbers and multiplying matrices can be done more efficiently by using creative divide and conquer techniques The trick is to creatively break these problems into subproblems taking advantage of cheap additions
19
Integer Arithmetic Add: Given two n-digit integers a and b, compute a + b O(n) bit operations Multiply: Given two n-digit integers a and b, compute a × b Brute force solution: (n2) bit operations 1 * Multiply 1 + Add
20
Divide-and-Conquer Multiplication: Warmup
To multiply two n-digit integers Multiply four ½n-digit integers Add two ½n-digit integers, and shift to obtain result What is time complexity??
21
We can use recursion but Ouch! No time improvement
Recursively compute results for 4 n/2 bit solutions and combine them (since the combining requires a constant number of additions of O(n) numbers it takes O(n)) so T(n) < 4 T(n/2) + cn This is an example of the problem we saw the other day where q = 4, so T(n) < O (n log2 4) = O(n2)
22
Karatsuba Multiplication gives us a speed up by thinking about the problem in a different way
To multiply two n-digit integers Add two ½n digit integers Multiply three ½n-digit integers Add, subtract, and shift ½n-digit integers to obtain result
23
Karatsuba Multiplication
Recursive-Multiply (x, y) Write x = x12n/2 + x0 y = y12n/2 + y0 Compute x1 + x0 and y1 + y0 p = Recursive-Multiply(x1 + x0, y1 + y0) x1y1 = Recursive-Multiply(x1, y1) x0y0 = Recursive-Multiply(x0, y0) Return x1y1 2n + (p - x1y1 - x0y0) 2n/2 + x0y0
24
Karatsuba Multiplication: use recursion with 3 problems of size n/2 and cheap additions
Theorem: [Karatsuba-Ofman, 1962] Can multiply two n-digit integers in O(n1.585) bit operations
25
Karatsuba: Recursion Tree – we already solved this for arbitrary q (where 3 is used here)
T(n) n T(n/2) T(n/2) T(n/2) 3(n/2) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) T(n/4) 9(n/4) . . . . . . T(n / 2k) 3k (n / 2k) . . . . . . T(2) T(2) T(2) T(2) T(2) T(2) T(2) T(2) 3 lg n (2)
26
Matrix Multiplication
Matrix multiplication: Given two n-by-n matrices A and B, compute C = AB Brute force: (n3) arithmetic operations -- can you see why?? Fundamental question: Can we improve upon brute force?
27
Matrix Multiplication: Warmup
Divide-and-conquer Divide: partition A and B into ½n-by-½n blocks Conquer: multiply 8 ½n-by-½n matrices recursively Combine: add appropriate products using 4 matrix additions Time complexity???
28
Still same complexity Time is still (n3)
29
Matrix Multiplication: 7 problems of size n/2 plus cheap additions and subtractions
Key idea: multiply 2-by-2 block matrices with only 7 multiplications 7 multiplications 18 = additions (or subtractions)
30
Fast Matrix Multiplication
Fast matrix multiplication (Strassen, 1969) Divide: partition A and B into ½n-by-½n blocks Compute: 14 ½n-by-½n matrices via 10 matrix additions Conquer: multiply 7 ½n-by-½n matrices recursively Combine: 7 products into 4 terms using 8 matrix additions Analysis Assume n is a power of 2 T(n) = # arithmetic operations
31
Fast Matrix Multiplication in Practice
Implementation issues Sparsity Numerical stability (number of significant digits) Odd matrix dimensions Common misperception: "Strassen is only a theoretical curiosity.“ Advanced Computation Group at Apple Computer reports 8x speedup on G4 Velocity Engine when n ~ 2,500 Range of instances where it's useful is a subject of controversy
32
Fast Matrix Multiplication in Theory
Decimal wars December, 1979: O(n ) January, 1980: O(n ) Best known: O(n2.376) [Coppersmith-Winograd, 1987] Conjecture: O(n2+) for any > 0
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.