Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into.

Similar presentations


Presentation on theme: "1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into."— Presentation transcript:

1 1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into two sub-problems, solve each sub-problem independently, and combine solution to sub-problems to form solution to original problem. Dynamic programming. Break up a problem into a series of overlapping sub-problems, and build up solutions to larger and larger sub-problems.

2 2 Dynamic Programming History Bellman. Pioneered the systematic study of dynamic programming in the 1950s. Etymology. n Dynamic programming = planning over time. n Secretary of Defense was hostile to mathematical research. n Bellman sought an impressive name to avoid confrontation. – "it's impossible to use dynamic in a pejorative sense" – "something not even a Congressman could object to" Reference: Bellman, R. E. Eye of the Hurricane, An Autobiography. Quotation: “Those who cannot remember the past are condemned to repeat it.

3 3 Dynamic Programming Applications Areas. n Bioinformatics. n Control theory. n Information theory. n Operations research. n Computer science: theory, graphics, AI, systems, …. Some famous dynamic programming algorithms. n Viterbi for hidden Markov models. n Unix diff for comparing two files. n Smith-Waterman for sequence alignment. n Bellman-Ford for shortest path routing in networks. n Cocke-Kasami-Younger for parsing context free grammars.

4 6.1 Weighted Interval Scheduling

5 5 Weighted Interval Scheduling Weighted interval scheduling problem. n Job j starts at s j, finishes at f j, and has weight or value v j. n Two jobs compatible if they don't overlap. n Goal: find maximum weight subset of mutually compatible jobs. Time 01234567891011 f g h e a b c d

6 6 Unweighted Interval Scheduling Review Recall. Greedy algorithm works if all weights are 1. n Consider jobs in ascending order of finish time. n Add job to subset if it is compatible with previously chosen jobs. Observation. Greedy algorithm can fail spectacularly if arbitrary weights are allowed. Time 01234567891011 b a weight = 999 weight = 1

7 7 Weighted Interval Scheduling Notation. Label jobs by finishing time: f 1  f 2 ...  f n. Def. p(j) = largest index i < j such that job i is compatible with j. Ex: p(8) = 5, p(7) = 3, p(2) = 0. Time 01234567891011 6 7 8 4 3 1 2 5

8 8 Dynamic Programming: Binary Choice Notation. OPT(j) = value of optimal solution to the problem consisting of job requests 1, 2,..., j. n Case 1: OPT selects job j. – can't use incompatible jobs { p(j) + 1, p(j) + 2,..., j - 1 } – must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., p(j) n Case 2: OPT does not select job j. – must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., j-1 optimal substructure

9 9 Input: n, s 1,…,s n, f 1,…,f n, v 1,…,v n Sort jobs by finish times so that f 1  f 2 ...  f n. Compute p(1), p(2), …, p(n) Compute-Opt(j) { if (j = 0) return 0 else return max(v j + Compute-Opt(p(j)), Compute-Opt(j-1)) } Weighted Interval Scheduling: Brute Force Brute force algorithm.

10 10 Weighted Interval Scheduling: Brute Force Observation. Recursive algorithm fails spectacularly because of redundant sub-problems  exponential algorithms. Ex. Number of recursive calls for family of "layered" instances grows like Fibonacci sequence. 3 4 5 1 2 p(1) = 0, p(j) = j-2 5 43 3221 21 10 1010

11 11 Input: n, s 1,…,s n, f 1,…,f n, v 1,…,v n Sort jobs by finish times so that f 1  f 2 ...  f n. Compute p(1), p(2), …, p(n) for j = 1 to n M[j] = empty M[j] = 0 M-Compute-Opt(j) { if (M[j] is empty) M[j] = max(w j + M-Compute-Opt(p(j)), M-Compute-Opt(j-1)) return M[j] } global array Weighted Interval Scheduling: Memoization Memoization. Store results of each sub-problem in a cache; lookup as needed.

12 12 Weighted Interval Scheduling: Running Time Claim. Memoized version of algorithm takes O(n log n) time. n Sort by finish time: O(n log n). n Computing p(  ) : O(n) after sorting by start time. M-Compute-Opt(j) : each invocation takes O(1) time and either – (i) returns an existing value M[j] – (ii) fills in one new entry M[j] and makes two recursive calls Progress measure  = # nonempty entries of M[]. – initially  = 0, throughout   n. – (ii) increases  by 1  at most 2n recursive calls. Overall running time of M-Compute-Opt(n) is O(n). ▪ Remark. O(n) if jobs are pre-sorted by start and finish times.

13 13 Automated Memoization Automated memoization. Many functional programming languages (e.g., Lisp) have built-in support for memoization. Q. Why not in imperative languages (e.g., Java)? F(40) F(39)F(38) F(37)F(36) F(37) F(36)F(35) F(36) F(35)F(34) F(37) F(36)F(35) static int F(int n) { if (n <= 1) return n; else return F(n-1) + F(n-2); } (defun F (n) (if (<= n 1) n (+ (F (- n 1)) (F (- n 2))))) Lisp (efficient) Java (exponential)

14 14 Weighted Interval Scheduling: Finding a Solution Q. Dynamic programming algorithms computes optimal value. What if we want the solution itself? A. Do some post-processing. n # of recursive calls  n  O(n). Run M-Compute-Opt(n) Run Find-Solution(n) Find-Solution(j) { if (j = 0) output nothing else if (v j + M[p(j)] > M[j-1]) print j Find-Solution(p(j)) else Find-Solution(j-1) }

15 15 Weighted Interval Scheduling: Bottom-Up Bottom-up dynamic programming. Unwind recursion. Input: n, s 1,…,s n, f 1,…,f n, v 1,…,v n Sort jobs by finish times so that f 1  f 2 ...  f n. Compute p(1), p(2), …, p(n) Iterative-Compute-Opt { M[0] = 0 for j = 1 to n M[j] = max(v j + M[p(j)], M[j-1]) }

16 Dynamic Programming 1. Replaces an exponential time algorithm with polynomial time alg. 2. Powerful for designing algorithms for optimization problems. 3. Similar to D&C but solves every sub-problem only once. 1. Avoids re-computing. (Remembers the past)  Elements  Use Tables  Subproblems : Decompose the problem into smaller subproblems.  Main Principle: For the global problem to be solved optimally, each sub-problem should be solved optimally.  Top Down Vs Bottom Up.  Top-Down: Memoization  Bottom-Up : Combine solutions of smaller subproblems to solve large subproblems. 16

17 6.3 Segmented Least Squares

18 18 Segmented Least Squares Least squares. n Foundational problem in statistic and numerical analysis. n Given n points in the plane: (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ). n Find a line y = ax + b that minimizes the sum of the squared error: Solution. Calculus  min error is achieved when x y

19 19 Segmented Least Squares Segmented least squares. n Points lie roughly on a sequence of several line segments. n Given n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with n x 1 < x 2 <... < x n, find a sequence of lines that minimizes f(x). Q. What's a reasonable choice for f(x) to balance accuracy and parsimony? x y goodness of fit number of lines

20 20 Segmented Least Squares Segmented least squares. n Points lie roughly on a sequence of several line segments. n Given n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with n x 1 < x 2 <... < x n, find a sequence of lines that minimizes: – the sum of the sums of the squared errors E in each segment – the number of lines L n Tradeoff function: E + c L, for some constant c > 0. x y

21 21 Dynamic Programming: Multiway Choice Notation. n OPT(j) = minimum cost for points p 1, p i+1,..., p j. n e(i, j) = minimum sum of squares for points p i, p i+1,..., p j. To compute OPT(j): n Last segment uses points p i, p i+1,..., p j for some i. n Cost = e(i, j) + c + OPT(i-1).

22 22 Segmented Least Squares: Algorithm Running time. O(n 3 ). n Bottleneck = computing e(i, j) for O(n 2 ) pairs, O(n) per pair using previous formula. INPUT: n, p 1,…,p N, c Segmented-Least-Squares() { M[0] = 0 for j = 1 to n for i = 1 to j compute the least square error e ij for the segment p i,…, p j for j = 1 to n M[j] = min 1  i  j (e ij + c + M[i-1]) return M[n] } can be improved to O(n 2 ) by pre-computing various statistics

23 Knapsack Subproblems : n B[n,k] = There exists a subset of (s[1],s[2],…,s[n]) that exactly fills up a knapsack of size k. [It’s a boolean variable] n B[n,k] = B[n-1,k] or B[n-1, k-s[n]] n How many total such subproblems? O(nk)?

24 Knapsack: Recursive bool B(int n, int k) if( n == 0 && k == 0) return true; if( n == 0 && k > 0) return false; if ( B(n-1,k) || ((k-s[n] >= 0) && B(n-1,k-s[n]))) return true; return false;

25 Edit Distance Given : Strings s[1..n] and t[1..m] Find fewest number of edits needed to convert s to t where ‘edit’ is : n Deleting a character n Inserting a character n Changing a character a.k.a Levenshtein distance

26 Edit Distance Applications n Spell Checking n Genomics n Morphing/Vision

27 Edit Distance Example n algorithm to logarithm? n algorithm n lgorithm (delete) n logorithm (Insert) n logarithm (Replace) – Edit Distance = 3

28 Edit Distance Subproblems? n What do we want? – Small number of Subproblems (Polynomial?) – Should be able to recover the solution easily from other smaller subproblems. Edit distance d(i,j) = Edit distance between s[1..i] and t[1..j] ? Does it satisfy what we want?

29 Edit Distance How can we solve the subproblem (i,j) by looking at smaller subproblems? n Solve s[1..i] to t[1..j] using a minimum of d[i,j] operations. n given the edit distances between – s[1..i-1] and t[1..j-1] and … n What if we look at s[i] and t[j]?

30 Edit Distance Recursion Replace Delete s[i] Insert t[j] If we can transform s[1..i-1] to t[1..j] in k operations, then we can do the same operations on s[1..i] and then remove the original s[i] at the end in k+1 operations. If we can transform s[1..i] to t[1..j-1] in k operations, then we can simply add t[j] afterwards to get t[1..j] in k+1 operations.

31 Edit Distance Recursion Replace Delete s[i] Insert t[j] If we can transform s[1..i-1] to t[1..j-1] in k operations, we can do the same to s[1..i] and then do a substitution of t[j] for the original s[i] at the end if necessary, requiring k+cost operations.

32 Base Cases n To turn empty string to t[1..j], do j inserts n To turn s[1..i] to empty string, do i deletes d(i,0) = i; for I = 1.. m (deletes) d(0,j) = j; for j = 1 to n (inserts) Can we fill up d(i,j) using this table and the recursion given? In what order? What is the space requirement? Running time?

33 Loop Assignment for i = 1 to n for j = 1 to m d(i,j) = min( d(i-1,j) + 1, d(i,j-1) + 1, d(i-1,j-1)+ ((s[i] == t[j])?0:1) );

34 Kitten Sitting kitten 0123456 s1123456 i2212345 t3321234 t4432123 i5543223 n6654332 g7765443 34

35

36 6.5 RNA Secondary Structure

37 37 RNA Secondary Structure RNA. String B = b 1 b 2  b n over alphabet { A, C, G, U }. Secondary structure. RNA is single-stranded so it tends to loop back and form base pairs with itself. This structure is essential for understanding behavior of molecule. G U C A GA A G CG A U G A U U A G A CA A C U G A G U C A U C G G G C C G Ex: GUCGAUUGAGCGAAUGUAACAACGUGGCUACGGCGAGA complementary base pairs: A-U, C-G

38 38 RNA Secondary Structure Secondary structure. A set of pairs S = { (b i, b j ) } that satisfy: n [Watson-Crick.] S is a matching and each pair in S is a Watson- Crick complement: A-U, U-A, C-G, or G-C. n [No sharp turns.] The ends of each pair are separated by at least 4 intervening bases. If (b i, b j )  S, then i < j - 4. n [Non-crossing.] If (b i, b j ) and (b k, b l ) are two pairs in S, then we cannot have i < k < j < l. Free energy. Usual hypothesis is that an RNA molecule will form the secondary structure with the optimum total free energy. Goal. Given an RNA molecule B = b 1 b 2  b n, find a secondary structure S that maximizes the number of base pairs. approximate by number of base pairs

39 39 RNA Secondary Structure: Examples Examples. C GG C A G U U UA A UGUGGCCAU GG C A G U UA A UGGGCAU C GG C A U G U UA A GUUGGCCAU sharp turncrossingok G G 44 base pair

40 40 RNA Secondary Structure: Subproblems First attempt. OPT(j) = maximum number of base pairs in a secondary structure of the substring b 1 b 2  b j. Difficulty. Results in two sub-problems. n Finding secondary structure in: b 1 b 2  b t-1. n Finding secondary structure in: b t+1 b t+2  b n-1. 1 tn match b t and b n OPT(t-1) need more sub-problems

41 41 Dynamic Programming Over Intervals Notation. OPT(i, j) = maximum number of base pairs in a secondary structure of the substring b i b i+1  b j. n Case 1. If i  j - 4. – OPT(i, j) = 0 by no-sharp turns condition. n Case 2. Base b j is not involved in a pair. – OPT(i, j) = OPT(i, j-1) n Case 3. Base b j pairs with b t for some i  t < j - 4. – non-crossing constraint decouples resulting sub-problems – OPT(i, j) = 1 + max t { OPT(i, t-1) + OPT(t+1, j-1) } Remark. Same core idea in CKY algorithm to parse context-free grammars. take max over t such that i  t < j-4 and b t and b j are Watson-Crick complements

42 42 Bottom Up Dynamic Programming Over Intervals Q. What order to solve the sub- problems? A. Do shortest intervals first. Running time. O(n 3 ). RNA(b 1,…,b n ) { for k = 5, 6, …, n-1 for i = 1, 2, …, n-k j = i + k Compute M[i, j] return M[1, n] } using recurrence 000 00 0 2 3 4 1 i 6789 j

43 43 Dynamic Programming Summary Recipe. n Characterize structure of problem. n Recursively define value of optimal solution. n Compute value of optimal solution. n Construct optimal solution from computed information. Dynamic programming techniques. n Binary choice: weighted interval scheduling. n Multi-way choice: segmented least squares. n Adding a new variable: knapsack. n Dynamic programming over intervals: RNA secondary structure. Top-down vs. bottom-up: different people have different intuitions. Viterbi algorithm for HMM also uses DP to optimize a maximum likelihood tradeoff between parsimony and accuracy CKY parsing algorithm for context-free grammar has similar structure


Download ppt "1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into."

Similar presentations


Ads by Google