CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.

Slides:



Advertisements
Similar presentations
Divide-and-Conquer CIS 606 Spring 2010.
Advertisements

Linear Least Squares Approximation Jami Durkee. Problem to be Solved Finding Ax=b where there are no solution y=x y=x+2 Interpolation of graphs where.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Greedy Algorithms Basic idea Connection to dynamic programming
Introduction to Algorithms
CSCI 256 Data Structures and Algorithm Analysis Lecture 13 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Spring 2015 Lecture 5: QuickSort & Selection
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 20.
CSCI 256 Data Structures and Algorithm Analysis Lecture 18 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 11.
DYNAMIC PROGRAMMING. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer.
Dynamic Programming CIS 606 Spring 2010.
Prune-and-search Strategy
CSE 421 Algorithms Richard Anderson Lecture 16 Dynamic Programming.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
1 Algorithmic Paradigms Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into.
Analysis of Algorithms
Dynamic Programming Adapted from Introduction and Algorithms by Kleinberg and Tardos.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 22.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
© The McGraw-Hill Companies, Inc., Chapter 6 Prune-and-Search Strategy.
Dynamic Programming.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 17.
Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2012 Lecture 16.
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 14.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
1 Chapter 6-1 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Instructor Neelima Gupta Instructor: Ms. Neelima Gupta.
CS38 Introduction to Algorithms Lecture 10 May 1, 2014.
1 Chapter 6 Dynamic Programming Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 18.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 11.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 21.
CSCI 256 Data Structures and Algorithm Analysis Lecture 16 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
Copyright © 2010 Pearson Education, Inc. Publishing as Pearson Addison- Wesley Systems of Equations in Three Variables Identifying Solutions Solving Systems.
CSCI 256 Data Structures and Algorithm Analysis Lecture 10 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
CSE 421 Algorithms Richard Anderson Lecture 18 Dynamic Programming.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Algorithm Design and Analysis
Lecture 5 Dynamic Programming
Weighted Interval Scheduling
CSE 4705 Artificial Intelligence
Dynamic Programming Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Analysis of Algorithms CS 477/677
Richard Anderson Lecture 17 Dynamic Programming
Lecture 5 Dynamic Programming
Richard Anderson Lecture 17 Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
CSEP 521 Applied Algorithms
Richard Anderson Lecture 16a Dynamic Programming
Richard Anderson Lecture 16 Dynamic Programming
Dynamic Programming.
Dynamic Programming.
DYNAMIC PROGRAMMING.
Richard Anderson Lecture 16, Winter 2019 Dynamic Programming
Data Structures and Algorithm Analysis Lecture 15
Presentation transcript:

CSCI 256 Data Structures and Algorithm Analysis Lecture 14 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some by Iker Gondra

Elements of DP From an engineering perspective, when should we look for a DP solution to a problem? –Optimal substructure: The first step in solving an optimization problem by DP is to characterize the structure of an optimal solution. A problem exhibits optimal structure if an optimal solution to the problem contains within it optimal solutions to subproblems –Overlapping subproblems: The space of subproblems must be “small” in the sense that a recursive algorithm for the problem solves the same subproblems over and over again, rather than always generating new subproblems. Typically, total number of distinct subproblems is a polynomial in the input size. DP algorithms take advantage of this by solving each subproblem once and storing the solution in a table

Least Squares Least squares –Foundational problem in statistics and numerical analysis –Given n points in the plane: (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) –Find a line y = ax + b that minimizes the sum of the squared error –Solution: Calculus  min error is achieved when x y

Least Squares Solution? Sensible?? x y

Segmented Least Squares Segmented least squares (first attempt) –Points lie roughly on a sequence of several line segments –Given n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with x 1 < x 2 <... < x n, find a sequence of lines that minimizes SSE which we could call the error x y

Segmented Least Squares -- How many line segments should we choose?? To optimize, we want to give assume a greater penalty for a larger number of segments as well as for the error—the squared deviations of the points from its corresponding line. Penalty of a partition is the sum of: –the number of segments into which we partition the points times a given multiplier, c –For each segment the error value of the optimal line through that segment This problem is a partitioning problem. This is an important problem in data mining and statistics known as change detection: given a sequence of data points, identify a few points in the sequence at which a discrete change occurs (in this case a change from one linear approximation to another)

Segmented Least Squares Goal in segmented Least Squares Problem: find a partition of minimal penalty

What is the optimal linear interpolation with two line segments?

Optimal interpolation with two segments Give an equation for the error of the optimal line ( having minimal least squares error ) through p 1,…,p n with two line segments. Let E i,j be the least squares error for the optimal line through p i,... p j (DONE IN CLASS)

What is the optimal linear interpolation with three line segments?

Optimal interpolation with three segments Give an equation for the error of the optimal line ( having minimal least squares error ) through p 1,…,p n with three line segments. Let E i,j be the least squares error for the optimal line through p i,... p j Need to find i and j which minimize (E j+1,n + E i+1,j + E 1,i ) (Note we haven’t included a penalty term accounting for the number of segments) Can we do this recursively?

What is the optimal linear interpolation with n line segments?

Segmented Least Squares Segmented least squares –Points lie roughly on a sequence of several line segments –Given n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with x 1 < x 2 <... < x n, find a sequence of lines that minimizes f(x) Question: What's a reasonable choice for f(x) to balance accuracy and parsimony? goodness of fitnumber of lines x y

Segmented Least Squares Segmented least squares –Points lie roughly on a sequence of several line segments –Given n points in the plane (x 1, y 1 ), (x 2, y 2 ),..., (x n, y n ) with x 1 < x 2 <... < x n, find a sequence of lines that minimizes the sum of the sums of the squared errors E in each segment the number of lines L Tradeoff function: E + cL, for some constant c > 0 x y

Optimal substructure property Optimal solution with k line segments extends an optimal solution of k-1 line segments on a smaller problem

DP: Multiway Choice Notation –OPT[j] = minimum cost for points p 1, p 2,..., p j –E i,j = minimum sum of squares for points p i, p i+1,..., p j Give a recursive definition for OPT[j]

Notation. OPT[j] = minimum cost for points p 1, p 2,..., p j. E i,j = minimum sum of squares for points p i, p i+1,..., p j. To compute OPT[j]: –Last segment uses points p i, p i+1,..., p j for some i. –Cost = E i,j + c + OPT[i-1]. –Which i ??? Opt[j] = min 1  i  j (E i,j + c + Opt[i-1])

Segmented Least Squares: Algorithm can be improved to O(n 2 ) by pre-computing various statistics INPUT: n, p 1,…,p N, c Segmented-Least-Squares() { Opt[0] = 0 for j = 1 to n for i = 1 to j compute the least square error E ij for the segment p i,…, p j endfor for j = 1 to n Opt[j] = min 1  i  j (E ij + c + Opt[i-1]) endfor return Opt[n] }

Total Running time: O(n 3 ) Computing E i,j for O(n 2 ) pairs, O(n) per pair using previous formula –this gives O(n 3 ) to compute all E i,j pairs Following this the algorithm has n iterations for values j = 1,…,n; for each value of j we have to compute the minimum of the recurrence to fill the array entry Opt[j]; this takes O(n) for each j; –This part gives O(n 2 ) Remark – there is an exercise in the text which shows how to reduce the total running time from O(n 3 ) to O(n 2 )

Determining the solution When Opt[j] is computed, record the value of i that minimized the sum Store this value in an auxiliary array Use to reconstruct solution

Determining the solution Find-Segments(j) If j = 0 then 0utput nothing Else Get i that minimizes E i,j + C + Opt[i-1] Output the segment {p i,…p j } and the result of Find-Segments(i-1) Endif