1 CPSC 320: Intermediate Algorithm Design and Analysis July 28, 2014.

Slides:



Advertisements
Similar presentations
Dynamic Programming.
Advertisements

Greedy Algorithms.
What is Intractable? Some problems seem too hard to solve efficiently. Question 1: Does an efficient algorithm exist?  An O(a ) algorithm, where a > 1,
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Dynamic Programming.
S. J. Shyu Chap. 1 Introduction 1 The Design and Analysis of Algorithms Chapter 1 Introduction S. J. Shyu.
Outline 1. General Design and Problem Solving Strategies 2. More about Dynamic Programming – Example: Edit Distance 3. Backtracking (if there is time)
CSC5160 Topics in Algorithms Tutorial 2 Introduction to NP-Complete Problems Feb Jerry Le
Dynamic Programming CIS 606 Spring 2010.
Sequence Alignment Variations Computing alignments using only O(m) space rather than O(mn) space. Computing alignments with bounded difference Exclusion.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 3) Tuesday, 9/3/02 Design Patterns for Optimization.
Analysis of Algorithms CS 477/677
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Hardness Results for Problems P: Class of “easy to solve” problems Absolute hardness results Relative hardness results –Reduction technique.
1 Summary of lectures 1.Introduction to Algorithm Analysis and Design (Chapter 1-3). Lecture SlidesLecture Slides 2.Recurrence and Master Theorem (Chapter.
Lecture 5 Dynamic Programming. Dynamic Programming Self-reducibility.
MCS312: NP-completeness and Approximation Algorithms
1 CPSC 320: Intermediate Algorithm Design and Analysis July 25, 2014.
Chapter 11 Limitations of Algorithm Power. Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples:
Computational Complexity Polynomial time O(n k ) input size n, k constant Tractable problems solvable in polynomial time(Opposite Intractable) Ex: sorting,
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Recursion and Dynamic Programming. Recursive thinking… Recursion is a method where the solution to a problem depends on solutions to smaller instances.
Algorithms  Al-Khwarizmi, arab mathematician, 8 th century  Wrote a book: al-kitab… from which the word Algebra comes  Oldest algorithm: Euclidian algorithm.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 11, 2014.
CSE 326: Data Structures NP Completeness Ben Lerner Summer 2007.
1 Summary: Design Methods for Algorithms Andreas Klappenecker.
Algorithm Paradigms High Level Approach To solving a Class of Problems.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
CSCI 2670 Introduction to Theory of Computing November 29, 2005.
Polynomial-time reductions We have seen several reductions:
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 21, 2014.
INTRODUCTION. What is an algorithm? What is a Problem?
1 CPSC 320: Intermediate Algorithm Design and Analysis July 9, 2014.
CSCI 3160 Design and Analysis of Algorithms Tutorial 10 Chengyu Lin.
1 Programming for Engineers in Python Autumn Lecture 12: Dynamic Programming.
1 Chapter 6 Dynamic Programming. 2 Algorithmic Paradigms Greedy. Build up a solution incrementally, optimizing some local criterion. Divide-and-conquer.
CSE373: Data Structures & Algorithms Lecture 22: The P vs. NP question, NP-Completeness Lauren Milne Summer 2015.
Umans Complexity Theory Lectures Lecture 1a: Problems and Languages.
CS223 Advanced Data Structures and Algorithms 1 Review for Final Neil Tang 04/27/2010.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness and course wrap up.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 16, 2014.
NP-Complete Problems Algorithm : Design & Analysis [23]
Introduction to Algorithms Jiafen Liu Sept
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Design and Analysis of Algorithms (09 Credits / 5 hours per week) Sixth Semester: Computer Science & Engineering M.B.Chandak
CS6045: Advanced Algorithms NP Completeness. NP-Completeness Some problems are intractable: as they grow large, we are unable to solve them in reasonable.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 30, 2014.
1 Today’s Material Dynamic Programming – Chapter 15 –Introduction to Dynamic Programming –0-1 Knapsack Problem –Longest Common Subsequence –Chain Matrix.
Lecture 38 CSE 331 Dec 5, OHs today (only online 9:30-11)
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 30, 2014.
1 CPSC 320: Intermediate Algorithm Design and Analysis July 14, 2014.
2016/3/13Page 1 Semester Review COMP3040 Dept. Computer Science and Technology United International College.
NP-completeness Ch.34.
Design and Analysis of Algorithms (09 Credits / 5 hours per week)
Lecture 5 Dynamic Programming
Richard Anderson Lecture 29 NP-Completeness and course wrap-up
Summary of lectures Introduction to Algorithm Analysis and Design (Chapter 1-3). Lecture Slides Recurrence and Master Theorem (Chapter 4). Lecture Slides.
Lecture 22 Complexity and Reductions
Lecture 5 Dynamic Programming
Objective of This Course
Major Design Strategies
INTRODUCTION TO ALOGORITHM DESIGN STRATEGIES
Department of Computer Science & Engineering
Lecture 22 Complexity and Reductions
Presentation transcript:

1 CPSC 320: Intermediate Algorithm Design and Analysis July 28, 2014

2 Course Outline Introduction and basic concepts Asymptotic notation Greedy algorithms Graph theory Amortized analysis Recursion Divide-and-conquer algorithms Randomized algorithms Dynamic programming algorithms NP-completeness

3 Dynamic Programming

4 Dynamic Programming Components Analyse the structure of an optimal solution Separate one choice (usually the last) from a subproblem Phrase the value of a choice as a function of the choice and the subproblem Phrase an optimal solution as the value of the best choice Usually a max/min result Implement the calculation of the optimal value Memoization: save optimal values as we compute them Bottom-up: evaluate smaller problems and use them for bigger problems Top-down: evaluate big problem by calling smaller problems recursively and saving result Keep record of the choice made in each level Rebuild the optimal solution from the optimal value result

5 Knapsack Problem

6 Knapsack Algorithm - Complexity

7 Algorithm Strategies - Review Dynamic programming algorithms: Choice is made based on evaluation of all possible results Time and space complexity are usually higher Greedy algorithms: Choice is made based on locally optimal solution Usually faster, but may not result in globally optimal solution Divide and conquer algorithms: Choice of input division is made based on assumption that merging result of subproblems is optimal

8 Global Sequence Alignment Problem Problem: given two sequences, analyse how similar they are Allow both gaps and mismatches Application: Finding suggestions for misspelled words (comparing strings) Comparing files (diff) Analyse if two pieces of DNA match Example: “ocurrance” vs “occurrence” There is a letter “c” missing (gap) An “a” was used instead of an “e” (mismatch) Mismatches may be seen as gaps in both sides “oc-urra-nce” vs “occurr-ence”

9 Formal Definition

10 Finding the Best Alignment

11 Algorithm (Smith-Wasserman)

12 Algorithm (cont.)

13 Longest Common Subsequence

14 Characterizing the LCS

15 Algorithm

16 Algorithm (cont.)

17 NP Complexity

18 Time Complexity for Decision Problems From this point on we analyse time complexity for problems, not algorithms We want to know what is the best possible complexity for the problem Our focus now is on decision problems, not optimization problems Decision problems: Yes/No answer Optimization: “find best”, “find maximum”, “find minimum” We also need to distinguish “finding” and “checking” a solution

19 Time Complexity - Classes

20 Example: Hamiltonian Path Problem: given a graph, is there a path that goes through every node exactly once? Decision problem: answer is yes or no Optimization problem: find a path with minimum cost, etc.; not required Is this problem in NP? Given a path, can we verify that the path is correct in polynomial time? Is this problem in P? Can we solve it in polynomial time?

21 Example: Satisfiability

22 NP Complete Problem Turns out nobody knows! There is no known algorithm that runs in polynomial time There is no proof that such an algorithm doesn’t exist NP complete: set of all decision problems that: Are in NP (solution can be verified in polynomial time) Are at least as hard as any problem in NP NP hard: set of all problems for which the second rule applies Includes non-decision problems (e.g., optimization)

23 Problem Reduction

24 NP Complete

25 Graph Coloring