Design and Analysis of Approximation Algorithms

Slides:



Advertisements
Similar presentations
Weighted Matching-Algorithms, Hamiltonian Cycles and TSP
Advertisements

Part VI NP-Hardness. Lecture 23 Whats NP? Hard Problems.
Instructor Neelima Gupta Table of Contents Approximation Algorithms.
Lecture 16 Deterministic Turing Machine (DTM) Finite Control tape head.
Lecture 6 Nondeterministic Finite Automata (NFA)
Chapter 6. Relaxation (1) Superstring Ding-Zhu Du.
Design and Analysis of Algorithms Approximation algorithms for NP-complete problems Haidong Xue Summer 2012, at GSU.
Lecture 23 Space Complexity of DTM. Space Space M (x) = # of cell that M visits on the work (storage) tapes during the computation on input x. If M is.
Approximation Algorithms for TSP
Lecture 21 Approximation Algorithms Introduction.
Chapter 10 Complexity of Approximation (1) L-Reduction Ding-Zhu Du.
FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY Read sections 7.1 – 7.3 of the book for next time.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 7 Monday, 4/3/06 Approximation Algorithms.
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
Approximation Algorithms
Lecture 3 Graph Representation for Regular Expressions
CS 461 – Nov. 21 Sections 7.1 – 7.2 Measuring complexity Dividing decidable languages into complexity classes. Algorithm complexity depends on what kind.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Theory of Computing Lecture 15 MAS 714 Hartmut Klauck.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Ding-Zhu Du Office: ECSS 3-611, M 3:15-4:30 Lecture: ECSS 2.311, MW 12:30-1:45.
Approximation Algorithms
Lecture Pushdown Automata. stack stack head finite control tape head tape.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness and course wrap up.
P Vs NP Turing Machine. Definitions - Turing Machine Turing Machine M has a tape of squares Each Square is capable of storing a symbol from set Γ (including.
Approximation Algorithms for TSP Tsvi Kopelowitz 1.
Strings Basic data type in computational biology A string is an ordered succession of characters or symbols from a finite set called an alphabet Sequence.
Lecture 22. Time of DTM. Time of DTM Time M (x) = # of moves that DTM M takes on input x. Time M (x) < infinity iff x ε L(M).
TU/e Algorithms (2IL15) – Lecture 9 1 NP-Completeness NOT AND OR AND NOT AND.
Lecture 2 Time and Space of DTM. Time of DTM Time M (x) = # of moves that DTM M takes on input x. Time M (x) < infinity iff x ε L(M).
Theory of Computational Complexity Yuji Ishikawa Avis lab. M1.
Approximation algorithms
TU/e Algorithms (2IL15) – Lecture 11 1 Approximation Algorithms.
ICS 353: Design and Analysis of Algorithms NP-Complete Problems King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Lecture 1-2 Time and Space of DTM
Theory of Computation Pushdown Automata pda Lecture #10.
Optimization problems such as
Lecture 2-2 NP Class.
Part VI NP-Hardness.
Approximation Algorithms
Richard Anderson Lecture 26 NP-Completeness
Hard Problems Introduction to NP
Algorithms for hard problems
Turing Machines Acceptors; Enumerators
Chapter 9 TURING MACHINES.
Lecture 25 Partition.
Lecture 5 NP Class.
Approximation Algorithms
Approximation Algorithms for TSP
Lecture 24 NP-Complete Problems
ICS 353: Design and Analysis of Algorithms
Turing Machines (TM) Deterministic Turing Machine (DTM)
Approximation Algorithms
Approximation Algorithms
Richard Anderson Lecture 30 NP-Completeness
CLASSES P AND NP.
CS154, Lecture 13: P vs NP.
Part II Theory of Nondeterministic Computation
Recall last lecture and Nondeterministic TMs
CS21 Decidability and Tractability
Time Complexity Classes
Lecture 21 More Approximation Algorithms
Richard Anderson Lecture 27 Survey of NP Complete Problems
Teori Bahasa dan Automata Lecture 10: Push Down Automata
CS6382 Theory of Computation
Instructor: Aaron Roth
Lecture 24 Vertex Cover and Hamiltonian Cycle
Lecture 1-2 Time and Space of DTM
Presentation transcript:

Design and Analysis of Approximation Algorithms Ding-Zhu Du

Text Books Ding-Zhu Du and Ker-I Ko, Design and Analysis of Approximation Algorithms (Lecture Notes). Chapters 1-8.

Schedule Introduction Greedy Strategy Restriction Partition Guillotine Cut Relaxation Linear Programming Local Ratio Semi-definite Programming

Rules You may discuss each other on 5 homework assignments. But, do not copy each other. Each homework is 10 points. 4 top scores will be chosen. Midterm Exam (take-home) is 30 points. Final Exam (in class) is 30 points. Final grade is given based on the total points (A ≥ 80; 80 > B ≥ 60 ; 60 > C ≥ 40).

Chapter 1 Introduction Computational Complexity (background) Approximation Performance Ratio Early results

Computability Deterministic Turing Machine Nondeterministic Turing Machine Church-Turing Thesis

Deterministic Turing Machine (DTM) Finite Control tape head

p h B e a l a The tape has the left end but infinite to the right. It is divided into cells. Each cell contains a symbol in an alphabet Γ. There exists a special symbol B which represents the empty cell.

a The head scans at a cell on the tape and can read, erase, and write a symbol on the cell. In each move, the head can move to the right cell or to the left cell (or stay in the same cell).

a The head scans at a cell on the tape and can read, erase, and write a symbol on the cell. In each move, the head can move to the right cell or to the left cell (or stay in the same cell).

The finite control has finitely many states which form a set Q The finite control has finitely many states which form a set Q. For each move, the state is changed according to the evaluation of a transition function δ : Q x Γ → Q x Γ x {R, L}.

b a p q δ(q, a) = (p, b, L) means that if the head reads symbol a and the finite control is in the state q, then the next state should be p, the symbol a should be changed to b, and the head moves one cell to the left.

a b p q δ(q, a) = (p, b, R) means that if the head reads symbol a and the finite control is in the state q, then the next state should be p, the symbol a should be changed to b, and the head moves one cell to the right.

s There are some special states: an initial state s and an final states h. Initially, the DTM is in the initial state and the head scans the leftmost cell. The tape holds an input string.

Otherwise, the input string is rejected. x h When the DTM is in the final state, the DTM stops. An input string x is accepted by the DTM if the DTM reaches the final state h. Otherwise, the input string is rejected.

The DTM can be represented by M = (Q, Σ, Γ, δ, s) where Σ is the alphabet of input symbols. The set of all strings accepted by a DTM $M$ is denoted by L(M). We also say that the language L(M) is accepted by M.

The transition diagram} of a DTM is an alternative way to represent the DTM. For M = (Q, Σ, Γ, δ, s), the transition diagram of M is a symbol-labeled digraph G=(V, E) satisfying the following: V = Q (s = , h = ) E = { p q | δ(p, a) = (q, b, D)}. a/b,D

M=(Q, Σ, Γ, δ, s) where Q = {s, p, q, h}, Σ = {0, 1}, Г = {0, 1, B}. 0/0,R; 1/1,R 0/0,R 0/0,R B/B,R s p q h 1/1,R M=(Q, Σ, Γ, δ, s) where Q = {s, p, q, h}, Σ = {0, 1}, Г = {0, 1, B}. δ 0 1 B s (p, 0, R) (s, 1, R) - p (q, 0, R) (s, 1, R) - q (q, 0, R) (q, 1, R) (h, B, R) L(M) = (0+1)*00(0+1)*.

Nondeterministic Turing Machine (NTM) Finite Control tape head

p h B e a l a The tape has the left end but infinite to the right. It is divided into cells. Each cell contains a symbol in an alphabet Γ. There exists a special symbol B which represents the empty cell.

The finite control has finitely many states which form a set Q The finite control has finitely many states which form a set Q. For each move, the state is changed according to the evaluation of a transition function δ : Q x Γ → 2^{Q x Γ x {R, L}}.

Church-Turing Thesis Computability is Turing-Computability.

Multi-tape DTM Input tape (read only) Storage tapes Output tape (possibly, write only)

Time of TM TimeM (x) = # of moves that TM M takes on input x. TimeM(x) < infinity iff x ε L(M).

Space SpaceM(x) = # of cell that M visits on the work (storage) tapes during the computation on input x. If M is a multitape DTM, then the work tapes do not include the input tape and the write-only output tape.

Time Bound M is said to have a time bound t(n) if for every x with |x| < n, TimeM(x) < max {n+1, t(n)}

Complexity Class A language L has a (deterministic) time-complexity t(n) if there is a multitape TM M accepting L, with time bound t(n). DTIME(t(n)) = {L(M) | DTM M has a time bound t(n)} NTIME(t(n)) = {L(M) | NTM M has a time bound t(n)}

P = U DTIME(n ) NP = U NTIME(n ) c C>0 c C>0

NP Class

Earlier Results on Approximations Vertex-Cover Traveling Salesman Problem Knapsack Problem

Performance Ratio

Constant-Approximation c-approximation is a polynomial-time approximation satisfying: 1 < approx(input)/opt(input) < c for MIN or 1 < opt(input)/approx(input) < c for MAX

Vertex Cover Given a graph G=(V,E), find a minimum subset C of vertices such that every edge is incident to a vertex in C.

Vertex-Cover The vertex set of a maximal matching gives 2-approximation, i.e., approx / opt < 2

Traveling Salesman Given n cities with a distance table, find a minimum total-distance tour to visit each city exactly once.

Traveling Salesman with triangular inequality Traveling around a minimum spanning tree is a 2-approximation.

Traveling Salesman with Triangular Inequality Minimum spanning tree + minimum-length perfect matching on odd vertices is 1.5-approximation

Minimum perfect matching on odd vertices has weight at most 0.5 opt.

Lower Bound 1+ε 1+ε 1 1 1+ε 1+ε 1+ε

Traveling Salesman without Triangular Inequality Theorem For any constant c> 0, TSP has no c-approximation unless NP=P. Given a graph G=(V,E), define a distance table on V as follows:

Contradition Argument Suppose c-approximation exists. Then we have a polynomial-time algorithm to solve Hamiltonian Cycle as follow: C-approximation solution < cn if and only if G has a Hamiltonian cycle

Knapsack

2-approximation

PTAS A problem has a PTAS (polynomial-time approximation scheme) if for any ε > 0, it has a (1+ε)-approximation.

Knapsack has PTAS Classify: for i < m, ci < a= cG, Sort For

Proof.

Time

Fully PTAS A problem has a fully PTAS if for any ε>0, it has (1+ε)-approximation running in time poly(n,1/ε).

Fully FTAS for Knapsack

Pseudo Polynomial-time Algorithm for Knapsak Initially,

Time outside loop: O(n) Inside loop: O(nM) where M=max ci Core: O(n log (MS)) Total O(n M log (MS)) Since input size is O(n log (MS)), this is a pseudo-polynomial-time due to M=2 3 log M

Thanks, End

Lecture 3 Complexity of Approximation L-reduction Two subclass of PTAS Set-cover ((ln n)-approximation) Independent set (n -approximation) c