P ROBABILISTIC T URING M ACHINES Stephany Coffman-Wolph Wednesday, March 28, 2007.

Slides:



Advertisements
Similar presentations
Part VI NP-Hardness. Lecture 23 Whats NP? Hard Problems.
Advertisements

The Polynomial – Time Hierarchy
CS 345: Chapter 9 Algorithmic Universality and Its Robustness
1 Nondeterministic Space is Closed Under Complement Presented by Jing Zhang and Yingbo Wang Theory of Computation II Professor: Geoffrey Smith.
CS 461 – Nov. 9 Chomsky hierarchy of language classes –Review –Let’s find a language outside the TM world! –Hints: languages and TM are countable, but.
Cook’s Theorem The Foundation of NP-Completeness.
Order Statistics Sorted
Probabilistic algorithms Section 10.2 Giorgi Japaridze Theory of Computability.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
Probabilistic Algorithms Michael Sipser Presented by: Brian Lawnichak.
Chernoff Bounds, and etc.
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
Complexity 18-1 Complexity Andrei Bulatov Probabilistic Algorithms.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
More Turing Machines Sipser 3.2 (pages ). CS 311 Fall Multitape Turing Machines Formally, we need only change the transition function to.
1 Introduction to Computability Theory Lecture11: Variants of Turing Machines Prof. Amos Israeli.
More Turing Machines Sipser 3.2 (pages ).
1 Introduction to Computability Theory Lecture13: Mapping Reductions Prof. Amos Israeli.
CPSC 411, Fall 2008: Set 12 1 CPSC 411 Design and Analysis of Algorithms Set 12: Undecidability Prof. Jennifer Welch Fall 2008.
P, NP, PS, and NPS By Muhannad Harrim. Class P P is the complexity class containing decision problems which can be solved by a Deterministic Turing machine.
Complexity ©D.Moshkovitz 1 Turing Machines. Complexity ©D.Moshkovitz 2 Motivation Our main goal in this course is to analyze problems and categorize them.
CS151 Complexity Theory Lecture 7 April 20, 2004.
Lecture 9 Recursive and r.e. language classes
61 Nondeterminism and Nodeterministic Automata. 62 The computational machine models that we learned in the class are deterministic in the sense that the.
1 Module 9 Recursive and r.e. language classes –representing solvable and half-solvable problems Proofs of closure properties –for the set of recursive.
Lecture 8 Recursively enumerable (r.e.) languages
Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.
Randomized Computation Roni Parshani Orly Margalit Eran Mantzur Avi Mintz
Probabilistic Complexity. Probabilistic Algorithms Def: A probabilistic Turing Machine M is a type of non- deterministic TM, where each non-deterministic.
Submitted by : Estrella Eisenberg Yair Kaufman Ohad Lipsky Riva Gonen Shalom.
Quantum Automata Formalism. These are general questions related to complexity of quantum algorithms, combinational and sequential.
RELATIVIZATION CSE860 Vaishali Athale. Overview Introduction Idea behind “Relativization” Concept of “Oracle” Review of Diagonalization Proof Limits of.
Fall 2004COMP 3351 Reducibility. Fall 2004COMP 3352 Problem is reduced to problem If we can solve problem then we can solve problem.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 5 Reducibility Contents Undecidable Problems from Language Theory.
Complexity ©D. Moshkovitz 1 And Randomized Computations The Polynomial Hierarchy.
Study Group Randomized Algorithms Jun 7, 2003 Jun 14, 2003.
Theory of Computing Lecture 20 MAS 714 Hartmut Klauck.
The Polynomial Hierarchy By Moti Meir And Yitzhak Sapir Based on notes from lectures by Oded Goldreich taken by Ronen Mizrahi, and lectures by Ely Porat.
Section 11.4 Language Classes Based On Randomization
Randomized Turing Machines
The Complexity of Primality Testing. What is Primality Testing? Testing whether an integer is prime or not. – An integer p is prime if the only integers.
Approximation Algorithms Pages ADVANCED TOPICS IN COMPLEXITY THEORY.
1 2 Probabilistic Computations  Extend the notion of “efficient computation” beyond polynomial-time- Turing machines.  We will still consider only.
Computation Model and Complexity Class. 2 An algorithmic process that uses the result of a random draw to make an approximated decision has the ability.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
. CLASSES RP AND ZPP By: SARIKA PAMMI. CONTENTS:  INTRODUCTION  RP  FACTS ABOUT RP  MONTE CARLO ALGORITHM  CO-RP  ZPP  FACTS ABOUT ZPP  RELATION.
PROBABILISTIC COMPUTATION By Remanth Dabbati. INDEX  Probabilistic Turing Machine  Probabilistic Complexity Classes  Probabilistic Algorithms.
Interactive proof systems Section 10.4 Giorgi Japaridze Theory of Computability.
Fall 2013 CMU CS Computational Complexity Lectures 8-9 Randomness, communication, complexity of unique solutions These slides are mostly a resequencing.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
CS 3813: Introduction to Formal Languages and Automata Chapter 12 Limits of Algorithmic Computation These class notes are based on material from our textbook,
1 Turing’s Thesis. 2 Turing’s thesis: Any computation carried out by mechanical means can be performed by a Turing Machine (1930)
 2005 SDU Lecture13 Reducibility — A methodology for proving un- decidability.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 7 Time complexity Contents Measuring Complexity Big-O and small-o notation.
ICS 353: Design and Analysis of Algorithms
NP-Completness Turing Machine. Hard problems There are many many important problems for which no polynomial algorithms is known. We show that a polynomial-time.
Lecture 17 Undecidability Topics:  TM variations  Undecidability June 25, 2015 CSCE 355 Foundations of Computation.
Chapter 11 Introduction to Computational Complexity Copyright © 2011 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1.
Recursively Enumerable and Recursive Languages
Overview of the theory of computation Episode 3 0 Turing machines The traditional concepts of computability, decidability and recursive enumerability.
NP ⊆ PCP(n 3, 1) Theory of Computation. NP ⊆ PCP(n 3,1) What is that? NP ⊆ PCP(n 3,1) What is that?
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
Theory of Computation Automata Theory Dr. Ayman Srour.
PROBABILITY AND COMPUTING RANDOMIZED ALGORITHMS AND PROBABILISTIC ANALYSIS CHAPTER 1 IWAMA and ITO Lab. M1 Sakaidani Hikaru 1.
Umans Complexity Theory Lectures
Probabilistic Algorithms
CSCI 2670 Introduction to Theory of Computing
CS21 Decidability and Tractability
Instructor: Aaron Roth
Presentation transcript:

P ROBABILISTIC T URING M ACHINES Stephany Coffman-Wolph Wednesday, March 28, 2007

P ROBABILISTIC T URING M ACHINE There are several popular definitions: A nondeterministic Turing Machine (TM) which randomly chooses between available transitions at each point according to some probability distribution A type of nondeterministic TM where each nondeterministic step is called a coin-flip step and has two legal next moves A Turing Machine in which some transitions are random choices among finitely many alternatives Also known as a Randomized Turing Machine

TM S PECIFICS There are (at least) three tapes 1 st Tape holds the input 2 nd Tape (also known as the random tape) is covered randomly (and independently) with 0’s and 1’s ½ probability of a 0 ½ probability of a 1 3 rd Tape is used as the scratch tape

W HEN A P ROBABILISTIC TM R ECOGNIZES A L ANGUAGE Accept all strings in the language Reject all strings not in the language However, a probabilistic TM will have a probability of error

P ROBABILISTIC TM F ACTS Each “branch” in the TMs computation has a probability Can have stochastic results Hence, on a given input it: May have different run times May not halt Therefore, it may accept the input in a given execution, but reject in another execution Time and space complexity can be measured using the worst case computation branch

P ROBABILISTIC A LGORITHM Also known as Randomized Algorithms An algorithm designed to use the outcome of a random process In other words, part of the logic for the algorithm uses randomness Often the algorithm has access to a pseudo- random number generator The algorithm uses random bits to help make choices (in hope of getting better performance)

W HY USE P ROBABILISTIC A LGORITHMS ? Probabilistic algorithms are useful because It is time consuming to calculate the “best” answer Estimation could introduce an unwanted bias that invalidates the results For example: Random Sampling Random sampling is used to obtain information about individuals in a large population Asking everyone would take too long Querying a not random selected subset might influence (or bias) the results

BPP Bounded error Probability in Polynomial time Definition: The class of languages that are recognized by probabilistic polynomial time TM with an error probability of 1/3 (or less) Or another way say it: The class of languages that a probabilistic TM halts in polynomial time with either a accept or reject answer at least 2/3 of the time

A P ROBLEM IN BPP: Can be solved by an algorithm that is allowed to make random decisions (called coin-flips) Guaranteed to run in polynomial time On a given run of the algorithm, it has a (at most) 1/3 probability of giving an incorrect answer These algorithms are known as probabilistic algorithms

W HY 1/3? Actually, this is arbitrary In fact can be any constant between 0 and ½ (as long as it is independent of the input) Why? If the algorithm is run many times, the probability of the probabilistic TM being wrong the majority of the time decreases exponentially Therefore, these kinds of algorithms can become more accurate by running it several times (and then taking the majority vote of the results)

T O I LLUSTRATE THE C ONCEPT Let the error probability be 1/3 We have a box containing many red and blue balls 2/3 of the balls are one color 1/3 of the balls are the other color (But, we don’t know which color is 2/3 or which color is 1/3) To find out, we start taking samples at random and keep track of which color ball we pulled from the box The color that comes up most frequently during a large sampling will most likely be the majority color originally in the box

H OW T HIS R ELATES … The blue and red balls correspond to branches in a probabilistic (polynomial time) TM. Lets call it M1 We can assign each color: Red = accepting Blue = rejecting The sampling can be done by running M1 using another probabilistic TM (lets call it M2) with a better error probability M2’s error probability is exponentially small if it runs M1 a polynomial number of times and outputs the result that occurs most often

F ORMALLY : Let the error probability ( Є ) be a fixed constant strictly between 0 and ½ Let poly(n) be any polynomial For any poly(n), a probabilistic polynomial time TM M1 that operates with error probability Є has an equivalent probabilistic polynomial time TM M2 This TM M2 has an error probability of 2 -poly(n)

RP Randomized Polynomial time A class of problems that will run in polynomial time on a probabilistic TM with the following properties: If the correct answer is no, always return no yes, return yes with probability at least ½ Otherwise, returns no Formally The class of languages for which membership can be determined in polynomial time by a probabilistic TM with no false acceptances and less than half of the rejections are false rejections

F ACTS A BOUT RP If the algorithm returns a yes answer, then yes is the correct answer If the algorithm returns a no answer, then it may or may not be correct The ½ in the definition is arbitrary Like we saw in the BPP class, running the algorithm addition repetitions will decrease the chance of the algorithm giving the wrong answer Often referred to as a Monte-Carlo Algorithm (or Monte-Carlo Turing Machine)

M ONTE C ARLO A LGORITHM A numerical Monte Carlo method used to find solutions to problems that cannot easily to solved using standard numerical methods Often relies on random (or pseudo-random) numbers Is stochastic or nondeterministic in some manner

C O -RP A class of problems that will run in polynomial time on a probabilistic TM with the following properties: If the correct answer is yes, always return yes no, return no with probability at least ½ Otherwise, returns a yes In other words: If the algorithm returns a no answer, then no is the correct answer If the algorithm returns a yes answer, then it may or may not be correct

ZPP Zero-error Probabilistic Polynomial The class of languages for which a probabilistic TM halts in polynomial time with no false acceptances or rejections, but sometimes gives an “I don’t know” answer In other words: It always returns a guaranteed correct yes or no answer It might return an “I don’t know” answer

F ACTS A BOUT ZPP The running time is unbounded But it is polynomial on average (for any input) It is expected to halt in polynomial time Similar to definition of P except: ZPP allows the TM to have “randomness” The expected running time is measured (instead of the worst-case) Often referred to as a Las-Vegas algorithm (or Las-Vegas Turning Machine)

L AS V EGAS A LGORITHM A randomized algorithm that never gives an incorrect result. It either produces a result or fails Therefore, it is said that the algorithm “does not gamble” with it’s result. It only “gambles” with the resources used for computation

L, ¬ L, AND ZPP If L is in ZPP, then ¬ L is in ZPP Where ¬ L represents the complement of L Why? If L is accepted by TM M that is in ZPP. We can alter M to accept ¬ L by Turning the acceptance by M into halting without acceptance If M halted without accepting before, instead we accept and halt

R ELATIONSHIP B ETWEEN RP AND ZPP ZPP = RP  co-RP Proof Part 1: RP  co-RP is in ZPP Let L be a language recognized by RP algorithm A and co- RP algorithm B Let w be in L Run w on A. If A returns yes, the answer must be yes. If A returns no, run w on B. If B returns no, then the answer must be no. Otherwise, repeat. Only one of the algorithms can ever give a wrong answer. The chance of an algorithm giving the wrong answer is 50%. The chance of having the kth repetition shrinks exponentially. Therefore, the expected running time is polynomial Hence, RP intersect co-RP is contained in ZPP

R ELATIONSHIP B ETWEEN RP AND ZPP ZPP = RP  co-RP Proof Part 2: ZPP is contained in RP  co-RP Let C be an algorithm in ZPP Construct the RP algorithm using C: Run C for (at least) double its expected running time. If it gives an answer, that must be the answer If it doesn’t given an answer before the algorithm stops, then the answer is no The chance that algorithm C produces an answer before it is stopped is ½ (and hence fitting the definition of an RP algorithm) The co-RP algorithm is almost identical, but it gives a yes answer if C does produce an answer. Therefore, we can conclude that ZPP is contained in RP  co-RP

W HAT W E C AN A LSO C ONCLUDE As seen in the proof of ZPP = RP  co-RP we can conclude that ZPP  RP ZPP  co-RP

R ELATIONSHIP B ETWEEN P AND ZPP P  ZPP Proof Any deterministic, polynomial time bounded TM is also a probabilistic TM that ignores its special feature that allows it to make random choices

R ELATIONSHIP B ETWEEN NP AND RP RP  NP Proof Let M1 be a probabilistic TM in RP for language L Construct a nondeterministic TM M2 for L Both of these TMs are bounded by the same polynomial When M1 examines a random bit for the first time, M2 chooses both possible values for the bit and writes it on a tape M2 will accept whenever M1 accepts. M2 will not accept otherwise

R ELATIONSHIP B ETWEEN NP AND RP Proof continued Let w be in L M1 has a 50% probability of accepting w. There must be some sequence of bits on the random tape that leads to the acceptance of w M2 will choose that sequence of bits and accepts when the choice is made. Thus, w is in the language of M2 If w is not in L, then there is no sequence of random bits that will make M1 accept. Therefore, M2 cannot choose a sequence of bits that leads to acceptance. Thus, w is not in the language of M2

D IAGRAM S HOWING R ELATIONSHIP OF P ROBLEM C LASSES NP Co-NP Co-RP RP ZPP P

W HERE D OES BPP F IT IN ? It is still an open question whether NP is a subset of BPP or BPP is a subset of NP However, it is believed that RP is a subset of BPP

D IAGRAM S HOWING R ELATIONSHIP OF P ROBLEM C LASSES BPP RP ZPP P

W HY S TUDY P ROBABILISTIC TM? To attempt to answer the question: Does randomness add power? Or putting it another way: Are there problems that can be solved by a probabilistic TM (in polynomial time) but these same problems cannot be solved by a deterministic TM in polynomial time?

R ESOURCES Introduction of the Theory of Computation by Michael Sipser, PWS Publishing Company, 1997 Introduction to Automata Theory, Languages, and Computation by Hopcroft, Motwani, and Ullman, Person Education, Inc, 2006 Introduction to the Theory of Computation by Eitan Gurari, Computer Science Press, 1989 ( bk/theory-bk.html) “Dictionary of Algorithms and Data Structures”, NIST website ( “Probabilistic Turing Machines”, “Randomized algorithm”, “BPP”, “ZPP”, “CP”, “Monte Carlo algorithm”, and “Las Vegas algorithm”, Wikipedia website (