60 likes | 165 Vues
Giorgi Japaridze Theory of Computability. Probabilistic algorithms. Section 10.2. 10.2.a. Giorgi Japaridze Theory of Computability. Definition of probabilistic Turing machines. Definition 10.3 A probabilistic Turing machine M is a type of nondeterministic TM
E N D
Giorgi Japaridze Theory of Computability Probabilistic algorithms Section 10.2
10.2.a Giorgi JaparidzeTheory of Computability Definition of probabilistic Turing machines Definition 10.3 A probabilistic Turing machine M is a type of nondeterministic TM in which each nondeterministic step is called a coin-flip step and has two legal next moves. We assign a probability to each branch b of M’s computation on input w as follows. Define the probability of b to be Pr[b] = 2-k, where k is the number of coin-flip steps that occur on branch b. We define the probability that M accepts w to be Pr[M accepts w] = Pr[b] b is an accepting branch In other words, the probability that M accepts w is the probability that we would reach an accepting configuration if we simulated M on w by flipping a coin to determine which move to follow at each coin-flip step. We let Pr[M rejects w] = 1 - Pr[M accepts w]
10.2.b Giorgi JaparidzeTheory of Computability Example - R accept 0 R 0R reject start 0 R - R - R What is the probability that 0is accepted? What is the probability that 00 is rejected? The language{0} is recognized with what error probability (see next slide)? Any other language is recognized with what error probability (see next slide)? 75% 100% 25% 100%
10.2.c Giorgi JaparidzeTheory of Computability The class BPP For 0 ≤ < ½, we say that Mrecognizes language Awith error probability if the probability that we would obtain the wrong answer by simulating M is at most . I.e.: 1. wA implies Pr[M accepts w] ≥ 1-, and 2. wA implies Pr[M rejects w] ≥ 1-. We also consider error probability bounds that depend on the input length n. For example, error probability =2-nindicates an exponentially small probability of error. Definition 10.4 BPP is the class of languages that are recognized by probabilistic polynomial time TMs with an error probability of 1/3. Instead of 1/3, any strictly between 0 and ½ would yield an equivalent definition by virtue of the amplification lemma (on the next slide). It gives a simple way of making the error probability exponentially small. Note that a probabilistic algorithm with an error probability of 2-100is far more likely to give an erroneous result because the computer on which it runs has a hardware failure than because of an unlucky toss of its coins.
10.2.d Giorgi JaparidzeTheory of Computability The amplification lemma Lemma 10.5 Letbe a fixed constant strictly between 0 and ½, and p(n) any polynomial. Then any probabilistic polynomial time TM M1 that operates with error probability has an equivalent probabilistic polynomial time TM M2 that operates with an error probability of 2-p(n) . Proof idea: M2 simulates M1 by running it a polynomial number of times and taking the majority vote of the outcomes. The probability of error decreases exponentially with the number of runs of M1 made.
10.2.e Giorgi JaparidzeTheory of Computability Open problems ssurrounding BPP Besides the problems in P, which are obviously in BPP, many problems were known to be in BPP but not known to be in P. The number of such problems is decreasing, and it is conjectured that P = BPP. For a long time, one of the most famous problems that was known to be in BPP but not known to be in P was PRIMES. However, in 2002, Agrawal and his students showed that PRIMESP. The relationship between BPP and NP is unknown: it is not known if BPP is a subset of NP, or if NP is a subset of BPP, or if they are incomparable. BPP is known to be a subset of PSPACE. It is however unknown whether vice versa also holds. It is also known that either P = BPP or P ≠ NP or both.