Download presentation

Presentation is loading. Please wait.

Published byRiley Brennan Modified over 4 years ago

1
So You Think Quantum Computing Is Bunk? Scott Aaronson (MIT) | You measurin ME?

2
Quantum Computing When I first heard about QC (around 1996), I was certain it was bunk! But to find the catch, Id first have to figure out what the deal was with quantum mechanics itself …

3
Like probability theory, but over the complex numbers Quantum Mechanics in 1 Slide Quantum Mechanics: Linear transformations that conserve 2-norm of amplitude vectors: Unitary matrices Probability Theory: Linear transformations that conserve 1-norm of probability vectors: Stochastic matrices

4
The source of all quantum weirdness Interference Possible states of a single quantum bit, or qubit:

5
If you ask |0 + |1 whether its |0 or |1, it answers |0 with probability | | 2 and |1 with probability | | 2. And it sticks with its answer from then on! Measurement Measurement is a destructive process:

6
The deep mystery of QM: Who decides when a measurement happens? An outsiders view: Many Worlds? Or Many Words? Product state of two qubits: Entangled state (cant be written as product state): The qubit simply gets entangled with your own body (and lots of other stuff), so that it collapses to |0 or |1 relative to you

7
A general entangled state of n qubits requires ~2 n amplitudes to specify: Quantum Computing Quantum Mechanics on Steroids Presents an obvious practical problem when using conventional computers to simulate quantum mechanics Feynman 1981: So then why not turn things around, and build computers that themselves exploit superposition? Shor 1994: Such a computer could do more than simulate QMe.g., it could factor integers in polynomial time Interesting

8
Where we are: A QC has now factored 21 into 3 7, with high probability (Martín-López et al. 2012) Why is scaling up so hard? Decoherence! The famous Fault-Tolerance Theorem suggests we only need to get decoherence down to some finite level (~1% per qubit per gate time?) to do arbitrarily long quantum computations Many discussions of the feasibility of QC focus entirely on the Fault-Tolerance Theorem and its assumptions My focus is different! For I take it as obvious that, if QC is impossible, there must exist a deeper explanation than such-and-such error-correction schemes might not work against every conceivable kind of noise

9
A few physicists and computer scientists remain vocally skeptical that scalable QC is possible… (And perhaps a much larger number are silently skeptical?) t HooftKalaiGoldreichWolframAlickiDyakanovLevin One historical analogy: People thought Charles Babbages idea was cute but would never work in practice And they were rightfor ~130 years!

10
3 Skeptical Positions and My Responses Sure/Shor separators 1.The difficulties are immense! QC might not be practical for a very long time (and will have limited applications even if built) My response: Agreement 2.QC will fail because quantum mechanics itself is wrong My response: Awesome! A revolution in physicseven better than QC. Count me in 3.Quantum mechanics is fine, but QC wont work because of some principle of unavoidable noise (?) on top of QM My response: Also wonderful! Explain your principle, why its true, and why it kills QC. Does it imply a fast classical simulation of realistic quantum systems? Skeptical position I wont address in this talk: BPP=BQP

11
Common Reasons for QC Skepticism Sure/Shor separators 1.Sounds too good to be true / like science fiction Response: Would any science-fiction writer have imagined a computer that solved factoring, discrete log, and a few other special problems, but not NP-complete problems? 2.Annoyance at hype/misrepresentations in popular press Response: Tell me about it… 3.The Extended Church-Turing Thesis rules out QC Response: The ECT was a CS encroachment onto physics turf … we cant cry foul if physics counterattacks us! 4.n qubits couldnt possibly encode 2 n bits 5.Underlying skepticism of QM itself (or modern physics in general?)

12
The 2 n Bits Is Too Many Argument Shouldnt we search for a more reasonable theory that agrees with QM on existing experiments, but: Lets us feasibly prepare only a singly-exponential number of states, not a doubly-exponential number? Predicts that in a volume of size n, only poly(n) bits can be reliably stored and retrieved, not exp(n) bits? Lets us summarize the results of exp(n) possible measurements on an n-qubit state using only poly(n) classical bits? Predicts that n-qubit states should be PAC-learnable with only poly(n) samples, not exp(n)? Such a theory exists! Its called quantum mechanics

13
OK, but suppose QC is impossible. Obvious question: Whats the criterion that tells us which quantum-mechanical experiments can be done, and which ones cant? Possibility 1: Precision in Amplitudes. The major problem is the requirement that basic quantum equations hold to multi-hundredth if not millionth decimal positions where the significant digits of the relevant quantum amplitudes reside. We have never seen a physical law valid to over a dozen decimals … Are quantum amplitudes still complex numbers to such accuracies or do they become quaternions, colored graphs, or sick-humored gremlins? Leonid Levin Obvious Response:

14
Possibility 2: OK, small amplitudes might be fine for separable statesbut entanglement is an illusion. Obvious Response: The Bell Inequality (and its experimental violation)

15
Possibility 3: Fine, 2 or 3 particles might be entangled, but a thousand particles could never be entangled! That doesnt work either… High-temperature superconductors Buckyball double-slit experiment

16
Needed: A Sure/Shor separator (A. 2004), between the many-particle quantum states were sure we can create and those that suffice for things like Shors algorithm PRINCIPLED LINE

17
+ |0 1 |1 2 ++ |0 1 |1 1 |0 2 |1 2 My Candidate: Tree Size Symmetrized states of n identical fermions/bosons can be shown to have tree size n (log n) (Using the breakthrough lower bound of [Raz 2004] on the multilinear formula size of the permanent and determinant) n (log n) lower bound probably also holds for 2D and 3D spin lattices (Indeed, in all these cases, the true tree size is probably exp(n)) But this doesnt work either!

18
God, Dice, Yadda Yadda A completely different way quantum mechanics might be not the whole story: What if there were deeper, underlying physical laws, and quantum mechanics was merely a statistical tool derivable from those laws? Recently, I became interested in -epistemic theories, an attempt to formalize the above Einsteinian impulse… Note: If quantum mechanics were exactly derivable, this still wouldnt kill QC! But maybe it could tell us where to look for a breakdown?

19
A set of ontic states (ontic = philosopher-speak for real) For each pure state | H d, a probability measure over ontic states For each orthonormal basis B=(v 1,…,v d ) and i [d], a response function R i,B : [0,1], satisfying A d-dimensional -Epistemic Theory is defined by: (Conservation of Probability) (Born Rule) Can trivially satisfy these axioms by setting =H d, = the point measure concentrated on | itself, and R i,B ( )=| v i | | 2 Gives a completely uninteresting restatement of quantum mechanics (called the Beltrametti- Bugajski theory)

20
Accounts beautifully for one qubit -epistemically! (One qutrit: Already a problem…) More Interesting Example: Kochen-Specker Theory Observation: If | =0, then and cant overlap Call the theory maximally nontrivial if (as above) and overlap whenever | and | are not orthogonal Response functions R i,B ( ): deterministically return basis vector closest to |

21
Suppose we assume = ( -epistemic theories must behave well under tensor product) Then theres a 2-qubit entangled measurement M, such that the only way to explain Ms behavior on the 4 states PBR (Pusey-Barrett-Rudolph 2011) No-Go Theorem is using a trivial theory that doesnt mix 0 and +. (Can be generalized to any pair of states, not just |0 and |+ ) Bells Theorem: Cant locally simulate all separable measurements on a fixed entangled state PBR Theorem: Cant locally simulate a fixed entangled measurement on all separable states (at least nontrivially so)

22
But suppose we drop PBRs tensor assumption. Then: Theorem (A.-Bouland-Chua-Lowther 13): Theres a maximally- nontrivial -epistemic theory in any finite dimension d Cover H d with -nets, for all =1/n Mix the states in pairs of small balls (B,B ), where |,| both belong to some -net (Mix = make their ontic distributions overlap) To mix all non-orthogonal states, take a convex combination of countably many such theories Albeit an extremely weird one! Solves the main open problem of Lewis et al. 12 Ideas of the construction:

23
Theorem (ABCL13): Theres no symmetric, maximally- nontrivial -epistemic theory in dimensions d 3 Our proof, in the general case, uses some measure theory and differential geometry (and strangely, currently works only with complex amplitudes, not real ones) On the other hand, suppose we want our theory to be symmetricmeaning that and

24
If scalable QC is indeed possible, are there any experiments that could help demonstrate thatshort of actually building a general-purpose QC? Some possibilities: - Keep 1 qubit coherent for an extremely long time (Current record: ~15 minutes in ion traps) - Quantum adiabatic optimization (the D-Wave approach) - BosonSampling (and other restricted QC proposals)

25
BosonSampling [A.-Arkhipov 2011] For when you only need your QC to overthrow the Extended Church-Turing Thesis, not do anything useful n identical photons are generated, sent through a network of beamsplitters, then measured to see where they are The result: A sample from a distribution {p x }, such that each probability p x equals |Per(A x )| 2, for some known n n complex matrix A x (Permanent: Famous #P-complete problem) Theorem: A classical computer cant sample the same distribution in polynomial time, unless P #P =BPP NP. We conjecture that this extends even to approximate/noisy classical simulations. Leads to a beautiful complexity-theoretic open problem! Is it #P-complete to approximate Per(A), with high probability over an n n matrix A of independent N(0,1) Gaussians?

26
Recent BosonSampling demonstrations with 3-4 photons [Broome et al., Tillmann et al., Walmsley et al., Crespi et al.] If this could be scaled to ~20-30 photons, it would probably BosonSample faster than a classical simulation of itself… Main engineering challenge: Deterministic generation of single photons, for synchronized arrival at the detectors

27
Conclusion I dont know for sure that scalable QC is possible But I do know that the popular framing of the question gets it exactly backwards! Believing that QC can work doesnt make you a starry-eyed visionary, but a scientific conservative Doubting that QC can work doesnt make you a cautious realist, but a scientific radical

Similar presentations

OK

Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson (MIT UT Austin) NYSC, West Virginia, June 24, 2016.

Quantum Computing and the Limits of the Efficiently Computable Scott Aaronson (MIT UT Austin) NYSC, West Virginia, June 24, 2016.

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on next generation 2-stroke engine repair Eat before dentist appt on your birthday Ppt on search engine optimisation Ppt on earth movements and major landforms in africa Ppt on web portal project Ppt on conservation of momentum lab Ppt on indian army weapons Ppt on obesity diet or exercise Ppt on anti rigging voting system Ppt on topic media