Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantum Gibbs Samplers

Similar presentations


Presentation on theme: "Quantum Gibbs Samplers"— Presentation transcript:

1 Quantum Gibbs Samplers
Fernando G.S.L. Brandão QuArC, Microsoft Research Q-QuArC Retreat, 2015

2 Dynamical Properties Hij Compute: Hamiltonian: State at time t:
Expectation values: Temporal correlations:

3 Quantum Simulators, Dynamical
Quantum Computer Can simulate the dynamics of every multi-particle quantum system Spin models (Lloyd ‘96, …, Berry, Childs, Kothari ‘15) Fermionic and bosonic models (Bravyi, Kitaev ’00, …) Topological quantum field theory (Freedman, Kitaev, Wang ‘02) Quantum field theory (Jordan, Lee, Preskill ‘11) Quantum Chemestry (Hastings, Wecker, Bauer, Triyer ’14) Quantum Simulators Can simulate the dynamics of particular models Optical Lattices Ion Traps Superconducting systems

4 Static Properties

5 Static Properties Hij Hamiltonian:

6 Static Properties Hij Hamiltonian: Groundstate: Thermal state:
Compute: local expectation values (e.g. magnetization), correlation functions (e.g ), …

7 Static Properties Can we prepare groundstates?
Warning: in general hard, even for one-dimensional translational-invariant models (…, Gottesman-Irani ‘09)

8 Static Properties Can we prepare groundstates?
Warning:I in general hard, even for one-dimensional translational-invariant models Method 1: Adiabatic Evolution; works if Δ ≥ n-c Method 2: Phase Estimation; works if can find a “simple” state |0> such that * (…, Gottesman-Irani ‘09) H(sf) H(s)ψs = E0,sψs Δ := min Δ(s) H(si) ψi H(s) ψs (Abrams, Lloyd ‘99)

9 Static Properties Can we prepare thermal states?
Warning: NP-hard to estimate energy of general classical Gibbs states

10 Static Properties Can we prepare thermal states?
Are quantum computers useful in some cases? Warning: NP-hard to estimate energy of general classical Gibbs states

11 Plan 1. Classical Glauber dynamics
2. Mixing in space vs mixing in time 3. Quantum Master Equations 4. Mixing in space vs mixing in time for commuting quantum systems 5. Approach to non-commuting 6. Potential Applications

12 Thermalization Can we prepare thermal states? S B
Method 1: Couple to a bath of the right temperature and wait. But size of environment might be huge. Maybe not efficient. No guarantee Gibbs state will be reached S B

13 Method 2 (classical): Metropolis Sampling
Consider e.g. Ising model: Coupling to bath modeled by stochastic map Q i j Metropolis Update: The stationary state is the thermal (Gibbs) state:

14 Method 2 (classical): Metropolis Sampling
Consider e.g. Ising model: Coupling to bath modeled by stochastic map Q i j Metropolis Update: The stationary state is the thermal (Gibbs) state: (Metropolis et al ’53) “We devised a general method to calculate the properties of any substance comprising individual molecules with classical statistics” Example of Markov Chain Monte Carlo method. Extremely useful algorithmic technique

15 Glauber Dynamics A stochastic map R = eG is a Glauber dynamics for a (classical) Hamiltonian if it’s generator G is local and the unique fixed point of R is e-βH/Z(β) (+ detailed balance) Ex: Metropolis, Heat-bath generator, ….

16 Glauber Dynamics Gibbs state with finite correlation length
A stochastic map R = eG is a Glauber dynamics for a (classical) Hamiltonian if it’s generator G is local and the unique fixed point of R is e-βH/Z(β) (+ detailed balance) Ex: Metropolis, Heat-bath generator, …. When is Glauber dynamics effective for sampling from Gibbs state? Gibbs state with finite correlation length (Stroock, Zergalinski ’92; Martinelli, Olivieri ’94, …) (Sly ‘10) Gibbs Sampling in P (vs NP -hard) Rapidly mixing Glauber dynamics (Proved only for hard core mode and 2-spin anti- ferromagetic model)

17 Temporal Mixing We have rapid mixing if Δ = constant
eigenprojectors transition matrix after t time steps eigenvalues Convergence time given by the gap Δ = 1- λ1: Time of equilibration ≈ n/Δ We have rapid mixing if Δ = constant

18 Spatial Mixing blue: V, red: boundary Let be the Gibbs state for
a model in the lattice V with boundary conditions τ, i.e. Ex. τ = (0, … 0)

19 Spatial Mixing blue: V, red: boundary Let be the Gibbs state for
a model in the lattice V with boundary conditions τ, i.e. def: The Gibbs state has correlation length ξ if for every f, g Ex. τ = (0, … 0) f g

20 Temporal Mixing <-> Spatial Mixing
(Stroock, Zergalinski ’92; Martinelli, Olivieri ’94, …) For every classical Hamiltonian, the Gibbs state has finite correlation length if, and only if, the Glauber dynamics has a finite gap

21 Temporal Mixing <-> Spatial Mixing
(Stroock, Zergalinski ’92; Martinelli, Olivieri ’94, …) For every classical Hamiltonian, the Gibbs state has finite correlation length if, and only if, the Glauber dynamics has a finite gap Obs1: Same is true for the log-Sobolev constant of the system Obs2: For many models, when correlation length diverges, gap is exponentially small in the system size (e.g. Ising model) Obs3: Any model in 1D, and any model in arbitrary dim. at high enough temperature, has a finite correlation length (connected to uniqueness of the phase, e.g. Dobrushin’s condition)

22 Temporal Mixing <-> Spatial Mixing
(Stroock, Zergalinski ’92; Martinelli, Olivieri ’94, …) For every classical Hamiltonian, the Gibbs state has finite correlation length if, and only if, the Glauber dynamics has a finite gap Obs1: Same is true for the log-Sobolev constant of the system Obs2: For many models, when correlation length diverges, gap is exponentially small in the system size (e.g. Ising model) Obs3: Any model in 1D, and any model in arbitrary dim. at high enough temperature, has a finite correlation length (connected to uniqueness of the phase, e.g. Dobrushin’s condition) Does something similar hold in the quantum case? 1st step: Need a quantum version of Glauber dynamics…

23 Preparing Quantum Thermal States
(Terhal and diVincenzo ’00, …) Simulate interaction of system with heat bath no run-time estimate (Poulin, Wocjan ’09, …) Grover-type speed-up for preparing Gibbs states exponential run-time (Temme et al ‘09) Quantum metropolis: Quantum channel s.t. (i) can be implemented efficiently on a quantum computer and (ii) has Gibbs state as fixed point (Yung, Aspuru-Guzik ‘10) Amplitude amplification applied to quantum metropolis: Square-root speed-up on spectral gap. (Hastings ’08; Bilgin, Boixo ’10) Poly-time quantum/classical algorithm for every 1D model restricted to 1D

24 Quantum Master Equations
Canonical example: cavity QED Lindblad Equation: (most general Markovian and time homogeneous q. master equation)

25 Quantum Master Equations
Canonical example: cavity QED Lindblad Equation: (most general Markovian and time homogeneous q. master equation) Generator completely positive trace-preserving map: fixed point: How fast does it converge? Determined by gap of of Lindbladian

26 Quantum Master Equations
Canonical example: cavity QED Lindblad Equation: Local master equations: L is k-local if all Ai act on at most k sites (Kliesch et al ‘11) Time evolution of every k-local Lindbladian on n qubits can be simulated in time poly(n, 2^k) in the circuit model Ai

27 Davies Maps Lindbladian: Lindblad terms: : spectral density

28 Davies Maps Hij Lindbladian: Lindblad terms: : spectral density
Sα (Xα, Yα, Zα) Thermal state is the unique fixed point: (satisfies q. detailed balance: )

29 Why Davies Maps? weak-coupling limit:
Interacting Ham. (Davies ‘74) Rigorous derivation in the weak-coupling limit: Coarse grain over time t ≈ λ-2 >> max(1/ (Ei – Ej + Ek - El)) (Ei: eigenvalues of H)

30 Why Davies Maps? But for commuting Hamiltonian H, it is local
Interacting Ham. (Davies ‘74) Rigorous derivation in the weak-coupling limit: Coarse grain over time t ≈ λ-2 >> max(1/ (Ei – Ej + Ek - El)) (Ei: eigenvalues of H) But: for n spin Hamiltonian H: max(1/ (Ei – Ej + Ek - El)) = exp(O(n)) Consequence: Sα(ω) are non-local (act on n qubits); But for commuting Hamiltonian H, it is local

31 Gap The relevant gap is given by L2 weighted inner product: Variance:
Gap equal to spectral gap of Mixing time of order

32 Previous Results Gap has been estimated for:
(Alicki, Fannes, Horodecki ‘08) Λ = Ω(1) for 2D toric code (Alicki, Horodecki, Horodecki, Horodecki ‘08) Λ = exp(-Ω(n)) for 4D toric code (Temme ‘14) Λ > exp(- βε)/n, with ε the energy barrier, for stabilizer Hamiltonians

33 Equivalence of Clustering in Space and Time for Quantum Commuting
thm For commuting Hamiltonians in a finite dimensional lattice, the Davies generator has a constant gap if, and only if, the Gibbs state satisfies strong clustering of correlations (Kastoryano, B )

34 Equivalence of Clustering in Space and Time for Quantum Commuting
thm For commuting Hamiltonians in a finite dimensional lattice, the Davies generator has a constant gap if, and only if, the Gibbs state satisfies strong clustering of correlations Strong Clustering holds true in: 1D at any temperature Any D at sufficiently high temperature (critical T determined only by dim and interaction range)

35 Equivalence of Clustering in Space and Time for Quantum Commuting
thm For commuting Hamiltonians in a finite dimensional lattice, the Davies generator has a constant gap if, and only if, the Gibbs state satisfies strong clustering of correlations Strong Clustering holds true in: 1D at any temperature Any D at sufficiently high temperature (critical T determined only by dim and interaction range) Gives first polynomial-time quantum algorithm for preparing Gibbs states of commuting models at high temperature. Caveat: At high temperature cluster expansion works well for computing local expectation values. (Open: How the two threshold T’s compare?) Q advantage: we get the full Gibbs state (e.g. could perform swap test of purifications of two Gibbs states. Good for anything?)

36 Strong Clustering def: Strong clustering holds if there is ξ>0 s.t. for every A and B and operator f acting on A B Conditional Covariance: Conditional Expectation: Xc : complement of X d(X, Y) : distance between regions X and Y

37 Strong Clustering def: Strong clustering holds if there is ξ>0 s.t. for every A and B and operator f acting on A B Fact 1: Fact 2: Strong clustering follows if

38 Strong Clustering -> Gap
We show that under the clustering condition: Getting: V : entire lattice V0 : sublattice of size O(ξ) A B Follows idea of a proof of the classical analogue for Glauber dynamics (Bertini et al ‘00) Key lemma: If then

39 Gap -> Strong Clustering
Employs the following mapping between Liouvillians for commuting Hamiltonians and local Hamiltonians on a larger space: Apply the detectability lemma (Aharonov et al ‘10) to prove gap -> strong clustering (strengthening proof previous proof that gap -> clustering)

40 Non-Commuting Hamiltonians?
Davies Master Equation is Non-Local… Question: Can it be implemented efficiently on a quantum computer?

41 Non-Commuting Hamiltonians?
Davies Master Equation is Non-Local… Beyond master equations: def: Hamiltonian H satisfies local indistinguishability (LI) if for every Λ A B

42 Non-Commuting Hamiltonians?
Davies Master Equation is Non-Local… Beyond master equations: def: Hamiltonian H satisfies local indistinguishability (LI) if for every thm Suppose H (local Ham in d dim) satisfies LI and for every region A, Then e-H/T/Z(T) can be created by a quantum circuit of size exp(O(logd(n))) l A Bl Based on recent lower bound by Fawzi and Renner on conditional mutual information

43 Non-Commuting Hamiltonians?
Davies Master Equation is Non-Local… Beyond master equations: def: Hamiltonian H satisfies local indistinguishability (LI) if for every thm Suppose H (local Ham in d dim) satisfies LI and for every region A, Then e-H/T/Z(T) can be created by a quantum circuit of size exp(O(logd(n))) l A Bl When is the requirement on mutual information true? (always true for commuting models) Can we improve to poly(n) time?

44 Applications of Q. Gibbs Samplers
Machine Learning? (Nate’s talk) Quantum Algorithms for semidefinite programming? O(d4log(1/ε))-time algorithm (interior point methods) For sparse Ai, Y, is there a polylog(d) quantum algorithm? (think of HHL quantum algorithm for linear equations)

45 Applications of Q. Gibbs Samplers
Machine Learning? (Nate’s talk) Quantum Algorithms for semidefinite programming? O(d4log(1/ε))-time algorithm (interior point methods) For sparse Ai, Y, is there a polylog(d) quantum algorithm? (think of HHL quantum algorithm for linear equations) No, unless NP in BQP

46 Applications of Q. Gibbs Samplers
Machine Learning? (Nate’s talk) Quantum Algorithms for semidefinite programming? (Peng, Tangwongsan ‘12) For Ai > 0 (||A||<1), can reduce the problem to polylog(d) evaluations of (for λi < log(d)/ε)

47 Applications of Q. Gibbs Samplers
Machine Learning? (Nate’s talk) Quantum Algorithms for semidefinite programming? (Peng, Tangwongsan ‘12) For Ai > 0 (||A||<1), can reduce the problem to polylog(d) evaluations of (for λi < log(d)/ε) Are there interesting choices of {Ai} for which quantum gives an advantage to compute the Gibbs states above?


Download ppt "Quantum Gibbs Samplers"

Similar presentations


Ads by Google