1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.

Slides:



Advertisements
Similar presentations
Turing Machines January 2003 Part 2:. 2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components:
Advertisements

Problems and Their Classes
1 Material to Cover  relationship between different types of models  incorrect to round real to integer variables  logical relationship: site selection.
Reducibility Class of problems A can be reduced to the class of problems B Take any instance of problem A Show how you can construct an instance of problem.
A Parallel GPU Version of the Traveling Salesman Problem Molly A. O’Neil, Dan Tamir, and Martin Burtscher* Department of Computer Science.
1 MERLIN A polynomial solution for the Traveling Salesman Problem Dr. Joachim Mertz, 2005.
Math443/543 Mathematical Modeling and Optimization
Previously in Chapter 6 Using binary variables Unintended options Lack of sensitivity analysis.
The Theory of NP-Completeness
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 7 Monday, 4/3/06 Approximation Algorithms.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 99 Chapter 4 The Simplex Method.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Schrödinger’s Elephants & Quantum Slide Rules A.M. Zagoskin (FRS RIKEN & UBC) S. Savel’ev (FRS RIKEN & Loughborough U.) F. Nori (FRS RIKEN & U. of Michigan)
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
1 Efficiency of Algorithms. 2  Two main issues related to the efficiency of algorithms:  Speed of algorithm  Efficient memory allocation  We will.
Energy function: E(S 1,…,S N ) = - ½ Σ W ij S i S j + C (W ii = P/N) (Lyapunov function) Attractors= local minima of energy function. Inverse states Mixture.
NP and NP- Completeness Bryan Pearsaul. Outline Decision and Optimization Problems Decision and Optimization Problems P and NP P and NP Polynomial-Time.
Linear Programming Applications
Solution methods for Discrete Optimization Problems.
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
Chapter 4 The Simplex Method
1 Integrality constraints Integrality constraints are often crucial when modeling optimizayion problems as linear programs. We have seen that if our linear.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
Elements of the Heuristic Approach
BIT Presentation 4.  An algorithm is a method for solving a class of problems.  While computer scientists think a lot about algorithms, the term.
1. Optimization and its necessity. Classes of optimizations problems. Evolutionary optimization. –Historical overview. –How it works?! Several Applications.
1 IE 607 Heuristic Optimization Simulated Annealing.
Quantum Computing Presented by: Don Davis PHYS
Difficult Problems. Polynomial-time algorithms A polynomial-time algorithm is an algorithm whose running time is O(f(n)), where f(n) is a polynomial A.
Complexity Classes (Ch. 34) The class P: class of problems that can be solved in time that is polynomial in the size of the input, n. if input size is.
1 Ethics of Computing MONT 113G, Spring 2012 Session 13 Limits of Computer Science.
Modular Arithmetic Shirley Moore CS4390/5390 Fall September 3, 2013.
Notes 5IE 3121 Knapsack Model Intuitive idea: what is the most valuable collection of items that can be fit into a backpack?
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
Sports Scheduling Written by Kelly Easton, George Nemhauser, Michael Trick Presented by Matthew Lai.
Metaheuristic – Threshold Acceptance (TA). 2 Outlines ▪ Measuring Computational Efficiency ▪ Construction Heuristics ▪ Local Search Algorithms ▪ Metaheuristic.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Unit 9: Coping with NP-Completeness
Erdal Kose CC30.10 These slides are based of Prof. N. Yanofsky Lecture notes.
3.3 Complexity of Algorithms
Simulated Annealing G.Anuradha.
Optimizing Pheromone Modification for Dynamic Ant Algorithms Ryan Ward TJHSST Computer Systems Lab 2006/2007 Testing To test the relative effectiveness.
Traveling Salesman Problem (TSP)
Beauty and Joy of Computing Limits of Computing Ivona Bezáková CS10: UC Berkeley, April 14, 2014 (Slides inspired by Dan Garcia’s slides.)
Integer Programming (정수계획법)
Operational Research & ManagementOperations Scheduling Economic Lot Scheduling 1.Summary Machine Scheduling 2.ELSP (one item, multiple items) 3.Arbitrary.
Clase 3: Basic Concepts of Search. Problems: SAT, TSP. Tarea 1 Computación Evolutiva Gabriela Ochoa
Optimization Problems
CSE 589 Part V One of the symptoms of an approaching nervous breakdown is the belief that one’s work is terribly important. Bertrand Russell.
Discrete optimisation problems on an adiabatic quantum computer
8/14/04 J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 6 – Integer Programming Models Topics.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
Copyright© 2012, D-Wave Systems Inc. 1 Quantum Boltzmann Machine Mohammad Amin D-Wave Systems Inc.
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Lecture 6 – Integer Programming Models Topics General model Logic constraint Defining decision variables Continuous vs. integral solution Applications:
Chapter 11 Introduction to Computational Complexity Copyright © 2011 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1.
Adiabatic Quantum Computing Josh Ball with advisor Professor Harsh Mathur Problems which are classically difficult to solve may be solved much more quickly.
Adiabatic quantum computer (AQC) Andrii Rudavskyi Supervisor: prof. Petra Rudolf.
Management Science 461 Lecture 7 – Routing (TSP) October 28, 2008.
Lecture 11: Linkage Analysis IV Date: 10/01/02  linkage grouping  locus ordering  confidence in locus ordering.
Copyright © 2014 Curt Hill Algorithm Analysis How Do We Determine the Complexity of Algorithms.
Quantum Boltzmann Machine
Hard Problems Some problems are hard to solve.  No polynomial time algorithm is known.  E.g., NP-hard problems such as machine scheduling, bin packing,
Optimization Problems
Integer Programming (정수계획법)
Solving the Optimal Trading Trajectory Problem Using a Quantum Annealer Gili Rosenberg, Poya Haghnegahdar, Phil Goddard, Peter Carr, Kesheng Wu, and Marcos.
Optimization Problems
traveling salesman problem
Integer Programming (정수계획법)
Topic 15 Job Shop Scheduling.
Presentation transcript:

1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD © Copyright 2011 Lockheed Martin Corporation

Lockheed Martin Buys Quantum Computer In 2011 Lockheed Martin Corp. purchased a quantum computer from D-Wave Systems, a Canadian technology firm. 2

3 Lockheed Martin Buys Quantum Computer In 2011 Lockheed Martin Corp. purchased a quantum computer from D-Wave Systems, a Canadian technology firm. Lockheed Martin planned uses: software validation and verification methods

4 Lockheed Martin Buys Quantum Computer In 2011 Lockheed Martin Corp. purchased a quantum computer from D-Wave Systems, a Canadian technology firm. Lockheed Martin planned uses: software validation and verification methods complex systems engineering

5 Lockheed Martin Buys Quantum Computer In 2011 Lockheed Martin Corp. purchased a quantum computer from D-Wave Systems, a Canadian technology firm. Lockheed Martin planned uses: software validation and verification methods complex systems engineering algorithms and computer science

6 Lockheed Martin Buys Quantum Computer In 2011 Lockheed Martin Corp. purchased a quantum computer from D-Wave Systems, a Canadian technology firm. Lockheed Martin planned uses: software validation and verification methods complex systems engineering algorithms and computer science enhanced machine learning

Ising Function for Quantum Annealing Quantum bits (qubits) can achieve an optimal state of low energy when super cooled. i and j are qubits 7

Ising Function for Quantum Annealing Quantum bits (qubits) can achieve an optimal state of low energy when super cooled. i and j are qubits Inputs: h i = the energy bias for qubit i J ij = the coupling energy between qubits i, j 8

Ising Function for Quantum Annealing Quantum bits (qubits) can achieve an optimal state of low energy when super cooled. i and j are qubits Inputs: h i = the energy bias for qubit i J ij = the coupling energy between qubits i, j Output: s i = the state of qubit i, either 0 or 1 9

Quantum Machine Hardware Qubit is a loop of superconducting wire made of niobium 10

Quantum Machine Hardware Qubit is a loop of superconducting wire Coupling between qubits is magnetic wiring 11

Quantum Machine Hardware Qubit is a loop of superconducting wire Coupling between qubits is magnetic wiring Temperature is close to 0 degrees kelvin 12

Quantum Machine Hardware Qubit is a loop of superconducting wire Coupling between qubits is magnetic wiring Temperature is close to 0 degrees kelvin Quantum machine has 128 qubits arranged in blocks of 8 qubits. All qubits in a block are coupled. Limited coupling between qubits in different blocks. 32 qubits were inoperative. – Result: Maximum number of qubits that can be logically coupled is 13 13

Adiabatic Quantum Optimization Physics inspired approach Hamiltonian changes over time – After the computation is complete Non-zero probability the system is in its lowest energy state which encodes a global minimization of the problem being solved 14

Programming Minimization Problem 1.Convert variables to binary 0, 1 15

Programming Minimization Problem 1.Convert variables to binary 0, 1 2.Design objective function 16

Programming Minimization Problem 1.Convert variables to binary 0, 1 2.Design objective function 3.Convert constraints to penalty functions 17

Programming Minimization Problem 1.Convert variables to binary 0, 1 2.Design objective function 3.Convert constraints to penalty functions 4.Combine object function & penalty functions into one expression that has only linear and quadratic terms 18

Programming Minimization Problem 1.Convert variables to binary 0, 1 2.Design objective function 3.Convert constraints to penalty functions 4.Combine object function & penalty functions into one expression that has only linear and quadratic terms 5.Map #4 to quantum machine hardware 19

Programming Minimization Problem 1.Convert variables to binary 0, 1 2.Design objective function 3.Convert constraints to penalty functions 4.Combine object function & penalty functions into one expression that has only linear and quadratic terms 5.Map #3 to quantum machine hardware 6.Interface to quantum machine through processor, such as MATLAB or Python 20

Programming Minimization Problem (cont’d) s i are the variables – Hardware limited to 13 fully coupled variables Hamiltonian = at most 13 X 13 matrix for full coupling – Diagonal elements are h i – Off diagonal elements are J ij 21

Symmetric Traveling Salesman Problem (TSP) A salesman needs to visit n cities once and return to the starting city. The distance d ij between each pair of cities i, j is know. Symmetric means d ij = d ji What is a route that minimizes the distance that the salesman travels? 22

Symmetric Traveling Salesman Problem (TSP) A salesman needs to visit n cities once and return to the starting city. The distance d ij between each pair of cities i, j is know. Symmetric means d ij = d ji What is a route that minimizes the distance that the salesman travels? Our binary variables are x ij for the road from city i to city j for i < j. The number of variables is n(n – 1)/2 which is 10 for 5 cities and 15 for 6 cities. 23

Symmetric Traveling Salesman Problem (TSP) A salesman needs to visit n cities once and return to the starting city. The distance d ij between each pair of cities i, j is know. Symmetric means d ij = d ji What is a route that minimizes the distance that the salesman travels? Our binary variables are x ij for the road from city i to city j for i < j. The number of variables is n(n – 1)/2 which is 10 for 5 cities and 15 for 6 cities. Our objective function is Σd ij x ij for i < j. 24

Symmetric Traveling Salesman Problem (TSP) A salesman needs to visit n cities once and return to the starting city. The distance d ij between each pair of cities i, j is know. Symmetric means d ij = d ji What is a route that minimizes the distance that the salesman travels? Our binary variables are x ij for the road from city i to city j for i < j. The number of variables is n(n – 1)/2 which is 10 for 5 cities and 15 for 6 cities. Our objective function is Σd ij x ij for i < j. The constraints are x ij = x ji, x ij 2 = x ij and constraints to ensure the salesman uses exactly 2 roads for each city, one into the city and different one out of the city. For city 1 in a 4 city problem, our constraint is C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 25

Constraint C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 ensures one route into city 1 and one route out of city 1 None of the routes x 12, x 13, x 14 are selected Then C(x 12, x 13, x 14 ) = ( ) 2 = 4 26

Constraint C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 ensures 1 route into city 1 and 1 route out of city 1 None of the routes x 12, x 13, x 14 are selected Then C(x 12, x 13, x 14 ) = ( ) 2 = 4 One of the routes is selected, say x 12 Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 27

Constraint C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 ensures 1 route into city 1 and 1 route out of city 1 None of the routes x 12, x 13, x 14 are selected Then C(x 12, x 13, x 14 ) = ( ) 2 = 4 One of the routes is selected, say x 12 Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 Two of the routes are selected, say x 12 and x 13 Then C(x 12, x 13, x 14 ) = ( ) 2 = 0 28

Constraint C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 ensures 1 route into city 1 and 1 route out of city 1 None of the routes x 12, x 13, x 14 are selected Then C(x 12, x 13, x 14 ) = ( ) 2 = 4 One of the routes is selected, say x 12 Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 Two of the routes are selected, say x 12 and x 13 Then C(x 12, x 13, x 14 ) = ( ) 2 = 0 All three of the routes are selected Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 29

Ensuring result is a tour for 4 city TSP Case for city 1 use C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 Consider solution (x 12 x 21 ) and (x 34 x 43 ) Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 30

Ensuring result is a tour for 4 city TSP Case for city 1 use C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 Consider solution (x 12 x 21 ) and (x 34 x 43 ) Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 Consider solution (x 23 x 34 x 42 ) Then C(x 12, x 13, x 14 ) = ( ) 2 = 4 31

Ensuring result is a tour for 4 city TSP Case for city 1 use C(x 12, x 13, x 14 ) = (x 12 + x 13 + x 14 – 2) 2 Consider solution (x 12 x 21 ) and (x 34 x 43 ) Then C(x 12, x 13, x 14 ) = ( ) 2 = 1 Consider solution (x 23 x 34 x 42 ) Then C(x 12, x 13, x 14 ) = ( ) 2 = 4 Consider solution (x 12 x 23 x 34 x 41 ) Then C(x 12, x 13, x 14 ) = ( ) 2 = 0 32

Compute Hamiltonian for 4 City TSP Step 1: Reduce constraint function for each city For city 1: Let a = x 12, b = x 13 and c = x 14 Then (x 12 + x 13 + x 14 – 2) 2 = a 2 + b 2 + c 2 + 2(ab + bc + ac) – 4(a + b + c) + 4 Applying a 2 = a, b 2 = b and c 2 = c Then (x 12 + x 13 + x 14 – 2) 2 = 2(ab + bc + ac) – 3(a + b + c)

Compute Hamiltonian for 4 City TSP Step 1: Reduce constraint function for each city For city 1: Let a = x 12, b = x 13 and c = x 14 Then (x 12 + x 13 + x 14 – 2) 2 = a 2 + b 2 + c 2 + 2(ab + bc + ac) – 4(a + b + c) + 4 Applying a 2 = a, b 2 = b and c 2 = c Then (x 12 + x 13 + x 14 – 2) 2 = 2(ab + bc + ac) – 3(a + b + c) + 4 Step 2: Add coefficients of linear terms to diagonal Add coefficients of quadratic terms to off diagonal 34

Compute Hamiltonian for 4 City TSP Step 1: Reduce constraint function for each city For city 1: Let a = x 12, b = x 13 and c = x 14 Then (x 12 + x 13 + x 14 – 2) 2 = a 2 + b 2 + c 2 + 2(ab + bc + ac) – 4(a + b + c) + 4 Applying a 2 = a, b 2 = b and c 2 = c Then (x 12 + x 13 + x 14 – 2) 2 = 2(ab + bc + ac) – 3(a + b + c) + 4 Step 2: Add coefficients of linear terms to diagonal Add coefficients of quadratic terms to off diagonal Step 3: Add coefficients of objective function Σd ij x ij to diagonal 35

Compute Hamiltonian for 4 City TSP Step 1: Reduce constraint function for each city For city 1: Let a = x 12, b = x 13 and c = x 14 Then (x 12 + x 13 + x 14 – 2) 2 = a 2 + b 2 + c 2 + 2(ab + bc + ac) – 4(a + b + c) + 4 Applying a 2 = a, b 2 = b and c 2 = c Then (x 12 + x 13 + x 14 – 2) 2 = 2(ab + bc + ac) – 3(a + b + c) + 4 Step 2: Add coefficients of linear terms to diagonal Add coefficients of quadratic terms to off diagonal Step 3: Add coefficients of objective function Σd ij x ij to diagonal 36 Hamiltonian is input to quantum computer

Job Shop Problem on One Processor There are n jobs. Each job has an earliest time it can be started, latest time it can be finished, length of time to complete it, and setup time to prepare processor. The number of start times is small. A value v j is assigned for completing job j. Maximize the value by completing jobs. 37

Job Shop Problem on One Processor There are n jobs. Each job has an earliest time it can be started, latest time it can be finished, length of time to complete it, and setup time to prepare processor. The number of start times is small. A value v j is assigned for completing job j. Maximize the value by completing jobs. Our binary variables are x jt where j designates a job and t designates a start time for job j. 38

Job Shop Problem on One Processor There are n jobs. Each job has an earliest time it can be started, latest time it can be finished, length of time to complete it, and setup time to prepare processor. The number of start times is small. A value v j is assigned for completing job j. Maximize the value by completing jobs. Our binary variables are x jt where j designates a job and t designates a start time for job j. Our objective function is -Σv j x jt over all jobs j and all start times t for job j. 39

Job Shop Problem on One Processor There are n jobs. Each job has an earliest time it can be started, latest time it can be finished, length of time to complete it, and setup time to prepare processor. The number of start times is small. A value v j is assigned for completing job j. Maximize the value by completing jobs. Our binary variables are x jt where j designates a job and t designates a start time for job j. Our objective function is -Σv j x jt over all jobs j and all start times t for job j. Constraints model: At most one job can start at time t A job can start at most once Two jobs cannot be running at the same time. 40

Job Shop Problem on One Processor There are n jobs. Each job has an earliest time it can be started, latest time it can be finished, length of time to complete it, and setup time to prepare processor. The number of start times is small. A value v j is assigned for completing job j. Maximize the value by completing jobs. Our binary variables are x jt where j designates a job and t designates a start time for job j. Our objective function is -Σv j x jt over all jobs j and all start times t for job j. Constraints model: At most one job can start at time t A job can start at most once Two jobs cannot be running at the same time. Constraints introduce new variables that limit job variables x jt to 7 for quantum processor. These are slack variables similar to those in linear programming. 41

Other Inputs to Quantum Processor LaGrange multipliers – balance objective function and constraints. Very sensitive. 42

Other Inputs to Quantum Processor LaGrange multipliers – balance objective function and constraints. Very sensitive. Cooling time – time to allow processor to reach near 0 degrees kelvin 43

Other Inputs to Quantum Processor LaGrange multipliers – balance objective function and constraints. Very sensitive. Cooling time – time to allow processor to reach near 0 degrees kelvin Annealing time – time to allow processor to reach minimum energy state, after cooled 44

Other Inputs to Quantum Processor LaGrange multipliers – balance objective function and constraints. Very sensitive. Cooling time – time to allow processor to reach near 0 degrees kelvin Annealing time – time to allow processor to reach minimum energy state, after cooled Iterations – number of times same problem is sampled. We found optimal 99% of time for 100 iterations of 4 city TSP. 45

Other Inputs to Quantum Processor LaGrange multipliers – balance objective function and constraints. Very sensitive. Cooling time – time to allow processor to reach near 0 degrees kelvin Annealing time – time to allow processor to reach minimum energy state, after cooled Iterations – number of times same problem is sampled. We found optimal 99% of time for 100 iterations of 4 city TSP. Loads – number of times to iterate 46

Advantage of Quantum Annealing Speed of solving intractable problems – Experimental evidence shows several orders of magnitude faster than current digital computers – Not known whether polynomial or exponential 47

Limitations of Quantum Annealing Limited number of qubits restricts number of variables h i Limited connectivity between qubits restricts which parameters J ij can be nonzero Precision affected by temperature and analog nature of parameters h i and J ij in hardware Techniques that Overcome Limitations 48 Techniques that Mitigate Limitations

Comparison to Linear Programming Linear Programming Optimization technique Linear objective function Variables ≥ 0 Linear constraints Solution - Simplex Method Quantum Computing Optimization technique Quadratic objective function Variables are 0 or 1 Quadratic constraints Solution - Adiabatic Annealing 49

Comparison to Linear Programming Linear Programming Optimization technique Linear objective function Variables ≥ 0 Linear constraints Solution - Simplex Method Runs in polynomial time Quantum Computing Optimization technique Quadratic objective function Variables are 0 or 1 Quadratic constraints Solution - Adiabatic Annealing Extremely fast on intractable problems 50

Comparison to Linear Programming Linear Programming Optimization technique Linear objective function Variables ≥ 0 Linear constraints Solution - Simplex Method Runs in polynomial time Optimal solution from 1 execution Quantum Computing Optimization technique Quadratic objective function Variables are 0 or 1 Quadratic constraints Solution - Adiabatic Annealing Extremely fast on intractable problems Near optimal solution from many executions 51

References 1.D. Aharonov et al., Adiabatic quantum computation is equivalent to standard quantum computation, Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science, 42-51, (2004). 2.G. A. Kochenberger et al., A unified modeling and solution framework for combinatorial optimization problems, OR Spectrum, 26, , (2004) 3.K. Karimi and 8 other D-Wave employees, Investigating the performance of an adiabatic quantum optimization processor, Quantum Information Processing, published on line, (2011) 52

Summary Inputs to adiabatic quantum computer are integers that are interpreted as analog Adiabatic quantum computer optimizes all solutions simultaneously Output from adiabatic quantum computer is 0, 1 assignment to variables with extremely high probability of being a minimization 53

Terminology Adiabatic – describes a process in which there is no loss or gain of heat Quantum annealing – hardware is cooled to a very low temperature which approximates the minimum energy function in the Ising model 54