Quantum Computers. Overview Brief History Computing – (generations) Current technology Limitations Theory of Quantum Computing How it Works? Applications.

Slides:



Advertisements
Similar presentations
1 Chapter 1 Why Parallel Computing? An Introduction to Parallel Programming Peter Pacheco.
Advertisements

FUTURE TECHNOLOGIES Lecture 13.  In this lecture we will discuss some of the important technologies of the future  Autonomic Computing  Cloud Computing.
Cove: A Practical Quantum Computer Programming Framework Matt Purkeypile Fall 2008.
Quantum Computing. Introduction to Computing Is currently done on your laptop today Numbers as we commonly use them are in decimal (base 10) format. Computers.
Integrated Digital Electronics Module 3B2 Lectures 1-8 Engineering Tripos Part IIA David Holburn January 2006.
Quantum Computing CPSC 321 Andreas Klappenecker. Plan T November 16: Multithreading R November 18: Quantum Computing T November 23: QC + Exam prep R November.
Quantum Entanglement David Badger Danah Albaum. Some thoughts on entanglement... “Spooky action at a distance.” -Albert Einstein “It is a problem that.
Quantum Computing Joseph Stelmach.
Quantum Computing Marek Perkowski Part of Computational Intelligence Course 2007.
EE141 © Digital Integrated Circuits 2nd Introduction 1 The First Computer.
FUTURE COMPUTERS By - Kapil jadhav. History of Computers. Long and a fascinating history. Started with huge and complicated machines. First, second, third.
A Brief History of Computers
3.1Introduction to CPU Central processing unit etched on silicon chip called microprocessor Contain tens of millions of tiny transistors Key components:
Quantum Information Processing
Quantum computing Alex Karassev. Quantum Computer Quantum computer uses properties of elementary particle that are predicted by quantum mechanics Usual.
By: Mike Neumiller & Brian Yarbrough
Moore’s Law the number of circuits on a single silicon chip doubles every 18 to 24 months.
Christina Cuervo, Kenny Roth, and Daniel Merrill.
Tallinn University of Technology Quantum computer impact on public key cryptography Roman Stepanenko.
Intro to MIS MGMT 661 Management Information Systems Summer Dannelly 1 st Meeting.
Quantum Computing The Next Generation of Computing Devices? by Heiko Frost, Seth Herve and Daniel Matthews.
{ The demise of Conventional Computing? Bilal Kaleem, 1 st Year Physics.
Computer Programming History of Computers
Chapter 0 Introduction Yonsei University 1 st Semester, 2012 Sanghyun Park.
Multi-core Programming Introduction Topics. Topics General Ideas Moore’s Law Amdahl's Law Processes and Threads Concurrency vs. Parallelism.
Quantum Information Jan Guzowski. Universal Quantum Computers are Only Years Away From David’s Deutsch weblog: „For a long time my standard answer to.
Limits and Horizon of Computing Post silicon computing.
An Introduction to Quantum Phenomena and their Effect on Computing Peter Shoemaker MSCS Candidate March 7 th, 2003.
A Brief History of Computers.
CHEMISTRY 2000 Topics of Interest #2: Quantum Computers.
By Joseph Szatkowski and Cody Borgschulte. ● Uses phenomenon associated with quantum mechanics instead of electrical circuitry ● Quantum mechanics explains.
Facts about quantum computation & speculation about the human brain Tim Hugo Taminiau Kavli Institute of Nanoscience, Delft University Quantum superposition.
Computer Evolution. ENIAC - background Electronic Numerical Integrator And Computer Eckert and Mauchly University of Pennsylvania Trajectory tables for.
Quantum Computing Paola Cappellaro
A brief introduction to Quantum computer
VLSI: A Look in the Past, Present and Future Basic building block is the transistor. –Bipolar Junction Transistor (BJT), reliable, less noisy and more.
Quantum Computers.
Quantum Computing by Mathew Ross Jared Davis - Group L -
Quantum Computers by Ran Li.
Quantum Computers By Andreas Stanescu Jay Shaffstall.
Quantum computing, teleportation, cryptography Computing Teleportation Cryptography.
Quantum Mechanics(14/2) Hongki Lee BIOPHOTONICS ENGINEERING LABORATORY School of Electrical and Electronic Engineering, Yonsei University Quantum Computing.
Introduction to Quantum Computing
Moore’s Law and Its Future Mark Clements. 15/02/2007EADS 2 This Week – Moore’s Law History of Transistors and circuits The Integrated circuit manufacturing.
FNI 1H Quantum Mechanics 1 Quantum Mechanics I don't like it, and I'm sorry I ever had anything to do with it. -- Erwin Schrodinger talking about Quantum.
Chapter VI What should I know about the sizes and speeds of computers?
As if computers weren’t fast enough already…
From Wikipedia wikipedia
Hardware Trends CSE451 Andrew Whitaker. Motivation Hardware moves quickly OS code tends to stick around for a while “System building” extends way beyond.
Quantum Computers By Ryan Orvosh.
Intel’s 3D Transistor BENJAMIN BAKER. Where we are headed  What is a transistor?  What it is and what does it do?  Moore’s Law  Who is Moore and what.
Norman Littlejohn COSC480.  Quantum Computing  History  How it works  Usage.
Christopher Monroe Joint Quantum Institute and Department of Physics NIST and University of Maryland Quantum Computation and Simulation.
Quantum Computing: An Introduction
Quantum Computing Keith Kelley CS 6800, Theory of Computation.
Attendance Syllabus Textbook (hardcopy or electronics) Groups s First-time meeting.
Integration Lower sums Upper sums
QUANTUM COMPUTING By Sandeep Neeli.
Prabhas Chongstitvatana Chulalongkorn University
Quantum Computing from theory to experiments
A Brief History of Computers.
3.1 Introduction to CPU Central processing unit etched on silicon chip called microprocessor Contain tens of millions of tiny transistors Key components:
Quantum Computing: an introduction
Quantum Computing Science A brief history of Quantum Computing
Computer Evolution and Performance
Quantum Computing Hakem Alazmi Jhilakshi Sharma Linda Vu.
Quantum Computing Joseph Stelmach.
Presentation transcript:

Quantum Computers

Overview Brief History Computing – (generations) Current technology Limitations Theory of Quantum Computing How it Works? Applications Timeline Questions

History Abacus Gear Driven Integrated Circuits Over 200 million transistors.

Moore’s Law In 1965 Gordon Moore predicted that number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. Moore predicted that this trend would continue for the foreseeable future.transistors integrated circuits This has held true …….. So far

Moore’s Law (cont.) Recently Gordon Moore announced that his prediction in 1965 would not remain true for much longer. The Microprocessor Industry was getting closer to the limits of the current technology. The ability to put transistors on chips was approaching the atomic level.

Stretching the limits Intel has announced new SRAM chips for high density memory. Contains 330 million transistors. Pentium IV has 30 million transistors

Problems … Current technology is not having difficulty adding more transistors…. At current rate transistors will be as small as an atom. If scale becomes too small, Electrons tunnel through micro-thin barriers between wires corrupting signals.

Quantum Computers Completely new approach to computing. Uses quantum particles to achieve computation. Still Theoretical.

Entanglement Albert Einstein Baffled Coined phrase “spooky action-at-a- distance” Still remains a mystery Superposition

Two States Are Better Than One! Digital Computers rely on O’s and 1’s Voltage produces high and lows Can only have one state Quantum computers can have multiple states Two places at once

Example Reflecting photons off half -silvered mirror (deflects half the light) Where do you think the photon landed?

Example (cont.) Proof ….. Both detectors record photon? Qubit – Quantum bit is in superposition (0 and 1).

Why Is This Helpful? Multiple computations simultaneously Computing power is exponential

Digital vs. Quantum Digital produces serial results, Even with threads! Quantum is truly concurrent Digital computers need an exponential amount of resources to accomplish a task Quantum computers only performs 2^n computations. (n = number of Qubits)

The Larger the problem … Because of the exponential factor the larger computations save more resources than smaller ones. Adding large calculations to existing algorithms does not complicate computation Efficiency due to Qubits

Power of Algorithms Multiplication Algorithms assist in large computations Quantum Algorithms can speed up processes by using logic instructions such as `... and now take a superposition of all numbers from the previous operations...';

Algorithms (cont.) Will be extremely effective in any logic based algorithms such as factoring large numbers. Algorithms are not just theoretical anymore In 1994 Peter Shor, of Bell Labs devised a polynomial time algorithm for factoring large numbers on a quantum computer.

Status … Quantum Algorithms do exist (Peter Shor) Intellectual hives devoted to quantum computing: Oxford University, University of Innsbruck in Austria, Boulder-Colorado, labs of the National Institute of Standards & Technology (NIST), Los Alamos National Laboratory Massachusetts Institute of Technology, many others.National Institute of Standards & Technology (NIST)Los Alamos National Laboratory Massachusetts Institute of Technology Even Microsoft Research in Redmond, Wash., now counts a quantum computer scientist among its theorists.

Status (cont.) Rudimentary Quantum Computers exist December 19, 2001 – IBM performs Shor’s Algorithm Quantum computing is so complex that expanding on simple operations is still 10 –20 years away. Most well known QC’s based on nuclear magnetic resonance (NMR).

Who’s Who in Quantum Computing Albert Einstein – Theorized about quantum computation Richard Feynman – considered simulation of quantum-mechanical objects by other quantum systems in 1982 Peter Shor – developed first quantum algorithm in 1994 A ton of well-formed teams ……

Timeline Desktop Quantum computers expected by many within 20 years Faster than anticipated progress

Questions ?