Presentation is loading. Please wait.

Presentation is loading. Please wait.

Physical Limits of Computing Dr. Mike Frank Slides from a Course Taught at the University of Florida College of Engineering Department of Computer & Information.

Similar presentations


Presentation on theme: "Physical Limits of Computing Dr. Mike Frank Slides from a Course Taught at the University of Florida College of Engineering Department of Computer & Information."— Presentation transcript:

1 Physical Limits of Computing Dr. Mike Frank Slides from a Course Taught at the University of Florida College of Engineering Department of Computer & Information Science & Engineering Spring 2000, Spring 2002, Fall 2003

2 Overview of First Lecture Introduction: Moore’s Law vs. Known PhysicsIntroduction: Moore’s Law vs. Known Physics Mechanics of the course:Mechanics of the course: –Course website –Books / readings –Topics & schedule –Assignments & grading policies –misc. other administrivia

3 Physical Limits of Computing Introductory Lecture Moore’s Law vs. Known Physics

4 Moore’s Law Moore’s Law proper:Moore’s Law proper: –Trend of doubling of number of transistors per integrated circuit every 18 (later 24) months First observed by Gordon Moore in 1965 (see readings)First observed by Gordon Moore in 1965 (see readings) “Generalized Moore’s Law”“Generalized Moore’s Law” –Various trends of exponential improvement in many aspects of information processing technology (both computing & communication): Storage capacity/cost, clock frequency, performance/cost, size/bit, cost/bit, energy/operation, bandwidth/cost …Storage capacity/cost, clock frequency, performance/cost, size/bit, cost/bit, energy/operation, bandwidth/cost …

5 Moore’s Law – Devices per IC Early Fairchild ICs Intel µpu’s

6 Microprocessor Performance Trends Raw technology performance (gate ops/sec/chip): Up ~55%/year Source: Hennessy & Patterson, Computer Architecture: A Quantitative Approach. Added Performance analysis based on data from the ITRS 1999 roadmap.

7 Super-Exponential Long-Term Trend Ops/second/ $1,000 Source: Kurzweil ‘99

8 Known Physics: The history of physics has been a story of:The history of physics has been a story of: –Ever-increasing precision, unity, & explanatory power Modern physics is very nearly perfect!Modern physics is very nearly perfect! –All accessible phenomena are exactly modeled, as far as we know, to the limits of experimental precision, ~11 decimal places today. However, the story is not quite complete yet:However, the story is not quite complete yet: –There is no experimentally validated theory unifying GR & QM (yet) String theory? M-theory? Loop quantum gravity? Other?

9 Fundamental Physical Limits of Computing Speed-of-Light Limit Thoroughly Confirmed Physical Theories Uncertainty Principle Definition of Energy Reversibility 2 nd Law of Thermodynamics Adiabatic Theorem Gravity Theory of Relativity Quantum Theory Implied Universal Facts Affected Quantities in Information Processing Communications Latency Information Capacity Information Bandwidth Memory Access Times Processing Rate Energy Loss per Operation

10

11

12 A Precise Definition of Nanoscale 10 −6 m = 1 µm 10 −9 m = 1 nm 10 −12 m = 1 pm 10 −7.5 m ≈ 31.6 nm 10 −10.5 m ≈ 31.6 pm Nanoscale: Characteristic length scale in Nanocomputers Microscale: Characteristic length scale in Microcomputers 10 −4.5 m ≈ 31.6 µm Picoscale: Characteristic length scale in Picocomputers (if possible) Near nano- scale Far nano- scale ~Atom size

13 (½CV 2 gate energy calculated from ITRS ’99 geometry/voltage data)

14 What is entropy? First was characterized by Rudolph Clausius in 1850.First was characterized by Rudolph Clausius in 1850. –Originally was just defined as heat ÷ temperature. –Noted to never decrease in thermodynamic processes. –Significance and physical meaning were mysterious. In ~1880’s, Ludwig Boltzmann proposed that entropy S is the logarithm of the number N of states, S = k ln NIn ~1880’s, Ludwig Boltzmann proposed that entropy S is the logarithm of the number N of states, S = k ln N –What we would now call the information capacity of a system –Holds for systems at equilibrium, in maximum-entropy state The modern consensus that emerged from 20 th -century physics is that entropy is indeed the amount of unknown or incompressible information in a physical system.The modern consensus that emerged from 20 th -century physics is that entropy is indeed the amount of unknown or incompressible information in a physical system. –Important contributions to this understanding were made by von Neumann, Shannon, Jaynes, and Zurek.

15 Landauer’s 1961 Principle from basic quantum theory … N distinct states … … 2N distinct states Unitary (1-1) evolution Before bit erasure:After bit erasure: Increase in entropy: S = log 2 = k ln 2. Energy lost to heat: ST = kT ln 2 0 s0s0 0 sN−1sN−1 … 1 s′0s′0 1 s′N−1s′N−1 … … 0 s″0s″0 0 s″N−1s″N−1 0 s″Ns″N 0 s″2N−1s″2N−1 …

16 Adiabatic Cost-Efficiency Benefits Bit-operations per US dollar Conventional irreversible computing Worst-case reversible computing Best-case reversible computing Scenario: $1,000/3-years, 100-Watt conventional computer, vs. reversible computers w. same capacity. All curves would →0 if leakage not reduced. ~1,000× ~100,000×


Download ppt "Physical Limits of Computing Dr. Mike Frank Slides from a Course Taught at the University of Florida College of Engineering Department of Computer & Information."

Similar presentations


Ads by Google