Presentation on theme: "The Origins of Computing"— Presentation transcript:
1 The Origins of Computing History of ComputingThe Origins of ComputingSources: Combrink, Thomas Cortina, Fergus Toolan,
2 What is a Computer? one who computes a person employed to make calculations in an observatory, in surveying, etc.“a programmable machine that can execute a list of instructions in a well-defined manner”“one who computes; a calculator, reckoner: specifically a person employed to make calculations in an observatory, in surveying, etc.” – Oxford English Dicitionary“a machine for performing calculations automatically” – WordNet Dictionary“a programmable machine that can respond to a specific set of instructions in a well-defined manner and can execute a prerecorded list of instructions” – Webopedia
3 What is a modern computer A machine which can execute billions of instructions per second.Uses a “stored program” to execute instruction in a specific order to “solve a problem”
4 Modern Computers are assemblies of components Six logical units of computer systemInput unitMouse, keyboardOutput unitPrinter, monitor, audio speakersMemory unitRetains input and processed informationArithmetic and logic unit (ALU)Performs calculationsCentral processing unit (CPU)Supervises operation of other devicesSecondary storage unitHard drives, floppy drives
5 CPU (Microprocessor Chip) Brain of the computerMade of Integrated Circuits (ICs), which have millions of tiny transistors and other componentsPerforms all calculations & executes all instructionsExample chips for PC:Intel (Celeron, Pentium)AMD (K-6 and Athlon)Inside the Chip
6 What’s a Giga Hertz (GHz) ? A unit of measurement for CPU speed (clock speed)G (giga) means 1 billion, M (mega) would be 1 millionHz is for frequency per secondGHz means 1 billion clock cycles per secondCPUs may execute multiple operations each clock cycleSo what does a 2.8 GHz CPU mean?2,800,000,000 clock cycles per secondPerforms at least 2,800,000,000 operations per second
7 Main Memory (RAM) Stores data for programs currently running Temporary empty when power is turned offFast access to CPU
8 What’s a Giga Byte (GB)?GB measures the amount of data the it can storeG (giga) for 1 billionM (mega) for 1 millionData quantities are measured in bytes1 Bit = stores a single on/off piece of information1 Byte = 8 bits1 Kilobyte = 210 (~1,000 bytes)1 Megabyte = 220 (~1,000,000 bytes)1 Gigabyte = 230 (~1,000,000,000 bytes)
9 Hard Drive Stores data and programs Permanent storage (theoretically) when you turn off the computer, it is not emptied
10 MotherboardConnects all the components together
11 How did we get here?In studying the history of computers, where do we start?We could go back thousands of yearsMathematical developmentsManufacturing developmentsEngineering innovationsThe wheel?
16 Early Computational Devices (Chinese) AbacusUsed for performing arithmetic operationsMathematical concepts and their offspring, arithmetical operations, were considered for thousands of years a pure intellectual exercise which could not be duplicated or performed by a man-made artifact. Even the Abacus, which appeared in Asia Minor 2500 years ago and is still in use today, is only a memory-helping device rather than a real calculating machine.The Abacus is an ingenious counting device based on the relative positions of two sets of beads moving on parallel strings. The first set contains five beads on each string and allows counting from 1 to 5, while the second set has only two beads per string representing the numbers 5 and 10. The Abacus system seems to be based on a radix of five. Using a radix of five makes sense since humans started counting objects on their fingers.
17 Al’Khowarizmi and the algorithm 12th Century Tashkent ClericDeveloped the concept of a written process for doing somethingPublished a book on the process of algorithmsThe basis of software
18 Early Computational Devices Napier’s Bones, 1617For performing multiplication & divisionAnother interesting invention is Napier’s bones, a clever multiplication tool invented in 1617 by mathematician John Napier ( ), of Scotland.The bones are a set of vertical rectangular rods, each one divided in 10 squares. The top square contains a digit and the remaining squares contain the first 9 multiples of the digit. Each multiple has its digits separated by a diagonal line. When a number is constructed by arranging side by side the rods with the corresponding digits on the top, then its multiple can be easily obtained by reading the corresponding row of multiples from left to right while adding the digits found in the parallelograms formed by the diagonal lines. No wonder John Napier is also the inventor of the logarithms, a concept used to change multiplication into addition.Napier's bones were very successful and were widely used in Europe until mid 1960's.Logarithms were also the basis for the invention of the slide rule by William Oughtred ( ), of England, in 1633.John Napier
20 Philosopher Forefathers of Modern Computing 1600-1700 Von Leibniz developed binary arithmetic and a hand cranked calculator.Calculator was able to add, subtract, multiply and divide.Blaise Pascal –developed the Pascaline.Desk top calculator worked lik an odometer.
21 Blaise PascalPascal ( ) was the son of a tax collector and a mathematical genius. He designed the first mechanical calculator (Pascaline) based on gears. It performed addition and subtraction.
23 Early Computational Devices Pascaline mechanical calculatorBlaise Pascal ( ) was only 18 years old when he conceived the Pascaline in A precocious French mathematician and philosopher, Pascal discovered at the age of 12 that the sum of the angles in a triangle is always 180 degrees. Later on, he set the basis for the probability theory and made significant contributions to the science of hydraulics. The Pascaline, built in 1643, was possibly the first mechanical adding device actually used for a practical purpose. It was built by Pascal to help his father, Etienne Pascal, a tax collector, with the tedious activity of adding and subtracting large sequences of numbers. However the machine was difficult to use and probably not very useful because of the French currency system which was not base 10. A livre had 20 sols and a sol had 12 deniers.Pascal was not aware of Schickard’s machine, and his solution was not as elegant and efficient. As Paul E. Dune said "had Schickard’s ideas found a wide audience then Pascal’s machine would not have been invented."It was built on a brass rectangular box, where a set of notched dials moved internal wheels in a way that a full rotation of a wheel caused the wheel at the left to advance one 10th. Although the first prototype contained only 5 wheels, later units were built with 6 and 8 wheels. A pin was used to rotate the dials. As opposed to Schickard’s machine, the wheels moved only clockwise and were designed only to add numbers. Subtraction was done by applying a cumbersome technique based on the addition of the nine’s complement.Although the machine attracted a lot of attention in those days, it did not get wide acceptance because it was expensive, unreliable as well as difficult to use and manufacture. By 1652 about 50 units had been made but less than 15 had been sold. Initially, Pascal got a lot of interest in his invention and he even obtained a "privilege" protection (medieval equivalent of a patent) for his idea in 1649, but his interest in science and "material" pursuits ended when he retreated to a Jansensist convent in 1655 concentrating all his attention on philosophy. He died in 1662.During a period of 30 years after Pascal's invention, several persons built calculating machines based on this design. The most notorious was the adding machine of Sir Samuel Morland ( ), from England. This machine invented in 1666 had a duodecimal scale based on the English currency, and required human intervention to enter the carry displayed in an auxiliary dial.It is interesting to note that even at the beginning of the 20th Century, several companies introduced models based directly on Pascal's design. One example is the Lightning Portable Adder introduced in 1908 by the Lightning Adding Machine Co. of Los Angeles. Another example is the Addometer introduced in 1920 by the Reliable Typewriter and Adding Machine Co. of Chicago. None of them achieved commercial success.Blaise Pascal
24 Early Computational Devices Slide CalculatorsOughtred is best known for his invention of an early form of the slide rule. Edmund Gunter (1620) plotted a logarithmic scale along a single straight two foot long ruler. He added and subtracted lengths by using a pair of dividers, operations that were equivalent to multiplying and dividing. In 1630 Oughtred invented a circular slide rule. In 1632 he used two Gunter rulers so that he could do away with the dividers. He published Circles of Proportion and the Horizontal Instrument in 1632 describing slide rules and sundials.There was a dispute however regarding priority over the invention of the circular slide rule. Delamain certainly published a description of a circular slide rule before Oughtred. His Grammelogia, or the Mathematicall ring was published in It may well be that both invented this instrument independently. Unfortunately a very heated argument ensued and to some extent this formed a cloud over the later years of Oughtred's life.The present form of the slide rule was designed in 1850 by a French army officer, Amedee Mannheim.MAGIC-BRAINHOW TO CLEAR THE MACHINE . .Before solving any math problem, be sure your Calculator is clear (all 000, in total windows). This is done simply by pulling up the wire bar on top and pushing it back down.HOW TO ADDAddition is simple, but requires practice! Use the large black figures only, for your addition problems. Put the point of the stylus into the right-hand slot directly opposite the number you wish to add. If the indicated slot is RED ... pull DOWN ... If the slot is WHITE … push the stylus all the way UP, OVER and DOWN around the curve (unless this operation is performed properly, the figure will not carry over into the second window).William Oughtred
25 Gottfried Wilhelm von Leibniz Leibnitz ( ) was a German mathematician and built the first calculator to do multiplication and division. It was not reliable due to accuracy of contemporary parts.He also documented the binary number system which is used in all modern computers.
26 Count to 8 in binary00010010001101000101011001111000
27 Modern Computers use Binary Why?Much simpler circuits needed for performing arithmetic
28 Early Computational Devices Leibniz’s calculating machine, 1674It was 1672 when the famous German polymath, mathematician and philosopher, Gottfried Wilhelm Von Leibniz ( ), co-inventor of the differential calculus, decided to build a machine able to perform the four basic arithmetical operations. He was inspired by a steps-counting device (pedometer) he saw while on a diplomatic mission in Paris.Like Pascal, Leibniz was a child prodigy. He learned Latin by the age of 8 and got his second doctorate when he was 19. As soon as he knew about Pascal’s design, he absorbed all its details and improved the design so as to allow for multiplication and division. By 1674 his design was complete and he commissioned the building of a prototype to a craftsman from Paris named Olivier.The Stepped Reckoner, as Leibniz called his machine, used a special type of gear named Stepped Drum or Leibniz Wheel which was a cylinder with nine bar-shaped teeth of incrementing length parallel to the cylinder’s axis. When the drum is rotated by using a crank, a regular ten-tooth wheel, fixed over a sliding axis, is rotated zero to nine positions depending on its relative position to the drum. As in the Pascal device, there is one set of wheels for each digit. This allows the user to slide the mobile axis so that when the drum is rotated it generates in the regular wheels a movement proportional to their relative position. This movement is then translated by the device into multiplication or division depending on which direction the stepped drum is rotated.There is no evidence that more than two prototypes of this machine were ever made. Even though Leibniz was one of the most outstanding polymaths of his time, he died in poverty and unrewarded. His machine remained in the attic of the University of Göttingen until a worker found it in 1879 while fixing a leak in the roof. Now it is in the State Museum of Hanover; another one is in the Deutsches Museum in München.Gottfried Wilhelm von Leibniz
29 George Boole (1815-1864) Invented Boolean Algebra System of logic using boolean valuesUsed to establish inequalities:symbolic use of <, or >, or <>Used in computer switchingModern use in library searches
30 Charles BabbageBabbage ( ) was a British inventor who designed an two important machines:Difference engineAnalytical engineHe saw a need to replace the human computers used to calculate numerical tables which were prone to error with a more accurate machine.
31 Charles Babbage Difference engine Designed to compute values of polynomial functions automaticallyNo multiplication was needed because he used the method of finite differencesHe never built oneIt was built from 1989 – 1991 for the London Science Museum
36 Lady Ada Byron – World’s first programmer Countess of Lovelace, daughter of Lord Byron.One of the first women mathematicians in EnglandDocumented Babbage’s work.Wrote an account of the difference engine.Wrote a program for the difference engine for computing Bernoulli numbers
37 Herman HollerithHollerith developed an electromechanical punched-card tabulator to tabulate the data for 1890 U.S. census. Data was entered on punched cards and could be sorted according to the census requirements. The machine was powered by electricity. He formed the Tabulating Machine Company which became International Business Machines (IBM). IBM is currently the largest computer manufacturer, employing in excess of 300,000 people.
38 Herman Hollerith punch card tabulating machine 1890 Census
39 Hollerith Tables and the Census Improved the speed of the censusReduced cost by $5 millionGreater accuracy of data collectedHollerith – unemployed after the census
41 The War Years 1939-1945 Two Primary Uses Artillery TablesHand calculation replaced by machine calculationDepartment of the NavyCryptologist :CryptographyThe art or process of writing in or deciphering secret writingBletchley HouseThe Enigma Codes – U23
43 History of ComputersAlan Turing was a British mathematician who also made significant contributions to the early development of computing, especially to the theory of computation.He developed an abstract theoretical model of a computer called a Turing machine which is used to capture the notion of computable i.e. what problems can and what problems cannot be computed.Not all problems can be solved on a computer.Note: A Turing machine is an abstract model and not a physical computer
44 Alan Turing misunderstood genius 1936 Published a paper “On Computable Numbers”Turing’s machine - hypothetical computer that could perform any computation or logical operation a human could devise.
45 Turings Heritage Code breaking was Touring’s strength. Colossus a computer to break the German enigma code Billion alternatives.Ran at rate of 25,000 characters per second
47 The II World War Years 1939 - 1945 Calculate artillery tables. Used to break codes like the Colossus.Used to model future events - Atomic and Hydrogen Bombs.Cmdr. Grace Hooper
48 Howard Aiken (1900 – 73)Aiken, a Harvard professor, with the backing of IBM built the Harvard Mark I computer (51ft long) in It was based on relays (operate in milliseconds) as opposed to the use of gears. It required 3 seconds for a multiplication.Aiken’s Mark 1. (1944) based on Babbage’s original design - built at IBM labs, electro-mechanical, weighed 5 tons. Admiral Grace Hopper worked as programmer on this computer, and coined the term 'bug' for a computer fault.
50 The Mark I - a dinosaur 51 feet long 3,304 electro mechanical switches Add or subtract 23 digit numbers in 3/10 of a second.Instructions (software) loaded by paper tape.The infamous “Bug”
51 ENIAC - The Next Jump Forward - 1946 1st electronic digital computerOperated with vacuum tubes rather electro-mechanical switches1000 times faster than Mark INo program storage - wired into circuitry.This was still based on the decimal numbering system.“programmed” by switches and cords
53 The Advent of the Semiconductor - 1947 Developed at Bell Labs by Shockley & Bardeen – Nobel PrizePoint Contact Transistor replaced power hungry, hot and short lived vacuum tubes
54 History of ComputersVon Neumann was a scientific genius and was a consultant on the ENIAC project. He formulated plans with Mauchly and Eckert for a new computer (EDVAC) which was to store programs as well as data. This is called the stored program concept and Von Neumann is credited with it. Almost all modern computers are based on this idea and are referred to as Von Neumann machines. He also concluded that the binary system was more suitable for computers since switches have only two values. He went on to design his own computer at Princeton which was a general purpose machine.
55 First Generation Computers (1951-58) These machines were used in business for accounting and payroll applications. Valves were unreliable components generating a lot of heat (still a problem in computers). They had very limited memory capacity. Magnetic drums were developed to store information and tapes were also developed for secondary storage.They were initially programmed in machine language (binary). A major breakthrough was the development of assemblers and assembly language.
56 EDVAC - Electronic Discreet Variable Automatic Computer 1951 Data stored internally on a magnetic drumRandom access magnetic storage deviceFirst stored program computer
58 Second Generation ( )The development of the transistor revolutionised the development of computers. Invented at Bell Labs in 1948, transistors were much smaller, more rugged, cheaper to make and far more reliable than valves.Core memory (non-volatile) was introduced and disk storage was also used. The hardware became smaller and more reliable, a trend that still continues.Another major feature of the second generation was the use of high-level programming languages such as Fortran and Cobol. These revolutionised the development of software for computers. The computer industry experienced explosive growth.
59 Technical Advances in the 60’s John Mccarthy coins the term “Artificial Intelligence”Removable Disks appearBASIC - Beginners-all purpose Symbolic Instruction LanguageTexas Instruments offers the first solid- state hand-held calculatorst issue of Computerworld published
60 Third Generation ( )IC’s (Integrated Circuits) were again smaller, cheaper, faster and more reliable than transistors. Speeds went from the microsecond to the nanosecond (billionth) to the picosecond (trillionth) range. ICs were used for main memory despite the disadvantage of being volatile. Minicomputers were developed at this time.Terminals replaced punched cards for data entry and disk packs became popular for secondary storage.IBM introduced the idea of a compatible family of computers, 360 family, easing the problem of upgrading to a more powerful machine
61 Third Generation ( )Substantial operating systems were developed to manage and share the computing resources and time sharing operating systems were developed. These greatly improved the efficiency of computers.Computers had by now pervaded most areas of business and administration.The number of transistors that be fabricated on a chip is referred to as the scale of integration (SI). Early chips had SSI (small SI) of tens to a few hundreds. Later chips were MSI (Medium SI): hundreds to a few thousands,. Then came LSI chips (Large SI) in the thousands range.
62 Moore’s LawIn 1965 Gordon Moore graphed data about growth in memory chip performance.Realized each new chip roughly twice capacity of predecessor, and released within ~2 yrs of it => computing power would rise exponentially over relatively brief periods of time.Still fairly accurate. In 30 years, no of transistors on a chip has increased ~20,000 times, from 2,300 on the 4004 in 1971 to 42 million on the Pentium® IV.
63 The 1970’s - The Microprocessor Revolution A single chip containing all the elements of a computer’s central processing unit.Small, integrated, relatively cheap to manufacture.
64 The Super Computers - 1972 The Cray Parallel processing power Speed 100 million arithmetical functions per secondSensitive to heat - cooled with liquid nitrogenVery expensive
65 Fourth GenerationVLSI allowed the equivalent of tens of thousand of transistors to be incorporated on a single chip. This led to the development of the microprocessor a processor on a chip.Intel produced the 4004 which was followed by the 8008,8080, 8088 and 8086 etc. Other companies developing microprocessors included Motorolla (6800, 68000), Texas Instruments and Zilog.
66 Fourth GenerationPersonal computers were developed and IBM launched the IBM PC based on the 8088 and 8086 microprocessors.Mainframe computers have grown in power.Memory chips are in the megabit range.VLSI chips had enough transistors to build 20 ENIACs.Secondary storage has also evolved at fantastic rates with storage devices holding gigabytes (1000Mb = 1 Gb) of data.
67 Fourth GenerationOn the software side, more powerful operating systems are available such as Unix.Applications software has become cheaper and easier to use.Software development techniques have vastly improved.Fourth generation languages 4GLs make the development process much easier and faster.
68 Fourth GenerationLanguages are also classified according to generations from machine language (1GL), assembly language (2GL), high level languages (3GL) to 4GLs.Software is often developed as application packages. VisiCalc a spreadsheet program, was the pioneering application package and the original killer application.Killer application: A piece of software that is so useful that people will buy a computer to use that application.
69 The ALTAIR from a Voyage to Altair - Star Trek -1975
70 The Birth of the Micro Computer 1975 Jobs and Wozniac develop the Apple IICommodore PET, programs stored on a cassetteTandy-Radio Shack TRS-805 1/2 inch floppy disk becomes the standard for software
71 Finally, The Computer as Man of the Year - 1982
72 Revenge of the nerds Bill Gates Microsoft, 1978 Steve Jobs Steve WozniakAlan Turing