Presentation on theme: "Introduction to the History of Computing. Mechanical “Computers” Generation 0 Didn’t use electricity, some used gears, wires, beads Abacus 1000-500 BC."— Presentation transcript:
Introduction to the History of Computing
Mechanical “Computers” Generation 0 Didn’t use electricity, some used gears, wires, beads Abacus BC (Babylonians): mechanical aid used for counting The Salamis Tablet (Greek, 300BC) The Roman Hand Abacus
Abacus (cont.) Modern: 1200 A.D to present Middle Ages 5 A.D to c1400 A.D Ancient times: 300 B.C. to c500A.D.
Da Vinci’s Mechanical Calculator Notebook sketches c1500 Working model
Napier’s Bones Early 1600s Multiplication tables inscribed on strips of wood and bones
Oughtred’s Slide Rule Rev. William Oughtred 1621 Use logs to perform multiplication and division by using addition and subtraction
Pascal’s arithmetic engine Blaise Pascal ( ) Mechanical calculator for addition and subtraction
Leibnez’s Step Reckoner Gottfried von Leibnez 1670 Add, subtract, multiply, divide, square roots
Jacquard’s punch card Joseph Marie Jacquard 1805 punch cards used to operator loom Could reprogram loom by changing cards
Babbage’s Engines Same chair at Cambridge as Newton and Hawking Designed the difference engine and later, the analytical engine Brass gears and strings of punch cards run by steam Analytical Engine never built Charles Babbage ( )
The World’s First Programmer Lady Ada Byron, Countess of Lovelace ( ) Understood Babbage’s Analytical Engine saw it as what we would call a general-purpose computer. Her notes anticipate future developments, including computer-generated music.
Hollerith’s Tabulating Machine Herman Hollerith ( ) Invented a punched card device to help analyse the 1890 US census data Founded “Tabulating Machine Company” – Tabulating Machine Company merges with others to form IBM
MIT Differential Analyzer Purpose: to solve differential equations Mechanical computation with first use of vacuum tubes for memory Programmed by aligning gears on shafts 1930s
Alan Turing ( ) Develops theory of computability and the “Turing Machine” model – a simple but elegant mathematical model of a general purpose computer (~1936) Helped crack German codes in WWII ( )
Konrad Zuse 1936: Z1 first binary computer using Erector Set parts, keyboard and lights for output (relay memory) 1938: Z2 – using punched tape and relays Z1
Vacuum Tubes Generation Atanasoff-Berry Computer –First electronic-digital computer? –Binary numbers, direct logic for calculation, regenerative memory Prototype years then to build full scale model –One op per 15 secs, 300 vacuum tubes, 700 pounds, mile of wire ABC Prototype
The first computers (cont.) 1943 British Colossus – first all-electronic computer? (2,400 vacuum tubes) –Decipher enigma coded messages at 5,000 chars/sec –At peak, 10 machines ran 24 hours a day A German enigma coding machine
The first computers (cont.) Aiken at Harvard/IBM “Mark 1” – first electromechanical digital computer (electromagnetic relays – magnets open and close metal switches) (recreation of Analytical Engine) –8 ft tall, 50 ft long, 1 million parts –323 decimal-digit additions per sec –storage for digit numbers.
ENIAC (1946) 18,000 tubes, 1500 sq ft Programmed by wire plugs into panels –5,000 decimal-digit additions/sec –20 10-decimal digit “accumulators” Von Neumann and ENIAC 1941 Von Neumann proposes EDVAC – Electronic Discrete Variable Computer Computer should –Use binary –Have stored programs –Be function-oriented
UNIVAC-1 The world’s first commercially available (non-military) computer “I think there is a world market for about five computers” –Thomas J. Watson, IBM Chairman
Transistors Generation 2 Transistors replace vacuum tubes Size and cost decreased, speed increased 1960’s IBM sells large mainframe computers to businesses, called 700 series Mainframes run operating systems that allow many dumb terminals to be attached Typical business applications are custom written and run in batch mode
Integrated Circuits Generation 3 Integrated circuits contain many transistors on one chip 1971 Intel produces 4004 chip with all circuitry for a calculator
VLSI Generation 4 Mid 1970s Very large scale integration 1977 Apple Corporation started by Steve Jobs sells personal computer for hobbyists 1980 IBM creates the PC to sell to businesses The PC is widely cloned and becomes widely accepted as prices drop PCs and clones use a text based operating system called DOS to programs 1984 Apple releases the MAC with a graphical user interface Generations on How Webopedia IBM PC c1982
Programming Language History Programming languages instruct computers what to do Charles Babbage's difference engine could only be made to execute tasks by changing the gears which executed the calculations US Government ENIAC could only be "programmed" by presetting switches and rewiring the entire system for each new "program" or calculation
Programming Language History Generation 1 late 40’s / early 50’s: programmers coded directly in machine language it allowed the programmer to write its statements in 0's and 1's by hand
Programming Language History Generation 2 mid 1950’s: assembly languages replaced numeric codes with mnemonic names an assembler is a program that translates assembly code into machine code input: assembly language program output: machine language program still low-level & machine- specific, but easier to program In 1951, Grace Hopper (US Rear Admiral) wrote the first compiler, A-0, which turned English-like instructions into 0's and 1's gcc2_compiled.:.global _Q_qtod.section ".rodata".align 8.LLC0:.asciz "Hello world!".section ".text".align 4.global main.type main,#function.proc 04 main: !#PROLOGUE# 0 save %sp,-112,%sp !#PROLOGUE# 1 sethi %hi(cout),%o1 or %o1,%lo(cout),%o0 sethi %hi(.LLC0),%o2 or %o2,%lo(.LLC0),%o1 call __ls__7ostreamPCc,0 nop mov %o0,%l0 mov %l0,%o0 sethi %hi(endl__FR7ostream),% or %o2,%lo(endl__FR7ostream),% call __ls__7ostreamPFR7ostream_R7ostream,0 nop mov 0,%i0 b.LL230 nop.LL230: ret restore.LLfe1:.size main,.LLfe1-main.ident "GCC: (GNU) 2.7.2"
Programming Language History Generation 3 In 1957, IBM creates the first of the major languages called FORTRAN. Its name stands for FORmula TRANslating system. The language was designed for scientific computing. Excellent language for scientific work, difficult input/output operations
Programming Language History In 1958, John McCarthy of MIT created the LISt Processing (or LISP) language. It was designed for Artificial Intelligence (AI) research. Because it was designed for such a highly specialized field, its syntax has rarely been seen before or since. Still in use today for AI research, offsprings include Scheme
Programming Language History 1959 COBOL was developed for businesses. COBOL statements have a very English- like grammar, making it quite easy to learn. Much better input/output than FORTRAN permitting business applications Highly successful and used on most IBM mainframe computers, even today.
Programming Language History The BASIC language was developed in 1964 by John Kemeny and Thomas Kurtz. BASIC is a very limited language and was designed for non-computer science people. Many versions of BASIC were developed, Bill Gates and his partner started business by writing a version of BASIC for a hobby computer Bill Gates would later start Microsoft when he licenses the DOS operating system to IBM
Programming Languages History Pascal was begun in 1968 by Niklaus Wirth. Its development was mainly out of necessity for a good teaching tool. Pascal was designed in a very orderly approach, it combined many of the best features of the languages in use at the time, COBOL, FORTRAN, and ALGOL.
Programming Language History C was developed in 1972 by Dennis Ritchie while working at Bell Labs in New Jersey. The transition in usage from the first major languages to the major languages of today occurred with the transition between Pascal and C. C was built to be fast and powerful at the expense of being hard to read. Ritchie developed C for the new Unix system being created at the same time. C is very commonly used to program operating systems such as Unix, Windows, the MacOS, and Linux.
Programming Language History In the late 1970's and early 1980's, a new programming method was being developed called Object Oriented Programming, or OOP. Bjarne Stroustroup liked this method and developed extensions to C known as C++, which was released in C++ was designed to organize the raw power of C using OOP, but maintain the speed of C and be able to run on many different types of computers. C++ is most often used in simulations, such as games.
Programming Language History Visual Basic 1 is released by Microsoft in 1991 It includes a combination of QuickBasic (Microsoft’s version of BASIC) and a graphical design tool for creating the User Interface (originally developed by Alan Cooper) It includes an event-driven programming paradigm
Programming Language History In the early 1990's, interactive TV was the technology of the future. Sun Microsystems decided that interactive TV needed a special, portable (can run on many types of machines), language. This language eventually became Java. In 1994, the Java project team changed their focus to the web, which was becoming "the cool thing" after interactive TV failed. The next year, Netscape licensed Java for use in their internet browser, Navigator. At this point, Java became the language of the future.
Programming Language History Generation 4 Often abbreviated 4GL, fourth-generation languages are programming languages closer to human languages than typical 3 rd generation languages. In 1969, a language called RAMIS was released Most 4GLs are used to access databases and do in a few lines of code what would require hundreds of lines of COBOL or C. For example, a typical 4GL command is FIND ALL RECORDS WHERE NAME IS "SMITH"