Presentation is loading. Please wait.

Presentation is loading. Please wait.

Early Computer History

Similar presentations


Presentation on theme: "Early Computer History"— Presentation transcript:

1 Early Computer History
How it all began 2:42 AM

2 Early Computer History
Pascalene 1624 The first accurate mechanical calculator Created by Blaise Pascal Used to add, subtract, multiply, and divide Jacquard Loom 1820 Created by Joseph Jacquard A machine that automated the weaving of complex patterns Used holes punched in cards to automate the process The Pascalene was the first accurate mechanical calculator. This machine, created by the French mathematician Blaise Pascal in 1642, used revolutions of gears to count by tens. The Pascalene could be used to add, subtract, multiply, and divide. The basic design of the Pascalene was so sound that it lived on in mechanical calculators for more than 300 years. Nearly 200 years later, Joseph Jacquard revolutionized the fabric industry by creating a machine that automated the weaving of complex patterns. Although not a counting or calculating machine, the Jacquard Loom was significant because it relied on stiff cards with punched holes to automate the process. Much later this process would be adopted as a means to record and read data by using punch cards in computers. 2:42 AM

3 Early Computer History
Analytical Engine 1834 Created by Charles Babbage The father of computing The first automatic calculator Includes components similar to those found in today's computers Hollerith Tabulating Machine 1890 Created by Herman Hollerith Used punch cards to tabulate census data Hollerith started the Tabulating Machine Company, which later became IBM Decades later, in 1834, Charles Babbage designed the first automatic calculator, called the Analytical Engine. Although it was never developed, Babbage’s detailed drawings and descriptions of the machine include components similar to those found in today’s computers, including the store (RAM), the mill (central processing unit), as well as input and output devices. This invention gave Charles Babbage the title of the “father of computing.” In 1890, Herman Hollerith developed a machine called the Hollerith Tabulating Machine that used punch cards to tabulate census data. Hollerith left the Census Bureau in 1896 to start the Tabulating Machine Company, which later changed its name to International Business Machines, or IBM. 2:42 AM

4 Early Computer History
Z1 1936 Created by Konrad Zuse The Z1 is a mechanical calculator It included a control unit and memory functions Atanasoff-Berry Computer 1939 Created by John Atanasoff and Clifford Berry The first electrically powered digital computer Used vacuum tubes to store data The first computer to use the binary system German inventor Konrad Zuse is credited with a creating a mechanical calculator called the Z1 in The Z1 is thought to be the first computer to include features that are integral to today’s systems, including a control unit and separate memory functions. In 1939, John Atanasoff and, Clifford Berry built the first electrically powered digital computer, called the Atanasoff-Berry Computer. The computer was the first to use vacuum tubes to store data. Most important, the ABC was the first to use the binary system. Atansoff-Berry Computer 2:42 AM

5 Early Computer History
Harvard Mark I 1944 Created by Howard Aiken and Grace Hopper A computer used by the US Navy for ballistics calculations Hopper’s contribution to computing was Invention of the compiler Coined the term “computer bug” Turing Machine 1939 Created by Alan Turing A hypothetical model that defined a mechanical procedure or algorithm Concept of an infinite tape that could read, write, and erase was precursor to today’s RAM From the late 1930s to the early 1950s, Howard Aiken and Grace Hopper designed the Mark series of computers at Harvard University. The U.S. Navy used these computers for ballistic and gunnery calculations. Aiken, an electrical engineer and physicist, designed the computer, while Hopper did the programming. The Harvard Mark I, finished in 1944, could perform all four arithmetic operations (addition, subtraction, multiplication, and division). Hopper’s greatest contribution was the invention of the compiler, a program that translates English language instructions into computer language. Hopper was also the first to “debug” a computer when she removed a moth that had flown into the Harvard Mark I. After that, problems that caused the computer to not run were called “”bugs.” In 1936, British mathematician Alan Turing created an abstract computer model that could perform logical operations. The Turing Machine was not a real machine but rather a hypothetical model that mathematically defined a mechanical procedure (or algorithm). Turing’s concept described a process by which the machine could read, write, or erase symbols written on squares of an infinite paper tape. This concept of an infinite tape that could be read, written to, and erased was the precursor to today’s RAM. 1st use of “computer bug” 2:42 AM

6 Early Computer History
ENIAC 1944 Created by John W. Mauchly and J. Presper Eckert The first successful high-speed electronic digital computer UNIVAC 1951 The first commercially successful electronic digital computer Used magnetic tape ENIAC The Electronic Numerical Integrator and Computer (ENIAC) was created by John W. Mauchly and J. Presper Eckert and was placed in operation in The ENIAC is generally thought of as the first successful high-speed electronic digital computer. The Universal Automatic Computer, or UNIVAC, was the first commercially successful electronic digital computer. Completed in 1951, the UNIVAC operated on magnetic tape (as opposed to its competitors, which ran on punch cards). UNIVAC 2:42 AM

7 Early Computer History
Transistors 1945 Invented at Bell Laboratories Replaces vacuum tubes Integrated circuits 1958 Invented by Jack Kilby of Texas Instruments A small chip containing thousands of transistors Enabled computers to become smaller and lighter In 1945, scientists at the Bell Telephone Laboratories invented the transistor as a means to store data. The transistor replaced the bulky vacuum tubes of earlier computers and was smaller and more powerful. In 1958, Jack Kilby, while working at Texas Instruments, invented the world’s first integrated circuit, a small chip capable of containing thousands of transistors. This consolidation in design enabled computers to become smaller and lighter. 2:42 AM

8 Early Computer History
Microprocessor chip 1971 Created by Intel Corporation A small chip containing millions of transistors It functions as the central processing unit (CPU) In 1971, the Intel Corporation introduced the microprocessor chip, a small chip containing millions of transistors. The microprocessor functions as the CPU. 2:42 AM

9 Computer Generations First-generation computers (1946–1958)
UNIVAC Use vacuum tubes to store data Second-generation computers (1959–1964) Use transistors to store data Third-generation computers (1965–1970) Use integrated circuits Fourth-generation computers (1971–Today) Use a microprocessor chip Computers have been classified in four generations. The first generation, like the ENIAC and UNIVAC, used vacuum tubes for memory. The second generation used transistors. Integrated circuits were the hallmark of third-generation computers. Fourth-generation computers use the microchip. 2:42 AM

10 Intel 8080 and the Altair 8800 The first microcomputer Sold as a kit
Switches for input Lights for output Gates and Allen create a compiler for Basic MITS receives 4,000 orders A company called MITS created a kit computer based on the Intel 8080 chip called the Altair It appeared on the cover of Popular Electronics in 1975 and quickly there were hundreds of orders a month for the $395 box of parts. The computer had switches for input and lights for output and had a total of 256 bytes of memory. (Today’s computers have 256,000,000 bytes) Bill Gates and Paul Allen saw the article, left Harvard and flew to Albuquerque, NM, convincing MITS that they could write a compiler to run the Basic computer language on the Altair. Their success led to the formation of Microsoft. 2:42 AM

11 Beginners All-Purpose Symbolic Instruction Code (BASIC)
Revolutionized the software industry Programming language that beginners could easily learn Key language of the PC Bill Gates and Paul Allen used BASIC to write the program for the Altair Led to the creation of Microsoft The software industry began in the 1950s with the development of programming languages such as FORTRAN, ALGOL, and COBOL. These languages were used mainly by businesses to create financial, statistical, and engineering programs for corporate enterprises. The 1964 introduction of Beginners All-Purpose Symbolic Instruction Code (BASIC) revolutionized the software industry. BASIC was a programming language that the beginning programming student could easily learn. It thus became enormously popular—and the key language of the PC. In fact, Bill Gates and Paul Allen used BASIC to write the program for the Altair. This program led to the creation of Microsoft, a company that produced software for the micro computer. 2:42 AM

12 Apple I and Apple II Apple I built by Steve Wozniak in 1976
Apple II developed by Steve Jobs in 1977 Uses Motorola processor First fully contained microcomputer Highly successful In 1976, Steve Wozniak and Steve Jobs met at the Homebrew Computing Club in Palo Alto. Wozniak had built a computer based on a Motorola microprocessor that worked well. Jobs joined up with him and together they developed the Apple Computer Company and the sensational Apple II, the first fully contained microcomputer in a plastic box, looking like a piece of consumer electronics. The machine was an instant success, selling for $1300. 2:42 AM

13 Early Competitors Commodore TRS-80 Osborne 2:42 AM
Around the time Apple was experiencing success with its computers, a number of competitors entered the market. Among Apple’s strongest competitors were the Commodore PET 2001 and Tandy RadioShack’s TRS-80. Commodore introduced the PET in January It was featured on the cover of Popular Science in October 1977 as the “new $595 home computer.” Tandy RadioShack’s home computer also garnered immediate popularity. Just one month after its release in 1977, the TRS-80 Model 1 sold approximately 10,000 units. Priced at $599.95, the easy-to-use machine included a monochrome display and 4 KB of memory. The Osborne Company introduced the Osborne in April 1981 as the industry’s first portable computer. Although portable, the computer weighed 24.5 pounds, and its screen was just 5 inches wide. In addition to its hefty weight, it came with a hefty price tag of $1,795. Still, the Osborne included 64 KB of memory, two floppy disk drives, and software programs installed (such as word processing and spreadsheet software). The Osborne was an overnight success, with sales quickly reaching 10,000 units per month. 2:42 AM

14 IBM PC IBM enters small computer market 1981 Uses open architecture
Purchases operating system from Microsoft As the microcomputer market grew, everyone waited to see if the dominant computer company, IBM, would enter the market. They did in 1981, but veered from their normal corporate philosophy by developing a product from off-the-shelf parts. There was only one essential part of the IBM PC that was proprietary, the ROM-BIOS. Two choices they made changed the world of computing forever. One was choosing the Intel microprocessor, the other was choosing their operating system from Microsoft. The IBM PC made it OK for American business to buy small computers and the sensational sales led software developers to write for it. The IBM PC became the standard by which all Intel/Microsoft PCs would be based. Unfortunately for IBM, their machine was cloned by Compaq in IBM had created an empire for Intel and Microsoft. 2:42 AM

15 Graphical User Interface
Xerox Alto Xerox Palo Alto Research Center Alto: 1972 Apple Lisa: 1983 Macintosh: 1984 Another important advancement in PCs was the introduction of the graphical user interface (GUI), which allowed users to interact with the computer more easily. In 1972, Xerox was hard at work in its Palo Alto Research Center designing a PC of its own. Named the Alto, it included a mouse and a file management system with directories and folders. For a variety of reasons, Xerox never sold the Alto commercially. In 1983, Apple introduced the Lisa, the first successful PC brought to market to use a GUI. It incorporated a user interface similar to the Alto, including features such as windows, drop-down menus, icons, a file system with folders and files, and a mouse. In 1984, Apple introduced the Macintosh, which was everything the Lisa was and then some, and at about a third of the cost. The Macintosh was also the first personal computer to introduce 3.5-inch floppy disks with a hard cover. 2:42 AM

16 The Internet Boom Mosaic Netscape Internet Explorer Windows 95 2:42 AM
The GUI made it easier for users to work on the computer. The Internet provided another reason for consumers to buy computers. Now they could conduct research and communicate with each other in a new and convenient way. In 1993, the Web browser Mosaic was introduced. This browser allowed users to view multimedia on the Web, causing Internet traffic to increase by nearly 350 percent. in 1994, the Mosaic development team developed the commercial Web browser Netscape. Netscape’s popularity grew quickly, and it soon became a predominant player in browser software. Meanwhile, companies discovered the Internet as a means to do business, and computer sales took off. IBM-compatible PCs became the PC system of choice when, in 1995, Microsoft introduced Internet Explorer, a Web browser that integrated Web functionality into Microsoft Office applications, and Windows 95, the first Microsoft OS designed to be principally a GUI OS. 2:42 AM

17 Central Processing Unit & Random Access Memory
Computer Hardware Central Processing Unit & Random Access Memory 2:42 AM

18 The CPU: Processing Digital Information
CPU is the brains of the computer Different types of CPUs Intel and AMD chips: Used in most Windows-based PCs Apple systems use different CPU design Differentiating CPUs Processing power Clock speed and cache The central processing unit (CPU or processor) executes every instruction given to your computer. The entire CPU fits on a tiny chip, called the microprocessor, which contains all of the hardware responsible for processing information, including millions of transistors. The CPU is located in the system unit on the computer’s motherboard. Both Intel and AMD chips are used in the majority of Windows-based PCs. Apple computer systems use a different CPU design. The G4 and PowerPC G5 chip were used by Apple machines for over ten years. In 2005, Apple announced all of their systems would be redesigned to use Intel CPUs. The primary distinction between CPUs is processing power, which is determined by the number of transistors on each CPU. The greatest differentiators are how quickly the processor can work (called its clock speed) and the amount of immediate access memory the CPU has (called its cache memory). 2:42 AM

19 The Control Unit Manages the switches inside the CPU
Is programmed by CPU designers to remember the sequence of processing stages for that CPU Moves each switch to its correct setting (on or off) and then performs the work of that stage The control unit of the CPU manages the switches inside the CPU. It is programmed by CPU designers to remember the sequence of processing stages for that CPU and how each switch in the CPU should be set, on or off, for each stage. As soon as the system clock says so, the control unit moves each switch to its correct setting (on or off) and then performs the work of that stage 2:42 AM

20 The Arithmetic Logic Unit (ALU)
Part of the CPU designed to perform mathematical operations (addition, subtraction, multiplication, division, etc.) Also performs logical OR, AND, and NOT operations Is fed data from the CPU registers Word size: Number of bits a computer can work with at a time The arithmetic logic unit (ALU) is the part of the CPU designed to perform mathematical operations such as addition, subtraction, multiplication, and division and to test comparing values as greater than, less than, or equal to. The ALU also performs logical OR, AND, and NOT operations. The ALU is specially designed to execute such calculations flawlessly and with incredible speed. The ALU is fed data from the CPU’s registers. The amount of data a CPU can process at a time is based in part on the amount of data each register can hold. The number of bits a computer can work with at a time is referred to as its word size. Therefore, a 64-bit processor can process more information faster than a 32-bit processor. 2:42 AM

21 Registers Small areas of storage in the CPU
Holds data and results of current operations Holds current instruction Holds address in memory of next instruction to execute 2:42 AM

22 The CPU Machine Cycle Fetch Decode Execute Store
The program’s binary code is “fetched” from its temporary location in RAM and moved to the CPU Decode The program’s binary code is decoded into commands the CPU understands. Execute The ALU performs the calculations. Store The results are stored in the registers Any program you run on your computer is actually a long series of binary code, 1s and 0s, describing a specific set of commands the CPU must perform. Each CPU is a bit different in the exact steps it follows to perform its tasks, but all CPUs must perform a series of similar general steps. These steps, referred to as a CPU machine cycle (or processing cycle) are: 1. Fetch: When any program begins to run, the 1s and 0s that make up the program’s binary code must be “fetched” from their temporary storage location in RAM and moved to the CPU before they can be executed. 2. Decode: Once the program’s binary code is in the CPU, it is decoded into the commands the CPU understands. 3 Execute: Next, the CPU actually performs the work described in the command. Specialized hardware on the CPU performs addition, subtraction, multiplication, division, and other mathematical and logical operations at incredible speeds. 4. Store: The result is stored in registers, special memory storage areas built into the CPU, which are the most expensive, fastest memory in your computer. The CPU is then ready to fetch the next set of bits encoding the next instruction. 2:42 AM

23 The System Clock Located on the motherboard
Controls the CPU’s processing cycles Clock cycle Pulse or tick Clock speed Number of pulses per second Measured in hertz (Hz) To move from one stage of the machine cycle to the next, the motherboard contains a built-in system clock. This internal clock is a special crystal that controls when the CPU moves to the next stage of processing. These “ticks” of the system clock, known as the clock cycle, set the pace by which the computer moves from process to process. The pace, known as clock speed, is measured in Hz. Today’s system clocks are measured in GHz, or one billion clock ticks per second. 2:42 AM

24 Making Computers Faster
Dual processing Two CPUs on the same system Each processor shares the workload Parallel processing Network of computers Each computer works on a portion of the problem simultaneously Dual processors Although the vast majority of home and work desktop systems today use only one processor, it is becoming increasingly common to use dual processors. A dual-processor design has two separate CPU chips installed on the same system. Operating systems such as Windows XP Professional and Apple’s Mac OS X are able to work with dual processors and automatically decide how to share the workload between them. Certain types of problems are well suited to a parallel-processing environment. In parallel processing, there is a large network of computers, with each computer working on a portion of the same problem simultaneously. To be a good candidate for parallel processing, a problem must be one that can be divided into a set of tasks that can be run simultaneously. If the next step of an algorithm can be started only after the results of the previous step have been computed, parallel processing will present no advantages. 2:42 AM

25 Making Computers Faster
Pipelining: The CPU processes more than one instruction at a time Non-pipelined CPU Instruction 1 Fetch Decode Execute Store Instruction 2 Fetch Decode Execute Store Pipelined CPU One method found to speed up a CPU is called pipelining. As an instruction is processed, the CPU runs through the four stages of processing in a sequential order: fetch, decode, execute, store. Pipelining is a technique that allows the CPU to work on more than one instruction (or stage of processing) at a time, thereby boosting CPU performance. and is used in some fashion in all modern CPUs. Instead of the CPU carrying out one step of the machine cycle on every pulse of the system clock, the CPU performs different parts of the cycle simultaneously, theoretically making the CPU four times faster. Chip makers have also designed special instruction sets to handle multimedia content, again speeding up the overall performance of the system. Instruction 1 Fetch Decode Execute Store Instruction 2 Fetch Decode Execute Store Instruction 3 Fetch Decode Execute Store Instruction 4 Fetch Decode Execute Store 2:42 AM

26 Moore’s Law Number of transistors on a CPU will double every 18 months
First chip had 29,000 transistors Pentium chip 169,000,000 transistors Moore’s Law has been accurate for 25 years Gordon Moore, the cofounder of processor manufacturer Intel, predicted more than 25 years ago that the number of transistors on a processor would double every 18 months. Known as Moore’s Law, this prediction has been remarkably accurate—but only with tremendous engineering ingenuity. The first 8086 chip had only 29,000 transistors and ran at 5 MHz. Advances in the number of transistors on processors through the 1970s, 1980s, and 1990s continued to align with Moore’s prediction. 2:42 AM

27 Cache Memory Small amount of memory located on the CPU chip or near it
Stores recent or frequently used instructions and data Used for quick access by the CPU Different levels of cache Cache memory consists of small blocks of memory located directly on and next to the CPU chip. These memory blocks are holding places for recently or frequently used instructions or data that the CPU needs the most. When these instructions or data are stored in cache memory, the CPU can more quickly retrieve them than if it had to access the instructions or data in RAM. Modern CPU designs include a number of types of cache memory. Level 1 cache is a block of memory that is built onto the CPU chip for the storage of data or commands that have just been used. Level 2 cache is located on the CPU chip but is slightly farther away from the CPU, or it’s on a separate chip next to the CPU and therefore takes somewhat longer to access. Some newer CPUs have an additional third level of cache memory storage, called Level 3 cache. 2:42 AM

28 RAM: The Next Level of Temporary Storage
Volatile: When you turn off your computer, the data is erased Several kinds of RAM exist Each type of RAM has a different design Some types work at much faster speeds Some transfer data more quickly Primary Storage Random access memory (RAM) is volatile, meaning that when you turn off your computer, the data stored in RAM is erased. There are several kinds of RAM. Each type of RAM has a very different internal design, allowing some types to work at much faster speeds and to transfer data much more quickly than others. 2:42 AM

29 Memory Modules & RAM Memory modules: Types of RAM: SIMM DIMM SRAM DRAM
SDRAM Random access memory (RAM) is your computer’s temporary storage space. Although we refer to RAM as a form of storage, RAM is really the computer’s short-term memory. As such, it remembers everything that the computer needs to process the data into information, such as inputted data and software instructions, but only while the computer is on. This means that RAM is an example of volatile storage. When the power is off, the data stored in RAM is cleared out. This is why, in addition to RAM, systems always include nonvolatile storage devices for permanent storage of instructions and data when the computer is powered off. Hard disks provide the greatest nonvolatile storage capacity in the computer system. Memory modules (or memory cards), the small circuit boards that hold a series of RAM chips, fit into special slots on the motherboard. Most memory modules in today’s systems are called dual inline memory modules (DIMMs). Several different types of RAM are available. -Currently, DDR SDRAM is very common in the marketplace, although some high-performance systems offer the even faster DDR2 RAM. RAM has a significant impact on the overall performance of your computer. 2:42 AM

30 Types of RAM: DRAM Dynamic RAM (DRAM)
Cheapest and most basic type of RAM Loses its electrical charge Needs to be refreshed Many types of DRAM SDRAM: Synchronous DRAM DDR SDRAM: Double data rate SDRAM The cheapest and most basic type of RAM is dynamic RAM (DRAM). It is used in older systems or in systems in which cost is an important factor. In storing 1 bit of data inside DRAM, a transistor and a capacitor are used. A transistor is a switch that can be turned on or off. A capacitor is a device that acts like a storage space for the charged electrons coming from the transistors. To store a 1, the transistor is turned to the “on” position and it fills the capacitor with charge. If the capacitor is just filled with charge once, it eventually loses its charge. A 1 bit might now be read as a 0 and the data stored there would be corrupted. To make sure each capacitor holding a 1 value is filled with enough charge to be read as a 1 at any time, a refresh signal is applied. The refresh will flood current through the open transistors to refill the capacitors so they continue to store a valid 1. A variety of types of DRAM are on the market, each with different performance levels and prices. -Synchronous DRAM (SDRAM) is much faster than traditional DRAM. -The current standard of DRAM in home systems is double data rate synchronous DRAM (DDR SDRAM). DDR SDRAM is faster than regular SDRAM but not as fast as DDR2 SDRAM, which is the most recent entry on the market. Each of these types of DRAM increases the speed with which the CPU can access data, but also increases the cost of the memory modules. 2:42 AM

31 Types of RAM: SRAM Static RAM (SRAM)
Does not lose its electrical charge Faster than DRAM More expensive than DRAM Used only in locations like cache memory All of the refresh signals required to keep the data “fresh” in DRAM take time. A faster type of RAM is static RAM (SRAM). In SRAM, more transistors are used to store a single bit, but no capacitor is needed. This eliminates the need for a refresh signal, thus making SRAM much faster than DRAM. However, because it is more expensive than DRAM, it is used only in locations such as the CPU’s cache, where the system demands the fastest possible storage. 2:42 AM

32 More Memory Types Read Only Memory - ROM
Complementary Metal-oxide Semiconductor - CMOS Video Ram 2:42 AM

33 Buses: The CPU’s Data Highway
Electrical pathway used to move data between components Local bus: Connects the CPU with the memory Expansion bus: Connects the CPU with peripheral devices A bus is an electrical wire in the computer’s circuitry—the highway that data (or bits) travels on between the computer’s various components. Computers have two different kinds of buses. -Local buses are on the motherboard and run between the CPU and the main system memory. - Most systems also have another type of bus, called an expansion bus, which expands the capabilities of your computer by allowing a range of different expansion cards (such as video cards and sound cards) to communicate with the motherboard. 2:42 AM

34 Bus Performance Bus clock speed Bus width
Rate of speed data moves from one location to another Measured in Mhz (millions of clock cycles per second) Bus width The number of bits of data moved on a bus at any one time Measured in bits 16 bits 32 bits Some buses move data along more quickly than others, whereas some can move more data at one time. The rate of speed that data moves from one location to another, known as bus clock speed, affects the overall performance of the computer. Bus clock speed is measured in units of megahertz (MHz), or millions of clock cycles per second. The width of the bus (or the bus width) determines how many bits of data can be sent along a given bus at any one time. The wider the bus, the more data that can be sent at one time. Bus width is measured in terms of bits, so a 32-bit bus can carry more data at one time than a 16-bit bus. 2:42 AM


Download ppt "Early Computer History"

Similar presentations


Ads by Google