Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Science Theory & Introduction Week 1 Lecture Material – F'13 Revision Doug Hogan Penn State University CMPSC 201 – C++ Programming for Engineers.

Similar presentations


Presentation on theme: "Computer Science Theory & Introduction Week 1 Lecture Material – F'13 Revision Doug Hogan Penn State University CMPSC 201 – C++ Programming for Engineers."— Presentation transcript:

1 Computer Science Theory & Introduction Week 1 Lecture Material – F'13 Revision Doug Hogan Penn State University CMPSC 201 – C++ Programming for Engineers CMPSC 202 - FORTRAN Programming for Engineers

2 Hardware vs. Software Hardware essentially, things you can touch input, output, storage devices memory Software essentially, what the computer knows data, 0s and 1s programs (this is a software course)

3 Categories of Memory Read-only memory (ROM) can only read data Random-access memory (RAM) can read and write information primary storage - computer’s main memory volatile

4 Sequential Access vs. Random Access Memory Sequential Access: must access each location in memory in order Random Access: can access memory locations using addresses, in any order Speed implications? Track 1 Track 2

5 Measuring Memory base unit: 1 bit = binary digit, 0 or 1 8 bits = 1 byte (B) 1000 bytes ≈ 1 kilobyte (KB) 1000 KB ≈ 1,000,000 B ≈ 1 megabyte (MB) 1000 MB ≈ 1,000,000,000 B ≈ 1 gigabyte (GB) 1000 GB ≈ 1,000,000,000,000 B ≈ 1 terabyte (TB)

6 Storage Device Capacities Floppy disk (the old 3.5” ones) 1.44 MB Compact disc (CD) 650-700 MB Digital Versatile/Video Disc (DVD) 4.7 GB Hard disks, flash drives typically sizes in GB

7 Software Overview System software Controls basic operations of computer The operating system manages memory, files, application software File management tasks – deleting, etc.

8 Software Overview Application software Not essential to system running Enables you to perform specific tasks Ex: Office software Web browsers Media players Games

9 Algorithms and Languages An algorithm is a set of instructions to solve a problem. Think recipes. Many algorithms may solve the same problem. How do we choose? We use a programming language to explain our algorithms to computer and write programs.

10 Programming Paradigms/Models Imperative Programming: specify steps to solve problem, use methods, methods could get long Object-Oriented Programming (OOP): create objects to model real-world phenomena, send messages to objects, typically shorter methods Event-Driven Programming: create methods that respond to events like mouse clicks, key presses, etc. Others: Functional, logic, etc.

11 Compiled vs. Interpreted Languages Interpreted Language Requires software called an interpreter to run the code Code is checked for errors as it runs (erroneous code: do the best we can…) Examples: HTML, JavaScript, PHP Compiled Language Requires software called a compiler to run the code Code must be compiled into an executable before running (and thus error- free) Examples: C, C++, Pascal, Fortran, BASIC

12 Compiling Process Source Code (C++, Fortran, …) Object Code Executable Program Object Code from Libraries compilerlinker

13 Errors Syntax Errors Misuse of the language, much like using incorrect punctuation in English Compiler reports; program won’t run until they’re resolved Logic Errors Program doesn’t solve the problem at hand correctly Runtime Errors Errors that occur while the program is running, e.g. problems accessing memory and files, divide by zero

14 Abstraction Poll: Who can use a CD player? Who can explain how a CD player works? Who can drive a car? Who is an auto mechanic? Abstraction Principle of ignoring details that allows us to use complex devices Focus on the WHAT, not the HOW Fundamental to CS Other examples?

15 Levels of Abstraction 0. Digital Logic 1. Microprocessor 2. Machine Language 3. Operating System 4. Assembly Language 5. High-Level Language 6. Application Software

16 Binary Numbers Use two symbols: 0 and 1 Base 2 Compare with decimal number system Uses symbols 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 Base 10 At the lowest level of abstraction, everything in a computer is expressed in binary.

17 Binary Numbers, ctd. 0 1 10 11 100 101 110 111 Places: Decimal: 1s, 10s, 100s, etc. Binary, 1s, 2s, 4s, 8s, etc. Conversion between decimal and binary is done by multiplying or adding by powers of 2. “There are 10 kinds of people in the world…” 1000 1001 1010 1011 1100 1101 1110 1111 10000

18 Other Number Systems Any positive integer could be the base of a number system. (Big topic in number theory.) Others used in computer science: Octal: Base 8 Hexadecimal: Base 16 New symbols A, B, C, D, E, F

19 ASCII Every character on a computer -- letters, digits, symbols, etc. -- is represented by a numeric code behind the scenes. This system of codes is called ASCII, short for American Standard Code for Information Interchange. We’ll learn more in lab…

20 # Transistors on a Processor ProcessorDateNumber of Transistors 400419712,250 800819722,500 808019745,000 8086197829,000 2861982120,000 3861985275,000 486 DX19891,180,000 Pentium19933,100,000 Pentium II19977,500,000 Pentium III199924,000,000 Pentium 4200042,000,000 Data for Intel processors: Data from Section 4.1 of : Yates, Daniel S., and David S. Moore and Daren S. Starnes. The Practice of Statistics. 2nd Ed. New York: Freeman, 2003.

21 A Graphical View Graph from Intel's web site (http://www.intel.com/technology/mooreslaw/index.htm); Retrieved 9/24/2006http://www.intel.com/technology/mooreslaw/index.htm Pay attention to the units on the axes…

22 Moore’s Law Prediction from Gordon Moore of Intel in 1965. Implication: The speed of processors doubles roughly every 12 to 18 months. Exponential relationship in the data. For the curious: the regression equation from the data two slides back is Can this go on forever?


Download ppt "Computer Science Theory & Introduction Week 1 Lecture Material – F'13 Revision Doug Hogan Penn State University CMPSC 201 – C++ Programming for Engineers."

Similar presentations


Ads by Google