Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 101 History and Basics.

Similar presentations


Presentation on theme: "CS 101 History and Basics."— Presentation transcript:

1 CS 101 History and Basics

2 Brief history Electronic computers
First ones built in 1940’s Military uses – projectile trajectories, code breaking (Enigma) Mainly big calculators Built on the binary principle – a switch is either on or off (1 or 0)

3 Moore’s Law Gordon Moore
Engineer Co-Founder of Intel with Robert Noyce The “Law”: he observed that the number of transistors on a chip doubled about every two years It is not a physical law, just a projection of technological advancement Has held from 1975 until 2012 Implications for world economy, consumer expectations

4 Computers all around us
The first computers were huge and expensive Owned mainly by businesses and governments In the 1970’s computers were reduced in size by the introduction of “microprocessors”. Personal computers like Apple II’s, Commodore, Sinclair, Atari were popular. Later people bought desktops and then laptops for home use Around the turn of the 21st century, more electronic devices became popular: ebooks, smart phones, tablets

5 Embedded computers Many electronic devices have computers inside them
Not generally meant for humans to interact with directly Usually are special purpose Examples Cars, in fact vehicles of all kinds Dishwashers Security systems Medical devices ATMs

6 Representing information as bits
Computers are based on devices which are in one of two states, on or off. Usually the on state is called a 1 and the off state is a 0. One of those devices stores a bit (binary digit). This is a very small amount of information, so often bits are grouped into groups of 8, which are called bytes. One byte is big enough to store a number from 0 to 255, or one character like “A” or “3” or “?”. All information in a computer is represented by numbers. That includes characters, colors, sounds, programs, …

7 Character codes In the early days of computers, every manufacturer had their own code for characters (HP, IBM, Sperry, Digital) The users didn’t care as long as they pressed an A on the keyboard and got an A on the screen As time passed, more computers meant more data and more users People wanted to share or buy or sell data with colleagues The incompatibilities in the character codes between hardware makers created a problem Eventually everyone decided that a “standard code” was needed

8 ASCII Several codes were considered but
ASCII won! American Standard Code for Information Interchange The beginnings of the Internet in the 70’s also gave impetus to the desire for a standard code so that clients didn’t have to know a dozen different codes just to read from different machines across the Net

9 ASCII So what? Why do I care? Most of the time, you don’t.
If you press A on the keyboard and get an A on the screen, what does the code matter? ASCII is efficient, known by almost every device, easy to transmit and receive Has codes for A-Z, a-z, 0-9, space, punctuation marks and a few control codes 256 different codes (1 ASCII character fits into 1 byte of data) Textbook shows ASCII as seven bit code, later was made 8-bits.

10 Time passes again The world starts using computers and not just countries who use the “Roman” alphabet (Asian countries, Eastern European, Middle East) First response from computer makers: sorry, learn English, use ASCII But eventually the realization came that a new code that was bigger to represent more symbols was needed Around Unicode was released “Universal Code” Has over 64K different codes

11 Unicode Covers all human alphabets and has room for more!
Includes ASCII as first 256 codes A Unicode character takes TWICE as much space (at least!) as an ASCII character (2 bytes) Now in development for 4 bytes! Becoming the default code for many applications

12 Unicode Note: Unicode is NOT a “translator program”
What it does do is allow you (if you know a foreign language) to write the foreign words properly spelled with the correct characters They will get transmitted correctly The recipient still has to know how to read them, but at least the words will be correctly spelled Interesting chart identifying the various languages

13 Binary numbers A decimal number is “base 10”, it uses 10 different symbols (0,1,2, 3, 4, 5, 6, 7, 8,9) A binary number is “base 2”, it uses only 2 different symbols (0, 1) Computers do arithmetic with binary numbers, not decimals. Why? It’s easier to build a switch with TWO different states than one with TEN different states some tutorials for Binary to Decimal and Decimal to Binary conversion

14 Computing and careers Jobs in Computing
Research Software Development Systems Analysis Jobs in other domains which use computers Bioinformatics Engineering Medicine Manufacturing Finance

15 Computers and careers Demand for computer skills is projected to increase Software Developer positions increase 17% from 2014 to 2024 Web Developer positions increase 27% from 2014 to 2024 Computer Systems Analyst positions increase 21% from 2014 to 2024 Source (Bureau of Labor Statistics) Many low-level jobs are being replaced by automation


Download ppt "CS 101 History and Basics."

Similar presentations


Ads by Google