Presentation is loading. Please wait.

Presentation is loading. Please wait.

What’s A Bit? Noah Mendelsohn Tufts University Web: COMP 40: Machine Structure.

Similar presentations


Presentation on theme: "What’s A Bit? Noah Mendelsohn Tufts University Web: COMP 40: Machine Structure."— Presentation transcript:

1 What’s A Bit? Noah Mendelsohn Tufts University Web: COMP 40: Machine Structure and Assembly Language Programming (Spring 2014)

2 © 2010 Noah Mendelsohn Topics  What is information?  What’s a bit?  How do computer memories store information?

3 © 2010 Noah Mendelsohn 3 The History of Information Theory

4 © 2010 Noah Mendelsohn 4 Claude Shannon 1948: Claude Shannon publishes: A mathematical theory of communication* * Photo by Tekniska Museet

5 © 2010 Noah Mendelsohn Questions  We can weigh things that have mass  We can determine the volume of solid object or a liquid  We can measure the height of the walls in this room  Can we measure information?  Can we distinguish more information from less?  What units could we use? 5

6 © 2010 Noah Mendelsohn 6 Intuition  There is more information in the library of congress than there is in a single word of text…  …but how can we prove that rigorously?

7 © 2010 Noah Mendelsohn Crucial insight  Whenever two parties communicate: –We can view the communication as answering one or more questions –Example: you and I are deciding whether to have dinner. We agree in advance that I am going to phone you and give you the shortest possible message to convey the answer. I will say “yes” if we’re having dinner and “no” if not. –Harder example: we also need to decide whether we’re going to the movies. This time, we agree in advance that I will say “yes, yes” for movie and dinner, “yes, no” for dinner only, “no, yes” for movie only, and “no, no” for staying home. 7 This is profound… …communication is answering questions!

8 © 2010 Noah Mendelsohn Crucial insight  Whenever two parties communicate: –We can view the communication as answering one or more questions –Example: you and I are deciding whether to have dinner. We agree in advance that I am going to phone you and give you the shortest possible message to convey the answer. I will say “yes” if we’re having dinner and “no” if not. –Harder example: we also need to decide whether we’re going to the movies. This time, we agree in advance that I will say “yes, yes” for movie and dinner, “yes, no” for dinner only, “no, yes” for movie only, and “no, no” for staying home. 8 This is profound… …communication is choosing among possibilites!

9 © 2010 Noah Mendelsohn Measuring information  Just saying “yes” or “no” isn’t enough…  …we have to agree on what the choices are  The more choices we have to make, the more “yes” or “no” answers we’ll have to communicate 9 We define the ability to convey a single yes/no answer as a bit We define the amount of information as the number of yes/no questions to be answered Shannon’s paper introduced the term bit! (atributing it to co-worker John Tukey)

10 © 2010 Noah Mendelsohn How many bits for Beethoven’s 9 th Symphony?  If you and I agree in advance that we are choosing between only two recordings that we both have, then: –We can choose between them with 1 bit!  If we agree in advance only that it is some digital sound recording, then: –We need enough bits so that you can choose the intended sound wave from all possible such 70+ minute recordings* 10 * By the way, the compact disc format was chosen to have enough bits to encode Beethoven’s 9 th, at 44KHz x 16bits/sample x 2 channels estimated at 74 minutes. Approximately 5,872,025,600 bits

11 © 2010 Noah Mendelsohn Things to notice  We always have to agree in advance what the possible choices are: –Whether we’re having dinner or not –Which of N sound wave forms I want you to reproduce  We always have to agree on which answers (bit values) corresponds to which choices  We can use any labels we like for the bit values, e.g. –[yes] will mean Beethoven –[yes yes] will mean dinner and movie  Or… –[true] will mean Beethoven –[true false] will mean dinner and no movie 11

12 © 2010 Noah Mendelsohn 12 What if we want to encode numbers?  We always have to agree in advance what the possible choices are: –Whether we’re having dinner or not –Which of N sound wave forms I want you to reproduce –Which of N numbers I’ve stored in a computers’s memory  Question: What are good labels for encoding numbers?

13 © 2010 Noah Mendelsohn Let’s try some labels for encoding numbers EncodingNumber encoded [no, no, no]0 [no, no, yes]1 [no, yes, no]2 [no, yes, yes]3 [yes, no, no]4 [yes, no, yes]5 [yes, yes, no]6 [yes, yes, yes]7 13

14 © 2010 Noah Mendelsohn Let’s try some labels for encoding numbers EncodingNumber encoded [false, false, false]0 [false, false, true]1 [false, true, false]2 [false, true, true]3 [true, false, false4 [true, false, true]5 [true, true, false]6 [true, true, true]7 14 Any two labels will do… …but do you notice a pattern?

15 © 2010 Noah Mendelsohn Let’s try some labels for encoding numbers EncodingNumber encoded [0,0,0]0 [0,0,1]1 [0,1,0]2 [0,1,1]3 [1,0,0]4 [1,0,1]5 [1,1,0]6 [1,1,1]7 15 Hey, that’s the binary representation of the number!

16 © 2010 Noah Mendelsohn Encoding numbers in a computer memory  How many bits do I need if we need to encode which of 8 values are in the memory? –1 bit: 0 or 1 [two choices] –2 bits: 00, 01, 10, 11 [four choices] –3 bits: 000, 001, 010, 011, 100, 101, 110, 111 [8 choices]  Number_of_choices = 2 N-bits 16 N-Bits = log 2 (number_of_choices)

17 © 2010 Noah Mendelsohn Encoding numbers in a computer memory  How many bits do I need if we need to encode which of 8 values are in the memory? –1 bit: 0 or 1 [two choices] –2 bits: 00, 01, 10, 11 [four choices] –3 bits: 000, 001, 010, 011, 100, 101, 110, 111 [8 choices]  Number_of_choices = 2 N-bits  As we said, those are binary numbers! 1 x x x 2 0  So…that’s why we label the states zero and one, because we can play this game to assign bit patterns to binary encodings of numbers 17 = 5 N-Bits = log 2 (number_of_choices) Our hardware has instructions to do very efficient arithmetic on these binary representations of numbers

18 © 2010 Noah Mendelsohn 18 Note: we will discuss negative numbers, numbers with fractions, very large and very small numbers, and arithmetic on all of these, at a later time

19 © 2010 Noah Mendelsohn Software structures model real world objects and concepts  Number  Students  Bank statements  Photographic images  Sound recordings  Etc. 19 These things aren’t bits!! They don’t live in computers, but…

20 © 2010 Noah Mendelsohn Software structures model real world objects and concepts  Numbers  Students  Bank statements  Photographic images  Sound recordings  Etc. 20 We build data structures that model them...we agree which bit patterns represent them

21 © 2010 Noah Mendelsohn What we’ve learned so far…  Bits encode yes/no choices  To communicate, we agree in advance on which bit patterns represent which choices  More information: more choices…which means more bits!  We can store any information in a computer memory as long as we agree on which bit patterns represent which choice  If we label the bit states 0 and 1, then binary numbers are an obvious representation for the integers  We choose other encodings for characters (e.g. ASCII), photos (pixel on/off), music (digitized wave amplitude) 21

22 © 2010 Noah Mendelsohn 22 How Do We Build Bits into Computers?

23 © 2010 Noah Mendelsohn Building a bit in hardware  We need hardware that can be in a choice of two states  Computer main memory history: –1940’s: spots on a TV tube; sound pressure waves in a mercury delay line; vacuum tubes –1950’s: rotating magnetic drum; vacuum tubes –1950s – 1970s: tiny magnetizeable iron donuts (core memory) –1970s – present: charges on a capacitor driving a transistor  Computer bulk storage –Magnetizeable tape –Magnetizeable disk –Transistors holding charge or solid state magnetic devices 23 These vary in cost/size/speed – all encode bits

24 © 2010 Noah Mendelsohn Technology for Storing Bits 24 Relay Thyratrons & Vacuum Tubes*

25 © 2010 Noah Mendelsohn Technology for Storing Bits 25 Punch Cards Transistors Core Memory Limited Integration: Magnetic Tape Integrated circuit USB Key

26 © 2010 Noah Mendelsohn 26 Binary Numbers

27 © 2010 Noah Mendelsohn Learn your binary numbers 27 N2N2N N2N2N ~= 1M 2 30 ~= 1B 2 32 ~= 4B 2 64 = HUGE

28 © 2010 Noah Mendelsohn Another way to think about it = 11 (decimal)

29 © 2010 Noah Mendelsohn Another way to think about it

30 © 2010 Noah Mendelsohn Another way to think about binary numbers

31 © 2010 Noah Mendelsohn Another way to think about binary numbers

32 © 2010 Noah Mendelsohn Another way to think about binary numbers The binary representation encodes a binary search for the number!

33 © 2010 Noah Mendelsohn bits

34 © 2010 Noah Mendelsohn The logical structure of computer memory Can we get a C pointer to a bit in memory? Pointers (on most modern machines) are to whole bytes NO! Addr:

35 © 2010 Noah Mendelsohn Why byte addressing?  Can address more memory with smaller pointers  Not too big, not too small  256 values: about right for holding the characters Western cultures need (ASCII) – one character fits in one byte  8 is a power of 2 … we’ll see advantages later  Unfortunately: we need multiple byte representations for non-alphabetic languages (Chinese, Japanese Kanji etc.) – we deal with that in software 35 What’s the largest integer we can store in a byte?

36 © 2010 Noah Mendelsohn Computers can work efficiently with bigger words 36 Byte C has types for these The hardware has instructions to directly manipulate these The memory system moves these efficiently (in parallel) Sizes vary with machine architecture: these are for AMD 64 BYTE (8)SHORT (16)INT (32) LONG (64)POINTER (64) Byte

37 © 2010 Noah Mendelsohn 37 Review

38 © 2010 Noah Mendelsohn Review  Bits encode choices  We can thus choose a representation for information in bits  We can interpret the same bit values in different ways (number 66 or ASCII letter C)  If we call the bit states 0 & 1: then we easily get binary numbers  We know how to implement bit stores in hardware and to compute with them  We generally address bytes not bits  We often use words of types like integer…these are useful, and the machine handles them efficiently 38

39 © 2010 Noah Mendelsohn 39 Abstractions -- again

40 © 2010 Noah Mendelsohn 40 Abstractions Are Layered at Every Level in our Systems “Real world” concepts modeled in Hanson ADTs & C Types Hanson ADTs implemented in C Types Soon: bits & bytes used encode machine instructions Words, Bytes and bits used to implement C types Bytes grouped in hardware to make words (int, long, etc.) True/false bits grouped to make bytes Information modeled as true/false bits True/false bits encoded in charges on transistors

41 © 2010 Noah Mendelsohn 41 An Aside on Information Theory

42 © 2010 Noah Mendelsohn We’ve over-simplified the story a little  What we’ve said about bits and choices is true  However: –Many encodings are wasteful…I.e. values of the bits are somewhat predictable –Example: for each day of the year: [1=There was a hurricane, 0=No hurricane]…we know most bits will be zero –Can you find a better encoding?  To really measure information: we use the smallest possible encoding  Also: Shannon didn’t just measure information…he predicted how reliably you could send it through a noisy transmission line 42 Still: what we’ve studied here is a great start on thinking about bits and information, which are the foundations for modern digital computing.


Download ppt "What’s A Bit? Noah Mendelsohn Tufts University Web: COMP 40: Machine Structure."

Similar presentations


Ads by Google