Presentation is loading. Please wait.

Presentation is loading. Please wait.

1.4 Representation of data in computer systems Character.

Similar presentations


Presentation on theme: "1.4 Representation of data in computer systems Character."— Presentation transcript:

1 1.4 Representation of data in computer systems Character

2 How are binary codes used to represent characters? Each character (such as uppercase and lowercase letters, numbers and symbols) must be stored as a binary number if a computer system is to be able to store and process it. Each character is therefore given a unique number code, for example the number code for the character 'a' could be decimal 97. When a character is stored on a computer system it is therefore the number code that is actually stored (as a binary number).

3 What is a character set? A character set is a complete set of the characters and their number codes that can be recognised by a computer system.

4 How are the number of bits per character related to the number of different possible characters ? The ASCII Character Set - 7-8 bits per character: The ASCII (American Standard Code for Information Interchange) character set uses 7 bits of memory per character, allowing 128 different characters to be represented, using the binary codes 0000000 to 1111111. Extensions to the ASCII character set use 8 bits (1 byte) per character, allowing 256 total characters. This is still limited and means that different ASCII character sets are needed for the different characters and symbols used in different countries such as accented characters.

5 How are the number of bits per character related to the number of different possible characters ? Example characters, their decimal codes and the binary codes actually stored by the computer. Character Decimal ASCII Code Binary code A651000001 B661000010 a971100001 b981100010 SPACE320100000 Control characters: ASCII actually reserves the first 32 codes (numbers 0–31 decimal) for non-printable control characters. For example: ASCII code 13 is a carriage return, moving the cursor to a new line when typing; ASCII code 9 inserts a tab into a line of text.

6 How are the number of bits per character related to the number of different possible characters ? The Unicode Character Set - 16 bits per character: The Unicode character set potentially uses up to 16 bits (2 bytes) of memory per character which would allow 65,536 different characters to be represented. Using 16 bits means that Unicode can represent the characters and symbols from all the alphabets that exist around the globe, rather than having to use different character sets for different countries. The first 128 characters in ASCII have the same numeric codes as those in Unicode, making Unicode backward compatible with ASCII.


Download ppt "1.4 Representation of data in computer systems Character."

Similar presentations


Ads by Google