Presentation is loading. Please wait.

Presentation is loading. Please wait.

Channel Coding Part 1: Block Coding

Similar presentations


Presentation on theme: "Channel Coding Part 1: Block Coding"— Presentation transcript:

1 Channel Coding Part 1: Block Coding
Doug Young Suh

2 Physical Layer Bit error rate  ① Transmitting power  ② Noise power 
Problems in hardware or operating cost Needs algorithmic(logical) approach!! 1V -1V 0.5 + AWGN = BER : bit error rate -1V 1V

3 Datalink/transport layer
Error control coding by addition of redundancy Block coding Convolutional coding Error-control in computer network Error detection : ARQ automatic repeat request Error correction : FEC forward error correction Errors and erasure Error : 0  1 or 1  0 at unknown location Erasure : packet(frame) loss at known location

4 Discrete memoryless channels
Decision between 0 and 1 Hard decision : Simple, loss of information Soft decision : complex

5 Channel model : BSC BSC (binary symmetric channel)
1-p BSC (binary symmetric channel) Bit error rate, p of BPSK with Gaussian noise Noise No and signal power Ec p p 1-p p SNR in dB

6 Entropy and codeword Example) Huffman coding 00 1/2 1/4 1 1/4 10 01
00 1/2 1/4 1 1/4 10 01 1/4 1 10 1/8 110 1 1/4 1 1 1 111 11 1/8 1/4 Average codeword length

7 Fine-to-coarse Quantization
Dice vs. coinl 1/6 1/2 {1,2,3}  head {4,5,6}  tail H T quantization ∙∙∙ H T H H T T ∙∙∙ Effects of quantization Data compression Information loss, but not all 4/22/2017 Media Lab. Kyung Hee University

8 Example) dice p(i)= 1/6 for i=1,…,6
Shannon coding theory Entropy and bitrate R Example) dice p(i)= 1/6 for i=1,…,6 H(X) = Σ(log26)/6 = 2.58 bits Shannon coding theorem No error, if H(X) < R(X) = 3 bits If R(X) = 2, {00,01,10,11}{1,2,{3,4},{5,6}} With received information Y=“even number” H(X|Y) = Σlog23/3 = 1.58 < R(X|Y) = 2 If the receiver received 2 more bits  decodable

9 BSC and mutual information BSC (Binary Symmetric Channel)
X=0 X=1 Y=0 Y=1 p 1-p H(X|Y) = - Σ Σ p(x,y)log2p(x|y) H(X|Y) = -p log2 p – (1-p) log2 (1-p) p=0.10.47, p=0.20.72, p=0.51 Mutual information (channel capacity) I(X;Y) = H(X) – H(X|Y) p=0.10.53, p=0.20.28, p=0.50. 1 bit transmission delivers I(X;Y) bit information. H(X|Y) 1bit P=0 P=1 H(X|Y) = loss of information

10 Probability of non-detected errors
(n,k) Code (n,k) n-k redundant bits (parity bit, check bit) k data bits Code rate r = k/n Information of k bits is delivered by transmission of n bits. Parity symbol ⊃ parity bit (For RS, a byte is a symbol.) (n,k)=(4,3) even parity error detection code when bit error rate p=0.001 Probability of non-detected errors

11 Trade-offs Trade-off 1 : Error Performance vs. Bandwidth A : less bandwidth higher error rate than C at the same channel condition. Trade-off 2 : Coding gain (D-E) Trade-off 3 : Capacity vs. Bandwidth coded A B C E uncoded Eb/N0 (dB) 8 9 14 10-2 10-4 10-6 D BER

12 Trade-off : an example Example) Coded vs. Uncoded Performance
R = 4800bps (n, k) = (15, 11) t=1 Performance of coding? Sol) without coding

13 (continued) Trade-off : an example
Higher layer 11kbps, Datalink layer (11=>15) Physical layer 15 kbps, Higher layer Datalink layer (15=>11) Physical layer Code performance at low values of Too many errors to be corrected => Turbo codes

14 Linear Block Code (n, k) code n: length of a codeword
k: number of message bits n-k: number of parity bits Example) Even parity check is a (8,7) code Systematic code : Message bits are left unchanged. (Parity check is one of systematic codes.) GF(2) (GF: Glois field, /galoa/) “field”: set of variables closed for an operation. closed: The results of an operation is also an element of the field. Ex) Set of positive integers is closed for + and x, but, open for – and /.

15 GF(2) : Galois Field GF(2) is closed for the following two operations.
+ 1 1 Two operations above are XOR and AND, respectively. Block encoder Block decoder

16 Error-Detecting/correcting capability
Hamming distance How many bits should be changed to be the same? Example) Effect of repetition code Send (0 1), by using ( ) or ( ). Minimum distance  maximum error detection capability maximum error correction capability     ◇ : single error detection ◇ : double error detection OR single error correction ◇ : (double error detection AND single error correction)          OR (triple error detection)

17 Error-Detecting/correcting capability
Double error correction with k message bits perfect code Example) Extended Hamming code (7,4) + one parity bit = (8,4) ⇒ 

18 Cyclic code Cyclic codes (Cyclic codes ⊂ Linear block codes)
For n=7, X7+1 = (X+1)(X3+X2+1)(X3+X+1) Generator polynomial g(X) =   X3+X2+1 or X3+X+1 Note that X7+1 = 0 when g(X)=0. where Example) mod operator(나머지 연산자)  = 7 % 3 = 1 Example) Calculate c(X) for m(X) = [1010]

19 Hamming (7,4) Decoding Example) Make the syndrome table of Hamming(7, 4) code.              0 0 0                                                                               Example) For Hamming(7,4), find and when              C1 [ ] [ ] [ ] [ ] [ ] C2 [ ] [ ]

20 Example) Hamming (7,4) code
b) Parity check polynomial h(X) If X is a root of  Since There exists which satisfies Then,

21 Entropy and Hamming (7,4) k=4, n=7
How many codewords? 2k = 24 Their entropy? P = 1/ 2k  k bits / codeword Information transmission rate = coding rate r = k/n [information/transmission] The value of a bit is k/n. Suitable when I(X;Y)=H(X)-H(X|Y) > k/n = 0.57 I(X;Y) = H(X) – H(X|Y) p=0.10.53, p=0.20.28, p=0.50. Suitable at BER of less than 10% H(X|Y) 1bit P=0 P=1

22 Other Block Codes CRC codes : error detection only for a long packet
    CRC-CCITT code open question) How many combinations of non-detectable errors for CRC-12 code used for 100bits long data? What is the probability of the non-detectable errors when BER is 0.01? (2) BCH Codes (Bose-Chadhuri-Hocquenghem)      Block length :                          Number of message bits :               Minimum distance :                      Ex)        (7,4,1) g(X)=13 (15,11,1) g(X)=23 (15,7,2) g(X)=721 (15,5,3) g(X)=2467 (255,171,11) (3) Reed Solomon Codes :         arithmetic


Download ppt "Channel Coding Part 1: Block Coding"

Similar presentations


Ads by Google