Institute for Experimental Mathematics Ellernstrasse 29 45326 Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.

Slides:



Advertisements
Similar presentations
Introduction to Digital Communications
Advertisements

Information theory Multi-user information theory A.J. Han Vinck Essen, 2004.
1. INTRODUCTION In order to transmit digital information over * bandpass channels, we have to transfer the information to a carrier wave of.appropriate.
Efficient Soft-Decision Decoding of Reed- Solomon Codes Clemson University Center for Wireless Communications SURE 2006 Presented By: Sierra Williams Claflin.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Combined QPSK and MFSK Communication over an AWGN Channel Jennifer Christensen South Dakota School of Mines & Technology Advisor: Dr. Komo.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Digital Data Transmission ECE 457 Spring Information Representation Communication systems convert information into a form suitable for transmission.
Chapter 11 Error-Control CodingChapter 11 : Lecture edition by K.Heikkinen.
1 Dr. Uri Mahlab. INTRODUCTION In order to transmit digital information over * bandpass channels, we have to transfer the information to a carrier wave.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany packet transmission A.J. Han Vinck January 19, 2010.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3b: Detection and Signal Spaces.
E&CE 418: Tutorial-6 Instructor: Prof. Xuemin (Sherman) Shen
Digital Communications I: Modulation and Coding Course
Digital communication - vector approach Dr. Uri Mahlab 1 Digital Communication Vector Space concept.
Matched Filters By: Andy Wang.
1 Digital Communication Systems Lecture-3, Prof. Dr. Habibullah Jamal Under Graduate, Spring 2008.
Noise, Information Theory, and Entropy
ELE 745 – Digital Communications Xavier Fernando
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
4.1 Why Modulate? 이번 발표자료는 연구배경 연구복적 제안시스템 시뮬레이션 향후 연구방향으로 구성되어 있습니다.
METU Informatics Institute Min 720 Pattern Classification with Bio-Medical Applications PART 2: Statistical Pattern Classification: Optimal Classification.
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Modulation, Demodulation and Coding Course
ECE 4371, Fall, 2014 Introduction to Telecommunication Engineering/Telecommunication Laboratory Zhu Han Department of Electrical and Computer Engineering.
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
Lecture II Introduction to Digital Communications Following Lecture III next week: 4. … Matched Filtering ( … continued from L2) (ch. 2 – part 0 “ Notes.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION 2-dimensional transmission A.J. Han Vinck May 1, 2003.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany Data communication line codes and constrained sequences A.J. Han Vinck Revised.
Dept. of EE, NDHU 1 Chapter Three Baseband Demodulation/Detection.
Channel Coding Part 1: Block Coding
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
Baseband Demodulation/Detection
Medicaps Institute of Technology & Management Submitted by :- Prasanna Panse Priyanka Shukla Savita Deshmukh Guided by :- Mr. Anshul Shrotriya Assistant.
Introduction to Data Communication: the discrete channel model A.J. Han Vinck University of Essen April 2005.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany Data communication signatures A.J. Han Vinck July 29, 2004.
Coding Theory. 2 Communication System Channel encoder Source encoder Modulator Demodulator Channel Voice Image Data CRC encoder Interleaver Deinterleaver.
DIGITAL COMMUNICATIONS Linear Block Codes
Analysis of Optimal and Suboptimal Discrete-Time Digital Communications Receivers Clemson University SURE Program 2005 Justin Ingersoll, Prof. Michael.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Source Coding Efficient Data Representation A.J. Han Vinck.
EE 3220: Digital Communication
Digital Communications Chapeter 3. Baseband Demodulation/Detection Signal Processing Lab.
1 Channel Coding (III) Channel Decoding. ECED of 15 Topics today u Viterbi decoding –trellis diagram –surviving path –ending the decoding u Soft.
1 Central Limit Theorem The theorem states that the sum of a large number of independent observations from the same distribution has, under certain general.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Bandpass Modulation & Demodulation Detection
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Performance of Digital Communications System
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
Information Theory & Coding for Digital Communications Prof JA Ritcey EE 417 Source; Anderson Digital Transmission Engineering 2005.
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
Lecture 1.31 Criteria for optimal reception of radio signals.
CHAPTER 3 SIGNAL SPACE ANALYSIS
The Viterbi Decoding Algorithm
Introduction to Information theory
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Lecture 1.30 Structure of the optimal receiver deterministic signals.
Advanced Wireless Networks
Chapter 6.
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
Digital Communication Systems Lecture-3, Prof. Dr. Habibullah Jamal
On the Design of RAKE Receivers with Non-uniform Tap Spacing
EE 6332, Spring, 2017 Wireless Telecommunication
Bandpass Modulation and Demodulation
Presentation transcript:

Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003

University Duisburg-Essen digital communications group 2 Content we describe  a simple transmission model  detection  error probability  compare coded with uncoded

University Duisburg-Essen digital communications group 3 Figure 1 Source transmitter channel receiver message signal signal estimate i = 1,,M s(i) r i‘ The optimum receiver maximizes the probability of being correct

University Duisburg-Essen digital communications group 4 Optimum Receiver Suppose: - with every possible received signal r we connect a message estimate P( correct | r ) = P( i transmitted | r and estimate is i ) Optimum detection rule: for a given r, set estimate is i if P(i transmitted | r and estimate is i )  P(k transmitted | r and estimate is k ) for all k  i message received k i j region of received signals with estimate k

University Duisburg-Essen digital communications group 5 Maximum likelihood receiver optimum receiver The optimum receiver gives as estimate the i that maximizes P(i) p( r| i ) ( note p(r) independent from index i ) ML receiver The ML receiver gives as estimate the i that maximizes p( r| i ) Bayes rule

University Duisburg-Essen digital communications group 6 Binary signaling 1 0 n(t) s(t) s(t)+n(t) = r(t) y i sample every T seconds Hard decision: y i > 0output symbol 1 y i < 0 output symbol 0 Error probability p:= probability ( 1(0) transmitted, 0 (1) decided ) T Energy E

University Duisburg-Essen digital communications group 7 Binary Symmetric Channel model p p 1-p Famous satellite transmission model: BSC

University Duisburg-Essen digital communications group 8 We transmit a series of symbols or codewords MessageC channelRestimate 0 1-p 0 p 1 1-p 1 Let R have d differences with C, thenP( R | C ) = p d (1-p) n-d hence: hence: Find C with minimum d (for p < ½ )

University Duisburg-Essen digital communications group 9 The noise n(t) is from an Additive White Gaussian Noise Source A noise process is called Gaussian if is Gaussian distributed for every g(t). White: noise samples are statisically independent from each other, i.e. p n (a,b) = p n (a) p n (b)

University Duisburg-Essen digital communications group 10 for our example Average noise level Variance of the noise Probability density Expected detector output

University Duisburg-Essen digital communications group 11 intermezzo

University Duisburg-Essen digital communications group 12 Cont‘d

University Duisburg-Essen digital communications group 13 1-dimensional model of transmission Decide 1 Decide 0

University Duisburg-Essen digital communications group 14 Obvious receiver Gaussian noise with power spectral density 2  2 n(t) 2T=1/B s(t) Filter  pass basic signal components  sine waves with frequency   B; 1   Gaussian noise with E[(n‘(t)) 2 ] =  B 2  2 n‘(t) Error probability: for  = 1: same as optimum (signal not reproducable) for  > 1: loss sample moments

University Duisburg-Essen digital communications group 15

University Duisburg-Essen digital communications group 16 Filtering causes interference Limit highest frequency eye diagram = overlapping registration of the signal in time

University Duisburg-Essen digital communications group 17 Amplitude shift keying ASK Homework: calculate error probabilities for average energy E Why did we choose this assignment?

University Duisburg-Essen digital communications group 18 Code-uncoded transmission bandwidth expansion Uncoded: Uncoded: k information bits in time T = k  with total energy kE b Coded: Coded: n code digits in time T = n  s with total energy n  s = k  k   n  s ss Increased noise power by factor n/k

University Duisburg-Essen digital communications group 19 Code-uncoded transmission no bandwidth expansion Uncoded: Uncoded: k information bits in time T = k  with total energy kE b Coded: Coded: n code digits in time T = n  with total energy n E s = k E b k    n  Reduced symbol energy

University Duisburg-Essen digital communications group 20 Uncoded error probability Homework: derive this upper bound

University Duisburg-Essen digital communications group 21 For a code with minimum distance d min Coded error probability  R*d/2 > 1 !! CODING GAIN: if product R*d min /2 > 1 !! Hence: look for codes - with large d Hence: look for codes - with large d min - at high ratio k/n = R - at high ratio k/n = R

University Duisburg-Essen digital communications group 22 „Waterfall curves“ 10log 10 Eb/  2 deciBell (dB) Pe

University Duisburg-Essen digital communications group 23 Use „soft-decision“, to do better? nini c i = {, } ML decoder: Choose the code vector C I of length n that maximizes p ( R | C I ) receive r i

University Duisburg-Essen digital communications group 24 Soft decoding cont‘d Choose the codeword that minimizes the „Squared Euclidean Distance“

University Duisburg-Essen digital communications group 25 performance Assume we transmit the all zero vector i.e. c i = i = 1,2,..., n. Let D be the set of d different positions

University Duisburg-Essen digital communications group 26 Performance cont‘d Note: we have the sum of d independent Gaussian random variables

University Duisburg-Essen digital communications group 27 For code with minimum distance d min Factor of 2 CODING GAIN! We transmit k bits of information with total energy kE b in n transmissions with total energy nE Thus: E = kE b /n and

University Duisburg-Essen digital communications group 28 example Single parity check code: n-1 information bits; 1 parity bit minimum distance = 2 No hard decision gain Soft decision gain 10 log 10 2 := 3 dB Check!

University Duisburg-Essen digital communications group 29 Sum of 2 Gaussian Random Variables Motivation: any process that combines many random variables will produce random variables that are approximately Gaussian. (Central Limit Theorem) Let z = x+y, where x and y are independent Gaussian Random Variables. E(x) = E(y) = 0; E(x 2 ) =  2 ; E(y 2 ) =  2. Then:z is a Gaussian random variable: E(z)=0; E(z 2 ) =  2 +  2. Homework: calculate the probablity density function for z.

University Duisburg-Essen digital communications group 30 Sum of 2 Gaussian Random Variables Sketch of proof: