Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio.

Similar presentations


Presentation on theme: "1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio."— Presentation transcript:

1 1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio signal. So far our attention has been on discrete signals, typically represented as streams of binary digits. How to deduce the information capacity of continuous signals?

2 2 Sampling Theorem f(t) f s (t) t t F(f) -WW Numbers of samples/s ≥2W (1/T ≥2W ), W  Nyquist rate F (fs) f f f Spectrum of continuous signal Spectrum of sampled signal T 2T 3T 4T

3 3 Information capacity in continuous signal Information per second: R=(number of independent samples/s) × (maximum information per sample) number of independent samples/s =2W What is maximum information per sample in continuous signal? maximum information per sample in discrete signal? H=-Σp log p number of distinguishable levels =S/N C=2Wlog(S/N)=W log (SNR) For continuous signal, maximum information per second is usually denoted as Information Capacity

4 4 Relative Entropy of Continuous Signal Discrete systems Continuous systems Gaussian

5 5 Information Capacity of Continuous Signals Channel Output (y) Input (x) Power (S)Power (S+N) Information Capacity C=[H(y)-H(n)]×2W This leads to Ideal Communication Theorem  C=Wlog(1+S/N) Theoretically information could be transmitted at a rate up to C with no net errors.

6 6 Transmission Media The physical medium of electronic transmission limits its achievable bit rate. It acts as a “filter” on the signal being transmitted.

7 7

8 8 Shannon’s Theorem Our phone line can carry frequencies between 300 Hz and 3300Hz unattenuated. The channel capacity C is C=W log 2 (1+S/N) W is the bandwidth 3300-300=3000 Hz. S/N is the signal to noise ratio, typically 1000, which corresponds to 10 log 10 (S/N) dB = 30dB. In our case C=30 kbs, corresponds well with a 28.8kbs modem.

9 9 Implications of the Ideal Theorem I=WTlog(1+SNR) bits in time T. A given amount of information can be transmitted by many combinations of W, T, SNR W SNR C=3 units T=1s 3 2 1 a b c 60 10 a. W=1, SNR=7, b. Half W  S/N =63. Requires very large increase in power. c. Half S/N  W  1.5. Useful can halve power with only 50% increase in bandwidth.

10 10 Maximum Capacity for given transmitted Power C=W log (1+S/N)  S/N 0 nats =1.44S/N 0 bits (about 3×10 ‾ ²¹ W required to transmit 1 bit.) C=Wlog(1+S/(N 0 W)), where N 0 is noise power spectral density. This suggests that power should be spread over a wide bandwidth and transmitted at as low P/N as possible for efficiency in power requirements. Max value of C occurs for W  ∞, and P/N  0


Download ppt "1 Information in Continuous Signals f(t) t 0 In practice, many signals are essentially analogue i.e. continuous. e.g. speech signal from microphone, radio."

Similar presentations


Ads by Google