# Theory of Discrete Information and Communication Systems Dr. Colin Campbell Room: 2.9, Queens Building 10 Lectures 2 example classes.

## Presentation on theme: "Theory of Discrete Information and Communication Systems Dr. Colin Campbell Room: 2.9, Queens Building 10 Lectures 2 example classes."— Presentation transcript:

Theory of Discrete Information and Communication Systems Dr. Colin Campbell Room: 2.9, Queens Building C.Campbell@bristol.ac.uk 10 Lectures 2 example classes Exam? – Yes, 2Q from 4Q.

Who are you and why are you here? Theory of Discrete Information and Communication Systems weeks 1-6 Communications Systems Performance (Mark Beach) Coursework Languages, Automatand Complexity (Colin Campbell) 1st order Predicate Logic (Enza di Tomaso) EMAT 31520 Information Systems (CSE 3, Eng Maths 3, Knowledge Eng 3) EMAT 20530 Logic and Information (CSE 2, Eng Maths 2) EENG 32000 Communication Systems (EE 3, Avionics 3) EENG M2100 Communication Systems (MSc Comms/Sig Proc) weeks 7-12

Resources www.enm.bris.ac.uk/admin/courses/EMAT20530.htm http://www.enm.bris.ac.uk/staff/kob/EMAT20530_lects.htm www.google.com Library –TK5101 LATHI: Intro. to Random Signals and Communication Theory (Ch 7) HAYKIN: Communication Systems (Chapter 10) HAYBER: Intro. to Information & Communication Theory –Q360 - more mathematical, e.g. SHANNON & WEAVER: Mathematical Theory of Communication YOUNG: Information Theory JONES: Elementary Information Theory USHER: Information Theory for Information Technologists (Chapters 1-4) email –information on course arrangements etc.

The Communication Model source destinationsource destination channel e.g. magnetic disk CD telephone smoke signal morse code

More formally source encode/ transmit receive/ decode destination channel source encode/ transmit receive/ decode destination channel Ideal NOISE Actual message signal

What is Information Theory? Measuring and comparing how much information is generated by an entity or system. Calculating how much information can be communicated. Making sure that we maximise information content for a given communication system. Communication Theory measure rate at which channel can carry information Information Theory measure quantity of information

What is information? Information is order Information must have some order for it to be useful Disorder means we dont know anything. Therefore, disorder implies uncertainty. Information is inversely related to uncertainty Sources of information: –Anything in the universe that is not completely random –DNA –Traffic light sequence –Your timetable –Your daily life –Your brains neurons … etc

How much information ? Case 1 –coin which always lands on heads –the coin is tossed –how much information do you gain if I tell you the outcome ? –answer : none Case 2 –coin which lands on heads 50% of the time –the coin is tossed –how much information do you gain if I tell you the outcome ? –answer : 1 bit Case 3 - a coin which lands on heads with probability p Case 4 - an n-sided die which shows face i with probability p i

Uncertainty information is related to uncertainty what is uncertainty ? –how many heads / tails from a fair coin ? –will Johns car be stolen if it is parked on Woodland Road for a week ? –does the car type make a difference ? –will the car be damaged ? fuzzy - is a clear definition possible ?

Probability - revision 1 1.Random Variable –A real-valued function X on a sample space S –E.g. X is the number of heads in a sequence of 10 coin tosses: S contains all 2 10 combinations of 10 heads or tails in sequence Particular outcome s S s = [HTHTHTTTHT], X(s) = 4 –RV can be discrete (S is discrete) or continuous (S is continuous)

Probability - 2 2.Probability distribution –Function defined on random variable –Discrete f(x) = Pr(X = x) X = number of Hs in sequence of 3 H or T s 1 = [HHT], X(s 1 ) = 2 s 2 = [HTH], X(s 2 ) = 2 s 3 = [THT], X(s 3 ) = 1 s 4 = [TTT], X(s 4 ) = 0 –or continuous f(x)f(x) x 1023 0.25 0.5 f(x)f(x) xab

Intuition and probability game rules –three boxes, only one contains a prize –you choose a box –I open one of the other two boxes and show you it is empty –you are now allowed to change your choice of box (a) I want to change (b) I want to stick with my first choice (c) I dont care, because it makes no difference

Probability - 3 3.Probability interpretations relative frequency – sampling Probability that you will get run over when crossing the road belief – subjective – no sampling What is the probability that there is life on Mars Laplace calculated the mass of Saturn and announced:It is a bet of 11000 to 1 that the error in this result is not within 1/100th of its value latest estimate differs from Laplaces calculation by 0.63% Independence Conditional probabilities,, etc.

Probability - 4 4.Probability rules 1.Product rule, or Pr(A,B) = Pr(A|B)Pr(B) = joint probability of A and B = Pr(B|A)Pr(A) if A and B are independent then Pr(A|B) = Pr(A) and Pr(B|A) = Pr(B) Therefore for independent events Pr(A,B) = Pr(A)Pr(B) 2.Sum rule That is, if we dont know Pr(A) directly we can calculate it from the known conditionals Pr(A|B) and Pr(B)

Bayes Rule 2.Bayes rule Since, Pr(A,B) = Pr(B,A), It follows from the product rule that, Pr(A|B)Pr(B) = Pr(B|A)Pr(A) Rearranging this we get Bayes rule, Now we can reverse the conditionalizing of events. We can calculate Pr(B|A) from Pr(A|B) Pr(A) and Pr(B) Can be applied recursively Posterior Prior

Bayesian example Should Arthur panic ? –Arthur has tested positive for telephone sanitiser disease (TSD) –the test for TSD is 95% reliable –1% of the population suffer from TSD Let D = Arthur has disease, T = test is positive

Bayesian Example - 2 Is man X guilty ? –definite DNA match…. population = 60 million only 1 person in 10 million has this DNA profile G = guilty, D = DNA match need more evidence !

A Brief History of Entropy 1865 : Clausius – thermodynamic entropy S= Q/T – change in entropy of a thermodynamic system, during a reversible process in which an amount of heat Q is applied at constant absolute temperature T 1877 : Boltzmann S = k ln N –S, the entropy of a system is related to the number of possible microscopic states (N) consistent with macroscopic observations –e.g. ideal gas or 10 coins in a box 1940s : Turing –weight of evidence - see Alan Turing : the Enigma 1948 : Shannon –information entropy related

Definition of Entropy Let X be a random variable with probability distribution p(x), x in S. The entropy H(X) of X is defined by

Download ppt "Theory of Discrete Information and Communication Systems Dr. Colin Campbell Room: 2.9, Queens Building 10 Lectures 2 example classes."

Similar presentations