Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.

Slides:



Advertisements
Similar presentations
Information Theory EE322 Al-Sanie.
Advertisements

Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
Probability and Statistics Dr. Saeid Moloudzadeh Sample Space and Events 1 Contents Descriptive Statistics Axioms of Probability Combinatorial.
Chapter 6 Information Theory
Background Knowledge Brief Review on Counting,Counting, Probability,Probability, Statistics,Statistics, I. TheoryI. Theory.
Information Theory Rong Jin. Outline  Information  Entropy  Mutual information  Noisy channel model.
LING 438/538 Computational Linguistics Sandiway Fong Lecture 17: 10/24.
Lecture 2: Basic Information Theory Thinh Nguyen Oregon State University.
Chapter 2 Chapter The sample space of an experiment, denoted S , is the set of all possible outcomes of that experiment. An event is any collection.
Review of Probability.
1 Statistical NLP: Lecture 5 Mathematical Foundations II: Information Theory.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Some basic concepts of Information Theory and Entropy
2. Mathematical Foundations
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Theory of Probability Statistics for Business and Economics.
Introduction Random Process. Where do we start from? Undergraduate Graduate Probability course Our main course Review and Additional course If we have.
LECTURE IV Random Variables and Probability Distributions I.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Channel Capacity.
Discrete Math 6-1 Copyright © Genetic Computer School 2007 Lesson 6 Probability.
Daniel Meissner Nick Lauber Kaitlyn Stangl Lauren Desordi.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
Summer 2004CS 4953 The Hidden Art of Steganography A Brief Introduction to Information Theory  Information theory is a branch of science that deals with.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
Probability is a measure of the likelihood of a random phenomenon or chance behavior. Probability describes the long-term proportion with which a certain.
Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.
Discrete Structures By: Tony Thi By: Tony Thi Aaron Morales Aaron Morales CS 490 CS 490.
Probability Definition : The probability of a given event is an expression of likelihood of occurrence of an event.A probability isa number which ranges.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Lecture 6 Dustin Lueker.  Standardized measure of variation ◦ Idea  A standard deviation of 10 may indicate great variability or small variability,
Presented by Minkoo Seo March, 2006
11.7 Continued Probability. Independent Events ► Two events are independent if the occurrence of one has no effect on the occurrence of the other ► Probability.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Introduction Remember that probability is a number from 0 to 1 inclusive or a percent from 0% to 100% inclusive that indicates how likely an event is to.
5.2 Day One Probability Rules. Learning Targets 1.I can describe a probability model for a chance process. 2.I can use basic probability rules, including.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Mutual Information, Joint Entropy & Conditional Entropy
Chapter 7 Sets & Probability Section 7.3 Introduction to Probability.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
(C) 2000, The University of Michigan 1 Language and Information Handout #2 September 21, 2000.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
APPENDIX A: A REVIEW OF SOME STATISTICAL CONCEPTS
Virtual University of Pakistan
Quick Review Probability Theory
Quick Review Probability Theory
Applied Discrete Mathematics Week 7: Probability Theory
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Probability Models Section 6.2.
Introduction Remember that probability is a number from 0 to 1 inclusive or a percent from 0% to 100% inclusive that indicates how likely an event is to.
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Section 11.7 Probability.
Entropy CSCI284/162 Spring 2009 GWU.
Adapted from Walch Education
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1

The amount of Information C. Shannon has suggested that the random variable –log P{E k } is an indicative relative measure of the occurrence of the event E k. The mean of this function is a good indication of the average uncertainty with respect to all outcomes of the experiment. 2

The amount of Information Consider the sample space Ω. Let us partition the sample space in a finite number of mutually exclusive events: The way in which the probability space defined by such equations is called a complete finite scheme. 3

The amount of Information Consider the sample space Ω. Let us partition the sample space in a finite number of mutually exclusive events: C. Shannon has suggested that the random variable I = –log P{E k } is an indicative relative measure of the occurrence of the event E k. This measure is called the amount of information contained in the occurrence of the event E k. 4

The amount of Information. Entropy. It is important to evaluate not only the amount of information (the uncertainty) contained in a single isolated event (message), but to evaluate the average uncertainty of the entire complete finite scheme. C. Shannon and N. Wiener suggested the following measure of uncertainty – the Entropy: 5

Entropy of a Bit (a simple communication channel) A completely random bit with p=(½,½) has H(p) = –(½ log ½ + ½ log ½) = –(–½ + –½) = 1. A deterministic bit with p=(1,0) has H(p) = –(1 log log 0) = –(0+0) = 0. A biased bit with p=(0.1,0.9) has H(p) = … In general, the entropy looks as follows as a function of 0≤P{X=1}≤1: 6

The amount of Information. Entropy. We have to investigate the principal properties of this measure with respect to statistical problems of communication systems. We have to generalize this concept to two- dimensional and n -dimensional probability schemes. 7

Entropy. Basic Properties Continuity: if the probabilities of the occurrence of events are slightly changed, the entropy is slightly changed accordingly. Symmetry: Extremal Property : when all the events are equally likely, the average uncertainty has the largest value: 8

Entropy. Basic Properties Additivity. Let is the entropy associated with a complete set of events E 1, E 2, …, E n. Let the event E n is divided into m disjoint subsets: Thus and where 9

Entropy. Basic Properties In general, is continuous in p i for all 10

Entropy for Two-dimensional Discrete Finite Probability Schemes The two-dimensional probability scheme provides the simplest mathematical model for a communication system with a transmitter and a receiver. Consider two finite discrete sample spaces Ω 1 (transmitter space) Ω 2 (receiver space) and their product space Ω. 11

Entropy for Two-dimensional Discrete Finite Probability Schemes In Ω 1 and Ω 2 we select complete set of events Each event E k of Ω 1 may occur in conjunction with any event F j of Ω 2. Thus for the product space Ω= Ω 1 Ω 2 we have the following complete set of events: 12

Entropy for Two-dimensional Discrete Finite Probability Schemes We have deal with the following three complete sets of probability schemes: Hence the joint probability matrix is 13