Presentation is loading. Please wait.

Presentation is loading. Please wait.

Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have.

Similar presentations


Presentation on theme: "Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have."— Presentation transcript:

1 Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have where B s is the average length of the code.

2 The optimal coding could be achieved via Hoffman coding which is created via

3 Shannon’s capacity: C = B log 2 ( 1 + (S/N) ) b/s Hamming distance between two codes X and Y is given by Parity check code is achieved via adding a bit to its actual code Encryption is achieved via matched key approach


Download ppt "Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have."

Similar presentations


Ads by Google