Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Vision – Compression(1) Hanyang University Jong-Il Park.

Similar presentations


Presentation on theme: "Computer Vision – Compression(1) Hanyang University Jong-Il Park."— Presentation transcript:

1 Computer Vision – Compression(1) Hanyang University Jong-Il Park

2 Department of Computer Science and Engineering, Hanyang University Image Compression The problem of reducing the amount of data required to represent a digital image Underlying basis  Removal of redundant data Mathematical viewpoint  Transforming a 2-D pixel array into a statistically uncorrelated data set

3 Department of Computer Science and Engineering, Hanyang University Topics to be covered Fundamentals  Basic concepts of source coding theorem Practical techniques  Lossless coding  Lossy coding  Optimum quantization  Predictive coding  Transform coding Standards  JPEG  MPEG  Recent issues

4 Department of Computer Science and Engineering, Hanyang University History of image compression Theoretic foundation  C.E.Shannon’s works in 1940s Analog compression  Aiming at reducing video transmission bandwidth  Bandwidth compression  Eg. Subsampling methods, subcarrier modulation… Digital compression  Owing to the development of ICs and computers  Early 70s: Facsimile transmission – 2D binary image coding  Academic research in 70s to 80s  Rapidly matured around 1990.  standardization such as JPEG, MPEG, H.263, …

5 Department of Computer Science and Engineering, Hanyang University Data redundancy Data vs. information Data redundancy Relative data redundancy Three basic redundancies 1. Coding redundancy 2. Interpixel redundancy 3. Psychovisual redundancy

6 Department of Computer Science and Engineering, Hanyang University Coding redundancy Code: a system of symbols used to represent a body of information or set of events Code word: a sequence of code symbols Code length: the number of symbols in each code word Average number of bits

7 Department of Computer Science and Engineering, Hanyang University Eg. Coding redundancy Reduction by variable length coding

8 Department of Computer Science and Engineering, Hanyang University Correlation Cross correlation Autocorrelation

9 Department of Computer Science and Engineering, Hanyang University Eg. Correlation

10 Department of Computer Science and Engineering, Hanyang University Interpixel redundancy Spatial redundancy Geometric redundancy Interframe redundancy

11 Department of Computer Science and Engineering, Hanyang University Eg. Interpixel redundancy

12 Department of Computer Science and Engineering, Hanyang University Eg. Run-length coding

13 Department of Computer Science and Engineering, Hanyang University Psychovisual redundancy +

14 Department of Computer Science and Engineering, Hanyang University Image compression models Communication model Source encoder and decoder

15 Department of Computer Science and Engineering, Hanyang University Basic concepts in information theory Self-information: I(E)= - log P(E) Source alphabet A and symbols Probability of the events z Ensemble (A, z) Entropy(=uncertainty): Channel alphabet B Channel matrix Q

16 Department of Computer Science and Engineering, Hanyang University Mutual information and capacity Equivocation: Mutual information: Channel capacity C  Minimum possible I (z,v)=0  Maximum possible I over all possible choices of source probabilities in z is the channel capacity

17 Department of Computer Science and Engineering, Hanyang University Eg. Binary Symmetric Channel Entropy Mutual information Channel capacity 00 11 1-p e pepe pepe p bs 1-p bs BSC

18 Department of Computer Science and Engineering, Hanyang University Noiseless coding theorem Shannon’s first theorem for a zero-memory source It is possible to make L’ avg /n arbitrarily close to H(z) by coding infinitely long extensions of the source Efficiency = entropy/ L’ avg Eg. Extension coding  Extension coding  better efficiency

19 Department of Computer Science and Engineering, Hanyang University Extension coding A B A Efficiency =0.918/1.0=0.918 B Efficiency =0.918*2/1.89=0.97 Better efficiency

20 Department of Computer Science and Engineering, Hanyang University Noisy coding theorem Shannon’s second theorem for a zero-memory channel: For any R<C, there exists an integer r and code of block length r and rate R such that the probability of a block decoding error is arbitrarily small. Rate-Distortion theory The source output can be recovered at the decoder with an arbitrarily small probability of error provided that the channel has capacity C > R(D)+e. x x x x Never Feasible! feasible

21 Department of Computer Science and Engineering, Hanyang University Using mappings to reduce entropy 1st order estimate of entropy > 2 nd order estimate of entropy > 3 rd order estimate of entropy …. The (estimated) entropy of a properly mapped image (eg. “difference source”) is in most cases smaller than that of original image source. How to implement ? The topic of the next lecture!


Download ppt "Computer Vision – Compression(1) Hanyang University Jong-Il Park."

Similar presentations


Ads by Google