1 16.548 Notes II Jay Weitzen University of Massachusetts Lowell.

Slides:



Advertisements
Similar presentations
II. Modulation & Coding. © Tallal Elshabrawy Design Goals of Communication Systems 1.Maximize transmission bit rate 2.Minimize bit error probability 3.Minimize.
Advertisements

Information Theory EE322 Al-Sanie.
Day 2 Information theory ( 信息論 ) Civil engineering ( 土木工程 ) Cultural exchange.
Forgent Inc. vs. High-Tech Giants Gautam Altekar.
UCB Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley
Revision of Chapter III For an information source {p i, i=1,2,…,N} its entropy is defined by Shannon’s first theorem: For an instantaneous coding, we have.
Turbo Codes Azmat Ali Pasha.
1Causality & MDL Causal Models as Minimal Descriptions of Multivariate Systems Jan Lemeire June 15 th 2006.
Huffman Encoding Visualization Auto-Generated Slides To Visualize Huffman Encoding by Chris Fremgen.
Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411
UCB Source Coding Jean Walrand EECS. UCB Outline Compression Losless: Huffman Lempel-Ziv Audio: Examples Differential ADPCM SUBBAND CELP Video: Discrete.
S. Mandayam/ ECOMMS/ECE Dept./Rowan University Electrical Communications Systems Spring 2005 Shreekanth Mandayam ECE Department Rowan University.
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
Massachusetts,
Still Image Conpression JPEG & JPEG2000 Yu-Wei Chang /18.
Coding Schemes for Multiple-Relay Channels 1 Ph.D. Defense Department of Electrical and Computer Engineering University of Waterloo Xiugang Wu December.
10.2 Sequences Math 6B Calculus II. Limit of Sequences from Limits of Functions.
Channel Coding Part 1: Block Coding
Channel Capacity
(Important to algorithm analysis )
File Compression Techniques Alex Robertson. Outline History Lossless vs Lossy Basics Huffman Coding Getting Advanced Lossy Explained Limitations Future.
Departments in Business Business Name 1 Business Name 2.
A Mathematical Theory of Communication Jin Woo Shin Sang Joon Kim Paper Review By C.E. Shannon.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Information & Communication INST 4200 David J Stucki Spring 2015.
Basic components II (06.523) Basic components of communication networks II Lecture 2.
Coding Theory Efficient and Reliable Transfer of Information
Transmission over composite channels with combined source-channel outage: Reza Mirghaderi and Andrea Goldsmith Work Summary STATUS QUO A subset Vo (with.
Compression  Data files compression  Music compression  Image and video compression.
CHAPTER Continuity Series Definition: Given a series   n=1 a n = a 1 + a 2 + a 3 + …, let s n denote its nth partial sum: s n =  n i=1 a i = a.
Layers of the Earth and Plate Tectonics Vocabulary.
Bahareh Sarrafzadeh 6111 Fall 2009
Combinatorics (Important to algorithm analysis ) Problem I: How many N-bit strings contain at least 1 zero? Problem II: How many N-bit strings contain.
Turbo Codes. 2 A Need for Better Codes Designing a channel code is always a tradeoff between energy efficiency and bandwidth efficiency. Lower rate Codes.
Modern Possibilities of Access and Transport Networks Vladislav Skorpil.
John Hamann Vickey Yeh Compression of Stereo Images.
1 Chapter 3 GAs: Why Do They Work?. 2 Schema Theorem SGA’s features: binary encoding proportional selection one-point crossover strong mutation Schema.
Project 1 Data Communication Spring 2010, ICE Stephen Kim, Ph.D.
Search Engines WS 2009 / 2010 Prof. Dr. Hannah Bast Chair of Algorithms and Data Structures Department of Computer Science University of Freiburg Lecture.
基 督 再 來 (一). 經文: 1 你們心裡不要憂愁;你們信神,也當信我。 2 在我父的家裡有許多住處;若是沒有,我就早 已告訴你們了。我去原是為你們預備地去 。 3 我 若去為你們預備了地方,就必再來接你們到我那 裡去,我在 那裡,叫你們也在那裡, ] ( 約 14 : 1-3)
S , Postgraduate Course in Radio Communications
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
12/12/2003EZW Image Coding Duarte and Haupt 1 Examining The Embedded Zerotree Wavelet (EZW) Image Coding Method Marco Duarte and Jarvis Haupt ECE 533 December.
Ch4. Zero-Error Data Compression Yuan Luo. Content  Ch4. Zero-Error Data Compression  4.1 The Entropy Bound  4.2 Prefix Codes  Definition and.
> Streamline, Maximise and Deliver > Soft Services > Projects > Consultancy > Hard Services.
Assignment 6: Huffman Code Generation
Shannon Entropy Shannon worked at Bell Labs (part of AT&T)
Image Compression The still image and motion images can be compressed by lossless coding or lossy coding. Principle of compression: - reduce the redundant.
continued on next slide
Context-based Data Compression
Advanced app of SE Data compression—cheat human eyes (jpg, mpeg); MP3: cheat human ears: wav->mp3 Png->jpg, dpcm, fft, entropy encoding: Huffman Shannon:
                                                                                                                                                                                                                                                
continued on next slide
continued on next slide
Слайд-дәріс Қарағанды мемлекеттік техникалық университеті
.. -"""--..J '. / /I/I =---=-- -, _ --, _ = :;:.
Chapter 18 Bayesian Statistics.
Distributed Compression For Binary Symetric Channels
II //II // \ Others Q.
I1I1 a 1·1,.,.,,I.,,I · I 1··n I J,-·
II. Modulation & Coding.
Justification for Health & Physical Education
Data Compression.
Lecture 11 The Noiseless Coding Theorem (Section 3.4)
. '. '. I;.,, - - "!' - -·-·,Ii '.....,,......, -,
continued on next slide
IV. Convolutional Codes
Lecture 8 Huffman Encoding (Section 2.2)
Algorithms Lecture # 25 Dr. Sohail Aslam.
continued on next slide
Presentation transcript:

Notes II Jay Weitzen University of Massachusetts Lowell

2

3

4

5

6

7

8

9

10

11

12

13

14

15 Information and Communication

16 Key Slide

17

18 Hard decision vs Soft Decision

19

20

21

22

23

24

25

26

27 Applications of Information Theory: Compression

28 Shannon’s First Theorem: A.K.A Source Coding Theorem

29 Shannon’s source coding theorem

30

31

32

33 Rate of a Source Code

34 2 nd order Block Codes and Huffman Encoding

35

36

37

38

39

40 Some Definitions

41 Higher Order Codes Converge

42

43

44

45