Information without rolling dice Taehyung J. Lim Massimo Franceschetti Simons Conference, Austin, Texas, May 21, 2015.

Slides:



Advertisements
Similar presentations
The Transmission-Switching Duality of Communication Networks
Advertisements

A journey through data rate theorems for communication and control Massimo Franceschetti.
Sampling and Pulse Code Modulation
CMP206 – Introduction to Data Communication & Networks Lecture 3 – Bandwidth.
Information Theory EE322 Al-Sanie.
Capacity of Wireless Channels
McGraw-Hill©The McGraw-Hill Companies, Inc., 2004 Physical Layer PART II.
Cellular Communications
TRANSMISSION FUNDAMENTALS Review
Poorvi Vora/CTO/IPG/HP 01/03 1 The channel coding theorem and the security of binary randomization Poorvi Vora Hewlett-Packard Co.
Lihua Weng Dept. of EECS, Univ. of Michigan Error Exponent Regions for Multi-User Channels.
Sep 06, 2005CS477: Analog and Digital Communications1 Introduction Analog and Digital Communications Autumn
On the interdependence of routing and data compression in multi-hop sensor networks Anna Scaglione, Sergio D. Servetto.
© 2006 Cisco Systems, Inc. All rights reserved. 2.2: Digitizing and Packetizing Voice.
USAFrance OCEAN. Control and Communication: an overview Jean-Charles Delvenne CMI, Caltech May 5, 2006.
Noise, Information Theory, and Entropy
1 Computer Communication & Networks Lecture 5 Physical Layer: Data & Signals Waleed Ejaz
Net 222: Communications and networks fundamentals (Practical Part)
Noise, Information Theory, and Entropy
Chapter 5 Linear Inequalities and Linear Programming Section R Review.
Noise and SNR. Noise unwanted signals inserted between transmitter and receiver is the major limiting factor in communications system performance 2.
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
The Complex Braid of Communication and Control Massimo Franceschetti.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
McGraw-Hill©The McGraw-Hill Companies, Inc., 2004 Physical Layer PART II.
COSC 3213 – Computer Networks I Summer 2003 Topics: 1. Line Coding (Digital Data, Digital Signals) 2. Digital Modulation (Digital Data, Analog Signals)
Part 2 Physical Layer and Media
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Lecture 5: Signal Processing II EEN 112: Introduction to Electrical and Computer Engineering Professor Eric Rozier, 2/20/13.
Information Theory for Mobile Ad-Hoc Networks (ITMANET): The FLoWS Project Thrust 2 Layerless Dynamic Networks Lizhong Zheng, Todd Coleman.
The Collocation of Measurement Points in Large Open Indoor Environment Kaikai Sheng, Zhicheng Gu, Xueyu Mao Xiaohua Tian, Weijie Wu, Xiaoying Gan Department.
Sept. 25, 2006 Assignment #1 Assignment #2 and Lab #3 Now Online Formula Cheat Sheet Cheat SheetCheat Sheet Review Time, Frequency, Fourier Bandwidth Bandwidth.
Secure Communication for Distributed Systems Paul Cuff Electrical Engineering Princeton University.
Toward a Secure Data-Rate Theorem Paul Cuff. Control Setting Controller Encoder System (Plant) Sensors Rate R UiUi XiXi YiYi.
Wireless Communication Elec 534 Set I September 9, 2007 Behnaam Aazhang.
Introduction to Digital and Analog Communication Systems
10 Min Talk SOUNDARARAJAN EZEKIEL Department of Computer Science IUP.
Lecture 2 Outline Announcements: No class next Wednesday MF lectures (1/13,1/17) start at 12:50pm Review of Last Lecture Analog and Digital Signals Information.
Information Theory The Work of Claude Shannon ( ) and others.
Coding Theory Efficient and Reliable Transfer of Information
1 Private codes or Succinct random codes that are (almost) perfect Michael Langberg California Institute of Technology.
Data Comm. & Networks Lecture 6 Instructor: Ibrahim Tariq.
Part 3: Channel Capacity
Chapter 3 Data and Signals. 3.2 Last Lecturer Summary Bit Rate Bit Length Digital Signal as a Composite Analog Signal Application Layer Distortion Noise.
1 Physical Layer Computer Networks. 2 Where are we?
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Chapter 8 Lossy Compression Algorithms. Fundamentals of Multimedia, Chapter Introduction Lossless compression algorithms do not deliver compression.
21 Audio Signal Processing -- Quantization Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication Engineering.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
CDC 2006, San Diego 1 Control of Discrete-Time Partially- Observed Jump Linear Systems Over Causal Communication Systems C. D. Charalambous Depart. of.
ON AVCS WITH QUADRATIC CONSTRAINTS Farzin Haddadpour Joint work with Madhi Jafari Siavoshani, Mayank Bakshi and Sidharth Jaggi Sharif University of Technology,
Fundamentals of Multimedia Chapter 6 Basics of Digital Audio Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.
Eeng360 1 Chapter 1 INTRODUCTION  Propagation of Electromagnetic Waves  Information Measure  Channel Capacity and Ideal Communication Systems Huseyin.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Computer Engineering and Networks, College of Engineering, Majmaah University Some Basics Mohammed Saleem Bhat CEN-444 Networks Structure.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
Computer Communication & Networks
Chapter 8 Lossy Compression Algorithms
Schrodinger wave equation
Chapter 5 Linear Inequalities and Linear Programming
Partly Verifiable Signals (c.n.)
Bounds for Optimal Compressed Sensing Matrices
Physical Layer Computer Networks.
The Electromagnetic Spectrum
Compute-and-Forward Can Buy Secrecy Cheap
Watermarking with Side Information
Lihua Weng Dept. of EECS, Univ. of Michigan
Presentation transcript:

Information without rolling dice Taehyung J. Lim Massimo Franceschetti Simons Conference, Austin, Texas, May 21, 2015

Shannon’s information theory Analogous notions in a deterministic setting?

Why deterministic?  Networked Control Systems (CPS)  Nair (2013), Matveev and Savkin (2007)  Electromagnetic Wave Theory  Franceschetti (2015), Franceschetti Migliore and Minero (2009)

Deterministic setting  “His mathematical intuition is remarkably precise” A.N. Kolmogorov on Shannon’s work  Influenced by Shannon’s work he introduced ( ) the concepts of entropy and capacity in the deterministic framework of functional approximation.

Deterministic vs Stochastic  He drew some connections in Kolmogorov (ISIT 1956), and Kolmogorov and Tikhomirov (AMS translation 1961, Appendix II), but limited the discussion “at the level of analogy and parallelism”  At the time, the theory spectral concentration of bandlimited functions was not yet developed by Landau, Pollack, and Slepian (60s,70s)  Donoho (2000) describes connection between entropy and rate distortion

Capacity

Comparison  The two models are driven by the same geometric intuition and have the same operational significance in terms of reliable communication with noise-perturbed signals  Shannon and Kolmogorov are playing the same game of packing billiard balls in high-dimensional space (and keeping the score in bits)

 Jagerman’s bounds ( )

Lower bound  Both volumes of square and the ball tend to zero  Volume of square vanishes at a faster rate  Volume escapes on the boundary of the sphere

Upper bound  Follows from Kolmogorov inequality  And upper bound on

 Tight up to half a bandwidth  Capacity scales linearly with bandwidth and logarithmically with SNR  Do not provide an explicit codebook, but rely on geometric properties of covering and packing  Lim and F. (Preprint)

 “What is the notion of an error exponent in a deterministic setting?” Francois Baccelli at Allerton End of the game?

Revisit the model Allow for a certain amount of overlap in the packing of the balls Signals in a codebook are -distinguishable if the portion of space where the received signal may lie and result in a decoding error is at most

 Tight in high SNR  As in Shannon’s case, the capacity per unit time does not depend on the size of the error region  Lim and F. (Preprint)

Error exponent Let number of messages in the codebook Size of error region is at most Error exponent in the deterministic model is

Entropy  Connections between deterministic and stochastic notions of entropy: Donoho (2000), Neuhoff and Gray (1998) Kolmogorov entropy corresponds to the minimum number of bits that can represent any signal in the space with a given error constraint

Entropy Shannon’s entropy corresponds to the average number of bits necessary to represent any quantized stochastic process of a given class, with uniform quantization on each coordinate of the space If a stochastic process is represented by a codebook point with mean square error then the min number of bps that represent the process with a given accuracy is given by Shannon’s rate distortion function

Entropy

 Lim and F. (Preprint)  Jagerman ( )

It’s a draw!