Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4.

Slides:



Advertisements
Similar presentations
Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Advertisements

Noise, Information Theory, and Entropy (cont.) CS414 – Spring 2007 By Karrie Karahalios, Roger Cheng, Brian Bailey.
Applied Algorithmics - week7
The Math Behind the Compact Disc Linear Algebra and Error-Correcting Codes william j. martin. mathematical sciences. wpi wednesday december fairfield.
Data and Computer Communications Tenth Edition by William Stallings Data and Computer Communications, Tenth Edition by William Stallings, (c) Pearson Education.
Chapter 10 Shannon’s Theorem. Shannon’s Theorems First theorem:H(S) ≤ L n (S n )/n < H(S) + 1/n where L n is the length of a certain code. Second theorem:
Information Theory EE322 Al-Sanie.
Information Theory Introduction to Channel Coding Jalal Al Roumy.
Chapter 6 Information Theory
Cellular Communications
Fundamental limits in Information Theory Chapter 10 :
The 1’st annual (?) workshop. 2 Communication under Channel Uncertainty: Oblivious channels Michael Langberg California Institute of Technology.
Coding Theory: Packing, Covering, and 2-Player Games Robert Ellis Menger Day 2008: Recent Applied Mathematics Research Advances April 14, 2008.
DIGITAL COMMUNICATION Coding
EEE377 Lecture Notes1 EEE436 DIGITAL COMMUNICATION Coding En. Mohd Nazri Mahmud MPhil (Cambridge, UK) BEng (Essex, UK) Room 2.14.
Information Theory Eighteenth Meeting. A Communication Model Messages are produced by a source transmitted over a channel to the destination. encoded.
Forward Error Correction. FEC Basic Idea Send redundant data Receiver uses it to detect/correct errors Reduces retransmissions/NAKs Useful when RTT is.
15-853Page :Algorithms in the Real World Error Correcting Codes I – Overview – Hamming Codes – Linear Codes.
Mario Vodisek 1 HEINZ NIXDORF INSTITUTE University of Paderborn Algorithms and Complexity Erasure Codes for Reading and Writing Mario Vodisek ( joint work.
CS151 Complexity Theory Lecture 9 April 27, 2004.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
Information Theory & Coding…
DIGITAL COMMUNICATION Error - Correction A.J. Han Vinck.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Channel Coding Part 1: Block Coding
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
Information Coding in noisy channel error protection:-- improve tolerance of errors error detection: --- indicate occurrence of errors. Source.
Information and Coding Theory Transmission over noisy channels. Channel capacity, Shannon’s theorem. Juris Viksna, 2015.
Channel Capacity.
Threshold Phenomena and Fountain Codes Amin Shokrollahi EPFL Joint work with M. Luby, R. Karp, O. Etesami.
1 Network Coding and its Applications in Communication Networks Alex Sprintson Computer Engineering Group Department of Electrical and Computer Engineering.
ERROR CONTROL CODING Basic concepts Classes of codes: Block Codes
Redundancy The object of coding is to introduce redundancy so that even if some of the information is lost or corrupted, it will still be possible to recover.
On Compression of Data Encrypted with Block Ciphers
Introduction to Coding Theory. p2. Outline [1] Introduction [2] Basic assumptions [3] Correcting and detecting error patterns [4] Information rate [5]
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
CS717 Algorithm-Based Fault Tolerance Matrix Multiplication Greg Bronevetsky.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
DIGITAL COMMUNICATIONS Linear Block Codes
Linear codes of good error control performance Tsonka Baicheva Institute of Mathematics and Informatics Bulgarian Academy of Sciences Bulgaria.
Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 5.
1 Private codes or Succinct random codes that are (almost) perfect Michael Langberg California Institute of Technology.
Information Theory Linear Block Codes Jalal Al Roumy.
Channel Coding Binit Mohanty Ketan Rajawat. Recap…  Information is transmitted through channels (eg. Wires, optical fibres and even air)  Channels are.
Basic Concepts of Encoding Codes and Error Correction 1.
Some Computation Problems in Coding Theory
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Elementary Coding Theory Including Hamming and Reed-Solomom Codes with Maple and MATLAB Richard Klima Appalachian State University Boone, North Carolina.
1 Asymptotically good binary code with efficient encoding & Justesen code Tomer Levinboim Error Correcting Codes Seminar (2008)
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Raptor Codes Amin Shokrollahi EPFL. BEC(p 1 ) BEC(p 2 ) BEC(p 3 ) BEC(p 4 ) BEC(p 5 ) BEC(p 6 ) Communication on Multiple Unknown Channels.
INFORMATION THEORY Pui-chor Wong.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Richard Cleve DC 2117 Introduction to Quantum Information Processing QIC 710 / CS 667 / PH 767 / CO 681 / AM 871 Lecture (2011)
Channel Coding: Part I Presentation II Irvanda Kurniadi V. ( ) Digital Communication 1.
Class Report 林格名 : Reed Solomon Encoder. Reed-Solomom Error Correction When a codeword is decoded, there are three possible outcomes –If 2s + r < 2t (s.
RS – Reed Solomon Error correcting code. Error-correcting codes are clever ways of representing data so that one can recover the original information.
IERG6120 Lecture 22 Kenneth Shum Dec 2016.
Introduction to Information theory
Transmission over noisy channels. Channel capacity, Shannon’s theorem.
Sublinear-Time Error-Correction and Error-Detection
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Block codes. encodes each message individually into a codeword n is fixed, Input/out belong to alphabet Q of cardinality q. The set of Q-ary n-tuples.
Homework #2 Due May 29 , Consider a (2,1,4) convolutional code with g(1) = 1+ D2, g(2) = 1+ D + D2 + D3 a. Draw the.
Theory of Information Lecture 13
Presentation transcript:

Probabilistic verification Mario Szegedy, Rutgers www/cs.rutgers.edu/~szegedy/07540 Lecture 4

Codes I (outline) Madhu’s notes : It all started with Shannon: binary symmetric (noisy) channels Malicious (rather than random) errors Error correcting codes and their parameters Papadimitriou’s notes: Reed-Solomon (RS) codes Berlekamp-Welsh (BW) decoding algorithm for RS codes Madhu’s notes: A generalization of the BW decoding algorithm by Sudan My notes: Generalized Reed-Solomon codes and its parameters Self-correcting properties of the generalized RS codes (local decodability)

Claude Shannon 1916 – 2001 In 1948 Shannon published A Mathematical Theory of Communication article in two parts in the July and October issues of the Bell System Technical Journal.A Mathematical Theory of CommunicationBell System Technical Journal This work focuses on the problem of how best to encode the information a sender wants to transmit. In this fundamental work he used tools in probability theory, developed by Norbert Wiener, which were in their nascent stages of being applied to communication theory at that time.informationNorbert Wiener Shannon developed information entropy as a measure for the uncertainty in a message while essentially inventing the field of information theory.information entropyinformation theory

some channel models Input X P(y|x)output Y transition probabilities memoryless: - output at time i depends only on input at time i - input and output alphabet finite

Example: binary symmetric channel (BSC) Error Source + E X Output Input E is the binary error sequence s.t. P(1) = 1-P(0) = p X is the binary information sequence Y is the binary output sequence 1-p 0 p 1 1-p

Other models (light on) 1 (light off) p 1-p X Y P(X=0) = P E10E1 1-e e 1-e P(X=0) = P 0 Z-channel (optical) Erasure channel (MAC)

Erasure with errors E10E1 p p e e 1-p-e

General noisy channel is an arbitrary bipartite graph p ij is the probability that upon sending i the receiver receives j. (p ij ) is a stochastic matrix. p 22 p 35 p 11

Original message + redundancy Encoded messageReceived message - decoding Original message

Encoded message m E(m) Rate of encoding R = |m| / |E(m)| For a fixed channel is there a fixed rate that allows almost certain recovery? (m→ infinity)

Madhu’s notes : 4.pdf

THEOREM: For binary symmetric channel with error probablity p we can find code with rate: 1-H(p), where H(p) = p log 2 p + (1-p) log 2 (1-p) (Entropy function) DEFINITIONS: Hamming distance of two binary strings is Δ(x,y) Hamming ball with radius r around binary string x is B(x,r) BASIC: For r=pn we have |B(x,r)| ≈ 2 H(p)n x r

ENCODING: (R = 1-H(p) ) E: {0,1} Rn → {0,1} n, random DECODING: Given y = E(m) + error, find the unique m’ such that Δ(y,E(m’)) ≤ pn, if such m’ exists, otherwise decode arbitrarily. ANALYSIS: Pr[decoding error] ≤ Pr[ Δ(y,E(m) ≥ pn ] + Pr[ many codewords in B(y,pn)] ≤ small + 2 H(p)n 2 Rn 2 -n y pn E(m’) A B Pr[ A, B intersect ]= |A||B|/|X| (A is random) x

No better rate is possible Transmit random messages. Decoding error is large for any code with rate > 1-H(p). PROOF: The decoding procedure D partitions {0,1} n into 2 Rn parts. {0,1} n Distribution of E(m) + error Decodes to m

2 n / 2 Rn versus 2 H(p)n R ≤ 1-H(p)

C= (n,k,d) q codes n = Block length k = Information length d = Distance k/n = Information rate d/n = Distance rate q = Alphabet size

Linear Codes C = {x | x Є L} (L is a linear subspace of F q n ) Δ(x,y) = Δ(x-z,y-z) min Δ(0,x) = min Δ(x,y) x,y Є L k = dimension of the message space = dim L n = dimension of the whole space (in which the code words lie) Generator matrix: {xG | x Є ∑ k } “Parity” check matrix: {y Є ∑ n | yH = 0}

Reed-Solomon codes The Reed–Solomon code is an error-correcting code that works by oversampling a polynomial constructed from the data.error-correcting code oversamplingpolynomial C is a [n, k, n-k+1] code; in other words, it is a linear code of length n (over F) with dimension k and minimum distance n-k+1.linear codedimension