3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

Slides:



Advertisements
Similar presentations
Chapter3 Pattern Association & Associative Memory
Advertisements

Shortest Vector In A Lattice is NP-Hard to approximate
Sampling and Pulse Code Modulation
Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review By Mary Kathryn Cowles and Bradley P. Carlin Presented by Yuting Qi 12/01/2006.
Information Theory EE322 Al-Sanie.
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAA Antonia Tulino Università degli Studi di Napoli Chautauqua.
CWIT Robust Entropy Rate for Uncertain Sources: Applications to Communication and Control Systems Charalambos D. Charalambous Dept. of Electrical.
11 - Markov Chains Jim Vallandingham.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Entropy Rates of a Stochastic Process
Chapter 6 Information Theory
By : L. Pour Mohammad Bagher Author : Vladimir N. Vapnik
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
10/15/01 1 The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego.
2005 Viterbi Conference Paul H. Siegel Director, CMRR Electrical and Computer Engineering University of California, San Diego 3/8/051 Applications of the.
Efficient Statistical Pruning for Maximum Likelihood Decoding Radhika Gowaikar Babak Hassibi California Institute of Technology July 3, 2003.
1/11/05 1 Capacity of Noiseless and Noisy Two-Dimensional Channels Paul H. Siegel Electrical and Computer Engineering Center for Magnetic Recording Research.
Location Estimation in Sensor Networks Moshe Mishali.
AN IFORMATION THEORETIC APPROACH TO BIT STUFFING FOR NETWORK PROTOCOLS Jack Keil Wolf Center for Magnetic Recording Research University of California,
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Division of Engineering and Applied Sciences DIMACS-04 Iterative Timing Recovery Aleksandar Kavčić Division of Engineering and Applied Sciences Harvard.
Review of Probability and Random Processes
Compression with Side Information using Turbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference.
Information-Theoretic Limits of Two-Dimensional Optical Recording Channels Paul H. Siegel Center for Magnetic Recording Research University of California,
Ger man Aerospace Center Gothenburg, April, 2007 Coding Schemes for Crisscross Error Patterns Simon Plass, Gerd Richter, and A.J. Han Vinck.
Lecture II-2: Probability Review
Introduction to AEP In information theory, the asymptotic equipartition property (AEP) is the analog of the law of large numbers. This law states that.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
§1 Entropy and mutual information
Feng Lu Chuan Heng Foh, Jianfei Cai and Liang- Tien Chia Information Theory, ISIT IEEE International Symposium on LT Codes Decoding: Design.
Diophantine Approximation and Basis Reduction
On Multiplicative Matrix Channels over Finite Chain Rings Roberto W. Nobrega, Chen Feng, Danilo Silva, Bartolomeu F. Uchoa-Filho Conference version: NetCod.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Channel Capacity.
Extremal Problems of Information Combining Alexei Ashikhmin  Information Combining: formulation of the problem  Mutual Information Function for the Single.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
1/11/05 1 Capacity of Noiseless and Noisy Two-Dimensional Channels Paul H. Siegel Electrical and Computer Engineering Center for Magnetic Recording Research.
§2 Discrete memoryless channels and their capacity function
Erasure Coding for Real-Time Streaming Derek Leong and Tracey Ho California Institute of Technology Pasadena, California, USA ISIT
Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
A Semi-Blind Technique for MIMO Channel Matrix Estimation Aditya Jagannatham and Bhaskar D. Rao The proposed algorithm performs well compared to its training.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Baseband Receiver Receiver Design: Demodulation Matched Filter Correlator Receiver Detection Max. Likelihood Detector Probability of Error.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
Jayanth Nayak, Ertem Tuncel, Member, IEEE, and Deniz Gündüz, Member, IEEE.
The Finite-state channel was introduced as early as 1953 [McMillan'53]. Memory captured by channel state at end of previous symbol's transmission: - S.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
CDC 2006, San Diego 1 Control of Discrete-Time Partially- Observed Jump Linear Systems Over Causal Communication Systems C. D. Charalambous Depart. of.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Compression for Fixed-Width Memories Ori Rottenstriech, Amit Berman, Yuval Cassuto and Isaac Keslassy Technion, Israel.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
How many iterations in the Gibbs sampler? Adrian E. Raftery and Steven Lewis (September, 1991) Duke University Machine Learning Group Presented by Iulian.
Computation of the solutions of nonlinear polynomial systems
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
General Strong Polarization
Hidden Markov Models Part 2: Algorithms
Tim Holliday Peter Glynn Andrea Goldsmith Stanford University
STOCHASTIC HYDROLOGY Random Processes
CONTEXT DEPENDENT CLASSIFICATION
Parametric Methods Berlin Chen, 2005 References:
Continuous Random Variables: Basics
Presentation transcript:

3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California, San Diego DIMACS Workshop March 22-24, 2004

DIMACS Workshop 3/23/04 2 Outline Motivation: Two-dimensional recording Channel model Information rates Bounds on the Symmetric Information Rate (SIR) Upper Bound Lower Bound Convergence Alternative upper bound Numerical results Conclusions

DIMACS Workshop 3/23/04 3 Two-Dimensional Channel Model Constrained input array Linear intersymbol interference Additive, i.i.d. Gaussian noise

DIMACS Workshop 3/23/04 4 Two-Dimensional Processes Input process: Output process: Array upper left corner: lower right corner:

DIMACS Workshop 3/23/04 5 Entropy Rates Output entropy rate: Noise entropy rate: Conditional entropy rate:

DIMACS Workshop 3/23/04 6 Mutual Information Rates Mutual information rate: Capacity: Symmetric information rate (SIR): Inputs are constrained to be independent, identically distributed, and equiprobable binary.

DIMACS Workshop 3/23/04 7 Capacity and SIR The capacity and SIR are useful measures of the achievable storage densities on the two- dimensional channel. They serve as performance benchmarks for channel coding and detection methods. So, it would be nice to be able to compute them.

DIMACS Workshop 3/23/04 8 Finding the Output Entropy Rate For one-dimensional ISI channel model: and where

DIMACS Workshop 3/23/04 9 Sample Entropy Rate If we simulate the channel N times, using inputs with specified (Markovian) statistics and generating output realizations then converges to with probability 1 as.

DIMACS Workshop 3/23/04 10 Computing Sample Entropy Rate The forward recursion of the sum-product (BCJR) algorithm can be used to calculate the probability of a sample realization of the channel output. In fact, we can write where the quantity is precisely the normalization constant in the (normalized) forward recursion.

DIMACS Workshop 3/23/04 11 Computing Entropy Rates Shannon-McMillan-Breimann theorem implies as, where is a single long sample realization of the channel output process.

DIMACS Workshop 3/23/04 12 SIR for Partial-Response Channels

DIMACS Workshop 3/23/04 13 Capacity Bounds for Dicode

DIMACS Workshop 3/23/04 14 Markovian Sufficiency Remark: It can be shown that optimized Markovian processes whose states are determined by their previous r symbols can asymptotically achieve the capacity of finite-state intersymbol interference channels with AWGN as the order r of the input process approaches . (J. Chen and P.H. Siegel, ISIT 2004)

DIMACS Workshop 3/23/04 15 Capacity and SIR in Two Dimensions In two dimensions, we could estimate by calculating the sample entropy rate of a very large simulated output array. However, there is no counterpart of the BCJR algorithm in two dimensions to simplify the calculation. Instead, we use conditional entropies to derive upper and lower bounds on.

DIMACS Workshop 3/23/04 16 Array Ordering Permuted lexicographic ordering: Choose vector, a permutation of. Map each array index to. Then precedes if or and. Therefore, row-by-row ordering column-by-column ordering

DIMACS Workshop 3/23/04 17 Two-Dimensional “Past” Let be a non-negative vector. Define to be the elements preceding inside the region (with permutation k )

DIMACS Workshop 3/23/04 18 Examples of Past{Y[i,j]}

DIMACS Workshop 3/23/04 19 Conditional Entropies For a stationary two-dimensional random field Y on the integer lattice, the entropy rate satisfies: (The proof uses the entropy chain rule. See [5-6]) This extends to random fields on the hexagonal lattice,via the natural mapping to the integer lattice.

DIMACS Workshop 3/23/04 20 Upper Bound on H(Y) For a stationary two-dimensional random field Y, where

DIMACS Workshop 3/23/04 21 Two-Dimensional Boundary of Past{Y[i,j]} Define to be the boundary of. The exact expression for is messy, but the geometrical concept is simple.

DIMACS Workshop 3/23/04 22 Two-Dimensional Boundary of Past{Y[i,j]}

DIMACS Workshop 3/23/04 23 Lower Bound on H(Y) For a stationary two-dimensional hidden Markov field Y, where and is the “state information” for the strip.

DIMACS Workshop 3/23/04 24 Sketch of Proof Upper bound: Note that and that conditioning reduces entropy. Lower bound: Markov property of, given “state information”.

DIMACS Workshop 3/23/04 25 Convergence Properties The upper bound on the entropy rate is monotonically non-increasing as the size of the array defined by increases. The lower bound on the entropy rate is monotonically non-decreasing as the size of the array defined by increases.

DIMACS Workshop 3/23/04 26 Convergence Rate The upper bound and lower bound converge to the true entropy rateat least as fast as O( 1 / l min ), where

DIMACS Workshop 3/23/04 27 Computing the SIR Bounds Estimate the two-dimensional conditional entropies over a small array. Calculate to get for many realizations of output array. For column-by-column ordering, treat each row as a variable and calculate the joint probability row-by-row using the BCJR forward recursion.

DIMACS Workshop 3/23/ x2 Impulse Response “Worst-case” scenario - large ISI: Conditional entropies computed from 100,000 realizations. Upper bound: Lower bound: (corresponds to element in middle of last column)

DIMACS Workshop 3/23/04 29 Two-Dimensional “State”

DIMACS Workshop 3/23/04 30 SIR Bounds for 2x2 Channel

DIMACS Workshop 3/23/04 31 Computing the SIR Bounds The number of states for each variable increases exponentially with the number of columns in the array. This requires that the two-dimensional impulse response have a small support region. It is desirable to find other approaches to computing bounds that reduce the complexity, perhaps at the cost of weakening the resulting bounds.

DIMACS Workshop 3/23/04 32 Alternative Upper Bound Modified BCJR approach limited to small impulse response support region. Introduce “auxiliary ISI channel” and bound where and is an arbitrary conditional probability distribution.

DIMACS Workshop 3/23/04 33 Choosing the Auxiliary Channel Assume is conditional probability distribution of the output from an auxiliary ISI channel A one-dimensional auxiliary channel permits a calculation based upon a larger number of columns in the output array. Conversion of the two-dimensional array into a one- dimensional sequence should “preserve” the statistical properties of the array. Pseudo-Peano-Hilbert space-filling curves can be used on a rectangular array to convert it to a sequence.

DIMACS Workshop 3/23/04 34 Pseudo-Peano-Hilbert Curve

DIMACS Workshop 3/23/04 35 SIR Bounds for 2x2 Channel Alternative upper bounds >

DIMACS Workshop 3/23/ x3 Impulse Response Two-DOS transfer function Auxiliary one-dimensional ISI channel with memory length 4. Useful upper bound up to E b /N 0 = 3 dB.

DIMACS Workshop 3/23/04 37 SIR Upper Bound for 3x3 Channel

DIMACS Workshop 3/23/04 38 Concluding Remarks Upper and lower bounds on the SIR of two- dimensional finite-state ISI channels were presented. Monte Carlo methods were used to compute the bounds for channels with small impulse response support region. Bounds can be extended to multi-dimensional ISI channels Further work is required to develop computable, tighter bounds for general multi-dimensional ISI channels.

DIMACS Workshop 3/23/04 39 References 1. D. Arnold and H.-A. Loeliger, “On the information rate of binary- input channels with memory,” IEEE International Conference on Communications, Helsinki, Finland, June 2001, vol. 9, pp H.D. Pfister, J.B. Soriaga, and P.H. Siegel, “On the achievable information rate of finite state ISI channels,” Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp V. Sharma and S.K. Singh, “Entropy and channel capacity in the regenerative setup with applications to Markov channels,” Proc. IEEE International Symposium on Information Theory, Washington, DC, June 2001, p A. Kavcic, “On the capacity of Markov sources over noisy channels,” Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp D. Arnold, H.-A. Loeliger, and P.O. Vontobel, “Computation of information rates from finite-state source/channel models,” Proc.40 th Annual Allerton Conf. Commun., Control, and Computing, Monticello, IL, October 2002, pp

DIMACS Workshop 3/23/04 40 References 6. Y. Katznelson and B. Weiss, “Commuting measure- preserving transformations,” Israel J. Math., vol. 12, pp , D. Anastassiou and D.J. Sakrison, “Some results regarding the entropy rates of random fields,” IEEE Trans. Inform. Theory, vol. 28, vol. 2, pp , March 1982.