Presentation is loading. Please wait.

Presentation is loading. Please wait.

3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,

Similar presentations


Presentation on theme: "3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,"— Presentation transcript:

1 3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California, San Diego DIMACS Workshop March 22-24, 2004

2 DIMACS Workshop 3/23/04 2 Outline Motivation: Two-dimensional recording Channel model Information rates Bounds on the Symmetric Information Rate (SIR) Upper Bound Lower Bound Convergence Alternative upper bound Numerical results Conclusions

3 DIMACS Workshop 3/23/04 3 Two-Dimensional Channel Model Constrained input array Linear intersymbol interference Additive, i.i.d. Gaussian noise

4 DIMACS Workshop 3/23/04 4 Two-Dimensional Processes Input process: Output process: Array upper left corner: lower right corner:

5 DIMACS Workshop 3/23/04 5 Entropy Rates Output entropy rate: Noise entropy rate: Conditional entropy rate:

6 DIMACS Workshop 3/23/04 6 Mutual Information Rates Mutual information rate: Capacity: Symmetric information rate (SIR): Inputs are constrained to be independent, identically distributed, and equiprobable binary.

7 DIMACS Workshop 3/23/04 7 Capacity and SIR The capacity and SIR are useful measures of the achievable storage densities on the two- dimensional channel. They serve as performance benchmarks for channel coding and detection methods. So, it would be nice to be able to compute them.

8 DIMACS Workshop 3/23/04 8 Finding the Output Entropy Rate For one-dimensional ISI channel model: and where

9 DIMACS Workshop 3/23/04 9 Sample Entropy Rate If we simulate the channel N times, using inputs with specified (Markovian) statistics and generating output realizations then converges to with probability 1 as.

10 DIMACS Workshop 3/23/04 10 Computing Sample Entropy Rate The forward recursion of the sum-product (BCJR) algorithm can be used to calculate the probability of a sample realization of the channel output. In fact, we can write where the quantity is precisely the normalization constant in the (normalized) forward recursion.

11 DIMACS Workshop 3/23/04 11 Computing Entropy Rates Shannon-McMillan-Breimann theorem implies as, where is a single long sample realization of the channel output process.

12 DIMACS Workshop 3/23/04 12 SIR for Partial-Response Channels

13 DIMACS Workshop 3/23/04 13 Capacity Bounds for Dicode

14 DIMACS Workshop 3/23/04 14 Markovian Sufficiency Remark: It can be shown that optimized Markovian processes whose states are determined by their previous r symbols can asymptotically achieve the capacity of finite-state intersymbol interference channels with AWGN as the order r of the input process approaches . (J. Chen and P.H. Siegel, ISIT 2004)

15 DIMACS Workshop 3/23/04 15 Capacity and SIR in Two Dimensions In two dimensions, we could estimate by calculating the sample entropy rate of a very large simulated output array. However, there is no counterpart of the BCJR algorithm in two dimensions to simplify the calculation. Instead, we use conditional entropies to derive upper and lower bounds on.

16 DIMACS Workshop 3/23/04 16 Array Ordering Permuted lexicographic ordering: Choose vector, a permutation of. Map each array index to. Then precedes if or and. Therefore, row-by-row ordering column-by-column ordering

17 DIMACS Workshop 3/23/04 17 Two-Dimensional “Past” Let be a non-negative vector. Define to be the elements preceding inside the region (with permutation k )

18 DIMACS Workshop 3/23/04 18 Examples of Past{Y[i,j]}

19 DIMACS Workshop 3/23/04 19 Conditional Entropies For a stationary two-dimensional random field Y on the integer lattice, the entropy rate satisfies: (The proof uses the entropy chain rule. See [5-6]) This extends to random fields on the hexagonal lattice,via the natural mapping to the integer lattice.

20 DIMACS Workshop 3/23/04 20 Upper Bound on H(Y) For a stationary two-dimensional random field Y, where

21 DIMACS Workshop 3/23/04 21 Two-Dimensional Boundary of Past{Y[i,j]} Define to be the boundary of. The exact expression for is messy, but the geometrical concept is simple.

22 DIMACS Workshop 3/23/04 22 Two-Dimensional Boundary of Past{Y[i,j]}

23 DIMACS Workshop 3/23/04 23 Lower Bound on H(Y) For a stationary two-dimensional hidden Markov field Y, where and is the “state information” for the strip.

24 DIMACS Workshop 3/23/04 24 Sketch of Proof Upper bound: Note that and that conditioning reduces entropy. Lower bound: Markov property of, given “state information”.

25 DIMACS Workshop 3/23/04 25 Convergence Properties The upper bound on the entropy rate is monotonically non-increasing as the size of the array defined by increases. The lower bound on the entropy rate is monotonically non-decreasing as the size of the array defined by increases.

26 DIMACS Workshop 3/23/04 26 Convergence Rate The upper bound and lower bound converge to the true entropy rateat least as fast as O( 1 / l min ), where

27 DIMACS Workshop 3/23/04 27 Computing the SIR Bounds Estimate the two-dimensional conditional entropies over a small array. Calculate to get for many realizations of output array. For column-by-column ordering, treat each row as a variable and calculate the joint probability row-by-row using the BCJR forward recursion.

28 DIMACS Workshop 3/23/04 28 2x2 Impulse Response “Worst-case” scenario - large ISI: Conditional entropies computed from 100,000 realizations. Upper bound: Lower bound: (corresponds to element in middle of last column)

29 DIMACS Workshop 3/23/04 29 Two-Dimensional “State”

30 DIMACS Workshop 3/23/04 30 SIR Bounds for 2x2 Channel

31 DIMACS Workshop 3/23/04 31 Computing the SIR Bounds The number of states for each variable increases exponentially with the number of columns in the array. This requires that the two-dimensional impulse response have a small support region. It is desirable to find other approaches to computing bounds that reduce the complexity, perhaps at the cost of weakening the resulting bounds.

32 DIMACS Workshop 3/23/04 32 Alternative Upper Bound Modified BCJR approach limited to small impulse response support region. Introduce “auxiliary ISI channel” and bound where and is an arbitrary conditional probability distribution.

33 DIMACS Workshop 3/23/04 33 Choosing the Auxiliary Channel Assume is conditional probability distribution of the output from an auxiliary ISI channel A one-dimensional auxiliary channel permits a calculation based upon a larger number of columns in the output array. Conversion of the two-dimensional array into a one- dimensional sequence should “preserve” the statistical properties of the array. Pseudo-Peano-Hilbert space-filling curves can be used on a rectangular array to convert it to a sequence.

34 DIMACS Workshop 3/23/04 34 Pseudo-Peano-Hilbert Curve

35 DIMACS Workshop 3/23/04 35 SIR Bounds for 2x2 Channel Alternative upper bounds --------->

36 DIMACS Workshop 3/23/04 36 3x3 Impulse Response Two-DOS transfer function Auxiliary one-dimensional ISI channel with memory length 4. Useful upper bound up to E b /N 0 = 3 dB.

37 DIMACS Workshop 3/23/04 37 SIR Upper Bound for 3x3 Channel

38 DIMACS Workshop 3/23/04 38 Concluding Remarks Upper and lower bounds on the SIR of two- dimensional finite-state ISI channels were presented. Monte Carlo methods were used to compute the bounds for channels with small impulse response support region. Bounds can be extended to multi-dimensional ISI channels Further work is required to develop computable, tighter bounds for general multi-dimensional ISI channels.

39 DIMACS Workshop 3/23/04 39 References 1. D. Arnold and H.-A. Loeliger, “On the information rate of binary- input channels with memory,” IEEE International Conference on Communications, Helsinki, Finland, June 2001, vol. 9, pp.2692-2695. 2. H.D. Pfister, J.B. Soriaga, and P.H. Siegel, “On the achievable information rate of finite state ISI channels,” Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp. 2992-2996. 3. V. Sharma and S.K. Singh, “Entropy and channel capacity in the regenerative setup with applications to Markov channels,” Proc. IEEE International Symposium on Information Theory, Washington, DC, June 2001, p. 283. 4. A. Kavcic, “On the capacity of Markov sources over noisy channels,” Proc. Globecom 2001, San Antonio, TX, November2001, vol. 5, pp. 2997-3001. 5. D. Arnold, H.-A. Loeliger, and P.O. Vontobel, “Computation of information rates from finite-state source/channel models,” Proc.40 th Annual Allerton Conf. Commun., Control, and Computing, Monticello, IL, October 2002, pp. 457-466.

40 DIMACS Workshop 3/23/04 40 References 6. Y. Katznelson and B. Weiss, “Commuting measure- preserving transformations,” Israel J. Math., vol. 12, pp. 161-173, 1972. 7. D. Anastassiou and D.J. Sakrison, “Some results regarding the entropy rates of random fields,” IEEE Trans. Inform. Theory, vol. 28, vol. 2, pp. 340-343, March 1982.


Download ppt "3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,"

Similar presentations


Ads by Google