10/15/01 1 The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego.

Slides:



Advertisements
Similar presentations
Noise-Predictive Turbo Equalization for Partial Response Channels Sharon Aviran, Paul H. Siegel and Jack K. Wolf Department of Electrical and Computer.
Advertisements

Another question consider a message (sequence of characters) from {a, b, c, d} encoded using the code shown what is the probability that a randomly chosen.
Information Theory EE322 Al-Sanie.
Data Compression.
1 Channel Coding in IEEE802.16e Student: Po-Sheng Wu Advisor: David W. Lin.
Chapter 6 Information Theory
2/8/05 1 Constrained Coding Techniques for Advanced Data Storage Devices Paul H. Siegel Director, CMRR Electrical and Computer Engineering University of.
Near Shannon Limit Performance of Low Density Parity Check Codes
. Parameter Estimation and Relative Entropy Lecture #8 Background Readings: Chapters 3.3, 11.2 in the text book, Biological Sequence Analysis, Durbin et.
Fundamental limits in Information Theory Chapter 10 :
CS Lecture 9 Storeing and Querying Large Web Graphs.
1/11/05 1 Capacity of Noiseless and Noisy Two-Dimensional Channels Paul H. Siegel Electrical and Computer Engineering Center for Magnetic Recording Research.
AN IFORMATION THEORETIC APPROACH TO BIT STUFFING FOR NETWORK PROTOCOLS Jack Keil Wolf Center for Magnetic Recording Research University of California,
2/28/03 1 The Virtues of Redundancy An Introduction to Error-Correcting Codes Paul H. Siegel Director, CMRR University of California, San Diego The Virtues.
3/23/04 1 Information Rates for Two-Dimensional ISI Channels Jiangxin Chen and Paul H. Siegel Center for Magnetic Recording Research University of California,
Constrained Codes for PRML Panu Chaichanavong December 14, 2000 Partial Response Channel Maximum Likelihood Detection Constraints for PRML Examples Conclusion.
Copyright © Cengage Learning. All rights reserved.
Generalized Communication System: Error Control Coding Occurs In Right Column. 6.
Time-Varying Encoders for Constrained Systems - Jonathan J. Ashley, Brian H. Marcus Jungwon Lee
Information-Theoretic Limits of Two-Dimensional Optical Recording Channels Paul H. Siegel Center for Magnetic Recording Research University of California,
Ger man Aerospace Center Gothenburg, April, 2007 Coding Schemes for Crisscross Error Patterns Simon Plass, Gerd Richter, and A.J. Han Vinck.
Noise, Information Theory, and Entropy
Channel Polarization and Polar Codes
Noise, Information Theory, and Entropy
Basics of Compression Goals: to understand how image/audio/video signals are compressed to save storage and increase transmission efficiency to understand.
Analysis of Iterative Decoding
Huffman Coding Vida Movahedi October Contents A simple example Definitions Huffman Coding Algorithm Image Compression.
Information Theory & Coding…
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
The Complex Braid of Communication and Control Massimo Franceschetti.
1 INF244 Textbook: Lin and Costello Lectures (Tu+Th ) covering roughly Chapter 1;Chapters 9-19? Weekly exercises: For your convenience Mandatory.
Super-Orthogonal Space- Time BPSK Trellis Code Design for 4 Transmit Antennas in Fast Fading Channels Asli Birol Yildiz Technical University,Istanbul,Turkey.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany Data communication line codes and constrained sequences A.J. Han Vinck Revised.
Information and Coding Theory Linear Block Codes. Basic definitions and some examples. Juris Viksna, 2015.
CODING/DECODING CONCEPTS AND BLOCK CODING. ERROR DETECTION CORRECTION Increase signal power Decrease signal power Reduce Diversity Retransmission Forward.
Channel Capacity.
Rohit Iyer Seshadri and Matthew C. Valenti
1/11/05 1 Capacity of Noiseless and Noisy Two-Dimensional Channels Paul H. Siegel Electrical and Computer Engineering Center for Magnetic Recording Research.
EE 6332, Spring, 2014 Wireless Communication Zhu Han Department of Electrical and Computer Engineering Class 11 Feb. 19 th, 2014.
Medicaps Institute of Technology & Management Submitted by :- Prasanna Panse Priyanka Shukla Savita Deshmukh Guided by :- Mr. Anshul Shrotriya Assistant.
Communication System A communication system can be represented as in Figure. A message W, drawn from the index set {1, 2,..., M}, results in the signal.
Computer Vision – Compression(1) Hanyang University Jong-Il Park.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
§6 Linear Codes § 6.1 Classification of error control system § 6.2 Channel coding conception § 6.3 The generator and parity-check matrices § 6.5 Hamming.
DIGITAL COMMUNICATIONS Linear Block Codes
Coding Theory Efficient and Reliable Transfer of Information
Multiple Antennas Have a Big Multi- User Advantage in Wireless Communications Bertrand Hochwald (Bell Labs)
CHAPTER 5 SIGNAL SPACE ANALYSIS
The parity bits of linear block codes are linear combination of the message. Therefore, we can represent the encoder by a linear system described by matrices.
Timo O. Korhonen, HUT Communication Laboratory 1 Convolutional encoding u Convolutional codes are applied in applications that require good performance.
Basic Concepts of Encoding Codes and Error Correction 1.
1 Source Coding and Compression Dr.-Ing. Khaled Shawky Hassan Room: C3-222, ext: 1204, Lecture 10 Rate-Distortion.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
The Finite-state channel was introduced as early as 1953 [McMillan'53]. Memory captured by channel state at end of previous symbol's transmission: - S.
Source Encoder Channel Encoder Noisy channel Source Decoder Channel Decoder Figure 1.1. A communication system: source and channel coding.
1 CSCD 433 Network Programming Fall 2013 Lecture 5a Digital Line Coding and other...
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
Compression for Fixed-Width Memories Ori Rottenstriech, Amit Berman, Yuval Cassuto and Isaac Keslassy Technion, Israel.
Network Topology Single-level Diversity Coding System (DCS) An information source is encoded by a number of encoders. There are a number of decoders, each.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
1 CSCD 433 Network Programming Fall 2016 Lecture 4 Digital Line Coding and other...
8 Coding Theory Discrete Mathematics: A Concept-based Approach.
Introduction to Information theory
Rohit Iyer Seshadri and Matthew C. Valenti
Functions Defined on General Sets
Tim Holliday Peter Glynn Andrea Goldsmith Stanford University
Mainak Chowdhury, Andrea Goldsmith, Tsachy Weissman
Distributed Compression For Binary Symetric Channels
Copyright © Cengage Learning. All rights reserved.
Presentation transcript:

10/15/01 1 The Continuing Miracle of Information Storage Technology Paul H. Siegel Director, CMRR University of California, San Diego

Shannon Symposium 10/15/01 2 Outline The Shannon Statue A Miraculous Technology Information Theory and Information Storage A Tale of Two Capacities Conclusion

Shannon Symposium 10/15/01 3 Claude E. Shannon

Shannon Symposium 10/15/01 4 Acknowledgments For the statue - from conception to realization: IEEE Information Theory Society Prof. Dave Neuhoff (University of Michigan) Eugene Daub, Sculptor For bringing it to CMRR: Prof. Jack K. Wolf

Shannon Symposium 10/15/01 5 How Jack Did It 6 casts of the statue Spoken for: 1. Shannon Park, Gaylord, Michigan 2. The University of Michigan 3. Lucent Technologies – Bell Labs 4. AT&T Research Labs 5. MIT Jack’s idea: “6. CMRR”

Shannon Symposium 10/15/01 6 The Inscription

Shannon Symposium 10/15/01 7 Data Storage and Transmission A data transmission system communicates information through space, i.e., “from here to there.” A data storage system communicates information through time, i.e., “from now to then.” [Berlekamp, 1980]

Shannon Symposium 10/15/01 8 Figure 1 (for Magnetic Recording) Binary-input Inter-Symbol Interference (ISI) Additive Gaussian Noise INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION MESSAGE SIGNAL RECEIVED SIGNAL MESSAGE NOISE SOURCE RECORDING CHANNEL (HEAD / DISK)

Shannon Symposium 10/15/01 9 A Miraculous Technology Areal Density Perspective – 45 Years of Progress Average Price of Storage

Shannon Symposium 10/15/01 10 Areal Density Perspective

Shannon Symposium 10/15/01 11 Average Price of Storage

Shannon Symposium 10/15/01 12 The Formula on the “Paper” Capacity of a discrete channel with noise [Shannon, 1948] For noiseless channel, H y (x)=0, so: Gaylord, MI: C = W log (P+N)/N Bell Labs: no formula on paper (“H = – p log p – q log q” on plaque)

Shannon Symposium 10/15/01 13 Discrete Noiseless Channels (Constrained Systems) A constrained system S is the set of sequences generated by walks on a labeled, directed graph G. Telegraph channel constraints [Shannon, 1948] DOT DASH DOT DASH LETTER SPACE WORD SPACE

Shannon Symposium 10/15/01 14 Magnetic Recording Constraints Runlength constraintsSpectral null constraints (“finite-type”: determined by (“almost-finite-type”) finite list F of forbidden words) Forbidden words F={101, 010} Biphase Even

Shannon Symposium 10/15/01 15 Practical Constrained Codes Finite-state encoder Sliding-block decoder (from binary data into S) (inverse mapping from S to data) m data bitsn code bits Encoder Logic Rate m:n (states) Decoder Logic n bits m bits We want: high rate R=m/n low complexity

Shannon Symposium 10/15/01 16 Codes and Capacity How high can the code rate be? Shannon defined the capacity of the constrained system S: where N(S,n) is the number of sequences in S of length n. Theorem [Shannon,1948] : If there exists a decodable code at rate R= m/n from binary data to S, then R  C. Theorem [Shannon,1948] : For any rate R=m/n < C there exists a block code from binary data to S with rate km:kn, for some integer k  1.

Shannon Symposium 10/15/01 17 Computing Capacity: Adjacency Matrices Let be the adjacency matrix of the graph G representing S. The entries in correspond to paths in G of length n

Shannon Symposium 10/15/01 18 Computing Capacity (cont.) Shannon showed that, for suitable representing graphs G, where, i.e., the spectral radius of the matrix. Assigning “transition probabilities” to the edges of G, the constrained system S becomes a Markov source x, with entropy H(x). Shannon proved that and expressed the maximizing probabilities in terms of the spectral radius and corresponding eigenvector of.

Shannon Symposium 10/15/01 19 Constrained Coding Theorems Stronger coding theorems were motivated by the problem of constrained code design for magnetic recording. Theorem[Adler-Coppersmith-Hassner, 1983] Let S be a finite-type constrained system. If m/n  C, then there exists a rate m:n sliding-block decodable, finite-state encoder. (Proof is constructive: state-splitting algorithm.) Theorem[Karabed-Marcus, 1988] Ditto if S is almost-finite-type. (Proof not so constructive…)

Shannon Symposium 10/15/01 20 Distance-Enhancing Codes for Partial Response Channels Beginning in 1990, disk drives have used a technique called partial-response equalization with maximum- likelihood detection, or PRML. In the late 1990’s, extensions of PRML, denoted EPRML and EEPRML were introduced. The performance of such PRML systems can be improved by using codes with “distance-enhancing” constraints. These constraints are described by a finite set D of “forbidden differences,” corresponding to differences of channel input sequences whose corresponding outputs are most likely to produce detection errors.

Shannon Symposium 10/15/01 21 Codes that Avoid Specified Differences The difference between length-n binary words u and v is A length-n code avoids D if no difference of codewords contains any string in D. Example: Length-2 code:

Shannon Symposium 10/15/01 22 Capacity of Difference Set D [Moision-Orlitsky-Siegel] How high can the rate be for a code avoiding D? Define the capacity of the difference set D: where is the maximum number of codewords in a (block) code of length n that avoids D. Problem: Determine cap(D) and find codes that achieve it.

Shannon Symposium 10/15/01 23 Computing cap(D): Adjacency Matrices Associate to D a set of graphs and corresponding set of adjacency matrices reflecting disallowed pairs of code patterns: Consider the set of n-fold products of matrices in : Each product corresponds (roughly) to a code avoiding D.

Shannon Symposium 10/15/01 24 Generalized Spectral Radius  (  ) Define the largest spectral radius of a matrix in. The generalized spectral radius of is defined as: [Daubechies-Lagarias,1992], cf. [Rota-Strang, 1960]

Shannon Symposium 10/15/01 25 Computing cap(D) (cont.) Theorem[Moision-Orlitsky-Siegel, 2001] For any finite difference set D, Recall formula for the capacity of a constrained system S Computing cap(D) can be difficult, but a constructive bounding algorithm has yielded good results.

Shannon Symposium 10/15/01 26 A Real Example: EEPRML Codes can improve EEPRML performance by avoiding Codes satisfying the following constraints avoid D: »F={101,010} C=0.6942… »F={101}C=0.8113… »F={0101,1010}C=0.8791… MTR [Moon,1996] What is cap(D), and are there other simple constraints with higher capacity that avoid D ?

Shannon Symposium 10/15/01 27 EEPRML Example (cont.) For,  cap(D) < The lower bound, conjectured to be exactly cap(D), is achieved by the “time-varying MTR (TMTR)” constraint, with finite periodic forbidden list: Rate 8/9, TMTR code has been used in commercial disk drives [Bliss, Wood, Karabed-Siegel-Soljanin].

Shannon Symposium 10/15/01 28 Periodic Finite-Type Constraints The TMTR (and biphase) constraint represent a new class of constraints, called “periodic finite-type,” characterized by a finite set of periodically forbidden words. [Moision-Siegel, ISIT 2001] RUNLENGTH FINITE-TYPE PERIODIC FINITE-TYPE ALMOST FINITE-TYPE BIPHASE TMTR EVEN

Shannon Symposium 10/15/01 29 Other Storage-Related Research Page-oriented storage technologies, such as holographic memories, require codes generating arrays of bits with 2-dimensional constraints. This is a very active area of research. [Wolf, Shannon Lecture, ISIT 2001] There has been recent progress related to computing the capacity of noisy magnetic recording (ISI) channels, C = Max (H(x) – H y (x)). [Arnold-Loeliger, Arnold-Vontobol, Pfister-Soriaga-Siegel, Kavcic, 2001]

Shannon Symposium 10/15/01 30 Conclusion The work of Claude Shannon has been a key element in the “miraculous” progress of modern information storage technologies. In return, the ongoing demand for data storage devices with larger density and higher data transfer rates has “miraculously” continued to inspire new concepts and results in information theory.