Download presentation
Presentation is loading. Please wait.
Published byHoward Oswald Bryan Modified over 9 years ago
1
Science & Technology Centers Program Center for Science of Information Bryn Mawr Howard MIT Princeton Purdue Stanford Texas A&M UC Berkeley UC San Diego UIUC Frontiers of Science of Information: Wojciech Szpankowski IS4IS Summit, Vienna, 2015 1 National Science Foundation/Science & Technology Centers Program
2
Science & Technology Centers Program 1.Science of Information & Center Mission 2.Post-Shannon Challenges 3.Center Team 4.Research –Fundamentals: Structural Information –Data Science: Data Models & Inference –Life Sciences: Protein Folding 2
3
Science & Technology Centers Program The Information Revolution started in 1948, with the publication of: A Mathematical Theory of Communication. The digital age began. Claude Shannon: Shannon information quantifies the extent to which a recipient of data can reduce its statistical uncertainty. “semantic aspects of communication are irrelevant...” Objective: Reproducing reliably data: Fundamental Limits for Storage and Communication. Applications Enabler/Driver: CD, iPod, DVD, video games, Internet, Facebook, WiFi, mobile, Google,.. Design Driver: universal data compression, voiceband modems, CDMA, multiantenna, discrete denoising, space-time codes, cryptography, distortion theory approach to Big Data. 3
4
Science & Technology Centers Program 4 Claude Shannon laid the foundation of information theory, demonstrating that problems of data transmission and compression can be precisely modeled formulated, and analyzed. SCIENCE OF INFORMATION builds on Shannon’s principles to address key challenges in understanding information that nowadays is not only communicated but also acquired, curated, organized, aggregated, managed, processed, suitably abstracted and represented, analyzed, inferred, valued, secured, and used in various scientific, engineering, and socio-economic processes. Georg Cantor (1845-1918): “In re mathematica ars proponendi questionem pluris facienda est quam solvendi ” ( In mathematics the art of proposing a question must be held of higher value than solving it.) Georg Cantor (1845-1918): “In re mathematica ars proponendi questionem pluris facienda est quam solvendi ” ( In mathematics the art of proposing a question must be held of higher value than solving it.)
5
Science & Technology Centers Program 5 Extend Information Theory to meet new challenges in biology, economics, data & social sciences, and distributed systems. Understand new aspects of information (embedded) in structure, time, space, semantics, dynamic information, limited resources, complexity, representation- invariant information, and cooperation & dependency. CSoI MISSION: Advance science and technology through a new quantitative understanding of the representation, communication and processing of information in biological, physical, social and engineering systems.
6
Science & Technology Centers Program 1.Science of Information & Center Mission 2.Post-Shannon Challenges 3.Center Team 4.Research –Fundamentals: Structural Information –Data Science: Data Models & Inference –Life Sciences: Protein Folding 6
7
Science & Technology Centers Program Structure: Measures are needed for quantifying information embodied in structures (e.g., information in material structures, nanostructures, biomolecules, gene regulatory networks, protein networks, social networks, financial transactions). [F. Brooks, JACM, 2003] Szpankowski, Choi, Manger : Information contained in unlabeled graphs & universal graphical compression. Grama & Subramaniam : quantifying role of noise and incomplete data, identifying conserved structures, finding orthologies in biological network reconstruction. Neville : Outlining characteristics (e.g., weak dependence) sufficient for network models to be well-defined in the limit. Yu & Qi: Finding distributions of latent structures in social networks. Szpankowski, Baryshnikov, & Duda: structure of Markov fields and optimal compression. Szpankowski, Wolpert: Network-Constraints Networks (NCO).
8
Science & Technology Centers Program Time: Classical Information Theory is at its weakest in dealing with problems of delay (e.g., information arriving late may be useless of has less value). Verdu, Polyanskiy, Kostina: major breakthrough in extending Shannon capacity theorem to finite blocklength information theory for lossy data compression. Kumar: design reliable scheduling policies with delay constraints for wireless networks; new axiomatic approach to secure wireless networks. Weissman : real time coding system with lookahead to investigate the impact of delay on expected distortion and bit rate. Subramaniam: reconstruct networks from dynamic biological data. Representation-invariance: How to know whether two representations of the same information are information equivalent?
9
Science & Technology Centers Program 9 Limited Resources : In many scenarios, information is limited by available computational resources (e.g., cell phone, living cell). Bialek works on structure of molecular networks that optimize information flow, subject to constraints on the total number of molecules being used. Verdu, Goldsmith investigates the minimum energy per bit as a function of data length in Gaussian channels. Semantics : Is there a way to account for the meaning or semantics of information? Sudan argues that the meaning of information becomes relevant whenever there is diversity across communicating parties and when parties themselves evolve over time. New collaboration between Sudan and Tse on human communication.
10
Science & Technology Centers Program 10 Learnable Information (BigData): Data driven science focuses on extracting information from data. How much information can actually be extracted from a given data repository? Information Theory of Big Data? Big data domain exhibits certain properties: Large (peta and exa scale) Noisy (high rate of false positives and negatives) Multiscale (interaction at different levels of abstractions) Dynamic (temporal and spatial changes) Heterogeneous (high variability over space and time Distributed (collected and stored at distributed locations) Elastic (flexibility to data model and clustering capabilities) Complex dependencies (structural, long term) High dimensional Ad-hoc solutions do not work at scale! ``Big data has arrived but big insights have not..’’ Financial Times, J. Hartford.
11
Science & Technology Centers Program 11 Cooperation & Dependency : How does cooperation impact information (nodes should cooperate in their own self-interest)? Cover initiateed a theory of cooperation and coordination in networks, that is, they study the achievable joint distribution among network nodes, provided that the communication rates are given. Dependency and rational expectation are critical ingredients in Sims' work on modern dynamic economic theory. Coleman is studying statistical causality in neural systems by using Granger principles. Quantum Information : The flow of information in macroscopic systems and microscopic systems may posses different characteristics. Aaronson and Shor lead these investigations. Aaronson develops computational complexity of linear systems.
12
Science & Technology Centers Program Economic systems share many features with communication networks: acquiring, storing, sharing, and processing information. Yet one crucial feature that distinguishes economic networks from reliable communication is value of information. A major challenge in economics is to formalize the notion of information value for dynamic settings involving delay constrains. 12 Value of Information: For lattice and define value V(a) as: 1.V(a) 0; Flow of Information? 2.If a b then V(a) V(b) 3.V(a) is submodular: If, then for all :
13
Science & Technology Centers Program 13 Life of a bit Copier has finite energy information dissipates (slowly!) DRAM controller Does refresh every 64 ms! read-writes / year. Study: of Google servers have DRAM errors each year (median: errs/Mbit) Majority: data-path related Y Y. Polyanskiy and Y. Wu: ``Dissipation of Information’’
14
Science & Technology Centers Program 14
15
Science & Technology Centers Program 15 Information Knowledge Data Framing the Foundation Data: set of values of qualitative variables; individual pieces of information. Information: measure of distinguishibility. Knowledge: actionable information.
16
Science & Technology Centers Program 1.Information: Core Principle Structural Information Temporal Information Value of Information 2.Communication & Control: Fundamental Limit Flow of Information in Dynamic/Cooperative Networks Probable Security 3.Data: Framing the Foundations Information-Theoretic Models Precise Complexity (Small Data) Structural Insights into Data 4.Modeling and Analysis: Life Sciences Sequence Analysis Network Inferences, Modeling, and Analysis Information Flow in Human Brain From Energetics to Sequence and Conformations 16
17
Science & Technology Centers Program 1.Science of Information & Center Mission 2.Post-Shannon Challenges 3.Center Team 4.Research –Fundamentals: Structural Information –Data Science: Data Models & Inference –Life Sciences: Protein Folding 17
18
Science & Technology Centers Program Bryn Mawr College: D. Kumar Howard University: C. Liu, L. Burge MIT: P. Shor (co-PI), M. Sudan Purdue University (lead): W. Szpankowski (PI) Princeton University: S. Verdu (co-PI) Stanford University: A. Goldsmith (co-PI) Texas A&M: P.R. Kumar University of California, Berkeley: Bin Yu (co-PI) University of California, San Diego: S. Subramaniam UIUC: O. Milenkovic University of Hawaii: P. Santhanam Bin Yu, U.C. Berkeley Sergio Verdú, Princeton Peter Shor, MIT Andrea Goldsmith, Stanford 18 R. Aguilar, M. Atallah, S. Datta, A. Grama, A. Mathur, J. Neville, D. Ramkrishna, L. Si, V. Rego, A. Qi, M. Ward, D. Xu, C. Liu, L. Burge, S. Aaronson, N. Lynch, R. Rivest, Y. Polyanskiy, W. Bialek, S. Kulkarni, C. Sims, G. Bejerano, T. Cover, A. Ozgur, T. Weissman, V. Anantharam, J. Gallant, T. Courtade, M. Mahoney, D. Tse, T.Coleman, Y. Baryshnikov, M. Raginsky.. Wojciech Szpankowski, Purdue
19
Science & Technology Centers Program Nobel Prize (Economics): Chris Sims National Academies (NAS/NAE) – Bialek, Cover, Datta, Lynch, Kumar, Ramkrishna, Rice, Rivest, Shor, Sims, Verdu, Yu. Turing award winner -- Rivest. Shannon award winners -- Cover and Verdu. Nevanlinna Prize (outstanding contributions in Mathematical Aspects of Information Sciences) – Sudan and Shor. Richard W. Hamming Medal – Cover and Verdu. Humboldt Research Award – Szpankowski. Swartz Prize in Neuroscience – Bialek. 19
20
Science & Technology Centers Program 20 RESEARCH THRUSTS: 1. Information & Communication 2. Knowledge Extraction (Data Science) 3. Life Sciences S. Subramaniam T. Weissman J. Neville RESEARCH MISSION: Create a shared intellectual space, integral to the Center’s activities, providing a collaborative research environment that crosses disciplinary and institutional boundaries. A. Grama S. Kulkarni David Tse
21
Science & Technology Centers Program 1.Science of Information & Center Mission 2.Post-Shannon Challenges 3.Center Team 4.Research –Fundamentals: Structural Information –Data Science: Data Models & Inference –Life Sciences: Protein Folding 21
22
Science & Technology Centers Program 22 Paris FRANCE Lyon FRANCE Strasbourg FRANCE Berlin GERMANY Strasburg GERMANY Dortmund GERMANY Gdansk POLAND Warsaw POLAND Structure vs Labeled Structure How much information embedded in structure and (correlated) labels? Fundamental New Limits?
23
Science & Technology Centers Program 23
24
Science & Technology Centers Program 24 Information Content of Unlabeled Graphs: A structure model S of a graph G is defined for an unlabeled version. Some labeled graphs have the same structure. Graph Entropy vs Structural Entropy: The probability of a structure S is: P(S) = N(S) · P(G) where N(S) is the number of different labeled graphs having the same structure. Y. Choi and W.S., IEEE Trans. Information Theory, 2012.
25
Science & Technology Centers Program 25
26
Science & Technology Centers Program 26 Theorem. [Choi, WS, 2012] Let be the code length. Then for Erd ö s-R é nyi graphs: (i)Lower Bound: (I)Upper Bound: where c is an explicitly computable constant, and is a fluctuating function with a small amplitude or zero.
27
Science & Technology Centers Program 1.Science of Information & Center Mission 2.Post-Shannon Challenges 3.Center Team 4.Research –Fundamentals: Structural Information –Data Science: Data Models & Inference –Life Sciences: Protein Folding 27
28
Science & Technology Centers Program 28 Recommendation systems make suggestions based on prior information: Recommendations are usually lists indexed by likelihood. Email on server Prior search history Processor Distributed Data Original (large) database is compressed Compressed version can be stored at several locations Queries about original database can be answered reliably from a compressed version of it. Databases may be compressed for the purpose of answering queries of the form: “Is there an entry similar to y in the database?”. Data is often processed for purposes other than reproduction of the original data: (new goal: reliably answer queries rather than reproduce data!) Courtade, Weissman, IEEE Trans. Information Theory, 2013.
29
Science & Technology Centers Program 29 Fundamental Tradeoff: what is the minimum description (compression) rate required to generate a quantifiably good set of beliefs and/or reliable answers. General Results : queries can be answered reliably if and only if the compression rate exceeds the identification rate. Quadratic similarity queries on compressed data Distributed (multiterminal) source coding under logarithmic loss
30
Science & Technology Centers Program 1.Science of Information & Center Mission 2.Post-Shannon Challenges 3.Center Team 4.Research –Fundamentals: Structural Information –Data Science: Data Models & Inference –Life Sciences: Protein Folding 30
31
Science & Technology Centers Program 31 Probability of protein folds vs. sequence rank Protein Folds in Nature
32
Science & Technology Centers Program protein-fold channel s f HPHHPPHPHHPPHHPH Information-Theoretic Model Optimal input distribution from Blahut-Arimoto algorithm:
33
Science & Technology Centers Program 33
34
Science & Technology Centers Program 34 : set of self-avoiding walks of length : Energy of a walk over sequence : We can prove that: is free energy, and Phase transition of the free energy (hence the capacity ):. Then
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.