Presentation is loading. Please wait.

Presentation is loading. Please wait.

10/14/2008Semantic Communication1 Universal Semantic Communication Madhu Sudan MIT CSAIL Joint work with Brendan Juba (MIT CSAIL).

Similar presentations


Presentation on theme: "10/14/2008Semantic Communication1 Universal Semantic Communication Madhu Sudan MIT CSAIL Joint work with Brendan Juba (MIT CSAIL)."— Presentation transcript:

1 10/14/2008Semantic Communication1 Universal Semantic Communication Madhu Sudan MIT CSAIL Joint work with Brendan Juba (MIT CSAIL).

2 10/14/2008Semantic Communication2 The Meaning of Bits Is this perfect communication? Is this perfect communication? What if Alice is trying to send instructions? What if Alice is trying to send instructions? In other words … an algorithm In other words … an algorithm Does Bob understand the correct algorithm? Does Bob understand the correct algorithm? What if Alice and Bob speak in different (programming) languages? What if Alice and Bob speak in different (programming) languages? Channel Alice Bob Bob 01001011 01001011 Bob Freeze!

3 10/14/2008Semantic Communication3 Motivation: Better Computing Networked computers use common languages: Networked computers use common languages: Interaction between computers (getting your computer onto internet). Interaction between computers (getting your computer onto internet). Interaction between pieces of software. Interaction between pieces of software. Interaction between software, data and devices. Interaction between software, data and devices. Getting two computing environments to “talk” to each other is getting problematic: Getting two computing environments to “talk” to each other is getting problematic: time consuming, unreliable, insecure. time consuming, unreliable, insecure. Can we communicate more like humans do? Can we communicate more like humans do?

4 10/14/2008Semantic Communication4 Some modelling Say, Alice and Bob know different programming languages. Alice wishes to send an algorithm A to Bob. Say, Alice and Bob know different programming languages. Alice wishes to send an algorithm A to Bob. Bad News: Can’t be done Bad News: Can’t be done For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that Alice sending A is indistinguishable (to Bob) from Alice’ sending A’ For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that Alice sending A is indistinguishable (to Bob) from Alice’ sending A’ Good News: Need not be done. Good News: Need not be done. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. What should be communicated? Why? What should be communicated? Why?

5 10/14/2008Semantic Communication5 Aside: Why communicate? Classical “Theory of Computing” Classical “Theory of Computing” Issues: Time/Space on DFA? Turing machines? Issues: Time/Space on DFA? Turing machines? Modern theory: Modern theory: Issues: Reliability, Security, Privacy, Agreement? Issues: Reliability, Security, Privacy, Agreement? If communication is so problematic, then why not “Not do it”? If communication is so problematic, then why not “Not do it”? F XF(X) Alice Bob Charlie Dick Alice Bob Charlie Dick

6 10/14/2008Semantic Communication6 Bob speaks to some environment (a collection of entities). Bob speaks to some environment (a collection of entities). Why? Has some goal! Why? Has some goal! “Control”: Wants to alter the state of the environment. “Control”: Wants to alter the state of the environment. “Intellectual”: Wants to glean knowledge (about universe/environment). “Intellectual”: Wants to glean knowledge (about universe/environment). Claim: By studying the goals, can enable Bob to overcome linguistic differences (and achieve goal). Claim: By studying the goals, can enable Bob to overcome linguistic differences (and achieve goal). (Selfish) Motivations for Communication

7 10/14/2008Semantic Communication7 Rest of the talk Part I: Bob is computationally limited but wishes to solve hard problem, and Alice can solve the problem. Part I: Bob is computationally limited but wishes to solve hard problem, and Alice can solve the problem. Part II: Bob is a teacher and wants to test student’s ability. Part II: Bob is a teacher and wants to test student’s ability. Part III: Generic goals. Part III: Generic goals.Bob

8 10/14/2008Semantic Communication8 Part I: A Computational Goal

9 10/14/2008Semantic Communication9. Alice similar Alice similar Modelling the communicator (Bob) B o b : ­ £ § k ! ­ £ ¡ `, w h ere ­ = coun t a bl es t a t espace § k = i npu t s i gna l s ¡ ` = ou t pu t s i gna l s. Bob

10 10/14/2008Semantic Communication10 Computational Goal for Bob Bob is prob. poly time bounded. Wants to decide membership in set S. Bob is prob. poly time bounded. Wants to decide membership in set S. Alice is computationally unbounded, does not speak same language as Bob, but is “helpful”. Alice is computationally unbounded, does not speak same language as Bob, but is “helpful”. What kind of sets S? What kind of sets S? E.g., undecidable?, decidable? PSPACE, NP, BPP? E.g., undecidable?, decidable? PSPACE, NP, BPP?

11 10/14/2008Semantic Communication11 Setup Bob Alice R Ã $$$ q 1 a 1 a k q k x 2 S ? f ( x ; R ; a 1 ;:::; a k ) = 1 ? H ope f u ll yx 2 S, f ( ¢¢¢ ) = 1 Different from IP: In IP Bob does not trust Alice, while here he does not understand her.

12 10/14/2008Semantic Communication12 Helpful Alice? For Bob to have a non-trivial interaction, Alice must be: For Bob to have a non-trivial interaction, Alice must be: Intelligent: Capable of deciding if x in S. Intelligent: Capable of deciding if x in S. Cooperative: Must communicate this to Bob. Cooperative: Must communicate this to Bob. Formally: Formally:

13 10/14/2008Semantic Communication13 Successful universal communication Bob should be able to talk to any S-helpful Alice and decide S. Bob should be able to talk to any S-helpful Alice and decide S. Formally, Formally, P p t B i s S -un i versa l i ff oreveryx 2 f 0 ; 1 g ¤ A i sno t S - h e l p f u l ) N o t h i ng !! ¡ A i s S - h e l p f u l ) [ A $ B ( x )] = 1 i ® x 2 S ( w h p ). Or should it be … A i sno t S - h e l p f u l ) [ A $ B ( x )] = 1 i mp l i esx 2 S.

14 10/14/2008Semantic Communication14 Main Theorem If S is PSPACE-complete, then there exists a S- universal Bob (generalizes to other checkable sets S). If S is PSPACE-complete, then there exists a S- universal Bob (generalizes to other checkable sets S). Conversely, if there exists a S-universal Bob, then S is in PSPACE. Conversely, if there exists a S-universal Bob, then S is in PSPACE. In other words: In other words: If S is moderately stronger than what Bob can do on his own, then attempting to solve S leads to non-trivial (useful) conversation. If S is moderately stronger than what Bob can do on his own, then attempting to solve S leads to non-trivial (useful) conversation. If S too strong, then leads to ambiguity. If S too strong, then leads to ambiguity. Uses IP=PSPACE [LFKN, Shamir] Uses IP=PSPACE [LFKN, Shamir]

15 10/14/2008Semantic Communication15 Few words about the proof Positive result: Enumeration + Interactive Proofs Positive result: Enumeration + Interactive Proofs Alice Prover Bob Interpreter P roo f wor k s ) x 2 S ; D oesn t wor k ) G uesswrong. A l i ce S - h e l p f u l ) I n t erpre t erex i s t s ! G uess: I n t erpre t er;x 2 S ?

16 10/14/2008Semantic Communication16 Proof of Negative Result L not in PSPACE implies Bob makes mistakes. L not in PSPACE implies Bob makes mistakes. Suppose Alice answers every question so as to minimize the conversation length. Suppose Alice answers every question so as to minimize the conversation length. (Reasonable effect of misunderstanding). (Reasonable effect of misunderstanding). Conversation comes to end quickly. Conversation comes to end quickly. Bob has to decide. Bob has to decide. Conversation + Decision simulatable in PSPACE (since Alice’s strategy can be computed in PSPACE). Conversation + Decision simulatable in PSPACE (since Alice’s strategy can be computed in PSPACE). Bob must be wrong if S is not in PSPACE. Bob must be wrong if S is not in PSPACE. Warning: Only leads to finitely many mistakes. Warning: Only leads to finitely many mistakes.

17 10/14/2008Semantic Communication17 Part II: Generic Goals

18 10/14/2008Semantic Communication18 Generically Bob interacts with an environment (collection of Alices). Bob interacts with an environment (collection of Alices). What should goal depend on? What should goal depend on? States of Bob? Then how can Bob adapt to Alice? States of Bob? Then how can Bob adapt to Alice? State of Alice(s)? Bob doesn’t know this! State of Alice(s)? Bob doesn’t know this! Transcript of interaction? Does this mean the same thing for different Alice/Bob pairs? Transcript of interaction? Does this mean the same thing for different Alice/Bob pairs? EnvironmentEnvironment Bob

19 10/14/2008Semantic Communication19 fafafafa fefefefe fdfdfdfd fcfcfcfc fbfbfbfb Need to model generic multiparty computation, to present general protocols for “secure, private, multiparty computation”. Need to model generic multiparty computation, to present general protocols for “secure, private, multiparty computation”. Modelled by “Ideal Trusted Party” Modelled by “Ideal Trusted Party” An Analogy: Multiparty Computation B E A D C Trusted Party a c d b e =(f a,f b,f c,f d,f e )

20 10/14/2008Semantic Communication20 Generic Goals Framework: Bob talks to Alice thru Interpreter Framework: Bob talks to Alice thru Interpreter Roles: Roles: Bob defines the Goal (though his actions may depend also on what the interpreter hears from Alice). Bob defines the Goal (though his actions may depend also on what the interpreter hears from Alice). Alice comes from class Ă; Interpreter from Ĭ Alice comes from class Ă; Interpreter from Ĭ Alice is helpful if Bob achieves his goal with her thru some Interpreter in Ĭ Alice is helpful if Bob achieves his goal with her thru some Interpreter in Ĭ Interpreter is universal if Bob achieve his goal for every helpful Alice in Ă. Interpreter is universal if Bob achieve his goal for every helpful Alice in Ă. Alice Bob Interpreter

21 10/14/2008Semantic Communication21 Consider: Class of Alices,Class of Interpreters and some goal given by Bob B Consider: Class of Alices,Class of Interpreters and some goal given by Bob B (B, )-Helpful: Alice helpful to Bob via some Interpreter in. (B, )-Helpful: Alice helpful to Bob via some Interpreter in. (B, )-Universal: Interpreter works with all Alice in. (B, )-Universal: Interpreter works with all Alice in. Theorem: “Forgiving”, “verifiable” Goals can be achieved universally. Theorem: “Forgiving”, “verifiable” Goals can be achieved universally. “Forgiving” – no finite prefix of interaction should rule out achievement of Goal. “Forgiving” – no finite prefix of interaction should rule out achievement of Goal. “Verifiability” … “Verifiability” … I Generic Helpfulness, Universality AAA I I

22 10/14/2008Semantic Communication22 Typical Goals Intent of Goals: Usually depend on state of Alice! Intent of Goals: Usually depend on state of Alice! Realizable goals: Can only depend on state of Bob, Interpreter and interactions. Realizable goals: Can only depend on state of Bob, Interpreter and interactions. Translating Intent to Realizable Goal: non-trivial. Translating Intent to Realizable Goal: non-trivial. Alice Bob Interpreter Intellectual Layer Physical Layer

23 10/14/2008Semantic Communication23 Part III: Intellectual Curiosity

24 10/14/2008Semantic Communication24 Setting: Bob more powerful than Alice What should Bob’s Goal be? What should Bob’s Goal be? Can’t use Alice to solve problems that are hard for him. Can’t use Alice to solve problems that are hard for him. Can pose problems and see if she can solve them. E.g., Teacher-student interactions. Can pose problems and see if she can solve them. E.g., Teacher-student interactions. But how does he verify “non-triviality”? But how does he verify “non-triviality”? What is “non-trivial”? Must distinguish … What is “non-trivial”? Must distinguish … Alice Bob Interpreter Scene 1 Scene 2

25 10/14/2008Semantic Communication25 Setting: Bob more powerful than Alice Concretely: Concretely: Bob capable of TIME(n 10 ). Bob capable of TIME(n 10 ). Alice capable of TIME(n 3 ) or nothing. Alice capable of TIME(n 3 ) or nothing. Can Bob distinguish the two settings ? Can Bob distinguish the two settings ? Definition: Definition: Theorem: There exists a universal Bob that distinguishes helpful Alices from trivial ones. Theorem: There exists a universal Bob that distinguishes helpful Alices from trivial ones. Moral: Language (translation) should be simpler than problems being discussed. Moral: Language (translation) should be simpler than problems being discussed.

26 10/14/2008Semantic Communication26 Conclusions Communication of “meaning/context” is feasible; provided goals are explicit. Communication of “meaning/context” is feasible; provided goals are explicit. Verifying “goal achievement” for non-trivial goals is the (only?) way to learn languages. Verifying “goal achievement” for non-trivial goals is the (only?) way to learn languages. Currently the learning is slow … is this inherent? Currently the learning is slow … is this inherent? Better class of Alices? Better class of Alices? What are interesting goals, and how can they be verified? What are interesting goals, and how can they be verified?

27 10/14/2008Semantic Communication27 Thank You!

28 10/14/2008Semantic Communication28 Computers Communicate! Classical “Theory of Computing” Classical “Theory of Computing” Issues: Time/Space on DFA? Turing machines? Issues: Time/Space on DFA? Turing machines? Modern theory: Modern theory: Issues: Reliability, Security, Privacy, Agreement? Issues: Reliability, Security, Privacy, Agreement? F XF(X) Alice Bob Charlie Dick

29 10/14/2008Semantic Communication29 Computers Communicate! How? Why? Classical Introduction to Theory of Computing Classical Introduction to Theory of Computing Bad News: Can’t be done Bad News: Can’t be done For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that the two are indistinguishable to Bob. For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that the two are indistinguishable to Bob. Good News: Need not be done. Good News: Need not be done. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. What should be communicated? Why? What should be communicated? Why?

30 10/14/2008Semantic Communication30 Computers Communicate! Classical Introduction to Theory of Computing Classical Introduction to Theory of Computing Bad News: Can’t be done Bad News: Can’t be done For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that the two are indistinguishable to Bob. For every Bob, there exist algorithms A and A’, and Alices, Alice and Alice’, such that the two are indistinguishable to Bob. Good News: Need not be done. Good News: Need not be done. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. From Bob’s perspective, if A and A’ are indistinguishable, then they are equally useful to him. What should be communicated? Why? What should be communicated? Why?

31 10/14/2008Semantic Communication31 An fantasy setting (SETI) 010010101010001111001000 Bob What should Bob ’ s response be? If there are further messages, are they reacting to him? Is there an intelligent Alien (Alice) out there? No common language! Is meaningful communication possible? Alice

32 10/14/2008Semantic Communication32 Pioneer’s face plate Why did they put this image? What would you put? What are the assumptions and implications?


Download ppt "10/14/2008Semantic Communication1 Universal Semantic Communication Madhu Sudan MIT CSAIL Joint work with Brendan Juba (MIT CSAIL)."

Similar presentations


Ads by Google