Presentation is loading. Please wait.

Presentation is loading. Please wait.

Communicating Agents in a Shared World Natalia Komarova (IAS & Rutgers) Partha Niyogi (Chicago)

Similar presentations


Presentation on theme: "Communicating Agents in a Shared World Natalia Komarova (IAS & Rutgers) Partha Niyogi (Chicago)"— Presentation transcript:

1 Communicating Agents in a Shared World Natalia Komarova (IAS & Rutgers) Partha Niyogi (Chicago)

2 What is your location? Nice weather! * Two linguistic agents in a shared world. Q: How can we quantify the rate of success in information transfer? Q: Given the language of the first agent, what language makes the best communicator?

3 * A population of communicating agents. Q: Can they develop a common communication system? Q: What are the properties of this system?

4 Plan Formalize the notion of “language” Introduce the “mutual intelligibility” function for two communicating agents What is the optimal communicator? Implications for evolution: populations of communicating agents Nash equilibria and evolutionarily stable strategies Part I. Part II.

5 Linguistic system A set of associations between form and meaning Let S be the set of all possible linguistic forms (words, codes, signals, sentences, …) Let M be the set of all possible semantic objects (meanings, messages, events, referents, …) Language, L, is a probability measure over M x S

6 m33 m32 m31 m23 m22 m21 m13 m12 m11 cat dog hippo All meanings…. All signals…

7 The sets S and M: Can be finite (animal signals) Or infinite (human sentences) We assume that they are countable… The measure on S x M is called an association matrix

8 Two modes of communication Cat Production modeComprehension mode

9 Production mode 0.9 0 0 0.1 1 0 0 0 1 cat dog hippo Given a meaning, what is the probability to use each signal? Normalize the association matrix by column…

10 Comprehension mode 0.7 0.1 0.2 0.1 0.9 0 0 0 1 cat dog hippo Given a signal, what is the probability to interpret it as each possible meaning? Normalize the association matrix by row…

11 Each agent has a language, that is, Each agent comes equipped with two matrices: P matrix for production and C matrix for comprehension 10.20.01 00.80.01 000.98 100 00.90.1 00.010.99 P = C = Signals Meanings Signals

12 Production and comprehension The production and comprehension modes are connected (in humans and animals). They arise out of highly structured systems in the brain and cannot change independently. We assume that they arise out of a common association matrix

13 A fully-coordinated (non-ambiguous) language : P=C P and C are binary permutation matrices 100 010 001 100 010 001 P = C = Signals Meanings Signals 001 100 010 001 100 010 P = C = Signals Meanings Signals

14 Intelligibility function All meanings, i All signals, j P( )P( ) 1 uses signal j to express meaning i 2 interprets signal j as meaning i Probability of successful communication when agent 1 speaks: Agent 1Agent 2

15 Intelligibility function Agent 1Agent 2 All meanings, i All signals, j P( )P( ) 2 uses signal j to express meaning i 1 interprets signal j as meaning i Probability of successful communication when agent 2 speaks:

16 Intelligibility function Agent 1Agent 2

17 Given the language of agent 1, what language of agent 2 maximizes the mutual intelligibility? Agent 1Agent 2

18 Fully coordinated languages A language is fully coordinated if it defines a one-to-one correspondence between signals and meanings (binary permutation matrices). For fully coordinated languages, the production and comprehension matrices are the same. The best communicator with a fully-coordinated language is the language itself.

19 What happens if the language is not fully coordinated?

20 Agent 1 Agent 2 Given P1 What is the best C2 ? 0.10.30.1 0.20.40.6 0.70.3 *** *** *** P1 = C2 = Meanings Signals

21 Agent 1 Agent 2 Given P1 What is the best C2 ? 0.10.30.1 0.20.40.6 0.70.3 010 001 100 P1 = C2 = Meanings Signals

22 Agent 1 Agent 2 Given C1 What is the best P2 ? 0.10.30.6 0.40 0.3 *** *** *** C1 = P2 = Meanings Signals

23 Agent 1 Agent 2 Given C1 What is the best P2 ? 0.10.30.6 0.40 0.3 001 110 000 C1 = P2 = Meanings Signals

24 Constructing the best communicator Given the production matrix, can construct the optimal decoder

25 Constructing the best communicator Given the production matrix, can construct the optimal decoder Given the comprehension matrix, can construct the optimal encoder

26 Constructing the best communicator Given the production matrix, can construct the optimal decoder Given the comprehension matrix, can construct the optimal encoder Recall: the production and comprehension mode are connected.

27 This puts a constraint on possible pairs of production and comprehension matrices. If the best encoder and decoder are constructed, do they correspond to a self- consistent language?

28 Constructing the best communicator 16422390 928428142 537760250 8815687359 3948666537 0.0040.3020.0080.0940.324 0.3370.0380.1760.3320.151 0.1940.3630.2520.0080.180 0.3220.0710.2860.2990.212 0.1430.2260.2770.2660.133 0.0060.3560.0110.1280.500 0.3470.0300.1580.3060.158 0.2190.3180.2480.0080.207 0.2900.0500.2240.2410.195 0.1520.1880.2590.2540.145 P = C = Given a measure on the space S x M, construct the production and comprehension matrices

29 P = C = For P construct the best decoder. For C construct the best encoder. 0.1330.2660.2770.2260.143 0.2120.2990.2860.0710.322 0.1800.0080.2520.3630.194 0.1510.3320.1760.0380.337 0.3240.0940.0080.3020.004 0.1450.2540.2590.1880.152 0.1950.2410.2240.0500.290 0.2070.0080.2480.3180.219 0.1580.3060.1580.0300.347 0.5000.1280.0110.3560.006 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P =

30 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = Can we identify a measure on the space S x M such that it gives rise to these production and comprehension matrices?

31 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = Can we identify a measure on the space S x M such that it gives rise to these production and comprehension matrices? Do they come from a self- consistent language?

32 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = Can we identify a measure on the space S x M such that it gives rise to these production and comprehension matrices? Do they come from a self- consistent language? In general, NO.

33 Theorem. Let language L be self-consistent. That is, its production and comprehension matrices come from some measure on the space S x M. Let C* and P* be the best encoder and decoder constructed as above. It is always possible to find a family of measures on the space S x M, parameterized by a parameter,, such that it gives rise to production and comprehension matrices arbitrarily close to C* and P*. These languages tend to the best communicator as Constructing the best communicator

34 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = 000 0000 0000 0000 000 Step 1. Build an auxiliary matrix

35 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = 000 0000 0000 0000 000 Step 2. Connect vertices That share a row (column)

36 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = 000 0000 0000 0000 000 Step 3. Assign direction

37 00001 10000 01000 10000 00100 01001 10000 00000 00000 00110 Best C = Best P = 0100 0000 0000 00001 1000 Step 4. Assign powers of epsilon so that the entries increase in the direction of the arrows.

38 Constructing the best communicator 0100 0000 0000 00001 1000 16422390 928428142 537760250 8815687359 3948666537 Given language The constructed family of languages which approaches the best communicator as

39 Constructing the best communicator For fully coordinated languages, the language is the best communicator with itself For general languages, the best communicator is different from the language itself In general, the best communicator cannot be constructed exactly, but it can be approached arbitrarily close This theory can be extended to noisy communication, infinite languages etc.

40 Evolutionary dynamics of communicating agents.

41 Evolutionary dynamics Consider a population of communicating agents Each agent has a language (an association matrix  matrices P and C) Apply a “genetic algorithm”: agents proliferate and die “Children” inherit/learn the language of their “parent” Keep the score for each agent on how successful it communicates with others…

42 Evolutionary dynamics L5 L7 L1 L4 L3 L2 L6

43 Evolutionary dynamics The score is computed by means of mutual intelligibility: Agents with a higher score have a higher chance of reproduction (lower chance of death)

44 Evolutionary dynamics Hurford 1989 Steels 1996 Steels & Vogt 1997 Steels & Kaplan 1998 Oliphant & Batali 1996 Oliphant 1997 Kirby 1999 Briscoe 2000 Smith 2002 Smith 2003 Numerical studies of evolutionary dynamics of communicating agents

45 Numerical experiments Average Communicability Time A typical outcome… A common, highly coordinated communication system Random initial condition

46 Evolutionary dynamics We attempt an analytical description

47 Evolutionary dynamics We attempt an analytical description Use properties of the “score” function F(Li,Lj)

48 Evolutionary dynamics We attempt an analytical description Use properties of the “score” function F(Li,Lj) Identify and classify important states of the system

49 Navigating in the fitness landscape

50 Definitions of Game Theory Language L is strict Nash equilibrium if for all L’L F(L,L’)

51 Definitions of Game Theory Language L is Nash equilibrium if for all L’L F(L,L’)

52 Definitions of Game Theory Language L is an evolutionary stable state (ESS) if L is Nash and for every we have L’LL’’ F(L,L’) F(L’’,L’)

53 Definitions of Game Theory Language L is a weak ESS if L is Nash and for every we have L’LL’’ F(L,L’) F(L’’,L’)

54 Definitions of game theory Strict Nash Weak ESSESS Nash

55 Definitions of game theory Strict Nash Weak ESSESS Nash This is what we observe in computer simulations

56 Evolutionarily Stable States A language is an ESS if it is an extended permutation matrix 10000 00010 01000 00100 00000 00000

57 Evolutionarily stable states ESS are important in the dynamics However, much more common outcomes of evolutionary dynamics are weak ESS. The system will converge to a weak ESS and then slowly drift to the nearest ESS without change in communicability function.

58 Evolutionarily stable states ESS are important in the dynamics However, much more common outcomes of evolutionary dynamics are weak ESS. The system will converge to a weak ESS and then slowly drift to the nearest ESS without change in communicability function. Weak ESS are harder to characterize…

59 Synonyms and Homonyms A language has synonyms (homonyms) if some column (row) of the association matrix has more than one positive elements 0 1 2 0 2 0 0 0 0 1 0 3 0 0 0 1 Meanings Signals Homonyms Synonyms Bank – bank Spring – spring Fall – to fall Beautiful – fair Medicine – drug To phone – to call

60 Isolated Synonyms A language has isolated synonyms if the corresponding rows of the association matrix have no other positive elements 0 1 2 0 2 0 0 0 0 0 0 3 0 0 0 1 Meanings Signals Homonyms Isolated Synonyms

61 Weak ESS Theorem. The language L is a weak ESS if and only if the only kind of synonyms it has are isolated synonyms. It cannot have homonyms. 000a0 000b0 00f00 00e00 000c0 000d0

62 Definitions of game theory Strict Nash Weak ESSESS Nash This is exactly what is observed in computer simulations Fully coordinated languages Languages that can have isolated synonyms but no homonyms

63 Paradox Studies of natural languages suggest that full synonyms are extremely rare, and homonyms are quite common Our results (together with many numerical simulations) indicate that synonyms are evolutionarily stable, and homonyms are not

64 Paradox Studies of natural languages suggest that full synonyms are extremely rare, and homonyms are quite common Our results (together with many numerical simulations) indicate that synonyms are evolutionarily stable, and homonyms are not What is wrong?

65 Paradox: resolution The columns of our matrices refer to meanings; the rows refer to signals Let us add context in the picture Intelligibility should be defined per utterance, rather than per word Now the results can be re-evaluated

66 Paradox: resolution At the new level of description, the mathematical model correctly describes the following observation: There are no true homonyms that remain homonyms in the presence of contextual clues. Context-dependent synonyms are rather common.

67 Example: the level of words Fall (autumn) and Fall (down) are homonyms on the level of words. Beautiful and Fair are not complete synonyms on the level of words (fair price?)

68 Example: the level of utterances Fall (autumn) and Fall (down) are homonyms on the level of words. On the level of utterances they are not! Beautiful and Fair are not complete synonyms on the level of words. They are synonymous on the level of utterances (beautiful lady = fair lady)

69 Context The presence of context destroys homonymy and creates synonymy This is exactly what is observed in the model. Synonyms are stable because they do not reduce coherence. Homonyms make the language ambiguous and thus reduce coherence. They are unstable.

70 Conclusions Given the language L, there can be found a self-consistent language, L’, which approximates the best communicator arbitrarily well. The best communicator is not equal to the language itself, unless L is fully coordinated

71 Conclusions In a population of learning agents, the mutual intelligibility will increase The system will reach a weak ESS state characterized by a common language with no homonyms. This language may contain isolated synonyms Slow random drift may then bring the system to a fully coordinated language without change in intelligibility


Download ppt "Communicating Agents in a Shared World Natalia Komarova (IAS & Rutgers) Partha Niyogi (Chicago)"

Similar presentations


Ads by Google