Presentation is loading. Please wait.

Presentation is loading. Please wait.

Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Similar presentations


Presentation on theme: "Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit."— Presentation transcript:

1 Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit

2 Models so far… Models of learning language Models of evolving ability to learn language Models of differing abilities to learn differing languages What do these have in common? The language comes from “outside” LINGUISTIC AGENT LANGUAGE

3 Neural network Training Sentences Weight settings Two kinds of models Language Acquisition Device Primary Linguistic Data Grammatical Competence What can be learned? What can evolve? LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC

4 A new kind of model: Iterated Learning LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC What kind of language evolves?

5 What can Iterated Learning explain? My hypothesis: some functional linguistic structure emerges inevitably from the process of iterated learning without the need for natural selection or explicit functional pressure. First target structure: Recursive Compositionality: the meaning of an utterance is some function of the meaning of parts of that utterance and the way they are put together. CompositionalHolistic walkedwent I greet youHi I thought I saw a pussy catchutter

6 The agent Meaning-signal Pairs in (utterances from parent) Meaning-signal Pairs out (to next generation) Meanings (generated by environment) Learning Algorithm Internal linguistic representation Agent (simulated individual) Production Algorithm Next generation

7 What will the agents talk about? Need some simple but structured “world”. Simple predicate logic: Agents can string random characters together to form utterances. loves(mary, john) admires(gavin, heather) thinks(mary, likes(john, heather)) knows(heather, thinks(mary, likes(john, heather)))

8 How do agents learn? Not using neural networks In this model, interested in more traditional, symbolic grammars Learners try and form a grammar that is consistent with the primary linguistic data they hear. Fundamental principle: learning is compression. Learners try and fit data heard, but also generalise Learning is a trade-off between these two

9 Two steps to learning INCORPORATION (for each sentence heard) GENERALISATION (whenever possible)

10 A simulation run 1.Start with one learner and one adult speaker neither of which have grammars. 2.Choose a meaning at random. 3.Get speaker to produce signal for that meaning (may need to “invent” random string). 4.Give meaning-signal pair to learner. 5.Repeat 2-4 one hundred and fifty times. 6.Delete speaker. 7.Make learner be the new speaker. 8.Introduce a new learner (with no initial grammar) 9.Repeat 2-8 thousands of times.

11 Results 1a: initial stages Initially, speakers have no language, so “invent” random strings of characters. A protolanguage emerges for some meanings, but no structure. These are holistic expressions: 1.ldg “Mary admires John” 2.xkq “Mary loves John” 3.gj “Mary admires Gavin” 4.axk “John admires Gavin” 5.gb“John knows that Mary knows that John admires Gavin”

12

13 Results 1b: many generations later… 6.gj h f tej m John Mary admires “Mary admires John” 7.gj h f tej wp John Mary loves “Mary loves John” 8.gj qp f tej m Gavin Mary admires “Mary admires Gavin” 9.gj qp f h m Gavin John admires “John admires Gavin” 10.i h u i tej u gj qp f h m John knows Mary knows Gavin John admires “John knows that Mary knows that John admires Gavin”

14

15 Quantitative results: languages evolve

16 What’s going on? There is no biological evolution in the ILM. There isn’t even any communication; no notion of function in model at all. So, why are structured languages evolving? Hypothesis: Languages themselves are evolving to the conditions of the ILM in order that they are learnable. The agents never see all the meanings… Only rules that are generalisable from limited exposure are stable.

17 Language has to fit through a narrow bottleneck This has profound implications for the structure of language Linguistic competence Linguistic performance Linguistic competence Production Learning

18 A nice and simple model… Language Meanings: 8-bit binary numbers Signals: 8-bit binary numbers Agents 8x8x8 neural network (not SRN) Learns to associate signals to meanings SIGNALS MEANINGS

19 Bottleneck Only one parameter in this model The bottleneck: The number of meaning-signal pairs (randomly chosen) given to the next generation… In each simulation, we can measure two things: Expressivity: the proportion of the meaning-space an adult agent can give a unique signal to Instability: how different each generation’s language is to that of the previous generation Subset of meaning signal pairs

20 Results Bottleneck too tight: unstable and inexpressive language

21 Results Bottleneck too wide: fairly stable and expressive eventually

22 Results Medium bottleneck: maximal stability and expressivity

23 Adaptation Language is evolving to be learnable Structure in mapping emerges Meanings and signals are related by simple rules of bit flipping and re- ordering These rules can be learned from a subset Despite the hugely different model, this is a very similar result to the earlier simulation

24 Summary Language is learned by individuals with innate learning biases The language data an individual hears is itself the result of learning Languages adapt through iterated learning in response to our innate biases There’s more! Our learning biases adapt through biological evolution in response to the language we use Tomorrow… use a simulation package to look at “grounding” models in an environment Cultural evolution Individual learning Biological evolution


Download ppt "Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit."

Similar presentations


Ads by Google