Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 A Syntagmatic Paradigmatic Model of Serial Order Memory Simon Dennis University of Adelaide.

Similar presentations


Presentation on theme: "1 A Syntagmatic Paradigmatic Model of Serial Order Memory Simon Dennis University of Adelaide."— Presentation transcript:

1 1 A Syntagmatic Paradigmatic Model of Serial Order Memory Simon Dennis University of Adelaide

2 2 Serial Order in Many Guises SW1A 1AA +44 20 7321 2233 You ain't nothing but a hound dog Supercalifragilisticexpialidocious /k//a//t/

3 Chaining Models Links successive items by pair-wise associations e.g. TODAM model (Lewandowksy & Murdock, 1989) Item 1Item 2Item 3 ‘Start of List’ cue

4 Ordinal Models Assigns activation strengths to each item in memory. Ranking these strengths determines item order during recall. e.g. Primacy model (Page & Norris, 1998) Item 1 Item 2 Item 3 Activation strength

5 Positional Models A Positional cue is linked with each item. Recall involves re-instating these positional cues in sequence. e.g. OSCAR (Brown, Preece, & Hulme, 2000) Item1Item2Item3 Positional cue1Positional cue2Positional cue3

6 6 Chaining Fillin errors ACB 53% ACD 21% Mixed lists of confusable/nonconfusable stimuli Henson concluded that chaining model not viable CGETBD KDNCSG DKCNGS KNSLRX

7 7 Chaining Bigram frequency (Baddeley 1964) and Approx to English (Miller & Selfridge 1951) Circular List Learning (Addis & Kahana prep) ABCDEF DEFABC BCDEFA Puzzle: How can the mechanism be chaining and not chaining?

8 8 Insights from other serial phenomena Word Reading SALT/SLAT Speech Errors The queer old dean (phonemic) Too slicely thinned (morphemic) I wrote a mother to my letter (whole word) Sentence comprehension The waiter served calzone complained As she lowered the veil the temptress lured the hero Contentions Items are activated and then assembled Backward links impact processing

9 The Syntagmatic Paradigmatic Model Assumes that people store a large number of sentence instances When trying to interpret a new sentence they retrieve similar traces from memory and resolve the constraints in these traces in working memory Traces consist of: syntagmatic associations between serial presented items paradigmatic associations between items that fill similar slots in different sentences Dennis, S. (2005). A memory-based theory of verbal cognition. Cognitive Science. 29:2. Dennis, S. (2004). An unsupervised method for the extraction of propositional information from text. Proceedings of the National Academy of Sciences, 101, 5206-5213.

10 10 The Formal Model syntagmatic error function sampling distribution paradigmatic

11 11 Position Curves DataSP

12 12 Parameterization

13 13 Position Curve DataSP

14 14 Lag Conditional Response Probability DataSP

15 15 Fill-in Errors

16 ABCDEF ABCDEF A B C D E F ABCDEF A B C D E F ABCDEF A B C D E F ABCDEF A B C D E F ABCDEF A B C D E F ABCDEF A B C D E F ACBDEFACDEF

17 17 Confusable / Nonconfusable Lists

18 Discussion & Conclusions Free recall can be modelled by adjusting contribution of syntagmatic and paradigmatic associations Can also resolve constraints using a gradient descent procedure – can capture backward recall and judgements of recency (McElree & Dosher) Fillin and Confusable list phenomena can be captured Bigram frequency and Approx to English phenomena can be captured by assuming preexisting background associations Circular lists can be captured by increasing syntagmatic and paradigmatic learning parameters Chaining models are viable

19 19 Other Related Tasks Backward Recall cue with end of list context assume backward connections exist model does backward recall automatically Judgements of Recency time to make judgement depends only on lag of most recent item (McElree & Dosher) do backward recall and wait until one of the probes is above threshold Free Recall by manipulating contribution of syntagmatic and paradigmatic constraint can shift between serial and free recall

20 20 Forward Recall with Gradient Descent

21 21 Forward with Noise

22 22 Backward Recall and Judgement of Recency

23 23 Metropolis Hastings Sample from the distribution Candidate distribution where v is vocab size, u is uniform random variate.

24 24 Metropolis Hastings Start chain at [a,b,c,d,e,f] which is the mode Used 1000000 samples

25 25 Lag Conditional Response Probability DataSP


Download ppt "1 A Syntagmatic Paradigmatic Model of Serial Order Memory Simon Dennis University of Adelaide."

Similar presentations


Ads by Google