1/28 Unit 3. Attention 3.1 Visual Locations 3.2 The Sperling Task 3.3 Visual Attention 3.4 Auditory Attention 3.5 Typing and Control 3.6 Declarative Finsts.

Slides:



Advertisements
Similar presentations
Reminder: extra credit experiments
Advertisements

Opportunities for extra credit: Keep checking at:
Working Memory Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
Primary Memory. Questions for this section Is there more than one kind of primary memory? What is the capacity of primary memory? What do serial position.
Sensory Memory Iconic Memory Echoic Memory. Iconic Memory What is the evidence? Subjective experience Objective measurements Judge duration of a light.
March 23 Overview of Memory Sensory Memory March 25 Short-Term/Working Memory (Brooks expt. 1) March 30 Long-Term Memory April 1 Long-Term Memory and False.
Memory.
Visual Attention: Outline Levels of analysis 1.Subjective: perception of unattended things 2.Functional: tasks to study components of attention 3.Neurological:
Sensory Memory Iconic Memory Echoic Memory. Iconic Memory What is the evidence? Subjective experience Objective measurements Judge duration of a light.
Features and Objects in Visual Processing
Sensory Memory What is memory, and why is it important? What is sensory memory? Is sensory memory useful?
Opportunities for extra credit: Keep checking at:
Upcoming Stuff: Finish attention lectures this week No class Tuesday next week – What should you do instead? Start memory Thursday next week – Read Oliver.
DT211/3 Internet Application Development JSP: Processing User input.
Cognitive Processes PSY 334 Chapter 3 – Attention July 8, 2003.
Overview of Memory Atkinson-Shiffrin Model Sensory Signals Sensory Memory Short-Term Memory Long-Term Memory ATTENTION REHEARSAL RETRIEVAL.
Memory III Working Memory & Brain. Atkinson & Shiffrin (1968) Model of Memory.
1 Pattern Recognition (cont.). 2 Auditory pattern recognition Stimuli for audition is alternating patterns of high and low air pressure called sound waves.
Overview of Memory Atkinson-Shiffrin Model Sensory Signals Sensory Memory Short-Term Memory Long-Term Memory ATTENTION REHEARSAL RETRIEVAL.
Upcoming: Read Expt 1 in Brooks for Tuesday Read Loftus and Sacks For Thursday Read Vokey Thursday the 6th Idea Journals Due on the 6th! The textbook Cognition.
False Memories (Beth Loftus) Lost Mariner (Oliver Sacks)
Features and Object in Visual Processing. The Waterfall Illusion.
Korea Univ. Division Information Management Engineering UI Lab. Korea Univ. Division Information Management Engineering UI Lab – 2 학기 Paper 7 Modeling.
PSY 369: Psycholinguistics Cognitive Psychology. It is the body of psychological experimentation that deals with issues of human memory, language use,
ACT-R.
Psychology of Music Learning Miksza Memory Processes.
Memory Chapter 6.
Whole Report “report” (remember and write down) as many letters from a brief display as possible Average in laboratory is 4.5 out of nine Class average.
Read: Loftus for Tuesday Vokey for April 14 Idea Journals due on the 16th.
Memory III Working Memory. Atkinson & Shiffrin (1968) Model of Memory.
ACT-R 6 Why aren’t you using it yet? Dan Bothell April 8, 2005.
How to Get The Most Out of Outlook 2003 Michele Schwartzman Division of Customer Support Summer 2006.
ACT-R 5.0 and ACT-R/PM Michael D. Byrne Department of Psychology Rice University Houston, TX 77005
by Chris Brown under Prof. Susan Rodger Duke University June 2012
Modeling meditation ?! Marieke van Vugt.
 2004 Prentice Hall, Inc. All rights reserved. 1 Chapter 11 - JavaScript: Arrays Outline 11.1 Introduction 11.2 Arrays 11.3 Declaring and Allocating Arrays.
CHAPTER 13 Creating a Workbook Part 1. Learning Objectives Understand spreadsheets and Excel Enter data in cells Edit cell content Work with columns and.
Using your calculator Using Ran# Using Ranint Finding Standard Deviation Finding Correlation Coefficient END.
WORKBOOK FORMATTING Nolan Tomboulian Tomboulian.wikispaces.com HOW THINGS LOOK CELL COLORFONT COLOR CELL BORDERSFONT SIZE CELL SIZEFONT.
Excel Project 2 Formulas, Functions, and Formatting.
Chapter 10 Memory. The Evolution of Multiple Memory Systems The ability to store memories and memes is adaptive, although memories may or may not contribute.
Lecture 3 - Race against Time 1 Three points for today Sensory memory (SM) contains highly transient information about the dynamic sensory array. Stabilizing.
Compilation and Instruction ACT-R Post Graduate Summer School 2001 Coolfont Resort ACT-R Home Page: John R. Anderson Psychology.
04/11/ Arrays 1D Arrays Defining, Declaring & Processing.
ACT-R 6.0 Updates Summer ‘10 – Summer ‘12 Dan Bothell Carnegie Mellon University
Memory Part II Memory Stages and Processes. Overview Memory processes –encoding, storage, and retrieval Capacity & duration of memory stages –sensory.
Experimental Psychology PSY 433
Memory. Modal Model of the Mind Three memory stores Three memory stores Four Control Processes Four Control Processes Long-term memory Working or Short-term.
1 Chapter 3 – Examples The examples from chapter 3, combining the data types, variables, expressions, assignments, functions and methods with Windows controls.
1 Cognitive Modeling GOMS, Keystroke Model Getting some details right!
3:01 PM Three points for today Sensory memory (SM) contains highly transient information about the dynamic sensory array. Stabilizing the contents of SM.
Chapter 5 Introduction To Form Builder. Lesson A Objectives  Display Forms Builder forms in a Web browser  Use a data block form to view, insert, update,
Memory Storage Thru the 3 Basic Stages February 5 th, 2009 Objective: Review memory technique Objective: Review memory technique Review chart (finish.
Modeling Space Fortress Abraham Anderson John Anderson Shawn Betts Dan Bothell Jennifer Ferris Jon Fincham Michelle Moon Ben Poole Yulin Qin.
 Example: seeing a bird that is singing in a tree or miss a road sign in plain sight  Cell phone use while driving reduces attention and memory for.
ACT-R 6.0 Software Updates Summer ‘08 – Summer ‘09 Dan Bothell Carnegie Mellon University
Modeling Individual Differences in Working Memory Capacity Larry Z. Daily Marsha C. Lovett Lynne M. Reder Carnegie Mellon University This work supported.
Describe how reaching and grasping abilities develop in the first year of life.
ACT-R 5.0 Architecture Christian Lebiere Human-Computer Interaction Institute Carnegie Mellon University
Slide 1 Chapter 3 Variables  A variable is a name for a value stored in memory.  Variables are used in programs so that values can be represented with.
ACT-R 6.0 Software Updates Summer ‘09 – Summer ‘10 Dan Bothell Carnegie Mellon University
SunGuide SM Incident Management Concepts Robert Heller October 21, 2004.
Introduction to: Python and OpenSesame THE BASICS.
Chapter 5 Short-Term and Working Memory. Some Questions to Consider Why can we remember a telephone number long enough to place a call, but then we forget.
Overview of Memory Atkinson-Shiffrin Model Sensory Signals Sensory Memory Short-Term Memory Long-Term Memory ATTENTION REHEARSAL RETRIEVAL.
Memory The persistence of learning over time through the storage and retrieval of information Information Processing Model of Memory –Encoding –Storage.
Lesson 5: Changing the Appearance of Worksheets
Cognitive Processes PSY 334
Memory Nisheeth 6th March 2018.
Cognitive Processes PSY 334
Presentation transcript:

1/28 Unit 3. Attention 3.1 Visual Locations 3.2 The Sperling Task 3.3 Visual Attention 3.4 Auditory Attention 3.5 Typing and Control 3.6 Declarative Finsts 3.7 Data Fitting 3.8 The Subitizing Task

2/ Visual Locations > (print-visicon) Visicon Three steps to encode visual objects 1. Find the location of an objects 2. Shift attention to that location 3. Harvest the object encoded

3/ Attended tests Draw attention to the previously unattended objects Attended slot –Has the object at this location been attended to before –New, nil, or t Attended when +visual> isa move-attention completes 3. 1 Visual Locations

4/28 Random selection among all unattended items (P attend-letter =goal> isa read-letters state find-location =visual-location> isa visual-location ==> +visual> isa visual-object screen-pos =visual-location =goal> state attending ) Feature (in visicon) tagged as attended 3. 1 Visual Locations Attended tests (P find-random-letter =goal> isa read-letters state find tone nil ==> +visual> isa visual-object :attended nil =goal> state attending )

5/28 Attentional tags - FINSTs (Finger Instantiation) –Zenon Pylyshyn Number of objects that can be tagged as :attended t Time limit on how long tag sticks Limited in number and time :VISUAL-NUM-FINSTS 4 default: 4 : Number of visual finsts. :VISUAL-FINST-SPAN 3.0 default: 3.0 : Lifespan of a visual finst 3. 1 Visual Locations Attended tests

6/ Visual Locations Attended tests

7/28 Slots that can be tested –distance –screen-x, screen-y –kind, value, color, size –attended, nearest If multiple matches, random choice Visual-location request 3. 1 Visual Locations

8/28 1. Exact values 3. 1 Visual Locations 2. General values +visual-location> isa visual-location > screen-x 50 <= screen-y 124 ※, >= +visual-location> isa visual-location > screen-x 10 < screen-x 100 > screen-y 10 < screen-y Production variables Visual-location request +visual-location> isa visual-location screen-x 50 screen-y 124 (p find-by-color =goal> isa find-goal target =color ==> +visual-location> isa visual-location color =color +visual-location> isa visual-location kind =kind < screen-x =x > screen-y =y

9/28 4. Relative values 3. 1 Visual Locations 5. Current value +visual-location> isa visual-location > screen-x current <= screen-y current - color current +visual-location> isa visual-location width highest screen-x lowest color red 6. Request variables Visual-location request +visual-location> isa visual-location height &height width &height

10/28 Find the items the closest to the currently attended location, or some other location =some-location bound to a chunk of type visual-location Nearest is tested last 3. 1 Visual Locations :nearest request parameter +visual-location> isa visual-location :nearest current +visual-location> isa visual-location :nearest =some-location

11/28 (p read-next-word =goal> isa read-word state find ==> +visual-location> isa visual-location :attended nil screen-x lowest =goal> state attend) (p read-next-word =goal> isa read-word state find ==> +visual-location> isa visual-location > screen-x current screen-x lowest =goal> state attend) 3. 1 Visual Locations Ordered Search Read words on the screen from left to right

12/ Sperling Task Partial report version –Rows of letters are displayed –After some interval (0,.15,.3, 1 sec), subjects are cued by beeping –Total display time - 1 sec –Report letters in cued row Crude approximation to the original –Display on for seconds vs 50 ms. –Cue occurs while display is still on –Subjects in original experiment were extremely well practiced –Visual conditions very different –But this representation is adequate for our purposes

13/ Sperling Task Whole report condition –50 msec display then a mask –4.4 letters Partial report condition –Auditory cue (tone with different freq) indicating which row to report –3.3 letters in that row –As delay of cue get longer (1 sec) the # reported fell (1.5)

14/ Sperling Task Subjects' recall at a second's delay fell to about a third of the whole report level – able to report as many items from the cued row as they happened to encode without the cue Interpretation – subjects have access to all the letters in a visual buffer but they have difficulty in reporting them before they decay away. People seldom process 50 millisecond display How fast visual attention can move over entire display

15/28 Human data 3.2 Sperling Task

16/28 Buffer stuffing 3.3 Visual Attention Visual-location buffer empty -> automatically placed Default specification : Attended new, screen-x lowest Model can detect changes to the screen VISION SET-BUFFER-CHUNK VISUAL-LOCATION LOC0 REQUESTED NIL Three steps to encode visual objects 1. Find the location of an objects 2. Shift attention to that location 3. Harvest the object encoded

17/28 Testing and Requesting Locations Constraints passed from goal to RHS Check for range of screen-y Modifies chunk before writing to memory Clear done so that chunk enters declarative memory at this time 3.3 Visual Attention (p encode-row-and-find =goal> isa read-letters location =pos upper-y =uy lower-y =ly =visual> isa text status nil ==> =visual> status =pos -visual> ;; add this to make sure it flushes to DM =goal> location nil state attending +visual-location> isa visual-location attended nil screen-y (within =uy =ly) )

18/ Auditory attention Two steps to encode a sound –aural-location buffer -> aural buffer –Automatically but delayed Delay between initial onset of sound and being able to detect –Delaydepends on type of sound: tone or digit Sperling Model is assuming only one sound, therefore can rely on buffer stuffing

19/28 :TONE-RECODE-DELAY default: : Recoding delay for tone sound content. :DIGIT-RECODE-DELAY 0.5 default: 0.5 : Recoding delay for digit sound content. :DIGIT-DURATION 0.6 default: 0.6 : Default duration for digit sounds. :DIGIT-DETECT-DELAY 0.3 default: 0.3 : Lag between onset and detectability for digits :TONE-DETECT-DELAY 0.05 default: 0.05 : Lag between sound onset and detectability for tones Audio Module Parameters 3.4 Auditory Attention

20/ Auditory Attention (p detected-sound =aural-location> isa audio-event ?aural> state free ==> +aural> isa sound event =aural-location) (p sound-respond-low =goal> isa read-letters tone nil =aural> isa sound content 500 ==> =goal> tone low upper-y 205 lower-y 215) VISION SET-BUFFER-CHUNK VISUAL TEXT PROCEDURAL PRODUCTION-FIRED ENCODE-ROW-AND-FIND VISION SET-BUFFER-CHUNK VISUAL-LOCATION LOC AUDIO SET-BUFFER-CHUNK AURAL-LOCATION AUDIO-EVENT0 REQUESTED NIL PROCEDURAL PRODUCTION-FIRED ATTEND-HIGH PROCEDURAL PRODUCTION-FIRED DETECTED-SOUND AUDIO SET-BUFFER-CHUNK AURAL TONE PROCEDURAL PRODUCTION-FIRED ATTEND-HIGH PROCEDURAL PRODUCTION-FIRED SOUND-RESPOND-MEDIUM VISION SET-BUFFER-CHUNK VISUAL TEXT PROCEDURAL PRODUCTION-FIRED ENCODE-ROW-AND-FIND VISION SET-BUFFER-CHUNK VISUAL-LOCATION LOC PROCEDURAL PRODUCTION-FIRED ATTEND-MEDIUM

21/ Typing and Control (P start-report =goal> isa read-letters tone =tone ?visual> state free ==> +goal> isa report-row row =tone +retrieval> isa text status =tone) This production competes with several other productions Creates and switches to a new goal Issues retrieval constrained by tone

22/28 Control Only want to fire when no more letters to perceive (spp start-report :c 2) (spp detected-sound :c 0) (spp sound-respond-low :c 0) (spp sound-respond-medium :c 0) (spp sound-respond-high :c 0) Default cost (:C) is 0.05sec 3.5 Typing and Control

23/28 (P do-report =goal> isa report-row row =tone =retrieval> isa text status =tone value =val ?manual> state free ==> +manual> isa press-key key =val +retrieval> isa text status =tone :recently-retrieved nil ) To keep from retrieving same letter PROCEDURAL PRODUCTION-FIRED DO-REPORT MOTOR PRESS-KEY c DECLARATIVE SET-BUFFER-CHUNK RETRIEVAL TEXT PROCEDURAL PRODUCTION-FIRED DO-REPORT 3.5 Typing and Control

24/28 (p stop-report =goal> isa report-row row =row ?retrieval> state error ?manual> state free ==> +manual> isa press-key key space =goal> row nil) 3.5 Typing and Control

25/ Declarative FINSTS +retrieval> isa text status =tone :recently-retrieved nil :declararive-num-finsts default = 4 items :declarative-finst-span default = 3 sec

26/ Data Fitting ;(sgp :seed (100 0)) ; signifies a comment, so line it commented out Allows random generation > (repeat-experiment 100) CORRELATION: MEAN DEVIATION: Condition Current Participant Original Experiment 0.00 sec sec sec sec

27/ Subitizing Task ? (experiment) CORRELATION: MEAN DEVIATION: Items Current Participant Original Experiment (T ) (T ) (T ) (T ) (T ) (T ) (T ) (T ) (T ) (T ) 2.58

28/28 Vocal System (P do-report =goal> isa report-row row =tone =retrieval> isa text status =tone value =val ?vocal> state free ==> ;Need to add this to keep the buffer from clearing =goal> +vocal> isa speak string =val +retrieval> isa text :recently-retrieved nil status =tone) 3.8 Subitizing Task