Visual search: Who cares?

Slides:



Advertisements
Similar presentations
Selective Visual Attention: Feature Integration Theory PS2009/10 Lecture 9.
Advertisements

Reminder: extra credit experiments
Palmer (after Broadbent) interval 750 ms test 100 ms cue 250 ms Relevant size 28 (Palmer, after Shaw)
Visual Saliency: the signal from V1 to capture attention Li Zhaoping Head, Laboratory of natural intelligence Department of Psychology University College.
Attention Focus on what matters.
Attention-Dependent Hemifield Asymmetries When Judging Numerosity Nestor Matthews & Sarah Theobald Department of Psychology, Denison University, Granville.
Perceptual Processes: Attention & Consciousness Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
Chapter 6: Visual Attention. Overview of Questions Why do we pay attention to some parts of a scene but not to others? Do we have to pay attention to.
Perception and Pattern Recognition  What types of information do we use to perceive the world correctly?  What are the major theories about how we recognize.
Visual Attention: Outline Levels of analysis 1.Subjective: perception of unattended things 2.Functional: tasks to study components of attention 3.Neurological:
Disorders of Visual Attention. Hemispatial Neglect Cause –often a stroke that has interrupted the flow of blood to the right parietal lobe that is thought.
A saliency map model explains the effects of random variations along irrelevant dimensions in texture segmentation and visual search Li Zhaoping, University.
Features and Objects in Visual Processing
Visual Search: finding a single item in a cluttered visual scene.
Experiments for Extra Credit Still available Go to to sign upwww.tatalab.ca.
Upcoming Stuff: Finish attention lectures this week No class Tuesday next week – What should you do instead? Start memory Thursday next week – Read Oliver.
Pre-attentive Visual Processing: What does Attention do? What don’t we need it for?
Treisman Visual Search Demo. Visual Search Tasks  Can detect features without applying attention  But detecting stimulus conjunctions requires attention.
Attention Focus on what matters. What is Attention? Selection –Needed to avoid “information overload” –Related to Limited Capacity Concentration –Applying.
Feature Level Processing Lessons from low-level vision Applications in Highlighting Icon (symbol) design Glyph design.
Visual Attention More information in visual field than we can process at a given moment Solutions Shifts of Visual Attention related to eye movements Some.
Next Week Memory: Articles by Loftus and Sacks (following week article by Vokey)
Shadows (P) Lavanya Sharan February 21st, Anomalously lit objects are easy to see Kleffner & Ramchandran 1992Enns & Rensink 1990.
Upcoming: Read Expt 1 in Brooks for Tuesday Read Loftus and Sacks For Thursday Read Vokey Thursday the 6th Idea Journals Due on the 6th! The textbook Cognition.
Types of Perceptual Processes Bottom-up - work up from sensory info. Top-down - apply knowledge and experience.
Attention II Selective Attention & Visual Search.
Features and Object in Visual Processing. The Waterfall Illusion.
Selective Visual Attention & Visual Search
Attention II Theories of Attention Visual Search.
Features and Object in Visual Processing. The Waterfall Illusion.
Parallel vs. Serial Information Processing Remember - attention is about information processing.
The Cognitive Approach I: History, Vision, and Attention
Read article by Anne Treisman. Reflexive Orienting Attention can be automatically “summoned” to a location at which an important event has occurred: –Loud.
Studying Visual Attention with the Visual Search Paradigm Marc Pomplun Department of Computer Science University of Massachusetts at Boston
Pay Attention! Kimberley Clow
Text Lecture 2 Schwartz.  The problem for the designer is to ensure all visual queries can be effectively and rapidly served.  Semantically meaningful.
1 Perception and VR MONT 104S, Fall 2008 Session 13 Visual Attention.
Attention Part 2. Early Selection Model (Broadbent, 1958) inputdetectionrecognition FI L T E R Only information that passed the filter received further.
Feature Integration Theory Project guide: Amitabha Mukerjee Course: SE367 Presented by Harmanjit Singh.
Searching the truth: Visual search for abstract, well-learned objects Denis Cousineau, Université de Montréal This talk will be available at
Visual Distinctness What is easy to find How to represent quantitity Lessons from low-level vision Applications in Highlighting Icon (symbol) design -
Cognition 7e, Margaret MatlinChapter 3 Cognition Chapter 3 Perceptual Processes II: Attention and Consciousness.
Psych 435 Attention. Issues Capacity –We can’t respond to everything in the environment –Too many pieces of information –we can only actively respond.
TEMPORAL DISPLACEMENTS: NEW EVIDENCE ON THE BINDING PROBLEM Ekaterina Pechenkova Moscow State University.
Lecture 4 – Attention 1 Three questions: What is attention? Are there different types of attention? What can we do with attention that we cannot do without.
Modeling Visual Search Time for Soft Keyboards Lecture #14.
Interference from irrelevant color-singletons during serial search depends on visual attention being spatially diffuse Bryan R. Burnham James H. Neely.
Perception & Pattern Recognition II
Understanding V1 --- Saliency map and pre- attentive segmentation Li Zhaoping University College London, Psychology Lecture.
Attention Part 2 Page
Binding problems and feature integration theory. Feature detectors Neurons that fire to specific features of a stimulus Pathway away from retina shows.
Focal attention in visual search Professor: Liu Student: Ruby.
Psych 335 Attention. Issues Capacity –We can’t respond to everything in the environment –Too many pieces of information –we can only actively respond.
Goals for Today’s Class
Cognitive - perception.ppt © 2001 Laura Snodgrass, Ph.D.1 Perception The final image we are consciously aware of is a “constructed” representation of the.
Cognitive approaches: Information processing, with the computer as a model.
Steven Dodd, Christian Kreitz, Lauren Landers, Kelsey Panter.
CORNELL UNIVERSITY CS 764 Seminar in Computer Vision Attention in visual tasks.
Melanie Boysen & Gwendolyn Walton
Feature Binding: Not Quite So Pre-attentive Erin Buchanan and M
Assist. Prof. Dr. Ilmiye Seçer Fall
View from the Top Neuron
Cognitive Processes PSY 334
CSc4730/6730 Scientific Visualization
Attentional Modulations Related to Spatial Gating but Not to Allocation of Limited Resources in Primate V1  Yuzhi Chen, Eyal Seidemann  Neuron  Volume.
Cognitive Processes PSY 334
Chapter 7 - Visual Attention
Bayesian vision Nisheeth 14th February 2019.
Human vision: function
The Psychophysical Evidence for a Binding Problem in Human Vision
Presentation transcript:

Visual search: Who cares? This is a visual task that is important outside psychology laboratories (for both humans and non-humans).

X X X X O X O X X X X X X O X X X X O O X X O X X X X Feature search Conjunction search Treisman & Gelade 1980

“Serial” vs “Parallel” Search Reaction Time (ms) Set size

Feature Integration Theory: Basics (FIT) Treisman (1988, 1993) Distinction between objects and features Attention used to bind features together (“glue”) at the attended location Code 1 object at a time based on location Pre-attentional, parallel processing of features Serial process of feature integration

FIT: Details Sensory “features” (color, size, orientation etc) coded in parallel by specialized modules Modules form two kinds of “maps” Feature maps color maps, orientation maps, etc. Master map of locations

Feature Maps Contain 2 kinds of info presence of a feature anywhere in the field there’s something red out there… implicit spatial info about the feature Activity in feature maps can tell us what’s out there, but can’t tell us: where it is located what other features the red thing has

Master Map of Locations codes where features are located, but not which features are located where need some way of: locating features binding appropriate features together [Enter Focal Attention…]

Role of Attention in FIT Attention moves within the location map Selects whatever features are linked to that location Features of other objects are excluded Attended features are then entered into the current temporary object representation

Evidence for FIT Visual Search Tasks Illusory Conjunctions

Feature Search: Find red dot

“Pop-Out Effect”

Conjunction: white vertical

1 Distractor

12 Distractors

29 Distractors

Feature Search T T T T T T T T T T T Is there a red T in the display? Target defined by a single feature According to FIT target should “pop out” T T T T T T T T T

Conjunction Search T X X T X T T T T T T T X X Is there a red T in the display? Target defined by shape and color Target detection involves binding features, so demands serial search w/focal attention T X T T T T T T T X X

Visual Search Experiments Record time taken to determine whether target is present or absent Vary the number of distracters FIT predicts that Feature search should be independent of the number of distracters Conjunction search should get slower w/more distracters

Typical Findings & interpretation Feature targets pop out flat display size function Conjunction targets demand serial search non-zero slope

… not that simple... easy conjunctions - - X X O O X X O X O O X O X easy conjunctions - - depth & shape, and movement & shape Theeuwes & Kooi (1994)

Guided Search Triple conjunctions are frequently easier than double conjunctions This lead Wolfe and Cave modified FIT --> the Guided search model - Wolfe & Cave

Guided Search - Wolfe and Cave X X O O X X O X O O X O X Separate processes search for Xs and for white things (target features), and there is double activation that draws attention to the target.

Problems for both of these theories Both FIT and Guided Search assume that attention is directed at locations, not at objects in the scene. Goldsmith (1998) showed much more efficient search for a target location with redness and S-ness when the features were combined (in an “object”) than when they were not.

more problems Hayward & Burke (2000) Lines Lines in circles Lines + circles

Results - target present only a popout search should be unaffected by the circles

more problems Enns & Rensink (1991) Search is very fast in this situation only when the objects look 3D - can the direction a whole object points be a “feature”?

Duncan & Humphreys (1989) SIMILARITY visual search tasks are : easy when distracters are homogeneous and very different from the target hard when distracters are heterogeneous and not very different from the target

Asymmetries in visual search Vs Vs the presence of a “feature” is easier to find than the absence of a feature

Kristjansson & Tse (2001) Faster detection of presence than absence - but what is the “feature”?

Familiarity and asymmetry asymmetry for German but not Cyrillic readers

Other high level effects finding a tilted black line is not affected by the white lattice - so “feature” search is sensitive to occlusion Wolfe (1996)