Schema Theory and Soft Constraint Satisfaction Psych 85-419/719 January 23, 2001
Announcements Assignment #1 handed out at end of class today –Due Feb 1 Also, very quick demo of PDP software at end of this class. Installation instructions on class web page Next class: –short lecture on stochastic optimization –… then full demo of software –Bring two blank floppies to Thursday class
Constraint Satisfaction Units represent facts or hypotheses Activation level indicates degree of belief or disbelief in that fact or hypothesis Connections represent constraints between facts or hypotheses Weight and Magnitude of constraint represents our degree of belief in the relation these items
Constraints as Weights Positive weights between units indicates we believe these facts are consistent with each other Negative weights indicate that they are not consistent Low Magnitude weights indicate an ambivalence between hypotheses
Bi-Directional Example Has frecklesHas Measles Having freckles suggests one might have measles Having measles strongly suggests one has freckles Left Handed Being left handed doesn’t tell us much about freckles or measles (and vice versa)
“Goodness” of Network The goodness measure formally states how consistent the activity of the network is When two units are connected by a positive connection, goodness increases if both are active When weight is negative, goodness increases if only one is active
Formally... Sum of the product of weight between units and their respective activations, Plus the product of external input to that unit and that unit’s activity, Plus the product the unit’s bias and its activity
Dynamics The set of all units’ activation values is the state of the network. The set of all possible values is the state space. The state can change over time. The points the network can settle into are the stable states or fixed point attractors. Moving through state space is called settling, or relaxation.
Simple Example of Settling Single unit, single weight to itself Activation function that preserves sign of input What happens over time in response to input? 1 +10 1 1 input
The “Goodness” Measure For Simple Example Recall: goodness is sum of: –Product of activation of from unit, activation of to unit, and weight –Product of input and activation –And defaults (which we didn’t use) Goodness plot, as function of input
A More Complex Example 11 Unit 2 ActivityUnit 1 Activity 1 0 0 1
Input Changes Shape of Goodness Plot There is a term for external input to units Changes shape of plot: –Alters where the peaks of goodness may be –May eliminate a peak entirely –Or form new ones
Two Unit Example, Revisited 10 Now Zero Unit 2 Activity Unit 1 Activity 1 0 0 1
Properties of Goodness Functions Set of optima (stable fixed points) Basins of attraction around optima –If you’re close, you get pulled in Height of optima (convergence rate)
Potential Problems Local optima. Where to go next? Goodness Activation State Local Optima True Optima
Necker Cube Can interpret cube as facing more to the left or right Seems to spontaneously shift interpretations Only one of two interpretations allowed Bi-stable system
Categorical Perception of Phonemes Can vary sounds between two continua –From a pure “spy” to a pure “sky”, with ambiguous stimuli in between “sky” “spy” Percentage of “sky” responses Phonemes perceived categorically; listener pulled into stable attractor states
Another Example: Phonological Attractors (Harm & Seidenberg 1999)
Constraint Satisfaction: Summary Propositions are units Knowledge of relations between them encoded as weights Activity can be driven externally, and/or by internal dynamics One way of thinking of world knowledge (characteristics of phonemes, animals, living rooms, stories, etc.)
What’s a Schema? Traditionally, a store of knowledge about characteristic events or things Used for reasoning and induction Characteristic roles, values, defaults, relationships We can consider this a case of constraint satisfaction
Challenges for any Schema Theory Navigating a huge search space –Set of concepts is potentially vast Flexibility; novel situations –Birthday party at restaurant Specified versus Unspecified knowledge –I know there is (or isn’t) an oven in the kitchen, versus not knowing one way or the other Learning!
The PDP View Schema are not explicitly coded structures Every feature is a variable Search is conducted through parallel constraint satisfaction Novel collections of features naturally arise in novel situations Knowledge of world: weighted connections between concepts Learning is the tuning of weights based on experience
Implementation Details Use goodness measure as defined before Activation: if input to unit > 0, then If input to unit < 0, then
Pre-set Connections Note: weights are symmetric This rule cannot capture idea that a implies b, but b might not imply a as strongly But there are other rules which allow this (more in next few lectures)
Some Nice Properties Can store a large amount of information with small number of processing units –Knowledge is in the weights Default assignment –If you don’t know if the kitchen has an oven or not, you can infer that it does Potential for simple, exposure based learning
Some Potential Problems What about multiple variables? Murder mystery: –if x is the killer, x has a gun –If y is victim, then y is dead killer has_gun is_dead victim butler widow detective Who has the gun, who is dead? Proposition: Butler is killer, widow is victim
Closing... What do you think of this approach? Are there domains that might be problematic for this approach? Would it be useful in your area of interest? –Robot navigation? –Phoneme perception? –Data mining? –Concept acquisition? Cup vs mug vs glass? –Vision? Human or Machine...
For Next Time Read PDP1, Chapter 7, pages 282-290 only
Homework #1 Due Feb 1 –but I’ll take it up to 2 working days late without penalty just for this assignment! –(note: next assignment handed out Feb 1) Look it over before next class. Come with questions if you have any –Bring 2 blank floppies to class Try out software as soon as you can, in case there are problems.. Quick demo of software...
Your consent to our cookies if you continue to use this website.