Presentation is loading. Please wait.

Presentation is loading. Please wait.

Motivating Operations

Similar presentations


Presentation on theme: "Motivating Operations"ā€” Presentation transcript:

1 Motivating Operations
Berkshire Association for Behavior Analysis and Therapy October 13th & 14th, 2005 Jack Michael, Ph.D. Psychology Department Western Michigan University

2 Timetable 1955 (KU): repertoire based heavily on SHB (re motivation, Chapters 9, 10, 11) and also Wm. James lectures, and to some extent K& S (from Wike). 1957 (U of H): SHB for grad courses, K&S for an undergrad learning course (2 semesters). Thoroughly exposed to estab. oper. in Chapter 9 and 10. 1958 or 59: Holland-Skinner disks from Harvard course. Used them for some instruction. 1960 (ASU) Used H-S program in teaching Psych (?) until Keller took over 112 in Deprivation and aversive stimulation as motivational operations. Gave many talks on behavior modification, rehabilitation, etc. It is not clear that I made much use of the motivation area, but probably it played a role in some highly general talks. 1967 (WMU) Taught VB (Psych 260), and statistics, JEAB, Indiv. Org. Res. Methodology (IORM), and eventually Psych 151. For 151 I used SHB, H-S, various sources. I could not determine by looking at old course materials when I started using the EO concept in the undergrad course, or in the VB course when the mand was covered. By 1980 I am pretty sure it was well developed. Critical was a grad seminar on Skinner where a group of very effective grad students and I discussed the SE (transitive CEO) at great length. I started using EO as a general term, including both deprivation and aversive stim (salt ingestion issue), and most importantly, from my perspective, with respect to learned EOs (sketch of a cat, slotted screw). Michael, J. (1982). Distinguishing between discriminative and motivational functions of stimuli. JEAB, 37, Michael, J. (1988). Establishing operations and the mand. TAVB, 6, 3-9. Michael, J. (1993). Establishing operations. TBA, 16, Michael, J. (2000). Implications and refinements of the EO concept. JABA, 33, Laraway, S., Snycerski, S., Michael, J., & Poling, A. (2003). Motivating operations and terms to describe them: Some further refinements. JABA, 36, Started using EO concept as a result of seminar interactions, Skinner's Recent Writings, Verbal Behavior, possibly related to offering the grad course on motivation--very important was the extensive discussion of the topic as mentioned in the footnote to the 1982 article--give the names. I think I was mostly concerned with accounting for the transitive CEO in these discussions.

3 First a personal historical perspective1
UCLA, 1943, B.A., M.A., Ph.D. in psychology. Main interests: Physiological psych, learning theory (Hull2, Tolman -- not much Skinner), statistics, philosophy of science. 1st academic position, , Psych Dept at Kansas Univ. Intro psych for non majors as a teaching responsibility but not a major one: Traditional eclectic intro text; typical course format-- 3 lectures per week, exams every 5 weeks or so. My approach to lecturing: Keep a little ahead of the class in the text, supplement text with information from other sources, mainly from my own library. 1. It is not important that this historical material become a part of your repertoire. However it is important in that it provides a context which led to the EO concept. 2. At UCLA I was very familiar with Hull's D X sHr = sEr notion of the role of drive--my dissertation was related to this issue! 3. Non majors were a good context--they wanted to learn something other than what would be valuable for studying further psych courses--SHB later sections, Sections 3-5 were ideal. (give titles?) 4. The material in Section 2 has been the basis for all of my teaching at U of H, ASU, and especially WMU. Studied Skinner's Science and Human Behavior to provide lecture material for the upcoming section on learning, and it was perfect for my lectures in the intro course.3 Section II is the comprehensive basic behavior analysis approach that I soon adopted and used all my life.4

4 Chapters 4 to 12, Section II, The Analysis of Behavior1
Phase 1: Skinner, B. F. (1953). Science and human behavior. New York: Macmillan. Chapters 4 to 12, Section II, The Analysis of Behavior1 4. Reflexes and conditioned reflexes (respondent relations) 5. Operant behavior (rfmt, extinction conditioned rfers) 6. Shaping and maintaining operant behavior (differential rfmt, intermittent rfmt--interval, ratio) 7. Operant discrimination (discriminative stimuli--SDs, discriminative repertoires, attention) 8. The controlling environment (generalization, discrimination) 9. Deprivation and satiation (needs and drives)2 10. Emotion (as a predisposition3, responses that vary together in emotion, emotional operations) 11. Aversion, avoidance, anxiety (aversive stimuli, conditioned aversive stimuli, escape, avoidance) 12. Punishment (does it work? how? by-products, alternatives) 1. The essential principles of the science of behavior. However, motivation was not of special interest for me at that time. Points below are in retrospect. {"Respondent beh controlled by eliciting stimuli: operant behavior controlled by consequences" Not Quite Right!} 2. Skinner introduces "drive" but does not use it much beyond that chapter, instead "deprivation." Sort of defines the term on p 144,1 "simply a convenient way of referring to the effects of deprivation and satiation and of other operations which alter the probability of behavior in more or less the same way." I could (then and for 10 or more years) easily get by just referring to deprivation and aversive stimulation. [points relevant to later developments or analysis--Evocative relation clearly implied on p. 149; then questions the rfer estab effect--p. 150); behav strengthened by a cond rfer varies with the deprivation appropriate to the relevant uncond rfer., almost deals with learned motivative relations (enabling events--taxi example--but does not examine the status of the evocative variable)]. 3. Emotion very much like deprivation

5 My slight modifications of the SHB arrangement
1. Unlearned environment-behavior relations 2. Respondent relations (conditioning, extinction1, S-change decrement, discrimination, and others2) [Aside on behavior-altering vs function-altering relations, considered in detail later] 3. Reinforcement: basic relation and three qualifications (R-SR delay, stimulus conditions, motivating operations3) 4. Extinction4 5. Punishment: basic relation and three qualifications 6. Schedules of reinforcement 7. Motivating operations (2 defining effects, human UMOs, multiple effects, CMOs, others5) 8. Discriminative stimulus control (stimulus change decrement, SD and Sāˆ†) 9. Conditioned reinforcement and punishment 1. Strictly defined--occurrence of CS without US. 2. Some of the field of emotion is concerned with respondent elicitation, and the pairing of neutral stimuli with emotional ones--(Proust?). 3. No longer an additional topic, separated from operant conditioning, but an essential component--no MO, no conditioning, and effect of conditioning depends on MO presence and strength. 4. Again, strictly defined--occurrence of response without its reinforcement. Not a synonym for weakening. There are many ways to weaken that are not extinction. How do you extinguish a child's running in the street to fetch a ball? 5. The emotional predisposition. What does it mean to be angry? 6. Much confusion exists about avoidance, in terms of its evocation (by warning stimulus functioning as what? in terms of extinguishing avoidance behavior and the notion that it extinguishes slowly. 10. Escape and avoidance6

6 Phase 1: Science and Human Behavior (cont'd.)
So, motivation is concerned with the operations of deprivation and satiation; and aversive stimulation. Deprivation increases and satiation decreases the probability of behavior that has been reinforced with the relevant reinforcer.1 An increase or decrease in aversive stimulation increases or decreases the probability of behavior that has terminated that type of aversive stimulation. What about the term drive?2 Skinner used it extensively in Beh. of Organisms (Chapters 9 and 10), and also in the William James Lectures (1947)3, but by SHB (1953) the chapter title was not "Drive", but rather "Deprivation and Satiation". Drive occurred frequently, but it was accompanied by much cautionary language. 1. Here only the evocative effect is emphasized--remind of earlier comment. 2. Why is the term drive important? Widely used in psych for motivation variables, is a single term that could include a variety of different Env-Beh relations more detail needed here 3. Read part of the Wm Jms material on the mand. 4. I do not think I had been affected yet by ATLN (1950), but SHB was negative about internal concepts, and I was getting a lot of mileage out of Skinner's descriptive approach vs. theoretical behavioral approaches. I avoided the term (without thinking about it), because of its internal implications. Also I was now uncomfortable with Hull's theoretical4 approach (despite my dissertation).

7 Excerpt from Skinner's 1947 William James Lectures, from page 31 of Chapter 2, "Verbal behavior as a scientific subject matter." His use of the drive concept. . . . "We may begin with the type of vb which involves the fewest variables. In any verbal community we observe that certain responses are characteristically followed by certain consequences. Wait is followed by someone's waiting, Shh by silence, and so on The case is defined by the fact that the form of the response is related to a particular consequence. There is a simple non-verbal parallel. Out! has the same ultimate effect as turning the knob and pushing against the door. The explanation of both behaviors is the same. They are examples of law-of-effect, or what I should like to call operant conditioning. Each response is acquired and continues to be maintained in strength because it is frequently followed by an appropriate consequence. The verbal response may have a slightly different "feel," but this is due to the special dynamic properties which arise from the mediation of the reinforcing organism. The basic relation is the same. The particular consequence which is used to account for the appearance of behavior of this sort - to use a technical term, the reinforcement for the response - is not the controlling variable. Reinforcement is merely the operation which establishes control. In changing the strength of such a response we manipulate any condition which alters what we call the drive. This is true whether the door is opened with a "twist and push" or with an "out!" We can make either response more likely to appear by increasing the drive to get outside - as by putting an attractive object beyond the door. We can reduce the strength of either by reducing the drive - as by introducing some object which strengthens staying in. Our control over the verbal response Out!, as in the case of any response showing a similar relations to a subsequent reinforcement, is thus reduced to our control of the underlying drive.

8 Phase 2: Keller, F. S. , & Schoenfeld, W. N. (1950)
Phase 2: Keller, F. S., & Schoenfeld, W. N. (1950). Principles of psychology: Chapter 9, Motivation First read it at in 1955 or 561. K & S2 was written in 1950, but I read it after I was very familiar with Skinner's 1953 Science and Human Behavior, and made little use of it at that time. 2nd academic position, , University of Houston. Here I used SHB and Verbal Behavior (available in 1957) for graduate seminars and informal discussion sessions.3 Also used K & S for two terms as text for an undergrad learning course. I don't remember much about my dealing with the motivation chapter; it was certainly assigned and I lectured and examined over it. I think I simply adopted the expression "establishing operation" meaning "establishes something as a reinforcer."4 1. A colleague (Ed Wike) at KU gave me his copy when he realized how interested I had become in Skinner. 2. Always referred to (affectionately) as Keller and Schoenfeld, or just K & S -- Actual title is useless in identifying any specific book, 3. Also used journal articles that were becoming available or that I was becoming familiar with. 4. Establishing operation for a drive, establishes something as a form of reinforcement -- on p. 269,0; much use of drive on pages ; also first joining of deprivation and aversive stimulation "Two classes of drives", page , both considered establishing operations. [The notion and phrase that some environmental operations "establish something as a reinforcer" must have become a part of my repertoire. It was introduced as a way of objectifying the concept of drive as a relation between an environmental variable (the establishing operation) and the changes in behavior, and I had never adopted drive as a technical behavioral term--being quite satisfied at that time with deprivation/satiation and aversive stimulation. ]

9 Phase 3: Holland, J. G. , & Skinner, B. F. (1961)
Phase 3: Holland, J. G., & Skinner, B. F. (1961). The analysis of behavior. New York: McGraw-Hill. , while at Houston I obtained the circular disks used at Harvard in the programmed course. Very interesting! The format1 not convenient for use in a course without the teaching machines, but the content covered the essential2 aspects of Section 2 of SHB, and programmed instruction was a very exciting new behavioral development 3rd academic position, , Arizona State University Taught undergrad statistics, several grad courses, and by 1962 I was teaching an intro course for majors, using the Holland-Skinner programmed textbook3 (and my own lab manual4). 1. Circular disks explained. 2. Elaborate on essential features vs Skinner's dealing with earlier psychology notions, which I had to deal with and then explain that they were not important any more. 3. Describe and show if possible the programmed text 4. Comment on my lab manual. 5. Explain PSI--most probably know. Fred Keller started teaching that intro course using his PSI5 approach in late 64-early 65, and using K&S as his text, but I made little or no use of the book once the H-S program came out.

10 Phase 4: Motivation in decline
Although quite important to Skinner (1938, 1953) and to Keller and Schoenfeld (1950),1 motivation during the 60s and 70s was hardly mentioned as a significant behavioral concept. Deprivation was mentioned but played only a small role. Why? (1) Knowledge of intermittent rfmt schedules2 showed that behavior is much more sensitive to rfmt frequency and rfmt schedule than to deprivation. (2) Wants, needs, drives, etc. were usually explanatory fictions referring to inner entities inferred from the behavior they were supposed to explain. (3) Also with increasing applied work, generalized conditioned rfmt (praise, points, money) was usually the immediate consequence for much human behavior, which made deprivation and aversive stimulation less necessary3. 1. A few texts had motivation chapters: Lundin, Millenson. Many did not: Catania, Mazur, Fantino and Logan, Karen, Powers and Osborne, Honig (Handbook). 2. Give example--possibly show Clark data. Even SHB examples and point that much of what seemed to be related to motivation was really rfmt frequency--eg. sex example. 3. Define gen cond rfer carefully--heading off usual mistake. ??But mention Wolf et al. work at U of W, and how important deprivation was in a clearly applied area. ??Ullman and Krasner--possibly list Ayllon and Michael, Wolf and colleagues, But ultimately motivational variables could not be ignored without leaving the system incomplete.

11 4th (and last) academic position: 1967-2003, WMU
Taught undergrad VB course (called "Social Psych"), statistics, JEAB, Indiv. Org. Res. Methodology, and eventually (~1970) an intro course for majors, for which I used SHB, H-S, and various sources. For my teaching I had to have consistent, coherent, concepts and principles1. So deprivation/satiation and aversive stimulation must be dealt with, even though such effects seem less important than reinforcement frequency. Deprivation increases and satiation decreases the frequency2 of behavior that has been reinforced with the relevant reinforcer. 1. Not just the three-term contingency, shaping, stimulus control, etc. but rather starting with unlearned behavior (like reflexes), respondent relations (conditioning, extinction, and more), operant relations starting with reinforcement (and here deprivation and aversive stim are critical to make something function as rfmt) and so on--maybe show the outline from C & P. 2. Skinner and also K & S wrote probability, but I much preferred something that was not susceptible to interpretation of an underlying cause of the behavior; frequency refers to an actual dependent variable. Later I must distinguish between current and future frequency--behavior-altering vs. function-altering relations. An increase or decrease in aversive stimulation increases or decreases the (current) frequency of behavior that has terminated that type of aversive stimulation.

12 Phase 5: Necessity for a general concept
But this was somewhat unsatisfactory. (1) There are a few operations with similar effects but which are neither deprivation or aversive stim. (salt ingestion, blood loss, perspiration; temperature too high/low)?1 (2) Also, I needed a technical term for this area that would refer to deprivation, aversive stimulation, and any other variables that have similar effects. Motivation is too broad in its ordinary usage (including reinforcement). Drive lends itself too easily to an internal interpretation. The solution: Sometime during the early or mid 70s it occurred to me that what these operations did was establish something as a reinforcer. Let's call them establishing operations.2a, 2b 1. These could be just ignored as exceptions or special cases. 2a. It is no doubt my exposure to K & S that was responsible for this term being a part of my repertoire, but at the time I did not realize where it came from--it just seemed like a very reasonable way to characterize these functional relations. Actually I (unknowingly) departed some from the K & S implication of establishing. For them deprivation and aversive stimulation established a drive; for me it established the effectiveness of something as a form of rfmt. After I wrote the 1982 paper (and had been using EO for some time--a year or more) someone at WVU pointed out that K & S used this term much earlier (to my embarassment--What Skinner referred to as unintentional plagiarism in VB. 2b. I could not determine by looking at old course materials when I started using the EO concept in the undergrad course, or in the VB course when the mand was covered. 3. K & S deal with it very briefly--p. 264 "food deprivation is itself a prerequisite for using food as reinforcement a reinforcer is such by virtue of some operation that makes it act so." This implies an all or nothing relation, although I suspect if asked they would have made it more a matter of degree. Three years later Skinner specifically questioned the effect of deprivation on the effectiveness of food as a form of reinforcement (based on research related to the old latent learning controversy). However I simply assumed the quantitative relation from the all-or-none one which seemed to me to be taken for granted in the field. An unanticipated result of this usage was to emphasize the reinforcer-establishing (later "value-altering") effect as separate from the evocative (later "behavior-altering" effect much more than with K & S (1950) or SHB (1953).3

13 Learned motivation:1 The sketch of a cat
Skinner's example of how to get someone to mand a pencil was the initial stimulus for the learned EO analysis. The sketch of a cat: In Verbal Behavior (p. 253) Skinner explained how one could use basic verbal relations to get a person to say pencil. To evoke a mand of that form, "we could make sure that no pencil or writing instrument is available, then hand our subject a pad of paper appropriate to pencil sketching, and offer him a handsome reward for a recognizable picture of a cat."2 How should we classify the offer of money for the sketch in its evocation of the response pencil?3 An SD? Why not? 1. The use of EO as a general term, including deprivation, aversive stimulation, and things like salt ingestion and temperature changes came first. I am not sure when learned motivation became important to me, but by 1980 the analysis was well developed. Critical for me were discussions (probably around 1978) with a group of very effective grad students: Give the list? . 2. He goes on to add echoic, textual, intraverbal and tact stimuli so that it will be "highly probable that our subject will say pencil." 3. Remind them of the taxi example in SHB--possibly read it. Also the slotted screw. 4. However, I am pretty sure that the main impact of the paper was with respect to the general use of the EO concept rather than for the SE (CEO-T). Even by the 1993 paper relatively little use was being made of the learned MO (EO). 5. The more restricted definition of the discriminative relation based on the requirement of a proper Sāˆ† was a major result of this analysis, and the part that was most difficult for some others to accept. The main purpose of the 1982 paper was to provide an analysis of what I first called an establishing stimulus (SE), and later a transitive conditioned establishing operation (CEO-T).4 Michael, J. (1982). Distinguishing between discriminative5 and motivational functions of stimuli. JEAB, 37,

14 : EO terminology was adopted to some extent in the applied field from the 1982 paper.1 So, a new paper: Michael, J. (1993) Establishing operations. TBA, 16, (somewhat clearer language, more details re pain as an EO, two new learned EOs2) McGill, P.(1999). Establishing operations: Implications for the assessment, treatment, and prevention of problem behavior. JABA, 32, Michael, J. (2000). Implications and refinements of the EO concept. JABA, 33, Iwata, B. A., Smith, R. G., & Michael, J. (2000). Current research on the influence of EOs on behavior in applied settings. JABA, 33, 1. Most often JEAB paper cited in JABA. 2. Reviewers still didn't like the learned EOs. Laraway, S., Snycerski, S., Michael, J., & Poling, A. (2003). Motivating operations and terms to describe them: Some further refinements. JABA, 36,

15 functional analysis added to the need for effective language with respect to variable that affected rfmt effectiveness and behavior frequency

16 This ends the historical introduction to the topic, which consisted of 14 slides.
Now back to a logical and conceptual rather than a historical development of motivation.

17 Motivating Operations
I. Definition and Characteristics A. Basic features B. Important details Brief history a. Skinner, 1938 & 1953, NQR b. K & S, 1950, better c. Michael's extension Figure 1. EO defining effects, Figure 2. MO defining effects Figure 3. EO and MO compared II. Distinguishing Motivative from Discriminative Relations III. Unconditioned Motivating Operations A. UMOs vs CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects IV. Conditioned Motivating Operations A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) V. General Implications of MOs for Behavior Analysis

18 I. Definition and Characteristics
A. Basic features Brief history a. Skinner, 1938 & 1953 b. K & S, 1950 c. Michael's extension Figure 1. EO defining effects named, with food and pain examples Figure 2. MO defining effects Figure 3. EO and MO effects compared

19 A. Basic features: Brief history
a. Skinner, 1938 & 1953: Motivation concerned with deprivation/satiation and aversive stimulation. Deprivation/satiation alter the probability of behavior that has been reinforced with the relevant reinforcer. Alteration in aversive stimulation alters the probability of behavior that has reduced such aversive stimulation. But unsatisfactory: a general term needed for both (drive no good). Also salt ingestion, blood loss, etc. b. Keller & Schoenfeld, Both deprivation and aversive stimulation are operations that establish a drive. Elaboration of commonsense psychology: They also have to know how to do it (thus knowledge and desire, or knowing and wanting. If either is absent or weak then the behavior cannot occur. But also psychological theory evolves from. Food deprivation establishes food as a reinforcer. Aversive stim establishes its reduction as a reinforcer. Establishing operation is good for 2 reasons: (a) includes both deprivation/satiation & aversive stim, (b) implies the environment rather than an internal state.

20 A. Basic features: Brief history (cont'd.)
c. Michael, 1982 JEAB: Let us use establishing operation (EO) for any environmental variable (deprivation, aversive stimulation, salt ingestion, becoming too warm or too cold, and also a learned variable) that does these two things: i. Increases the current reinforcing effectiveness of some stimulus, object, or event. ii. Increases the current frequency of (evokes) all behavior that has obtained that stimulus, object, or event in the past. Furthermore let us give each of these effects a name:

21 Fig. 1 Establishing Operations (EOs): 2 Defining Effects
Rfer Establishing or Abolishing Effect Evocative or Abative Effect EOs establish the current rein-forcing effectiveness of some stimulus, object, or event. (And establish includes the effect in the opposite direction, abolish.) EOs evoke any behavior that has been reinforced by the same stimulus that is altered in rfing effectiveness by the same EO. (And evoke includes an effect in the opposite direction, abate.) current frequency of any behavior that has been reinforced1 by food. Food deprivation increases and food ingestion decreases the reinforcing effectiveness of food. An increase in pain causes an increase, and a decrease in pain causes a decrease in the reinforcing effectiveness of pain reduction. current frequency of any behavior that has been rfed by pain reduction. problems with the terminology! rfer, rfing, rfed are necessary abbreviations for this topic evoke/abate = increases/decreases current frequency of behavior current frequency is contrasted with future frequency--see later evocative vs. function altering relations (*It is easy to say "that are reinforced with food" and I will slip up sometimes. But it should always be in the past tense: That have been rfed with food.) Problems: (1) EO includes estab and abolish (2) Evocative/abative seems secondary

22 Fig. 2 Motivating Operations (MOs): 2 Defining Effects
Value-Altering Effect Behavior-Altering Effect MOs alter the current rein-forcing effectiveness of some stimulus, object, event. MOs alter any behavior that has been reinforced by the same stimulus, object, or event that is altered in value by the same MO. Reinforcer Evocative Effect Abative Effect Establishing Effect Abolishing Effect Food deprivation increases and food ingestion decreases the reinforcing effectiveness of food. current frequency of any behavior that has been rfed by food. rfer, rfing, rfed are necessary abbreviations for this topic evoke/abate = increases/decreases current frequency of behavior current frequency is contrasted with future frequency--see later evocative vs. function altering relations An increase in pain causes an increase, and a decrease in pain causes a decrease in the reinforcing effectiveness of pain reduction. current frequency of any behavior that has been rfed by pain reduction.

23 Establishing Operations (EOs): 2 Defining Effects
Rfer Establishing or Abolishing Effect Evocative or Abative Effect EOs evoke any behavior that has been reinforced by the same stimulus that is altered in rfing effectiveness by the same EO. (And evoke includes an effect in the opposite direction, abate.) EOs establish the current rein-forcing effectiveness of some stimulus, object, or event. (And establish includes the effect in the opposite direction, abolish.) Figure 3: EO and MO comparison Motivating Operations (MOs): 2 Defining Effects Value-Altering Effect Behavior-Altering Effect rfer, rfing, rfed are necessary abbreviations for this topic evoke/abate = increases/decreases current frequency of behavior current frequency is contrasted with future frequency--see later evocative vs. function altering relations MOs alter the current rein-forcing effectiveness of some stimulus, object, event. MOs alter any behavior that has been reinforced by the same stimulus, object, or event that is altered in value by the same MO. Reinforcer Establishing Effect Abolishing Effect Evocative Effect Abative Effect

24 Motivating Operations Where are we?
I. Definition and Characteristics A. Basic features B. Important details next II. Distinguishing Motivative from Discriminative Relations III. Unconditioned Motivating Operations A. UMOs vs CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects IV. Conditioned Motivating Operations A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) V. General Implications of MOs for Behavior Analysis

25 I. Definition and Characteristics
IB. Important details 1. What about MOs and punishment? 2. Direct and Indirect Effects. 3. Not Just Frequency. 4. Common Misunderstandings. 5. Current vs. Future Effects; Evocative/Abative vs. Function-Altering Effects; Antecedent vs. Consequence Effects 6. Generality depends on MO as well as stimulus conditions

26 IB. Important Details 1. What about MOs and punishment? Only recently consideredā€“most of MO theory and knowledge relates to MOs for rfmt. Some in a later section. 2. Direct and indirect effects1 a. MO alters response frequency directly. b. MO alters evocative strength of relevant SDs. c. Also establishing/abolishing effects, and evocative and abative effects re relevant conditioned reinforcers (but not for the same response) 1. controversial issue 3. Not just frequency: magnitude (more or less forceful R), latency (shorter or longer time from MO or SD to R), relative frequency (response occurrences per response opportunities), & others.

27 IB. More Details1 4. Misunderstanding #1: Evocative/abative effect is secondary to the value-altering effect. This is the interpretation of altered response frequency as solely the result of contact with the reinforcer of altered effectiveness, i.e. follows that contact and behavior is increased or decreased because of the smaller or greater strengthening effect of the reinforcer on subsequent responses. But not true. Evocative/abative effects can be seen in extinction responding--that is, without contacting the reinforcer. 1. A lot of my personal point of view in this and the next several slides. Best thought of as Two Separate Effects.

28 IB. More Details (cont'd.)
But the two effects do often work together. Reinforcing effectiveness will only be seen in the future, after some behavior has been reinforced, but this can be immediately after the MO alteration. Thus ongoing increased reinforcer effectiveness will combine with an evocative effect. If behavior is occurring too infrequently: Strengthening the MO will result in responses being followed by more effective rfer (rfer estab. effect); and all behavior that has been so rfed will be occurring at a higher frequency (evocative effect). The increase cannot be unambiguously interpreted, but in practice it may make no difference. If behavior is occurring too frequently: Weakening MO will result in a weaker evocative effect, and a weaker reinforcer.

29 4. Misunderstanding #2: The cognitive interpretation
IB. More Details (still cont'd.) 4. Misunderstanding #2: The cognitive interpretation This is belief that evocative and abative effects only work because the individual understands (is able to verbally describe) the situation and behaves appropriately as a result of understanding. Not true. Reinforcement automatically adds the reinforced behavior to the repertoire that will be evoked or abated by the relevant MO. The individual does not have to understand anything in the sense of verbal description. (*Consider rats.) There are 2 harmful effects of this belief. Little effort may made to alter the behavior of non-verbal persons who seem incapable of such understanding. Teachers are not prepared for disruptive behavior acquired by non-verbal persons who have been so reinforced.

30 IB. More Details (finished at last)
5. Current vs Future Effects; Evocative/Abative vs Function-Altering Effects; Antecedents vs Consequents Evocative/abative (antecedent) variables with current effects: Operant repertoire: (MO + SD)----->R relations Respondent repertoire: US or CS----->UR or CR Function-altering variables (consequences) with future effects: Operant consequences: R followed by SR, SP, Sr, Sp; and R occurs w/o consequence (extinction) (Respondent pairing/unpairing: CS paired w/ US; CS occurs w/o US (extinction) 6. Generality depends on MO as well as stim conditions

31 1st. Review: Basic features, important details.
IA. Basic Features. Brief history (Skinner, K & S, Michael) EO defining effects with examples MO defining effects with examples EO and MO effects compared IB. Important details. 1. What about punishment? 2. Direct and indirect effects 3. Not just frequency 4. Two misunderstandings 5. Current vs future; evocative/abative vs function-altering; antecedents vs consequents 6. Generality depends on MO as well as stim conditions (*Also awkwardness of terms, & past tense is critical)

32 Motivating Operations Where are we?
I. Definition and Characteristics A. Basic features B. Important details next II. Distinguishing Motivative from Discriminative Relations III. Unconditioned Motivating Operations A. UMOs vs CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) V. General Implications of MOs for Behavior Analysis

33 II. A Critical Distinction: Motivative vs
II. A Critical Distinction: Motivative vs Discriminative Relations; MO vs. SD A. The General Contrast Both MOs and SDs are learned, operant, antecedent, evocative/abative, not function-altering relations. SDs evoke (Sāˆ†s abate) because of differential past availability of a reinforcer. MOs evoke or abate because of the differential current effectiveness of a reinforcer But more is needed on differential availability.

34 B. Differential Availability Refined
An SD (discriminative stimulus) is a type of stimulus that evokes a type of response. But so does the respondent CS (conditioned stimulus). An SD is a type of S that evokes a type of R because that R has been reinforced in that S. But it will not have strong control unless it occurs without rfmt. in the absence of the S (in the Sāˆ† condition).1 An SD evokes its R because it has been reinforced in the SD and has occurred w/o rfmt. in Sāˆ†. But now another assumption must be made explicit. 1. Comment on S-change decrement.

35 C. MO in Sāˆ† condition An SD evokes its R because it has been reinforced in the SD and has occurred w/o rfmt. in Sāˆ†. But, occurring w/o rfmt in Sāˆ† would be behaviorally irrelevant unless the unavailable reinforcement would have been effective as reinforcement if it been obtained. This means that the relevant MO for the rfmt in SD must also be in effect during Sāˆ†.1 In everyday* language: For development of an SD---R relation, an organism must have wanted something in the SD, R occurred, and it was reinforced; and also must have wanted it in the Sāˆ†, R occurred, and was not reinforced. (*I'll admit that this is not exactly everyday language.) 1. With ordinary lab procedure, Sāˆ† w/o MO would not have occurred.

36 D. Food example: Could food deprivation (or relevant internal stimuli1) qualify as an SD, and absence of deprivation as an Sāˆ†, for a food reinforced response? Two SD requirements: (1) R must have been rfed with food in SD and (2) occurred w/o food rfmt in Sāˆ†, and the relevant MO (food deprivation) must have been in effect during Sāˆ†. (1) Food deprivation sort of 2 meets the requirement: Food available and R rfed w/ food in the presence of deprivation. (2) R may have occurred w/o food rfmt in Sāˆ†, but Sāˆ† is specified as the absence of food deprivation (or of related internal stimuli), so MO is clearly absent. Doesn't qualify! 1. Hunger pangs, or dry throat in case of water deprivation, and so on. 2. Whether food has been differentially available when one is food deprived would depend on the situation. Many hungry organisms would say no. An SD is a guarantee of reinforcement--not necessarily 100%, but clearly better than the Sāˆ†. For many organisms food deprivation may be completely uncorrelated with the availability of food reinforcement. Similarly with pain and pain removal. The absence of food deprivation does not qualify as an Sāˆ† but food deprivation clearly qualifies as an MO. Everyday language: (1) Food may have been wanted in the SD condition, and obtained. (2) But what was wanted in the Sāˆ† condition that was not obtained? Nothing.

37 E. Pain example: Could pain qualify as SD, and pain absence as Sāˆ†, for an R rfed by pain reduction?
Two SD requirements: (1) R was rfed with pain reduction in SD (painful S present) and (2) occurred w/o pain reduction rfmt. in Sāˆ† (when painful S was absent), and the relevant MO (painful S) must have been in effect during Sāˆ†. (1) Pain sort of meets the first requirement.1 Pain reduction may have been available and may have typically followed R in the presence of pain. (2) R may have occurred w/o being followed by pain reduction in Sāˆ† (when pain was not present), but the relevant MO (painful S) was specified as not present. Pain absence clearly fails to qualify as an Sāˆ†, so pain no good as SD. Pain does not qualify as an SD, but clearly qualifies as an MO. 1. Pain is a necessary condition for pain removal, but not a sufficient one, as many organisms who are in pain will tell you. 2. Do we want pain so that pain reduction rfmt is available?? No. Everyday language: (1) Pain reduction was wanted in SD and obtained. (2) What was wanted in Sāˆ† condition.2 Nothing.

38 II. Motivative vs. discriminative relations: MO vs. SD
2nd. Review II. Motivative vs. discriminative relations: MO vs. SD A. The general contrast. B. Differential availability refined. C. Another assumption: MO in Sāˆ†. D. Example: Food deprivation as SD? Why not? E. Example: Pain as SD? Why not?

39 Motivating Operations Where are we?
I. Definition and characteristics A. Basic features B. Important details II. Distinguishing motivative from discriminative relations III. Unconditioned Motivating Operations A. UMOs vs. CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects next This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) V. General Implications of MOs for Behavior Analysis

40 IIIA. UMOs vs. CMOs UMOs are events, operations, or stimulus conditions with unlearned value-altering effects. Conditioned motivating operations (CMOs) are MOs with learned value-altering effects. The distinction depends solely on the value-altering effect; an MO's behavior-altering (evocative/abative) effect is always learned. UMO: Humans are born with the capacity to be reinforced by food when food deprived (reinforcer-establishing effect), but the behavior that gets food has to be learned. CMO: The capacity to be reinforced by having a key, when we have to open a locked door (reinforcer-establishing effect) depends on our history with doors and keys. And we also have to learn behavior that obtains keys (evocative effect).

41 IIIB. Nine main human UMOs1
Five deprivation and satiation UMOs: food, water, sleep, activity, and oxygen2. UMOs related to sex. Two UMOs related to uncomfortable temperatures: being too cold or too warm. A UMO consisting of painful stimulation increase.3 1. It is easiest to become fluent with the MO concepts by working with UMOs, which is one reason I have a fairly large section on them and also the practice exercises. 2. Not really O2 deprivation, but rather CO2 excess. 3. Pain is almost always present.

42 IIIB1. Five Deprivation/Satiation UMOs: food, water, sleep, activity, and oxygen.
Reinforcer establishing effect: X deprivation increases the effectiveness of X as a reinforcer. Evocative effect: X deprivation increases the current frequency of all behavior that has been reinforced with X. Reinforcer abolishing effect: X consumption decreases the effectiveness of X as a reinforcer. Abative effect: X consumption decreases the current frequency of all behavior that has been reinforced with X.

43 IIIB2a. UMOs related to sex
For many mammals, time passage and environmental conditions related to successful reproduction (e.g. ambient light conditions, average daily temperature) produce hormonal changes in the female that as UMOs cause contact with a male to be an effective reinforcer for the female. These changes produce visual changes in some aspect of the female's body and elicit chemical attractants that function as UMOs making contact with a female a rfer for the male and evoking behavior that has produced such contact. These changes may also evoke behaviors by the female (a sexually receptive posture) that function as UMOs for sexual behavior by the male. There is also often a deprivation effect that may also function as a UMO for both genders.

44 IIIB2b. The sex UMO in humans
In the human, learning plays such a strong role in the determination of sexual behavior that the role of unlearned environment-behavior relations has been difficult to determine. The effect of hormonal changes in the female on the female's behavior is unclear; and similarly for the role of chemical attractants in changing the male's behavior. Other things being equal, both male and female seem to be affected by the passage of time since last sexual activity (deprivation) functioning as a UMO with establishing and evocative effects, and sexual orgasm functioning as a UMO with abolishing and abative effects. In addition, tactile stimulation of erogenous regions of the body seems to function as a UMO making further similar stimulation even more effective as rfmt and evoking any behavior that has achieved such further stimulation.

45 IIIB3a. Temperature UMOs, Too Cold
Becoming too cold, reinforcer establishing effect: Increases effectiveness of an increase in temperature as a reinforcer. Evocative effect: Increases the current frequency of all behavior that has increased warmth. Return to normal temperature1, reinforcer abolishing effect: Decreases2 effectiveness of becoming warmer as a reinforcer. Abative effect: Decreases2 current frequency of all behavior that has increased warmth. 1. For our purposes this is the appropriate alternative to becoming too cold, not becoming too hot. 2. It is very important in considering this type of change to be careful with "decrease" and "increase."

46 IIIB3b. Temperature UMOs, Too Warm
Becoming too warm, reinforcer establishing effect: Increases effectiveness of a decrease in temperature as a reinforcer. Evocative effect: Increases the current frequency of all behavior that has decreased warmth. Return to normal temperature1, reinforcer abolishing effect: Decreases effectiveness of becoming cooler as a reinforcer. Abative effect: Decreases current frequency of all behavior that has decreased warmth. 1. This is the proper alternative to becoming too warm, not becoming too cold.

47 IIIB4a. Painful Stimulation UMO
Reinforcer Establishing Effect: An increase in pain increases the current reinforcing effectiveness of pain reduction.1 Evocative Effect: An increase in pain increases current frequency of all types of behavior that have been reinforced by pain reduction.1 Reinforcer Abolishing Effect: A decrease in pain decreases the current reinforcing effectiveness of pain reduction. Abative Effect: A decrease in pain decreases the current frequency of all types of behavior that have been reinforced with pain reduction. 1. In my courses and workshops on MOs, a common mistake is to refer to the "current reinforcing effectiveness of pain"; or to "behaviors that have been reinforced by pain." In general, pain (a pain increase) does not function as a form of reinforcement, but rather as a form of punishment. There are some rather complex exceptions to this statement, where painful stimulation has become a form of conditioned reinforcement. One such example involves pain being a sign of effective exercise--no pain no gain, etc.--either ongoing or in the past. Another involves painful stimulation having been paired with sexual stimulation. However, at this point, we are considering only the effect of pain increase and decrease as UMOs. 2. Discuss "worsening" and refer to later section on aversive and appetitive stimuli. The pain MO is an appropriate conceptual model for motivation by any form of worsening.2

48 IIIB4b. More on pain as a UMO
Skinnerā€™s emotional predisposition refers to an operant1 aspect of emotion, as a form of MO.2 For anger, the cause is any worsening in the presence of another organismā€”pain, interference with rfed behavior, etc. For some organisms, this seems to function as a UMO making signs of damage or discomfort3 by the other organism function as rfmt, and evoking behavior that has been so rfed. Whether such effects are related to UMOs in humans is presently unclear. The similarity of emotional and motivational functional relations was well developed by Skinner in his 1938 book, The Behavior of Organisms. The concept of an emotional predisposition (and a more extensive analysis of emotion) is in Science and Human Behavior, 1953, pp 1. As opposed to the well recognized respondent aspects of emotion. 2. Although he did not use EO or MO language, his description and his reference to drive is clearly in this direction. 3. It is hard to see how "signs of damage" could be specific enough to function as unlearned forms of rfmt. More plausible perhaps would be the feeling of pressure on the front teeth or on claws.

49 IIIB. Practice Exercise #1: UMO Effects
Provide each of the following: Evocative effect of sleep deprivation. Reinforcer-abolishing effect of water ingestion. Abative effect of pain decrease. (Be careful.) Reinforcer-establishing effect of becoming too cold. Abative effect of pain increase. (trick question) Rfer-abolishing effect of engaging in much activity. Evocative effect of sex deprivation. Rfer-abolishing effect of a return to normal temperature after having been too warm. (What has been rfing?) Evocative effect of pain increase. (Be careful.) 10. Rfer-establishing effect of pain increase. (Be careful.)

50 IIIB. Answers for Exercise #1: UMO Effects
Increased current frequency of all behavior that has facilitated going to sleep. Decreased reinforcing effectiveness of water. Decreased current frequency of all behavior that has been rfed by pain decrease (not "by pain"). Increased reinforcing effectiveness of temperature increase. Pain increase does not have an abative effect. Decreased reinforcing effectiveness of activity. Increased current frequency of all behavior that has led to sexual stimulation. Decreased reinforcing effectiveness of becoming cooler. Increased current freq of all behavior that has reduced pain. Increased reinforcing effectiveness of pain reduction.

51 Motivating Operations Where are we now?
I. Definition and characteristics A. Basic features B. Important details II. Distinguishing motivative from discriminative relations III. Unconditioned Motivating Operations A. UMOs vs. CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects next This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) V. General Implications of MOs for Behavior Analysis

52 IIIC. Weakening the Effects of UMOs
For practical reasons it may be necessary to weaken some UMO effects. Permanent weakening of UMO's unlearned rfer-establishing effect is not possible. Pain increase will always make pain reduction more effective as rfmt. Temporary weakening by rfer-abolishing and abative variables is possible. Food stealing can be temporarily abated by inducing food ingestion, but when deprivation recurs, the behavior will come back. Evocative effects depend on a history of rfmt, and can be reversed by extinction procedureā€“let the evoked R occur without rfmt (not possible in practice if control of rfer is not possible), and abative effects of punishment history can be reversed by recovery from pmt procedureā€“R occurs without the punishment.

53 IIID. UMOs for Punishment
An environmental variable that (1) alters the punishing effectiveness (up or down) of a stimulus, object, or event, and (2) alters the current frequency (up or down) of all behavior that has been so punished is an MO for punishment; and if the first effect does not depend on a learning history, then it is a UMO. 1. Reinforcer-establishing effects UMOs: Pain increase/decrease will always increase/decrease the effectiveness of pain reduction as a rfer. Also true for other uncond. pners (some sounds, odors, tastes, etc). MOs for conditioned punishers. Most punishers for humans are conditioned, not unconditioned punishers. Two kinds: We really need the CMO concept here. The UMOs below are not really UMOs, as will be considered in detail in the section on CMOs.) Painful stim can be a conditioned punisher or even a conditioned reinforcer under some conditions. Can you give examples? a. S paired with an unconditioned pner (SP), then the UMO is the UMO for that unconditioned pner. b. Historical relation to reduced availability of rfers, then UMO is the UMO for those rfers. (cont'd on next slide)

54 1. Reinforcer-establishing effects (cont'd.)
Examples: Removing food as pmt (or changing to an S related to less food) will only punish if food is a reinforcer, so the MO for food removal as pmt is food deprivation.1 Social disapproval as a punisher (frown, head shake, "bad!") may work because of being paired with SP like painful stimulation, so MO would be the MO for the relevant SP. More often social disapproval works because some of the rfers provided by the disapprover have been withheld when disapproval stimuli have occurred. MO would be the MOs for those reinforcers. Time-out as punishment is similar. The MOs are the MOs for reinforcers that have been unavailable during time-out. Response cost (taking away tokens, money, or reducing the score in a point bank) only works if the things that can be obtained with the tokens, etc. are effective as reinforcers at the time response cost procedure occurs. (continued on next slide) 1. An exception is related to a highly verbal human.

55 2. Abative effects of MO for pmt: Quite complex.
An increase in an MO for pmt would abate (decrease the current frequency of) all behavior that had been punished with that type of punisher. To observe this effect, however, the punished behavior must be occurring so that a decrease can be observed. This depends on the current strength of the MO for the reinforcers for the punished behavior. This means that the observation of an MO abative effect for punishment requires the MO evocative effect of the rfmt for the behavior that was punished, otherwise there would be no behavior to punish. Example for time-out punishment: Assume a time-out procedure was used to punish behavior that was disruptive to a therapy situation. Problems: Only if MOs for the rfers available in the situation had been in effect would the time-out have functioned as punishment. Then, only if those MOs were in effect would one expect to see the abative effect of the previous punishment procedure on the disruptive behavior. But only if the MO for the disruptive behavior were in effect would there be any disruptive behavior to be abated. These issues have not been much considered in behavior analysis up to now. But you should be aware of the complications. They will be there.

56 3rd. Review C. Weakening the effects of UMOs.
1. Weakening reinforcer establishing effects. Permanent (not possible). Temporary (evocative weakening). 2. Weakening evocative effects. D. MOs for punishment. Definition. 1. Rfer establishing effects. Pain and other UMOs. MOs for conditioned reinforcers. Examples (social disapproval, time out, response cost). 2. Abative effects (considerable complexity).

57 Motivating Operations Where are we now?
I. Definition and characteristics A. Basic features B. Important details II. Distinguishing motivative from discriminative relations III. Unconditioned Motivating Operations A. UMOs vs. CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects next This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) V. General Implications of MOs for Behavior Analysis

58 E. Multiple effects: Many environmental events have more than one behavioral effect.
1. SD and Sr in a simple operant chain 2. MO evocative/abative effects vs SR/SP function-altering effects 3. Practical implications 4. Terminological note: Aversive stimuli

59 1. SD and Sr in a simple operant chain
Food-deprived pigeon presses a treadle (R1) protruding from the chamber wall, which turns on an auditory tone stimulus. With the tone on, the pigeon pecks a disk on the wall (R2), which delivers 3 sec exposure to a grain hopper where the pigeon can eat the grain. tone off rfmt off tone ON rfmt ON R1 R2 3 sec R1 = treadle press, R2 = key peck, rfmt = 3" grain available Tone onset is SD for key peck, and Sr for treadle push.

60 Pigeon Operant Chamber
These pictures are slightly modified versions of the pictures on pages 14 and 15 of Ferster, C. B. & Skinner, B. F. (1957). Schedules of reinforcement. New York: Appleton-Century-Crofts. pecking key key lights aperture light Pigeon Operant Chamber food aperture key lights grain hopper down Rfmt = aperture light on, grain hopper up to the bottom of the food aperture. (Pigeon sticks its head into the aperture and pecks at the grain.) After 3 sec, light goes off and hopper goes back down (where grain can't be reached). treadle food aperture

61 Pigeon Operant Chamber
pecking key key lights aperture light Pigeon Operant Chamber food aperture key lights grain hopper up Rfmt = aperture light on, grain hopper up to the bottom of the food aperture. (Pigeon sticks its head into the aperture and pecks at the grain.) After 3 sec, light goes off and hopper goes back down (where grain can't be reached). treadle food aperture

62 Pigeon Operant Chamber
pecking key key lights aperture light Pigeon Operant Chamber food aperture key lights grain hopper down Rfmt = aperture light on, grain hopper up to the bottom of the food aperture. (Pigeon sticks its head into the aperture and pecks at the grain.) After 3 sec, light goes off and hopper goes back down (where grain can't be reached). treadle food aperture

63 1. SD and Sr in a simple operant chain
Evocative/abative (antecedent) variables with current effects: Operant repertoire: (MO + SD)----->R relations Respondent repertoire: CS----->CR Function-altering (consequence) variables with future effects: Operant consequences: R followed by SR, SP, Sr, Sp; R occurs without consequence (Respondent pairing/unpairing: CS paired w/ US; CS without US) (Above is from the earlier section IB5, slide 31.)

64 2. MO evocative/abative & SR/SP function-altering effects.
Pain increase has an MO evocative effect (increase in the current frequency of (evokes) all behavior reinforced by pain reduction). Pain increase also functions as SP to cause a decrease in the future frequency of the particular type of behavior that immediately preceded that instance of pain increase. Food ingestion has an MO abative effect (decrease in the current frequency of (abates) all food rfed behavior). Food ingestion also functions as SR to cause an increase in the future frequency of the particular type of behavior that immediately preceded that instance of food ingestion, operant conditioning.

65 2. MO evocative/abative & SR/SP function-altering effects: Direction of the effects
Becoming too cold or too warm is similar to pain increase in producing increases in current frequency and decreases in future frequency. However the evocative effects of most of the deprivation MOs are too slow acting to function as effective consequences. Abative effects are all quick acting and the relevant variables will also function as reinforcers. Note that SD and Sr effects are in the same directionā€“both are increases (not the same behavior). MO and related SR/SP effects are typically in opposite directions, thus the MO decreased the current frequency of all food rfed behavior (abated it); the SR caused an increase in future frequency (but not in the same behaviors).

66 3. Practice Exercise For each of the following, name the effect (evocative, abative, reinforcer, punisher) and describe it using the language of the preceding slide. I will give the first two. The function-altering effect of pain decrease. (Answer: Reinforcer: Increases the future frequency of what ever behavior preceded that instance of pain reduction.) The MO effect of returning to a comfortable temperature after having been too warm. (Answer: Abative: Decrease in all the behavior that has caused a decrease in temperature.) MO effect of becoming too cold. Function-altering effect of becoming too cold. MO effect of water ingestion. Function-altering effect of being able to sleep after sleep deprivation. (More on the next slide.)

67 3. More Practice MO effect of sexual orgasm.
Function-altering effect of suddenly not being able to breathe. Function-altering effect of return to a comfortable temperature after having been too cold. Function altering effect of sexual orgasm. MO effect of pain decrease. MO effect of activity deprivation. Function-altering effect of engaging in activity after activity deprivation. MO effect of becoming too cold. MO effect of oxygen deprivation.

68 4. Practical Implications.
Many behavioral interventions are chosen because of their MO evocative or abative effects, or because of their reinforcement or punishment effects. Any of these operations will have related operations in the opposite direction. These effects could be counter-productive and should be understood and prepared for. MO weakening = reinforcement: MO weakened to decrease some undesirable behavior, (weaker SR for ongoing behavior and weaker evocative effect): food satiation to reduce food stealing, attention satiation to reduce disruptive behavior relevant to attention as a reinforcer. But some behavior will be reinforced by the satiation operation. Maybe not a problem, but could be. (continued on next slide)

69 4. Practical Implications (cont'd.)
MO strengthening = punishment: MO strengthened to increase some desirable behavior (stronger SR for ongoing behavior plus stronger evocative effect): food deprivation to enhance effectiveness of food as reinforcer; attention deprivation, music deprivation, toy deprivation, etc. to increase effectiveness of those items as reinforcers. But, some behavior will be punished by the operation unless deprivation onset is very slow. And even with slow build-up deprivation effects, if they have been systematically related to a stimulus condition, then the presentation of that stimulus condition will function as punishment.

70 4. Practical Implications (cont'd.)
Reinforcement = MO weakening: Food, attention, toys, etc. used a reinforcers to develop new behavior. But providing these reinforcers will weaken the MO, thus ongoing rfers will be less effective and evocative effect will be weakened. If reinforcers are small the effect may not be counter productive, as with pigeons on 24-hour food deprivation being reinforced with 3 seconds exposure to grain. However it is not clear what "small" means in terms of the kinds of reinforcers mentioned above.

71 4. Practical Implications (cont'd.)
Punishment = MO strengthening: Considering that most punishers used deliberately to weaken human behavior are stimulus conditions that have been related to a lower availability of various kinds of reinforcement, such a punishment operation will be like deprivation. It will result in a stronger Sr for ongoing behavior plus a stronger current frequency (stronger evocation). With a time-out procedure, the reinforcing effects of obtaining a reinforcer will be greater when one is obtained (perhaps by stealing) and the behavior that has gotten such reinforcers will be stronger.

72 5. Terminological Note: Aversive and appetitive stimuli.
Some environmental events have all three of the following effects: MO evocative effects. Punishment function-altering effects. Certain respondent evocative effects: heart rate increase, adrenal secretion, peripheral vasodilation, galvanic skin response, and so on, often called the activation syndrome). (Continued on the next slide.)

73 5. Aversive and appetitive stimuli (cont'd.)
Such events are often referred to as aversive stimuli, where the specific behavioral function [MO, SP or Sp, and US (unconditioned stimulus)] is not specified. This type of omnibus term is of problematic value. In many cases it seems to be little more than a technical translation of mentalisms like "an unpleasant stimulus", or "I don't like it." It can be avoided in favor of the more specific terms (MO, SP or Sp or US) if possible. Appetitive stimuli has sometimes been used for events with (a) MO abative effects,(b) reinforcing function-altering effects, and (c) respondent evocative effects characteristic of happiness, affection etc. But like aversive stimulus it is too unspecific, and happily is not much used in behavior analysis.

74 4th. Review E. Multiple effects 1. SD and Sāˆ† in a simple operant chain
evocative/abative effects function-altering effects 2. MO evocative/abative effects & SR/SP function- altering effects. 3. Practice exercises 4. Practical implications a. MO weakening = reinforcement b. MO strengthening = punishment c. Reinforcement = MO weakening d. Punishment = MO strengthening 5. Terminological Note: Aversive stimuli

75 Motivating Operations Where are we now?
I. Definition and characteristics A. Basic features B. Important details II. Distinguishing motivative from discriminative relations III. Unconditioned Motivating Operations A. UMOs vs. CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations (Review of the UMO-CMO distinction) A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) next V. General Implications of MOs for Behavior Analysis

76 Unconditioned vs. Conditioned MOs
UMOs are events, operations, or stimulus conditions with unlearned reinforcer-establishing effects. Conditioned motivating operations (CMOs) are MOs with learned reinforcer-motivating effects. The distinction depends solely upon reinforcer-establishing effect; an MO's evocative/abative effect is always learned. Humans are born with the capacity to be reinforced by food when food deprived (reinforcer-estab. effect), but the behavior that gets food has to be learned. The capacity to be reinforced by having a key, when we have to open a locked door (reinforcer-estab. effect) depends on our history with doors and keys. And we also have to learn how to obtain the key (evocative effect). (Same as slide 41.)

77 IV. Conditioned Motivating Operations: Three kinds.
Variables that alter the reinforcing effectiveness (value) of other stimuli, objects, and events but only as a result of a learning history can be called Conditioned Motivating Operations, CMOs. There seem to be three kinds of CMOs: Surrogate: CMO-S Reflexive: CMO-R Transitive: CMO-T

78 IVA. Surrogate CMO (CMO-S)
1. Description a. Pairing: The pairing of stimuli develops the respondent CS, and the operant Sr, and Sp, and possibly the SD Maybe also an MO, by pairing with another MO. Such a CMO will be called a surrogate CMO, a CMO-S. It would have the same reinforcer-establishing effect and the same evocative effect as that of the MO it had been paired with. Example: A stimulus paired with the UMO of being too cold might 1) increase the effectiveness of warmth as a reinforcer, and 2) evoke behavior that had been so reinforced more than needed for the actual temperature. b. Evidence for such a CMO is not strong. Also it would not have good survival value, still evolution does not always work perfectly. 1. Rat experiment and food deprivation.)

79 IVA1. CMO-S (cont'd.) c. Emotional MOs: With sexual motivation, MOs for aggressive behavior, and the other emotional MOs, the issue has not been addressed in terms specific to the CMO, because its distinction from CS, Sr, and Sp has not been previously emphasized. The surrogate CMO is only just beginning to be considered within applied behavior analysis (see McGill, 1999, p.396), but its effects could be quite prevalent. d. Practical importance: From a practical perspective, it may be helpful to consider the possibility of this type of CMO when trying to understand the origin of some puzzling or especially irrational behavior. 1. Two kinds of unpairing--a without b, b without a as often as with a. 2. Weakening the effects of the CMO-S: Any relation developed by pairing can be weakened by the two forms of unpairing1. The stimulus that had been paired with being too cold would weaken if it occurred repeatedly in normal temperature, or if one was too cold as often in the absence as in the presence of the stimulus.

80 Motivating Operations Where are we now?
I. Definition and characteristics A. Basic features B. Important details II. Distinguishing motivative from discriminative relations III. Unconditioned Motivating Operations A. UMOs vs. CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations (Review of the UMO-CMO distinction) A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) next V. General Implications of MOs for Behavior Analysis

81 IVB. Reflexive CMO Description Human Examples: Escape and avoidance
Escape extinction Avoidance: CMO-R defined Avoidance extinction Avoidance misconceptions Human Examples: Ordinary social interactions Academic demand situation

82 IVB1. CMO-R Description: a. Escape-Avoidance
The warning stimulus in an avoidance procedure. Animal lab escape-avoidance as a box diagram. tone off shock off tone on shock on 30" 5" R1 R2 R1 = lever press, the avoidance rsp. R2 = chain pull, the escape rsp. Escape: What evokes R2? The shock onset. What is rfer for R2? Shock termination. How should the evocative relation be named? Is shock onset an SD for R2? No, because Sāˆ† condition is defectiveā€“no MO for rfmt consisting of shock termination when shock is not on. (What is wanted? Nothing.) Shock is a UMO (recall earlier section on MO vs. SD, and pain as UMO.

83 IVB1. CMO-R b. Escape Extinction
Animal lab escape-avoidance procedure as a diagram. tone off shock off tone on shock on 30" 5" R1 R1 = lever press, the avoidance rsp. R2 = chain pull, the escape rsp. R2 How can R2 be reduced or prevented? What would extinction of R2 consist of? Extinction = R occurs w/o SR. Remove R2 contingency (shown dimmed) from the diagramā€“not an actual lab procedure. In general, to extinguish escape behavior the rsp must not escape the later worsening. How else to prevent R2? Omit shock, but this is only temporaryā€“evocative, not function-altering.

84 IVB1. CMO-R c. Avoidance & Definition
tone off shock off tone on shock on 30" 5" R1 R2 R1 = lever press, the avoidance rsp. R2 = chain pull, the escape rsp. What evokes R1? Onset of the warning stimulus (tone). What reinforces R1? Avoiding the shock? No, terminating the tone. How should the evocative relation be named? Is tone onset an SD for R1? No, because the Sāˆ† condition is defectiveā€“no MO for rfmt consisting of tone termination when tone is not on. (What is wanted? Nothing.) Tone is a CMO-R. (Why not a UMO?) CMO-R: Any stimulus that has systematically preceded the onset of any avoidable worsening.

85 IVB1. CMO-R d. Avoidance Extinction
tone off shock off tone on shock on 30" 5" R1 R2 R1 = lever press, the avoidance rsp. R2 = chain pull, the escape rsp. How can R1 be weakened or prevented? Evocative weakening: Leave tone off. But only temporary. When tone next comes on R will occur. Function-altering weakening: True Extinction: R1: Remove R1 contingency (dimmed out). R1 occurs, tone stays on and shock occurs when it would have if R1 had not occurred. Result: Reduction in R frequency will take place at a usual rate for extinction.

86 IVB1. CMO-R e. Avoidance Misconceptions
tone off shock off tone on 30" 5" R1 R1 = lever press, the avoidance rsp. omitting shock Widespread misconceptions about avoidance: 1st. Misconception: Rfmt for R1 is not getting the shock. Wrong: Too long-range and a nonevent. Rfmt for R is termination of the warning S. 2nd. Misconception: To extinguish R1, when R fails to occur omit shock and return to the beginning. Wrong: Leave warning S on when R occurs and give shock when it is due. This error is based on the previous one. Omitting shock will lead to reduction of R1 frequency, but should not be called extinction. This procedure is shown on the next slide.

87 IVB1. CMO-R e. Misconceptions (cont'd.)
30" tone off shock off tone on shock off 5" R1 omitting shock R1 = lever press, the avoidance rsp. Widespread misconceptions about avoidance: 3rd. Misconception: Avoidance behavior extinguishes very slowly. Based on erroneous definition of extinction; also on the notion that organism has to find out that the shock is gone and the avoidance R prevents this discovery; also on research results where the shock was omitted, and the behavior decreased very slowly. Why does this work? Tone-off is better than tone-on, but only because shock is closer once tone comes on. When shock is omitted tone-on loses is aversiveness, so (tone-on--->tone-off) loses its reinforcing valueā€“but only very gradually.

88 IVB2. CMO-R Human examples a. Everyday social interactions.
The CMO-R is important in identifying a negative aspect of many everyday interactions that might seem free of deliberate aversiveness. The interactions are usually interpreted as a sequence of SD--->R interactions, with each one being an opportunity for one person to provide some form of rfmt to the other person. But there is a slightly darker side to everyday life. i. Response to a request for information: You are on campus and stranger asks you where the library is. The appropriate R is give the information or say that you don't know. What evokes your answer? The request. What reinforces your response? The person asking will smile and thank you. Also you will be rfed by the knowledge that you have helped another person.

89 IVB2. CMO-R Human examples a. Everyday social interactions (cont'd.)
So the request is an SD. But, it also begins a brief period that can be considered a warning stimulus, and if a rsp is not made soon, some form of mild social worsening will occur. The asker may repeat the question, more clearly or loudly, and will think you are strange if you do not respond. You, yourself, would consider your behavior socially inappropriate if you did not respond quickly. Even with no clear threat implied for non-responding, our social history implies some form of worsening for continued inappropriate behavior. So, the request plus the brief following period is in part a CMO-R in evoking the response. It is best considered a mixture of positive and negative parts. But when the question is an inconvenience (e.g. when you are in a rush to get some-where) the CMO-R is probably the main component.

90 IVB2. Human examples a. Everyday social interactions (cont'd.)
ii."Thanks" When a person does something for another that is a kindness of some sort, it is customary for the recipient of the kindness to thank the person performing the kindness, who then typically says "You're welcome." What evokes the thanking rsp, and what is its rfmt? Clearly it is evoked by the person's performing the kindness. And the "You're welcome" acknowledgment is the obvious rfmt. So the kindness is an SD in the presence of which a "Thanks" response can receive a "You're welcome." But what if the recipient fails to thank the donor? The performance of the kindness is also a CMO-R that begins a period that functions like a warning stimulus. Failure to thank is inappropriate.

91 IVB2. CMO-R Human examples b. Academic Demand.
In applied behavior analysis the CMO-R may be an unrecognized component of procedures used for training individuals with defective social repertoires. Learners are typically asked questions or given verbal instructions, and appropriate responses are rfed in some way (an edible, praise, a toy, etc.). Should the questions and instructions be considered primarily SDs evoking behavior because of the availability of the rfers? I think not. What happens if an appropriate response does not occur fairly quickly? Usually a more intense social interaction ensues. The question usually has relatively strong CMO-R characteristics. Although it may not be possible to completely eliminate this negative component, it is important to recognize its existence and to understand its nature and origin.

92 IVB3. CMO-R Weakening the CMO-R
Evocative and temporary weakening will occur if the warning stimulus does not occur. Function-altering weakening will result from extinction (R does not terminate the warning stimulus) and from unpairing (ultimate worsening does not occur even if the warning stimulus is not terminated, or occurs even when the warning stimulus is terminated). The analysis of weakening the CMO-R involved in everyday social interactions becomes more complex than seems appropriate for this type of presentation. It is possibly useful to suggest that the larger the CMO-R vs. the SD component, the "meaner" the culture. Early phases of an academic demand situation may evoke tantrums, self-injury, aggressive behavior, etc. This behavior may have been rfed by terminating the early phases and not progressing to the more demanding phases.

93 IVB3. CMO-R Weakening the CMO-R (cont'd.)
The effects of the CMO-R in evoking the problem behavior can be weakened by extinction or by unpairing. If later phases must occur because of the importance of the repertoire being taught, and assuming they cannot be made less aversive, then extinction of problem behavior is the only practical solution. (Unpairing will lead to no training.) But the demand can often be made less aversive. Better instruction will result in less failure and more frequent positive rfmt. The CMO-R will weaken as the final components become less demanding. The negativity of the training situation would not be expected to vanish completely unless the rfers in the non-training situation did not compete with what was available in the training situation. However, as the SD component related to the positive reinforcers in the situation becomes more important as compared with the CMO-R component, problem behavior should be less frequent and less intense.

94 5th. Review Review of the UMO - CMO difference. Surrogate CMO
Description in terms of pairing Example Reflexive CMO Avoidance and escape Animal lab procedure Evocation and rfmt of the escape and the avoidance Rs Extinction of escape and of avoidance Rs Misconceptions (rfmt, true extinction vs. unpairing) Human examples Everyday social interactions Academic demand Weakening the effects of the CMO-R (everyday social interactions, academic demand)

95 Motivating Operations Where are we now?
I. Definition and characteristics A. Basic features B. Important details II. Distinguishing motivative from discriminative relations III. Unconditioned Motivating Operations A. UMOs vs. CMOs B. Nine main UMOs for humans C. Weakening the effects of UMOs D. UMOs for punishment E. A complication: Multiple effects This is an outline of the major topics covered in the slides and lecture, with the topics of the next several slides and lectures circled in dark red. Some topics in behavior analysis will be elaborated on beyond their relevance to motivating operations. These will be listed in appropriate places and identified as Diversions or Side Trips. Put in the number of slides for each topic when I have finished, and also put the diversions into the major and minor outlines] IV. Conditioned Motivating Operations (Review of the UMO-CMO distinction) A. Surrogate CMO (CMO-S) B. Reflexive CMO (CMO-R) C. Transitive CMO (CMO-T) next V. General Implications of MOs for Behavior Analysis

96 IVC. Transitive CMO, CMO-T
Definition and animal examples. Human CMO-T Examples Weakening the Effects of the CMO-T Importance for Language Training Practical implications of CMO-T in general

97 CMO-T: 1. Definition & animal example
CMO-T: An environmental variable related to the relation between another stimulus and some form of rfmt, and thus establishes the reinforcing effectiveness of the other stimulus, evokes all behavior that has produced that stimulus. Examples: UMOs function as CMO-Ts for stimuli that are Srs because of their relation to the relevant SR. tone off rfmt off tone ON rfmt ON R1 R2 3 sec R1 = treadle press, R2 = key peck, rfmt = 3" grain available Food deprivation is CMO-T for rfer effectiveness of tone, and evokes all Rs that have produced tone (in this case, R1).

98 CMO-T: 1. Another animal example
tone off rfmt off tone ON rfmt ON R1 R2 3 sec R1 = treadle press, R2 = key peck, rfmt = 3" grain available Onset of tone makes sight of the key effective as rfmt and evokes observing behaviorā€“visual search behavior. Why is tone onset not an SD for looking for the key? What is the rfmt for looking for key? Seeing key. Is the tone onset related to the availability of this rfmt? Can the key be more easily seen when tone is on than when tone is off? No. Tone makes seeing key more valuable, not more available. As a suppose SD for looking for key, tone is defective in two ways. (1) An SD is a stimulus in the absence of which the relevant rfer is not available, but the key can be successfully looked for when tone is off. (2) When tone is off, there is no MO making sight of key valuable.

99 CMO-T: 1. Avoidance and All 3 CMOs
tone off shock off tone on shock on 30" 5" R1 R2 R1 = lever press, the avoidance rsp. R2 = chain pull, the escape rsp. Tone onset is CMO-S in evoking chain pull. Tone onset is CMO-R in evoking lever press. Tone onset is CMO-T in evoking looking for the lever. CMO-T: 2. A complication: SDs may also be involved. Although tone onset is not an SD but rather a CMO-T for looking for the key, it is an SD for pecking the key. What is the rfmt for pecking the key? Food. Is food rfmt more available when tone is on than when it is not on? Yes.

100 3. Human CMO-T: a. Flashlight example
The rfing effectiveness of many human Srs is dependent on other stimulus conditions because of a learning history. Thus conditioned rfing effectiveness depends on a context. When the context is not appropriate the S may be available, but not accessed because it is not effective rfmt in that context. A change to an appropriate context will evoke behavior that has been followed by that S. The occurrence of the behavior is not related to the availability of the S, but to its value. Flashlights are available in most home settings, but are not accessed until existent lighting becomes inadequate, as with a power failure. Sudden darkness, as a CMO-T, evokes getting a flashlight. The motivative nature of this relation is not widely appreciated. The sudden darkness is usually interpreted as an SD for looking for a flashlight. But are flashlights more available in the dark? No. They are more valuable.

101 3. Human CMO-T: b. Slotted screw example
Consider a workman disassembling a piece of equipment, with an assistant providing tools as they are requested. The workman sees a slotted screw and requests a screwdriver. The sight of the screw evoked the request, the rfmt for which is receiving the tool. Prior to the CMO-T analysis the sight of the screw would have been considered an SD for requesting the tool. But the sight of such screws have not been differentially related to the availability of screwdrivers. Workmen's assistants have typically provided requested tools irrespective of the stimulus conditions that evoked the request. The sight of the screw does not make screwdrivers more available, but rather more valuable--a CMO-T, not an SD. SDs are involved: The screw is an SD for unscrewing motions; and the request is also dependent upon the presence of the assistant as an SD. But it is a CMO-T for the request.

102 3. Human CMO-T: The danger stimulus
A security guard hears a suspicious sound. He activates his mobile phone which signals another guard, who calls back and asks if help is needed (the Sr for the first guard's response). Is the suspicious sound an SD for contacting the other guard? Only if the rfmt for the rsp is more available in the presence than in the absence of the suspicious sound, which it is not. The sound makes the rsp by the other guard more valuable, not more available, so it is a CMO-T for activating the phone. The CMO-T is not an SD because the absence of the stimulus does not qualify as an Sāˆ†. The relevant rfmt is just as available in the supposed Sāˆ† as in the SD; and there is no MO for the rfmt in the Sāˆ† condition--nothing is wanted. The other guard's phone ringing is an SD for his activating his phone and saying "Hello," getting some rsp from a person phoning has not been available from non-sounding phones. (A danger signal is not a CMO-R, because it is rfed by producing another S, not its own removal.)

103 CMO-T: 4. Weakening the CMO-T
Abative weakening: The CMO-T can be temporarily weakened by weakening the MO related to the ultimate outcome of the sequence of behaviors. If the workman is told that the equipment does not have to be disassembled for this job the behavior evoked by the sight of the slotted screw will be weak. Of course the next time a screw has to be removed the request will be as strong as before. Function-altering weakening by extinction: Something changes so that requests are no longer honored, e.g. assistants now believe that workmen should get their own tools. By one type of unpairing, if screwdrivers no longer work. Construction practices changed so that screws are welded as soon as they are inserted. By another type of unpairing, if slotted screws can be unscrewed just as easily by hand as with the screwdriver.

104 IVC. 5. The CMO-T and language training
It is increasingly recognized that mand training is an important part of language programs for individuals with nonfunctional verbal repertoires. With such individuals, manding seldom arises spontaneously from tact and receptive language training. The learner has to want something, make an appropriate request, and receive what was requested, and thus the rsp comes under control by the MO and becomes a part of the individual's verbal repertoire as a mand. The occurrence of UMOs can be taken advantage of to teach mands, but there are two problems. Manipulating UMOs will usually raise ethical problems. Much of the human mand repertoire is for conditioned rather than unconditioned reinforcers. The CMO-T can be a way to make a learner want anything that can be a means to another end.

105 5. The CMO-T and language training (cont'd.)
Any stimulus, object or event can be the basis for a mand simply by arranging an environment in which that stimulus can function as an Sr. Thus if a pencil mark on a piece of paper is required for an opportunity to play with a favored toy, mands for a pencil and a piece of paper can be taught. This approach is somewhat similar to Hart and Risley's (1975) procedure called incidental teaching. It is also an essential aspect of the verbal behavior approach to much current work in the area of autism, for example by Sundberg, M. L., & Partington, J. W. (1998). Teaching language to children with autism or other developmental disabilities. Pleasant Hill, CA : Behavior Analysts, Inc.

106 6. Practical implications of the CMO-T vs. SD analysis.
A CMO-T evokes behavior because of its relation to the value of a consequence; an SD evokes behavior because of its relation to the availability of a consequence. This distinction must be relevant in subtle ways to the effective understanding and manipulation of behavioral variables for a variety of practical purposes. To develop new behavior or to eliminate old behavior by manipulating the value when availability is relevant, or availability when value is relevant will be inadequate or at least less effective than the more precise manipulation. The distinction is an example of a terminological refinement, not an empirical issue. Its value will be seen in the improved theoretical and practical effectiveness of those whose verbal behavior has been affected.

107 6th. Review C. Transitive CMO Definition and animal examples
A complication: SDs are also involved Human CMO-T examples Flashlight Slotted screw Danger stimulus Weakening the effects of the CMO-T Importance for language training Practical implications of the CMO-T vs. SD analysis.

108 V. General Implications for Applied Behavior Analysis.
Behavior analysis makes extensive use of the three-term contingency relation involving stimulus, response, and consequence. However, the reinforcing or punishing effectiveness of the consequence in developing control by the stimulus depends on an MO. And the future effectiveness of the stimulus in evoking the response depends on the presence of the same MO in that future condition. In other words, the three-term relation cannot be fully understood, nor most effectively used for practical purposes without a thorough understanding of motivating operations. In principle it should be referred to as a four-term contingency.

109 The slide show has ended.

110


Download ppt "Motivating Operations"

Similar presentations


Ads by Google