Presentation is loading. Please wait.

Presentation is loading. Please wait.

Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response.

Similar presentations


Presentation on theme: "Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response."— Presentation transcript:

1 Schedules of reinforcement

2 Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response Partial reinforcement lies between continuous reinforcement and extinction

3 An Example of Continuous Reinforcement Each instance of a smile is reinforced

4 Schedules of Reinforcement Continuous Reinforcement – A schedule of reinforcement in which every correct response is reinforced. Partial Reinforcement – One of several reinforcement schedules in which not every correct response is reinforced. Which method do you think is used more in real life?

5 Schedules of Reinforcement Ratio Version – having to do with instances of the behavior. Ex. – Reinforce or reward the behavior after a set number or x many times that an action or behavior is demonstrated. Interval Version – having to do with the passage of time. Ex. – Reinforce the participant after a set number or x period of time that the behavior is displayed.

6 When attempting to find out the Interval/Ratio of an example: Always ask: Does it deal with time? If Yes=Interval If No=Ratio

7 Fixed-Interval Schedule Fixed-interval schedule – A schedule in which a fixed amount of time must elapse between the previous and subsequent times that reinforcement will occur. Ex. Johnny gets five M & M’s every 5 minutes he sits in his seat quietly. (Think Pre-school)

8 Fixed interval schedule is when the reinforcement is received after a fixed amount of time has passed. Ex. You get allowance every other Friday.

9 Variable interval schedule is when the reinforcement occurs after varying amounts of time. Ex. Fishing and catching a fish after varying amounts of time

10 Variable-Ratio Schedule Variable-ratio Schedule – A schedule in which reinforcement is provided after a variable number of correct responses. Produce an overall high consistent rate of responding. Ex. – On average, the rat press the bar 5 times for one pellet of food.

11 Variable ratios schedule is when an unpredictable number of responses are required before reinforcement can be obtained. Ex. slot machines.

12 Fixed ratio schedule a specific number of correct responses is required before reinforcement can be obtained. Ex. Buy 10 haircuts get 1 free.

13

14 Comparisons of Schedules of Reinforcement Fixed interval Reward on fixed time basis Leads to average and irregular performance Fast extinction of behavior Fixed ratio Variable ratio Variable interval Reward tied to specific number of responses Leads quickly to very high and stable performance Moderately fast extinction of behavior SCHEDULE FORM OF REWARD Reward given after varying periods of time Leads to moderately high and stable performance Slow extinction of behavior Reward given for some behaviors Leads to very high performance Very slow extinction of behavior INFLUENCE ON PERFORMANCE EFFECTS ON BEHAVIOR

15 FI, VI, FR, or VR? 1. When I bake cookies, I can only put one set in at a time, so after 10 minutes my first set of cookies is done. After another ten minutes, my second set of cookies is done. I get to eat a cookie after each set is done baking. 2. After every 10 math problems that I complete, I allow myself a 5 minute break. 3. I look over my notes every night in math because I never know when my teacher will give a pop quiz. 4. When hunting season comes around, sometimes I’ll spend all day sitting in the woods waiting to get a shot at a big buck. It’s worth it though when I get a nice 10 pt buck. 5. Today in Psychology class we were talking about Schedules of Reinforcement and everyone was eagerly raising their hands and participating. Miranda raised her hand a couple of times and was eventually called on. 1. FI 2. FR 3. VI 4. VI 5. VR

16 FI, VI, FR, or VR? 6. Madison spanks her son if she has to ask him three times to clean up his room. 7. Emily has a spelling test every Friday. She usually does well and gets a star sticker. 8. Steve’s a big gambling man. He plays the slot machines all day hoping for a big win. 9. Snakes get hungry at certain times of the day. They might watch any number of prey go by before they decide to strike. 10. Mr. Bertani receives a salary paycheck every 2 weeks. (Miss Suter doesn’t  ). 11. Christina works at a tanning salon. For every 2 bottles of lotion she sells, she gets 1 dollar in commission. 12. Mike is trying to study for his upcoming Psychology quiz. He reads five pages, then takes a break. He resumes reading and takes another break after he has completed 5 more pages. 6. FR 7. FI 8. VR 9. VI 10. FI 11. FR 12. FR

17 FI, VI, FR, or VR? 13. Megan is fundraising to try to raise money so she can go on the annual band trip. She goes door to door in her neighborhood trying to sell popcorn tins. She eventually sells some. 14. Kylie is a business girl who works in the big city. Her boss is busy, so he only checks her work periodically. 15. Mark is a lawyer who owns his own practice. His customers makes payments at irregular times. 16. Jessica is a dental assistant and gets a raise every year at the same time and never in between. 17. Andrew works at a GM factory and is in charge of attaching 3 parts. After he gets his parts attached, he gets some free time before the next car moves down the line. 18. Brittany is a telemarketer trying to sell life insurance. After so many calls, someone will eventually buy. 13. VR 14. VI 15. VI 16. FI 17. FR 18. VR


Download ppt "Schedules of reinforcement. Schedules of Reinforcement Continuous reinforcement refers to reinforcement being administered to each instance of a response."

Similar presentations


Ads by Google