1 / 33

Operant Conditioning

Operant Conditioning. Complex Learning. Why do we learn new behaviors? Classical conditioning only deals with reflex responses that we already possess. Most of our behaviors are voluntary. Volitional. Stimulated by something in our environment. Operant Conditioning.

amma
Télécharger la présentation

Operant Conditioning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Operant Conditioning

  2. Complex Learning • Why do we learn new behaviors? • Classical conditioning only deals with reflex responses that we already possess. • Most of our behaviors are voluntary. Volitional. Stimulated by something in our environment.

  3. Operant Conditioning • Defined as - the form of learning concerned with changes in emitted responses as a function of their consequences.

  4. Origins of Operant Conditioning • Edward Thorndike • Instrumental Conditioning • “Law of Effect” • Satisfying outcome • Unsatisfactory outcome

  5. Outcomes of Thorndike’s Work • How Long - the length of time it took the cat to escape from the puzzle box. As it declined learning was taking place. • This change in performance represented a change in behavior from experience. Learning

  6. Question • In Thorndike’s terms, what sort of things give you satisfaction? What things produce dissatisfaction? Why?

  7. Edward Thorndike • His research provided a foundation for the study of “non-reflexive” learning. • He drew a connection between action and its outcomes.

  8. B. F. Skinner • Skinner coined the term “operant”. • Disagreed with the “soft” concepts of Thorndike’s “satisfying” and “unsatisfactory” outcome(s)

  9. B. F. Skinner • Operant Conditioning replaced Thorndike’s term “instrumental learning” • Emitted behavior is now called “operant responses” • Classical conditioning is now called ‘respondent conditioning. The Skinner Box or “auto-environmental chamber”

  10. Skinner Box in Action Zack Florin '99 using a Skinner  box to shape a rat's behavior 

  11. Reinforcment • Primary reinforcers - food, water, shelter. Those innate biological needs. • Secondary reinforcers (Conditioned reinforcers) - something that will provide a primary reinforcer. (money, poker chips etc.)

  12. Primary vs. Secondary • Which of the following are secondary reinforcers: • quarters spilling from a slot machine, • a winner’s blue ribbon, • a piece of candy, • an A on an exam, • frequent-flyer miles.

  13. Reinforcement • Negative Reinforcer - an aversive stimulus which serves to decrease the probability of the response in the future. • Positive Reinforcer - a stimulus which when applied increases the probability of the response in the future.

  14. Contingencies of Reinforcement • According to Skinner the relationship between a response and a reinforcer is a contingency. • One type of contingency is “reinforcement”

  15. Desired change in behavior

  16. Shaping • Some learning does not occur in a single event. • A series of successive steps leads to a learned behavior. • Playing the piano, swimming etc.

  17. Applying the Principles • When asked choose the best alternative and explain why. • You want your 2-year-old to ask for water with a word instead of a grunt. Should you give him water when he says “wa-wa” or wait until his pronunciation improves.

  18. Applying the Principles • When asked choose the best alternative and explain why. • Your roommate keeps interrupting your studying even though you have asked her/him to stop. Should you ignore her/him completely or occasionally respond for the sake of good manners?.

  19. Applying the Principles • When asked choose the best alternative and explain why. • Your father, who rarely writes to you, has finally sent a letter. Should you reply quickly or wait a while so he will know how it feels to be ignored?.

  20. Extinction • What happens when the reinforcement stops. • Extinction - in operant conditioning, a drop I responding when reinforcement is discontinued.

  21. Schedules of Reinforcement

  22. Schedules of Reinforcement • Continuous reinforcement - every response is followed by a reinforcer. (FR1 schedule) • Partial reinforcement - a contingency of reinforcement in which every response does not get a reinforcer.

  23. Fixed Interval Schedule • Referred to as FI x - reinforcement contingency defined by the amount of time that must pass since the previous reinforcer. • Based on time. • Example: pay checks

  24. Fixed Ratio Schedule • Referred to as FR x - reinforcement contingency defined by the number of responses the organism must make in order to get a reinforcer. • Example: piece work.

  25. Variable Interval Schedule • Referred to as VI x - a reinforcement contingency defined by the average time interval which must elapse since the last reinforcer. • Example: Quality Control

  26. Variable Ratio Schedule • Referred to as VR x - a reinforcement contingency defined in terms of the average number of responses required to receive a reinforcer. • Example: Slot Machine

  27. Non-Contingent Reinforcement • Random “reinforcement” • Development of what Skinner called ‘superstition’ in the pigeon.

  28. Applying Conditioning • We must always keep in mind that all this is done to match the goals of psychology. • Behavior Modification. • Mary Cover Jones - the mother of behavior therapy • Controls Aversive Positive

  29. Punishment • Most used and most misunderstood • Occurs after the ‘offense’ has taken place. • Requires “contiguity” • Encourages avoidance behaviors.

  30. Negative Reinforcement

  31. Autonomic Conditioning • Neal Miller and Leo DiCara • ‘proprioceptive feedback

  32. Some unanswered questions: Equipotentiality premise Ethology Species-specific behavior Critical period Preparedness The premise that principles of conditioning will apply to any response and any species. Biological Constraints Behaviors which are characteristic of all members of a particular species. (instincts) The study of the behavior of animals in their natural environment. A period during development where there are optimal periods for learning. A concept developed by Martin Seligman to describe how physiological structure influences the occurrence of behavior

  33. Biological Constraints Unlearnable Associations Species-specific behavior Bait Shyness Classical and operant conditioning

More Related