Animal Behavior for Shelter Veterinarians and Staff. Группа авторов

Чтение книги онлайн.

Читать онлайн книгу Animal Behavior for Shelter Veterinarians and Staff - Группа авторов страница 43

Animal Behavior for Shelter Veterinarians and Staff - Группа авторов

Скачать книгу

or potentially any other operant behavior. However, the punishing effects of the tone will eventually wear off if the shock no longer accompanies it. For a conditioned punisher to maintain its suppressive effects, it too must occasionally precede the unconditioned punisher.

      Readers experienced in animal training may wonder why we don’t discuss clickers in this section. For those non‐animal trainers, clickers are hand‐held devices that, when pressed, make a clicking sound. Clickers and similar devices (such as whistles) are discussed by animal trainers as conditioned reinforcers because they are paired with food. However, this function has been questioned (Dorey and Cox 2018), and more research needs to be conducted to make this claim.

      3.4.2 Extinction and Shaping

      Behaviors maintained by consistent and predictable reinforcement are highly sensitive to discontinuing reinforcement (Williams 1994). For example, if someone pressed an elevator button, but it didn’t light up to indicate that an elevator was on its way, what would the person do? Most people would press the button again, maybe a few more times in rapid succession, or hold the button down harder and longer than usual. After a few attempts, most people would eventually just take the stairs. The process by which a response stops occurring when reinforcement no longer follows the behavior is termed extinction.

      Extinction can be both a process and a procedure. Extinction as a procedure entails withholding the reinforcer that previously maintained a response. Extinction as a process involves the decrease and eventual elimination of a response. It is important to note this difference because for extinction as a process to successfully occur, the reinforcer that is maintaining the response must be identified. Sometimes we assume that a behavior is maintained by a certain reinforcer, but relying on assumptions can lead us astray when trying to implement extinction to decrease behavior.

      The discovery of extinction as a behavioral process in operant conditioning was completely serendipitous (Skinner 1956). Skinner was running an experiment in which a rat pressed a lever for food. Unbeknownst to Skinner, the pellet dispenser jammed at some point during the session. Therefore, presses on the lever no longer produced reinforcement and underwent extinction. The rat didn’t immediately stop pressing the lever; instead, there was a gradual reduction in the number of lever presses before the behavior finally stopped. Skinner’s accidental demonstration of extinction highlights an important feature of the process: the behavior under extinction diminishes gradually, not in an all‐or‐none fashion. How quickly behavior decreases during extinction is a function of the schedule of reinforcement that maintained the behavior, the length of time the behavior has been in the animal’s repertoire, and whether conditioned reinforcers are delivered during extinction.

      Extinction and reinforcement are used in combination to teach new behaviors through a technique called shaping. In shaping, a behavior is trained by reinforcing responses with forms that are closer and closer to a final desired behavior. In the laboratory, a common scenario is for a rat to press a lever for food. However, when a rat is put in the operant chamber for the very first time, it is highly unlikely that he would press the lever since the lever‐press response and the reinforcer have yet to be associated. Experimenters must first shape the lever‐press response before they can run their experiments. As the rat sniffs around the operant chamber and looks in the direction of the lever, the experimenter delivers a food pellet. As the rat moves progressively closer, each approach is reinforced with a food pellet. The experimenter might then wait for the rat to place his paw on the lever before delivering food. And finally, the rat presses down on the lever, exhibiting the final desired behavior. As successive approximations to a lever press are reinforced, previous responses that had formerly been reinforced are extinguished.

      3.4.3 Stimulus Control

Description Example
Antecedent A stimulus that precedes a response Mailperson walks down the street
Behavior The organism’s response to the antecedent The dog barks
Consequence The stimulus change that follows the behavior (addition or removal of a stimulus) The mailperson crosses the street and thereby reinforces the dog’s barking behavior

      In technical terms, if the presentation of a stimulus reliably evokes an operant response, the stimulus is called a “discriminative stimulus.” In application, a discriminative stimulus is often called a “cue.” For a stimulus to reliably function as a discriminative stimulus, the same rules for creating strong associations apply. The cue needs to reliably and consistently signal a certain consequence if a behavior occurs. Naive trainers sometimes attempt to train their pet to sit by repeatedly saying “sit.” After saying “sit” a dozen times, the pet sits and gets a treat. Unfortunately, “sit” never becomes a reliable cue because the pet did not sit most of the time the cue was presented. However, after a few pairings of the trainer saying “sit” once and the dog sits, and the dog is unlikely to sit when the trainer

Скачать книгу