Dogs
Operant conditioning
From Wikipedia the free encyclopedia, by MultiMedia
|
Back |
Home |
Up |
Next |
Operant conditioning, so named by
psychologist
B. F.
Skinner, is the modification of
behavior
brought about over time by the consequences of said behavior. The phrase
operant conditioning differs from
Pavlovian conditioning in that while operant conditioning deals with
voluntary behavior explained by its
consequences,
Pavlovian conditioning deals with involuntary behavior triggered by
its
antecedents.
Operant conditioning, sometimes called instrumental conditioning or
instrumental learning, was first extensively studied by
Edward L. Thorndike (1874-1949), who observed the behavior of cats trying to
escape from home-made puzzle boxes. When first constrained in the boxes, the
cats took a long time to escape. With experience, ineffective responses occurred
less frequently and successful responses occurred more frequently, enabling the
cats to escape in less time over successive trials. In his
Law of Effect,
Thorndike theorized that successful responses, those producing satisfying
consequences, were "stamped in" by the experience and thus occurred more
frequently. Unsuccessful responses, those producing annoying
consequences, were stamped out and subsequently occurred less frequently.
In short, some consequences strengthened behavior and some consequences
weakened behavior.
B.F. Skinner
(1904-1990) built upon Thorndike's ideas to construct a more detailed theory of
operant conditioning based on reinforcement and punishment.
Reinforcement and punishment
Reinforcement
and punishment, the
core ideas of operant conditioning, are either positive (adding a stimulus to an
organism's environment), or negative (removing a stimulus from an organism's
environment). This creates a total of four basic consequences, with the addition
of no consequence (i.e. nothing happens). It's important to note that organisms
are not reinforced or punished; behavior is reinforced or punished.
- Reinforcement is a consequence that causes a behavior to occur with
greater frequency.
- Punishment
is a consequence that causes a behavior to occur with less frequency.
According to Skinner's theory of operant conditioning, there are two methods
of decreasing a behavior or response. These can be by punishment or
extinction.
Four contexts of operant conditioning: Here the terms "positive"
and "negative" are not used in their popular sense, but rather:
"positive" refers to addition, and "negative" refers to subtraction.
What is added or subtracted may be either reinforcement or punishment. Hence
positive punishment is sometimes a confusing term, as it denotes the
addition of punishment (such as spanking or an electric shock), a context that
may seem very negative in the lay sense. The four situations are:
-
Positive reinforcement occurs when a behavior (response) is followed
by a pleasant stimulus that rewards it. In the Skinner box experiment,
positive reinforcement is the rat pressing a lever and receiving a food
reward.
-
Negative reinforcement occurs when a behavior (response) is followed
by an unpleasant stimulus being removed. In the Skinner box experiment,
negative reinforcement is a loud noise continuously sounding inside the
rat's cage until it presses the lever, when the noise ceases.
- Positive punishment an aversive stimulus, such as introducing a
shock or loud noise.
- Negative punishment or
Extinction
removes a pleasant stimulus, such as taking away a child's toy. This occurs
when a behavior (response) that had previously been followed by a pleasant
stimulus is followed by no stimulus at all. In the Skinner box experiment,
this is the rat pushing the lever and being rewarded with a food pellet
several times, and then pushing the lever again and never receiving a food
pellet again. Eventually the rat would learn that no food would come, and
would cease pushing the lever.
Also:
-
- A type of learning in which a certain behavior (usually negative) is
not done in an attempt to not receive a punishment is termed
avoidance learning.
-
- One of the practical aspects of operant conditioning with relation
to animal training is the use of shaping or
Reinforcing successive approximations, as well as chaining.
See also
References
- Skinner, B. F. (1938). The behavior of organisms: An experimental
analysis. Acton, MA: Copley.
- Skinner, B. F. (1953). Science and human behavior. New York.
Macmillan.
- Skinner, B. F. (1957). Verbal behavior. Englewood Cliffs, NJ:
Prentice Hall.
- Thorndike, E. L. (1901). Animal intelligence: An experimental study of
the associative processes in animals. Psychological Review Monograph
Supplement, 2, 1-109.
External links
Home |
Up |
Alpha Roll |
Dog Attack |
Bark |
Clicker |
Clicker Training |
Dog Collar |
Animal Communication |
Dog Communication |
Coprophagia |
Crate Training |
Dog Aggression |
Dog Trainer |
Housebreaking |
Dog Intelligence |
The Intelligence of Dogs |
Obedience School |
Obedience Training | Operant Conditioning |
Prey Drive |
Socialization |
Dog Society |
Trophallaxis |
WarDog |
Dog Whistle
Operant conditioning, psychologist B. F. Skinner,
modification of behavior, consequences of said behavior, operant conditioning,
Pavlovian conditioning, voluntary behavior, consequences, Pavlovian
conditioning, nvoluntary behavior, triggered, antecedents