Tuesday, November 23, 2010

Phase Two Part I

Learning-a relatively permanent change in an organism’s behavior due to experience.(Molly)
Associative learning-learning that certain events occur together. The events may be two stimuli (classical conditioning) or a response and its consequences (operant conditioning). (Molly)
Conditional (classical)-a type of learning in which an organism comes to associate stimuli. A neutral stimulus that signals an unconditioned stimulus begins to produce a response that anticipates and prepares for the unconditioned stimulus. Also can be called Pavlovian Conditioning. (Molly)
Operant Conditioning-a type of learning in which behavior is strengthened if followed by a reinforce or diminished if followed by a punisher. (Molly)
Behaviorism-the view that psychology (1) should be an objective science that (2) studies behavior without reference to mental processes. Most psychologists agree with 1, not 2. (Molly)
Observational learning- learning by observing others. (Molly)
Ivan Pavlov’s Experiment-started doing experiments involving learning when he noticed invariably salivation caused by food in dogs. At first Pavlov tried to imagine what the dog was thinking and feeling. Then, he paired neutral stimuli with food to see the dog’s reaction the next times he uses the same neutral stimuli. He eliminated other possible neutral stimuli by isolating the dog in a room. Pavlov hoped to see if the dog would associate the neutral stimuli with food and therefore, salivate. (Molly)
Findings of Pavlov’s experiment-Pavlov found that the dog did associate food (and salivation) to neutral stimuli (like a voice, buzzer, a touch, a light or a smell). Since the salivation in response to the food in the mouth was unlearned, it was called unconditioned response. The food stimulus was called the unconditioned stimulus. After the experiment, when the dog would associate the stimuli and salivation together, which means he learned. This is called conditioned response and conditioned stimulus. (Molly)
Acquisition-the initial stage in classical conditioning; the phase associating a neutral stimulus with an unconditioned stimulus so that the neutral stimulus comes to elicit a conditioned response. In operant conditioning, the strengthening of a reinforced response.
Provide an example of classical conditioning NOT found in your book-my mom is understanding, so when I smell her perfume I feel at home and comfortable. (Molly)
Extinction-the diminishing of a conditioned response; occurs in classical conditioning when an unconditioned stimulus does not follow a conditioned stimulus; occurs in operant conditioning when a response is no longer reinforced. (Molly)
Spontaneous Recovery-the reappearance, after a rest period, of an extinguished conditioned response.
Generalization (in context of learning)-the tendency, one a response has been conditioned, for stimuli similar to the conditioned stimulus to elicit similar responses.
Discrimination-in classical conditioning, the learned ability to distinguish between a conditioned stimulus and other stimuli that do not signal an unconditioned stimulus.
Importance of Cognitive Processes in Classical Conditioning-Animals learn when to “expect” an unconditioned stimulus. The more predictable the association, the stronger the conditioned response. It’s as if the animal learns an expectancy, an awareness of how likely it is that the UCS will occur. Conditioning occurs best when the CS and UCS have just the sort of relationship that would lead a scientist to conclude that the CS causes the UCS. This principle helps explain why conditioning treatments that ignore cognition often have limited success.
Importance of Biology in Classical Conditioning-An animal’s capacity for conditioning is constrained by its biology. The biological predispositions of each species dispose it to learn the particular associations that enhance its survival. Environments are not the whole story.
Taste Aversion (Research of John Garcia)-if sickened as late as several hours after tasting a particular novel flavor, the flavor is avoid in the future. Aversions are developed to the tastes, but not the sights or sounds.
An example of taste aversion not described in your book-My friend ate white cheddar popcorn and hours later, she was sick from it. She no longer eats white cheddar popcorn.
Watson, Rayner and research after Pavolv-Pavolv’s work provided a basis for John Watson’s idea that human emotions and behavior, though biologically influenced, are mainly a bundle of conditioned responses. Watson and Rayner showed how specific fears might be conditioned. Their subject was an 11-month old infant names Albert. “Little Albert” feared loud noises but not white rats. Watson and Rayner presented him with a white rat and, as he reached to touch it, stuck a hammer against a steel bar just behind his head. After seven repetitions of seeing the rat and hearing the frightening noise, Albert burst into tears at he mere sight of the rat. Five days later, Albert showed generalization of his condition response by reacting with fear to a rabbit, a dog, and a sealskin coat, but not dissimilar objects such as toys.
(the following review to operant conditioning)
Respondent behavior-behavior that occurs as an automatic response to some stimulus
Operant behavior-behavior that operates on the environment, producing consequences
Law of effects-Thorndike’s principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely
Skinner Box (operant chamber)-a chamber containing a bar or key that an animal can manipulate to obtain a food or water reinforcer, with attached devices to record the animals rate of bar pressing or key pecking; used in operant conditioning research.
Shaping-an operant conditioning procedure in which reinforcers guide behavior toward closer and closer approximations of a desired goal.
Successive approximations-one rewards responses that are ever closer to the final desired behavior and ignore all other responses. It is a way to shape complex behaviors. Example: after we got our electric fence for my dog, she refused to cross the line even when the collar was off and she was on a leash. She would sit down and not budge. We used treats to make her come closer and closer until she finally crossed the line. Each time she took a step closer we gave her a treat to try to reinforce that she was okay to cross. (Chrissy)
Discriminative stimulus-psychologists try to shape nonverbal organisms to discriminate between stimuli as well as try to determine what they perceive like dogs can distinguish color? Experiments are used to see if we can shape them to respond to one stimulus and not another then they can perceive the difference. (Chrissy)
Reinforcement-any event that increases the frequency of a preceding response. (Chrissy)
Positive reinforcement- a tangible reward, praise for attention, or a reward. Example: a child was in speech class for eight years because he had trouble saying the “r” sound. He could not say words like “girl” or “car”. Throughout the classes, the teacher used positive reinforcement when the boy would pronounce the word correctly. Each time he did it right, he received a piece of candy to try to instill this behavior in the child. (Chrissy)
Negative reinforcement-this strengthens a response by reducing or removing an aversive stimulus. Example: a student does not pay attention during class and does not prepare for a test in his chemistry class. He receives a bad grade on the next test. After seeing the result of this test, the student changes his behavior in class by taking notes, asking questions, and studying for the test. Due to his bad test grade on the last one, he changes his behavior to help reduce his stress and anxiety. Example: my dad used to never wear his seat belt, but since he got a new car there is a constant beeping noise if he doesn’t put his seat belt on. He now wears his seat belt in order to turn off the annoying beeping sound. (Chrissy)
Primary reinforcers-an innately reinforcing stimulus like one that satisfies a biological need. Example- someone gets a glass of water to relieve their thirst. (Chrissy)
Conditioned reinforcer-a stimulus that gains its reinforcing power through its association with a primary reinforcer. It is also known as the “secondary reinforcer”. Example: if a light signifies food is coming for a rat. The rat will work to turn on the light in order to try to get food. The light is a secondary reinforcer with food. (Chrissy)
Immediate reinforcer-if a child is whining to their parents because they want a cookie and the parent gives them a cookie right away to try to stop the child’s whining, then the child will begin to associate their whining to getting what they want. This will cause them to exhibit their behavior more frequently in order to get it. An immediate reinforcer means that they show a specific behavior they will immediately get that reinforcement like food for a rat or a cookie for a child who is whining. (Chrissy)
Delayed reinforcer- if the parent woujld delay the reinforcement by giving the child the reward later, it would keep them from making the connection of cause and effect. By waiting other behaviors will intervene and be reinforced to keep from the connection between cause and effect. Example: if I whine I will get what I want.
Reinforcement Schedules:
Variable-interval schedule- reinforce the first response after varying time intervals. Produces a slow, steadying response.
Fixed-interval schedule- reinforce first response after a fixed time period. Causes a choppy stop and start pattern rather than a steady rate of response.
Variable-ratio schedules- provides reinforcers after an unpredictable number of responses like gamblers. Produces high rates of responding because the reinforcers increase as the number of responses increase.
Fixed-ratio schedules- reinforce behavior after a set number of responses. This reduces awards.
Partial (intermittent) reinforcement- responses are sometimes reinforced, sometimes not. The initial learning is slower compared to continuous but partial reinforcement has greater persistence and resistance to extinction.
Continuous reinforcement- the desired response is reinforced every time it occurs. Learning occurs more rapidly but once the reinforcement stops, the skill is forgotten quickly. (Chrissy)
Punishment-opposite of reinforcement; tries to increase a behavior while punishment decreases it. A punisher is any consequence that decreases the frequency of a preceding behavior, usually by giving an undesirable consequence or taking away something desirable one. Example: if a young child yells back at a parent or is rude or disrespectful, they might take away dessert to try to have the child realize not to repeat this action. (Chrissy)
Cognitive map- a mental representation of the layout of one’s environment. Example: after exploring a maze, rats act as if they have learned a cognitive map of it. They have learned the maze and will demonstrate their knowledge regardless if there is a reinforcer of not because they have made this “map” to remember their way.(Chrissy)
Latent Learning- learning that occurs but is not apparent until there is an incentive to demonstrate it (Laura)
Intrinsic Motivation vs. Extrinsic Motivation-intrinsic motivation is a desire to perform a behavior for its own sake and to be effective and extrinsic motivation is the desire to perform a behavior due to promised rewards or threats of punishment (Laura)
How does cognition affect operant learning?  Cognition is all the mental activities associated with thinking, knowing, remembering, and communication and operant conditioning is a type of learning in which behavior is strengthened if followed by a reinforcer or diminished if followed by a punisher. Cognition impacts operant learning because cognition helps you to remember the reinforcer or diminisher.  (Laura)
How does biology play a role in operant learning? Animals and humans can most easily learn and retain behaviors that draw on their biological predispositions. For example, cats have inborn tendencies to leap high and land on their feet. (Laura)
Research after Skinner: Skinner repeatedly insisted that external influences not internal thoughts and feelings shaped behavior and by urging the use of operant principles to influence people’s behavior at school, work, and home. He said recognizing that behavior is shaped by its consequences, we should promote more desirable behavior.  (Laura)
Modeling: The process of observing and imitating a specific behavior (Laura)
Mirror Neurons: Frontal lobe neurons that fire when performing certain actions or when observing another doing so, the brain’s mirroring of another action may enable imitation, language learning, and empathy. (For example, when one person yawns, it causes others to yawn). (Laura)
Albert Bandura: Bandura conducted a controversial experiment known as the Bobo doll experiment, to study patterns of behavior associated with aggression. He learned that similar behaviors were learned by individuals modeling their own behavior after the actions of others. He believes part of whether we imitate a model is due to reinforcements and punishments. (Laura)
Prosocial Models: People who exemplify non-violent, helpful behavior can prompt similar behavior in others. (For example, Martin Luther King, Jr. set a prosocial example by using non-violence). (Laura)
The Impact of Television: The more hours children spend watching violent programs, the more at risk they are at for aggression and crime as teens and adults. (Laura)
The Good News about TV: Correlation does not imply causation. Therefore, correlation studies do not prove that viewing violence causes aggression. Maybe aggressive children prefer violent programs.  (Laura)
Desensitizing Youth: Prolonged exposure to violence desensitizes viewers. They become more indifferent to it when later viewing a brawl, whether on television or in real life. (Laura)
Provide one example of Observational Learning from your lifetime (4)-
An example of observation learning from my life is when my mom used to do my hair when I was younger. I would watch how she would pull my hair up and put a hair tie in it to keep it up. By watching (observing) how my mom did my hair, I was able to learn how to do my hair on my own. (Nina)
After my younger cousin was born when I was 7, I watched how my aunt held, fed, and comforted the baby boy. I watched every move like how she rocked him and held the bottle. When I went home, I brought out my baby doll that I always played with and imitated my aunt’s actions. I pretended to rock my baby to sleep, comfort her when she “cried”, feed her when she was hungry, and sing her a lullaby to help her fall asleep. I watched my aunt take care of my younger cousin and repeated her actions on my only baby doll. (Chrissy)
When I was younger, my mom always used to make cookies, and I would watch her every time. Every holiday, I was surrounded by my older sister and mom making cookies. By watching them make cookies, it encouraged me to want to make them; I learned how to make cookies by watching them. (Molly)
In order to encourage me to read, my parents read to me, and surrounded me with books and people that read a lot.  (Laura)


No comments:

Post a Comment