Thursday, March 11, 2010

Schedules of Reinforcement

Written 10 March, 2010

Schedules of Reinforcement

The death of a baby of a Korean couple who were so immersed in raising a virtual baby in a "Second Life-like" virtual world that they neglected their real infant has made big news in recent weeks:

-----

The couple left their 3-month-old daughter alone for 12-hour intervals while they went to a cyber cafe to play an online game, according to AOL News. The 41-year-old man and 25-year-old woman dropped by the house occasionally to feed the baby powdered milk.
The couple had become "obsessed with raising a virtual girl called Anima in the popular role-playing game Prius Online. The game, similar to Second Life, allows players to create another existence for themselves in a virtual world, including getting a job, interacting with other users and earning an extra avatar to nurture once they reach a certain level," according to the London Guardian.

-----

Video game addiction is in fact a topic of considerable concern in modern society. There are big questions about whether the American Psychiatric Association will list it in their upcoming revision of the Diagnostic and Statistical Manual of Mental Disorders (see also here), and already at least one nonprofit exists to combat what it characterizes as a mental illness.

In today's society it's almost impossible to avoid computers. And they certainly affect our bodies. We are subject to repetitive motion syndrome, migraines, pinched nerves, and other physical ailments caused by spending too much time sitting in front of screens. But computers affect our psyches also, and in ways we are only just beginning to understand.

There's little doubt, for example, that exposure to computers during childhood and teen years is causing dramatic changes in the way young people think. Educators are noticing difficulties with attention span in university students who, immersed in a world of tweets and Facebooking and text messaging, seem unable to concentrate on a single subject for more than a minute or two. In fact, in the news this week were reports that some university professors are banning laptops in classrooms because students are focusing on their screens rather than their instructors' words. The recent PBS Frontline film Digital Nation (a must watch) compellingly documents this change in the way young people think.

And what of video games and MMORPGS-- massively multiplayer online role playing games like World of Warcraft and Ultima Online? Does their immersive nature pose additional risks, cause additional changes in the way we think and behave? And are they in fact addicting? And what of virtual worlds like Second Life? Will they in fact be so compelling that we will starve our babies because we can't tear ourselves away from our computers?

On May 8 www.cracked.com posted this humorously-written but serious article about video game addiction. Author David Wong asks "Are some games intentionally designed to keep you compulsively playing, even when you're not enjoying it?"

His answer? "Oh hell, yes!"

Wong references an article by John Hopson called Behavioral Game Design. Hopson discusses principles of behavioral psychology, in particular something called schedules of reinforcement. While Hopson doesn't quite say these schedules can be used to manipulate people into spending more time in video games, he at least implies it.

Wong thinks Hopson is just talking in code...

-----

"Each contingency is an arrangement of time, activity, and reward, and there are an infinite number of ways these elements can be combined to produce the pattern of activity you want from your players."


Notice his article doesn't contain the words "fun" or "enjoyment." That's not his field. Instead it's "the pattern of activity you want."

-----

... and probably, Hopson is talking in code.

The schedules of reinforcement Hopson describes were derived from work done by B.F. Skinner and his students and co-workers and described in Skinner's 1938 book Principles of Behavior.

At a time when experimental psychologists were trying to derive universal laws of learning by running rats through mazes and devising elaborate intra-organism equations to describe the resulting behavior, Skinner treated the organism as a black box, looking at what went into the box and what came out of the box and not worrying about what happened inside the box itself. He applied stimuli and analyzed the resulting behavior, not worrying about intervening mental processes.

This was in a way similar to the work of Ivan Pavlov, a Russian physiologist whose experiments with salivation in dogs resulted in a theory of learning called Classical Conditioning. When the presentation of meat powder was paired with the ringing of a bell, Pavlov discovered, his dogs would begin to salivate when the bell was rung without meat powder being present. Pavlov, too, you see, treated his dogs as a black box.

Through work in his lab, using rats and pigeons as subjects, Skinner developed theories about how stimuli affected behavior. He called this Operant Conditioning. From his theories emerged the applied science of Applied Behavior Analysis-- or, in the vernacular, behavior modification.

A principle tenant of operant conditioning are the aforementioned schedules of reinforcement.

Schedules of reinforcement are, simply, rules for the awarding or removal of reinforcers after a specified behavior.

Skinner derived his schedules by starving pigeons and rats to 80% of their normal body weight, then placing them in an operant conditioning chamber or "Skinner Box," an enclosure with a level to press (for rats) or button to peck (for pigeons) and a chute to deliver food (reinforcers) on a schedule to be awarded in response to one or more level presses or button pecks.

Obviously, a rat will soon learn to press a lever or a pigeon peck a button to get a reinforcer (food). The rat might at first accidentally blunder into the lever or the pigeon quizzically peck the button , but they will quickly figure out that the action brings them food. And predictably, they will press or peck until they are satiated.

Also, predictably, if the lever and button stops delivering food, the rat and pigeon will quickly stop pressing or pecking. Behaviorists call this extinction.

What Skinner discovered, however, was the the intervals and ratios of the reinforcement have powerful effects upon the frequency of button pecking/lever pressing-- and moreover, upon extinction.

When reinforcement is delivered on a fixed ratio (every five pecks, say), pecks are steady, with a brief pause after reinforcement. An example of a fixed ratio schedule for humans is piece work, in which money is based on the number of units assembled or delivered.

When reinforcement is delivered at a fixed interval (it comes at the first lever press or peck after the interval has expired), responses increase gradually as the interval approaches, then decrease after delivery. An example of a fixed interval schedule for humans might be checking to see if the mail has been delivered. As mail time approaches, you might check more frequently; after delivery, you will stop checking.

When reinforcement is delivered on variable schedules, however, things get interesting, and interesting indeed.

What typically occurs is high rates of responding both before and after delivery of reinforcement, and tremendous resistance to extinction.

In real life, slot machines pay off on a variable ratio. And pulling the lever of slot machines is notoriously resistant to extinction. Just think about your Grandma in Atlantic City who just dropped 1500 quarters into a broken slot machine.

Consider: you have a machine in your hallway that gives you a quarter every time you pull the lever. You pull it happily for several days, then it stops delivering. After you fiddle with it for five or ten minutes, you walk away. Thereafter you might pull the lever every once in a while as you pass the machine, just to see if the rules have changed.

But what if you were playing a slot machine that suddenly stopped paying off? How long would it be before you figured out money wasn't forthcoming? Answer: it would be a very long time! Think thousands of responses. Thousands of quarters.

Variable intervals are much the same. They pay on the first response after an unknown period of time. Human examples aren't plentiful, but consider giveaways on radio shows. You find yourself listening all day in hopes the DJ will finally play Abbey Road so you can call in for free tickets to next week's Paul McCartney concert.

There's a reason I have given human examples to these schedules of reinforcement. That's because even though they were derived from experiments with pigeons and rats, they apply equally well to dogs and to fish and to baboons and to humans. They are universal laws of learning, certain invertebrates possibly excepted. As much as some humans might not like the idea, we learn in the same ways as other animals.

For this reason, manipulation of schedules of reinforcement can have powerful effects on humans. Just ask anyone who has left Las Vegas with only his or her shirt!

Since their inception, video games have taken advantage of the various schedules of reinforcement. This adds to their attraction and makes it more difficult to tear oneself away from the computer to oh, say, go home and feed the baby powdered milk. Certainly game designers, whether cynically, or otherwise, do whatever they can to keep people immersed.

Second Life and other virtual worlds are indeed attractive to many people-- and, as with games, staying in world is depending upon reinforcement. But here's the distinction between games and virtual worlds:

In games, computer-based or not, reinforcement is delivered according to the will of the game designer and comes according to strict rules. You get queened when your checker reaches the other side of the board, you go to Level II when you kill enough orcs or find the treasure,  you win the archery contest when your arrow hits closest to the center of the target.

But in virtual worlds, the reinforcement comes not from external sources, but from within the user himself or herself.

Those of us who are in Second Life are here because it satisfies us. We are reinforced by the landscapes and cities built by others, or by building or scripting things for ourselves, or by talking to strangers and to our friends, or by performing music, or by watching the performances of others.

We can certainly steer our second lives in the direction of artificial rewards-- by gambling, or by playing HUD-based games like Tiny Empires, or by role playing-- but we do that in real life as well. We are not cynically manipulated by the intrinsic design of Second Life-- this world is, after all, absent the creations of its residents, mostly empty space. It is compelling because LIFE is compelling. We are here because we are living a life in a distinct virtual space-- not because we are responding to someone else's devised schedules of reinforcement.

1 comment:

Anonymous said...

Brilliant hon ;)

---Sweetie!