Sunday, September 23, 2012

Situational Power

This is the seventh time on this blog that I've referred to The Lucifer Effect: Understanding How Good People Turn Evil. Here, I call attention to Chapter 12, "Investigating Social Dynamics."

Throughout the chapter, author Philip Zimbardo presents the results of a a number of psychological investigations into the propensity of people to go along with the crowd, to comply with authority, and to dismiss their own feelings and even concrete evidence that is directly in front of them.

You should think about the results of this body of work when trying to make sense of what is otherwise rather ridiculous claims made by those who argue that the oversight of experiments on animals is subject to careful ethical considerations.

Before giving an example from Zimbardo that illustrates how malleable our perceptions and beliefs actually are when certain situational influences are at work, I think it worthwhile to mention that even people studying this psychosocial phenomena are themselves at risk of being drawn in.

The first half of The Lucifer Effect is about the experiment conducted by Zimbardo that has come to be known as the Stanford Prison Experiment. It's not important to this short essay, but very briefly, Zimbardo turned an empty floor of a university building into a mock prison, randomly assigned volunteer male undergrads to role-play either prison guards or prisoners, and then let them "play" prison.

Things very quickly got out of hand, but Zimbardo himself was unable to see it until he invited a colleague to observe his experiment. She was horrified. Zimbardo realized he had fallen victim to the very phenomena he was studying. He writes,
In retrospect, my role transformation from usually compassionate teacher to data-focused researcher to callous prison superintendent was most distressing. I did improper or bizarre things in that new strange role.... I so fully adopted that role it made the prison "work" as well as it did. However, by adopting that role, with its focus on the security and maintenance of "my prison," I failed to appreciate the need to terminate the experiment as soon as the the second prisoner went over the edge. (p 218.)
If someone studying this phenomena is so at risk of falling victim to it, imagine how much more likely it is that it routinely ensnares others without their realizing it. One of the dangers of these social dynamics is that we generally assume that we are immune, its the weak willed other guy who is likely to just go along with the crowd. Zimbardo writes:
I must warn you of a bias you likely possess that might shield you from drawing the right conclusion from all you are about to read. Most of us construct self-enhancing, self-serving, egocentric biases that make us feel special--never ordinary, and certainly "above average." ...

Yet these biases can be maladaptive as well by blinding us to our similarity to others and distancing us from the reality that people just like us behave badly in certain toxic situations. Such biases mean that we don't take basic precautions to avoid the undesired consequences of our behavior, assuming it won't happen to us... In the extreme version of these biases, most people believe that they are less vulnerable to these self-serving biases than other people, even after being taught about them.
The biases Zimbardo is talking about cover a wide spectrum of circumstances, from Nationalism to local team spirit. The situational influences that govern and control what is done to an animal on a college or university campus is an extreme case. One example directly on point discussed by Zimbardo is the research reported in the paper Obedience to authority with an authentic victim. Sheridan, Charles L.; King, Richard G. Proceedings of the Annual Convention of the American Psychological Association. 1972. Here's the abstract:
Instructed 13 male and 13 female undergraduates to deliver 30 graded shocks to a puppy. The extreme shocks, thought not actually so, were purportedly dangerous and labeled up to 450 v. Ss'[subjects'] levels of obedience were assessed by noting the shock level at which they refused to comply with instructions. Despite the reactions of the puppy to the shock, 54% of the male Ss and 100% of the female Ss threw all switches. The sex difference was statistically reliable. Thus, Ss obeyed authoritatively given commands even when the victim was authentic.
The puppy, standing on an electrified grid, was really being shocked, though at a lower level than shown on the voltage labels the students saw. The shocks were sufficient to make the puppy cry out and jump around. Some of the female students cried as they shocked the puppy with higher and higher voltages. The lesson here is that situational influences can easily make us do things we would otherwise believe we never would or even could.

The reason I got to thinking about all of this again is because of a long comment by UW-Madison associate professor Robert Streiffer during a public "forum" on animal research. I've written about these "forums" before on a number of occasions. Streiffer is a member of the Letters and Sciences Animal Care and Use Committee. He says on his website that his "research includes bioethics (both medical and agricultural), ethical theory, metaethics, and political philosophy, with a focus on ethical and policy issues arising from modern biotechnology."

You can watch the forum here. Streiffer begins speaking at about 45:00.

Here, I'm primarily interested in the remarks he makes beginning at about 50:35. He argues that a "thoroughgoing utilitarian decision-making process" occurs with regard to experiments on animals. He says that the researchers themselves are likely familiar with only a small part of the process which explains why they give "truncated" justifications for the things they do to animals when asked by a reporter. He argues that this thoroughgoing utilitarian decision-making process begins with Congress, then is continued by the NIH, then is continued by the university, then by university departments, the lab animal veterinarians and animal care staff, then the animal care and use committees, and then ("of course") the vivisector him or herself is responsible for parts of this thoroughgoing utilitarian decision-making process. And, because of all these steps along the way, yes indeed, a thoroughgoing utilitarian decision-making process is in place.

This seems a too convenient way to sidestep responsibility for what is done to an animal in one of the university labs. I questioned his claim at about 1:11:45. Streiffer finishes up his answer at about 1:16:00.

He argued in his prepared remarks that a thoroughgoing utilitarian decision-making process takes place, and then says in his answer to me that it's biased, but that there isn't too much we can do about it. He also says that in past forums that there has been some limited discussion about the systems in place to try and mitigate those biases -- which he agrees can be financial and caused by group membership. But I don't know of any such system or effort in place. In the end, he seems to just shake it all off and says oh well, what can we do?

Streiffer's claims about there being a thoroughgoing utilitarian decision-making process isn't very accurate with regard to the topic of and reasons for the forums, the ethics of animal research. In the thoroughgoing process he thinks he sees, the ethics of experiments on animals are not part of the discussion. The process sits squarely and unquestioned on the assumption by everyone in the system that using animals isn't a big deal or worthy of consideration.

When I say that Streiffer thinks he sees something that isn't actually there, I'm not being trite. In fact, research discussed by Zimbardo makes if very clear that situational pressures will make people see things that aren't as they claim them to be if others around them claim to seem them that way. This has been demonstrated in a number of studies. The opinions of those around genuinely change what we beleie we see, even when the change we think we see isn't real.

When challenged, and generally only when challenged, those who experiment on animals pontificate about their goals and high ideals and rail on at length about how much their critics must hate sick babies. But they've been conditioned by their membership in their group to respond like this. The proof that there isn't likely to be any discussion about the costs to animals themselves during Streiffer's imagined thoroughgoing utilitarian decision-making process is simple to see: essentially everyone along the way eats animals.

The taste of an animal's fried flesh is a significant enough reason to raise and kill them. This is believed by essentially all the decision makers in Streiffer's imagined thoroughgoing utilitarian decision-making process. The idea that some more meaningful justification is needed prior to approving someone's getting a massive amount of money to poison a bunch of rats, to drill holes in cats' heads, or to raise monkeys without mothers is too silly to believe.

The connection with this and Zimbardo is the apparent consumption of Streiffer's better judgement due to his now extended membership in a group situation that includes most and perhaps all the elements identified by sociologists as reinforcements to the subjugation of independent thought.

Zimbardo summarizes these elements in what he calls "Ten Lessons from the Milgram Studies: Creating Evil Traps for Good People." (See his on-line version here:

1. Offering an Ideology so that a big lie provides justification for any means to be used to achieve the seemingly desirable, essential goal. Presenting an acceptable justification, or rationale, for engaging in the undesirable action ...

-- Everything we do to animals, no matter how much pain and misery it causes is fully justified because it is for the good of Humanity. Human Exceptionalism.

2. Arranging some form of contractual obligation, verbal or written, to enact the behavior.

3. Giving participants meaningful roles to play [ACUC membership] that carry with them previously learned positive values and response scripts.

4. Presenting basic rules to be followed, that seem to make sense prior to their actual use, but then can be arbitrarily used to justify mindless compliance.

-- All protocols will be approved when every box is appropriately checked and questions are answered in an approved manner.

5. Altering the semantics of the act, the actor, and the action, [from hurting victims to helping children] -- replace reality with desirable rhetoric.

6.Creating opportunities for diffusion of responsibility for negative outcomes; others will be responsible...

-- this seems to describe Strieffer's imagined thoroughgoing utilitarian decision-making process.

7. Starting the path toward the ultimate evil act with a small, insignificant first step.

-- Just sit in on a few of our meetings.

8. Having successively increasing steps on the pathway be gradual, so that they are hardly noticed as being different from one’s most recent prior action.

9. Changing the nature of the influence authority from initially “Just” and reasonable to “Unjust” and demanding, even irrational, elicits initial compliance and later confusion, but continued obedience.

10. Making the "exit costs" high, and making the process of exiting difficult by allowing usual forms of verbal dissent (that make people feel good about themselves), while insisting on behavioral compliance (“I know you are not that kind of person, just keep doing as I tell you.”)

-- Whistleblowers are frightened and want to remain anonymous; people speak to colleagues or outsiders on the sly and are afraid to be seen by the group as a critic.

I hope people like Robert Streiffer who themselves don't experiment on animals, and thus might not be quite as under the Situation's influence, will take the time to reread, or read for the first time, at least part of the very large body of research that explains to a large degree why otherwise good people so often end up doing really despicable things if ordered or encouraged to do so.

The good news is that some people's actions in the face of toxic situations have shown us that these strong negative influences are not universally able to corrupt everyone all the time.

1 comment:

Charlie Talbert said...

Hmmm ... this post affirms what I've heard others suggest: that the only ethical experiments and studies - and surely the most useful ones to humanity - should focus on the individual psychologies and self-reinforcing group dynamics of the human primates at UW who have been victimizing others with their self-congratulatory "research".