Sunday, August 02, 2015

INTERVIEW: Dr. Philip Zimbardo and Director Kyle Patrick Alvarez on The Stanford Prison Experiment

When it comes to research into human behavior in groups, one of the most notable, foundational studies is the 1971 Stanford Prison Experiment. Conducted by Dr. Philip Zimbardo, a psychology professor at Stanford University, the experiment enlisted fifty volunteers -- half designated "prisoners" and half "guards " -- with the goal being to study how each group adapted to their designated roles in a mock prison. As it turns out, they all adapted a little too well. While it was scheduled to last longer, the experiment was cut short after six days when the guards began to abuse the prisoners.

The study says a lot about human nature, and it's now the subject of a gripping new docudrama entitled, appropriately enough, The Stanford Prison Experiment, starring Billy Crudup (as Zimbardo) and directed by Kyle Patrick Alvarez. As a Communication Studies instructor, the Stanford Experiment is something I refer to early and often when teaching Small Group Communication and Interpersonal Communication, so when the opportunity arose to speak with director Alvarez and Dr. Zimbardo, I leaped at it. What follows are some highlights from our chat:


Dr. Zimbardo, when you first conducted this experiment, did you have any indication of how long-lasting its implications would end up becoming?

No, we never thought it would have any far-reaching implications other than to be a bookend to the Milgram Study on the power of the situation. In his study, it's one-on-one authority over individuals. In ours, it's a system, a situation, and playing a role in a setting that validates that role. In fact, after I finished this study, I wrote a few articles, and then moved on, so it wasn't until after Abu Ghraib that I finally wrote a book about it.

Kyle, you and I are roughly the same age, and we both grew up knowing about this study in some capacity, I'm sure. What drew you to wanting to  dramatize it?

Alvarez: Well, the project existed before me, so there was a script already and...

It's been, for like, ten years, right?

Yeah, and arguably more, but this iteration, ten years. And I was familiar with the experiment, but I didn't understand or really know the finer details of it, and I think in a lot of ways, some of the most fascinating things about it are in those details: the accents one of the guards took on; certain events and how they occurred; or the role that the advisor played; these kinds of things.

So, when I read the script, I didn't assume but conjectured that it had been manufactured for the sake of turning it into a movie. And so, then when I read the script first, and then started doing the research, I was like, oh, wait, no, this all really happened. This wasn't just like, this dangerous experiment or whatever that the sort of very broad point of view of it was.

It was actually something much more fascinating than that. And so, for me, I was really excited that you could do a movie that played with the cerebral more than the physical.

And you didn't really have to expand or exaggerate.

No, and the few moments we did were very carefully calculated. I feel like I could stand by why we did that, why we...any of those changes, I feel like the movie can hold up to that kind of scrutiny. But I do genuinely feel that. So, for me, it was that opportunity to make something that was really based on a true story.

As opposed to -- the term "based on a true story" means nothing nowadays. Really, it's totally absent...horror movies use it more than movies based on non-fiction. And so, for me, it was an opportunity to try to do a real event justice, and tell a really contained...almost make a clinical film about clinical stuff.

Doctor, this many years later, do you feel like you're still learning new things about what happened in terms of the lasting ramifications?

Zimbardo: Yeah, the first lesson was back in 2004, when everybody was horrified by the dozen or so images of American military police in Abu Ghraib humiliating, torturing, degrading prisoners who they should have been taking care of, and the parallels with the study were obvious and apparent. People wrote to me, said, "Hey, Dr. Z, those images look like they're from your study -- guards putting bags over prisoners' heads, stripping them naked, sexually abusing them."

And so, what happened was, through various circumstances, I went on NPR immediately afterwards, and I said: I want to challenge the characterization of these military police as bad apples, both by the Bush administration's spokespeople, Cheney and Rumsfeld, and also by the military, because whenever there's a scandal, whatever it is -- if it's a bank, if it's the police -- it's always a few rogue employees. So, you say that to get the system off the hook: don't blame us, blame them.

So, what I did, I said, you know what? I believe our soldiers are good apples, and somebody put them in a bad barrel. The media loved that metaphor, and then I said, then, the next question we have to ask is: who made that bad barrel? And this is the system, the bad barrel makers. So, actually, a lawyer for one of the guards heard my interview, called me and said, "Want to be an expert witness?" First I said no, what he did is reprehensible, and the lawyer persuaded me.

He's like, "But if you get to know him, you get to read all the industrial reports. You can see all the images." There's 1,000. The images that are prohibited from being seen: there's 1,000 of them, including videos.

Wow.

And the ones that were shown are almost the least objectionable. So, I got to know everything about Abu Ghraib. I think I know probably more than anyone because I read every one of the -- each of those reports are 2-300 pages, and I read every one of them. Most of them are by generals, and one of them is by a guy named James Schlesinger, a former Secretary of Defense.

And so, I became an expert witness for him, and in the process of doing that, I had to go back and revisit the prison studies. So, what I did is I got two students, new students who had not worked with me, and together we looked at each of the 12 hours of the videos, and then we made transcripts.

In the process of makings transcripts, I said this is how I have to write the book: in present tense, first person because when I was thinking of doing a book, it would've been in retrospect: let me tell you about this thing we did then, which could be good but it's not as good as: "You know what I'm going to do with those sausages, boy, if you don't eat them?" "No, sir, what are you going to do?" "I'm going to shove them up your a**, boy." "Please, sir, don't..." -- you know.

So, here it's in the script, and in reading it, I remembered, this is how it happened. So, I said to myself, okay, I have to write a book which goes into great depth of what happened in the studies, and that's what the Lucifer Effect was.

And can you expand on “the Lucifer Effect”? Because I think that's fascinating.

The Lucifer Effect is understanding how good people turn evil. That's the subtitle. And so, what I did is I began by saying, what is the Lucifer myth? How do we understand the nature of evil? In theological terms and psychological terms. And then let's look at some examples. So, we visit the Holocaust; we visit Rwanda; we visit Bosnia; we visit the Crusades. Evil has been part of human nature all these times, and then I get into the prison studies, and here's an example of evil that I created.

So, there's ten chapters on that, a chapter for each six days, plus a chapter for the parole board hearing, a chapter for the visiting days, maybe a chapter for the arrests even. And then I say: but this is only one study. How can you draw a powerful conclusion? Now, I have one or two other chapters on what is all the other research, in great detail, about the power of the situation?

There's Milgram, etc., and then I say: now the reader is equipped with the analytical tools to visit Abu Ghraib. So, then we say: what happened at Abu Ghraib? What was it? And then, the next chapter is: what is the system that created it? And then the final chapter is: how do we resist? In all these situations in the real world and experiments, a small percentage of people always resist: 10%, 20%, never more than 30%, and then: what is it about them?

Never more than 30%.

Almost never, no.

Wow.

It's the majority. So, Milgram -- let's say 35% -- Milgram is 65%. So, his study was really the first to quantify evil by giving it a precise...what percentage of people gave 250 volts. And then in the Holocaust, in every nation, there were small percentages of Christians who helped Jews. Now, it's minimal compared to the millions of people who went along.

So, the point is: how do you resist? So, I began by saying, here's Dr. Z.'s ten steps on resisting unwanted influence, and I list. And then I said, really, if somebody resists these powerful group pressures, don't they qualify as heroes? And then, what are heroes? So, I never even thought about it, never wrote about it. So, I started researching. So, the last part of the chapter is: how can an ordinary person become an everyday hero? So, I looked at the banality of evil and the banality of heroism.

The typical notion of heroes is heroes are people who walk on special soil. You know, the Agamemnons, Achilles, samurai warriors; it's the mythical heroes, and then the superhero. I said, you know what? In most of the situations, everything I could find -- the Christians who helped Jews, they were very ordinary people. And so, then I began to say...I give a taxonomy for it: here's 12 different categories of heroes, here's exemplars.

And then I end with: we ought to promote the notion that anybody can be a hero, and so that's what I've been doing in the last six years. I started a nonprofit foundation in San Francisco, the Heroic Imagination Project, that we teach people how to stand up, speak out, and take action in challenging situations -- in your life, in your family, in your school.

And we have a revolutionary education program that -- we don't have it in San Francisco schools because they won't let us in because they said they don't have room enough. We have it in Poland, all over Hungary, in Sicily, and some community colleges in Oregon.

I saw the movie just a couple days ago, and then the footage was dropped of Sandra Bland, and I see so much of what the film is about being illustrated here. Do you see parallels there?

Alvarez: To a degree, yes. I don't know enough about the individual cop yet to really speak about it, but let's just speak more about police brutality as opposed to the specific instance, which is -- and I'm borrowing some of the doctor's words here, but the individual is still responsible but we also, to the degree of which we hold the individual responsible, we also need to look at the system -- as he said, the barrel itself, that says, okay, we train these cops. We put guns in their hands. They face immense opposition every single day.

And we can solve it in the moment by dealing with the individual, but the bigger picture, the solutions to gun use, police brutality, incarceration issues, all these things. Gang issues. What we rarely do is treat the issue. We punish the individual as opposed to saying: why is this a thing that is happening. Police are being too brutal, and in many cases, it has to do with race. So, why can't we look at the bigger picture here, and say, okay, yes, these individuals are responsible, but what is it that's encouraging or allowing this behavior? Is there something about the way we're structured or the training we're given?

There's also just something inherently, as the study showed and I think the film shows, is you put on a uniform and you're given a task, and it changes you. It alters the way you interact with people. That person you are when you put that on isn't the same person you are when you go home. There is something different there, and I don't know what that is because I'm just a filmmaker, but there is, and and maybe it's just that 30% or something, but it happens almost instantaneously. So, how do you deal with that? How do you bring accountability into that, and not just be part of it?

Zimbardo: In our study, the symbol of power was a billy club, but policemen, they strap on that big belt with a huge gun, with a set of bullets, with handcuffs, with whistles, and once they put that on, and know they step out into the street and know there are a lot of people who don't like them. There are criminals there; there are drug dealers; there's pimps, etc.

There are people who don't like me for a variety of reasons, and so, for the police, the world is a dangerous place. So, the Tenderloin is a more dangerous place than Presidio Heights. And so, they start each day with a certain degree of fear, and also, they realize that, in dealing with the public, the public has to obey them. The public is lots of different people. So, when they say, "Stop," they want you to stop. You don't stop, then you're a threat to their authority. You don't stop when they say stop, they say you are suspicious.

So, they want you to obey them, and that's part of the job. So, in this case of this woman, Sandra Bland, essentially, what he asked her to do is stop smoking, and when she refused, that was an insult because she's saying: I'm not doing what you tell me to do. And at that point, everything changed. She's going to fight back, and this is exactly our scene in the movie, where you're not going to make your bed -- you know, the prisoner lunges at you, the guard hits, so suddenly, now, nobody's in control. The problem is what happened when they took her to the police station. So, I think it's at that level where there's a comparison.

Alvarez: I have many friends that are cops that are amazing cops that serve their job, that don't even have the desire to be violent. I know cops who wish they didn't have to have a gun on them, who would prefer to not. And so, it's an interesting thing now where there's going to be a natural defensive quality, I think, that is expected, and it's going to perpetuate and grow by no fault of any individual.

It's a vicious circle.

Alvarez: Right now, it's really easy to say, oh, it's because cops are racist or cops are violent, but it runs deeper than that.

That's a very reductive approach to it, yeah.

Zimbardo: Exactly, but I fear that's what's going to happen because of the act of a few individuals. But this is what we face -- we were talking about Abu Ghraib. We were talking about the Stanford Prison Experiment, in a lot of ways because in a lot of ways, what happens, too, which is where I think some of the really compelling issues with some of the guards during the experiment was the passivity, in many cases, is almost even worse.

It's not worse because the actions are less culpable, but in terms of the bigger picture, perpetuating this kind of behavior, it's worse when you don't...it's the: for evil to exist is for good men to do nothing. Silence can be approval or allowance, and that's a really scary thing because I think, for me, it's easier to relate to the silent...the person who's like, okay, I'm just going to sit back, and this is not a behavior, but I'm going to...

I don't want to get involved.

Yeah, and really, the true heroism, as you described before, are the people who could be passive but choose to be active, who can take the path of least resistance and elect not to.

The Schindlers, if you will.

Zimbardo: Exactly. Those are the people that I think deserve the real admiration because they have no reason to do...no, even, guilt to...you know. It's hard to describe what that is, but there's a really interesting thing that each shift, with the three shifts of these guards, each shift had a guard who kind of stayed in the corner, was a little meek, kind of hung back, didn't really say anything. And then each shift had a leader who instigated everything.

Alvarez: Spontaneously evolved, not assigned.

That's amazing.

And then each shift had "the swing man," the guy who could, if he elected to be with the passive one, wouldn't allow the behavior, but instead, it's easier to follow the stronger leader. I just know that as a filmmaker. The stronger, more authoritative leader I am -- and I say this considering myself to be quite a nice director, comparatively speaking -- the more authoritative I am, the more people rally behind you because there is a comfort in someone...right?

If there's four people in a room, you say, "Hey, we need you to do this task," it's a ten-second pause where there's an actual human state of: who is taking the lead? And body language always tells it. Say this is someone who doesn't always go out of my way to take the lead but will if I have to. You see the body language of the people who...and then sometimes you see the body language of the person who really wants to take the lead, and then sometimes that's the person you've got to subdue because they're going to be terrible at what they're going to do.

So, you've got to be up there, try to be...you know, leadership is a weird thing but it becomes a big part in how these group situations play out. Including, like with Ezra [Miller]'s character -- he was the leader of the prisoners, right, but he actually perpetuated the violence as much as anybody else because he didn't try to negotiate with the guards. He didn't try to say, "Hey, let's come up with a plan," or, "This isn't..." He responded with vitriol, which only encouraged -- as we were talking -- you don't encourage bad behavior.

********

Many thanks to the doctor and the director for their time. The Stanford Prison Experiment is one of the most bracing, engaging dramas I've seen all year, and I highly recommended seeking it out at select theaters. Check out the trailer below:

1 comment:

Blogger said...

I've just downloaded iStripper, so I can have the sexiest virtual strippers on my taskbar.