0:00:06.901,0:00:10.493 [Man] I was the first one to be [br]picked up, so they put me in a cell 0:00:10.493,0:00:11.951 they locked me in there 0:00:11.951,0:00:15.691 in this degrading little outfit. 0:00:15.691,0:00:18.029 [Guard] Hey! I don't want anybody laughing! 0:00:18.029,0:00:20.291 My way is the rule! 0:00:20.291,0:00:22.290 [unintelligible yelling] 0:00:25.189,0:00:27.389 [Man] I've gotta go to a doctor, anything. 0:00:27.829,0:00:30.889 [Man] Jesus Christ, I'm burning up inside dontcha know! 0:00:31.589,0:00:32.589 I want out! 0:00:32.839,0:00:33.839 I want out now! 0:00:35.739,0:00:38.647 [Man] I've never screamed [br]so loud in my life, 0:00:38.647,0:00:40.323 never been so upset in my life. 0:00:40.783,0:00:43.587 It was an experience of [br]being out of control. 0:00:44.738,0:00:46.518 [Man] I just fucking can't take it. 0:00:50.459,0:00:53.706 [Narrator] Stanford University, [br]Northern California. 0:00:53.706,0:00:57.425 One of America's most [br]prestigious academic institutions 0:00:57.425,0:01:01.840 and in 1971, the scene of one [br]of the most notorious experiments 0:01:01.840,0:01:04.100 in the history of psychology. 0:01:08.343,0:01:13.125 [Zimbardo] I was interested in what happens [br]if you put good people in an evil place. 0:01:15.552,0:01:20.992 Does the situation outside of you, the [br]institution, come to control your behavior 0:01:20.992,0:01:23.156 or does the things inside of you, 0:01:23.156,0:01:25.971 your attitudes, your values, your morality 0:01:25.971,0:01:30.625 allow you to rise above [br]a negative environment? 0:01:32.920,0:01:36.291 [Narrator] The negative environment [br]Zimbardo chose to test his ideas, 0:01:36.291,0:01:37.571 was a prison. 0:01:37.571,0:01:41.410 He would convert the basement of [br]the university psychology department 0:01:41.410,0:01:43.840 into a subterranean jail. 0:01:43.840,0:01:47.308 [Zimbardo] We put prison doors [br]on each of three office cells. 0:01:47.308,0:01:50.276 In the cells, it was [br]nothing but three beds 0:01:50.276,0:01:54.171 and there was actually very little room for [br]anything else because they're very small. 0:01:54.171,0:01:58.685 And here we had solitary confinement,[br]which we call "the hole." 0:01:58.685,0:02:02.931 And in the hole was where the prisoners [br]would be put for punishment. 0:02:02.931,0:02:04.593 It was a very very small area. 0:02:04.593,0:02:07.555 When you close the door, [br]it was totally dark. 0:02:14.855,0:02:17.175 All the guards wore [br]military uniforms 0:02:17.175,0:02:20.641 and we had them wear these [br]silver reflecting sunglasses. 0:02:20.641,0:02:23.044 And what it does is, [br]you can't see someone's eyes 0:02:23.044,0:02:26.406 and that loses some the [br]humanness, the humanity. 0:02:27.351,0:02:29.936 In general, we wanted to [br]create a sense of power. 0:02:29.936,0:02:33.835 That the guards as a category, are [br]people who have power over others. 0:02:33.835,0:02:36.320 And in this case, [br]power over the prisoners. 0:02:38.639,0:02:40.337 [Narrator] A decade earlier, 0:02:40.337,0:02:44.557 psychologist Stanley Milgrim had also [br]looked at how we respond to authority. 0:02:45.887,0:02:50.502 In order to understand how people were [br]induced to obey unjust regimes 0:02:50.502,0:02:56.194 and participate in atrocities such as [br]the holocaust, he set up an experiment. 0:02:56.194,0:03:01.162 Volunteers were told they were taking part [br]in scientific research to improve memory. 0:03:01.712,0:03:05.061 [Experimenter] Would you open those and [br]tell me which of you is which please? 0:03:05.061,0:03:06.724 [First man] Teacher.[br][Second man] Learner. 0:03:08.314,0:03:10.076 [Narrator] Separated by a screen, 0:03:10.076,0:03:13.886 the teacher would ask the[br]learner questions in a word game 0:03:13.886,0:03:17.528 and administer an electric shock [br]when the answer was incorrect. 0:03:17.528,0:03:20.908 He was told to increase the [br]voltage with each wrong answer. 0:03:22.369,0:03:27.846 [Teacher] Cloud, horse, [br]rock, or house? Answer? 0:03:27.846,0:03:29.564 [buzz][br][Teacher] Wrong. 0:03:31.354,0:03:35.106 150 volts. [br]Answer: horse. 0:03:35.106,0:03:40.965 [Learner] Ow! That's all! Get me out [br]of here! Get me out of here, please! 0:03:40.965,0:03:43.225 [Experimenter] Continue please.[br][teacher gesturing, unsure] 0:03:43.225,0:03:44.825 [Learner] I refuse to go on! Let me out! 0:03:44.825,0:03:48.346 [Experimenter] The experiment requires [br]you continue, teacher. Please continue. 0:03:48.346,0:03:52.047 [Narrator] Participants didn't know [br]that the learner was really an actor 0:03:52.047,0:03:54.767 and the so-called shocks, were harmless. 0:03:54.767,0:03:58.656 [Teacher] Now you'll [br]get a shock. 180 volts. 0:03:58.656,0:04:03.596 [Learner] Ow! I can't stand [br]the pain, let me out of here! 0:04:03.596,0:04:06.184 [Teacher] He can't stand the pain, [br]I'm not going to kill that man. 0:04:06.184,0:04:09.184 Who's going to take the responsibility if [br]anything happens to that gentleman? 0:04:09.184,0:04:13.148 [Experimenter] I'm responsible for [br]anything that happens here. Continue please. 0:04:13.148,0:04:18.886 [Teacher] Alright, next one. [br]Slow, walk, dance, truck, music. 0:04:18.886,0:04:23.229 [Narrator] 2/3 of volunteers were prepared to [br]administer potentially fatal electric shock 0:04:23.229,0:04:28.281 when encouraged to do so by what they [br]perceived to be a legitimate authority figure. 0:04:28.281,0:04:31.667 In this case, a man in a white coat. 0:04:31.667,0:04:36.672 [Teacher] 375 volts. I think something's [br]happened to that fellow in there. 0:04:36.672,0:04:39.810 I got no answer, he was [br]hollering at all this voltage. 0:04:39.810,0:04:42.616 Can you check him to see [br]if he's alright please? 0:04:42.616,0:04:44.940 [Narrator] Milgrim's [br]findings horrified America. 0:04:44.940,0:04:48.345 They showed that decent American citizens [br]were as capable of committing acts 0:04:48.345,0:04:49.635 against their conscious 0:04:49.635,0:04:52.248 as the Germans had been under the Nazis. 0:04:53.908,0:04:56.266 Like Milgrim, Zimbardo was interested 0:04:56.266,0:05:00.026 in the power of social situations [br]to overwhelm individuals. 0:05:01.477,0:05:05.191 His experiment would test people's [br]responses to an oppressive regime. 0:05:05.491,0:05:08.197 Would they accept it? [br]Or act against it? 0:05:09.761,0:05:14.297 Zimbardo's experiment was conducted [br]against a backdrop of Civil Rights activism 0:05:14.297,0:05:16.625 and protests against the Vietnam war. 0:05:17.264,0:05:20.050 [Zimbardo] It was a sense of [br]student power, student dominance 0:05:20.050,0:05:23.443 and student rebellion [br]against authority in general. 0:05:24.772,0:05:28.693 [Narrator] It was from the student body [br]Zimbardo selected his participants. 0:05:28.693,0:05:33.499 After passing tests, to screen out [br]anyone with a psychological abnormality, 0:05:33.499,0:05:36.538 they were paid $15 a day. 0:05:36.538,0:05:40.057 Each was randomly assigned [br]the role of guard or prisoner. 0:05:41.077,0:05:43.698 [Man] It was a prison to me, [br]it still is a prison to me. 0:05:43.698,0:05:46.036 I don't look at it as an [br]experiment or a simulation. 0:05:46.036,0:05:50.962 It was just a prison that was run by [br]psychologists instead of run by the state. 0:05:52.050,0:05:56.409 [Ramsay] I was 20, and that [br]September I was going to college. 0:05:56.409,0:06:01.815 And it would be nice to have a summer job, [br]but there sure wasn't a lot of time left. 0:06:01.815,0:06:06.186 And I looked in the want ads and I found [br]this thing which was just going to fit. 0:06:06.186,0:06:08.706 It was just two weeks. 0:06:08.706,0:06:12.801 [Man] You put a uniform on, [br]and are given a job 0:06:12.801,0:06:14.496 to keep these people in line. 0:06:14.496,0:06:17.546 You really become that person once [br]you put on that khaki uniform, 0:06:17.546,0:06:20.931 you put on the glasses, [br]you take the nightstick. 0:06:20.931,0:06:23.636 [Eshleman] I was on summer [br]break from my first year of college 0:06:23.636,0:06:25.446 and I was looking for a job. 0:06:25.976,0:06:28.690 Had to chose between [br]that or making pizzas. 0:06:28.690,0:06:30.591 And that sounded like a lot more fun. 0:06:31.151,0:06:32.868 [Narrator] As well as running the experiment, [br] 0:06:32.868,0:06:36.038 Zimbardo took on the role [br]of prison superintendent. 0:06:36.038,0:06:38.442 He began by briefing the guards. 0:06:38.442,0:06:40.685 [Zimbardo] I said, "We have to [br]maintain law and order. 0:06:40.685,0:06:45.616 "If prisoners escape, the study is over.[br]And you can't use physical violence." 0:06:46.156,0:06:48.143 [Zimbardo] You can [br]create a sense of fear in them. 0:06:48.143,0:06:52.541 You can create a notion that their [br]life is totally in control by us. 0:06:52.541,0:06:54.453 There will be constant surveillance, 0:06:54.453,0:06:57.683 we have total power of the [br]situation and they have none. 0:07:00.677,0:07:03.042 [Narrator] Prisoners were [br]brought to the basement prison, 0:07:03.042,0:07:05.991 blindfolded to confuse them [br]about their whereabouts. 0:07:05.991,0:07:08.809 They were stripped and deloused. 0:07:08.809,0:07:11.137 [Zimbardo] of course the guards [br]started making fun of their genitals 0:07:11.137,0:07:15.452 and humiliating them and really it starts [br]what's known as a degradation process. 0:07:15.452,0:07:21.849 Which not only prisons, but lots of [br]military type outfits use that process. 0:07:26.273,0:07:29.510 [Prisoner ] When I first got there, [br]even though I had to strip, 0:07:29.510,0:07:32.459 and they would call me names, I still [br]didn't feel at all like it was a prison. 0:07:32.459,0:07:34.287 I just looked at it like a job. 0:07:35.321,0:07:39.310 [Eshleman] I recall sort of walking [br]up and down the very short hallway, 0:07:39.310,0:07:42.015 which was the prison hall [br]and looking in on the prisoners. 0:07:42.015,0:07:44.554 And they're basically lounging [br]around on their beds. 0:07:44.554,0:07:47.322 I felt it was like [br]a day in summer camp. 0:07:48.850,0:07:50.275 [Zimbardo] The first day I said, 0:07:50.275,0:07:54.685 "This might be a very long [br]and very boring experiment," 0:07:54.685,0:07:57.502 because it's conceivable [br]nothing will ever happen. 0:07:59.515,0:08:01.554 [Eshleman] I arrived independently [br]at the conclusion 0:08:01.554,0:08:04.054 that this experiment must [br]have been put together 0:08:04.054,0:08:09.280 to prove a point about prisons [br]being a cruel and inhumane place. 0:08:09.280,0:08:15.346 And therefore, I would do my part, [br]to help those results come about. 0:08:15.936,0:08:20.442 I was a confrontational and [br]arrogant 18 year old at the time 0:08:20.442,0:08:24.205 and I said "somebody outta [br]stir things up a bit here." 0:08:25.067,0:08:28.951 [Prisoner] Fuck this experiment [br]and fuck that Zimbardo! 0:08:30.650,0:08:34.933 [Narrator] On the second morning, [br]prisoners decided to stir things up as well. 0:08:36.013,0:08:39.563 The guards found some of them had used [br]their beds to barricade their cell. 0:08:40.173,0:08:43.955 Prisoner 8612 was one of the [br]ring leaders of the rebellion. 0:08:43.955,0:08:48.913 [yelling] ...Fucking simulation! It's a fucking simulated experiment! 0:08:48.913,0:08:53.053 [Indistinct yelling] 0:08:54.443,0:08:56.123 [Zimbardo] Initially I was stunned. 0:08:56.123,0:08:58.183 I didn't expect a rebellion [br]because not much happened. 0:08:58.183,0:09:00.660 And it wasn't clear what [br]they were rebelling against, 0:09:00.660,0:09:02.852 but they were rebelling against the status, 0:09:02.852,0:09:04.494 rebelling against being anonymous, 0:09:04.494,0:09:09.310 against having to follow orders [br]from these other students. 0:09:10.144,0:09:12.418 [Narrator] As punishment for the rebellion, 0:09:12.418,0:09:17.442 prisoner 8612 was put in the hole and [br]the guards turned on the other prisoners. 0:09:18.536,0:09:21.866 [Zimbardo] The guards felt that they [br]now had to up the ante of being tough. 0:09:22.336,0:09:27.498 The prisoners made the mistake of [br]beginning to use profanity against the guards 0:09:27.498,0:09:29.592 in a very personalized way. 0:09:29.592,0:09:34.059 So not against the guards, but now[br]"you little punk" "you big shit" and stuff. 0:09:34.059,0:09:36.124 And the guards got furious. 0:09:37.186,0:09:40.652 [Guard] Everybody up! Everybody get up! 0:09:40.652,0:09:43.342 Well gentlemen, here [br]it is time for count. 0:09:43.342,0:09:46.445 [Narrator] Prisoners were repeatedly [br]woken in the middle of the night. 0:09:46.445,0:09:49.388 The guards made them [br]do menial physical tasks 0:09:49.388,0:09:52.053 and clean out toilets [br]with their bare hands. 0:09:52.053,0:09:56.431 [Eshleman] We made it a point not [br]to give them any sense of comfort 0:09:56.431,0:09:58.145 or what to expect. 0:09:58.145,0:09:59.915 Anything could happen to them at any time, 0:09:59.915,0:10:03.861 including being rousted from [br]their sleep at any hour. 0:10:03.861,0:10:08.174 And forced to stand up in a line [br]and have me hurl insults at them 0:10:08.174,0:10:11.861 and make them do exercises. 0:10:11.861,0:10:15.948 When you interrupt people's sleep, [br]they tend to become a little disoriented. 0:10:15.948,0:10:20.116 And since there was no daylight in the prison, [br]they had no idea whether it was night or day. 0:10:22.415,0:10:25.379 I think that I was the instigator of this 0:10:25.379,0:10:28.851 whole schedule of harrassment. 0:10:30.221,0:10:34.347 [Narrator] The harassment of the guards [br]took it's toll on rebellion leader 8612. 0:10:34.347,0:10:38.115 He told Zimbardo he wanted [br]to leave the experiment. 0:10:38.115,0:10:43.179 Zimbardo responded not as a psychologist [br]but as a prison superintendent. 0:10:43.179,0:10:46.907 [Zimbardo] I said, "Well, I can see to it [br]the guards don't hassle you personally 0:10:46.907,0:10:50.869 and in return all I would like [br]is some information from time to time 0:10:50.869,0:10:52.779 about what the prisoners are doing." 0:10:52.779,0:10:56.548 So essentially I'm saying "I'd [br]like you to be a snitch, an informant." 0:10:56.548,0:11:00.274 And I said "think it over and [br]if you still want to leave, fine." 0:11:00.274,0:11:04.920 [Narrator] Confused, prisoner [br]8612 returned to his cell [br] 0:11:04.920,0:11:07.540 and told the other prisoners [br]that no one could leave. 0:11:12.925,0:11:16.968 [Zimbardo] He believed we wouldn't[br]let him go, although we never said that. 0:11:16.968,0:11:19.451 But the fact that he was the [br]ring leader of the rebellion 0:11:19.451,0:11:22.576 and he told the other prisoners [br]"they won't let you leave," 0:11:22.576,0:11:26.770 that really transformed the [br]experiment into a prison. 0:11:26.770,0:11:31.839 [Prisoner] I was told I couldn't quit. And [br]at that point, I just felt totally hopeless. 0:11:31.839,0:11:33.966 More hopeless than I [br]had ever felt before. 0:11:35.995,0:11:37.730 [Narrator] Soon after returning to his cell, 0:11:37.730,0:11:41.740 prisoner 8612 started showing [br]signs of severe distress. 0:11:42.220,0:11:46.073 [Prisoner 8612] Goddammit. Fucked up! [br]You don't know, you don't know. 0:11:46.073,0:11:50.409 I mean, God, I mean Jesus Christ, [br]I'm burning up inside, don't you know? 0:11:54.212,0:11:58.700 [Zimbardo] He came up with a [br]plan that if he acted crazy, 0:11:58.700,0:12:00.400 we would have to release him. 0:12:02.140,0:12:05.101 [Prisoner 8612] I feel fucked up inside, [br]I feel really fucked up inside. You don't know. 0:12:05.101,0:12:09.865 I gotta go to a doctor, anything. [br]I can't stay here, I'm fucked up. 0:12:09.865,0:12:13.339 I don't know how to explain it, [br]I'm fucked up inside! And I want out! 0:12:16.151,0:12:19.717 [Zimbardo] It starts with make-believe and [br]then he's doing it and cursing and screaming. 0:12:19.717,0:12:24.070 You know, whatever that little [br]boundary is, he moved across, 0:12:24.070,0:12:29.254 not that he became really crazy, [br]but he became excessively disturbed. 0:12:29.254,0:12:32.088 So much so, we immediately said,[br]"We have to release him." 0:12:33.081,0:12:35.630 [Korpi--prisoner 8612] As an [br]experience, it was unique. 0:12:35.630,0:12:37.960 I've never screamed [br]so loud in my life. 0:12:38.700,0:12:41.374 Never been so upset in my life, 0:12:41.374,0:12:44.575 and it was an experience [br]of being out of control. 0:12:46.890,0:12:49.446 [Narrator] The boundary between [br]reality and make-believe 0:12:49.446,0:12:51.810 was to become blurred [br]even for Zimbardo. 0:12:52.330,0:12:56.646 A rumor circulated that released [br]prisoner 8612 would return with friends 0:12:56.646,0:12:59.316 to liberate the remaining prisoners. 0:12:59.316,0:13:01.732 [Zimbardo] I quickly convinced myself that 0:13:01.732,0:13:08.330 my most important function was not to [br]allow this prison liberation to occur. 0:13:08.330,0:13:10.920 And what could I do to keep my prison going? 0:13:10.920,0:13:12.408 Not the experiment going. 0:13:13.546,0:13:15.143 [Narrator] The prison was dismantled 0:13:15.143,0:13:17.783 and the prisoners were moved [br]another part of the building. 0:13:19.243,0:13:23.523 Zimbardo waited in the empty [br]corridor preparing to tell 8612 0:13:23.523,0:13:25.511 and his friends that the study was over. 0:13:25.511,0:13:26.749 When a colleague appeared 0:13:26.749,0:13:30.349 and began asking questions [br]about the scientific basis of the research. 0:13:30.919,0:13:33.361 [Zimbardo] I'm trying to get [br]rid of him and then he says, 0:13:33.361,0:13:35.351 "What's the independent variable?" 0:13:35.721,0:13:41.869 I got furious, because he doesn't understand [br]that there's a riot about to take place, 0:13:41.869,0:13:44.038 that this prison is about to erupt. 0:13:44.038,0:13:49.979 I had totally lost this whole other identity [br]of scientist, researcher, psychologist. 0:13:51.136,0:13:54.107 [Narrator] The rumored jailbreak [br]never materialized. 0:13:54.107,0:13:56.511 The guards had dismantled [br]the prison for nothing 0:13:56.511,0:13:58.355 and had to rebuild it. 0:13:58.355,0:14:00.657 They took their frustration [br]out on the prisoners. 0:14:02.492,0:14:05.282 [Zimbardo] They escalated [br]again the level of control. 0:14:05.282,0:14:08.606 the level of dominance,[br]the level of humiliating behavior. 0:14:13.783,0:14:18.245 [Narrator] 819 was the next prisoner to [br]rebel against the harassment of the guards. 0:14:18.245,0:14:22.465 He barricaded himself in his cell [br]and refused to take part in the count. 0:14:22.465,0:14:24.346 [Guard] You're not only [br]not getting cigarettes, 0:14:24.346,0:14:26.226 but for as long as this cell's blockaded [br] 0:14:26.226,0:14:28.356 you're going to be in [br]solitary when you get out. 0:14:28.356,0:14:32.645 [Narrator] For 819's disobedience, the guards [br]made his cellmates do mindless work. 0:14:32.645,0:14:36.991 This undermined any vestige of [br]solidarity amongst the prisoners 0:14:36.991,0:14:39.162 who now chose to accept [br]the tyranny of the guards 0:14:39.162,0:14:41.412 rather than risk further harassment. 0:14:42.422,0:14:45.446 [Eshleman] That was one of [br]the surprising things to me was that 0:14:45.446,0:14:49.607 there was so little that the [br]prisoners did to support one another 0:14:49.607,0:14:53.967 after we started our [br]campaign of divide and conquer. 0:14:56.117,0:14:58.588 [Narrator] Isolated and [br]distraught, prisoner 819 0:14:58.588,0:15:00.904 told Zimbardo he wanted to leave. 0:15:01.344,0:15:05.377 [Zimbardo] While I'm interviewing [br]819, and saying, [br] 0:15:05.377,0:15:08.687 "Okay, it's all over, thank you [br]for your participation. 0:15:08.687,0:15:13.293 I'll give you money for the whole two weeks, [br]even though you're leaving early." 0:15:13.293,0:15:17.693 He hears the prisoners [br]shouting: "819 did a bad thing." 0:15:17.934,0:15:23.334 [Prisoners] Prisoner 819 did a bad thing.[br]Prisoner 819 did a bad thing. 0:15:25.440,0:15:27.038 And he said, "I can't leave." [br]And he's crying. 0:15:27.038,0:15:28.026 And he said, "I can't leave." 0:15:28.026,0:15:29.041 And I said, "What do [br]you mean you can't leave?" 0:15:29.041,0:15:32.626 And he said, "No, I have to go back because [br]I don't want them to think I'm a bad prisoner." 0:15:32.626,0:15:37.229 And that's when I really flipped [br]out that in such a short time 0:15:37.229,0:15:42.888 a college student's thinking [br]could become so distorted. 0:15:42.888,0:15:44.891 I said, "You're not a bad prisoner. [br] 0:15:44.891,0:15:46.931 "You're not a prisoner. [br]And this is not a prison." 0:15:46.931,0:15:48.044 And it was just this thing where 0:15:48.044,0:15:51.262 he opened up his eyes, really [br]like a cloud being lifted. 0:15:52.108,0:15:57.553 [Narrator] Seeing things clearly, prisoner 819 [br]reverted to his original request and was released. 0:15:57.553,0:16:03.029 To replace him, the experimenters called [br]in one of their reserves from the standby list. 0:16:04.414,0:16:08.453 [Ramsay] I got a phone call saying, [br]"Are you still available as an alternate?" 0:16:08.453,0:16:11.181 Kind of a cheery, female secretary voice. 0:16:11.181,0:16:13.321 And I said, "Yes, sure." 0:16:13.321,0:16:16.452 And so she said, [br]"Could you start this afternoon?" 0:16:16.452,0:16:18.717 And I said, "Yes, sure." 0:16:18.717,0:16:21.818 And my role in the [br]experiment really began. 0:16:26.859,0:16:32.308 I was blindfolded and then [br]stripped and supposedly deloused. 0:16:34.543,0:16:38.579 [Zimbardo] He came into [br]a madhouse, full blown. 0:16:38.579,0:16:42.660 All of us, had gradually acclimated [br]to the increasing level of aggression, 0:16:42.660,0:16:45.162 increasing powerlessness of the prisoners, 0:16:45.162,0:16:48.416 increasing dominance of the guards. 0:16:48.416,0:16:51.942 And he comes in and says, "What's [br]happening here?" to the other prisoners. 0:16:51.942,0:16:53.400 And they said [br]"Yeah, you better not make trouble, [br] 0:16:53.400,0:16:55.920 it's really terrible, it's a real prison." 0:16:55.920,0:17:00.561 And he says, "I'm out of here, [br]I don't want this." 0:17:00.568,0:17:01.835 And they said "No, you can't leave. 0:17:01.835,0:17:04.135 Once you're here, you're stuck. [br]This is a real prison." 0:17:04.135,0:17:07.859 [Guard] 416 put your hands in the air [br]or why don't you play Frankenstein? 0:17:07.859,0:17:11.572 293 you can be the bride of [br]Frankenstein, you stand here. 0:17:11.572,0:17:15.845 [Narrator] Prisoner 416 was soon subjected [br]to the harassment of Dave Eshleman, 0:17:15.845,0:17:19.481 nicknamed John Wayne [br]because of his macho attitude. 0:17:19.481,0:17:23.761 [Guard] 416 I want you to walk over here like [br]Frankenstein and say that you love 2093. 0:17:25.951,0:17:27.714 That ain't a Frankenstein walk! 0:17:27.714,0:17:29.848 [Eshleman] I made the [br]decision that I would be 0:17:29.848,0:17:34.639 as intimidating, as cold, as cruel as possible. 0:17:34.639,0:17:38.359 [Prisoner 416] I love you 2093. [br][Guard] Get up close! Get up close! 0:17:38.359,0:17:42.339 [Prisoner 416] I love you 2093.[br]I love you 2093. 0:17:42.339,0:17:46.390 [Guard] You get down [br]here and do ten pushups! 0:17:47.869,0:17:50.365 [Eshleman] I had just watched [br]a movie called Cool Hand Luke 0:17:50.365,0:17:55.601 and the mean intimidating southern [br]prison warden character in that film 0:17:55.601,0:17:59.324 really was my inspiration for the [br]role that I created for myself. 0:18:09.728,0:18:12.244 [Zimbardo] He was creative in his evil. 0:18:12.244,0:18:18.432 He would think of very ingenious ways [br]to degrade, to demean the prisoners. 0:18:18.432,0:18:23.403 [Guard] What if I told you to get [br]down on that floor and fuck the floor 0:18:23.403,0:18:25.262 what would you do then? 0:18:26.155,0:18:29.696 [Zimbardo] One of the best guards, [br]was also on that shift 0:18:29.696,0:18:33.684 and instead of confronting the [br]bad guard, the sadistic guard 0:18:33.684,0:18:36.298 essentially, because he didn't [br]want to see what was happening, 0:18:36.298,0:18:40.279 he became the gofer, he went out to [br]get the food and things of this kind. 0:18:40.279,0:18:46.300 And that left the John Wayne guard and the [br]other guard on that shift to be dominant. 0:18:46.300,0:18:49.065 [One of the guards] We were [br]continually called upon to act 0:18:49.065,0:18:53.203 in a way that is contrary [br]to what I really feel inside. 0:18:53.203,0:18:55.355 Just continually giving out shit. 0:18:55.355,0:18:59.018 It's really just one of the [br]most oppressive things you can do. 0:19:01.178,0:19:05.578 [Guard] 416, while they do pushups, [br]you sing Amazing Grace. 0:19:05.578,0:19:08.296 Ready? Down. 0:19:08.296,0:19:14.098 [prisoner singing] 0:19:14.098,0:19:15.767 [Guard] Keep going. 0:19:15.767,0:19:19.835 [Narrator] The madness of the [br]experiment started to affect 416. 0:19:19.835,0:19:23.455 [prisoner singing] 0:19:23.455,0:19:24.908 [Guard] Keep going! 0:19:24.908,0:19:30.394 >>[Prisoner 416] I began feeling like, I was [br]losing my identity, until I wasn't Clay. 0:19:30.394,0:19:32.034 I was 416. 0:19:32.034,0:19:35.574 I was really my number and 416 [br]was going to decide what to do. 0:19:37.256,0:19:42.015 [Narrator] Prisoner 416 decided [br]to go on a hunger strike. 0:19:42.448,0:19:45.139 [Ramsay] They were pushing my limits, 0:19:45.139,0:19:47.424 but here was the thing that I could do, 0:19:47.424,0:19:50.046 that could push their limits. 0:19:53.024,0:19:55.399 After I had missed a couple meals, 0:19:55.399,0:19:59.292 I saw this was not a matter [br]of indifference to the guards. 0:19:59.292,0:20:02.024 I was making headway, they were upset. 0:20:04.639,0:20:08.622 [Eshleman] I thought, "How dare this [br]newcomer come in and try to change 0:20:08.622,0:20:12.055 everything that we had worked [br]for the first three days to set up. 0:20:12.055,0:20:14.586 And by God, he's going to suffer for that." 0:20:16.306,0:20:19.660 [Narrator] Frustrated by [br]his continued defiance, [br] 0:20:19.660,0:20:22.600 John Wayne threw prisoner 416 into the hole. 0:20:22.600,0:20:25.953 After punishing the other prisoners, [br]for his disobedience, 0:20:25.953,0:20:29.889 John Wayne encouraged them to [br]vent their anger at 416 directly. 0:20:29.889,0:20:31.329 [Prisoner] Thank you, 416. [br][bangs on door] 0:20:32.807,0:20:36.754 [Guard] Ok, 209.[br][Prisoner] Thank you, 416. 0:20:36.754,0:20:40.535 [Eshleman] We would use our nightsticks to [br]bang on the door and we would kick the door 0:20:40.535,0:20:46.462 so hard it must've shaken [br]him very seriously inside. 0:20:46.462,0:20:48.025 Scared the life out of him. 0:20:49.335,0:20:52.548 [Ramsay] He yelled at me, and [br]threatened me and actually sort of 0:20:52.548,0:20:58.409 smashed a sausage into my face [br]to try to get me to open up. 0:20:58.409,0:21:02.812 But I didn't have any intention [br]of eating until I was out. 0:21:04.590,0:21:07.750 [Zimbardo] 416 should've [br]been, at some level, a hero 0:21:07.750,0:21:11.413 'cause he's willing to oppose [br]the authority of the system. 0:21:11.413,0:21:15.566 In fact, the prisoners accept the guards [br]definition of him as a troublemaker. 0:21:16.235,0:21:20.243 [Eshleman] I remember some of them [br]saying: "Would you eat goddammit!" 0:21:20.243,0:21:22.705 "We're sick and tired of this." 0:21:22.705,0:21:30.665 And that was proof that there was no solidary, [br]there was no support between the prisoners. 0:21:30.665,0:21:32.956 [Narrator] While 416 was still in the hole, [br] 0:21:32.956,0:21:35.676 John Wayne made a final [br]attempt to break him 0:21:35.676,0:21:38.450 by giving his fellow prisoners a choice. 0:21:38.450,0:21:42.035 They could vote to release [br]him by making small sacrifice. 0:21:42.035,0:21:48.088 [Guard] You can give me your blankets [br]and sleep on the bare mattress 0:21:48.088,0:21:53.351 or you can keep your blankets [br]and 416 will stay in another day. 0:21:54.821,0:21:57.144 What will it be? 0:21:57.144,0:21:58.632 [Prisoners] I'll keep my blankets. 0:21:58.632,0:21:59.952 [Guard] What will it be over here? 0:22:00.414,0:22:01.414 [Prisoners] I'll keep my blankets. 0:22:02.415,0:22:04.335 [Guard] How about 536? 0:22:04.515,0:22:06.575 [Prisoners] I'll give you my blankets [br]Mr. Correctional Officer. 0:22:06.940,0:22:08.370 [Guard] We don't want your blankets 0:22:08.526,0:22:09.344 [Guard] We got 3 in favor of keeping their blankets. [br] 0:22:10.163,0:22:12.283 We got 3 against 1.[br][br] 0:22:12.283,0:22:13.403 Keep your blankets. 0:22:13.403,0:22:18.520 416 you're going to be in there for a while,[br]so just get used to it. 0:22:18.520,0:22:21.472 [Eshleman] The study [br]showed that power corrupts 0:22:21.472,0:22:27.485 and how difficult it is for people who are the [br]victims of abuse to stand up and defend themselves. 0:22:27.485,0:22:33.079 Why doesn't anybody who is being abused [br]by a spouse or something like that 0:22:33.079,0:22:35.216 just say "stop it?" 0:22:35.216,0:22:37.856 And we realize now that it's [br]not as easy as it sounds. 0:22:41.217,0:22:45.618 [Narrator] By the end of the 5th day, 4 [br]prisoners had broken down and been released. 0:22:45.618,0:22:47.929 416 was on the second [br]day of his hunger strike [br] 0:22:47.929,0:22:50.989 and the experiment still had another 9 days to run. 0:22:53.571,0:22:57.583 At this point, a fellow psychologist [br]visited Zimbardo's basement prison 0:22:57.591,0:23:01.417 and would witness the brutality [br]of the experiment first hand. 0:23:02.233,0:23:05.383 [Zimbardo] The guards had lined up [br]the prisoners to go to the toilet. 0:23:05.383,0:23:07.726 They had bags over their heads, [br]chains on their feet, 0:23:07.726,0:23:09.622 and were marching by and I looked up. 0:23:09.622,0:23:12.926 And I saw this circus, this parade, 0:23:12.926,0:23:14.901 and I said, "Hey Chris, look at that." 0:23:15.475,0:23:18.547 [Christina Maslach] I looked up, and [br]I began to feel sick to my stomach. 0:23:18.547,0:23:22.286 I had this just...[br]chilling, sickening feeling 0:23:22.286,0:23:25.367 of watching this and you [br]know, I just turned away. 0:23:25.918,0:23:29.324 And I just let loose [br]in this emotional tyranny. 0:23:29.324,0:23:30.605 I just lost it. 0:23:30.605,0:23:33.703 I was angry, scared, I was in tears. 0:23:33.703,0:23:35.790 [Zimbardo] And I'm furious, 0:23:35.790,0:23:37.584 saying you know, we had a big argument. 0:23:37.584,0:23:39.019 You're supposed to [br]be a psychologist. 0:23:39.019,0:23:42.757 This is interesting dynamic behavior [br]and I'm going through this whole thing 0:23:42.757,0:23:44.651 the power of the situation. 0:23:44.651,0:23:48.416 And she says "No, no, it's [br]that young boys are suffering 0:23:48.416,0:23:51.823 and you're responsible. [br]You're letting it happen." 0:23:51.823,0:23:54.882 I said "Oh my god, [br]of course you're right." 0:23:56.129,0:23:59.417 [Narrator] The next day, [br]Zimbardo ended the experiment. 0:24:01.330,0:24:06.839 Studies like his stimulated heated debate [br]about the ethics of using human subjects. 0:24:06.839,0:24:10.769 [Zimbardo] Really young men [br]suffered verbally, physically. 0:24:10.769,0:24:13.465 Prisoners felt shame in their role. 0:24:13.465,0:24:15.365 Guards felt guilt. 0:24:15.365,0:24:17.359 So in that sense, it's unethical. 0:24:17.359,0:24:22.096 That is, nobody has the right, the power, [br]the privilege to do that to other people. 0:24:22.900,0:24:26.290 [Narrator] In the wake of experiments [br]like Zimbardo's and Milgrim's 0:24:26.290,0:24:31.665 ethical guidelines changed, introducing [br]greater safeguards to protect participants. 0:24:32.415,0:24:36.218 In the Standford experiment, Zimbardo [br]might have sped his volunteers distress 0:24:36.218,0:24:39.781 had he not taken on a dual role in the study. 0:24:39.781,0:24:41.731 [Zimbardo] If I was going to [br]be the prison superintendent, 0:24:41.731,0:24:45.416 I should have had a colleague [br]who was overseeing the experiment. 0:24:45.416,0:24:50.623 Who was in a position to stop it at any point. 0:24:50.623,0:24:52.619 Or I should've been the principal investigator 0:24:52.619,0:24:55.903 and get somebody who was going [br]to be the prison superintendent. 0:24:55.903,0:24:58.571 I realized that was a big mistake, [br]to play both those roles. 0:24:58.571,0:25:00.436 And by shifting back and forth. 0:25:03.393,0:25:06.534 [Narrator] After the experiment, Zimbardo [br]brought all the participants together 0:25:06.534,0:25:08.851 to talk about their experiences. 0:25:09.371,0:25:11.794 John Wayne would now come face to face 0:25:11.794,0:25:14.214 with the hunger striker that he had tormented. 0:25:14.914,0:25:15.873 [Eshleman] I was a little worried. 0:25:15.873,0:25:19.424 I said "Oh my god, he's really [br]gonna come down on me hard now." 0:25:19.424,0:25:22.368 Now that we're on equal footing. 0:25:22.368,0:25:23.662 [Ramsay] It harms me. 0:25:23.662,0:25:26.084 [Eshleman] How did it harm you? [br]How does it harm you? 0:25:26.084,0:25:28.924 Just to think ((cross talk)) you know [br]people can be like that? 0:25:28.924,0:25:32.105 [Ramsay] Yeah, it let me in [br]on some knowledge [br] 0:25:32.105,0:25:34.385 that I've never experienced first hand. 0:25:34.385,0:25:37.099 Because I know what you can turn into, 0:25:37.099,0:25:39.425 I know what you're willing to do. 0:25:39.425,0:25:41.887 [Eshleman] When I look back on [br]it now, I behaved appallingly. 0:25:41.887,0:25:45.308 You know, it was just horrid to look at. 0:25:45.308,0:25:48.062 I think I tried to explain [br]to him that at the time, 0:25:48.062,0:25:50.928 what you experience and [br]what you hated so much 0:25:50.928,0:25:54.126 was a role that I was playing, [br]that's not me at all. 0:25:54.126,0:26:00.461 [Ramsay] He was trying to dissociate [br]himself from what he had done. 0:26:00.461,0:26:03.026 That did make me angry. 0:26:03.026,0:26:05.545 Everyone was acting out [br]a part and playing a role: 0:26:05.545,0:26:08.457 prisoners, guards, staff, 0:26:08.457,0:26:11.166 everyone was acting out a part. 0:26:11.166,0:26:15.513 It's when you start contributing [br]to the script, 0:26:16.377,0:26:20.976 that's you and thus it's something you [br]should take responsibility for. 0:26:20.976,0:26:23.776 [Eshleman] Uh, I didn't [br]see where it was really harmful. 0:26:23.776,0:26:27.295 It was degrading and that was part [br]of my particular little experiment 0:26:27.295,0:26:30.164 to see how I could-- 0:26:30.164,0:26:31.843 [Ramsay] Your particular little experiment?! 0:26:31.843,0:26:33.053 Why don't you tell me about that. 0:26:33.053,0:26:35.534 [Eshleman] Yes, I was running...[br]I was running a little experiment of my own. 0:26:35.534,0:26:37.941 [Ramsay] Tell me about your [br]little experiments, I'm curious. 0:26:37.941,0:26:43.062 [Eshleman] I wanted to see what [br]kind of verbal abuse that people can take 0:26:43.062,0:26:47.183 before they start objecting, [br]before they start lashing back. 0:26:47.183,0:26:49.256 [Eshleman] If I have any regret, right now, 0:26:49.256,0:26:52.776 it's that I made that decision, [br]because it would've been interesting 0:26:52.776,0:26:58.964 to see what would have happened [br]had I not decided to force things. 0:27:00.959,0:27:05.645 It could be that I only accelerated them, [br]that the same things would've happened. 0:27:05.645,0:27:08.221 But we'll never know. 0:27:08.221,0:27:11.859 [Narrator] If the extreme nature of [br]Dave Eshleman's behavior tested the prisoners, 0:27:11.859,0:27:14.791 it also presented the other [br]guards with the choice: 0:27:14.791,0:27:16.837 to intervene, or not. 0:27:16.837,0:27:19.248 [Eshleman] It surprised me [br]that no one said anything to stop me. 0:27:19.248,0:27:20.640 They just accepted what I'd say. 0:27:20.640,0:27:22.726 And no one questioned [br]my authority at all. 0:27:22.726,0:27:27.429 And it really shocked me, why didn't [br]people say when I started to get so abusive? 0:27:27.429,0:27:32.658 I started to get so profane, [br]and still people didn't say anything. 0:27:34.357,0:27:38.374 [Zimbardo] There were a few guards who[br]hated to see the prisoners suffer, 0:27:38.374,0:27:41.562 they never did anything that [br]would be demeaning of the prisoners. 0:27:41.562,0:27:45.434 The interesting thing is, none [br]of the good guards ever intervened 0:27:45.434,0:27:50.383 in the behavior of the guards who gradually [br]became more and more sadistical over time. 0:27:51.893,0:27:56.288 We like to think there is this core of human [br]nature that good people can't do bad things. 0:27:56.288,0:28:00.877 And that good people will [br]dominate over bad situations. 0:28:00.877,0:28:03.772 In fact, one way to look [br]at this prison study is that 0:28:03.772,0:28:06.088 we put good people in an evil place 0:28:06.088,0:28:07.718 and we saw who won. 0:28:07.718,0:28:11.936 And the sad message in this case is, [br]the evil place won over the good people. 0:28:14.318,0:28:17.285 [Eshleman] It did show some very [br]interesting and maybe some unpleasant things 0:28:17.285,0:28:20.091 about human behavior. 0:28:20.091,0:28:24.113 It seems like every century, [br]every decade that we go through, 0:28:24.113,0:28:29.893 we're suffering the same kind of [br]atrocities and you need to understand 0:28:29.893,0:28:30.873 why these things happen, 0:28:30.873,0:28:33.769 you need to understand [br]why people behave like this.