1 00:00:00,000 --> 00:00:06,960 Why would a woman obey phone commands from a stranger to strip search an innocent employee? 2 00:00:08,080 --> 00:00:11,800 She pretty much got this victim trapped in the office, totally naked. 3 00:00:12,160 --> 00:00:15,440 But unless you're put in that situation, how do you know what you would do? 4 00:00:15,780 --> 00:00:16,360 You don't. 5 00:00:16,880 --> 00:00:17,200 Guilty. 6 00:00:17,560 --> 00:00:18,160 Guilty, sir. 7 00:00:18,300 --> 00:00:18,540 Guilty. 8 00:00:18,880 --> 00:00:19,240 Guilty. 9 00:00:19,620 --> 00:00:24,800 Why would four young men watch their friend die when they could have intervened to save him? 10 00:00:25,160 --> 00:00:26,000 Matt, stop breathing. 11 00:00:26,940 --> 00:00:28,220 This guy is in real trouble. 12 00:00:28,880 --> 00:00:30,060 You call 911. 13 00:00:30,260 --> 00:00:32,280 I had it typed into my phone. 14 00:00:32,540 --> 00:00:34,280 All I had to do was press the green button. 15 00:00:34,520 --> 00:00:36,460 I hit the red button and canceled it out. 16 00:00:36,620 --> 00:00:38,980 When good people do nothing, evil prevails. 17 00:00:42,640 --> 00:00:47,400 How could soldiers with good service records suddenly descend into barbaric behavior? 18 00:00:48,060 --> 00:00:51,240 Sometimes you cross a line, and it's a thin line. 19 00:00:51,240 --> 00:00:54,980 At any time, that can be crossed by any body if placed in certain conditions. 20 00:00:54,980 --> 00:01:01,380 The answer to these questions can be found in the human behavior experiments. 21 00:01:01,380 --> 00:01:02,360 Hands off the door. 22 00:01:02,480 --> 00:01:04,040 Hey, I don't want anybody left. 23 00:01:04,040 --> 00:01:05,860 You want to take your bed and your clothes and present? 24 00:01:05,860 --> 00:01:35,840 You want to take your bed and your clothes and present? 25 00:01:35,840 --> 00:01:40,600 In a unique period from the early 60s to the early 70s, a group of social scientists conducted 26 00:01:40,600 --> 00:01:46,120 a series of experiments examining the nature of human behavior and its relationship to social 27 00:01:46,120 --> 00:01:47,760 conventions and situations. 28 00:01:47,760 --> 00:01:53,700 In this setting, I allow things to be done to me that I wouldn't allow in any other context. 29 00:01:53,700 --> 00:01:57,240 A dentist is about to put an electric drill into my mouth. 30 00:01:57,240 --> 00:02:17,040 Stanley Milgram, one of the most influential social psychologists of the time, was particularly fascinated with the dangers of group behavior and blind obedience to authority. 31 00:02:17,040 --> 00:02:32,040 What is there in human nature that allows an individual to act without any restraints whatsoever so that he can act inhumanely, harshly, severely, and in no way limited by feelings of compassion or conscience? 32 00:02:32,040 --> 00:02:34,040 These are questions that concern me. 33 00:02:34,040 --> 00:02:35,040 But he might be dead in there. 34 00:02:35,040 --> 00:02:36,040 But he might be dead in there. 35 00:02:36,040 --> 00:02:39,040 The experiment requires that you continue teaching. 36 00:02:39,040 --> 00:02:41,040 330 volts. 37 00:02:41,040 --> 00:02:48,040 The experiments that Milgram and others conducted were controversial, and for ethical reasons 38 00:02:48,040 --> 00:02:50,040 may never be conducted again. 39 00:02:50,040 --> 00:03:03,040 Yet, the results of those experiments remain groundbreaking, profoundly revealing about the tensions between the individual and society, and increasingly relevant to contemporary life. 40 00:03:07,040 --> 00:03:13,040 In 1962, Stanley Milgram shocked the world with his study on obedience. 41 00:03:13,040 --> 00:03:19,040 To test his theories, he invented an electronic box that would become a window into human cruelty. 42 00:03:19,040 --> 00:03:28,040 In ascending order, a row of buttons marked the amount of voltage one person would inflict upon another. 43 00:03:31,040 --> 00:03:36,040 Milgram's original motive for the experiment was to understand the unthinkable. 44 00:03:36,040 --> 00:03:40,040 How the German people could permit the extermination of the Jews. 45 00:03:40,040 --> 00:03:51,040 When I learn of incidents such as the massacre of millions of men, women, and children perpetrated by the Nazis in World War II, 46 00:03:51,040 --> 00:03:57,040 how is it possible, I ask myself, that ordinary people, who are courteous and decent in everyday life, 47 00:03:57,040 --> 00:04:02,040 can act callously, inhumanely, without any limitations of conscience? 48 00:04:02,040 --> 00:04:09,040 Now, there are some studies in my discipline, social psychology, that seem to provide a clue to this question. 49 00:04:13,040 --> 00:04:16,040 The problem I wanted to study was a little different, it went a little bit further. 50 00:04:16,040 --> 00:04:18,040 It was the issue of authority. 51 00:04:18,040 --> 00:04:23,040 Under what conditions would a person obey authority, who commanded actions that went against conscience? 52 00:04:23,040 --> 00:04:27,040 These are exactly the questions that I wanted to investigate at Yale University. 53 00:04:30,040 --> 00:04:32,040 It is May 1962. 54 00:04:32,040 --> 00:04:37,040 An experiment is being conducted in the Elegant Interaction Laboratory at Yale University. 55 00:04:38,040 --> 00:04:43,040 The subjects are 40 males between the ages of 20 and 50 residing in the greater New Haven area. 56 00:04:44,040 --> 00:04:47,040 Psychologists have developed several theories to explain how people learn. 57 00:04:47,040 --> 00:04:51,040 One theory is that people learn things correctly whenever they get punished for making a mistake. 58 00:04:52,040 --> 00:04:58,040 Forty years later, Milgram's infamous experiment, Obedience, is still taught in classrooms around the world. 59 00:04:59,040 --> 00:05:02,040 Would you open those and tell me which of you is which, please? 60 00:05:05,040 --> 00:05:09,040 All right, now the next thing we'll have to do is set the learner up so that he can get some sort of punishment. 61 00:05:10,040 --> 00:05:12,040 When inspired Milgram, I would say there were a number of factors. 62 00:05:12,040 --> 00:05:14,040 One of them is he was very ambitious. 63 00:05:14,040 --> 00:05:16,040 He wanted to make a mark in social psychology. 64 00:05:17,040 --> 00:05:21,040 And he wanted, as he wrote to one friend, he wanted to come up with the most, 65 00:05:21,040 --> 00:05:23,040 the boldest experiment that he could think of. 66 00:05:24,040 --> 00:05:25,040 Would you roll up your right sleeve, please? 67 00:05:28,040 --> 00:05:31,040 This electrode is connected to the shock generator in the next room. 68 00:05:32,040 --> 00:05:36,040 And this electrode paste is to provide a good contact to avoid any blister or burn. 69 00:05:37,040 --> 00:05:40,040 Do you have any questions now before we go into the next room? 70 00:05:40,040 --> 00:05:43,040 About two years ago, I was in the Veterans Hospital in West Haven. 71 00:05:44,040 --> 00:05:46,040 And while there, they detected a heart condition. 72 00:05:47,040 --> 00:05:49,040 There's nothing serious, but as long as I'm having these shocks, 73 00:05:50,040 --> 00:05:52,040 how strong are they? How dangerous are they? 74 00:05:53,040 --> 00:05:55,040 Well, no, although they may be painful, they're not dangerous. 75 00:05:56,040 --> 00:05:57,040 Anything else? 76 00:05:58,040 --> 00:05:59,040 No, that's all. 77 00:06:00,040 --> 00:06:04,040 All right, teacher, would you take the test and be seated in front of the shock generator, please, in the next room? 78 00:06:05,040 --> 00:06:06,040 But the experiment was rigged. 79 00:06:06,040 --> 00:06:09,040 The victim was an accomplice of the experimenter. 80 00:06:10,040 --> 00:06:13,040 The victim, according to plan, provided many wrong answers. 81 00:06:14,040 --> 00:06:20,040 His verbal responses were standardized on tape, and each protest was coordinated to a particular voltage level on the shock generator. 82 00:06:20,040 --> 00:06:26,040 Now, as teacher, you were seated in front of this impressive-looking instrument, the shock generator. 83 00:06:27,040 --> 00:06:37,040 Its essential feature is a line of switches that goes from 15 volts to 450 volts, and a set of verbal designations that goes from slight shock to moderate shock. 84 00:06:37,040 --> 00:06:47,040 Strong shock, very strong shock, intense shock, extreme intensity shock, and finally, XXX, danger-severe shock. 85 00:06:48,040 --> 00:06:52,040 Your job, the experimenter explains to you, is to teach the learner a simple word pair test. 86 00:06:52,040 --> 00:06:55,040 If he gets each answer correctly, fine, you move on to the next pair. 87 00:06:56,040 --> 00:07:07,040 But if he makes a mistake, you were instructed to give him an electric shock, starting with 15 volts, and you increase the shock one step on each error. 88 00:07:07,040 --> 00:07:11,040 Incorrect. You'll now get a shock of 105 volts. 89 00:07:14,040 --> 00:07:15,040 Hard head. 90 00:07:17,040 --> 00:07:19,040 Just how far can you go on this thing? 91 00:07:20,040 --> 00:07:21,040 As far as it's necessary. 92 00:07:22,040 --> 00:07:23,040 What do you mean, as far as it's necessary? 93 00:07:24,040 --> 00:07:25,040 Complete the test. 94 00:07:26,040 --> 00:07:30,040 Milgram was very much aware that obedience is a necessary ingredient for society to function. 95 00:07:31,040 --> 00:07:33,040 But he focused on the darker side of obedience. 96 00:07:34,040 --> 00:07:35,040 Incorrect. 150 volts. 97 00:07:35,040 --> 00:07:36,040 Ah! 98 00:07:38,040 --> 00:07:39,040 Sad face. 99 00:07:39,040 --> 00:07:40,040 That's all. Get me out of here. 100 00:07:41,040 --> 00:07:44,040 I told you I had heart trouble. My heart's starting to bother me now. 101 00:07:45,040 --> 00:07:46,040 It's absolutely essential that you continue. 102 00:07:47,040 --> 00:07:48,040 You had no other choice, teacher. 103 00:07:49,040 --> 00:07:50,040 Oh, I had a lot of choices. 104 00:07:51,040 --> 00:07:54,040 My number one choice is that I wouldn't go on if I thought he was being harmed. 105 00:07:55,040 --> 00:07:59,040 Now, this man makes disobedience seem a very rational and simple deed. 106 00:08:00,040 --> 00:08:03,040 Now, other subjects respond quite differently to the experimenter's authority. 107 00:08:04,040 --> 00:08:06,040 Wrong. It's hair. 108 00:08:07,040 --> 00:08:08,040 75 volts, Jim. 109 00:08:09,040 --> 00:08:10,040 Oh! 110 00:08:11,040 --> 00:08:12,040 Please continue. 111 00:08:13,040 --> 00:08:15,040 Some psychologists were troubled by the ethics of it. 112 00:08:15,040 --> 00:08:20,040 Many, if not most, subjects found it a highly stressful, conflicted experience. 113 00:08:21,040 --> 00:08:25,040 People are stammering, stuttering, laughing hysterically, inappropriately. 114 00:08:26,040 --> 00:08:27,040 150 volts. 115 00:08:28,040 --> 00:08:29,040 Oh! 116 00:08:30,040 --> 00:08:32,040 Experimenter. That's all. Get me out of here. 117 00:08:33,040 --> 00:08:34,040 Please quit. 118 00:08:34,040 --> 00:08:35,040 Oh, my heart's starting to bother me now. 119 00:08:36,040 --> 00:08:37,040 Get me out of here, please. 120 00:08:38,040 --> 00:08:40,040 Let me out of here. You have no right to keep me here. Let me out. 121 00:08:41,040 --> 00:08:42,040 Let me out of here. Let me out. 122 00:08:42,040 --> 00:08:43,040 Continue, please. 123 00:08:43,040 --> 00:08:53,040 Clearly, you know, when we say people went to the top of the shock board, it wasn't like they were going blithely, sadistically. 124 00:08:54,040 --> 00:08:55,040 People went, stop and go, stop and go. 125 00:08:56,040 --> 00:08:59,040 They were in a state of conflict, which created a tremendous amount of stress. 126 00:08:59,040 --> 00:09:00,040 So that was the main critique. 127 00:09:00,040 --> 00:09:02,040 This will be at 3.30. 128 00:09:03,040 --> 00:09:04,040 Wow! 129 00:09:07,040 --> 00:09:13,040 As his voice began to show increasing frustration, so did I. 130 00:09:14,040 --> 00:09:19,040 And I was really in a state of real conflict and agitation. 131 00:09:20,040 --> 00:09:28,040 One of Stanley Milgram's basic contributions was that you don't ask people what they would do given this hypothetical situation. 132 00:09:29,040 --> 00:09:31,040 You put them in the situation. 133 00:09:32,040 --> 00:09:33,040 Wrong. 134 00:09:35,040 --> 00:09:36,040 I want the 180 volts. 135 00:09:37,040 --> 00:09:38,040 Please continue, teacher. 136 00:09:39,040 --> 00:09:40,040 180 volts. 137 00:09:42,040 --> 00:09:44,040 I can't stand the pain. Let me out of here. 138 00:09:44,040 --> 00:09:46,040 I can't stand it. I'm not going to kill that man today. 139 00:09:47,040 --> 00:09:58,040 Concord to Milgram, one of the things that's a prerequisite for carrying out acts that are evil is to shed responsibility from your shoulders and hand it over to the person in charge. 140 00:09:58,040 --> 00:10:01,040 I mean, who's going to take the responsibility if anything happens to that gentleman? 141 00:10:01,040 --> 00:10:03,040 I'm responsible for anything that happens here. 142 00:10:04,040 --> 00:10:05,040 Continue, please. 143 00:10:06,040 --> 00:10:07,040 All right, next one. Slow. 144 00:10:08,040 --> 00:10:09,040 I didn't hold any gun to anybody's head. 145 00:10:10,040 --> 00:10:12,040 Just the fact that he conveyed a sense of authority. 146 00:10:13,040 --> 00:10:18,040 Roughly 60, 65% of the people went all the way to the top of the shock board. 147 00:10:18,040 --> 00:10:19,040 450 volts. 148 00:10:20,040 --> 00:10:21,040 That's it. 149 00:10:22,040 --> 00:10:26,040 Now continue using the last switch on the board, please. The 450 switch for each wrong answer. Continue, please. 150 00:10:27,040 --> 00:10:29,040 I'm not getting no answer. Don't the man's health mean anything? 151 00:10:30,040 --> 00:10:31,040 Whether the learner likes it or not, we must... 152 00:10:31,040 --> 00:10:32,040 But he might be dead in there. 153 00:10:32,040 --> 00:10:46,040 Milgram made the point, I think very effectively, that the Nazis weren't all a bunch of psychopaths at Belsen and Dachau, that you could staff a death camp from the middle class in New Haven. 154 00:10:46,040 --> 00:10:47,040 Who was actually pushing the switch? 155 00:10:48,040 --> 00:10:52,040 I was, but he kept insisting. I told him no, but he said, you've got to keep going. 156 00:10:53,040 --> 00:10:56,040 What kind of obedience would Milgram get today, if he were to do the experiment today? 157 00:10:57,040 --> 00:10:58,040 Probably about the same. 158 00:10:59,040 --> 00:11:00,040 Probably about the same. Why? 159 00:11:01,040 --> 00:11:03,040 I don't know. I think people are just inherently obedient. 160 00:11:04,040 --> 00:11:09,040 It just really shows how far human beings will go to appease what they perceive to be an authority figure. 161 00:11:10,040 --> 00:11:14,040 Milgram has identified one of the constants, one of the universals of social behavior. 162 00:11:14,040 --> 00:11:17,040 The readiness to obey authority cuts across time. It's a constant. 163 00:11:18,040 --> 00:11:25,040 The other outstanding and distinctive thing about the obedience experiment is how much it has and keeps on permeating contemporary culture and thought. 164 00:11:26,040 --> 00:11:28,040 It's still with us in very, very important ways. 165 00:11:33,040 --> 00:11:38,040 A series of strange events recently confirmed Milgram's theories about obedience. 166 00:11:38,040 --> 00:11:50,040 Targeting fast food restaurants across the country, a con man telephoned restaurant managers and convinced them to strip search and sometimes sexually abuse their employees. 167 00:11:51,040 --> 00:11:56,040 The mystery is not in the con man, but in the victims. Why would they obey? 168 00:11:56,040 --> 00:12:10,040 This person was so convincing. People saw him as a legitimate authority. I think we have a, um, probably the closest thing that we have to Milgram experiment today in the, uh, in these strip searches. 169 00:12:10,040 --> 00:12:16,040 The most famous of these incidents took place at a McDonald's in Mount Washington, Kentucky. 170 00:12:17,040 --> 00:12:20,040 There was a videotape security camera it had filmed. 171 00:12:21,040 --> 00:12:33,040 We didn't hear what the instructions were, but due to the actions that were, uh, had taken place, what the victim was doing in the video and stuff, it was pretty evident what each instruction was. 172 00:12:33,040 --> 00:12:42,040 An anonymous caller pretending to be a police officer told the assistant manager that an employee had stolen some money. 173 00:12:43,040 --> 00:12:53,040 He said, I'm Officer Scott. And he said, I'm with the police department. I'm investigating a complaint. 174 00:12:53,040 --> 00:12:59,040 It went directly from a theft into a drug thing. So I was asked to search her clothing. 175 00:13:02,040 --> 00:13:06,040 You know, he would tell me, take her shoes, click them, take her shirt, shake it out. 176 00:13:06,040 --> 00:13:23,040 I know how it seems to people, but you weren't on the phone with him. The man has convinced 70 to 100 other places, the very same thing. He's very good at what he does. Very good. 177 00:13:23,040 --> 00:13:37,040 He sounded like a police officer. And, um, I'm thinking, okay, you know, I'm doing what I'm supposed to do. 178 00:13:38,040 --> 00:13:48,040 He was getting some kind of satisfaction by being an authoritative figure and telling people what to do and then realizing by the phone conversation that they were actually doing what he said. 179 00:13:48,040 --> 00:13:55,040 He's telling me that I needed to get someone to sit with her while he goes and gets somebody to come in to pick her up. 180 00:13:56,040 --> 00:14:09,040 The caller then asked the manager if she was married or had a boyfriend. She said that she had a fiancé. Then the caller asked if she could have her fiancé come to the restaurant and assist her with the strip search of the victim. 181 00:14:09,040 --> 00:14:18,040 He says, well, why don't you have him come up and sit there? I mean, you can trust him. So, um, I called Wes, my fiancé. We were going to get married and asked him if he would come up. 182 00:14:18,040 --> 00:14:39,040 The manager goes about doing her duties of running the restaurant and leaves the fiancé there in the office and then the caller starts giving instructions over the phone of things that he wants the victim to do and what he wants the fiancé to tell her to do. 183 00:14:39,040 --> 00:14:50,040 Have her remove her apron and instruct her to do jumping jacks and jog in place and several more things. 184 00:14:50,040 --> 00:15:15,040 She was still in high school. The kind of person she was, she was actually graduating the top ten in her class. And she was scared of being in trouble with the police. So she sort of just went along and did whatever the fiancé told her to do because she didn't want to be in trouble for anything. 185 00:15:15,040 --> 00:15:32,040 During all this time, I'm working. I'm running the floor. I'm getting change. And then when I would walk into the office to get the change or whatever I had to get, Wes would be sitting where he was when I left. And she was sitting where she was. And no one said anything. 186 00:15:32,040 --> 00:15:48,040 After over two and a half hours, Summer's fiancé Walter Nix did something that was unthinkable. Complying with the instructions of the caller, he ordered the employee to perform a sexual act. 187 00:15:48,040 --> 00:16:06,040 There's no way that I could take away from what happened to her. A lot of people, you know, look at you and go, well, you're, you know, you're a nut. You should be strung up. I've had it even said to me. But it's really hard because you weren't there. 188 00:16:06,040 --> 00:16:23,040 The Milgram study showed us that most people would do that. If you structure the environment such that, you know, you provide all the authority and, you know, the commands, just anybody might do this. 189 00:16:23,040 --> 00:16:27,040 But I do think this sounds worse. You think this is worse than with Milgram did? 190 00:16:27,040 --> 00:16:40,040 With Milgram, there was somebody, like, right, sitting right there and instructing them. If they hesitated, they could turn and then somebody could encourage them and, and they could sort of maybe psychologically leave that responsibility on that other person. 191 00:16:40,040 --> 00:16:44,040 But in this case, the police officer's on the phone. He's not standing there. 192 00:16:44,040 --> 00:16:47,040 Exactly. It's a very good point. 193 00:16:47,040 --> 00:16:58,040 You know, you look back on it and you say, I wouldn't have done it. But unless you're put in that situation at that time, how do you know what you would do? You don't. You don't. 194 00:16:58,040 --> 00:17:08,040 Over 60 other people did exactly as Donna Summers did. Why is it so easy for us to obey orders even when we know they are wrong? 195 00:17:08,040 --> 00:17:13,040 Why are we willing to inflict pain on others if someone else takes responsibility? 196 00:17:13,040 --> 00:17:20,040 There's nothing more difficult for people to violate a social structure which all participants have initially accepted. 197 00:17:20,040 --> 00:17:24,040 It reminds me of a situation that once occurred in South America. I was in an airplane. 198 00:17:24,040 --> 00:17:28,040 The pilot came into the plane. He was drunk. He was reeling toward the cockpit. 199 00:17:28,040 --> 00:17:34,040 The passengers looked at each other, but no one got up. No one said to the pilot, you're drunk, we can't fly in this plane. 200 00:17:34,040 --> 00:17:39,040 There are a set of pressures that keep you in the role that you've initially accepted. 201 00:17:39,040 --> 00:17:49,040 What pressures would keep a friend from calling for help even when it was a matter of life and death? 202 00:17:49,040 --> 00:17:58,040 In 2005, four fraternity brothers watched and did nothing to help as their close friend, 21-year-old Matthew Carrington, died in front of them. 203 00:17:58,040 --> 00:18:05,040 I will live with consequences for the rest of my life. My actions killed a good person. 204 00:18:05,040 --> 00:18:10,040 Nothing I can say here today will bring back Matthew Carrington or lessen the grief that his family feels. 205 00:18:10,040 --> 00:18:17,040 His death was preventable. And I will live with the guilt for the rest of my life. 206 00:18:17,040 --> 00:18:20,040 Why did these four boys do nothing? 207 00:18:20,040 --> 00:18:30,040 Every time I think about it, the feelings rush back and the idea of what if just stands in the corner, just not leaving. 208 00:18:30,040 --> 00:18:33,040 It's always there. 209 00:18:33,040 --> 00:18:38,040 I had no doubt that if I would have known what I know now that I could have stopped it. 210 00:18:38,040 --> 00:18:48,040 This story is not unique, but it raises a question. Is there something in human nature that can keep us from helping? 211 00:18:48,040 --> 00:18:55,040 In 1964, 38 New Yorkers watched through their windows as one of their neighbors was brutally murdered. 212 00:18:55,040 --> 00:19:01,040 Her name was Kitty Genovese, a 28-year-old woman. 213 00:19:01,040 --> 00:19:14,040 The Genovese incident where a young woman coming home late at night from her work was assaulted by somebody who was one of those random, crazy people. 214 00:19:14,040 --> 00:19:27,040 Kitty was running up the block and Winston Mosley ran after her until she reached the midpoint of the block, almost directly under this streetlight. 215 00:19:29,040 --> 00:19:35,040 Mosley caught up with her and stabbed her four times in the back. 216 00:19:35,040 --> 00:19:44,040 Her screams were loud, unmistakable, and reverberated throughout the entire area. 217 00:19:46,040 --> 00:19:53,040 The lights went on in the windows around the courtyard so we know that people were seeing this. 218 00:19:53,040 --> 00:19:55,040 Nobody called the police. 219 00:19:55,040 --> 00:20:05,040 Somebody who lived on the seventh floor opened his window and yelled out, what's going on down there? 220 00:20:06,040 --> 00:20:12,040 When Mosley heard somebody yelling out, he ran back to his car. Kitty was still alive. 221 00:20:12,040 --> 00:20:17,040 She managed to get up. She staggers around the corner here, still screaming. 222 00:20:17,040 --> 00:20:22,040 People in that building heard her as well. And she collapses inside this hallway. 223 00:20:22,040 --> 00:20:29,040 There's one apartment above there. It was occupied by Carl Ross. 224 00:20:30,040 --> 00:20:38,040 Carl opened his door at the time that Mosley returns and he saw the second attack taking place. 225 00:20:39,040 --> 00:20:41,040 And he did nothing. 226 00:20:42,040 --> 00:20:48,040 After stabbing Kitty another eight times in this very hallway, the killer ran away, leaving Kitty to bleed to death. 227 00:20:48,040 --> 00:20:54,040 Eventually, a neighbor called the police, but it was too late. 228 00:20:54,040 --> 00:20:57,040 Kitty died before the ambulance could get her to the hospital. 229 00:20:58,040 --> 00:21:00,040 That shocked the city. 230 00:21:00,040 --> 00:21:05,040 Now, it's not that a person got murdered that shocked the city. 231 00:21:05,040 --> 00:21:15,040 That happens, sadly. It's that a person got murdered and her neighbors watched and nobody did anything. 232 00:21:15,040 --> 00:21:21,040 Bib Latney and I, we read about the murder as did everybody else. 233 00:21:22,040 --> 00:21:26,040 Here we were two young social psychologists starting our research careers. 234 00:21:27,040 --> 00:21:31,040 We know about Stanley Milgram's set of experiments on obedience to authority. 235 00:21:31,040 --> 00:21:39,040 And we started to think about, in an offhand way, what could have produced the Genovese effect. 236 00:21:39,040 --> 00:21:44,040 Perhaps Kitty Genovese might have been alive today if fewer people had seen her. 237 00:21:45,040 --> 00:21:52,040 There were perhaps 38 people who could have responded, but each were looking to see what these other people were doing. 238 00:21:52,040 --> 00:22:06,040 We decided to try to create a relatively ambiguous situation to which we could see how people responded. 239 00:22:07,040 --> 00:22:13,040 We thought that one kind of thing that comes up that's often hard to tell whether it's a real emergency or not has to do with fire. 240 00:22:13,040 --> 00:22:29,040 You see smoke coming through the vent, and it is ambiguous. What do you do? 241 00:22:30,040 --> 00:22:35,040 Hey, there's smoke coming out from under the door in that room where I was filling out the questionnaire. 242 00:22:36,040 --> 00:22:40,040 Almost everybody does that if they face the smoke alone. 243 00:22:40,040 --> 00:22:44,040 Now let's have you face the smoke with two strangers. 244 00:22:53,040 --> 00:22:59,040 One person can be seen glancing at the other. The other is continuing to fill out the questionnaire. 245 00:23:00,040 --> 00:23:04,040 It's getting a little more smoky in their room, but nonetheless you stay in their room. 246 00:23:04,040 --> 00:23:12,040 By and large, people surrounded by people who react as if there's nothing wrong, don't respond. 247 00:23:13,040 --> 00:23:22,040 Everybody sees the other people not reacting, so they create a definition of the situation. No emergency. 248 00:23:22,040 --> 00:23:32,040 To test their theories about how groups and individuals respond differently to a crisis, Darley and Latene conducted a second experiment. 249 00:23:33,040 --> 00:23:35,040 This time, the emergency was clearly defined. 250 00:23:35,040 --> 00:23:39,040 First of all, I would like to thank the two of you for being here today to help out in the study. 251 00:23:40,040 --> 00:23:45,040 In this experiment, one student was asked to communicate via intercon with another student down the hall. 252 00:23:45,040 --> 00:24:01,040 What sounded like a real seizure in the subject's headphones was just a tape recording of an actor playing a role for the experiment. 253 00:24:01,040 --> 00:24:20,040 If you knew there was nobody else but you to help, you got up, you opened the door of your room, and you headed off to find the person. 254 00:24:20,040 --> 00:24:32,040 On the other hand, if there were three or four other people present who you heard, 255 00:24:33,040 --> 00:24:36,040 I would like to thank the three of you for being here today to help us with the study. 256 00:24:37,040 --> 00:24:41,040 You are much less likely to respond yourself. 257 00:24:41,040 --> 00:24:49,040 Somebody is getting a little help here because I'm having a real problem right now. Help me out. 258 00:24:50,040 --> 00:25:01,040 The responsibility any individual feels for helping is diffused when there are other people who could also help. 259 00:25:02,040 --> 00:25:09,040 So what can we say back to the bystanders in the Genovese situation? 260 00:25:09,040 --> 00:25:13,040 The first thing we can say, I think, is they got a bum rap. 261 00:25:14,040 --> 00:25:18,040 They were reacting the way that you or me might react in those situations. 262 00:25:19,040 --> 00:25:24,040 There have been many incidents, like the Genovese incident since then, 263 00:25:25,040 --> 00:25:30,040 and there have been many incidents in which people who could help, don't help. 264 00:25:31,040 --> 00:25:35,040 The children are not professional actors. They agreed to participate in an unusual study. 265 00:25:35,040 --> 00:25:40,040 Wired with microphones and film from an observation point high across New York 6th Avenue. 266 00:25:41,040 --> 00:25:43,040 They will be set out to ask a simple question. 267 00:25:44,040 --> 00:25:45,040 Will you help me? 268 00:25:46,040 --> 00:25:48,040 Excuse me, I'm lost. Can you help me, please? 269 00:25:49,040 --> 00:25:51,040 Excuse me, I'm lost. Can you help me, please? 270 00:25:52,040 --> 00:25:54,040 Excuse me, I'm lost. Can you help me, please? 271 00:25:55,040 --> 00:25:57,040 Excuse me, I want to call my mother, please. 272 00:25:58,040 --> 00:26:00,040 What? Only call my mother, can you help me? 273 00:26:00,040 --> 00:26:14,040 We are all creatures of socialization, and socialization is what teaches us to pay attention to what other people do when they're responding to a situation. 274 00:26:15,040 --> 00:26:27,040 I can remember when I first heard about the study, Psych 1. 275 00:26:28,040 --> 00:26:30,040 I was like, wow, that is ridiculous. 276 00:26:30,040 --> 00:26:43,040 How can someone see something happening that they know is wrong, that they know the person standing next to them know is wrong, but not taking action? 277 00:26:44,040 --> 00:26:49,040 It's sickening to know that I took part in it, that I could have just been the one that stood up. 278 00:26:49,040 --> 00:27:00,040 A makeshift memorial of flowers and candles is placed outside the Kai Tao fraternity house for Matthew Carrington. 279 00:27:01,040 --> 00:27:10,040 Police say the 21-year-old Chico State student was in the basement of this house, taking part in a fraternity event at 5 a.m. Wednesday morning, when his body gave out. 280 00:27:13,040 --> 00:27:14,040 Matt didn't have to die that night. 281 00:27:14,040 --> 00:27:18,040 It could have all been so different. 282 00:27:19,040 --> 00:27:24,040 It could have all been from the very beginning when they were all down there, when there was a room full of guys. 283 00:27:25,040 --> 00:27:27,040 It went wrong before they all left. 284 00:27:28,040 --> 00:27:35,040 Matthew Carrington joined a fraternity and was undergoing hazing during the spring semester of his junior year. 285 00:27:35,040 --> 00:27:56,040 He wanted to join because he would get to meet people. He's kind of shy, you know, networking and such when you're older. I mean, you've got brothers and houses in every college all over the country, so. 286 00:27:56,040 --> 00:28:11,040 They did some pretty silly things. More embarrassing than causing anybody any harm. Like wearing a miniskirt out in the intersection or switching your t-shirt with a homeless guy and putting his shirt on. 287 00:28:12,040 --> 00:28:15,040 But it was nothing that was going to get anybody hurt. 288 00:28:15,040 --> 00:28:25,040 How did these seemingly harmless pranks escalate to the point where Matthew died? 289 00:28:25,040 --> 00:28:34,040 Basically, it was the third night of what the fraternity called Inspiration Week, what the pledges call Hell Week. 290 00:28:35,040 --> 00:28:45,040 The pledges of Mike and Matt were brought into the basement and the first thing they did was undergo some grueling physical calisthenics. 291 00:28:45,040 --> 00:29:02,040 The young men were then given a five-gallon water jug, which in itself weighed about 42 pounds, and were told to stand up on a narrow bench, stand on one foot, and to drink as much as they possibly could. 292 00:29:03,040 --> 00:29:13,040 Matthew at some point became nauseated, vomited, became increasingly confused. The kidneys can only handle so much water and indeed you can't poison yourself. 293 00:29:13,040 --> 00:29:25,040 When you're drinking water and you're acting drunk, okay, something's not right. You know? When you're slurring your words, when you're, you know, you can't manipulate things like you normally, something's wrong. 294 00:29:26,040 --> 00:29:40,040 Gabriel Mostretti and J.P. Fickas came in at some point and they were both intoxicated. Mostretti was excessively intoxicated and they basically took over. 295 00:29:40,040 --> 00:29:51,040 Actually, to tell you the truth, I don't remember most of it. Unfortunately, I was pretty intoxicated when it happened. I remember making him do push-ups. I don't remember why. 296 00:29:51,040 --> 00:30:01,040 Matt was at a point where he couldn't do any more push-ups. He just all of a sudden dropped and his, it just seemed like his whole body just tensed up. 297 00:30:01,040 --> 00:30:15,040 You've got, well, at this point, four boys down there. It just makes me sick that they didn't think. They didn't think something's wrong. So why can't someone say stop? 298 00:30:15,040 --> 00:30:31,040 What could happen is if one person says, this guy is in real trouble. You call 911. You do this. You do that. Everybody will, by that definition, will start to react, will be helpful. 299 00:30:31,040 --> 00:30:38,040 The thing is balanced on a knife edge, but sometimes it falls and nothing happens. 300 00:30:38,040 --> 00:30:51,040 His hips moved a little bit and he just seized up. And Mike said, oh my God, I think he bit his tongue. And then he said, somebody needs to call an ambulance. 301 00:30:51,040 --> 00:31:01,040 I was turning on my cell phone when I was walking down the stairs and was typing in 911 when I saw Mike at the bottom of the stairs. I had it typed into my phone. 302 00:31:01,040 --> 00:31:08,040 All I had to do was press the green button. And he said, it's okay. You don't need to call 911. Matt's just sleeping. 303 00:31:08,040 --> 00:31:14,040 You know, I hit, I hit the red button and canceled it out. And he was snoring. It just sounded like he was snoring. 304 00:31:14,040 --> 00:31:20,040 I remember that thoroughly. I remember the sound of him snoring. I remember thinking, no, he's sleeping. 305 00:31:21,040 --> 00:31:34,040 The snoring was certainly not sleeping. It would have been a result of water intoxication and a pulmonary edema, which is basically the lungs filling with fluid. 306 00:31:34,040 --> 00:31:40,040 Approximately an hour after he had been left to sleep at all, he was not breathing. 307 00:31:40,040 --> 00:31:46,040 You do nothing for an hour while they have him lay there. Then they realize he's not breathing. 308 00:31:46,040 --> 00:31:55,040 Then all of a sudden it's like, call 911. Well, God, at this point they do. But at this point now, it's too late. 309 00:31:55,040 --> 00:32:01,040 Matthew was pronounced dead approximately 27 minutes after arrival in the emergency department. 310 00:32:01,040 --> 00:32:14,040 When we got there, they took me and Debbie in the back and we were still hoping that when they pulled that sheet over his head, it was going to be another kid, not yours. 311 00:32:14,040 --> 00:32:24,040 Just as bad as that sounds, there was just a chance it wasn't our son. And as soon as they pulled the sheet up and you seen his hairdo, you knew it was him. 312 00:32:24,040 --> 00:32:47,040 The four ringleaders in the fraternity hazing and torture death of 21 year old Chico State student Matthew Carrington accepted responsibility. 313 00:32:47,040 --> 00:32:56,040 All four, some through tears, pleading guilty. Guilty. Guilty, sir. Guilty. Guilty. 314 00:32:56,040 --> 00:33:03,040 All four were given jail time. Most culpable, 22 year old Gabriel Maestretti sentenced to a year in jail for involuntary manslaughter. 315 00:33:03,040 --> 00:33:10,040 25 year old Jerry Lim and 19 year old John Fickus sentenced to six months as accessories to manslaughter. 316 00:33:10,040 --> 00:33:16,040 Matt trusted him to help him out if he was going to get in trouble because he wasn't worried about getting into trouble. 317 00:33:16,040 --> 00:33:25,040 with just drinking water. When it was time for help, they didn't step up and he didn't get any help. 318 00:33:28,040 --> 00:33:38,040 Nearly a year after Matthew's death, Debbie visits three of the four fraternity brothers who remain in jail, serving time for involuntary manslaughter. 319 00:33:38,040 --> 00:33:51,040 I don't think they're bad kids. I think they just made bad choices and have been a terrible, terrible mistake, you know, that we're all going to live with for the rest of our lives. 320 00:33:51,040 --> 00:33:56,040 It's just so hard. 321 00:33:56,040 --> 00:34:06,040 I start all my days crying for Grandma because I just miss him so much. 322 00:34:06,040 --> 00:34:21,040 And I think of all of you guys, I think of the pain that you must be feeling, having to live with them. 323 00:34:21,040 --> 00:34:35,040 For like the whole year, it was just one day being played over and over that night, being played over and over again in my head. 324 00:34:35,040 --> 00:34:42,040 I find it hard to forgive myself. I don't know. It's like the only thing that makes me feel better is to like hate myself. 325 00:34:42,040 --> 00:34:52,040 It's sad in itself that you want, you want retribution just so that it can be over with. 326 00:34:55,040 --> 00:35:04,040 What it's hard for us to realize is the power that situations have over us to cause us to act in certain ways. 327 00:35:04,040 --> 00:35:10,040 It was not the case that they had been horrible, moral failures. 328 00:35:11,040 --> 00:35:18,040 It's the case that they're like the rest of us, caught up in situations, influenced by the situations, reacting. 329 00:35:22,040 --> 00:35:31,040 I believe that there's all different kinds of people and that a certain kind of people take charge in situations. 330 00:35:31,040 --> 00:35:37,040 Unfortunately for Matt, none of us were the type of person who took charge and told people what to do. 331 00:35:38,040 --> 00:35:44,040 We just found ourselves looking at each other, waiting for someone to step up, and nobody did. 332 00:35:51,040 --> 00:35:55,040 One of the illusions about human behavior is that it stems entirely from personality or character. 333 00:35:55,040 --> 00:36:01,040 But social psychology shows us that often behavior is dominated by the social roles we're asked to play. 334 00:36:02,040 --> 00:36:07,040 This point is driven home with particular force by a study carried out by Professor Philip Zimbardo at Stanford University. 335 00:36:08,040 --> 00:36:16,040 Professor Zimbardo created a prison situation in which ordinary people were asked to play the role either of prisoner or of warden. 336 00:36:17,040 --> 00:36:18,040 Then he observed what happened. 337 00:36:18,040 --> 00:36:26,040 In 1971, in the basement of the psychology department of Stanford University, a mock prison was created. 338 00:36:27,040 --> 00:36:30,040 It rivaled all social psychology experiments in controversy. 339 00:36:31,040 --> 00:36:34,040 Shortly after I finished this Stanford prison study, Milgram embraced me and said, 340 00:36:35,040 --> 00:36:36,040 I'm so happy that you did this. 341 00:36:37,040 --> 00:36:44,040 He said, because now you can take off some of the heat that he's had to bear alone of having done the most unethical study. 342 00:36:44,040 --> 00:36:52,040 Although this experiment is over 30 years old, its enduring power has been underscored by the events at Abu Ghraib. 343 00:36:53,040 --> 00:36:57,040 When we got to Abu Ghraib, it was eerie. 344 00:36:58,040 --> 00:37:02,040 People were being told to rough up Iraqis that wouldn't cooperate. 345 00:37:02,040 --> 00:37:07,040 I mean, they're torturing, they're abusing detainees. 346 00:37:08,040 --> 00:37:13,040 You're looking at the situation thinking, they've condoned this, but why? 347 00:37:14,040 --> 00:37:21,040 And if it wouldn't have been for those photos, no one would have ever believed what was going on over there. 348 00:37:21,040 --> 00:37:32,040 When I first saw the pictures, immediately a sense of familiarity struck me because I knew that I had been there before. 349 00:37:33,040 --> 00:37:36,040 I'd been in this type of situation. I knew what was going on in my mind. 350 00:37:39,040 --> 00:37:45,040 The photographs were strikingly familiar to the photographs that we had taken, many of the photographs I had taken in the prison study. 351 00:37:45,040 --> 00:37:51,040 We didn't do any of the stuff that you see in Abu Ghraib where they, you know, get into a big pile or something like that. 352 00:37:52,040 --> 00:37:55,040 But I certainly subjected them to all kinds of humiliations. 353 00:37:56,040 --> 00:38:01,040 I don't know where I would have stopped myself. Given enough time, we could have got there. 354 00:38:01,040 --> 00:38:12,040 When the images of the abuse and torture at Abu Ghraib were revealed, immediately the military went on the defensive saying, it's a few bad apples. 355 00:38:13,040 --> 00:38:17,040 When we see somebody doing bad things, we assume they were bad people to begin with. 356 00:38:18,040 --> 00:38:27,040 But what we know in our study is there are a set of social psychological variables that can make ordinary people do things they never could imagine doing. 357 00:38:27,040 --> 00:38:33,040 At Abu Ghraib, ordinary people perpetrated extraordinary abuses. 358 00:38:34,040 --> 00:38:38,040 To understand why, it helps to reach back to the lessons of Zimbardo's experiment. 359 00:38:39,040 --> 00:38:42,040 How people respond to a cruel environment without clear rules. 360 00:38:42,040 --> 00:38:54,040 I think he and everybody else who came down into that situation got caught up into that situation. 361 00:38:55,040 --> 00:38:58,040 And the sense that this was an experiment, that began to fade away. 362 00:38:59,040 --> 00:39:02,040 It became just life. 363 00:39:03,040 --> 00:39:08,040 We frankly didn't anticipate what was going to happen. 364 00:39:08,040 --> 00:39:14,040 We tried to really test the power of the environment to change and transform otherwise normal people. 365 00:39:15,040 --> 00:39:22,040 Much as Milgram had changed or transformed otherwise normal people in an obedient situation, we wanted to do it in a prison-like situation. 366 00:39:23,040 --> 00:39:26,040 Over 70 men volunteered for Zimbardo's experiment. 367 00:39:26,040 --> 00:39:31,040 And they completed a battery of psychological tests. 368 00:39:32,040 --> 00:39:35,040 We picked two dozen, 24 who were the most normal and most healthy. 369 00:39:36,040 --> 00:39:38,040 Half are going to be guards, half are going to be prisoners. 370 00:39:39,040 --> 00:39:40,040 And it's like flipping a coin. 371 00:39:40,040 --> 00:39:41,040 And heads, this one's a guard, this one's a prisoner. 372 00:39:42,040 --> 00:39:47,040 So at the beginning, there's no difference in the kinds of people who are in your two groups. 373 00:39:47,040 --> 00:39:56,040 When we were given our jobs as guards, we were issued a uniform, which was a plain, sort of khaki or lighter-colored uniform. 374 00:39:57,040 --> 00:40:02,040 And then we gave them the symbols of power, handcuffs, a whistle, a big billy club. 375 00:40:03,040 --> 00:40:07,040 And then the other thing we gave them were silver-reflecting sunglasses. 376 00:40:07,040 --> 00:40:11,040 When you have mirror sunglasses on, then nobody can see your eyeballs. 377 00:40:12,040 --> 00:40:16,040 I think that any time you put on what essentially is a mask and you mask your identity, 378 00:40:17,040 --> 00:40:22,040 then it allows you to behave in ways that you would not behave if you didn't have the mask on. 379 00:40:23,040 --> 00:40:28,040 To make it more realistic, I had arranged with this Palo Alto Police Department to make mock arrests. 380 00:40:33,040 --> 00:40:35,040 When I was arrested, it was a surprise to me. 381 00:40:35,040 --> 00:40:38,040 I didn't think I was going to be brought to an actual police station. 382 00:40:39,040 --> 00:40:41,040 I didn't think I was going to go through a booking process. 383 00:40:42,040 --> 00:40:45,040 The guards then put a blindfold on them, stripped them naked, 384 00:40:46,040 --> 00:40:49,040 and then they put them in dresses, smocks with no underpants. 385 00:40:50,040 --> 00:40:53,040 Each had a number that replaced their name. 386 00:40:54,040 --> 00:40:56,040 They had to know the number. They could only be referred to by that number. 387 00:40:57,040 --> 00:41:02,040 And they had a chain on one foot, which was put there to remind them at all times of their loss of freedom. 388 00:41:02,040 --> 00:41:06,040 So all of these things produces a sense of being dehumanized. 389 00:41:09,040 --> 00:41:13,040 On the first day, I said, this is not going to work. 390 00:41:14,040 --> 00:41:16,040 I mean, the guards felt awkward giving orders. 391 00:41:17,040 --> 00:41:20,040 And they'd say, okay, line up, repeat your numbers, and the prisoners start giggling. 392 00:41:20,040 --> 00:41:23,040 Hey, I don't want anybody laughing. Three, two, one. 393 00:41:24,040 --> 00:41:25,040 And then a very interesting thing happened. 394 00:41:25,040 --> 00:41:29,040 Dave Eshelman, who the prisoners named John Wayne, like he's a Wild West cowboy, 395 00:41:30,040 --> 00:41:31,040 he begins to be more extreme. 396 00:41:31,040 --> 00:41:42,040 I decided that I would become the worst, most intimidating, cruel prison guard that I could possibly be. 397 00:41:43,040 --> 00:41:45,040 Have I seen the picture that you work where you told? 398 00:41:46,040 --> 00:41:47,040 Thanks, sir. Say it again. 399 00:41:48,040 --> 00:41:49,040 Bless you, sir. 400 00:41:49,040 --> 00:41:50,040 Say bless you. 401 00:41:51,040 --> 00:41:52,040 Bless you, sir. 402 00:41:53,040 --> 00:41:57,040 I was sort of fascinated myself that people were believing the act. 403 00:41:58,040 --> 00:42:04,040 And I was trying to see how far I could take it before somebody would say, okay, that's enough. Stop. 404 00:42:05,040 --> 00:42:06,040 We did have to do things like push-ups. 405 00:42:07,040 --> 00:42:08,040 We would have to sing things. 406 00:42:09,040 --> 00:42:11,040 At the beginning, we protested some of the actions. 407 00:42:12,040 --> 00:42:14,040 We did things to irritate the guards. 408 00:42:14,040 --> 00:42:20,040 So the guards' authority was challenged right off the bat. 409 00:42:21,040 --> 00:42:22,040 And the guards had to decide how they were going to handle that. 410 00:42:23,040 --> 00:42:24,040 And they had to decide it without our input. 411 00:42:25,040 --> 00:42:28,040 I mean, again, this was not a Milgram study in which we were standing over them telling them what to do. 412 00:42:29,040 --> 00:42:33,040 And they began to see the prisoners' behavior as a kind of an affront to their authority. 413 00:42:34,040 --> 00:42:35,040 And they began to push back. 414 00:42:36,040 --> 00:42:39,040 We would ramp up the general harassment, just sort of crank it up a bit. 415 00:42:40,040 --> 00:42:42,040 Nobody was telling me I shouldn't be doing this. 416 00:42:42,040 --> 00:42:44,040 The professor is the authority here. 417 00:42:44,040 --> 00:42:45,040 You know, he's the prison warden. 418 00:42:46,040 --> 00:42:47,040 He's not stopping me. 419 00:42:48,040 --> 00:42:49,040 This is unbelievable. 420 00:42:50,040 --> 00:42:51,040 They took our clothes. 421 00:42:52,040 --> 00:42:53,040 Hands off the door. 422 00:42:54,040 --> 00:42:58,040 There was the first evening, a kind of rebellion that took place. 423 00:42:59,040 --> 00:43:00,040 The prisoners rebelled. 424 00:43:01,040 --> 00:43:04,040 They barricaded themselves in their cells and said, we refused to come out. 425 00:43:05,040 --> 00:43:06,040 They took off their numbers. 426 00:43:06,040 --> 00:43:08,040 They didn't want to be de-individuated. 427 00:43:09,040 --> 00:43:10,040 They started cursing the guards to their face. 428 00:43:10,040 --> 00:43:16,040 And the key, the key turning point was the guards began to think of them as dangerous prisoners. 429 00:43:17,040 --> 00:43:22,040 And so the guards formulated a plan to use fire extinguishers. 430 00:43:23,040 --> 00:43:30,040 Took the doors down, dragged the prisoners out, stripped them naked, and essentially broke the rebellion in a purely physical way. 431 00:43:30,040 --> 00:43:39,040 From that point on, the study was as remarkable a series of events as I have ever seen. 432 00:43:40,040 --> 00:43:52,040 It was a real laboratory for Zimbardo and I to watch human nature transformed in a very rapid way in the face of a very powerful situation. 433 00:43:52,040 --> 00:43:56,040 People really suffered. I mean, guards did terrible things to the prisoners. 434 00:43:57,040 --> 00:44:00,040 They punished them by putting them in solitary confinement, which was a small closet. 435 00:44:01,040 --> 00:44:06,040 You could squat or stand, but, you know, you couldn't sit. And it was dark and dank, actually. 436 00:44:06,040 --> 00:44:12,040 Every hour, every day, there's a teeny little bit more of an increment. 437 00:44:13,040 --> 00:44:19,040 And they're stepping up, taunting the prisoners. They're stepping up, the Count's not letting them sleep. They're stepping up. 438 00:44:19,040 --> 00:44:24,040 I don't think from one minute to the next, the people who are in it see the change and see the difference. 439 00:44:24,040 --> 00:44:36,040 And then the next key thing happened, beside the rebellion, Prison 8612. He was the first one to have an emotional breakdown. 440 00:44:37,040 --> 00:44:42,040 I feel really fucked up inside. You don't know. I gotta go. To a doctor, anything. 441 00:44:43,040 --> 00:44:50,040 I mean, Jesus Christ, I'm burning up inside. Don't you know? I'm fucked up. I don't know how to explain it. I'm all fucked up inside. 442 00:44:50,040 --> 00:44:53,040 I want out! I want out now! 443 00:44:55,040 --> 00:44:59,040 At the time, if you had questioned me about the effect I was having, I would say, 444 00:45:00,040 --> 00:45:04,040 Well, you know, they must be a wimp. They're weak or they're faking. 445 00:45:05,040 --> 00:45:09,040 Because I wouldn't have believed that what I was doing could actually cause somebody to have a nervous breakdown. 446 00:45:10,040 --> 00:45:13,040 It was just us sort of getting our jollies with it. 447 00:45:14,040 --> 00:45:17,040 You know, let's be like puppeteers here. Let's make these people do things. 448 00:45:17,040 --> 00:45:21,040 What if I told you to get down to that pool and fuck the pool? What would you do then? 449 00:45:22,040 --> 00:45:25,040 The guards now began to escalate their use of power. 450 00:45:26,040 --> 00:45:28,040 Some of them had prisoners clean out toilet bowls with their bare hands. 451 00:45:29,040 --> 00:45:36,040 They now taunt, humiliate, degrade the prisoners in front of each other, and they exert arbitrary control over the prisoners. 452 00:45:36,040 --> 00:45:45,040 They keep thinking of more and more unusual things to do. And very soon after the fourth day, things begin to turn sexual. 453 00:45:46,040 --> 00:45:53,040 You be the bride of Frankenstein, and you be Frankenstein. I want you to walk over here like Frankenstein and say that you love the pool man. 454 00:45:53,040 --> 00:46:01,040 If you want to fully sort of humiliate somebody, then you want to get them in those things where their biggest fears are. 455 00:46:02,040 --> 00:46:07,040 And a lot of us have a lot of sexual hang-ups, and so that was part of that effort to humiliate them even further. 456 00:46:07,040 --> 00:46:24,040 The guards knew that had the coin come up heads rather than tails, they would have had the dress on rather than the uniform on. They knew that. 457 00:46:24,040 --> 00:46:30,040 So they certainly knew that the prisoners who were being mistreated had done nothing wrong to deserve the mistreatment. 458 00:46:31,040 --> 00:46:41,040 And yet, the roles themselves were so powerful, and the environment itself was so powerful, that they ended up punishing those prisoners as though they had done something wrong. 459 00:46:42,040 --> 00:46:44,040 Prisoner 819 did a bad thing! 460 00:46:45,040 --> 00:46:47,040 We were told to chant something about how he was a bad prisoner. 461 00:46:47,040 --> 00:46:53,040 And at the time I went along with it, I'm thinking, what does this matter? We don't believe this, but we can go along and chant it. 462 00:46:54,040 --> 00:46:58,040 Because of what a prisoner 819 did, my cell is the best! 463 00:46:59,040 --> 00:47:03,040 Because of what a prisoner 819 did, my cell is the best! 464 00:47:04,040 --> 00:47:08,040 That night, he had a breakdown. Every day after that, another prisoner broke down in a similar way. 465 00:47:09,040 --> 00:47:14,040 I mean, extreme stress reaction. We released another one on Tuesday, Wednesday, Thursday. 466 00:47:14,040 --> 00:47:19,040 Nobody who was in that study could deny that the prisoner breakdowns were genuine. 467 00:47:20,040 --> 00:47:27,040 They were scary to see. They were upsetting to us. They were unexpected, but they were very clearly the real thing. 468 00:47:28,040 --> 00:47:32,040 At some level, we understood that something was happening that we were no longer in control of. 469 00:47:33,040 --> 00:47:36,040 It was damaging people. We didn't quite have a grasp on what to do about it. 470 00:47:36,040 --> 00:47:49,040 One of the mistakes we made was that we hadn't built in time to step back and to look at what was happening and call it what it was, which was mistreatment. 471 00:47:49,040 --> 00:47:52,040 We were caught up in the events that were taking place. 472 00:47:53,040 --> 00:47:58,040 Well, you can keep your blankets and 416 will stay in another day. 473 00:47:59,040 --> 00:48:05,040 We got three, guess one. Keep your blankets, 416, you're going to be in there for a while. 474 00:48:06,040 --> 00:48:07,040 So just get used to it. 475 00:48:07,040 --> 00:48:15,040 On the fifth day of the study, Zimbardo invited his girlfriend, recent psychology graduate, Christina Maslak, to visit the mock prison. 476 00:48:16,040 --> 00:48:21,040 I had heard bits and pieces from Phil about what was going on. 477 00:48:22,040 --> 00:48:26,040 And then when I was down there that evening, it really was kind of a, wow. 478 00:48:27,040 --> 00:48:32,040 The thing that really got to me was when some of the guards took the prisoners down the hall to the men's room. 479 00:48:32,040 --> 00:48:42,040 She looks out and sees a line of prisoners with paper bags over their heads, each one holding the other one's shoulder. 480 00:48:43,040 --> 00:48:44,040 And they're leading them down the hall. 481 00:48:45,040 --> 00:48:47,040 And Phil comes over and I, look, look, you know, my God, look at that. 482 00:48:48,040 --> 00:48:52,040 And I looked up and something about it just, you know, again, it was the dehumanizing, demeaning kind of treatment. 483 00:48:53,040 --> 00:48:54,040 I just, I couldn't watch it. 484 00:48:55,040 --> 00:48:57,040 And she said, it's terrible what you're doing to those boys. 485 00:48:58,040 --> 00:48:59,040 And she got tears in her eyes. I said, what? 486 00:48:59,040 --> 00:49:06,040 And she runs out. And I'm furious. I'm saying, you know, I'm saying, look, this is, you know, run outside, we have this big argument. 487 00:49:06,040 --> 00:49:11,040 I'm saying, look, this is, this is dynamics of human behavior. Look, it's fascinating, the power of the situation, all. 488 00:49:12,040 --> 00:49:16,040 So I'm giving her all the psychological basis and what kind of psychologist are you? You don't appreciate this. 489 00:49:17,040 --> 00:49:25,040 And she said, I don't understand. You're a stranger to me. I don't understand this. How could you not see what I see? 490 00:49:25,040 --> 00:49:29,040 I mean, you know, you're a caring, compassionate person. I know you from all these other things. Something's gone wrong here. 491 00:49:30,040 --> 00:49:39,040 And then the next thing she said, which had an equally big impact is, you know, I'm not sure I want to, you know, have anything to do with you if this is the real you. 492 00:49:39,040 --> 00:49:51,040 And that was like a slap in the face because what she was saying is, you've changed. You know, the power of the situation has transformed you from, from the person I thought I knew to this person that I don't know. 493 00:49:52,040 --> 00:49:54,040 And at that moment, I said, wow, you're right. We got to end it. 494 00:49:55,040 --> 00:49:59,040 After only six days, Dr. Zimbardo shut down his experiment. 495 00:49:59,040 --> 00:50:06,040 What makes this study interesting and what makes the Milgram study interesting, it's really about the transformation of human character. 496 00:50:07,040 --> 00:50:10,040 People can be seduced into doing things they never thought they could. 497 00:50:11,040 --> 00:50:14,040 There are interesting parallels that are coming up now with Abu Ghraib. 498 00:50:14,040 --> 00:50:29,040 At Abu Ghraib, standard operating procedures were changed. Normally, military guards are supposed to protect prisoners. Suddenly, they were asked to soften them up for interrogators. 499 00:50:32,040 --> 00:50:41,040 We were never trained to be prison goers. The higher ups said, use your imagination. Break them. We want them broke by the time we get back. 500 00:50:41,040 --> 00:50:47,040 As soon as we'd have prisoners come in, sandbags instantly over their head. 501 00:50:48,040 --> 00:50:51,040 They would flexi cuff them, throw them down to the ground. Some would be stripped. 502 00:50:52,040 --> 00:50:57,040 It was told to all of us, they're nothing but dogs. So you start breeding that picture to people. 503 00:50:58,040 --> 00:51:01,040 Then all of a sudden, you start looking at these people as less than human. 504 00:51:02,040 --> 00:51:06,040 And you start doing things to them you would never dream of. And that's where it got scary. 505 00:51:06,040 --> 00:51:14,040 Tier 1A was where a lot of the stuff started happening. And that's what Tier Specialist Grainer was in charge of. 506 00:51:15,040 --> 00:51:20,040 One evening, after he got off of his shift, he was hoarse. I said, Grainer, are you getting sick? 507 00:51:21,040 --> 00:51:23,040 And he goes, no. And I said, well, what's going on? 508 00:51:23,040 --> 00:51:29,040 And he says, well, I'm having to yell and do things to detainees that I feel are morally and ethically wrong. 509 00:51:30,040 --> 00:51:32,040 What do you think I should do? I said, then don't do them. 510 00:51:33,040 --> 00:51:35,040 And he goes, I don't have a choice. And I said, what do you mean? 511 00:51:36,040 --> 00:51:43,040 He says, well, every time a bomb goes off outside the wire or outside the fence, they come in and they tell me that's another American losing their life. 512 00:51:43,040 --> 00:51:47,040 And unless you help us, their blood's on your hands as well. 513 00:51:56,040 --> 00:52:02,040 So early on in October, what I saw whenever I walked up to the tier was two soldiers that I had no idea who they were. 514 00:52:02,040 --> 00:52:13,040 They had two naked detainees handcuffed to prison cells. They were telling him to confess, confess, confess. He would swat him on the behind with a water bottle. 515 00:52:14,040 --> 00:52:20,040 So then after they did that, they handcuffed him together in what appeared to be sexual position. 516 00:52:21,040 --> 00:52:27,040 I've never been trained in an interrogation, but I definitely didn't think that this is the way interrogation should be. 517 00:52:27,040 --> 00:52:36,040 And so I reported it to my lieutenant, basically, you know, telling him military intelligence is doing some pretty weird things with naked detainees. 518 00:52:37,040 --> 00:52:38,040 And he seemed not to care. 519 00:52:40,040 --> 00:52:43,040 Whenever there's a system, there are perpetrators of evil. 520 00:52:44,040 --> 00:52:47,040 There's people who do bad things, like in Abu Ghraib, the guards. 521 00:52:48,040 --> 00:52:52,040 But the top administration gives permission, either implicitly or explicitly. 522 00:52:52,040 --> 00:52:59,040 They didn't say put him in a pyramid, but they gave them general permission to do whatever they had to do to get confessions. 523 00:53:02,040 --> 00:53:07,040 They may well be given missions in connection with this overall task, strategy. 524 00:53:08,040 --> 00:53:12,040 We also have to work those sort of the dark side, if you will. We're going to spend time in the shadows. 525 00:53:12,040 --> 00:53:26,040 I can't think of a worse thing for somebody who is in charge of an environment like that or in charge of people who work in an environment like that to say it's time to take the gloves off and go to the dark side. 526 00:53:27,040 --> 00:53:32,040 Those kinds of institutional environments create pressures on people to head to the dark side anyway. 527 00:53:32,040 --> 00:53:37,040 I mean, we learned this in the prison study. Those environments elicit the worst from good people. 528 00:53:42,040 --> 00:53:50,040 My guess is that 99.999% of our armed forces behave admirably at all times, but it's like the rest of society. 529 00:53:51,040 --> 00:53:55,040 There will be a few bad apples that will conduct themselves in ways that we're not proud of. 530 00:53:55,040 --> 00:54:05,040 Were there a few bad apples? No. What was bad was the barrel. Who made the barrel, this whole chain of command? 531 00:54:06,040 --> 00:54:10,040 I feel terrible about what happened to these Iraqi detainees. They were in U.S. custody. 532 00:54:11,040 --> 00:54:16,040 Our country had an obligation to treat them right, to treat them as human beings. 533 00:54:17,040 --> 00:54:20,040 We didn't do that. That was wrong. 534 00:54:20,040 --> 00:54:33,040 Prior to the Abu Ghraib scandal, Donald Rumsfeld had personally approved a menu of interrogation techniques, including dogs, stress positions, and nudity, that violated long-standing military rules. 535 00:54:34,040 --> 00:54:37,040 When you follow an order, you've got to be held accountable as well. 536 00:54:38,040 --> 00:54:45,040 But the ones that hold the key to that door, the ones that ask you to walk through that door, hold a higher accountability because they know better. 537 00:54:45,040 --> 00:54:52,040 I know the situation very closely now because I was an expert witness for one of those guards, Chip Frederick. 538 00:54:53,040 --> 00:55:02,040 Exemplary soldier, nine medals, model father, husband, patriotic, normal, healthy, no sadistic tendencies. 539 00:55:03,040 --> 00:55:08,040 Nothing that would indicate he was anything other than an ordinary good guy. 540 00:55:08,040 --> 00:55:12,040 And he gets into this place, and he is totally corrupted. 541 00:55:15,040 --> 00:55:22,040 Sometimes you cross a line, and it's a thin line, at any time that can be crossed by anybody if placed in certain conditions. 542 00:55:22,040 --> 00:55:34,040 I think it's a hard conclusion from all of the research evidence to sort of say, there's nothing inherent in who you are that would necessarily say, I'm safe, I will never cross the line. 543 00:55:35,040 --> 00:55:45,040 That research was done 30-something years ago. This is not news, you know. The lessons that were learned, it's been in textbooks, it's been taught in psychology courses. 544 00:55:45,040 --> 00:55:50,040 Other research, Milgram, all of these other studies are pointing to those same conclusions. 545 00:55:51,040 --> 00:55:55,040 It is the rare person who is able to be in that situation and resist. 546 00:55:56,040 --> 00:56:02,040 It's the majority who conform, who comply, who obey authority, who do these things. 547 00:56:03,040 --> 00:56:07,040 And that's what nobody wants to hear. We want to all think we're heroes. If we were in that situation, we'd be different. 548 00:56:07,040 --> 00:56:20,040 Maybe that's true. But heroes are rare in any society. They are the exception. The rule is the majority. The rule is the base rate, is what the average person would do. 549 00:56:21,040 --> 00:56:28,040 And so the big message from the Stanford Prison Experiment, the big message from the Milgram Obedience Study, from many of these other studies, 550 00:56:28,040 --> 00:56:38,040 is that if you imagine yourself being a participant in the studies, you have to say, it's likely I would do what the majority did. 551 00:56:39,040 --> 00:56:43,040 And I'm not that special. I'm an ordinary person. They were ordinary people. 552 00:56:43,040 --> 00:56:46,040 And whom you are staying here, you are a loyal on level of study. 553 00:56:46,040 --> 00:56:50,040 And beyond theubertraniloquist Lucy, it's likely to be an average person like a cual era and становится very often. 554 00:56:50,040 --> 00:56:56,040 And I am not alone that anyone? No, I don't really want to do it. 555 00:56:56,040 --> 00:57:01,040 The Jewels are family 5 gigpeople! 556 00:57:02,040 --> 00:57:07,040 And so if you imagine yourself being able to focus on the experience of the doubt's time, too. 557 00:57:07,040 --> 00:57:11,040 And so you can become one of these people you have. 558 00:57:11,040 --> 00:57:41,020 Transcription by CastingWords 559 00:57:41,040 --> 00:58:11,020 Transcription by CastingWords