1 00:00:04,200 --> 00:00:08,240 Funding for this program is provided by: 2 00:00:08,240 --> 00:00:15,240 Additional funding provided by 3 00:00:33,509 --> 00:00:37,750 This is a course about Justice and we begin with a story 4 00:00:37,750 --> 00:00:40,179 suppose you're the driver of a trolley car, 5 00:00:40,179 --> 00:00:44,640 and your trolley car is hurdling down the track at sixty miles an hour 6 00:00:44,640 --> 00:00:49,390 and at the end of the track you notice five workers working on the track 7 00:00:49,390 --> 00:00:51,799 you tried to stop but you can't 8 00:00:51,799 --> 00:00:53,829 your brakes don't work 9 00:00:53,829 --> 00:00:56,780 you feel desperate because you know 10 00:00:56,780 --> 00:00:59,590 that if you crash into these five workers 11 00:00:59,590 --> 00:01:01,490 they will all die 12 00:01:01,490 --> 00:01:05,079 let's assume you know that for sure 13 00:01:05,079 --> 00:01:07,420 and so you feel helpless 14 00:01:07,420 --> 00:01:09,450 until you notice that there is 15 00:01:09,450 --> 00:01:11,290 off to the right 16 00:01:11,290 --> 00:01:13,170 a side track 17 00:01:13,170 --> 00:01:15,579 at the end of that track 18 00:01:15,579 --> 00:01:17,390 there's one worker 19 00:01:17,390 --> 00:01:19,079 working on track 20 00:01:19,079 --> 00:01:21,368 you're steering wheel works 21 00:01:21,368 --> 00:01:23,030 so you can 22 00:01:23,030 --> 00:01:26,010 turn the trolley car if you want to 23 00:01:26,010 --> 00:01:28,840 onto this side track 24 00:01:28,840 --> 00:01:30,399 killing the one 25 00:01:30,399 --> 00:01:33,269 but sparing the five. 26 00:01:33,269 --> 00:01:36,109 Here's our first question 27 00:01:36,109 --> 00:01:38,819 what's the right thing to do? 28 00:01:38,819 --> 00:01:40,478 What would you do? 29 00:01:40,478 --> 00:01:42,560 Let's take a poll, 30 00:01:42,560 --> 00:01:45,180 how many 31 00:01:45,180 --> 00:01:52,020 would turn the trolley car onto the side track? 32 00:01:52,019 --> 00:01:53,530 How many wouldn't? 33 00:01:53,530 --> 00:01:58,180 How many would go straight ahead 34 00:01:58,180 --> 00:02:04,050 keep your hands up, those of you who'd go straight ahead. 35 00:02:04,049 --> 00:02:08,378 A handful of people would, the vast majority would turn 36 00:02:08,378 --> 00:02:09,878 let's hear first 37 00:02:09,878 --> 00:02:14,149 now we need to begin to investigate the reasons why you think 38 00:02:14,149 --> 00:02:19,799 it's the right thing to do. Let's begin with those in the majority, who would turn 39 00:02:19,799 --> 00:02:22,058 to go onto side track? 40 00:02:22,058 --> 00:02:23,669 Why would you do it, 41 00:02:23,669 --> 00:02:25,808 what would be your reason? 42 00:02:25,808 --> 00:02:30,188 Who's willing to volunteer a reason? 43 00:02:30,188 --> 00:02:32,218 Go ahead, stand up. 44 00:02:32,218 --> 00:02:39,218 Because it can't be right to kill five people when you can only kill one person instead. 45 00:02:39,579 --> 00:02:42,239 it wouldn't be right to kill five 46 00:02:42,239 --> 00:02:47,069 if you could kill one person instead 47 00:02:47,068 --> 00:02:48,528 that's a good reason 48 00:02:48,528 --> 00:02:52,959 that's a good reason 49 00:02:52,959 --> 00:02:53,778 who else? 50 00:02:53,778 --> 00:02:56,658 does everybody agree with that 51 00:02:56,658 --> 00:03:01,218 reason? go ahead. 52 00:03:01,218 --> 00:03:03,620 Well I was thinking it was the same reason it was on 53 00:03:03,620 --> 00:03:05,459 9/11 we regard the people who flew the plane 54 00:03:05,459 --> 00:03:08,009 who flew the plane into the 55 00:03:08,008 --> 00:03:09,528 Pennsylvania field as heroes 56 00:03:09,528 --> 00:03:11,718 because they chose to kill the people on the plane 57 00:03:11,718 --> 00:03:14,418 and not kill more people 58 00:03:14,419 --> 00:03:16,109 in big buildings. 59 00:03:16,109 --> 00:03:19,260 So the principle there was the same on 9/11 60 00:03:19,259 --> 00:03:21,638 it's tragic circumstance, 61 00:03:21,639 --> 00:03:25,359 but better to kill one so that five can live 62 00:03:25,359 --> 00:03:30,748 is that the reason most of you have, those of you who would turn, yes? 63 00:03:30,748 --> 00:03:32,628 Let's hear now 64 00:03:32,628 --> 00:03:33,628 from 65 00:03:33,628 --> 00:03:35,508 those in the minority 66 00:03:35,508 --> 00:03:40,558 those who wouldn't turn. 67 00:03:40,558 --> 00:03:45,709 Well I think that same type of mentality that justifies genocide and totalitarianism 68 00:03:45,468 --> 00:03:50,358 in order to save one type of race you wipe out the other. 69 00:03:50,359 --> 00:03:53,278 so what would you do in this case? You would 70 00:03:53,278 --> 00:03:55,378 to avoid 71 00:03:55,378 --> 00:03:57,798 the horrors of genocide 72 00:03:57,799 --> 00:04:03,989 you would crash into the five and kill them? 73 00:04:03,989 --> 00:04:07,628 Presumably yes. 74 00:04:07,628 --> 00:04:09,929 okay who else? 75 00:04:09,930 --> 00:04:14,498 That's a brave answer, thank you. 76 00:04:14,498 --> 00:04:16,079 Let's consider another 77 00:04:16,978 --> 00:04:20,238 trolley car case 78 00:04:20,238 --> 00:04:21,688 and see 79 00:04:21,689 --> 00:04:24,189 whether 80 00:04:24,189 --> 00:04:27,429 those of you in the majority 81 00:04:27,428 --> 00:04:30,989 want to adhere to the principle, 82 00:04:30,990 --> 00:04:33,500 better that one should die so that five should live. 83 00:04:33,500 --> 00:04:38,528 This time you're not the driver of the trolley car, you're an onlooker 84 00:04:38,528 --> 00:04:42,678 standing on a bridge overlooking a trolley car track 85 00:04:42,678 --> 00:04:45,658 and down the track comes a trolley car 86 00:04:45,658 --> 00:04:49,620 at the end of the track are five workers 87 00:04:49,620 --> 00:04:51,739 the brakes don't work 88 00:04:51,738 --> 00:04:55,948 the trolley car is about to careen into the five and kill them 89 00:04:55,949 --> 00:04:57,218 and now 90 00:04:57,218 --> 00:04:58,788 you're not the driver 91 00:04:58,788 --> 00:05:01,308 you really feel helpless 92 00:05:01,309 --> 00:05:03,259 until you notice 93 00:05:03,259 --> 00:05:06,879 standing next to you 94 00:05:06,879 --> 00:05:08,639 leaning over 95 00:05:08,639 --> 00:05:09,889 the bridge 96 00:05:09,889 --> 00:05:16,889 is it very fat man. 97 00:05:17,218 --> 00:05:20,178 And you could 98 00:05:20,178 --> 00:05:22,668 give him a shove 99 00:05:22,668 --> 00:05:24,728 he would fall over the bridge 100 00:05:24,728 --> 00:05:27,949 onto the track 101 00:05:27,949 --> 00:05:29,538 right in the way of 102 00:05:29,538 --> 00:05:32,199 the trolley car 103 00:05:32,199 --> 00:05:33,419 he would die 104 00:05:33,418 --> 00:05:38,930 but he would spare the five. 105 00:05:38,930 --> 00:05:41,108 Now, how many would push 106 00:05:41,108 --> 00:05:48,089 the fat man over the bridge? Raise your hand. 107 00:05:48,089 --> 00:05:51,198 How many wouldn't? 108 00:05:51,199 --> 00:05:54,069 Most people wouldn't. 109 00:05:54,069 --> 00:05:55,789 Here's the obvious question, 110 00:05:55,788 --> 00:05:56,878 what became 111 00:05:56,879 --> 00:06:00,169 of the principle 112 00:06:00,168 --> 00:06:05,109 better to save five lives even if it means sacrificing one, what became of the principal 113 00:06:05,110 --> 00:06:07,490 that almost everyone endorsed 114 00:06:07,490 --> 00:06:09,240 in the first case 115 00:06:09,240 --> 00:06:12,809 I need to hear from someone who was in the majority in both 116 00:06:12,809 --> 00:06:13,580 cases is 117 00:06:13,579 --> 00:06:17,758 how do you explain the difference between the two? 118 00:06:17,759 --> 00:06:21,860 The second one I guess involves an active choice of 119 00:06:21,860 --> 00:06:22,509 pushing a person 120 00:06:22,509 --> 00:06:24,279 and down which 121 00:06:24,279 --> 00:06:25,198 I guess that 122 00:06:25,199 --> 00:06:29,830 that person himself would otherwise not have been involved in the situation at all 123 00:06:29,829 --> 00:06:31,269 and so 124 00:06:31,269 --> 00:06:33,769 to choose on his behalf I guess 125 00:06:33,769 --> 00:06:36,769 to 126 00:06:36,769 --> 00:06:39,990 involve him in something that he otherwise would have this escaped is 127 00:06:39,990 --> 00:06:41,788 I guess more than 128 00:06:41,788 --> 00:06:43,620 what you have in the first case where 129 00:06:43,620 --> 00:06:45,968 the three parties, the driver and 130 00:06:45,968 --> 00:06:47,668 the two sets of workers are 131 00:06:47,668 --> 00:06:50,598 already I guess in this situation. 132 00:06:50,598 --> 00:06:55,108 but the guy working, the one on the track off to the side 133 00:06:55,108 --> 00:07:02,068 he didn't choose to sacrifice his life any more than the fat guy did, did he? 134 00:07:02,069 --> 00:07:05,420 That's true, but he was on the tracks. 135 00:07:05,399 --> 00:07:10,438 this guy was on the bridge. 136 00:07:10,439 --> 00:07:13,550 Go ahead, you can come back if you want. 137 00:07:13,550 --> 00:07:15,329 Alright, it's a hard question 138 00:07:15,329 --> 00:07:18,658 but you did well you did very well it's a hard question. 139 00:07:19,629 --> 00:07:21,180 who else 140 00:07:21,180 --> 00:07:22,598 can 141 00:07:22,598 --> 00:07:26,028 find a way of reconciling 142 00:07:26,028 --> 00:07:30,158 the reaction of the majority in these two cases? Yes? 143 00:07:30,158 --> 00:07:31,550 Well I guess 144 00:07:31,550 --> 00:07:32,540 in the first case where 145 00:07:32,540 --> 00:07:35,179 you have the one worker and the five 146 00:07:35,178 --> 00:07:37,429 it's a choice between those two, and you have to 147 00:07:37,430 --> 00:07:41,218 make a certain choice and people are going to die because of the trolley car 148 00:07:41,218 --> 00:07:45,029 not necessarily because of your direct actions. The trolley car is a runway, 149 00:07:45,029 --> 00:07:48,209 thing and you need to make in a split second choice 150 00:07:48,209 --> 00:07:52,658 whereas pushing the fat man over is an actual act of murder on your part 151 00:07:52,658 --> 00:07:54,490 you have control over that 152 00:07:54,490 --> 00:07:57,278 whereas you may not have control over the trolley car. 153 00:07:57,278 --> 00:08:00,089 So I think that it's a slightly different situation. 154 00:08:00,089 --> 00:08:04,359 Alright who has a reply? Is that, who has a reply to that? no that was good, who has a way 155 00:08:04,360 --> 00:08:06,319 who wants to reply? 156 00:08:06,319 --> 00:08:09,490 Is that a way out of this? 157 00:08:09,490 --> 00:08:12,439 I don't think that's a very good reason because you choose 158 00:08:12,439 --> 00:08:16,889 either way you have to choose who dies because you either choose to turn and kill a person 159 00:08:16,889 --> 00:08:18,500 which is an act of conscious 160 00:08:18,500 --> 00:08:19,718 thought to turn, 161 00:08:19,718 --> 00:08:21,319 or you choose to push the fat man 162 00:08:21,319 --> 00:08:23,609 over which is also an active 163 00:08:23,608 --> 00:08:27,688 conscious action so either way you're making a choice. 164 00:08:27,689 --> 00:08:29,729 Do you want to reply? 165 00:08:29,728 --> 00:08:34,189 Well I'm not really sure that that's the case, it just still seems kind of different, the act of actually 166 00:08:34,190 --> 00:08:38,230 pushing someone over onto the tracks and killing them, 167 00:08:38,230 --> 00:08:42,600 you are actually killing him yourself, you're pushing him with your own hands you're pushing and 168 00:08:42,600 --> 00:08:43,710 that's different 169 00:08:43,710 --> 00:08:47,090 than steering something that is going to cause death 170 00:08:47,090 --> 00:08:48,670 into another...you know 171 00:08:48,669 --> 00:08:52,829 it doesn't really sound right saying it now when I'm up here. 172 00:08:52,830 --> 00:08:54,740 No that's good, what's your name? 173 00:08:54,740 --> 00:08:55,570 Andrew. 174 00:08:55,740 --> 00:08:59,570 Andrew and let me ask you this question Andrew, 175 00:08:59,570 --> 00:09:02,240 suppose 176 00:09:02,240 --> 00:09:03,680 standing on the bridge 177 00:09:03,679 --> 00:09:04,809 next to the fat man 178 00:09:04,809 --> 00:09:07,849 I didn't have to push him, suppose he was standing 179 00:09:07,850 --> 00:09:14,850 over a trap door that I could open by turning a steering wheel like that 180 00:09:17,450 --> 00:09:18,690 would you turn it? 181 00:09:18,690 --> 00:09:20,960 For some reason that still just seems more 182 00:09:20,960 --> 00:09:24,129 more wrong. 183 00:09:24,129 --> 00:09:30,350 I mean maybe if you just accidentally like leaned into this steering wheel or something like that 184 00:09:30,350 --> 00:09:31,060 or but, 185 00:09:31,059 --> 00:09:33,118 or say that the car is 186 00:09:33,119 --> 00:09:37,850 hurdling towards a switch that will drop the trap 187 00:09:37,850 --> 00:09:39,230 then I could agree with that. 188 00:09:39,850 --> 00:09:42,230 Fair enough, it still seems 189 00:09:42,230 --> 00:09:45,500 wrong in a way that it doesn't seem wrong in the first case to turn, you say 190 00:09:45,500 --> 00:09:50,419 An in another way, I mean in the first situation you're involved directly with the situation 191 00:09:50,419 --> 00:09:52,449 in the second one you're an onlooker as well. 192 00:09:52,450 --> 00:09:56,920 So you have the choice of becoming involved or not by pushing the fat man. 193 00:09:56,919 --> 00:09:59,549 Let's forget for the moment about this case, 194 00:09:59,549 --> 00:10:01,269 that's good, 195 00:10:01,269 --> 00:10:06,460 but let's imagine a different case. This time your doctor in an emergency room 196 00:10:06,460 --> 00:10:11,629 and six patients come to you 197 00:10:11,629 --> 00:10:18,500 they've been in a terrible trolley car wreck 198 00:10:18,500 --> 00:10:23,889 five of them sustained moderate injuries one is severely injured you could spend all day 199 00:10:23,889 --> 00:10:27,720 caring for the one severely injured victim, 200 00:10:27,720 --> 00:10:32,330 but in that time the five would die, or you could look after the five, restore them to health, but 201 00:10:32,330 --> 00:10:35,490 during that time the one severely injured 202 00:10:35,490 --> 00:10:36,470 person would die. 203 00:10:36,470 --> 00:10:37,660 How many would save 204 00:10:37,659 --> 00:10:39,539 the five 205 00:10:39,539 --> 00:10:40,849 now as the doctor? 206 00:10:40,850 --> 00:10:44,480 How many would save the one? 207 00:10:44,480 --> 00:10:46,300 Very few people, 208 00:10:46,299 --> 00:10:49,269 just a handful of people. 209 00:10:49,269 --> 00:10:51,439 Same reason I assume, 210 00:10:51,440 --> 00:10:55,570 one life versus five. 211 00:10:55,570 --> 00:10:57,230 Now consider 212 00:10:57,230 --> 00:10:59,070 another doctor case 213 00:10:59,070 --> 00:11:02,080 this time you're a transplant surgeon 214 00:11:02,080 --> 00:11:06,160 and you have five patients each in desperate need 215 00:11:06,159 --> 00:11:09,639 of an organ transplant in order to survive 216 00:11:09,639 --> 00:11:12,220 on needs a heart one a lung, 217 00:11:12,220 --> 00:11:13,550 one a kidney, 218 00:11:13,549 --> 00:11:15,089 one a liver 219 00:11:15,090 --> 00:11:16,680 and the fifth 220 00:11:16,679 --> 00:11:20,209 a pancreas. 221 00:11:20,210 --> 00:11:22,780 And you have no organ donors 222 00:11:22,779 --> 00:11:24,909 you are about to 223 00:11:24,909 --> 00:11:27,649 see you them die 224 00:11:27,649 --> 00:11:28,860 and then 225 00:11:28,860 --> 00:11:30,649 it occurs to you 226 00:11:30,649 --> 00:11:32,470 that in the next room 227 00:11:32,470 --> 00:11:35,720 there's a healthy guy who came in for a checkup. 228 00:11:39,450 --> 00:11:43,740 and he is 229 00:11:43,740 --> 00:11:47,159 you like that 230 00:11:47,159 --> 00:11:50,740 and he's taking a nap 231 00:11:53,269 --> 00:11:56,769 you could go in very quietly 232 00:11:56,769 --> 00:12:00,600 yank out the five organs, that person would die 233 00:12:00,600 --> 00:12:03,170 but you can save the five. 234 00:12:03,169 --> 00:12:10,169 How many would do it? Anyone? 235 00:12:10,389 --> 00:12:17,389 How many? Put your hands up if you would do it. 236 00:12:18,100 --> 00:12:21,840 Anyone in the balcony? 237 00:12:21,840 --> 00:12:24,280 You would? Be careful don't lean over too much 238 00:12:26,269 --> 00:12:29,340 How many wouldn't? 239 00:12:29,340 --> 00:12:30,360 All right. 240 00:12:30,360 --> 00:12:33,600 What do you say, speak up in the balcony, you who would 241 00:12:33,600 --> 00:12:35,840 yank out the organs, why? 242 00:12:35,840 --> 00:12:38,629 I'd actually like to explore slightly alternate 243 00:12:38,629 --> 00:12:40,188 possibility of just taking the one 244 00:12:40,188 --> 00:12:44,419 of the five he needs an organ who dies first 245 00:12:44,419 --> 00:12:50,229 and using their four healthy organs to save the other four 246 00:12:50,230 --> 00:12:54,639 That's a pretty good idea. 247 00:12:54,639 --> 00:12:57,830 That's a great idea 248 00:12:57,830 --> 00:13:00,160 except for the fact 249 00:13:00,159 --> 00:13:06,159 that you just wrecked the philosophical point. 250 00:13:06,159 --> 00:13:07,360 Let's step back 251 00:13:07,360 --> 00:13:10,240 from these stories and these arguments 252 00:13:10,240 --> 00:13:12,750 to notice a couple of things 253 00:13:12,750 --> 00:13:17,809 about the way the arguments have began to unfold. 254 00:13:17,809 --> 00:13:18,649 Certain 255 00:13:18,649 --> 00:13:20,360 moral principles 256 00:13:20,360 --> 00:13:23,340 have already begun to emerge 257 00:13:23,340 --> 00:13:25,879 from the discussions we've had 258 00:13:25,879 --> 00:13:27,570 and let's consider 259 00:13:27,570 --> 00:13:29,740 what those moral principles 260 00:13:29,740 --> 00:13:31,250 look like 261 00:13:31,250 --> 00:13:35,710 the first moral principle that emerged from the discussion said 262 00:13:35,710 --> 00:13:39,110 that the right thing to do the moral thing to do 263 00:13:39,110 --> 00:13:43,139 depends on the consequences that will result 264 00:13:43,139 --> 00:13:45,480 from your action 265 00:13:45,480 --> 00:13:47,080 at the end of the day 266 00:13:47,080 --> 00:13:49,250 better that five should live 267 00:13:49,250 --> 00:13:52,320 even if one must die. 268 00:13:52,320 --> 00:13:53,700 That's an example 269 00:13:53,700 --> 00:13:56,280 of consequentialist 270 00:13:56,279 --> 00:13:59,459 moral reasoning. 271 00:13:59,460 --> 00:14:04,310 consequentialist moral reasoning locates morality in the consequences of an act. In the state of the 272 00:14:04,309 --> 00:14:06,549 world that will result 273 00:14:06,549 --> 00:14:09,049 from the thing you do 274 00:14:09,049 --> 00:14:12,789 but then we went a little further, we considered those other cases 275 00:14:12,789 --> 00:14:15,319 and people weren't so sure 276 00:14:15,320 --> 00:14:17,300 about 277 00:14:17,299 --> 00:14:20,529 consequentialist moral reasoning 278 00:14:20,529 --> 00:14:22,419 when people hesitated 279 00:14:22,419 --> 00:14:24,269 to push the fat man 280 00:14:24,269 --> 00:14:25,789 over the bridge 281 00:14:25,789 --> 00:14:28,639 or to yank out the organs of the innocent 282 00:14:28,639 --> 00:14:29,750 patient 283 00:14:29,750 --> 00:14:32,259 people gestured towards 284 00:14:32,259 --> 00:14:34,259 reasons 285 00:14:34,259 --> 00:14:35,379 having to do 286 00:14:35,379 --> 00:14:37,250 with the intrinsic 287 00:14:37,250 --> 00:14:39,230 quality of the act 288 00:14:39,230 --> 00:14:40,690 itself. 289 00:14:40,690 --> 00:14:42,770 Consequences be what they may. 290 00:14:42,769 --> 00:14:45,179 People were reluctant 291 00:14:45,179 --> 00:14:47,819 people thought it was just wrong 292 00:14:47,820 --> 00:14:49,480 categorically wrong 293 00:14:49,480 --> 00:14:50,500 to kill 294 00:14:50,500 --> 00:14:51,408 a person 295 00:14:51,408 --> 00:14:53,509 an innocent person 296 00:14:53,509 --> 00:14:54,689 even for the sake 297 00:14:54,690 --> 00:14:55,819 of saving 298 00:14:55,818 --> 00:14:58,500 five lives, at least these people thought that 299 00:14:58,500 --> 00:15:00,610 in the second 300 00:15:00,610 --> 00:15:05,120 version of each story we reconsidered 301 00:15:05,120 --> 00:15:06,519 so this points 302 00:15:06,519 --> 00:15:09,539 a second 303 00:15:09,539 --> 00:15:10,689 categorical 304 00:15:10,690 --> 00:15:12,660 way 305 00:15:12,659 --> 00:15:14,679 of thinking about 306 00:15:14,679 --> 00:15:16,429 moral reasoning 307 00:15:16,429 --> 00:15:22,429 categorical moral reasoning locates morality in certain absolute moral requirements in 308 00:15:22,429 --> 00:15:24,439 certain categorical duties and rights 309 00:15:24,440 --> 00:15:27,440 regardless of the consequences. 310 00:15:27,440 --> 00:15:29,230 We're going to explore 311 00:15:29,230 --> 00:15:33,070 in the days and weeks to come the contrast between 312 00:15:33,070 --> 00:15:36,540 consequentialist and categorical moral principles. 313 00:15:36,539 --> 00:15:38,370 The most influential 314 00:15:38,370 --> 00:15:40,419 example of 315 00:15:40,419 --> 00:15:45,919 consequential moral reasoning is utilitarianism, a doctrine invented by 316 00:15:45,919 --> 00:15:51,099 Jeremy Bentham, the eighteenth century English political philosopher. 317 00:15:51,100 --> 00:15:54,139 The most important 318 00:15:54,139 --> 00:15:56,850 philosopher of categorical moral reasoning 319 00:15:56,850 --> 00:15:58,170 is the 320 00:15:58,169 --> 00:16:02,589 eighteenth century German philosopher Emmanuel Kant. 321 00:16:02,590 --> 00:16:03,860 So we will look 322 00:16:03,860 --> 00:16:07,259 at those two different modes of moral reasoning 323 00:16:07,259 --> 00:16:08,299 assess them 324 00:16:08,299 --> 00:16:10,669 and also consider others. 325 00:16:10,669 --> 00:16:16,099 If you look at the syllabus, you'll notice that we read a number of great and famous books. 326 00:16:16,100 --> 00:16:18,310 Books by Aristotle 327 00:16:18,309 --> 00:16:19,889 John Locke 328 00:16:19,889 --> 00:16:22,080 Emanuel Kant, John Stuart Mill, 329 00:16:22,080 --> 00:16:24,030 and others. 330 00:16:24,029 --> 00:16:28,169 You'll notice too from the syllabus that we don't only read these books, 331 00:16:28,169 --> 00:16:30,069 we also all 332 00:16:30,070 --> 00:16:32,010 take up 333 00:16:32,009 --> 00:16:37,000 contemporary political and legal controversies that raise philosophical questions. 334 00:16:37,000 --> 00:16:40,190 We will debate equality and inequality, 335 00:16:40,190 --> 00:16:41,490 affirmative action, 336 00:16:41,490 --> 00:16:43,698 free speech versus hate speech, 337 00:16:43,698 --> 00:16:47,039 same sex marriage, military conscription, 338 00:16:47,039 --> 00:16:50,980 a range of practical questions, why 339 00:16:50,980 --> 00:16:55,350 not just to enliven these abstract and distant books 340 00:16:55,350 --> 00:17:01,040 but to make clear to bring out what's at stake in our everyday lives including our political 341 00:17:01,039 --> 00:17:03,519 lives, 342 00:17:03,519 --> 00:17:05,639 for philosophy. 343 00:17:05,640 --> 00:17:07,720 So we will read these books 344 00:17:07,720 --> 00:17:09,818 and we will debate these 345 00:17:09,818 --> 00:17:15,480 issues and we'll see how each informs and illuminates the other. 346 00:17:15,480 --> 00:17:17,599 This may sound appealing enough 347 00:17:17,599 --> 00:17:19,109 but here 348 00:17:19,109 --> 00:17:22,549 I have to issue a warning, 349 00:17:22,549 --> 00:17:25,210 and the warning is this 350 00:17:25,210 --> 00:17:28,500 to read these books 351 00:17:28,500 --> 00:17:31,549 in this way, 352 00:17:31,549 --> 00:17:34,119 as an exercise in self-knowledge, 353 00:17:34,119 --> 00:17:38,849 to read them in this way carry certain risks 354 00:17:38,849 --> 00:17:42,119 risks that are both personal and political, 355 00:17:42,119 --> 00:17:47,799 risks that every student of political philosophy have known. 356 00:17:47,799 --> 00:17:50,690 These risks spring from that fact 357 00:17:50,690 --> 00:17:52,580 that philosophy 358 00:17:52,579 --> 00:17:54,220 teaches us 359 00:17:54,220 --> 00:17:56,400 and unsettles us 360 00:17:56,400 --> 00:18:01,390 by confronting us with what we already know. 361 00:18:01,390 --> 00:18:03,390 There's an irony 362 00:18:03,390 --> 00:18:09,790 the difficulty of this course consists in the fact that it teaches what you already know. 363 00:18:09,789 --> 00:18:12,159 It works by taking 364 00:18:12,160 --> 00:18:16,470 what we know from familiar unquestioned settings, 365 00:18:16,470 --> 00:18:20,370 and making it strange. 366 00:18:20,369 --> 00:18:22,389 That's how those examples worked 367 00:18:22,390 --> 00:18:23,120 worked 368 00:18:23,119 --> 00:18:29,029 the hypotheticals with which we began with their mix of playfulness and sobriety. 369 00:18:29,029 --> 00:18:33,940 it's also how these philosophical books work. Philosophy 370 00:18:33,940 --> 00:18:35,640 estranges us 371 00:18:35,640 --> 00:18:37,520 from the familiar 372 00:18:37,519 --> 00:18:40,389 not by supplying new information 373 00:18:40,390 --> 00:18:41,780 but by inviting 374 00:18:41,779 --> 00:18:43,599 and provoking 375 00:18:43,599 --> 00:18:47,419 a new way of seeing 376 00:18:47,420 --> 00:18:49,970 but, and here's the risk, 377 00:18:49,970 --> 00:18:50,720 once 378 00:18:50,720 --> 00:18:54,299 the familiar turns strange, 379 00:18:54,299 --> 00:18:58,259 it's never quite the same again. 380 00:18:58,259 --> 00:19:00,210 Self-knowledge 381 00:19:00,210 --> 00:19:03,200 is like lost innocence, 382 00:19:03,200 --> 00:19:04,720 however unsettling 383 00:19:04,720 --> 00:19:06,029 you find it, 384 00:19:06,029 --> 00:19:07,289 it can never 385 00:19:07,289 --> 00:19:09,720 be unthought 386 00:19:09,720 --> 00:19:13,390 or unknown 387 00:19:13,390 --> 00:19:17,259 what makes this enterprise difficult 388 00:19:17,259 --> 00:19:19,970 but also riveting, 389 00:19:19,970 --> 00:19:20,880 is that 390 00:19:20,880 --> 00:19:25,400 moral and political philosophy is a story 391 00:19:25,400 --> 00:19:29,340 and you don't know where this story will lead but what you do know 392 00:19:29,339 --> 00:19:31,319 is that the story 393 00:19:31,319 --> 00:19:34,480 is about you. 394 00:19:34,480 --> 00:19:37,079 Those are the personal risks, 395 00:19:37,079 --> 00:19:40,220 now what of the political risks. 396 00:19:40,220 --> 00:19:43,038 one way of introducing of course like this 397 00:19:43,038 --> 00:19:44,730 would be to promise you 398 00:19:44,730 --> 00:19:46,460 that by reading these books 399 00:19:46,460 --> 00:19:48,069 and debating these issues 400 00:19:48,069 --> 00:19:51,769 you will become a better more responsible citizen. 401 00:19:51,769 --> 00:19:56,450 You will examine the presuppositions of public policy, you will hone your political 402 00:19:56,450 --> 00:19:57,289 judgment 403 00:19:57,289 --> 00:20:02,819 you'll become a more effective participant in public affairs 404 00:20:02,819 --> 00:20:06,869 but this would be a partial and misleading promise 405 00:20:06,869 --> 00:20:11,489 political philosophy for the most part hasn't worked that way. 406 00:20:11,490 --> 00:20:14,660 You have to allow for the possibility 407 00:20:14,660 --> 00:20:19,100 that political philosophy may make you a worse citizen 408 00:20:19,099 --> 00:20:21,969 rather than a better one 409 00:20:21,970 --> 00:20:23,809 or at least a worse citizen 410 00:20:23,809 --> 00:20:25,629 before it makes you 411 00:20:25,630 --> 00:20:27,950 a better one 412 00:20:27,950 --> 00:20:30,400 and that's because philosophy 413 00:20:30,400 --> 00:20:32,620 is a distancing 414 00:20:32,619 --> 00:20:34,709 even debilitating 415 00:20:34,710 --> 00:20:36,600 activity 416 00:20:36,599 --> 00:20:37,759 And you see this 417 00:20:37,759 --> 00:20:39,890 going back to Socrates 418 00:20:39,890 --> 00:20:42,290 there's a dialogue, the Gorgias 419 00:20:42,289 --> 00:20:44,680 in which one of Socrates’ friends 420 00:20:44,680 --> 00:20:45,620 Calicles 421 00:20:45,619 --> 00:20:47,239 tries to talk him out 422 00:20:47,240 --> 00:20:49,970 of philosophizing. 423 00:20:49,970 --> 00:20:54,400 calicles tells Socrates philosophy is a pretty toy 424 00:20:54,400 --> 00:20:57,980 if one indulges in it with moderation at the right time of life 425 00:20:57,980 --> 00:21:03,779 but if one pursues it further than one should it is absolute ruin. 426 00:21:03,779 --> 00:21:06,819 Take my advice calicles says, 427 00:21:06,819 --> 00:21:08,369 abandon argument 428 00:21:08,369 --> 00:21:11,669 learn the accomplishments of active life, take 429 00:21:11,670 --> 00:21:16,880 for your models not those people who spend their time on these petty quibbles, 430 00:21:16,880 --> 00:21:20,080 but those who have a good livelihood and reputation 431 00:21:20,079 --> 00:21:22,319 and many other blessings. 432 00:21:22,319 --> 00:21:26,808 So Calicles is really saying to Socrates 433 00:21:26,808 --> 00:21:28,690 quit philosophizing, 434 00:21:28,690 --> 00:21:30,549 get real 435 00:21:30,549 --> 00:21:35,169 go to business school 436 00:21:35,170 --> 00:21:38,300 and calicles did have a point 437 00:21:38,299 --> 00:21:39,690 he had a point 438 00:21:39,690 --> 00:21:42,210 because philosophy distances us 439 00:21:42,210 --> 00:21:45,009 from conventions from established assumptions 440 00:21:45,009 --> 00:21:46,750 and from settled beliefs. 441 00:21:46,750 --> 00:21:48,650 those are the risks, 442 00:21:48,650 --> 00:21:49,970 personal and political 443 00:21:49,970 --> 00:21:54,089 and in the face of these risks there is a characteristic evasion, 444 00:21:54,089 --> 00:21:57,470 the name of the evasion is skepticism. It's the idea 445 00:21:57,470 --> 00:21:58,990 well it goes something like this 446 00:21:58,990 --> 00:22:03,660 we didn't resolve, once and for all, 447 00:22:03,660 --> 00:22:09,759 either the cases or the principles we were arguing when we began 448 00:22:09,759 --> 00:22:11,390 and if Aristotle 449 00:22:11,390 --> 00:22:17,220 and Locke and Kant and Mill haven't solved these questions after all of these years 450 00:22:17,220 --> 00:22:19,589 who are we to think 451 00:22:19,589 --> 00:22:23,730 that we here in Sanders Theatre over the course a semester 452 00:22:23,730 --> 00:22:26,430 can resolve them 453 00:22:26,430 --> 00:22:29,460 and so maybe it's just a matter of 454 00:22:29,460 --> 00:22:33,519 each person having his or her own principles and there's nothing more to be said about 455 00:22:33,519 --> 00:22:34,170 it 456 00:22:34,170 --> 00:22:36,590 no way of reasoning 457 00:22:36,589 --> 00:22:37,649 that's the 458 00:22:37,650 --> 00:22:39,440 evasion. The evasion of skepticism 459 00:22:39,440 --> 00:22:41,279 to which I would offer the following 460 00:22:41,279 --> 00:22:42,660 reply: 461 00:22:42,660 --> 00:22:43,720 it's true 462 00:22:43,720 --> 00:22:47,559 these questions have been debated for a very long time 463 00:22:47,559 --> 00:22:49,059 but the very fact 464 00:22:49,059 --> 00:22:52,700 that they have reoccurred and persisted 465 00:22:52,700 --> 00:22:54,650 may suggest 466 00:22:54,650 --> 00:22:57,169 that though they're impossible in one sense 467 00:22:57,169 --> 00:22:59,800 their unavoidable in another 468 00:22:59,799 --> 00:23:02,299 and the reason they're unavoidable 469 00:23:02,299 --> 00:23:06,500 the reason they're inescapable is that we live some answer 470 00:23:06,500 --> 00:23:10,069 to these questions every day. 471 00:23:10,069 --> 00:23:16,129 So skepticism, just throwing up their hands and giving up on moral reflection, 472 00:23:16,130 --> 00:23:18,280 is no solution 473 00:23:18,279 --> 00:23:19,990 Emanuel Kant 474 00:23:19,990 --> 00:23:23,710 described very well the problem with skepticism when he wrote 475 00:23:23,710 --> 00:23:26,308 skepticism is a resting place for human reason 476 00:23:26,308 --> 00:23:29,099 where it can reflect upon its dogmatic wanderings 477 00:23:29,099 --> 00:23:33,069 but it is no dwelling place for permanent settlement. 478 00:23:33,069 --> 00:23:35,939 Simply to acquiesce in skepticism, Kant wrote, 479 00:23:35,940 --> 00:23:42,660 can never suffice to overcome the restless of reason. 480 00:23:42,660 --> 00:23:47,040 I've tried to suggest through theses stories and these arguments 481 00:23:47,039 --> 00:23:49,898 some sense of the risks and temptations 482 00:23:49,898 --> 00:23:55,589 of the perils and the possibilities I would simply conclude by saying 483 00:23:55,589 --> 00:23:58,099 that the aim of this course 484 00:23:58,099 --> 00:23:59,629 is to awaken 485 00:23:59,630 --> 00:24:02,240 the restlessness of reason 486 00:24:02,240 --> 00:24:04,380 and to see where it might lead 487 00:24:04,380 --> 00:24:11,380 thank you very much. 488 00:24:15,440 --> 00:24:16,980 Like, in a situation that desperate, 489 00:24:16,980 --> 00:24:21,259 you have to do what you have to do to survive. You have to do what you have to do you? You've gotta do 490 00:24:21,259 --> 00:24:22,779 What you 491 00:24:22,779 --> 00:24:23,619 gotta do. pretty much, 492 00:24:23,619 --> 00:24:25,699 If you've been going nineteen days without any food 493 00:24:25,700 --> 00:24:32,700 someone has to take the sacrifice, someone has to make the sacrifice and people can survive. Alright that's good, what's your name? Marcus. 494 00:24:33,569 --> 00:24:40,349 Marcus, what do you say to Marcus? 495 00:24:40,349 --> 00:24:44,569 Last time 496 00:24:44,569 --> 00:24:46,960 we started out last time 497 00:24:46,960 --> 00:24:48,980 with some stores 498 00:24:48,980 --> 00:24:51,019 with some moral dilemmas 499 00:24:51,019 --> 00:24:53,049 about trolley cars 500 00:24:53,049 --> 00:24:54,509 and about doctors 501 00:24:54,509 --> 00:24:56,269 and healthy patients 502 00:24:56,269 --> 00:24:57,559 vulnerable 503 00:24:57,559 --> 00:25:00,909 to being victims of organ transplantation 504 00:25:00,910 --> 00:25:04,090 we noticed two things 505 00:25:04,089 --> 00:25:06,889 about the arguments we had 506 00:25:06,890 --> 00:25:10,830 one had to do with the way we were arguing 507 00:25:10,829 --> 00:25:13,569 it began with our judgments in particular cases 508 00:25:13,569 --> 00:25:18,470 we tried to articulate the reasons or the principles 509 00:25:18,470 --> 00:25:22,549 lying behind our judgments 510 00:25:22,549 --> 00:25:25,399 and then confronted with a new case 511 00:25:25,400 --> 00:25:30,759 we found ourselves re-examining those principles 512 00:25:30,759 --> 00:25:34,019 revising each in the light of the other 513 00:25:34,019 --> 00:25:38,920 and we noticed the built-in pressure to try to bring into alignment 514 00:25:38,920 --> 00:25:41,680 our judgments about particular cases 515 00:25:41,680 --> 00:25:43,940 and the principles we would endorse 516 00:25:43,940 --> 00:25:46,360 on reflection 517 00:25:46,359 --> 00:25:50,589 we also noticed something about the substance of the arguments 518 00:25:50,589 --> 00:25:55,250 that emerged from the discussion. 519 00:25:55,250 --> 00:26:00,859 We noticed that sometimes we were tempted to locate the morality of an act in the consequences 520 00:26:00,859 --> 00:26:06,519 in the results, in the state of the world that it brought about. 521 00:26:06,519 --> 00:26:09,019 We called is consequentialist 522 00:26:09,019 --> 00:26:11,690 moral reason. 523 00:26:11,690 --> 00:26:13,240 But we also noticed that 524 00:26:13,240 --> 00:26:16,440 in some cases 525 00:26:16,440 --> 00:26:18,610 we weren't swayed only 526 00:26:18,609 --> 00:26:22,089 by the results 527 00:26:22,089 --> 00:26:23,399 sometimes, 528 00:26:23,400 --> 00:26:25,350 many of us felt, 529 00:26:25,349 --> 00:26:31,669 that not just consequences but also the intrinsic quality or character of the act 530 00:26:31,670 --> 00:26:35,370 matters morally. 531 00:26:35,369 --> 00:26:40,979 Some people argued that there are certain things that are just categorically wrong 532 00:26:40,980 --> 00:26:42,579 even if they bring about 533 00:26:42,579 --> 00:26:44,460 a good result 534 00:26:44,460 --> 00:26:45,490 even 535 00:26:45,490 --> 00:26:47,200 if they save five people 536 00:26:47,200 --> 00:26:49,788 at the cost of one life. 537 00:26:49,788 --> 00:26:52,730 So we contrasted consequentialist 538 00:26:52,730 --> 00:26:54,569 moral principles 539 00:26:54,569 --> 00:26:58,139 with categorical ones. 540 00:26:58,140 --> 00:26:59,790 Today 541 00:26:59,789 --> 00:27:00,819 and in the next few days 542 00:27:00,819 --> 00:27:06,569 we will begin to examine one of the most influential 543 00:27:06,569 --> 00:27:08,839 versions of consequentialist 544 00:27:08,839 --> 00:27:10,970 moral theory 545 00:27:10,970 --> 00:27:16,329 and that's the philosophy of utilitarianism. 546 00:27:16,329 --> 00:27:17,429 Jeremy Bentham, 547 00:27:17,430 --> 00:27:19,080 the eighteenth century 548 00:27:19,079 --> 00:27:21,689 English political philosopher 549 00:27:21,690 --> 00:27:22,910 gave first 550 00:27:22,910 --> 00:27:26,630 the first clear systematic expression 551 00:27:26,630 --> 00:27:28,670 to the utilitarian 552 00:27:28,670 --> 00:27:32,210 moral theory. 553 00:27:32,210 --> 00:27:36,400 And Bentham's idea, 554 00:27:36,400 --> 00:27:38,440 his essential idea 555 00:27:38,440 --> 00:27:42,930 is a very simple one 556 00:27:42,930 --> 00:27:44,808 with a lot of 557 00:27:44,808 --> 00:27:46,329 morally 558 00:27:46,329 --> 00:27:48,429 intuitive appeal. 559 00:27:48,430 --> 00:27:50,450 Bentham's idea is 560 00:27:50,450 --> 00:27:51,590 the following 561 00:27:51,589 --> 00:27:54,439 the right thing to do 562 00:27:54,440 --> 00:27:57,799 the just thing to do 563 00:27:57,799 --> 00:27:58,819 it's to 564 00:27:58,819 --> 00:28:01,369 maximize 565 00:28:01,369 --> 00:28:02,339 utility. 566 00:28:02,339 --> 00:28:06,329 What did he mean by utility? 567 00:28:06,329 --> 00:28:11,490 He meant by utility the balance 568 00:28:11,490 --> 00:28:14,000 of pleasure over pain, 569 00:28:14,000 --> 00:28:16,778 happiness over suffering. 570 00:28:16,778 --> 00:28:18,210 Here's how we arrived 571 00:28:18,210 --> 00:28:19,259 at the principle 572 00:28:19,259 --> 00:28:22,308 of maximizing utility. 573 00:28:22,308 --> 00:28:24,450 He started out by observing 574 00:28:24,450 --> 00:28:26,399 that all of us 575 00:28:26,398 --> 00:28:27,750 all human beings 576 00:28:27,750 --> 00:28:31,490 are governed by two sovereign masters, 577 00:28:31,490 --> 00:28:34,620 pain and pleasure. 578 00:28:34,619 --> 00:28:37,479 We human beings 579 00:28:37,480 --> 00:28:42,308 like pleasure and dislike pain 580 00:28:42,308 --> 00:28:45,839 and so we should base morality 581 00:28:45,839 --> 00:28:49,169 whether we are thinking of what to do in our own lives 582 00:28:49,170 --> 00:28:50,490 or whether 583 00:28:50,490 --> 00:28:52,579 as legislators or citizens 584 00:28:52,579 --> 00:28:57,029 we are thinking about what the law should be, 585 00:28:57,029 --> 00:29:02,069 the right thing to do individually or collectively 586 00:29:02,069 --> 00:29:05,838 is to maximize, act in a way that maximizes 587 00:29:05,838 --> 00:29:07,659 the overall level 588 00:29:07,660 --> 00:29:11,519 of happiness. 589 00:29:11,519 --> 00:29:15,430 Bentham's utilitarianism is sometimes summed up with the slogan 590 00:29:15,430 --> 00:29:18,870 the greatest good for the greatest number. 591 00:29:18,869 --> 00:29:20,398 With this 592 00:29:20,398 --> 00:29:22,989 basic principle of utility on hand, 593 00:29:22,990 --> 00:29:26,410 let's begin to test it and to examine it 594 00:29:26,410 --> 00:29:28,409 by turning to another case 595 00:29:28,409 --> 00:29:30,620 another story but this time 596 00:29:30,619 --> 00:29:32,639 not a hypothetical story, 597 00:29:32,640 --> 00:29:34,120 a real-life story 598 00:29:34,119 --> 00:29:35,059 the case of 599 00:29:35,059 --> 00:29:38,329 the Queen versus Dudley and Stephens. 600 00:29:38,329 --> 00:29:41,889 This was a nineteenth-century British law case 601 00:29:41,890 --> 00:29:44,009 that's famous 602 00:29:44,009 --> 00:29:47,538 and much debated in law schools. 603 00:29:47,538 --> 00:29:50,059 Here's what happened in the case 604 00:29:50,059 --> 00:29:51,869 I'll summarize the story 605 00:29:51,869 --> 00:29:54,909 and then I want to hear 606 00:29:54,910 --> 00:29:57,640 how you would rule 607 00:29:57,640 --> 00:30:04,350 imagining that you are the jury. 608 00:30:04,349 --> 00:30:06,178 A newspaper account of the time 609 00:30:06,179 --> 00:30:09,290 described the background: 610 00:30:09,289 --> 00:30:11,460 A sadder story of disaster at sea 611 00:30:11,460 --> 00:30:12,789 was never told 612 00:30:12,789 --> 00:30:15,288 than that of the survivors of the yacht 613 00:30:15,288 --> 00:30:16,240 Mignonette. 614 00:30:16,240 --> 00:30:19,109 The ship foundered in the south Atlantic 615 00:30:19,109 --> 00:30:21,789 thirteen hundred miles from the cape 616 00:30:21,789 --> 00:30:24,000 there were four in the crew, 617 00:30:24,000 --> 00:30:26,500 Dudley was the captain 618 00:30:26,500 --> 00:30:28,400 Stephens was the first mate 619 00:30:28,400 --> 00:30:30,220 Brooks was a sailor, 620 00:30:30,220 --> 00:30:31,279 all men of 621 00:30:31,279 --> 00:30:32,490 excellent character, 622 00:30:32,490 --> 00:30:34,249 or so the newspaper account 623 00:30:34,249 --> 00:30:35,819 tells us. 624 00:30:35,819 --> 00:30:38,639 The fourth crew member was the cabin boy, 625 00:30:38,640 --> 00:30:40,300 Richard Parker 626 00:30:40,299 --> 00:30:42,849 seventeen years old. 627 00:30:42,849 --> 00:30:44,629 He was an orphan 628 00:30:44,630 --> 00:30:46,930 he had no family 629 00:30:46,930 --> 00:30:51,410 and he was on his first long voyage at sea. 630 00:30:51,410 --> 00:30:53,700 He went, the news account tells us, 631 00:30:53,700 --> 00:30:56,730 rather against the advice of his friends. 632 00:30:56,730 --> 00:31:00,210 He went in the hopefulness of youthful ambition 633 00:31:00,210 --> 00:31:03,390 thinking the journey would make a man of him. 634 00:31:03,390 --> 00:31:05,140 Sadly it was not to be, 635 00:31:05,140 --> 00:31:07,780 the facts of the case were not in dispute, 636 00:31:07,779 --> 00:31:08,970 a wave hit the ship 637 00:31:08,970 --> 00:31:12,000 and the Mignonette went down. 638 00:31:12,000 --> 00:31:14,859 The four crew members escaped to a lifeboat 639 00:31:14,859 --> 00:31:16,199 the only 640 00:31:16,200 --> 00:31:18,380 food they had 641 00:31:18,380 --> 00:31:19,659 were two 642 00:31:19,659 --> 00:31:20,990 cans of preserved 643 00:31:20,990 --> 00:31:21,779 turnips 644 00:31:21,779 --> 00:31:23,980 no fresh water 645 00:31:23,980 --> 00:31:26,610 for the first three days they ate nothing 646 00:31:26,609 --> 00:31:30,459 on the fourth day that opened one of the cans of turnips 647 00:31:30,460 --> 00:31:31,590 and ate it. 648 00:31:31,589 --> 00:31:34,449 The next day they caught a turtle 649 00:31:34,450 --> 00:31:36,960 together with the other can of turnips 650 00:31:36,960 --> 00:31:38,549 the turtle 651 00:31:38,549 --> 00:31:40,058 enabled them to subsist 652 00:31:40,058 --> 00:31:43,069 for the next few days and then for eight days 653 00:31:43,069 --> 00:31:44,038 they had nothing 654 00:31:44,038 --> 00:31:47,069 no food no water. 655 00:31:47,069 --> 00:31:50,069 Imagine yourself in a situation like that 656 00:31:50,069 --> 00:31:52,849 what would you do? 657 00:31:52,849 --> 00:31:55,119 Here's what they did 658 00:31:55,119 --> 00:32:00,969 by now the cabin boy Parker is lying at the bottom of the lifeboat in a corner 659 00:32:00,970 --> 00:32:03,230 because he had drunk sea water 660 00:32:03,230 --> 00:32:05,490 against the advice of the others 661 00:32:05,490 --> 00:32:07,230 and he had become ill 662 00:32:07,230 --> 00:32:10,669 and he appeared to be dying 663 00:32:10,669 --> 00:32:14,619 so on the nineteenth day Dudley, the captain, suggested 664 00:32:14,618 --> 00:32:17,349 that they should all 665 00:32:17,349 --> 00:32:18,759 have a lottery. That they should 666 00:32:18,759 --> 00:32:19,619 all draw lots to see 667 00:32:19,619 --> 00:32:20,859 who would die 668 00:32:20,859 --> 00:32:24,079 to save the rest. 669 00:32:24,079 --> 00:32:25,210 Brooks 670 00:32:25,210 --> 00:32:26,539 refused 671 00:32:26,539 --> 00:32:29,139 he didn't like the lottery idea 672 00:32:29,140 --> 00:32:30,619 we don't know whether this 673 00:32:30,618 --> 00:32:35,999 was because he didn't want to take that chance or because he believed in categorical moral 674 00:32:35,999 --> 00:32:36,870 principles 675 00:32:36,869 --> 00:32:38,619 but in any case 676 00:32:38,619 --> 00:32:42,129 no lots were drawn. 677 00:32:42,130 --> 00:32:43,230 The next day 678 00:32:43,230 --> 00:32:45,028 there was still no ship in sight 679 00:32:45,028 --> 00:32:48,470 so a Dudley told Brooks to avert his gaze 680 00:32:48,470 --> 00:32:50,720 and he motioned to Stephens 681 00:32:50,720 --> 00:32:53,929 that the boy Parker had better be killed. 682 00:32:53,929 --> 00:32:55,850 Dudley offered a prayer 683 00:32:55,849 --> 00:32:58,480 he told a the boy his time had come 684 00:32:58,480 --> 00:33:00,679 and he killed him with a pen knife 685 00:33:00,679 --> 00:33:03,900 stabbing him in the jugular vein. 686 00:33:03,900 --> 00:33:09,750 Brooks emerged from his conscientious objection to share in the gruesome bounty. 687 00:33:09,750 --> 00:33:11,029 For four days 688 00:33:11,029 --> 00:33:15,230 the three of them fed on the body and blood of the cabin boy. 689 00:33:15,230 --> 00:33:17,220 True story. 690 00:33:17,220 --> 00:33:19,390 And then they were rescued. 691 00:33:19,390 --> 00:33:22,840 Dudley describes their rescue 692 00:33:22,839 --> 00:33:24,678 in his diary 693 00:33:24,679 --> 00:33:27,570 with staggering euphemism, quote: 694 00:33:27,569 --> 00:33:29,648 "on the twenty fourth day 695 00:33:29,648 --> 00:33:34,750 as we were having our breakfast 696 00:33:34,750 --> 00:33:38,599 a ship appeared at last." 697 00:33:38,599 --> 00:33:44,309 The three survivors were picked up by a German ship. They were taken back to Falmouth in England 698 00:33:44,309 --> 00:33:47,079 where they were arrested and tried 699 00:33:47,079 --> 00:33:47,829 Brooks 700 00:33:47,829 --> 00:33:49,949 turned state's witness 701 00:33:49,950 --> 00:33:54,450 Dudley and Stephens went to trial. They didn't dispute the facts 702 00:33:54,450 --> 00:33:55,390 they claimed 703 00:33:55,390 --> 00:33:58,140 they had acted out of necessity 704 00:33:58,140 --> 00:33:59,430 that was their defense 705 00:33:59,430 --> 00:34:01,220 they argued in effect 706 00:34:01,220 --> 00:34:03,250 better that one should die 707 00:34:03,250 --> 00:34:06,420 so that three could survive 708 00:34:06,420 --> 00:34:08,619 the prosecutor 709 00:34:08,619 --> 00:34:10,849 wasn't swayed by that argument 710 00:34:10,849 --> 00:34:12,509 he said murder is murder 711 00:34:12,509 --> 00:34:16,429 and so the case went to trial. Now imagine you are the jury 712 00:34:16,429 --> 00:34:19,489 and just to simplify the discussion 713 00:34:19,489 --> 00:34:21,989 put aside the question of law, 714 00:34:21,989 --> 00:34:23,010 and let's assume that 715 00:34:23,010 --> 00:34:25,879 you as the jury 716 00:34:25,878 --> 00:34:28,278 are charged with deciding 717 00:34:28,278 --> 00:34:31,009 whether what they did was morally 718 00:34:31,009 --> 00:34:34,378 permissible or not. 719 00:34:34,378 --> 00:34:36,608 How many 720 00:34:36,608 --> 00:34:39,808 would vote 721 00:34:39,809 --> 00:34:46,809 not guilty, that what they did was morally permissible? 722 00:34:49,528 --> 00:34:51,639 And how many would vote guilty 723 00:34:51,639 --> 00:34:54,858 what they did was morally wrong? 724 00:34:54,858 --> 00:34:57,998 A pretty sizable majority. 725 00:34:57,998 --> 00:35:03,808 Now let's see what people's reasons are, and let me begin with those who are in the minority. 726 00:35:03,809 --> 00:35:07,739 Let's hear first from the defense 727 00:35:07,739 --> 00:35:10,099 of Dudley and Stephens. 728 00:35:10,099 --> 00:35:14,160 Why would you morally exonerate them? 729 00:35:14,159 --> 00:35:17,989 What are your reasons? 730 00:35:17,989 --> 00:35:20,798 I think it's I think it is morally reprehensible 731 00:35:20,798 --> 00:35:24,349 but I think that there's a distinction between what's morally reprehensible 732 00:35:24,349 --> 00:35:26,609 what makes someone legally accountable 733 00:35:26,608 --> 00:35:30,690 in other words the night as the judge said what's always moral isn't necessarily 734 00:35:30,690 --> 00:35:34,789 against the law and while I don't think that necessity 735 00:35:34,789 --> 00:35:36,169 justifies 736 00:35:36,168 --> 00:35:38,578 theft or murder any illegal act, 737 00:35:38,579 --> 00:35:43,509 at some point your degree of necessity does in fact 738 00:35:43,509 --> 00:35:45,849 exonerate you form any guilt. ok. 739 00:35:45,849 --> 00:35:50,588 other defenders, other voices for the defense? 740 00:35:50,588 --> 00:35:53,038 Moral justifications for 741 00:35:53,039 --> 00:35:56,989 what they did? 742 00:35:56,989 --> 00:35:57,619 yes, thank you 743 00:35:57,619 --> 00:35:58,798 744 00:35:58,798 --> 00:35:59,570 I just feel like 745 00:35:59,570 --> 00:36:03,139 in a situation that desperate you have to do what you have to do to survive. 746 00:36:03,139 --> 00:36:04,679 You have to do what you have to do 747 00:36:04,679 --> 00:36:06,789 ya, you gotta do what you gotta do, pretty much. 748 00:36:06,789 --> 00:36:07,849 If you've been 749 00:36:07,849 --> 00:36:09,899 going nineteen days without any food 750 00:36:09,898 --> 00:36:14,710 you know someone just has to take the sacrifice has to make sacrifices and people can survive 751 00:36:14,710 --> 00:36:16,139 and furthermore from that 752 00:36:16,139 --> 00:36:21,299 let's say they survived and then they become productive members of society who go home and then start like 753 00:36:21,300 --> 00:36:26,230 a million charity organizations and this and that and this and that, I mean they benefit everybody in the end so 754 00:36:26,230 --> 00:36:28,519 I mean I don't know what they did afterwards, I mean they might have 755 00:36:28,519 --> 00:36:30,478 gone on and killed more people 756 00:36:30,478 --> 00:36:32,889 but whatever. 757 00:36:32,889 --> 00:36:35,708 what? what if they were going home and turned out to be assassins? 758 00:36:35,708 --> 00:36:38,908 What if they were going home and turned out to be assassins? 759 00:36:38,909 --> 00:36:42,878 You would want to know who they assassinated. 760 00:36:42,878 --> 00:36:45,848 That's true too, that's fair 761 00:36:45,849 --> 00:36:49,609 I would wanna know who they assassinated. 762 00:36:49,608 --> 00:36:50,708 alright that's good, what's your name? Marcus. 763 00:36:50,708 --> 00:36:52,489 We've heard a defense 764 00:36:52,489 --> 00:36:54,050 a couple voices for the defense 765 00:36:54,050 --> 00:36:55,660 now we need to hear 766 00:36:55,659 --> 00:36:57,298 from the prosecution 767 00:36:57,298 --> 00:36:59,338 most people think 768 00:36:59,338 --> 00:37:05,068 what they did was wrong, why? 769 00:37:05,068 --> 00:37:09,898 One of the first things that I was thinking was, oh well if they haven't been eating for a really long time, 770 00:37:09,898 --> 00:37:11,409 maybe 771 00:37:11,409 --> 00:37:12,469 then 772 00:37:12,469 --> 00:37:15,139 they're mentally affected 773 00:37:15,139 --> 00:37:16,408 that could be used for the defense, 774 00:37:16,409 --> 00:37:20,619 a possible argument that oh, 775 00:37:20,619 --> 00:37:24,179 that they weren't in a proper state of mind, they were making 776 00:37:24,179 --> 00:37:28,519 decisions that they otherwise wouldn't be making, and if that's an appealing argument 777 00:37:28,518 --> 00:37:33,608 that you have to be in an altered mindset to do something like that it suggests that 778 00:37:33,608 --> 00:37:36,108 people who find that argument convincing 779 00:37:36,108 --> 00:37:40,088 do you think that they're acting immorally. But I want to know what you think you're defending 780 00:37:40,088 --> 00:37:41,248 you k 0:37:41.249,0:37:45.549 you voted to convict right? yeah I don't think that they acted in morally 781 00:37:45,548 --> 00:37:49,449 appropriate way. And why not? What do you say, Here's Marcus 782 00:37:49,449 --> 00:37:51,088 he just defended them, 783 00:37:51,088 --> 00:37:52,909 he said, 784 00:37:52,909 --> 00:37:53,880 you heard what he said, 785 00:37:53,880 --> 00:37:55,249 yes I did 786 00:37:55,248 --> 00:37:56,558 yes 787 00:37:56,559 --> 00:38:00,119 that you've got to do what you've got to do in a case like that. 788 00:38:00,119 --> 00:38:04,789 What do you say to Marcus? 789 00:38:04,789 --> 00:38:06,439 They didn't, 790 00:38:06,438 --> 00:38:13,438 that there is no situation that would allow human beings to take 791 00:38:13,579 --> 00:38:17,959 the idea of fate or the other people's lives into their own hands that we don't have 792 00:38:17,958 --> 00:38:19,328 that kind of power. 793 00:38:19,329 --> 00:38:21,399 Good, okay 794 00:38:21,398 --> 00:38:24,130 thanks you, and what's your name? 795 00:38:24,130 --> 00:38:24,548 Britt? okay. 796 00:38:24,548 --> 00:38:26,028 who else? 797 00:38:26,028 --> 00:38:28,159 What do you say? Stand up 798 00:38:28,159 --> 00:38:35,159 I'm wondering if Dudley and Stephens had asked for Richard Parker's consent in, you know, dying, 799 00:38:35,429 --> 00:38:37,568 if that would 800 00:38:37,568 --> 00:38:41,228 would that exonerate them 801 00:38:41,228 --> 00:38:45,448 from an act of murder, and if so is that still morally justifiable? 802 00:38:45,449 --> 00:38:51,720 That's interesting, alright consent, now hang on, what's your name? Kathleen. 803 00:38:51,719 --> 00:38:56,088 Kathleen says suppose so what would that scenario look like? 804 00:38:56,088 --> 00:38:56,619 so in the story 805 00:38:56,619 --> 00:39:00,410 Dudley is there, pen knife in hand, 806 00:39:00,409 --> 00:39:02,608 but instead of the prayer 807 00:39:02,608 --> 00:39:04,588 or before the prayer, 808 00:39:04,588 --> 00:39:07,599 he says, Parker, 809 00:39:07,599 --> 00:39:11,519 would you mind 810 00:39:11,518 --> 00:39:14,348 we're desperately hungry, 811 00:39:14,349 --> 00:39:17,679 as Marcus empathizes with 812 00:39:17,679 --> 00:39:19,769 we're desperately hungry 813 00:39:19,768 --> 00:39:22,168 you're not going to last long anyhow, 814 00:39:22,168 --> 00:39:23,498 you can be a martyr, 815 00:39:23,498 --> 00:39:25,748 would you be a martyr 816 00:39:25,748 --> 00:39:29,468 how about it Parker? 817 00:39:29,469 --> 00:39:33,220 Then, then 818 00:39:33,219 --> 00:39:37,639 then what do you think, would be morally justified then? Suppose 819 00:39:37,639 --> 00:39:38,219 Parker 820 00:39:38,219 --> 00:39:40,168 in his semi-stupor 821 00:39:40,168 --> 00:39:42,498 says okay 822 00:39:42,498 --> 00:39:47,889 I don't think it'll be morally justifiable but I'm wondering. Even then, even then it wouldn't be? No 823 00:39:47,889 --> 00:39:50,650 You don't think that even with consent 824 00:39:50,650 --> 00:39:52,490 it would be morally justified. 825 00:39:52,489 --> 00:39:54,568 Are there people who think 826 00:39:54,568 --> 00:39:56,369 who want to take up Kathleen's 827 00:39:56,369 --> 00:39:57,200 consent idea 828 00:39:57,199 --> 00:40:01,688 and who think that that would make it morally justified? Raise your hand if it would 829 00:40:01,688 --> 00:40:05,868 if you think it would. 830 00:40:05,869 --> 00:40:07,588 That's very interesting 831 00:40:07,588 --> 00:40:09,168 Why would consent 832 00:40:09,168 --> 00:40:15,888 make a moral difference? Why would it? 833 00:40:15,889 --> 00:40:18,509 Well I just think that if he was making his own original idea 834 00:40:18,509 --> 00:40:20,969 and it was his idea to start with 835 00:40:20,969 --> 00:40:23,778 then that would be the only situation in which I would 836 00:40:23,778 --> 00:40:25,940 see it being appropriate in anyway 0:40:25.940,0:40:28.359 because that way you couldn't make the argument that 837 00:40:28,358 --> 00:40:30,579 he was pressured you know it’s three 838 00:40:30,579 --> 00:40:32,759 to one or whatever the ratio was, 839 00:40:32,759 --> 00:40:34,070 and I think that 840 00:40:34,070 --> 00:40:38,009 if he was making a decision to give his life then he took on the agency 841 00:40:38,009 --> 00:40:42,668 to sacrifice himself which some people might see as admirable and other people 842 00:40:42,668 --> 00:40:45,449 might disagree with that decision. 843 00:40:45,449 --> 00:40:49,099 So if he came up with the idea 844 00:40:49,099 --> 00:40:52,820 that's the only kind of consent we could have confidence in 845 00:40:52,820 --> 00:40:55,359 morally, then it would be okay 846 00:40:55,358 --> 00:40:57,268 otherwise 847 00:40:57,268 --> 00:40:59,788 it would be kind of coerced consent 848 00:40:59,789 --> 00:41:01,469 under the circumstances 849 00:41:01,469 --> 00:41:05,278 you think. 850 00:41:05,278 --> 00:41:07,349 Is there anyone who thinks 851 00:41:07,349 --> 00:41:10,979 that the even the consent of Parker 852 00:41:10,978 --> 00:41:13,419 would not justify 853 00:41:13,420 --> 00:41:15,479 their killing him? 854 00:41:15,478 --> 00:41:18,088 Who thinks that? 855 00:41:18,088 --> 00:41:19,538 Yes, tell us why, stand up 856 00:41:19,539 --> 00:41:21,260 I think that Parker 857 00:41:21,260 --> 00:41:22,319 would be killed 858 00:41:22,318 --> 00:41:26,558 with the hope that the other crew members would be rescued so 859 00:41:26,559 --> 00:41:29,250 there's no definite reason that he should be killed 860 00:41:29,250 --> 00:41:31,199 because you don't know 861 00:41:31,199 --> 00:41:35,809 when they're going to get rescued so if you kill him you're killing him in vain 862 00:41:35,809 --> 00:41:38,039 do you keep killing a crew member until you're rescued and then you're left with no one? 863 00:41:38,039 --> 00:41:40,309 because someone's going to die eventually? 864 00:41:40,309 --> 00:41:44,199 Well the moral logic of the situation seems to be that. 865 00:41:44,199 --> 00:41:45,829 That they would 866 00:41:45,829 --> 00:41:50,318 keep on picking off the weakest maybe, one by one, 867 00:41:50,318 --> 00:41:51,818 until they were 868 00:41:51,818 --> 00:41:57,679 rescued and in this case luckily when three at least were still alive. 869 00:41:57,679 --> 00:41:58,880 Now if 870 00:41:58,880 --> 00:42:01,298 if Parker did give his consent 871 00:42:01,298 --> 00:42:04,068 would it be all right do you think or not? 872 00:42:04,068 --> 00:42:06,329 No, it still wouldn't be right. 873 00:42:06,329 --> 00:42:08,030 Tell us why wouldn't be all right. 874 00:42:08,030 --> 00:42:10,028 First of all, cannibalism, I believe 875 00:42:10,028 --> 00:42:13,228 is morally incorrect 876 00:42:13,228 --> 00:42:14,509 so you shouldn’t be eating a human anyway. 877 00:42:14,510 --> 00:42:17,449 So 878 00:42:17,449 --> 00:42:19,380 cannibalism is morally objectionable outside 879 00:42:19,380 --> 00:42:22,400 so then even in the scenario 880 00:42:22,400 --> 00:42:24,568 of waiting until someone died 881 00:42:24,568 --> 00:42:27,018 still it would be objectionable. 882 00:42:27,018 --> 00:42:27,929 Yes, to me personally 883 00:42:27,929 --> 00:42:29,739 I feel like of 884 00:42:29,739 --> 00:42:31,199 it all depends on 885 00:42:31,199 --> 00:42:35,288 one's personal morals, like we can't just, like this is just my opinion 886 00:42:35,289 --> 00:42:39,339 of course other people are going to disagree. 887 00:42:39,338 --> 00:42:41,498 Well let's see, let's hear what their disagreements are 888 00:42:41,498 --> 00:42:42,639 and then we'll see 889 00:42:42,639 --> 00:42:44,259 if they have reasons 890 00:42:44,259 --> 00:42:46,228 that can persuade you or not. 891 00:42:46,228 --> 00:42:48,358 Let's try that 892 00:42:48,358 --> 00:42:50,098 Let's 893 00:42:50,099 --> 00:42:53,249 now is there someone 894 00:42:53,248 --> 00:42:57,908 who can explain, those of you who are tempted by consent 895 00:42:57,909 --> 00:42:59,778 can you explain 896 00:42:59,778 --> 00:43:02,028 why consent makes 897 00:43:02,028 --> 00:43:03,358 such a moral difference, 898 00:43:03,358 --> 00:43:05,650 what about the lottery idea 899 00:43:05,650 --> 00:43:08,930 does that count as consent. Remember at the beginning 900 00:43:08,929 --> 00:43:11,308 Dudley proposed a lottery 901 00:43:11,309 --> 00:43:13,839 suppose that they had agreed 902 00:43:13,838 --> 00:43:16,338 to a lottery 903 00:43:16,338 --> 00:43:17,369 then 904 00:43:17,369 --> 00:43:20,798 how many would then say 905 00:43:20,798 --> 00:43:23,929 it was all right. Say there was a lottery, 906 00:43:23,929 --> 00:43:25,380 cabin boy lost, 907 00:43:25,380 --> 00:43:32,380 and the rest of the story unfolded. How many people would say it's morally permissible? 908 00:43:33,199 --> 00:43:37,030 So the numbers are rising if we add a lottery, let's hear from one of you 909 00:43:37,030 --> 00:43:41,609 for whom the lottery would make a moral difference 910 00:43:41,608 --> 00:43:43,458 why would it? 911 00:43:43,458 --> 00:43:44,739 I think the essential 912 00:43:44,739 --> 00:43:45,718 element, 913 00:43:45,719 --> 00:43:47,858 in my mind that makes it a crime is 914 00:43:47,858 --> 00:43:53,848 the idea that they decided at some point that their lives were more important than his, and that 915 00:43:53,849 --> 00:43:56,609 I mean that's kind of the basis for really any crime 916 00:43:56,608 --> 00:43:57,688 right? It's like 917 00:43:57,688 --> 00:44:01,949 my needs, my desire is a more important than yours and mine take precedent 918 00:44:01,949 --> 00:44:04,798 and if they had done a lottery were everyone consented 919 00:44:04,798 --> 00:44:06,478 that someone should die 920 00:44:06,478 --> 00:44:09,239 and it's sort of like they're all sacrificing themselves, 921 00:44:09,239 --> 00:44:11,008 to save the rest, 922 00:44:11,009 --> 00:44:12,949 Then it would be all right? 923 00:44:12,949 --> 00:44:15,880 A little grotesque but, 924 00:44:15,880 --> 00:44:18,959 But morally permissible? Yes. 925 00:44:18,958 --> 00:44:22,688 what's your name? Matt. 926 00:44:22,688 --> 00:44:25,578 so, Matt for you 927 00:44:25,579 --> 00:44:27,329 what bothers you is not 928 00:44:27,329 --> 00:44:31,389 the cannibalism, but the lack of due process. 929 00:44:31,389 --> 00:44:34,689 I guess you could say that 930 00:44:34,688 --> 00:44:38,170 And can someone who agrees with Matt 931 00:44:38,170 --> 00:44:40,499 say a little bit more 932 00:44:40,498 --> 00:44:41,378 about why 933 00:44:41,378 --> 00:44:43,690 a lottery 934 00:44:43,690 --> 00:44:47,099 would make it, in your view, 935 00:44:47,099 --> 00:44:50,979 morally permissible. 936 00:44:50,978 --> 00:44:55,568 The way I understood it originally was that that was the whole issue is that the cabin boy was never 937 00:44:55,568 --> 00:44:56,398 consulted 938 00:44:56,398 --> 00:45:00,478 about whether or not it something was going to happen to him even though with the original 939 00:45:00,478 --> 00:45:01,088 lottery 940 00:45:01,088 --> 00:45:04,420 whether or not he would be a part of that it was just decided 941 00:45:04,420 --> 00:45:08,170 that he was the one that was going to die. Yes that's what happened in the actual case 942 00:45:08,170 --> 00:45:11,900 but if there were a lottery and they all agreed to the procedure 943 00:45:11,900 --> 00:45:13,539 you think that would be okay? 944 00:45:13,539 --> 00:45:16,418 Right, because everyone knows that there's gonna be a death 945 00:45:16,418 --> 00:45:17,079 whereas 946 00:45:17,079 --> 00:45:18,878 you know the cabin boy didn't know that 947 00:45:18,878 --> 00:45:21,038 this discussion was even happening 948 00:45:21,039 --> 00:45:21,919 there was no 949 00:45:21,918 --> 00:45:23,578 you know forewarning 950 00:45:23,579 --> 00:45:28,829 for him to know that hey, I may be the one that's dying. Okay, now suppose the everyone agrees 951 00:45:28,829 --> 00:45:35,089 to the lottery they have the lottery the cabin boy loses any changes his mind. 952 00:45:35,088 --> 00:45:40,989 You've already decided, it's like a verbal contract, you can't go back on that. You've decided the decision was made 953 00:45:40,989 --> 00:45:45,139 you know if you know you're dying for the reason for at others to live, 954 00:45:45,139 --> 00:45:45,999 you would, you know 955 00:45:45,998 --> 00:45:47,688 if the someone else had died 956 00:45:47,688 --> 00:45:51,969 you know that you would consume them, so 957 00:45:51,969 --> 00:45:57,429 But then he could say I know, but I lost. 958 00:45:57,429 --> 00:46:01,939 I just think that that's the whole moral issue is that there was no consulting of the cabin boy and that that's 959 00:46:01,938 --> 00:46:04,298 what makes it the most horrible 960 00:46:04,298 --> 00:46:08,909 is that he had no idea what was even going on, that if he had known what was going on 961 00:46:08,909 --> 00:46:10,599 it would 962 00:46:10,599 --> 00:46:13,109 be a bit more understandable. 963 00:46:13,108 --> 00:46:14,509 Alright, good, now I want to hear 964 00:46:14,510 --> 00:46:17,049 so there's some who think 965 00:46:17,048 --> 00:46:18,708 it's morally permissible 966 00:46:18,708 --> 00:46:24,048 but only about twenty percent, 967 00:46:24,048 --> 00:46:26,559 led by Marcus, 968 00:46:26,559 --> 00:46:28,439 then there are some who say 969 00:46:28,438 --> 00:46:30,208 the real problem here 970 00:46:30,208 --> 00:46:32,838 is the lack of consent 971 00:46:32,838 --> 00:46:37,139 whether the lack of consent to a lottery to a fair procedure 972 00:46:37,139 --> 00:46:38,588 or 973 00:46:38,588 --> 00:46:39,690 Kathleen's idea, 974 00:46:39,690 --> 00:46:40,749 lack of consent 975 00:46:40,748 --> 00:46:42,928 at the moment 976 00:46:42,929 --> 00:46:45,139 of death 977 00:46:45,139 --> 00:46:48,318 and if we add consent 978 00:46:48,318 --> 00:46:49,019 then 979 00:46:49,019 --> 00:46:51,889 more people are willing to consider 980 00:46:51,889 --> 00:46:54,528 the sacrifice morally justified. 981 00:46:54,528 --> 00:46:56,639 I want to hear now finally 982 00:46:56,639 --> 00:46:58,548 from those of you who think 983 00:46:58,548 --> 00:47:00,199 even with consent 984 00:47:00,199 --> 00:47:01,898 even with a lottery 985 00:47:01,898 --> 00:47:02,519 even with 986 00:47:02,519 --> 00:47:04,579 a final 987 00:47:04,579 --> 00:47:06,950 murmur of consent from Parker 988 00:47:06,949 --> 00:47:08,018 at the 989 00:47:08,018 --> 00:47:09,179 very last moment 990 00:47:09,179 --> 00:47:10,838 it would still 991 00:47:10,838 --> 00:47:12,639 be wrong 992 00:47:12,639 --> 00:47:14,248 and why would it be wrong 993 00:47:14,248 --> 00:47:16,998 that's what I want to hear. 994 00:47:16,998 --> 00:47:18,818 well the whole time 995 00:47:18,818 --> 00:47:22,639 I've been leaning towards the categorical moral reasoning 996 00:47:22,639 --> 00:47:25,568 and I think that 997 00:47:25,568 --> 00:47:29,608 there's a possibility I'd be okay with the idea of the lottery and then loser 998 00:47:29,608 --> 00:47:31,440 taking into their own hands to 999 00:47:31,440 --> 00:47:32,749 kill themselves 1000 00:47:32,748 --> 00:47:33,679 1001 00:47:33,679 --> 00:47:37,358 so there wouldn't be an act of murder but I still think that 1002 00:47:37,358 --> 00:47:42,278 even that way it's coerced and also I don't think that there's any remorse like in 1003 00:47:42,278 --> 00:47:43,338 Dudley's diary 1004 00:47:43,338 --> 00:47:44,909 we're getting our breakfast 1005 00:47:44,909 --> 00:47:47,659 it seems as though he's just sort of like, oh, 1006 00:47:47,659 --> 00:47:51,440 you know that whole idea of not valuing someone else's life 1007 00:47:51,440 --> 00:47:53,639 so that makes me 1008 00:47:53,639 --> 00:47:57,969 feel like I have to take the categorical stance. You want to throw the book at him. 1009 00:47:57,969 --> 00:48:02,298 when he lacks remorse or a sense of having done anything wrong. Right. 1010 00:48:02,298 --> 00:48:06,969 Alright, good so are there any other 1011 00:48:06,969 --> 00:48:08,769 defenders who 1012 00:48:08,768 --> 00:48:13,268 who say it's just categorically wrong, with or without consent, yes stand up. Why? 1013 00:48:13,268 --> 00:48:17,288 I think undoubtedly the way our society is shaped, murder is murder 1014 00:48:17,289 --> 00:48:21,829 murder is murder and every way our society looks down at it in the same light 1015 00:48:21,829 --> 00:48:24,780 and I don't think it's any different in any case. Good now let me ask you a question, 1016 00:48:24,780 --> 00:48:27,119 there were three lives at stake 1017 00:48:27,119 --> 00:48:30,489 versus one, 1018 00:48:30,489 --> 00:48:33,028 the one, that the cabin boy, he had no family 1019 00:48:33,028 --> 00:48:34,509 he had no dependents, 1020 00:48:34,509 --> 00:48:38,739 these other three had families back home in England they had dependents 1021 00:48:38,739 --> 00:48:41,418 they had wives and children 1022 00:48:41,418 --> 00:48:43,328 think back to Bentham, 1023 00:48:43,329 --> 00:48:44,989 Bentham says we have to consider 1024 00:48:44,989 --> 00:48:48,048 the welfare, the utility, the happiness 1025 00:48:48,048 --> 00:48:51,288 of everybody. We have to add it all up 1026 00:48:51,289 --> 00:48:54,640 so it's not just numbers three against one 1027 00:48:54,639 --> 00:48:58,759 it's also all of those people at home 1028 00:48:58,759 --> 00:49:00,909 in fact the London newspaper at the time 1029 00:49:00,909 --> 00:49:04,248 and popular opinion sympathized with them 1030 00:49:04,248 --> 00:49:05,478 Dudley in Stephens 1031 00:49:05,478 --> 00:49:07,919 and the paper said if they weren't 1032 00:49:07,920 --> 00:49:08,280 motivated 1033 00:49:08,280 --> 00:49:09,640 by affection 1034 00:49:09,639 --> 00:49:13,489 and concern for their loved ones at home and dependents, surely they wouldn't have 1035 00:49:13,489 --> 00:49:15,969 done this. Yeah, and how is that any different from people 1036 00:49:15,969 --> 00:49:17,369 on the corner 1037 00:49:17,369 --> 00:49:21,108 trying to having the same desire to feed their family, I don't think it's any different. I think in any case 1038 00:49:21,108 --> 00:49:25,279 if I'm murdering you to advance my status, that's murder and I think that we should look at all 1039 00:49:25,280 --> 00:49:28,430 of that in the same light. Instead of criminalizing certain 1040 00:49:28,429 --> 00:49:30,278 activities 1041 00:49:30,278 --> 00:49:33,760 and making certain things seem more violent and savage 1042 00:49:33,760 --> 00:49:36,760 when in that same case it's all the same act and mentality 1043 00:49:36,760 --> 00:49:40,150 that goes into the murder, a necessity to feed their families. 1044 00:49:40,150 --> 00:49:43,028 Suppose there weren't three, supposed there were thirty, 1045 00:49:43,028 --> 00:49:44,608 three hundred, 1046 00:49:44,608 --> 00:49:47,358 one life to save three hundred 1047 00:49:47,358 --> 00:49:48,349 or in more time, 1048 00:49:48,349 --> 00:49:49,589 three thousand 1049 00:49:49,590 --> 00:49:51,039 or suppose the stakes were even bigger. 1050 00:49:51,039 --> 00:49:52,778 Suppose the stakes were even bigger 1051 00:49:52,778 --> 00:49:54,608 I think it's still the same deal. 1052 00:49:54,608 --> 00:49:58,108 Do you think Bentham was wrong to say the right thing to do 1053 00:49:58,108 --> 00:49:58,929 is to add 1054 00:49:58,929 --> 00:50:02,479 up the collected happiness, you think he's wrong about that? 1055 00:50:02,478 --> 00:50:06,728 I don't think he is wrong, but I think murder is murder in any case. Well then Bentham has to be wrong 1056 00:50:06,728 --> 00:50:09,568 if you're right he's wrong. okay then he's wrong. 1057 00:50:09,568 --> 00:50:12,818 Alright thank you, well done. 1058 00:50:12,818 --> 00:50:14,358 Alright, let's step back 1059 00:50:14,358 --> 00:50:16,389 from this discussion 1060 00:50:16,389 --> 00:50:19,728 and notice 1061 00:50:19,728 --> 00:50:23,259 how many objections have we heard to what they did. 1062 00:50:23,259 --> 00:50:26,048 we heard some defenses of what they did 1063 00:50:26,048 --> 00:50:28,509 the defense has had to do with 1064 00:50:28,509 --> 00:50:28,918 necessity 1065 00:50:28,918 --> 00:50:32,588 the dire circumstance and, 1066 00:50:32,588 --> 00:50:33,409 implicitly at least, 1067 00:50:33,409 --> 00:50:36,018 the idea that numbers matter 1068 00:50:36,018 --> 00:50:37,858 and not only numbers matter 1069 00:50:37,858 --> 00:50:40,380 but the wider effects matter 1070 00:50:40,380 --> 00:50:43,459 their families back home, their dependents 1071 00:50:43,458 --> 00:50:44,768 Parker was an orphan, 1072 00:50:44,768 --> 00:50:47,978 no one would miss him. 1073 00:50:47,978 --> 00:50:49,578 so if you 1074 00:50:49,579 --> 00:50:50,829 add up 1075 00:50:50,829 --> 00:50:52,649 if you tried to calculate 1076 00:50:52,648 --> 00:50:53,998 the balance 1077 00:50:53,998 --> 00:50:56,598 of happiness and suffering 1078 00:50:56,599 --> 00:50:58,838 you might have a case for 1079 00:50:58,838 --> 00:51:02,768 saying what they did was the right thing 1080 00:51:02,768 --> 00:51:09,468 then we heard at least three different types of objections, 1081 00:51:09,469 --> 00:51:11,690 we heard an objection that's said 1082 00:51:11,690 --> 00:51:14,108 what they did was categorically wrong, 1083 00:51:14,108 --> 00:51:15,750 right here at the end 1084 00:51:15,750 --> 00:51:17,389 categorically wrong. 1085 00:51:17,389 --> 00:51:19,820 Murder is murder it's always wrong 1086 00:51:19,820 --> 00:51:20,969 even if 1087 00:51:20,969 --> 00:51:23,349 it increases the overall happiness 1088 00:51:23,349 --> 00:51:25,639 of society 1089 00:51:25,639 --> 00:51:28,499 the categorical objection. 1090 00:51:28,498 --> 00:51:30,738 But we still need to investigate 1091 00:51:30,739 --> 00:51:32,749 why murder 1092 00:51:32,748 --> 00:51:35,448 is categorically wrong. 1093 00:51:35,449 --> 00:51:38,579 Is it because 1094 00:51:38,579 --> 00:51:42,339 even cabin boys have certain fundamental rights? 1095 00:51:42,338 --> 00:51:44,400 And if that's the reason 1096 00:51:44,400 --> 00:51:47,880 where do those rights come from if not from some idea 1097 00:51:47,880 --> 00:51:53,209 of the larger welfare or utility or happiness? Question number one. 1098 00:51:53,208 --> 00:51:56,308 Others said 1099 00:51:56,309 --> 00:51:58,449 a lottery would make a difference 1100 00:51:58,449 --> 00:52:00,039 a fair procedure, 1101 00:52:00,039 --> 00:52:05,949 Matt said. 1102 00:52:05,949 --> 00:52:08,769 And some people were swayed by that. 1103 00:52:08,768 --> 00:52:12,188 That's not a categorical objection exactly 1104 00:52:12,188 --> 00:52:13,828 it's saying 1105 00:52:13,829 --> 00:52:16,798 everybody has to be counted as an equal 1106 00:52:16,798 --> 00:52:18,469 even though, at the end of the day 1107 00:52:18,469 --> 00:52:20,769 one can be sacrificed 1108 00:52:20,768 --> 00:52:23,288 for the general welfare. 1109 00:52:23,289 --> 00:52:26,059 That leaves us with another question to investigate, 1110 00:52:26,059 --> 00:52:29,670 Why does agreement to certain procedure, 1111 00:52:29,670 --> 00:52:31,969 even a fair procedure, 1112 00:52:31,969 --> 00:52:34,739 justify whatever result flows 1113 00:52:34,739 --> 00:52:38,088 from the operation of that procedure? 1114 00:52:38,088 --> 00:52:39,898 Question number two. 1115 00:52:39,898 --> 00:52:42,398 and question number three 1116 00:52:42,398 --> 00:52:45,338 the basic idea of consent. 1117 00:52:45,338 --> 00:52:48,528 Kathleen got us on to this. 1118 00:52:48,528 --> 00:52:52,719 If the cabin boy had agreed himself 1119 00:52:52,719 --> 00:52:54,499 and not under duress 1120 00:52:54,498 --> 00:52:57,068 as was added 1121 00:52:57,068 --> 00:53:01,918 then it would be all right to take his life to save the rest. 1122 00:53:01,918 --> 00:53:04,900 Even more people signed on to that idea 1123 00:53:04,900 --> 00:53:06,630 but that raises 1124 00:53:06,630 --> 00:53:08,528 a third philosophical question 1125 00:53:08,528 --> 00:53:11,009 what is the moral work 1126 00:53:11,009 --> 00:53:12,838 that consent 1127 00:53:12,838 --> 00:53:14,438 does? 1128 00:53:14,438 --> 00:53:16,978 Why does an act of consent 1129 00:53:16,978 --> 00:53:19,188 make such a moral difference 1130 00:53:19,188 --> 00:53:23,818 that an act that would be wrong, taking a life, without consent 1131 00:53:23,818 --> 00:53:25,369 is morally 1132 00:53:25,369 --> 00:53:26,358 permissible 1133 00:53:26,358 --> 00:53:29,619 with consent? 1134 00:53:29,619 --> 00:53:31,699 To investigate those three questions 1135 00:53:31,699 --> 00:53:34,108 we're going to have to read some philosophers 1136 00:53:34,108 --> 00:53:35,728 and starting next time 1137 00:53:35,728 --> 00:53:36,939 we're going to read 1138 00:53:36,940 --> 00:53:37,720 Bentham, 1139 00:53:37,719 --> 00:53:43,798 and John Stuart Mill, utilitarian philosophers. 1140 00:53:43,798 --> 00:53:47,298 Don't miss the chance to interact online with other viewers of Justice 1141 00:53:43,798 --> 00:53:47,298 join the conversation, 1142 00:53:49,869 --> 00:53:56,869 take a pop quiz, watch lectures you've missed, and a lot more. Visit www.justiceharvard.org. It's the right thing to do. 1143 00:54:36,219 --> 00:54:40,269 Funding for the program is provided by 1144 00:54:40,268 --> 00:54:41,678 Additional funding provided by