0:00:04.200,0:00:08.240 Funding for this program is provided by: 0:00:08.240,0:00:15.240 Additional funding provided by 0:00:33.509,0:00:37.750 This is a course about Justice and we begin with a story 0:00:37.750,0:00:40.179 suppose you're the driver of a trolley car, 0:00:40.179,0:00:44.640 and your trolley car is hurdling down the track at sixty miles an hour 0:00:44.640,0:00:49.390 and at the end of the track you notice five workers working on the track 0:00:49.390,0:00:51.799 you tried to stop but you can't 0:00:51.799,0:00:53.829 your brakes don't work 0:00:53.829,0:00:56.780 you feel desperate because you know 0:00:56.780,0:00:59.590 that if you crash into these five workers 0:00:59.590,0:01:01.490 they will all die 0:01:01.490,0:01:05.079 let's assume you know that for sure 0:01:05.079,0:01:07.420 and so you feel helpless 0:01:07.420,0:01:09.450 until you notice that there is 0:01:09.450,0:01:11.290 off to the right 0:01:11.290,0:01:13.170 a side track 0:01:13.170,0:01:15.579 at the end of that track 0:01:15.579,0:01:17.390 there's one worker 0:01:17.390,0:01:19.079 working on track 0:01:19.079,0:01:21.368 you're steering wheel works 0:01:21.368,0:01:23.030 so you can 0:01:23.030,0:01:26.010 turn the trolley car if you want to 0:01:26.010,0:01:28.840 onto this side track 0:01:28.840,0:01:30.399 killing the one 0:01:30.399,0:01:33.269 but sparing the five. 0:01:33.269,0:01:36.109 Here's our first question 0:01:36.109,0:01:38.819 what's the right thing to do? 0:01:38.819,0:01:40.478 What would you do? 0:01:40.478,0:01:42.560 Let's take a poll, 0:01:42.560,0:01:45.180 how many 0:01:45.180,0:01:52.020 would turn the trolley car onto the side track? 0:01:52.019,0:01:53.530 How many wouldn't? 0:01:53.530,0:01:58.180 How many would go straight ahead 0:01:58.180,0:02:04.050 keep your hands up, those of you who'd go straight ahead. 0:02:04.049,0:02:08.378 A handful of people would, the vast majority would turn 0:02:08.378,0:02:09.878 let's hear first 0:02:09.878,0:02:14.149 now we need to begin to investigate the reasons why you think 0:02:14.149,0:02:19.799 it's the right thing to do. Let's begin with those in the majority, who would turn 0:02:19.799,0:02:22.058 to go onto side track? 0:02:22.058,0:02:23.669 Why would you do it, 0:02:23.669,0:02:25.808 what would be your reason? 0:02:25.808,0:02:30.188 Who's willing to volunteer a reason? 0:02:30.188,0:02:32.218 Go ahead, stand up. 0:02:32.218,0:02:39.218 Because it can't be right to kill five people when you can only kill one person instead. 0:02:39.579,0:02:42.239 it wouldn't be right to kill five 0:02:42.239,0:02:47.069 if you could kill one person instead 0:02:47.068,0:02:48.528 that's a good reason 0:02:48.528,0:02:52.959 that's a good reason 0:02:52.959,0:02:53.778 who else? 0:02:53.778,0:02:56.658 does everybody agree with that 0:02:56.658,0:03:01.218 reason? go ahead. 0:03:01.218,0:03:03.620 Well I was thinking it was the same reason it was on 0:03:03.620,0:03:05.459 9/11 we regard the people who flew the plane 0:03:05.459,0:03:08.009 who flew the plane into the 0:03:08.008,0:03:09.528 Pennsylvania field as heroes 0:03:09.528,0:03:11.718 because they chose to kill the people on the plane 0:03:11.718,0:03:14.418 and not kill more people 0:03:14.419,0:03:16.109 in big buildings. 0:03:16.109,0:03:19.260 So the principle there was the same on 9/11 0:03:19.259,0:03:21.638 it's tragic circumstance, 0:03:21.639,0:03:25.359 but better to kill one so that five can live 0:03:25.359,0:03:30.748 is that the reason most of you have, those of you who would turn, yes? 0:03:30.748,0:03:32.628 Let's hear now 0:03:32.628,0:03:33.628 from 0:03:33.628,0:03:35.508 those in the minority 0:03:35.508,0:03:40.558 those who wouldn't turn. 0:03:40.558,0:03:45.709 Well I think that same type of mentality that justifies genocide and totalitarianism 0:03:45.468,0:03:50.358 in order to save one type of race you wipe out the other. 0:03:50.359,0:03:53.278 so what would you do in this case? You would 0:03:53.278,0:03:55.378 to avoid 0:03:55.378,0:03:57.798 the horrors of genocide 0:03:57.799,0:04:03.989 you would crash into the five and kill them? 0:04:03.989,0:04:07.628 Presumably yes. 0:04:07.628,0:04:09.929 okay who else? 0:04:09.930,0:04:14.498 That's a brave answer, thank you. 0:04:14.498,0:04:16.079 Let's consider another 0:04:16.978,0:04:20.238 trolley car case 0:04:20.238,0:04:21.688 and see 0:04:21.689,0:04:24.189 whether 0:04:24.189,0:04:27.429 those of you in the majority 0:04:27.428,0:04:30.989 want to adhere to the principle, 0:04:30.990,0:04:33.500 better that one should die so that five should live. 0:04:33.500,0:04:38.528 This time you're not the driver of the trolley car, you're an onlooker 0:04:38.528,0:04:42.678 standing on a bridge overlooking a trolley car track 0:04:42.678,0:04:45.658 and down the track comes a trolley car 0:04:45.658,0:04:49.620 at the end of the track are five workers 0:04:49.620,0:04:51.739 the brakes don't work 0:04:51.738,0:04:55.948 the trolley car is about to careen into the five and kill them 0:04:55.949,0:04:57.218 and now 0:04:57.218,0:04:58.788 you're not the driver 0:04:58.788,0:05:01.308 you really feel helpless 0:05:01.309,0:05:03.259 until you notice 0:05:03.259,0:05:06.879 standing next to you 0:05:06.879,0:05:08.639 leaning over 0:05:08.639,0:05:09.889 the bridge 0:05:09.889,0:05:16.889 is it very fat man. 0:05:17.218,0:05:20.178 And you could 0:05:20.178,0:05:22.668 give him a shove 0:05:22.668,0:05:24.728 he would fall over the bridge 0:05:24.728,0:05:27.949 onto the track 0:05:27.949,0:05:29.538 right in the way of 0:05:29.538,0:05:32.199 the trolley car 0:05:32.199,0:05:33.419 he would die 0:05:33.418,0:05:38.930 but he would spare the five. 0:05:38.930,0:05:41.108 Now, how many would push 0:05:41.108,0:05:48.089 the fat man over the bridge? Raise your hand. 0:05:48.089,0:05:51.198 How many wouldn't? 0:05:51.199,0:05:54.069 Most people wouldn't. 0:05:54.069,0:05:55.789 Here's the obvious question, 0:05:55.788,0:05:56.878 what became 0:05:56.879,0:06:00.169 of the principle 0:06:00.168,0:06:05.109 better to save five lives even if it means sacrificing one, what became of the principal 0:06:05.110,0:06:07.490 that almost everyone endorsed 0:06:07.490,0:06:09.240 in the first case 0:06:09.240,0:06:12.809 I need to hear from someone who was in the majority in both 0:06:12.809,0:06:13.580 cases is 0:06:13.579,0:06:17.758 how do you explain the difference between the two? 0:06:17.759,0:06:21.860 The second one I guess involves an active choice of 0:06:21.860,0:06:22.509 pushing a person 0:06:22.509,0:06:24.279 and down which 0:06:24.279,0:06:25.198 I guess that 0:06:25.199,0:06:29.830 that person himself would otherwise not have been involved in the situation at all 0:06:29.829,0:06:31.269 and so 0:06:31.269,0:06:33.769 to choose on his behalf I guess 0:06:33.769,0:06:36.769 to 0:06:36.769,0:06:39.990 involve him in something that he otherwise would have this escaped is 0:06:39.990,0:06:41.788 I guess more than 0:06:41.788,0:06:43.620 what you have in the first case where 0:06:43.620,0:06:45.968 the three parties, the driver and 0:06:45.968,0:06:47.668 the two sets of workers are 0:06:47.668,0:06:50.598 already I guess in this situation. 0:06:50.598,0:06:55.108 but the guy working, the one on the track off to the side 0:06:55.108,0:07:02.068 he didn't choose to sacrifice his life any more than the fat guy did, did he? 0:07:02.069,0:07:05.420 That's true, but he was on the tracks. 0:07:05.399,0:07:10.438 this guy was on the bridge. 0:07:10.439,0:07:13.550 Go ahead, you can come back if you want. 0:07:13.550,0:07:15.329 Alright, it's a hard question 0:07:15.329,0:07:18.658 but you did well you did very well it's a hard question. 0:07:19.629,0:07:21.180 who else 0:07:21.180,0:07:22.598 can 0:07:22.598,0:07:26.028 find a way of reconciling 0:07:26.028,0:07:30.158 the reaction of the majority in these two cases? Yes? 0:07:30.158,0:07:31.550 Well I guess 0:07:31.550,0:07:32.540 in the first case where 0:07:32.540,0:07:35.179 you have the one worker and the five 0:07:35.178,0:07:37.429 it's a choice between those two, and you have to 0:07:37.430,0:07:41.218 make a certain choice and people are going to die because of the trolley car 0:07:41.218,0:07:45.029 not necessarily because of your direct actions. The trolley car is a runway, 0:07:45.029,0:07:48.209 thing and you need to make in a split second choice 0:07:48.209,0:07:52.658 whereas pushing the fat man over is an actual act of murder on your part 0:07:52.658,0:07:54.490 you have control over that 0:07:54.490,0:07:57.278 whereas you may not have control over the trolley car. 0:07:57.278,0:08:00.089 So I think that it's a slightly different situation. 0:08:00.089,0:08:04.359 Alright who has a reply? Is that, who has a reply to that? no that was good, who has a way 0:08:04.360,0:08:06.319 who wants to reply? 0:08:06.319,0:08:09.490 Is that a way out of this? 0:08:09.490,0:08:12.439 I don't think that's a very good reason because you choose 0:08:12.439,0:08:16.889 either way you have to choose who dies because you either choose to turn and kill a person 0:08:16.889,0:08:18.500 which is an act of conscious 0:08:18.500,0:08:19.718 thought to turn, 0:08:19.718,0:08:21.319 or you choose to push the fat man 0:08:21.319,0:08:23.609 over which is also an active 0:08:23.608,0:08:27.688 conscious action so either way you're making a choice. 0:08:27.689,0:08:29.729 Do you want to reply? 0:08:29.728,0:08:34.189 Well I'm not really sure that that's the case, it just still seems kind of different, the act of actually 0:08:34.190,0:08:38.230 pushing someone over onto the tracks and killing them, 0:08:38.230,0:08:42.600 you are actually killing him yourself, you're pushing him with your own hands you're pushing and 0:08:42.600,0:08:43.710 that's different 0:08:43.710,0:08:47.090 than steering something that is going to cause death 0:08:47.090,0:08:48.670 into another...you know 0:08:48.669,0:08:52.829 it doesn't really sound right saying it now when I'm up here. 0:08:52.830,0:08:54.740 No that's good, what's your name? 0:08:54.740,0:08:55.570 Andrew. 0:08:55.740,0:08:59.570 Andrew and let me ask you this question Andrew, 0:08:59.570,0:09:02.240 suppose 0:09:02.240,0:09:03.680 standing on the bridge 0:09:03.679,0:09:04.809 next to the fat man 0:09:04.809,0:09:07.849 I didn't have to push him, suppose he was standing 0:09:07.850,0:09:14.850 over a trap door that I could open by turning a steering wheel like that 0:09:17.450,0:09:18.690 would you turn it? 0:09:18.690,0:09:20.960 For some reason that still just seems more 0:09:20.960,0:09:24.129 more wrong. 0:09:24.129,0:09:30.350 I mean maybe if you just accidentally like leaned into this steering wheel or something like that 0:09:30.350,0:09:31.060 or but, 0:09:31.059,0:09:33.118 or say that the car is 0:09:33.119,0:09:37.850 hurdling towards a switch that will drop the trap 0:09:37.850,0:09:39.230 then I could agree with that. 0:09:39.850,0:09:42.230 Fair enough, it still seems 0:09:42.230,0:09:45.500 wrong in a way that it doesn't seem wrong in the first case to turn, you say 0:09:45.500,0:09:50.419 An in another way, I mean in the first situation you're involved directly with the situation 0:09:50.419,0:09:52.449 in the second one you're an onlooker as well. 0:09:52.450,0:09:56.920 So you have the choice of becoming involved or not by pushing the fat man. 0:09:56.919,0:09:59.549 Let's forget for the moment about this case, 0:09:59.549,0:10:01.269 that's good, 0:10:01.269,0:10:06.460 but let's imagine a different case. This time your doctor in an emergency room 0:10:06.460,0:10:11.629 and six patients come to you 0:10:11.629,0:10:18.500 they've been in a terrible trolley car wreck 0:10:18.500,0:10:23.889 five of them sustained moderate injuries one is severely injured you could spend all day 0:10:23.889,0:10:27.720 caring for the one severely injured victim, 0:10:27.720,0:10:32.330 but in that time the five would die, or you could look after the five, restore them to health, but 0:10:32.330,0:10:35.490 during that time the one severely injured 0:10:35.490,0:10:36.470 person would die. 0:10:36.470,0:10:37.660 How many would save 0:10:37.659,0:10:39.539 the five 0:10:39.539,0:10:40.849 now as the doctor? 0:10:40.850,0:10:44.480 How many would save the one? 0:10:44.480,0:10:46.300 Very few people, 0:10:46.299,0:10:49.269 just a handful of people. 0:10:49.269,0:10:51.439 Same reason I assume, 0:10:51.440,0:10:55.570 one life versus five. 0:10:55.570,0:10:57.230 Now consider 0:10:57.230,0:10:59.070 another doctor case 0:10:59.070,0:11:02.080 this time you're a transplant surgeon 0:11:02.080,0:11:06.160 and you have five patients each in desperate need 0:11:06.159,0:11:09.639 of an organ transplant in order to survive 0:11:09.639,0:11:12.220 on needs a heart one a lung, 0:11:12.220,0:11:13.550 one a kidney, 0:11:13.549,0:11:15.089 one a liver 0:11:15.090,0:11:16.680 and the fifth 0:11:16.679,0:11:20.209 a pancreas. 0:11:20.210,0:11:22.780 And you have no organ donors 0:11:22.779,0:11:24.909 you are about to 0:11:24.909,0:11:27.649 see you them die 0:11:27.649,0:11:28.860 and then 0:11:28.860,0:11:30.649 it occurs to you 0:11:30.649,0:11:32.470 that in the next room 0:11:32.470,0:11:35.720 there's a healthy guy who came in for a checkup. 0:11:39.450,0:11:43.740 and he is 0:11:43.740,0:11:47.159 you like that 0:11:47.159,0:11:50.740 and he's taking a nap 0:11:53.269,0:11:56.769 you could go in very quietly 0:11:56.769,0:12:00.600 yank out the five organs, that person would die 0:12:00.600,0:12:03.170 but you can save the five. 0:12:03.169,0:12:10.169 How many would do it? Anyone? 0:12:10.389,0:12:17.389 How many? Put your hands up if you would do it. 0:12:18.100,0:12:21.840 Anyone in the balcony? 0:12:21.840,0:12:24.280 You would? Be careful don't lean over too much 0:12:26.269,0:12:29.340 How many wouldn't? 0:12:29.340,0:12:30.360 All right. 0:12:30.360,0:12:33.600 What do you say, speak up in the balcony, you who would 0:12:33.600,0:12:35.840 yank out the organs, why? 0:12:35.840,0:12:38.629 I'd actually like to explore slightly alternate 0:12:38.629,0:12:40.188 possibility of just taking the one 0:12:40.188,0:12:44.419 of the five he needs an organ who dies first 0:12:44.419,0:12:50.229 and using their four healthy organs to save the other four 0:12:50.230,0:12:54.639 That's a pretty good idea. 0:12:54.639,0:12:57.830 That's a great idea 0:12:57.830,0:13:00.160 except for the fact 0:13:00.159,0:13:06.159 that you just wrecked the philosophical point. 0:13:06.159,0:13:07.360 Let's step back 0:13:07.360,0:13:10.240 from these stories and these arguments 0:13:10.240,0:13:12.750 to notice a couple of things 0:13:12.750,0:13:17.809 about the way the arguments have began to unfold. 0:13:17.809,0:13:18.649 Certain 0:13:18.649,0:13:20.360 moral principles 0:13:20.360,0:13:23.340 have already begun to emerge 0:13:23.340,0:13:25.879 from the discussions we've had 0:13:25.879,0:13:27.570 and let's consider 0:13:27.570,0:13:29.740 what those moral principles 0:13:29.740,0:13:31.250 look like 0:13:31.250,0:13:35.710 the first moral principle that emerged from the discussion said 0:13:35.710,0:13:39.110 that the right thing to do the moral thing to do 0:13:39.110,0:13:43.139 depends on the consequences that will result 0:13:43.139,0:13:45.480 from your action 0:13:45.480,0:13:47.080 at the end of the day 0:13:47.080,0:13:49.250 better that five should live 0:13:49.250,0:13:52.320 even if one must die. 0:13:52.320,0:13:53.700 That's an example 0:13:53.700,0:13:56.280 of consequentialist 0:13:56.279,0:13:59.459 moral reasoning. 0:13:59.460,0:14:04.310 consequentialist moral reasoning locates morality in the consequences of an act. In the state of the 0:14:04.309,0:14:06.549 world that will result 0:14:06.549,0:14:09.049 from the thing you do 0:14:09.049,0:14:12.789 but then we went a little further, we considered those other cases 0:14:12.789,0:14:15.319 and people weren't so sure 0:14:15.320,0:14:17.300 about 0:14:17.299,0:14:20.529 consequentialist moral reasoning 0:14:20.529,0:14:22.419 when people hesitated 0:14:22.419,0:14:24.269 to push the fat man 0:14:24.269,0:14:25.789 over the bridge 0:14:25.789,0:14:28.639 or to yank out the organs of the innocent 0:14:28.639,0:14:29.750 patient 0:14:29.750,0:14:32.259 people gestured towards 0:14:32.259,0:14:34.259 reasons 0:14:34.259,0:14:35.379 having to do 0:14:35.379,0:14:37.250 with the intrinsic 0:14:37.250,0:14:39.230 quality of the act 0:14:39.230,0:14:40.690 itself. 0:14:40.690,0:14:42.770 Consequences be what they may. 0:14:42.769,0:14:45.179 People were reluctant 0:14:45.179,0:14:47.819 people thought it was just wrong 0:14:47.820,0:14:49.480 categorically wrong 0:14:49.480,0:14:50.500 to kill 0:14:50.500,0:14:51.408 a person 0:14:51.408,0:14:53.509 an innocent person 0:14:53.509,0:14:54.689 even for the sake 0:14:54.690,0:14:55.819 of saving 0:14:55.818,0:14:58.500 five lives, at least these people thought that 0:14:58.500,0:15:00.610 in the second 0:15:00.610,0:15:05.120 version of each story we reconsidered 0:15:05.120,0:15:06.519 so this points 0:15:06.519,0:15:09.539 a second 0:15:09.539,0:15:10.689 categorical 0:15:10.690,0:15:12.660 way 0:15:12.659,0:15:14.679 of thinking about 0:15:14.679,0:15:16.429 moral reasoning 0:15:16.429,0:15:22.429 categorical moral reasoning locates morality in certain absolute moral requirements in 0:15:22.429,0:15:24.439 certain categorical duties and rights 0:15:24.440,0:15:27.440 regardless of the consequences. 0:15:27.440,0:15:29.230 We're going to explore 0:15:29.230,0:15:33.070 in the days and weeks to come the contrast between 0:15:33.070,0:15:36.540 consequentialist and categorical moral principles. 0:15:36.539,0:15:38.370 The most influential 0:15:38.370,0:15:40.419 example of 0:15:40.419,0:15:45.919 consequential moral reasoning is utilitarianism, a doctrine invented by 0:15:45.919,0:15:51.099 Jeremy Bentham, the eighteenth century English political philosopher. 0:15:51.100,0:15:54.139 The most important 0:15:54.139,0:15:56.850 philosopher of categorical moral reasoning 0:15:56.850,0:15:58.170 is the 0:15:58.169,0:16:02.589 eighteenth century German philosopher Emmanuel Kant. 0:16:02.590,0:16:03.860 So we will look 0:16:03.860,0:16:07.259 at those two different modes of moral reasoning 0:16:07.259,0:16:08.299 assess them 0:16:08.299,0:16:10.669 and also consider others. 0:16:10.669,0:16:16.099 If you look at the syllabus, you'll notice that we read a number of great and famous books. 0:16:16.100,0:16:18.310 Books by Aristotle 0:16:18.309,0:16:19.889 John Locke 0:16:19.889,0:16:22.080 Emanuel Kant, John Stuart Mill, 0:16:22.080,0:16:24.030 and others. 0:16:24.029,0:16:28.169 You'll notice too from the syllabus that we don't only read these books, 0:16:28.169,0:16:30.069 we also all 0:16:30.070,0:16:32.010 take up 0:16:32.009,0:16:37.000 contemporary political and legal controversies that raise philosophical questions. 0:16:37.000,0:16:40.190 We will debate equality and inequality, 0:16:40.190,0:16:41.490 affirmative action, 0:16:41.490,0:16:43.698 free speech versus hate speech, 0:16:43.698,0:16:47.039 same sex marriage, military conscription, 0:16:47.039,0:16:50.980 a range of practical questions, why 0:16:50.980,0:16:55.350 not just to enliven these abstract and distant books 0:16:55.350,0:17:01.040 but to make clear to bring out what's at stake in our everyday lives including our political 0:17:01.039,0:17:03.519 lives, 0:17:03.519,0:17:05.639 for philosophy. 0:17:05.640,0:17:07.720 So we will read these books 0:17:07.720,0:17:09.818 and we will debate these 0:17:09.818,0:17:15.480 issues and we'll see how each informs and illuminates the other. 0:17:15.480,0:17:17.599 This may sound appealing enough 0:17:17.599,0:17:19.109 but here 0:17:19.109,0:17:22.549 I have to issue a warning, 0:17:22.549,0:17:25.210 and the warning is this 0:17:25.210,0:17:28.500 to read these books 0:17:28.500,0:17:31.549 in this way, 0:17:31.549,0:17:34.119 as an exercise in self-knowledge, 0:17:34.119,0:17:38.849 to read them in this way carry certain risks 0:17:38.849,0:17:42.119 risks that are both personal and political, 0:17:42.119,0:17:47.799 risks that every student of political philosophy have known. 0:17:47.799,0:17:50.690 These risks spring from that fact 0:17:50.690,0:17:52.580 that philosophy 0:17:52.579,0:17:54.220 teaches us 0:17:54.220,0:17:56.400 and unsettles us 0:17:56.400,0:18:01.390 by confronting us with what we already know. 0:18:01.390,0:18:03.390 There's an irony 0:18:03.390,0:18:09.790 the difficulty of this course consists in the fact that it teaches what you already know. 0:18:09.789,0:18:12.159 It works by taking 0:18:12.160,0:18:16.470 what we know from familiar unquestioned settings, 0:18:16.470,0:18:20.370 and making it strange. 0:18:20.369,0:18:22.389 That's how those examples worked 0:18:22.390,0:18:23.120 worked 0:18:23.119,0:18:29.029 the hypotheticals with which we began with their mix of playfulness and sobriety. 0:18:29.029,0:18:33.940 it's also how these philosophical books work. Philosophy 0:18:33.940,0:18:35.640 estranges us 0:18:35.640,0:18:37.520 from the familiar 0:18:37.519,0:18:40.389 not by supplying new information 0:18:40.390,0:18:41.780 but by inviting 0:18:41.779,0:18:43.599 and provoking 0:18:43.599,0:18:47.419 a new way of seeing 0:18:47.420,0:18:49.970 but, and here's the risk, 0:18:49.970,0:18:50.720 once 0:18:50.720,0:18:54.299 the familiar turns strange, 0:18:54.299,0:18:58.259 it's never quite the same again. 0:18:58.259,0:19:00.210 Self-knowledge 0:19:00.210,0:19:03.200 is like lost innocence, 0:19:03.200,0:19:04.720 however unsettling 0:19:04.720,0:19:06.029 you find it, 0:19:06.029,0:19:07.289 it can never 0:19:07.289,0:19:09.720 be unthought 0:19:09.720,0:19:13.390 or unknown 0:19:13.390,0:19:17.259 what makes this enterprise difficult 0:19:17.259,0:19:19.970 but also riveting, 0:19:19.970,0:19:20.880 is that 0:19:20.880,0:19:25.400 moral and political philosophy is a story 0:19:25.400,0:19:29.340 and you don't know where this story will lead but what you do know 0:19:29.339,0:19:31.319 is that the story 0:19:31.319,0:19:34.480 is about you. 0:19:34.480,0:19:37.079 Those are the personal risks, 0:19:37.079,0:19:40.220 now what of the political risks. 0:19:40.220,0:19:43.038 one way of introducing of course like this 0:19:43.038,0:19:44.730 would be to promise you 0:19:44.730,0:19:46.460 that by reading these books 0:19:46.460,0:19:48.069 and debating these issues 0:19:48.069,0:19:51.769 you will become a better more responsible citizen. 0:19:51.769,0:19:56.450 You will examine the presuppositions of public policy, you will hone your political 0:19:56.450,0:19:57.289 judgment 0:19:57.289,0:20:02.819 you'll become a more effective participant in public affairs 0:20:02.819,0:20:06.869 but this would be a partial and misleading promise 0:20:06.869,0:20:11.489 political philosophy for the most part hasn't worked that way. 0:20:11.490,0:20:14.660 You have to allow for the possibility 0:20:14.660,0:20:19.100 that political philosophy may make you a worse citizen 0:20:19.099,0:20:21.969 rather than a better one 0:20:21.970,0:20:23.809 or at least a worse citizen 0:20:23.809,0:20:25.629 before it makes you 0:20:25.630,0:20:27.950 a better one 0:20:27.950,0:20:30.400 and that's because philosophy 0:20:30.400,0:20:32.620 is a distancing 0:20:32.619,0:20:34.709 even debilitating 0:20:34.710,0:20:36.600 activity 0:20:36.599,0:20:37.759 And you see this 0:20:37.759,0:20:39.890 going back to Socrates 0:20:39.890,0:20:42.290 there's a dialogue, the Gorgias 0:20:42.289,0:20:44.680 in which one of Socrates’ friends 0:20:44.680,0:20:45.620 Calicles 0:20:45.619,0:20:47.239 tries to talk him out 0:20:47.240,0:20:49.970 of philosophizing. 0:20:49.970,0:20:54.400 calicles tells Socrates philosophy is a pretty toy 0:20:54.400,0:20:57.980 if one indulges in it with moderation at the right time of life 0:20:57.980,0:21:03.779 but if one pursues it further than one should it is absolute ruin. 0:21:03.779,0:21:06.819 Take my advice calicles says, 0:21:06.819,0:21:08.369 abandon argument 0:21:08.369,0:21:11.669 learn the accomplishments of active life, take 0:21:11.670,0:21:16.880 for your models not those people who spend their time on these petty quibbles, 0:21:16.880,0:21:20.080 but those who have a good livelihood and reputation 0:21:20.079,0:21:22.319 and many other blessings. 0:21:22.319,0:21:26.808 So Calicles is really saying to Socrates 0:21:26.808,0:21:28.690 quit philosophizing, 0:21:28.690,0:21:30.549 get real 0:21:30.549,0:21:35.169 go to business school 0:21:35.170,0:21:38.300 and calicles did have a point 0:21:38.299,0:21:39.690 he had a point 0:21:39.690,0:21:42.210 because philosophy distances us 0:21:42.210,0:21:45.009 from conventions from established assumptions 0:21:45.009,0:21:46.750 and from settled beliefs. 0:21:46.750,0:21:48.650 those are the risks, 0:21:48.650,0:21:49.970 personal and political 0:21:49.970,0:21:54.089 and in the face of these risks there is a characteristic evasion, 0:21:54.089,0:21:57.470 the name of the evasion is skepticism. It's the idea 0:21:57.470,0:21:58.990 well it goes something like this 0:21:58.990,0:22:03.660 we didn't resolve, once and for all, 0:22:03.660,0:22:09.759 either the cases or the principles we were arguing when we began 0:22:09.759,0:22:11.390 and if Aristotle 0:22:11.390,0:22:17.220 and Locke and Kant and Mill haven't solved these questions after all of these years 0:22:17.220,0:22:19.589 who are we to think 0:22:19.589,0:22:23.730 that we here in Sanders Theatre over the course a semester 0:22:23.730,0:22:26.430 can resolve them 0:22:26.430,0:22:29.460 and so maybe it's just a matter of 0:22:29.460,0:22:33.519 each person having his or her own principles and there's nothing more to be said about 0:22:33.519,0:22:34.170 it 0:22:34.170,0:22:36.590 no way of reasoning 0:22:36.589,0:22:37.649 that's the 0:22:37.650,0:22:39.440 evasion. The evasion of skepticism 0:22:39.440,0:22:41.279 to which I would offer the following 0:22:41.279,0:22:42.660 reply: 0:22:42.660,0:22:43.720 it's true 0:22:43.720,0:22:47.559 these questions have been debated for a very long time 0:22:47.559,0:22:49.059 but the very fact 0:22:49.059,0:22:52.700 that they have reoccurred and persisted 0:22:52.700,0:22:54.650 may suggest 0:22:54.650,0:22:57.169 that though they're impossible in one sense 0:22:57.169,0:22:59.800 their unavoidable in another 0:22:59.799,0:23:02.299 and the reason they're unavoidable 0:23:02.299,0:23:06.500 the reason they're inescapable is that we live some answer 0:23:06.500,0:23:10.069 to these questions every day. 0:23:10.069,0:23:16.129 So skepticism, just throwing up their hands and giving up on moral reflection, 0:23:16.130,0:23:18.280 is no solution 0:23:18.279,0:23:19.990 Emanuel Kant 0:23:19.990,0:23:23.710 described very well the problem with skepticism when he wrote 0:23:23.710,0:23:26.308 skepticism is a resting place for human reason 0:23:26.308,0:23:29.099 where it can reflect upon its dogmatic wanderings 0:23:29.099,0:23:33.069 but it is no dwelling place for permanent settlement. 0:23:33.069,0:23:35.939 Simply to acquiesce in skepticism, Kant wrote, 0:23:35.940,0:23:42.660 can never suffice to overcome the restless of reason. 0:23:42.660,0:23:47.040 I've tried to suggest through theses stories and these arguments 0:23:47.039,0:23:49.898 some sense of the risks and temptations 0:23:49.898,0:23:55.589 of the perils and the possibilities I would simply conclude by saying 0:23:55.589,0:23:58.099 that the aim of this course 0:23:58.099,0:23:59.629 is to awaken 0:23:59.630,0:24:02.240 the restlessness of reason 0:24:02.240,0:24:04.380 and to see where it might lead 0:24:04.380,0:24:11.380 thank you very much. 0:24:15.440,0:24:16.980 Like, in a situation that desperate, 0:24:16.980,0:24:21.259 you have to do what you have to do to survive. You have to do what you have to do you? You've gotta do 0:24:21.259,0:24:22.779 What you 0:24:22.779,0:24:23.619 gotta do. pretty much, 0:24:23.619,0:24:25.699 If you've been going nineteen days without any food 0:24:25.700,0:24:32.700 someone has to take the sacrifice, someone has to make the sacrifice and people can survive. Alright that's good, what's your name? Marcus. 0:24:33.569,0:24:40.349 Marcus, what do you say to Marcus? 0:24:40.349,0:24:44.569 Last time 0:24:44.569,0:24:46.960 we started out last time 0:24:46.960,0:24:48.980 with some stores 0:24:48.980,0:24:51.019 with some moral dilemmas 0:24:51.019,0:24:53.049 about trolley cars 0:24:53.049,0:24:54.509 and about doctors 0:24:54.509,0:24:56.269 and healthy patients 0:24:56.269,0:24:57.559 vulnerable 0:24:57.559,0:25:00.909 to being victims of organ transplantation 0:25:00.910,0:25:04.090 we noticed two things 0:25:04.089,0:25:06.889 about the arguments we had 0:25:06.890,0:25:10.830 one had to do with the way we were arguing 0:25:10.829,0:25:13.569 it began with our judgments in particular cases 0:25:13.569,0:25:18.470 we tried to articulate the reasons or the principles 0:25:18.470,0:25:22.549 lying behind our judgments 0:25:22.549,0:25:25.399 and then confronted with a new case 0:25:25.400,0:25:30.759 we found ourselves re-examining those principles 0:25:30.759,0:25:34.019 revising each in the light of the other 0:25:34.019,0:25:38.920 and we noticed the built-in pressure to try to bring into alignment 0:25:38.920,0:25:41.680 our judgments about particular cases 0:25:41.680,0:25:43.940 and the principles we would endorse 0:25:43.940,0:25:46.360 on reflection 0:25:46.359,0:25:50.589 we also noticed something about the substance of the arguments 0:25:50.589,0:25:55.250 that emerged from the discussion. 0:25:55.250,0:26:00.859 We noticed that sometimes we were tempted to locate the morality of an act in the consequences 0:26:00.859,0:26:06.519 in the results, in the state of the world that it brought about. 0:26:06.519,0:26:09.019 We called is consequentialist 0:26:09.019,0:26:11.690 moral reason. 0:26:11.690,0:26:13.240 But we also noticed that 0:26:13.240,0:26:16.440 in some cases 0:26:16.440,0:26:18.610 we weren't swayed only 0:26:18.609,0:26:22.089 by the results 0:26:22.089,0:26:23.399 sometimes, 0:26:23.400,0:26:25.350 many of us felt, 0:26:25.349,0:26:31.669 that not just consequences but also the intrinsic quality or character of the act 0:26:31.670,0:26:35.370 matters morally. 0:26:35.369,0:26:40.979 Some people argued that there are certain things that are just categorically wrong 0:26:40.980,0:26:42.579 even if they bring about 0:26:42.579,0:26:44.460 a good result 0:26:44.460,0:26:45.490 even 0:26:45.490,0:26:47.200 if they save five people 0:26:47.200,0:26:49.788 at the cost of one life. 0:26:49.788,0:26:52.730 So we contrasted consequentialist 0:26:52.730,0:26:54.569 moral principles 0:26:54.569,0:26:58.139 with categorical ones. 0:26:58.140,0:26:59.790 Today 0:26:59.789,0:27:00.819 and in the next few days 0:27:00.819,0:27:06.569 we will begin to examine one of the most influential 0:27:06.569,0:27:08.839 versions of consequentialist 0:27:08.839,0:27:10.970 moral theory 0:27:10.970,0:27:16.329 and that's the philosophy of utilitarianism. 0:27:16.329,0:27:17.429 Jeremy Bentham, 0:27:17.430,0:27:19.080 the eighteenth century 0:27:19.079,0:27:21.689 English political philosopher 0:27:21.690,0:27:22.910 gave first 0:27:22.910,0:27:26.630 the first clear systematic expression 0:27:26.630,0:27:28.670 to the utilitarian 0:27:28.670,0:27:32.210 moral theory. 0:27:32.210,0:27:36.400 And Bentham's idea, 0:27:36.400,0:27:38.440 his essential idea 0:27:38.440,0:27:42.930 is a very simple one 0:27:42.930,0:27:44.808 with a lot of 0:27:44.808,0:27:46.329 morally 0:27:46.329,0:27:48.429 intuitive appeal. 0:27:48.430,0:27:50.450 Bentham's idea is 0:27:50.450,0:27:51.590 the following 0:27:51.589,0:27:54.439 the right thing to do 0:27:54.440,0:27:57.799 the just thing to do 0:27:57.799,0:27:58.819 it's to 0:27:58.819,0:28:01.369 maximize 0:28:01.369,0:28:02.339 utility. 0:28:02.339,0:28:06.329 What did he mean by utility? 0:28:06.329,0:28:11.490 He meant by utility the balance 0:28:11.490,0:28:14.000 of pleasure over pain, 0:28:14.000,0:28:16.778 happiness over suffering. 0:28:16.778,0:28:18.210 Here's how we arrived 0:28:18.210,0:28:19.259 at the principle 0:28:19.259,0:28:22.308 of maximizing utility. 0:28:22.308,0:28:24.450 He started out by observing 0:28:24.450,0:28:26.399 that all of us 0:28:26.398,0:28:27.750 all human beings 0:28:27.750,0:28:31.490 are governed by two sovereign masters, 0:28:31.490,0:28:34.620 pain and pleasure. 0:28:34.619,0:28:37.479 We human beings 0:28:37.480,0:28:42.308 like pleasure and dislike pain 0:28:42.308,0:28:45.839 and so we should base morality 0:28:45.839,0:28:49.169 whether we are thinking of what to do in our own lives 0:28:49.170,0:28:50.490 or whether 0:28:50.490,0:28:52.579 as legislators or citizens 0:28:52.579,0:28:57.029 we are thinking about what the law should be, 0:28:57.029,0:29:02.069 the right thing to do individually or collectively 0:29:02.069,0:29:05.838 is to maximize, act in a way that maximizes 0:29:05.838,0:29:07.659 the overall level 0:29:07.660,0:29:11.519 of happiness. 0:29:11.519,0:29:15.430 Bentham's utilitarianism is sometimes summed up with the slogan 0:29:15.430,0:29:18.870 the greatest good for the greatest number. 0:29:18.869,0:29:20.398 With this 0:29:20.398,0:29:22.989 basic principle of utility on hand, 0:29:22.990,0:29:26.410 let's begin to test it and to examine it 0:29:26.410,0:29:28.409 by turning to another case 0:29:28.409,0:29:30.620 another story but this time 0:29:30.619,0:29:32.639 not a hypothetical story, 0:29:32.640,0:29:34.120 a real-life story 0:29:34.119,0:29:35.059 the case of 0:29:35.059,0:29:38.329 the Queen versus Dudley and Stephens. 0:29:38.329,0:29:41.889 This was a nineteenth-century British law case 0:29:41.890,0:29:44.009 that's famous 0:29:44.009,0:29:47.538 and much debated in law schools. 0:29:47.538,0:29:50.059 Here's what happened in the case 0:29:50.059,0:29:51.869 I'll summarize the story 0:29:51.869,0:29:54.909 and then I want to hear 0:29:54.910,0:29:57.640 how you would rule 0:29:57.640,0:30:04.350 imagining that you are the jury. 0:30:04.349,0:30:06.178 A newspaper account of the time 0:30:06.179,0:30:09.290 described the background: 0:30:09.289,0:30:11.460 A sadder story of disaster at sea 0:30:11.460,0:30:12.789 was never told 0:30:12.789,0:30:15.288 than that of the survivors of the yacht 0:30:15.288,0:30:16.240 Mignonette. 0:30:16.240,0:30:19.109 The ship foundered in the south Atlantic 0:30:19.109,0:30:21.789 thirteen hundred miles from the cape 0:30:21.789,0:30:24.000 there were four in the crew, 0:30:24.000,0:30:26.500 Dudley was the captain 0:30:26.500,0:30:28.400 Stephens was the first mate 0:30:28.400,0:30:30.220 Brooks was a sailor, 0:30:30.220,0:30:31.279 all men of 0:30:31.279,0:30:32.490 excellent character, 0:30:32.490,0:30:34.249 or so the newspaper account 0:30:34.249,0:30:35.819 tells us. 0:30:35.819,0:30:38.639 The fourth crew member was the cabin boy, 0:30:38.640,0:30:40.300 Richard Parker 0:30:40.299,0:30:42.849 seventeen years old. 0:30:42.849,0:30:44.629 He was an orphan 0:30:44.630,0:30:46.930 he had no family 0:30:46.930,0:30:51.410 and he was on his first long voyage at sea. 0:30:51.410,0:30:53.700 He went, the news account tells us, 0:30:53.700,0:30:56.730 rather against the advice of his friends. 0:30:56.730,0:31:00.210 He went in the hopefulness of youthful ambition 0:31:00.210,0:31:03.390 thinking the journey would make a man of him. 0:31:03.390,0:31:05.140 Sadly it was not to be, 0:31:05.140,0:31:07.780 the facts of the case were not in dispute, 0:31:07.779,0:31:08.970 a wave hit the ship 0:31:08.970,0:31:12.000 and the Mignonette went down. 0:31:12.000,0:31:14.859 The four crew members escaped to a lifeboat 0:31:14.859,0:31:16.199 the only 0:31:16.200,0:31:18.380 food they had 0:31:18.380,0:31:19.659 were two 0:31:19.659,0:31:20.990 cans of preserved 0:31:20.990,0:31:21.779 turnips 0:31:21.779,0:31:23.980 no fresh water 0:31:23.980,0:31:26.610 for the first three days they ate nothing 0:31:26.609,0:31:30.459 on the fourth day that opened one of the cans of turnips 0:31:30.460,0:31:31.590 and ate it. 0:31:31.589,0:31:34.449 The next day they caught a turtle 0:31:34.450,0:31:36.960 together with the other can of turnips 0:31:36.960,0:31:38.549 the turtle 0:31:38.549,0:31:40.058 enabled them to subsist 0:31:40.058,0:31:43.069 for the next few days and then for eight days 0:31:43.069,0:31:44.038 they had nothing 0:31:44.038,0:31:47.069 no food no water. 0:31:47.069,0:31:50.069 Imagine yourself in a situation like that 0:31:50.069,0:31:52.849 what would you do? 0:31:52.849,0:31:55.119 Here's what they did 0:31:55.119,0:32:00.969 by now the cabin boy Parker is lying at the bottom of the lifeboat in a corner 0:32:00.970,0:32:03.230 because he had drunk sea water 0:32:03.230,0:32:05.490 against the advice of the others 0:32:05.490,0:32:07.230 and he had become ill 0:32:07.230,0:32:10.669 and he appeared to be dying 0:32:10.669,0:32:14.619 so on the nineteenth day Dudley, the captain, suggested 0:32:14.618,0:32:17.349 that they should all 0:32:17.349,0:32:18.759 have a lottery. That they should 0:32:18.759,0:32:19.619 all draw lots to see 0:32:19.619,0:32:20.859 who would die 0:32:20.859,0:32:24.079 to save the rest. 0:32:24.079,0:32:25.210 Brooks 0:32:25.210,0:32:26.539 refused 0:32:26.539,0:32:29.139 he didn't like the lottery idea 0:32:29.140,0:32:30.619 we don't know whether this 0:32:30.618,0:32:35.999 was because he didn't want to take that chance or because he believed in categorical moral 0:32:35.999,0:32:36.870 principles 0:32:36.869,0:32:38.619 but in any case 0:32:38.619,0:32:42.129 no lots were drawn. 0:32:42.130,0:32:43.230 The next day 0:32:43.230,0:32:45.028 there was still no ship in sight 0:32:45.028,0:32:48.470 so a Dudley told Brooks to avert his gaze 0:32:48.470,0:32:50.720 and he motioned to Stephens 0:32:50.720,0:32:53.929 that the boy Parker had better be killed. 0:32:53.929,0:32:55.850 Dudley offered a prayer 0:32:55.849,0:32:58.480 he told a the boy his time had come 0:32:58.480,0:33:00.679 and he killed him with a pen knife 0:33:00.679,0:33:03.900 stabbing him in the jugular vein. 0:33:03.900,0:33:09.750 Brooks emerged from his conscientious objection to share in the gruesome bounty. 0:33:09.750,0:33:11.029 For four days 0:33:11.029,0:33:15.230 the three of them fed on the body and blood of the cabin boy. 0:33:15.230,0:33:17.220 True story. 0:33:17.220,0:33:19.390 And then they were rescued. 0:33:19.390,0:33:22.840 Dudley describes their rescue 0:33:22.839,0:33:24.678 in his diary 0:33:24.679,0:33:27.570 with staggering euphemism, quote: 0:33:27.569,0:33:29.648 "on the twenty fourth day 0:33:29.648,0:33:34.750 as we were having our breakfast 0:33:34.750,0:33:38.599 a ship appeared at last." 0:33:38.599,0:33:44.309 The three survivors were picked up by a German ship. They were taken back to Falmouth in England 0:33:44.309,0:33:47.079 where they were arrested and tried 0:33:47.079,0:33:47.829 Brooks 0:33:47.829,0:33:49.949 turned state's witness 0:33:49.950,0:33:54.450 Dudley and Stephens went to trial. They didn't dispute the facts 0:33:54.450,0:33:55.390 they claimed 0:33:55.390,0:33:58.140 they had acted out of necessity 0:33:58.140,0:33:59.430 that was their defense 0:33:59.430,0:34:01.220 they argued in effect 0:34:01.220,0:34:03.250 better that one should die 0:34:03.250,0:34:06.420 so that three could survive 0:34:06.420,0:34:08.619 the prosecutor 0:34:08.619,0:34:10.849 wasn't swayed by that argument 0:34:10.849,0:34:12.509 he said murder is murder 0:34:12.509,0:34:16.429 and so the case went to trial. Now imagine you are the jury 0:34:16.429,0:34:19.489 and just to simplify the discussion 0:34:19.489,0:34:21.989 put aside the question of law, 0:34:21.989,0:34:23.010 and let's assume that 0:34:23.010,0:34:25.879 you as the jury 0:34:25.878,0:34:28.278 are charged with deciding 0:34:28.278,0:34:31.009 whether what they did was morally 0:34:31.009,0:34:34.378 permissible or not. 0:34:34.378,0:34:36.608 How many 0:34:36.608,0:34:39.808 would vote 0:34:39.809,0:34:46.809 not guilty, that what they did was morally permissible? 0:34:49.528,0:34:51.639 And how many would vote guilty 0:34:51.639,0:34:54.858 what they did was morally wrong? 0:34:54.858,0:34:57.998 A pretty sizable majority. 0:34:57.998,0:35:03.808 Now let's see what people's reasons are, and let me begin with those who are in the minority. 0:35:03.809,0:35:07.739 Let's hear first from the defense 0:35:07.739,0:35:10.099 of Dudley and Stephens. 0:35:10.099,0:35:14.160 Why would you morally exonerate them? 0:35:14.159,0:35:17.989 What are your reasons? 0:35:17.989,0:35:20.798 I think it's I think it is morally reprehensible 0:35:20.798,0:35:24.349 but I think that there's a distinction between what's morally reprehensible 0:35:24.349,0:35:26.609 what makes someone legally accountable 0:35:26.608,0:35:30.690 in other words the night as the judge said what's always moral isn't necessarily 0:35:30.690,0:35:34.789 against the law and while I don't think that necessity 0:35:34.789,0:35:36.169 justifies 0:35:36.168,0:35:38.578 theft or murder any illegal act, 0:35:38.579,0:35:43.509 at some point your degree of necessity does in fact 0:35:43.509,0:35:45.849 exonerate you form any guilt. ok. 0:35:45.849,0:35:50.588 other defenders, other voices for the defense? 0:35:50.588,0:35:53.038 Moral justifications for 0:35:53.039,0:35:56.989 what they did? 0:35:56.989,0:35:57.619 yes, thank you 0:35:57.619,0:35:58.798 0:35:58.798,0:35:59.570 I just feel like 0:35:59.570,0:36:03.139 in a situation that desperate you have to do what you have to do to survive. 0:36:03.139,0:36:04.679 You have to do what you have to do 0:36:04.679,0:36:06.789 ya, you gotta do what you gotta do, pretty much. 0:36:06.789,0:36:07.849 If you've been 0:36:07.849,0:36:09.899 going nineteen days without any food 0:36:09.898,0:36:14.710 you know someone just has to take the sacrifice has to make sacrifices and people can survive 0:36:14.710,0:36:16.139 and furthermore from that 0:36:16.139,0:36:21.299 let's say they survived and then they become productive members of society who go home and then start like 0:36:21.300,0:36:26.230 a million charity organizations and this and that and this and that, I mean they benefit everybody in the end so 0:36:26.230,0:36:28.519 I mean I don't know what they did afterwards, I mean they might have 0:36:28.519,0:36:30.478 gone on and killed more people 0:36:30.478,0:36:32.889 but whatever. 0:36:32.889,0:36:35.708 what? what if they were going home and turned out to be assassins? 0:36:35.708,0:36:38.908 What if they were going home and turned out to be assassins? 0:36:38.909,0:36:42.878 You would want to know who they assassinated. 0:36:42.878,0:36:45.848 That's true too, that's fair 0:36:45.849,0:36:49.609 I would wanna know who they assassinated. 0:36:49.608,0:36:50.708 alright that's good, what's your name? Marcus. 0:36:50.708,0:36:52.489 We've heard a defense 0:36:52.489,0:36:54.050 a couple voices for the defense 0:36:54.050,0:36:55.660 now we need to hear 0:36:55.659,0:36:57.298 from the prosecution 0:36:57.298,0:36:59.338 most people think 0:36:59.338,0:37:05.068 what they did was wrong, why? 0:37:05.068,0:37:09.898 One of the first things that I was thinking was, oh well if they haven't been eating for a really long time, 0:37:09.898,0:37:11.409 maybe 0:37:11.409,0:37:12.469 then 0:37:12.469,0:37:15.139 they're mentally affected 0:37:15.139,0:37:16.408 that could be used for the defense, 0:37:16.409,0:37:20.619 a possible argument that oh, 0:37:20.619,0:37:24.179 that they weren't in a proper state of mind, they were making 0:37:24.179,0:37:28.519 decisions that they otherwise wouldn't be making, and if that's an appealing argument 0:37:28.518,0:37:33.608 that you have to be in an altered mindset to do something like that it suggests that 0:37:33.608,0:37:36.108 people who find that argument convincing 0:37:36.108,0:37:40.088 do you think that they're acting immorally. But I want to know what you think you're defending 0:37:40.088,0:37:41.248 you k 0:37:41.249,0:37:45.549 you voted to convict right? yeah I don't think that they acted in morally 0:37:45.548,0:37:49.449 appropriate way. And why not? What do you say, Here's Marcus 0:37:49.449,0:37:51.088 he just defended them, 0:37:51.088,0:37:52.909 he said, 0:37:52.909,0:37:53.880 you heard what he said, 0:37:53.880,0:37:55.249 yes I did 0:37:55.248,0:37:56.558 yes 0:37:56.559,0:38:00.119 that you've got to do what you've got to do in a case like that. 0:38:00.119,0:38:04.789 What do you say to Marcus? 0:38:04.789,0:38:06.439 They didn't, 0:38:06.438,0:38:13.438 that there is no situation that would allow human beings to take 0:38:13.579,0:38:17.959 the idea of fate or the other people's lives into their own hands that we don't have 0:38:17.958,0:38:19.328 that kind of power. 0:38:19.329,0:38:21.399 Good, okay 0:38:21.398,0:38:24.130 thanks you, and what's your name? 0:38:24.130,0:38:24.548 Britt? okay. 0:38:24.548,0:38:26.028 who else? 0:38:26.028,0:38:28.159 What do you say? Stand up 0:38:28.159,0:38:35.159 I'm wondering if Dudley and Stephens had asked for Richard Parker's consent in, you know, dying, 0:38:35.429,0:38:37.568 if that would 0:38:37.568,0:38:41.228 would that exonerate them 0:38:41.228,0:38:45.448 from an act of murder, and if so is that still morally justifiable? 0:38:45.449,0:38:51.720 That's interesting, alright consent, now hang on, what's your name? Kathleen. 0:38:51.719,0:38:56.088 Kathleen says suppose so what would that scenario look like? 0:38:56.088,0:38:56.619 so in the story 0:38:56.619,0:39:00.410 Dudley is there, pen knife in hand, 0:39:00.409,0:39:02.608 but instead of the prayer 0:39:02.608,0:39:04.588 or before the prayer, 0:39:04.588,0:39:07.599 he says, Parker, 0:39:07.599,0:39:11.519 would you mind 0:39:11.518,0:39:14.348 we're desperately hungry, 0:39:14.349,0:39:17.679 as Marcus empathizes with 0:39:17.679,0:39:19.769 we're desperately hungry 0:39:19.768,0:39:22.168 you're not going to last long anyhow, 0:39:22.168,0:39:23.498 you can be a martyr, 0:39:23.498,0:39:25.748 would you be a martyr 0:39:25.748,0:39:29.468 how about it Parker? 0:39:29.469,0:39:33.220 Then, then 0:39:33.219,0:39:37.639 then what do you think, would be morally justified then? Suppose 0:39:37.639,0:39:38.219 Parker 0:39:38.219,0:39:40.168 in his semi-stupor 0:39:40.168,0:39:42.498 says okay 0:39:42.498,0:39:47.889 I don't think it'll be morally justifiable but I'm wondering. Even then, even then it wouldn't be? No 0:39:47.889,0:39:50.650 You don't think that even with consent 0:39:50.650,0:39:52.490 it would be morally justified. 0:39:52.489,0:39:54.568 Are there people who think 0:39:54.568,0:39:56.369 who want to take up Kathleen's 0:39:56.369,0:39:57.200 consent idea 0:39:57.199,0:40:01.688 and who think that that would make it morally justified? Raise your hand if it would 0:40:01.688,0:40:05.868 if you think it would. 0:40:05.869,0:40:07.588 That's very interesting 0:40:07.588,0:40:09.168 Why would consent 0:40:09.168,0:40:15.888 make a moral difference? Why would it? 0:40:15.889,0:40:18.509 Well I just think that if he was making his own original idea 0:40:18.509,0:40:20.969 and it was his idea to start with 0:40:20.969,0:40:23.778 then that would be the only situation in which I would 0:40:23.778,0:40:25.940 see it being appropriate in anyway 0:40:25.940,0:40:28.359 because that way you couldn't make the argument that 0:40:28.358,0:40:30.579 he was pressured you know it’s three 0:40:30.579,0:40:32.759 to one or whatever the ratio was, 0:40:32.759,0:40:34.070 and I think that 0:40:34.070,0:40:38.009 if he was making a decision to give his life then he took on the agency 0:40:38.009,0:40:42.668 to sacrifice himself which some people might see as admirable and other people 0:40:42.668,0:40:45.449 might disagree with that decision. 0:40:45.449,0:40:49.099 So if he came up with the idea 0:40:49.099,0:40:52.820 that's the only kind of consent we could have confidence in 0:40:52.820,0:40:55.359 morally, then it would be okay 0:40:55.358,0:40:57.268 otherwise 0:40:57.268,0:40:59.788 it would be kind of coerced consent 0:40:59.789,0:41:01.469 under the circumstances 0:41:01.469,0:41:05.278 you think. 0:41:05.278,0:41:07.349 Is there anyone who thinks 0:41:07.349,0:41:10.979 that the even the consent of Parker 0:41:10.978,0:41:13.419 would not justify 0:41:13.420,0:41:15.479 their killing him? 0:41:15.478,0:41:18.088 Who thinks that? 0:41:18.088,0:41:19.538 Yes, tell us why, stand up 0:41:19.539,0:41:21.260 I think that Parker 0:41:21.260,0:41:22.319 would be killed 0:41:22.318,0:41:26.558 with the hope that the other crew members would be rescued so 0:41:26.559,0:41:29.250 there's no definite reason that he should be killed 0:41:29.250,0:41:31.199 because you don't know 0:41:31.199,0:41:35.809 when they're going to get rescued so if you kill him you're killing him in vain 0:41:35.809,0:41:38.039 do you keep killing a crew member until you're rescued and then you're left with no one? 0:41:38.039,0:41:40.309 because someone's going to die eventually? 0:41:40.309,0:41:44.199 Well the moral logic of the situation seems to be that. 0:41:44.199,0:41:45.829 That they would 0:41:45.829,0:41:50.318 keep on picking off the weakest maybe, one by one, 0:41:50.318,0:41:51.818 until they were 0:41:51.818,0:41:57.679 rescued and in this case luckily when three at least were still alive. 0:41:57.679,0:41:58.880 Now if 0:41:58.880,0:42:01.298 if Parker did give his consent 0:42:01.298,0:42:04.068 would it be all right do you think or not? 0:42:04.068,0:42:06.329 No, it still wouldn't be right. 0:42:06.329,0:42:08.030 Tell us why wouldn't be all right. 0:42:08.030,0:42:10.028 First of all, cannibalism, I believe 0:42:10.028,0:42:13.228 is morally incorrect 0:42:13.228,0:42:14.509 so you shouldn’t be eating a human anyway. 0:42:14.510,0:42:17.449 So 0:42:17.449,0:42:19.380 cannibalism is morally objectionable outside 0:42:19.380,0:42:22.400 so then even in the scenario 0:42:22.400,0:42:24.568 of waiting until someone died 0:42:24.568,0:42:27.018 still it would be objectionable. 0:42:27.018,0:42:27.929 Yes, to me personally 0:42:27.929,0:42:29.739 I feel like of 0:42:29.739,0:42:31.199 it all depends on 0:42:31.199,0:42:35.288 one's personal morals, like we can't just, like this is just my opinion 0:42:35.289,0:42:39.339 of course other people are going to disagree. 0:42:39.338,0:42:41.498 Well let's see, let's hear what their disagreements are 0:42:41.498,0:42:42.639 and then we'll see 0:42:42.639,0:42:44.259 if they have reasons 0:42:44.259,0:42:46.228 that can persuade you or not. 0:42:46.228,0:42:48.358 Let's try that 0:42:48.358,0:42:50.098 Let's 0:42:50.099,0:42:53.249 now is there someone 0:42:53.248,0:42:57.908 who can explain, those of you who are tempted by consent 0:42:57.909,0:42:59.778 can you explain 0:42:59.778,0:43:02.028 why consent makes 0:43:02.028,0:43:03.358 such a moral difference, 0:43:03.358,0:43:05.650 what about the lottery idea 0:43:05.650,0:43:08.930 does that count as consent. Remember at the beginning 0:43:08.929,0:43:11.308 Dudley proposed a lottery 0:43:11.309,0:43:13.839 suppose that they had agreed 0:43:13.838,0:43:16.338 to a lottery 0:43:16.338,0:43:17.369 then 0:43:17.369,0:43:20.798 how many would then say 0:43:20.798,0:43:23.929 it was all right. Say there was a lottery, 0:43:23.929,0:43:25.380 cabin boy lost, 0:43:25.380,0:43:32.380 and the rest of the story unfolded. How many people would say it's morally permissible? 0:43:33.199,0:43:37.030 So the numbers are rising if we add a lottery, let's hear from one of you 0:43:37.030,0:43:41.609 for whom the lottery would make a moral difference 0:43:41.608,0:43:43.458 why would it? 0:43:43.458,0:43:44.739 I think the essential 0:43:44.739,0:43:45.718 element, 0:43:45.719,0:43:47.858 in my mind that makes it a crime is 0:43:47.858,0:43:53.848 the idea that they decided at some point that their lives were more important than his, and that 0:43:53.849,0:43:56.609 I mean that's kind of the basis for really any crime 0:43:56.608,0:43:57.688 right? It's like 0:43:57.688,0:44:01.949 my needs, my desire is a more important than yours and mine take precedent 0:44:01.949,0:44:04.798 and if they had done a lottery were everyone consented 0:44:04.798,0:44:06.478 that someone should die 0:44:06.478,0:44:09.239 and it's sort of like they're all sacrificing themselves, 0:44:09.239,0:44:11.008 to save the rest, 0:44:11.009,0:44:12.949 Then it would be all right? 0:44:12.949,0:44:15.880 A little grotesque but, 0:44:15.880,0:44:18.959 But morally permissible? Yes. 0:44:18.958,0:44:22.688 what's your name? Matt. 0:44:22.688,0:44:25.578 so, Matt for you 0:44:25.579,0:44:27.329 what bothers you is not 0:44:27.329,0:44:31.389 the cannibalism, but the lack of due process. 0:44:31.389,0:44:34.689 I guess you could say that 0:44:34.688,0:44:38.170 And can someone who agrees with Matt 0:44:38.170,0:44:40.499 say a little bit more 0:44:40.498,0:44:41.378 about why 0:44:41.378,0:44:43.690 a lottery 0:44:43.690,0:44:47.099 would make it, in your view, 0:44:47.099,0:44:50.979 morally permissible. 0:44:50.978,0:44:55.568 The way I understood it originally was that that was the whole issue is that the cabin boy was never 0:44:55.568,0:44:56.398 consulted 0:44:56.398,0:45:00.478 about whether or not it something was going to happen to him even though with the original 0:45:00.478,0:45:01.088 lottery 0:45:01.088,0:45:04.420 whether or not he would be a part of that it was just decided 0:45:04.420,0:45:08.170 that he was the one that was going to die. Yes that's what happened in the actual case 0:45:08.170,0:45:11.900 but if there were a lottery and they all agreed to the procedure 0:45:11.900,0:45:13.539 you think that would be okay? 0:45:13.539,0:45:16.418 Right, because everyone knows that there's gonna be a death 0:45:16.418,0:45:17.079 whereas 0:45:17.079,0:45:18.878 you know the cabin boy didn't know that 0:45:18.878,0:45:21.038 this discussion was even happening 0:45:21.039,0:45:21.919 there was no 0:45:21.918,0:45:23.578 you know forewarning 0:45:23.579,0:45:28.829 for him to know that hey, I may be the one that's dying. Okay, now suppose the everyone agrees 0:45:28.829,0:45:35.089 to the lottery they have the lottery the cabin boy loses any changes his mind. 0:45:35.088,0:45:40.989 You've already decided, it's like a verbal contract, you can't go back on that. You've decided the decision was made 0:45:40.989,0:45:45.139 you know if you know you're dying for the reason for at others to live, 0:45:45.139,0:45:45.999 you would, you know 0:45:45.998,0:45:47.688 if the someone else had died 0:45:47.688,0:45:51.969 you know that you would consume them, so 0:45:51.969,0:45:57.429 But then he could say I know, but I lost. 0:45:57.429,0:46:01.939 I just think that that's the whole moral issue is that there was no consulting of the cabin boy and that that's 0:46:01.938,0:46:04.298 what makes it the most horrible 0:46:04.298,0:46:08.909 is that he had no idea what was even going on, that if he had known what was going on 0:46:08.909,0:46:10.599 it would 0:46:10.599,0:46:13.109 be a bit more understandable. 0:46:13.108,0:46:14.509 Alright, good, now I want to hear 0:46:14.510,0:46:17.049 so there's some who think 0:46:17.048,0:46:18.708 it's morally permissible 0:46:18.708,0:46:24.048 but only about twenty percent, 0:46:24.048,0:46:26.559 led by Marcus, 0:46:26.559,0:46:28.439 then there are some who say 0:46:28.438,0:46:30.208 the real problem here 0:46:30.208,0:46:32.838 is the lack of consent 0:46:32.838,0:46:37.139 whether the lack of consent to a lottery to a fair procedure 0:46:37.139,0:46:38.588 or 0:46:38.588,0:46:39.690 Kathleen's idea, 0:46:39.690,0:46:40.749 lack of consent 0:46:40.748,0:46:42.928 at the moment 0:46:42.929,0:46:45.139 of death 0:46:45.139,0:46:48.318 and if we add consent 0:46:48.318,0:46:49.019 then 0:46:49.019,0:46:51.889 more people are willing to consider 0:46:51.889,0:46:54.528 the sacrifice morally justified. 0:46:54.528,0:46:56.639 I want to hear now finally 0:46:56.639,0:46:58.548 from those of you who think 0:46:58.548,0:47:00.199 even with consent 0:47:00.199,0:47:01.898 even with a lottery 0:47:01.898,0:47:02.519 even with 0:47:02.519,0:47:04.579 a final 0:47:04.579,0:47:06.950 murmur of consent from Parker 0:47:06.949,0:47:08.018 at the 0:47:08.018,0:47:09.179 very last moment 0:47:09.179,0:47:10.838 it would still 0:47:10.838,0:47:12.639 be wrong 0:47:12.639,0:47:14.248 and why would it be wrong 0:47:14.248,0:47:16.998 that's what I want to hear. 0:47:16.998,0:47:18.818 well the whole time 0:47:18.818,0:47:22.639 I've been leaning towards the categorical moral reasoning 0:47:22.639,0:47:25.568 and I think that 0:47:25.568,0:47:29.608 there's a possibility I'd be okay with the idea of the lottery and then loser 0:47:29.608,0:47:31.440 taking into their own hands to 0:47:31.440,0:47:32.749 kill themselves 0:47:32.748,0:47:33.679 0:47:33.679,0:47:37.358 so there wouldn't be an act of murder but I still think that 0:47:37.358,0:47:42.278 even that way it's coerced and also I don't think that there's any remorse like in 0:47:42.278,0:47:43.338 Dudley's diary 0:47:43.338,0:47:44.909 we're getting our breakfast 0:47:44.909,0:47:47.659 it seems as though he's just sort of like, oh, 0:47:47.659,0:47:51.440 you know that whole idea of not valuing someone else's life 0:47:51.440,0:47:53.639 so that makes me 0:47:53.639,0:47:57.969 feel like I have to take the categorical stance. You want to throw the book at him. 0:47:57.969,0:48:02.298 when he lacks remorse or a sense of having done anything wrong. Right. 0:48:02.298,0:48:06.969 Alright, good so are there any other 0:48:06.969,0:48:08.769 defenders who 0:48:08.768,0:48:13.268 who say it's just categorically wrong, with or without consent, yes stand up. Why? 0:48:13.268,0:48:17.288 I think undoubtedly the way our society is shaped, murder is murder 0:48:17.289,0:48:21.829 murder is murder and every way our society looks down at it in the same light 0:48:21.829,0:48:24.780 and I don't think it's any different in any case. Good now let me ask you a question, 0:48:24.780,0:48:27.119 there were three lives at stake 0:48:27.119,0:48:30.489 versus one, 0:48:30.489,0:48:33.028 the one, that the cabin boy, he had no family 0:48:33.028,0:48:34.509 he had no dependents, 0:48:34.509,0:48:38.739 these other three had families back home in England they had dependents 0:48:38.739,0:48:41.418 they had wives and children 0:48:41.418,0:48:43.328 think back to Bentham, 0:48:43.329,0:48:44.989 Bentham says we have to consider 0:48:44.989,0:48:48.048 the welfare, the utility, the happiness 0:48:48.048,0:48:51.288 of everybody. We have to add it all up 0:48:51.289,0:48:54.640 so it's not just numbers three against one 0:48:54.639,0:48:58.759 it's also all of those people at home 0:48:58.759,0:49:00.909 in fact the London newspaper at the time 0:49:00.909,0:49:04.248 and popular opinion sympathized with them 0:49:04.248,0:49:05.478 Dudley in Stephens 0:49:05.478,0:49:07.919 and the paper said if they weren't 0:49:07.920,0:49:08.280 motivated 0:49:08.280,0:49:09.640 by affection 0:49:09.639,0:49:13.489 and concern for their loved ones at home and dependents, surely they wouldn't have 0:49:13.489,0:49:15.969 done this. Yeah, and how is that any different from people 0:49:15.969,0:49:17.369 on the corner 0:49:17.369,0:49:21.108 trying to having the same desire to feed their family, I don't think it's any different. I think in any case 0:49:21.108,0:49:25.279 if I'm murdering you to advance my status, that's murder and I think that we should look at all 0:49:25.280,0:49:28.430 of that in the same light. Instead of criminalizing certain 0:49:28.429,0:49:30.278 activities 0:49:30.278,0:49:33.760 and making certain things seem more violent and savage 0:49:33.760,0:49:36.760 when in that same case it's all the same act and mentality 0:49:36.760,0:49:40.150 that goes into the murder, a necessity to feed their families. 0:49:40.150,0:49:43.028 Suppose there weren't three, supposed there were thirty, 0:49:43.028,0:49:44.608 three hundred, 0:49:44.608,0:49:47.358 one life to save three hundred 0:49:47.358,0:49:48.349 or in more time, 0:49:48.349,0:49:49.589 three thousand 0:49:49.590,0:49:51.039 or suppose the stakes were even bigger. 0:49:51.039,0:49:52.778 Suppose the stakes were even bigger 0:49:52.778,0:49:54.608 I think it's still the same deal. 0:49:54.608,0:49:58.108 Do you think Bentham was wrong to say the right thing to do 0:49:58.108,0:49:58.929 is to add 0:49:58.929,0:50:02.479 up the collected happiness, you think he's wrong about that? 0:50:02.478,0:50:06.728 I don't think he is wrong, but I think murder is murder in any case. Well then Bentham has to be wrong 0:50:06.728,0:50:09.568 if you're right he's wrong. okay then he's wrong. 0:50:09.568,0:50:12.818 Alright thank you, well done. 0:50:12.818,0:50:14.358 Alright, let's step back 0:50:14.358,0:50:16.389 from this discussion 0:50:16.389,0:50:19.728 and notice 0:50:19.728,0:50:23.259 how many objections have we heard to what they did. 0:50:23.259,0:50:26.048 we heard some defenses of what they did 0:50:26.048,0:50:28.509 the defense has had to do with 0:50:28.509,0:50:28.918 necessity 0:50:28.918,0:50:32.588 the dire circumstance and, 0:50:32.588,0:50:33.409 implicitly at least, 0:50:33.409,0:50:36.018 the idea that numbers matter 0:50:36.018,0:50:37.858 and not only numbers matter 0:50:37.858,0:50:40.380 but the wider effects matter 0:50:40.380,0:50:43.459 their families back home, their dependents 0:50:43.458,0:50:44.768 Parker was an orphan, 0:50:44.768,0:50:47.978 no one would miss him. 0:50:47.978,0:50:49.578 so if you 0:50:49.579,0:50:50.829 add up 0:50:50.829,0:50:52.649 if you tried to calculate 0:50:52.648,0:50:53.998 the balance 0:50:53.998,0:50:56.598 of happiness and suffering 0:50:56.599,0:50:58.838 you might have a case for 0:50:58.838,0:51:02.768 saying what they did was the right thing 0:51:02.768,0:51:09.468 then we heard at least three different types of objections, 0:51:09.469,0:51:11.690 we heard an objection that's said 0:51:11.690,0:51:14.108 what they did was categorically wrong, 0:51:14.108,0:51:15.750 right here at the end 0:51:15.750,0:51:17.389 categorically wrong. 0:51:17.389,0:51:19.820 Murder is murder it's always wrong 0:51:19.820,0:51:20.969 even if 0:51:20.969,0:51:23.349 it increases the overall happiness 0:51:23.349,0:51:25.639 of society 0:51:25.639,0:51:28.499 the categorical objection. 0:51:28.498,0:51:30.738 But we still need to investigate 0:51:30.739,0:51:32.749 why murder 0:51:32.748,0:51:35.448 is categorically wrong. 0:51:35.449,0:51:38.579 Is it because 0:51:38.579,0:51:42.339 even cabin boys have certain fundamental rights? 0:51:42.338,0:51:44.400 And if that's the reason 0:51:44.400,0:51:47.880 where do those rights come from if not from some idea 0:51:47.880,0:51:53.209 of the larger welfare or utility or happiness? Question number one. 0:51:53.208,0:51:56.308 Others said 0:51:56.309,0:51:58.449 a lottery would make a difference 0:51:58.449,0:52:00.039 a fair procedure, 0:52:00.039,0:52:05.949 Matt said. 0:52:05.949,0:52:08.769 And some people were swayed by that. 0:52:08.768,0:52:12.188 That's not a categorical objection exactly 0:52:12.188,0:52:13.828 it's saying 0:52:13.829,0:52:16.798 everybody has to be counted as an equal 0:52:16.798,0:52:18.469 even though, at the end of the day 0:52:18.469,0:52:20.769 one can be sacrificed 0:52:20.768,0:52:23.288 for the general welfare. 0:52:23.289,0:52:26.059 That leaves us with another question to investigate, 0:52:26.059,0:52:29.670 Why does agreement to certain procedure, 0:52:29.670,0:52:31.969 even a fair procedure, 0:52:31.969,0:52:34.739 justify whatever result flows 0:52:34.739,0:52:38.088 from the operation of that procedure? 0:52:38.088,0:52:39.898 Question number two. 0:52:39.898,0:52:42.398 and question number three 0:52:42.398,0:52:45.338 the basic idea of consent. 0:52:45.338,0:52:48.528 Kathleen got us on to this. 0:52:48.528,0:52:52.719 If the cabin boy had agreed himself 0:52:52.719,0:52:54.499 and not under duress 0:52:54.498,0:52:57.068 as was added 0:52:57.068,0:53:01.918 then it would be all right to take his life to save the rest. 0:53:01.918,0:53:04.900 Even more people signed on to that idea 0:53:04.900,0:53:06.630 but that raises 0:53:06.630,0:53:08.528 a third philosophical question 0:53:08.528,0:53:11.009 what is the moral work 0:53:11.009,0:53:12.838 that consent 0:53:12.838,0:53:14.438 does? 0:53:14.438,0:53:16.978 Why does an act of consent 0:53:16.978,0:53:19.188 make such a moral difference 0:53:19.188,0:53:23.818 that an act that would be wrong, taking a life, without consent 0:53:23.818,0:53:25.369 is morally 0:53:25.369,0:53:26.358 permissible 0:53:26.358,0:53:29.619 with consent? 0:53:29.619,0:53:31.699 To investigate those three questions 0:53:31.699,0:53:34.108 we're going to have to read some philosophers 0:53:34.108,0:53:35.728 and starting next time 0:53:35.728,0:53:36.939 we're going to read 0:53:36.940,0:53:37.720 Bentham, 0:53:37.719,0:53:43.798 and John Stuart Mill, utilitarian philosophers. 0:53:43.798,0:53:47.298 Don't miss the chance to interact online with other viewers of Justice 0:53:43.798,0:53:47.298 join the conversation, 0:53:49.869,0:53:56.869 take a pop quiz, watch lectures you've missed, and a lot more. Visit www.justiceharvard.org. It's the right thing to do. 0:54:36.219,0:54:40.269 Funding for the program is provided by 0:54:40.268,0:54:41.678 Additional funding provided by