0:00:03.570,0:00:06.570 Funding for this program provided by 0:00:06.510,0:00:11.850 Additional funding provided by 0:00:32.850,0:00:35.290 last time we argued about 0:00:35.290,0:00:39.400 the case of the Queen verses Dudley and Stephens 0:00:39.400,0:00:45.970 the lifeboat case, the case of cannibalism[br]at sea 0:00:45.970,0:00:48.420 and with the arguments about 0:00:48.420,0:00:49.369 the lifeboat 0:00:49.369,0:00:53.750 in mind the arguments for and against [br]what Dudley and Stephens did in mind, 0:00:53.750,0:00:56.890 let's turn back to the 0:00:56.890,0:00:58.480 philosophy 0:00:58.480,0:01:02.059 the utilitarian philosophy of Jeremy Bentham 0:01:02.060,0:01:06.130 Bentham was born in England in 1748,[br]at the age of twelve 0:01:06.130,0:01:09.740 he went to Oxford, at fifteen he went to law[br]school 0:01:09.740,0:01:15.060 he was admitted to the bar at age nineteen [br]but he never practiced law, 0:01:15.060,0:01:17.470 instead he devoted his life 0:01:17.470,0:01:19.920 to jurisprudence and moral 0:01:19.920,0:01:22.170 philosophy. 0:01:22.170,0:01:27.690 last time we began to consider Bentham's version[br]of utilitarianism 0:01:27.690,0:01:28.810 the main idea 0:01:28.810,0:01:32.870 is simply stated and it's this, 0:01:32.870,0:01:35.750 the highest principle of morality 0:01:35.750,0:01:38.909 whether personal or political morality 0:01:38.909,0:01:40.260 is 0:01:40.260,0:01:42.520 to maximize 0:01:42.520,0:01:44.390 the general welfare 0:01:44.390,0:01:46.560 or the collective happiness 0:01:46.560,0:01:50.070 or the overall balance of pleasure over[br]pain 0:01:50.070,0:01:52.210 in a phrase 0:01:52.210,0:01:53.210 maximize 0:01:53.210,0:01:56.419 utility 0:01:56.420,0:02:00.750 Bentham arrives at this principle by the following[br]line of reasoning 0:02:00.750,0:02:03.150 we're all governed by pain and pleasure 0:02:03.150,0:02:09.009 they are our sovereign masters and so any[br]moral system has to take account of them. 0:02:09.008,0:02:10.999 How best to take account? 0:02:10.999,0:02:14.209 By maximizing 0:02:14.209,0:02:15.739 and this leads to the principle 0:02:15.739,0:02:19.449 of the greatest good for the greatest[br]number 0:02:19.449,0:02:23.029 what exactly should we maximize? 0:02:23.029,0:02:24.698 Bentham tells us 0:02:24.699,0:02:25.510 happiness 0:02:25.510,0:02:27.728 or more precisely 0:02:27.729,0:02:29.479 utility. 0:02:29.479,0:02:34.239 Maximizing utility is a principal not only[br]for individuals but also for communities and 0:02:34.239,0:02:36.569 for legislators 0:02:36.569,0:02:38.749 what after all is a community 0:02:38.749,0:02:41.359 Bentham asks, 0:02:41.359,0:02:45.309 it's the sum of the individuals who comprise it 0:02:45.309,0:02:46.780 and that's why 0:02:46.780,0:02:53.370 in deciding the best policy, in deciding what the[br]law should be, in deciding what's just, 0:02:53.370,0:02:59.029 citizens and legislators should ask themselves[br]the question if we add up, 0:02:59.029,0:03:03.870 all of the benefits of this policy 0:03:03.870,0:03:05.290 and subtract 0:03:05.290,0:03:08.828 all of the costs, 0:03:08.829,0:03:10.669 the right thing to do 0:03:10.669,0:03:11.779 is the one 0:03:11.779,0:03:13.439 that maximizes 0:03:13.439,0:03:14.698 the balance 0:03:14.699,0:03:16.020 of happiness 0:03:16.020,0:03:20.869 over suffering. 0:03:20.869,0:03:24.039 that's what it means to maximize utility 0:03:24.039,0:03:24.988 now, today 0:03:24.989,0:03:28.189 I want to see 0:03:28.189,0:03:31.150 whether you agree or disagree with it, 0:03:31.150,0:03:36.650 and it often goes, this utilitarian logic, under[br]the name of cost-benefit analysis 0:03:36.650,0:03:39.719 which is used by companies 0:03:39.719,0:03:41.219 and by 0:03:41.219,0:03:41.978 governments 0:03:41.979,0:03:43.729 all the time 0:03:43.729,0:03:45.239 and what it involves 0:03:45.239,0:03:50.789 is placing a value usually a dollar value[br]to stand for utility 0:03:50.789,0:03:53.289 on the costs and the benefits 0:03:53.289,0:03:56.539 of various proposals. 0:03:56.539,0:03:58.879 recently in the Czech Republic 0:03:58.879,0:04:03.370 there was a proposal to increases the excise[br]tax on smoking 0:04:03.370,0:04:05.089 Philip Morris, 0:04:05.089,0:04:08.219 the tobacco company, 0:04:08.219,0:04:09.779 does huge business 0:04:09.779,0:04:12.509 in the Czech Republic. They commissioned 0:04:12.509,0:04:15.429 a study of cost-benefit analysis 0:04:15.430,0:04:16.789 of smoking 0:04:16.789,0:04:18.678 in the Czech Republic 0:04:18.678,0:04:20.899 and what their cost benefit 0:04:20.899,0:04:22.830 analysis found 0:04:22.830,0:04:23.550 was 0:04:23.550,0:04:25.660 the government gains 0:04:25.660,0:04:27.349 by 0:04:27.349,0:04:30.779 having Czech citizens smoke. 0:04:30.779,0:04:32.360 Now, how do they gain? 0:04:32.360,0:04:36.189 It's true that there are negative effects 0:04:36.189,0:04:40.000 to the public finance of the Czech government 0:04:40.000,0:04:45.039 because there are increased health care costs[br]for people who develop smoking-related 0:04:45.039,0:04:47.239 diseases 0:04:47.239,0:04:50.369 on the other hand there were positive [br]effects 0:04:50.369,0:04:51.620 and those were 0:04:51.620,0:04:53.099 added up 0:04:53.099,0:04:55.378 on the other side of the ledger 0:04:55.379,0:05:01.960 the positive effects included, for the most[br]part, various tax revenues that the government 0:05:01.960,0:05:06.239 derives from the sale of cigarette products[br]but it also included health care savings to 0:05:06.240,0:05:09.530 the government when people die early 0:05:09.530,0:05:13.339 pensions savings, you don't have to pay pensions[br]for as long, 0:05:13.339,0:05:14.889 and also savings 0:05:14.889,0:05:19.679 in housing costs for the elderly 0:05:19.679,0:05:24.539 and when all of the costs and benefits were added[br]up 0:05:24.539,0:05:25.969 the Philip Morris 0:05:25.969,0:05:27.870 study found 0:05:27.870,0:05:32.610 that there is a net public finance gain[br]in the Czech Republic 0:05:32.610,0:05:36.449 of a hundred and forty seven million dollars 0:05:36.449,0:05:38.120 and given the savings 0:05:38.120,0:05:41.479 in housing and health care and pension costs 0:05:41.479,0:05:46.570 the government enjoys the saving of savings[br]of over twelve hundred dollars 0:05:46.570,0:05:52.569 for each person who dies prematurely due to[br]smoking. 0:05:52.569,0:05:56.740 cost-benefit analysis 0:05:56.740,0:06:01.729 now, those among you who are defenders utilitarianism [br]may think that this is a unfair 0:06:01.729,0:06:03.389 test 0:06:03.389,0:06:09.130 Philip Morris was pilloried in the press and[br]they issued an apology for this heartless 0:06:09.130,0:06:10.990 calculation 0:06:10.990,0:06:12.319 you may say 0:06:12.319,0:06:17.119 that what's missing here is something that[br]the utilitarian can be easily incorporate 0:06:17.119,0:06:19.339 mainly 0:06:19.339,0:06:23.430 the value to the person and to the families[br]of those who die 0:06:23.430,0:06:25.599 from lung cancer. 0:06:25.599,0:06:29.190 what about the value of life? 0:06:29.190,0:06:33.099 Some cost-benefit analyses incorporate 0:06:33.099,0:06:34.560 a measure 0:06:34.560,0:06:36.809 for the value of life. 0:06:36.809,0:06:41.330 One of the most famous of these involved the[br]Ford Pinto case 0:06:41.330,0:06:45.008 did any of you read about that? this was back[br]in the 1970's, you remember that 0:06:45.009,0:06:48.210 the Ford Pinto was, a kind of car? 0:06:48.210,0:06:51.029 anybody? 0:06:51.029,0:06:55.879 it was a small car, subcompact car,[br]very popular 0:06:55.879,0:06:58.059 but it had one 0:06:58.059,0:07:01.959 problem which is the fuel tank was at the[br]back of the car 0:07:01.959,0:07:07.960 and in rear collisions the fuel tank exploded 0:07:07.960,0:07:10.210 and some people were killed 0:07:10.210,0:07:14.349 and some severely injured. 0:07:14.349,0:07:18.998 victims of these injuries took Ford to court[br]to sue 0:07:18.999,0:07:21.669 and in the court case it turned out 0:07:21.669,0:07:24.030 that Ford had long 0:07:24.030,0:07:25.248 since known 0:07:25.249,0:07:27.719 about the vulnerable fuel tank 0:07:27.719,0:07:33.599 and had done a cost-benefit analysis to determine[br]whether it would be worth it 0:07:33.599,0:07:36.300 to put in a special shield 0:07:36.300,0:07:40.490 that would protect the fuel tank and prevent it[br]from exploding. 0:07:40.490,0:07:42.979 They did a cost benefit analysis 0:07:42.979,0:07:46.318 the cost per part 0:07:46.319,0:07:48.239 to increase the safety 0:07:48.239,0:07:50.299 of the Pinto, 0:07:50.299,0:07:55.688 they calculated at eleven dollars per part 0:07:55.689,0:07:57.189 and here's, 0:07:57.189,0:08:00.609 this was the cost benefit analysis that emerged 0:08:00.609,0:08:03.089 in the trial, 0:08:03.089,0:08:05.960 eleven dollars per part 0:08:05.960,0:08:09.998 at 12.5 million cars and trucks 0:08:09.999,0:08:13.559 came to a total cost of 0:08:13.370,0:08:17.370 137 million dollars to improve the safety 0:08:17.370,0:08:18.789 but then they calculated 0:08:18.789,0:08:20.139 the benefits 0:08:20.139,0:08:23.300 of spending all this money on a safer car 0:08:23.300,0:08:26.800 and they counted 180 deaths 0:08:26.800,0:08:28.749 and they assigned a dollar value 0:08:28.749,0:08:30.449 200 thousand dollars 0:08:30.449,0:08:32.490 per death 0:08:32.490,0:08:35.250 180 injuries 0:08:35.250,0:08:37.490 67 thousand 0:08:37.490,0:08:38.909 and then the cost to repair 0:08:38.909,0:08:43.289 the replacement cost for two thousand[br]vehicles that would be destroyed without the 0:08:43.289,0:08:45.030 safety device 0:08:45.030,0:08:48.060 700 dollars per vehicle, 0:08:48.060,0:08:50.390 so the benefits 0:08:50.390,0:08:53.930 turned out to be only 49.5 million, 0:08:53.930,0:08:55.270 and so they 0:08:55.270,0:08:56.329 didn't install 0:08:56.330,0:08:58.140 the device 0:08:58.140,0:08:59.840 needless to say 0:08:59.840,0:09:01.740 when this memo 0:09:01.160,0:09:09.160 of the Ford Motor Company's cost-benefit analysis came[br]out in the trial 0:09:09.160,0:09:11.250 it appalled the jurors 0:09:11.250,0:09:15.950 who awarded a huge settlement 0:09:15.950,0:09:21.810 is this a counter example to the utilitarian[br]idea of calculating 0:09:21.810,0:09:23.040 because Ford included a 0:09:23.040,0:09:27.319 measure of the value life. 0:09:27.320,0:09:30.770 Now who here wants to defend 0:09:30.770,0:09:33.240 cost-benefit analysis from 0:09:33.240,0:09:35.320 this apparent counter example 0:09:35.320,0:09:38.720 who has a defense? 0:09:38.720,0:09:41.590 or do you think it's completely destroys 0:09:41.590,0:09:47.410 the whole utilitarian calculus? 0:09:47.410,0:09:48.760 I think that 0:09:48.760,0:09:53.210 once again they've made the same mistake the previous case[br]did that they've assigned a dollar value 0:09:53.210,0:09:56.570 to human life and once again they failed to take into[br]account things like 0:09:56.570,0:10:00.580 suffering and emotional losses of families, I mean families[br]lost earnings 0:10:00.580,0:10:03.980 but they also lost a loved one and that 0:10:03.980,0:10:06.820 is more value than 200 thousand dollars. 0:10:06.820,0:10:09.170 Good, and wait wait wait, what's you're name? 0:10:09.170,0:10:10.479 Julie Roto. 0:10:10.480,0:10:14.400 so if two hundred thousand, Julie, is too 0:10:14.400,0:10:18.660 too low a figure because it doesn't include[br]the loss of a loved one, 0:10:18.660,0:10:21.660 and the loss of those years of life, 0:10:21.660,0:10:23.980 what would be, what do you think 0:10:23.980,0:10:27.290 would be a more accurate number? 0:10:27.290,0:10:32.430 I don't believe I could give a number I think[br]that this sort of analysis shouldn't be applied to 0:10:32.430,0:10:33.560 issues of human life. 0:10:33.560,0:10:36.199 I think it can't be used monetarily 0:10:36.200,0:10:39.270 so they didn't just put to low a number, 0:10:39.270,0:10:44.510 Julie says, they were wrong to try to[br]put any number at all. 0:10:44.510,0:10:49.540 all right let's hear someone who 0:10:49.540,0:10:51.860 you have to adjust for inflation 0:10:57.700,0:10:59.130 all right 0:10:59.130,0:11:00.050 fair enough 0:11:00.050,0:11:02.660 so what would the number of being now? 0:11:02.660,0:11:07.640 this is was thirty five years ago 0:11:07.640,0:11:10.330 two million dollars 0:11:10.330,0:11:12.030 you would put two million 0:11:12.030,0:11:14.079 and what's your name 0:11:14.080,0:11:15.410 Voicheck 0:11:15.410,0:11:17.329 Voicheck says we have to allow for inflation 0:11:17.330,0:11:19.640 we should be more generous 0:11:19.640,0:11:25.170 then would you be satisfied that this is the[br]right way of thinking about the question? 0:11:25.170,0:11:27.310 I guess unfortunately 0:11:27.310,0:11:29.589 it is for 0:11:29.590,0:11:32.980 there's needs to be of number put somewhere 0:11:32.980,0:11:37.330 I'm not sure what number would be but I do[br]agree that there could possibly 0:11:37.330,0:11:39.490 be a number put 0:11:39.490,0:11:41.060 on a human life. 0:11:41.060,0:11:42.439 all right so 0:11:42.440,0:11:44.420 Voicheck says 0:11:44.420,0:11:46.280 and here he disagrees with 0:11:46.280,0:11:46.649 Julie 0:11:46.649,0:11:49.750 Julie says we can't put a number of human[br]life 0:11:49.750,0:11:53.670 for the purpose of a cost-benefit analysis,[br]Voicheck says we have to 0:11:53.670,0:11:59.870 because we have to make decisions somehow 0:11:59.870,0:12:04.750 what do other people think about this?[br]Is there anyone prepared to defend cost-benefit 0:12:04.750,0:12:06.270 analysis here 0:12:06.270,0:12:09.810 as accurate, as desirable? 0:12:09.810,0:12:16.000 I think that if ford and other car companies didn't use[br]cost-benefit analysis they'd eventually go out 0:12:16.000,0:12:18.949 of business because they wouldn't be able[br]to be profitable 0:12:18.949,0:12:23.139 and millions of people wouldn't be able to use[br]their cars to get to jobs, to put food on the table 0:12:23.139,0:12:27.939 to feed their children so I think that if cost-benefit[br]analysis isn't employed 0:12:27.939,0:12:29.699 the greater good 0:12:29.700,0:12:31.870 is sacrificed 0:12:31.870,0:12:34.810 in this case. Alright let me ask, what's your name? 0:12:34.810,0:12:37.599 Raul. Raul. 0:12:37.600,0:12:41.760 there was recently a study done about cell[br]phone use by drivers, when people are driving 0:12:41.760,0:12:43.090 a car, 0:12:43.090,0:12:47.080 and there's a debate about whether that should be[br]banned 0:12:47.080,0:12:48.660 and 0:12:48.660,0:12:50.689 the figure was that some 0:12:50.690,0:12:54.990 two thousand people die 0:12:54.990,0:12:57.280 as a result of accidents 0:12:57.280,0:12:59.240 each year 0:12:59.240,0:13:02.120 using cell phones 0:13:02.120,0:13:07.860 and yet the cost benefit analysis which was done by[br]the center for risk analysis at Harvard 0:13:07.860,0:13:10.520 found that if you look at the benefits 0:13:10.520,0:13:13.560 of the cell phone use 0:13:13.560,0:13:15.089 and you put some 0:13:15.090,0:13:19.050 value on the life, it comes out about[br]the same 0:13:19.050,0:13:23.290 because of the enormous economic benefit[br]of enabling people to take advantage 0:13:23.290,0:13:27.370 of their time, not waste time, be able to make deals[br]and talk to friends and so on 0:13:27.370,0:13:30.370 while they're driving 0:13:30.370,0:13:32.060 doesn't that suggest that 0:13:32.060,0:13:35.619 it's a mistake to try to put monetary figures[br]on questions 0:13:35.620,0:13:37.590 of human life? 0:13:37.590,0:13:39.330 well I think that if 0:13:39.330,0:13:41.780 the great majority of people 0:13:41.780,0:13:47.569 tried to derive maximum utility out of a service[br]like using cell phones and the convenience that cell phones 0:13:47.570,0:13:47.989 provide 0:13:47.989,0:13:50.400 that sacrifice is necessary 0:13:50.400,0:13:52.160 for 0:13:52.160,0:13:53.230 satisfaction to occur. 0:13:53.230,0:13:59.200 You're an outright utilitarian. In, yes okay. 0:13:59.200,0:14:02.820 all right then, one last question Raul 0:14:02.820,0:14:05.810 and I put this to Voicheck, 0:14:05.810,0:14:07.829 what dollar figure should be put 0:14:07.829,0:14:12.870 on human life to decide whether to ban the[br]use of cell phones 0:14:12.870,0:14:14.640 well I don't want to 0:14:14.640,0:14:16.199 arbitrarily 0:14:16.200,0:14:18.140 calculate a figure, I mean right now 0:14:18.140,0:14:19.370 I think that 0:14:21.350,0:14:23.610 you want to take it under advisement. 0:14:23.610,0:14:24.790 yeah I'll take it under advisement. 0:14:24.790,0:14:28.689 but what roughly speaking would it be? you've[br]got 23 hundred deaths 0:14:28.690,0:14:32.400 you've got to assign a dollar value to know[br]whether you want to prevent those deaths by 0:14:32.400,0:14:37.069 banning the use of cell phones in cars 0:14:37.070,0:14:38.900 so what would you're hunch be? 0:14:38.900,0:14:40.350 how much? 0:14:40.350,0:14:40.799 million 0:14:40.799,0:14:42.280 two million 0:14:42.280,0:14:44.410 two million was Voitech's figure 0:14:44.410,0:14:46.120 is that about right? maybe a million. 0:14:46.120,0:14:50.500 a million.?! 0:14:50.500,0:14:55.110 Alright that's good, thank you 0:14:55.110,0:15:00.440 So these are some of the controversies that arise[br]these days from cost-benefit analysis especially 0:15:00.440,0:15:01.859 those that involve 0:15:01.859,0:15:06.770 placing a dollar value on everything to be[br]added up. 0:15:06.770,0:15:08.840 well now I want to turn 0:15:08.840,0:15:15.110 to your objections, to your objections not necessarily[br]to cost benefit analysis specifically, 0:15:15.110,0:15:18.400 because that's just one version of the 0:15:18.400,0:15:21.959 utilitarian logic in practice today, 0:15:21.960,0:15:26.830 but to the theory as a whole, to the idea 0:15:26.830,0:15:30.000 that the right thing to do, 0:15:30.000,0:15:33.710 the just basis for policy and law, 0:15:33.710,0:15:35.559 is to maximize 0:15:35.559,0:15:39.750 utility. 0:15:39.750,0:15:41.590 How many disagree 0:15:41.590,0:15:43.359 with the utilitarian 0:15:43.359,0:15:44.330 approach 0:15:44.330,0:15:46.390 to law 0:15:46.390,0:15:48.480 and to the common good? 0:15:48.480,0:15:52.230 How many bring with it? 0:15:52.230,0:15:55.200 so more agree than disagree. 0:15:55.200,0:15:59.560 so let's hear from the critics 0:15:59.560,0:16:02.209 my main issue with it is that I feel like 0:16:02.209,0:16:06.280 you can't say that just because someone's[br]in the minority 0:16:06.280,0:16:11.650 what they want and need is less valuable than[br]someone who's in the majority 0:16:11.650,0:16:14.189 so I guess I have an issue with the idea that 0:16:14.190,0:16:16.099 the greatest good for the greatest number 0:16:16.099,0:16:18.070 is okay because 0:16:18.070,0:16:21.040 there is still what about people who are in 0:16:21.040,0:16:24.719 the lesser number, like it's not fair to them[br]they didn't have a say in where they wanted 0:16:24.720,0:16:25.840 to be. 0:16:25.840,0:16:28.880 alright now that's an interesting objection, you're[br]worried about 0:16:28.880,0:16:32.460 the effect on minority. yes. 0:16:32.460,0:16:34.940 what's your name by the way. Anna. 0:16:34.940,0:16:39.819 alright who has an answer to Anna's worry about[br]the effect on the minority 0:16:39.819,0:16:41.689 What do you say to Anna? 0:16:41.690,0:16:42.540 she said that 0:16:42.540,0:16:47.490 the minorities value less, I don't think that's[br]the case because individually the minorities 0:16:47.490,0:16:51.339 value is just the same as the individual in the majority[br]it's just that 0:16:51.339,0:16:54.520 the numbers outweigh the 0:16:54.520,0:16:55.670 minority 0:16:55.670,0:16:58.919 and I mean at a certain point you have to make a[br]decision 0:16:58.919,0:17:01.510 and I'm sorry for the minority but 0:17:01.510,0:17:02.640 sometimes 0:17:02.640,0:17:03.670 it's for the general 0:17:03.670,0:17:08.839 for the greater good. For the greater good, Anna what do you[br]say? what's your name? Youngda. 0:17:08.839,0:17:10.839 What do you say to Youngda? 0:17:10.839,0:17:13.589 Youngda says you just have to add up people's[br]preferences 0:17:13.589,0:17:17.698 and those in the minority do have their preferences[br]weighed. 0:17:17.699,0:17:22.000 can you give an example of the kind of thing[br]you're worried about when you say you're worried 0:17:22.000,0:17:24.880 about utilitarianism violating 0:17:24.880,0:17:27.970 the concern or respect due the minority? 0:17:27.970,0:17:29.730 can you give an example. 0:17:29.730,0:17:35.179 so well with any of the cases that we've talked[br]about, like with the shipwreck one, 0:17:35.179,0:17:36.450 I think that 0:17:36.450,0:17:39.700 the boy who was eaten 0:17:39.700,0:17:39.980 still had 0:17:39.980,0:17:43.560 just as much of a right to live as the other people[br]and 0:17:43.560,0:17:44.710 just because 0:17:44.710,0:17:45.980 he was the 0:17:48.120,0:17:50.610 minority in that case the one who 0:17:50.610,0:17:53.459 maybe had less of a chance to keep living 0:17:53.460,0:17:54.870 that doesn't mean 0:17:54.870,0:17:58.449 that the others automatically have a right[br]to eat him 0:17:58.450,0:17:59.680 just because 0:17:59.680,0:18:01.849 it would give a greater amount of people 0:18:01.849,0:18:03.290 the chance to live. 0:18:03.290,0:18:05.740 so there may be a certain rights 0:18:05.740,0:18:07.850 that the minority 0:18:07.850,0:18:12.550 members have that the individual has that[br]shouldn't be traded off 0:18:12.550,0:18:14.399 for the sake of 0:18:14.400,0:18:16.520 utility? 0:18:16.520,0:18:18.210 yes Anna? 0:18:18.210,0:18:21.990 Now this would be a test for you, 0:18:21.990,0:18:24.550 back in ancient Rome 0:18:24.550,0:18:29.510 they threw Christians to the lions in the[br]coliseum for sport 0:18:29.510,0:18:33.890 if you think how the utilitarian calculus[br]would go 0:18:33.890,0:18:38.770 yes, the Christian thrown to the lion suffers enormous[br]excruciating pain, 0:18:38.770,0:18:45.770 but look at the collective ecstasy of the Romans. 0:18:48.460,0:18:50.580 Youngda. Well 0:18:50.580,0:18:51.889 in that time 0:18:51.890,0:18:53.250 I don't think 0:18:54.920,0:19:01.710 in the modern-day of time to value the, um, to given [br]a number to the happiness given to the people watching 0:19:01.710,0:19:02.900 I don't think 0:19:02.900,0:19:05.300 any 0:19:05.300,0:19:07.490 policy maker would say 0:19:07.490,0:19:11.390 the pain of one person, the suffering of one person is[br]much much, 0:19:11.390,0:19:14.680 in comparison to the happiness gained 0:19:14.680,0:19:20.460 no but you have to admit that if there were[br]enough Romans delirious with happiness, 0:19:20.460,0:19:25.220 it would outweigh even the most excruciating[br]pain of a handful of 0:19:25.220,0:19:28.600 Christians thrown to the lion. 0:19:28.600,0:19:33.570 so we really have here two different objections[br]to utilitarianism 0:19:33.570,0:19:35.030 one has to do 0:19:35.030,0:19:37.850 with whether utilitarianism 0:19:37.850,0:19:39.559 adequately respects 0:19:39.559,0:19:40.740 individual rights 0:19:40.740,0:19:42.500 or minority rights 0:19:42.500,0:19:45.150 and the other has to do 0:19:45.150,0:19:46.940 with the whole idea 0:19:46.940,0:19:48.980 of aggregating 0:19:48.980,0:19:49.900 utility 0:19:49.900,0:19:51.610 for preferences 0:19:51.610,0:19:53.070 or values 0:19:53.070,0:19:56.470 is it possible to aggregate all values 0:19:56.470,0:19:58.050 to translate them 0:19:58.050,0:20:00.360 into dollar terms? 0:20:00.360,0:20:02.479 there was 0:20:02.480,0:20:07.000 in the 1930's 0:20:07.000,0:20:09.190 a psychologist 0:20:09.190,0:20:10.840 who tried 0:20:10.840,0:20:12.370 to address 0:20:12.370,0:20:15.850 the second question. He tried to prove 0:20:15.850,0:20:18.530 what utilitarianism assumes, 0:20:18.530,0:20:22.040 that it is possible 0:20:22.040,0:20:23.230 to translate 0:20:23.230,0:20:27.309 all goods, all values, all human concerns 0:20:27.309,0:20:28.908 into a single uniform measure 0:20:28.909,0:20:30.230 and he did this 0:20:30.230,0:20:32.660 by conducting a survey 0:20:32.660,0:20:37.610 of the young recipients of relief, this was[br]in the 1930's 0:20:37.610,0:20:42.000 and he asked them, he gave them a list of[br]unpleasant experiences 0:20:42.000,0:20:45.360 and he asked them how much would you have to[br]be paid to undergo 0:20:45.360,0:20:49.899 the following experiences and he kept track 0:20:49.900,0:20:51.040 for example 0:20:51.040,0:20:57.240 how much would you have to be paid to have[br]one upper front tooth pulled out 0:20:57.240,0:21:04.240 or how much would you have to be paid to have one little[br]one tow cut off? 0:21:05.030,0:21:11.649 or eat a live earth worm, six inches long 0:21:11.650,0:21:18.650 or to live the rest of your life on a farm in[br]Kansas 0:21:19.430,0:21:24.480 or to choke a stray cat to death with your bare hands 0:21:24.480,0:21:26.140 now what do you suppose 0:21:26.140,0:21:32.900 what do you suppose was the most expensive[br]item on that list 0:21:32.900,0:21:39.750 Kansas? 0:21:39.750,0:21:44.840 You're right it was Kansas 0:21:44.840,0:21:46.059 for a Kansas 0:21:46.059,0:21:47.619 people said they'd have to pay them 0:21:47.619,0:21:51.980 they have to be paid three hundred[br]thousand dollars 0:21:57.340,0:21:59.800 what do you think 0:21:59.800,0:22:03.139 what do you think was the next most expensive? 0:22:03.140,0:22:06.010 not the cat 0:22:06.010,0:22:08.190 not the tooth 0:22:08.190,0:22:11.250 not the toe 0:22:11.250,0:22:16.580 the worm! 0:22:16.580,0:22:20.600 people said you'd have to pay them a hundred[br]thousand dollars 0:22:20.600,0:22:23.419 to eat the worm 0:22:23.420,0:22:28.010 what do you think was the least expensive[br]item? 0:22:28.010,0:22:30.010 not the cat 0:22:30.010,0:22:31.300 the tooth 0:22:31.300,0:22:34.990 during the depression people were willing[br]to have their tooth pulled 0:22:34.990,0:22:39.600 for only forty five hundred dollars 0:22:39.600,0:22:40.669 now 0:22:42.020,0:22:45.010 here's what Thorndike 0:22:45.010,0:22:48.460 concluded from his study 0:22:48.460,0:22:51.650 any want or satisfaction which exists, exists 0:22:51.650,0:22:54.600 in some amount and is therefore measurable 0:22:54.600,0:22:55.949 the life of a dog 0:22:55.950,0:22:57.010 or a cat 0:22:57.010,0:22:59.430 or a chicken consists 0:22:59.430,0:23:00.979 of appetites 0:23:00.979,0:23:01.880 cravings 0:23:01.880,0:23:04.290 desires and their gratifications 0:23:04.290,0:23:05.750 so does the life 0:23:05.750,0:23:06.940 of human beings 0:23:06.940,0:23:09.409 though the appetites and desires 0:23:09.409,0:23:12.450 are more complicated 0:23:12.450,0:23:14.330 but what about 0:23:14.330,0:23:15.990 Thorndike's study? 0:23:15.990,0:23:17.870 does it support 0:23:17.870,0:23:20.209 Bentham's idea 0:23:20.210,0:23:22.130 that all 0:23:22.130,0:23:28.390 goods all values can be captured according[br]to a single uniform measure of value 0:23:28.390,0:23:34.289 or does the preposterous character of those[br]different items on the list 0:23:34.289,0:23:37.220 suggest the opposite conclusion 0:23:37.220,0:23:40.290 that may be whether we're talking about life 0:23:40.290,0:23:42.120 or Kansas 0:23:42.120,0:23:44.179 or the worm 0:23:44.180,0:23:45.440 maybe 0:23:45.440,0:23:47.810 the things we value 0:23:47.810,0:23:49.399 and cherish 0:23:49.400,0:23:51.350 can't be captured 0:23:51.350,0:23:54.280 according to a single uniform measure of value 0:23:54.280,0:23:56.129 and if they can't 0:23:56.130,0:23:57.910 what are the consequences 0:23:57.910,0:24:00.740 for the utilitarian theory 0:24:00.740,0:24:02.180 of morality 0:24:02.180,0:24:09.180 that's a question we'll continue with next[br]time 0:24:13.179,0:24:15.530 alright now let's take the other 0:24:15.530,0:24:17.680 part of the poll 0:24:17.680,0:24:19.309 which is the 0:24:19.309,0:24:21.470 the highest 0:24:21.470,0:24:23.410 experience or pleasure? 0:24:23.410,0:24:25.659 how many say 0:24:25.659,0:24:31.720 Shakespeare 0:24:31.720,0:24:38.120 how many say fear Factor 0:24:38.120,0:24:39.760 no you can't be serious 0:24:39.760,0:24:46.160 really? 0:24:46.160,0:24:48.530 last time 0:24:48.530,0:24:53.510 last time we began to consider some objections 0:24:53.510,0:24:56.570 to Jeremy Bentham's version 0:24:56.570,0:25:01.790 of utilitarianism 0:25:01.790,0:25:04.950 people raised two objections in the discussion 0:25:04.950,0:25:08.380 we had 0:25:08.380,0:25:10.600 the first 0:25:10.600,0:25:12.709 was the objection, the claim 0:25:12.710,0:25:14.760 that utilitarianism, 0:25:14.760,0:25:16.970 by concerning itself 0:25:16.970,0:25:20.110 with the greatest good for the greatest number 0:25:20.110,0:25:23.120 fails adequately to respect 0:25:23.120,0:25:25.479 individual rights. 0:25:25.480,0:25:28.880 today we have debates 0:25:28.880,0:25:29.990 about torture 0:25:29.990,0:25:33.610 and terrorism 0:25:33.610,0:25:35.139 suppose 0:25:35.140,0:25:41.350 a suspected terrorists was apprehended[br]on September tenth 0:25:41.350,0:25:44.189 and you had reason to believe 0:25:44.190,0:25:46.490 that the suspect 0:25:46.490,0:25:51.230 had crucial information about an impending[br]terrorist attack that would kill over three thousand 0:25:51.230,0:25:52.200 people 0:25:52.200,0:25:55.690 and you couldn't extract the information 0:25:55.690,0:25:57.670 would it be just 0:25:57.670,0:25:59.440 to torture 0:25:59.440,0:26:01.070 the suspect 0:26:01.070,0:26:03.060 to get the information 0:26:03.060,0:26:05.550 or 0:26:05.550,0:26:07.770 do you say no 0:26:07.770,0:26:14.550 there is a categorical moral duty of [br]respect for individual rights 0:26:14.550,0:26:17.899 in a way we're back to the questions we started[br]with t 0:26:17.900,0:26:23.910 about trolley cars and organ transplants so that's[br]the first issue 0:26:23.910,0:26:29.250 and you remember we considered some examples of[br]cost-benefit analysis 0:26:29.250,0:26:33.870 but a lot of people were unhappy with cost-benefit[br]analysis 0:26:33.870,0:26:39.969 when it came to placing a dollar value on[br]human life 0:26:39.970,0:26:42.290 and so that led us to the 0:26:42.290,0:26:44.500 second objection, 0:26:44.500,0:26:48.810 it questioned whether it's possible to translate[br]all values 0:26:48.810,0:26:52.879 into a single uniform measure of value 0:26:52.880,0:26:57.580 it asks in other words whether all values[br]are commensurable 0:26:57.580,0:27:00.080 let me give you one other 0:27:00.080,0:27:01.299 example 0:27:01.299,0:27:07.240 of an experience, this actually is a true[br]story, it comes from personal experience 0:27:07.240,0:27:12.900 that raises a question at least about whether[br]all values can be translated without 0:27:12.900,0:27:14.060 loss 0:27:14.060,0:27:20.158 into utilitarian terms 0:27:20.159,0:27:22.390 some years ago 0:27:22.390,0:27:28.080 when I was a graduate student I was at Oxford[br]in England and they had menâs and women's 0:27:28.080,0:27:29.800 colleges they weren't yet mixed 0:27:29.800,0:27:31.428 and the women's colleges had rules 0:27:31.429,0:27:33.590 against 0:27:33.590,0:27:37.470 overnight male guests 0:27:37.470,0:27:40.080 by the nineteen seventies these 0:27:40.080,0:27:44.030 rules were rarely enforced and easily violated, 0:27:44.030,0:27:51.030 or so I was told, 0:27:51.600,0:27:56.549 by the late nineteen seventies when I was there,[br]pressure grew to relax these rules and it became 0:27:56.549,0:28:01.100 the subject of debate among the faculty at St. Anne's College 0:28:01.100,0:28:03.840 which was one of these all women colleges 0:28:03.840,0:28:06.629 the older women on the faculty 0:28:06.630,0:28:10.720 we're traditionalists they were opposed to[br]change 0:28:10.720,0:28:12.820 on conventional moral grounds 0:28:12.820,0:28:14.320 but times had changed 0:28:14.320,0:28:17.110 and they were embarrassed 0:28:17.110,0:28:20.000 to give the true grounds of their objection 0:28:20.000,0:28:22.809 and so the translated their arguments 0:28:22.809,0:28:26.120 into utilitarian terms 0:28:26.120,0:28:27.419 if men stay overnight, 0:28:27.420,0:28:31.680 they argued, the costs to the college will increase. 0:28:31.680,0:28:33.490 how you might wonder 0:28:33.490,0:28:39.429 well they'll want to take baths, and that[br]will use up hot water they said 0:28:39.429,0:28:40.779 furthermore they argued 0:28:40.779,0:28:46.780 we'll have to replace the mattresses more often 0:28:46.780,0:28:48.379 the reformers 0:28:48.380,0:28:52.250 met these arguments by adopting the following[br]compromise 0:28:52.250,0:28:53.280 each woman 0:28:53.280,0:29:00.280 could have a maximum of three overnight male[br]guest each week 0:29:01.160,0:29:05.940 they didn't say whether it had to be the same[br]one, or three different 0:29:05.940,0:29:06.800 provided 0:29:06.800,0:29:09.240 and this is the compromise provided 0:29:09.240,0:29:10.400 the guest 0:29:10.400,0:29:15.340 paid fifty pence to defray the cost to the college 0:29:15.340,0:29:17.179 the next day 0:29:17.180,0:29:23.080 the national headline in the national newspaper[br]read St. Anne's girls, fifty pence a night 0:29:29.500,0:29:31.040 another 0:29:31.040,0:29:31.730 illustration 0:29:31.730,0:29:34.730 of the difficulty of translating 0:29:34.730,0:29:36.150 all values 0:29:36.150,0:29:38.930 in this case a certain idea of virtue 0:29:38.930,0:29:44.200 into utilitarian terms 0:29:44.200,0:29:46.980 so that's all to illustrate 0:29:46.980,0:29:48.830 the second objection 0:29:48.830,0:29:52.679 to utilitarianism, at least the[br]part of that objection 0:29:52.680,0:29:54.780 that questions rather 0:29:54.780,0:29:56.620 the utilitarianism 0:29:56.620,0:29:58.809 is right to assume 0:29:58.809,0:30:00.670 that we can 0:30:00.670,0:30:02.420 assume the uniformity of 0:30:02.420,0:30:08.230 value, the commensurability of values[br]and translate all moral considerations 0:30:08.230,0:30:09.740 into 0:30:09.740,0:30:10.280 dollars 0:30:10.280,0:30:12.220 or money. 0:30:12.220,0:30:14.510 But there is a second 0:30:14.510,0:30:19.690 aspect to this worry about aggregating values[br]and preferences 0:30:19.690,0:30:21.990 why should we 0:30:21.990,0:30:23.120 weigh 0:30:23.120,0:30:25.260 all preferences 0:30:25.260,0:30:27.390 that people have 0:30:27.390,0:30:32.650 without assessing whether they're good preferences[br]or bad preferences 0:30:32.650,0:30:35.120 shouldn't we distinguish 0:30:35.120,0:30:36.010 between 0:30:36.010,0:30:37.590 higher 0:30:37.590,0:30:38.260 pleasures 0:30:38.260,0:30:41.760 and lower pleasures. 0:30:41.760,0:30:44.250 Now, part of the appeal of 0:30:44.250,0:30:49.380 not making any qualitative distinctions about[br]the worth of people's preferences, part of the 0:30:49.380,0:30:50.820 appeal 0:30:50.820,0:30:55.250 is that it is non-judgmental and egalitarian 0:30:55.250,0:30:58.110 the Benthamite utilitarian says 0:30:58.110,0:31:01.199 everybody's preferences count 0:31:01.200,0:31:04.890 and they count regardless of what people want 0:31:04.890,0:31:08.580 regardless of what makes it different people 0:31:08.580,0:31:09.530 happy. For Bentham, 0:31:09.530,0:31:11.250 all that matters 0:31:11.250,0:31:12.960 you'll remember 0:31:12.960,0:31:16.010 are the intensity and the duration 0:31:16.010,0:31:18.270 of a pleasure or pain 0:31:18.270,0:31:24.040 the so-called higher pleasures or nobler[br]virtues are simply those, according to Bentham 0:31:24.040,0:31:25.420 that produce 0:31:25.420,0:31:26.419 stronger, 0:31:26.419,0:31:29.380 longer, pleasure 0:31:29.380,0:31:32.790 yet a famous phrase to express this idea 0:31:32.790,0:31:35.770 the quantity of pleasure being equal 0:31:35.770,0:31:37.059 pushpin 0:31:37.059,0:31:39.519 is as good as poetry. 0:31:39.519,0:31:41.789 What was pushpin? 0:31:41.789,0:31:46.929 It was some kind of a child's game like to tidily winks[br]pushpin is as good as poetry 0:31:46.929,0:31:48.850 Bentham said 0:31:48.850,0:31:50.509 and lying behind this idea 0:31:50.509,0:31:51.510 I think 0:31:51.510,0:31:52.779 is the claim 0:31:52.779,0:31:54.090 the intuition 0:31:54.090,0:31:55.870 that it's a presumption 0:31:55.870,0:31:57.949 to judge 0:31:57.950,0:31:58.979 whose pleasures 0:31:58.979,0:32:00.670 are intrinsically higher 0:32:00.670,0:32:03.850 or worthier or better 0:32:03.850,0:32:07.010 and there is something attractive in this 0:32:07.010,0:32:09.770 refusal to judge, after all some people like 0:32:09.770,0:32:11.790 Mozart, others 0:32:11.790,0:32:12.520 Madonna 0:32:12.520,0:32:15.090 some people like ballet 0:32:15.090,0:32:16.389 others 0:32:16.390,0:32:17.360 bowling, 0:32:17.360,0:32:19.439 who's to say 0:32:19.440,0:32:23.140 a Benthamite might argue, who's to say which [br]of these pleasures 0:32:23.140,0:32:24.260 whose pleasures 0:32:24.260,0:32:25.460 are higher 0:32:25.460,0:32:26.350 worthier 0:32:26.350,0:32:27.500 nobler 0:32:27.500,0:32:31.570 than others? 0:32:31.570,0:32:35.980 But, is that right? 0:32:35.980,0:32:40.310 this refusal to make qualitative distinctions 0:32:40.310,0:32:42.179 can we 0:32:42.180,0:32:45.170 altogether dispense with the idea 0:32:45.170,0:32:49.010 that certain things we take pleasure in are 0:32:49.010,0:32:50.660 better or worthier 0:32:50.660,0:32:53.890 than others 0:32:53.890,0:32:58.640 think back to the case of the Romans in the coliseum,[br]one thing that troubled people about that 0:32:58.640,0:32:58.960 practice 0:32:58.960,0:33:01.170 is that it seemed to violate the rights 0:33:01.170,0:33:04.490 of the Christian 0:33:04.490,0:33:07.250 another way of objecting to what's going[br]on there 0:33:07.250,0:33:10.720 is that the pleasure that the Romans[br]take 0:33:10.720,0:33:13.480 in this bloody spectacle 0:33:13.480,0:33:15.850 should that pleasure 0:33:15.850,0:33:16.500 which is a base, 0:33:16.500,0:33:19.130 kind of corrupt 0:33:19.130,0:33:22.730 degrading pleasure, should that even 0:33:22.730,0:33:26.960 be valorized or weighed in deciding what[br]the 0:33:26.960,0:33:33.770 the general welfare is? 0:33:33.770,0:33:38.760 so here are the objections to Bentham's[br]utilitarianism 0:33:38.760,0:33:42.800 and now we turn to someone who tried to 0:33:42.800,0:33:45.889 respond to those objections, 0:33:45.890,0:33:47.820 a later day utilitarian 0:33:47.820,0:33:50.120 John Stuart Mill 0:33:50.120,0:33:52.780 so what we need to 0:33:52.780,0:33:54.510 examine now 0:33:54.510,0:33:58.809 is whether John Stuart Mill had a convincing[br]reply 0:33:58.809,0:34:05.090 to these objections to utilitarianism. 0:34:05.090,0:34:06.750 John Stuart Mill 0:34:06.750,0:34:08.760 was born in 1806 0:34:08.760,0:34:11.369 his father James Mill 0:34:11.369,0:34:14.079 was a disciple of Benthamâs 0:34:14.079,0:34:17.389 and James Mills set about giving his son 0:34:17.389,0:34:20.329 John Stuart Mill a model education 0:34:20.329,0:34:22.429 he was a child prodigy 0:34:22.429,0:34:23.569 John Stuart Mill 0:34:23.569,0:34:27.859 the knew Latin, sorry, Greek at the age of three, [br]Latin at eight 0:34:27.859,0:34:29.139 and at age ten 0:34:29.139,0:34:33.659 he wrote a history of Roman law. 0:34:33.659,0:34:35.649 At age twenty 0:34:35.649,0:34:39.129 he had a nervous breakdown 0:34:39.129,0:34:43.549 this left him in a depression for five years 0:34:43.549,0:34:46.800 but at age twenty five what helped lift him[br]out of this depression 0:34:46.800,0:34:49.949 is that he met Harriet Taylor 0:34:49.949,0:34:52.659 she in no doubt married him, they lived happily ever after 0:34:52.659,0:34:54.849 and it was under her 0:34:54.849,0:34:56.659 influence 0:34:56.659,0:34:59.859 the John Stuart Mill try to humanize 0:34:59.859,0:35:01.720 utilitarianism 0:35:01.720,0:35:05.279 what Mill tried to do was to see 0:35:05.279,0:35:07.710 whether the utilitarian calculus could be 0:35:07.710,0:35:09.029 enlarged 0:35:09.029,0:35:11.200 and modified 0:35:11.200,0:35:13.759 to accommodate 0:35:13.759,0:35:17.429 humanitarian concerns 0:35:17.429,0:35:20.390 like the concern to respect individual rights 0:35:20.390,0:35:24.598 and also to address the distinction between[br]higher and lower 0:35:24.599,0:35:26.439 pleasures. 0:35:26.439,0:35:30.368 In 1859 Mill wrote a famous book[br]on liberty 0:35:30.369,0:35:35.399 the main point of which was the importance[br]of defending individual rights and minority 0:35:35.399,0:35:36.380 rights 0:35:36.380,0:35:38.329 and in 1861 0:35:38.329,0:35:40.339 toward the end of his life 0:35:40.339,0:35:43.130 he wrote the book we read is part of this course 0:35:43.130,0:35:45.099 Utilitarianism. 0:35:45.099,0:35:46.599 It makes it clear 0:35:46.599,0:35:49.739 that utility is the only standard of morality 0:35:49.739,0:35:50.769 in his view 0:35:50.769,0:35:52.549 so he's not challenging 0:35:52.550,0:35:53.949 Bentham's premise, 0:35:53.949,0:35:55.279 he's affirming it. 0:35:55.279,0:35:58.829 he says very explicitly the sole evidence, 0:35:58.829,0:36:04.659 it is possible to produce that anything is[br]desirable is that people actually do 0:36:04.659,0:36:05.929 desire it. 0:36:05.929,0:36:11.679 so he stays with the idea that our de facto[br]actual empirical desires are the only 0:36:11.679,0:36:12.979 basis 0:36:12.979,0:36:15.618 for moral judgment. 0:36:15.619,0:36:17.549 but then 0:36:17.549,0:36:18.979 page eight 0:36:18.979,0:36:24.879 also in chapter two, he argues that it is possible[br]for a utilitarian to distinguish 0:36:24.879,0:36:26.319 higher from lower 0:36:26.320,0:36:28.970 pleasures. 0:36:28.970,0:36:30.519 now, those of you who've read 0:36:30.519,0:36:32.078 Mill already 0:36:32.079,0:36:33.329 how 0:36:33.329,0:36:36.559 according to him is it possible to draw that[br]distinction? 0:36:36.559,0:36:40.079 How can a utilitarian 0:36:40.079,0:36:42.819 distinguish qualitatively higher pleasures 0:36:42.819,0:36:43.540 from 0:36:43.540,0:36:48.859 lesser ones, base ones, unworthy ones? 0:36:48.859,0:36:50.519 If you tried both of them 0:36:50.519,0:36:55.308 and you'll prefer the higher one naturally[br]always 0:36:55.309,0:36:59.819 that's great, that's right. What's your name? John. 0:36:59.819,0:37:01.759 so as John points out 0:37:01.759,0:37:05.019 Mill says here's the test, 0:37:05.019,0:37:07.868 since we can't step outside 0:37:07.869,0:37:10.490 actual desires, actual preferences 0:37:10.490,0:37:11.649 that would 0:37:11.650,0:37:13.670 violate utilitarian premises, 0:37:13.670,0:37:16.819 the only test 0:37:16.819,0:37:17.600 of whether 0:37:17.600,0:37:19.769 a pleasure is higher 0:37:19.769,0:37:26.468 or lower is whether someone who has experienced[br]both 0:37:26.469,0:37:28.109 would prefer it. 0:37:28.109,0:37:29.369 And here, 0:37:29.369,0:37:31.259 in chapter two 0:37:31.259,0:37:32.900 we see the passage 0:37:32.900,0:37:37.199 where Mill makes the point that John just described 0:37:37.199,0:37:42.680 of two pleasures, if there be one to which all[br]are almost all who have experience 0:37:42.680,0:37:46.219 of both give a decided preference, 0:37:46.219,0:37:51.859 irrespective of any feeling of moral obligation to[br]prefer it, in other words no outside, no independent 0:37:51.859,0:37:53.390 standard, 0:37:53.390,0:37:57.529 then that is the more desirable pleasure. 0:37:57.529,0:37:59.829 what do people think about that argument. 0:37:59.829,0:38:01.549 does that 0:38:01.549,0:38:03.489 does it succeeded? 0:38:03.489,0:38:06.380 how many think that it does succeed? 0:38:06.380,0:38:11.489 of arguing within utilitarian terms for a[br]distinction between higher and lower pleasures. 0:38:11.489,0:38:12.390 how many 0:38:12.390,0:38:17.788 think it doesn't succeed? 0:38:17.789,0:38:20.829 I want to hear your reasons. 0:38:20.829,0:38:22.249 but before 0:38:22.249,0:38:23.848 we give the reasons 0:38:23.849,0:38:26.329 let's do an experiment 0:38:26.329,0:38:28.719 of Mills' 0:38:28.719,0:38:31.569 claim. 0:38:31.569,0:38:35.199 In order to do this experiment 0:38:35.199,0:38:39.859 we're going to look that three 0:38:39.859,0:38:41.558 short excerpts 0:38:41.559,0:38:44.889 of popular entertainment 0:38:44.889,0:38:48.170 the first one is a Hamlet soliloquy 0:38:48.170,0:38:52.939 it'll be followed by two other 0:38:52.939,0:38:55.420 experiences 0:38:55.420,0:38:58.259 see what you think. 0:38:58.259,0:39:02.249 'what a piece of work is a man 0:39:02.249,0:39:05.509 how noble in reason 0:39:05.509,0:39:07.579 how infinite in faculties 0:39:07.579,0:39:11.439 in form and moving, how express and admirable 0:39:11.439,0:39:14.949 in action how like an angel. In apprehension, how like a god 0:39:14.949,0:39:16.430 the beauty of the world 0:39:16.430,0:39:18.348 the paragon of animals 0:39:18.349,0:39:21.259 and yet, to me 0:39:21.259,0:39:24.689 what is this quintessence of dust? 0:39:24.689,0:39:31.689 man delights not me. 0:39:43.309,0:39:47.749 Imagine a world where your greatest fears become reality 0:39:47.749,0:39:52.609 each show, six contestants from around the country battle[br]each other in three 0:39:52.609,0:39:59.609 extreme stunts. these stunts are designed to challenge [br]these contestants both physically and mentally 0:40:00.419,0:40:02.288 six contestants, three stunts, one winner. 0:40:02.289,0:40:09.289 Fear factor. 0:40:16.449,0:40:22.919 The Simpsons. Well hi diddly-o peddle to the metal o-philes! [br]Flanders- since when do you like anything cool. 0:40:22.919,0:40:25.240 well, I don't care for the speed, but I can't get enough of that [br]safety gear 0:40:25.240,0:40:28.729 helmets, roll bars, caution flags. I like the fresh[br]air 0:40:28.729,0:40:35.499 and looking at the poor people in the infield. 0:40:35.499,0:40:41.319 Dang Cletus, why you got to park by my parents. 0:40:41.320,0:40:43.029 Now hunny, it's my parents too. 0:40:55.759,0:41:00.629 I don't even have to ask which one you like[br]most 0:41:00.629,0:41:05.109 the Simpsons? How many like the Simpson's most? 0:41:05.109,0:41:10.140 How many Shakespeare? 0:41:10.140,0:41:12.710 What about fear factor? 0:41:12.710,0:41:15.959 how many preferred fear factor? 0:41:15.959,0:41:22.368 really? 0:41:22.369,0:41:24.079 people overwhelmingly 0:41:24.079,0:41:25.769 like the Simpsons 0:41:25.769,0:41:29.229 better 0:41:29.229,0:41:31.598 than Shakespeare. alright, now let's take the other 0:41:31.599,0:41:33.749 part of the poll 0:41:33.749,0:41:35.799 which is the 0:41:35.799,0:41:37.529 highest 0:41:37.529,0:41:39.409 experience or pleasure? 0:41:39.409,0:41:41.729 how many say 0:41:41.729,0:41:47.788 Shakespeare? 0:41:47.789,0:41:49.960 how many say 0:41:49.960,0:41:54.189 fear factor? 0:41:54.189,0:41:59.069 no you can't be serious 0:41:59.069,0:42:01.038 really? 0:42:01.039,0:42:02.709 alright go ahead you can say it. 0:42:02.709,0:42:03.439 I found that one 0:42:03.439,0:42:04.709 the most entertaining 0:42:04.709,0:42:09.428 I know but which do you think was the worthiest, [br]the noblest experience, I know you find it 0:42:09.429,0:42:10.920 the most anything 0:42:10.920,0:42:15.920 if something is good just because it is pleasurable[br]what is the matter if you have some kind of 0:42:15.920,0:42:17.209 abstract 0:42:17.209,0:42:21.729 idea of whether it is good by someone else's[br]sense or not. 0:42:21.729,0:42:25.038 Alright so you come down on the straight Benthamite's side 0:42:25.039,0:42:26.429 whose to judge 0:42:26.429,0:42:29.279 and why should we judge 0:42:29.279,0:42:33.880 apart from just registering and aggregating[br]de facto preferences, alright fair enough. 0:42:33.880,0:42:35.349 what's your name? 0:42:35.349,0:42:37.329 Nate? okay fair enough 0:42:37.329,0:42:38.219 Alright so 0:42:38.219,0:42:40.919 how many think that the Simpson's is actually 0:42:40.919,0:42:46.259 apart from liking is actually the higher experience 0:42:46.259,0:42:47.420 higher than Shakespeare. 0:42:47.420,0:42:49.309 Alright let's see the vote for Shakespeare again 0:42:49.309,0:42:52.869 how many think Shakespeare is higher? 0:42:52.869,0:42:53.869 alright so 0:42:53.869,0:42:55.540 why is it 0:42:55.540,0:42:58.999 ideally I'd like to hear from someone is there[br]someone 0:42:58.999,0:43:02.109 think Shakespeare is highest 0:43:02.109,0:43:02.989 but who preferred 0:43:02.989,0:43:04.060 watching 0:43:04.060,0:43:08.739 the Simpsons 0:43:08.739,0:43:14.160 Like I guess just sitting and watching the Simpsons, it's entertaining[br]because the make jokes, they make us laugh but 0:43:14.160,0:43:17.839 someone has to tell us that Shakespeare was this great writer[br]we had to be taught how to read him, how to 0:43:17.839,0:43:20.630 understand him, we had to be taught how to 0:43:20.630,0:43:23.150 take in Rembrandt, how to analyze a painting. 0:43:23.150,0:43:25.859 well how do, what's your name? Aneesha. 0:43:25.859,0:43:27.840 Aneesha, when you say someone 0:43:27.840,0:43:31.259 told you that Shakespeare's better 0:43:31.259,0:43:37.299 are you accepting it on blind faith you voted that[br]Shakespeare's higher only because the culture 0:43:37.299,0:43:41.529 tells you that our teachers tell you that[br]or do you 0:43:41.529,0:43:44.049 actually agree with that yourself 0:43:44.049,0:43:48.319 well in the sense that Shakespeare, no, but earlier you made 0:43:48.319,0:43:49.740 an example of Rembrandt 0:43:49.740,0:43:54.479 I feel like I would enjoy a reading a comic book[br]more than I would enjoy a kind of analyzing 0:43:54.479,0:43:58.348 Rembrandt because someone told me it was[br]great, you know. Right so of some this seems 0:43:58.349,0:44:01.699 to be, you're suggesting a kind of 0:44:01.699,0:44:05.319 cultural convention and pressure. We're told 0:44:05.319,0:44:12.319 what books, what works of art are great. who else? 0:44:15.309,0:44:19.640 although I enjoyed watching the Simpsons more[br]in this particular moment in Justice, 0:44:19.640,0:44:23.049 if I were to spend the rest of my life[br]considering 0:44:23.049,0:44:25.369 the three different 0:44:25.369,0:44:26.869 video clips shown 0:44:26.869,0:44:29.119 I would not want to spend 0:44:29.119,0:44:31.869 that remainder of my life considering 0:44:31.869,0:44:33.800 the latter two clips. 0:44:33.800,0:44:36.650 I think I would derive more pleasure 0:44:36.650,0:44:38.309 from being able to 0:44:38.309,0:44:39.269 branch out in my own mind 0:44:39.269,0:44:40.488 sort of 0:44:40.489,0:44:44.869 considering more deep pleasures, more[br]deep thoughts. 0:44:44.869,0:44:48.630 and tell me your name 0:44:48.630,0:44:49.630 Joe. 0:44:49.630,0:44:52.559 Joe, so if you had to spend the rest of your life[br]on 0:44:52.559,0:44:55.409 on a farm in Kansas with only 0:44:55.409,0:44:57.489 with only Shakespeare 0:44:57.489,0:45:02.079 or the collected episodes of the Simpsons 0:45:02.079,0:45:04.119 you would prefer 0:45:04.119,0:45:06.599 Shakespeare 0:45:06.599,0:45:09.849 what do you conclude from that 0:45:09.849,0:45:12.149 about John Stuart Mill's test 0:45:12.149,0:45:15.078 but the test of a higher pleasure 0:45:15.079,0:45:16.359 is whether 0:45:16.359,0:45:18.369 people who have experienced 0:45:18.369,0:45:21.509 both prefer it. 0:45:21.509,0:45:23.880 can I cite another example briefly? 0:45:23.880,0:45:25.130 in biology 0:45:25.130,0:45:28.769 in neuro biology last year we were told of a rat who was[br]tested 0:45:28.769,0:45:31.209 a particular center in the brain 0:45:31.209,0:45:35.879 where the rat was able to stimulate its[br]brain and cause itself intense pleasure repeatedly 0:45:35.879,0:45:38.368 the rat did not eat or drink until it died 0:45:38.369,0:45:42.179 so the rat was clearly experiencing intense[br]pleasure 0:45:42.179,0:45:46.269 now if you asked me right now if I'd rather[br]experience intense pleasure 0:45:46.269,0:45:47.140 or have 0:45:47.140,0:45:52.690 a full lifetime of higher pleasure, I would consider[br]intense pleasure to be lower pleasure, right 0:45:52.690,0:45:55.799 now enjoy intense pleasure 0:45:55.799,0:46:01.969 yes I would 0:46:01.969,0:46:03.159 but over a lifetime I think 0:46:03.159,0:46:04.360 I would think 0:46:04.360,0:46:06.899 almost a complete majority here would agree 0:46:06.899,0:46:11.630 that they would rather be a human[br]with higher pleasure that rat 0:46:11.630,0:46:13.269 with intense pleasure 0:46:13.269,0:46:14.930 for a momentary period of time 0:46:14.930,0:46:15.870 so now 0:46:15.870,0:46:18.959 in answer to your question, right, I think 0:46:18.959,0:46:21.348 this proves that, or I won't say proves 0:46:21.349,0:46:24.999 I think the conclusion 0:46:24.999,0:46:28.649 is that Mill's theory that when a majority people are[br]asked 0:46:28.650,0:46:31.439 what they would rather do, 0:46:31.439,0:46:33.118 they will answer 0:46:33.119,0:46:34.689 that they would rather 0:46:34.689,0:46:39.499 engage in a higher pleasure. So you think that this[br]supports Mills, that Mills was on to something here 0:46:39.499,0:46:40.738 I do. 0:46:40.739,0:46:42.629 all right is there anyone 0:46:42.629,0:46:46.839 who disagrees with Joe who thinks that[br]our experiment 0:46:46.839,0:46:48.578 disproves 0:46:48.579,0:46:49.959 Mills' 0:46:49.959,0:46:51.098 test 0:46:51.099,0:46:53.239 shows that that's not an adequate way 0:46:53.239,0:46:57.689 that you can't distinguish higher pleasures within[br]the utilitarian 0:46:57.689,0:47:02.649 framework. 0:47:05.879,0:47:09.509 If whatever is good is truly just whatever[br]people prefer it's truly relative and there's 0:47:09.509,0:47:11.519 no objective definition then 0:47:11.519,0:47:14.888 there will be some society where people prefer[br]Simpsons 0:47:14.889,0:47:15.849 more 0:47:15.849,0:47:21.380 anyone can appreciate the Simpsons, but I think[br]it does take education to appreciate Shakespeare 0:47:21.380,0:47:25.660 Alright, you're saying it takes education to appreciate[br]higher 0:47:25.660,0:47:27.319 true thing 0:47:27.319,0:47:29.719 Mill's point is 0:47:29.719,0:47:32.890 that the higher pleasures do require 0:47:32.890,0:47:35.089 cultivation and appreciation and education 0:47:35.090,0:47:37.799 he doesn't dispute that 0:47:37.799,0:47:38.650 but 0:47:38.650,0:47:41.660 once having been cultivated 0:47:41.660,0:47:44.029 and educated 0:47:44.029,0:47:45.509 people will see 0:47:45.509,0:47:48.270 not only see the difference between higher[br]lower 0:47:48.270,0:47:49.009 pleasures 0:47:49.009,0:47:51.539 but will it actually 0:47:51.539,0:47:52.769 prefer 0:47:52.769,0:47:53.970 the higher 0:47:53.970,0:47:55.848 to the lower. 0:47:55.849,0:47:59.519 you find this famous passage from John Stuart[br]Mill- 0:47:59.519,0:48:00.808 it is better 0:48:00.809,0:48:03.869 to be a human being dissatisfied 0:48:03.869,0:48:06.109 then a pig satisfied. 0:48:06.109,0:48:10.538 Better to the Socrates dissatisfied than[br]a fool satisfied 0:48:10.539,0:48:12.229 and if the fool 0:48:12.229,0:48:13.439 or the pig 0:48:13.439,0:48:15.538 are of a different opinion 0:48:15.539,0:48:18.209 it is because they only know 0:48:18.209,0:48:20.700 their side of the question. 0:48:20.700,0:48:22.339 so here you have 0:48:22.339,0:48:23.159 an attempt 0:48:23.159,0:48:24.609 to distinguish 0:48:24.609,0:48:27.199 higher from lower 0:48:27.199,0:48:28.719 pleasures 0:48:28.719,0:48:33.169 so going to an art museum or being a couch[br]potato, swilling beer watching television 0:48:33.169,0:48:35.489 at home 0:48:35.489,0:48:37.950 sometimes Mill agrees we might succumb 0:48:37.950,0:48:40.839 to the temptation 0:48:40.839,0:48:41.839 to do the latter, 0:48:41.839,0:48:46.308 to be couch potatoes, 0:48:46.309,0:48:47.779 but even when we do that 0:48:47.779,0:48:49.609 out of indolence 0:48:49.609,0:48:50.680 and sloth, 0:48:50.680,0:48:52.200 we know 0:48:52.200,0:48:54.029 that the pleasure we get 0:48:54.029,0:48:55.670 gazing at Rembrandts 0:48:55.670,0:48:56.989 in the museum 0:48:56.989,0:49:00.219 is actually higher, 0:49:00.219,0:49:03.190 because we've experienced both. 0:49:03.190,0:49:05.699 And is a higher pressure 0:49:05.699,0:49:06.940 gazing at Rembrandts 0:49:06.940,0:49:11.499 because of engages our higher human faculties 0:49:11.499,0:49:13.848 what about Mill's attempt 0:49:13.849,0:49:18.890 to reply to the objection about individual rights? 0:49:18.890,0:49:21.859 In a way he uses the same 0:49:21.859,0:49:25.319 kind of argument 0:49:25.319,0:49:27.859 and this comes out in chapter five 0:49:27.859,0:49:33.369 he says while I dispute the pretensions of any[br]theory which sets up an imaginary standard 0:49:33.369,0:49:34.889 of justice 0:49:34.889,0:49:39.719 not grounded on utility, 0:49:39.719,0:49:41.389 but still 0:49:41.389,0:49:43.429 he considers 0:49:43.429,0:49:44.599 justice 0:49:44.599,0:49:48.549 grounded on utility to be what he calls the[br]chief part 0:49:48.549,0:49:52.679 and incomparably the most sacred and binding[br]part 0:49:52.679,0:49:54.599 of all morality. 0:49:54.599,0:49:57.249 so justice is higher 0:49:57.249,0:50:00.049 individual rights are privileged 0:50:00.049,0:50:01.899 but not for 0:50:01.899,0:50:05.310 reasons that depart from utilitarian assumptions. 0:50:05.310,0:50:06.769 Justice is a name 0:50:06.769,0:50:09.069 for certain moral requirements 0:50:09.069,0:50:11.220 which, regarded collectively 0:50:11.220,0:50:14.519 stand higher in the scale of social utility 0:50:14.519,0:50:17.109 and are therefore 0:50:17.109,0:50:19.209 of more 0:50:19.209,0:50:20.678 paramount obligation 0:50:20.679,0:50:23.200 than any others 0:50:23.200,0:50:28.919 so justice is sacred, it's prior, it's privileged,[br]it isn't something that can easily be traded 0:50:28.920,0:50:30.890 off against lesser things 0:50:30.890,0:50:32.239 but the reason 0:50:32.239,0:50:33.619 is ultimately 0:50:33.619,0:50:35.799 Mills Claims 0:50:35.799,0:50:37.929 a utilitarian reason 0:50:37.929,0:50:39.239 once you consider 0:50:39.239,0:50:41.199 the long run interests 0:50:41.199,0:50:43.539 of humankind, 0:50:43.539,0:50:44.670 of all of us, 0:50:44.670,0:50:46.329 as progressive 0:50:46.329,0:50:47.699 beings. 0:50:47.699,0:50:51.279 If we do justice and if we respect rights 0:50:51.279,0:50:52.609 society as a whole 0:50:52.609,0:50:55.949 will be better off in the long run. 0:50:55.949,0:50:57.999 Well is that convincing? 0:50:57.999,0:50:59.158 Or 0:50:59.159,0:51:04.549 is Mill actually, without admitting it, stepping[br]outside 0:51:04.549,0:51:06.219 utilitarian considerations 0:51:06.219,0:51:07.789 in arguing 0:51:07.789,0:51:11.129 for qualitatively higher 0:51:11.129,0:51:12.788 pleasures 0:51:12.789,0:51:14.479 and for sacred 0:51:14.479,0:51:16.948 or specially important 0:51:16.949,0:51:18.479 individual rights? 0:51:18.479,0:51:21.718 we haven't fully answered that question 0:51:21.719,0:51:23.809 because to answer that question 0:51:23.809,0:51:26.259 in the case of rights and justice 0:51:26.259,0:51:28.869 will require that we explore 0:51:28.869,0:51:30.419 other ways, 0:51:30.419,0:51:33.368 non utilitarian ways 0:51:33.369,0:51:35.030 of accounting for the basis 0:51:35.030,0:51:36.479 or rights 0:51:36.479,0:51:38.269 and then asking 0:51:38.269,0:51:40.408 whether they succeed 0:51:40.409,0:51:42.949 as for Jeremy Bentham, 0:51:42.949,0:51:44.969 who launched 0:51:44.969,0:51:46.099 utilitarianism 0:51:46.099,0:51:47.419 as a doctrine 0:51:47.419,0:51:49.979 in moral and legal philosophy 0:51:49.979,0:51:53.819 Bentham died in 1832 at the[br]age of eighty five 0:51:53.819,0:51:57.509 but if you go to London you can visit him[br]today 0:51:57.509,0:51:58.729 literally. 0:51:58.729,0:52:01.439 he provided in his will 0:52:01.440,0:52:03.390 that his body be preserved, 0:52:03.390,0:52:05.489 embalmed and displayed 0:52:05.489,0:52:07.739 in the university of London 0:52:07.739,0:52:11.069 where he still presides in a glass case 0:52:11.069,0:52:12.950 with a wax head 0:52:12.950,0:52:14.989 dressed in his actual clothing. 0:52:14.989,0:52:17.049 you see before he died, 0:52:17.049,0:52:22.339 Bentham addressed himself to a question consistent[br]with his philosophy, 0:52:22.339,0:52:23.499 of what use 0:52:23.499,0:52:26.839 could a dead man be to the living 0:52:26.839,0:52:30.499 one use, he said, would be to make one's corpse[br]available 0:52:30.499,0:52:33.519 for the study of anatomy 0:52:33.519,0:52:37.129 in the case of great philosophers, however, 0:52:37.129,0:52:38.159 better yet 0:52:38.159,0:52:44.919 to preserve one's physical presence in order[br]to inspire future generations of thinkers. 0:52:44.919,0:52:47.618 You want to see what Bentham looks like stuffed? 0:52:47.619,0:52:50.410 Here's what he looks like 0:52:50.410,0:52:53.549 There he is 0:52:53.549,0:52:55.459 now, if you look closely 0:52:55.459,0:52:57.459 you'll notice 0:52:57.459,0:52:58.529 that 0:52:58.529,0:53:05.529 the embalming up his actual had was not a[br]success so they substituted a waxed head 0:53:06.910,0:53:10.009 and at the bottom for verisimilitude 0:53:10.009,0:53:13.390 you can actually see his actual had 0:53:13.390,0:53:14.979 on a plate 0:53:16.529,0:53:17.809 you see it? 0:53:17.809,0:53:22.599 right there 0:53:22.599,0:53:25.709 so, what's the moral of the story? 0:53:25.709,0:53:29.499 the moral of the story 0:53:29.499,0:53:33.698 by the way they bring him out during meetings[br]of the board at university college London 0:53:33.699,0:53:40.659 and the minutes record him as present but[br]not voting. 0:53:40.659,0:53:42.539 here is a philosopher 0:53:42.539,0:53:45.109 in life and in death 0:53:45.109,0:53:46.769 who adhered 0:53:46.769,0:53:48.399 to the principles 0:53:48.400,0:53:55.400 of his philosophy. we'll continue with rights next time. 0:53:57.439,0:54:00.808 Don't miss the chance to interact online with other viewers of Justice 0:54:00.809,0:54:03.369 join the conversation, take a pop quiz, 0:54:03.369,0:54:07.799 watch lectures you've missed, and a lot more. Visit Justiceharvard.org 0:54:07.799,0:54:14.799 It's the right thing to do. 0:54:49.739,0:54:53.789 funding for this program is provided by 0:54:53.789,0:54:55.189 additional funding provided by