1 00:00:03,570 --> 00:00:06,570 Funding for this program provided by 2 00:00:06,510 --> 00:00:11,850 Additional funding provided by 3 00:00:32,850 --> 00:00:35,290 last time we argued about 4 00:00:35,290 --> 00:00:39,400 the case of the Queen verses Dudley and Stephens 5 00:00:39,400 --> 00:00:45,970 the lifeboat case, the case of cannibalism at sea 6 00:00:45,970 --> 00:00:48,420 and with the arguments about 7 00:00:48,420 --> 00:00:49,369 the lifeboat 8 00:00:49,369 --> 00:00:53,750 in mind the arguments for and against what Dudley and Stephens did in mind, 9 00:00:53,750 --> 00:00:56,890 let's turn back to the 10 00:00:56,890 --> 00:00:58,480 philosophy 11 00:00:58,480 --> 00:01:02,059 the utilitarian philosophy of Jeremy Bentham 12 00:01:02,060 --> 00:01:06,130 Bentham was born in England in 1748, at the age of twelve 13 00:01:06,130 --> 00:01:09,740 he went to Oxford, at fifteen he went to law school 14 00:01:09,740 --> 00:01:15,060 he was admitted to the bar at age nineteen but he never practiced law, 15 00:01:15,060 --> 00:01:17,470 instead he devoted his life 16 00:01:17,470 --> 00:01:19,920 to jurisprudence and moral 17 00:01:19,920 --> 00:01:22,170 philosophy. 18 00:01:22,170 --> 00:01:27,690 last time we began to consider Bentham's version of utilitarianism 19 00:01:27,690 --> 00:01:28,810 the main idea 20 00:01:28,810 --> 00:01:32,870 is simply stated and it's this, 21 00:01:32,870 --> 00:01:35,750 the highest principle of morality 22 00:01:35,750 --> 00:01:38,909 whether personal or political morality 23 00:01:38,909 --> 00:01:40,260 is 24 00:01:40,260 --> 00:01:42,520 to maximize 25 00:01:42,520 --> 00:01:44,390 the general welfare 26 00:01:44,390 --> 00:01:46,560 or the collective happiness 27 00:01:46,560 --> 00:01:50,070 or the overall balance of pleasure over pain 28 00:01:50,070 --> 00:01:52,210 in a phrase 29 00:01:52,210 --> 00:01:53,210 maximize 30 00:01:53,210 --> 00:01:56,419 utility 31 00:01:56,420 --> 00:02:00,750 Bentham arrives at this principle by the following line of reasoning 32 00:02:00,750 --> 00:02:03,150 we're all governed by pain and pleasure 33 00:02:03,150 --> 00:02:09,009 they are our sovereign masters and so any moral system has to take account of them. 34 00:02:09,008 --> 00:02:10,999 How best to take account? 35 00:02:10,999 --> 00:02:14,209 By maximizing 36 00:02:14,209 --> 00:02:15,739 and this leads to the principle 37 00:02:15,739 --> 00:02:19,449 of the greatest good for the greatest number 38 00:02:19,449 --> 00:02:23,029 what exactly should we maximize? 39 00:02:23,029 --> 00:02:24,698 Bentham tells us 40 00:02:24,699 --> 00:02:25,510 happiness 41 00:02:25,510 --> 00:02:27,728 or more precisely 42 00:02:27,729 --> 00:02:29,479 utility. 43 00:02:29,479 --> 00:02:34,239 Maximizing utility is a principal not only for individuals but also for communities and 44 00:02:34,239 --> 00:02:36,569 for legislators 45 00:02:36,569 --> 00:02:38,749 what after all is a community 46 00:02:38,749 --> 00:02:41,359 Bentham asks, 47 00:02:41,359 --> 00:02:45,309 it's the sum of the individuals who comprise it 48 00:02:45,309 --> 00:02:46,780 and that's why 49 00:02:46,780 --> 00:02:53,370 in deciding the best policy, in deciding what the law should be, in deciding what's just, 50 00:02:53,370 --> 00:02:59,029 citizens and legislators should ask themselves the question if we add up, 51 00:02:59,029 --> 00:03:03,870 all of the benefits of this policy 52 00:03:03,870 --> 00:03:05,290 and subtract 53 00:03:05,290 --> 00:03:08,828 all of the costs, 54 00:03:08,829 --> 00:03:10,669 the right thing to do 55 00:03:10,669 --> 00:03:11,779 is the one 56 00:03:11,779 --> 00:03:13,439 that maximizes 57 00:03:13,439 --> 00:03:14,698 the balance 58 00:03:14,699 --> 00:03:16,020 of happiness 59 00:03:16,020 --> 00:03:20,869 over suffering. 60 00:03:20,869 --> 00:03:24,039 that's what it means to maximize utility 61 00:03:24,039 --> 00:03:24,988 now, today 62 00:03:24,989 --> 00:03:28,189 I want to see 63 00:03:28,189 --> 00:03:31,150 whether you agree or disagree with it, 64 00:03:31,150 --> 00:03:36,650 and it often goes, this utilitarian logic, under the name of cost-benefit analysis 65 00:03:36,650 --> 00:03:39,719 which is used by companies 66 00:03:39,719 --> 00:03:41,219 and by 67 00:03:41,219 --> 00:03:41,978 governments 68 00:03:41,979 --> 00:03:43,729 all the time 69 00:03:43,729 --> 00:03:45,239 and what it involves 70 00:03:45,239 --> 00:03:50,789 is placing a value usually a dollar value to stand for utility 71 00:03:50,789 --> 00:03:53,289 on the costs and the benefits 72 00:03:53,289 --> 00:03:56,539 of various proposals. 73 00:03:56,539 --> 00:03:58,879 recently in the Czech Republic 74 00:03:58,879 --> 00:04:03,370 there was a proposal to increases the excise tax on smoking 75 00:04:03,370 --> 00:04:05,089 Philip Morris, 76 00:04:05,089 --> 00:04:08,219 the tobacco company, 77 00:04:08,219 --> 00:04:09,779 does huge business 78 00:04:09,779 --> 00:04:12,509 in the Czech Republic. They commissioned 79 00:04:12,509 --> 00:04:15,429 a study of cost-benefit analysis 80 00:04:15,430 --> 00:04:16,789 of smoking 81 00:04:16,789 --> 00:04:18,678 in the Czech Republic 82 00:04:18,678 --> 00:04:20,899 and what their cost benefit 83 00:04:20,899 --> 00:04:22,830 analysis found 84 00:04:22,830 --> 00:04:23,550 was 85 00:04:23,550 --> 00:04:25,660 the government gains 86 00:04:25,660 --> 00:04:27,349 by 87 00:04:27,349 --> 00:04:30,779 having Czech citizens smoke. 88 00:04:30,779 --> 00:04:32,360 Now, how do they gain? 89 00:04:32,360 --> 00:04:36,189 It's true that there are negative effects 90 00:04:36,189 --> 00:04:40,000 to the public finance of the Czech government 91 00:04:40,000 --> 00:04:45,039 because there are increased health care costs for people who develop smoking-related 92 00:04:45,039 --> 00:04:47,239 diseases 93 00:04:47,239 --> 00:04:50,369 on the other hand there were positive effects 94 00:04:50,369 --> 00:04:51,620 and those were 95 00:04:51,620 --> 00:04:53,099 added up 96 00:04:53,099 --> 00:04:55,378 on the other side of the ledger 97 00:04:55,379 --> 00:05:01,960 the positive effects included, for the most part, various tax revenues that the government 98 00:05:01,960 --> 00:05:06,239 derives from the sale of cigarette products but it also included health care savings to 99 00:05:06,240 --> 00:05:09,530 the government when people die early 100 00:05:09,530 --> 00:05:13,339 pensions savings, you don't have to pay pensions for as long, 101 00:05:13,339 --> 00:05:14,889 and also savings 102 00:05:14,889 --> 00:05:19,679 in housing costs for the elderly 103 00:05:19,679 --> 00:05:24,539 and when all of the costs and benefits were added up 104 00:05:24,539 --> 00:05:25,969 the Philip Morris 105 00:05:25,969 --> 00:05:27,870 study found 106 00:05:27,870 --> 00:05:32,610 that there is a net public finance gain in the Czech Republic 107 00:05:32,610 --> 00:05:36,449 of a hundred and forty seven million dollars 108 00:05:36,449 --> 00:05:38,120 and given the savings 109 00:05:38,120 --> 00:05:41,479 in housing and health care and pension costs 110 00:05:41,479 --> 00:05:46,570 the government enjoys the saving of savings of over twelve hundred dollars 111 00:05:46,570 --> 00:05:52,569 for each person who dies prematurely due to smoking. 112 00:05:52,569 --> 00:05:56,740 cost-benefit analysis 113 00:05:56,740 --> 00:06:01,729 now, those among you who are defenders utilitarianism may think that this is a unfair 114 00:06:01,729 --> 00:06:03,389 test 115 00:06:03,389 --> 00:06:09,130 Philip Morris was pilloried in the press and they issued an apology for this heartless 116 00:06:09,130 --> 00:06:10,990 calculation 117 00:06:10,990 --> 00:06:12,319 you may say 118 00:06:12,319 --> 00:06:17,119 that what's missing here is something that the utilitarian can be easily incorporate 119 00:06:17,119 --> 00:06:19,339 mainly 120 00:06:19,339 --> 00:06:23,430 the value to the person and to the families of those who die 121 00:06:23,430 --> 00:06:25,599 from lung cancer. 122 00:06:25,599 --> 00:06:29,190 what about the value of life? 123 00:06:29,190 --> 00:06:33,099 Some cost-benefit analyses incorporate 124 00:06:33,099 --> 00:06:34,560 a measure 125 00:06:34,560 --> 00:06:36,809 for the value of life. 126 00:06:36,809 --> 00:06:41,330 One of the most famous of these involved the Ford Pinto case 127 00:06:41,330 --> 00:06:45,008 did any of you read about that? this was back in the 1970's, you remember that 128 00:06:45,009 --> 00:06:48,210 the Ford Pinto was, a kind of car? 129 00:06:48,210 --> 00:06:51,029 anybody? 130 00:06:51,029 --> 00:06:55,879 it was a small car, subcompact car, very popular 131 00:06:55,879 --> 00:06:58,059 but it had one 132 00:06:58,059 --> 00:07:01,959 problem which is the fuel tank was at the back of the car 133 00:07:01,959 --> 00:07:07,960 and in rear collisions the fuel tank exploded 134 00:07:07,960 --> 00:07:10,210 and some people were killed 135 00:07:10,210 --> 00:07:14,349 and some severely injured. 136 00:07:14,349 --> 00:07:18,998 victims of these injuries took Ford to court to sue 137 00:07:18,999 --> 00:07:21,669 and in the court case it turned out 138 00:07:21,669 --> 00:07:24,030 that Ford had long 139 00:07:24,030 --> 00:07:25,248 since known 140 00:07:25,249 --> 00:07:27,719 about the vulnerable fuel tank 141 00:07:27,719 --> 00:07:33,599 and had done a cost-benefit analysis to determine whether it would be worth it 142 00:07:33,599 --> 00:07:36,300 to put in a special shield 143 00:07:36,300 --> 00:07:40,490 that would protect the fuel tank and prevent it from exploding. 144 00:07:40,490 --> 00:07:42,979 They did a cost benefit analysis 145 00:07:42,979 --> 00:07:46,318 the cost per part 146 00:07:46,319 --> 00:07:48,239 to increase the safety 147 00:07:48,239 --> 00:07:50,299 of the Pinto, 148 00:07:50,299 --> 00:07:55,688 they calculated at eleven dollars per part 149 00:07:55,689 --> 00:07:57,189 and here's, 150 00:07:57,189 --> 00:08:00,609 this was the cost benefit analysis that emerged 151 00:08:00,609 --> 00:08:03,089 in the trial, 152 00:08:03,089 --> 00:08:05,960 eleven dollars per part 153 00:08:05,960 --> 00:08:09,998 at 12.5 million cars and trucks 154 00:08:09,999 --> 00:08:13,559 came to a total cost of 155 00:08:13,370 --> 00:08:17,370 137 million dollars to improve the safety 156 00:08:17,370 --> 00:08:18,789 but then they calculated 157 00:08:18,789 --> 00:08:20,139 the benefits 158 00:08:20,139 --> 00:08:23,300 of spending all this money on a safer car 159 00:08:23,300 --> 00:08:26,800 and they counted 180 deaths 160 00:08:26,800 --> 00:08:28,749 and they assigned a dollar value 161 00:08:28,749 --> 00:08:30,449 200 thousand dollars 162 00:08:30,449 --> 00:08:32,490 per death 163 00:08:32,490 --> 00:08:35,250 180 injuries 164 00:08:35,250 --> 00:08:37,490 67 thousand 165 00:08:37,490 --> 00:08:38,909 and then the cost to repair 166 00:08:38,909 --> 00:08:43,289 the replacement cost for two thousand vehicles that would be destroyed without the 167 00:08:43,289 --> 00:08:45,030 safety device 168 00:08:45,030 --> 00:08:48,060 700 dollars per vehicle, 169 00:08:48,060 --> 00:08:50,390 so the benefits 170 00:08:50,390 --> 00:08:53,930 turned out to be only 49.5 million, 171 00:08:53,930 --> 00:08:55,270 and so they 172 00:08:55,270 --> 00:08:56,329 didn't install 173 00:08:56,330 --> 00:08:58,140 the device 174 00:08:58,140 --> 00:08:59,840 needless to say 175 00:08:59,840 --> 00:09:01,740 when this memo 176 00:09:01,160 --> 00:09:09,160 of the Ford Motor Company's cost-benefit analysis came out in the trial 177 00:09:09,160 --> 00:09:11,250 it appalled the jurors 178 00:09:11,250 --> 00:09:15,950 who awarded a huge settlement 179 00:09:15,950 --> 00:09:21,810 is this a counter example to the utilitarian idea of calculating 180 00:09:21,810 --> 00:09:23,040 because Ford included a 181 00:09:23,040 --> 00:09:27,319 measure of the value life. 182 00:09:27,320 --> 00:09:30,770 Now who here wants to defend 183 00:09:30,770 --> 00:09:33,240 cost-benefit analysis from 184 00:09:33,240 --> 00:09:35,320 this apparent counter example 185 00:09:35,320 --> 00:09:38,720 who has a defense? 186 00:09:38,720 --> 00:09:41,590 or do you think it's completely destroys 187 00:09:41,590 --> 00:09:47,410 the whole utilitarian calculus? 188 00:09:47,410 --> 00:09:48,760 I think that 189 00:09:48,760 --> 00:09:53,210 once again they've made the same mistake the previous case did that they've assigned a dollar value 190 00:09:53,210 --> 00:09:56,570 to human life and once again they failed to take into account things like 191 00:09:56,570 --> 00:10:00,580 suffering and emotional losses of families, I mean families lost earnings 192 00:10:00,580 --> 00:10:03,980 but they also lost a loved one and that 193 00:10:03,980 --> 00:10:06,820 is more value than 200 thousand dollars. 194 00:10:06,820 --> 00:10:09,170 Good, and wait wait wait, what's you're name? 195 00:10:09,170 --> 00:10:10,479 Julie Roto. 196 00:10:10,480 --> 00:10:14,400 so if two hundred thousand, Julie, is too 197 00:10:14,400 --> 00:10:18,660 too low a figure because it doesn't include the loss of a loved one, 198 00:10:18,660 --> 00:10:21,660 and the loss of those years of life, 199 00:10:21,660 --> 00:10:23,980 what would be, what do you think 200 00:10:23,980 --> 00:10:27,290 would be a more accurate number? 201 00:10:27,290 --> 00:10:32,430 I don't believe I could give a number I think that this sort of analysis shouldn't be applied to 202 00:10:32,430 --> 00:10:33,560 issues of human life. 203 00:10:33,560 --> 00:10:36,199 I think it can't be used monetarily 204 00:10:36,200 --> 00:10:39,270 so they didn't just put to low a number, 205 00:10:39,270 --> 00:10:44,510 Julie says, they were wrong to try to put any number at all. 206 00:10:44,510 --> 00:10:49,540 all right let's hear someone who 207 00:10:49,540 --> 00:10:51,860 you have to adjust for inflation 208 00:10:57,700 --> 00:10:59,130 all right 209 00:10:59,130 --> 00:11:00,050 fair enough 210 00:11:00,050 --> 00:11:02,660 so what would the number of being now? 211 00:11:02,660 --> 00:11:07,640 this is was thirty five years ago 212 00:11:07,640 --> 00:11:10,330 two million dollars 213 00:11:10,330 --> 00:11:12,030 you would put two million 214 00:11:12,030 --> 00:11:14,079 and what's your name 215 00:11:14,080 --> 00:11:15,410 Voicheck 216 00:11:15,410 --> 00:11:17,329 Voicheck says we have to allow for inflation 217 00:11:17,330 --> 00:11:19,640 we should be more generous 218 00:11:19,640 --> 00:11:25,170 then would you be satisfied that this is the right way of thinking about the question? 219 00:11:25,170 --> 00:11:27,310 I guess unfortunately 220 00:11:27,310 --> 00:11:29,589 it is for 221 00:11:29,590 --> 00:11:32,980 there's needs to be of number put somewhere 222 00:11:32,980 --> 00:11:37,330 I'm not sure what number would be but I do agree that there could possibly 223 00:11:37,330 --> 00:11:39,490 be a number put 224 00:11:39,490 --> 00:11:41,060 on a human life. 225 00:11:41,060 --> 00:11:42,439 all right so 226 00:11:42,440 --> 00:11:44,420 Voicheck says 227 00:11:44,420 --> 00:11:46,280 and here he disagrees with 228 00:11:46,280 --> 00:11:46,649 Julie 229 00:11:46,649 --> 00:11:49,750 Julie says we can't put a number of human life 230 00:11:49,750 --> 00:11:53,670 for the purpose of a cost-benefit analysis, Voicheck says we have to 231 00:11:53,670 --> 00:11:59,870 because we have to make decisions somehow 232 00:11:59,870 --> 00:12:04,750 what do other people think about this? Is there anyone prepared to defend cost-benefit 233 00:12:04,750 --> 00:12:06,270 analysis here 234 00:12:06,270 --> 00:12:09,810 as accurate, as desirable? 235 00:12:09,810 --> 00:12:16,000 I think that if ford and other car companies didn't use cost-benefit analysis they'd eventually go out 236 00:12:16,000 --> 00:12:18,949 of business because they wouldn't be able to be profitable 237 00:12:18,949 --> 00:12:23,139 and millions of people wouldn't be able to use their cars to get to jobs, to put food on the table 238 00:12:23,139 --> 00:12:27,939 to feed their children so I think that if cost-benefit analysis isn't employed 239 00:12:27,939 --> 00:12:29,699 the greater good 240 00:12:29,700 --> 00:12:31,870 is sacrificed 241 00:12:31,870 --> 00:12:34,810 in this case. Alright let me ask, what's your name? 242 00:12:34,810 --> 00:12:37,599 Raul. Raul. 243 00:12:37,600 --> 00:12:41,760 there was recently a study done about cell phone use by drivers, when people are driving 244 00:12:41,760 --> 00:12:43,090 a car, 245 00:12:43,090 --> 00:12:47,080 and there's a debate about whether that should be banned 246 00:12:47,080 --> 00:12:48,660 and 247 00:12:48,660 --> 00:12:50,689 the figure was that some 248 00:12:50,690 --> 00:12:54,990 two thousand people die 249 00:12:54,990 --> 00:12:57,280 as a result of accidents 250 00:12:57,280 --> 00:12:59,240 each year 251 00:12:59,240 --> 00:13:02,120 using cell phones 252 00:13:02,120 --> 00:13:07,860 and yet the cost benefit analysis which was done by the center for risk analysis at Harvard 253 00:13:07,860 --> 00:13:10,520 found that if you look at the benefits 254 00:13:10,520 --> 00:13:13,560 of the cell phone use 255 00:13:13,560 --> 00:13:15,089 and you put some 256 00:13:15,090 --> 00:13:19,050 value on the life, it comes out about the same 257 00:13:19,050 --> 00:13:23,290 because of the enormous economic benefit of enabling people to take advantage 258 00:13:23,290 --> 00:13:27,370 of their time, not waste time, be able to make deals and talk to friends and so on 259 00:13:27,370 --> 00:13:30,370 while they're driving 260 00:13:30,370 --> 00:13:32,060 doesn't that suggest that 261 00:13:32,060 --> 00:13:35,619 it's a mistake to try to put monetary figures on questions 262 00:13:35,620 --> 00:13:37,590 of human life? 263 00:13:37,590 --> 00:13:39,330 well I think that if 264 00:13:39,330 --> 00:13:41,780 the great majority of people 265 00:13:41,780 --> 00:13:47,569 tried to derive maximum utility out of a service like using cell phones and the convenience that cell phones 266 00:13:47,570 --> 00:13:47,989 provide 267 00:13:47,989 --> 00:13:50,400 that sacrifice is necessary 268 00:13:50,400 --> 00:13:52,160 for 269 00:13:52,160 --> 00:13:53,230 satisfaction to occur. 270 00:13:53,230 --> 00:13:59,200 You're an outright utilitarian. In, yes okay. 271 00:13:59,200 --> 00:14:02,820 all right then, one last question Raul 272 00:14:02,820 --> 00:14:05,810 and I put this to Voicheck, 273 00:14:05,810 --> 00:14:07,829 what dollar figure should be put 274 00:14:07,829 --> 00:14:12,870 on human life to decide whether to ban the use of cell phones 275 00:14:12,870 --> 00:14:14,640 well I don't want to 276 00:14:14,640 --> 00:14:16,199 arbitrarily 277 00:14:16,200 --> 00:14:18,140 calculate a figure, I mean right now 278 00:14:18,140 --> 00:14:19,370 I think that 279 00:14:21,350 --> 00:14:23,610 you want to take it under advisement. 280 00:14:23,610 --> 00:14:24,790 yeah I'll take it under advisement. 281 00:14:24,790 --> 00:14:28,689 but what roughly speaking would it be? you've got 23 hundred deaths 282 00:14:28,690 --> 00:14:32,400 you've got to assign a dollar value to know whether you want to prevent those deaths by 283 00:14:32,400 --> 00:14:37,069 banning the use of cell phones in cars 284 00:14:37,070 --> 00:14:38,900 so what would you're hunch be? 285 00:14:38,900 --> 00:14:40,350 how much? 286 00:14:40,350 --> 00:14:40,799 million 287 00:14:40,799 --> 00:14:42,280 two million 288 00:14:42,280 --> 00:14:44,410 two million was Voitech's figure 289 00:14:44,410 --> 00:14:46,120 is that about right? maybe a million. 290 00:14:46,120 --> 00:14:50,500 a million.?! 291 00:14:50,500 --> 00:14:55,110 Alright that's good, thank you 292 00:14:55,110 --> 00:15:00,440 So these are some of the controversies that arise these days from cost-benefit analysis especially 293 00:15:00,440 --> 00:15:01,859 those that involve 294 00:15:01,859 --> 00:15:06,770 placing a dollar value on everything to be added up. 295 00:15:06,770 --> 00:15:08,840 well now I want to turn 296 00:15:08,840 --> 00:15:15,110 to your objections, to your objections not necessarily to cost benefit analysis specifically, 297 00:15:15,110 --> 00:15:18,400 because that's just one version of the 298 00:15:18,400 --> 00:15:21,959 utilitarian logic in practice today, 299 00:15:21,960 --> 00:15:26,830 but to the theory as a whole, to the idea 300 00:15:26,830 --> 00:15:30,000 that the right thing to do, 301 00:15:30,000 --> 00:15:33,710 the just basis for policy and law, 302 00:15:33,710 --> 00:15:35,559 is to maximize 303 00:15:35,559 --> 00:15:39,750 utility. 304 00:15:39,750 --> 00:15:41,590 How many disagree 305 00:15:41,590 --> 00:15:43,359 with the utilitarian 306 00:15:43,359 --> 00:15:44,330 approach 307 00:15:44,330 --> 00:15:46,390 to law 308 00:15:46,390 --> 00:15:48,480 and to the common good? 309 00:15:48,480 --> 00:15:52,230 How many bring with it? 310 00:15:52,230 --> 00:15:55,200 so more agree than disagree. 311 00:15:55,200 --> 00:15:59,560 so let's hear from the critics 312 00:15:59,560 --> 00:16:02,209 my main issue with it is that I feel like 313 00:16:02,209 --> 00:16:06,280 you can't say that just because someone's in the minority 314 00:16:06,280 --> 00:16:11,650 what they want and need is less valuable than someone who's in the majority 315 00:16:11,650 --> 00:16:14,189 so I guess I have an issue with the idea that 316 00:16:14,190 --> 00:16:16,099 the greatest good for the greatest number 317 00:16:16,099 --> 00:16:18,070 is okay because 318 00:16:18,070 --> 00:16:21,040 there is still what about people who are in 319 00:16:21,040 --> 00:16:24,719 the lesser number, like it's not fair to them they didn't have a say in where they wanted 320 00:16:24,720 --> 00:16:25,840 to be. 321 00:16:25,840 --> 00:16:28,880 alright now that's an interesting objection, you're worried about 322 00:16:28,880 --> 00:16:32,460 the effect on minority. yes. 323 00:16:32,460 --> 00:16:34,940 what's your name by the way. Anna. 324 00:16:34,940 --> 00:16:39,819 alright who has an answer to Anna's worry about the effect on the minority 325 00:16:39,819 --> 00:16:41,689 What do you say to Anna? 326 00:16:41,690 --> 00:16:42,540 she said that 327 00:16:42,540 --> 00:16:47,490 the minorities value less, I don't think that's the case because individually the minorities 328 00:16:47,490 --> 00:16:51,339 value is just the same as the individual in the majority it's just that 329 00:16:51,339 --> 00:16:54,520 the numbers outweigh the 330 00:16:54,520 --> 00:16:55,670 minority 331 00:16:55,670 --> 00:16:58,919 and I mean at a certain point you have to make a decision 332 00:16:58,919 --> 00:17:01,510 and I'm sorry for the minority but 333 00:17:01,510 --> 00:17:02,640 sometimes 334 00:17:02,640 --> 00:17:03,670 it's for the general 335 00:17:03,670 --> 00:17:08,839 for the greater good. For the greater good, Anna what do you say? what's your name? Youngda. 336 00:17:08,839 --> 00:17:10,839 What do you say to Youngda? 337 00:17:10,839 --> 00:17:13,589 Youngda says you just have to add up people's preferences 338 00:17:13,589 --> 00:17:17,698 and those in the minority do have their preferences weighed. 339 00:17:17,699 --> 00:17:22,000 can you give an example of the kind of thing you're worried about when you say you're worried 340 00:17:22,000 --> 00:17:24,880 about utilitarianism violating 341 00:17:24,880 --> 00:17:27,970 the concern or respect due the minority? 342 00:17:27,970 --> 00:17:29,730 can you give an example. 343 00:17:29,730 --> 00:17:35,179 so well with any of the cases that we've talked about, like with the shipwreck one, 344 00:17:35,179 --> 00:17:36,450 I think that 345 00:17:36,450 --> 00:17:39,700 the boy who was eaten 346 00:17:39,700 --> 00:17:39,980 still had 347 00:17:39,980 --> 00:17:43,560 just as much of a right to live as the other people and 348 00:17:43,560 --> 00:17:44,710 just because 349 00:17:44,710 --> 00:17:45,980 he was the 350 00:17:48,120 --> 00:17:50,610 minority in that case the one who 351 00:17:50,610 --> 00:17:53,459 maybe had less of a chance to keep living 352 00:17:53,460 --> 00:17:54,870 that doesn't mean 353 00:17:54,870 --> 00:17:58,449 that the others automatically have a right to eat him 354 00:17:58,450 --> 00:17:59,680 just because 355 00:17:59,680 --> 00:18:01,849 it would give a greater amount of people 356 00:18:01,849 --> 00:18:03,290 the chance to live. 357 00:18:03,290 --> 00:18:05,740 so there may be a certain rights 358 00:18:05,740 --> 00:18:07,850 that the minority 359 00:18:07,850 --> 00:18:12,550 members have that the individual has that shouldn't be traded off 360 00:18:12,550 --> 00:18:14,399 for the sake of 361 00:18:14,400 --> 00:18:16,520 utility? 362 00:18:16,520 --> 00:18:18,210 yes Anna? 363 00:18:18,210 --> 00:18:21,990 Now this would be a test for you, 364 00:18:21,990 --> 00:18:24,550 back in ancient Rome 365 00:18:24,550 --> 00:18:29,510 they threw Christians to the lions in the coliseum for sport 366 00:18:29,510 --> 00:18:33,890 if you think how the utilitarian calculus would go 367 00:18:33,890 --> 00:18:38,770 yes, the Christian thrown to the lion suffers enormous excruciating pain, 368 00:18:38,770 --> 00:18:45,770 but look at the collective ecstasy of the Romans. 369 00:18:48,460 --> 00:18:50,580 Youngda. Well 370 00:18:50,580 --> 00:18:51,889 in that time 371 00:18:51,890 --> 00:18:53,250 I don't think 372 00:18:54,920 --> 00:19:01,710 in the modern-day of time to value the, um, to given a number to the happiness given to the people watching 373 00:19:01,710 --> 00:19:02,900 I don't think 374 00:19:02,900 --> 00:19:05,300 any 375 00:19:05,300 --> 00:19:07,490 policy maker would say 376 00:19:07,490 --> 00:19:11,390 the pain of one person, the suffering of one person is much much, 377 00:19:11,390 --> 00:19:14,680 in comparison to the happiness gained 378 00:19:14,680 --> 00:19:20,460 no but you have to admit that if there were enough Romans delirious with happiness, 379 00:19:20,460 --> 00:19:25,220 it would outweigh even the most excruciating pain of a handful of 380 00:19:25,220 --> 00:19:28,600 Christians thrown to the lion. 381 00:19:28,600 --> 00:19:33,570 so we really have here two different objections to utilitarianism 382 00:19:33,570 --> 00:19:35,030 one has to do 383 00:19:35,030 --> 00:19:37,850 with whether utilitarianism 384 00:19:37,850 --> 00:19:39,559 adequately respects 385 00:19:39,559 --> 00:19:40,740 individual rights 386 00:19:40,740 --> 00:19:42,500 or minority rights 387 00:19:42,500 --> 00:19:45,150 and the other has to do 388 00:19:45,150 --> 00:19:46,940 with the whole idea 389 00:19:46,940 --> 00:19:48,980 of aggregating 390 00:19:48,980 --> 00:19:49,900 utility 391 00:19:49,900 --> 00:19:51,610 for preferences 392 00:19:51,610 --> 00:19:53,070 or values 393 00:19:53,070 --> 00:19:56,470 is it possible to aggregate all values 394 00:19:56,470 --> 00:19:58,050 to translate them 395 00:19:58,050 --> 00:20:00,360 into dollar terms? 396 00:20:00,360 --> 00:20:02,479 there was 397 00:20:02,480 --> 00:20:07,000 in the 1930's 398 00:20:07,000 --> 00:20:09,190 a psychologist 399 00:20:09,190 --> 00:20:10,840 who tried 400 00:20:10,840 --> 00:20:12,370 to address 401 00:20:12,370 --> 00:20:15,850 the second question. He tried to prove 402 00:20:15,850 --> 00:20:18,530 what utilitarianism assumes, 403 00:20:18,530 --> 00:20:22,040 that it is possible 404 00:20:22,040 --> 00:20:23,230 to translate 405 00:20:23,230 --> 00:20:27,309 all goods, all values, all human concerns 406 00:20:27,309 --> 00:20:28,908 into a single uniform measure 407 00:20:28,909 --> 00:20:30,230 and he did this 408 00:20:30,230 --> 00:20:32,660 by conducting a survey 409 00:20:32,660 --> 00:20:37,610 of the young recipients of relief, this was in the 1930's 410 00:20:37,610 --> 00:20:42,000 and he asked them, he gave them a list of unpleasant experiences 411 00:20:42,000 --> 00:20:45,360 and he asked them how much would you have to be paid to undergo 412 00:20:45,360 --> 00:20:49,899 the following experiences and he kept track 413 00:20:49,900 --> 00:20:51,040 for example 414 00:20:51,040 --> 00:20:57,240 how much would you have to be paid to have one upper front tooth pulled out 415 00:20:57,240 --> 00:21:04,240 or how much would you have to be paid to have one little one tow cut off? 416 00:21:05,030 --> 00:21:11,649 or eat a live earth worm, six inches long 417 00:21:11,650 --> 00:21:18,650 or to live the rest of your life on a farm in Kansas 418 00:21:19,430 --> 00:21:24,480 or to choke a stray cat to death with your bare hands 419 00:21:24,480 --> 00:21:26,140 now what do you suppose 420 00:21:26,140 --> 00:21:32,900 what do you suppose was the most expensive item on that list 421 00:21:32,900 --> 00:21:39,750 Kansas? 422 00:21:39,750 --> 00:21:44,840 You're right it was Kansas 423 00:21:44,840 --> 00:21:46,059 for a Kansas 424 00:21:46,059 --> 00:21:47,619 people said they'd have to pay them 425 00:21:47,619 --> 00:21:51,980 they have to be paid three hundred thousand dollars 426 00:21:57,340 --> 00:21:59,800 what do you think 427 00:21:59,800 --> 00:22:03,139 what do you think was the next most expensive? 428 00:22:03,140 --> 00:22:06,010 not the cat 429 00:22:06,010 --> 00:22:08,190 not the tooth 430 00:22:08,190 --> 00:22:11,250 not the toe 431 00:22:11,250 --> 00:22:16,580 the worm! 432 00:22:16,580 --> 00:22:20,600 people said you'd have to pay them a hundred thousand dollars 433 00:22:20,600 --> 00:22:23,419 to eat the worm 434 00:22:23,420 --> 00:22:28,010 what do you think was the least expensive item? 435 00:22:28,010 --> 00:22:30,010 not the cat 436 00:22:30,010 --> 00:22:31,300 the tooth 437 00:22:31,300 --> 00:22:34,990 during the depression people were willing to have their tooth pulled 438 00:22:34,990 --> 00:22:39,600 for only forty five hundred dollars 439 00:22:39,600 --> 00:22:40,669 now 440 00:22:42,020 --> 00:22:45,010 here's what Thorndike 441 00:22:45,010 --> 00:22:48,460 concluded from his study 442 00:22:48,460 --> 00:22:51,650 any want or satisfaction which exists, exists 443 00:22:51,650 --> 00:22:54,600 in some amount and is therefore measurable 444 00:22:54,600 --> 00:22:55,949 the life of a dog 445 00:22:55,950 --> 00:22:57,010 or a cat 446 00:22:57,010 --> 00:22:59,430 or a chicken consists 447 00:22:59,430 --> 00:23:00,979 of appetites 448 00:23:00,979 --> 00:23:01,880 cravings 449 00:23:01,880 --> 00:23:04,290 desires and their gratifications 450 00:23:04,290 --> 00:23:05,750 so does the life 451 00:23:05,750 --> 00:23:06,940 of human beings 452 00:23:06,940 --> 00:23:09,409 though the appetites and desires 453 00:23:09,409 --> 00:23:12,450 are more complicated 454 00:23:12,450 --> 00:23:14,330 but what about 455 00:23:14,330 --> 00:23:15,990 Thorndike's study? 456 00:23:15,990 --> 00:23:17,870 does it support 457 00:23:17,870 --> 00:23:20,209 Bentham's idea 458 00:23:20,210 --> 00:23:22,130 that all 459 00:23:22,130 --> 00:23:28,390 goods all values can be captured according to a single uniform measure of value 460 00:23:28,390 --> 00:23:34,289 or does the preposterous character of those different items on the list 461 00:23:34,289 --> 00:23:37,220 suggest the opposite conclusion 462 00:23:37,220 --> 00:23:40,290 that may be whether we're talking about life 463 00:23:40,290 --> 00:23:42,120 or Kansas 464 00:23:42,120 --> 00:23:44,179 or the worm 465 00:23:44,180 --> 00:23:45,440 maybe 466 00:23:45,440 --> 00:23:47,810 the things we value 467 00:23:47,810 --> 00:23:49,399 and cherish 468 00:23:49,400 --> 00:23:51,350 can't be captured 469 00:23:51,350 --> 00:23:54,280 according to a single uniform measure of value 470 00:23:54,280 --> 00:23:56,129 and if they can't 471 00:23:56,130 --> 00:23:57,910 what are the consequences 472 00:23:57,910 --> 00:24:00,740 for the utilitarian theory 473 00:24:00,740 --> 00:24:02,180 of morality 474 00:24:02,180 --> 00:24:09,180 that's a question we'll continue with next time 475 00:24:13,179 --> 00:24:15,530 alright now let's take the other 476 00:24:15,530 --> 00:24:17,680 part of the poll 477 00:24:17,680 --> 00:24:19,309 which is the 478 00:24:19,309 --> 00:24:21,470 the highest 479 00:24:21,470 --> 00:24:23,410 experience or pleasure? 480 00:24:23,410 --> 00:24:25,659 how many say 481 00:24:25,659 --> 00:24:31,720 Shakespeare 482 00:24:31,720 --> 00:24:38,120 how many say fear Factor 483 00:24:38,120 --> 00:24:39,760 no you can't be serious 484 00:24:39,760 --> 00:24:46,160 really? 485 00:24:46,160 --> 00:24:48,530 last time 486 00:24:48,530 --> 00:24:53,510 last time we began to consider some objections 487 00:24:53,510 --> 00:24:56,570 to Jeremy Bentham's version 488 00:24:56,570 --> 00:25:01,790 of utilitarianism 489 00:25:01,790 --> 00:25:04,950 people raised two objections in the discussion 490 00:25:04,950 --> 00:25:08,380 we had 491 00:25:08,380 --> 00:25:10,600 the first 492 00:25:10,600 --> 00:25:12,709 was the objection, the claim 493 00:25:12,710 --> 00:25:14,760 that utilitarianism, 494 00:25:14,760 --> 00:25:16,970 by concerning itself 495 00:25:16,970 --> 00:25:20,110 with the greatest good for the greatest number 496 00:25:20,110 --> 00:25:23,120 fails adequately to respect 497 00:25:23,120 --> 00:25:25,479 individual rights. 498 00:25:25,480 --> 00:25:28,880 today we have debates 499 00:25:28,880 --> 00:25:29,990 about torture 500 00:25:29,990 --> 00:25:33,610 and terrorism 501 00:25:33,610 --> 00:25:35,139 suppose 502 00:25:35,140 --> 00:25:41,350 a suspected terrorists was apprehended on September tenth 503 00:25:41,350 --> 00:25:44,189 and you had reason to believe 504 00:25:44,190 --> 00:25:46,490 that the suspect 505 00:25:46,490 --> 00:25:51,230 had crucial information about an impending terrorist attack that would kill over three thousand 506 00:25:51,230 --> 00:25:52,200 people 507 00:25:52,200 --> 00:25:55,690 and you couldn't extract the information 508 00:25:55,690 --> 00:25:57,670 would it be just 509 00:25:57,670 --> 00:25:59,440 to torture 510 00:25:59,440 --> 00:26:01,070 the suspect 511 00:26:01,070 --> 00:26:03,060 to get the information 512 00:26:03,060 --> 00:26:05,550 or 513 00:26:05,550 --> 00:26:07,770 do you say no 514 00:26:07,770 --> 00:26:14,550 there is a categorical moral duty of respect for individual rights 515 00:26:14,550 --> 00:26:17,899 in a way we're back to the questions we started with t 516 00:26:17,900 --> 00:26:23,910 about trolley cars and organ transplants so that's the first issue 517 00:26:23,910 --> 00:26:29,250 and you remember we considered some examples of cost-benefit analysis 518 00:26:29,250 --> 00:26:33,870 but a lot of people were unhappy with cost-benefit analysis 519 00:26:33,870 --> 00:26:39,969 when it came to placing a dollar value on human life 520 00:26:39,970 --> 00:26:42,290 and so that led us to the 521 00:26:42,290 --> 00:26:44,500 second objection, 522 00:26:44,500 --> 00:26:48,810 it questioned whether it's possible to translate all values 523 00:26:48,810 --> 00:26:52,879 into a single uniform measure of value 524 00:26:52,880 --> 00:26:57,580 it asks in other words whether all values are commensurable 525 00:26:57,580 --> 00:27:00,080 let me give you one other 526 00:27:00,080 --> 00:27:01,299 example 527 00:27:01,299 --> 00:27:07,240 of an experience, this actually is a true story, it comes from personal experience 528 00:27:07,240 --> 00:27:12,900 that raises a question at least about whether all values can be translated without 529 00:27:12,900 --> 00:27:14,060 loss 530 00:27:14,060 --> 00:27:20,158 into utilitarian terms 531 00:27:20,159 --> 00:27:22,390 some years ago 532 00:27:22,390 --> 00:27:28,080 when I was a graduate student I was at Oxford in England and they had menâs and women's 533 00:27:28,080 --> 00:27:29,800 colleges they weren't yet mixed 534 00:27:29,800 --> 00:27:31,428 and the women's colleges had rules 535 00:27:31,429 --> 00:27:33,590 against 536 00:27:33,590 --> 00:27:37,470 overnight male guests 537 00:27:37,470 --> 00:27:40,080 by the nineteen seventies these 538 00:27:40,080 --> 00:27:44,030 rules were rarely enforced and easily violated, 539 00:27:44,030 --> 00:27:51,030 or so I was told, 540 00:27:51,600 --> 00:27:56,549 by the late nineteen seventies when I was there, pressure grew to relax these rules and it became 541 00:27:56,549 --> 00:28:01,100 the subject of debate among the faculty at St. Anne's College 542 00:28:01,100 --> 00:28:03,840 which was one of these all women colleges 543 00:28:03,840 --> 00:28:06,629 the older women on the faculty 544 00:28:06,630 --> 00:28:10,720 we're traditionalists they were opposed to change 545 00:28:10,720 --> 00:28:12,820 on conventional moral grounds 546 00:28:12,820 --> 00:28:14,320 but times had changed 547 00:28:14,320 --> 00:28:17,110 and they were embarrassed 548 00:28:17,110 --> 00:28:20,000 to give the true grounds of their objection 549 00:28:20,000 --> 00:28:22,809 and so the translated their arguments 550 00:28:22,809 --> 00:28:26,120 into utilitarian terms 551 00:28:26,120 --> 00:28:27,419 if men stay overnight, 552 00:28:27,420 --> 00:28:31,680 they argued, the costs to the college will increase. 553 00:28:31,680 --> 00:28:33,490 how you might wonder 554 00:28:33,490 --> 00:28:39,429 well they'll want to take baths, and that will use up hot water they said 555 00:28:39,429 --> 00:28:40,779 furthermore they argued 556 00:28:40,779 --> 00:28:46,780 we'll have to replace the mattresses more often 557 00:28:46,780 --> 00:28:48,379 the reformers 558 00:28:48,380 --> 00:28:52,250 met these arguments by adopting the following compromise 559 00:28:52,250 --> 00:28:53,280 each woman 560 00:28:53,280 --> 00:29:00,280 could have a maximum of three overnight male guest each week 561 00:29:01,160 --> 00:29:05,940 they didn't say whether it had to be the same one, or three different 562 00:29:05,940 --> 00:29:06,800 provided 563 00:29:06,800 --> 00:29:09,240 and this is the compromise provided 564 00:29:09,240 --> 00:29:10,400 the guest 565 00:29:10,400 --> 00:29:15,340 paid fifty pence to defray the cost to the college 566 00:29:15,340 --> 00:29:17,179 the next day 567 00:29:17,180 --> 00:29:23,080 the national headline in the national newspaper read St. Anne's girls, fifty pence a night 568 00:29:29,500 --> 00:29:31,040 another 569 00:29:31,040 --> 00:29:31,730 illustration 570 00:29:31,730 --> 00:29:34,730 of the difficulty of translating 571 00:29:34,730 --> 00:29:36,150 all values 572 00:29:36,150 --> 00:29:38,930 in this case a certain idea of virtue 573 00:29:38,930 --> 00:29:44,200 into utilitarian terms 574 00:29:44,200 --> 00:29:46,980 so that's all to illustrate 575 00:29:46,980 --> 00:29:48,830 the second objection 576 00:29:48,830 --> 00:29:52,679 to utilitarianism, at least the part of that objection 577 00:29:52,680 --> 00:29:54,780 that questions rather 578 00:29:54,780 --> 00:29:56,620 the utilitarianism 579 00:29:56,620 --> 00:29:58,809 is right to assume 580 00:29:58,809 --> 00:30:00,670 that we can 581 00:30:00,670 --> 00:30:02,420 assume the uniformity of 582 00:30:02,420 --> 00:30:08,230 value, the commensurability of values and translate all moral considerations 583 00:30:08,230 --> 00:30:09,740 into 584 00:30:09,740 --> 00:30:10,280 dollars 585 00:30:10,280 --> 00:30:12,220 or money. 586 00:30:12,220 --> 00:30:14,510 But there is a second 587 00:30:14,510 --> 00:30:19,690 aspect to this worry about aggregating values and preferences 588 00:30:19,690 --> 00:30:21,990 why should we 589 00:30:21,990 --> 00:30:23,120 weigh 590 00:30:23,120 --> 00:30:25,260 all preferences 591 00:30:25,260 --> 00:30:27,390 that people have 592 00:30:27,390 --> 00:30:32,650 without assessing whether they're good preferences or bad preferences 593 00:30:32,650 --> 00:30:35,120 shouldn't we distinguish 594 00:30:35,120 --> 00:30:36,010 between 595 00:30:36,010 --> 00:30:37,590 higher 596 00:30:37,590 --> 00:30:38,260 pleasures 597 00:30:38,260 --> 00:30:41,760 and lower pleasures. 598 00:30:41,760 --> 00:30:44,250 Now, part of the appeal of 599 00:30:44,250 --> 00:30:49,380 not making any qualitative distinctions about the worth of people's preferences, part of the 600 00:30:49,380 --> 00:30:50,820 appeal 601 00:30:50,820 --> 00:30:55,250 is that it is non-judgmental and egalitarian 602 00:30:55,250 --> 00:30:58,110 the Benthamite utilitarian says 603 00:30:58,110 --> 00:31:01,199 everybody's preferences count 604 00:31:01,200 --> 00:31:04,890 and they count regardless of what people want 605 00:31:04,890 --> 00:31:08,580 regardless of what makes it different people 606 00:31:08,580 --> 00:31:09,530 happy. For Bentham, 607 00:31:09,530 --> 00:31:11,250 all that matters 608 00:31:11,250 --> 00:31:12,960 you'll remember 609 00:31:12,960 --> 00:31:16,010 are the intensity and the duration 610 00:31:16,010 --> 00:31:18,270 of a pleasure or pain 611 00:31:18,270 --> 00:31:24,040 the so-called higher pleasures or nobler virtues are simply those, according to Bentham 612 00:31:24,040 --> 00:31:25,420 that produce 613 00:31:25,420 --> 00:31:26,419 stronger, 614 00:31:26,419 --> 00:31:29,380 longer, pleasure 615 00:31:29,380 --> 00:31:32,790 yet a famous phrase to express this idea 616 00:31:32,790 --> 00:31:35,770 the quantity of pleasure being equal 617 00:31:35,770 --> 00:31:37,059 pushpin 618 00:31:37,059 --> 00:31:39,519 is as good as poetry. 619 00:31:39,519 --> 00:31:41,789 What was pushpin? 620 00:31:41,789 --> 00:31:46,929 It was some kind of a child's game like to tidily winks pushpin is as good as poetry 621 00:31:46,929 --> 00:31:48,850 Bentham said 622 00:31:48,850 --> 00:31:50,509 and lying behind this idea 623 00:31:50,509 --> 00:31:51,510 I think 624 00:31:51,510 --> 00:31:52,779 is the claim 625 00:31:52,779 --> 00:31:54,090 the intuition 626 00:31:54,090 --> 00:31:55,870 that it's a presumption 627 00:31:55,870 --> 00:31:57,949 to judge 628 00:31:57,950 --> 00:31:58,979 whose pleasures 629 00:31:58,979 --> 00:32:00,670 are intrinsically higher 630 00:32:00,670 --> 00:32:03,850 or worthier or better 631 00:32:03,850 --> 00:32:07,010 and there is something attractive in this 632 00:32:07,010 --> 00:32:09,770 refusal to judge, after all some people like 633 00:32:09,770 --> 00:32:11,790 Mozart, others 634 00:32:11,790 --> 00:32:12,520 Madonna 635 00:32:12,520 --> 00:32:15,090 some people like ballet 636 00:32:15,090 --> 00:32:16,389 others 637 00:32:16,390 --> 00:32:17,360 bowling, 638 00:32:17,360 --> 00:32:19,439 who's to say 639 00:32:19,440 --> 00:32:23,140 a Benthamite might argue, who's to say which of these pleasures 640 00:32:23,140 --> 00:32:24,260 whose pleasures 641 00:32:24,260 --> 00:32:25,460 are higher 642 00:32:25,460 --> 00:32:26,350 worthier 643 00:32:26,350 --> 00:32:27,500 nobler 644 00:32:27,500 --> 00:32:31,570 than others? 645 00:32:31,570 --> 00:32:35,980 But, is that right? 646 00:32:35,980 --> 00:32:40,310 this refusal to make qualitative distinctions 647 00:32:40,310 --> 00:32:42,179 can we 648 00:32:42,180 --> 00:32:45,170 altogether dispense with the idea 649 00:32:45,170 --> 00:32:49,010 that certain things we take pleasure in are 650 00:32:49,010 --> 00:32:50,660 better or worthier 651 00:32:50,660 --> 00:32:53,890 than others 652 00:32:53,890 --> 00:32:58,640 think back to the case of the Romans in the coliseum, one thing that troubled people about that 653 00:32:58,640 --> 00:32:58,960 practice 654 00:32:58,960 --> 00:33:01,170 is that it seemed to violate the rights 655 00:33:01,170 --> 00:33:04,490 of the Christian 656 00:33:04,490 --> 00:33:07,250 another way of objecting to what's going on there 657 00:33:07,250 --> 00:33:10,720 is that the pleasure that the Romans take 658 00:33:10,720 --> 00:33:13,480 in this bloody spectacle 659 00:33:13,480 --> 00:33:15,850 should that pleasure 660 00:33:15,850 --> 00:33:16,500 which is a base, 661 00:33:16,500 --> 00:33:19,130 kind of corrupt 662 00:33:19,130 --> 00:33:22,730 degrading pleasure, should that even 663 00:33:22,730 --> 00:33:26,960 be valorized or weighed in deciding what the 664 00:33:26,960 --> 00:33:33,770 the general welfare is? 665 00:33:33,770 --> 00:33:38,760 so here are the objections to Bentham's utilitarianism 666 00:33:38,760 --> 00:33:42,800 and now we turn to someone who tried to 667 00:33:42,800 --> 00:33:45,889 respond to those objections, 668 00:33:45,890 --> 00:33:47,820 a later day utilitarian 669 00:33:47,820 --> 00:33:50,120 John Stuart Mill 670 00:33:50,120 --> 00:33:52,780 so what we need to 671 00:33:52,780 --> 00:33:54,510 examine now 672 00:33:54,510 --> 00:33:58,809 is whether John Stuart Mill had a convincing reply 673 00:33:58,809 --> 00:34:05,090 to these objections to utilitarianism. 674 00:34:05,090 --> 00:34:06,750 John Stuart Mill 675 00:34:06,750 --> 00:34:08,760 was born in 1806 676 00:34:08,760 --> 00:34:11,369 his father James Mill 677 00:34:11,369 --> 00:34:14,079 was a disciple of Benthamâs 678 00:34:14,079 --> 00:34:17,389 and James Mills set about giving his son 679 00:34:17,389 --> 00:34:20,329 John Stuart Mill a model education 680 00:34:20,329 --> 00:34:22,429 he was a child prodigy 681 00:34:22,429 --> 00:34:23,569 John Stuart Mill 682 00:34:23,569 --> 00:34:27,859 the knew Latin, sorry, Greek at the age of three, Latin at eight 683 00:34:27,859 --> 00:34:29,139 and at age ten 684 00:34:29,139 --> 00:34:33,659 he wrote a history of Roman law. 685 00:34:33,659 --> 00:34:35,649 At age twenty 686 00:34:35,649 --> 00:34:39,129 he had a nervous breakdown 687 00:34:39,129 --> 00:34:43,549 this left him in a depression for five years 688 00:34:43,549 --> 00:34:46,800 but at age twenty five what helped lift him out of this depression 689 00:34:46,800 --> 00:34:49,949 is that he met Harriet Taylor 690 00:34:49,949 --> 00:34:52,659 she in no doubt married him, they lived happily ever after 691 00:34:52,659 --> 00:34:54,849 and it was under her 692 00:34:54,849 --> 00:34:56,659 influence 693 00:34:56,659 --> 00:34:59,859 the John Stuart Mill try to humanize 694 00:34:59,859 --> 00:35:01,720 utilitarianism 695 00:35:01,720 --> 00:35:05,279 what Mill tried to do was to see 696 00:35:05,279 --> 00:35:07,710 whether the utilitarian calculus could be 697 00:35:07,710 --> 00:35:09,029 enlarged 698 00:35:09,029 --> 00:35:11,200 and modified 699 00:35:11,200 --> 00:35:13,759 to accommodate 700 00:35:13,759 --> 00:35:17,429 humanitarian concerns 701 00:35:17,429 --> 00:35:20,390 like the concern to respect individual rights 702 00:35:20,390 --> 00:35:24,598 and also to address the distinction between higher and lower 703 00:35:24,599 --> 00:35:26,439 pleasures. 704 00:35:26,439 --> 00:35:30,368 In 1859 Mill wrote a famous book on liberty 705 00:35:30,369 --> 00:35:35,399 the main point of which was the importance of defending individual rights and minority 706 00:35:35,399 --> 00:35:36,380 rights 707 00:35:36,380 --> 00:35:38,329 and in 1861 708 00:35:38,329 --> 00:35:40,339 toward the end of his life 709 00:35:40,339 --> 00:35:43,130 he wrote the book we read is part of this course 710 00:35:43,130 --> 00:35:45,099 Utilitarianism. 711 00:35:45,099 --> 00:35:46,599 It makes it clear 712 00:35:46,599 --> 00:35:49,739 that utility is the only standard of morality 713 00:35:49,739 --> 00:35:50,769 in his view 714 00:35:50,769 --> 00:35:52,549 so he's not challenging 715 00:35:52,550 --> 00:35:53,949 Bentham's premise, 716 00:35:53,949 --> 00:35:55,279 he's affirming it. 717 00:35:55,279 --> 00:35:58,829 he says very explicitly the sole evidence, 718 00:35:58,829 --> 00:36:04,659 it is possible to produce that anything is desirable is that people actually do 719 00:36:04,659 --> 00:36:05,929 desire it. 720 00:36:05,929 --> 00:36:11,679 so he stays with the idea that our de facto actual empirical desires are the only 721 00:36:11,679 --> 00:36:12,979 basis 722 00:36:12,979 --> 00:36:15,618 for moral judgment. 723 00:36:15,619 --> 00:36:17,549 but then 724 00:36:17,549 --> 00:36:18,979 page eight 725 00:36:18,979 --> 00:36:24,879 also in chapter two, he argues that it is possible for a utilitarian to distinguish 726 00:36:24,879 --> 00:36:26,319 higher from lower 727 00:36:26,320 --> 00:36:28,970 pleasures. 728 00:36:28,970 --> 00:36:30,519 now, those of you who've read 729 00:36:30,519 --> 00:36:32,078 Mill already 730 00:36:32,079 --> 00:36:33,329 how 731 00:36:33,329 --> 00:36:36,559 according to him is it possible to draw that distinction? 732 00:36:36,559 --> 00:36:40,079 How can a utilitarian 733 00:36:40,079 --> 00:36:42,819 distinguish qualitatively higher pleasures 734 00:36:42,819 --> 00:36:43,540 from 735 00:36:43,540 --> 00:36:48,859 lesser ones, base ones, unworthy ones? 736 00:36:48,859 --> 00:36:50,519 If you tried both of them 737 00:36:50,519 --> 00:36:55,308 and you'll prefer the higher one naturally always 738 00:36:55,309 --> 00:36:59,819 that's great, that's right. What's your name? John. 739 00:36:59,819 --> 00:37:01,759 so as John points out 740 00:37:01,759 --> 00:37:05,019 Mill says here's the test, 741 00:37:05,019 --> 00:37:07,868 since we can't step outside 742 00:37:07,869 --> 00:37:10,490 actual desires, actual preferences 743 00:37:10,490 --> 00:37:11,649 that would 744 00:37:11,650 --> 00:37:13,670 violate utilitarian premises, 745 00:37:13,670 --> 00:37:16,819 the only test 746 00:37:16,819 --> 00:37:17,600 of whether 747 00:37:17,600 --> 00:37:19,769 a pleasure is higher 748 00:37:19,769 --> 00:37:26,468 or lower is whether someone who has experienced both 749 00:37:26,469 --> 00:37:28,109 would prefer it. 750 00:37:28,109 --> 00:37:29,369 And here, 751 00:37:29,369 --> 00:37:31,259 in chapter two 752 00:37:31,259 --> 00:37:32,900 we see the passage 753 00:37:32,900 --> 00:37:37,199 where Mill makes the point that John just described 754 00:37:37,199 --> 00:37:42,680 of two pleasures, if there be one to which all are almost all who have experience 755 00:37:42,680 --> 00:37:46,219 of both give a decided preference, 756 00:37:46,219 --> 00:37:51,859 irrespective of any feeling of moral obligation to prefer it, in other words no outside, no independent 757 00:37:51,859 --> 00:37:53,390 standard, 758 00:37:53,390 --> 00:37:57,529 then that is the more desirable pleasure. 759 00:37:57,529 --> 00:37:59,829 what do people think about that argument. 760 00:37:59,829 --> 00:38:01,549 does that 761 00:38:01,549 --> 00:38:03,489 does it succeeded? 762 00:38:03,489 --> 00:38:06,380 how many think that it does succeed? 763 00:38:06,380 --> 00:38:11,489 of arguing within utilitarian terms for a distinction between higher and lower pleasures. 764 00:38:11,489 --> 00:38:12,390 how many 765 00:38:12,390 --> 00:38:17,788 think it doesn't succeed? 766 00:38:17,789 --> 00:38:20,829 I want to hear your reasons. 767 00:38:20,829 --> 00:38:22,249 but before 768 00:38:22,249 --> 00:38:23,848 we give the reasons 769 00:38:23,849 --> 00:38:26,329 let's do an experiment 770 00:38:26,329 --> 00:38:28,719 of Mills' 771 00:38:28,719 --> 00:38:31,569 claim. 772 00:38:31,569 --> 00:38:35,199 In order to do this experiment 773 00:38:35,199 --> 00:38:39,859 we're going to look that three 774 00:38:39,859 --> 00:38:41,558 short excerpts 775 00:38:41,559 --> 00:38:44,889 of popular entertainment 776 00:38:44,889 --> 00:38:48,170 the first one is a Hamlet soliloquy 777 00:38:48,170 --> 00:38:52,939 it'll be followed by two other 778 00:38:52,939 --> 00:38:55,420 experiences 779 00:38:55,420 --> 00:38:58,259 see what you think. 780 00:38:58,259 --> 00:39:02,249 'what a piece of work is a man 781 00:39:02,249 --> 00:39:05,509 how noble in reason 782 00:39:05,509 --> 00:39:07,579 how infinite in faculties 783 00:39:07,579 --> 00:39:11,439 in form and moving, how express and admirable 784 00:39:11,439 --> 00:39:14,949 in action how like an angel. In apprehension, how like a god 785 00:39:14,949 --> 00:39:16,430 the beauty of the world 786 00:39:16,430 --> 00:39:18,348 the paragon of animals 787 00:39:18,349 --> 00:39:21,259 and yet, to me 788 00:39:21,259 --> 00:39:24,689 what is this quintessence of dust? 789 00:39:24,689 --> 00:39:31,689 man delights not me. 790 00:39:43,309 --> 00:39:47,749 Imagine a world where your greatest fears become reality 791 00:39:47,749 --> 00:39:52,609 each show, six contestants from around the country battle each other in three 792 00:39:52,609 --> 00:39:59,609 extreme stunts. these stunts are designed to challenge these contestants both physically and mentally 793 00:40:00,419 --> 00:40:02,288 six contestants, three stunts, one winner. 794 00:40:02,289 --> 00:40:09,289 Fear factor. 795 00:40:16,449 --> 00:40:22,919 The Simpsons. Well hi diddly-o peddle to the metal o-philes! Flanders- since when do you like anything cool. 796 00:40:22,919 --> 00:40:25,240 well, I don't care for the speed, but I can't get enough of that safety gear 797 00:40:25,240 --> 00:40:28,729 helmets, roll bars, caution flags. I like the fresh air 798 00:40:28,729 --> 00:40:35,499 and looking at the poor people in the infield. 799 00:40:35,499 --> 00:40:41,319 Dang Cletus, why you got to park by my parents. 800 00:40:41,320 --> 00:40:43,029 Now hunny, it's my parents too. 801 00:40:55,759 --> 00:41:00,629 I don't even have to ask which one you like most 802 00:41:00,629 --> 00:41:05,109 the Simpsons? How many like the Simpson's most? 803 00:41:05,109 --> 00:41:10,140 How many Shakespeare? 804 00:41:10,140 --> 00:41:12,710 What about fear factor? 805 00:41:12,710 --> 00:41:15,959 how many preferred fear factor? 806 00:41:15,959 --> 00:41:22,368 really? 807 00:41:22,369 --> 00:41:24,079 people overwhelmingly 808 00:41:24,079 --> 00:41:25,769 like the Simpsons 809 00:41:25,769 --> 00:41:29,229 better 810 00:41:29,229 --> 00:41:31,598 than Shakespeare. alright, now let's take the other 811 00:41:31,599 --> 00:41:33,749 part of the poll 812 00:41:33,749 --> 00:41:35,799 which is the 813 00:41:35,799 --> 00:41:37,529 highest 814 00:41:37,529 --> 00:41:39,409 experience or pleasure? 815 00:41:39,409 --> 00:41:41,729 how many say 816 00:41:41,729 --> 00:41:47,788 Shakespeare? 817 00:41:47,789 --> 00:41:49,960 how many say 818 00:41:49,960 --> 00:41:54,189 fear factor? 819 00:41:54,189 --> 00:41:59,069 no you can't be serious 820 00:41:59,069 --> 00:42:01,038 really? 821 00:42:01,039 --> 00:42:02,709 alright go ahead you can say it. 822 00:42:02,709 --> 00:42:03,439 I found that one 823 00:42:03,439 --> 00:42:04,709 the most entertaining 824 00:42:04,709 --> 00:42:09,428 I know but which do you think was the worthiest, the noblest experience, I know you find it 825 00:42:09,429 --> 00:42:10,920 the most anything 826 00:42:10,920 --> 00:42:15,920 if something is good just because it is pleasurable what is the matter if you have some kind of 827 00:42:15,920 --> 00:42:17,209 abstract 828 00:42:17,209 --> 00:42:21,729 idea of whether it is good by someone else's sense or not. 829 00:42:21,729 --> 00:42:25,038 Alright so you come down on the straight Benthamite's side 830 00:42:25,039 --> 00:42:26,429 whose to judge 831 00:42:26,429 --> 00:42:29,279 and why should we judge 832 00:42:29,279 --> 00:42:33,880 apart from just registering and aggregating de facto preferences, alright fair enough. 833 00:42:33,880 --> 00:42:35,349 what's your name? 834 00:42:35,349 --> 00:42:37,329 Nate? okay fair enough 835 00:42:37,329 --> 00:42:38,219 Alright so 836 00:42:38,219 --> 00:42:40,919 how many think that the Simpson's is actually 837 00:42:40,919 --> 00:42:46,259 apart from liking is actually the higher experience 838 00:42:46,259 --> 00:42:47,420 higher than Shakespeare. 839 00:42:47,420 --> 00:42:49,309 Alright let's see the vote for Shakespeare again 840 00:42:49,309 --> 00:42:52,869 how many think Shakespeare is higher? 841 00:42:52,869 --> 00:42:53,869 alright so 842 00:42:53,869 --> 00:42:55,540 why is it 843 00:42:55,540 --> 00:42:58,999 ideally I'd like to hear from someone is there someone 844 00:42:58,999 --> 00:43:02,109 think Shakespeare is highest 845 00:43:02,109 --> 00:43:02,989 but who preferred 846 00:43:02,989 --> 00:43:04,060 watching 847 00:43:04,060 --> 00:43:08,739 the Simpsons 848 00:43:08,739 --> 00:43:14,160 Like I guess just sitting and watching the Simpsons, it's entertaining because the make jokes, they make us laugh but 849 00:43:14,160 --> 00:43:17,839 someone has to tell us that Shakespeare was this great writer we had to be taught how to read him, how to 850 00:43:17,839 --> 00:43:20,630 understand him, we had to be taught how to 851 00:43:20,630 --> 00:43:23,150 take in Rembrandt, how to analyze a painting. 852 00:43:23,150 --> 00:43:25,859 well how do, what's your name? Aneesha. 853 00:43:25,859 --> 00:43:27,840 Aneesha, when you say someone 854 00:43:27,840 --> 00:43:31,259 told you that Shakespeare's better 855 00:43:31,259 --> 00:43:37,299 are you accepting it on blind faith you voted that Shakespeare's higher only because the culture 856 00:43:37,299 --> 00:43:41,529 tells you that our teachers tell you that or do you 857 00:43:41,529 --> 00:43:44,049 actually agree with that yourself 858 00:43:44,049 --> 00:43:48,319 well in the sense that Shakespeare, no, but earlier you made 859 00:43:48,319 --> 00:43:49,740 an example of Rembrandt 860 00:43:49,740 --> 00:43:54,479 I feel like I would enjoy a reading a comic book more than I would enjoy a kind of analyzing 861 00:43:54,479 --> 00:43:58,348 Rembrandt because someone told me it was great, you know. Right so of some this seems 862 00:43:58,349 --> 00:44:01,699 to be, you're suggesting a kind of 863 00:44:01,699 --> 00:44:05,319 cultural convention and pressure. We're told 864 00:44:05,319 --> 00:44:12,319 what books, what works of art are great. who else? 865 00:44:15,309 --> 00:44:19,640 although I enjoyed watching the Simpsons more in this particular moment in Justice, 866 00:44:19,640 --> 00:44:23,049 if I were to spend the rest of my life considering 867 00:44:23,049 --> 00:44:25,369 the three different 868 00:44:25,369 --> 00:44:26,869 video clips shown 869 00:44:26,869 --> 00:44:29,119 I would not want to spend 870 00:44:29,119 --> 00:44:31,869 that remainder of my life considering 871 00:44:31,869 --> 00:44:33,800 the latter two clips. 872 00:44:33,800 --> 00:44:36,650 I think I would derive more pleasure 873 00:44:36,650 --> 00:44:38,309 from being able to 874 00:44:38,309 --> 00:44:39,269 branch out in my own mind 875 00:44:39,269 --> 00:44:40,488 sort of 876 00:44:40,489 --> 00:44:44,869 considering more deep pleasures, more deep thoughts. 877 00:44:44,869 --> 00:44:48,630 and tell me your name 878 00:44:48,630 --> 00:44:49,630 Joe. 879 00:44:49,630 --> 00:44:52,559 Joe, so if you had to spend the rest of your life on 880 00:44:52,559 --> 00:44:55,409 on a farm in Kansas with only 881 00:44:55,409 --> 00:44:57,489 with only Shakespeare 882 00:44:57,489 --> 00:45:02,079 or the collected episodes of the Simpsons 883 00:45:02,079 --> 00:45:04,119 you would prefer 884 00:45:04,119 --> 00:45:06,599 Shakespeare 885 00:45:06,599 --> 00:45:09,849 what do you conclude from that 886 00:45:09,849 --> 00:45:12,149 about John Stuart Mill's test 887 00:45:12,149 --> 00:45:15,078 but the test of a higher pleasure 888 00:45:15,079 --> 00:45:16,359 is whether 889 00:45:16,359 --> 00:45:18,369 people who have experienced 890 00:45:18,369 --> 00:45:21,509 both prefer it. 891 00:45:21,509 --> 00:45:23,880 can I cite another example briefly? 892 00:45:23,880 --> 00:45:25,130 in biology 893 00:45:25,130 --> 00:45:28,769 in neuro biology last year we were told of a rat who was tested 894 00:45:28,769 --> 00:45:31,209 a particular center in the brain 895 00:45:31,209 --> 00:45:35,879 where the rat was able to stimulate its brain and cause itself intense pleasure repeatedly 896 00:45:35,879 --> 00:45:38,368 the rat did not eat or drink until it died 897 00:45:38,369 --> 00:45:42,179 so the rat was clearly experiencing intense pleasure 898 00:45:42,179 --> 00:45:46,269 now if you asked me right now if I'd rather experience intense pleasure 899 00:45:46,269 --> 00:45:47,140 or have 900 00:45:47,140 --> 00:45:52,690 a full lifetime of higher pleasure, I would consider intense pleasure to be lower pleasure, right 901 00:45:52,690 --> 00:45:55,799 now enjoy intense pleasure 902 00:45:55,799 --> 00:46:01,969 yes I would 903 00:46:01,969 --> 00:46:03,159 but over a lifetime I think 904 00:46:03,159 --> 00:46:04,360 I would think 905 00:46:04,360 --> 00:46:06,899 almost a complete majority here would agree 906 00:46:06,899 --> 00:46:11,630 that they would rather be a human with higher pleasure that rat 907 00:46:11,630 --> 00:46:13,269 with intense pleasure 908 00:46:13,269 --> 00:46:14,930 for a momentary period of time 909 00:46:14,930 --> 00:46:15,870 so now 910 00:46:15,870 --> 00:46:18,959 in answer to your question, right, I think 911 00:46:18,959 --> 00:46:21,348 this proves that, or I won't say proves 912 00:46:21,349 --> 00:46:24,999 I think the conclusion 913 00:46:24,999 --> 00:46:28,649 is that Mill's theory that when a majority people are asked 914 00:46:28,650 --> 00:46:31,439 what they would rather do, 915 00:46:31,439 --> 00:46:33,118 they will answer 916 00:46:33,119 --> 00:46:34,689 that they would rather 917 00:46:34,689 --> 00:46:39,499 engage in a higher pleasure. So you think that this supports Mills, that Mills was on to something here 918 00:46:39,499 --> 00:46:40,738 I do. 919 00:46:40,739 --> 00:46:42,629 all right is there anyone 920 00:46:42,629 --> 00:46:46,839 who disagrees with Joe who thinks that our experiment 921 00:46:46,839 --> 00:46:48,578 disproves 922 00:46:48,579 --> 00:46:49,959 Mills' 923 00:46:49,959 --> 00:46:51,098 test 924 00:46:51,099 --> 00:46:53,239 shows that that's not an adequate way 925 00:46:53,239 --> 00:46:57,689 that you can't distinguish higher pleasures within the utilitarian 926 00:46:57,689 --> 00:47:02,649 framework. 927 00:47:05,879 --> 00:47:09,509 If whatever is good is truly just whatever people prefer it's truly relative and there's 928 00:47:09,509 --> 00:47:11,519 no objective definition then 929 00:47:11,519 --> 00:47:14,888 there will be some society where people prefer Simpsons 930 00:47:14,889 --> 00:47:15,849 more 931 00:47:15,849 --> 00:47:21,380 anyone can appreciate the Simpsons, but I think it does take education to appreciate Shakespeare 932 00:47:21,380 --> 00:47:25,660 Alright, you're saying it takes education to appreciate higher 933 00:47:25,660 --> 00:47:27,319 true thing 934 00:47:27,319 --> 00:47:29,719 Mill's point is 935 00:47:29,719 --> 00:47:32,890 that the higher pleasures do require 936 00:47:32,890 --> 00:47:35,089 cultivation and appreciation and education 937 00:47:35,090 --> 00:47:37,799 he doesn't dispute that 938 00:47:37,799 --> 00:47:38,650 but 939 00:47:38,650 --> 00:47:41,660 once having been cultivated 940 00:47:41,660 --> 00:47:44,029 and educated 941 00:47:44,029 --> 00:47:45,509 people will see 942 00:47:45,509 --> 00:47:48,270 not only see the difference between higher lower 943 00:47:48,270 --> 00:47:49,009 pleasures 944 00:47:49,009 --> 00:47:51,539 but will it actually 945 00:47:51,539 --> 00:47:52,769 prefer 946 00:47:52,769 --> 00:47:53,970 the higher 947 00:47:53,970 --> 00:47:55,848 to the lower. 948 00:47:55,849 --> 00:47:59,519 you find this famous passage from John Stuart Mill- 949 00:47:59,519 --> 00:48:00,808 it is better 950 00:48:00,809 --> 00:48:03,869 to be a human being dissatisfied 951 00:48:03,869 --> 00:48:06,109 then a pig satisfied. 952 00:48:06,109 --> 00:48:10,538 Better to the Socrates dissatisfied than a fool satisfied 953 00:48:10,539 --> 00:48:12,229 and if the fool 954 00:48:12,229 --> 00:48:13,439 or the pig 955 00:48:13,439 --> 00:48:15,538 are of a different opinion 956 00:48:15,539 --> 00:48:18,209 it is because they only know 957 00:48:18,209 --> 00:48:20,700 their side of the question. 958 00:48:20,700 --> 00:48:22,339 so here you have 959 00:48:22,339 --> 00:48:23,159 an attempt 960 00:48:23,159 --> 00:48:24,609 to distinguish 961 00:48:24,609 --> 00:48:27,199 higher from lower 962 00:48:27,199 --> 00:48:28,719 pleasures 963 00:48:28,719 --> 00:48:33,169 so going to an art museum or being a couch potato, swilling beer watching television 964 00:48:33,169 --> 00:48:35,489 at home 965 00:48:35,489 --> 00:48:37,950 sometimes Mill agrees we might succumb 966 00:48:37,950 --> 00:48:40,839 to the temptation 967 00:48:40,839 --> 00:48:41,839 to do the latter, 968 00:48:41,839 --> 00:48:46,308 to be couch potatoes, 969 00:48:46,309 --> 00:48:47,779 but even when we do that 970 00:48:47,779 --> 00:48:49,609 out of indolence 971 00:48:49,609 --> 00:48:50,680 and sloth, 972 00:48:50,680 --> 00:48:52,200 we know 973 00:48:52,200 --> 00:48:54,029 that the pleasure we get 974 00:48:54,029 --> 00:48:55,670 gazing at Rembrandts 975 00:48:55,670 --> 00:48:56,989 in the museum 976 00:48:56,989 --> 00:49:00,219 is actually higher, 977 00:49:00,219 --> 00:49:03,190 because we've experienced both. 978 00:49:03,190 --> 00:49:05,699 And is a higher pressure 979 00:49:05,699 --> 00:49:06,940 gazing at Rembrandts 980 00:49:06,940 --> 00:49:11,499 because of engages our higher human faculties 981 00:49:11,499 --> 00:49:13,848 what about Mill's attempt 982 00:49:13,849 --> 00:49:18,890 to reply to the objection about individual rights? 983 00:49:18,890 --> 00:49:21,859 In a way he uses the same 984 00:49:21,859 --> 00:49:25,319 kind of argument 985 00:49:25,319 --> 00:49:27,859 and this comes out in chapter five 986 00:49:27,859 --> 00:49:33,369 he says while I dispute the pretensions of any theory which sets up an imaginary standard 987 00:49:33,369 --> 00:49:34,889 of justice 988 00:49:34,889 --> 00:49:39,719 not grounded on utility, 989 00:49:39,719 --> 00:49:41,389 but still 990 00:49:41,389 --> 00:49:43,429 he considers 991 00:49:43,429 --> 00:49:44,599 justice 992 00:49:44,599 --> 00:49:48,549 grounded on utility to be what he calls the chief part 993 00:49:48,549 --> 00:49:52,679 and incomparably the most sacred and binding part 994 00:49:52,679 --> 00:49:54,599 of all morality. 995 00:49:54,599 --> 00:49:57,249 so justice is higher 996 00:49:57,249 --> 00:50:00,049 individual rights are privileged 997 00:50:00,049 --> 00:50:01,899 but not for 998 00:50:01,899 --> 00:50:05,310 reasons that depart from utilitarian assumptions. 999 00:50:05,310 --> 00:50:06,769 Justice is a name 1000 00:50:06,769 --> 00:50:09,069 for certain moral requirements 1001 00:50:09,069 --> 00:50:11,220 which, regarded collectively 1002 00:50:11,220 --> 00:50:14,519 stand higher in the scale of social utility 1003 00:50:14,519 --> 00:50:17,109 and are therefore 1004 00:50:17,109 --> 00:50:19,209 of more 1005 00:50:19,209 --> 00:50:20,678 paramount obligation 1006 00:50:20,679 --> 00:50:23,200 than any others 1007 00:50:23,200 --> 00:50:28,919 so justice is sacred, it's prior, it's privileged, it isn't something that can easily be traded 1008 00:50:28,920 --> 00:50:30,890 off against lesser things 1009 00:50:30,890 --> 00:50:32,239 but the reason 1010 00:50:32,239 --> 00:50:33,619 is ultimately 1011 00:50:33,619 --> 00:50:35,799 Mills Claims 1012 00:50:35,799 --> 00:50:37,929 a utilitarian reason 1013 00:50:37,929 --> 00:50:39,239 once you consider 1014 00:50:39,239 --> 00:50:41,199 the long run interests 1015 00:50:41,199 --> 00:50:43,539 of humankind, 1016 00:50:43,539 --> 00:50:44,670 of all of us, 1017 00:50:44,670 --> 00:50:46,329 as progressive 1018 00:50:46,329 --> 00:50:47,699 beings. 1019 00:50:47,699 --> 00:50:51,279 If we do justice and if we respect rights 1020 00:50:51,279 --> 00:50:52,609 society as a whole 1021 00:50:52,609 --> 00:50:55,949 will be better off in the long run. 1022 00:50:55,949 --> 00:50:57,999 Well is that convincing? 1023 00:50:57,999 --> 00:50:59,158 Or 1024 00:50:59,159 --> 00:51:04,549 is Mill actually, without admitting it, stepping outside 1025 00:51:04,549 --> 00:51:06,219 utilitarian considerations 1026 00:51:06,219 --> 00:51:07,789 in arguing 1027 00:51:07,789 --> 00:51:11,129 for qualitatively higher 1028 00:51:11,129 --> 00:51:12,788 pleasures 1029 00:51:12,789 --> 00:51:14,479 and for sacred 1030 00:51:14,479 --> 00:51:16,948 or specially important 1031 00:51:16,949 --> 00:51:18,479 individual rights? 1032 00:51:18,479 --> 00:51:21,718 we haven't fully answered that question 1033 00:51:21,719 --> 00:51:23,809 because to answer that question 1034 00:51:23,809 --> 00:51:26,259 in the case of rights and justice 1035 00:51:26,259 --> 00:51:28,869 will require that we explore 1036 00:51:28,869 --> 00:51:30,419 other ways, 1037 00:51:30,419 --> 00:51:33,368 non utilitarian ways 1038 00:51:33,369 --> 00:51:35,030 of accounting for the basis 1039 00:51:35,030 --> 00:51:36,479 or rights 1040 00:51:36,479 --> 00:51:38,269 and then asking 1041 00:51:38,269 --> 00:51:40,408 whether they succeed 1042 00:51:40,409 --> 00:51:42,949 as for Jeremy Bentham, 1043 00:51:42,949 --> 00:51:44,969 who launched 1044 00:51:44,969 --> 00:51:46,099 utilitarianism 1045 00:51:46,099 --> 00:51:47,419 as a doctrine 1046 00:51:47,419 --> 00:51:49,979 in moral and legal philosophy 1047 00:51:49,979 --> 00:51:53,819 Bentham died in 1832 at the age of eighty five 1048 00:51:53,819 --> 00:51:57,509 but if you go to London you can visit him today 1049 00:51:57,509 --> 00:51:58,729 literally. 1050 00:51:58,729 --> 00:52:01,439 he provided in his will 1051 00:52:01,440 --> 00:52:03,390 that his body be preserved, 1052 00:52:03,390 --> 00:52:05,489 embalmed and displayed 1053 00:52:05,489 --> 00:52:07,739 in the university of London 1054 00:52:07,739 --> 00:52:11,069 where he still presides in a glass case 1055 00:52:11,069 --> 00:52:12,950 with a wax head 1056 00:52:12,950 --> 00:52:14,989 dressed in his actual clothing. 1057 00:52:14,989 --> 00:52:17,049 you see before he died, 1058 00:52:17,049 --> 00:52:22,339 Bentham addressed himself to a question consistent with his philosophy, 1059 00:52:22,339 --> 00:52:23,499 of what use 1060 00:52:23,499 --> 00:52:26,839 could a dead man be to the living 1061 00:52:26,839 --> 00:52:30,499 one use, he said, would be to make one's corpse available 1062 00:52:30,499 --> 00:52:33,519 for the study of anatomy 1063 00:52:33,519 --> 00:52:37,129 in the case of great philosophers, however, 1064 00:52:37,129 --> 00:52:38,159 better yet 1065 00:52:38,159 --> 00:52:44,919 to preserve one's physical presence in order to inspire future generations of thinkers. 1066 00:52:44,919 --> 00:52:47,618 You want to see what Bentham looks like stuffed? 1067 00:52:47,619 --> 00:52:50,410 Here's what he looks like 1068 00:52:50,410 --> 00:52:53,549 There he is 1069 00:52:53,549 --> 00:52:55,459 now, if you look closely 1070 00:52:55,459 --> 00:52:57,459 you'll notice 1071 00:52:57,459 --> 00:52:58,529 that 1072 00:52:58,529 --> 00:53:05,529 the embalming up his actual had was not a success so they substituted a waxed head 1073 00:53:06,910 --> 00:53:10,009 and at the bottom for verisimilitude 1074 00:53:10,009 --> 00:53:13,390 you can actually see his actual had 1075 00:53:13,390 --> 00:53:14,979 on a plate 1076 00:53:16,529 --> 00:53:17,809 you see it? 1077 00:53:17,809 --> 00:53:22,599 right there 1078 00:53:22,599 --> 00:53:25,709 so, what's the moral of the story? 1079 00:53:25,709 --> 00:53:29,499 the moral of the story 1080 00:53:29,499 --> 00:53:33,698 by the way they bring him out during meetings of the board at university college London 1081 00:53:33,699 --> 00:53:40,659 and the minutes record him as present but not voting. 1082 00:53:40,659 --> 00:53:42,539 here is a philosopher 1083 00:53:42,539 --> 00:53:45,109 in life and in death 1084 00:53:45,109 --> 00:53:46,769 who adhered 1085 00:53:46,769 --> 00:53:48,399 to the principles 1086 00:53:48,400 --> 00:53:55,400 of his philosophy. we'll continue with rights next time. 1087 00:53:57,439 --> 00:54:00,808 Don't miss the chance to interact online with other viewers of Justice 1088 00:54:00,809 --> 00:54:03,369 join the conversation, take a pop quiz, 1089 00:54:03,369 --> 00:54:07,799 watch lectures you've missed, and a lot more. Visit Justiceharvard.org 1090 00:54:07,799 --> 00:54:14,799 It's the right thing to do. 1091 00:54:49,739 --> 00:54:53,789 funding for this program is provided by 1092 00:54:53,789 --> 00:54:55,189 additional funding provided by