0:00:00.000,0:00:05.944 (intro music) 0:00:05.944,0:00:07.274 My name is Laurie Santos. 0:00:07.274,0:00:11.389 I teach psychology at Yale University,[br]and today I want to talk to you about 0:00:11.389,0:00:13.539 reference dependence and loss aversion. 0:00:13.539,0:00:17.069 This lecture is part of a[br]series on cognitive biases. 0:00:17.069,0:00:21.430 Imagine that you're a doctor heading a[br]medical team that's trying to fight a new 0:00:21.430,0:00:25.720 strain of deadly flu, one that's currently[br]spreading at an alarming rate. 0:00:25.720,0:00:30.388 The new flu is so devastating that six[br]hundred million people have already 0:00:30.388,0:00:34.230 been infected, and if nothing[br]is done, all of them will die. 0:00:34.230,0:00:38.540 The good news is there are two, drugs[br]available to treat the disease and your 0:00:38.540,0:00:42.140 team can decide which one[br]to put into mass production. 0:00:42.140,0:00:46.959 Clinical trials show that if you go with[br]the first drug, drug A, you'll be able to 0:00:46.959,0:00:49.879 save two hundred million[br]of the infected people. 0:00:49.879,0:00:52.489 The second option is drug B, which has a 0:00:52.489,0:00:56.379 one-third chance of saving all six hundred[br]million people, but a two-thirds 0:00:56.379,0:00:58.530 chance that no one infected will be saved. 0:00:58.530,0:01:00.430 Which drug do you pick? 0:01:00.430,0:01:03.870 You probably thought drug[br]A was the best one. 0:01:03.870,0:01:07.850 After all, with drug A, two hundred[br]million people will be saved for sure, 0:01:07.850,0:01:09.550 which is a pretty good outcome. 0:01:09.550,0:01:13.090 But now imagine that your team is faced[br]with a slightly different choice. 0:01:13.090,0:01:16.160 This time, it's between drug C and drug D. 0:01:16.160,0:01:19.750 If you choose drug C, four[br]hundred million infected 0:01:19.750,0:01:21.250 people will die for sure. 0:01:21.250,0:01:24.460 If you choose drug D, there's a one-third chance 0:01:24.460,0:01:28.490 that no one infected will die, and a[br]two-thirds chance that six hundred million 0:01:28.490,0:01:29.929 infected people will die. 0:01:29.929,0:01:32.399 Which drug do you choose in this case? 0:01:32.399,0:01:34.919 I bet you probably wen with drug D. 0:01:34.919,0:01:38.560 After all, a chance that no one will[br]die seems like a pretty good bet. 0:01:38.560,0:01:41.100 If you picked drug A in the first scenario 0:01:41.100,0:01:43.690 and drug D in the second, you're not alone. 0:01:43.700,0:01:45.720 When behavioral economists Danny Kahneman 0:01:45.720,0:01:48.450 and Amos Tversky gave these[br]scenarios to college students, 0:01:48.450,0:01:52.260 seventy-two percent of people said[br]that drug A was better than B, 0:01:52.260,0:01:56.290 and seventy-eight percent of people[br]said that drug D was better than C. 0:01:56.290,0:02:00.570 But let's take a slightly different[br]look at both sets of outcomes. 0:02:00.570,0:02:03.680 In fact, let's depicted both choices in 0:02:03.680,0:02:06.619 terms of the number of people[br]who will live and die. 0:02:06.619,0:02:08.900 Here's your first choice. 0:02:08.900,0:02:13.230 Drug A will save two hundred million[br]people for sure, and for drug B, there's a 0:02:13.230,0:02:17.400 one-third chance that all six hundred million[br]infected people will be saved and a 0:02:17.400,0:02:20.160 two-thirds chance that no[br]one infected will be saved. 0:02:20.160,0:02:24.040 And now, let's do the same[br]thing for drugs C and D. 0:02:24.040,0:02:28.620 Surprisingly, you can now see[br]that the two options are identical. 0:02:28.620,0:02:32.690 Drugs A and C will save two hundred[br]million people, while four hundred million 0:02:32.690,0:02:34.100 people are certain to die. 0:02:34.100,0:02:38.220 And with both drug B and drug D, you[br]have a one-third chance of saving all 0:02:38.220,0:02:41.779 six hundred million people and a[br]two-thirds chance of saving no one. 0:02:41.779,0:02:46.459 We can argue about whether it's better to[br]save two hundred million people for sure, 0:02:46.459,0:02:49.159 or to take a one-third chance[br]of saving all of them. 0:02:49.159,0:02:50.629 But one thing should be clear from 0:02:50.629,0:02:56.349 the example: it's pretty weird for you to[br]prefer drug A over B at the same time as 0:02:56.349,0:02:58.070 you prefer drug D over C. 0:02:58.070,0:03:02.610 After all, they're exactly the same drugs[br]with slightly different labels. 0:03:02.610,0:03:05.129 Why does a simple change[br]in wording change our 0:03:05.129,0:03:07.840 judgments about exactly the same options? 0:03:07.840,0:03:12.970 Kahneman and Tversky figured out that this[br]strange effect results from two classic 0:03:12.970,0:03:15.000 biases that affect human choice, 0:03:15.000,0:03:18.290 biases known as "reference[br]dependence" and "loss aversion." 0:03:18.290,0:03:22.440 "Reference dependence" just refers the[br]fact that we think about our decisions 0:03:22.440,0:03:26.840 not in terms of absolutes, but relative[br]to some status quo or baseline. 0:03:26.840,0:03:29.360 This is why, when you find[br]a dollar on the ground, 0:03:29.360,0:03:32.889 you don't think about that dollar[br]as part of your entire net worth. 0:03:32.889,0:03:35.750 Instead, you think in terms[br]of the change that the dollar 0:03:35.750,0:03:37.190 made your status quo. 0:03:37.190,0:03:39.590 You think, "Hey, I'm one dollar richer!" 0:03:39.590,0:03:43.840 because of reference dependence, you[br]don't think of the options presented earlier 0:03:43.840,0:03:46.370 in terms of the absolute number of lives saved. 0:03:46.370,0:03:50.470 Instead, you frame each choice[br]relative to some status quo. 0:03:50.470,0:03:52.600 And that's why the wording matters. 0:03:52.600,0:03:54.889 The first scenario is described in terms 0:03:54.889,0:03:56.479 of the number of life saved. 0:03:56.479,0:03:58.039 That's your reference point. 0:03:58.039,0:04:02.089 You're thinking in terms of how many[br]additional lives you can save. 0:04:02.089,0:04:03.879 And in the second, you think relative 0:04:03.879,0:04:06.510 to how many less lives you can lose. 0:04:06.510,0:04:09.510 And that second part, worrying about losing 0:04:09.510,0:04:15.299 lives, leads to the second bias that's[br]affecting your choices: loss aversion. 0:04:15.299,0:04:19.648 Loss aversion is our reluctance to[br]make choices that lead to losses. 0:04:19.648,0:04:24.109 We don't like losing stuff, whether[br]it's money, or lives, or even candy. 0:04:24.109,0:04:28.130 We have an instinct to avoid[br]potential losses at all costs. 0:04:28.130,0:04:31.700 Economists have found that[br]loss aversion causes us to do 0:04:31.700,0:04:33.479 a bunch of irrational stuff. 0:04:33.479,0:04:37.249 Loss aversion causes people to[br]hold onto property that's losing in 0:04:37.249,0:04:39.459 value in the housing market, just because 0:04:39.459,0:04:41.960 they don't want to sell[br]their assets at a loss. 0:04:41.960,0:04:46.930 Loss aversion also leads people to[br]invest more poorly, even avoid risky 0:04:46.930,0:04:49.180 stocks that overall will do well, because 0:04:49.180,0:04:52.730 we're afraid of a small probability of losses. 0:04:52.730,0:04:55.470 Loss aversion causes to latch onto the 0:04:55.470,0:04:58.990 fact that drugs C and D involve losing lives. 0:04:59.000,0:05:01.910 Our aversion to any potential losses causes 0:05:01.910,0:05:04.840 us to avoid drug C and to go with drug D, 0:05:04.840,0:05:07.980 which is the chance of not losing anyone. 0:05:07.980,0:05:10.470 Our loss aversion isn't as activated 0:05:10.470,0:05:12.380 when we hear about drugs A and B. 0:05:12.380,0:05:16.690 Both of them involve saving people,[br]so why not go with the safe option, 0:05:16.690,0:05:18.830 drug A over drug B? 0:05:18.830,0:05:21.830 Merely describing the outcomes differently 0:05:21.830,0:05:24.990 changes which scenarios[br]we find more aversive. 0:05:24.990,0:05:27.080 If losses are mentioned, we want to 0:05:27.080,0:05:28.820 reduce them as much as possible, 0:05:28.820,0:05:33.040 so much so, that we take on a bit[br]more risk than we usually like 0:05:33.040,0:05:36.570 So describing the decision one[br]way, as opposed to another, 0:05:36.570,0:05:39.189 can cause us to make a[br]completely different choice. 0:05:39.189,0:05:41.159 even in a life-or-death decision 0:05:41.159,0:05:45.310 like this, we're at the mercy of our[br]minds interpret information. 0:05:45.310,0:05:50.240 And how our minds interpret information[br]is at the mercy of our cognitive biases.