WEBVTT 00:00:00.000 --> 00:00:05.944 (intro music) 00:00:05.944 --> 00:00:07.274 My name is Laurie Santos. 00:00:07.274 --> 00:00:11.389 I teach psychology at Yale University, and today I want to talk to you about 00:00:11.389 --> 00:00:13.539 reference dependence and loss aversion. 00:00:13.539 --> 00:00:17.069 This lecture is part of a series on cognitive biases. 00:00:17.069 --> 00:00:21.430 Imagine that you're a doctor heading a medical team that's trying to fight a new 00:00:21.430 --> 00:00:25.720 strain of deadly flu, one that's currently spreading at an alarming rate. 00:00:25.720 --> 00:00:30.388 The new flu is so devastating that six hundred million people have already 00:00:30.388 --> 00:00:34.230 been infected, and if nothing is done, all of them will die. 00:00:34.230 --> 00:00:38.540 The good news is there are two, drugs available to treat the disease and your 00:00:38.540 --> 00:00:42.140 team can decide which one to put into mass production. 00:00:42.140 --> 00:00:46.959 Clinical trials show that if you go with the first drug, drug A, you'll be able to 00:00:46.959 --> 00:00:49.879 save two hundred million of the infected people. 00:00:49.879 --> 00:00:52.489 The second option is drug B, which has a 00:00:52.489 --> 00:00:56.379 one-third chance of saving all six hundred million people, but a two-thirds 00:00:56.379 --> 00:00:58.530 chance that no one infected will be saved. 00:00:58.530 --> 00:01:00.430 Which drug do you pick? 00:01:00.430 --> 00:01:03.870 You probably thought drug A was the best one. 00:01:03.870 --> 00:01:07.850 After all, with drug A, two hundred million people will be saved for sure, 00:01:07.850 --> 00:01:09.550 which is a pretty good outcome. 00:01:09.550 --> 00:01:13.090 But now imagine that your team is faced with a slightly different choice. 00:01:13.090 --> 00:01:16.160 This time, it's between drug C and drug D. 00:01:16.160 --> 00:01:19.750 If you choose drug C, four hundred million infected 00:01:19.750 --> 00:01:21.250 people will die for sure. 00:01:21.250 --> 00:01:24.460 If you choose drug D, there's a one-third chance 00:01:24.460 --> 00:01:28.490 that no one infected will die, and a two-thirds chance that six hundred million 00:01:28.490 --> 00:01:29.929 infected people will die. 00:01:29.929 --> 00:01:32.399 Which drug do you choose in this case? 00:01:32.399 --> 00:01:34.919 I bet you probably wen with drug D. 00:01:34.919 --> 00:01:38.560 After all, a chance that no one will die seems like a pretty good bet. 00:01:38.560 --> 00:01:41.100 If you picked drug A in the first scenario 00:01:41.100 --> 00:01:43.690 and drug D in the second, you're not alone. 00:01:43.700 --> 00:01:45.720 When behavioral economists Danny Kahneman 00:01:45.720 --> 00:01:48.450 and Amos Tversky gave these scenarios to college students, 00:01:48.450 --> 00:01:52.260 seventy-two percent of people said that drug A was better than B, 00:01:52.260 --> 00:01:56.290 and seventy-eight percent of people said that drug D was better than C. 00:01:56.290 --> 00:02:00.570 But let's take a slightly different look at both sets of outcomes. 00:02:00.570 --> 00:02:03.680 In fact, let's depicted both choices in 00:02:03.680 --> 00:02:06.619 terms of the number of people who will live and die. 00:02:06.619 --> 00:02:08.900 Here's your first choice. 00:02:08.900 --> 00:02:13.230 Drug A will save two hundred million people for sure, and for drug B, there's a 00:02:13.230 --> 00:02:17.400 one-third chance that all six hundred million infected people will be saved and a 00:02:17.400 --> 00:02:20.160 two-thirds chance that no one infected will be saved. 00:02:20.160 --> 00:02:24.040 And now, let's do the same thing for drugs C and D. 00:02:24.040 --> 00:02:28.620 Surprisingly, you can now see that the two options are identical. 00:02:28.620 --> 00:02:32.690 Drugs A and C will save two hundred million people, while four hundred million 00:02:32.690 --> 00:02:34.100 people are certain to die. 00:02:34.100 --> 00:02:38.220 And with both drug B and drug D, you have a one-third chance of saving all 00:02:38.220 --> 00:02:41.779 six hundred million people and a two-thirds chance of saving no one. 00:02:41.779 --> 00:02:46.459 We can argue about whether it's better to save two hundred million people for sure, 00:02:46.459 --> 00:02:49.159 or to take a one-third chance of saving all of them. 00:02:49.159 --> 00:02:50.629 But one thing should be clear from 00:02:50.629 --> 00:02:56.349 the example: it's pretty weird for you to prefer drug A over B at the same time as 00:02:56.349 --> 00:02:58.070 you prefer drug D over C. 00:02:58.070 --> 00:03:02.610 After all, they're exactly the same drugs with slightly different labels. 00:03:02.610 --> 00:03:05.129 Why does a simple change in wording change our 00:03:05.129 --> 00:03:07.840 judgments about exactly the same options? 00:03:07.840 --> 00:03:12.970 Kahneman and Tversky figured out that this strange effect results from two classic 00:03:12.970 --> 00:03:15.000 biases that affect human choice, 00:03:15.000 --> 00:03:18.290 biases known as "reference dependence" and "loss aversion." 00:03:18.290 --> 00:03:22.440 "Reference dependence" just refers the fact that we think about our decisions 00:03:22.440 --> 00:03:26.840 not in terms of absolutes, but relative to some status quo or baseline. 00:03:26.840 --> 00:03:29.360 This is why, when you find a dollar on the ground, 00:03:29.360 --> 00:03:32.889 you don't think about that dollar as part of your entire net worth. 00:03:32.889 --> 00:03:35.750 Instead, you think in terms of the change that the dollar 00:03:35.750 --> 00:03:37.190 made your status quo. 00:03:37.190 --> 00:03:39.590 You think, "Hey, I'm one dollar richer!" 00:03:39.590 --> 00:03:43.840 because of reference dependence, you don't think of the options presented earlier 00:03:43.840 --> 00:03:46.370 in terms of the absolute number of lives saved. 00:03:46.370 --> 00:03:50.470 Instead, you frame each choice relative to some status quo. 00:03:50.470 --> 00:03:52.600 And that's why the wording matters. 00:03:52.600 --> 00:03:54.889 The first scenario is described in terms 00:03:54.889 --> 00:03:56.479 of the number of life saved. 00:03:56.479 --> 00:03:58.039 That's your reference point. 00:03:58.039 --> 00:04:02.089 You're thinking in terms of how many additional lives you can save. 00:04:02.089 --> 00:04:03.879 And in the second, you think relative 00:04:03.879 --> 00:04:06.510 to how many less lives you can lose. 00:04:06.510 --> 00:04:09.510 And that second part, worrying about losing 00:04:09.510 --> 00:04:15.299 lives, leads to the second bias that's affecting your choices: loss aversion. 00:04:15.299 --> 00:04:19.648 Loss aversion is our reluctance to make choices that lead to losses. 00:04:19.648 --> 00:04:24.109 We don't like losing stuff, whether it's money, or lives, or even candy. 00:04:24.109 --> 00:04:28.130 We have an instinct to avoid potential losses at all costs. 00:04:28.130 --> 00:04:31.700 Economists have found that loss aversion causes us to do 00:04:31.700 --> 00:04:33.479 a bunch of irrational stuff. 00:04:33.479 --> 00:04:37.249 Loss aversion causes people to hold onto property that's losing in 00:04:37.249 --> 00:04:39.459 value in the housing market, just because 00:04:39.459 --> 00:04:41.960 they don't want to sell their assets at a loss. 00:04:41.960 --> 00:04:46.930 Loss aversion also leads people to invest more poorly, even avoid risky 00:04:46.930 --> 00:04:49.180 stocks that overall will do well, because 00:04:49.180 --> 00:04:52.730 we're afraid of a small probability of losses. 00:04:52.730 --> 00:04:55.470 Loss aversion causes to latch onto the 00:04:55.470 --> 00:04:58.990 fact that drugs C and D involve losing lives. 00:04:59.000 --> 00:05:01.910 Our aversion to any potential losses causes 00:05:01.910 --> 00:05:04.840 us to avoid drug C and to go with drug D, 00:05:04.840 --> 00:05:07.980 which is the chance of not losing anyone. 00:05:07.980 --> 00:05:10.470 Our loss aversion isn't as activated 00:05:10.470 --> 00:05:12.380 when we hear about drugs A and B. 00:05:12.380 --> 00:05:16.690 Both of them involve saving people, so why not go with the safe option, 00:05:16.690 --> 00:05:18.830 drug A over drug B? 00:05:18.830 --> 00:05:21.830 Merely describing the outcomes differently 00:05:21.830 --> 00:05:24.990 changes which scenarios we find more aversive. 00:05:24.990 --> 00:05:27.080 If losses are mentioned, we want to 00:05:27.080 --> 00:05:28.820 reduce them as much as possible, 00:05:28.820 --> 00:05:33.040 so much so, that we take on a bit more risk than we usually like 00:05:33.040 --> 00:05:36.570 So describing the decision one way, as opposed to another, 00:05:36.570 --> 00:05:39.189 can cause us to make a completely different choice. 00:05:39.189 --> 00:05:41.159 even in a life-or-death decision 00:05:41.159 --> 00:05:45.310 like this, we're at the mercy of our minds interpret information. 00:05:45.310 --> 00:05:50.240 And how our minds interpret information is at the mercy of our cognitive biases.