1 00:00:02,000 --> 00:00:04,000 I want to start my talk today with two observations 2 00:00:04,000 --> 00:00:06,000 about the human species. 3 00:00:06,000 --> 00:00:09,000 The first observation is something that you might think is quite obvious, 4 00:00:09,000 --> 00:00:11,000 and that's that our species, Homo sapiens, 5 00:00:11,000 --> 00:00:13,000 is actually really, really smart -- 6 00:00:13,000 --> 00:00:15,000 like, ridiculously smart -- 7 00:00:15,000 --> 00:00:17,000 like you're all doing things 8 00:00:17,000 --> 00:00:20,000 that no other species on the planet does right now. 9 00:00:20,000 --> 00:00:22,000 And this is, of course, 10 00:00:22,000 --> 00:00:24,000 not the first time you've probably recognized this. 11 00:00:24,000 --> 00:00:27,000 Of course, in addition to being smart, we're also an extremely vain species. 12 00:00:27,000 --> 00:00:30,000 So we like pointing out the fact that we're smart. 13 00:00:30,000 --> 00:00:32,000 You know, so I could turn to pretty much any sage 14 00:00:32,000 --> 00:00:34,000 from Shakespeare to Stephen Colbert 15 00:00:34,000 --> 00:00:36,000 to point out things like the fact that 16 00:00:36,000 --> 00:00:38,000 we're noble in reason and infinite in faculties 17 00:00:38,000 --> 00:00:40,000 and just kind of awesome-er than anything else on the planet 18 00:00:40,000 --> 00:00:43,000 when it comes to all things cerebral. 19 00:00:43,000 --> 00:00:45,000 But of course, there's a second observation about the human species 20 00:00:45,000 --> 00:00:47,000 that I want to focus on a little bit more, 21 00:00:47,000 --> 00:00:49,000 and that's the fact that 22 00:00:49,000 --> 00:00:52,000 even though we're actually really smart, sometimes uniquely smart, 23 00:00:52,000 --> 00:00:55,000 we can also be incredibly, incredibly dumb 24 00:00:55,000 --> 00:00:58,000 when it comes to some aspects of our decision making. 25 00:00:58,000 --> 00:01:00,000 Now I'm seeing lots of smirks out there. 26 00:01:00,000 --> 00:01:02,000 Don't worry, I'm not going to call anyone in particular out 27 00:01:02,000 --> 00:01:04,000 on any aspects of your own mistakes. 28 00:01:04,000 --> 00:01:06,000 But of course, just in the last two years 29 00:01:06,000 --> 00:01:09,000 we see these unprecedented examples of human ineptitude. 30 00:01:09,000 --> 00:01:12,000 And we've watched as the tools we uniquely make 31 00:01:12,000 --> 00:01:14,000 to pull the resources out of our environment 32 00:01:14,000 --> 00:01:16,000 kind of just blow up in our face. 33 00:01:16,000 --> 00:01:18,000 We've watched the financial markets that we uniquely create -- 34 00:01:18,000 --> 00:01:21,000 these markets that were supposed to be foolproof -- 35 00:01:21,000 --> 00:01:23,000 we've watched them kind of collapse before our eyes. 36 00:01:23,000 --> 00:01:25,000 But both of these two embarrassing examples, I think, 37 00:01:25,000 --> 00:01:28,000 don't highlight what I think is most embarrassing 38 00:01:28,000 --> 00:01:30,000 about the mistakes that humans make, 39 00:01:30,000 --> 00:01:33,000 which is that we'd like to think that the mistakes we make 40 00:01:33,000 --> 00:01:35,000 are really just the result of a couple bad apples 41 00:01:35,000 --> 00:01:38,000 or a couple really sort of FAIL Blog-worthy decisions. 42 00:01:38,000 --> 00:01:41,000 But it turns out, what social scientists are actually learning 43 00:01:41,000 --> 00:01:44,000 is that most of us, when put in certain contexts, 44 00:01:44,000 --> 00:01:47,000 will actually make very specific mistakes. 45 00:01:47,000 --> 00:01:49,000 The errors we make are actually predictable. 46 00:01:49,000 --> 00:01:51,000 We make them again and again. 47 00:01:51,000 --> 00:01:53,000 And they're actually immune to lots of evidence. 48 00:01:53,000 --> 00:01:55,000 When we get negative feedback, 49 00:01:55,000 --> 00:01:58,000 we still, the next time we're face with a certain context, 50 00:01:58,000 --> 00:02:00,000 tend to make the same errors. 51 00:02:00,000 --> 00:02:02,000 And so this has been a real puzzle to me 52 00:02:02,000 --> 00:02:04,000 as a sort of scholar of human nature. 53 00:02:04,000 --> 00:02:06,000 What I'm most curious about is, 54 00:02:06,000 --> 00:02:09,000 how is a species that's as smart as we are 55 00:02:09,000 --> 00:02:11,000 capable of such bad 56 00:02:11,000 --> 00:02:13,000 and such consistent errors all the time? 57 00:02:13,000 --> 00:02:16,000 You know, we're the smartest thing out there, why can't we figure this out? 58 00:02:16,000 --> 00:02:19,000 In some sense, where do our mistakes really come from? 59 00:02:19,000 --> 00:02:22,000 And having thought about this a little bit, I see a couple different possibilities. 60 00:02:22,000 --> 00:02:25,000 One possibility is, in some sense, it's not really our fault. 61 00:02:25,000 --> 00:02:27,000 Because we're a smart species, 62 00:02:27,000 --> 00:02:29,000 we can actually create all kinds of environments 63 00:02:29,000 --> 00:02:31,000 that are super, super complicated, 64 00:02:31,000 --> 00:02:34,000 sometimes too complicated for us to even actually understand, 65 00:02:34,000 --> 00:02:36,000 even though we've actually created them. 66 00:02:36,000 --> 00:02:38,000 We create financial markets that are super complex. 67 00:02:38,000 --> 00:02:41,000 We create mortgage terms that we can't actually deal with. 68 00:02:41,000 --> 00:02:44,000 And of course, if we are put in environments where we can't deal with it, 69 00:02:44,000 --> 00:02:46,000 in some sense makes sense that we actually 70 00:02:46,000 --> 00:02:48,000 might mess certain things up. 71 00:02:48,000 --> 00:02:50,000 If this was the case, we'd have a really easy solution 72 00:02:50,000 --> 00:02:52,000 to the problem of human error. 73 00:02:52,000 --> 00:02:54,000 We'd actually just say, okay, let's figure out 74 00:02:54,000 --> 00:02:56,000 the kinds of technologies we can't deal with, 75 00:02:56,000 --> 00:02:58,000 the kinds of environments that are bad -- 76 00:02:58,000 --> 00:03:00,000 get rid of those, design things better, 77 00:03:00,000 --> 00:03:02,000 and we should be the noble species 78 00:03:02,000 --> 00:03:04,000 that we expect ourselves to be. 79 00:03:04,000 --> 00:03:07,000 But there's another possibility that I find a little bit more worrying, 80 00:03:07,000 --> 00:03:10,000 which is, maybe it's not our environments that are messed up. 81 00:03:10,000 --> 00:03:13,000 Maybe it's actually us that's designed badly. 82 00:03:13,000 --> 00:03:15,000 This is a hint that I've gotten 83 00:03:15,000 --> 00:03:18,000 from watching the ways that social scientists have learned about human errors. 84 00:03:18,000 --> 00:03:21,000 And what we see is that people tend to keep making errors 85 00:03:21,000 --> 00:03:24,000 exactly the same way, over and over again. 86 00:03:24,000 --> 00:03:26,000 It feels like we might almost just be built 87 00:03:26,000 --> 00:03:28,000 to make errors in certain ways. 88 00:03:28,000 --> 00:03:31,000 This is a possibility that I worry a little bit more about, 89 00:03:31,000 --> 00:03:33,000 because, if it's us that's messed up, 90 00:03:33,000 --> 00:03:35,000 it's not actually clear how we go about dealing with it. 91 00:03:35,000 --> 00:03:38,000 We might just have to accept the fact that we're error prone 92 00:03:38,000 --> 00:03:40,000 and try to design things around it. 93 00:03:40,000 --> 00:03:43,000 So this is the question my students and I wanted to get at. 94 00:03:43,000 --> 00:03:46,000 How can we tell the difference between possibility one and possibility two? 95 00:03:46,000 --> 00:03:48,000 What we need is a population 96 00:03:48,000 --> 00:03:50,000 that's basically smart, can make lots of decisions, 97 00:03:50,000 --> 00:03:52,000 but doesn't have access to any of the systems we have, 98 00:03:52,000 --> 00:03:54,000 any of the things that might mess us up -- 99 00:03:54,000 --> 00:03:56,000 no human technology, human culture, 100 00:03:56,000 --> 00:03:58,000 maybe even not human language. 101 00:03:58,000 --> 00:04:00,000 And so this is why we turned to these guys here. 102 00:04:00,000 --> 00:04:03,000 These are one of the guys I work with. This is a brown capuchin monkey. 103 00:04:03,000 --> 00:04:05,000 These guys are New World primates, 104 00:04:05,000 --> 00:04:07,000 which means they broke off from the human branch 105 00:04:07,000 --> 00:04:09,000 about 35 million years ago. 106 00:04:09,000 --> 00:04:11,000 This means that your great, great, great great, great, great -- 107 00:04:11,000 --> 00:04:13,000 with about five million "greats" in there -- 108 00:04:13,000 --> 00:04:15,000 grandmother was probably the same great, great, great, great 109 00:04:15,000 --> 00:04:17,000 grandmother with five million "greats" in there 110 00:04:17,000 --> 00:04:19,000 as Holly up here. 111 00:04:19,000 --> 00:04:22,000 You know, so you can take comfort in the fact that this guy up here is a really really distant, 112 00:04:22,000 --> 00:04:24,000 but albeit evolutionary, relative. 113 00:04:24,000 --> 00:04:26,000 The good news about Holly though is that 114 00:04:26,000 --> 00:04:29,000 she doesn't actually have the same kinds of technologies we do. 115 00:04:29,000 --> 00:04:32,000 You know, she's a smart, very cut creature, a primate as well, 116 00:04:32,000 --> 00:04:34,000 but she lacks all the stuff we think might be messing us up. 117 00:04:34,000 --> 00:04:36,000 So she's the perfect test case. 118 00:04:36,000 --> 00:04:39,000 What if we put Holly into the same context as humans? 119 00:04:39,000 --> 00:04:41,000 Does she make the same mistakes as us? 120 00:04:41,000 --> 00:04:43,000 Does she not learn from them? And so on. 121 00:04:43,000 --> 00:04:45,000 And so this is the kind of thing we decided to do. 122 00:04:45,000 --> 00:04:47,000 My students and I got very excited about this a few years ago. 123 00:04:47,000 --> 00:04:49,000 We said, all right, let's, you know, throw so problems at Holly, 124 00:04:49,000 --> 00:04:51,000 see if she messes these things up. 125 00:04:51,000 --> 00:04:54,000 First problem is just, well, where should we start? 126 00:04:54,000 --> 00:04:56,000 Because, you know, it's great for us, but bad for humans. 127 00:04:56,000 --> 00:04:58,000 We make a lot of mistakes in a lot of different contexts. 128 00:04:58,000 --> 00:05:00,000 You know, where are we actually going to start with this? 129 00:05:00,000 --> 00:05:03,000 And because we started this work around the time of the financial collapse, 130 00:05:03,000 --> 00:05:05,000 around the time when foreclosures were hitting the news, 131 00:05:05,000 --> 00:05:07,000 we said, hhmm, maybe we should 132 00:05:07,000 --> 00:05:09,000 actually start in the financial domain. 133 00:05:09,000 --> 00:05:12,000 Maybe we should look at monkey's economic decisions 134 00:05:12,000 --> 00:05:15,000 and try to see if they do the same kinds of dumb things that we do. 135 00:05:15,000 --> 00:05:17,000 Of course, that's when we hit a sort second problem -- 136 00:05:17,000 --> 00:05:19,000 a little bit more methodological -- 137 00:05:19,000 --> 00:05:21,000 which is that, maybe you guys don't know, 138 00:05:21,000 --> 00:05:24,000 but monkeys don't actually use money. I know, you haven't met them. 139 00:05:24,000 --> 00:05:26,000 But this is why, you know, they're not in the queue behind you 140 00:05:26,000 --> 00:05:29,000 at the grocery store or the ATM -- you know, they don't do this stuff. 141 00:05:29,000 --> 00:05:32,000 So now we faced, you know, a little bit of a problem here. 142 00:05:32,000 --> 00:05:34,000 How are we actually going to ask monkeys about money 143 00:05:34,000 --> 00:05:36,000 if they don't actually use it? 144 00:05:36,000 --> 00:05:38,000 So we said, well, maybe we should just, actually just suck it up 145 00:05:38,000 --> 00:05:40,000 and teach monkeys how to use money. 146 00:05:40,000 --> 00:05:42,000 So that's just what we did. 147 00:05:42,000 --> 00:05:45,000 What you're looking at over here is actually the first unit that I know of 148 00:05:45,000 --> 00:05:47,000 of non-human currency. 149 00:05:47,000 --> 00:05:49,000 We weren't very creative at the time we started these studies, 150 00:05:49,000 --> 00:05:51,000 so we just called it a token. 151 00:05:51,000 --> 00:05:54,000 But this is the unit of currency that we've taught our monkeys at Yale 152 00:05:54,000 --> 00:05:56,000 to actually use with humans, 153 00:05:56,000 --> 00:05:59,000 to actually buy different pieces of food. 154 00:05:59,000 --> 00:06:01,000 It doesn't look like much -- in fact, it isn't like much. 155 00:06:01,000 --> 00:06:03,000 Like most of our money, it's just a piece of metal. 156 00:06:03,000 --> 00:06:06,000 As those of you who've taken currencies home from your trip know, 157 00:06:06,000 --> 00:06:08,000 once you get home, it's actually pretty useless. 158 00:06:08,000 --> 00:06:10,000 It was useless to the monkeys at first 159 00:06:10,000 --> 00:06:12,000 before they realized what they could do with it. 160 00:06:12,000 --> 00:06:14,000 When we first gave it to them in their enclosures, 161 00:06:14,000 --> 00:06:16,000 they actually kind of picked them up, looked at them. 162 00:06:16,000 --> 00:06:18,000 They were these kind of weird things. 163 00:06:18,000 --> 00:06:20,000 But very quickly, the monkeys realized 164 00:06:20,000 --> 00:06:22,000 that they could actually hand these tokens over 165 00:06:22,000 --> 00:06:25,000 to different humans in the lab for some food. 166 00:06:25,000 --> 00:06:27,000 And so you see one of our monkeys, Mayday, up here doing this. 167 00:06:27,000 --> 00:06:30,000 This is A and B are kind of the points where she's sort of a little bit 168 00:06:30,000 --> 00:06:32,000 curious about these things -- doesn't know. 169 00:06:32,000 --> 00:06:34,000 There's this waiting hand from a human experimenter, 170 00:06:34,000 --> 00:06:37,000 and Mayday quickly figures out, apparently the human wants this. 171 00:06:37,000 --> 00:06:39,000 Hands it over, and then gets some food. 172 00:06:39,000 --> 00:06:41,000 It turns out not just Mayday, all of our monkeys get good 173 00:06:41,000 --> 00:06:43,000 at trading tokens with human salesman. 174 00:06:43,000 --> 00:06:45,000 So here's just a quick video of what this looks like. 175 00:06:45,000 --> 00:06:48,000 Here's Mayday. She's going to be trading a token for some food 176 00:06:48,000 --> 00:06:51,000 and waiting happily and getting her food. 177 00:06:51,000 --> 00:06:53,000 Here's Felix, I think. He's our alpha male; he's a kind of big guy. 178 00:06:53,000 --> 00:06:56,000 But he too waits patiently, gets his food and goes on. 179 00:06:56,000 --> 00:06:58,000 So the monkeys get really good at this. 180 00:06:58,000 --> 00:07:01,000 They're surprisingly good at this with very little training. 181 00:07:01,000 --> 00:07:03,000 We just allowed them to pick this up on their own. 182 00:07:03,000 --> 00:07:05,000 The question is: is this anything like human money? 183 00:07:05,000 --> 00:07:07,000 Is this a market at all, 184 00:07:07,000 --> 00:07:09,000 or did we just do a weird psychologist's trick 185 00:07:09,000 --> 00:07:11,000 by getting monkeys to do something, 186 00:07:11,000 --> 00:07:13,000 looking smart, but not really being smart. 187 00:07:13,000 --> 00:07:16,000 And so we said, well, what would the monkeys spontaneously do 188 00:07:16,000 --> 00:07:19,000 if this was really their currency, if they were really using it like money? 189 00:07:19,000 --> 00:07:21,000 Well, you might actually imagine them 190 00:07:21,000 --> 00:07:23,000 to do all the kinds of smart things 191 00:07:23,000 --> 00:07:26,000 that humans do when they start exchanging money with each other. 192 00:07:26,000 --> 00:07:29,000 You might have them start paying attention to price, 193 00:07:29,000 --> 00:07:31,000 paying attention to how much they buy -- 194 00:07:31,000 --> 00:07:34,000 sort of keeping track of their monkey token, as it were. 195 00:07:34,000 --> 00:07:36,000 Do the monkeys do anything like this? 196 00:07:36,000 --> 00:07:39,000 And so our monkey marketplace was born. 197 00:07:39,000 --> 00:07:41,000 The way this works is that 198 00:07:41,000 --> 00:07:44,000 our monkeys normally live in a kind of big zoo social enclosure. 199 00:07:44,000 --> 00:07:46,000 When they get a hankering for some treats, 200 00:07:46,000 --> 00:07:48,000 we actually allowed them a way out 201 00:07:48,000 --> 00:07:50,000 into a little smaller enclosure where they could enter the market. 202 00:07:50,000 --> 00:07:52,000 Upon entering the market -- 203 00:07:52,000 --> 00:07:54,000 it was actually a much more fun market for the monkeys than most human markets 204 00:07:54,000 --> 00:07:57,000 because, as the monkeys entered the door of the market, 205 00:07:57,000 --> 00:07:59,000 a human would give them a big wallet full of tokens 206 00:07:59,000 --> 00:08:01,000 so they could actually trade the tokens 207 00:08:01,000 --> 00:08:03,000 with one of these two guys here -- 208 00:08:03,000 --> 00:08:05,000 two different possible human salesmen 209 00:08:05,000 --> 00:08:07,000 that they could actually buy stuff from. 210 00:08:07,000 --> 00:08:09,000 The salesmen were students from my lab. 211 00:08:09,000 --> 00:08:11,000 They dressed differently; they were different people. 212 00:08:11,000 --> 00:08:14,000 And over time, they did basically the same thing 213 00:08:14,000 --> 00:08:16,000 so the monkeys could learn, you know, 214 00:08:16,000 --> 00:08:19,000 who sold what at what price -- you know, who was reliable, who wasn't, and so on. 215 00:08:19,000 --> 00:08:21,000 And you can see that each of the experimenters 216 00:08:21,000 --> 00:08:24,000 is actually holding up a little, yellow food dish. 217 00:08:24,000 --> 00:08:26,000 and that's what the monkey can for a single token. 218 00:08:26,000 --> 00:08:28,000 So everything costs one token, 219 00:08:28,000 --> 00:08:30,000 but as you can see, sometimes tokens buy more than others, 220 00:08:30,000 --> 00:08:32,000 sometimes more grapes than others. 221 00:08:32,000 --> 00:08:35,000 So I'll show you a quick video of what this marketplace actually looks like. 222 00:08:35,000 --> 00:08:38,000 Here's a monkey-eye-view. Monkeys are shorter, so it's a little short. 223 00:08:38,000 --> 00:08:40,000 But here's Honey. 224 00:08:40,000 --> 00:08:42,000 She's waiting for the market to open a little impatiently. 225 00:08:42,000 --> 00:08:45,000 All of a sudden the market opens. Here's her choice: one grapes or two grapes. 226 00:08:45,000 --> 00:08:47,000 You can see Honey, very good market economist, 227 00:08:47,000 --> 00:08:50,000 goes with the guy who gives more. 228 00:08:50,000 --> 00:08:52,000 She could teach our financial advisers a few things or two. 229 00:08:52,000 --> 00:08:54,000 So not just Honey, 230 00:08:54,000 --> 00:08:57,000 most of the monkeys went with guys who had more. 231 00:08:57,000 --> 00:08:59,000 Most of the monkeys went with guys who had better food. 232 00:08:59,000 --> 00:09:02,000 When we introduced sales, we saw the monkeys paid attention to that. 233 00:09:02,000 --> 00:09:05,000 They really cared about their monkey token dollar. 234 00:09:05,000 --> 00:09:08,000 The more surprising thing was that when we collaborated with economists 235 00:09:08,000 --> 00:09:11,000 to actually look at the monkeys' data using economic tools, 236 00:09:11,000 --> 00:09:14,000 they basically matched, not just qualitatively, 237 00:09:14,000 --> 00:09:16,000 but quantitatively with what we saw 238 00:09:16,000 --> 00:09:18,000 humans doing in a real market. 239 00:09:18,000 --> 00:09:20,000 So much so that, if you saw the monkeys' numbers, 240 00:09:20,000 --> 00:09:23,000 you couldn't tell whether they came from a monkey or a human in the same market. 241 00:09:23,000 --> 00:09:25,000 And what we'd really thought we'd done 242 00:09:25,000 --> 00:09:27,000 is like we'd actually introduced something 243 00:09:27,000 --> 00:09:29,000 that, at least for the monkeys and us, 244 00:09:29,000 --> 00:09:31,000 works like a real financial currency. 245 00:09:31,000 --> 00:09:34,000 Question is: do the monkeys start messing up in the same ways we do? 246 00:09:34,000 --> 00:09:37,000 Well, we already saw anecdotally a couple of signs that they might. 247 00:09:37,000 --> 00:09:39,000 One thing we never saw in the monkey marketplace 248 00:09:39,000 --> 00:09:41,000 was any evidence of saving -- 249 00:09:41,000 --> 00:09:43,000 you know, just like our own species. 250 00:09:43,000 --> 00:09:45,000 The monkeys entered the market, spent their entire budget 251 00:09:45,000 --> 00:09:47,000 and then went back to everyone else. 252 00:09:47,000 --> 00:09:49,000 The other thing we also spontaneously saw, 253 00:09:49,000 --> 00:09:51,000 embarrassingly enough, 254 00:09:51,000 --> 00:09:53,000 is spontaneous evidence of larceny. 255 00:09:53,000 --> 00:09:56,000 The monkeys would rip-off the tokens at every available opportunity -- 256 00:09:56,000 --> 00:09:58,000 from each other, often from us -- 257 00:09:58,000 --> 00:10:00,000 you know, things we didn't necessarily think we were introducing, 258 00:10:00,000 --> 00:10:02,000 but things we spontaneously saw. 259 00:10:02,000 --> 00:10:04,000 So we said, this looks bad. 260 00:10:04,000 --> 00:10:06,000 Can we actually see if the monkeys 261 00:10:06,000 --> 00:10:09,000 are doing exactly the same dumb things as humans do? 262 00:10:09,000 --> 00:10:11,000 One possibility is just kind of let 263 00:10:11,000 --> 00:10:13,000 the monkey financial system play out, 264 00:10:13,000 --> 00:10:15,000 you know, see if they start calling us for bailouts in a few years. 265 00:10:15,000 --> 00:10:17,000 We were a little impatient so we wanted 266 00:10:17,000 --> 00:10:19,000 to sort of speed things up a bit. 267 00:10:19,000 --> 00:10:21,000 So we said, let's actually give the monkeys 268 00:10:21,000 --> 00:10:23,000 the same kinds of problems 269 00:10:23,000 --> 00:10:25,000 that humans tend to get wrong 270 00:10:25,000 --> 00:10:27,000 in certain kinds of economic challenges, 271 00:10:27,000 --> 00:10:29,000 or certain kinds of economic experiments. 272 00:10:29,000 --> 00:10:32,000 And so, since the best way to see how people go wrong 273 00:10:32,000 --> 00:10:34,000 is to actually do it yourself, 274 00:10:34,000 --> 00:10:36,000 I'm going to give you guys a quick experiment 275 00:10:36,000 --> 00:10:38,000 to sort of watch your own financial intuitions in action. 276 00:10:38,000 --> 00:10:40,000 So imagine that right now 277 00:10:40,000 --> 00:10:42,000 I handed each and every one of you 278 00:10:42,000 --> 00:10:45,000 a thousand U.S. dollars -- so 10 crisp hundred dollar bills. 279 00:10:45,000 --> 00:10:47,000 Take these, put it in your wallet 280 00:10:47,000 --> 00:10:49,000 and spend a second thinking about what you're going to do with it. 281 00:10:49,000 --> 00:10:51,000 Because it's yours now; you can buy whatever you want. 282 00:10:51,000 --> 00:10:53,000 Donate it, take it, and so on. 283 00:10:53,000 --> 00:10:56,000 Sounds great, but you get one more choice to earn a little bit more money. 284 00:10:56,000 --> 00:10:59,000 And here's your choice: you can either be risky, 285 00:10:59,000 --> 00:11:01,000 in which case I'm going to flip one of these monkey tokens. 286 00:11:01,000 --> 00:11:03,000 If it comes up heads, you're going to get a thousand dollars more. 287 00:11:03,000 --> 00:11:05,000 If it comes up tails, you get nothing. 288 00:11:05,000 --> 00:11:08,000 So it's a chance to get more, but it's pretty risky. 289 00:11:08,000 --> 00:11:11,000 Your other option is a bit safe. Your just going to get some money for sure. 290 00:11:11,000 --> 00:11:13,000 I'm just going to give you 500 bucks. 291 00:11:13,000 --> 00:11:16,000 You can stick it in your wallet and use it immediately. 292 00:11:16,000 --> 00:11:18,000 So see what your intuition is here. 293 00:11:18,000 --> 00:11:21,000 Most people actually go with the play-it-safe option. 294 00:11:21,000 --> 00:11:24,000 Most people say, why should I be risky when I can get 1,500 dollars for sure? 295 00:11:24,000 --> 00:11:26,000 This seems like a good bet. I'm going to go with that. 296 00:11:26,000 --> 00:11:28,000 You might say, eh, that's not really irrational. 297 00:11:28,000 --> 00:11:30,000 People are a little risk-averse. So what? 298 00:11:30,000 --> 00:11:32,000 Well, the "so what?" comes when start thinking 299 00:11:32,000 --> 00:11:34,000 about the same problem 300 00:11:34,000 --> 00:11:36,000 set up just a little bit differently. 301 00:11:36,000 --> 00:11:38,000 So now imagine that I give each and every one of you 302 00:11:38,000 --> 00:11:41,000 2,000 dollars -- 20 crisp hundred dollar bills. 303 00:11:41,000 --> 00:11:43,000 Now you can buy double to stuff you were going to get before. 304 00:11:43,000 --> 00:11:45,000 Think about how you'd feel sticking it in your wallet. 305 00:11:45,000 --> 00:11:47,000 And now imagine that I have you make another choice 306 00:11:47,000 --> 00:11:49,000 But this time, it's a little bit worse. 307 00:11:49,000 --> 00:11:52,000 Now, you're going to be deciding how you're going to lose money, 308 00:11:52,000 --> 00:11:54,000 but you're going to get the same choice. 309 00:11:54,000 --> 00:11:56,000 You can either take a risky loss -- 310 00:11:56,000 --> 00:11:59,000 so I'll flip a coin. If it comes up heads, you're going to actually lose a lot. 311 00:11:59,000 --> 00:12:02,000 If it comes up tails, you lose nothing, you're fine, get to keep the whole thing -- 312 00:12:02,000 --> 00:12:05,000 or you could play it safe, which means you have to reach back into your wallet 313 00:12:05,000 --> 00:12:08,000 and give me five of those $100 bills, for certain. 314 00:12:08,000 --> 00:12:11,000 And I'm seeing a lot of furrowed brows out there. 315 00:12:11,000 --> 00:12:13,000 So maybe you're having the same intuitions 316 00:12:13,000 --> 00:12:15,000 as the subjects that were actually tested in this, 317 00:12:15,000 --> 00:12:17,000 which is when presented with these options, 318 00:12:17,000 --> 00:12:19,000 people don't choose to play it safe. 319 00:12:19,000 --> 00:12:21,000 They actually tend to go a little risky. 320 00:12:21,000 --> 00:12:24,000 The reason this is irrational is that we've given people in both situations 321 00:12:24,000 --> 00:12:26,000 the same choice. 322 00:12:26,000 --> 00:12:29,000 It's a 50/50 shot of a thousand or 2,000, 323 00:12:29,000 --> 00:12:31,000 or just 1,500 dollars with certainty. 324 00:12:31,000 --> 00:12:34,000 But people's intuitions about how much risk to take 325 00:12:34,000 --> 00:12:36,000 varies depending on where they started with. 326 00:12:36,000 --> 00:12:38,000 So what's going on? 327 00:12:38,000 --> 00:12:40,000 Well, it turns out that this seems to be the result 328 00:12:40,000 --> 00:12:43,000 of at least two biases that we have at the psychological level. 329 00:12:43,000 --> 00:12:46,000 One is that we have a really hard time thinking in absolute terms. 330 00:12:46,000 --> 00:12:48,000 You really have to do work to figure out, 331 00:12:48,000 --> 00:12:50,000 well, one option's a thousand, 2,000; 332 00:12:50,000 --> 00:12:52,000 one is 1,500. 333 00:12:52,000 --> 00:12:55,000 Instead, we find it very easy to think in very relative terms 334 00:12:55,000 --> 00:12:58,000 as options change from one time to another. 335 00:12:58,000 --> 00:13:01,000 So we think of things as, "Oh, I'm going to get more," or "Oh, I'm going to get less." 336 00:13:01,000 --> 00:13:03,000 This is all well and good, except that 337 00:13:03,000 --> 00:13:05,000 changes in different directions 338 00:13:05,000 --> 00:13:07,000 actually effect whether or not we think 339 00:13:07,000 --> 00:13:09,000 options are good or not. 340 00:13:09,000 --> 00:13:11,000 And this leads to the second bias, 341 00:13:11,000 --> 00:13:13,000 which economists have called loss aversion. 342 00:13:13,000 --> 00:13:16,000 The idea is that we really hate it when things go into the red. 343 00:13:16,000 --> 00:13:18,000 We really hate it when we have to lose out on some money. 344 00:13:18,000 --> 00:13:20,000 And this means that sometimes we'll actually 345 00:13:20,000 --> 00:13:22,000 switch our preferences to avoid this. 346 00:13:22,000 --> 00:13:24,000 What you saw in that last scenario is that 347 00:13:24,000 --> 00:13:26,000 subjects get risky 348 00:13:26,000 --> 00:13:29,000 because they want the small shot that there won't be any loss. 349 00:13:29,000 --> 00:13:31,000 That means when we're in a risk mindset -- 350 00:13:31,000 --> 00:13:33,000 excuse me, when we're in a loss mindset, 351 00:13:33,000 --> 00:13:35,000 we actually become more risky, 352 00:13:35,000 --> 00:13:37,000 which can actually be really worrying. 353 00:13:37,000 --> 00:13:40,000 These kinds of things play out in lots of bad ways in humans. 354 00:13:40,000 --> 00:13:43,000 They're why stock investors hold onto losing stocks longer -- 355 00:13:43,000 --> 00:13:45,000 because they're evaluating them in relative terms. 356 00:13:45,000 --> 00:13:47,000 They're why people in the housing market refused to sell their house -- 357 00:13:47,000 --> 00:13:49,000 because they don't want to sell at a loss. 358 00:13:49,000 --> 00:13:51,000 The question we were interested in 359 00:13:51,000 --> 00:13:53,000 is whether the monkeys show the same biases. 360 00:13:53,000 --> 00:13:56,000 If we set up those same scenarios in our little monkey market, 361 00:13:56,000 --> 00:13:58,000 would they do the same thing as people? 362 00:13:58,000 --> 00:14:00,000 And so this is what we did, we gave the monkeys choices 363 00:14:00,000 --> 00:14:03,000 between guys who were safe -- they did the same thing every time -- 364 00:14:03,000 --> 00:14:05,000 or guys who were risky -- 365 00:14:05,000 --> 00:14:07,000 they did things differently half the time. 366 00:14:07,000 --> 00:14:09,000 And then we gave them options that were bonuses -- 367 00:14:09,000 --> 00:14:11,000 like you guys did in the first scenario -- 368 00:14:11,000 --> 00:14:13,000 so they actually have a chance more, 369 00:14:13,000 --> 00:14:16,000 or pieces where they were experiencing losses -- 370 00:14:16,000 --> 00:14:18,000 they actually thought they were going to get more than they really got. 371 00:14:18,000 --> 00:14:20,000 And so this is what this looks like. 372 00:14:20,000 --> 00:14:22,000 We introduced the monkeys to two new monkey salesmen. 373 00:14:22,000 --> 00:14:24,000 The guy on the left and right both start with one piece of grape, 374 00:14:24,000 --> 00:14:26,000 so it looks pretty good. 375 00:14:26,000 --> 00:14:28,000 But they're going to give the monkeys bonuses. 376 00:14:28,000 --> 00:14:30,000 The guy on the left is a safe bonus. 377 00:14:30,000 --> 00:14:33,000 All the time, he adds one, to give the monkeys two. 378 00:14:33,000 --> 00:14:35,000 The guy on the right is actually a risky bonus. 379 00:14:35,000 --> 00:14:38,000 Sometimes the monkeys get no bonus -- so this is a bonus of zero. 380 00:14:38,000 --> 00:14:41,000 Sometimes the monkeys get two extra. 381 00:14:41,000 --> 00:14:43,000 For a big bonus, now they get three. 382 00:14:43,000 --> 00:14:45,000 But this is the same choice you guys just faced. 383 00:14:45,000 --> 00:14:48,000 Do the monkeys actually want to play it safe 384 00:14:48,000 --> 00:14:50,000 and then go with the guy who's going to do the same thing on every trial, 385 00:14:50,000 --> 00:14:52,000 or do they want to be risky 386 00:14:52,000 --> 00:14:54,000 and try to get a risky, but big, bonus, 387 00:14:54,000 --> 00:14:56,000 but risk the possibility of getting no bonus. 388 00:14:56,000 --> 00:14:58,000 People here played it safe. 389 00:14:58,000 --> 00:15:00,000 Turns out, the monkeys play it safe too. 390 00:15:00,000 --> 00:15:02,000 Qualitatively and quantitatively, 391 00:15:02,000 --> 00:15:04,000 they choose exactly the same way as people, 392 00:15:04,000 --> 00:15:06,000 when tested in the same thing. 393 00:15:06,000 --> 00:15:08,000 You might say, well, maybe the monkeys just don't like risk. 394 00:15:08,000 --> 00:15:10,000 Maybe we should see how they do with losses. 395 00:15:10,000 --> 00:15:12,000 And so we ran a second version of this. 396 00:15:12,000 --> 00:15:14,000 Now, the monkeys meet two guys 397 00:15:14,000 --> 00:15:16,000 who aren't giving them bonuses; 398 00:15:16,000 --> 00:15:18,000 they're actually giving them less than they expect. 399 00:15:18,000 --> 00:15:20,000 So they look like they're starting out with a big amount. 400 00:15:20,000 --> 00:15:22,000 These are three grapes; the monkey's really psyched for this. 401 00:15:22,000 --> 00:15:25,000 But now they learn these guys are going to give them less than they expect. 402 00:15:25,000 --> 00:15:27,000 They guy on the left is a safe loss. 403 00:15:27,000 --> 00:15:30,000 Every single time, he's going to take one of these away 404 00:15:30,000 --> 00:15:32,000 and give the monkeys just two. 405 00:15:32,000 --> 00:15:34,000 the guy on the right is the risky loss. 406 00:15:34,000 --> 00:15:37,000 Sometimes he gives no loss, so the monkeys are really psyched, 407 00:15:37,000 --> 00:15:39,000 but sometimes he actually gives a big loss, 408 00:15:39,000 --> 00:15:41,000 taking away two to give the monkeys only one. 409 00:15:41,000 --> 00:15:43,000 And so what do the monkeys do? 410 00:15:43,000 --> 00:15:45,000 Again, same choice; they can play it safe 411 00:15:45,000 --> 00:15:48,000 for always getting two grapes every single time, 412 00:15:48,000 --> 00:15:51,000 or they can take a risky bet and choose between one and three. 413 00:15:51,000 --> 00:15:54,000 The remarkable thing to us is that, when you give monkeys this choice, 414 00:15:54,000 --> 00:15:56,000 they do the same irrational thing that people do. 415 00:15:56,000 --> 00:15:58,000 They actually become more risky 416 00:15:58,000 --> 00:16:01,000 depending on how the experimenters started. 417 00:16:01,000 --> 00:16:03,000 This is crazy because it suggests that the monkeys too 418 00:16:03,000 --> 00:16:05,000 are evaluating things in relative terms 419 00:16:05,000 --> 00:16:08,000 and actually treating losses differently than they treat gains. 420 00:16:08,000 --> 00:16:10,000 So what does all of this mean? 421 00:16:10,000 --> 00:16:12,000 Well, what we've shown is that, first of all, 422 00:16:12,000 --> 00:16:14,000 we can actually give the monkeys a financial currency, 423 00:16:14,000 --> 00:16:16,000 and they do very similar things with it. 424 00:16:16,000 --> 00:16:18,000 They do some of the smart things we do, 425 00:16:18,000 --> 00:16:20,000 some of the kind of not so nice things we do, 426 00:16:20,000 --> 00:16:22,000 like steal it and so on. 427 00:16:22,000 --> 00:16:24,000 But they also do some of the irrational things we do. 428 00:16:24,000 --> 00:16:26,000 They systematically get things wrong 429 00:16:26,000 --> 00:16:28,000 and in the same ways that we do. 430 00:16:28,000 --> 00:16:30,000 This is the first take-home message of the Talk, 431 00:16:30,000 --> 00:16:32,000 which is that if you saw the beginning of this and you thought, 432 00:16:32,000 --> 00:16:34,000 oh, I'm totally going to go home and hire a capuchin monkey financial adviser. 433 00:16:34,000 --> 00:16:36,000 They're way cuter than the one at ... you know -- 434 00:16:36,000 --> 00:16:38,000 Don't do that; they're probably going to be just as dumb 435 00:16:38,000 --> 00:16:41,000 as the human one you already have. 436 00:16:41,000 --> 00:16:43,000 So, you know, a little bad -- Sorry, sorry, sorry. 437 00:16:43,000 --> 00:16:45,000 A little bad for monkey investors. 438 00:16:45,000 --> 00:16:48,000 But of course, you know, the reason you're laughing is bad for humans too. 439 00:16:48,000 --> 00:16:51,000 Because we've answered the question we started out with. 440 00:16:51,000 --> 00:16:53,000 We wanted to know where these kinds of errors came from. 441 00:16:53,000 --> 00:16:55,000 And we started with the hope that maybe we can 442 00:16:55,000 --> 00:16:57,000 sort of tweak our financial institutions, 443 00:16:57,000 --> 00:17:00,000 tweak our technologies to make ourselves better. 444 00:17:00,000 --> 00:17:03,000 But what we've learn is that these biases might be a deeper part of us than that. 445 00:17:03,000 --> 00:17:05,000 In fact, they might be due to the very nature 446 00:17:05,000 --> 00:17:07,000 of our evolutionary history. 447 00:17:07,000 --> 00:17:09,000 You know, maybe it's not just humans 448 00:17:09,000 --> 00:17:11,000 at the right side of this chain that's duncey. 449 00:17:11,000 --> 00:17:13,000 Maybe it's sort of duncey all the way back. 450 00:17:13,000 --> 00:17:16,000 And this, if we believe the capuchin monkey results, 451 00:17:16,000 --> 00:17:18,000 means that these duncey strategies 452 00:17:18,000 --> 00:17:20,000 might be 35 million years old. 453 00:17:20,000 --> 00:17:22,000 That's a long time for a strategy 454 00:17:22,000 --> 00:17:25,000 to potentially get changed around -- really, really old. 455 00:17:25,000 --> 00:17:27,000 What do we know about other old strategies like this? 456 00:17:27,000 --> 00:17:30,000 Well, one thing we know is that they tend to be really hard to overcome. 457 00:17:30,000 --> 00:17:32,000 You know, think of our evolutionary predilection 458 00:17:32,000 --> 00:17:35,000 for eating sweet things, fatty things like cheesecake. 459 00:17:35,000 --> 00:17:37,000 You can't just shut that off. 460 00:17:37,000 --> 00:17:40,000 You can't just look at the dessert cart as say, "No, no, no. That looks disgusting to me." 461 00:17:40,000 --> 00:17:42,000 We're just built differently. 462 00:17:42,000 --> 00:17:44,000 We're going to perceive it as a good thing to go after. 463 00:17:44,000 --> 00:17:46,000 My guess is that the same thing is going to be true 464 00:17:46,000 --> 00:17:48,000 when humans are perceiving 465 00:17:48,000 --> 00:17:50,000 different financial decisions. 466 00:17:50,000 --> 00:17:52,000 When you're watching your stocks plummet into the red, 467 00:17:52,000 --> 00:17:54,000 when you're watching your house price go down, 468 00:17:54,000 --> 00:17:56,000 you're not going to be able to see that 469 00:17:56,000 --> 00:17:58,000 in anything but old evolutionary terms. 470 00:17:58,000 --> 00:18:00,000 This means that the biases 471 00:18:00,000 --> 00:18:02,000 that lead investors to do badly, 472 00:18:02,000 --> 00:18:04,000 that lead to the foreclosure crisis 473 00:18:04,000 --> 00:18:06,000 are going to be really hard to overcome. 474 00:18:06,000 --> 00:18:08,000 So that's the bad news. The question is: is there any good news? 475 00:18:08,000 --> 00:18:10,000 I'm supposed to be up here telling you the good news. 476 00:18:10,000 --> 00:18:12,000 Well, the good news, I think, 477 00:18:12,000 --> 00:18:14,000 is what I started with at the beginning of the Talk, 478 00:18:14,000 --> 00:18:16,000 which is that humans are not only smart; 479 00:18:16,000 --> 00:18:18,000 we're really inspirationally smart 480 00:18:18,000 --> 00:18:21,000 to the rest of the animals in the biological kingdom. 481 00:18:21,000 --> 00:18:24,000 We're so good at overcoming our biological limitations -- 482 00:18:24,000 --> 00:18:26,000 you know, I flew over here in an airplane. 483 00:18:26,000 --> 00:18:28,000 I didn't have to try to flap my wings. 484 00:18:28,000 --> 00:18:31,000 I'm wearing contact lenses now so that I can see all of you. 485 00:18:31,000 --> 00:18:34,000 I don't have to rely on my own near-sightedness. 486 00:18:34,000 --> 00:18:36,000 We actually have all of these cases 487 00:18:36,000 --> 00:18:39,000 where we overcome our biological limitations 488 00:18:39,000 --> 00:18:42,000 through technology and other means, seemingly pretty easily. 489 00:18:42,000 --> 00:18:45,000 But we have to recognize that we have those limitations. 490 00:18:45,000 --> 00:18:47,000 And here's the rub. 491 00:18:47,000 --> 00:18:49,000 It was Camus who once said that, "Man is the only species 492 00:18:49,000 --> 00:18:52,000 who refuses to be what he really is." 493 00:18:52,000 --> 00:18:54,000 But the irony is that 494 00:18:54,000 --> 00:18:56,000 it might only be in recognizing our limitations 495 00:18:56,000 --> 00:18:58,000 that we can really actually overcome them. 496 00:18:58,000 --> 00:19:01,000 The hope is that you all will think about your limitations, 497 00:19:01,000 --> 00:19:04,000 not necessarily as unovercomable, 498 00:19:04,000 --> 00:19:06,000 but to recognize them, accept them 499 00:19:06,000 --> 00:19:09,000 and then use the world of design to actually figure them out. 500 00:19:09,000 --> 00:19:12,000 That might be the only way that we will really be able 501 00:19:12,000 --> 00:19:14,000 to achieve our own human potential 502 00:19:14,000 --> 00:19:17,000 and really be the noble species we hope to all be. 503 00:19:17,000 --> 00:19:19,000 Thank you. 504 00:19:19,000 --> 00:19:24,000 (Applause)