1 00:00:01,515 --> 00:00:05,221 So a friend of mine was riding in a taxi to the airport the other day, 2 00:00:05,245 --> 00:00:07,955 and on the way, she was chatting with the taxi driver, 3 00:00:07,979 --> 00:00:10,408 and he said to her, with total sincerity, 4 00:00:10,432 --> 00:00:13,718 "I can tell you are a really good person." 5 00:00:13,742 --> 00:00:15,559 And when she told me this story later, 6 00:00:15,583 --> 00:00:18,758 she said she couldn't believe how good it made her feel, 7 00:00:18,782 --> 00:00:20,841 that it meant a lot to her. 8 00:00:20,865 --> 00:00:23,891 Now that may seem like a strong reaction from my friend 9 00:00:23,915 --> 00:00:26,119 to the words of a total stranger, 10 00:00:26,143 --> 00:00:27,761 but she's not alone. 11 00:00:27,785 --> 00:00:29,150 I'm a social scientist. 12 00:00:29,174 --> 00:00:31,827 I study the psychology of good people, 13 00:00:31,851 --> 00:00:36,141 and research in my field says many of us care deeply 14 00:00:36,165 --> 00:00:41,166 about feeling like a good person and being seen as a good person. 15 00:00:41,467 --> 00:00:46,300 Now, your definition of "good person" and your definition of "good person" 16 00:00:46,324 --> 00:00:49,064 and maybe the taxi driver's definition of "good person" -- 17 00:00:49,088 --> 00:00:51,071 we may not all have the same definition, 18 00:00:51,095 --> 00:00:53,920 but within whatever our definition is, 19 00:00:53,944 --> 00:00:56,745 that moral identity is important to many of us. 20 00:00:57,154 --> 00:01:02,375 Now, if somebody challenges it, like they question us for a joke we tell, 21 00:01:02,399 --> 00:01:05,082 or maybe we say our workforce is homogenous, 22 00:01:05,106 --> 00:01:08,085 or a slippery business expense, 23 00:01:08,109 --> 00:01:11,307 we go into red-zone defensiveness a lot of the time. 24 00:01:11,331 --> 00:01:14,422 I mean, sometimes we call out 25 00:01:14,446 --> 00:01:18,115 all the ways in which we help people from marginalized groups, 26 00:01:18,139 --> 00:01:19,907 or we donate to charity, 27 00:01:19,931 --> 00:01:23,970 or the hours we volunteer to nonprofits. 28 00:01:23,994 --> 00:01:27,685 We work to protect that good person identity. 29 00:01:27,709 --> 00:01:29,645 It's important to many of us. 30 00:01:30,517 --> 00:01:32,375 But what if I told you this? 31 00:01:32,399 --> 00:01:37,113 What if I told you that our attachment to being good people 32 00:01:37,137 --> 00:01:39,653 is getting in the way of us being better people? 33 00:01:40,225 --> 00:01:46,551 What if I told you that our definition of "good person" is so narrow, 34 00:01:46,575 --> 00:01:49,106 it's scientifically impossible to meet? 35 00:01:49,806 --> 00:01:53,083 And what if I told you the path to being better people 36 00:01:53,107 --> 00:01:56,043 just begins with letting go of being a good person? 37 00:01:56,876 --> 00:01:59,516 Now, let me tell you a little bit about the research 38 00:01:59,540 --> 00:02:01,012 about how the human mind works 39 00:02:01,036 --> 00:02:02,186 to explain. 40 00:02:02,540 --> 00:02:06,648 The brain relies on shortcuts to do a lot of its work. 41 00:02:06,672 --> 00:02:08,297 That means a lot of the time, 42 00:02:08,321 --> 00:02:11,649 your mental processes are taking place outside of your awareness, 43 00:02:11,673 --> 00:02:16,411 like in low-battery, low-power mode in the back of your mind. 44 00:02:17,088 --> 00:02:20,549 That's, in fact, the premise of bounded rationality. 45 00:02:20,573 --> 00:02:24,014 Bounded rationality is the Nobel Prize-winning idea 46 00:02:24,038 --> 00:02:26,880 that the human mind has limited storage resources, 47 00:02:26,904 --> 00:02:29,000 limited processing power, 48 00:02:29,024 --> 00:02:33,056 and as a result, it relies on shortcuts to do a lot of its work. 49 00:02:33,571 --> 00:02:35,095 So for example, 50 00:02:35,833 --> 00:02:38,467 some scientists estimate that in any given moment ... 51 00:02:39,270 --> 00:02:41,350 Better, better click, right? There we go. 52 00:02:41,374 --> 00:02:42,381 (Laughter) 53 00:02:42,405 --> 00:02:43,649 At any given moment, 54 00:02:43,673 --> 00:02:47,477 11 million pieces of information are coming into your mind. 55 00:02:48,054 --> 00:02:49,670 Eleven million. 56 00:02:49,694 --> 00:02:52,567 And only 40 of them are being processed consciously. 57 00:02:53,051 --> 00:02:55,273 So 11 million, 40. 58 00:02:56,140 --> 00:02:58,024 I mean, has this ever happened to you? 59 00:02:58,048 --> 00:03:00,402 Have you ever had a really busy day at work, 60 00:03:00,426 --> 00:03:02,155 and you drive home, 61 00:03:02,179 --> 00:03:04,397 and when you get in the door, 62 00:03:04,421 --> 00:03:07,669 you realize you don't even remember the drive home, 63 00:03:07,693 --> 00:03:10,196 like whether you had green lights or red lights. 64 00:03:10,220 --> 00:03:12,630 You don't even remember. You were on autopilot. 65 00:03:12,974 --> 00:03:16,261 Or have you ever opened the fridge, 66 00:03:16,285 --> 00:03:18,357 looked for the butter, 67 00:03:18,381 --> 00:03:21,325 swore there is no butter, 68 00:03:21,349 --> 00:03:24,966 and then realized the butter was right in front of you the whole time? 69 00:03:24,990 --> 00:03:28,511 These are the kinds of "whoops" moments that make us giggle, 70 00:03:28,535 --> 00:03:30,546 and this is what happens in a brain 71 00:03:30,570 --> 00:03:34,144 that can handle 11 million pieces of information coming in 72 00:03:34,168 --> 00:03:36,745 with only 40 being processed consciously. 73 00:03:36,769 --> 00:03:40,149 That's the bounded part of bounded rationality. 74 00:03:43,352 --> 00:03:45,829 This work on bounded rationality 75 00:03:45,853 --> 00:03:50,019 is what's inspired work I've done with my collaborators 76 00:03:50,043 --> 00:03:52,678 Max Bazerman and Mahzarin Banaji, 77 00:03:52,702 --> 00:03:55,348 on what we call bounded ethicality. 78 00:03:55,702 --> 00:03:58,774 So it's the same premise as bounded rationality, 79 00:03:58,798 --> 00:04:04,399 that we have a human mind that is bounded in some sort of way 80 00:04:04,423 --> 00:04:06,505 and relying on shortcuts, 81 00:04:06,529 --> 00:04:10,354 and that those shortcuts can sometimes lead us astray. 82 00:04:10,886 --> 00:04:12,411 With bounded rationality, 83 00:04:12,435 --> 00:04:16,121 perhaps it affects the cereal we buy in the grocery store, 84 00:04:16,145 --> 00:04:19,228 or the product we launch in the boardroom. 85 00:04:19,836 --> 00:04:22,519 With bounded ethicality, the human mind, 86 00:04:22,543 --> 00:04:24,622 the same human mind, 87 00:04:24,646 --> 00:04:26,138 is making decisions, 88 00:04:26,162 --> 00:04:28,948 and here, it's about who to hire next, 89 00:04:28,972 --> 00:04:30,622 or what joke to tell 90 00:04:30,646 --> 00:04:32,868 or that slippery business decision. 91 00:04:34,157 --> 00:04:38,760 So let me give you an example of bounded ethicality at work. 92 00:04:38,784 --> 00:04:41,570 Unconscious bias is one place 93 00:04:41,594 --> 00:04:45,103 where we see the effects of bounded ethicality. 94 00:04:45,127 --> 00:04:49,513 So unconscious bias refers to associations we have in our mind, 95 00:04:49,537 --> 00:04:53,827 the shortcuts your brain is using to organize information, 96 00:04:53,851 --> 00:04:56,115 very likely outside of your awareness, 97 00:04:56,139 --> 00:04:59,591 not necessarily lining up with your conscious beliefs. 98 00:05:00,503 --> 00:05:03,027 Researchers Nosek, Banaji and Greenwald 99 00:05:03,051 --> 00:05:05,782 have looked at data from millions of people, 100 00:05:05,806 --> 00:05:08,563 and what they've found is, for example, 101 00:05:08,587 --> 00:05:12,080 most white Americans can more quickly and easily 102 00:05:12,104 --> 00:05:16,373 associate white people and good things 103 00:05:16,397 --> 00:05:18,690 than black people and good things, 104 00:05:19,650 --> 00:05:25,264 and most men and women can more quickly and easily associate 105 00:05:25,288 --> 00:05:29,590 men and science than women and science. 106 00:05:30,137 --> 00:05:34,424 And these associations don't necessarily line up 107 00:05:34,448 --> 00:05:36,323 with what people consciously think. 108 00:05:36,347 --> 00:05:39,680 They may have very egalitarian views, in fact. 109 00:05:40,206 --> 00:05:44,619 So sometimes, that 11 million and that 40 just don't line up. 110 00:05:45,402 --> 00:05:47,369 And here's another example: 111 00:05:47,393 --> 00:05:48,885 conflicts of interest. 112 00:05:49,372 --> 00:05:53,182 So we tend to underestimate how much a small gift -- 113 00:05:53,206 --> 00:05:56,849 imagine a ballpoint pen or dinner -- 114 00:05:56,873 --> 00:06:01,031 how much that small gift can affect our decision making. 115 00:06:01,852 --> 00:06:06,178 We don't realize that our mind is unconsciously lining up evidence 116 00:06:06,202 --> 00:06:09,733 to support the point of view of the gift-giver, 117 00:06:09,757 --> 00:06:14,578 no matter how hard we're consciously trying to be objective and professional. 118 00:06:15,689 --> 00:06:17,408 We also see bounded ethicality -- 119 00:06:17,432 --> 00:06:20,809 despite our attachment to being good people, 120 00:06:20,833 --> 00:06:22,914 we still make mistakes, 121 00:06:22,938 --> 00:06:26,949 and we make mistakes that sometimes hurt other people, 122 00:06:26,973 --> 00:06:29,443 that sometimes promote injustice, 123 00:06:29,467 --> 00:06:31,492 despite our best attempts, 124 00:06:31,516 --> 00:06:35,633 and we explain away our mistakes rather than learning from them. 125 00:06:36,810 --> 00:06:39,263 Like, for example, 126 00:06:39,287 --> 00:06:43,088 when I got an email from a female student in my class 127 00:06:43,112 --> 00:06:45,660 saying that a reading I had assigned, 128 00:06:45,684 --> 00:06:48,438 a reading I had been assigning for years, 129 00:06:48,462 --> 00:06:49,893 was sexist. 130 00:06:50,738 --> 00:06:56,326 Or when I confused two students in my class 131 00:06:56,350 --> 00:06:57,707 of the same race -- 132 00:06:57,731 --> 00:07:00,001 look nothing alike -- 133 00:07:00,025 --> 00:07:02,184 when I confused them for each other 134 00:07:02,208 --> 00:07:04,873 more than once, in front of everybody. 135 00:07:05,885 --> 00:07:10,208 These kinds of mistakes send us, send me, 136 00:07:10,232 --> 00:07:13,067 into red-zone defensiveness. 137 00:07:13,091 --> 00:07:17,321 They leave us fighting for that good person identity. 138 00:07:18,189 --> 00:07:22,529 But the latest work that I've been doing on bounded ethicality with Mary Kern 139 00:07:22,553 --> 00:07:26,125 says that we're not only prone to mistakes -- 140 00:07:26,149 --> 00:07:31,388 that tendency towards mistakes depends on how close we are to that red zone. 141 00:07:31,412 --> 00:07:35,611 So most of the time, nobody's challenging our good person identity, 142 00:07:35,635 --> 00:07:37,794 and so we're not thinking too much 143 00:07:37,818 --> 00:07:40,151 about the ethical implications of our decisions, 144 00:07:40,175 --> 00:07:44,054 and our model shows that we're then spiraling 145 00:07:44,078 --> 00:07:48,817 towards less and less ethical behavior most of the time. 146 00:07:48,841 --> 00:07:51,689 On the other hand, somebody might challenge our identity, 147 00:07:51,713 --> 00:07:55,212 or, upon reflection, we may be challenging it ourselves. 148 00:07:55,236 --> 00:07:59,360 So the ethical implications of our decisions become really salient, 149 00:07:59,384 --> 00:08:05,121 and in those cases, we spiral towards more and more good person behavior, 150 00:08:05,145 --> 00:08:06,986 or, to be more precise, 151 00:08:07,010 --> 00:08:11,548 towards more and more behavior that makes us feel like a good person, 152 00:08:11,572 --> 00:08:14,016 which isn't always the same, of course. 153 00:08:15,413 --> 00:08:19,056 The idea with bounded ethicality 154 00:08:19,080 --> 00:08:23,282 is that we are perhaps overestimating 155 00:08:23,306 --> 00:08:28,474 the importance our inner compass is playing in our ethical decisions. 156 00:08:28,498 --> 00:08:32,983 We perhaps are overestimating how much our self-interest 157 00:08:33,007 --> 00:08:36,379 is driving our decisions, 158 00:08:36,403 --> 00:08:42,118 and perhaps we don't realize how much our self-view as a good person 159 00:08:42,142 --> 00:08:44,666 is affecting our behavior, 160 00:08:44,690 --> 00:08:50,175 that in fact, we're working so hard to protect that good person identity, 161 00:08:50,199 --> 00:08:52,493 to keep out of that red zone, 162 00:08:52,517 --> 00:08:57,871 that we're not actually giving ourselves space to learn from our mistakes 163 00:08:57,895 --> 00:09:00,212 and actually be better people. 164 00:09:01,998 --> 00:09:05,039 It's perhaps because we expect it to be easy. 165 00:09:05,063 --> 00:09:09,153 We have this definition of good person that's either-or. 166 00:09:09,177 --> 00:09:12,216 Either you are a good person or you're not. 167 00:09:12,240 --> 00:09:14,860 Either you have integrity or you don't. 168 00:09:14,884 --> 00:09:19,516 Either you are a racist or a sexist or a homophobe or you're not. 169 00:09:19,540 --> 00:09:23,523 And in this either-or definition, there's no room to grow. 170 00:09:24,444 --> 00:09:25,595 And by the way, 171 00:09:25,619 --> 00:09:28,603 this is not what we do in most parts of our lives. 172 00:09:28,627 --> 00:09:31,102 Life, if you needed to learn accounting, 173 00:09:31,126 --> 00:09:32,819 you would take an accounting class, 174 00:09:32,843 --> 00:09:35,137 or if you become a parent, 175 00:09:35,161 --> 00:09:38,668 we pick up a book and we read about it. 176 00:09:38,692 --> 00:09:41,319 We talk to experts, 177 00:09:41,343 --> 00:09:42,797 we learn from our mistakes, 178 00:09:42,821 --> 00:09:44,320 we update our knowledge, 179 00:09:44,344 --> 00:09:46,310 we just keep getting better. 180 00:09:46,835 --> 00:09:48,791 But when it comes to being a good person, 181 00:09:48,815 --> 00:09:51,307 we think it's something we're just supposed to know, 182 00:09:51,331 --> 00:09:52,594 we're just supposed to do, 183 00:09:52,618 --> 00:09:55,926 without the benefit of effort or growth. 184 00:09:55,950 --> 00:09:57,790 So what I've been thinking about 185 00:09:57,814 --> 00:10:01,966 is what if we were to just forget about being good people, 186 00:10:01,990 --> 00:10:03,755 just let it go, 187 00:10:03,779 --> 00:10:06,875 and instead, set a higher standard, 188 00:10:06,899 --> 00:10:09,961 a higher standard of being a good-ish person? 189 00:10:12,891 --> 00:10:17,114 A good-ish person absolutely still makes mistakes. 190 00:10:17,138 --> 00:10:20,180 As a good-ish person, I'm making them all the time. 191 00:10:20,881 --> 00:10:25,255 But as a good-ish person, I'm trying to learn from them, own them. 192 00:10:25,279 --> 00:10:28,838 I expect them and I go after them. 193 00:10:28,862 --> 00:10:31,466 I understand there are costs to these mistakes. 194 00:10:31,490 --> 00:10:35,558 When it comes to issues like ethics and bias and diversity and inclusion, 195 00:10:35,582 --> 00:10:38,724 there are real costs to real people, 196 00:10:38,748 --> 00:10:40,063 and I accept that. 197 00:10:42,602 --> 00:10:44,483 As a good-ish person, in fact, 198 00:10:44,507 --> 00:10:47,190 I become better at noticing my own mistakes. 199 00:10:47,214 --> 00:10:49,514 I don't wait for people to point them out. 200 00:10:49,538 --> 00:10:51,680 I practice finding them, 201 00:10:51,704 --> 00:10:52,980 and as a result ... 202 00:10:53,911 --> 00:10:57,528 Sure, sometimes it can be embarrassing, 203 00:10:57,552 --> 00:10:59,414 it can be uncomfortable. 204 00:10:59,438 --> 00:11:02,784 We put ourselves in a vulnerable place, sometimes. 205 00:11:03,968 --> 00:11:06,119 But through all that vulnerability, 206 00:11:06,143 --> 00:11:10,483 just like in everything else we've tried to ever get better at, 207 00:11:10,507 --> 00:11:11,809 we see progress. 208 00:11:11,833 --> 00:11:12,984 We see growth. 209 00:11:13,008 --> 00:11:15,937 We allow ourselves to get better. 210 00:11:17,016 --> 00:11:20,437 Why wouldn't we give ourselves that? 211 00:11:20,944 --> 00:11:25,471 In every other part of our lives, we give ourselves room to grow -- 212 00:11:25,495 --> 00:11:28,026 except in this one, where it matters most. 213 00:11:29,256 --> 00:11:30,407 Thank you. 214 00:11:30,431 --> 00:11:35,031 (Applause)