1 00:00:01,077 --> 00:00:02,596 Some years ago, 2 00:00:02,596 --> 00:00:07,166 I was on an airplane with my son who was just five years old at the time. 3 00:00:08,460 --> 00:00:13,623 My son was so excited about being on this airplane with Mommy. 4 00:00:13,623 --> 00:00:17,489 He's looking all around and he's checking things out 5 00:00:17,489 --> 00:00:19,176 and he's checking people out, 6 00:00:19,176 --> 00:00:21,387 and he sees this man, and he says, "Hey, 7 00:00:21,387 --> 00:00:23,843 that guy looks like Daddy." 8 00:00:24,049 --> 00:00:25,992 And I look at the man, 9 00:00:25,992 --> 00:00:29,683 and he didn't look anything at all like my husband, 10 00:00:29,876 --> 00:00:31,245 nothing at all. 11 00:00:31,668 --> 00:00:34,403 And so then I start looking around on the plane, 12 00:00:34,403 --> 00:00:36,273 and I notice this man 13 00:00:36,273 --> 00:00:39,715 was the only black guy on the plane. 14 00:00:41,145 --> 00:00:42,531 And I thought, 15 00:00:42,531 --> 00:00:44,321 all right. 16 00:00:44,552 --> 00:00:47,651 I'm going to have to have a little talk with my son 17 00:00:47,651 --> 00:00:50,070 about how not all black people look alike. 18 00:00:50,293 --> 00:00:54,694 My son, he lifts his head up, and he says to me, 19 00:00:54,694 --> 00:00:59,203 "I hope he doesn't rob the plane." 20 00:00:59,582 --> 00:01:02,174 And I said, "What? What did you say?" 21 00:01:02,174 --> 00:01:05,979 And he says, "Well, I hope that man doesn't rob the plane." 22 00:01:05,979 --> 00:01:09,839 And I said, "Well, why would you say that? 23 00:01:10,700 --> 00:01:13,386 You know Daddy wouldn't rob a plane." 24 00:01:13,386 --> 00:01:16,449 And he says, "Yeah, yeah, yeah, well, I know." 25 00:01:16,449 --> 00:01:18,809 And I said, "Well, why would you say that?" 26 00:01:18,809 --> 00:01:24,438 And he looked at me with this really sad face, 27 00:01:24,438 --> 00:01:26,743 and he says, 28 00:01:26,743 --> 00:01:29,599 "I don't know why I said that. 29 00:01:29,599 --> 00:01:33,273 I don't know why I was thinking that." 30 00:01:34,010 --> 00:01:37,551 We are living with such severe racial stratification 31 00:01:37,551 --> 00:01:39,776 that even a five-year old 32 00:01:39,776 --> 00:01:42,572 can tell us what's supposed to happen next, 33 00:01:44,292 --> 00:01:46,422 even with no evildoer, 34 00:01:46,422 --> 00:01:50,367 even with no explicit hatred. 35 00:01:50,367 --> 00:01:54,169 This association between blackness and crime 36 00:01:54,169 --> 00:01:58,357 made its way into the mind of my five-year old. 37 00:02:00,050 --> 00:02:04,463 It makes its way into all of our children, 38 00:02:04,463 --> 00:02:07,052 into all of us. 39 00:02:07,245 --> 00:02:10,690 Our minds are shaped by the racial disparities 40 00:02:10,690 --> 00:02:12,864 we see out in the world, 41 00:02:12,864 --> 00:02:18,205 and the narratives that help us to make sense of the disparities we see. 42 00:02:19,899 --> 00:02:22,448 Those people are criminals. 43 00:02:22,448 --> 00:02:24,335 Those people are violent. 44 00:02:24,335 --> 00:02:27,300 Those people are to be feared. 45 00:02:28,100 --> 00:02:31,657 When my research team brought people into our lab 46 00:02:31,657 --> 00:02:33,395 and exposed them to faces, 47 00:02:33,395 --> 00:02:36,343 we found that exposure to black faces 48 00:02:37,052 --> 00:02:43,634 led them to see blurry images of guns with greater clarity and speed. 49 00:02:43,902 --> 00:02:47,114 Bias cannot only control what we see, 50 00:02:47,114 --> 00:02:48,941 but where we look. 51 00:02:48,941 --> 00:02:52,408 We found that prompting people to think of violent crime 52 00:02:52,408 --> 00:02:56,597 can lead them to direct their eyes onto a black face 53 00:02:56,858 --> 00:02:59,011 and away from a white face. 54 00:02:59,011 --> 00:03:03,041 Prompting police officers to think of capturing and shooting 55 00:03:03,041 --> 00:03:04,131 and arresting 56 00:03:04,131 --> 00:03:07,936 leads their eyes to settle on black faces too. 57 00:03:07,936 --> 00:03:12,772 Bias can infect every aspect of our criminal justice system. 58 00:03:13,378 --> 00:03:16,536 In a large dataset of death-eligible defendants, 59 00:03:16,536 --> 00:03:18,478 we found that looking more black 60 00:03:18,478 --> 00:03:22,986 more than doubled their chances of receiving a death sentence, 61 00:03:23,828 --> 00:03:26,345 at least when their victims were white. 62 00:03:26,345 --> 00:03:29,232 This effect is significant even though we controlled 63 00:03:29,232 --> 00:03:33,234 for the severity of the crime 64 00:03:33,234 --> 00:03:34,637 and the defendant's attractiveness, 65 00:03:34,637 --> 00:03:36,089 and no matter what we controlled for, 66 00:03:36,089 --> 00:03:38,654 we found that black people 67 00:03:38,654 --> 00:03:40,785 were punished in proportion 68 00:03:40,785 --> 00:03:44,041 to the blackness of their physical features: 69 00:03:44,041 --> 00:03:45,674 the more black, 70 00:03:45,674 --> 00:03:47,600 the more death-worthy. 71 00:03:47,600 --> 00:03:52,107 Bias can also influence how teachers discipline students. 72 00:03:52,107 --> 00:03:54,113 My colleagues and I 73 00:03:54,113 --> 00:03:56,606 have found that teachers express a desire 74 00:03:56,606 --> 00:03:59,116 to discipline a black middle school student 75 00:03:59,116 --> 00:04:03,535 more harshly than a white student for the same repeated infractions. 76 00:04:04,200 --> 00:04:05,440 In a recent study, 77 00:04:05,440 --> 00:04:09,453 we're finding that teachers treat black students as a group 78 00:04:09,453 --> 00:04:12,388 but white students as individuals. 79 00:04:12,388 --> 00:04:15,947 If, for example, one black student misbehaves 80 00:04:15,947 --> 00:04:20,799 and then a different black student misbehaves a few days later, 81 00:04:20,799 --> 00:04:24,050 the teacher responds to that second black student 82 00:04:24,050 --> 00:04:26,675 as if he had misbehaved twice. 83 00:04:27,214 --> 00:04:30,048 It's as though the sins of one child 84 00:04:30,048 --> 00:04:32,158 get piled on to the other. 85 00:04:32,718 --> 00:04:36,035 We create categories to make sense of the world, 86 00:04:36,035 --> 00:04:39,725 to assert some control and coherence 87 00:04:39,725 --> 00:04:44,270 to the stimuli that we're constantly being bombarded with. 88 00:04:44,270 --> 00:04:48,360 Categorization and the bias that it seeds 89 00:04:48,360 --> 00:04:51,198 allow our brains to make judgments 90 00:04:51,198 --> 00:04:53,370 more quickly and efficiently, 91 00:04:53,370 --> 00:04:56,684 and we do this by instinctively relying on patterns 92 00:04:56,684 --> 00:04:58,376 that seem predictable. 93 00:04:58,376 --> 00:05:01,758 Yet just as the categories we create 94 00:05:01,758 --> 00:05:04,463 allow us to make quick decisions, 95 00:05:04,463 --> 00:05:06,878 they also reinforce bias. 96 00:05:06,878 --> 00:05:11,295 So the very things that help us to see the world 97 00:05:11,295 --> 00:05:13,851 also can blind us to it. 98 00:05:13,851 --> 00:05:16,549 They render our choices effortless, 99 00:05:16,549 --> 00:05:19,441 friction-free. 100 00:05:19,441 --> 00:05:22,397 Yet the exact a heavy toll. 101 00:05:22,397 --> 00:05:24,809 So what can we do? 102 00:05:24,809 --> 00:05:27,284 We are all vulnerable to bias, 103 00:05:27,284 --> 00:05:30,212 but we don't act on bias all the time. 104 00:05:30,212 --> 00:05:33,681 There are certain conditions that can bring bias alive 105 00:05:33,681 --> 00:05:36,281 and other conditions that can muffle it. 106 00:05:36,281 --> 00:05:38,933 Let me give you an example. 107 00:05:38,933 --> 00:05:43,061 Many people are familiar with the tech company Nextdoor. 108 00:05:43,460 --> 00:05:51,738 So their whole purpose is to create stronger, healthier, safer neighborhoods. 109 00:05:51,738 --> 00:05:55,393 And so they offer this online space 110 00:05:55,393 --> 00:05:57,983 where neighbors can gather and share information. 111 00:05:57,983 --> 00:06:02,015 Yet Nextdoor soon found that they had a problem 112 00:06:02,015 --> 00:06:04,187 with racial profiling. 113 00:06:04,187 --> 00:06:06,130 In the typical case, 114 00:06:06,130 --> 00:06:08,713 people would look outside their window 115 00:06:08,713 --> 00:06:12,785 and see a black man in their otherwise white neighborhood 116 00:06:12,785 --> 00:06:14,976 and make the snap judgment 117 00:06:14,976 --> 00:06:17,299 that he was up to no good, 118 00:06:17,299 --> 00:06:20,865 even when there was no evidence of criminal wrongdoing. 119 00:06:21,127 --> 00:06:23,831 In many ways, how we behave online 120 00:06:23,831 --> 00:06:26,945 is a reflection of how we behave in the world. 121 00:06:27,816 --> 00:06:31,372 But what we don't want to do is create an easy-to-use system 122 00:06:31,372 --> 00:06:34,195 that can amplify bias 123 00:06:34,195 --> 00:06:36,439 and deepen racial disparities, 124 00:06:36,439 --> 00:06:39,117 rather than dismantling them. 125 00:06:39,117 --> 00:06:42,491 So the co-founder of Nextdoor reached out to me and to others 126 00:06:42,491 --> 00:06:44,755 to try to figure out what to do, 127 00:06:44,755 --> 00:06:48,052 and they realized that to curb racial profiling on the platform, 128 00:06:48,710 --> 00:06:50,601 they were going to have to add friction. 129 00:06:50,601 --> 00:06:53,244 That is, they were going to have to slow people down. 130 00:06:53,244 --> 00:06:55,423 So Nextdoor had a choice to make, 131 00:06:55,423 --> 00:06:58,066 and against every impulse 132 00:06:58,066 --> 00:07:00,524 they decided to add friction. 133 00:07:00,524 --> 00:07:03,948 And they did this by adding a simple checklist. 134 00:07:03,948 --> 00:07:06,310 There were three items on it. 135 00:07:06,310 --> 00:07:08,766 First, they asked users to pause 136 00:07:08,766 --> 00:07:12,608 and think, what was this person doing 137 00:07:12,608 --> 00:07:14,399 that made him suspicious? 138 00:07:14,399 --> 00:07:19,519 The category "black man" is not ground for suspicion. 139 00:07:19,713 --> 00:07:24,833 Second, they asked users to describe the person's physical features, 140 00:07:24,833 --> 00:07:27,551 not simply their race and gender. 141 00:07:27,551 --> 00:07:31,107 Third, they realized that a lot of people 142 00:07:31,107 --> 00:07:34,345 didn't seem to know what racial profiling was, 143 00:07:34,345 --> 00:07:36,645 nor that they were engaging in it. 144 00:07:36,875 --> 00:07:40,069 So Nextdoor provided them with a definition 145 00:07:40,069 --> 00:07:43,177 and told them that it was strictly prohibited. 146 00:07:43,484 --> 00:07:47,060 Most of you have seen those signs in airports 147 00:07:47,060 --> 00:07:50,262 and Metro stations, "If you see something, say something." 148 00:07:50,262 --> 00:07:53,919 Nextdoor tried modifying this. 149 00:07:53,919 --> 00:07:59,329 "If you see something suspicious, say something specific." 150 00:07:59,587 --> 00:08:01,700 And using this strategy 151 00:08:01,700 --> 00:08:04,290 by simply slowing people down, 152 00:08:04,290 --> 00:08:09,834 Nextdoor was able to curb racial profiling by 75 percent. 153 00:08:10,766 --> 00:08:12,879 Now, people often will say to me, 154 00:08:12,879 --> 00:08:17,768 you can't add friction in every situation, in every context, 155 00:08:17,768 --> 00:08:21,914 and especially for people who make split second decisions all the time, 156 00:08:22,809 --> 00:08:25,586 but it turns out we can add friction 157 00:08:25,586 --> 00:08:28,246 to more situations than we think. 158 00:08:28,246 --> 00:08:30,691 Working with the Oakland Police Department 159 00:08:30,691 --> 00:08:31,919 in California, 160 00:08:31,919 --> 00:08:35,859 I and a number of my colleagues were able to help the Department 161 00:08:35,859 --> 00:08:38,360 to reduce the number of stops they made 162 00:08:38,360 --> 00:08:41,805 of people who were not committing any serious crimes, 163 00:08:42,238 --> 00:08:44,500 and we did this by pushing officers 164 00:08:44,500 --> 00:08:46,364 to ask themselves a question 165 00:08:46,364 --> 00:08:48,917 before each and every stop they made: 166 00:08:49,451 --> 00:08:52,797 is this stop intelligence-led, 167 00:08:52,797 --> 00:08:54,207 yes or no? 168 00:08:55,581 --> 00:08:57,215 In other words, 169 00:08:57,215 --> 00:09:02,317 do I have prior information to tie this particular person 170 00:09:02,317 --> 00:09:04,295 to a specific crime? 171 00:09:04,882 --> 00:09:06,672 By adding that question 172 00:09:06,672 --> 00:09:09,435 to the form officer's complete doing the stop, 173 00:09:09,435 --> 00:09:11,376 they slow down, they pause, 174 00:09:11,376 --> 00:09:15,237 they think, why am I considering pulling this person over? 175 00:09:16,912 --> 00:09:22,203 In 2017, before we added that intelligence-led question to the form, 176 00:09:23,655 --> 00:09:27,479 officers made about 32,000 stops across the city. 177 00:09:27,934 --> 00:09:29,593 In that next year, 178 00:09:29,593 --> 00:09:31,887 with the addition of this question, 179 00:09:31,887 --> 00:09:34,401 that fell to 19,000 stops. 180 00:09:34,579 --> 00:09:40,612 African-American stops alone fell by 43 percent. 181 00:09:40,612 --> 00:09:44,669 And stopping fewer black people did not make the city any more dangerous. 182 00:09:44,669 --> 00:09:47,292 In fact, the crime rate continued to fall 183 00:09:47,292 --> 00:09:50,202 and the city became safer for everybody. 184 00:09:50,486 --> 00:09:56,334 So one solution can come from reducing the number of unnecessary stops. 185 00:09:56,334 --> 00:10:00,690 Another can come from improving the quality of the stops 186 00:10:00,690 --> 00:10:02,743 officers do make. 187 00:10:02,743 --> 00:10:05,349 And technology can help us here. 188 00:10:05,349 --> 00:10:07,878 We all know about George Floyd's death 189 00:10:07,878 --> 00:10:11,196 because those who tried to come to his aid 190 00:10:11,517 --> 00:10:17,131 held cell phone cameras to record that horrific, fatal encounter 191 00:10:17,453 --> 00:10:18,918 with the police. 192 00:10:18,918 --> 00:10:21,703 But we have all sorts of technology 193 00:10:21,703 --> 00:10:24,133 that we're not putting to good use. 194 00:10:24,133 --> 00:10:26,396 Police departments across the country 195 00:10:26,396 --> 00:10:30,094 are now required to wear body-worn cameras, 196 00:10:30,094 --> 00:10:35,570 so we have recordings of not only the most extreme and horrific encounters 197 00:10:35,570 --> 00:10:38,354 but of everyday interactions. 198 00:10:38,753 --> 00:10:41,799 With an interdisciplinary team at Stanford, 199 00:10:41,799 --> 00:10:44,806 we've begun to use machine learning techniques 200 00:10:44,806 --> 00:10:47,898 to analyze large numbers of encounters. 201 00:10:47,898 --> 00:10:52,528 This is to better understand what happens in routine traffic stops. 202 00:10:52,528 --> 00:10:56,094 What we found was that even when police officers 203 00:10:56,094 --> 00:10:58,356 are behaving professionally, 204 00:10:58,356 --> 00:11:02,906 they speak to black drivers less respectfully than white drivers. 205 00:11:03,529 --> 00:11:07,334 In fact, from the words officers use alone 206 00:11:07,334 --> 00:11:12,973 we could predict whether they were talking to a black driver or a white driver. 207 00:11:13,655 --> 00:11:18,797 The problem is that the vast majority of the footage from these cameras 208 00:11:18,797 --> 00:11:21,424 is not used by police departments 209 00:11:21,424 --> 00:11:23,858 to understand what's going on on the street 210 00:11:23,858 --> 00:11:26,428 or to train officers. 211 00:11:26,659 --> 00:11:28,844 And that's a shame. 212 00:11:28,844 --> 00:11:31,532 How does a routine stop 213 00:11:31,532 --> 00:11:33,812 turn into a deadly encounter? 214 00:11:34,077 --> 00:11:36,287 How did this happen in George Floyd's case? 215 00:11:36,287 --> 00:11:40,511 How did it happen in others? 216 00:11:40,511 --> 00:11:43,252 When my eldest son was 16 years old, 217 00:11:43,252 --> 00:11:46,277 he discovered that when white people look at him, 218 00:11:46,277 --> 00:11:47,838 they feel fear. 219 00:11:49,338 --> 00:11:52,409 Elevators are the worst, he said. 220 00:11:52,409 --> 00:11:54,592 When those doors close, 221 00:11:54,592 --> 00:11:57,775 people are trapped in this tiny space 222 00:11:57,775 --> 00:12:00,262 with someone they have been taught 223 00:12:00,262 --> 00:12:02,242 to associate with danger. 224 00:12:03,046 --> 00:12:06,163 My son senses their discomfort, 225 00:12:06,163 --> 00:12:09,400 and he smiles to put them at ease, 226 00:12:09,400 --> 00:12:11,566 to calm their fears. 227 00:12:11,566 --> 00:12:13,534 When he speaks, 228 00:12:13,534 --> 00:12:15,720 their bodies relax. 229 00:12:15,720 --> 00:12:17,697 They breathe easier. 230 00:12:17,697 --> 00:12:20,251 They take pleasure in his cadence, 231 00:12:20,251 --> 00:12:22,418 his diction, his word choice. 232 00:12:23,241 --> 00:12:25,157 He sounds like one of them. 233 00:12:25,157 --> 00:12:29,867 I used to think that my son was a natural extrovert like his father, 234 00:12:29,867 --> 00:12:34,413 but I realized at that moment in that conversation 235 00:12:34,413 --> 00:12:38,006 that his smile was not a sign 236 00:12:38,006 --> 00:12:41,049 that he wanted to connect with would-be strangers. 237 00:12:42,175 --> 00:12:45,850 It was a talisman he'd used to protect himself, 238 00:12:45,850 --> 00:12:49,638 a survival skill he had honed over thousands of elevator rides. 239 00:12:49,638 --> 00:12:53,571 He was learning to accommodate the tension 240 00:12:53,571 --> 00:12:56,887 that his skin color generated 241 00:12:58,164 --> 00:13:01,396 and that put his own life at risk. 242 00:13:02,619 --> 00:13:06,187 We know that the brain is wired for bias, 243 00:13:06,187 --> 00:13:09,201 and one way to interrupt that bias 244 00:13:09,559 --> 00:13:13,445 is to pause and to reflect on the evidence of our assumptions. 245 00:13:13,445 --> 00:13:15,156 So we need to ask ourselves, 246 00:13:15,156 --> 00:13:19,702 what assumptions do we bring 247 00:13:19,702 --> 00:13:21,530 when we step onto an elevator? 248 00:13:21,530 --> 00:13:23,393 Or an airplane? 249 00:13:23,739 --> 00:13:28,422 How do we make ourselves aware of our own unconscious bias? 250 00:13:28,422 --> 00:13:32,337 Who do those assumptions keep safe? 251 00:13:32,798 --> 00:13:35,557 Who do they put at risk? 252 00:13:35,824 --> 00:13:38,832 Until we ask these questions, 253 00:13:38,832 --> 00:13:43,702 and insist that our schools and our courts and our police departments 254 00:13:43,702 --> 00:13:47,074 and every institution do the same, 255 00:13:47,074 --> 00:13:51,031 we will continue to allow bias 256 00:13:51,883 --> 00:13:53,179 to blind us, 257 00:13:53,579 --> 00:13:54,988 and if we do, 258 00:13:55,717 --> 00:13:59,274 none of us are truly safe. 259 00:14:00,358 --> 00:14:02,619 Thank you.