1 00:00:00,910 --> 00:00:02,183 Some years ago, 2 00:00:02,207 --> 00:00:07,119 I was on an airplane with my son who was just five years old at the time. 3 00:00:08,245 --> 00:00:13,340 My son was so excited about being on this airplane with Mommy. 4 00:00:13,364 --> 00:00:16,304 He's looking all around and he's checking things out 5 00:00:16,328 --> 00:00:18,164 and he's checking people out. 6 00:00:18,188 --> 00:00:19,818 And he sees this man, and he says, 7 00:00:19,842 --> 00:00:22,703 "Hey! That guy looks like Daddy!" 8 00:00:23,882 --> 00:00:25,802 And I look at the man, 9 00:00:25,826 --> 00:00:29,556 and he didn't look anything at all like my husband, 10 00:00:29,580 --> 00:00:30,949 nothing at all. 11 00:00:31,461 --> 00:00:33,797 And so then I start looking around on the plane, 12 00:00:33,821 --> 00:00:39,706 and I notice this man was the only black guy on the plane. 13 00:00:40,874 --> 00:00:42,286 And I thought, 14 00:00:42,310 --> 00:00:43,504 "Alright. 15 00:00:44,369 --> 00:00:46,894 I'm going to have to have a little talk with my son 16 00:00:46,918 --> 00:00:49,847 about how not all black people look alike." 17 00:00:49,871 --> 00:00:54,248 My son, he lifts his head up, and he says to me, 18 00:00:56,246 --> 00:00:58,617 "I hope he doesn't rob the plane." 19 00:00:59,359 --> 00:01:01,874 And I said, "What? What did you say?" 20 00:01:01,898 --> 00:01:05,326 And he says, "Well, I hope that man doesn't rob the plane." 21 00:01:07,200 --> 00:01:09,952 And I said, "Well, why would you say that? 22 00:01:10,486 --> 00:01:13,149 You know Daddy wouldn't rob a plane." 23 00:01:13,173 --> 00:01:15,485 And he says, "Yeah, yeah, yeah, well, I know." 24 00:01:16,179 --> 00:01:18,306 And I said, "Well, why would you say that?" 25 00:01:20,346 --> 00:01:23,303 And he looked at me with this really sad face, 26 00:01:24,168 --> 00:01:25,422 and he says, 27 00:01:26,890 --> 00:01:28,996 "I don't know why I said that. 28 00:01:30,600 --> 00:01:32,958 I don't know why I was thinking that." 29 00:01:33,724 --> 00:01:37,242 We are living with such severe racial stratification 30 00:01:37,266 --> 00:01:42,326 that even a five-year-old can tell us what's supposed to happen next, 31 00:01:43,990 --> 00:01:46,097 even with no evildoer, 32 00:01:46,121 --> 00:01:48,700 even with no explicit hatred. 33 00:01:50,184 --> 00:01:54,097 This association between blackness and crime 34 00:01:54,121 --> 00:01:58,455 made its way into the mind of my five-year-old. 35 00:01:59,787 --> 00:02:03,050 It makes its way into all of our children, 36 00:02:04,201 --> 00:02:05,592 into all of us. 37 00:02:06,793 --> 00:02:10,317 Our minds are shaped by the racial disparities 38 00:02:10,341 --> 00:02:11,933 we see out in the world 39 00:02:12,752 --> 00:02:18,093 and the narratives that help us to make sense of the disparities we see: 40 00:02:19,637 --> 00:02:22,163 "Those people are criminal." 41 00:02:22,187 --> 00:02:24,112 "Those people are violent." 42 00:02:24,136 --> 00:02:27,101 "Those people are to be feared." 43 00:02:27,814 --> 00:02:31,005 When my research team brought people into our lab 44 00:02:31,029 --> 00:02:33,312 and exposed them to faces, 45 00:02:33,336 --> 00:02:40,336 we found that exposure to black faces led them to see blurry images of guns 46 00:02:40,360 --> 00:02:43,616 with greater clarity and speed. 47 00:02:43,640 --> 00:02:46,994 Bias cannot only control what we see, 48 00:02:47,018 --> 00:02:48,666 but where we look. 49 00:02:48,690 --> 00:02:52,134 We found that prompting people to think of violent crime 50 00:02:52,158 --> 00:02:56,454 can lead them to direct their eyes onto a black face 51 00:02:56,478 --> 00:02:58,608 and away from a white face. 52 00:02:58,632 --> 00:03:02,474 Prompting police officers to think of capturing and shooting 53 00:03:02,498 --> 00:03:03,727 and arresting 54 00:03:03,751 --> 00:03:07,610 leads their eyes to settle on black faces, too. 55 00:03:07,634 --> 00:03:12,700 Bias can infect every aspect of our criminal justice system. 56 00:03:13,100 --> 00:03:16,031 In a large data set of death-eligible defendants, 57 00:03:16,055 --> 00:03:20,412 we found that looking more black more than doubled their chances 58 00:03:20,436 --> 00:03:22,493 of receiving a death sentence -- 59 00:03:23,494 --> 00:03:25,921 at least when their victims were white. 60 00:03:25,945 --> 00:03:27,383 This effect is significant, 61 00:03:27,407 --> 00:03:30,708 even though we controlled for the severity of the crime 62 00:03:30,732 --> 00:03:33,013 and the defendant's attractiveness. 63 00:03:33,037 --> 00:03:35,686 And no matter what we controlled for, 64 00:03:35,710 --> 00:03:39,055 we found that black people were punished 65 00:03:39,079 --> 00:03:43,404 in proportion to the blackness of their physical features: 66 00:03:43,428 --> 00:03:45,309 the more black, 67 00:03:45,333 --> 00:03:47,086 the more death-worthy. 68 00:03:47,110 --> 00:03:51,319 Bias can also influence how teachers discipline students. 69 00:03:51,781 --> 00:03:56,188 My colleagues and I have found that teachers express a desire 70 00:03:56,212 --> 00:03:59,778 to discipline a black middle school student more harshly 71 00:03:59,802 --> 00:04:00,970 than a white student 72 00:04:00,994 --> 00:04:03,570 for the same repeated infractions. 73 00:04:03,594 --> 00:04:04,888 In a recent study, 74 00:04:04,912 --> 00:04:09,270 we're finding that teachers treat black students as a group 75 00:04:09,294 --> 00:04:11,725 but white students as individuals. 76 00:04:12,126 --> 00:04:15,725 If, for example, one black student misbehaves 77 00:04:15,749 --> 00:04:20,534 and then a different black student misbehaves a few days later, 78 00:04:20,558 --> 00:04:23,786 the teacher responds to that second black student 79 00:04:23,810 --> 00:04:26,435 as if he had misbehaved twice. 80 00:04:26,952 --> 00:04:29,763 It's as though the sins of one child 81 00:04:29,787 --> 00:04:31,963 get piled onto the other. 82 00:04:31,987 --> 00:04:35,281 We create categories to make sense of the world, 83 00:04:35,305 --> 00:04:39,788 to assert some control and coherence 84 00:04:39,812 --> 00:04:43,902 to the stimuli that we're constantly being bombarded with. 85 00:04:43,926 --> 00:04:47,894 Categorization and the bias that it seeds 86 00:04:47,918 --> 00:04:52,940 allow our brains to make judgments more quickly and efficiently, 87 00:04:52,964 --> 00:04:56,366 and we do this by instinctively relying on patterns 88 00:04:56,390 --> 00:04:58,059 that seem predictable. 89 00:04:58,083 --> 00:05:04,026 Yet, just as the categories we create allow us to make quick decisions, 90 00:05:04,050 --> 00:05:06,552 they also reinforce bias. 91 00:05:06,576 --> 00:05:09,968 So the very things that help us to see the world 92 00:05:11,104 --> 00:05:13,084 also can blind us to it. 93 00:05:13,509 --> 00:05:16,287 They render our choices effortless, 94 00:05:16,311 --> 00:05:17,680 friction-free. 95 00:05:18,956 --> 00:05:21,401 Yet they exact a heavy toll. 96 00:05:22,158 --> 00:05:23,569 So what can we do? 97 00:05:24,507 --> 00:05:26,998 We are all vulnerable to bias, 98 00:05:27,022 --> 00:05:29,702 but we don't act on bias all the time. 99 00:05:29,726 --> 00:05:33,370 There are certain conditions that can bring bias alive 100 00:05:33,394 --> 00:05:35,927 and other conditions that can muffle it. 101 00:05:35,951 --> 00:05:37,798 Let me give you an example. 102 00:05:38,663 --> 00:05:43,223 Many people are familiar with the tech company Nextdoor. 103 00:05:44,073 --> 00:05:50,526 So, their whole purpose is to create stronger, healthier, safer neighborhoods. 104 00:05:51,468 --> 00:05:54,389 And so they offer this online space 105 00:05:54,413 --> 00:05:57,562 where neighbors can gather and share information. 106 00:05:57,586 --> 00:06:01,712 Yet, Nextdoor soon found that they had a problem 107 00:06:01,736 --> 00:06:03,404 with racial profiling. 108 00:06:04,012 --> 00:06:05,979 In the typical case, 109 00:06:06,003 --> 00:06:08,399 people would look outside their window 110 00:06:08,423 --> 00:06:12,472 and see a black man in their otherwise white neighborhood 111 00:06:12,496 --> 00:06:17,211 and make the snap judgment that he was up to no good, 112 00:06:17,235 --> 00:06:20,586 even when there was no evidence of criminal wrongdoing. 113 00:06:20,610 --> 00:06:23,544 In many ways, how we behave online 114 00:06:23,568 --> 00:06:26,682 is a reflection of how we behave in the world. 115 00:06:27,117 --> 00:06:31,062 But what we don't want to do is create an easy-to-use system 116 00:06:31,086 --> 00:06:35,249 that can amplify bias and deepen racial disparities, 117 00:06:36,129 --> 00:06:38,395 rather than dismantling them. 118 00:06:38,863 --> 00:06:42,292 So the cofounder of Nextdoor reached out to me and to others 119 00:06:42,316 --> 00:06:44,447 to try to figure out what to do. 120 00:06:44,471 --> 00:06:48,417 And they realized that to curb racial profiling on the platform, 121 00:06:48,441 --> 00:06:50,363 they were going to have to add friction; 122 00:06:50,387 --> 00:06:53,045 that is, they were going to have to slow people down. 123 00:06:53,069 --> 00:06:55,264 So Nextdoor had a choice to make, 124 00:06:55,288 --> 00:06:57,766 and against every impulse, 125 00:06:57,790 --> 00:06:59,906 they decided to add friction. 126 00:07:00,397 --> 00:07:03,837 And they did this by adding a simple checklist. 127 00:07:03,861 --> 00:07:05,531 There were three items on it. 128 00:07:06,111 --> 00:07:09,052 First, they asked users to pause 129 00:07:09,076 --> 00:07:14,193 and think, "What was this person doing that made him suspicious?" 130 00:07:14,876 --> 00:07:19,409 The category "black man" is not grounds for suspicion. 131 00:07:19,433 --> 00:07:24,572 Second, they asked users to describe the person's physical features, 132 00:07:24,596 --> 00:07:27,031 not simply their race and gender. 133 00:07:27,642 --> 00:07:31,025 Third, they realized that a lot of people 134 00:07:31,049 --> 00:07:33,977 didn't seem to know what racial profiling was, 135 00:07:34,001 --> 00:07:35,960 nor that they were engaging in it. 136 00:07:36,462 --> 00:07:39,656 So Nextdoor provided them with a definition 137 00:07:39,680 --> 00:07:42,582 and told them that it was strictly prohibited. 138 00:07:43,071 --> 00:07:45,683 Most of you have seen those signs in airports 139 00:07:45,707 --> 00:07:49,409 and in metro stations, "If you see something, say something." 140 00:07:49,928 --> 00:07:52,742 Nextdoor tried modifying this. 141 00:07:53,584 --> 00:07:56,156 "If you see something suspicious, 142 00:07:56,180 --> 00:07:58,253 say something specific." 143 00:07:59,491 --> 00:08:03,937 And using this strategy, by simply slowing people down, 144 00:08:03,961 --> 00:08:09,652 Nextdoor was able to curb racial profiling by 75 percent. 145 00:08:10,496 --> 00:08:12,586 Now, people often will say to me, 146 00:08:12,610 --> 00:08:17,323 "You can't add friction in every situation, in every context, 147 00:08:17,347 --> 00:08:21,993 and especially for people who make split-second decisions all the time." 148 00:08:22,730 --> 00:08:25,293 But it turns out we can add friction 149 00:08:25,317 --> 00:08:27,593 to more situations than we think. 150 00:08:28,031 --> 00:08:30,105 Working with the Oakland Police Department 151 00:08:30,129 --> 00:08:31,546 in California, 152 00:08:31,570 --> 00:08:35,426 I and a number of my colleagues were able to help the department 153 00:08:35,450 --> 00:08:38,121 to reduce the number of stops they made 154 00:08:38,145 --> 00:08:41,745 of people who were not committing any serious crimes. 155 00:08:41,769 --> 00:08:44,134 And we did this by pushing officers 156 00:08:44,158 --> 00:08:48,601 to ask themselves a question before each and every stop they made: 157 00:08:49,451 --> 00:08:52,466 "Is this stop intelligence-led, 158 00:08:52,490 --> 00:08:53,941 yes or no?" 159 00:08:55,353 --> 00:08:56,749 In other words, 160 00:08:57,621 --> 00:09:02,105 do I have prior information to tie this particular person 161 00:09:02,129 --> 00:09:03,730 to a specific crime? 162 00:09:04,587 --> 00:09:06,045 By adding that question 163 00:09:06,069 --> 00:09:09,148 to the form officers complete during a stop, 164 00:09:09,172 --> 00:09:10,981 they slow down, they pause, 165 00:09:11,005 --> 00:09:15,225 they think, "Why am I considering pulling this person over?" 166 00:09:16,721 --> 00:09:22,282 In 2017, before we added that intelligence-led question to the form, 167 00:09:23,655 --> 00:09:27,601 officers made about 32,000 stops across the city. 168 00:09:27,625 --> 00:09:31,740 In that next year, with the addition of this question, 169 00:09:31,764 --> 00:09:34,208 that fell to 19,000 stops. 170 00:09:34,232 --> 00:09:39,193 African-American stops alone fell by 43 percent. 171 00:09:39,905 --> 00:09:44,343 And stopping fewer black people did not make the city any more dangerous. 172 00:09:44,367 --> 00:09:47,101 In fact, the crime rate continued to fall, 173 00:09:47,125 --> 00:09:50,462 and the city became safer for everybody. 174 00:09:50,486 --> 00:09:55,841 So one solution can come from reducing the number of unnecessary stops. 175 00:09:56,285 --> 00:10:00,555 Another can come from improving the quality of the stops 176 00:10:00,579 --> 00:10:01,884 officers do make. 177 00:10:02,512 --> 00:10:05,108 And technology can help us here. 178 00:10:05,132 --> 00:10:07,547 We all know about George Floyd's death, 179 00:10:08,499 --> 00:10:13,271 because those who tried to come to his aid held cell phone cameras 180 00:10:13,295 --> 00:10:18,726 to record that horrific, fatal encounter with the police. 181 00:10:18,750 --> 00:10:23,781 But we have all sorts of technology that we're not putting to good use. 182 00:10:23,805 --> 00:10:26,308 Police departments across the country 183 00:10:26,332 --> 00:10:29,885 are now required to wear body-worn cameras 184 00:10:29,909 --> 00:10:35,839 so we have recordings of not only the most extreme and horrific encounters 185 00:10:35,863 --> 00:10:38,617 but of everyday interactions. 186 00:10:38,641 --> 00:10:41,418 With an interdisciplinary team at Stanford, 187 00:10:41,442 --> 00:10:44,129 we've begun to use machine learning techniques 188 00:10:44,153 --> 00:10:47,520 to analyze large numbers of encounters. 189 00:10:47,544 --> 00:10:52,155 This is to better understand what happens in routine traffic stops. 190 00:10:52,179 --> 00:10:54,334 What we found was that 191 00:10:54,358 --> 00:10:58,020 even when police officers are behaving professionally, 192 00:10:58,860 --> 00:11:03,322 they speak to black drivers less respectfully than white drivers. 193 00:11:04,052 --> 00:11:08,127 In fact, from the words officers use alone, 194 00:11:08,151 --> 00:11:13,313 we could predict whether they were talking to a black driver or a white driver. 195 00:11:13,337 --> 00:11:19,099 The problem is that the vast majority of the footage from these cameras 196 00:11:19,123 --> 00:11:21,210 is not used by police departments 197 00:11:21,234 --> 00:11:23,510 to understand what's going on on the street 198 00:11:23,534 --> 00:11:25,777 or to train officers. 199 00:11:26,554 --> 00:11:28,012 And that's a shame. 200 00:11:28,796 --> 00:11:33,585 How does a routine stop turn into a deadly encounter? 201 00:11:33,609 --> 00:11:36,279 How did this happen in George Floyd's case? 202 00:11:37,588 --> 00:11:39,670 How did it happen in others? 203 00:11:39,694 --> 00:11:43,090 When my eldest son was 16 years old, 204 00:11:43,114 --> 00:11:46,253 he discovered that when white people look at him, 205 00:11:46,277 --> 00:11:47,838 they feel fear. 206 00:11:49,123 --> 00:11:51,784 Elevators are the worst, he said. 207 00:11:52,313 --> 00:11:54,644 When those doors close, 208 00:11:54,668 --> 00:11:57,751 people are trapped in this tiny space 209 00:11:57,775 --> 00:12:02,242 with someone they have been taught to associate with danger. 210 00:12:02,744 --> 00:12:05,964 My son senses their discomfort, 211 00:12:05,988 --> 00:12:09,145 and he smiles to put them at ease, 212 00:12:09,169 --> 00:12:10,938 to calm their fears. 213 00:12:11,351 --> 00:12:13,296 When he speaks, 214 00:12:13,320 --> 00:12:15,003 their bodies relax. 215 00:12:15,442 --> 00:12:17,345 They breathe easier. 216 00:12:17,369 --> 00:12:19,900 They take pleasure in his cadence, 217 00:12:19,924 --> 00:12:22,241 his diction, his word choice. 218 00:12:22,986 --> 00:12:24,829 He sounds like one of them. 219 00:12:24,853 --> 00:12:29,583 I used to think that my son was a natural extrovert like his father. 220 00:12:29,607 --> 00:12:33,157 But I realized at that moment, in that conversation, 221 00:12:34,143 --> 00:12:39,221 that his smile was not a sign that he wanted to connect 222 00:12:39,245 --> 00:12:41,209 with would-be strangers. 223 00:12:41,920 --> 00:12:45,572 It was a talisman he used to protect himself, 224 00:12:45,596 --> 00:12:51,815 a survival skill he had honed over thousands of elevator rides. 225 00:12:52,387 --> 00:12:57,558 He was learning to accommodate the tension that his skin color generated 226 00:12:59,026 --> 00:13:01,693 and that put his own life at risk. 227 00:13:02,619 --> 00:13:06,402 We know that the brain is wired for bias, 228 00:13:06,426 --> 00:13:10,891 and one way to interrupt that bias is to pause and to reflect 229 00:13:10,915 --> 00:13:13,220 on the evidence of our assumptions. 230 00:13:13,244 --> 00:13:14,999 So we need to ask ourselves: 231 00:13:15,023 --> 00:13:19,688 What assumptions do we bring when we step onto an elevator? 232 00:13:21,776 --> 00:13:23,087 Or an airplane? 233 00:13:23,532 --> 00:13:28,131 How do we make ourselves aware of our own unconscious bias? 234 00:13:28,155 --> 00:13:30,506 Who do those assumptions keep safe? 235 00:13:32,615 --> 00:13:34,547 Who do they put at risk? 236 00:13:35,649 --> 00:13:38,003 Until we ask these questions 237 00:13:38,978 --> 00:13:43,602 and insist that our schools and our courts and our police departments 238 00:13:43,626 --> 00:13:46,168 and every institution do the same, 239 00:13:47,835 --> 00:13:51,664 we will continue to allow bias 240 00:13:51,688 --> 00:13:52,966 to blind us. 241 00:13:53,348 --> 00:13:54,757 And if we do, 242 00:13:56,066 --> 00:13:59,274 none of us are truly safe. 243 00:14:02,103 --> 00:14:03,411 Thank you.