0:00:01.077,0:00:02.596 Some years ago, 0:00:02.596,0:00:07.166 I was on an airplane with my son[br]who was just five years old at the time. 0:00:08.460,0:00:13.623 My son was so excited[br]about being on this airplane with Mommy. 0:00:13.623,0:00:17.489 He's looking all around[br]and he's checking things out 0:00:17.489,0:00:19.176 and he's checking people out, 0:00:19.176,0:00:21.387 and he sees this man, and he says, "Hey, 0:00:21.387,0:00:23.843 that guy looks like Daddy." 0:00:24.049,0:00:25.992 And I look at the man, 0:00:25.992,0:00:29.683 and he didn't look anything[br]at all like my husband, 0:00:29.876,0:00:31.245 nothing at all. 0:00:31.668,0:00:34.403 And so then I start[br]looking around on the plane, 0:00:34.403,0:00:36.273 and I notice this man 0:00:36.273,0:00:39.715 was the only black guy on the plane. 0:00:41.145,0:00:42.531 And I thought, 0:00:42.531,0:00:44.321 all right. 0:00:44.552,0:00:47.651 I'm going to have to have[br]a little talk with my son 0:00:47.651,0:00:50.070 about how not all black people look alike. 0:00:50.293,0:00:54.694 My son, he lifts his head up,[br]and he says to me, 0:00:54.694,0:00:59.203 "I hope he doesn't rob the plane." 0:00:59.582,0:01:02.174 And I said, "What? What did you say?" 0:01:02.174,0:01:05.979 And he says, "Well, I hope that man[br]doesn't rob the plane." 0:01:05.979,0:01:09.839 And I said, "Well, why would you say that? 0:01:10.700,0:01:13.386 You know Daddy wouldn't rob a plane." 0:01:13.386,0:01:16.449 And he says, "Yeah, yeah,[br]yeah, well, I know." 0:01:16.449,0:01:18.809 And I said, "Well,[br]why would you say that?" 0:01:18.809,0:01:24.438 And he looked at me[br]with this really sad face, 0:01:24.438,0:01:26.743 and he says, 0:01:26.743,0:01:29.599 "I don't know why I said that. 0:01:29.599,0:01:33.273 I don't know why I was thinking that." 0:01:34.010,0:01:37.551 We are living with such severe[br]racial stratification 0:01:37.551,0:01:39.776 that even a five-year old 0:01:39.776,0:01:42.572 can tell us what's supposed[br]to happen next, 0:01:44.292,0:01:46.422 even with no evildoer, 0:01:46.422,0:01:50.367 even with no explicit hatred. 0:01:50.367,0:01:54.169 This association between[br]blackness and crime 0:01:54.169,0:01:58.357 made its way into the mind[br]of my five-year old. 0:02:00.050,0:02:04.463 It makes its way into all of our children, 0:02:04.463,0:02:07.052 into all of us. 0:02:07.245,0:02:10.690 Our minds are shaped[br]by the racial disparities 0:02:10.690,0:02:12.864 we see out in the world, 0:02:12.864,0:02:18.205 and the narratives that help us[br]to make sense of the disparities we see. 0:02:19.899,0:02:22.448 Those people are criminals. 0:02:22.448,0:02:24.335 Those people are violent. 0:02:24.335,0:02:27.300 Those people are to be feared. 0:02:28.100,0:02:31.657 When my research team[br]brought people into our lab 0:02:31.657,0:02:33.395 and exposed them to faces, 0:02:33.395,0:02:36.343 we found that exposure to black faces 0:02:37.052,0:02:43.634 led them to see blurry images[br]of guns with greater clarity and speed. 0:02:43.902,0:02:47.114 Bias cannot only control what we see, 0:02:47.114,0:02:48.941 but where we look. 0:02:48.941,0:02:52.408 We found that prompting people[br]to think of violent crime 0:02:52.408,0:02:56.597 can lead them to direct their eyes[br]onto a black face 0:02:56.858,0:02:59.011 and away from a white face. 0:02:59.011,0:03:03.041 Prompting police officers[br]to think of capturing and shooting 0:03:03.041,0:03:04.131 and arresting 0:03:04.131,0:03:07.936 leads their eyes to settle[br]on black faces too. 0:03:07.936,0:03:12.772 Bias can infect every aspect[br]of our criminal justice system. 0:03:13.378,0:03:16.536 In a large dataset[br]of death-eligible defendants, 0:03:16.536,0:03:18.478 we found that looking more black 0:03:18.478,0:03:22.986 more than doubled their chances[br]of receiving a death sentence, 0:03:23.828,0:03:26.345 at least when their victims were white. 0:03:26.345,0:03:29.232 This effect is significant[br]even though we controlled 0:03:29.232,0:03:33.234 for the severity of the crime 0:03:33.234,0:03:34.637 and the defendant's attractiveness, 0:03:34.637,0:03:36.089 and no matter what we controlled for, 0:03:36.089,0:03:38.654 we found that black people 0:03:38.654,0:03:40.785 were punished in proportion 0:03:40.785,0:03:44.041 to the blackness[br]of their physical features: 0:03:44.041,0:03:45.674 the more black, 0:03:45.674,0:03:47.600 the more death-worthy. 0:03:47.600,0:03:52.107 Bias can also influence[br]how teachers discipline students. 0:03:52.107,0:03:54.113 My colleagues and I 0:03:54.113,0:03:56.606 have found that teachers express a desire 0:03:56.606,0:03:59.116 to discipline a black[br]middle school student 0:03:59.116,0:04:03.535 more harshly than a white student[br]for the same repeated infractions. 0:04:04.200,0:04:05.440 In a recent study, 0:04:05.440,0:04:09.453 we're finding that teachers[br]treat black students as a group 0:04:09.453,0:04:12.388 but white students as individuals. 0:04:12.388,0:04:15.947 If, for example,[br]one black student misbehaves 0:04:15.947,0:04:20.799 and then a different black student[br]misbehaves a few days later, 0:04:20.799,0:04:24.050 the teacher responds[br]to that second black student 0:04:24.050,0:04:26.675 as if he had misbehaved twice. 0:04:27.214,0:04:30.048 It's as though the sins of one child 0:04:30.048,0:04:32.158 get piled on to the other. 0:04:32.718,0:04:36.035 We create categories[br]to make sense of the world, 0:04:36.035,0:04:39.725 to assert some control and coherence 0:04:39.725,0:04:44.270 to the stimuli that we're[br]constantly being bombarded with. 0:04:44.270,0:04:48.360 Categorization and the bias that it seeds 0:04:48.360,0:04:51.198 allow our brains to make judgments 0:04:51.198,0:04:53.370 more quickly and efficiently, 0:04:53.370,0:04:56.684 and we do this by instinctively[br]relying on patterns 0:04:56.684,0:04:58.376 that seem predictable. 0:04:58.376,0:05:01.758 Yet just as the categories we create 0:05:01.758,0:05:04.463 allow us to make quick decisions, 0:05:04.463,0:05:06.878 they also reinforce bias. 0:05:06.878,0:05:11.295 So the very things that help us[br]to see the world 0:05:11.295,0:05:13.851 also can blind us to it. 0:05:13.851,0:05:16.549 They render our choices effortless, 0:05:16.549,0:05:19.441 friction-free. 0:05:19.441,0:05:22.397 Yet the exact a heavy toll. 0:05:22.397,0:05:24.809 So what can we do? 0:05:24.809,0:05:27.284 We are all vulnerable to bias, 0:05:27.284,0:05:30.212 but we don't act on bias all the time. 0:05:30.212,0:05:33.681 There are certain conditions[br]that can bring bias alive 0:05:33.681,0:05:36.281 and other conditions that can muffle it. 0:05:36.281,0:05:38.933 Let me give you an example. 0:05:38.933,0:05:43.061 Many people are familiar[br]with the tech company Nextdoor. 0:05:43.460,0:05:51.738 So their whole purpose is to create[br]stronger, healthier, safer neighborhoods. 0:05:51.738,0:05:55.393 And so they offer this online space 0:05:55.393,0:05:57.983 where neighbors can gather[br]and share information. 0:05:57.983,0:06:02.015 Yet Nextdoor soon found[br]that they had a problem 0:06:02.015,0:06:04.187 with racial profiling. 0:06:04.187,0:06:06.130 In the typical case, 0:06:06.130,0:06:08.713 people would look outside their window 0:06:08.713,0:06:12.785 and see a black man[br]in their otherwise white neighborhood 0:06:12.785,0:06:14.976 and make the snap judgment 0:06:14.976,0:06:17.299 that he was up to no good, 0:06:17.299,0:06:20.865 even when there was no evidence[br]of criminal wrongdoing. 0:06:21.127,0:06:23.831 In many ways, how we behave online 0:06:23.831,0:06:26.945 is a reflection of how[br]we behave in the world. 0:06:27.816,0:06:31.372 But what we don't want to do[br]is create an easy-to-use system 0:06:31.372,0:06:34.195 that can amplify bias 0:06:34.195,0:06:36.439 and deepen racial disparities, 0:06:36.439,0:06:39.117 rather than dismantling them. 0:06:39.117,0:06:42.491 So the co-founder of Nextdoor[br]reached out to me and to others 0:06:42.491,0:06:44.755 to try to figure out what to do, 0:06:44.755,0:06:48.052 and they realized that[br]to curb racial profiling on the platform, 0:06:48.710,0:06:50.601 they were going to have to add friction. 0:06:50.601,0:06:53.244 That is, they were going[br]to have to slow people down. 0:06:53.244,0:06:55.423 So Nextdoor had a choice to make, 0:06:55.423,0:06:58.066 and against every impulse 0:06:58.066,0:07:00.524 they decided to add friction. 0:07:00.524,0:07:03.948 And they did this by adding[br]a simple checklist. 0:07:03.948,0:07:06.310 There were three items on it. 0:07:06.310,0:07:08.766 First, they asked users to pause 0:07:08.766,0:07:12.608 and think, what was this person doing 0:07:12.608,0:07:14.399 that made him suspicious? 0:07:14.399,0:07:19.519 The category "black man"[br]is not ground for suspicion. 0:07:19.713,0:07:24.833 Second, they asked users to describe[br]the person's physical features, 0:07:24.833,0:07:27.551 not simply their race and gender. 0:07:27.551,0:07:31.107 Third, they realized that a lot of people 0:07:31.107,0:07:34.345 didn't seem to know[br]what racial profiling was, 0:07:34.345,0:07:36.645 nor that they were engaging in it. 0:07:36.875,0:07:40.069 So Nextdoor provided them[br]with a definition 0:07:40.069,0:07:43.177 and told them that it[br]was strictly prohibited. 0:07:43.484,0:07:47.060 Most of you have seen[br]those signs in airports 0:07:47.060,0:07:50.262 and Metro stations,[br]"If you see something, say something." 0:07:50.262,0:07:53.919 Nextdoor tried modifying this. 0:07:53.919,0:07:59.329 "If you see something suspicious,[br]say something specific." 0:07:59.587,0:08:01.700 And using this strategy 0:08:01.700,0:08:04.290 by simply slowing people down, 0:08:04.290,0:08:09.834 Nextdoor was able to curb[br]racial profiling by 75 percent. 0:08:10.766,0:08:12.879 Now, people often will say to me, 0:08:12.879,0:08:17.768 you can't add friction[br]in every situation, in every context, 0:08:17.768,0:08:21.914 and especially for people who make[br]split second decisions all the time, 0:08:22.809,0:08:25.586 but it turns out we can add friction 0:08:25.586,0:08:28.246 to more situations than we think. 0:08:28.246,0:08:30.691 Working with the Oakland Police Department 0:08:30.691,0:08:31.919 in California, 0:08:31.919,0:08:35.859 I and a number of my colleagues[br]were able to help the Department 0:08:35.859,0:08:38.360 to reduce the number of stops they made 0:08:38.360,0:08:41.805 of people who were not[br]committing any serious crimes, 0:08:42.238,0:08:44.500 and we did this by pushing officers 0:08:44.500,0:08:46.364 to ask themselves a question 0:08:46.364,0:08:48.917 before each and every stop they made: 0:08:49.451,0:08:52.797 is this stop intelligence-led, 0:08:52.797,0:08:54.207 yes or no? 0:08:55.581,0:08:57.215 In other words, 0:08:57.215,0:09:02.317 do I have prior information[br]to tie this particular person 0:09:02.317,0:09:04.295 to a specific crime? 0:09:04.882,0:09:06.672 By adding that question 0:09:06.672,0:09:09.435 to the form officer's[br]complete doing the stop, 0:09:09.435,0:09:11.376 they slow down, they pause, 0:09:11.376,0:09:15.237 they think, why am I considering[br]pulling this person over? 0:09:16.912,0:09:22.203 In 2017, before we added that[br]intelligence-led question to the form, 0:09:23.655,0:09:27.479 officers made about 32,000 stops[br]across the city. 0:09:27.934,0:09:29.593 In that next year, 0:09:29.593,0:09:31.887 with the addition of this question, 0:09:31.887,0:09:34.401 that fell to 19,000 stops. 0:09:34.579,0:09:40.612 African-American stops alone[br]fell by 43 percent. 0:09:40.612,0:09:44.669 And stopping fewer black people[br]did not make the city any more dangerous. 0:09:44.669,0:09:47.292 In fact, the crime rate continued to fall 0:09:47.292,0:09:50.202 and the city became safer for everybody. 0:09:50.486,0:09:56.334 So one solution can come from reducing[br]the number of unnecessary stops. 0:09:56.334,0:10:00.690 Another can come from improving[br]the quality of the stops 0:10:00.690,0:10:02.743 officers do make. 0:10:02.743,0:10:05.349 And technology can help us here. 0:10:05.349,0:10:07.878 We all know about George Floyd's death 0:10:07.878,0:10:11.196 because those who tried to come to his aid 0:10:11.517,0:10:17.131 held cell phone cameras[br]to record that horrific, fatal encounter 0:10:17.453,0:10:18.918 with the police. 0:10:18.918,0:10:21.703 But we have all sorts of technology 0:10:21.703,0:10:24.133 that we're not putting to good use. 0:10:24.133,0:10:26.396 Police departments across the country 0:10:26.396,0:10:30.094 are now required to wear[br]body-worn cameras, 0:10:30.094,0:10:35.570 so we have recordings of not only[br]the most extreme and horrific encounters 0:10:35.570,0:10:38.354 but of everyday interactions. 0:10:38.753,0:10:41.799 With an interdisciplinary[br]team at Stanford, 0:10:41.799,0:10:44.806 we've begun to use[br]machine learning techniques 0:10:44.806,0:10:47.898 to analyze large numbers of encounters. 0:10:47.898,0:10:52.528 This is to better understand what happens[br]in routine traffic stops. 0:10:52.528,0:10:56.094 What we found was that[br]even when police officers 0:10:56.094,0:10:58.356 are behaving professionally, 0:10:58.356,0:11:02.906 they speak to black drivers[br]less respectfully than white drivers. 0:11:03.529,0:11:07.334 In fact, from the words officers use alone 0:11:07.334,0:11:12.973 we could predict whether they were talking[br]to a black driver or a white driver. 0:11:13.655,0:11:18.797 The problem is that the vast majority[br]of the footage from these cameras 0:11:18.797,0:11:21.424 is not used by police departments 0:11:21.424,0:11:23.858 to understand what's[br]going on on the street 0:11:23.858,0:11:26.428 or to train officers. 0:11:26.659,0:11:28.844 And that's a shame. 0:11:28.844,0:11:31.532 How does a routine stop 0:11:31.532,0:11:33.812 turn into a deadly encounter? 0:11:34.077,0:11:36.287 How did this happen[br]in George Floyd's case? 0:11:36.287,0:11:40.511 How did it happen in others? 0:11:40.511,0:11:43.252 When my eldest son was 16 years old, 0:11:43.252,0:11:46.277 he discovered that[br]when white people look at him, 0:11:46.277,0:11:47.838 they feel fear. 0:11:49.338,0:11:52.409 Elevators are the worst, he said. 0:11:52.409,0:11:54.592 When those doors close, 0:11:54.592,0:11:57.775 people are trapped in this tiny space 0:11:57.775,0:12:00.262 with someone they have been taught 0:12:00.262,0:12:02.242 to associate with danger. 0:12:03.046,0:12:06.163 My son senses their discomfort, 0:12:06.163,0:12:09.400 and he smiles to put them at ease, 0:12:09.400,0:12:11.566 to calm their fears. 0:12:11.566,0:12:13.534 When he speaks, 0:12:13.534,0:12:15.720 their bodies relax. 0:12:15.720,0:12:17.697 They breathe easier. 0:12:17.697,0:12:20.251 They take pleasure in his cadence, 0:12:20.251,0:12:22.418 his diction, his word choice. 0:12:23.241,0:12:25.157 He sounds like one of them. 0:12:25.157,0:12:29.867 I used to think that my son[br]was a natural extrovert like his father, 0:12:29.867,0:12:34.413 but I realized at that moment[br]in that conversation 0:12:34.413,0:12:38.006 that his smile was not a sign 0:12:38.006,0:12:41.049 that he wanted to connect[br]with would-be strangers. 0:12:42.175,0:12:45.850 It was a talisman he'd used[br]to protect himself, 0:12:45.850,0:12:49.638 a survival skill he had honed[br]over thousands of elevator rides. 0:12:49.638,0:12:53.571 He was learning to accommodate the tension 0:12:53.571,0:12:56.887 that his skin color generated 0:12:58.164,0:13:01.396 and that put his own life at risk. 0:13:02.619,0:13:06.187 We know that the brain is wired for bias, 0:13:06.187,0:13:09.201 and one way to interrupt that bias 0:13:09.559,0:13:13.445 is to pause and to reflect[br]on the evidence of our assumptions. 0:13:13.445,0:13:15.156 So we need to ask ourselves, 0:13:15.156,0:13:19.702 what assumptions do we bring 0:13:19.702,0:13:21.530 when we step onto an elevator? 0:13:21.530,0:13:23.393 Or an airplane? 0:13:23.739,0:13:28.422 How do we make ourselves aware[br]of our own unconscious bias? 0:13:28.422,0:13:32.337 Who do those assumptions keep safe? 0:13:32.798,0:13:35.557 Who do they put at risk? 0:13:35.824,0:13:38.832 Until we ask these questions, 0:13:38.832,0:13:43.702 and insist that our schools[br]and our courts and our police departments 0:13:43.702,0:13:47.074 and every institution do the same, 0:13:47.074,0:13:51.031 we will continue to allow bias 0:13:51.883,0:13:53.179 to blind us, 0:13:53.579,0:13:54.988 and if we do, 0:13:55.717,0:13:59.274 none of us are truly safe. 0:14:00.358,0:14:02.619 Thank you.