0:00:00.910,0:00:02.183 Some years ago, 0:00:02.207,0:00:07.119 I was on an airplane with my son[br]who was just five years old at the time. 0:00:08.245,0:00:13.340 My son was so excited[br]about being on this airplane with Mommy. 0:00:13.364,0:00:16.304 He's looking all around[br]and he's checking things out 0:00:16.328,0:00:18.164 and he's checking people out. 0:00:18.188,0:00:19.818 And he sees this man, and he says, 0:00:19.842,0:00:22.703 "Hey! That guy looks like Daddy!" 0:00:23.882,0:00:25.802 And I look at the man, 0:00:25.826,0:00:29.556 and he didn't look anything[br]at all like my husband, 0:00:29.580,0:00:30.949 nothing at all. 0:00:31.461,0:00:33.797 And so then I start[br]looking around on the plane, 0:00:33.821,0:00:39.706 and I notice this man was[br]the only black guy on the plane. 0:00:40.874,0:00:42.286 And I thought, 0:00:42.310,0:00:43.504 "Alright. 0:00:44.369,0:00:46.894 I'm going to have to have[br]a little talk with my son 0:00:46.918,0:00:49.847 about how not all[br]black people look alike." 0:00:49.871,0:00:54.248 My son, he lifts his head up,[br]and he says to me, 0:00:56.246,0:00:58.617 "I hope he doesn't rob the plane." 0:00:59.359,0:01:01.874 And I said, "What? What did you say?" 0:01:01.898,0:01:05.326 And he says, "Well, I hope that man[br]doesn't rob the plane." 0:01:07.200,0:01:09.952 And I said, "Well, why would you say that? 0:01:10.486,0:01:13.149 You know Daddy wouldn't rob a plane." 0:01:13.173,0:01:15.485 And he says, "Yeah, yeah,[br]yeah, well, I know." 0:01:16.179,0:01:18.306 And I said, "Well,[br]why would you say that?" 0:01:20.346,0:01:23.303 And he looked at me[br]with this really sad face, 0:01:24.168,0:01:25.422 and he says, 0:01:26.890,0:01:28.996 "I don't know why I said that. 0:01:30.600,0:01:32.958 I don't know why I was thinking that." 0:01:33.724,0:01:37.242 We are living with such severe[br]racial stratification 0:01:37.266,0:01:42.326 that even a five-year-old can tell us[br]what's supposed to happen next, 0:01:43.990,0:01:46.097 even with no evildoer, 0:01:46.121,0:01:48.700 even with no explicit hatred. 0:01:50.184,0:01:54.097 This association between[br]blackness and crime 0:01:54.121,0:01:58.455 made its way into the mind[br]of my five-year-old. 0:01:59.787,0:02:03.050 It makes its way into all of our children, 0:02:04.201,0:02:05.592 into all of us. 0:02:06.793,0:02:10.317 Our minds are shaped by[br]the racial disparities 0:02:10.341,0:02:11.933 we see out in the world 0:02:12.752,0:02:18.093 and the narratives that help us[br]to make sense of the disparities we see: 0:02:19.637,0:02:22.163 "Those people are criminal." 0:02:22.187,0:02:24.112 "Those people are violent." 0:02:24.136,0:02:27.101 "Those people are to be feared." 0:02:27.814,0:02:31.005 When my research team[br]brought people into our lab 0:02:31.029,0:02:33.312 and exposed them to faces, 0:02:33.336,0:02:40.336 we found that exposure to black faces[br]led them to see blurry images of guns 0:02:40.360,0:02:43.616 with greater clarity and speed. 0:02:43.640,0:02:46.994 Bias cannot only control what we see, 0:02:47.018,0:02:48.666 but where we look. 0:02:48.690,0:02:52.134 We found that prompting people[br]to think of violent crime 0:02:52.158,0:02:56.454 can lead them to direct their eyes[br]onto a black face 0:02:56.478,0:02:58.608 and away from a white face. 0:02:58.632,0:03:02.474 Prompting police officers[br]to think of capturing and shooting 0:03:02.498,0:03:03.727 and arresting 0:03:03.751,0:03:07.610 leads their eyes to settle[br]on black faces, too. 0:03:07.634,0:03:12.700 Bias can infect every aspect[br]of our criminal justice system. 0:03:13.100,0:03:16.031 In a large data set[br]of death-eligible defendants, 0:03:16.055,0:03:20.412 we found that looking more black[br]more than double their chances 0:03:20.436,0:03:22.493 of receiving a death sentence -- 0:03:23.494,0:03:25.921 at least when their victims were white. 0:03:25.945,0:03:27.383 This effect is significant, 0:03:27.407,0:03:30.708 even though we controlled[br]for the severity of the crime 0:03:30.732,0:03:33.013 and the defendant's attractiveness. 0:03:33.037,0:03:35.686 And no matter what we controlled for, 0:03:35.710,0:03:39.055 we found that black[br]people were punished 0:03:39.079,0:03:43.404 in proportion to the blackness[br]of their physical features: 0:03:43.428,0:03:45.309 the more black, 0:03:45.333,0:03:47.086 the more death-worthy. 0:03:47.110,0:03:51.319 Bias can also influence[br]how teachers discipline students. 0:03:51.781,0:03:56.188 My colleagues and I have found[br]that teachers express a desire 0:03:56.212,0:03:59.778 to discipline a black[br]middle school student more harshly 0:03:59.802,0:04:00.970 than a white student 0:04:00.994,0:04:03.570 for the same repeated infractions. 0:04:03.594,0:04:04.888 In a recent study, 0:04:04.912,0:04:09.270 we're finding that teachers[br]treat black students as a group 0:04:09.294,0:04:11.725 but white students as individuals. 0:04:12.126,0:04:15.725 If, for example,[br]one black student misbehaves 0:04:15.749,0:04:20.534 and then a different black student[br]misbehaves a few days later, 0:04:20.558,0:04:23.786 the teacher responds[br]to that second black student 0:04:23.810,0:04:26.435 as if he had misbehaved twice. 0:04:26.952,0:04:29.763 It's as though the sins of one child 0:04:29.787,0:04:31.963 get piled onto the other. 0:04:31.987,0:04:35.281 We create categories[br]to make sense of the world, 0:04:35.305,0:04:39.788 to assert some control and coherence 0:04:39.812,0:04:43.902 to the stimuli that we're constantly[br]being bombarded with. 0:04:43.926,0:04:47.894 Categorization and the bias that it seeds 0:04:47.918,0:04:52.940 allow our brains to make judgments[br]more quickly and efficiently, 0:04:52.964,0:04:56.366 and we do this by instinctively[br]relying on patterns 0:04:56.390,0:04:58.059 that seem predictable. 0:04:58.083,0:05:04.026 Yet, just as the categories we create[br]allow us to make quick decisions, 0:05:04.050,0:05:06.552 they also reinforce bias. 0:05:06.576,0:05:09.968 So the very things that help us[br]to see the world 0:05:11.104,0:05:13.084 also can blind us to it. 0:05:13.509,0:05:16.287 They render our choices effortless, 0:05:16.311,0:05:17.680 friction-free. 0:05:18.956,0:05:21.401 Yet they exact a heavy toll. 0:05:22.158,0:05:23.569 So what can we do? 0:05:24.507,0:05:26.998 We are all vulnerable to bias, 0:05:27.022,0:05:29.702 but we don't act on bias all the time. 0:05:29.726,0:05:33.370 There are certain conditions[br]that can bring bias alive 0:05:33.394,0:05:35.927 and other conditions that can muffle it. 0:05:35.951,0:05:37.798 Let me give you an example. 0:05:38.663,0:05:43.223 Many people are familiar[br]with the tech company Nextdoor. 0:05:44.073,0:05:50.526 So, their whole purpose is to create[br]stronger, healthier, safer neighborhoods. 0:05:51.468,0:05:54.389 And so they offer this online space 0:05:54.413,0:05:57.562 where neighbors can gather[br]and share information. 0:05:57.586,0:06:01.712 Yet, Nextdoor soon found[br]that they had a problem 0:06:01.736,0:06:03.404 with racial profiling. 0:06:04.012,0:06:05.979 In the typical case, 0:06:06.003,0:06:08.399 people would look outside their window 0:06:08.423,0:06:12.472 and see a black man[br]in their otherwise white neighborhood 0:06:12.496,0:06:17.211 and make the snap judgment[br]that he was up to no good, 0:06:17.235,0:06:20.586 even when there was no evidence[br]of criminal wrongdoing. 0:06:20.610,0:06:23.544 In many ways, how we behave online 0:06:23.568,0:06:26.682 is a reflection of how[br]we behave in the world. 0:06:27.117,0:06:31.062 But what we don't want to do[br]is create an easy-to-use system 0:06:31.086,0:06:35.249 that can amplify bias[br]and deepen racial disparities, 0:06:36.129,0:06:38.395 rather than dismantling them. 0:06:38.863,0:06:42.292 So the cofounder of Nextdoor[br]reached out to me and to others 0:06:42.316,0:06:44.447 to try to figure out what to do. 0:06:44.471,0:06:48.417 And they realized that[br]to curb racial profiling on the platform, 0:06:48.441,0:06:50.363 they were going to have to add friction; 0:06:50.387,0:06:53.045 that is, they were going[br]to have to slow people down. 0:06:53.069,0:06:55.264 So Nextdoor had a choice to make, 0:06:55.288,0:06:57.766 and against every impulse, 0:06:57.790,0:06:59.906 they decided to add friction. 0:07:00.397,0:07:03.837 And they did this by adding[br]a simple checklist. 0:07:03.861,0:07:05.531 There were three items on it. 0:07:06.111,0:07:09.052 First, they asked users to pause 0:07:09.076,0:07:14.193 and think, "What was this person doing[br]that made him suspicious?" 0:07:14.876,0:07:19.409 The category "black man"[br]is not grounds for suspicion. 0:07:19.433,0:07:24.572 Second, they asked users to describe[br]the person's physical features, 0:07:24.596,0:07:27.031 not simply their race and gender. 0:07:27.642,0:07:31.025 Third, they realized that a lot of people 0:07:31.049,0:07:33.977 didn't seem to know[br]what racial profiling was, 0:07:34.001,0:07:35.960 nor that they were engaging in it. 0:07:36.462,0:07:39.656 So Nextdoor provided them[br]with a definition 0:07:39.680,0:07:42.582 and told them that it was[br]strictly prohibited. 0:07:43.071,0:07:45.683 Most of you have seen[br]those signs in airports 0:07:45.707,0:07:49.409 and in metro stations,[br]"If you see something, say something." 0:07:49.928,0:07:52.742 Nextdoor tried modifying this. 0:07:53.584,0:07:56.156 "If you see something suspicious, 0:07:56.180,0:07:58.253 say something specific." 0:07:59.491,0:08:03.937 And using this strategy,[br]by simply slowing people down, 0:08:03.961,0:08:09.652 Nextdoor was able to curb[br]racial profiling by 75 percent. 0:08:10.496,0:08:12.586 Now, people often will say to me, 0:08:12.610,0:08:17.323 "You can't add friction[br]in every situation, in every context, 0:08:17.347,0:08:21.993 and especially for people who make[br]split-second decisions all the time." 0:08:22.730,0:08:25.293 But it turns out we can add friction 0:08:25.317,0:08:27.593 to more situations than we think. 0:08:28.031,0:08:30.105 Working with the Oakland Police Department 0:08:30.129,0:08:31.546 in California, 0:08:31.570,0:08:35.426 I and a number of my colleagues[br]were able to help the department 0:08:35.450,0:08:38.121 to reduce the number of stops they made 0:08:38.145,0:08:41.745 of people who were not[br]committing any serious crimes. 0:08:41.769,0:08:44.134 And we did this by pushing officers 0:08:44.158,0:08:48.601 to ask themselves a question[br]before each and every stop they made: 0:08:49.451,0:08:52.466 "Is this stop intelligence-led, 0:08:52.490,0:08:53.941 yes or no?" 0:08:55.353,0:08:56.749 In other words, 0:08:57.621,0:09:02.105 do I have prior information[br]to tie this particular person 0:09:02.129,0:09:03.730 to a specific crime? 0:09:04.587,0:09:06.045 By adding that question 0:09:06.069,0:09:09.148 to the form officers complete[br]during a stop, 0:09:09.172,0:09:10.981 they slow down, they pause, 0:09:11.005,0:09:15.225 they think, "Why am I considering[br]pulling this person over?" 0:09:16.721,0:09:22.282 In 2017, before we added that[br]intelligence-led question to the form, 0:09:23.655,0:09:27.601 officers made about 32,000 stops[br]across the city. 0:09:27.625,0:09:31.740 In that next year,[br]with the addition of this question, 0:09:31.764,0:09:34.208 that fell to 19,000 stops. 0:09:34.232,0:09:39.193 African-American stops alone[br]fell by 43 percent. 0:09:39.905,0:09:44.343 And stopping fewer black people[br]did not make the city any more dangerous. 0:09:44.367,0:09:47.101 In fact, the crime rate continued to fall, 0:09:47.125,0:09:50.462 and the city became safer for everybody. 0:09:50.486,0:09:55.841 So one solution can come from reducing[br]the number of unnecessary stops. 0:09:56.285,0:10:00.555 Another can come from improving[br]the quality of the stops 0:10:00.579,0:10:01.884 officers do make. 0:10:02.512,0:10:05.108 And technology can help us here. 0:10:05.132,0:10:07.547 We all know about George Floyd's death, 0:10:08.499,0:10:13.271 because those who tried to come to his aid[br]held cell phone cameras 0:10:13.295,0:10:18.726 to record that horrific, fatal[br]encounter with the police. 0:10:18.750,0:10:23.781 But we have all sorts of technology[br]that we're not putting to good use. 0:10:23.805,0:10:26.308 Police departments across the country 0:10:26.332,0:10:29.885 are now required to wear body-worn cameras 0:10:29.909,0:10:35.839 so we have recordings of not only[br]the most extreme and horrific encounters 0:10:35.863,0:10:38.617 but of everyday interactions. 0:10:38.641,0:10:41.418 With an interdisciplinary[br]team at Stanford, 0:10:41.442,0:10:44.129 we've begun to use[br]machine learning techniques 0:10:44.153,0:10:47.520 to analyze large numbers of encounters. 0:10:47.544,0:10:52.155 This is to better understand[br]what happens in routine traffic stops. 0:10:52.179,0:10:54.334 What we found was that 0:10:54.358,0:10:58.020 even when police officers[br]are behaving professionally, 0:10:58.860,0:11:03.322 they speak to black drivers[br]less respectfully than white drivers. 0:11:04.052,0:11:08.127 In fact, from the words[br]officers use alone, 0:11:08.151,0:11:13.313 we could predict whether they were talking[br]to a black driver or a white driver. 0:11:13.337,0:11:19.099 The problem is that the vast majority[br]of the footage from these cameras 0:11:19.123,0:11:21.210 is not used by police departments 0:11:21.234,0:11:23.510 to understand what's[br]going on on the street 0:11:23.534,0:11:25.777 or to train officers. 0:11:26.554,0:11:28.012 And that's a shame. 0:11:28.796,0:11:33.585 How does a routine stop[br]turn into a deadly encounter? 0:11:33.609,0:11:36.279 How did this happen[br]in George Floyd's case? 0:11:37.588,0:11:39.670 How did it happen in others? 0:11:39.694,0:11:43.090 When my eldest son was 16 years old, 0:11:43.114,0:11:46.253 he discovered that[br]when white people look at him, 0:11:46.277,0:11:47.838 they feel fear. 0:11:49.123,0:11:51.784 Elevators are the worst, he said. 0:11:52.313,0:11:54.644 When those doors close, 0:11:54.668,0:11:57.751 people are trapped in this tiny space 0:11:57.775,0:12:02.242 with someone they have been taught[br]to associate with danger. 0:12:02.744,0:12:05.964 My son senses their discomfort, 0:12:05.988,0:12:09.145 and he smiles to put them at ease, 0:12:09.169,0:12:10.938 to calm their fears. 0:12:11.351,0:12:13.296 When he speaks, 0:12:13.320,0:12:15.003 their bodies relax. 0:12:15.442,0:12:17.345 They breathe easier. 0:12:17.369,0:12:19.900 They take pleasure in his cadence, 0:12:19.924,0:12:22.241 his diction, his word choice. 0:12:22.986,0:12:24.829 He sounds like one of them. 0:12:24.853,0:12:29.583 I used to think that my son[br]was a natural extrovert like his father. 0:12:29.607,0:12:33.157 But I realized at that moment,[br]in that conversation, 0:12:34.143,0:12:39.221 that his smile was not a sign[br]that he wanted to connect 0:12:39.245,0:12:41.209 with would-be strangers. 0:12:41.920,0:12:45.572 It was a talisman he used[br]to protect himself, 0:12:45.596,0:12:51.815 a survival skill he had honed[br]over thousands of elevator rides. 0:12:52.387,0:12:57.558 He was learning to accommodate the tension[br]that his skin color generated 0:12:59.026,0:13:01.693 and that put his own life at risk. 0:13:02.619,0:13:06.402 We know that the brain is wired for bias, 0:13:06.426,0:13:10.891 and one way to interrupt that bias[br]is to pause and to reflect 0:13:10.915,0:13:13.220 on the evidence of our assumptions. 0:13:13.244,0:13:14.999 So we need to ask ourselves: 0:13:15.023,0:13:19.688 What assumptions do we bring[br]when we step onto an elevator? 0:13:21.776,0:13:23.087 Or an airplane? 0:13:23.532,0:13:28.131 How do we make ourselves aware[br]of our own unconscious bias? 0:13:28.155,0:13:30.506 Who do those assumptions keep safe? 0:13:32.615,0:13:34.547 Who do they put at risk? 0:13:35.649,0:13:38.003 Until we ask these questions 0:13:38.978,0:13:43.602 and insist that our schools[br]and our courts and our police departments 0:13:43.626,0:13:46.168 and every institution do the same, 0:13:47.835,0:13:51.664 we will continue to allow bias 0:13:51.688,0:13:52.966 to blind us. 0:13:53.348,0:13:54.757 And if we do, 0:13:56.066,0:13:59.274 none of us are truly safe. 0:14:02.103,0:14:03.411 Thank you.