WEBVTT 00:00:00.910 --> 00:00:02.183 Some years ago, 00:00:02.207 --> 00:00:07.119 I was on an airplane with my son who was just five years old at the time. 00:00:08.245 --> 00:00:13.340 My son was so excited about being on this airplane with Mommy. 00:00:13.364 --> 00:00:16.304 He's looking all around and he's checking things out 00:00:16.328 --> 00:00:18.164 and he's checking people out. 00:00:18.188 --> 00:00:19.818 And he sees this man, and he says, 00:00:19.842 --> 00:00:22.703 "Hey! That guy looks like Daddy!" 00:00:23.882 --> 00:00:25.802 And I look at the man, 00:00:25.826 --> 00:00:29.556 and he didn't look anything at all like my husband, 00:00:29.580 --> 00:00:30.949 nothing at all. 00:00:31.461 --> 00:00:33.797 And so then I start looking around on the plane, 00:00:33.821 --> 00:00:39.706 and I notice this man was the only black guy on the plane. 00:00:40.874 --> 00:00:42.286 And I thought, 00:00:42.310 --> 00:00:43.504 "Alright. 00:00:44.369 --> 00:00:46.894 I'm going to have to have a little talk with my son 00:00:46.918 --> 00:00:49.847 about how not all black people look alike." 00:00:49.871 --> 00:00:54.248 My son, he lifts his head up, and he says to me, 00:00:56.246 --> 00:00:58.617 "I hope he doesn't rob the plane." 00:00:59.359 --> 00:01:01.874 And I said, "What? What did you say?" 00:01:01.898 --> 00:01:05.326 And he says, "Well, I hope that man doesn't rob the plane." 00:01:07.200 --> 00:01:09.952 And I said, "Well, why would you say that? 00:01:10.486 --> 00:01:13.149 You know Daddy wouldn't rob a plane." 00:01:13.173 --> 00:01:15.485 And he says, "Yeah, yeah, yeah, well, I know." 00:01:16.179 --> 00:01:18.306 And I said, "Well, why would you say that?" 00:01:20.346 --> 00:01:23.303 And he looked at me with this really sad face, 00:01:24.168 --> 00:01:25.422 and he says, 00:01:26.890 --> 00:01:28.996 "I don't know why I said that. 00:01:30.600 --> 00:01:32.958 I don't know why I was thinking that." NOTE Paragraph 00:01:33.724 --> 00:01:37.242 We are living with such severe racial stratification 00:01:37.266 --> 00:01:42.326 that even a five-year-old can tell us what's supposed to happen next, 00:01:43.990 --> 00:01:46.097 even with no evildoer, 00:01:46.121 --> 00:01:48.700 even with no explicit hatred. 00:01:50.184 --> 00:01:54.097 This association between blackness and crime 00:01:54.121 --> 00:01:58.455 made its way into the mind of my five-year-old. 00:01:59.787 --> 00:02:03.050 It makes its way into all of our children, 00:02:04.201 --> 00:02:05.592 into all of us. 00:02:06.793 --> 00:02:10.317 Our minds are shaped by the racial disparities 00:02:10.341 --> 00:02:11.933 we see out in the world 00:02:12.752 --> 00:02:18.093 and the narratives that help us to make sense of the disparities we see: 00:02:19.637 --> 00:02:22.163 "Those people are criminal." 00:02:22.187 --> 00:02:24.112 "Those people are violent." 00:02:24.136 --> 00:02:27.101 "Those people are to be feared." NOTE Paragraph 00:02:27.814 --> 00:02:31.005 When my research team brought people into our lab 00:02:31.029 --> 00:02:33.312 and exposed them to faces, 00:02:33.336 --> 00:02:40.336 we found that exposure to black faces led them to see blurry images of guns 00:02:40.360 --> 00:02:43.616 with greater clarity and speed. 00:02:43.640 --> 00:02:46.994 Bias cannot only control what we see, 00:02:47.018 --> 00:02:48.666 but where we look. 00:02:48.690 --> 00:02:52.134 We found that prompting people to think of violent crime 00:02:52.158 --> 00:02:56.454 can lead them to direct their eyes onto a black face 00:02:56.478 --> 00:02:58.608 and away from a white face. 00:02:58.632 --> 00:03:02.474 Prompting police officers to think of capturing and shooting 00:03:02.498 --> 00:03:03.727 and arresting 00:03:03.751 --> 00:03:07.610 leads their eyes to settle on black faces, too. NOTE Paragraph 00:03:07.634 --> 00:03:12.700 Bias can infect every aspect of our criminal justice system. 00:03:13.100 --> 00:03:16.031 In a large data set of death-eligible defendants, 00:03:16.055 --> 00:03:20.412 we found that looking more black more than doubled their chances 00:03:20.436 --> 00:03:22.493 of receiving a death sentence -- 00:03:23.494 --> 00:03:25.921 at least when their victims were white. 00:03:25.945 --> 00:03:27.383 This effect is significant, 00:03:27.407 --> 00:03:30.708 even though we controlled for the severity of the crime 00:03:30.732 --> 00:03:33.013 and the defendant's attractiveness. 00:03:33.037 --> 00:03:35.686 And no matter what we controlled for, 00:03:35.710 --> 00:03:39.055 we found that black people were punished 00:03:39.079 --> 00:03:43.404 in proportion to the blackness of their physical features: 00:03:43.428 --> 00:03:45.309 the more black, 00:03:45.333 --> 00:03:47.086 the more death-worthy. NOTE Paragraph 00:03:47.110 --> 00:03:51.319 Bias can also influence how teachers discipline students. 00:03:51.781 --> 00:03:56.188 My colleagues and I have found that teachers express a desire 00:03:56.212 --> 00:03:59.778 to discipline a black middle school student more harshly 00:03:59.802 --> 00:04:00.970 than a white student 00:04:00.994 --> 00:04:03.570 for the same repeated infractions. 00:04:03.594 --> 00:04:04.888 In a recent study, 00:04:04.912 --> 00:04:09.270 we're finding that teachers treat black students as a group 00:04:09.294 --> 00:04:11.725 but white students as individuals. 00:04:12.126 --> 00:04:15.725 If, for example, one black student misbehaves 00:04:15.749 --> 00:04:20.534 and then a different black student misbehaves a few days later, 00:04:20.558 --> 00:04:23.786 the teacher responds to that second black student 00:04:23.810 --> 00:04:26.435 as if he had misbehaved twice. 00:04:26.952 --> 00:04:29.763 It's as though the sins of one child 00:04:29.787 --> 00:04:31.963 get piled onto the other. NOTE Paragraph 00:04:31.987 --> 00:04:35.281 We create categories to make sense of the world, 00:04:35.305 --> 00:04:39.788 to assert some control and coherence 00:04:39.812 --> 00:04:43.902 to the stimuli that we're constantly being bombarded with. 00:04:43.926 --> 00:04:47.894 Categorization and the bias that it seeds 00:04:47.918 --> 00:04:52.940 allow our brains to make judgments more quickly and efficiently, 00:04:52.964 --> 00:04:56.366 and we do this by instinctively relying on patterns 00:04:56.390 --> 00:04:58.059 that seem predictable. 00:04:58.083 --> 00:05:04.026 Yet, just as the categories we create allow us to make quick decisions, 00:05:04.050 --> 00:05:06.552 they also reinforce bias. 00:05:06.576 --> 00:05:09.968 So the very things that help us to see the world 00:05:11.104 --> 00:05:13.084 also can blind us to it. 00:05:13.509 --> 00:05:16.287 They render our choices effortless, 00:05:16.311 --> 00:05:17.680 friction-free. 00:05:18.956 --> 00:05:21.401 Yet they exact a heavy toll. NOTE Paragraph 00:05:22.158 --> 00:05:23.569 So what can we do? 00:05:24.507 --> 00:05:26.998 We are all vulnerable to bias, 00:05:27.022 --> 00:05:29.702 but we don't act on bias all the time. 00:05:29.726 --> 00:05:33.370 There are certain conditions that can bring bias alive 00:05:33.394 --> 00:05:35.927 and other conditions that can muffle it. NOTE Paragraph 00:05:35.951 --> 00:05:37.798 Let me give you an example. 00:05:38.663 --> 00:05:43.223 Many people are familiar with the tech company Nextdoor. 00:05:44.073 --> 00:05:50.526 So, their whole purpose is to create stronger, healthier, safer neighborhoods. 00:05:51.468 --> 00:05:54.389 And so they offer this online space 00:05:54.413 --> 00:05:57.562 where neighbors can gather and share information. 00:05:57.586 --> 00:06:01.712 Yet, Nextdoor soon found that they had a problem 00:06:01.736 --> 00:06:03.404 with racial profiling. 00:06:04.012 --> 00:06:05.979 In the typical case, 00:06:06.003 --> 00:06:08.399 people would look outside their window 00:06:08.423 --> 00:06:12.472 and see a black man in their otherwise white neighborhood 00:06:12.496 --> 00:06:17.211 and make the snap judgment that he was up to no good, 00:06:17.235 --> 00:06:20.586 even when there was no evidence of criminal wrongdoing. 00:06:20.610 --> 00:06:23.544 In many ways, how we behave online 00:06:23.568 --> 00:06:26.682 is a reflection of how we behave in the world. 00:06:27.117 --> 00:06:31.062 But what we don't want to do is create an easy-to-use system 00:06:31.086 --> 00:06:35.249 that can amplify bias and deepen racial disparities, 00:06:36.129 --> 00:06:38.395 rather than dismantling them. NOTE Paragraph 00:06:38.863 --> 00:06:42.292 So the cofounder of Nextdoor reached out to me and to others 00:06:42.316 --> 00:06:44.447 to try to figure out what to do. 00:06:44.471 --> 00:06:48.417 And they realized that to curb racial profiling on the platform, 00:06:48.441 --> 00:06:50.363 they were going to have to add friction; 00:06:50.387 --> 00:06:53.045 that is, they were going to have to slow people down. 00:06:53.069 --> 00:06:55.264 So Nextdoor had a choice to make, 00:06:55.288 --> 00:06:57.766 and against every impulse, 00:06:57.790 --> 00:06:59.906 they decided to add friction. 00:07:00.397 --> 00:07:03.837 And they did this by adding a simple checklist. 00:07:03.861 --> 00:07:05.531 There were three items on it. 00:07:06.111 --> 00:07:09.052 First, they asked users to pause 00:07:09.076 --> 00:07:14.193 and think, "What was this person doing that made him suspicious?" 00:07:14.876 --> 00:07:19.409 The category "black man" is not grounds for suspicion. 00:07:19.433 --> 00:07:24.572 Second, they asked users to describe the person's physical features, 00:07:24.596 --> 00:07:27.031 not simply their race and gender. 00:07:27.642 --> 00:07:31.025 Third, they realized that a lot of people 00:07:31.049 --> 00:07:33.977 didn't seem to know what racial profiling was, 00:07:34.001 --> 00:07:35.960 nor that they were engaging in it. 00:07:36.462 --> 00:07:39.656 So Nextdoor provided them with a definition 00:07:39.680 --> 00:07:42.582 and told them that it was strictly prohibited. 00:07:43.071 --> 00:07:45.683 Most of you have seen those signs in airports 00:07:45.707 --> 00:07:49.409 and in metro stations, "If you see something, say something." 00:07:49.928 --> 00:07:52.742 Nextdoor tried modifying this. 00:07:53.584 --> 00:07:56.156 "If you see something suspicious, 00:07:56.180 --> 00:07:58.253 say something specific." 00:07:59.491 --> 00:08:03.937 And using this strategy, by simply slowing people down, 00:08:03.961 --> 00:08:09.652 Nextdoor was able to curb racial profiling by 75 percent. NOTE Paragraph 00:08:10.496 --> 00:08:12.586 Now, people often will say to me, 00:08:12.610 --> 00:08:17.323 "You can't add friction in every situation, in every context, 00:08:17.347 --> 00:08:21.993 and especially for people who make split-second decisions all the time." 00:08:22.730 --> 00:08:25.293 But it turns out we can add friction 00:08:25.317 --> 00:08:27.593 to more situations than we think. 00:08:28.031 --> 00:08:30.105 Working with the Oakland Police Department 00:08:30.129 --> 00:08:31.546 in California, 00:08:31.570 --> 00:08:35.426 I and a number of my colleagues were able to help the department 00:08:35.450 --> 00:08:38.121 to reduce the number of stops they made 00:08:38.145 --> 00:08:41.745 of people who were not committing any serious crimes. 00:08:41.769 --> 00:08:44.134 And we did this by pushing officers 00:08:44.158 --> 00:08:48.601 to ask themselves a question before each and every stop they made: 00:08:49.451 --> 00:08:52.466 "Is this stop intelligence-led, 00:08:52.490 --> 00:08:53.941 yes or no?" 00:08:55.353 --> 00:08:56.749 In other words, 00:08:57.621 --> 00:09:02.105 do I have prior information to tie this particular person 00:09:02.129 --> 00:09:03.730 to a specific crime? 00:09:04.587 --> 00:09:06.045 By adding that question 00:09:06.069 --> 00:09:09.148 to the form officers complete during a stop, 00:09:09.172 --> 00:09:10.981 they slow down, they pause, 00:09:11.005 --> 00:09:15.225 they think, "Why am I considering pulling this person over?" NOTE Paragraph 00:09:16.721 --> 00:09:22.282 In 2017, before we added that intelligence-led question to the form, 00:09:23.655 --> 00:09:27.601 officers made about 32,000 stops across the city. 00:09:27.625 --> 00:09:31.740 In that next year, with the addition of this question, 00:09:31.764 --> 00:09:34.208 that fell to 19,000 stops. 00:09:34.232 --> 00:09:39.193 African-American stops alone fell by 43 percent. 00:09:39.905 --> 00:09:44.343 And stopping fewer black people did not make the city any more dangerous. 00:09:44.367 --> 00:09:47.101 In fact, the crime rate continued to fall, 00:09:47.125 --> 00:09:50.462 and the city became safer for everybody. NOTE Paragraph 00:09:50.486 --> 00:09:55.841 So one solution can come from reducing the number of unnecessary stops. 00:09:56.285 --> 00:10:00.555 Another can come from improving the quality of the stops 00:10:00.579 --> 00:10:01.884 officers do make. 00:10:02.512 --> 00:10:05.108 And technology can help us here. 00:10:05.132 --> 00:10:07.547 We all know about George Floyd's death, 00:10:08.499 --> 00:10:13.271 because those who tried to come to his aid held cell phone cameras 00:10:13.295 --> 00:10:18.726 to record that horrific, fatal encounter with the police. 00:10:18.750 --> 00:10:23.781 But we have all sorts of technology that we're not putting to good use. 00:10:23.805 --> 00:10:26.308 Police departments across the country 00:10:26.332 --> 00:10:29.885 are now required to wear body-worn cameras 00:10:29.909 --> 00:10:35.839 so we have recordings of not only the most extreme and horrific encounters 00:10:35.863 --> 00:10:38.617 but of everyday interactions. NOTE Paragraph 00:10:38.641 --> 00:10:41.418 With an interdisciplinary team at Stanford, 00:10:41.442 --> 00:10:44.129 we've begun to use machine learning techniques 00:10:44.153 --> 00:10:47.520 to analyze large numbers of encounters. 00:10:47.544 --> 00:10:52.155 This is to better understand what happens in routine traffic stops. 00:10:52.179 --> 00:10:54.334 What we found was that 00:10:54.358 --> 00:10:58.020 even when police officers are behaving professionally, 00:10:58.860 --> 00:11:03.322 they speak to black drivers less respectfully than white drivers. 00:11:04.052 --> 00:11:08.127 In fact, from the words officers use alone, 00:11:08.151 --> 00:11:13.313 we could predict whether they were talking to a black driver or a white driver. NOTE Paragraph 00:11:13.337 --> 00:11:19.099 The problem is that the vast majority of the footage from these cameras 00:11:19.123 --> 00:11:21.210 is not used by police departments 00:11:21.234 --> 00:11:23.510 to understand what's going on on the street 00:11:23.534 --> 00:11:25.777 or to train officers. 00:11:26.554 --> 00:11:28.012 And that's a shame. 00:11:28.796 --> 00:11:33.585 How does a routine stop turn into a deadly encounter? 00:11:33.609 --> 00:11:36.279 How did this happen in George Floyd's case? 00:11:37.588 --> 00:11:39.670 How did it happen in others? NOTE Paragraph 00:11:39.694 --> 00:11:43.090 When my eldest son was 16 years old, 00:11:43.114 --> 00:11:46.253 he discovered that when white people look at him, 00:11:46.277 --> 00:11:47.838 they feel fear. 00:11:49.123 --> 00:11:51.784 Elevators are the worst, he said. 00:11:52.313 --> 00:11:54.644 When those doors close, 00:11:54.668 --> 00:11:57.751 people are trapped in this tiny space 00:11:57.775 --> 00:12:02.242 with someone they have been taught to associate with danger. 00:12:02.744 --> 00:12:05.964 My son senses their discomfort, 00:12:05.988 --> 00:12:09.145 and he smiles to put them at ease, 00:12:09.169 --> 00:12:10.938 to calm their fears. 00:12:11.351 --> 00:12:13.296 When he speaks, 00:12:13.320 --> 00:12:15.003 their bodies relax. 00:12:15.442 --> 00:12:17.345 They breathe easier. 00:12:17.369 --> 00:12:19.900 They take pleasure in his cadence, 00:12:19.924 --> 00:12:22.241 his diction, his word choice. 00:12:22.986 --> 00:12:24.829 He sounds like one of them. 00:12:24.853 --> 00:12:29.583 I used to think that my son was a natural extrovert like his father. 00:12:29.607 --> 00:12:33.157 But I realized at that moment, in that conversation, 00:12:34.143 --> 00:12:39.221 that his smile was not a sign that he wanted to connect 00:12:39.245 --> 00:12:41.209 with would-be strangers. 00:12:41.920 --> 00:12:45.572 It was a talisman he used to protect himself, 00:12:45.596 --> 00:12:51.815 a survival skill he had honed over thousands of elevator rides. 00:12:52.387 --> 00:12:57.558 He was learning to accommodate the tension that his skin color generated 00:12:59.026 --> 00:13:01.693 and that put his own life at risk. NOTE Paragraph 00:13:02.619 --> 00:13:06.402 We know that the brain is wired for bias, 00:13:06.426 --> 00:13:10.891 and one way to interrupt that bias is to pause and to reflect 00:13:10.915 --> 00:13:13.220 on the evidence of our assumptions. 00:13:13.244 --> 00:13:14.999 So we need to ask ourselves: 00:13:15.023 --> 00:13:19.688 What assumptions do we bring when we step onto an elevator? 00:13:21.776 --> 00:13:23.087 Or an airplane? 00:13:23.532 --> 00:13:28.131 How do we make ourselves aware of our own unconscious bias? 00:13:28.155 --> 00:13:30.506 Who do those assumptions keep safe? 00:13:32.615 --> 00:13:34.547 Who do they put at risk? 00:13:35.649 --> 00:13:38.003 Until we ask these questions 00:13:38.978 --> 00:13:43.602 and insist that our schools and our courts and our police departments 00:13:43.626 --> 00:13:46.168 and every institution do the same, 00:13:47.835 --> 00:13:51.664 we will continue to allow bias 00:13:51.688 --> 00:13:52.966 to blind us. 00:13:53.348 --> 00:13:54.757 And if we do, 00:13:56.066 --> 00:13:59.274 none of us are truly safe. NOTE Paragraph 00:14:02.103 --> 00:14:03.411 Thank you.