WEBVTT 00:00:01.077 --> 00:00:02.596 Some years ago, 00:00:02.596 --> 00:00:07.166 I was on an airplane with my son who was just five years old at the time. 00:00:08.460 --> 00:00:13.623 My son was so excited about being on this airplane with Mommy. 00:00:13.623 --> 00:00:17.489 He's looking all around and he's checking things out 00:00:17.489 --> 00:00:19.176 and he's checking people out, 00:00:19.176 --> 00:00:21.387 and he sees this man, and he says, "Hey, 00:00:21.387 --> 00:00:23.843 that guy looks like Daddy." 00:00:24.049 --> 00:00:25.992 And I look at the man, 00:00:25.992 --> 00:00:29.683 and he didn't look anything at all like my husband, 00:00:29.876 --> 00:00:31.245 nothing at all. 00:00:31.668 --> 00:00:34.403 And so then I start looking around on the plane, 00:00:34.403 --> 00:00:36.273 and I notice this man 00:00:36.273 --> 00:00:39.715 was the only black guy on the plane. 00:00:41.145 --> 00:00:42.531 And I thought, 00:00:42.531 --> 00:00:44.321 all right. 00:00:44.552 --> 00:00:47.651 I'm going to have to have a little talk with my son 00:00:47.651 --> 00:00:50.070 about how not all black people look alike. 00:00:50.293 --> 00:00:54.694 My son, he lifts his head up, and he says to me, 00:00:54.694 --> 00:00:59.203 "I hope he doesn't rob the plane." 00:00:59.582 --> 00:01:02.174 And I said, "What? What did you say?" 00:01:02.174 --> 00:01:05.979 And he says, "Well, I hope that man doesn't rob the plane." 00:01:05.979 --> 00:01:09.839 And I said, "Well, why would you say that? 00:01:10.700 --> 00:01:13.386 You know Daddy wouldn't rob a plane." 00:01:13.386 --> 00:01:16.449 And he says, "Yeah, yeah, yeah, well, I know." 00:01:16.449 --> 00:01:18.809 And I said, "Well, why would you say that?" 00:01:18.809 --> 00:01:24.438 And he looked at me with this really sad face, 00:01:24.438 --> 00:01:26.743 and he says, 00:01:26.743 --> 00:01:29.599 "I don't know why I said that. 00:01:29.599 --> 00:01:33.273 I don't know why I was thinking that." NOTE Paragraph 00:01:34.010 --> 00:01:37.551 We are living with such severe racial stratification 00:01:37.551 --> 00:01:39.776 that even a five-year old 00:01:39.776 --> 00:01:42.572 can tell us what's supposed to happen next, 00:01:44.292 --> 00:01:46.422 even with no evildoer, 00:01:46.422 --> 00:01:50.367 even with no explicit hatred. 00:01:50.367 --> 00:01:54.169 This association between blackness and crime 00:01:54.169 --> 00:01:58.357 made its way into the mind of my five-year old. 00:02:00.050 --> 00:02:04.463 It makes its way into all of our children, 00:02:04.463 --> 00:02:07.052 into all of us. 00:02:07.245 --> 00:02:10.690 Our minds are shaped by the racial disparities 00:02:10.690 --> 00:02:12.864 we see out in the world, 00:02:12.864 --> 00:02:18.205 and the narratives that help us to make sense of the disparities we see. 00:02:19.899 --> 00:02:22.448 Those people are criminals. 00:02:22.448 --> 00:02:24.335 Those people are violent. 00:02:24.335 --> 00:02:27.300 Those people are to be feared. 00:02:28.100 --> 00:02:31.657 When my research team brought people into our lab 00:02:31.657 --> 00:02:33.395 and exposed them to faces, 00:02:33.395 --> 00:02:36.343 we found that exposure to black faces 00:02:37.052 --> 00:02:43.634 led them to see blurry images of guns with greater clarity and speed. 00:02:43.902 --> 00:02:47.114 Bias cannot only control what we see, 00:02:47.114 --> 00:02:48.941 but where we look. 00:02:48.941 --> 00:02:52.408 We found that prompting people to think of violent crime 00:02:52.408 --> 00:02:56.597 can lead them to direct their eyes onto a black face 00:02:56.858 --> 00:02:59.011 and away from a white face. 00:02:59.011 --> 00:03:03.041 Prompting police officers to think of capturing and shooting 00:03:03.041 --> 00:03:04.131 and arresting 00:03:04.131 --> 00:03:07.936 leads their eyes to settle on black faces too. 00:03:07.936 --> 00:03:12.772 Bias can infect every aspect of our criminal justice system. 00:03:13.378 --> 00:03:16.536 In a large dataset of death-eligible defendants, 00:03:16.536 --> 00:03:18.478 we found that looking more black 00:03:18.478 --> 00:03:22.986 more than doubled their chances of receiving a death sentence, 00:03:23.828 --> 00:03:26.345 at least when their victims were white. 00:03:26.345 --> 00:03:29.232 This effect is significant even though we controlled 00:03:29.232 --> 00:03:33.234 for the severity of the crime 00:03:33.234 --> 00:03:34.637 and the defendant's attractiveness, 00:03:34.637 --> 00:03:36.089 and no matter what we controlled for, 00:03:36.089 --> 00:03:38.654 we found that black people 00:03:38.654 --> 00:03:40.785 were punished in proportion 00:03:40.785 --> 00:03:44.041 to the blackness of their physical features: 00:03:44.041 --> 00:03:45.674 the more black, 00:03:45.674 --> 00:03:47.600 the more death-worthy. NOTE Paragraph 00:03:47.600 --> 00:03:52.107 Bias can also influence how teachers discipline students. 00:03:52.107 --> 00:03:54.113 My colleagues and I 00:03:54.113 --> 00:03:56.606 have found that teachers express a desire 00:03:56.606 --> 00:03:59.116 to discipline a black middle school student 00:03:59.116 --> 00:04:03.535 more harshly than a white student for the same repeated infractions. 00:04:04.200 --> 00:04:05.440 In a recent study, 00:04:05.440 --> 00:04:09.453 we're finding that teachers treat black students as a group 00:04:09.453 --> 00:04:12.388 but white students as individuals. 00:04:12.388 --> 00:04:15.947 If, for example, one black student misbehaves 00:04:15.947 --> 00:04:20.799 and then a different black student misbehaves a few days later, 00:04:20.799 --> 00:04:24.050 the teacher responds to that second black student 00:04:24.050 --> 00:04:26.675 as if he had misbehaved twice. 00:04:27.214 --> 00:04:30.048 It's as though the sins of one child 00:04:30.048 --> 00:04:32.158 get piled on to the other. NOTE Paragraph 00:04:32.718 --> 00:04:36.035 We create categories to make sense of the world, 00:04:36.035 --> 00:04:39.725 to assert some control and coherence 00:04:39.725 --> 00:04:44.270 to the stimuli that we're constantly being bombarded with. 00:04:44.270 --> 00:04:48.360 Categorization and the bias that it seeds 00:04:48.360 --> 00:04:51.198 allow our brains to make judgments 00:04:51.198 --> 00:04:53.370 more quickly and efficiently, 00:04:53.370 --> 00:04:56.684 and we do this by instinctively relying on patterns 00:04:56.684 --> 00:04:58.376 that seem predictable. 00:04:58.376 --> 00:05:01.758 Yet just as the categories we create 00:05:01.758 --> 00:05:04.463 allow us to make quick decisions, 00:05:04.463 --> 00:05:06.878 they also reinforce bias. 00:05:06.878 --> 00:05:11.295 So the very things that help us to see the world 00:05:11.295 --> 00:05:13.851 also can blind us to it. 00:05:13.851 --> 00:05:16.549 They render our choices effortless, 00:05:16.549 --> 00:05:19.441 friction-free. 00:05:19.441 --> 00:05:22.397 Yet the exact a heavy toll. NOTE Paragraph 00:05:22.397 --> 00:05:24.809 So what can we do? 00:05:24.809 --> 00:05:27.284 We are all vulnerable to bias, 00:05:27.284 --> 00:05:30.212 but we don't act on bias all the time. 00:05:30.212 --> 00:05:33.681 There are certain conditions that can bring bias alive 00:05:33.681 --> 00:05:36.281 and other conditions that can muffle it. NOTE Paragraph 00:05:36.281 --> 00:05:38.933 Let me give you an example. 00:05:38.933 --> 00:05:43.061 Many people are familiar with the tech company Nextdoor. 00:05:43.460 --> 00:05:51.738 So their whole purpose is to create stronger, healthier, safer neighborhoods. 00:05:51.738 --> 00:05:55.393 And so they offer this online space 00:05:55.393 --> 00:05:57.983 where neighbors can gather and share information. 00:05:57.983 --> 00:06:02.015 Yet Nextdoor soon found that they had a problem 00:06:02.015 --> 00:06:04.187 with racial profiling. 00:06:04.187 --> 00:06:06.130 In the typical case, 00:06:06.130 --> 00:06:08.713 people would look outside their window 00:06:08.713 --> 00:06:12.785 and see a black man in their otherwise white neighborhood 00:06:12.785 --> 00:06:14.976 and make the snap judgment 00:06:14.976 --> 00:06:17.299 that he was up to no good, 00:06:17.299 --> 00:06:20.865 even when there was no evidence of criminal wrongdoing. 00:06:21.127 --> 00:06:23.831 In many ways, how we behave online 00:06:23.831 --> 00:06:26.945 is a reflection of how we behave in the world. 00:06:27.816 --> 00:06:31.372 But what we don't want to do is create an easy-to-use system 00:06:31.372 --> 00:06:34.195 that can amplify bias 00:06:34.195 --> 00:06:36.439 and deepen racial disparities, 00:06:36.439 --> 00:06:39.117 rather than dismantling them. 00:06:39.117 --> 00:06:42.491 So the co-founder of Nextdoor reached out to me and to others 00:06:42.491 --> 00:06:44.755 to try to figure out what to do, 00:06:44.755 --> 00:06:48.052 and they realized that to curb racial profiling on the platform, 00:06:48.710 --> 00:06:50.601 they were going to have to add friction. 00:06:50.601 --> 00:06:53.244 That is, they were going to have to slow people down. 00:06:53.244 --> 00:06:55.423 So Nextdoor had a choice to make, 00:06:55.423 --> 00:06:58.066 and against every impulse 00:06:58.066 --> 00:07:00.524 they decided to add friction. 00:07:00.524 --> 00:07:03.948 And they did this by adding a simple checklist. 00:07:03.948 --> 00:07:06.310 There were three items on it. NOTE Paragraph 00:07:06.310 --> 00:07:08.766 First, they asked users to pause 00:07:08.766 --> 00:07:12.608 and think, what was this person doing 00:07:12.608 --> 00:07:14.399 that made him suspicious? 00:07:14.399 --> 00:07:19.519 The category "black man" is not ground for suspicion. NOTE Paragraph 00:07:19.713 --> 00:07:24.833 Second, they asked users to describe the person's physical features, 00:07:24.833 --> 00:07:27.551 not simply their race and gender. NOTE Paragraph 00:07:27.551 --> 00:07:31.107 Third, they realized that a lot of people 00:07:31.107 --> 00:07:34.345 didn't seem to know what racial profiling was, 00:07:34.345 --> 00:07:36.645 nor that they were engaging in it. 00:07:36.875 --> 00:07:40.069 So Nextdoor provided them with a definition 00:07:40.069 --> 00:07:43.177 and told them that it was strictly prohibited. 00:07:43.484 --> 00:07:47.060 Most of you have seen those signs in airports 00:07:47.060 --> 00:07:50.262 and Metro stations, "If you see something, say something." 00:07:50.262 --> 00:07:53.919 Nextdoor tried modifying this. 00:07:53.919 --> 00:07:59.329 "If you see something suspicious, say something specific." 00:07:59.587 --> 00:08:01.700 And using this strategy 00:08:01.700 --> 00:08:04.290 by simply slowing people down, 00:08:04.290 --> 00:08:09.834 Nextdoor was able to curb racial profiling by 75 percent. NOTE Paragraph 00:08:10.766 --> 00:08:12.879 Now, people often will say to me, 00:08:12.879 --> 00:08:17.768 you can't add friction in every situation, in every context, 00:08:17.768 --> 00:08:21.914 and especially for people who make split second decisions all the time, 00:08:22.809 --> 00:08:25.586 but it turns out we can add friction 00:08:25.586 --> 00:08:28.246 to more situations than we think. 00:08:28.246 --> 00:08:30.691 Working with the Oakland Police Department 00:08:30.691 --> 00:08:31.919 in California, 00:08:31.919 --> 00:08:35.859 I and a number of my colleagues were able to help the Department 00:08:35.859 --> 00:08:38.360 to reduce the number of stops they made 00:08:38.360 --> 00:08:41.805 of people who were not committing any serious crimes, 00:08:42.238 --> 00:08:44.500 and we did this by pushing officers 00:08:44.500 --> 00:08:46.364 to ask themselves a question 00:08:46.364 --> 00:08:48.917 before each and every stop they made: 00:08:49.451 --> 00:08:52.797 is this stop intelligence-led, 00:08:52.797 --> 00:08:54.207 yes or no? 00:08:55.581 --> 00:08:57.215 In other words, 00:08:57.215 --> 00:09:02.317 do I have prior information to tie this particular person 00:09:02.317 --> 00:09:04.295 to a specific crime? 00:09:04.882 --> 00:09:06.672 By adding that question 00:09:06.672 --> 00:09:09.435 to the form officer's complete doing the stop, 00:09:09.435 --> 00:09:11.376 they slow down, they pause, 00:09:11.376 --> 00:09:15.237 they think, why am I considering pulling this person over? NOTE Paragraph 00:09:16.912 --> 00:09:22.203 In 2017, before we added that intelligence-led question to the form, 00:09:23.655 --> 00:09:27.479 officers made about 32,000 stops across the city. 00:09:27.934 --> 00:09:29.593 In that next year, 00:09:29.593 --> 00:09:31.887 with the addition of this question, 00:09:31.887 --> 00:09:34.401 that fell to 19,000 stops. 00:09:34.579 --> 00:09:40.612 African-American stops alone fell by 43 percent. 00:09:40.612 --> 00:09:44.669 And stopping fewer black people did not make the city any more dangerous. 00:09:44.669 --> 00:09:47.292 In fact, the crime rate continued to fall 00:09:47.292 --> 00:09:50.202 and the city became safer for everybody. NOTE Paragraph 00:09:50.486 --> 00:09:56.334 So one solution can come from reducing the number of unnecessary stops. 00:09:56.334 --> 00:10:00.690 Another can come from improving the quality of the stops 00:10:00.690 --> 00:10:02.743 officers do make. 00:10:02.743 --> 00:10:05.349 And technology can help us here. 00:10:05.349 --> 00:10:07.878 We all know about George Floyd's death 00:10:07.878 --> 00:10:11.196 because those who tried to come to his aid 00:10:11.517 --> 00:10:17.131 held cell phone cameras to record that horrific, fatal encounter 00:10:17.453 --> 00:10:18.918 with the police. 00:10:18.918 --> 00:10:21.703 But we have all sorts of technology 00:10:21.703 --> 00:10:24.133 that we're not putting to good use. 00:10:24.133 --> 00:10:26.396 Police departments across the country 00:10:26.396 --> 00:10:30.094 are now required to wear body-worn cameras, 00:10:30.094 --> 00:10:35.570 so we have recordings of not only the most extreme and horrific encounters 00:10:35.570 --> 00:10:38.354 but of everyday interactions. NOTE Paragraph 00:10:38.753 --> 00:10:41.799 With an interdisciplinary team at Stanford, 00:10:41.799 --> 00:10:44.806 we've begun to use machine learning techniques 00:10:44.806 --> 00:10:47.898 to analyze large numbers of encounters. 00:10:47.898 --> 00:10:52.528 This is to better understand what happens in routine traffic stops. 00:10:52.528 --> 00:10:56.094 What we found was that even when police officers 00:10:56.094 --> 00:10:58.356 are behaving professionally, 00:10:58.356 --> 00:11:02.906 they speak to black drivers less respectfully than white drivers. 00:11:03.529 --> 00:11:07.334 In fact, from the words officers use alone 00:11:07.334 --> 00:11:12.973 we could predict whether they were talking to a black driver or a white driver. NOTE Paragraph 00:11:13.655 --> 00:11:18.797 The problem is that the vast majority of the footage from these cameras 00:11:18.797 --> 00:11:21.424 is not used by police departments 00:11:21.424 --> 00:11:23.858 to understand what's going on on the street 00:11:23.858 --> 00:11:26.428 or to train officers. 00:11:26.659 --> 00:11:28.844 And that's a shame. 00:11:28.844 --> 00:11:31.532 How does a routine stop 00:11:31.532 --> 00:11:33.812 turn into a deadly encounter? 00:11:34.077 --> 00:11:36.287 How did this happen in George Floyd's case? 00:11:36.287 --> 00:11:40.511 How did it happen in others? NOTE Paragraph 00:11:40.511 --> 00:11:43.252 When my eldest son was 16 years old, 00:11:43.252 --> 00:11:46.277 he discovered that when white people look at him, 00:11:46.277 --> 00:11:47.838 they feel fear. 00:11:49.338 --> 00:11:52.409 Elevators are the worst, he said. 00:11:52.409 --> 00:11:54.592 When those doors close, 00:11:54.592 --> 00:11:57.775 people are trapped in this tiny space 00:11:57.775 --> 00:12:00.262 with someone they have been taught 00:12:00.262 --> 00:12:02.242 to associate with danger. 00:12:03.046 --> 00:12:06.163 My son senses their discomfort, 00:12:06.163 --> 00:12:09.400 and he smiles to put them at ease, 00:12:09.400 --> 00:12:11.566 to calm their fears. 00:12:11.566 --> 00:12:13.534 When he speaks, 00:12:13.534 --> 00:12:15.720 their bodies relax. 00:12:15.720 --> 00:12:17.697 They breathe easier. 00:12:17.697 --> 00:12:20.251 They take pleasure in his cadence, 00:12:20.251 --> 00:12:22.418 his diction, his word choice. 00:12:23.241 --> 00:12:25.157 He sounds like one of them. 00:12:25.157 --> 00:12:29.867 I used to think that my son was a natural extrovert like his father, 00:12:29.867 --> 00:12:34.413 but I realized at that moment in that conversation 00:12:34.413 --> 00:12:38.006 that his smile was not a sign 00:12:38.006 --> 00:12:41.049 that he wanted to connect with would-be strangers. 00:12:42.175 --> 00:12:45.850 It was a talisman he'd used to protect himself, 00:12:45.850 --> 00:12:49.638 a survival skill he had honed over thousands of elevator rides. 00:12:49.638 --> 00:12:53.571 He was learning to accommodate the tension 00:12:53.571 --> 00:12:56.887 that his skin color generated 00:12:58.164 --> 00:13:01.396 and that put his own life at risk. NOTE Paragraph 00:13:02.619 --> 00:13:06.187 We know that the brain is wired for bias, 00:13:06.187 --> 00:13:09.201 and one way to interrupt that bias 00:13:09.559 --> 00:13:13.445 is to pause and to reflect on the evidence of our assumptions. 00:13:13.445 --> 00:13:15.156 So we need to ask ourselves, 00:13:15.156 --> 00:13:19.702 what assumptions do we bring 00:13:19.702 --> 00:13:21.530 when we step onto an elevator? 00:13:21.530 --> 00:13:23.393 Or an airplane? 00:13:23.739 --> 00:13:28.422 How do we make ourselves aware of our own unconscious bias? 00:13:28.422 --> 00:13:32.337 Who do those assumptions keep safe? 00:13:32.798 --> 00:13:35.557 Who do they put at risk? 00:13:35.824 --> 00:13:38.832 Until we ask these questions, 00:13:38.832 --> 00:13:43.702 and insist that our schools and our courts and our police departments 00:13:43.702 --> 00:13:47.074 and every institution do the same, 00:13:47.074 --> 00:13:51.031 we will continue to allow bias 00:13:51.883 --> 00:13:53.179 to blind us, 00:13:53.579 --> 00:13:54.988 and if we do, 00:13:55.717 --> 00:13:59.274 none of us are truly safe. 00:14:00.358 --> 00:14:02.619 Thank you.