Return to Video

How racial bias works -- and how to disrupt it

  • 0:01 - 0:03
    Some years ago,
  • 0:03 - 0:07
    I was on an airplane with my son
    who was just five years old at the time.
  • 0:08 - 0:14
    My son was so excited
    about being on this airplane with Mommy.
  • 0:14 - 0:17
    He's looking all around
    and he's checking things out
  • 0:17 - 0:19
    and he's checking people out,
  • 0:19 - 0:21
    and he sees this man, and he says, "Hey,
  • 0:21 - 0:24
    that guy looks like Daddy."
  • 0:24 - 0:26
    And I look at the man,
  • 0:26 - 0:30
    and he didn't look anything
    at all like my husband,
  • 0:30 - 0:31
    nothing at all.
  • 0:32 - 0:34
    And so then I start
    looking around on the plane,
  • 0:34 - 0:36
    and I notice this man
  • 0:36 - 0:40
    was the only black guy on the plane.
  • 0:41 - 0:43
    And I thought,
  • 0:43 - 0:44
    all right.
  • 0:45 - 0:48
    I'm going to have to have
    a little talk with my son
  • 0:48 - 0:50
    about how not all black people look alike.
  • 0:50 - 0:55
    My son, he lifts his head up,
    and he says to me,
  • 0:55 - 0:59
    "I hope he doesn't rob the plane."
  • 1:00 - 1:02
    And I said, "What? What did you say?"
  • 1:02 - 1:06
    And he says, "Well, I hope that man
    doesn't rob the plane."
  • 1:06 - 1:10
    And I said, "Well, why would you say that?
  • 1:11 - 1:13
    You know Daddy wouldn't rob a plane."
  • 1:13 - 1:16
    And he says, "Yeah, yeah,
    yeah, well, I know."
  • 1:16 - 1:19
    And I said, "Well,
    why would you say that?"
  • 1:19 - 1:24
    And he looked at me
    with this really sad face,
  • 1:24 - 1:27
    and he says,
  • 1:27 - 1:30
    "I don't know why I said that.
  • 1:30 - 1:33
    I don't know why I was thinking that."
  • 1:34 - 1:38
    We are living with such severe
    racial stratification
  • 1:38 - 1:40
    that even a five-year old
  • 1:40 - 1:43
    can tell us what's supposed
    to happen next,
  • 1:44 - 1:46
    even with no evildoer,
  • 1:46 - 1:50
    even with no explicit hatred.
  • 1:50 - 1:54
    This association between
    blackness and crime
  • 1:54 - 1:58
    made its way into the mind
    of my five-year old.
  • 2:00 - 2:04
    It makes its way into all of our children,
  • 2:04 - 2:07
    into all of us.
  • 2:07 - 2:11
    Our minds are shaped
    by the racial disparities
  • 2:11 - 2:13
    we see out in the world,
  • 2:13 - 2:18
    and the narratives that help us
    to make sense of the disparities we see.
  • 2:20 - 2:22
    Those people are criminals.
  • 2:22 - 2:24
    Those people are violent.
  • 2:24 - 2:27
    Those people are to be feared.
  • 2:28 - 2:32
    When my research team
    brought people into our lab
  • 2:32 - 2:33
    and exposed them to faces,
  • 2:33 - 2:36
    we found that exposure to black faces
  • 2:37 - 2:44
    led them to see blurry images
    of guns with greater clarity and speed.
  • 2:44 - 2:47
    Bias cannot only control what we see,
  • 2:47 - 2:49
    but where we look.
  • 2:49 - 2:52
    We found that prompting people
    to think of violent crime
  • 2:52 - 2:57
    can lead them to direct their eyes
    onto a black face
  • 2:57 - 2:59
    and away from a white face.
  • 2:59 - 3:03
    Prompting police officers
    to think of capturing and shooting
  • 3:03 - 3:04
    and arresting
  • 3:04 - 3:08
    leads their eyes to settle
    on black faces too.
  • 3:08 - 3:13
    Bias can infect every aspect
    of our criminal justice system.
  • 3:13 - 3:17
    In a large dataset
    of death-eligible defendants,
  • 3:17 - 3:18
    we found that looking more black
  • 3:18 - 3:23
    more than doubled their chances
    of receiving a death sentence,
  • 3:24 - 3:26
    at least when their victims were white.
  • 3:26 - 3:29
    This effect is significant
    even though we controlled
  • 3:29 - 3:33
    for the severity of the crime
  • 3:33 - 3:35
    and the defendant's attractiveness,
  • 3:35 - 3:36
    and no matter what we controlled for,
  • 3:36 - 3:39
    we found that black people
  • 3:39 - 3:41
    were punished in proportion
  • 3:41 - 3:44
    to the blackness
    of their physical features:
  • 3:44 - 3:46
    the more black,
  • 3:46 - 3:48
    the more death-worthy.
  • 3:48 - 3:52
    Bias can also influence
    how teachers discipline students.
  • 3:52 - 3:54
    My colleagues and I
  • 3:54 - 3:57
    have found that teachers express a desire
  • 3:57 - 3:59
    to discipline a black
    middle school student
  • 3:59 - 4:04
    more harshly than a white student
    for the same repeated infractions.
  • 4:04 - 4:05
    In a recent study,
  • 4:05 - 4:09
    we're finding that teachers
    treat black students as a group
  • 4:09 - 4:12
    but white students as individuals.
  • 4:12 - 4:16
    If, for example,
    one black student misbehaves
  • 4:16 - 4:21
    and then a different black student
    misbehaves a few days later,
  • 4:21 - 4:24
    the teacher responds
    to that second black student
  • 4:24 - 4:27
    as if he had misbehaved twice.
  • 4:27 - 4:30
    It's as though the sins of one child
  • 4:30 - 4:32
    get piled on to the other.
  • 4:33 - 4:36
    We create categories
    to make sense of the world,
  • 4:36 - 4:40
    to assert some control and coherence
  • 4:40 - 4:44
    to the stimuli that we're
    constantly being bombarded with.
  • 4:44 - 4:48
    Categorization and the bias that it seeds
  • 4:48 - 4:51
    allow our brains to make judgments
  • 4:51 - 4:53
    more quickly and efficiently,
  • 4:53 - 4:57
    and we do this by instinctively
    relying on patterns
  • 4:57 - 4:58
    that seem predictable.
  • 4:58 - 5:02
    Yet just as the categories we create
  • 5:02 - 5:04
    allow us to make quick decisions,
  • 5:04 - 5:07
    they also reinforce bias.
  • 5:07 - 5:11
    So the very things that help us
    to see the world
  • 5:11 - 5:14
    also can blind us to it.
  • 5:14 - 5:17
    They render our choices effortless,
  • 5:17 - 5:19
    friction-free.
  • 5:19 - 5:22
    Yet the exact a heavy toll.
  • 5:22 - 5:25
    So what can we do?
  • 5:25 - 5:27
    We are all vulnerable to bias,
  • 5:27 - 5:30
    but we don't act on bias all the time.
  • 5:30 - 5:34
    There are certain conditions
    that can bring bias alive
  • 5:34 - 5:36
    and other conditions that can muffle it.
  • 5:36 - 5:39
    Let me give you an example.
  • 5:39 - 5:43
    Many people are familiar
    with the tech company Nextdoor.
  • 5:43 - 5:52
    So their whole purpose is to create
    stronger, healthier, safer neighborhoods.
  • 5:52 - 5:55
    And so they offer this online space
  • 5:55 - 5:58
    where neighbors can gather
    and share information.
  • 5:58 - 6:02
    Yet Nextdoor soon found
    that they had a problem
  • 6:02 - 6:04
    with racial profiling.
  • 6:04 - 6:06
    In the typical case,
  • 6:06 - 6:09
    people would look outside their window
  • 6:09 - 6:13
    and see a black man
    in their otherwise white neighborhood
  • 6:13 - 6:15
    and make the snap judgment
  • 6:15 - 6:17
    that he was up to no good,
  • 6:17 - 6:21
    even when there was no evidence
    of criminal wrongdoing.
  • 6:21 - 6:24
    In many ways, how we behave online
  • 6:24 - 6:27
    is a reflection of how
    we behave in the world.
  • 6:28 - 6:31
    But what we don't want to do
    is create an easy-to-use system
  • 6:31 - 6:34
    that can amplify bias
  • 6:34 - 6:36
    and deepen racial disparities,
  • 6:36 - 6:39
    rather than dismantling them.
  • 6:39 - 6:42
    So the co-founder of Nextdoor
    reached out to me and to others
  • 6:42 - 6:45
    to try to figure out what to do,
  • 6:45 - 6:48
    and they realized that
    to curb racial profiling on the platform,
  • 6:49 - 6:51
    they were going to have to add friction.
  • 6:51 - 6:53
    That is, they were going
    to have to slow people down.
  • 6:53 - 6:55
    So Nextdoor had a choice to make,
  • 6:55 - 6:58
    and against every impulse
  • 6:58 - 7:01
    they decided to add friction.
  • 7:01 - 7:04
    And they did this by adding
    a simple checklist.
  • 7:04 - 7:06
    There were three items on it.
  • 7:06 - 7:09
    First, they asked users to pause
  • 7:09 - 7:13
    and think, what was this person doing
  • 7:13 - 7:14
    that made him suspicious?
  • 7:14 - 7:20
    The category "black man"
    is not ground for suspicion.
  • 7:20 - 7:25
    Second, they asked users to describe
    the person's physical features,
  • 7:25 - 7:28
    not simply their race and gender.
  • 7:28 - 7:31
    Third, they realized that a lot of people
  • 7:31 - 7:34
    didn't seem to know
    what racial profiling was,
  • 7:34 - 7:37
    nor that they were engaging in it.
  • 7:37 - 7:40
    So Nextdoor provided them
    with a definition
  • 7:40 - 7:43
    and told them that it
    was strictly prohibited.
  • 7:43 - 7:47
    Most of you have seen
    those signs in airports
  • 7:47 - 7:50
    and Metro stations,
    "If you see something, say something."
  • 7:50 - 7:54
    Nextdoor tried modifying this.
  • 7:54 - 7:59
    "If you see something suspicious,
    say something specific."
  • 8:00 - 8:02
    And using this strategy
  • 8:02 - 8:04
    by simply slowing people down,
  • 8:04 - 8:10
    Nextdoor was able to curb
    racial profiling by 75 percent.
  • 8:11 - 8:13
    Now, people often will say to me,
  • 8:13 - 8:18
    you can't add friction
    in every situation, in every context,
  • 8:18 - 8:22
    and especially for people who make
    split second decisions all the time,
  • 8:23 - 8:26
    but it turns out we can add friction
  • 8:26 - 8:28
    to more situations than we think.
  • 8:28 - 8:31
    Working with the Oakland Police Department
  • 8:31 - 8:32
    in California,
  • 8:32 - 8:36
    I and a number of my colleagues
    were able to help the Department
  • 8:36 - 8:38
    to reduce the number of stops they made
  • 8:38 - 8:42
    of people who were not
    committing any serious crimes,
  • 8:42 - 8:44
    and we did this by pushing officers
  • 8:44 - 8:46
    to ask themselves a question
  • 8:46 - 8:49
    before each and every stop they made:
  • 8:49 - 8:53
    is this stop intelligence-led,
  • 8:53 - 8:54
    yes or no?
  • 8:56 - 8:57
    In other words,
  • 8:57 - 9:02
    do I have prior information
    to tie this particular person
  • 9:02 - 9:04
    to a specific crime?
  • 9:05 - 9:07
    By adding that question
  • 9:07 - 9:09
    to the form officer's
    complete doing the stop,
  • 9:09 - 9:11
    they slow down, they pause,
  • 9:11 - 9:15
    they think, why am I considering
    pulling this person over?
  • 9:17 - 9:22
    In 2017, before we added that
    intelligence-led question to the form,
  • 9:24 - 9:27
    officers made about 32,000 stops
    across the city.
  • 9:28 - 9:30
    In that next year,
  • 9:30 - 9:32
    with the addition of this question,
  • 9:32 - 9:34
    that fell to 19,000 stops.
  • 9:35 - 9:41
    African-American stops alone
    fell by 43 percent.
  • 9:41 - 9:45
    And stopping fewer black people
    did not make the city any more dangerous.
  • 9:45 - 9:47
    In fact, the crime rate continued to fall
  • 9:47 - 9:50
    and the city became safer for everybody.
  • 9:50 - 9:56
    So one solution can come from reducing
    the number of unnecessary stops.
  • 9:56 - 10:01
    Another can come from improving
    the quality of the stops
  • 10:01 - 10:03
    officers do make.
  • 10:03 - 10:05
    And technology can help us here.
  • 10:05 - 10:08
    We all know about George Floyd's death
  • 10:08 - 10:11
    because those who tried to come to his aid
  • 10:12 - 10:17
    held cell phone cameras
    to record that horrific, fatal encounter
  • 10:17 - 10:19
    with the police.
  • 10:19 - 10:22
    But we have all sorts of technology
  • 10:22 - 10:24
    that we're not putting to good use.
  • 10:24 - 10:26
    Police departments across the country
  • 10:26 - 10:30
    are now required to wear
    body-worn cameras,
  • 10:30 - 10:36
    so we have recordings of not only
    the most extreme and horrific encounters
  • 10:36 - 10:38
    but of everyday interactions.
  • 10:39 - 10:42
    With an interdisciplinary
    team at Stanford,
  • 10:42 - 10:45
    we've begun to use
    machine learning techniques
  • 10:45 - 10:48
    to analyze large numbers of encounters.
  • 10:48 - 10:53
    This is to better understand what happens
    in routine traffic stops.
  • 10:53 - 10:56
    What we found was that
    even when police officers
  • 10:56 - 10:58
    are behaving professionally,
  • 10:58 - 11:03
    they speak to black drivers
    less respectfully than white drivers.
  • 11:04 - 11:07
    In fact, from the words officers use alone
  • 11:07 - 11:13
    we could predict whether they were talking
    to a black driver or a white driver.
  • 11:14 - 11:19
    The problem is that the vast majority
    of the footage from these cameras
  • 11:19 - 11:21
    is not used by police departments
  • 11:21 - 11:24
    to understand what's
    going on on the street
  • 11:24 - 11:26
    or to train officers.
  • 11:27 - 11:29
    And that's a shame.
  • 11:29 - 11:32
    How does a routine stop
  • 11:32 - 11:34
    turn into a deadly encounter?
  • 11:34 - 11:36
    How did this happen
    in George Floyd's case?
  • 11:36 - 11:41
    How did it happen in others?
  • 11:41 - 11:43
    When my eldest son was 16 years old,
  • 11:43 - 11:46
    he discovered that
    when white people look at him,
  • 11:46 - 11:48
    they feel fear.
  • 11:49 - 11:52
    Elevators are the worst, he said.
  • 11:52 - 11:55
    When those doors close,
  • 11:55 - 11:58
    people are trapped in this tiny space
  • 11:58 - 12:00
    with someone they have been taught
  • 12:00 - 12:02
    to associate with danger.
  • 12:03 - 12:06
    My son senses their discomfort,
  • 12:06 - 12:09
    and he smiles to put them at ease,
  • 12:09 - 12:12
    to calm their fears.
  • 12:12 - 12:14
    When he speaks,
  • 12:14 - 12:16
    their bodies relax.
  • 12:16 - 12:18
    They breathe easier.
  • 12:18 - 12:20
    They take pleasure in his cadence,
  • 12:20 - 12:22
    his diction, his word choice.
  • 12:23 - 12:25
    He sounds like one of them.
  • 12:25 - 12:30
    I used to think that my son
    was a natural extrovert like his father,
  • 12:30 - 12:34
    but I realized at that moment
    in that conversation
  • 12:34 - 12:38
    that his smile was not a sign
  • 12:38 - 12:41
    that he wanted to connect
    with would-be strangers.
  • 12:42 - 12:46
    It was a talisman he'd used
    to protect himself,
  • 12:46 - 12:50
    a survival skill he had honed
    over thousands of elevator rides.
  • 12:50 - 12:54
    He was learning to accommodate the tension
  • 12:54 - 12:57
    that his skin color generated
  • 12:58 - 13:01
    and that put his own life at risk.
  • 13:03 - 13:06
    We know that the brain is wired for bias,
  • 13:06 - 13:09
    and one way to interrupt that bias
  • 13:10 - 13:13
    is to pause and to reflect
    on the evidence of our assumptions.
  • 13:13 - 13:15
    So we need to ask ourselves,
  • 13:15 - 13:20
    what assumptions do we bring
  • 13:20 - 13:22
    when we step onto an elevator?
  • 13:22 - 13:23
    Or an airplane?
  • 13:24 - 13:28
    How do we make ourselves aware
    of our own unconscious bias?
  • 13:28 - 13:32
    Who do those assumptions keep safe?
  • 13:33 - 13:36
    Who do they put at risk?
  • 13:36 - 13:39
    Until we ask these questions,
  • 13:39 - 13:44
    and insist that our schools
    and our courts and our police departments
  • 13:44 - 13:47
    and every institution do the same,
  • 13:47 - 13:51
    we will continue to allow bias
  • 13:52 - 13:53
    to blind us,
  • 13:54 - 13:55
    and if we do,
  • 13:56 - 13:59
    none of us are truly safe.
  • 14:00 - 14:03
    Thank you.
Title:
How racial bias works -- and how to disrupt it
Speaker:
Jennifer L. Eberhardt
Description:

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
14:17

English subtitles

Revisions Compare revisions