Return to Video

How racial bias works -- and how to disrupt it

  • 0:01 - 0:02
    Some years ago,
  • 0:02 - 0:07
    I was on an airplane with my son
    who was just five years old at the time.
  • 0:08 - 0:13
    My son was so excited
    about being on this airplane with Mommy.
  • 0:13 - 0:16
    He's looking all around
    and he's checking things out
  • 0:16 - 0:18
    and he's checking people out.
  • 0:18 - 0:20
    And he sees this man, and he says,
  • 0:20 - 0:23
    "Hey! That guy looks like Daddy!"
  • 0:24 - 0:26
    And I look at the man,
  • 0:26 - 0:30
    and he didn't look anything
    at all like my husband,
  • 0:30 - 0:31
    nothing at all.
  • 0:31 - 0:34
    And so then I start
    looking around on the plane,
  • 0:34 - 0:40
    and I notice this man was
    the only black guy on the plane.
  • 0:41 - 0:42
    And I thought,
  • 0:42 - 0:44
    "Alright.
  • 0:44 - 0:47
    I'm going to have to have
    a little talk with my son
  • 0:47 - 0:50
    about how not all
    black people look alike."
  • 0:50 - 0:54
    My son, he lifts his head up,
    and he says to me,
  • 0:56 - 0:59
    "I hope he doesn't rob the plane."
  • 0:59 - 1:02
    And I said, "What? What did you say?"
  • 1:02 - 1:05
    And he says, "Well, I hope that man
    doesn't rob the plane."
  • 1:07 - 1:10
    And I said, "Well, why would you say that?
  • 1:10 - 1:13
    You know Daddy wouldn't rob a plane."
  • 1:13 - 1:15
    And he says, "Yeah, yeah,
    yeah, well, I know."
  • 1:16 - 1:18
    And I said, "Well,
    why would you say that?"
  • 1:20 - 1:23
    And he looked at me
    with this really sad face,
  • 1:24 - 1:25
    and he says,
  • 1:27 - 1:29
    "I don't know why I said that.
  • 1:31 - 1:33
    I don't know why I was thinking that."
  • 1:34 - 1:37
    We are living with such severe
    racial stratification
  • 1:37 - 1:42
    that even a five-year-old can tell us
    what's supposed to happen next,
  • 1:44 - 1:46
    even with no evildoer,
  • 1:46 - 1:49
    even with no explicit hatred.
  • 1:50 - 1:54
    This association between
    blackness and crime
  • 1:54 - 1:58
    made its way into the mind
    of my five-year-old.
  • 2:00 - 2:03
    It makes its way into all of our children,
  • 2:04 - 2:06
    into all of us.
  • 2:07 - 2:10
    Our minds are shaped by
    the racial disparities
  • 2:10 - 2:12
    we see out in the world
  • 2:13 - 2:18
    and the narratives that help us
    to make sense of the disparities we see:
  • 2:20 - 2:22
    "Those people are criminal."
  • 2:22 - 2:24
    "Those people are violent."
  • 2:24 - 2:27
    "Those people are to be feared."
  • 2:28 - 2:31
    When my research team
    brought people into our lab
  • 2:31 - 2:33
    and exposed them to faces,
  • 2:33 - 2:40
    we found that exposure to black faces
    led them to see blurry images of guns
  • 2:40 - 2:44
    with greater clarity and speed.
  • 2:44 - 2:47
    Bias cannot only control what we see,
  • 2:47 - 2:49
    but where we look.
  • 2:49 - 2:52
    We found that prompting people
    to think of violent crime
  • 2:52 - 2:56
    can lead them to direct their eyes
    onto a black face
  • 2:56 - 2:59
    and away from a white face.
  • 2:59 - 3:02
    Prompting police officers
    to think of capturing and shooting
  • 3:02 - 3:04
    and arresting
  • 3:04 - 3:08
    leads their eyes to settle
    on black faces, too.
  • 3:08 - 3:13
    Bias can infect every aspect
    of our criminal justice system.
  • 3:13 - 3:16
    In a large data set
    of death-eligible defendants,
  • 3:16 - 3:20
    we found that looking more black
    more than double their chances
  • 3:20 - 3:22
    of receiving a death sentence --
  • 3:23 - 3:26
    at least when their victims were white.
  • 3:26 - 3:27
    This effect is significant,
  • 3:27 - 3:31
    even though we controlled
    for the severity of the crime
  • 3:31 - 3:33
    and the defendant's attractiveness.
  • 3:33 - 3:36
    And no matter what we controlled for,
  • 3:36 - 3:39
    we found that black
    people were punished
  • 3:39 - 3:43
    in proportion to the blackness
    of their physical features:
  • 3:43 - 3:45
    the more black,
  • 3:45 - 3:47
    the more death-worthy.
  • 3:47 - 3:51
    Bias can also influence
    how teachers discipline students.
  • 3:52 - 3:56
    My colleagues and I have found
    that teachers express a desire
  • 3:56 - 4:00
    to discipline a black
    middle school student more harshly
  • 4:00 - 4:01
    than a white student
  • 4:01 - 4:04
    for the same repeated infractions.
  • 4:04 - 4:05
    In a recent study,
  • 4:05 - 4:09
    we're finding that teachers
    treat black students as a group
  • 4:09 - 4:12
    but white students as individuals.
  • 4:12 - 4:16
    If, for example,
    one black student misbehaves
  • 4:16 - 4:21
    and then a different black student
    misbehaves a few days later,
  • 4:21 - 4:24
    the teacher responds
    to that second black student
  • 4:24 - 4:26
    as if he had misbehaved twice.
  • 4:27 - 4:30
    It's as though the sins of one child
  • 4:30 - 4:32
    get piled onto the other.
  • 4:32 - 4:35
    We create categories
    to make sense of the world,
  • 4:35 - 4:40
    to assert some control and coherence
  • 4:40 - 4:44
    to the stimuli that we're constantly
    being bombarded with.
  • 4:44 - 4:48
    Categorization and the bias that it seeds
  • 4:48 - 4:53
    allow our brains to make judgments
    more quickly and efficiently,
  • 4:53 - 4:56
    and we do this by instinctively
    relying on patterns
  • 4:56 - 4:58
    that seem predictable.
  • 4:58 - 5:04
    Yet, just as the categories we create
    allow us to make quick decisions,
  • 5:04 - 5:07
    they also reinforce bias.
  • 5:07 - 5:10
    So the very things that help us
    to see the world
  • 5:11 - 5:13
    also can blind us to it.
  • 5:14 - 5:16
    They render our choices effortless,
  • 5:16 - 5:18
    friction-free.
  • 5:19 - 5:21
    Yet they exact a heavy toll.
  • 5:22 - 5:24
    So what can we do?
  • 5:25 - 5:27
    We are all vulnerable to bias,
  • 5:27 - 5:30
    but we don't act on bias all the time.
  • 5:30 - 5:33
    There are certain conditions
    that can bring bias alive
  • 5:33 - 5:36
    and other conditions that can muffle it.
  • 5:36 - 5:38
    Let me give you an example.
  • 5:39 - 5:43
    Many people are familiar
    with the tech company Nextdoor.
  • 5:44 - 5:51
    So, their whole purpose is to create
    stronger, healthier, safer neighborhoods.
  • 5:51 - 5:54
    And so they offer this online space
  • 5:54 - 5:58
    where neighbors can gather
    and share information.
  • 5:58 - 6:02
    Yet, Nextdoor soon found
    that they had a problem
  • 6:02 - 6:03
    with racial profiling.
  • 6:04 - 6:06
    In the typical case,
  • 6:06 - 6:08
    people would look outside their window
  • 6:08 - 6:12
    and see a black man
    in their otherwise white neighborhood
  • 6:12 - 6:17
    and make the snap judgment
    that he was up to no good,
  • 6:17 - 6:21
    even when there was no evidence
    of criminal wrongdoing.
  • 6:21 - 6:24
    In many ways, how we behave online
  • 6:24 - 6:27
    is a reflection of how
    we behave in the world.
  • 6:27 - 6:31
    But what we don't want to do
    is create an easy-to-use system
  • 6:31 - 6:35
    that can amplify bias
    and deepen racial disparities,
  • 6:36 - 6:38
    rather than dismantling them.
  • 6:39 - 6:42
    So the cofounder of Nextdoor
    reached out to me and to others
  • 6:42 - 6:44
    to try to figure out what to do.
  • 6:44 - 6:48
    And they realized that
    to curb racial profiling on the platform,
  • 6:48 - 6:50
    they were going to have to add friction;
  • 6:50 - 6:53
    that is, they were going
    to have to slow people down.
  • 6:53 - 6:55
    So Nextdoor had a choice to make,
  • 6:55 - 6:58
    and against every impulse,
  • 6:58 - 7:00
    they decided to add friction.
  • 7:00 - 7:04
    And they did this by adding
    a simple checklist.
  • 7:04 - 7:06
    There were three items on it.
  • 7:06 - 7:09
    First, they asked users to pause
  • 7:09 - 7:14
    and think, "What was this person doing
    that made him suspicious?"
  • 7:15 - 7:19
    The category "black man"
    is not grounds for suspicion.
  • 7:19 - 7:25
    Second, they asked users to describe
    the person's physical features,
  • 7:25 - 7:27
    not simply their race and gender.
  • 7:28 - 7:31
    Third, they realized that a lot of people
  • 7:31 - 7:34
    didn't seem to know
    what racial profiling was,
  • 7:34 - 7:36
    nor that they were engaging in it.
  • 7:36 - 7:40
    So Nextdoor provided them
    with a definition
  • 7:40 - 7:43
    and told them that it was
    strictly prohibited.
  • 7:43 - 7:46
    Most of you have seen
    those signs in airports
  • 7:46 - 7:49
    and in metro stations,
    "If you see something, say something."
  • 7:50 - 7:53
    Nextdoor tried modifying this.
  • 7:54 - 7:56
    "If you see something suspicious,
  • 7:56 - 7:58
    say something specific."
  • 7:59 - 8:04
    And using this strategy,
    by simply slowing people down,
  • 8:04 - 8:10
    Nextdoor was able to curb
    racial profiling by 75 percent.
  • 8:10 - 8:13
    Now, people often will say to me,
  • 8:13 - 8:17
    "You can't add friction
    in every situation, in every context,
  • 8:17 - 8:22
    and especially for people who make
    split-second decisions all the time."
  • 8:23 - 8:25
    But it turns out we can add friction
  • 8:25 - 8:28
    to more situations than we think.
  • 8:28 - 8:30
    Working with the Oakland Police Department
  • 8:30 - 8:32
    in California,
  • 8:32 - 8:35
    I and a number of my colleagues
    were able to help the department
  • 8:35 - 8:38
    to reduce the number of stops they made
  • 8:38 - 8:42
    of people who were not
    committing any serious crimes.
  • 8:42 - 8:44
    And we did this by pushing officers
  • 8:44 - 8:49
    to ask themselves a question
    before each and every stop they made:
  • 8:49 - 8:52
    "Is this stop intelligence-led,
  • 8:52 - 8:54
    yes or no?"
  • 8:55 - 8:57
    In other words,
  • 8:58 - 9:02
    do I have prior information
    to tie this particular person
  • 9:02 - 9:04
    to a specific crime?
  • 9:05 - 9:06
    By adding that question
  • 9:06 - 9:09
    to the form officers complete
    during a stop,
  • 9:09 - 9:11
    they slow down, they pause,
  • 9:11 - 9:15
    they think, "Why am I considering
    pulling this person over?"
  • 9:17 - 9:22
    In 2017, before we added that
    intelligence-led question to the form,
  • 9:24 - 9:28
    officers made about 32,000 stops
    across the city.
  • 9:28 - 9:32
    In that next year,
    with the addition of this question,
  • 9:32 - 9:34
    that fell to 19,000 stops.
  • 9:34 - 9:39
    African-American stops alone
    fell by 43 percent.
  • 9:40 - 9:44
    And stopping fewer black people
    did not make the city any more dangerous.
  • 9:44 - 9:47
    In fact, the crime rate continued to fall,
  • 9:47 - 9:50
    and the city became safer for everybody.
  • 9:50 - 9:56
    So one solution can come from reducing
    the number of unnecessary stops.
  • 9:56 - 10:01
    Another can come from improving
    the quality of the stops
  • 10:01 - 10:02
    officers do make.
  • 10:03 - 10:05
    And technology can help us here.
  • 10:05 - 10:08
    We all know about George Floyd's death,
  • 10:08 - 10:13
    because those who tried to come to his aid
    held cell phone cameras
  • 10:13 - 10:19
    to record that horrific, fatal
    encounter with the police.
  • 10:19 - 10:24
    But we have all sorts of technology
    that we're not putting to good use.
  • 10:24 - 10:26
    Police departments across the country
  • 10:26 - 10:30
    are now required to wear body-worn cameras
  • 10:30 - 10:36
    so we have recordings of not only
    the most extreme and horrific encounters
  • 10:36 - 10:39
    but of everyday interactions.
  • 10:39 - 10:41
    With an interdisciplinary
    team at Stanford,
  • 10:41 - 10:44
    we've begun to use
    machine learning techniques
  • 10:44 - 10:48
    to analyze large numbers of encounters.
  • 10:48 - 10:52
    This is to better understand
    what happens in routine traffic stops.
  • 10:52 - 10:54
    What we found was that
  • 10:54 - 10:58
    even when police officers
    are behaving professionally,
  • 10:59 - 11:03
    they speak to black drivers
    less respectfully than white drivers.
  • 11:04 - 11:08
    In fact, from the words
    officers use alone,
  • 11:08 - 11:13
    we could predict whether they were talking
    to a black driver or a white driver.
  • 11:13 - 11:19
    The problem is that the vast majority
    of the footage from these cameras
  • 11:19 - 11:21
    is not used by police departments
  • 11:21 - 11:24
    to understand what's
    going on on the street
  • 11:24 - 11:26
    or to train officers.
  • 11:27 - 11:28
    And that's a shame.
  • 11:29 - 11:34
    How does a routine stop
    turn into a deadly encounter?
  • 11:34 - 11:36
    How did this happen
    in George Floyd's case?
  • 11:38 - 11:40
    How did it happen in others?
  • 11:40 - 11:43
    When my eldest son was 16 years old,
  • 11:43 - 11:46
    he discovered that
    when white people look at him,
  • 11:46 - 11:48
    they feel fear.
  • 11:49 - 11:52
    Elevators are the worst, he said.
  • 11:52 - 11:55
    When those doors close,
  • 11:55 - 11:58
    people are trapped in this tiny space
  • 11:58 - 12:02
    with someone they have been taught
    to associate with danger.
  • 12:03 - 12:06
    My son senses their discomfort,
  • 12:06 - 12:09
    and he smiles to put them at ease,
  • 12:09 - 12:11
    to calm their fears.
  • 12:11 - 12:13
    When he speaks,
  • 12:13 - 12:15
    their bodies relax.
  • 12:15 - 12:17
    They breathe easier.
  • 12:17 - 12:20
    They take pleasure in his cadence,
  • 12:20 - 12:22
    his diction, his word choice.
  • 12:23 - 12:25
    He sounds like one of them.
  • 12:25 - 12:30
    I used to think that my son
    was a natural extrovert like his father.
  • 12:30 - 12:33
    But I realized at that moment,
    in that conversation,
  • 12:34 - 12:39
    that his smile was not a sign
    that he wanted to connect
  • 12:39 - 12:41
    with would-be strangers.
  • 12:42 - 12:46
    It was a talisman he used
    to protect himself,
  • 12:46 - 12:52
    a survival skill he had honed
    over thousands of elevator rides.
  • 12:52 - 12:58
    He was learning to accommodate the tension
    that his skin color generated
  • 12:59 - 13:02
    and that put his own life at risk.
  • 13:03 - 13:06
    We know that the brain is wired for bias,
  • 13:06 - 13:11
    and one way to interrupt that bias
    is to pause and to reflect
  • 13:11 - 13:13
    on the evidence of our assumptions.
  • 13:13 - 13:15
    So we need to ask ourselves:
  • 13:15 - 13:20
    What assumptions do we bring
    when we step onto an elevator?
  • 13:22 - 13:23
    Or an airplane?
  • 13:24 - 13:28
    How do we make ourselves aware
    of our own unconscious bias?
  • 13:28 - 13:31
    Who do those assumptions keep safe?
  • 13:33 - 13:35
    Who do they put at risk?
  • 13:36 - 13:38
    Until we ask these questions
  • 13:39 - 13:44
    and insist that our schools
    and our courts and our police departments
  • 13:44 - 13:46
    and every institution do the same,
  • 13:48 - 13:52
    we will continue to allow bias
  • 13:52 - 13:53
    to blind us.
  • 13:53 - 13:55
    And if we do,
  • 13:56 - 13:59
    none of us are truly safe.
  • 14:02 - 14:03
    Thank you.
Title:
How racial bias works -- and how to disrupt it
Speaker:
Jennifer L. Eberhardt
Description:

Our brains create categories to make sense of the world, recognize patterns and make quick decisions. But this ability to categorize also exacts a heavy toll in the form of unconscious bias. In this powerful talk, psychologist Jennifer L. Eberhardt explores how our biases unfairly target Black people at all levels of society -- from schools and social media to policing and criminal justice -- and discusses how creating points of friction can help us actively interrupt and address this troubling problem.

more » « less
Video Language:
English
Team:
closed TED
Project:
TEDTalks
Duration:
14:17

English subtitles

Revisions Compare revisions