< Return to Video

Stereotypes vs. Prejudice vs. Discrimination

  • 0:01 - 0:06
    [no audio yet]
  • 0:06 - 0:08
    In this video, we'll differentiate
  • 0:08 - 0:11
    between stereotypes, prejudice,
    and discrimination;
  • 0:11 - 0:15
    and we'll discuss several important
    social psychological concepts
  • 0:15 - 0:17
    and hypotheses related to each,
  • 0:17 - 0:20
    including what causes them
    to arise in the first place.
  • 0:21 - 0:24
    Let's go over a bit of
    terminology to kick things off.
  • 0:24 - 0:28
    A stereotype is a belief
    (which can be positive or negative )
  • 0:28 - 0:30
    about the characteristics
    of members of a group
  • 0:30 - 0:35
    that is applied generally
    to most members of that group.
  • 0:35 - 0:39
    Believing that Asians are good
    at math, for example, is positive;
  • 0:39 - 0:40
    it's not necessarily derogatory,
  • 0:40 - 0:44
    but it's nonetheless a stereotype
    that you have about Asians.
  • 0:44 - 0:49
    Now, stereotypes (these beliefs) can
    lead to prejudice, which in contrast,
  • 0:49 - 0:51
    can only ever be negative.
  • 0:51 - 0:55
    Prejudice involves drawing
    negative conclusions about
  • 0:55 - 0:56
    a person, a group of people,
  • 0:56 - 1:00
    or a situation prior to
    evaluating the evidence.
  • 1:00 - 1:02
    These baseless conclusions are typically
  • 1:02 - 1:06
    the result of those stereotypes
    that you hold about that group.
  • 1:06 - 1:08
    Also, in contrast to stereotypes,
  • 1:08 - 1:11
    prejudice involves emotion;
    it’s an attitude.
  • 1:11 - 1:15
    Being prejudiced against a person
    or a group of people involves
  • 1:15 - 1:17
    feeling negatively toward them.
  • 1:17 - 1:19
    Now, because of these negative emotions
  • 1:19 - 1:22
    and these negative conclusions
    that you're coming to,
  • 1:22 - 1:25
    prejudice often leads to discrimination,
  • 1:25 - 1:29
    which is negative behavior
    towards members of an out-group.
  • 1:29 - 1:32
    And by the way, an out-group is a group
  • 1:32 - 1:36
    that we don't belong to or one that we
    view as fundamentally different from us;
  • 1:36 - 1:37
    whereas an in-group, in contrast,
  • 1:37 - 1:42
    refers to a group that we DO identify
    with or see ourselves as belonging to.
  • 1:42 - 1:44
    So I might be using that
    terminology quite a bit–
  • 1:44 - 1:46
    important to know.
  • 1:46 - 1:47
    So just to summarize,
  • 1:47 - 1:52
    stereotypes are beliefs,
    prejudice is an attitude,
  • 1:52 - 1:54
    and discrimination is a behavior.
  • 1:54 - 1:58
    Let's go over an example
    that puts all of this together.
  • 1:58 - 2:02
    Let's say, for example, that you
    believe older adults are incompetent,
  • 2:02 - 2:05
    and that's a stereotype that
    you have about older adults.
  • 2:05 - 2:09
    (And I'll note that I'm not endorsing
    this stereotype or any other stereotype
  • 2:09 - 2:11
    that I use as an example in this video,
  • 2:11 - 2:14
    but we have to have some kind
    of an example to work with here.)
  • 2:14 - 2:17
    So let's say you work at,
    I don't know, a tech company
  • 2:17 - 2:19
    and you're looking to hire an assistant.
  • 2:19 - 2:21
    If an elderly gentleman applies,
  • 2:21 - 2:24
    you might walk into that interview
    with the gentleman,
  • 2:24 - 2:28
    assuming he won't be a good fit
    or that he'd be difficult to train.
  • 2:28 - 2:31
    Now, we would call this
    premature conclusion;
  • 2:31 - 2:34
    this negative attitude toward
    this gentleman [is] prejudice.
  • 2:34 - 2:37
    Finally, you may decide not
    to hire the gentleman at all
  • 2:37 - 2:40
    because of your stereotype,
    because of your prejudice.
  • 2:40 - 2:43
    In this case, the behavior
    of not hiring him
  • 2:43 - 2:45
    would be discrimination.
  • 2:45 - 2:49
    Now, stereotypes and prejudice
    can be either explicit
  • 2:49 - 2:52
    (meaning, we're consciously
    aware of having this bias)
  • 2:52 - 2:56
    or implicit (meaning, it's there,
    but we aren't aware of it).
  • 2:56 - 3:00
    Research shows that explicit prejudice
    is in decline, which is encouraging;
  • 3:00 - 3:04
    however, implicit prejudice
    really isn't much.
  • 3:04 - 3:08
    That is, people report being
    very anti-bias nowadays,
  • 3:08 - 3:11
    but their behavior still
    tells us a different story.
  • 3:11 - 3:14
    Let's take a look at a few
    examples to illustrate.
  • 3:14 - 3:18
    Starting with the realm of gender,
    we can look to some of my own data.
  • 3:18 - 3:20
    In one study, I searched through
  • 3:20 - 3:23
    the language used by students
    evaluating their teachers
  • 3:23 - 3:29
    in over 14 million reviews posted to
    a popular instructor evaluation website,
  • 3:29 - 3:34
    RateMyProfessors.com,
    which you've perhaps used in the past.
  • 3:34 - 3:37
    I was specifically interested in
    stereotypes about intelligence,
  • 3:37 - 3:42
    so I searched through uses of
    the words “genius” and “brilliant.”
  • 3:42 - 3:46
    So let's take a look at the results.
    There's a lot of information here.
  • 3:46 - 3:48
    Let me help you interpret these graphs.
  • 3:48 - 3:50
    These are graphs for uses of
  • 3:50 - 3:54
    the words “genius” on the left
    and “brilliant” on the right.
  • 3:54 - 3:57
    The x-axis on both of these graphs
  • 3:57 - 4:00
    represents uses per millions
    of words of text,
  • 4:00 - 4:03
    which might sound a little
    complicated, but really isn't.
  • 4:03 - 4:04
    There's a ton of text here,
  • 4:04 - 4:08
    so to keep the numbers on the
    x-axis from being enormous
  • 4:08 - 4:09
    and just visually unappealing,
  • 4:09 - 4:15
    I used this uses per millions of words of text,
    but the interpretation is basically the same.
  • 4:15 - 4:17
    The further to the right
    you go on the x-axis
  • 4:17 - 4:21
    (the higher the number),
    the more this word was used.
  • 4:21 - 4:22
    So that's how you can interpret that.
  • 4:22 - 4:27
    The y-axis here displays all of the
    different fields such as philosophy,
  • 4:27 - 4:30
    music, mathematics, psychology;
  • 4:30 - 4:32
    so you can look for your own field
  • 4:32 - 4:36
    or just pause the video and look through
    them in general, if you're curious,
  • 4:36 - 4:39
    And fields that are higher up
    on the y-axis were the ones
  • 4:39 - 4:42
    in which the words were used the most often.
  • 4:42 - 4:47
    The blue dots here on the slide
    represent reviews of male professors,
  • 4:47 - 4:52
    whereas the orange dots represent
    reviews of female professors.
  • 4:52 - 4:55
    Before I give you the punch line,
    what do you notice here?
  • 4:55 - 5:00
    Well, what I found is that every
    field for which we have data,
  • 5:00 - 5:03
    students describe their male
    professors as genius and brilliant
  • 5:03 - 5:06
    significantly more often than
    they do their female professors.
  • 5:06 - 5:09
    And in no field was this effect reversed,
  • 5:09 - 5:13
    even for fields where women
    were the statistical majority.
  • 5:13 - 5:16
    And this points to a stereotype
    in favor of men's intelligence
  • 5:16 - 5:18
    and against women's intelligence.
  • 5:18 - 5:19
    You might be wondering:
  • 5:19 - 5:23
    Does this reflect an overall
    bias against women,
  • 5:23 - 5:27
    or is the stereotype specific
    to intellectual ability?
  • 5:27 - 5:29
    Well, I was curious about this as well,
  • 5:29 - 5:32
    but if you look at the data for
    the terms “excellent” and “amazing,”
  • 5:32 - 5:34
    the gender bias goes away entirely.
  • 5:34 - 5:36
    It appears that students believe
  • 5:36 - 5:39
    that their female professors
    can be excellent and amazing,
  • 5:39 - 5:44
    but they believe it's mainly the male
    professors who are genius and brilliant.
  • 5:44 - 5:46
    Again, this is evidence of implicit bias
  • 5:46 - 5:47
    because students are likely
  • 5:47 - 5:50
    not consciously aware
    of this discrepancy.
  • 5:50 - 5:52
    They're simply going on line
    to review their professors
  • 5:52 - 5:55
    and they're not giving their
    stereotypes any thought.
  • 5:55 - 5:59
    So explicitly, students would
    likely say they don't hold a bias,
  • 5:59 - 6:02
    yet implicitly, they respond in this way.
  • 6:02 - 6:04
    This is a common theme
  • 6:04 - 6:09
    in modern research on stereotypes,
    prejudice, and discrimination.
  • 6:09 - 6:11
    Now that's gender.
    What about race?
  • 6:11 - 6:15
    One study found that doctors
    were only 60% as likely to suggest
  • 6:15 - 6:19
    a top-rated diagnostic test
    for Black heart patients
  • 6:19 - 6:22
    than for White heart patients.
  • 6:22 - 6:23
    There's also evidence to suggest
  • 6:23 - 6:26
    that White men are offered
    greater financial opportunities.
  • 6:26 - 6:29
    As one example, a study found
    that White men were offered
  • 6:29 - 6:32
    the best deals at used car dealerships.
  • 6:32 - 6:37
    White men paid $109 on average
    less than White women,
  • 6:37 - 6:40
    $318 less than Black women,
  • 6:40 - 6:48
    and a whopping $935 less for a
    used car on average than Black men.
  • 6:48 - 6:51
    Now, these are just two examples out
    of thousands that I could tell you about,
  • 6:51 - 6:53
    but again, it's likely the case
  • 6:53 - 6:57
    that these doctors and car salesmen
    aren't EXPLICITLY biased,
  • 6:57 - 7:02
    but their behavior provides
    evidence of IMPLICIT bias.
  • 7:02 - 7:05
    Okay, so let's finish
    with a brief discussion
  • 7:05 - 7:09
    of what leads to the development and
    perpetuation of some of these things
  • 7:09 - 7:12
    (stereotypes, prejudice,
    and discrimination),
  • 7:12 - 7:14
    starting with stereotypes.
  • 7:14 - 7:17
    A factor that we've learned about
    before is confirmation bias,
  • 7:17 - 7:20
    the tendency to seek out evidence
    that supports our beliefs
  • 7:20 - 7:25
    and to deny, dismiss, or distort
    evidence that contradicts them.
  • 7:25 - 7:29
    Say, for example that you believe
    women to be bad drivers.
  • 7:29 - 7:31
    If you're out driving for an hour,
  • 7:31 - 7:36
    you might encounter several bad
    drivers, some male, some female.
  • 7:36 - 7:39
    If you don't have a stereotype
    against male drivers, though,
  • 7:39 - 7:44
    you might not think much of them
    when they speed or make dangerous moves.
  • 7:44 - 7:47
    But the second a female driver
    cuts you off, for example,
  • 7:47 - 7:50
    you feel vindicated as though
    you've found additional evidence
  • 7:50 - 7:52
    or proof for your belief.
  • 7:52 - 7:54
    And this reinforces your stereotype
  • 7:54 - 7:59
    even though, in truth, many people are
    bad drivers regardless of their gender.
  • 7:59 - 8:02
    Now, if we used System 2 thinking
  • 8:02 - 8:04
    (which we've learned
    about in a previous video)
  • 8:04 - 8:07
    to evaluate these kinds of assumptions
    and the data that we base them on,
  • 8:07 - 8:12
    we might realize that those assumptions
    are erroneous, but we usually don't.
  • 8:12 - 8:15
    This is because we are cognitive misers.
  • 8:15 - 8:18
    That is, we seek to use only
    minimal cognitive resources
  • 8:18 - 8:21
    when explaining the world around us.
  • 8:21 - 8:23
    Evaluating our stereotypes takes effort,
  • 8:23 - 8:25
    and because we generally
    don't go to more effort
  • 8:25 - 8:27
    than we deem absolutely necessary,
  • 8:27 - 8:31
    we don't evaluate or re-evaluate them at all.
  • 8:31 - 8:34
    Now, what causes prejudice?
  • 8:34 - 8:37
    First, we have in-group bias,
    which refers to the tendency
  • 8:37 - 8:43
    to favor individuals from within our
    group over those from outside our group.
  • 8:43 - 8:46
    Evidence from developmental
    psychology suggests that this bias
  • 8:46 - 8:51
    is innate, with young infants showing
    strong preferences, for example,
  • 8:51 - 8:55
    for others who share their preferences
    (such as their favorite snack)
  • 8:55 - 8:58
    and infants disliking others who
    do not share their preferences
  • 8:58 - 9:03
    (for example, if the other person shows
    that they like a different snack more).
  • 9:03 - 9:06
    Think of the implications
    for racism, sexism, and so on.
  • 9:06 - 9:10
    Another factor is called
    the ultimate attribution error,
  • 9:10 - 9:12
    which refers to the assumption
  • 9:12 - 9:14
    that behaviors among
    individual members of a group
  • 9:14 - 9:18
    are due to their internal dispositions.
  • 9:18 - 9:20
    Out-group members’ flaws
    are due to internal factors
  • 9:20 - 9:25
    such as their personality or their race,
    whereas in-group members flaws aren't.
  • 9:25 - 9:28
    This might sound a lot like
    the fundamental attribution error,
  • 9:28 - 9:31
    which we've learned about before,
    but it is a bit different.
  • 9:31 - 9:33
    Think of the ultimate attribution error
  • 9:33 - 9:37
    as more of a narrow case of
    the fundamental attribution error
  • 9:37 - 9:39
    applied specifically to attributions
  • 9:39 - 9:44
    about an individual in relation to
    the group to which they belong.
  • 9:44 - 9:45
    Along similar lines,
  • 9:45 - 9:48
    out-group homogeneity
    refers to the tendency to view
  • 9:48 - 9:53
    all individuals outside our group
    as highly similar to one another.
  • 9:53 - 9:54
    Here, think of the implications
  • 9:54 - 9:57
    for identifying a suspect in
    a police lineup, for example,
  • 9:57 - 10:02
    but also consider this bias in relation
    to the ultimate attribution error.
  • 10:02 - 10:05
    It's a very bad combination to assume
  • 10:05 - 10:08
    that out-group members flaws
    are due to inherent factors
  • 10:08 - 10:10
    such as their personalities or their race,
  • 10:10 - 10:12
    and to simultaneously assume
  • 10:12 - 10:16
    that out-group members are all
    highly similar to one another.
  • 10:16 - 10:20
    Finally, scapegoating refers to
    the act of blaming an out-group
  • 10:20 - 10:26
    when the in-group experiences frustration or
    is blocked from obtaining some kind of a goal.
  • 10:26 - 10:30
    People scapegoat because it preserves
    a positive self-concept.
  • 10:30 - 10:35
    If you believe the reason you can't get a job
    is because immigrants are taking them all,
  • 10:35 - 10:38
    well, then you don't have to
    come to terms with the reality
  • 10:38 - 10:43
    that you simply aren't qualified or
    competent enough for that line of work.
  • 10:43 - 10:46
    Now, this list of causes here
    is by no means all-inclusive
  • 10:46 - 10:50
    but should give you a good idea of
    the general psychological phenomena
  • 10:50 - 10:53
    that lead to the formation and
    perpetuation of stereotypes,
  • 10:53 - 10:56
    prejudice, and discrimination. [END]
Title:
Stereotypes vs. Prejudice vs. Discrimination
Description:

more » « less
Video Language:
English
Duration:
10:56

English subtitles

Revisions