0:00:00.743,0:00:05.990 [no audio yet] 0:00:05.990,0:00:08.012 In this video, we'll differentiate 0:00:08.012,0:00:11.440 between stereotypes, prejudice, [br]and discrimination; 0:00:11.440,0:00:14.513 and we'll discuss several important[br]social psychological concepts 0:00:14.513,0:00:17.132 and hypotheses related to each, 0:00:17.132,0:00:20.019 including what causes them [br]to arise in the first place. 0:00:21.081,0:00:24.364 Let's go over a bit of [br]terminology to kick things off. 0:00:24.364,0:00:28.020 A stereotype is a belief [br](which can be positive or negative ) 0:00:28.020,0:00:30.491 about the characteristics [br]of members of a group 0:00:30.491,0:00:34.675 that is applied generally [br]to most members of that group. 0:00:34.675,0:00:38.733 Believing that Asians are good [br]at math, for example, is positive; 0:00:38.733,0:00:40.398 it's not necessarily derogatory, 0:00:40.398,0:00:44.433 but it's nonetheless a stereotype [br]that you have about Asians. 0:00:44.433,0:00:48.569 Now, stereotypes (these beliefs) can [br]lead to prejudice, which in contrast, 0:00:48.569,0:00:50.903 can only ever be negative. 0:00:50.903,0:00:54.504 Prejudice involves drawing [br]negative conclusions about 0:00:54.504,0:00:56.056 a person, a group of people, 0:00:56.056,0:01:00.001 or a situation prior to [br]evaluating the evidence. 0:01:00.001,0:01:01.979 These baseless conclusions are typically 0:01:01.979,0:01:06.200 the result of those stereotypes [br]that you hold about that group. 0:01:06.200,0:01:07.883 Also, in contrast to stereotypes, 0:01:07.883,0:01:11.022 prejudice involves emotion; [br]it’s an attitude. 0:01:11.022,0:01:14.639 Being prejudiced against a person [br]or a group of people involves 0:01:14.639,0:01:16.908 feeling negatively toward them. 0:01:16.908,0:01:19.290 Now, because of these negative emotions 0:01:19.290,0:01:22.079 and these negative conclusions [br]that you're coming to, 0:01:22.079,0:01:24.664 prejudice often leads to discrimination, 0:01:24.664,0:01:29.017 which is negative behavior [br]towards members of an out-group. 0:01:29.017,0:01:31.670 And by the way, an out-group is a group 0:01:31.670,0:01:35.591 that we don't belong to or one that we [br]view as fundamentally different from us; 0:01:35.591,0:01:37.240 whereas an in-group, in contrast, 0:01:37.240,0:01:41.945 refers to a group that we DO identify [br]with or see ourselves as belonging to. 0:01:41.945,0:01:44.350 So I might be using that [br]terminology quite a bit– 0:01:44.350,0:01:45.973 important to know. 0:01:45.973,0:01:47.067 So just to summarize, 0:01:47.067,0:01:51.659 stereotypes are beliefs,[br]prejudice is an attitude, 0:01:51.659,0:01:54.445 and discrimination is a behavior. 0:01:54.445,0:01:57.583 Let's go over an example [br]that puts all of this together. 0:01:57.583,0:02:01.904 Let's say, for example, that you [br]believe older adults are incompetent, 0:02:01.904,0:02:05.437 and that's a stereotype that [br]you have about older adults. 0:02:05.437,0:02:09.111 (And I'll note that I'm not endorsing [br]this stereotype or any other stereotype 0:02:09.111,0:02:11.151 that I use as an example in this video, 0:02:11.151,0:02:14.203 but we have to have some kind [br]of an example to work with here.) 0:02:14.203,0:02:16.777 So let's say you work at, [br]I don't know, a tech company 0:02:16.777,0:02:18.963 and you're looking to hire an assistant. 0:02:18.963,0:02:20.835 If an elderly gentleman applies, 0:02:20.835,0:02:23.505 you might walk into that interview [br]with the gentleman, 0:02:23.505,0:02:27.908 assuming he won't be a good fit [br]or that he'd be difficult to train. 0:02:27.908,0:02:30.710 Now, we would call this [br]premature conclusion; 0:02:30.710,0:02:33.749 this negative attitude toward [br]this gentleman [is] prejudice. 0:02:33.749,0:02:36.623 Finally, you may decide not [br]to hire the gentleman at all 0:02:36.623,0:02:39.772 because of your stereotype,[br]because of your prejudice. 0:02:39.772,0:02:42.604 In this case, the behavior [br]of not hiring him 0:02:42.604,0:02:45.223 would be discrimination. 0:02:45.223,0:02:48.649 Now, stereotypes and prejudice [br]can be either explicit 0:02:48.649,0:02:51.851 (meaning, we're consciously [br]aware of having this bias) 0:02:51.851,0:02:55.870 or implicit (meaning, it's there, [br]but we aren't aware of it). 0:02:55.870,0:02:59.857 Research shows that explicit prejudice [br]is in decline, which is encouraging; 0:02:59.857,0:03:03.844 however, implicit prejudice [br]really isn't much. 0:03:03.844,0:03:07.749 That is, people report being [br]very anti-bias nowadays, 0:03:07.749,0:03:11.000 but their behavior still [br]tells us a different story. 0:03:11.000,0:03:14.168 Let's take a look at a few [br]examples to illustrate. 0:03:14.168,0:03:18.265 Starting with the realm of gender, [br]we can look to some of my own data. 0:03:18.265,0:03:20.000 In one study, I searched through 0:03:20.000,0:03:23.104 the language used by students [br]evaluating their teachers 0:03:23.104,0:03:28.708 in over 14 million reviews posted to [br]a popular instructor evaluation website, 0:03:28.708,0:03:33.696 RateMyProfessors.com,[br]which you've perhaps used in the past. 0:03:33.696,0:03:37.448 I was specifically interested in[br]stereotypes about intelligence, 0:03:37.448,0:03:41.954 so I searched through uses of [br]the words “genius” and “brilliant.” 0:03:41.954,0:03:46.225 So let's take a look at the results.[br]There's a lot of information here. 0:03:46.225,0:03:48.490 Let me help you interpret these graphs. 0:03:48.490,0:03:50.101 These are graphs for uses of 0:03:50.101,0:03:54.220 the words “genius” on the left [br]and “brilliant” on the right. 0:03:54.220,0:03:56.715 The x-axis on both of these graphs 0:03:56.715,0:04:00.072 represents uses per millions [br]of words of text, 0:04:00.072,0:04:02.829 which might sound a little [br]complicated, but really isn't. 0:04:02.829,0:04:04.264 There's a ton of text here, 0:04:04.264,0:04:07.613 so to keep the numbers on the [br]x-axis from being enormous 0:04:07.613,0:04:09.397 and just visually unappealing, 0:04:09.397,0:04:15.170 I used this uses per millions of words of text, [br]but the interpretation is basically the same. 0:04:15.170,0:04:17.338 The further to the right[br]you go on the x-axis 0:04:17.338,0:04:20.590 (the higher the number), [br]the more this word was used. 0:04:20.590,0:04:22.392 So that's how you can interpret that. 0:04:22.392,0:04:26.762 The y-axis here displays all of the [br]different fields such as philosophy, 0:04:26.762,0:04:30.115 music, mathematics, psychology; 0:04:30.115,0:04:31.951 so you can look for your own field 0:04:31.951,0:04:35.990 or just pause the video and look through[br]them in general, if you're curious, 0:04:35.990,0:04:39.092 And fields that are higher up [br]on the y-axis were the ones 0:04:39.092,0:04:41.976 in which the words were used the most often. 0:04:41.976,0:04:47.095 The blue dots here on the slide [br]represent reviews of male professors, 0:04:47.095,0:04:51.881 whereas the orange dots represent[br]reviews of female professors. 0:04:51.881,0:04:55.098 Before I give you the punch line, [br]what do you notice here? 0:04:55.098,0:04:59.557 Well, what I found is that every [br]field for which we have data, 0:04:59.557,0:05:02.720 students describe their male [br]professors as genius and brilliant 0:05:02.720,0:05:05.840 significantly more often than [br]they do their female professors. 0:05:05.840,0:05:09.189 And in no field was this effect reversed, 0:05:09.189,0:05:12.874 even for fields where women [br]were the statistical majority. 0:05:12.874,0:05:15.944 And this points to a stereotype [br]in favor of men's intelligence 0:05:15.944,0:05:18.294 and against women's intelligence. 0:05:18.294,0:05:19.480 You might be wondering: 0:05:19.480,0:05:22.700 Does this reflect an overall[br]bias against women, 0:05:22.700,0:05:26.586 or is the stereotype specific [br]to intellectual ability? 0:05:26.586,0:05:28.616 Well, I was curious about this as well, 0:05:28.616,0:05:31.857 but if you look at the data for [br]the terms “excellent” and “amazing,” 0:05:31.857,0:05:34.224 the gender bias goes away entirely. 0:05:34.224,0:05:35.963 It appears that students believe 0:05:35.963,0:05:38.795 that their female professors [br]can be excellent and amazing, 0:05:38.795,0:05:43.516 but they believe it's mainly the male [br]professors who are genius and brilliant. 0:05:43.516,0:05:45.884 Again, this is evidence of implicit bias 0:05:45.884,0:05:47.329 because students are likely 0:05:47.329,0:05:49.506 not consciously aware[br]of this discrepancy. 0:05:49.506,0:05:52.186 They're simply going on line[br]to review their professors 0:05:52.186,0:05:54.751 and they're not giving their [br]stereotypes any thought. 0:05:54.751,0:05:58.890 So explicitly, students would [br]likely say they don't hold a bias, 0:05:58.890,0:06:02.142 yet implicitly, they respond in this way. 0:06:02.142,0:06:03.552 This is a common theme 0:06:03.552,0:06:08.744 in modern research on stereotypes, [br]prejudice, and discrimination. 0:06:08.744,0:06:10.987 Now that's gender.[br]What about race? 0:06:10.987,0:06:15.005 One study found that doctors [br]were only 60% as likely to suggest 0:06:15.005,0:06:19.152 a top-rated diagnostic test [br]for Black heart patients 0:06:19.152,0:06:21.504 than for White heart patients. 0:06:21.504,0:06:23.053 There's also evidence to suggest 0:06:23.053,0:06:26.355 that White men are offered [br]greater financial opportunities. 0:06:26.355,0:06:29.358 As one example, a study found [br]that White men were offered 0:06:29.358,0:06:32.245 the best deals at used car dealerships. 0:06:32.245,0:06:36.622 White men paid $109 on average[br]less than White women, 0:06:36.622,0:06:40.302 $318 less than Black women, 0:06:40.302,0:06:47.757 and a whopping $935 less for a [br]used car on average than Black men. 0:06:47.757,0:06:51.470 Now, these are just two examples out [br]of thousands that I could tell you about, 0:06:51.470,0:06:53.022 but again, it's likely the case 0:06:53.022,0:06:56.642 that these doctors and car salesmen[br]aren't EXPLICITLY biased, 0:06:56.642,0:07:01.676 but their behavior provides [br]evidence of IMPLICIT bias. 0:07:01.676,0:07:04.513 Okay, so let's finish [br]with a brief discussion 0:07:04.513,0:07:09.483 of what leads to the development and [br]perpetuation of some of these things 0:07:09.483,0:07:11.704 (stereotypes, prejudice,[br]and discrimination), 0:07:11.704,0:07:13.873 starting with stereotypes. 0:07:13.873,0:07:17.025 A factor that we've learned about [br]before is confirmation bias, 0:07:17.025,0:07:20.462 the tendency to seek out evidence [br]that supports our beliefs 0:07:20.462,0:07:25.381 and to deny, dismiss, or distort [br]evidence that contradicts them. 0:07:25.381,0:07:28.918 Say, for example that you believe [br]women to be bad drivers. 0:07:28.918,0:07:30.984 If you're out driving for an hour, 0:07:30.984,0:07:36.107 you might encounter several bad [br]drivers, some male, some female. 0:07:36.107,0:07:39.393 If you don't have a stereotype [br]against male drivers, though, 0:07:39.393,0:07:43.776 you might not think much of them [br]when they speed or make dangerous moves. 0:07:43.776,0:07:47.015 But the second a female driver [br]cuts you off, for example, 0:07:47.015,0:07:50.299 you feel vindicated as though [br]you've found additional evidence 0:07:50.299,0:07:52.286 or proof for your belief. 0:07:52.286,0:07:54.157 And this reinforces your stereotype 0:07:54.157,0:07:59.081 even though, in truth, many people are [br]bad drivers regardless of their gender. 0:07:59.081,0:08:01.503 Now, if we used System 2 thinking 0:08:01.503,0:08:03.782 (which we've learned [br]about in a previous video) 0:08:03.782,0:08:07.386 to evaluate these kinds of assumptions [br]and the data that we base them on, 0:08:07.386,0:08:11.869 we might realize that those assumptions [br]are erroneous, but we usually don't. 0:08:11.869,0:08:14.854 This is because we are cognitive misers. 0:08:14.854,0:08:18.040 That is, we seek to use only [br]minimal cognitive resources 0:08:18.040,0:08:20.593 when explaining the world around us. 0:08:20.593,0:08:23.028 Evaluating our stereotypes takes effort, 0:08:23.028,0:08:25.348 and because we generally [br]don't go to more effort 0:08:25.348,0:08:27.431 than we deem absolutely necessary, 0:08:27.431,0:08:31.465 we don't evaluate or re-evaluate them at all. 0:08:31.465,0:08:33.537 Now, what causes prejudice? 0:08:33.537,0:08:37.372 First, we have in-group bias, [br]which refers to the tendency 0:08:37.372,0:08:43.146 to favor individuals from within our [br]group over those from outside our group. 0:08:43.146,0:08:46.117 Evidence from developmental [br]psychology suggests that this bias 0:08:46.117,0:08:50.551 is innate, with young infants showing[br]strong preferences, for example, 0:08:50.551,0:08:54.670 for others who share their preferences[br](such as their favorite snack) 0:08:54.670,0:08:58.474 and infants disliking others who [br]do not share their preferences 0:08:58.474,0:09:02.605 (for example, if the other person shows[br]that they like a different snack more). 0:09:02.605,0:09:06.236 Think of the implications [br]for racism, sexism, and so on. 0:09:06.236,0:09:10.051 Another factor is called [br]the ultimate attribution error, 0:09:10.051,0:09:11.676 which refers to the assumption 0:09:11.676,0:09:14.180 that behaviors among [br]individual members of a group 0:09:14.180,0:09:17.554 are due to their internal dispositions. 0:09:17.554,0:09:20.407 Out-group members’ flaws [br]are due to internal factors 0:09:20.407,0:09:25.483 such as their personality or their race, [br]whereas in-group members flaws aren't. 0:09:25.483,0:09:28.459 This might sound a lot like [br]the fundamental attribution error, 0:09:28.459,0:09:31.355 which we've learned about before, [br]but it is a bit different. 0:09:31.355,0:09:33.285 Think of the ultimate attribution error 0:09:33.285,0:09:37.202 as more of a narrow case of [br]the fundamental attribution error 0:09:37.202,0:09:39.267 applied specifically to attributions 0:09:39.267,0:09:43.694 about an individual in relation to[br]the group to which they belong. 0:09:43.694,0:09:45.000 Along similar lines, 0:09:45.000,0:09:48.352 out-group homogeneity [br]refers to the tendency to view 0:09:48.352,0:09:52.836 all individuals outside our group [br]as highly similar to one another. 0:09:52.836,0:09:54.356 Here, think of the implications 0:09:54.356,0:09:57.442 for identifying a suspect in [br]a police lineup, for example, 0:09:57.442,0:10:02.314 but also consider this bias in relation [br]to the ultimate attribution error. 0:10:02.314,0:10:04.681 It's a very bad combination to assume 0:10:04.681,0:10:08.136 that out-group members flaws [br]are due to inherent factors 0:10:08.136,0:10:10.143 such as their personalities or their race, 0:10:10.143,0:10:11.943 and to simultaneously assume 0:10:11.943,0:10:15.997 that out-group members are all [br]highly similar to one another. 0:10:15.997,0:10:19.866 Finally, scapegoating refers to [br]the act of blaming an out-group 0:10:19.866,0:10:25.872 when the in-group experiences frustration or [br]is blocked from obtaining some kind of a goal. 0:10:25.872,0:10:30.493 People scapegoat because it preserves [br]a positive self-concept. 0:10:30.493,0:10:35.329 If you believe the reason you can't get a job [br]is because immigrants are taking them all, 0:10:35.329,0:10:38.192 well, then you don't have to [br]come to terms with the reality 0:10:38.192,0:10:42.580 that you simply aren't qualified or [br]competent enough for that line of work. 0:10:42.580,0:10:46.431 Now, this list of causes here [br]is by no means all-inclusive 0:10:46.431,0:10:50.116 but should give you a good idea of [br]the general psychological phenomena 0:10:50.116,0:10:53.311 that lead to the formation and[br]perpetuation of stereotypes, 0:10:53.311,0:10:55.661 prejudice, and discrimination. [END]