0:00:00.780,0:00:04.631 >>Behind every innovation[br]is an inspired scientist. 0:00:04.631,0:00:08.891 And behind every PBS program[br]is a viewer like you! 0:00:08.891,0:00:11.152 Get behind your PBS station. 0:00:11.922,0:00:15.070 Your support makes a difference! 0:00:15.530,0:00:18.550 >>Hello and welcome to[br]Scientific American Frontiers, 0:00:18.550,0:00:19.880 I'm Alan Alda. 0:00:20.451,0:00:24.039 we all like to think that we[br]understand our own minds. 0:00:24.039,0:00:27.958 That we carefully weigh the pros and cons[br]when we make a decision. 0:00:27.958,0:00:29.699 And that after we've made it, 0:00:29.699,0:00:33.870 even if we have cause to regret it, we[br]know why we decided the way we did. 0:00:33.870,0:00:37.923 Oh sure we know some choices[br]are more emotional than rational, 0:00:37.923,0:00:40.285 but even then, we think[br]we're conscious 0:00:40.285,0:00:44.357 of all the conflicting arguments that run[br]through our minds as we make a choice. 0:00:45.371,0:00:49.868 Well, in this program we'll find out[br]how utterly deluded we are, 0:00:49.868,0:00:55.650 as we join in experiments revealing just how[br]sneaky and underhanded our brains can be. 0:00:56.493,0:01:02.992 [music] 0:01:04.159,0:01:06.508 >>I've always thought[br]of myself as a feminist, 0:01:06.508,0:01:09.950 so I'm pretty sure I know how I'll[br]do in this test of my reaction 0:01:09.950,0:01:11.748 to women in the work place. 0:01:11.748,0:01:13.498 I am ready to begin. 0:01:13.498,0:01:19.258 Women, in fact like Mahzarin Banaji, who's[br]a professor here at Harvard University. 0:01:19.258,0:01:22.499 The test is called the[br]"implicit association test" 0:01:22.499,0:01:24.568 and it begins simply enough. 0:01:24.568,0:01:26.657 I have to pair the word in the center,[br] 0:01:26.657,0:01:30.340 with one of the words above by[br]pressing the "e" key with my left hand 0:01:30.340,0:01:32.695 or the "i" key with my right. 0:01:32.695,0:01:34.875 Well, that could be either way. 0:01:34.875,0:01:37.405 Mahzarin has told me to[br]do this as fast as I can 0:01:37.405,0:01:42.146 because it's the time I take to make the[br]associations that's critical to the test. 0:01:44.280,0:01:48.560 Now the target words have changed,[br]but the task remains the same; 0:01:48.560,0:01:52.824 to quickly decide whether the[br]new words belong to the left or the right. 0:01:56.004,0:01:58.286 But things are about[br]to get trickier. 0:01:59.884,0:02:03.043 >>It's the same thing except now any[br]one of these four will show up, 0:02:03.043,0:02:05.295 and when it's "career" or "male"[br]you'll press the left key, 0:02:05.295,0:02:08.315 when it's "family" or "female" you'll[br]press the right key. 0:02:08.825,0:02:11.464 >>"Career" or "male"[br]>>Yes. 0:02:11.464,0:02:14.945 >>So now the categories are[br]described by two words, 0:02:14.945,0:02:17.883 making it harder to decide[br]where the new words belong. 0:02:18.783,0:02:23.273 But since historically, "male" and[br]"career" have gone together.... 0:02:23.273,0:02:27.394 This is like, reinforcing this stereotype.[br]>>Yeah, that's it exactly. 0:02:27.954,0:02:32.595 >>The point of the test is to discover[br]if lurking beneath my feminist convictions, 0:02:32.595,0:02:36.866 I actually harbor a hidden bias[br]against women in the work place 0:02:36.866,0:02:41.665 based on all the associations between[br]"man" and "career" and "women" and "family" 0:02:41.665,0:02:44.159 that bathed the culture[br]I grew up in. 0:02:44.159,0:02:49.009 "Family or "male", "career" or "female".[br]Okay, now you're testing me. 0:02:49.199,0:02:53.600 The implicit association test[br]is designed to ferret out any bias 0:02:53.600,0:02:58.389 by seeing if it takes me fractionally longer[br]to figure out where a word belongs 0:02:58.389,0:03:02.989 when the pairing of the target words, in[br]this case: "family" and "male" together 0:03:02.989,0:03:08.578 and "career" and "female", is slightly[br]more difficult for my brain to except. 0:03:08.578,0:03:12.939 Corporation - "family"[br]or...[breaks off in laughter] 0:03:12.939,0:03:16.379 "brother", "management"... 0:03:17.780,0:03:19.251 and we're done. 0:03:20.872,0:03:24.502 >>Slight. I could tell he has a slight.[br]This is very good. 0:03:24.502,0:03:26.952 You are...so let's see,[br]you're showing a slight, 0:03:26.952,0:03:29.551 automatic association[br]between male and career, 0:03:29.551,0:03:34.903 and the...only 12% of the[br]population that takes this test 0:03:34.903,0:03:38.102 shows this bias. What you're[br]seeing is that you're, 0:03:38.102,0:03:41.071 you're showing a much smaller bias[br]then what many other people show. 0:03:41.071,0:03:44.320 I'm up here. I make a strong[br]association between 0:03:44.320,0:03:47.772 "male" and "career" and "female" and[br]"family" even though that's not what I 0:03:47.772,0:03:49.232 consciously express. 0:03:49.232,0:03:50.673 >>Even though that's[br]not what you live, huh? 0:03:50.673,0:03:52.331 >>And that's not what I live,[br]but in my world. 0:03:52.331,0:03:54.781 And I don't know, I'd be[br]interested to know. Maybe 0:03:54.781,0:03:58.680 many years of working in feminist[br]causes made you show this lesson. 0:03:58.680,0:04:01.442 >>So all those years of working[br]in feminist causes 0:04:01.442,0:04:06.490 didn't manage to totally eliminate my lingering[br]association between "female" and "family" 0:04:06.490,0:04:09.542 and "male" and "career". 0:04:09.542,0:04:12.990 What's astonishing though is that[br]Mahzarin is far worse, 0:04:12.990,0:04:16.463 and she's not only enjoying[br]a very successful career- 0:04:16.463,0:04:18.233 she designed the test! 0:04:18.894,0:04:21.854 Another of its designers[br]is Brian Nosek, 0:04:21.854,0:04:24.473 who personally developed[br]the test he's taking now, 0:04:24.473,0:04:27.594 intending to reveal hidden[br]racial prejudice. 0:04:27.594,0:04:29.864 >>These are the same faces[br]you've seen many, many times. 0:04:29.864,0:04:33.633 >>Many times, many times. In fact I helped[br]create the faces, so... 0:04:33.633,0:04:36.974 >>The heart of this test is to see whether[br]it's easier for Brian to match words 0:04:36.974,0:04:40.743 or pictures to the pairing[br]of African American with bad, 0:04:40.743,0:04:43.282 and European American with good. 0:04:43.742,0:04:45.934 Then when the pairings[br]are reversed, 0:04:45.934,0:04:50.424 European Americans with bad and[br]African Americans with good. 0:04:50.985,0:04:55.445 [quiet] 0:04:55.445,0:04:57.533 >>I can predict.[br]>>Yeah [chuckle] 0:04:57.533,0:04:58.583 What do you think? 0:04:58.583,0:05:00.904 >>Moderate. Moderate to strong. 0:05:00.904,0:05:03.455 >>Yeah, I'll say moderate as well.[br]>>Yeah. 0:05:03.455,0:05:06.335 >>Strong preference for white.[br][exclamation of laughter] 0:05:06.335,0:05:09.205 >>You've taken this[br]for the umpteenth time, 0:05:09.575,0:05:14.055 and you still haven't caught on[br]to the fact that you're a little biased? 0:05:14.055,0:05:15.975 [laughter] 0:05:15.975,0:05:21.526 >>I think part of that is the insistence[br]that my conscious beliefs still matter. 0:05:21.526,0:05:25.276 It isn't that-- that fact that I[br]keep showing these implicit biases 0:05:25.276,0:05:28.577 mean that I'm a biased person so[br]I should just accept that and move on, 0:05:28.577,0:05:32.635 it's that-- no, I don't agree with[br]those, I do have them 0:05:32.635,0:05:35.065 and I will admit to having[br]those implicit biases 0:05:35.065,0:05:38.667 but I'm not going to let that rule[br]what I would consciously want to have. 0:05:39.927,0:05:43.335 >>It really is remarkable that here I am[br]in Cambridge, Massachusetts-- 0:05:43.335,0:05:45.555 perhaps the liberal[br]capitol of the country-- 0:05:45.555,0:05:49.904 with two academics who pride themselves[br]on their enlightened attitudes, 0:05:49.904,0:05:52.246 only to discover that they[br]have latent within in them 0:05:52.246,0:05:56.731 biases they would fervently deny[br]if their own test hadn't revealed them. 0:05:59.581,0:06:01.569 But all may not be lost. 0:06:02.202,0:06:05.794 >>We found in research, and[br]also in testing of myself, 0:06:05.794,0:06:10.685 that if I put myself in a situation where[br]I think about positive black exemplars-- 0:06:10.685,0:06:14.402 I think about Michael Jordan[br]and Colin Powell and people... 0:06:14.402,0:06:19.396 Martin Luther King who have had a very positive[br]impact and who are also African American, 0:06:19.396,0:06:23.153 I show much less bias immediately[br]after thinking about those exemplars 0:06:23.153,0:06:25.355 then if I hadn't thought[br]about them before. 0:06:25.355,0:06:29.715 >>Even simple things, like the presence[br]of an African American experimenter, 0:06:29.715,0:06:34.943 reduces race bias. That the presence[br]of a competent woman, 0:06:34.943,0:06:38.085 makes women's attitudes to math[br]become more positive. 0:06:38.085,0:06:40.575 Those kinds of studies 0:06:40.575,0:06:44.515 sit, I think, in contradiction the ways[br]in which people like I thought about this- 0:06:44.515,0:06:46.654 we thought they were learned[br]over long periods of time, 0:06:46.654,0:06:49.143 that they were entrenched,[br]they were rigid, inflexible-- 0:06:49.143,0:06:51.224 and that does not appear[br]to be the case, 0:06:51.224,0:06:53.534 and I think that that's where[br]the room for optimism is. 0:06:53.534,0:06:56.805 On the one hand, I would like people[br]to take these data seriously 0:06:56.805,0:07:00.967 when we can show the vast numbers[br]of people who show the bias, 0:07:00.967,0:07:02.945 I think we need to contend with that. 0:07:02.945,0:07:07.224 On the other hand, what this work[br]is showing is that it might be 0:07:07.224,0:07:10.755 not that hard to shape environments[br]that can change these, 0:07:10.755,0:07:13.878 even automatic kinds of attitudes. 0:07:15.188,0:07:19.806 >>You can find out if you harbor unconscious[br]biases you would deny even to yourself, 0:07:19.806,0:07:23.536 by logging on to the[br]Project Implicit website. 0:07:23.536,0:07:27.868 There you'll find tests of attitudes towards[br]such things as age and religion, 0:07:27.868,0:07:33.149 as well as lighter fair like the Harry[br]Potter movies versus Lord of The Rings. 0:07:33.878,0:07:37.256 Give it a try. You may be in for a shock.