1 00:00:00,780 --> 00:00:04,631 >>Behind every innovation is an inspired scientist. 2 00:00:04,631 --> 00:00:08,891 And behind every PBS program is a viewer like you! 3 00:00:08,891 --> 00:00:11,152 Get behind your PBS station. 4 00:00:11,922 --> 00:00:15,070 Your support makes a difference! 5 00:00:15,530 --> 00:00:18,550 >>Hello and welcome to Scientific American Frontiers, 6 00:00:18,550 --> 00:00:19,880 I'm Alan Alda. 7 00:00:20,451 --> 00:00:24,039 we all like to think that we understand our own minds. 8 00:00:24,039 --> 00:00:27,958 That we carefully weigh the pros and cons when we make a decision. 9 00:00:27,958 --> 00:00:29,699 And that after we've made it, 10 00:00:29,699 --> 00:00:33,870 even if we have cause to regret it, we know why we decided the way we did. 11 00:00:33,870 --> 00:00:37,923 Oh sure we know some choices are more emotional than rational, 12 00:00:37,923 --> 00:00:40,285 but even then, we think we're conscious 13 00:00:40,285 --> 00:00:44,357 of all the conflicting arguments that run through our minds as we make a choice. 14 00:00:45,371 --> 00:00:49,868 Well, in this program we'll find out how utterly deluded we are, 15 00:00:49,868 --> 00:00:55,650 as we join in experiments revealing just how sneaky and underhanded our brains can be. 16 00:00:56,493 --> 00:01:02,992 [music] 17 00:01:04,159 --> 00:01:06,508 >>I've always thought of myself as a feminist, 18 00:01:06,508 --> 00:01:09,950 so I'm pretty sure I know how I'll do in this test of my reaction 19 00:01:09,950 --> 00:01:11,748 to women in the work place. 20 00:01:11,748 --> 00:01:13,498 I am ready to begin. 21 00:01:13,498 --> 00:01:19,258 Women, in fact like Mahzarin Banaji, who's a professor here at Harvard University. 22 00:01:19,258 --> 00:01:22,499 The test is called the "implicit association test" 23 00:01:22,499 --> 00:01:24,568 and it begins simply enough. 24 00:01:24,568 --> 00:01:26,657 I have to pair the word in the center, 25 00:01:26,657 --> 00:01:30,340 with one of the words above by pressing the "e" key with my left hand 26 00:01:30,340 --> 00:01:32,695 or the "i" key with my right. 27 00:01:32,695 --> 00:01:34,875 Well, that could be either way. 28 00:01:34,875 --> 00:01:37,405 Mahzarin has told me to do this as fast as I can 29 00:01:37,405 --> 00:01:42,146 because it's the time I take to make the associations that's critical to the test. 30 00:01:44,280 --> 00:01:48,560 Now the target words have changed, but the task remains the same; 31 00:01:48,560 --> 00:01:52,824 to quickly decide whether the new words belong to the left or the right. 32 00:01:56,004 --> 00:01:58,286 But things are about to get trickier. 33 00:01:59,884 --> 00:02:03,043 >>It's the same thing except now any one of these four will show up, 34 00:02:03,043 --> 00:02:05,295 and when it's "career" or "male" you'll press the left key, 35 00:02:05,295 --> 00:02:08,315 when it's "family" or "female" you'll press the right key. 36 00:02:08,825 --> 00:02:11,464 >>"Career" or "male" >>Yes. 37 00:02:11,464 --> 00:02:14,945 >>So now the categories are described by two words, 38 00:02:14,945 --> 00:02:17,883 making it harder to decide where the new words belong. 39 00:02:18,783 --> 00:02:23,273 But since historically, "male" and "career" have gone together.... 40 00:02:23,273 --> 00:02:27,394 This is like, reinforcing this stereotype. >>Yeah, that's it exactly. 41 00:02:27,954 --> 00:02:32,595 >>The point of the test is to discover if lurking beneath my feminist convictions, 42 00:02:32,595 --> 00:02:36,866 I actually harbor a hidden bias against women in the work place 43 00:02:36,866 --> 00:02:41,665 based on all the associations between "man" and "career" and "women" and "family" 44 00:02:41,665 --> 00:02:44,159 that bathed the culture I grew up in. 45 00:02:44,159 --> 00:02:49,009 "Family or "male", "career" or "female". Okay, now you're testing me. 46 00:02:49,199 --> 00:02:53,600 The implicit association test is designed to ferret out any bias 47 00:02:53,600 --> 00:02:58,389 by seeing if it takes me fractionally longer to figure out where a word belongs 48 00:02:58,389 --> 00:03:02,989 when the pairing of the target words, in this case: "family" and "male" together 49 00:03:02,989 --> 00:03:08,578 and "career" and "female", is slightly more difficult for my brain to except. 50 00:03:08,578 --> 00:03:12,939 Corporation - "family" or...[breaks off in laughter] 51 00:03:12,939 --> 00:03:16,379 "brother", "management"... 52 00:03:17,780 --> 00:03:19,251 and we're done. 53 00:03:20,872 --> 00:03:24,502 >>Slight. I could tell he has a slight. This is very good. 54 00:03:24,502 --> 00:03:26,952 You are...so let's see, you're showing a slight, 55 00:03:26,952 --> 00:03:29,551 automatic association between male and career, 56 00:03:29,551 --> 00:03:34,903 and the...only 12% of the population that takes this test 57 00:03:34,903 --> 00:03:38,102 shows this bias. What you're seeing is that you're, 58 00:03:38,102 --> 00:03:41,071 you're showing a much smaller bias then what many other people show. 59 00:03:41,071 --> 00:03:44,320 I'm up here. I make a strong association between 60 00:03:44,320 --> 00:03:47,772 "male" and "career" and "female" and "family" even though that's not what I 61 00:03:47,772 --> 00:03:49,232 consciously express. 62 00:03:49,232 --> 00:03:50,673 >>Even though that's not what you live, huh? 63 00:03:50,673 --> 00:03:52,331 >>And that's not what I live, but in my world. 64 00:03:52,331 --> 00:03:54,781 And I don't know, I'd be interested to know. Maybe 65 00:03:54,781 --> 00:03:58,680 many years of working in feminist causes made you show this lesson. 66 00:03:58,680 --> 00:04:01,442 >>So all those years of working in feminist causes 67 00:04:01,442 --> 00:04:06,490 didn't manage to totally eliminate my lingering association between "female" and "family" 68 00:04:06,490 --> 00:04:09,542 and "male" and "career". 69 00:04:09,542 --> 00:04:12,990 What's astonishing though is that Mahzarin is far worse, 70 00:04:12,990 --> 00:04:16,463 and she's not only enjoying a very successful career- 71 00:04:16,463 --> 00:04:18,233 she designed the test! 72 00:04:18,894 --> 00:04:21,854 Another of its designers is Brian Nosek, 73 00:04:21,854 --> 00:04:24,473 who personally developed the test he's taking now, 74 00:04:24,473 --> 00:04:27,594 intending to reveal hidden racial prejudice. 75 00:04:27,594 --> 00:04:29,864 >>These are the same faces you've seen many, many times. 76 00:04:29,864 --> 00:04:33,633 >>Many times, many times. In fact I helped create the faces, so... 77 00:04:33,633 --> 00:04:36,974 >>The heart of this test is to see whether it's easier for Brian to match words 78 00:04:36,974 --> 00:04:40,743 or pictures to the pairing of African American with bad, 79 00:04:40,743 --> 00:04:43,282 and European American with good. 80 00:04:43,742 --> 00:04:45,934 Then when the pairings are reversed, 81 00:04:45,934 --> 00:04:50,424 European Americans with bad and African Americans with good. 82 00:04:50,985 --> 00:04:55,445 [quiet] 83 00:04:55,445 --> 00:04:57,533 >>I can predict. >>Yeah [chuckle] 84 00:04:57,533 --> 00:04:58,583 What do you think? 85 00:04:58,583 --> 00:05:00,904 >>Moderate. Moderate to strong. 86 00:05:00,904 --> 00:05:03,455 >>Yeah, I'll say moderate as well. >>Yeah. 87 00:05:03,455 --> 00:05:06,335 >>Strong preference for white. [exclamation of laughter] 88 00:05:06,335 --> 00:05:09,205 >>You've taken this for the umpteenth time, 89 00:05:09,575 --> 00:05:14,055 and you still haven't caught on to the fact that you're a little biased? 90 00:05:14,055 --> 00:05:15,975 [laughter] 91 00:05:15,975 --> 00:05:21,526 >>I think part of that is the insistence that my conscious beliefs still matter. 92 00:05:21,526 --> 00:05:25,276 It isn't that-- that fact that I keep showing these implicit biases 93 00:05:25,276 --> 00:05:28,577 mean that I'm a biased person so I should just accept that and move on, 94 00:05:28,577 --> 00:05:32,635 it's that-- no, I don't agree with those, I do have them 95 00:05:32,635 --> 00:05:35,065 and I will admit to having those implicit biases 96 00:05:35,065 --> 00:05:38,667 but I'm not going to let that rule what I would consciously want to have. 97 00:05:39,927 --> 00:05:43,335 >>It really is remarkable that here I am in Cambridge, Massachusetts-- 98 00:05:43,335 --> 00:05:45,555 perhaps the liberal capitol of the country-- 99 00:05:45,555 --> 00:05:49,904 with two academics who pride themselves on their enlightened attitudes, 100 00:05:49,904 --> 00:05:52,246 only to discover that they have latent within in them 101 00:05:52,246 --> 00:05:56,731 biases they would fervently deny if their own test hadn't revealed them. 102 00:05:59,581 --> 00:06:01,569 But all may not be lost. 103 00:06:02,202 --> 00:06:05,794 >>We found in research, and also in testing of myself, 104 00:06:05,794 --> 00:06:10,685 that if I put myself in a situation where I think about positive black exemplars-- 105 00:06:10,685 --> 00:06:14,402 I think about Michael Jordan and Colin Powell and people... 106 00:06:14,402 --> 00:06:19,396 Martin Luther King who have had a very positive impact and who are also African American, 107 00:06:19,396 --> 00:06:23,153 I show much less bias immediately after thinking about those exemplars 108 00:06:23,153 --> 00:06:25,355 then if I hadn't thought about them before. 109 00:06:25,355 --> 00:06:29,715 >>Even simple things, like the presence of an African American experimenter, 110 00:06:29,715 --> 00:06:34,943 reduces race bias. That the presence of a competent woman, 111 00:06:34,943 --> 00:06:38,085 makes women's attitudes to math become more positive. 112 00:06:38,085 --> 00:06:40,575 Those kinds of studies 113 00:06:40,575 --> 00:06:44,515 sit, I think, in contradiction the ways in which people like I thought about this- 114 00:06:44,515 --> 00:06:46,654 we thought they were learned over long periods of time, 115 00:06:46,654 --> 00:06:49,143 that they were entrenched, they were rigid, inflexible-- 116 00:06:49,143 --> 00:06:51,224 and that does not appear to be the case, 117 00:06:51,224 --> 00:06:53,534 and I think that that's where the room for optimism is. 118 00:06:53,534 --> 00:06:56,805 On the one hand, I would like people to take these data seriously 119 00:06:56,805 --> 00:07:00,967 when we can show the vast numbers of people who show the bias, 120 00:07:00,967 --> 00:07:02,945 I think we need to contend with that. 121 00:07:02,945 --> 00:07:07,224 On the other hand, what this work is showing is that it might be 122 00:07:07,224 --> 00:07:10,755 not that hard to shape environments that can change these, 123 00:07:10,755 --> 00:07:13,878 even automatic kinds of attitudes. 124 00:07:15,188 --> 00:07:19,806 >>You can find out if you harbor unconscious biases you would deny even to yourself, 125 00:07:19,806 --> 00:07:23,536 by logging on to the Project Implicit website. 126 00:07:23,536 --> 00:07:27,868 There you'll find tests of attitudes towards such things as age and religion, 127 00:07:27,868 --> 00:07:33,149 as well as lighter fair like the Harry Potter movies versus Lord of The Rings. 128 00:07:33,878 --> 00:07:37,256 Give it a try. You may be in for a shock.