WEBVTT 00:00:00.780 --> 00:00:04.631 >>Behind every innovation is an inspired scientist. 00:00:04.631 --> 00:00:08.891 And behind every PBS program is a viewer like you! 00:00:08.891 --> 00:00:11.152 Get behind your PBS station. 00:00:11.922 --> 00:00:15.070 Your support makes a difference! 00:00:15.530 --> 00:00:18.550 >>Hello and welcome to Scientific American Frontiers, 00:00:18.550 --> 00:00:19.880 I'm Alan Alda. 00:00:20.451 --> 00:00:24.039 we all like to think that we understand our own minds. 00:00:24.039 --> 00:00:27.958 That we carefully weigh the pros and cons when we make a decision. 00:00:27.958 --> 00:00:29.699 And that after we've made it, 00:00:29.699 --> 00:00:33.870 even if we have cause to regret it, we know why we decided the way we did. 00:00:33.870 --> 00:00:37.923 Oh sure we know some choices are more emotional than rational, 00:00:37.923 --> 00:00:40.285 but even then, we think we're conscious 00:00:40.285 --> 00:00:44.357 of all the conflicting arguments that run through our minds as we make a choice. 00:00:45.371 --> 00:00:49.868 Well, in this program we'll find out how utterly deluded we are, 00:00:49.868 --> 00:00:55.650 as we join in experiments revealing just how sneaky and underhanded our brains can be. 00:00:56.493 --> 00:01:02.992 [music] 00:01:04.159 --> 00:01:06.508 >>I've always thought of myself as a feminist, 00:01:06.508 --> 00:01:09.950 so I'm pretty sure I know how I'll do in this test of my reaction 00:01:09.950 --> 00:01:11.748 to women in the work place. 00:01:11.748 --> 00:01:13.498 I am ready to begin. 00:01:13.498 --> 00:01:19.258 Women, in fact like Mahzarin Banaji, who's a professor here at Harvard University. 00:01:19.258 --> 00:01:22.499 The test is called the "implicit association test" 00:01:22.499 --> 00:01:24.568 and it begins simply enough. 00:01:24.568 --> 00:01:26.657 I have to pair the word in the center, 00:01:26.657 --> 00:01:30.340 with one of the words above by pressing the "e" key with my left hand 00:01:30.340 --> 00:01:32.695 or the "i" key with my right. 00:01:32.695 --> 00:01:34.875 Well, that could be either way. 00:01:34.875 --> 00:01:37.405 Mahzarin has told me to do this as fast as I can 00:01:37.405 --> 00:01:42.146 because it's the time I take to make the associations that's critical to the test. 00:01:44.280 --> 00:01:48.560 Now the target words have changed, but the task remains the same; 00:01:48.560 --> 00:01:52.824 to quickly decide whether the new words belong to the left or the right. 00:01:56.004 --> 00:01:58.286 But things are about to get trickier. 00:01:59.884 --> 00:02:03.043 >>It's the same thing except now any one of these four will show up, 00:02:03.043 --> 00:02:05.295 and when it's "career" or "male" you'll press the left key, 00:02:05.295 --> 00:02:08.315 when it's "family" or "female" you'll press the right key. 00:02:08.825 --> 00:02:11.464 >>"Career" or "male" >>Yes. 00:02:11.464 --> 00:02:14.945 >>So now the categories are described by two words, 00:02:14.945 --> 00:02:17.883 making it harder to decide where the new words belong. 00:02:18.783 --> 00:02:23.273 But since historically, "male" and "career" have gone together.... 00:02:23.273 --> 00:02:27.394 This is like, reinforcing this stereotype. >>Yeah, that's it exactly. 00:02:27.954 --> 00:02:32.595 >>The point of the test is to discover if lurking beneath my feminist convictions, 00:02:32.595 --> 00:02:36.866 I actually harbor a hidden bias against women in the work place 00:02:36.866 --> 00:02:41.665 based on all the associations between "man" and "career" and "women" and "family" 00:02:41.665 --> 00:02:44.159 that bathed the culture I grew up in. 00:02:44.159 --> 00:02:49.009 "Family or "male", "career" or "female". Okay, now you're testing me. 00:02:49.199 --> 00:02:53.600 The implicit association test is designed to ferret out any bias 00:02:53.600 --> 00:02:58.389 by seeing if it takes me fractionally longer to figure out where a word belongs 00:02:58.389 --> 00:03:02.989 when the pairing of the target words, in this case: "family" and "male" together 00:03:02.989 --> 00:03:08.578 and "career" and "female", is slightly more difficult for my brain to except. 00:03:08.578 --> 00:03:12.939 Corporation - "family" or...[breaks off in laughter] 00:03:12.939 --> 00:03:16.379 "brother", "management"... 00:03:17.780 --> 00:03:19.251 and we're done. 00:03:20.872 --> 00:03:24.502 >>Slight. I could tell he has a slight. This is very good. 00:03:24.502 --> 00:03:26.952 You are...so let's see, you're showing a slight, 00:03:26.952 --> 00:03:29.551 automatic association between male and career, 00:03:29.551 --> 00:03:34.903 and the...only 12% of the population that takes this test 00:03:34.903 --> 00:03:38.102 shows this bias. What you're seeing is that you're, 00:03:38.102 --> 00:03:41.071 you're showing a much smaller bias then what many other people show. 00:03:41.071 --> 00:03:44.320 I'm up here. I make a strong association between 00:03:44.320 --> 00:03:47.772 "male" and "career" and "female" and "family" even though that's not what I 00:03:47.772 --> 00:03:49.232 consciously express. 00:03:49.232 --> 00:03:50.673 >>Even though that's not what you live, huh? 00:03:50.673 --> 00:03:52.331 >>And that's not what I live, but in my world. 00:03:52.331 --> 00:03:54.781 And I don't know, I'd be interested to know. Maybe 00:03:54.781 --> 00:03:58.680 many years of working in feminist causes made you show this lesson. 00:03:58.680 --> 00:04:01.442 >>So all those years of working in feminist causes 00:04:01.442 --> 00:04:06.490 didn't manage to totally eliminate my lingering association between "female" and "family" 00:04:06.490 --> 00:04:09.542 and "male" and "career". 00:04:09.542 --> 00:04:12.990 What's astonishing though is that Mahzarin is far worse, 00:04:12.990 --> 00:04:16.463 and she's not only enjoying a very successful career- 00:04:16.463 --> 00:04:18.233 she designed the test! 00:04:18.894 --> 00:04:21.854 Another of its designers is Brian Nosek, 00:04:21.854 --> 00:04:24.473 who personally developed the test he's taking now, 00:04:24.473 --> 00:04:27.594 intending to reveal hidden racial prejudice. 00:04:27.594 --> 00:04:29.864 >>These are the same faces you've seen many, many times. 00:04:29.864 --> 00:04:33.633 >>Many times, many times. In fact I helped create the faces, so... 00:04:33.633 --> 00:04:36.974 >>The heart of this test is to see whether it's easier for Brian to match words 00:04:36.974 --> 00:04:40.743 or pictures to the pairing of African American with bad, 00:04:40.743 --> 00:04:43.282 and European American with good. 00:04:43.742 --> 00:04:45.934 Then when the pairings are reversed, 00:04:45.934 --> 00:04:50.424 European Americans with bad and African Americans with good. 00:04:50.985 --> 00:04:55.445 [quiet] 00:04:55.445 --> 00:04:57.533 >>I can predict. >>Yeah [chuckle] 00:04:57.533 --> 00:04:58.583 What do you think? 00:04:58.583 --> 00:05:00.904 >>Moderate. Moderate to strong. 00:05:00.904 --> 00:05:03.455 >>Yeah, I'll say moderate as well. >>Yeah. 00:05:03.455 --> 00:05:06.335 >>Strong preference for white. [exclamation of laughter] 00:05:06.335 --> 00:05:09.205 >>You've taken this for the umpteenth time, 00:05:09.575 --> 00:05:14.055 and you still haven't caught on to the fact that you're a little biased? 00:05:14.055 --> 00:05:15.975 [laughter] 00:05:15.975 --> 00:05:21.526 >>I think part of that is the insistence that my conscious beliefs still matter. 00:05:21.526 --> 00:05:25.276 It isn't that-- that fact that I keep showing these implicit biases 00:05:25.276 --> 00:05:28.577 mean that I'm a biased person so I should just accept that and move on, 00:05:28.577 --> 00:05:32.635 it's that-- no, I don't agree with those, I do have them 00:05:32.635 --> 00:05:35.065 and I will admit to having those implicit biases 00:05:35.065 --> 00:05:38.667 but I'm not going to let that rule what I would consciously want to have. 00:05:39.927 --> 00:05:43.335 >>It really is remarkable that here I am in Cambridge, Massachusetts-- 00:05:43.335 --> 00:05:45.555 perhaps the liberal capitol of the country-- 00:05:45.555 --> 00:05:49.904 with two academics who pride themselves on their enlightened attitudes, 00:05:49.904 --> 00:05:52.246 only to discover that they have latent within in them 00:05:52.246 --> 00:05:56.731 biases they would fervently deny if their own test hadn't revealed them. 00:05:59.581 --> 00:06:01.569 But all may not be lost. 00:06:02.202 --> 00:06:05.794 >>We found in research, and also in testing of myself, 00:06:05.794 --> 00:06:10.685 that if I put myself in a situation where I think about positive black exemplars-- 00:06:10.685 --> 00:06:14.402 I think about Michael Jordan and Colin Powell and people... 00:06:14.402 --> 00:06:19.396 Martin Luther King who have had a very positive impact and who are also African American, 00:06:19.396 --> 00:06:23.153 I show much less bias immediately after thinking about those exemplars 00:06:23.153 --> 00:06:25.355 then if I hadn't thought about them before. 00:06:25.355 --> 00:06:29.715 >>Even simple things, like the presence of an African American experimenter, 00:06:29.715 --> 00:06:34.943 reduces race bias. That the presence of a competent woman, 00:06:34.943 --> 00:06:38.085 makes women's attitudes to math become more positive. 00:06:38.085 --> 00:06:40.575 Those kinds of studies 00:06:40.575 --> 00:06:44.515 sit, I think, in contradiction the ways in which people like I thought about this- 00:06:44.515 --> 00:06:46.654 we thought they were learned over long periods of time, 00:06:46.654 --> 00:06:49.143 that they were entrenched, they were rigid, inflexible-- 00:06:49.143 --> 00:06:51.224 and that does not appear to be the case, 00:06:51.224 --> 00:06:53.534 and I think that that's where the room for optimism is. 00:06:53.534 --> 00:06:56.805 On the one hand, I would like people to take these data seriously 00:06:56.805 --> 00:07:00.967 when we can show the vast numbers of people who show the bias, 00:07:00.967 --> 00:07:02.945 I think we need to contend with that. 00:07:02.945 --> 00:07:07.224 On the other hand, what this work is showing is that it might be 00:07:07.224 --> 00:07:10.755 not that hard to shape environments that can change these, 00:07:10.755 --> 00:07:13.878 even automatic kinds of attitudes. 00:07:15.188 --> 00:07:19.806 >>You can find out if you harbor unconscious biases you would deny even to yourself, 00:07:19.806 --> 00:07:23.536 by logging on to the Project Implicit website. 00:07:23.536 --> 00:07:27.868 There you'll find tests of attitudes towards such things as age and religion, 00:07:27.868 --> 00:07:33.149 as well as lighter fair like the Harry Potter movies versus Lord of The Rings. 00:07:33.878 --> 00:07:37.256 Give it a try. You may be in for a shock.