1 00:00:00,760 --> 00:00:03,816 So you probably have the sense, as most people do, 2 00:00:03,840 --> 00:00:07,496 that polarization is getting worse in our country, 3 00:00:07,520 --> 00:00:10,976 that the divide between the left and the right 4 00:00:11,000 --> 00:00:14,536 is as bad as it's been in really any of our lifetimes. 5 00:00:14,560 --> 00:00:19,840 But you might also reasonably wonder if research backs up your intuition. 6 00:00:20,560 --> 00:00:25,240 And in a nutshell, the answer is sadly yes. 7 00:00:26,920 --> 00:00:28,936 In study after study, we find 8 00:00:28,960 --> 00:00:32,640 that liberals and conservatives have grown further apart. 9 00:00:33,440 --> 00:00:38,216 They increasingly wall themselves off in these ideological silos, 10 00:00:38,240 --> 00:00:42,376 consuming different news, talking only to like-minded others 11 00:00:42,400 --> 00:00:45,640 and more and more choosing to live in different parts of the country. 12 00:00:46,720 --> 00:00:49,936 And I think that most alarming of all of it 13 00:00:49,960 --> 00:00:53,760 is seeing this rising animosity on both sides. 14 00:00:54,440 --> 00:00:56,096 Liberals and conservatives, 15 00:00:56,120 --> 00:00:58,016 Democrats and Republicans, 16 00:00:58,040 --> 00:01:01,160 more and more they just don't like one another. 17 00:01:02,320 --> 00:01:04,336 You see it in many different ways. 18 00:01:04,360 --> 00:01:08,016 They don't want to befriend one another. They don't want to date one another. 19 00:01:08,040 --> 00:01:11,336 If they do, if they find out, they find each other less attractive, 20 00:01:11,360 --> 00:01:14,456 and they more and more don't want their children to marry someone 21 00:01:14,480 --> 00:01:16,176 who supports the other party, 22 00:01:16,200 --> 00:01:17,960 a particularly shocking statistic. 23 00:01:19,640 --> 00:01:22,456 You know, in my lab, the students that I work with, 24 00:01:22,480 --> 00:01:25,936 we're talking about some sort of social pattern -- 25 00:01:25,960 --> 00:01:29,496 I'm a movie buff, and so I'm often like, 26 00:01:29,520 --> 00:01:32,480 what kind of movie are we in here with this pattern? 27 00:01:33,080 --> 00:01:36,360 So what kind of movie are we in with political polarization? 28 00:01:37,080 --> 00:01:39,800 Well, it could be a disaster movie. 29 00:01:40,880 --> 00:01:42,560 It certainly seems like a disaster. 30 00:01:42,920 --> 00:01:44,920 Could be a war movie. 31 00:01:45,640 --> 00:01:46,840 Also fits. 32 00:01:47,480 --> 00:01:51,296 But what I keep thinking is that we're in a zombie apocalypse movie. 33 00:01:51,320 --> 00:01:52,776 (Laughter) 34 00:01:52,800 --> 00:01:55,096 Right? You know the kind. 35 00:01:55,120 --> 00:01:57,536 There's people wandering around in packs, 36 00:01:57,560 --> 00:01:59,336 not thinking for themselves, 37 00:01:59,360 --> 00:02:00,976 seized by this mob mentality 38 00:02:01,000 --> 00:02:04,240 trying to spread their disease and destroy society. 39 00:02:05,480 --> 00:02:07,816 And you probably think, as I do, 40 00:02:07,840 --> 00:02:11,296 that you're the good guy in the zombie apocalypse movie, 41 00:02:11,320 --> 00:02:15,016 and all this hate and polarization, it's being propagated by the other people, 42 00:02:15,040 --> 00:02:16,920 because we're Brad Pitt, right? 43 00:02:17,760 --> 00:02:20,656 Free-thinking, righteous, 44 00:02:20,680 --> 00:02:22,976 just trying to hold on to what we hold dear, 45 00:02:23,000 --> 00:02:26,576 you know, not foot soldiers in the army of the undead. 46 00:02:26,600 --> 00:02:28,056 Not that. 47 00:02:28,080 --> 00:02:29,280 Never that. 48 00:02:30,080 --> 00:02:31,576 But here's the thing: 49 00:02:31,600 --> 00:02:34,320 what movie do you suppose they think they're in? 50 00:02:35,480 --> 00:02:36,696 Right? 51 00:02:36,720 --> 00:02:39,256 Well, they absolutely think that they're the good guys 52 00:02:39,280 --> 00:02:41,136 in the zombie apocalypse movie. Right? 53 00:02:41,160 --> 00:02:44,136 And you'd better believe that they think that they're Brad Pitt 54 00:02:44,160 --> 00:02:46,280 and that we, we are the zombies. 55 00:02:49,120 --> 00:02:51,480 And who's to say that they're wrong? 56 00:02:52,440 --> 00:02:55,560 I think that the truth is that we're all a part of this. 57 00:02:56,240 --> 00:02:59,400 And the good side of that is that we can be a part of the solution. 58 00:03:00,280 --> 00:03:02,280 So what are we going to do? 59 00:03:03,320 --> 00:03:07,576 What can we do to chip away at polarization in everyday life? 60 00:03:07,600 --> 00:03:11,416 What could we do to connect with and communicate with 61 00:03:11,440 --> 00:03:13,160 our political counterparts? 62 00:03:13,720 --> 00:03:17,856 Well, these were exactly the questions that I and my colleague, Matt Feinberg, 63 00:03:17,880 --> 00:03:19,738 became fascinated with a few years ago, 64 00:03:19,762 --> 00:03:21,962 and we started doing research on this topic. 65 00:03:22,920 --> 00:03:25,896 And one of the first things that we discovered 66 00:03:25,920 --> 00:03:29,376 that I think is really helpful for understanding polarization 67 00:03:29,400 --> 00:03:30,616 is to understand 68 00:03:30,640 --> 00:03:35,056 that the political divide in our country is undergirded by a deeper moral divide. 69 00:03:35,080 --> 00:03:39,856 So one of the most robust findings in the history of political psychology 70 00:03:39,880 --> 00:03:43,576 is this pattern identified by Jon Haidt and Jesse Graham, 71 00:03:43,600 --> 00:03:44,816 psychologists, 72 00:03:44,840 --> 00:03:48,856 that liberals and conservatives tend to endorse different values 73 00:03:48,880 --> 00:03:50,080 to different degrees. 74 00:03:50,600 --> 00:03:56,096 So for example, we find that liberals tend to endorse values like equality 75 00:03:56,120 --> 00:03:59,776 and fairness and care and protection from harm 76 00:03:59,800 --> 00:04:01,936 more than conservatives do. 77 00:04:01,960 --> 00:04:07,216 And conservatives tend to endorse values like loyalty, patriotism, 78 00:04:07,240 --> 00:04:10,696 respect for authority and moral purity 79 00:04:10,720 --> 00:04:12,800 more than liberals do. 80 00:04:13,920 --> 00:04:17,976 And Matt and I were thinking that maybe this moral divide 81 00:04:18,000 --> 00:04:21,096 might be helpful for understanding how it is 82 00:04:21,120 --> 00:04:23,536 that liberals and conservatives talk to one another 83 00:04:23,560 --> 00:04:25,976 and why they so often seem to talk past one another 84 00:04:26,000 --> 00:04:27,216 when they do. 85 00:04:27,240 --> 00:04:29,216 So we conducted a study 86 00:04:29,240 --> 00:04:32,336 where we recruited liberals to a study 87 00:04:32,360 --> 00:04:34,816 where they were supposed to write a persuasive essay 88 00:04:34,840 --> 00:04:39,280 that would be compelling to a conservative in support of same-sex marriage. 89 00:04:39,800 --> 00:04:43,056 And what we found was that liberals tended to make arguments 90 00:04:43,080 --> 00:04:47,256 in terms of the liberal moral values of equality and fairness. 91 00:04:47,280 --> 00:04:49,016 So they said things like, 92 00:04:49,040 --> 00:04:52,416 "Everyone should have the right to love whoever they choose," 93 00:04:52,440 --> 00:04:55,016 and, "They" -- they being gay Americans -- 94 00:04:55,040 --> 00:04:57,800 "deserve the same equal rights as other Americans." 95 00:04:58,360 --> 00:05:01,576 Overall, we found that 69 percent of liberals 96 00:05:01,600 --> 00:05:07,016 invoked one of the more liberal moral values in constructing their essay, 97 00:05:07,040 --> 00:05:10,736 and only nine percent invoked one of the more conservative moral values, 98 00:05:10,760 --> 00:05:14,176 even though they were supposed to be trying to persuade conservatives. 99 00:05:14,200 --> 00:05:18,496 And when we studied conservatives and had them make persuasive arguments 100 00:05:18,520 --> 00:05:21,416 in support of making English the official language of the US, 101 00:05:21,440 --> 00:05:23,976 a classically conservative political position, 102 00:05:24,000 --> 00:05:26,216 we found that they weren't much better at this. 103 00:05:26,240 --> 00:05:27,856 59 percent of them made arguments 104 00:05:27,880 --> 00:05:30,576 in terms of one of the more conservative moral values, 105 00:05:30,600 --> 00:05:33,096 and just eight percent invoked a liberal moral value, 106 00:05:33,120 --> 00:05:36,480 even though they were supposed to be targeting liberals for persuasion. 107 00:05:37,480 --> 00:05:41,520 Now, you can see right away why we're in trouble here. Right? 108 00:05:42,280 --> 00:05:45,776 People's moral values, they're their most deeply held beliefs. 109 00:05:45,800 --> 00:05:49,200 People are willing to fight and die for their values. 110 00:05:49,720 --> 00:05:52,416 Why are they going to give that up just to agree with you 111 00:05:52,440 --> 00:05:55,976 on something that they don't particularly want to agree with you on anyway? 112 00:05:56,000 --> 00:05:59,256 If that persuasive appeal that you're making to your Republican uncle 113 00:05:59,280 --> 00:06:01,696 means that he doesn't just have to change his view, 114 00:06:01,720 --> 00:06:03,886 he's got to change his underlying values, too, 115 00:06:03,910 --> 00:06:05,470 that's not going to go very far. 116 00:06:06,080 --> 00:06:07,400 So what would work better? 117 00:06:08,200 --> 00:06:12,496 Well, we believe it's a technique that we call moral reframing, 118 00:06:12,520 --> 00:06:15,136 and we've studied it in a series of experiments. 119 00:06:15,160 --> 00:06:16,656 In one of these experiments, 120 00:06:16,680 --> 00:06:19,816 we recruited liberals and conservatives to a study 121 00:06:19,840 --> 00:06:22,136 where they read one of three essays 122 00:06:22,160 --> 00:06:25,200 before having their environmental attitudes surveyed. 123 00:06:25,640 --> 00:06:27,136 And the first of these essays 124 00:06:27,160 --> 00:06:30,536 was a relatively conventional pro-environmental essay 125 00:06:30,560 --> 00:06:34,576 that invoked the liberal values of care and protection from harm. 126 00:06:34,600 --> 00:06:37,136 It said things like, "In many important ways 127 00:06:37,160 --> 00:06:39,976 we are causing real harm to the places we live in," 128 00:06:40,000 --> 00:06:42,816 and, "It is essential that we take steps now 129 00:06:42,840 --> 00:06:45,760 to prevent further destruction from being done to our Earth." 130 00:06:47,120 --> 00:06:48,536 Another group of participants 131 00:06:48,560 --> 00:06:50,776 were assigned to read a really different essay 132 00:06:50,800 --> 00:06:55,240 that was designed to tap into the conservative value of moral purity. 133 00:06:56,190 --> 00:06:58,176 It was a pro-environmental essay as well, 134 00:06:58,200 --> 00:06:59,696 and it said things like, 135 00:06:59,720 --> 00:07:03,960 "Keeping our forests, drinking water, and skies pure is of vital importance." 136 00:07:05,000 --> 00:07:06,496 "We should regard the pollution 137 00:07:06,520 --> 00:07:08,560 of the places we live in to be disgusting." 138 00:07:09,160 --> 00:07:11,256 And, "Reducing pollution can help us preserve 139 00:07:11,280 --> 00:07:14,440 what is pure and beautiful about the places we live." 140 00:07:15,880 --> 00:07:17,296 And then we had a third group 141 00:07:17,320 --> 00:07:19,816 that were assigned to read just a nonpolitical essay. 142 00:07:19,840 --> 00:07:22,576 It was just a comparison group so we could get a baseline. 143 00:07:22,600 --> 00:07:24,553 And what we found when we surveyed people 144 00:07:24,577 --> 00:07:26,776 about their environmental attitudes afterwards, 145 00:07:26,800 --> 00:07:29,736 we found that liberals, it didn't matter what essay they read. 146 00:07:29,760 --> 00:07:32,856 They tended to have highly pro-environmental attitudes regardless. 147 00:07:32,880 --> 00:07:35,296 Liberals are on board for environmental protection. 148 00:07:35,320 --> 00:07:36,536 Conservatives, however, 149 00:07:36,560 --> 00:07:40,976 were significantly more supportive of progressive environmental policies 150 00:07:41,000 --> 00:07:42,536 and environmental protection 151 00:07:42,560 --> 00:07:44,616 if they had read the moral purity essay 152 00:07:44,640 --> 00:07:47,040 than if they read one of the other two essays. 153 00:07:48,160 --> 00:07:51,256 We even found that conservatives who read the moral purity essay 154 00:07:51,280 --> 00:07:54,776 were significantly more likely to say that they believed in global warming 155 00:07:54,800 --> 00:07:56,705 and were concerned about global warming, 156 00:07:56,729 --> 00:07:59,456 even though this essay didn't even mention global warming. 157 00:07:59,480 --> 00:08:01,936 That's just a related environmental issue. 158 00:08:01,960 --> 00:08:05,040 But that's how robust this moral reframing effect was. 159 00:08:05,960 --> 00:08:09,696 And we've studied this on a whole slew of different political issues. 160 00:08:09,720 --> 00:08:13,456 So if you want to move conservatives 161 00:08:13,480 --> 00:08:16,576 on issues like same-sex marriage or national health insurance, 162 00:08:16,600 --> 00:08:20,056 it helps to tie these liberal political issues to conservative values 163 00:08:20,080 --> 00:08:22,880 like patriotism and moral purity. 164 00:08:23,800 --> 00:08:25,896 And we studied it the other way, too. 165 00:08:25,920 --> 00:08:29,736 If you want to move liberals to the right on conservative policy issues 166 00:08:29,760 --> 00:08:34,376 like military spending and making English the official language of the US, 167 00:08:34,400 --> 00:08:36,056 you're going to be more persuasive 168 00:08:36,080 --> 00:08:39,416 if you tie those conservative policy issues to liberal moral values 169 00:08:39,440 --> 00:08:41,320 like equality and fairness. 170 00:08:42,640 --> 00:08:45,496 All these studies have the same clear message: 171 00:08:45,520 --> 00:08:48,456 if you want to persuade someone on some policy, 172 00:08:48,480 --> 00:08:52,320 it's helpful to connect that policy to their underlying moral values. 173 00:08:53,520 --> 00:08:55,696 And when you say it like that 174 00:08:55,720 --> 00:08:57,216 it seems really obvious. Right? 175 00:08:57,240 --> 00:08:59,016 Like, why did we come here tonight? 176 00:08:59,040 --> 00:09:00,256 Why -- 177 00:09:00,280 --> 00:09:01,816 (Laughter) 178 00:09:01,840 --> 00:09:03,880 It's incredibly intuitive. 179 00:09:05,400 --> 00:09:08,696 And even though it is, it's something we really struggle to do. 180 00:09:08,720 --> 00:09:12,576 You know, it turns out that when we go to persuade somebody on a political issue, 181 00:09:12,600 --> 00:09:15,336 we talk like we're speaking into a mirror. 182 00:09:15,360 --> 00:09:19,736 We don't persuade so much as we rehearse our own reasons 183 00:09:19,760 --> 00:09:22,640 for why we believe some sort of political position. 184 00:09:23,400 --> 00:09:27,816 We kept saying when we were designing these reframed moral arguments, 185 00:09:27,840 --> 00:09:30,480 "Empathy and respect, empathy and respect." 186 00:09:31,040 --> 00:09:32,496 If you can tap into that, 187 00:09:32,520 --> 00:09:34,176 you can connect 188 00:09:34,200 --> 00:09:37,000 and you might be able to persuade somebody in this country. 189 00:09:37,560 --> 00:09:39,976 So thinking again 190 00:09:40,000 --> 00:09:42,280 about what movie we're in, 191 00:09:43,200 --> 00:09:44,776 maybe I got carried away before. 192 00:09:44,800 --> 00:09:46,760 Maybe it's not a zombie apocalypse movie. 193 00:09:47,520 --> 00:09:49,440 Maybe instead it's a buddy cop movie. 194 00:09:50,040 --> 00:09:52,056 (Laughter) 195 00:09:52,080 --> 00:09:54,096 Just roll with it, just go with it please. 196 00:09:54,120 --> 00:09:55,560 (Laughter) 197 00:09:56,480 --> 00:09:59,176 You know the kind: there's a white cop and a black cop, 198 00:09:59,200 --> 00:10:01,336 or maybe a messy cop and an organized cop. 199 00:10:01,360 --> 00:10:03,416 Whatever it is, they don't get along 200 00:10:03,440 --> 00:10:04,726 because of this difference. 201 00:10:05,520 --> 00:10:08,736 But in the end, when they have to come together and they cooperate, 202 00:10:08,760 --> 00:10:10,696 the solidarity that they feel, 203 00:10:10,720 --> 00:10:14,360 it's greater because of that gulf that they had to cross. Right? 204 00:10:15,280 --> 00:10:17,256 And remember that in these movies, 205 00:10:17,280 --> 00:10:20,176 it's usually worst in the second act 206 00:10:20,200 --> 00:10:22,600 when our leads are further apart than ever before. 207 00:10:23,440 --> 00:10:25,776 And so maybe that's where we are in this country, 208 00:10:25,800 --> 00:10:27,976 late in the second act of a buddy cop movie -- 209 00:10:28,000 --> 00:10:30,576 (Laughter) 210 00:10:30,600 --> 00:10:33,680 torn apart but about to come back together. 211 00:10:35,400 --> 00:10:37,056 It sounds good, 212 00:10:37,080 --> 00:10:38,936 but if we want it to happen, 213 00:10:38,960 --> 00:10:41,680 I think the responsibility is going to start with us. 214 00:10:42,520 --> 00:10:44,680 So this is my call to you: 215 00:10:45,480 --> 00:10:47,480 let's put this country back together. 216 00:10:49,080 --> 00:10:52,136 Let's do it despite the politicians 217 00:10:52,160 --> 00:10:55,016 and the media and Facebook and Twitter 218 00:10:55,040 --> 00:10:56,576 and Congressional redistricting 219 00:10:56,600 --> 00:10:59,320 and all of it, all the things that divide us. 220 00:11:00,360 --> 00:11:02,600 Let's do it because it's right. 221 00:11:03,920 --> 00:11:08,336 And let's do it because this hate and contempt 222 00:11:08,360 --> 00:11:10,520 that flows through all of us every day 223 00:11:11,400 --> 00:11:14,576 makes us ugly and it corrupts us, 224 00:11:14,600 --> 00:11:17,920 and it threatens the very fabric of our society. 225 00:11:19,960 --> 00:11:22,616 We owe it to one another and our country 226 00:11:22,640 --> 00:11:24,800 to reach out and try to connect. 227 00:11:26,000 --> 00:11:29,160 We can't afford to hate them any longer, 228 00:11:30,200 --> 00:11:32,400 and we can't afford to let them hate us either. 229 00:11:33,880 --> 00:11:35,240 Empathy and respect. 230 00:11:35,880 --> 00:11:37,120 Empathy and respect. 231 00:11:37,920 --> 00:11:41,720 If you think about it, it's the very least that we owe our fellow citizens. 232 00:11:42,400 --> 00:11:43,616 Thank you. 233 00:11:43,640 --> 00:11:48,325 (Applause)