1 00:00:11,388 --> 00:00:14,444 So you probably have the sense, as most people do, 2 00:00:14,468 --> 00:00:18,124 that polarization is getting worse in our country, 3 00:00:18,148 --> 00:00:21,604 that the divide between the left and the right 4 00:00:21,628 --> 00:00:25,164 is as bad as it's been in really any of our lifetimes. 5 00:00:25,188 --> 00:00:30,468 But you might also reasonably wonder if research backs up your intuition. 6 00:00:31,188 --> 00:00:35,868 And in a nutshell, the answer is sadly yes. 7 00:00:35,892 --> 00:00:37,455 (Laughter) 8 00:00:37,778 --> 00:00:39,794 In study after study, we find 9 00:00:39,818 --> 00:00:43,498 that liberals and conservatives have grown further apart. 10 00:00:44,298 --> 00:00:49,074 They increasingly wall themselves off in these ideological silos, 11 00:00:49,098 --> 00:00:53,234 consuming different news, talking only to like-minded others 12 00:00:53,258 --> 00:00:56,498 and more and more choosing to live in different parts of the country. 13 00:00:57,578 --> 00:01:00,794 And I think that most alarming of all of it 14 00:01:00,818 --> 00:01:04,617 is seeing this rising animosity on both sides. 15 00:01:05,298 --> 00:01:06,954 Liberals and conservatives, 16 00:01:06,978 --> 00:01:08,874 Democrats and Republicans, 17 00:01:08,898 --> 00:01:12,018 more and more they just don't like one another. 18 00:01:13,178 --> 00:01:15,194 You see it in many different ways. 19 00:01:15,218 --> 00:01:18,874 They don't want to befriend one another. They don't want to date one another. 20 00:01:18,898 --> 00:01:22,194 If they do, if they find out, they find each other less attractive, 21 00:01:22,218 --> 00:01:25,314 and they more and more don't want their children to marry someone 22 00:01:25,338 --> 00:01:27,034 who supports the other party, 23 00:01:27,058 --> 00:01:28,818 a particularly shocking statistic. 24 00:01:29,478 --> 00:01:31,913 You know, in my lab, the students that I work with, 25 00:01:31,937 --> 00:01:33,954 we're talking about some social pattern -- 26 00:01:33,978 --> 00:01:37,514 I'm a movie buff, and so I'm often like, 27 00:01:37,538 --> 00:01:40,498 what kind of movie are we in here with this pattern? 28 00:01:40,648 --> 00:01:43,928 So what kind of movie are we in with political polarization? 29 00:01:44,648 --> 00:01:47,368 Well, it could be a disaster movie. 30 00:01:48,448 --> 00:01:50,128 It certainly seems like a disaster. 31 00:01:50,488 --> 00:01:52,488 Could be a war movie. 32 00:01:53,144 --> 00:01:54,344 Also fits. 33 00:01:55,048 --> 00:01:58,864 But what I keep thinking is that we're in a zombie apocalypse movie. 34 00:01:58,888 --> 00:02:00,344 (Laughter) 35 00:02:00,368 --> 00:02:02,664 Right? You know the kind. 36 00:02:02,688 --> 00:02:05,104 There's people wandering around in packs, 37 00:02:05,128 --> 00:02:06,904 not thinking for themselves, 38 00:02:06,928 --> 00:02:08,544 seized by this mob mentality, 39 00:02:08,568 --> 00:02:11,808 trying to spread their disease and destroy society. 40 00:02:12,965 --> 00:02:16,711 And if you're like me, and you're a college-educated liberal -- 41 00:02:16,735 --> 00:02:19,933 and statistically, I'm guessing, the majority of you ... 42 00:02:19,957 --> 00:02:22,900 (Laughter) 43 00:02:22,924 --> 00:02:24,074 are exactly that. 44 00:02:24,345 --> 00:02:25,757 (Laughter) 45 00:02:25,908 --> 00:02:27,644 And you probably think, as I do, 46 00:02:27,668 --> 00:02:31,124 that you're the good guy in the zombie apocalypse movie, 47 00:02:31,148 --> 00:02:34,846 and all this hate and polarization, it's being propagated by the other people, 48 00:02:34,870 --> 00:02:36,070 the conservatives. 49 00:02:37,668 --> 00:02:39,548 We're Brad Pitt, right? 50 00:02:40,228 --> 00:02:43,124 Free-thinking, righteous, 51 00:02:43,148 --> 00:02:45,336 just trying to hold on to what we hold dear, 52 00:02:45,360 --> 00:02:48,943 you know, not foot soldiers in the army of the undead. 53 00:02:48,967 --> 00:02:50,424 Not that. 54 00:02:50,448 --> 00:02:51,648 Never that. 55 00:02:52,448 --> 00:02:53,944 But here's the thing: 56 00:02:53,968 --> 00:02:56,688 what movie do you suppose they think they're in? 57 00:02:57,848 --> 00:02:59,064 Right? 58 00:02:59,088 --> 00:03:01,624 Well, they absolutely think that they're the good guys 59 00:03:01,648 --> 00:03:03,504 in the zombie apocalypse movie. Right? 60 00:03:03,528 --> 00:03:06,504 And you'd better believe that they think that they're Brad Pitt 61 00:03:06,528 --> 00:03:08,648 and that we, we are the zombies. 62 00:03:11,488 --> 00:03:13,848 And who's to say that they're wrong? 63 00:03:14,512 --> 00:03:19,369 Look, they click on stupid internet links that say stuff like that ... 64 00:03:19,393 --> 00:03:22,359 We click on stupid internet links that say stuff like this. 65 00:03:22,621 --> 00:03:28,247 (Laughter) 66 00:03:28,430 --> 00:03:31,383 They complain about living near us, 67 00:03:31,407 --> 00:03:32,589 having to work with us, 68 00:03:32,613 --> 00:03:34,827 even eating Thanksgiving dinner with us. 69 00:03:35,851 --> 00:03:37,684 And we do all those same things. 70 00:03:37,708 --> 00:03:38,858 Right? 71 00:03:39,510 --> 00:03:40,795 Look, it's true. 72 00:03:41,018 --> 00:03:42,922 The studies that I see on polarization 73 00:03:42,946 --> 00:03:45,200 show that conservatives look a little bit worse. 74 00:03:45,375 --> 00:03:47,105 They look a little bit angrier, 75 00:03:47,129 --> 00:03:48,827 a little more averse to compromise. 76 00:03:48,851 --> 00:03:52,160 And we could tell ourselves that means that this is not our problem. 77 00:03:52,184 --> 00:03:53,605 That it's them doing it. 78 00:03:54,653 --> 00:03:57,089 But I think that would be taking the easy way out. 79 00:03:57,113 --> 00:04:00,028 I think that the truth is that we're all a part of this. 80 00:04:00,708 --> 00:04:03,868 And the good side of that is that we can be a part of the solution. 81 00:04:04,748 --> 00:04:06,748 So what are we going to do? 82 00:04:07,788 --> 00:04:12,044 What can we do to chip away at polarization in everyday life? 83 00:04:12,068 --> 00:04:15,884 What could we do to connect with and communicate with 84 00:04:15,908 --> 00:04:17,628 our political counterparts? 85 00:04:18,188 --> 00:04:22,324 Well, these were exactly the questions that I and my colleague Matt Feinberg 86 00:04:22,348 --> 00:04:24,206 became fascinated with a few years ago, 87 00:04:24,230 --> 00:04:26,430 and we started doing research on this topic. 88 00:04:27,388 --> 00:04:30,364 And one of the first things that we discovered 89 00:04:30,388 --> 00:04:33,844 that I think is really helpful for understanding polarization 90 00:04:33,868 --> 00:04:35,084 is to understand 91 00:04:35,108 --> 00:04:39,525 that the political divide in our country is undergirded by a deeper moral divide. 92 00:04:39,549 --> 00:04:44,324 So one of the most robust findings in the history of political psychology 93 00:04:44,348 --> 00:04:48,044 is this pattern identified by Jon Haidt and Jesse Graham, 94 00:04:48,068 --> 00:04:49,284 psychologists, 95 00:04:49,308 --> 00:04:53,324 that liberals and conservatives tend to endorse different values 96 00:04:53,348 --> 00:04:54,548 to different degrees. 97 00:04:55,068 --> 00:05:00,564 So for example, we find that liberals tend to endorse values like equality 98 00:05:00,588 --> 00:05:04,244 and fairness and care and protection from harm 99 00:05:04,268 --> 00:05:06,404 more than conservatives do, 100 00:05:06,428 --> 00:05:11,684 and conservatives tend to endorse values like loyalty, patriotism, 101 00:05:11,708 --> 00:05:15,164 respect for authority and moral purity 102 00:05:15,188 --> 00:05:17,268 more than liberals do. 103 00:05:18,388 --> 00:05:22,444 And Matt and I were thinking that maybe this moral divide 104 00:05:22,468 --> 00:05:25,564 might be helpful for understanding how it is 105 00:05:25,588 --> 00:05:28,004 that liberals and conservatives talk to one another 106 00:05:28,028 --> 00:05:30,444 and why they so often seem to talk past one another 107 00:05:30,468 --> 00:05:31,684 when they do. 108 00:05:31,708 --> 00:05:33,684 So we conducted a study 109 00:05:33,708 --> 00:05:36,804 where we recruited liberals to a study 110 00:05:36,828 --> 00:05:39,284 where they were supposed to write a persuasive essay 111 00:05:39,308 --> 00:05:43,748 that would be compelling to a conservative in support of same-sex marriage. 112 00:05:44,268 --> 00:05:47,524 And what we found was that liberals tended to make arguments 113 00:05:47,548 --> 00:05:51,724 in terms of the liberal moral values of equality and fairness. 114 00:05:51,748 --> 00:05:53,484 So they said things like 115 00:05:53,988 --> 00:05:57,364 "everyone should have the right to love whoever they choose," 116 00:05:57,388 --> 00:05:59,964 and they -- "they" being gay Americans -- 117 00:05:59,988 --> 00:06:02,748 "deserve the same equal rights as other Americans." 118 00:06:03,308 --> 00:06:06,524 Overall, we found that 69 percent of liberals 119 00:06:06,548 --> 00:06:11,964 invoked one of the more liberal moral values in constructing their essay, 120 00:06:11,988 --> 00:06:15,684 and only nine percent invoked one of the more conservative moral values, 121 00:06:15,708 --> 00:06:19,124 even though they were supposed to be trying to persuade conservatives. 122 00:06:19,148 --> 00:06:23,444 And when we studied conservatives and had them make persuasive arguments 123 00:06:23,468 --> 00:06:26,364 in support of making English the official language of the US, 124 00:06:26,388 --> 00:06:28,924 a classically conservative political position, 125 00:06:28,948 --> 00:06:31,164 we found that they weren't much better at this. 126 00:06:31,188 --> 00:06:32,804 59 percent of them made arguments 127 00:06:32,828 --> 00:06:35,524 in terms of one of the more conservative moral values, 128 00:06:35,548 --> 00:06:38,044 and just eight percent invoked a liberal moral value, 129 00:06:38,068 --> 00:06:41,428 even though they were supposed to be targeting liberals for persuasion. 130 00:06:42,428 --> 00:06:46,468 Now, you can see right away why we're in trouble here. Right? 131 00:06:47,228 --> 00:06:50,724 People's moral values, they're their most deeply held beliefs. 132 00:06:50,748 --> 00:06:54,148 People are willing to fight and die for their values. 133 00:06:54,668 --> 00:06:57,364 Why are they going to give that up just to agree with you 134 00:06:57,388 --> 00:07:00,924 on something that they don't particularly want to agree with you on anyway? 135 00:07:00,948 --> 00:07:04,204 If that persuasive appeal that you're making to your Republican uncle 136 00:07:04,228 --> 00:07:06,644 means that he doesn't just have to change his view, 137 00:07:06,668 --> 00:07:08,804 he's got to change his underlying values too, 138 00:07:08,828 --> 00:07:10,388 that's not going to go very far. 139 00:07:11,028 --> 00:07:12,348 So what would work better? 140 00:07:13,148 --> 00:07:17,444 Well, we believe it's a technique that we call moral reframing, 141 00:07:17,468 --> 00:07:20,084 and we've studied it in a series of experiments. 142 00:07:20,108 --> 00:07:21,604 In one of these experiments, 143 00:07:21,628 --> 00:07:24,764 we recruited liberals and conservatives to a study 144 00:07:24,788 --> 00:07:27,084 where they read one of three essays 145 00:07:27,108 --> 00:07:30,148 before having their environmental attitudes surveyed. 146 00:07:30,588 --> 00:07:32,084 And the first of these essays 147 00:07:32,108 --> 00:07:35,484 was a relatively conventional proenvironmental essay 148 00:07:35,508 --> 00:07:39,524 that invoked the liberal values of care and protection from harm. 149 00:07:39,548 --> 00:07:42,084 It said things like "in many important ways 150 00:07:42,108 --> 00:07:44,924 we are causing real harm to the places we live in," 151 00:07:44,948 --> 00:07:47,764 and "it is essential that we take steps now 152 00:07:47,788 --> 00:07:50,708 to prevent further destruction from being done to our Earth." 153 00:07:51,668 --> 00:07:53,084 Another group of participants 154 00:07:53,108 --> 00:07:55,324 were assigned to read a really different essay 155 00:07:55,348 --> 00:07:59,788 that was designed to tap into the conservative value of moral purity. 156 00:08:00,028 --> 00:08:01,964 It was a proenvironmental essay as well, 157 00:08:01,988 --> 00:08:03,484 and it said things like 158 00:08:03,508 --> 00:08:07,748 "keeping our forests, drinking water, and skies pure is of vital importance." 159 00:08:08,588 --> 00:08:10,084 "We should regard the pollution 160 00:08:10,108 --> 00:08:12,148 of the places we live in to be disgusting." 161 00:08:12,748 --> 00:08:14,844 And "reducing pollution can help us preserve 162 00:08:14,868 --> 00:08:18,028 what is pure and beautiful about the places we live." 163 00:08:19,468 --> 00:08:20,884 And then we had a third group 164 00:08:20,908 --> 00:08:23,404 that were assigned to read just a nonpolitical essay. 165 00:08:23,428 --> 00:08:26,164 It was just a comparison group so we could get a baseline. 166 00:08:26,188 --> 00:08:28,141 And what we found when we surveyed people 167 00:08:28,165 --> 00:08:30,364 about their environmental attitudes afterwards, 168 00:08:30,388 --> 00:08:33,323 we found that liberals, it didn't matter what essay they read. 169 00:08:33,347 --> 00:08:36,443 They tended to have highly proenvironmental attitudes regardless. 170 00:08:36,467 --> 00:08:38,884 Liberals are on board for environmental protection. 171 00:08:38,908 --> 00:08:40,124 Conservatives, however, 172 00:08:40,148 --> 00:08:44,564 were significantly more supportive of progressive environmental policies 173 00:08:44,588 --> 00:08:46,124 and environmental protection 174 00:08:46,148 --> 00:08:48,204 if they had read the moral purity essay 175 00:08:48,228 --> 00:08:50,628 than if they read one of the other two essays. 176 00:08:51,748 --> 00:08:54,844 We even found that conservatives who read the moral purity essay 177 00:08:54,868 --> 00:08:58,364 were significantly more likely to say that they believed in global warming 178 00:08:58,388 --> 00:09:00,293 and were concerned about global warming, 179 00:09:00,317 --> 00:09:03,044 even though this essay didn't even mention global warming. 180 00:09:03,068 --> 00:09:05,524 That's just a related environmental issue. 181 00:09:05,548 --> 00:09:08,628 But that's how robust this moral reframing effect was. 182 00:09:09,548 --> 00:09:13,284 And we've studied this on a whole slew of different political issues. 183 00:09:13,308 --> 00:09:17,044 So if you want to move conservatives 184 00:09:17,068 --> 00:09:20,164 on issues like same-sex marriage or national health insurance, 185 00:09:20,188 --> 00:09:23,644 it helps to tie these liberal political issues to conservative values 186 00:09:23,668 --> 00:09:25,410 like patriotism and moral purity. 187 00:09:26,188 --> 00:09:28,284 And we studied it the other way too. 188 00:09:28,308 --> 00:09:32,124 If you want to move liberals to the right on conservative policy issues 189 00:09:32,148 --> 00:09:36,474 like military spending and making English the official language of the US, 190 00:09:36,498 --> 00:09:38,154 you're going to be more persuasive 191 00:09:38,178 --> 00:09:41,514 if you tie those conservative policy issues to liberal moral values 192 00:09:41,538 --> 00:09:43,418 like equality and fairness. 193 00:09:44,708 --> 00:09:47,564 All these studies have the same clear message: 194 00:09:47,588 --> 00:09:50,524 if you want to persuade someone on some policy, 195 00:09:50,548 --> 00:09:54,388 it's helpful to connect that policy to their underlying moral values. 196 00:09:55,527 --> 00:09:59,284 And when you say it like that, it seems really obvious. Right? 197 00:09:59,308 --> 00:10:01,084 Like, why did we come here tonight? 198 00:10:01,108 --> 00:10:02,324 Why -- 199 00:10:02,348 --> 00:10:03,884 (Laughter) 200 00:10:03,908 --> 00:10:05,948 It's incredibly intuitive. 201 00:10:07,468 --> 00:10:10,764 And even though it is, it's something we really struggle to do. 202 00:10:10,788 --> 00:10:14,644 You know, it turns out that when we go to persuade somebody on a political issue, 203 00:10:14,668 --> 00:10:17,404 we talk like we're speaking into a mirror. 204 00:10:17,428 --> 00:10:21,804 We don't persuade so much as we rehearse our own reasons 205 00:10:21,828 --> 00:10:24,708 for why we believe some sort of political position. 206 00:10:24,987 --> 00:10:26,558 But, speaking as a liberal, 207 00:10:26,582 --> 00:10:30,114 I believe that we're going to need a whole new set of arguments, 208 00:10:30,138 --> 00:10:32,373 if we're going to move the next wave of people 209 00:10:32,397 --> 00:10:36,754 on critical issues like climate change, immigration and inequality. 210 00:10:37,476 --> 00:10:39,690 And to come up with those arguments, 211 00:10:39,714 --> 00:10:41,484 we're going to have to take the time 212 00:10:41,508 --> 00:10:44,405 to really listen to our conservative counterparts, 213 00:10:44,429 --> 00:10:46,698 to understand what they value, 214 00:10:47,238 --> 00:10:52,063 and then creatively think about why they should come to agree with us, 215 00:10:52,714 --> 00:10:55,811 and in a way that doesn't involve them having to sacrifice 216 00:10:55,835 --> 00:10:57,857 the things that they hold most dear. 217 00:10:59,268 --> 00:11:03,284 We kept saying, when we were designing these reframed moral arguments, 218 00:11:03,308 --> 00:11:05,948 "empathy and respect, empathy and respect." 219 00:11:06,508 --> 00:11:07,964 If you can tap into that, 220 00:11:07,988 --> 00:11:09,644 you can connect 221 00:11:09,668 --> 00:11:12,468 and you might be able to persuade somebody in this country. 222 00:11:13,028 --> 00:11:17,748 So thinking, again, about what movie we're in, 223 00:11:18,668 --> 00:11:20,244 maybe I got carried away before. 224 00:11:20,268 --> 00:11:22,228 Maybe it's not a zombie apocalypse movie. 225 00:11:22,988 --> 00:11:24,908 Maybe instead it's a buddy cop movie. 226 00:11:25,508 --> 00:11:27,524 (Laughter) 227 00:11:27,548 --> 00:11:29,564 Just roll with it, just go with it please. 228 00:11:29,588 --> 00:11:31,028 (Laughter) 229 00:11:31,948 --> 00:11:34,644 You know the kind: there's a white cop and a black cop, 230 00:11:34,668 --> 00:11:36,804 or maybe a messy cop and an organized cop. 231 00:11:36,828 --> 00:11:38,884 Whatever it is, they don't get along 232 00:11:38,908 --> 00:11:40,194 because of this difference. 233 00:11:40,988 --> 00:11:44,204 But in the end, when they have to come together and they cooperate, 234 00:11:44,228 --> 00:11:46,164 the solidarity that they feel, 235 00:11:46,188 --> 00:11:49,828 it's greater because of that gulf that they had to cross. Right? 236 00:11:50,748 --> 00:11:52,724 And remember that in these movies, 237 00:11:52,748 --> 00:11:55,644 it's usually worst in the second act 238 00:11:55,668 --> 00:11:58,068 when our leads are further apart than ever before. 239 00:11:58,908 --> 00:12:01,244 And so maybe that's where we are in this country, 240 00:12:01,268 --> 00:12:03,444 late in the second act of a buddy cop movie -- 241 00:12:03,468 --> 00:12:05,615 (Laughter) 242 00:12:06,068 --> 00:12:09,250 torn apart but about to come back together. 243 00:12:10,868 --> 00:12:12,524 It sounds good, 244 00:12:12,548 --> 00:12:14,404 but if we want it to happen, 245 00:12:14,428 --> 00:12:17,148 I think the responsibility is going to start with us. 246 00:12:17,988 --> 00:12:20,148 So this is my call to you: 247 00:12:20,948 --> 00:12:22,948 let's put this country back together. 248 00:12:24,548 --> 00:12:27,604 Let's do it despite the politicians 249 00:12:27,628 --> 00:12:30,484 and the media and Facebook and Twitter 250 00:12:30,508 --> 00:12:32,044 and Congressional redistricting 251 00:12:32,068 --> 00:12:34,788 and all of it, all the things that divide us. 252 00:12:35,828 --> 00:12:38,068 Let's do it because it's right. 253 00:12:39,388 --> 00:12:43,804 And let's do it because this hate and contempt 254 00:12:43,828 --> 00:12:45,988 that flows through all of us every day 255 00:12:46,868 --> 00:12:50,044 makes us ugly and it corrupts us, 256 00:12:50,068 --> 00:12:53,388 and it threatens the very fabric of our society. 257 00:12:55,428 --> 00:12:58,084 We owe it to one another and our country 258 00:12:58,108 --> 00:13:00,268 to reach out and try to connect. 259 00:13:01,468 --> 00:13:04,628 We can't afford to hate them any longer, 260 00:13:05,668 --> 00:13:07,868 and we can't afford to let them hate us either. 261 00:13:09,348 --> 00:13:10,708 Empathy and respect. 262 00:13:11,348 --> 00:13:12,588 Empathy and respect. 263 00:13:13,388 --> 00:13:17,188 If you think about it, it's the very least that we owe our fellow citizens. 264 00:13:17,868 --> 00:13:19,084 Thank you. 265 00:13:19,108 --> 00:13:23,070 (Applause)