1 00:00:00,936 --> 00:00:03,136 Chris Anderson: So, Jon, this feels scary. 2 00:00:03,160 --> 00:00:04,325 Jonathan Haidt: Yeah. 3 00:00:04,349 --> 00:00:06,339 CA: It feels like the world is in a place 4 00:00:06,363 --> 00:00:08,200 that we haven't seen for a long time. 5 00:00:08,224 --> 00:00:12,785 People don't just disagree in the way that we're familiar with, 6 00:00:12,809 --> 00:00:14,709 on the left-right political divide. 7 00:00:14,733 --> 00:00:17,734 There are much deeper differences afoot. 8 00:00:17,758 --> 00:00:21,024 What on earth is going on, and how did we get here? 9 00:00:21,048 --> 00:00:24,051 JH: This is different. 10 00:00:24,075 --> 00:00:27,302 There's a much more apocalyptic sort of feeling. 11 00:00:27,326 --> 00:00:29,652 Survey research by Pew Research shows 12 00:00:29,676 --> 00:00:33,385 that the degree to which we feel that the other side is not just -- 13 00:00:33,409 --> 00:00:36,383 we don't just dislike them; we strongly dislike them, 14 00:00:36,407 --> 00:00:39,883 and we think that they are a threat to the nation. 15 00:00:39,907 --> 00:00:41,871 Those numbers have been going up and up, 16 00:00:41,895 --> 00:00:44,632 and those are over 50 percent now on both sides. 17 00:00:44,656 --> 00:00:45,807 People are scared, 18 00:00:45,831 --> 00:00:49,458 because it feels like this is different than before; it's much more intense. 19 00:00:49,482 --> 00:00:52,002 Whenever I look at any sort of social puzzle, 20 00:00:52,026 --> 00:00:55,143 I always apply the three basic principles of moral psychology, 21 00:00:55,167 --> 00:00:57,061 and I think they'll help us here. 22 00:00:57,085 --> 00:00:59,818 So the first thing that you have to always keep in mind 23 00:00:59,842 --> 00:01:01,578 when you're thinking about politics 24 00:01:01,602 --> 00:01:02,982 is that we're tribal. 25 00:01:03,006 --> 00:01:04,529 We evolved for tribalism. 26 00:01:04,553 --> 00:01:07,722 One of the simplest and greatest insights into human social nature 27 00:01:07,746 --> 00:01:08,919 is the Bedouin proverb: 28 00:01:08,943 --> 00:01:10,335 "Me against my brother; 29 00:01:10,359 --> 00:01:12,286 me and my brother against our cousin; 30 00:01:12,310 --> 00:01:14,812 me and my brother and cousins against the stranger." 31 00:01:14,836 --> 00:01:19,560 And that tribalism allowed us to create large societies 32 00:01:19,584 --> 00:01:22,616 and to come together in order to compete with others. 33 00:01:22,640 --> 00:01:26,321 That brought us out of the jungle and out of small groups, 34 00:01:26,345 --> 00:01:28,368 but it means that we have eternal conflict. 35 00:01:28,392 --> 00:01:30,133 The question you have to look at is: 36 00:01:30,157 --> 00:01:32,821 What aspects of our society are making that more bitter, 37 00:01:32,845 --> 00:01:34,375 and what are calming them down? 38 00:01:34,399 --> 00:01:35,960 CA: That's a very dark proverb. 39 00:01:35,984 --> 00:01:40,157 You're saying that that's actually baked into most people's mental wiring 40 00:01:40,181 --> 00:01:41,332 at some level? 41 00:01:41,356 --> 00:01:45,160 JH: Oh, absolutely. This is just a basic aspect of human social cognition. 42 00:01:45,184 --> 00:01:47,507 But we can also live together really peacefully, 43 00:01:47,531 --> 00:01:50,926 and we've invented all kinds of fun ways of, like, playing war. 44 00:01:50,950 --> 00:01:52,332 I mean, sports, politics -- 45 00:01:52,356 --> 00:01:56,051 these are all ways that we get to exercise this tribal nature 46 00:01:56,075 --> 00:01:57,658 without actually hurting anyone. 47 00:01:57,682 --> 00:02:02,027 We're also really good at trade and exploration and meeting new people. 48 00:02:02,051 --> 00:02:05,350 So you have to see our tribalism as something that goes up or down -- 49 00:02:05,374 --> 00:02:08,226 it's not like we're doomed to always be fighting each other, 50 00:02:08,250 --> 00:02:10,100 but we'll never have world peace. 51 00:02:10,980 --> 00:02:14,202 CA: The size of that tribe can shrink or expand. 52 00:02:14,226 --> 00:02:15,377 JH: Right. 53 00:02:15,401 --> 00:02:17,388 CA: The size of what we consider "us" 54 00:02:17,412 --> 00:02:19,833 and what we consider "other" or "them" 55 00:02:19,857 --> 00:02:21,987 can change. 56 00:02:22,610 --> 00:02:28,200 And some people believed that process could continue indefinitely. 57 00:02:28,224 --> 00:02:29,416 JH: That's right. 58 00:02:29,440 --> 00:02:32,646 CA: And we were indeed expanding the sense of tribe for a while. 59 00:02:32,670 --> 00:02:33,831 JH: So this is, I think, 60 00:02:33,855 --> 00:02:37,210 where we're getting at what's possibly the new left-right distinction. 61 00:02:37,234 --> 00:02:39,557 I mean, the left-right as we've all inherited it, 62 00:02:39,581 --> 00:02:42,420 comes out of the labor versus capital distinction, 63 00:02:42,444 --> 00:02:44,819 and the working class, and Marx. 64 00:02:44,843 --> 00:02:47,092 But I think what we're seeing now, increasingly, 65 00:02:47,116 --> 00:02:49,970 is a divide in all the Western democracies 66 00:02:49,994 --> 00:02:53,701 between the people who want to stop at nation, 67 00:02:53,725 --> 00:02:55,475 the people who are more parochial -- 68 00:02:55,499 --> 00:02:57,296 and I don't mean that in a bad way -- 69 00:02:57,320 --> 00:03:00,120 people who have much more of a sense of being rooted, 70 00:03:00,144 --> 00:03:03,337 they care about their town, their community and their nation. 71 00:03:03,824 --> 00:03:07,861 And then those who are anti-parochial and who -- 72 00:03:07,885 --> 00:03:11,285 whenever I get confused, I just think of the John Lennon song "Imagine." 73 00:03:11,309 --> 00:03:14,138 "Imagine there's no countries, nothing to kill or die for." 74 00:03:14,162 --> 00:03:17,375 And so these are the people who want more global governance, 75 00:03:17,399 --> 00:03:20,073 they don't like nation states, they don't like borders. 76 00:03:20,097 --> 00:03:21,922 You see this all over Europe as well. 77 00:03:21,946 --> 00:03:25,172 There's a great metaphor guy -- actually, his name is Shakespeare -- 78 00:03:25,196 --> 00:03:26,777 writing ten years ago in Britain. 79 00:03:26,801 --> 00:03:27,968 He had a metaphor: 80 00:03:27,992 --> 00:03:31,265 "Are we drawbridge-uppers or drawbridge-downers?" 81 00:03:31,289 --> 00:03:34,194 And Britain is divided 52-48 on that point. 82 00:03:34,218 --> 00:03:36,338 And America is divided on that point, too. 83 00:03:37,379 --> 00:03:40,956 CA: And so, those of us who grew up with The Beatles 84 00:03:40,980 --> 00:03:44,554 and that sort of hippie philosophy of dreaming of a more connected world -- 85 00:03:44,578 --> 00:03:48,419 it felt so idealistic and "how could anyone think badly about that?" 86 00:03:48,736 --> 00:03:50,747 And what you're saying is that, actually, 87 00:03:50,771 --> 00:03:55,073 millions of people today feel that that isn't just silly; 88 00:03:55,097 --> 00:03:57,958 it's actually dangerous and wrong, and they're scared of it. 89 00:03:57,982 --> 00:04:01,008 JH: I think the big issue, especially in Europe but also here, 90 00:04:01,032 --> 00:04:02,399 is the issue of immigration. 91 00:04:02,423 --> 00:04:05,492 And I think this is where we have to look very carefully 92 00:04:05,516 --> 00:04:09,147 at the social science about diversity and immigration. 93 00:04:09,171 --> 00:04:10,885 Once something becomes politicized, 94 00:04:10,909 --> 00:04:13,933 once it becomes something that the left loves and the right -- 95 00:04:13,957 --> 00:04:17,058 then even the social scientists can't think straight about it. 96 00:04:17,082 --> 00:04:19,039 Now, diversity is good in a lot of ways. 97 00:04:19,063 --> 00:04:21,270 It clearly creates more innovation. 98 00:04:21,294 --> 00:04:23,771 The American economy has grown enormously from it. 99 00:04:23,795 --> 00:04:26,177 Diversity and immigration do a lot of good things. 100 00:04:26,201 --> 00:04:28,829 But what the globalists, I think, don't see, 101 00:04:28,853 --> 00:04:30,360 what they don't want to see, 102 00:04:30,384 --> 00:04:36,662 is that ethnic diversity cuts social capital and trust. 103 00:04:36,686 --> 00:04:39,022 There's a very important study by Robert Putnam, 104 00:04:39,046 --> 00:04:40,902 the author of "Bowling Alone," 105 00:04:40,926 --> 00:04:42,781 looking at social capital databases. 106 00:04:42,805 --> 00:04:45,752 And basically, the more people feel that they are the same, 107 00:04:45,776 --> 00:04:47,283 the more they trust each other, 108 00:04:47,307 --> 00:04:50,047 the more they can have a redistributionist welfare state. 109 00:04:50,071 --> 00:04:52,042 Scandinavian countries are so wonderful 110 00:04:52,066 --> 00:04:55,415 because they have this legacy of being small, homogenous countries. 111 00:04:55,439 --> 00:04:59,156 And that leads to a progressive welfare state, 112 00:04:59,180 --> 00:05:01,983 a set of progressive left-leaning values, which says, 113 00:05:02,007 --> 00:05:05,008 "Drawbridge down! The world is a great place. 114 00:05:05,032 --> 00:05:08,064 People in Syria are suffering -- we must welcome them in." 115 00:05:08,088 --> 00:05:09,465 And it's a beautiful thing. 116 00:05:09,934 --> 00:05:12,517 But if, and I was in Sweden this summer, 117 00:05:12,541 --> 00:05:15,724 if the discourse in Sweden is fairly politically correct 118 00:05:15,748 --> 00:05:17,985 and they can't talk about the downsides, 119 00:05:18,009 --> 00:05:19,996 you end up bringing a lot of people in. 120 00:05:20,020 --> 00:05:21,712 That's going to cut social capital, 121 00:05:21,736 --> 00:05:23,681 it makes it hard to have a welfare state 122 00:05:23,705 --> 00:05:26,261 and they might end up, as we have in America, 123 00:05:26,285 --> 00:05:29,898 with a racially divided, visibly racially divided, society. 124 00:05:29,922 --> 00:05:32,182 So this is all very uncomfortable to talk about. 125 00:05:32,206 --> 00:05:35,451 But I think this is the thing, especially in Europe and for us, too, 126 00:05:35,475 --> 00:05:36,668 we need to be looking at. 127 00:05:36,692 --> 00:05:38,778 CA: You're saying that people of reason, 128 00:05:38,802 --> 00:05:41,290 people who would consider themselves not racists, 129 00:05:41,314 --> 00:05:43,041 but moral, upstanding people, 130 00:05:43,065 --> 00:05:46,011 have a rationale that says humans are just too different; 131 00:05:46,035 --> 00:05:51,474 that we're in danger of overloading our sense of what humans are capable of, 132 00:05:51,498 --> 00:05:53,988 by mixing in people who are too different. 133 00:05:54,012 --> 00:05:57,552 JH: Yes, but I can make it much more palatable 134 00:05:57,576 --> 00:06:00,374 by saying it's not necessarily about race. 135 00:06:00,819 --> 00:06:02,086 It's about culture. 136 00:06:02,110 --> 00:06:06,283 There's wonderful work by a political scientist named Karen Stenner, 137 00:06:06,307 --> 00:06:09,534 who shows that when people have a sense 138 00:06:09,558 --> 00:06:11,793 that we are all united, we're all the same, 139 00:06:11,817 --> 00:06:15,067 there are many people who have a predisposition to authoritarianism. 140 00:06:15,091 --> 00:06:17,133 Those people aren't particularly racist 141 00:06:17,157 --> 00:06:19,324 when they feel as through there's not a threat 142 00:06:19,348 --> 00:06:21,047 to our social and moral order. 143 00:06:21,071 --> 00:06:23,104 But if you prime them experimentally 144 00:06:23,128 --> 00:06:26,260 by thinking we're coming apart, people are getting more different, 145 00:06:26,284 --> 00:06:29,807 then they get more racist, homophobic, they want to kick out the deviants. 146 00:06:29,831 --> 00:06:32,738 So it's in part that you get an authoritarian reaction. 147 00:06:32,762 --> 00:06:35,099 The left, following through the Lennonist line -- 148 00:06:35,123 --> 00:06:36,409 the John Lennon line -- 149 00:06:36,433 --> 00:06:38,842 does things that create an authoritarian reaction. 150 00:06:38,866 --> 00:06:41,661 We're certainly seeing that in America with the alt-right. 151 00:06:41,685 --> 00:06:44,267 We saw it in Britain, we've seen it all over Europe. 152 00:06:44,291 --> 00:06:46,873 But the more positive part of that 153 00:06:46,897 --> 00:06:51,194 is that I think the localists, or the nationalists, are actually right -- 154 00:06:51,218 --> 00:06:55,005 that, if you emphasize our cultural similarity, 155 00:06:55,029 --> 00:06:57,256 then race doesn't actually matter very much. 156 00:06:57,280 --> 00:06:59,988 So an assimilationist approach to immigration 157 00:07:00,012 --> 00:07:01,569 removes a lot of these problems. 158 00:07:01,593 --> 00:07:03,980 And if you value having a generous welfare state, 159 00:07:04,004 --> 00:07:06,477 you've got to emphasize that we're all the same. 160 00:07:06,840 --> 00:07:09,934 CA: OK, so rising immigration and fears about that 161 00:07:09,958 --> 00:07:13,125 are one of the causes of the current divide. 162 00:07:13,149 --> 00:07:14,862 What are other causes? 163 00:07:14,886 --> 00:07:16,907 JH: The next principle of moral psychology 164 00:07:16,931 --> 00:07:20,734 is that intuitions come first, strategic reasoning second. 165 00:07:20,758 --> 00:07:23,238 You've probably heard the term "motivated reasoning" 166 00:07:23,262 --> 00:07:24,869 or "confirmation bias." 167 00:07:24,893 --> 00:07:26,810 There's some really interesting work 168 00:07:26,834 --> 00:07:29,911 on how our high intelligence and our verbal abilities 169 00:07:29,935 --> 00:07:33,137 might have evolved not to help us find out the truth, 170 00:07:33,161 --> 00:07:36,133 but to help us manipulate each other, defend our reputation ... 171 00:07:36,157 --> 00:07:39,123 We're really, really good at justifying ourselves. 172 00:07:39,147 --> 00:07:41,514 And when you bring group interests into account, 173 00:07:41,538 --> 00:07:44,275 so it's not just me, it's my team versus your team, 174 00:07:44,299 --> 00:07:47,212 whereas if you're evaluating evidence that your side is wrong, 175 00:07:47,236 --> 00:07:48,993 we just can't accept that. 176 00:07:49,017 --> 00:07:51,734 So this is why you can't win a political argument. 177 00:07:51,758 --> 00:07:53,251 If you're debating something, 178 00:07:53,275 --> 00:07:56,322 you can't persuade the person with reasons and evidence, 179 00:07:56,346 --> 00:07:58,691 because that's not the way reasoning works. 180 00:07:58,715 --> 00:08:01,830 So now, give us the internet, give us Google: 181 00:08:02,522 --> 00:08:05,252 "I heard that Barack Obama was born in Kenya. 182 00:08:05,276 --> 00:08:09,224 Let me Google that -- oh my God! 10 million hits! Look, he was!" 183 00:08:09,248 --> 00:08:12,388 CA: So this has come as an unpleasant surprise to a lot of people. 184 00:08:12,412 --> 00:08:15,260 Social media has often been framed by techno-optimists 185 00:08:15,284 --> 00:08:20,531 as this great connecting force that would bring people together. 186 00:08:20,555 --> 00:08:24,477 And there have been some unexpected counter-effects to that. 187 00:08:24,922 --> 00:08:26,073 JH: That's right. 188 00:08:26,097 --> 00:08:28,478 That's why I'm very enamored of yin-yang views 189 00:08:28,502 --> 00:08:30,120 of human nature and left-right -- 190 00:08:30,144 --> 00:08:32,610 that each side is right about certain things, 191 00:08:32,634 --> 00:08:34,755 but then it goes blind to other things. 192 00:08:34,779 --> 00:08:37,757 And so the left generally believes that human nature is good: 193 00:08:37,781 --> 00:08:40,946 bring people together, knock down the walls and all will be well. 194 00:08:40,970 --> 00:08:43,528 The right -- social conservatives, not libertarians -- 195 00:08:43,552 --> 00:08:47,805 social conservatives generally believe people can be greedy 196 00:08:47,829 --> 00:08:49,211 and sexual and selfish, 197 00:08:49,235 --> 00:08:51,655 and we need regulation, and we need restrictions. 198 00:08:52,214 --> 00:08:54,616 So, yeah, if you knock down all the walls, 199 00:08:54,640 --> 00:08:56,869 allow people to communicate all over the world, 200 00:08:56,893 --> 00:08:58,951 you get a lot of porn and a lot of racism. 201 00:08:58,975 --> 00:09:00,236 CA: So help us understand. 202 00:09:00,260 --> 00:09:05,812 These principles of human nature have been with us forever. 203 00:09:06,918 --> 00:09:11,692 What's changed that's deepened this feeling of division? 204 00:09:12,360 --> 00:09:17,016 JH: You have to see six to ten different threads all coming together. 205 00:09:17,373 --> 00:09:19,066 I'll just list a couple of them. 206 00:09:19,398 --> 00:09:23,876 So in America, one of the big -- actually, America and Europe -- 207 00:09:23,900 --> 00:09:25,813 one of the biggest ones is World War II. 208 00:09:25,837 --> 00:09:28,480 There's interesting research from Joe Henrich and others 209 00:09:28,504 --> 00:09:30,907 that says if your country was at war, 210 00:09:30,931 --> 00:09:32,488 especially when you were young, 211 00:09:32,512 --> 00:09:35,626 then we test you 30 years later in a commons dilemma 212 00:09:35,650 --> 00:09:36,979 or a prisoner's dilemma, 213 00:09:37,003 --> 00:09:38,294 you're more cooperative. 214 00:09:38,983 --> 00:09:41,856 Because of our tribal nature, if you're -- 215 00:09:41,880 --> 00:09:44,797 my parents were teenagers during World War II, 216 00:09:44,821 --> 00:09:47,382 and they would go out looking for scraps of aluminum 217 00:09:47,406 --> 00:09:48,595 to help the war effort. 218 00:09:48,619 --> 00:09:50,797 I mean, everybody pulled together. 219 00:09:50,821 --> 00:09:52,350 And so then these people go on, 220 00:09:52,374 --> 00:09:54,772 they rise up through business and government, 221 00:09:54,796 --> 00:09:56,427 they take leadership positions. 222 00:09:56,451 --> 00:09:59,702 They're really good at compromise and cooperation. 223 00:09:59,726 --> 00:10:01,661 They all retire by the '90s. 224 00:10:01,685 --> 00:10:04,986 So we're left with baby boomers by the end of the '90s. 225 00:10:05,010 --> 00:10:08,977 And their youth was spent fighting each other within each country, 226 00:10:09,001 --> 00:10:10,648 in 1968 and afterwards. 227 00:10:10,672 --> 00:10:14,666 The loss of the World War II generation, "The Greatest Generation," 228 00:10:14,690 --> 00:10:15,974 is huge. 229 00:10:16,567 --> 00:10:17,742 So that's one. 230 00:10:18,440 --> 00:10:21,563 Another, in America, is the purification of the two parties. 231 00:10:21,949 --> 00:10:24,996 There used to be liberal Republicans and conservative Democrats. 232 00:10:25,020 --> 00:10:28,187 So America had a mid-20th century that was really bipartisan. 233 00:10:28,211 --> 00:10:32,578 But because of a variety of factors that started things moving, 234 00:10:32,602 --> 00:10:36,000 by the 90's, we had a purified liberal party and conservative party. 235 00:10:36,024 --> 00:10:38,669 So now, the people in either party really are different, 236 00:10:38,693 --> 00:10:41,176 and we really don't want our children to marry them, 237 00:10:41,200 --> 00:10:43,268 which, in the '60s, didn't matter very much. 238 00:10:43,292 --> 00:10:45,089 So, the purification of the parties. 239 00:10:45,113 --> 00:10:47,585 Third is the internet and, as I said, 240 00:10:47,609 --> 00:10:52,292 it's just the most amazing stimulant for post-hoc reasoning and demonization. 241 00:10:52,316 --> 00:10:57,108 CA: The tone of what's happening on the internet now is quite troubling. 242 00:10:57,132 --> 00:10:59,979 I just did a quick search on Twitter about the election 243 00:11:00,003 --> 00:11:02,923 and saw two tweets next to each other. 244 00:11:03,335 --> 00:11:07,547 One, against a picture of racist graffiti: 245 00:11:08,007 --> 00:11:09,640 "This is disgusting! 246 00:11:09,664 --> 00:11:12,834 Ugliness in this country, brought to us by #Trump." 247 00:11:13,424 --> 00:11:14,806 And then the next one is: 248 00:11:15,339 --> 00:11:18,642 "Crooked Hillary dedication page. Disgusting!" 249 00:11:19,176 --> 00:11:23,383 So this idea of "disgust" is troubling to me. 250 00:11:23,407 --> 00:11:26,642 Because you can have an argument or a disagreement about something, 251 00:11:26,666 --> 00:11:28,227 you can get angry at someone. 252 00:11:29,094 --> 00:11:32,687 Disgust, I've heard you say, takes things to a much deeper level. 253 00:11:32,711 --> 00:11:34,598 JH: That's right. Disgust is different. 254 00:11:34,622 --> 00:11:36,585 Anger -- you know, I have kids. 255 00:11:36,609 --> 00:11:38,359 They fight 10 times a day, 256 00:11:38,383 --> 00:11:40,350 and they love each other 30 times a day. 257 00:11:40,374 --> 00:11:43,284 You just go back and forth: you get angry, you're not angry; 258 00:11:43,308 --> 00:11:44,831 you're angry, you're not angry. 259 00:11:44,855 --> 00:11:46,165 But disgust is different. 260 00:11:46,189 --> 00:11:50,662 Disgust paints the person as subhuman, monstrous, 261 00:11:50,686 --> 00:11:52,495 deformed, morally deformed. 262 00:11:52,519 --> 00:11:54,943 Disgust is like indelible ink. 263 00:11:55,768 --> 00:11:59,283 There's research from John Gottman on marital therapy. 264 00:11:59,307 --> 00:12:04,432 If you look at the faces -- if one of the couple shows disgust or contempt, 265 00:12:04,456 --> 00:12:07,552 that's a predictor that they're going to get divorced soon, 266 00:12:07,576 --> 00:12:10,475 whereas if they show anger, that doesn't predict anything, 267 00:12:10,499 --> 00:12:13,208 because if you deal with anger well, it actually is good. 268 00:12:13,232 --> 00:12:14,684 So this election is different. 269 00:12:14,708 --> 00:12:18,362 Donald Trump personally uses the word "disgust" a lot. 270 00:12:18,386 --> 00:12:21,243 He's very germ-sensitive, so disgust does matter a lot -- 271 00:12:21,267 --> 00:12:25,177 more for him, that's something unique to him -- 272 00:12:25,201 --> 00:12:28,104 but as we demonize each other more, 273 00:12:28,128 --> 00:12:31,537 and again, through the Manichaean worldview, 274 00:12:31,561 --> 00:12:34,291 the idea that the world is a battle between good and evil 275 00:12:34,315 --> 00:12:35,662 as this has been ramping up, 276 00:12:35,686 --> 00:12:39,012 we're more likely not just to say they're wrong or I don't like them, 277 00:12:39,036 --> 00:12:41,572 but we say they're evil, they're satanic, 278 00:12:41,596 --> 00:12:43,517 they're disgusting, they're revolting. 279 00:12:43,541 --> 00:12:46,407 And then we want nothing to do with them. 280 00:12:46,806 --> 00:12:50,293 And that's why I think we're seeing it, for example, on campus now. 281 00:12:50,317 --> 00:12:52,957 We're seeing more the urge to keep people off campus, 282 00:12:52,981 --> 00:12:54,926 silence them, keep them away. 283 00:12:54,950 --> 00:12:57,545 I'm afraid that this whole generation of young people, 284 00:12:57,569 --> 00:13:01,362 if their introduction to politics involves a lot of disgust, 285 00:13:01,386 --> 00:13:05,091 they're not going to want to be involved in politics as they get older. 286 00:13:05,506 --> 00:13:07,346 CA: So how do we deal with that? 287 00:13:07,370 --> 00:13:11,850 Disgust. How do you defuse disgust? 288 00:13:12,874 --> 00:13:14,822 JH: You can't do it with reasons. 289 00:13:15,312 --> 00:13:16,503 I think ... 290 00:13:18,257 --> 00:13:21,474 I studied disgust for many years, and I think about emotions a lot. 291 00:13:21,498 --> 00:13:25,023 And I think that the opposite of disgust is actually love. 292 00:13:25,764 --> 00:13:28,855 Love is all about, like ... 293 00:13:29,239 --> 00:13:31,810 Disgust is closing off, borders. 294 00:13:31,834 --> 00:13:34,379 Love is about dissolving walls. 295 00:13:35,074 --> 00:13:37,554 So personal relationships, I think, 296 00:13:37,578 --> 00:13:40,337 are probably the most powerful means we have. 297 00:13:41,291 --> 00:13:43,988 You can be disgusted by a group of people, 298 00:13:44,012 --> 00:13:45,930 but then you meet a particular person 299 00:13:45,954 --> 00:13:48,730 and you genuinely discover that they're lovely. 300 00:13:48,754 --> 00:13:53,050 And then gradually that chips away or changes your category as well. 301 00:13:54,016 --> 00:13:59,993 The tragedy is, Americans used to be much more mixed up in the their towns 302 00:14:00,017 --> 00:14:02,151 by left-right or politics. 303 00:14:02,175 --> 00:14:04,506 And now that it's become this great moral divide, 304 00:14:04,530 --> 00:14:07,673 there's a lot of evidence that we're moving to be near people 305 00:14:07,697 --> 00:14:09,209 who are like us politically. 306 00:14:09,233 --> 00:14:11,763 It's harder to find somebody who's on the other side. 307 00:14:11,787 --> 00:14:14,077 So they're over there, they're far away. 308 00:14:14,101 --> 00:14:15,671 It's harder to get to know them. 309 00:14:15,695 --> 00:14:19,919 CA: What would you say to someone or say to Americans, 310 00:14:19,943 --> 00:14:21,101 people generally, 311 00:14:21,125 --> 00:14:23,734 about what we should understand about each other 312 00:14:23,758 --> 00:14:27,233 that might help us rethink for a minute 313 00:14:27,257 --> 00:14:29,460 this "disgust" instinct? 314 00:14:30,086 --> 00:14:31,238 JH: Yes. 315 00:14:31,262 --> 00:14:33,415 A really important thing to keep in mind -- 316 00:14:33,439 --> 00:14:38,155 there's research by political scientist Alan Abramowitz, 317 00:14:38,179 --> 00:14:42,172 showing that American democracy is increasingly governed 318 00:14:42,196 --> 00:14:44,439 by what's called "negative partisanship." 319 00:14:44,875 --> 00:14:47,986 That means you think, OK there's a candidate, 320 00:14:48,010 --> 00:14:50,416 you like the candidate, you vote for the candidate. 321 00:14:50,440 --> 00:14:52,499 But with the rise of negative advertising 322 00:14:52,523 --> 00:14:54,747 and social media and all sorts of other trends, 323 00:14:54,771 --> 00:14:56,812 increasingly, the way elections are done 324 00:14:56,836 --> 00:15:00,817 is that each side tries to make the other side so horrible, so awful, 325 00:15:00,841 --> 00:15:02,882 that you'll vote for my guy by default. 326 00:15:03,319 --> 00:15:06,289 And so as we more and more vote against the other side 327 00:15:06,313 --> 00:15:07,644 and not for our side, 328 00:15:07,668 --> 00:15:13,175 you have to keep in mind that if people are on the left, 329 00:15:13,199 --> 00:15:16,109 they think, "Well, I used to think that Republicans were bad, 330 00:15:16,133 --> 00:15:17,616 but now Donald Trump proves it. 331 00:15:17,640 --> 00:15:20,491 And now every Republican, I can paint with all the things 332 00:15:20,515 --> 00:15:21,897 that I think about Trump." 333 00:15:21,921 --> 00:15:23,514 And that's not necessarily true. 334 00:15:23,538 --> 00:15:26,230 They're generally not very happy with their candidate. 335 00:15:26,254 --> 00:15:30,970 This is the most negative partisanship election in American history. 336 00:15:31,860 --> 00:15:35,223 So you have to first separate your feelings about the candidate 337 00:15:35,247 --> 00:15:38,184 from your feelings about the people who are given a choice. 338 00:15:38,208 --> 00:15:40,691 And then you have to realize that, 339 00:15:41,246 --> 00:15:43,666 because we all live in a separate moral world -- 340 00:15:43,690 --> 00:15:47,141 the metaphor I use in the book is that we're all trapped in "The Matrix," 341 00:15:47,165 --> 00:15:50,689 or each moral community is a matrix, a consensual hallucination. 342 00:15:50,713 --> 00:15:52,956 And so if you're within the blue matrix, 343 00:15:52,980 --> 00:15:56,174 everything's completely compelling that the other side -- 344 00:15:56,198 --> 00:15:59,829 they're troglodytes, they're racists, they're the worst people in the world, 345 00:15:59,853 --> 00:16:01,957 and you have all the facts to back that up. 346 00:16:01,981 --> 00:16:04,256 But somebody in the next house from yours 347 00:16:04,280 --> 00:16:06,313 is living in a different moral matrix. 348 00:16:06,337 --> 00:16:08,284 They live in a different video game, 349 00:16:08,308 --> 00:16:10,686 and they see a completely different set of facts. 350 00:16:10,710 --> 00:16:13,386 And each one sees different threats to the country. 351 00:16:13,410 --> 00:16:15,500 And what I've found from being in the middle 352 00:16:15,524 --> 00:16:18,451 and trying to understand both sides is: both sides are right. 353 00:16:18,475 --> 00:16:20,595 There are a lot of threats to this country, 354 00:16:20,619 --> 00:16:24,104 and each side is constitutionally incapable of seeing them all. 355 00:16:24,963 --> 00:16:31,482 CA: So, are you saying that we almost need a new type of empathy? 356 00:16:31,506 --> 00:16:33,676 Empathy is traditionally framed as: 357 00:16:33,700 --> 00:16:36,391 "Oh, I feel your pain. I can put myself in your shoes." 358 00:16:36,415 --> 00:16:39,344 And we apply it to the poor, the needy, the suffering. 359 00:16:40,023 --> 00:16:43,846 We don't usually apply it to people who we feel as other, 360 00:16:43,870 --> 00:16:45,335 or we're disgusted by. 361 00:16:45,359 --> 00:16:46,510 JH: No. That's right. 362 00:16:46,534 --> 00:16:51,364 CA: What would it look like to build that type of empathy? 363 00:16:52,268 --> 00:16:53,506 JH: Actually, I think ... 364 00:16:54,145 --> 00:16:56,450 Empathy is a very, very hot topic in psychology, 365 00:16:56,474 --> 00:16:59,132 and it's a very popular word on the left in particular. 366 00:16:59,156 --> 00:17:03,156 Empathy is a good thing, and empathy for the preferred classes of victims. 367 00:17:03,180 --> 00:17:04,633 So it's important to empathize 368 00:17:04,657 --> 00:17:07,481 with the groups that we on the left think are so important. 369 00:17:07,505 --> 00:17:10,036 That's easy to do, because you get points for that. 370 00:17:10,442 --> 00:17:14,091 But empathy really should get you points if you do it when it's hard to do. 371 00:17:14,513 --> 00:17:16,267 And, I think ... 372 00:17:16,291 --> 00:17:21,379 You know, we had a long 50-year period of dealing with our race problems 373 00:17:21,403 --> 00:17:23,658 and legal discrimination, 374 00:17:23,682 --> 00:17:25,869 and that was our top priority for a long time 375 00:17:25,893 --> 00:17:27,143 and it still is important. 376 00:17:27,167 --> 00:17:28,696 But I think this year, 377 00:17:28,720 --> 00:17:31,124 I'm hoping it will make people see 378 00:17:31,148 --> 00:17:33,943 that we have an existential threat on our hands. 379 00:17:33,967 --> 00:17:36,634 Our left-right divide, I believe, 380 00:17:36,658 --> 00:17:38,818 is by far the most important divide we face. 381 00:17:38,842 --> 00:17:41,873 We still have issues about race and gender and LGBT, 382 00:17:41,897 --> 00:17:45,268 but this is the urgent need of the next 50 years, 383 00:17:45,292 --> 00:17:48,153 and things aren't going to get better on their own. 384 00:17:49,021 --> 00:17:51,856 So we're going to need to do a lot of institutional reforms, 385 00:17:51,880 --> 00:17:53,289 and we could talk about that, 386 00:17:53,313 --> 00:17:55,643 but that's like a whole long, wonky conversation. 387 00:17:55,667 --> 00:17:59,513 But I think it starts with people realizing that this is a turning point. 388 00:17:59,537 --> 00:18:02,346 And yes, we need a new kind of empathy. 389 00:18:02,370 --> 00:18:03,875 We need to realize: 390 00:18:03,899 --> 00:18:05,441 this is what our country needs, 391 00:18:05,465 --> 00:18:07,819 and this is what you need if you don't want to -- 392 00:18:07,843 --> 00:18:10,538 Raise your hand if you want to spend the next four years 393 00:18:10,562 --> 00:18:14,048 as angry and worried as you've been for the last year -- raise your hand. 394 00:18:14,072 --> 00:18:15,777 So if you want to escape from this, 395 00:18:15,801 --> 00:18:17,952 read Buddha, read Jesus, read Marcus Aurelius. 396 00:18:17,976 --> 00:18:23,038 They have all kinds of great advice for how to drop the fear, 397 00:18:23,062 --> 00:18:24,240 reframe things, 398 00:18:24,264 --> 00:18:26,347 stop seeing other people as your enemy. 399 00:18:26,371 --> 00:18:29,678 There's a lot of guidance in ancient wisdom for this kind of empathy. 400 00:18:29,702 --> 00:18:31,079 CA: Here's my last question: 401 00:18:31,103 --> 00:18:35,438 Personally, what can people do to help heal? 402 00:18:35,462 --> 00:18:39,545 JH: Yeah, it's very hard to just decide to overcome your deepest prejudices. 403 00:18:39,569 --> 00:18:41,030 And there's research showing 404 00:18:41,054 --> 00:18:45,403 that political prejudices are deeper and stronger than race prejudices 405 00:18:45,427 --> 00:18:46,687 in the country now. 406 00:18:47,395 --> 00:18:50,827 So I think you have to make an effort -- that's the main thing. 407 00:18:50,851 --> 00:18:52,855 Make an effort to actually meet somebody. 408 00:18:52,879 --> 00:18:55,089 Everybody has a cousin, a brother-in-law, 409 00:18:55,113 --> 00:18:56,982 somebody who's on the other side. 410 00:18:57,006 --> 00:18:58,822 So, after this election -- 411 00:18:59,252 --> 00:19:00,603 wait a week or two, 412 00:19:00,627 --> 00:19:03,463 because it's probably going to feel awful for one of you -- 413 00:19:03,487 --> 00:19:07,639 but wait a couple weeks, and then reach out and say you want to talk. 414 00:19:07,663 --> 00:19:09,087 And before you do it, 415 00:19:09,111 --> 00:19:12,256 read Dale Carnegie, "How to Win Friends and Influence People" -- 416 00:19:12,280 --> 00:19:13,319 (Laughter) 417 00:19:13,343 --> 00:19:14,510 I'm totally serious. 418 00:19:14,534 --> 00:19:17,124 You'll learn techniques if you start by acknowledging, 419 00:19:17,148 --> 00:19:18,309 if you start by saying, 420 00:19:18,333 --> 00:19:20,003 "You know, we don't agree on a lot, 421 00:19:20,027 --> 00:19:22,565 but one thing I really respect about you, Uncle Bob," 422 00:19:22,589 --> 00:19:24,648 or "... about you conservatives, is ... " 423 00:19:24,672 --> 00:19:26,006 And you can find something. 424 00:19:26,030 --> 00:19:28,793 If you start with some appreciation, it's like magic. 425 00:19:28,817 --> 00:19:30,931 This is one of the main things I've learned 426 00:19:30,955 --> 00:19:32,868 that I take into my human relationships. 427 00:19:32,892 --> 00:19:34,812 I still make lots of stupid mistakes, 428 00:19:34,836 --> 00:19:36,852 but I'm incredibly good at apologizing now, 429 00:19:36,876 --> 00:19:39,293 and at acknowledging what somebody was right about. 430 00:19:39,317 --> 00:19:40,471 And if you do that, 431 00:19:40,495 --> 00:19:43,989 then the conversation goes really well, and it's actually really fun. 432 00:19:44,717 --> 00:19:47,362 CA: Jon, it's absolutely fascinating speaking with you. 433 00:19:47,386 --> 00:19:51,144 It's really does feel like the ground that we're on 434 00:19:51,168 --> 00:19:56,035 is a ground populated by deep questions of morality and human nature. 435 00:19:56,366 --> 00:19:58,790 Your wisdom couldn't be more relevant. 436 00:19:58,814 --> 00:20:01,109 Thank you so much for sharing this time with us. 437 00:20:01,133 --> 00:20:02,285 JH: Thanks, Chris. 438 00:20:02,309 --> 00:20:03,470 JH: Thanks, everyone. 439 00:20:03,494 --> 00:20:05,494 (Applause)