0:00:00.936,0:00:03.136 Chris Anderson: So, Jon, this feels scary. 0:00:03.160,0:00:04.325 Jonathan Haidt: Yeah. 0:00:04.349,0:00:06.339 CA: It feels like the world is in a place 0:00:06.363,0:00:08.200 that we haven't seen for a long time. 0:00:08.224,0:00:12.785 People don't just disagree[br]in the way that we're familiar with, 0:00:12.809,0:00:14.709 on the left-right political divide. 0:00:14.733,0:00:17.734 There are much deeper differences afoot. 0:00:17.758,0:00:21.024 What on earth is going on,[br]and how did we get here? 0:00:21.048,0:00:24.051 JH: This is different. 0:00:24.075,0:00:27.302 There's a much more[br]apocalyptic sort of feeling. 0:00:27.326,0:00:29.652 Survey research by Pew Research shows 0:00:29.676,0:00:33.385 that the degree to which we feel[br]that the other side is not just -- 0:00:33.409,0:00:36.383 we don't just dislike them;[br]we strongly dislike them, 0:00:36.407,0:00:39.883 and we think that they are[br]a threat to the nation. 0:00:39.907,0:00:41.871 Those numbers have been going up and up, 0:00:41.895,0:00:44.632 and those are over 50 percent[br]now on both sides. 0:00:44.656,0:00:45.807 People are scared, 0:00:45.831,0:00:49.458 because it feels like this is different[br]than before; it's much more intense. 0:00:49.482,0:00:52.002 Whenever I look[br]at any sort of social puzzle, 0:00:52.026,0:00:55.143 I always apply the three basic[br]principles of moral psychology, 0:00:55.167,0:00:57.061 and I think they'll help us here. 0:00:57.085,0:00:59.818 So the first thing that you[br]have to always keep in mind 0:00:59.842,0:01:01.578 when you're thinking about politics 0:01:01.602,0:01:02.982 is that we're tribal. 0:01:03.006,0:01:04.529 We evolved for tribalism. 0:01:04.553,0:01:07.722 One of the simplest and greatest[br]insights into human social nature 0:01:07.746,0:01:08.919 is the Bedouin proverb: 0:01:08.943,0:01:10.335 "Me against my brother; 0:01:10.359,0:01:12.286 me and my brother against our cousin; 0:01:12.310,0:01:14.812 me and my brother and cousins[br]against the stranger." 0:01:14.836,0:01:19.560 And that tribalism allowed us[br]to create large societies 0:01:19.584,0:01:22.616 and to come together[br]in order to compete with others. 0:01:22.640,0:01:26.321 That brought us out of the jungle[br]and out of small groups, 0:01:26.345,0:01:28.368 but it means that we have[br]eternal conflict. 0:01:28.392,0:01:30.133 The question you have to look at is: 0:01:30.157,0:01:32.821 What aspects of our society[br]are making that more bitter, 0:01:32.845,0:01:34.375 and what are calming them down? 0:01:34.399,0:01:35.960 CA: That's a very dark proverb. 0:01:35.984,0:01:40.157 You're saying that that's actually[br]baked into most people's mental wiring 0:01:40.181,0:01:41.332 at some level? 0:01:41.356,0:01:45.160 JH: Oh, absolutely. This is just[br]a basic aspect of human social cognition. 0:01:45.184,0:01:47.507 But we can also live together[br]really peacefully, 0:01:47.531,0:01:50.926 and we've invented all kinds[br]of fun ways of, like, playing war. 0:01:50.950,0:01:52.332 I mean, sports, politics -- 0:01:52.356,0:01:56.051 these are all ways that we get[br]to exercise this tribal nature 0:01:56.075,0:01:57.658 without actually hurting anyone. 0:01:57.682,0:02:02.027 We're also really good at trade[br]and exploration and meeting new people. 0:02:02.051,0:02:05.350 So you have to see our tribalism[br]as something that goes up or down -- 0:02:05.374,0:02:08.226 it's not like we're doomed[br]to always be fighting each other, 0:02:08.250,0:02:10.100 but we'll never have world peace. 0:02:10.980,0:02:14.202 CA: The size of that tribe[br]can shrink or expand. 0:02:14.226,0:02:15.377 JH: Right. 0:02:15.401,0:02:17.388 CA: The size of what we consider "us" 0:02:17.412,0:02:19.833 and what we consider "other" or "them" 0:02:19.857,0:02:21.987 can change. 0:02:22.610,0:02:28.200 And some people believed that process[br]could continue indefinitely. 0:02:28.224,0:02:29.416 JH: That's right. 0:02:29.440,0:02:32.646 CA: And we were indeed expanding[br]the sense of tribe for a while. 0:02:32.670,0:02:33.831 JH: So this is, I think, 0:02:33.855,0:02:37.210 where we're getting at what's possibly[br]the new left-right distinction. 0:02:37.234,0:02:39.557 I mean, the left-right[br]as we've all inherited it, 0:02:39.581,0:02:42.420 comes out of the labor[br]versus capital distinction, 0:02:42.444,0:02:44.819 and the working class, and Marx. 0:02:44.843,0:02:47.092 But I think what we're seeing[br]now, increasingly, 0:02:47.116,0:02:49.970 is a divide in all the Western democracies 0:02:49.994,0:02:53.701 between the people[br]who want to stop at nation, 0:02:53.725,0:02:55.475 the people who are more parochial -- 0:02:55.499,0:02:57.296 and I don't mean that in a bad way -- 0:02:57.320,0:03:00.120 people who have much more[br]of a sense of being rooted, 0:03:00.144,0:03:03.337 they care about their town,[br]their community and their nation. 0:03:03.824,0:03:07.861 And then those who are[br]anti-parochial and who -- 0:03:07.885,0:03:11.285 whenever I get confused, I just think[br]of the John Lennon song "Imagine." 0:03:11.309,0:03:14.138 "Imagine there's no countries,[br]nothing to kill or die for." 0:03:14.162,0:03:17.375 And so these are the people[br]who want more global governance, 0:03:17.399,0:03:20.073 they don't like nation states,[br]they don't like borders. 0:03:20.097,0:03:21.922 You see this all over Europe as well. 0:03:21.946,0:03:25.172 There's a great metaphor guy --[br]actually, his name is Shakespeare -- 0:03:25.196,0:03:26.777 writing ten years ago in Britain. 0:03:26.801,0:03:27.968 He had a metaphor: 0:03:27.992,0:03:31.265 "Are we drawbridge-uppers[br]or drawbridge-downers?" 0:03:31.289,0:03:34.194 And Britain is divided[br]52-48 on that point. 0:03:34.218,0:03:36.338 And America is divided on that point, too. 0:03:37.379,0:03:40.956 CA: And so, those of us[br]who grew up with The Beatles 0:03:40.980,0:03:44.554 and that sort of hippie philosophy[br]of dreaming of a more connected world -- 0:03:44.578,0:03:48.419 it felt so idealistic and "how could[br]anyone think badly about that?" 0:03:48.736,0:03:50.747 And what you're saying is that, actually, 0:03:50.771,0:03:55.073 millions of people today[br]feel that that isn't just silly; 0:03:55.097,0:03:57.958 it's actually dangerous and wrong,[br]and they're scared of it. 0:03:57.982,0:04:01.008 JH: I think the big issue, especially[br]in Europe but also here, 0:04:01.032,0:04:02.399 is the issue of immigration. 0:04:02.423,0:04:05.492 And I think this is where[br]we have to look very carefully 0:04:05.516,0:04:09.147 at the social science[br]about diversity and immigration. 0:04:09.171,0:04:10.885 Once something becomes politicized, 0:04:10.909,0:04:13.933 once it becomes something[br]that the left loves and the right -- 0:04:13.957,0:04:17.058 then even the social scientists[br]can't think straight about it. 0:04:17.082,0:04:19.039 Now, diversity is good in a lot of ways. 0:04:19.063,0:04:21.270 It clearly creates more innovation. 0:04:21.294,0:04:23.771 The American economy[br]has grown enormously from it. 0:04:23.795,0:04:26.177 Diversity and immigration[br]do a lot of good things. 0:04:26.201,0:04:28.829 But what the globalists,[br]I think, don't see, 0:04:28.853,0:04:30.360 what they don't want to see, 0:04:30.384,0:04:36.662 is that ethnic diversity[br]cuts social capital and trust. 0:04:36.686,0:04:39.022 There's a very important[br]study by Robert Putnam, 0:04:39.046,0:04:40.902 the author of "Bowling Alone," 0:04:40.926,0:04:42.781 looking at social capital databases. 0:04:42.805,0:04:45.752 And basically, the more people[br]feel that they are the same, 0:04:45.776,0:04:47.283 the more they trust each other, 0:04:47.307,0:04:50.047 the more they can have[br]a redistributionist welfare state. 0:04:50.071,0:04:52.042 Scandinavian countries are so wonderful 0:04:52.066,0:04:55.415 because they have this legacy[br]of being small, homogenous countries. 0:04:55.439,0:04:59.156 And that leads to[br]a progressive welfare state, 0:04:59.180,0:05:01.983 a set of progressive[br]left-leaning values, which says, 0:05:02.007,0:05:05.008 "Drawbridge down![br]The world is a great place. 0:05:05.032,0:05:08.064 People in Syria are suffering --[br]we must welcome them in." 0:05:08.088,0:05:09.465 And it's a beautiful thing. 0:05:09.934,0:05:12.517 But if, and I was in Sweden[br]this summer, 0:05:12.541,0:05:15.724 if the discourse in Sweden[br]is fairly politically correct 0:05:15.748,0:05:17.985 and they can't talk about the downsides, 0:05:18.009,0:05:19.996 you end up bringing a lot of people in. 0:05:20.020,0:05:21.712 That's going to cut social capital, 0:05:21.736,0:05:23.681 it makes it hard to have a welfare state 0:05:23.705,0:05:26.261 and they might end up,[br]as we have in America, 0:05:26.285,0:05:29.898 with a racially divided, visibly[br]racially divided, society. 0:05:29.922,0:05:32.182 So this is all very[br]uncomfortable to talk about. 0:05:32.206,0:05:35.451 But I think this is the thing,[br]especially in Europe and for us, too, 0:05:35.475,0:05:36.668 we need to be looking at. 0:05:36.692,0:05:38.778 CA: You're saying that people of reason, 0:05:38.802,0:05:41.290 people who would consider[br]themselves not racists, 0:05:41.314,0:05:43.041 but moral, upstanding people, 0:05:43.065,0:05:46.011 have a rationale that says[br]humans are just too different; 0:05:46.035,0:05:51.474 that we're in danger of overloading[br]our sense of what humans are capable of, 0:05:51.498,0:05:53.988 by mixing in people who are too different. 0:05:54.012,0:05:57.552 JH: Yes, but I can make it[br]much more palatable 0:05:57.576,0:06:00.374 by saying it's not necessarily about race. 0:06:00.819,0:06:02.086 It's about culture. 0:06:02.110,0:06:06.283 There's wonderful work by a political[br]scientist named Karen Stenner, 0:06:06.307,0:06:09.534 who shows that when people have a sense 0:06:09.558,0:06:11.793 that we are all united,[br]we're all the same, 0:06:11.817,0:06:15.067 there are many people who have[br]a predisposition to authoritarianism. 0:06:15.091,0:06:17.133 Those people aren't particularly racist 0:06:17.157,0:06:19.324 when they feel as through[br]there's not a threat 0:06:19.348,0:06:21.047 to our social and moral order. 0:06:21.071,0:06:23.104 But if you prime them experimentally 0:06:23.128,0:06:26.260 by thinking we're coming apart,[br]people are getting more different, 0:06:26.284,0:06:29.807 then they get more racist, homophobic,[br]they want to kick out the deviants. 0:06:29.831,0:06:32.738 So it's in part that you get[br]an authoritarian reaction. 0:06:32.762,0:06:35.099 The left, following through[br]the Lennonist line -- 0:06:35.123,0:06:36.409 the John Lennon line -- 0:06:36.433,0:06:38.842 does things that create[br]an authoritarian reaction. 0:06:38.866,0:06:41.661 We're certainly seeing that[br]in America with the alt-right. 0:06:41.685,0:06:44.267 We saw it in Britain,[br]we've seen it all over Europe. 0:06:44.291,0:06:46.873 But the more positive part of that 0:06:46.897,0:06:51.194 is that I think the localists,[br]or the nationalists, are actually right -- 0:06:51.218,0:06:55.005 that, if you emphasize[br]our cultural similarity, 0:06:55.029,0:06:57.256 then race doesn't actually[br]matter very much. 0:06:57.280,0:06:59.988 So an assimilationist[br]approach to immigration 0:07:00.012,0:07:01.569 removes a lot of these problems. 0:07:01.593,0:07:03.980 And if you value having[br]a generous welfare state, 0:07:04.004,0:07:06.477 you've got to emphasize[br]that we're all the same. 0:07:06.840,0:07:09.934 CA: OK, so rising immigration[br]and fears about that 0:07:09.958,0:07:13.125 are one of the causes[br]of the current divide. 0:07:13.149,0:07:14.862 What are other causes? 0:07:14.886,0:07:16.907 JH: The next principle of moral psychology 0:07:16.931,0:07:20.734 is that intuitions come first,[br]strategic reasoning second. 0:07:20.758,0:07:23.238 You've probably heard[br]the term "motivated reasoning" 0:07:23.262,0:07:24.869 or "confirmation bias." 0:07:24.893,0:07:26.810 There's some really interesting work 0:07:26.834,0:07:29.911 on how our high intelligence[br]and our verbal abilities 0:07:29.935,0:07:33.137 might have evolved[br]not to help us find out the truth, 0:07:33.161,0:07:36.133 but to help us manipulate each other,[br]defend our reputation ... 0:07:36.157,0:07:39.123 We're really, really good[br]at justifying ourselves. 0:07:39.147,0:07:41.514 And when you bring[br]group interests into account, 0:07:41.538,0:07:44.275 so it's not just me,[br]it's my team versus your team, 0:07:44.299,0:07:47.212 whereas if you're evaluating evidence[br]that your side is wrong, 0:07:47.236,0:07:48.993 we just can't accept that. 0:07:49.017,0:07:51.734 So this is why you can't win[br]a political argument. 0:07:51.758,0:07:53.251 If you're debating something, 0:07:53.275,0:07:56.322 you can't persuade the person[br]with reasons and evidence, 0:07:56.346,0:07:58.691 because that's not[br]the way reasoning works. 0:07:58.715,0:08:01.830 So now, give us the internet,[br]give us Google: 0:08:02.522,0:08:05.252 "I heard that Barack Obama[br]was born in Kenya. 0:08:05.276,0:08:09.224 Let me Google that -- oh my God![br]10 million hits! Look, he was!" 0:08:09.248,0:08:12.388 CA: So this has come as an unpleasant[br]surprise to a lot of people. 0:08:12.412,0:08:15.260 Social media has often been framed[br]by techno-optimists 0:08:15.284,0:08:20.531 as this great connecting force[br]that would bring people together. 0:08:20.555,0:08:24.477 And there have been some[br]unexpected counter-effects to that. 0:08:24.922,0:08:26.073 JH: That's right. 0:08:26.097,0:08:28.478 That's why I'm very enamored[br]of yin-yang views 0:08:28.502,0:08:30.120 of human nature and left-right -- 0:08:30.144,0:08:32.610 that each side is right[br]about certain things, 0:08:32.634,0:08:34.755 but then it goes blind to other things. 0:08:34.779,0:08:37.757 And so the left generally believes[br]that human nature is good: 0:08:37.781,0:08:40.946 bring people together, knock down[br]the walls and all will be well. 0:08:40.970,0:08:43.528 The right -- social conservatives,[br]not libertarians -- 0:08:43.552,0:08:47.805 social conservatives generally[br]believe people can be greedy 0:08:47.829,0:08:49.211 and sexual and selfish, 0:08:49.235,0:08:51.655 and we need regulation,[br]and we need restrictions. 0:08:52.214,0:08:54.616 So, yeah, if you knock down all the walls, 0:08:54.640,0:08:56.869 allow people to communicate[br]all over the world, 0:08:56.893,0:08:58.951 you get a lot of porn and a lot of racism. 0:08:58.975,0:09:00.236 CA: So help us understand. 0:09:00.260,0:09:05.812 These principles of human nature[br]have been with us forever. 0:09:06.918,0:09:11.692 What's changed that's deepened[br]this feeling of division? 0:09:12.360,0:09:17.016 JH: You have to see six to ten[br]different threads all coming together. 0:09:17.373,0:09:19.066 I'll just list a couple of them. 0:09:19.398,0:09:23.876 So in America, one of the big --[br]actually, America and Europe -- 0:09:23.900,0:09:25.813 one of the biggest ones is World War II. 0:09:25.837,0:09:28.480 There's interesting research[br]from Joe Henrich and others 0:09:28.504,0:09:30.907 that says if your country was at war, 0:09:30.931,0:09:32.488 especially when you were young, 0:09:32.512,0:09:35.626 then we test you 30 years later[br]in a commons dilemma 0:09:35.650,0:09:36.979 or a prisoner's dilemma, 0:09:37.003,0:09:38.294 you're more cooperative. 0:09:38.983,0:09:41.856 Because of our tribal nature, if you're -- 0:09:41.880,0:09:44.797 my parents were teenagers[br]during World War II, 0:09:44.821,0:09:47.382 and they would go out[br]looking for scraps of aluminum 0:09:47.406,0:09:48.595 to help the war effort. 0:09:48.619,0:09:50.797 I mean, everybody pulled together. 0:09:50.821,0:09:52.350 And so then these people go on, 0:09:52.374,0:09:54.772 they rise up through business[br]and government, 0:09:54.796,0:09:56.427 they take leadership positions. 0:09:56.451,0:09:59.702 They're really good[br]at compromise and cooperation. 0:09:59.726,0:10:01.661 They all retire by the '90s. 0:10:01.685,0:10:04.986 So we're left with baby boomers[br]by the end of the '90s. 0:10:05.010,0:10:08.977 And their youth was spent fighting[br]each other within each country, 0:10:09.001,0:10:10.648 in 1968 and afterwards. 0:10:10.672,0:10:14.666 The loss of the World War II generation,[br]"The Greatest Generation," 0:10:14.690,0:10:15.974 is huge. 0:10:16.567,0:10:17.742 So that's one. 0:10:18.440,0:10:21.563 Another, in America,[br]is the purification of the two parties. 0:10:21.949,0:10:24.996 There used to be liberal Republicans[br]and conservative Democrats. 0:10:25.020,0:10:28.187 So America had a mid-20th century[br]that was really bipartisan. 0:10:28.211,0:10:32.578 But because of a variety of factors[br]that started things moving, 0:10:32.602,0:10:36.000 by the 90's, we had a purified[br]liberal party and conservative party. 0:10:36.024,0:10:38.669 So now, the people in either party[br]really are different, 0:10:38.693,0:10:41.176 and we really don't want[br]our children to marry them, 0:10:41.200,0:10:43.268 which, in the '60s,[br]didn't matter very much. 0:10:43.292,0:10:45.089 So, the purification of the parties. 0:10:45.113,0:10:47.585 Third is the internet and, as I said, 0:10:47.609,0:10:52.292 it's just the most amazing stimulant[br]for post-hoc reasoning and demonization. 0:10:52.316,0:10:57.108 CA: The tone of what's happening[br]on the internet now is quite troubling. 0:10:57.132,0:10:59.979 I just did a quick search[br]on Twitter about the election 0:11:00.003,0:11:02.923 and saw two tweets next to each other. 0:11:03.335,0:11:07.547 One, against a picture of racist graffiti: 0:11:08.007,0:11:09.640 "This is disgusting! 0:11:09.664,0:11:12.834 Ugliness in this country,[br]brought to us by #Trump." 0:11:13.424,0:11:14.806 And then the next one is: 0:11:15.339,0:11:18.642 "Crooked Hillary[br]dedication page. Disgusting!" 0:11:19.176,0:11:23.383 So this idea of "disgust"[br]is troubling to me. 0:11:23.407,0:11:26.642 Because you can have an argument[br]or a disagreement about something, 0:11:26.666,0:11:28.227 you can get angry at someone. 0:11:29.094,0:11:32.687 Disgust, I've heard you say,[br]takes things to a much deeper level. 0:11:32.711,0:11:34.598 JH: That's right. Disgust is different. 0:11:34.622,0:11:36.585 Anger -- you know, I have kids. 0:11:36.609,0:11:38.359 They fight 10 times a day, 0:11:38.383,0:11:40.350 and they love each other 30 times a day. 0:11:40.374,0:11:43.284 You just go back and forth:[br]you get angry, you're not angry; 0:11:43.308,0:11:44.831 you're angry, you're not angry. 0:11:44.855,0:11:46.165 But disgust is different. 0:11:46.189,0:11:50.662 Disgust paints the person[br]as subhuman, monstrous, 0:11:50.686,0:11:52.495 deformed, morally deformed. 0:11:52.519,0:11:54.943 Disgust is like indelible ink. 0:11:55.768,0:11:59.283 There's research from John Gottman[br]on marital therapy. 0:11:59.307,0:12:04.432 If you look at the faces -- if one[br]of the couple shows disgust or contempt, 0:12:04.456,0:12:07.552 that's a predictor that they're going[br]to get divorced soon, 0:12:07.576,0:12:10.475 whereas if they show anger,[br]that doesn't predict anything, 0:12:10.499,0:12:13.208 because if you deal with anger well,[br]it actually is good. 0:12:13.232,0:12:14.684 So this election is different. 0:12:14.708,0:12:18.362 Donald Trump personally[br]uses the word "disgust" a lot. 0:12:18.386,0:12:21.243 He's very germ-sensitive,[br]so disgust does matter a lot -- 0:12:21.267,0:12:25.177 more for him, that's something[br]unique to him -- 0:12:25.201,0:12:28.104 but as we demonize each other more, 0:12:28.128,0:12:31.537 and again, through[br]the Manichaean worldview, 0:12:31.561,0:12:34.291 the idea that the world[br]is a battle between good and evil 0:12:34.315,0:12:35.662 as this has been ramping up, 0:12:35.686,0:12:39.012 we're more likely not just to say[br]they're wrong or I don't like them, 0:12:39.036,0:12:41.572 but we say they're evil, they're satanic, 0:12:41.596,0:12:43.517 they're disgusting, they're revolting. 0:12:43.541,0:12:46.407 And then we want nothing to do with them. 0:12:46.806,0:12:50.293 And that's why I think we're seeing it,[br]for example, on campus now. 0:12:50.317,0:12:52.957 We're seeing more the urge[br]to keep people off campus, 0:12:52.981,0:12:54.926 silence them, keep them away. 0:12:54.950,0:12:57.545 I'm afraid that this whole[br]generation of young people, 0:12:57.569,0:13:01.362 if their introduction to politics[br]involves a lot of disgust, 0:13:01.386,0:13:05.091 they're not going to want to be involved[br]in politics as they get older. 0:13:05.506,0:13:07.346 CA: So how do we deal with that? 0:13:07.370,0:13:11.850 Disgust. How do you defuse disgust? 0:13:12.874,0:13:14.822 JH: You can't do it with reasons. 0:13:15.312,0:13:16.503 I think ... 0:13:18.257,0:13:21.474 I studied disgust for many years,[br]and I think about emotions a lot. 0:13:21.498,0:13:25.023 And I think that the opposite[br]of disgust is actually love. 0:13:25.764,0:13:28.855 Love is all about, like ... 0:13:29.239,0:13:31.810 Disgust is closing off, borders. 0:13:31.834,0:13:34.379 Love is about dissolving walls. 0:13:35.074,0:13:37.554 So personal relationships, I think, 0:13:37.578,0:13:40.337 are probably the most[br]powerful means we have. 0:13:41.291,0:13:43.988 You can be disgusted by a group of people, 0:13:44.012,0:13:45.930 but then you meet a particular person 0:13:45.954,0:13:48.730 and you genuinely discover[br]that they're lovely. 0:13:48.754,0:13:53.050 And then gradually that chips away[br]or changes your category as well. 0:13:54.016,0:13:59.993 The tragedy is, Americans used to be[br]much more mixed up in the their towns 0:14:00.017,0:14:02.151 by left-right or politics. 0:14:02.175,0:14:04.506 And now that it's become[br]this great moral divide, 0:14:04.530,0:14:07.673 there's a lot of evidence[br]that we're moving to be near people 0:14:07.697,0:14:09.209 who are like us politically. 0:14:09.233,0:14:11.763 It's harder to find somebody[br]who's on the other side. 0:14:11.787,0:14:14.077 So they're over there, they're far away. 0:14:14.101,0:14:15.671 It's harder to get to know them. 0:14:15.695,0:14:19.919 CA: What would you say to someone[br]or say to Americans, 0:14:19.943,0:14:21.101 people generally, 0:14:21.125,0:14:23.734 about what we should understand[br]about each other 0:14:23.758,0:14:27.233 that might help us rethink for a minute 0:14:27.257,0:14:29.460 this "disgust" instinct? 0:14:30.086,0:14:31.238 JH: Yes. 0:14:31.262,0:14:33.415 A really important[br]thing to keep in mind -- 0:14:33.439,0:14:38.155 there's research by political[br]scientist Alan Abramowitz, 0:14:38.179,0:14:42.172 showing that American democracy[br]is increasingly governed 0:14:42.196,0:14:44.439 by what's called "negative partisanship." 0:14:44.875,0:14:47.986 That means you think,[br]OK there's a candidate, 0:14:48.010,0:14:50.416 you like the candidate,[br]you vote for the candidate. 0:14:50.440,0:14:52.499 But with the rise of negative advertising 0:14:52.523,0:14:54.747 and social media[br]and all sorts of other trends, 0:14:54.771,0:14:56.812 increasingly, the way elections are done 0:14:56.836,0:15:00.817 is that each side tries to make[br]the other side so horrible, so awful, 0:15:00.841,0:15:02.882 that you'll vote for my guy by default. 0:15:03.319,0:15:06.289 And so as we more and more vote[br]against the other side 0:15:06.313,0:15:07.644 and not for our side, 0:15:07.668,0:15:13.175 you have to keep in mind[br]that if people are on the left, 0:15:13.199,0:15:16.109 they think, "Well, I used to think[br]that Republicans were bad, 0:15:16.133,0:15:17.616 but now Donald Trump proves it. 0:15:17.640,0:15:20.491 And now every Republican,[br]I can paint with all the things 0:15:20.515,0:15:21.897 that I think about Trump." 0:15:21.921,0:15:23.514 And that's not necessarily true. 0:15:23.538,0:15:26.230 They're generally not very happy[br]with their candidate. 0:15:26.254,0:15:30.970 This is the most negative partisanship[br]election in American history. 0:15:31.860,0:15:35.223 So you have to first separate[br]your feelings about the candidate 0:15:35.247,0:15:38.184 from your feelings about the people[br]who are given a choice. 0:15:38.208,0:15:40.691 And then you have to realize that, 0:15:41.246,0:15:43.666 because we all live[br]in a separate moral world -- 0:15:43.690,0:15:47.141 the metaphor I use in the book[br]is that we're all trapped in "The Matrix," 0:15:47.165,0:15:50.689 or each moral community is a matrix,[br]a consensual hallucination. 0:15:50.713,0:15:52.956 And so if you're within the blue matrix, 0:15:52.980,0:15:56.174 everything's completely compelling[br]that the other side -- 0:15:56.198,0:15:59.829 they're troglodytes, they're racists,[br]they're the worst people in the world, 0:15:59.853,0:16:01.957 and you have all the facts[br]to back that up. 0:16:01.981,0:16:04.256 But somebody in the next house from yours 0:16:04.280,0:16:06.313 is living in a different moral matrix. 0:16:06.337,0:16:08.284 They live in a different video game, 0:16:08.308,0:16:10.686 and they see a completely[br]different set of facts. 0:16:10.710,0:16:13.386 And each one sees[br]different threats to the country. 0:16:13.410,0:16:15.500 And what I've found[br]from being in the middle 0:16:15.524,0:16:18.451 and trying to understand both sides[br]is: both sides are right. 0:16:18.475,0:16:20.595 There are a lot of threats[br]to this country, 0:16:20.619,0:16:24.104 and each side is constitutionally[br]incapable of seeing them all. 0:16:24.963,0:16:31.482 CA: So, are you saying[br]that we almost need a new type of empathy? 0:16:31.506,0:16:33.676 Empathy is traditionally framed as: 0:16:33.700,0:16:36.391 "Oh, I feel your pain.[br]I can put myself in your shoes." 0:16:36.415,0:16:39.344 And we apply it to the poor,[br]the needy, the suffering. 0:16:40.023,0:16:43.846 We don't usually apply it[br]to people who we feel as other, 0:16:43.870,0:16:45.335 or we're disgusted by. 0:16:45.359,0:16:46.510 JH: No. That's right. 0:16:46.534,0:16:51.364 CA: What would it look like[br]to build that type of empathy? 0:16:52.268,0:16:53.506 JH: Actually, I think ... 0:16:54.145,0:16:56.450 Empathy is a very, very[br]hot topic in psychology, 0:16:56.474,0:16:59.132 and it's a very popular word[br]on the left in particular. 0:16:59.156,0:17:03.156 Empathy is a good thing, and empathy[br]for the preferred classes of victims. 0:17:03.180,0:17:04.633 So it's important to empathize 0:17:04.657,0:17:07.481 with the groups that we on the left[br]think are so important. 0:17:07.505,0:17:10.036 That's easy to do,[br]because you get points for that. 0:17:10.442,0:17:14.091 But empathy really should get you points[br]if you do it when it's hard to do. 0:17:14.513,0:17:16.267 And, I think ... 0:17:16.291,0:17:21.379 You know, we had a long 50-year period[br]of dealing with our race problems 0:17:21.403,0:17:23.658 and legal discrimination, 0:17:23.682,0:17:25.869 and that was our top priority[br]for a long time 0:17:25.893,0:17:27.143 and it still is important. 0:17:27.167,0:17:28.696 But I think this year, 0:17:28.720,0:17:31.124 I'm hoping it will make people see 0:17:31.148,0:17:33.943 that we have an existential[br]threat on our hands. 0:17:33.967,0:17:36.634 Our left-right divide, I believe, 0:17:36.658,0:17:38.818 is by far the most important[br]divide we face. 0:17:38.842,0:17:41.873 We still have issues about race[br]and gender and LGBT, 0:17:41.897,0:17:45.268 but this is the urgent need[br]of the next 50 years, 0:17:45.292,0:17:48.153 and things aren't going[br]to get better on their own. 0:17:49.021,0:17:51.856 So we're going to need to do[br]a lot of institutional reforms, 0:17:51.880,0:17:53.289 and we could talk about that, 0:17:53.313,0:17:55.643 but that's like a whole long,[br]wonky conversation. 0:17:55.667,0:17:59.513 But I think it starts with people[br]realizing that this is a turning point. 0:17:59.537,0:18:02.346 And yes, we need a new kind of empathy. 0:18:02.370,0:18:03.875 We need to realize: 0:18:03.899,0:18:05.441 this is what our country needs, 0:18:05.465,0:18:07.819 and this is what you need[br]if you don't want to -- 0:18:07.843,0:18:10.538 Raise your hand if you want[br]to spend the next four years 0:18:10.562,0:18:14.048 as angry and worried as you've been[br]for the last year -- raise your hand. 0:18:14.072,0:18:15.777 So if you want to escape from this, 0:18:15.801,0:18:17.952 read Buddha, read Jesus,[br]read Marcus Aurelius. 0:18:17.976,0:18:23.038 They have all kinds of great advice[br]for how to drop the fear, 0:18:23.062,0:18:24.240 reframe things, 0:18:24.264,0:18:26.347 stop seeing other people as your enemy. 0:18:26.371,0:18:29.678 There's a lot of guidance in ancient[br]wisdom for this kind of empathy. 0:18:29.702,0:18:31.079 CA: Here's my last question: 0:18:31.103,0:18:35.438 Personally, what can[br]people do to help heal? 0:18:35.462,0:18:39.545 JH: Yeah, it's very hard to just decide[br]to overcome your deepest prejudices. 0:18:39.569,0:18:41.030 And there's research showing 0:18:41.054,0:18:45.403 that political prejudices are deeper[br]and stronger than race prejudices 0:18:45.427,0:18:46.687 in the country now. 0:18:47.395,0:18:50.827 So I think you have to make an effort --[br]that's the main thing. 0:18:50.851,0:18:52.855 Make an effort to actually meet somebody. 0:18:52.879,0:18:55.089 Everybody has a cousin, a brother-in-law, 0:18:55.113,0:18:56.982 somebody who's on the other side. 0:18:57.006,0:18:58.822 So, after this election -- 0:18:59.252,0:19:00.603 wait a week or two, 0:19:00.627,0:19:03.463 because it's probably going to feel[br]awful for one of you -- 0:19:03.487,0:19:07.639 but wait a couple weeks, and then[br]reach out and say you want to talk. 0:19:07.663,0:19:09.087 And before you do it, 0:19:09.111,0:19:12.256 read Dale Carnegie, "How to Win[br]Friends and Influence People" -- 0:19:12.280,0:19:13.319 (Laughter) 0:19:13.343,0:19:14.510 I'm totally serious. 0:19:14.534,0:19:17.124 You'll learn techniques[br]if you start by acknowledging, 0:19:17.148,0:19:18.309 if you start by saying, 0:19:18.333,0:19:20.003 "You know, we don't agree on a lot, 0:19:20.027,0:19:22.565 but one thing I really respect[br]about you, Uncle Bob," 0:19:22.589,0:19:24.648 or "... about you conservatives, is ... " 0:19:24.672,0:19:26.006 And you can find something. 0:19:26.030,0:19:28.793 If you start with some[br]appreciation, it's like magic. 0:19:28.817,0:19:30.931 This is one of the main[br]things I've learned 0:19:30.955,0:19:32.868 that I take into my human relationships. 0:19:32.892,0:19:34.812 I still make lots of stupid mistakes, 0:19:34.836,0:19:36.852 but I'm incredibly good[br]at apologizing now, 0:19:36.876,0:19:39.293 and at acknowledging what[br]somebody was right about. 0:19:39.317,0:19:40.471 And if you do that, 0:19:40.495,0:19:43.989 then the conversation goes really well,[br]and it's actually really fun. 0:19:44.717,0:19:47.362 CA: Jon, it's absolutely fascinating[br]speaking with you. 0:19:47.386,0:19:51.144 It really does feel like[br]the ground that we're on 0:19:51.168,0:19:56.035 is a ground populated by deep questions[br]of morality and human nature. 0:19:56.366,0:19:58.790 Your wisdom couldn't be more relevant. 0:19:58.814,0:20:01.109 Thank you so much for sharing[br]this time with us. 0:20:01.133,0:20:02.285 JH: Thanks, Chris. 0:20:02.309,0:20:03.470 JH: Thanks, everyone. 0:20:03.494,0:20:05.494 (Applause)