0:00:00.864,0:00:03.160 Chris Anderson: So, Jon, this feels scary. 0:00:03.160,0:00:04.186 Jonathan Haidt: Yeah. 0:00:04.186,0:00:07.956 CA: It feels like the world is in a place[br]that we haven't seen for a long time. 0:00:08.224,0:00:12.790 People don't just disagree[br]in the way that we're familiar with -- 0:00:12.790,0:00:14.656 on the left-right political divide. 0:00:14.656,0:00:17.632 There are much deeper differences afoot. 0:00:17.810,0:00:20.906 What on earth is going on[br]and how did we get here? 0:00:21.200,0:00:24.256 Jonathan Haidt: This is different. 0:00:24.256,0:00:27.326 There's a much more[br]apocalyptic sort of feeling. 0:00:27.326,0:00:29.816 Survey research by Pew Research shows 0:00:29.816,0:00:33.556 that the degree to which we feel[br]that the other side is not just -- 0:00:33.556,0:00:34.956 we don't just dislike them, 0:00:34.956,0:00:36.590 we strongly dislike them, 0:00:36.590,0:00:39.906 and we think that they are[br]a threat to the nation. 0:00:39.906,0:00:44.926 Those numbers have been going up and up,[br]and are over 50 percent now on both sides. 0:00:44.926,0:00:48.210 People are scared because it feels[br]like this is different than before. 0:00:48.210,0:00:49.336 It's much more intense. 0:00:49.336,0:00:52.026 Whenever I look[br]at any sort of social puzzle, 0:00:52.026,0:00:55.166 I always just apply the three[br]basic principles of moral psychology, 0:00:55.166,0:00:57.546 and I think they'll help us here. 0:00:57.546,0:01:00.136 So the first thing that you[br]have to always keep in mind 0:01:00.136,0:01:01.895 when you're thinking about politics 0:01:01.895,0:01:03.236 is that we're tribal. 0:01:03.236,0:01:04.646 We evolved for tribalism. 0:01:04.646,0:01:07.746 One of the simplest and greatest[br]insights into human social nature 0:01:07.746,0:01:08.996 is the bedouin proverb: 0:01:08.996,0:01:10.446 me against my brother, 0:01:10.446,0:01:12.396 me and my brother against our cousin, 0:01:12.396,0:01:14.986 me and my brother and cousins[br]against the stranger. 0:01:14.986,0:01:19.656 And so that tribalism[br]allowed us to create large societies 0:01:19.656,0:01:22.696 and to come together[br]in order to compete with others. 0:01:22.696,0:01:26.466 That brought us out of the jungle[br]and out of small groups, 0:01:26.466,0:01:28.476 but it means that we[br]have eternal conflict. 0:01:28.476,0:01:30.246 And the question you have to look at 0:01:30.246,0:01:33.106 is what aspects of our society[br]are making that more bitter, 0:01:33.106,0:01:34.596 and what are calming them down. 0:01:34.596,0:01:36.106 CA: That's a very dark proverb. 0:01:36.106,0:01:37.726 You're saying that that's actually 0:01:37.726,0:01:41.302 baked into most people's[br]mental wiring at some level. 0:01:41.302,0:01:42.496 JH: Oh yeah, absolutely. 0:01:42.496,0:01:45.056 This is just a basic aspect[br]of human social cognition. 0:01:45.056,0:01:47.706 But we can also live[br]together really peacefully, 0:01:47.706,0:01:50.916 and we've invented all kinds[br]of fun ways of, like, playing war. 0:01:50.916,0:01:52.356 I mean, sports, politics -- 0:01:52.356,0:01:56.206 these are all ways that we get[br]to exercise this tribal nature 0:01:56.206,0:01:58.026 without actually hurting anyone. 0:01:58.026,0:02:02.156 We're actually also very good at trade[br]and exploration and meeting new people. 0:02:02.156,0:02:05.306 So you have to see our tribalism[br]as something that goes up or down. 0:02:05.306,0:02:08.125 It's not like we're doomed[br]to always be fighting each other, 0:02:08.125,0:02:10.330 but we'll never have world peace. 0:02:10.980,0:02:14.376 CA: The size of that tribe[br]can shrink or expand. 0:02:14.376,0:02:15.336 JH: Right. 0:02:15.336,0:02:17.612 CA: The size of what we consider us, 0:02:17.612,0:02:20.056 and what we consider other or them 0:02:20.056,0:02:22.306 can change. 0:02:22.730,0:02:25.066 And some people believe 0:02:25.066,0:02:28.216 that that process[br]could continue indefinitely. 0:02:28.224,0:02:29.440 JH: That's right. 0:02:29.440,0:02:32.746 CA: And that we were indeed expanding[br]the sense of tribe for a while. 0:02:32.746,0:02:33.936 JH: So this is, I think, 0:02:33.936,0:02:37.412 where we're getting at what's possibly[br]the new left-right distinction. 0:02:37.412,0:02:39.716 I mean, the left-right[br]as we've all inherited it, 0:02:39.716,0:02:42.578 comes out of the, you know,[br]labor versus capital distinction, 0:02:42.578,0:02:44.976 and the working class, Marx -- 0:02:44.976,0:02:47.116 but I think what we're seeing[br]now increasingly 0:02:47.116,0:02:50.096 is a divide in all the Western democracies 0:02:50.096,0:02:53.826 between the people[br]who want to stop at nation, 0:02:53.826,0:02:55.576 the people who are more parochial -- 0:02:55.576,0:02:57.386 I don't mean that in a bad way -- 0:02:57.386,0:03:00.476 people who have much more[br]of a sense of being rooted, 0:03:00.476,0:03:03.540 they care about their town,[br]their community and their nation. 0:03:03.824,0:03:07.020 And then those who are antiparochial 0:03:07.020,0:03:08.136 and who -- 0:03:08.136,0:03:11.511 whenever I get confused, I just think[br]of the John Lennon song "Imagine." 0:03:11.511,0:03:12.971 "Imagine there's no countries. 0:03:12.971,0:03:14.400 Nothing to kill or die for." 0:03:14.400,0:03:17.636 And so these are the people[br]who want more global governance, 0:03:17.636,0:03:19.096 they don't like nation states, 0:03:19.096,0:03:20.240 they don't like borders. 0:03:20.240,0:03:22.246 You see this all over Europe as well. 0:03:22.246,0:03:23.866 There's a great metaphor guy -- 0:03:23.866,0:03:25.530 actually his name is Shakespeare -- 0:03:25.530,0:03:27.134 writing ten years ago in Britain. 0:03:27.134,0:03:28.160 He had a metaphor: 0:03:28.160,0:03:31.456 "Are we drawbridge-uppers[br]or drawbridge-downers?" 0:03:31.456,0:03:34.306 And Britain is divided[br]52-48 on that point. 0:03:34.306,0:03:36.426 And America is divided on that point, too. 0:03:37.380,0:03:41.140 CA: And so those of us[br]who grew up with The Beatles 0:03:41.140,0:03:42.906 and that sort of hippie philosophy 0:03:42.906,0:03:44.870 of dreaming of a more connected world -- 0:03:44.870,0:03:46.270 and it felt so idealistic -- 0:03:46.270,0:03:48.810 and how could anyone[br]think badly about that? 0:03:48.810,0:03:50.760 What you're saying is that actually 0:03:50.760,0:03:55.206 millions of people today[br]feel that that isn't just silly, 0:03:55.206,0:03:58.016 it's actually dangerous and wrong,[br]and they're scared of it. 0:03:58.016,0:04:00.346 JH: I think the big issue,[br]especially in Europe, 0:04:00.346,0:04:01.348 but it's also here, 0:04:01.348,0:04:02.694 is the issue of immigration. 0:04:02.694,0:04:05.620 And I think this is where[br]we have to look very carefully 0:04:05.620,0:04:09.136 at the social science[br]about diversity and immigration. 0:04:09.274,0:04:10.961 Once something becomes politicized, 0:04:10.961,0:04:14.336 once it becomes something[br]that the left loves, and the right -- 0:04:14.336,0:04:17.336 then even the social scientists[br]can't think straight about it. 0:04:17.336,0:04:19.316 Now, diversity is good in a lot of ways. 0:04:19.316,0:04:21.546 It clearly creates more innovation, 0:04:21.546,0:04:24.046 the American economy[br]has grown enormously from it. 0:04:24.046,0:04:26.376 Diversity and immigration[br]do a lot of good things, 0:04:26.376,0:04:29.136 but what the globalists,[br]I think, don't see, 0:04:29.136,0:04:30.666 what they don't want to see, 0:04:30.666,0:04:36.676 is that ethnic diversity[br]cuts social capital and trust. 0:04:36.806,0:04:39.046 There's a very important[br]study by Robert Putnam, 0:04:39.046,0:04:40.926 the author of "Bowling Alone," 0:04:40.926,0:04:43.046 looking at social capital databases. 0:04:43.046,0:04:46.016 And basically, the more people[br]feel that they are the same, 0:04:46.016,0:04:47.546 the more they trust each other, 0:04:47.546,0:04:50.256 the more they can have[br]a redistributionist welfare state. 0:04:50.256,0:04:52.250 Scandinavian countries are so wonderful 0:04:52.250,0:04:55.676 because they have this legacy[br]of being small, homogenous countries. 0:04:55.676,0:04:59.416 And that leads to a set[br]of progressive welfare state -- 0:04:59.416,0:05:01.496 a set of progressive left-leaning values, 0:05:01.496,0:05:03.636 which says, "Drawbridge down! 0:05:03.636,0:05:05.246 The world is a great place. 0:05:05.246,0:05:06.906 People in Syria are suffering, 0:05:06.906,0:05:08.300 we must welcome them in." 0:05:08.300,0:05:09.720 And it's a beautiful thing. 0:05:09.990,0:05:11.286 But if -- 0:05:11.286,0:05:12.930 and I was in Sweden this summer -- 0:05:12.930,0:05:16.136 if the discourse in Sweden[br]is fairly politically correct, 0:05:16.136,0:05:18.216 and they can't talk about the downsides, 0:05:18.216,0:05:20.226 you end up bringing a lot of people in, 0:05:20.226,0:05:22.066 that's going to cut social capital, 0:05:22.066,0:05:23.970 it makes it hard to have a welfare state 0:05:23.970,0:05:25.480 and they might end up, 0:05:25.480,0:05:26.600 as we have in America, 0:05:26.600,0:05:28.040 with a racially divided -- 0:05:28.040,0:05:29.970 visibly racially divided society. 0:05:29.970,0:05:32.206 So this is all very[br]uncomfortable to talk about. 0:05:32.206,0:05:33.776 But I think this is the thing -- 0:05:33.776,0:05:35.660 especially in Europe and for us, too -- 0:05:35.660,0:05:36.876 we need to be looking at. 0:05:36.876,0:05:38.985 CA: You're saying that people of reason, 0:05:38.985,0:05:41.496 people who would[br]consider themselves not racists, 0:05:41.496,0:05:43.246 but moral, upstanding people, 0:05:43.246,0:05:44.576 have a rationale that says, 0:05:44.576,0:05:47.126 "Look, humans are just too different." 0:05:47.126,0:05:51.736 We're in danger of overloading[br]our sense of what humans are capable of 0:05:51.736,0:05:53.916 by mixing in people who are too different. 0:05:53.916,0:05:57.906 JH: Yes, but I can make it[br]much more palatable 0:05:57.906,0:06:01.066 by saying it's not necessarily about race. 0:06:01.066,0:06:02.356 It's about culture. 0:06:02.356,0:06:04.196 And so there's wonderful work 0:06:04.196,0:06:06.526 by a political scientist[br]named Karen Stenner, 0:06:06.526,0:06:09.776 who shows that when people have a sense 0:06:09.776,0:06:12.126 that we are all united,[br]we're all the same, 0:06:12.126,0:06:15.321 there are many people who have[br]a predisposition to authoritarianism. 0:06:15.321,0:06:17.386 Those people aren't particularly racist 0:06:17.386,0:06:19.586 when they feel as through[br]there's not a threat 0:06:19.586,0:06:21.270 to our social and moral order. 0:06:21.270,0:06:23.326 But if you prime them experimentally 0:06:23.326,0:06:24.836 by thinking we're coming apart, 0:06:24.836,0:06:26.480 people are getting more different, 0:06:26.480,0:06:30.026 then they get more racist, homophobic,[br]they want to kick out the deviants. 0:06:30.026,0:06:32.956 So it's in part that you get[br]an authoritarian reaction. 0:06:32.956,0:06:35.316 The left, following through[br]the Lennonist line -- 0:06:35.316,0:06:36.720 the John Lennon line -- 0:06:36.720,0:06:39.086 does things that create[br]an authoritarian reaction. 0:06:39.086,0:06:41.976 So we're certainly seeing that[br]in America with the Alt-Right. 0:06:41.976,0:06:42.976 We saw it in Britain, 0:06:42.976,0:06:44.530 we've seen it all over Europe. 0:06:44.530,0:06:47.216 But the more positive part of that 0:06:47.216,0:06:51.536 is that I think the localists,[br]or the nationalists are actually right. 0:06:51.536,0:06:55.236 That if you emphasize[br]our cultural similarity, 0:06:55.236,0:06:57.486 then race doesn't[br]actually matter very much. 0:06:57.486,0:07:00.116 So an assimilationist[br]approach to immigration 0:07:00.116,0:07:01.696 removes a lot of these problems. 0:07:01.696,0:07:04.306 And if you value having[br]a generous welfare state, 0:07:04.306,0:07:06.550 you've got to emphasize[br]that we're all the same. 0:07:06.840,0:07:10.096 CA: OK, so rising immigration[br]and fears about that 0:07:10.096,0:07:13.286 are one of the causes[br]of the current divide. 0:07:13.286,0:07:15.196 What are other causes? 0:07:15.196,0:07:17.240 JH: The next principle of moral psychology 0:07:17.240,0:07:20.926 is that intuitions come first,[br]strategic reasoning second. 0:07:20.926,0:07:23.556 And you've probably heard[br]the term "motivated reasoning" 0:07:23.556,0:07:25.186 or "confirmation bias." 0:07:25.186,0:07:27.126 So there's some really interesting work 0:07:27.126,0:07:30.226 on how our high intelligence[br]and our verbal abilities 0:07:30.226,0:07:33.406 might have evolved[br]not to help us find out the truth, 0:07:33.406,0:07:35.206 but to help us manipulate each other, 0:07:35.206,0:07:36.400 defend our reputation. 0:07:36.400,0:07:39.346 We're really, really good[br]at justifying ourselves. 0:07:39.346,0:07:41.736 And when you bring[br]group interests into account, 0:07:41.736,0:07:42.776 so it's not just me, 0:07:42.776,0:07:44.590 it's like my team versus your team, 0:07:44.590,0:07:47.526 whereas if you're evaluating evidence[br]that your side is wrong, 0:07:47.526,0:07:49.306 we just can't accept that. 0:07:49.306,0:07:52.046 And so this is why you can't[br]win a political argument. 0:07:52.046,0:07:53.466 If you're debating something, 0:07:53.466,0:07:56.536 you can't persuade the person[br]with reasons and evidence 0:07:56.536,0:07:58.786 because that's not[br]the way reasoning works. 0:07:58.786,0:08:00.846 And so, now give us the Internet. 0:08:00.846,0:08:02.096 Give us Google. 0:08:02.096,0:08:05.276 You know, "I heard that Barack Obama[br]was born in Kenya, 0:08:05.276,0:08:06.360 let me Google that -- 0:08:06.360,0:08:09.376 oh my God! 10 million hits! Look, he was!" 0:08:09.376,0:08:12.476 CA: So this has come as an unpleasant[br]surprise to a lot of people. 0:08:12.476,0:08:15.546 Social media has often been framed[br]by techno-optimists 0:08:15.546,0:08:20.726 as this great connecting force[br]that would bring people together. 0:08:21.150,0:08:25.096 And there have been some[br]unexpected counter effects to that. 0:08:25.096,0:08:27.426 JH: That's right, and that's why[br]I'm very enamored 0:08:27.426,0:08:30.246 of sort of ying-yang views[br]of human nature and left-right. 0:08:30.246,0:08:32.855 That each side is right[br]about certain things, 0:08:32.855,0:08:34.915 but then it goes blind to other things. 0:08:34.915,0:08:37.916 And so the left generally believes[br]that human nature is good, 0:08:37.916,0:08:38.966 bring people together, 0:08:38.966,0:08:40.970 knock down the walls and all will be well. 0:08:40.970,0:08:41.970 The right -- 0:08:41.970,0:08:43.930 social conservatives, not libertarians -- 0:08:43.930,0:08:45.946 social conservatives generally believe 0:08:45.946,0:08:49.346 people can be greedy[br]and sexual and selfish, 0:08:49.346,0:08:51.766 and we need regulation,[br]and we need restrictions. 0:08:52.634,0:08:54.640 So yeah, if you knock down[br]all the walls -- 0:08:54.640,0:08:56.936 allow people to communicate[br]all over the world -- 0:08:56.936,0:08:59.006 you get a lot of porn and a lot of racism. 0:08:59.006,0:09:00.290 CA: So help us understand. 0:09:00.290,0:09:05.720 These principles of human nature[br]have been with us forever. 0:09:07.030,0:09:11.710 What's changed that's deepened[br]this feeling of division? 0:09:12.360,0:09:17.016 JH: You have to see six to 10 different[br]threads all coming together. 0:09:17.564,0:09:19.470 I'll just list a couple of them. 0:09:19.470,0:09:22.740 So in America one of the big -- 0:09:22.740,0:09:24.170 actually America and Europe -- 0:09:24.170,0:09:26.076 one of the biggest ones is World War II. 0:09:26.076,0:09:28.706 There's interesting research[br]from Joe Henrich and others 0:09:28.706,0:09:31.106 that if your country was at war, 0:09:31.106,0:09:32.686 especially when you were young, 0:09:32.686,0:09:35.756 then we test you 30 years later[br]in a commons dilemma, 0:09:35.756,0:09:37.246 or a prisoner's dilemma, 0:09:37.246,0:09:39.126 you're more cooperative. 0:09:39.126,0:09:42.166 Because of our tribal nature, if you're -- 0:09:42.166,0:09:45.106 you know, my parents were[br]teenagers during World War II 0:09:45.106,0:09:47.536 and they would go out[br]looking for scraps of aluminum 0:09:47.536,0:09:48.852 to help the war effort. 0:09:48.852,0:09:50.976 I mean, everybody pulled together. 0:09:50.976,0:09:52.616 And so then these people go on, 0:09:52.616,0:09:54.796 they rise up through business[br]and government, 0:09:54.796,0:09:56.696 they take leadership positions. 0:09:56.696,0:09:59.766 They're really good[br]at compromise and cooperation. 0:09:59.766,0:10:01.922 They all retire by the '90s. 0:10:01.922,0:10:05.246 So we're left with baby boomers[br]by the end of the '90s. 0:10:05.246,0:10:09.236 And their youth was spent fighting[br]each other within each country; 0:10:09.236,0:10:10.906 1968 and afterwards. 0:10:10.906,0:10:13.616 So the loss of the[br]World War II generation, 0:10:13.616,0:10:14.836 the greatest generation, 0:10:14.836,0:10:16.120 is huge. 0:10:16.774,0:10:18.440 So that's one. 0:10:18.440,0:10:22.196 Another in America is the[br]purification of the two parties. 0:10:22.196,0:10:25.266 There used to be liberal Republicans[br]and conservative Democrats. 0:10:25.266,0:10:28.456 So America had a mid-20th century[br]that was really bipartisan. 0:10:28.456,0:10:32.846 But because of a variety of factors[br]that started things moving 0:10:32.846,0:10:36.176 by the 90's we had purified[br]liberal party and conservative party. 0:10:36.176,0:10:38.946 And so now the people[br]in either party really are different. 0:10:38.946,0:10:41.596 And now we really don't want[br]our children to marry them, 0:10:41.596,0:10:43.616 which in the '60s didn't matter very much. 0:10:43.616,0:10:45.436 So the purification of the parties. 0:10:45.436,0:10:46.726 Third is the Internet. 0:10:46.726,0:10:47.930 And as I said, 0:10:47.930,0:10:52.636 it's just the most amazing stimulant[br]for post hoc reasoning and demonization. 0:10:52.636,0:10:57.616 CA: The tone of what's happening[br]on the Internet now is quite troubling. 0:10:57.616,0:11:00.486 I just did a quick search[br]on Twitter about the election 0:11:00.486,0:11:03.496 and saw two tweets next to each other. 0:11:03.724,0:11:07.960 One against a picture[br]of a sort of racist graffiti. 0:11:07.960,0:11:09.876 "This is disgusting, 0:11:09.876,0:11:13.176 Ugliness in this country,[br]brought to us by #Trump." 0:11:13.424,0:11:15.480 And then the next one is: 0:11:15.480,0:11:19.406 "Crooked Hillary[br]dedication page. Disgusting!" 0:11:19.406,0:11:23.636 So this idea of disgust[br]is troubling to me. 0:11:23.636,0:11:26.780 Because you can have an argument[br]or a disagreement about something, 0:11:26.780,0:11:28.501 you can get angry at someone. 0:11:29.254,0:11:32.870 Disgust, I've heard you say,[br]takes things to a much deeper level. 0:11:32.870,0:11:34.820 JH: That's right. Disgust is different. 0:11:34.820,0:11:36.160 Anger, you know -- 0:11:36.160,0:11:37.160 I have kids, 0:11:37.160,0:11:40.296 they fight 10 times a day[br]and they love each other 30 times a day. 0:11:40.296,0:11:41.616 You just go back and forth. 0:11:41.616,0:11:43.156 You get angry, you're not angry. 0:11:43.156,0:11:44.690 You're angry, you're not angry. 0:11:44.690,0:11:46.346 But disgust is different. 0:11:46.346,0:11:50.796 Disgust paints the person[br]as being subhuman, monstrous, 0:11:50.796,0:11:51.806 deformed -- 0:11:51.806,0:11:52.906 morally deformed. 0:11:52.906,0:11:56.076 Disgust is like indelible ink. 0:11:56.076,0:11:59.506 There's research from John Gottman[br]on marital therapy. 0:11:59.506,0:12:01.546 If you look at the faces, 0:12:01.546,0:12:04.766 if one of the couple[br]shows disgust or contempt, 0:12:04.766,0:12:07.756 that's a predictor that they're[br]going to get divorced soon. 0:12:07.756,0:12:10.856 Whereas if they show anger[br]that actually doesn't predict anything. 0:12:10.856,0:12:13.476 Because if you deal with anger well[br]it actually is good. 0:12:13.476,0:12:14.929 So this election is different. 0:12:14.929,0:12:18.606 Donald Trump personally[br]uses the word "disgust" a lot. 0:12:18.606,0:12:21.486 He's very germ-sensitive,[br]so disgust does matter a lot -- 0:12:21.486,0:12:22.936 more for him,[br] 0:12:22.936,0:12:25.410 that is something unique to him, 0:12:25.410,0:12:28.336 but as we demonize each other more, 0:12:28.336,0:12:31.736 and again, through[br]the Manichaean worldview, 0:12:31.736,0:12:34.446 the idea that the world[br]is a battle between good and evil, 0:12:34.446,0:12:35.786 as this has been ramping up, 0:12:35.786,0:12:39.036 we're more likely not just to say[br]they're wrong or I don't like them, 0:12:39.036,0:12:41.596 but we say they're evil, they're satanic, 0:12:41.596,0:12:43.856 they're disgusting, they're revolting. 0:12:43.856,0:12:46.926 And then we want nothing to do with them. 0:12:46.926,0:12:48.736 And that's why I think we see -- 0:12:48.736,0:12:50.645 we're seeing it, for example, on campus. 0:12:50.645,0:12:53.306 Now we're seeing more[br]the urge to keep people off campus, 0:12:53.306,0:12:55.096 silence them, keep them away. 0:12:55.096,0:12:57.676 I'm afraid that this whole[br]generation of young people, 0:12:57.676,0:13:01.492 if their introduction to politics[br]involves a lot of disgust, 0:13:01.492,0:13:03.926 they're not going to want[br]to be involved in politics 0:13:03.926,0:13:05.506 as they get older. 0:13:05.506,0:13:07.630 CA: So how do we deal with that? 0:13:07.630,0:13:12.110 Disgust. How do you defuse disgust? 0:13:13.041,0:13:14.989 JH: You can't do it with reasons. 0:13:15.312,0:13:18.034 I think -- 0:13:18.034,0:13:20.311 I studied disgust for many years 0:13:20.311,0:13:22.062 and I think about emotions a lot, 0:13:22.062,0:13:25.587 and I think that the opposite[br]of disgust is actually love. 0:13:26.058,0:13:29.407 Love is all about like -- 0:13:29.407,0:13:31.160 disgust is closing off -- 0:13:31.160,0:13:32.317 borders. 0:13:32.317,0:13:34.774 Love is about dissolving walls. 0:13:35.074,0:13:37.930 And so personal relationships I think 0:13:37.930,0:13:40.689 are probably the most[br]powerful means we have. 0:13:41.561,0:13:44.281 You can be disgusted by a group of people, 0:13:44.281,0:13:46.222 but then you meet a particular person 0:13:46.222,0:13:48.482 and you genuinely discover[br]that they're lovely. 0:13:48.976,0:13:53.200 And then gradually that chips away[br]or changes your category as well. 0:13:54.231,0:14:00.231 The tragedy is that Americans used to be[br]much more mixed up in the their towns 0:14:00.231,0:14:02.310 by left-right or politics, 0:14:02.310,0:14:04.606 and now that it's become[br]this great moral divide, 0:14:04.606,0:14:05.838 there's a lot of evidence 0:14:05.838,0:14:09.233 that we're moving to be near people[br]who are like us politically. 0:14:09.233,0:14:11.787 It's harder to find somebody[br]who's on the other side. 0:14:11.787,0:14:13.237 So they're over there, 0:14:13.237,0:14:14.241 they're far away. 0:14:14.241,0:14:15.788 It's harder to get to know them. 0:14:15.788,0:14:18.118 CA: What would you say to someone, 0:14:18.118,0:14:19.943 or say to Americans, 0:14:19.943,0:14:20.944 people generally, 0:14:20.944,0:14:23.842 about what should we[br]understand about each other 0:14:23.842,0:14:27.421 that might help us rethink for a minute 0:14:27.421,0:14:29.842 this disgust instinct. 0:14:30.277,0:14:31.385 JH: Yes, 0:14:31.385,0:14:34.263 a really important[br]thing to keep in mind -- 0:14:34.263,0:14:38.413 there's research by political[br]scientist Alan Abramowitz 0:14:38.413,0:14:42.429 showing that American democracy[br]is increasingly governed 0:14:42.429,0:14:45.115 by what's called negative partisanship. 0:14:45.115,0:14:48.146 That means you think like, "OK[br]there's a candidate, 0:14:48.146,0:14:49.279 you like the candidate, 0:14:49.279,0:14:50.610 you vote for the candidate." 0:14:50.610,0:14:52.625 But with the rise of negative advertising, 0:14:52.625,0:14:53.559 and social media 0:14:53.559,0:14:55.231 and all sorts of other trends, 0:14:55.231,0:14:57.133 increasingly the way elections are done 0:14:57.133,0:14:59.902 is that each side tries to make[br]the other side so horrible, 0:14:59.902,0:15:01.031 so awful, 0:15:01.031,0:15:03.279 that you'll vote for my guy by default. 0:15:03.510,0:15:06.503 And so as we more and more vote[br]against the other side 0:15:06.503,0:15:08.038 and not for our side, 0:15:08.038,0:15:09.489 you have to keep in mind 0:15:09.489,0:15:13.518 that if people are on the left, 0:15:13.518,0:15:16.334 they think well, "I used to think[br]that Republicans were bad, 0:15:16.334,0:15:17.840 but now Donald Trump proves it. 0:15:17.840,0:15:19.652 And now every Republican 0:15:19.652,0:15:22.375 I can paint with all the things[br]that I think about Trump." 0:15:22.375,0:15:23.991 And that's not necessarily true. 0:15:23.991,0:15:26.548 They're generally not very happy[br]with their candidate. 0:15:26.548,0:15:31.264 This is the most negative partisanship[br]election in American history. 0:15:31.955,0:15:35.578 So you have to first separate your[br]feelings about the candidate 0:15:35.578,0:15:38.538 from your feelings about the people[br]who are given a choice, 0:15:38.538,0:15:40.538 and then you have to realize 0:15:40.538,0:15:44.012 that because we all live[br]in a separate moral world -- 0:15:44.012,0:15:47.482 the metaphor I use in the book[br]is that we're all trapped in "The Matrix," 0:15:47.482,0:15:49.231 or each moral community is a matrix, 0:15:49.231,0:15:51.028 a consensual hallucination. 0:15:51.028,0:15:53.294 And so if you're within the blue matrix, 0:15:53.294,0:15:56.613 everything's completely compelling[br]that the other side, 0:15:56.613,0:16:00.222 they're troglodytes, they're racists,[br]they're the worst people in the world, 0:16:00.222,0:16:02.349 and you have all[br]the facts to back that up. 0:16:02.349,0:16:06.460 But somebody in the next house from yours[br]is living in a different moral matrix. 0:16:06.460,0:16:08.500 They live in a different video game, 0:16:08.500,0:16:10.798 and they see a completely[br]different set of facts. 0:16:10.798,0:16:13.410 And each one sees different[br]threats to the country. 0:16:13.410,0:16:15.524 And what I've found[br]from being in the middle, 0:16:15.524,0:16:17.519 and from trying to understand both sides, 0:16:17.519,0:16:18.696 is both sides are right. 0:16:18.696,0:16:20.763 There are a lot of threats to this country 0:16:20.763,0:16:24.248 and each side is constitutionally[br]incapable of seeing them all. 0:16:24.963,0:16:27.042 CA: So are you almost saying 0:16:27.044,0:16:31.309 that we almost need a new type of empathy? 0:16:31.705,0:16:35.202 Empathy is traditionally framed as[br]"Oh I feel your pain, 0:16:35.202,0:16:36.906 I can put myself in your shoes," 0:16:36.906,0:16:39.697 and we apply it to the poor,[br]the needy, the suffering. 0:16:40.349,0:16:44.195 We don't usually apply it [br]to people who we feel as other, 0:16:44.195,0:16:45.970 or we're disgusted by. 0:16:46.248,0:16:51.427 What would it look like[br]to build that type of empathy? 0:16:52.491,0:16:54.439 JH: So I think -- 0:16:54.439,0:16:56.710 empathy is a very very[br]hot topic in psychology, 0:16:56.710,0:16:59.289 and it's a very popular word[br]on the left in particular, 0:16:59.289,0:17:00.439 empathy's a good thing. 0:17:00.439,0:17:03.581 And empathy for the preferred[br]classes of victims. 0:17:03.581,0:17:05.047 So it's important to empathize 0:17:05.047,0:17:07.805 with the groups that we on the left[br]think are so important. 0:17:07.805,0:17:10.209 That's easy to do because[br]you get points for that. 0:17:10.840,0:17:14.513 But empathy really should get you points[br]if you do it when it's hard to do. 0:17:14.513,0:17:16.593 And I think, 0:17:16.593,0:17:21.618 we had a long 50-year period[br]of dealing with our race problems 0:17:21.618,0:17:23.681 and legal discrimination, 0:17:23.681,0:17:25.806 and that was our top[br]priority for a long time 0:17:25.806,0:17:27.571 and it still is important, 0:17:27.571,0:17:29.063 but I think this year, 0:17:29.063,0:17:31.490 I'm hoping it will make people see 0:17:31.490,0:17:34.308 that we have an existential[br]threat on our hands. 0:17:34.308,0:17:35.971 Our left-right divide, 0:17:35.971,0:17:36.972 I believe, 0:17:36.972,0:17:39.112 is by far the most important[br]divide we face. 0:17:39.112,0:17:42.120 We still have issues about race[br]and gender and LGBT, 0:17:42.120,0:17:45.668 but this is the urgent need[br]of the next 50 years, 0:17:45.668,0:17:48.529 and things aren't going[br]to get better on their own. 0:17:49.021,0:17:51.880 So we're going to need to do[br]a lot of institutional reforms, 0:17:51.880,0:17:53.313 and we could talk about that, 0:17:53.313,0:17:55.756 but that's like a whole long,[br]wonky converstation. 0:17:55.756,0:17:59.850 But I think it starts with people[br]realizing that this is a turning point. 0:17:59.850,0:18:02.656 And yes, we need a new kind of empathy. 0:18:02.656,0:18:04.035 We need to realize: 0:18:04.035,0:18:05.537 this is what our country needs, 0:18:05.537,0:18:08.072 and this is what you need. 0:18:08.072,0:18:10.714 Raise your hand if you want[br]to spend the next four years 0:18:10.714,0:18:13.255 as angry and worried[br]as you've been for the last year. 0:18:13.255,0:18:14.255 Raise your hand. 0:18:14.255,0:18:15.924 So if you want to escape from this, 0:18:15.924,0:18:18.201 read Buddha, read Jesus,[br]read Marcus Aurelius, 0:18:18.201,0:18:23.234 I mean they have all kinds of great advice[br]for how to drop the fear, 0:18:23.234,0:18:24.489 reframe things, 0:18:24.489,0:18:26.467 stop seeing other people as your enemy. 0:18:26.467,0:18:29.903 So there's a lot of guidance and ancient[br]wisdom for this kind of empathy. 0:18:29.903,0:18:31.263 CA: Here's my last question. 0:18:31.263,0:18:35.883 Personally, what can[br]people do to help heal? 0:18:35.883,0:18:39.768 JH: Yeah, it's very hard to just decide[br]to overcome your deepest prejudices. 0:18:39.768,0:18:41.252 And there's research showing 0:18:41.252,0:18:45.531 that political prejudices are deeper[br]and stronger that race prejudices 0:18:45.531,0:18:46.974 in the country now. 0:18:47.498,0:18:49.797 So I think you have to make an effort. 0:18:49.797,0:18:50.995 That's the main thing. 0:18:50.995,0:18:53.000 Make an effort to actually[br]meet somebody -- 0:18:53.000,0:18:54.473 everybody has a cousin, 0:18:54.473,0:18:55.482 a brother-in-law, 0:18:55.482,0:18:57.181 somebody who's on the other side. 0:18:57.181,0:18:59.582 So after this election, 0:18:59.582,0:19:03.181 wait a week or two because it's probably[br]going to feel awful for one of you, 0:19:03.181,0:19:04.319 but wait a couple weeks 0:19:04.319,0:19:07.781 and then reach out and say[br]that you want to talk. 0:19:07.781,0:19:09.401 And before you do it, 0:19:09.401,0:19:12.403 read Dale Carnegie, "How to Win[br]Friends and Influence People --" 0:19:12.403,0:19:13.403 (Laughter) 0:19:13.403,0:19:14.517 I'm totally serious. 0:19:14.517,0:19:17.166 You'll learn techniques[br]if you start by acknowledging, 0:19:17.166,0:19:18.410 if you start by saying, 0:19:18.410,0:19:20.027 "You know we don't agree on a lot, 0:19:20.027,0:19:22.493 but one thing I really[br]respect about you, Uncle Bob, 0:19:22.493,0:19:24.672 or about you conservatives is ... " 0:19:24.672,0:19:26.112 And you can find something. 0:19:26.112,0:19:29.164 If you start with some[br]appreciation it's like magic. 0:19:29.164,0:19:31.222 This is one of the main[br]things I've learned, 0:19:31.222,0:19:33.218 that I've taken to my human relationships. 0:19:33.218,0:19:35.001 I still make lots of stupid mistakes, 0:19:35.001,0:19:37.040 but I'm incredibly good[br]at apologizing now, 0:19:37.040,0:19:39.480 and at acknowledging what[br]somebody was right about. 0:19:39.480,0:19:42.281 And if you do that then[br]the conversation goes really well, 0:19:42.281,0:19:44.136 and it's actually really fun. 0:19:44.797,0:19:47.386 CA: Jon, it's absolutely fascinating[br]speaking with you. 0:19:47.386,0:19:51.168 It's really does feel like[br]the ground that we're on 0:19:51.168,0:19:56.035 is a ground populated by deep questions[br]of morality and human nature. 0:19:56.454,0:19:59.086 Your wisdom couldn't be more relevant. 0:19:59.086,0:20:01.404 Thank you so much for sharing[br]this time with us. 0:20:01.404,0:20:02.403 (Applause) 0:20:02.403,0:20:03.402 JH: Thank you.