WEBVTT 00:00:00.888 --> 00:00:04.642 Chris Anderson: Hello. Welcome to this TED Dialogue. 00:00:04.666 --> 00:00:08.253 It's the first of a series that's going to be done 00:00:08.277 --> 00:00:11.589 in response to the current political upheaval. 00:00:12.265 --> 00:00:13.434 I don't know about you; 00:00:13.458 --> 00:00:17.069 I've become quite concerned about the growing divisiveness in this country 00:00:17.093 --> 00:00:18.589 and in the world. 00:00:18.613 --> 00:00:20.973 No one's listening to each other. Right? 00:00:21.564 --> 00:00:22.727 They aren't. 00:00:22.751 --> 00:00:26.487 I mean, it feels like we need a different kind of conversation, 00:00:26.511 --> 00:00:32.439 one that's based on -- I don't know, on reason, listening, on understanding, 00:00:32.463 --> 00:00:34.130 on a broader context. NOTE Paragraph 00:00:34.840 --> 00:00:37.867 That's at least what we're going to try in these TED Dialogues, 00:00:37.891 --> 00:00:39.083 starting today. 00:00:39.107 --> 00:00:41.733 And we couldn't have anyone with us 00:00:41.757 --> 00:00:44.566 who I'd be more excited to kick this off. 00:00:44.590 --> 00:00:48.440 This is a mind right here that thinks pretty much like no one else 00:00:48.464 --> 00:00:50.235 on the planet, I would hasten to say. 00:00:50.259 --> 00:00:51.418 I'm serious. NOTE Paragraph 00:00:51.442 --> 00:00:52.704 (Yuval Noah Harari laughs) 00:00:52.728 --> 00:00:53.892 I'm serious. 00:00:53.916 --> 00:00:58.854 He synthesizes history with underlying ideas 00:00:58.878 --> 00:01:01.048 in a way that kind of takes your breath away. NOTE Paragraph 00:01:01.072 --> 00:01:04.481 So, some of you will know this book, "Sapiens." 00:01:04.505 --> 00:01:06.248 Has anyone here read "Sapiens"? NOTE Paragraph 00:01:06.272 --> 00:01:07.484 (Applause) 00:01:07.508 --> 00:01:10.653 I mean, I could not put it down. 00:01:10.677 --> 00:01:14.551 The way that he tells the story of mankind 00:01:14.575 --> 00:01:18.377 through big ideas that really make you think differently -- 00:01:18.401 --> 00:01:20.136 it's kind of amazing. 00:01:20.160 --> 00:01:21.384 And here's the follow-up, 00:01:21.408 --> 00:01:24.484 which I think is being published in the US next week. NOTE Paragraph 00:01:24.508 --> 00:01:25.659 YNH: Yeah, next week. NOTE Paragraph 00:01:25.683 --> 00:01:26.857 CA: "Homo Deus." 00:01:26.881 --> 00:01:29.946 Now, this is the history of the next hundred years. 00:01:30.521 --> 00:01:32.371 I've had a chance to read it. 00:01:32.395 --> 00:01:34.785 It's extremely dramatic, 00:01:34.809 --> 00:01:39.074 and I daresay, for some people, quite alarming. 00:01:39.605 --> 00:01:40.848 It's a must-read. 00:01:40.872 --> 00:01:46.450 And honestly, we couldn't have someone better to help 00:01:46.474 --> 00:01:50.450 make sense of what on Earth is happening in the world right now. 00:01:50.474 --> 00:01:54.631 So a warm welcome, please, to Yuval Noah Harari. NOTE Paragraph 00:01:54.655 --> 00:01:58.048 (Applause) NOTE Paragraph 00:02:02.900 --> 00:02:06.645 It's great to be joined by our friends on Facebook and around the Web. 00:02:06.669 --> 00:02:08.246 Hello, Facebook. 00:02:08.270 --> 00:02:12.196 And all of you, as I start asking questions of Yuval, 00:02:12.220 --> 00:02:13.872 come up with your own questions, 00:02:13.896 --> 00:02:16.543 and not necessarily about the political scandal du jour, 00:02:16.567 --> 00:02:21.336 but about the broader understanding of: Where are we heading? 00:02:22.517 --> 00:02:24.283 You ready? OK, we're going to go. NOTE Paragraph 00:02:24.307 --> 00:02:25.582 So here we are, Yuval: 00:02:25.606 --> 00:02:29.326 New York City, 2017, there's a new president in power, 00:02:29.350 --> 00:02:32.471 and shock waves rippling around the world. 00:02:32.495 --> 00:02:33.964 What on Earth is happening? NOTE Paragraph 00:02:35.115 --> 00:02:37.361 YNH: I think the basic thing that happened 00:02:37.385 --> 00:02:39.675 is that we have lost our story. 00:02:40.144 --> 00:02:42.611 Humans think in stories, 00:02:42.635 --> 00:02:46.297 and we try to make sense of the world by telling stories. 00:02:46.321 --> 00:02:47.738 And for the last few decades, 00:02:47.762 --> 00:02:50.632 we had a very simple and very attractive story 00:02:50.656 --> 00:02:52.405 about what's happening in the world. 00:02:52.429 --> 00:02:55.593 And the story said that, oh, what's happening is 00:02:55.617 --> 00:02:58.233 that the economy is being globalized, 00:02:58.257 --> 00:03:00.400 politics is being liberalized, 00:03:00.424 --> 00:03:04.423 and the combination of the two will create paradise on Earth, 00:03:04.447 --> 00:03:07.546 and we just need to keep on globalizing the economy 00:03:07.570 --> 00:03:09.381 and liberalizing the political system, 00:03:09.405 --> 00:03:11.310 and everything will be wonderful. 00:03:11.334 --> 00:03:14.066 And 2016 is the moment 00:03:14.090 --> 00:03:18.050 when a very large segment, even of the Western world, 00:03:18.074 --> 00:03:20.480 stopped believing in this story. 00:03:20.504 --> 00:03:22.636 For good or bad reasons -- it doesn't matter. 00:03:22.660 --> 00:03:24.881 People stopped believing in the story, 00:03:24.905 --> 00:03:28.681 and when you don't have a story, you don't understand what's happening. NOTE Paragraph 00:03:29.212 --> 00:03:33.179 CA: Part of you believes that that story was actually a very effective story. 00:03:33.203 --> 00:03:34.397 It worked. NOTE Paragraph 00:03:34.421 --> 00:03:35.898 YNH: To some extent, yes. 00:03:35.922 --> 00:03:37.984 According to some measurements, 00:03:38.008 --> 00:03:40.593 we are now in the best time ever 00:03:40.617 --> 00:03:42.044 for humankind. 00:03:42.068 --> 00:03:44.508 Today, for the first time in history, 00:03:44.532 --> 00:03:48.945 more people die from eating too much than from eating too little, 00:03:48.969 --> 00:03:50.741 which is an amazing achievement. NOTE Paragraph 00:03:50.765 --> 00:03:53.448 (Laughter) NOTE Paragraph 00:03:53.472 --> 00:03:55.173 Also for the first time in history, 00:03:55.197 --> 00:03:59.332 more people die from old age than from infectious diseases, 00:03:59.356 --> 00:04:02.150 and violence is also down. 00:04:02.174 --> 00:04:03.604 For the first time in history, 00:04:03.628 --> 00:04:08.959 more people commit suicide than are killed by crime and terrorism 00:04:08.983 --> 00:04:10.823 and war put together. 00:04:10.847 --> 00:04:15.007 Statistically, you are your own worst enemy. 00:04:15.031 --> 00:04:17.037 At least, of all the people in the world, 00:04:17.061 --> 00:04:20.183 you are most likely to be killed by yourself -- NOTE Paragraph 00:04:20.207 --> 00:04:21.474 (Laughter) NOTE Paragraph 00:04:21.498 --> 00:04:24.542 which is, again, very good news, compared -- NOTE Paragraph 00:04:24.566 --> 00:04:26.190 (Laughter) NOTE Paragraph 00:04:26.214 --> 00:04:30.515 compared to the level of violence that we saw in previous eras. NOTE Paragraph 00:04:30.539 --> 00:04:32.775 CA: But this process of connecting the world 00:04:32.799 --> 00:04:36.698 ended up with a large group of people kind of feeling left out, 00:04:36.722 --> 00:04:38.353 and they've reacted. 00:04:38.377 --> 00:04:40.363 And so we have this bombshell 00:04:40.387 --> 00:04:42.698 that's sort of ripping through the whole system. 00:04:42.722 --> 00:04:46.114 I mean, what do you make of what's happened? 00:04:46.138 --> 00:04:49.416 It feels like the old way that people thought of politics, 00:04:49.440 --> 00:04:52.296 the left-right divide, has been blown up and replaced. 00:04:52.320 --> 00:04:53.917 How should we think of this? NOTE Paragraph 00:04:53.941 --> 00:04:58.325 YNH: Yeah, the old 20th-century political model of left versus right 00:04:58.349 --> 00:05:00.056 is now largely irrelevant, 00:05:00.080 --> 00:05:04.582 and the real divide today is between global and national, 00:05:04.606 --> 00:05:06.395 global or local. 00:05:06.419 --> 00:05:09.276 And you see it again all over the world 00:05:09.300 --> 00:05:11.445 that this is now the main struggle. 00:05:11.469 --> 00:05:14.905 We probably need completely new political models 00:05:14.929 --> 00:05:19.989 and completely new ways of thinking about politics. 00:05:20.499 --> 00:05:26.029 In essence, what you can say is that we now have global ecology, 00:05:26.053 --> 00:05:29.994 we have a global economy but we have national politics, 00:05:30.018 --> 00:05:31.755 and this doesn't work together. 00:05:31.779 --> 00:05:34.005 This makes the political system ineffective, 00:05:34.029 --> 00:05:37.553 because it has no control over the forces that shape our life. 00:05:37.577 --> 00:05:40.856 And you have basically two solutions to this imbalance: 00:05:40.880 --> 00:05:45.454 either de-globalize the economy and turn it back into a national economy, 00:05:45.478 --> 00:05:47.646 or globalize the political system. NOTE Paragraph 00:05:48.948 --> 00:05:53.613 CA: So some, I guess many, liberals out there 00:05:53.637 --> 00:06:00.073 view Trump and his government as kind of irredeemably bad, 00:06:00.097 --> 00:06:02.756 just awful in every way. 00:06:02.780 --> 00:06:09.203 Do you see any underlying narrative or political philosophy in there 00:06:09.227 --> 00:06:11.031 that is at least worth understanding? 00:06:11.055 --> 00:06:13.100 How would you articulate that philosophy? 00:06:13.124 --> 00:06:15.423 Is it just the philosophy of nationalism? NOTE Paragraph 00:06:16.254 --> 00:06:21.654 YNH: I think the underlying feeling or idea 00:06:21.678 --> 00:06:26.170 is that the political system -- something is broken there. 00:06:26.194 --> 00:06:29.964 It doesn't empower the ordinary person anymore. 00:06:29.988 --> 00:06:33.488 It doesn't care so much about the ordinary person anymore, 00:06:33.512 --> 00:06:38.331 and I think this diagnosis of the political disease is correct. 00:06:38.355 --> 00:06:41.714 With regard to the answers, I am far less certain. NOTE Paragraph 00:06:41.738 --> 00:06:45.221 I think what we are seeing is the immediate human reaction: 00:06:45.245 --> 00:06:47.803 if something doesn't work, let's go back. 00:06:47.827 --> 00:06:49.448 And you see it all over the world, 00:06:49.472 --> 00:06:53.898 that people, almost nobody in the political system today, 00:06:53.922 --> 00:06:58.115 has any future-oriented vision of where humankind is going. 00:06:58.139 --> 00:07:01.165 Almost everywhere, you see retrograde vision: 00:07:01.189 --> 00:07:03.235 "Let's make America great again," 00:07:03.259 --> 00:07:06.688 like it was great -- I don't know -- in the '50s, in the '80s, sometime, 00:07:06.712 --> 00:07:07.882 let's go back there. 00:07:07.906 --> 00:07:12.627 And you go to Russia a hundred years after Lenin, 00:07:12.651 --> 00:07:14.522 Putin's vision for the future 00:07:14.546 --> 00:07:17.743 is basically, ah, let's go back to the Tsarist empire. 00:07:17.767 --> 00:07:20.159 And in Israel, where I come from, 00:07:20.183 --> 00:07:23.435 the hottest political vision of the present is: 00:07:23.459 --> 00:07:25.310 "Let's build the temple again." 00:07:25.334 --> 00:07:28.312 So let's go back 2,000 years backwards. 00:07:28.336 --> 00:07:33.162 So people are thinking sometime in the past we've lost it, 00:07:33.186 --> 00:07:36.924 and sometimes in the past, it's like you've lost your way in the city, 00:07:36.948 --> 00:07:40.098 and you say OK, let's go back to the point where I felt secure 00:07:40.122 --> 00:07:41.484 and start again. 00:07:41.508 --> 00:07:43.081 I don't think this can work, 00:07:43.105 --> 00:07:46.007 but a lot of people, this is their gut instinct. NOTE Paragraph 00:07:46.031 --> 00:07:47.683 CA: But why couldn't it work? 00:07:47.707 --> 00:07:51.346 "America First" is a very appealing slogan in many ways. 00:07:51.370 --> 00:07:55.247 Patriotism is, in many ways, a very noble thing. 00:07:55.271 --> 00:07:58.012 It's played a role in promoting cooperation 00:07:58.036 --> 00:07:59.626 among large numbers of people. 00:07:59.650 --> 00:08:03.532 Why couldn't you have a world organized in countries, 00:08:03.556 --> 00:08:06.333 all of which put themselves first? NOTE Paragraph 00:08:07.373 --> 00:08:10.692 YNH: For many centuries, even thousands of years, 00:08:10.716 --> 00:08:13.478 patriotism worked quite well. 00:08:13.502 --> 00:08:15.381 Of course, it led to wars an so forth, 00:08:15.405 --> 00:08:18.360 but we shouldn't focus too much on the bad. 00:08:18.384 --> 00:08:21.926 There are also many, many positive things about patriotism, 00:08:21.950 --> 00:08:25.724 and the ability to have a large number of people 00:08:25.748 --> 00:08:27.162 care about each other, 00:08:27.186 --> 00:08:28.792 sympathize with one another, 00:08:28.816 --> 00:08:32.030 and come together for collective action. 00:08:32.054 --> 00:08:34.749 If you go back to the first nations, 00:08:34.773 --> 00:08:36.598 so, thousands of years ago, 00:08:36.622 --> 00:08:40.028 the people who lived along the Yellow River in China -- 00:08:40.052 --> 00:08:42.495 it was many, many different tribes 00:08:42.519 --> 00:08:46.889 and they all depended on the river for survival and for prosperity, 00:08:46.913 --> 00:08:51.222 but all of them also suffered from periodical floods 00:08:51.246 --> 00:08:52.846 and periodical droughts. 00:08:52.870 --> 00:08:55.905 And no tribe could really do anything about it, 00:08:55.929 --> 00:09:00.089 because each of them controlled just a tiny section of the river. NOTE Paragraph 00:09:00.113 --> 00:09:02.877 And then in a long and complicated process, 00:09:02.901 --> 00:09:06.825 the tribes coalesced together to form the Chinese nation, 00:09:06.849 --> 00:09:09.483 which controlled the entire Yellow River 00:09:09.507 --> 00:09:14.954 and had the ability to bring hundreds of thousands of people together 00:09:14.978 --> 00:09:19.229 to build dams and canals and regulate the river 00:09:19.253 --> 00:09:22.440 and prevent the worst floods and droughts 00:09:22.464 --> 00:09:25.564 and raise the level of prosperity for everybody. 00:09:25.588 --> 00:09:28.205 And this worked in many places around the world. NOTE Paragraph 00:09:28.229 --> 00:09:31.355 But in the 21st century, 00:09:31.379 --> 00:09:34.814 technology is changing all that in a fundamental way. 00:09:34.838 --> 00:09:37.552 We are now living -- all people in the world -- 00:09:37.576 --> 00:09:41.393 are living alongside the same cyber river, 00:09:41.417 --> 00:09:47.023 and no single nation can regulate this river by itself. 00:09:47.047 --> 00:09:51.090 We are all living together on a single planet, 00:09:51.114 --> 00:09:53.875 which is threatened by our own actions, 00:09:53.899 --> 00:09:57.915 and if you don't have some kind of global cooperation, 00:09:57.939 --> 00:10:03.082 nationalism is just not on the right level to tackle the problems, 00:10:03.106 --> 00:10:06.809 whether it's climate change or whether it's technological disruption. NOTE Paragraph 00:10:07.761 --> 00:10:09.952 CA: So it was a beautiful idea 00:10:09.976 --> 00:10:13.924 in a world where most of the action, most of the issues, 00:10:13.948 --> 00:10:16.304 took place on national scale, 00:10:16.328 --> 00:10:19.126 but your argument is that the issues that matter most today 00:10:19.150 --> 00:10:22.294 no longer take place on a national scale but on a global scale. NOTE Paragraph 00:10:22.318 --> 00:10:26.127 YNH: Exactly. All the major problems of the world today 00:10:26.151 --> 00:10:28.540 are global in essence, 00:10:28.564 --> 00:10:30.055 and they cannot be solved 00:10:30.079 --> 00:10:33.976 unless through some kind of global cooperation. 00:10:34.000 --> 00:10:35.629 It's not just climate change, 00:10:35.653 --> 00:10:39.057 which is, like, the most obvious example people give. 00:10:39.081 --> 00:10:42.159 I think more in terms of technological disruption. 00:10:42.183 --> 00:10:45.233 If you think about, for example, artificial intelligence, 00:10:45.257 --> 00:10:48.250 over the next 20, 30 years 00:10:48.274 --> 00:10:52.158 pushing hundreds of millions of people out of the job market -- 00:10:52.182 --> 00:10:54.433 this is a problem on a global level. 00:10:54.457 --> 00:10:57.960 It will disrupt the economy of all the countries. NOTE Paragraph 00:10:57.984 --> 00:11:01.647 And similarly, if you think about, say, bioengineering 00:11:01.671 --> 00:11:04.626 and people being afraid of conducting, 00:11:04.650 --> 00:11:07.311 I don't know, genetic engineering research in humans, 00:11:07.335 --> 00:11:12.624 it won't help if just a single country, let's say the US, 00:11:12.648 --> 00:11:15.972 outlaws all genetic experiments in humans, 00:11:15.996 --> 00:11:19.626 but China or North Korea continues to do it. 00:11:19.650 --> 00:11:22.390 So the US cannot solve it by itself, 00:11:22.414 --> 00:11:27.297 and very quickly, the pressure on the US to do the same will be immense 00:11:27.321 --> 00:11:32.211 because we are talking about high-risk, high-gain technologies. 00:11:32.235 --> 00:11:36.943 If somebody else is doing it, I can't allow myself to remain behind. 00:11:36.967 --> 00:11:42.655 The only way to have regulations, effective regulations, 00:11:42.679 --> 00:11:44.772 on things like genetic engineering, 00:11:44.796 --> 00:11:46.808 is to have global regulations. 00:11:46.832 --> 00:11:51.870 If you just have national regulations, nobody would like to stay behind. NOTE Paragraph 00:11:51.894 --> 00:11:53.902 CA: So this is really interesting. 00:11:53.926 --> 00:11:55.857 It seems to me that this may be one key 00:11:55.881 --> 00:11:59.353 to provoking at least a constructive conversation 00:11:59.377 --> 00:12:01.050 between the different sides here, 00:12:01.074 --> 00:12:04.216 because I think everyone can agree that the start point 00:12:04.240 --> 00:12:06.937 of a lot of the anger that's propelled us to where we are 00:12:06.961 --> 00:12:09.709 is because of the legitimate concerns about job loss. 00:12:09.733 --> 00:12:13.374 Work is gone, a traditional way of life has gone, 00:12:13.398 --> 00:12:16.773 and it's no wonder that people are furious about that. 00:12:16.797 --> 00:12:21.349 And in general, they have blamed globalism, global elites, 00:12:21.373 --> 00:12:24.120 for doing this to them without asking their permission, 00:12:24.144 --> 00:12:26.239 and that seems like a legitimate complaint. NOTE Paragraph 00:12:26.263 --> 00:12:29.572 But what I hear you saying is that -- so a key question is: 00:12:29.596 --> 00:12:35.057 What is the real cause of job loss, both now and going forward? 00:12:35.081 --> 00:12:37.857 To the extent that it's about globalism, 00:12:37.881 --> 00:12:42.003 then the right response, yes, is to shut down borders 00:12:42.027 --> 00:12:46.007 and keep people out and change trade agreements and so forth. 00:12:46.031 --> 00:12:47.387 But you're saying, I think, 00:12:47.411 --> 00:12:52.202 that actually the bigger cause of job loss is not going to be that at all. 00:12:52.226 --> 00:12:55.940 It's going to originate in technological questions, 00:12:55.964 --> 00:12:57.976 and we have no chance of solving that 00:12:58.000 --> 00:13:00.095 unless we operate as a connected world. NOTE Paragraph 00:13:00.119 --> 00:13:01.638 YNH: Yeah, I think that, 00:13:01.662 --> 00:13:04.903 I don't know about the present, but looking to the future, 00:13:04.927 --> 00:13:08.004 it's not the Mexicans or Chinese who will take the jobs 00:13:08.028 --> 00:13:09.595 from the people in Pennsylvania, 00:13:09.619 --> 00:13:11.362 it's the robots and algorithms. 00:13:11.386 --> 00:13:15.570 So unless you plan to build a big wall on the border of California -- NOTE Paragraph 00:13:15.594 --> 00:13:16.728 (Laughter) NOTE Paragraph 00:13:16.752 --> 00:13:20.443 the wall on the border with Mexico is going to be very ineffective. 00:13:20.467 --> 00:13:26.815 And I was struck when I watched the debates before the election, 00:13:26.839 --> 00:13:32.576 I was struck that certainly Trump did not even attempt to frighten people 00:13:32.600 --> 00:13:35.176 by saying the robots will take your jobs. 00:13:35.200 --> 00:13:37.499 Now even if it's not true, it doesn't matter. 00:13:37.523 --> 00:13:41.041 It could have been an extremely effective way of frightening people -- NOTE Paragraph 00:13:41.065 --> 00:13:42.066 (Laughter) NOTE Paragraph 00:13:42.090 --> 00:13:43.251 and galvanizing people: 00:13:43.275 --> 00:13:44.884 "The robots will take your jobs!" 00:13:44.908 --> 00:13:46.263 And nobody used that line. 00:13:46.287 --> 00:13:48.979 And it made me afraid, 00:13:49.003 --> 00:13:53.051 because it meant that no matter what happens 00:13:53.075 --> 00:13:55.221 in universities and laboratories, 00:13:55.245 --> 00:13:58.010 and there, there is already an intense debate about it, 00:13:58.034 --> 00:14:02.128 but in the mainstream political system and among the general public, 00:14:02.152 --> 00:14:04.262 people are just unaware 00:14:04.286 --> 00:14:08.796 that there could be an immense technological disruption -- 00:14:08.820 --> 00:14:12.942 not in 200 years, but in 10, 20, 30 years -- 00:14:12.966 --> 00:14:15.536 and we have to do something about it now, 00:14:15.560 --> 00:14:21.836 partly because most of what we teach children today in school or in college 00:14:21.860 --> 00:14:27.861 is going to be completely irrelevant to the job market of 2040, 2050. 00:14:27.885 --> 00:14:31.243 So it's not something we'll need to think about in 2040. 00:14:31.267 --> 00:14:34.860 We need to think today what to teach the young people. NOTE Paragraph 00:14:34.884 --> 00:14:37.537 CA: Yeah, no, absolutely. 00:14:38.595 --> 00:14:42.512 You've often written about moments in history 00:14:42.536 --> 00:14:49.395 where humankind has ... entered a new era, unintentionally. 00:14:49.806 --> 00:14:52.674 Decisions have been made, technologies have been developed, 00:14:52.698 --> 00:14:55.069 and suddenly the world has changed, 00:14:55.093 --> 00:14:57.560 possibly in a way that's worse for everyone. 00:14:57.584 --> 00:14:59.659 So one of the examples you give in "Sapiens" 00:14:59.683 --> 00:15:01.774 is just the whole agricultural revolution, 00:15:01.798 --> 00:15:05.352 which, for an actual person tilling the fields, 00:15:05.376 --> 00:15:08.556 they just picked up a 12-hour backbreaking workday 00:15:08.580 --> 00:15:14.828 instead of six hours in the jungle and a much more interesting lifestyle. NOTE Paragraph 00:15:14.852 --> 00:15:15.894 (Laughter) NOTE Paragraph 00:15:15.918 --> 00:15:19.107 So are we at another possible phase change here, 00:15:19.131 --> 00:15:23.619 where we kind of sleepwalk into a future that none of us actually wants? NOTE Paragraph 00:15:24.058 --> 00:15:26.791 YNH: Yes, very much so. 00:15:26.815 --> 00:15:28.652 During the agricultural revolution, 00:15:28.676 --> 00:15:33.096 what happened is that immense technological and economic revolution 00:15:33.120 --> 00:15:35.985 empowered the human collective, 00:15:36.009 --> 00:15:38.962 but when you look at actual individual lives, 00:15:38.986 --> 00:15:42.480 the life of a tiny elite became much better, 00:15:42.504 --> 00:15:46.742 and the lives of the majority of people became considerably worse. 00:15:46.766 --> 00:15:49.471 And this can happen again in the 21st century. 00:15:49.495 --> 00:15:54.361 No doubt the new technologies will empower the human collective. 00:15:54.385 --> 00:15:57.105 But we may end up again 00:15:57.129 --> 00:16:01.586 with a tiny elite reaping all the benefits, taking all the fruits, 00:16:01.610 --> 00:16:05.796 and the masses of the population finding themselves worse 00:16:05.820 --> 00:16:07.121 than they were before, 00:16:07.145 --> 00:16:09.933 certainly much worse than this tiny elite. NOTE Paragraph 00:16:10.657 --> 00:16:13.312 CA: And those elites might not even be human elites. 00:16:13.336 --> 00:16:15.093 They might be cyborgs or -- NOTE Paragraph 00:16:15.117 --> 00:16:17.324 YNH: Yeah, they could be enhanced super humans. 00:16:17.348 --> 00:16:18.603 They could be cyborgs. 00:16:18.627 --> 00:16:20.984 They could be completely nonorganic elites. 00:16:21.008 --> 00:16:23.536 They could even be non-conscious algorithms. 00:16:23.560 --> 00:16:28.471 What we see now in the world is authority shifting away 00:16:28.495 --> 00:16:30.764 from humans to algorithms. 00:16:30.788 --> 00:16:34.312 More and more decisions -- about personal lives, 00:16:34.336 --> 00:16:37.008 about economic matters, about political matters -- 00:16:37.032 --> 00:16:39.511 are actually being taken by algorithms. 00:16:39.535 --> 00:16:42.169 If you ask the bank for a loan, 00:16:42.193 --> 00:16:46.890 chances are your fate is decided by an algorithm, not by a human being. 00:16:46.914 --> 00:16:53.101 And the general impression is that maybe Homo sapiens just lost it. NOTE Paragraph 00:16:53.125 --> 00:16:57.685 The world is so complicated, there is so much data, 00:16:57.709 --> 00:17:00.263 things are changing so fast, 00:17:00.287 --> 00:17:03.888 that this thing that evolved on the African savanna 00:17:03.912 --> 00:17:05.619 tens of thousands of years ago -- 00:17:05.643 --> 00:17:09.140 to cope with a particular environment, 00:17:09.164 --> 00:17:12.648 a particular volume of information and data -- 00:17:12.672 --> 00:17:17.008 it just can't handle the realities of the 21st century, 00:17:17.032 --> 00:17:19.929 and the only thing that may be able to handle it 00:17:19.953 --> 00:17:22.025 is big-data algorithms. 00:17:22.049 --> 00:17:28.230 So no wonder more and more authority is shifting from us to the algorithms. NOTE Paragraph 00:17:28.857 --> 00:17:32.706 CA: So we're in New York City for the first of a series of TED Dialogues 00:17:32.730 --> 00:17:35.027 with Yuval Harari, 00:17:35.051 --> 00:17:38.895 and there's a Facebook Live audience out there. 00:17:38.919 --> 00:17:40.570 We're excited to have you with us. 00:17:40.594 --> 00:17:42.696 We'll start coming to some of your questions 00:17:42.720 --> 00:17:44.434 and questions of people in the room 00:17:44.458 --> 00:17:45.623 in just a few minutes, 00:17:45.647 --> 00:17:47.611 so have those coming. NOTE Paragraph 00:17:47.635 --> 00:17:51.532 Yuval, if you're going to make the argument 00:17:51.556 --> 00:17:57.691 that we need to get past nationalism because of the coming technological ... 00:17:59.218 --> 00:18:01.059 danger, in a way, 00:18:01.083 --> 00:18:03.028 presented by so much of what's happening 00:18:03.052 --> 00:18:05.495 we've got to have a global conversation about this. 00:18:05.519 --> 00:18:08.947 Trouble is, it's hard to get people really believing that, I don't know, 00:18:08.971 --> 00:18:11.132 AI really is an imminent threat, and so forth. 00:18:11.156 --> 00:18:13.882 The things that people, some people at least, 00:18:13.906 --> 00:18:15.941 care about much more immediately, perhaps, 00:18:15.965 --> 00:18:17.549 is climate change, 00:18:17.573 --> 00:18:22.466 perhaps other issues like refugees, nuclear weapons, and so forth. 00:18:22.490 --> 00:18:27.536 Would you argue that where we are right now 00:18:27.560 --> 00:18:31.109 that somehow those issues need to be dialed up? 00:18:31.133 --> 00:18:33.293 You've talked about climate change, 00:18:33.317 --> 00:18:36.973 but Trump has said he doesn't believe in that. 00:18:36.997 --> 00:18:39.436 So in a way, your most powerful argument, 00:18:39.460 --> 00:18:42.206 you can't actually use to make this case. NOTE Paragraph 00:18:42.230 --> 00:18:44.416 YNH: Yeah, I think with climate change, 00:18:44.440 --> 00:18:48.147 at first sight, it's quite surprising 00:18:48.171 --> 00:18:50.675 that there is a very close correlation 00:18:50.699 --> 00:18:54.021 between nationalism and climate change. 00:18:54.045 --> 00:18:58.632 I mean, almost always, the people who deny climate change are nationalists. 00:18:58.656 --> 00:19:00.737 And at first sight, you think: Why? 00:19:00.761 --> 00:19:01.914 What's the connection? 00:19:01.938 --> 00:19:04.724 Why don't you have socialists denying climate change? 00:19:04.748 --> 00:19:07.099 But then, when you think about it, it's obvious -- 00:19:07.123 --> 00:19:10.867 because nationalism has no solution to climate change. 00:19:10.891 --> 00:19:14.087 If you want to be a nationalist in the 21st century, 00:19:14.111 --> 00:19:15.983 you have to deny the problem. 00:19:16.007 --> 00:19:20.494 If you accept the reality of the problem, then you must accept that, yes, 00:19:20.518 --> 00:19:23.294 there is still room in the world for patriotism, 00:19:23.318 --> 00:19:27.469 there is still room in the world for having special loyalties 00:19:27.493 --> 00:19:32.127 and obligations towards your own people, towards your own country. 00:19:32.151 --> 00:19:35.971 I don't think anybody is really thinking of abolishing that. NOTE Paragraph 00:19:35.995 --> 00:19:38.996 But in order to confront climate change, 00:19:39.020 --> 00:19:43.231 we need additional loyalties and commitments 00:19:43.255 --> 00:19:45.260 to a level beyond the nation. 00:19:45.284 --> 00:19:47.727 And that should not be impossible, 00:19:47.751 --> 00:19:51.443 because people can have several layers of loyalty. 00:19:51.467 --> 00:19:53.871 You can be loyal to your family 00:19:53.895 --> 00:19:55.408 and to your community 00:19:55.432 --> 00:19:56.761 and to your nation, 00:19:56.785 --> 00:20:00.413 so why can't you also be loyal to humankind as a whole? 00:20:00.437 --> 00:20:03.836 Of course, there are occasions when it becomes difficult, 00:20:03.860 --> 00:20:05.643 what to put first, 00:20:05.667 --> 00:20:07.490 but, you know, life is difficult. 00:20:07.514 --> 00:20:08.665 Handle it. NOTE Paragraph 00:20:08.689 --> 00:20:11.333 (Laughter) NOTE Paragraph 00:20:11.357 --> 00:20:15.855 CA: OK, so I would love to get some questions from the audience here. 00:20:15.879 --> 00:20:17.797 We've got a microphone here. 00:20:17.821 --> 00:20:21.038 Speak into it, and Facebook, get them coming, too. NOTE Paragraph 00:20:21.062 --> 00:20:24.496 Question: One of the things that has clearly made a huge difference 00:20:24.520 --> 00:20:26.306 in this country and other countries 00:20:26.330 --> 00:20:28.544 is the income distribution inequality, 00:20:28.568 --> 00:20:32.782 the dramatic change in income distribution in the US 00:20:32.806 --> 00:20:34.508 from what it was 50 years ago, 00:20:34.532 --> 00:20:35.683 and around the world. 00:20:35.707 --> 00:20:38.850 Is there anything we can do to affect that? 00:20:38.874 --> 00:20:41.715 Because that gets at a lot of the underlying causes. NOTE Paragraph 00:20:44.283 --> 00:20:49.597 YNH: So far I haven't heard a very good idea about what to do about it, 00:20:49.621 --> 00:20:53.349 again, partly because most ideas remain on the national level, 00:20:53.373 --> 00:20:55.141 and the problem is global. 00:20:55.165 --> 00:20:58.143 I mean, one idea that we hear quite a lot about now 00:20:58.167 --> 00:20:59.999 is universal basic income. 00:21:00.023 --> 00:21:01.174 But this is a problem. 00:21:01.198 --> 00:21:02.850 I mean, I think it's a good start, 00:21:02.874 --> 00:21:06.596 but it's a problematic idea because it's not clear what "universal" is 00:21:06.620 --> 00:21:08.461 and it's not clear what "basic" is. 00:21:08.485 --> 00:21:11.866 Most people when they speak about universal basic income, 00:21:11.890 --> 00:21:14.675 they actually mean national basic income. 00:21:14.699 --> 00:21:16.443 But the problem is global. NOTE Paragraph 00:21:16.467 --> 00:21:22.117 Let's say that you have AI and 3D printers taking away millions of jobs 00:21:22.141 --> 00:21:23.297 in Bangladesh, 00:21:23.321 --> 00:21:26.569 from all the people who make my shirts and my shoes. 00:21:26.593 --> 00:21:27.899 So what's going to happen? 00:21:27.923 --> 00:21:34.462 The US government will levy taxes on Google and Apple in California, 00:21:34.486 --> 00:21:39.067 and use that to pay basic income to unemployed Bangladeshis? 00:21:39.091 --> 00:21:41.727 If you believe that, you can just as well believe 00:21:41.751 --> 00:21:45.414 that Santa Claus will come and solve the problem. 00:21:45.438 --> 00:21:50.564 So unless we have really universal and not national basic income, 00:21:50.588 --> 00:21:53.723 the deep problems are not going to go away. NOTE Paragraph 00:21:53.747 --> 00:21:56.479 And also it's not clear what basic is, 00:21:56.503 --> 00:21:59.136 because what are basic human needs? 00:21:59.160 --> 00:22:01.970 A thousand years ago, just food and shelter was enough. 00:22:01.994 --> 00:22:05.605 But today, people will say education is a basic human need, 00:22:05.629 --> 00:22:07.202 it should be part of the package. 00:22:07.226 --> 00:22:11.005 But how much? Six years? Twelve years? PhD? 00:22:11.029 --> 00:22:12.862 Similarly, with health care, 00:22:12.886 --> 00:22:15.571 let's say that in 20, 30, 40 years, 00:22:15.595 --> 00:22:19.368 you'll have expensive treatments that can extend human life 00:22:19.392 --> 00:22:21.307 to 120, I don't know. 00:22:21.331 --> 00:22:26.522 Will this be part of the basket of basic income or not? 00:22:26.546 --> 00:22:27.975 It's a very difficult problem, 00:22:27.999 --> 00:22:34.257 because in a world where people lose their ability to be employed, 00:22:34.281 --> 00:22:37.862 the only thing they are going to get is this basic income. 00:22:37.886 --> 00:22:43.013 So what's part of it is a very, very difficult ethical question. NOTE Paragraph 00:22:43.037 --> 00:22:46.341 CA: There's a bunch of questions on how the world affords it as well, 00:22:46.365 --> 00:22:47.525 who pays. 00:22:47.549 --> 00:22:50.361 There's a question here from Facebook from Lisa Larson: 00:22:50.385 --> 00:22:52.960 "How does nationalism in the US now 00:22:52.984 --> 00:22:56.399 compare to that between World War I and World War II 00:22:56.423 --> 00:22:57.844 in the last century?" NOTE Paragraph 00:22:57.868 --> 00:23:02.316 YNH: Well the good news, with regard to the dangers of nationalism, 00:23:02.340 --> 00:23:06.263 we are in a much better position than a century ago. 00:23:06.287 --> 00:23:08.959 A century ago, 1917, 00:23:08.983 --> 00:23:12.116 Europeans were killing each other by the millions. 00:23:12.140 --> 00:23:16.491 In 2016, with Brexit, as far as I remember, 00:23:16.515 --> 00:23:21.752 a single person lost their life, an MP who was murdered by some extremist. 00:23:21.776 --> 00:23:23.309 Just a single person. 00:23:23.333 --> 00:23:26.018 I mean, if Brexit was about British independence, 00:23:26.042 --> 00:23:30.793 this is the most peaceful war of independence in human history. 00:23:30.817 --> 00:23:36.606 And let's say that Scotland will now choose to leave the UK 00:23:36.630 --> 00:23:38.806 after Brexit. NOTE Paragraph 00:23:38.830 --> 00:23:40.814 So in the 18th century, 00:23:40.838 --> 00:23:44.070 if Scotland wanted -- and the Scots wanted several times -- 00:23:44.094 --> 00:23:47.627 to break out of the control of London, 00:23:47.651 --> 00:23:51.949 the reaction of the government in London was to send an army up north 00:23:51.973 --> 00:23:55.444 to burn down Edinburgh and massacre the highland tribes. 00:23:55.468 --> 00:24:01.024 My guess is that if, in 2018, the Scots vote for independence, 00:24:01.048 --> 00:24:04.457 the London government will not send an army up north 00:24:04.481 --> 00:24:06.084 to burn down Edinburgh. 00:24:06.108 --> 00:24:10.375 Very few people are now willing to kill or be killed 00:24:10.399 --> 00:24:13.121 for Scottish or for British independence. 00:24:13.145 --> 00:24:18.165 So for all the talk of the rise of nationalism 00:24:18.189 --> 00:24:20.432 and going back to the 1930s, 00:24:20.456 --> 00:24:24.231 to the 19th century, in the West at least, 00:24:24.255 --> 00:24:30.839 the power of national sentiments today is far, far smaller 00:24:30.863 --> 00:24:32.403 than it was a century ago. NOTE Paragraph 00:24:32.427 --> 00:24:36.264 CA: Although some people now, you hear publicly worrying 00:24:36.288 --> 00:24:39.044 about whether that might be shifting, 00:24:39.068 --> 00:24:42.466 that there could actually be outbreaks of violence in the US 00:24:42.490 --> 00:24:44.837 depending on how things turn out. 00:24:44.861 --> 00:24:46.400 Should we be worried about that, 00:24:46.424 --> 00:24:48.490 or do you really think things have shifted? NOTE Paragraph 00:24:48.514 --> 00:24:50.005 YNH: No, we should be worried. 00:24:50.029 --> 00:24:51.654 We should be aware of two things. 00:24:51.678 --> 00:24:53.315 First of all, don't be hysterical. 00:24:53.339 --> 00:24:56.786 We are not back in the First World War yet. 00:24:56.810 --> 00:24:59.750 But on the other hand, don't be complacent. 00:24:59.774 --> 00:25:05.148 We reached from 1917 to 2017, 00:25:05.172 --> 00:25:07.354 not by some divine miracle, 00:25:07.378 --> 00:25:09.402 but simply by human decisions, 00:25:09.426 --> 00:25:12.089 and if we now start making the wrong decisions, 00:25:12.113 --> 00:25:16.598 we could be back in an analogous situation to 1917 00:25:16.622 --> 00:25:18.128 in a few years. 00:25:18.152 --> 00:25:20.473 One of the things I know as a historian 00:25:20.497 --> 00:25:24.172 is that you should never underestimate human stupidity. NOTE Paragraph 00:25:24.196 --> 00:25:27.079 (Laughter) NOTE Paragraph 00:25:27.103 --> 00:25:30.187 It's one of the most powerful forces in history, 00:25:30.211 --> 00:25:32.538 human stupidity and human violence. 00:25:32.562 --> 00:25:36.667 Humans do such crazy things for no obvious reason, 00:25:36.691 --> 00:25:38.401 but again, at the same time, 00:25:38.425 --> 00:25:42.029 another very powerful force in human history is human wisdom. 00:25:42.053 --> 00:25:43.219 We have both. NOTE Paragraph 00:25:43.243 --> 00:25:46.145 CA: We have with us here moral psychologist Jonathan Haidt, 00:25:46.169 --> 00:25:47.792 who I think has a question. NOTE Paragraph 00:25:48.871 --> 00:25:50.354 Jonathan Haidt: Thanks, Yuval. 00:25:50.378 --> 00:25:52.861 So you seem to be a fan of global governance, 00:25:52.885 --> 00:25:56.405 but when you look at the map of the world from Transparency International, 00:25:56.429 --> 00:25:59.757 which rates the level of corruption of political institutions, 00:25:59.781 --> 00:26:02.861 it's a vast sea of red with little bits of yellow here and there 00:26:02.885 --> 00:26:04.490 for those with good institutions. 00:26:04.514 --> 00:26:07.015 So if we were to have some kind of global governance, 00:26:07.039 --> 00:26:09.870 what makes you think it would end up being more like Denmark 00:26:09.894 --> 00:26:11.934 rather than more like Russia or Honduras, 00:26:11.958 --> 00:26:13.459 and aren't there alternatives, 00:26:13.483 --> 00:26:15.569 such as we did with CFCs? 00:26:15.593 --> 00:26:18.700 There are ways to solve global problems with national governments. 00:26:18.724 --> 00:26:20.938 What would world government actually look like, 00:26:20.962 --> 00:26:22.683 and why do you think it would work? NOTE Paragraph 00:26:22.707 --> 00:26:26.467 YNH: Well, I don't know what it would look like. 00:26:26.491 --> 00:26:29.543 Nobody still has a model for that. 00:26:29.567 --> 00:26:32.195 The main reason we need it 00:26:32.219 --> 00:26:36.513 is because many of these issues are lose-lose situations. 00:26:36.537 --> 00:26:39.429 When you have a win-win situation like trade, 00:26:39.453 --> 00:26:42.369 both sides can benefit from a trade agreement, 00:26:42.393 --> 00:26:44.657 then this is something you can work out. 00:26:44.681 --> 00:26:47.027 Without some kind of global government, 00:26:47.051 --> 00:26:49.905 national governments each have an interest in doing it. 00:26:49.929 --> 00:26:53.900 But when you have a lose-lose situation like with climate change, 00:26:53.924 --> 00:26:55.565 it's much more difficult 00:26:55.589 --> 00:27:00.475 without some overarching authority, real authority. NOTE Paragraph 00:27:00.499 --> 00:27:03.261 Now, how to get there and what would it look like, 00:27:03.285 --> 00:27:04.645 I don't know. 00:27:04.669 --> 00:27:08.406 And certainly there is no obvious reason 00:27:08.430 --> 00:27:10.710 to think that it would look like Denmark, 00:27:10.734 --> 00:27:12.322 or that it would be a democracy. 00:27:12.346 --> 00:27:14.932 Most likely it wouldn't. 00:27:14.956 --> 00:27:20.987 We don't have workable democratic models 00:27:21.011 --> 00:27:23.107 for a global government. 00:27:23.131 --> 00:27:26.196 So maybe it would look more like ancient China 00:27:26.220 --> 00:27:27.919 than like modern Denmark. 00:27:27.943 --> 00:27:33.166 But still, given the dangers that we are facing, 00:27:33.190 --> 00:27:38.310 I think the imperative of having some kind of real ability 00:27:38.334 --> 00:27:42.462 to force through difficult decisions on the global level 00:27:42.486 --> 00:27:46.616 is more important than almost anything else. NOTE Paragraph 00:27:47.591 --> 00:27:49.689 CA: There's a question from Facebook here, 00:27:49.713 --> 00:27:51.606 and then we'll get the mic to Andrew. 00:27:51.630 --> 00:27:53.826 So, Kat Hebron on Facebook, 00:27:53.850 --> 00:27:55.518 calling in from Vail: 00:27:55.542 --> 00:27:59.753 "How would developed nations manage the millions of climate migrants?" NOTE Paragraph 00:28:00.818 --> 00:28:02.972 YNH: I don't know. NOTE Paragraph 00:28:02.996 --> 00:28:04.888 CA: That's your answer, Kat. (Laughter) NOTE Paragraph 00:28:04.912 --> 00:28:07.058 YNH: And I don't think that they know either. 00:28:07.082 --> 00:28:08.876 They'll just deny the problem, maybe. NOTE Paragraph 00:28:08.900 --> 00:28:11.925 CA: But immigration, generally, is another example of a problem 00:28:11.949 --> 00:28:14.522 that's very hard to solve on a nation-by-nation basis. 00:28:14.546 --> 00:28:16.016 One nation can shut its doors, 00:28:16.040 --> 00:28:18.574 but maybe that stores up problems for the future. NOTE Paragraph 00:28:18.598 --> 00:28:22.470 YNH: Yes, I mean -- it's another very good case, 00:28:22.494 --> 00:28:24.723 especially because it's so much easier 00:28:24.747 --> 00:28:26.578 to migrate today 00:28:26.602 --> 00:28:30.291 than it was in the Middle Ages or in ancient times. NOTE Paragraph 00:28:30.315 --> 00:28:34.778 CA: Yuval, there's a belief among many technologists, certainly, 00:28:34.802 --> 00:28:37.153 that political concerns are kind of overblown, 00:28:37.177 --> 00:28:40.874 that actually, political leaders don't have that much influence 00:28:40.898 --> 00:28:42.064 in the world, 00:28:42.088 --> 00:28:46.057 that the real determination of humanity at this point is by science, 00:28:46.081 --> 00:28:47.527 by invention, by companies, 00:28:47.551 --> 00:28:51.943 by many things other than political leaders, 00:28:51.967 --> 00:28:54.378 and it's actually very hard for leaders to do much, 00:28:54.402 --> 00:28:56.760 so we're actually worrying about nothing here. NOTE Paragraph 00:28:58.005 --> 00:29:00.241 YNH: Well, first, it should be emphasized 00:29:00.265 --> 00:29:05.262 that it's true that political leaders' ability to do good is very limited, 00:29:05.286 --> 00:29:08.329 but their ability to do harm is unlimited. 00:29:08.353 --> 00:29:10.953 There is a basic imbalance here. 00:29:10.977 --> 00:29:14.545 You can still press the button and blow everybody up. 00:29:14.569 --> 00:29:16.155 You have that kind of ability. 00:29:16.179 --> 00:29:19.748 But if you want, for example, to reduce inequality, 00:29:19.772 --> 00:29:21.649 that's very, very difficult. 00:29:21.673 --> 00:29:23.069 But to start a war, 00:29:23.093 --> 00:29:24.944 you can still do so very easily. 00:29:24.968 --> 00:29:28.560 So there is a built-in imbalance in the political system today 00:29:28.584 --> 00:29:30.195 which is very frustrating, 00:29:30.219 --> 00:29:35.120 where you cannot do a lot of good but you can still do a lot of harm. 00:29:35.144 --> 00:29:39.288 And this makes the political system still a very big concern. NOTE Paragraph 00:29:39.812 --> 00:29:41.963 CA: So as you look at what's happening today, 00:29:41.987 --> 00:29:43.741 and putting your historian's hat on, 00:29:43.765 --> 00:29:47.291 do you look back in history at moments when things were going just fine 00:29:47.315 --> 00:29:52.648 and an individual leader really took the world or their country backwards? NOTE Paragraph 00:29:53.307 --> 00:29:55.936 YNH: There are quite a few examples, 00:29:55.960 --> 00:29:58.779 but I should emphasize, it's never an individual leader. 00:29:58.803 --> 00:30:00.437 I mean, somebody put him there, 00:30:00.461 --> 00:30:03.744 and somebody allowed him to continue to be there. 00:30:03.768 --> 00:30:07.851 So it's never really just the fault of a single individual. 00:30:07.875 --> 00:30:12.488 There are a lot of people behind every such individual. NOTE Paragraph 00:30:12.512 --> 00:30:15.990 CA: Can we have the microphone here, please, to Andrew? NOTE Paragraph 00:30:19.132 --> 00:30:22.696 Andrew Solomon: You've talked a lot about the global versus the national, 00:30:22.720 --> 00:30:24.346 but increasingly, it seems to me, 00:30:24.370 --> 00:30:27.013 the world situation is in the hands of identity groups. 00:30:27.037 --> 00:30:29.347 We look at people within the United States 00:30:29.371 --> 00:30:30.998 who have been recruited by ISIS. 00:30:31.022 --> 00:30:33.213 We look at these other groups which have formed 00:30:33.237 --> 00:30:35.199 which go outside of national bounds 00:30:35.223 --> 00:30:37.384 but still represent significant authorities. 00:30:37.408 --> 00:30:39.836 How are they to be integrated into the system, 00:30:39.860 --> 00:30:43.573 and how is a diverse set of identities to be made coherent 00:30:43.597 --> 00:30:45.935 under either national or global leadership? NOTE Paragraph 00:30:47.380 --> 00:30:50.601 YNH: Well, the problem of such diverse identities 00:30:50.625 --> 00:30:52.681 is a problem from nationalism as well. 00:30:53.229 --> 00:30:57.584 Nationalism believes in a single, monolithic identity, 00:30:57.608 --> 00:31:01.724 and exclusive or at least more extreme versions of nationalism 00:31:01.748 --> 00:31:05.317 believe in an exclusive loyalty to a single identity. 00:31:05.341 --> 00:31:08.257 And therefore, nationalism has had a lot of problems 00:31:08.281 --> 00:31:11.157 with people wanting to divide their identities 00:31:11.181 --> 00:31:13.244 between various groups. 00:31:13.268 --> 00:31:18.143 So it's not just a problem, say, for a global vision. NOTE Paragraph 00:31:18.540 --> 00:31:22.392 And I think, again, history shows 00:31:22.416 --> 00:31:28.543 that you shouldn't necessarily think in such exclusive terms. 00:31:28.567 --> 00:31:31.975 If you think that there is just a single identity for a person, 00:31:31.999 --> 00:31:37.039 "I am just X, that's it, I can't be several things, I can be just that," 00:31:37.063 --> 00:31:39.159 that's the start of the problem. 00:31:39.183 --> 00:31:41.971 You have religions, you have nations 00:31:41.995 --> 00:31:45.177 that sometimes demand exclusive loyalty, 00:31:45.201 --> 00:31:46.932 but it's not the only option. 00:31:46.956 --> 00:31:49.338 There are many religions and many nations 00:31:49.362 --> 00:31:53.240 that enable you to have diverse identities at the same time. NOTE Paragraph 00:31:53.264 --> 00:31:57.621 CA: But is one explanation of what's happened in the last year 00:31:57.645 --> 00:32:02.825 that a group of people have got fed up with, if you like, 00:32:02.849 --> 00:32:06.016 the liberal elites, for want of a better term, 00:32:06.040 --> 00:32:10.413 obsessing over many, many different identities and them feeling, 00:32:10.437 --> 00:32:14.296 "But what about my identity? I am being completely ignored here. 00:32:14.320 --> 00:32:17.294 And by the way, I thought I was the majority"? 00:32:17.318 --> 00:32:20.299 And that that's actually sparked a lot of the anger. NOTE Paragraph 00:32:20.918 --> 00:32:24.063 YNH: Yeah. Identity is always problematic, 00:32:24.087 --> 00:32:28.397 because identity is always based on fictional stories 00:32:28.421 --> 00:32:31.310 that sooner or later collide with reality. 00:32:31.890 --> 00:32:33.408 Almost all identities, 00:32:33.432 --> 00:32:36.843 I mean, beyond the level of the basic community 00:32:36.867 --> 00:32:38.336 of a few dozen people, 00:32:38.360 --> 00:32:40.289 are based on a fictional story. 00:32:40.313 --> 00:32:41.954 They are not the truth. 00:32:41.978 --> 00:32:43.293 They are not the reality. 00:32:43.317 --> 00:32:46.411 It's just a story that people invent and tell one another 00:32:46.435 --> 00:32:47.926 and start believing. 00:32:47.950 --> 00:32:53.270 And therefore all identities are extremely unstable. 00:32:53.294 --> 00:32:55.821 They are not a biological reality. 00:32:55.845 --> 00:32:57.851 Sometimes nationalists, for example, 00:32:57.875 --> 00:33:00.802 think that the nation is a biological entity. 00:33:00.826 --> 00:33:04.439 It's made of the combination of soil and blood, 00:33:04.463 --> 00:33:06.165 creates the nation. 00:33:06.189 --> 00:33:09.281 But this is just a fictional story. NOTE Paragraph 00:33:09.305 --> 00:33:11.868 CA: Soil and blood kind of makes a gooey mess. NOTE Paragraph 00:33:11.892 --> 00:33:13.714 (Laughter) NOTE Paragraph 00:33:13.738 --> 00:33:16.762 YNH: It does, and also it messes with your mind 00:33:16.786 --> 00:33:21.570 when you think too much that I am a combination of soil and blood. 00:33:21.594 --> 00:33:24.461 If you look from a biological perspective, 00:33:24.485 --> 00:33:27.963 obviously none of the nations that exist today 00:33:27.987 --> 00:33:30.230 existed 5,000 years ago. 00:33:30.254 --> 00:33:34.112 Homo sapiens is a social animal, that's for sure. 00:33:34.136 --> 00:33:36.563 But for millions of years, 00:33:36.587 --> 00:33:41.226 Homo sapiens and our hominid ancestors lived in small communities 00:33:41.250 --> 00:33:43.579 of a few dozen individuals. 00:33:43.603 --> 00:33:45.730 Everybody knew everybody else. 00:33:45.754 --> 00:33:49.775 Whereas modern nations are imagined communities, 00:33:49.799 --> 00:33:52.350 in the sense that I don't even know all these people. 00:33:52.374 --> 00:33:55.222 I come from a relatively small nation, Israel, 00:33:55.246 --> 00:33:57.389 and of eight million Israelis, 00:33:57.413 --> 00:33:59.403 I never met most of them. 00:33:59.427 --> 00:34:01.735 I will never meet most of them. 00:34:01.759 --> 00:34:04.321 They basically exist here. NOTE Paragraph 00:34:04.345 --> 00:34:07.094 CA: But in terms of this identity, 00:34:07.118 --> 00:34:12.555 this group who feel left out and perhaps have work taken away, 00:34:12.579 --> 00:34:14.873 I mean, in "Homo Deus," 00:34:14.897 --> 00:34:18.008 you actually speak of this group in one sense expanding, 00:34:18.032 --> 00:34:21.654 that so many people may have their jobs taken away 00:34:21.678 --> 00:34:26.058 by technology in some way that we could end up with 00:34:26.082 --> 00:34:29.253 a really large -- I think you call it a "useless class" -- 00:34:29.277 --> 00:34:31.380 a class where traditionally, 00:34:31.404 --> 00:34:34.135 as viewed by the economy, these people have no use. NOTE Paragraph 00:34:34.159 --> 00:34:35.357 YNH: Yes. NOTE Paragraph 00:34:35.381 --> 00:34:38.312 CA: How likely a possibility is that? 00:34:38.336 --> 00:34:41.080 Is that something we should be terrified about? 00:34:41.104 --> 00:34:43.763 And can we address it in any way? NOTE Paragraph 00:34:43.787 --> 00:34:46.034 YNH: We should think about it very carefully. 00:34:46.058 --> 00:34:49.029 I mean, nobody really knows what the job market will look like 00:34:49.053 --> 00:34:50.743 in 2040, 2050. 00:34:50.767 --> 00:34:53.475 There is a chance many new jobs will appear, 00:34:53.499 --> 00:34:55.253 but it's not certain. 00:34:55.277 --> 00:34:57.488 And even if new jobs do appear, 00:34:57.512 --> 00:34:59.496 it won't necessarily be easy 00:34:59.520 --> 00:35:02.519 for a 50-year old unemployed truck driver 00:35:02.543 --> 00:35:05.576 made unemployed by self-driving vehicles, 00:35:05.600 --> 00:35:09.253 it won't be easy for an unemployed truck driver 00:35:09.277 --> 00:35:14.063 to reinvent himself or herself as a designer of virtual worlds. NOTE Paragraph 00:35:14.087 --> 00:35:18.269 Previously, if you look at the trajectory of the industrial revolution, 00:35:18.293 --> 00:35:22.450 when machines replaced humans in one type of work, 00:35:22.474 --> 00:35:26.755 the solution usually came from low-skill work 00:35:26.779 --> 00:35:29.367 in new lines of business. 00:35:29.391 --> 00:35:32.793 So you didn't need any more agricultural workers, 00:35:32.817 --> 00:35:38.231 so people moved to working in low-skill industrial jobs, 00:35:38.255 --> 00:35:41.724 and when this was taken away by more and more machines, 00:35:41.748 --> 00:35:44.718 people moved to low-skill service jobs. 00:35:44.742 --> 00:35:48.102 Now, when people say there will be new jobs in the future, 00:35:48.126 --> 00:35:50.555 that humans can do better than AI, 00:35:50.579 --> 00:35:52.409 that humans can do better than robots, 00:35:52.433 --> 00:35:55.073 they usually think about high-skill jobs, 00:35:55.097 --> 00:35:58.968 like software engineers designing virtual worlds. 00:35:58.992 --> 00:36:04.386 Now, I don't see how an unemployed cashier from Wal-Mart 00:36:04.410 --> 00:36:09.033 reinvents herself or himself at 50 as a designer of virtual worlds, 00:36:09.057 --> 00:36:10.528 and certainly I don't see 00:36:10.552 --> 00:36:14.019 how the millions of unemployed Bangladeshi textile workers 00:36:14.043 --> 00:36:15.654 will be able to do that. 00:36:15.678 --> 00:36:17.398 I mean, if they are going to do it, 00:36:17.422 --> 00:36:20.778 we need to start teaching the Bangladeshis today 00:36:20.802 --> 00:36:22.556 how to be software designers, 00:36:22.580 --> 00:36:23.823 and we are not doing it. 00:36:23.847 --> 00:36:26.338 So what will they do in 20 years? NOTE Paragraph 00:36:26.362 --> 00:36:30.276 CA: So it feels like you're really highlighting a question 00:36:30.300 --> 00:36:34.483 that's really been bugging me the last few months more and more. 00:36:34.507 --> 00:36:37.362 It's almost a hard question to ask in public, 00:36:37.386 --> 00:36:40.777 but if any mind has some wisdom to offer in it, maybe it's yours, 00:36:40.801 --> 00:36:42.346 so I'm going to ask you: 00:36:42.370 --> 00:36:44.248 What are humans for? NOTE Paragraph 00:36:45.232 --> 00:36:47.166 YNH: As far as we know, for nothing. NOTE Paragraph 00:36:47.190 --> 00:36:48.902 (Laughter) NOTE Paragraph 00:36:48.926 --> 00:36:54.452 I mean, there is no great cosmic drama, some great cosmic plan, 00:36:54.476 --> 00:36:57.317 that we have a role to play in. 00:36:57.341 --> 00:37:00.365 And we just need to discover what our role is 00:37:00.389 --> 00:37:03.381 and then play it to the best of our ability. 00:37:03.405 --> 00:37:08.383 This has been the story of all religions and ideologies and so forth, 00:37:08.407 --> 00:37:11.885 but as a scientist, the best I can say is this is not true. 00:37:11.909 --> 00:37:17.267 There is no universal drama with a role in it for Homo sapiens. 00:37:17.291 --> 00:37:18.972 So -- NOTE Paragraph 00:37:18.996 --> 00:37:21.489 CA: I'm going to push back on you just for a minute, 00:37:21.513 --> 00:37:22.707 just from your own book, 00:37:22.731 --> 00:37:24.055 because in "Homo Deus," 00:37:24.079 --> 00:37:29.138 you give really one of the most coherent and understandable accounts 00:37:29.162 --> 00:37:31.394 about sentience, about consciousness, 00:37:31.418 --> 00:37:34.376 and that unique sort of human skill. 00:37:34.400 --> 00:37:36.893 You point out that it's different from intelligence, 00:37:36.917 --> 00:37:39.251 the intelligence that we're building in machines, 00:37:39.275 --> 00:37:42.933 and that there's actually a lot of mystery around it. 00:37:42.957 --> 00:37:46.334 How can you be sure there's no purpose 00:37:46.358 --> 00:37:50.409 when we don't even understand what this sentience thing is? 00:37:50.433 --> 00:37:53.009 I mean, in your own thinking, isn't there a chance 00:37:53.033 --> 00:37:57.345 that what humans are for is to be the universe's sentient things, 00:37:57.369 --> 00:38:00.792 to be the centers of joy and love and happiness and hope? 00:38:00.816 --> 00:38:03.851 And maybe we can build machines that actually help amplify that, 00:38:03.875 --> 00:38:06.539 even if they're not going to become sentient themselves? 00:38:06.563 --> 00:38:07.714 Is that crazy? 00:38:07.738 --> 00:38:11.221 I kind of found myself hoping that, reading your book. NOTE Paragraph 00:38:11.245 --> 00:38:15.102 YNH: Well, I certainly think that the most interesting question today in science 00:38:15.126 --> 00:38:17.549 is the question of consciousness and the mind. 00:38:17.573 --> 00:38:21.071 We are getting better and better in understanding the brain 00:38:21.095 --> 00:38:22.355 and intelligence, 00:38:22.379 --> 00:38:24.916 but we are not getting much better 00:38:24.940 --> 00:38:27.283 in understanding the mind and consciousness. 00:38:27.307 --> 00:38:30.669 People often confuse intelligence and consciousness, 00:38:30.693 --> 00:38:32.992 especially in places like Silicon Valley, 00:38:33.016 --> 00:38:36.773 which is understandable, because in humans, they go together. 00:38:36.797 --> 00:38:40.376 I mean, intelligence basically is the ability to solve problems. 00:38:40.400 --> 00:38:42.942 Consciousness is the ability to feel things, 00:38:42.966 --> 00:38:48.178 to feel joy and sadness and boredom and pain and so forth. 00:38:48.202 --> 00:38:52.241 In Homo sapiens and all other mammals as well -- it's not unique to humans -- 00:38:52.265 --> 00:38:54.912 in all mammals and birds and some other animals, 00:38:54.936 --> 00:38:57.586 intelligence and consciousness go together. 00:38:57.610 --> 00:39:01.188 We often solve problems by feeling things. 00:39:01.212 --> 00:39:02.705 So we tend to confuse them. 00:39:02.729 --> 00:39:04.194 But they are different things. NOTE Paragraph 00:39:04.218 --> 00:39:07.306 What's happening today in places like Silicon Valley 00:39:07.330 --> 00:39:10.956 is that we are creating artificial intelligence 00:39:10.980 --> 00:39:12.802 but not artificial consciousness. 00:39:12.826 --> 00:39:16.206 There has been an amazing development in computer intelligence 00:39:16.230 --> 00:39:17.792 over the last 50 years, 00:39:17.816 --> 00:39:22.017 and exactly zero development in computer consciousness, 00:39:22.041 --> 00:39:25.727 and there is no indication that computers are going to become conscious 00:39:25.751 --> 00:39:28.282 anytime soon. NOTE Paragraph 00:39:28.306 --> 00:39:33.956 So first of all, if there is some cosmic role for consciousness, 00:39:33.980 --> 00:39:36.110 it's not unique to Homo sapiens. 00:39:36.134 --> 00:39:38.453 Cows are conscious, pigs are conscious, 00:39:38.477 --> 00:39:41.310 chimpanzees are conscious, chickens are conscious, 00:39:41.334 --> 00:39:45.187 so if we go that way, first of all, we need to broaden our horizons 00:39:45.211 --> 00:39:49.936 and remember very clearly we are not the only sentient beings on Earth, 00:39:49.960 --> 00:39:51.755 and when it comes to sentience -- 00:39:51.779 --> 00:39:55.091 when it comes to intelligence, there is good reason to think 00:39:55.115 --> 00:39:58.411 we are the most intelligent of the whole bunch. NOTE Paragraph 00:39:58.435 --> 00:40:01.009 But when it comes to sentience, 00:40:01.033 --> 00:40:04.191 to say that humans are more sentient than whales, 00:40:04.215 --> 00:40:08.362 or more sentient than baboons or more sentient than cats, 00:40:08.386 --> 00:40:10.680 I see no evidence for that. 00:40:10.704 --> 00:40:14.311 So first step is, you go in that direction, expand. 00:40:14.335 --> 00:40:18.317 And then the second question of what is it for, 00:40:18.341 --> 00:40:20.123 I would reverse it 00:40:20.147 --> 00:40:24.383 and I would say that I don't think sentience is for anything. 00:40:24.407 --> 00:40:28.579 I think we don't need to find our role in the universe. 00:40:28.603 --> 00:40:34.416 The really important thing is to liberate ourselves from suffering. 00:40:34.440 --> 00:40:37.433 What characterizes sentient beings 00:40:37.457 --> 00:40:40.177 in contrast to robots, to stones, 00:40:40.201 --> 00:40:41.384 to whatever, 00:40:41.408 --> 00:40:45.199 is that sentient beings suffer, can suffer, 00:40:45.223 --> 00:40:47.563 and what they should focus on 00:40:47.587 --> 00:40:51.707 is not finding their place in some mysterious cosmic drama. 00:40:51.731 --> 00:40:55.550 They should focus on understanding what suffering is, 00:40:55.574 --> 00:40:58.933 what causes it and how to be liberated from it. NOTE Paragraph 00:40:59.572 --> 00:41:03.049 CA: I know this is a big issue for you, and that was very eloquent. 00:41:03.073 --> 00:41:06.487 We're going to have a blizzard of questions from the audience here, 00:41:06.511 --> 00:41:08.431 and maybe from Facebook as well, 00:41:08.455 --> 00:41:10.128 and maybe some comments as well. 00:41:10.152 --> 00:41:11.948 So let's go quick. 00:41:11.972 --> 00:41:13.402 There's one right here. 00:41:15.052 --> 00:41:17.861 Keep your hands held up at the back if you want the mic, 00:41:17.885 --> 00:41:19.304 and we'll get it back to you. NOTE Paragraph 00:41:19.328 --> 00:41:22.447 Question: In your work, you talk a lot about the fictional stories 00:41:22.471 --> 00:41:23.815 that we accept as truth, 00:41:23.839 --> 00:41:25.556 and we live our lives by it. 00:41:25.580 --> 00:41:28.079 As an individual, knowing that, 00:41:28.103 --> 00:41:31.849 how does it impact the stories that you choose to live your life, 00:41:31.873 --> 00:41:35.613 and do you confuse them with the truth, like all of us? NOTE Paragraph 00:41:36.246 --> 00:41:37.457 YNH: I try not to. 00:41:37.481 --> 00:41:40.249 I mean, for me, maybe the most important question, 00:41:40.273 --> 00:41:42.751 both as a scientist and as a person, 00:41:42.775 --> 00:41:46.650 is how to tell the difference between fiction and reality, 00:41:46.674 --> 00:41:49.270 because reality is there. 00:41:49.294 --> 00:41:51.376 I'm not saying that everything is fiction. 00:41:51.400 --> 00:41:54.452 It's just very difficult for human beings to tell the difference 00:41:54.476 --> 00:41:56.093 between fiction and reality, 00:41:56.117 --> 00:42:01.062 and it has become more and more difficult as history progressed, 00:42:01.086 --> 00:42:03.537 because the fictions that we have created -- 00:42:03.561 --> 00:42:06.729 nations and gods and money and corporations -- 00:42:06.753 --> 00:42:08.263 they now control the world. 00:42:08.287 --> 00:42:09.464 So just to even think, 00:42:09.488 --> 00:42:12.633 "Oh, this is just all fictional entities that we've created," 00:42:12.657 --> 00:42:14.104 is very difficult. 00:42:14.128 --> 00:42:16.408 But reality is there. NOTE Paragraph 00:42:17.043 --> 00:42:19.048 For me the best ... 00:42:19.072 --> 00:42:21.195 There are several tests 00:42:21.219 --> 00:42:23.989 to tell the difference between fiction and reality. 00:42:24.013 --> 00:42:27.439 The simplest one, the best one that I can say in short, 00:42:27.463 --> 00:42:29.044 is the test of suffering. 00:42:29.068 --> 00:42:30.621 If it can suffer, it's real. 00:42:31.192 --> 00:42:32.886 If it can't suffer, it's not real. 00:42:32.910 --> 00:42:34.375 A nation cannot suffer. 00:42:34.399 --> 00:42:35.969 That's very, very clear. 00:42:35.993 --> 00:42:37.931 Even if a nation loses a war, 00:42:37.955 --> 00:42:42.020 we say, "Germany suffered a defeat in the First World War," 00:42:42.044 --> 00:42:43.209 it's a metaphor. 00:42:43.233 --> 00:42:45.790 Germany cannot suffer. Germany has no mind. 00:42:45.814 --> 00:42:47.467 Germany has no consciousness. 00:42:47.491 --> 00:42:51.149 Germans can suffer, yes, but Germany cannot. 00:42:51.173 --> 00:42:54.142 Similarly, when a bank goes bust, 00:42:54.166 --> 00:42:55.937 the bank cannot suffer. 00:42:55.961 --> 00:42:59.352 When the dollar loses its value, the dollar doesn't suffer. 00:42:59.376 --> 00:43:01.626 People can suffer. Animals can suffer. 00:43:01.650 --> 00:43:02.806 This is real. 00:43:02.830 --> 00:43:07.359 So I would start, if you really want to see reality, 00:43:07.383 --> 00:43:09.447 I would go through the door of suffering. 00:43:09.471 --> 00:43:12.425 If you can really understand what suffering is, 00:43:12.449 --> 00:43:14.672 this will give you also the key 00:43:14.696 --> 00:43:16.713 to understand what reality is. NOTE Paragraph 00:43:16.737 --> 00:43:19.520 CA: There's a Facebook question here that connects to this, 00:43:19.544 --> 00:43:22.521 from someone around the world in a language that I cannot read. NOTE Paragraph 00:43:22.545 --> 00:43:24.762 YNH: Oh, it's Hebrew. CA: Hebrew. There you go. NOTE Paragraph 00:43:24.786 --> 00:43:25.848 (Laughter) NOTE Paragraph 00:43:25.872 --> 00:43:27.036 Can you read the name? NOTE Paragraph 00:43:27.060 --> 00:43:28.935 YNH: [??] NOTE Paragraph 00:43:28.959 --> 00:43:30.803 CA: Well, thank you for writing in. 00:43:30.827 --> 00:43:35.382 The question is: "Is the post-truth era really a brand-new era, 00:43:35.406 --> 00:43:39.793 or just another climax or moment in a never-ending trend? NOTE Paragraph 00:43:40.701 --> 00:43:44.030 YNH: Personally, I don't connect with this idea of post-truth. 00:43:44.054 --> 00:43:46.762 My basic reaction as a historian is: 00:43:46.786 --> 00:43:50.681 If this is the era of post-truth, when the hell was the era of truth? NOTE Paragraph 00:43:50.705 --> 00:43:51.956 CA: Right. NOTE Paragraph 00:43:51.980 --> 00:43:53.300 (Laughter) NOTE Paragraph 00:43:53.324 --> 00:43:58.007 YNH: Was it the 1980s, the 1950s, the Middle Ages? 00:43:58.031 --> 00:44:02.423 I mean, we have always lived in an era, in a way, of post-truth. NOTE Paragraph 00:44:02.883 --> 00:44:05.194 CA: But I'd push back on that, 00:44:05.218 --> 00:44:07.888 because I think what people are talking about 00:44:07.912 --> 00:44:14.872 is that there was a world where you had fewer journalistic outlets, 00:44:14.896 --> 00:44:18.544 where there were traditions, that things were fact-checked. 00:44:18.568 --> 00:44:22.513 It was incorporated into the charter of those organizations 00:44:22.537 --> 00:44:24.704 that the truth mattered. 00:44:24.728 --> 00:44:26.477 So if you believe in a reality, 00:44:26.501 --> 00:44:28.724 then what you write is information. 00:44:28.748 --> 00:44:32.569 There was a belief that that information should connect to reality in a real way, 00:44:32.593 --> 00:44:35.554 and if you wrote a headline, it was a serious, earnest attempt 00:44:35.578 --> 00:44:37.881 to reflect something that had actually happened. 00:44:37.905 --> 00:44:39.756 And people didn't always get it right. NOTE Paragraph 00:44:39.780 --> 00:44:41.789 But I think the concern now is you've got 00:44:41.813 --> 00:44:44.131 a technological system that's incredibly powerful 00:44:44.155 --> 00:44:48.325 that, for a while at least, massively amplified anything 00:44:48.349 --> 00:44:51.129 with no attention paid to whether it connected to reality, 00:44:51.153 --> 00:44:54.307 only to whether it connected to clicks and attention, 00:44:54.331 --> 00:44:55.947 and that that was arguably toxic. 00:44:55.971 --> 00:44:58.407 That's a reasonable concern, isn't it? NOTE Paragraph 00:44:58.431 --> 00:45:00.717 YNH: Yeah, it is. I mean, the technology changes, 00:45:00.741 --> 00:45:05.969 and it's now easier to disseminate both truth and fiction and falsehood. 00:45:05.993 --> 00:45:07.996 It goes both ways. 00:45:08.020 --> 00:45:12.599 It's also much easier, though, to spread the truth than it was ever before. 00:45:12.623 --> 00:45:16.308 But I don't think there is anything essentially new 00:45:16.332 --> 00:45:21.052 about this disseminating fictions and errors. 00:45:21.076 --> 00:45:25.110 There is nothing that -- I don't know -- Joseph Goebbels, didn't know 00:45:25.134 --> 00:45:30.573 about all this idea of fake news and post-truth. 00:45:30.597 --> 00:45:34.315 He famously said that if you repeat a lie often enough, 00:45:34.339 --> 00:45:36.160 people will think it's the truth, 00:45:36.184 --> 00:45:38.540 and the bigger the lie, the better, 00:45:38.564 --> 00:45:44.587 because people won't even think that something so big can be a lie. 00:45:44.611 --> 00:45:50.269 I think that fake news has been with us for thousands of years. 00:45:50.293 --> 00:45:52.194 Just think of the Bible. NOTE Paragraph 00:45:52.218 --> 00:45:53.605 (Laughter) NOTE Paragraph 00:45:53.629 --> 00:45:54.916 CA: But there is a concern 00:45:54.940 --> 00:45:58.957 that the fake news is associated with tyrannical regimes, 00:45:58.981 --> 00:46:01.558 and when you see an uprise in fake news 00:46:01.582 --> 00:46:06.304 that is a canary in the coal mine that there may be dark times coming. NOTE Paragraph 00:46:08.124 --> 00:46:15.086 YNH: Yeah. I mean, the intentional use of fake news is a disturbing sign. 00:46:15.812 --> 00:46:20.393 But I'm not saying that it's not bad, I'm just saying that it's not new. NOTE Paragraph 00:46:20.820 --> 00:46:23.574 CA: There's a lot of interest on Facebook on this question 00:46:23.598 --> 00:46:28.598 about global governance versus nationalism. 00:46:29.292 --> 00:46:30.800 Question here from Phil Dennis: 00:46:30.824 --> 00:46:34.320 "How do we get people, governments, to relinquish power? 00:46:34.344 --> 00:46:38.259 Is that -- is that -- actually, the text is so big 00:46:38.283 --> 00:46:39.823 I can't read the full question. 00:46:39.847 --> 00:46:41.386 But is that a necessity? 00:46:41.410 --> 00:46:44.022 Is it going to take war to get there? 00:46:44.046 --> 00:46:47.736 Sorry Phil -- I mangled your question, but I blame the text right here. NOTE Paragraph 00:46:47.760 --> 00:46:49.860 YNH: One option that some people talk about 00:46:49.884 --> 00:46:54.623 is that only a catastrophe can shake humankind 00:46:54.647 --> 00:46:59.911 and open the path to a real system of global governance, 00:46:59.935 --> 00:47:04.083 and they say that we can't do it before the catastrophe, 00:47:04.107 --> 00:47:06.908 but we need to start laying the foundations 00:47:06.932 --> 00:47:09.432 so that when the disaster strikes, 00:47:09.456 --> 00:47:11.638 we can react quickly. 00:47:11.662 --> 00:47:15.662 But people will just not have the motivation to do such a thing 00:47:15.686 --> 00:47:17.698 before the disaster strikes. 00:47:17.722 --> 00:47:19.987 Another thing that I would emphasize 00:47:20.011 --> 00:47:25.065 is that anybody who is really interested in global governance 00:47:25.089 --> 00:47:27.990 should always make it very, very clear 00:47:28.014 --> 00:47:34.598 that it doesn't replace or abolish local identities and communities, 00:47:34.622 --> 00:47:37.578 that it should come both as -- 00:47:37.602 --> 00:47:40.909 It should be part of a single package. NOTE Paragraph 00:47:40.933 --> 00:47:44.311 CA: I want to hear more on this, 00:47:44.335 --> 00:47:47.388 because the very words "global governance" 00:47:47.412 --> 00:47:52.001 are almost the epitome of evil in the mindset of a lot of people 00:47:52.025 --> 00:47:53.351 on the alt-right right now. 00:47:53.375 --> 00:47:56.329 It just seems scary, remote, distant, and it has let them down, 00:47:56.353 --> 00:48:00.469 and so globalists, global governance -- no, go away! 00:48:00.493 --> 00:48:04.175 And many view the election as the ultimate poke in the eye 00:48:04.199 --> 00:48:05.677 to anyone who believes in that. 00:48:05.701 --> 00:48:09.252 So how do we change the narrative 00:48:09.276 --> 00:48:12.251 so that it doesn't seem so scary and remote? 00:48:12.275 --> 00:48:15.019 Build more on this idea of it being compatible 00:48:15.043 --> 00:48:17.664 with local identity, local communities. NOTE Paragraph 00:48:17.688 --> 00:48:20.288 YNH: Well, I think again we should start 00:48:20.312 --> 00:48:23.444 really with the biological realities 00:48:23.468 --> 00:48:25.479 of Homo sapiens. 00:48:25.503 --> 00:48:29.621 And biology tells us two things about Homo sapiens 00:48:29.645 --> 00:48:31.902 which are very relevant to this issue: 00:48:31.926 --> 00:48:34.955 first of all, that we are completely dependent 00:48:34.979 --> 00:48:37.574 on the ecological system around us, 00:48:37.598 --> 00:48:41.057 and that today we are talking about a global system. 00:48:41.081 --> 00:48:42.438 You cannot escape that. NOTE Paragraph 00:48:42.462 --> 00:48:46.084 And at the same time, biology tells us about Homo sapiens 00:48:46.108 --> 00:48:48.355 that we are social animals, 00:48:48.379 --> 00:48:53.016 but that we are social on a very, very local level. 00:48:53.040 --> 00:48:56.585 It's just a simple fact of humanity 00:48:56.609 --> 00:49:01.406 that we cannot have intimate familiarity 00:49:01.430 --> 00:49:05.305 with more than about 150 individuals. 00:49:05.329 --> 00:49:09.626 The size of the natural group, 00:49:09.650 --> 00:49:12.752 the natural community of Homo sapiens, 00:49:12.776 --> 00:49:16.120 is not more than 150 individuals, 00:49:16.144 --> 00:49:22.543 and everything beyond that is really based on all kinds of imaginary stories 00:49:22.567 --> 00:49:24.614 and large-scale institutions, 00:49:24.638 --> 00:49:29.014 and I think that we can find a way, 00:49:29.038 --> 00:49:33.608 again, based on a biological understanding of our species, 00:49:33.632 --> 00:49:35.714 to weave the two together 00:49:35.738 --> 00:49:38.814 and to understand that today in the 21st century, 00:49:38.838 --> 00:49:44.374 we need both the global level and the local community. NOTE Paragraph 00:49:44.398 --> 00:49:46.415 And I would go even further than that 00:49:46.439 --> 00:49:49.762 and say that it starts with the body itself. 00:49:50.500 --> 00:49:54.842 The feelings that people today have of alienation and loneliness 00:49:54.866 --> 00:49:58.082 and not finding their place in the world, 00:49:58.106 --> 00:50:03.835 I would think that the chief problem is not global capitalism. 00:50:04.285 --> 00:50:07.311 The chief problem is that over the last hundred years, 00:50:07.335 --> 00:50:11.039 people have been becoming disembodied, 00:50:11.063 --> 00:50:14.222 have been distancing themselves from their body. 00:50:14.246 --> 00:50:17.142 As a hunter-gatherer or even as a peasant, 00:50:17.166 --> 00:50:21.364 to survive, you need to be constantly in touch 00:50:21.388 --> 00:50:23.571 with your body and with your senses, 00:50:23.595 --> 00:50:24.776 every moment. 00:50:24.800 --> 00:50:26.947 If you go to the forest to look for mushrooms 00:50:26.971 --> 00:50:29.348 and you don't pay attention to what you hear, 00:50:29.372 --> 00:50:31.248 to what you smell, to what you taste, 00:50:31.272 --> 00:50:32.423 you're dead. 00:50:32.447 --> 00:50:34.598 So you must be very connected. NOTE Paragraph 00:50:34.622 --> 00:50:39.218 In the last hundred years, people are losing their ability 00:50:39.242 --> 00:50:42.114 to be in touch with their body and their senses, 00:50:42.138 --> 00:50:44.324 to hear, to smell, to feel. 00:50:44.348 --> 00:50:47.474 More and more attention goes to screens, 00:50:47.498 --> 00:50:49.018 to what is happening elsewhere, 00:50:49.042 --> 00:50:50.263 some other time. 00:50:50.287 --> 00:50:52.718 This, I think, is the deep reason 00:50:52.742 --> 00:50:56.636 for the feelings of alienation and loneliness and so forth, 00:50:56.660 --> 00:50:59.162 and therefore part of the solution 00:50:59.186 --> 00:51:03.450 is not to bring back some mass nationalism, 00:51:03.474 --> 00:51:07.598 but also reconnect with our own bodies, 00:51:07.622 --> 00:51:10.885 and if you are back in touch with your body, 00:51:10.909 --> 00:51:14.079 you will feel much more at home in the world also. NOTE Paragraph 00:51:14.103 --> 00:51:17.788 CA: Well, depending on how things go, we may all be back in the forest soon. 00:51:17.812 --> 00:51:20.161 We're going to have one more question in the room 00:51:20.185 --> 00:51:21.688 and one more on Facebook. NOTE Paragraph 00:51:21.712 --> 00:51:25.093 Ama Adi-Dako: Hello. I'm from Ghana, West Africa, and my question is: 00:51:25.117 --> 00:51:29.719 I'm wondering how do you present and justify the idea of global governance 00:51:29.743 --> 00:51:32.754 to countries that have been historically disenfranchised 00:51:32.778 --> 00:51:34.823 by the effects of globalization, 00:51:34.847 --> 00:51:37.593 and also, if we're talking about global governance, 00:51:37.617 --> 00:51:41.241 it sounds to me like it will definitely come from a very Westernized idea 00:51:41.265 --> 00:51:43.439 of what the "global" is supposed to look like. 00:51:43.463 --> 00:51:46.753 So how do we present and justify that idea of global 00:51:46.777 --> 00:51:49.770 versus wholly nationalist 00:51:49.794 --> 00:51:53.129 to people in countries like Ghana and Nigeria and Togo 00:51:53.153 --> 00:51:55.329 and other countries like that? NOTE Paragraph 00:51:56.131 --> 00:52:02.545 YNH: I would start by saying that history is extremely unfair, 00:52:02.569 --> 00:52:06.491 and that we should realize that. 00:52:07.004 --> 00:52:10.053 Many of the countries that suffered most 00:52:10.077 --> 00:52:14.216 from the last 200 years of globalization 00:52:14.240 --> 00:52:16.200 and imperialism and industrialization 00:52:16.224 --> 00:52:21.934 are exactly the countries which are also most likely to suffer most 00:52:21.958 --> 00:52:24.747 from the next wave. 00:52:24.771 --> 00:52:28.765 And we should be very, very clear about that. 00:52:29.477 --> 00:52:32.528 If we don't have a global governance, 00:52:32.552 --> 00:52:35.755 and if we suffer from climate change, 00:52:35.779 --> 00:52:38.036 from technological disruptions, 00:52:38.060 --> 00:52:41.661 the worst suffering will not be in the US. 00:52:41.685 --> 00:52:46.781 The worst suffering will be in Ghana, will be in Sudan, will be in Syria, 00:52:46.805 --> 00:52:49.542 will be in Bangladesh, will be in those places. NOTE Paragraph 00:52:49.566 --> 00:52:55.602 So I think those countries have an even greater incentive 00:52:55.626 --> 00:53:00.353 to do something about the next wave of disruption, 00:53:00.377 --> 00:53:02.902 whether it's ecological or whether it's technological. 00:53:02.926 --> 00:53:05.772 Again, if you think about technological disruption, 00:53:05.796 --> 00:53:10.412 so if AI and 3D printers and robots will take the jobs 00:53:10.436 --> 00:53:12.805 from billions of people, 00:53:12.829 --> 00:53:15.954 I worry far less about the Swedes 00:53:15.978 --> 00:53:19.583 than about the people in Ghana or in Bangladesh. 00:53:19.607 --> 00:53:24.835 And therefore, because history is so unfair 00:53:24.859 --> 00:53:29.205 and the results of a calamity 00:53:29.229 --> 00:53:31.597 will not be shared equally between everybody, 00:53:31.621 --> 00:53:36.054 as usual, the rich will be able to get away 00:53:36.078 --> 00:53:39.550 from the worst consequences of climate change 00:53:39.574 --> 00:53:42.419 in a way that the poor will not be able to. NOTE Paragraph 00:53:43.347 --> 00:53:46.755 CA: And here's a great question from Cameron Taylor on Facebook: 00:53:46.779 --> 00:53:48.900 "At the end of 'Sapiens,'" 00:53:48.924 --> 00:53:50.987 you said we should be asking the question, 00:53:51.011 --> 00:53:53.367 'What do we want to want?' 00:53:53.391 --> 00:53:56.378 Well, what do you think we should want to want?" NOTE Paragraph 00:53:56.402 --> 00:53:59.933 YNH: I think we should want to want to know the truth, 00:53:59.957 --> 00:54:02.607 to understand reality. 00:54:03.207 --> 00:54:08.321 Mostly what we want is to change reality, 00:54:08.345 --> 00:54:12.063 to fit it to our own desires, to our own wishes, 00:54:12.087 --> 00:54:15.807 and I think we should first want to understand it. 00:54:15.831 --> 00:54:19.595 If you look at the long-term trajectory of history, 00:54:19.619 --> 00:54:22.355 what you see is that for thousands of years 00:54:22.379 --> 00:54:25.715 we humans have been gaining control of the world outside us 00:54:25.739 --> 00:54:29.233 and trying to shape it to fit our own desires. 00:54:29.257 --> 00:54:32.445 And we've gained control of the other animals, 00:54:32.469 --> 00:54:34.000 of the rivers, of the forests, 00:54:34.024 --> 00:54:37.517 and reshaped them completely, 00:54:37.541 --> 00:54:40.902 causing an ecological destruction 00:54:40.926 --> 00:54:44.104 without making ourselves satisfied. NOTE Paragraph 00:54:44.128 --> 00:54:47.930 So the next step is we turn our gaze inwards, 00:54:47.954 --> 00:54:52.502 and we say OK, getting control of the world outside us 00:54:52.526 --> 00:54:54.390 did not really make us satisfied. 00:54:54.414 --> 00:54:57.113 Let's now try to gain control of the world inside us. 00:54:57.137 --> 00:54:59.300 This is the really big project 00:54:59.324 --> 00:55:03.620 of science and technology and industry in the 21st century -- 00:55:03.644 --> 00:55:07.166 to try and gain control of the world inside us, 00:55:07.190 --> 00:55:12.113 to learn how to engineer and produce bodies and brains and minds. 00:55:12.137 --> 00:55:16.779 These are likely to be the main products of the 21st century economy. 00:55:16.803 --> 00:55:20.624 When people think about the future, very often they think in terms, 00:55:20.648 --> 00:55:24.595 "Oh, I want to gain control of my body and of my brain." 00:55:24.619 --> 00:55:27.429 And I think that's very dangerous. NOTE Paragraph 00:55:27.453 --> 00:55:30.719 If we've learned anything from our previous history, 00:55:30.743 --> 00:55:34.656 it's that yes, we gain the power to manipulate, 00:55:34.680 --> 00:55:37.470 but because we didn't really understand the complexity 00:55:37.494 --> 00:55:39.299 of the ecological system, 00:55:39.323 --> 00:55:43.013 we are now facing an ecological meltdown. 00:55:43.037 --> 00:55:48.443 And if we now try to reengineer the world inside us 00:55:48.467 --> 00:55:50.599 without really understanding it, 00:55:50.623 --> 00:55:54.939 especially without understanding the complexity of our mental system, 00:55:54.963 --> 00:55:59.623 we might cause a kind of internal ecological disaster, 00:55:59.647 --> 00:56:03.190 and we'll face a kind of mental meltdown inside us. NOTE Paragraph 00:56:04.270 --> 00:56:06.712 CA: Putting all the pieces together here -- 00:56:06.736 --> 00:56:09.416 the current politics, the coming technology, 00:56:09.440 --> 00:56:11.590 concerns like the one you've just outlined -- 00:56:11.614 --> 00:56:14.709 I mean, it seems like you yourself are in quite a bleak place 00:56:14.733 --> 00:56:16.354 when you think about the future. 00:56:16.378 --> 00:56:17.960 You're pretty worried about it. 00:56:17.984 --> 00:56:19.176 Is that right? 00:56:19.200 --> 00:56:25.888 And if there was one cause for hope, how would you state that? NOTE Paragraph 00:56:25.912 --> 00:56:30.075 YNH: I focus on the most dangerous possibilities 00:56:30.099 --> 00:56:33.120 partly because this is like my job or responsibility 00:56:33.144 --> 00:56:34.925 as a historian or social critic. 00:56:34.949 --> 00:56:39.711 I mean, the industry focuses mainly on the positive sides, 00:56:39.735 --> 00:56:43.096 so it's the job of historians and philosophers and sociologists 00:56:43.120 --> 00:56:47.561 to highlight the more dangerous potential of all these new technologies. 00:56:47.585 --> 00:56:50.068 I don't think any of that is inevitable. 00:56:50.092 --> 00:56:53.131 Technology is never deterministic. 00:56:53.155 --> 00:56:54.872 You can use the same technology 00:56:54.896 --> 00:56:57.887 to create very different kinds of societies. NOTE Paragraph 00:56:57.911 --> 00:56:59.949 If you look at the 20th century, 00:56:59.973 --> 00:57:02.754 so, the technologies of the Industrial Revolution, 00:57:02.778 --> 00:57:05.835 the trains and electricity and all that 00:57:05.859 --> 00:57:08.911 could be used to create a communist dictatorship 00:57:08.935 --> 00:57:11.740 or a fascist regime or a liberal democracy. 00:57:11.764 --> 00:57:14.292 The trains did not tell you what to do with them. 00:57:14.316 --> 00:57:18.768 Similarly, now, artificial intelligence and bioengineering and all of that -- 00:57:18.792 --> 00:57:22.306 they don't predetermine a single outcome. 00:57:22.886 --> 00:57:26.063 Humanity can rise up to the challenge, 00:57:26.087 --> 00:57:27.778 and the best example we have 00:57:27.802 --> 00:57:31.542 of humanity rising up to the challenge of a new technology 00:57:31.566 --> 00:57:33.289 is nuclear weapons. 00:57:33.313 --> 00:57:36.322 In the late 1940s, '50s, 00:57:36.346 --> 00:57:38.485 many people were convinced 00:57:38.509 --> 00:57:42.815 that sooner or later the Cold War will end in a nuclear catastrophe, 00:57:42.839 --> 00:57:44.614 destroying human civilization. 00:57:44.638 --> 00:57:46.118 And this did not happen. 00:57:46.142 --> 00:57:52.562 In fact, nuclear weapons prompted humans all over the world 00:57:52.586 --> 00:57:57.327 to change the way that they manage international politics 00:57:57.351 --> 00:57:59.720 to reduce violence. NOTE Paragraph 00:57:59.744 --> 00:58:02.983 And many countries basically took out war 00:58:03.007 --> 00:58:04.881 from their political toolkit. 00:58:04.905 --> 00:58:09.175 They no longer tried to pursue their interests with warfare. 00:58:09.580 --> 00:58:12.850 Not all countries have done so, but many countries have. 00:58:12.874 --> 00:58:16.808 And this is maybe the most important reason 00:58:16.832 --> 00:58:22.934 why international violence declined dramatically since 1945, 00:58:22.958 --> 00:58:26.296 and today, as I said, more people commit suicide 00:58:26.320 --> 00:58:28.527 than are killed in war. 00:58:28.551 --> 00:58:33.380 So this, I think, gives us a good example 00:58:33.404 --> 00:58:37.246 that even the most frightening technology, 00:58:37.270 --> 00:58:39.805 humans can rise up to the challenge 00:58:39.829 --> 00:58:42.852 and actually some good can come out of it. 00:58:42.876 --> 00:58:47.163 The problem is, we have very little margin for error. 00:58:47.187 --> 00:58:49.396 If we don't get it right, 00:58:49.420 --> 00:58:53.091 we might not have a second option to try again. NOTE Paragraph 00:58:54.337 --> 00:58:55.904 CA: That's a very powerful note, 00:58:55.928 --> 00:58:58.733 on which I think we should draw this to a conclusion. 00:58:58.757 --> 00:59:01.868 Before I wrap up, I just want to say one thing to people here 00:59:01.892 --> 00:59:07.438 and to the global TED community watching online, anyone watching online: 00:59:07.462 --> 00:59:10.355 help us with these dialogues. 00:59:10.379 --> 00:59:12.929 If you believe, like we do, 00:59:12.953 --> 00:59:15.933 that we need to find a different kind of conversation, 00:59:15.957 --> 00:59:18.190 now more than ever, help us do it. 00:59:18.214 --> 00:59:20.237 Reach out to other people, 00:59:21.269 --> 00:59:24.009 try and have conversations with people you disagree with, 00:59:24.033 --> 00:59:25.216 understand them, 00:59:25.240 --> 00:59:26.770 pull the pieces together, 00:59:26.794 --> 00:59:30.686 and help us figure out how to take these conversations forward 00:59:30.710 --> 00:59:32.964 so we can make a real contribution 00:59:32.988 --> 00:59:35.733 to what's happening in the world right now. NOTE Paragraph 00:59:35.757 --> 00:59:39.076 I think everyone feels more alive, 00:59:39.100 --> 00:59:41.410 more concerned, more engaged 00:59:41.434 --> 00:59:43.963 with the politics of the moment. 00:59:43.987 --> 00:59:46.441 The stakes do seem quite high, 00:59:46.465 --> 00:59:50.977 so help us respond to it in a wise, wise way. NOTE Paragraph 00:59:51.001 --> 00:59:52.596 Yuval Harari, thank you. NOTE Paragraph 00:59:52.620 --> 00:59:55.928 (Applause)