0:00:00.888,0:00:04.642 Chris Anderson: Hello.[br]Welcome to this TED Dialogues. 0:00:04.666,0:00:08.253 It's the first of a series[br]that's going to be done 0:00:08.277,0:00:11.589 in response to the current[br]political upheaval. 0:00:12.265,0:00:13.434 I don't know about you; 0:00:13.458,0:00:17.069 I've become quite concerned about[br]the growing divisiveness in this country 0:00:17.093,0:00:18.589 and in the world. 0:00:18.613,0:00:20.973 No one's listening to each other. Right? 0:00:21.564,0:00:22.727 They aren't. 0:00:22.751,0:00:26.487 I mean, it feels like we need[br]a different kind of conversation, 0:00:26.511,0:00:32.439 one that's based on -- I don't know,[br]on reason, listening, on understanding, 0:00:32.463,0:00:34.130 on a broader context. 0:00:34.840,0:00:37.867 That's at least what we're going to try[br]in these TED Dialogues, 0:00:37.891,0:00:39.083 starting today. 0:00:39.107,0:00:41.733 And we couldn't have anyone with us 0:00:41.757,0:00:44.566 who I'd be more excited to kick this off. 0:00:44.590,0:00:48.440 This is a mind right here that thinks[br]pretty much like no one else 0:00:48.464,0:00:50.235 on the planet, I would hasten to say. 0:00:50.259,0:00:51.418 I'm serious. 0:00:51.442,0:00:52.704 (Yuval Noah Harari laughs) 0:00:52.728,0:00:53.892 I'm serious. 0:00:53.916,0:00:58.854 He synthesizes history[br]with underlying ideas 0:00:58.878,0:01:01.048 in a way that kind of takes[br]your breath away. 0:01:01.072,0:01:04.481 So, some of you will know[br]this book, "Sapiens." 0:01:04.505,0:01:06.248 Has anyone here read "Sapiens"? 0:01:06.272,0:01:07.484 (Applause) 0:01:07.508,0:01:10.653 I mean, I could not put it down. 0:01:10.677,0:01:14.551 The way that he tells the story of mankind 0:01:14.575,0:01:18.377 through big ideas that really make you[br]think differently -- 0:01:18.401,0:01:20.136 it's kind of amazing. 0:01:20.160,0:01:21.384 And here's the follow-up, 0:01:21.408,0:01:24.484 which I think is being published[br]in the US next week. 0:01:24.508,0:01:25.659 YNH: Yeah, next week. 0:01:25.683,0:01:26.857 CA: "Homo Deus." 0:01:26.881,0:01:29.946 Now, this is the history[br]of the next hundred years. 0:01:30.521,0:01:32.371 I've had a chance to read it. 0:01:32.395,0:01:34.785 It's extremely dramatic, 0:01:34.809,0:01:39.074 and I daresay, for some people,[br]quite alarming. 0:01:39.605,0:01:40.848 It's a must-read. 0:01:40.872,0:01:46.450 And honestly, we couldn't have[br]someone better to help 0:01:46.474,0:01:50.450 make sense of what on Earth[br]is happening in the world right now. 0:01:50.474,0:01:54.631 So a warm welcome, please,[br]to Yuval Noah Harari. 0:01:54.655,0:01:58.048 (Applause) 0:02:02.900,0:02:06.645 It's great to be joined by our friends[br]on Facebook and around the Web. 0:02:06.669,0:02:08.246 Hello, Facebook. 0:02:08.270,0:02:12.196 And all of you, as I start[br]asking questions of Yuval, 0:02:12.220,0:02:13.872 come up with your own questions, 0:02:13.896,0:02:16.543 and not necessarily about[br]the political scandal du jour, 0:02:16.567,0:02:21.336 but about the broader understanding[br]of: Where are we heading? 0:02:22.517,0:02:24.283 You ready? OK, we're going to go. 0:02:24.307,0:02:25.582 So here we are, Yuval: 0:02:25.606,0:02:29.326 New York City, 2017,[br]there's a new president in power, 0:02:29.350,0:02:32.471 and shock waves rippling around the world. 0:02:32.495,0:02:33.964 What on Earth is happening? 0:02:35.115,0:02:37.361 YNH: I think the basic thing that happened 0:02:37.385,0:02:39.675 is that we have lost our story. 0:02:40.144,0:02:42.611 Humans think in stories, 0:02:42.635,0:02:46.297 and we try to make sense of the world[br]by telling stories. 0:02:46.321,0:02:47.738 And for the last few decades, 0:02:47.762,0:02:50.632 we had a very simple[br]and very attractive story 0:02:50.656,0:02:52.405 about what's happening in the world. 0:02:52.429,0:02:55.593 And the story said that,[br]oh, what's happening is 0:02:55.617,0:02:58.233 that the economy is being globalized, 0:02:58.257,0:03:00.400 politics is being liberalized, 0:03:00.424,0:03:04.423 and the combination of the two[br]will create paradise on Earth, 0:03:04.447,0:03:07.546 and we just need to keep on[br]globalizing the economy 0:03:07.570,0:03:09.381 and liberalizing the political system, 0:03:09.405,0:03:11.310 and everything will be wonderful. 0:03:11.334,0:03:14.066 And 2016 is the moment 0:03:14.090,0:03:18.050 when a very large segment,[br]even of the Western world, 0:03:18.074,0:03:20.480 stopped believing in this story. 0:03:20.504,0:03:22.636 For good or bad reasons --[br]it doesn't matter. 0:03:22.660,0:03:24.881 People stopped believing in the story, 0:03:24.905,0:03:28.681 and when you don't have a story,[br]you don't understand what's happening. 0:03:29.212,0:03:33.179 CA: Part of you believes that that story[br]was actually a very effective story. 0:03:33.203,0:03:34.397 It worked. 0:03:34.421,0:03:35.898 YNH: To some extent, yes. 0:03:35.922,0:03:37.984 According to some measurements, 0:03:38.008,0:03:40.593 we are now in the best time ever 0:03:40.617,0:03:42.044 for humankind. 0:03:42.068,0:03:44.508 Today, for the first time in history, 0:03:44.532,0:03:48.945 more people die from eating too much[br]than from eating too little, 0:03:48.969,0:03:50.741 which is an amazing achievement. 0:03:50.765,0:03:53.448 (Laughter) 0:03:53.472,0:03:55.173 Also for the first time in history, 0:03:55.197,0:03:59.332 more people die from old age[br]than from infectious diseases, 0:03:59.356,0:04:02.150 and violence is also down. 0:04:02.174,0:04:03.604 For the first time in history, 0:04:03.628,0:04:08.959 more people commit suicide[br]than are killed by crime and terrorism 0:04:08.983,0:04:10.823 and war put together. 0:04:10.847,0:04:15.007 Statistically, you are[br]your own worst enemy. 0:04:15.031,0:04:17.037 At least, of all the people in the world, 0:04:17.061,0:04:20.183 you are most likely[br]to be killed by yourself -- 0:04:20.207,0:04:21.474 (Laughter) 0:04:21.498,0:04:24.542 which is, again,[br]very good news, compared -- 0:04:24.566,0:04:26.190 (Laughter) 0:04:26.214,0:04:30.515 compared to the level of violence[br]that we saw in previous eras. 0:04:30.539,0:04:32.775 CA: But this process[br]of connecting the world 0:04:32.799,0:04:36.698 ended up with a large group of people[br]kind of feeling left out, 0:04:36.722,0:04:38.353 and they've reacted. 0:04:38.377,0:04:40.363 And so we have this bombshell 0:04:40.387,0:04:42.698 that's sort of ripping[br]through the whole system. 0:04:42.722,0:04:46.114 I mean, what do you make[br]of what's happened? 0:04:46.138,0:04:49.416 It feels like the old way[br]that people thought of politics, 0:04:49.440,0:04:52.296 the left-right divide,[br]has been blown up and replaced. 0:04:52.320,0:04:53.917 How should we think of this? 0:04:53.941,0:04:58.325 YNH: Yeah, the old 20th-century[br]political model of left versus right 0:04:58.349,0:05:00.056 is now largely irrelevant, 0:05:00.080,0:05:04.582 and the real divide today[br]is between global and national, 0:05:04.606,0:05:06.395 global or local. 0:05:06.419,0:05:09.276 And you see it again all over the world 0:05:09.300,0:05:11.445 that this is now the main struggle. 0:05:11.469,0:05:14.905 We probably need completely[br]new political models 0:05:14.929,0:05:19.989 and completely new ways[br]of thinking about politics. 0:05:20.499,0:05:26.029 In essence, what you can say[br]is that we now have global ecology, 0:05:26.053,0:05:29.994 we have a global economy[br]but we have national politics, 0:05:30.018,0:05:31.755 and this doesn't work together. 0:05:31.779,0:05:34.005 This makes the political[br]system ineffective, 0:05:34.029,0:05:37.553 because it has no control[br]over the forces that shape our life. 0:05:37.577,0:05:40.856 And you have basically two solutions[br]to this imbalance: 0:05:40.880,0:05:45.454 either de-globalize the economy[br]and turn it back into a national economy, 0:05:45.478,0:05:47.646 or globalize the political system. 0:05:48.948,0:05:53.613 CA: So some, I guess[br]many liberals out there 0:05:53.637,0:06:00.073 view Trump and his government[br]as kind of irredeemably bad, 0:06:00.097,0:06:02.756 just awful in every way. 0:06:02.780,0:06:09.203 Do you see any underlying narrative[br]or political philosophy in there 0:06:09.227,0:06:11.031 that is at least worth understanding? 0:06:11.055,0:06:13.100 How would you articulate that philosophy? 0:06:13.124,0:06:15.423 Is it just the philosophy of nationalism? 0:06:16.254,0:06:21.654 YNH: I think the underlying[br]feeling or idea 0:06:21.678,0:06:26.170 is that the political system --[br]something is broken there. 0:06:26.194,0:06:29.964 It doesn't empower[br]the ordinary person anymore. 0:06:29.988,0:06:33.488 It doesn't care so much[br]about the ordinary person anymore, 0:06:33.512,0:06:38.331 and I think this diagnosis[br]of the political disease is correct. 0:06:38.355,0:06:41.714 With regard to the answers,[br]I am far less certain. 0:06:41.738,0:06:45.221 I think what we are seeing[br]is the immediate human reaction: 0:06:45.245,0:06:47.803 if something doesn't work, let's go back. 0:06:47.827,0:06:49.448 And you see it all over the world, 0:06:49.472,0:06:53.898 that people, almost nobody[br]in the political system today, 0:06:53.922,0:06:58.115 has any future-oriented vision[br]of where humankind is going. 0:06:58.139,0:07:01.165 Almost everywhere,[br]you see retrograde vision: 0:07:01.189,0:07:03.235 "Let's make America great again," 0:07:03.259,0:07:06.688 like it was great -- I don't know --[br]in the '50s, in the '80s, sometime, 0:07:06.712,0:07:07.882 let's go back there. 0:07:07.906,0:07:12.627 And you go to Russia[br]a hundred years after Lenin, 0:07:12.651,0:07:14.522 Putin's vision for the future 0:07:14.546,0:07:17.743 is basically, ah, let's go back[br]to the Tsarist empire. 0:07:17.767,0:07:20.159 And in Israel, where I come from, 0:07:20.183,0:07:23.435 the hottest political vision[br]of the present is: 0:07:23.459,0:07:25.310 "Let's build the temple again." 0:07:25.334,0:07:28.312 So let's go back 2,000 years backwards. 0:07:28.336,0:07:33.162 So people are thinking[br]sometime in the past we've lost it, 0:07:33.186,0:07:36.924 and sometimes in the past, it's like[br]you've lost your way in the city, 0:07:36.948,0:07:40.098 and you say OK, let's go back[br]to the point where I felt secure 0:07:40.122,0:07:41.484 and start again. 0:07:41.508,0:07:43.081 I don't think this can work, 0:07:43.105,0:07:46.007 but a lot of people,[br]this is their gut instinct. 0:07:46.031,0:07:47.683 CA: But why couldn't it work? 0:07:47.707,0:07:51.346 "America First" is a very[br]appealing slogan in many ways. 0:07:51.370,0:07:55.247 Patriotism is, in many ways,[br]a very noble thing. 0:07:55.271,0:07:58.012 It's played a role[br]in promoting cooperation 0:07:58.036,0:07:59.626 among large numbers of people. 0:07:59.650,0:08:03.532 Why couldn't you have a world[br]organized in countries, 0:08:03.556,0:08:06.333 all of which put themselves first? 0:08:07.373,0:08:10.692 YNH: For many centuries,[br]even thousands of years, 0:08:10.716,0:08:13.478 patriotism worked quite well. 0:08:13.502,0:08:15.381 Of course, it led to wars an so forth, 0:08:15.405,0:08:18.360 but we shouldn't focus[br]too much on the bad. 0:08:18.384,0:08:21.926 There are also many,[br]many positive things about patriotism, 0:08:21.950,0:08:25.724 and the ability to have[br]a large number of people 0:08:25.748,0:08:27.162 care about each other, 0:08:27.186,0:08:28.792 sympathize with one another, 0:08:28.816,0:08:32.030 and come together for collective action. 0:08:32.054,0:08:34.749 If you go back to the first nations, 0:08:34.773,0:08:36.598 so, thousands of years ago, 0:08:36.622,0:08:40.028 the people who lived along[br]the Yellow River in China -- 0:08:40.052,0:08:42.495 it was many, many different tribes 0:08:42.519,0:08:46.889 and they all depended on the river[br]for survival and for prosperity, 0:08:46.913,0:08:51.222 but all of them also suffered[br]from periodical floods 0:08:51.246,0:08:52.846 and periodical droughts. 0:08:52.870,0:08:55.905 And no tribe could really do[br]anything about it, 0:08:55.929,0:09:00.089 because each of them controlled[br]just a tiny section of the river. 0:09:00.113,0:09:02.877 And then in a long[br]and complicated process, 0:09:02.901,0:09:06.825 the tribes coalesced together[br]to form the Chinese nation, 0:09:06.849,0:09:09.483 which controlled the entire Yellow River 0:09:09.507,0:09:14.954 and had the ability to bring[br]hundreds of thousands of people together 0:09:14.978,0:09:19.229 to build dams and canals[br]and regulate the river 0:09:19.253,0:09:22.440 and prevent the worst floods and droughts 0:09:22.464,0:09:25.564 and raise the level[br]of prosperity for everybody. 0:09:25.588,0:09:28.205 And this worked in many places[br]around the world. 0:09:28.229,0:09:31.355 But in the 21st century, 0:09:31.379,0:09:34.814 technology is changing all that[br]in a fundamental way. 0:09:34.838,0:09:37.552 We are now living -- all people[br]in the world -- 0:09:37.576,0:09:41.393 are living alongside the same cyber river, 0:09:41.417,0:09:47.023 and no single nation can regulate[br]this river by itself. 0:09:47.047,0:09:51.090 We are all living together[br]on a single planet, 0:09:51.114,0:09:53.875 which is threatened by our own actions. 0:09:53.899,0:09:57.915 And if you don't have some kind[br]of global cooperation, 0:09:57.939,0:10:03.082 nationalism is just not on the right level[br]to tackle the problems, 0:10:03.106,0:10:06.809 whether it's climate change[br]or whether it's technological disruption. 0:10:07.761,0:10:09.952 CA: So it was a beautiful idea 0:10:09.976,0:10:13.924 in a world where most of the action,[br]most of the issues, 0:10:13.948,0:10:16.304 took place on national scale, 0:10:16.328,0:10:19.126 but your argument is that the issues[br]that matter most today 0:10:19.150,0:10:22.294 no longer take place on a national scale[br]but on a global scale. 0:10:22.318,0:10:26.127 YNH: Exactly. All the major problems[br]of the world today 0:10:26.151,0:10:28.540 are global in essence, 0:10:28.564,0:10:30.055 and they cannot be solved 0:10:30.079,0:10:33.976 unless through some kind[br]of global cooperation. 0:10:34.000,0:10:35.629 It's not just climate change, 0:10:35.653,0:10:39.057 which is, like, the most obvious[br]example people give. 0:10:39.081,0:10:42.159 I think more in terms[br]of technological disruption. 0:10:42.183,0:10:45.233 If you think about, for example,[br]artificial intelligence, 0:10:45.257,0:10:48.250 over the next 20, 30 years 0:10:48.274,0:10:52.158 pushing hundreds of millions of people[br]out of the job market -- 0:10:52.182,0:10:54.433 this is a problem on a global level. 0:10:54.457,0:10:57.960 It will disrupt the economy[br]of all the countries. 0:10:57.984,0:11:01.647 And similarly, if you think[br]about, say, bioengineering 0:11:01.671,0:11:04.626 and people being afraid of conducting, 0:11:04.650,0:11:07.311 I don't know, genetic engineering[br]research in humans, 0:11:07.335,0:11:12.624 it won't help if just[br]a single country, let's say the US, 0:11:12.648,0:11:15.972 outlaws all genetic experiments in humans, 0:11:15.996,0:11:19.626 but China or North Korea[br]continues to do it. 0:11:19.650,0:11:22.390 So the US cannot solve it by itself, 0:11:22.414,0:11:27.297 and very quickly, the pressure on the US[br]to do the same will be immense 0:11:27.321,0:11:32.211 because we are talking about[br]high-risk, high-gain technologies. 0:11:32.235,0:11:36.943 If somebody else is doing it,[br]I can't allow myself to remain behind. 0:11:36.967,0:11:42.655 The only way to have regulations,[br]effective regulations, 0:11:42.679,0:11:44.772 on things like genetic engineering, 0:11:44.796,0:11:46.808 is to have global regulations. 0:11:46.832,0:11:51.870 If you just have national regulations,[br]nobody would like to stay behind. 0:11:51.894,0:11:53.902 CA: So this is really interesting. 0:11:53.926,0:11:55.857 It seems to me that this may be one key 0:11:55.881,0:11:59.353 to provoking at least[br]a constructive conversation 0:11:59.377,0:12:01.050 between the different sides here, 0:12:01.074,0:12:04.216 because I think everyone can agree[br]that the start point 0:12:04.240,0:12:06.937 of a lot of the anger[br]that's propelled us to where we are 0:12:06.961,0:12:09.709 is because of the legitimate[br]concerns about job loss. 0:12:09.733,0:12:13.374 Work is gone, a traditional[br]way of life has gone, 0:12:13.398,0:12:16.773 and it's no wonder[br]that people are furious about that. 0:12:16.797,0:12:21.349 And in general, they have blamed[br]globalism, global elites, 0:12:21.373,0:12:24.120 for doing this to them[br]without asking their permission, 0:12:24.144,0:12:26.239 and that seems like[br]a legitimate complaint. 0:12:26.263,0:12:29.572 But what I hear you saying[br]is that -- so a key question is: 0:12:29.596,0:12:35.057 What is the real cause of job loss,[br]both now and going forward? 0:12:35.081,0:12:37.857 To the extent that it's about globalism, 0:12:37.881,0:12:42.003 then the right response,[br]yes, is to shut down borders 0:12:42.027,0:12:46.007 and keep people out[br]and change trade agreements and so forth. 0:12:46.031,0:12:47.387 But you're saying, I think, 0:12:47.411,0:12:52.202 that actually the bigger cause of job loss[br]is not going to be that at all. 0:12:52.226,0:12:55.940 It's going to originate[br]in technological questions, 0:12:55.964,0:12:57.976 and we have no chance of solving that 0:12:58.000,0:13:00.095 unless we operate as a connected world. 0:13:00.119,0:13:01.638 YNH: Yeah, I think that, 0:13:01.662,0:13:04.903 I don't know about the present,[br]but looking to the future, 0:13:04.927,0:13:08.004 it's not the Mexicans or Chinese[br]who will take the jobs 0:13:08.028,0:13:09.595 from the people in Pennsylvania, 0:13:09.619,0:13:11.362 it's the robots and algorithms. 0:13:11.386,0:13:15.570 So unless you plan to build a big wall[br]on the border of California -- 0:13:15.594,0:13:16.728 (Laughter) 0:13:16.752,0:13:20.443 the wall on the border with Mexico[br]is going to be very ineffective. 0:13:20.467,0:13:26.815 And I was struck when I watched[br]the debates before the election, 0:13:26.839,0:13:32.576 I was struck that certainly Trump[br]did not even attempt to frighten people 0:13:32.600,0:13:35.176 by saying the robots will take your jobs. 0:13:35.200,0:13:37.499 Now even if it's not true,[br]it doesn't matter. 0:13:37.523,0:13:41.041 It could have been an extremely[br]effective way of frightening people -- 0:13:41.065,0:13:42.066 (Laughter) 0:13:42.090,0:13:43.251 and galvanizing people: 0:13:43.275,0:13:44.884 "The robots will take your jobs!" 0:13:44.908,0:13:46.263 And nobody used that line. 0:13:46.287,0:13:48.979 And it made me afraid, 0:13:49.003,0:13:53.051 because it meant[br]that no matter what happens 0:13:53.075,0:13:55.221 in universities and laboratories, 0:13:55.245,0:13:58.010 and there, there is already[br]an intense debate about it, 0:13:58.034,0:14:02.128 but in the mainstream political system[br]and among the general public, 0:14:02.152,0:14:04.262 people are just unaware 0:14:04.286,0:14:08.796 that there could be an immense[br]technological disruption -- 0:14:08.820,0:14:12.942 not in 200 years,[br]but in 10, 20, 30 years -- 0:14:12.966,0:14:15.536 and we have to do something about it now, 0:14:15.560,0:14:21.836 partly because most of what we teach[br]children today in school or in college 0:14:21.860,0:14:27.861 is going to be completely irrelevant[br]to the job market of 2040, 2050. 0:14:27.885,0:14:31.243 So it's not something we'll need[br]to think about in 2040. 0:14:31.267,0:14:34.860 We need to think today[br]what to teach the young people. 0:14:34.884,0:14:37.537 CA: Yeah, no, absolutely. 0:14:38.595,0:14:42.512 You've often written about[br]moments in history 0:14:42.536,0:14:49.395 where humankind has ...[br]entered a new era, unintentionally. 0:14:49.806,0:14:52.674 Decisions have been made,[br]technologies have been developed, 0:14:52.698,0:14:55.069 and suddenly the world has changed, 0:14:55.093,0:14:57.560 possibly in a way[br]that's worse for everyone. 0:14:57.584,0:14:59.659 So one of the examples[br]you give in "Sapiens" 0:14:59.683,0:15:01.774 is just the whole agricultural revolution, 0:15:01.798,0:15:05.352 which, for an actual person[br]tilling the fields, 0:15:05.376,0:15:08.556 they just picked up a 12-hour[br]backbreaking workday 0:15:08.580,0:15:14.828 instead of six hours in the jungle[br]and a much more interesting lifestyle. 0:15:14.852,0:15:15.894 (Laughter) 0:15:15.918,0:15:19.107 So are we at another possible[br]phase change here, 0:15:19.131,0:15:23.619 where we kind of sleepwalk into a future[br]that none of us actually wants? 0:15:24.058,0:15:26.791 YNH: Yes, very much so. 0:15:26.815,0:15:28.652 During the agricultural revolution, 0:15:28.676,0:15:33.096 what happened is that immense[br]technological and economic revolution 0:15:33.120,0:15:35.985 empowered the human collective, 0:15:36.009,0:15:38.962 but when you look at actual[br]individual lives, 0:15:38.986,0:15:42.480 the life of a tiny elite[br]became much better, 0:15:42.504,0:15:46.742 and the lives of the majority of people[br]became considerably worse. 0:15:46.766,0:15:49.471 And this can happen again[br]in the 21st century. 0:15:49.495,0:15:54.361 No doubt the new technologies[br]will empower the human collective. 0:15:54.385,0:15:57.105 But we may end up again 0:15:57.129,0:16:01.586 with a tiny elite reaping[br]all the benefits, taking all the fruits, 0:16:01.610,0:16:05.796 and the masses of the population[br]finding themselves worse 0:16:05.820,0:16:07.121 than they were before, 0:16:07.145,0:16:09.933 certainly much worse than this tiny elite. 0:16:10.657,0:16:13.312 CA: And those elites[br]might not even be human elites. 0:16:13.336,0:16:15.093 They might be cyborgs or -- 0:16:15.117,0:16:17.324 YNH: Yeah, they could be[br]enhanced super humans. 0:16:17.348,0:16:18.603 They could be cyborgs. 0:16:18.627,0:16:20.984 They could be completely[br]nonorganic elites. 0:16:21.008,0:16:23.536 They could even be[br]non-conscious algorithms. 0:16:23.560,0:16:28.471 What we see now in the world[br]is authority shifting away 0:16:28.495,0:16:30.764 from humans to algorithms. 0:16:30.788,0:16:34.312 More and more decisions --[br]about personal lives, 0:16:34.336,0:16:37.008 about economic matters,[br]about political matters -- 0:16:37.032,0:16:39.511 are actually being taken by algorithms. 0:16:39.535,0:16:42.169 If you ask the bank for a loan, 0:16:42.193,0:16:46.890 chances are your fate is decided[br]by an algorithm, not by a human being. 0:16:46.914,0:16:53.101 And the general impression[br]is that maybe Homo sapiens just lost it. 0:16:53.125,0:16:57.685 The world is so complicated,[br]there is so much data, 0:16:57.709,0:17:00.263 things are changing so fast, 0:17:00.287,0:17:03.888 that this thing that evolved[br]on the African savanna 0:17:03.912,0:17:05.619 tens of thousands of years ago -- 0:17:05.643,0:17:09.140 to cope with a particular environment, 0:17:09.164,0:17:12.648 a particular volume[br]of information and data -- 0:17:12.672,0:17:17.008 it just can't handle the realities[br]of the 21st century, 0:17:17.032,0:17:19.929 and the only thing[br]that may be able to handle it 0:17:19.953,0:17:22.025 is big-data algorithms. 0:17:22.049,0:17:28.230 So no wonder more and more authority[br]is shifting from us to the algorithms. 0:17:28.857,0:17:32.706 CA: So we're in New York City[br]for the first of a series of TED Dialogues 0:17:32.730,0:17:35.027 with Yuval Harari, 0:17:35.051,0:17:38.895 and there's a Facebook Live[br]audience out there. 0:17:38.919,0:17:40.570 We're excited to have you with us. 0:17:40.594,0:17:42.696 We'll start coming[br]to some of your questions 0:17:42.720,0:17:44.434 and questions of people in the room 0:17:44.458,0:17:45.623 in just a few minutes, 0:17:45.647,0:17:47.611 so have those coming. 0:17:47.635,0:17:51.532 Yuval, if you're going[br]to make the argument 0:17:51.556,0:17:57.691 that we need to get past nationalism[br]because of the coming technological ... 0:17:59.218,0:18:01.059 danger, in a way, 0:18:01.083,0:18:03.028 presented by so much of what's happening 0:18:03.052,0:18:05.495 we've got to have[br]a global conversation about this. 0:18:05.519,0:18:08.947 Trouble is, it's hard to get people[br]really believing that, I don't know, 0:18:08.971,0:18:11.132 AI really is an imminent[br]threat, and so forth. 0:18:11.156,0:18:13.882 The things that people,[br]some people at least, 0:18:13.906,0:18:15.941 care about much more immediately, perhaps, 0:18:15.965,0:18:17.549 is climate change, 0:18:17.573,0:18:22.466 perhaps other issues like refugees,[br]nuclear weapons, and so forth. 0:18:22.490,0:18:27.536 Would you argue that where[br]we are right now 0:18:27.560,0:18:31.109 that somehow those issues[br]need to be dialed up? 0:18:31.133,0:18:33.293 You've talked about climate change, 0:18:33.317,0:18:36.973 but Trump has said[br]he doesn't believe in that. 0:18:36.997,0:18:39.436 So in a way, your most powerful argument, 0:18:39.460,0:18:42.206 you can't actually use to make this case. 0:18:42.230,0:18:44.416 YNH: Yeah, I think with climate change, 0:18:44.440,0:18:48.147 at first sight, it's quite surprising 0:18:48.171,0:18:50.675 that there is a very close correlation 0:18:50.699,0:18:54.021 between nationalism and climate change. 0:18:54.045,0:18:58.632 I mean, almost always, the people[br]who deny climate change are nationalists. 0:18:58.656,0:19:00.737 And at first sight, you think: Why? 0:19:00.761,0:19:01.914 What's the connection? 0:19:01.938,0:19:04.724 Why don't you have socialists[br]denying climate change? 0:19:04.748,0:19:07.099 But then, when you think[br]about it, it's obvious -- 0:19:07.123,0:19:10.867 because nationalism has no solution[br]to climate change. 0:19:10.891,0:19:14.087 If you want to be a nationalist[br]in the 21st century, 0:19:14.111,0:19:15.983 you have to deny the problem. 0:19:16.007,0:19:20.494 If you accept the reality of the problem,[br]then you must accept that, yes, 0:19:20.518,0:19:23.294 there is still room in the world[br]for patriotism, 0:19:23.318,0:19:27.469 there is still room in the world[br]for having special loyalties 0:19:27.493,0:19:32.127 and obligations towards your own people,[br]towards your own country. 0:19:32.151,0:19:35.971 I don't think anybody is really[br]thinking of abolishing that. 0:19:35.995,0:19:38.996 But in order to confront climate change, 0:19:39.020,0:19:43.231 we need additional loyalties[br]and commitments 0:19:43.255,0:19:45.260 to a level beyond the nation. 0:19:45.284,0:19:47.727 And that should not be impossible, 0:19:47.751,0:19:51.443 because people can have[br]several layers of loyalty. 0:19:51.467,0:19:53.871 You can be loyal to your family 0:19:53.895,0:19:55.408 and to your community 0:19:55.432,0:19:56.761 and to your nation, 0:19:56.785,0:20:00.413 so why can't you also be loyal[br]to humankind as a whole? 0:20:00.437,0:20:03.836 Of course, there are occasions[br]when it becomes difficult, 0:20:03.860,0:20:05.643 what to put first, 0:20:05.667,0:20:07.490 but, you know, life is difficult. 0:20:07.514,0:20:08.665 Handle it. 0:20:08.689,0:20:11.333 (Laughter) 0:20:11.357,0:20:15.855 CA: OK, so I would love to get[br]some questions from the audience here. 0:20:15.879,0:20:17.797 We've got a microphone here. 0:20:17.821,0:20:21.038 Speak into it, and Facebook,[br]get them coming, too. 0:20:21.062,0:20:24.496 Howard Morgan: One of the things that has[br]clearly made a huge difference 0:20:24.520,0:20:26.306 in this country and other countries 0:20:26.330,0:20:28.544 is the income distribution inequality, 0:20:28.568,0:20:32.782 the dramatic change[br]in income distribution in the US 0:20:32.806,0:20:34.508 from what it was 50 years ago, 0:20:34.532,0:20:35.683 and around the world. 0:20:35.707,0:20:38.850 Is there anything we can do[br]to affect that? 0:20:38.874,0:20:41.715 Because that gets at a lot[br]of the underlying causes. 0:20:44.283,0:20:49.597 YNH: So far I haven't heard a very[br]good idea about what to do about it, 0:20:49.621,0:20:53.349 again, partly because most ideas[br]remain on the national level, 0:20:53.373,0:20:55.141 and the problem is global. 0:20:55.165,0:20:58.143 I mean, one idea that we hear[br]quite a lot about now 0:20:58.167,0:20:59.999 is universal basic income. 0:21:00.023,0:21:01.174 But this is a problem. 0:21:01.198,0:21:02.850 I mean, I think it's a good start, 0:21:02.874,0:21:06.596 but it's a problematic idea because[br]it's not clear what "universal" is 0:21:06.620,0:21:08.461 and it's not clear what "basic" is. 0:21:08.485,0:21:11.866 Most people when they speak[br]about universal basic income, 0:21:11.890,0:21:14.675 they actually mean national basic income. 0:21:14.699,0:21:16.443 But the problem is global. 0:21:16.467,0:21:22.117 Let's say that you have AI and 3D printers[br]taking away millions of jobs 0:21:22.141,0:21:23.297 in Bangladesh, 0:21:23.321,0:21:26.569 from all the people who make[br]my shirts and my shoes. 0:21:26.593,0:21:27.899 So what's going to happen? 0:21:27.923,0:21:34.462 The US government will levy taxes[br]on Google and Apple in California, 0:21:34.486,0:21:39.067 and use that to pay basic income[br]to unemployed Bangladeshis? 0:21:39.091,0:21:41.727 If you believe that,[br]you can just as well believe 0:21:41.751,0:21:45.414 that Santa Claus will come[br]and solve the problem. 0:21:45.438,0:21:50.564 So unless we have really universal[br]and not national basic income, 0:21:50.588,0:21:53.723 the deep problems[br]are not going to go away. 0:21:53.747,0:21:56.479 And also it's not clear what basic is, 0:21:56.503,0:21:59.136 because what are basic human needs? 0:21:59.160,0:22:01.970 A thousand years ago,[br]just food and shelter was enough. 0:22:01.994,0:22:05.605 But today, people will say[br]education is a basic human need, 0:22:05.629,0:22:07.202 it should be part of the package. 0:22:07.226,0:22:11.005 But how much? Six years?[br]Twelve years? PhD? 0:22:11.029,0:22:12.862 Similarly, with health care, 0:22:12.886,0:22:15.571 let's say that in 20, 30, 40 years, 0:22:15.595,0:22:19.368 you'll have expensive treatments[br]that can extend human life 0:22:19.392,0:22:21.307 to 120, I don't know. 0:22:21.331,0:22:26.522 Will this be part of the basket[br]of basic income or not? 0:22:26.546,0:22:27.975 It's a very difficult problem, 0:22:27.999,0:22:34.257 because in a world where people[br]lose their ability to be employed, 0:22:34.281,0:22:37.862 the only thing they are going to get[br]is this basic income. 0:22:37.886,0:22:43.013 So what's part of it is a very,[br]very difficult ethical question. 0:22:43.037,0:22:46.341 CA: There's a bunch of questions[br]on how the world affords it as well, 0:22:46.365,0:22:47.525 who pays. 0:22:47.549,0:22:50.361 There's a question here[br]from Facebook from Lisa Larson: 0:22:50.385,0:22:52.960 "How does nationalism in the US now 0:22:52.984,0:22:56.399 compare to that between[br]World War I and World War II 0:22:56.423,0:22:57.844 in the last century?" 0:22:57.868,0:23:02.316 YNH: Well the good news, with regard[br]to the dangers of nationalism, 0:23:02.340,0:23:06.263 we are in a much better position[br]than a century ago. 0:23:06.287,0:23:08.959 A century ago, 1917, 0:23:08.983,0:23:12.116 Europeans were killing[br]each other by the millions. 0:23:12.140,0:23:16.491 In 2016, with Brexit,[br]as far as I remember, 0:23:16.515,0:23:21.752 a single person lost their life,[br]an MP who was murdered by some extremist. 0:23:21.776,0:23:23.309 Just a single person. 0:23:23.333,0:23:26.018 I mean, if Brexit was about[br]British independence, 0:23:26.042,0:23:30.793 this is the most peaceful[br]war of independence in human history. 0:23:30.817,0:23:36.606 And let's say that Scotland[br]will now choose to leave the UK 0:23:36.630,0:23:38.806 after Brexit. 0:23:38.830,0:23:40.814 So in the 18th century, 0:23:40.838,0:23:44.070 if Scotland wanted -- and the Scots[br]wanted several times -- 0:23:44.094,0:23:47.627 to break out of the control of London, 0:23:47.651,0:23:51.949 the reaction of the government[br]in London was to send an army up north 0:23:51.973,0:23:55.444 to burn down Edinburgh[br]and massacre the highland tribes. 0:23:55.468,0:24:01.024 My guess is that if, in 2018,[br]the Scots vote for independence, 0:24:01.048,0:24:04.457 the London government[br]will not send an army up north 0:24:04.481,0:24:06.084 to burn down Edinburgh. 0:24:06.108,0:24:10.375 Very few people are now willing[br]to kill or be killed 0:24:10.399,0:24:13.121 for Scottish or for British independence. 0:24:13.145,0:24:18.165 So for all the talk[br]of the rise of nationalism 0:24:18.189,0:24:20.432 and going back to the 1930s, 0:24:20.456,0:24:24.231 to the 19th century, in the West at least, 0:24:24.255,0:24:30.839 the power of national sentiments[br]today is far, far smaller 0:24:30.863,0:24:32.403 than it was a century ago. 0:24:32.427,0:24:36.264 CA: Although some people now,[br]you hear publicly worrying 0:24:36.288,0:24:39.044 about whether that might be shifting, 0:24:39.068,0:24:42.466 that there could actually be[br]outbreaks of violence in the US 0:24:42.490,0:24:44.837 depending on how things turn out. 0:24:44.861,0:24:46.400 Should we be worried about that, 0:24:46.424,0:24:48.490 or do you really think[br]things have shifted? 0:24:48.514,0:24:50.005 YNH: No, we should be worried. 0:24:50.029,0:24:51.654 We should be aware of two things. 0:24:51.678,0:24:53.315 First of all, don't be hysterical. 0:24:53.339,0:24:56.786 We are not back[br]in the First World War yet. 0:24:56.810,0:24:59.750 But on the other hand,[br]don't be complacent. 0:24:59.774,0:25:05.148 We reached from 1917 to 2017, 0:25:05.172,0:25:07.354 not by some divine miracle, 0:25:07.378,0:25:09.402 but simply by human decisions, 0:25:09.426,0:25:12.089 and if we now start making[br]the wrong decisions, 0:25:12.113,0:25:16.598 we could be back[br]in an analogous situation to 1917 0:25:16.622,0:25:18.128 in a few years. 0:25:18.152,0:25:20.473 One of the things I know as a historian 0:25:20.497,0:25:24.172 is that you should never[br]underestimate human stupidity. 0:25:24.196,0:25:27.079 (Laughter) 0:25:27.103,0:25:30.187 It's one of the most powerful[br]forces in history, 0:25:30.211,0:25:32.538 human stupidity and human violence. 0:25:32.562,0:25:36.667 Humans do such crazy things[br]for no obvious reason, 0:25:36.691,0:25:38.401 but again, at the same time, 0:25:38.425,0:25:42.029 another very powerful force[br]in human history is human wisdom. 0:25:42.053,0:25:43.219 We have both. 0:25:43.243,0:25:46.145 CA: We have with us here[br]moral psychologist Jonathan Haidt, 0:25:46.169,0:25:47.792 who I think has a question. 0:25:48.871,0:25:50.354 Jonathan Haidt: Thanks, Yuval. 0:25:50.378,0:25:52.861 So you seem to be a fan[br]of global governance, 0:25:52.885,0:25:56.405 but when you look at the map of the world[br]from Transparency International, 0:25:56.429,0:25:59.757 which rates the level of corruption[br]of political institutions, 0:25:59.781,0:26:02.861 it's a vast sea of red with little bits[br]of yellow here and there 0:26:02.885,0:26:04.490 for those with good institutions. 0:26:04.514,0:26:07.015 So if we were to have[br]some kind of global governance, 0:26:07.039,0:26:09.870 what makes you think it would end up[br]being more like Denmark 0:26:09.894,0:26:11.934 rather than more like Russia or Honduras, 0:26:11.958,0:26:13.459 and aren't there alternatives, 0:26:13.483,0:26:15.569 such as we did with CFCs? 0:26:15.593,0:26:18.700 There are ways to solve global problems[br]with national governments. 0:26:18.724,0:26:20.938 What would world government[br]actually look like, 0:26:20.962,0:26:22.683 and why do you think it would work? 0:26:22.707,0:26:26.467 YNH: Well, I don't know[br]what it would look like. 0:26:26.491,0:26:29.543 Nobody still has a model for that. 0:26:29.567,0:26:32.195 The main reason we need it 0:26:32.219,0:26:36.513 is because many of these issues[br]are lose-lose situations. 0:26:36.537,0:26:39.429 When you have[br]a win-win situation like trade, 0:26:39.453,0:26:42.369 both sides can benefit[br]from a trade agreement, 0:26:42.393,0:26:44.657 then this is something you can work out. 0:26:44.681,0:26:47.027 Without some kind of global government, 0:26:47.051,0:26:49.905 national governments each[br]have an interest in doing it. 0:26:49.929,0:26:53.900 But when you have a lose-lose situation[br]like with climate change, 0:26:53.924,0:26:55.565 it's much more difficult 0:26:55.589,0:27:00.475 without some overarching[br]authority, real authority. 0:27:00.499,0:27:03.261 Now, how to get there[br]and what would it look like, 0:27:03.285,0:27:04.645 I don't know. 0:27:04.669,0:27:08.406 And certainly there is no obvious reason 0:27:08.430,0:27:10.710 to think that it would look like Denmark, 0:27:10.734,0:27:12.322 or that it would be a democracy. 0:27:12.346,0:27:14.932 Most likely it wouldn't. 0:27:14.956,0:27:20.987 We don't have workable democratic models 0:27:21.011,0:27:23.107 for a global government. 0:27:23.131,0:27:26.196 So maybe it would look more[br]like ancient China 0:27:26.220,0:27:27.919 than like modern Denmark. 0:27:27.943,0:27:33.166 But still, given the dangers[br]that we are facing, 0:27:33.190,0:27:38.310 I think the imperative of having[br]some kind of real ability 0:27:38.334,0:27:42.462 to force through difficult decisions[br]on the global level 0:27:42.486,0:27:46.616 is more important[br]than almost anything else. 0:27:47.591,0:27:49.689 CA: There's a question from Facebook here, 0:27:49.713,0:27:51.606 and then we'll get the mic to Andrew. 0:27:51.630,0:27:53.826 So, Kat Hebron on Facebook, 0:27:53.850,0:27:55.518 calling in from Vail: 0:27:55.542,0:27:59.753 "How would developed nations manage[br]the millions of climate migrants?" 0:28:00.818,0:28:02.972 YNH: I don't know. 0:28:02.996,0:28:04.888 CA: That's your answer, Kat. (Laughter) 0:28:04.912,0:28:07.058 YNH: And I don't think[br]that they know either. 0:28:07.082,0:28:08.876 They'll just deny the problem, maybe. 0:28:08.900,0:28:11.925 CA: But immigration, generally,[br]is another example of a problem 0:28:11.949,0:28:14.522 that's very hard to solve[br]on a nation-by-nation basis. 0:28:14.546,0:28:16.016 One nation can shut its doors, 0:28:16.040,0:28:18.574 but maybe that stores up[br]problems for the future. 0:28:18.598,0:28:22.470 YNH: Yes, I mean --[br]it's another very good case, 0:28:22.494,0:28:24.723 especially because it's so much easier 0:28:24.747,0:28:26.578 to migrate today 0:28:26.602,0:28:30.291 than it was in the Middle Ages[br]or in ancient times. 0:28:30.315,0:28:34.778 CA: Yuval, there's a belief[br]among many technologists, certainly, 0:28:34.802,0:28:37.153 that political concerns[br]are kind of overblown, 0:28:37.177,0:28:40.874 that actually, political leaders[br]don't have that much influence 0:28:40.898,0:28:42.064 in the world, 0:28:42.088,0:28:46.057 that the real determination of humanity[br]at this point is by science, 0:28:46.081,0:28:47.527 by invention, by companies, 0:28:47.551,0:28:51.943 by many things[br]other than political leaders, 0:28:51.967,0:28:54.378 and it's actually very hard[br]for leaders to do much, 0:28:54.402,0:28:56.760 so we're actually worrying[br]about nothing here. 0:28:58.005,0:29:00.241 YNH: Well, first, it should be emphasized 0:29:00.265,0:29:05.262 that it's true that political leaders'[br]ability to do good is very limited, 0:29:05.286,0:29:08.329 but their ability to do harm is unlimited. 0:29:08.353,0:29:10.953 There is a basic imbalance here. 0:29:10.977,0:29:14.545 You can still press the button[br]and blow everybody up. 0:29:14.569,0:29:16.155 You have that kind of ability. 0:29:16.179,0:29:19.748 But if you want, for example,[br]to reduce inequality, 0:29:19.772,0:29:21.649 that's very, very difficult. 0:29:21.673,0:29:23.069 But to start a war, 0:29:23.093,0:29:24.944 you can still do so very easily. 0:29:24.968,0:29:28.560 So there is a built-in imbalance[br]in the political system today 0:29:28.584,0:29:30.195 which is very frustrating, 0:29:30.219,0:29:35.120 where you cannot do a lot of good[br]but you can still do a lot of harm. 0:29:35.144,0:29:39.288 And this makes the political system[br]still a very big concern. 0:29:39.812,0:29:41.963 CA: So as you look at[br]what's happening today, 0:29:41.987,0:29:43.741 and putting your historian's hat on, 0:29:43.765,0:29:47.291 do you look back in history at moments[br]when things were going just fine 0:29:47.315,0:29:52.648 and an individual leader really took[br]the world or their country backwards? 0:29:53.307,0:29:55.936 YNH: There are quite a few examples, 0:29:55.960,0:29:58.779 but I should emphasize,[br]it's never an individual leader. 0:29:58.803,0:30:00.437 I mean, somebody put him there, 0:30:00.461,0:30:03.744 and somebody allowed him[br]to continue to be there. 0:30:03.768,0:30:07.851 So it's never really just the fault[br]of a single individual. 0:30:07.875,0:30:12.488 There are a lot of people[br]behind every such individual. 0:30:12.512,0:30:15.990 CA: Can we have the microphone[br]here, please, to Andrew? 0:30:19.132,0:30:22.696 Andrew Solomon: You've talked a lot[br]about the global versus the national, 0:30:22.720,0:30:24.346 but increasingly, it seems to me, 0:30:24.370,0:30:27.013 the world situation[br]is in the hands of identity groups. 0:30:27.037,0:30:29.347 We look at people within the United States 0:30:29.371,0:30:30.998 who have been recruited by ISIS. 0:30:31.022,0:30:33.213 We look at these other groups[br]which have formed 0:30:33.237,0:30:35.199 which go outside of national bounds 0:30:35.223,0:30:37.384 but still represent[br]significant authorities. 0:30:37.408,0:30:39.836 How are they to be integrated[br]into the system, 0:30:39.860,0:30:43.573 and how is a diverse set of identities[br]to be made coherent 0:30:43.597,0:30:45.935 under either national[br]or global leadership? 0:30:47.380,0:30:50.601 YNH: Well, the problem[br]of such diverse identities 0:30:50.625,0:30:52.681 is a problem from nationalism as well. 0:30:53.229,0:30:57.584 Nationalism believes[br]in a single, monolithic identity, 0:30:57.608,0:31:01.724 and exclusive or at least[br]more extreme versions of nationalism 0:31:01.748,0:31:05.317 believe in an exclusive loyalty[br]to a single identity. 0:31:05.341,0:31:08.257 And therefore, nationalism has had[br]a lot of problems 0:31:08.281,0:31:11.157 with people wanting to divide[br]their identities 0:31:11.181,0:31:13.244 between various groups. 0:31:13.268,0:31:18.143 So it's not just a problem, say,[br]for a global vision. 0:31:18.540,0:31:22.392 And I think, again, history shows 0:31:22.416,0:31:28.543 that you shouldn't necessarily[br]think in such exclusive terms. 0:31:28.567,0:31:31.975 If you think that there is just[br]a single identity for a person, 0:31:31.999,0:31:37.039 "I am just X, that's it, I can't be[br]several things, I can be just that," 0:31:37.063,0:31:39.159 that's the start of the problem. 0:31:39.183,0:31:41.971 You have religions, you have nations 0:31:41.995,0:31:45.177 that sometimes demand exclusive loyalty, 0:31:45.201,0:31:46.932 but it's not the only option. 0:31:46.956,0:31:49.338 There are many religions and many nations 0:31:49.362,0:31:53.240 that enable you to have[br]diverse identities at the same time. 0:31:53.264,0:31:57.621 CA: But is one explanation[br]of what's happened in the last year 0:31:57.645,0:32:02.825 that a group of people have got[br]fed up with, if you like, 0:32:02.849,0:32:06.016 the liberal elites,[br]for want of a better term, 0:32:06.040,0:32:10.413 obsessing over many, many different[br]identities and them feeling, 0:32:10.437,0:32:14.296 "But what about my identity?[br]I am being completely ignored here. 0:32:14.320,0:32:17.294 And by the way, I thought[br]I was the majority"? 0:32:17.318,0:32:20.299 And that that's actually[br]sparked a lot of the anger. 0:32:20.918,0:32:24.063 YNH: Yeah. Identity is always problematic, 0:32:24.087,0:32:28.397 because identity is always based[br]on fictional stories 0:32:28.421,0:32:31.310 that sooner or later collide with reality. 0:32:31.890,0:32:33.408 Almost all identities, 0:32:33.432,0:32:36.843 I mean, beyond the level[br]of the basic community 0:32:36.867,0:32:38.336 of a few dozen people, 0:32:38.360,0:32:40.289 are based on a fictional story. 0:32:40.313,0:32:41.954 They are not the truth. 0:32:41.978,0:32:43.293 They are not the reality. 0:32:43.317,0:32:46.411 It's just a story that people invent[br]and tell one another 0:32:46.435,0:32:47.926 and start believing. 0:32:47.950,0:32:53.270 And therefore all identities[br]are extremely unstable. 0:32:53.294,0:32:55.821 They are not a biological reality. 0:32:55.845,0:32:57.851 Sometimes nationalists, for example, 0:32:57.875,0:33:00.802 think that the nation[br]is a biological entity. 0:33:00.826,0:33:04.439 It's made of the combination[br]of soil and blood, 0:33:04.463,0:33:06.165 creates the nation. 0:33:06.189,0:33:09.281 But this is just a fictional story. 0:33:09.305,0:33:11.868 CA: Soil and blood[br]kind of makes a gooey mess. 0:33:11.892,0:33:13.714 (Laughter) 0:33:13.738,0:33:16.762 YNH: It does, and also[br]it messes with your mind 0:33:16.786,0:33:21.570 when you think too much[br]that I am a combination of soil and blood. 0:33:21.594,0:33:24.461 If you look from a biological perspective, 0:33:24.485,0:33:27.963 obviously none of the nations[br]that exist today 0:33:27.987,0:33:30.230 existed 5,000 years ago. 0:33:30.254,0:33:34.112 Homo sapiens is a social animal,[br]that's for sure. 0:33:34.136,0:33:36.563 But for millions of years, 0:33:36.587,0:33:41.226 Homo sapiens and our hominid ancestors[br]lived in small communities 0:33:41.250,0:33:43.579 of a few dozen individuals. 0:33:43.603,0:33:45.730 Everybody knew everybody else. 0:33:45.754,0:33:49.775 Whereas modern nations[br]are imagined communities, 0:33:49.799,0:33:52.350 in the sense that I don't even know[br]all these people. 0:33:52.374,0:33:55.222 I come from a relatively[br]small nation, Israel, 0:33:55.246,0:33:57.389 and of eight million Israelis, 0:33:57.413,0:33:59.403 I never met most of them. 0:33:59.427,0:34:01.735 I will never meet most of them. 0:34:01.759,0:34:04.321 They basically exist here. 0:34:04.345,0:34:07.094 CA: But in terms of this identity, 0:34:07.118,0:34:12.555 this group who feel left out[br]and perhaps have work taken away, 0:34:12.579,0:34:14.873 I mean, in "Homo Deus," 0:34:14.897,0:34:18.008 you actually speak of this group[br]in one sense expanding, 0:34:18.032,0:34:21.654 that so many people[br]may have their jobs taken away 0:34:21.678,0:34:26.058 by technology in some way[br]that we could end up with 0:34:26.082,0:34:29.253 a really large -- I think you call it[br]a "useless class" -- 0:34:29.277,0:34:31.380 a class where traditionally, 0:34:31.404,0:34:34.135 as viewed by the economy,[br]these people have no use. 0:34:34.159,0:34:35.357 YNH: Yes. 0:34:35.381,0:34:38.312 CA: How likely a possibility is that? 0:34:38.336,0:34:41.080 Is that something[br]we should be terrified about? 0:34:41.104,0:34:43.763 And can we address it in any way? 0:34:43.787,0:34:46.034 YNH: We should think about it[br]very carefully. 0:34:46.058,0:34:49.029 I mean, nobody really knows[br]what the job market will look like 0:34:49.053,0:34:50.743 in 2040, 2050. 0:34:50.767,0:34:53.475 There is a chance[br]many new jobs will appear, 0:34:53.499,0:34:55.253 but it's not certain. 0:34:55.277,0:34:57.488 And even if new jobs do appear, 0:34:57.512,0:34:59.496 it won't necessarily be easy 0:34:59.520,0:35:02.519 for a 50-year old unemployed truck driver 0:35:02.543,0:35:05.576 made unemployed by self-driving vehicles, 0:35:05.600,0:35:09.253 it won't be easy[br]for an unemployed truck driver 0:35:09.277,0:35:14.063 to reinvent himself or herself[br]as a designer of virtual worlds. 0:35:14.087,0:35:18.269 Previously, if you look at the trajectory[br]of the industrial revolution, 0:35:18.293,0:35:22.450 when machines replaced humans[br]in one type of work, 0:35:22.474,0:35:26.755 the solution usually came[br]from low-skill work 0:35:26.779,0:35:29.367 in new lines of business. 0:35:29.391,0:35:32.793 So you didn't need any more[br]agricultural workers, 0:35:32.817,0:35:38.231 so people moved to working[br]in low-skill industrial jobs, 0:35:38.255,0:35:41.724 and when this was taken away[br]by more and more machines, 0:35:41.748,0:35:44.718 people moved to low-skill service jobs. 0:35:44.742,0:35:48.102 Now, when people say there will[br]be new jobs in the future, 0:35:48.126,0:35:50.555 that humans can do better than AI, 0:35:50.579,0:35:52.409 that humans can do better than robots, 0:35:52.433,0:35:55.073 they usually think about high-skill jobs, 0:35:55.097,0:35:58.968 like software engineers[br]designing virtual worlds. 0:35:58.992,0:36:04.386 Now, I don't see how[br]an unemployed cashier from Wal-Mart 0:36:04.410,0:36:09.033 reinvents herself or himself at 50[br]as a designer of virtual worlds, 0:36:09.057,0:36:10.528 and certainly I don't see 0:36:10.552,0:36:14.019 how the millions of unemployed[br]Bangladeshi textile workers 0:36:14.043,0:36:15.654 will be able to do that. 0:36:15.678,0:36:17.398 I mean, if they are going to do it, 0:36:17.422,0:36:20.778 we need to start teaching[br]the Bangladeshis today 0:36:20.802,0:36:22.556 how to be software designers, 0:36:22.580,0:36:23.823 and we are not doing it. 0:36:23.847,0:36:26.338 So what will they do in 20 years? 0:36:26.362,0:36:30.276 CA: So it feels like you're really[br]highlighting a question 0:36:30.300,0:36:34.483 that's really been bugging me[br]the last few months more and more. 0:36:34.507,0:36:37.362 It's almost a hard question[br]to ask in public, 0:36:37.386,0:36:40.777 but if any mind has some wisdom[br]to offer in it, maybe it's yours, 0:36:40.801,0:36:42.346 so I'm going to ask you: 0:36:42.370,0:36:44.248 What are humans for? 0:36:45.232,0:36:47.166 YNH: As far as we know, for nothing. 0:36:47.190,0:36:48.902 (Laughter) 0:36:48.926,0:36:54.452 I mean, there is no great cosmic drama,[br]some great cosmic plan, 0:36:54.476,0:36:57.317 that we have a role to play in. 0:36:57.341,0:37:00.365 And we just need to discover[br]what our role is 0:37:00.389,0:37:03.381 and then play it to the best[br]of our ability. 0:37:03.405,0:37:08.383 This has been the story of all religions[br]and ideologies and so forth, 0:37:08.407,0:37:11.885 but as a scientist, the best I can say[br]is this is not true. 0:37:11.909,0:37:17.267 There is no universal drama[br]with a role in it for Homo sapiens. 0:37:17.291,0:37:18.972 So -- 0:37:18.996,0:37:21.489 CA: I'm going to push back on you[br]just for a minute, 0:37:21.513,0:37:22.707 just from your own book, 0:37:22.731,0:37:24.055 because in "Homo Deus," 0:37:24.079,0:37:29.138 you give really one of the most coherent[br]and understandable accounts 0:37:29.162,0:37:31.394 about sentience, about consciousness, 0:37:31.418,0:37:34.376 and that unique sort of human skill. 0:37:34.400,0:37:36.893 You point out that it's different[br]from intelligence, 0:37:36.917,0:37:39.251 the intelligence[br]that we're building in machines, 0:37:39.275,0:37:42.933 and that there's actually a lot[br]of mystery around it. 0:37:42.957,0:37:46.334 How can you be sure there's no purpose 0:37:46.358,0:37:50.409 when we don't even understand[br]what this sentience thing is? 0:37:50.433,0:37:53.009 I mean, in your own thinking,[br]isn't there a chance 0:37:53.033,0:37:57.345 that what humans are for[br]is to be the universe's sentient things, 0:37:57.369,0:38:00.792 to be the centers of joy and love[br]and happiness and hope? 0:38:00.816,0:38:03.851 And maybe we can build machines[br]that actually help amplify that, 0:38:03.875,0:38:06.539 even if they're not going to become[br]sentient themselves? 0:38:06.563,0:38:07.714 Is that crazy? 0:38:07.738,0:38:11.221 I kind of found myself hoping that,[br]reading your book. 0:38:11.245,0:38:15.102 YNH: Well, I certainly think that the most[br]interesting question today in science 0:38:15.126,0:38:17.549 is the question[br]of consciousness and the mind. 0:38:17.573,0:38:21.071 We are getting better and better[br]in understanding the brain 0:38:21.095,0:38:22.355 and intelligence, 0:38:22.379,0:38:24.916 but we are not getting much better 0:38:24.940,0:38:27.283 in understanding the mind[br]and consciousness. 0:38:27.307,0:38:30.669 People often confuse intelligence[br]and consciousness, 0:38:30.693,0:38:32.992 especially in places like Silicon Valley, 0:38:33.016,0:38:36.773 which is understandable,[br]because in humans, they go together. 0:38:36.797,0:38:40.376 I mean, intelligence basically[br]is the ability to solve problems. 0:38:40.400,0:38:42.942 Consciousness is the ability[br]to feel things, 0:38:42.966,0:38:48.178 to feel joy and sadness[br]and boredom and pain and so forth. 0:38:48.202,0:38:52.241 In Homo sapiens and all other mammals[br]as well -- it's not unique to humans -- 0:38:52.265,0:38:54.912 in all mammals and birds[br]and some other animals, 0:38:54.936,0:38:57.586 intelligence and consciousness[br]go together. 0:38:57.610,0:39:01.188 We often solve problems by feeling things. 0:39:01.212,0:39:02.705 So we tend to confuse them. 0:39:02.729,0:39:04.194 But they are different things. 0:39:04.218,0:39:07.306 What's happening today[br]in places like Silicon Valley 0:39:07.330,0:39:10.956 is that we are creating[br]artificial intelligence 0:39:10.980,0:39:12.802 but not artificial consciousness. 0:39:12.826,0:39:16.206 There has been an amazing development[br]in computer intelligence 0:39:16.230,0:39:17.792 over the last 50 years, 0:39:17.816,0:39:22.017 and exactly zero development[br]in computer consciousness, 0:39:22.041,0:39:25.727 and there is no indication that computers[br]are going to become conscious 0:39:25.751,0:39:28.282 anytime soon. 0:39:28.306,0:39:33.956 So first of all, if there is[br]some cosmic role for consciousness, 0:39:33.980,0:39:36.110 it's not unique to Homo sapiens. 0:39:36.134,0:39:38.453 Cows are conscious, pigs are conscious, 0:39:38.477,0:39:41.310 chimpanzees are conscious,[br]chickens are conscious, 0:39:41.334,0:39:45.187 so if we go that way, first of all,[br]we need to broaden our horizons 0:39:45.211,0:39:49.936 and remember very clearly we are not[br]the only sentient beings on Earth, 0:39:49.960,0:39:51.755 and when it comes to sentience -- 0:39:51.779,0:39:55.091 when it comes to intelligence,[br]there is good reason to think 0:39:55.115,0:39:58.411 we are the most intelligent[br]of the whole bunch. 0:39:58.435,0:40:01.009 But when it comes to sentience, 0:40:01.033,0:40:04.191 to say that humans are more[br]sentient than whales, 0:40:04.215,0:40:08.362 or more sentient than baboons[br]or more sentient than cats, 0:40:08.386,0:40:10.680 I see no evidence for that. 0:40:10.704,0:40:14.311 So first step is, you go[br]in that direction, expand. 0:40:14.335,0:40:18.317 And then the second question[br]of what is it for, 0:40:18.341,0:40:20.123 I would reverse it 0:40:20.147,0:40:24.383 and I would say that I don't think[br]sentience is for anything. 0:40:24.407,0:40:28.579 I think we don't need[br]to find our role in the universe. 0:40:28.603,0:40:34.416 The really important thing[br]is to liberate ourselves from suffering. 0:40:34.440,0:40:37.433 What characterizes sentient beings 0:40:37.457,0:40:40.177 in contrast to robots, to stones, 0:40:40.201,0:40:41.384 to whatever, 0:40:41.408,0:40:45.199 is that sentient beings[br]suffer, can suffer, 0:40:45.223,0:40:47.563 and what they should focus on 0:40:47.587,0:40:51.707 is not finding their place[br]in some mysterious cosmic drama. 0:40:51.731,0:40:55.550 They should focus on understanding[br]what suffering is, 0:40:55.574,0:40:58.933 what causes it and how[br]to be liberated from it. 0:40:59.572,0:41:03.049 CA: I know this is a big issue for you,[br]and that was very eloquent. 0:41:03.073,0:41:06.487 We're going to have a blizzard[br]of questions from the audience here, 0:41:06.511,0:41:08.431 and maybe from Facebook as well, 0:41:08.455,0:41:10.128 and maybe some comments as well. 0:41:10.152,0:41:11.948 So let's go quick. 0:41:11.972,0:41:13.402 There's one right here. 0:41:15.052,0:41:17.861 Keep your hands held up[br]at the back if you want the mic, 0:41:17.885,0:41:19.304 and we'll get it back to you. 0:41:19.328,0:41:22.447 Question: In your work, you talk a lot[br]about the fictional stories 0:41:22.471,0:41:23.815 that we accept as truth, 0:41:23.839,0:41:25.556 and we live our lives by it. 0:41:25.580,0:41:28.079 As an individual, knowing that, 0:41:28.103,0:41:31.849 how does it impact the stories[br]that you choose to live your life, 0:41:31.873,0:41:35.613 and do you confuse them[br]with the truth, like all of us? 0:41:36.246,0:41:37.457 YNH: I try not to. 0:41:37.481,0:41:40.249 I mean, for me, maybe the most[br]important question, 0:41:40.273,0:41:42.751 both as a scientist and as a person, 0:41:42.775,0:41:46.650 is how to tell the difference[br]between fiction and reality, 0:41:46.674,0:41:49.270 because reality is there. 0:41:49.294,0:41:51.376 I'm not saying that everything is fiction. 0:41:51.400,0:41:54.452 It's just very difficult for human beings[br]to tell the difference 0:41:54.476,0:41:56.093 between fiction and reality, 0:41:56.117,0:42:01.062 and it has become more and more difficult[br]as history progressed, 0:42:01.086,0:42:03.537 because the fictions[br]that we have created -- 0:42:03.561,0:42:06.729 nations and gods and money[br]and corporations -- 0:42:06.753,0:42:08.263 they now control the world. 0:42:08.287,0:42:09.464 So just to even think, 0:42:09.488,0:42:12.633 "Oh, this is just all fictional entities[br]that we've created," 0:42:12.657,0:42:14.104 is very difficult. 0:42:14.128,0:42:16.408 But reality is there. 0:42:17.043,0:42:19.048 For me the best ... 0:42:19.072,0:42:21.195 There are several tests 0:42:21.219,0:42:23.989 to tell the difference[br]between fiction and reality. 0:42:24.013,0:42:27.439 The simplest one, the best one[br]that I can say in short, 0:42:27.463,0:42:29.044 is the test of suffering. 0:42:29.068,0:42:30.621 If it can suffer, it's real. 0:42:31.192,0:42:32.886 If it can't suffer, it's not real. 0:42:32.910,0:42:34.375 A nation cannot suffer. 0:42:34.399,0:42:35.969 That's very, very clear. 0:42:35.993,0:42:37.931 Even if a nation loses a war, 0:42:37.955,0:42:42.020 we say, "Germany suffered a defeat[br]in the First World War," 0:42:42.044,0:42:43.209 it's a metaphor. 0:42:43.233,0:42:45.790 Germany cannot suffer.[br]Germany has no mind. 0:42:45.814,0:42:47.467 Germany has no consciousness. 0:42:47.491,0:42:51.149 Germans can suffer, yes,[br]but Germany cannot. 0:42:51.173,0:42:54.142 Similarly, when a bank goes bust, 0:42:54.166,0:42:55.937 the bank cannot suffer. 0:42:55.961,0:42:59.352 When the dollar loses its value,[br]the dollar doesn't suffer. 0:42:59.376,0:43:01.626 People can suffer. Animals can suffer. 0:43:01.650,0:43:02.806 This is real. 0:43:02.830,0:43:07.359 So I would start, if you[br]really want to see reality, 0:43:07.383,0:43:09.447 I would go through the door of suffering. 0:43:09.471,0:43:12.425 If you can really understand[br]what suffering is, 0:43:12.449,0:43:14.672 this will give you also the key 0:43:14.696,0:43:16.713 to understand what reality is. 0:43:16.737,0:43:19.520 CA: There's a Facebook question[br]here that connects to this, 0:43:19.544,0:43:22.521 from someone around the world[br]in a language that I cannot read. 0:43:22.545,0:43:24.762 YNH: Oh, it's Hebrew.[br]CA: Hebrew. There you go. 0:43:24.786,0:43:25.848 (Laughter) 0:43:25.872,0:43:27.036 Can you read the name? 0:43:27.060,0:43:28.935 YNH: [??] 0:43:28.959,0:43:30.803 CA: Well, thank you for writing in. 0:43:30.827,0:43:35.382 The question is: "Is the post-truth era[br]really a brand-new era, 0:43:35.406,0:43:39.793 or just another climax or moment[br]in a never-ending trend? 0:43:40.701,0:43:44.030 YNH: Personally, I don't connect[br]with this idea of post-truth. 0:43:44.054,0:43:46.762 My basic reaction as a historian is: 0:43:46.786,0:43:50.681 If this is the era of post-truth,[br]when the hell was the era of truth? 0:43:50.705,0:43:51.956 CA: Right. 0:43:51.980,0:43:53.300 (Laughter) 0:43:53.324,0:43:58.007 YNH: Was it the 1980s, the 1950s,[br]the Middle Ages? 0:43:58.031,0:44:02.423 I mean, we have always lived[br]in an era, in a way, of post-truth. 0:44:02.883,0:44:05.194 CA: But I'd push back on that, 0:44:05.218,0:44:07.888 because I think what people[br]are talking about 0:44:07.912,0:44:14.872 is that there was a world[br]where you had fewer journalistic outlets, 0:44:14.896,0:44:18.544 where there were traditions,[br]that things were fact-checked. 0:44:18.568,0:44:22.513 It was incorporated into the charter[br]of those organizations 0:44:22.537,0:44:24.704 that the truth mattered. 0:44:24.728,0:44:26.477 So if you believe in a reality, 0:44:26.501,0:44:28.724 then what you write is information. 0:44:28.748,0:44:32.569 There was a belief that that information[br]should connect to reality in a real way, 0:44:32.593,0:44:35.554 and if you wrote a headline,[br]it was a serious, earnest attempt 0:44:35.578,0:44:37.881 to reflect something[br]that had actually happened. 0:44:37.905,0:44:39.756 And people didn't always get it right. 0:44:39.780,0:44:41.789 But I think the concern now is you've got 0:44:41.813,0:44:44.131 a technological system[br]that's incredibly powerful 0:44:44.155,0:44:48.325 that, for a while at least,[br]massively amplified anything 0:44:48.349,0:44:51.129 with no attention paid to whether[br]it connected to reality, 0:44:51.153,0:44:54.307 only to whether it connected[br]to clicks and attention, 0:44:54.331,0:44:55.947 and that that was arguably toxic. 0:44:55.971,0:44:58.407 That's a reasonable concern, isn't it? 0:44:58.431,0:45:00.717 YNH: Yeah, it is. I mean,[br]the technology changes, 0:45:00.741,0:45:05.969 and it's now easier to disseminate[br]both truth and fiction and falsehood. 0:45:05.993,0:45:07.996 It goes both ways. 0:45:08.020,0:45:12.599 It's also much easier, though, to spread[br]the truth than it was ever before. 0:45:12.623,0:45:16.308 But I don't think there[br]is anything essentially new 0:45:16.332,0:45:21.052 about this disseminating[br]fictions and errors. 0:45:21.076,0:45:25.110 There is nothing that -- I don't know --[br]Joseph Goebbels, didn't know 0:45:25.134,0:45:30.573 about all this idea of fake[br]news and post-truth. 0:45:30.597,0:45:34.315 He famously said that if you repeat[br]a lie often enough, 0:45:34.339,0:45:36.160 people will think it's the truth, 0:45:36.184,0:45:38.540 and the bigger the lie, the better, 0:45:38.564,0:45:44.587 because people won't even think[br]that something so big can be a lie. 0:45:44.611,0:45:50.269 I think that fake news[br]has been with us for thousands of years. 0:45:50.293,0:45:52.194 Just think of the Bible. 0:45:52.218,0:45:53.605 (Laughter) 0:45:53.629,0:45:54.916 CA: But there is a concern 0:45:54.940,0:45:58.957 that the fake news is associated[br]with tyrannical regimes, 0:45:58.981,0:46:01.558 and when you see an uprise in fake news 0:46:01.582,0:46:06.304 that is a canary in the coal mine[br]that there may be dark times coming. 0:46:08.124,0:46:15.086 YNH: Yeah. I mean, the intentional use[br]of fake news is a disturbing sign. 0:46:15.812,0:46:20.393 But I'm not saying that it's not bad,[br]I'm just saying that it's not new. 0:46:20.820,0:46:23.574 CA: There's a lot of interest[br]on Facebook on this question 0:46:23.598,0:46:28.598 about global governance[br]versus nationalism. 0:46:29.292,0:46:30.800 Question here from Phil Dennis: 0:46:30.824,0:46:34.320 "How do we get people, governments,[br]to relinquish power? 0:46:34.344,0:46:38.259 Is that -- is that --[br]actually, the text is so big 0:46:38.283,0:46:39.823 I can't read the full question. 0:46:39.847,0:46:41.386 But is that a necessity? 0:46:41.410,0:46:44.022 Is it going to take war to get there? 0:46:44.046,0:46:47.736 Sorry Phil -- I mangled your question,[br]but I blame the text right here. 0:46:47.760,0:46:49.860 YNH: One option[br]that some people talk about 0:46:49.884,0:46:54.623 is that only a catastrophe[br]can shake humankind 0:46:54.647,0:46:59.911 and open the path to a real system[br]of global governance, 0:46:59.935,0:47:04.083 and they say that we can't do it[br]before the catastrophe, 0:47:04.107,0:47:06.908 but we need to start[br]laying the foundations 0:47:06.932,0:47:09.432 so that when the disaster strikes, 0:47:09.456,0:47:11.638 we can react quickly. 0:47:11.662,0:47:15.662 But people will just not have[br]the motivation to do such a thing 0:47:15.686,0:47:17.698 before the disaster strikes. 0:47:17.722,0:47:19.987 Another thing that I would emphasize 0:47:20.011,0:47:25.065 is that anybody who is really[br]interested in global governance 0:47:25.089,0:47:27.990 should always make it very, very clear 0:47:28.014,0:47:34.598 that it doesn't replace or abolish[br]local identities and communities, 0:47:34.622,0:47:37.578 that it should come both as -- 0:47:37.602,0:47:40.909 It should be part of a single package. 0:47:40.933,0:47:44.311 CA: I want to hear more on this, 0:47:44.335,0:47:47.388 because the very words "global governance" 0:47:47.412,0:47:52.001 are almost the epitome of evil[br]in the mindset of a lot of people 0:47:52.025,0:47:53.351 on the alt-right right now. 0:47:53.375,0:47:56.329 It just seems scary, remote, distant,[br]and it has let them down, 0:47:56.353,0:48:00.469 and so globalists,[br]global governance -- no, go away! 0:48:00.493,0:48:04.175 And many view the election[br]as the ultimate poke in the eye 0:48:04.199,0:48:05.677 to anyone who believes in that. 0:48:05.701,0:48:09.252 So how do we change the narrative 0:48:09.276,0:48:12.251 so that it doesn't seem[br]so scary and remote? 0:48:12.275,0:48:15.019 Build more on this idea[br]of it being compatible 0:48:15.043,0:48:17.664 with local identity, local communities. 0:48:17.688,0:48:20.288 YNH: Well, I think again we should start 0:48:20.312,0:48:23.444 really with the biological realities 0:48:23.468,0:48:25.479 of Homo sapiens. 0:48:25.503,0:48:29.621 And biology tells us two things[br]about Homo sapiens 0:48:29.645,0:48:31.902 which are very relevant to this issue: 0:48:31.926,0:48:34.955 first of all, that we are[br]completely dependent 0:48:34.979,0:48:37.574 on the ecological system around us, 0:48:37.598,0:48:41.057 and that today we are talking[br]about a global system. 0:48:41.081,0:48:42.438 You cannot escape that. 0:48:42.462,0:48:46.084 And at the same time, biology tells us[br]about Homo sapiens 0:48:46.108,0:48:48.355 that we are social animals, 0:48:48.379,0:48:53.016 but that we are social[br]on a very, very local level. 0:48:53.040,0:48:56.585 It's just a simple fact of humanity 0:48:56.609,0:49:01.406 that we cannot have intimate familiarity 0:49:01.430,0:49:05.305 with more than about 150 individuals. 0:49:05.329,0:49:09.626 The size of the natural group, 0:49:09.650,0:49:12.752 the natural community of Homo sapiens, 0:49:12.776,0:49:16.120 is not more than 150 individuals, 0:49:16.144,0:49:22.543 and everything beyond that is really[br]based on all kinds of imaginary stories 0:49:22.567,0:49:24.614 and large-scale institutions, 0:49:24.638,0:49:29.014 and I think that we can find a way, 0:49:29.038,0:49:33.608 again, based on a biological[br]understanding of our species, 0:49:33.632,0:49:35.714 to weave the two together 0:49:35.738,0:49:38.814 and to understand that today[br]in the 21st century, 0:49:38.838,0:49:44.374 we need both the global level[br]and the local community. 0:49:44.398,0:49:46.415 And I would go even further than that 0:49:46.439,0:49:49.762 and say that it starts[br]with the body itself. 0:49:50.500,0:49:54.842 The feelings that people today have[br]of alienation and loneliness 0:49:54.866,0:49:58.082 and not finding their place in the world, 0:49:58.106,0:50:03.835 I would think that the chief problem[br]is not global capitalism. 0:50:04.285,0:50:07.311 The chief problem is that over[br]the last hundred years, 0:50:07.335,0:50:11.039 people have been becoming disembodied, 0:50:11.063,0:50:14.222 have been distancing themselves[br]from their body. 0:50:14.246,0:50:17.142 As a hunter-gatherer or even as a peasant, 0:50:17.166,0:50:21.364 to survive, you need to be[br]constantly in touch 0:50:21.388,0:50:23.571 with your body and with your senses, 0:50:23.595,0:50:24.776 every moment. 0:50:24.800,0:50:26.947 If you go to the forest[br]to look for mushrooms 0:50:26.971,0:50:29.348 and you don't pay attention[br]to what you hear, 0:50:29.372,0:50:31.248 to what you smell, to what you taste, 0:50:31.272,0:50:32.423 you're dead. 0:50:32.447,0:50:34.598 So you must be very connected. 0:50:34.622,0:50:39.218 In the last hundred years,[br]people are losing their ability 0:50:39.242,0:50:42.114 to be in touch with their body[br]and their senses, 0:50:42.138,0:50:44.324 to hear, to smell, to feel. 0:50:44.348,0:50:47.474 More and more attention goes to screens, 0:50:47.498,0:50:49.018 to what is happening elsewhere, 0:50:49.042,0:50:50.263 some other time. 0:50:50.287,0:50:52.718 This, I think, is the deep reason 0:50:52.742,0:50:56.636 for the feelings of alienation[br]and loneliness and so forth, 0:50:56.660,0:50:59.162 and therefore part of the solution 0:50:59.186,0:51:03.450 is not to bring back[br]some mass nationalism, 0:51:03.474,0:51:07.598 but also reconnect with our own bodies, 0:51:07.622,0:51:10.885 and if you are back[br]in touch with your body, 0:51:10.909,0:51:14.079 you will feel much more at home[br]in the world also. 0:51:14.103,0:51:17.788 CA: Well, depending on how things go,[br]we may all be back in the forest soon. 0:51:17.812,0:51:20.161 We're going to have[br]one more question in the room 0:51:20.185,0:51:21.688 and one more on Facebook. 0:51:21.712,0:51:25.093 Ama Adi-Dako: Hello. I'm from Ghana,[br]West Africa, and my question is: 0:51:25.117,0:51:29.719 I'm wondering how do you present[br]and justify the idea of global governance 0:51:29.743,0:51:32.754 to countries that have been[br]historically disenfranchised 0:51:32.778,0:51:34.823 by the effects of globalization, 0:51:34.847,0:51:37.593 and also, if we're talking about[br]global governance, 0:51:37.617,0:51:41.241 it sounds to me like it will definitely[br]come from a very Westernized idea 0:51:41.265,0:51:43.439 of what the "global"[br]is supposed to look like. 0:51:43.463,0:51:46.753 So how do we present and justify[br]that idea of global 0:51:46.777,0:51:49.770 versus wholly nationalist 0:51:49.794,0:51:53.129 to people in countries like Ghana[br]and Nigeria and Togo 0:51:53.153,0:51:55.329 and other countries like that? 0:51:56.131,0:52:02.545 YNH: I would start by saying[br]that history is extremely unfair, 0:52:02.569,0:52:06.491 and that we should realize that. 0:52:07.004,0:52:10.053 Many of the countries that suffered most 0:52:10.077,0:52:14.216 from the last 200 years of globalization 0:52:14.240,0:52:16.200 and imperialism and industrialization 0:52:16.224,0:52:21.934 are exactly the countries[br]which are also most likely to suffer most 0:52:21.958,0:52:24.747 from the next wave. 0:52:24.771,0:52:28.765 And we should be very,[br]very clear about that. 0:52:29.477,0:52:32.528 If we don't have a global governance, 0:52:32.552,0:52:35.755 and if we suffer from climate change, 0:52:35.779,0:52:38.036 from technological disruptions, 0:52:38.060,0:52:41.661 the worst suffering will not be in the US. 0:52:41.685,0:52:46.781 The worst suffering will be in Ghana,[br]will be in Sudan, will be in Syria, 0:52:46.805,0:52:49.542 will be in Bangladesh,[br]will be in those places. 0:52:49.566,0:52:55.602 So I think those countries[br]have an even greater incentive 0:52:55.626,0:53:00.353 to do something about[br]the next wave of disruption, 0:53:00.377,0:53:02.902 whether it's ecological[br]or whether it's technological. 0:53:02.926,0:53:05.772 Again, if you think about[br]technological disruption, 0:53:05.796,0:53:10.412 so if AI and 3D printers and robots[br]will take the jobs 0:53:10.436,0:53:12.805 from billions of people, 0:53:12.829,0:53:15.954 I worry far less about the Swedes 0:53:15.978,0:53:19.583 than about the people in Ghana[br]or in Bangladesh. 0:53:19.607,0:53:24.835 And therefore,[br]because history is so unfair 0:53:24.859,0:53:29.205 and the results of a calamity 0:53:29.229,0:53:31.597 will not be shared equally[br]between everybody, 0:53:31.621,0:53:36.054 as usual, the rich[br]will be able to get away 0:53:36.078,0:53:39.550 from the worst consequences[br]of climate change 0:53:39.574,0:53:42.419 in a way that the poor[br]will not be able to. 0:53:43.347,0:53:46.755 CA: And here's a great question[br]from Cameron Taylor on Facebook: 0:53:46.779,0:53:48.900 "At the end of 'Sapiens,'" 0:53:48.924,0:53:50.987 you said we should be asking the question, 0:53:51.011,0:53:53.367 'What do we want to want?' 0:53:53.391,0:53:56.378 Well, what do you think[br]we should want to want?" 0:53:56.402,0:53:59.933 YNH: I think we should want[br]to want to know the truth, 0:53:59.957,0:54:02.607 to understand reality. 0:54:03.207,0:54:08.321 Mostly what we want is to change reality, 0:54:08.345,0:54:12.063 to fit it to our own desires,[br]to our own wishes, 0:54:12.087,0:54:15.807 and I think we should first[br]want to understand it. 0:54:15.831,0:54:19.595 If you look at the long-term[br]trajectory of history, 0:54:19.619,0:54:22.355 what you see is that[br]for thousands of years 0:54:22.379,0:54:25.715 we humans have been gaining[br]control of the world outside us 0:54:25.739,0:54:29.233 and trying to shape it[br]to fit our own desires. 0:54:29.257,0:54:32.445 And we've gained control[br]of the other animals, 0:54:32.469,0:54:34.000 of the rivers, of the forests, 0:54:34.024,0:54:37.517 and reshaped them completely, 0:54:37.541,0:54:40.902 causing an ecological destruction 0:54:40.926,0:54:44.104 without making ourselves satisfied. 0:54:44.128,0:54:47.930 So the next step[br]is we turn our gaze inwards, 0:54:47.954,0:54:52.502 and we say OK, getting control[br]of the world outside us 0:54:52.526,0:54:54.390 did not really make us satisfied. 0:54:54.414,0:54:57.113 Let's now try to gain control[br]of the world inside us. 0:54:57.137,0:54:59.300 This is the really big project 0:54:59.324,0:55:03.620 of science and technology[br]and industry in the 21st century -- 0:55:03.644,0:55:07.166 to try and gain control[br]of the world inside us, 0:55:07.190,0:55:12.113 to learn how to engineer and produce[br]bodies and brains and minds. 0:55:12.137,0:55:16.779 These are likely to be the main[br]products of the 21st century economy. 0:55:16.803,0:55:20.624 When people think about the future,[br]very often they think in terms, 0:55:20.648,0:55:24.595 "Oh, I want to gain control[br]of my body and of my brain." 0:55:24.619,0:55:27.429 And I think that's very dangerous. 0:55:27.453,0:55:30.719 If we've learned anything[br]from our previous history, 0:55:30.743,0:55:34.656 it's that yes, we gain[br]the power to manipulate, 0:55:34.680,0:55:37.470 but because we didn't really[br]understand the complexity 0:55:37.494,0:55:39.299 of the ecological system, 0:55:39.323,0:55:43.013 we are now facing an ecological meltdown. 0:55:43.037,0:55:48.443 And if we now try to reengineer[br]the world inside us 0:55:48.467,0:55:50.599 without really understanding it, 0:55:50.623,0:55:54.939 especially without understanding[br]the complexity of our mental system, 0:55:54.963,0:55:59.623 we might cause a kind of internal[br]ecological disaster, 0:55:59.647,0:56:03.190 and we'll face a kind of mental[br]meltdown inside us. 0:56:04.270,0:56:06.712 CA: Putting all the pieces[br]together here -- 0:56:06.736,0:56:09.416 the current politics,[br]the coming technology, 0:56:09.440,0:56:11.590 concerns like the one[br]you've just outlined -- 0:56:11.614,0:56:14.709 I mean, it seems like you yourself[br]are in quite a bleak place 0:56:14.733,0:56:16.354 when you think about the future. 0:56:16.378,0:56:17.960 You're pretty worried about it. 0:56:17.984,0:56:19.176 Is that right? 0:56:19.200,0:56:25.888 And if there was one cause for hope,[br]how would you state that? 0:56:25.912,0:56:30.075 YNH: I focus on the most[br]dangerous possibilities 0:56:30.099,0:56:33.120 partly because this is like[br]my job or responsibility 0:56:33.144,0:56:34.925 as a historian or social critic. 0:56:34.949,0:56:39.711 I mean, the industry focuses mainly[br]on the positive sides, 0:56:39.735,0:56:43.096 so it's the job of historians[br]and philosophers and sociologists 0:56:43.120,0:56:47.561 to highlight the more dangerous potential[br]of all these new technologies. 0:56:47.585,0:56:50.068 I don't think any of that is inevitable. 0:56:50.092,0:56:53.131 Technology is never deterministic. 0:56:53.155,0:56:54.872 You can use the same technology 0:56:54.896,0:56:57.887 to create very different[br]kinds of societies. 0:56:57.911,0:56:59.949 If you look at the 20th century, 0:56:59.973,0:57:02.754 so, the technologies[br]of the Industrial Revolution, 0:57:02.778,0:57:05.835 the trains and electricity and all that 0:57:05.859,0:57:08.911 could be used to create[br]a communist dictatorship 0:57:08.935,0:57:11.740 or a fascist regime[br]or a liberal democracy. 0:57:11.764,0:57:14.292 The trains did not tell you[br]what to do with them. 0:57:14.316,0:57:18.768 Similarly, now, artificial intelligence[br]and bioengineering and all of that -- 0:57:18.792,0:57:22.306 they don't predetermine a single outcome. 0:57:22.886,0:57:26.063 Humanity can rise up to the challenge, 0:57:26.087,0:57:27.778 and the best example we have 0:57:27.802,0:57:31.542 of humanity rising up[br]to the challenge of a new technology 0:57:31.566,0:57:33.289 is nuclear weapons. 0:57:33.313,0:57:36.322 In the late 1940s, '50s, 0:57:36.346,0:57:38.485 many people were convinced 0:57:38.509,0:57:42.815 that sooner or later the Cold War[br]will end in a nuclear catastrophe, 0:57:42.839,0:57:44.614 destroying human civilization. 0:57:44.638,0:57:46.118 And this did not happen. 0:57:46.142,0:57:52.562 In fact, nuclear weapons prompted[br]humans all over the world 0:57:52.586,0:57:57.327 to change the way that they manage[br]international politics 0:57:57.351,0:57:59.720 to reduce violence. 0:57:59.744,0:58:02.983 And many countries basically took out war 0:58:03.007,0:58:04.881 from their political toolkit. 0:58:04.905,0:58:09.175 They no longer tried to pursue[br]their interests with warfare. 0:58:09.580,0:58:12.850 Not all countries have done so,[br]but many countries have. 0:58:12.874,0:58:16.808 And this is maybe[br]the most important reason 0:58:16.832,0:58:22.934 why international violence[br]declined dramatically since 1945, 0:58:22.958,0:58:26.296 and today, as I said,[br]more people commit suicide 0:58:26.320,0:58:28.527 than are killed in war. 0:58:28.551,0:58:33.380 So this, I think, gives us a good example 0:58:33.404,0:58:37.246 that even the most frightening technology, 0:58:37.270,0:58:39.805 humans can rise up to the challenge 0:58:39.829,0:58:42.852 and actually some good can come out of it. 0:58:42.876,0:58:47.163 The problem is, we have very little[br]margin for error. 0:58:47.187,0:58:49.396 If we don't get it right, 0:58:49.420,0:58:53.091 we might not have[br]a second option to try again. 0:58:54.337,0:58:55.904 CA: That's a very powerful note, 0:58:55.928,0:58:58.733 on which I think we should draw[br]this to a conclusion. 0:58:58.757,0:59:01.868 Before I wrap up, I just want to say[br]one thing to people here 0:59:01.892,0:59:07.438 and to the global TED community[br]watching online, anyone watching online: 0:59:07.462,0:59:10.355 help us with these dialogues. 0:59:10.379,0:59:12.929 If you believe, like we do, 0:59:12.953,0:59:15.933 that we need to find[br]a different kind of conversation, 0:59:15.957,0:59:18.190 now more than ever, help us do it. 0:59:18.214,0:59:20.237 Reach out to other people, 0:59:21.269,0:59:24.009 try and have conversations[br]with people you disagree with, 0:59:24.033,0:59:25.216 understand them, 0:59:25.240,0:59:26.770 pull the pieces together, 0:59:26.794,0:59:30.686 and help us figure out how to take[br]these conversations forward 0:59:30.710,0:59:32.964 so we can make a real contribution 0:59:32.988,0:59:35.733 to what's happening[br]in the world right now. 0:59:35.757,0:59:39.076 I think everyone feels more alive, 0:59:39.100,0:59:41.410 more concerned, more engaged 0:59:41.434,0:59:43.963 with the politics of the moment. 0:59:43.987,0:59:46.441 The stakes do seem quite high, 0:59:46.465,0:59:50.977 so help us respond to it[br]in a wise, wise way. 0:59:51.001,0:59:52.596 Yuval Harari, thank you. 0:59:52.620,0:59:55.928 (Applause)