1 00:00:00,888 --> 00:00:04,642 Chris Anderson: Hello. Welcome to this TED Dialogues. 2 00:00:04,666 --> 00:00:08,253 It's the first of a series that's going to be done 3 00:00:08,277 --> 00:00:11,589 in response to the current political upheaval. 4 00:00:12,265 --> 00:00:13,434 I don't know about you; 5 00:00:13,458 --> 00:00:17,069 I've become quite concerned about the growing divisiveness in this country 6 00:00:17,093 --> 00:00:18,589 and in the world. 7 00:00:18,613 --> 00:00:20,973 No one's listening to each other. Right? 8 00:00:21,564 --> 00:00:22,727 They aren't. 9 00:00:22,751 --> 00:00:26,487 I mean, it feels like we need a different kind of conversation, 10 00:00:26,511 --> 00:00:32,439 one that's based on -- I don't know, on reason, listening, on understanding, 11 00:00:32,463 --> 00:00:34,130 on a broader context. 12 00:00:34,840 --> 00:00:37,867 That's at least what we're going to try in these TED Dialogues, 13 00:00:37,891 --> 00:00:39,083 starting today. 14 00:00:39,107 --> 00:00:41,733 And we couldn't have anyone with us 15 00:00:41,757 --> 00:00:44,566 who I'd be more excited to kick this off. 16 00:00:44,590 --> 00:00:48,440 This is a mind right here that thinks pretty much like no one else 17 00:00:48,464 --> 00:00:50,235 on the planet, I would hasten to say. 18 00:00:50,259 --> 00:00:51,418 I'm serious. 19 00:00:51,442 --> 00:00:52,704 (Yuval Noah Harari laughs) 20 00:00:52,728 --> 00:00:53,892 I'm serious. 21 00:00:53,916 --> 00:00:58,854 He synthesizes history with underlying ideas 22 00:00:58,878 --> 00:01:01,048 in a way that kind of takes your breath away. 23 00:01:01,072 --> 00:01:04,481 So, some of you will know this book, "Sapiens." 24 00:01:04,505 --> 00:01:06,248 Has anyone here read "Sapiens"? 25 00:01:06,272 --> 00:01:07,484 (Applause) 26 00:01:07,508 --> 00:01:10,653 I mean, I could not put it down. 27 00:01:10,677 --> 00:01:14,551 The way that he tells the story of mankind 28 00:01:14,575 --> 00:01:18,377 through big ideas that really make you think differently -- 29 00:01:18,401 --> 00:01:20,136 it's kind of amazing. 30 00:01:20,160 --> 00:01:21,384 And here's the follow-up, 31 00:01:21,408 --> 00:01:24,484 which I think is being published in the US next week. 32 00:01:24,508 --> 00:01:25,659 YNH: Yeah, next week. 33 00:01:25,683 --> 00:01:26,857 CA: "Homo Deus." 34 00:01:26,881 --> 00:01:29,946 Now, this is the history of the next hundred years. 35 00:01:30,521 --> 00:01:32,371 I've had a chance to read it. 36 00:01:32,395 --> 00:01:34,785 It's extremely dramatic, 37 00:01:34,809 --> 00:01:39,074 and I daresay, for some people, quite alarming. 38 00:01:39,605 --> 00:01:40,848 It's a must-read. 39 00:01:40,872 --> 00:01:46,450 And honestly, we couldn't have someone better to help 40 00:01:46,474 --> 00:01:50,450 make sense of what on Earth is happening in the world right now. 41 00:01:50,474 --> 00:01:54,631 So a warm welcome, please, to Yuval Noah Harari. 42 00:01:54,655 --> 00:01:58,048 (Applause) 43 00:02:02,900 --> 00:02:06,645 It's great to be joined by our friends on Facebook and around the Web. 44 00:02:06,669 --> 00:02:08,246 Hello, Facebook. 45 00:02:08,270 --> 00:02:12,196 And all of you, as I start asking questions of Yuval, 46 00:02:12,220 --> 00:02:13,872 come up with your own questions, 47 00:02:13,896 --> 00:02:16,543 and not necessarily about the political scandal du jour, 48 00:02:16,567 --> 00:02:21,336 but about the broader understanding of: Where are we heading? 49 00:02:22,517 --> 00:02:24,283 You ready? OK, we're going to go. 50 00:02:24,307 --> 00:02:25,582 So here we are, Yuval: 51 00:02:25,606 --> 00:02:29,326 New York City, 2017, there's a new president in power, 52 00:02:29,350 --> 00:02:32,471 and shock waves rippling around the world. 53 00:02:32,495 --> 00:02:33,964 What on Earth is happening? 54 00:02:35,115 --> 00:02:37,361 YNH: I think the basic thing that happened 55 00:02:37,385 --> 00:02:39,675 is that we have lost our story. 56 00:02:40,144 --> 00:02:42,611 Humans think in stories, 57 00:02:42,635 --> 00:02:46,297 and we try to make sense of the world by telling stories. 58 00:02:46,321 --> 00:02:47,738 And for the last few decades, 59 00:02:47,762 --> 00:02:50,632 we had a very simple and very attractive story 60 00:02:50,656 --> 00:02:52,405 about what's happening in the world. 61 00:02:52,429 --> 00:02:55,593 And the story said that, oh, what's happening is 62 00:02:55,617 --> 00:02:58,233 that the economy is being globalized, 63 00:02:58,257 --> 00:03:00,400 politics is being liberalized, 64 00:03:00,424 --> 00:03:04,423 and the combination of the two will create paradise on Earth, 65 00:03:04,447 --> 00:03:07,546 and we just need to keep on globalizing the economy 66 00:03:07,570 --> 00:03:09,381 and liberalizing the political system, 67 00:03:09,405 --> 00:03:11,310 and everything will be wonderful. 68 00:03:11,334 --> 00:03:14,066 And 2016 is the moment 69 00:03:14,090 --> 00:03:18,050 when a very large segment, even of the Western world, 70 00:03:18,074 --> 00:03:20,480 stopped believing in this story. 71 00:03:20,504 --> 00:03:22,636 For good or bad reasons -- it doesn't matter. 72 00:03:22,660 --> 00:03:24,881 People stopped believing in the story, 73 00:03:24,905 --> 00:03:28,681 and when you don't have a story, you don't understand what's happening. 74 00:03:29,212 --> 00:03:33,179 CA: Part of you believes that that story was actually a very effective story. 75 00:03:33,203 --> 00:03:34,397 It worked. 76 00:03:34,421 --> 00:03:35,898 YNH: To some extent, yes. 77 00:03:35,922 --> 00:03:37,984 According to some measurements, 78 00:03:38,008 --> 00:03:40,593 we are now in the best time ever 79 00:03:40,617 --> 00:03:42,044 for humankind. 80 00:03:42,068 --> 00:03:44,508 Today, for the first time in history, 81 00:03:44,532 --> 00:03:48,945 more people die from eating too much than from eating too little, 82 00:03:48,969 --> 00:03:50,741 which is an amazing achievement. 83 00:03:50,765 --> 00:03:53,448 (Laughter) 84 00:03:53,472 --> 00:03:55,173 Also for the first time in history, 85 00:03:55,197 --> 00:03:59,332 more people die from old age than from infectious diseases, 86 00:03:59,356 --> 00:04:02,150 and violence is also down. 87 00:04:02,174 --> 00:04:03,604 For the first time in history, 88 00:04:03,628 --> 00:04:08,959 more people commit suicide than are killed by crime and terrorism 89 00:04:08,983 --> 00:04:10,823 and war put together. 90 00:04:10,847 --> 00:04:15,007 Statistically, you are your own worst enemy. 91 00:04:15,031 --> 00:04:17,037 At least, of all the people in the world, 92 00:04:17,061 --> 00:04:20,183 you are most likely to be killed by yourself -- 93 00:04:20,207 --> 00:04:21,474 (Laughter) 94 00:04:21,498 --> 00:04:24,542 which is, again, very good news, compared -- 95 00:04:24,566 --> 00:04:26,190 (Laughter) 96 00:04:26,214 --> 00:04:30,515 compared to the level of violence that we saw in previous eras. 97 00:04:30,539 --> 00:04:32,775 CA: But this process of connecting the world 98 00:04:32,799 --> 00:04:36,698 ended up with a large group of people kind of feeling left out, 99 00:04:36,722 --> 00:04:38,353 and they've reacted. 100 00:04:38,377 --> 00:04:40,363 And so we have this bombshell 101 00:04:40,387 --> 00:04:42,698 that's sort of ripping through the whole system. 102 00:04:42,722 --> 00:04:46,114 I mean, what do you make of what's happened? 103 00:04:46,138 --> 00:04:49,416 It feels like the old way that people thought of politics, 104 00:04:49,440 --> 00:04:52,296 the left-right divide, has been blown up and replaced. 105 00:04:52,320 --> 00:04:53,917 How should we think of this? 106 00:04:53,941 --> 00:04:58,325 YNH: Yeah, the old 20th-century political model of left versus right 107 00:04:58,349 --> 00:05:00,056 is now largely irrelevant, 108 00:05:00,080 --> 00:05:04,582 and the real divide today is between global and national, 109 00:05:04,606 --> 00:05:06,395 global or local. 110 00:05:06,419 --> 00:05:09,276 And you see it again all over the world 111 00:05:09,300 --> 00:05:11,445 that this is now the main struggle. 112 00:05:11,469 --> 00:05:14,905 We probably need completely new political models 113 00:05:14,929 --> 00:05:19,989 and completely new ways of thinking about politics. 114 00:05:20,499 --> 00:05:26,029 In essence, what you can say is that we now have global ecology, 115 00:05:26,053 --> 00:05:29,994 we have a global economy but we have national politics, 116 00:05:30,018 --> 00:05:31,755 and this doesn't work together. 117 00:05:31,779 --> 00:05:34,005 This makes the political system ineffective, 118 00:05:34,029 --> 00:05:37,553 because it has no control over the forces that shape our life. 119 00:05:37,577 --> 00:05:40,856 And you have basically two solutions to this imbalance: 120 00:05:40,880 --> 00:05:45,454 either de-globalize the economy and turn it back into a national economy, 121 00:05:45,478 --> 00:05:47,646 or globalize the political system. 122 00:05:48,948 --> 00:05:53,613 CA: So some, I guess many liberals out there 123 00:05:53,637 --> 00:06:00,073 view Trump and his government as kind of irredeemably bad, 124 00:06:00,097 --> 00:06:02,756 just awful in every way. 125 00:06:02,780 --> 00:06:09,203 Do you see any underlying narrative or political philosophy in there 126 00:06:09,227 --> 00:06:11,031 that is at least worth understanding? 127 00:06:11,055 --> 00:06:13,100 How would you articulate that philosophy? 128 00:06:13,124 --> 00:06:15,423 Is it just the philosophy of nationalism? 129 00:06:16,254 --> 00:06:21,654 YNH: I think the underlying feeling or idea 130 00:06:21,678 --> 00:06:26,170 is that the political system -- something is broken there. 131 00:06:26,194 --> 00:06:29,964 It doesn't empower the ordinary person anymore. 132 00:06:29,988 --> 00:06:33,488 It doesn't care so much about the ordinary person anymore, 133 00:06:33,512 --> 00:06:38,331 and I think this diagnosis of the political disease is correct. 134 00:06:38,355 --> 00:06:41,714 With regard to the answers, I am far less certain. 135 00:06:41,738 --> 00:06:45,221 I think what we are seeing is the immediate human reaction: 136 00:06:45,245 --> 00:06:47,803 if something doesn't work, let's go back. 137 00:06:47,827 --> 00:06:49,448 And you see it all over the world, 138 00:06:49,472 --> 00:06:53,898 that people, almost nobody in the political system today, 139 00:06:53,922 --> 00:06:58,115 has any future-oriented vision of where humankind is going. 140 00:06:58,139 --> 00:07:01,165 Almost everywhere, you see retrograde vision: 141 00:07:01,189 --> 00:07:03,235 "Let's make America great again," 142 00:07:03,259 --> 00:07:06,688 like it was great -- I don't know -- in the '50s, in the '80s, sometime, 143 00:07:06,712 --> 00:07:07,882 let's go back there. 144 00:07:07,906 --> 00:07:12,627 And you go to Russia a hundred years after Lenin, 145 00:07:12,651 --> 00:07:14,522 Putin's vision for the future 146 00:07:14,546 --> 00:07:17,743 is basically, ah, let's go back to the Tsarist empire. 147 00:07:17,767 --> 00:07:20,159 And in Israel, where I come from, 148 00:07:20,183 --> 00:07:23,435 the hottest political vision of the present is: 149 00:07:23,459 --> 00:07:25,310 "Let's build the temple again." 150 00:07:25,334 --> 00:07:28,312 So let's go back 2,000 years backwards. 151 00:07:28,336 --> 00:07:33,162 So people are thinking sometime in the past we've lost it, 152 00:07:33,186 --> 00:07:36,924 and sometimes in the past, it's like you've lost your way in the city, 153 00:07:36,948 --> 00:07:40,098 and you say OK, let's go back to the point where I felt secure 154 00:07:40,122 --> 00:07:41,484 and start again. 155 00:07:41,508 --> 00:07:43,081 I don't think this can work, 156 00:07:43,105 --> 00:07:46,007 but a lot of people, this is their gut instinct. 157 00:07:46,031 --> 00:07:47,683 CA: But why couldn't it work? 158 00:07:47,707 --> 00:07:51,346 "America First" is a very appealing slogan in many ways. 159 00:07:51,370 --> 00:07:55,247 Patriotism is, in many ways, a very noble thing. 160 00:07:55,271 --> 00:07:58,012 It's played a role in promoting cooperation 161 00:07:58,036 --> 00:07:59,626 among large numbers of people. 162 00:07:59,650 --> 00:08:03,532 Why couldn't you have a world organized in countries, 163 00:08:03,556 --> 00:08:06,333 all of which put themselves first? 164 00:08:07,373 --> 00:08:10,692 YNH: For many centuries, even thousands of years, 165 00:08:10,716 --> 00:08:13,478 patriotism worked quite well. 166 00:08:13,502 --> 00:08:15,381 Of course, it led to wars an so forth, 167 00:08:15,405 --> 00:08:18,360 but we shouldn't focus too much on the bad. 168 00:08:18,384 --> 00:08:21,926 There are also many, many positive things about patriotism, 169 00:08:21,950 --> 00:08:25,724 and the ability to have a large number of people 170 00:08:25,748 --> 00:08:27,162 care about each other, 171 00:08:27,186 --> 00:08:28,792 sympathize with one another, 172 00:08:28,816 --> 00:08:32,030 and come together for collective action. 173 00:08:32,054 --> 00:08:34,749 If you go back to the first nations, 174 00:08:34,773 --> 00:08:36,598 so, thousands of years ago, 175 00:08:36,622 --> 00:08:40,028 the people who lived along the Yellow River in China -- 176 00:08:40,052 --> 00:08:42,495 it was many, many different tribes 177 00:08:42,519 --> 00:08:46,889 and they all depended on the river for survival and for prosperity, 178 00:08:46,913 --> 00:08:51,222 but all of them also suffered from periodical floods 179 00:08:51,246 --> 00:08:52,846 and periodical droughts. 180 00:08:52,870 --> 00:08:55,905 And no tribe could really do anything about it, 181 00:08:55,929 --> 00:09:00,089 because each of them controlled just a tiny section of the river. 182 00:09:00,113 --> 00:09:02,877 And then in a long and complicated process, 183 00:09:02,901 --> 00:09:06,825 the tribes coalesced together to form the Chinese nation, 184 00:09:06,849 --> 00:09:09,483 which controlled the entire Yellow River 185 00:09:09,507 --> 00:09:14,954 and had the ability to bring hundreds of thousands of people together 186 00:09:14,978 --> 00:09:19,229 to build dams and canals and regulate the river 187 00:09:19,253 --> 00:09:22,440 and prevent the worst floods and droughts 188 00:09:22,464 --> 00:09:25,564 and raise the level of prosperity for everybody. 189 00:09:25,588 --> 00:09:28,205 And this worked in many places around the world. 190 00:09:28,229 --> 00:09:31,355 But in the 21st century, 191 00:09:31,379 --> 00:09:34,814 technology is changing all that in a fundamental way. 192 00:09:34,838 --> 00:09:37,552 We are now living -- all people in the world -- 193 00:09:37,576 --> 00:09:41,393 are living alongside the same cyber river, 194 00:09:41,417 --> 00:09:47,023 and no single nation can regulate this river by itself. 195 00:09:47,047 --> 00:09:51,090 We are all living together on a single planet, 196 00:09:51,114 --> 00:09:53,875 which is threatened by our own actions. 197 00:09:53,899 --> 00:09:57,915 And if you don't have some kind of global cooperation, 198 00:09:57,939 --> 00:10:03,082 nationalism is just not on the right level to tackle the problems, 199 00:10:03,106 --> 00:10:06,809 whether it's climate change or whether it's technological disruption. 200 00:10:07,761 --> 00:10:09,952 CA: So it was a beautiful idea 201 00:10:09,976 --> 00:10:13,924 in a world where most of the action, most of the issues, 202 00:10:13,948 --> 00:10:16,304 took place on national scale, 203 00:10:16,328 --> 00:10:19,126 but your argument is that the issues that matter most today 204 00:10:19,150 --> 00:10:22,294 no longer take place on a national scale but on a global scale. 205 00:10:22,318 --> 00:10:26,127 YNH: Exactly. All the major problems of the world today 206 00:10:26,151 --> 00:10:28,540 are global in essence, 207 00:10:28,564 --> 00:10:30,055 and they cannot be solved 208 00:10:30,079 --> 00:10:33,976 unless through some kind of global cooperation. 209 00:10:34,000 --> 00:10:35,629 It's not just climate change, 210 00:10:35,653 --> 00:10:39,057 which is, like, the most obvious example people give. 211 00:10:39,081 --> 00:10:42,159 I think more in terms of technological disruption. 212 00:10:42,183 --> 00:10:45,233 If you think about, for example, artificial intelligence, 213 00:10:45,257 --> 00:10:48,250 over the next 20, 30 years 214 00:10:48,274 --> 00:10:52,158 pushing hundreds of millions of people out of the job market -- 215 00:10:52,182 --> 00:10:54,433 this is a problem on a global level. 216 00:10:54,457 --> 00:10:57,960 It will disrupt the economy of all the countries. 217 00:10:57,984 --> 00:11:01,647 And similarly, if you think about, say, bioengineering 218 00:11:01,671 --> 00:11:04,626 and people being afraid of conducting, 219 00:11:04,650 --> 00:11:07,311 I don't know, genetic engineering research in humans, 220 00:11:07,335 --> 00:11:12,624 it won't help if just a single country, let's say the US, 221 00:11:12,648 --> 00:11:15,972 outlaws all genetic experiments in humans, 222 00:11:15,996 --> 00:11:19,626 but China or North Korea continues to do it. 223 00:11:19,650 --> 00:11:22,390 So the US cannot solve it by itself, 224 00:11:22,414 --> 00:11:27,297 and very quickly, the pressure on the US to do the same will be immense 225 00:11:27,321 --> 00:11:32,211 because we are talking about high-risk, high-gain technologies. 226 00:11:32,235 --> 00:11:36,943 If somebody else is doing it, I can't allow myself to remain behind. 227 00:11:36,967 --> 00:11:42,655 The only way to have regulations, effective regulations, 228 00:11:42,679 --> 00:11:44,772 on things like genetic engineering, 229 00:11:44,796 --> 00:11:46,808 is to have global regulations. 230 00:11:46,832 --> 00:11:51,870 If you just have national regulations, nobody would like to stay behind. 231 00:11:51,894 --> 00:11:53,902 CA: So this is really interesting. 232 00:11:53,926 --> 00:11:55,857 It seems to me that this may be one key 233 00:11:55,881 --> 00:11:59,353 to provoking at least a constructive conversation 234 00:11:59,377 --> 00:12:01,050 between the different sides here, 235 00:12:01,074 --> 00:12:04,216 because I think everyone can agree that the start point 236 00:12:04,240 --> 00:12:06,937 of a lot of the anger that's propelled us to where we are 237 00:12:06,961 --> 00:12:09,709 is because of the legitimate concerns about job loss. 238 00:12:09,733 --> 00:12:13,374 Work is gone, a traditional way of life has gone, 239 00:12:13,398 --> 00:12:16,773 and it's no wonder that people are furious about that. 240 00:12:16,797 --> 00:12:21,349 And in general, they have blamed globalism, global elites, 241 00:12:21,373 --> 00:12:24,120 for doing this to them without asking their permission, 242 00:12:24,144 --> 00:12:26,239 and that seems like a legitimate complaint. 243 00:12:26,263 --> 00:12:29,572 But what I hear you saying is that -- so a key question is: 244 00:12:29,596 --> 00:12:35,057 What is the real cause of job loss, both now and going forward? 245 00:12:35,081 --> 00:12:37,857 To the extent that it's about globalism, 246 00:12:37,881 --> 00:12:42,003 then the right response, yes, is to shut down borders 247 00:12:42,027 --> 00:12:46,007 and keep people out and change trade agreements and so forth. 248 00:12:46,031 --> 00:12:47,387 But you're saying, I think, 249 00:12:47,411 --> 00:12:52,202 that actually the bigger cause of job loss is not going to be that at all. 250 00:12:52,226 --> 00:12:55,940 It's going to originate in technological questions, 251 00:12:55,964 --> 00:12:57,976 and we have no chance of solving that 252 00:12:58,000 --> 00:13:00,095 unless we operate as a connected world. 253 00:13:00,119 --> 00:13:01,638 YNH: Yeah, I think that, 254 00:13:01,662 --> 00:13:04,903 I don't know about the present, but looking to the future, 255 00:13:04,927 --> 00:13:08,004 it's not the Mexicans or Chinese who will take the jobs 256 00:13:08,028 --> 00:13:09,595 from the people in Pennsylvania, 257 00:13:09,619 --> 00:13:11,362 it's the robots and algorithms. 258 00:13:11,386 --> 00:13:15,570 So unless you plan to build a big wall on the border of California -- 259 00:13:15,594 --> 00:13:16,728 (Laughter) 260 00:13:16,752 --> 00:13:20,443 the wall on the border with Mexico is going to be very ineffective. 261 00:13:20,467 --> 00:13:26,815 And I was struck when I watched the debates before the election, 262 00:13:26,839 --> 00:13:32,576 I was struck that certainly Trump did not even attempt to frighten people 263 00:13:32,600 --> 00:13:35,176 by saying the robots will take your jobs. 264 00:13:35,200 --> 00:13:37,499 Now even if it's not true, it doesn't matter. 265 00:13:37,523 --> 00:13:41,041 It could have been an extremely effective way of frightening people -- 266 00:13:41,065 --> 00:13:42,066 (Laughter) 267 00:13:42,090 --> 00:13:43,251 and galvanizing people: 268 00:13:43,275 --> 00:13:44,884 "The robots will take your jobs!" 269 00:13:44,908 --> 00:13:46,263 And nobody used that line. 270 00:13:46,287 --> 00:13:48,979 And it made me afraid, 271 00:13:49,003 --> 00:13:53,051 because it meant that no matter what happens 272 00:13:53,075 --> 00:13:55,221 in universities and laboratories, 273 00:13:55,245 --> 00:13:58,010 and there, there is already an intense debate about it, 274 00:13:58,034 --> 00:14:02,128 but in the mainstream political system and among the general public, 275 00:14:02,152 --> 00:14:04,262 people are just unaware 276 00:14:04,286 --> 00:14:08,796 that there could be an immense technological disruption -- 277 00:14:08,820 --> 00:14:12,942 not in 200 years, but in 10, 20, 30 years -- 278 00:14:12,966 --> 00:14:15,536 and we have to do something about it now, 279 00:14:15,560 --> 00:14:21,836 partly because most of what we teach children today in school or in college 280 00:14:21,860 --> 00:14:27,861 is going to be completely irrelevant to the job market of 2040, 2050. 281 00:14:27,885 --> 00:14:31,243 So it's not something we'll need to think about in 2040. 282 00:14:31,267 --> 00:14:34,860 We need to think today what to teach the young people. 283 00:14:34,884 --> 00:14:37,537 CA: Yeah, no, absolutely. 284 00:14:38,595 --> 00:14:42,512 You've often written about moments in history 285 00:14:42,536 --> 00:14:49,395 where humankind has ... entered a new era, unintentionally. 286 00:14:49,806 --> 00:14:52,674 Decisions have been made, technologies have been developed, 287 00:14:52,698 --> 00:14:55,069 and suddenly the world has changed, 288 00:14:55,093 --> 00:14:57,560 possibly in a way that's worse for everyone. 289 00:14:57,584 --> 00:14:59,659 So one of the examples you give in "Sapiens" 290 00:14:59,683 --> 00:15:01,774 is just the whole agricultural revolution, 291 00:15:01,798 --> 00:15:05,352 which, for an actual person tilling the fields, 292 00:15:05,376 --> 00:15:08,556 they just picked up a 12-hour backbreaking workday 293 00:15:08,580 --> 00:15:14,828 instead of six hours in the jungle and a much more interesting lifestyle. 294 00:15:14,852 --> 00:15:15,894 (Laughter) 295 00:15:15,918 --> 00:15:19,107 So are we at another possible phase change here, 296 00:15:19,131 --> 00:15:23,619 where we kind of sleepwalk into a future that none of us actually wants? 297 00:15:24,058 --> 00:15:26,791 YNH: Yes, very much so. 298 00:15:26,815 --> 00:15:28,652 During the agricultural revolution, 299 00:15:28,676 --> 00:15:33,096 what happened is that immense technological and economic revolution 300 00:15:33,120 --> 00:15:35,985 empowered the human collective, 301 00:15:36,009 --> 00:15:38,962 but when you look at actual individual lives, 302 00:15:38,986 --> 00:15:42,480 the life of a tiny elite became much better, 303 00:15:42,504 --> 00:15:46,742 and the lives of the majority of people became considerably worse. 304 00:15:46,766 --> 00:15:49,471 And this can happen again in the 21st century. 305 00:15:49,495 --> 00:15:54,361 No doubt the new technologies will empower the human collective. 306 00:15:54,385 --> 00:15:57,105 But we may end up again 307 00:15:57,129 --> 00:16:01,586 with a tiny elite reaping all the benefits, taking all the fruits, 308 00:16:01,610 --> 00:16:05,796 and the masses of the population finding themselves worse 309 00:16:05,820 --> 00:16:07,121 than they were before, 310 00:16:07,145 --> 00:16:09,933 certainly much worse than this tiny elite. 311 00:16:10,657 --> 00:16:13,312 CA: And those elites might not even be human elites. 312 00:16:13,336 --> 00:16:15,093 They might be cyborgs or -- 313 00:16:15,117 --> 00:16:17,324 YNH: Yeah, they could be enhanced super humans. 314 00:16:17,348 --> 00:16:18,603 They could be cyborgs. 315 00:16:18,627 --> 00:16:20,984 They could be completely nonorganic elites. 316 00:16:21,008 --> 00:16:23,536 They could even be non-conscious algorithms. 317 00:16:23,560 --> 00:16:28,471 What we see now in the world is authority shifting away 318 00:16:28,495 --> 00:16:30,764 from humans to algorithms. 319 00:16:30,788 --> 00:16:34,312 More and more decisions -- about personal lives, 320 00:16:34,336 --> 00:16:37,008 about economic matters, about political matters -- 321 00:16:37,032 --> 00:16:39,511 are actually being taken by algorithms. 322 00:16:39,535 --> 00:16:42,169 If you ask the bank for a loan, 323 00:16:42,193 --> 00:16:46,890 chances are your fate is decided by an algorithm, not by a human being. 324 00:16:46,914 --> 00:16:53,101 And the general impression is that maybe Homo sapiens just lost it. 325 00:16:53,125 --> 00:16:57,685 The world is so complicated, there is so much data, 326 00:16:57,709 --> 00:17:00,263 things are changing so fast, 327 00:17:00,287 --> 00:17:03,888 that this thing that evolved on the African savanna 328 00:17:03,912 --> 00:17:05,619 tens of thousands of years ago -- 329 00:17:05,643 --> 00:17:09,140 to cope with a particular environment, 330 00:17:09,164 --> 00:17:12,648 a particular volume of information and data -- 331 00:17:12,672 --> 00:17:17,008 it just can't handle the realities of the 21st century, 332 00:17:17,032 --> 00:17:19,929 and the only thing that may be able to handle it 333 00:17:19,953 --> 00:17:22,025 is big-data algorithms. 334 00:17:22,049 --> 00:17:28,230 So no wonder more and more authority is shifting from us to the algorithms. 335 00:17:28,857 --> 00:17:32,706 CA: So we're in New York City for the first of a series of TED Dialogues 336 00:17:32,730 --> 00:17:35,027 with Yuval Harari, 337 00:17:35,051 --> 00:17:38,895 and there's a Facebook Live audience out there. 338 00:17:38,919 --> 00:17:40,570 We're excited to have you with us. 339 00:17:40,594 --> 00:17:42,696 We'll start coming to some of your questions 340 00:17:42,720 --> 00:17:44,434 and questions of people in the room 341 00:17:44,458 --> 00:17:45,623 in just a few minutes, 342 00:17:45,647 --> 00:17:47,611 so have those coming. 343 00:17:47,635 --> 00:17:51,532 Yuval, if you're going to make the argument 344 00:17:51,556 --> 00:17:57,691 that we need to get past nationalism because of the coming technological ... 345 00:17:59,218 --> 00:18:01,059 danger, in a way, 346 00:18:01,083 --> 00:18:03,028 presented by so much of what's happening 347 00:18:03,052 --> 00:18:05,495 we've got to have a global conversation about this. 348 00:18:05,519 --> 00:18:08,947 Trouble is, it's hard to get people really believing that, I don't know, 349 00:18:08,971 --> 00:18:11,132 AI really is an imminent threat, and so forth. 350 00:18:11,156 --> 00:18:13,882 The things that people, some people at least, 351 00:18:13,906 --> 00:18:15,941 care about much more immediately, perhaps, 352 00:18:15,965 --> 00:18:17,549 is climate change, 353 00:18:17,573 --> 00:18:22,466 perhaps other issues like refugees, nuclear weapons, and so forth. 354 00:18:22,490 --> 00:18:27,536 Would you argue that where we are right now 355 00:18:27,560 --> 00:18:31,109 that somehow those issues need to be dialed up? 356 00:18:31,133 --> 00:18:33,293 You've talked about climate change, 357 00:18:33,317 --> 00:18:36,973 but Trump has said he doesn't believe in that. 358 00:18:36,997 --> 00:18:39,436 So in a way, your most powerful argument, 359 00:18:39,460 --> 00:18:42,206 you can't actually use to make this case. 360 00:18:42,230 --> 00:18:44,416 YNH: Yeah, I think with climate change, 361 00:18:44,440 --> 00:18:48,147 at first sight, it's quite surprising 362 00:18:48,171 --> 00:18:50,675 that there is a very close correlation 363 00:18:50,699 --> 00:18:54,021 between nationalism and climate change. 364 00:18:54,045 --> 00:18:58,632 I mean, almost always, the people who deny climate change are nationalists. 365 00:18:58,656 --> 00:19:00,737 And at first sight, you think: Why? 366 00:19:00,761 --> 00:19:01,914 What's the connection? 367 00:19:01,938 --> 00:19:04,724 Why don't you have socialists denying climate change? 368 00:19:04,748 --> 00:19:07,099 But then, when you think about it, it's obvious -- 369 00:19:07,123 --> 00:19:10,867 because nationalism has no solution to climate change. 370 00:19:10,891 --> 00:19:14,087 If you want to be a nationalist in the 21st century, 371 00:19:14,111 --> 00:19:15,983 you have to deny the problem. 372 00:19:16,007 --> 00:19:20,494 If you accept the reality of the problem, then you must accept that, yes, 373 00:19:20,518 --> 00:19:23,294 there is still room in the world for patriotism, 374 00:19:23,318 --> 00:19:27,469 there is still room in the world for having special loyalties 375 00:19:27,493 --> 00:19:32,127 and obligations towards your own people, towards your own country. 376 00:19:32,151 --> 00:19:35,971 I don't think anybody is really thinking of abolishing that. 377 00:19:35,995 --> 00:19:38,996 But in order to confront climate change, 378 00:19:39,020 --> 00:19:43,231 we need additional loyalties and commitments 379 00:19:43,255 --> 00:19:45,260 to a level beyond the nation. 380 00:19:45,284 --> 00:19:47,727 And that should not be impossible, 381 00:19:47,751 --> 00:19:51,443 because people can have several layers of loyalty. 382 00:19:51,467 --> 00:19:53,871 You can be loyal to your family 383 00:19:53,895 --> 00:19:55,408 and to your community 384 00:19:55,432 --> 00:19:56,761 and to your nation, 385 00:19:56,785 --> 00:20:00,413 so why can't you also be loyal to humankind as a whole? 386 00:20:00,437 --> 00:20:03,836 Of course, there are occasions when it becomes difficult, 387 00:20:03,860 --> 00:20:05,643 what to put first, 388 00:20:05,667 --> 00:20:07,490 but, you know, life is difficult. 389 00:20:07,514 --> 00:20:08,665 Handle it. 390 00:20:08,689 --> 00:20:11,333 (Laughter) 391 00:20:11,357 --> 00:20:15,855 CA: OK, so I would love to get some questions from the audience here. 392 00:20:15,879 --> 00:20:17,797 We've got a microphone here. 393 00:20:17,821 --> 00:20:21,038 Speak into it, and Facebook, get them coming, too. 394 00:20:21,062 --> 00:20:24,496 Howard Morgan: One of the things that has clearly made a huge difference 395 00:20:24,520 --> 00:20:26,306 in this country and other countries 396 00:20:26,330 --> 00:20:28,544 is the income distribution inequality, 397 00:20:28,568 --> 00:20:32,782 the dramatic change in income distribution in the US 398 00:20:32,806 --> 00:20:34,508 from what it was 50 years ago, 399 00:20:34,532 --> 00:20:35,683 and around the world. 400 00:20:35,707 --> 00:20:38,850 Is there anything we can do to affect that? 401 00:20:38,874 --> 00:20:41,715 Because that gets at a lot of the underlying causes. 402 00:20:44,283 --> 00:20:49,597 YNH: So far I haven't heard a very good idea about what to do about it, 403 00:20:49,621 --> 00:20:53,349 again, partly because most ideas remain on the national level, 404 00:20:53,373 --> 00:20:55,141 and the problem is global. 405 00:20:55,165 --> 00:20:58,143 I mean, one idea that we hear quite a lot about now 406 00:20:58,167 --> 00:20:59,999 is universal basic income. 407 00:21:00,023 --> 00:21:01,174 But this is a problem. 408 00:21:01,198 --> 00:21:02,850 I mean, I think it's a good start, 409 00:21:02,874 --> 00:21:06,596 but it's a problematic idea because it's not clear what "universal" is 410 00:21:06,620 --> 00:21:08,461 and it's not clear what "basic" is. 411 00:21:08,485 --> 00:21:11,866 Most people when they speak about universal basic income, 412 00:21:11,890 --> 00:21:14,675 they actually mean national basic income. 413 00:21:14,699 --> 00:21:16,443 But the problem is global. 414 00:21:16,467 --> 00:21:22,117 Let's say that you have AI and 3D printers taking away millions of jobs 415 00:21:22,141 --> 00:21:23,297 in Bangladesh, 416 00:21:23,321 --> 00:21:26,569 from all the people who make my shirts and my shoes. 417 00:21:26,593 --> 00:21:27,899 So what's going to happen? 418 00:21:27,923 --> 00:21:34,462 The US government will levy taxes on Google and Apple in California, 419 00:21:34,486 --> 00:21:39,067 and use that to pay basic income to unemployed Bangladeshis? 420 00:21:39,091 --> 00:21:41,727 If you believe that, you can just as well believe 421 00:21:41,751 --> 00:21:45,414 that Santa Claus will come and solve the problem. 422 00:21:45,438 --> 00:21:50,564 So unless we have really universal and not national basic income, 423 00:21:50,588 --> 00:21:53,723 the deep problems are not going to go away. 424 00:21:53,747 --> 00:21:56,479 And also it's not clear what basic is, 425 00:21:56,503 --> 00:21:59,136 because what are basic human needs? 426 00:21:59,160 --> 00:22:01,970 A thousand years ago, just food and shelter was enough. 427 00:22:01,994 --> 00:22:05,605 But today, people will say education is a basic human need, 428 00:22:05,629 --> 00:22:07,202 it should be part of the package. 429 00:22:07,226 --> 00:22:11,005 But how much? Six years? Twelve years? PhD? 430 00:22:11,029 --> 00:22:12,862 Similarly, with health care, 431 00:22:12,886 --> 00:22:15,571 let's say that in 20, 30, 40 years, 432 00:22:15,595 --> 00:22:19,368 you'll have expensive treatments that can extend human life 433 00:22:19,392 --> 00:22:21,307 to 120, I don't know. 434 00:22:21,331 --> 00:22:26,522 Will this be part of the basket of basic income or not? 435 00:22:26,546 --> 00:22:27,975 It's a very difficult problem, 436 00:22:27,999 --> 00:22:34,257 because in a world where people lose their ability to be employed, 437 00:22:34,281 --> 00:22:37,862 the only thing they are going to get is this basic income. 438 00:22:37,886 --> 00:22:43,013 So what's part of it is a very, very difficult ethical question. 439 00:22:43,037 --> 00:22:46,341 CA: There's a bunch of questions on how the world affords it as well, 440 00:22:46,365 --> 00:22:47,525 who pays. 441 00:22:47,549 --> 00:22:50,361 There's a question here from Facebook from Lisa Larson: 442 00:22:50,385 --> 00:22:52,960 "How does nationalism in the US now 443 00:22:52,984 --> 00:22:56,399 compare to that between World War I and World War II 444 00:22:56,423 --> 00:22:57,844 in the last century?" 445 00:22:57,868 --> 00:23:02,316 YNH: Well the good news, with regard to the dangers of nationalism, 446 00:23:02,340 --> 00:23:06,263 we are in a much better position than a century ago. 447 00:23:06,287 --> 00:23:08,959 A century ago, 1917, 448 00:23:08,983 --> 00:23:12,116 Europeans were killing each other by the millions. 449 00:23:12,140 --> 00:23:16,491 In 2016, with Brexit, as far as I remember, 450 00:23:16,515 --> 00:23:21,752 a single person lost their life, an MP who was murdered by some extremist. 451 00:23:21,776 --> 00:23:23,309 Just a single person. 452 00:23:23,333 --> 00:23:26,018 I mean, if Brexit was about British independence, 453 00:23:26,042 --> 00:23:30,793 this is the most peaceful war of independence in human history. 454 00:23:30,817 --> 00:23:36,606 And let's say that Scotland will now choose to leave the UK 455 00:23:36,630 --> 00:23:38,806 after Brexit. 456 00:23:38,830 --> 00:23:40,814 So in the 18th century, 457 00:23:40,838 --> 00:23:44,070 if Scotland wanted -- and the Scots wanted several times -- 458 00:23:44,094 --> 00:23:47,627 to break out of the control of London, 459 00:23:47,651 --> 00:23:51,949 the reaction of the government in London was to send an army up north 460 00:23:51,973 --> 00:23:55,444 to burn down Edinburgh and massacre the highland tribes. 461 00:23:55,468 --> 00:24:01,024 My guess is that if, in 2018, the Scots vote for independence, 462 00:24:01,048 --> 00:24:04,457 the London government will not send an army up north 463 00:24:04,481 --> 00:24:06,084 to burn down Edinburgh. 464 00:24:06,108 --> 00:24:10,375 Very few people are now willing to kill or be killed 465 00:24:10,399 --> 00:24:13,121 for Scottish or for British independence. 466 00:24:13,145 --> 00:24:18,165 So for all the talk of the rise of nationalism 467 00:24:18,189 --> 00:24:20,432 and going back to the 1930s, 468 00:24:20,456 --> 00:24:24,231 to the 19th century, in the West at least, 469 00:24:24,255 --> 00:24:30,839 the power of national sentiments today is far, far smaller 470 00:24:30,863 --> 00:24:32,403 than it was a century ago. 471 00:24:32,427 --> 00:24:36,264 CA: Although some people now, you hear publicly worrying 472 00:24:36,288 --> 00:24:39,044 about whether that might be shifting, 473 00:24:39,068 --> 00:24:42,466 that there could actually be outbreaks of violence in the US 474 00:24:42,490 --> 00:24:44,837 depending on how things turn out. 475 00:24:44,861 --> 00:24:46,400 Should we be worried about that, 476 00:24:46,424 --> 00:24:48,490 or do you really think things have shifted? 477 00:24:48,514 --> 00:24:50,005 YNH: No, we should be worried. 478 00:24:50,029 --> 00:24:51,654 We should be aware of two things. 479 00:24:51,678 --> 00:24:53,315 First of all, don't be hysterical. 480 00:24:53,339 --> 00:24:56,786 We are not back in the First World War yet. 481 00:24:56,810 --> 00:24:59,750 But on the other hand, don't be complacent. 482 00:24:59,774 --> 00:25:05,148 We reached from 1917 to 2017, 483 00:25:05,172 --> 00:25:07,354 not by some divine miracle, 484 00:25:07,378 --> 00:25:09,402 but simply by human decisions, 485 00:25:09,426 --> 00:25:12,089 and if we now start making the wrong decisions, 486 00:25:12,113 --> 00:25:16,598 we could be back in an analogous situation to 1917 487 00:25:16,622 --> 00:25:18,128 in a few years. 488 00:25:18,152 --> 00:25:20,473 One of the things I know as a historian 489 00:25:20,497 --> 00:25:24,172 is that you should never underestimate human stupidity. 490 00:25:24,196 --> 00:25:27,079 (Laughter) 491 00:25:27,103 --> 00:25:30,187 It's one of the most powerful forces in history, 492 00:25:30,211 --> 00:25:32,538 human stupidity and human violence. 493 00:25:32,562 --> 00:25:36,667 Humans do such crazy things for no obvious reason, 494 00:25:36,691 --> 00:25:38,401 but again, at the same time, 495 00:25:38,425 --> 00:25:42,029 another very powerful force in human history is human wisdom. 496 00:25:42,053 --> 00:25:43,219 We have both. 497 00:25:43,243 --> 00:25:46,145 CA: We have with us here moral psychologist Jonathan Haidt, 498 00:25:46,169 --> 00:25:47,792 who I think has a question. 499 00:25:48,871 --> 00:25:50,354 Jonathan Haidt: Thanks, Yuval. 500 00:25:50,378 --> 00:25:52,861 So you seem to be a fan of global governance, 501 00:25:52,885 --> 00:25:56,405 but when you look at the map of the world from Transparency International, 502 00:25:56,429 --> 00:25:59,757 which rates the level of corruption of political institutions, 503 00:25:59,781 --> 00:26:02,861 it's a vast sea of red with little bits of yellow here and there 504 00:26:02,885 --> 00:26:04,490 for those with good institutions. 505 00:26:04,514 --> 00:26:07,015 So if we were to have some kind of global governance, 506 00:26:07,039 --> 00:26:09,870 what makes you think it would end up being more like Denmark 507 00:26:09,894 --> 00:26:11,934 rather than more like Russia or Honduras, 508 00:26:11,958 --> 00:26:13,459 and aren't there alternatives, 509 00:26:13,483 --> 00:26:15,569 such as we did with CFCs? 510 00:26:15,593 --> 00:26:18,700 There are ways to solve global problems with national governments. 511 00:26:18,724 --> 00:26:20,938 What would world government actually look like, 512 00:26:20,962 --> 00:26:22,683 and why do you think it would work? 513 00:26:22,707 --> 00:26:26,467 YNH: Well, I don't know what it would look like. 514 00:26:26,491 --> 00:26:29,543 Nobody still has a model for that. 515 00:26:29,567 --> 00:26:32,195 The main reason we need it 516 00:26:32,219 --> 00:26:36,513 is because many of these issues are lose-lose situations. 517 00:26:36,537 --> 00:26:39,429 When you have a win-win situation like trade, 518 00:26:39,453 --> 00:26:42,369 both sides can benefit from a trade agreement, 519 00:26:42,393 --> 00:26:44,657 then this is something you can work out. 520 00:26:44,681 --> 00:26:47,027 Without some kind of global government, 521 00:26:47,051 --> 00:26:49,905 national governments each have an interest in doing it. 522 00:26:49,929 --> 00:26:53,900 But when you have a lose-lose situation like with climate change, 523 00:26:53,924 --> 00:26:55,565 it's much more difficult 524 00:26:55,589 --> 00:27:00,475 without some overarching authority, real authority. 525 00:27:00,499 --> 00:27:03,261 Now, how to get there and what would it look like, 526 00:27:03,285 --> 00:27:04,645 I don't know. 527 00:27:04,669 --> 00:27:08,406 And certainly there is no obvious reason 528 00:27:08,430 --> 00:27:10,710 to think that it would look like Denmark, 529 00:27:10,734 --> 00:27:12,322 or that it would be a democracy. 530 00:27:12,346 --> 00:27:14,932 Most likely it wouldn't. 531 00:27:14,956 --> 00:27:20,987 We don't have workable democratic models 532 00:27:21,011 --> 00:27:23,107 for a global government. 533 00:27:23,131 --> 00:27:26,196 So maybe it would look more like ancient China 534 00:27:26,220 --> 00:27:27,919 than like modern Denmark. 535 00:27:27,943 --> 00:27:33,166 But still, given the dangers that we are facing, 536 00:27:33,190 --> 00:27:38,310 I think the imperative of having some kind of real ability 537 00:27:38,334 --> 00:27:42,462 to force through difficult decisions on the global level 538 00:27:42,486 --> 00:27:46,616 is more important than almost anything else. 539 00:27:47,591 --> 00:27:49,689 CA: There's a question from Facebook here, 540 00:27:49,713 --> 00:27:51,606 and then we'll get the mic to Andrew. 541 00:27:51,630 --> 00:27:53,826 So, Kat Hebron on Facebook, 542 00:27:53,850 --> 00:27:55,518 calling in from Vail: 543 00:27:55,542 --> 00:27:59,753 "How would developed nations manage the millions of climate migrants?" 544 00:28:00,818 --> 00:28:02,972 YNH: I don't know. 545 00:28:02,996 --> 00:28:04,888 CA: That's your answer, Kat. (Laughter) 546 00:28:04,912 --> 00:28:07,058 YNH: And I don't think that they know either. 547 00:28:07,082 --> 00:28:08,876 They'll just deny the problem, maybe. 548 00:28:08,900 --> 00:28:11,925 CA: But immigration, generally, is another example of a problem 549 00:28:11,949 --> 00:28:14,522 that's very hard to solve on a nation-by-nation basis. 550 00:28:14,546 --> 00:28:16,016 One nation can shut its doors, 551 00:28:16,040 --> 00:28:18,574 but maybe that stores up problems for the future. 552 00:28:18,598 --> 00:28:22,470 YNH: Yes, I mean -- it's another very good case, 553 00:28:22,494 --> 00:28:24,723 especially because it's so much easier 554 00:28:24,747 --> 00:28:26,578 to migrate today 555 00:28:26,602 --> 00:28:30,291 than it was in the Middle Ages or in ancient times. 556 00:28:30,315 --> 00:28:34,778 CA: Yuval, there's a belief among many technologists, certainly, 557 00:28:34,802 --> 00:28:37,153 that political concerns are kind of overblown, 558 00:28:37,177 --> 00:28:40,874 that actually, political leaders don't have that much influence 559 00:28:40,898 --> 00:28:42,064 in the world, 560 00:28:42,088 --> 00:28:46,057 that the real determination of humanity at this point is by science, 561 00:28:46,081 --> 00:28:47,527 by invention, by companies, 562 00:28:47,551 --> 00:28:51,943 by many things other than political leaders, 563 00:28:51,967 --> 00:28:54,378 and it's actually very hard for leaders to do much, 564 00:28:54,402 --> 00:28:56,760 so we're actually worrying about nothing here. 565 00:28:58,005 --> 00:29:00,241 YNH: Well, first, it should be emphasized 566 00:29:00,265 --> 00:29:05,262 that it's true that political leaders' ability to do good is very limited, 567 00:29:05,286 --> 00:29:08,329 but their ability to do harm is unlimited. 568 00:29:08,353 --> 00:29:10,953 There is a basic imbalance here. 569 00:29:10,977 --> 00:29:14,545 You can still press the button and blow everybody up. 570 00:29:14,569 --> 00:29:16,155 You have that kind of ability. 571 00:29:16,179 --> 00:29:19,748 But if you want, for example, to reduce inequality, 572 00:29:19,772 --> 00:29:21,649 that's very, very difficult. 573 00:29:21,673 --> 00:29:23,069 But to start a war, 574 00:29:23,093 --> 00:29:24,944 you can still do so very easily. 575 00:29:24,968 --> 00:29:28,560 So there is a built-in imbalance in the political system today 576 00:29:28,584 --> 00:29:30,195 which is very frustrating, 577 00:29:30,219 --> 00:29:35,120 where you cannot do a lot of good but you can still do a lot of harm. 578 00:29:35,144 --> 00:29:39,288 And this makes the political system still a very big concern. 579 00:29:39,812 --> 00:29:41,963 CA: So as you look at what's happening today, 580 00:29:41,987 --> 00:29:43,741 and putting your historian's hat on, 581 00:29:43,765 --> 00:29:47,291 do you look back in history at moments when things were going just fine 582 00:29:47,315 --> 00:29:52,648 and an individual leader really took the world or their country backwards? 583 00:29:53,307 --> 00:29:55,936 YNH: There are quite a few examples, 584 00:29:55,960 --> 00:29:58,779 but I should emphasize, it's never an individual leader. 585 00:29:58,803 --> 00:30:00,437 I mean, somebody put him there, 586 00:30:00,461 --> 00:30:03,744 and somebody allowed him to continue to be there. 587 00:30:03,768 --> 00:30:07,851 So it's never really just the fault of a single individual. 588 00:30:07,875 --> 00:30:12,488 There are a lot of people behind every such individual. 589 00:30:12,512 --> 00:30:15,990 CA: Can we have the microphone here, please, to Andrew? 590 00:30:19,132 --> 00:30:22,696 Andrew Solomon: You've talked a lot about the global versus the national, 591 00:30:22,720 --> 00:30:24,346 but increasingly, it seems to me, 592 00:30:24,370 --> 00:30:27,013 the world situation is in the hands of identity groups. 593 00:30:27,037 --> 00:30:29,347 We look at people within the United States 594 00:30:29,371 --> 00:30:30,998 who have been recruited by ISIS. 595 00:30:31,022 --> 00:30:33,213 We look at these other groups which have formed 596 00:30:33,237 --> 00:30:35,199 which go outside of national bounds 597 00:30:35,223 --> 00:30:37,384 but still represent significant authorities. 598 00:30:37,408 --> 00:30:39,836 How are they to be integrated into the system, 599 00:30:39,860 --> 00:30:43,573 and how is a diverse set of identities to be made coherent 600 00:30:43,597 --> 00:30:45,935 under either national or global leadership? 601 00:30:47,380 --> 00:30:50,601 YNH: Well, the problem of such diverse identities 602 00:30:50,625 --> 00:30:52,681 is a problem from nationalism as well. 603 00:30:53,229 --> 00:30:57,584 Nationalism believes in a single, monolithic identity, 604 00:30:57,608 --> 00:31:01,724 and exclusive or at least more extreme versions of nationalism 605 00:31:01,748 --> 00:31:05,317 believe in an exclusive loyalty to a single identity. 606 00:31:05,341 --> 00:31:08,257 And therefore, nationalism has had a lot of problems 607 00:31:08,281 --> 00:31:11,157 with people wanting to divide their identities 608 00:31:11,181 --> 00:31:13,244 between various groups. 609 00:31:13,268 --> 00:31:18,143 So it's not just a problem, say, for a global vision. 610 00:31:18,540 --> 00:31:22,392 And I think, again, history shows 611 00:31:22,416 --> 00:31:28,543 that you shouldn't necessarily think in such exclusive terms. 612 00:31:28,567 --> 00:31:31,975 If you think that there is just a single identity for a person, 613 00:31:31,999 --> 00:31:37,039 "I am just X, that's it, I can't be several things, I can be just that," 614 00:31:37,063 --> 00:31:39,159 that's the start of the problem. 615 00:31:39,183 --> 00:31:41,971 You have religions, you have nations 616 00:31:41,995 --> 00:31:45,177 that sometimes demand exclusive loyalty, 617 00:31:45,201 --> 00:31:46,932 but it's not the only option. 618 00:31:46,956 --> 00:31:49,338 There are many religions and many nations 619 00:31:49,362 --> 00:31:53,240 that enable you to have diverse identities at the same time. 620 00:31:53,264 --> 00:31:57,621 CA: But is one explanation of what's happened in the last year 621 00:31:57,645 --> 00:32:02,825 that a group of people have got fed up with, if you like, 622 00:32:02,849 --> 00:32:06,016 the liberal elites, for want of a better term, 623 00:32:06,040 --> 00:32:10,413 obsessing over many, many different identities and them feeling, 624 00:32:10,437 --> 00:32:14,296 "But what about my identity? I am being completely ignored here. 625 00:32:14,320 --> 00:32:17,294 And by the way, I thought I was the majority"? 626 00:32:17,318 --> 00:32:20,299 And that that's actually sparked a lot of the anger. 627 00:32:20,918 --> 00:32:24,063 YNH: Yeah. Identity is always problematic, 628 00:32:24,087 --> 00:32:28,397 because identity is always based on fictional stories 629 00:32:28,421 --> 00:32:31,310 that sooner or later collide with reality. 630 00:32:31,890 --> 00:32:33,408 Almost all identities, 631 00:32:33,432 --> 00:32:36,843 I mean, beyond the level of the basic community 632 00:32:36,867 --> 00:32:38,336 of a few dozen people, 633 00:32:38,360 --> 00:32:40,289 are based on a fictional story. 634 00:32:40,313 --> 00:32:41,954 They are not the truth. 635 00:32:41,978 --> 00:32:43,293 They are not the reality. 636 00:32:43,317 --> 00:32:46,411 It's just a story that people invent and tell one another 637 00:32:46,435 --> 00:32:47,926 and start believing. 638 00:32:47,950 --> 00:32:53,270 And therefore all identities are extremely unstable. 639 00:32:53,294 --> 00:32:55,821 They are not a biological reality. 640 00:32:55,845 --> 00:32:57,851 Sometimes nationalists, for example, 641 00:32:57,875 --> 00:33:00,802 think that the nation is a biological entity. 642 00:33:00,826 --> 00:33:04,439 It's made of the combination of soil and blood, 643 00:33:04,463 --> 00:33:06,165 creates the nation. 644 00:33:06,189 --> 00:33:09,281 But this is just a fictional story. 645 00:33:09,305 --> 00:33:11,868 CA: Soil and blood kind of makes a gooey mess. 646 00:33:11,892 --> 00:33:13,714 (Laughter) 647 00:33:13,738 --> 00:33:16,762 YNH: It does, and also it messes with your mind 648 00:33:16,786 --> 00:33:21,570 when you think too much that I am a combination of soil and blood. 649 00:33:21,594 --> 00:33:24,461 If you look from a biological perspective, 650 00:33:24,485 --> 00:33:27,963 obviously none of the nations that exist today 651 00:33:27,987 --> 00:33:30,230 existed 5,000 years ago. 652 00:33:30,254 --> 00:33:34,112 Homo sapiens is a social animal, that's for sure. 653 00:33:34,136 --> 00:33:36,563 But for millions of years, 654 00:33:36,587 --> 00:33:41,226 Homo sapiens and our hominid ancestors lived in small communities 655 00:33:41,250 --> 00:33:43,579 of a few dozen individuals. 656 00:33:43,603 --> 00:33:45,730 Everybody knew everybody else. 657 00:33:45,754 --> 00:33:49,775 Whereas modern nations are imagined communities, 658 00:33:49,799 --> 00:33:52,350 in the sense that I don't even know all these people. 659 00:33:52,374 --> 00:33:55,222 I come from a relatively small nation, Israel, 660 00:33:55,246 --> 00:33:57,389 and of eight million Israelis, 661 00:33:57,413 --> 00:33:59,403 I never met most of them. 662 00:33:59,427 --> 00:34:01,735 I will never meet most of them. 663 00:34:01,759 --> 00:34:04,321 They basically exist here. 664 00:34:04,345 --> 00:34:07,094 CA: But in terms of this identity, 665 00:34:07,118 --> 00:34:12,555 this group who feel left out and perhaps have work taken away, 666 00:34:12,579 --> 00:34:14,873 I mean, in "Homo Deus," 667 00:34:14,897 --> 00:34:18,008 you actually speak of this group in one sense expanding, 668 00:34:18,032 --> 00:34:21,654 that so many people may have their jobs taken away 669 00:34:21,678 --> 00:34:26,058 by technology in some way that we could end up with 670 00:34:26,082 --> 00:34:29,253 a really large -- I think you call it a "useless class" -- 671 00:34:29,277 --> 00:34:31,380 a class where traditionally, 672 00:34:31,404 --> 00:34:34,135 as viewed by the economy, these people have no use. 673 00:34:34,159 --> 00:34:35,357 YNH: Yes. 674 00:34:35,381 --> 00:34:38,312 CA: How likely a possibility is that? 675 00:34:38,336 --> 00:34:41,080 Is that something we should be terrified about? 676 00:34:41,104 --> 00:34:43,763 And can we address it in any way? 677 00:34:43,787 --> 00:34:46,034 YNH: We should think about it very carefully. 678 00:34:46,058 --> 00:34:49,029 I mean, nobody really knows what the job market will look like 679 00:34:49,053 --> 00:34:50,743 in 2040, 2050. 680 00:34:50,767 --> 00:34:53,475 There is a chance many new jobs will appear, 681 00:34:53,499 --> 00:34:55,253 but it's not certain. 682 00:34:55,277 --> 00:34:57,488 And even if new jobs do appear, 683 00:34:57,512 --> 00:34:59,496 it won't necessarily be easy 684 00:34:59,520 --> 00:35:02,519 for a 50-year old unemployed truck driver 685 00:35:02,543 --> 00:35:05,576 made unemployed by self-driving vehicles, 686 00:35:05,600 --> 00:35:09,253 it won't be easy for an unemployed truck driver 687 00:35:09,277 --> 00:35:14,063 to reinvent himself or herself as a designer of virtual worlds. 688 00:35:14,087 --> 00:35:18,269 Previously, if you look at the trajectory of the industrial revolution, 689 00:35:18,293 --> 00:35:22,450 when machines replaced humans in one type of work, 690 00:35:22,474 --> 00:35:26,755 the solution usually came from low-skill work 691 00:35:26,779 --> 00:35:29,367 in new lines of business. 692 00:35:29,391 --> 00:35:32,793 So you didn't need any more agricultural workers, 693 00:35:32,817 --> 00:35:38,231 so people moved to working in low-skill industrial jobs, 694 00:35:38,255 --> 00:35:41,724 and when this was taken away by more and more machines, 695 00:35:41,748 --> 00:35:44,718 people moved to low-skill service jobs. 696 00:35:44,742 --> 00:35:48,102 Now, when people say there will be new jobs in the future, 697 00:35:48,126 --> 00:35:50,555 that humans can do better than AI, 698 00:35:50,579 --> 00:35:52,409 that humans can do better than robots, 699 00:35:52,433 --> 00:35:55,073 they usually think about high-skill jobs, 700 00:35:55,097 --> 00:35:58,968 like software engineers designing virtual worlds. 701 00:35:58,992 --> 00:36:04,386 Now, I don't see how an unemployed cashier from Wal-Mart 702 00:36:04,410 --> 00:36:09,033 reinvents herself or himself at 50 as a designer of virtual worlds, 703 00:36:09,057 --> 00:36:10,528 and certainly I don't see 704 00:36:10,552 --> 00:36:14,019 how the millions of unemployed Bangladeshi textile workers 705 00:36:14,043 --> 00:36:15,654 will be able to do that. 706 00:36:15,678 --> 00:36:17,398 I mean, if they are going to do it, 707 00:36:17,422 --> 00:36:20,778 we need to start teaching the Bangladeshis today 708 00:36:20,802 --> 00:36:22,556 how to be software designers, 709 00:36:22,580 --> 00:36:23,823 and we are not doing it. 710 00:36:23,847 --> 00:36:26,338 So what will they do in 20 years? 711 00:36:26,362 --> 00:36:30,276 CA: So it feels like you're really highlighting a question 712 00:36:30,300 --> 00:36:34,483 that's really been bugging me the last few months more and more. 713 00:36:34,507 --> 00:36:37,362 It's almost a hard question to ask in public, 714 00:36:37,386 --> 00:36:40,777 but if any mind has some wisdom to offer in it, maybe it's yours, 715 00:36:40,801 --> 00:36:42,346 so I'm going to ask you: 716 00:36:42,370 --> 00:36:44,248 What are humans for? 717 00:36:45,232 --> 00:36:47,166 YNH: As far as we know, for nothing. 718 00:36:47,190 --> 00:36:48,902 (Laughter) 719 00:36:48,926 --> 00:36:54,452 I mean, there is no great cosmic drama, some great cosmic plan, 720 00:36:54,476 --> 00:36:57,317 that we have a role to play in. 721 00:36:57,341 --> 00:37:00,365 And we just need to discover what our role is 722 00:37:00,389 --> 00:37:03,381 and then play it to the best of our ability. 723 00:37:03,405 --> 00:37:08,383 This has been the story of all religions and ideologies and so forth, 724 00:37:08,407 --> 00:37:11,885 but as a scientist, the best I can say is this is not true. 725 00:37:11,909 --> 00:37:17,267 There is no universal drama with a role in it for Homo sapiens. 726 00:37:17,291 --> 00:37:18,972 So -- 727 00:37:18,996 --> 00:37:21,489 CA: I'm going to push back on you just for a minute, 728 00:37:21,513 --> 00:37:22,707 just from your own book, 729 00:37:22,731 --> 00:37:24,055 because in "Homo Deus," 730 00:37:24,079 --> 00:37:29,138 you give really one of the most coherent and understandable accounts 731 00:37:29,162 --> 00:37:31,394 about sentience, about consciousness, 732 00:37:31,418 --> 00:37:34,376 and that unique sort of human skill. 733 00:37:34,400 --> 00:37:36,893 You point out that it's different from intelligence, 734 00:37:36,917 --> 00:37:39,251 the intelligence that we're building in machines, 735 00:37:39,275 --> 00:37:42,933 and that there's actually a lot of mystery around it. 736 00:37:42,957 --> 00:37:46,334 How can you be sure there's no purpose 737 00:37:46,358 --> 00:37:50,409 when we don't even understand what this sentience thing is? 738 00:37:50,433 --> 00:37:53,009 I mean, in your own thinking, isn't there a chance 739 00:37:53,033 --> 00:37:57,345 that what humans are for is to be the universe's sentient things, 740 00:37:57,369 --> 00:38:00,792 to be the centers of joy and love and happiness and hope? 741 00:38:00,816 --> 00:38:03,851 And maybe we can build machines that actually help amplify that, 742 00:38:03,875 --> 00:38:06,539 even if they're not going to become sentient themselves? 743 00:38:06,563 --> 00:38:07,714 Is that crazy? 744 00:38:07,738 --> 00:38:11,221 I kind of found myself hoping that, reading your book. 745 00:38:11,245 --> 00:38:15,102 YNH: Well, I certainly think that the most interesting question today in science 746 00:38:15,126 --> 00:38:17,549 is the question of consciousness and the mind. 747 00:38:17,573 --> 00:38:21,071 We are getting better and better in understanding the brain 748 00:38:21,095 --> 00:38:22,355 and intelligence, 749 00:38:22,379 --> 00:38:24,916 but we are not getting much better 750 00:38:24,940 --> 00:38:27,283 in understanding the mind and consciousness. 751 00:38:27,307 --> 00:38:30,669 People often confuse intelligence and consciousness, 752 00:38:30,693 --> 00:38:32,992 especially in places like Silicon Valley, 753 00:38:33,016 --> 00:38:36,773 which is understandable, because in humans, they go together. 754 00:38:36,797 --> 00:38:40,376 I mean, intelligence basically is the ability to solve problems. 755 00:38:40,400 --> 00:38:42,942 Consciousness is the ability to feel things, 756 00:38:42,966 --> 00:38:48,178 to feel joy and sadness and boredom and pain and so forth. 757 00:38:48,202 --> 00:38:52,241 In Homo sapiens and all other mammals as well -- it's not unique to humans -- 758 00:38:52,265 --> 00:38:54,912 in all mammals and birds and some other animals, 759 00:38:54,936 --> 00:38:57,586 intelligence and consciousness go together. 760 00:38:57,610 --> 00:39:01,188 We often solve problems by feeling things. 761 00:39:01,212 --> 00:39:02,705 So we tend to confuse them. 762 00:39:02,729 --> 00:39:04,194 But they are different things. 763 00:39:04,218 --> 00:39:07,306 What's happening today in places like Silicon Valley 764 00:39:07,330 --> 00:39:10,956 is that we are creating artificial intelligence 765 00:39:10,980 --> 00:39:12,802 but not artificial consciousness. 766 00:39:12,826 --> 00:39:16,206 There has been an amazing development in computer intelligence 767 00:39:16,230 --> 00:39:17,792 over the last 50 years, 768 00:39:17,816 --> 00:39:22,017 and exactly zero development in computer consciousness, 769 00:39:22,041 --> 00:39:25,727 and there is no indication that computers are going to become conscious 770 00:39:25,751 --> 00:39:28,282 anytime soon. 771 00:39:28,306 --> 00:39:33,956 So first of all, if there is some cosmic role for consciousness, 772 00:39:33,980 --> 00:39:36,110 it's not unique to Homo sapiens. 773 00:39:36,134 --> 00:39:38,453 Cows are conscious, pigs are conscious, 774 00:39:38,477 --> 00:39:41,310 chimpanzees are conscious, chickens are conscious, 775 00:39:41,334 --> 00:39:45,187 so if we go that way, first of all, we need to broaden our horizons 776 00:39:45,211 --> 00:39:49,936 and remember very clearly we are not the only sentient beings on Earth, 777 00:39:49,960 --> 00:39:51,755 and when it comes to sentience -- 778 00:39:51,779 --> 00:39:55,091 when it comes to intelligence, there is good reason to think 779 00:39:55,115 --> 00:39:58,411 we are the most intelligent of the whole bunch. 780 00:39:58,435 --> 00:40:01,009 But when it comes to sentience, 781 00:40:01,033 --> 00:40:04,191 to say that humans are more sentient than whales, 782 00:40:04,215 --> 00:40:08,362 or more sentient than baboons or more sentient than cats, 783 00:40:08,386 --> 00:40:10,680 I see no evidence for that. 784 00:40:10,704 --> 00:40:14,311 So first step is, you go in that direction, expand. 785 00:40:14,335 --> 00:40:18,317 And then the second question of what is it for, 786 00:40:18,341 --> 00:40:20,123 I would reverse it 787 00:40:20,147 --> 00:40:24,383 and I would say that I don't think sentience is for anything. 788 00:40:24,407 --> 00:40:28,579 I think we don't need to find our role in the universe. 789 00:40:28,603 --> 00:40:34,416 The really important thing is to liberate ourselves from suffering. 790 00:40:34,440 --> 00:40:37,433 What characterizes sentient beings 791 00:40:37,457 --> 00:40:40,177 in contrast to robots, to stones, 792 00:40:40,201 --> 00:40:41,384 to whatever, 793 00:40:41,408 --> 00:40:45,199 is that sentient beings suffer, can suffer, 794 00:40:45,223 --> 00:40:47,563 and what they should focus on 795 00:40:47,587 --> 00:40:51,707 is not finding their place in some mysterious cosmic drama. 796 00:40:51,731 --> 00:40:55,550 They should focus on understanding what suffering is, 797 00:40:55,574 --> 00:40:58,933 what causes it and how to be liberated from it. 798 00:40:59,572 --> 00:41:03,049 CA: I know this is a big issue for you, and that was very eloquent. 799 00:41:03,073 --> 00:41:06,487 We're going to have a blizzard of questions from the audience here, 800 00:41:06,511 --> 00:41:08,431 and maybe from Facebook as well, 801 00:41:08,455 --> 00:41:10,128 and maybe some comments as well. 802 00:41:10,152 --> 00:41:11,948 So let's go quick. 803 00:41:11,972 --> 00:41:13,402 There's one right here. 804 00:41:15,052 --> 00:41:17,861 Keep your hands held up at the back if you want the mic, 805 00:41:17,885 --> 00:41:19,304 and we'll get it back to you. 806 00:41:19,328 --> 00:41:22,447 Question: In your work, you talk a lot about the fictional stories 807 00:41:22,471 --> 00:41:23,815 that we accept as truth, 808 00:41:23,839 --> 00:41:25,556 and we live our lives by it. 809 00:41:25,580 --> 00:41:28,079 As an individual, knowing that, 810 00:41:28,103 --> 00:41:31,849 how does it impact the stories that you choose to live your life, 811 00:41:31,873 --> 00:41:35,613 and do you confuse them with the truth, like all of us? 812 00:41:36,246 --> 00:41:37,457 YNH: I try not to. 813 00:41:37,481 --> 00:41:40,249 I mean, for me, maybe the most important question, 814 00:41:40,273 --> 00:41:42,751 both as a scientist and as a person, 815 00:41:42,775 --> 00:41:46,650 is how to tell the difference between fiction and reality, 816 00:41:46,674 --> 00:41:49,270 because reality is there. 817 00:41:49,294 --> 00:41:51,376 I'm not saying that everything is fiction. 818 00:41:51,400 --> 00:41:54,452 It's just very difficult for human beings to tell the difference 819 00:41:54,476 --> 00:41:56,093 between fiction and reality, 820 00:41:56,117 --> 00:42:01,062 and it has become more and more difficult as history progressed, 821 00:42:01,086 --> 00:42:03,537 because the fictions that we have created -- 822 00:42:03,561 --> 00:42:06,729 nations and gods and money and corporations -- 823 00:42:06,753 --> 00:42:08,263 they now control the world. 824 00:42:08,287 --> 00:42:09,464 So just to even think, 825 00:42:09,488 --> 00:42:12,633 "Oh, this is just all fictional entities that we've created," 826 00:42:12,657 --> 00:42:14,104 is very difficult. 827 00:42:14,128 --> 00:42:16,408 But reality is there. 828 00:42:17,043 --> 00:42:19,048 For me the best ... 829 00:42:19,072 --> 00:42:21,195 There are several tests 830 00:42:21,219 --> 00:42:23,989 to tell the difference between fiction and reality. 831 00:42:24,013 --> 00:42:27,439 The simplest one, the best one that I can say in short, 832 00:42:27,463 --> 00:42:29,044 is the test of suffering. 833 00:42:29,068 --> 00:42:30,621 If it can suffer, it's real. 834 00:42:31,192 --> 00:42:32,886 If it can't suffer, it's not real. 835 00:42:32,910 --> 00:42:34,375 A nation cannot suffer. 836 00:42:34,399 --> 00:42:35,969 That's very, very clear. 837 00:42:35,993 --> 00:42:37,931 Even if a nation loses a war, 838 00:42:37,955 --> 00:42:42,020 we say, "Germany suffered a defeat in the First World War," 839 00:42:42,044 --> 00:42:43,209 it's a metaphor. 840 00:42:43,233 --> 00:42:45,790 Germany cannot suffer. Germany has no mind. 841 00:42:45,814 --> 00:42:47,467 Germany has no consciousness. 842 00:42:47,491 --> 00:42:51,149 Germans can suffer, yes, but Germany cannot. 843 00:42:51,173 --> 00:42:54,142 Similarly, when a bank goes bust, 844 00:42:54,166 --> 00:42:55,937 the bank cannot suffer. 845 00:42:55,961 --> 00:42:59,352 When the dollar loses its value, the dollar doesn't suffer. 846 00:42:59,376 --> 00:43:01,626 People can suffer. Animals can suffer. 847 00:43:01,650 --> 00:43:02,806 This is real. 848 00:43:02,830 --> 00:43:07,359 So I would start, if you really want to see reality, 849 00:43:07,383 --> 00:43:09,447 I would go through the door of suffering. 850 00:43:09,471 --> 00:43:12,425 If you can really understand what suffering is, 851 00:43:12,449 --> 00:43:14,672 this will give you also the key 852 00:43:14,696 --> 00:43:16,713 to understand what reality is. 853 00:43:16,737 --> 00:43:19,520 CA: There's a Facebook question here that connects to this, 854 00:43:19,544 --> 00:43:22,521 from someone around the world in a language that I cannot read. 855 00:43:22,545 --> 00:43:24,762 YNH: Oh, it's Hebrew. CA: Hebrew. There you go. 856 00:43:24,786 --> 00:43:25,848 (Laughter) 857 00:43:25,872 --> 00:43:27,036 Can you read the name? 858 00:43:27,060 --> 00:43:28,935 YNH: [??] 859 00:43:28,959 --> 00:43:30,803 CA: Well, thank you for writing in. 860 00:43:30,827 --> 00:43:35,382 The question is: "Is the post-truth era really a brand-new era, 861 00:43:35,406 --> 00:43:39,793 or just another climax or moment in a never-ending trend? 862 00:43:40,701 --> 00:43:44,030 YNH: Personally, I don't connect with this idea of post-truth. 863 00:43:44,054 --> 00:43:46,762 My basic reaction as a historian is: 864 00:43:46,786 --> 00:43:50,681 If this is the era of post-truth, when the hell was the era of truth? 865 00:43:50,705 --> 00:43:51,956 CA: Right. 866 00:43:51,980 --> 00:43:53,300 (Laughter) 867 00:43:53,324 --> 00:43:58,007 YNH: Was it the 1980s, the 1950s, the Middle Ages? 868 00:43:58,031 --> 00:44:02,423 I mean, we have always lived in an era, in a way, of post-truth. 869 00:44:02,883 --> 00:44:05,194 CA: But I'd push back on that, 870 00:44:05,218 --> 00:44:07,888 because I think what people are talking about 871 00:44:07,912 --> 00:44:14,872 is that there was a world where you had fewer journalistic outlets, 872 00:44:14,896 --> 00:44:18,544 where there were traditions, that things were fact-checked. 873 00:44:18,568 --> 00:44:22,513 It was incorporated into the charter of those organizations 874 00:44:22,537 --> 00:44:24,704 that the truth mattered. 875 00:44:24,728 --> 00:44:26,477 So if you believe in a reality, 876 00:44:26,501 --> 00:44:28,724 then what you write is information. 877 00:44:28,748 --> 00:44:32,569 There was a belief that that information should connect to reality in a real way, 878 00:44:32,593 --> 00:44:35,554 and if you wrote a headline, it was a serious, earnest attempt 879 00:44:35,578 --> 00:44:37,881 to reflect something that had actually happened. 880 00:44:37,905 --> 00:44:39,756 And people didn't always get it right. 881 00:44:39,780 --> 00:44:41,789 But I think the concern now is you've got 882 00:44:41,813 --> 00:44:44,131 a technological system that's incredibly powerful 883 00:44:44,155 --> 00:44:48,325 that, for a while at least, massively amplified anything 884 00:44:48,349 --> 00:44:51,129 with no attention paid to whether it connected to reality, 885 00:44:51,153 --> 00:44:54,307 only to whether it connected to clicks and attention, 886 00:44:54,331 --> 00:44:55,947 and that that was arguably toxic. 887 00:44:55,971 --> 00:44:58,407 That's a reasonable concern, isn't it? 888 00:44:58,431 --> 00:45:00,717 YNH: Yeah, it is. I mean, the technology changes, 889 00:45:00,741 --> 00:45:05,969 and it's now easier to disseminate both truth and fiction and falsehood. 890 00:45:05,993 --> 00:45:07,996 It goes both ways. 891 00:45:08,020 --> 00:45:12,599 It's also much easier, though, to spread the truth than it was ever before. 892 00:45:12,623 --> 00:45:16,308 But I don't think there is anything essentially new 893 00:45:16,332 --> 00:45:21,052 about this disseminating fictions and errors. 894 00:45:21,076 --> 00:45:25,110 There is nothing that -- I don't know -- Joseph Goebbels, didn't know 895 00:45:25,134 --> 00:45:30,573 about all this idea of fake news and post-truth. 896 00:45:30,597 --> 00:45:34,315 He famously said that if you repeat a lie often enough, 897 00:45:34,339 --> 00:45:36,160 people will think it's the truth, 898 00:45:36,184 --> 00:45:38,540 and the bigger the lie, the better, 899 00:45:38,564 --> 00:45:44,587 because people won't even think that something so big can be a lie. 900 00:45:44,611 --> 00:45:50,269 I think that fake news has been with us for thousands of years. 901 00:45:50,293 --> 00:45:52,194 Just think of the Bible. 902 00:45:52,218 --> 00:45:53,605 (Laughter) 903 00:45:53,629 --> 00:45:54,916 CA: But there is a concern 904 00:45:54,940 --> 00:45:58,957 that the fake news is associated with tyrannical regimes, 905 00:45:58,981 --> 00:46:01,558 and when you see an uprise in fake news 906 00:46:01,582 --> 00:46:06,304 that is a canary in the coal mine that there may be dark times coming. 907 00:46:08,124 --> 00:46:15,086 YNH: Yeah. I mean, the intentional use of fake news is a disturbing sign. 908 00:46:15,812 --> 00:46:20,393 But I'm not saying that it's not bad, I'm just saying that it's not new. 909 00:46:20,820 --> 00:46:23,574 CA: There's a lot of interest on Facebook on this question 910 00:46:23,598 --> 00:46:28,598 about global governance versus nationalism. 911 00:46:29,292 --> 00:46:30,800 Question here from Phil Dennis: 912 00:46:30,824 --> 00:46:34,320 "How do we get people, governments, to relinquish power? 913 00:46:34,344 --> 00:46:38,259 Is that -- is that -- actually, the text is so big 914 00:46:38,283 --> 00:46:39,823 I can't read the full question. 915 00:46:39,847 --> 00:46:41,386 But is that a necessity? 916 00:46:41,410 --> 00:46:44,022 Is it going to take war to get there? 917 00:46:44,046 --> 00:46:47,736 Sorry Phil -- I mangled your question, but I blame the text right here. 918 00:46:47,760 --> 00:46:49,860 YNH: One option that some people talk about 919 00:46:49,884 --> 00:46:54,623 is that only a catastrophe can shake humankind 920 00:46:54,647 --> 00:46:59,911 and open the path to a real system of global governance, 921 00:46:59,935 --> 00:47:04,083 and they say that we can't do it before the catastrophe, 922 00:47:04,107 --> 00:47:06,908 but we need to start laying the foundations 923 00:47:06,932 --> 00:47:09,432 so that when the disaster strikes, 924 00:47:09,456 --> 00:47:11,638 we can react quickly. 925 00:47:11,662 --> 00:47:15,662 But people will just not have the motivation to do such a thing 926 00:47:15,686 --> 00:47:17,698 before the disaster strikes. 927 00:47:17,722 --> 00:47:19,987 Another thing that I would emphasize 928 00:47:20,011 --> 00:47:25,065 is that anybody who is really interested in global governance 929 00:47:25,089 --> 00:47:27,990 should always make it very, very clear 930 00:47:28,014 --> 00:47:34,598 that it doesn't replace or abolish local identities and communities, 931 00:47:34,622 --> 00:47:37,578 that it should come both as -- 932 00:47:37,602 --> 00:47:40,909 It should be part of a single package. 933 00:47:40,933 --> 00:47:44,311 CA: I want to hear more on this, 934 00:47:44,335 --> 00:47:47,388 because the very words "global governance" 935 00:47:47,412 --> 00:47:52,001 are almost the epitome of evil in the mindset of a lot of people 936 00:47:52,025 --> 00:47:53,351 on the alt-right right now. 937 00:47:53,375 --> 00:47:56,329 It just seems scary, remote, distant, and it has let them down, 938 00:47:56,353 --> 00:48:00,469 and so globalists, global governance -- no, go away! 939 00:48:00,493 --> 00:48:04,175 And many view the election as the ultimate poke in the eye 940 00:48:04,199 --> 00:48:05,677 to anyone who believes in that. 941 00:48:05,701 --> 00:48:09,252 So how do we change the narrative 942 00:48:09,276 --> 00:48:12,251 so that it doesn't seem so scary and remote? 943 00:48:12,275 --> 00:48:15,019 Build more on this idea of it being compatible 944 00:48:15,043 --> 00:48:17,664 with local identity, local communities. 945 00:48:17,688 --> 00:48:20,288 YNH: Well, I think again we should start 946 00:48:20,312 --> 00:48:23,444 really with the biological realities 947 00:48:23,468 --> 00:48:25,479 of Homo sapiens. 948 00:48:25,503 --> 00:48:29,621 And biology tells us two things about Homo sapiens 949 00:48:29,645 --> 00:48:31,902 which are very relevant to this issue: 950 00:48:31,926 --> 00:48:34,955 first of all, that we are completely dependent 951 00:48:34,979 --> 00:48:37,574 on the ecological system around us, 952 00:48:37,598 --> 00:48:41,057 and that today we are talking about a global system. 953 00:48:41,081 --> 00:48:42,438 You cannot escape that. 954 00:48:42,462 --> 00:48:46,084 And at the same time, biology tells us about Homo sapiens 955 00:48:46,108 --> 00:48:48,355 that we are social animals, 956 00:48:48,379 --> 00:48:53,016 but that we are social on a very, very local level. 957 00:48:53,040 --> 00:48:56,585 It's just a simple fact of humanity 958 00:48:56,609 --> 00:49:01,406 that we cannot have intimate familiarity 959 00:49:01,430 --> 00:49:05,305 with more than about 150 individuals. 960 00:49:05,329 --> 00:49:09,626 The size of the natural group, 961 00:49:09,650 --> 00:49:12,752 the natural community of Homo sapiens, 962 00:49:12,776 --> 00:49:16,120 is not more than 150 individuals, 963 00:49:16,144 --> 00:49:22,543 and everything beyond that is really based on all kinds of imaginary stories 964 00:49:22,567 --> 00:49:24,614 and large-scale institutions, 965 00:49:24,638 --> 00:49:29,014 and I think that we can find a way, 966 00:49:29,038 --> 00:49:33,608 again, based on a biological understanding of our species, 967 00:49:33,632 --> 00:49:35,714 to weave the two together 968 00:49:35,738 --> 00:49:38,814 and to understand that today in the 21st century, 969 00:49:38,838 --> 00:49:44,374 we need both the global level and the local community. 970 00:49:44,398 --> 00:49:46,415 And I would go even further than that 971 00:49:46,439 --> 00:49:49,762 and say that it starts with the body itself. 972 00:49:50,500 --> 00:49:54,842 The feelings that people today have of alienation and loneliness 973 00:49:54,866 --> 00:49:58,082 and not finding their place in the world, 974 00:49:58,106 --> 00:50:03,835 I would think that the chief problem is not global capitalism. 975 00:50:04,285 --> 00:50:07,311 The chief problem is that over the last hundred years, 976 00:50:07,335 --> 00:50:11,039 people have been becoming disembodied, 977 00:50:11,063 --> 00:50:14,222 have been distancing themselves from their body. 978 00:50:14,246 --> 00:50:17,142 As a hunter-gatherer or even as a peasant, 979 00:50:17,166 --> 00:50:21,364 to survive, you need to be constantly in touch 980 00:50:21,388 --> 00:50:23,571 with your body and with your senses, 981 00:50:23,595 --> 00:50:24,776 every moment. 982 00:50:24,800 --> 00:50:26,947 If you go to the forest to look for mushrooms 983 00:50:26,971 --> 00:50:29,348 and you don't pay attention to what you hear, 984 00:50:29,372 --> 00:50:31,248 to what you smell, to what you taste, 985 00:50:31,272 --> 00:50:32,423 you're dead. 986 00:50:32,447 --> 00:50:34,598 So you must be very connected. 987 00:50:34,622 --> 00:50:39,218 In the last hundred years, people are losing their ability 988 00:50:39,242 --> 00:50:42,114 to be in touch with their body and their senses, 989 00:50:42,138 --> 00:50:44,324 to hear, to smell, to feel. 990 00:50:44,348 --> 00:50:47,474 More and more attention goes to screens, 991 00:50:47,498 --> 00:50:49,018 to what is happening elsewhere, 992 00:50:49,042 --> 00:50:50,263 some other time. 993 00:50:50,287 --> 00:50:52,718 This, I think, is the deep reason 994 00:50:52,742 --> 00:50:56,636 for the feelings of alienation and loneliness and so forth, 995 00:50:56,660 --> 00:50:59,162 and therefore part of the solution 996 00:50:59,186 --> 00:51:03,450 is not to bring back some mass nationalism, 997 00:51:03,474 --> 00:51:07,598 but also reconnect with our own bodies, 998 00:51:07,622 --> 00:51:10,885 and if you are back in touch with your body, 999 00:51:10,909 --> 00:51:14,079 you will feel much more at home in the world also. 1000 00:51:14,103 --> 00:51:17,788 CA: Well, depending on how things go, we may all be back in the forest soon. 1001 00:51:17,812 --> 00:51:20,161 We're going to have one more question in the room 1002 00:51:20,185 --> 00:51:21,688 and one more on Facebook. 1003 00:51:21,712 --> 00:51:25,093 Ama Adi-Dako: Hello. I'm from Ghana, West Africa, and my question is: 1004 00:51:25,117 --> 00:51:29,719 I'm wondering how do you present and justify the idea of global governance 1005 00:51:29,743 --> 00:51:32,754 to countries that have been historically disenfranchised 1006 00:51:32,778 --> 00:51:34,823 by the effects of globalization, 1007 00:51:34,847 --> 00:51:37,593 and also, if we're talking about global governance, 1008 00:51:37,617 --> 00:51:41,241 it sounds to me like it will definitely come from a very Westernized idea 1009 00:51:41,265 --> 00:51:43,439 of what the "global" is supposed to look like. 1010 00:51:43,463 --> 00:51:46,753 So how do we present and justify that idea of global 1011 00:51:46,777 --> 00:51:49,770 versus wholly nationalist 1012 00:51:49,794 --> 00:51:53,129 to people in countries like Ghana and Nigeria and Togo 1013 00:51:53,153 --> 00:51:55,329 and other countries like that? 1014 00:51:56,131 --> 00:52:02,545 YNH: I would start by saying that history is extremely unfair, 1015 00:52:02,569 --> 00:52:06,491 and that we should realize that. 1016 00:52:07,004 --> 00:52:10,053 Many of the countries that suffered most 1017 00:52:10,077 --> 00:52:14,216 from the last 200 years of globalization 1018 00:52:14,240 --> 00:52:16,200 and imperialism and industrialization 1019 00:52:16,224 --> 00:52:21,934 are exactly the countries which are also most likely to suffer most 1020 00:52:21,958 --> 00:52:24,747 from the next wave. 1021 00:52:24,771 --> 00:52:28,765 And we should be very, very clear about that. 1022 00:52:29,477 --> 00:52:32,528 If we don't have a global governance, 1023 00:52:32,552 --> 00:52:35,755 and if we suffer from climate change, 1024 00:52:35,779 --> 00:52:38,036 from technological disruptions, 1025 00:52:38,060 --> 00:52:41,661 the worst suffering will not be in the US. 1026 00:52:41,685 --> 00:52:46,781 The worst suffering will be in Ghana, will be in Sudan, will be in Syria, 1027 00:52:46,805 --> 00:52:49,542 will be in Bangladesh, will be in those places. 1028 00:52:49,566 --> 00:52:55,602 So I think those countries have an even greater incentive 1029 00:52:55,626 --> 00:53:00,353 to do something about the next wave of disruption, 1030 00:53:00,377 --> 00:53:02,902 whether it's ecological or whether it's technological. 1031 00:53:02,926 --> 00:53:05,772 Again, if you think about technological disruption, 1032 00:53:05,796 --> 00:53:10,412 so if AI and 3D printers and robots will take the jobs 1033 00:53:10,436 --> 00:53:12,805 from billions of people, 1034 00:53:12,829 --> 00:53:15,954 I worry far less about the Swedes 1035 00:53:15,978 --> 00:53:19,583 than about the people in Ghana or in Bangladesh. 1036 00:53:19,607 --> 00:53:24,835 And therefore, because history is so unfair 1037 00:53:24,859 --> 00:53:29,205 and the results of a calamity 1038 00:53:29,229 --> 00:53:31,597 will not be shared equally between everybody, 1039 00:53:31,621 --> 00:53:36,054 as usual, the rich will be able to get away 1040 00:53:36,078 --> 00:53:39,550 from the worst consequences of climate change 1041 00:53:39,574 --> 00:53:42,419 in a way that the poor will not be able to. 1042 00:53:43,347 --> 00:53:46,755 CA: And here's a great question from Cameron Taylor on Facebook: 1043 00:53:46,779 --> 00:53:48,900 "At the end of 'Sapiens,'" 1044 00:53:48,924 --> 00:53:50,987 you said we should be asking the question, 1045 00:53:51,011 --> 00:53:53,367 'What do we want to want?' 1046 00:53:53,391 --> 00:53:56,378 Well, what do you think we should want to want?" 1047 00:53:56,402 --> 00:53:59,933 YNH: I think we should want to want to know the truth, 1048 00:53:59,957 --> 00:54:02,607 to understand reality. 1049 00:54:03,207 --> 00:54:08,321 Mostly what we want is to change reality, 1050 00:54:08,345 --> 00:54:12,063 to fit it to our own desires, to our own wishes, 1051 00:54:12,087 --> 00:54:15,807 and I think we should first want to understand it. 1052 00:54:15,831 --> 00:54:19,595 If you look at the long-term trajectory of history, 1053 00:54:19,619 --> 00:54:22,355 what you see is that for thousands of years 1054 00:54:22,379 --> 00:54:25,715 we humans have been gaining control of the world outside us 1055 00:54:25,739 --> 00:54:29,233 and trying to shape it to fit our own desires. 1056 00:54:29,257 --> 00:54:32,445 And we've gained control of the other animals, 1057 00:54:32,469 --> 00:54:34,000 of the rivers, of the forests, 1058 00:54:34,024 --> 00:54:37,517 and reshaped them completely, 1059 00:54:37,541 --> 00:54:40,902 causing an ecological destruction 1060 00:54:40,926 --> 00:54:44,104 without making ourselves satisfied. 1061 00:54:44,128 --> 00:54:47,930 So the next step is we turn our gaze inwards, 1062 00:54:47,954 --> 00:54:52,502 and we say OK, getting control of the world outside us 1063 00:54:52,526 --> 00:54:54,390 did not really make us satisfied. 1064 00:54:54,414 --> 00:54:57,113 Let's now try to gain control of the world inside us. 1065 00:54:57,137 --> 00:54:59,300 This is the really big project 1066 00:54:59,324 --> 00:55:03,620 of science and technology and industry in the 21st century -- 1067 00:55:03,644 --> 00:55:07,166 to try and gain control of the world inside us, 1068 00:55:07,190 --> 00:55:12,113 to learn how to engineer and produce bodies and brains and minds. 1069 00:55:12,137 --> 00:55:16,779 These are likely to be the main products of the 21st century economy. 1070 00:55:16,803 --> 00:55:20,624 When people think about the future, very often they think in terms, 1071 00:55:20,648 --> 00:55:24,595 "Oh, I want to gain control of my body and of my brain." 1072 00:55:24,619 --> 00:55:27,429 And I think that's very dangerous. 1073 00:55:27,453 --> 00:55:30,719 If we've learned anything from our previous history, 1074 00:55:30,743 --> 00:55:34,656 it's that yes, we gain the power to manipulate, 1075 00:55:34,680 --> 00:55:37,470 but because we didn't really understand the complexity 1076 00:55:37,494 --> 00:55:39,299 of the ecological system, 1077 00:55:39,323 --> 00:55:43,013 we are now facing an ecological meltdown. 1078 00:55:43,037 --> 00:55:48,443 And if we now try to reengineer the world inside us 1079 00:55:48,467 --> 00:55:50,599 without really understanding it, 1080 00:55:50,623 --> 00:55:54,939 especially without understanding the complexity of our mental system, 1081 00:55:54,963 --> 00:55:59,623 we might cause a kind of internal ecological disaster, 1082 00:55:59,647 --> 00:56:03,190 and we'll face a kind of mental meltdown inside us. 1083 00:56:04,270 --> 00:56:06,712 CA: Putting all the pieces together here -- 1084 00:56:06,736 --> 00:56:09,416 the current politics, the coming technology, 1085 00:56:09,440 --> 00:56:11,590 concerns like the one you've just outlined -- 1086 00:56:11,614 --> 00:56:14,709 I mean, it seems like you yourself are in quite a bleak place 1087 00:56:14,733 --> 00:56:16,354 when you think about the future. 1088 00:56:16,378 --> 00:56:17,960 You're pretty worried about it. 1089 00:56:17,984 --> 00:56:19,176 Is that right? 1090 00:56:19,200 --> 00:56:25,888 And if there was one cause for hope, how would you state that? 1091 00:56:25,912 --> 00:56:30,075 YNH: I focus on the most dangerous possibilities 1092 00:56:30,099 --> 00:56:33,120 partly because this is like my job or responsibility 1093 00:56:33,144 --> 00:56:34,925 as a historian or social critic. 1094 00:56:34,949 --> 00:56:39,711 I mean, the industry focuses mainly on the positive sides, 1095 00:56:39,735 --> 00:56:43,096 so it's the job of historians and philosophers and sociologists 1096 00:56:43,120 --> 00:56:47,561 to highlight the more dangerous potential of all these new technologies. 1097 00:56:47,585 --> 00:56:50,068 I don't think any of that is inevitable. 1098 00:56:50,092 --> 00:56:53,131 Technology is never deterministic. 1099 00:56:53,155 --> 00:56:54,872 You can use the same technology 1100 00:56:54,896 --> 00:56:57,887 to create very different kinds of societies. 1101 00:56:57,911 --> 00:56:59,949 If you look at the 20th century, 1102 00:56:59,973 --> 00:57:02,754 so, the technologies of the Industrial Revolution, 1103 00:57:02,778 --> 00:57:05,835 the trains and electricity and all that 1104 00:57:05,859 --> 00:57:08,911 could be used to create a communist dictatorship 1105 00:57:08,935 --> 00:57:11,740 or a fascist regime or a liberal democracy. 1106 00:57:11,764 --> 00:57:14,292 The trains did not tell you what to do with them. 1107 00:57:14,316 --> 00:57:18,768 Similarly, now, artificial intelligence and bioengineering and all of that -- 1108 00:57:18,792 --> 00:57:22,306 they don't predetermine a single outcome. 1109 00:57:22,886 --> 00:57:26,063 Humanity can rise up to the challenge, 1110 00:57:26,087 --> 00:57:27,778 and the best example we have 1111 00:57:27,802 --> 00:57:31,542 of humanity rising up to the challenge of a new technology 1112 00:57:31,566 --> 00:57:33,289 is nuclear weapons. 1113 00:57:33,313 --> 00:57:36,322 In the late 1940s, '50s, 1114 00:57:36,346 --> 00:57:38,485 many people were convinced 1115 00:57:38,509 --> 00:57:42,815 that sooner or later the Cold War will end in a nuclear catastrophe, 1116 00:57:42,839 --> 00:57:44,614 destroying human civilization. 1117 00:57:44,638 --> 00:57:46,118 And this did not happen. 1118 00:57:46,142 --> 00:57:52,562 In fact, nuclear weapons prompted humans all over the world 1119 00:57:52,586 --> 00:57:57,327 to change the way that they manage international politics 1120 00:57:57,351 --> 00:57:59,720 to reduce violence. 1121 00:57:59,744 --> 00:58:02,983 And many countries basically took out war 1122 00:58:03,007 --> 00:58:04,881 from their political toolkit. 1123 00:58:04,905 --> 00:58:09,175 They no longer tried to pursue their interests with warfare. 1124 00:58:09,580 --> 00:58:12,850 Not all countries have done so, but many countries have. 1125 00:58:12,874 --> 00:58:16,808 And this is maybe the most important reason 1126 00:58:16,832 --> 00:58:22,934 why international violence declined dramatically since 1945, 1127 00:58:22,958 --> 00:58:26,296 and today, as I said, more people commit suicide 1128 00:58:26,320 --> 00:58:28,527 than are killed in war. 1129 00:58:28,551 --> 00:58:33,380 So this, I think, gives us a good example 1130 00:58:33,404 --> 00:58:37,246 that even the most frightening technology, 1131 00:58:37,270 --> 00:58:39,805 humans can rise up to the challenge 1132 00:58:39,829 --> 00:58:42,852 and actually some good can come out of it. 1133 00:58:42,876 --> 00:58:47,163 The problem is, we have very little margin for error. 1134 00:58:47,187 --> 00:58:49,396 If we don't get it right, 1135 00:58:49,420 --> 00:58:53,091 we might not have a second option to try again. 1136 00:58:54,337 --> 00:58:55,904 CA: That's a very powerful note, 1137 00:58:55,928 --> 00:58:58,733 on which I think we should draw this to a conclusion. 1138 00:58:58,757 --> 00:59:01,868 Before I wrap up, I just want to say one thing to people here 1139 00:59:01,892 --> 00:59:07,438 and to the global TED community watching online, anyone watching online: 1140 00:59:07,462 --> 00:59:10,355 help us with these dialogues. 1141 00:59:10,379 --> 00:59:12,929 If you believe, like we do, 1142 00:59:12,953 --> 00:59:15,933 that we need to find a different kind of conversation, 1143 00:59:15,957 --> 00:59:18,190 now more than ever, help us do it. 1144 00:59:18,214 --> 00:59:20,237 Reach out to other people, 1145 00:59:21,269 --> 00:59:24,009 try and have conversations with people you disagree with, 1146 00:59:24,033 --> 00:59:25,216 understand them, 1147 00:59:25,240 --> 00:59:26,770 pull the pieces together, 1148 00:59:26,794 --> 00:59:30,686 and help us figure out how to take these conversations forward 1149 00:59:30,710 --> 00:59:32,964 so we can make a real contribution 1150 00:59:32,988 --> 00:59:35,733 to what's happening in the world right now. 1151 00:59:35,757 --> 00:59:39,076 I think everyone feels more alive, 1152 00:59:39,100 --> 00:59:41,410 more concerned, more engaged 1153 00:59:41,434 --> 00:59:43,963 with the politics of the moment. 1154 00:59:43,987 --> 00:59:46,441 The stakes do seem quite high, 1155 00:59:46,465 --> 00:59:50,977 so help us respond to it in a wise, wise way. 1156 00:59:51,001 --> 00:59:52,596 Yuval Harari, thank you. 1157 00:59:52,620 --> 00:59:55,928 (Applause)