0:00:01.000,0:00:02.809 Chris Anderson: Nick Bostrom. 0:00:02.833,0:00:06.809 So, you have already given us[br]so many crazy ideas out there. 0:00:06.833,0:00:08.559 I think a couple of decades ago, 0:00:08.583,0:00:11.518 you made the case that we might[br]all be living in a simulation, 0:00:11.542,0:00:13.351 or perhaps probably were. 0:00:13.375,0:00:14.726 More recently, 0:00:14.750,0:00:19.351 you've painted the most vivid examples[br]of how artificial general intelligence 0:00:19.375,0:00:21.208 could go horribly wrong. 0:00:21.750,0:00:23.143 And now this year, 0:00:23.167,0:00:25.393 you're about to publish 0:00:25.417,0:00:29.351 a paper that presents something called[br]the vulnerable world hypothesis. 0:00:29.375,0:00:33.958 And our job this evening is to[br]give the illustrated guide to that. 0:00:34.417,0:00:36.250 So let's do that. 0:00:36.833,0:00:38.625 What is that hypothesis? 0:00:40.000,0:00:42.434 Nick Bostrom: It's trying to think about 0:00:42.458,0:00:45.542 a sort of structural feature[br]of the current human condition. 0:00:47.125,0:00:49.476 You like the urn metaphor, 0:00:49.500,0:00:51.393 so I'm going to use that to explain it. 0:00:51.417,0:00:55.768 So picture a big urn filled with balls 0:00:55.792,0:00:59.750 representing ideas, methods,[br]possible technologies. 0:01:00.833,0:01:04.559 You can think of the history[br]of human creativity 0:01:04.583,0:01:08.393 as the process of reaching into this urn[br]and pulling out one ball after another, 0:01:08.417,0:01:11.643 and the net effect so far[br]has been hugely beneficial, right? 0:01:11.667,0:01:14.393 We've extracted a great many white balls, 0:01:14.417,0:01:17.292 some various shades of gray,[br]mixed blessings. 0:01:18.042,0:01:21.000 We haven't so far[br]pulled out the black ball -- 0:01:22.292,0:01:27.768 a technology that invariably destroys[br]the civilization that discovers it. 0:01:27.792,0:01:31.059 So the paper tries to think[br]about what could such a black ball be. 0:01:31.083,0:01:32.893 CA: So you define that ball 0:01:32.917,0:01:36.601 as one that would inevitably[br]bring about civilizational destruction. 0:01:36.625,0:01:41.934 NB: Unless we exit what I call[br]the semi-anarchic default condition. 0:01:41.958,0:01:43.458 But sort of, by default. 0:01:44.333,0:01:47.851 CA: So, you make the case compelling 0:01:47.875,0:01:49.893 by showing some sort of counterexamples 0:01:49.917,0:01:52.851 where you believe that so far[br]we've actually got lucky, 0:01:52.875,0:01:55.726 that we might have pulled out[br]that death ball 0:01:55.750,0:01:57.309 without even knowing it. 0:01:57.333,0:01:59.625 So there's this quote, what's this quote? 0:02:00.625,0:02:03.309 NB: Well, I guess[br]it's just meant to illustrate 0:02:03.333,0:02:05.434 the difficulty of foreseeing 0:02:05.458,0:02:08.143 what basic discoveries will lead to. 0:02:08.167,0:02:11.226 We just don't have that capability. 0:02:11.250,0:02:14.601 Because we have become quite good[br]at pulling out balls, 0:02:14.625,0:02:18.351 but we don't really have the ability[br]to put the ball back into the urn, right. 0:02:18.375,0:02:20.542 We can invent, but we can't un-invent. 0:02:21.583,0:02:24.351 So our strategy, such as it is, 0:02:24.375,0:02:26.809 is to hope that there is[br]no black ball in the urn. 0:02:26.833,0:02:30.893 CA: So once it's out, it's out,[br]and you can't put it back in, 0:02:30.917,0:02:32.434 and you think we've been lucky. 0:02:32.458,0:02:34.684 So talk through a couple[br]of these examples. 0:02:34.708,0:02:37.809 You talk about different[br]types of vulnerability. 0:02:37.833,0:02:40.268 NB: So the easiest type to understand 0:02:40.292,0:02:43.434 is a technology[br]that just makes it very easy 0:02:43.458,0:02:45.583 to cause massive amounts of destruction. 0:02:47.375,0:02:50.893 Synthetic biology might be a fecund[br]source of that kind of black ball, 0:02:50.917,0:02:53.601 but many other possible things we could -- 0:02:53.625,0:02:56.143 think of geoengineering,[br]really great, right? 0:02:56.167,0:02:58.393 We could combat global warming, 0:02:58.417,0:03:00.559 but you don't want it[br]to get too easy either, 0:03:00.583,0:03:03.059 you don't want any random person[br]and his grandmother 0:03:03.083,0:03:06.143 to have the ability to radically[br]alter the earth's climate. 0:03:06.167,0:03:09.726 Or maybe lethal autonomous drones, 0:03:09.750,0:03:13.083 massed-produced, mosquito-sized[br]killer bot swarms. 0:03:14.500,0:03:17.226 Nanotechnology,[br]artificial general intelligence. 0:03:17.250,0:03:18.559 CA: You argue in the paper 0:03:18.583,0:03:21.476 that it's a matter of luck[br]that when we discovered 0:03:21.500,0:03:24.934 that nuclear power could create a bomb, 0:03:24.958,0:03:26.351 it might have been the case 0:03:26.375,0:03:28.226 that you could have created a bomb 0:03:28.250,0:03:31.809 with much easier resources,[br]accessible to anyone. 0:03:31.833,0:03:35.393 NB: Yeah, so think back to the 1930s 0:03:35.417,0:03:40.018 where for the first time we make[br]some breakthroughs in nuclear physics, 0:03:40.042,0:03:43.726 some genius figures out that it's possible[br]to create a nuclear chain reaction 0:03:43.750,0:03:46.934 and then realizes[br]that this could lead to the bomb. 0:03:46.958,0:03:48.851 And we do some more work, 0:03:48.875,0:03:51.601 it turns out that what you require[br]to make a nuclear bomb 0:03:51.625,0:03:54.018 is highly enriched uranium or plutonium, 0:03:54.042,0:03:56.059 which are very difficult materials to get. 0:03:56.083,0:03:58.351 You need ultracentrifuges, 0:03:58.375,0:04:02.143 you need reactors, like,[br]massive amounts of energy. 0:04:02.167,0:04:03.976 But suppose it had turned out instead 0:04:04.000,0:04:07.976 there had been an easy way[br]to unlock the energy of the atom. 0:04:08.000,0:04:10.768 That maybe by baking sand[br]in the microwave oven 0:04:10.792,0:04:12.059 or something like that 0:04:12.083,0:04:14.184 you could have created[br]a nuclear detonation. 0:04:14.208,0:04:16.351 So we know that that's[br]physically impossible. 0:04:16.375,0:04:18.268 But before you did the relevant physics 0:04:18.292,0:04:20.483 how could you have known[br]how it would turn out? 0:04:20.507,0:04:22.059 CA: Although, couldn't you argue 0:04:22.083,0:04:24.018 that for life to evolve on Earth 0:04:24.042,0:04:27.309 that implied sort of stable environment, 0:04:27.333,0:04:31.518 that if it was possible to create[br]massive nuclear reactions relatively easy, 0:04:31.542,0:04:33.400 the Earth would never have been stable, 0:04:33.424,0:04:34.976 that we wouldn't be here at all. 0:04:35.000,0:04:38.393 NB: Yeah, unless there were something[br]that is easy to do on purpose 0:04:38.417,0:04:41.268 but that wouldn't happen by random chance. 0:04:41.292,0:04:42.871 So, like things we can easily do, 0:04:42.896,0:04:45.006 we can stack 10 blocks[br]on top of one another, 0:04:45.031,0:04:48.228 but in nature, you're not going to find,[br]like, a stack of 10 blocks. 0:04:48.253,0:04:49.926 CA: OK, so this is probably the one 0:04:49.950,0:04:51.893 that many of us worry about most, 0:04:51.917,0:04:55.434 and yes, synthetic biology[br]is perhaps the quickest route 0:04:55.458,0:04:58.476 that we can foresee[br]in our near future to get us here. 0:04:58.500,0:05:01.434 NB: Yeah, and so think[br]about what that would have meant 0:05:01.458,0:05:05.101 if, say, anybody by working[br]in their kitchen for an afternoon 0:05:05.125,0:05:06.518 could destroy a city. 0:05:06.542,0:05:10.101 It's hard to see how[br]modern civilization as we know it 0:05:10.125,0:05:11.559 could have survived that. 0:05:11.583,0:05:14.101 Because in any population[br]of a million people, 0:05:14.125,0:05:16.809 there will always be some[br]who would, for whatever reason, 0:05:16.833,0:05:18.917 choose to use that destructive power. 0:05:19.750,0:05:22.893 So if that apocalyptic residual 0:05:22.917,0:05:24.893 would choose to destroy a city, or worse, 0:05:24.917,0:05:26.476 then cities would get destroyed. 0:05:26.500,0:05:28.851 CA: So here's another type[br]of vulnerability. 0:05:28.875,0:05:30.518 Talk about this. 0:05:30.542,0:05:34.518 NB: Yeah, so in addition to these[br]kind of obvious types of black balls 0:05:34.542,0:05:37.352 that would just make it possible[br]to blow up a lot of things, 0:05:37.376,0:05:41.809 other types would act[br]by creating bad incentives 0:05:41.833,0:05:44.059 for humans to do things that are harmful. 0:05:44.083,0:05:48.184 So, the Type-2a, we might call it that, 0:05:48.208,0:05:52.726 is to think about some technology[br]that incentivizes great powers 0:05:52.750,0:05:57.226 to use their massive amounts of force[br]to create destruction. 0:05:57.250,0:06:00.083 So, nuclear weapons were actually[br]very close to this, right? 0:06:02.083,0:06:05.143 What we did, we spent[br]over 10 trillion dollars 0:06:05.167,0:06:07.684 to build 70,000 nuclear warheads 0:06:07.708,0:06:10.143 and put them on hair-trigger alert. 0:06:10.167,0:06:12.434 And there were several times[br]during the Cold War 0:06:12.458,0:06:13.893 we almost blew each other up. 0:06:13.917,0:06:17.018 It's not because a lot of people felt[br]this would be a great idea, 0:06:17.042,0:06:19.726 let's all spend 10 trillion dollars[br]to blow ourselves up, 0:06:19.750,0:06:22.684 but the incentives were such[br]that we were finding ourselves -- 0:06:22.708,0:06:24.018 this could have been worse. 0:06:24.042,0:06:26.476 Imagine if there had been[br]a safe first strike. 0:06:26.500,0:06:28.809 Then it might have been very tricky, 0:06:28.833,0:06:30.101 in a crisis situation, 0:06:30.125,0:06:32.602 to refrain from launching[br]all their nuclear missiles. 0:06:32.626,0:06:36.018 If nothing else, because you would fear[br]that the other side might do it. 0:06:36.042,0:06:37.851 CA: Right, mutual assured destruction 0:06:37.875,0:06:40.601 kept the Cold War relatively stable, 0:06:40.625,0:06:42.559 without that, we might not be here now. 0:06:42.583,0:06:44.893 NB: It could have been[br]more unstable than it was. 0:06:44.917,0:06:47.268 And there could be[br]other properties of technology. 0:06:47.292,0:06:49.601 It could have been harder[br]to have arms treaties, 0:06:49.625,0:06:51.226 if instead of nuclear weapons 0:06:51.250,0:06:54.268 there had been some smaller thing[br]or something less distinctive. 0:06:54.292,0:06:56.851 CA: And as well as bad incentives[br]for powerful actors, 0:06:56.875,0:07:00.393 you also worry about bad incentives[br]for all of us, in Type-2b here. 0:07:00.417,0:07:04.667 NB: Yeah, so, here we might[br]take the case of global warming. 0:07:06.958,0:07:08.851 There are a lot of little conveniences 0:07:08.875,0:07:11.059 that cause each one of us to do things 0:07:11.083,0:07:13.934 that individually [br]have no significant effect, right? 0:07:13.958,0:07:15.934 But if billions of people do it, 0:07:15.958,0:07:18.018 cumulatively, it has a damaging effect. 0:07:18.042,0:07:20.851 Now, global warming[br]could have been a lot worse than it is. 0:07:20.875,0:07:23.851 So we have the climate[br]sensitivity parameter, right. 0:07:23.875,0:07:27.518 It's a parameter that says[br]how much warmer does it get 0:07:27.542,0:07:30.226 if you emit a certain amount[br]of greenhouse gases. 0:07:30.250,0:07:32.643 But, suppose that it had been the case 0:07:32.667,0:07:35.184 that with the amount[br]of greenhouse gases we emitted, 0:07:35.208,0:07:37.268 instead of the temperature rising by, say, 0:07:37.292,0:07:41.018 between three and 4.5 degrees by 2100, 0:07:41.042,0:07:43.542 suppose it had been[br]15 degrees or 20 degrees. 0:07:44.375,0:07:46.934 Like, then we might have been[br]in a very bad situation. 0:07:46.958,0:07:50.101 Or suppose that renewable energy[br]had just been a lot harder to do. 0:07:50.125,0:07:52.768 Or that there had been[br]more fossil fuels in the ground. 0:07:52.792,0:07:55.434 CA: Couldn't you argue[br]that if in that case of -- 0:07:55.458,0:07:57.184 if what we are doing today 0:07:57.208,0:08:01.768 had resulted in 10 degrees difference[br]in the time period that we could see, 0:08:01.792,0:08:05.476 actually humanity would have got[br]off its ass and done something about it. 0:08:05.500,0:08:08.309 We're stupid, but we're not[br]maybe that stupid. 0:08:08.333,0:08:09.601 Or maybe we are. 0:08:09.625,0:08:10.893 NB: I wouldn't bet on it. 0:08:10.917,0:08:13.101 (Laughter) 0:08:13.125,0:08:14.809 You could imagine other features. 0:08:14.833,0:08:20.351 So, right now, it's a little bit difficult[br]to switch to renewables and stuff, right, 0:08:20.375,0:08:21.643 but it can be done. 0:08:21.667,0:08:24.643 But it might just have been,[br]with slightly different physics, 0:08:24.667,0:08:27.458 it could have been much more expensive[br]to do these things. 0:08:28.375,0:08:29.893 CA: And what's your view, Nick? 0:08:29.917,0:08:32.351 Do you think, putting[br]these possibilities together, 0:08:32.375,0:08:36.643 that this earth, humanity that we are, 0:08:36.667,0:08:38.226 we count as a vulnerable world? 0:08:38.250,0:08:40.667 That there is a death ball in our future? 0:08:43.958,0:08:45.226 NB: It's hard to say. 0:08:45.250,0:08:50.309 I mean, I think there might[br]well be various black balls in the urn, 0:08:50.333,0:08:51.643 that's what it looks like. 0:08:51.667,0:08:54.059 There might also be some golden balls 0:08:54.083,0:08:57.559 that would help us[br]protect against black balls. 0:08:57.583,0:09:00.559 And I don't know which order[br]they will come out. 0:09:00.583,0:09:04.434 CA: I mean, one possible[br]philosophical critique of this idea 0:09:04.458,0:09:10.101 is that it implies a view[br]that the future is essentially settled. 0:09:10.125,0:09:12.601 That there either[br]is that ball there or it's not. 0:09:12.625,0:09:15.643 And in a way, 0:09:15.667,0:09:18.268 that's not a view of the future[br]that I want to believe. 0:09:18.292,0:09:20.643 I want to believe[br]that the future is undetermined, 0:09:20.667,0:09:22.601 that our decisions today will determine 0:09:22.625,0:09:24.833 what kind of balls[br]we pull out of that urn. 0:09:25.917,0:09:29.684 NB: I mean, if we just keep inventing, 0:09:29.708,0:09:32.042 like, eventually we will[br]pull out all the balls. 0:09:32.875,0:09:36.268 I mean, I think there's a kind[br]of weak form of technological determinism 0:09:36.292,0:09:37.559 that is quite plausible, 0:09:37.583,0:09:40.226 like, you're unlikely[br]to encounter a society 0:09:40.250,0:09:43.083 that uses flint axes and jet planes. 0:09:44.208,0:09:48.268 But you can almost think[br]of a technology as a set of affordances. 0:09:48.292,0:09:51.309 So technology is the thing[br]that enables us to do various things 0:09:51.333,0:09:53.309 and achieve various effects in the world. 0:09:53.333,0:09:56.143 How we'd then use that,[br]of course depends on human choice. 0:09:56.167,0:09:58.851 But if we think about these[br]three types of vulnerability, 0:09:58.875,0:10:02.268 they make quite weak assumptions[br]about how we would choose to use them. 0:10:02.292,0:10:05.684 So a Type-1 vulnerability, again,[br]this massive, destructive power, 0:10:05.708,0:10:07.143 it's a fairly weak assumption 0:10:07.167,0:10:09.559 to think that in a population[br]of millions of people 0:10:09.583,0:10:12.518 there would be some that would choose[br]to use it destructively. 0:10:12.542,0:10:14.976 CA: For me, the most single[br]disturbing argument 0:10:15.000,0:10:19.559 is that we actually might have[br]some kind of view into the urn 0:10:19.583,0:10:23.101 that makes it actually[br]very likely that we're doomed. 0:10:23.125,0:10:27.768 Namely, if you believe[br]in accelerating power, 0:10:27.792,0:10:30.059 that technology inherently accelerates, 0:10:30.083,0:10:32.518 that we build the tools[br]that make us more powerful, 0:10:32.542,0:10:35.184 then at some point you get to a stage 0:10:35.208,0:10:38.268 where a single individual[br]can take us all down, 0:10:38.292,0:10:41.143 and then it looks like we're screwed. 0:10:41.167,0:10:44.101 Isn't that argument quite alarming? 0:10:44.125,0:10:45.875 NB: Ah, yeah. 0:10:46.708,0:10:47.976 (Laughter) 0:10:48.000,0:10:49.333 I think -- 0:10:50.875,0:10:52.476 Yeah, we get more and more power, 0:10:52.500,0:10:56.434 and [it's] easier and easier [br]to use those powers, 0:10:56.458,0:11:00.018 but we can also invent technologies[br]that kind of help us control 0:11:00.042,0:11:02.059 how people use those powers. 0:11:02.083,0:11:04.934 CA: So let's talk about that,[br]let's talk about the response. 0:11:04.958,0:11:07.268 Suppose that thinking[br]about all the possibilities 0:11:07.292,0:11:09.393 that are out there now -- 0:11:09.417,0:11:13.143 it's not just synbio,[br]it's things like cyberwarfare, 0:11:13.167,0:11:16.518 artificial intelligence, etc., etc. -- 0:11:16.542,0:11:21.059 that there might be[br]serious doom in our future. 0:11:21.083,0:11:22.684 What are the possible responses? 0:11:22.708,0:11:27.601 And you've talked about[br]four possible responses as well. 0:11:27.625,0:11:31.268 NB: Restricting technological development[br]doesn't seem promising, 0:11:31.292,0:11:34.518 if we are talking about a general halt[br]to technological progress. 0:11:34.542,0:11:35.809 I think neither feasible, 0:11:35.833,0:11:38.143 nor would it be desirable[br]even if we could do it. 0:11:38.167,0:11:41.184 I think there might be very limited areas 0:11:41.208,0:11:43.934 where maybe you would want[br]slower technological progress. 0:11:43.958,0:11:47.351 You don't, I think, want[br]faster progress in bioweapons, 0:11:47.375,0:11:49.434 or in, say, isotope separation, 0:11:49.458,0:11:51.708 that would make it easier to create nukes. 0:11:52.583,0:11:55.893 CA: I mean, I used to be[br]fully on board with that. 0:11:55.917,0:11:59.184 But I would like to actually[br]push back on that for a minute. 0:11:59.208,0:12:00.518 Just because, first of all, 0:12:00.542,0:12:03.226 if you look at the history[br]of the last couple of decades, 0:12:03.250,0:12:06.809 you know, it's always been[br]push forward at full speed, 0:12:06.833,0:12:08.684 it's OK, that's our only choice. 0:12:08.708,0:12:12.976 But if you look at globalization[br]and the rapid acceleration of that, 0:12:13.000,0:12:16.434 if you look at the strategy of[br]"move fast and break things" 0:12:16.458,0:12:18.518 and what happened with that, 0:12:18.542,0:12:21.309 and then you look at the potential[br]for synthetic biology, 0:12:21.333,0:12:25.768 I don't know that we should[br]move forward rapidly 0:12:25.792,0:12:27.434 or without any kind of restriction 0:12:27.458,0:12:30.768 to a world where you could have[br]a DNA printer in every home 0:12:30.792,0:12:32.125 and high school lab. 0:12:33.167,0:12:34.851 There are some restrictions, right? 0:12:34.875,0:12:37.518 NB: Possibly, there is[br]the first part, the not feasible. 0:12:37.542,0:12:39.726 If you think it would be[br]desirable to stop it, 0:12:39.750,0:12:41.476 there's the problem of feasibility. 0:12:41.500,0:12:44.309 So it doesn't really help[br]if one nation kind of -- 0:12:44.333,0:12:46.351 CA: No, it doesn't help[br]if one nation does, 0:12:46.375,0:12:49.309 but we've had treaties before. 0:12:49.333,0:12:52.684 That's really how we survived[br]the nuclear threat, 0:12:52.708,0:12:53.976 was by going out there 0:12:54.000,0:12:56.518 and going through[br]the painful process of negotiating. 0:12:56.542,0:13:01.976 I just wonder whether the logic isn't[br]that we, as a matter of global priority, 0:13:02.000,0:13:03.684 we shouldn't go out there and try, 0:13:03.708,0:13:06.393 like, now start negotiating[br]really strict rules 0:13:06.417,0:13:09.101 on where synthetic bioresearch is done, 0:13:09.125,0:13:11.976 that it's not something[br]that you want to democratize, no? 0:13:12.000,0:13:13.809 NB: I totally agree with that -- 0:13:13.833,0:13:18.059 that it would be desirable, for example, 0:13:18.083,0:13:21.684 maybe to have DNA synthesis machines, 0:13:21.708,0:13:25.268 not as a product where each lab[br]has their own device, 0:13:25.292,0:13:26.768 but maybe as a service. 0:13:26.792,0:13:29.309 Maybe there could be[br]four or five places in the world 0:13:29.333,0:13:32.851 where you send in your digital blueprint[br]and the DNA comes back, right? 0:13:32.875,0:13:34.643 And then, you would have the ability, 0:13:34.667,0:13:37.059 if one day it really looked[br]like it was necessary, 0:13:37.083,0:13:39.434 we would have like,[br]a finite set of choke points. 0:13:39.458,0:13:42.976 So I think you want to look[br]for kind of special opportunities, 0:13:43.000,0:13:45.059 where you could have tighter control. 0:13:45.083,0:13:46.726 CA: Your belief is, fundamentally, 0:13:46.750,0:13:49.643 we are not going to be successful[br]in just holding back. 0:13:49.667,0:13:52.393 Someone, somewhere --[br]North Korea, you know -- 0:13:52.417,0:13:55.934 someone is going to go there[br]and discover this knowledge, 0:13:55.958,0:13:57.226 if it's there to be found. 0:13:57.250,0:13:59.601 NB: That looks plausible[br]under current conditions. 0:13:59.625,0:14:01.559 It's not just synthetic biology, either. 0:14:01.583,0:14:04.101 I mean, any kind of profound,[br]new change in the world 0:14:04.101,0:14:05.727 could turn out to be a black ball. 0:14:05.727,0:14:07.823 CA: Let's look at [br]another possible response. 0:14:07.823,0:14:10.226 NB: This also, I think,[br]has only limited potential. 0:14:10.250,0:14:13.809 So, with the Type-1 vulnerability again, 0:14:13.833,0:14:18.184 I mean, if you could reduce the number[br]of people who are incentivized 0:14:18.208,0:14:19.476 to destroy the world, 0:14:19.500,0:14:21.559 if only they could get[br]access and the means, 0:14:21.583,0:14:22.851 that would be good. 0:14:22.875,0:14:24.851 CA: In this image that you asked us to do 0:14:24.875,0:14:27.434 you're imagining these drones[br]flying around the world 0:14:27.458,0:14:28.726 with facial recognition. 0:14:28.750,0:14:31.643 When they spot someone[br]showing signs of sociopathic behavior, 0:14:31.667,0:14:33.851 they shower them with love, they fix them. 0:14:33.875,0:14:35.768 NB: I think it's like a hybrid picture. 0:14:35.792,0:14:39.809 Eliminate can either mean,[br]like, incarcerate or kill, 0:14:39.833,0:14:42.851 or it can mean persuade them[br]to a better view of the world. 0:14:42.875,0:14:44.601 But the point is that, 0:14:44.625,0:14:46.768 suppose you were[br]extremely successful in this, 0:14:46.792,0:14:50.101 and you reduced the number[br]of such individuals by half. 0:14:50.125,0:14:52.018 And if you want to do it by persuasion, 0:14:52.042,0:14:54.434 you are competing against[br]all other powerful forces 0:14:54.458,0:14:56.143 that are trying to persuade people, 0:14:56.167,0:14:57.934 parties, religion, education system. 0:14:57.958,0:14:59.863 But suppose you could reduce it by half, 0:14:59.887,0:15:02.143 I don't think the risk[br]would be reduced by half. 0:15:02.167,0:15:03.726 Maybe by five or 10 percent. 0:15:03.750,0:15:08.101 CA: You're not recommending that we gamble[br]humanity's future on response two. 0:15:08.125,0:15:11.143 NB: I think it's all good[br]to try to deter and persuade people, 0:15:11.167,0:15:14.143 but we shouldn't rely on that[br]as our only safeguard. 0:15:14.167,0:15:15.434 CA: How about three? 0:15:15.458,0:15:18.351 NB: I think there are two general methods 0:15:18.375,0:15:22.351 that we could use to achieve[br]the ability to stabilize the world 0:15:22.375,0:15:25.351 against the whole spectrum[br]of possible vulnerabilities. 0:15:25.375,0:15:26.934 And we probably would need both. 0:15:26.958,0:15:31.476 So, one is an extremely effective ability 0:15:31.500,0:15:33.268 to do preventive policing. 0:15:33.292,0:15:34.816 Such that you could intercept. 0:15:34.840,0:15:37.601 If anybody started to do[br]this dangerous thing, 0:15:37.625,0:15:40.309 you could intercept them[br]in real time, and stop them. 0:15:40.333,0:15:42.809 So this would require[br]ubiquitous surveillance, 0:15:42.833,0:15:45.208 everybody would be monitored all the time. 0:15:46.333,0:15:48.893 CA: This is "Minority Report,"[br]essentially, a form of. 0:15:48.917,0:15:50.851 NB: You would have maybe AI algorithms, 0:15:50.875,0:15:55.292 big freedom centers[br]that were reviewing this, etc., etc. 0:15:56.583,0:16:00.976 CA: You know that mass surveillance[br]is not a very popular term right now? 0:16:01.000,0:16:02.250 (Laughter) 0:16:03.458,0:16:05.268 NB: Yeah, so this little device there, 0:16:05.292,0:16:08.893 imagine that kind of necklace[br]that you would have to wear at all times 0:16:08.917,0:16:10.917 with multidirectional cameras. 0:16:11.792,0:16:13.601 But, to make it go down better, 0:16:13.625,0:16:16.149 just call it the "freedom tag"[br]or something like that. 0:16:16.173,0:16:18.184 (Laughter) 0:16:18.208,0:16:19.476 CA: OK. 0:16:19.500,0:16:21.601 I mean, this is the conversation, friends, 0:16:21.625,0:16:25.184 this is why this is[br]such a mind-blowing conversation. 0:16:25.208,0:16:27.809 NB: Actually, there's[br]a whole big conversation on this 0:16:27.833,0:16:29.143 on its own, obviously. 0:16:29.167,0:16:31.643 There are huge problems and risks[br]with that, right? 0:16:31.667,0:16:32.934 We may come back to that. 0:16:32.958,0:16:34.226 So the other, the final, 0:16:34.250,0:16:36.809 the other general stabilization capability 0:16:36.833,0:16:38.893 is kind of plugging[br]another governance gap. 0:16:38.917,0:16:43.101 So the surveillance would be kind of[br]governance gap at the microlevel, 0:16:43.125,0:16:46.226 like, preventing anybody[br]from ever doing something highly illegal. 0:16:46.250,0:16:48.559 Then, there's a corresponding[br]governance gap 0:16:48.583,0:16:50.518 at the macro level, at the global level. 0:16:50.542,0:16:54.476 You would need the ability, reliably, 0:16:54.500,0:16:57.309 to prevent the worst kinds[br]of global coordination failures, 0:16:57.333,0:17:01.101 to avoid wars between great powers, 0:17:01.125,0:17:02.458 arms races, 0:17:03.500,0:17:05.708 cataclysmic commons problems, 0:17:07.667,0:17:11.851 in order to deal with[br]the Type-2a vulnerabilities. 0:17:11.875,0:17:13.809 CA: Global governance is a term 0:17:13.833,0:17:16.059 that's definitely way out[br]of fashion right now, 0:17:16.083,0:17:18.601 but could you make the case[br]that throughout history, 0:17:18.625,0:17:19.893 the history of humanity 0:17:19.917,0:17:25.351 is that at every stage[br]of technological power increase, 0:17:25.375,0:17:28.601 people have reorganized[br]and sort of centralized the power. 0:17:28.625,0:17:32.059 So, for example,[br]when a roving band of criminals 0:17:32.083,0:17:33.768 could take over a society, 0:17:33.792,0:17:36.031 the response was,[br]well, you have a nation-state 0:17:36.055,0:17:38.489 and you centralize force,[br]a police force or an army, 0:17:38.513,0:17:40.143 so, "No, you can't do that." 0:17:40.167,0:17:44.726 The logic, perhaps, of having[br]a single person or a single group 0:17:44.750,0:17:46.393 able to take out humanity 0:17:46.417,0:17:49.143 means at some point[br]we're going to have to go this route, 0:17:49.167,0:17:50.601 at least in some form, no? 0:17:50.625,0:17:54.309 NB: It's certainly true that the scale[br]of political organization has increased 0:17:54.333,0:17:56.476 over the course of human history. 0:17:56.500,0:17:58.518 It used to be hunter-gatherer band, right, 0:17:58.542,0:18:01.476 and then chiefdom, city-states, nations, 0:18:01.500,0:18:05.476 now there are international organizations[br]and so on and so forth. 0:18:05.500,0:18:07.018 Again, I just want to make sure 0:18:07.042,0:18:08.684 I get the chance to stress 0:18:08.708,0:18:10.684 that obviously there are huge downsides 0:18:10.708,0:18:12.226 and indeed, massive risks, 0:18:12.250,0:18:15.601 both to mass surveillance[br]and to global governance. 0:18:15.625,0:18:18.184 I'm just pointing out[br]that if we are lucky, 0:18:18.208,0:18:20.893 the world could be such[br]that these would be the only ways 0:18:20.917,0:18:22.434 you could survive a black ball. 0:18:22.458,0:18:24.976 CA: The logic of this theory, 0:18:25.000,0:18:26.268 it seems to me, 0:18:26.292,0:18:29.893 is that we've got to recognize[br]we can't have it all. 0:18:29.917,0:18:31.750 That the sort of, 0:18:33.500,0:18:36.476 I would say, naive dream[br]that many of us had 0:18:36.500,0:18:39.851 that technology is always[br]going to be a force for good, 0:18:39.875,0:18:42.851 keep going, don't stop,[br]go as fast as you can 0:18:42.875,0:18:45.226 and not pay attention[br]to some of the consequences, 0:18:45.250,0:18:46.934 that's actually just not an option. 0:18:46.958,0:18:48.893 We can have that. 0:18:48.917,0:18:50.184 If we have that, 0:18:50.208,0:18:51.643 we're going to have to accept 0:18:51.667,0:18:54.226 some of these other[br]very uncomfortable things with it, 0:18:54.250,0:18:56.476 and kind of be in this[br]arms race with ourselves 0:18:56.500,0:18:58.768 of, you want the power,[br]you better limit it, 0:18:58.792,0:19:00.934 you better figure out how to limit it. 0:19:00.958,0:19:04.434 NB: I think it is an option, 0:19:04.458,0:19:07.226 a very tempting option,[br]it's in a sense the easiest option 0:19:07.250,0:19:08.518 and it might work, 0:19:08.542,0:19:13.351 but it means we are fundamentally[br]vulnerable to extracting a black ball. 0:19:13.375,0:19:15.518 Now, I think with a bit of coordination, 0:19:15.542,0:19:18.268 like, if you did solve this[br]macrogovernance problem, 0:19:18.292,0:19:19.893 and the microgovernance problem, 0:19:19.917,0:19:22.226 then we could extract[br]all the balls from the urn 0:19:22.250,0:19:24.518 and we'd benefit greatly. 0:19:24.542,0:19:27.976 CA: I mean, if we're living[br]in a simulation, does it matter? 0:19:28.000,0:19:29.309 We just reboot. 0:19:29.333,0:19:30.601 (Laughter) 0:19:30.625,0:19:32.268 NB: Then ... I ... 0:19:32.292,0:19:34.768 (Laughter) 0:19:34.792,0:19:36.208 I didn't see that one coming. 0:19:38.125,0:19:39.393 CA: So what's your view? 0:19:39.417,0:19:44.226 Putting all the pieces together,[br]how likely is it that we're doomed? 0:19:44.250,0:19:46.208 (Laughter) 0:19:47.042,0:19:49.434 I love how people laugh[br]when you ask that question. 0:19:49.458,0:19:50.809 NB: On an individual level, 0:19:50.833,0:19:54.684 we seem to kind of be doomed anyway,[br]just with the time line, 0:19:54.708,0:19:57.309 we're rotting and aging[br]and all kinds of things, right? 0:19:57.333,0:19:58.934 (Laughter) 0:19:58.958,0:20:00.643 It's actually a little bit tricky. 0:20:00.667,0:20:03.434 If you want to set up[br]so that you can attach a probability, 0:20:03.458,0:20:04.726 first, who are we? 0:20:04.750,0:20:07.476 If you're very old,[br]probably you'll die of natural causes, 0:20:07.500,0:20:09.851 if you're very young,[br]you might have a 100-year -- 0:20:09.875,0:20:12.018 the probability might depend[br]on who you ask. 0:20:12.042,0:20:16.268 Then the threshold, like, what counts[br]as civilizational devastation? 0:20:16.292,0:20:21.934 In the paper I don't require[br]an existential catastrophe 0:20:21.958,0:20:23.393 in order for it to count. 0:20:23.417,0:20:25.101 This is just a definitional matter, 0:20:25.125,0:20:26.434 I say a billion dead, 0:20:26.458,0:20:28.518 or a reduction of world GDP by 50 percent, 0:20:28.542,0:20:30.768 but depending on what[br]you say the threshold is, 0:20:30.792,0:20:32.768 you get a different probability estimate. 0:20:32.792,0:20:37.309 But I guess you could[br]put me down as a frightened optimist. 0:20:37.333,0:20:38.434 (Laughter) 0:20:38.458,0:20:40.101 CA: You're a frightened optimist, 0:20:40.125,0:20:44.393 and I think you've just created[br]a large number of other frightened ... 0:20:44.417,0:20:45.684 people. 0:20:45.708,0:20:46.768 (Laughter) 0:20:46.792,0:20:48.059 NB: In the simulation. 0:20:48.083,0:20:49.351 CA: In a simulation. 0:20:49.375,0:20:51.059 Nick Bostrom, your mind amazes me, 0:20:51.083,0:20:53.976 thank you so much for scaring[br]the living daylights out of us. 0:20:54.000,0:20:56.375 (Applause)