WEBVTT 00:00:01.000 --> 00:00:02.809 Chris Anderson: Nick Bostrom. 00:00:02.833 --> 00:00:06.809 So, you have already given us so many crazy ideas out there. 00:00:06.833 --> 00:00:08.559 I think a couple of decades ago, 00:00:08.583 --> 00:00:11.518 you made the case that we might all be living in a simulation, 00:00:11.542 --> 00:00:13.351 or perhaps probably were. 00:00:13.375 --> 00:00:14.726 More recently, 00:00:14.750 --> 00:00:19.351 you've painted the most vivid examples of how artificial general intelligence 00:00:19.375 --> 00:00:21.208 could go horribly wrong. 00:00:21.750 --> 00:00:23.143 And now this year, 00:00:23.167 --> 00:00:25.393 you're about to publish 00:00:25.417 --> 00:00:29.351 a paper that presents something called the vulnerable world hypothesis. 00:00:29.375 --> 00:00:33.958 And our job this evening is to give the illustrated guide to that. 00:00:34.417 --> 00:00:36.250 So let's do that. 00:00:36.833 --> 00:00:38.625 What is that hypothesis? NOTE Paragraph 00:00:40.000 --> 00:00:42.434 Nick Bostrom: It's trying to think about 00:00:42.458 --> 00:00:45.542 a sort of structural feature of the current human condition. 00:00:47.125 --> 00:00:49.476 You like the urn metaphor, 00:00:49.500 --> 00:00:51.393 so I'm going to use that to explain it. 00:00:51.417 --> 00:00:55.768 So picture a big urn filled with balls 00:00:55.792 --> 00:00:59.750 representing ideas, methods, possible technologies. 00:01:00.833 --> 00:01:04.559 You can think of the history of human creativity 00:01:04.583 --> 00:01:08.393 as the process of reaching into this urn and pulling out one ball after another, 00:01:08.417 --> 00:01:11.643 and the net effect so far has been hugely beneficial, right? 00:01:11.667 --> 00:01:14.393 We've extracted a great many white balls, 00:01:14.417 --> 00:01:17.292 some various shades of gray, mixed blessings. 00:01:18.042 --> 00:01:21.000 We haven't so far pulled out the black ball -- 00:01:22.292 --> 00:01:27.768 a technology that invariably destroys the civilization that discovers it. 00:01:27.792 --> 00:01:31.059 So the paper tries to think about what could such a black ball be. NOTE Paragraph 00:01:31.083 --> 00:01:32.893 CA: So you define that ball 00:01:32.917 --> 00:01:36.601 as one that would inevitably bring about civilizational destruction. NOTE Paragraph 00:01:36.625 --> 00:01:41.934 NB: Unless we exit what I call the semi-anarchic default condition. 00:01:41.958 --> 00:01:43.458 But sort of, by default. NOTE Paragraph 00:01:44.333 --> 00:01:47.851 CA: So, you make the case compelling 00:01:47.875 --> 00:01:49.893 by showing some sort of counterexamples 00:01:49.917 --> 00:01:52.851 where you believe that so far we've actually got lucky, 00:01:52.875 --> 00:01:55.726 that we might have pulled out that death ball 00:01:55.750 --> 00:01:57.309 without even knowing it. 00:01:57.333 --> 00:01:59.625 So there's this quote, what's this quote? NOTE Paragraph 00:02:00.625 --> 00:02:03.309 NB: Well, I guess it's just meant to illustrate 00:02:03.333 --> 00:02:05.434 the difficulty of foreseeing 00:02:05.458 --> 00:02:08.143 what basic discoveries will lead to. 00:02:08.167 --> 00:02:11.226 We just don't have that capability. 00:02:11.250 --> 00:02:14.601 Because we have become quite good at pulling out balls, 00:02:14.625 --> 00:02:18.351 but we don't really have the ability to put the ball back into the urn, right. 00:02:18.375 --> 00:02:20.542 We can invent, but we can't un-invent. 00:02:21.583 --> 00:02:24.351 So our strategy, such as it is, 00:02:24.375 --> 00:02:26.809 is to hope that there is no black ball in the urn. NOTE Paragraph 00:02:26.833 --> 00:02:30.893 CA: So once it's out, it's out, and you can't put it back in, 00:02:30.917 --> 00:02:32.434 and you think we've been lucky. 00:02:32.458 --> 00:02:34.684 So talk through a couple of these examples. 00:02:34.708 --> 00:02:37.809 You talk about different types of vulnerability. NOTE Paragraph 00:02:37.833 --> 00:02:40.268 NB: So the easiest type to understand 00:02:40.292 --> 00:02:43.434 is a technology that just makes it very easy 00:02:43.458 --> 00:02:45.583 to cause massive amounts of destruction. 00:02:47.375 --> 00:02:50.893 Synthetic biology might be a fecund source of that kind of black ball, 00:02:50.917 --> 00:02:53.601 but many other possible things we could -- 00:02:53.625 --> 00:02:56.143 think of geoengineering, really great, right? 00:02:56.167 --> 00:02:58.393 We could combat global warming, 00:02:58.417 --> 00:03:00.559 but you don't want it to get too easy either, 00:03:00.583 --> 00:03:03.059 you don't want any random person and his grandmother 00:03:03.083 --> 00:03:06.143 to have the ability to radically alter the earth's climate. 00:03:06.167 --> 00:03:09.726 Or maybe lethal autonomous drones, 00:03:09.750 --> 00:03:13.083 massed-produced, mosquito-sized killer bot swarms. 00:03:14.500 --> 00:03:17.226 Nanotechnology, artificial general intelligence. NOTE Paragraph 00:03:17.250 --> 00:03:18.559 CA: You argue in the paper 00:03:18.583 --> 00:03:21.476 that it's a matter of luck that when we discovered 00:03:21.500 --> 00:03:24.934 that nuclear power could create a bomb, 00:03:24.958 --> 00:03:26.351 it might have been the case 00:03:26.375 --> 00:03:28.226 that you could have created a bomb 00:03:28.250 --> 00:03:31.809 with much easier resources, accessible to anyone. NOTE Paragraph 00:03:31.833 --> 00:03:35.393 NB: Yeah, so think back to the 1930s 00:03:35.417 --> 00:03:40.018 where for the first time we make some breakthroughs in nuclear physics, 00:03:40.042 --> 00:03:43.726 some genius figures out that it's possible to create a nuclear chain reaction 00:03:43.750 --> 00:03:46.934 and then realizes that this could lead to the bomb. 00:03:46.958 --> 00:03:48.851 And we do some more work, 00:03:48.875 --> 00:03:51.601 it turns out that what you require to make a nuclear bomb 00:03:51.625 --> 00:03:54.018 is highly enriched uranium or plutonium, 00:03:54.042 --> 00:03:56.059 which are very difficult materials to get. 00:03:56.083 --> 00:03:58.351 You need ultracentrifuges, 00:03:58.375 --> 00:04:02.143 you need reactors, like, massive amounts of energy. 00:04:02.167 --> 00:04:03.976 But suppose it had turned out instead 00:04:04.000 --> 00:04:07.976 there had been an easy way to unlock the energy of the atom. 00:04:08.000 --> 00:04:10.768 That maybe by baking sand in the microwave oven 00:04:10.792 --> 00:04:12.059 or something like that 00:04:12.083 --> 00:04:14.184 you could have created a nuclear detonation. 00:04:14.208 --> 00:04:16.351 So we know that that's physically impossible. 00:04:16.375 --> 00:04:18.268 But before you did the relevant physics 00:04:18.292 --> 00:04:20.483 how could you have known how it would turn out? NOTE Paragraph 00:04:20.507 --> 00:04:22.059 CA: Although, couldn't you argue 00:04:22.083 --> 00:04:24.018 that for life to evolve on Earth 00:04:24.042 --> 00:04:27.309 that implied sort of stable environment, 00:04:27.333 --> 00:04:31.518 that if it was possible to create massive nuclear reactions relatively easy, 00:04:31.542 --> 00:04:33.400 the Earth would never have been stable, 00:04:33.424 --> 00:04:34.976 that we wouldn't be here at all. NOTE Paragraph 00:04:35.000 --> 00:04:38.393 NB: Yeah, unless there were something that is easy to do on purpose 00:04:38.417 --> 00:04:41.268 but that wouldn't happen by random chance. 00:04:41.292 --> 00:04:42.871 So, like things we can easily do, 00:04:42.896 --> 00:04:45.006 we can stack 10 blocks on top of one another, 00:04:45.031 --> 00:04:48.228 but in nature, you're not going to find, like, a stack of 10 blocks. NOTE Paragraph 00:04:48.253 --> 00:04:49.926 CA: OK, so this is probably the one 00:04:49.950 --> 00:04:51.893 that many of us worry about most, 00:04:51.917 --> 00:04:55.434 and yes, synthetic biology is perhaps the quickest route 00:04:55.458 --> 00:04:58.476 that we can foresee in our near future to get us here. NOTE Paragraph 00:04:58.500 --> 00:05:01.434 NB: Yeah, and so think about what that would have meant 00:05:01.458 --> 00:05:05.101 if, say, anybody by working in their kitchen for an afternoon 00:05:05.125 --> 00:05:06.518 could destroy a city. 00:05:06.542 --> 00:05:10.101 It's hard to see how modern civilization as we know it 00:05:10.125 --> 00:05:11.559 could have survived that. 00:05:11.583 --> 00:05:14.101 Because in any population of a million people, 00:05:14.125 --> 00:05:16.809 there will always be some who would, for whatever reason, 00:05:16.833 --> 00:05:18.917 choose to use that destructive power. 00:05:19.750 --> 00:05:22.893 So if that apocalyptic residual 00:05:22.917 --> 00:05:24.893 would choose to destroy a city, or worse, 00:05:24.917 --> 00:05:26.476 then cities would get destroyed. NOTE Paragraph 00:05:26.500 --> 00:05:28.851 CA: So here's another type of vulnerability. 00:05:28.875 --> 00:05:30.518 Talk about this. NOTE Paragraph 00:05:30.542 --> 00:05:34.518 NB: Yeah, so in addition to these kind of obvious types of black balls 00:05:34.542 --> 00:05:37.352 that would just make it possible to blow up a lot of things, 00:05:37.376 --> 00:05:41.809 other types would act by creating bad incentives 00:05:41.833 --> 00:05:44.059 for humans to do things that are harmful. 00:05:44.083 --> 00:05:48.184 So, the Type-2a, we might call it that, 00:05:48.208 --> 00:05:52.726 is to think about some technology that incentivizes great powers 00:05:52.750 --> 00:05:57.226 to use their massive amounts of force to create destruction. 00:05:57.250 --> 00:06:00.083 So, nuclear weapons were actually very close to this, right? 00:06:02.083 --> 00:06:05.143 What we did, we spent over 10 trillion dollars 00:06:05.167 --> 00:06:07.684 to build 70,000 nuclear warheads 00:06:07.708 --> 00:06:10.143 and put them on hair-trigger alert. 00:06:10.167 --> 00:06:12.434 And there were several times during the Cold War 00:06:12.458 --> 00:06:13.893 we almost blew each other up. 00:06:13.917 --> 00:06:17.018 It's not because a lot of people felt this would be a great idea, 00:06:17.042 --> 00:06:19.726 let's all spend 10 trillion dollars to blow ourselves up, 00:06:19.750 --> 00:06:22.684 but the incentives were such that we were finding ourselves -- 00:06:22.708 --> 00:06:24.018 this could have been worse. 00:06:24.042 --> 00:06:26.476 Imagine if there had been a safe first strike. 00:06:26.500 --> 00:06:28.809 Then it might have been very tricky, 00:06:28.833 --> 00:06:30.101 in a crisis situation, 00:06:30.125 --> 00:06:32.602 to refrain from launching all their nuclear missiles. 00:06:32.626 --> 00:06:36.018 If nothing else, because you would fear that the other side might do it. NOTE Paragraph 00:06:36.042 --> 00:06:37.851 CA: Right, mutual assured destruction 00:06:37.875 --> 00:06:40.601 kept the Cold War relatively stable, 00:06:40.625 --> 00:06:42.559 without that, we might not be here now. NOTE Paragraph 00:06:42.583 --> 00:06:44.893 NB: It could have been more unstable than it was. 00:06:44.917 --> 00:06:47.268 And there could be other properties of technology. 00:06:47.292 --> 00:06:49.601 It could have been harder to have arms treaties, 00:06:49.625 --> 00:06:51.226 if instead of nuclear weapons 00:06:51.250 --> 00:06:54.268 there had been some smaller thing or something less distinctive. NOTE Paragraph 00:06:54.292 --> 00:06:56.851 CA: And as well as bad incentives for powerful actors, 00:06:56.875 --> 00:07:00.393 you also worry about bad incentives for all of us, in Type-2b here. NOTE Paragraph 00:07:00.417 --> 00:07:04.667 NB: Yeah, so, here we might take the case of global warming. 00:07:06.958 --> 00:07:08.851 There are a lot of little conveniences 00:07:08.875 --> 00:07:11.059 that cause each one of us to do things 00:07:11.083 --> 00:07:13.934 that individually have no significant effect, right? 00:07:13.958 --> 00:07:15.934 But if billions of people do it, 00:07:15.958 --> 00:07:18.018 cumulatively, it has a damaging effect. 00:07:18.042 --> 00:07:20.851 Now, global warming could have been a lot worse than it is. 00:07:20.875 --> 00:07:23.851 So we have the climate sensitivity parameter, right. 00:07:23.875 --> 00:07:27.518 It's a parameter that says how much warmer does it get 00:07:27.542 --> 00:07:30.226 if you emit a certain amount of greenhouse gases. 00:07:30.250 --> 00:07:32.643 But, suppose that it had been the case 00:07:32.667 --> 00:07:35.184 that with the amount of greenhouse gases we emitted, 00:07:35.208 --> 00:07:37.268 instead of the temperature rising by, say, 00:07:37.292 --> 00:07:41.018 between three and 4.5 degrees by 2100, 00:07:41.042 --> 00:07:43.542 suppose it had been 15 degrees or 20 degrees. 00:07:44.375 --> 00:07:46.934 Like, then we might have been in a very bad situation. 00:07:46.958 --> 00:07:50.101 Or suppose that renewable energy had just been a lot harder to do. 00:07:50.125 --> 00:07:52.768 Or that there had been more fossil fuels in the ground. NOTE Paragraph 00:07:52.792 --> 00:07:55.434 CA: Couldn't you argue that if in that case of -- 00:07:55.458 --> 00:07:57.184 if what we are doing today 00:07:57.208 --> 00:08:01.768 had resulted in 10 degrees difference in the time period that we could see, 00:08:01.792 --> 00:08:05.476 actually humanity would have got off its ass and done something about it. 00:08:05.500 --> 00:08:08.309 We're stupid, but we're not maybe that stupid. 00:08:08.333 --> 00:08:09.601 Or maybe we are. NOTE Paragraph 00:08:09.625 --> 00:08:10.893 NB: I wouldn't bet on it. NOTE Paragraph 00:08:10.917 --> 00:08:13.101 (Laughter) NOTE Paragraph 00:08:13.125 --> 00:08:14.809 You could imagine other features. 00:08:14.833 --> 00:08:20.351 So, right now, it's a little bit difficult to switch to renewables and stuff, right, 00:08:20.375 --> 00:08:21.643 but it can be done. 00:08:21.667 --> 00:08:24.643 But it might just have been, with slightly different physics, 00:08:24.667 --> 00:08:27.458 it could have been much more expensive to do these things. NOTE Paragraph 00:08:28.375 --> 00:08:29.893 CA: And what's your view, Nick? 00:08:29.917 --> 00:08:32.351 Do you think, putting these possibilities together, 00:08:32.375 --> 00:08:36.643 that this earth, humanity that we are, 00:08:36.667 --> 00:08:38.226 we count as a vulnerable world? 00:08:38.250 --> 00:08:40.667 That there is a death ball in our future? NOTE Paragraph 00:08:43.958 --> 00:08:45.226 NB: It's hard to say. 00:08:45.250 --> 00:08:50.309 I mean, I think there might well be various black balls in the urn, 00:08:50.333 --> 00:08:51.643 that's what it looks like. 00:08:51.667 --> 00:08:54.059 There might also be some golden balls 00:08:54.083 --> 00:08:57.559 that would help us protect against black balls. 00:08:57.583 --> 00:09:00.559 And I don't know which order they will come out. NOTE Paragraph 00:09:00.583 --> 00:09:04.434 CA: I mean, one possible philosophical critique of this idea 00:09:04.458 --> 00:09:10.101 is that it implies a view that the future is essentially settled. 00:09:10.125 --> 00:09:12.601 That there either is that ball there or it's not. 00:09:12.625 --> 00:09:15.643 And in a way, 00:09:15.667 --> 00:09:18.268 that's not a view of the future that I want to believe. 00:09:18.292 --> 00:09:20.643 I want to believe that the future is undetermined, 00:09:20.667 --> 00:09:22.601 that our decisions today will determine 00:09:22.625 --> 00:09:24.833 what kind of balls we pull out of that urn. NOTE Paragraph 00:09:25.917 --> 00:09:29.684 NB: I mean, if we just keep inventing, 00:09:29.708 --> 00:09:32.042 like, eventually we will pull out all the balls. 00:09:32.875 --> 00:09:36.268 I mean, I think there's a kind of weak form of technological determinism 00:09:36.292 --> 00:09:37.559 that is quite plausible, 00:09:37.583 --> 00:09:40.226 like, you're unlikely to encounter a society 00:09:40.250 --> 00:09:43.083 that uses flint axes and jet planes. 00:09:44.208 --> 00:09:48.268 But you can almost think of a technology as a set of affordances. 00:09:48.292 --> 00:09:51.309 So technology is the thing that enables us to do various things 00:09:51.333 --> 00:09:53.309 and achieve various effects in the world. 00:09:53.333 --> 00:09:56.143 How we'd then use that, of course depends on human choice. 00:09:56.167 --> 00:09:58.851 But if we think about these three types of vulnerability, 00:09:58.875 --> 00:10:02.268 they make quite weak assumptions about how we would choose to use them. 00:10:02.292 --> 00:10:05.684 So a Type-1 vulnerability, again, this massive, destructive power, 00:10:05.708 --> 00:10:07.143 it's a fairly weak assumption 00:10:07.167 --> 00:10:09.559 to think that in a population of millions of people 00:10:09.583 --> 00:10:12.518 there would be some that would choose to use it destructively. NOTE Paragraph 00:10:12.542 --> 00:10:14.976 CA: For me, the most single disturbing argument 00:10:15.000 --> 00:10:19.559 is that we actually might have some kind of view into the urn 00:10:19.583 --> 00:10:23.101 that makes it actually very likely that we're doomed. 00:10:23.125 --> 00:10:27.768 Namely, if you believe in accelerating power, 00:10:27.792 --> 00:10:30.059 that technology inherently accelerates, 00:10:30.083 --> 00:10:32.518 that we build the tools that make us more powerful, 00:10:32.542 --> 00:10:35.184 then at some point you get to a stage 00:10:35.208 --> 00:10:38.268 where a single individual can take us all down, 00:10:38.292 --> 00:10:41.143 and then it looks like we're screwed. 00:10:41.167 --> 00:10:44.101 Isn't that argument quite alarming? NOTE Paragraph 00:10:44.125 --> 00:10:45.875 NB: Ah, yeah. NOTE Paragraph 00:10:46.708 --> 00:10:47.976 (Laughter) NOTE Paragraph 00:10:48.000 --> 00:10:49.333 I think -- 00:10:50.875 --> 00:10:52.476 Yeah, we get more and more power, 00:10:52.500 --> 00:10:56.434 and [it's] easier and easier to use those powers, 00:10:56.458 --> 00:11:00.018 but we can also invent technologies that kind of help us control 00:11:00.042 --> 00:11:02.059 how people use those powers. NOTE Paragraph 00:11:02.083 --> 00:11:04.934 CA: So let's talk about that, let's talk about the response. 00:11:04.958 --> 00:11:07.268 Suppose that thinking about all the possibilities 00:11:07.292 --> 00:11:09.393 that are out there now -- 00:11:09.417 --> 00:11:13.143 it's not just synbio, it's things like cyberwarfare, 00:11:13.167 --> 00:11:16.518 artificial intelligence, etc., etc. -- 00:11:16.542 --> 00:11:21.059 that there might be serious doom in our future. 00:11:21.083 --> 00:11:22.684 What are the possible responses? 00:11:22.708 --> 00:11:27.601 And you've talked about four possible responses as well. NOTE Paragraph 00:11:27.625 --> 00:11:31.268 NB: Restricting technological development doesn't seem promising, 00:11:31.292 --> 00:11:34.518 if we are talking about a general halt to technological progress. 00:11:34.542 --> 00:11:35.809 I think neither feasible, 00:11:35.833 --> 00:11:38.143 nor would it be desirable even if we could do it. 00:11:38.167 --> 00:11:41.184 I think there might be very limited areas 00:11:41.208 --> 00:11:43.934 where maybe you would want slower technological progress. 00:11:43.958 --> 00:11:47.351 You don't, I think, want faster progress in bioweapons, 00:11:47.375 --> 00:11:49.434 or in, say, isotope separation, 00:11:49.458 --> 00:11:51.708 that would make it easier to create nukes. NOTE Paragraph 00:11:52.583 --> 00:11:55.893 CA: I mean, I used to be fully on board with that. 00:11:55.917 --> 00:11:59.184 But I would like to actually push back on that for a minute. 00:11:59.208 --> 00:12:00.518 Just because, first of all, 00:12:00.542 --> 00:12:03.226 if you look at the history of the last couple of decades, 00:12:03.250 --> 00:12:06.809 you know, it's always been push forward at full speed, 00:12:06.833 --> 00:12:08.684 it's OK, that's our only choice. 00:12:08.708 --> 00:12:12.976 But if you look at globalization and the rapid acceleration of that, 00:12:13.000 --> 00:12:16.434 if you look at the strategy of "move fast and break things" 00:12:16.458 --> 00:12:18.518 and what happened with that, 00:12:18.542 --> 00:12:21.309 and then you look at the potential for synthetic biology, 00:12:21.333 --> 00:12:25.768 I don't know that we should move forward rapidly 00:12:25.792 --> 00:12:27.434 or without any kind of restriction 00:12:27.458 --> 00:12:30.768 to a world where you could have a DNA printer in every home 00:12:30.792 --> 00:12:32.125 and high school lab. 00:12:33.167 --> 00:12:34.851 There are some restrictions, right? NOTE Paragraph 00:12:34.875 --> 00:12:37.518 NB: Possibly, there is the first part, the not feasible. 00:12:37.542 --> 00:12:39.726 If you think it would be desirable to stop it, 00:12:39.750 --> 00:12:41.476 there's the problem of feasibility. 00:12:41.500 --> 00:12:44.309 So it doesn't really help if one nation kind of -- NOTE Paragraph 00:12:44.333 --> 00:12:46.351 CA: No, it doesn't help if one nation does, 00:12:46.375 --> 00:12:49.309 but we've had treaties before. 00:12:49.333 --> 00:12:52.684 That's really how we survived the nuclear threat, 00:12:52.708 --> 00:12:53.976 was by going out there 00:12:54.000 --> 00:12:56.518 and going through the painful process of negotiating. 00:12:56.542 --> 00:13:01.976 I just wonder whether the logic isn't that we, as a matter of global priority, 00:13:02.000 --> 00:13:03.684 we shouldn't go out there and try, 00:13:03.708 --> 00:13:06.393 like, now start negotiating really strict rules 00:13:06.417 --> 00:13:09.101 on where synthetic bioresearch is done, 00:13:09.125 --> 00:13:11.976 that it's not something that you want to democratize, no? NOTE Paragraph 00:13:12.000 --> 00:13:13.809 NB: I totally agree with that -- 00:13:13.833 --> 00:13:18.059 that it would be desirable, for example, 00:13:18.083 --> 00:13:21.684 maybe to have DNA synthesis machines, 00:13:21.708 --> 00:13:25.268 not as a product where each lab has their own device, 00:13:25.292 --> 00:13:26.768 but maybe as a service. 00:13:26.792 --> 00:13:29.309 Maybe there could be four or five places in the world 00:13:29.333 --> 00:13:32.851 where you send in your digital blueprint and the DNA comes back, right? 00:13:32.875 --> 00:13:34.643 And then, you would have the ability, 00:13:34.667 --> 00:13:37.059 if one day it really looked like it was necessary, 00:13:37.083 --> 00:13:39.434 we would have like, a finite set of choke points. 00:13:39.458 --> 00:13:42.976 So I think you want to look for kind of special opportunities, 00:13:43.000 --> 00:13:45.059 where you could have tighter control. NOTE Paragraph 00:13:45.083 --> 00:13:46.726 CA: Your belief is, fundamentally, 00:13:46.750 --> 00:13:49.643 we are not going to be successful in just holding back. 00:13:49.667 --> 00:13:52.393 Someone, somewhere -- North Korea, you know -- 00:13:52.417 --> 00:13:55.934 someone is going to go there and discover this knowledge, 00:13:55.958 --> 00:13:57.226 if it's there to be found. NOTE Paragraph 00:13:57.250 --> 00:13:59.601 NB: That looks plausible under current conditions. 00:13:59.625 --> 00:14:01.559 It's not just synthetic biology, either. 00:14:01.583 --> 00:14:04.101 I mean, any kind of profound, new change in the world 00:14:04.101 --> 00:14:05.727 could turn out to be a black ball. NOTE Paragraph 00:14:05.727 --> 00:14:07.823 CA: Let's look at another possible response. NOTE Paragraph 00:14:07.823 --> 00:14:10.226 NB: This also, I think, has only limited potential. 00:14:10.250 --> 00:14:13.809 So, with the Type-1 vulnerability again, 00:14:13.833 --> 00:14:18.184 I mean, if you could reduce the number of people who are incentivized 00:14:18.208 --> 00:14:19.476 to destroy the world, 00:14:19.500 --> 00:14:21.559 if only they could get access and the means, 00:14:21.583 --> 00:14:22.851 that would be good. NOTE Paragraph 00:14:22.875 --> 00:14:24.851 CA: In this image that you asked us to do 00:14:24.875 --> 00:14:27.434 you're imagining these drones flying around the world 00:14:27.458 --> 00:14:28.726 with facial recognition. 00:14:28.750 --> 00:14:31.643 When they spot someone showing signs of sociopathic behavior, 00:14:31.667 --> 00:14:33.851 they shower them with love, they fix them. NOTE Paragraph 00:14:33.875 --> 00:14:35.768 NB: I think it's like a hybrid picture. 00:14:35.792 --> 00:14:39.809 Eliminate can either mean, like, incarcerate or kill, 00:14:39.833 --> 00:14:42.851 or it can mean persuade them to a better view of the world. 00:14:42.875 --> 00:14:44.601 But the point is that, 00:14:44.625 --> 00:14:46.768 suppose you were extremely successful in this, 00:14:46.792 --> 00:14:50.101 and you reduced the number of such individuals by half. 00:14:50.125 --> 00:14:52.018 And if you want to do it by persuasion, 00:14:52.042 --> 00:14:54.434 you are competing against all other powerful forces 00:14:54.458 --> 00:14:56.143 that are trying to persuade people, 00:14:56.167 --> 00:14:57.934 parties, religion, education system. 00:14:57.958 --> 00:14:59.863 But suppose you could reduce it by half, 00:14:59.887 --> 00:15:02.143 I don't think the risk would be reduced by half. 00:15:02.167 --> 00:15:03.726 Maybe by five or 10 percent. NOTE Paragraph 00:15:03.750 --> 00:15:08.101 CA: You're not recommending that we gamble humanity's future on response two. NOTE Paragraph 00:15:08.125 --> 00:15:11.143 NB: I think it's all good to try to deter and persuade people, 00:15:11.167 --> 00:15:14.143 but we shouldn't rely on that as our only safeguard. NOTE Paragraph 00:15:14.167 --> 00:15:15.434 CA: How about three? NOTE Paragraph 00:15:15.458 --> 00:15:18.351 NB: I think there are two general methods 00:15:18.375 --> 00:15:22.351 that we could use to achieve the ability to stabilize the world 00:15:22.375 --> 00:15:25.351 against the whole spectrum of possible vulnerabilities. 00:15:25.375 --> 00:15:26.934 And we probably would need both. 00:15:26.958 --> 00:15:31.476 So, one is an extremely effective ability 00:15:31.500 --> 00:15:33.268 to do preventive policing. 00:15:33.292 --> 00:15:34.816 Such that you could intercept. 00:15:34.840 --> 00:15:37.601 If anybody started to do this dangerous thing, 00:15:37.625 --> 00:15:40.309 you could intercept them in real time, and stop them. 00:15:40.333 --> 00:15:42.809 So this would require ubiquitous surveillance, 00:15:42.833 --> 00:15:45.208 everybody would be monitored all the time. NOTE Paragraph 00:15:46.333 --> 00:15:48.893 CA: This is "Minority Report," essentially, a form of. NOTE Paragraph 00:15:48.917 --> 00:15:50.851 NB: You would have maybe AI algorithms, 00:15:50.875 --> 00:15:55.292 big freedom centers that were reviewing this, etc., etc. NOTE Paragraph 00:15:56.583 --> 00:16:00.976 CA: You know that mass surveillance is not a very popular term right now? NOTE Paragraph 00:16:01.000 --> 00:16:02.250 (Laughter) NOTE Paragraph 00:16:03.458 --> 00:16:05.268 NB: Yeah, so this little device there, 00:16:05.292 --> 00:16:08.893 imagine that kind of necklace that you would have to wear at all times 00:16:08.917 --> 00:16:10.917 with multidirectional cameras. 00:16:11.792 --> 00:16:13.601 But, to make it go down better, 00:16:13.625 --> 00:16:16.149 just call it the "freedom tag" or something like that. NOTE Paragraph 00:16:16.173 --> 00:16:18.184 (Laughter) NOTE Paragraph 00:16:18.208 --> 00:16:19.476 CA: OK. 00:16:19.500 --> 00:16:21.601 I mean, this is the conversation, friends, 00:16:21.625 --> 00:16:25.184 this is why this is such a mind-blowing conversation. NOTE Paragraph 00:16:25.208 --> 00:16:27.809 NB: Actually, there's a whole big conversation on this 00:16:27.833 --> 00:16:29.143 on its own, obviously. 00:16:29.167 --> 00:16:31.643 There are huge problems and risks with that, right? 00:16:31.667 --> 00:16:32.934 We may come back to that. 00:16:32.958 --> 00:16:34.226 So the other, the final, 00:16:34.250 --> 00:16:36.809 the other general stabilization capability 00:16:36.833 --> 00:16:38.893 is kind of plugging another governance gap. 00:16:38.917 --> 00:16:43.101 So the surveillance would be kind of governance gap at the microlevel, 00:16:43.125 --> 00:16:46.226 like, preventing anybody from ever doing something highly illegal. 00:16:46.250 --> 00:16:48.559 Then, there's a corresponding governance gap 00:16:48.583 --> 00:16:50.518 at the macro level, at the global level. 00:16:50.542 --> 00:16:54.476 You would need the ability, reliably, 00:16:54.500 --> 00:16:57.309 to prevent the worst kinds of global coordination failures, 00:16:57.333 --> 00:17:01.101 to avoid wars between great powers, 00:17:01.125 --> 00:17:02.458 arms races, 00:17:03.500 --> 00:17:05.708 cataclysmic commons problems, 00:17:07.667 --> 00:17:11.851 in order to deal with the Type-2a vulnerabilities. NOTE Paragraph 00:17:11.875 --> 00:17:13.809 CA: Global governance is a term 00:17:13.833 --> 00:17:16.059 that's definitely way out of fashion right now, 00:17:16.083 --> 00:17:18.601 but could you make the case that throughout history, 00:17:18.625 --> 00:17:19.893 the history of humanity 00:17:19.917 --> 00:17:25.351 is that at every stage of technological power increase, 00:17:25.375 --> 00:17:28.601 people have reorganized and sort of centralized the power. 00:17:28.625 --> 00:17:32.059 So, for example, when a roving band of criminals 00:17:32.083 --> 00:17:33.768 could take over a society, 00:17:33.792 --> 00:17:36.031 the response was, well, you have a nation-state 00:17:36.055 --> 00:17:38.489 and you centralize force, a police force or an army, 00:17:38.513 --> 00:17:40.143 so, "No, you can't do that." 00:17:40.167 --> 00:17:44.726 The logic, perhaps, of having a single person or a single group 00:17:44.750 --> 00:17:46.393 able to take out humanity 00:17:46.417 --> 00:17:49.143 means at some point we're going to have to go this route, 00:17:49.167 --> 00:17:50.601 at least in some form, no? NOTE Paragraph 00:17:50.625 --> 00:17:54.309 NB: It's certainly true that the scale of political organization has increased 00:17:54.333 --> 00:17:56.476 over the course of human history. 00:17:56.500 --> 00:17:58.518 It used to be hunter-gatherer band, right, 00:17:58.542 --> 00:18:01.476 and then chiefdom, city-states, nations, 00:18:01.500 --> 00:18:05.476 now there are international organizations and so on and so forth. 00:18:05.500 --> 00:18:07.018 Again, I just want to make sure 00:18:07.042 --> 00:18:08.684 I get the chance to stress 00:18:08.708 --> 00:18:10.684 that obviously there are huge downsides 00:18:10.708 --> 00:18:12.226 and indeed, massive risks, 00:18:12.250 --> 00:18:15.601 both to mass surveillance and to global governance. 00:18:15.625 --> 00:18:18.184 I'm just pointing out that if we are lucky, 00:18:18.208 --> 00:18:20.893 the world could be such that these would be the only ways 00:18:20.917 --> 00:18:22.434 you could survive a black ball. NOTE Paragraph 00:18:22.458 --> 00:18:24.976 CA: The logic of this theory, 00:18:25.000 --> 00:18:26.268 it seems to me, 00:18:26.292 --> 00:18:29.893 is that we've got to recognize we can't have it all. 00:18:29.917 --> 00:18:31.750 That the sort of, 00:18:33.500 --> 00:18:36.476 I would say, naive dream that many of us had 00:18:36.500 --> 00:18:39.851 that technology is always going to be a force for good, 00:18:39.875 --> 00:18:42.851 keep going, don't stop, go as fast as you can 00:18:42.875 --> 00:18:45.226 and not pay attention to some of the consequences, 00:18:45.250 --> 00:18:46.934 that's actually just not an option. 00:18:46.958 --> 00:18:48.893 We can have that. 00:18:48.917 --> 00:18:50.184 If we have that, 00:18:50.208 --> 00:18:51.643 we're going to have to accept 00:18:51.667 --> 00:18:54.226 some of these other very uncomfortable things with it, 00:18:54.250 --> 00:18:56.476 and kind of be in this arms race with ourselves 00:18:56.500 --> 00:18:58.768 of, you want the power, you better limit it, 00:18:58.792 --> 00:19:00.934 you better figure out how to limit it. NOTE Paragraph 00:19:00.958 --> 00:19:04.434 NB: I think it is an option, 00:19:04.458 --> 00:19:07.226 a very tempting option, it's in a sense the easiest option 00:19:07.250 --> 00:19:08.518 and it might work, 00:19:08.542 --> 00:19:13.351 but it means we are fundamentally vulnerable to extracting a black ball. 00:19:13.375 --> 00:19:15.518 Now, I think with a bit of coordination, 00:19:15.542 --> 00:19:18.268 like, if you did solve this macrogovernance problem, 00:19:18.292 --> 00:19:19.893 and the microgovernance problem, 00:19:19.917 --> 00:19:22.226 then we could extract all the balls from the urn 00:19:22.250 --> 00:19:24.518 and we'd benefit greatly. NOTE Paragraph 00:19:24.542 --> 00:19:27.976 CA: I mean, if we're living in a simulation, does it matter? 00:19:28.000 --> 00:19:29.309 We just reboot. NOTE Paragraph 00:19:29.333 --> 00:19:30.601 (Laughter) NOTE Paragraph 00:19:30.625 --> 00:19:32.268 NB: Then ... I ... NOTE Paragraph 00:19:32.292 --> 00:19:34.768 (Laughter) 00:19:34.792 --> 00:19:36.208 I didn't see that one coming. NOTE Paragraph 00:19:38.125 --> 00:19:39.393 CA: So what's your view? 00:19:39.417 --> 00:19:44.226 Putting all the pieces together, how likely is it that we're doomed? NOTE Paragraph 00:19:44.250 --> 00:19:46.208 (Laughter) NOTE Paragraph 00:19:47.042 --> 00:19:49.434 I love how people laugh when you ask that question. NOTE Paragraph 00:19:49.458 --> 00:19:50.809 NB: On an individual level, 00:19:50.833 --> 00:19:54.684 we seem to kind of be doomed anyway, just with the time line, 00:19:54.708 --> 00:19:57.309 we're rotting and aging and all kinds of things, right? NOTE Paragraph 00:19:57.333 --> 00:19:58.934 (Laughter) NOTE Paragraph 00:19:58.958 --> 00:20:00.643 It's actually a little bit tricky. 00:20:00.667 --> 00:20:03.434 If you want to set up so that you can attach a probability, 00:20:03.458 --> 00:20:04.726 first, who are we? 00:20:04.750 --> 00:20:07.476 If you're very old, probably you'll die of natural causes, 00:20:07.500 --> 00:20:09.851 if you're very young, you might have a 100-year -- 00:20:09.875 --> 00:20:12.018 the probability might depend on who you ask. 00:20:12.042 --> 00:20:16.268 Then the threshold, like, what counts as civilizational devastation? 00:20:16.292 --> 00:20:21.934 In the paper I don't require an existential catastrophe 00:20:21.958 --> 00:20:23.393 in order for it to count. 00:20:23.417 --> 00:20:25.101 This is just a definitional matter, 00:20:25.125 --> 00:20:26.434 I say a billion dead, 00:20:26.458 --> 00:20:28.518 or a reduction of world GDP by 50 percent, 00:20:28.542 --> 00:20:30.768 but depending on what you say the threshold is, 00:20:30.792 --> 00:20:32.768 you get a different probability estimate. 00:20:32.792 --> 00:20:37.309 But I guess you could put me down as a frightened optimist. NOTE Paragraph 00:20:37.333 --> 00:20:38.434 (Laughter) NOTE Paragraph 00:20:38.458 --> 00:20:40.101 CA: You're a frightened optimist, 00:20:40.125 --> 00:20:44.393 and I think you've just created a large number of other frightened ... 00:20:44.417 --> 00:20:45.684 people. NOTE Paragraph 00:20:45.708 --> 00:20:46.768 (Laughter) NOTE Paragraph 00:20:46.792 --> 00:20:48.059 NB: In the simulation. NOTE Paragraph 00:20:48.083 --> 00:20:49.351 CA: In a simulation. 00:20:49.375 --> 00:20:51.059 Nick Bostrom, your mind amazes me, 00:20:51.083 --> 00:20:53.976 thank you so much for scaring the living daylights out of us. NOTE Paragraph 00:20:54.000 --> 00:20:56.375 (Applause)