1 00:00:01,000 --> 00:00:02,809 Chris Anderson: Nick Bostrom. 2 00:00:02,833 --> 00:00:06,809 So, you have already given us so many crazy ideas out there. 3 00:00:06,833 --> 00:00:08,559 I think a couple of decades ago, 4 00:00:08,583 --> 00:00:11,518 you made the case that we might all be living in a simulation, 5 00:00:11,542 --> 00:00:13,351 or perhaps probably were. 6 00:00:13,375 --> 00:00:14,726 More recently, 7 00:00:14,750 --> 00:00:19,351 you've painted the most vivid examples of how artificial general intelligence 8 00:00:19,375 --> 00:00:21,208 could go horribly wrong. 9 00:00:21,750 --> 00:00:23,143 And now this year, 10 00:00:23,167 --> 00:00:25,393 you're about to publish 11 00:00:25,417 --> 00:00:29,351 a paper that presents something called the vulnerable world hypothesis. 12 00:00:29,375 --> 00:00:33,958 And our job this evening is to give the illustrated guide to that. 13 00:00:34,417 --> 00:00:36,250 So let's do that. 14 00:00:36,833 --> 00:00:38,625 What is that hypothesis? 15 00:00:40,000 --> 00:00:42,434 Nick Bostrom: It's trying to think about 16 00:00:42,458 --> 00:00:45,542 a sort of structural feature of the current human condition. 17 00:00:47,125 --> 00:00:49,476 You like the urn metaphor, 18 00:00:49,500 --> 00:00:51,393 so I'm going to use that to explain it. 19 00:00:51,417 --> 00:00:55,768 So picture a big urn filled with balls 20 00:00:55,792 --> 00:00:59,750 representing ideas, methods, possible technologies. 21 00:01:00,833 --> 00:01:04,559 You can think of the history of human creativity 22 00:01:04,583 --> 00:01:08,393 as the process of reaching into this urn and pulling out one ball after another, 23 00:01:08,417 --> 00:01:11,643 and the net effect so far has been hugely beneficial, right? 24 00:01:11,667 --> 00:01:14,393 We've extracted a great many white balls, 25 00:01:14,417 --> 00:01:17,292 some various shades of gray, mixed blessings. 26 00:01:18,042 --> 00:01:21,000 We haven't so far pulled out the black ball -- 27 00:01:22,292 --> 00:01:27,768 a technology that invariably destroys the civilization that discovers it. 28 00:01:27,792 --> 00:01:31,059 So the paper tries to think about what could such a black ball be. 29 00:01:31,083 --> 00:01:32,893 CA: So you define that ball 30 00:01:32,917 --> 00:01:36,601 as one that would inevitably bring about civilizational destruction. 31 00:01:36,625 --> 00:01:41,934 NB: Unless we exit what I call the semi-anarchic default condition. 32 00:01:41,958 --> 00:01:43,458 But sort of, by default. 33 00:01:44,333 --> 00:01:47,851 CA: So, you make the case compelling 34 00:01:47,875 --> 00:01:49,893 by showing some sort of counterexamples 35 00:01:49,917 --> 00:01:52,851 where you believe that so far we've actually got lucky, 36 00:01:52,875 --> 00:01:55,726 that we might have pulled out that death ball 37 00:01:55,750 --> 00:01:57,309 without even knowing it. 38 00:01:57,333 --> 00:01:59,625 So there's this quote, what's this quote? 39 00:02:00,625 --> 00:02:03,309 NB: Well, I guess it's just meant to illustrate 40 00:02:03,333 --> 00:02:05,434 the difficulty of foreseeing 41 00:02:05,458 --> 00:02:08,143 what basic discoveries will lead to. 42 00:02:08,167 --> 00:02:11,226 We just don't have that capability. 43 00:02:11,250 --> 00:02:14,601 Because we have become quite good at pulling out balls, 44 00:02:14,625 --> 00:02:18,351 but we don't really have the ability to put the ball back into the urn, right. 45 00:02:18,375 --> 00:02:20,542 We can invent, but we can't un-invent. 46 00:02:21,583 --> 00:02:24,351 So our strategy, such as it is, 47 00:02:24,375 --> 00:02:26,809 is to hope that there is no black ball in the urn. 48 00:02:26,833 --> 00:02:30,893 CA: So once it's out, it's out, and you can't put it back in, 49 00:02:30,917 --> 00:02:32,434 and you think we've been lucky. 50 00:02:32,458 --> 00:02:34,684 So talk through a couple of these examples. 51 00:02:34,708 --> 00:02:37,809 You talk about different types of vulnerability. 52 00:02:37,833 --> 00:02:40,268 NB: So the easiest type to understand 53 00:02:40,292 --> 00:02:43,434 is a technology that just makes it very easy 54 00:02:43,458 --> 00:02:45,583 to cause massive amounts of destruction. 55 00:02:47,375 --> 00:02:50,893 Synthetic biology might be a fecund source of that kind of black ball, 56 00:02:50,917 --> 00:02:53,601 but many other possible things we could -- 57 00:02:53,625 --> 00:02:56,143 think of geoengineering, really great, right? 58 00:02:56,167 --> 00:02:58,393 We could combat global warming, 59 00:02:58,417 --> 00:03:00,559 but you don't want it to get too easy either, 60 00:03:00,583 --> 00:03:03,059 you don't want any random person and his grandmother 61 00:03:03,083 --> 00:03:06,143 to have the ability to radically alter the earth's climate. 62 00:03:06,167 --> 00:03:09,726 Or maybe lethal autonomous drones, 63 00:03:09,750 --> 00:03:13,083 massed-produced, mosquito-sized killer bot swarms. 64 00:03:14,500 --> 00:03:17,226 Nanotechnology, artificial general intelligence. 65 00:03:17,250 --> 00:03:18,559 CA: You argue in the paper 66 00:03:18,583 --> 00:03:21,476 that it's a matter of luck that when we discovered 67 00:03:21,500 --> 00:03:24,934 that nuclear power could create a bomb, 68 00:03:24,958 --> 00:03:26,351 it might have been the case 69 00:03:26,375 --> 00:03:28,226 that you could have created a bomb 70 00:03:28,250 --> 00:03:31,809 with much easier resources, accessible to anyone. 71 00:03:31,833 --> 00:03:35,393 NB: Yeah, so think back to the 1930s 72 00:03:35,417 --> 00:03:40,018 where for the first time we make some breakthroughs in nuclear physics, 73 00:03:40,042 --> 00:03:43,726 some genius figures out that it's possible to create a nuclear chain reaction 74 00:03:43,750 --> 00:03:46,934 and then realizes that this could lead to the bomb. 75 00:03:46,958 --> 00:03:48,851 And we do some more work, 76 00:03:48,875 --> 00:03:51,601 it turns out that what you require to make a nuclear bomb 77 00:03:51,625 --> 00:03:54,018 is highly enriched uranium or plutonium, 78 00:03:54,042 --> 00:03:56,059 which are very difficult materials to get. 79 00:03:56,083 --> 00:03:58,351 You need ultracentrifuges, 80 00:03:58,375 --> 00:04:02,143 you need reactors, like, massive amounts of energy. 81 00:04:02,167 --> 00:04:03,976 But suppose it had turned out instead 82 00:04:04,000 --> 00:04:07,976 there had been an easy way to unlock the energy of the atom. 83 00:04:08,000 --> 00:04:10,768 That maybe by baking sand in the microwave oven 84 00:04:10,792 --> 00:04:12,059 or something like that 85 00:04:12,083 --> 00:04:14,184 you could have created a nuclear detonation. 86 00:04:14,208 --> 00:04:16,351 So we know that that's physically impossible. 87 00:04:16,375 --> 00:04:18,268 But before you did the relevant physics 88 00:04:18,292 --> 00:04:20,483 how could you have known how it would turn out? 89 00:04:20,507 --> 00:04:22,059 CA: Although, couldn't you argue 90 00:04:22,083 --> 00:04:24,018 that for life to evolve on Earth 91 00:04:24,042 --> 00:04:27,309 that implied sort of stable environment, 92 00:04:27,333 --> 00:04:31,518 that if it was possible to create massive nuclear reactions relatively easy, 93 00:04:31,542 --> 00:04:33,400 the Earth would never have been stable, 94 00:04:33,424 --> 00:04:34,976 that we wouldn't be here at all. 95 00:04:35,000 --> 00:04:38,393 NB: Yeah, unless there were something that is easy to do on purpose 96 00:04:38,417 --> 00:04:41,268 but that wouldn't happen by random chance. 97 00:04:41,292 --> 00:04:42,871 So, like things we can easily do, 98 00:04:42,896 --> 00:04:45,006 we can stack 10 blocks on top of one another, 99 00:04:45,031 --> 00:04:48,228 but in nature, you're not going to find, like, a stack of 10 blocks. 100 00:04:48,253 --> 00:04:49,926 CA: OK, so this is probably the one 101 00:04:49,950 --> 00:04:51,893 that many of us worry about most, 102 00:04:51,917 --> 00:04:55,434 and yes, synthetic biology is perhaps the quickest route 103 00:04:55,458 --> 00:04:58,476 that we can foresee in our near future to get us here. 104 00:04:58,500 --> 00:05:01,434 NB: Yeah, and so think about what that would have meant 105 00:05:01,458 --> 00:05:05,101 if, say, anybody by working in their kitchen for an afternoon 106 00:05:05,125 --> 00:05:06,518 could destroy a city. 107 00:05:06,542 --> 00:05:10,101 It's hard to see how modern civilization as we know it 108 00:05:10,125 --> 00:05:11,559 could have survived that. 109 00:05:11,583 --> 00:05:14,101 Because in any population of a million people, 110 00:05:14,125 --> 00:05:16,809 there will always be some who would, for whatever reason, 111 00:05:16,833 --> 00:05:18,917 choose to use that destructive power. 112 00:05:19,750 --> 00:05:22,893 So if that apocalyptic residual 113 00:05:22,917 --> 00:05:24,893 would choose to destroy a city, or worse, 114 00:05:24,917 --> 00:05:26,476 then cities would get destroyed. 115 00:05:26,500 --> 00:05:28,851 CA: So here's another type of vulnerability. 116 00:05:28,875 --> 00:05:30,518 Talk about this. 117 00:05:30,542 --> 00:05:34,518 NB: Yeah, so in addition to these kind of obvious types of black balls 118 00:05:34,542 --> 00:05:37,352 that would just make it possible to blow up a lot of things, 119 00:05:37,376 --> 00:05:41,809 other types would act by creating bad incentives 120 00:05:41,833 --> 00:05:44,059 for humans to do things that are harmful. 121 00:05:44,083 --> 00:05:48,184 So, the Type-2a, we might call it that, 122 00:05:48,208 --> 00:05:52,726 is to think about some technology that incentivizes great powers 123 00:05:52,750 --> 00:05:57,226 to use their massive amounts of force to create destruction. 124 00:05:57,250 --> 00:06:00,083 So, nuclear weapons were actually very close to this, right? 125 00:06:02,083 --> 00:06:05,143 What we did, we spent over 10 trillion dollars 126 00:06:05,167 --> 00:06:07,684 to build 70,000 nuclear warheads 127 00:06:07,708 --> 00:06:10,143 and put them on hair-trigger alert. 128 00:06:10,167 --> 00:06:12,434 And there were several times during the Cold War 129 00:06:12,458 --> 00:06:13,893 we almost blew each other up. 130 00:06:13,917 --> 00:06:17,018 It's not because a lot of people felt this would be a great idea, 131 00:06:17,042 --> 00:06:19,726 let's all spend 10 trillion dollars to blow ourselves up, 132 00:06:19,750 --> 00:06:22,684 but the incentives were such that we were finding ourselves -- 133 00:06:22,708 --> 00:06:24,018 this could have been worse. 134 00:06:24,042 --> 00:06:26,476 Imagine if there had been a safe first strike. 135 00:06:26,500 --> 00:06:28,809 Then it might have been very tricky, 136 00:06:28,833 --> 00:06:30,101 in a crisis situation, 137 00:06:30,125 --> 00:06:32,602 to refrain from launching all their nuclear missiles. 138 00:06:32,626 --> 00:06:36,018 If nothing else, because you would fear that the other side might do it. 139 00:06:36,042 --> 00:06:37,851 CA: Right, mutual assured destruction 140 00:06:37,875 --> 00:06:40,601 kept the Cold War relatively stable, 141 00:06:40,625 --> 00:06:42,559 without that, we might not be here now. 142 00:06:42,583 --> 00:06:44,893 NB: It could have been more unstable than it was. 143 00:06:44,917 --> 00:06:47,268 And there could be other properties of technology. 144 00:06:47,292 --> 00:06:49,601 It could have been harder to have arms treaties, 145 00:06:49,625 --> 00:06:51,226 if instead of nuclear weapons 146 00:06:51,250 --> 00:06:54,268 there had been some smaller thing or something less distinctive. 147 00:06:54,292 --> 00:06:56,851 CA: And as well as bad incentives for powerful actors, 148 00:06:56,875 --> 00:07:00,393 you also worry about bad incentives for all of us, in Type-2b here. 149 00:07:00,417 --> 00:07:04,667 NB: Yeah, so, here we might take the case of global warming. 150 00:07:06,958 --> 00:07:08,851 There are a lot of little conveniences 151 00:07:08,875 --> 00:07:11,059 that cause each one of us to do things 152 00:07:11,083 --> 00:07:13,934 that individually have no significant effect, right? 153 00:07:13,958 --> 00:07:15,934 But if billions of people do it, 154 00:07:15,958 --> 00:07:18,018 cumulatively, it has a damaging effect. 155 00:07:18,042 --> 00:07:20,851 Now, global warming could have been a lot worse than it is. 156 00:07:20,875 --> 00:07:23,851 So we have the climate sensitivity parameter, right. 157 00:07:23,875 --> 00:07:27,518 It's a parameter that says how much warmer does it get 158 00:07:27,542 --> 00:07:30,226 if you emit a certain amount of greenhouse gases. 159 00:07:30,250 --> 00:07:32,643 But, suppose that it had been the case 160 00:07:32,667 --> 00:07:35,184 that with the amount of greenhouse gases we emitted, 161 00:07:35,208 --> 00:07:37,268 instead of the temperature rising by, say, 162 00:07:37,292 --> 00:07:41,018 between three and 4.5 degrees by 2100, 163 00:07:41,042 --> 00:07:43,542 suppose it had been 15 degrees or 20 degrees. 164 00:07:44,375 --> 00:07:46,934 Like, then we might have been in a very bad situation. 165 00:07:46,958 --> 00:07:50,101 Or suppose that renewable energy had just been a lot harder to do. 166 00:07:50,125 --> 00:07:52,768 Or that there had been more fossil fuels in the ground. 167 00:07:52,792 --> 00:07:55,434 CA: Couldn't you argue that if in that case of -- 168 00:07:55,458 --> 00:07:57,184 if what we are doing today 169 00:07:57,208 --> 00:08:01,768 had resulted in 10 degrees difference in the time period that we could see, 170 00:08:01,792 --> 00:08:05,476 actually humanity would have got off its ass and done something about it. 171 00:08:05,500 --> 00:08:08,309 We're stupid, but we're not maybe that stupid. 172 00:08:08,333 --> 00:08:09,601 Or maybe we are. 173 00:08:09,625 --> 00:08:10,893 NB: I wouldn't bet on it. 174 00:08:10,917 --> 00:08:13,101 (Laughter) 175 00:08:13,125 --> 00:08:14,809 You could imagine other features. 176 00:08:14,833 --> 00:08:20,351 So, right now, it's a little bit difficult to switch to renewables and stuff, right, 177 00:08:20,375 --> 00:08:21,643 but it can be done. 178 00:08:21,667 --> 00:08:24,643 But it might just have been, with slightly different physics, 179 00:08:24,667 --> 00:08:27,458 it could have been much more expensive to do these things. 180 00:08:28,375 --> 00:08:29,893 CA: And what's your view, Nick? 181 00:08:29,917 --> 00:08:32,351 Do you think, putting these possibilities together, 182 00:08:32,375 --> 00:08:36,643 that this earth, humanity that we are, 183 00:08:36,667 --> 00:08:38,226 we count as a vulnerable world? 184 00:08:38,250 --> 00:08:40,667 That there is a death ball in our future? 185 00:08:43,958 --> 00:08:45,226 NB: It's hard to say. 186 00:08:45,250 --> 00:08:50,309 I mean, I think there might well be various black balls in the urn, 187 00:08:50,333 --> 00:08:51,643 that's what it looks like. 188 00:08:51,667 --> 00:08:54,059 There might also be some golden balls 189 00:08:54,083 --> 00:08:57,559 that would help us protect against black balls. 190 00:08:57,583 --> 00:09:00,559 And I don't know which order they will come out. 191 00:09:00,583 --> 00:09:04,434 CA: I mean, one possible philosophical critique of this idea 192 00:09:04,458 --> 00:09:10,101 is that it implies a view that the future is essentially settled. 193 00:09:10,125 --> 00:09:12,601 That there either is that ball there or it's not. 194 00:09:12,625 --> 00:09:15,643 And in a way, 195 00:09:15,667 --> 00:09:18,268 that's not a view of the future that I want to believe. 196 00:09:18,292 --> 00:09:20,643 I want to believe that the future is undetermined, 197 00:09:20,667 --> 00:09:22,601 that our decisions today will determine 198 00:09:22,625 --> 00:09:24,833 what kind of balls we pull out of that urn. 199 00:09:25,917 --> 00:09:29,684 NB: I mean, if we just keep inventing, 200 00:09:29,708 --> 00:09:32,042 like, eventually we will pull out all the balls. 201 00:09:32,875 --> 00:09:36,268 I mean, I think there's a kind of weak form of technological determinism 202 00:09:36,292 --> 00:09:37,559 that is quite plausible, 203 00:09:37,583 --> 00:09:40,226 like, you're unlikely to encounter a society 204 00:09:40,250 --> 00:09:43,083 that uses flint axes and jet planes. 205 00:09:44,208 --> 00:09:48,268 But you can almost think of a technology as a set of affordances. 206 00:09:48,292 --> 00:09:51,309 So technology is the thing that enables us to do various things 207 00:09:51,333 --> 00:09:53,309 and achieve various effects in the world. 208 00:09:53,333 --> 00:09:56,143 How we'd then use that, of course depends on human choice. 209 00:09:56,167 --> 00:09:58,851 But if we think about these three types of vulnerability, 210 00:09:58,875 --> 00:10:02,268 they make quite weak assumptions about how we would choose to use them. 211 00:10:02,292 --> 00:10:05,684 So a Type-1 vulnerability, again, this massive, destructive power, 212 00:10:05,708 --> 00:10:07,143 it's a fairly weak assumption 213 00:10:07,167 --> 00:10:09,559 to think that in a population of millions of people 214 00:10:09,583 --> 00:10:12,518 there would be some that would choose to use it destructively. 215 00:10:12,542 --> 00:10:14,976 CA: For me, the most single disturbing argument 216 00:10:15,000 --> 00:10:19,559 is that we actually might have some kind of view into the urn 217 00:10:19,583 --> 00:10:23,101 that makes it actually very likely that we're doomed. 218 00:10:23,125 --> 00:10:27,768 Namely, if you believe in accelerating power, 219 00:10:27,792 --> 00:10:30,059 that technology inherently accelerates, 220 00:10:30,083 --> 00:10:32,518 that we build the tools that make us more powerful, 221 00:10:32,542 --> 00:10:35,184 then at some point you get to a stage 222 00:10:35,208 --> 00:10:38,268 where a single individual can take us all down, 223 00:10:38,292 --> 00:10:41,143 and then it looks like we're screwed. 224 00:10:41,167 --> 00:10:44,101 Isn't that argument quite alarming? 225 00:10:44,125 --> 00:10:45,875 NB: Ah, yeah. 226 00:10:46,708 --> 00:10:47,976 (Laughter) 227 00:10:48,000 --> 00:10:49,333 I think -- 228 00:10:50,875 --> 00:10:52,476 Yeah, we get more and more power, 229 00:10:52,500 --> 00:10:56,434 and [it's] easier and easier to use those powers, 230 00:10:56,458 --> 00:11:00,018 but we can also invent technologies that kind of help us control 231 00:11:00,042 --> 00:11:02,059 how people use those powers. 232 00:11:02,083 --> 00:11:04,934 CA: So let's talk about that, let's talk about the response. 233 00:11:04,958 --> 00:11:07,268 Suppose that thinking about all the possibilities 234 00:11:07,292 --> 00:11:09,393 that are out there now -- 235 00:11:09,417 --> 00:11:13,143 it's not just synbio, it's things like cyberwarfare, 236 00:11:13,167 --> 00:11:16,518 artificial intelligence, etc., etc. -- 237 00:11:16,542 --> 00:11:21,059 that there might be serious doom in our future. 238 00:11:21,083 --> 00:11:22,684 What are the possible responses? 239 00:11:22,708 --> 00:11:27,601 And you've talked about four possible responses as well. 240 00:11:27,625 --> 00:11:31,268 NB: Restricting technological development doesn't seem promising, 241 00:11:31,292 --> 00:11:34,518 if we are talking about a general halt to technological progress. 242 00:11:34,542 --> 00:11:35,809 I think neither feasible, 243 00:11:35,833 --> 00:11:38,143 nor would it be desirable even if we could do it. 244 00:11:38,167 --> 00:11:41,184 I think there might be very limited areas 245 00:11:41,208 --> 00:11:43,934 where maybe you would want slower technological progress. 246 00:11:43,958 --> 00:11:47,351 You don't, I think, want faster progress in bioweapons, 247 00:11:47,375 --> 00:11:49,434 or in, say, isotope separation, 248 00:11:49,458 --> 00:11:51,708 that would make it easier to create nukes. 249 00:11:52,583 --> 00:11:55,893 CA: I mean, I used to be fully on board with that. 250 00:11:55,917 --> 00:11:59,184 But I would like to actually push back on that for a minute. 251 00:11:59,208 --> 00:12:00,518 Just because, first of all, 252 00:12:00,542 --> 00:12:03,226 if you look at the history of the last couple of decades, 253 00:12:03,250 --> 00:12:06,809 you know, it's always been push forward at full speed, 254 00:12:06,833 --> 00:12:08,684 it's OK, that's our only choice. 255 00:12:08,708 --> 00:12:12,976 But if you look at globalization and the rapid acceleration of that, 256 00:12:13,000 --> 00:12:16,434 if you look at the strategy of "move fast and break things" 257 00:12:16,458 --> 00:12:18,518 and what happened with that, 258 00:12:18,542 --> 00:12:21,309 and then you look at the potential for synthetic biology, 259 00:12:21,333 --> 00:12:25,768 I don't know that we should move forward rapidly 260 00:12:25,792 --> 00:12:27,434 or without any kind of restriction 261 00:12:27,458 --> 00:12:30,768 to a world where you could have a DNA printer in every home 262 00:12:30,792 --> 00:12:32,125 and high school lab. 263 00:12:33,167 --> 00:12:34,851 There are some restrictions, right? 264 00:12:34,875 --> 00:12:37,518 NB: Possibly, there is the first part, the not feasible. 265 00:12:37,542 --> 00:12:39,726 If you think it would be desirable to stop it, 266 00:12:39,750 --> 00:12:41,476 there's the problem of feasibility. 267 00:12:41,500 --> 00:12:44,309 So it doesn't really help if one nation kind of -- 268 00:12:44,333 --> 00:12:46,351 CA: No, it doesn't help if one nation does, 269 00:12:46,375 --> 00:12:49,309 but we've had treaties before. 270 00:12:49,333 --> 00:12:52,684 That's really how we survived the nuclear threat, 271 00:12:52,708 --> 00:12:53,976 was by going out there 272 00:12:54,000 --> 00:12:56,518 and going through the painful process of negotiating. 273 00:12:56,542 --> 00:13:01,976 I just wonder whether the logic isn't that we, as a matter of global priority, 274 00:13:02,000 --> 00:13:03,684 we shouldn't go out there and try, 275 00:13:03,708 --> 00:13:06,393 like, now start negotiating really strict rules 276 00:13:06,417 --> 00:13:09,101 on where synthetic bioresearch is done, 277 00:13:09,125 --> 00:13:11,976 that it's not something that you want to democratize, no? 278 00:13:12,000 --> 00:13:13,809 NB: I totally agree with that -- 279 00:13:13,833 --> 00:13:18,059 that it would be desirable, for example, 280 00:13:18,083 --> 00:13:21,684 maybe to have DNA synthesis machines, 281 00:13:21,708 --> 00:13:25,268 not as a product where each lab has their own device, 282 00:13:25,292 --> 00:13:26,768 but maybe as a service. 283 00:13:26,792 --> 00:13:29,309 Maybe there could be four or five places in the world 284 00:13:29,333 --> 00:13:32,851 where you send in your digital blueprint and the DNA comes back, right? 285 00:13:32,875 --> 00:13:34,643 And then, you would have the ability, 286 00:13:34,667 --> 00:13:37,059 if one day it really looked like it was necessary, 287 00:13:37,083 --> 00:13:39,434 we would have like, a finite set of choke points. 288 00:13:39,458 --> 00:13:42,976 So I think you want to look for kind of special opportunities, 289 00:13:43,000 --> 00:13:45,059 where you could have tighter control. 290 00:13:45,083 --> 00:13:46,726 CA: Your belief is, fundamentally, 291 00:13:46,750 --> 00:13:49,643 we are not going to be successful in just holding back. 292 00:13:49,667 --> 00:13:52,393 Someone, somewhere -- North Korea, you know -- 293 00:13:52,417 --> 00:13:55,934 someone is going to go there and discover this knowledge, 294 00:13:55,958 --> 00:13:57,226 if it's there to be found. 295 00:13:57,250 --> 00:13:59,601 NB: That looks plausible under current conditions. 296 00:13:59,625 --> 00:14:01,559 It's not just synthetic biology, either. 297 00:14:01,583 --> 00:14:04,101 I mean, any kind of profound, new change in the world 298 00:14:04,101 --> 00:14:05,727 could turn out to be a black ball. 299 00:14:05,727 --> 00:14:07,823 CA: Let's look at another possible response. 300 00:14:07,823 --> 00:14:10,226 NB: This also, I think, has only limited potential. 301 00:14:10,250 --> 00:14:13,809 So, with the Type-1 vulnerability again, 302 00:14:13,833 --> 00:14:18,184 I mean, if you could reduce the number of people who are incentivized 303 00:14:18,208 --> 00:14:19,476 to destroy the world, 304 00:14:19,500 --> 00:14:21,559 if only they could get access and the means, 305 00:14:21,583 --> 00:14:22,851 that would be good. 306 00:14:22,875 --> 00:14:24,851 CA: In this image that you asked us to do 307 00:14:24,875 --> 00:14:27,434 you're imagining these drones flying around the world 308 00:14:27,458 --> 00:14:28,726 with facial recognition. 309 00:14:28,750 --> 00:14:31,643 When they spot someone showing signs of sociopathic behavior, 310 00:14:31,667 --> 00:14:33,851 they shower them with love, they fix them. 311 00:14:33,875 --> 00:14:35,768 NB: I think it's like a hybrid picture. 312 00:14:35,792 --> 00:14:39,809 Eliminate can either mean, like, incarcerate or kill, 313 00:14:39,833 --> 00:14:42,851 or it can mean persuade them to a better view of the world. 314 00:14:42,875 --> 00:14:44,601 But the point is that, 315 00:14:44,625 --> 00:14:46,768 suppose you were extremely successful in this, 316 00:14:46,792 --> 00:14:50,101 and you reduced the number of such individuals by half. 317 00:14:50,125 --> 00:14:52,018 And if you want to do it by persuasion, 318 00:14:52,042 --> 00:14:54,434 you are competing against all other powerful forces 319 00:14:54,458 --> 00:14:56,143 that are trying to persuade people, 320 00:14:56,167 --> 00:14:57,934 parties, religion, education system. 321 00:14:57,958 --> 00:14:59,863 But suppose you could reduce it by half, 322 00:14:59,887 --> 00:15:02,143 I don't think the risk would be reduced by half. 323 00:15:02,167 --> 00:15:03,726 Maybe by five or 10 percent. 324 00:15:03,750 --> 00:15:08,101 CA: You're not recommending that we gamble humanity's future on response two. 325 00:15:08,125 --> 00:15:11,143 NB: I think it's all good to try to deter and persuade people, 326 00:15:11,167 --> 00:15:14,143 but we shouldn't rely on that as our only safeguard. 327 00:15:14,167 --> 00:15:15,434 CA: How about three? 328 00:15:15,458 --> 00:15:18,351 NB: I think there are two general methods 329 00:15:18,375 --> 00:15:22,351 that we could use to achieve the ability to stabilize the world 330 00:15:22,375 --> 00:15:25,351 against the whole spectrum of possible vulnerabilities. 331 00:15:25,375 --> 00:15:26,934 And we probably would need both. 332 00:15:26,958 --> 00:15:31,476 So, one is an extremely effective ability 333 00:15:31,500 --> 00:15:33,268 to do preventive policing. 334 00:15:33,292 --> 00:15:34,816 Such that you could intercept. 335 00:15:34,840 --> 00:15:37,601 If anybody started to do this dangerous thing, 336 00:15:37,625 --> 00:15:40,309 you could intercept them in real time, and stop them. 337 00:15:40,333 --> 00:15:42,809 So this would require ubiquitous surveillance, 338 00:15:42,833 --> 00:15:45,208 everybody would be monitored all the time. 339 00:15:46,333 --> 00:15:48,893 CA: This is "Minority Report," essentially, a form of. 340 00:15:48,917 --> 00:15:50,851 NB: You would have maybe AI algorithms, 341 00:15:50,875 --> 00:15:55,292 big freedom centers that were reviewing this, etc., etc. 342 00:15:56,583 --> 00:16:00,976 CA: You know that mass surveillance is not a very popular term right now? 343 00:16:01,000 --> 00:16:02,250 (Laughter) 344 00:16:03,458 --> 00:16:05,268 NB: Yeah, so this little device there, 345 00:16:05,292 --> 00:16:08,893 imagine that kind of necklace that you would have to wear at all times 346 00:16:08,917 --> 00:16:10,917 with multidirectional cameras. 347 00:16:11,792 --> 00:16:13,601 But, to make it go down better, 348 00:16:13,625 --> 00:16:16,149 just call it the "freedom tag" or something like that. 349 00:16:16,173 --> 00:16:18,184 (Laughter) 350 00:16:18,208 --> 00:16:19,476 CA: OK. 351 00:16:19,500 --> 00:16:21,601 I mean, this is the conversation, friends, 352 00:16:21,625 --> 00:16:25,184 this is why this is such a mind-blowing conversation. 353 00:16:25,208 --> 00:16:27,809 NB: Actually, there's a whole big conversation on this 354 00:16:27,833 --> 00:16:29,143 on its own, obviously. 355 00:16:29,167 --> 00:16:31,643 There are huge problems and risks with that, right? 356 00:16:31,667 --> 00:16:32,934 We may come back to that. 357 00:16:32,958 --> 00:16:34,226 So the other, the final, 358 00:16:34,250 --> 00:16:36,809 the other general stabilization capability 359 00:16:36,833 --> 00:16:38,893 is kind of plugging another governance gap. 360 00:16:38,917 --> 00:16:43,101 So the surveillance would be kind of governance gap at the microlevel, 361 00:16:43,125 --> 00:16:46,226 like, preventing anybody from ever doing something highly illegal. 362 00:16:46,250 --> 00:16:48,559 Then, there's a corresponding governance gap 363 00:16:48,583 --> 00:16:50,518 at the macro level, at the global level. 364 00:16:50,542 --> 00:16:54,476 You would need the ability, reliably, 365 00:16:54,500 --> 00:16:57,309 to prevent the worst kinds of global coordination failures, 366 00:16:57,333 --> 00:17:01,101 to avoid wars between great powers, 367 00:17:01,125 --> 00:17:02,458 arms races, 368 00:17:03,500 --> 00:17:05,708 cataclysmic commons problems, 369 00:17:07,667 --> 00:17:11,851 in order to deal with the Type-2a vulnerabilities. 370 00:17:11,875 --> 00:17:13,809 CA: Global governance is a term 371 00:17:13,833 --> 00:17:16,059 that's definitely way out of fashion right now, 372 00:17:16,083 --> 00:17:18,601 but could you make the case that throughout history, 373 00:17:18,625 --> 00:17:19,893 the history of humanity 374 00:17:19,917 --> 00:17:25,351 is that at every stage of technological power increase, 375 00:17:25,375 --> 00:17:28,601 people have reorganized and sort of centralized the power. 376 00:17:28,625 --> 00:17:32,059 So, for example, when a roving band of criminals 377 00:17:32,083 --> 00:17:33,768 could take over a society, 378 00:17:33,792 --> 00:17:36,031 the response was, well, you have a nation-state 379 00:17:36,055 --> 00:17:38,489 and you centralize force, a police force or an army, 380 00:17:38,513 --> 00:17:40,143 so, "No, you can't do that." 381 00:17:40,167 --> 00:17:44,726 The logic, perhaps, of having a single person or a single group 382 00:17:44,750 --> 00:17:46,393 able to take out humanity 383 00:17:46,417 --> 00:17:49,143 means at some point we're going to have to go this route, 384 00:17:49,167 --> 00:17:50,601 at least in some form, no? 385 00:17:50,625 --> 00:17:54,309 NB: It's certainly true that the scale of political organization has increased 386 00:17:54,333 --> 00:17:56,476 over the course of human history. 387 00:17:56,500 --> 00:17:58,518 It used to be hunter-gatherer band, right, 388 00:17:58,542 --> 00:18:01,476 and then chiefdom, city-states, nations, 389 00:18:01,500 --> 00:18:05,476 now there are international organizations and so on and so forth. 390 00:18:05,500 --> 00:18:07,018 Again, I just want to make sure 391 00:18:07,042 --> 00:18:08,684 I get the chance to stress 392 00:18:08,708 --> 00:18:10,684 that obviously there are huge downsides 393 00:18:10,708 --> 00:18:12,226 and indeed, massive risks, 394 00:18:12,250 --> 00:18:15,601 both to mass surveillance and to global governance. 395 00:18:15,625 --> 00:18:18,184 I'm just pointing out that if we are lucky, 396 00:18:18,208 --> 00:18:20,893 the world could be such that these would be the only ways 397 00:18:20,917 --> 00:18:22,434 you could survive a black ball. 398 00:18:22,458 --> 00:18:24,976 CA: The logic of this theory, 399 00:18:25,000 --> 00:18:26,268 it seems to me, 400 00:18:26,292 --> 00:18:29,893 is that we've got to recognize we can't have it all. 401 00:18:29,917 --> 00:18:31,750 That the sort of, 402 00:18:33,500 --> 00:18:36,476 I would say, naive dream that many of us had 403 00:18:36,500 --> 00:18:39,851 that technology is always going to be a force for good, 404 00:18:39,875 --> 00:18:42,851 keep going, don't stop, go as fast as you can 405 00:18:42,875 --> 00:18:45,226 and not pay attention to some of the consequences, 406 00:18:45,250 --> 00:18:46,934 that's actually just not an option. 407 00:18:46,958 --> 00:18:48,893 We can have that. 408 00:18:48,917 --> 00:18:50,184 If we have that, 409 00:18:50,208 --> 00:18:51,643 we're going to have to accept 410 00:18:51,667 --> 00:18:54,226 some of these other very uncomfortable things with it, 411 00:18:54,250 --> 00:18:56,476 and kind of be in this arms race with ourselves 412 00:18:56,500 --> 00:18:58,768 of, you want the power, you better limit it, 413 00:18:58,792 --> 00:19:00,934 you better figure out how to limit it. 414 00:19:00,958 --> 00:19:04,434 NB: I think it is an option, 415 00:19:04,458 --> 00:19:07,226 a very tempting option, it's in a sense the easiest option 416 00:19:07,250 --> 00:19:08,518 and it might work, 417 00:19:08,542 --> 00:19:13,351 but it means we are fundamentally vulnerable to extracting a black ball. 418 00:19:13,375 --> 00:19:15,518 Now, I think with a bit of coordination, 419 00:19:15,542 --> 00:19:18,268 like, if you did solve this macrogovernance problem, 420 00:19:18,292 --> 00:19:19,893 and the microgovernance problem, 421 00:19:19,917 --> 00:19:22,226 then we could extract all the balls from the urn 422 00:19:22,250 --> 00:19:24,518 and we'd benefit greatly. 423 00:19:24,542 --> 00:19:27,976 CA: I mean, if we're living in a simulation, does it matter? 424 00:19:28,000 --> 00:19:29,309 We just reboot. 425 00:19:29,333 --> 00:19:30,601 (Laughter) 426 00:19:30,625 --> 00:19:32,268 NB: Then ... I ... 427 00:19:32,292 --> 00:19:34,768 (Laughter) 428 00:19:34,792 --> 00:19:36,208 I didn't see that one coming. 429 00:19:38,125 --> 00:19:39,393 CA: So what's your view? 430 00:19:39,417 --> 00:19:44,226 Putting all the pieces together, how likely is it that we're doomed? 431 00:19:44,250 --> 00:19:46,208 (Laughter) 432 00:19:47,042 --> 00:19:49,434 I love how people laugh when you ask that question. 433 00:19:49,458 --> 00:19:50,809 NB: On an individual level, 434 00:19:50,833 --> 00:19:54,684 we seem to kind of be doomed anyway, just with the time line, 435 00:19:54,708 --> 00:19:57,309 we're rotting and aging and all kinds of things, right? 436 00:19:57,333 --> 00:19:58,934 (Laughter) 437 00:19:58,958 --> 00:20:00,643 It's actually a little bit tricky. 438 00:20:00,667 --> 00:20:03,434 If you want to set up so that you can attach a probability, 439 00:20:03,458 --> 00:20:04,726 first, who are we? 440 00:20:04,750 --> 00:20:07,476 If you're very old, probably you'll die of natural causes, 441 00:20:07,500 --> 00:20:09,851 if you're very young, you might have a 100-year -- 442 00:20:09,875 --> 00:20:12,018 the probability might depend on who you ask. 443 00:20:12,042 --> 00:20:16,268 Then the threshold, like, what counts as civilizational devastation? 444 00:20:16,292 --> 00:20:21,934 In the paper I don't require an existential catastrophe 445 00:20:21,958 --> 00:20:23,393 in order for it to count. 446 00:20:23,417 --> 00:20:25,101 This is just a definitional matter, 447 00:20:25,125 --> 00:20:26,434 I say a billion dead, 448 00:20:26,458 --> 00:20:28,518 or a reduction of world GDP by 50 percent, 449 00:20:28,542 --> 00:20:30,768 but depending on what you say the threshold is, 450 00:20:30,792 --> 00:20:32,768 you get a different probability estimate. 451 00:20:32,792 --> 00:20:37,309 But I guess you could put me down as a frightened optimist. 452 00:20:37,333 --> 00:20:38,434 (Laughter) 453 00:20:38,458 --> 00:20:40,101 CA: You're a frightened optimist, 454 00:20:40,125 --> 00:20:44,393 and I think you've just created a large number of other frightened ... 455 00:20:44,417 --> 00:20:45,684 people. 456 00:20:45,708 --> 00:20:46,768 (Laughter) 457 00:20:46,792 --> 00:20:48,059 NB: In the simulation. 458 00:20:48,083 --> 00:20:49,351 CA: In a simulation. 459 00:20:49,375 --> 00:20:51,059 Nick Bostrom, your mind amazes me, 460 00:20:51,083 --> 00:20:53,976 thank you so much for scaring the living daylights out of us. 461 00:20:54,000 --> 00:20:56,375 (Applause)