[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:01.00,0:00:02.81,Default,,0000,0000,0000,,Chris Anderson: Nick Bostrom. Dialogue: 0,0:00:02.83,0:00:06.81,Default,,0000,0000,0000,,So, you have already given us\Nso many crazy ideas out there. Dialogue: 0,0:00:06.83,0:00:08.56,Default,,0000,0000,0000,,I think a couple of decades ago, Dialogue: 0,0:00:08.58,0:00:11.52,Default,,0000,0000,0000,,you made the case that we might\Nall be living in a simulation, Dialogue: 0,0:00:11.54,0:00:13.35,Default,,0000,0000,0000,,or perhaps probably were. Dialogue: 0,0:00:13.38,0:00:14.73,Default,,0000,0000,0000,,More recently, Dialogue: 0,0:00:14.75,0:00:19.35,Default,,0000,0000,0000,,you've painted the most vivid examples\Nof how artificial general intelligence Dialogue: 0,0:00:19.38,0:00:21.21,Default,,0000,0000,0000,,could go horribly wrong. Dialogue: 0,0:00:21.75,0:00:23.14,Default,,0000,0000,0000,,And now this year, Dialogue: 0,0:00:23.17,0:00:25.39,Default,,0000,0000,0000,,you're about to publish Dialogue: 0,0:00:25.42,0:00:29.35,Default,,0000,0000,0000,,a paper that presents something called\Nthe vulnerable world hypothesis. Dialogue: 0,0:00:29.38,0:00:33.96,Default,,0000,0000,0000,,And our job this evening is to\Ngive the illustrated guide to that. Dialogue: 0,0:00:34.42,0:00:36.25,Default,,0000,0000,0000,,So let's do that. Dialogue: 0,0:00:36.83,0:00:38.62,Default,,0000,0000,0000,,What is that hypothesis? Dialogue: 0,0:00:40.00,0:00:42.43,Default,,0000,0000,0000,,Nick Bostrom: It's trying to think about Dialogue: 0,0:00:42.46,0:00:45.54,Default,,0000,0000,0000,,a sort of structural feature\Nof the current human condition. Dialogue: 0,0:00:47.12,0:00:49.48,Default,,0000,0000,0000,,You like the urn metaphor, Dialogue: 0,0:00:49.50,0:00:51.39,Default,,0000,0000,0000,,so I'm going to use that to explain it. Dialogue: 0,0:00:51.42,0:00:55.77,Default,,0000,0000,0000,,So picture a big urn filled with balls Dialogue: 0,0:00:55.79,0:00:59.75,Default,,0000,0000,0000,,representing ideas, methods,\Npossible technologies. Dialogue: 0,0:01:00.83,0:01:04.56,Default,,0000,0000,0000,,You can think of the history\Nof human creativity Dialogue: 0,0:01:04.58,0:01:08.39,Default,,0000,0000,0000,,as the process of reaching into this urn\Nand pulling out one ball after another, Dialogue: 0,0:01:08.42,0:01:11.64,Default,,0000,0000,0000,,and the net effect so far\Nhas been hugely beneficial, right? Dialogue: 0,0:01:11.67,0:01:14.39,Default,,0000,0000,0000,,We've extracted a great many white balls, Dialogue: 0,0:01:14.42,0:01:17.29,Default,,0000,0000,0000,,some various shades of gray,\Nmixed blessings. Dialogue: 0,0:01:18.04,0:01:21.00,Default,,0000,0000,0000,,We haven't so far\Npulled out the black ball -- Dialogue: 0,0:01:22.29,0:01:27.77,Default,,0000,0000,0000,,a technology that invariably destroys\Nthe civilization that discovers it. Dialogue: 0,0:01:27.79,0:01:31.06,Default,,0000,0000,0000,,So the paper tries to think\Nabout what could such a black ball be. Dialogue: 0,0:01:31.08,0:01:32.89,Default,,0000,0000,0000,,CA: So you define that ball Dialogue: 0,0:01:32.92,0:01:36.60,Default,,0000,0000,0000,,as one that would inevitably\Nbring about civilizational destruction. Dialogue: 0,0:01:36.62,0:01:41.93,Default,,0000,0000,0000,,NB: Unless we exit what I call\Nthe semi-anarchic default condition. Dialogue: 0,0:01:41.96,0:01:43.46,Default,,0000,0000,0000,,But sort of, by default. Dialogue: 0,0:01:44.33,0:01:47.85,Default,,0000,0000,0000,,CA: So, you make the case compelling Dialogue: 0,0:01:47.88,0:01:49.89,Default,,0000,0000,0000,,by showing some sort of counterexamples Dialogue: 0,0:01:49.92,0:01:52.85,Default,,0000,0000,0000,,where you believe that so far\Nwe've actually got lucky, Dialogue: 0,0:01:52.88,0:01:55.73,Default,,0000,0000,0000,,that we might have pulled out\Nthat death ball Dialogue: 0,0:01:55.75,0:01:57.31,Default,,0000,0000,0000,,without even knowing it. Dialogue: 0,0:01:57.33,0:01:59.62,Default,,0000,0000,0000,,So there's this quote, what's this quote? Dialogue: 0,0:02:00.62,0:02:03.31,Default,,0000,0000,0000,,NB: Well, I guess\Nit's just meant to illustrate Dialogue: 0,0:02:03.33,0:02:05.43,Default,,0000,0000,0000,,the difficulty of foreseeing Dialogue: 0,0:02:05.46,0:02:08.14,Default,,0000,0000,0000,,what basic discoveries will lead to. Dialogue: 0,0:02:08.17,0:02:11.23,Default,,0000,0000,0000,,We just don't have that capability. Dialogue: 0,0:02:11.25,0:02:14.60,Default,,0000,0000,0000,,Because we have become quite good\Nat pulling out balls, Dialogue: 0,0:02:14.62,0:02:18.35,Default,,0000,0000,0000,,but we don't really have the ability\Nto put the ball back into the urn, right. Dialogue: 0,0:02:18.38,0:02:20.54,Default,,0000,0000,0000,,We can invent, but we can't uninvent. Dialogue: 0,0:02:21.58,0:02:24.35,Default,,0000,0000,0000,,So our strategy, such as it is, Dialogue: 0,0:02:24.38,0:02:26.81,Default,,0000,0000,0000,,is to hope that there is\Nno black ball in the urn. Dialogue: 0,0:02:26.83,0:02:30.89,Default,,0000,0000,0000,,CA: So once it's out, it's out,\Nand you can't put it back in, Dialogue: 0,0:02:30.92,0:02:32.43,Default,,0000,0000,0000,,and you think we've been lucky. Dialogue: 0,0:02:32.46,0:02:34.68,Default,,0000,0000,0000,,So talk through a couple\Nof these examples. Dialogue: 0,0:02:34.71,0:02:37.81,Default,,0000,0000,0000,,You talk about different\Ntypes of vulnerability. Dialogue: 0,0:02:37.83,0:02:40.27,Default,,0000,0000,0000,,NB: So the easiest type to understand Dialogue: 0,0:02:40.29,0:02:43.43,Default,,0000,0000,0000,,is a technology\Nthat just makes it very easy Dialogue: 0,0:02:43.46,0:02:45.58,Default,,0000,0000,0000,,to cause massive amounts of destruction. Dialogue: 0,0:02:47.38,0:02:50.89,Default,,0000,0000,0000,,Synthetic biology might be a fecund\Nsource of that kind of black ball, Dialogue: 0,0:02:50.92,0:02:53.60,Default,,0000,0000,0000,,but many other possible things we could -- Dialogue: 0,0:02:53.62,0:02:56.14,Default,,0000,0000,0000,,think of geoengineering,\Nreally great, right? Dialogue: 0,0:02:56.17,0:02:58.39,Default,,0000,0000,0000,,We could combat global warming, Dialogue: 0,0:02:58.42,0:03:00.56,Default,,0000,0000,0000,,but you don't want it\Nto get too easy either, Dialogue: 0,0:03:00.58,0:03:03.06,Default,,0000,0000,0000,,you don't want any random person\Nand his grandmother Dialogue: 0,0:03:03.08,0:03:06.14,Default,,0000,0000,0000,,to have the ability to radically\Nalter the earth's climate. Dialogue: 0,0:03:06.17,0:03:09.73,Default,,0000,0000,0000,,Or maybe lethal autonomous drones, Dialogue: 0,0:03:09.75,0:03:13.08,Default,,0000,0000,0000,,massed-produced, mosquito-sized\Nkiller bot swarms. Dialogue: 0,0:03:14.50,0:03:17.23,Default,,0000,0000,0000,,Nanotechnology,\Nartificial general intelligence. Dialogue: 0,0:03:17.25,0:03:18.56,Default,,0000,0000,0000,,CA: You argue in the paper Dialogue: 0,0:03:18.58,0:03:21.48,Default,,0000,0000,0000,,that it's a matter of luck\Nthat when we discovered Dialogue: 0,0:03:21.50,0:03:24.93,Default,,0000,0000,0000,,that nuclear power could create a bomb, Dialogue: 0,0:03:24.96,0:03:26.35,Default,,0000,0000,0000,,it might have been the case Dialogue: 0,0:03:26.38,0:03:28.23,Default,,0000,0000,0000,,that you could have created a bomb Dialogue: 0,0:03:28.25,0:03:31.81,Default,,0000,0000,0000,,with much easier resources,\Naccessible to anyone. Dialogue: 0,0:03:31.83,0:03:35.39,Default,,0000,0000,0000,,NB: Yeah, so think back to the 1930s Dialogue: 0,0:03:35.42,0:03:40.02,Default,,0000,0000,0000,,where for the first time we make\Nsome breakthroughs in nuclear physics, Dialogue: 0,0:03:40.04,0:03:43.73,Default,,0000,0000,0000,,some genius figures out that it's possible\Nto create a nuclear chain reaction Dialogue: 0,0:03:43.75,0:03:46.93,Default,,0000,0000,0000,,and then realizes\Nthat this could lead to the bomb. Dialogue: 0,0:03:46.96,0:03:48.85,Default,,0000,0000,0000,,And we do some more work, Dialogue: 0,0:03:48.88,0:03:51.60,Default,,0000,0000,0000,,it turns out that what you require\Nto make a nuclear bomb Dialogue: 0,0:03:51.62,0:03:54.02,Default,,0000,0000,0000,,is highly enriched uranium or plutonium, Dialogue: 0,0:03:54.04,0:03:56.06,Default,,0000,0000,0000,,which are very difficult materials to get. Dialogue: 0,0:03:56.08,0:03:58.35,Default,,0000,0000,0000,,You need ultracentrifuges, Dialogue: 0,0:03:58.38,0:04:02.14,Default,,0000,0000,0000,,you need reactors, like,\Nmassive amounts of energy. Dialogue: 0,0:04:02.17,0:04:03.98,Default,,0000,0000,0000,,But suppose it had turned out instead Dialogue: 0,0:04:04.00,0:04:07.98,Default,,0000,0000,0000,,there had been an easy way\Nto unlock the energy of the atom. Dialogue: 0,0:04:08.00,0:04:10.77,Default,,0000,0000,0000,,That maybe by baking sand\Nin the microwave oven Dialogue: 0,0:04:10.79,0:04:12.06,Default,,0000,0000,0000,,or something like that Dialogue: 0,0:04:12.08,0:04:14.18,Default,,0000,0000,0000,,you could have created\Na nuclear detonation. Dialogue: 0,0:04:14.21,0:04:16.35,Default,,0000,0000,0000,,So we know that that's\Nphysically impossible. Dialogue: 0,0:04:16.38,0:04:18.27,Default,,0000,0000,0000,,But before you did the relevant physics Dialogue: 0,0:04:18.29,0:04:20.48,Default,,0000,0000,0000,,how could you have known\Nhow it would turn out? Dialogue: 0,0:04:20.51,0:04:22.06,Default,,0000,0000,0000,,CA: Although, couldn't you argue Dialogue: 0,0:04:22.08,0:04:24.02,Default,,0000,0000,0000,,that for life to evolve on Earth Dialogue: 0,0:04:24.04,0:04:27.31,Default,,0000,0000,0000,,that implied sort of stable environment, Dialogue: 0,0:04:27.33,0:04:31.52,Default,,0000,0000,0000,,that if it was possible to create\Nmassive nuclear reactions relatively easy, Dialogue: 0,0:04:31.54,0:04:33.40,Default,,0000,0000,0000,,the Earth would never have been stable, Dialogue: 0,0:04:33.42,0:04:34.98,Default,,0000,0000,0000,,that we wouldn't be here at all. Dialogue: 0,0:04:35.00,0:04:38.39,Default,,0000,0000,0000,,NB: Yeah, unless there were some\Nthing that is easy to do on purpose Dialogue: 0,0:04:38.42,0:04:41.27,Default,,0000,0000,0000,,but that wouldn't happen by random chance. Dialogue: 0,0:04:41.29,0:04:42.87,Default,,0000,0000,0000,,So, like things we can easily do, Dialogue: 0,0:04:42.90,0:04:45.01,Default,,0000,0000,0000,,we can stack 10 blocks\Non top of one another, Dialogue: 0,0:04:45.03,0:04:48.23,Default,,0000,0000,0000,,but in nature, you're not going to find,\Nlike, a stack of 10 blocks. Dialogue: 0,0:04:48.25,0:04:49.93,Default,,0000,0000,0000,,CA: OK, so this is probably the one Dialogue: 0,0:04:49.95,0:04:51.89,Default,,0000,0000,0000,,that many of us worry about most, Dialogue: 0,0:04:51.92,0:04:55.43,Default,,0000,0000,0000,,and yes, synthetic biology\Nis perhaps the quickest route Dialogue: 0,0:04:55.46,0:04:58.48,Default,,0000,0000,0000,,that we can foresee\Nin our near future to get us here. Dialogue: 0,0:04:58.50,0:05:01.43,Default,,0000,0000,0000,,NB: Yeah, and so think\Nabout what that would have meant Dialogue: 0,0:05:01.46,0:05:05.10,Default,,0000,0000,0000,,if, say, anybody by working\Nin their kitchen for an afternoon Dialogue: 0,0:05:05.12,0:05:06.52,Default,,0000,0000,0000,,could destroy a city. Dialogue: 0,0:05:06.54,0:05:10.10,Default,,0000,0000,0000,,It's hard to see how\Nmodern civilization as we know it Dialogue: 0,0:05:10.12,0:05:11.56,Default,,0000,0000,0000,,could have survived that. Dialogue: 0,0:05:11.58,0:05:14.10,Default,,0000,0000,0000,,Because in any population\Nof a million people, Dialogue: 0,0:05:14.12,0:05:16.81,Default,,0000,0000,0000,,there will always be some\Nwho would, for whatever reason, Dialogue: 0,0:05:16.83,0:05:18.92,Default,,0000,0000,0000,,choose to use that destructive power. Dialogue: 0,0:05:19.75,0:05:22.89,Default,,0000,0000,0000,,So if that apocalyptic residual Dialogue: 0,0:05:22.92,0:05:24.89,Default,,0000,0000,0000,,would choose to destroy a city, or worse, Dialogue: 0,0:05:24.92,0:05:26.48,Default,,0000,0000,0000,,then cities would get destroyed. Dialogue: 0,0:05:26.50,0:05:28.85,Default,,0000,0000,0000,,CA: So here's another type\Nof vulnerability. Dialogue: 0,0:05:28.88,0:05:30.52,Default,,0000,0000,0000,,Talk about this. Dialogue: 0,0:05:30.54,0:05:34.52,Default,,0000,0000,0000,,NB: Yeah, so in addition to these\Nkind of obvious types of black balls Dialogue: 0,0:05:34.54,0:05:37.35,Default,,0000,0000,0000,,that would just make it possible\Nto blow up a lot of things, Dialogue: 0,0:05:37.38,0:05:41.81,Default,,0000,0000,0000,,other types would act\Nby creating bad incentives Dialogue: 0,0:05:41.83,0:05:44.06,Default,,0000,0000,0000,,for humans to do things that are harmful. Dialogue: 0,0:05:44.08,0:05:48.18,Default,,0000,0000,0000,,So, the Type-2a, we might call it that, Dialogue: 0,0:05:48.21,0:05:52.73,Default,,0000,0000,0000,,is to think about some technology\Nthat incentivizes great powers Dialogue: 0,0:05:52.75,0:05:57.23,Default,,0000,0000,0000,,to use their massive amounts of force\Nto create destruction. Dialogue: 0,0:05:57.25,0:06:00.08,Default,,0000,0000,0000,,So, nuclear weapons were actually\Nvery close to this, right? Dialogue: 0,0:06:02.08,0:06:05.14,Default,,0000,0000,0000,,What we did, we spent\Nover 10 trillion dollars Dialogue: 0,0:06:05.17,0:06:07.68,Default,,0000,0000,0000,,to build 70,000 nuclear warheads Dialogue: 0,0:06:07.71,0:06:10.14,Default,,0000,0000,0000,,and put them on hair-trigger alert. Dialogue: 0,0:06:10.17,0:06:12.43,Default,,0000,0000,0000,,And there were several times\Nduring the Cold War Dialogue: 0,0:06:12.46,0:06:13.89,Default,,0000,0000,0000,,we almost blew each other up. Dialogue: 0,0:06:13.92,0:06:17.02,Default,,0000,0000,0000,,It's not because a lot of people felt\Nthis would be a great idea, Dialogue: 0,0:06:17.04,0:06:19.73,Default,,0000,0000,0000,,let's all spend 10 trillion dollars\Nto blow ourselves up, Dialogue: 0,0:06:19.75,0:06:22.68,Default,,0000,0000,0000,,but the incentives were such\Nthat we were finding ourselves -- Dialogue: 0,0:06:22.71,0:06:24.02,Default,,0000,0000,0000,,This could have been worse. Dialogue: 0,0:06:24.04,0:06:26.48,Default,,0000,0000,0000,,Imagine if there had been\Na safe first strike. Dialogue: 0,0:06:26.50,0:06:28.81,Default,,0000,0000,0000,,Then it might have been very tricky, Dialogue: 0,0:06:28.83,0:06:30.10,Default,,0000,0000,0000,,in a crisis situation, Dialogue: 0,0:06:30.12,0:06:32.60,Default,,0000,0000,0000,,to refrain from launching\Nall their nuclear missiles. Dialogue: 0,0:06:32.63,0:06:36.02,Default,,0000,0000,0000,,If nothing else, because you would fear\Nthat the other side might do it. Dialogue: 0,0:06:36.04,0:06:37.85,Default,,0000,0000,0000,,CA: Right, mutual assured destruction Dialogue: 0,0:06:37.88,0:06:40.60,Default,,0000,0000,0000,,kept the Cold War relatively stable, Dialogue: 0,0:06:40.62,0:06:42.56,Default,,0000,0000,0000,,without that, we might not be here now. Dialogue: 0,0:06:42.58,0:06:44.89,Default,,0000,0000,0000,,NB: It could have been\Nmore unstable than it was. Dialogue: 0,0:06:44.92,0:06:47.27,Default,,0000,0000,0000,,And there could be\Nother properties of technology. Dialogue: 0,0:06:47.29,0:06:49.60,Default,,0000,0000,0000,,It could have been harder\Nto have arms treaties, Dialogue: 0,0:06:49.62,0:06:51.23,Default,,0000,0000,0000,,if instead of nuclear weapons Dialogue: 0,0:06:51.25,0:06:54.27,Default,,0000,0000,0000,,there had been some smaller thing\Nor something less distinctive. Dialogue: 0,0:06:54.29,0:06:56.85,Default,,0000,0000,0000,,CA: And as well as bad incentives\Nfor powerful actors, Dialogue: 0,0:06:56.88,0:07:00.39,Default,,0000,0000,0000,,you also worry about bad incentives\Nfor all of us, in Type-2b here. Dialogue: 0,0:07:00.42,0:07:04.67,Default,,0000,0000,0000,,NB: Yeah, so, here we might\Ntake the case of global warming. Dialogue: 0,0:07:06.96,0:07:08.85,Default,,0000,0000,0000,,There are a lot of little conveniences Dialogue: 0,0:07:08.88,0:07:11.06,Default,,0000,0000,0000,,that cause each one of us to do things Dialogue: 0,0:07:11.08,0:07:13.93,Default,,0000,0000,0000,,that individually \Nhave no significant effect, right? Dialogue: 0,0:07:13.96,0:07:15.93,Default,,0000,0000,0000,,But if billions of people do it, Dialogue: 0,0:07:15.96,0:07:18.02,Default,,0000,0000,0000,,cumulatively, it has a damaging effect. Dialogue: 0,0:07:18.04,0:07:20.85,Default,,0000,0000,0000,,Now, global warming\Ncould have been a lot worse than it is. Dialogue: 0,0:07:20.88,0:07:23.85,Default,,0000,0000,0000,,So we have the climate\Nsensitivity parameter, right. Dialogue: 0,0:07:23.88,0:07:27.52,Default,,0000,0000,0000,,It's a parameter that says\Nhow much warmer does it get Dialogue: 0,0:07:27.54,0:07:30.23,Default,,0000,0000,0000,,if you emit a certain amount\Nof greenhouse gasses. Dialogue: 0,0:07:30.25,0:07:32.64,Default,,0000,0000,0000,,But, suppose that it had been the case Dialogue: 0,0:07:32.67,0:07:35.18,Default,,0000,0000,0000,,that with the amount\Nof greenhouse gasses we emitted, Dialogue: 0,0:07:35.21,0:07:37.27,Default,,0000,0000,0000,,instead of the temperature rising by, say, Dialogue: 0,0:07:37.29,0:07:41.02,Default,,0000,0000,0000,,between three and 4.5 degrees by 2100, Dialogue: 0,0:07:41.04,0:07:43.54,Default,,0000,0000,0000,,suppose it had been\N15 degrees or 20 degrees. Dialogue: 0,0:07:44.38,0:07:46.93,Default,,0000,0000,0000,,Like, then we might have been\Nin a very bad situation. Dialogue: 0,0:07:46.96,0:07:50.10,Default,,0000,0000,0000,,Or suppose that renewable energy\Nhad just been a lot harder to do. Dialogue: 0,0:07:50.12,0:07:52.77,Default,,0000,0000,0000,,Or that there had been\Nmore fossil fuels in the ground. Dialogue: 0,0:07:52.79,0:07:55.43,Default,,0000,0000,0000,,CA: Couldn't you argue\Nthat if in that case of -- Dialogue: 0,0:07:55.46,0:07:57.18,Default,,0000,0000,0000,,if what we are doing today Dialogue: 0,0:07:57.21,0:08:01.77,Default,,0000,0000,0000,,had resulted in 10 degrees difference\Nin the time period that we could see, Dialogue: 0,0:08:01.79,0:08:05.48,Default,,0000,0000,0000,,actually humanity would have got\Noff its ass and done something about it. Dialogue: 0,0:08:05.50,0:08:08.31,Default,,0000,0000,0000,,We're stupid, but we're not\Nmaybe that stupid. Dialogue: 0,0:08:08.33,0:08:09.60,Default,,0000,0000,0000,,Or maybe we are. Dialogue: 0,0:08:09.62,0:08:10.89,Default,,0000,0000,0000,,NB: I wouldn't bet on it. Dialogue: 0,0:08:10.92,0:08:13.10,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:08:13.12,0:08:14.81,Default,,0000,0000,0000,,You could imagine other features. Dialogue: 0,0:08:14.83,0:08:20.35,Default,,0000,0000,0000,,So, right now, it's a little bit difficult\Nto switch to renewables and stuff, right, Dialogue: 0,0:08:20.38,0:08:21.64,Default,,0000,0000,0000,,but it can be done. Dialogue: 0,0:08:21.67,0:08:24.64,Default,,0000,0000,0000,,But it might just have been,\Nwith slightly different physics, Dialogue: 0,0:08:24.67,0:08:27.46,Default,,0000,0000,0000,,it could have been much more expensive\Nto do these things. Dialogue: 0,0:08:28.38,0:08:29.89,Default,,0000,0000,0000,,CA: And what's your view, Nick? Dialogue: 0,0:08:29.92,0:08:32.35,Default,,0000,0000,0000,,Do you think, putting\Nthese possibilities together, Dialogue: 0,0:08:32.38,0:08:36.64,Default,,0000,0000,0000,,that this earth, humanity that we are, Dialogue: 0,0:08:36.67,0:08:38.23,Default,,0000,0000,0000,,we count as a vulnerable world? Dialogue: 0,0:08:38.25,0:08:40.67,Default,,0000,0000,0000,,That there is a death ball in our future? Dialogue: 0,0:08:43.96,0:08:45.23,Default,,0000,0000,0000,,NB: It's hard to say. Dialogue: 0,0:08:45.25,0:08:50.31,Default,,0000,0000,0000,,I mean, I think there might\Nwell be various black balls in the urn, Dialogue: 0,0:08:50.33,0:08:51.64,Default,,0000,0000,0000,,that's what it looks like. Dialogue: 0,0:08:51.67,0:08:54.06,Default,,0000,0000,0000,,There might also be some golden balls Dialogue: 0,0:08:54.08,0:08:57.56,Default,,0000,0000,0000,,that would help us\Nprotect against black balls. Dialogue: 0,0:08:57.58,0:09:00.56,Default,,0000,0000,0000,,And I don't know which order\Nthey will come out. Dialogue: 0,0:09:00.58,0:09:04.43,Default,,0000,0000,0000,,CA: I mean, one possible\Nphilosophical critique of this idea Dialogue: 0,0:09:04.46,0:09:10.10,Default,,0000,0000,0000,,is that it implies a view\Nthat the future is essentially settled. Dialogue: 0,0:09:10.12,0:09:12.60,Default,,0000,0000,0000,,That there either\Nis that ball there or it's not. Dialogue: 0,0:09:12.62,0:09:15.64,Default,,0000,0000,0000,,And in a way, Dialogue: 0,0:09:15.67,0:09:18.27,Default,,0000,0000,0000,,that's not a view of the future\Nthat I want to believe. Dialogue: 0,0:09:18.29,0:09:20.64,Default,,0000,0000,0000,,I want to believe\Nthat the future is undetermined, Dialogue: 0,0:09:20.67,0:09:22.60,Default,,0000,0000,0000,,that our decisions today will determine Dialogue: 0,0:09:22.62,0:09:24.83,Default,,0000,0000,0000,,what kind of balls\Nwe pull out of that urn. Dialogue: 0,0:09:25.92,0:09:29.68,Default,,0000,0000,0000,,NB: I mean, if we just keep inventing, Dialogue: 0,0:09:29.71,0:09:32.04,Default,,0000,0000,0000,,like, eventually we will\Npull out all the balls. Dialogue: 0,0:09:32.88,0:09:36.27,Default,,0000,0000,0000,,I mean, I think there's a kind\Nof weak form of technological determinism Dialogue: 0,0:09:36.29,0:09:37.56,Default,,0000,0000,0000,,that is quite plausible, Dialogue: 0,0:09:37.58,0:09:40.23,Default,,0000,0000,0000,,like, you're unlikely\Nto encounter a society Dialogue: 0,0:09:40.25,0:09:43.08,Default,,0000,0000,0000,,that uses flint axes and jet planes. Dialogue: 0,0:09:44.21,0:09:48.27,Default,,0000,0000,0000,,But you can almost think\Nof a technology as a set of affordances. Dialogue: 0,0:09:48.29,0:09:51.31,Default,,0000,0000,0000,,So technology is the thing\Nthat enables us to do various things Dialogue: 0,0:09:51.33,0:09:53.31,Default,,0000,0000,0000,,and achieve various effects in the world. Dialogue: 0,0:09:53.33,0:09:56.14,Default,,0000,0000,0000,,How we'd then use that,\Nof course depends on human choice. Dialogue: 0,0:09:56.17,0:09:58.85,Default,,0000,0000,0000,,But if we think about these\Nthree types of vulnerability, Dialogue: 0,0:09:58.88,0:10:02.27,Default,,0000,0000,0000,,they make quite weak assumptions\Nabout how we would choose to use them. Dialogue: 0,0:10:02.29,0:10:05.68,Default,,0000,0000,0000,,So a Type-1 vulnerability, again,\Nthis massive, destructive power, Dialogue: 0,0:10:05.71,0:10:07.14,Default,,0000,0000,0000,,it's a fairly weak assumption Dialogue: 0,0:10:07.17,0:10:09.56,Default,,0000,0000,0000,,to think that in a population\Nof millions of people Dialogue: 0,0:10:09.58,0:10:12.52,Default,,0000,0000,0000,,there would be some that would choose\Nto use it destructively. Dialogue: 0,0:10:12.54,0:10:14.98,Default,,0000,0000,0000,,CA: For me, the most single\Ndisturbing argument Dialogue: 0,0:10:15.00,0:10:19.56,Default,,0000,0000,0000,,is that we actually might have\Nsome kind of view into the urn Dialogue: 0,0:10:19.58,0:10:23.10,Default,,0000,0000,0000,,that makes it actually\Nvery likely that we're doomed. Dialogue: 0,0:10:23.12,0:10:27.77,Default,,0000,0000,0000,,Namely, if you believe\Nin accelerating power, Dialogue: 0,0:10:27.79,0:10:30.06,Default,,0000,0000,0000,,that technology inherently accelerates, Dialogue: 0,0:10:30.08,0:10:32.52,Default,,0000,0000,0000,,that we build the tools\Nthat make us more powerful, Dialogue: 0,0:10:32.54,0:10:35.18,Default,,0000,0000,0000,,then at some point you get to a stage Dialogue: 0,0:10:35.21,0:10:38.27,Default,,0000,0000,0000,,where a single individual\Ncan take us all down, Dialogue: 0,0:10:38.29,0:10:41.14,Default,,0000,0000,0000,,and then it looks like we're screwed. Dialogue: 0,0:10:41.17,0:10:44.10,Default,,0000,0000,0000,,Isn't that argument quite alarming? Dialogue: 0,0:10:44.12,0:10:45.88,Default,,0000,0000,0000,,NB: Ah, yeah. Dialogue: 0,0:10:46.71,0:10:47.98,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:10:48.00,0:10:49.33,Default,,0000,0000,0000,,I think -- Dialogue: 0,0:10:50.88,0:10:52.48,Default,,0000,0000,0000,,Yeah, we get more and more power, Dialogue: 0,0:10:52.50,0:10:56.43,Default,,0000,0000,0000,,and [it's] easier and easier \Nto use those powers, Dialogue: 0,0:10:56.46,0:11:00.02,Default,,0000,0000,0000,,but we can also invent technologies\Nthat kind of help us control Dialogue: 0,0:11:00.04,0:11:02.06,Default,,0000,0000,0000,,how people use those powers. Dialogue: 0,0:11:02.08,0:11:04.93,Default,,0000,0000,0000,,CA: So let's talk about that,\Nlet's talk about the response. Dialogue: 0,0:11:04.96,0:11:07.27,Default,,0000,0000,0000,,Suppose that thinking\Nabout all the possibilities Dialogue: 0,0:11:07.29,0:11:09.39,Default,,0000,0000,0000,,that are out there now -- Dialogue: 0,0:11:09.42,0:11:13.14,Default,,0000,0000,0000,,it's not just synbio,\Nit's things like cyberwarfare, Dialogue: 0,0:11:13.17,0:11:16.52,Default,,0000,0000,0000,,artificial intelligence, etc., etc. -- Dialogue: 0,0:11:16.54,0:11:21.06,Default,,0000,0000,0000,,that there might be\Nserious doom in our future. Dialogue: 0,0:11:21.08,0:11:22.68,Default,,0000,0000,0000,,What are the possible responses? Dialogue: 0,0:11:22.71,0:11:27.60,Default,,0000,0000,0000,,And you've talked about\Nfour possible responses as well. Dialogue: 0,0:11:27.62,0:11:31.27,Default,,0000,0000,0000,,NB: Restricting technological development\Ndoesn't seem promising, Dialogue: 0,0:11:31.29,0:11:34.52,Default,,0000,0000,0000,,if we are talking about a general halt\Nto technological progress. Dialogue: 0,0:11:34.54,0:11:35.81,Default,,0000,0000,0000,,I think neither feasible, Dialogue: 0,0:11:35.83,0:11:38.14,Default,,0000,0000,0000,,nor would it be desirable\Neven if we could do it. Dialogue: 0,0:11:38.17,0:11:41.18,Default,,0000,0000,0000,,I think there might be very limited areas Dialogue: 0,0:11:41.21,0:11:43.93,Default,,0000,0000,0000,,where maybe you would want\Nslower technological progress. Dialogue: 0,0:11:43.96,0:11:47.35,Default,,0000,0000,0000,,You don't, I think, want\Nfaster progress in bioweapons, Dialogue: 0,0:11:47.38,0:11:49.43,Default,,0000,0000,0000,,or in, say, isotope separation, Dialogue: 0,0:11:49.46,0:11:51.71,Default,,0000,0000,0000,,that would make it easier to create nukes. Dialogue: 0,0:11:52.58,0:11:55.89,Default,,0000,0000,0000,,CA: I mean, I used to be\Nfully on board with that. Dialogue: 0,0:11:55.92,0:11:59.18,Default,,0000,0000,0000,,But I would like to actually\Npush back on that for a minute. Dialogue: 0,0:11:59.21,0:12:00.52,Default,,0000,0000,0000,,Just because, first of all, Dialogue: 0,0:12:00.54,0:12:03.23,Default,,0000,0000,0000,,if you look at the history\Nof the last couple of decades, Dialogue: 0,0:12:03.25,0:12:06.81,Default,,0000,0000,0000,,you know, it's always been\Npush forward at full speed, Dialogue: 0,0:12:06.83,0:12:08.68,Default,,0000,0000,0000,,it's OK, that's our only choice. Dialogue: 0,0:12:08.71,0:12:12.98,Default,,0000,0000,0000,,But if you look at globalization\Nand the rapid acceleration of that, Dialogue: 0,0:12:13.00,0:12:16.43,Default,,0000,0000,0000,,if you look at the strategy of\N"move fast and break things" Dialogue: 0,0:12:16.46,0:12:18.52,Default,,0000,0000,0000,,and what happened with that, Dialogue: 0,0:12:18.54,0:12:21.31,Default,,0000,0000,0000,,and then you look at the potential\Nfor synthetic biology, Dialogue: 0,0:12:21.33,0:12:25.77,Default,,0000,0000,0000,,I don't know that we should\Nmove forward rapidly Dialogue: 0,0:12:25.79,0:12:27.43,Default,,0000,0000,0000,,or without any kind of restriction Dialogue: 0,0:12:27.46,0:12:30.77,Default,,0000,0000,0000,,to a world where you could have\Na DNA printer in every home Dialogue: 0,0:12:30.79,0:12:32.12,Default,,0000,0000,0000,,and high school lab. Dialogue: 0,0:12:33.17,0:12:34.85,Default,,0000,0000,0000,,There are some restrictions, right? Dialogue: 0,0:12:34.88,0:12:37.52,Default,,0000,0000,0000,,NB: Possibly, there is\Nthe first part, the not feasible. Dialogue: 0,0:12:37.54,0:12:39.73,Default,,0000,0000,0000,,If you think it would be\Ndesirable to stop it, Dialogue: 0,0:12:39.75,0:12:41.48,Default,,0000,0000,0000,,there's the problem of feasibility. Dialogue: 0,0:12:41.50,0:12:44.31,Default,,0000,0000,0000,,So it doesn't really help\Nif one nation kind of -- Dialogue: 0,0:12:44.33,0:12:46.35,Default,,0000,0000,0000,,CA: No, it doesn't help\Nif one nation does, Dialogue: 0,0:12:46.38,0:12:49.31,Default,,0000,0000,0000,,but we've had treaties before. Dialogue: 0,0:12:49.33,0:12:52.68,Default,,0000,0000,0000,,That's really how we survived\Nthe nuclear threat, Dialogue: 0,0:12:52.71,0:12:53.98,Default,,0000,0000,0000,,was by going out there Dialogue: 0,0:12:54.00,0:12:56.52,Default,,0000,0000,0000,,and going through\Nthe painful process of negotiating. Dialogue: 0,0:12:56.54,0:13:01.98,Default,,0000,0000,0000,,I just wonder whether the logic isn't\Nthat we, as a matter of global priority, Dialogue: 0,0:13:02.00,0:13:03.68,Default,,0000,0000,0000,,we shouldn't go out there and try, Dialogue: 0,0:13:03.71,0:13:06.39,Default,,0000,0000,0000,,like, now start negotiating\Nreally strict rules Dialogue: 0,0:13:06.42,0:13:09.10,Default,,0000,0000,0000,,on where synthetic bioresearch is done, Dialogue: 0,0:13:09.12,0:13:11.98,Default,,0000,0000,0000,,that it's not something\Nthat you want to democratize, no? Dialogue: 0,0:13:12.00,0:13:13.81,Default,,0000,0000,0000,,NB: I totally agree with that -- Dialogue: 0,0:13:13.83,0:13:18.06,Default,,0000,0000,0000,,that it would be desirable, for example, Dialogue: 0,0:13:18.08,0:13:21.68,Default,,0000,0000,0000,,maybe to have DNA synthesis machines, Dialogue: 0,0:13:21.71,0:13:25.27,Default,,0000,0000,0000,,not as a product where each lab\Nhas their own device, Dialogue: 0,0:13:25.29,0:13:26.77,Default,,0000,0000,0000,,but maybe as a service. Dialogue: 0,0:13:26.79,0:13:29.31,Default,,0000,0000,0000,,Maybe there could be\Nfour or five places in the world Dialogue: 0,0:13:29.33,0:13:32.85,Default,,0000,0000,0000,,where you send in your digital blueprint\Nand the DNA comes back, right? Dialogue: 0,0:13:32.88,0:13:34.64,Default,,0000,0000,0000,,And then, you would have the ability, Dialogue: 0,0:13:34.67,0:13:37.06,Default,,0000,0000,0000,,if one day it really looked\Nlike it was necessary, Dialogue: 0,0:13:37.08,0:13:39.43,Default,,0000,0000,0000,,we would have like,\Na finite set of choke points. Dialogue: 0,0:13:39.46,0:13:42.98,Default,,0000,0000,0000,,So I think you want to look\Nfor kind of special opportunities, Dialogue: 0,0:13:43.00,0:13:45.06,Default,,0000,0000,0000,,where you could have tighter control. Dialogue: 0,0:13:45.08,0:13:46.73,Default,,0000,0000,0000,,CA: Your belief is, fundamentally, Dialogue: 0,0:13:46.75,0:13:49.64,Default,,0000,0000,0000,,we are not going to be successful\Nin just holding back. Dialogue: 0,0:13:49.67,0:13:52.39,Default,,0000,0000,0000,,Someone, somewhere --\NNorth Korea, you know -- Dialogue: 0,0:13:52.42,0:13:55.93,Default,,0000,0000,0000,,someone is going to go there\Nand discover this knowledge, Dialogue: 0,0:13:55.96,0:13:57.23,Default,,0000,0000,0000,,if it's there to be found. Dialogue: 0,0:13:57.25,0:13:59.60,Default,,0000,0000,0000,,NB: That looks plausible\Nunder current conditions. Dialogue: 0,0:13:59.62,0:14:01.56,Default,,0000,0000,0000,,It's not just synthetic biology, either. Dialogue: 0,0:14:01.58,0:14:04.10,Default,,0000,0000,0000,,I mean, any kind of profound,\Nnew change in the world Dialogue: 0,0:14:04.12,0:14:05.75,Default,,0000,0000,0000,,could turn out to be a black ball. Dialogue: 0,0:14:05.78,0:14:07.80,Default,,0000,0000,0000,,CA: Let's look\Nat another possible response. Dialogue: 0,0:14:07.82,0:14:10.23,Default,,0000,0000,0000,,NB: This also, I think,\Nhas only limited potential. Dialogue: 0,0:14:10.25,0:14:13.81,Default,,0000,0000,0000,,So, with the Type-1 vulnerability again, Dialogue: 0,0:14:13.83,0:14:18.18,Default,,0000,0000,0000,,I mean, if you could reduce the number\Nof people who are incentivized Dialogue: 0,0:14:18.21,0:14:19.48,Default,,0000,0000,0000,,to destroy the world, Dialogue: 0,0:14:19.50,0:14:21.56,Default,,0000,0000,0000,,if only they could get\Naccess and the means, Dialogue: 0,0:14:21.58,0:14:22.85,Default,,0000,0000,0000,,that would be good. Dialogue: 0,0:14:22.88,0:14:24.85,Default,,0000,0000,0000,,CA: In this image that you asked us to do Dialogue: 0,0:14:24.88,0:14:27.43,Default,,0000,0000,0000,,you're imagining these drones\Nflying around the world Dialogue: 0,0:14:27.46,0:14:28.73,Default,,0000,0000,0000,,with facial recognition. Dialogue: 0,0:14:28.75,0:14:31.64,Default,,0000,0000,0000,,When they spot someone\Nshowing signs of sociopathic behavior, Dialogue: 0,0:14:31.67,0:14:33.85,Default,,0000,0000,0000,,they shower them with love, they fix them. Dialogue: 0,0:14:33.88,0:14:35.77,Default,,0000,0000,0000,,NB: I think it's like a hybrid picture. Dialogue: 0,0:14:35.79,0:14:39.81,Default,,0000,0000,0000,,Eliminate can either mean,\Nlike, incarcerate or kill, Dialogue: 0,0:14:39.83,0:14:42.85,Default,,0000,0000,0000,,or it can mean persuade them\Nto a better view of the world. Dialogue: 0,0:14:42.88,0:14:44.60,Default,,0000,0000,0000,,But the point is that, Dialogue: 0,0:14:44.62,0:14:46.77,Default,,0000,0000,0000,,suppose you were\Nextremely successful in this, Dialogue: 0,0:14:46.79,0:14:50.10,Default,,0000,0000,0000,,and you reduced the number\Nof such individuals by half. Dialogue: 0,0:14:50.12,0:14:52.02,Default,,0000,0000,0000,,And if you want to do it by persuasion, Dialogue: 0,0:14:52.04,0:14:54.43,Default,,0000,0000,0000,,you are competing against\Nall other powerful forces Dialogue: 0,0:14:54.46,0:14:56.14,Default,,0000,0000,0000,,that are trying to persuade people, Dialogue: 0,0:14:56.17,0:14:57.93,Default,,0000,0000,0000,,parties, religion, education system. Dialogue: 0,0:14:57.96,0:14:59.86,Default,,0000,0000,0000,,But suppose you could reduce it by half, Dialogue: 0,0:14:59.89,0:15:02.14,Default,,0000,0000,0000,,I don't think the risk\Nwould be reduced by half. Dialogue: 0,0:15:02.17,0:15:03.73,Default,,0000,0000,0000,,Maybe by five or 10 percent. Dialogue: 0,0:15:03.75,0:15:08.10,Default,,0000,0000,0000,,CA: You're not recommending that we gamble\Nhumanity's future on response two. Dialogue: 0,0:15:08.12,0:15:11.14,Default,,0000,0000,0000,,NB: I think it's all good\Nto try to deter and persuade people, Dialogue: 0,0:15:11.17,0:15:14.14,Default,,0000,0000,0000,,but we shouldn't rely on that\Nas our only safeguard. Dialogue: 0,0:15:14.17,0:15:15.43,Default,,0000,0000,0000,,CA: How about three? Dialogue: 0,0:15:15.46,0:15:18.35,Default,,0000,0000,0000,,NB: I think there are two general methods Dialogue: 0,0:15:18.38,0:15:22.35,Default,,0000,0000,0000,,that we could use to achieve\Nthe ability to stabilize the world Dialogue: 0,0:15:22.38,0:15:25.35,Default,,0000,0000,0000,,against the whole spectrum\Nof possible vulnerabilities. Dialogue: 0,0:15:25.38,0:15:26.93,Default,,0000,0000,0000,,And we probably would need both. Dialogue: 0,0:15:26.96,0:15:31.48,Default,,0000,0000,0000,,So, one is an extremely effective ability Dialogue: 0,0:15:31.50,0:15:33.27,Default,,0000,0000,0000,,to do preventive policing. Dialogue: 0,0:15:33.29,0:15:34.82,Default,,0000,0000,0000,,Such that you could intercept. Dialogue: 0,0:15:34.84,0:15:37.60,Default,,0000,0000,0000,,If anybody started to do\Nthis dangerous thing, Dialogue: 0,0:15:37.62,0:15:40.31,Default,,0000,0000,0000,,you could intercept them\Nin real time, and stop them. Dialogue: 0,0:15:40.33,0:15:42.81,Default,,0000,0000,0000,,So this would require\Nubiquitous surveillance, Dialogue: 0,0:15:42.83,0:15:45.21,Default,,0000,0000,0000,,everybody would be monitored all the time. Dialogue: 0,0:15:46.33,0:15:48.89,Default,,0000,0000,0000,,CA: This is "Minority Report,"\Nessentially, a form of. Dialogue: 0,0:15:48.92,0:15:50.85,Default,,0000,0000,0000,,NB: You would have maybe AI algorithms, Dialogue: 0,0:15:50.88,0:15:55.29,Default,,0000,0000,0000,,big freedom centers\Nthat were reviewing this, etc., etc. Dialogue: 0,0:15:56.58,0:16:00.98,Default,,0000,0000,0000,,CA: You know that mass surveillance\Nis not a very popular term right now? Dialogue: 0,0:16:01.00,0:16:02.25,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:16:03.46,0:16:05.27,Default,,0000,0000,0000,,NB: Yeah, so this little device there, Dialogue: 0,0:16:05.29,0:16:08.89,Default,,0000,0000,0000,,imagine that kind of necklace\Nthat you would have to wear at all times Dialogue: 0,0:16:08.92,0:16:10.92,Default,,0000,0000,0000,,with multidirectional cameras. Dialogue: 0,0:16:11.79,0:16:13.60,Default,,0000,0000,0000,,But, to make it go down better, Dialogue: 0,0:16:13.62,0:16:16.15,Default,,0000,0000,0000,,just call it the "freedom tag"\Nor something like that. Dialogue: 0,0:16:16.17,0:16:18.18,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:16:18.21,0:16:19.48,Default,,0000,0000,0000,,CA: OK. Dialogue: 0,0:16:19.50,0:16:21.60,Default,,0000,0000,0000,,I mean, this is the conversation, friends, Dialogue: 0,0:16:21.62,0:16:25.18,Default,,0000,0000,0000,,this is why this is\Nsuch a mind-blowing conversation. Dialogue: 0,0:16:25.21,0:16:27.81,Default,,0000,0000,0000,,NB: Actually, there's\Na whole big conversation on this Dialogue: 0,0:16:27.83,0:16:29.14,Default,,0000,0000,0000,,on its own, obviously. Dialogue: 0,0:16:29.17,0:16:31.64,Default,,0000,0000,0000,,There are huge problems and risks\Nwith that, right? Dialogue: 0,0:16:31.67,0:16:32.93,Default,,0000,0000,0000,,We may come back to that. Dialogue: 0,0:16:32.96,0:16:34.23,Default,,0000,0000,0000,,So the other, the final, Dialogue: 0,0:16:34.25,0:16:36.81,Default,,0000,0000,0000,,the other general stabilization capability Dialogue: 0,0:16:36.83,0:16:38.89,Default,,0000,0000,0000,,is kind of plugging\Nanother governance gap. Dialogue: 0,0:16:38.92,0:16:43.10,Default,,0000,0000,0000,,So the surveillance would be kind of\Ngovernance gap at the microlevel, Dialogue: 0,0:16:43.12,0:16:46.23,Default,,0000,0000,0000,,like, preventing anybody\Nfrom ever doing something highly illegal. Dialogue: 0,0:16:46.25,0:16:48.56,Default,,0000,0000,0000,,Then, there's a corresponding\Ngovernance gap Dialogue: 0,0:16:48.58,0:16:50.52,Default,,0000,0000,0000,,at the macrolevel, at the global level. Dialogue: 0,0:16:50.54,0:16:54.48,Default,,0000,0000,0000,,You would need the ability, reliably, Dialogue: 0,0:16:54.50,0:16:57.31,Default,,0000,0000,0000,,to prevent the worst kinds\Nof global coordination failures, Dialogue: 0,0:16:57.33,0:17:01.10,Default,,0000,0000,0000,,to avoid wars between great powers, Dialogue: 0,0:17:01.12,0:17:02.46,Default,,0000,0000,0000,,arms races, Dialogue: 0,0:17:03.50,0:17:05.71,Default,,0000,0000,0000,,cataclysmic commons problems, Dialogue: 0,0:17:07.67,0:17:11.85,Default,,0000,0000,0000,,in order to deal with\Nthe Type-2a vulnerabilities. Dialogue: 0,0:17:11.88,0:17:13.81,Default,,0000,0000,0000,,CA: Global governance is a term Dialogue: 0,0:17:13.83,0:17:16.06,Default,,0000,0000,0000,,that's definitely way out\Nof fashion right now, Dialogue: 0,0:17:16.08,0:17:18.60,Default,,0000,0000,0000,,but could you make the case\Nthat throughout history, Dialogue: 0,0:17:18.62,0:17:19.89,Default,,0000,0000,0000,,the history of humanity Dialogue: 0,0:17:19.92,0:17:25.35,Default,,0000,0000,0000,,is that at every stage\Nof technological power increase, Dialogue: 0,0:17:25.38,0:17:28.60,Default,,0000,0000,0000,,people have reorganized\Nand sort of centralized the power. Dialogue: 0,0:17:28.62,0:17:32.06,Default,,0000,0000,0000,,So, for example,\Nwhen a roving band of criminals Dialogue: 0,0:17:32.08,0:17:33.77,Default,,0000,0000,0000,,could take over a society, Dialogue: 0,0:17:33.79,0:17:36.03,Default,,0000,0000,0000,,the response was,\Nwell, you have a nation-state Dialogue: 0,0:17:36.06,0:17:38.49,Default,,0000,0000,0000,,and you centralize force,\Na police force or an army, Dialogue: 0,0:17:38.51,0:17:40.14,Default,,0000,0000,0000,,so, "No, you can't do that." Dialogue: 0,0:17:40.17,0:17:44.73,Default,,0000,0000,0000,,The logic, perhaps, of having\Na single person or a single group Dialogue: 0,0:17:44.75,0:17:46.39,Default,,0000,0000,0000,,able to take out humanity Dialogue: 0,0:17:46.42,0:17:49.14,Default,,0000,0000,0000,,means at some point\Nwe're going to have to go this route, Dialogue: 0,0:17:49.17,0:17:50.60,Default,,0000,0000,0000,,at least in some form, no? Dialogue: 0,0:17:50.62,0:17:54.31,Default,,0000,0000,0000,,NB: It's certainly true that the scale\Nof political organization has increased Dialogue: 0,0:17:54.33,0:17:56.48,Default,,0000,0000,0000,,over the course of human history. Dialogue: 0,0:17:56.50,0:17:58.52,Default,,0000,0000,0000,,It used to be hunter-gatherer band, right, Dialogue: 0,0:17:58.54,0:18:01.48,Default,,0000,0000,0000,,and then chiefdom, city-states, nations, Dialogue: 0,0:18:01.50,0:18:05.48,Default,,0000,0000,0000,,now there are international organizations\Nand so on and so forth. Dialogue: 0,0:18:05.50,0:18:07.02,Default,,0000,0000,0000,,Again, I just want to make sure Dialogue: 0,0:18:07.04,0:18:08.68,Default,,0000,0000,0000,,I get the chance to stress Dialogue: 0,0:18:08.71,0:18:10.68,Default,,0000,0000,0000,,that obviously there are huge downsides Dialogue: 0,0:18:10.71,0:18:12.23,Default,,0000,0000,0000,,and indeed, massive risks, Dialogue: 0,0:18:12.25,0:18:15.60,Default,,0000,0000,0000,,both to mass surveillance\Nand to global governance. Dialogue: 0,0:18:15.62,0:18:18.18,Default,,0000,0000,0000,,I'm just pointing out\Nthat if we are lucky, Dialogue: 0,0:18:18.21,0:18:20.89,Default,,0000,0000,0000,,the world could be such\Nthat these would be the only ways Dialogue: 0,0:18:20.92,0:18:22.43,Default,,0000,0000,0000,,you could survive a black ball. Dialogue: 0,0:18:22.46,0:18:24.98,Default,,0000,0000,0000,,CA: The logic of this theory, Dialogue: 0,0:18:25.00,0:18:26.27,Default,,0000,0000,0000,,it seems to me, Dialogue: 0,0:18:26.29,0:18:29.89,Default,,0000,0000,0000,,is that we've got to recognize\Nwe can't have it all. Dialogue: 0,0:18:29.92,0:18:31.75,Default,,0000,0000,0000,,That the sort of, Dialogue: 0,0:18:33.50,0:18:36.48,Default,,0000,0000,0000,,I would say, naive dream\Nthat many of us had Dialogue: 0,0:18:36.50,0:18:39.85,Default,,0000,0000,0000,,that technology is always\Ngoing to be a force for good, Dialogue: 0,0:18:39.88,0:18:42.85,Default,,0000,0000,0000,,keep going, don't stop,\Ngo as fast as you can Dialogue: 0,0:18:42.88,0:18:45.23,Default,,0000,0000,0000,,and not pay attention\Nto some of the consequences, Dialogue: 0,0:18:45.25,0:18:46.93,Default,,0000,0000,0000,,that's actually just not an option. Dialogue: 0,0:18:46.96,0:18:48.89,Default,,0000,0000,0000,,We can have that. Dialogue: 0,0:18:48.92,0:18:50.18,Default,,0000,0000,0000,,If we have that, Dialogue: 0,0:18:50.21,0:18:51.64,Default,,0000,0000,0000,,we're going to have to accept Dialogue: 0,0:18:51.67,0:18:54.23,Default,,0000,0000,0000,,some of these other\Nvery uncomfortable things with it, Dialogue: 0,0:18:54.25,0:18:56.48,Default,,0000,0000,0000,,and kind of be in this\Narms race with ourselves Dialogue: 0,0:18:56.50,0:18:58.77,Default,,0000,0000,0000,,of, you want the power,\Nyou better limit it, Dialogue: 0,0:18:58.79,0:19:00.93,Default,,0000,0000,0000,,you better figure out how to limit it. Dialogue: 0,0:19:00.96,0:19:04.43,Default,,0000,0000,0000,,NB: I think it is an option, Dialogue: 0,0:19:04.46,0:19:07.23,Default,,0000,0000,0000,,a very tempting option,\Nit's in a sense the easiest option Dialogue: 0,0:19:07.25,0:19:08.52,Default,,0000,0000,0000,,and it might work, Dialogue: 0,0:19:08.54,0:19:13.35,Default,,0000,0000,0000,,but it means we are fundamentally\Nvulnerable to extracting a black ball. Dialogue: 0,0:19:13.38,0:19:15.52,Default,,0000,0000,0000,,Now, I think with a bit of coordination, Dialogue: 0,0:19:15.54,0:19:18.27,Default,,0000,0000,0000,,like, if you did solve this\Nmacrogovernance problem, Dialogue: 0,0:19:18.29,0:19:19.89,Default,,0000,0000,0000,,and the microgovernance problem, Dialogue: 0,0:19:19.92,0:19:22.23,Default,,0000,0000,0000,,then we could extract\Nall the balls from the urn Dialogue: 0,0:19:22.25,0:19:24.52,Default,,0000,0000,0000,,and we'd benefit greatly. Dialogue: 0,0:19:24.54,0:19:27.98,Default,,0000,0000,0000,,CA: I mean, if we're living\Nin a simulation, does it matter? Dialogue: 0,0:19:28.00,0:19:29.31,Default,,0000,0000,0000,,We just reboot. Dialogue: 0,0:19:29.33,0:19:30.60,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:19:30.62,0:19:32.27,Default,,0000,0000,0000,,NB: Then ... I ... Dialogue: 0,0:19:32.29,0:19:34.77,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:19:34.79,0:19:36.21,Default,,0000,0000,0000,,I didn't see that one coming. Dialogue: 0,0:19:38.12,0:19:39.39,Default,,0000,0000,0000,,CA: So what's your view? Dialogue: 0,0:19:39.42,0:19:44.23,Default,,0000,0000,0000,,Putting all the pieces together,\Nhow likely is it that we're doomed? Dialogue: 0,0:19:44.25,0:19:46.21,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:19:47.04,0:19:49.43,Default,,0000,0000,0000,,I love how people laugh\Nwhen you ask that question. Dialogue: 0,0:19:49.46,0:19:50.81,Default,,0000,0000,0000,,NB: On an individual level, Dialogue: 0,0:19:50.83,0:19:54.68,Default,,0000,0000,0000,,we seem to kind of be doomed anyway,\Njust with the time line, Dialogue: 0,0:19:54.71,0:19:57.31,Default,,0000,0000,0000,,we're rotting and aging\Nand all kinds of things, right? Dialogue: 0,0:19:57.33,0:19:58.93,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:19:58.96,0:20:00.64,Default,,0000,0000,0000,,It's actually a little bit tricky. Dialogue: 0,0:20:00.67,0:20:03.43,Default,,0000,0000,0000,,If you want to set up\Nso that you can attach a probability, Dialogue: 0,0:20:03.46,0:20:04.73,Default,,0000,0000,0000,,first, who are we? Dialogue: 0,0:20:04.75,0:20:07.48,Default,,0000,0000,0000,,If you're very old,\Nprobably you'll die of natural causes, Dialogue: 0,0:20:07.50,0:20:09.85,Default,,0000,0000,0000,,if you're very young,\Nyou might have a 100-year -- Dialogue: 0,0:20:09.88,0:20:12.02,Default,,0000,0000,0000,,If probability might depend\Non who you ask, Dialogue: 0,0:20:12.04,0:20:16.27,Default,,0000,0000,0000,,then the threshold, like, what counts\Nas civilizational devastation -- Dialogue: 0,0:20:16.29,0:20:21.93,Default,,0000,0000,0000,,In the paper I don't require\Nan existential catastrophe Dialogue: 0,0:20:21.96,0:20:23.39,Default,,0000,0000,0000,,in order for it to count. Dialogue: 0,0:20:23.42,0:20:25.10,Default,,0000,0000,0000,,This is just a definitional matter, Dialogue: 0,0:20:25.12,0:20:26.43,Default,,0000,0000,0000,,I say a billion dead, Dialogue: 0,0:20:26.46,0:20:28.52,Default,,0000,0000,0000,,or a reduction of world GDP by 50 percent, Dialogue: 0,0:20:28.54,0:20:30.77,Default,,0000,0000,0000,,but depending on what\Nyou say the threshold is, Dialogue: 0,0:20:30.79,0:20:32.77,Default,,0000,0000,0000,,you get a different probability estimate. Dialogue: 0,0:20:32.79,0:20:37.31,Default,,0000,0000,0000,,But I guess you could\Nput me down as a frightened optimist. Dialogue: 0,0:20:37.33,0:20:38.43,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:20:38.46,0:20:40.10,Default,,0000,0000,0000,,CA: You're a frightened optimist, Dialogue: 0,0:20:40.12,0:20:44.39,Default,,0000,0000,0000,,and I think you've just created\Na large number of other frightened ... Dialogue: 0,0:20:44.42,0:20:45.68,Default,,0000,0000,0000,,people. Dialogue: 0,0:20:45.71,0:20:46.77,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:20:46.79,0:20:48.06,Default,,0000,0000,0000,,NB: In the simulation. Dialogue: 0,0:20:48.08,0:20:49.35,Default,,0000,0000,0000,,CA: In a simulation. Dialogue: 0,0:20:49.38,0:20:51.06,Default,,0000,0000,0000,,Nick Bostrom, your mind amazes me, Dialogue: 0,0:20:51.08,0:20:53.98,Default,,0000,0000,0000,,thank you so much for scaring\Nthe living daylights out of us. Dialogue: 0,0:20:54.00,0:20:56.38,Default,,0000,0000,0000,,(Applause)