[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.57,0:00:04.78,Default,,0000,0000,0000,,I work with a bunch of mathematicians,\Nphilosophers and computer scientists, Dialogue: 0,0:00:04.78,0:00:09.99,Default,,0000,0000,0000,,and we sit around and think about\Nthe future of machine intelligence, Dialogue: 0,0:00:09.99,0:00:12.03,Default,,0000,0000,0000,,among other things. Dialogue: 0,0:00:12.03,0:00:16.76,Default,,0000,0000,0000,,Some people think that some of these\Nthings are sort of science fiction-y, Dialogue: 0,0:00:16.76,0:00:19.86,Default,,0000,0000,0000,,far out there, crazy. Dialogue: 0,0:00:19.86,0:00:21.33,Default,,0000,0000,0000,,But I like to say, Dialogue: 0,0:00:21.33,0:00:24.93,Default,,0000,0000,0000,,okay, let's look at the modern\Nhuman condition. Dialogue: 0,0:00:24.93,0:00:26.62,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:00:26.62,0:00:29.02,Default,,0000,0000,0000,,This is the normal way for things to be. Dialogue: 0,0:00:29.02,0:00:31.31,Default,,0000,0000,0000,,But if we think about it, Dialogue: 0,0:00:31.31,0:00:34.60,Default,,0000,0000,0000,,we are actually recently arrived\Nguests on this planet, Dialogue: 0,0:00:34.60,0:00:36.68,Default,,0000,0000,0000,,the human species. Dialogue: 0,0:00:36.68,0:00:41.43,Default,,0000,0000,0000,,Think about if Earth\Nwas created one year ago, Dialogue: 0,0:00:41.43,0:00:44.98,Default,,0000,0000,0000,,the human species, then, \Nwould be 10 minutes old. Dialogue: 0,0:00:44.98,0:00:48.15,Default,,0000,0000,0000,,The industrial era started\Ntwo seconds ago. Dialogue: 0,0:00:49.28,0:00:54.50,Default,,0000,0000,0000,,Another way to look at this is to think of\Nworld GDP over the last 10,000 years, Dialogue: 0,0:00:54.50,0:00:57.53,Default,,0000,0000,0000,,I've actually taken the trouble\Nto plot this for you in a graph.\N Dialogue: 0,0:00:57.53,0:00:59.30,Default,,0000,0000,0000,,It looks like this. Dialogue: 0,0:00:59.30,0:01:00.67,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:01:00.67,0:01:02.82,Default,,0000,0000,0000,,It's a curious shape\Nfor a normal condition. Dialogue: 0,0:01:02.82,0:01:04.52,Default,,0000,0000,0000,,I sure wouldn't want to sit on it. Dialogue: 0,0:01:04.52,0:01:07.07,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:01:07.07,0:01:11.84,Default,,0000,0000,0000,,Let's ask ourselves, what is the cause\Nof this current anomaly? Dialogue: 0,0:01:11.84,0:01:14.39,Default,,0000,0000,0000,,Some people would say it's technology. Dialogue: 0,0:01:14.39,0:01:19.06,Default,,0000,0000,0000,,Now it's true, technology has accumulated\Nthrough human history, Dialogue: 0,0:01:19.06,0:01:23.71,Default,,0000,0000,0000,,and right now, technology\Nadvances extremely rapidly -- Dialogue: 0,0:01:23.71,0:01:25.28,Default,,0000,0000,0000,,that is the proximate cause, Dialogue: 0,0:01:25.28,0:01:27.84,Default,,0000,0000,0000,,that's why we are currently \Nso very productive. Dialogue: 0,0:01:28.47,0:01:32.13,Default,,0000,0000,0000,,But I like to think back further \Nto the ultimate cause. Dialogue: 0,0:01:33.11,0:01:36.88,Default,,0000,0000,0000,,Look at these two highly\Ndistinguished gentlemen: Dialogue: 0,0:01:36.88,0:01:38.48,Default,,0000,0000,0000,,We have Kanzi -- Dialogue: 0,0:01:38.48,0:01:43.12,Default,,0000,0000,0000,,he's mastered 200 lexical\Ntokens, an incredible feat. Dialogue: 0,0:01:43.12,0:01:46.82,Default,,0000,0000,0000,,And Ed Witten unleashed the second\Nsuperstring revolution. Dialogue: 0,0:01:46.82,0:01:49.14,Default,,0000,0000,0000,,If we look under the hood, \Nthis is what we find: Dialogue: 0,0:01:49.14,0:01:50.71,Default,,0000,0000,0000,,basically the same thing. Dialogue: 0,0:01:50.71,0:01:52.52,Default,,0000,0000,0000,,One is a little larger, Dialogue: 0,0:01:52.52,0:01:55.28,Default,,0000,0000,0000,,it maybe also has a few tricks\Nin the exact way it's wired. Dialogue: 0,0:01:55.28,0:01:59.09,Default,,0000,0000,0000,,These invisible differences cannot\Nbe too complicated, however, Dialogue: 0,0:01:59.09,0:02:03.38,Default,,0000,0000,0000,,because there have only\Nbeen 250,000 generations Dialogue: 0,0:02:03.38,0:02:05.11,Default,,0000,0000,0000,,since our last common ancestor. Dialogue: 0,0:02:05.11,0:02:08.96,Default,,0000,0000,0000,,We know that complicated mechanisms\Ntake a long time to evolve. Dialogue: 0,0:02:10.00,0:02:12.50,Default,,0000,0000,0000,,So a bunch of relatively minor changes Dialogue: 0,0:02:12.50,0:02:15.57,Default,,0000,0000,0000,,take us from Kanzi to Witten, Dialogue: 0,0:02:15.57,0:02:20.11,Default,,0000,0000,0000,,from broken-off tree branches\Nto intercontinental ballistic missiles. Dialogue: 0,0:02:20.84,0:02:24.77,Default,,0000,0000,0000,,So this then seems pretty obvious\Nthat everything we've achieved, Dialogue: 0,0:02:24.77,0:02:26.15,Default,,0000,0000,0000,,and everything we care about, Dialogue: 0,0:02:26.15,0:02:31.38,Default,,0000,0000,0000,,depends crucially on some relatively minor\Nchanges that made the human mind. Dialogue: 0,0:02:32.65,0:02:36.31,Default,,0000,0000,0000,,And the corollary, of course,\Nis that any further changes Dialogue: 0,0:02:36.31,0:02:39.79,Default,,0000,0000,0000,,that could significantly change\Nthe substrate of thinking Dialogue: 0,0:02:39.79,0:02:42.99,Default,,0000,0000,0000,,could have potentially \Nenormous consequences. Dialogue: 0,0:02:44.32,0:02:47.23,Default,,0000,0000,0000,,Some of my colleagues \Nthink we're on the verge Dialogue: 0,0:02:47.23,0:02:51.13,Default,,0000,0000,0000,,of something that could cause\Na profound change in that substrate, Dialogue: 0,0:02:51.13,0:02:54.35,Default,,0000,0000,0000,,and that is machine superintelligence. Dialogue: 0,0:02:54.35,0:02:59.09,Default,,0000,0000,0000,,Artificial intelligence used to be\Nabout putting commands in a box. Dialogue: 0,0:02:59.09,0:03:00.75,Default,,0000,0000,0000,,You would have human programmers Dialogue: 0,0:03:00.75,0:03:03.89,Default,,0000,0000,0000,,that would painstakingly \Nhandcraft knowledge items. Dialogue: 0,0:03:03.89,0:03:05.97,Default,,0000,0000,0000,,You build up these expert systems, Dialogue: 0,0:03:05.97,0:03:08.30,Default,,0000,0000,0000,,and they were kind of useful \Nfor some purposes, Dialogue: 0,0:03:08.30,0:03:10.98,Default,,0000,0000,0000,,but they were very brittle,\Nyou couldn't scale them. Dialogue: 0,0:03:10.98,0:03:14.41,Default,,0000,0000,0000,,Basically, you got out only\Nwhat you put in. Dialogue: 0,0:03:14.41,0:03:15.41,Default,,0000,0000,0000,,But since then, Dialogue: 0,0:03:15.41,0:03:18.87,Default,,0000,0000,0000,,a paradigm shift has taken place\Nin the field of artificial intelligence. Dialogue: 0,0:03:18.87,0:03:21.64,Default,,0000,0000,0000,,Today, the action is really \Naround machine learning. Dialogue: 0,0:03:22.39,0:03:27.78,Default,,0000,0000,0000,,So rather than handcrafting knowledge\Nrepresentations and features, Dialogue: 0,0:03:28.51,0:03:34.06,Default,,0000,0000,0000,,we create algorithms that learn,\Noften from raw perceptual data. Dialogue: 0,0:03:34.06,0:03:39.06,Default,,0000,0000,0000,,Basically the same thing\Nthat the human infant does. Dialogue: 0,0:03:39.06,0:03:43.27,Default,,0000,0000,0000,,The result is A.I. that is not\Nlimited to one domain -- Dialogue: 0,0:03:43.27,0:03:47.90,Default,,0000,0000,0000,,the same system can learn to translate \Nbetween any pairs of languages, Dialogue: 0,0:03:47.90,0:03:53.34,Default,,0000,0000,0000,,or learn to play any computer game\Non the Atari console. Dialogue: 0,0:03:53.34,0:03:55.12,Default,,0000,0000,0000,,Now of course, Dialogue: 0,0:03:55.12,0:03:59.12,Default,,0000,0000,0000,,A.I. is still nowhere near having\Nthe same powerful, cross-domain Dialogue: 0,0:03:59.12,0:04:02.34,Default,,0000,0000,0000,,ability to learn and plan\Nas a human being has. Dialogue: 0,0:04:02.34,0:04:04.46,Default,,0000,0000,0000,,The cortex still has some \Nalgorithmic tricks Dialogue: 0,0:04:04.46,0:04:06.82,Default,,0000,0000,0000,,that we don't yet know\Nhow to match in machines. Dialogue: 0,0:04:07.89,0:04:09.78,Default,,0000,0000,0000,,So the question is, Dialogue: 0,0:04:09.78,0:04:13.28,Default,,0000,0000,0000,,how far are we from being able\Nto match those tricks? Dialogue: 0,0:04:14.24,0:04:15.33,Default,,0000,0000,0000,,A couple of years ago, Dialogue: 0,0:04:15.33,0:04:18.22,Default,,0000,0000,0000,,we did a survey of some of the world's \Nleading A.I. experts, Dialogue: 0,0:04:18.22,0:04:21.44,Default,,0000,0000,0000,,to see what they think,\Nand one of the questions we asked was, Dialogue: 0,0:04:21.44,0:04:24.79,Default,,0000,0000,0000,,"By which year do you think\Nthere is a 50 percent probability Dialogue: 0,0:04:24.79,0:04:28.28,Default,,0000,0000,0000,,that we will have achieved \Nhuman-level machine intelligence?" Dialogue: 0,0:04:28.78,0:04:32.97,Default,,0000,0000,0000,,We defined human-level here \Nas the ability to perform Dialogue: 0,0:04:32.97,0:04:35.84,Default,,0000,0000,0000,,almost any job at least as well\Nas an adult human, Dialogue: 0,0:04:35.84,0:04:39.84,Default,,0000,0000,0000,,so real human-level, not just\Nwithin some limited domain. Dialogue: 0,0:04:39.84,0:04:43.49,Default,,0000,0000,0000,,And the median answer was 2040 or 2050, Dialogue: 0,0:04:43.49,0:04:46.30,Default,,0000,0000,0000,,depending on precisely which \Ngroup of experts we asked. Dialogue: 0,0:04:46.30,0:04:50.34,Default,,0000,0000,0000,,Now, it could happen much,\Nmuch later, or sooner, Dialogue: 0,0:04:50.34,0:04:52.28,Default,,0000,0000,0000,,the truth is nobody really knows. Dialogue: 0,0:04:53.26,0:04:57.67,Default,,0000,0000,0000,,What we do know is that the ultimate \Nlimit to information processing Dialogue: 0,0:04:57.67,0:05:02.54,Default,,0000,0000,0000,,in a machine substrate lies far outside \Nthe limits in biological tissue. Dialogue: 0,0:05:03.24,0:05:05.62,Default,,0000,0000,0000,,This comes down to physics. Dialogue: 0,0:05:05.62,0:05:10.34,Default,,0000,0000,0000,,A biological neuron fires, maybe, \Nat 200 hertz, 200 times a second. Dialogue: 0,0:05:10.34,0:05:13.93,Default,,0000,0000,0000,,But even a present-day transistor\Noperates at the Gigahertz. Dialogue: 0,0:05:13.93,0:05:19.23,Default,,0000,0000,0000,,Neurons propagate slowly in axons,\N100 meters per second, tops. Dialogue: 0,0:05:19.23,0:05:22.34,Default,,0000,0000,0000,,But in computers, signals can travel\Nat the speed of light. Dialogue: 0,0:05:23.08,0:05:24.95,Default,,0000,0000,0000,,There are also size limitations, Dialogue: 0,0:05:24.95,0:05:27.98,Default,,0000,0000,0000,,like a human brain has \Nto fit inside a cranium, Dialogue: 0,0:05:27.98,0:05:32.74,Default,,0000,0000,0000,,but a computer can be the size\Nof a warehouse or larger. Dialogue: 0,0:05:32.74,0:05:38.34,Default,,0000,0000,0000,,So the potential for superintelligence \Nlies dormant in matter, Dialogue: 0,0:05:38.34,0:05:44.05,Default,,0000,0000,0000,,much like the power of the atom \Nlay dormant throughout human history, Dialogue: 0,0:05:44.05,0:05:48.45,Default,,0000,0000,0000,,patiently waiting there until 1945. Dialogue: 0,0:05:48.45,0:05:49.70,Default,,0000,0000,0000,,In this century, Dialogue: 0,0:05:49.70,0:05:53.82,Default,,0000,0000,0000,,scientists may learn to awaken\Nthe power of artificial intelligence. Dialogue: 0,0:05:53.82,0:05:57.83,Default,,0000,0000,0000,,And I think we might then see\Nan intelligence explosion. Dialogue: 0,0:05:58.41,0:06:02.36,Default,,0000,0000,0000,,Now most people, when they think\Nabout what is smart and what is dumb, Dialogue: 0,0:06:02.36,0:06:05.39,Default,,0000,0000,0000,,I think have in mind a picture\Nroughly like this. Dialogue: 0,0:06:05.39,0:06:07.98,Default,,0000,0000,0000,,So at one end we have the village idiot, Dialogue: 0,0:06:07.98,0:06:10.47,Default,,0000,0000,0000,,and then far over at the other side Dialogue: 0,0:06:10.47,0:06:15.22,Default,,0000,0000,0000,,we have Ed Witten, or Albert Einstein,\Nor whoever your favorite guru is. Dialogue: 0,0:06:15.22,0:06:19.06,Default,,0000,0000,0000,,But I think that from the point of view\Nof artificial intelligence, Dialogue: 0,0:06:19.06,0:06:22.74,Default,,0000,0000,0000,,the true picture is actually\Nprobably more like this: Dialogue: 0,0:06:23.26,0:06:26.64,Default,,0000,0000,0000,,AI starts out at this point here,\Nat zero intelligence, Dialogue: 0,0:06:26.64,0:06:29.65,Default,,0000,0000,0000,,and then, after many, many \Nyears of really hard work, Dialogue: 0,0:06:29.65,0:06:33.49,Default,,0000,0000,0000,,maybe eventually we get to\Nmouse-level artificial intelligence, Dialogue: 0,0:06:33.49,0:06:35.92,Default,,0000,0000,0000,,something that can navigate \Ncluttered environments Dialogue: 0,0:06:35.92,0:06:37.91,Default,,0000,0000,0000,,as well as a mouse can. Dialogue: 0,0:06:37.91,0:06:42.22,Default,,0000,0000,0000,,And then, after many, many more years\Nof really hard work, lots of investment, Dialogue: 0,0:06:42.22,0:06:46.86,Default,,0000,0000,0000,,maybe eventually we get to\Nchimpanzee-level artificial intelligence. Dialogue: 0,0:06:46.86,0:06:50.07,Default,,0000,0000,0000,,And then, after even more years \Nof really, really hard work, Dialogue: 0,0:06:50.07,0:06:52.98,Default,,0000,0000,0000,,we get to village idiot \Nartificial intelligence. Dialogue: 0,0:06:52.98,0:06:56.26,Default,,0000,0000,0000,,And a few moments later, \Nwe are beyond Ed Witten. Dialogue: 0,0:06:56.26,0:06:59.22,Default,,0000,0000,0000,,The train doesn't stop\Nat Humanville Station. Dialogue: 0,0:06:59.22,0:07:02.25,Default,,0000,0000,0000,,It's likely, rather, to swoosh right by. Dialogue: 0,0:07:02.25,0:07:04.23,Default,,0000,0000,0000,,Now this has profound implications, Dialogue: 0,0:07:04.23,0:07:08.09,Default,,0000,0000,0000,,particularly when it comes \Nto questions of power. Dialogue: 0,0:07:08.09,0:07:09.99,Default,,0000,0000,0000,,For example, chimpanzees are strong -- Dialogue: 0,0:07:09.99,0:07:15.21,Default,,0000,0000,0000,,pound for pound, a chimpanzee is about\Ntwice as strong as a fit human male. Dialogue: 0,0:07:15.21,0:07:19.83,Default,,0000,0000,0000,,And yet, the fate of Kanzi \Nand his pals depends a lot more Dialogue: 0,0:07:19.83,0:07:23.97,Default,,0000,0000,0000,,on what we humans do than on\Nwhat the chimpanzees do themselves. Dialogue: 0,0:07:25.23,0:07:27.54,Default,,0000,0000,0000,,Once there is superintelligence, Dialogue: 0,0:07:27.54,0:07:31.38,Default,,0000,0000,0000,,the fate of humanity may depend\Non what the superintelligence does. Dialogue: 0,0:07:32.45,0:07:33.51,Default,,0000,0000,0000,,Think about it: Dialogue: 0,0:07:33.51,0:07:38.55,Default,,0000,0000,0000,,Machine intelligence is the last invention\Nthat humanity will ever need to make. Dialogue: 0,0:07:38.55,0:07:41.52,Default,,0000,0000,0000,,Machines will then be better \Nat inventing than we are, Dialogue: 0,0:07:41.52,0:07:44.06,Default,,0000,0000,0000,,and they'll be doing so \Non digital timescales. Dialogue: 0,0:07:44.06,0:07:48.97,Default,,0000,0000,0000,,What this means is basically\Na telescoping of the future. Dialogue: 0,0:07:48.97,0:07:52.52,Default,,0000,0000,0000,,Think of all the crazy technologies \Nthat you could have imagined Dialogue: 0,0:07:52.52,0:07:55.32,Default,,0000,0000,0000,,maybe humans could have developed\Nin the fullness of time: Dialogue: 0,0:07:55.32,0:07:58.58,Default,,0000,0000,0000,,cures for aging, space colonization, Dialogue: 0,0:07:58.58,0:08:02.31,Default,,0000,0000,0000,,self-replicating nanobots or uploading\Nof minds into computers, Dialogue: 0,0:08:02.31,0:08:04.47,Default,,0000,0000,0000,,all kinds of science fiction-y stuff Dialogue: 0,0:08:04.47,0:08:07.21,Default,,0000,0000,0000,,that's nevertheless consistent \Nwith the laws of physics. Dialogue: 0,0:08:07.21,0:08:11.42,Default,,0000,0000,0000,,All of this superintelligence could \Ndevelop, and possibly quite rapidly. Dialogue: 0,0:08:12.45,0:08:16.01,Default,,0000,0000,0000,,Now, a superintelligence with such \Ntechnological maturity Dialogue: 0,0:08:16.01,0:08:18.19,Default,,0000,0000,0000,,would be extremely powerful, Dialogue: 0,0:08:18.19,0:08:22.73,Default,,0000,0000,0000,,and at least in some scenarios,\Nit would be able to get what it wants. Dialogue: 0,0:08:22.73,0:08:28.39,Default,,0000,0000,0000,,We would then have a future that would\Nbe shaped by the preferences of this A.I. Dialogue: 0,0:08:29.86,0:08:33.60,Default,,0000,0000,0000,,Now a good question is,\Nwhat are those preferences? Dialogue: 0,0:08:34.24,0:08:36.01,Default,,0000,0000,0000,,Here it gets trickier. Dialogue: 0,0:08:36.01,0:08:37.45,Default,,0000,0000,0000,,To make any headway with this, Dialogue: 0,0:08:37.45,0:08:40.72,Default,,0000,0000,0000,,we must first of all\Navoid anthropomorphizing. Dialogue: 0,0:08:41.93,0:08:45.24,Default,,0000,0000,0000,,And this is ironic because \Nevery newspaper article Dialogue: 0,0:08:45.24,0:08:49.09,Default,,0000,0000,0000,,about the future of A.I.\Nhas a picture of this: Dialogue: 0,0:08:50.28,0:08:54.41,Default,,0000,0000,0000,,So I think what we need to do is\Nto conceive of the issue more abstractly, Dialogue: 0,0:08:54.41,0:08:57.20,Default,,0000,0000,0000,,not in terms of vivid Hollywood scenarios. Dialogue: 0,0:08:57.20,0:09:00.82,Default,,0000,0000,0000,,We need to think of intelligence \Nas an optimization process, Dialogue: 0,0:09:00.82,0:09:06.47,Default,,0000,0000,0000,,a process that steers the future\Ninto a particular set of configurations. Dialogue: 0,0:09:06.47,0:09:09.98,Default,,0000,0000,0000,,A superintelligence is\Na really strong optimization process. Dialogue: 0,0:09:09.98,0:09:14.10,Default,,0000,0000,0000,,It's extremely good at using \Navailable means to achieve a state Dialogue: 0,0:09:14.10,0:09:16.01,Default,,0000,0000,0000,,in which its goal is realized. Dialogue: 0,0:09:16.45,0:09:19.12,Default,,0000,0000,0000,,This means that there is no necessary\Nconenction between Dialogue: 0,0:09:19.12,0:09:21.85,Default,,0000,0000,0000,,being highly intelligent in this sense, Dialogue: 0,0:09:21.85,0:09:26.52,Default,,0000,0000,0000,,and having an objective that we humans\Nwould find worthwhile or meaningful. Dialogue: 0,0:09:27.32,0:09:31.12,Default,,0000,0000,0000,,Suppose we give an A.I. the goal \Nto make humans smile. Dialogue: 0,0:09:31.12,0:09:34.10,Default,,0000,0000,0000,,When the A.I. is weak, it performs useful\Nor amusing actions Dialogue: 0,0:09:34.10,0:09:36.61,Default,,0000,0000,0000,,that cause its user to smile. Dialogue: 0,0:09:36.61,0:09:39.03,Default,,0000,0000,0000,,When the A.I. becomes superintelligent, Dialogue: 0,0:09:39.03,0:09:42.55,Default,,0000,0000,0000,,it realizes that there is a more\Neffective way to achieve this goal: Dialogue: 0,0:09:42.55,0:09:44.48,Default,,0000,0000,0000,,take control of the world Dialogue: 0,0:09:44.48,0:09:47.64,Default,,0000,0000,0000,,and stick electrodes into the facial\Nmuscles of humans Dialogue: 0,0:09:47.64,0:09:50.58,Default,,0000,0000,0000,,to cause constant, beaming grins. Dialogue: 0,0:09:50.58,0:09:51.61,Default,,0000,0000,0000,,Another example, Dialogue: 0,0:09:51.61,0:09:54.100,Default,,0000,0000,0000,,suppose we give A.I. the goal to solve\Na difficult mathematical problem. Dialogue: 0,0:09:54.100,0:09:56.93,Default,,0000,0000,0000,,When the A.I. becomes superintelligent, Dialogue: 0,0:09:56.93,0:10:01.10,Default,,0000,0000,0000,,it realizes that the most effective way \Nto get the solution to this problem Dialogue: 0,0:10:01.10,0:10:04.04,Default,,0000,0000,0000,,is by transforming the planet\Ninto a giant computer, Dialogue: 0,0:10:04.04,0:10:06.28,Default,,0000,0000,0000,,so as to increase its thinking capacity. Dialogue: 0,0:10:06.28,0:10:09.04,Default,,0000,0000,0000,,And notice that this gives the A.I.s\Nan instrumental reason Dialogue: 0,0:10:09.04,0:10:11.56,Default,,0000,0000,0000,,to do things to us that we\Nmight not approve of. Dialogue: 0,0:10:11.56,0:10:13.50,Default,,0000,0000,0000,,Human beings in this model are threats, Dialogue: 0,0:10:13.50,0:10:16.42,Default,,0000,0000,0000,,we could prevent the mathematical\Nproblem from being solved. Dialogue: 0,0:10:17.21,0:10:20.70,Default,,0000,0000,0000,,Of course, perceivably things won't \Ngo wrong in these particular ways; Dialogue: 0,0:10:20.70,0:10:22.45,Default,,0000,0000,0000,,these are cartoon examples. Dialogue: 0,0:10:22.45,0:10:24.39,Default,,0000,0000,0000,,But the general point here is important: Dialogue: 0,0:10:24.39,0:10:27.27,Default,,0000,0000,0000,,if you create a really powerful\Noptimization process Dialogue: 0,0:10:27.27,0:10:29.50,Default,,0000,0000,0000,,to maximize for objective x, Dialogue: 0,0:10:29.50,0:10:31.78,Default,,0000,0000,0000,,you better make sure \Nthat your definition of x Dialogue: 0,0:10:31.78,0:10:34.24,Default,,0000,0000,0000,,incorporates everything you care about. Dialogue: 0,0:10:34.84,0:10:39.22,Default,,0000,0000,0000,,This is a lesson that's also taught\Nin many a myth. Dialogue: 0,0:10:39.22,0:10:44.52,Default,,0000,0000,0000,,King Midas wishes that everything\Nhe touches be turned into gold. Dialogue: 0,0:10:44.52,0:10:47.38,Default,,0000,0000,0000,,He touches his daughter, \Nshe turns into gold. Dialogue: 0,0:10:47.38,0:10:49.93,Default,,0000,0000,0000,,He touches his food, it turns into gold. Dialogue: 0,0:10:49.93,0:10:52.52,Default,,0000,0000,0000,,This could become practically relevant, Dialogue: 0,0:10:52.52,0:10:54.59,Default,,0000,0000,0000,,not just as a metaphor for greed, Dialogue: 0,0:10:54.59,0:10:56.48,Default,,0000,0000,0000,,but as an illustration of what happens Dialogue: 0,0:10:56.48,0:10:59.32,Default,,0000,0000,0000,,if you create a powerful\Noptimization process Dialogue: 0,0:10:59.32,0:11:04.11,Default,,0000,0000,0000,,and give it misconceived \Nor poorly specified goals. Dialogue: 0,0:11:04.11,0:11:09.30,Default,,0000,0000,0000,,Now you might say, if a computer starts\Nsticking electrodes into people's faces, Dialogue: 0,0:11:09.30,0:11:11.56,Default,,0000,0000,0000,,we'd just shut it off. Dialogue: 0,0:11:12.56,0:11:17.90,Default,,0000,0000,0000,,A, this is not necessarily so easy to do\Nif we've grown dependent on the system -- Dialogue: 0,0:11:17.90,0:11:20.63,Default,,0000,0000,0000,,like, where is the off switch \Nto the Internet? Dialogue: 0,0:11:20.63,0:11:25.75,Default,,0000,0000,0000,,B, why haven't the chimpanzees\Nflicked the off switch to humanity, Dialogue: 0,0:11:25.75,0:11:27.30,Default,,0000,0000,0000,,or the Neanderthals? Dialogue: 0,0:11:27.30,0:11:29.96,Default,,0000,0000,0000,,They certainly had reasons. Dialogue: 0,0:11:29.96,0:11:32.76,Default,,0000,0000,0000,,We have an off switch, \Nfor example, right here. Dialogue: 0,0:11:32.76,0:11:34.31,Default,,0000,0000,0000,,(Choking) Dialogue: 0,0:11:34.31,0:11:37.24,Default,,0000,0000,0000,,The reason is that we are \Nan intelligent adversary; Dialogue: 0,0:11:37.24,0:11:39.97,Default,,0000,0000,0000,,we can anticipate threats \Nand plan around them. Dialogue: 0,0:11:39.97,0:11:42.47,Default,,0000,0000,0000,,But so could a superintelligent agent, Dialogue: 0,0:11:42.47,0:11:45.72,Default,,0000,0000,0000,,and it would be much better \Nat that than we are. Dialogue: 0,0:11:45.72,0:11:52.91,Default,,0000,0000,0000,,The point is, we should not be confident\Nthat we have this under control here. Dialogue: 0,0:11:52.91,0:11:56.36,Default,,0000,0000,0000,,And we could try to make our job\Na little bit easier by, say, Dialogue: 0,0:11:56.36,0:11:57.95,Default,,0000,0000,0000,,putting the A.I. in a box, Dialogue: 0,0:11:57.95,0:11:59.74,Default,,0000,0000,0000,,like a secure software environment, Dialogue: 0,0:11:59.74,0:12:02.77,Default,,0000,0000,0000,,a virtual reality simulation\Nfrom which it cannot escape. Dialogue: 0,0:12:02.77,0:12:06.91,Default,,0000,0000,0000,,But how confident can we be that\Nthe A.I. couldn't find a bug. Dialogue: 0,0:12:06.91,0:12:10.08,Default,,0000,0000,0000,,Given that merely human hackers\Nfind bugs all the time, Dialogue: 0,0:12:10.08,0:12:13.12,Default,,0000,0000,0000,,I'd say, probably not very confident. Dialogue: 0,0:12:14.24,0:12:18.78,Default,,0000,0000,0000,,So we disconnect the ethernet cable\Nto create an air gap, Dialogue: 0,0:12:18.78,0:12:21.45,Default,,0000,0000,0000,,but again, like merely human hackers Dialogue: 0,0:12:21.45,0:12:24.83,Default,,0000,0000,0000,,routinely transgress air gaps\Nusing social engineering. Dialogue: 0,0:12:24.83,0:12:26.09,Default,,0000,0000,0000,,Right now, as I speak, Dialogue: 0,0:12:26.09,0:12:28.48,Default,,0000,0000,0000,,I'm sure there is some employee\Nout there somewhere Dialogue: 0,0:12:28.48,0:12:31.83,Default,,0000,0000,0000,,who has been talked into handing out \Nher account details Dialogue: 0,0:12:31.83,0:12:34.57,Default,,0000,0000,0000,,by somebody claiming to be\Nfrom the I.T. department. Dialogue: 0,0:12:34.57,0:12:36.70,Default,,0000,0000,0000,,More creative scenarios are also possible, Dialogue: 0,0:12:36.70,0:12:38.02,Default,,0000,0000,0000,,like if you're the A.I., Dialogue: 0,0:12:38.02,0:12:41.55,Default,,0000,0000,0000,,you can imagine wiggling electrodes\Naround in your internal circuitry Dialogue: 0,0:12:41.55,0:12:45.01,Default,,0000,0000,0000,,to create radio waves that you\Ncan use to communicate. Dialogue: 0,0:12:45.01,0:12:47.43,Default,,0000,0000,0000,,Or maybe you could pretend to malfunction, Dialogue: 0,0:12:47.43,0:12:50.93,Default,,0000,0000,0000,,and then when the programmers open\Nyou up to see what went wrong with you, Dialogue: 0,0:12:50.93,0:12:52.87,Default,,0000,0000,0000,,they look at the source code -- Bam! -- Dialogue: 0,0:12:52.87,0:12:55.31,Default,,0000,0000,0000,,the manipulation can take place. Dialogue: 0,0:12:55.31,0:12:58.74,Default,,0000,0000,0000,,Or it could output the blueprint\Nto a really nifty technology, Dialogue: 0,0:12:58.74,0:13:00.14,Default,,0000,0000,0000,,and when we implement it, Dialogue: 0,0:13:00.14,0:13:04.54,Default,,0000,0000,0000,,it has some surreptitious side effect\Nthat the A.I. had planned. Dialogue: 0,0:13:04.54,0:13:08.00,Default,,0000,0000,0000,,The point here is that we should \Nnot be confident in our ability Dialogue: 0,0:13:08.00,0:13:11.81,Default,,0000,0000,0000,,to keep a superintelligent genie\Nlocked up in its bottle forever. Dialogue: 0,0:13:11.81,0:13:14.06,Default,,0000,0000,0000,,Sooner or later, it will out. Dialogue: 0,0:13:15.03,0:13:18.14,Default,,0000,0000,0000,,I believe that the answer here\Nis to figure out Dialogue: 0,0:13:18.14,0:13:23.16,Default,,0000,0000,0000,,how to create superintelligent A.I.\Nsuch that even if -- when -- it escapes, Dialogue: 0,0:13:23.16,0:13:26.44,Default,,0000,0000,0000,,it is still safe because it is\Nfundamentally on our side Dialogue: 0,0:13:26.44,0:13:28.34,Default,,0000,0000,0000,,because it shares our values. Dialogue: 0,0:13:28.34,0:13:31.55,Default,,0000,0000,0000,,I see no way around \Nthis difficult problem. Dialogue: 0,0:13:32.56,0:13:36.39,Default,,0000,0000,0000,,Now, I'm actually fairly optimistic\Nthat this problem can be solved. Dialogue: 0,0:13:36.39,0:13:40.29,Default,,0000,0000,0000,,We wouldn't have to write down \Na long list of everything we care about, Dialogue: 0,0:13:40.29,0:13:43.94,Default,,0000,0000,0000,,or worse yet, spell it out \Nin some computer language Dialogue: 0,0:13:43.94,0:13:45.39,Default,,0000,0000,0000,,like C++ or Python, Dialogue: 0,0:13:45.39,0:13:48.16,Default,,0000,0000,0000,,that would be a task beyond hopeless. Dialogue: 0,0:13:48.16,0:13:52.46,Default,,0000,0000,0000,,Instead, we would create an A.I.\Nthat uses its intelligence Dialogue: 0,0:13:52.46,0:13:55.23,Default,,0000,0000,0000,,to learn what we value, Dialogue: 0,0:13:55.23,0:14:00.51,Default,,0000,0000,0000,,and its motivation system is constructed\Nin such a way that it is motivated Dialogue: 0,0:14:00.51,0:14:05.74,Default,,0000,0000,0000,,to pursue our values or to perform actions\Nthat it predicts we would approve of. Dialogue: 0,0:14:05.74,0:14:09.15,Default,,0000,0000,0000,,We would thus leverage \Nits intelligence as much as possible Dialogue: 0,0:14:09.15,0:14:11.90,Default,,0000,0000,0000,,to solve the problem of value-loading. Dialogue: 0,0:14:12.73,0:14:14.24,Default,,0000,0000,0000,,This can happen, Dialogue: 0,0:14:14.24,0:14:17.84,Default,,0000,0000,0000,,and the outcome could be \Nvery good for humanity. Dialogue: 0,0:14:17.84,0:14:21.79,Default,,0000,0000,0000,,But it doesn't happen automatically. Dialogue: 0,0:14:21.79,0:14:24.79,Default,,0000,0000,0000,,The initial conditions \Nfor the intelligence explosion Dialogue: 0,0:14:24.79,0:14:27.65,Default,,0000,0000,0000,,might need to be set up \Nin just the right way Dialogue: 0,0:14:27.65,0:14:31.18,Default,,0000,0000,0000,,if we are to have a controlled detonation. Dialogue: 0,0:14:31.18,0:14:33.80,Default,,0000,0000,0000,,The values that the A.I. has\Nneed to match ours, Dialogue: 0,0:14:33.80,0:14:35.56,Default,,0000,0000,0000,,not just in the familiar context, Dialogue: 0,0:14:35.56,0:14:37.100,Default,,0000,0000,0000,,like where we can easily check\Nhow the A.I. behaves, Dialogue: 0,0:14:37.100,0:14:41.23,Default,,0000,0000,0000,,but also in all novel contexts\Nthat the A.I. might encounter Dialogue: 0,0:14:41.23,0:14:42.79,Default,,0000,0000,0000,,in the indefinite future. Dialogue: 0,0:14:42.79,0:14:47.53,Default,,0000,0000,0000,,And there are also some esoteric issues\Nthat would need to be solved, sorted out: Dialogue: 0,0:14:47.53,0:14:49.62,Default,,0000,0000,0000,,the exact details of its decision theory, Dialogue: 0,0:14:49.62,0:14:52.48,Default,,0000,0000,0000,,how to deal with logical\Nuncertainty and so forth. Dialogue: 0,0:14:53.33,0:14:56.43,Default,,0000,0000,0000,,So the technical problems that need\Nto be solved to make this work Dialogue: 0,0:14:56.43,0:14:57.54,Default,,0000,0000,0000,,look quite difficult -- Dialogue: 0,0:14:57.54,0:15:00.92,Default,,0000,0000,0000,,not as difficult as making \Na superintelligent A.I., Dialogue: 0,0:15:00.92,0:15:03.79,Default,,0000,0000,0000,,but fairly difficult. Dialogue: 0,0:15:03.79,0:15:05.49,Default,,0000,0000,0000,,Here is the worry: Dialogue: 0,0:15:05.49,0:15:10.17,Default,,0000,0000,0000,,Making superintelligent A.I.\Nis a really hard challenge. Dialogue: 0,0:15:10.17,0:15:12.72,Default,,0000,0000,0000,,Making superintelligent A.I. that is safe Dialogue: 0,0:15:12.72,0:15:15.14,Default,,0000,0000,0000,,involves some additional \Nchallenge on top of that. Dialogue: 0,0:15:16.22,0:15:19.70,Default,,0000,0000,0000,,The risk is that if somebody figures out\Nhow to crack the first challenge Dialogue: 0,0:15:19.70,0:15:22.70,Default,,0000,0000,0000,,without also having cracked \Nthe additional challenge Dialogue: 0,0:15:22.70,0:15:24.60,Default,,0000,0000,0000,,of ensuring perfect safety. Dialogue: 0,0:15:25.38,0:15:28.71,Default,,0000,0000,0000,,So I think that we should\Nwork out a solution Dialogue: 0,0:15:28.71,0:15:31.53,Default,,0000,0000,0000,,to the control problem in advance, Dialogue: 0,0:15:31.53,0:15:34.19,Default,,0000,0000,0000,,so that we have it available \Nby the time it is needed. Dialogue: 0,0:15:34.77,0:15:38.28,Default,,0000,0000,0000,,Now it might be that we cannot solve\Nthe entire control problem in advance Dialogue: 0,0:15:38.28,0:15:41.30,Default,,0000,0000,0000,,because maybe some elements\Ncan only be put in place Dialogue: 0,0:15:41.30,0:15:45.30,Default,,0000,0000,0000,,once you know the details of the \Narchitecture where it will be implemented. Dialogue: 0,0:15:45.30,0:15:48.68,Default,,0000,0000,0000,,But the more of the control problem\Nthat we solve in advance, Dialogue: 0,0:15:48.68,0:15:52.77,Default,,0000,0000,0000,,the better the odds that the transition\Nto the machine intelligence era Dialogue: 0,0:15:52.77,0:15:54.31,Default,,0000,0000,0000,,will go well. Dialogue: 0,0:15:54.31,0:15:58.95,Default,,0000,0000,0000,,This to me looks like a thing\Nthat is well worth doing Dialogue: 0,0:15:58.95,0:16:02.28,Default,,0000,0000,0000,,and I can imagine that if \Nthings turn out okay, Dialogue: 0,0:16:02.28,0:16:06.94,Default,,0000,0000,0000,,that people a million years from now\Nlook back at this century Dialogue: 0,0:16:06.94,0:16:10.94,Default,,0000,0000,0000,,and it might well be that they say that\Nthe one thing we did that really mattered Dialogue: 0,0:16:10.94,0:16:12.51,Default,,0000,0000,0000,,was to get this thing right. Dialogue: 0,0:16:12.51,0:16:14.20,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:16:14.20,0:16:17.01,Default,,0000,0000,0000,,(Applause)