[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.76,0:00:04.60,Default,,0000,0000,0000,,When I was a kid,\NI was the quintessential nerd. Dialogue: 0,0:00:05.32,0:00:07.50,Default,,0000,0000,0000,,I think some of you were too. Dialogue: 0,0:00:07.52,0:00:08.74,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:00:08.76,0:00:11.98,Default,,0000,0000,0000,,And you, sir, who laughed the loudest,\Nyou probably still are. Dialogue: 0,0:00:12.00,0:00:14.26,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:00:14.28,0:00:17.78,Default,,0000,0000,0000,,I grew up in a small town\Nin the dusty plains of north Texas, Dialogue: 0,0:00:17.80,0:00:21.14,Default,,0000,0000,0000,,the son of a sheriff\Nwho was the son of a pastor. Dialogue: 0,0:00:21.16,0:00:23.08,Default,,0000,0000,0000,,Getting into trouble was not an option. Dialogue: 0,0:00:24.04,0:00:27.30,Default,,0000,0000,0000,,And so I started reading\Ncalculus books for fun. Dialogue: 0,0:00:27.32,0:00:28.86,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:00:28.88,0:00:30.58,Default,,0000,0000,0000,,You did too. Dialogue: 0,0:00:30.60,0:00:34.34,Default,,0000,0000,0000,,That led me to building a laser\Nand a computer and model rockets, Dialogue: 0,0:00:34.36,0:00:37.36,Default,,0000,0000,0000,,and that led me to making\Nrocket fuel in my bedroom. Dialogue: 0,0:00:37.96,0:00:41.62,Default,,0000,0000,0000,,Now, in scientific terms, Dialogue: 0,0:00:41.64,0:00:44.90,Default,,0000,0000,0000,,we call this a very bad idea. Dialogue: 0,0:00:44.92,0:00:46.14,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:00:46.16,0:00:48.34,Default,,0000,0000,0000,,Around that same time, Dialogue: 0,0:00:48.36,0:00:51.58,Default,,0000,0000,0000,,Stanley Kubrick's "2001: A Space Odyssey"\Ncame to the theaters, Dialogue: 0,0:00:51.60,0:00:53.80,Default,,0000,0000,0000,,and my life was forever changed. Dialogue: 0,0:00:54.28,0:00:56.34,Default,,0000,0000,0000,,I loved everything about that movie, Dialogue: 0,0:00:56.36,0:00:58.90,Default,,0000,0000,0000,,especially the HAL 9000. Dialogue: 0,0:00:58.92,0:01:00.98,Default,,0000,0000,0000,,Now, HAL was a sentient computer Dialogue: 0,0:01:01.00,0:01:03.46,Default,,0000,0000,0000,,designed to guide the Discovery spacecraft Dialogue: 0,0:01:03.48,0:01:06.02,Default,,0000,0000,0000,,from the Earth to Jupiter. Dialogue: 0,0:01:06.04,0:01:08.10,Default,,0000,0000,0000,,HAL was also a flawed character, Dialogue: 0,0:01:08.12,0:01:12.40,Default,,0000,0000,0000,,for in the end he chose\Nto value the mission over human life. Dialogue: 0,0:01:12.84,0:01:14.94,Default,,0000,0000,0000,,Now, HAL was a fictional character, Dialogue: 0,0:01:14.96,0:01:17.62,Default,,0000,0000,0000,,but nonetheless he speaks to our fears, Dialogue: 0,0:01:17.64,0:01:19.74,Default,,0000,0000,0000,,our fears of being subjugated Dialogue: 0,0:01:19.76,0:01:22.78,Default,,0000,0000,0000,,by some unfeeling, artificial intelligence Dialogue: 0,0:01:22.80,0:01:24.76,Default,,0000,0000,0000,,who is indifferent to our humanity. Dialogue: 0,0:01:25.88,0:01:28.46,Default,,0000,0000,0000,,I believe that such fears are unfounded. Dialogue: 0,0:01:28.48,0:01:31.18,Default,,0000,0000,0000,,Indeed, we stand at a remarkable time Dialogue: 0,0:01:31.20,0:01:32.74,Default,,0000,0000,0000,,in human history, Dialogue: 0,0:01:32.76,0:01:37.74,Default,,0000,0000,0000,,where, driven by refusal to accept\Nthe limits of our bodies and our minds, Dialogue: 0,0:01:37.76,0:01:39.46,Default,,0000,0000,0000,,we are building machines Dialogue: 0,0:01:39.48,0:01:43.10,Default,,0000,0000,0000,,of exquisite, beautiful\Ncomplexity and grace Dialogue: 0,0:01:43.12,0:01:45.18,Default,,0000,0000,0000,,that will extend the human experience Dialogue: 0,0:01:45.20,0:01:46.88,Default,,0000,0000,0000,,in ways beyond our imagining. Dialogue: 0,0:01:47.72,0:01:50.30,Default,,0000,0000,0000,,After a career that led me\Nfrom the Air Force Academy Dialogue: 0,0:01:50.32,0:01:52.26,Default,,0000,0000,0000,,to Space Command to now, Dialogue: 0,0:01:52.28,0:01:53.98,Default,,0000,0000,0000,,I became a systems engineer, Dialogue: 0,0:01:54.00,0:01:56.74,Default,,0000,0000,0000,,and recently I was drawn\Ninto an engineering problem Dialogue: 0,0:01:56.76,0:01:59.34,Default,,0000,0000,0000,,associated with NASA's mission to Mars. Dialogue: 0,0:01:59.36,0:02:01.86,Default,,0000,0000,0000,,Now, in space flights to the Moon, Dialogue: 0,0:02:01.88,0:02:05.02,Default,,0000,0000,0000,,we can rely upon\Nmission control in Houston Dialogue: 0,0:02:05.04,0:02:07.02,Default,,0000,0000,0000,,to watch over all aspects of the flight. Dialogue: 0,0:02:07.04,0:02:10.58,Default,,0000,0000,0000,,However, Mars is 200 times further away, Dialogue: 0,0:02:10.60,0:02:13.82,Default,,0000,0000,0000,,and as a result it takes\Non average 13 minutes Dialogue: 0,0:02:13.84,0:02:16.98,Default,,0000,0000,0000,,for a signal to travel\Nfrom the Earth to Mars. Dialogue: 0,0:02:17.00,0:02:20.40,Default,,0000,0000,0000,,If there's trouble,\Nthere's not enough time. Dialogue: 0,0:02:20.84,0:02:23.34,Default,,0000,0000,0000,,And so a reasonable engineering solution Dialogue: 0,0:02:23.36,0:02:25.94,Default,,0000,0000,0000,,calls for us to put mission control Dialogue: 0,0:02:25.96,0:02:28.98,Default,,0000,0000,0000,,inside the walls of the Orion spacecraft. Dialogue: 0,0:02:29.00,0:02:31.90,Default,,0000,0000,0000,,Another fascinating idea\Nin the mission profile Dialogue: 0,0:02:31.92,0:02:34.82,Default,,0000,0000,0000,,places humanoid robots\Non the surface of Mars Dialogue: 0,0:02:34.84,0:02:36.70,Default,,0000,0000,0000,,before the humans themselves arrive, Dialogue: 0,0:02:36.72,0:02:38.38,Default,,0000,0000,0000,,first to build facilities Dialogue: 0,0:02:38.40,0:02:41.76,Default,,0000,0000,0000,,and later to serve as collaborative\Nmembers of the science team. Dialogue: 0,0:02:43.40,0:02:46.14,Default,,0000,0000,0000,,Now, as I looked at this\Nfrom an engineering perspective, Dialogue: 0,0:02:46.16,0:02:49.34,Default,,0000,0000,0000,,it became very clear to me\Nthat what I needed to architect Dialogue: 0,0:02:49.36,0:02:51.54,Default,,0000,0000,0000,,was a smart, collaborative, Dialogue: 0,0:02:51.56,0:02:53.94,Default,,0000,0000,0000,,socially intelligent\Nartificial intelligence. Dialogue: 0,0:02:53.96,0:02:58.26,Default,,0000,0000,0000,,In other words, I needed to build\Nsomething very much like a HAL Dialogue: 0,0:02:58.28,0:03:00.70,Default,,0000,0000,0000,,but without the homicidal tendencies. Dialogue: 0,0:03:00.72,0:03:02.08,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:03:02.92,0:03:04.74,Default,,0000,0000,0000,,Let's pause for a moment. Dialogue: 0,0:03:04.76,0:03:08.66,Default,,0000,0000,0000,,Is it really possible to build\Nan artificial intelligence like that? Dialogue: 0,0:03:08.68,0:03:10.14,Default,,0000,0000,0000,,Actually, it is. Dialogue: 0,0:03:10.16,0:03:11.42,Default,,0000,0000,0000,,In many ways, Dialogue: 0,0:03:11.44,0:03:13.42,Default,,0000,0000,0000,,this is a hard engineering problem Dialogue: 0,0:03:13.44,0:03:14.90,Default,,0000,0000,0000,,with elements of AI, Dialogue: 0,0:03:14.92,0:03:19.62,Default,,0000,0000,0000,,not some wet hair ball of an AI problem\Nthat needs to be engineered. Dialogue: 0,0:03:19.64,0:03:22.30,Default,,0000,0000,0000,,To paraphrase Alan Turing, Dialogue: 0,0:03:22.32,0:03:24.70,Default,,0000,0000,0000,,I'm not interested\Nin building a sentient machine. Dialogue: 0,0:03:24.72,0:03:26.30,Default,,0000,0000,0000,,I'm not building a HAL. Dialogue: 0,0:03:26.32,0:03:28.74,Default,,0000,0000,0000,,All I'm after is a simple brain, Dialogue: 0,0:03:28.76,0:03:31.88,Default,,0000,0000,0000,,something that offers\Nthe illusion of intelligence. Dialogue: 0,0:03:33.00,0:03:36.14,Default,,0000,0000,0000,,The art and the science of computing\Nhave come a long way Dialogue: 0,0:03:36.16,0:03:37.66,Default,,0000,0000,0000,,since HAL was onscreen, Dialogue: 0,0:03:37.68,0:03:40.90,Default,,0000,0000,0000,,and I'd imagine if his inventor\NDr. Chandra were here today, Dialogue: 0,0:03:40.92,0:03:43.26,Default,,0000,0000,0000,,he'd have a whole lot of questions for us. Dialogue: 0,0:03:43.28,0:03:45.38,Default,,0000,0000,0000,,Is it really possible for us Dialogue: 0,0:03:45.40,0:03:49.42,Default,,0000,0000,0000,,to take a system of millions\Nupon millions of devices, Dialogue: 0,0:03:49.44,0:03:50.90,Default,,0000,0000,0000,,to read in their data streams, Dialogue: 0,0:03:50.92,0:03:53.18,Default,,0000,0000,0000,,to predict their failures\Nand act in advance? Dialogue: 0,0:03:53.20,0:03:54.42,Default,,0000,0000,0000,,Yes. Dialogue: 0,0:03:54.44,0:03:57.62,Default,,0000,0000,0000,,Can we build systems that converse\Nwith humans in natural language? Dialogue: 0,0:03:57.64,0:03:58.86,Default,,0000,0000,0000,,Yes. Dialogue: 0,0:03:58.88,0:04:01.86,Default,,0000,0000,0000,,Can we build systems\Nthat recognize objects, identify emotions, Dialogue: 0,0:04:01.88,0:04:05.26,Default,,0000,0000,0000,,emote themselves,\Nplay games, and even read lips? Dialogue: 0,0:04:05.28,0:04:06.50,Default,,0000,0000,0000,,Yes. Dialogue: 0,0:04:06.52,0:04:08.66,Default,,0000,0000,0000,,Can we build a system that sets goals, Dialogue: 0,0:04:08.68,0:04:12.30,Default,,0000,0000,0000,,that carries out plans against those goals\Nand learns along the way? Dialogue: 0,0:04:12.32,0:04:13.54,Default,,0000,0000,0000,,Yes. Dialogue: 0,0:04:13.56,0:04:16.90,Default,,0000,0000,0000,,Can we build systems\Nthat have a theory of mind? Dialogue: 0,0:04:16.92,0:04:18.42,Default,,0000,0000,0000,,This we are learning to do. Dialogue: 0,0:04:18.44,0:04:21.92,Default,,0000,0000,0000,,Can we build systems that have\Nan ethical and moral foundation? Dialogue: 0,0:04:22.48,0:04:24.52,Default,,0000,0000,0000,,This we must learn how to do. Dialogue: 0,0:04:25.36,0:04:26.74,Default,,0000,0000,0000,,So let's accept for a moment Dialogue: 0,0:04:26.76,0:04:29.66,Default,,0000,0000,0000,,that it's possible to build\Nsuch an artificial intelligence Dialogue: 0,0:04:29.68,0:04:31.82,Default,,0000,0000,0000,,for this kind of mission and others. Dialogue: 0,0:04:31.84,0:04:34.38,Default,,0000,0000,0000,,The next question\Nyou must ask yourself is, Dialogue: 0,0:04:34.40,0:04:35.86,Default,,0000,0000,0000,,should we fear it? Dialogue: 0,0:04:35.88,0:04:37.86,Default,,0000,0000,0000,,Now, every new technology Dialogue: 0,0:04:37.88,0:04:40.78,Default,,0000,0000,0000,,brings with it\Nsome measure of trepidation. Dialogue: 0,0:04:40.80,0:04:42.50,Default,,0000,0000,0000,,When we first saw cars, Dialogue: 0,0:04:42.52,0:04:46.54,Default,,0000,0000,0000,,people lamented that we would see\Nthe destruction of the family. Dialogue: 0,0:04:46.56,0:04:49.26,Default,,0000,0000,0000,,When we first saw telephones come in, Dialogue: 0,0:04:49.28,0:04:52.18,Default,,0000,0000,0000,,people would worry it would destroy\Nall civil conversation. Dialogue: 0,0:04:52.20,0:04:56.14,Default,,0000,0000,0000,,At a point in time we saw\Nthe written word become pervasive, Dialogue: 0,0:04:56.16,0:04:58.66,Default,,0000,0000,0000,,people thought we would lose\Nour ability to memorize. Dialogue: 0,0:04:58.68,0:05:00.74,Default,,0000,0000,0000,,These things are all true to a degree, Dialogue: 0,0:05:00.76,0:05:03.18,Default,,0000,0000,0000,,but it's also the case\Nthat these technologies Dialogue: 0,0:05:03.20,0:05:06.58,Default,,0000,0000,0000,,brought to us things\Nthat extended the human experience Dialogue: 0,0:05:06.60,0:05:08.48,Default,,0000,0000,0000,,in some profound ways. Dialogue: 0,0:05:09.84,0:05:12.12,Default,,0000,0000,0000,,So let's take this a little further. Dialogue: 0,0:05:13.12,0:05:17.86,Default,,0000,0000,0000,,I do not fear the creation\Nof an AI like this, Dialogue: 0,0:05:17.88,0:05:21.70,Default,,0000,0000,0000,,because it will eventually\Nembody some of our values. Dialogue: 0,0:05:21.72,0:05:25.22,Default,,0000,0000,0000,,Consider this: building a cognitive system\Nis fundamentally different Dialogue: 0,0:05:25.24,0:05:28.54,Default,,0000,0000,0000,,than building a traditional\Nsoftware-intensive system of the past. Dialogue: 0,0:05:28.56,0:05:31.02,Default,,0000,0000,0000,,We don't program them. We teach them. Dialogue: 0,0:05:31.04,0:05:33.70,Default,,0000,0000,0000,,In order to teach a system\Nhow to recognize flowers, Dialogue: 0,0:05:33.72,0:05:36.74,Default,,0000,0000,0000,,I show it thousands of flowers\Nof the kinds I like. Dialogue: 0,0:05:36.76,0:05:39.02,Default,,0000,0000,0000,,In order to teach a system\Nhow to play a game -- Dialogue: 0,0:05:39.04,0:05:41.00,Default,,0000,0000,0000,,Well, I would. You would too. Dialogue: 0,0:05:42.60,0:05:44.64,Default,,0000,0000,0000,,I like flowers. Come on. Dialogue: 0,0:05:45.44,0:05:48.30,Default,,0000,0000,0000,,To teach a system\Nhow to play a game like go, Dialogue: 0,0:05:48.32,0:05:50.38,Default,,0000,0000,0000,,I'd have it play thousands of games of go, Dialogue: 0,0:05:50.40,0:05:52.06,Default,,0000,0000,0000,,but in the process I also teach it Dialogue: 0,0:05:52.08,0:05:54.50,Default,,0000,0000,0000,,how to discern\Na good game from a bad game. Dialogue: 0,0:05:54.52,0:05:58.22,Default,,0000,0000,0000,,If I want to create an artificially\Nintelligent legal assistant, Dialogue: 0,0:05:58.24,0:06:00.02,Default,,0000,0000,0000,,I will teach it some corpus of law Dialogue: 0,0:06:00.04,0:06:02.90,Default,,0000,0000,0000,,but at the same time I am fusing with it Dialogue: 0,0:06:02.92,0:06:05.80,Default,,0000,0000,0000,,the sense of mercy and justice\Nthat is part of that law. Dialogue: 0,0:06:06.56,0:06:09.54,Default,,0000,0000,0000,,In scientific terms,\Nthis is what we call ground truth, Dialogue: 0,0:06:09.56,0:06:11.58,Default,,0000,0000,0000,,and here's the important point: Dialogue: 0,0:06:11.60,0:06:13.06,Default,,0000,0000,0000,,in producing these machines, Dialogue: 0,0:06:13.08,0:06:16.50,Default,,0000,0000,0000,,we are therefore teaching them\Na sense of our values. Dialogue: 0,0:06:16.52,0:06:19.66,Default,,0000,0000,0000,,To that end, I trust\Nin artificial intelligence Dialogue: 0,0:06:19.68,0:06:23.32,Default,,0000,0000,0000,,the same, if not more,\Nas a human who is well-trained. Dialogue: 0,0:06:24.08,0:06:25.30,Default,,0000,0000,0000,,But, you may ask, Dialogue: 0,0:06:25.32,0:06:27.94,Default,,0000,0000,0000,,what about rogue agents, Dialogue: 0,0:06:27.96,0:06:31.30,Default,,0000,0000,0000,,some well-funded\Nnongovernment organization? Dialogue: 0,0:06:31.32,0:06:35.14,Default,,0000,0000,0000,,I do not fear an artificial intelligence\Nin the hand of a lone wolf. Dialogue: 0,0:06:35.16,0:06:39.70,Default,,0000,0000,0000,,Clearly, we cannot protect ourselves\Nagainst all random acts of violence, Dialogue: 0,0:06:39.72,0:06:41.86,Default,,0000,0000,0000,,but the reality is such a system Dialogue: 0,0:06:41.88,0:06:44.98,Default,,0000,0000,0000,,requires substantial training\Nand subtle training Dialogue: 0,0:06:45.00,0:06:47.30,Default,,0000,0000,0000,,far beyond the resources of an individual. Dialogue: 0,0:06:47.32,0:06:48.54,Default,,0000,0000,0000,,And furthermore, Dialogue: 0,0:06:48.56,0:06:51.82,Default,,0000,0000,0000,,it's far more than just injecting\Nan Internet virus to the world, Dialogue: 0,0:06:51.84,0:06:54.94,Default,,0000,0000,0000,,where you push a button,\Nall of a sudden it's in a million places Dialogue: 0,0:06:54.96,0:06:57.42,Default,,0000,0000,0000,,and laptops start blowing up\Nall over the place. Dialogue: 0,0:06:57.44,0:07:00.26,Default,,0000,0000,0000,,Now, these kinds of substances\Nare much larger Dialogue: 0,0:07:00.28,0:07:01.100,Default,,0000,0000,0000,,and we'll certainly see them coming. Dialogue: 0,0:07:02.52,0:07:05.58,Default,,0000,0000,0000,,Do I fear that such\Nan artificial intelligence Dialogue: 0,0:07:05.60,0:07:07.56,Default,,0000,0000,0000,,might threaten all of humanity? Dialogue: 0,0:07:08.28,0:07:12.66,Default,,0000,0000,0000,,If you look at movies\Nsuch as "The Matrix," "Metropolis," Dialogue: 0,0:07:12.68,0:07:15.86,Default,,0000,0000,0000,,"The Terminator,"\Nshows such as "Westworld," Dialogue: 0,0:07:15.88,0:07:18.02,Default,,0000,0000,0000,,they all speak of this kind of fear. Dialogue: 0,0:07:18.04,0:07:22.34,Default,,0000,0000,0000,,Indeed, in the book "Superintelligence"\Nby the philosopher Nick Bostrom, Dialogue: 0,0:07:22.36,0:07:23.90,Default,,0000,0000,0000,,he picks up on this theme Dialogue: 0,0:07:23.92,0:07:27.94,Default,,0000,0000,0000,,and observes that a superintelligence\Nmight not only be dangerous, Dialogue: 0,0:07:27.96,0:07:31.82,Default,,0000,0000,0000,,it could represent an existential threat\Nto all of humanity. Dialogue: 0,0:07:31.84,0:07:34.06,Default,,0000,0000,0000,,Dr. Bostrom's basic argument Dialogue: 0,0:07:34.08,0:07:36.82,Default,,0000,0000,0000,,is that such systems will eventually Dialogue: 0,0:07:36.84,0:07:40.10,Default,,0000,0000,0000,,have such an insatiable\Nthirst for information Dialogue: 0,0:07:40.12,0:07:43.02,Default,,0000,0000,0000,,that they will perhaps learn how to learn Dialogue: 0,0:07:43.04,0:07:45.66,Default,,0000,0000,0000,,and eventually discover\Nthat they may have goals Dialogue: 0,0:07:45.68,0:07:47.98,Default,,0000,0000,0000,,that are contrary to human needs. Dialogue: 0,0:07:48.00,0:07:49.86,Default,,0000,0000,0000,,Dr. Bostrom has a number of followers. Dialogue: 0,0:07:49.88,0:07:54.20,Default,,0000,0000,0000,,He is supported by people\Nsuch as Elon Musk and Stephen Hawking. Dialogue: 0,0:07:54.88,0:07:57.28,Default,,0000,0000,0000,,With all due respect Dialogue: 0,0:07:58.16,0:08:00.18,Default,,0000,0000,0000,,to these brilliant minds, Dialogue: 0,0:08:00.20,0:08:02.46,Default,,0000,0000,0000,,I believe that they\Nare fundamentally wrong. Dialogue: 0,0:08:02.48,0:08:05.66,Default,,0000,0000,0000,,Now, there are a lot of pieces\Nof Dr. Bostrom's argument to unpack, Dialogue: 0,0:08:05.68,0:08:07.82,Default,,0000,0000,0000,,and I don't have time to unpack them all, Dialogue: 0,0:08:07.84,0:08:10.54,Default,,0000,0000,0000,,but very briefly, consider this: Dialogue: 0,0:08:10.56,0:08:14.30,Default,,0000,0000,0000,,super knowing is very different\Nthan super doing. Dialogue: 0,0:08:14.32,0:08:16.22,Default,,0000,0000,0000,,HAL was a threat to the Discovery crew Dialogue: 0,0:08:16.24,0:08:20.66,Default,,0000,0000,0000,,only insofar as HAL commanded\Nall aspects of the Discovery. Dialogue: 0,0:08:20.68,0:08:23.18,Default,,0000,0000,0000,,So it would have to be\Nwith a superintelligence. Dialogue: 0,0:08:23.20,0:08:25.70,Default,,0000,0000,0000,,It would have to have dominion\Nover all of our world. Dialogue: 0,0:08:25.72,0:08:28.54,Default,,0000,0000,0000,,This is the stuff of Skynet\Nfrom the movie "The Terminator" Dialogue: 0,0:08:28.56,0:08:30.42,Default,,0000,0000,0000,,in which we had a superintelligence Dialogue: 0,0:08:30.44,0:08:31.82,Default,,0000,0000,0000,,that commanded human will, Dialogue: 0,0:08:31.84,0:08:35.70,Default,,0000,0000,0000,,that directed every device\Nthat was in every corner of the world. Dialogue: 0,0:08:35.72,0:08:37.18,Default,,0000,0000,0000,,Practically speaking, Dialogue: 0,0:08:37.20,0:08:39.30,Default,,0000,0000,0000,,it ain't gonna happen. Dialogue: 0,0:08:39.32,0:08:42.38,Default,,0000,0000,0000,,We are not building AIs\Nthat control the weather, Dialogue: 0,0:08:42.40,0:08:43.74,Default,,0000,0000,0000,,that direct the tides, Dialogue: 0,0:08:43.76,0:08:47.14,Default,,0000,0000,0000,,that command us\Ncapricious, chaotic humans. Dialogue: 0,0:08:47.16,0:08:51.06,Default,,0000,0000,0000,,And furthermore, if such\Nan artificial intelligence existed, Dialogue: 0,0:08:51.08,0:08:54.02,Default,,0000,0000,0000,,it would have to compete\Nwith human economies, Dialogue: 0,0:08:54.04,0:08:56.56,Default,,0000,0000,0000,,and thereby compete for resources with us. Dialogue: 0,0:08:57.20,0:08:58.42,Default,,0000,0000,0000,,And in the end -- Dialogue: 0,0:08:58.44,0:08:59.68,Default,,0000,0000,0000,,don't tell Siri this -- Dialogue: 0,0:09:00.44,0:09:01.82,Default,,0000,0000,0000,,we can always unplug them. Dialogue: 0,0:09:01.84,0:09:03.96,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:09:05.36,0:09:07.82,Default,,0000,0000,0000,,We are on an incredible journey Dialogue: 0,0:09:07.84,0:09:10.34,Default,,0000,0000,0000,,of coevolution with our machines. Dialogue: 0,0:09:10.36,0:09:12.86,Default,,0000,0000,0000,,The humans we are today Dialogue: 0,0:09:12.88,0:09:15.42,Default,,0000,0000,0000,,are not the humans we will be then. Dialogue: 0,0:09:15.44,0:09:18.58,Default,,0000,0000,0000,,To worry now about the rise\Nof a superintelligence Dialogue: 0,0:09:18.60,0:09:21.66,Default,,0000,0000,0000,,is in many ways a dangerous distraction Dialogue: 0,0:09:21.68,0:09:24.02,Default,,0000,0000,0000,,because the rise of computing itself Dialogue: 0,0:09:24.04,0:09:27.06,Default,,0000,0000,0000,,brings to us a number\Nof human and societal issues Dialogue: 0,0:09:27.08,0:09:28.72,Default,,0000,0000,0000,,to which we must now attend. Dialogue: 0,0:09:29.36,0:09:32.18,Default,,0000,0000,0000,,How shall I best organize society Dialogue: 0,0:09:32.20,0:09:34.54,Default,,0000,0000,0000,,when the need for human labor diminishes? Dialogue: 0,0:09:34.56,0:09:38.38,Default,,0000,0000,0000,,How can I bring understanding\Nand education throughout the globe Dialogue: 0,0:09:38.40,0:09:40.18,Default,,0000,0000,0000,,and still respect our differences? Dialogue: 0,0:09:40.20,0:09:44.46,Default,,0000,0000,0000,,How might I extend and enhance human life\Nthrough cognitive healthcare? Dialogue: 0,0:09:44.48,0:09:47.34,Default,,0000,0000,0000,,How might I use computing Dialogue: 0,0:09:47.36,0:09:49.12,Default,,0000,0000,0000,,to help take us to the stars? Dialogue: 0,0:09:49.76,0:09:51.80,Default,,0000,0000,0000,,And that's the exciting thing. Dialogue: 0,0:09:52.40,0:09:54.74,Default,,0000,0000,0000,,The opportunities to use computing Dialogue: 0,0:09:54.76,0:09:56.30,Default,,0000,0000,0000,,to advance the human experience Dialogue: 0,0:09:56.32,0:09:57.74,Default,,0000,0000,0000,,are within our reach, Dialogue: 0,0:09:57.76,0:09:59.62,Default,,0000,0000,0000,,here and now, Dialogue: 0,0:09:59.64,0:10:01.32,Default,,0000,0000,0000,,and we are just beginning. Dialogue: 0,0:10:02.28,0:10:03.50,Default,,0000,0000,0000,,Thank you very much. Dialogue: 0,0:10:03.52,0:10:05.72,Default,,0000,0000,0000,,(Applause)