[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.76,0:00:05.18,Default,,0000,0000,0000,,After 13.8 billion years\Nof cosmic history, Dialogue: 0,0:00:05.20,0:00:07.30,Default,,0000,0000,0000,,our universe has woken up Dialogue: 0,0:00:07.32,0:00:08.84,Default,,0000,0000,0000,,and become aware of itself. Dialogue: 0,0:00:09.48,0:00:11.42,Default,,0000,0000,0000,,From a small blue planet, Dialogue: 0,0:00:11.44,0:00:15.58,Default,,0000,0000,0000,,tiny, conscious parts of our universe\Nhave begun gazing out into the cosmos Dialogue: 0,0:00:15.60,0:00:16.98,Default,,0000,0000,0000,,with telescopes, Dialogue: 0,0:00:17.00,0:00:18.48,Default,,0000,0000,0000,,discovering something humbling. Dialogue: 0,0:00:19.32,0:00:22.22,Default,,0000,0000,0000,,We've discovered that our universe\Nis vastly grander Dialogue: 0,0:00:22.24,0:00:23.58,Default,,0000,0000,0000,,than our ancestors imagined Dialogue: 0,0:00:23.60,0:00:27.86,Default,,0000,0000,0000,,and that life seems to be an almost\Nimperceptibly small perturbation Dialogue: 0,0:00:27.88,0:00:29.60,Default,,0000,0000,0000,,on an otherwise dead universe. Dialogue: 0,0:00:30.32,0:00:33.34,Default,,0000,0000,0000,,But we've also discovered\Nsomething inspiring, Dialogue: 0,0:00:33.36,0:00:36.34,Default,,0000,0000,0000,,which is that the technology\Nwe're developing has the potential Dialogue: 0,0:00:36.36,0:00:39.22,Default,,0000,0000,0000,,to help life flourish like never before, Dialogue: 0,0:00:39.24,0:00:42.34,Default,,0000,0000,0000,,not just for centuries\Nbut for billions of years, Dialogue: 0,0:00:42.36,0:00:46.48,Default,,0000,0000,0000,,and not just on Earth but throughout\Nmuch of this amazing cosmos. Dialogue: 0,0:00:47.68,0:00:51.02,Default,,0000,0000,0000,,I think of the earliest life as "Life 1.0" Dialogue: 0,0:00:51.04,0:00:52.42,Default,,0000,0000,0000,,because it was really dumb, Dialogue: 0,0:00:52.44,0:00:56.74,Default,,0000,0000,0000,,like bacteria, unable to learn\Nanything during its lifetime. Dialogue: 0,0:00:56.76,0:01:00.14,Default,,0000,0000,0000,,I think of us humans as Life 2.0\Nbecause we can learn, Dialogue: 0,0:01:00.16,0:01:01.66,Default,,0000,0000,0000,,which we in nerdy, geek speak, Dialogue: 0,0:01:01.68,0:01:04.90,Default,,0000,0000,0000,,might think of as installing\Nnew software into our brains, Dialogue: 0,0:01:04.92,0:01:07.04,Default,,0000,0000,0000,,like languages and job skills. Dialogue: 0,0:01:07.68,0:01:11.98,Default,,0000,0000,0000,,Life 3.0, which can design not only\Nits software but also its hardware Dialogue: 0,0:01:12.00,0:01:13.66,Default,,0000,0000,0000,,of course doesn't exist yet. Dialogue: 0,0:01:13.68,0:01:17.46,Default,,0000,0000,0000,,But perhaps our technology\Nhas already made us life 2.1, Dialogue: 0,0:01:17.48,0:01:21.82,Default,,0000,0000,0000,,with our artificial knees,\Npacemakers and cochlear implants. Dialogue: 0,0:01:21.84,0:01:25.72,Default,,0000,0000,0000,,So let's take a closer look\Nat our relationship with technology, OK? Dialogue: 0,0:01:26.80,0:01:28.02,Default,,0000,0000,0000,,As an example, Dialogue: 0,0:01:28.04,0:01:33.34,Default,,0000,0000,0000,,the Apollo 11 moon mission\Nwas both successful and inspiring, Dialogue: 0,0:01:33.36,0:01:36.38,Default,,0000,0000,0000,,showing that when we humans\Nuse technology wisely, Dialogue: 0,0:01:36.40,0:01:40.34,Default,,0000,0000,0000,,we can accomplish things\Nthat our ancestors could only dream of. Dialogue: 0,0:01:40.36,0:01:43.34,Default,,0000,0000,0000,,But there's an even more inspiring journey Dialogue: 0,0:01:43.36,0:01:46.04,Default,,0000,0000,0000,,propelled by something\Nmore powerful than rocket engines, Dialogue: 0,0:01:47.20,0:01:49.54,Default,,0000,0000,0000,,where the passengers\Naren't just three astronauts Dialogue: 0,0:01:49.56,0:01:51.34,Default,,0000,0000,0000,,but all of humanity. Dialogue: 0,0:01:51.36,0:01:54.30,Default,,0000,0000,0000,,Let's talk about our collective\Njourney into the future Dialogue: 0,0:01:54.32,0:01:56.32,Default,,0000,0000,0000,,with artificial intelligence. Dialogue: 0,0:01:56.96,0:02:01.50,Default,,0000,0000,0000,,My friend Jaan Tallinn likes to point out\Nthat just as with rocketry, Dialogue: 0,0:02:01.52,0:02:04.68,Default,,0000,0000,0000,,it's not enough to make\Nour technology powerful. Dialogue: 0,0:02:05.56,0:02:08.74,Default,,0000,0000,0000,,We also have to figure out,\Nif we're going to be really ambitious, Dialogue: 0,0:02:08.76,0:02:10.18,Default,,0000,0000,0000,,how to steer it Dialogue: 0,0:02:10.20,0:02:11.88,Default,,0000,0000,0000,,and where we want to go with it. Dialogue: 0,0:02:12.88,0:02:15.72,Default,,0000,0000,0000,,So let's talk about all three\Nfor artificial intelligence: Dialogue: 0,0:02:16.44,0:02:19.50,Default,,0000,0000,0000,,the power, the steering\Nand the destination. Dialogue: 0,0:02:19.52,0:02:20.81,Default,,0000,0000,0000,,Let's start with the power. Dialogue: 0,0:02:21.60,0:02:24.70,Default,,0000,0000,0000,,I define intelligence very inclusively -- Dialogue: 0,0:02:24.72,0:02:29.06,Default,,0000,0000,0000,,simply as our ability\Nto accomplish complex goals, Dialogue: 0,0:02:29.08,0:02:32.90,Default,,0000,0000,0000,,because I want to include both\Nbiological and artificial intelligence Dialogue: 0,0:02:32.92,0:02:36.94,Default,,0000,0000,0000,,and I want to avoid\Nthe silly carbon-chauvinism idea Dialogue: 0,0:02:36.96,0:02:39.32,Default,,0000,0000,0000,,that you can only be smart\Nif you're made of meat. Dialogue: 0,0:02:40.88,0:02:45.06,Default,,0000,0000,0000,,It's really amazing how the power\Nof AI has grown recently. Dialogue: 0,0:02:45.08,0:02:46.34,Default,,0000,0000,0000,,Just think about it. Dialogue: 0,0:02:46.36,0:02:49.56,Default,,0000,0000,0000,,Not long ago, robots couldn't walk. Dialogue: 0,0:02:51.04,0:02:52.76,Default,,0000,0000,0000,,Now, they can do backflips. Dialogue: 0,0:02:54.08,0:02:55.90,Default,,0000,0000,0000,,Not long ago, Dialogue: 0,0:02:55.92,0:02:57.68,Default,,0000,0000,0000,,we didn't have self-driving cars. Dialogue: 0,0:02:58.92,0:03:01.40,Default,,0000,0000,0000,,Now, we have self-flying rockets. Dialogue: 0,0:03:03.96,0:03:05.38,Default,,0000,0000,0000,,Not long ago, Dialogue: 0,0:03:05.40,0:03:08.02,Default,,0000,0000,0000,,AI couldn't do face recognition. Dialogue: 0,0:03:08.04,0:03:11.02,Default,,0000,0000,0000,,Now, AI can generate fake faces Dialogue: 0,0:03:11.04,0:03:15.20,Default,,0000,0000,0000,,and simulate your face\Nsaying stuff that you never said. Dialogue: 0,0:03:16.40,0:03:17.98,Default,,0000,0000,0000,,Not long ago, Dialogue: 0,0:03:18.00,0:03:19.88,Default,,0000,0000,0000,,AI couldn't beat us at the game of Go. Dialogue: 0,0:03:20.40,0:03:25.50,Default,,0000,0000,0000,,Then, Google DeepMind's AlphaZero AI\Ntook 3,000 years of human Go games Dialogue: 0,0:03:25.52,0:03:26.78,Default,,0000,0000,0000,,and Go wisdom, Dialogue: 0,0:03:26.80,0:03:31.78,Default,,0000,0000,0000,,ignored it all and became the world's best\Nplayer by just playing against itself. Dialogue: 0,0:03:31.80,0:03:35.50,Default,,0000,0000,0000,,And the most impressive feat here\Nwasn't that it crushed human gamers, Dialogue: 0,0:03:35.52,0:03:38.10,Default,,0000,0000,0000,,but that it crushed human AI researchers Dialogue: 0,0:03:38.12,0:03:41.80,Default,,0000,0000,0000,,who had spent decades\Nhandcrafting game-playing software. Dialogue: 0,0:03:42.20,0:03:46.86,Default,,0000,0000,0000,,And AlphaZero crushed human AI researchers\Nnot just in Go but even at chess, Dialogue: 0,0:03:46.88,0:03:49.36,Default,,0000,0000,0000,,which we have been working on since 1950. Dialogue: 0,0:03:50.00,0:03:54.24,Default,,0000,0000,0000,,So all this amazing recent progress in AI\Nreally begs the question: Dialogue: 0,0:03:55.28,0:03:56.84,Default,,0000,0000,0000,,how far will it go? Dialogue: 0,0:03:57.80,0:03:59.50,Default,,0000,0000,0000,,I like to think about this question Dialogue: 0,0:03:59.52,0:04:02.50,Default,,0000,0000,0000,,in terms of this abstract\Nlandscape of tasks, Dialogue: 0,0:04:02.52,0:04:05.98,Default,,0000,0000,0000,,where the elevation represents\Nhow hard it is for AI to do each task Dialogue: 0,0:04:06.00,0:04:07.22,Default,,0000,0000,0000,,at human level, Dialogue: 0,0:04:07.24,0:04:10.00,Default,,0000,0000,0000,,and the sea level represents\Nwhat AI can do today. Dialogue: 0,0:04:11.12,0:04:13.18,Default,,0000,0000,0000,,The sea level is rising\Nas the AI improves, Dialogue: 0,0:04:13.20,0:04:16.64,Default,,0000,0000,0000,,so there's a kind of global warming\Ngoing on here in the task landscape. Dialogue: 0,0:04:18.04,0:04:21.38,Default,,0000,0000,0000,,And the obvious takeaway\Nis to avoid careers at the waterfront -- Dialogue: 0,0:04:21.40,0:04:22.66,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:04:22.68,0:04:25.54,Default,,0000,0000,0000,,which will soon be\Nautomated and disrupted. Dialogue: 0,0:04:25.56,0:04:28.54,Default,,0000,0000,0000,,But there's a much\Nbigger question as well. Dialogue: 0,0:04:28.56,0:04:30.37,Default,,0000,0000,0000,,How high will the water end up rising? Dialogue: 0,0:04:31.44,0:04:34.64,Default,,0000,0000,0000,,Will it eventually rise\Nto flood everything, Dialogue: 0,0:04:35.84,0:04:38.34,Default,,0000,0000,0000,,matching human intelligence at all tasks. Dialogue: 0,0:04:38.36,0:04:42.10,Default,,0000,0000,0000,,This is the definition\Nof artificial general intelligence -- Dialogue: 0,0:04:42.12,0:04:43.42,Default,,0000,0000,0000,,AGI, Dialogue: 0,0:04:43.44,0:04:46.52,Default,,0000,0000,0000,,which has been the holy grail\Nof AI research since its inception. Dialogue: 0,0:04:47.00,0:04:48.78,Default,,0000,0000,0000,,By this definition, people who say, Dialogue: 0,0:04:48.80,0:04:52.22,Default,,0000,0000,0000,,"Ah, there will always be jobs\Nthat humans can do better than machines," Dialogue: 0,0:04:52.24,0:04:55.16,Default,,0000,0000,0000,,are simply saying\Nthat we'll never get AGI. Dialogue: 0,0:04:55.68,0:04:59.26,Default,,0000,0000,0000,,Sure, we might still choose\Nto have some human jobs Dialogue: 0,0:04:59.28,0:05:02.38,Default,,0000,0000,0000,,or to give humans income\Nand purpose with our jobs, Dialogue: 0,0:05:02.40,0:05:06.14,Default,,0000,0000,0000,,but AGI will in any case\Ntransform life as we know it Dialogue: 0,0:05:06.16,0:05:08.90,Default,,0000,0000,0000,,with humans no longer being\Nthe most intelligent. Dialogue: 0,0:05:08.92,0:05:12.62,Default,,0000,0000,0000,,Now, if the water level does reach AGI, Dialogue: 0,0:05:12.64,0:05:17.94,Default,,0000,0000,0000,,then further AI progress will be driven\Nmainly not by humans but by AI, Dialogue: 0,0:05:17.96,0:05:19.82,Default,,0000,0000,0000,,which means that there's a possibility Dialogue: 0,0:05:19.84,0:05:22.18,Default,,0000,0000,0000,,that further AI progress\Ncould be way faster Dialogue: 0,0:05:22.20,0:05:25.58,Default,,0000,0000,0000,,than the typical human research\Nand development timescale of years, Dialogue: 0,0:05:25.60,0:05:29.62,Default,,0000,0000,0000,,raising the controversial possibility\Nof an intelligence explosion Dialogue: 0,0:05:29.64,0:05:31.94,Default,,0000,0000,0000,,where recursively self-improving AI Dialogue: 0,0:05:31.96,0:05:35.38,Default,,0000,0000,0000,,rapidly leaves human\Nintelligence far behind, Dialogue: 0,0:05:35.40,0:05:37.84,Default,,0000,0000,0000,,creating what's known\Nas superintelligence. Dialogue: 0,0:05:39.80,0:05:42.08,Default,,0000,0000,0000,,Alright, reality check: Dialogue: 0,0:05:43.12,0:05:45.56,Default,,0000,0000,0000,,are we going to get AGI any time soon? Dialogue: 0,0:05:46.36,0:05:49.06,Default,,0000,0000,0000,,Some famous AI researchers,\Nlike Rodney Brooks, Dialogue: 0,0:05:49.08,0:05:51.58,Default,,0000,0000,0000,,think it won't happen\Nfor hundreds of years. Dialogue: 0,0:05:51.60,0:05:55.50,Default,,0000,0000,0000,,But others, like Google DeepMind\Nfounder Demis Hassabis, Dialogue: 0,0:05:55.52,0:05:56.78,Default,,0000,0000,0000,,are more optimistic Dialogue: 0,0:05:56.80,0:05:59.38,Default,,0000,0000,0000,,and are working to try to make\Nit happen much sooner. Dialogue: 0,0:05:59.40,0:06:02.70,Default,,0000,0000,0000,,And recent surveys have shown\Nthat most AI researchers Dialogue: 0,0:06:02.72,0:06:05.58,Default,,0000,0000,0000,,actually share Demis's optimism, Dialogue: 0,0:06:05.60,0:06:08.68,Default,,0000,0000,0000,,expecting that we will\Nget AGI within decades, Dialogue: 0,0:06:09.64,0:06:11.90,Default,,0000,0000,0000,,so within the lifetime of many of us, Dialogue: 0,0:06:11.92,0:06:13.88,Default,,0000,0000,0000,,which begs the question -- and then what? Dialogue: 0,0:06:15.04,0:06:17.26,Default,,0000,0000,0000,,What do we want the role of humans to be Dialogue: 0,0:06:17.28,0:06:19.96,Default,,0000,0000,0000,,if machines can do everything better\Nand cheaper than us? Dialogue: 0,0:06:23.00,0:06:25.00,Default,,0000,0000,0000,,The way I see it, we face a choice. Dialogue: 0,0:06:26.00,0:06:27.58,Default,,0000,0000,0000,,One option is to be complacent. Dialogue: 0,0:06:27.60,0:06:31.38,Default,,0000,0000,0000,,We can say, "Oh, let's just build machines\Nthat can do everything we can do Dialogue: 0,0:06:31.40,0:06:33.22,Default,,0000,0000,0000,,and not worry about the consequences. Dialogue: 0,0:06:33.24,0:06:36.50,Default,,0000,0000,0000,,Come on, if we build technology\Nthat makes all humans obsolete, Dialogue: 0,0:06:36.52,0:06:38.62,Default,,0000,0000,0000,,what could possibly go wrong?" Dialogue: 0,0:06:38.64,0:06:40.30,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:06:40.32,0:06:43.08,Default,,0000,0000,0000,,But I think that would be\Nembarrassingly lame. Dialogue: 0,0:06:44.08,0:06:47.58,Default,,0000,0000,0000,,I think we should be more ambitious --\Nin the spirit of TED. Dialogue: 0,0:06:47.60,0:06:51.10,Default,,0000,0000,0000,,Let's envision the truly inspiring\Nhigh-tech future Dialogue: 0,0:06:51.12,0:06:52.52,Default,,0000,0000,0000,,and try to steer towards it. Dialogue: 0,0:06:53.72,0:06:57.26,Default,,0000,0000,0000,,This brings us to the second part\Nof our rocket metaphor: the steering. Dialogue: 0,0:06:57.28,0:06:59.18,Default,,0000,0000,0000,,We're making AI more powerful, Dialogue: 0,0:06:59.20,0:07:03.02,Default,,0000,0000,0000,,but how can we steer towards a future Dialogue: 0,0:07:03.04,0:07:06.12,Default,,0000,0000,0000,,where AI helps humanity flourish\Nrather than flounder? Dialogue: 0,0:07:06.76,0:07:08.02,Default,,0000,0000,0000,,To help with this, Dialogue: 0,0:07:08.04,0:07:10.02,Default,,0000,0000,0000,,I cofounded the Future of Life Institute. Dialogue: 0,0:07:10.04,0:07:12.82,Default,,0000,0000,0000,,It's a small nonprofit promoting\Nbeneficial technology use Dialogue: 0,0:07:12.84,0:07:15.58,Default,,0000,0000,0000,,and our goal is simply\Nfor the future of life to exist Dialogue: 0,0:07:15.60,0:07:17.66,Default,,0000,0000,0000,,and to be as inspiring as possible. Dialogue: 0,0:07:17.68,0:07:20.86,Default,,0000,0000,0000,,You know, I love technology. Dialogue: 0,0:07:20.88,0:07:23.80,Default,,0000,0000,0000,,Technology is why today\Nis better than the Stone Age. Dialogue: 0,0:07:24.60,0:07:28.68,Default,,0000,0000,0000,,And I'm optimistic that we can create\Na really inspiring high-tech future ... Dialogue: 0,0:07:29.68,0:07:31.14,Default,,0000,0000,0000,,if -- and this is a big if -- Dialogue: 0,0:07:31.16,0:07:33.62,Default,,0000,0000,0000,,if we win the wisdom race -- Dialogue: 0,0:07:33.64,0:07:36.50,Default,,0000,0000,0000,,the race between the growing\Npower of our technology Dialogue: 0,0:07:36.52,0:07:38.72,Default,,0000,0000,0000,,and the growing wisdom\Nwith which we manage it. Dialogue: 0,0:07:39.24,0:07:41.54,Default,,0000,0000,0000,,But this is going to require\Na change of strategy Dialogue: 0,0:07:41.56,0:07:44.60,Default,,0000,0000,0000,,because our old strategy\Nhas been learning from mistakes. Dialogue: 0,0:07:45.28,0:07:46.82,Default,,0000,0000,0000,,We invented fire, Dialogue: 0,0:07:46.84,0:07:48.38,Default,,0000,0000,0000,,screwed up a bunch of times -- Dialogue: 0,0:07:48.40,0:07:50.22,Default,,0000,0000,0000,,invented the fire extinguisher. Dialogue: 0,0:07:50.24,0:07:51.58,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:07:51.60,0:07:54.02,Default,,0000,0000,0000,,We invented the car,\Nscrewed up a bunch of times -- Dialogue: 0,0:07:54.04,0:07:56.71,Default,,0000,0000,0000,,invented the traffic light,\Nthe seat belt and the airbag, Dialogue: 0,0:07:56.73,0:08:00.58,Default,,0000,0000,0000,,but with more powerful technology\Nlike nuclear weapons and AGI, Dialogue: 0,0:08:00.60,0:08:03.98,Default,,0000,0000,0000,,learning from mistakes\Nis a lousy strategy, Dialogue: 0,0:08:04.00,0:08:05.22,Default,,0000,0000,0000,,don't you think? Dialogue: 0,0:08:05.24,0:08:06.26,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:08:06.28,0:08:08.86,Default,,0000,0000,0000,,It's much better to be proactive\Nrather than reactive; Dialogue: 0,0:08:08.88,0:08:11.18,Default,,0000,0000,0000,,plan ahead and get things\Nright the first time Dialogue: 0,0:08:11.20,0:08:13.70,Default,,0000,0000,0000,,because that might be\Nthe only time we'll get. Dialogue: 0,0:08:13.72,0:08:16.06,Default,,0000,0000,0000,,But it is funny because\Nsometimes people tell me, Dialogue: 0,0:08:16.08,0:08:18.82,Default,,0000,0000,0000,,"Max, shhh, don't talk like that. Dialogue: 0,0:08:18.84,0:08:20.56,Default,,0000,0000,0000,,That's Luddite scaremongering." Dialogue: 0,0:08:22.04,0:08:23.58,Default,,0000,0000,0000,,But it's not scaremongering. Dialogue: 0,0:08:23.60,0:08:26.48,Default,,0000,0000,0000,,It's what we at MIT\Ncall safety engineering. Dialogue: 0,0:08:27.20,0:08:28.42,Default,,0000,0000,0000,,Think about it: Dialogue: 0,0:08:28.44,0:08:30.66,Default,,0000,0000,0000,,before NASA launched\Nthe Apollo 11 mission, Dialogue: 0,0:08:30.68,0:08:33.82,Default,,0000,0000,0000,,they systematically thought through\Neverything that could go wrong Dialogue: 0,0:08:33.84,0:08:36.22,Default,,0000,0000,0000,,when you put people\Non top of explosive fuel tanks Dialogue: 0,0:08:36.24,0:08:38.86,Default,,0000,0000,0000,,and launch them somewhere\Nwhere no one could help them. Dialogue: 0,0:08:38.88,0:08:40.82,Default,,0000,0000,0000,,And there was a lot that could go wrong. Dialogue: 0,0:08:40.84,0:08:42.32,Default,,0000,0000,0000,,Was that scaremongering? Dialogue: 0,0:08:43.16,0:08:44.38,Default,,0000,0000,0000,,No. Dialogue: 0,0:08:44.40,0:08:46.42,Default,,0000,0000,0000,,That's was precisely\Nthe safety engineering Dialogue: 0,0:08:46.44,0:08:48.38,Default,,0000,0000,0000,,that ensured the success of the mission, Dialogue: 0,0:08:48.40,0:08:52.58,Default,,0000,0000,0000,,and that is precisely the strategy\NI think we should take with AGI. Dialogue: 0,0:08:52.60,0:08:56.66,Default,,0000,0000,0000,,Think through what can go wrong\Nto make sure it goes right. Dialogue: 0,0:08:56.68,0:08:59.22,Default,,0000,0000,0000,,So in this spirit,\Nwe've organized conferences, Dialogue: 0,0:08:59.24,0:09:02.06,Default,,0000,0000,0000,,bringing together leading\NAI researchers and other thinkers Dialogue: 0,0:09:02.08,0:09:05.82,Default,,0000,0000,0000,,to discuss how to grow this wisdom\Nwe need to keep AI beneficial. Dialogue: 0,0:09:05.84,0:09:09.14,Default,,0000,0000,0000,,Our last conference\Nwas in Asilomar, California last year Dialogue: 0,0:09:09.16,0:09:12.22,Default,,0000,0000,0000,,and produced this list of 23 principles Dialogue: 0,0:09:12.24,0:09:15.14,Default,,0000,0000,0000,,which have since been signed\Nby over 1,000 AI researchers Dialogue: 0,0:09:15.16,0:09:16.46,Default,,0000,0000,0000,,and key industry leaders, Dialogue: 0,0:09:16.48,0:09:19.66,Default,,0000,0000,0000,,and I want to tell you\Nabout three of these principles. Dialogue: 0,0:09:19.68,0:09:24.64,Default,,0000,0000,0000,,One is that we should avoid an arms race\Nand lethal autonomous weapons. Dialogue: 0,0:09:25.48,0:09:29.10,Default,,0000,0000,0000,,The idea here is that any science\Ncan be used for new ways of helping people Dialogue: 0,0:09:29.12,0:09:30.66,Default,,0000,0000,0000,,or new ways of harming people. Dialogue: 0,0:09:30.68,0:09:34.62,Default,,0000,0000,0000,,For example, biology and chemistry\Nare much more likely to be used Dialogue: 0,0:09:34.64,0:09:39.50,Default,,0000,0000,0000,,for new medicines or new cures\Nthan for new ways of killing people, Dialogue: 0,0:09:39.52,0:09:41.70,Default,,0000,0000,0000,,because biologists\Nand chemists pushed hard -- Dialogue: 0,0:09:41.72,0:09:42.98,Default,,0000,0000,0000,,and successfully -- Dialogue: 0,0:09:43.00,0:09:45.18,Default,,0000,0000,0000,,for bans on biological\Nand chemical weapons. Dialogue: 0,0:09:45.20,0:09:46.46,Default,,0000,0000,0000,,And in the same spirit, Dialogue: 0,0:09:46.48,0:09:50.92,Default,,0000,0000,0000,,most AI researchers want to stigmatize\Nand ban lethal autonomous weapons. Dialogue: 0,0:09:51.60,0:09:53.42,Default,,0000,0000,0000,,Another Asilomar AI principle Dialogue: 0,0:09:53.44,0:09:57.14,Default,,0000,0000,0000,,is that we should mitigate\NAI-fueled income inequality. Dialogue: 0,0:09:57.16,0:10:01.62,Default,,0000,0000,0000,,I think that if we can grow\Nthe economic pie dramatically with AI Dialogue: 0,0:10:01.64,0:10:04.10,Default,,0000,0000,0000,,and we still can't figure out\Nhow to divide this pie Dialogue: 0,0:10:04.12,0:10:05.70,Default,,0000,0000,0000,,so that everyone is better off, Dialogue: 0,0:10:05.72,0:10:06.98,Default,,0000,0000,0000,,then shame on us. Dialogue: 0,0:10:07.00,0:10:11.10,Default,,0000,0000,0000,,(Applause) Dialogue: 0,0:10:11.12,0:10:14.72,Default,,0000,0000,0000,,Alright, now raise your hand\Nif your computer has ever crashed. Dialogue: 0,0:10:15.48,0:10:16.74,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:10:16.76,0:10:18.42,Default,,0000,0000,0000,,Wow, that's a lot of hands. Dialogue: 0,0:10:18.44,0:10:20.62,Default,,0000,0000,0000,,Well, then you'll appreciate\Nthis principle Dialogue: 0,0:10:20.64,0:10:23.78,Default,,0000,0000,0000,,that we should invest much more\Nin AI safety research, Dialogue: 0,0:10:23.80,0:10:27.46,Default,,0000,0000,0000,,because as we put AI in charge\Nof even more decisions and infrastructure, Dialogue: 0,0:10:27.48,0:10:31.10,Default,,0000,0000,0000,,we need to figure out how to transform\Ntoday's buggy and hackable computers Dialogue: 0,0:10:31.12,0:10:33.54,Default,,0000,0000,0000,,into robust AI systems\Nthat we can really trust, Dialogue: 0,0:10:33.56,0:10:34.78,Default,,0000,0000,0000,,because otherwise, Dialogue: 0,0:10:34.80,0:10:37.62,Default,,0000,0000,0000,,all this awesome new technology\Ncan malfunction and harm us, Dialogue: 0,0:10:37.64,0:10:39.62,Default,,0000,0000,0000,,or get hacked and be turned against us. Dialogue: 0,0:10:39.64,0:10:45.34,Default,,0000,0000,0000,,And this AI safety work\Nhas to include work on AI value alignment, Dialogue: 0,0:10:45.36,0:10:48.18,Default,,0000,0000,0000,,because the real threat\Nfrom AGI isn't malice, Dialogue: 0,0:10:48.20,0:10:49.86,Default,,0000,0000,0000,,like in silly Hollywood movies, Dialogue: 0,0:10:49.88,0:10:51.62,Default,,0000,0000,0000,,but competence -- Dialogue: 0,0:10:51.64,0:10:55.06,Default,,0000,0000,0000,,AGI accomplishing goals\Nthat just aren't aligned with ours. Dialogue: 0,0:10:55.08,0:10:59.82,Default,,0000,0000,0000,,For example, when we humans drove\Nthe West African black rhino extinct, Dialogue: 0,0:10:59.84,0:11:03.74,Default,,0000,0000,0000,,we didn't do it because we were a bunch\Nof evil rhinoceros haters, did we? Dialogue: 0,0:11:03.76,0:11:05.82,Default,,0000,0000,0000,,We did it because\Nwe were smarter than them Dialogue: 0,0:11:05.84,0:11:08.42,Default,,0000,0000,0000,,and our goals weren't aligned with theirs. Dialogue: 0,0:11:08.44,0:11:11.10,Default,,0000,0000,0000,,But AGI is by definition smarter than us, Dialogue: 0,0:11:11.12,0:11:14.70,Default,,0000,0000,0000,,so to make sure that we don't put\Nourselves in the position of those rhinos Dialogue: 0,0:11:14.72,0:11:16.70,Default,,0000,0000,0000,,if we create AGI, Dialogue: 0,0:11:16.72,0:11:20.90,Default,,0000,0000,0000,,we need to figure out how\Nto make machines understand our goals, Dialogue: 0,0:11:20.92,0:11:24.08,Default,,0000,0000,0000,,adopt our goals and retain our goals. Dialogue: 0,0:11:25.32,0:11:28.18,Default,,0000,0000,0000,,And whose goals should these be, anyway? Dialogue: 0,0:11:28.20,0:11:30.10,Default,,0000,0000,0000,,Which goals should they be? Dialogue: 0,0:11:30.12,0:11:33.68,Default,,0000,0000,0000,,This brings us to the third part\Nof our rocket metaphor: the destination. Dialogue: 0,0:11:35.16,0:11:37.02,Default,,0000,0000,0000,,We're making AI more powerful, Dialogue: 0,0:11:37.04,0:11:38.86,Default,,0000,0000,0000,,trying to figure out how to steer it, Dialogue: 0,0:11:38.88,0:11:40.56,Default,,0000,0000,0000,,but where do we want to go with it? Dialogue: 0,0:11:41.76,0:11:45.42,Default,,0000,0000,0000,,This is the elephant in the room\Nthat almost nobody talks about -- Dialogue: 0,0:11:45.44,0:11:47.30,Default,,0000,0000,0000,,not even here at TED -- Dialogue: 0,0:11:47.32,0:11:51.40,Default,,0000,0000,0000,,because we're so fixated\Non short-term AI challenges. Dialogue: 0,0:11:52.08,0:11:56.74,Default,,0000,0000,0000,,Look, our species is trying to build AGI, Dialogue: 0,0:11:56.76,0:12:00.26,Default,,0000,0000,0000,,motivated by curiosity and economics, Dialogue: 0,0:12:00.28,0:12:03.96,Default,,0000,0000,0000,,but what sort of future society\Nare we hoping for if we succeed? Dialogue: 0,0:12:04.68,0:12:06.62,Default,,0000,0000,0000,,We did an opinion poll on this recently, Dialogue: 0,0:12:06.64,0:12:07.86,Default,,0000,0000,0000,,and I was struck to see Dialogue: 0,0:12:07.88,0:12:10.78,Default,,0000,0000,0000,,that most people actually\Nwant us to build superintelligence: Dialogue: 0,0:12:10.80,0:12:13.96,Default,,0000,0000,0000,,AI that's vastly smarter\Nthan us in all ways. Dialogue: 0,0:12:15.12,0:12:18.54,Default,,0000,0000,0000,,What there was the greatest agreement on\Nwas that we should be ambitious Dialogue: 0,0:12:18.56,0:12:20.58,Default,,0000,0000,0000,,and help life spread into the cosmos, Dialogue: 0,0:12:20.60,0:12:25.10,Default,,0000,0000,0000,,but there was much less agreement\Nabout who or what should be in charge. Dialogue: 0,0:12:25.12,0:12:26.86,Default,,0000,0000,0000,,And I was actually quite amused Dialogue: 0,0:12:26.88,0:12:30.34,Default,,0000,0000,0000,,to see that there's some some people\Nwho want it to be just machines. Dialogue: 0,0:12:30.36,0:12:32.06,Default,,0000,0000,0000,,(Laughter) Dialogue: 0,0:12:32.08,0:12:35.94,Default,,0000,0000,0000,,And there was total disagreement\Nabout what the role of humans should be, Dialogue: 0,0:12:35.96,0:12:37.94,Default,,0000,0000,0000,,even at the most basic level, Dialogue: 0,0:12:37.96,0:12:40.78,Default,,0000,0000,0000,,so let's take a closer look\Nat possible futures Dialogue: 0,0:12:40.80,0:12:43.54,Default,,0000,0000,0000,,that we might choose\Nto steer toward, alright? Dialogue: 0,0:12:43.56,0:12:44.90,Default,,0000,0000,0000,,So don't get be wrong here. Dialogue: 0,0:12:44.92,0:12:46.98,Default,,0000,0000,0000,,I'm not talking about space travel, Dialogue: 0,0:12:47.00,0:12:50.20,Default,,0000,0000,0000,,merely about humanity's\Nmetaphorical journey into the future. Dialogue: 0,0:12:50.92,0:12:54.42,Default,,0000,0000,0000,,So one option that some\Nof my AI colleagues like Dialogue: 0,0:12:54.44,0:12:58.06,Default,,0000,0000,0000,,is to build superintelligence\Nand keep it under human control, Dialogue: 0,0:12:58.08,0:12:59.82,Default,,0000,0000,0000,,like an enslaved god, Dialogue: 0,0:12:59.84,0:13:01.42,Default,,0000,0000,0000,,disconnected from the internet Dialogue: 0,0:13:01.44,0:13:04.70,Default,,0000,0000,0000,,and used to create unimaginable\Ntechnology and wealth Dialogue: 0,0:13:04.72,0:13:05.96,Default,,0000,0000,0000,,for whoever controls it. Dialogue: 0,0:13:06.80,0:13:08.26,Default,,0000,0000,0000,,But Lord Acton warned us Dialogue: 0,0:13:08.28,0:13:11.90,Default,,0000,0000,0000,,that power corrupts\Nand absolute power corrupts absolutely, Dialogue: 0,0:13:11.92,0:13:15.98,Default,,0000,0000,0000,,so you might worry that maybe\Nwe humans just aren't smart enough, Dialogue: 0,0:13:16.00,0:13:17.54,Default,,0000,0000,0000,,or wise enough rather, Dialogue: 0,0:13:17.56,0:13:18.80,Default,,0000,0000,0000,,to handle this much power. Dialogue: 0,0:13:19.64,0:13:22.18,Default,,0000,0000,0000,,Also, aside from any\Nmoral qualms you might have Dialogue: 0,0:13:22.20,0:13:24.50,Default,,0000,0000,0000,,about enslaving superior minds, Dialogue: 0,0:13:24.52,0:13:28.50,Default,,0000,0000,0000,,you might worry that maybe\Nthe superintelligence could outsmart us, Dialogue: 0,0:13:28.52,0:13:30.76,Default,,0000,0000,0000,,break out and take over. Dialogue: 0,0:13:31.56,0:13:34.98,Default,,0000,0000,0000,,But I also have colleagues\Nwho are fine with AI taking over Dialogue: 0,0:13:35.00,0:13:37.30,Default,,0000,0000,0000,,and even causing human extinction, Dialogue: 0,0:13:37.32,0:13:40.90,Default,,0000,0000,0000,,as long as we feel the the AIs\Nare our worthy descendants, Dialogue: 0,0:13:40.92,0:13:42.66,Default,,0000,0000,0000,,like our children. Dialogue: 0,0:13:42.68,0:13:48.30,Default,,0000,0000,0000,,But how would we know that the AIs\Nhave adopted our best values Dialogue: 0,0:13:48.32,0:13:52.70,Default,,0000,0000,0000,,and aren't just unconscious zombies\Ntricking us into anthropomorphizing them? Dialogue: 0,0:13:52.72,0:13:55.58,Default,,0000,0000,0000,,Also, shouldn't those people\Nwho don't want human extinction Dialogue: 0,0:13:55.60,0:13:57.04,Default,,0000,0000,0000,,have a say in the matter, too? Dialogue: 0,0:13:58.20,0:14:01.58,Default,,0000,0000,0000,,Now, if you didn't like either\Nof those two high-tech options, Dialogue: 0,0:14:01.60,0:14:04.78,Default,,0000,0000,0000,,it's important to remember\Nthat low-tech is suicide Dialogue: 0,0:14:04.80,0:14:06.06,Default,,0000,0000,0000,,from a cosmic perspective, Dialogue: 0,0:14:06.08,0:14:08.58,Default,,0000,0000,0000,,because if we don't go far\Nbeyond today's technology, Dialogue: 0,0:14:08.60,0:14:11.42,Default,,0000,0000,0000,,the question isn't whether humanity\Nis going to go extinct, Dialogue: 0,0:14:11.44,0:14:13.46,Default,,0000,0000,0000,,merely whether\Nwe're going to get taken out Dialogue: 0,0:14:13.48,0:14:15.62,Default,,0000,0000,0000,,by the next killer asteroid, supervolcano, Dialogue: 0,0:14:15.64,0:14:18.74,Default,,0000,0000,0000,,or some other problem\Nthat better technology could have solved. Dialogue: 0,0:14:18.76,0:14:22.34,Default,,0000,0000,0000,,So, how about having\Nour cake and eating it ... Dialogue: 0,0:14:22.36,0:14:24.20,Default,,0000,0000,0000,,with AGI that's not enslaved Dialogue: 0,0:14:25.12,0:14:28.30,Default,,0000,0000,0000,,but treats us well because its values\Nare aligned with ours? Dialogue: 0,0:14:28.32,0:14:32.50,Default,,0000,0000,0000,,This is the gist of what Eliezer Yudkowsky\Nhas called "friendly AI," Dialogue: 0,0:14:32.52,0:14:35.20,Default,,0000,0000,0000,,and if we can do this,\Nit could be awesome. Dialogue: 0,0:14:35.84,0:14:40.66,Default,,0000,0000,0000,,It could not only eliminate negative\Nexperiences like disease, poverty, Dialogue: 0,0:14:40.68,0:14:42.14,Default,,0000,0000,0000,,crime and other suffering, Dialogue: 0,0:14:42.16,0:14:44.98,Default,,0000,0000,0000,,but it could also give us\Nthe freedom to choose Dialogue: 0,0:14:45.00,0:14:49.06,Default,,0000,0000,0000,,from a fantastic new diversity\Nof positive experiences -- Dialogue: 0,0:14:49.08,0:14:52.24,Default,,0000,0000,0000,,basically making us\Nthe masters of our own destiny. Dialogue: 0,0:14:54.28,0:14:55.66,Default,,0000,0000,0000,,So in summary, Dialogue: 0,0:14:55.68,0:14:58.78,Default,,0000,0000,0000,,our situation with technology\Nis complicated, Dialogue: 0,0:14:58.80,0:15:01.22,Default,,0000,0000,0000,,but the big picture is rather simple. Dialogue: 0,0:15:01.24,0:15:04.70,Default,,0000,0000,0000,,Most AI researchers\Nexpect AGI within decades Dialogue: 0,0:15:04.72,0:15:07.86,Default,,0000,0000,0000,,and if we just bumble\Ninto this unprepared, Dialogue: 0,0:15:07.88,0:15:11.22,Default,,0000,0000,0000,,it will probably be\Nthe biggest mistake in human history -- Dialogue: 0,0:15:11.24,0:15:12.66,Default,,0000,0000,0000,,let's face it. Dialogue: 0,0:15:12.68,0:15:15.26,Default,,0000,0000,0000,,It could enable brutal,\Nglobal dictatorship Dialogue: 0,0:15:15.28,0:15:18.82,Default,,0000,0000,0000,,with unprecedented inequality,\Nsurveillance and suffering, Dialogue: 0,0:15:18.84,0:15:20.82,Default,,0000,0000,0000,,and maybe even human extinction. Dialogue: 0,0:15:20.84,0:15:23.16,Default,,0000,0000,0000,,But if we steer carefully, Dialogue: 0,0:15:24.04,0:15:27.94,Default,,0000,0000,0000,,we could end up in a fantastic future\Nwhere everybody's better off: Dialogue: 0,0:15:27.96,0:15:30.34,Default,,0000,0000,0000,,the poor are richer, the rich are richer, Dialogue: 0,0:15:30.36,0:15:34.32,Default,,0000,0000,0000,,everybody is healthy\Nand free to live out their dreams. Dialogue: 0,0:15:35.00,0:15:36.54,Default,,0000,0000,0000,,Now, hang on. Dialogue: 0,0:15:36.56,0:15:41.14,Default,,0000,0000,0000,,Do you folks want the future\Nthat's politically right or left? Dialogue: 0,0:15:41.16,0:15:44.02,Default,,0000,0000,0000,,Do you want the pious society\Nwith strict moral rules, Dialogue: 0,0:15:44.04,0:15:45.86,Default,,0000,0000,0000,,or do you an hedonistic free-for-all, Dialogue: 0,0:15:45.88,0:15:48.10,Default,,0000,0000,0000,,more like Burning Man 24-7? Dialogue: 0,0:15:48.12,0:15:50.54,Default,,0000,0000,0000,,Do you want beautiful beaches,\Nforests and lakes, Dialogue: 0,0:15:50.56,0:15:53.98,Default,,0000,0000,0000,,or would you prefer to rearrange\Nsome of those atoms with the computers, Dialogue: 0,0:15:54.00,0:15:55.72,Default,,0000,0000,0000,,and they can be virtual experiences? Dialogue: 0,0:15:55.74,0:15:58.90,Default,,0000,0000,0000,,With friendly AI, we could simply\Nbuild all of these societies Dialogue: 0,0:15:58.92,0:16:02.14,Default,,0000,0000,0000,,and give people the freedom\Nto choose which one they want to live in Dialogue: 0,0:16:02.16,0:16:05.26,Default,,0000,0000,0000,,because we would no longer\Nbe limited by our intelligence, Dialogue: 0,0:16:05.28,0:16:06.74,Default,,0000,0000,0000,,merely by the laws of physics. Dialogue: 0,0:16:06.76,0:16:11.38,Default,,0000,0000,0000,,So the resources and space\Nfor this would be astronomical -- Dialogue: 0,0:16:11.40,0:16:12.72,Default,,0000,0000,0000,,literally. Dialogue: 0,0:16:13.32,0:16:14.52,Default,,0000,0000,0000,,So here's our choice. Dialogue: 0,0:16:15.88,0:16:18.20,Default,,0000,0000,0000,,We can either be complacent\Nabout our future, Dialogue: 0,0:16:19.44,0:16:22.10,Default,,0000,0000,0000,,taking as an article of blind faith Dialogue: 0,0:16:22.12,0:16:26.14,Default,,0000,0000,0000,,that any new technology\Nis guaranteed to be beneficial, Dialogue: 0,0:16:26.16,0:16:30.30,Default,,0000,0000,0000,,and just repeat that to ourselves\Nas a mantra over and over and over again Dialogue: 0,0:16:30.32,0:16:34.00,Default,,0000,0000,0000,,as we drift like a rudderless ship\Ntowards our own obsolescence. Dialogue: 0,0:16:34.92,0:16:36.80,Default,,0000,0000,0000,,Or we can be ambitious -- Dialogue: 0,0:16:37.84,0:16:40.30,Default,,0000,0000,0000,,thinking hard about how\Nto steer our technology Dialogue: 0,0:16:40.32,0:16:42.26,Default,,0000,0000,0000,,and where we want to go with it Dialogue: 0,0:16:42.28,0:16:44.04,Default,,0000,0000,0000,,to create the age of amazement. Dialogue: 0,0:16:45.00,0:16:47.86,Default,,0000,0000,0000,,We're all here to celebrate\Nthe age of amazement Dialogue: 0,0:16:47.88,0:16:52.32,Default,,0000,0000,0000,,and I feel that its essence should lie\Nin becoming not overpowered Dialogue: 0,0:16:53.24,0:16:55.86,Default,,0000,0000,0000,,but empowered by our technology. Dialogue: 0,0:16:55.88,0:16:57.26,Default,,0000,0000,0000,,Thank you. Dialogue: 0,0:16:57.28,0:17:00.36,Default,,0000,0000,0000,,(Applause)