0:00:00.735,0:00:03.024 How many of you are creatives? 0:00:03.048,0:00:06.672 Designers, engineers,[br]entrepreneurs, artists, 0:00:06.696,0:00:09.083 or maybe you just have[br]a really big imagination. 0:00:09.107,0:00:10.955 Show of hands? (Cheers) 0:00:10.979,0:00:12.160 That's most of you. 0:00:13.334,0:00:15.628 I have some news for us creatives. 0:00:16.714,0:00:19.287 Over the course of the next 20 years, 0:00:21.471,0:00:24.444 more will change around[br]the way we do our work 0:00:25.382,0:00:27.539 than has happened in the last 2,000. 0:00:28.511,0:00:33.139 In fact, I think we're at the dawn[br]of a new age in human history. 0:00:33.645,0:00:38.406 Now, there have been four major historical[br]eras defined by the way we work. 0:00:39.404,0:00:42.679 The Hunter-Gatherer Age[br]lasted several million years. 0:00:43.163,0:00:46.739 And then the Agricultural Age[br]lasted several thousand years. 0:00:47.195,0:00:50.685 The Industrial Age lasted[br]a couple of centuries. 0:00:50.709,0:00:54.996 And now the Information Age[br]has lasted just a few decades. 0:00:55.020,0:01:00.240 And now today, we're on the cusp[br]of our next great era as a species. 0:01:01.296,0:01:03.976 Welcome to the Augmented Age. 0:01:04.000,0:01:07.693 In this new era, your natural human[br]capabilities are going to be augmented 0:01:07.717,0:01:10.785 by computational systems[br]that help you think, 0:01:10.809,0:01:12.995 robotic systems that help you make, 0:01:13.019,0:01:14.667 and a digital nervous system 0:01:14.691,0:01:18.381 that connects you to the world[br]far beyond your natural senses. 0:01:19.437,0:01:21.379 Let's start with cognitive augmentation. 0:01:21.403,0:01:23.603 How many of you are augmented cyborgs? 0:01:24.133,0:01:26.783 (Laughter) 0:01:26.807,0:01:29.628 I would actually argue[br]that we're already augmented. 0:01:30.288,0:01:31.792 Imagine you're at a party, 0:01:31.816,0:01:35.336 and somebody asks you a question[br]that you don't know the answer to. 0:01:35.360,0:01:39.120 If you have one of these,[br]in a few seconds, you can know the answer. 0:01:39.869,0:01:42.168 But this is just a primitive beginning. 0:01:42.863,0:01:46.194 Even Siri is just a passive tool. 0:01:46.660,0:01:50.041 In fact, for the last[br]three-and-a-half million years, 0:01:50.065,0:01:53.174 the tools that we've had[br]have been completely passive. 0:01:54.203,0:01:57.858 They do exactly what we tell them[br]and nothing more. 0:01:57.882,0:02:00.983 Our very first tool only cut[br]where we struck it. 0:02:01.822,0:02:04.862 The chisel only carves[br]where the artist points it. 0:02:05.343,0:02:10.984 And even our most advanced tools[br]do nothing without our explicit direction. 0:02:11.008,0:02:14.189 In fact, to date -- and this[br]is something that frustrates me -- 0:02:14.213,0:02:15.661 we've always been limited 0:02:15.685,0:02:19.186 by this need to manually[br]push our wills into our tools -- 0:02:19.210,0:02:21.507 like, manual -- like,[br]literally using our hands, 0:02:21.531,0:02:22.959 even with computers. 0:02:24.072,0:02:26.535 But I'm more like Scotty in "Star Trek." 0:02:26.559,0:02:28.409 (Laughter) 0:02:28.433,0:02:30.579 I want to have a conversation[br]with a computer. 0:02:30.603,0:02:33.573 I want to say, "Computer,[br]let's design a car," 0:02:33.597,0:02:35.136 and the computer shows me a car. 0:02:35.160,0:02:37.768 And I say, "No, more fast-looking,[br]and less German," 0:02:37.792,0:02:39.955 and bang, the computer shows me an option. 0:02:39.979,0:02:41.844 (Laughter) 0:02:42.208,0:02:44.514 That conversation might be[br]a little ways off -- 0:02:44.538,0:02:47.203 probably less than many of us think -- 0:02:47.227,0:02:48.990 but right now, 0:02:49.014,0:02:50.165 we're working on it. 0:02:50.189,0:02:54.222 Tools are making this leap[br]from being passive to being generative. 0:02:54.831,0:02:58.139 Generative design tools[br]use a computer and algorithms 0:02:58.163,0:03:00.771 to synthesize geometry 0:03:00.795,0:03:03.549 to come up with new designs[br]all by themselves. 0:03:03.996,0:03:06.744 All it needs are your goals[br]and your constraints. 0:03:06.768,0:03:08.176 I'll give you an example. 0:03:08.200,0:03:10.988 In the case of this aerial drone chassis, 0:03:11.012,0:03:13.638 all you would need to do[br]is tell it something like, 0:03:13.662,0:03:14.935 it has four propellers, 0:03:14.959,0:03:17.090 you want it to be[br]as lightweight as possible, 0:03:17.114,0:03:19.384 and you need it to be[br]aerodynamically efficient. 0:03:19.408,0:03:24.322 Then what the computer does[br]is it explores the entire solution space: 0:03:24.346,0:03:28.273 every single possibility that solves[br]and meets your criteria -- 0:03:28.297,0:03:29.739 millions of them. 0:03:29.763,0:03:31.738 It takes big computers to do this. 0:03:31.762,0:03:33.717 But it comes back to us with designs 0:03:33.741,0:03:36.884 that we, by ourselves,[br]never could've imagined. 0:03:37.326,0:03:40.238 And the computer's coming up[br]with this stuff all by itself -- 0:03:40.262,0:03:41.940 no one ever drew anything, 0:03:41.964,0:03:44.050 and it started completely from scratch. 0:03:45.038,0:03:47.425 And by the way, it's no accident 0:03:47.449,0:03:50.930 that the drone body looks just like[br]the pelvis of a flying squirrel. 0:03:51.287,0:03:53.294 (Laughter) 0:03:54.040,0:03:56.342 It's because the algorithms[br]are designed to work 0:03:56.366,0:03:58.003 the same way evolution does. 0:03:58.715,0:04:01.375 What's exciting is we're starting[br]to see this technology 0:04:01.399,0:04:02.558 out in the real world. 0:04:02.582,0:04:05.034 We've been working with Airbus[br]for a couple of years 0:04:05.058,0:04:06.967 on this concept plane for the future. 0:04:06.991,0:04:09.061 It's a ways out still. 0:04:09.085,0:04:12.865 But just recently we used[br]a generative-design AI 0:04:12.889,0:04:14.696 to come up with this. 0:04:15.609,0:04:20.762 This is a 3D-printed cabin partition[br]that's been designed by a computer. 0:04:20.786,0:04:23.610 It's stronger than the original[br]yet half the weight, 0:04:23.634,0:04:26.780 and it will be flying[br]in the Airbus A320 later this year. 0:04:27.405,0:04:28.964 So computers can now generate; 0:04:28.988,0:04:33.583 they can come up with their own solutions[br]to our well-defined problems. 0:04:34.677,0:04:35.987 But they're not intuitive. 0:04:36.011,0:04:39.097 They still have to start from scratch[br]every single time, 0:04:39.121,0:04:41.686 and that's because they never learn. 0:04:42.368,0:04:44.134 Unlike Maggie. 0:04:44.158,0:04:45.739 (Laughter) 0:04:45.763,0:04:49.060 Maggie's actually smarter[br]than our most advanced design tools. 0:04:49.467,0:04:50.907 What do I mean by that? 0:04:50.931,0:04:52.521 If her owner picks up that leash, 0:04:52.545,0:04:54.613 Maggie knows with a fair[br]degree of certainty 0:04:54.637,0:04:56.041 it's time to go for a walk. 0:04:56.065,0:04:57.250 And how did she learn? 0:04:57.274,0:05:00.598 Well, every time the owner picked up[br]the leash, they went for a walk. 0:05:00.622,0:05:02.500 And Maggie did three things: 0:05:02.524,0:05:04.393 she had to pay attention, 0:05:04.417,0:05:06.499 she had to remember what happened 0:05:06.523,0:05:10.540 and she had to retain and create[br]a pattern in her mind. 0:05:11.429,0:05:13.524 Interestingly, that's exactly what 0:05:13.548,0:05:16.071 computer scientists[br]have been trying to get AIs to do 0:05:16.095,0:05:17.954 for the last 60 or so years. 0:05:18.683,0:05:20.032 Back in 1952, 0:05:20.056,0:05:23.857 they built this computer[br]that could play Tic-Tac-Toe. 0:05:25.081,0:05:26.241 Big deal. 0:05:27.029,0:05:30.029 Then 45 years later, in 1997, 0:05:30.053,0:05:32.525 Deep Blue beats Kasparov at chess. 0:05:34.046,0:05:39.014 2011, Watson beats these two[br]humans at Jeopardy, 0:05:39.038,0:05:41.966 which is much harder for a computer[br]to play than chess is. 0:05:41.990,0:05:45.802 In fact, rather than working[br]from predefined recipes, 0:05:45.826,0:05:49.149 Watson had to use reasoning[br]to overcome his human opponents. 0:05:50.393,0:05:52.832 And then a couple of weeks ago, 0:05:52.856,0:05:57.118 DeepMind's AlphaGo beats[br]the world's best human at Go, 0:05:57.142,0:05:59.354 which is the most difficult[br]game that we have. 0:05:59.378,0:06:02.274 In fact, in Go, there are more[br]possible moves 0:06:02.298,0:06:04.322 than there are atoms in the universe. 0:06:06.210,0:06:08.036 So in order to win, 0:06:08.060,0:06:10.678 what AlphaGo had to do[br]was develop intuition. 0:06:11.098,0:06:15.208 And in fact, at some points,[br]AlphaGo's programmers didn't understand 0:06:15.232,0:06:17.518 why AlphaGo was doing what it was doing. 0:06:19.451,0:06:21.111 And things are moving really fast. 0:06:21.135,0:06:24.362 I mean, consider --[br]in the space of a human lifetime, 0:06:24.386,0:06:26.619 computers have gone from a child's game 0:06:27.920,0:06:30.968 to what's recognized as the pinnacle[br]of strategic thought. 0:06:31.999,0:06:34.416 What's basically happening 0:06:34.440,0:06:37.750 is computers are going[br]from being like Spock 0:06:37.774,0:06:39.723 to being a lot more like Kirk. 0:06:39.747,0:06:43.365 (Laughter) 0:06:43.389,0:06:46.813 Right? From pure logic to intuition. 0:06:48.184,0:06:49.927 Would you cross this bridge? 0:06:50.609,0:06:52.932 Most of you are saying, "Oh, hell no!" 0:06:52.956,0:06:54.264 (Laughter) 0:06:54.288,0:06:56.945 And you arrived at that decision[br]in a split second. 0:06:56.969,0:06:59.397 You just sort of knew[br]that bridge was unsafe. 0:06:59.421,0:07:01.410 And that's exactly the kind of intuition 0:07:01.434,0:07:05.002 that our deep-learning systems[br]are starting to develop right now. 0:07:05.722,0:07:07.429 Very soon, you'll literally be able 0:07:07.453,0:07:09.659 to show something you've made,[br]you've designed, 0:07:09.683,0:07:10.836 to a computer, 0:07:10.860,0:07:12.349 and it will look at it and say, 0:07:12.373,0:07:15.196 "Sorry, homey, that'll never work.[br]You have to try again." 0:07:15.854,0:07:18.924 Or you could ask it if people[br]are going to like your next song, 0:07:19.773,0:07:21.836 or your next flavor of ice cream. 0:07:23.549,0:07:26.128 Or, much more importantly, 0:07:26.152,0:07:28.516 you could work with a computer[br]to solve a problem 0:07:28.540,0:07:30.177 that we've never faced before. 0:07:30.201,0:07:31.602 For instance, climate change. 0:07:31.626,0:07:33.646 We're not doing a very[br]good job on our own, 0:07:33.670,0:07:35.915 we could certainly use[br]all the help we can get. 0:07:35.939,0:07:37.397 That's what I'm talking about, 0:07:37.421,0:07:39.976 technology amplifying[br]our cognitive abilities 0:07:40.000,0:07:43.552 so we can imagine and design things[br]that were simply out of our reach 0:07:43.576,0:07:46.135 as plain old un-augmented humans. 0:07:47.984,0:07:50.925 So what about making[br]all of this crazy new stuff 0:07:50.949,0:07:53.390 that we're going to invent and design? 0:07:53.952,0:07:58.045 I think the era of human augmentation[br]is as much about the physical world 0:07:58.069,0:08:01.134 as it is about the virtual,[br]intellectual realm. 0:08:01.833,0:08:03.754 How will technology augment us? 0:08:04.261,0:08:06.734 In the physical world, robotic systems. 0:08:07.620,0:08:09.356 OK, there's certainly a fear 0:08:09.380,0:08:11.868 that robots are going to take[br]jobs away from humans, 0:08:11.892,0:08:13.722 and that is true in certain sectors. 0:08:14.174,0:08:17.052 But I'm much more interested in this idea 0:08:17.076,0:08:22.086 that humans and robots working together[br]are going to augment each other, 0:08:22.110,0:08:24.168 and start to inhabit a new space. 0:08:24.192,0:08:26.554 This is our applied research lab[br]in San Francisco, 0:08:26.578,0:08:29.720 where one of our areas of focus[br]is advanced robotics, 0:08:29.744,0:08:32.255 specifically, human-robot collaboration. 0:08:33.034,0:08:35.793 And this is Bishop, one of our robots. 0:08:35.817,0:08:37.606 As an experiment, we set it up 0:08:37.630,0:08:41.090 to help a person working in construction[br]doing repetitive tasks -- 0:08:41.984,0:08:46.178 tasks like cutting out holes for outlets[br]or light switches in drywall. 0:08:46.202,0:08:48.668 (Laughter) 0:08:49.877,0:08:52.988 So, Bishop's human partner[br]can tell what to do in plain English 0:08:53.012,0:08:54.317 and with simple gestures, 0:08:54.341,0:08:55.788 kind of like talking to a dog, 0:08:55.812,0:08:57.955 and then Bishop executes[br]on those instructions 0:08:57.979,0:08:59.871 with perfect precision. 0:08:59.895,0:09:02.884 We're using the human[br]for what the human is good at: 0:09:02.908,0:09:05.241 awareness, perception and decision making. 0:09:05.265,0:09:07.505 And we're using the robot[br]for what it's good at: 0:09:07.529,0:09:09.277 precision and repetitiveness. 0:09:10.252,0:09:12.619 Here's another cool project[br]that Bishop worked on. 0:09:12.643,0:09:15.718 The goal of this project,[br]which we called the HIVE, 0:09:15.742,0:09:19.593 was to prototype the experience[br]of humans, computers and robots 0:09:19.617,0:09:22.837 all working together to solve[br]a highly complex design problem. 0:09:23.793,0:09:25.244 The humans acted as labor. 0:09:25.268,0:09:28.741 They cruised around the construction site,[br]they manipulated the bamboo -- 0:09:28.765,0:09:31.521 which, by the way,[br]because it's a non-isomorphic material, 0:09:31.545,0:09:33.419 is super hard for robots to deal with. 0:09:33.443,0:09:35.465 But then the robots[br]did this fiber winding, 0:09:35.489,0:09:37.940 which was almost impossible[br]for a human to do. 0:09:37.964,0:09:41.585 And then we had an AI[br]that was controlling everything. 0:09:41.609,0:09:44.899 It was telling the humans what to do,[br]telling the robots what to do 0:09:44.923,0:09:47.838 and keeping track of thousands[br]of individual components. 0:09:47.862,0:09:49.042 What's interesting is, 0:09:49.066,0:09:52.207 building this pavilion[br]was simply not possible 0:09:52.231,0:09:56.755 without human, robot and AI[br]augmenting each other. 0:09:57.890,0:10:01.210 OK, I'll share one more project.[br]This one's a little bit crazy. 0:10:01.234,0:10:05.702 We're working with Amsterdam-based artist[br]Joris Laarman and his team at MX3D 0:10:05.726,0:10:08.604 to generatively design[br]and robotically print 0:10:08.628,0:10:11.623 the world's first autonomously[br]manufactured bridge. 0:10:12.315,0:10:16.000 So, Joris and an AI are designing[br]this thing right now, as we speak, 0:10:16.024,0:10:17.196 in Amsterdam. 0:10:17.220,0:10:19.541 And when they're done,[br]we're going to hit "Go," 0:10:19.565,0:10:22.876 and robots will start 3D printing[br]in stainless steel, 0:10:22.900,0:10:26.183 and then they're going to keep printing,[br]without human intervention, 0:10:26.207,0:10:27.765 until the bridge is finished. 0:10:29.099,0:10:32.027 So, as computers are going[br]to augment our ability 0:10:32.051,0:10:34.201 to imagine and design new stuff, 0:10:34.225,0:10:37.120 robotic systems are going to help us[br]build and make things 0:10:37.144,0:10:39.228 that we've never been able to make before. 0:10:40.347,0:10:44.507 But what about our ability[br]to sense and control these things? 0:10:44.531,0:10:48.562 What about a nervous system[br]for the things that we make? 0:10:48.586,0:10:51.098 Our nervous system,[br]the human nervous system, 0:10:51.122,0:10:53.433 tells us everything[br]that's going on around us. 0:10:54.186,0:10:57.870 But the nervous system of the things[br]we make is rudimentary at best. 0:10:57.894,0:11:01.457 For instance, a car doesn't tell[br]the city's public works department 0:11:01.481,0:11:04.611 that it just hit a pothole at the corner[br]of Broadway and Morrison. 0:11:04.635,0:11:06.667 A building doesn't tell its designers 0:11:06.691,0:11:09.375 whether or not the people inside[br]like being there, 0:11:09.399,0:11:12.409 and the toy manufacturer doesn't know 0:11:12.433,0:11:14.440 if a toy is actually being played with -- 0:11:14.464,0:11:17.003 how and where and whether[br]or not it's any fun. 0:11:17.620,0:11:21.434 Look, I'm sure that the designers[br]imagined this lifestyle for Barbie 0:11:21.458,0:11:22.682 when they designed her -- 0:11:22.706,0:11:24.153 (Laughter) 0:11:24.177,0:11:27.083 But what if it turns out that Barbie's[br]actually really lonely? 0:11:27.107,0:11:30.254 (Laughter) 0:11:31.266,0:11:32.554 If the designers had known 0:11:32.578,0:11:34.685 what was really happening[br]in the real world 0:11:34.709,0:11:37.292 with their designs -- the road,[br]the building, Barbie -- 0:11:37.316,0:11:40.010 they could've used that knowledge[br]to create an experience 0:11:40.034,0:11:41.434 that was better for the user. 0:11:41.458,0:11:43.249 What's missing is a nervous system 0:11:43.273,0:11:46.982 connecting us to all of the things[br]that we design, make and use. 0:11:47.915,0:11:51.470 What if all of you had that kind[br]of information flowing to you 0:11:51.494,0:11:53.677 from the things you create[br]in the real world? 0:11:55.432,0:11:56.883 With all of the stuff we make, 0:11:56.907,0:11:59.342 we spend a tremendous amount[br]of money and energy -- 0:11:59.366,0:12:01.742 in fact, last year,[br]about two trillion dollars -- 0:12:01.766,0:12:04.620 convincing people to buy[br]the things we've made. 0:12:04.644,0:12:08.032 But if you had this connection[br]to the things that you design and create 0:12:08.056,0:12:09.783 after they're out in the real world, 0:12:09.807,0:12:13.421 after they've been sold[br]or launched or whatever, 0:12:13.445,0:12:15.065 we could actually change that, 0:12:15.089,0:12:18.136 and go from making people want our stuff, 0:12:18.160,0:12:21.594 to just making stuff that people[br]want in the first place. 0:12:21.618,0:12:24.405 The good news is, we're working[br]on digital nervous systems 0:12:24.429,0:12:27.230 that connect us to the things we design. 0:12:28.365,0:12:29.992 We're working on one project 0:12:30.016,0:12:33.728 with a couple of guys down in Los Angeles[br]called the Bandito Brothers 0:12:33.752,0:12:35.159 and their team, 0:12:35.183,0:12:38.616 and one of the things these guys do[br]is build insane cars 0:12:38.640,0:12:41.513 that do absolutely insane things. 0:12:42.905,0:12:44.355 These guys are crazy -- 0:12:44.379,0:12:45.415 (Laughter) 0:12:45.439,0:12:46.842 in the best way. 0:12:48.993,0:12:50.756 And what we're doing with them 0:12:50.780,0:12:53.220 is taking a traditional race-car chassis 0:12:53.244,0:12:54.829 and giving it a nervous system. 0:12:54.853,0:12:57.911 So we instrumented it[br]with dozens of sensors, 0:12:57.935,0:13:00.570 put a world-class driver behind the wheel, 0:13:00.594,0:13:03.951 took it out to the desert[br]and drove the hell out of it for a week. 0:13:03.975,0:13:06.466 And the car's nervous system[br]captured everything 0:13:06.490,0:13:07.972 that was happening to the car. 0:13:07.996,0:13:10.617 We captured four billion data points; 0:13:10.641,0:13:12.951 all of the forces[br]that it was subjected to. 0:13:12.975,0:13:14.634 And then we did something crazy. 0:13:15.268,0:13:16.768 We took all of that data, 0:13:16.792,0:13:20.528 and plugged it into a generative-design AI[br]we call "Dreamcatcher." 0:13:21.270,0:13:25.234 So what do get when you give[br]a design tool a nervous system, 0:13:25.258,0:13:28.140 and you ask it to build you[br]the ultimate car chassis? 0:13:28.723,0:13:30.696 You get this. 0:13:32.293,0:13:36.006 This is something that a human[br]could never have designed. 0:13:36.707,0:13:38.595 Except a human did design this, 0:13:38.619,0:13:42.928 but it was a human that was augmented[br]by a generative-design AI, 0:13:42.952,0:13:44.183 a digital nervous system 0:13:44.207,0:13:47.212 and robots that can actually[br]fabricate something like this. 0:13:47.680,0:13:51.275 So if this is the future,[br]the Augmented Age, 0:13:51.299,0:13:55.560 and we're going to be augmented[br]cognitively, physically and perceptually, 0:13:55.584,0:13:56.992 what will that look like? 0:13:57.576,0:14:00.897 What is this wonderland going to be like? 0:14:00.921,0:14:02.630 I think we're going to see a world 0:14:02.654,0:14:05.722 where we're moving[br]from things that are fabricated 0:14:05.746,0:14:07.191 to things that are farmed. 0:14:08.159,0:14:11.612 Where we're moving from things[br]that are constructed 0:14:11.636,0:14:13.340 to that which is grown. 0:14:14.134,0:14:16.322 We're going to move from being isolated 0:14:16.346,0:14:17.956 to being connected. 0:14:18.634,0:14:21.045 And we'll move away from extraction 0:14:21.069,0:14:22.942 to embrace aggregation. 0:14:23.967,0:14:27.734 I also think we'll shift[br]from craving obedience from our things 0:14:27.758,0:14:29.399 to valuing autonomy. 0:14:30.510,0:14:32.415 Thanks to our augmented capabilities, 0:14:32.439,0:14:34.816 our world is going to change dramatically. 0:14:35.576,0:14:38.822 We're going to have a world[br]with more variety, more connectedness, 0:14:38.846,0:14:41.133 more dynamism, more complexity, 0:14:41.157,0:14:43.475 more adaptability and, of course, 0:14:43.499,0:14:44.716 more beauty. 0:14:45.231,0:14:46.795 The shape of things to come 0:14:46.819,0:14:49.109 will be unlike anything[br]we've ever seen before. 0:14:49.133,0:14:50.292 Why? 0:14:50.316,0:14:54.071 Because what will be shaping those things[br]is this new partnership 0:14:54.095,0:14:57.765 between technology, nature and humanity. 0:14:59.279,0:15:03.083 That, to me, is a future[br]well worth looking forward to. 0:15:03.107,0:15:04.378 Thank you all so much. 0:15:04.402,0:15:10.071 (Applause)