WEBVTT 00:00:00.735 --> 00:00:03.024 How many of you are creatives, 00:00:03.048 --> 00:00:06.672 designers, engineers, entrepreneurs, artists, 00:00:06.696 --> 00:00:09.083 or maybe you just have a really big imagination? 00:00:09.107 --> 00:00:10.955 Show of hands? (Cheers) NOTE Paragraph 00:00:10.979 --> 00:00:12.160 That's most of you. 00:00:13.334 --> 00:00:15.628 I have some news for us creatives. 00:00:16.714 --> 00:00:19.287 Over the course of the next 20 years, 00:00:21.471 --> 00:00:24.444 more will change around the way we do our work 00:00:25.382 --> 00:00:27.539 than has happened in the last 2,000. 00:00:28.511 --> 00:00:33.139 In fact, I think we're at the dawn of a new age in human history. NOTE Paragraph 00:00:33.645 --> 00:00:38.406 Now, there have been four major historical eras defined by the way we work. 00:00:39.404 --> 00:00:42.679 The Hunter-Gatherer Age lasted several million years. 00:00:43.163 --> 00:00:46.739 And then the Agricultural Age lasted several thousand years. 00:00:47.195 --> 00:00:50.685 The Industrial Age lasted a couple of centuries. 00:00:50.709 --> 00:00:54.996 And now the Information Age has lasted just a few decades. 00:00:55.020 --> 00:01:00.240 And now today, we're on the cusp of our next great era as a species. NOTE Paragraph 00:01:01.296 --> 00:01:03.976 Welcome to the Augmented Age. 00:01:04.000 --> 00:01:07.693 In this new era, your natural human capabilities are going to be augmented 00:01:07.717 --> 00:01:10.785 by computational systems that help you think, 00:01:10.809 --> 00:01:12.995 robotic systems that help you make, 00:01:13.019 --> 00:01:14.667 and a digital nervous system 00:01:14.691 --> 00:01:18.381 that connects you to the world far beyond your natural senses. 00:01:19.437 --> 00:01:21.379 Let's start with cognitive augmentation. 00:01:21.403 --> 00:01:23.603 How many of you are augmented cyborgs? NOTE Paragraph 00:01:24.133 --> 00:01:26.783 (Laughter) NOTE Paragraph 00:01:26.807 --> 00:01:29.628 I would actually argue that we're already augmented. 00:01:30.288 --> 00:01:31.792 Imagine you're at a party, 00:01:31.816 --> 00:01:35.336 and somebody asks you a question that you don't know the answer to. 00:01:35.360 --> 00:01:39.120 If you have one of these, in a few seconds, you can know the answer. 00:01:39.869 --> 00:01:42.168 But this is just a primitive beginning. 00:01:42.863 --> 00:01:46.194 Even Siri is just a passive tool. 00:01:46.660 --> 00:01:50.041 In fact, for the last three-and-a-half million years, 00:01:50.065 --> 00:01:53.174 the tools that we've had have been completely passive. 00:01:54.203 --> 00:01:57.858 They do exactly what we tell them and nothing more. 00:01:57.882 --> 00:02:00.983 Our very first tool only cut where we struck it. 00:02:01.822 --> 00:02:04.862 The chisel only carves where the artist points it. 00:02:05.343 --> 00:02:10.984 And even our most advanced tools do nothing without our explicit direction. 00:02:11.008 --> 00:02:14.189 In fact, to date, and this is something that frustrates me, 00:02:14.213 --> 00:02:15.661 we've always been limited 00:02:15.685 --> 00:02:19.186 by this need to manually push our wills into our tools -- 00:02:19.210 --> 00:02:21.507 like, manual, literally using our hands, 00:02:21.531 --> 00:02:22.959 even with computers. 00:02:24.072 --> 00:02:26.535 But I'm more like Scotty in "Star Trek." NOTE Paragraph 00:02:26.559 --> 00:02:28.409 (Laughter) NOTE Paragraph 00:02:28.433 --> 00:02:30.579 I want to have a conversation with a computer. 00:02:30.603 --> 00:02:33.573 I want to say, "Computer, let's design a car," 00:02:33.597 --> 00:02:35.136 and the computer shows me a car. 00:02:35.160 --> 00:02:37.768 And I say, "No, more fast-looking, and less German," 00:02:37.792 --> 00:02:39.955 and bang, the computer shows me an option. NOTE Paragraph 00:02:39.979 --> 00:02:41.844 (Laughter) NOTE Paragraph 00:02:42.208 --> 00:02:44.514 That conversation might be a little ways off, 00:02:44.538 --> 00:02:47.203 probably less than many of us think, 00:02:47.227 --> 00:02:48.990 but right now, 00:02:49.014 --> 00:02:50.165 we're working on it. 00:02:50.189 --> 00:02:54.222 Tools are making this leap from being passive to being generative. 00:02:54.831 --> 00:02:58.139 Generative design tools use a computer and algorithms 00:02:58.163 --> 00:03:00.771 to synthesize geometry 00:03:00.795 --> 00:03:03.549 to come up with new designs all by themselves. 00:03:03.996 --> 00:03:06.744 All it needs are your goals and your constraints. NOTE Paragraph 00:03:06.768 --> 00:03:08.176 I'll give you an example. 00:03:08.200 --> 00:03:10.988 In the case of this aerial drone chassis, 00:03:11.012 --> 00:03:13.638 all you would need to do is tell it something like, 00:03:13.662 --> 00:03:14.935 it has four propellers, 00:03:14.959 --> 00:03:17.090 you want it to be as lightweight as possible, 00:03:17.114 --> 00:03:19.384 and you need it to be aerodynamically efficient. 00:03:19.408 --> 00:03:24.322 Then what the computer does is it explores the entire solution space: 00:03:24.346 --> 00:03:28.273 every single possibility that solves and meets your criteria -- 00:03:28.297 --> 00:03:29.739 millions of them. 00:03:29.763 --> 00:03:31.738 It takes big computers to do this. 00:03:31.762 --> 00:03:33.717 But it comes back to us with designs 00:03:33.741 --> 00:03:36.884 that we, by ourselves, never could've imagined. 00:03:37.326 --> 00:03:40.238 And the computer's coming up with this stuff all by itself -- 00:03:40.262 --> 00:03:41.940 no one ever drew anything, 00:03:41.964 --> 00:03:44.050 and it started completely from scratch. 00:03:45.038 --> 00:03:47.425 And by the way, it's no accident 00:03:47.449 --> 00:03:50.930 that the drone body looks just like the pelvis of a flying squirrel. NOTE Paragraph 00:03:51.287 --> 00:03:53.294 (Laughter) NOTE Paragraph 00:03:54.040 --> 00:03:56.342 It's because the algorithms are designed to work 00:03:56.366 --> 00:03:58.003 the same way evolution does. NOTE Paragraph 00:03:58.715 --> 00:04:01.375 What's exciting is we're starting to see this technology 00:04:01.399 --> 00:04:02.558 out in the real world. 00:04:02.582 --> 00:04:05.034 We've been working with Airbus for a couple of years 00:04:05.058 --> 00:04:06.967 on this concept plane for the future. 00:04:06.991 --> 00:04:09.061 It's a ways out still. 00:04:09.085 --> 00:04:12.865 But just recently we used a generative-design AI 00:04:12.889 --> 00:04:14.696 to come up with this. 00:04:15.609 --> 00:04:20.762 This is a 3D-printed cabin partition that's been designed by a computer. 00:04:20.786 --> 00:04:23.610 It's stronger than the original yet half the weight, 00:04:23.634 --> 00:04:26.780 and it will be flying in the Airbus A320 later this year. 00:04:27.405 --> 00:04:28.964 So computers can now generate; 00:04:28.988 --> 00:04:33.583 they can come up with their own solutions to our well-defined problems. 00:04:34.677 --> 00:04:35.987 But they're not intuitive. 00:04:36.011 --> 00:04:39.097 They still have to start from scratch every single time, 00:04:39.121 --> 00:04:41.686 and that's because they never learn. 00:04:42.368 --> 00:04:44.134 Unlike Maggie. NOTE Paragraph 00:04:44.158 --> 00:04:45.739 (Laughter) NOTE Paragraph 00:04:45.763 --> 00:04:49.060 Maggie's actually smarter than our most advanced design tools. 00:04:49.467 --> 00:04:50.907 What do I mean by that? 00:04:50.931 --> 00:04:52.521 If her owner picks up that leash, 00:04:52.545 --> 00:04:54.613 Maggie knows with a fair degree of certainty 00:04:54.637 --> 00:04:56.041 it's time to go for a walk. 00:04:56.065 --> 00:04:57.250 And how did she learn? 00:04:57.274 --> 00:05:00.598 Well, every time the owner picked up the leash, they went for a walk. 00:05:00.622 --> 00:05:02.500 And Maggie did three things: 00:05:02.524 --> 00:05:04.393 she had to pay attention, 00:05:04.417 --> 00:05:06.499 she had to remember what happened 00:05:06.523 --> 00:05:10.540 and she had to retain and create a pattern in her mind. NOTE Paragraph 00:05:11.429 --> 00:05:13.524 Interestingly, that's exactly what 00:05:13.548 --> 00:05:16.071 computer scientists have been trying to get AIs to do 00:05:16.095 --> 00:05:17.954 for the last 60 or so years. 00:05:18.683 --> 00:05:20.032 Back in 1952, 00:05:20.056 --> 00:05:23.857 they built this computer that could play Tic-Tac-Toe. 00:05:25.081 --> 00:05:26.241 Big deal. 00:05:27.029 --> 00:05:30.029 Then 45 years later, in 1997, 00:05:30.053 --> 00:05:32.525 Deep Blue beats Kasparov at chess. 00:05:34.046 --> 00:05:39.014 2011, Watson beats these two humans at Jeopardy, 00:05:39.038 --> 00:05:41.966 which is much harder for a computer to play than chess is. 00:05:41.990 --> 00:05:45.802 In fact, rather than working from predefined recipes, 00:05:45.826 --> 00:05:49.149 Watson had to use reasoning to overcome his human opponents. 00:05:50.393 --> 00:05:52.832 And then a couple of weeks ago, 00:05:52.856 --> 00:05:57.118 DeepMind's AlphaGo beats the world's best human at Go, 00:05:57.142 --> 00:05:59.354 which is the most difficult game that we have. 00:05:59.378 --> 00:06:02.274 In fact, in Go, there are more possible moves 00:06:02.298 --> 00:06:04.322 than there are atoms in the universe. 00:06:06.210 --> 00:06:08.036 So in order to win, 00:06:08.060 --> 00:06:10.678 what AlphaGo had to do was develop intuition. 00:06:11.098 --> 00:06:15.208 And in fact, at some points, AlphaGo's programmers didn't understand 00:06:15.232 --> 00:06:17.518 why AlphaGo was doing what it was doing. NOTE Paragraph 00:06:19.451 --> 00:06:21.111 And things are moving really fast. 00:06:21.135 --> 00:06:24.362 I mean, consider -- in the space of a human lifetime, 00:06:24.386 --> 00:06:26.619 computers have gone from a child's game 00:06:27.920 --> 00:06:30.968 to what's recognized as the pinnacle of strategic thought. 00:06:31.999 --> 00:06:34.416 What's basically happening 00:06:34.440 --> 00:06:37.750 is computers are going from being like Spock 00:06:37.774 --> 00:06:39.723 to being a lot more like Kirk. NOTE Paragraph 00:06:39.747 --> 00:06:43.365 (Laughter) NOTE Paragraph 00:06:43.389 --> 00:06:46.813 Right? From pure logic to intuition. 00:06:48.184 --> 00:06:49.927 Would you cross this bridge? 00:06:50.609 --> 00:06:52.932 Most of you are saying, "Oh, hell no!" NOTE Paragraph 00:06:52.956 --> 00:06:54.264 (Laughter) NOTE Paragraph 00:06:54.288 --> 00:06:56.945 And you arrived at that decision in a split second. 00:06:56.969 --> 00:06:59.397 You just sort of knew that bridge was unsafe. 00:06:59.421 --> 00:07:01.410 And that's exactly the kind of intuition 00:07:01.434 --> 00:07:05.002 that our deep-learning systems are starting to develop right now. 00:07:05.722 --> 00:07:07.429 Very soon, you'll literally be able 00:07:07.453 --> 00:07:09.659 to show something you've made, you've designed, 00:07:09.683 --> 00:07:10.836 to a computer, 00:07:10.860 --> 00:07:12.349 and it will look at it and say, 00:07:12.373 --> 00:07:15.196 "Sorry, homie, that'll never work. You have to try again." 00:07:15.854 --> 00:07:18.924 Or you could ask it if people are going to like your next song, 00:07:19.773 --> 00:07:21.836 or your next flavor of ice cream. 00:07:23.549 --> 00:07:26.128 Or, much more importantly, 00:07:26.152 --> 00:07:28.516 you could work with a computer to solve a problem 00:07:28.540 --> 00:07:30.177 that we've never faced before. 00:07:30.201 --> 00:07:31.602 For instance, climate change. 00:07:31.626 --> 00:07:33.646 We're not doing a very good job on our own, 00:07:33.670 --> 00:07:35.915 we could certainly use all the help we can get. 00:07:35.939 --> 00:07:37.397 That's what I'm talking about, 00:07:37.421 --> 00:07:39.976 technology amplifying our cognitive abilities 00:07:40.000 --> 00:07:43.552 so we can imagine and design things that were simply out of our reach 00:07:43.576 --> 00:07:46.135 as plain old un-augmented humans. NOTE Paragraph 00:07:47.984 --> 00:07:50.925 So what about making all of this crazy new stuff 00:07:50.949 --> 00:07:53.390 that we're going to invent and design? 00:07:53.952 --> 00:07:58.045 I think the era of human augmentation is as much about the physical world 00:07:58.069 --> 00:08:01.134 as it is about the virtual, intellectual realm. 00:08:01.833 --> 00:08:03.754 How will technology augment us? 00:08:04.261 --> 00:08:06.734 In the physical world, robotic systems. 00:08:07.620 --> 00:08:09.356 OK, there's certainly a fear 00:08:09.380 --> 00:08:11.868 that robots are going to take jobs away from humans, 00:08:11.892 --> 00:08:13.722 and that is true in certain sectors. 00:08:14.174 --> 00:08:17.052 But I'm much more interested in this idea 00:08:17.076 --> 00:08:22.086 that humans and robots working together are going to augment each other, 00:08:22.110 --> 00:08:24.168 and start to inhabit a new space. NOTE Paragraph 00:08:24.192 --> 00:08:26.554 This is our applied research lab in San Francisco, 00:08:26.578 --> 00:08:29.720 where one of our areas of focus is advanced robotics, 00:08:29.744 --> 00:08:32.255 specifically, human-robot collaboration. 00:08:33.034 --> 00:08:35.793 And this is Bishop, one of our robots. 00:08:35.817 --> 00:08:37.606 As an experiment, we set it up 00:08:37.630 --> 00:08:41.090 to help a person working in construction doing repetitive tasks -- 00:08:41.984 --> 00:08:46.178 tasks like cutting out holes for outlets or light switches in drywall. NOTE Paragraph 00:08:46.202 --> 00:08:48.668 (Laughter) NOTE Paragraph 00:08:49.877 --> 00:08:52.988 So, Bishop's human partner can tell what to do in plain English 00:08:53.012 --> 00:08:54.317 and with simple gestures, 00:08:54.341 --> 00:08:55.788 kind of like talking to a dog, 00:08:55.812 --> 00:08:57.955 and then Bishop executes on those instructions 00:08:57.979 --> 00:08:59.871 with perfect precision. 00:08:59.895 --> 00:09:02.884 We're using the human for what the human is good at: 00:09:02.908 --> 00:09:05.241 awareness, perception and decision making. 00:09:05.265 --> 00:09:07.505 And we're using the robot for what it's good at: 00:09:07.529 --> 00:09:09.277 precision and repetitiveness. NOTE Paragraph 00:09:10.252 --> 00:09:12.619 Here's another cool project that Bishop worked on. 00:09:12.643 --> 00:09:15.718 The goal of this project, which we called the HIVE, 00:09:15.742 --> 00:09:19.593 was to prototype the experience of humans, computers and robots 00:09:19.617 --> 00:09:22.837 all working together to solve a highly complex design problem. 00:09:23.793 --> 00:09:25.244 The humans acted as labor. 00:09:25.268 --> 00:09:28.741 They cruised around the construction site, they manipulated the bamboo -- 00:09:28.765 --> 00:09:31.521 which, by the way, because it's a non-isomorphic material, 00:09:31.545 --> 00:09:33.419 is super hard for robots to deal with. 00:09:33.443 --> 00:09:35.465 But then the robots did this fiber winding, 00:09:35.489 --> 00:09:37.940 which was almost impossible for a human to do. 00:09:37.964 --> 00:09:41.585 And then we had an AI that was controlling everything. 00:09:41.609 --> 00:09:44.899 It was telling the humans what to do, telling the robots what to do 00:09:44.923 --> 00:09:47.838 and keeping track of thousands of individual components. 00:09:47.862 --> 00:09:49.042 What's interesting is, 00:09:49.066 --> 00:09:52.207 building this pavilion was simply not possible 00:09:52.231 --> 00:09:56.755 without human, robot and AI augmenting each other. NOTE Paragraph 00:09:57.890 --> 00:10:01.210 OK, I'll share one more project. This one's a little bit crazy. 00:10:01.234 --> 00:10:05.702 We're working with Amsterdam-based artist Joris Laarman and his team at MX3D 00:10:05.726 --> 00:10:08.604 to generatively design and robotically print 00:10:08.628 --> 00:10:11.623 the world's first autonomously manufactured bridge. 00:10:12.315 --> 00:10:16.000 So, Joris and an AI are designing this thing right now, as we speak, 00:10:16.024 --> 00:10:17.196 in Amsterdam. 00:10:17.220 --> 00:10:19.541 And when they're done, we're going to hit "Go," 00:10:19.565 --> 00:10:22.876 and robots will start 3D printing in stainless steel, 00:10:22.900 --> 00:10:26.183 and then they're going to keep printing, without human intervention, 00:10:26.207 --> 00:10:27.765 until the bridge is finished. NOTE Paragraph 00:10:29.099 --> 00:10:32.027 So, as computers are going to augment our ability 00:10:32.051 --> 00:10:34.201 to imagine and design new stuff, 00:10:34.225 --> 00:10:37.120 robotic systems are going to help us build and make things 00:10:37.144 --> 00:10:39.228 that we've never been able to make before. 00:10:40.347 --> 00:10:44.507 But what about our ability to sense and control these things? 00:10:44.531 --> 00:10:48.562 What about a nervous system for the things that we make? NOTE Paragraph 00:10:48.586 --> 00:10:51.098 Our nervous system, the human nervous system, 00:10:51.122 --> 00:10:53.433 tells us everything that's going on around us. 00:10:54.186 --> 00:10:57.870 But the nervous system of the things we make is rudimentary at best. 00:10:57.894 --> 00:11:01.457 For instance, a car doesn't tell the city's public works department 00:11:01.481 --> 00:11:04.611 that it just hit a pothole at the corner of Broadway and Morrison. 00:11:04.635 --> 00:11:06.667 A building doesn't tell its designers 00:11:06.691 --> 00:11:09.375 whether or not the people inside like being there, 00:11:09.399 --> 00:11:12.409 and the toy manufacturer doesn't know 00:11:12.433 --> 00:11:14.440 if a toy is actually being played with -- 00:11:14.464 --> 00:11:17.003 how and where and whether or not it's any fun. 00:11:17.620 --> 00:11:21.434 Look, I'm sure that the designers imagined this lifestyle for Barbie 00:11:21.458 --> 00:11:22.682 when they designed her. NOTE Paragraph 00:11:22.706 --> 00:11:24.153 (Laughter) NOTE Paragraph 00:11:24.177 --> 00:11:27.083 But what if it turns out that Barbie's actually really lonely? NOTE Paragraph 00:11:27.107 --> 00:11:30.254 (Laughter) NOTE Paragraph 00:11:31.266 --> 00:11:32.554 If the designers had known 00:11:32.578 --> 00:11:34.685 what was really happening in the real world 00:11:34.709 --> 00:11:37.292 with their designs -- the road, the building, Barbie -- 00:11:37.316 --> 00:11:40.010 they could've used that knowledge to create an experience 00:11:40.034 --> 00:11:41.434 that was better for the user. 00:11:41.458 --> 00:11:43.249 What's missing is a nervous system 00:11:43.273 --> 00:11:46.982 connecting us to all of the things that we design, make and use. 00:11:47.915 --> 00:11:51.470 What if all of you had that kind of information flowing to you 00:11:51.494 --> 00:11:53.677 from the things you create in the real world? 00:11:55.432 --> 00:11:56.883 With all of the stuff we make, 00:11:56.907 --> 00:11:59.342 we spend a tremendous amount of money and energy -- 00:11:59.366 --> 00:12:01.742 in fact, last year, about two trillion dollars -- 00:12:01.766 --> 00:12:04.620 convincing people to buy the things we've made. 00:12:04.644 --> 00:12:08.032 But if you had this connection to the things that you design and create 00:12:08.056 --> 00:12:09.783 after they're out in the real world, 00:12:09.807 --> 00:12:13.421 after they've been sold or launched or whatever, 00:12:13.445 --> 00:12:15.065 we could actually change that, 00:12:15.089 --> 00:12:18.136 and go from making people want our stuff, 00:12:18.160 --> 00:12:21.594 to just making stuff that people want in the first place. NOTE Paragraph 00:12:21.618 --> 00:12:24.405 The good news is, we're working on digital nervous systems 00:12:24.429 --> 00:12:27.230 that connect us to the things we design. 00:12:28.365 --> 00:12:29.992 We're working on one project 00:12:30.016 --> 00:12:33.728 with a couple of guys down in Los Angeles called the Bandito Brothers 00:12:33.752 --> 00:12:35.159 and their team. 00:12:35.183 --> 00:12:38.616 And one of the things these guys do is build insane cars 00:12:38.640 --> 00:12:41.513 that do absolutely insane things. 00:12:42.905 --> 00:12:44.355 These guys are crazy -- NOTE Paragraph 00:12:44.379 --> 00:12:45.415 (Laughter) NOTE Paragraph 00:12:45.439 --> 00:12:46.842 in the best way. 00:12:48.993 --> 00:12:50.756 And what we're doing with them 00:12:50.780 --> 00:12:53.220 is taking a traditional race-car chassis 00:12:53.244 --> 00:12:54.829 and giving it a nervous system. NOTE Paragraph 00:12:54.853 --> 00:12:57.911 So we instrumented it with dozens of sensors, 00:12:57.935 --> 00:13:00.570 put a world-class driver behind the wheel, 00:13:00.594 --> 00:13:03.951 took it out to the desert and drove the hell out of it for a week. 00:13:03.975 --> 00:13:06.466 And the car's nervous system captured everything 00:13:06.490 --> 00:13:07.972 that was happening to the car. 00:13:07.996 --> 00:13:10.617 We captured four billion data points; 00:13:10.641 --> 00:13:12.951 all of the forces that it was subjected to. 00:13:12.975 --> 00:13:14.634 And then we did something crazy. 00:13:15.268 --> 00:13:16.768 We took all of that data, 00:13:16.792 --> 00:13:20.528 and plugged it into a generative-design AI we call "Dreamcatcher." 00:13:21.270 --> 00:13:25.234 So what do get when you give a design tool a nervous system, 00:13:25.258 --> 00:13:28.140 and you ask it to build you the ultimate car chassis? 00:13:28.723 --> 00:13:30.696 You get this. 00:13:32.293 --> 00:13:36.006 This is something that a human could never have designed. 00:13:36.707 --> 00:13:38.595 Except a human did design this, 00:13:38.619 --> 00:13:42.928 but it was a human that was augmented by a generative-design AI, 00:13:42.952 --> 00:13:44.183 a digital nervous system 00:13:44.207 --> 00:13:47.212 and robots that can actually fabricate something like this. NOTE Paragraph 00:13:47.680 --> 00:13:51.275 So if this is the future, the Augmented Age, 00:13:51.299 --> 00:13:55.560 and we're going to be augmented cognitively, physically and perceptually, 00:13:55.584 --> 00:13:56.992 what will that look like? 00:13:57.576 --> 00:14:00.897 What is this wonderland going to be like? NOTE Paragraph 00:14:00.921 --> 00:14:02.630 I think we're going to see a world 00:14:02.654 --> 00:14:05.722 where we're moving from things that are fabricated 00:14:05.746 --> 00:14:07.191 to things that are farmed. 00:14:08.159 --> 00:14:11.612 Where we're moving from things that are constructed 00:14:11.636 --> 00:14:13.340 to that which is grown. 00:14:14.134 --> 00:14:16.322 We're going to move from being isolated 00:14:16.346 --> 00:14:17.956 to being connected. 00:14:18.634 --> 00:14:21.045 And we'll move away from extraction 00:14:21.069 --> 00:14:22.942 to embrace aggregation. 00:14:23.967 --> 00:14:27.734 I also think we'll shift from craving obedience from our things 00:14:27.758 --> 00:14:29.399 to valuing autonomy. NOTE Paragraph 00:14:30.510 --> 00:14:32.415 Thanks to our augmented capabilities, 00:14:32.439 --> 00:14:34.816 our world is going to change dramatically. 00:14:35.576 --> 00:14:38.822 We're going to have a world with more variety, more connectedness, 00:14:38.846 --> 00:14:41.133 more dynamism, more complexity, 00:14:41.157 --> 00:14:43.475 more adaptability and, of course, 00:14:43.499 --> 00:14:44.716 more beauty. 00:14:45.231 --> 00:14:46.795 The shape of things to come 00:14:46.819 --> 00:14:49.109 will be unlike anything we've ever seen before. 00:14:49.133 --> 00:14:50.292 Why? 00:14:50.316 --> 00:14:54.071 Because what will be shaping those things is this new partnership 00:14:54.095 --> 00:14:57.765 between technology, nature and humanity. 00:14:59.279 --> 00:15:03.083 That, to me, is a future well worth looking forward to. NOTE Paragraph 00:15:03.107 --> 00:15:04.378 Thank you all so much. NOTE Paragraph 00:15:04.402 --> 00:15:10.071 (Applause)