0:00:00.835,0:00:03.148 How many of you are creatives? 0:00:03.148,0:00:06.796 Designers, engineers,[br]entrepreneurs, artists, 0:00:06.796,0:00:09.207 or maybe you just have[br]a really big imagination. 0:00:09.207,0:00:10.207 Show of hands? 0:00:10.207,0:00:11.207 (Applause) 0:00:11.207,0:00:12.813 That's most of you. 0:00:13.434,0:00:15.728 I have some news for us creatives. 0:00:16.934,0:00:21.571 Over the course of the next 20 years ... 0:00:21.571,0:00:25.570 more will change around[br]the way we do our work 0:00:25.570,0:00:28.254 than has happened in the last 2,000. 0:00:28.611,0:00:33.239 In fact, I think we're at the dawn[br]of a new age in human history. 0:00:33.841,0:00:38.602 Now, there have been four major historical[br]eras defined by the way we work. 0:00:39.504,0:00:42.779 The Hunter-Gatherer Age[br]lasted several million years. 0:00:43.263,0:00:46.839 And then the Agricultural Age[br]lasted several thousand years. 0:00:47.295,0:00:50.809 The Industrial Age lasted[br]a couple of centuries, 0:00:50.809,0:00:54.843 and now the Information Age[br]has lasted just a few decades. 0:00:55.120,0:00:56.756 And now today, 0:00:56.756,0:01:00.340 we're on the cusp of our next[br]great era as a species. 0:01:01.496,0:01:03.969 Welcome to the Augemented Age. 0:01:04.199,0:01:05.351 In this new era, 0:01:05.351,0:01:09.199 your natural human capabilities are going[br]to be augmented by computational systems 0:01:09.199,0:01:11.009 that help you think, 0:01:11.009,0:01:13.175 robotic systems that help you make, 0:01:13.175,0:01:14.891 and [a] digital nervous system 0:01:14.891,0:01:19.214 that connects you to the world[br]fay beyond your natural senses. 0:01:19.530,0:01:21.467 Let's start with cognitive augmentation. 0:01:21.467,0:01:23.667 How many of you are augmented cyborgs? 0:01:24.333,0:01:25.329 (Laughter) 0:01:26.905,0:01:30.006 I would actually argue that we're[br]already augmented. 0:01:30.488,0:01:32.016 Imagine you're at a party, 0:01:32.016,0:01:35.491 and somebody asks you a question[br]that you don't know the answer to. 0:01:35.491,0:01:36.940 If you have one of these, 0:01:36.940,0:01:38.038 in a few seconds, 0:01:38.038,0:01:39.644 you can know the answer. 0:01:39.951,0:01:42.635 But this is just a primitive beginning. 0:01:42.938,0:01:46.436 Even Siri is just a passive tool. 0:01:46.860,0:01:50.186 In fact, for the last[br]three-and-a-half million years, 0:01:50.186,0:01:53.633 the tools that we've had[br]have been completely passive. 0:01:54.403,0:01:57.971 They do exactly what we tell them[br]and nothing more. 0:01:57.971,0:02:01.462 Our very first tool only cut[br]where we struck it. 0:02:02.022,0:02:05.284 The chisel only carves[br]where the artist points it. 0:02:05.543,0:02:10.898 And even our most advanced tools[br]do nothing without our explicit direction. 0:02:11.208,0:02:12.647 In fact, to date -- 0:02:12.647,0:02:14.672 and this is something[br]that frustrates me -- 0:02:14.672,0:02:15.933 we've always been limited 0:02:15.933,0:02:19.215 by this need to manually[br]push our wills into our tools -- 0:02:19.215,0:02:20.230 like manual, 0:02:20.230,0:02:21.715 like literally using our hands, 0:02:21.715,0:02:23.277 even with computers. 0:02:24.272,0:02:26.759 But I'm more like Scotty in "Star Trek." 0:02:26.759,0:02:27.759 (Laughter) 0:02:28.584,0:02:30.803 I want to have a conversation[br]with a computer. 0:02:30.803,0:02:33.797 I want to say, "Computer,[br]let's design a car," 0:02:33.797,0:02:35.360 and the computer shows me a car. 0:02:35.360,0:02:37.992 And I say, "No, more fast-looking,[br]and less German," 0:02:37.992,0:02:40.156 and bang, the computer shows me an option. 0:02:40.156,0:02:41.159 (Laughter) 0:02:42.408,0:02:44.738 That conversation might be[br]a little ways off, 0:02:44.738,0:02:47.290 it's actually probably less[br]than any of us think, 0:02:47.290,0:02:49.300 but right now, 0:02:49.300,0:02:50.307 we're working on it. 0:02:50.307,0:02:54.340 Tools are making this leap from being[br]passive to being generative. 0:02:55.031,0:02:58.363 Generative design tools[br]use a computer and algorithms 0:02:58.363,0:03:00.995 to synthesize geometry 0:03:00.995,0:03:03.749 to come up with new designs[br]all by themselves. 0:03:04.196,0:03:06.968 All it needs are your goals[br]and your constraints. 0:03:06.968,0:03:08.400 I'll give you an example. 0:03:08.400,0:03:11.212 In the case of this aerial drone chassis, 0:03:11.212,0:03:13.593 all you would need to do[br]is tell it something like, 0:03:13.593,0:03:15.083 it has four propellers, 0:03:15.083,0:03:17.186 you want it to be[br]as lightweight as possible, 0:03:17.186,0:03:19.421 and you need it to be[br]aerodynamically efficient. 0:03:19.421,0:03:21.550 And then what the computer does 0:03:21.550,0:03:24.546 is it explores the entire solution space: 0:03:24.546,0:03:28.497 every single possibility that solves[br]and meets your criteria -- 0:03:28.497,0:03:29.844 millions of them. 0:03:29.844,0:03:31.797 It takes big computers to do this, 0:03:31.797,0:03:33.854 but it comes back to us with designs 0:03:33.854,0:03:36.997 that we by ourselves[br]never could've imagined. 0:03:37.380,0:03:40.143 And the computer's coming up[br]with this stuff all by itself, 0:03:40.143,0:03:42.153 no one ever drew anything, 0:03:42.153,0:03:44.532 and it started completely from scratch. 0:03:45.115,0:03:46.135 And by the way, 0:03:46.135,0:03:47.649 it's no accident 0:03:47.649,0:03:51.130 that the drone body looks just like[br]the pelvis of a flying squirrel. 0:03:51.487,0:03:52.612 (Laughter) 0:03:54.162,0:03:58.314 It's because the algorithms are designed[br]to work the same way that evolution does. 0:03:58.915,0:04:02.685 What's exciting is we're starting to see[br]this technology out in the real world. 0:04:02.685,0:04:05.159 We've been working with Airbus[br]for a couple of years 0:04:05.159,0:04:07.085 on this concept plane for the future. 0:04:07.085,0:04:09.285 It's aways out still, 0:04:09.285,0:04:13.018 but just recently we used[br]a generative design AI 0:04:13.018,0:04:15.390 to come up with this. 0:04:15.809,0:04:20.891 This is a 3D printed cabin partition[br]that's been designed by a computer. 0:04:20.891,0:04:23.834 It's stronger than the original[br]yet half the weight, 0:04:23.834,0:04:26.980 and it will be flying[br]in the Airbus A320 later this year. 0:04:27.498,0:04:29.188 So, computers can now generate. 0:04:29.188,0:04:33.694 They can come up with their own solutions[br]to our well-defined problems. 0:04:34.815,0:04:36.211 But they're not inutitive. 0:04:36.211,0:04:39.321 They still have to start from scratch[br]every single time, 0:04:39.321,0:04:42.568 and that's because they never learn ... 0:04:42.568,0:04:44.358 unlike Maggie. 0:04:44.358,0:04:45.364 (Laughter) 0:04:45.764,0:04:49.118 Maggie's actually smarter than our[br]most advanced design tools. 0:04:49.522,0:04:50.985 What do I mean by that? 0:04:50.985,0:04:52.578 If her owner picks up that leash, 0:04:52.578,0:04:54.624 Maggie knows with a fair[br]degree of certainty 0:04:54.624,0:04:56.048 that it's time to for a walk. 0:04:56.265,0:04:57.475 And how did she learn? 0:04:57.475,0:04:59.703 Well, every time the owner[br]picked up the leash, 0:04:59.703,0:05:00.715 they went for a walk. 0:05:00.715,0:05:02.724 And Maggie did three things: 0:05:02.724,0:05:04.617 she had to pay attention, 0:05:04.617,0:05:06.723 she had to remember what happened 0:05:06.723,0:05:11.157 and she had to retain and create[br]a pattern in her mind. 0:05:11.544,0:05:12.634 Interestingly, 0:05:12.634,0:05:16.094 that's exactly computer scientists[br]have been trying to get AIs to do 0:05:16.094,0:05:18.056 for the last 60 or so years. 0:05:18.681,0:05:20.256 Back in 1952, 0:05:20.256,0:05:24.292 they built this computer[br]that could play Tic-Tac-Toe. 0:05:25.130,0:05:26.290 Big deal. 0:05:26.931,0:05:28.487 Then 45 years later, 0:05:28.487,0:05:30.253 in 1997, 0:05:30.253,0:05:33.332 Deep Blue beats Kasparov at Chess. 0:05:34.246,0:05:39.031 2011, Watson beats these two[br]humans at Jeopardy, 0:05:39.031,0:05:41.901 which is much harder for a computer[br]to play than Chess is. 0:05:42.190,0:05:46.026 In fact, rather than working[br]from predefined recipes, 0:05:46.026,0:05:49.349 Watson had to use reasoning[br]to overcome his human opponents. 0:05:50.438,0:05:53.056 And then a couple of weeks ago, 0:05:53.056,0:05:57.225 DeepMind's AlphaGo beats[br]the world's best human at Go, 0:05:57.225,0:05:59.382 which is the most difficult[br]game that we have. 0:05:59.382,0:06:00.379 In fact in Go, 0:06:00.379,0:06:04.706 there are more possible moves[br]than there are atoms in the universe. 0:06:06.315,0:06:08.260 So in order to win, 0:06:08.260,0:06:11.210 what AlphaGo had to do[br]was develop intuition, 0:06:11.210,0:06:12.273 and in fact, 0:06:12.273,0:06:13.273 at some points, 0:06:13.273,0:06:17.631 AlphaGo's programmers didn't understand[br]why AlphaGo was doing what it was doing. 0:06:19.527,0:06:21.168 And things are moving really fast. 0:06:21.168,0:06:22.286 I mean, consider -- 0:06:22.286,0:06:24.586 in the space of a human lifetime, 0:06:24.586,0:06:28.120 computers have gone from a child's game 0:06:28.120,0:06:31.537 to what's recognized as the pinnacle[br]of strategic thought. 0:06:32.199,0:06:34.602 What's basically happening 0:06:34.602,0:06:37.974 is computers are going[br]from being like Spock 0:06:37.974,0:06:39.947 to being a lot more like Kirk. 0:06:39.947,0:06:41.895 (Laughter) 0:06:43.526,0:06:44.534 Right? 0:06:44.534,0:06:47.111 From pure logic to intuition. 0:06:48.298,0:06:50.373 Would you cross this bridge? 0:06:50.698,0:06:51.810 Most of you are saying, 0:06:51.810,0:06:53.156 "Oh, hell no." 0:06:53.156,0:06:54.159 (Laughter) 0:06:54.488,0:06:57.169 And you arrived at that decision[br]in a split second. 0:06:57.169,0:06:59.518 You just sort of knew[br]that that bridge was unsafe. 0:06:59.518,0:07:01.634 And that's exactly the kind of intuition 0:07:01.634,0:07:05.202 that our deep learning systems[br]are starting to develop right now. 0:07:05.607,0:07:06.607 Very soon, 0:07:06.607,0:07:09.191 you'll literally be able[br]to show something you've made, 0:07:09.191,0:07:10.194 you've designed, 0:07:10.194,0:07:11.193 to a computer, 0:07:11.193,0:07:12.670 and it will look at it and say, 0:07:12.670,0:07:14.540 "Mm, sorry homey, that will never work, 0:07:14.540,0:07:15.672 you have to try again." 0:07:16.078,0:07:19.973 Or you could ask it if people[br]are going to like your next song, 0:07:19.973,0:07:22.568 or your next flavor of ice cream. 0:07:23.607,0:07:26.395 Or, much more importantly, 0:07:26.395,0:07:30.290 you could work with a computer to solve[br]a problem that we've never faced before. 0:07:30.290,0:07:31.648 For instance climate change. 0:07:31.648,0:07:33.676 We're not doing a very[br]good job on our own, 0:07:33.676,0:07:35.877 we could certainly use[br]all the help we can get. 0:07:36.048,0:07:37.485 That's what I'm talking about: 0:07:37.485,0:07:40.200 technology amplifying[br]our cognitive abilities 0:07:40.200,0:07:41.866 so we can imagine and design things 0:07:41.866,0:07:46.633 that were simply out of our reach[br]as plain old unaugmented humans. 0:07:48.137,0:07:51.006 So, what about making[br]all of this crazy new stuff 0:07:51.006,0:07:53.754 that we're going to invent and design? 0:07:54.152,0:07:58.269 I think the era of human augmentation[br]is as much about the physical world 0:07:58.269,0:08:01.743 as it is about the virtual,[br]intellectual realm. 0:08:02.033,0:08:04.574 So how will technology augment us? 0:08:04.574,0:08:05.768 In the physcial world, 0:08:05.768,0:08:07.098 robotic systems. 0:08:07.820,0:08:11.793 OK, there's certainly a fear that robots[br]are going to take jobs away from humans, 0:08:11.793,0:08:14.029 and that is true in certain sectors. 0:08:14.306,0:08:17.352 But I'm much more interested in this idea 0:08:17.352,0:08:22.310 that humans and robots working together[br]are going to augment each other, 0:08:22.310,0:08:24.392 and start to inhabit a new space. 0:08:24.392,0:08:26.778 This is our applied[br]research lab in San Francisco, 0:08:26.778,0:08:29.944 where one of our areas of focus[br]is advanced robotics, 0:08:29.944,0:08:32.608 specifically human-robot collaboration. 0:08:33.234,0:08:34.644 And this is Bishop, 0:08:34.644,0:08:36.017 one of our robots. 0:08:36.017,0:08:37.340 As an experiment, 0:08:37.340,0:08:41.243 we set it up to help a person working[br]in construction doing repetitive tasks. 0:08:42.098,0:08:46.377 Tasks like cutting out holes for outlets[br]or light switches in dry wall. 0:08:46.704,0:08:47.948 (Laughter) 0:08:50.077,0:08:53.212 So, Bishop's human partner[br]can tell what to do in plain English 0:08:53.212,0:08:54.541 and with simple gestures, 0:08:54.541,0:08:56.012 kind of like talking to a dog, 0:08:56.012,0:08:58.178 and then Bishop executes[br]on those instructions 0:08:58.178,0:08:59.767 with perfect precision. 0:09:00.047,0:09:02.546 We're using the human for what[br]the human is good at, 0:09:02.546,0:09:03.552 right? 0:09:03.552,0:09:05.583 Awareness, perception and desicion making. 0:09:05.583,0:09:07.826 And we're using the robot[br]for what it's good at: 0:09:07.826,0:09:09.848 precision and repetitiveness. 0:09:10.256,0:09:12.682 Here's another cool project[br]that Bishop worked on. 0:09:12.682,0:09:14.181 The goal of this project, 0:09:14.181,0:09:15.942 which we called the HIVE, 0:09:15.942,0:09:19.817 was to prototype the experience[br]of humans, computers and robots 0:09:19.817,0:09:23.398 all working together to solve[br]a highly complex design problem. 0:09:23.831,0:09:25.365 The humans acted as labor. 0:09:25.365,0:09:27.383 They cruised around the construction site, 0:09:27.383,0:09:28.735 they manipulated the bamboo, 0:09:28.735,0:09:29.743 which by the way, 0:09:29.743,0:09:31.659 because it's a [nonisomorphic] material, 0:09:31.659,0:09:33.504 is super hard for robots to deal with. 0:09:33.504,0:09:35.598 But then the robots[br]did this fiber winding, 0:09:35.598,0:09:37.940 which was almost impossible[br]for a human to do. 0:09:37.940,0:09:41.702 And then we had an AI[br]that was controlling everything. 0:09:41.702,0:09:43.466 It was telling the humans what to do, 0:09:43.466,0:09:45.265 it was telling the robots what to do, 0:09:45.265,0:09:47.993 and keeping track of thousands[br]of individual components. 0:09:47.993,0:09:52.431 What's interesting is building[br]this pavilion was simply not possible 0:09:52.431,0:09:57.306 without human, robot and AI[br]augmenting each other. 0:09:57.868,0:09:59.391 OK, I'll share one more project. 0:09:59.391,0:10:01.363 This one's a little bit crazy. 0:10:01.363,0:10:05.926 We're working with Amsterdam-based artist,[br]Joris Laarman and his team at MX3D 0:10:05.926,0:10:08.828 to generatively design[br]and robotically print 0:10:08.828,0:10:11.823 the world's first autonomously[br]manufactured bridge. 0:10:12.566,0:10:16.224 So, Joris and an AI are designing[br]this thing right now, as we speak, 0:10:16.224,0:10:17.260 in Amsterdam. 0:10:17.260,0:10:18.341 And when they're done, 0:10:18.341,0:10:19.842 we're going to hit "go," 0:10:19.842,0:10:23.176 and robots will start 3D printing[br]in stainless steel, 0:10:23.176,0:10:26.482 and then they're going to keep printing[br]without human intervention 0:10:26.482,0:10:28.341 until the bridge is finished. 0:10:29.250,0:10:32.251 So, as computers are going[br]to augment our ability 0:10:32.251,0:10:34.453 to imagine and design new stuff, 0:10:34.453,0:10:36.179 robotic systems are going to help us 0:10:36.179,0:10:39.883 build and make things that we've[br]never been able to make before. 0:10:40.440,0:10:44.378 But what about our ability[br]to sense and control these things? 0:10:44.624,0:10:48.529 What about a nervous system[br]for the things that we make? 0:10:48.787,0:10:50.230 Our nervous system -- 0:10:50.230,0:10:51.542 the human nervous system -- 0:10:51.542,0:10:54.093 tells us everything that's[br]going on around us. 0:10:54.363,0:10:57.750 But the nervous system of the things[br]we make is rudimentary at best. 0:10:58.119,0:10:59.131 For instance, 0:10:59.131,0:11:01.681 a car doesn't tell the city's[br]Public Works department 0:11:01.681,0:11:04.835 that it just hit a pothole at the corner[br]of Broadway and Morrison; 0:11:04.835,0:11:06.891 A building doesn't tell its designers 0:11:06.891,0:11:09.683 whether or not the people[br]inside like being there, 0:11:09.683,0:11:14.570 and the toy manufacturer doesn't know[br]if a toy is actually being played with -- 0:11:14.570,0:11:17.461 how and where and whether[br]or not it's any fun. 0:11:17.820,0:11:21.569 Look, I'm sure that the designers[br]imagined this lifestyle for Barbie 0:11:21.569,0:11:23.591 when they designed her -- 0:11:23.591,0:11:24.589 right? 0:11:24.589,0:11:27.489 But what if it turns out that Barbie's[br]actually really lonely? 0:11:27.489,0:11:29.068 (Laughter) 0:11:31.263,0:11:34.549 If the designers had known what was[br]really happening in the real world 0:11:34.549,0:11:35.551 with their designs -- 0:11:35.551,0:11:37.275 the road, the building and Barbie -- 0:11:37.275,0:11:38.863 they could've used that knowledge 0:11:38.863,0:11:41.337 to create an experience[br]that was better for the user. 0:11:41.337,0:11:43.473 What's missing is a nervous system 0:11:43.473,0:11:47.182 connecting us to all of the things[br]that we design, make and use. 0:11:48.115,0:11:51.694 What if all of you had that kind[br]of information flowing to you 0:11:51.694,0:11:54.005 from the things you create[br]in the real world? 0:11:55.507,0:11:56.951 With all of the stuff we make, 0:11:56.951,0:11:59.349 we spend a tremendous amount[br]of money and energy -- 0:11:59.349,0:12:01.575 in fact last year about[br]two trillion dollars -- 0:12:01.575,0:12:04.298 convincing people to buy[br]the things that we've made. 0:12:04.666,0:12:07.992 But if you had this connection[br]to the things that you design and create 0:12:07.992,0:12:09.744 after they're out in the real world, 0:12:09.744,0:12:13.725 after they've been sold,[br]or launched or whatever, 0:12:13.725,0:12:15.368 we could actually change that, 0:12:15.368,0:12:18.360 and go from making people want our stuff, 0:12:18.360,0:12:21.617 to just making stuff that people[br]want in the first place. 0:12:21.818,0:12:24.629 The good news is we're working[br]on digital nervous systems 0:12:24.629,0:12:27.430 that connect us to the things we design. 0:12:28.732,0:12:32.659 We're working on one project[br]with a couple of guys down in Los Angeles 0:12:32.659,0:12:33.993 called the Bandito Brothers, 0:12:33.993,0:12:35.447 and their team, 0:12:35.447,0:12:38.903 and one of the things these guys do[br]is build insane cars 0:12:38.903,0:12:42.104 that do absolutely insane things. 0:12:42.927,0:12:44.579 These guys are crazy -- 0:12:44.579,0:12:45.583 (Laughter) 0:12:45.583,0:12:47.213 in the best way. 0:12:49.061,0:12:50.980 And what we're doing with them 0:12:50.980,0:12:53.444 is taking a traditional racecar chassis 0:12:53.444,0:12:55.053 and giving it a nervous system. 0:12:55.053,0:12:58.135 So, we instrumented it[br]with dozens of sensors, 0:12:58.135,0:13:00.675 and then we put a world-class[br]driver behind the wheel, 0:13:00.675,0:13:01.885 took it out to the desert 0:13:01.885,0:13:03.831 and drove the hell out of it for a week. 0:13:04.089,0:13:07.822 And the car's nervous system captured[br]everything that was happening to the car. 0:13:07.822,0:13:10.841 We captured four billion data points ... 0:13:10.841,0:13:13.038 all of the forces[br]that it was subjected to. 0:13:13.038,0:13:15.087 And then we did something crazy. 0:13:15.363,0:13:16.791 We took all of that data, 0:13:16.791,0:13:20.638 and plugged it into a generative[br]design AI that we call, Dreamcatcher. 0:13:21.470,0:13:25.458 So, what do get when you give[br]a design tool a nervous system, 0:13:25.458,0:13:28.400 and you ask it to build you[br]the ultimate car chassis? 0:13:28.923,0:13:31.263 You get this. 0:13:32.493,0:13:36.578 This is something that human[br]could never have designed. 0:13:36.907,0:13:38.819 Except a human did design this, 0:13:38.819,0:13:42.936 but it was a human that was augmented[br]by a generative design AI, 0:13:42.936,0:13:44.294 a digital nervous system, 0:13:44.294,0:13:47.496 and robots that can actually[br]fabricate something like this. 0:13:47.880,0:13:49.724 So, if this is the future, 0:13:49.724,0:13:51.499 the Augmented Age, 0:13:51.499,0:13:55.784 and we're going to be augmented[br]cognitively, physically and perceptually, 0:13:55.784,0:13:57.483 what will that look like? 0:13:57.895,0:14:01.121 What is this wonderland going to be like? 0:14:01.121,0:14:02.854 I think we're going to see a world 0:14:02.854,0:14:05.946 where we're moving from[br]things that are fabricated 0:14:05.946,0:14:07.608 to things that are farmed. 0:14:08.201,0:14:11.775 Where we're moving from things[br]that are constructed 0:14:11.775,0:14:13.797 to that which is grown. 0:14:14.187,0:14:16.573 We're going to move from being isolated 0:14:16.573,0:14:18.834 to being connected, 0:14:18.834,0:14:21.269 and we'll move away from extraction 0:14:21.269,0:14:23.605 to embrace aggregation. 0:14:24.247,0:14:27.958 I also think we'll shift from craving[br]obedience from our things 0:14:27.958,0:14:29.906 to valuing autonomy. 0:14:30.554,0:14:32.639 Thanks to our augmented capabilities, 0:14:32.639,0:14:35.253 our world is going to change dramatically. 0:14:35.776,0:14:38.048 We're going to have a world[br]with more variety, 0:14:38.048,0:14:39.046 more connectedness, 0:14:39.046,0:14:40.131 more dynamism, 0:14:40.131,0:14:41.357 more complexity, 0:14:41.357,0:14:42.401 more adaptability, 0:14:42.401,0:14:43.699 and of course, 0:14:43.699,0:14:45.058 more beauty. 0:14:45.373,0:14:47.019 The shape of things to come 0:14:47.019,0:14:49.333 will be unlike anything[br]we've ever seen before. 0:14:49.333,0:14:50.328 Why? 0:14:50.328,0:14:52.376 Because what will be shaping those things 0:14:52.376,0:14:57.812 is this new partnership between[br]technology, nature and humanity. 0:14:59.125,0:15:03.022 That to me, is a future[br]well worth looking forward to. 0:15:03.307,0:15:04.602 Thank you all so much. 0:15:04.602,0:15:06.318 (Applause)