0:00:10.847,0:00:13.799 All right, Portland, wow.[br]Thank you so much for having me. 0:00:14.038,0:00:15.498 I love visiting this city. 0:00:15.522,0:00:18.625 I mean, where else in the world[br]can I have breakfast like this. 0:00:18.649,0:00:20.315 (Laughter) 0:00:20.339,0:00:23.632 And clearly, this is[br]a very creative place. 0:00:23.815,0:00:25.111 (Laughter) 0:00:25.135,0:00:27.424 How many of you are creatives, 0:00:27.448,0:00:31.072 designers, engineers,[br]entrepreneurs, artists, 0:00:31.096,0:00:33.483 or maybe you just have[br]a really big imagination? 0:00:33.507,0:00:35.355 Show of hands? (Cheers) 0:00:35.379,0:00:36.560 That's most of you. 0:00:37.734,0:00:40.028 I have some news for us creatives. 0:00:41.114,0:00:43.687 Over the course of the next 20 years, 0:00:45.871,0:00:48.844 more will change around[br]the way we do our work 0:00:49.782,0:00:51.939 than has happened in the last 2,000. 0:00:52.768,0:00:57.396 In fact, I think we're at the dawn[br]of a new age in human history. 0:00:58.045,0:01:02.624 Now, there have been four major historical[br]eras defined by the way we work. 0:01:03.804,0:01:07.079 The Hunter-Gatherer Age[br]lasted several million years. 0:01:07.563,0:01:11.139 And then the Agricultural Age[br]lasted several thousand years. 0:01:11.405,0:01:14.895 The Industrial Age[br]lasted a couple of centuries. 0:01:14.919,0:01:18.728 And now the Information Age[br]has lasted just a few decades. 0:01:19.230,0:01:24.450 And now today, we're on the cusp[br]of our next great era as a species. 0:01:25.506,0:01:28.186 Welcome to the Augmented Age. 0:01:28.210,0:01:31.903 In this new era, your natural human[br]capabilities are going to be augmented 0:01:31.927,0:01:34.995 by computational systems[br]that help you think, 0:01:35.119,0:01:37.305 robotic systems that help you make, 0:01:37.329,0:01:38.977 and a digital nervous system 0:01:39.001,0:01:42.691 that connects you to the world[br]far beyond your natural senses. 0:01:44.947,0:01:46.889 Let's start with cognitive augmentation. 0:01:46.913,0:01:49.113 How many of you are augmented cyborgs? 0:01:49.270,0:01:51.545 (Laughter) 0:01:51.569,0:01:53.084 There's three or four of you, 0:01:53.108,0:01:54.791 that's just because it's Portland. 0:01:54.815,0:01:58.025 (Laughter) 0:01:58.049,0:01:59.231 Yeah, keep it weird. 0:01:59.255,0:02:01.543 (Laughter) 0:02:01.567,0:02:04.388 I would actually argue[br]that we're already augmented. 0:02:05.048,0:02:06.552 Imagine you're at a party, 0:02:06.576,0:02:10.096 and somebody asks you a question[br]that you don't know the answer to. 0:02:10.120,0:02:13.880 If you have one of these,[br]in a few seconds, you can know the answer. 0:02:14.629,0:02:16.928 But this is just a primitive beginning. 0:02:17.623,0:02:20.954 Even Siri is just a passive tool. 0:02:21.420,0:02:24.801 In fact, for the last[br]three-and-a-half million years, 0:02:24.825,0:02:28.348 the tools that we've had[br]have been completely passive. 0:02:29.563,0:02:33.218 They do exactly what we tell them[br]and nothing more. 0:02:33.242,0:02:36.343 Our very first tool[br]only cut where we struck it. 0:02:37.382,0:02:40.422 The chisel only carves[br]where the artist points it. 0:02:40.818,0:02:46.544 And even our most advanced tools[br]do nothing without our explicit direction. 0:02:47.598,0:02:50.779 In fact, to date, and this[br]is something that frustrates me, 0:02:50.803,0:02:52.251 we've always been limited 0:02:52.275,0:02:55.776 by this need to manually[br]push our wills into our tools -- 0:02:55.800,0:02:58.097 like, manual, literally using our hands, 0:02:58.121,0:02:59.549 even with computers. 0:03:00.662,0:03:03.125 But I'm more like Scotty in "Star Trek." 0:03:03.149,0:03:04.999 (Laughter) 0:03:05.023,0:03:07.169 I want to have a conversation[br]with a computer. 0:03:07.193,0:03:10.163 I want to say, "Computer,[br]let's design a car," 0:03:10.187,0:03:11.726 and the computer shows me a car. 0:03:11.750,0:03:14.358 And I say, "No, more fast-looking,[br]and less German," 0:03:14.382,0:03:16.545 and bang, the computer shows me an option. 0:03:16.569,0:03:18.434 (Laughter) 0:03:18.798,0:03:21.104 That conversation might be[br]a little ways off, 0:03:21.128,0:03:23.793 probably less than many of us think, 0:03:23.817,0:03:25.580 but right now, 0:03:25.604,0:03:26.755 we're working on it. 0:03:26.779,0:03:30.812 Tools are making this leap[br]from being passive to being generative. 0:03:31.421,0:03:36.069 Generative design tools[br]use a computer and algorithms 0:03:36.093,0:03:38.901 to synthesize geometry 0:03:38.925,0:03:41.679 to come up with new designs[br]all by themselves. 0:03:42.126,0:03:44.874 All it needs are your goals[br]and your constraints. 0:03:44.898,0:03:46.306 I'll give you an example. 0:03:46.330,0:03:49.118 In the case of this aerial drone chassis, 0:03:49.142,0:03:51.768 all you would need to do[br]is tell it something like, 0:03:51.792,0:03:53.065 it has four propellers, 0:03:53.089,0:03:55.220 you want it to be[br]as lightweight as possible, 0:03:55.244,0:03:57.514 and you need it to be[br]aerodynamically efficient. 0:03:57.538,0:04:02.452 Then what the computer does[br]is it explores the entire solution space: 0:04:02.476,0:04:06.403 every single possibility that solves[br]and meets your criteria -- 0:04:06.427,0:04:07.869 millions of them. 0:04:07.893,0:04:09.868 It takes big computers to do this. 0:04:09.892,0:04:11.847 But it comes back to us with designs 0:04:11.871,0:04:15.014 that we, by ourselves,[br]never could've imagined. 0:04:15.456,0:04:18.368 And the computer's coming up[br]with this stuff all by itself -- 0:04:18.392,0:04:20.070 no one ever drew anything, 0:04:20.094,0:04:22.180 and it started completely from scratch. 0:04:23.168,0:04:25.555 And by the way, it's no accident 0:04:25.579,0:04:29.060 that the drone body looks just like[br]the pelvis of a flying squirrel. 0:04:29.231,0:04:32.146 (Laughter) 0:04:32.170,0:04:34.472 It's because the algorithms[br]are designed to work 0:04:34.496,0:04:36.133 the same way evolution does. 0:04:37.045,0:04:39.705 What's exciting is we're starting[br]to see this technology 0:04:39.729,0:04:40.888 out in the real world. 0:04:40.912,0:04:43.364 We've been working with Airbus[br]for a couple of years 0:04:43.388,0:04:45.297 on this concept plane for the future. 0:04:45.321,0:04:47.391 It's a ways out still. 0:04:47.415,0:04:51.195 But just recently we used[br]a generative-design AI 0:04:51.219,0:04:53.026 to come up with this. 0:04:53.939,0:04:59.092 This is a 3D-printed cabin partition[br]that's been designed by a computer. 0:04:59.116,0:05:01.940 It's stronger than the original[br]yet half the weight, 0:05:01.964,0:05:05.110 and it will be flying[br]in the Airbus A320 later this year. 0:05:06.835,0:05:09.024 So computers can now generate; 0:05:09.048,0:05:13.643 they can come up with their own solutions[br]to our well-defined problems. 0:05:14.737,0:05:16.047 But they're not intuitive. 0:05:16.071,0:05:19.157 They still have to start from scratch[br]every single time, 0:05:19.181,0:05:21.871 and that's because they never learn. 0:05:22.768,0:05:24.534 Unlike Maggie. 0:05:24.558,0:05:26.139 (Laughter) 0:05:26.163,0:05:29.460 Maggie's actually smarter[br]than our most advanced design tools. 0:05:29.867,0:05:31.367 What do I mean by that? 0:05:31.391,0:05:32.981 If her owner picks up that leash, 0:05:33.005,0:05:35.073 Maggie knows with a fair[br]degree of certainty 0:05:35.097,0:05:36.501 it's time to go for a walk. 0:05:36.525,0:05:37.710 And how did she learn? 0:05:37.734,0:05:41.058 Well, every time the owner picked up[br]the leash, they went for a walk. 0:05:41.082,0:05:42.960 And Maggie did three things: 0:05:42.984,0:05:44.853 she had to pay attention, 0:05:44.877,0:05:46.959 she had to remember what happened 0:05:46.983,0:05:51.000 and she had to retain and create[br]a pattern in her mind. 0:05:51.889,0:05:53.984 Interestingly, that's exactly what 0:05:54.008,0:05:56.531 computer scientists[br]have been trying to get AIs to do 0:05:56.555,0:05:58.414 for the last 60 or so years. 0:05:59.783,0:06:01.132 Back in 1952, 0:06:01.156,0:06:04.957 they built this computer[br]that could play tic-tac-toe. 0:06:06.181,0:06:07.341 Big deal. 0:06:08.129,0:06:11.129 Then 45 years later, in 1997, 0:06:11.153,0:06:13.625 Deep Blue beats Kasparov at chess. 0:06:15.146,0:06:20.114 2011, Watson beats these two[br]humans at Jeopardy, 0:06:20.138,0:06:23.066 which is much harder for a computer[br]to play than chess is. 0:06:23.090,0:06:26.902 In fact, rather than working[br]from predefined recipes, 0:06:26.926,0:06:30.249 Watson had to use reasoning[br]to overcome his human opponents. 0:06:31.493,0:06:33.932 And then a couple of weeks ago, 0:06:33.956,0:06:38.218 DeepMind's AlphaGo beats[br]the world's best human at Go, 0:06:38.242,0:06:40.454 which is the most difficult[br]game that we have. 0:06:40.478,0:06:43.974 In fact, in Go, there are more[br]possible moves 0:06:43.998,0:06:46.022 than there are atoms in the universe. 0:06:47.910,0:06:49.736 So in order to win, 0:06:49.760,0:06:52.774 what AlphaGo had to do[br]was develop intuition. 0:06:52.798,0:06:56.908 And in fact, at some points,[br]AlphaGo's programmers didn't understand 0:06:56.932,0:06:59.218 why AlphaGo was doing what it was doing. 0:07:01.151,0:07:02.811 And things are moving really fast. 0:07:02.835,0:07:06.062 I mean, consider --[br]in the space of a human lifetime, 0:07:06.086,0:07:08.319 computers have gone from a child's game 0:07:09.620,0:07:12.668 to what's recognized as the pinnacle[br]of strategic thought. 0:07:13.699,0:07:16.116 What's basically happening 0:07:16.140,0:07:19.450 is computers are going[br]from being like Spock ... 0:07:20.474,0:07:22.423 to being a lot more like Kirk. 0:07:22.447,0:07:26.065 (Laughter) 0:07:26.089,0:07:29.513 Right? From pure logic to intuition. 0:07:31.584,0:07:33.327 Would you cross this bridge? 0:07:34.009,0:07:36.332 Most of you are saying, "Oh, hell no!" 0:07:36.356,0:07:37.664 (Laughter) 0:07:37.688,0:07:40.345 And you arrived at that decision[br]in a split second. 0:07:40.369,0:07:42.797 You just sort of knew[br]that bridge was unsafe. 0:07:42.821,0:07:44.810 And that's exactly the kind of intuition 0:07:44.834,0:07:48.402 that our deep-learning systems[br]are starting to develop right now. 0:07:49.122,0:07:50.829 Very soon, you'll literally be able 0:07:50.853,0:07:53.782 to show something you've made,[br]you've designed, to a computer, 0:07:53.806,0:07:55.439 and it will look at it and say, 0:07:55.463,0:07:58.286 "Sorry, homie, that'll never work.[br]You have to try again." 0:07:59.254,0:08:02.324 Or you could ask it if people[br]are going to like your next song, 0:08:03.173,0:08:05.236 or your next flavor of ice cream. 0:08:06.949,0:08:09.528 Or, much more importantly, 0:08:09.552,0:08:11.916 you could work with a computer[br]to solve a problem 0:08:11.940,0:08:13.577 that we've never faced before. 0:08:13.601,0:08:15.002 For instance, climate change. 0:08:15.026,0:08:17.046 We're not doing[br]a very good job on our own, 0:08:17.070,0:08:19.315 we could certainly use[br]all the help we can get. 0:08:19.339,0:08:20.797 That's what I'm talking about, 0:08:20.821,0:08:23.376 technology amplifying[br]our cognitive abilities 0:08:23.400,0:08:26.952 so we can imagine and design things[br]that were simply out of our reach 0:08:26.976,0:08:29.535 as plain old un-augmented humans. 0:08:31.384,0:08:34.325 So what about making[br]all of this crazy new stuff 0:08:34.349,0:08:36.790 that we're going to invent and design? 0:08:37.352,0:08:41.445 I think the era of human augmentation[br]is as much about the physical world 0:08:41.469,0:08:44.534 as it is about the virtual,[br]intellectual realm. 0:08:45.233,0:08:47.154 How will technology augment us? 0:08:47.565,0:08:50.038 In the physical world, robotic systems. 0:08:51.620,0:08:53.356 OK, there's certainly a fear 0:08:53.380,0:08:55.868 that robots are going to take[br]jobs away from humans, 0:08:55.892,0:08:57.722 and that is true in certain sectors. 0:08:58.174,0:09:01.052 But I'm much more interested in this idea 0:09:01.076,0:09:06.086 that humans and robots working together[br]are going to augment each other, 0:09:06.110,0:09:08.168 and start to inhabit a new space. 0:09:08.192,0:09:10.554 This is our applied research lab[br]in San Francisco, 0:09:10.578,0:09:13.720 where one of our areas of focus[br]is advanced robotics, 0:09:13.744,0:09:16.255 specifically, human-robot collaboration. 0:09:17.034,0:09:19.793 And this is Bishop, one of our robots. 0:09:19.817,0:09:21.606 As an experiment, we set it up 0:09:21.630,0:09:25.090 to help a person working in construction[br]doing repetitive tasks -- 0:09:25.984,0:09:30.178 tasks like cutting out holes for outlets[br]or light switches in drywall. 0:09:30.202,0:09:32.668 (Laughter) 0:09:33.877,0:09:36.988 So, Bishop's human partner[br]can tell what to do in plain English 0:09:37.012,0:09:38.317 and with simple gestures, 0:09:38.341,0:09:39.788 kind of like talking to a dog, 0:09:39.812,0:09:41.955 and then Bishop executes[br]on those instructions 0:09:41.979,0:09:43.871 with perfect precision. 0:09:43.895,0:09:46.884 We're using the human[br]for what the human is good at: 0:09:46.908,0:09:49.241 awareness, perception and decision making. 0:09:49.265,0:09:51.505 And we're using the robot[br]for what it's good at: 0:09:51.529,0:09:53.277 precision and repetitiveness. 0:09:54.252,0:09:56.619 Here's another cool project[br]that Bishop worked on. 0:09:56.643,0:09:59.718 The goal of this project,[br]which we called the HIVE, 0:09:59.742,0:10:03.593 was to prototype the experience[br]of humans, computers and robots 0:10:03.617,0:10:06.837 all working together to solve[br]a highly complex design problem. 0:10:07.793,0:10:09.244 The humans acted as labor. 0:10:09.268,0:10:12.741 They cruised around the construction site,[br]they manipulated the bamboo -- 0:10:12.765,0:10:15.521 which, by the way,[br]because it's a non-isomorphic material, 0:10:15.545,0:10:17.419 is super hard for robots to deal with. 0:10:17.543,0:10:19.865 But then the robots[br]did this fiber winding, 0:10:19.889,0:10:22.340 which was almost impossible[br]for a human to do. 0:10:22.364,0:10:25.985 And then we had an AI[br]that was controlling everything. 0:10:26.109,0:10:29.399 It was telling the humans what to do,[br]telling the robots what to do 0:10:29.423,0:10:32.338 and keeping track of thousands[br]of individual components. 0:10:32.362,0:10:33.542 What's interesting is, 0:10:33.566,0:10:36.707 building this pavilion[br]was simply not possible 0:10:36.731,0:10:41.255 without human, robot and AI[br]augmenting each other. 0:10:42.390,0:10:45.710 OK, I'll share one more project.[br]This one's a little bit crazy. 0:10:47.734,0:10:52.202 We're working with Amsterdam-based artist[br]Joris Laarman and his team at MX3D 0:10:53.526,0:10:56.404 to generatively design[br]and robotically print 0:10:56.428,0:10:59.423 the world's first autonomously[br]manufactured bridge. 0:11:01.015,0:11:04.700 So, Joris and an AI are designing[br]this thing right now, as we speak, 0:11:04.724,0:11:05.896 in Amsterdam. 0:11:06.620,0:11:09.211 And when they're done,[br]we're going to hit "Go," 0:11:09.235,0:11:12.546 and robots will start 3D printing[br]in stainless steel, 0:11:12.570,0:11:15.853 and then they're going to keep printing,[br]without human intervention, 0:11:15.877,0:11:17.435 until the bridge is finished. 0:11:18.769,0:11:21.697 So, as computers are going[br]to augment our ability 0:11:21.721,0:11:23.871 to imagine and design new stuff, 0:11:23.895,0:11:26.790 robotic systems are going to help us[br]build and make things 0:11:26.814,0:11:28.898 that we've never been able to make before. 0:11:30.017,0:11:34.177 But what about our ability[br]to sense and control these things? 0:11:34.201,0:11:38.232 What about a nervous system[br]for the things that we make? 0:11:40.486,0:11:42.998 Our nervous system,[br]the human nervous system, 0:11:43.022,0:11:45.333 tells us everything[br]that's going on around us. 0:11:46.086,0:11:49.389 But the nervous system of the things[br]we make is rudimentary at best. 0:11:49.413,0:11:51.578 I would say it's rather[br]shitty, at this point. 0:11:51.602,0:11:52.930 (Laughter) 0:11:52.954,0:11:56.817 For instance, a car doesn't tell[br]the city's public works department 0:11:56.841,0:11:59.971 that it just hit a pothole at the corner[br]of Broadway and Morrison. 0:12:01.295,0:12:03.327 A building doesn't tell its designers 0:12:03.351,0:12:06.035 whether or not the people inside[br]like being there, 0:12:07.659,0:12:10.669 and the toy manufacturer doesn't know 0:12:10.693,0:12:12.700 if a toy is actually being played with -- 0:12:12.724,0:12:15.263 how and where and whether[br]or not it's any fun. 0:12:15.880,0:12:19.694 Look, I'm sure that the designers[br]imagined this lifestyle for Barbie 0:12:19.718,0:12:20.942 when they designed her. 0:12:20.966,0:12:22.413 (Laughter) 0:12:23.237,0:12:26.143 But what if it turns out that Barbie's[br]actually really lonely? 0:12:26.167,0:12:30.302 (Laughter) 0:12:30.326,0:12:31.614 If the designers had known 0:12:31.638,0:12:33.745 what was really happening[br]in the real world 0:12:33.769,0:12:36.352 with their designs --[br]the road, the building, Barbie -- 0:12:36.376,0:12:39.070 they could've used that knowledge[br]to create an experience 0:12:39.094,0:12:40.494 that was better for the user. 0:12:40.518,0:12:42.309 What's missing is a nervous system 0:12:42.333,0:12:46.042 connecting us to all of the things[br]that we design, make and use. 0:12:46.975,0:12:50.530 What if all of you had that kind[br]of information flowing to you 0:12:50.554,0:12:52.737 from the things you create[br]in the real world? 0:12:54.492,0:12:55.943 With all of the stuff we make, 0:12:55.967,0:12:58.402 we spend a tremendous amount[br]of money and energy -- 0:12:58.426,0:13:00.802 in fact, last year,[br]about two trillion dollars -- 0:13:00.826,0:13:03.680 convincing people to buy[br]the things we've made. 0:13:03.704,0:13:07.092 But if you had this connection[br]to the things that you design and create 0:13:07.116,0:13:08.843 after they're out in the real world, 0:13:08.867,0:13:12.481 after they've been sold[br]or launched or whatever, 0:13:12.505,0:13:14.125 we could actually change that, 0:13:14.149,0:13:17.596 and go from making people want our stuff, 0:13:17.620,0:13:21.054 to just making stuff that people[br]want in the first place. 0:13:24.478,0:13:27.265 The good news is, we're working[br]on digital nervous systems 0:13:27.289,0:13:30.090 that connect us to the things we design. 0:13:31.225,0:13:32.852 We're working on one project 0:13:32.876,0:13:37.888 with a couple of guys down in Los Angeles[br]called the Bandito Brothers 0:13:37.912,0:13:39.319 and their team. 0:13:39.343,0:13:42.776 And one of the things these guys do[br]is build insane cars 0:13:42.800,0:13:45.673 that do absolutely insane things. 0:13:47.065,0:13:48.515 These guys are crazy -- 0:13:48.539,0:13:49.575 (Laughter) 0:13:49.599,0:13:51.002 in the best way. 0:13:54.793,0:13:56.556 And what we're doing with them 0:13:56.580,0:13:59.020 is taking a traditional race-car chassis 0:13:59.044,0:14:00.629 and giving it a nervous system. 0:14:00.653,0:14:03.771 So we instrumented it[br]with dozens of sensors, 0:14:03.795,0:14:06.430 put a world-class driver behind the wheel, 0:14:06.454,0:14:09.811 took it out to the desert[br]and drove the hell out of it for a week. 0:14:09.835,0:14:12.326 And the car's nervous system[br]captured everything 0:14:12.350,0:14:13.832 that was happening to the car. 0:14:13.856,0:14:16.477 We captured four billion data points; 0:14:16.501,0:14:18.811 all of the forces[br]that it was subjected to. 0:14:18.835,0:14:20.494 And then we did something crazy. 0:14:21.238,0:14:22.738 We took all of that data, 0:14:22.762,0:14:26.498 and plugged it into a generative-design AI[br]we call "Dreamcatcher." 0:14:27.240,0:14:31.204 So what do get when you give[br]a design tool a nervous system, 0:14:31.228,0:14:34.110 and you ask it to build you[br]the ultimate car chassis? 0:14:34.693,0:14:36.666 You get this. 0:14:38.263,0:14:41.976 This is something that a human[br]could never have designed. 0:14:42.677,0:14:44.565 Except a human did design this, 0:14:44.589,0:14:48.898 but it was a human that was augmented[br]by a generative-design AI, 0:14:48.922,0:14:50.153 a digital nervous system 0:14:50.177,0:14:53.182 and robots that can actually[br]fabricate something like this. 0:14:54.150,0:14:57.745 So if this is the future,[br]the Augmented Age, 0:14:57.769,0:15:02.030 and we're going to be augmented[br]cognitively, physically and perceptually, 0:15:02.054,0:15:03.462 what will that look like? 0:15:04.046,0:15:06.712 What is this wonderland going to be like? 0:15:08.291,0:15:10.900 I think we're going to see a world 0:15:10.924,0:15:13.992 where we're moving[br]from things that are fabricated 0:15:15.101,0:15:16.546 to things that are farmed. 0:15:18.429,0:15:22.374 Where we're moving from things[br]that are constructed 0:15:22.398,0:15:24.102 to that which is grown. 0:15:26.504,0:15:28.692 We're going to move from being isolated 0:15:29.660,0:15:31.270 to being connected. 0:15:33.104,0:15:36.015 And we'll move away from extraction 0:15:36.039,0:15:37.912 to embrace aggregation. 0:15:40.437,0:15:44.204 I also think we'll shift[br]from craving obedience from our things 0:15:45.128,0:15:46.769 to valuing autonomy. 0:15:49.480,0:15:51.385 Thanks to our augmented capabilities, 0:15:51.409,0:15:53.786 our world is going to change dramatically. 0:15:54.702,0:15:58.305 I think a good analogy is the incredible[br]microcosm of a coral reef. 0:15:58.836,0:16:02.082 We're going to have a world[br]with more variety, more connectedness, 0:16:02.106,0:16:04.393 more dynamism, more complexity, 0:16:04.417,0:16:06.735 more adaptability and, of course, 0:16:06.759,0:16:07.976 more beauty. 0:16:08.741,0:16:10.305 The shape of things to come 0:16:10.329,0:16:12.619 will be unlike anything[br]we've ever seen before. 0:16:12.643,0:16:13.802 Why? 0:16:13.826,0:16:17.581 Because what will be shaping those things[br]is this new partnership 0:16:17.605,0:16:21.275 between technology, nature and humanity. 0:16:22.789,0:16:26.593 That, to me, is a future[br]well worth looking forward to. 0:16:26.617,0:16:27.888 Thank you all so much. 0:16:27.912,0:16:34.912 (Applause)