The future of flying robots | Vijay Kumar | TEDxPenn
-
0:35 - 0:38In my lab, we build
autonomous aerial robots -
0:38 - 0:40like the one you see flying here.
-
0:43 - 0:46Unlike the commercially available drones
that you can buy today, -
0:47 - 0:50this robot doesn't have any GPS on board.
-
0:51 - 0:52So without GPS,
-
0:52 - 0:55it's hard for robots like this
to determine their position. -
0:57 - 1:01This robot uses onboard sensors,
cameras and laser scanners, -
1:01 - 1:04to scan the environment.
-
1:04 - 1:07It detects features from the environment,
-
1:07 - 1:10and it determines where it is
relative to those features, -
1:10 - 1:12using a method of triangulation.
-
1:14 - 1:17And then it can assemble
all these features into a map, -
1:17 - 1:19like you see behind me.
-
1:20 - 1:24And this map then allows the robot
to understand where the obstacles are -
1:24 - 1:26and navigate in a collision-free manner.
-
1:28 - 1:29What I want to show you next
-
1:30 - 1:33is a set of experiments
we did inside our laboratory, -
1:33 - 1:36where this robot was able
to go for longer distances. -
1:37 - 1:42So here you'll see, on the top right,
what the robot sees with the camera. -
1:42 - 1:44And on the main screen...
-
1:44 - 1:46And of course this is sped up
by a factor of four... -
1:46 - 1:49On the main screen you'll see
the map that it's building. -
1:49 - 1:53So this is a high-resolution map
of the corridor around our laboratory. -
1:53 - 1:55And in a minute
you'll see it enter our lab, -
1:55 - 1:58which is recognizable
by the clutter that you see. -
1:58 - 1:59(Laughter)
-
1:59 - 2:01But the main point I want to convey to you
-
2:01 - 2:05is that these robots are capable
of building high-resolution maps -
2:05 - 2:07at five centimeters resolution,
-
2:07 - 2:12allowing somebody who is outside the lab,
or outside the building -
2:12 - 2:15to deploy these
without actually going inside, -
2:15 - 2:18and trying to infer
what happens inside the building. -
2:20 - 2:22Now there's one problem
with robots like this. -
2:23 - 2:25The first problem is it's pretty big.
-
2:25 - 2:27Because it's big, it's heavy.
-
2:28 - 2:31And these robots consume
about 100 watts per pound. -
2:32 - 2:34And this makes for
a very short mission life. -
2:35 - 2:37The second problem
-
2:37 - 2:41is that these robots have onboard sensors
that end up being very expensive... -
2:41 - 2:44A laser scanner, a camera
and the processors. -
2:45 - 2:48That drives up the cost of this robot.
-
2:49 - 2:51So we asked ourselves a question:
-
2:51 - 2:55what consumer product
can you buy in an electronics store -
2:55 - 3:01that is inexpensive, that's lightweight,
that has sensing onboard and computation? -
3:03 - 3:06And we invented the flying phone.
-
3:06 - 3:08(Laughter)
-
3:08 - 3:14So this robot uses a Samsung Galaxy
smartphone that you can buy off the shelf, -
3:14 - 3:18and all you need is an app that you
can download from our app store. -
3:18 - 3:22And you can see this robot
reading the letters, "TED" in this case, -
3:22 - 3:25looking at the corners
of the "T" and the "E" -
3:25 - 3:29and then triangulating off of that,
flying autonomously. -
3:30 - 3:33That joystick is just there
to make sure if the robot goes crazy, -
3:33 - 3:35Giuseppe can kill it.
-
3:35 - 3:36(Laughter)
-
3:38 - 3:41In addition to building
these small robots, -
3:42 - 3:46we also experiment with aggressive
behaviors, like you see here. -
3:47 - 3:51So this robot is now traveling
at two to three meters per second, -
3:52 - 3:55pitching and rolling aggressively
as it changes direction. -
3:56 - 4:00The main point is we can have
smaller robots that can go faster -
4:00 - 4:03and then travel in these
very unstructured environments. -
4:04 - 4:06And in this next video,
-
4:06 - 4:12just like you see this bird, an eagle,
gracefully coordinating its wings, -
4:13 - 4:16its eyes and feet
to grab prey out of the water, -
4:17 - 4:19our robot can go fishing, too.
-
4:19 - 4:20(Laughter)
-
4:20 - 4:24In this case, this is a Philly cheesesteak
hoagie that it's grabbing out of thin air. -
4:24 - 4:27(Laughter)
-
4:27 - 4:31So you can see this robot
going at about three meters per second, -
4:31 - 4:35which is faster than walking speed,
coordinating its arms, its claws -
4:36 - 4:40and its flight with split-second timing
to achieve this maneuver. -
4:41 - 4:43In another experiment,
-
4:43 - 4:46I want to show you
how the robot adapts its flight -
4:46 - 4:49to control its suspended payload,
-
4:49 - 4:52whose length is actually larger
than the width of the window. -
4:53 - 4:55So in order to accomplish this,
-
4:55 - 4:58it actually has to pitch
and adjust the altitude -
4:58 - 5:01and swing the payload through.
-
5:06 - 5:08But of course we want
to make these even smaller, -
5:08 - 5:11and we're inspired
in particular by honeybees. -
5:11 - 5:14So if you look at honeybees,
and this is a slowed down video, -
5:15 - 5:19they're so small,
the inertia is so lightweight... -
5:19 - 5:21(Laughter)
-
5:21 - 5:22that they don't care...
-
5:22 - 5:24They bounce off my hand, for example.
-
5:24 - 5:27This is a little robot
that mimics the honeybee behavior. -
5:28 - 5:29And smaller is better,
-
5:29 - 5:33because along with the small size
you get lower inertia. -
5:33 - 5:34Along with lower inertia...
-
5:34 - 5:37(Robot buzzing, laughter)
-
5:37 - 5:40along with lower inertia,
you're resistant to collisions. -
5:40 - 5:42And that makes you more robust.
-
5:43 - 5:46So just like these honeybees,
we build small robots. -
5:46 - 5:49And this particular one
is only 25 grams in weight. -
5:49 - 5:51It consumes only six watts of power.
-
5:52 - 5:54And it can travel
up to six meters per second. -
5:54 - 5:57So if I normalize that to its size,
-
5:57 - 6:00it's like a Boeing 787 traveling
ten times the speed of sound. -
6:03 - 6:05(Laughter)
-
6:05 - 6:07And I want to show you an example.
-
6:08 - 6:13This is probably the first planned mid-air
collision, at one-twentieth normal speed. -
6:13 - 6:16These are going at a relative speed
of two meters per second, -
6:16 - 6:19and this illustrates the basic principle.
-
6:19 - 6:24The two-gram carbon fiber cage around it
prevents the propellers from entangling, -
6:24 - 6:29but essentially the collision is absorbed
and the robot responds to the collisions. -
6:30 - 6:33And so small also means safe.
-
6:33 - 6:35In my lab, as we developed these robots,
-
6:35 - 6:37we start off with these big robots
-
6:37 - 6:39and then now we're down
to these small robots. -
6:39 - 6:43And if you plot a histogram
of the number of Band-Aids we've ordered -
6:43 - 6:45in the past, that sort of tailed off now.
-
6:46 - 6:47Because these robots are really safe.
-
6:49 - 6:51The small size has some disadvantages,
-
6:51 - 6:56and nature has found a number of ways
to compensate for these disadvantages. -
6:56 - 7:00The basic idea is they aggregate
to form large groups, or swarms. -
7:01 - 7:05So, similarly, in our lab,
we try to create artificial robot swarms. -
7:05 - 7:06And this is quite challenging
-
7:06 - 7:10because now you have to think
about networks of robots. -
7:10 - 7:12And within each robot,
-
7:12 - 7:17you have to think about the interplay
of sensing, communication, computation... -
7:17 - 7:22And this network then becomes
quite difficult to control and manage. -
7:23 - 7:26So from nature we take away
three organizing principles -
7:27 - 7:30that essentially allow us
to develop our algorithms. -
7:31 - 7:35The first idea is that robots
need to be aware of their neighbors. -
7:35 - 7:39They need to be able to sense
and communicate with their neighbors. -
7:40 - 7:42So this video illustrates the basic idea.
-
7:42 - 7:44You have four robots...
-
7:44 - 7:47One of the robots has actually been
hijacked by a human operator, literally. -
7:49 - 7:51But because the robots
interact with each other, -
7:51 - 7:53they sense their neighbors,
-
7:53 - 7:54they essentially follow.
-
7:54 - 7:59And here there's a single person
able to lead this network of followers. -
8:02 - 8:07So again, it's not because all the robots
know where they're supposed to go. -
8:07 - 8:11It's because they're just reacting
to the positions of their neighbors. -
8:13 - 8:16(Laughter)
-
8:18 - 8:22So the next experiment illustrates
the second organizing principle. -
8:25 - 8:28And this principle has to do
with the principle of anonymity. -
8:29 - 8:32Here the key idea is that
-
8:33 - 8:37the robots are agnostic
to the identities of their neighbors. -
8:38 - 8:41They're asked to form a circular shape,
-
8:41 - 8:44and no matter how many robots
you introduce into the formation, -
8:44 - 8:47or how many robots you pull out,
-
8:47 - 8:50each robot is simply
reacting to its neighbor. -
8:50 - 8:54It's aware of the fact that it needs
to form the circular shape, -
8:55 - 8:57but collaborating with its neighbors
-
8:57 - 9:00it forms the shape
without central coordination. -
9:02 - 9:04Now if you put these ideas together,
-
9:04 - 9:08the third idea is that we
essentially give these robots -
9:08 - 9:12mathematical descriptions
of the shape they need to execute. -
9:13 - 9:16And these shapes can be varying
as a function of time, -
9:16 - 9:21and you'll see these robots
start from a circular formation, -
9:21 - 9:24change into a rectangular formation,
stretch into a straight line, -
9:24 - 9:25back into an ellipse.
-
9:25 - 9:29And they do this with the same
kind of split-second coordination -
9:29 - 9:32that you see in natural swarms, in nature.
-
9:34 - 9:36So why work with swarms?
-
9:36 - 9:40Let me tell you about two applications
that we are very interested in. -
9:40 - 9:43The first one has to do with agriculture,
-
9:43 - 9:46which is probably the biggest problem
that we're facing worldwide. -
9:47 - 9:48As you well know,
-
9:48 - 9:52one in every seven persons
in this earth is malnourished. -
9:52 - 9:55Most of the land that we can cultivate
has already been cultivated. -
9:56 - 10:00And the efficiency of most systems
in the world is improving, -
10:00 - 10:03but our production system
efficiency is actually declining. -
10:03 - 10:07And that's mostly because of water
shortage, crop diseases, climate change -
10:07 - 10:09and a couple of other things.
-
10:09 - 10:11So what can robots do?
-
10:11 - 10:16Well, we adopt an approach that's
called Precision Farming in the community. -
10:16 - 10:21And the basic idea is that we fly
aerial robots through orchards, -
10:21 - 10:24and then we build
precision models of individual plants. -
10:25 - 10:27So just like personalized medicine,
-
10:27 - 10:31while you might imagine wanting
to treat every patient individually, -
10:31 - 10:34what we'd like to do is build
models of individual plants -
10:35 - 10:39and then tell the farmer
what kind of inputs every plant needs... -
10:40 - 10:44The inputs in this case being water,
fertilizer and pesticide. -
10:46 - 10:49Here you'll see robots
traveling through an apple orchard, -
10:49 - 10:52and in a minute you'll see
two of its companions -
10:52 - 10:53doing the same thing on the left side.
-
10:54 - 10:57And what they're doing is essentially
building a map of the orchard. -
10:58 - 11:02Within the map is a map
of every plant in this orchard. -
11:02 - 11:04(Robot buzzing)
-
11:04 - 11:06Let's see what those maps look like.
-
11:06 - 11:11In the next video, you'll see the cameras
that are being used on this robot. -
11:11 - 11:14On the top-left is essentially
a standard color camera. -
11:15 - 11:18On the left-center is an infrared camera.
-
11:18 - 11:22And on the bottom-left
is a thermal camera. -
11:22 - 11:25And on the main panel, you're seeing
a three-dimensional reconstruction -
11:25 - 11:30of every tree in the orchard
as the sensors fly right past the trees. -
11:33 - 11:37Armed with information like this,
we can do several things. -
11:37 - 11:42The first and possibly the most important
thing we can do is very simple: -
11:42 - 11:44count the number of fruits on every tree.
-
11:45 - 11:49By doing this, you tell the farmer
how many [fruits] she has in every tree -
11:50 - 11:53and allow her to estimate
the yield in the orchard, -
11:54 - 11:57optimizing the production
chain downstream. -
11:58 - 12:00The second thing we can do
-
12:00 - 12:04is take models of plants, construct
three-dimensional reconstructions, -
12:04 - 12:07and from that estimate the canopy size,
-
12:07 - 12:10and then correlate the canopy size
to the amount of leaf area on every plant. -
12:10 - 12:13And this is called the leaf area index.
-
12:13 - 12:15So if you know this leaf area index,
-
12:15 - 12:20you essentially have a measure of how much
photosynthesis is possible in every plant, -
12:20 - 12:23which again tells you
how healthy each plant is. -
12:26 - 12:30By combining visual
and infrared information, -
12:30 - 12:33we can also compute indices such as NDVI.
-
12:34 - 12:37And in this particular case,
you can essentially see -
12:37 - 12:40there are some crops that are
not doing as well as other crops. -
12:40 - 12:43This is easily discernible from imagery,
-
12:44 - 12:47not just visual imagery but combining
-
12:47 - 12:49both visual imagery and infrared imagery.
-
12:50 - 12:52And then lastly,
-
12:52 - 12:56one thing we're interested in doing is
detecting the early onset of chlorosis... -
12:56 - 12:57And this is an orange tree...
-
12:58 - 13:00Which is essentially seen
by yellowing of leaves. -
13:00 - 13:04But robots flying overhead
can easily spot this autonomously -
13:04 - 13:07and then report to the farmer
that he or she has a problem -
13:07 - 13:08in this section of the orchard.
-
13:10 - 13:13Systems like this can really help,
-
13:13 - 13:19and we're projecting yields
that can improve by about ten percent -
13:19 - 13:22and, more importantly, decrease
the amount of inputs such as water -
13:22 - 13:25by 25 percent by using
aerial robot swarms. -
13:26 - 13:29A second application area
is in first response. -
13:29 - 13:31This is a picture of the Penn campus,
-
13:31 - 13:34actually south of the Penn campus,
the South Bank. -
13:34 - 13:36I want you to imagine a setting
-
13:36 - 13:39where there might be an emergency call
from a building, -
13:39 - 13:40a 911 call.
-
13:41 - 13:45By the time the police get there,
it might take a valuable 5-10 minutes. -
13:45 - 13:47But imagine now, robots respond.
-
13:47 - 13:49And you have a whole swarm of them,
-
13:49 - 13:53flying to the scene autonomously,
just triggered by a 911 call -
13:53 - 13:54or by a dispacher.
-
13:55 - 13:57By the way, if someone is here
from the FAA, -
13:57 - 14:00this was actually shot in South America.
-
14:00 - 14:01(Laughter)
-
14:01 - 14:04So, robots arrive at the scene,
-
14:05 - 14:08and they're all equipped
with downward facing cameras, -
14:08 - 14:10and they can monitor the scene.
-
14:10 - 14:12And they do this autonomously,
-
14:12 - 14:16so by the time a human first responder
or a police officer gets there, -
14:17 - 14:19they have access
to all kinds of information. -
14:20 - 14:23So on the top left,
you see the central screen -
14:23 - 14:24that a dispacher might see,
-
14:24 - 14:27which is telling her
where the robots are flying -
14:27 - 14:30and how they're encircling the building.
-
14:30 - 14:33And the robots are autonomously deciding
which ingress poins -
14:33 - 14:35should be assigned to what robot.
-
14:36 - 14:40On the top right, you essentialy see
images from the robots -
14:40 - 14:42being assembled into a mosaic.
-
14:43 - 14:46Which again, gives the first responder
some idea -
14:46 - 14:49of what activity is going on at the scene.
-
14:49 - 14:52And on the bottom, you can see
a three-dimensional reconstruction -
14:52 - 14:54that we developed on the fly.
-
14:55 - 14:57In addition to working outside buidlings,
-
14:57 - 15:00we're also interested
in going inside buidlings, -
15:00 - 15:03and I want to show you
an experiment we did three years ago -
15:03 - 15:07where our aerial robot -
one exactly like this one - -
15:08 - 15:10is collaborating with a ground robot,
-
15:10 - 15:13in this case it's actually hitching a ride
with a ground robot, -
15:13 - 15:16because it's programmed to be lazy,
to save power. -
15:16 - 15:19So, as it goes up,
the two travel in tandem, -
15:20 - 15:23and this is a collapsed building
after an earthquake, -
15:23 - 15:26this is shortly after the 2011 earthquake
in Fukushima. -
15:27 - 15:30When the robots get stuck
in front of a collapsed doorway, -
15:30 - 15:33our tobot takes off
and is able to fly over the obstacles -
15:34 - 15:35around the obstacles,
-
15:35 - 15:39and generate a three-dimensional map,
in this case of a bookcase. -
15:39 - 15:41And it's able to see
what's on the other side. -
15:41 - 15:46Something fairly simple,
but it's hard to do with ground robots, -
15:46 - 15:49and certainly hard to do with humans
when there's potential for harm. -
15:50 - 15:54So these two robots
are collaboratively building these maps, -
15:55 - 15:57and, again,
when the first responders come, -
15:57 - 15:59they can be quick with these maps.
-
16:00 - 16:03So let me show you
what some of these maps look like. -
16:03 - 16:05So this is a three storey building,
-
16:05 - 16:08the seventh, eighth and what remains
of the ninth floor on top, -
16:08 - 16:11and we were able to construct this map
using this team of two robots, -
16:11 - 16:13operating in tandem, autonomously.
-
16:14 - 16:18However, this experiment took us
two and a half hours to complete. -
16:19 - 16:21Now, no first responder
is going to give you -
16:21 - 16:23two and a half hours
to do this experiment, -
16:23 - 16:25before he or she wants to rush in.
-
16:25 - 16:27They might give you
two and a half minutes, -
16:27 - 16:30you'll be lucky if you get
two and a half seconds. -
16:30 - 16:32But now that's where robot swarms
come in. -
16:32 - 16:35Imagine if you could send in
a hundred of these robots, -
16:35 - 16:37like the little ones
that we were just flying, -
16:37 - 16:40and imagine they went in
and generated maps like this -
16:40 - 16:43well before humans actually arrived
on the scene. -
16:43 - 16:46And that's the vision
we're working towards. -
16:48 - 16:49So, let me conclude
-
16:50 - 16:53with a movie -
a Warner brothers movie - -
16:53 - 16:56of an upcoming -
right next in your theatre, -
16:57 - 16:59The Swarm!
The Swarm is coming! -
16:59 - 17:02And I love this poster,
actually if you've seen the movie, -
17:02 - 17:04you're probably dating yourself
-
17:05 - 17:08if you have not seen the movie,
I encourage you not to see it, -
17:08 - 17:09it's a terrible movie,
-
17:09 - 17:10(Laughter)
-
17:10 - 17:13It's about killer bees, attacking men
and killing them and so on. -
17:13 - 17:16But everything about this poster is true,
which is why I like it. -
17:16 - 17:19"Its size is immeasurable" -
I hope I've convinced you -
17:19 - 17:21that "its power is limitless",
-
17:21 - 17:23and even this last bit is true,
-
17:23 - 17:26"its enemy is man",
the technology is here today -
17:26 - 17:28and it's people like us that are standing
-
17:28 - 17:31between this technology
and its applications. -
17:31 - 17:34The swarm is coming,
this is not science fiction, -
17:34 - 17:36in fact, this is what lies ahead.
-
17:37 - 17:42Lastly, I want you to applaud
the people who actually create the future, -
17:43 - 17:47Yash Mulgaonkar, Sikang Liu
and Giuseppe Loianno, -
17:47 - 17:50who are responsible for the three
demonstrations that you saw. -
17:51 - 17:52Thank you.
-
17:52 - 17:55(Applause)
- Title:
- The future of flying robots | Vijay Kumar | TEDxPenn
- Description:
-
At his lab at the University of Pennsylvania, Vijay Kumar and his team have created autonomous aerial robots inspired by honeybees. Their latest breakthrough: Precision Farming, in which swarms of robots map, reconstruct and analyze every plant and piece of fruit in an orchard, providing vital information to farmers that can help improve yields and make water management smarter.
This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx - Video Language:
- English
- Team:
- closed TED
- Project:
- TEDxTalks
- Duration:
- 18:09
Yasushi Aoki commented on English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
TED Translators admin approved English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
TED Translators admin accepted English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
TED Translators admin edited English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
TED Translators admin edited English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
TED Translators admin edited English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
Ivana Korom edited English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn | ||
Ivana Korom edited English subtitles for The future of flying robots | Vijay Kumar | TEDxPenn |
Yasushi Aoki
104
00:06:24,278 --> 00:06:29,289
but essentially the collision is absorbed
and the robot responds to the collisions.
=>
but essentially the collision is absorbed
and the robot able to respond to the collisions.
112
00:06:48,860 --> 00:06:51,316
The small size has some disadvantages,
=>
The small size, of course, has some disadvantages,
131
00:07:54,180 --> 00:07:59,309
And here there's a single person
able to lead this network of followers.
=>
And here there's a single person who's
able to lead this network of leader-followers.
145
00:08:56,660 --> 00:08:59,660
it forms the shape
without central coordination.
=>
it forms the shape
without any central coordination.
271
00:15:57,213 --> 00:15:59,480
they can be quick with these maps.
=>
they can be equipped with these maps.
294
00:17:01,990 --> 00:17:04,057
you're probably dating yourself
=>
you're probably dating yourself.
# Period
295
00:17:04,738 --> 00:17:07,689
if you have not seen the movie,
I encourage you not to see it,
=>
If ...
Typos:
dispacher => dispatcher
poins => points
essentialy => essentially
buidlings => buildings
tobot => robot
storey => story