The future of flying robots
-
0:01 - 0:05In my lab, we build
autonomous aerial robots -
0:05 - 0:07like the one you see flying here.
-
0:09 - 0:12Unlike the commercially available drones
that you can buy today, -
0:12 - 0:15this robot doesn't have any GPS on board.
-
0:16 - 0:17So without GPS,
-
0:17 - 0:21it's hard for robots like this
to determine their position. -
0:22 - 0:27This robot uses onboard sensors,
cameras and laser scanners, -
0:27 - 0:29to scan the environment.
-
0:29 - 0:32It detects features from the environment,
-
0:32 - 0:35and it determines where it is
relative to those features, -
0:35 - 0:37using a method of triangulation.
-
0:37 - 0:40And then it can assemble
all these features into a map, -
0:40 - 0:42like you see behind me.
-
0:42 - 0:46And this map then allows the robot
to understand where the obstacles are -
0:46 - 0:49and navigate in a collision-free manner.
-
0:49 - 0:51What I want to show you next
-
0:51 - 0:54is a set of experiments
we did inside our laboratory, -
0:55 - 0:58where this robot was able
to go for longer distances. -
0:58 - 1:03So here you'll see, on the top right,
what the robot sees with the camera. -
1:03 - 1:05And on the main screen --
-
1:05 - 1:07and of course this is sped up
by a factor of four -- -
1:07 - 1:10on the main screen you'll see
the map that it's building. -
1:10 - 1:14So this is a high-resolution map
of the corridor around our laboratory. -
1:14 - 1:16And in a minute
you'll see it enter our lab, -
1:17 - 1:19which is recognizable
by the clutter that you see. -
1:19 - 1:20(Laughter)
-
1:20 - 1:22But the main point I want to convey to you
-
1:22 - 1:26is that these robots are capable
of building high-resolution maps -
1:26 - 1:29at five centimeters resolution,
-
1:29 - 1:33allowing somebody who is outside the lab,
or outside the building -
1:33 - 1:36to deploy these
without actually going inside, -
1:36 - 1:40and trying to infer
what happens inside the building. -
1:40 - 1:43Now there's one problem
with robots like this. -
1:44 - 1:46The first problem is it's pretty big.
-
1:46 - 1:48Because it's big, it's heavy.
-
1:49 - 1:52And these robots consume
about 100 watts per pound. -
1:52 - 1:55And this makes for
a very short mission life. -
1:56 - 1:57The second problem
-
1:57 - 2:01is that these robots have onboard sensors
that end up being very expensive -- -
2:01 - 2:05a laser scanner, a camera
and the processors. -
2:05 - 2:08That drives up the cost of this robot.
-
2:09 - 2:12So we asked ourselves a question:
-
2:12 - 2:16what consumer product
can you buy in an electronics store -
2:16 - 2:22that is inexpensive, that's lightweight,
that has sensing onboard and computation? -
2:24 - 2:27And we invented the flying phone.
-
2:27 - 2:29(Laughter)
-
2:29 - 2:35So this robot uses a Samsung Galaxy
smartphone that you can buy off the shelf, -
2:35 - 2:39and all you need is an app that you
can download from our app store. -
2:39 - 2:43And you can see this robot
reading the letters, "TED" in this case, -
2:43 - 2:46looking at the corners
of the "T" and the "E" -
2:46 - 2:50and then triangulating off of that,
flying autonomously. -
2:51 - 2:54That joystick is just there
to make sure if the robot goes crazy, -
2:54 - 2:55Giuseppe can kill it.
-
2:55 - 2:57(Laughter)
-
2:59 - 3:03In addition to building
these small robots, -
3:03 - 3:08we also experiment with aggressive
behaviors, like you see here. -
3:08 - 3:13So this robot is now traveling
at two to three meters per second, -
3:13 - 3:17pitching and rolling aggressively
as it changes direction. -
3:17 - 3:21The main point is we can have
smaller robots that can go faster -
3:21 - 3:24and then travel in these
very unstructured environments. -
3:25 - 3:27And in this next video,
-
3:27 - 3:33just like you see this bird, an eagle,
gracefully coordinating its wings, -
3:33 - 3:37its eyes and feet
to grab prey out of the water, -
3:37 - 3:39our robot can go fishing, too.
-
3:39 - 3:41(Laughter)
-
3:41 - 3:45In this case, this is a Philly cheesesteak
hoagie that it's grabbing out of thin air. -
3:45 - 3:47(Laughter)
-
3:48 - 3:51So you can see this robot
going at about three meters per second, -
3:51 - 3:56which is faster than walking speed,
coordinating its arms, its claws -
3:56 - 4:00and its flight with split-second timing
to achieve this maneuver. -
4:02 - 4:03In another experiment,
-
4:03 - 4:07I want to show you
how the robot adapts its flight -
4:07 - 4:09to control its suspended payload,
-
4:09 - 4:13whose length is actually larger
than the width of the window. -
4:14 - 4:15So in order to accomplish this,
-
4:15 - 4:19it actually has to pitch
and adjust the altitude -
4:19 - 4:21and swing the payload through.
-
4:27 - 4:29But of course we want
to make these even smaller, -
4:29 - 4:32and we're inspired
in particular by honeybees. -
4:32 - 4:36So if you look at honeybees,
and this is a slowed down video, -
4:36 - 4:39they're so small,
the inertia is so lightweight -- -
4:40 - 4:41(Laughter)
-
4:41 - 4:45that they don't care --
they bounce off my hand, for example. -
4:45 - 4:48This is a little robot
that mimics the honeybee behavior. -
4:49 - 4:50And smaller is better,
-
4:50 - 4:53because along with the small size
you get lower inertia. -
4:53 - 4:55Along with lower inertia --
-
4:55 - 4:58(Robot buzzing, laughter)
-
4:58 - 5:01along with lower inertia,
you're resistant to collisions. -
5:01 - 5:02And that makes you more robust.
-
5:04 - 5:06So just like these honeybees,
we build small robots. -
5:06 - 5:10And this particular one
is only 25 grams in weight. -
5:10 - 5:12It consumes only six watts of power.
-
5:12 - 5:15And it can travel
up to six meters per second. -
5:15 - 5:17So if I normalize that to its size,
-
5:17 - 5:21it's like a Boeing 787 traveling
ten times the speed of sound. -
5:24 - 5:26(Laughter)
-
5:26 - 5:28And I want to show you an example.
-
5:29 - 5:34This is probably the first planned mid-air
collision, at one-twentieth normal speed. -
5:34 - 5:37These are going at a relative speed
of two meters per second, -
5:37 - 5:39and this illustrates the basic principle.
-
5:40 - 5:45The two-gram carbon fiber cage around it
prevents the propellers from entangling, -
5:45 - 5:50but essentially the collision is absorbed
and the robot responds to the collisions. -
5:51 - 5:53And so small also means safe.
-
5:53 - 5:55In my lab, as we developed these robots,
-
5:55 - 5:57we start off with these big robots
-
5:57 - 6:00and then now we're down
to these small robots. -
6:00 - 6:03And if you plot a histogram
of the number of Band-Aids we've ordered -
6:03 - 6:06in the past, that sort of tailed off now.
-
6:06 - 6:08Because these robots are really safe.
-
6:09 - 6:11The small size has some disadvantages,
-
6:11 - 6:15and nature has found a number of ways
to compensate for these disadvantages. -
6:16 - 6:20The basic idea is they aggregate
to form large groups, or swarms. -
6:20 - 6:24So, similarly, in our lab,
we try to create artificial robot swarms. -
6:24 - 6:26And this is quite challenging
-
6:26 - 6:29because now you have to think
about networks of robots. -
6:29 - 6:31And within each robot,
-
6:31 - 6:36you have to think about the interplay
of sensing, communication, computation -- -
6:36 - 6:41and this network then becomes
quite difficult to control and manage. -
6:42 - 6:45So from nature we take away
three organizing principles -
6:45 - 6:49that essentially allow us
to develop our algorithms. -
6:50 - 6:54The first idea is that robots
need to be aware of their neighbors. -
6:54 - 6:58They need to be able to sense
and communicate with their neighbors. -
6:58 - 7:01So this video illustrates the basic idea.
-
7:01 - 7:02You have four robots --
-
7:02 - 7:06one of the robots has actually been
hijacked by a human operator, literally. -
7:07 - 7:09But because the robots
interact with each other, -
7:09 - 7:11they sense their neighbors,
-
7:11 - 7:12they essentially follow.
-
7:12 - 7:18And here there's a single person
able to lead this network of followers. -
7:20 - 7:25So again, it's not because all the robots
know where they're supposed to go. -
7:25 - 7:29It's because they're just reacting
to the positions of their neighbors. -
7:32 - 7:36(Laughter)
-
7:36 - 7:42So the next experiment illustrates
the second organizing principle. -
7:43 - 7:47And this principle has to do
with the principle of anonymity. -
7:47 - 7:52Here the key idea is that
-
7:52 - 7:56the robots are agnostic
to the identities of their neighbors. -
7:56 - 7:59They're asked to form a circular shape,
-
7:59 - 8:02and no matter how many robots
you introduce into the formation, -
8:02 - 8:05or how many robots you pull out,
-
8:05 - 8:08each robot is simply
reacting to its neighbor. -
8:08 - 8:13It's aware of the fact that it needs
to form the circular shape, -
8:13 - 8:15but collaborating with its neighbors
-
8:15 - 8:19it forms the shape
without central coordination. -
8:20 - 8:22Now if you put these ideas together,
-
8:22 - 8:26the third idea is that we
essentially give these robots -
8:26 - 8:30mathematical descriptions
of the shape they need to execute. -
8:30 - 8:34And these shapes can be varying
as a function of time, -
8:34 - 8:38and you'll see these robots
start from a circular formation, -
8:38 - 8:41change into a rectangular formation,
stretch into a straight line, -
8:42 - 8:43back into an ellipse.
-
8:43 - 8:47And they do this with the same
kind of split-second coordination -
8:47 - 8:50that you see in natural swarms, in nature.
-
8:51 - 8:53So why work with swarms?
-
8:53 - 8:57Let me tell you about two applications
that we are very interested in. -
8:58 - 9:01The first one has to do with agriculture,
-
9:01 - 9:04which is probably the biggest problem
that we're facing worldwide. -
9:05 - 9:06As you well know,
-
9:06 - 9:10one in every seven persons
in this earth is malnourished. -
9:10 - 9:13Most of the land that we can cultivate
has already been cultivated. -
9:14 - 9:17And the efficiency of most systems
in the world is improving, -
9:17 - 9:21but our production system
efficiency is actually declining. -
9:21 - 9:25And that's mostly because of water
shortage, crop diseases, climate change -
9:25 - 9:27and a couple of other things.
-
9:27 - 9:29So what can robots do?
-
9:29 - 9:34Well, we adopt an approach that's
called Precision Farming in the community. -
9:34 - 9:39And the basic idea is that we fly
aerial robots through orchards, -
9:39 - 9:42and then we build
precision models of individual plants. -
9:43 - 9:44So just like personalized medicine,
-
9:45 - 9:49while you might imagine wanting
to treat every patient individually, -
9:49 - 9:53what we'd like to do is build
models of individual plants -
9:53 - 9:57and then tell the farmer
what kind of inputs every plant needs -- -
9:57 - 10:02the inputs in this case being water,
fertilizer and pesticide. -
10:03 - 10:06Here you'll see robots
traveling through an apple orchard, -
10:06 - 10:09and in a minute you'll see
two of its companions -
10:09 - 10:10doing the same thing on the left side.
-
10:11 - 10:14And what they're doing is essentially
building a map of the orchard. -
10:14 - 10:17Within the map is a map
of every plant in this orchard. -
10:17 - 10:19(Robot buzzing)
-
10:19 - 10:21Let's see what those maps look like.
-
10:21 - 10:25In the next video, you'll see the cameras
that are being used on this robot. -
10:25 - 10:28On the top-left is essentially
a standard color camera. -
10:30 - 10:33On the left-center is an infrared camera.
-
10:33 - 10:37And on the bottom-left
is a thermal camera. -
10:37 - 10:40And on the main panel, you're seeing
a three-dimensional reconstruction -
10:40 - 10:46of every tree in the orchard
as the sensors fly right past the trees. -
10:48 - 10:52Armed with information like this,
we can do several things. -
10:52 - 10:56The first and possibly the most important
thing we can do is very simple: -
10:56 - 10:59count the number of fruits on every tree.
-
11:00 - 11:04By doing this, you tell the farmer
how many fruits she has in every tree -
11:04 - 11:08and allow her to estimate
the yield in the orchard, -
11:08 - 11:11optimizing the production
chain downstream. -
11:12 - 11:13The second thing we can do
-
11:13 - 11:18is take models of plants, construct
three-dimensional reconstructions, -
11:18 - 11:20and from that estimate the canopy size,
-
11:20 - 11:24and then correlate the canopy size
to the amount of leaf area on every plant. -
11:24 - 11:26And this is called the leaf area index.
-
11:26 - 11:28So if you know this leaf area index,
-
11:28 - 11:34you essentially have a measure of how much
photosynthesis is possible in every plant, -
11:34 - 11:37which again tells you
how healthy each plant is. -
11:38 - 11:42By combining visual
and infrared information, -
11:42 - 11:45we can also compute indices such as NDVI.
-
11:45 - 11:48And in this particular case,
you can essentially see -
11:48 - 11:51there are some crops that are
not doing as well as other crops. -
11:51 - 11:55This is easily discernible from imagery,
-
11:55 - 11:57not just visual imagery but combining
-
11:57 - 12:00both visual imagery and infrared imagery.
-
12:00 - 12:01And then lastly,
-
12:01 - 12:05one thing we're interested in doing is
detecting the early onset of chlorosis -- -
12:05 - 12:07and this is an orange tree --
-
12:07 - 12:10which is essentially seen
by yellowing of leaves. -
12:10 - 12:14But robots flying overhead
can easily spot this autonomously -
12:14 - 12:17and then report to the farmer
that he or she has a problem -
12:17 - 12:18in this section of the orchard.
-
12:19 - 12:21Systems like this can really help,
-
12:22 - 12:27and we're projecting yields
that can improve by about ten percent -
12:27 - 12:31and, more importantly, decrease
the amount of inputs such as water -
12:31 - 12:34by 25 percent by using
aerial robot swarms. -
12:35 - 12:41Lastly, I want you to applaud
the people who actually create the future, -
12:41 - 12:46Yash Mulgaonkar, Sikang Liu
and Giuseppe Loianno, -
12:46 - 12:49who are responsible for the three
demonstrations that you saw. -
12:49 - 12:51Thank you.
-
12:51 - 12:57(Applause)
- Title:
- The future of flying robots
- Speaker:
- Vijay Kumar
- Description:
-
At his lab at the University of Pennsylvania, Vijay Kumar and his team have created autonomous aerial robots inspired by honeybees. Their latest breakthrough: Precision Farming, in which swarms of robots map, reconstruct and analyze every plant and piece of fruit in an orchard, providing vital information to farmers that can help improve yields and make water management smarter.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 13:09
Brian Greene commented on English subtitles for Vijay Kumar | ||
Brian Greene edited English subtitles for Vijay Kumar | ||
Brian Greene edited English subtitles for Vijay Kumar | ||
Brian Greene edited English subtitles for Vijay Kumar | ||
Joanna Pietrulewicz edited English subtitles for Vijay Kumar | ||
Joanna Pietrulewicz edited English subtitles for Vijay Kumar | ||
Brian Greene edited English subtitles for Vijay Kumar | ||
Brian Greene approved English subtitles for Vijay Kumar |
Brian Greene
A correction was made to this transcript on 1/15/16.
At 10:25, the subtitle now reads: "On the top-left is essentially a standard color camera."