TED Global 2013 Found in Translation: Daniel Suarez
-
0:09 - 0:12Good afternoon, everybody.
I'm Doug Chilcott from TED. -
0:12 - 0:16Welcome to TED Global
here in the Open Translation lounge. -
0:16 - 0:19Today we have Daniel Suarez who just
left the stage moments ago -
0:19 - 0:23with a talk that was pretty alarming
and left a lot of people shivering, -
0:23 - 0:25and I have a lot of questions.
-
0:25 - 0:26So we'll get to you in a moment.
-
0:26 - 0:30Also joining us on stage
here are Irteza from Pakistan, -
0:30 - 0:32Hugo from France,
-
0:32 - 0:34Els from Belgium,
-
0:34 - 0:36and Krystian from Poland.
-
0:36 - 0:40And joining us via Skype,
at all hours of the night and day, -
0:40 - 0:43we have Lazaros from Greece,
-
0:43 - 0:45Anja from Slovenia,
-
0:45 - 0:48Alberto from Italy,
-
0:48 - 0:52and Yasushi, in Japan,
where it's actually midnight. -
0:52 - 0:54Thank you for joining us.
-
0:55 - 0:58I'm just actually
going to throw it out to you. -
0:58 - 1:00Krystian, you had a question
about the use of fiction. -
1:00 - 1:01- Yeah.
-
1:02 - 1:05I was wondering if,
because your talk is a cautionary tale, -
1:05 - 1:09but your novel was probably
not meant as one. -
1:09 - 1:13But did this angle that we have to be
careful about this -
1:13 - 1:15grow out of your research,
-
1:15 - 1:19or was your novel also
meant as something to [caution]? -
1:19 - 1:21- I really do cautionary tales.
-
1:21 - 1:26I'll tell you, some people think
thriller writers like to be doomsayers. -
1:26 - 1:29But that's not it, actually.
I love technology. -
1:29 - 1:32I spent nearly 20 years
designing software. -
1:32 - 1:36I try to think of it as spotting
icebergs as opposed to be a doomsayer. -
1:36 - 1:38That doesn't mean
you're not going to go anywhere, -
1:38 - 1:42it means you might have to turn right
or left occasionally to avoid obstacles. -
1:42 - 1:44I think there is
a way we can navigate past these -
1:44 - 1:47and get to even a better place
than where we are now. -
1:47 - 1:50We just have to be aware of the terrain,
that's all. -
1:50 - 1:51That's really what I try to do.
-
1:51 - 1:54A cautionary tale is probably
a perfect description for it. -
1:54 - 1:58- Irteza from Pakistan,
we were talking earlier about drones -
1:58 - 2:00being part of the last political campaign
in Pakistan. -
2:01 - 2:04Could you talk about the reality
of drones in your country? -
2:04 - 2:08- The second highest party which just
won in our recent elections, -
2:08 - 2:11their campaign used drones over there.
-
2:11 - 2:13A lot of incidents over there
-
2:13 - 2:16were that they just targeted
the wrong guy. -
2:16 - 2:18And they killed them.
-
2:18 - 2:20So it's really a bad thing over there.
-
2:20 - 2:24- For every person you harm,
100 might be angry -
2:24 - 2:27and then you create a bigger problem,
and you hit the wrong people. -
2:27 - 2:30It's not an effective way to do things.
-
2:30 - 2:34Even if you agree with it as a technology,
it just is not effective. -
2:34 - 2:38I think especially what will change
America's viewpoint on that is, -
2:38 - 2:42as this technology spreads,
the temptation to use it -
2:42 - 2:44when you're the only one who has it
is very strong, -
2:45 - 2:46but we are rapidly getting to the point
-
2:46 - 2:49where we're going to see
70 nations developing their own. -
2:50 - 2:54I think that will help to finally get this
into some situation where we can -
2:54 - 2:56get a legal framework
-
2:56 - 3:00because it's not in anyone's interest
to continue where we're going. -
3:00 - 3:01It's not just a free for all.
-
3:01 - 3:03- Do you have ideas about
this legal framework? -
3:03 - 3:05What does it look like?
-
3:05 - 3:07Just saying "you can't do it"
is not enough. -
3:07 - 3:11- No, no. Here's the thing, though.
we've done this with all sorts of things. -
3:11 - 3:14Laser blinding weapons is a great example.
-
3:14 - 3:16We cannot sit here and think
of all the horrible incidents -
3:16 - 3:20with laser blinding weapons
because people sat down and said, yeah, -
3:20 - 3:23it's very easy to blind people,
let's not do that. -
3:23 - 3:26And, sure enough, there are all these
international controls. -
3:26 - 3:31So, drones are a little different because
they actually have some civilian uses. -
3:31 - 3:35So, you will have instances
where individuals or small groups -
3:35 - 3:36are putting weapons on drones,
-
3:36 - 3:39and that's why I was thinking
of having an immune system, -
3:39 - 3:44this idea we would create, you know,
drones that would look for other drones. -
3:44 - 3:47And then bring them
to the attentions of humans. -
3:47 - 3:49They wouldn't shoot them out of the sky.
-
3:49 - 3:53But there is no way
you're going to stop everything. -
3:53 - 3:55But you don't want to militarise
your entire society -
3:55 - 3:57to defend it against robots.
-
3:57 - 4:00- Actually, let's use
one of the benefits of technology. -
4:00 - 4:02I'm going to bring
some of the Skype people in. -
4:02 - 4:04A non-threatening technology.
-
4:04 - 4:07- I don't think that technology
-
4:07 - 4:09is going to be available
in a couple of years -
4:09 - 4:13because I found some sites that cite
that you can actually build -
4:13 - 4:18the majority of parts for drones
with a 3D printer. -
4:18 - 4:23So, the drones are around the corner
for small groups, -
4:23 - 4:27not just for terrorist organisations
and countries. -
4:27 - 4:32So, I don't think that a convention
that you spoke about in your TED talk -
4:32 - 4:34is going to do it.
-
4:34 - 4:38I think we need a broader perspective,
more Internet-based. -
4:38 - 4:42I don't know if you have any suggestions
I think we would all appreciate? -
4:42 - 4:46- Well, I am curious
when you said a broader solution, -
4:46 - 4:49do you have any idea what that might be?
-
4:50 - 4:53- No, I was hoping you would have it!
-
4:53 - 4:54(Laughter)
-
4:54 - 4:57- Let me say, when I say
international legal framework, -
4:57 - 4:59that's really just the beginning.
-
4:59 - 5:02Again, it's to try to change
the psychology of people -
5:02 - 5:06to say more than just, look at this
cool stuff we can do with these weapons. -
5:06 - 5:07It's the next step.
-
5:07 - 5:11It's an appreciation for how
it would erode democracy, -
5:11 - 5:13this idea of popular government,
making that connection, -
5:14 - 5:15more than these are dangerous weapons.
-
5:15 - 5:17Individually, they could hurt people.
-
5:17 - 5:19They can hurt our whole society.
-
5:19 - 5:21So, I take your point,
you're right, in a way, -
5:21 - 5:23but it's the best I can come up with
-
5:23 - 5:27and, you know, I guess I have a devious
mind which is why I write these things, -
5:27 - 5:28but even with a devious mind,
-
5:28 - 5:30I can't come up with
the perfect solution for it. -
5:31 - 5:32- Hugo, do you have a comment?
-
5:32 - 5:38- Yeah, when we look back
at our current technologies, -
5:38 - 5:44the most efficient ones, like GPS,
the Internet, -
5:44 - 5:46or the radar,
-
5:46 - 5:51those are technologies
that we get from the military, right? -
5:51 - 5:56So, my question for you is
how is it different with drones now? -
5:56 - 5:58- I don't think it is.
-
5:58 - 6:00- I don't think so.
-
6:00 - 6:04- GPS is used by all of us, really.
Think how great it is. -
6:04 - 6:06I am rather optimistic about human nature.
-
6:06 - 6:08I try to view it as a bell curve.
-
6:08 - 6:11On either extreme, you have people
doing extreme things, -
6:11 - 6:14but most people want to get through
their life and do good things -
6:14 - 6:15and be with their families.
-
6:15 - 6:17So they'll take those technologies,
-
6:17 - 6:19whether developed by the military or not,
-
6:19 - 6:21and they'll do cool,
peaceful things with those. -
6:21 - 6:24Logistics, environmental monitoring,
search and rescue, -
6:24 - 6:26aerial photography.
-
6:26 - 6:29And, then, yes, you'll have
these people who will arm them -
6:29 - 6:33and then they'll have one attack,
and hopefully it'll make them a pariah. -
6:33 - 6:37What would change a lot of Americans'
point of views about it -
6:37 - 6:41is if America was being
attacked like this, -
6:41 - 6:43it would be a page one issue
we have to deal with. -
6:43 - 6:48I'm not saying I want that to happen,
but this is where that goes. -
6:48 - 6:51If you look a few years down the line,
given the price point -
6:51 - 6:55and the accessibility of the technology,
as you pointed out, -
6:55 - 6:58every country will have these
and be tempted to use them -
6:58 - 7:02to solve intractable problems
or problems nobody wants to deal with. -
7:02 - 7:05Again, it's that anonymity thing.
-
7:05 - 7:07That's why we have to get ahead of it,
and at least declare -
7:07 - 7:09that's a very bad way to go
-
7:09 - 7:12because it will destabilise
geopolitics as we know it. -
7:12 - 7:14And that would be very bad.
-
7:14 - 7:16- Anja?
-
7:16 - 7:17- I just had a question.
-
7:17 - 7:19Are you alone in your fight
against drones? -
7:20 - 7:24Like, are you alone in pursuing
the convention, or do you have any, -
7:24 - 7:29like, I don't know,
associations or NGOs helping you? -
7:29 - 7:32- Again, I was saying that I've talked
to people at Human Rights Watch, -
7:32 - 7:35I think they're dealing with a group
called Article 36. -
7:35 - 7:38There's the International
Committee on Robotics Arms Control. -
7:38 - 7:40These are all good organisations.
-
7:40 - 7:43I am not directly affiliated with them.
-
7:43 - 7:46But I strongly sympathise
with what they're trying to do. -
7:46 - 7:48Am I alone? Definitely not.
-
7:48 - 7:52There's a lot of people
who are very concerned about this. -
7:52 - 7:56Both autonomous drones
as well as remotely piloted drones -
7:56 - 8:00that are being used for
extrajudicial assassinations, essentially, -
8:00 - 8:05and I think a lot of people
are really upset -
8:05 - 8:07that they want to prevent
this from happening. -
8:07 - 8:09So, no, I don't think I'm alone.
-
8:09 - 8:11- I wanted to bring you
into the conversation, -
8:11 - 8:14if you have any additional
questions for Daniel or comments? -
8:14 - 8:16- I was thinking, OK,
it sounds inevitable, -
8:16 - 8:17it's going to happen,
-
8:17 - 8:20we're going to have killer robots
in a few years. -
8:20 - 8:23So, should we focus on defence
-
8:23 - 8:29and maybe push the research
towards open research -
8:29 - 8:31instead of branding it,
-
8:31 - 8:36and making sure that defence systems
can be as good as the bad guys'? -
8:36 - 8:38- That's interesting.
-
8:38 - 8:42I think I started from that viewpoint,
-
8:42 - 8:46that idea that it would be important
for democracies not just to have -
8:46 - 8:51robotic drones in defence
but the best robotic drones in defence. -
8:51 - 8:55I've really come around to the idea
that's a bad way to go. -
8:55 - 8:57I really do think
that it shifts the balance -
8:57 - 9:01so much towards offence
that defence is really untenable. -
9:01 - 9:04The best defence
is to not to build the damn things. -
9:04 - 9:08That said, I think we can have
very advanced systems -
9:08 - 9:11that are busy snaring
or detecting other drones -
9:11 - 9:13to give humans the head up
so that they can say, -
9:13 - 9:15yeah, that's them again.
-
9:15 - 9:18And then have a human make
a lethal decision to, let's say, -
9:18 - 9:20destroy that drones.
-
9:20 - 9:24But I don't think having killer robots
running around overhead, -
9:24 - 9:26that's just going to get ugly real fast.
-
9:26 - 9:29- Again, it's going to be
robots against robots, the good ones. -
9:29 - 9:31- Exactly!
-
9:31 - 9:33Except, this is interesting,
because I hear this a lot: -
9:33 - 9:36what's wrong with bloodless war?
-
9:36 - 9:38And, I don't know,
I feel like telling these people, -
9:38 - 9:40didn't you see Terminator, dammit?!
-
9:40 - 9:44It's the difference between humans
fighting in a war or being the targets -
9:44 - 9:47because, let's face it,
the weak point there would be, -
9:47 - 9:49we've got to get around their drones
-
9:49 - 9:52so we hit the leadership
and the citizens, and make them pay. -
9:52 - 9:54And humans will be
the targets at that point. -
9:54 - 9:56It'll be each side's robots
killing humans. -
9:56 - 10:00The point is that I don't think
we should build them at all, -
10:00 - 10:03and I think we should make sure
they don't have weapons on them. -
10:03 - 10:05- OK, I think we've actually
run out of time. -
10:06 - 10:08I want everyone to get back
into the theatre on time, -
10:08 - 10:10so, Daniel, thank you so much
for joining us. -
10:10 - 10:12You've given us a lot to think about.
-
10:12 - 10:14Thank you for joining us
from Skype around the world. -
10:14 - 10:18And, everyone, we're here
at every break during the week, -
10:18 - 10:19with speakers, all week.
-
10:19 - 10:21So, looking forward to seeing you then.
-
10:21 - 10:22Goodbye.
-
10:22 - 10:25(Applause)
- Title:
- TED Global 2013 Found in Translation: Daniel Suarez
- Description:
-
In the TED Found in Translation Session following his talk, Daniel explores the topic of killer robots in greater detail with a global panel of TED Translators.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TED Translator Resources
- Duration:
- 10:39
Ivana Korom edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Ivana Korom edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Krystian Aparta edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Krystian Aparta edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Krystian Aparta edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Ivana Korom edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Ivana Korom edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez | ||
Ivana Korom edited English subtitles for TED Global 2013 Found in Translation: Daniel Suarez |