Return to Video

TED Global 2013 Found in Translation: Daniel Suarez

  • 0:09 - 0:12
    Good afternoon, everybody.
    I'm Doug Chilcott from TED.
  • 0:12 - 0:16
    Welcome to TED Global
    here in the Open Translation lounge.
  • 0:16 - 0:19
    Today we have Daniel Suarez who just
    left the stage moments ago
  • 0:19 - 0:23
    with a talk that was pretty alarming
    and left a lot of people shivering,
  • 0:23 - 0:25
    and I have a lot of questions.
  • 0:25 - 0:26
    So we'll get to you in a moment.
  • 0:26 - 0:30
    Also joining us on stage
    here are Irteza from Pakistan,
  • 0:30 - 0:32
    Hugo from France,
  • 0:32 - 0:34
    Els from Belgium,
  • 0:34 - 0:36
    and Krystian from Poland.
  • 0:36 - 0:40
    And joining us via Skype,
    at all hours of the night and day,
  • 0:40 - 0:43
    we have Lazaros from Greece,
  • 0:43 - 0:45
    Anja from Slovenia,
  • 0:45 - 0:48
    Alberto from Italy,
  • 0:48 - 0:52
    and Yasushi, in Japan,
    where it's actually midnight.
  • 0:52 - 0:54
    Thank you for joining us.
  • 0:55 - 0:58
    I'm just actually
    going to throw it out to you.
  • 0:58 - 1:00
    Krystian, you had a question
    about the use of fiction.
  • 1:00 - 1:01
    - Yeah.
  • 1:02 - 1:05
    I was wondering if,
    because your talk is a cautionary tale,
  • 1:05 - 1:09
    but your novel was probably
    not meant as one.
  • 1:09 - 1:13
    But did this angle that we have to be
    careful about this
  • 1:13 - 1:15
    grow out of your research,
  • 1:15 - 1:19
    or was your novel also
    meant as something to [caution]?
  • 1:19 - 1:21
    - I really do cautionary tales.
  • 1:21 - 1:26
    I'll tell you, some people think
    thriller writers like to be doomsayers.
  • 1:26 - 1:29
    But that's not it, actually.
    I love technology.
  • 1:29 - 1:32
    I spent nearly 20 years
    designing software.
  • 1:32 - 1:36
    I try to think of it as spotting
    icebergs as opposed to be a doomsayer.
  • 1:36 - 1:38
    That doesn't mean
    you're not going to go anywhere,
  • 1:38 - 1:42
    it means you might have to turn right
    or left occasionally to avoid obstacles.
  • 1:42 - 1:44
    I think there is
    a way we can navigate past these
  • 1:44 - 1:47
    and get to even a better place
    than where we are now.
  • 1:47 - 1:50
    We just have to be aware of the terrain,
    that's all.
  • 1:50 - 1:51
    That's really what I try to do.
  • 1:51 - 1:54
    A cautionary tale is probably
    a perfect description for it.
  • 1:54 - 1:58
    - Irteza from Pakistan,
    we were talking earlier about drones
  • 1:58 - 2:00
    being part of the last political campaign
    in Pakistan.
  • 2:01 - 2:04
    Could you talk about the reality
    of drones in your country?
  • 2:04 - 2:08
    - The second highest party which just
    won in our recent elections,
  • 2:08 - 2:11
    their campaign used drones over there.
  • 2:11 - 2:13
    A lot of incidents over there
  • 2:13 - 2:16
    were that they just targeted
    the wrong guy.
  • 2:16 - 2:18
    And they killed them.
  • 2:18 - 2:20
    So it's really a bad thing over there.
  • 2:20 - 2:24
    - For every person you harm,
    100 might be angry
  • 2:24 - 2:27
    and then you create a bigger problem,
    and you hit the wrong people.
  • 2:27 - 2:30
    It's not an effective way to do things.
  • 2:30 - 2:34
    Even if you agree with it as a technology,
    it just is not effective.
  • 2:34 - 2:38
    I think especially what will change
    America's viewpoint on that is,
  • 2:38 - 2:42
    as this technology spreads,
    the temptation to use it
  • 2:42 - 2:44
    when you're the only one who has it
    is very strong,
  • 2:45 - 2:46
    but we are rapidly getting to the point
  • 2:46 - 2:49
    where we're going to see
    70 nations developing their own.
  • 2:50 - 2:54
    I think that will help to finally get this
    into some situation where we can
  • 2:54 - 2:56
    get a legal framework
  • 2:56 - 3:00
    because it's not in anyone's interest
    to continue where we're going.
  • 3:00 - 3:01
    It's not just a free for all.
  • 3:01 - 3:03
    - Do you have ideas about
    this legal framework?
  • 3:03 - 3:05
    What does it look like?
  • 3:05 - 3:07
    Just saying "you can't do it"
    is not enough.
  • 3:07 - 3:11
    - No, no. Here's the thing, though.
    we've done this with all sorts of things.
  • 3:11 - 3:14
    Laser blinding weapons is a great example.
  • 3:14 - 3:16
    We cannot sit here and think
    of all the horrible incidents
  • 3:16 - 3:20
    with laser blinding weapons
    because people sat down and said, yeah,
  • 3:20 - 3:23
    it's very easy to blind people,
    let's not do that.
  • 3:23 - 3:26
    And, sure enough, there are all these
    international controls.
  • 3:26 - 3:31
    So, drones are a little different because
    they actually have some civilian uses.
  • 3:31 - 3:35
    So, you will have instances
    where individuals or small groups
  • 3:35 - 3:36
    are putting weapons on drones,
  • 3:36 - 3:39
    and that's why I was thinking
    of having an immune system,
  • 3:39 - 3:44
    this idea we would create, you know,
    drones that would look for other drones.
  • 3:44 - 3:47
    And then bring them
    to the attentions of humans.
  • 3:47 - 3:49
    They wouldn't shoot them out of the sky.
  • 3:49 - 3:53
    But there is no way
    you're going to stop everything.
  • 3:53 - 3:55
    But you don't want to militarise
    your entire society
  • 3:55 - 3:57
    to defend it against robots.
  • 3:57 - 4:00
    - Actually, let's use
    one of the benefits of technology.
  • 4:00 - 4:02
    I'm going to bring
    some of the Skype people in.
  • 4:02 - 4:04
    A non-threatening technology.
  • 4:04 - 4:07
    - I don't think that technology
  • 4:07 - 4:09
    is going to be available
    in a couple of years
  • 4:09 - 4:13
    because I found some sites that cite
    that you can actually build
  • 4:13 - 4:18
    the majority of parts for drones
    with a 3D printer.
  • 4:18 - 4:23
    So, the drones are around the corner
    for small groups,
  • 4:23 - 4:27
    not just for terrorist organisations
    and countries.
  • 4:27 - 4:32
    So, I don't think that a convention
    that you spoke about in your TED talk
  • 4:32 - 4:34
    is going to do it.
  • 4:34 - 4:38
    I think we need a broader perspective,
    more Internet-based.
  • 4:38 - 4:42
    I don't know if you have any suggestions
    I think we would all appreciate?
  • 4:42 - 4:46
    - Well, I am curious
    when you said a broader solution,
  • 4:46 - 4:49
    do you have any idea what that might be?
  • 4:50 - 4:53
    - No, I was hoping you would have it!
  • 4:53 - 4:54
    (Laughter)
  • 4:54 - 4:57
    - Let me say, when I say
    international legal framework,
  • 4:57 - 4:59
    that's really just the beginning.
  • 4:59 - 5:02
    Again, it's to try to change
    the psychology of people
  • 5:02 - 5:06
    to say more than just, look at this
    cool stuff we can do with these weapons.
  • 5:06 - 5:07
    It's the next step.
  • 5:07 - 5:11
    It's an appreciation for how
    it would erode democracy,
  • 5:11 - 5:13
    this idea of popular government,
    making that connection,
  • 5:14 - 5:15
    more than these are dangerous weapons.
  • 5:15 - 5:17
    Individually, they could hurt people.
  • 5:17 - 5:19
    They can hurt our whole society.
  • 5:19 - 5:21
    So, I take your point,
    you're right, in a way,
  • 5:21 - 5:23
    but it's the best I can come up with
  • 5:23 - 5:27
    and, you know, I guess I have a devious
    mind which is why I write these things,
  • 5:27 - 5:28
    but even with a devious mind,
  • 5:28 - 5:30
    I can't come up with
    the perfect solution for it.
  • 5:31 - 5:32
    - Hugo, do you have a comment?
  • 5:32 - 5:38
    - Yeah, when we look back
    at our current technologies,
  • 5:38 - 5:44
    the most efficient ones, like GPS,
    the Internet,
  • 5:44 - 5:46
    or the radar,
  • 5:46 - 5:51
    those are technologies
    that we get from the military, right?
  • 5:51 - 5:56
    So, my question for you is
    how is it different with drones now?
  • 5:56 - 5:58
    - I don't think it is.
  • 5:58 - 6:00
    - I don't think so.
  • 6:00 - 6:04
    - GPS is used by all of us, really.
    Think how great it is.
  • 6:04 - 6:06
    I am rather optimistic about human nature.
  • 6:06 - 6:08
    I try to view it as a bell curve.
  • 6:08 - 6:11
    On either extreme, you have people
    doing extreme things,
  • 6:11 - 6:14
    but most people want to get through
    their life and do good things
  • 6:14 - 6:15
    and be with their families.
  • 6:15 - 6:17
    So they'll take those technologies,
  • 6:17 - 6:19
    whether developed by the military or not,
  • 6:19 - 6:21
    and they'll do cool,
    peaceful things with those.
  • 6:21 - 6:24
    Logistics, environmental monitoring,
    search and rescue,
  • 6:24 - 6:26
    aerial photography.
  • 6:26 - 6:29
    And, then, yes, you'll have
    these people who will arm them
  • 6:29 - 6:33
    and then they'll have one attack,
    and hopefully it'll make them a pariah.
  • 6:33 - 6:37
    What would change a lot of Americans'
    point of views about it
  • 6:37 - 6:41
    is if America was being
    attacked like this,
  • 6:41 - 6:43
    it would be a page one issue
    we have to deal with.
  • 6:43 - 6:48
    I'm not saying I want that to happen,
    but this is where that goes.
  • 6:48 - 6:51
    If you look a few years down the line,
    given the price point
  • 6:51 - 6:55
    and the accessibility of the technology,
    as you pointed out,
  • 6:55 - 6:58
    every country will have these
    and be tempted to use them
  • 6:58 - 7:02
    to solve intractable problems
    or problems nobody wants to deal with.
  • 7:02 - 7:05
    Again, it's that anonymity thing.
  • 7:05 - 7:07
    That's why we have to get ahead of it,
    and at least declare
  • 7:07 - 7:09
    that's a very bad way to go
  • 7:09 - 7:12
    because it will destabilise
    geopolitics as we know it.
  • 7:12 - 7:14
    And that would be very bad.
  • 7:14 - 7:16
    - Anja?
  • 7:16 - 7:17
    - I just had a question.
  • 7:17 - 7:19
    Are you alone in your fight
    against drones?
  • 7:20 - 7:24
    Like, are you alone in pursuing
    the convention, or do you have any,
  • 7:24 - 7:29
    like, I don't know,
    associations or NGOs helping you?
  • 7:29 - 7:32
    - Again, I was saying that I've talked
    to people at Human Rights Watch,
  • 7:32 - 7:35
    I think they're dealing with a group
    called Article 36.
  • 7:35 - 7:38
    There's the International
    Committee on Robotics Arms Control.
  • 7:38 - 7:40
    These are all good organisations.
  • 7:40 - 7:43
    I am not directly affiliated with them.
  • 7:43 - 7:46
    But I strongly sympathise
    with what they're trying to do.
  • 7:46 - 7:48
    Am I alone? Definitely not.
  • 7:48 - 7:52
    There's a lot of people
    who are very concerned about this.
  • 7:52 - 7:56
    Both autonomous drones
    as well as remotely piloted drones
  • 7:56 - 8:00
    that are being used for
    extrajudicial assassinations, essentially,
  • 8:00 - 8:05
    and I think a lot of people
    are really upset
  • 8:05 - 8:07
    that they want to prevent
    this from happening.
  • 8:07 - 8:09
    So, no, I don't think I'm alone.
  • 8:09 - 8:11
    - I wanted to bring you
    into the conversation,
  • 8:11 - 8:14
    if you have any additional
    questions for Daniel or comments?
  • 8:14 - 8:16
    - I was thinking, OK,
    it sounds inevitable,
  • 8:16 - 8:17
    it's going to happen,
  • 8:17 - 8:20
    we're going to have killer robots
    in a few years.
  • 8:20 - 8:23
    So, should we focus on defence
  • 8:23 - 8:29
    and maybe push the research
    towards open research
  • 8:29 - 8:31
    instead of branding it,
  • 8:31 - 8:36
    and making sure that defence systems
    can be as good as the bad guys'?
  • 8:36 - 8:38
    - That's interesting.
  • 8:38 - 8:42
    I think I started from that viewpoint,
  • 8:42 - 8:46
    that idea that it would be important
    for democracies not just to have
  • 8:46 - 8:51
    robotic drones in defence
    but the best robotic drones in defence.
  • 8:51 - 8:55
    I've really come around to the idea
    that's a bad way to go.
  • 8:55 - 8:57
    I really do think
    that it shifts the balance
  • 8:57 - 9:01
    so much towards offence
    that defence is really untenable.
  • 9:01 - 9:04
    The best defence
    is to not to build the damn things.
  • 9:04 - 9:08
    That said, I think we can have
    very advanced systems
  • 9:08 - 9:11
    that are busy snaring
    or detecting other drones
  • 9:11 - 9:13
    to give humans the head up
    so that they can say,
  • 9:13 - 9:15
    yeah, that's them again.
  • 9:15 - 9:18
    And then have a human make
    a lethal decision to, let's say,
  • 9:18 - 9:20
    destroy that drones.
  • 9:20 - 9:24
    But I don't think having killer robots
    running around overhead,
  • 9:24 - 9:26
    that's just going to get ugly real fast.
  • 9:26 - 9:29
    - Again, it's going to be
    robots against robots, the good ones.
  • 9:29 - 9:31
    - Exactly!
  • 9:31 - 9:33
    Except, this is interesting,
    because I hear this a lot:
  • 9:33 - 9:36
    what's wrong with bloodless war?
  • 9:36 - 9:38
    And, I don't know,
    I feel like telling these people,
  • 9:38 - 9:40
    didn't you see Terminator, dammit?!
  • 9:40 - 9:44
    It's the difference between humans
    fighting in a war or being the targets
  • 9:44 - 9:47
    because, let's face it,
    the weak point there would be,
  • 9:47 - 9:49
    we've got to get around their drones
  • 9:49 - 9:52
    so we hit the leadership
    and the citizens, and make them pay.
  • 9:52 - 9:54
    And humans will be
    the targets at that point.
  • 9:54 - 9:56
    It'll be each side's robots
    killing humans.
  • 9:56 - 10:00
    The point is that I don't think
    we should build them at all,
  • 10:00 - 10:03
    and I think we should make sure
    they don't have weapons on them.
  • 10:03 - 10:05
    - OK, I think we've actually
    run out of time.
  • 10:06 - 10:08
    I want everyone to get back
    into the theatre on time,
  • 10:08 - 10:10
    so, Daniel, thank you so much
    for joining us.
  • 10:10 - 10:12
    You've given us a lot to think about.
  • 10:12 - 10:14
    Thank you for joining us
    from Skype around the world.
  • 10:14 - 10:18
    And, everyone, we're here
    at every break during the week,
  • 10:18 - 10:19
    with speakers, all week.
  • 10:19 - 10:21
    So, looking forward to seeing you then.
  • 10:21 - 10:22
    Goodbye.
  • 10:22 - 10:25
    (Applause)
Title:
TED Global 2013 Found in Translation: Daniel Suarez
Description:

In the TED Found in Translation Session following his talk, Daniel explores the topic of killer robots in greater detail with a global panel of TED Translators.

more » « less
Video Language:
English
Team:
closed TED
Project:
TED Translator Resources
Duration:
10:39

English subtitles

Revisions Compare revisions