The kill decision shouldn't belong to a robot
-
0:01 - 0:04I write fiction sci-fi thrillers,
-
0:04 - 0:06so if I say "killer robots,"
-
0:06 - 0:08you'd probably think something like this.
-
0:08 - 0:11But I'm actually not here to talk about fiction.
-
0:11 - 0:14I'm here to talk about very real killer robots,
-
0:14 - 0:17autonomous combat drones.
-
0:17 - 0:21Now, I'm not referring to Predator and Reaper drones,
-
0:21 - 0:24which have a human making targeting decisions.
-
0:24 - 0:27I'm talking about fully autonomous robotic weapons
-
0:27 - 0:30that make lethal decisions about human beings
-
0:30 - 0:32all on their own.
-
0:32 - 0:36There's actually a technical term for this: lethal autonomy.
-
0:36 - 0:39Now, lethally autonomous killer robots
-
0:39 - 0:42would take many forms -- flying, driving,
-
0:42 - 0:45or just lying in wait.
-
0:45 - 0:48And actually, they're very quickly becoming a reality.
-
0:48 - 0:50These are two automatic sniper stations
-
0:50 - 0:55currently deployed in the DMZ between North and South Korea.
-
0:55 - 0:57Both of these machines are capable of automatically
-
0:57 - 1:00identifying a human target and firing on it,
-
1:00 - 1:05the one on the left at a distance of over a kilometer.
-
1:05 - 1:08Now, in both cases, there's still a human in the loop
-
1:08 - 1:10to make that lethal firing decision,
-
1:10 - 1:16but it's not a technological requirement. It's a choice.
-
1:16 - 1:19And it's that choice that I want to focus on,
-
1:19 - 1:22because as we migrate lethal decision-making
-
1:22 - 1:25from humans to software,
-
1:25 - 1:28we risk not only taking the humanity out of war,
-
1:28 - 1:32but also changing our social landscape entirely,
-
1:32 - 1:34far from the battlefield.
-
1:34 - 1:38That's because the way humans resolve conflict
-
1:38 - 1:40shapes our social landscape.
-
1:40 - 1:43And this has always been the case, throughout history.
-
1:43 - 1:46For example, these were state-of-the-art weapons systems
-
1:46 - 1:48in 1400 A.D.
-
1:48 - 1:51Now they were both very expensive to build and maintain,
-
1:51 - 1:54but with these you could dominate the populace,
-
1:54 - 1:58and the distribution of political power in feudal society reflected that.
-
1:58 - 2:01Power was focused at the very top.
-
2:01 - 2:04And what changed? Technological innovation.
-
2:04 - 2:06Gunpowder, cannon.
-
2:06 - 2:10And pretty soon, armor and castles were obsolete,
-
2:10 - 2:12and it mattered less who you brought to the battlefield
-
2:12 - 2:16versus how many people you brought to the battlefield.
-
2:16 - 2:20And as armies grew in size, the nation-state arose
-
2:20 - 2:23as a political and logistical requirement of defense.
-
2:23 - 2:26And as leaders had to rely on more of their populace,
-
2:26 - 2:28they began to share power.
-
2:28 - 2:30Representative government began to form.
-
2:30 - 2:33So again, the tools we use to resolve conflict
-
2:33 - 2:37shape our social landscape.
-
2:37 - 2:41Autonomous robotic weapons are such a tool,
-
2:41 - 2:46except that, by requiring very few people to go to war,
-
2:46 - 2:51they risk re-centralizing power into very few hands,
-
2:51 - 2:57possibly reversing a five-century trend toward democracy.
-
2:57 - 2:59Now, I think, knowing this,
-
2:59 - 3:03we can take decisive steps to preserve our democratic institutions,
-
3:03 - 3:07to do what humans do best, which is adapt.
-
3:07 - 3:09But time is a factor.
-
3:09 - 3:12Seventy nations are developing remotely-piloted
-
3:12 - 3:14combat drones of their own,
-
3:14 - 3:17and as you'll see, remotely-piloted combat drones
-
3:17 - 3:22are the precursors to autonomous robotic weapons.
-
3:22 - 3:24That's because once you've deployed remotely-piloted drones,
-
3:24 - 3:28there are three powerful factors pushing decision-making
-
3:28 - 3:32away from humans and on to the weapon platform itself.
-
3:32 - 3:38The first of these is the deluge of video that drones produce.
-
3:38 - 3:41For example, in 2004, the U.S. drone fleet produced
-
3:41 - 3:47a grand total of 71 hours of video surveillance for analysis.
-
3:47 - 3:51By 2011, this had gone up to 300,000 hours,
-
3:51 - 3:54outstripping human ability to review it all,
-
3:54 - 3:58but even that number is about to go up drastically.
-
3:58 - 4:01The Pentagon's Gorgon Stare and Argus programs
-
4:01 - 4:04will put up to 65 independently operated camera eyes
-
4:04 - 4:06on each drone platform,
-
4:06 - 4:09and this would vastly outstrip human ability to review it.
-
4:09 - 4:11And that means visual intelligence software will need
-
4:11 - 4:15to scan it for items of interest.
-
4:15 - 4:17And that means very soon
-
4:17 - 4:19drones will tell humans what to look at,
-
4:19 - 4:22not the other way around.
-
4:22 - 4:24But there's a second powerful incentive pushing
-
4:24 - 4:28decision-making away from humans and onto machines,
-
4:28 - 4:31and that's electromagnetic jamming,
-
4:31 - 4:33severing the connection between the drone
-
4:33 - 4:36and its operator.
-
4:36 - 4:38Now we saw an example of this in 2011
-
4:38 - 4:41when an American RQ-170 Sentinel drone
-
4:41 - 4:46got a bit confused over Iran due to a GPS spoofing attack,
-
4:46 - 4:51but any remotely-piloted drone is susceptible to this type of attack,
-
4:51 - 4:53and that means drones
-
4:53 - 4:56will have to shoulder more decision-making.
-
4:56 - 4:59They'll know their mission objective,
-
4:59 - 5:04and they'll react to new circumstances without human guidance.
-
5:04 - 5:07They'll ignore external radio signals
-
5:07 - 5:09and send very few of their own.
-
5:09 - 5:11Which brings us to, really, the third
-
5:11 - 5:15and most powerful incentive pushing decision-making
-
5:15 - 5:18away from humans and onto weapons:
-
5:18 - 5:22plausible deniability.
-
5:22 - 5:25Now we live in a global economy.
-
5:25 - 5:29High-tech manufacturing is occurring on most continents.
-
5:29 - 5:32Cyber espionage is spiriting away advanced designs
-
5:32 - 5:34to parts unknown,
-
5:34 - 5:36and in that environment, it is very likely
-
5:36 - 5:40that a successful drone design will be knocked off in contract factories,
-
5:40 - 5:43proliferate in the gray market.
-
5:43 - 5:45And in that situation, sifting through the wreckage
-
5:45 - 5:48of a suicide drone attack, it will be very difficult to say
-
5:48 - 5:52who sent that weapon.
-
5:52 - 5:55This raises the very real possibility
-
5:55 - 5:58of anonymous war.
-
5:58 - 6:01This could tilt the geopolitical balance on its head,
-
6:01 - 6:04make it very difficult for a nation to turn its firepower
-
6:04 - 6:07against an attacker, and that could shift the balance
-
6:07 - 6:11in the 21st century away from defense and toward offense.
-
6:11 - 6:14It could make military action a viable option
-
6:14 - 6:16not just for small nations,
-
6:16 - 6:19but criminal organizations, private enterprise,
-
6:19 - 6:21even powerful individuals.
-
6:21 - 6:25It could create a landscape of rival warlords
-
6:25 - 6:28undermining rule of law and civil society.
-
6:28 - 6:32Now if responsibility and transparency
-
6:32 - 6:34are two of the cornerstones of representative government,
-
6:34 - 6:39autonomous robotic weapons could undermine both.
-
6:39 - 6:40Now you might be thinking that
-
6:40 - 6:42citizens of high-tech nations
-
6:42 - 6:45would have the advantage in any robotic war,
-
6:45 - 6:49that citizens of those nations would be less vulnerable,
-
6:49 - 6:53particularly against developing nations.
-
6:53 - 6:57But I think the truth is the exact opposite.
-
6:57 - 6:59I think citizens of high-tech societies
-
6:59 - 7:03are more vulnerable to robotic weapons,
-
7:03 - 7:07and the reason can be summed up in one word: data.
-
7:07 - 7:11Data powers high-tech societies.
-
7:11 - 7:14Cell phone geolocation, telecom metadata,
-
7:14 - 7:17social media, email, text, financial transaction data,
-
7:17 - 7:21transportation data, it's a wealth of real-time data
-
7:21 - 7:24on the movements and social interactions of people.
-
7:24 - 7:28In short, we are more visible to machines
-
7:28 - 7:30than any people in history,
-
7:30 - 7:36and this perfectly suits the targeting needs of autonomous weapons.
-
7:36 - 7:37What you're looking at here
-
7:37 - 7:41is a link analysis map of a social group.
-
7:41 - 7:44Lines indicate social connectedness between individuals.
-
7:44 - 7:47And these types of maps can be automatically generated
-
7:47 - 7:52based on the data trail modern people leave behind.
-
7:52 - 7:54Now it's typically used to market goods and services
-
7:54 - 7:59to targeted demographics, but it's a dual-use technology,
-
7:59 - 8:02because targeting is used in another context.
-
8:02 - 8:05Notice that certain individuals are highlighted.
-
8:05 - 8:08These are the hubs of social networks.
-
8:08 - 8:12These are organizers, opinion-makers, leaders,
-
8:12 - 8:14and these people also can be automatically identified
-
8:14 - 8:17from their communication patterns.
-
8:17 - 8:19Now, if you're a marketer, you might then target them
-
8:19 - 8:21with product samples, try to spread your brand
-
8:21 - 8:24through their social group.
-
8:24 - 8:26But if you're a repressive government
-
8:26 - 8:31searching for political enemies, you might instead remove them,
-
8:31 - 8:34eliminate them, disrupt their social group,
-
8:34 - 8:37and those who remain behind lose social cohesion
-
8:37 - 8:39and organization.
-
8:39 - 8:43Now in a world of cheap, proliferating robotic weapons,
-
8:43 - 8:45borders would offer very little protection
-
8:45 - 8:47to critics of distant governments
-
8:47 - 8:51or trans-national criminal organizations.
-
8:51 - 8:55Popular movements agitating for change
-
8:55 - 8:58could be detected early and their leaders eliminated
-
8:58 - 9:01before their ideas achieve critical mass.
-
9:01 - 9:04And ideas achieving critical mass
-
9:04 - 9:08is what political activism in popular government is all about.
-
9:08 - 9:12Anonymous lethal weapons could make lethal action
-
9:12 - 9:15an easy choice for all sorts of competing interests.
-
9:15 - 9:19And this would put a chill on free speech
-
9:19 - 9:24and popular political action, the very heart of democracy.
-
9:24 - 9:27And this is why we need an international treaty
-
9:27 - 9:31on robotic weapons, and in particular a global ban
-
9:31 - 9:35on the development and deployment of killer robots.
-
9:35 - 9:38Now we already have international treaties
-
9:38 - 9:41on nuclear and biological weapons, and, while imperfect,
-
9:41 - 9:44these have largely worked.
-
9:44 - 9:47But robotic weapons might be every bit as dangerous,
-
9:47 - 9:51because they will almost certainly be used,
-
9:51 - 9:56and they would also be corrosive to our democratic institutions.
-
9:56 - 9:59Now in November 2012 the U.S. Department of Defense
-
9:59 - 10:02issued a directive requiring
-
10:02 - 10:06a human being be present in all lethal decisions.
-
10:06 - 10:11This temporarily effectively banned autonomous weapons in the U.S. military,
-
10:11 - 10:15but that directive needs to be made permanent.
-
10:15 - 10:19And it could set the stage for global action.
-
10:19 - 10:23Because we need an international legal framework
-
10:23 - 10:25for robotic weapons.
-
10:25 - 10:28And we need it now, before there's a devastating attack
-
10:28 - 10:31or a terrorist incident that causes nations of the world
-
10:31 - 10:33to rush to adopt these weapons
-
10:33 - 10:37before thinking through the consequences.
-
10:37 - 10:40Autonomous robotic weapons concentrate too much power
-
10:40 - 10:46in too few hands, and they would imperil democracy itself.
-
10:46 - 10:49Now, don't get me wrong, I think there are tons
-
10:49 - 10:51of great uses for unarmed civilian drones:
-
10:51 - 10:55environmental monitoring, search and rescue, logistics.
-
10:55 - 10:58If we have an international treaty on robotic weapons,
-
10:58 - 11:02how do we gain the benefits of autonomous drones
-
11:02 - 11:04and vehicles while still protecting ourselves
-
11:04 - 11:08against illegal robotic weapons?
-
11:08 - 11:13I think the secret will be transparency.
-
11:13 - 11:16No robot should have an expectation of privacy
-
11:16 - 11:20in a public place.
-
11:20 - 11:25(Applause)
-
11:25 - 11:27Each robot and drone should have
-
11:27 - 11:30a cryptographically signed I.D. burned in at the factory
-
11:30 - 11:33that can be used to track its movement through public spaces.
-
11:33 - 11:36We have license plates on cars, tail numbers on aircraft.
-
11:36 - 11:38This is no different.
-
11:38 - 11:40And every citizen should be able to download an app
-
11:40 - 11:43that shows the population of drones and autonomous vehicles
-
11:43 - 11:45moving through public spaces around them,
-
11:45 - 11:48both right now and historically.
-
11:48 - 11:52And civic leaders should deploy sensors and civic drones
-
11:52 - 11:54to detect rogue drones,
-
11:54 - 11:57and instead of sending killer drones of their own up to shoot them down,
-
11:57 - 12:00they should notify humans to their presence.
-
12:00 - 12:03And in certain very high-security areas,
-
12:03 - 12:05perhaps civic drones would snare them
-
12:05 - 12:07and drag them off to a bomb disposal facility.
-
12:07 - 12:11But notice, this is more an immune system
-
12:11 - 12:12than a weapons system.
-
12:12 - 12:14It would allow us to avail ourselves of the use
-
12:14 - 12:16of autonomous vehicles and drones
-
12:16 - 12:21while still preserving our open, civil society.
-
12:21 - 12:24We must ban the deployment and development
-
12:24 - 12:26of killer robots.
-
12:26 - 12:30Let's not succumb to the temptation to automate war.
-
12:30 - 12:33Autocratic governments and criminal organizations
-
12:33 - 12:36undoubtedly will, but let's not join them.
-
12:36 - 12:38Autonomous robotic weapons
-
12:38 - 12:40would concentrate too much power
-
12:40 - 12:43in too few unseen hands,
-
12:43 - 12:46and that would be corrosive to representative government.
-
12:46 - 12:49Let's make sure, for democracies at least,
-
12:49 - 12:51killer robots remain fiction.
-
12:51 - 12:52Thank you.
-
12:52 - 12:57(Applause)
-
12:57 - 13:02Thank you. (Applause)
- Title:
- The kill decision shouldn't belong to a robot
- Speaker:
- Daniel Suarez
- Description:
-
As a novelist, Daniel Suarez spins dystopian tales of the future. But on the TEDGlobal stage, he talks us through a real-life scenario we all need to know more about: the rise of autonomous robotic weapons of war. Advanced drones, automated weapons and AI-powered intelligence-gathering tools, he suggests, could take the decision to make war out of the hands of humans.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 13:20
Thu-Huong Ha edited English subtitles for The kill decision shouldn't belong to a robot | ||
Thu-Huong Ha approved English subtitles for The kill decision shouldn't belong to a robot | ||
Thu-Huong Ha edited English subtitles for The kill decision shouldn't belong to a robot | ||
Morton Bast accepted English subtitles for The kill decision shouldn't belong to a robot | ||
Morton Bast edited English subtitles for The kill decision shouldn't belong to a robot | ||
Joseph Geni added a translation |