Good afternoon, everybody. I'm Doug Chilcott from TED. Welcome to TED Global here in the Open Translation lounge. Today we have Daniel Suarez who just left the stage moments ago with a talk that was pretty alarming and left a lot of people shivering, and I have a lot of questions. So we'll get to you in a moment. Also joining us on stage here are Irteza from Pakistan, Hugo from France, Els from Belgium, and Krystian from Poland. And joining us via Skype, at all hours of the night and day, we have Lazaros from Greece, Anja from Slovenia, Alberto from Italy, and Yasushi, in Japan, where it's actually midnight. Thank you for joining us. I'm just actually going to throw it out to you. Krystian, you had a question about the use of fiction. - Yeah. I was wondering if, because your talk is a cautionary tale, but your novel was probably not meant as one. But did this angle that we have to be careful about this grow out of your research, or was your novel also meant as something to [caution]? - I really do cautionary tales. I'll tell you, some people think thriller writers like to be doomsayers. But that's not it, actually. I love technology. I spent nearly 20 years designing software. I try to think of it as spotting icebergs as opposed to be a doomsayer. That doesn't mean you're not going to go anywhere, it means you might have to turn right or left occasionally to avoid obstacles. I think there is a way we can navigate past these and get to even a better place than where we are now. We just have to be aware of the terrain, that's all. That's really what I try to do. A cautionary tale is probably a perfect description for it. - Irteza from Pakistan, we were talking earlier about drones being part of the last political campaign in Pakistan. Could you talk about the reality of drones in your country? - The second highest party which just won in our recent elections, their campaign used drones over there. A lot of incidents over there were that they just targeted the wrong guy. And they killed them. So it's really a bad thing over there. - For every person you harm, 100 might be angry and then you create a bigger problem, and you hit the wrong people. It's not an effective way to do things. Even if you agree with it as a technology, it just is not effective. I think especially what will change America's viewpoint on that is, as this technology spreads, the temptation to use it when you're the only one who has it is very strong, but we are rapidly getting to the point where we're going to see 70 nations developing their own. I think that will help to finally get this into some situation where we can get a legal framework because it's not in anyone's interest to continue where we're going. It's not just a free for all. - Do you have ideas about this legal framework? What does it look like? Just saying "you can't do it" is not enough. - No, no. Here's the thing, though. we've done this with all sorts of things. Laser blinding weapons is a great example. We cannot sit here and think of all the horrible incidents with laser blinding weapons because people sat down and said, yeah, it's very easy to blind people, let's not do that. And, sure enough, there are all these international controls. So, drones are a little different because they actually have some civilian uses. So, you will have instances where individuals or small groups are putting weapons on drones, and that's why I was thinking of having an immune system, this idea we would create, you know, drones that would look for other drones. And then bring them to the attentions of humans. They wouldn't shoot them out of the sky. But there is no way you're going to stop everything. But you don't want to militarise your entire society to defend it against robots. - Actually, let's use one of the benefits of technology. I'm going to bring some of the Skype people in. A non-threatening technology. - I don't think that technology is going to be available in a couple of years because I found some sites that cite that you can actually build the majority of parts for drones with a 3D printer. So, the drones are around the corner for small groups, not just for terrorist organisations and countries. So, I don't think that a convention that you spoke about in your TED talk is going to do it. I think we need a broader perspective, more Internet-based. I don't know if you have any suggestions I think we would all appreciate? - Well, I am curious when you said a broader solution, do you have any idea what that might be? - No, I was hoping you would have it! (Laughter) - Let me say, when I say international legal framework, that's really just the beginning. Again, it's to try to change the psychology of people to say more than just, look at this cool stuff we can do with these weapons. It's the next step. It's an appreciation for how it would erode democracy, this idea of popular government, making that connection, more than these are dangerous weapons. Individually, they could hurt people. They can hurt our whole society. So, I take your point, you're right, in a way, but it's the best I can come up with and, you know, I guess I have a devious mind which is why I write these things, but even with a devious mind, I can't come up with the perfect solution for it. - Hugo, do you have a comment? - Yeah, when we look back at our current technologies, the most efficient ones, like GPS, the Internet, or the radar, those are technologies that we get from the military, right? So, my question for you is how is it different with drones now? - I don't think it is. - I don't think so. - GPS is used by all of us, really. Think how great it is. I am rather optimistic about human nature. I try to view it as a bell curve. On either extreme, you have people doing extreme things, but most people want to get through their life and do good things and be with their families. So they'll take those technologies, whether developed by the military or not, and they'll do cool, peaceful things with those. Logistics, environmental monitoring, search and rescue, aerial photography. And, then, yes, you'll have these people who will arm them and then they'll have one attack, and hopefully it'll make them a pariah. What would change a lot of Americans' point of views about it is if America was being attacked like this, it would be a page one issue we have to deal with. I'm not saying I want that to happen, but this is where that goes. If you look a few years down the line, given the price point and the accessibility of the technology, as you pointed out, every country will have these and be tempted to use them to solve intractable problems or problems nobody wants to deal with. Again, it's that anonymity thing. That's why we have to get ahead of it, and at least declare that's a very bad way to go because it will destabilise geopolitics as we know it. And that would be very bad. - Anja? - I just had a question. Are you alone in your fight against drones? Like, are you alone in pursuing the convention, or do you have any, like, I don't know, associations or NGOs helping you? - Again, I was saying that I've talked to people at Human Rights Watch, I think they're dealing with a group called Article 36. There's the International Committee on Robotics Arms Control. These are all good organisations. I am not directly affiliated with them. But I strongly sympathise with what they're trying to do. Am I alone? Definitely not. There's a lot of people who are very concerned about this. Both autonomous drones as well as remotely piloted drones that are being used for extrajudicial assassinations, essentially, and I think a lot of people are really upset that they want to prevent this from happening. So, no, I don't think I'm alone. - I wanted to bring you into the conversation, if you have any additional questions for Daniel or comments? - I was thinking, OK, it sounds inevitable, it's going to happen, we're going to have killer robots in a few years. So, should we focus on defence and maybe push the research towards open research instead of branding it, and making sure that defence systems can be as good as the bad guys'? - That's interesting. I think I started from that viewpoint, that idea that it would be important for democracies not just to have robotic drones in defence but the best robotic drones in defence. I've really come around to the idea that's a bad way to go. I really do think that it shifts the balance so much towards offence that defence is really untenable. The best defence is to not to build the damn things. That said, I think we can have very advanced systems that are busy snaring or detecting other drones to give humans the head up so that they can say, yeah, that's them again. And then have a human make a lethal decision to, let's say, destroy that drones. But I don't think having killer robots running around overhead, that's just going to get ugly real fast. - Again, it's going to be robots against robots, the good ones. - Exactly! Except, this is interesting, because I hear this a lot: what's wrong with bloodless war? And, I don't know, I feel like telling these people, didn't you see Terminator, dammit?! It's the difference between humans fighting in a war or being the targets because, let's face it, the weak point there would be, we've got to get around their drones so we hit the leadership and the citizens, and make them pay. And humans will be the targets at that point. It'll be each side's robots killing humans. The point is that I don't think we should build them at all, and I think we should make sure they don't have weapons on them. - OK, I think we've actually run out of time. I want everyone to get back into the theatre on time, so, Daniel, thank you so much for joining us. You've given us a lot to think about. Thank you for joining us from Skype around the world. And, everyone, we're here at every break during the week, with speakers, all week. So, looking forward to seeing you then. Goodbye. (Applause)