[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.00,0:00:15.01,Default,,0000,0000,0000,,{\i1}34c3 intro{\i0} Dialogue: 0,0:00:15.01,0:00:18.34,Default,,0000,0000,0000,,Herald: ... used Anja Dahlmann, a\Npolitical scientist and researcher at Dialogue: 0,0:00:18.34,0:00:23.31,Default,,0000,0000,0000,,Stiftung Wissenschaft und Politik, a\Nberlin-based think-tank. Here we go. Dialogue: 0,0:00:23.31,0:00:30.16,Default,,0000,0000,0000,,{\i1}applause{\i0} Dialogue: 0,0:00:30.16,0:00:40.03,Default,,0000,0000,0000,,Anja Dahlmann: Yeah, Thanks for being\Nhere. I probably neither cut myself nor Dialogue: 0,0:00:40.03,0:00:44.27,Default,,0000,0000,0000,,proposed but I hope it's still\Ninteresting. I'm going to talk about Dialogue: 0,0:00:44.27,0:00:49.24,Default,,0000,0000,0000,,preventive arms control and international\Nhumanitarian law and doing in this Dialogue: 0,0:00:49.24,0:00:53.27,Default,,0000,0000,0000,,international debate around autonomous\Nweapons. This type of weapon is also Dialogue: 0,0:00:53.27,0:00:59.01,Default,,0000,0000,0000,,referred to as Lethal Autonomous Weapons\NSystem, short LAWS, or also killer robots. Dialogue: 0,0:00:59.01,0:01:04.24,Default,,0000,0000,0000,,So if I say LAWS, I mostly mean these\Nweapons and not like legal laws, just to Dialogue: 0,0:01:04.24,0:01:11.65,Default,,0000,0000,0000,,confuse you a bit. Okay. I will discuss\Nthis topic along three questions. First of Dialogue: 0,0:01:11.65,0:01:18.30,Default,,0000,0000,0000,,all, what are we actually talking about\Nhere, what are autonomous weapons? Second, Dialogue: 0,0:01:18.30,0:01:22.21,Default,,0000,0000,0000,,why should we even care about this? Why's\Nit important? And third, how could this Dialogue: 0,0:01:22.21,0:01:31.48,Default,,0000,0000,0000,,issue be addressed on international level?\NSo. I'll go through my slides, anyway, Dialogue: 0,0:01:31.48,0:01:37.77,Default,,0000,0000,0000,,what are we talking about here? Well,\Nduring the international negotiations, so Dialogue: 0,0:01:37.77,0:01:45.36,Default,,0000,0000,0000,,far no real, no common definition has been\Nfound. So States, Parties try to find Dialogue: 0,0:01:45.36,0:01:50.09,Default,,0000,0000,0000,,something or not and for my presentation I\Nwill just use a very broad definition of Dialogue: 0,0:01:50.09,0:01:57.23,Default,,0000,0000,0000,,autonomous weapons, which is: Weapons that\Ncan once activated execute a broad range Dialogue: 0,0:01:57.23,0:02:02.35,Default,,0000,0000,0000,,of tasks or selecting to engage targets\Nwithout further human intervention. And Dialogue: 0,0:02:02.35,0:02:08.34,Default,,0000,0000,0000,,it's just a very broad spectrum of weapons\Nthat might fall under this definition. Dialogue: 0,0:02:08.34,0:02:13.15,Default,,0000,0000,0000,,Actually, some existing ones are there as\Nwell which you can't see here. That would Dialogue: 0,0:02:13.15,0:02:19.94,Default,,0000,0000,0000,,be the Phalanx system for example. It's\Nbeen around since the 1970s. Sorry... Dialogue: 0,0:02:19.94,0:02:22.95,Default,,0000,0000,0000,,Herald: Man kann nichts hören\Nauf der Bühne. Mach mal weiter. Dialogue: 0,0:02:22.95,0:02:27.31,Default,,0000,0000,0000,,Dahlmann: Sorry. So, Phalanx system has\Nbeen around since the 1970s, a US system, Dialogue: 0,0:02:27.31,0:02:33.07,Default,,0000,0000,0000,,air defense system, based on ships and\Nit's been to - just yeah, defend the ship Dialogue: 0,0:02:33.07,0:02:39.33,Default,,0000,0000,0000,,against incoming objects from the air. So\Nthat's around, has been around for quite a Dialogue: 0,0:02:39.33,0:02:45.17,Default,,0000,0000,0000,,long time and it might be even part of\Nthis LAWS definition or not but just to Dialogue: 0,0:02:45.17,0:02:49.41,Default,,0000,0000,0000,,give you an impression how broad this\Nrange is: Today, we've got for example Dialogue: 0,0:02:49.41,0:02:57.99,Default,,0000,0000,0000,,demonstrators like the Taranis drone, a UK\Nsystem, or the x74b which can, for Dialogue: 0,0:02:57.99,0:03:04.96,Default,,0000,0000,0000,,example, autonomously land\N{\i1}applause{\i0} Dialogue: 0,0:03:04.96,0:03:08.70,Default,,0000,0000,0000,,land on aircraft carriers and can be air-\Nrefueled and stuff like that which is Dialogue: 0,0:03:08.70,0:03:13.55,Default,,0000,0000,0000,,apparently quite impressive if you don't\Nneed a human to do that and in the future Dialogue: 0,0:03:13.55,0:03:18.81,Default,,0000,0000,0000,,there might be even, or there probably\Nwill be even more, autonomous functions, Dialogue: 0,0:03:18.81,0:03:25.65,Default,,0000,0000,0000,,so navigation, landing, refueling, all\Nthat stuff. That's, you know, old but at Dialogue: 0,0:03:25.65,0:03:30.09,Default,,0000,0000,0000,,some point there might, be weapons might\Nbe able to choose their own ammunition Dialogue: 0,0:03:30.09,0:03:34.56,Default,,0000,0000,0000,,according to the situation. They might be\Nable to choose their target and decide Dialogue: 0,0:03:34.56,0:03:41.72,Default,,0000,0000,0000,,when to engage with the target without any\Nhuman intervention at some point. And Dialogue: 0,0:03:41.72,0:03:45.00,Default,,0000,0000,0000,,that's quite problematic, I will tell you\Nwhy that's in a minute. Overall, you can Dialogue: 0,0:03:45.00,0:03:52.23,Default,,0000,0000,0000,,see that there's a gradual decline of\Nhuman control over weapons systems or over Dialogue: 0,0:03:52.23,0:03:58.05,Default,,0000,0000,0000,,weapons and the use of force. So that's a\Nvery short and broad impression of what Dialogue: 0,0:03:58.05,0:04:01.36,Default,,0000,0000,0000,,we're talking about here. And talking\Nabout definitions, it's always interesting Dialogue: 0,0:04:01.36,0:04:05.73,Default,,0000,0000,0000,,what you're not talking about and that's\Nwhy I want to address some misconceptions Dialogue: 0,0:04:05.73,0:04:13.48,Default,,0000,0000,0000,,in the public debate. First of all, when\Nwe talk about machine autonomy, also Dialogue: 0,0:04:13.48,0:04:19.86,Default,,0000,0000,0000,,artificial intelligence, with intelligence\Nwhich is the technology behind this, Dialogue: 0,0:04:19.86,0:04:25.42,Default,,0000,0000,0000,,people - not you probably - in the media\Nand the broader public often get the idea Dialogue: 0,0:04:25.42,0:04:30.54,Default,,0000,0000,0000,,that these machines might have some kind\Nof real intelligence or intention or an Dialogue: 0,0:04:30.54,0:04:36.26,Default,,0000,0000,0000,,entity on own right and they're just not.\NIt's just statistical methods, it's just Dialogue: 0,0:04:36.26,0:04:40.74,Default,,0000,0000,0000,,math and you know way more about this than\NI do so I will leave it with this and just Dialogue: 0,0:04:40.74,0:04:45.60,Default,,0000,0000,0000,,say that or highlight that they have these\Nmachines, these weapons have certain Dialogue: 0,0:04:45.60,0:04:50.07,Default,,0000,0000,0000,,competences for specific tasks. They are\Nnot entities on their own right, they are Dialogue: 0,0:04:50.07,0:04:55.05,Default,,0000,0000,0000,,not intentional.And that's important when\Nwe talk about ethical and legal challenges Dialogue: 0,0:04:55.05,0:05:06.58,Default,,0000,0000,0000,,afterwards. Sorry. There it is. And the\Nother, in connection with this, there's Dialogue: 0,0:05:06.58,0:05:11.38,Default,,0000,0000,0000,,another one, which is the plethora of\NTerminator references in the media as soon Dialogue: 0,0:05:11.38,0:05:15.00,Default,,0000,0000,0000,,as you talk about autonomous weapons,\Nmostly referred to as killer robots in Dialogue: 0,0:05:15.00,0:05:20.25,Default,,0000,0000,0000,,this context. And just in case you tend to\Nwrite an article about this: don't use a Dialogue: 0,0:05:20.25,0:05:24.34,Default,,0000,0000,0000,,Terminator picture, please. Don't, because\Nit's really unhelpful to understand where Dialogue: 0,0:05:24.34,0:05:30.35,Default,,0000,0000,0000,,the problems are. With this kind of thing,\Npeople assume that we have problems is Dialogue: 0,0:05:30.35,0:05:34.16,Default,,0000,0000,0000,,when we have machines with a human-like\Nintelligence which look like the Dialogue: 0,0:05:34.16,0:05:39.75,Default,,0000,0000,0000,,Terminator or something like this. And the\Nproblem is that really way before that Dialogue: 0,0:05:39.75,0:05:46.82,Default,,0000,0000,0000,,they start when you use assisting systems\Nwhen you have men or human-machine teaming Dialogue: 0,0:05:46.82,0:05:50.86,Default,,0000,0000,0000,,or when you accumulate a couple of\Nautonomous functions through the targeting Dialogue: 0,0:05:50.86,0:05:57.72,Default,,0000,0000,0000,,cycle. So through this, the military steps\Nare lead to the use of force or lead to Dialogue: 0,0:05:57.72,0:06:03.56,Default,,0000,0000,0000,,the killing of people. And that's not,\Nthis is really not our problem at the Dialogue: 0,0:06:03.56,0:06:08.35,Default,,0000,0000,0000,,moment. So please keep this in mind\Nbecause it's not just semantics, semantics Dialogue: 0,0:06:08.35,0:06:15.35,Default,,0000,0000,0000,,to differentiate between these two things.\NIt's really manages the expectations of Dialogue: 0,0:06:15.35,0:06:20.78,Default,,0000,0000,0000,,political and military decision-makers.\NOk, so now you've got kind of an Dialogue: 0,0:06:20.78,0:06:23.62,Default,,0000,0000,0000,,impression what I'm talking about here so\Nwhy should we actually talk about this? Dialogue: 0,0:06:23.62,0:06:30.09,Default,,0000,0000,0000,,What's all the fuss about? Actually,\Nautonomous weapons have or would have Dialogue: 0,0:06:30.09,0:06:34.17,Default,,0000,0000,0000,,quite a few military advantages: They\Nmight be, in some cases, faster or even Dialogue: 0,0:06:34.17,0:06:39.78,Default,,0000,0000,0000,,more precise than humans. And you don't\Nneed a constant communication link. So you Dialogue: 0,0:06:39.78,0:06:43.74,Default,,0000,0000,0000,,don't have, you don't have to worry about\Ninstable communication links, you don't Dialogue: 0,0:06:43.74,0:06:50.02,Default,,0000,0000,0000,,have to worry about latency or detection\Nor a vulnerability of this specific link. Dialogue: 0,0:06:50.02,0:06:57.05,Default,,0000,0000,0000,,So yay! And a lot of, let's say very\Ninteresting, military options come from Dialogue: 0,0:06:57.05,0:07:02.52,Default,,0000,0000,0000,,that. People talk about stealthy\Noperations and shallow waters for example. Dialogue: 0,0:07:02.52,0:07:07.34,Default,,0000,0000,0000,,Or you know remote missions and secluded\Nareas, things like that. And you can get Dialogue: 0,0:07:07.34,0:07:13.83,Default,,0000,0000,0000,,very creative with tiniest robots and\Nswarms for example. So shiny new options. Dialogue: 0,0:07:13.83,0:07:19.63,Default,,0000,0000,0000,,But, and of course there's a "but", it\Ncomes at a prize because you have at least Dialogue: 0,0:07:19.63,0:07:26.59,Default,,0000,0000,0000,,three dimensions of challenges in this\Nregard. First of all, the legal ones. When Dialogue: 0,0:07:26.59,0:07:31.39,Default,,0000,0000,0000,,we talk about these weapons, they might\Nbe, they will be applied in conflict where Dialogue: 0,0:07:31.39,0:07:37.50,Default,,0000,0000,0000,,international humanitarian law IHL\Napplies. And IHL consists of quite a few Dialogue: 0,0:07:37.50,0:07:45.43,Default,,0000,0000,0000,,very abstract principles. For example:\Nprinciple of distinction between Dialogue: 0,0:07:45.43,0:07:51.31,Default,,0000,0000,0000,,combatants and civilians, principle of\Nproportionality or a military necessity. Dialogue: 0,0:07:51.31,0:07:58.07,Default,,0000,0000,0000,,They are very abstract and I'm pretty sure\Nthey really always need a human judgment Dialogue: 0,0:07:58.07,0:08:05.89,Default,,0000,0000,0000,,to interpret this, these principles, and\Napply them to dynamic situations. Feel Dialogue: 0,0:08:05.89,0:08:14.87,Default,,0000,0000,0000,,free to correct me if I'm wrong later. So\Nthat's one thing. So if you remove the Dialogue: 0,0:08:14.87,0:08:19.44,Default,,0000,0000,0000,,human from the targeting cycle, this human\Njudgment might be missing and therefore Dialogue: 0,0:08:19.44,0:08:24.70,Default,,0000,0000,0000,,military decision makers have to evaluate\Nvery carefully the quality of human Dialogue: 0,0:08:24.70,0:08:31.70,Default,,0000,0000,0000,,control and human judgement within the\Ntargeting cycle. So that's law. Second Dialogue: 0,0:08:31.70,0:08:38.72,Default,,0000,0000,0000,,dimension of challenges are security\Nissues. When you look at these new systems Dialogue: 0,0:08:38.72,0:08:43.64,Default,,0000,0000,0000,,they are cool and shiny and as most new\Ntypes of weapons they are, they have the Dialogue: 0,0:08:43.64,0:08:49.05,Default,,0000,0000,0000,,potential to stir an arms race between\Nbetween states. So they actually might Dialogue: 0,0:08:49.05,0:08:53.56,Default,,0000,0000,0000,,make conflicts more likely just because\Nthey are there and states want to have Dialogue: 0,0:08:53.56,0:09:00.52,Default,,0000,0000,0000,,them and feel threatened by them. Second\Naspect is proliferation. Autonomy is based Dialogue: 0,0:09:00.52,0:09:04.74,Default,,0000,0000,0000,,on software, so software can be easily\Ntransferred it's really hard to control Dialogue: 0,0:09:04.74,0:09:08.58,Default,,0000,0000,0000,,and all the other components, or most of\Nthe other components you will need, are Dialogue: 0,0:09:08.58,0:09:12.28,Default,,0000,0000,0000,,available on the civilian market so you\Ncan build this stuff on your own if you're Dialogue: 0,0:09:12.28,0:09:19.64,Default,,0000,0000,0000,,smart enough. So we have might have more\Nconflicts from these types of weapons and Dialogue: 0,0:09:19.64,0:09:24.50,Default,,0000,0000,0000,,it's might get, well, more difficult to\Ncontrol the application of this Dialogue: 0,0:09:24.50,0:09:29.56,Default,,0000,0000,0000,,technology. And the third one which is it\Nespecially worrying for me is the as Dialogue: 0,0:09:29.56,0:09:33.97,Default,,0000,0000,0000,,potential for escalation within the\Nconflict, especially when you have, when Dialogue: 0,0:09:33.97,0:09:40.28,Default,,0000,0000,0000,,both or more sites use these autonomous\Nweapons, you have these very complex Dialogue: 0,0:09:40.28,0:09:45.55,Default,,0000,0000,0000,,adversary systems and it will become very\Nhard to predict how they are going to Dialogue: 0,0:09:45.55,0:09:52.05,Default,,0000,0000,0000,,interact. They will increase the speed of\Nthe of the conflict and the human might Dialogue: 0,0:09:52.05,0:09:57.04,Default,,0000,0000,0000,,not even have a chance to process\Nwhat's going on there. Dialogue: 0,0:09:57.04,0:10:01.95,Default,,0000,0000,0000,,So that's really worrying and we can see\Nfor example in high-frequency trading at Dialogue: 0,0:10:01.95,0:10:05.98,Default,,0000,0000,0000,,the stock markets where problems arise\Nthere and how are difficult is for humans Dialogue: 0,0:10:05.98,0:10:12.53,Default,,0000,0000,0000,,to understand what's going on there. So\Nthat, that are of some of these security Dialogue: 0,0:10:12.53,0:10:23.13,Default,,0000,0000,0000,,issues there. And the last and maybe maybe\Nmost important one are ethics. As I Dialogue: 0,0:10:23.13,0:10:29.09,Default,,0000,0000,0000,,mentioned before, when you use autonomy\Nand weapons or machines you have Dialogue: 0,0:10:29.09,0:10:32.76,Default,,0000,0000,0000,,artificial intelligence so you don't have\Nreal intention, a real entity that's Dialogue: 0,0:10:32.76,0:10:38.22,Default,,0000,0000,0000,,behind this. So the killing decision might\Nat some point be based on statistical Dialogue: 0,0:10:38.22,0:10:43.21,Default,,0000,0000,0000,,methods and no one will be involved there\Nand that's, well, worrying for a lot of Dialogue: 0,0:10:43.21,0:10:48.40,Default,,0000,0000,0000,,reasons but also it could constitute a\Nviolation of human dignity. You can argue Dialogue: 0,0:10:48.40,0:10:54.08,Default,,0000,0000,0000,,that humans have, well, you can kill\Nhumans in in war but they at least have Dialogue: 0,0:10:54.08,0:10:59.14,Default,,0000,0000,0000,,the right to be killed by another human or\Nat least by the decision of another human, Dialogue: 0,0:10:59.14,0:11:02.63,Default,,0000,0000,0000,,but we can discuss this later.\NSo at least on this regard it would be Dialogue: 0,0:11:02.63,0:11:07.61,Default,,0000,0000,0000,,highly unethical and that really just\Nscratches the surface of problems and Dialogue: 0,0:11:07.61,0:11:12.79,Default,,0000,0000,0000,,challenges that would arise from the use\Nof these autonomous weapons. I haven't Dialogue: 0,0:11:12.79,0:11:16.68,Default,,0000,0000,0000,,even touched on the problems with training\Ndata, with accountability, with Dialogue: 0,0:11:16.68,0:11:23.30,Default,,0000,0000,0000,,verification and all that funny stuff\Nbecause I only have 20 minutes. So, sounds Dialogue: 0,0:11:23.30,0:11:32.52,Default,,0000,0000,0000,,pretty bad, doesn't it? So how can this\Nissue be addressed? Luckily, states have, Dialogue: 0,0:11:32.52,0:11:36.74,Default,,0000,0000,0000,,thanks to a huge campaign of NGOs, noticed\Nthat there might be some problems and Dialogue: 0,0:11:36.74,0:11:40.44,Default,,0000,0000,0000,,there might be a necessity to address\Nthat, this issue. They're currently doing Dialogue: 0,0:11:40.44,0:11:45.38,Default,,0000,0000,0000,,this in the UN Convention on certain\Nconventional weapons, CCW, where they Dialogue: 0,0:11:45.38,0:11:53.29,Default,,0000,0000,0000,,discuss a potential ban of the development\Nand use of these lethal weapons or weapons Dialogue: 0,0:11:53.29,0:11:57.59,Default,,0000,0000,0000,,that lack meaningful human control over\Nthe use of force. There are several ideas Dialogue: 0,0:11:57.59,0:12:04.42,Default,,0000,0000,0000,,around there. And such a ban would be\Nreally the maximum goal of the NGOs there Dialogue: 0,0:12:04.42,0:12:09.17,Default,,0000,0000,0000,,but it becomes increasingly unlikely that\Nthis happens. Most states do not agree Dialogue: 0,0:12:09.17,0:12:12.99,Default,,0000,0000,0000,,with a complete ban, they want to regulate\Nit a bit here, a bit there, and they Dialogue: 0,0:12:12.99,0:12:17.88,Default,,0000,0000,0000,,really can't find a common common\Ndefinition as I mentioned before because Dialogue: 0,0:12:17.88,0:12:22.61,Default,,0000,0000,0000,,if you have a broad definition as just as\NI used it you will notice that you have Dialogue: 0,0:12:22.61,0:12:26.01,Default,,0000,0000,0000,,existing systems in there that might be\Nnot that problematic or that you just Dialogue: 0,0:12:26.01,0:12:31.75,Default,,0000,0000,0000,,don't want to ben and you might stop\Ncivilian or commercial developments which Dialogue: 0,0:12:31.75,0:12:38.69,Default,,0000,0000,0000,,you also don't want to do. So states are\Nstuck on this regard and they also really Dialogue: 0,0:12:38.69,0:12:42.18,Default,,0000,0000,0000,,challenge the notion that we need a\Npreventive arms control here, so that we Dialogue: 0,0:12:42.18,0:12:50.63,Default,,0000,0000,0000,,need to act before these systems are\Napplied on the battlefield. So at the Dialogue: 0,0:12:50.63,0:12:55.98,Default,,0000,0000,0000,,moment, this is the fourth year or\Nsomething of these negotiations and we Dialogue: 0,0:12:55.98,0:13:00.09,Default,,0000,0000,0000,,will see how it goes this year and if\Nstates can't find a common ground there it Dialogue: 0,0:13:00.09,0:13:04.95,Default,,0000,0000,0000,,becomes increasingly like or yeah becomes\Nlikely that it will change to another Dialogue: 0,0:13:04.95,0:13:11.06,Default,,0000,0000,0000,,forum just like with anti-personnel mines\Nfor example which where the the treaty was Dialogue: 0,0:13:11.06,0:13:17.64,Default,,0000,0000,0000,,found outside of the United Nations. But\Nyeah, the window of opportunity really Dialogue: 0,0:13:17.64,0:13:24.95,Default,,0000,0000,0000,,closes and states and NGOs have to act\Nthere and yeah keep on track there. Just Dialogue: 0,0:13:24.95,0:13:32.34,Default,,0000,0000,0000,,as a side note, probably quite a few\Npeople are members of NGOs so if you look Dialogue: 0,0:13:32.34,0:13:39.06,Default,,0000,0000,0000,,at the campaign to stop killer robots with\Na big campaign behind this, this process, Dialogue: 0,0:13:39.06,0:13:43.31,Default,,0000,0000,0000,,there's only one German NGO which is\Nfacing finance, so if you're especially if Dialogue: 0,0:13:43.31,0:13:48.74,Default,,0000,0000,0000,,you're German NGO and are interest that in\NAI it might be worthwhile to look into the Dialogue: 0,0:13:48.74,0:13:54.51,Default,,0000,0000,0000,,military dimension as well. We really need\Nsome expertise on that regard, especially Dialogue: 0,0:13:54.51,0:14:00.44,Default,,0000,0000,0000,,on AI and these technologies. They're...\NOkay, so just in case you fell asleep in Dialogue: 0,0:14:00.44,0:14:05.59,Default,,0000,0000,0000,,the last 15 minutes I want you to take\Naway three key messages: Please be aware Dialogue: 0,0:14:05.59,0:14:11.10,Default,,0000,0000,0000,,of the trends and internal logic that lead\Nto autonomy in weapons. Do not Dialogue: 0,0:14:11.10,0:14:15.63,Default,,0000,0000,0000,,overestimate the abilities of autonomy, of\Nautonomous machines like intent and these Dialogue: 0,0:14:15.63,0:14:20.19,Default,,0000,0000,0000,,things and because you probably all knew\Nthis already, please tell people about Dialogue: 0,0:14:20.19,0:14:23.90,Default,,0000,0000,0000,,this, tell other people about this,\Neducate them about this type of Dialogue: 0,0:14:23.90,0:14:30.67,Default,,0000,0000,0000,,technology. And third, don't underestimate\Nthe potential dangers for security and Dialogue: 0,0:14:30.67,0:14:37.25,Default,,0000,0000,0000,,human dignity that comes from this type of\Nweapon. I hope that I could interest you a Dialogue: 0,0:14:37.25,0:14:40.84,Default,,0000,0000,0000,,bit more in this in this particular issue\Nif you want to learn more you can find Dialogue: 0,0:14:40.84,0:14:46.82,Default,,0000,0000,0000,,really interesting sources on the website\Nof the CCW at the campaign to stuff killer Dialogue: 0,0:14:46.82,0:14:53.41,Default,,0000,0000,0000,,robots and from a research project that I\Nhappen to work in, the International Panel Dialogue: 0,0:14:53.41,0:14:56.76,Default,,0000,0000,0000,,on the Regulation of Autonomous Weapons,\Nwe do have a few studies on that regard Dialogue: 0,0:14:56.76,0:15:02.06,Default,,0000,0000,0000,,and we're going to publish a few more. So\Nplease, check this out and thank you for Dialogue: 0,0:15:02.06,0:15:03.48,Default,,0000,0000,0000,,your attention. Dialogue: 0,0:15:03.48,0:15:13.71,Default,,0000,0000,0000,,{\i1}Applause{\i0} Dialogue: 0,0:15:13.71,0:15:16.20,Default,,0000,0000,0000,,Questions? Dialogue: 0,0:15:20.52,0:15:23.78,Default,,0000,0000,0000,,Herald: Sorry. So we have some time for\Nquestions answers now. Okay, first of all Dialogue: 0,0:15:23.78,0:15:28.11,Default,,0000,0000,0000,,I have to apologize that we had a hiccup\Nwith the signing language, the acoustics Dialogue: 0,0:15:28.11,0:15:32.21,Default,,0000,0000,0000,,over here on the stage was so bad that she\Ndidn't could do her job so I'm Dialogue: 0,0:15:32.21,0:15:38.72,Default,,0000,0000,0000,,terrible sorry about that. We fixed it in\Nthe talk and my apologies for that. We are Dialogue: 0,0:15:38.72,0:15:42.16,Default,,0000,0000,0000,,queuing here on the microphones already so\Nwe start with microphone number one, your Dialogue: 0,0:15:42.16,0:15:44.61,Default,,0000,0000,0000,,question please.\NMic 1: Thanks for your talk Anja. Don't Dialogue: 0,0:15:44.61,0:15:49.05,Default,,0000,0000,0000,,you think there is a possibility to reduce\Nwar crimes as well by taking away the Dialogue: 0,0:15:49.05,0:15:54.06,Default,,0000,0000,0000,,decision from humans and by having\Nalgorithms who decide which are actually Dialogue: 0,0:15:54.06,0:15:56.93,Default,,0000,0000,0000,,auditable?\NDahlmann: Yeah that's, actually, that's Dialogue: 0,0:15:56.93,0:15:59.74,Default,,0000,0000,0000,,something I just discussed in the\Ninternational debate as well, that there Dialogue: 0,0:15:59.74,0:16:05.40,Default,,0000,0000,0000,,might, that machines might be more ethical\Nthan humans could be. And well, of course Dialogue: 0,0:16:05.40,0:16:11.51,Default,,0000,0000,0000,,they won't just start raping women because\Nthey want to but you can program them to Dialogue: 0,0:16:11.51,0:16:18.27,Default,,0000,0000,0000,,do this. So you just you shift the\Nproblems really. And also maybe these Dialogue: 0,0:16:18.27,0:16:22.56,Default,,0000,0000,0000,,machines don't get angry but they don't\Nshow compassion either so if you are there Dialogue: 0,0:16:22.56,0:16:26.07,Default,,0000,0000,0000,,and your potential target they just won't\Nstop they will just kill you and do not Dialogue: 0,0:16:26.07,0:16:32.84,Default,,0000,0000,0000,,think once think about this. So you have\Nto really look at both sides there I guess. Dialogue: 0,0:16:32.84,0:16:37.84,Default,,0000,0000,0000,,Herald: Thanks. So we switch over\Nto microphone 3, please. Dialogue: 0,0:16:37.84,0:16:44.70,Default,,0000,0000,0000,,Mic 3: Thanks for the talk. Regarding\Nautonomous cars, self-driving cars, Dialogue: 0,0:16:44.70,0:16:49.24,Default,,0000,0000,0000,,there's a similar discussion going on\Nregarding the ethics. How should a car Dialogue: 0,0:16:49.24,0:16:54.14,Default,,0000,0000,0000,,react in a case of an accident? Should it\Nprotect people outside people, inside, Dialogue: 0,0:16:54.14,0:17:02.07,Default,,0000,0000,0000,,what are the laws? So there is another\Ndiscussion there. Do you work with people Dialogue: 0,0:17:02.07,0:17:07.27,Default,,0000,0000,0000,,in this area or is this is there any\Ncollaboration? Dialogue: 0,0:17:07.27,0:17:10.17,Default,,0000,0000,0000,,Dahlmann: Maybe there's less collaboration\Nthan one might think there is. I think Dialogue: 0,0:17:10.17,0:17:17.30,Default,,0000,0000,0000,,there is. Of course, we we monitor this\Ndebate as well and yeah we think about the Dialogue: 0,0:17:17.30,0:17:20.69,Default,,0000,0000,0000,,possible applications of the outcomes for\Nexample from this German ethical Dialogue: 0,0:17:20.69,0:17:26.85,Default,,0000,0000,0000,,commission on self-driving cars for our\Nwork. But I'm a bit torn there because Dialogue: 0,0:17:26.85,0:17:30.91,Default,,0000,0000,0000,,when you talk about weapons, they are\Ndesigned to kill people and cars mostly Dialogue: 0,0:17:30.91,0:17:36.00,Default,,0000,0000,0000,,are not. So with this ethical committee\Nyou want to avoid killing people or decide Dialogue: 0,0:17:36.00,0:17:42.10,Default,,0000,0000,0000,,what happens when this accident occurs. So\Nthey are a bit different but of course Dialogue: 0,0:17:42.10,0:17:48.79,Default,,0000,0000,0000,,yeah you can learn a lot from both\Ndiscussions and we aware of that. Dialogue: 0,0:17:48.79,0:17:53.53,Default,,0000,0000,0000,,Herald: Thanks. Then we're gonna go over\Nin the back, microphone number 2, please. Dialogue: 0,0:17:53.53,0:17:59.50,Default,,0000,0000,0000,,Mic 2: Also from me thanks again for this\Ntalk and infusing all this professionalism Dialogue: 0,0:17:59.50,0:18:10.28,Default,,0000,0000,0000,,into the debate because some of the\Nsurroundings of our, so to say ours Dialogue: 0,0:18:10.28,0:18:17.86,Default,,0000,0000,0000,,scenery, they like to protest against very\Nspecific things like for example the Dialogue: 0,0:18:17.86,0:18:23.92,Default,,0000,0000,0000,,Rammstein air base and in my view that's a\Nbit misguided if you just go out and Dialogue: 0,0:18:23.92,0:18:30.56,Default,,0000,0000,0000,,protest in a populistic way without\Ninvolving these points of expertise that Dialogue: 0,0:18:30.56,0:18:38.21,Default,,0000,0000,0000,,you offer. And so, thanks again for that.\NAnd then my question: How would you Dialogue: 0,0:18:38.21,0:18:46.94,Default,,0000,0000,0000,,propose that protests progress and develop\Nthemselves to a higher level to be on the Dialogue: 0,0:18:46.94,0:18:55.17,Default,,0000,0000,0000,,one hand more effective and on the other\Nhand more considerate of what is at stake Dialogue: 0,0:18:55.17,0:19:00.20,Default,,0000,0000,0000,,on all the levels and on\Nall sides involved? Dialogue: 0,0:19:00.20,0:19:05.69,Default,,0000,0000,0000,,Dahlmann: Yeah well, first, the Rammstein\Nissue is completely, actually a completely Dialogue: 0,0:19:05.69,0:19:10.34,Default,,0000,0000,0000,,different topic. It's drone warfare,\Nremotely piloted drones, so there are a Dialogue: 0,0:19:10.34,0:19:14.29,Default,,0000,0000,0000,,lot of a lot of problems with this and\Nwe're starting killings but it's not about Dialogue: 0,0:19:14.29,0:19:21.82,Default,,0000,0000,0000,,lethal autonomous weapons in particular.\NWell if you want to be a part of this Dialogue: 0,0:19:21.82,0:19:25.33,Default,,0000,0000,0000,,international debate, there's of course\Nthis campaign to stop killer robots and Dialogue: 0,0:19:25.33,0:19:30.43,Default,,0000,0000,0000,,they have a lot of really good people and\Na lot of resources, sources, literature Dialogue: 0,0:19:30.43,0:19:35.19,Default,,0000,0000,0000,,and things like that to really educate\Nyourself what's going on there, so that Dialogue: 0,0:19:35.19,0:19:39.49,Default,,0000,0000,0000,,would be a starting point. And then yeah\Njust keep talking to scientists about Dialogue: 0,0:19:39.49,0:19:45.28,Default,,0000,0000,0000,,this and find out where we see the\Nproblems and I mean it's always helpful Dialogue: 0,0:19:45.28,0:19:52.97,Default,,0000,0000,0000,,for scientists to to talk to people in the\Nfield, so to say. So yeah, keep talking. Dialogue: 0,0:19:52.97,0:19:55.52,Default,,0000,0000,0000,,Herald: Thanks for that. And the\Nsignal angel signaled that we have Dialogue: 0,0:19:55.52,0:19:59.20,Default,,0000,0000,0000,,something from the internet.\NSignal Angel: Thank you. Question from Dialogue: 0,0:19:59.20,0:20:04.21,Default,,0000,0000,0000,,IRC: Aren't we already in a killer robot\Nworld? The bot net can attack a nuclear Dialogue: 0,0:20:04.21,0:20:08.27,Default,,0000,0000,0000,,power plant for example. What do you think?\NDahlmann: I really didn't understand a Dialogue: 0,0:20:08.27,0:20:09.71,Default,,0000,0000,0000,,word, I'm sorry.\NHerald: I didn't understand that as well, Dialogue: 0,0:20:09.71,0:20:12.59,Default,,0000,0000,0000,,so can you speak closer to\Nthe microphone, please? Dialogue: 0,0:20:12.59,0:20:16.48,Default,,0000,0000,0000,,Signal Angel: Yes. Aren't we already in a\Nkiller robot world? Dialogue: 0,0:20:16.48,0:20:19.51,Default,,0000,0000,0000,,Herald: Sorry, that doesn't work. Sorry.\NSorry, we stop that here, we can't hear it Dialogue: 0,0:20:19.51,0:20:21.55,Default,,0000,0000,0000,,over here. Sorry.\NSignal Angel: Okay. Dialogue: 0,0:20:21.55,0:20:25.75,Default,,0000,0000,0000,,Herald: We're gonna switch over to\Nmicrophone two now, please. Dialogue: 0,0:20:25.75,0:20:32.53,Default,,0000,0000,0000,,Mic 2: I have one little question. So in\Nyour talk, you were focusing on the Dialogue: 0,0:20:32.53,0:20:38.97,Default,,0000,0000,0000,,ethical questions related to lethal\Nweapons. Are you aware of ongoing Dialogue: 0,0:20:38.97,0:20:45.38,Default,,0000,0000,0000,,discussions regarding the ethical aspects\Nof the design and implementation of less Dialogue: 0,0:20:45.38,0:20:52.77,Default,,0000,0000,0000,,than lethal autonomous weapons for crowd\Ncontrol and similar purposes? Dialogue: 0,0:20:52.77,0:20:57.04,Default,,0000,0000,0000,,Dahlmann: Yeah actually within the CCW,\Nevery term of this Lethal Autonomous Dialogue: 0,0:20:57.04,0:21:02.66,Default,,0000,0000,0000,,Weapon Systems is disputed also the\N"lethal" aspect and for the regulation Dialogue: 0,0:21:02.66,0:21:07.67,Default,,0000,0000,0000,,that might be easier to focus on this for\Nnow because less than lethal weapons come Dialogue: 0,0:21:07.67,0:21:14.49,Default,,0000,0000,0000,,with their own problems and the question\Nif they are ethical and if they can, if Dialogue: 0,0:21:14.49,0:21:18.82,Default,,0000,0000,0000,,IHL applies to them but I'm not really\Ndeep into this discussion. So I'll just Dialogue: 0,0:21:18.82,0:21:22.60,Default,,0000,0000,0000,,have to leave it there.\NHerald: Thanks and back here to microphone Dialogue: 0,0:21:22.60,0:21:25.23,Default,,0000,0000,0000,,one, please.\NMic 1: Hi. Thank you for the talk very Dialogue: 0,0:21:25.23,0:21:31.71,Default,,0000,0000,0000,,much. My question is in the context of the\Ndecreasing cost of both, the hardware and Dialogue: 0,0:21:31.71,0:21:37.19,Default,,0000,0000,0000,,software, over the next say 20, 40 years.\NOutside of a nation-state context like Dialogue: 0,0:21:37.19,0:21:42.42,Default,,0000,0000,0000,,private forces or non nation-state actors\Ngaining use of these weapons, do things Dialogue: 0,0:21:42.42,0:21:45.86,Default,,0000,0000,0000,,like the UN convention or the campaign to\Nstop killer robots apply are they Dialogue: 0,0:21:45.86,0:21:52.41,Default,,0000,0000,0000,,considering private individuals trying to\Nleverage these against others? Dialogue: 0,0:21:52.41,0:21:59.86,Default,,0000,0000,0000,,Dahlmann: Not sure what the campaign says\Nabout this, I'm not a member there. The Dialogue: 0,0:21:59.86,0:22:06.60,Default,,0000,0000,0000,,the CCW mostly focuses on international\Nhumanitarian law which is important but I Dialogue: 0,0:22:06.60,0:22:10.82,Default,,0000,0000,0000,,think it's it's not broad enough. So\Nquestions like proliferation and all this Dialogue: 0,0:22:10.82,0:22:16.48,Default,,0000,0000,0000,,is connected to your question and not\Nreally or probably won't be part of Dialogue: 0,0:22:16.48,0:22:21.09,Default,,0000,0000,0000,,regulation there. It's discussed only on\Nthe edges of the of the debates and Dialogue: 0,0:22:21.09,0:22:26.80,Default,,0000,0000,0000,,negotiations there but it doesn't seem to\Nbe a really issue there. Dialogue: 0,0:22:26.80,0:22:29.86,Default,,0000,0000,0000,,Mic 1: Thanks.\NHerald: And over to microphone six, Dialogue: 0,0:22:29.86,0:22:32.51,Default,,0000,0000,0000,,please.\NMic 6: Thank you. I have a question as a Dialogue: 0,0:22:32.51,0:22:38.56,Default,,0000,0000,0000,,researcher: Do you know how far the\Ndevelopment has gone already? So how Dialogue: 0,0:22:38.56,0:22:44.77,Default,,0000,0000,0000,,transparent or intransparent is your look\Ninto what is being developed and Dialogue: 0,0:22:44.77,0:22:50.67,Default,,0000,0000,0000,,researched on the side of militaria\Nworking, military people working with Dialogue: 0,0:22:50.67,0:22:54.79,Default,,0000,0000,0000,,autonomous weapons and developing them?\NDahlmann: Well, for me it's quite Dialogue: 0,0:22:54.79,0:22:59.76,Default,,0000,0000,0000,,intransparent because I only have only\Naccess to public publicly available Dialogue: 0,0:22:59.76,0:23:04.63,Default,,0000,0000,0000,,sources so I don't really know what what's\Ngoing on behind closed doors in the Dialogue: 0,0:23:04.63,0:23:10.50,Default,,0000,0000,0000,,military or in the industry there. Of\Ncourse you can you can monitor the Dialogue: 0,0:23:10.50,0:23:14.82,Default,,0000,0000,0000,,civilian applications or developments\Nwhich can tell a lot about the the state Dialogue: 0,0:23:14.82,0:23:24.13,Default,,0000,0000,0000,,of the art and for example the DARPA\Nthe American Development Agency, they Dialogue: 0,0:23:24.13,0:23:30.07,Default,,0000,0000,0000,,published sometimes a call for papers,\Nthat's not the term, but there you can see Dialogue: 0,0:23:30.07,0:23:34.44,Default,,0000,0000,0000,,where in which areas they are interested\Nin then for example they really like this Dialogue: 0,0:23:34.44,0:23:40.99,Default,,0000,0000,0000,,idea of autonomous killer bug that can\Nact in swarms and monitor or even kill Dialogue: 0,0:23:40.99,0:23:45.60,Default,,0000,0000,0000,,people and things like that. So yeah we\Ntry to piece it, piece it together in Dialogue: 0,0:23:45.60,0:23:48.70,Default,,0000,0000,0000,,our work.\NHerald: We do have a little bit more time, Dialogue: 0,0:23:48.70,0:23:50.57,Default,,0000,0000,0000,,are you okay to answer more questions?\NDahlmann: Sure. Dialogue: 0,0:23:50.57,0:23:52.96,Default,,0000,0000,0000,,Herald: Then we're gonna switch over to\Nmicrophone three, please. Dialogue: 0,0:23:52.96,0:24:00.46,Default,,0000,0000,0000,,Mic 3: Yes, hello. I think we are living\Nalready in a world of Leathal Autonomous Dialogue: 0,0:24:00.46,0:24:04.87,Default,,0000,0000,0000,,Weapon Systems if you think about these\Nmillions of landmines which are operating. Dialogue: 0,0:24:04.87,0:24:09.25,Default,,0000,0000,0000,,And so the question is: Shouldn't it be\Npossible to ban these weapon systems the Dialogue: 0,0:24:09.25,0:24:14.32,Default,,0000,0000,0000,,same way as land mines that are already\Nbanned by several countries so just Dialogue: 0,0:24:14.32,0:24:19.24,Default,,0000,0000,0000,,include them in that definition? And\Nbecause the arguments should be very Dialogue: 0,0:24:19.24,0:24:23.05,Default,,0000,0000,0000,,similar.\NDahlmann: Yeah it does, it does come to Dialogue: 0,0:24:23.05,0:24:26.54,Default,,0000,0000,0000,,mind of course because these mines are\Njust lying around there and no one's Dialogue: 0,0:24:26.54,0:24:32.90,Default,,0000,0000,0000,,interacting when you step on them and\Nboom! But they are, well it depends, it Dialogue: 0,0:24:32.90,0:24:38.89,Default,,0000,0000,0000,,depends first of all a bit of your\Ndefinition of autonomy. So some say Dialogue: 0,0:24:38.89,0:24:42.54,Default,,0000,0000,0000,,autonomous is when you act in dynamic\Nsituations and the other ones would be Dialogue: 0,0:24:42.54,0:24:47.89,Default,,0000,0000,0000,,automated and things like that and I think\Nthis autonomy aspect, I really don't want Dialogue: 0,0:24:47.89,0:24:55.90,Default,,0000,0000,0000,,to find, don't want to find define\Nautonomy here really but this this action Dialogue: 0,0:24:55.90,0:25:01.48,Default,,0000,0000,0000,,in more dynamic spaces and the aspect of\Nmachine learning and all these things, Dialogue: 0,0:25:01.48,0:25:06.45,Default,,0000,0000,0000,,they are way more complex and they bring\Ndifferent problems than just land mines. Dialogue: 0,0:25:06.45,0:25:10.76,Default,,0000,0000,0000,,Landmines are problematic, anti-personnel\Nmines are banned for good reasons but they Dialogue: 0,0:25:10.76,0:25:15.41,Default,,0000,0000,0000,,don't have the same problems I think. So\Nit won't be, I don't think it won't be Dialogue: 0,0:25:15.41,0:25:21.54,Default,,0000,0000,0000,,sufficient to just put the LAWS in there,\Nthe Lethal Autonomous Weapons. Dialogue: 0,0:25:21.54,0:25:26.40,Default,,0000,0000,0000,,Herald: Thank you very much. I can't see\Nanyone else queuing up so therefore, Anja, Dialogue: 0,0:25:26.40,0:25:28.99,Default,,0000,0000,0000,,thank you very much it's your applause! Dialogue: 0,0:25:28.99,0:25:31.81,Default,,0000,0000,0000,,{\i1}applause{\i0} Dialogue: 0,0:25:31.81,0:25:34.59,Default,,0000,0000,0000,,and once again my apologies that\Nthat didn't work Dialogue: 0,0:25:34.59,0:25:39.80,Default,,0000,0000,0000,,{\i1}34c3 outro{\i0} Dialogue: 0,0:25:39.80,0:25:56.58,Default,,0000,0000,0000,,subtitles created by c3subtitles.de\Nin the year 2018. Join, and help us!