0:00:00.000,0:00:15.010
34c3 intro
0:00:15.010,0:00:18.339
Herald: ... used Anja Dahlmann, a[br]political scientist and researcher at
0:00:18.339,0:00:23.310
Stiftung Wissenschaft und Politik, a[br]berlin-based think-tank. Here we go.
0:00:23.310,0:00:30.160
applause
0:00:30.160,0:00:40.030
Anja Dahlmann: Yeah, Thanks for being[br]here. I probably neither cut myself nor
0:00:40.030,0:00:44.270
proposed but I hope it's still[br]interesting. I'm going to talk about
0:00:44.270,0:00:49.240
preventive arms control and international[br]humanitarian law and doing in this
0:00:49.240,0:00:53.270
international debate around autonomous[br]weapons. This type of weapon is also
0:00:53.270,0:00:59.010
referred to as Lethal Autonomous Weapons[br]System, short LAWS, or also killer robots.
0:00:59.010,0:01:04.239
So if I say LAWS, I mostly mean these[br]weapons and not like legal laws, just to
0:01:04.239,0:01:11.650
confuse you a bit. Okay. I will discuss[br]this topic along three questions. First of
0:01:11.650,0:01:18.300
all, what are we actually talking about[br]here, what are autonomous weapons? Second,
0:01:18.300,0:01:22.210
why should we even care about this? Why's[br]it important? And third, how could this
0:01:22.210,0:01:31.480
issue be addressed on international level?[br]So. I'll go through my slides, anyway,
0:01:31.480,0:01:37.770
what are we talking about here? Well,[br]during the international negotiations, so
0:01:37.770,0:01:45.360
far no real, no common definition has been[br]found. So States, Parties try to find
0:01:45.360,0:01:50.090
something or not and for my presentation I[br]will just use a very broad definition of
0:01:50.090,0:01:57.229
autonomous weapons, which is: Weapons that[br]can once activated execute a broad range
0:01:57.229,0:02:02.350
of tasks or selecting to engage targets[br]without further human intervention. And
0:02:02.350,0:02:08.340
it's just a very broad spectrum of weapons[br]that might fall under this definition.
0:02:08.340,0:02:13.150
Actually, some existing ones are there as[br]well which you can't see here. That would
0:02:13.150,0:02:19.940
be the Phalanx system for example. It's[br]been around since the 1970s. Sorry...
0:02:19.940,0:02:22.950
Herald: Man kann nichts hören[br]auf der Bühne. Mach mal weiter.
0:02:22.950,0:02:27.310
Dahlmann: Sorry. So, Phalanx system has[br]been around since the 1970s, a US system,
0:02:27.310,0:02:33.069
air defense system, based on ships and[br]it's been to - just yeah, defend the ship
0:02:33.069,0:02:39.330
against incoming objects from the air. So[br]that's around, has been around for quite a
0:02:39.330,0:02:45.170
long time and it might be even part of[br]this LAWS definition or not but just to
0:02:45.170,0:02:49.410
give you an impression how broad this[br]range is: Today, we've got for example
0:02:49.410,0:02:57.990
demonstrators like the Taranis drone, a UK[br]system, or the x74b which can, for
0:02:57.990,0:03:04.959
example, autonomously land[br]applause
0:03:04.959,0:03:08.700
land on aircraft carriers and can be air-[br]refueled and stuff like that which is
0:03:08.700,0:03:13.550
apparently quite impressive if you don't[br]need a human to do that and in the future
0:03:13.550,0:03:18.810
there might be even, or there probably[br]will be even more, autonomous functions,
0:03:18.810,0:03:25.650
so navigation, landing, refueling, all[br]that stuff. That's, you know, old but at
0:03:25.650,0:03:30.090
some point there might, be weapons might[br]be able to choose their own ammunition
0:03:30.090,0:03:34.560
according to the situation. They might be[br]able to choose their target and decide
0:03:34.560,0:03:41.720
when to engage with the target without any[br]human intervention at some point. And
0:03:41.720,0:03:45.001
that's quite problematic, I will tell you[br]why that's in a minute. Overall, you can
0:03:45.001,0:03:52.230
see that there's a gradual decline of[br]human control over weapons systems or over
0:03:52.230,0:03:58.050
weapons and the use of force. So that's a[br]very short and broad impression of what
0:03:58.050,0:04:01.360
we're talking about here. And talking[br]about definitions, it's always interesting
0:04:01.360,0:04:05.730
what you're not talking about and that's[br]why I want to address some misconceptions
0:04:05.730,0:04:13.481
in the public debate. First of all, when[br]we talk about machine autonomy, also
0:04:13.481,0:04:19.858
artificial intelligence, with intelligence[br]which is the technology behind this,
0:04:19.858,0:04:25.420
people - not you probably - in the media[br]and the broader public often get the idea
0:04:25.420,0:04:30.540
that these machines might have some kind[br]of real intelligence or intention or an
0:04:30.540,0:04:36.260
entity on own right and they're just not.[br]It's just statistical methods, it's just
0:04:36.260,0:04:40.740
math and you know way more about this than[br]I do so I will leave it with this and just
0:04:40.740,0:04:45.600
say that or highlight that they have these[br]machines, these weapons have certain
0:04:45.600,0:04:50.070
competences for specific tasks. They are[br]not entities on their own right, they are
0:04:50.070,0:04:55.050
not intentional.And that's important when[br]we talk about ethical and legal challenges
0:04:55.050,0:05:06.580
afterwards. Sorry. There it is. And the[br]other, in connection with this, there's
0:05:06.580,0:05:11.380
another one, which is the plethora of[br]Terminator references in the media as soon
0:05:11.380,0:05:15.000
as you talk about autonomous weapons,[br]mostly referred to as killer robots in
0:05:15.000,0:05:20.251
this context. And just in case you tend to[br]write an article about this: don't use a
0:05:20.251,0:05:24.340
Terminator picture, please. Don't, because[br]it's really unhelpful to understand where
0:05:24.340,0:05:30.350
the problems are. With this kind of thing,[br]people assume that we have problems is
0:05:30.350,0:05:34.160
when we have machines with a human-like[br]intelligence which look like the
0:05:34.160,0:05:39.750
Terminator or something like this. And the[br]problem is that really way before that
0:05:39.750,0:05:46.820
they start when you use assisting systems[br]when you have men or human-machine teaming
0:05:46.820,0:05:50.860
or when you accumulate a couple of[br]autonomous functions through the targeting
0:05:50.860,0:05:57.720
cycle. So through this, the military steps[br]are lead to the use of force or lead to
0:05:57.720,0:06:03.560
the killing of people. And that's not,[br]this is really not our problem at the
0:06:03.560,0:06:08.350
moment. So please keep this in mind[br]because it's not just semantics, semantics
0:06:08.350,0:06:15.350
to differentiate between these two things.[br]It's really manages the expectations of
0:06:15.350,0:06:20.780
political and military decision-makers.[br]Ok, so now you've got kind of an
0:06:20.780,0:06:23.620
impression what I'm talking about here so[br]why should we actually talk about this?
0:06:23.620,0:06:30.090
What's all the fuss about? Actually,[br]autonomous weapons have or would have
0:06:30.090,0:06:34.169
quite a few military advantages: They[br]might be, in some cases, faster or even
0:06:34.169,0:06:39.780
more precise than humans. And you don't[br]need a constant communication link. So you
0:06:39.780,0:06:43.740
don't have, you don't have to worry about[br]instable communication links, you don't
0:06:43.740,0:06:50.020
have to worry about latency or detection[br]or a vulnerability of this specific link.
0:06:50.020,0:06:57.050
So yay! And a lot of, let's say very[br]interesting, military options come from
0:06:57.050,0:07:02.520
that. People talk about stealthy[br]operations and shallow waters for example.
0:07:02.520,0:07:07.340
Or you know remote missions and secluded[br]areas, things like that. And you can get
0:07:07.340,0:07:13.830
very creative with tiniest robots and[br]swarms for example. So shiny new options.
0:07:13.830,0:07:19.630
But, and of course there's a "but", it[br]comes at a prize because you have at least
0:07:19.630,0:07:26.590
three dimensions of challenges in this[br]regard. First of all, the legal ones. When
0:07:26.590,0:07:31.389
we talk about these weapons, they might[br]be, they will be applied in conflict where
0:07:31.389,0:07:37.500
international humanitarian law IHL[br]applies. And IHL consists of quite a few
0:07:37.500,0:07:45.430
very abstract principles. For example:[br]principle of distinction between
0:07:45.430,0:07:51.310
combatants and civilians, principle of[br]proportionality or a military necessity.
0:07:51.310,0:07:58.069
They are very abstract and I'm pretty sure[br]they really always need a human judgment
0:07:58.069,0:08:05.889
to interpret this, these principles, and[br]apply them to dynamic situations. Feel
0:08:05.889,0:08:14.870
free to correct me if I'm wrong later. So[br]that's one thing. So if you remove the
0:08:14.870,0:08:19.440
human from the targeting cycle, this human[br]judgment might be missing and therefore
0:08:19.440,0:08:24.700
military decision makers have to evaluate[br]very carefully the quality of human
0:08:24.700,0:08:31.699
control and human judgement within the[br]targeting cycle. So that's law. Second
0:08:31.699,0:08:38.719
dimension of challenges are security[br]issues. When you look at these new systems
0:08:38.719,0:08:43.639
they are cool and shiny and as most new[br]types of weapons they are, they have the
0:08:43.639,0:08:49.050
potential to stir an arms race between[br]between states. So they actually might
0:08:49.050,0:08:53.559
make conflicts more likely just because[br]they are there and states want to have
0:08:53.559,0:09:00.519
them and feel threatened by them. Second[br]aspect is proliferation. Autonomy is based
0:09:00.519,0:09:04.739
on software, so software can be easily[br]transferred it's really hard to control
0:09:04.739,0:09:08.579
and all the other components, or most of[br]the other components you will need, are
0:09:08.579,0:09:12.279
available on the civilian market so you[br]can build this stuff on your own if you're
0:09:12.279,0:09:19.639
smart enough. So we have might have more[br]conflicts from these types of weapons and
0:09:19.639,0:09:24.500
it's might get, well, more difficult to[br]control the application of this
0:09:24.500,0:09:29.560
technology. And the third one which is it[br]especially worrying for me is the as
0:09:29.560,0:09:33.971
potential for escalation within the[br]conflict, especially when you have, when
0:09:33.971,0:09:40.279
both or more sites use these autonomous[br]weapons, you have these very complex
0:09:40.279,0:09:45.550
adversary systems and it will become very[br]hard to predict how they are going to
0:09:45.550,0:09:52.050
interact. They will increase the speed of[br]the of the conflict and the human might
0:09:52.050,0:09:57.040
not even have a chance to process[br]what's going on there.
0:09:57.040,0:10:01.949
So that's really worrying and we can see[br]for example in high-frequency trading at
0:10:01.949,0:10:05.980
the stock markets where problems arise[br]there and how are difficult is for humans
0:10:05.980,0:10:12.529
to understand what's going on there. So[br]that, that are of some of these security
0:10:12.529,0:10:23.129
issues there. And the last and maybe maybe[br]most important one are ethics. As I
0:10:23.129,0:10:29.089
mentioned before, when you use autonomy[br]and weapons or machines you have
0:10:29.089,0:10:32.759
artificial intelligence so you don't have[br]real intention, a real entity that's
0:10:32.759,0:10:38.220
behind this. So the killing decision might[br]at some point be based on statistical
0:10:38.220,0:10:43.209
methods and no one will be involved there[br]and that's, well, worrying for a lot of
0:10:43.209,0:10:48.399
reasons but also it could constitute a[br]violation of human dignity. You can argue
0:10:48.399,0:10:54.079
that humans have, well, you can kill[br]humans in in war but they at least have
0:10:54.079,0:10:59.140
the right to be killed by another human or[br]at least by the decision of another human,
0:10:59.140,0:11:02.629
but we can discuss this later.[br]So at least on this regard it would be
0:11:02.629,0:11:07.610
highly unethical and that really just[br]scratches the surface of problems and
0:11:07.610,0:11:12.790
challenges that would arise from the use[br]of these autonomous weapons. I haven't
0:11:12.790,0:11:16.680
even touched on the problems with training[br]data, with accountability, with
0:11:16.680,0:11:23.299
verification and all that funny stuff[br]because I only have 20 minutes. So, sounds
0:11:23.299,0:11:32.519
pretty bad, doesn't it? So how can this[br]issue be addressed? Luckily, states have,
0:11:32.519,0:11:36.739
thanks to a huge campaign of NGOs, noticed[br]that there might be some problems and
0:11:36.739,0:11:40.439
there might be a necessity to address[br]that, this issue. They're currently doing
0:11:40.439,0:11:45.379
this in the UN Convention on certain[br]conventional weapons, CCW, where they
0:11:45.379,0:11:53.290
discuss a potential ban of the development[br]and use of these lethal weapons or weapons
0:11:53.290,0:11:57.589
that lack meaningful human control over[br]the use of force. There are several ideas
0:11:57.589,0:12:04.420
around there. And such a ban would be[br]really the maximum goal of the NGOs there
0:12:04.420,0:12:09.170
but it becomes increasingly unlikely that[br]this happens. Most states do not agree
0:12:09.170,0:12:12.991
with a complete ban, they want to regulate[br]it a bit here, a bit there, and they
0:12:12.991,0:12:17.879
really can't find a common common[br]definition as I mentioned before because
0:12:17.879,0:12:22.609
if you have a broad definition as just as[br]I used it you will notice that you have
0:12:22.609,0:12:26.009
existing systems in there that might be[br]not that problematic or that you just
0:12:26.009,0:12:31.749
don't want to ben and you might stop[br]civilian or commercial developments which
0:12:31.749,0:12:38.690
you also don't want to do. So states are[br]stuck on this regard and they also really
0:12:38.690,0:12:42.179
challenge the notion that we need a[br]preventive arms control here, so that we
0:12:42.179,0:12:50.629
need to act before these systems are[br]applied on the battlefield. So at the
0:12:50.629,0:12:55.980
moment, this is the fourth year or[br]something of these negotiations and we
0:12:55.980,0:13:00.089
will see how it goes this year and if[br]states can't find a common ground there it
0:13:00.089,0:13:04.949
becomes increasingly like or yeah becomes[br]likely that it will change to another
0:13:04.949,0:13:11.060
forum just like with anti-personnel mines[br]for example which where the the treaty was
0:13:11.060,0:13:17.639
found outside of the United Nations. But[br]yeah, the window of opportunity really
0:13:17.639,0:13:24.950
closes and states and NGOs have to act[br]there and yeah keep on track there. Just
0:13:24.950,0:13:32.339
as a side note, probably quite a few[br]people are members of NGOs so if you look
0:13:32.339,0:13:39.059
at the campaign to stop killer robots with[br]a big campaign behind this, this process,
0:13:39.059,0:13:43.310
there's only one German NGO which is[br]facing finance, so if you're especially if
0:13:43.310,0:13:48.739
you're German NGO and are interest that in[br]AI it might be worthwhile to look into the
0:13:48.739,0:13:54.509
military dimension as well. We really need[br]some expertise on that regard, especially
0:13:54.509,0:14:00.439
on AI and these technologies. They're...[br]Okay, so just in case you fell asleep in
0:14:00.439,0:14:05.589
the last 15 minutes I want you to take[br]away three key messages: Please be aware
0:14:05.589,0:14:11.100
of the trends and internal logic that lead[br]to autonomy in weapons. Do not
0:14:11.100,0:14:15.629
overestimate the abilities of autonomy, of[br]autonomous machines like intent and these
0:14:15.629,0:14:20.189
things and because you probably all knew[br]this already, please tell people about
0:14:20.189,0:14:23.899
this, tell other people about this,[br]educate them about this type of
0:14:23.899,0:14:30.669
technology. And third, don't underestimate[br]the potential dangers for security and
0:14:30.669,0:14:37.249
human dignity that comes from this type of[br]weapon. I hope that I could interest you a
0:14:37.249,0:14:40.839
bit more in this in this particular issue[br]if you want to learn more you can find
0:14:40.839,0:14:46.819
really interesting sources on the website[br]of the CCW at the campaign to stuff killer
0:14:46.819,0:14:53.410
robots and from a research project that I[br]happen to work in, the International Panel
0:14:53.410,0:14:56.759
on the Regulation of Autonomous Weapons,[br]we do have a few studies on that regard
0:14:56.759,0:15:02.059
and we're going to publish a few more. So[br]please, check this out and thank you for
0:15:02.059,0:15:03.481
your attention.
0:15:03.481,0:15:13.714
Applause
0:15:13.714,0:15:16.199
Questions?
0:15:20.519,0:15:23.779
Herald: Sorry. So we have some time for[br]questions answers now. Okay, first of all
0:15:23.779,0:15:28.110
I have to apologize that we had a hiccup[br]with the signing language, the acoustics
0:15:28.110,0:15:32.209
over here on the stage was so bad that she[br]didn't could do her job so I'm
0:15:32.209,0:15:38.720
terrible sorry about that. We fixed it in[br]the talk and my apologies for that. We are
0:15:38.720,0:15:42.160
queuing here on the microphones already so[br]we start with microphone number one, your
0:15:42.160,0:15:44.610
question please.[br]Mic 1: Thanks for your talk Anja. Don't
0:15:44.610,0:15:49.049
you think there is a possibility to reduce[br]war crimes as well by taking away the
0:15:49.049,0:15:54.059
decision from humans and by having[br]algorithms who decide which are actually
0:15:54.059,0:15:56.929
auditable?[br]Dahlmann: Yeah that's, actually, that's
0:15:56.929,0:15:59.740
something I just discussed in the[br]international debate as well, that there
0:15:59.740,0:16:05.399
might, that machines might be more ethical[br]than humans could be. And well, of course
0:16:05.399,0:16:11.509
they won't just start raping women because[br]they want to but you can program them to
0:16:11.509,0:16:18.269
do this. So you just you shift the[br]problems really. And also maybe these
0:16:18.269,0:16:22.559
machines don't get angry but they don't[br]show compassion either so if you are there
0:16:22.559,0:16:26.069
and your potential target they just won't[br]stop they will just kill you and do not
0:16:26.069,0:16:32.839
think once think about this. So you have[br]to really look at both sides there I guess.
0:16:32.839,0:16:37.839
Herald: Thanks. So we switch over[br]to microphone 3, please.
0:16:37.839,0:16:44.699
Mic 3: Thanks for the talk. Regarding[br]autonomous cars, self-driving cars,
0:16:44.699,0:16:49.239
there's a similar discussion going on[br]regarding the ethics. How should a car
0:16:49.239,0:16:54.139
react in a case of an accident? Should it[br]protect people outside people, inside,
0:16:54.139,0:17:02.069
what are the laws? So there is another[br]discussion there. Do you work with people
0:17:02.069,0:17:07.270
in this area or is this is there any[br]collaboration?
0:17:07.270,0:17:10.169
Dahlmann: Maybe there's less collaboration[br]than one might think there is. I think
0:17:10.169,0:17:17.299
there is. Of course, we we monitor this[br]debate as well and yeah we think about the
0:17:17.299,0:17:20.689
possible applications of the outcomes for[br]example from this German ethical
0:17:20.689,0:17:26.849
commission on self-driving cars for our[br]work. But I'm a bit torn there because
0:17:26.849,0:17:30.910
when you talk about weapons, they are[br]designed to kill people and cars mostly
0:17:30.910,0:17:36.000
are not. So with this ethical committee[br]you want to avoid killing people or decide
0:17:36.000,0:17:42.100
what happens when this accident occurs. So[br]they are a bit different but of course
0:17:42.100,0:17:48.790
yeah you can learn a lot from both[br]discussions and we aware of that.
0:17:48.790,0:17:53.530
Herald: Thanks. Then we're gonna go over[br]in the back, microphone number 2, please.
0:17:53.530,0:17:59.500
Mic 2: Also from me thanks again for this[br]talk and infusing all this professionalism
0:17:59.500,0:18:10.280
into the debate because some of the[br]surroundings of our, so to say ours
0:18:10.280,0:18:17.860
scenery, they like to protest against very[br]specific things like for example the
0:18:17.860,0:18:23.920
Rammstein air base and in my view that's a[br]bit misguided if you just go out and
0:18:23.920,0:18:30.559
protest in a populistic way without[br]involving these points of expertise that
0:18:30.559,0:18:38.210
you offer. And so, thanks again for that.[br]And then my question: How would you
0:18:38.210,0:18:46.940
propose that protests progress and develop[br]themselves to a higher level to be on the
0:18:46.940,0:18:55.170
one hand more effective and on the other[br]hand more considerate of what is at stake
0:18:55.170,0:19:00.200
on all the levels and on[br]all sides involved?
0:19:00.200,0:19:05.690
Dahlmann: Yeah well, first, the Rammstein[br]issue is completely, actually a completely
0:19:05.690,0:19:10.340
different topic. It's drone warfare,[br]remotely piloted drones, so there are a
0:19:10.340,0:19:14.290
lot of a lot of problems with this and[br]we're starting killings but it's not about
0:19:14.290,0:19:21.820
lethal autonomous weapons in particular.[br]Well if you want to be a part of this
0:19:21.820,0:19:25.330
international debate, there's of course[br]this campaign to stop killer robots and
0:19:25.330,0:19:30.429
they have a lot of really good people and[br]a lot of resources, sources, literature
0:19:30.429,0:19:35.190
and things like that to really educate[br]yourself what's going on there, so that
0:19:35.190,0:19:39.490
would be a starting point. And then yeah[br]just keep talking to scientists about
0:19:39.490,0:19:45.280
this and find out where we see the[br]problems and I mean it's always helpful
0:19:45.280,0:19:52.969
for scientists to to talk to people in the[br]field, so to say. So yeah, keep talking.
0:19:52.969,0:19:55.519
Herald: Thanks for that. And the[br]signal angel signaled that we have
0:19:55.519,0:19:59.200
something from the internet.[br]Signal Angel: Thank you. Question from
0:19:59.200,0:20:04.210
IRC: Aren't we already in a killer robot[br]world? The bot net can attack a nuclear
0:20:04.210,0:20:08.270
power plant for example. What do you think?[br]Dahlmann: I really didn't understand a
0:20:08.270,0:20:09.710
word, I'm sorry.[br]Herald: I didn't understand that as well,
0:20:09.710,0:20:12.590
so can you speak closer to[br]the microphone, please?
0:20:12.590,0:20:16.480
Signal Angel: Yes. Aren't we already in a[br]killer robot world?
0:20:16.480,0:20:19.510
Herald: Sorry, that doesn't work. Sorry.[br]Sorry, we stop that here, we can't hear it
0:20:19.510,0:20:21.550
over here. Sorry.[br]Signal Angel: Okay.
0:20:21.550,0:20:25.749
Herald: We're gonna switch over to[br]microphone two now, please.
0:20:25.749,0:20:32.529
Mic 2: I have one little question. So in[br]your talk, you were focusing on the
0:20:32.529,0:20:38.970
ethical questions related to lethal[br]weapons. Are you aware of ongoing
0:20:38.970,0:20:45.379
discussions regarding the ethical aspects[br]of the design and implementation of less
0:20:45.379,0:20:52.769
than lethal autonomous weapons for crowd[br]control and similar purposes?
0:20:52.769,0:20:57.039
Dahlmann: Yeah actually within the CCW,[br]every term of this Lethal Autonomous
0:20:57.039,0:21:02.660
Weapon Systems is disputed also the[br]"lethal" aspect and for the regulation
0:21:02.660,0:21:07.670
that might be easier to focus on this for[br]now because less than lethal weapons come
0:21:07.670,0:21:14.490
with their own problems and the question[br]if they are ethical and if they can, if
0:21:14.490,0:21:18.820
IHL applies to them but I'm not really[br]deep into this discussion. So I'll just
0:21:18.820,0:21:22.600
have to leave it there.[br]Herald: Thanks and back here to microphone
0:21:22.600,0:21:25.230
one, please.[br]Mic 1: Hi. Thank you for the talk very
0:21:25.230,0:21:31.710
much. My question is in the context of the[br]decreasing cost of both, the hardware and
0:21:31.710,0:21:37.190
software, over the next say 20, 40 years.[br]Outside of a nation-state context like
0:21:37.190,0:21:42.419
private forces or non nation-state actors[br]gaining use of these weapons, do things
0:21:42.419,0:21:45.860
like the UN convention or the campaign to[br]stop killer robots apply are they
0:21:45.860,0:21:52.409
considering private individuals trying to[br]leverage these against others?
0:21:52.409,0:21:59.860
Dahlmann: Not sure what the campaign says[br]about this, I'm not a member there. The
0:21:59.860,0:22:06.600
the CCW mostly focuses on international[br]humanitarian law which is important but I
0:22:06.600,0:22:10.820
think it's it's not broad enough. So[br]questions like proliferation and all this
0:22:10.820,0:22:16.480
is connected to your question and not[br]really or probably won't be part of
0:22:16.480,0:22:21.090
regulation there. It's discussed only on[br]the edges of the of the debates and
0:22:21.090,0:22:26.799
negotiations there but it doesn't seem to[br]be a really issue there.
0:22:26.799,0:22:29.860
Mic 1: Thanks.[br]Herald: And over to microphone six,
0:22:29.860,0:22:32.510
please.[br]Mic 6: Thank you. I have a question as a
0:22:32.510,0:22:38.559
researcher: Do you know how far the[br]development has gone already? So how
0:22:38.559,0:22:44.770
transparent or intransparent is your look[br]into what is being developed and
0:22:44.770,0:22:50.669
researched on the side of militaria[br]working, military people working with
0:22:50.669,0:22:54.789
autonomous weapons and developing them?[br]Dahlmann: Well, for me it's quite
0:22:54.789,0:22:59.759
intransparent because I only have only[br]access to public publicly available
0:22:59.759,0:23:04.630
sources so I don't really know what what's[br]going on behind closed doors in the
0:23:04.630,0:23:10.500
military or in the industry there. Of[br]course you can you can monitor the
0:23:10.500,0:23:14.820
civilian applications or developments[br]which can tell a lot about the the state
0:23:14.820,0:23:24.131
of the art and for example the DARPA[br]the American Development Agency, they
0:23:24.131,0:23:30.070
published sometimes a call for papers,[br]that's not the term, but there you can see
0:23:30.070,0:23:34.440
where in which areas they are interested[br]in then for example they really like this
0:23:34.440,0:23:40.990
idea of autonomous killer bug that can[br]act in swarms and monitor or even kill
0:23:40.990,0:23:45.600
people and things like that. So yeah we[br]try to piece it, piece it together in
0:23:45.600,0:23:48.699
our work.[br]Herald: We do have a little bit more time,
0:23:48.699,0:23:50.570
are you okay to answer more questions?[br]Dahlmann: Sure.
0:23:50.570,0:23:52.959
Herald: Then we're gonna switch over to[br]microphone three, please.
0:23:52.959,0:24:00.460
Mic 3: Yes, hello. I think we are living[br]already in a world of Leathal Autonomous
0:24:00.460,0:24:04.870
Weapon Systems if you think about these[br]millions of landmines which are operating.
0:24:04.870,0:24:09.249
And so the question is: Shouldn't it be[br]possible to ban these weapon systems the
0:24:09.249,0:24:14.320
same way as land mines that are already[br]banned by several countries so just
0:24:14.320,0:24:19.240
include them in that definition? And[br]because the arguments should be very
0:24:19.240,0:24:23.049
similar.[br]Dahlmann: Yeah it does, it does come to
0:24:23.049,0:24:26.539
mind of course because these mines are[br]just lying around there and no one's
0:24:26.539,0:24:32.900
interacting when you step on them and[br]boom! But they are, well it depends, it
0:24:32.900,0:24:38.889
depends first of all a bit of your[br]definition of autonomy. So some say
0:24:38.889,0:24:42.539
autonomous is when you act in dynamic[br]situations and the other ones would be
0:24:42.539,0:24:47.889
automated and things like that and I think[br]this autonomy aspect, I really don't want
0:24:47.889,0:24:55.899
to find, don't want to find define[br]autonomy here really but this this action
0:24:55.899,0:25:01.480
in more dynamic spaces and the aspect of[br]machine learning and all these things,
0:25:01.480,0:25:06.450
they are way more complex and they bring[br]different problems than just land mines.
0:25:06.450,0:25:10.759
Landmines are problematic, anti-personnel[br]mines are banned for good reasons but they
0:25:10.759,0:25:15.409
don't have the same problems I think. So[br]it won't be, I don't think it won't be
0:25:15.409,0:25:21.539
sufficient to just put the LAWS in there,[br]the Lethal Autonomous Weapons.
0:25:21.539,0:25:26.399
Herald: Thank you very much. I can't see[br]anyone else queuing up so therefore, Anja,
0:25:26.399,0:25:28.990
thank you very much it's your applause!
0:25:28.990,0:25:31.810
applause
0:25:31.810,0:25:34.587
and once again my apologies that[br]that didn't work
0:25:34.587,0:25:39.800
34c3 outro
0:25:39.800,0:25:56.575
subtitles created by c3subtitles.de[br]in the year 2018. Join, and help us!