1
00:00:00,000 --> 00:00:15,010
34c3 intro
2
00:00:15,010 --> 00:00:18,339
Herald: ... used Anja Dahlmann, a
political scientist and researcher at
3
00:00:18,339 --> 00:00:23,310
Stiftung Wissenschaft und Politik, a
berlin-based think-tank. Here we go.
4
00:00:23,310 --> 00:00:30,160
applause
5
00:00:30,160 --> 00:00:40,030
Anja Dahlmann: Yeah, Thanks for being
here. I probably neither cut myself nor
6
00:00:40,030 --> 00:00:44,270
proposed but I hope it's still
interesting. I'm going to talk about
7
00:00:44,270 --> 00:00:49,240
preventive arms control and international
humanitarian law and doing in this
8
00:00:49,240 --> 00:00:53,270
international debate around autonomous
weapons. This type of weapon is also
9
00:00:53,270 --> 00:00:59,010
referred to as Lethal Autonomous Weapons
System, short LAWS, or also killer robots.
10
00:00:59,010 --> 00:01:04,239
So if I say LAWS, I mostly mean these
weapons and not like legal laws, just to
11
00:01:04,239 --> 00:01:11,650
confuse you a bit. Okay. I will discuss
this topic along three questions. First of
12
00:01:11,650 --> 00:01:18,300
all, what are we actually talking about
here, what are autonomous weapons? Second,
13
00:01:18,300 --> 00:01:22,210
why should we even care about this? Why's
it important? And third, how could this
14
00:01:22,210 --> 00:01:31,480
issue be addressed on international level?
So. I'll go through my slides, anyway,
15
00:01:31,480 --> 00:01:37,770
what are we talking about here? Well,
during the international negotiations, so
16
00:01:37,770 --> 00:01:45,360
far no real, no common definition has been
found. So States, Parties try to find
17
00:01:45,360 --> 00:01:50,090
something or not and for my presentation I
will just use a very broad definition of
18
00:01:50,090 --> 00:01:57,229
autonomous weapons, which is: Weapons that
can once activated execute a broad range
19
00:01:57,229 --> 00:02:02,350
of tasks or selecting to engage targets
without further human intervention. And
20
00:02:02,350 --> 00:02:08,340
it's just a very broad spectrum of weapons
that might fall under this definition.
21
00:02:08,340 --> 00:02:13,150
Actually, some existing ones are there as
well which you can't see here. That would
22
00:02:13,150 --> 00:02:19,940
be the Phalanx system for example. It's
been around since the 1970s. Sorry...
23
00:02:19,940 --> 00:02:22,950
Herald: Man kann nichts hören
auf der Bühne. Mach mal weiter.
24
00:02:22,950 --> 00:02:27,310
Dahlmann: Sorry. So, Phalanx system has
been around since the 1970s, a US system,
25
00:02:27,310 --> 00:02:33,069
air defense system, based on ships and
it's been to - just yeah, defend the ship
26
00:02:33,069 --> 00:02:39,330
against incoming objects from the air. So
that's around, has been around for quite a
27
00:02:39,330 --> 00:02:45,170
long time and it might be even part of
this LAWS definition or not but just to
28
00:02:45,170 --> 00:02:49,410
give you an impression how broad this
range is: Today, we've got for example
29
00:02:49,410 --> 00:02:57,990
demonstrators like the Taranis drone, a UK
system, or the x74b which can, for
30
00:02:57,990 --> 00:03:04,959
example, autonomously land
applause
31
00:03:04,959 --> 00:03:08,700
land on aircraft carriers and can be air-
refueled and stuff like that which is
32
00:03:08,700 --> 00:03:13,550
apparently quite impressive if you don't
need a human to do that and in the future
33
00:03:13,550 --> 00:03:18,810
there might be even, or there probably
will be even more, autonomous functions,
34
00:03:18,810 --> 00:03:25,650
so navigation, landing, refueling, all
that stuff. That's, you know, old but at
35
00:03:25,650 --> 00:03:30,090
some point there might, be weapons might
be able to choose their own ammunition
36
00:03:30,090 --> 00:03:34,560
according to the situation. They might be
able to choose their target and decide
37
00:03:34,560 --> 00:03:41,720
when to engage with the target without any
human intervention at some point. And
38
00:03:41,720 --> 00:03:45,001
that's quite problematic, I will tell you
why that's in a minute. Overall, you can
39
00:03:45,001 --> 00:03:52,230
see that there's a gradual decline of
human control over weapons systems or over
40
00:03:52,230 --> 00:03:58,050
weapons and the use of force. So that's a
very short and broad impression of what
41
00:03:58,050 --> 00:04:01,360
we're talking about here. And talking
about definitions, it's always interesting
42
00:04:01,360 --> 00:04:05,730
what you're not talking about and that's
why I want to address some misconceptions
43
00:04:05,730 --> 00:04:13,481
in the public debate. First of all, when
we talk about machine autonomy, also
44
00:04:13,481 --> 00:04:19,858
artificial intelligence, with intelligence
which is the technology behind this,
45
00:04:19,858 --> 00:04:25,420
people - not you probably - in the media
and the broader public often get the idea
46
00:04:25,420 --> 00:04:30,540
that these machines might have some kind
of real intelligence or intention or an
47
00:04:30,540 --> 00:04:36,260
entity on own right and they're just not.
It's just statistical methods, it's just
48
00:04:36,260 --> 00:04:40,740
math and you know way more about this than
I do so I will leave it with this and just
49
00:04:40,740 --> 00:04:45,600
say that or highlight that they have these
machines, these weapons have certain
50
00:04:45,600 --> 00:04:50,070
competences for specific tasks. They are
not entities on their own right, they are
51
00:04:50,070 --> 00:04:55,050
not intentional.And that's important when
we talk about ethical and legal challenges
52
00:04:55,050 --> 00:05:06,580
afterwards. Sorry. There it is. And the
other, in connection with this, there's
53
00:05:06,580 --> 00:05:11,380
another one, which is the plethora of
Terminator references in the media as soon
54
00:05:11,380 --> 00:05:15,000
as you talk about autonomous weapons,
mostly referred to as killer robots in
55
00:05:15,000 --> 00:05:20,251
this context. And just in case you tend to
write an article about this: don't use a
56
00:05:20,251 --> 00:05:24,340
Terminator picture, please. Don't, because
it's really unhelpful to understand where
57
00:05:24,340 --> 00:05:30,350
the problems are. With this kind of thing,
people assume that we have problems is
58
00:05:30,350 --> 00:05:34,160
when we have machines with a human-like
intelligence which look like the
59
00:05:34,160 --> 00:05:39,750
Terminator or something like this. And the
problem is that really way before that
60
00:05:39,750 --> 00:05:46,820
they start when you use assisting systems
when you have men or human-machine teaming
61
00:05:46,820 --> 00:05:50,860
or when you accumulate a couple of
autonomous functions through the targeting
62
00:05:50,860 --> 00:05:57,720
cycle. So through this, the military steps
are lead to the use of force or lead to
63
00:05:57,720 --> 00:06:03,560
the killing of people. And that's not,
this is really not our problem at the
64
00:06:03,560 --> 00:06:08,350
moment. So please keep this in mind
because it's not just semantics, semantics
65
00:06:08,350 --> 00:06:15,350
to differentiate between these two things.
It's really manages the expectations of
66
00:06:15,350 --> 00:06:20,780
political and military decision-makers.
Ok, so now you've got kind of an
67
00:06:20,780 --> 00:06:23,620
impression what I'm talking about here so
why should we actually talk about this?
68
00:06:23,620 --> 00:06:30,090
What's all the fuss about? Actually,
autonomous weapons have or would have
69
00:06:30,090 --> 00:06:34,169
quite a few military advantages: They
might be, in some cases, faster or even
70
00:06:34,169 --> 00:06:39,780
more precise than humans. And you don't
need a constant communication link. So you
71
00:06:39,780 --> 00:06:43,740
don't have, you don't have to worry about
instable communication links, you don't
72
00:06:43,740 --> 00:06:50,020
have to worry about latency or detection
or a vulnerability of this specific link.
73
00:06:50,020 --> 00:06:57,050
So yay! And a lot of, let's say very
interesting, military options come from
74
00:06:57,050 --> 00:07:02,520
that. People talk about stealthy
operations and shallow waters for example.
75
00:07:02,520 --> 00:07:07,340
Or you know remote missions and secluded
areas, things like that. And you can get
76
00:07:07,340 --> 00:07:13,830
very creative with tiniest robots and
swarms for example. So shiny new options.
77
00:07:13,830 --> 00:07:19,630
But, and of course there's a "but", it
comes at a prize because you have at least
78
00:07:19,630 --> 00:07:26,590
three dimensions of challenges in this
regard. First of all, the legal ones. When
79
00:07:26,590 --> 00:07:31,389
we talk about these weapons, they might
be, they will be applied in conflict where
80
00:07:31,389 --> 00:07:37,500
international humanitarian law IHL
applies. And IHL consists of quite a few
81
00:07:37,500 --> 00:07:45,430
very abstract principles. For example:
principle of distinction between
82
00:07:45,430 --> 00:07:51,310
combatants and civilians, principle of
proportionality or a military necessity.
83
00:07:51,310 --> 00:07:58,069
They are very abstract and I'm pretty sure
they really always need a human judgment
84
00:07:58,069 --> 00:08:05,889
to interpret this, these principles, and
apply them to dynamic situations. Feel
85
00:08:05,889 --> 00:08:14,870
free to correct me if I'm wrong later. So
that's one thing. So if you remove the
86
00:08:14,870 --> 00:08:19,440
human from the targeting cycle, this human
judgment might be missing and therefore
87
00:08:19,440 --> 00:08:24,700
military decision makers have to evaluate
very carefully the quality of human
88
00:08:24,700 --> 00:08:31,699
control and human judgement within the
targeting cycle. So that's law. Second
89
00:08:31,699 --> 00:08:38,719
dimension of challenges are security
issues. When you look at these new systems
90
00:08:38,719 --> 00:08:43,639
they are cool and shiny and as most new
types of weapons they are, they have the
91
00:08:43,639 --> 00:08:49,050
potential to stir an arms race between
between states. So they actually might
92
00:08:49,050 --> 00:08:53,559
make conflicts more likely just because
they are there and states want to have
93
00:08:53,559 --> 00:09:00,519
them and feel threatened by them. Second
aspect is proliferation. Autonomy is based
94
00:09:00,519 --> 00:09:04,739
on software, so software can be easily
transferred it's really hard to control
95
00:09:04,739 --> 00:09:08,579
and all the other components, or most of
the other components you will need, are
96
00:09:08,579 --> 00:09:12,279
available on the civilian market so you
can build this stuff on your own if you're
97
00:09:12,279 --> 00:09:19,639
smart enough. So we have might have more
conflicts from these types of weapons and
98
00:09:19,639 --> 00:09:24,500
it's might get, well, more difficult to
control the application of this
99
00:09:24,500 --> 00:09:29,560
technology. And the third one which is it
especially worrying for me is the as
100
00:09:29,560 --> 00:09:33,971
potential for escalation within the
conflict, especially when you have, when
101
00:09:33,971 --> 00:09:40,279
both or more sites use these autonomous
weapons, you have these very complex
102
00:09:40,279 --> 00:09:45,550
adversary systems and it will become very
hard to predict how they are going to
103
00:09:45,550 --> 00:09:52,050
interact. They will increase the speed of
the of the conflict and the human might
104
00:09:52,050 --> 00:09:57,040
not even have a chance to process
what's going on there.
105
00:09:57,040 --> 00:10:01,949
So that's really worrying and we can see
for example in high-frequency trading at
106
00:10:01,949 --> 00:10:05,980
the stock markets where problems arise
there and how are difficult is for humans
107
00:10:05,980 --> 00:10:12,529
to understand what's going on there. So
that, that are of some of these security
108
00:10:12,529 --> 00:10:23,129
issues there. And the last and maybe maybe
most important one are ethics. As I
109
00:10:23,129 --> 00:10:29,089
mentioned before, when you use autonomy
and weapons or machines you have
110
00:10:29,089 --> 00:10:32,759
artificial intelligence so you don't have
real intention, a real entity that's
111
00:10:32,759 --> 00:10:38,220
behind this. So the killing decision might
at some point be based on statistical
112
00:10:38,220 --> 00:10:43,209
methods and no one will be involved there
and that's, well, worrying for a lot of
113
00:10:43,209 --> 00:10:48,399
reasons but also it could constitute a
violation of human dignity. You can argue
114
00:10:48,399 --> 00:10:54,079
that humans have, well, you can kill
humans in in war but they at least have
115
00:10:54,079 --> 00:10:59,140
the right to be killed by another human or
at least by the decision of another human,
116
00:10:59,140 --> 00:11:02,629
but we can discuss this later.
So at least on this regard it would be
117
00:11:02,629 --> 00:11:07,610
highly unethical and that really just
scratches the surface of problems and
118
00:11:07,610 --> 00:11:12,790
challenges that would arise from the use
of these autonomous weapons. I haven't
119
00:11:12,790 --> 00:11:16,680
even touched on the problems with training
data, with accountability, with
120
00:11:16,680 --> 00:11:23,299
verification and all that funny stuff
because I only have 20 minutes. So, sounds
121
00:11:23,299 --> 00:11:32,519
pretty bad, doesn't it? So how can this
issue be addressed? Luckily, states have,
122
00:11:32,519 --> 00:11:36,739
thanks to a huge campaign of NGOs, noticed
that there might be some problems and
123
00:11:36,739 --> 00:11:40,439
there might be a necessity to address
that, this issue. They're currently doing
124
00:11:40,439 --> 00:11:45,379
this in the UN Convention on certain
conventional weapons, CCW, where they
125
00:11:45,379 --> 00:11:53,290
discuss a potential ban of the development
and use of these lethal weapons or weapons
126
00:11:53,290 --> 00:11:57,589
that lack meaningful human control over
the use of force. There are several ideas
127
00:11:57,589 --> 00:12:04,420
around there. And such a ban would be
really the maximum goal of the NGOs there
128
00:12:04,420 --> 00:12:09,170
but it becomes increasingly unlikely that
this happens. Most states do not agree
129
00:12:09,170 --> 00:12:12,991
with a complete ban, they want to regulate
it a bit here, a bit there, and they
130
00:12:12,991 --> 00:12:17,879
really can't find a common common
definition as I mentioned before because
131
00:12:17,879 --> 00:12:22,609
if you have a broad definition as just as
I used it you will notice that you have
132
00:12:22,609 --> 00:12:26,009
existing systems in there that might be
not that problematic or that you just
133
00:12:26,009 --> 00:12:31,749
don't want to ben and you might stop
civilian or commercial developments which
134
00:12:31,749 --> 00:12:38,690
you also don't want to do. So states are
stuck on this regard and they also really
135
00:12:38,690 --> 00:12:42,179
challenge the notion that we need a
preventive arms control here, so that we
136
00:12:42,179 --> 00:12:50,629
need to act before these systems are
applied on the battlefield. So at the
137
00:12:50,629 --> 00:12:55,980
moment, this is the fourth year or
something of these negotiations and we
138
00:12:55,980 --> 00:13:00,089
will see how it goes this year and if
states can't find a common ground there it
139
00:13:00,089 --> 00:13:04,949
becomes increasingly like or yeah becomes
likely that it will change to another
140
00:13:04,949 --> 00:13:11,060
forum just like with anti-personnel mines
for example which where the the treaty was
141
00:13:11,060 --> 00:13:17,639
found outside of the United Nations. But
yeah, the window of opportunity really
142
00:13:17,639 --> 00:13:24,950
closes and states and NGOs have to act
there and yeah keep on track there. Just
143
00:13:24,950 --> 00:13:32,339
as a side note, probably quite a few
people are members of NGOs so if you look
144
00:13:32,339 --> 00:13:39,059
at the campaign to stop killer robots with
a big campaign behind this, this process,
145
00:13:39,059 --> 00:13:43,310
there's only one German NGO which is
facing finance, so if you're especially if
146
00:13:43,310 --> 00:13:48,739
you're German NGO and are interest that in
AI it might be worthwhile to look into the
147
00:13:48,739 --> 00:13:54,509
military dimension as well. We really need
some expertise on that regard, especially
148
00:13:54,509 --> 00:14:00,439
on AI and these technologies. They're...
Okay, so just in case you fell asleep in
149
00:14:00,439 --> 00:14:05,589
the last 15 minutes I want you to take
away three key messages: Please be aware
150
00:14:05,589 --> 00:14:11,100
of the trends and internal logic that lead
to autonomy in weapons. Do not
151
00:14:11,100 --> 00:14:15,629
overestimate the abilities of autonomy, of
autonomous machines like intent and these
152
00:14:15,629 --> 00:14:20,189
things and because you probably all knew
this already, please tell people about
153
00:14:20,189 --> 00:14:23,899
this, tell other people about this,
educate them about this type of
154
00:14:23,899 --> 00:14:30,669
technology. And third, don't underestimate
the potential dangers for security and
155
00:14:30,669 --> 00:14:37,249
human dignity that comes from this type of
weapon. I hope that I could interest you a
156
00:14:37,249 --> 00:14:40,839
bit more in this in this particular issue
if you want to learn more you can find
157
00:14:40,839 --> 00:14:46,819
really interesting sources on the website
of the CCW at the campaign to stuff killer
158
00:14:46,819 --> 00:14:53,410
robots and from a research project that I
happen to work in, the International Panel
159
00:14:53,410 --> 00:14:56,759
on the Regulation of Autonomous Weapons,
we do have a few studies on that regard
160
00:14:56,759 --> 00:15:02,059
and we're going to publish a few more. So
please, check this out and thank you for
161
00:15:02,059 --> 00:15:03,481
your attention.
162
00:15:03,481 --> 00:15:13,714
Applause
163
00:15:13,714 --> 00:15:16,199
Questions?
164
00:15:20,519 --> 00:15:23,779
Herald: Sorry. So we have some time for
questions answers now. Okay, first of all
165
00:15:23,779 --> 00:15:28,110
I have to apologize that we had a hiccup
with the signing language, the acoustics
166
00:15:28,110 --> 00:15:32,209
over here on the stage was so bad that she
didn't could do her job so I'm
167
00:15:32,209 --> 00:15:38,720
terrible sorry about that. We fixed it in
the talk and my apologies for that. We are
168
00:15:38,720 --> 00:15:42,160
queuing here on the microphones already so
we start with microphone number one, your
169
00:15:42,160 --> 00:15:44,610
question please.
Mic 1: Thanks for your talk Anja. Don't
170
00:15:44,610 --> 00:15:49,049
you think there is a possibility to reduce
war crimes as well by taking away the
171
00:15:49,049 --> 00:15:54,059
decision from humans and by having
algorithms who decide which are actually
172
00:15:54,059 --> 00:15:56,929
auditable?
Dahlmann: Yeah that's, actually, that's
173
00:15:56,929 --> 00:15:59,740
something I just discussed in the
international debate as well, that there
174
00:15:59,740 --> 00:16:05,399
might, that machines might be more ethical
than humans could be. And well, of course
175
00:16:05,399 --> 00:16:11,509
they won't just start raping women because
they want to but you can program them to
176
00:16:11,509 --> 00:16:18,269
do this. So you just you shift the
problems really. And also maybe these
177
00:16:18,269 --> 00:16:22,559
machines don't get angry but they don't
show compassion either so if you are there
178
00:16:22,559 --> 00:16:26,069
and your potential target they just won't
stop they will just kill you and do not
179
00:16:26,069 --> 00:16:32,839
think once think about this. So you have
to really look at both sides there I guess.
180
00:16:32,839 --> 00:16:37,839
Herald: Thanks. So we switch over
to microphone 3, please.
181
00:16:37,839 --> 00:16:44,699
Mic 3: Thanks for the talk. Regarding
autonomous cars, self-driving cars,
182
00:16:44,699 --> 00:16:49,239
there's a similar discussion going on
regarding the ethics. How should a car
183
00:16:49,239 --> 00:16:54,139
react in a case of an accident? Should it
protect people outside people, inside,
184
00:16:54,139 --> 00:17:02,069
what are the laws? So there is another
discussion there. Do you work with people
185
00:17:02,069 --> 00:17:07,270
in this area or is this is there any
collaboration?
186
00:17:07,270 --> 00:17:10,169
Dahlmann: Maybe there's less collaboration
than one might think there is. I think
187
00:17:10,169 --> 00:17:17,299
there is. Of course, we we monitor this
debate as well and yeah we think about the
188
00:17:17,299 --> 00:17:20,689
possible applications of the outcomes for
example from this German ethical
189
00:17:20,689 --> 00:17:26,849
commission on self-driving cars for our
work. But I'm a bit torn there because
190
00:17:26,849 --> 00:17:30,910
when you talk about weapons, they are
designed to kill people and cars mostly
191
00:17:30,910 --> 00:17:36,000
are not. So with this ethical committee
you want to avoid killing people or decide
192
00:17:36,000 --> 00:17:42,100
what happens when this accident occurs. So
they are a bit different but of course
193
00:17:42,100 --> 00:17:48,790
yeah you can learn a lot from both
discussions and we aware of that.
194
00:17:48,790 --> 00:17:53,530
Herald: Thanks. Then we're gonna go over
in the back, microphone number 2, please.
195
00:17:53,530 --> 00:17:59,500
Mic 2: Also from me thanks again for this
talk and infusing all this professionalism
196
00:17:59,500 --> 00:18:10,280
into the debate because some of the
surroundings of our, so to say ours
197
00:18:10,280 --> 00:18:17,860
scenery, they like to protest against very
specific things like for example the
198
00:18:17,860 --> 00:18:23,920
Rammstein air base and in my view that's a
bit misguided if you just go out and
199
00:18:23,920 --> 00:18:30,559
protest in a populistic way without
involving these points of expertise that
200
00:18:30,559 --> 00:18:38,210
you offer. And so, thanks again for that.
And then my question: How would you
201
00:18:38,210 --> 00:18:46,940
propose that protests progress and develop
themselves to a higher level to be on the
202
00:18:46,940 --> 00:18:55,170
one hand more effective and on the other
hand more considerate of what is at stake
203
00:18:55,170 --> 00:19:00,200
on all the levels and on
all sides involved?
204
00:19:00,200 --> 00:19:05,690
Dahlmann: Yeah well, first, the Rammstein
issue is completely, actually a completely
205
00:19:05,690 --> 00:19:10,340
different topic. It's drone warfare,
remotely piloted drones, so there are a
206
00:19:10,340 --> 00:19:14,290
lot of a lot of problems with this and
we're starting killings but it's not about
207
00:19:14,290 --> 00:19:21,820
lethal autonomous weapons in particular.
Well if you want to be a part of this
208
00:19:21,820 --> 00:19:25,330
international debate, there's of course
this campaign to stop killer robots and
209
00:19:25,330 --> 00:19:30,429
they have a lot of really good people and
a lot of resources, sources, literature
210
00:19:30,429 --> 00:19:35,190
and things like that to really educate
yourself what's going on there, so that
211
00:19:35,190 --> 00:19:39,490
would be a starting point. And then yeah
just keep talking to scientists about
212
00:19:39,490 --> 00:19:45,280
this and find out where we see the
problems and I mean it's always helpful
213
00:19:45,280 --> 00:19:52,969
for scientists to to talk to people in the
field, so to say. So yeah, keep talking.
214
00:19:52,969 --> 00:19:55,519
Herald: Thanks for that. And the
signal angel signaled that we have
215
00:19:55,519 --> 00:19:59,200
something from the internet.
Signal Angel: Thank you. Question from
216
00:19:59,200 --> 00:20:04,210
IRC: Aren't we already in a killer robot
world? The bot net can attack a nuclear
217
00:20:04,210 --> 00:20:08,270
power plant for example. What do you think?
Dahlmann: I really didn't understand a
218
00:20:08,270 --> 00:20:09,710
word, I'm sorry.
Herald: I didn't understand that as well,
219
00:20:09,710 --> 00:20:12,590
so can you speak closer to
the microphone, please?
220
00:20:12,590 --> 00:20:16,480
Signal Angel: Yes. Aren't we already in a
killer robot world?
221
00:20:16,480 --> 00:20:19,510
Herald: Sorry, that doesn't work. Sorry.
Sorry, we stop that here, we can't hear it
222
00:20:19,510 --> 00:20:21,550
over here. Sorry.
Signal Angel: Okay.
223
00:20:21,550 --> 00:20:25,749
Herald: We're gonna switch over to
microphone two now, please.
224
00:20:25,749 --> 00:20:32,529
Mic 2: I have one little question. So in
your talk, you were focusing on the
225
00:20:32,529 --> 00:20:38,970
ethical questions related to lethal
weapons. Are you aware of ongoing
226
00:20:38,970 --> 00:20:45,379
discussions regarding the ethical aspects
of the design and implementation of less
227
00:20:45,379 --> 00:20:52,769
than lethal autonomous weapons for crowd
control and similar purposes?
228
00:20:52,769 --> 00:20:57,039
Dahlmann: Yeah actually within the CCW,
every term of this Lethal Autonomous
229
00:20:57,039 --> 00:21:02,660
Weapon Systems is disputed also the
"lethal" aspect and for the regulation
230
00:21:02,660 --> 00:21:07,670
that might be easier to focus on this for
now because less than lethal weapons come
231
00:21:07,670 --> 00:21:14,490
with their own problems and the question
if they are ethical and if they can, if
232
00:21:14,490 --> 00:21:18,820
IHL applies to them but I'm not really
deep into this discussion. So I'll just
233
00:21:18,820 --> 00:21:22,600
have to leave it there.
Herald: Thanks and back here to microphone
234
00:21:22,600 --> 00:21:25,230
one, please.
Mic 1: Hi. Thank you for the talk very
235
00:21:25,230 --> 00:21:31,710
much. My question is in the context of the
decreasing cost of both, the hardware and
236
00:21:31,710 --> 00:21:37,190
software, over the next say 20, 40 years.
Outside of a nation-state context like
237
00:21:37,190 --> 00:21:42,419
private forces or non nation-state actors
gaining use of these weapons, do things
238
00:21:42,419 --> 00:21:45,860
like the UN convention or the campaign to
stop killer robots apply are they
239
00:21:45,860 --> 00:21:52,409
considering private individuals trying to
leverage these against others?
240
00:21:52,409 --> 00:21:59,860
Dahlmann: Not sure what the campaign says
about this, I'm not a member there. The
241
00:21:59,860 --> 00:22:06,600
the CCW mostly focuses on international
humanitarian law which is important but I
242
00:22:06,600 --> 00:22:10,820
think it's it's not broad enough. So
questions like proliferation and all this
243
00:22:10,820 --> 00:22:16,480
is connected to your question and not
really or probably won't be part of
244
00:22:16,480 --> 00:22:21,090
regulation there. It's discussed only on
the edges of the of the debates and
245
00:22:21,090 --> 00:22:26,799
negotiations there but it doesn't seem to
be a really issue there.
246
00:22:26,799 --> 00:22:29,860
Mic 1: Thanks.
Herald: And over to microphone six,
247
00:22:29,860 --> 00:22:32,510
please.
Mic 6: Thank you. I have a question as a
248
00:22:32,510 --> 00:22:38,559
researcher: Do you know how far the
development has gone already? So how
249
00:22:38,559 --> 00:22:44,770
transparent or intransparent is your look
into what is being developed and
250
00:22:44,770 --> 00:22:50,669
researched on the side of militaria
working, military people working with
251
00:22:50,669 --> 00:22:54,789
autonomous weapons and developing them?
Dahlmann: Well, for me it's quite
252
00:22:54,789 --> 00:22:59,759
intransparent because I only have only
access to public publicly available
253
00:22:59,759 --> 00:23:04,630
sources so I don't really know what what's
going on behind closed doors in the
254
00:23:04,630 --> 00:23:10,500
military or in the industry there. Of
course you can you can monitor the
255
00:23:10,500 --> 00:23:14,820
civilian applications or developments
which can tell a lot about the the state
256
00:23:14,820 --> 00:23:24,131
of the art and for example the DARPA
the American Development Agency, they
257
00:23:24,131 --> 00:23:30,070
published sometimes a call for papers,
that's not the term, but there you can see
258
00:23:30,070 --> 00:23:34,440
where in which areas they are interested
in then for example they really like this
259
00:23:34,440 --> 00:23:40,990
idea of autonomous killer bug that can
act in swarms and monitor or even kill
260
00:23:40,990 --> 00:23:45,600
people and things like that. So yeah we
try to piece it, piece it together in
261
00:23:45,600 --> 00:23:48,699
our work.
Herald: We do have a little bit more time,
262
00:23:48,699 --> 00:23:50,570
are you okay to answer more questions?
Dahlmann: Sure.
263
00:23:50,570 --> 00:23:52,959
Herald: Then we're gonna switch over to
microphone three, please.
264
00:23:52,959 --> 00:24:00,460
Mic 3: Yes, hello. I think we are living
already in a world of Leathal Autonomous
265
00:24:00,460 --> 00:24:04,870
Weapon Systems if you think about these
millions of landmines which are operating.
266
00:24:04,870 --> 00:24:09,249
And so the question is: Shouldn't it be
possible to ban these weapon systems the
267
00:24:09,249 --> 00:24:14,320
same way as land mines that are already
banned by several countries so just
268
00:24:14,320 --> 00:24:19,240
include them in that definition? And
because the arguments should be very
269
00:24:19,240 --> 00:24:23,049
similar.
Dahlmann: Yeah it does, it does come to
270
00:24:23,049 --> 00:24:26,539
mind of course because these mines are
just lying around there and no one's
271
00:24:26,539 --> 00:24:32,900
interacting when you step on them and
boom! But they are, well it depends, it
272
00:24:32,900 --> 00:24:38,889
depends first of all a bit of your
definition of autonomy. So some say
273
00:24:38,889 --> 00:24:42,539
autonomous is when you act in dynamic
situations and the other ones would be
274
00:24:42,539 --> 00:24:47,889
automated and things like that and I think
this autonomy aspect, I really don't want
275
00:24:47,889 --> 00:24:55,899
to find, don't want to find define
autonomy here really but this this action
276
00:24:55,899 --> 00:25:01,480
in more dynamic spaces and the aspect of
machine learning and all these things,
277
00:25:01,480 --> 00:25:06,450
they are way more complex and they bring
different problems than just land mines.
278
00:25:06,450 --> 00:25:10,759
Landmines are problematic, anti-personnel
mines are banned for good reasons but they
279
00:25:10,759 --> 00:25:15,409
don't have the same problems I think. So
it won't be, I don't think it won't be
280
00:25:15,409 --> 00:25:21,539
sufficient to just put the LAWS in there,
the Lethal Autonomous Weapons.
281
00:25:21,539 --> 00:25:26,399
Herald: Thank you very much. I can't see
anyone else queuing up so therefore, Anja,
282
00:25:26,399 --> 00:25:28,990
thank you very much it's your applause!
283
00:25:28,990 --> 00:25:31,810
applause
284
00:25:31,810 --> 00:25:34,587
and once again my apologies that
that didn't work
285
00:25:34,587 --> 00:25:39,800
34c3 outro
286
00:25:39,800 --> 00:25:56,575
subtitles created by c3subtitles.de
in the year 2018. Join, and help us!