1
00:00:00,000 --> 00:00:20,870
36c3 preroll music
2
00:00:20,870 --> 00:00:24,390
Herald: ...but now we start what we're
here for and I'm really happy to be
3
00:00:24,390 --> 00:00:30,550
allowed to introduce Anna Mazgal. She will
talk about something which a great title.
4
00:00:30,550 --> 00:00:36,397
I love it. "Confessions of a Future
Terrorist". Terror, terrorism is the one
5
00:00:36,397 --> 00:00:40,450
thing you can always shout out and you get
everything through. And she will give us a
6
00:00:40,450 --> 00:00:47,590
rough guide to overregulate free speech
with anti-terrorist measures. Anna works
7
00:00:47,590 --> 00:00:51,950
for Wikimedia where she's a lobbyist of
human rights into the digital environment
8
00:00:51,950 --> 00:00:56,810
and works in Brussels. And she gives a lot
of talk. And I think it's the first time
9
00:00:56,810 --> 00:00:59,430
at a congress for her, is that right?
Anna: Second time.
10
00:00:59,430 --> 00:01:03,330
Herald: Second time? I haven't really
researched right, because I searched for
11
00:01:03,330 --> 00:01:08,360
it. So I have to do this again. It's her
2nd time for congress and I'm really
12
00:01:08,360 --> 00:01:12,295
happy to have her here. Please welcome her
with a big round of applause.
13
00:01:12,295 --> 00:01:22,244
Anna, the stage is yours.
Anna: Thank you. Yes. So as you have
14
00:01:22,244 --> 00:01:27,598
already heard, I don't do any of the cool
things that we comedians and Wikipedians
15
00:01:27,598 --> 00:01:34,390
do. I am based in Brussels and the
L-word, I do the lobbying on behalf of our
16
00:01:34,390 --> 00:01:42,850
community. And today I am here because I
wanted to talk to you about one of the
17
00:01:42,850 --> 00:01:48,331
proposals of 4 laws that we are now
observing the development of. And I wanted
18
00:01:48,331 --> 00:01:53,200
to share my concerns also like active, as
an activist, because I'm really worried
19
00:01:53,200 --> 00:01:58,070
how, if that law passes in its worst
possible vision or one of the of the bad
20
00:01:58,070 --> 00:02:02,160
versions, how it will affect my work. I'm
also concerned how it would it will affect
21
00:02:02,160 --> 00:02:09,390
your work. And basically all of ours
expression online. And I also want to
22
00:02:09,390 --> 00:02:16,420
share with you that this law makes me
really angry. So, so I think these are a
23
00:02:16,420 --> 00:02:22,690
few good reasons to be here and to talk to
you. And I hope after this presentation we
24
00:02:22,690 --> 00:02:26,370
can have a conversation about this. And
I'm looking forward also to your
25
00:02:26,370 --> 00:02:34,379
perspective on it and, and also the things
you may not agree with maybe. So, so what
26
00:02:34,379 --> 00:02:41,590
is this law? So, in September 2018, the
European Commission came out with a
27
00:02:41,590 --> 00:02:46,379
proposal of regulation on preventing
the dissemination of terrorist content
28
00:02:46,379 --> 00:02:53,799
online. So there are a few things to
unpack here of for what it is about. First
29
00:02:53,799 --> 00:02:58,950
of all, when we see a law that is about
Internet and that is about the content and
30
00:02:58,950 --> 00:03:03,019
what is about the online environment and
it says it will prevent something, this
31
00:03:03,019 --> 00:03:09,297
always brings a very difficult and
complicated perspective for us, the
32
00:03:09,297 --> 00:03:13,569
digital rights activists in Brussels.
Because prevention online never means
33
00:03:13,569 --> 00:03:19,680
anything good. So, so this is one thing.
The other thing is that this very troubled
34
00:03:19,680 --> 00:03:24,359
concept of terrorist content, I will be
talking about this more, we will talk, I
35
00:03:24,359 --> 00:03:28,619
will show you how the European Commission
understands it and what are the problems
36
00:03:28,619 --> 00:03:33,393
with that understanding and whether this
is something that can actually be really
37
00:03:33,393 --> 00:03:42,189
defined in the law. So, so these are the,
the already the red flags that I have seen
38
00:03:42,189 --> 00:03:48,349
and we have seen when we were, when we
first got them, the proposal, into our
39
00:03:48,349 --> 00:03:53,030
hands. I would like to tell you a little
bit about the framework of it. This is
40
00:03:53,030 --> 00:04:01,309
probably the, the most dry part of that,
but I think it's important to correctly
41
00:04:01,309 --> 00:04:07,049
place it. First of all, this is the
European Union legislation. So we're
42
00:04:07,049 --> 00:04:15,739
talking about the legislation that will
influence 27 member states. Maybe 28, but
43
00:04:15,739 --> 00:04:23,350
we know about Brexit, so, so that is a
debatable what's going to happen there.
44
00:04:23,350 --> 00:04:29,020
And it's important to note that whenever
we have European legislation in the EU,
45
00:04:29,020 --> 00:04:35,599
this is the, these are the laws that
actually are shaping the laws of all those
46
00:04:35,599 --> 00:04:40,280
countries. And they come before the
national laws. So, so should this, should
47
00:04:40,280 --> 00:04:45,590
this be implemented in any of the form,
when it's implemented in any of the form,
48
00:04:45,590 --> 00:04:52,520
this is what is going to happen. The next
important part of information that I want
49
00:04:52,520 --> 00:04:56,930
to give you is that this particular
regulation is a part of the framework that
50
00:04:56,930 --> 00:05:03,320
is called Digital Single Market. So the
European Union, one of, one of the
51
00:05:03,320 --> 00:05:09,270
objectives when a European
Commission created the law and when other
52
00:05:09,270 --> 00:05:14,560
bodies of the European Union work on it,
is that there is, that, that the laws in
53
00:05:14,560 --> 00:05:20,535
the member states of the
European Union are actually similar. And
54
00:05:20,535 --> 00:05:25,990
the Digital Single Market means that what
we want, we want to achieve something on
55
00:05:25,990 --> 00:05:30,570
the Internet that in a way is already
achieved within the European Union
56
00:05:30,570 --> 00:05:35,950
geographically, meaning that we don't want
the borders on the Internet between people
57
00:05:35,950 --> 00:05:40,659
communicating and also delivering goods
and services in the European Union.
58
00:05:40,659 --> 00:05:44,580
And you may ask how that connects with
the terrorist content and how
59
00:05:44,580 --> 00:05:50,091
that connects with what today's topics. To
be honest, I am also puzzled because I
60
00:05:50,091 --> 00:05:57,470
think, that legislation that talks about
how people communicate online and what is
61
00:05:57,470 --> 00:06:01,650
considered the speech that we wanted there
and we don't want this should not be a
62
00:06:01,650 --> 00:06:08,130
part of a framework that is about market.
So this is also something that
63
00:06:08,130 --> 00:06:18,340
brings a concern. Also, as you've seen at
the first slide, this piece of
64
00:06:18,340 --> 00:06:23,530
legislation, this proposal is called the
regulation. And not to go too much into
65
00:06:23,530 --> 00:06:30,880
details about what are the forms of
legislation in the EU, the important thing
66
00:06:30,880 --> 00:06:37,360
to know here is that the regulation is a
law that once it is adopted by the EU,
67
00:06:37,360 --> 00:06:43,530
once the parliament votes on it, it
starts, it is binding directly in
68
00:06:43,530 --> 00:06:48,120
all the member states of the European
Union, which means that there is no
69
00:06:48,120 --> 00:06:52,880
further discussion on how this should be
actually used. Of course, in each country
70
00:06:52,880 --> 00:06:57,510
there are different decisions being made
by different bodies. But it means for us,
71
00:06:57,510 --> 00:07:02,103
the people that work on this and that want
to influence the legislative process, that
72
00:07:02,103 --> 00:07:05,911
once this law is out of Brussels, there
is nothing much to be done
73
00:07:05,911 --> 00:07:13,780
about how it's going to be
implemented. And this is
74
00:07:13,780 --> 00:07:18,780
important because for now, the discussion
about this, because for us, the discussion
75
00:07:18,780 --> 00:07:23,755
about this, is the one that happens in
Brussels. There are a few versions of the
76
00:07:23,755 --> 00:07:29,300
law. And very quickly, European Commission
proposes the law. European Parliament
77
00:07:29,300 --> 00:07:34,390
looks at it, debates it, and then produces
its own version of it. So amends it or
78
00:07:34,390 --> 00:07:39,196
makes it worse. And then the Council of
the EU, which is the gathering of all the
79
00:07:39,196 --> 00:07:42,760
member states and the
representatives of the government of the
80
00:07:42,760 --> 00:07:46,449
member states, also creates their own
version. And then, of course, when you
81
00:07:46,449 --> 00:07:49,931
have 3 versions, you also need to have a
lot of conversations and a lot of
82
00:07:49,931 --> 00:07:54,260
negotiation, how to put this together into
one. And all of those bodies have their
83
00:07:54,260 --> 00:07:59,710
own ideas. Every one of those bodies
have their own ideas on how any law should
84
00:07:59,710 --> 00:08:04,950
look like. So this process is not only
complicated, but also this negotiation
85
00:08:04,950 --> 00:08:10,950
that is called the trilogs. It's actually
very non-transparent. And there is no or
86
00:08:10,950 --> 00:08:15,680
almost none, no official information about
how those negotiations go, what are the
87
00:08:15,680 --> 00:08:20,530
versions of the document and so on. This
is the part that we are now in. And I will
88
00:08:20,530 --> 00:08:26,000
talk more about this later on. Today I
want to talk to you about the potential
89
00:08:26,000 --> 00:08:31,325
consequences of diversion. That is the
original one, which is the European
90
00:08:31,325 --> 00:08:36,450
Commission's version. And it's because it
will be very complicated and
91
00:08:36,450 --> 00:08:40,659
confusing I guess, if we look at all of
the proposals that are on the table. But
92
00:08:40,659 --> 00:08:45,184
also, it's important because European
Commission has a lot of influence also
93
00:08:45,184 --> 00:08:53,530
informally, both on member states and also
on - to an extent - on the whole trial
94
00:08:53,530 --> 00:08:59,340
process. So whatever gains we have in
other versions or whatever better
95
00:08:59,340 --> 00:09:05,330
solutions we have there, they are not
secure yet. And I promise I'm almost
96
00:09:05,330 --> 00:09:10,520
done with this part. There is other
relevant legislation that we'll
97
00:09:10,520 --> 00:09:15,779
consider. One is the E-Commerce Directive.
And in this, the part that is
98
00:09:15,779 --> 00:09:20,900
very relevant is for this particular
conversation, is that the platforms,
99
00:09:20,900 --> 00:09:27,710
according to this law or Internet services
or hosting providers are not by default
100
00:09:27,710 --> 00:09:32,733
responsible for the content that users
play, place online. So it's a very
101
00:09:32,733 --> 00:09:38,830
important premise that also protects us,
protects our rights, protects our privacy,
102
00:09:38,830 --> 00:09:47,235
that the they are not, they cannot
go after us or they cannot look for, for
103
00:09:47,235 --> 00:09:50,970
the content that could be potentially
illegal, which would mean that they would
104
00:09:50,970 --> 00:09:55,540
have to look into everything. But of
course, they have to react when somebody
105
00:09:55,540 --> 00:10:01,420
notifies them and they have to see whether
the information that is placed by the
106
00:10:01,420 --> 00:10:08,027
users should stay up or not. There is also
a directive on combating terrorism. And
107
00:10:08,027 --> 00:10:13,193
this is the piece of legislation that
is quite recent to my best
108
00:10:13,193 --> 00:10:17,360
knowledge. Not all countries in the
European Union, not all member states have
109
00:10:17,360 --> 00:10:23,110
implemented it yet. So for us, it was also
very puzzling that we actually have a new
110
00:10:23,110 --> 00:10:29,959
law, a new proposal that is talking about
the communication part of, of what already
111
00:10:29,959 --> 00:10:33,990
has been mentioned in this directive. When
we still don't know how it works, we still
112
00:10:33,990 --> 00:10:39,490
don't know because this law is physically
not being used at all. So this was for
113
00:10:39,490 --> 00:10:46,170
us also difficult to understand why the
commission does not want to wait and see
114
00:10:46,170 --> 00:10:55,025
how like what comes out from the
directive on combating terrorism. So
115
00:10:55,025 --> 00:11:00,500
why would the European Commission and why
the European legislators would
116
00:11:00,500 --> 00:11:06,270
actually want such a law that, again,
is about the content that people post
117
00:11:06,270 --> 00:11:17,050
through different services and why this is
an important issue. If this is, why this
118
00:11:17,050 --> 00:11:21,840
issue is actually conflated with
the market questions and the
119
00:11:21,840 --> 00:11:28,650
harmonization in the digital market. So
there are some serious numbers here. 94 %
120
00:11:28,650 --> 00:11:33,610
and 99 %. And I'm wondering if you have
any idea what those numbers are about.
121
00:11:33,610 --> 00:11:36,220
Audience: Persons.
Anna: I'm sorry?
122
00:11:36,220 --> 00:11:41,348
Audience: Persons.
Anna: Yes. It's about people. But the
123
00:11:41,348 --> 00:11:46,810
numbers are actually presenting, so there
was a survey done by Eurostat and those
124
00:11:46,810 --> 00:11:52,420
numbers present the percentage of
people first number 94 % presents the
125
00:11:52,420 --> 00:11:59,210
percentage of people that say that they
have not come across terrorist content
126
00:11:59,210 --> 00:12:09,100
online. Right. So, inversely, only 6 %
of people actually say that they had
127
00:12:09,100 --> 00:12:13,110
access to terrorist content, it's
important to underline that they say it
128
00:12:13,110 --> 00:12:19,320
because there's no way to check what that
content actually was and of course we can,
129
00:12:19,320 --> 00:12:25,160
you know, here is the analogy of what a
certain American judge said about
130
00:12:25,160 --> 00:12:29,420
pornography. I know it when I see it. It's
not a very good definition of anything,
131
00:12:29,420 --> 00:12:36,460
really. So I would argue that actually
6 % of people being affected by something
132
00:12:36,460 --> 00:12:40,300
is not really a big percentage and that
the European Union actually has bigger
133
00:12:40,300 --> 00:12:45,950
problems to deal with and where they can
spend money and energy on. E.g., we are
134
00:12:45,950 --> 00:12:49,978
all affected by, I don't know, air
pollution and that's much more people.
135
00:12:49,978 --> 00:12:57,261
89% are the people in the age range
between 15 and 24. But again, were not
136
00:12:57,261 --> 00:13:00,981
affected by something what they would
consider terrorist content. Of course,
137
00:13:00,981 --> 00:13:04,134
would somebody think of the children?
There you go.
138
00:13:04,134 --> 00:13:08,020
The children and young people do
not also experience
139
00:13:08,020 --> 00:13:15,920
it in an overwhelming,
overwhelmingly. So, but this rationale
140
00:13:15,920 --> 00:13:22,570
is being used, 6 % and 11 % as one
of the reasons why this regulation is
141
00:13:22,570 --> 00:13:27,380
important, why this law is important. The
other is the exposure to, the other reason
142
00:13:27,380 --> 00:13:32,470
is the exposure to imagery of violent
crimes via social media. So, of course, we
143
00:13:32,470 --> 00:13:38,470
know that, that platforms such as Facebook
and YouTube contain all sorts of things
144
00:13:38,470 --> 00:13:43,484
that people look. We also know that
because of their business models, they
145
00:13:43,484 --> 00:13:54,689
sometimes push controversial content or
violent content into, into peoples
146
00:13:54,689 --> 00:13:59,570
proposal, that the proposals that
they give to people to, to watch or to
147
00:13:59,570 --> 00:14:06,029
read. So this is, actually the 2nd part
is not addressed by this, by this
148
00:14:06,029 --> 00:14:11,880
proposal at all. But nevertheless,
whenever we talk to the representatives of
149
00:14:11,880 --> 00:14:17,060
the commission why this law is there, they
start waving, that was my experience that
150
00:14:17,060 --> 00:14:21,500
the one of the meetings, the person
started waving his phone at me and saying,
151
00:14:21,500 --> 00:14:25,217
"Well, you know, there are beheading
videos online and I can show you how
152
00:14:25,217 --> 00:14:27,790
horrible it is", which I consider
to be an emotional
153
00:14:27,790 --> 00:14:32,112
blackmail at best, but not really a
good regulatory impulse.
154
00:14:32,112 --> 00:14:36,730
So I guess maybe the commission
people are somehow
155
00:14:36,730 --> 00:14:41,630
mysteriously affected by that content more
than anything else. I don't mean to joke
156
00:14:41,630 --> 00:14:49,160
about those, those videos because of
course, it is not something that I would
157
00:14:49,160 --> 00:14:54,860
want to watch and, and it is very violent.
But I would also argue that the problem is
158
00:14:54,860 --> 00:14:58,974
not that the video is there, but that
somebody has been beheaded. And this is
159
00:14:58,974 --> 00:15:02,710
where we should actually direct our
attention and look for the sources of that
160
00:15:02,710 --> 00:15:08,890
sort of behavior and not only to try and
clean the Internet. The other reason why,
161
00:15:08,890 --> 00:15:16,870
why this law should be enacted is a
radicalisation. Of course, this is a,
162
00:15:16,870 --> 00:15:21,903
this is a problem for certain
vulnerable populations and people. And we
163
00:15:21,903 --> 00:15:25,970
can read about it a lot. And there are
organizations that are dealing with
164
00:15:25,970 --> 00:15:31,720
strategies to counteract radicalisation.
Again, when we look at the evidence, what
165
00:15:31,720 --> 00:15:38,459
is the, what is the relationship between
content that is available online and the
166
00:15:38,459 --> 00:15:42,588
fact that people get radicalized in
different level, in different ways, that
167
00:15:42,588 --> 00:15:46,060
we didn't see any research and the
commission also did not present any
168
00:15:46,060 --> 00:15:50,870
research that would actually point to at
least the correlation between the two. So
169
00:15:50,870 --> 00:15:56,649
again, asked about, so "How did you come
up with this idea since without really
170
00:15:56,649 --> 00:16:02,825
actually showing the, the support for your
claim that radicalisation is connected to
171
00:16:02,825 --> 00:16:08,519
that?" This is a quote from a meeting
that happened public and journalists were
172
00:16:08,519 --> 00:16:12,590
there. Again, the person from the
commission said, "We had to make a guess,
173
00:16:12,590 --> 00:16:18,560
so we made the guess that way." There is
the guess being, yes, there is some sort
174
00:16:18,560 --> 00:16:23,501
of connection between the content and the
radicalisation. And then finally, when we
175
00:16:23,501 --> 00:16:27,560
read the impact assessment and when we
look at the different articles, that or
176
00:16:27,560 --> 00:16:33,027
different explanations that the European
Commission posts about the
177
00:16:33,027 --> 00:16:38,850
rationale for this law, of course, they
bring the terrorists attack that have
178
00:16:38,850 --> 00:16:47,300
been happening and they make, they swiftly
go from naming the different violent
179
00:16:47,300 --> 00:16:52,930
events that have happened in
Europe very recently or quite recently.
180
00:16:52,930 --> 00:16:58,459
And they swiftly make a connection between
the fact that somebody took a truck and
181
00:16:58,459 --> 00:17:05,500
and run into a group of people. Or that
somebody was participating in the shooting
182
00:17:05,500 --> 00:17:10,419
or organizing the shooting of people
enjoying themselves.They swiftly go from
183
00:17:10,419 --> 00:17:15,350
this to the fact that regulation of the
content is needed. Which also, the fact
184
00:17:15,350 --> 00:17:19,456
that you put something in one sentence
does not mean it makes sense. Right? So
185
00:17:19,456 --> 00:17:24,290
this is also not very well documented.
Again, pressed about this, the
186
00:17:24,290 --> 00:17:28,398
representative of the European Commission
said that, well, "We know that and it has
187
00:17:28,398 --> 00:17:33,780
been proven in the investigation, that one
of the people that were responsible for
188
00:17:33,780 --> 00:17:37,758
the Bataclan attack actually used the
Internet before that happened.
189
00:17:37,758 --> 00:17:46,357
laughter
Yes. No more comment needed on that one.
190
00:17:46,357 --> 00:17:52,160
So, well, clearly, there are "very good
reasons", quote unquote, to spend time
191
00:17:52,160 --> 00:17:59,140
and citizens money on working on
the new law. And I always say that
192
00:17:59,140 --> 00:18:03,102
basically these laws are created because,
not because there is a reason, but
193
00:18:03,102 --> 00:18:06,439
because there is a do-something-doctrine.
Right. We have a problem, we
194
00:18:06,439 --> 00:18:15,999
have to do something. And this is how this
law, I think, came to be. And the
195
00:18:15,999 --> 00:18:22,990
do-something-doctrine in this particular
case, or also, of course, it encompasses a
196
00:18:22,990 --> 00:18:28,330
very broad and blurry definition of that
law. I will talk about this more in a
197
00:18:28,330 --> 00:18:33,620
moment. It also encompasses measures, we,
if we define something that we want to
198
00:18:33,620 --> 00:18:41,370
counteract to, we have to basically say
what should happen. Right. So that the
199
00:18:41,370 --> 00:18:46,192
problem is being solved. And there are 3
measures that they will also explain. One
200
00:18:46,192 --> 00:18:50,809
is the removal orders. The other is
referrals. And the third are so-called
201
00:18:50,809 --> 00:18:56,749
proactive measures. This is, I guess, the
part where we touch the prevention
202
00:18:56,749 --> 00:19:06,501
most. And then the third issue is that,
the one of the things I also want to talk
203
00:19:06,501 --> 00:19:10,340
about is the links between the content
that is being removed and the actual
204
00:19:10,340 --> 00:19:14,780
investigations or prosecutions that may
occur, because of course it's possible
205
00:19:14,780 --> 00:19:20,576
that there will be some content found that
actually does document the crime. And
206
00:19:20,576 --> 00:19:32,510
then what do we do about that? So, going
forward, I do think that the definition
207
00:19:32,510 --> 00:19:37,390
and this law is basically, its main
principle is to normalize the state
208
00:19:37,390 --> 00:19:44,679
control over how people communicate and
what they wanted to say. As it was said
209
00:19:44,679 --> 00:19:50,630
before, under the premise of terrorism, we
can actually pack a lot of different
210
00:19:50,630 --> 00:19:56,890
things because people are afraid of this.
And we have also examples from other
211
00:19:56,890 --> 00:20:02,740
topics, other laws that have been debated
in Brussels. One was public sector
212
00:20:02,740 --> 00:20:10,080
information directive, where everybody was
very happy discussing how much public
213
00:20:10,080 --> 00:20:13,980
information should be released and where
it should come from and how people should
214
00:20:13,980 --> 00:20:18,980
have access to it. And a part of public
information is the information that is
215
00:20:18,980 --> 00:20:23,760
produced by companies that perform public
services, but they may also be private,
216
00:20:23,760 --> 00:20:28,027
for example, sometimes transport, public
transport is provided that way. And
217
00:20:28,027 --> 00:20:31,929
actually public transport providers were
the ones that were saying that they cannot
218
00:20:31,929 --> 00:20:36,549
release the information that they have,
namely timetables and other
219
00:20:36,549 --> 00:20:44,270
information about how the system
works that could be useful for citizens
220
00:20:44,270 --> 00:20:49,350
because then it may be used by terrorists.
I guess that maybe prevents the potential
221
00:20:49,350 --> 00:20:53,510
terrorists from going from bus stop to bus
stop and figuring out how the buses go.
222
00:20:53,510 --> 00:20:57,530
But we already know that this does not
work that way. So this is something
223
00:20:57,530 --> 00:21:04,040
that actually normalizes this approach.
And let's first look at the definition of
224
00:21:04,040 --> 00:21:09,899
the proposal as presented by the
European Commission. So they say
225
00:21:09,899 --> 00:21:14,120
basically, let me read: "Terrorist content
means one or more of the following
226
00:21:14,120 --> 00:21:20,720
information. So a) inciting or advocating,
including by glorifying, the commission of
227
00:21:20,720 --> 00:21:25,990
terrorist offences". I do apologise for
the horrible level of English
228
00:21:25,990 --> 00:21:32,040
that they use, I don't know why and that I
don't apologise for them, but for the fact
229
00:21:32,040 --> 00:21:35,960
that they expose you to it. "The
commission of terrorist offences,
230
00:21:35,960 --> 00:21:40,240
Clapping
thereby causing a danger that such acts be
231
00:21:40,240 --> 00:21:44,650
committed". You won't believe how many
times I had to read all this to actually
232
00:21:44,650 --> 00:21:48,525
understand what all those things mean.
"Encouraging the contribution to terrorist
233
00:21:48,525 --> 00:21:56,749
offences". So contribution could be money,
could be some, I guess material resources.
234
00:21:56,749 --> 00:22:00,679
"Promoting the activities of a terrorist
group, in particular by encouraging the
235
00:22:00,679 --> 00:22:05,702
participation in or support to a
terrorist group. Instructing on methods or
236
00:22:05,702 --> 00:22:10,076
techniques for the purpose of committing
terrorist offenses". And then there is
237
00:22:10,076 --> 00:22:16,230
also the definition of "dissemination of
terrorist content". That basically means
238
00:22:16,230 --> 00:22:20,490
"making terrorist content available to
third parties on the hosting service
239
00:22:20,490 --> 00:22:26,830
providers services". As you can probably
see, the dissemination and the fact that
240
00:22:26,830 --> 00:22:33,360
third parties are evoked mean that this
law is super broad. So it's not only about
241
00:22:33,360 --> 00:22:38,200
social media because making content
available through third parties may mean
242
00:22:38,200 --> 00:22:43,290
that I am sharing something over some sort
of service with my mom and she is a third
243
00:22:43,290 --> 00:22:48,860
party in the understanding of this law. So
we were actually super troubled to see
244
00:22:48,860 --> 00:22:54,350
that not only does it encompass services
that make information available to the
245
00:22:54,350 --> 00:22:58,809
public. So the one that we all can see
like social media, but also that
246
00:22:58,809 --> 00:23:04,530
potentially it could be used against
services that make, let people communicate
247
00:23:04,530 --> 00:23:09,810
privately. So that is a that is a big
issue. The second thing I wanted to direct
248
00:23:09,810 --> 00:23:17,660
your attention to is the parts that
they put in italics. It's how soft those
249
00:23:17,660 --> 00:23:25,178
those concepts are, inciting, advocating,
glorifying, encouraging, promoting. This
250
00:23:25,178 --> 00:23:29,470
is a law that actually potentially can
really influence how we talk and how to
251
00:23:29,470 --> 00:23:33,610
communicate what we wanted to talk about,
whether we agree or disagree with certain
252
00:23:33,610 --> 00:23:41,572
policies or certain political decisions.
And all those things are super soft. And
253
00:23:41,572 --> 00:23:47,090
it's very, very hard to say what
does it really mean. And I want to
254
00:23:47,090 --> 00:23:53,679
give you an example of the same
content used in 3 different cases to
255
00:23:53,679 --> 00:23:59,820
illustrate this. So let's imagine we have
a group of people that recorded a video
256
00:23:59,820 --> 00:24:03,950
and on those videos, they
say that, well, basically they call
257
00:24:03,950 --> 00:24:10,650
themselves terrorists, to make it easier,
and they say that they wanted to
258
00:24:10,650 --> 00:24:16,820
commit all sorts of horrible things in
specific places, so that constitutes like
259
00:24:16,820 --> 00:24:21,702
some sort of a credible threat. And they
also bragged that they killed someone. And
260
00:24:21,702 --> 00:24:25,401
they also say that they're super happy
about this and so on. And they also, of
261
00:24:25,401 --> 00:24:29,620
course, encourage others to join them and
so on and so on. And the 3 cases would be:
262
00:24:29,620 --> 00:24:35,871
1 would be that this particular group
posted videos on, I don't know, their
263
00:24:35,871 --> 00:24:41,340
YouTube channel. The other case would be
that there is a media outlet that reports
264
00:24:41,340 --> 00:24:47,265
on it and either links to this video or
maybe present snippets of it. And the
265
00:24:47,265 --> 00:24:51,480
third case would be, for example, that
there is some sort of group that is
266
00:24:51,480 --> 00:24:57,370
actually following what's happening in
that region and collects evidence that can
267
00:24:57,370 --> 00:25:01,020
then help identify the people and
prosecute them for the crimes they commit.
268
00:25:01,020 --> 00:25:07,975
Like the crime that's our
exemplary terrorists admitted to
269
00:25:07,975 --> 00:25:13,400
committing. And do you think that
according to this definition, in your
270
00:25:13,400 --> 00:25:18,347
opinion, do you think that there is a
difference between those 3 types of
271
00:25:18,347 --> 00:25:23,020
presenting that content between the
terrorist group that is presenting it on
272
00:25:23,020 --> 00:25:27,919
their channel, between the media outlet
and between the activists? There is none.
273
00:25:27,919 --> 00:25:34,660
Because this law has nothing, does not
define in any way that the so-called
274
00:25:34,660 --> 00:25:42,179
terrorist content is something that is
published with an intention of actually
275
00:25:42,179 --> 00:25:48,980
advocating and glorifying. So the problem
is that not only does the content that,
276
00:25:48,980 --> 00:25:54,226
let's say, is as weak as we may call it
manifestly illegal. So somebody kills
277
00:25:54,226 --> 00:25:58,706
someone and is being recorded and we know
it's a crime and perhaps we don't want to
278
00:25:58,706 --> 00:26:02,316
watch it, although I do think that we
should also have a discussion in our
279
00:26:02,316 --> 00:26:08,350
society, what we wanted to see and
what we don't want to see.
280
00:26:08,350 --> 00:26:12,780
From the fact, from the perspective that
the world is complicated and we may
281
00:26:12,780 --> 00:26:17,480
have the right to access all sorts of
information, even that is not so
282
00:26:17,480 --> 00:26:23,030
pleasant and not so easy to digest. So
this law does not make this
283
00:26:23,030 --> 00:26:28,030
differentiation. There is no mention of
how this should be intentional to qualify
284
00:26:28,030 --> 00:26:34,600
to be considered so-called terrorist
content. And that's a big problem. So as
285
00:26:34,600 --> 00:26:38,810
you can see, there is a fallacy in this
narrative because these will be the member
286
00:26:38,810 --> 00:26:45,809
states and their so-called competent
authorities that will be deciding what the
287
00:26:45,809 --> 00:26:54,460
terrorist content is. And, of course,
Europeans have a tendency, a tendency to
288
00:26:54,460 --> 00:26:59,900
think we have the tendency to think of
ourselves as the the societies and the
289
00:26:59,900 --> 00:27:07,160
nations and the countries that champion
the rule of law and that and that actually
290
00:27:07,160 --> 00:27:13,700
respect fundamental rights and expect,
respect freedom of speech. But we also
291
00:27:13,700 --> 00:27:18,490
know that this is changing rapidly. And I
also will show you examples of how that
292
00:27:18,490 --> 00:27:24,510
changes in this area that we're talking
about right now. So. So I do not have
293
00:27:24,510 --> 00:27:29,546
great trust in in European governments
into making the correct judgment about
294
00:27:29,546 --> 00:27:45,421
that. So. So we have this category of very
dubious and and very broad terrorist
295
00:27:45,421 --> 00:27:56,250
content. And then, so how it's how it's
being done. The, the basically all that
296
00:27:56,250 --> 00:28:01,169
power to decide what the content, like how
to deal with that content, is actually
297
00:28:01,169 --> 00:28:04,880
outsourced to private actors. So the
platforms that we are talking about
298
00:28:04,880 --> 00:28:09,450
becomes kind of mercenaries because
both the commission and I guess many
299
00:28:09,450 --> 00:28:13,399
member states say, well, it's not possible
that the judge will actually look through
300
00:28:13,399 --> 00:28:18,169
content that is placed online and give,
you know, proper judiciary decisions about
301
00:28:18,169 --> 00:28:21,951
what should, what constitute freedom of
expression and what goes beyond it.
302
00:28:21,951 --> 00:28:29,425
Because it hurts other people or
is basically a proof of something illegal.
303
00:28:29,425 --> 00:28:32,900
So the platforms will take those
decisions. This will be the hosting
304
00:28:32,900 --> 00:28:39,870
service providers, as I mentioned. And
then also a lot of the reliance that they
305
00:28:39,870 --> 00:28:44,074
will do it right is put into the wishful
thinking in this proposal that says, well,
306
00:28:44,074 --> 00:28:47,855
basically, you have to put in terms of
service that you will not host terrorist
307
00:28:47,855 --> 00:28:52,550
content. So then that again, there is a
layer in there where the platform,
308
00:28:52,550 --> 00:29:01,234
let's say Facebook or Twitter or any
anyone else actually decides what and how
309
00:29:01,234 --> 00:29:06,890
they wanted to deal with that in detail.
Also, one thing I didn't mention is that
310
00:29:06,890 --> 00:29:10,880
looking for this regulation and looking at
who is the platform that should basically
311
00:29:10,880 --> 00:29:15,513
have those terms of service we realize
that Wikimedia that actually our platforms
312
00:29:15,513 --> 00:29:21,210
will actually be in the scope of that. So
not only that may affect the way we can
313
00:29:21,210 --> 00:29:29,340
document and reference the articles that
are appearing on Wikipedia, on all those
314
00:29:29,340 --> 00:29:34,090
all those, on the events that are
described or the groups or the political
315
00:29:34,090 --> 00:29:39,990
situation and what not. But also that, you
know, our community of editors will have
316
00:29:39,990 --> 00:29:44,792
less and less to say if we have to put a
lot of emphasis on terms of service. I
317
00:29:44,792 --> 00:29:49,530
just think that we are somehow a
collateral damage of this. But also this
318
00:29:49,530 --> 00:29:54,860
doesn't console me much because, of
course, Internet is bigger than than our
319
00:29:54,860 --> 00:30:00,379
projects. And also, we want to make sure
that, that content is not being removed
320
00:30:00,379 --> 00:30:07,845
elsewhere. So basically the 3 measures are
the removal orders, as I mentioned. And
321
00:30:07,845 --> 00:30:11,690
this is something that is fairly, fairly
straightforward. And actually, I'm
322
00:30:11,690 --> 00:30:15,830
wondering why there has to be a special
law to bring it, because, to being
323
00:30:15,830 --> 00:30:20,410
because the removal order is basically a
decision that the competent authority in
324
00:30:20,410 --> 00:30:24,950
the member state releases and sends it to
the platform. The problem is that
325
00:30:24,950 --> 00:30:30,204
according to the commission, the platform
should actually act on it in 1 hour. And
326
00:30:30,204 --> 00:30:34,586
then again, we ask them why 1 hour and not
74 minutes? And they say, "Well, because
327
00:30:34,586 --> 00:30:39,705
we actually know", I don't know how, but
they say they do. Let's take it at face
328
00:30:39,705 --> 00:30:46,250
value. "We actually know that the content
is the most, you know, viral and spreads
329
00:30:46,250 --> 00:30:50,602
the fastest within, has the biggest range
within the 1 hour from appearance". And
330
00:30:50,602 --> 00:30:54,475
then we ask them "Well, but how can you
know that? Actually, the people that find
331
00:30:54,475 --> 00:30:59,800
the content find it exactly on the moment
when it comes up. Maybe it has been around
332
00:30:59,800 --> 00:31:05,940
for 2 weeks and this 1 hour window when it
went really viral is like long gone". And
333
00:31:05,940 --> 00:31:11,730
here they don't really answer, obviously.
So this is one of the measures
334
00:31:11,730 --> 00:31:17,090
that I guess makes the most sense out of
all of that. Then we have the referrals
335
00:31:17,090 --> 00:31:20,100
that we call lazy removal
orders. And this is this is really
336
00:31:20,100 --> 00:31:23,834
something that is very puzzling for me
because the referral is a situation in
337
00:31:23,834 --> 00:31:28,090
which this competent authority and the
person working there goes through the
338
00:31:28,090 --> 00:31:33,480
content, through the videos or postings
and looks at it and says, "Well, I think,
339
00:31:33,480 --> 00:31:38,908
I think it's against the terms of service
of this platform, but does not actually
340
00:31:38,908 --> 00:31:43,740
release this removal order, but writes to
the platform, let's them know and say,
341
00:31:43,740 --> 00:31:48,809
"Hey, can you please check this out?" I'm
sorry, I'm confused, is this the time that
342
00:31:48,809 --> 00:31:56,223
I have left or the time? OK, good, time is
important here. So so basically, you know,
343
00:31:56,223 --> 00:32:00,250
they are basically, won't spend the time
to prepare this removal order
344
00:32:00,250 --> 00:32:05,840
and right and take let the platform to,
tell the platform actually to remove it.
345
00:32:05,840 --> 00:32:09,490
But they will just ask them to please
verify whether this content should be
346
00:32:09,490 --> 00:32:14,821
there or not. And first of all, this is
the real outsourcing of power
347
00:32:14,821 --> 00:32:20,063
over the speech and expression. But
also we know how platforms take those
348
00:32:20,063 --> 00:32:26,240
decisions. They have a very short time.
The people that do it are sitting
349
00:32:26,240 --> 00:32:30,650
somewhere, most probably where the content
is not originating from. So they don't
350
00:32:30,650 --> 00:32:34,418
understand the context. Sometimes they
don't understand the language. And also,
351
00:32:34,418 --> 00:32:38,062
you know, it's better to get rid of it
just in case it really is problematic,
352
00:32:38,062 --> 00:32:42,240
right? So this is something that is
completely increased this great gray area
353
00:32:42,240 --> 00:32:51,559
of information that is controversial
enough to be flagged, but it's not illegal
354
00:32:51,559 --> 00:32:57,070
enough to be removed by the order. By the
way, the European Parliament actually
355
00:32:57,070 --> 00:33:03,200
kicked this out from their version. So now
the fight is in this negotiation between
356
00:33:03,200 --> 00:33:07,598
the 3 institutions to actually follow this
recommendation and just remove it, because
357
00:33:07,598 --> 00:33:13,230
it really does not make sense. And it
really makes the people that
358
00:33:13,230 --> 00:33:17,700
release those referrals not really
accountable for their decisions
359
00:33:17,700 --> 00:33:22,449
because they don't take the decision. They
just make a suggestion. And then we have
360
00:33:22,449 --> 00:33:26,629
the proactive measures, which most
definitely will lead to over-policing of
361
00:33:26,629 --> 00:33:31,799
content. There is a whole, a very clever
description in the law that basically
362
00:33:31,799 --> 00:33:35,940
boils down to the point that if you are
going to use content filtering and if
363
00:33:35,940 --> 00:33:40,677
you're going to prevent content from
disappearing, then basically you are
364
00:33:40,677 --> 00:33:44,320
doing a good job as a platform. And
this is the way to actually deal with
365
00:33:44,320 --> 00:33:50,648
terrorist content. Since, however we
define it, again, this is very
366
00:33:50,648 --> 00:33:56,079
context-oriented, very context-dependent.
It's really very difficult to say based on
367
00:33:56,079 --> 00:33:59,829
what sort of criteria and based, based on
what sort of databases those
368
00:33:59,829 --> 00:34:05,242
automated processes will be, will be
happening. So, of course,
369
00:34:05,242 --> 00:34:10,159
as it happens in today's world,
Somebody privatizes
370
00:34:10,159 --> 00:34:17,299
the profits, but the losses are
always socialized. And this is now no
371
00:34:17,299 --> 00:34:23,849
exception from that rule. So, again, when
we were talking to the European Commission
372
00:34:23,849 --> 00:34:29,320
and asking them, "Why is this not a piece
of legislation that belongs to the
373
00:34:29,320 --> 00:34:36,560
enforcement of the law?" And that is then
not controlled by a heavily by the
374
00:34:36,560 --> 00:34:40,620
judiciary system and by any other sort of
oversight, that enforcements usually had.
375
00:34:40,620 --> 00:34:44,460
"They have, well, because, you know,
when we have those videos of beheadings,
376
00:34:44,460 --> 00:34:48,334
they usually don't happen in Europe and
they are really beyond our jurisdiction".
377
00:34:48,334 --> 00:34:51,628
So, of course, nobody will act on it.
On the very meaningful level of
378
00:34:51,628 --> 00:34:54,259
actually finding the people
that that are killing,
379
00:34:54,259 --> 00:34:57,129
that are in the business of
killing others and making
380
00:34:57,129 --> 00:35:02,873
sure they cannot continue with this
activity. So it's very clear that this
381
00:35:02,873 --> 00:35:08,470
whole law is about cleaning the Internet
and not really about a meaningfully
382
00:35:08,470 --> 00:35:18,500
tackling societal problems that lead to
that sort of violence. Also the
383
00:35:18,500 --> 00:35:22,540
redress, which is the mechanism in which
the user can say, hey, this is not the
384
00:35:22,540 --> 00:35:26,459
right decision. I actually believe this
content is not illegal at all. And it's
385
00:35:26,459 --> 00:35:31,397
important for me to say this and this is
my right and I want it to be up. Those,
386
00:35:31,397 --> 00:35:38,570
those provisions are very weak. You cannot
actually protest meaningfully against a
387
00:35:38,570 --> 00:35:42,670
removal order of your content. Of course,
you can always take states, the states to
388
00:35:42,670 --> 00:35:47,640
court. But we know how amazingly
interesting that is and how fast it
389
00:35:47,640 --> 00:35:53,030
happens. So that, so we can I think we
can agree that there is no meaningful way
390
00:35:53,030 --> 00:36:00,140
to actually protest. Also, the state may
ask, well, actually, this this removal
391
00:36:00,140 --> 00:36:05,410
order should... the user should not be
informed that the content has been has
392
00:36:05,410 --> 00:36:10,690
been taken down because of terrorism.
So. Or depicting terrorism or glorifying
393
00:36:10,690 --> 00:36:16,193
or whatever. So even you may not even know
why the content is taken down. It will be
394
00:36:16,193 --> 00:36:22,013
a secret. For referrals and for proactive
measures, well, you know what? Go talk to
395
00:36:22,013 --> 00:36:26,535
the platform and protest with them. And
then, of course, the other question is, so
396
00:36:26,535 --> 00:36:30,506
who is the terrorist? Right. Because this
is a very important question that that we
397
00:36:30,506 --> 00:36:37,589
should have answered if we wanted to... if
we wanted to have a law that actually is
398
00:36:37,589 --> 00:36:42,330
meaningfully engaging with those issues.
And of course, well, the as you know
399
00:36:42,330 --> 00:36:48,591
already from what I said, the European
Commission in that particular case does
400
00:36:48,591 --> 00:36:54,398
not provide a very good answer. But we
have some other responses to that. For
401
00:36:54,398 --> 00:37:03,367
example, Europol has created a report and
then there was a blog post based on that.
402
00:37:03,367 --> 00:37:07,622
On the title "On the importance of taking
down non-violent terrorist content".
403
00:37:07,622 --> 00:37:11,428
So we have the European Commission that
says, "Yes, it's about the beheadings and
404
00:37:11,428 --> 00:37:15,576
about the mutilations". And we have
Europol that says, "You know, actually
405
00:37:15,576 --> 00:37:19,879
this non-violent terrorist content is
super important". So basically what they
406
00:37:19,879 --> 00:37:25,459
say and I quote, "Poetry is a literary
medium that is widely appreciated across
407
00:37:25,459 --> 00:37:30,457
the Arab world and is an important part
of their region's identity. Mastering it
408
00:37:30,457 --> 00:37:34,810
provides the poet with singular authority
in Arabic culture. The most prominent
409
00:37:34,810 --> 00:37:39,660
jihadi leaders - including Osama bin Laden
and former Islamic State spokesman
410
00:37:39,660 --> 00:37:43,196
Abu Muhammad al-Adnadi - frequently
included poetry in their
411
00:37:43,196 --> 00:37:46,447
speeches or wrote poems of their own.
Their charisma was
412
00:37:46,447 --> 00:37:49,380
closely intertwined with their
mastery of poetry."
413
00:37:49,380 --> 00:37:54,810
So we can see the art that is being made
by Europol between a very important aspect
414
00:37:54,810 --> 00:37:59,260
of a culture that is beautiful and
enriching and about the fact that this,
415
00:37:59,260 --> 00:38:03,460
that, that Europol wants it to see it
weaponized. The other part of the blogpost
416
00:38:03,460 --> 00:38:08,369
was about how ISIS presents interesting
activities that their members... their,
417
00:38:08,369 --> 00:38:12,531
their... their fighters have. And one of
them is that they are enjoying themselves
418
00:38:12,531 --> 00:38:16,936
and smiling and spending time together and
swimming. So what? How do we, what do we
419
00:38:16,936 --> 00:38:20,228
make out of that? So the video of some
brown people swimming are now
420
00:38:20,228 --> 00:38:27,760
terrorist content? This is... the blatant
racism of, of this, of this communication
421
00:38:27,760 --> 00:38:33,359
really enrages me. And I think it's really
a shame that that nobody called Europol
422
00:38:33,359 --> 00:38:39,530
out on this, when the blogpost came up. We
also have laws in Europe that are
423
00:38:39,530 --> 00:38:44,770
different. I mean, this is not the same
legislation, but that actually give the...
424
00:38:44,770 --> 00:38:51,466
give the the taste of what may happen. One
is the the Spanish law against hate
425
00:38:51,466 --> 00:38:57,111
speech. And, and this is an important
part. It didn't happen online, but it
426
00:38:57,111 --> 00:39:02,500
shows the approach, that basically first
you have legislators that say, oh, don't
427
00:39:02,500 --> 00:39:06,450
worry about this, we really want to go
after bad guys. And then what happens is
428
00:39:06,450 --> 00:39:10,930
that there was a puppeteer performance
done by 2 people, "The Witch and Don
429
00:39:10,930 --> 00:39:15,840
Christóbal" and the puppets were
actually... this is the kind of
430
00:39:15,840 --> 00:39:22,890
Punch and Judy performance in which, this
is a genre of, of theater, theatric
431
00:39:22,890 --> 00:39:30,109
performances, I'm sorry. That is kind of
full of silly jokes and and sometimes
432
00:39:30,109 --> 00:39:36,000
excessive and and unjustified violence
and, and the, and the full of bad taste.
433
00:39:36,000 --> 00:39:40,860
And this is quite serious. And the, the 2
characters in the, the 2 puppets held the
434
00:39:40,860 --> 00:39:46,400
banner that featured a made-up terrorist
organization. And after that performance,
435
00:39:46,400 --> 00:39:52,410
actually, they were charged with, first of
all, promoting terrorism, even though
436
00:39:52,410 --> 00:39:55,940
there is no terrorist organization like
that. And then also with inciting,
437
00:39:55,940 --> 00:40:01,859
inciting hatred. And this is what's one of
the puppeteers said after describing this
438
00:40:01,859 --> 00:40:07,270
whole horrible experience. Finally, the
charges were dropped. So this is good. But
439
00:40:07,270 --> 00:40:11,570
I think this really sums up who is the
terrorists and how those laws are being
440
00:40:11,570 --> 00:40:20,103
used against people who actually have
nothing to do with, with violence. We were
441
00:40:20,103 --> 00:40:23,981
charged with inciting hatred, which is a
felony created in theory to protect
442
00:40:23,981 --> 00:40:27,228
vulnerable minorities, the minorities in
this case where the church,
443
00:40:27,228 --> 00:40:29,868
the police and the legal system.
laughter
444
00:40:29,868 --> 00:40:33,569
Then, again in Spain, I don't want to
single out this beautiful country, but
445
00:40:33,569 --> 00:40:36,998
actually, unfortunately, they
have good examples. This is a very recent
446
00:40:36,998 --> 00:40:44,310
one. So Tsunami Democràtic in Catalonia
created an app to actually help people
447
00:40:44,310 --> 00:40:49,770
organize small action in a decentralized
manner. And they placed the documentations
448
00:40:49,770 --> 00:40:56,609
on GitHub. And it was taken down by the
order of, of the Spanish court. And also
449
00:40:56,609 --> 00:41:01,463
the - and this is the practical
application of such laws online - also,
450
00:41:01,463 --> 00:41:06,600
the website of Tsunami Democràtic
was taken down by the court. Of course,
451
00:41:06,600 --> 00:41:10,580
both of that on charges
of of facilitating terrorist
452
00:41:10,580 --> 00:41:15,790
activities and inciting to
terrorism. So why is it important?
453
00:41:15,790 --> 00:41:19,776
Because of what comes next. So
there will be the Digital Services Act,
454
00:41:19,776 --> 00:41:23,651
which will be an overhaul of this idea
that I mentioned at the beginning, which
455
00:41:23,651 --> 00:41:27,850
is that basically platform are not
responsible by default, by what we put
456
00:41:27,850 --> 00:41:34,250
online. And European Commission and
other, the European Commission and other
457
00:41:34,250 --> 00:41:38,070
actors in the EU are toying with the idea
that maybe platforms should be somehow
458
00:41:38,070 --> 00:41:43,800
responsible. So, of course. And it's not
only about social media, but basically
459
00:41:43,800 --> 00:41:51,180
anybody that any sort of, of a service
that helps people place content online.
460
00:41:51,180 --> 00:41:54,900
And then, the, one of the ideas, we
don't know what it's going to be, it's not
461
00:41:54,900 --> 00:41:58,176
there yet. It's going to happen at the
beginning of the next year, so
462
00:41:58,176 --> 00:42:01,864
quite soon. But we can actually
expect that the so-called "Good Samaritan"
463
00:42:01,864 --> 00:42:05,704
rule will be 1 of the solutions proposed.
What is this rule? This rule basically
464
00:42:05,704 --> 00:42:11,090
means if a platform is really going the
extra mile and doing a good job in
465
00:42:11,090 --> 00:42:16,851
removing the content, that is what... that
is either illegal or again or again, a
466
00:42:16,851 --> 00:42:22,987
very difficult category, harmful. I also
don't know what that exactly means. Then
467
00:42:22,987 --> 00:42:27,758
if they behave well, then they will not be
held responsible. So this is basically a
468
00:42:27,758 --> 00:42:31,838
proposal that you cannot really turn down,
because if you run the business, you want
469
00:42:31,838 --> 00:42:35,675
to manage the risk of that and you don't
want to be fined. And you and you don't
470
00:42:35,675 --> 00:42:40,070
want to pay, pay money. So, of course, you
try and overpolice and of course you try
471
00:42:40,070 --> 00:42:44,120
and you filter the content and of course
you take content when it only raises a
472
00:42:44,120 --> 00:42:52,320
question what sort of... what sort of
content that is. Is it neutral or
473
00:42:52,320 --> 00:43:00,330
is it maybe, you know, making somebody
offended or... or stirred? And, of course,
474
00:43:00,330 --> 00:43:05,480
other attempts, we heard it from Germany.
Which is basically that there wasn't a
475
00:43:05,480 --> 00:43:15,220
proposal to actually make... oblige.. like
make platforms obliged to give passwords
476
00:43:15,220 --> 00:43:21,193
of users of social media. The people that
are under investigation or prosecution.
477
00:43:21,193 --> 00:43:26,829
And also, of course, we see that one of
the ideas that supposedly is going to fix
478
00:43:26,829 --> 00:43:31,677
everything is that, well, if terrorists
communicate through encrypted services,
479
00:43:31,677 --> 00:43:35,485
then maybe we should do something about
encryption. And there was a petition
480
00:43:35,485 --> 00:43:41,890
already on (?) to actually go in to
actually forbid encryption for those
481
00:43:41,890 --> 00:43:46,540
services after one of the one of the
terrorist attacks. So, of course, it
482
00:43:46,540 --> 00:43:53,109
sounds, it sounds very extreme. But
this is, in my opinion, the next the next
483
00:43:53,109 --> 00:44:00,383
frontier here. So what can we do? Because
this is all quite difficult. So as I
484
00:44:00,383 --> 00:44:04,920
mentioned, the negotiations are still on.
So there is still time to talk to
485
00:44:04,920 --> 00:44:08,880
your government. And this is very import
because, of course, the governments, when
486
00:44:08,880 --> 00:44:12,850
they have this idea... they have this
proposal on the table, that they will be
487
00:44:12,850 --> 00:44:18,360
able to decide finally who is
the terrorist and what is the terrorist
488
00:44:18,360 --> 00:44:23,000
content. And also, that's on one hand. On
the other hand, they know that people
489
00:44:23,000 --> 00:44:26,650
don't really care all that much about what
happens in the E.U., which is
490
00:44:26,650 --> 00:44:31,383
unfortunately true. They are actually
supporting very much the commission's
491
00:44:31,383 --> 00:44:35,188
proposals. The only thing that they don't
like is the fact that somebody from the
492
00:44:35,188 --> 00:44:41,710
police, from other country can maybe
interfere with content in their language,
493
00:44:41,710 --> 00:44:46,570
because that's one of the provisions that
that also is there. So, so this is what
494
00:44:46,570 --> 00:44:50,950
they don't like. They want to keep their
there the territoriality of their
495
00:44:50,950 --> 00:44:56,701
enforcement laws intact. But there is
still time and we can still do this. And
496
00:44:56,701 --> 00:45:00,220
if you want to talk to me about
what are the good ways to do it, I'm
497
00:45:00,220 --> 00:45:04,778
available here. And I would love to take
that conversation up with you. The other
498
00:45:04,778 --> 00:45:11,374
is a very simple measure that I believe is
is always working. Is one that basically
499
00:45:11,374 --> 00:45:16,119
is about telling just 1 friend, even 1
friend, and asking them to do the same to
500
00:45:16,119 --> 00:45:20,134
talk to other people about this. And there
are 2 reasons to do it. One is because, of
501
00:45:20,134 --> 00:45:23,666
course, then we make people aware of what
it happens. And the other in this
502
00:45:23,666 --> 00:45:29,350
particular case that is very important is
that basically people are scared of
503
00:45:29,350 --> 00:45:33,284
terrorism and, and they support a lot of
measures just because they hear this word.
504
00:45:33,284 --> 00:45:36,999
And when we explain, that, what that
really means and when we unpack this a
505
00:45:36,999 --> 00:45:40,392
little bit, we build the resilience to
those arguments. And I think it's
506
00:45:40,392 --> 00:45:43,017
important. The other people
who should know about this
507
00:45:43,017 --> 00:45:46,377
are activists working with
vulnerable groups because of the
508
00:45:46,377 --> 00:45:50,355
stigmatization that I
already mentioned and because
509
00:45:50,355 --> 00:45:54,804
of the fact that we need to document
horrible things that are happening to
510
00:45:54,804 --> 00:45:58,703
people in other places in the world and
also here in Europe. And journalists
511
00:45:58,703 --> 00:46:02,824
and media organizations, because they
will be affected by this law. And by the
512
00:46:02,824 --> 00:46:06,918
way, how they can report and where they
can they can get the sources for their
513
00:46:06,918 --> 00:46:12,025
information. So I think I went massively
over time from what it was planned. I hope
514
00:46:12,025 --> 00:46:16,760
we can still have some questions. Thank
you. So, yeah. Talk to me more about this
515
00:46:16,760 --> 00:46:23,058
now and then after the talk. Thank you.
516
00:46:23,058 --> 00:46:33,057
applause
517
00:46:33,057 --> 00:46:37,310
Herald: Thanks for your talk. We still
have time for questions, so please, if you
518
00:46:37,310 --> 00:46:42,600
have a question, line up at the mics. We
have 1, 2, 3 evenly distributed through
519
00:46:42,600 --> 00:46:47,231
the room. I want to remind you really
quickly that a question normally is one
520
00:46:47,231 --> 00:46:50,520
sentence and ends with a question mark.
laughter
521
00:46:50,520 --> 00:46:55,530
Not everybody seems to know that. So we
start with mic number 2.
522
00:46:55,530 --> 00:47:02,460
Mic2: Hello. I... so I run Tor Relays in
the United States. It seems like a lot of
523
00:47:02,460 --> 00:47:07,570
these laws are focused on the notion of
centralized platforms. Do they define what
524
00:47:07,570 --> 00:47:12,400
a platform is and are they going to
extradite me because I am facilitating Tor
525
00:47:12,400 --> 00:47:16,249
Onion service?
A: Should we answer, no?
526
00:47:16,249 --> 00:47:21,895
H: Yeah.
A: Okay, yes, so they do and they don't
527
00:47:21,895 --> 00:47:26,000
in a way that the definition it's
based on basically what
528
00:47:26,000 --> 00:47:30,839
the the hosting provider
is in in the European law is
529
00:47:30,839 --> 00:47:36,089
actually very broad. So it doesn't take
into account the fact how big you are or
530
00:47:36,089 --> 00:47:42,420
how you run your services. The bottom line
is that if you allow people to put content
531
00:47:42,420 --> 00:47:47,161
up and share it with, again, 3rd party,
which may be the whole room here, it may
532
00:47:47,161 --> 00:47:51,460
be the whole world but it may be just the
people I want to share things to with.
533
00:47:51,460 --> 00:47:57,750
Then then you're obliged to to use the
measures that are... or, or to comply with
534
00:47:57,750 --> 00:48:01,940
the measures that are envisioned in this
regulation. And there is a there's a
535
00:48:01,940 --> 00:48:06,590
debate also in the parliament. It was
taken up and narrowed down actually to the
536
00:48:06,590 --> 00:48:11,520
communication to the public. So I guess
then as you correctly observe, it is more
537
00:48:11,520 --> 00:48:17,130
about about the big platforms or about the
centralized services. But actually the, in
538
00:48:17,130 --> 00:48:20,849
the commission version, nothing makes me
believe that, that only then will be
539
00:48:20,849 --> 00:48:26,358
affected. On the contrary, also the, the
messaging services may be.
540
00:48:26,358 --> 00:48:34,880
H: Okay, um, next question, mic number 3.
Mic3: Is it, uh, a photo with the upload
541
00:48:34,880 --> 00:48:41,250
filters, the copyright directive, it was
really similar debate, especially on
542
00:48:41,250 --> 00:48:46,589
small companies, because, um, uh, at that
time, the question was they tried to push
543
00:48:46,589 --> 00:48:51,010
upload filters for copyright content. And
the question was, uh, how does that fit
544
00:48:51,010 --> 00:48:55,871
with small companies? And they still
haven't provided an answer to that. Uh,
545
00:48:55,871 --> 00:48:59,560
the problem is they took the copyright
directive and basically inspired
546
00:48:59,560 --> 00:49:04,041
themselves from the upload filters and
applied it to terrorist content. And it's
547
00:49:04,041 --> 00:49:07,928
again, the question, how does that work
with small Internet companies that have to
548
00:49:07,928 --> 00:49:13,788
have someone on call during the
nights and things like that. So even big
549
00:49:13,788 --> 00:49:17,380
providers, I heard they don't have the
means to, to properly enforce that
550
00:49:17,380 --> 00:49:22,569
something like this, this is a
killer for the European Internet industry.
551
00:49:22,569 --> 00:49:26,060
A: Yes.
laughter
552
00:49:26,060 --> 00:49:32,230
applause
H: I want to give a short reminder on the
553
00:49:32,230 --> 00:49:39,339
1 sentence rule. We have a question from
the Internet. Signal angel, please.
554
00:49:39,339 --> 00:49:44,696
Signal Angel: Yes, what, the question is,
wouldn't decentralized social networks
555
00:49:44,696 --> 00:49:52,429
bypass these regulations?
A: I'm not a lawyer, but I will give a
556
00:49:52,429 --> 00:49:55,902
question, I give an answer to this
question that the lawyer would give,
557
00:49:55,902 --> 00:49:58,709
I maybe spent too much time with lawyers.
That depends.
558
00:49:58,709 --> 00:50:01,220
laughter
A: Because it really does, because this
559
00:50:01,220 --> 00:50:05,800
definition of who is obliged is so broad
that a lot depends on the context, a lot
560
00:50:05,800 --> 00:50:10,940
depends on what is happening, what is
being shared and how. So it's, it's very
561
00:50:10,940 --> 00:50:14,718
difficult to say. I just want to say that
we also had this conversation about
562
00:50:14,718 --> 00:50:20,489
copyright and many people came to me last
year at Congress. I wasn't giving a talk
563
00:50:20,489 --> 00:50:24,806
about it, but I was at the talk about the
copyright directive and the filtering. And
564
00:50:24,806 --> 00:50:28,649
many people said, well, actually, you
know, if you're not using those services,
565
00:50:28,649 --> 00:50:32,540
you will not be affected. And actually,
when we share peer to peer, then this is
566
00:50:32,540 --> 00:50:36,820
not an issue. But actually, this is
changing. And there is actually
567
00:50:36,820 --> 00:50:41,959
a decision of the European Court of
Justice. And the decisions are not like
568
00:50:41,959 --> 00:50:45,242
basically the law, but basically they
are very often then followed and
569
00:50:45,242 --> 00:50:49,319
incorporated. And this is the... and this
is the decision on the Pirate Bay and
570
00:50:49,319 --> 00:50:53,680
in... on Pirate Bay. And in this decision,
the court says that, well, the argument
571
00:50:53,680 --> 00:50:57,716
that Pirate Bay made was basically we're
not hosting any content. We're just
572
00:50:57,716 --> 00:51:03,986
connecting people with it. And in short,
and the court said, well,
573
00:51:03,986 --> 00:51:09,148
actually, we don't care. Because you
organize it, you optimize it, you like
574
00:51:09,148 --> 00:51:12,269
the info, you optimize the
information, you bring it to people.
575
00:51:12,269 --> 00:51:15,696
And the fact that you don't share
it does not really mean anything. And
576
00:51:15,696 --> 00:51:20,280
you are liable for the, for the copyright
infringements. So, again, this is about a
577
00:51:20,280 --> 00:51:26,700
different issue, but this is a very
relevant way of thinking that we may
578
00:51:26,700 --> 00:51:30,670
expect that it will be translated into
other types of content. So, again, the
579
00:51:30,670 --> 00:51:36,040
fact that that you don't host anything but
you just connect people to one another
580
00:51:36,040 --> 00:51:42,480
will not be... may not be something that,
that will take you off the hook.
581
00:51:42,480 --> 00:51:49,190
H: Microphone number 3.
Mic3: Do these proposals contain or...
582
00:51:49,190 --> 00:51:54,572
what sort of repercussions do these
proposals contained for filing a request,
583
00:51:54,572 --> 00:51:58,921
removal requests that are later determined
to be illegitimate? Is this just a free
584
00:51:58,921 --> 00:52:03,460
pass to censor things? Or can.... are
there repercussions?
585
00:52:03,460 --> 00:52:07,619
A: You... just want to make sure I
understand, you mean the removal orders,
586
00:52:07,619 --> 00:52:09,867
the ones that say remove content, and
that's it?
587
00:52:09,867 --> 00:52:13,450
Mic3: Yeah. If somebody files a removal
order that is determined later to be
588
00:52:13,450 --> 00:52:16,868
completely illegitimate. Are there
repercussions?
589
00:52:16,868 --> 00:52:22,882
A: Well, the problem starts even before
that because again, the removal orders are
590
00:52:22,882 --> 00:52:26,839
being issued by competent authorities. So
there's like a designated authority that
591
00:52:26,839 --> 00:52:31,206
can do it. Not everybody can. And
basically, the order says this is the
592
00:52:31,206 --> 00:52:36,410
content. This is the URL. This is the
legal basis. Take it down. So there is no
593
00:52:36,410 --> 00:52:41,452
way to protest it. And the platform can
only not follow this order within 1 hour
594
00:52:41,452 --> 00:52:46,094
in 2 situations. One is that the force
majeure, that is usually the issue.
595
00:52:46,094 --> 00:52:49,706
Basically, there's some sort of external
circumstance that prevents them from
596
00:52:49,706 --> 00:52:52,432
doing it. I don't know.
Complete power outage
597
00:52:52,432 --> 00:52:55,272
or problem with their servers
that basically they cannot
598
00:52:55,272 --> 00:52:59,550
access and remove or block access
to this content. The other is if the
599
00:52:59,550 --> 00:53:04,796
request... the removal order, I'm sorry,
contains errors that actually make it
600
00:53:04,796 --> 00:53:09,300
impossible to do. So, for example, there
is no URL or it's broken and it doesn't
601
00:53:09,300 --> 00:53:13,565
lead anywhere. And these are the only 2
situations. In the rest, the content has
602
00:53:13,565 --> 00:53:19,839
to be removed. And there is no way for the
user and no way for the platform
603
00:53:19,839 --> 00:53:23,829
to actually say, well, hold on, this is
not the way to do it. And therefore, after
604
00:53:23,829 --> 00:53:28,690
it's being implemented to say, well, that
was a bad decision. As I said, you can
605
00:53:28,690 --> 00:53:33,569
always go to court with the, with your
state, but not many people will do it.
606
00:53:33,569 --> 00:53:39,223
And this is not really a meaningful
way to address this.
607
00:53:39,223 --> 00:53:46,260
H: Next question. Mic number 3.
Mic3: How many... how much time do we have
608
00:53:46,260 --> 00:53:50,890
to contact the parliamentarians to inform
them maybe that there is some big issue
609
00:53:50,890 --> 00:53:55,450
with this? What's the worst case
timetable at the moment?
610
00:53:55,450 --> 00:53:59,630
A: That's a very good question. And thank
you for asking because... this ... because
611
00:53:59,630 --> 00:54:05,020
I forgot to mention this. That actually is
quite urgent. So the commission wanted to,
612
00:54:05,020 --> 00:54:10,210
like usually, in those situations, the
commission wanted to close the thing until
613
00:54:10,210 --> 00:54:14,631
the end of the year and they didn't manage
it because there is no, no agreement
614
00:54:14,631 --> 00:54:20,240
on those most pressing issues.
But we expect that the, the best
615
00:54:20,240 --> 00:54:25,900
case scenario is that until March, maybe
until June, it will probably happen
616
00:54:25,900 --> 00:54:31,280
earlier. It may be the next couple of
months. And there will be lots of meetings
617
00:54:31,280 --> 00:54:36,950
about about that. So this is more or less
the timeline. It's, there's no sort of
618
00:54:36,950 --> 00:54:41,507
external deadline for this, right, so this
is just an estimation and of course,
619
00:54:41,507 --> 00:54:44,320
it may change, but, but this is what we
expect.
620
00:54:44,320 --> 00:54:46,950
H: We have another question from the
Internet.
621
00:54:46,950 --> 00:54:52,500
S: Does this law considers that
such content is used for psychological
622
00:54:52,500 --> 00:54:58,748
warfare by big nations?
A: I'm sorry. I... Again, please.
623
00:54:58,748 --> 00:55:04,130
S: This, this content is, pictures or
video of whatsever, does this law
624
00:55:04,130 --> 00:55:08,520
consider that such content is used for
psychological warfare?
625
00:55:08,520 --> 00:55:17,730
A: Well, I'm trying to see how that
relates. I think the law is... does not go
626
00:55:17,730 --> 00:55:24,589
into details like that in a way. Which
means that I can go back to the definition
627
00:55:24,589 --> 00:55:31,740
that basically it's just about the fact
that if the content appears to be positive
628
00:55:31,740 --> 00:55:37,550
about terrorist activities, then that's
the basis of taking it down. But there's
629
00:55:37,550 --> 00:55:42,350
nothing else that is being actually said
about that. It's not more nuanced than
630
00:55:42,350 --> 00:55:48,590
that. So I guess the answer is no.
H: One last question from mic number 2.
631
00:55:48,590 --> 00:55:54,690
Mic2: Are there, in... any case studies
published on successful application of
632
00:55:54,690 --> 00:55:59,035
alike laws in other countries? I asked
because we have alike laws in
633
00:55:59,035 --> 00:56:06,718
Russia for 12 years and it's not that
useful as far as I see.
634
00:56:06,718 --> 00:56:11,218
A: Not that I know of. So I think
it's also a very difficult
635
00:56:11,218 --> 00:56:15,840
thing to research because we
can only research what, what we
636
00:56:15,840 --> 00:56:21,181
know that happened. Right? In a way that
you have to have people that actually are
637
00:56:21,181 --> 00:56:27,210
vocal about this and that complain about
these laws not being enforced in the
638
00:56:27,210 --> 00:56:32,030
proper way. So, for example, content that
taken down is completely about something
639
00:56:32,030 --> 00:56:39,113
else, which also sometimes happens. And,
and that's very difficult. I think
640
00:56:39,113 --> 00:56:45,330
the biggest question here is
whether there is an amount of studies
641
00:56:45,330 --> 00:56:49,920
documenting that something does not work
that would prevent the European Union from
642
00:56:49,920 --> 00:56:56,710
actually having this legislative fever.
And I would argue that not, because, as
643
00:56:56,710 --> 00:57:00,859
they said, they don't have really good
arguments or they don't really have good
644
00:57:00,859 --> 00:57:05,829
numbers to justify bringing this law at
all. Not to mention bringing the
645
00:57:05,829 --> 00:57:12,418
ridiculous measures that they propose.
So what we say sometimes
646
00:57:12,418 --> 00:57:15,631
in Brussels when we're very frustrated
that we we were hoping, you
647
00:57:15,631 --> 00:57:21,309
know, being there and advocating for for
human rights, is that we... we hoped
648
00:57:21,309 --> 00:57:26,000
for... that we can contribute to evidence
based policy. But actually, what's
649
00:57:26,000 --> 00:57:31,678
happening, it's a policy based evidence.
And, and this is the difficult part. So I
650
00:57:31,678 --> 00:57:37,819
am all for studies and I am all for
presenting information that, you know, may
651
00:57:37,819 --> 00:57:41,734
possibly help legislators. There are
definitely some MEP or some people there,
652
00:57:41,734 --> 00:57:45,599
even probably in the commission. Maybe
they just are not allowed to voice
653
00:57:45,599 --> 00:57:50,650
their opinion on this because it's a
highly political issue that would wish to
654
00:57:50,650 --> 00:57:54,930
have those studies or would wish to be
able to use them. And that believe in
655
00:57:54,930 --> 00:58:00,779
that. But it's just, it doesn't
translate into the political process.
656
00:58:00,779 --> 00:58:06,682
H: Okay. Time's up. If you have any
more questions, you can come
657
00:58:06,682 --> 00:58:09,564
up and approach Anna later.
A: Yes.
658
00:58:09,564 --> 00:58:14,498
H: There is please. Thanks.
So first for me.
659
00:58:14,498 --> 00:58:17,060
Thanks for the talk. Thanks for
patiently answer...
660
00:58:17,060 --> 00:58:20,899
36c3 postroll music
661
00:58:20,899 --> 00:58:43,000
Subtitles created by c3subtitles.de
in the year 2021. Join, and help us!