-
36c3 preroll music
-
Herald: ...but now we start what we're
here for and I'm really happy to be
-
allowed to introduce Anna Mazgal. She will
talk about something which a great title.
-
I love it. "Confessions of a Future
Terrorist". Terror, terrorism is the one
-
thing you can always shout out and you get
everything through. And she will give us a
-
rough guide to overregulate free speech
with anti-terrorist measures. Anna works
-
for Wikimedia where she's a lobbyist of
human rights into the digital environment
-
and works in Brussels. And she gives a lot
of talk. And I think it's the first time
-
at a congress for her, is that right?
Anna: Second time.
-
Herald: Second time? I haven't really
researched right, because I searched for
-
it. So I have to do this again. It's her
2nd time for congress and I'm really
-
happy to have her here. Please welcome her
with a big round of applause.
-
Anna, the stage is yours.
Anna: Thank you. Yes. So as you have
-
already heard, I don't do any of the cool
things that we comedians and Wikipedians
-
do. I am based in Brussels and the
L-word, I do the lobbying on behalf of our
-
community. And today I am here because I
wanted to talk to you about one of the
-
proposals of 4 laws that we are now
observing the development of. And I wanted
-
to share my concerns also like active, as
an activist, because I'm really worried
-
how, if that law passes in its worst
possible vision or one of the of the bad
-
versions, how it will affect my work. I'm
also concerned how it would it will affect
-
your work. And basically all of ours
expression online. And I also want to
-
share with you that this law makes me
really angry. So, so I think these are a
-
few good reasons to be here and to talk to
you. And I hope after this presentation we
-
can have a conversation about this. And
I'm looking forward also to your
-
perspective on it and, and also the things
you may not agree with maybe. So, so what
-
is this law? So, in September 2018, the
European Commission came out with a
-
proposal of regulation on preventing
the dissemination of terrorist content
-
online. So there are a few things to
unpack here of for what it is about. First
-
of all, when we see a law that is about
Internet and that is about the content and
-
what is about the online environment and
it says it will prevent something, this
-
always brings a very difficult and
complicated perspective for us, the
-
digital rights activists in Brussels.
Because prevention online never means
-
anything good. So, so this is one thing.
The other thing is that this very troubled
-
concept of terrorist content, I will be
talking about this more, we will talk, I
-
will show you how the European Commission
understands it and what are the problems
-
with that understanding and whether this
is something that can actually be really
-
defined in the law. So, so these are the,
the already the red flags that I have seen
-
and we have seen when we were, when we
first got them, the proposal, into our
-
hands. I would like to tell you a little
bit about the framework of it. This is
-
probably the, the most dry part of that,
but I think it's important to correctly
-
place it. First of all, this is the
European Union legislation. So we're
-
talking about the legislation that will
influence 27 member states. Maybe 28, but
-
we know about Brexit, so, so that is a
debatable what's going to happen there.
-
And it's important to note that whenever
we have European legislation in the EU,
-
this is the, these are the laws that
actually are shaping the laws of all those
-
countries. And they come before the
national laws. So, so should this, should
-
this be implemented in any of the form,
when it's implemented in any of the form,
-
this is what is going to happen. The next
important part of information that I want
-
to give you is that this particular
regulation is a part of the framework that
-
is called Digital Single Market. So the
European Union, one of, one of the
-
objectives when a European
Commission created the law and when other
-
bodies of the European Union work on it,
is that there is, that, that the laws in
-
the member states of the
European Union are actually similar. And
-
the Digital Single Market means that what
we want, we want to achieve something on
-
the Internet that in a way is already
achieved within the European Union
-
geographically, meaning that we don't want
the borders on the Internet between people
-
communicating and also delivering goods
and services in the European Union.
-
And you may ask how that connects with
the terrorist content and how
-
that connects with what today's topics. To
be honest, I am also puzzled because I
-
think, that legislation that talks about
how people communicate online and what is
-
considered the speech that we wanted there
and we don't want this should not be a
-
part of a framework that is about market.
So this is also something that
-
brings a concern. Also, as you've seen at
the first slide, this piece of
-
legislation, this proposal is called the
regulation. And not to go too much into
-
details about what are the forms of
legislation in the EU, the important thing
-
to know here is that the regulation is a
law that once it is adopted by the EU,
-
once the parliament votes on it, it
starts, it is binding directly in
-
all the member states of the European
Union, which means that there is no
-
further discussion on how this should be
actually used. Of course, in each country
-
there are different decisions being made
by different bodies. But it means for us,
-
the people that work on this and that want
to influence the legislative process, that
-
once this law is out of Brussels, there
is nothing much to be done
-
about how it's going to be
implemented. And this is
-
important because for now, the discussion
about this, because for us, the discussion
-
about this, is the one that happens in
Brussels. There are a few versions of the
-
law. And very quickly, European Commission
proposes the law. European Parliament
-
looks at it, debates it, and then produces
its own version of it. So amends it or
-
makes it worse. And then the Council of
the EU, which is the gathering of all the
-
member states and the
representatives of the government of the
-
member states, also creates their own
version. And then, of course, when you
-
have 3 versions, you also need to have a
lot of conversations and a lot of
-
negotiation, how to put this together into
one. And all of those bodies have their
-
own ideas. Every one of those bodies
have their own ideas on how any law should
-
look like. So this process is not only
complicated, but also this negotiation
-
that is called the trilogs. It's actually
very non-transparent. And there is no or
-
almost none, no official information about
how those negotiations go, what are the
-
versions of the document and so on. This
is the part that we are now in. And I will
-
talk more about this later on. Today I
want to talk to you about the potential
-
consequences of diversion. That is the
original one, which is the European
-
Commission's version. And it's because it
will be very complicated and
-
confusing I guess, if we look at all of
the proposals that are on the table. But
-
also, it's important because European
Commission has a lot of influence also
-
informally, both on member states and also
on - to an extent - on the whole trial
-
process. So whatever gains we have in
other versions or whatever better
-
solutions we have there, they are not
secure yet. And I promise I'm almost
-
done with this part. There is other
relevant legislation that we'll
-
consider. One is the E-Commerce Directive.
And in this, the part that is
-
very relevant is for this particular
conversation, is that the platforms,
-
according to this law or Internet services
or hosting providers are not by default
-
responsible for the content that users
play, place online. So it's a very
-
important premise that also protects us,
protects our rights, protects our privacy,
-
that the they are not, they cannot
go after us or they cannot look for, for
-
the content that could be potentially
illegal, which would mean that they would
-
have to look into everything. But of
course, they have to react when somebody
-
notifies them and they have to see whether
the information that is placed by the
-
users should stay up or not. There is also
a directive on combating terrorism. And
-
this is the piece of legislation that
is quite recent to my best
-
knowledge. Not all countries in the
European Union, not all member states have
-
implemented it yet. So for us, it was also
very puzzling that we actually have a new
-
law, a new proposal that is talking about
the communication part of, of what already
-
has been mentioned in this directive. When
we still don't know how it works, we still
-
don't know because this law is physically
not being used at all. So this was for
-
us also difficult to understand why the
commission does not want to wait and see
-
how like what comes out from the
directive on combating terrorism. So
-
why would the European Commission and why
the European legislators would
-
actually want such a law that, again,
is about the content that people post
-
through different services and why this is
an important issue. If this is, why this
-
issue is actually conflated with
the market questions and the
-
harmonization in the digital market. So
there are some serious numbers here. 94 %
-
and 99 %. And I'm wondering if you have
any idea what those numbers are about.
-
Audience: Persons.
Anna: I'm sorry?
-
Audience: Persons.
Anna: Yes. It's about people. But the
-
numbers are actually presenting, so there
was a survey done by Eurostat and those
-
numbers present the percentage of
people first number 94 % presents the
-
percentage of people that say that they
have not come across terrorist content
-
online. Right. So, inversely, only 6 %
of people actually say that they had
-
access to terrorist content, it's
important to underline that they say it
-
because there's no way to check what that
content actually was and of course we can,
-
you know, here is the analogy of what a
certain American judge said about
-
pornography. I know it when I see it. It's
not a very good definition of anything,
-
really. So I would argue that actually
6 % of people being affected by something
-
is not really a big percentage and that
the European Union actually has bigger
-
problems to deal with and where they can
spend money and energy on. E.g., we are
-
all affected by, I don't know, air
pollution and that's much more people.
-
89% are the people in the age range
between 15 and 24. But again, were not
-
affected by something what they would
consider terrorist content. Of course,
-
would somebody think of the children?
There you go.
-
The children and young people do
not also experience
-
it in an overwhelming,
overwhelmingly. So, but this rationale
-
is being used, 6 % and 11 % as one
of the reasons why this regulation is
-
important, why this law is important. The
other is the exposure to, the other reason
-
is the exposure to imagery of violent
crimes via social media. So, of course, we
-
know that, that platforms such as Facebook
and YouTube contain all sorts of things
-
that people look. We also know that
because of their business models, they
-
sometimes push controversial content or
violent content into, into peoples
-
proposal, that the proposals that
they give to people to, to watch or to
-
read. So this is, actually the 2nd part
is not addressed by this, by this
-
proposal at all. But nevertheless,
whenever we talk to the representatives of
-
the commission why this law is there, they
start waving, that was my experience that
-
the one of the meetings, the person
started waving his phone at me and saying,
-
"Well, you know, there are beheading
videos online and I can show you how
-
horrible it is", which I consider
to be an emotional
-
blackmail at best, but not really a
good regulatory impulse.
-
So I guess maybe the commission
people are somehow
-
mysteriously affected by that content more
than anything else. I don't mean to joke
-
about those, those videos because of
course, it is not something that I would
-
want to watch and, and it is very violent.
But I would also argue that the problem is
-
not that the video is there, but that
somebody has been beheaded. And this is
-
where we should actually direct our
attention and look for the sources of that
-
sort of behavior and not only to try and
clean the Internet. The other reason why,
-
why this law should be enacted is a
radicalisation. Of course, this is a,
-
this is a problem for certain
vulnerable populations and people. And we
-
can read about it a lot. And there are
organizations that are dealing with
-
strategies to counteract radicalisation.
Again, when we look at the evidence, what
-
is the, what is the relationship between
content that is available online and the
-
fact that people get radicalized in
different level, in different ways, that
-
we didn't see any research and the
commission also did not present any
-
research that would actually point to at
least the correlation between the two. So
-
again, asked about, so "How did you come
up with this idea since without really
-
actually showing the, the support for your
claim that radicalisation is connected to
-
that?" This is a quote from a meeting
that happened public and journalists were
-
there. Again, the person from the
commission said, "We had to make a guess,
-
so we made the guess that way." There is
the guess being, yes, there is some sort
-
of connection between the content and the
radicalisation. And then finally, when we
-
read the impact assessment and when we
look at the different articles, that or
-
different explanations that the European
Commission posts about the
-
rationale for this law, of course, they
bring the terrorists attack that have
-
been happening and they make, they swiftly
go from naming the different violent
-
events that have happened in
Europe very recently or quite recently.
-
And they swiftly make a connection between
the fact that somebody took a truck and
-
and run into a group of people. Or that
somebody was participating in the shooting
-
or organizing the shooting of people
enjoying themselves.They swiftly go from
-
this to the fact that regulation of the
content is needed. Which also, the fact
-
that you put something in one sentence
does not mean it makes sense. Right? So
-
this is also not very well documented.
Again, pressed about this, the
-
representative of the European Commission
said that, well, "We know that and it has
-
been proven in the investigation, that one
of the people that were responsible for
-
the Bataclan attack actually used the
Internet before that happened.
-
laughter
Yes. No more comment needed on that one.
-
So, well, clearly, there are "very good
reasons", quote unquote, to spend time
-
and citizens money on working on
the new law. And I always say that
-
basically these laws are created because,
not because there is a reason, but
-
because there is a do-something-doctrine.
Right. We have a problem, we
-
have to do something. And this is how this
law, I think, came to be. And the
-
do-something-doctrine in this particular
case, or also, of course, it encompasses a
-
very broad and blurry definition of that
law. I will talk about this more in a
-
moment. It also encompasses measures, we,
if we define something that we want to
-
counteract to, we have to basically say
what should happen. Right. So that the
-
problem is being solved. And there are 3
measures that they will also explain. One
-
is the removal orders. The other is
referrals. And the third are so-called
-
proactive measures. This is, I guess, the
part where we touch the prevention
-
most. And then the third issue is that,
the one of the things I also want to talk
-
about is the links between the content
that is being removed and the actual
-
investigations or prosecutions that may
occur, because of course it's possible
-
that there will be some content found that
actually does document the crime. And
-
then what do we do about that? So, going
forward, I do think that the definition
-
and this law is basically, its main
principle is to normalize the state
-
control over how people communicate and
what they wanted to say. As it was said
-
before, under the premise of terrorism, we
can actually pack a lot of different
-
things because people are afraid of this.
And we have also examples from other
-
topics, other laws that have been debated
in Brussels. One was public sector
-
information directive, where everybody was
very happy discussing how much public
-
information should be released and where
it should come from and how people should
-
have access to it. And a part of public
information is the information that is
-
produced by companies that perform public
services, but they may also be private,
-
for example, sometimes transport, public
transport is provided that way. And
-
actually public transport providers were
the ones that were saying that they cannot
-
release the information that they have,
namely timetables and other
-
information about how the system
works that could be useful for citizens
-
because then it may be used by terrorists.
I guess that maybe prevents the potential
-
terrorists from going from bus stop to bus
stop and figuring out how the buses go.
-
But we already know that this does not
work that way. So this is something
-
that actually normalizes this approach.
And let's first look at the definition of
-
the proposal as presented by the
European Commission. So they say
-
basically, let me read: "Terrorist content
means one or more of the following
-
information. So a) inciting or advocating,
including by glorifying, the commission of
-
terrorist offences". I do apologise for
the horrible level of English
-
that they use, I don't know why and that I
don't apologise for them, but for the fact
-
that they expose you to it. "The
commission of terrorist offences,
-
Clapping
thereby causing a danger that such acts be
-
committed". You won't believe how many
times I had to read all this to actually
-
understand what all those things mean.
"Encouraging the contribution to terrorist
-
offences". So contribution could be money,
could be some, I guess material resources.
-
"Promoting the activities of a terrorist
group, in particular by encouraging the
-
participation in or support to a
terrorist group. Instructing on methods or
-
techniques for the purpose of committing
terrorist offenses". And then there is
-
also the definition of "dissemination of
terrorist content". That basically means
-
"making terrorist content available to
third parties on the hosting service
-
providers services". As you can probably
see, the dissemination and the fact that
-
third parties are evoked mean that this
law is super broad. So it's not only about
-
social media because making content
available through third parties may mean
-
that I am sharing something over some sort
of service with my mom and she is a third
-
party in the understanding of this law. So
we were actually super troubled to see
-
that not only does it encompass services
that make information available to the
-
public. So the one that we all can see
like social media, but also that
-
potentially it could be used against
services that make, let people communicate
-
privately. So that is a that is a big
issue. The second thing I wanted to direct
-
your attention to is the parts that
they put in italics. It's how soft those
-
those concepts are, inciting, advocating,
glorifying, encouraging, promoting. This
-
is a law that actually potentially can
really influence how we talk and how to
-
communicate what we wanted to talk about,
whether we agree or disagree with certain
-
policies or certain political decisions.
And all those things are super soft. And
-
it's very, very hard to say what
does it really mean. And I want to
-
give you an example of the same
content used in 3 different cases to
-
illustrate this. So let's imagine we have
a group of people that recorded a video
-
and on those videos, they
say that, well, basically they call
-
themselves terrorists, to make it easier,
and they say that they wanted to
-
commit all sorts of horrible things in
specific places, so that constitutes like
-
some sort of a credible threat. And they
also bragged that they killed someone. And
-
they also say that they're super happy
about this and so on. And they also, of
-
course, encourage others to join them and
so on and so on. And the 3 cases would be:
-
1 would be that this particular group
posted videos on, I don't know, their
-
YouTube channel. The other case would be
that there is a media outlet that reports
-
on it and either links to this video or
maybe present snippets of it. And the
-
third case would be, for example, that
there is some sort of group that is
-
actually following what's happening in
that region and collects evidence that can
-
then help identify the people and
prosecute them for the crimes they commit.
-
Like the crime that's our
exemplary terrorists admitted to
-
committing. And do you think that
according to this definition, in your
-
opinion, do you think that there is a
difference between those 3 types of
-
presenting that content between the
terrorist group that is presenting it on
-
their channel, between the media outlet
and between the activists? There is none.
-
Because this law has nothing, does not
define in any way that the so-called
-
terrorist content is something that is
published with an intention of actually
-
advocating and glorifying. So the problem
is that not only does the content that,
-
let's say, is as weak as we may call it
manifestly illegal. So somebody kills
-
someone and is being recorded and we know
it's a crime and perhaps we don't want to
-
watch it, although I do think that we
should also have a discussion in our
-
society, what we wanted to see and
what we don't want to see.
-
From the fact, from the perspective that
the world is complicated and we may
-
have the right to access all sorts of
information, even that is not so
-
pleasant and not so easy to digest. So
this law does not make this
-
differentiation. There is no mention of
how this should be intentional to qualify
-
to be considered so-called terrorist
content. And that's a big problem. So as
-
you can see, there is a fallacy in this
narrative because these will be the member
-
states and their so-called competent
authorities that will be deciding what the
-
terrorist content is. And, of course,
Europeans have a tendency, a tendency to
-
think we have the tendency to think of
ourselves as the the societies and the
-
nations and the countries that champion
the rule of law and that and that actually
-
respect fundamental rights and expect,
respect freedom of speech. But we also
-
know that this is changing rapidly. And I
also will show you examples of how that
-
changes in this area that we're talking
about right now. So. So I do not have
-
great trust in in European governments
into making the correct judgment about
-
that. So. So we have this category of very
dubious and and very broad terrorist
-
content. And then, so how it's how it's
being done. The, the basically all that
-
power to decide what the content, like how
to deal with that content, is actually
-
outsourced to private actors. So the
platforms that we are talking about
-
becomes kind of mercenaries because
both the commission and I guess many
-
member states say, well, it's not possible
that the judge will actually look through
-
content that is placed online and give,
you know, proper judiciary decisions about
-
what should, what constitute freedom of
expression and what goes beyond it.
-
Because it hurts other people or
is basically a proof of something illegal.
-
So the platforms will take those
decisions. This will be the hosting
-
service providers, as I mentioned. And
then also a lot of the reliance that they
-
will do it right is put into the wishful
thinking in this proposal that says, well,
-
basically, you have to put in terms of
service that you will not host terrorist
-
content. So then that again, there is a
layer in there where the platform,
-
let's say Facebook or Twitter or any
anyone else actually decides what and how
-
they wanted to deal with that in detail.
Also, one thing I didn't mention is that
-
looking for this regulation and looking at
who is the platform that should basically
-
have those terms of service we realize
that Wikimedia that actually our platforms
-
will actually be in the scope of that. So
not only that may affect the way we can
-
document and reference the articles that
are appearing on Wikipedia, on all those
-
all those, on the events that are
described or the groups or the political
-
situation and what not. But also that, you
know, our community of editors will have
-
less and less to say if we have to put a
lot of emphasis on terms of service. I
-
just think that we are somehow a
collateral damage of this. But also this
-
doesn't console me much because, of
course, Internet is bigger than than our
-
projects. And also, we want to make sure
that, that content is not being removed
-
elsewhere. So basically the 3 measures are
the removal orders, as I mentioned. And
-
this is something that is fairly, fairly
straightforward. And actually, I'm
-
wondering why there has to be a special
law to bring it, because, to being
-
because the removal order is basically a
decision that the competent authority in
-
the member state releases and sends it to
the platform. The problem is that
-
according to the commission, the platform
should actually act on it in 1 hour. And
-
then again, we ask them why 1 hour and not
74 minutes? And they say, "Well, because
-
we actually know", I don't know how, but
they say they do. Let's take it at face
-
value. "We actually know that the content
is the most, you know, viral and spreads
-
the fastest within, has the biggest range
within the 1 hour from appearance". And
-
then we ask them "Well, but how can you
know that? Actually, the people that find
-
the content find it exactly on the moment
when it comes up. Maybe it has been around
-
for 2 weeks and this 1 hour window when it
went really viral is like long gone". And
-
here they don't really answer, obviously.
So this is one of the measures
-
that I guess makes the most sense out of
all of that. Then we have the referrals
-
that we call lazy removal
orders. And this is this is really
-
something that is very puzzling for me
because the referral is a situation in
-
which this competent authority and the
person working there goes through the
-
content, through the videos or postings
and looks at it and says, "Well, I think,
-
I think it's against the terms of service
of this platform, but does not actually
-
release this removal order, but writes to
the platform, let's them know and say,
-
"Hey, can you please check this out?" I'm
sorry, I'm confused, is this the time that
-
I have left or the time? OK, good, time is
important here. So so basically, you know,
-
they are basically, won't spend the time
to prepare this removal order
-
and right and take let the platform to,
tell the platform actually to remove it.
-
But they will just ask them to please
verify whether this content should be
-
there or not. And first of all, this is
the real outsourcing of power
-
over the speech and expression. But
also we know how platforms take those
-
decisions. They have a very short time.
The people that do it are sitting
-
somewhere, most probably where the content
is not originating from. So they don't
-
understand the context. Sometimes they
don't understand the language. And also,
-
you know, it's better to get rid of it
just in case it really is problematic,
-
right? So this is something that is
completely increased this great gray area
-
of information that is controversial
enough to be flagged, but it's not illegal
-
enough to be removed by the order. By the
way, the European Parliament actually
-
kicked this out from their version. So now
the fight is in this negotiation between
-
the 3 institutions to actually follow this
recommendation and just remove it, because
-
it really does not make sense. And it
really makes the people that
-
release those referrals not really
accountable for their decisions
-
because they don't take the decision. They
just make a suggestion. And then we have
-
the proactive measures, which most
definitely will lead to over-policing of
-
content. There is a whole, a very clever
description in the law that basically
-
boils down to the point that if you are
going to use content filtering and if
-
you're going to prevent content from
disappearing, then basically you are
-
doing a good job as a platform. And
this is the way to actually deal with
-
terrorist content. Since, however we
define it, again, this is very
-
context-oriented, very context-dependent.
It's really very difficult to say based on
-
what sort of criteria and based, based on
what sort of databases those
-
automated processes will be, will be
happening. So, of course,
-
as it happens in today's world,
Somebody privatizes
-
the profits, but the losses are
always socialized. And this is now no
-
exception from that rule. So, again, when
we were talking to the European Commission
-
and asking them, "Why is this not a piece
of legislation that belongs to the
-
enforcement of the law?" And that is then
not controlled by a heavily by the
-
judiciary system and by any other sort of
oversight, that enforcements usually had.
-
"They have, well, because, you know,
when we have those videos of beheadings,
-
they usually don't happen in Europe and
they are really beyond our jurisdiction".
-
So, of course, nobody will act on it.
On the very meaningful level of
-
actually finding the people
that that are killing,
-
that are in the business of
killing others and making
-
sure they cannot continue with this
activity. So it's very clear that this
-
whole law is about cleaning the Internet
and not really about a meaningfully
-
tackling societal problems that lead to
that sort of violence. Also the
-
redress, which is the mechanism in which
the user can say, hey, this is not the
-
right decision. I actually believe this
content is not illegal at all. And it's
-
important for me to say this and this is
my right and I want it to be up. Those,
-
those provisions are very weak. You cannot
actually protest meaningfully against a
-
removal order of your content. Of course,
you can always take states, the states to
-
court. But we know how amazingly
interesting that is and how fast it
-
happens. So that, so we can I think we
can agree that there is no meaningful way
-
to actually protest. Also, the state may
ask, well, actually, this this removal
-
order should... the user should not be
informed that the content has been has
-
been taken down because of terrorism.
So. Or depicting terrorism or glorifying
-
or whatever. So even you may not even know
why the content is taken down. It will be
-
a secret. For referrals and for proactive
measures, well, you know what? Go talk to
-
the platform and protest with them. And
then, of course, the other question is, so
-
who is the terrorist? Right. Because this
is a very important question that that we
-
should have answered if we wanted to... if
we wanted to have a law that actually is
-
meaningfully engaging with those issues.
And of course, well, the as you know
-
already from what I said, the European
Commission in that particular case does
-
not provide a very good answer. But we
have some other responses to that. For
-
example, Europol has created a report and
then there was a blog post based on that.
-
On the title "On the importance of taking
down non-violent terrorist content".
-
So we have the European Commission that
says, "Yes, it's about the beheadings and
-
about the mutilations". And we have
Europol that says, "You know, actually
-
this non-violent terrorist content is
super important". So basically what they
-
say and I quote, "Poetry is a literary
medium that is widely appreciated across
-
the Arab world and is an important part
of their region's identity. Mastering it
-
provides the poet with singular authority
in Arabic culture. The most prominent
-
jihadi leaders - including Osama bin Laden
and former Islamic State spokesman
-
Abu Muhammad al-Adnadi - frequently
included poetry in their
-
speeches or wrote poems of their own.
Their charisma was
-
closely intertwined with their
mastery of poetry."
-
So we can see the art that is being made
by Europol between a very important aspect
-
of a culture that is beautiful and
enriching and about the fact that this,
-
that, that Europol wants it to see it
weaponized. The other part of the blogpost
-
was about how ISIS presents interesting
activities that their members... their,
-
their... their fighters have. And one of
them is that they are enjoying themselves
-
and smiling and spending time together and
swimming. So what? How do we, what do we
-
make out of that? So the video of some
brown people swimming are now
-
terrorist content? This is... the blatant
racism of, of this, of this communication
-
really enrages me. And I think it's really
a shame that that nobody called Europol
-
out on this, when the blogpost came up. We
also have laws in Europe that are
-
different. I mean, this is not the same
legislation, but that actually give the...
-
give the the taste of what may happen. One
is the the Spanish law against hate
-
speech. And, and this is an important
part. It didn't happen online, but it
-
shows the approach, that basically first
you have legislators that say, oh, don't
-
worry about this, we really want to go
after bad guys. And then what happens is
-
that there was a puppeteer performance
done by 2 people, "The Witch and Don
-
Christóbal" and the puppets were
actually... this is the kind of
-
Punch and Judy performance in which, this
is a genre of, of theater, theatric
-
performances, I'm sorry. That is kind of
full of silly jokes and and sometimes
-
excessive and and unjustified violence
and, and the, and the full of bad taste.
-
And this is quite serious. And the, the 2
characters in the, the 2 puppets held the
-
banner that featured a made-up terrorist
organization. And after that performance,
-
actually, they were charged with, first of
all, promoting terrorism, even though
-
there is no terrorist organization like
that. And then also with inciting,
-
inciting hatred. And this is what's one of
the puppeteers said after describing this
-
whole horrible experience. Finally, the
charges were dropped. So this is good. But
-
I think this really sums up who is the
terrorists and how those laws are being
-
used against people who actually have
nothing to do with, with violence. We were
-
charged with inciting hatred, which is a
felony created in theory to protect
-
vulnerable minorities, the minorities in
this case where the church,
-
the police and the legal system.
laughter
-
Then, again in Spain, I don't want to
single out this beautiful country, but
-
actually, unfortunately, they
have good examples. This is a very recent
-
one. So Tsunami Democràtic in Catalonia
created an app to actually help people
-
organize small action in a decentralized
manner. And they placed the documentations
-
on GitHub. And it was taken down by the
order of, of the Spanish court. And also
-
the - and this is the practical
application of such laws online - also,
-
the website of Tsunami Democràtic
was taken down by the court. Of course,
-
both of that on charges
of of facilitating terrorist
-
activities and inciting to
terrorism. So why is it important?
-
Because of what comes next. So
there will be the Digital Services Act,
-
which will be an overhaul of this idea
that I mentioned at the beginning, which
-
is that basically platform are not
responsible by default, by what we put
-
online. And European Commission and
other, the European Commission and other
-
actors in the EU are toying with the idea
that maybe platforms should be somehow
-
responsible. So, of course. And it's not
only about social media, but basically
-
anybody that any sort of, of a service
that helps people place content online.
-
And then, the, one of the ideas, we
don't know what it's going to be, it's not
-
there yet. It's going to happen at the
beginning of the next year, so
-
quite soon. But we can actually
expect that the so-called "Good Samaritan"
-
rule will be 1 of the solutions proposed.
What is this rule? This rule basically
-
means if a platform is really going the
extra mile and doing a good job in
-
removing the content, that is what... that
is either illegal or again or again, a
-
very difficult category, harmful. I also
don't know what that exactly means. Then
-
if they behave well, then they will not be
held responsible. So this is basically a
-
proposal that you cannot really turn down,
because if you run the business, you want
-
to manage the risk of that and you don't
want to be fined. And you and you don't
-
want to pay, pay money. So, of course, you
try and overpolice and of course you try
-
and you filter the content and of course
you take content when it only raises a
-
question what sort of... what sort of
content that is. Is it neutral or
-
is it maybe, you know, making somebody
offended or... or stirred? And, of course,
-
other attempts, we heard it from Germany.
Which is basically that there wasn't a
-
proposal to actually make... oblige.. like
make platforms obliged to give passwords
-
of users of social media. The people that
are under investigation or prosecution.
-
And also, of course, we see that one of
the ideas that supposedly is going to fix
-
everything is that, well, if terrorists
communicate through encrypted services,
-
then maybe we should do something about
encryption. And there was a petition
-
already on (?) to actually go in to
actually forbid encryption for those
-
services after one of the one of the
terrorist attacks. So, of course, it
-
sounds, it sounds very extreme. But
this is, in my opinion, the next the next
-
frontier here. So what can we do? Because
this is all quite difficult. So as I
-
mentioned, the negotiations are still on.
So there is still time to talk to
-
your government. And this is very import
because, of course, the governments, when
-
they have this idea... they have this
proposal on the table, that they will be
-
able to decide finally who is
the terrorist and what is the terrorist
-
content. And also, that's on one hand. On
the other hand, they know that people
-
don't really care all that much about what
happens in the E.U., which is
-
unfortunately true. They are actually
supporting very much the commission's
-
proposals. The only thing that they don't
like is the fact that somebody from the
-
police, from other country can maybe
interfere with content in their language,
-
because that's one of the provisions that
that also is there. So, so this is what
-
they don't like. They want to keep their
there the territoriality of their
-
enforcement laws intact. But there is
still time and we can still do this. And
-
if you want to talk to me about
what are the good ways to do it, I'm
-
available here. And I would love to take
that conversation up with you. The other
-
is a very simple measure that I believe is
is always working. Is one that basically
-
is about telling just 1 friend, even 1
friend, and asking them to do the same to
-
talk to other people about this. And there
are 2 reasons to do it. One is because, of
-
course, then we make people aware of what
it happens. And the other in this
-
particular case that is very important is
that basically people are scared of
-
terrorism and, and they support a lot of
measures just because they hear this word.
-
And when we explain, that, what that
really means and when we unpack this a
-
little bit, we build the resilience to
those arguments. And I think it's
-
important. The other people
who should know about this
-
are activists working with
vulnerable groups because of the
-
stigmatization that I
already mentioned and because
-
of the fact that we need to document
horrible things that are happening to
-
people in other places in the world and
also here in Europe. And journalists
-
and media organizations, because they
will be affected by this law. And by the
-
way, how they can report and where they
can they can get the sources for their
-
information. So I think I went massively
over time from what it was planned. I hope
-
we can still have some questions. Thank
you. So, yeah. Talk to me more about this
-
now and then after the talk. Thank you.
-
applause
-
Herald: Thanks for your talk. We still
have time for questions, so please, if you
-
have a question, line up at the mics. We
have 1, 2, 3 evenly distributed through
-
the room. I want to remind you really
quickly that a question normally is one
-
sentence and ends with a question mark.
laughter
-
Not everybody seems to know that. So we
start with mic number 2.
-
Mic2: Hello. I... so I run Tor Relays in
the United States. It seems like a lot of
-
these laws are focused on the notion of
centralized platforms. Do they define what
-
a platform is and are they going to
extradite me because I am facilitating Tor
-
Onion service?
A: Should we answer, no?
-
H: Yeah.
A: Okay, yes, so they do and they don't
-
in a way that the definition it's
based on basically what
-
the the hosting provider
is in in the European law is
-
actually very broad. So it doesn't take
into account the fact how big you are or
-
how you run your services. The bottom line
is that if you allow people to put content
-
up and share it with, again, 3rd party,
which may be the whole room here, it may
-
be the whole world but it may be just the
people I want to share things to with.
-
Then then you're obliged to to use the
measures that are... or, or to comply with
-
the measures that are envisioned in this
regulation. And there is a there's a
-
debate also in the parliament. It was
taken up and narrowed down actually to the
-
communication to the public. So I guess
then as you correctly observe, it is more
-
about about the big platforms or about the
centralized services. But actually the, in
-
the commission version, nothing makes me
believe that, that only then will be
-
affected. On the contrary, also the, the
messaging services may be.
-
H: Okay, um, next question, mic number 3.
Mic3: Is it, uh, a photo with the upload
-
filters, the copyright directive, it was
really similar debate, especially on
-
small companies, because, um, uh, at that
time, the question was they tried to push
-
upload filters for copyright content. And
the question was, uh, how does that fit
-
with small companies? And they still
haven't provided an answer to that. Uh,
-
the problem is they took the copyright
directive and basically inspired
-
themselves from the upload filters and
applied it to terrorist content. And it's
-
again, the question, how does that work
with small Internet companies that have to
-
have someone on call during the
nights and things like that. So even big
-
providers, I heard they don't have the
means to, to properly enforce that
-
something like this, this is a
killer for the European Internet industry.
-
A: Yes.
laughter
-
applause
H: I want to give a short reminder on the
-
1 sentence rule. We have a question from
the Internet. Signal angel, please.
-
Signal Angel: Yes, what, the question is,
wouldn't decentralized social networks
-
bypass these regulations?
A: I'm not a lawyer, but I will give a
-
question, I give an answer to this
question that the lawyer would give,
-
I maybe spent too much time with lawyers.
That depends.
-
laughter
A: Because it really does, because this
-
definition of who is obliged is so broad
that a lot depends on the context, a lot
-
depends on what is happening, what is
being shared and how. So it's, it's very
-
difficult to say. I just want to say that
we also had this conversation about
-
copyright and many people came to me last
year at Congress. I wasn't giving a talk
-
about it, but I was at the talk about the
copyright directive and the filtering. And
-
many people said, well, actually, you
know, if you're not using those services,
-
you will not be affected. And actually,
when we share peer to peer, then this is
-
not an issue. But actually, this is
changing. And there is actually
-
a decision of the European Court of
Justice. And the decisions are not like
-
basically the law, but basically they
are very often then followed and
-
incorporated. And this is the... and this
is the decision on the Pirate Bay and
-
in... on Pirate Bay. And in this decision,
the court says that, well, the argument
-
that Pirate Bay made was basically we're
not hosting any content. We're just
-
connecting people with it. And in short,
and the court said, well,
-
actually, we don't care. Because you
organize it, you optimize it, you like
-
the info, you optimize the
information, you bring it to people.
-
And the fact that you don't share
it does not really mean anything. And
-
you are liable for the, for the copyright
infringements. So, again, this is about a
-
different issue, but this is a very
relevant way of thinking that we may
-
expect that it will be translated into
other types of content. So, again, the
-
fact that that you don't host anything but
you just connect people to one another
-
will not be... may not be something that,
that will take you off the hook.
-
H: Microphone number 3.
Mic3: Do these proposals contain or...
-
what sort of repercussions do these
proposals contained for filing a request,
-
removal requests that are later determined
to be illegitimate? Is this just a free
-
pass to censor things? Or can.... are
there repercussions?
-
A: You... just want to make sure I
understand, you mean the removal orders,
-
the ones that say remove content, and
that's it?
-
Mic3: Yeah. If somebody files a removal
order that is determined later to be
-
completely illegitimate. Are there
repercussions?
-
A: Well, the problem starts even before
that because again, the removal orders are
-
being issued by competent authorities. So
there's like a designated authority that
-
can do it. Not everybody can. And
basically, the order says this is the
-
content. This is the URL. This is the
legal basis. Take it down. So there is no
-
way to protest it. And the platform can
only not follow this order within 1 hour
-
in 2 situations. One is that the force
majeure, that is usually the issue.
-
Basically, there's some sort of external
circumstance that prevents them from
-
doing it. I don't know.
Complete power outage
-
or problem with their servers
that basically they cannot
-
access and remove or block access
to this content. The other is if the
-
request... the removal order, I'm sorry,
contains errors that actually make it
-
impossible to do. So, for example, there
is no URL or it's broken and it doesn't
-
lead anywhere. And these are the only 2
situations. In the rest, the content has
-
to be removed. And there is no way for the
user and no way for the platform
-
to actually say, well, hold on, this is
not the way to do it. And therefore, after
-
it's being implemented to say, well, that
was a bad decision. As I said, you can
-
always go to court with the, with your
state, but not many people will do it.
-
And this is not really a meaningful
way to address this.
-
H: Next question. Mic number 3.
Mic3: How many... how much time do we have
-
to contact the parliamentarians to inform
them maybe that there is some big issue
-
with this? What's the worst case
timetable at the moment?
-
A: That's a very good question. And thank
you for asking because... this ... because
-
I forgot to mention this. That actually is
quite urgent. So the commission wanted to,
-
like usually, in those situations, the
commission wanted to close the thing until
-
the end of the year and they didn't manage
it because there is no, no agreement
-
on those most pressing issues.
But we expect that the, the best
-
case scenario is that until March, maybe
until June, it will probably happen
-
earlier. It may be the next couple of
months. And there will be lots of meetings
-
about about that. So this is more or less
the timeline. It's, there's no sort of
-
external deadline for this, right, so this
is just an estimation and of course,
-
it may change, but, but this is what we
expect.
-
H: We have another question from the
Internet.
-
S: Does this law considers that
such content is used for psychological
-
warfare by big nations?
A: I'm sorry. I... Again, please.
-
S: This, this content is, pictures or
video of whatsever, does this law
-
consider that such content is used for
psychological warfare?
-
A: Well, I'm trying to see how that
relates. I think the law is... does not go
-
into details like that in a way. Which
means that I can go back to the definition
-
that basically it's just about the fact
that if the content appears to be positive
-
about terrorist activities, then that's
the basis of taking it down. But there's
-
nothing else that is being actually said
about that. It's not more nuanced than
-
that. So I guess the answer is no.
H: One last question from mic number 2.
-
Mic2: Are there, in... any case studies
published on successful application of
-
alike laws in other countries? I asked
because we have alike laws in
-
Russia for 12 years and it's not that
useful as far as I see.
-
A: Not that I know of. So I think
it's also a very difficult
-
thing to research because we
can only research what, what we
-
know that happened. Right? In a way that
you have to have people that actually are
-
vocal about this and that complain about
these laws not being enforced in the
-
proper way. So, for example, content that
taken down is completely about something
-
else, which also sometimes happens. And,
and that's very difficult. I think
-
the biggest question here is
whether there is an amount of studies
-
documenting that something does not work
that would prevent the European Union from
-
actually having this legislative fever.
And I would argue that not, because, as
-
they said, they don't have really good
arguments or they don't really have good
-
numbers to justify bringing this law at
all. Not to mention bringing the
-
ridiculous measures that they propose.
So what we say sometimes
-
in Brussels when we're very frustrated
that we we were hoping, you
-
know, being there and advocating for for
human rights, is that we... we hoped
-
for... that we can contribute to evidence
based policy. But actually, what's
-
happening, it's a policy based evidence.
And, and this is the difficult part. So I
-
am all for studies and I am all for
presenting information that, you know, may
-
possibly help legislators. There are
definitely some MEP or some people there,
-
even probably in the commission. Maybe
they just are not allowed to voice
-
their opinion on this because it's a
highly political issue that would wish to
-
have those studies or would wish to be
able to use them. And that believe in
-
that. But it's just, it doesn't
translate into the political process.
-
H: Okay. Time's up. If you have any
more questions, you can come
-
up and approach Anna later.
A: Yes.
-
H: There is please. Thanks.
So first for me.
-
Thanks for the talk. Thanks for
patiently answer...
-
36c3 postroll music
-
Subtitles created by c3subtitles.de
in the year 2021. Join, and help us!