< Return to Video

36C3 - Confessions of a future terrorist

  • 0:00 - 0:21
    36c3 preroll music
  • 0:21 - 0:24
    Herald: ...but now we start what we're
    here for and I'm really happy to be
  • 0:24 - 0:31
    allowed to introduce Anna Mazgal. She will
    talk about something which a great title.
  • 0:31 - 0:36
    I love it. "Confessions of a Future
    Terrorist". Terror, terrorism is the one
  • 0:36 - 0:40
    thing you can always shout out and you get
    everything through. And she will give us a
  • 0:40 - 0:48
    rough guide to overregulate free speech
    with anti-terrorist measures. Anna works
  • 0:48 - 0:52
    for Wikimedia where she's a lobbyist of
    human rights into the digital environment
  • 0:52 - 0:57
    and works in Brussels. And she gives a lot
    of talk. And I think it's the first time
  • 0:57 - 0:59
    at a congress for her, is that right?
    Anna: Second time.
  • 0:59 - 1:03
    Herald: Second time? I haven't really
    researched right, because I searched for
  • 1:03 - 1:08
    it. So I have to do this again. It's her
    2nd time for congress and I'm really
  • 1:08 - 1:12
    happy to have her here. Please welcome her
    with a big round of applause.
  • 1:12 - 1:22
    Anna, the stage is yours.
    Anna: Thank you. Yes. So as you have
  • 1:22 - 1:28
    already heard, I don't do any of the cool
    things that we comedians and Wikipedians
  • 1:28 - 1:34
    do. I am based in Brussels and the
    L-word, I do the lobbying on behalf of our
  • 1:34 - 1:43
    community. And today I am here because I
    wanted to talk to you about one of the
  • 1:43 - 1:48
    proposals of 4 laws that we are now
    observing the development of. And I wanted
  • 1:48 - 1:53
    to share my concerns also like active, as
    an activist, because I'm really worried
  • 1:53 - 1:58
    how, if that law passes in its worst
    possible vision or one of the of the bad
  • 1:58 - 2:02
    versions, how it will affect my work. I'm
    also concerned how it would it will affect
  • 2:02 - 2:09
    your work. And basically all of ours
    expression online. And I also want to
  • 2:09 - 2:16
    share with you that this law makes me
    really angry. So, so I think these are a
  • 2:16 - 2:23
    few good reasons to be here and to talk to
    you. And I hope after this presentation we
  • 2:23 - 2:26
    can have a conversation about this. And
    I'm looking forward also to your
  • 2:26 - 2:34
    perspective on it and, and also the things
    you may not agree with maybe. So, so what
  • 2:34 - 2:42
    is this law? So, in September 2018, the
    European Commission came out with a
  • 2:42 - 2:46
    proposal of regulation on preventing
    the dissemination of terrorist content
  • 2:46 - 2:54
    online. So there are a few things to
    unpack here of for what it is about. First
  • 2:54 - 2:59
    of all, when we see a law that is about
    Internet and that is about the content and
  • 2:59 - 3:03
    what is about the online environment and
    it says it will prevent something, this
  • 3:03 - 3:09
    always brings a very difficult and
    complicated perspective for us, the
  • 3:09 - 3:14
    digital rights activists in Brussels.
    Because prevention online never means
  • 3:14 - 3:20
    anything good. So, so this is one thing.
    The other thing is that this very troubled
  • 3:20 - 3:24
    concept of terrorist content, I will be
    talking about this more, we will talk, I
  • 3:24 - 3:29
    will show you how the European Commission
    understands it and what are the problems
  • 3:29 - 3:33
    with that understanding and whether this
    is something that can actually be really
  • 3:33 - 3:42
    defined in the law. So, so these are the,
    the already the red flags that I have seen
  • 3:42 - 3:48
    and we have seen when we were, when we
    first got them, the proposal, into our
  • 3:48 - 3:53
    hands. I would like to tell you a little
    bit about the framework of it. This is
  • 3:53 - 4:01
    probably the, the most dry part of that,
    but I think it's important to correctly
  • 4:01 - 4:07
    place it. First of all, this is the
    European Union legislation. So we're
  • 4:07 - 4:16
    talking about the legislation that will
    influence 27 member states. Maybe 28, but
  • 4:16 - 4:23
    we know about Brexit, so, so that is a
    debatable what's going to happen there.
  • 4:23 - 4:29
    And it's important to note that whenever
    we have European legislation in the EU,
  • 4:29 - 4:36
    this is the, these are the laws that
    actually are shaping the laws of all those
  • 4:36 - 4:40
    countries. And they come before the
    national laws. So, so should this, should
  • 4:40 - 4:46
    this be implemented in any of the form,
    when it's implemented in any of the form,
  • 4:46 - 4:53
    this is what is going to happen. The next
    important part of information that I want
  • 4:53 - 4:57
    to give you is that this particular
    regulation is a part of the framework that
  • 4:57 - 5:03
    is called Digital Single Market. So the
    European Union, one of, one of the
  • 5:03 - 5:09
    objectives when a European
    Commission created the law and when other
  • 5:09 - 5:15
    bodies of the European Union work on it,
    is that there is, that, that the laws in
  • 5:15 - 5:21
    the member states of the
    European Union are actually similar. And
  • 5:21 - 5:26
    the Digital Single Market means that what
    we want, we want to achieve something on
  • 5:26 - 5:31
    the Internet that in a way is already
    achieved within the European Union
  • 5:31 - 5:36
    geographically, meaning that we don't want
    the borders on the Internet between people
  • 5:36 - 5:41
    communicating and also delivering goods
    and services in the European Union.
  • 5:41 - 5:45
    And you may ask how that connects with
    the terrorist content and how
  • 5:45 - 5:50
    that connects with what today's topics. To
    be honest, I am also puzzled because I
  • 5:50 - 5:57
    think, that legislation that talks about
    how people communicate online and what is
  • 5:57 - 6:02
    considered the speech that we wanted there
    and we don't want this should not be a
  • 6:02 - 6:08
    part of a framework that is about market.
    So this is also something that
  • 6:08 - 6:18
    brings a concern. Also, as you've seen at
    the first slide, this piece of
  • 6:18 - 6:24
    legislation, this proposal is called the
    regulation. And not to go too much into
  • 6:24 - 6:31
    details about what are the forms of
    legislation in the EU, the important thing
  • 6:31 - 6:37
    to know here is that the regulation is a
    law that once it is adopted by the EU,
  • 6:37 - 6:44
    once the parliament votes on it, it
    starts, it is binding directly in
  • 6:44 - 6:48
    all the member states of the European
    Union, which means that there is no
  • 6:48 - 6:53
    further discussion on how this should be
    actually used. Of course, in each country
  • 6:53 - 6:58
    there are different decisions being made
    by different bodies. But it means for us,
  • 6:58 - 7:02
    the people that work on this and that want
    to influence the legislative process, that
  • 7:02 - 7:06
    once this law is out of Brussels, there
    is nothing much to be done
  • 7:06 - 7:14
    about how it's going to be
    implemented. And this is
  • 7:14 - 7:19
    important because for now, the discussion
    about this, because for us, the discussion
  • 7:19 - 7:24
    about this, is the one that happens in
    Brussels. There are a few versions of the
  • 7:24 - 7:29
    law. And very quickly, European Commission
    proposes the law. European Parliament
  • 7:29 - 7:34
    looks at it, debates it, and then produces
    its own version of it. So amends it or
  • 7:34 - 7:39
    makes it worse. And then the Council of
    the EU, which is the gathering of all the
  • 7:39 - 7:43
    member states and the
    representatives of the government of the
  • 7:43 - 7:46
    member states, also creates their own
    version. And then, of course, when you
  • 7:46 - 7:50
    have 3 versions, you also need to have a
    lot of conversations and a lot of
  • 7:50 - 7:54
    negotiation, how to put this together into
    one. And all of those bodies have their
  • 7:54 - 8:00
    own ideas. Every one of those bodies
    have their own ideas on how any law should
  • 8:00 - 8:05
    look like. So this process is not only
    complicated, but also this negotiation
  • 8:05 - 8:11
    that is called the trilogs. It's actually
    very non-transparent. And there is no or
  • 8:11 - 8:16
    almost none, no official information about
    how those negotiations go, what are the
  • 8:16 - 8:21
    versions of the document and so on. This
    is the part that we are now in. And I will
  • 8:21 - 8:26
    talk more about this later on. Today I
    want to talk to you about the potential
  • 8:26 - 8:31
    consequences of diversion. That is the
    original one, which is the European
  • 8:31 - 8:36
    Commission's version. And it's because it
    will be very complicated and
  • 8:36 - 8:41
    confusing I guess, if we look at all of
    the proposals that are on the table. But
  • 8:41 - 8:45
    also, it's important because European
    Commission has a lot of influence also
  • 8:45 - 8:54
    informally, both on member states and also
    on - to an extent - on the whole trial
  • 8:54 - 8:59
    process. So whatever gains we have in
    other versions or whatever better
  • 8:59 - 9:05
    solutions we have there, they are not
    secure yet. And I promise I'm almost
  • 9:05 - 9:11
    done with this part. There is other
    relevant legislation that we'll
  • 9:11 - 9:16
    consider. One is the E-Commerce Directive.
    And in this, the part that is
  • 9:16 - 9:21
    very relevant is for this particular
    conversation, is that the platforms,
  • 9:21 - 9:28
    according to this law or Internet services
    or hosting providers are not by default
  • 9:28 - 9:33
    responsible for the content that users
    play, place online. So it's a very
  • 9:33 - 9:39
    important premise that also protects us,
    protects our rights, protects our privacy,
  • 9:39 - 9:47
    that the they are not, they cannot
    go after us or they cannot look for, for
  • 9:47 - 9:51
    the content that could be potentially
    illegal, which would mean that they would
  • 9:51 - 9:56
    have to look into everything. But of
    course, they have to react when somebody
  • 9:56 - 10:01
    notifies them and they have to see whether
    the information that is placed by the
  • 10:01 - 10:08
    users should stay up or not. There is also
    a directive on combating terrorism. And
  • 10:08 - 10:13
    this is the piece of legislation that
    is quite recent to my best
  • 10:13 - 10:17
    knowledge. Not all countries in the
    European Union, not all member states have
  • 10:17 - 10:23
    implemented it yet. So for us, it was also
    very puzzling that we actually have a new
  • 10:23 - 10:30
    law, a new proposal that is talking about
    the communication part of, of what already
  • 10:30 - 10:34
    has been mentioned in this directive. When
    we still don't know how it works, we still
  • 10:34 - 10:39
    don't know because this law is physically
    not being used at all. So this was for
  • 10:39 - 10:46
    us also difficult to understand why the
    commission does not want to wait and see
  • 10:46 - 10:55
    how like what comes out from the
    directive on combating terrorism. So
  • 10:55 - 11:00
    why would the European Commission and why
    the European legislators would
  • 11:00 - 11:06
    actually want such a law that, again,
    is about the content that people post
  • 11:06 - 11:17
    through different services and why this is
    an important issue. If this is, why this
  • 11:17 - 11:22
    issue is actually conflated with
    the market questions and the
  • 11:22 - 11:29
    harmonization in the digital market. So
    there are some serious numbers here. 94 %
  • 11:29 - 11:34
    and 99 %. And I'm wondering if you have
    any idea what those numbers are about.
  • 11:34 - 11:36
    Audience: Persons.
    Anna: I'm sorry?
  • 11:36 - 11:41
    Audience: Persons.
    Anna: Yes. It's about people. But the
  • 11:41 - 11:47
    numbers are actually presenting, so there
    was a survey done by Eurostat and those
  • 11:47 - 11:52
    numbers present the percentage of
    people first number 94 % presents the
  • 11:52 - 11:59
    percentage of people that say that they
    have not come across terrorist content
  • 11:59 - 12:09
    online. Right. So, inversely, only 6 %
    of people actually say that they had
  • 12:09 - 12:13
    access to terrorist content, it's
    important to underline that they say it
  • 12:13 - 12:19
    because there's no way to check what that
    content actually was and of course we can,
  • 12:19 - 12:25
    you know, here is the analogy of what a
    certain American judge said about
  • 12:25 - 12:29
    pornography. I know it when I see it. It's
    not a very good definition of anything,
  • 12:29 - 12:36
    really. So I would argue that actually
    6 % of people being affected by something
  • 12:36 - 12:40
    is not really a big percentage and that
    the European Union actually has bigger
  • 12:40 - 12:46
    problems to deal with and where they can
    spend money and energy on. E.g., we are
  • 12:46 - 12:50
    all affected by, I don't know, air
    pollution and that's much more people.
  • 12:50 - 12:57
    89% are the people in the age range
    between 15 and 24. But again, were not
  • 12:57 - 13:01
    affected by something what they would
    consider terrorist content. Of course,
  • 13:01 - 13:04
    would somebody think of the children?
    There you go.
  • 13:04 - 13:08
    The children and young people do
    not also experience
  • 13:08 - 13:16
    it in an overwhelming,
    overwhelmingly. So, but this rationale
  • 13:16 - 13:23
    is being used, 6 % and 11 % as one
    of the reasons why this regulation is
  • 13:23 - 13:27
    important, why this law is important. The
    other is the exposure to, the other reason
  • 13:27 - 13:32
    is the exposure to imagery of violent
    crimes via social media. So, of course, we
  • 13:32 - 13:38
    know that, that platforms such as Facebook
    and YouTube contain all sorts of things
  • 13:38 - 13:43
    that people look. We also know that
    because of their business models, they
  • 13:43 - 13:55
    sometimes push controversial content or
    violent content into, into peoples
  • 13:55 - 14:00
    proposal, that the proposals that
    they give to people to, to watch or to
  • 14:00 - 14:06
    read. So this is, actually the 2nd part
    is not addressed by this, by this
  • 14:06 - 14:12
    proposal at all. But nevertheless,
    whenever we talk to the representatives of
  • 14:12 - 14:17
    the commission why this law is there, they
    start waving, that was my experience that
  • 14:17 - 14:22
    the one of the meetings, the person
    started waving his phone at me and saying,
  • 14:22 - 14:25
    "Well, you know, there are beheading
    videos online and I can show you how
  • 14:25 - 14:28
    horrible it is", which I consider
    to be an emotional
  • 14:28 - 14:32
    blackmail at best, but not really a
    good regulatory impulse.
  • 14:32 - 14:37
    So I guess maybe the commission
    people are somehow
  • 14:37 - 14:42
    mysteriously affected by that content more
    than anything else. I don't mean to joke
  • 14:42 - 14:49
    about those, those videos because of
    course, it is not something that I would
  • 14:49 - 14:55
    want to watch and, and it is very violent.
    But I would also argue that the problem is
  • 14:55 - 14:59
    not that the video is there, but that
    somebody has been beheaded. And this is
  • 14:59 - 15:03
    where we should actually direct our
    attention and look for the sources of that
  • 15:03 - 15:09
    sort of behavior and not only to try and
    clean the Internet. The other reason why,
  • 15:09 - 15:17
    why this law should be enacted is a
    radicalisation. Of course, this is a,
  • 15:17 - 15:22
    this is a problem for certain
    vulnerable populations and people. And we
  • 15:22 - 15:26
    can read about it a lot. And there are
    organizations that are dealing with
  • 15:26 - 15:32
    strategies to counteract radicalisation.
    Again, when we look at the evidence, what
  • 15:32 - 15:38
    is the, what is the relationship between
    content that is available online and the
  • 15:38 - 15:43
    fact that people get radicalized in
    different level, in different ways, that
  • 15:43 - 15:46
    we didn't see any research and the
    commission also did not present any
  • 15:46 - 15:51
    research that would actually point to at
    least the correlation between the two. So
  • 15:51 - 15:57
    again, asked about, so "How did you come
    up with this idea since without really
  • 15:57 - 16:03
    actually showing the, the support for your
    claim that radicalisation is connected to
  • 16:03 - 16:09
    that?" This is a quote from a meeting
    that happened public and journalists were
  • 16:09 - 16:13
    there. Again, the person from the
    commission said, "We had to make a guess,
  • 16:13 - 16:19
    so we made the guess that way." There is
    the guess being, yes, there is some sort
  • 16:19 - 16:24
    of connection between the content and the
    radicalisation. And then finally, when we
  • 16:24 - 16:28
    read the impact assessment and when we
    look at the different articles, that or
  • 16:28 - 16:33
    different explanations that the European
    Commission posts about the
  • 16:33 - 16:39
    rationale for this law, of course, they
    bring the terrorists attack that have
  • 16:39 - 16:47
    been happening and they make, they swiftly
    go from naming the different violent
  • 16:47 - 16:53
    events that have happened in
    Europe very recently or quite recently.
  • 16:53 - 16:58
    And they swiftly make a connection between
    the fact that somebody took a truck and
  • 16:58 - 17:06
    and run into a group of people. Or that
    somebody was participating in the shooting
  • 17:06 - 17:10
    or organizing the shooting of people
    enjoying themselves.They swiftly go from
  • 17:10 - 17:15
    this to the fact that regulation of the
    content is needed. Which also, the fact
  • 17:15 - 17:19
    that you put something in one sentence
    does not mean it makes sense. Right? So
  • 17:19 - 17:24
    this is also not very well documented.
    Again, pressed about this, the
  • 17:24 - 17:28
    representative of the European Commission
    said that, well, "We know that and it has
  • 17:28 - 17:34
    been proven in the investigation, that one
    of the people that were responsible for
  • 17:34 - 17:38
    the Bataclan attack actually used the
    Internet before that happened.
  • 17:38 - 17:46
    laughter
    Yes. No more comment needed on that one.
  • 17:46 - 17:52
    So, well, clearly, there are "very good
    reasons", quote unquote, to spend time
  • 17:52 - 17:59
    and citizens money on working on
    the new law. And I always say that
  • 17:59 - 18:03
    basically these laws are created because,
    not because there is a reason, but
  • 18:03 - 18:06
    because there is a do-something-doctrine.
    Right. We have a problem, we
  • 18:06 - 18:16
    have to do something. And this is how this
    law, I think, came to be. And the
  • 18:16 - 18:23
    do-something-doctrine in this particular
    case, or also, of course, it encompasses a
  • 18:23 - 18:28
    very broad and blurry definition of that
    law. I will talk about this more in a
  • 18:28 - 18:34
    moment. It also encompasses measures, we,
    if we define something that we want to
  • 18:34 - 18:41
    counteract to, we have to basically say
    what should happen. Right. So that the
  • 18:41 - 18:46
    problem is being solved. And there are 3
    measures that they will also explain. One
  • 18:46 - 18:51
    is the removal orders. The other is
    referrals. And the third are so-called
  • 18:51 - 18:57
    proactive measures. This is, I guess, the
    part where we touch the prevention
  • 18:57 - 19:07
    most. And then the third issue is that,
    the one of the things I also want to talk
  • 19:07 - 19:10
    about is the links between the content
    that is being removed and the actual
  • 19:10 - 19:15
    investigations or prosecutions that may
    occur, because of course it's possible
  • 19:15 - 19:21
    that there will be some content found that
    actually does document the crime. And
  • 19:21 - 19:33
    then what do we do about that? So, going
    forward, I do think that the definition
  • 19:33 - 19:37
    and this law is basically, its main
    principle is to normalize the state
  • 19:37 - 19:45
    control over how people communicate and
    what they wanted to say. As it was said
  • 19:45 - 19:51
    before, under the premise of terrorism, we
    can actually pack a lot of different
  • 19:51 - 19:57
    things because people are afraid of this.
    And we have also examples from other
  • 19:57 - 20:03
    topics, other laws that have been debated
    in Brussels. One was public sector
  • 20:03 - 20:10
    information directive, where everybody was
    very happy discussing how much public
  • 20:10 - 20:14
    information should be released and where
    it should come from and how people should
  • 20:14 - 20:19
    have access to it. And a part of public
    information is the information that is
  • 20:19 - 20:24
    produced by companies that perform public
    services, but they may also be private,
  • 20:24 - 20:28
    for example, sometimes transport, public
    transport is provided that way. And
  • 20:28 - 20:32
    actually public transport providers were
    the ones that were saying that they cannot
  • 20:32 - 20:37
    release the information that they have,
    namely timetables and other
  • 20:37 - 20:44
    information about how the system
    works that could be useful for citizens
  • 20:44 - 20:49
    because then it may be used by terrorists.
    I guess that maybe prevents the potential
  • 20:49 - 20:54
    terrorists from going from bus stop to bus
    stop and figuring out how the buses go.
  • 20:54 - 20:58
    But we already know that this does not
    work that way. So this is something
  • 20:58 - 21:04
    that actually normalizes this approach.
    And let's first look at the definition of
  • 21:04 - 21:10
    the proposal as presented by the
    European Commission. So they say
  • 21:10 - 21:14
    basically, let me read: "Terrorist content
    means one or more of the following
  • 21:14 - 21:21
    information. So a) inciting or advocating,
    including by glorifying, the commission of
  • 21:21 - 21:26
    terrorist offences". I do apologise for
    the horrible level of English
  • 21:26 - 21:32
    that they use, I don't know why and that I
    don't apologise for them, but for the fact
  • 21:32 - 21:36
    that they expose you to it. "The
    commission of terrorist offences,
  • 21:36 - 21:40
    Clapping
    thereby causing a danger that such acts be
  • 21:40 - 21:45
    committed". You won't believe how many
    times I had to read all this to actually
  • 21:45 - 21:49
    understand what all those things mean.
    "Encouraging the contribution to terrorist
  • 21:49 - 21:57
    offences". So contribution could be money,
    could be some, I guess material resources.
  • 21:57 - 22:01
    "Promoting the activities of a terrorist
    group, in particular by encouraging the
  • 22:01 - 22:06
    participation in or support to a
    terrorist group. Instructing on methods or
  • 22:06 - 22:10
    techniques for the purpose of committing
    terrorist offenses". And then there is
  • 22:10 - 22:16
    also the definition of "dissemination of
    terrorist content". That basically means
  • 22:16 - 22:20
    "making terrorist content available to
    third parties on the hosting service
  • 22:20 - 22:27
    providers services". As you can probably
    see, the dissemination and the fact that
  • 22:27 - 22:33
    third parties are evoked mean that this
    law is super broad. So it's not only about
  • 22:33 - 22:38
    social media because making content
    available through third parties may mean
  • 22:38 - 22:43
    that I am sharing something over some sort
    of service with my mom and she is a third
  • 22:43 - 22:49
    party in the understanding of this law. So
    we were actually super troubled to see
  • 22:49 - 22:54
    that not only does it encompass services
    that make information available to the
  • 22:54 - 22:59
    public. So the one that we all can see
    like social media, but also that
  • 22:59 - 23:05
    potentially it could be used against
    services that make, let people communicate
  • 23:05 - 23:10
    privately. So that is a that is a big
    issue. The second thing I wanted to direct
  • 23:10 - 23:18
    your attention to is the parts that
    they put in italics. It's how soft those
  • 23:18 - 23:25
    those concepts are, inciting, advocating,
    glorifying, encouraging, promoting. This
  • 23:25 - 23:29
    is a law that actually potentially can
    really influence how we talk and how to
  • 23:29 - 23:34
    communicate what we wanted to talk about,
    whether we agree or disagree with certain
  • 23:34 - 23:42
    policies or certain political decisions.
    And all those things are super soft. And
  • 23:42 - 23:47
    it's very, very hard to say what
    does it really mean. And I want to
  • 23:47 - 23:54
    give you an example of the same
    content used in 3 different cases to
  • 23:54 - 24:00
    illustrate this. So let's imagine we have
    a group of people that recorded a video
  • 24:00 - 24:04
    and on those videos, they
    say that, well, basically they call
  • 24:04 - 24:11
    themselves terrorists, to make it easier,
    and they say that they wanted to
  • 24:11 - 24:17
    commit all sorts of horrible things in
    specific places, so that constitutes like
  • 24:17 - 24:22
    some sort of a credible threat. And they
    also bragged that they killed someone. And
  • 24:22 - 24:25
    they also say that they're super happy
    about this and so on. And they also, of
  • 24:25 - 24:30
    course, encourage others to join them and
    so on and so on. And the 3 cases would be:
  • 24:30 - 24:36
    1 would be that this particular group
    posted videos on, I don't know, their
  • 24:36 - 24:41
    YouTube channel. The other case would be
    that there is a media outlet that reports
  • 24:41 - 24:47
    on it and either links to this video or
    maybe present snippets of it. And the
  • 24:47 - 24:51
    third case would be, for example, that
    there is some sort of group that is
  • 24:51 - 24:57
    actually following what's happening in
    that region and collects evidence that can
  • 24:57 - 25:01
    then help identify the people and
    prosecute them for the crimes they commit.
  • 25:01 - 25:08
    Like the crime that's our
    exemplary terrorists admitted to
  • 25:08 - 25:13
    committing. And do you think that
    according to this definition, in your
  • 25:13 - 25:18
    opinion, do you think that there is a
    difference between those 3 types of
  • 25:18 - 25:23
    presenting that content between the
    terrorist group that is presenting it on
  • 25:23 - 25:28
    their channel, between the media outlet
    and between the activists? There is none.
  • 25:28 - 25:35
    Because this law has nothing, does not
    define in any way that the so-called
  • 25:35 - 25:42
    terrorist content is something that is
    published with an intention of actually
  • 25:42 - 25:49
    advocating and glorifying. So the problem
    is that not only does the content that,
  • 25:49 - 25:54
    let's say, is as weak as we may call it
    manifestly illegal. So somebody kills
  • 25:54 - 25:59
    someone and is being recorded and we know
    it's a crime and perhaps we don't want to
  • 25:59 - 26:02
    watch it, although I do think that we
    should also have a discussion in our
  • 26:02 - 26:08
    society, what we wanted to see and
    what we don't want to see.
  • 26:08 - 26:13
    From the fact, from the perspective that
    the world is complicated and we may
  • 26:13 - 26:17
    have the right to access all sorts of
    information, even that is not so
  • 26:17 - 26:23
    pleasant and not so easy to digest. So
    this law does not make this
  • 26:23 - 26:28
    differentiation. There is no mention of
    how this should be intentional to qualify
  • 26:28 - 26:35
    to be considered so-called terrorist
    content. And that's a big problem. So as
  • 26:35 - 26:39
    you can see, there is a fallacy in this
    narrative because these will be the member
  • 26:39 - 26:46
    states and their so-called competent
    authorities that will be deciding what the
  • 26:46 - 26:54
    terrorist content is. And, of course,
    Europeans have a tendency, a tendency to
  • 26:54 - 27:00
    think we have the tendency to think of
    ourselves as the the societies and the
  • 27:00 - 27:07
    nations and the countries that champion
    the rule of law and that and that actually
  • 27:07 - 27:14
    respect fundamental rights and expect,
    respect freedom of speech. But we also
  • 27:14 - 27:18
    know that this is changing rapidly. And I
    also will show you examples of how that
  • 27:18 - 27:25
    changes in this area that we're talking
    about right now. So. So I do not have
  • 27:25 - 27:30
    great trust in in European governments
    into making the correct judgment about
  • 27:30 - 27:45
    that. So. So we have this category of very
    dubious and and very broad terrorist
  • 27:45 - 27:56
    content. And then, so how it's how it's
    being done. The, the basically all that
  • 27:56 - 28:01
    power to decide what the content, like how
    to deal with that content, is actually
  • 28:01 - 28:05
    outsourced to private actors. So the
    platforms that we are talking about
  • 28:05 - 28:09
    becomes kind of mercenaries because
    both the commission and I guess many
  • 28:09 - 28:13
    member states say, well, it's not possible
    that the judge will actually look through
  • 28:13 - 28:18
    content that is placed online and give,
    you know, proper judiciary decisions about
  • 28:18 - 28:22
    what should, what constitute freedom of
    expression and what goes beyond it.
  • 28:22 - 28:29
    Because it hurts other people or
    is basically a proof of something illegal.
  • 28:29 - 28:33
    So the platforms will take those
    decisions. This will be the hosting
  • 28:33 - 28:40
    service providers, as I mentioned. And
    then also a lot of the reliance that they
  • 28:40 - 28:44
    will do it right is put into the wishful
    thinking in this proposal that says, well,
  • 28:44 - 28:48
    basically, you have to put in terms of
    service that you will not host terrorist
  • 28:48 - 28:53
    content. So then that again, there is a
    layer in there where the platform,
  • 28:53 - 29:01
    let's say Facebook or Twitter or any
    anyone else actually decides what and how
  • 29:01 - 29:07
    they wanted to deal with that in detail.
    Also, one thing I didn't mention is that
  • 29:07 - 29:11
    looking for this regulation and looking at
    who is the platform that should basically
  • 29:11 - 29:16
    have those terms of service we realize
    that Wikimedia that actually our platforms
  • 29:16 - 29:21
    will actually be in the scope of that. So
    not only that may affect the way we can
  • 29:21 - 29:29
    document and reference the articles that
    are appearing on Wikipedia, on all those
  • 29:29 - 29:34
    all those, on the events that are
    described or the groups or the political
  • 29:34 - 29:40
    situation and what not. But also that, you
    know, our community of editors will have
  • 29:40 - 29:45
    less and less to say if we have to put a
    lot of emphasis on terms of service. I
  • 29:45 - 29:50
    just think that we are somehow a
    collateral damage of this. But also this
  • 29:50 - 29:55
    doesn't console me much because, of
    course, Internet is bigger than than our
  • 29:55 - 30:00
    projects. And also, we want to make sure
    that, that content is not being removed
  • 30:00 - 30:08
    elsewhere. So basically the 3 measures are
    the removal orders, as I mentioned. And
  • 30:08 - 30:12
    this is something that is fairly, fairly
    straightforward. And actually, I'm
  • 30:12 - 30:16
    wondering why there has to be a special
    law to bring it, because, to being
  • 30:16 - 30:20
    because the removal order is basically a
    decision that the competent authority in
  • 30:20 - 30:25
    the member state releases and sends it to
    the platform. The problem is that
  • 30:25 - 30:30
    according to the commission, the platform
    should actually act on it in 1 hour. And
  • 30:30 - 30:35
    then again, we ask them why 1 hour and not
    74 minutes? And they say, "Well, because
  • 30:35 - 30:40
    we actually know", I don't know how, but
    they say they do. Let's take it at face
  • 30:40 - 30:46
    value. "We actually know that the content
    is the most, you know, viral and spreads
  • 30:46 - 30:51
    the fastest within, has the biggest range
    within the 1 hour from appearance". And
  • 30:51 - 30:54
    then we ask them "Well, but how can you
    know that? Actually, the people that find
  • 30:54 - 31:00
    the content find it exactly on the moment
    when it comes up. Maybe it has been around
  • 31:00 - 31:06
    for 2 weeks and this 1 hour window when it
    went really viral is like long gone". And
  • 31:06 - 31:12
    here they don't really answer, obviously.
    So this is one of the measures
  • 31:12 - 31:17
    that I guess makes the most sense out of
    all of that. Then we have the referrals
  • 31:17 - 31:20
    that we call lazy removal
    orders. And this is this is really
  • 31:20 - 31:24
    something that is very puzzling for me
    because the referral is a situation in
  • 31:24 - 31:28
    which this competent authority and the
    person working there goes through the
  • 31:28 - 31:33
    content, through the videos or postings
    and looks at it and says, "Well, I think,
  • 31:33 - 31:39
    I think it's against the terms of service
    of this platform, but does not actually
  • 31:39 - 31:44
    release this removal order, but writes to
    the platform, let's them know and say,
  • 31:44 - 31:49
    "Hey, can you please check this out?" I'm
    sorry, I'm confused, is this the time that
  • 31:49 - 31:56
    I have left or the time? OK, good, time is
    important here. So so basically, you know,
  • 31:56 - 32:00
    they are basically, won't spend the time
    to prepare this removal order
  • 32:00 - 32:06
    and right and take let the platform to,
    tell the platform actually to remove it.
  • 32:06 - 32:09
    But they will just ask them to please
    verify whether this content should be
  • 32:09 - 32:15
    there or not. And first of all, this is
    the real outsourcing of power
  • 32:15 - 32:20
    over the speech and expression. But
    also we know how platforms take those
  • 32:20 - 32:26
    decisions. They have a very short time.
    The people that do it are sitting
  • 32:26 - 32:31
    somewhere, most probably where the content
    is not originating from. So they don't
  • 32:31 - 32:34
    understand the context. Sometimes they
    don't understand the language. And also,
  • 32:34 - 32:38
    you know, it's better to get rid of it
    just in case it really is problematic,
  • 32:38 - 32:42
    right? So this is something that is
    completely increased this great gray area
  • 32:42 - 32:52
    of information that is controversial
    enough to be flagged, but it's not illegal
  • 32:52 - 32:57
    enough to be removed by the order. By the
    way, the European Parliament actually
  • 32:57 - 33:03
    kicked this out from their version. So now
    the fight is in this negotiation between
  • 33:03 - 33:08
    the 3 institutions to actually follow this
    recommendation and just remove it, because
  • 33:08 - 33:13
    it really does not make sense. And it
    really makes the people that
  • 33:13 - 33:18
    release those referrals not really
    accountable for their decisions
  • 33:18 - 33:22
    because they don't take the decision. They
    just make a suggestion. And then we have
  • 33:22 - 33:27
    the proactive measures, which most
    definitely will lead to over-policing of
  • 33:27 - 33:32
    content. There is a whole, a very clever
    description in the law that basically
  • 33:32 - 33:36
    boils down to the point that if you are
    going to use content filtering and if
  • 33:36 - 33:41
    you're going to prevent content from
    disappearing, then basically you are
  • 33:41 - 33:44
    doing a good job as a platform. And
    this is the way to actually deal with
  • 33:44 - 33:51
    terrorist content. Since, however we
    define it, again, this is very
  • 33:51 - 33:56
    context-oriented, very context-dependent.
    It's really very difficult to say based on
  • 33:56 - 34:00
    what sort of criteria and based, based on
    what sort of databases those
  • 34:00 - 34:05
    automated processes will be, will be
    happening. So, of course,
  • 34:05 - 34:10
    as it happens in today's world,
    Somebody privatizes
  • 34:10 - 34:17
    the profits, but the losses are
    always socialized. And this is now no
  • 34:17 - 34:24
    exception from that rule. So, again, when
    we were talking to the European Commission
  • 34:24 - 34:29
    and asking them, "Why is this not a piece
    of legislation that belongs to the
  • 34:29 - 34:37
    enforcement of the law?" And that is then
    not controlled by a heavily by the
  • 34:37 - 34:41
    judiciary system and by any other sort of
    oversight, that enforcements usually had.
  • 34:41 - 34:44
    "They have, well, because, you know,
    when we have those videos of beheadings,
  • 34:44 - 34:48
    they usually don't happen in Europe and
    they are really beyond our jurisdiction".
  • 34:48 - 34:52
    So, of course, nobody will act on it.
    On the very meaningful level of
  • 34:52 - 34:54
    actually finding the people
    that that are killing,
  • 34:54 - 34:57
    that are in the business of
    killing others and making
  • 34:57 - 35:03
    sure they cannot continue with this
    activity. So it's very clear that this
  • 35:03 - 35:08
    whole law is about cleaning the Internet
    and not really about a meaningfully
  • 35:08 - 35:18
    tackling societal problems that lead to
    that sort of violence. Also the
  • 35:18 - 35:23
    redress, which is the mechanism in which
    the user can say, hey, this is not the
  • 35:23 - 35:26
    right decision. I actually believe this
    content is not illegal at all. And it's
  • 35:26 - 35:31
    important for me to say this and this is
    my right and I want it to be up. Those,
  • 35:31 - 35:39
    those provisions are very weak. You cannot
    actually protest meaningfully against a
  • 35:39 - 35:43
    removal order of your content. Of course,
    you can always take states, the states to
  • 35:43 - 35:48
    court. But we know how amazingly
    interesting that is and how fast it
  • 35:48 - 35:53
    happens. So that, so we can I think we
    can agree that there is no meaningful way
  • 35:53 - 36:00
    to actually protest. Also, the state may
    ask, well, actually, this this removal
  • 36:00 - 36:05
    order should... the user should not be
    informed that the content has been has
  • 36:05 - 36:11
    been taken down because of terrorism.
    So. Or depicting terrorism or glorifying
  • 36:11 - 36:16
    or whatever. So even you may not even know
    why the content is taken down. It will be
  • 36:16 - 36:22
    a secret. For referrals and for proactive
    measures, well, you know what? Go talk to
  • 36:22 - 36:27
    the platform and protest with them. And
    then, of course, the other question is, so
  • 36:27 - 36:31
    who is the terrorist? Right. Because this
    is a very important question that that we
  • 36:31 - 36:38
    should have answered if we wanted to... if
    we wanted to have a law that actually is
  • 36:38 - 36:42
    meaningfully engaging with those issues.
    And of course, well, the as you know
  • 36:42 - 36:49
    already from what I said, the European
    Commission in that particular case does
  • 36:49 - 36:54
    not provide a very good answer. But we
    have some other responses to that. For
  • 36:54 - 37:03
    example, Europol has created a report and
    then there was a blog post based on that.
  • 37:03 - 37:08
    On the title "On the importance of taking
    down non-violent terrorist content".
  • 37:08 - 37:11
    So we have the European Commission that
    says, "Yes, it's about the beheadings and
  • 37:11 - 37:16
    about the mutilations". And we have
    Europol that says, "You know, actually
  • 37:16 - 37:20
    this non-violent terrorist content is
    super important". So basically what they
  • 37:20 - 37:25
    say and I quote, "Poetry is a literary
    medium that is widely appreciated across
  • 37:25 - 37:30
    the Arab world and is an important part
    of their region's identity. Mastering it
  • 37:30 - 37:35
    provides the poet with singular authority
    in Arabic culture. The most prominent
  • 37:35 - 37:40
    jihadi leaders - including Osama bin Laden
    and former Islamic State spokesman
  • 37:40 - 37:43
    Abu Muhammad al-Adnadi - frequently
    included poetry in their
  • 37:43 - 37:46
    speeches or wrote poems of their own.
    Their charisma was
  • 37:46 - 37:49
    closely intertwined with their
    mastery of poetry."
  • 37:49 - 37:55
    So we can see the art that is being made
    by Europol between a very important aspect
  • 37:55 - 37:59
    of a culture that is beautiful and
    enriching and about the fact that this,
  • 37:59 - 38:03
    that, that Europol wants it to see it
    weaponized. The other part of the blogpost
  • 38:03 - 38:08
    was about how ISIS presents interesting
    activities that their members... their,
  • 38:08 - 38:13
    their... their fighters have. And one of
    them is that they are enjoying themselves
  • 38:13 - 38:17
    and smiling and spending time together and
    swimming. So what? How do we, what do we
  • 38:17 - 38:20
    make out of that? So the video of some
    brown people swimming are now
  • 38:20 - 38:28
    terrorist content? This is... the blatant
    racism of, of this, of this communication
  • 38:28 - 38:33
    really enrages me. And I think it's really
    a shame that that nobody called Europol
  • 38:33 - 38:40
    out on this, when the blogpost came up. We
    also have laws in Europe that are
  • 38:40 - 38:45
    different. I mean, this is not the same
    legislation, but that actually give the...
  • 38:45 - 38:51
    give the the taste of what may happen. One
    is the the Spanish law against hate
  • 38:51 - 38:57
    speech. And, and this is an important
    part. It didn't happen online, but it
  • 38:57 - 39:02
    shows the approach, that basically first
    you have legislators that say, oh, don't
  • 39:02 - 39:06
    worry about this, we really want to go
    after bad guys. And then what happens is
  • 39:06 - 39:11
    that there was a puppeteer performance
    done by 2 people, "The Witch and Don
  • 39:11 - 39:16
    Christóbal" and the puppets were
    actually... this is the kind of
  • 39:16 - 39:23
    Punch and Judy performance in which, this
    is a genre of, of theater, theatric
  • 39:23 - 39:30
    performances, I'm sorry. That is kind of
    full of silly jokes and and sometimes
  • 39:30 - 39:36
    excessive and and unjustified violence
    and, and the, and the full of bad taste.
  • 39:36 - 39:41
    And this is quite serious. And the, the 2
    characters in the, the 2 puppets held the
  • 39:41 - 39:46
    banner that featured a made-up terrorist
    organization. And after that performance,
  • 39:46 - 39:52
    actually, they were charged with, first of
    all, promoting terrorism, even though
  • 39:52 - 39:56
    there is no terrorist organization like
    that. And then also with inciting,
  • 39:56 - 40:02
    inciting hatred. And this is what's one of
    the puppeteers said after describing this
  • 40:02 - 40:07
    whole horrible experience. Finally, the
    charges were dropped. So this is good. But
  • 40:07 - 40:12
    I think this really sums up who is the
    terrorists and how those laws are being
  • 40:12 - 40:20
    used against people who actually have
    nothing to do with, with violence. We were
  • 40:20 - 40:24
    charged with inciting hatred, which is a
    felony created in theory to protect
  • 40:24 - 40:27
    vulnerable minorities, the minorities in
    this case where the church,
  • 40:27 - 40:30
    the police and the legal system.
    laughter
  • 40:30 - 40:34
    Then, again in Spain, I don't want to
    single out this beautiful country, but
  • 40:34 - 40:37
    actually, unfortunately, they
    have good examples. This is a very recent
  • 40:37 - 40:44
    one. So Tsunami Democràtic in Catalonia
    created an app to actually help people
  • 40:44 - 40:50
    organize small action in a decentralized
    manner. And they placed the documentations
  • 40:50 - 40:57
    on GitHub. And it was taken down by the
    order of, of the Spanish court. And also
  • 40:57 - 41:01
    the - and this is the practical
    application of such laws online - also,
  • 41:01 - 41:07
    the website of Tsunami Democràtic
    was taken down by the court. Of course,
  • 41:07 - 41:11
    both of that on charges
    of of facilitating terrorist
  • 41:11 - 41:16
    activities and inciting to
    terrorism. So why is it important?
  • 41:16 - 41:20
    Because of what comes next. So
    there will be the Digital Services Act,
  • 41:20 - 41:24
    which will be an overhaul of this idea
    that I mentioned at the beginning, which
  • 41:24 - 41:28
    is that basically platform are not
    responsible by default, by what we put
  • 41:28 - 41:34
    online. And European Commission and
    other, the European Commission and other
  • 41:34 - 41:38
    actors in the EU are toying with the idea
    that maybe platforms should be somehow
  • 41:38 - 41:44
    responsible. So, of course. And it's not
    only about social media, but basically
  • 41:44 - 41:51
    anybody that any sort of, of a service
    that helps people place content online.
  • 41:51 - 41:55
    And then, the, one of the ideas, we
    don't know what it's going to be, it's not
  • 41:55 - 41:58
    there yet. It's going to happen at the
    beginning of the next year, so
  • 41:58 - 42:02
    quite soon. But we can actually
    expect that the so-called "Good Samaritan"
  • 42:02 - 42:06
    rule will be 1 of the solutions proposed.
    What is this rule? This rule basically
  • 42:06 - 42:11
    means if a platform is really going the
    extra mile and doing a good job in
  • 42:11 - 42:17
    removing the content, that is what... that
    is either illegal or again or again, a
  • 42:17 - 42:23
    very difficult category, harmful. I also
    don't know what that exactly means. Then
  • 42:23 - 42:28
    if they behave well, then they will not be
    held responsible. So this is basically a
  • 42:28 - 42:32
    proposal that you cannot really turn down,
    because if you run the business, you want
  • 42:32 - 42:36
    to manage the risk of that and you don't
    want to be fined. And you and you don't
  • 42:36 - 42:40
    want to pay, pay money. So, of course, you
    try and overpolice and of course you try
  • 42:40 - 42:44
    and you filter the content and of course
    you take content when it only raises a
  • 42:44 - 42:52
    question what sort of... what sort of
    content that is. Is it neutral or
  • 42:52 - 43:00
    is it maybe, you know, making somebody
    offended or... or stirred? And, of course,
  • 43:00 - 43:05
    other attempts, we heard it from Germany.
    Which is basically that there wasn't a
  • 43:05 - 43:15
    proposal to actually make... oblige.. like
    make platforms obliged to give passwords
  • 43:15 - 43:21
    of users of social media. The people that
    are under investigation or prosecution.
  • 43:21 - 43:27
    And also, of course, we see that one of
    the ideas that supposedly is going to fix
  • 43:27 - 43:32
    everything is that, well, if terrorists
    communicate through encrypted services,
  • 43:32 - 43:35
    then maybe we should do something about
    encryption. And there was a petition
  • 43:35 - 43:42
    already on (?) to actually go in to
    actually forbid encryption for those
  • 43:42 - 43:47
    services after one of the one of the
    terrorist attacks. So, of course, it
  • 43:47 - 43:53
    sounds, it sounds very extreme. But
    this is, in my opinion, the next the next
  • 43:53 - 44:00
    frontier here. So what can we do? Because
    this is all quite difficult. So as I
  • 44:00 - 44:05
    mentioned, the negotiations are still on.
    So there is still time to talk to
  • 44:05 - 44:09
    your government. And this is very import
    because, of course, the governments, when
  • 44:09 - 44:13
    they have this idea... they have this
    proposal on the table, that they will be
  • 44:13 - 44:18
    able to decide finally who is
    the terrorist and what is the terrorist
  • 44:18 - 44:23
    content. And also, that's on one hand. On
    the other hand, they know that people
  • 44:23 - 44:27
    don't really care all that much about what
    happens in the E.U., which is
  • 44:27 - 44:31
    unfortunately true. They are actually
    supporting very much the commission's
  • 44:31 - 44:35
    proposals. The only thing that they don't
    like is the fact that somebody from the
  • 44:35 - 44:42
    police, from other country can maybe
    interfere with content in their language,
  • 44:42 - 44:47
    because that's one of the provisions that
    that also is there. So, so this is what
  • 44:47 - 44:51
    they don't like. They want to keep their
    there the territoriality of their
  • 44:51 - 44:57
    enforcement laws intact. But there is
    still time and we can still do this. And
  • 44:57 - 45:00
    if you want to talk to me about
    what are the good ways to do it, I'm
  • 45:00 - 45:05
    available here. And I would love to take
    that conversation up with you. The other
  • 45:05 - 45:11
    is a very simple measure that I believe is
    is always working. Is one that basically
  • 45:11 - 45:16
    is about telling just 1 friend, even 1
    friend, and asking them to do the same to
  • 45:16 - 45:20
    talk to other people about this. And there
    are 2 reasons to do it. One is because, of
  • 45:20 - 45:24
    course, then we make people aware of what
    it happens. And the other in this
  • 45:24 - 45:29
    particular case that is very important is
    that basically people are scared of
  • 45:29 - 45:33
    terrorism and, and they support a lot of
    measures just because they hear this word.
  • 45:33 - 45:37
    And when we explain, that, what that
    really means and when we unpack this a
  • 45:37 - 45:40
    little bit, we build the resilience to
    those arguments. And I think it's
  • 45:40 - 45:43
    important. The other people
    who should know about this
  • 45:43 - 45:46
    are activists working with
    vulnerable groups because of the
  • 45:46 - 45:50
    stigmatization that I
    already mentioned and because
  • 45:50 - 45:55
    of the fact that we need to document
    horrible things that are happening to
  • 45:55 - 45:59
    people in other places in the world and
    also here in Europe. And journalists
  • 45:59 - 46:03
    and media organizations, because they
    will be affected by this law. And by the
  • 46:03 - 46:07
    way, how they can report and where they
    can they can get the sources for their
  • 46:07 - 46:12
    information. So I think I went massively
    over time from what it was planned. I hope
  • 46:12 - 46:17
    we can still have some questions. Thank
    you. So, yeah. Talk to me more about this
  • 46:17 - 46:23
    now and then after the talk. Thank you.
  • 46:23 - 46:33
    applause
  • 46:33 - 46:37
    Herald: Thanks for your talk. We still
    have time for questions, so please, if you
  • 46:37 - 46:43
    have a question, line up at the mics. We
    have 1, 2, 3 evenly distributed through
  • 46:43 - 46:47
    the room. I want to remind you really
    quickly that a question normally is one
  • 46:47 - 46:51
    sentence and ends with a question mark.
    laughter
  • 46:51 - 46:56
    Not everybody seems to know that. So we
    start with mic number 2.
  • 46:56 - 47:02
    Mic2: Hello. I... so I run Tor Relays in
    the United States. It seems like a lot of
  • 47:02 - 47:08
    these laws are focused on the notion of
    centralized platforms. Do they define what
  • 47:08 - 47:12
    a platform is and are they going to
    extradite me because I am facilitating Tor
  • 47:12 - 47:16
    Onion service?
    A: Should we answer, no?
  • 47:16 - 47:22
    H: Yeah.
    A: Okay, yes, so they do and they don't
  • 47:22 - 47:26
    in a way that the definition it's
    based on basically what
  • 47:26 - 47:31
    the the hosting provider
    is in in the European law is
  • 47:31 - 47:36
    actually very broad. So it doesn't take
    into account the fact how big you are or
  • 47:36 - 47:42
    how you run your services. The bottom line
    is that if you allow people to put content
  • 47:42 - 47:47
    up and share it with, again, 3rd party,
    which may be the whole room here, it may
  • 47:47 - 47:51
    be the whole world but it may be just the
    people I want to share things to with.
  • 47:51 - 47:58
    Then then you're obliged to to use the
    measures that are... or, or to comply with
  • 47:58 - 48:02
    the measures that are envisioned in this
    regulation. And there is a there's a
  • 48:02 - 48:07
    debate also in the parliament. It was
    taken up and narrowed down actually to the
  • 48:07 - 48:12
    communication to the public. So I guess
    then as you correctly observe, it is more
  • 48:12 - 48:17
    about about the big platforms or about the
    centralized services. But actually the, in
  • 48:17 - 48:21
    the commission version, nothing makes me
    believe that, that only then will be
  • 48:21 - 48:26
    affected. On the contrary, also the, the
    messaging services may be.
  • 48:26 - 48:35
    H: Okay, um, next question, mic number 3.
    Mic3: Is it, uh, a photo with the upload
  • 48:35 - 48:41
    filters, the copyright directive, it was
    really similar debate, especially on
  • 48:41 - 48:47
    small companies, because, um, uh, at that
    time, the question was they tried to push
  • 48:47 - 48:51
    upload filters for copyright content. And
    the question was, uh, how does that fit
  • 48:51 - 48:56
    with small companies? And they still
    haven't provided an answer to that. Uh,
  • 48:56 - 49:00
    the problem is they took the copyright
    directive and basically inspired
  • 49:00 - 49:04
    themselves from the upload filters and
    applied it to terrorist content. And it's
  • 49:04 - 49:08
    again, the question, how does that work
    with small Internet companies that have to
  • 49:08 - 49:14
    have someone on call during the
    nights and things like that. So even big
  • 49:14 - 49:17
    providers, I heard they don't have the
    means to, to properly enforce that
  • 49:17 - 49:23
    something like this, this is a
    killer for the European Internet industry.
  • 49:23 - 49:26
    A: Yes.
    laughter
  • 49:26 - 49:32
    applause
    H: I want to give a short reminder on the
  • 49:32 - 49:39
    1 sentence rule. We have a question from
    the Internet. Signal angel, please.
  • 49:39 - 49:45
    Signal Angel: Yes, what, the question is,
    wouldn't decentralized social networks
  • 49:45 - 49:52
    bypass these regulations?
    A: I'm not a lawyer, but I will give a
  • 49:52 - 49:56
    question, I give an answer to this
    question that the lawyer would give,
  • 49:56 - 49:59
    I maybe spent too much time with lawyers.
    That depends.
  • 49:59 - 50:01
    laughter
    A: Because it really does, because this
  • 50:01 - 50:06
    definition of who is obliged is so broad
    that a lot depends on the context, a lot
  • 50:06 - 50:11
    depends on what is happening, what is
    being shared and how. So it's, it's very
  • 50:11 - 50:15
    difficult to say. I just want to say that
    we also had this conversation about
  • 50:15 - 50:20
    copyright and many people came to me last
    year at Congress. I wasn't giving a talk
  • 50:20 - 50:25
    about it, but I was at the talk about the
    copyright directive and the filtering. And
  • 50:25 - 50:29
    many people said, well, actually, you
    know, if you're not using those services,
  • 50:29 - 50:33
    you will not be affected. And actually,
    when we share peer to peer, then this is
  • 50:33 - 50:37
    not an issue. But actually, this is
    changing. And there is actually
  • 50:37 - 50:42
    a decision of the European Court of
    Justice. And the decisions are not like
  • 50:42 - 50:45
    basically the law, but basically they
    are very often then followed and
  • 50:45 - 50:49
    incorporated. And this is the... and this
    is the decision on the Pirate Bay and
  • 50:49 - 50:54
    in... on Pirate Bay. And in this decision,
    the court says that, well, the argument
  • 50:54 - 50:58
    that Pirate Bay made was basically we're
    not hosting any content. We're just
  • 50:58 - 51:04
    connecting people with it. And in short,
    and the court said, well,
  • 51:04 - 51:09
    actually, we don't care. Because you
    organize it, you optimize it, you like
  • 51:09 - 51:12
    the info, you optimize the
    information, you bring it to people.
  • 51:12 - 51:16
    And the fact that you don't share
    it does not really mean anything. And
  • 51:16 - 51:20
    you are liable for the, for the copyright
    infringements. So, again, this is about a
  • 51:20 - 51:27
    different issue, but this is a very
    relevant way of thinking that we may
  • 51:27 - 51:31
    expect that it will be translated into
    other types of content. So, again, the
  • 51:31 - 51:36
    fact that that you don't host anything but
    you just connect people to one another
  • 51:36 - 51:42
    will not be... may not be something that,
    that will take you off the hook.
  • 51:42 - 51:49
    H: Microphone number 3.
    Mic3: Do these proposals contain or...
  • 51:49 - 51:55
    what sort of repercussions do these
    proposals contained for filing a request,
  • 51:55 - 51:59
    removal requests that are later determined
    to be illegitimate? Is this just a free
  • 51:59 - 52:03
    pass to censor things? Or can.... are
    there repercussions?
  • 52:03 - 52:08
    A: You... just want to make sure I
    understand, you mean the removal orders,
  • 52:08 - 52:10
    the ones that say remove content, and
    that's it?
  • 52:10 - 52:13
    Mic3: Yeah. If somebody files a removal
    order that is determined later to be
  • 52:13 - 52:17
    completely illegitimate. Are there
    repercussions?
  • 52:17 - 52:23
    A: Well, the problem starts even before
    that because again, the removal orders are
  • 52:23 - 52:27
    being issued by competent authorities. So
    there's like a designated authority that
  • 52:27 - 52:31
    can do it. Not everybody can. And
    basically, the order says this is the
  • 52:31 - 52:36
    content. This is the URL. This is the
    legal basis. Take it down. So there is no
  • 52:36 - 52:41
    way to protest it. And the platform can
    only not follow this order within 1 hour
  • 52:41 - 52:46
    in 2 situations. One is that the force
    majeure, that is usually the issue.
  • 52:46 - 52:50
    Basically, there's some sort of external
    circumstance that prevents them from
  • 52:50 - 52:52
    doing it. I don't know.
    Complete power outage
  • 52:52 - 52:55
    or problem with their servers
    that basically they cannot
  • 52:55 - 53:00
    access and remove or block access
    to this content. The other is if the
  • 53:00 - 53:05
    request... the removal order, I'm sorry,
    contains errors that actually make it
  • 53:05 - 53:09
    impossible to do. So, for example, there
    is no URL or it's broken and it doesn't
  • 53:09 - 53:14
    lead anywhere. And these are the only 2
    situations. In the rest, the content has
  • 53:14 - 53:20
    to be removed. And there is no way for the
    user and no way for the platform
  • 53:20 - 53:24
    to actually say, well, hold on, this is
    not the way to do it. And therefore, after
  • 53:24 - 53:29
    it's being implemented to say, well, that
    was a bad decision. As I said, you can
  • 53:29 - 53:34
    always go to court with the, with your
    state, but not many people will do it.
  • 53:34 - 53:39
    And this is not really a meaningful
    way to address this.
  • 53:39 - 53:46
    H: Next question. Mic number 3.
    Mic3: How many... how much time do we have
  • 53:46 - 53:51
    to contact the parliamentarians to inform
    them maybe that there is some big issue
  • 53:51 - 53:55
    with this? What's the worst case
    timetable at the moment?
  • 53:55 - 54:00
    A: That's a very good question. And thank
    you for asking because... this ... because
  • 54:00 - 54:05
    I forgot to mention this. That actually is
    quite urgent. So the commission wanted to,
  • 54:05 - 54:10
    like usually, in those situations, the
    commission wanted to close the thing until
  • 54:10 - 54:15
    the end of the year and they didn't manage
    it because there is no, no agreement
  • 54:15 - 54:20
    on those most pressing issues.
    But we expect that the, the best
  • 54:20 - 54:26
    case scenario is that until March, maybe
    until June, it will probably happen
  • 54:26 - 54:31
    earlier. It may be the next couple of
    months. And there will be lots of meetings
  • 54:31 - 54:37
    about about that. So this is more or less
    the timeline. It's, there's no sort of
  • 54:37 - 54:42
    external deadline for this, right, so this
    is just an estimation and of course,
  • 54:42 - 54:44
    it may change, but, but this is what we
    expect.
  • 54:44 - 54:47
    H: We have another question from the
    Internet.
  • 54:47 - 54:52
    S: Does this law considers that
    such content is used for psychological
  • 54:52 - 54:59
    warfare by big nations?
    A: I'm sorry. I... Again, please.
  • 54:59 - 55:04
    S: This, this content is, pictures or
    video of whatsever, does this law
  • 55:04 - 55:09
    consider that such content is used for
    psychological warfare?
  • 55:09 - 55:18
    A: Well, I'm trying to see how that
    relates. I think the law is... does not go
  • 55:18 - 55:25
    into details like that in a way. Which
    means that I can go back to the definition
  • 55:25 - 55:32
    that basically it's just about the fact
    that if the content appears to be positive
  • 55:32 - 55:38
    about terrorist activities, then that's
    the basis of taking it down. But there's
  • 55:38 - 55:42
    nothing else that is being actually said
    about that. It's not more nuanced than
  • 55:42 - 55:49
    that. So I guess the answer is no.
    H: One last question from mic number 2.
  • 55:49 - 55:55
    Mic2: Are there, in... any case studies
    published on successful application of
  • 55:55 - 55:59
    alike laws in other countries? I asked
    because we have alike laws in
  • 55:59 - 56:07
    Russia for 12 years and it's not that
    useful as far as I see.
  • 56:07 - 56:11
    A: Not that I know of. So I think
    it's also a very difficult
  • 56:11 - 56:16
    thing to research because we
    can only research what, what we
  • 56:16 - 56:21
    know that happened. Right? In a way that
    you have to have people that actually are
  • 56:21 - 56:27
    vocal about this and that complain about
    these laws not being enforced in the
  • 56:27 - 56:32
    proper way. So, for example, content that
    taken down is completely about something
  • 56:32 - 56:39
    else, which also sometimes happens. And,
    and that's very difficult. I think
  • 56:39 - 56:45
    the biggest question here is
    whether there is an amount of studies
  • 56:45 - 56:50
    documenting that something does not work
    that would prevent the European Union from
  • 56:50 - 56:57
    actually having this legislative fever.
    And I would argue that not, because, as
  • 56:57 - 57:01
    they said, they don't have really good
    arguments or they don't really have good
  • 57:01 - 57:06
    numbers to justify bringing this law at
    all. Not to mention bringing the
  • 57:06 - 57:12
    ridiculous measures that they propose.
    So what we say sometimes
  • 57:12 - 57:16
    in Brussels when we're very frustrated
    that we we were hoping, you
  • 57:16 - 57:21
    know, being there and advocating for for
    human rights, is that we... we hoped
  • 57:21 - 57:26
    for... that we can contribute to evidence
    based policy. But actually, what's
  • 57:26 - 57:32
    happening, it's a policy based evidence.
    And, and this is the difficult part. So I
  • 57:32 - 57:38
    am all for studies and I am all for
    presenting information that, you know, may
  • 57:38 - 57:42
    possibly help legislators. There are
    definitely some MEP or some people there,
  • 57:42 - 57:46
    even probably in the commission. Maybe
    they just are not allowed to voice
  • 57:46 - 57:51
    their opinion on this because it's a
    highly political issue that would wish to
  • 57:51 - 57:55
    have those studies or would wish to be
    able to use them. And that believe in
  • 57:55 - 58:01
    that. But it's just, it doesn't
    translate into the political process.
  • 58:01 - 58:07
    H: Okay. Time's up. If you have any
    more questions, you can come
  • 58:07 - 58:10
    up and approach Anna later.
    A: Yes.
  • 58:10 - 58:14
    H: There is please. Thanks.
    So first for me.
  • 58:14 - 58:17
    Thanks for the talk. Thanks for
    patiently answer...
  • 58:17 - 58:21
    36c3 postroll music
  • 58:21 - 58:43
    Subtitles created by c3subtitles.de
    in the year 2021. Join, and help us!
Title:
36C3 - Confessions of a future terrorist
Description:

more » « less
Video Language:
English
Duration:
58:43

English subtitles

Revisions