< Return to Video

35C3 - Citzens or subjects? The battle to control our bodies, speech and communications

  • 0:00 - 0:18
    35C3 preroll music
    [Filler, please remove in amara]
  • 0:18 - 0:25
    Herald Angel: And now to announce the
    speakers. These are Diego Naranjo, who's a
  • 0:25 - 0:32
    senior policy advisor at EDri and Andreea
    Belu, who's a campaigns and communications
  • 0:32 - 0:36
    manager also at EDRi. EDRi stands for
    European Digital Rights which is an
  • 0:36 - 0:41
    umbrella organization of European NGOs
    active in the field of freedom, rights and
  • 0:41 - 0:46
    the digital sphere. CCC is actually
    founding member of it. And they will be
  • 0:46 - 0:51
    talking about "Citizens or subjects? The
    battle to control our bodies, speech and
  • 0:51 - 1:02
    communications". The floor is yours.
    applause
  • 1:02 - 1:10
    Andreea Belu: This one this one here. This
    is my phone. There are many like it but
  • 1:10 - 1:17
    this one is mine. My phone is my best
    friend. It is my life and I should master
  • 1:17 - 1:27
    it as I master my life. This is my phone.
    But what makes it mine? I mean it might be
  • 1:27 - 1:32
    quite obvious right now that I'm holding
    it for all of you. What is not that
  • 1:32 - 1:42
    obvious though is that my phone is also
    holding me. On one hand we use our phones,
  • 1:42 - 1:47
    we use it to connect to the Internet, get
    online with our friends, exchange
  • 1:47 - 1:57
    opinions, coordinate actions. On the other
    hand, we are used. We are used by third
  • 1:57 - 2:06
    parties, governmental, private, who
    through our phones, through our devices,
  • 2:06 - 2:19
    monitor. They monitor our location, our
    bodies, our speech, the content we share.
  • 2:19 - 2:26
    At EU level right now there is a sort of a
    pattern. There is this tendency, a trend
  • 2:26 - 2:33
    almost. Certain laws like the ePrivacy,
    the Copyright Directive, the Terrorist
  • 2:33 - 2:42
    Regulation, have this very central core
    that we call the body and speech control.
  • 2:42 - 2:49
    It looks like it is really the driving
    force in the moment. So in the next 40
  • 2:49 - 2:56
    minutes or so, what we will do is give you
    short updates about these laws. Talk to
  • 2:56 - 3:01
    you a bit about what their impact is on
    us, and what do they mean past the Article
  • 3:01 - 3:09
    X and Y, and hopefully convince you to get
    involved in changing how they look right
  • 3:09 - 3:13
    now.
    Diego Naranjo: While we represent European
  • 3:13 - 3:18
    Digital Rights as Walter was mentioning
    before we are 39 human rights
  • 3:18 - 3:23
    organizations from all across Europe. We
    work on all sorts of human rights in the
  • 3:23 - 3:28
    online environments, so-called digital
    rights. We work on data protection, net
  • 3:28 - 3:33
    neutrality, privacy, freedom of expression
    online and so on, and Andreea and I are
  • 3:33 - 3:39
    glad to be for the very first time here at
    35C3.
  • 3:39 - 3:46
    Andreea: Now to continue that quote, that
    adapted quote from Full Metal Jacket, my
  • 3:46 - 3:54
    phone without me is useless. Without my
    phone I am useless. We spend most of our
  • 3:54 - 4:00
    seconds of our lifetime around devices
    that are connected to the internet,
  • 4:00 - 4:07
    whether a phone, a computer, a fridge or
    whatnot. This means that these devices
  • 4:07 - 4:14
    pretty much become attached to our bodies,
    especially a phone. Tracking these devices
  • 4:14 - 4:25
    therefore is equal to tracking our bodies.
    Controlling our bodies. For the purpose of
  • 4:25 - 4:30
    this presentation, we will talk about
    online tracking in terms of location
  • 4:30 - 4:38
    tracking, the tracking of our devices, the
    behavior tracking of users on websites -
  • 4:38 - 4:44
    how much do they spend on, I don't know
    what part of a website, where do they
  • 4:44 - 4:49
    navigate next, how many clicks they give,
    and the tracking of communications sent
  • 4:49 - 4:58
    between two devices.
    Diego: First location tracking. They are
  • 4:58 - 5:03
    on average more screens on most of the
    households than we have people. We carry
  • 5:03 - 5:08
    some of these devices in our pockets and
    they have more personal information than
  • 5:08 - 5:13
    most diaries used to have before. Our
    phones need to be tracked because they
  • 5:13 - 5:19
    need to be able to receive and send calls,
    messages, data. But this opens of course
  • 5:19 - 5:25
    new ways to use location data for
    commercial purposes, but also for a state
  • 5:25 - 5:28
    surveillance.
    Andreea: When it comes to behavioral
  • 5:28 - 5:34
    tracking, tracking our behavior
    online provides a lot more information
  • 5:34 - 5:41
    than just location. It adds on top of it,
    right? A user can then be targeted
  • 5:41 - 5:52
    according to that tracking. And the more
    this tracking targeting process basically
  • 5:52 - 5:58
    represents the business model of the
    Internet nowadays. For this reason the
  • 5:58 - 6:05
    more complex and detailed someone's
    profiling is, well, the more accurate the
  • 6:05 - 6:13
    targeting can be done, the more effective
    and efficient, most of the times. And
  • 6:13 - 6:19
    therefore more valuable the data about the
    profile is. You can see here a really cool
  • 6:19 - 6:31
    infographic from Cracked Labs, an Acxiom
    and Oracle profiling of populations. You
  • 6:31 - 6:37
    see the amount of variables and the amount
    of information and the depth where it
  • 6:37 - 6:43
    goes, and you get that business model, a
    cash flow.
  • 6:43 - 6:48
    Diego: And this business model is quite
    interesting. I wouldn't imagine a postman
  • 6:48 - 6:54
    going to my physical mailbox at home going
    through my letters and then pouring some
  • 6:54 - 7:01
    leaflets for advertising in there
    according to what he reads. Right now
  • 7:01 - 7:06
    Gmail and many other services, that's what
    they do, they leave out as, you well know,
  • 7:06 - 7:12
    reading your emails to sell you stuff that
    you don't really need. Facebook conversations
  • 7:12 - 7:18
    now through the API are an option,
    they want to read them, those
  • 7:18 - 7:23
    conversations in order to find patterns
    for example for intellectual property and
  • 7:23 - 7:28
    infringements, especially for
    counterfeiting but not only, also for
  • 7:28 - 7:35
    copyright. Now also WhatsApp metadata is
    used by Facebook in order to know who your
  • 7:35 - 7:41
    friends are, who you contact, who your
    family is, in order for that social media
  • 7:41 - 7:45
    services to gain more and more power, more
    and more data, and more and more profit of
  • 7:45 - 7:55
    course. The life of others. Right? I got a
    good movie. If you haven't seen the movie,
  • 7:55 - 8:01
    I guess everyone has seen it. If not you
    should. It's basically about a Stasi agent
  • 8:01 - 8:06
    who follows the life of an individual
    through a period of time, and after
  • 8:06 - 8:13
    there's no other revelations. This movie
    has changed in a way from drama to soft
  • 8:13 - 8:19
    comedy because the capabilities of
    surveillance services, to veil all of
  • 8:19 - 8:23
    us, to get so much data, also for
    companies to get so much intimate
  • 8:23 - 8:32
    information from us, has doubled, tripled
    or perhaps exponentially grown compared to
  • 8:32 - 8:39
    what the Stasi would do back then. And all
    of this... to some extent it's going to be
  • 8:39 - 8:45
    regulated by this regulation with a very,
    very long name. I guess half of you will
  • 8:45 - 8:51
    have fell asleep already by going through
    it, but I'll go through it quickly to
  • 8:51 - 8:56
    let you know what is this about. Why do we
    need to know about ePrivacy Regulation,
  • 8:56 - 9:02
    why it is important for the control of our
    bodies and our devices. The ePrivacy is
  • 9:02 - 9:07
    about online tracking and you might have
    heard of the cookie directive, that one
  • 9:07 - 9:13
    bringing all those cookie banners on your
    screens in part due to a bad
  • 9:13 - 9:18
    implementation of this directive. It is
    also about your emails. Who's going to be
  • 9:18 - 9:22
    able to read your emails or use the data
    from your emails to sell you advertising
  • 9:22 - 9:29
    or not. How confidential that information
    can be. It's also about your chats. How
  • 9:29 - 9:32
    would you communicate nowadays with
    your WhatsApp, Signal, Wire, or any other
  • 9:32 - 9:39
    devices. And finally it's also about
    location data. Who can track you, why can
  • 9:39 - 9:45
    they track you and what are the safeguards
    that that need to be put in place in order
  • 9:45 - 9:51
    to safeguard your privacy. And I can
    imagine many of you saying well don't we
  • 9:51 - 9:58
    have already this GDPR thingy of the
    emails I received in May. Yes we do have
  • 9:58 - 10:04
    that. But the GDPR was not enough. After
    more than four years of discussions to
  • 10:04 - 10:10
    achieve this General Data Protection
    Regulation we have achieved a lot, and
  • 10:10 - 10:17
    GDPR was our best possible outcome in that
    current political scenario. And there's a
  • 10:17 - 10:19
    lot to do regarding the implementation
    though.
  • 10:19 - 10:21
    We have seen problems in Romania,
  • 10:21 - 10:26
    in Spain, and we expect that to happen in
    many other places. But we still need a
  • 10:26 - 10:31
    specific instrument to cover the right to
    privacy in electronic communications,
  • 10:31 - 10:35
    including everything we mentioned before.
    Metadata, chats, location data, the
  • 10:35 - 10:42
    content of your communications and so on.
    Andreea: So ePrivacy is basically meant to
  • 10:42 - 10:49
    complement GDPR and be more focused on
    exactly the topics that Diego mentioned.
  • 10:49 - 10:56
    What did we advocate for and still do?
    Privacy by design and privacy by default
  • 10:56 - 11:02
    should be the core principles, the pillars
    of this regulation. Moreover politicians
  • 11:02 - 11:10
    need to recognize the value of maintaining
    and enhancing secure encryption. Cookie
  • 11:10 - 11:16
    walls--I mean we should be able to visit a
    website without having to agree to being
  • 11:16 - 11:23
    tracked by cookies. This is another topic
    that we strongly advocated for.
  • 11:23 - 11:30
    Finally, content should be protected
    together with metadata in storage and in
  • 11:30 - 11:41
    transit. And we actually succeeded last
    year in 2017 at the end. The parliament
  • 11:41 - 11:49
    adopted a very good text, a very strong
    text. It supported most of the
  • 11:49 - 11:55
    problems that, no, it addressed most of
    the problems that we pointed out and
  • 11:55 - 12:05
    supported the values that we're going
    through. But, this has been quite a ride.
  • 12:05 - 12:13
    I mean it wasn't easy. As Diego said we're
    a network, 39 organizations. They're not
  • 12:13 - 12:19
    just the legal people or tech people, it's
    a combination of both. So when we provided
  • 12:19 - 12:27
    our input in the shape of analysis or
    recommendations, some bulleted there, all
  • 12:27 - 12:33
    sorts of skills were combined. And this
    played a big part of our success, the fact
  • 12:33 - 12:38
    that we were able to provide a
    comprehensive yet complex analysis of what
  • 12:38 - 12:45
    encryption should look like, of what
    cookies should act like, and also a legal
  • 12:45 - 12:54
    analysis of existing legislation. The
    diversity of our skills became productive.
  • 12:54 - 13:00
    Diego: Did we win? Well, we are on our
    way. After the EU parliament adopted its
  • 13:00 - 13:07
    position, now it needs to sort of enter
    into discussion with the member states in
  • 13:07 - 13:11
    what is called the Council of the EU. For
    the Parliament with a strong position is
  • 13:11 - 13:16
    now to talk with the rest of the member
    states. Currently there are negotiations
  • 13:16 - 13:20
    around the ePrivacy are not really moving
    forward. They are being delayed by their
  • 13:20 - 13:25
    national governments. They are claiming
    that there are issues that need to be
  • 13:25 - 13:29
    tackled. That it is very technical, that
    we already had the GDPR and we need to see
  • 13:29 - 13:34
    how it is implemented, and Member States
    feared that another layer of protection
  • 13:34 - 13:41
    may impede that some businesses grow in
    the European Union. And if this was not
  • 13:41 - 13:47
    enough they also are afraid of getting bad
    press from the press right now. It depends
  • 13:47 - 13:51
    to a high extent on behavioural
    advertising. They say that without
  • 13:51 - 13:58
    tracking all over the internet they are
    unable to to sustain their business model.
  • 13:58 - 14:02
    And of course since the national
    governments, the politicians, are afraid
  • 14:02 - 14:10
    of that bad press from the press, then
    they are quite cautious to move forward.
  • 14:10 - 14:16
    Online we exercise our free speech and in
    many ways, but one of those ways is the
  • 14:16 - 14:23
    way we produce, share, or enjoy our
    content online. Our opinions, the people
  • 14:23 - 14:28
    with whom we communicate, can be seen as a
    threat in a given time by certain
  • 14:28 - 14:33
    governments. We have seen the trend in
    certain governments such in Poland,
  • 14:33 - 14:38
    Hungary and to a certain extent as well in
    Spain. All of these
  • 14:38 - 14:43
    information can be as well very profitable
    as we see with the mainstream social media
  • 14:43 - 14:49
    platforms we were mentioning before. So
    there are political and economical reasons
  • 14:49 - 14:56
    to control speech and that's why the best
    way to control speech is to control the
  • 14:56 - 15:04
    way that content is shared online. Right
    now there are two proposals that raise a
  • 15:04 - 15:11
    huge threat to freedom of expression
    online. Both propose upload filters by
  • 15:11 - 15:16
    increasing liability for platforms or
    making platform companies responsible for
  • 15:16 - 15:22
    the content which they host. One of them
    is the famous or infamous Article 13 or
  • 15:22 - 15:27
    the Copyright Directive proposal. And the
    second one is the regulation to prevent
  • 15:27 - 15:33
    the dissemination of terrorist content
    online. Both of them, as you will see,
  • 15:33 - 15:38
    they are just another way to make private
    companies the police and the dad of the
  • 15:38 - 15:44
    Internet. This is the first one. The
    proposal for act again with a long name.
  • 15:44 - 15:51
    Just stick to the short name, the
    Copyright Directive. And this Copyright
  • 15:51 - 15:56
    Directive is based on a fable. The fable
    goes like this. There are a wide range of
  • 15:56 - 16:01
    lonely and poor songwriters in their attic
    trying to produce songs for their
  • 16:01 - 16:07
    audience. Then there are these big
    platforms, mainly YouTube but also others,
  • 16:07 - 16:13
    that allow these uploads and gain profit.
    And these platforms give some pennies,
  • 16:13 - 16:17
    some small amount of money, for these
    authors. And the difference between what
  • 16:17 - 16:24
    they earn and what they should be earning
    is what they call the value gap. The fable
  • 16:24 - 16:29
    though conveniently hides the fact that
    the music industry keeps saying year after
  • 16:29 - 16:36
    year after year that they increase their
    revenues to a high percentage every year.
  • 16:36 - 16:42
    And that keeps growing, especially in the
    online world. What is the solution to this
  • 16:42 - 16:47
    problem? Well as you can imagine it's a
    magical algorithm that will solve this
  • 16:47 - 16:53
    problem. And this algorithm will filter
    each and every file that you upload to
  • 16:53 - 16:58
    these platforms, will identify it and
    match it against a database, and will
  • 16:58 - 17:04
    block or allow the content depending if it
    is licensed or not and if they like you or
  • 17:04 - 17:11
    not and then according to their terms of
    service in the end. As we will mention
  • 17:11 - 17:15
    there are some technical and legal
    problems with upload filters.
  • 17:15 - 17:17
    In essence, if they are implemented,
  • 17:17 - 17:20
    it will mean that
    YouTube and Facebook will
  • 17:20 - 17:28
    officially become the police and the dad
    of the internet. The other big fight that
  • 17:28 - 17:34
    we have is around terrorism, or to be
    specific, about terrorist content online.
  • 17:34 - 17:40
    After the Cold War we needed a new
    official enemy. Once communism fell.
  • 17:40 - 17:45
    Terrorism is a new threat. It's very real
    to some extent. We we lived through it in
  • 17:45 - 17:52
    Brussels recently, but it has been also
    exaggerated and inserted in our daily
  • 17:52 - 17:57
    lives. We see that in the airport
    controls, surveillance online, and
  • 17:57 - 18:02
    offline, restrictions to freedom of
    assembly and expression all over Europe.
  • 18:02 - 18:06
    And whenever a terrorist attack occurs we
    see pushes for legislation and measures
  • 18:06 - 18:13
    that restricts our freedoms. Usually those
    restrictions stay even though the risk or
  • 18:13 - 18:20
    the threat has disappeared or has been
    reduced. Again there we go with a long
  • 18:20 - 18:27
    name. Let's stick to the short name.
    TERREG. It's the regulation to prevent
  • 18:27 - 18:33
    dissemination of content of terrorist
    content. These proposals allegedly aims at
  • 18:33 - 18:40
    reducing terrorist content online. Note:
    not illegal content. Terrorist content. In
  • 18:40 - 18:49
    order to reduce risks of radicalization.
    I've always -- and what we have seen
  • 18:49 - 18:54
    through experience that a lot of
    radicalization happens outside the online
  • 18:54 - 19:02
    world, and that radicalization has other
    causes which are not online content.
  • 19:02 - 19:06
    It seems that the politicians need to send
    a strong signal before the EU elections.
  • 19:06 - 19:10
    We need to do something strong against
    terrorism, and the way to do that is
  • 19:10 - 19:22
    through three measures. Three. As we will
    see in a minute. First, Speedy González
  • 19:22 - 19:28
    takedowns. My favorite. Platforms will
    need to remove content which has been
  • 19:28 - 19:32
    declared terrorist content by some
    competent authorities. And this definition
  • 19:32 - 19:38
    of terrorist content is of course vague,
    and also incoherent with other relevant
  • 19:38 - 19:43
    pieces of legislation which are already in
    place but not implemented all across the
  • 19:43 - 19:52
    EU. This removal needs to happen in one
    hour. This is sort of fast food principles
  • 19:52 - 19:57
    applied to online world, to other visual
    material. And they give you some sort of
  • 19:57 - 20:01
    complaint mechanism, so do you have any
    problem with that, your content being
  • 20:01 - 20:06
    taken down, you can go and say this
    content is legal please take it back. But
  • 20:06 - 20:11
    in practice as you read it you will see
    that it's likely to be quite ineffective.
  • 20:11 - 20:16
    First of all, also overblocking will not
    be penalized. So if they overblock legal
  • 20:16 - 20:22
    content nothing will happen. If they leave
    one piece of content which is illegal on
  • 20:22 - 20:31
    their platform they will face a sanction.
    The second issue is those of measures for
  • 20:31 - 20:39
    voluntary consideration. According to this
    second measure, the states will be able to
  • 20:39 - 20:44
    tell platforms, "I have seen this
    terrorist content in your platform. This
  • 20:44 - 20:50
    looks bad. Really really bad. So I really
    felt I had to ask you, could you be so
  • 20:50 - 20:55
    kind to have a look, just if you wish? Of
    course I have no worries.", and the
  • 20:55 - 21:00
    platform then will decide according to
    their own priorities how to deal with this
  • 21:00 - 21:10
    voluntary request. Third. Good old upload
    filters, that's the third measure they are
  • 21:10 - 21:16
    proposing. Upload filters, or general
    monitoring obligations in legal jargon,
  • 21:16 - 21:23
    are prohibited in EU legislation. But
    anyway let's propose them. We'll see what
  • 21:23 - 21:28
    happens. And in order to be able to push
    them in the legislation, let's give them
  • 21:28 - 21:35
    an Orwellian twist to our filters, like
    let's call them in a different way. So we
  • 21:35 - 21:40
    call them proactive measures. Platforms
    will need to proactively prevent that
  • 21:40 - 21:47
    certain content is uploaded. How will they
    prevent this? Upload filters of course.
  • 21:47 - 21:54
    I meant, proactive measures. And whether
    it is copyright or terrorist content, we
  • 21:54 - 22:02
    see the same trend. We see this one size
    fits all solution: a filter, an algorithm
  • 22:02 - 22:06
    that will compare all the content that is
    uploaded, will match it against a certain
  • 22:06 - 22:11
    database, and that will block it or not.
    We will need many filters, not only one
  • 22:11 - 22:17
    filter. We need filters for audio, for
    images, for text, and also one
  • 22:17 - 22:24
    specifically for terrorist content.
    Whatever that is defined. So this is
  • 22:24 - 22:32
    basically the principle of law making
    today. We really want filters, what can we
  • 22:32 - 22:39
    invent to have them?
    Andreea: So we've got an issue with
  • 22:39 - 22:47
    filters, well, quite a few issues. But in
    big, an issue. First of all, they're
  • 22:47 - 22:52
    illegal. The European Court of Justice
    said like this: "A social network cannot
  • 22:52 - 22:58
    be obliged to install a general filtering
    covering all of its users in order to
  • 22:58 - 23:04
    prevent the unlawful use of musical and
    audio-visual work". In the case SABAM
  • 23:04 - 23:15
    versus Netlog. Despite this it seems that
    automated content filters are okay. Not
  • 23:15 - 23:20
    general filtering covering all of its
    users. Of course there are the technical
  • 23:20 - 23:22
    issues.
    Diego: Yeah, there's some technical
  • 23:22 - 23:27
    issues. One of my best example of that,
    the magnificent examples of how filters do
  • 23:27 - 23:33
    not work. Well James Rhodes the pianist
    that a few weeks ago tried to upload a
  • 23:33 - 23:39
    video of himself playing Bach in his
    living room. Then the algorithm detected
  • 23:39 - 23:46
    some copyrighted content owned by Sony
    Music and automatically took down the
  • 23:46 - 23:53
    content. Of course he complained, he took
    the content back. But it set a good
  • 23:53 - 23:59
    example of how filters do not work,
    because one piece of Bach, who died around
  • 23:59 - 24:05
    three or four hundred years ago, is of
    course out of copyright. And if the video
  • 24:05 - 24:11
    of a famous artist is taken down, we can
    imagine the same for many of your content.
  • 24:11 - 24:16
    Andreea: So not only that filters don't
    recognize what is actually copyrighted and
  • 24:16 - 24:23
    what is not, but they also don't recognize
    exceptions, such as remixes, caricatures
  • 24:23 - 24:31
    or parodies. When it comes to copyright,
    filters can't tell, and this is why memes
  • 24:31 - 24:38
    were a central part of the protest against
    Article 13. And this is why we will show
  • 24:38 - 24:47
    soon why this filter has huge potential
    for a political tool. Another issue with
  • 24:47 - 24:54
    the automated content filter is that they
    don't even recognize contexts either.
  • 24:54 - 25:03
    When it comes to hate speech or terrorist
    content, they can't tell nuances. A girl
  • 25:03 - 25:09
    decided to share her traumatic experience
    of receiving a lot of injuries and... not
  • 25:09 - 25:17
    injuries, insults, insults in her mailbox
    from this person who was hating her a lot
  • 25:17 - 25:23
    and threatening her, so she took it and
    copy-pasted it on her Facebook account,
  • 25:23 - 25:31
    and made a post, and her profile was taken
    down. Why? Because the automated solutions
  • 25:31 - 25:36
    can't tell that she was the victim, not
    the actual perpetrator, right? And this is
  • 25:36 - 25:41
    very likely to continue happening if this
    is the "solution" put forward.
  • 25:41 - 25:46
    Diego: That is also a problem for SMEs of
    course, because these tools, these
  • 25:46 - 25:52
    filters, are very expensive. YouTube spent
    around $100,000,000 to develop Content ID
  • 25:52 - 25:58
    which is the best, worst, filter we have
    now online. So we can imagine how this is
  • 25:58 - 26:03
    gonna go for European SMEs that will need
    to copy that model, probably getting a
  • 26:03 - 26:09
    license for them, I can imagine, in order
    to implement those filters online. In the
  • 26:09 - 26:13
    end this will just empower these big
    companies who already have their filters
  • 26:13 - 26:17
    in place or they will just keep doing
    their business as usual and these new
  • 26:17 - 26:21
    companies that would like to develop a
    different business model will be prevented
  • 26:21 - 26:26
    from doing so because they will need to
    spend a lot of money on these filters.
  • 26:26 - 26:30
    Andreea: Then there's the issue of
    privatized law enforcement, the
  • 26:30 - 26:38
    privatization of law enforcement.
    Attributes change. Past state attributes
  • 26:38 - 26:44
    are now shifted over - "Can you take care
    of it?" - to entities that are not really
  • 26:44 - 26:51
    driven by the same values that a state
    should at least be driven by. I'll just
  • 26:51 - 26:56
    give you one example from a project called
    Demon Dollar Project, a study commissioned
  • 26:56 - 27:03
    by the Parliament to look at hate speech
    definition in different EU member states.
  • 27:03 - 27:09
    Their conclusion: There are huge
    disparities between what it means, "hate
  • 27:09 - 27:13
    speech" - what hate speech means in
    Germany compared to what hate speech means
  • 27:13 - 27:24
    in Romania to what it means in the UK. So
    in this context, how can we ask a company
  • 27:24 - 27:34
    like Google or Facebook to find the
    definition? I mean, are their terms and
  • 27:34 - 27:42
    conditions the standard that we should see
    as the one influencing our legal
  • 27:42 - 27:47
    definitions, am I the only one seeing a
    conflict of interest here?
  • 27:47 - 27:52
    Diego: And there's a problem there that
    once we have these filters for copyright
  • 27:52 - 27:57
    infringements or any other purposes, like
    terrorist content, we of course will have
  • 27:57 - 28:01
    it as a political tool. Once we have these
    for copyright why are you not going to
  • 28:01 - 28:08
    look for those dissidents in in every
    country. These things change very often, I
  • 28:08 - 28:12
    see that in Spain but I see it all across
    the EU nowadays. So once we have them in
  • 28:12 - 28:15
    place over one thing one small thing like
    copyright, why not for something else,
  • 28:15 - 28:21
    something more political.
    Andreea: There is a really interesting
  • 28:21 - 28:28
    example coming from Denmark some year or a
    year and a half ago. The Social Democrats
  • 28:28 - 28:36
    announced their immigration plan. They
    made the video in which Mette Fredriksen
  • 28:36 - 28:41
    talked about how great their plan is and
    so on. Some people were happy some sad.
  • 28:41 - 28:52
    Some of the sad ones decided to criticize
    the plan and made a video about it.
  • 28:52 - 29:00
    It was a critique during which they
    caricatured her, but they used two audio
  • 29:00 - 29:08
    bits from their announcement video.
    Social Democrats sent a letter to the NGO
  • 29:08 - 29:13
    accusing them of copyright infringement
    and threatening a lawsuit. Obviously the
  • 29:13 - 29:20
    NGO thought: "Yeah we don't really have
    enough money to go through a big court
  • 29:20 - 29:27
    case, so we're just going to take the
    video down." They took it down. Now, why
  • 29:27 - 29:35
    is this case important? If an automated
    content filter for copyrighted material
  • 29:35 - 29:39
    would have been in place, the Social
    Democrats wouldn't have to even lift a
  • 29:39 - 29:47
    finger. The job would be automatically
    done. Why? Automated content filters can't
  • 29:47 - 29:55
    tell exceptions, such as parodies. And
    this is a very clear case on how copyright
  • 29:55 - 29:59
    infringement can be strategically used to
    silence any critical voices in the
  • 29:59 - 30:05
    political sphere.
    Diego: So we see a few threats to
  • 30:05 - 30:11
    fundamental rights. First on privacy that
    we need to scan every piece of content so
  • 30:11 - 30:17
    they can discard this information. Then we
    will live in a sort of blackbox society
  • 30:17 - 30:23
    and that will affect the freedom of
    speech. We will face also overcensoring,
  • 30:23 - 30:29
    overblocking, chilling effects and these
    tools are going to be repurposed as a
  • 30:29 - 30:36
    political tool. In a nutshell, rights can
    only be restricted when there is approved
  • 30:36 - 30:41
    necessity, when the measure is
    proportional and when this measure is also
  • 30:41 - 30:47
    effective. This filters are not necessary
    for the ends they want to achieve. They
  • 30:47 - 30:51
    are not proportional, as we have seen, and
    they are not effective, as we have seen as
  • 30:51 - 30:56
    well. So these in effect these are an
    unlawful restriction of freedom of
  • 30:56 - 31:03
    expression and privacy rights.
    Andreea: Now obviously we were also
  • 31:03 - 31:10
    unhappy about these and, I mentioned
    before, how we organize within our network
  • 31:10 - 31:20
    to fight to get strong a privacy. When it
    comes to copyright, this fight went out of
  • 31:20 - 31:30
    our network. It got a lot of people mad,
    people like librarians, startups, the U.N.
  • 31:30 - 31:37
    special rapporteur, all of those there,
    basically. And more. And even YouTube
  • 31:37 - 31:46
    in the end who thought about endorsing our
    right to campaign. What we learned from
  • 31:46 - 31:55
    these fights is that we really need to
    share knowledge between us. We need to
  • 31:55 - 32:02
    team up, coordinate actions, be patient
    with each other. When it comes to
  • 32:02 - 32:08
    different skills, it is important to unite
    them. When it comes to different
  • 32:08 - 32:15
    perspectives, it is important to
    acknowledge them. If we're separate
  • 32:15 - 32:22
    individuals, by ourselves we're just many;
    but if we're together, we're one big
  • 32:22 - 32:34
    giant. That is where the impact lays. Now,
    this is basically a call to you. If you're
  • 32:34 - 32:41
    worried about anything that we've told you
    today, if you want to support our fight,
  • 32:41 - 32:47
    if you think that laws aimed at
    controlling our bodies and our speech
  • 32:47 - 32:54
    should not be the ones that should rule us
    and our internet, I think it's time to get
  • 32:54 - 33:01
    involved. Whether you're a journalist
    writing about privacy or other topics,
  • 33:01 - 33:06
    whether you're a lawyer working in a human
    rights organization, whether you're a
  • 33:06 - 33:13
    technical mindset, whether you have no
    clue about laws or anything like that,
  • 33:13 - 33:19
    come talk to us. We will have two
    workshops, one on e-privacy, one on
  • 33:19 - 33:25
    upload filters we'll be answering more
    questions if you have and you can't ask
  • 33:25 - 33:33
    them today and try to put together an
    action plan. We also have a cluster called
  • 33:33 - 33:45
    "about freedom" that you can see there but
    is right by the info point in SSL.
  • 33:45 - 34:04
    And do you have any questions or
    comments? Thank you. applause
  • 34:04 - 34:10
    Angel: There is ample time for Q and A,
    so fire away if you have questions. Go to
  • 34:10 - 34:18
    a microphone, wave your hands.
    Signal Angel, are there any questions on
  • 34:18 - 34:29
    the internet? Nope. Uh, mycrophone number
    one!
  • 34:29 - 34:40
    inaudible question from the audience
    Diego: So the question is if the content
  • 34:40 - 34:44
    is encrypted, how companies will be
    obliged to implement its filters?
  • 34:44 - 34:47
    Microphone 1: Yes.
    Diego: Good question, I don't know.
  • 34:47 - 34:50
    I don't think that's gonna be possible.
    They got to find a way to do that because
  • 34:50 - 34:55
    either did they ban the encryption in the
    channels or they -- it doesn't matter
  • 34:55 - 34:59
    because they will make you liable.
    If you have platforms with very encrypted
  • 34:59 - 35:03
    channels and you have everything on
    either but by any reason they
  • 35:03 - 35:08
    find any copyrighted content which is not
    licensed, which you're not paying money
  • 35:08 - 35:13
    for, that will make you liable. Perhaps in
    practice they will not be able to find you
  • 35:13 - 35:17
    to make you liable because they will not
    be able to access the content but if they
  • 35:17 - 35:19
    find a way to do so they will they will
    make you pay.
  • 35:19 - 35:23
    M1: Thank you.
    Angel: Okay, microphone number two.
  • 35:23 - 35:27
    Microphone 2: Thank you very much for
    the presentation. You've been talking a
  • 35:27 - 35:33
    lot about upload filters. A lot of the
    telcos and lobbyists are saying that the
  • 35:33 - 35:43
    upload filters don't exist, that trial
    mechanism for the copyright reform is as
  • 35:43 - 35:49
    I've heard ending in January and there
    will be a solution in the European
  • 35:49 - 35:57
    legislation process. How will we be able
    to inform this process and influence it to
  • 35:57 - 36:01
    try and make it better before the European
    elections?
  • 36:01 - 36:06
    Diego: Well we still have time, that's why
    we are here, one of our main goals to be
  • 36:06 - 36:11
    35C3 apart from enjoying the conference
    for the very first time is to mobilize all
  • 36:11 - 36:16
    of those who have not been mobilized yet.
    Thousands of people have been active they
  • 36:16 - 36:20
    have been tweeting they have been calling
    their MEPs, the members of the European
  • 36:20 - 36:25
    Parliament, they have been contacting the
    national governments. We still have time.
  • 36:25 - 36:29
    The vote will be some time around January
    February. We still don't know. We. We are
  • 36:29 - 36:33
    afraid there's gonna be sooner than
    expected, but this is the last push, the
  • 36:33 - 36:38
    last push to say no to the entire
    directive and say no to upload filters.
  • 36:38 - 36:42
    And that's why we are here because we
    still have time. Worst case scenario:
  • 36:42 - 36:45
    We'll go to the implementation phase, of
    course, we go to a national member state
  • 36:45 - 36:50
    and say "Do not do this. This goes against
    the Charter of Fundamental Rights and then
  • 36:50 - 36:55
    we will stop it there. Better either now,
    which is my my hope, or in the worst case
  • 36:55 - 37:00
    scenario we will stop it for sure, in
    member states.
  • 37:00 - 37:07
    applause
    Angel: Microphone number one.
  • 37:07 - 37:13
    Microphone 1: You talked about
    voluntary measures that companies are
  • 37:13 - 37:20
    somehow asked to implement. What are the
    incentives for the companies to do so?
  • 37:20 - 37:23
    Diego: What do the companies have to do
    with that?
  • 37:23 - 37:26
    M1: Why should they do that,
    because it's voluntary, you said.
  • 37:26 - 37:30
    Diego: Well they could do that for
    different reasons because they could get
  • 37:30 - 37:35
    bad PR. Imagine you are a small company in
    Hungary and then goes Orban and tells you
  • 37:35 - 37:39
    "You need to block this because I think
    that this is terrorism. It comes from a
  • 37:39 - 37:43
    human rights organization." What would you
    do, if you understand me? That depends on
  • 37:43 - 37:47
    perhaps not on the government, but on the
    general structure. You could get bad PR
  • 37:47 - 37:52
    from the government, you could be, perhaps
    too because you're not acting promptly on
  • 37:52 - 37:56
    on these serious content, but it's true
    that is only for your voluntary
  • 37:56 - 38:03
    consideration.
    Angel: Again microphone number one.
  • 38:03 - 38:08
    Microphone 1: Thanks for the talk. So,
    I also think, when I see a problem, oh
  • 38:08 - 38:13
    there is a technical solution. So it's
    hard for me to admit, maybe not. But it
  • 38:13 - 38:19
    does look like it's the case but also when
    you mention in the workshop maybe more
  • 38:19 - 38:23
    with a... I mean anybody can come, but
    more eventually with a legal background. I
  • 38:23 - 38:28
    don't have it. I'm a developer, but I want
    to understand how a system is working and
  • 38:28 - 38:31
    I understand a little bit about the
    European process and the regulatory
  • 38:31 - 38:35
    process but not so much. So what's the
    most efficient way for me as a developer
  • 38:35 - 38:43
    to get a better grasp of how this system,
    all those laws and regulation, are getting
  • 38:43 - 38:48
    implemented and all the different steps.
    Diego: Well yeah. We didn't come to the
  • 38:48 - 38:52
    Lawyers Computer Congress,we came to the
    Chaos Computer Congress, so we can
  • 38:52 - 38:57
    make chaos out of it. We need developers,
    we need lawyers, we need journalists, we
  • 38:57 - 39:02
    need graphic designers, we need people
    with all sorts of skills. As Andreea was
  • 39:02 - 39:06
    saying before, and we need developers to
    develop tools that work so we are capable
  • 39:06 - 39:11
    of developing any calling tool, any
    tweeting or any sort of tool that we can
  • 39:11 - 39:16
    use to transport our message and take it
    to Brussels, take it to the members of the
  • 39:16 - 39:20
    European Parliament, take to the national
    member states. We really need you, if we
  • 39:20 - 39:24
    need something, it's developers. We have
    enough lawyers in this world. I think we
  • 39:24 - 39:29
    have too many with myself already, so we
    need you tomorrow and the day after
  • 39:29 - 39:39
    tomorrow.
    Angel: Okay. Any other questions? In that
  • 39:39 - 39:43
    case, I'll ask one myself. Andreea, what
    will be a good start at the member state
  • 39:43 - 39:47
    level to start campaigning if you've never
    campaigned before? Andreea: What, what?
  • 39:47 - 39:50
    Can you please repeat?
    Angel: What would be a good start. If you
  • 39:50 - 39:53
    wanted to campaign at the member state
    level ... ?
  • 39:53 - 39:58
    Andreea: ...and never campaigned before.
    Angel: Yes. Campaigning for Dummies.
  • 39:58 - 40:06
    Andreea: Well we've got a lot of
    organizations in EU member states. So, as
  • 40:06 - 40:12
    a person who has never campaigned before
    and was looking for someone to campaign
  • 40:12 - 40:18
    with two years ago in Denmark, I was
    advised to look for the Danish EDRi
  • 40:18 - 40:24
    member. So I did and we managed to
    organize a lot of great workshops in
  • 40:24 - 40:34
    Denmark when nothing existed, because IT-
    Pol, the Danish member, had a very complex
  • 40:34 - 40:42
    grasp of the political environment and
    most of EDRi members understand how this
  • 40:42 - 40:50
    is, how the dynamic is working, both
    politically, but also journalists, also
  • 40:50 - 40:59
    what the the interests of certain
    nationalities are. So I would say that:
  • 40:59 - 41:06
    find your first EDRi organization, that is
    the first step and then unite with the
  • 41:06 - 41:09
    rest.
    Diego: And there's another way. Remember
  • 41:09 - 41:13
    you can always contact consumer
    organizations. You can contact directly
  • 41:13 - 41:18
    your members of the parliament. You can
    organize yourself with two or three
  • 41:18 - 41:22
    friends and make a few phone calls, that's
    also already enough you can do, there are
  • 41:22 - 41:30
    many ways for you to help out.
    Andreea: Of course, make sure you contact
  • 41:30 - 41:37
    your country's MEP at the European level.
    We are being represented and we get to
  • 41:37 - 41:42
    actually elect the parliamentaries,
    they're the only ones who are elected by
  • 41:42 - 41:54
    us and not just proposed by governments or
    other politicians. So if we want to be
  • 41:54 - 42:00
    connected to our country member state but
    influence a law at the European level like
  • 42:00 - 42:05
    the ones we talked about, it is very
    important to let our EU parliamentaries
  • 42:05 - 42:11
    know that we are here and we hear them and
    they came from our country to represent us
  • 42:11 - 42:17
    at EU level.
    Angel: Thank you. Any other questions?
  • 42:17 - 42:20
    Signal Angel, do we have questions from
    the Internet?
  • 42:20 - 42:24
    Signal Angel: Unfortunately not.
    Angel: Well in that case we're finished.
  • 42:24 - 42:28
    Thank you all for your attention.
  • 42:28 - 42:39
    applause
  • 42:39 - 43:00
    subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
35C3 - Citzens or subjects? The battle to control our bodies, speech and communications
Description:

more » « less
Video Language:
English
Duration:
43:00

English subtitles

Revisions