Return to Video

#rC3 - The EU Digital Services Act package

  • 0:00 - 0:13
    rC3 preroll music
  • 0:13 - 0:20
    Herald: And the the talk that is about to
    begin now is by Christoph Schmon from EFF
  • 0:20 - 0:24
    and Eliska Pirkova from Access Now and
    they will be talking about this, probably
  • 0:24 - 0:29
    maybe the biggest legislative initiative
    since the GDPR in Brussels. It's called
  • 0:29 - 0:39
    the Digital Services Act. And onto you
    two. You're muted.
  • 0:39 - 0:43
    Eliska Perkova: I realized. Hello!
    First of all, thank you very much for
  • 0:43 - 0:48
    having us today. It's a great pleasure to
    be the part of this event. I think for
  • 0:48 - 0:52
    both of us, it's the first time we are
    actually joining this year. So it's
  • 0:52 - 0:57
    fantastic to be the part of this great
    community. And today we are going to talk
  • 0:57 - 1:03
    about the legislative proposal, the EU
    proposal which causes a lot of noise all
  • 1:03 - 1:08
    around Europe, but not only in Europe, but
    also beyond. And that's the Digital
  • 1:08 - 1:12
    Services Act legislative package that
    today we already know that this
  • 1:12 - 1:16
    legislative package actually consists of
    two acts: Digital Services Act and a
  • 1:16 - 1:22
    Digital Market Act. And both of them will
    significantly change the regulation of
  • 1:22 - 1:27
    online platforms with a specific focus on
    very large online platforms, also often
  • 1:27 - 1:32
    referred to as gatekeepers. So those who
    actually hold a lot of economic dominance,
  • 1:32 - 1:37
    but also a lot of influence and control
    over users rights and the public
  • 1:37 - 1:42
    discourse. So I'm going to start with
    giving you a quick introduction into
  • 1:42 - 1:47
    what's the fuss about, what is actually
    the DSA about, why we are also interested
  • 1:47 - 1:50
    in it and why we keep talking about it,
    and why this legislation will keep us
  • 1:50 - 1:57
    preoccupied for the years to come. DSA was
    already announced two years ago as a part
  • 1:57 - 2:05
    of European Union digital strategy, and it
    was appointed as one of those actions that
  • 2:05 - 2:11
    the digital strategy will be actually
    consisting of. And it was the promise that
  • 2:11 - 2:15
    the European Commission gave us already at
    that time to create the systemic
  • 2:15 - 2:20
    regulation of online platforms that
    actually places hopefully the users and
  • 2:20 - 2:26
    their rights into the center of this
    upcoming legislation. So the promise
  • 2:26 - 2:31
    behind DSA is that these ad-based Internet
    bill, and I'm speaking now about Ad-Tech
  • 2:31 - 2:34
    and Online-Targeting. The
    Internet, as we actually knew, will be
  • 2:34 - 2:39
    actually replaced by something that puts
    user and users controls, and users right
  • 2:39 - 2:44
    as a priority. So both of these
    legislations implemented and drafted
  • 2:44 - 2:49
    right, should be actually achieving that
    goal in the future. Now, previously,
  • 2:49 - 2:54
    before DSA actually was drafted, there was
    some so-called e-Commerce Directive in
  • 2:54 - 2:58
    place that actually established the basic
    principles, especially in the field of
  • 2:58 - 3:03
    content governance. I won't go into
    details on that because I don't want to
  • 3:03 - 3:08
    make this too legalistic. But ultimately,
    DSA legislation is supposed to not
  • 3:08 - 3:13
    completely replace but build up on the top
    of this legislation that actually created
  • 3:13 - 3:17
    the ground and the main legal regime for
    almost 20 years in Europe to regulate the
  • 3:17 - 3:23
    user generated content. So, DSA and DMA,
    as the legislation will seek to harmonize
  • 3:23 - 3:29
    the rules addressing the problem, such as
    online hate speech, disinformation. But it
  • 3:29 - 3:33
    also puts emphasis finally on increased
    meaningful transparency in online
  • 3:33 - 3:38
    advertising, the way how the content is
    actually being distributed across
  • 3:38 - 3:45
    platforms and also will develop a specific
    enforcement mechanism that will be
  • 3:45 - 3:50
    actually looking into it. Now, before I
    will actually go into the details on DSA
  • 3:50 - 3:54
    and why DSA matters, and do we actually
    need such a big new legislative reform
  • 3:54 - 3:58
    that is coming from the European
    Commission? I want to just unpack it for
  • 3:58 - 4:03
    you a little bit what this legislative
    package actually consists of. So, as I
  • 4:03 - 4:07
    already mentioned, two regulations... the
    regulation, the strongest legal instrument
  • 4:07 - 4:12
    European Commission actually has in its
    hands, which is supposed to achieve the
  • 4:12 - 4:16
    highest level of harmonization across the
    member states. And we all can imagine how
  • 4:16 - 4:20
    difficult that will be to achieve,
    especially in the realm of freedom of
  • 4:20 - 4:25
    expression and particular categories of
    user generated content, which is so deeply
  • 4:25 - 4:30
    complex dependance. All of those
    related to content moderation and content
  • 4:30 - 4:36
    curation will be mainly in the realm of
    Digital Services Act. And then the second
  • 4:36 - 4:40
    regulation, the Digital Market Act, will
    be specifically looking at the dominance
  • 4:40 - 4:45
    of online platforms, economic dominance,
    competitive environment for smaller
  • 4:45 - 4:50
    players, the fairness in the competition.
    And it will also establish the list of
  • 4:50 - 4:55
    do's and don'ts for gatekeeper's
    platforms. So something that... is so the
  • 4:55 - 5:00
    platforms that actually hold a relevant
    dominance and now based on these new
  • 5:00 - 5:04
    proposals, we know that these platforms
    are mainly called as very large online
  • 5:04 - 5:10
    platforms. So this is exactly how the
    legislation refers to gatekeepers now. And
  • 5:10 - 5:15
    now I think one more point that I want to
    make is that the DSA and the DMA were
  • 5:15 - 5:21
    launched on the 15th of December 2020. So it
    was literally a Christmas present given to
  • 5:21 - 5:25
    the digital rights community by the
    European Commission, a long time
  • 5:25 - 5:31
    anticipated one. The work on DSA
    started, however much earlier. Electronic
  • 5:31 - 5:35
    Frontiers Foundations, as much as Access
    Now, together with EDRi, were working very
  • 5:35 - 5:39
    hard to come up with the priorities and
    recommendations what we would like to see
  • 5:39 - 5:43
    is within these legislations to be
    infringed??, because from the beginning we
  • 5:43 - 5:47
    understood the importance and the far
    reaching consequences this legislation
  • 5:47 - 5:52
    will have not only inside of the European
    Union, but also beyond. And that brings me
  • 5:52 - 5:55
    to the final introductory point that I
    want to make before I will hand over to
  • 5:55 - 6:01
    Chris, which is why and do we actually
    need DSA? We strongly believe that there
  • 6:01 - 6:06
    is a big justification and good reason to
    actually establish a systemic regulation
  • 6:06 - 6:11
    of online platforms in order to secure
    users fundamental rights, empower them,
  • 6:11 - 6:16
    and also to protect their democratic
    discourse. And this is due to the fact
  • 6:16 - 6:21
    that for many years we're witnessing this
    phase: quite bad regulatory practices in
  • 6:21 - 6:25
    platform governance that are coming from
    member states and Chris will provide for a
  • 6:25 - 6:30
    very concrete examples in that regard, but
    also coming from the European Commission
  • 6:30 - 6:36
    itself, mainly the proposed online service
    content regulation, for instance, or we
  • 6:36 - 6:41
    all remember the story of copyright that
    Chris will discuss a little bit further.
  • 6:41 - 6:45
    We saw how not only the member states, but
    also European Commission or European
  • 6:45 - 6:52
    Union, in order to actually establish some
    order in the digital space, they started
  • 6:52 - 6:56
    pushing the state's obligation and
    especially states positive obligation to
  • 6:56 - 7:01
    protect the users' human rights in the
    hands of online private platforms that
  • 7:01 - 7:08
    started replacing state actors and public
    authorities within the online space. They
  • 7:08 - 7:12
    started assessing the content, the
    legality of the content, deciding under a
  • 7:12 - 7:16
    very short time frames what should stay
    online and what should go offline with no
  • 7:16 - 7:21
    public scrutiny or transparency about
    practices that they kept deploying. And
  • 7:21 - 7:26
    they keep deploying to this day. Of
    course, platforms under the threat of
  • 7:26 - 7:31
    legal liability often had to rely and
    still have to rely on the content
  • 7:31 - 7:36
    recognition technologies for removing user
    generated content. A typical example could
  • 7:36 - 7:40
    be also ABAI??? law, which will be still
    mentioned today during the presentation.
  • 7:40 - 7:45
    And the typical time frames are usually
    those that extend from one hour to 24
  • 7:45 - 7:49
    hours, which is extremely short,
    especially if any users would like to
  • 7:49 - 7:55
    appeal such a decision or seek an
    effective remedy. At the same time, due to
  • 7:55 - 8:01
    the lack of harmonization and lack of
    proper set of responsibilities that should
  • 8:01 - 8:04
    lie in the hands of these online
    platforms. There was a lack of legal
  • 8:04 - 8:09
    certainty which would only reinforce the
    vicious circle of removing more and more
  • 8:09 - 8:15
    of online content in order to escape any
    possible liability. And at the end, to
  • 8:15 - 8:20
    this day, due to the lack of transparency,
    we lack any evidence or research based
  • 8:20 - 8:25
    policy making, because platforms do not
    want to share or inform the public
  • 8:25 - 8:28
    authorities what they actually do with the
    content, how they moderate and those
  • 8:28 - 8:32
    transparency information that we receive
    within their transparency reports are
  • 8:32 - 8:37
    usually quantity oriented instead of
    quality. So they focus on how much content
  • 8:37 - 8:42
    is actually being removed and how fast,
    which is not enough in order to create
  • 8:42 - 8:47
    laws that can actually provide any more
    sustainable solutions. And ultimately, as
  • 8:47 - 8:53
    we all agree, the core issue doesn't lie
    that much with how the content is being
  • 8:53 - 8:57
    moderated, but how content is being
    distributed across platforms within the
  • 8:57 - 9:03
    core of their business models that
    actually stands on a attention economy and
  • 9:03 - 9:08
    on the way, how sensational content is
    often being amplified in order to actually
  • 9:08 - 9:13
    prolong that attention span of users that
    visit platforms on regular basis. And I'm
  • 9:13 - 9:18
    packing quite a few issues here. And this
    supposed to be just like a quick
  • 9:18 - 9:22
    introductory remark. I will now hand over
    to Chris that will elaborate on all these
  • 9:22 - 9:26
    points a little bit further, and then we
    take a look and unpack a few quite
  • 9:26 - 9:31
    important parts of the essay that we feel
    should be prioritized in this debate at
  • 9:31 - 9:37
    the moment. Chris, over to you.
    Christoph Schmon: Hi everybody. And I'm
  • 9:37 - 9:43
    quite sure that many of you have noticed
    that there's a growing appetite from the
  • 9:43 - 9:47
    side of the European Union to regulate the
    Internet by using online platforms, as the
  • 9:47 - 9:52
    helping hands to monitor and censor
    reduces [consensual do???] online. As you
  • 9:52 - 9:59
    see on the slide, the first highlight of
    this growing appetite was corporate upload
  • 9:59 - 10:03
    filters, which are supposed to stop
    copyright protected content online.
  • 10:03 - 10:07
    Thousands of people, old and young, went
    on the streets to demonstrate for free
  • 10:07 - 10:11
    Internet, to demonstrate against technical
    measures to turn the Internet into some
  • 10:11 - 10:16
    sort of censorship machine. We've made a
    point then, and we continue making the
  • 10:16 - 10:21
    point now that upload filters are prone to
    error, that upload filters cannot
  • 10:21 - 10:24
    understand context, that they are
    unaffordable by all but the largest tech
  • 10:24 - 10:30
    companies, which happen to be all based in
    the United States. But as you well know,
  • 10:30 - 10:34
    policymakers would not listen article 13
    of the copyright act - directive was
  • 10:34 - 10:40
    adopted by a small margin in the European
    Parliament, also because some members of
  • 10:40 - 10:44
    European Parliament had troubles to press
    the right buttons, but I think it's
  • 10:44 - 10:50
    important for us to understand that the
    fight is far from over. The member states
  • 10:50 - 10:53
    must now implement the directive in the
    way that is not at odds with fundamental
  • 10:53 - 10:58
    rights. And we argue that mandated
    automated removal technologies are always
  • 10:58 - 11:03
    in conflict with fundamental rights. And
    this includes data protection rights. It
  • 11:03 - 11:07
    is a data protection right not to be made
    subject to Automated Decision-Making
  • 11:07 - 11:11
    online, if it involves your personal data
    and if such decision making has a negative
  • 11:11 - 11:17
    effect. So we believe, all those legal
    arguments aside, I think the most worrying
  • 11:17 - 11:23
    experience with upload filters is that it
    has spillover effects to other
  • 11:23 - 11:26
    initiatives. Sure, if it works for
    copyright protected companies, it may well
  • 11:26 - 11:32
    work for other types of content, right?
    Except that it doesn't. Many considered
  • 11:32 - 11:36
    now a clever idea that that forms should
    proactively monitor and check all sorts of
  • 11:36 - 11:41
    user content may this be communication,
    pictures or videos, and they should use
  • 11:41 - 11:46
    filters to take it down or the filters to
    prevent the re-upload of such content. An
  • 11:46 - 11:51
    example of such spillover effect that ...
    would mentioned is the draft regulation of
  • 11:51 - 11:56
    terrorist related content. It took a huge
    joint effort of civil society groups and
  • 11:56 - 12:03
    some members of Parliament to reject the
    worst of all text. We had recent
  • 12:03 - 12:09
    negotiations going on and at least we
    managed to get out the requirement to use
  • 12:09 - 12:13
    uploads filters, but still a twenty four
    hours removal obligation that may nudge
  • 12:13 - 12:20
    platforms to employ those filters
    nevertheless. And we see that particularly
  • 12:20 - 12:23
    in national states, they are very fond of
    the idea that platforms rather than
  • 12:23 - 12:27
    [churches???] should be the new law
    enforcers. There are now several states in
  • 12:27 - 12:31
    the European Union that have adopted laws
    that would either oblige or nudge
  • 12:31 - 12:37
    platforms to monitor users of speech
    online. First up was the German NetzDG,
  • 12:37 - 12:40
    which set out systematic duties for
    platforms. Then we had the French law
  • 12:40 - 12:44
    Avia, which copy-pasted the NetzDG and
    made it worse. And last we have the
  • 12:44 - 12:48
    Austrian Hate Speech bill, which is a mix
    of both the German and the French
  • 12:48 - 12:52
    proposal. They all go much beyond
    copyright content, but focus on hate
  • 12:52 - 12:56
    speech and all sorts of content that may
    be considered problematic and in those
  • 12:56 - 13:02
    respective countries, not necessarily in
    other countries. And this brings me to the
  • 13:02 - 13:06
    next problem. How do we deal with content
    that is illegal in one country, but legal
  • 13:06 - 13:11
    in another? A recent Court of Justice
    ruling had confirmed that a court of a
  • 13:11 - 13:16
    small state like Austria can order
    platforms not only to take down defamatory
  • 13:16 - 13:20
    content globally, but also to take down
    identical or equivalent material using
  • 13:20 - 13:26
    automated technologies. For us this is a
    terrible outcome, that will lead to a race
  • 13:26 - 13:31
    to the bottom, where the countries with
    the least freedom of speech friendly laws
  • 13:31 - 13:36
    can superimpose their laws on every other
    state in the world. We really believe that
  • 13:36 - 13:41
    all this nonsense has to stop. It's time
    to acknowledge that the Internet is a
  • 13:41 - 13:44
    global space, a place of exchange of
    creativity and the place where civil
  • 13:44 - 13:50
    liberties are suppose to exist. So we are
    fighting now against all those national
  • 13:50 - 13:54
    initiatives. We had the great first
    victory, when we have to bring down the
  • 13:54 - 14:00
    French law Avia - the Avia bill that had
    imposed the duty for platforms to check
  • 14:00 - 14:06
    and remove potentially leak content within
    24 hours. Before they considered
  • 14:06 - 14:09
    constitutional, the French Supreme Court,
    we had argued that this would push
  • 14:09 - 14:14
    platforms to constantly check what users
    post. And with platforms face high
  • 14:14 - 14:20
    fines... of course, they would be rather
    motivated to block as much context of the
  • 14:20 - 14:24
    content as possible. We've made a point
    that this would be against the Charter of
  • 14:24 - 14:29
    Fundamental Rights, including freedom of
    information and freedom of expression. And
  • 14:29 - 14:34
    it was a great victory for us that the
    French Supreme Court has struck down the
  • 14:34 - 14:40
    French Avia bill and followed our
    argument, as you see on the slide. We also
  • 14:40 - 14:44
    see now that there's a push back at least
    against the update of the German NetzDG,
  • 14:44 - 14:48
    which would have provided new access
    rights for law enforcement authorities.
  • 14:48 - 14:51
    This and other provisions are considered
    unconstitutional. And as far as I
  • 14:51 - 14:55
    understand and, perhaps listeners can
    correct me, the German president has
  • 14:55 - 15:01
    refused to sign the bill and the Austrian
    bill goes the similar path way - got
  • 15:01 - 15:06
    recently a red light from Brussels. The
    commission considers it in conflict with
  • 15:06 - 15:12
    EU principles. Also, thanks to joint
    effort by epicenter.works. And this shows
  • 15:12 - 15:16
    that something positive is going on, it's
    a positive development, the pushback
  • 15:16 - 15:20
    against the automated filter technologies.
    But it's important to understand that
  • 15:20 - 15:25
    those national initiatives are not just
    purely national attempts to regulate hate
  • 15:25 - 15:29
    speech. It's an attempt of an EU member
    state to make their own bills, as badly as
  • 15:29 - 15:34
    they are, some sort of a prototype for EU-
    wide legislation, a prototype for the
  • 15:34 - 15:38
    Digital Services Act. And as you know,
    national members did have the saying ...
  • 15:38 - 15:42
    lawmaking: their voices are represented in
    the council of the U.N., the European
  • 15:42 - 15:47
    Commission, which will be disincentivized
    to propose anything that would be voted
  • 15:47 - 15:51
    down by the council. I think that's a nice
    takeaway from today, that lawmaking in
  • 15:51 - 15:56
    national member states is not an isolated
    event. It's always political, it's always
  • 15:56 - 16:03
    nets politics. The good news is that as
    far as the European Commission proposal
  • 16:03 - 16:07
    for a Digital Services Act is concerned,
    that it has not followed the footsteps of
  • 16:07 - 16:13
    those bad, badly designed and misguided
    bills. It has respected our input, the
  • 16:13 - 16:16
    input from Access Now, from EFF, from the
    EDRi network, from academics and many
  • 16:16 - 16:22
    others, that some key principles should
    not be removed, like that liability for
  • 16:22 - 16:27
    speech should rest for the speaker. The
    DSA.. it's also a red light to channel
  • 16:27 - 16:31
    monitoring of users content. And there are
    no sure badlines in there to remove
  • 16:31 - 16:35
    content that might be illegal. Instead,
    the commission gives more selective
  • 16:35 - 16:42
    platforms to take down posting good face,
    which is arbitrary, which we call it used
  • 16:42 - 16:46
    style Good Samaritan clauses. Looking
    through the global lenses of law making
  • 16:46 - 16:49
    it's very fascinating to see that while
    the United States is flirting with the
  • 16:49 - 16:53
    idea to move away from the Good Samaritan
    principle in Section 230 of the
  • 16:53 - 16:57
    Communications Decency Act, so the idea is
    that platforms can voluntarily remove
  • 16:57 - 17:02
    content without being held liable for it,
    the European Union flirts with the idea to
  • 17:02 - 17:07
    introduce it, to give more options to
    platforms to act. That being said, the
  • 17:07 - 17:11
    major differences between the US and the
    EU is that in Europe, platforms could be
  • 17:11 - 17:16
    held liable the moment they become aware
    of the illegality of content. That's an
  • 17:16 - 17:20
    issue because the Digital Services Act has
    now introduced a relatively sophisticated
  • 17:20 - 17:25
    system for user notification, complaint
    mechanism, dispute resolution options,
  • 17:25 - 17:29
    which all leads to such awareness about
    illegality or could lead to such
  • 17:29 - 17:34
    awareness. It's not quite clear for us how
    platforms will make use of voluntary
  • 17:34 - 17:39
    measures to remove content. That being
    said, we think that the commission's
  • 17:39 - 17:42
    proposal could have been much worse. And
    the Parliament's reports on Digital
  • 17:42 - 17:45
    Services Act have demonstrated that the
    new parliament is a bit better than the
  • 17:45 - 17:51
    old one. They have a lot of respect for
    fundamental rights. So many members of
  • 17:51 - 17:55
    parliament that are quite fond of the idea
    to protect civil liberties online. But we
  • 17:55 - 18:00
    know that this was only the start and we
    know that we need another joint effort to
  • 18:00 - 18:03
    ensure that users are not monitored, they
    are not at the mercy of others decision
  • 18:03 - 18:08
    making. And I think Eliska is now going to
    explain a bit more about all this.
  • 18:08 - 18:15
    Eliska: Thank you. Thank you very much,
    Chris. So, we can actually move now
  • 18:15 - 18:20
    further and unpack a few relevant
    provisions for everything that has already
  • 18:20 - 18:24
    been mentioned, mainly in the realm of
    content moderation and content creation,
  • 18:24 - 18:29
    which ultimately lies in the core of
    Digital Services Act. And maybe not to
  • 18:29 - 18:35
    make it all so abstract... I also have the
    printed version of the law here with me.
  • 18:35 - 18:39
    And if you look at it, it's quite an
    impressive piece of work that the European
  • 18:39 - 18:45
    Commission did there. And I have to say
    that even though it's a great start, it
  • 18:45 - 18:51
    still contains a lot of imperfections. And
    I will try to summarize those now for you,
  • 18:51 - 18:58
    especially in the light of ??? positioning
    and as I mean civil societies in general,
  • 18:58 - 19:01
    because I believe that on all those
    points, we have a pretty solid agreement
  • 19:01 - 19:06
    among each other. And what we were
    actually hoping that Digital Services Act
  • 19:06 - 19:11
    will do, what it actually does and where
    we see that we will need to actually
  • 19:11 - 19:15
    continue working very closely, especially
    with the members of the European
  • 19:15 - 19:18
    Parliament in the future, once the draft
    will actually enter the European
  • 19:18 - 19:24
    Parliament, which should happen relatively
    soon. So as it already, as I already
  • 19:24 - 19:29
    mentioned at the beginning, quite briefly,
    is how actually DSA distinguishes between
  • 19:29 - 19:34
    online platforms, which are defined within
    the scope of the law and between very
  • 19:34 - 19:39
    large online platforms, which is exactly
    that's cover all large online gatekeepers
  • 19:39 - 19:44
    fall into. DSA specificly then
    distinguishes between obligations and
  • 19:44 - 19:49
    responsibilities of these actors, some
    assigning to all of them, including online
  • 19:49 - 19:53
    platforms and some being extended
    specifically due to the dominance and
  • 19:53 - 19:59
    power of these online gatekeepers hold.
    This is mainly then the case when we
  • 19:59 - 20:02
    discussed the requirements for
    transparency. There is a set of
  • 20:02 - 20:05
    requirements for transparency that apply
    to online platforms, but then there is
  • 20:05 - 20:09
    still specific additional set of
    transparency requirements for larger
  • 20:09 - 20:14
    online platforms. What DSA does especially
    and this is the bed which is extremely
  • 20:14 - 20:19
    relevant for the content moderation
    practices - it attempts to establish a
  • 20:19 - 20:24
    harmonized model for notice and action
    procedure for allegedly illegal content.
  • 20:24 - 20:29
    Whatever alert red lines before I go into
    the details on this was that DSA will be
  • 20:29 - 20:35
    touching only, or trying to regulate only
    allegedly illegal content and stay away
  • 20:35 - 20:40
    from vaguely defined content categories
    such as potentially harmful, but legal
  • 20:40 - 20:45
    content, there are other ways how the
    legislation can tackle this content,
  • 20:45 - 20:51
    mainly through the meaningful transparency
    requirements, accountability, tackling
  • 20:51 - 20:57
    issues within the open content recommender
    systems and algorithmic curation. But we
  • 20:57 - 21:01
    didn't want the specific category of the
    content to be included within the scope of
  • 21:01 - 21:06
    DSA. This is due to the fact that vaguely
    defined terms that find their way into
  • 21:06 - 21:12
    legislation always lead to human rights
    abuse in the future. I could give you
  • 21:12 - 21:17
    examples from Europe, such as the concept
    of online harms within the UK, but also as
  • 21:17 - 21:22
    a global organizations. Both of us, we
    actually often see how weak terminology
  • 21:22 - 21:26
    can quickly lead to even over
    criminalization of speech or suppressing
  • 21:26 - 21:31
    the dissent. Now, if we go back to
    harmonize what is an action procedure,
  • 21:31 - 21:36
    what that actually means in practice, as
    Christoph already mentioned, Europe has
  • 21:36 - 21:40
    so-called conditional model of
    intermediate reliability, which is being
  • 21:40 - 21:44
    provided already and established by the
    initial legal regime, which is the
  • 21:44 - 21:50
    e-Commerce Directive under Article 14 of
    the e-Commerce Directive, which actually
  • 21:50 - 21:54
    states that unless the platform holds the
    actual knowledge and according to the
  • 21:54 - 21:59
    wording of DSA now it's the actual
    knowledge or awareness about the presence
  • 21:59 - 22:05
    of illegal content on their platform, they
    cannot be held liable for such a content.
  • 22:05 - 22:10
    Now, we were asking for a harmonized
    procedure regarding ??? action across the
  • 22:10 - 22:14
    EU for a while, precisely because we
    wanted to see reinforced legal certainty.
  • 22:14 - 22:19
    Lack of legal certainty, also translated
    into over removal of even legitimate
  • 22:19 - 22:26
    content from the platforms with no public
    scrutiny. DSA does a good job, it's a good
  • 22:26 - 22:29
    starting point that actually tries to
    attempt, to establish such a harmonized
  • 22:29 - 22:34
    procedure, but it's still lacking behind
    on many aspects that we consider important
  • 22:34 - 22:38
    in order to strengthen protection of
    fundamental rights of users. One of them
  • 22:38 - 22:42
    is, for instance, that harmonized notice
    and action procedure, as envisioned by
  • 22:42 - 22:48
    DSA, is not specifically tailored to
    different types of categories of user
  • 22:48 - 22:53
    generated content. And as we know, there
    were some or many categories of content
  • 22:53 - 22:58
    that are deeply context dependent, linked
    to the historical and sociopolitical
  • 22:58 - 23:05
    context of member state in question. And
    due to their reliance on the automated
  • 23:05 - 23:09
    measures, that usually context blind, we
    are worried that if notice and action
  • 23:09 - 23:13
    doesn't reflect this aspect in any further
    ways we will end up again with over
  • 23:13 - 23:19
    removals of the content. What is probably
    another huge issue that we are currently
  • 23:19 - 23:24
    lacking in the draft, even though DSA is
    trying to create a proper appeal and
  • 23:24 - 23:29
    enforcement mechanisms and also appeal
    mechanisms for users and different
  • 23:29 - 23:34
    alternative dispute settlement of the law
    draft currently contains, there is no
  • 23:34 - 23:46
    possibility for content providers, for the
    user that uploads the filter.. Sorry. For
  • 23:46 - 23:51
    a user that actually uploaded content to
    appeal to directly actually use the
  • 23:51 - 23:57
    counter notification about that notified
    content that belongs to that user. Nor
  • 23:57 - 24:01
    platforms are obliged to actually send the
    notification to a user prior to any action
  • 24:01 - 24:06
    that is being taken against that
    particular notified content. These are for
  • 24:06 - 24:10
    us a procedural safeguards for fairness
    that users should have, and currently they
  • 24:10 - 24:16
    are not being reflected in the draft.
    However, this is a good start and it's
  • 24:16 - 24:20
    something that we were pushing for. But I
    think there are many more aspects that
  • 24:20 - 24:25
    these notice and action procedures will
    need to contain in order to truly put
  • 24:25 - 24:31
    users at first. Now the notice and action
    procedure is mainly focusing on the
  • 24:31 - 24:35
    illegal content. But there are ways in the
    draft where potentially harmful content,
  • 24:35 - 24:39
    which is still legal - so the content that
    actually violates the terms of service of
  • 24:39 - 24:43
    platforms is being mentioned throughout
    the draft. So for us, it's now at the
  • 24:43 - 24:52
    moment exactly clear how that will work in
    practice. So that's why we often use this
  • 24:52 - 24:56
    phrase that is also put on the slide: good
    intention with imperfect solutions.
  • 24:56 - 25:00
    However, I want to emphasize again that
    this is just the beginning and we will
  • 25:00 - 25:06
    still have time and space to work very
    hard on this. Another kind of novel aspect
  • 25:06 - 25:11
    that DSA actually brings about is already
    mentioned Good Samaritan Clause, and I
  • 25:11 - 25:17
    tend to call it the EU model or EU version
    of Good Samaritan Clause. Good Samaritan
  • 25:17 - 25:21
    Clause originates in Section 230 of
    Communication Decency Act, as Cristoph
  • 25:21 - 25:27
    already mentioned. But within the European
    Union, it goes hand in hand with this
  • 25:27 - 25:31
    conditional model of liability which is
    being preserved within the DSA legal
  • 25:31 - 25:36
    draft. That was also one of our main ask
    to preserve this conditional model of
  • 25:36 - 25:42
    liability and it's great that this
    European Commission really listened. Why
  • 25:42 - 25:47
    we consider the Good Samaritan Clause
    being important? In the past, when such a
  • 25:47 - 25:51
    security wasn't enshrined in the law, but
    it was just somehow vaguely promised to
  • 25:51 - 25:58
    the commission that if the platform will
    proactively deploy measures to fight
  • 25:58 - 26:02
    against the spread of illegal content,
    they won't be held liable without
  • 26:02 - 26:07
    acknowledging that through such a use of
    so-called proactive measures, the platform
  • 26:07 - 26:12
    could in theory gain the actual knowledge
    about the existence of such a type of
  • 26:12 - 26:16
    content on its platform, which would
    immediately trigger legal liability. This
  • 26:16 - 26:20
    threat of liability often pushed platforms
    to the coroner so they would rather remove
  • 26:20 - 26:27
    the content very quickly then to face more
    serious consequences later on. That's why
  • 26:27 - 26:32
    we see the importance within the Good
    Samaritan Clause or the European model of
  • 26:32 - 26:37
    Good Samaritan Clause, and we are glad
    that it's currently being the part of the
  • 26:37 - 26:45
    draft. One of the biggest downfalls or one
    of the biggest disappointments when DSA
  • 26:45 - 26:51
    finally came out on the 15th of December
    for us was to see that it's still online
  • 26:51 - 26:55
    platforms that will remain in charge when
    it comes to assessing the legality of the
  • 26:55 - 27:00
    content and deciding what content should
    be actually restricted and removed from a
  • 27:00 - 27:06
    platform and what should be available. We
    often emphasize that it's very important
  • 27:06 - 27:12
    that the legality of the content is being
    assessed by the independent judicial
  • 27:12 - 27:17
    authorities as in line with the rule of
    law principles. We also do understand that
  • 27:17 - 27:23
    such a solution creates a big burden on
    the judicial structure of member states.
  • 27:23 - 27:27
    Many member states see that as a very
    expensive solutions, they don't always
  • 27:27 - 27:33
    want to create a special network, of
    courts, or e-courts or other forms of
  • 27:33 - 27:38
    judicial review of the illegal or
    allegedly illegal content. But we still
  • 27:38 - 27:43
    wanted to see more public scrutiny,
    because for us this is truly just the
  • 27:43 - 27:47
    reaffirmation of already existing status
    quo, as at the moment and there are many
  • 27:47 - 27:52
    jurisdictions within the EU and in the EU
    itself, it's still online platform that
  • 27:52 - 27:57
    will call the final shots. What, on the
    other hand, is a positive outcome that
  • 27:57 - 28:02
    we've also hardly pushing for are the
    requirements for meaningful transparency.
  • 28:02 - 28:07
    So to understand better what platforms
    actually do with the individual pieces of
  • 28:07 - 28:13
    content that are being shared on these
    platforms and how actually transparency
  • 28:13 - 28:18
    can then ultimately empower user. Now, I
    want to emphasize this because this is
  • 28:18 - 28:23
    still ongoing debate and we will touch
    upon those issues in a minute. But we
  • 28:23 - 28:29
    don't see transparency as a silver bullet
    to the issues such as amplification of
  • 28:29 - 28:33
    potentially harmful content or in general
    that transparency will be enough to
  • 28:33 - 28:39
    actually hold platforms accountable.
    Absolutely not. It will never be enough,
  • 28:39 - 28:45
    but it's a precondition for us to actually
    seek such solutions in the future. DSA
  • 28:45 - 28:49
    contains specific requirements for
    transparency, as I already mentioned, a
  • 28:49 - 28:53
    set of requirements that will be
    applicable largely to all online platforms
  • 28:53 - 28:58
    and then still specific set of
    requirements on the top of it. That will
  • 28:58 - 29:02
    be applicable only to very large online
    platforms of the online gatekeepers. We
  • 29:02 - 29:07
    appreciate the effort, we see that the
    list is very promising, but we still think
  • 29:07 - 29:13
    it could be more ambitious. Both EFF and
    Access Now put forward a specific set of
  • 29:13 - 29:18
    requirements for meaningful transparency
    that are in our positions. And so did EDRi
  • 29:18 - 29:25
    and other civil society or digital rights
    activists in this space. And final point
  • 29:25 - 29:29
    that I'm going to make is the so-called
    Pandora's box of online targeting and
  • 29:29 - 29:35
    recommender systems. Why do I refer to
    this as to Pandora's Box? When a European
  • 29:35 - 29:41
    Parliament published its initiative
    reports on DSA, there are two reports, one
  • 29:41 - 29:48
    being tabled by JURI Committee and then
    another one by ENCO, especially the JURI
  • 29:48 - 29:52
    report contained paragraph 17, which calls
    out for a better regulation of online
  • 29:52 - 29:57
    targeting and online advertisements, and
    specifically calling for a ban of online
  • 29:57 - 30:03
    targeting and including the Phase-Out that
    will then lead to a ban. We supported this
  • 30:03 - 30:08
    paragraph, which at the end was voted for
    and is the part of the final report.
  • 30:08 - 30:13
    Nevertheless, we also do understand that
    these wording of the article has to be
  • 30:13 - 30:18
    more nuanced in the future. Before I go
    into the details there, I just want to say
  • 30:18 - 30:23
    that this part has never made it to DSA.
    So there is no ban on online targeting or
  • 30:23 - 30:28
    online advertisement of any sort, which to
    us, to some extent, it was certainly
  • 30:28 - 30:33
    disappointing too, we specifically would
    call for a much more stricter approach
  • 30:33 - 30:39
    when it comes to behavioral targeting as
    well as crossside tracking of online
  • 30:39 - 30:44
    users, but unfortunately, and as we
    eventually also heard from Commissioner
  • 30:44 - 30:49
    Vestager, that was simply lag of bill or,
    maybe, too much pressure from other
  • 30:49 - 30:54
    lobbies in Brussels. And this provision
    never found its way to the final draft of
  • 30:54 - 30:59
    DSA. That's the current state of art, we
    will see what we will manage to achieve
  • 30:59 - 31:05
    once the DSA will enter the European
    Parliament. And finally, the law also
  • 31:05 - 31:10
    contains a specific provision on
    recommender systems. So the way how the
  • 31:10 - 31:17
    content is being distributed across
    platform and how the data of users are
  • 31:17 - 31:22
    being abused for such a distribution and
    personalization of user generated content.
  • 31:22 - 31:27
    In both cases, whether it's online
    targeting and recommender systems within
  • 31:27 - 31:32
    the DSA, DSA goes as far as the
    transparency requirements, the
  • 31:32 - 31:37
    explainability, but it does very little
    for returning that control and empowerment
  • 31:37 - 31:43
    back to the user. So whether user can
    obtain or opt out from these algorithmic
  • 31:43 - 31:49
    curation models, how it can actually be
    optimized if they decide to optimize it?
  • 31:49 - 31:55
    All of that is at the moment very much
    left outside of the scope of DSA. And so
  • 31:55 - 32:00
    that's the issue of interoperability,
    which is definitely a one of the key
  • 32:00 - 32:06
    issues being currently discussed and made
    kind of possible hopes in the future for
  • 32:06 - 32:10
    returning that control and empowerment
    back to the user. And I keep repeating
  • 32:10 - 32:14
    this as a mantra, but it's truly the main
    driving force behind all our initiatives
  • 32:14 - 32:20
    and the work we do in these fields. So the
    user and their fundamental rights. And on
  • 32:20 - 32:23
    that note, I would like to hand over back
    to Chris, who will explain the issue of
  • 32:23 - 32:28
    interoperability and how to actually
    empower you as a user and to strengthen
  • 32:28 - 32:36
    the protection of fundamental rights
    further. Chris, it's yours now. Christoph:
  • 32:36 - 32:43
    Thank you. I think we all know or feel
    that the Internet has seen better times.
  • 32:43 - 32:49
    If you look back over the last 20 years,
    we have seen that transformation was going
  • 32:49 - 32:55
    on from an open Internet towards a more
    closed one - monopolization. Big platforms
  • 32:55 - 32:59
    have built entire ecosystems and it seems
    that they alone decides who gets to use
  • 32:59 - 33:04
    them. Those platforms have strong network
    effects that have pushed platforms or
  • 33:04 - 33:08
    those platforms in the gatekeeper position
    which made it so easy for them to avoid
  • 33:08 - 33:12
    any real competition. This is especially
    true when we think of social media
  • 33:12 - 33:17
    platforms. This year we celebrate the 20th
    birthday of the e-Commerce Directive that
  • 33:17 - 33:21
    Eliska mentioned. The Internet bill that
    will now be replaced by the Digital
  • 33:21 - 33:26
    Services Act. We believe it's a very good
    time now to think and make a choice:
  • 33:26 - 33:29
    should we give even more power to the big
    platforms that have created a lot of the
  • 33:29 - 33:33
    mess in the first place; or should we give
    the power to the users, give the power
  • 33:33 - 33:39
    back to the people? For us, the answer is
    clear. Big tech companies already employ a
  • 33:39 - 33:44
    wide array of technical measures. They
    monitor, they remove, they disrespect user
  • 33:44 - 33:48
    privacy and the idea is to turn them into
    the Internet Police, with a special
  • 33:48 - 33:55
    license of censoring the speech of users,
    will only solidify their dominance. So we
  • 33:55 - 33:59
    wouldn't like that. What we like is to
    produce those in charge over the online
  • 33:59 - 34:05
    experience. Users should.. if we have to
    say, choose for themselves which kind of
  • 34:05 - 34:09
    content they can see, what services they
    can use to talk to their friends and
  • 34:09 - 34:13
    families. And we believe it's perhaps time
    to break up those silos, those big
  • 34:13 - 34:18
    platforms have become to end the dominance
    of the data. One element to achieve this
  • 34:18 - 34:22
    would be to tackle the targeted ads
    industry, as Eliska mentioned it, perhaps
  • 34:22 - 34:27
    to give an actual right to users not to be
    subject to targeted ads or to give more
  • 34:27 - 34:32
    choice to use to decide, which content
    they would like to see or not to see. In
  • 34:32 - 34:35
    the Digital Services Act, the Commission
    went for transparency when it comes to ads
  • 34:35 - 34:39
    and better option for users to decide on
    the recommended content, which is a start,
  • 34:39 - 34:45
    we can work with that. Another important
    element to achieve user autonomy over data
  • 34:45 - 34:50
    is interoperability. If the European Union
    really wants to break the power of those
  • 34:50 - 34:54
    data driven platforms that monopolize the
    Internet, it needs regulations that
  • 34:54 - 34:58
    enables users to be in control over the
    data. We believe that users should be able
  • 34:58 - 35:05
    to access data, to download data, to move,
    manipulate their data as they see fit. And
  • 35:05 - 35:08
    part of their control is to board data
    from one place to another. But data
  • 35:08 - 35:13
    portability, which we have under the
    treaty bureau, is not good enough. And we
  • 35:13 - 35:16
    see from the treaty bureau that it's not
    working in practice. Users should be able
  • 35:16 - 35:19
    to communicate with friends across
    platform boundaries, to be able to follow
  • 35:19 - 35:23
    their favorite content across different
    platforms without having to create several
  • 35:23 - 35:29
    accounts. But to put it in other terms, if
    you upset with the absence of privacy on
  • 35:29 - 35:33
    Facebook or how the content is moderated
    on Facebook, you should be able to just
  • 35:33 - 35:37
    take your data with you using portability
    options and move to an alternative
  • 35:37 - 35:41
    platforms, that is a better fit and this
    without losing touch with your friends who
  • 35:41 - 35:46
    stay behind, who have not left the
    incumbent big platform. So what we did for
  • 35:46 - 35:50
    Digital Services Act is to argue for
    mandatory interoperability options that
  • 35:50 - 35:56
    would force Facebook to maintain APIs that
    let users on other platforms exchange
  • 35:56 - 36:01
    messages and content with Facebook users.
    However, if you look in the DSA, we see
  • 36:01 - 36:05
    that the commission completely missed the
    mark on interoperability, which is
  • 36:05 - 36:09
    supposed to be dealt with by related legal
    act, now it gets complicated. It's the
  • 36:09 - 36:15
    Digital Markets Act, the DMA, another
    beautiful acronym. A Digital Markets Act
  • 36:15 - 36:20
    wants to tackle certain harmful business
    practices by those gatekeeper platforms,
  • 36:20 - 36:25
    the very large tech companies that control
    what is called core services. The core
  • 36:25 - 36:28
    service is a search engine, a social
    networking service, a messaging service,
  • 36:28 - 36:33
    its operating systems and online
    intermediation services. Like think of how
  • 36:33 - 36:37
    Amazon controls access to customers for
    merchants that sell on its platforms or
  • 36:37 - 36:44
    how the Android and iPhone app stores as
    chokepoints in delivering mobile software.
  • 36:44 - 36:48
    And many things we like in the new
    proposal, the proposal of the Digital
  • 36:48 - 36:53
    Markets Act, for example there's a ban on
    mixing data in there that you may wants to
  • 36:53 - 36:57
    ban gatekeeper's from mixing data from
    data progress with the data they collect
  • 36:57 - 37:04
    on the customers. Another rule is to ban
    cross tying - sort of practices that end
  • 37:04 - 37:08
    users must sign up for ancillary services.
    So you should be able to use Android
  • 37:08 - 37:12
    without having to get a Google account for
    example. You believe that this is all
  • 37:12 - 37:19
    good, but the DMA like the DSA is very
    weak on interoperability. What it does is
  • 37:19 - 37:23
    to focus on real time data portability
    instead. So instead of having
  • 37:23 - 37:27
    interoperable services, users will only be
    able to send the data from one service to
  • 37:27 - 37:32
    another like from Facebook to Diaspora,
    meaning that you would end up having two
  • 37:32 - 37:37
    accounts instead of one or to quote co-
    redactor who spoke yesterday already:
  • 37:37 - 37:43
    "Users would still be subject to the
    sprawling garbage novela, abusive legalese
  • 37:43 - 37:47
    Facebook lovably calls its terms of
    service.". We believe that this is not
  • 37:47 - 37:54
    good enough. And the last slide, you see a
    quote from the Margrethe Vestager who made
  • 37:54 - 38:00
    a very good statement last month, that we
    need trustworthy services, fair use of
  • 38:00 - 38:06
    data and free speech and interoperability
    in Internet; people agree on that. And in
  • 38:06 - 38:10
    the next months and years, we will work on
    this to actually happen. However, you can
  • 38:10 - 38:14
    imagine, it will not be easy. We already
    see that European Union member states
  • 38:14 - 38:18
    follow the trend of platforms should
    systematically check undesirable and
  • 38:18 - 38:23
    insightful content and share those data
    with enforcement authorities, which is
  • 38:23 - 38:27
    even worse. We see an international trend
    going on to move away from the immunity of
  • 38:27 - 38:33
    platform for use of content towards a more
    active stance of those platforms. And we
  • 38:33 - 38:38
    see that recent terror attacks have fueled
    ideas that monitoring is a good idea and
  • 38:38 - 38:43
    end to end encryption is a problem. So
    whatever will be the result, you can bet
  • 38:43 - 38:46
    that European Union will want to make the
    Digital Services Act and the Digital
  • 38:46 - 38:51
    Markets Act another export model. So this
    time we want the numbers right in
  • 38:51 - 38:55
    parliament and the council, we want to
    help members of parliament to press the
  • 38:55 - 39:00
    right buttons. And for all these we would
    need your help, even if it means to learn
  • 39:00 - 39:04
    yet another acronym or several acronyms in
    order to keep it in head. That's it from
  • 39:04 - 39:08
    our side - we are looking forward to the
    discussion.Thank you.
  • 39:08 - 39:19
    Herald: OK, thank you Eliska and
  • 39:19 - 39:26
    Christoph. There are questions from the
    Internets and the first one is basically
  • 39:26 - 39:32
    we just have and as you mentioned in your
    slides, because of the copyright in the
  • 39:32 - 39:39
    digital single market with both
    accountability and liability provisions,
  • 39:39 - 39:46
    you also briefly mentioned, I think even
    the evidence proposal also. How do all
  • 39:46 - 39:49
    these proposals relate to each other? And
    especially for a layperson, that is not
  • 39:49 - 39:58
    into all the Brussels jargon.
    Christoph: I think Eliska, you've reised
  • 39:58 - 40:02
    your hand, don't you?
    Eliska: ...more or less unintentionally,
  • 40:02 - 40:10
    but yeah, kind of that. I can start and
    then let you Christoph to step in. Yeah,
  • 40:10 - 40:15
    that's a very, very good question. And
    this is specifically due to the fact that
  • 40:15 - 40:20
    when you mention especially online
    terrorist content regulation, but also
  • 40:20 - 40:27
    recently proposed interim regulation on
    child sexual abuse, they.. all these - we
  • 40:27 - 40:33
    call them sectoral legislation, so kind of
    a little bit of parting from this
  • 40:33 - 40:38
    horizontal approach, meaning an approach
    that tackles all categories of illegal
  • 40:38 - 40:43
    content in one way, instead of going after
    specific categories such as online
  • 40:43 - 40:48
    terrorist content in the separate ways. So
    it's a little bit paradoxical saying what
  • 40:48 - 40:51
    is currently also happening at the EU
    level, because on one hand, we were
  • 40:51 - 40:56
    promised this systemic regulation that
    once for all established harmonized
  • 40:56 - 41:01
    approach to combating illegal content
    online and at the same time, which is
  • 41:01 - 41:06
    specifically DSA, the Digital Services
    Act, and at the same time we still see
  • 41:06 - 41:10
    European Commission allowing for these
    fundamental rights harmful legislative
  • 41:10 - 41:16
    proposals happening in these specific
    sectors such as Ripple's online, that
  • 41:16 - 41:21
    response and regulation or other
    legislative acts seeking to somehow
  • 41:21 - 41:25
    regulate specific categories of user
    generated content. This is quite puzzling
  • 41:25 - 41:31
    for us as a digital rights activists too,
    and very often, actually, so I would maybe
  • 41:31 - 41:36
    separate DSA from this for a moment and
    say that all of these sectoral
  • 41:36 - 41:40
    legislations what they have in common is:
    first of all, continuing these negative
  • 41:40 - 41:44
    legislative trends that we already
    described and that we constantly observe
  • 41:44 - 41:49
    in practice, such as shifting more and
    more responsibility on online platforms.
  • 41:49 - 41:53
    And at the same time, what is also very
    interesting, what they have in common is
  • 41:53 - 41:58
    the legal basis that they stand on, and
    that's the legal basis that is rather
  • 41:58 - 42:03
    connected to the cooperation within the
    digital single market, even though they
  • 42:03 - 42:10
    seek to tackle a very particular type of
    category of content category, which is
  • 42:10 - 42:15
    manifestly illegal. So logically, if they
    should have that appropriate legal ground,
  • 42:15 - 42:20
    it should be something more close to
    police and judicial cooperation, which we
  • 42:20 - 42:25
    don't see happening in practice,
    specifically because there is this idea
  • 42:25 - 42:29
    that platforms are the best suited to
    decide how the illegal content will be
  • 42:29 - 42:33
    tackled in the online space. They can be
    the fastest, they can be the most
  • 42:33 - 42:38
    effective. So they should actually have
    that main decision making powers and
  • 42:38 - 42:42
    forced into taking those responsibilities
    which have ever ultimately, according to
  • 42:42 - 42:48
    the rule of law principle, should and have
    to be in the hands of the state and public
  • 42:48 - 42:54
    authorities, preferably judicial
    authorities. So I would say they are all
  • 42:54 - 43:00
    bad news for fundamental rights protection
    of online users, civil rights
  • 43:00 - 43:06
    organizations, all of us that are on this
    call today. We're fighting very hard also
  • 43:06 - 43:10
    against the online service content
    regulation. There was a lot of damage
  • 43:10 - 43:15
    control done, especially with the first
    report that was tabled by the European
  • 43:15 - 43:20
    Parliament and also now during the last
    trialogue since the negotiations seems to
  • 43:20 - 43:25
    be concluded and the outcome is not great.
    It's far from ideal. And I'm worried that
  • 43:25 - 43:29
    with other sectoral legislative attempts
    coming from the European Commission, we
  • 43:29 - 43:33
    might see the same outcome. It will be
    very interesting to see how that will
  • 43:33 - 43:37
    actually then play together with the
    Digital Services Act, which is trying to
  • 43:37 - 43:43
    do the exact opposite to actually fix this
    negative legislative efforts that we see
  • 43:43 - 43:48
    at the EU level that these sectoral
    legislation, but also with the member
  • 43:48 - 43:52
    states at the national level. I could also
    mention the European Commission reaction
  • 43:52 - 43:56
    to some national legislative proposals.
    But Christoph, I would leave that to you
  • 43:56 - 44:02
    and please defend.
    Christoph: I think you explained it
  • 44:02 - 44:06
    perfectly, and the only thing I can
    supplement here is that if you look at
  • 44:06 - 44:10
    this move from sectoral legislation,
    asylum legislation to horizontal
  • 44:10 - 44:16
    legislation, now back to sectoral
    legislation - it's a problem, it's a mess.
  • 44:16 - 44:24
    First, the two sides not very good
    coordinated which brings troubles for
  • 44:24 - 44:29
    legal certainty. It makes it very
    troublesome for platforms to follow up.
  • 44:29 - 44:34
    And it's problematic for us, for us in the
    space. We are some sort of lobbyist as
  • 44:34 - 44:38
    well, just for public interest. But you
    will have to have to deal with copyright,
  • 44:38 - 44:42
    with sesam???, with terric???, with end to
    end encryption, DSA, DMA and 15 other
  • 44:42 - 44:48
    parties to pop up content by content. It's
    very hard to manage to have the capacity
  • 44:48 - 44:51
    ready to be early in the debate, and it's
    so important to be early in the debate to
  • 44:51 - 44:55
    prevent that from happening. And I think
    that's a huge challenge for us, to have
  • 44:55 - 45:00
    something for us to reflect to in the next
    days. How can we turn forces better in a
  • 45:00 - 45:05
    more systematic way in order to really
    follow up on all those initiatives? That's
  • 45:05 - 45:12
    for me, a very problematic development.
    Herald: So in summary it's a mess. So it
  • 45:12 - 45:18
    is related, but we can't explain how,
    because it's such a mess. Fair enough. I
  • 45:18 - 45:23
    have another question for you, Eliska.
    That is so I was asking how the proposed
  • 45:23 - 45:29
    Good Samaritan clause works compared to..
    how it currently works in Germany. But I
  • 45:29 - 45:33
    think it's a bit unreasonable to expect
    everyone to know how it works in Germany.
  • 45:33 - 45:39
    I would rephrase it this as: how does this
    proposed Good Samaritan clause work
  • 45:39 - 45:42
    compared to how it is now under the
    e-Commerce Directive?
  • 45:42 - 45:51
    Eliska: Thank you very much. Yeah, so a
    great question again, I think the first if
  • 45:51 - 45:56
    we put it into the context of the EU law
    and apologies that I cannot really answer
  • 45:56 - 46:00
    how - you know, compare the German context
    - I really don't there to, I'm not a
  • 46:00 - 46:05
    German lawyer, so I wouldn't like to step
    in those waters. But first of all, there
  • 46:05 - 46:11
    is no Good Samaritan clause per se within
    the scope of e-Commerce Directive. It did
  • 46:11 - 46:17
    not really exist within the law. And I'm
    using the pass sentence now because DSA is
  • 46:17 - 46:22
    trying to change that. So that level of
    legal certainty was not really, really
  • 46:22 - 46:26
    there for the platforms. There was the
    conditional model of the liability, which
  • 46:26 - 46:31
    is still preserved within the regulation.
    But if you think of a Good Samaritan
  • 46:31 - 46:35
    clause as we know, which from the section
    230, or let's use that Good Samaritan
  • 46:35 - 46:38
    clause as an example, because also
    e-Commerce Directive was actually drafted
  • 46:38 - 46:43
    as a response to Communication Decency Act
    that was the legislation that puts things
  • 46:43 - 46:50
    into motion. So that's the first
    ultimative point. I explain at the
  • 46:50 - 46:55
    beginning in my presentation what was then
    happening in the space of combating
  • 46:55 - 47:01
    illegal content at the EU level, and
    especially I would refer to the
  • 47:01 - 47:06
    communication that the European Commission
    published, I think, back in 2018, where it
  • 47:06 - 47:12
    actually encouraged and called on online
    platforms to proactively engage with
  • 47:12 - 47:18
    illegal content and use these proactive
    measures to actually seek an adequate
  • 47:18 - 47:22
    response to illegal content. Now, to make
    that with this conditional model of
  • 47:22 - 47:28
    liability, which is of course defined by
    the obtaining actual knowledge by the
  • 47:28 - 47:33
    platform that created a perfect storm that
    I already explained. So the platforms new
  • 47:33 - 47:37
    that they are kind of pushed by the
    legislature to actually seek these active
  • 47:37 - 47:42
    responses to illegal content, often
    deploying automated measures. But they
  • 47:42 - 47:47
    didn't have any legal certainty or
    security on their side that if they do so,
  • 47:47 - 47:51
    they won't end up ultimately being held
    legally liable and face legal consequences
  • 47:51 - 47:56
    as a result of obtaining actual knowledge
    through those proactive measures that were
  • 47:56 - 48:02
    kind of the tool, how they could possibly
    actually obtain that knowledge. Now, what
  • 48:02 - 48:08
    DSA does, it specifically actually simply
    states and I think it's Article 6 in the
  • 48:08 - 48:14
    Digital Services Act, if I'm not mistaken,
    and I can even open it, it specifically
  • 48:14 - 48:21
    basically says that that platforms can use
    these proactive measures are, you know,
  • 48:21 - 48:28
    continue using some tools that actually
    seek to provide some responses to this
  • 48:28 - 48:32
    type of content without the fear of being
    held liable. So it's it's an article which
  • 48:32 - 48:37
    has approximately, I think, two
    paragraphs, but it's finally in the
  • 48:37 - 48:41
    legislation and that means that it will
    help to reinforce the level of legal
  • 48:41 - 48:45
    certainty. I would also emphasize that
    very often in Europe, when we discuss Good
  • 48:45 - 48:50
    Samaritan clause, and Good Samaritan is
    actually very unfortunate term, because
  • 48:50 - 48:55
    it's very much connected to the American
    legal tradition. But when it's being mixed
  • 48:55 - 48:59
    up with the conditional model of liability
    and with the prohibition of general
  • 48:59 - 49:03
    monitoring, which is still upheld, these
    are the main principles of the European
  • 49:03 - 49:09
    intermediary reliability law and the
    regime that is applicable within the EU,
  • 49:09 - 49:14
    such a safeguard can be actually
    beneficial and it won't lead hopefully to
  • 49:14 - 49:19
    these blanket immunity for online
    platforms or to this idea that platforms
  • 49:19 - 49:23
    will be able to do whatever they want with
    the illegal content without any public
  • 49:23 - 49:25
    scrutiny, because there are other
    measures, safeguards and principles in
  • 49:25 - 49:31
    place as a part of conditional model of
    liability that we have here in Europe. So
  • 49:31 - 49:36
    I'm sorry, maybe that was too complicated.
    Legalistic explanation there. But this is
  • 49:36 - 49:40
    how these provisions should work in
    practice. We, of course, have to wait for
  • 49:40 - 49:45
    the implementation of the law and see how
    that will turn out. But the main purpose
  • 49:45 - 49:50
    is that these legal certainty that was
    lacking until now can finally come to its
  • 49:50 - 49:55
    existence, which should help us to prevent
    over removal of legitimate speech from
  • 49:55 - 50:01
    online platforms.
    Herald: OK, thank you. I have two other
  • 50:01 - 50:05
    questions from the Internet about
    interoperability, and I suppose I should
  • 50:05 - 50:13
    look at Christoph for them. The last one
    I'm going to ask first is: would such
  • 50:13 - 50:18
    interoperability make it much more
    difficult to combat harassment and
  • 50:18 - 50:22
    stalking on the Internet? How do you
    police that kind of misbehavior if it's
  • 50:22 - 50:30
    across different platforms? We're forced
    to interoperate and also be conduits for
  • 50:30 - 50:36
    such bad behavior. And I'll come to the
    earlier question if you've answered this
  • 50:36 - 50:40
    question Christoph.
    Christoph: It's a pretty good question.
  • 50:40 - 50:47
    First, to understand our origin of
    interoperability is to understand that we
  • 50:47 - 50:54
    would like to have it between platforms
    that empower large platforms and the right
  • 50:54 - 51:00
    of smaller platforms, actually, to make
    use of interoperability. So it should not
  • 51:00 - 51:05
    be among the big platforms. So small
    platforms should be able to connect to the
  • 51:05 - 51:12
    big platforms. And second, we believe it
    will help and not make it worse because we
  • 51:12 - 51:16
    have now a problem of hate speech, we have
    now a problem of a lack of privacy, we
  • 51:16 - 51:23
    have now a problem of the attention
    industry that works with, you know,
  • 51:23 - 51:29
    certain picture, certain frames to trigger
    the attention of users, because users
  • 51:29 - 51:32
    don't have a choice of a content
    moderation, for instance, you don't have a
  • 51:32 - 51:37
    choice to see which kind of content should
    be shown. And users don't have options to
  • 51:37 - 51:43
    regulate the privacy. The idea of more
    competitors would be exactly that I can
  • 51:43 - 51:51
    move to a space, where I'm not harassed
    and not be made subject to certain content
  • 51:51 - 51:57
    that hurt my feelings. Right. And that
    moment I got get control. I can choose a
  • 51:57 - 52:02
    provider that gives me those options and
    we would like even to go a step further
  • 52:02 - 52:07
    back and interoperability as a start. We
    believe if users want to, they should be
  • 52:07 - 52:11
    able to delegate a third party company or
    piece of a third party software to
  • 52:11 - 52:15
    interact with the platform on their
    behalf. So users would have the option to
  • 52:15 - 52:19
    see a news feed in different order,
    calibrate their own filters on this
  • 52:19 - 52:23
    information. So in this sense,
    interoperability can be a great tool,
  • 52:23 - 52:29
    actually, to tackle hate speech and to
    sort of negative developments. Of course,
  • 52:29 - 52:34
    that's a risk to it. I think the risk
    comes rather from the data industry side
  • 52:34 - 52:38
    again, that we need to take care not to
    place one or another data, selling
  • 52:38 - 52:44
    industry on the one that we already face.
    But for this, we have options that's value
  • 52:44 - 52:47
    about that's can't happen. But to answer
    the question, we believe interoperability
  • 52:47 - 52:53
    is a tool actually to escape from the
    negative developments you had mentioned.
  • 52:53 - 52:59
    Herald: Critical counter question for me
    then, aren't you actually advocating for
  • 52:59 - 53:04
    just roll your own recommendation engines
    to be able to do so? Can't you achieve
  • 53:04 - 53:10
    that without interoperability?
    Christoph: Sure. The counter question: Do
  • 53:10 - 53:15
    you think an average user can accomplish
    that quite easily? You know, like when we
  • 53:15 - 53:21
    look at the Internet through the lenses of
    market competition then we see that it is
  • 53:21 - 53:27
    the dominance of platforms over data that
    have created those spaces, those developed
  • 53:27 - 53:32
    gardens where users have feeling the trap
    you cannot escape from. And there are so
  • 53:32 - 53:36
    many alternative options that kind of get
    off the ground because users feel trapped,
  • 53:36 - 53:41
    don't want to leave their friends behind
    and don't have options, actually to have a
  • 53:41 - 53:46
    better moderation system. Of course, you
    can be creative and, you know, use plugins
  • 53:46 - 53:50
    and whatever you see fit, but you need to
    stay within the platform barriers. But we
  • 53:50 - 53:54
    would like to enable users to actually
    leave developed garden, go to another
  • 53:54 - 53:59
    place, but still stay in touch with
    friends who have made the choice to remain
  • 53:59 - 54:01
    there. And I think that's perhaps the
    difference to what you had in mind.
  • 54:01 - 54:06
    Herald: I have a follow up question. Well,
    another question from the Internet,
  • 54:06 - 54:12
    regardless of interoperability, and that
    is, historically speaking, as soon as the
  • 54:12 - 54:16
    big players get involved in certain
    standards, they tend to also shape policy
  • 54:16 - 54:24
    by being involved in that. How would that
    be different in the case of
  • 54:24 - 54:29
    interoperability and specifically
    mentioned by the person who ask the
  • 54:29 - 54:31
    question. That Mastodon probably
    flourishes because nobody else was
  • 54:31 - 54:36
    involved in certain like standards.
    Christoph: A - it's an excellent question.
  • 54:36 - 54:42
    And B, we struggled with the question of
    standards ourselves in our policy paper,
  • 54:42 - 54:47
    which is our recommendations for European
    Union to enact certain provisions in the
  • 54:47 - 54:56
    new Digital Services Act. We abstain from
    asking to establish new standards like API
  • 54:56 - 55:01
    standards. We believe it's a bad idea to
    regulate technology like that. What do we
  • 55:01 - 55:05
    want to do is that big platforms just
    offer interoperability. How bad is it that
  • 55:05 - 55:11
    we don't want to have a standard that can
    be either again monopolized or lobbied by
  • 55:11 - 55:16
    the big platforms? Because then we end up
    with the standards we already see with
  • 55:16 - 55:21
    which we don't like. But it's a good
    question. And what we did is with our
  • 55:21 - 55:25
    policy principles on interoperability to
    give kind of a food for thought, how we
  • 55:25 - 55:31
    believe the end version should look like,
    but the many questions that remain and we
  • 55:31 - 55:36
    don't know exactly the way how to go
    there.
  • 55:36 - 55:39
    Herald: Yeah, I'm sorry of sticking to the
    topic of interoperability, because most
  • 55:39 - 55:44
    questions are actually about that. One of
    the other questions is how do we prevent
  • 55:44 - 55:49
    this from getting messed up like it
    happens with PSD2? And for the audience
  • 55:49 - 55:55
    that don't know about PSD2 - PSD2 is a
    Directive that forced banks to open up
  • 55:55 - 55:59
    APIs to other financial service providers,
    which is also interoperability between
  • 55:59 - 56:03
    platforms, in this case for banking
    platforms, which comes with all sorts of
  • 56:03 - 56:08
    privacy questions that weren't completely
    thought through when that legislation came
  • 56:08 - 56:13
    about. Sorry for having this long winded
    intoduction, Christoph, but I think it
  • 56:13 - 56:16
    needed for people that don't know what
    PSD2 means.
  • 56:16 - 56:21
    Christoph: It's a good question.
    Interestingly, we never used PSD2 or the
  • 56:21 - 56:24
    Telecommunications Act because both have
    interoperability options as those negative
  • 56:24 - 56:29
    examples we always used as examples that
    hey, it's already possible. So you don't
  • 56:29 - 56:33
    have an excuse to say it's impossible to
    put it in the law. What is true is that
  • 56:33 - 56:39
    there's a lot of mess around the question
    of how to avoid the mess, it's a question
  • 56:39 - 56:42
    of.. that's politic again. So the question
    of whether policy makers are actually
  • 56:42 - 56:47
    listening to us or listening to industry
    lobbyists. So the ones who raised the
  • 56:47 - 56:51
    question is absolutely right - there's a
    huge risk for every topic we talk about.
  • 56:51 - 56:56
    But the interoperability, whether it's use
    of control about content, targeted ads,
  • 56:56 - 57:00
    liability, everything that we believe
    should be the law, of course, could be
  • 57:00 - 57:06
    hijacked, could be redesigned in a way
    that it will lead to more problems, then
  • 57:06 - 57:11
    fewer problems. So, indeed, for every
    policy question we raise, we need to ask
  • 57:11 - 57:15
    ourselves, is it worth the fight to risk
    opening the box of the Pandora? Do we make
  • 57:15 - 57:22
    it worse? What we said is on that front -
    we are happy to make a pressure and what
  • 57:22 - 57:25
    we need to do in the next year is to
    convince them that we are the right person
  • 57:25 - 57:31
    to talk to. And that's perhaps a challenge
    how to make that explicable to policy
  • 57:31 - 57:35
    makers. So those who ask the questions, I
    think those should help us to come, to
  • 57:35 - 57:40
    process to the parliament and tell people
    how it's going to work.
  • 57:40 - 57:49
    Herald: On those.. a question to both of
    you. Citizen enforcement, I prefer the
  • 57:49 - 57:55
    term citizen overuses. Would it be helpful
    to push for amendments in the parliament
  • 57:55 - 58:05
    for at least the targeting points you both
    mentioned before? And if so, how?
  • 58:05 - 58:15
    Eliska: So I guess that I will start just
    so Christoph can rest a little. So the
  • 58:15 - 58:19
    question was whether it would be useful to
    push for those amendments? Was that's
  • 58:19 - 58:21
    right?
    Herald: For amendments that cover the
  • 58:21 - 58:26
    targeting of citizens.
    Eliska: Absolutely. So there is, of
  • 58:26 - 58:31
    course, short and long answer as to every
    question. And so the short answer would be
  • 58:31 - 58:39
    yes, but given that the wording of such an
    amendment will be precise and onced. We
  • 58:39 - 58:42
    are still working out our positioning on
    online targeting and I think we all.. no
  • 58:42 - 58:46
    one can name those practices that we don't
    want to see being deployed by platforms
  • 58:46 - 58:51
    and where we can actually imagine a proper
    ban on such practices. We have recently
  • 58:51 - 58:58
    published one of our blog posts where we
    actually unfold the way of thinking the
  • 58:58 - 59:02
    access is now currently, you know, how we
    are brainstorming about this whole issue.
  • 59:02 - 59:07
    And as I said, especially those.. that's
    targeting that uses behavioral data of
  • 59:07 - 59:11
    users, citizens, then maybe let's go for
    individuals because they are also obliged
  • 59:11 - 59:19
    to protect the rights of individuals that
    are not their citizens. So that's
  • 59:19 - 59:22
    definitely one form where we can
    definitely see and will be supporting the
  • 59:22 - 59:26
    ban and possible face out that will
    actually lead to a ban. The same goes for
  • 59:26 - 59:34
    the cross side tracking of users due to
    the fact how users data are being abused
  • 59:34 - 59:40
    again as being the integral part of the
    business models of these platforms and so
  • 59:40 - 59:44
    on and so forth. So that's one of the
    direction we will be definitely taking.
  • 59:44 - 59:49
    And again, we are inviting all of you to
    help us out, to brainstorm together with
  • 59:49 - 59:53
    us, to access different options,
    directions that we should take into
  • 59:53 - 59:58
    consideration and not forget about. But I
    personally think that this will be one of
  • 59:58 - 60:04
    the main battles when it comes to DSA,
    where we will definitely need to be on the
  • 60:04 - 60:10
    same page and harmonize and joint the
    forces, because DSA gives us a good ground
  • 60:10 - 60:16
    at the moment, but it doesn't go far
    enough. So, yes, definitely the answer is
  • 60:16 - 60:20
    yes, but given that, we will have a very
    nuanced position so we know what we are
  • 60:20 - 60:25
    asking for and we are taking into
    consideration also those other aspects
  • 60:25 - 60:30
    that could eventually play out badly in
    practice. So good intentions are not
  • 60:30 - 60:35
    enough when it comes to DSA.
    Herald: Thank you. I think we are slightly
  • 60:35 - 60:43
    over time, but I've been told beforehand
    it's OK to do so for a few minutes and
  • 60:43 - 60:48
    there's three questions that are open. One
    of one of them, I will answer myself. That
  • 60:48 - 60:50
    is basically: have the member states
    responded? And the answer to that is no.
  • 60:50 - 60:54
    The member states have not taken any
    position. And two others, I think are
  • 60:54 - 61:00
    quite interesting and important from
    technical perspective. One is: is there
  • 61:00 - 61:03
    anything you see that might affect current
    decentralized platforms like the
  • 61:03 - 61:11
    Fediverse, Mastodon? And the other is:
    will any review of the data protection,
  • 61:11 - 61:18
    sorry, the Database Protection Directive
    affect make the search engines and
  • 61:18 - 61:24
    interact of these again?
    Christoph: Perhaps I jump in and Eliska,
  • 61:24 - 61:28
    you take over. First, member states have
    given an opinion actually on DSA, there
  • 61:28 - 61:33
    have been one-two official submissions,
    plus joint letters, plus discussions in
  • 61:33 - 61:39
    council on whether the DSA was presented.
    I have some nice protocols, which showed
  • 61:39 - 61:44
    the different attitude of member states
    towards it. So for us, it also means we
  • 61:44 - 61:49
    need to work straight away with the
    council to ensure that the package would
  • 61:49 - 61:56
    be good. What was the question? Ah, yes, I
    think the answer to the question depends
  • 61:56 - 62:01
    on the what lawyers call materials scope
    applications wheather it would apply to
  • 62:01 - 62:06
    those platform models at all. That..
    Eliska you can help me out here. You have
  • 62:06 - 62:09
    always criticized for the e-Commerce
    Directive that it was not quite clear how
  • 62:09 - 62:15
    it would relate to first non profit
    platforms. And many of these alternative
  • 62:15 - 62:20
    platforms are like that, because there was
    this issue of providing a service against
  • 62:20 - 62:23
    remuneration and what's not quite clear
    what it means. Would it apply to Wikipedia
  • 62:23 - 62:27
    if you get like donations would have
    applied to a blogger if you have, like,
  • 62:27 - 62:32
    pop ups, ads or something like that. So
    that's, I think, one huge question. And
  • 62:32 - 62:38
    the second question is, in how much those
    diligence obligation would force
  • 62:38 - 62:44
    alternative platforms, governance models
    to redesign the interfaces. And those are
  • 62:44 - 62:48
    open question for us. We have not analyzed
    that in detail, but we see that.. we have
  • 62:48 - 62:53
    worried that it would not only impact the
    large platforms, but many others as well.
  • 62:53 - 62:58
    What do you think Eliska?
    Eliska: Yeah, I can only agree and
  • 62:58 - 63:04
    especially regarding the non profit
    question. Yeah, this is also or was and
  • 63:04 - 63:10
    always been one of our main asks for
    nonprofit organizations. And actually it's
  • 63:10 - 63:14
    not quite clear how that will play out now
    in practice. By the way, how DSA standing,
  • 63:14 - 63:17
    because at the moment it actually speaks
    about online platforms and then it speaks
  • 63:17 - 63:31
    about very large online platforms, but
    what it will be and how it will impact
  • 63:31 - 63:36
    nonprofit organizations, whether these are
    bloggers or organizations like civil
  • 63:36 - 63:44
    rights organizations that remain to be
    seen. I also know that the European Court
  • 63:44 - 63:49
    of Human Rights and its jurisprudence try
    to establish some principles for the
  • 63:49 - 63:56
    nonprofit since Delfi AS versus Estonia
    and then with the empty e-decision and the
  • 63:56 - 64:01
    Swedish case that followed afterwards. But
    I'm not sure how well it was actually
  • 64:01 - 64:05
    elaborated later on. But this is something
    we will be definitely looking at and
  • 64:05 - 64:14
    working on further and regarding the
    impact on the smaller players and also at
  • 64:14 - 64:18
    the interface idea, this is still
    something we are actually also wondering
  • 64:18 - 64:26
    about and thinking how this will turn out
    in practice. And we are hoping to actually
  • 64:26 - 64:31
    develop our positioning further on these
    issues as well, because we actually
  • 64:31 - 64:37
    started working, all of us on DSA and on
    our recommendations already. I think a
  • 64:37 - 64:41
    year ago, maybe a year and a half ago when
    we were working just with the leagues that
  • 64:41 - 64:45
    Politico or other media platforms in
    Brussels were actually sharing with us.
  • 64:45 - 64:49
    And we were working with the bits and
    pieces and trying to put our thinking
  • 64:49 - 64:53
    together. Now we have the draft and I
    think we need to do another round of very
  • 64:53 - 64:58
    detailed thinking. What will be ultimately
    our position or what will be those
  • 64:58 - 65:01
    recommendations and amendments in the
    European Parliament we will be supporting
  • 65:01 - 65:07
    and pushing for. So it's a period of the
    hard work for all of us. Not mentioning
  • 65:07 - 65:15
    that I always say that, you know, we stand
    against a huge lobby power that is going
  • 65:15 - 65:22
    to be and is constantly being exercised by
    these private actors. But I also have to
  • 65:22 - 65:25
    say that we had a very good cooperation
    with the European Commission throughout
  • 65:25 - 65:29
    the process. And I think that I can say
    that on behalf of all of us, that we feel
  • 65:29 - 65:34
    that the European Commission really listen
    this time. So, yeah, more question marks
  • 65:34 - 65:38
    than answers here from me, I think.
    Herald: This is fine with me, not knowing
  • 65:38 - 65:43
    something is fine. I think we're
    definitely run out of time. Thank you both
  • 65:43 - 65:48
    for being here. Well, enjoy the 2D online
    world and the Congress. Thank you. Let's
  • 65:48 - 65:51
    end up our session first. Thank you all.
    Christoph: Thank you.
  • 65:51 - 65:54
    Eliska: Thank you very much. Bye.
  • 65:54 - 66:38
    Subtitles created by c3subtitles.de
    in the year 2021. Join, and help us!
Title:
#rC3 - The EU Digital Services Act package
Description:

more » « less
Video Language:
English
Duration:
01:06:38

English subtitles

Incomplete

Revisions Compare revisions