WEBVTT
00:00:00.000 --> 00:00:12.719
rC3 preroll music
00:00:12.719 --> 00:00:19.690
Herald: And the the talk that is about to
begin now is by Christoph Schmon from EFF
00:00:19.690 --> 00:00:23.890
and Eliska Pirkova from Access Now and
they will be talking about this, probably
00:00:23.890 --> 00:00:28.759
maybe the biggest legislative initiative
since the GDPR in Brussels. It's called
00:00:28.759 --> 00:00:38.885
the Digital Services Act. And onto you
two. You're muted.
00:00:38.885 --> 00:00:43.109
Eliska Perkova: I realized. Hello!
First of all, thank you very much for
00:00:43.109 --> 00:00:47.710
having us today. It's a great pleasure to
be the part of this event. I think for
00:00:47.710 --> 00:00:51.979
both of us, it's the first time we are
actually joining this year. So it's
00:00:51.979 --> 00:00:56.949
fantastic to be the part of this great
community. And today we are going to talk
00:00:56.949 --> 00:01:03.229
about the legislative proposal, the EU
proposal which causes a lot of noise all
00:01:03.229 --> 00:01:07.700
around Europe, but not only in Europe, but
also beyond. And that's the Digital
00:01:07.700 --> 00:01:11.950
Services Act legislative package that
today we already know that this
00:01:11.950 --> 00:01:16.390
legislative package actually consists of
two acts: Digital Services Act and a
00:01:16.390 --> 00:01:21.829
Digital Market Act. And both of them will
significantly change the regulation of
00:01:21.829 --> 00:01:26.720
online platforms with a specific focus on
very large online platforms, also often
00:01:26.720 --> 00:01:31.810
referred to as gatekeepers. So those who
actually hold a lot of economic dominance,
00:01:31.810 --> 00:01:37.320
but also a lot of influence and control
over users rights and the public
00:01:37.320 --> 00:01:42.100
discourse. So I'm going to start with
giving you a quick introduction into
00:01:42.100 --> 00:01:46.509
what's the fuss about, what is actually
the DSA about, why we are also interested
00:01:46.509 --> 00:01:50.460
in it and why we keep talking about it,
and why this legislation will keep us
00:01:50.460 --> 00:01:57.049
preoccupied for the years to come. DSA was
already announced two years ago as a part
00:01:57.049 --> 00:02:05.409
of European Union digital strategy, and it
was appointed as one of those actions that
00:02:05.409 --> 00:02:11.020
the digital strategy will be actually
consisting of. And it was the promise that
00:02:11.020 --> 00:02:14.700
the European Commission gave us already at
that time to create the systemic
00:02:14.700 --> 00:02:19.650
regulation of online platforms that
actually places hopefully the users and
00:02:19.650 --> 00:02:25.790
their rights into the center of this
upcoming legislation. So the promise
00:02:25.790 --> 00:02:30.879
behind DSA is that these ad-based Internet
bill, and I'm speaking now about Ad-Tech
00:02:30.879 --> 00:02:34.319
and Online-Targeting. The
Internet, as we actually knew, will be
00:02:34.319 --> 00:02:38.760
actually replaced by something that puts
user and users controls, and users right
00:02:38.760 --> 00:02:43.810
as a priority. So both of these
legislations implemented and drafted
00:02:43.810 --> 00:02:49.209
right, should be actually achieving that
goal in the future. Now, previously,
00:02:49.209 --> 00:02:54.319
before DSA actually was drafted, there was
some so-called e-Commerce Directive in
00:02:54.319 --> 00:02:58.480
place that actually established the basic
principles, especially in the field of
00:02:58.480 --> 00:03:03.190
content governance. I won't go into
details on that because I don't want to
00:03:03.190 --> 00:03:07.830
make this too legalistic. But ultimately,
DSA legislation is supposed to not
00:03:07.830 --> 00:03:12.519
completely replace but build up on the top
of this legislation that actually created
00:03:12.519 --> 00:03:17.280
the ground and the main legal regime for
almost 20 years in Europe to regulate the
00:03:17.280 --> 00:03:23.330
user generated content. So, DSA and DMA,
as the legislation will seek to harmonize
00:03:23.330 --> 00:03:28.620
the rules addressing the problem, such as
online hate speech, disinformation. But it
00:03:28.620 --> 00:03:33.310
also puts emphasis finally on increased
meaningful transparency in online
00:03:33.310 --> 00:03:37.829
advertising, the way how the content is
actually being distributed across
00:03:37.829 --> 00:03:44.519
platforms and also will develop a specific
enforcement mechanism that will be
00:03:44.519 --> 00:03:49.810
actually looking into it. Now, before I
will actually go into the details on DSA
00:03:49.810 --> 00:03:54.349
and why DSA matters, and do we actually
need such a big new legislative reform
00:03:54.349 --> 00:03:58.020
that is coming from the European
Commission? I want to just unpack it for
00:03:58.020 --> 00:04:03.200
you a little bit what this legislative
package actually consists of. So, as I
00:04:03.200 --> 00:04:07.409
already mentioned, two regulations... the
regulation, the strongest legal instrument
00:04:07.409 --> 00:04:11.989
European Commission actually has in its
hands, which is supposed to achieve the
00:04:11.989 --> 00:04:16.380
highest level of harmonization across the
member states. And we all can imagine how
00:04:16.380 --> 00:04:20.120
difficult that will be to achieve,
especially in the realm of freedom of
00:04:20.120 --> 00:04:24.910
expression and particular categories of
user generated content, which is so deeply
00:04:24.910 --> 00:04:29.960
complex dependance. All of those
related to content moderation and content
00:04:29.960 --> 00:04:35.550
curation will be mainly in the realm of
Digital Services Act. And then the second
00:04:35.550 --> 00:04:39.710
regulation, the Digital Market Act, will
be specifically looking at the dominance
00:04:39.710 --> 00:04:44.620
of online platforms, economic dominance,
competitive environment for smaller
00:04:44.620 --> 00:04:49.750
players, the fairness in the competition.
And it will also establish the list of
00:04:49.750 --> 00:04:54.510
do's and don'ts for gatekeeper's
platforms. So something that... is so the
00:04:54.510 --> 00:04:59.650
platforms that actually hold a relevant
dominance and now based on these new
00:04:59.650 --> 00:05:03.920
proposals, we know that these platforms
are mainly called as very large online
00:05:03.920 --> 00:05:10.040
platforms. So this is exactly how the
legislation refers to gatekeepers now. And
00:05:10.040 --> 00:05:15.259
now I think one more point that I want to
make is that the DSA and the DMA were
00:05:15.259 --> 00:05:20.520
launched on the 15th of December 2020. So it
was literally a Christmas present given to
00:05:20.520 --> 00:05:24.610
the digital rights community by the
European Commission, a long time
00:05:24.610 --> 00:05:30.530
anticipated one. The work on DSA
started, however much earlier. Electronic
00:05:30.530 --> 00:05:34.800
Frontiers Foundations, as much as Access
Now, together with EDRi, were working very
00:05:34.800 --> 00:05:39.100
hard to come up with the priorities and
recommendations what we would like to see
00:05:39.100 --> 00:05:42.660
within these legislations to be
enshrined, because from the beginning we
00:05:42.660 --> 00:05:46.690
understood the importance and the far
reaching consequences this legislation
00:05:46.690 --> 00:05:51.690
will have not only inside of the European
Union, but also beyond. And that brings me
00:05:51.690 --> 00:05:55.140
to the final introductory point that I
want to make before I will hand over to
00:05:55.140 --> 00:06:00.731
Chris, which is why and do we actually
need DSA? We strongly believe that there
00:06:00.731 --> 00:06:06.139
is a big justification and good reason to
actually establish a systemic regulation
00:06:06.139 --> 00:06:11.190
of online platforms in order to secure
users fundamental rights, empower them,
00:06:11.190 --> 00:06:15.580
and also to protect our democratic
discourse. And this is due to the fact
00:06:15.580 --> 00:06:20.840
that for many years we're witnessing this
phase: quite bad regulatory practices in
00:06:20.840 --> 00:06:25.231
platform governance that are coming from
member states and Chris will provide for a
00:06:25.231 --> 00:06:29.840
very concrete example in that regard, but
also coming from the European Commission
00:06:29.840 --> 00:06:35.930
itself, mainly the proposed online service
content regulation, for instance, or we
00:06:35.930 --> 00:06:40.670
all remember the story of copyright that
Chris will discuss a little bit further.
00:06:40.670 --> 00:06:45.370
We saw how not only the member states, but
also European Commission or European
00:06:45.370 --> 00:06:52.050
Union, in order to actually establish some
order in the digital space, they started
00:06:52.050 --> 00:06:56.060
pushing the state's obligation and
especially states positive obligation to
00:06:56.060 --> 00:07:00.940
protect the users' human rights in the
hands of online private platforms that
00:07:00.940 --> 00:07:07.940
started replacing state actors and public
authorities within the online space. They
00:07:07.940 --> 00:07:11.900
started assessing the content, the
legality of the content, deciding under a
00:07:11.900 --> 00:07:16.010
very short time frames what should stay
online and what should go offline with no
00:07:16.010 --> 00:07:20.519
public scrutiny or transparency about
practices that they kept deploying. And
00:07:20.519 --> 00:07:25.590
they keep deploying to this day. Of
course, platforms under the threat of
00:07:25.590 --> 00:07:31.140
legal liability often had to rely and
still have to rely on the content
00:07:31.140 --> 00:07:36.180
recognition technologies for removing user
generated content. A typical example could
00:07:36.180 --> 00:07:40.190
be also Avia law, which will be still
mentioned today during the presentation.
00:07:40.190 --> 00:07:45.440
And the typical time frames are usually
those that extend from one hour to 24
00:07:45.440 --> 00:07:49.060
hours, which is extremely short,
especially if any users would like to
00:07:49.060 --> 00:07:54.900
appeal such a decision or seek an
effective remedy. At the same time, due to
00:07:54.900 --> 00:08:00.860
the lack of harmonization and lack of
proper set of responsibilities that should
00:08:00.860 --> 00:08:04.129
lie in the hands of these online
platforms. There was a lack of legal
00:08:04.129 --> 00:08:08.550
certainty which would only reinforce the
vicious circle of removing more and more
00:08:08.550 --> 00:08:14.909
of online content in order to escape any
possible liability. And at the end, to
00:08:14.909 --> 00:08:19.940
this day, due to the lack of transparency,
we lack any evidence or research based
00:08:19.940 --> 00:08:24.340
policy making, because platforms do not
want to share or inform the public
00:08:24.340 --> 00:08:28.350
authorities what they actually do with the
content, how they moderate and those
00:08:28.350 --> 00:08:32.320
transparency information that we receive
within their transparency reports are
00:08:32.320 --> 00:08:36.880
usually quantity oriented instead of
quality. So they focus on how much content
00:08:36.880 --> 00:08:42.030
is actually being removed and how fast,
which is not enough in order to create
00:08:42.030 --> 00:08:47.010
laws that can actually provide any more
sustainable solutions. And ultimately, as
00:08:47.010 --> 00:08:52.600
we all agree, the core issue doesn't lie
that much with how the content is being
00:08:52.600 --> 00:08:57.440
moderated, but how content is being
distributed across platforms within the
00:08:57.440 --> 00:09:02.600
core of their business models that
actually stands on a attention economy and
00:09:02.600 --> 00:09:07.580
on the way, how sensational content is
often being amplified in order to actually
00:09:07.580 --> 00:09:12.930
prolong that attention span of users that
visit platforms on regular basis. And I'm
00:09:12.930 --> 00:09:18.000
packing quite a few issues here. And this
supposed to be just like a quick
00:09:18.000 --> 00:09:22.370
introductory remark. I will now hand over
to Chris that will elaborate on all these
00:09:22.370 --> 00:09:26.080
points a little bit further, and then we
take a look and unpack a few quite
00:09:26.080 --> 00:09:31.200
important parts of the essay that we feel
should be prioritized in this debate at
00:09:31.200 --> 00:09:36.980
the moment. Chris, over to you.
Christoph Schmon: Hi everybody. I'm
00:09:36.980 --> 00:09:42.860
quite sure that many of you have noticed
that there's a growing appetite from the
00:09:42.860 --> 00:09:47.390
side of the European Union to regulate the
Internet by using online platforms, as the
00:09:47.390 --> 00:09:52.370
helping hands to monitor and censor what
users can say/share/do online. As you
00:09:52.370 --> 00:09:58.940
see on the slide, the first highlight of
this growing appetite was corporate upload
00:09:58.940 --> 00:10:02.880
filters, which are supposed to stop
copyright protected content online.
00:10:02.880 --> 00:10:07.280
Thousands of people, old and young, went
on the streets to demonstrate for free
00:10:07.280 --> 00:10:11.290
Internet, to demonstrate against technical
measures to turn the Internet into some
00:10:11.290 --> 00:10:15.670
sort of censorship machine. We've made a
point then, and we continue making the
00:10:15.670 --> 00:10:20.700
point now that upload filters are prone to
error, that upload filters cannot
00:10:20.700 --> 00:10:24.330
understand context, that they are
unaffordable by all but the largest tech
00:10:24.330 --> 00:10:29.670
companies, which happen to be all based in
the United States. But as you well know,
00:10:29.670 --> 00:10:34.460
policymakers would not listen and
Article 13 of the copyright directive was
00:10:34.460 --> 00:10:39.750
adopted by a small margin in the European
Parliament, also because some members of
00:10:39.750 --> 00:10:44.410
European Parliament had troubles to press
the right buttons, but I think it's
00:10:44.410 --> 00:10:49.510
important for us to understand that the
fight is far from over. The member states
00:10:49.510 --> 00:10:53.170
must now implement the directive in the
way that is not at odds with fundamental
00:10:53.170 --> 00:10:58.130
rights. And we argue that mandated
automated removal technologies are always
00:10:58.130 --> 00:11:02.790
in conflict with fundamental rights. And
this includes data protection rights. It
00:11:02.790 --> 00:11:07.070
is a data protection right not to be made
subject to Automated Decision-Making
00:11:07.070 --> 00:11:10.890
online, if it involves your personal data
and if such decision making has a negative
00:11:10.890 --> 00:11:16.640
effect. So we believe, all those legal
arguments aside, I think the most worrying
00:11:16.640 --> 00:11:22.570
experience with upload filters is that it
has spillover effects to other
00:11:22.570 --> 00:11:25.810
initiatives. Sure, if it works for
copyright protected companies, it may well
00:11:25.810 --> 00:11:31.350
work for other types of content, right?
Except that it doesn't. Many considered
00:11:31.350 --> 00:11:35.960
now a clever idea that web forums should
proactively monitor and check all sorts of
00:11:35.960 --> 00:11:40.690
user content. May this be communication,
pictures or videos, and they should use
00:11:40.690 --> 00:11:45.820
filters to take it down or the filters to
prevent the re-upload of such content. An
00:11:45.820 --> 00:11:51.190
example of such spillover effect that Eliska
had mentioned is the draft regulation of
00:11:51.190 --> 00:11:56.110
terrorist related content. It took a huge
joint effort of civil society groups and
00:11:56.110 --> 00:12:02.930
some members of Parliament to reject the
worst of all text. We had recent
00:12:02.930 --> 00:12:08.860
negotiations going on and at least we
managed to get out the requirement to use
00:12:08.860 --> 00:12:13.050
uploads filters, but still a twenty four
hours removal obligation that may nudge
00:12:13.050 --> 00:12:19.560
platforms to employ those filters
nevertheless. And we see that particularly
00:12:19.560 --> 00:12:23.140
in national states, they are very fond of
the idea that platforms rather than
00:12:23.140 --> 00:12:27.420
judges should be the new law
enforcers. There are now several states in
00:12:27.420 --> 00:12:31.480
the European Union that have adopted laws
that would either oblige or nudge
00:12:31.480 --> 00:12:36.640
platforms to monitor users speech
online. First up was the German NetzDG,
00:12:36.640 --> 00:12:40.110
which set out systematic duties for
platforms. Then we had the French law
00:12:40.110 --> 00:12:44.320
Avia, which copy-pasted the NetzDG and
made it worse. And last we have the
00:12:44.320 --> 00:12:47.690
Austrian Hate Speech bill, which is a mix
of both the German and the French
00:12:47.690 --> 00:12:51.890
proposal. They all go much beyond
copyright content, but focus on hate
00:12:51.890 --> 00:12:56.500
speech and all sorts of content that may
be considered problematic and in those
00:12:56.500 --> 00:13:01.620
respective countries, not necessarily in
other countries. And this brings me to the
00:13:01.620 --> 00:13:05.890
next problem. How do we deal with content
that is illegal in one country, but legal
00:13:05.890 --> 00:13:10.910
in another? A recent Court of Justice
ruling had confirmed that a court of a
00:13:10.910 --> 00:13:16.020
small state like Austria can order
platforms not only to take down defamatory
00:13:16.020 --> 00:13:20.360
content globally, but also to take down
identical or equivalent material using
00:13:20.360 --> 00:13:26.340
automated technologies. For us this is a
terrible outcome, that will lead to a race
00:13:26.340 --> 00:13:30.660
to the bottom, where the countries with
the least freedom of speech friendly laws
00:13:30.660 --> 00:13:36.490
can superimpose their laws on every other
state in the world. We really believe that
00:13:36.490 --> 00:13:40.740
all this nonsense has to stop. It's time
to acknowledge that the Internet is a
00:13:40.740 --> 00:13:44.480
global space, a place of exchange of
creativity and the place where civil
00:13:44.480 --> 00:13:49.730
liberties are suppose to exist. So we are
fighting now against all those national
00:13:49.730 --> 00:13:54.360
initiatives. We had the great first
victory, when we have to bring down the
00:13:54.360 --> 00:14:00.070
French law Avia - the Avia bill that had
imposed the duty for platforms to check
00:14:00.070 --> 00:14:05.880
and remove potentially illegal content
within 24 hours. Before the Conseil
00:14:05.880 --> 00:14:09.140
constitutionnel, the French Supreme Court,
we had argued that this would push
00:14:09.140 --> 00:14:13.740
platforms to constantly check what users
post. And if platforms face high
00:14:13.740 --> 00:14:19.740
fines... of course, they would be rather
motivated to block as much context of the
00:14:19.740 --> 00:14:24.190
content as possible. We've made a point
that this would be against the Charter of
00:14:24.190 --> 00:14:29.370
Fundamental Rights, including freedom of
information and freedom of expression. And
00:14:29.370 --> 00:14:33.760
it was a great victory for us that the
French Supreme Court has struck down the
00:14:33.760 --> 00:14:39.770
French Avia bill and followed our
argument, as you see on the slide. We also
00:14:39.770 --> 00:14:43.810
see now that there's a push back at least
against the update of the German NetzDG,
00:14:43.810 --> 00:14:47.790
which would have provided new access
rights for law enforcement authorities.
00:14:47.790 --> 00:14:51.230
This and other provisions are considered
unconstitutional. And as far as I
00:14:51.230 --> 00:14:55.300
understand and, perhaps listeners can
correct me, the German president has
00:14:55.300 --> 00:15:00.620
refused to sign the bill and the Austrian
bill goes the similar path way - got
00:15:00.620 --> 00:15:05.920
recently a red light from Brussels. The
commission considers it in conflict with
00:15:05.920 --> 00:15:12.150
EU principles. Also, thanks to joint
effort by epicenter.works. And this shows
00:15:12.150 --> 00:15:15.880
that something positive is going on, it's
a positive development, the pushback
00:15:15.880 --> 00:15:20.310
against automated filter technologies.
But it's important to understand that
00:15:20.310 --> 00:15:24.270
those national initiatives are not just
purely national attempts to regulate hate
00:15:24.270 --> 00:15:29.100
speech. It's an attempt of an EU member
state to make their own bills, as badly as
00:15:29.100 --> 00:15:33.850
they are, some sort of a prototype for EU-
wide legislation, a prototype for the
00:15:33.850 --> 00:15:37.690
Digital Services Act. And as you know,
national member states have a say in EU
00:15:37.690 --> 00:15:41.610
lawmaking: their voices are represented in
the council of the EU and the European
00:15:41.610 --> 00:15:46.532
Commission, which will be disincentivized
to propose anything that would be voted
00:15:46.532 --> 00:15:50.970
down by the council. I think that's a nice
takeaway from today, that lawmaking in
00:15:50.970 --> 00:15:55.770
national member states is not an isolated
event. It's always political, it's always
00:15:55.770 --> 00:16:02.680
Netzpolitik. The good news is that as
far as the European Commission proposal
00:16:02.680 --> 00:16:07.410
for a Digital Services Act is concerned,
that it has not followed the footsteps of
00:16:07.410 --> 00:16:12.700
those bad, badly designed and misguided
bills. It has respected our input, the
00:16:12.700 --> 00:16:16.490
input from Access Now, from EFF, from the
EDRi network, from academics and many
00:16:16.490 --> 00:16:21.830
others, that some key principles should
not be removed, like that liability for
00:16:21.830 --> 00:16:26.560
speech should rest with the speaker. The
DSA, it's also a red light to channel
00:16:26.560 --> 00:16:30.560
monitoring of users content. And there are
no sure badlines in there to remove
00:16:30.560 --> 00:16:34.600
content that might be illegal. Instead,
the commission gives more slack to
00:16:34.600 --> 00:16:41.770
platforms to take down posts in good faith,
which we call the EU style
00:16:41.770 --> 00:16:45.550
Good Samaritan clause. Looking
through the global lenses of law making
00:16:45.550 --> 00:16:49.190
it's very fascinating to see that while
the United States is flirting with the
00:16:49.190 --> 00:16:52.710
idea to move away from the Good Samaritan
principle in Section 230 of the
00:16:52.710 --> 00:16:56.870
Communications Decency Act, so the idea is
that platforms can voluntarily remove
00:16:56.870 --> 00:17:01.720
content without being held liable for it,
the European Union flirts with the idea to
00:17:01.720 --> 00:17:06.680
introduce it, to give more options to
platforms to act. That being said, the
00:17:06.680 --> 00:17:11.310
major differences between the US and the
EU is that in Europe, platforms could be
00:17:11.310 --> 00:17:15.699
held liable the moment they become aware
of the illegality of content. That's an
00:17:15.699 --> 00:17:20.060
issue because the Digital Services Act has
now introduced a relatively sophisticated
00:17:20.060 --> 00:17:24.870
system for user notification, complaint
mechanism, dispute resolution options,
00:17:24.870 --> 00:17:29.350
which all leads to such awareness about
illegality or could lead to such
00:17:29.350 --> 00:17:33.580
awareness. It's not quite clear for us how
platforms will make use of voluntary
00:17:33.580 --> 00:17:38.510
measures to remove content. That being
said, we think that the commission's
00:17:38.510 --> 00:17:41.870
proposal could have been much worse. And
the Parliament's reports on Digital
00:17:41.870 --> 00:17:46.090
Services Act have demonstrated that the
new parliament is a bit better than the
00:17:46.090 --> 00:17:50.440
old one. They have a lot of respect for
fundamental rights. So many members of
00:17:50.440 --> 00:17:55.470
parliament that are quite fond of the idea
to protect civil liberties online. But we
00:17:55.470 --> 00:17:59.780
know that this was only the start and we
know that we need another joint effort to
00:17:59.780 --> 00:18:03.240
ensure that users are not monitored, they
are not at the mercy of algorithmic decision
00:18:03.240 --> 00:18:08.480
making. And I think Eliska is now going to
explain a bit more about all this.
00:18:08.480 --> 00:18:15.250
Eliska: Thank you. Thank you very much,
Chris. So, we can actually move now
00:18:15.250 --> 00:18:19.800
further and unpack a few relevant
provisions for everything that has already
00:18:19.800 --> 00:18:23.950
been mentioned, mainly in the realm of
content moderation and content creation,
00:18:23.950 --> 00:18:29.010
which ultimately lies in the core of
Digital Services Act. And maybe not to
00:18:29.010 --> 00:18:34.700
make it all so abstract... I also have the
printed version of the law here with me.
00:18:34.700 --> 00:18:38.630
And if you look at it, it's quite an
impressive piece of work that the European
00:18:38.630 --> 00:18:44.690
Commission did there. And I have to say
that even though it's a great start, it
00:18:44.690 --> 00:18:50.860
still contains a lot of imperfections. And
I will try to summarize those now for you,
00:18:50.860 --> 00:18:57.560
especially in the light of our end positioning
and as I mean civil societies in general,
00:18:57.560 --> 00:19:00.650
because I believe that on all those
points, we have a pretty solid agreement
00:19:00.650 --> 00:19:05.730
among each other. And what we were
actually hoping that Digital Services Act
00:19:05.730 --> 00:19:11.130
will do, what it actually does and where
we see that we will need to actually
00:19:11.130 --> 00:19:14.880
continue working very closely, especially
with the members of the European
00:19:14.880 --> 00:19:18.470
Parliament in the future, once the draft
will actually enter the European
00:19:18.470 --> 00:19:23.520
Parliament, which should happen relatively
soon. So as it already, as I already
00:19:23.520 --> 00:19:29.170
mentioned at the beginning, quite briefly,
is how actually DSA distinguishes between
00:19:29.170 --> 00:19:34.470
online platforms, which are defined within
the scope of the law and between very
00:19:34.470 --> 00:19:38.570
large online platforms, which is exactly
that scope where all large online
00:19:38.570 --> 00:19:43.860
gatekeepers fall into. DSA specificly then
distinguishes between obligations or
00:19:43.860 --> 00:19:49.370
responsibilities of these actors, some
assigning to all of them, including online
00:19:49.370 --> 00:19:53.170
platforms and some being extended
specifically due to the dominance and
00:19:53.170 --> 00:19:58.530
power of these online gatekeepers hold.
This is mainly then the case when we
00:19:58.530 --> 00:20:01.520
discuss the requirements for
transparency. There is a set of
00:20:01.520 --> 00:20:05.470
requirements for transparency that apply
to online platforms, but then there is
00:20:05.470 --> 00:20:09.160
still specific additional set of
transparency requirements for larger
00:20:09.160 --> 00:20:14.300
online platforms. What DSA does especially
and this is the bed which is extremely
00:20:14.300 --> 00:20:18.890
relevant for the content moderation
practices - it attempts to establish a
00:20:18.890 --> 00:20:23.980
harmonized model for notice and action
procedure for allegedly illegal content.
00:20:23.980 --> 00:20:29.350
Whatever alert red lines before I go into
the details on this was that DSA will be
00:20:29.350 --> 00:20:35.080
touching only, or trying to regulate only
allegedly illegal content and stay away
00:20:35.080 --> 00:20:39.870
from vaguely defined content categories
such as potentially harmful, but legal
00:20:39.870 --> 00:20:44.560
content. There are other ways how the
legislation can tackle this content,
00:20:44.560 --> 00:20:50.730
mainly through the meaningful transparency
requirements, accountability, tackling
00:20:50.730 --> 00:20:56.820
issues within the open content recommender
systems and algorithmic curation. But we
00:20:56.820 --> 00:21:00.950
didn't want the specific category of the
content to be included within the scope of
00:21:00.950 --> 00:21:05.830
DSA. This is due to the fact that vaguely
defined terms that find their way into
00:21:05.830 --> 00:21:11.570
legislation always lead to human rights
abuse in the future. I could give you
00:21:11.570 --> 00:21:17.470
examples from Europe, such as the concept
of online harms within the UK, but also as
00:21:17.470 --> 00:21:21.960
a global organizations. Both of us, we
actually often see how weak terminology
00:21:21.960 --> 00:21:26.290
can quickly lead to even over
criminalization of speech or suppressing
00:21:26.290 --> 00:21:31.120
the decent. Now, if we go back to
harmonize notice and action procedure,
00:21:31.120 --> 00:21:35.730
what that actually means in practice, as
Christoph already mentioned, Europe has
00:21:35.730 --> 00:21:39.950
so-called conditional model of
intermediate reliability, which is being
00:21:39.950 --> 00:21:43.950
provided already and established by the
initial legal regime, which is the
00:21:43.950 --> 00:21:49.810
e-Commerce Directive under Article 14 of
the e-Commerce Directive, which actually
00:21:49.810 --> 00:21:54.130
states that unless the platform holds the
actual knowledge and according to the
00:21:54.130 --> 00:21:58.770
wording of DSA now it's the actual
knowledge or awareness about the presence
00:21:58.770 --> 00:22:04.520
of illegal content on their platform, they
cannot be held liable for such a content.
00:22:04.520 --> 00:22:09.540
Now, we were asking for a harmonized
procedure regarding notice and action across
00:22:09.540 --> 00:22:14.470
the EU for a while, precisely because we
wanted to see reinforced legal certainty.
00:22:14.470 --> 00:22:19.070
Lack of legal certainty often translated
into overremoval of even legitimate
00:22:19.070 --> 00:22:25.760
content from the platforms with no public
scrutiny. DSA does a good job, it's a good
00:22:25.760 --> 00:22:29.330
starting point that actually tries to
attempt, to establish such a harmonized
00:22:29.330 --> 00:22:34.230
procedure, but it's still lacking behind
on many aspects that we consider important
00:22:34.230 --> 00:22:38.350
in order to strengthen protection of
fundamental rights of users. One of them
00:22:38.350 --> 00:22:42.290
is, for instance, that harmonized notice
and action procedure, as envisioned by
00:22:42.290 --> 00:22:47.640
DSA, is not specifically tailored to
different types of categories of user
00:22:47.640 --> 00:22:52.950
generated content. And as we know, there
were some or many categories of content
00:22:52.950 --> 00:22:58.280
that are deeply context dependent, linked
to the historical and sociopolitical
00:22:58.280 --> 00:23:04.560
context of member state in question. And
due to their reliance on the automated
00:23:04.560 --> 00:23:08.840
measures, that usually context blind, we
are worried that if notice and action
00:23:08.840 --> 00:23:13.290
doesn't reflect this aspect in any further
ways we will end up again with over
00:23:13.290 --> 00:23:18.830
removals of the content. What is probably
another huge issue that we are currently
00:23:18.830 --> 00:23:23.640
lacking in the draft, even though DSA is
trying to create a proper appeal and
00:23:23.640 --> 00:23:29.200
enforcement mechanisms and also appeal
mechanisms for users and different
00:23:29.200 --> 00:23:33.620
alternative dispute settlement of the law
draft currently contains, there is no
00:23:33.620 --> 00:23:40.768
possibility for content providers, for the
user that uploads the filter.. Sorry.
00:23:40.768 --> 00:23:45.450
laughs That was a nice Freudian slip
there. For
00:23:45.450 --> 00:23:50.621
a user that actually uploaded content to
appeal to directly actually use the
00:23:50.621 --> 00:23:56.430
counter notification about that notified
content that belongs to that user. Nor
00:23:56.430 --> 00:24:00.750
platforms are obliged to actually send the
notification to a user prior to any action
00:24:00.750 --> 00:24:05.110
that is being taken against that
particular notified content. These are for
00:24:05.110 --> 00:24:10.120
us a procedural safeguards for fairness
that users should have, and currently they
00:24:10.120 --> 00:24:16.030
are not being reflected in the draft.
However, this is a good start and it's
00:24:16.030 --> 00:24:20.190
something that we were pushing for. But I
think there are many more aspects that
00:24:20.190 --> 00:24:24.710
these notice and action procedures will
need to contain in order to truly put
00:24:24.710 --> 00:24:31.420
users at first. Now the notice and action
procedure is mainly focusing on the
00:24:31.420 --> 00:24:35.470
illegal content. But there are ways in the
draft where potentially harmful content,
00:24:35.470 --> 00:24:38.910
which is still legal - so the content that
actually violates the terms of service of
00:24:38.910 --> 00:24:42.710
platforms is being mentioned throughout
the draft. So for us, it's now at the
00:24:42.710 --> 00:24:51.890
moment exactly clear how that will work in
practice. So that's why we often use this
00:24:51.890 --> 00:24:56.370
phrase that is also put on the slide: good
intention with imperfect solutions.
00:24:56.370 --> 00:25:00.430
However, I want to emphasize again that
this is just the beginning and we will
00:25:00.430 --> 00:25:05.920
still have time and space to work very
hard on this. Another kind of novel aspect
00:25:05.920 --> 00:25:10.710
that DSA actually brings about is already
mentioned Good Samaritan Clause, and I
00:25:10.710 --> 00:25:16.560
tend to call it the EU model or EU version
of Good Samaritan Clause. Good Samaritan
00:25:16.560 --> 00:25:21.370
Clause originates in Section 230 of
Communication Decency Act, as Cristoph
00:25:21.370 --> 00:25:26.609
already mentioned. But within the European
realm, it goes hand in hand with this
00:25:26.609 --> 00:25:31.680
conditional model of liability which is
being preserved within the DSA legal
00:25:31.680 --> 00:25:36.180
draft. That was also one of our main ask
to preserve this conditional model of
00:25:36.180 --> 00:25:41.550
liability and it's great that this time
European Commission really listened. Why
00:25:41.550 --> 00:25:46.060
we consider the Good Samaritan Clause
being important? In the past, when such a
00:25:46.060 --> 00:25:51.380
security wasn't enshrined in the law, but
it was just somehow vaguely promised to
00:25:51.380 --> 00:25:58.320
the commission that if the platform will
proactively deploy measures to fight
00:25:58.320 --> 00:26:01.910
against the spread of illegal content,
they won't be held liable without
00:26:01.910 --> 00:26:06.921
acknowledging that through such a use of
so-called proactive measures, the platform
00:26:06.921 --> 00:26:11.710
could in theory gain the actual knowledge
about the existence of such a type of
00:26:11.710 --> 00:26:16.350
content on its platform, which would
immediately trigger legal liability. This
00:26:16.350 --> 00:26:20.500
threat of liability often pushed platforms
to the corner so they would rather remove
00:26:20.500 --> 00:26:27.240
the content very quickly then to face more
serious consequences later on. That's why
00:26:27.240 --> 00:26:32.330
we see the importance within the Good
Samaritan Clause or the European model of
00:26:32.330 --> 00:26:36.910
Good Samaritan Clause, and we are glad
that it's currently being part of the
00:26:36.910 --> 00:26:44.490
draft. One of the biggest downfalls or one
of the biggest disappointments when DSA
00:26:44.490 --> 00:26:50.410
finally came out on the 15th of December
for us was to see that it's still online
00:26:50.410 --> 00:26:54.800
platforms that will remain in charge when
it comes to assessing the legality of the
00:26:54.800 --> 00:26:59.450
content and deciding what content should
be actually restricted and removed from a
00:26:59.450 --> 00:27:06.080
platform and what should be available. We
often emphasize that it's very important
00:27:06.080 --> 00:27:11.470
that the legality of the content is being
assessed by the independent judicial
00:27:11.470 --> 00:27:17.020
authorities as in line with the rule of
law principles. We also do understand that
00:27:17.020 --> 00:27:23.090
such a solution creates a big burden on
the judicial structure of member states.
00:27:23.090 --> 00:27:26.540
Many member states see that as a very
expensive solutions, they don't always
00:27:26.540 --> 00:27:33.020
want to create a special network, of
courts, or e-courts or other forms of
00:27:33.020 --> 00:27:38.150
judicial review of the illegal or
allegedly illegal content. But we still
00:27:38.150 --> 00:27:42.559
wanted to see more public scrutiny,
because for us this is truly just the
00:27:42.559 --> 00:27:47.403
reaffirmation of already existing status
quo, as at the moment and there are many
00:27:47.403 --> 00:27:51.900
jurisdictions within the EU and in the EU
itself, it's still online platform that
00:27:51.900 --> 00:27:56.750
will call the final shots. What, on the
other hand, is a positive outcome that
00:27:56.750 --> 00:28:02.370
we were also hardly pushing for are the
requirements for meaningful transparency.
00:28:02.370 --> 00:28:06.580
So to understand better what platforms
actually do with the individual pieces of
00:28:06.580 --> 00:28:12.660
content that are being shared on these
platforms and how actually transparency
00:28:12.660 --> 00:28:17.559
can then ultimately empower user. Now, I
want to emphasize this because this is
00:28:17.559 --> 00:28:22.970
still ongoing debate and we will touch
upon those issues in a minute. But we
00:28:22.970 --> 00:28:28.840
don't see transparency as a silver bullet
to the issues such as amplification of
00:28:28.840 --> 00:28:33.420
potentially harmful content or in general
that transparency will be enough to
00:28:33.420 --> 00:28:38.150
actually hold platforms accountable.
Absolutely not. It will never be enough,
00:28:38.150 --> 00:28:44.460
but it's a precondition for us to actually
seek such solutions in the future. DSA
00:28:44.460 --> 00:28:48.910
contains specific requirements for
transparency, as I already mentioned, a
00:28:48.910 --> 00:28:53.300
set of requirements that will be
applicable largely to all online platforms
00:28:53.300 --> 00:28:57.840
and then still specific set of
requirements on the top of it. That will
00:28:57.840 --> 00:29:02.350
be applicable only to very large online
platforms so the online gatekeepers. We
00:29:02.350 --> 00:29:07.230
appreciate the effort, we see that the
list is very promising, but we still think
00:29:07.230 --> 00:29:12.860
it could be more ambitious. Both EFF and
Access Now put forward a specific set of
00:29:12.860 --> 00:29:17.910
requirements for meaningful transparency
that are in our positions. And so did EDRi
00:29:17.910 --> 00:29:25.080
and other civil society or digital rights
activists in this space. And final point
00:29:25.080 --> 00:29:29.100
that I'm going to make is the so-called
Pandora box of online targeting and
00:29:29.100 --> 00:29:34.809
recommender systems. Why do I refer to
this as to Pandora Box? When a European
00:29:34.809 --> 00:29:41.440
Parliament published its initiative
reports on DSA, there are two reports, one
00:29:41.440 --> 00:29:47.680
being tabled by JURI Committee and then
another one by ENCO, especially the JURI
00:29:47.680 --> 00:29:52.410
report contained paragraph 17, which calls
out for a better regulation of online
00:29:52.410 --> 00:29:57.220
targeting and online advertisements, and
specifically calling for a ban of online
00:29:57.220 --> 00:30:03.370
targeting and including the Phase-Out that
will then lead to a ban. We supported this
00:30:03.370 --> 00:30:08.410
paragraph, which at the end was voted for
and is the part of the final report.
00:30:08.410 --> 00:30:12.780
Nevertheless, we also do understand that
these wording of the article has to be
00:30:12.780 --> 00:30:18.000
more nuanced in the future. Before I go
into the details there, I just want to say
00:30:18.000 --> 00:30:22.840
that this part has never made it to DSA.
So there is no ban on online targeting or
00:30:22.840 --> 00:30:27.620
online advertisement of any sort, which to
us, to some extent, it was certainly
00:30:27.620 --> 00:30:33.210
disappointing too, we specifically would
call for a much more stricter approach
00:30:33.210 --> 00:30:38.740
when it comes to behavioral targeting as
well as crossside tracking of online
00:30:38.740 --> 00:30:43.890
users, but unfortunately, and as we
eventually also heard from Commissioner
00:30:43.890 --> 00:30:48.930
Vestager, that was simply lag of will or,
maybe, too much pressure from other
00:30:48.930 --> 00:30:54.059
lobbies in Brussels. And this provision
never found its way to the final draft of
00:30:54.059 --> 00:30:58.540
DSA. That's the current state of art, we
will see what we will manage to achieve
00:30:58.540 --> 00:31:05.220
once the DSA will enter the European
Parliament. And finally, the law also
00:31:05.220 --> 00:31:10.440
contains a specific provision on
recommender systems. So the way how the
00:31:10.440 --> 00:31:16.630
content is being distributed across
platform and how the data of users are
00:31:16.630 --> 00:31:22.049
being abused for such a distribution and
personalization of user generated content.
00:31:22.049 --> 00:31:27.110
In both cases, whether it's online
targeting and recommender systems within
00:31:27.110 --> 00:31:31.740
the DSA, DSA goes as far as the
transparency requirements, the
00:31:31.740 --> 00:31:37.230
explainability, but it does very little
for returning that control and empowerment
00:31:37.230 --> 00:31:42.600
back to the user. So whether user can
obtain or opt out from these algorithmic
00:31:42.600 --> 00:31:48.630
curation models, how it can actually be
optimized if they decide to optimize it?
00:31:48.630 --> 00:31:54.550
All of that is at the moment very much
left outside of the scope of DSA. And so
00:31:54.550 --> 00:31:59.761
that's the issue of interoperability,
which is definitely one of the key
00:31:59.761 --> 00:32:05.630
issues being currently discussed and made
kind of possible hopes in the future for
00:32:05.630 --> 00:32:09.500
returning that control and empowerment
back to the user. And I keep repeating
00:32:09.500 --> 00:32:14.370
this as a mantra, but it's truly the main
driving force behind all our initiatives
00:32:14.370 --> 00:32:19.289
and the work we do in these fields. So the
user and their fundamental rights. And on
00:32:19.289 --> 00:32:23.210
that note, I would like to hand over back
to Chris, who will explain the issue of
00:32:23.210 --> 00:32:28.120
interoperability and how to actually
empower you as a user and to strengthen
00:32:28.120 --> 00:32:35.850
the protection of fundamental rights
further. Chris, it's yours now. Christoph:
00:32:35.850 --> 00:32:43.299
Thank you. I think we all know or feel
that the Internet has seen better times.
00:32:43.299 --> 00:32:48.920
If you look back over the last 20 years,
we have seen that transformation was going
00:32:48.920 --> 00:32:54.280
on from an open Internet towards a more
closed one - monopolization. Big platforms
00:32:54.280 --> 00:32:59.230
have built entire ecosystems and it seems
that they alone decide who gets to use
00:32:59.230 --> 00:33:04.260
them. Those platforms have strong network
effects that have pushed platforms or
00:33:04.260 --> 00:33:07.830
those platforms into gatekeeper position
which made it so easy for them to avoid
00:33:07.830 --> 00:33:12.330
any real competition. This is especially
true when we think of social media
00:33:12.330 --> 00:33:17.280
platforms. This year we celebrate the 20th
birthday of the e-Commerce Directive that
00:33:17.280 --> 00:33:20.641
Eliska mentioned. The Internet bill that
will now be replaced by the Digital
00:33:20.641 --> 00:33:25.870
Services Act. We believe it's a very good
time now to think and make a choice:
00:33:25.870 --> 00:33:29.480
should we give even more power to the big
platforms that have created a lot of the
00:33:29.480 --> 00:33:33.110
mess in the first place; or should we give
the power to the users, give the power
00:33:33.110 --> 00:33:39.270
back to the people? For us, the answer is
clear. Big tech companies already employ a
00:33:39.270 --> 00:33:43.500
wide array of technical measures. They
monitor, they remove, they disrespect user
00:33:43.500 --> 00:33:48.340
privacy and the idea to turn them into
the Internet Police, with a special
00:33:48.340 --> 00:33:54.250
license of censoring the speech of users,
will only solidify their dominance. So we
00:33:54.250 --> 00:33:58.880
wouldn't like that. What we like is to
put users in charge over their online
00:33:58.880 --> 00:34:04.770
experience. Users should, if we had a say,
choose for themselves which kind of
00:34:04.770 --> 00:34:08.329
content they can see, what services they
can use to talk to their friends and
00:34:08.329 --> 00:34:13.389
families. And we believe it's perhaps time
to break up those silos, those big
00:34:13.389 --> 00:34:18.070
platforms have become to end the dominance
over data. One element to achieve this
00:34:18.070 --> 00:34:21.770
would be to tackle the targeted ads
industry, as Eliska mentioned it, perhaps
00:34:21.770 --> 00:34:26.970
to give an actual right to users not to be
subject to targeted ads or to give more
00:34:26.970 --> 00:34:30.960
choice to use to decide, which content
they would like to see or not to see. In
00:34:30.960 --> 00:34:35.050
the Digital Services Act, the Commission
went for transparency when it comes to ads
00:34:35.050 --> 00:34:39.490
and better option for users to decide on
the recommended content, which is a start,
00:34:39.490 --> 00:34:45.010
we can work with that. Another important
element to achieve user autonomy over data
00:34:45.010 --> 00:34:49.639
is interoperability. If the European Union
really wants to break the power of those
00:34:49.639 --> 00:34:53.820
data driven platforms that monopolize the
Internet, it needs regulations that
00:34:53.820 --> 00:34:58.440
enables users to be in control over the
data. We believe that users should be able
00:34:58.440 --> 00:35:03.710
to access data, to download data, to move,
manipulate their data as they see fit. And
00:35:03.710 --> 00:35:07.920
part of that control is to port data
from one place to another. But data
00:35:07.920 --> 00:35:11.560
portability, which we have under the
GDPR is not good enough. And we
00:35:11.560 --> 00:35:15.592
see from the GDPR that it's not
working in practice. Users should be able
00:35:15.592 --> 00:35:19.350
to communicate with friends across
platform boundaries, to be able to follow
00:35:19.350 --> 00:35:23.480
their favorite content across different
platforms without having to create several
00:35:23.480 --> 00:35:28.950
accounts. But to put it in other terms, if
you upset with the absence of privacy on
00:35:28.950 --> 00:35:33.300
Facebook or how the content is moderated
on Facebook, you should be able to just
00:35:33.300 --> 00:35:37.080
take your data with you using portability
options and move to an alternative
00:35:37.080 --> 00:35:41.040
platforms, that is a better fit and this
without losing touch with your friends who
00:35:41.040 --> 00:35:45.850
stay behind, who have not left the
incumbent big platform. So what we did for
00:35:45.850 --> 00:35:50.190
Digital Services Act is to argue for
mandatory interoperability options that
00:35:50.190 --> 00:35:55.760
would force Facebook to maintain APIs that
let users on other platforms exchange
00:35:55.760 --> 00:36:01.330
messages and content with Facebook users.
However, if you look in the DSA, we see
00:36:01.330 --> 00:36:04.970
that the commission completely missed the
mark on interoperability, which is
00:36:04.970 --> 00:36:09.100
supposed to be dealt with by related legal
act, now it gets complicated. It's the
00:36:09.100 --> 00:36:15.000
Digital Markets Act, the DMA, another
beautiful acronym. The Digital Markets Act
00:36:15.000 --> 00:36:19.500
wants to tackle certain harmful business
practices by those gatekeeper platforms,
00:36:19.500 --> 00:36:24.490
the very large tech companies that control
what is called core services. The core
00:36:24.490 --> 00:36:28.380
service is a search engine, a social
networking service, a messaging service,
00:36:28.380 --> 00:36:33.410
its operating systems and online
intermediation services. Like think of how
00:36:33.410 --> 00:36:37.200
Amazon controls access to customers for
merchants that sell on its platforms or
00:36:37.200 --> 00:36:44.100
how the Android and iPhone app stores as
chokepoints in delivering mobile software.
00:36:44.100 --> 00:36:47.850
And many things we like in the new
proposal, the proposal of the Digital
00:36:47.850 --> 00:36:53.240
Markets Act, for example there's a ban on
mixing data in there that you may wants to
00:36:53.240 --> 00:36:57.190
ban gatekeeper's from mixing data from
data progress with the data they collect
00:36:57.190 --> 00:37:04.030
on the customers. Another rule is to ban
cross tying - sort of practices that end
00:37:04.030 --> 00:37:08.170
users must sign up for ancillary services.
So you should be able to use Android
00:37:08.170 --> 00:37:12.280
without having to get a Google account for
example. You believe that this is all
00:37:12.280 --> 00:37:18.570
good, but the DMA like the DSA is very
weak on interoperability. What it does is
00:37:18.570 --> 00:37:23.240
to focus on real time data portability
instead. So instead of having
00:37:23.240 --> 00:37:27.000
interoperable services, users will only be
able to send the data from one service to
00:37:27.000 --> 00:37:31.830
another like from Facebook to Diaspora,
meaning that you would end up having two
00:37:31.830 --> 00:37:37.350
accounts instead of one or to quote Cory
Doctorow who spoke yesterday already:
00:37:37.350 --> 00:37:42.580
"Users would still be subject to the
sprawling garbage novela of abusive legalese
00:37:42.580 --> 00:37:47.260
Facebook lovably calls its terms of
service." We believe that this is not
00:37:47.260 --> 00:37:54.359
good enough. And the last slide, you see a
quote from the Margrethe Vestager who made
00:37:54.359 --> 00:37:59.920
a very good statement last month, that we
need trustworthy services, fair use of
00:37:59.920 --> 00:38:05.270
data and free speech and an interoperable
internet; we fully agree on that. And in
00:38:05.270 --> 00:38:09.560
the next months and years, we will work on
this to actually happen. However, you can
00:38:09.560 --> 00:38:13.520
imagine, it will not be easy. We already
see that European Union member states
00:38:13.520 --> 00:38:18.250
follow the trend of platforms should
systematically check undesirable and
00:38:18.250 --> 00:38:22.560
insightful content and share those data
with enforcement authorities, which is
00:38:22.560 --> 00:38:27.240
even worse. We see an international trend
going on to move away from the immunity of
00:38:27.240 --> 00:38:32.450
platform for use of content towards a more
active stance of those platforms. And we
00:38:32.450 --> 00:38:37.780
see that recent terror attacks have fueled
ideas that monitoring is a good idea and
00:38:37.780 --> 00:38:43.010
end to end encryption is a problem. So
whatever will be the result, you can bet
00:38:43.010 --> 00:38:46.440
that European Union will want to make the
Digital Services Act and the Digital
00:38:46.440 --> 00:38:50.800
Markets Act another export model. So this
time we want the numbers right in
00:38:50.800 --> 00:38:54.700
parliament and the council, we want to
help members of parliament to press the
00:38:54.700 --> 00:38:59.620
right buttons. And for all this we will
need your help, even if it means to learn
00:38:59.620 --> 00:39:04.030
yet another acronym or several acronyms
after the GDPR. That's it from
00:39:04.030 --> 00:39:07.770
our side - we are looking forward to the
discussion.Thank you.
00:39:14.584 --> 00:39:18.830
Herald: OK, thank you Eliska and
00:39:18.830 --> 00:39:25.660
Christoph. There are questions from the
internet and the first one is basically
00:39:25.660 --> 00:39:31.590
we just have and as you mentioned in your
slides, Christoph, the copyright in the
00:39:31.590 --> 00:39:38.850
digital single market with both
accountability and liability provisions,
00:39:38.850 --> 00:39:45.880
you also briefly mentioned, I think even
the e-evidence proposal also. How do all
00:39:45.880 --> 00:39:49.400
these proposals relate to each other? And
especially for a layperson, that is not
00:39:49.400 --> 00:39:57.030
into all the Brussels jargon.
Christoph: I think Eliska, you raised
00:39:57.030 --> 00:40:01.700
your hand, don't you?
Eliska: ...more or less unintentionally,
00:40:01.700 --> 00:40:10.050
but yeah, kind of that. I can start and
then let you Christoph to step in. Yeah,
00:40:10.050 --> 00:40:15.369
that's a very, very good question. And
this is specifically due to the fact that
00:40:15.369 --> 00:40:19.990
when you mention especially online
terrorist content regulation, but also
00:40:19.990 --> 00:40:27.350
recently proposed interim regulation on
child sexual abuse, they.. all these - we
00:40:27.350 --> 00:40:32.590
call them sectoral legislation, so kind of
a little bit of parting from this
00:40:32.590 --> 00:40:37.790
horizontal approach, meaning an approach
that tackles all categories of illegal
00:40:37.790 --> 00:40:42.720
content in one way, instead of going after
specific categories such as online
00:40:42.720 --> 00:40:47.510
terrorist content in the separate ways. So
it's a little bit paradoxical saying what
00:40:47.510 --> 00:40:50.869
is currently also happening at the EU
level, because on one hand, we were
00:40:50.869 --> 00:40:55.960
promised this systemic regulation that
will once for all establish harmonized
00:40:55.960 --> 00:41:00.660
approach to combating illegal content
online and at the same time, which is
00:41:00.660 --> 00:41:05.510
specifically DSA, the Digital Services
Act, and at the same time we still see
00:41:05.510 --> 00:41:10.109
European Commission allowing for these
fundamental rights harmful legislative
00:41:10.109 --> 00:41:14.890
proposals happening in these specific
sectors such as proposed online
00:41:14.890 --> 00:41:20.560
terrorist content regulation or other
legislative acts seeking to somehow
00:41:20.560 --> 00:41:24.980
regulate specific categories of user
generated content. This is quite puzzling
00:41:24.980 --> 00:41:31.220
for us as a digital rights activists too,
and very often, actually, so I would maybe
00:41:31.220 --> 00:41:35.484
separate DSA from this for a moment and
say that all of these sectoral
00:41:35.484 --> 00:41:39.990
legislations what they have in common is:
first of all, continuing these negative
00:41:39.990 --> 00:41:43.730
legislative trends that we already
described and that we constantly observe
00:41:43.730 --> 00:41:49.160
in practice, such as shifting more and
more responsibility on online platforms.
00:41:49.160 --> 00:41:53.040
And at the same time, what is also very
interesting, what they have in common is
00:41:53.040 --> 00:41:57.680
the legal basis that they stand on, and
that's the legal basis that is rather
00:41:57.680 --> 00:42:03.340
connected to the cooperation within the
digital single market, even though they
00:42:03.340 --> 00:42:09.530
seek to tackle a very particular type of
category of content category, which is
00:42:09.530 --> 00:42:15.270
manifestly illegal. So logically, if they
should have that appropriate legal ground,
00:42:15.270 --> 00:42:20.010
it should be something more close to
police and judicial cooperation, which we
00:42:20.010 --> 00:42:24.580
don't see happening in practice,
specifically because there is this idea
00:42:24.580 --> 00:42:28.820
that platforms are the best suited to
decide how the illegal content will be
00:42:28.820 --> 00:42:33.100
tackled in the online space. They can be
the fastest, they can be the most
00:42:33.100 --> 00:42:37.640
effective. So they should actually have
that main decision making powers and
00:42:37.640 --> 00:42:42.070
forced into taking those responsibilities
which have ever ultimately, according to
00:42:42.070 --> 00:42:47.220
the rule of law principle, should and have
to be in the hands of the state and public
00:42:47.220 --> 00:42:54.300
authorities, preferably judicial
authorities. So I would say they are all
00:42:54.300 --> 00:43:00.150
bad news for fundamental rights protection
of online users, civil rights
00:43:00.150 --> 00:43:05.810
organizations, all of us that are on this
call today. We're fighting very hard also
00:43:05.810 --> 00:43:09.630
against the online service content
regulation. There was a lot of damage
00:43:09.630 --> 00:43:14.800
control done, especially with the first
report that was tabled by the European
00:43:14.800 --> 00:43:19.609
Parliament and also now during the last
trialogue since the negotiations seems to
00:43:19.609 --> 00:43:24.609
be concluded and the outcome is not great.
It's far from ideal. And I'm worried that
00:43:24.609 --> 00:43:28.619
with other sectoral legislative attempts
coming from the European Commission, we
00:43:28.619 --> 00:43:32.690
might see the same outcome. It will be
very interesting to see how that will
00:43:32.690 --> 00:43:37.130
actually then play together with the
Digital Services Act, which is trying to
00:43:37.130 --> 00:43:43.020
do the exact opposite to actually fix this
negative legislative efforts that we see
00:43:43.020 --> 00:43:47.580
at the EU level with these sectoral
legislation, but also with the member
00:43:47.580 --> 00:43:52.080
states at the national level. I could also
mention the European Commission reaction
00:43:52.080 --> 00:43:56.380
to some national legislative proposals.
But Christoph, I would leave that to you
00:43:56.380 --> 00:44:01.710
and please step in.
Christoph: I think you explained it
00:44:01.710 --> 00:44:05.960
perfectly, and the only thing I can
supplement here is that if you look at
00:44:05.960 --> 00:44:09.810
this move from sectoral legislation,
asylum legislation to horizontal
00:44:09.810 --> 00:44:15.900
legislation, now back to sectoral
legislation - it's a problem, it's a mess.
00:44:15.900 --> 00:44:24.280
First, the two sides not very good
coordinated which brings troubles for
00:44:24.280 --> 00:44:28.880
legal certainty. It makes it very
troublesome for platforms to follow up.
00:44:28.880 --> 00:44:33.520
And it's problematic for us, for us in the
space. We are some sort of lobbyist as
00:44:33.520 --> 00:44:37.530
well, just for public interest. But you
will have to have to deal with copyright,
00:44:37.530 --> 00:44:42.300
with CSAM, with TERREG, with end to
end encryption, DSA, DMA and 15 other
00:44:42.300 --> 00:44:47.780
parties to pop up content by content. It's
very hard to manage to have the capacity
00:44:47.780 --> 00:44:51.311
ready to be early in the debate, and it's
so important to be early in the debate to
00:44:51.311 --> 00:44:55.470
prevent that from happening. And I think
that's a huge challenge for us, to have
00:44:55.470 --> 00:45:00.280
something for us to reflect to in the next
days. How can we join forces better in a
00:45:00.280 --> 00:45:04.290
more systematic way in order to really
follow up on all those initiatives? That's
00:45:04.290 --> 00:45:12.400
for me, a very problematic development.
Herald: So in summary it's a mess. So it
00:45:12.400 --> 00:45:16.790
is related, but we can't explain how,
because it's such a mess. Fair enough.
00:45:16.790 --> 00:45:23.480
I have another question for you, Eliska.
Someone was asking how the proposed
00:45:23.480 --> 00:45:29.220
Good Samaritan clause works compared to..
how it currently works in Germany. But I
00:45:29.220 --> 00:45:33.130
think it's a bit unreasonable to expect
everyone to know how it works in Germany.
00:45:33.130 --> 00:45:38.650
I would rephrase it this as: how does this
proposed Good Samaritan clause work
00:45:38.650 --> 00:45:41.950
compared to how it is now under the
e-Commerce Directive?
00:45:41.950 --> 00:45:50.840
Eliska: Thank you very much. Yeah, so a
great question again, I think the first if
00:45:50.840 --> 00:45:55.720
we put it into the context of the EU law
and apologies that I cannot really answer
00:45:55.720 --> 00:45:59.770
how - you know, compare the German context
- I really don't dare to, I'm not a
00:45:59.770 --> 00:46:05.050
German lawyer, so I wouldn't like to step
in those waters. But first of all, there
00:46:05.050 --> 00:46:10.859
is no Good Samaritan clause per se within
the scope of e-Commerce Directive. It did
00:46:10.859 --> 00:46:16.680
not really exist within the law. And I'm
using the pass sentence now because DSA is
00:46:16.680 --> 00:46:22.250
trying to change that. So that level of
legal certainty was not really, really
00:46:22.250 --> 00:46:26.170
there for the platforms. There was the
conditional model of the liability, which
00:46:26.170 --> 00:46:30.480
is still preserved within the regulation.
But if you think of a Good Samaritan
00:46:30.480 --> 00:46:34.770
clause as we know it from the section
230, or let's use that Good Samaritan
00:46:34.770 --> 00:46:38.060
clause as an example, because also
e-Commerce Directive was actually drafted
00:46:38.060 --> 00:46:43.000
as a response to Communication Decency Act
that was the legislation that puts things
00:46:43.000 --> 00:46:49.920
into motion. So that's the first
ultimate point. I explain at the
00:46:49.920 --> 00:46:55.270
beginning in my presentation what was then
happening in the space of combating
00:46:55.270 --> 00:47:00.570
illegal content at the EU level, and
especially I would refer to the
00:47:00.570 --> 00:47:05.700
communication that the European Commission
published, I think, back in 2018, where it
00:47:05.700 --> 00:47:11.850
actually encouraged and called on online
platforms to proactively engage with
00:47:11.850 --> 00:47:17.390
illegal content and use these proactive
measures to actually seek an adequate
00:47:17.390 --> 00:47:22.369
response to illegal content. Now, to mix
that with this conditional model of
00:47:22.369 --> 00:47:27.550
liability, which is of course defined by
the obtaining actual knowledge by the
00:47:27.550 --> 00:47:32.730
platform that created a perfect storm that
I already explained. So the platforms knew
00:47:32.730 --> 00:47:37.120
that they are kind of pushed by the
legislature to actually seek these active
00:47:37.120 --> 00:47:41.629
responses to illegal content, often
deploying automated measures. But they
00:47:41.629 --> 00:47:46.710
didn't have any legal certainty or
security on their side that if they do so,
00:47:46.710 --> 00:47:50.990
they won't end up ultimately being held
legally liable and face legal consequences
00:47:50.990 --> 00:47:55.650
as a result of obtaining actual knowledge
through those proactive measures that were
00:47:55.650 --> 00:48:01.820
kind of the tool, how they could possibly
actually obtain that knowledge. Now, what
00:48:01.820 --> 00:48:08.290
DSA does, it specifically actually simply
states and I think it's Article 6 in the
00:48:08.290 --> 00:48:13.550
Digital Services Act, if I'm not mistaken,
and I can even open it, it specifically
00:48:13.550 --> 00:48:20.280
basically says that platforms can use
these proactive measures or, you know,
00:48:20.280 --> 00:48:28.339
continue using some tools that actually
seek to provide some responses to this
00:48:28.339 --> 00:48:32.300
type of content without the fear of being
held liable. So it's it's an article which
00:48:32.300 --> 00:48:36.680
has approximately, I think, two
paragraphs, but it's finally in the
00:48:36.680 --> 00:48:40.510
legislation and that means that it will
help to reinforce the level of legal
00:48:40.510 --> 00:48:45.210
certainty. I would also emphasize that
very often in Europe, when we discuss Good
00:48:45.210 --> 00:48:49.820
Samaritan clause, and Good Samaritan is
actually very unfortunate term, because
00:48:49.820 --> 00:48:55.000
it's very much connected to the American
legal tradition. But when it's being mixed
00:48:55.000 --> 00:48:58.910
up with the conditional model of liability
and with the prohibition of general
00:48:58.910 --> 00:49:03.349
monitoring, which is still upheld, these
are the main principles of the European
00:49:03.349 --> 00:49:08.580
intermediary reliability law and the
regime that is applicable within the EU,
00:49:08.580 --> 00:49:13.520
such a safeguard can be actually
beneficial and it won't lead hopefully to
00:49:13.520 --> 00:49:18.619
these blanket immunity for online
platforms or to this idea that platforms
00:49:18.619 --> 00:49:22.050
will be able to do whatever they want with
the illegal content without any public
00:49:22.050 --> 00:49:25.250
scrutiny, because there are other
measures, safeguards and principles in
00:49:25.250 --> 00:49:30.180
place as a part of conditional model of
liability that we have here in Europe. So
00:49:30.180 --> 00:49:35.940
I'm sorry, maybe that was too complicated.
Legalistic explanation there. But this is
00:49:35.940 --> 00:49:40.290
how these provisions should work in
practice. We, of course, have to wait for
00:49:40.290 --> 00:49:44.660
the implementation of the law and see how
that will turn out. But the main purpose
00:49:44.660 --> 00:49:49.550
is that this legal certainty that was
lacking until now can finally come to its
00:49:49.550 --> 00:49:54.599
existence, which should help us to prevent
over removal of legitimate speech from
00:49:54.599 --> 00:50:00.550
online platforms.
Herald: OK, thank you. I have two other
00:50:00.550 --> 00:50:05.050
questions from the Internet about
interoperability, and I suppose I should
00:50:05.050 --> 00:50:12.570
look at Christoph for them. The last one
I'm going to ask first is: would such
00:50:12.570 --> 00:50:18.060
interoperability make it much more
difficult to combat harassment and
00:50:18.060 --> 00:50:22.500
stalking on the Internet? How do you
police that kind of misbehavior if it's
00:50:22.500 --> 00:50:29.840
across different platforms who are forced
to interoperate and also be conduits for
00:50:29.840 --> 00:50:35.670
such bad behavior. And I'll come to the
earlier question if you've answered this
00:50:35.670 --> 00:50:40.130
question Christoph.
Christoph: It's a pretty good question.
00:50:40.130 --> 00:50:47.380
First, to understand our vision on
interoperability is to understand that we
00:50:47.380 --> 00:50:53.770
would like to have it between platforms
that empower large platforms and the right
00:50:53.770 --> 00:50:59.920
of smaller platforms, actually, to make
use of interoperability. So it should not
00:50:59.920 --> 00:51:05.330
be among the big platforms. So small
platforms should be able to connect to the
00:51:05.330 --> 00:51:11.550
big platforms. And second, we believe it
will help and not make it worse because we
00:51:11.550 --> 00:51:15.590
have now a problem of hate speech, we have
now a problem of a lack of privacy, we
00:51:15.590 --> 00:51:23.210
have now a problem of the attention
industry that works with, you know,
00:51:23.210 --> 00:51:29.320
certain pictures put in certain frames to
trigger the attention of users, because users
00:51:29.320 --> 00:51:31.950
don't have a choice of the content
moderation practices, users don't have a
00:51:31.950 --> 00:51:37.310
choice to see which kind of content should
be shown. And users don't have options to
00:51:37.310 --> 00:51:42.580
regulate the privacy. The idea of more
competitors would be exactly that I can
00:51:42.580 --> 00:51:51.160
move to a space, where I'm not harassed
and not be made subject to certain content
00:51:51.160 --> 00:51:57.160
that hurt my feelings. Right. And that
moment I get control. I can choose a
00:51:57.160 --> 00:52:02.330
provider that gives me those options and
we would like even to go a step further.
00:52:02.330 --> 00:52:07.490
Back end interoperability was a start. We
believe if users want to, they should be
00:52:07.490 --> 00:52:11.160
able to delegate a third party company or
piece of a third party software to
00:52:11.160 --> 00:52:14.850
interact with the platform on their
behalf. So users would have the option to
00:52:14.850 --> 00:52:19.070
see a news feed in different order,
calibrate their own filters on
00:52:19.070 --> 00:52:23.099
misinformation. So in this sense,
interoperability can be a great tool,
00:52:23.099 --> 00:52:28.570
actually, to tackle hate speech and to
sort of negative developments. Of course,
00:52:28.570 --> 00:52:33.900
there is a risk to it. I think the risk
comes rather from the data industry side
00:52:33.900 --> 00:52:38.500
again, that we need to take care not to
place one or another data selling
00:52:38.500 --> 00:52:43.770
industry on the one that we already face.
But for this, we have options as well to
00:52:43.770 --> 00:52:46.910
avoid that from happening. But to answer
the question, we believe interoperability
00:52:46.910 --> 00:52:52.609
is a tool actually to escape from the
negative developments you had mentioned.
00:52:52.609 --> 00:52:58.850
Herald: Critical counter question for me
then, aren't you actually advocating for
00:52:58.850 --> 00:53:04.330
just roll your own recommendation engines
to be able to do so? Can't you achieve
00:53:04.330 --> 00:53:09.950
that without interoperability?
Christoph: Sure. Recounter question: Do
00:53:09.950 --> 00:53:15.300
you think an average user can accomplish
that quite easily? You know, like when we
00:53:15.300 --> 00:53:21.040
look at the Internet through the lenses of
market competition then we see that it is
00:53:21.040 --> 00:53:27.090
the dominance of platforms over data that
have created those spaces, those developed
00:53:27.090 --> 00:53:31.182
gardens where users have the feeling they are
trapped and cannot escape from. And there
00:53:31.182 --> 00:53:36.426
are so many alternative options that can
not get off the ground because users feel
00:53:36.426 --> 00:53:40.880
trapped, don't want to leave their friends
behind and don't have options, actually to
00:53:40.880 --> 00:53:45.410
have a better moderation system. Of course,
you can be creative and, you know, use
00:53:45.410 --> 00:53:49.650
plugins and whatever you see fit, but you
need to stay within the platform barriers.
00:53:49.650 --> 00:53:54.460
But we would like to enable users to actually
leave developed garden, go to another
00:53:54.460 --> 00:53:58.599
place, but still stay in touch with
friends who have made the choice to remain
00:53:58.599 --> 00:54:01.420
there. And I think that's perhaps the
difference to what you had in mind.
00:54:01.420 --> 00:54:05.880
Herald: I have a follow up question. Well,
another question from the Internet,
00:54:05.880 --> 00:54:12.130
regardless of interoperability, and that
is, historically speaking, as soon as the
00:54:12.130 --> 00:54:16.470
big players get involved in certain
standards, they tend to also shape policy
00:54:16.470 --> 00:54:23.070
by being involved in that. How would that
be different in the case of
00:54:23.070 --> 00:54:27.640
interoperability and specifically
mentioned by the person who ask the
00:54:27.640 --> 00:54:31.160
question. That Mastodon probably
flourishes because nobody else was
00:54:31.160 --> 00:54:35.900
involved in setting that standard.
Christoph: Ah it's an excellent question.
00:54:35.900 --> 00:54:41.859
And we struggled with the question of
standards ourselves in our policy paper,
00:54:41.859 --> 00:54:46.860
which is our recommendations for European
Union to enact certain provisions in the
00:54:46.860 --> 00:54:55.970
new Digital Services Act. We abstain from
asking to establish new standards like API
00:54:55.970 --> 00:55:00.133
standards. We believe it's a bad idea to
regulate technology like that. What we
00:55:00.133 --> 00:55:05.460
want to do is that big platforms just
offer interoperability however they see fit.
00:55:05.460 --> 00:55:11.170
We don't want to have a standard that can
be either again monopolized or lobbied by
00:55:11.170 --> 00:55:15.290
the big platforms. Because then we end up
with the standards we already see
00:55:15.290 --> 00:55:21.010
which we don't like. But it's a good
question. And what we did is with our
00:55:21.010 --> 00:55:24.501
policy principles on interoperability to
give kind of a food for thought, how we
00:55:24.501 --> 00:55:31.110
believe the end version should look like,
but the many questions that remain and we
00:55:31.110 --> 00:55:35.510
don't know exactly the way how to go
there.
00:55:35.510 --> 00:55:39.450
Herald: Yeah, I'm sorry of sticking to the
topic of interoperability, because most
00:55:39.450 --> 00:55:44.040
questions are actually about that. One of
the other questions is how do we prevent
00:55:44.040 --> 00:55:49.480
this from getting messed up like it
happens with PSD2? And for the audience
00:55:49.480 --> 00:55:54.950
that don't know about PSD2 - PSD2 is a
Directive that forced banks to open up
00:55:54.950 --> 00:55:58.930
APIs to other financial service providers,
which is also interoperability between
00:55:58.930 --> 00:56:03.079
platforms, in this case for banking
platforms, which comes with all sorts of
00:56:03.079 --> 00:56:07.740
privacy questions that weren't completely
thought through when that legislation came
00:56:07.740 --> 00:56:12.209
about. Sorry for having this long winded
intoduction, Christoph, but I think it was
00:56:12.209 --> 00:56:16.140
needed for people that don't know what
PSD2 means.
00:56:16.140 --> 00:56:20.080
Christoph: It's a good question.
Interestingly, we never used PSD2 or the
00:56:20.080 --> 00:56:23.760
Telecommunications Act because both have
interoperability options as those negative
00:56:23.760 --> 00:56:28.540
examples we always used as examples that
hey, it's already possible. So you don't
00:56:28.540 --> 00:56:32.570
have an excuse to say it's impossible to put
it in the law. What is true is that there
00:56:32.570 --> 00:56:37.013
is a lot of mess around it. The question
of how to avoid the mess, it's a question
00:56:37.013 --> 00:56:42.500
of.. Netzpolitik again. So the question
of whether policy makers are actually
00:56:42.500 --> 00:56:47.150
listening to us or listening to industry
lobbyists. So the ones who raised the
00:56:47.150 --> 00:56:50.690
question is absolutely right - there's a
huge risk for every topic we talk about.
00:56:50.690 --> 00:56:56.040
Whether interoperability, whether it's use
of control about content, targeted ads,
00:56:56.040 --> 00:56:59.829
liability, everything that we believe
should be the law, of course, could be
00:56:59.829 --> 00:57:05.640
hijacked, could be redesigned in a way
that it will lead to more problems than
00:57:05.640 --> 00:57:10.830
fewer problems. So, indeed, for every
policy question we raise, we need to ask
00:57:10.830 --> 00:57:14.840
ourselves, is it worth the fight to risk
opening the box of the Pandora? Do we make
00:57:14.840 --> 00:57:22.220
it worse? What we said is on that front -
we are happy to make a pressure and what
00:57:22.220 --> 00:57:25.040
we need to do in the next year is to
convince them that we are the right person
00:57:25.040 --> 00:57:30.849
to talk to. And that's perhaps a challenge
how to make that explicable to policy
00:57:30.849 --> 00:57:35.440
makers. So those who ask the questions, I
think those should help us to come to
00:57:35.440 --> 00:57:39.540
Brussels to the parliament and tell MEPs
how it's going to work.
00:57:39.540 --> 00:57:48.820
Herald: On that note a question to both of
you. Citizen enforcement, I prefer the
00:57:48.820 --> 00:57:55.349
term citizen overuses. Would it be helpful
to push for amendments in the parliament
00:57:55.349 --> 00:58:05.470
for at least the targeting points you both
mentioned before? And if so, how?
00:58:05.470 --> 00:58:14.496
Eliska: So I guess that I will start just
so Christoph can rest a little. So the
00:58:14.496 --> 00:58:18.350
question was whether it would be useful to
push for those amendments? Was that's
00:58:18.350 --> 00:58:20.710
right?
Herald: For amendments that cover the
00:58:20.710 --> 00:58:25.970
targeting of citizens.
Eliska: Absolutely. So there is, of
00:58:25.970 --> 00:58:30.630
course, short and long answer as to every
question. And so the short answer would be
00:58:30.630 --> 00:58:37.900
yes, but given that the wording of such an
amendment will be precise and nuanced. We
00:58:37.900 --> 00:58:42.090
are still working out our positioning on
online targeting and I think we all.. no
00:58:42.090 --> 00:58:46.300
one can name those practices that we don't
want to see being deployed by platforms
00:58:46.300 --> 00:58:50.900
and where we can actually imagine a proper
ban on such practices. We have recently
00:58:50.900 --> 00:58:57.810
published one of our blog posts where we
actually unfold the way of thinking that
00:58:57.810 --> 00:59:02.150
Access Now currently, you know, how we
are brainstorming about this whole issue.
00:59:02.150 --> 00:59:06.740
And as I said, especially those.. that's
targeting that uses behavioral data of
00:59:06.740 --> 00:59:11.360
users, citizens, then maybe let's go for
individuals because they are also obliged
00:59:11.360 --> 00:59:18.550
to protect the rights of individuals that
are not their citizens. So that's
00:59:18.550 --> 00:59:21.840
definitely one form where we can
definitely see and will be supporting the
00:59:21.840 --> 00:59:26.130
ban and possible phase out that will
actually lead to a ban. The same goes for
00:59:26.130 --> 00:59:34.200
the cross side tracking of users due to
the fact how users data are being abused
00:59:34.200 --> 00:59:39.580
again as being the integral part of the
business models of these platforms and so
00:59:39.580 --> 00:59:44.230
on and so forth. So that's one of the
direction we will be definitely taking.
00:59:44.230 --> 00:59:48.799
And again, we are inviting all of you to
help us out, to brainstorm together with
00:59:48.799 --> 00:59:53.020
us, to assess different options,
directions that we should take into
00:59:53.020 --> 00:59:58.470
consideration and not forget about. But I
personally think that this will be one of
00:59:58.470 --> 01:00:04.360
the main battles when it comes to DSA,
where we will definitely need to be on the
01:00:04.360 --> 01:00:10.360
same page and harmonize and joint the
forces, because DSA gives us a good ground
01:00:10.360 --> 01:00:16.040
at the moment, but it doesn't go far
enough. So, yes, definitely the answer is
01:00:16.040 --> 01:00:19.660
yes, but given that, we will have a very
nuanced position so we know what we are
01:00:19.660 --> 01:00:24.579
asking for and we are taking into
consideration also those other aspects
01:00:24.579 --> 01:00:30.240
that could eventually play out badly in
practice. So good intentions are not
01:00:30.240 --> 01:00:35.109
enough when it comes to DSA.
Herald: Thank you. I think we are slightly
01:00:35.109 --> 01:00:42.770
over time, but I've been told beforehand
it's OK to do so for a few minutes and
01:00:42.770 --> 01:00:47.170
there's three questions that are open. One
of one of them, I will answer myself. That
01:00:47.170 --> 01:00:50.480
is basically: have the member states
responded? And the answer to that is no.
01:00:50.480 --> 01:00:54.430
The member states have not taken any
position. And two others, I think are
01:00:54.430 --> 01:00:59.859
quite interesting and important from
a techy perspective. One is: is there
01:00:59.859 --> 01:01:03.320
anything you see that might affect current
decentralized platforms like the
01:01:03.320 --> 01:01:11.250
Fediverse, Mastodon? And the other is:
will any review of the data protection,
01:01:11.250 --> 01:01:17.440
sorry, the Database Protection Directive
affect make the search engines and
01:01:17.440 --> 01:01:23.430
interact of these again?
Christoph: Perhaps I jump in and Eliska,
01:01:23.430 --> 01:01:28.329
you take over. First, member states have
given an opinion actually on DSA, there
01:01:28.329 --> 01:01:33.350
have been one-two official submissions,
plus joint letters, plus discussions in
01:01:33.350 --> 01:01:39.150
council on whether the DSA was presented.
I have some nice protocols, which showed
01:01:39.150 --> 01:01:44.170
the different attitude of member states
towards it. So for us, it also means we
01:01:44.170 --> 01:01:48.970
need to work straight away with the
council to ensure that the package would
01:01:48.970 --> 01:01:55.680
be good. What was the question? Ah, yes, I
think the answer to the question depends
01:01:55.680 --> 01:02:00.520
on the what lawyers call materials scope
applications whether it would apply to
01:02:00.520 --> 01:02:05.760
those platform models at all. But
Eliska you can help me out here. You have
01:02:05.760 --> 01:02:08.930
always criticized for the e-Commerce
Directive that it was not quite clear how
01:02:08.930 --> 01:02:15.190
it would relate to first non profit
platforms. And many of these alternative
01:02:15.190 --> 01:02:19.680
platforms are like that, because there was
this issue of providing a service against
01:02:19.680 --> 01:02:22.660
remuneration and what's not quite clear
what it means. Would it apply to Wikipedia
01:02:22.660 --> 01:02:26.609
if you get like donations would have
applied to a blogger if you have, like,
01:02:26.609 --> 01:02:31.640
pop ups, ads or something like that. So
that's, I think, one huge question. And
01:02:31.640 --> 01:02:37.680
the second question is, in as much those
new due diligence obligation would force
01:02:37.680 --> 01:02:44.369
alternative platforms, governance models
to redesign the interfaces. And those are
01:02:44.369 --> 01:02:48.430
open question for us. We have not analyzed
that in detail, but we see that.. we have
01:02:48.430 --> 01:02:52.839
worried that it would not only impact the
large platforms, but many others as well.
01:02:52.839 --> 01:02:58.020
What do you think Eliska?
Eliska: Yeah, I can only agree and
01:02:58.020 --> 01:03:03.620
especially regarding the non profit
question. Yeah, this is also or was and
01:03:03.620 --> 01:03:09.859
always been one of our main asks for
nonprofit organizations. And actually it's
01:03:09.859 --> 01:03:13.910
not quite clear how that will play out now
in practice. By the way, how DSA standing,
01:03:13.910 --> 01:03:17.450
because at the moment it actually speaks
about online platforms and then it speaks
01:03:17.450 --> 01:03:30.829
about very large online platforms, but
what it will be and how it will impact
01:03:30.829 --> 01:03:35.610
nonprofit organizations, whether these are
bloggers or organizations like us civil
01:03:35.610 --> 01:03:43.760
rights organizations that remain to be
seen. I also know that the European Court
01:03:43.760 --> 01:03:49.100
of Human Rights and its jurisprudence try
to establish some principles for the
01:03:49.100 --> 01:03:56.410
nonprofit since Delfi AS versus Estonia
and then with the empty e-decision and the
01:03:56.410 --> 01:04:01.090
Swedish case that followed afterwards. But
I'm not sure how well it was actually
01:04:01.090 --> 01:04:05.430
elaborated later on. But this is something
we will be definitely looking at and
01:04:05.430 --> 01:04:14.030
working on further and regarding the
impact on the smaller players and also at
01:04:14.030 --> 01:04:18.349
the interface idea, this is still
something we are actually also wondering
01:04:18.349 --> 01:04:26.330
about and thinking how this will turn out
in practice. And we are hoping to actually
01:04:26.330 --> 01:04:30.760
develop our positioning further on these
issues as well, because we actually
01:04:30.760 --> 01:04:36.440
started working, all of us on DSA and on
our recommendations already. I think a
01:04:36.440 --> 01:04:40.810
year ago, maybe a year and a half ago when
we were working just with the leagues that
01:04:40.810 --> 01:04:45.080
Politico or other media platforms in
Brussels were actually sharing with us.
01:04:45.080 --> 01:04:48.551
And we were working with the bits and
pieces and trying to put our thinking
01:04:48.551 --> 01:04:52.570
together. Now we have the draft and I
think we need to do another round of very
01:04:52.570 --> 01:04:58.099
detailed thinking. What will be ultimately
our position or what will be those
01:04:58.099 --> 01:05:01.480
recommendations and amendments in the
European Parliament we will be supporting
01:05:01.480 --> 01:05:06.749
and pushing for. So it's a period of
hard work for all of us. Not mentioning
01:05:06.749 --> 01:05:14.700
that I always say that, you know, we stand
against a huge lobby power that is going
01:05:14.700 --> 01:05:21.671
to be and is constantly being exercised by
these private actors. But I also have to
01:05:21.671 --> 01:05:25.270
say that we had a very good cooperation
with the European Commission throughout
01:05:25.270 --> 01:05:28.731
the process. And I think that I can say
that on behalf of all of us, that we feel
01:05:28.731 --> 01:05:33.630
that the European Commission really listen
this time. So, yeah, more question marks
01:05:33.630 --> 01:05:37.980
than answers here from me, I think.
Herald: This is fine with me, not knowing
01:05:37.980 --> 01:05:42.299
something is fine. I think we're
definitely run out of time. Thank you both
01:05:42.299 --> 01:05:51.410
for being here. Well, enjoy the 2D online
world and the Congress. Thank you. That
01:05:51.410 --> 01:05:53.990
wraps the session up for us. Thank you all.
Christoph: Thank you.
01:05:53.990 --> 01:05:57.380
Eliska: Thank you very much. Bye.
01:05:57.380 --> 01:06:00.363
music
01:06:00.363 --> 01:06:36.350
Subtitles created by c3subtitles.de
in the year 2021. Join, and help us!