0:00:00.000,0:00:12.719
rC3 preroll music
0:00:12.719,0:00:19.690
Herald: And the the talk that is about to[br]begin now is by Christoph Schmon from EFF
0:00:19.690,0:00:23.890
and Eliska Pirkova from Access Now and[br]they will be talking about this, probably
0:00:23.890,0:00:28.759
maybe the biggest legislative initiative[br]since the GDPR in Brussels. It's called
0:00:28.759,0:00:38.885
the Digital Services Act. And onto you[br]two. You're muted.
0:00:38.885,0:00:43.109
Eliska Perkova: I realized. Hello![br]First of all, thank you very much for
0:00:43.109,0:00:47.710
having us today. It's a great pleasure to[br]be the part of this event. I think for
0:00:47.710,0:00:51.979
both of us, it's the first time we are[br]actually joining this year. So it's
0:00:51.979,0:00:56.949
fantastic to be the part of this great[br]community. And today we are going to talk
0:00:56.949,0:01:03.229
about the legislative proposal, the EU[br]proposal which causes a lot of noise all
0:01:03.229,0:01:07.700
around Europe, but not only in Europe, but[br]also beyond. And that's the Digital
0:01:07.700,0:01:11.950
Services Act legislative package that[br]today we already know that this
0:01:11.950,0:01:16.390
legislative package actually consists of[br]two acts: Digital Services Act and a
0:01:16.390,0:01:21.829
Digital Market Act. And both of them will[br]significantly change the regulation of
0:01:21.829,0:01:26.720
online platforms with a specific focus on[br]very large online platforms, also often
0:01:26.720,0:01:31.810
referred to as gatekeepers. So those who[br]actually hold a lot of economic dominance,
0:01:31.810,0:01:37.320
but also a lot of influence and control[br]over users rights and the public
0:01:37.320,0:01:42.100
discourse. So I'm going to start with[br]giving you a quick introduction into
0:01:42.100,0:01:46.509
what's the fuss about, what is actually[br]the DSA about, why we are also interested
0:01:46.509,0:01:50.460
in it and why we keep talking about it,[br]and why this legislation will keep us
0:01:50.460,0:01:57.049
preoccupied for the years to come. DSA was[br]already announced two years ago as a part
0:01:57.049,0:02:05.409
of European Union digital strategy, and it[br]was appointed as one of those actions that
0:02:05.409,0:02:11.020
the digital strategy will be actually[br]consisting of. And it was the promise that
0:02:11.020,0:02:14.700
the European Commission gave us already at[br]that time to create the systemic
0:02:14.700,0:02:19.650
regulation of online platforms that[br]actually places hopefully the users and
0:02:19.650,0:02:25.790
their rights into the center of this[br]upcoming legislation. So the promise
0:02:25.790,0:02:30.879
behind DSA is that these ad-based Internet[br]bill, and I'm speaking now about Ad-Tech
0:02:30.879,0:02:34.319
and Online-Targeting. The[br]Internet, as we actually knew, will be
0:02:34.319,0:02:38.760
actually replaced by something that puts[br]user and users controls, and users right
0:02:38.760,0:02:43.810
as a priority. So both of these[br]legislations implemented and drafted
0:02:43.810,0:02:49.209
right, should be actually achieving that[br]goal in the future. Now, previously,
0:02:49.209,0:02:54.319
before DSA actually was drafted, there was[br]some so-called e-Commerce Directive in
0:02:54.319,0:02:58.480
place that actually established the basic[br]principles, especially in the field of
0:02:58.480,0:03:03.190
content governance. I won't go into[br]details on that because I don't want to
0:03:03.190,0:03:07.830
make this too legalistic. But ultimately,[br]DSA legislation is supposed to not
0:03:07.830,0:03:12.519
completely replace but build up on the top[br]of this legislation that actually created
0:03:12.519,0:03:17.280
the ground and the main legal regime for[br]almost 20 years in Europe to regulate the
0:03:17.280,0:03:23.330
user generated content. So, DSA and DMA,[br]as the legislation will seek to harmonize
0:03:23.330,0:03:28.620
the rules addressing the problem, such as[br]online hate speech, disinformation. But it
0:03:28.620,0:03:33.310
also puts emphasis finally on increased[br]meaningful transparency in online
0:03:33.310,0:03:37.829
advertising, the way how the content is[br]actually being distributed across
0:03:37.829,0:03:44.519
platforms and also will develop a specific[br]enforcement mechanism that will be
0:03:44.519,0:03:49.810
actually looking into it. Now, before I[br]will actually go into the details on DSA
0:03:49.810,0:03:54.349
and why DSA matters, and do we actually[br]need such a big new legislative reform
0:03:54.349,0:03:58.020
that is coming from the European[br]Commission? I want to just unpack it for
0:03:58.020,0:04:03.200
you a little bit what this legislative[br]package actually consists of. So, as I
0:04:03.200,0:04:07.409
already mentioned, two regulations... the[br]regulation, the strongest legal instrument
0:04:07.409,0:04:11.989
European Commission actually has in its[br]hands, which is supposed to achieve the
0:04:11.989,0:04:16.380
highest level of harmonization across the[br]member states. And we all can imagine how
0:04:16.380,0:04:20.120
difficult that will be to achieve,[br]especially in the realm of freedom of
0:04:20.120,0:04:24.910
expression and particular categories of[br]user generated content, which is so deeply
0:04:24.910,0:04:29.960
complex dependance. All of those[br]related to content moderation and content
0:04:29.960,0:04:35.550
curation will be mainly in the realm of[br]Digital Services Act. And then the second
0:04:35.550,0:04:39.710
regulation, the Digital Market Act, will[br]be specifically looking at the dominance
0:04:39.710,0:04:44.620
of online platforms, economic dominance,[br]competitive environment for smaller
0:04:44.620,0:04:49.750
players, the fairness in the competition.[br]And it will also establish the list of
0:04:49.750,0:04:54.510
do's and don'ts for gatekeeper's[br]platforms. So something that... is so the
0:04:54.510,0:04:59.650
platforms that actually hold a relevant[br]dominance and now based on these new
0:04:59.650,0:05:03.920
proposals, we know that these platforms[br]are mainly called as very large online
0:05:03.920,0:05:10.040
platforms. So this is exactly how the[br]legislation refers to gatekeepers now. And
0:05:10.040,0:05:15.259
now I think one more point that I want to[br]make is that the DSA and the DMA were
0:05:15.259,0:05:20.520
launched on the 15th of December 2020. So it[br]was literally a Christmas present given to
0:05:20.520,0:05:24.610
the digital rights community by the[br]European Commission, a long time
0:05:24.610,0:05:30.530
anticipated one. The work on DSA[br]started, however much earlier. Electronic
0:05:30.530,0:05:34.800
Frontiers Foundations, as much as Access[br]Now, together with EDRi, were working very
0:05:34.800,0:05:39.100
hard to come up with the priorities and[br]recommendations what we would like to see
0:05:39.100,0:05:42.660
within these legislations to be[br]enshrined, because from the beginning we
0:05:42.660,0:05:46.690
understood the importance and the far[br]reaching consequences this legislation
0:05:46.690,0:05:51.690
will have not only inside of the European[br]Union, but also beyond. And that brings me
0:05:51.690,0:05:55.140
to the final introductory point that I[br]want to make before I will hand over to
0:05:55.140,0:06:00.731
Chris, which is why and do we actually[br]need DSA? We strongly believe that there
0:06:00.731,0:06:06.139
is a big justification and good reason to[br]actually establish a systemic regulation
0:06:06.139,0:06:11.190
of online platforms in order to secure[br]users fundamental rights, empower them,
0:06:11.190,0:06:15.580
and also to protect our democratic[br]discourse. And this is due to the fact
0:06:15.580,0:06:20.840
that for many years we're witnessing this[br]phase: quite bad regulatory practices in
0:06:20.840,0:06:25.231
platform governance that are coming from[br]member states and Chris will provide for a
0:06:25.231,0:06:29.840
very concrete example in that regard, but[br]also coming from the European Commission
0:06:29.840,0:06:35.930
itself, mainly the proposed online service[br]content regulation, for instance, or we
0:06:35.930,0:06:40.670
all remember the story of copyright that[br]Chris will discuss a little bit further.
0:06:40.670,0:06:45.370
We saw how not only the member states, but[br]also European Commission or European
0:06:45.370,0:06:52.050
Union, in order to actually establish some[br]order in the digital space, they started
0:06:52.050,0:06:56.060
pushing the state's obligation and[br]especially states positive obligation to
0:06:56.060,0:07:00.940
protect the users' human rights in the[br]hands of online private platforms that
0:07:00.940,0:07:07.940
started replacing state actors and public[br]authorities within the online space. They
0:07:07.940,0:07:11.900
started assessing the content, the[br]legality of the content, deciding under a
0:07:11.900,0:07:16.010
very short time frames what should stay[br]online and what should go offline with no
0:07:16.010,0:07:20.519
public scrutiny or transparency about[br]practices that they kept deploying. And
0:07:20.519,0:07:25.590
they keep deploying to this day. Of[br]course, platforms under the threat of
0:07:25.590,0:07:31.140
legal liability often had to rely and[br]still have to rely on the content
0:07:31.140,0:07:36.180
recognition technologies for removing user[br]generated content. A typical example could
0:07:36.180,0:07:40.190
be also Avia law, which will be still[br]mentioned today during the presentation.
0:07:40.190,0:07:45.440
And the typical time frames are usually[br]those that extend from one hour to 24
0:07:45.440,0:07:49.060
hours, which is extremely short,[br]especially if any users would like to
0:07:49.060,0:07:54.900
appeal such a decision or seek an[br]effective remedy. At the same time, due to
0:07:54.900,0:08:00.860
the lack of harmonization and lack of[br]proper set of responsibilities that should
0:08:00.860,0:08:04.129
lie in the hands of these online[br]platforms. There was a lack of legal
0:08:04.129,0:08:08.550
certainty which would only reinforce the[br]vicious circle of removing more and more
0:08:08.550,0:08:14.909
of online content in order to escape any[br]possible liability. And at the end, to
0:08:14.909,0:08:19.940
this day, due to the lack of transparency,[br]we lack any evidence or research based
0:08:19.940,0:08:24.340
policy making, because platforms do not[br]want to share or inform the public
0:08:24.340,0:08:28.350
authorities what they actually do with the[br]content, how they moderate and those
0:08:28.350,0:08:32.320
transparency information that we receive[br]within their transparency reports are
0:08:32.320,0:08:36.880
usually quantity oriented instead of[br]quality. So they focus on how much content
0:08:36.880,0:08:42.030
is actually being removed and how fast,[br]which is not enough in order to create
0:08:42.030,0:08:47.010
laws that can actually provide any more[br]sustainable solutions. And ultimately, as
0:08:47.010,0:08:52.600
we all agree, the core issue doesn't lie[br]that much with how the content is being
0:08:52.600,0:08:57.440
moderated, but how content is being[br]distributed across platforms within the
0:08:57.440,0:09:02.600
core of their business models that[br]actually stands on a attention economy and
0:09:02.600,0:09:07.580
on the way, how sensational content is[br]often being amplified in order to actually
0:09:07.580,0:09:12.930
prolong that attention span of users that[br]visit platforms on regular basis. And I'm
0:09:12.930,0:09:18.000
packing quite a few issues here. And this[br]supposed to be just like a quick
0:09:18.000,0:09:22.370
introductory remark. I will now hand over[br]to Chris that will elaborate on all these
0:09:22.370,0:09:26.080
points a little bit further, and then we[br]take a look and unpack a few quite
0:09:26.080,0:09:31.200
important parts of the essay that we feel[br]should be prioritized in this debate at
0:09:31.200,0:09:36.980
the moment. Chris, over to you.[br]Christoph Schmon: Hi everybody. I'm
0:09:36.980,0:09:42.860
quite sure that many of you have noticed[br]that there's a growing appetite from the
0:09:42.860,0:09:47.390
side of the European Union to regulate the[br]Internet by using online platforms, as the
0:09:47.390,0:09:52.370
helping hands to monitor and censor what[br]users can say/share/do online. As you
0:09:52.370,0:09:58.940
see on the slide, the first highlight of[br]this growing appetite was corporate upload
0:09:58.940,0:10:02.880
filters, which are supposed to stop[br]copyright protected content online.
0:10:02.880,0:10:07.280
Thousands of people, old and young, went[br]on the streets to demonstrate for free
0:10:07.280,0:10:11.290
Internet, to demonstrate against technical[br]measures to turn the Internet into some
0:10:11.290,0:10:15.670
sort of censorship machine. We've made a[br]point then, and we continue making the
0:10:15.670,0:10:20.700
point now that upload filters are prone to[br]error, that upload filters cannot
0:10:20.700,0:10:24.330
understand context, that they are[br]unaffordable by all but the largest tech
0:10:24.330,0:10:29.670
companies, which happen to be all based in[br]the United States. But as you well know,
0:10:29.670,0:10:34.460
policymakers would not listen and[br]Article 13 of the copyright directive was
0:10:34.460,0:10:39.750
adopted by a small margin in the European[br]Parliament, also because some members of
0:10:39.750,0:10:44.410
European Parliament had troubles to press[br]the right buttons, but I think it's
0:10:44.410,0:10:49.510
important for us to understand that the[br]fight is far from over. The member states
0:10:49.510,0:10:53.170
must now implement the directive in the[br]way that is not at odds with fundamental
0:10:53.170,0:10:58.130
rights. And we argue that mandated[br]automated removal technologies are always
0:10:58.130,0:11:02.790
in conflict with fundamental rights. And[br]this includes data protection rights. It
0:11:02.790,0:11:07.070
is a data protection right not to be made[br]subject to Automated Decision-Making
0:11:07.070,0:11:10.890
online, if it involves your personal data[br]and if such decision making has a negative
0:11:10.890,0:11:16.640
effect. So we believe, all those legal[br]arguments aside, I think the most worrying
0:11:16.640,0:11:22.570
experience with upload filters is that it[br]has spillover effects to other
0:11:22.570,0:11:25.810
initiatives. Sure, if it works for[br]copyright protected companies, it may well
0:11:25.810,0:11:31.350
work for other types of content, right?[br]Except that it doesn't. Many considered
0:11:31.350,0:11:35.960
now a clever idea that web forums should[br]proactively monitor and check all sorts of
0:11:35.960,0:11:40.690
user content. May this be communication,[br]pictures or videos, and they should use
0:11:40.690,0:11:45.820
filters to take it down or the filters to[br]prevent the re-upload of such content. An
0:11:45.820,0:11:51.190
example of such spillover effect that Eliska[br]had mentioned is the draft regulation of
0:11:51.190,0:11:56.110
terrorist related content. It took a huge[br]joint effort of civil society groups and
0:11:56.110,0:12:02.930
some members of Parliament to reject the[br]worst of all text. We had recent
0:12:02.930,0:12:08.860
negotiations going on and at least we[br]managed to get out the requirement to use
0:12:08.860,0:12:13.050
uploads filters, but still a twenty four[br]hours removal obligation that may nudge
0:12:13.050,0:12:19.560
platforms to employ those filters[br]nevertheless. And we see that particularly
0:12:19.560,0:12:23.140
in national states, they are very fond of[br]the idea that platforms rather than
0:12:23.140,0:12:27.420
judges should be the new law[br]enforcers. There are now several states in
0:12:27.420,0:12:31.480
the European Union that have adopted laws[br]that would either oblige or nudge
0:12:31.480,0:12:36.640
platforms to monitor users speech[br]online. First up was the German NetzDG,
0:12:36.640,0:12:40.110
which set out systematic duties for[br]platforms. Then we had the French law
0:12:40.110,0:12:44.320
Avia, which copy-pasted the NetzDG and[br]made it worse. And last we have the
0:12:44.320,0:12:47.690
Austrian Hate Speech bill, which is a mix[br]of both the German and the French
0:12:47.690,0:12:51.890
proposal. They all go much beyond[br]copyright content, but focus on hate
0:12:51.890,0:12:56.500
speech and all sorts of content that may[br]be considered problematic and in those
0:12:56.500,0:13:01.620
respective countries, not necessarily in[br]other countries. And this brings me to the
0:13:01.620,0:13:05.890
next problem. How do we deal with content[br]that is illegal in one country, but legal
0:13:05.890,0:13:10.910
in another? A recent Court of Justice[br]ruling had confirmed that a court of a
0:13:10.910,0:13:16.020
small state like Austria can order[br]platforms not only to take down defamatory
0:13:16.020,0:13:20.360
content globally, but also to take down[br]identical or equivalent material using
0:13:20.360,0:13:26.340
automated technologies. For us this is a[br]terrible outcome, that will lead to a race
0:13:26.340,0:13:30.660
to the bottom, where the countries with[br]the least freedom of speech friendly laws
0:13:30.660,0:13:36.490
can superimpose their laws on every other[br]state in the world. We really believe that
0:13:36.490,0:13:40.740
all this nonsense has to stop. It's time[br]to acknowledge that the Internet is a
0:13:40.740,0:13:44.480
global space, a place of exchange of[br]creativity and the place where civil
0:13:44.480,0:13:49.730
liberties are suppose to exist. So we are[br]fighting now against all those national
0:13:49.730,0:13:54.360
initiatives. We had the great first[br]victory, when we have to bring down the
0:13:54.360,0:14:00.070
French law Avia - the Avia bill that had[br]imposed the duty for platforms to check
0:14:00.070,0:14:05.880
and remove potentially illegal content[br]within 24 hours. Before the Conseil
0:14:05.880,0:14:09.140
constitutionnel, the French Supreme Court,[br]we had argued that this would push
0:14:09.140,0:14:13.740
platforms to constantly check what users[br]post. And if platforms face high
0:14:13.740,0:14:19.740
fines... of course, they would be rather[br]motivated to block as much context of the
0:14:19.740,0:14:24.190
content as possible. We've made a point[br]that this would be against the Charter of
0:14:24.190,0:14:29.370
Fundamental Rights, including freedom of[br]information and freedom of expression. And
0:14:29.370,0:14:33.760
it was a great victory for us that the[br]French Supreme Court has struck down the
0:14:33.760,0:14:39.770
French Avia bill and followed our[br]argument, as you see on the slide. We also
0:14:39.770,0:14:43.810
see now that there's a push back at least[br]against the update of the German NetzDG,
0:14:43.810,0:14:47.790
which would have provided new access[br]rights for law enforcement authorities.
0:14:47.790,0:14:51.230
This and other provisions are considered[br]unconstitutional. And as far as I
0:14:51.230,0:14:55.300
understand and, perhaps listeners can[br]correct me, the German president has
0:14:55.300,0:15:00.620
refused to sign the bill and the Austrian[br]bill goes the similar path way - got
0:15:00.620,0:15:05.920
recently a red light from Brussels. The[br]commission considers it in conflict with
0:15:05.920,0:15:12.150
EU principles. Also, thanks to joint[br]effort by epicenter.works. And this shows
0:15:12.150,0:15:15.880
that something positive is going on, it's[br]a positive development, the pushback
0:15:15.880,0:15:20.310
against automated filter technologies.[br]But it's important to understand that
0:15:20.310,0:15:24.270
those national initiatives are not just[br]purely national attempts to regulate hate
0:15:24.270,0:15:29.100
speech. It's an attempt of an EU member[br]state to make their own bills, as badly as
0:15:29.100,0:15:33.850
they are, some sort of a prototype for EU-[br]wide legislation, a prototype for the
0:15:33.850,0:15:37.690
Digital Services Act. And as you know,[br]national member states have a say in EU
0:15:37.690,0:15:41.610
lawmaking: their voices are represented in[br]the council of the EU and the European
0:15:41.610,0:15:46.532
Commission, which will be disincentivized[br]to propose anything that would be voted
0:15:46.532,0:15:50.970
down by the council. I think that's a nice[br]takeaway from today, that lawmaking in
0:15:50.970,0:15:55.770
national member states is not an isolated[br]event. It's always political, it's always
0:15:55.770,0:16:02.680
Netzpolitik. The good news is that as[br]far as the European Commission proposal
0:16:02.680,0:16:07.410
for a Digital Services Act is concerned,[br]that it has not followed the footsteps of
0:16:07.410,0:16:12.700
those bad, badly designed and misguided[br]bills. It has respected our input, the
0:16:12.700,0:16:16.490
input from Access Now, from EFF, from the[br]EDRi network, from academics and many
0:16:16.490,0:16:21.830
others, that some key principles should[br]not be removed, like that liability for
0:16:21.830,0:16:26.560
speech should rest with the speaker. The[br]DSA, it's also a red light to channel
0:16:26.560,0:16:30.560
monitoring of users content. And there are[br]no sure badlines in there to remove
0:16:30.560,0:16:34.600
content that might be illegal. Instead,[br]the commission gives more slack to
0:16:34.600,0:16:41.770
platforms to take down posts in good faith,[br]which we call the EU style
0:16:41.770,0:16:45.550
Good Samaritan clause. Looking[br]through the global lenses of law making
0:16:45.550,0:16:49.190
it's very fascinating to see that while[br]the United States is flirting with the
0:16:49.190,0:16:52.710
idea to move away from the Good Samaritan[br]principle in Section 230 of the
0:16:52.710,0:16:56.870
Communications Decency Act, so the idea is[br]that platforms can voluntarily remove
0:16:56.870,0:17:01.720
content without being held liable for it,[br]the European Union flirts with the idea to
0:17:01.720,0:17:06.680
introduce it, to give more options to[br]platforms to act. That being said, the
0:17:06.680,0:17:11.310
major differences between the US and the[br]EU is that in Europe, platforms could be
0:17:11.310,0:17:15.699
held liable the moment they become aware[br]of the illegality of content. That's an
0:17:15.699,0:17:20.060
issue because the Digital Services Act has[br]now introduced a relatively sophisticated
0:17:20.060,0:17:24.870
system for user notification, complaint[br]mechanism, dispute resolution options,
0:17:24.870,0:17:29.350
which all leads to such awareness about[br]illegality or could lead to such
0:17:29.350,0:17:33.580
awareness. It's not quite clear for us how[br]platforms will make use of voluntary
0:17:33.580,0:17:38.510
measures to remove content. That being[br]said, we think that the commission's
0:17:38.510,0:17:41.870
proposal could have been much worse. And[br]the Parliament's reports on Digital
0:17:41.870,0:17:46.090
Services Act have demonstrated that the[br]new parliament is a bit better than the
0:17:46.090,0:17:50.440
old one. They have a lot of respect for[br]fundamental rights. So many members of
0:17:50.440,0:17:55.470
parliament that are quite fond of the idea[br]to protect civil liberties online. But we
0:17:55.470,0:17:59.780
know that this was only the start and we[br]know that we need another joint effort to
0:17:59.780,0:18:03.240
ensure that users are not monitored, they[br]are not at the mercy of algorithmic decision
0:18:03.240,0:18:08.480
making. And I think Eliska is now going to[br]explain a bit more about all this.
0:18:08.480,0:18:15.250
Eliska: Thank you. Thank you very much,[br]Chris. So, we can actually move now
0:18:15.250,0:18:19.800
further and unpack a few relevant[br]provisions for everything that has already
0:18:19.800,0:18:23.950
been mentioned, mainly in the realm of[br]content moderation and content creation,
0:18:23.950,0:18:29.010
which ultimately lies in the core of[br]Digital Services Act. And maybe not to
0:18:29.010,0:18:34.700
make it all so abstract... I also have the[br]printed version of the law here with me.
0:18:34.700,0:18:38.630
And if you look at it, it's quite an[br]impressive piece of work that the European
0:18:38.630,0:18:44.690
Commission did there. And I have to say[br]that even though it's a great start, it
0:18:44.690,0:18:50.860
still contains a lot of imperfections. And[br]I will try to summarize those now for you,
0:18:50.860,0:18:57.560
especially in the light of our end positioning[br]and as I mean civil societies in general,
0:18:57.560,0:19:00.650
because I believe that on all those[br]points, we have a pretty solid agreement
0:19:00.650,0:19:05.730
among each other. And what we were[br]actually hoping that Digital Services Act
0:19:05.730,0:19:11.130
will do, what it actually does and where[br]we see that we will need to actually
0:19:11.130,0:19:14.880
continue working very closely, especially[br]with the members of the European
0:19:14.880,0:19:18.470
Parliament in the future, once the draft[br]will actually enter the European
0:19:18.470,0:19:23.520
Parliament, which should happen relatively[br]soon. So as it already, as I already
0:19:23.520,0:19:29.170
mentioned at the beginning, quite briefly,[br]is how actually DSA distinguishes between
0:19:29.170,0:19:34.470
online platforms, which are defined within[br]the scope of the law and between very
0:19:34.470,0:19:38.570
large online platforms, which is exactly[br]that scope where all large online
0:19:38.570,0:19:43.860
gatekeepers fall into. DSA specificly then[br]distinguishes between obligations or
0:19:43.860,0:19:49.370
responsibilities of these actors, some[br]assigning to all of them, including online
0:19:49.370,0:19:53.170
platforms and some being extended[br]specifically due to the dominance and
0:19:53.170,0:19:58.530
power of these online gatekeepers hold.[br]This is mainly then the case when we
0:19:58.530,0:20:01.520
discuss the requirements for[br]transparency. There is a set of
0:20:01.520,0:20:05.470
requirements for transparency that apply[br]to online platforms, but then there is
0:20:05.470,0:20:09.160
still specific additional set of[br]transparency requirements for larger
0:20:09.160,0:20:14.300
online platforms. What DSA does especially[br]and this is the bed which is extremely
0:20:14.300,0:20:18.890
relevant for the content moderation[br]practices - it attempts to establish a
0:20:18.890,0:20:23.980
harmonized model for notice and action[br]procedure for allegedly illegal content.
0:20:23.980,0:20:29.350
Whatever alert red lines before I go into[br]the details on this was that DSA will be
0:20:29.350,0:20:35.080
touching only, or trying to regulate only[br]allegedly illegal content and stay away
0:20:35.080,0:20:39.870
from vaguely defined content categories[br]such as potentially harmful, but legal
0:20:39.870,0:20:44.560
content. There are other ways how the[br]legislation can tackle this content,
0:20:44.560,0:20:50.730
mainly through the meaningful transparency[br]requirements, accountability, tackling
0:20:50.730,0:20:56.820
issues within the open content recommender[br]systems and algorithmic curation. But we
0:20:56.820,0:21:00.950
didn't want the specific category of the[br]content to be included within the scope of
0:21:00.950,0:21:05.830
DSA. This is due to the fact that vaguely[br]defined terms that find their way into
0:21:05.830,0:21:11.570
legislation always lead to human rights[br]abuse in the future. I could give you
0:21:11.570,0:21:17.470
examples from Europe, such as the concept[br]of online harms within the UK, but also as
0:21:17.470,0:21:21.960
a global organizations. Both of us, we[br]actually often see how weak terminology
0:21:21.960,0:21:26.290
can quickly lead to even over[br]criminalization of speech or suppressing
0:21:26.290,0:21:31.120
the decent. Now, if we go back to[br]harmonize notice and action procedure,
0:21:31.120,0:21:35.730
what that actually means in practice, as[br]Christoph already mentioned, Europe has
0:21:35.730,0:21:39.950
so-called conditional model of[br]intermediate reliability, which is being
0:21:39.950,0:21:43.950
provided already and established by the[br]initial legal regime, which is the
0:21:43.950,0:21:49.810
e-Commerce Directive under Article 14 of[br]the e-Commerce Directive, which actually
0:21:49.810,0:21:54.130
states that unless the platform holds the[br]actual knowledge and according to the
0:21:54.130,0:21:58.770
wording of DSA now it's the actual[br]knowledge or awareness about the presence
0:21:58.770,0:22:04.520
of illegal content on their platform, they[br]cannot be held liable for such a content.
0:22:04.520,0:22:09.540
Now, we were asking for a harmonized[br]procedure regarding notice and action across
0:22:09.540,0:22:14.470
the EU for a while, precisely because we[br]wanted to see reinforced legal certainty.
0:22:14.470,0:22:19.070
Lack of legal certainty often translated[br]into overremoval of even legitimate
0:22:19.070,0:22:25.760
content from the platforms with no public[br]scrutiny. DSA does a good job, it's a good
0:22:25.760,0:22:29.330
starting point that actually tries to[br]attempt, to establish such a harmonized
0:22:29.330,0:22:34.230
procedure, but it's still lacking behind[br]on many aspects that we consider important
0:22:34.230,0:22:38.350
in order to strengthen protection of[br]fundamental rights of users. One of them
0:22:38.350,0:22:42.290
is, for instance, that harmonized notice[br]and action procedure, as envisioned by
0:22:42.290,0:22:47.640
DSA, is not specifically tailored to[br]different types of categories of user
0:22:47.640,0:22:52.950
generated content. And as we know, there[br]were some or many categories of content
0:22:52.950,0:22:58.280
that are deeply context dependent, linked[br]to the historical and sociopolitical
0:22:58.280,0:23:04.560
context of member state in question. And[br]due to their reliance on the automated
0:23:04.560,0:23:08.840
measures, that usually context blind, we[br]are worried that if notice and action
0:23:08.840,0:23:13.290
doesn't reflect this aspect in any further[br]ways we will end up again with over
0:23:13.290,0:23:18.830
removals of the content. What is probably[br]another huge issue that we are currently
0:23:18.830,0:23:23.640
lacking in the draft, even though DSA is[br]trying to create a proper appeal and
0:23:23.640,0:23:29.200
enforcement mechanisms and also appeal[br]mechanisms for users and different
0:23:29.200,0:23:33.620
alternative dispute settlement of the law[br]draft currently contains, there is no
0:23:33.620,0:23:40.768
possibility for content providers, for the[br]user that uploads the filter.. Sorry.
0:23:40.768,0:23:45.450
laughs That was a nice Freudian slip[br]there. For
0:23:45.450,0:23:50.621
a user that actually uploaded content to[br]appeal to directly actually use the
0:23:50.621,0:23:56.430
counter notification about that notified[br]content that belongs to that user. Nor
0:23:56.430,0:24:00.750
platforms are obliged to actually send the[br]notification to a user prior to any action
0:24:00.750,0:24:05.110
that is being taken against that[br]particular notified content. These are for
0:24:05.110,0:24:10.120
us a procedural safeguards for fairness[br]that users should have, and currently they
0:24:10.120,0:24:16.030
are not being reflected in the draft.[br]However, this is a good start and it's
0:24:16.030,0:24:20.190
something that we were pushing for. But I[br]think there are many more aspects that
0:24:20.190,0:24:24.710
these notice and action procedures will[br]need to contain in order to truly put
0:24:24.710,0:24:31.420
users at first. Now the notice and action[br]procedure is mainly focusing on the
0:24:31.420,0:24:35.470
illegal content. But there are ways in the[br]draft where potentially harmful content,
0:24:35.470,0:24:38.910
which is still legal - so the content that[br]actually violates the terms of service of
0:24:38.910,0:24:42.710
platforms is being mentioned throughout[br]the draft. So for us, it's now at the
0:24:42.710,0:24:51.890
moment exactly clear how that will work in[br]practice. So that's why we often use this
0:24:51.890,0:24:56.370
phrase that is also put on the slide: good[br]intention with imperfect solutions.
0:24:56.370,0:25:00.430
However, I want to emphasize again that[br]this is just the beginning and we will
0:25:00.430,0:25:05.920
still have time and space to work very[br]hard on this. Another kind of novel aspect
0:25:05.920,0:25:10.710
that DSA actually brings about is already[br]mentioned Good Samaritan Clause, and I
0:25:10.710,0:25:16.560
tend to call it the EU model or EU version[br]of Good Samaritan Clause. Good Samaritan
0:25:16.560,0:25:21.370
Clause originates in Section 230 of[br]Communication Decency Act, as Cristoph
0:25:21.370,0:25:26.609
already mentioned. But within the European[br]realm, it goes hand in hand with this
0:25:26.609,0:25:31.680
conditional model of liability which is[br]being preserved within the DSA legal
0:25:31.680,0:25:36.180
draft. That was also one of our main ask[br]to preserve this conditional model of
0:25:36.180,0:25:41.550
liability and it's great that this time[br]European Commission really listened. Why
0:25:41.550,0:25:46.060
we consider the Good Samaritan Clause[br]being important? In the past, when such a
0:25:46.060,0:25:51.380
security wasn't enshrined in the law, but[br]it was just somehow vaguely promised to
0:25:51.380,0:25:58.320
the commission that if the platform will[br]proactively deploy measures to fight
0:25:58.320,0:26:01.910
against the spread of illegal content,[br]they won't be held liable without
0:26:01.910,0:26:06.921
acknowledging that through such a use of[br]so-called proactive measures, the platform
0:26:06.921,0:26:11.710
could in theory gain the actual knowledge[br]about the existence of such a type of
0:26:11.710,0:26:16.350
content on its platform, which would[br]immediately trigger legal liability. This
0:26:16.350,0:26:20.500
threat of liability often pushed platforms[br]to the corner so they would rather remove
0:26:20.500,0:26:27.240
the content very quickly then to face more[br]serious consequences later on. That's why
0:26:27.240,0:26:32.330
we see the importance within the Good[br]Samaritan Clause or the European model of
0:26:32.330,0:26:36.910
Good Samaritan Clause, and we are glad[br]that it's currently being part of the
0:26:36.910,0:26:44.490
draft. One of the biggest downfalls or one[br]of the biggest disappointments when DSA
0:26:44.490,0:26:50.410
finally came out on the 15th of December[br]for us was to see that it's still online
0:26:50.410,0:26:54.800
platforms that will remain in charge when[br]it comes to assessing the legality of the
0:26:54.800,0:26:59.450
content and deciding what content should[br]be actually restricted and removed from a
0:26:59.450,0:27:06.080
platform and what should be available. We[br]often emphasize that it's very important
0:27:06.080,0:27:11.470
that the legality of the content is being[br]assessed by the independent judicial
0:27:11.470,0:27:17.020
authorities as in line with the rule of[br]law principles. We also do understand that
0:27:17.020,0:27:23.090
such a solution creates a big burden on[br]the judicial structure of member states.
0:27:23.090,0:27:26.540
Many member states see that as a very[br]expensive solutions, they don't always
0:27:26.540,0:27:33.020
want to create a special network, of[br]courts, or e-courts or other forms of
0:27:33.020,0:27:38.150
judicial review of the illegal or[br]allegedly illegal content. But we still
0:27:38.150,0:27:42.559
wanted to see more public scrutiny,[br]because for us this is truly just the
0:27:42.559,0:27:47.403
reaffirmation of already existing status[br]quo, as at the moment and there are many
0:27:47.403,0:27:51.900
jurisdictions within the EU and in the EU[br]itself, it's still online platform that
0:27:51.900,0:27:56.750
will call the final shots. What, on the[br]other hand, is a positive outcome that
0:27:56.750,0:28:02.370
we were also hardly pushing for are the[br]requirements for meaningful transparency.
0:28:02.370,0:28:06.580
So to understand better what platforms[br]actually do with the individual pieces of
0:28:06.580,0:28:12.660
content that are being shared on these[br]platforms and how actually transparency
0:28:12.660,0:28:17.559
can then ultimately empower user. Now, I[br]want to emphasize this because this is
0:28:17.559,0:28:22.970
still ongoing debate and we will touch[br]upon those issues in a minute. But we
0:28:22.970,0:28:28.840
don't see transparency as a silver bullet[br]to the issues such as amplification of
0:28:28.840,0:28:33.420
potentially harmful content or in general[br]that transparency will be enough to
0:28:33.420,0:28:38.150
actually hold platforms accountable.[br]Absolutely not. It will never be enough,
0:28:38.150,0:28:44.460
but it's a precondition for us to actually[br]seek such solutions in the future. DSA
0:28:44.460,0:28:48.910
contains specific requirements for[br]transparency, as I already mentioned, a
0:28:48.910,0:28:53.300
set of requirements that will be[br]applicable largely to all online platforms
0:28:53.300,0:28:57.840
and then still specific set of[br]requirements on the top of it. That will
0:28:57.840,0:29:02.350
be applicable only to very large online[br]platforms so the online gatekeepers. We
0:29:02.350,0:29:07.230
appreciate the effort, we see that the[br]list is very promising, but we still think
0:29:07.230,0:29:12.860
it could be more ambitious. Both EFF and[br]Access Now put forward a specific set of
0:29:12.860,0:29:17.910
requirements for meaningful transparency[br]that are in our positions. And so did EDRi
0:29:17.910,0:29:25.080
and other civil society or digital rights[br]activists in this space. And final point
0:29:25.080,0:29:29.100
that I'm going to make is the so-called[br]Pandora box of online targeting and
0:29:29.100,0:29:34.809
recommender systems. Why do I refer to[br]this as to Pandora Box? When a European
0:29:34.809,0:29:41.440
Parliament published its initiative[br]reports on DSA, there are two reports, one
0:29:41.440,0:29:47.680
being tabled by JURI Committee and then[br]another one by ENCO, especially the JURI
0:29:47.680,0:29:52.410
report contained paragraph 17, which calls[br]out for a better regulation of online
0:29:52.410,0:29:57.220
targeting and online advertisements, and[br]specifically calling for a ban of online
0:29:57.220,0:30:03.370
targeting and including the Phase-Out that[br]will then lead to a ban. We supported this
0:30:03.370,0:30:08.410
paragraph, which at the end was voted for[br]and is the part of the final report.
0:30:08.410,0:30:12.780
Nevertheless, we also do understand that[br]these wording of the article has to be
0:30:12.780,0:30:18.000
more nuanced in the future. Before I go[br]into the details there, I just want to say
0:30:18.000,0:30:22.840
that this part has never made it to DSA.[br]So there is no ban on online targeting or
0:30:22.840,0:30:27.620
online advertisement of any sort, which to[br]us, to some extent, it was certainly
0:30:27.620,0:30:33.210
disappointing too, we specifically would[br]call for a much more stricter approach
0:30:33.210,0:30:38.740
when it comes to behavioral targeting as[br]well as crossside tracking of online
0:30:38.740,0:30:43.890
users, but unfortunately, and as we[br]eventually also heard from Commissioner
0:30:43.890,0:30:48.930
Vestager, that was simply lag of will or,[br]maybe, too much pressure from other
0:30:48.930,0:30:54.059
lobbies in Brussels. And this provision[br]never found its way to the final draft of
0:30:54.059,0:30:58.540
DSA. That's the current state of art, we[br]will see what we will manage to achieve
0:30:58.540,0:31:05.220
once the DSA will enter the European[br]Parliament. And finally, the law also
0:31:05.220,0:31:10.440
contains a specific provision on[br]recommender systems. So the way how the
0:31:10.440,0:31:16.630
content is being distributed across[br]platform and how the data of users are
0:31:16.630,0:31:22.049
being abused for such a distribution and[br]personalization of user generated content.
0:31:22.049,0:31:27.110
In both cases, whether it's online[br]targeting and recommender systems within
0:31:27.110,0:31:31.740
the DSA, DSA goes as far as the[br]transparency requirements, the
0:31:31.740,0:31:37.230
explainability, but it does very little[br]for returning that control and empowerment
0:31:37.230,0:31:42.600
back to the user. So whether user can[br]obtain or opt out from these algorithmic
0:31:42.600,0:31:48.630
curation models, how it can actually be[br]optimized if they decide to optimize it?
0:31:48.630,0:31:54.550
All of that is at the moment very much[br]left outside of the scope of DSA. And so
0:31:54.550,0:31:59.761
that's the issue of interoperability,[br]which is definitely one of the key
0:31:59.761,0:32:05.630
issues being currently discussed and made[br]kind of possible hopes in the future for
0:32:05.630,0:32:09.500
returning that control and empowerment[br]back to the user. And I keep repeating
0:32:09.500,0:32:14.370
this as a mantra, but it's truly the main[br]driving force behind all our initiatives
0:32:14.370,0:32:19.289
and the work we do in these fields. So the[br]user and their fundamental rights. And on
0:32:19.289,0:32:23.210
that note, I would like to hand over back[br]to Chris, who will explain the issue of
0:32:23.210,0:32:28.120
interoperability and how to actually[br]empower you as a user and to strengthen
0:32:28.120,0:32:35.850
the protection of fundamental rights[br]further. Chris, it's yours now. Christoph:
0:32:35.850,0:32:43.299
Thank you. I think we all know or feel[br]that the Internet has seen better times.
0:32:43.299,0:32:48.920
If you look back over the last 20 years,[br]we have seen that transformation was going
0:32:48.920,0:32:54.280
on from an open Internet towards a more[br]closed one - monopolization. Big platforms
0:32:54.280,0:32:59.230
have built entire ecosystems and it seems[br]that they alone decide who gets to use
0:32:59.230,0:33:04.260
them. Those platforms have strong network[br]effects that have pushed platforms or
0:33:04.260,0:33:07.830
those platforms into gatekeeper position[br]which made it so easy for them to avoid
0:33:07.830,0:33:12.330
any real competition. This is especially[br]true when we think of social media
0:33:12.330,0:33:17.280
platforms. This year we celebrate the 20th[br]birthday of the e-Commerce Directive that
0:33:17.280,0:33:20.641
Eliska mentioned. The Internet bill that[br]will now be replaced by the Digital
0:33:20.641,0:33:25.870
Services Act. We believe it's a very good[br]time now to think and make a choice:
0:33:25.870,0:33:29.480
should we give even more power to the big[br]platforms that have created a lot of the
0:33:29.480,0:33:33.110
mess in the first place; or should we give[br]the power to the users, give the power
0:33:33.110,0:33:39.270
back to the people? For us, the answer is[br]clear. Big tech companies already employ a
0:33:39.270,0:33:43.500
wide array of technical measures. They[br]monitor, they remove, they disrespect user
0:33:43.500,0:33:48.340
privacy and the idea to turn them into[br]the Internet Police, with a special
0:33:48.340,0:33:54.250
license of censoring the speech of users,[br]will only solidify their dominance. So we
0:33:54.250,0:33:58.880
wouldn't like that. What we like is to[br]put users in charge over their online
0:33:58.880,0:34:04.770
experience. Users should, if we had a say,[br]choose for themselves which kind of
0:34:04.770,0:34:08.329
content they can see, what services they[br]can use to talk to their friends and
0:34:08.329,0:34:13.389
families. And we believe it's perhaps time[br]to break up those silos, those big
0:34:13.389,0:34:18.070
platforms have become to end the dominance[br]over data. One element to achieve this
0:34:18.070,0:34:21.770
would be to tackle the targeted ads[br]industry, as Eliska mentioned it, perhaps
0:34:21.770,0:34:26.970
to give an actual right to users not to be[br]subject to targeted ads or to give more
0:34:26.970,0:34:30.960
choice to use to decide, which content[br]they would like to see or not to see. In
0:34:30.960,0:34:35.050
the Digital Services Act, the Commission[br]went for transparency when it comes to ads
0:34:35.050,0:34:39.490
and better option for users to decide on[br]the recommended content, which is a start,
0:34:39.490,0:34:45.010
we can work with that. Another important[br]element to achieve user autonomy over data
0:34:45.010,0:34:49.639
is interoperability. If the European Union[br]really wants to break the power of those
0:34:49.639,0:34:53.820
data driven platforms that monopolize the[br]Internet, it needs regulations that
0:34:53.820,0:34:58.440
enables users to be in control over the[br]data. We believe that users should be able
0:34:58.440,0:35:03.710
to access data, to download data, to move,[br]manipulate their data as they see fit. And
0:35:03.710,0:35:07.920
part of that control is to port data[br]from one place to another. But data
0:35:07.920,0:35:11.560
portability, which we have under the[br]GDPR is not good enough. And we
0:35:11.560,0:35:15.592
see from the GDPR that it's not[br]working in practice. Users should be able
0:35:15.592,0:35:19.350
to communicate with friends across[br]platform boundaries, to be able to follow
0:35:19.350,0:35:23.480
their favorite content across different[br]platforms without having to create several
0:35:23.480,0:35:28.950
accounts. But to put it in other terms, if[br]you upset with the absence of privacy on
0:35:28.950,0:35:33.300
Facebook or how the content is moderated[br]on Facebook, you should be able to just
0:35:33.300,0:35:37.080
take your data with you using portability[br]options and move to an alternative
0:35:37.080,0:35:41.040
platforms, that is a better fit and this[br]without losing touch with your friends who
0:35:41.040,0:35:45.850
stay behind, who have not left the[br]incumbent big platform. So what we did for
0:35:45.850,0:35:50.190
Digital Services Act is to argue for[br]mandatory interoperability options that
0:35:50.190,0:35:55.760
would force Facebook to maintain APIs that[br]let users on other platforms exchange
0:35:55.760,0:36:01.330
messages and content with Facebook users.[br]However, if you look in the DSA, we see
0:36:01.330,0:36:04.970
that the commission completely missed the[br]mark on interoperability, which is
0:36:04.970,0:36:09.100
supposed to be dealt with by related legal[br]act, now it gets complicated. It's the
0:36:09.100,0:36:15.000
Digital Markets Act, the DMA, another[br]beautiful acronym. The Digital Markets Act
0:36:15.000,0:36:19.500
wants to tackle certain harmful business[br]practices by those gatekeeper platforms,
0:36:19.500,0:36:24.490
the very large tech companies that control[br]what is called core services. The core
0:36:24.490,0:36:28.380
service is a search engine, a social[br]networking service, a messaging service,
0:36:28.380,0:36:33.410
its operating systems and online[br]intermediation services. Like think of how
0:36:33.410,0:36:37.200
Amazon controls access to customers for[br]merchants that sell on its platforms or
0:36:37.200,0:36:44.100
how the Android and iPhone app stores as[br]chokepoints in delivering mobile software.
0:36:44.100,0:36:47.850
And many things we like in the new[br]proposal, the proposal of the Digital
0:36:47.850,0:36:53.240
Markets Act, for example there's a ban on[br]mixing data in there that you may wants to
0:36:53.240,0:36:57.190
ban gatekeeper's from mixing data from[br]data progress with the data they collect
0:36:57.190,0:37:04.030
on the customers. Another rule is to ban[br]cross tying - sort of practices that end
0:37:04.030,0:37:08.170
users must sign up for ancillary services.[br]So you should be able to use Android
0:37:08.170,0:37:12.280
without having to get a Google account for[br]example. You believe that this is all
0:37:12.280,0:37:18.570
good, but the DMA like the DSA is very[br]weak on interoperability. What it does is
0:37:18.570,0:37:23.240
to focus on real time data portability[br]instead. So instead of having
0:37:23.240,0:37:27.000
interoperable services, users will only be[br]able to send the data from one service to
0:37:27.000,0:37:31.830
another like from Facebook to Diaspora,[br]meaning that you would end up having two
0:37:31.830,0:37:37.350
accounts instead of one or to quote Cory[br]Doctorow who spoke yesterday already:
0:37:37.350,0:37:42.580
"Users would still be subject to the[br]sprawling garbage novela of abusive legalese
0:37:42.580,0:37:47.260
Facebook lovably calls its terms of[br]service." We believe that this is not
0:37:47.260,0:37:54.359
good enough. And the last slide, you see a[br]quote from the Margrethe Vestager who made
0:37:54.359,0:37:59.920
a very good statement last month, that we[br]need trustworthy services, fair use of
0:37:59.920,0:38:05.270
data and free speech and an interoperable[br]internet; we fully agree on that. And in
0:38:05.270,0:38:09.560
the next months and years, we will work on[br]this to actually happen. However, you can
0:38:09.560,0:38:13.520
imagine, it will not be easy. We already[br]see that European Union member states
0:38:13.520,0:38:18.250
follow the trend of platforms should[br]systematically check undesirable and
0:38:18.250,0:38:22.560
insightful content and share those data[br]with enforcement authorities, which is
0:38:22.560,0:38:27.240
even worse. We see an international trend[br]going on to move away from the immunity of
0:38:27.240,0:38:32.450
platform for use of content towards a more[br]active stance of those platforms. And we
0:38:32.450,0:38:37.780
see that recent terror attacks have fueled[br]ideas that monitoring is a good idea and
0:38:37.780,0:38:43.010
end to end encryption is a problem. So[br]whatever will be the result, you can bet
0:38:43.010,0:38:46.440
that European Union will want to make the[br]Digital Services Act and the Digital
0:38:46.440,0:38:50.800
Markets Act another export model. So this[br]time we want the numbers right in
0:38:50.800,0:38:54.700
parliament and the council, we want to[br]help members of parliament to press the
0:38:54.700,0:38:59.620
right buttons. And for all this we will[br]need your help, even if it means to learn
0:38:59.620,0:39:04.030
yet another acronym or several acronyms[br]after the GDPR. That's it from
0:39:04.030,0:39:07.770
our side - we are looking forward to the[br]discussion.Thank you.
0:39:14.584,0:39:18.830
Herald: OK, thank you Eliska and
0:39:18.830,0:39:25.660
Christoph. There are questions from the[br]internet and the first one is basically
0:39:25.660,0:39:31.590
we just have and as you mentioned in your[br]slides, Christoph, the copyright in the
0:39:31.590,0:39:38.850
digital single market with both[br]accountability and liability provisions,
0:39:38.850,0:39:45.880
you also briefly mentioned, I think even[br]the e-evidence proposal also. How do all
0:39:45.880,0:39:49.400
these proposals relate to each other? And[br]especially for a layperson, that is not
0:39:49.400,0:39:57.030
into all the Brussels jargon.[br]Christoph: I think Eliska, you raised
0:39:57.030,0:40:01.700
your hand, don't you?[br]Eliska: ...more or less unintentionally,
0:40:01.700,0:40:10.050
but yeah, kind of that. I can start and[br]then let you Christoph to step in. Yeah,
0:40:10.050,0:40:15.369
that's a very, very good question. And[br]this is specifically due to the fact that
0:40:15.369,0:40:19.990
when you mention especially online[br]terrorist content regulation, but also
0:40:19.990,0:40:27.350
recently proposed interim regulation on[br]child sexual abuse, they.. all these - we
0:40:27.350,0:40:32.590
call them sectoral legislation, so kind of[br]a little bit of parting from this
0:40:32.590,0:40:37.790
horizontal approach, meaning an approach[br]that tackles all categories of illegal
0:40:37.790,0:40:42.720
content in one way, instead of going after[br]specific categories such as online
0:40:42.720,0:40:47.510
terrorist content in the separate ways. So[br]it's a little bit paradoxical saying what
0:40:47.510,0:40:50.869
is currently also happening at the EU[br]level, because on one hand, we were
0:40:50.869,0:40:55.960
promised this systemic regulation that[br]will once for all establish harmonized
0:40:55.960,0:41:00.660
approach to combating illegal content[br]online and at the same time, which is
0:41:00.660,0:41:05.510
specifically DSA, the Digital Services[br]Act, and at the same time we still see
0:41:05.510,0:41:10.109
European Commission allowing for these[br]fundamental rights harmful legislative
0:41:10.109,0:41:14.890
proposals happening in these specific[br]sectors such as proposed online
0:41:14.890,0:41:20.560
terrorist content regulation or other[br]legislative acts seeking to somehow
0:41:20.560,0:41:24.980
regulate specific categories of user[br]generated content. This is quite puzzling
0:41:24.980,0:41:31.220
for us as a digital rights activists too,[br]and very often, actually, so I would maybe
0:41:31.220,0:41:35.484
separate DSA from this for a moment and[br]say that all of these sectoral
0:41:35.484,0:41:39.990
legislations what they have in common is:[br]first of all, continuing these negative
0:41:39.990,0:41:43.730
legislative trends that we already[br]described and that we constantly observe
0:41:43.730,0:41:49.160
in practice, such as shifting more and[br]more responsibility on online platforms.
0:41:49.160,0:41:53.040
And at the same time, what is also very[br]interesting, what they have in common is
0:41:53.040,0:41:57.680
the legal basis that they stand on, and[br]that's the legal basis that is rather
0:41:57.680,0:42:03.340
connected to the cooperation within the[br]digital single market, even though they
0:42:03.340,0:42:09.530
seek to tackle a very particular type of[br]category of content category, which is
0:42:09.530,0:42:15.270
manifestly illegal. So logically, if they[br]should have that appropriate legal ground,
0:42:15.270,0:42:20.010
it should be something more close to[br]police and judicial cooperation, which we
0:42:20.010,0:42:24.580
don't see happening in practice,[br]specifically because there is this idea
0:42:24.580,0:42:28.820
that platforms are the best suited to[br]decide how the illegal content will be
0:42:28.820,0:42:33.100
tackled in the online space. They can be[br]the fastest, they can be the most
0:42:33.100,0:42:37.640
effective. So they should actually have[br]that main decision making powers and
0:42:37.640,0:42:42.070
forced into taking those responsibilities[br]which have ever ultimately, according to
0:42:42.070,0:42:47.220
the rule of law principle, should and have[br]to be in the hands of the state and public
0:42:47.220,0:42:54.300
authorities, preferably judicial[br]authorities. So I would say they are all
0:42:54.300,0:43:00.150
bad news for fundamental rights protection[br]of online users, civil rights
0:43:00.150,0:43:05.810
organizations, all of us that are on this[br]call today. We're fighting very hard also
0:43:05.810,0:43:09.630
against the online service content[br]regulation. There was a lot of damage
0:43:09.630,0:43:14.800
control done, especially with the first[br]report that was tabled by the European
0:43:14.800,0:43:19.609
Parliament and also now during the last[br]trialogue since the negotiations seems to
0:43:19.609,0:43:24.609
be concluded and the outcome is not great.[br]It's far from ideal. And I'm worried that
0:43:24.609,0:43:28.619
with other sectoral legislative attempts[br]coming from the European Commission, we
0:43:28.619,0:43:32.690
might see the same outcome. It will be[br]very interesting to see how that will
0:43:32.690,0:43:37.130
actually then play together with the[br]Digital Services Act, which is trying to
0:43:37.130,0:43:43.020
do the exact opposite to actually fix this[br]negative legislative efforts that we see
0:43:43.020,0:43:47.580
at the EU level with these sectoral[br]legislation, but also with the member
0:43:47.580,0:43:52.080
states at the national level. I could also[br]mention the European Commission reaction
0:43:52.080,0:43:56.380
to some national legislative proposals.[br]But Christoph, I would leave that to you
0:43:56.380,0:44:01.710
and please step in.[br]Christoph: I think you explained it
0:44:01.710,0:44:05.960
perfectly, and the only thing I can[br]supplement here is that if you look at
0:44:05.960,0:44:09.810
this move from sectoral legislation,[br]asylum legislation to horizontal
0:44:09.810,0:44:15.900
legislation, now back to sectoral[br]legislation - it's a problem, it's a mess.
0:44:15.900,0:44:24.280
First, the two sides not very good[br]coordinated which brings troubles for
0:44:24.280,0:44:28.880
legal certainty. It makes it very[br]troublesome for platforms to follow up.
0:44:28.880,0:44:33.520
And it's problematic for us, for us in the[br]space. We are some sort of lobbyist as
0:44:33.520,0:44:37.530
well, just for public interest. But you[br]will have to have to deal with copyright,
0:44:37.530,0:44:42.300
with CSAM, with TERREG, with end to[br]end encryption, DSA, DMA and 15 other
0:44:42.300,0:44:47.780
parties to pop up content by content. It's[br]very hard to manage to have the capacity
0:44:47.780,0:44:51.311
ready to be early in the debate, and it's[br]so important to be early in the debate to
0:44:51.311,0:44:55.470
prevent that from happening. And I think[br]that's a huge challenge for us, to have
0:44:55.470,0:45:00.280
something for us to reflect to in the next[br]days. How can we join forces better in a
0:45:00.280,0:45:04.290
more systematic way in order to really[br]follow up on all those initiatives? That's
0:45:04.290,0:45:12.400
for me, a very problematic development.[br]Herald: So in summary it's a mess. So it
0:45:12.400,0:45:16.790
is related, but we can't explain how,[br]because it's such a mess. Fair enough.
0:45:16.790,0:45:23.480
I have another question for you, Eliska.[br]Someone was asking how the proposed
0:45:23.480,0:45:29.220
Good Samaritan clause works compared to..[br]how it currently works in Germany. But I
0:45:29.220,0:45:33.130
think it's a bit unreasonable to expect[br]everyone to know how it works in Germany.
0:45:33.130,0:45:38.650
I would rephrase it this as: how does this[br]proposed Good Samaritan clause work
0:45:38.650,0:45:41.950
compared to how it is now under the[br]e-Commerce Directive?
0:45:41.950,0:45:50.840
Eliska: Thank you very much. Yeah, so a[br]great question again, I think the first if
0:45:50.840,0:45:55.720
we put it into the context of the EU law[br]and apologies that I cannot really answer
0:45:55.720,0:45:59.770
how - you know, compare the German context[br]- I really don't dare to, I'm not a
0:45:59.770,0:46:05.050
German lawyer, so I wouldn't like to step[br]in those waters. But first of all, there
0:46:05.050,0:46:10.859
is no Good Samaritan clause per se within[br]the scope of e-Commerce Directive. It did
0:46:10.859,0:46:16.680
not really exist within the law. And I'm[br]using the pass sentence now because DSA is
0:46:16.680,0:46:22.250
trying to change that. So that level of[br]legal certainty was not really, really
0:46:22.250,0:46:26.170
there for the platforms. There was the[br]conditional model of the liability, which
0:46:26.170,0:46:30.480
is still preserved within the regulation.[br]But if you think of a Good Samaritan
0:46:30.480,0:46:34.770
clause as we know it from the section[br]230, or let's use that Good Samaritan
0:46:34.770,0:46:38.060
clause as an example, because also[br]e-Commerce Directive was actually drafted
0:46:38.060,0:46:43.000
as a response to Communication Decency Act[br]that was the legislation that puts things
0:46:43.000,0:46:49.920
into motion. So that's the first[br]ultimate point. I explain at the
0:46:49.920,0:46:55.270
beginning in my presentation what was then[br]happening in the space of combating
0:46:55.270,0:47:00.570
illegal content at the EU level, and[br]especially I would refer to the
0:47:00.570,0:47:05.700
communication that the European Commission[br]published, I think, back in 2018, where it
0:47:05.700,0:47:11.850
actually encouraged and called on online[br]platforms to proactively engage with
0:47:11.850,0:47:17.390
illegal content and use these proactive[br]measures to actually seek an adequate
0:47:17.390,0:47:22.369
response to illegal content. Now, to mix[br]that with this conditional model of
0:47:22.369,0:47:27.550
liability, which is of course defined by[br]the obtaining actual knowledge by the
0:47:27.550,0:47:32.730
platform that created a perfect storm that[br]I already explained. So the platforms knew
0:47:32.730,0:47:37.120
that they are kind of pushed by the[br]legislature to actually seek these active
0:47:37.120,0:47:41.629
responses to illegal content, often[br]deploying automated measures. But they
0:47:41.629,0:47:46.710
didn't have any legal certainty or[br]security on their side that if they do so,
0:47:46.710,0:47:50.990
they won't end up ultimately being held[br]legally liable and face legal consequences
0:47:50.990,0:47:55.650
as a result of obtaining actual knowledge[br]through those proactive measures that were
0:47:55.650,0:48:01.820
kind of the tool, how they could possibly[br]actually obtain that knowledge. Now, what
0:48:01.820,0:48:08.290
DSA does, it specifically actually simply[br]states and I think it's Article 6 in the
0:48:08.290,0:48:13.550
Digital Services Act, if I'm not mistaken,[br]and I can even open it, it specifically
0:48:13.550,0:48:20.280
basically says that platforms can use[br]these proactive measures or, you know,
0:48:20.280,0:48:28.339
continue using some tools that actually[br]seek to provide some responses to this
0:48:28.339,0:48:32.300
type of content without the fear of being[br]held liable. So it's it's an article which
0:48:32.300,0:48:36.680
has approximately, I think, two[br]paragraphs, but it's finally in the
0:48:36.680,0:48:40.510
legislation and that means that it will[br]help to reinforce the level of legal
0:48:40.510,0:48:45.210
certainty. I would also emphasize that[br]very often in Europe, when we discuss Good
0:48:45.210,0:48:49.820
Samaritan clause, and Good Samaritan is[br]actually very unfortunate term, because
0:48:49.820,0:48:55.000
it's very much connected to the American[br]legal tradition. But when it's being mixed
0:48:55.000,0:48:58.910
up with the conditional model of liability[br]and with the prohibition of general
0:48:58.910,0:49:03.349
monitoring, which is still upheld, these[br]are the main principles of the European
0:49:03.349,0:49:08.580
intermediary reliability law and the[br]regime that is applicable within the EU,
0:49:08.580,0:49:13.520
such a safeguard can be actually[br]beneficial and it won't lead hopefully to
0:49:13.520,0:49:18.619
these blanket immunity for online[br]platforms or to this idea that platforms
0:49:18.619,0:49:22.050
will be able to do whatever they want with[br]the illegal content without any public
0:49:22.050,0:49:25.250
scrutiny, because there are other[br]measures, safeguards and principles in
0:49:25.250,0:49:30.180
place as a part of conditional model of[br]liability that we have here in Europe. So
0:49:30.180,0:49:35.940
I'm sorry, maybe that was too complicated.[br]Legalistic explanation there. But this is
0:49:35.940,0:49:40.290
how these provisions should work in[br]practice. We, of course, have to wait for
0:49:40.290,0:49:44.660
the implementation of the law and see how[br]that will turn out. But the main purpose
0:49:44.660,0:49:49.550
is that this legal certainty that was[br]lacking until now can finally come to its
0:49:49.550,0:49:54.599
existence, which should help us to prevent[br]over removal of legitimate speech from
0:49:54.599,0:50:00.550
online platforms.[br]Herald: OK, thank you. I have two other
0:50:00.550,0:50:05.050
questions from the Internet about[br]interoperability, and I suppose I should
0:50:05.050,0:50:12.570
look at Christoph for them. The last one[br]I'm going to ask first is: would such
0:50:12.570,0:50:18.060
interoperability make it much more[br]difficult to combat harassment and
0:50:18.060,0:50:22.500
stalking on the Internet? How do you[br]police that kind of misbehavior if it's
0:50:22.500,0:50:29.840
across different platforms who are forced[br]to interoperate and also be conduits for
0:50:29.840,0:50:35.670
such bad behavior. And I'll come to the[br]earlier question if you've answered this
0:50:35.670,0:50:40.130
question Christoph.[br]Christoph: It's a pretty good question.
0:50:40.130,0:50:47.380
First, to understand our vision on[br]interoperability is to understand that we
0:50:47.380,0:50:53.770
would like to have it between platforms[br]that empower large platforms and the right
0:50:53.770,0:50:59.920
of smaller platforms, actually, to make[br]use of interoperability. So it should not
0:50:59.920,0:51:05.330
be among the big platforms. So small[br]platforms should be able to connect to the
0:51:05.330,0:51:11.550
big platforms. And second, we believe it[br]will help and not make it worse because we
0:51:11.550,0:51:15.590
have now a problem of hate speech, we have[br]now a problem of a lack of privacy, we
0:51:15.590,0:51:23.210
have now a problem of the attention[br]industry that works with, you know,
0:51:23.210,0:51:29.320
certain pictures put in certain frames to[br]trigger the attention of users, because users
0:51:29.320,0:51:31.950
don't have a choice of the content[br]moderation practices, users don't have a
0:51:31.950,0:51:37.310
choice to see which kind of content should[br]be shown. And users don't have options to
0:51:37.310,0:51:42.580
regulate the privacy. The idea of more[br]competitors would be exactly that I can
0:51:42.580,0:51:51.160
move to a space, where I'm not harassed[br]and not be made subject to certain content
0:51:51.160,0:51:57.160
that hurt my feelings. Right. And that[br]moment I get control. I can choose a
0:51:57.160,0:52:02.330
provider that gives me those options and[br]we would like even to go a step further.
0:52:02.330,0:52:07.490
Back end interoperability was a start. We[br]believe if users want to, they should be
0:52:07.490,0:52:11.160
able to delegate a third party company or[br]piece of a third party software to
0:52:11.160,0:52:14.850
interact with the platform on their[br]behalf. So users would have the option to
0:52:14.850,0:52:19.070
see a news feed in different order,[br]calibrate their own filters on
0:52:19.070,0:52:23.099
misinformation. So in this sense,[br]interoperability can be a great tool,
0:52:23.099,0:52:28.570
actually, to tackle hate speech and to[br]sort of negative developments. Of course,
0:52:28.570,0:52:33.900
there is a risk to it. I think the risk[br]comes rather from the data industry side
0:52:33.900,0:52:38.500
again, that we need to take care not to[br]place one or another data selling
0:52:38.500,0:52:43.770
industry on the one that we already face.[br]But for this, we have options as well to
0:52:43.770,0:52:46.910
avoid that from happening. But to answer[br]the question, we believe interoperability
0:52:46.910,0:52:52.609
is a tool actually to escape from the[br]negative developments you had mentioned.
0:52:52.609,0:52:58.850
Herald: Critical counter question for me[br]then, aren't you actually advocating for
0:52:58.850,0:53:04.330
just roll your own recommendation engines[br]to be able to do so? Can't you achieve
0:53:04.330,0:53:09.950
that without interoperability?[br]Christoph: Sure. Recounter question: Do
0:53:09.950,0:53:15.300
you think an average user can accomplish[br]that quite easily? You know, like when we
0:53:15.300,0:53:21.040
look at the Internet through the lenses of[br]market competition then we see that it is
0:53:21.040,0:53:27.090
the dominance of platforms over data that[br]have created those spaces, those developed
0:53:27.090,0:53:31.182
gardens where users have the feeling they are[br]trapped and cannot escape from. And there
0:53:31.182,0:53:36.426
are so many alternative options that can[br]not get off the ground because users feel
0:53:36.426,0:53:40.880
trapped, don't want to leave their friends[br]behind and don't have options, actually to
0:53:40.880,0:53:45.410
have a better moderation system. Of course,[br]you can be creative and, you know, use
0:53:45.410,0:53:49.650
plugins and whatever you see fit, but you[br]need to stay within the platform barriers.
0:53:49.650,0:53:54.460
But we would like to enable users to actually[br]leave developed garden, go to another
0:53:54.460,0:53:58.599
place, but still stay in touch with[br]friends who have made the choice to remain
0:53:58.599,0:54:01.420
there. And I think that's perhaps the[br]difference to what you had in mind.
0:54:01.420,0:54:05.880
Herald: I have a follow up question. Well,[br]another question from the Internet,
0:54:05.880,0:54:12.130
regardless of interoperability, and that[br]is, historically speaking, as soon as the
0:54:12.130,0:54:16.470
big players get involved in certain[br]standards, they tend to also shape policy
0:54:16.470,0:54:23.070
by being involved in that. How would that[br]be different in the case of
0:54:23.070,0:54:27.640
interoperability and specifically[br]mentioned by the person who ask the
0:54:27.640,0:54:31.160
question. That Mastodon probably[br]flourishes because nobody else was
0:54:31.160,0:54:35.900
involved in setting that standard.[br]Christoph: Ah it's an excellent question.
0:54:35.900,0:54:41.859
And we struggled with the question of[br]standards ourselves in our policy paper,
0:54:41.859,0:54:46.860
which is our recommendations for European[br]Union to enact certain provisions in the
0:54:46.860,0:54:55.970
new Digital Services Act. We abstain from[br]asking to establish new standards like API
0:54:55.970,0:55:00.133
standards. We believe it's a bad idea to[br]regulate technology like that. What we
0:55:00.133,0:55:05.460
want to do is that big platforms just[br]offer interoperability however they see fit.
0:55:05.460,0:55:11.170
We don't want to have a standard that can[br]be either again monopolized or lobbied by
0:55:11.170,0:55:15.290
the big platforms. Because then we end up[br]with the standards we already see
0:55:15.290,0:55:21.010
which we don't like. But it's a good[br]question. And what we did is with our
0:55:21.010,0:55:24.501
policy principles on interoperability to[br]give kind of a food for thought, how we
0:55:24.501,0:55:31.110
believe the end version should look like,[br]but the many questions that remain and we
0:55:31.110,0:55:35.510
don't know exactly the way how to go[br]there.
0:55:35.510,0:55:39.450
Herald: Yeah, I'm sorry of sticking to the[br]topic of interoperability, because most
0:55:39.450,0:55:44.040
questions are actually about that. One of[br]the other questions is how do we prevent
0:55:44.040,0:55:49.480
this from getting messed up like it[br]happens with PSD2? And for the audience
0:55:49.480,0:55:54.950
that don't know about PSD2 - PSD2 is a[br]Directive that forced banks to open up
0:55:54.950,0:55:58.930
APIs to other financial service providers,[br]which is also interoperability between
0:55:58.930,0:56:03.079
platforms, in this case for banking[br]platforms, which comes with all sorts of
0:56:03.079,0:56:07.740
privacy questions that weren't completely[br]thought through when that legislation came
0:56:07.740,0:56:12.209
about. Sorry for having this long winded[br]intoduction, Christoph, but I think it was
0:56:12.209,0:56:16.140
needed for people that don't know what[br]PSD2 means.
0:56:16.140,0:56:20.080
Christoph: It's a good question.[br]Interestingly, we never used PSD2 or the
0:56:20.080,0:56:23.760
Telecommunications Act because both have[br]interoperability options as those negative
0:56:23.760,0:56:28.540
examples we always used as examples that[br]hey, it's already possible. So you don't
0:56:28.540,0:56:32.570
have an excuse to say it's impossible to put[br]it in the law. What is true is that there
0:56:32.570,0:56:37.013
is a lot of mess around it. The question[br]of how to avoid the mess, it's a question
0:56:37.013,0:56:42.500
of.. Netzpolitik again. So the question[br]of whether policy makers are actually
0:56:42.500,0:56:47.150
listening to us or listening to industry[br]lobbyists. So the ones who raised the
0:56:47.150,0:56:50.690
question is absolutely right - there's a[br]huge risk for every topic we talk about.
0:56:50.690,0:56:56.040
Whether interoperability, whether it's use[br]of control about content, targeted ads,
0:56:56.040,0:56:59.829
liability, everything that we believe[br]should be the law, of course, could be
0:56:59.829,0:57:05.640
hijacked, could be redesigned in a way[br]that it will lead to more problems than
0:57:05.640,0:57:10.830
fewer problems. So, indeed, for every[br]policy question we raise, we need to ask
0:57:10.830,0:57:14.840
ourselves, is it worth the fight to risk[br]opening the box of the Pandora? Do we make
0:57:14.840,0:57:22.220
it worse? What we said is on that front -[br]we are happy to make a pressure and what
0:57:22.220,0:57:25.040
we need to do in the next year is to[br]convince them that we are the right person
0:57:25.040,0:57:30.849
to talk to. And that's perhaps a challenge[br]how to make that explicable to policy
0:57:30.849,0:57:35.440
makers. So those who ask the questions, I[br]think those should help us to come to
0:57:35.440,0:57:39.540
Brussels to the parliament and tell MEPs[br]how it's going to work.
0:57:39.540,0:57:48.820
Herald: On that note a question to both of[br]you. Citizen enforcement, I prefer the
0:57:48.820,0:57:55.349
term citizen overuses. Would it be helpful[br]to push for amendments in the parliament
0:57:55.349,0:58:05.470
for at least the targeting points you both[br]mentioned before? And if so, how?
0:58:05.470,0:58:14.496
Eliska: So I guess that I will start just[br]so Christoph can rest a little. So the
0:58:14.496,0:58:18.350
question was whether it would be useful to[br]push for those amendments? Was that's
0:58:18.350,0:58:20.710
right?[br]Herald: For amendments that cover the
0:58:20.710,0:58:25.970
targeting of citizens.[br]Eliska: Absolutely. So there is, of
0:58:25.970,0:58:30.630
course, short and long answer as to every[br]question. And so the short answer would be
0:58:30.630,0:58:37.900
yes, but given that the wording of such an[br]amendment will be precise and nuanced. We
0:58:37.900,0:58:42.090
are still working out our positioning on[br]online targeting and I think we all.. no
0:58:42.090,0:58:46.300
one can name those practices that we don't[br]want to see being deployed by platforms
0:58:46.300,0:58:50.900
and where we can actually imagine a proper[br]ban on such practices. We have recently
0:58:50.900,0:58:57.810
published one of our blog posts where we[br]actually unfold the way of thinking that
0:58:57.810,0:59:02.150
Access Now currently, you know, how we[br]are brainstorming about this whole issue.
0:59:02.150,0:59:06.740
And as I said, especially those.. that's[br]targeting that uses behavioral data of
0:59:06.740,0:59:11.360
users, citizens, then maybe let's go for[br]individuals because they are also obliged
0:59:11.360,0:59:18.550
to protect the rights of individuals that[br]are not their citizens. So that's
0:59:18.550,0:59:21.840
definitely one form where we can[br]definitely see and will be supporting the
0:59:21.840,0:59:26.130
ban and possible phase out that will[br]actually lead to a ban. The same goes for
0:59:26.130,0:59:34.200
the cross side tracking of users due to[br]the fact how users data are being abused
0:59:34.200,0:59:39.580
again as being the integral part of the[br]business models of these platforms and so
0:59:39.580,0:59:44.230
on and so forth. So that's one of the[br]direction we will be definitely taking.
0:59:44.230,0:59:48.799
And again, we are inviting all of you to[br]help us out, to brainstorm together with
0:59:48.799,0:59:53.020
us, to assess different options,[br]directions that we should take into
0:59:53.020,0:59:58.470
consideration and not forget about. But I[br]personally think that this will be one of
0:59:58.470,1:00:04.360
the main battles when it comes to DSA,[br]where we will definitely need to be on the
1:00:04.360,1:00:10.360
same page and harmonize and joint the[br]forces, because DSA gives us a good ground
1:00:10.360,1:00:16.040
at the moment, but it doesn't go far[br]enough. So, yes, definitely the answer is
1:00:16.040,1:00:19.660
yes, but given that, we will have a very[br]nuanced position so we know what we are
1:00:19.660,1:00:24.579
asking for and we are taking into[br]consideration also those other aspects
1:00:24.579,1:00:30.240
that could eventually play out badly in[br]practice. So good intentions are not
1:00:30.240,1:00:35.109
enough when it comes to DSA.[br]Herald: Thank you. I think we are slightly
1:00:35.109,1:00:42.770
over time, but I've been told beforehand[br]it's OK to do so for a few minutes and
1:00:42.770,1:00:47.170
there's three questions that are open. One[br]of one of them, I will answer myself. That
1:00:47.170,1:00:50.480
is basically: have the member states[br]responded? And the answer to that is no.
1:00:50.480,1:00:54.430
The member states have not taken any[br]position. And two others, I think are
1:00:54.430,1:00:59.859
quite interesting and important from[br]a techy perspective. One is: is there
1:00:59.859,1:01:03.320
anything you see that might affect current[br]decentralized platforms like the
1:01:03.320,1:01:11.250
Fediverse, Mastodon? And the other is:[br]will any review of the data protection,
1:01:11.250,1:01:17.440
sorry, the Database Protection Directive[br]affect make the search engines and
1:01:17.440,1:01:23.430
interact of these again?[br]Christoph: Perhaps I jump in and Eliska,
1:01:23.430,1:01:28.329
you take over. First, member states have[br]given an opinion actually on DSA, there
1:01:28.329,1:01:33.350
have been one-two official submissions,[br]plus joint letters, plus discussions in
1:01:33.350,1:01:39.150
council on whether the DSA was presented.[br]I have some nice protocols, which showed
1:01:39.150,1:01:44.170
the different attitude of member states[br]towards it. So for us, it also means we
1:01:44.170,1:01:48.970
need to work straight away with the[br]council to ensure that the package would
1:01:48.970,1:01:55.680
be good. What was the question? Ah, yes, I[br]think the answer to the question depends
1:01:55.680,1:02:00.520
on the what lawyers call materials scope[br]applications whether it would apply to
1:02:00.520,1:02:05.760
those platform models at all. But[br]Eliska you can help me out here. You have
1:02:05.760,1:02:08.930
always criticized for the e-Commerce[br]Directive that it was not quite clear how
1:02:08.930,1:02:15.190
it would relate to first non profit[br]platforms. And many of these alternative
1:02:15.190,1:02:19.680
platforms are like that, because there was[br]this issue of providing a service against
1:02:19.680,1:02:22.660
remuneration and what's not quite clear[br]what it means. Would it apply to Wikipedia
1:02:22.660,1:02:26.609
if you get like donations would have[br]applied to a blogger if you have, like,
1:02:26.609,1:02:31.640
pop ups, ads or something like that. So[br]that's, I think, one huge question. And
1:02:31.640,1:02:37.680
the second question is, in as much those[br]new due diligence obligation would force
1:02:37.680,1:02:44.369
alternative platforms, governance models[br]to redesign the interfaces. And those are
1:02:44.369,1:02:48.430
open question for us. We have not analyzed[br]that in detail, but we see that.. we have
1:02:48.430,1:02:52.839
worried that it would not only impact the[br]large platforms, but many others as well.
1:02:52.839,1:02:58.020
What do you think Eliska?[br]Eliska: Yeah, I can only agree and
1:02:58.020,1:03:03.620
especially regarding the non profit[br]question. Yeah, this is also or was and
1:03:03.620,1:03:09.859
always been one of our main asks for[br]nonprofit organizations. And actually it's
1:03:09.859,1:03:13.910
not quite clear how that will play out now[br]in practice. By the way, how DSA standing,
1:03:13.910,1:03:17.450
because at the moment it actually speaks[br]about online platforms and then it speaks
1:03:17.450,1:03:30.829
about very large online platforms, but[br]what it will be and how it will impact
1:03:30.829,1:03:35.610
nonprofit organizations, whether these are[br]bloggers or organizations like us civil
1:03:35.610,1:03:43.760
rights organizations that remain to be[br]seen. I also know that the European Court
1:03:43.760,1:03:49.100
of Human Rights and its jurisprudence try[br]to establish some principles for the
1:03:49.100,1:03:56.410
nonprofit since Delfi AS versus Estonia[br]and then with the empty e-decision and the
1:03:56.410,1:04:01.090
Swedish case that followed afterwards. But[br]I'm not sure how well it was actually
1:04:01.090,1:04:05.430
elaborated later on. But this is something[br]we will be definitely looking at and
1:04:05.430,1:04:14.030
working on further and regarding the[br]impact on the smaller players and also at
1:04:14.030,1:04:18.349
the interface idea, this is still[br]something we are actually also wondering
1:04:18.349,1:04:26.330
about and thinking how this will turn out[br]in practice. And we are hoping to actually
1:04:26.330,1:04:30.760
develop our positioning further on these[br]issues as well, because we actually
1:04:30.760,1:04:36.440
started working, all of us on DSA and on[br]our recommendations already. I think a
1:04:36.440,1:04:40.810
year ago, maybe a year and a half ago when[br]we were working just with the leagues that
1:04:40.810,1:04:45.080
Politico or other media platforms in[br]Brussels were actually sharing with us.
1:04:45.080,1:04:48.551
And we were working with the bits and[br]pieces and trying to put our thinking
1:04:48.551,1:04:52.570
together. Now we have the draft and I[br]think we need to do another round of very
1:04:52.570,1:04:58.099
detailed thinking. What will be ultimately[br]our position or what will be those
1:04:58.099,1:05:01.480
recommendations and amendments in the[br]European Parliament we will be supporting
1:05:01.480,1:05:06.749
and pushing for. So it's a period of[br]hard work for all of us. Not mentioning
1:05:06.749,1:05:14.700
that I always say that, you know, we stand[br]against a huge lobby power that is going
1:05:14.700,1:05:21.671
to be and is constantly being exercised by[br]these private actors. But I also have to
1:05:21.671,1:05:25.270
say that we had a very good cooperation[br]with the European Commission throughout
1:05:25.270,1:05:28.731
the process. And I think that I can say[br]that on behalf of all of us, that we feel
1:05:28.731,1:05:33.630
that the European Commission really listen[br]this time. So, yeah, more question marks
1:05:33.630,1:05:37.980
than answers here from me, I think.[br]Herald: This is fine with me, not knowing
1:05:37.980,1:05:42.299
something is fine. I think we're[br]definitely run out of time. Thank you both
1:05:42.299,1:05:51.410
for being here. Well, enjoy the 2D online[br]world and the Congress. Thank you. That
1:05:51.410,1:05:53.990
wraps the session up for us. Thank you all.[br]Christoph: Thank you.
1:05:53.990,1:05:57.380
Eliska: Thank you very much. Bye.
1:05:57.380,1:06:00.363
music
1:06:00.363,1:06:36.350
Subtitles created by c3subtitles.de[br]in the year 2021. Join, and help us!