-
rC3 preroll music
-
Herald: And the the talk that is about to
begin now is by Christoph Schmon from EFF
-
and Eliska Pirkova from Access Now and
they will be talking about this, probably
-
maybe the biggest legislative initiative
since the GDPR in Brussels. It's called
-
the Digital Services Act. And onto you
two. You're muted.
-
Eliska Perkova: I realized. Hello!
First of all, thank you very much for
-
having us today. It's a great pleasure to
be the part of this event. I think for
-
both of us, it's the first time we are
actually joining this year. So it's
-
fantastic to be the part of this great
community. And today we are going to talk
-
about the legislative proposal, the EU
proposal which causes a lot of noise all
-
around Europe, but not only in Europe, but
also beyond. And that's the Digital
-
Services Act legislative package that
today we already know that this
-
legislative package actually consists of
two acts: Digital Services Act and a
-
Digital Market Act. And both of them will
significantly change the regulation of
-
online platforms with a specific focus on
very large online platforms, also often
-
referred to as gatekeepers. So those who
actually hold a lot of economic dominance,
-
but also a lot of influence and control
over users rights and the public
-
discourse. So I'm going to start with
giving you a quick introduction into
-
what's the fuss about, what is actually
the DSA about, why we are also interested
-
in it and why we keep talking about it,
and why this legislation will keep us
-
preoccupied for the years to come. DSA was
already announced two years ago as a part
-
of European Union digital strategy, and it
was appointed as one of those actions that
-
the digital strategy will be actually
consisting of. And it was the promise that
-
the European Commission gave us already at
that time to create the systemic
-
regulation of online platforms that
actually places hopefully the users and
-
their rights into the center of this
upcoming legislation. So the promise
-
behind DSA is that these ad-based Internet
bill, and I'm speaking now about Ad-Tech
-
and Online-Targeting. The
Internet, as we actually knew, will be
-
actually replaced by something that puts
user and users controls, and users right
-
as a priority. So both of these
legislations implemented and drafted
-
right, should be actually achieving that
goal in the future. Now, previously,
-
before DSA actually was drafted, there was
some so-called e-Commerce Directive in
-
place that actually established the basic
principles, especially in the field of
-
content governance. I won't go into
details on that because I don't want to
-
make this too legalistic. But ultimately,
DSA legislation is supposed to not
-
completely replace but build up on the top
of this legislation that actually created
-
the ground and the main legal regime for
almost 20 years in Europe to regulate the
-
user generated content. So, DSA and DMA,
as the legislation will seek to harmonize
-
the rules addressing the problem, such as
online hate speech, disinformation. But it
-
also puts emphasis finally on increased
meaningful transparency in online
-
advertising, the way how the content is
actually being distributed across
-
platforms and also will develop a specific
enforcement mechanism that will be
-
actually looking into it. Now, before I
will actually go into the details on DSA
-
and why DSA matters, and do we actually
need such a big new legislative reform
-
that is coming from the European
Commission? I want to just unpack it for
-
you a little bit what this legislative
package actually consists of. So, as I
-
already mentioned, two regulations... the
regulation, the strongest legal instrument
-
European Commission actually has in its
hands, which is supposed to achieve the
-
highest level of harmonization across the
member states. And we all can imagine how
-
difficult that will be to achieve,
especially in the realm of freedom of
-
expression and particular categories of
user generated content, which is so deeply
-
complex dependance. All of those
related to content moderation and content
-
curation will be mainly in the realm of
Digital Services Act. And then the second
-
regulation, the Digital Market Act, will
be specifically looking at the dominance
-
of online platforms, economic dominance,
competitive environment for smaller
-
players, the fairness in the competition.
And it will also establish the list of
-
do's and don'ts for gatekeeper's
platforms. So something that... is so the
-
platforms that actually hold a relevant
dominance and now based on these new
-
proposals, we know that these platforms
are mainly called as very large online
-
platforms. So this is exactly how the
legislation refers to gatekeepers now. And
-
now I think one more point that I want to
make is that the DSA and the DMA were
-
launched on the 15th of December 2020. So it
was literally a Christmas present given to
-
the digital rights community by the
European Commission, a long time
-
anticipated one. The work on DSA
started, however much earlier. Electronic
-
Frontiers Foundations, as much as Access
Now, together with EDRi, were working very
-
hard to come up with the priorities and
recommendations what we would like to see
-
within these legislations to be
enshrined, because from the beginning we
-
understood the importance and the far
reaching consequences this legislation
-
will have not only inside of the European
Union, but also beyond. And that brings me
-
to the final introductory point that I
want to make before I will hand over to
-
Chris, which is why and do we actually
need DSA? We strongly believe that there
-
is a big justification and good reason to
actually establish a systemic regulation
-
of online platforms in order to secure
users fundamental rights, empower them,
-
and also to protect our democratic
discourse. And this is due to the fact
-
that for many years we're witnessing this
phase: quite bad regulatory practices in
-
platform governance that are coming from
member states and Chris will provide for a
-
very concrete example in that regard, but
also coming from the European Commission
-
itself, mainly the proposed online service
content regulation, for instance, or we
-
all remember the story of copyright that
Chris will discuss a little bit further.
-
We saw how not only the member states, but
also European Commission or European
-
Union, in order to actually establish some
order in the digital space, they started
-
pushing the state's obligation and
especially states positive obligation to
-
protect the users' human rights in the
hands of online private platforms that
-
started replacing state actors and public
authorities within the online space. They
-
started assessing the content, the
legality of the content, deciding under a
-
very short time frames what should stay
online and what should go offline with no
-
public scrutiny or transparency about
practices that they kept deploying. And
-
they keep deploying to this day. Of
course, platforms under the threat of
-
legal liability often had to rely and
still have to rely on the content
-
recognition technologies for removing user
generated content. A typical example could
-
be also Avia law, which will be still
mentioned today during the presentation.
-
And the typical time frames are usually
those that extend from one hour to 24
-
hours, which is extremely short,
especially if any users would like to
-
appeal such a decision or seek an
effective remedy. At the same time, due to
-
the lack of harmonization and lack of
proper set of responsibilities that should
-
lie in the hands of these online
platforms. There was a lack of legal
-
certainty which would only reinforce the
vicious circle of removing more and more
-
of online content in order to escape any
possible liability. And at the end, to
-
this day, due to the lack of transparency,
we lack any evidence or research based
-
policy making, because platforms do not
want to share or inform the public
-
authorities what they actually do with the
content, how they moderate and those
-
transparency information that we receive
within their transparency reports are
-
usually quantity oriented instead of
quality. So they focus on how much content
-
is actually being removed and how fast,
which is not enough in order to create
-
laws that can actually provide any more
sustainable solutions. And ultimately, as
-
we all agree, the core issue doesn't lie
that much with how the content is being
-
moderated, but how content is being
distributed across platforms within the
-
core of their business models that
actually stands on a attention economy and
-
on the way, how sensational content is
often being amplified in order to actually
-
prolong that attention span of users that
visit platforms on regular basis. And I'm
-
packing quite a few issues here. And this
supposed to be just like a quick
-
introductory remark. I will now hand over
to Chris that will elaborate on all these
-
points a little bit further, and then we
take a look and unpack a few quite
-
important parts of the essay that we feel
should be prioritized in this debate at
-
the moment. Chris, over to you.
Christoph Schmon: Hi everybody. I'm
-
quite sure that many of you have noticed
that there's a growing appetite from the
-
side of the European Union to regulate the
Internet by using online platforms, as the
-
helping hands to monitor and censor what
users can say/share/do online. As you
-
see on the slide, the first highlight of
this growing appetite was corporate upload
-
filters, which are supposed to stop
copyright protected content online.
-
Thousands of people, old and young, went
on the streets to demonstrate for free
-
Internet, to demonstrate against technical
measures to turn the Internet into some
-
sort of censorship machine. We've made a
point then, and we continue making the
-
point now that upload filters are prone to
error, that upload filters cannot
-
understand context, that they are
unaffordable by all but the largest tech
-
companies, which happen to be all based in
the United States. But as you well know,
-
policymakers would not listen and
Article 13 of the copyright directive was
-
adopted by a small margin in the European
Parliament, also because some members of
-
European Parliament had troubles to press
the right buttons, but I think it's
-
important for us to understand that the
fight is far from over. The member states
-
must now implement the directive in the
way that is not at odds with fundamental
-
rights. And we argue that mandated
automated removal technologies are always
-
in conflict with fundamental rights. And
this includes data protection rights. It
-
is a data protection right not to be made
subject to Automated Decision-Making
-
online, if it involves your personal data
and if such decision making has a negative
-
effect. So we believe, all those legal
arguments aside, I think the most worrying
-
experience with upload filters is that it
has spillover effects to other
-
initiatives. Sure, if it works for
copyright protected companies, it may well
-
work for other types of content, right?
Except that it doesn't. Many considered
-
now a clever idea that web forums should
proactively monitor and check all sorts of
-
user content. May this be communication,
pictures or videos, and they should use
-
filters to take it down or the filters to
prevent the re-upload of such content. An
-
example of such spillover effect that Eliska
had mentioned is the draft regulation of
-
terrorist related content. It took a huge
joint effort of civil society groups and
-
some members of Parliament to reject the
worst of all text. We had recent
-
negotiations going on and at least we
managed to get out the requirement to use
-
uploads filters, but still a twenty four
hours removal obligation that may nudge
-
platforms to employ those filters
nevertheless. And we see that particularly
-
in national states, they are very fond of
the idea that platforms rather than
-
judges should be the new law
enforcers. There are now several states in
-
the European Union that have adopted laws
that would either oblige or nudge
-
platforms to monitor users speech
online. First up was the German NetzDG,
-
which set out systematic duties for
platforms. Then we had the French law
-
Avia, which copy-pasted the NetzDG and
made it worse. And last we have the
-
Austrian Hate Speech bill, which is a mix
of both the German and the French
-
proposal. They all go much beyond
copyright content, but focus on hate
-
speech and all sorts of content that may
be considered problematic and in those
-
respective countries, not necessarily in
other countries. And this brings me to the
-
next problem. How do we deal with content
that is illegal in one country, but legal
-
in another? A recent Court of Justice
ruling had confirmed that a court of a
-
small state like Austria can order
platforms not only to take down defamatory
-
content globally, but also to take down
identical or equivalent material using
-
automated technologies. For us this is a
terrible outcome, that will lead to a race
-
to the bottom, where the countries with
the least freedom of speech friendly laws
-
can superimpose their laws on every other
state in the world. We really believe that
-
all this nonsense has to stop. It's time
to acknowledge that the Internet is a
-
global space, a place of exchange of
creativity and the place where civil
-
liberties are suppose to exist. So we are
fighting now against all those national
-
initiatives. We had the great first
victory, when we have to bring down the
-
French law Avia - the Avia bill that had
imposed the duty for platforms to check
-
and remove potentially illegal content
within 24 hours. Before the Conseil
-
constitutionnel, the French Supreme Court,
we had argued that this would push
-
platforms to constantly check what users
post. And if platforms face high
-
fines... of course, they would be rather
motivated to block as much context of the
-
content as possible. We've made a point
that this would be against the Charter of
-
Fundamental Rights, including freedom of
information and freedom of expression. And
-
it was a great victory for us that the
French Supreme Court has struck down the
-
French Avia bill and followed our
argument, as you see on the slide. We also
-
see now that there's a push back at least
against the update of the German NetzDG,
-
which would have provided new access
rights for law enforcement authorities.
-
This and other provisions are considered
unconstitutional. And as far as I
-
understand and, perhaps listeners can
correct me, the German president has
-
refused to sign the bill and the Austrian
bill goes the similar path way - got
-
recently a red light from Brussels. The
commission considers it in conflict with
-
EU principles. Also, thanks to joint
effort by epicenter.works. And this shows
-
that something positive is going on, it's
a positive development, the pushback
-
against automated filter technologies.
But it's important to understand that
-
those national initiatives are not just
purely national attempts to regulate hate
-
speech. It's an attempt of an EU member
state to make their own bills, as badly as
-
they are, some sort of a prototype for EU-
wide legislation, a prototype for the
-
Digital Services Act. And as you know,
national member states have a say in EU
-
lawmaking: their voices are represented in
the council of the EU and the European
-
Commission, which will be disincentivized
to propose anything that would be voted
-
down by the council. I think that's a nice
takeaway from today, that lawmaking in
-
national member states is not an isolated
event. It's always political, it's always
-
Netzpolitik. The good news is that as
far as the European Commission proposal
-
for a Digital Services Act is concerned,
that it has not followed the footsteps of
-
those bad, badly designed and misguided
bills. It has respected our input, the
-
input from Access Now, from EFF, from the
EDRi network, from academics and many
-
others, that some key principles should
not be removed, like that liability for
-
speech should rest with the speaker. The
DSA, it's also a red light to channel
-
monitoring of users content. And there are
no sure badlines in there to remove
-
content that might be illegal. Instead,
the commission gives more slack to
-
platforms to take down posts in good faith,
which we call the EU style
-
Good Samaritan clause. Looking
through the global lenses of law making
-
it's very fascinating to see that while
the United States is flirting with the
-
idea to move away from the Good Samaritan
principle in Section 230 of the
-
Communications Decency Act, so the idea is
that platforms can voluntarily remove
-
content without being held liable for it,
the European Union flirts with the idea to
-
introduce it, to give more options to
platforms to act. That being said, the
-
major differences between the US and the
EU is that in Europe, platforms could be
-
held liable the moment they become aware
of the illegality of content. That's an
-
issue because the Digital Services Act has
now introduced a relatively sophisticated
-
system for user notification, complaint
mechanism, dispute resolution options,
-
which all leads to such awareness about
illegality or could lead to such
-
awareness. It's not quite clear for us how
platforms will make use of voluntary
-
measures to remove content. That being
said, we think that the commission's
-
proposal could have been much worse. And
the Parliament's reports on Digital
-
Services Act have demonstrated that the
new parliament is a bit better than the
-
old one. They have a lot of respect for
fundamental rights. So many members of
-
parliament that are quite fond of the idea
to protect civil liberties online. But we
-
know that this was only the start and we
know that we need another joint effort to
-
ensure that users are not monitored, they
are not at the mercy of algorithmic decision
-
making. And I think Eliska is now going to
explain a bit more about all this.
-
Eliska: Thank you. Thank you very much,
Chris. So, we can actually move now
-
further and unpack a few relevant
provisions for everything that has already
-
been mentioned, mainly in the realm of
content moderation and content creation,
-
which ultimately lies in the core of
Digital Services Act. And maybe not to
-
make it all so abstract... I also have the
printed version of the law here with me.
-
And if you look at it, it's quite an
impressive piece of work that the European
-
Commission did there. And I have to say
that even though it's a great start, it
-
still contains a lot of imperfections. And
I will try to summarize those now for you,
-
especially in the light of our end positioning
and as I mean civil societies in general,
-
because I believe that on all those
points, we have a pretty solid agreement
-
among each other. And what we were
actually hoping that Digital Services Act
-
will do, what it actually does and where
we see that we will need to actually
-
continue working very closely, especially
with the members of the European
-
Parliament in the future, once the draft
will actually enter the European
-
Parliament, which should happen relatively
soon. So as it already, as I already
-
mentioned at the beginning, quite briefly,
is how actually DSA distinguishes between
-
online platforms, which are defined within
the scope of the law and between very
-
large online platforms, which is exactly
that scope where all large online
-
gatekeepers fall into. DSA specificly then
distinguishes between obligations or
-
responsibilities of these actors, some
assigning to all of them, including online
-
platforms and some being extended
specifically due to the dominance and
-
power of these online gatekeepers hold.
This is mainly then the case when we
-
discuss the requirements for
transparency. There is a set of
-
requirements for transparency that apply
to online platforms, but then there is
-
still specific additional set of
transparency requirements for larger
-
online platforms. What DSA does especially
and this is the bed which is extremely
-
relevant for the content moderation
practices - it attempts to establish a
-
harmonized model for notice and action
procedure for allegedly illegal content.
-
Whatever alert red lines before I go into
the details on this was that DSA will be
-
touching only, or trying to regulate only
allegedly illegal content and stay away
-
from vaguely defined content categories
such as potentially harmful, but legal
-
content. There are other ways how the
legislation can tackle this content,
-
mainly through the meaningful transparency
requirements, accountability, tackling
-
issues within the open content recommender
systems and algorithmic curation. But we
-
didn't want the specific category of the
content to be included within the scope of
-
DSA. This is due to the fact that vaguely
defined terms that find their way into
-
legislation always lead to human rights
abuse in the future. I could give you
-
examples from Europe, such as the concept
of online harms within the UK, but also as
-
a global organizations. Both of us, we
actually often see how weak terminology
-
can quickly lead to even over
criminalization of speech or suppressing
-
the decent. Now, if we go back to
harmonize notice and action procedure,
-
what that actually means in practice, as
Christoph already mentioned, Europe has
-
so-called conditional model of
intermediate reliability, which is being
-
provided already and established by the
initial legal regime, which is the
-
e-Commerce Directive under Article 14 of
the e-Commerce Directive, which actually
-
states that unless the platform holds the
actual knowledge and according to the
-
wording of DSA now it's the actual
knowledge or awareness about the presence
-
of illegal content on their platform, they
cannot be held liable for such a content.
-
Now, we were asking for a harmonized
procedure regarding notice and action across
-
the EU for a while, precisely because we
wanted to see reinforced legal certainty.
-
Lack of legal certainty often translated
into overremoval of even legitimate
-
content from the platforms with no public
scrutiny. DSA does a good job, it's a good
-
starting point that actually tries to
attempt, to establish such a harmonized
-
procedure, but it's still lacking behind
on many aspects that we consider important
-
in order to strengthen protection of
fundamental rights of users. One of them
-
is, for instance, that harmonized notice
and action procedure, as envisioned by
-
DSA, is not specifically tailored to
different types of categories of user
-
generated content. And as we know, there
were some or many categories of content
-
that are deeply context dependent, linked
to the historical and sociopolitical
-
context of member state in question. And
due to their reliance on the automated
-
measures, that usually context blind, we
are worried that if notice and action
-
doesn't reflect this aspect in any further
ways we will end up again with over
-
removals of the content. What is probably
another huge issue that we are currently
-
lacking in the draft, even though DSA is
trying to create a proper appeal and
-
enforcement mechanisms and also appeal
mechanisms for users and different
-
alternative dispute settlement of the law
draft currently contains, there is no
-
possibility for content providers, for the
user that uploads the filter.. Sorry.
-
laughs That was a nice Freudian slip
there. For
-
a user that actually uploaded content to
appeal to directly actually use the
-
counter notification about that notified
content that belongs to that user. Nor
-
platforms are obliged to actually send the
notification to a user prior to any action
-
that is being taken against that
particular notified content. These are for
-
us a procedural safeguards for fairness
that users should have, and currently they
-
are not being reflected in the draft.
However, this is a good start and it's
-
something that we were pushing for. But I
think there are many more aspects that
-
these notice and action procedures will
need to contain in order to truly put
-
users at first. Now the notice and action
procedure is mainly focusing on the
-
illegal content. But there are ways in the
draft where potentially harmful content,
-
which is still legal - so the content that
actually violates the terms of service of
-
platforms is being mentioned throughout
the draft. So for us, it's now at the
-
moment exactly clear how that will work in
practice. So that's why we often use this
-
phrase that is also put on the slide: good
intention with imperfect solutions.
-
However, I want to emphasize again that
this is just the beginning and we will
-
still have time and space to work very
hard on this. Another kind of novel aspect
-
that DSA actually brings about is already
mentioned Good Samaritan Clause, and I
-
tend to call it the EU model or EU version
of Good Samaritan Clause. Good Samaritan
-
Clause originates in Section 230 of
Communication Decency Act, as Cristoph
-
already mentioned. But within the European
realm, it goes hand in hand with this
-
conditional model of liability which is
being preserved within the DSA legal
-
draft. That was also one of our main ask
to preserve this conditional model of
-
liability and it's great that this time
European Commission really listened. Why
-
we consider the Good Samaritan Clause
being important? In the past, when such a
-
security wasn't enshrined in the law, but
it was just somehow vaguely promised to
-
the commission that if the platform will
proactively deploy measures to fight
-
against the spread of illegal content,
they won't be held liable without
-
acknowledging that through such a use of
so-called proactive measures, the platform
-
could in theory gain the actual knowledge
about the existence of such a type of
-
content on its platform, which would
immediately trigger legal liability. This
-
threat of liability often pushed platforms
to the corner so they would rather remove
-
the content very quickly then to face more
serious consequences later on. That's why
-
we see the importance within the Good
Samaritan Clause or the European model of
-
Good Samaritan Clause, and we are glad
that it's currently being part of the
-
draft. One of the biggest downfalls or one
of the biggest disappointments when DSA
-
finally came out on the 15th of December
for us was to see that it's still online
-
platforms that will remain in charge when
it comes to assessing the legality of the
-
content and deciding what content should
be actually restricted and removed from a
-
platform and what should be available. We
often emphasize that it's very important
-
that the legality of the content is being
assessed by the independent judicial
-
authorities as in line with the rule of
law principles. We also do understand that
-
such a solution creates a big burden on
the judicial structure of member states.
-
Many member states see that as a very
expensive solutions, they don't always
-
want to create a special network, of
courts, or e-courts or other forms of
-
judicial review of the illegal or
allegedly illegal content. But we still
-
wanted to see more public scrutiny,
because for us this is truly just the
-
reaffirmation of already existing status
quo, as at the moment and there are many
-
jurisdictions within the EU and in the EU
itself, it's still online platform that
-
will call the final shots. What, on the
other hand, is a positive outcome that
-
we were also hardly pushing for are the
requirements for meaningful transparency.
-
So to understand better what platforms
actually do with the individual pieces of
-
content that are being shared on these
platforms and how actually transparency
-
can then ultimately empower user. Now, I
want to emphasize this because this is
-
still ongoing debate and we will touch
upon those issues in a minute. But we
-
don't see transparency as a silver bullet
to the issues such as amplification of
-
potentially harmful content or in general
that transparency will be enough to
-
actually hold platforms accountable.
Absolutely not. It will never be enough,
-
but it's a precondition for us to actually
seek such solutions in the future. DSA
-
contains specific requirements for
transparency, as I already mentioned, a
-
set of requirements that will be
applicable largely to all online platforms
-
and then still specific set of
requirements on the top of it. That will
-
be applicable only to very large online
platforms so the online gatekeepers. We
-
appreciate the effort, we see that the
list is very promising, but we still think
-
it could be more ambitious. Both EFF and
Access Now put forward a specific set of
-
requirements for meaningful transparency
that are in our positions. And so did EDRi
-
and other civil society or digital rights
activists in this space. And final point
-
that I'm going to make is the so-called
Pandora box of online targeting and
-
recommender systems. Why do I refer to
this as to Pandora Box? When a European
-
Parliament published its initiative
reports on DSA, there are two reports, one
-
being tabled by JURI Committee and then
another one by ENCO, especially the JURI
-
report contained paragraph 17, which calls
out for a better regulation of online
-
targeting and online advertisements, and
specifically calling for a ban of online
-
targeting and including the Phase-Out that
will then lead to a ban. We supported this
-
paragraph, which at the end was voted for
and is the part of the final report.
-
Nevertheless, we also do understand that
these wording of the article has to be
-
more nuanced in the future. Before I go
into the details there, I just want to say
-
that this part has never made it to DSA.
So there is no ban on online targeting or
-
online advertisement of any sort, which to
us, to some extent, it was certainly
-
disappointing too, we specifically would
call for a much more stricter approach
-
when it comes to behavioral targeting as
well as crossside tracking of online
-
users, but unfortunately, and as we
eventually also heard from Commissioner
-
Vestager, that was simply lag of will or,
maybe, too much pressure from other
-
lobbies in Brussels. And this provision
never found its way to the final draft of
-
DSA. That's the current state of art, we
will see what we will manage to achieve
-
once the DSA will enter the European
Parliament. And finally, the law also
-
contains a specific provision on
recommender systems. So the way how the
-
content is being distributed across
platform and how the data of users are
-
being abused for such a distribution and
personalization of user generated content.
-
In both cases, whether it's online
targeting and recommender systems within
-
the DSA, DSA goes as far as the
transparency requirements, the
-
explainability, but it does very little
for returning that control and empowerment
-
back to the user. So whether user can
obtain or opt out from these algorithmic
-
curation models, how it can actually be
optimized if they decide to optimize it?
-
All of that is at the moment very much
left outside of the scope of DSA. And so
-
that's the issue of interoperability,
which is definitely one of the key
-
issues being currently discussed and made
kind of possible hopes in the future for
-
returning that control and empowerment
back to the user. And I keep repeating
-
this as a mantra, but it's truly the main
driving force behind all our initiatives
-
and the work we do in these fields. So the
user and their fundamental rights. And on
-
that note, I would like to hand over back
to Chris, who will explain the issue of
-
interoperability and how to actually
empower you as a user and to strengthen
-
the protection of fundamental rights
further. Chris, it's yours now. Christoph:
-
Thank you. I think we all know or feel
that the Internet has seen better times.
-
If you look back over the last 20 years,
we have seen that transformation was going
-
on from an open Internet towards a more
closed one - monopolization. Big platforms
-
have built entire ecosystems and it seems
that they alone decide who gets to use
-
them. Those platforms have strong network
effects that have pushed platforms or
-
those platforms into gatekeeper position
which made it so easy for them to avoid
-
any real competition. This is especially
true when we think of social media
-
platforms. This year we celebrate the 20th
birthday of the e-Commerce Directive that
-
Eliska mentioned. The Internet bill that
will now be replaced by the Digital
-
Services Act. We believe it's a very good
time now to think and make a choice:
-
should we give even more power to the big
platforms that have created a lot of the
-
mess in the first place; or should we give
the power to the users, give the power
-
back to the people? For us, the answer is
clear. Big tech companies already employ a
-
wide array of technical measures. They
monitor, they remove, they disrespect user
-
privacy and the idea to turn them into
the Internet Police, with a special
-
license of censoring the speech of users,
will only solidify their dominance. So we
-
wouldn't like that. What we like is to
put users in charge over their online
-
experience. Users should, if we had a say,
choose for themselves which kind of
-
content they can see, what services they
can use to talk to their friends and
-
families. And we believe it's perhaps time
to break up those silos, those big
-
platforms have become to end the dominance
over data. One element to achieve this
-
would be to tackle the targeted ads
industry, as Eliska mentioned it, perhaps
-
to give an actual right to users not to be
subject to targeted ads or to give more
-
choice to use to decide, which content
they would like to see or not to see. In
-
the Digital Services Act, the Commission
went for transparency when it comes to ads
-
and better option for users to decide on
the recommended content, which is a start,
-
we can work with that. Another important
element to achieve user autonomy over data
-
is interoperability. If the European Union
really wants to break the power of those
-
data driven platforms that monopolize the
Internet, it needs regulations that
-
enables users to be in control over the
data. We believe that users should be able
-
to access data, to download data, to move,
manipulate their data as they see fit. And
-
part of that control is to port data
from one place to another. But data
-
portability, which we have under the
GDPR is not good enough. And we
-
see from the GDPR that it's not
working in practice. Users should be able
-
to communicate with friends across
platform boundaries, to be able to follow
-
their favorite content across different
platforms without having to create several
-
accounts. But to put it in other terms, if
you upset with the absence of privacy on
-
Facebook or how the content is moderated
on Facebook, you should be able to just
-
take your data with you using portability
options and move to an alternative
-
platforms, that is a better fit and this
without losing touch with your friends who
-
stay behind, who have not left the
incumbent big platform. So what we did for
-
Digital Services Act is to argue for
mandatory interoperability options that
-
would force Facebook to maintain APIs that
let users on other platforms exchange
-
messages and content with Facebook users.
However, if you look in the DSA, we see
-
that the commission completely missed the
mark on interoperability, which is
-
supposed to be dealt with by related legal
act, now it gets complicated. It's the
-
Digital Markets Act, the DMA, another
beautiful acronym. The Digital Markets Act
-
wants to tackle certain harmful business
practices by those gatekeeper platforms,
-
the very large tech companies that control
what is called core services. The core
-
service is a search engine, a social
networking service, a messaging service,
-
its operating systems and online
intermediation services. Like think of how
-
Amazon controls access to customers for
merchants that sell on its platforms or
-
how the Android and iPhone app stores as
chokepoints in delivering mobile software.
-
And many things we like in the new
proposal, the proposal of the Digital
-
Markets Act, for example there's a ban on
mixing data in there that you may wants to
-
ban gatekeeper's from mixing data from
data progress with the data they collect
-
on the customers. Another rule is to ban
cross tying - sort of practices that end
-
users must sign up for ancillary services.
So you should be able to use Android
-
without having to get a Google account for
example. You believe that this is all
-
good, but the DMA like the DSA is very
weak on interoperability. What it does is
-
to focus on real time data portability
instead. So instead of having
-
interoperable services, users will only be
able to send the data from one service to
-
another like from Facebook to Diaspora,
meaning that you would end up having two
-
accounts instead of one or to quote Cory
Doctorow who spoke yesterday already:
-
"Users would still be subject to the
sprawling garbage novela of abusive legalese
-
Facebook lovably calls its terms of
service." We believe that this is not
-
good enough. And the last slide, you see a
quote from the Margrethe Vestager who made
-
a very good statement last month, that we
need trustworthy services, fair use of
-
data and free speech and an interoperable
internet; we fully agree on that. And in
-
the next months and years, we will work on
this to actually happen. However, you can
-
imagine, it will not be easy. We already
see that European Union member states
-
follow the trend of platforms should
systematically check undesirable and
-
insightful content and share those data
with enforcement authorities, which is
-
even worse. We see an international trend
going on to move away from the immunity of
-
platform for use of content towards a more
active stance of those platforms. And we
-
see that recent terror attacks have fueled
ideas that monitoring is a good idea and
-
end to end encryption is a problem. So
whatever will be the result, you can bet
-
that European Union will want to make the
Digital Services Act and the Digital
-
Markets Act another export model. So this
time we want the numbers right in
-
parliament and the council, we want to
help members of parliament to press the
-
right buttons. And for all this we will
need your help, even if it means to learn
-
yet another acronym or several acronyms
after the GDPR. That's it from
-
our side - we are looking forward to the
discussion.Thank you.
-
Herald: OK, thank you Eliska and
-
Christoph. There are questions from the
internet and the first one is basically
-
we just have and as you mentioned in your
slides, Christoph, the copyright in the
-
digital single market with both
accountability and liability provisions,
-
you also briefly mentioned, I think even
the e-evidence proposal also. How do all
-
these proposals relate to each other? And
especially for a layperson, that is not
-
into all the Brussels jargon.
Christoph: I think Eliska, you raised
-
your hand, don't you?
Eliska: ...more or less unintentionally,
-
but yeah, kind of that. I can start and
then let you Christoph to step in. Yeah,
-
that's a very, very good question. And
this is specifically due to the fact that
-
when you mention especially online
terrorist content regulation, but also
-
recently proposed interim regulation on
child sexual abuse, they.. all these - we
-
call them sectoral legislation, so kind of
a little bit of parting from this
-
horizontal approach, meaning an approach
that tackles all categories of illegal
-
content in one way, instead of going after
specific categories such as online
-
terrorist content in the separate ways. So
it's a little bit paradoxical saying what
-
is currently also happening at the EU
level, because on one hand, we were
-
promised this systemic regulation that
will once for all establish harmonized
-
approach to combating illegal content
online and at the same time, which is
-
specifically DSA, the Digital Services
Act, and at the same time we still see
-
European Commission allowing for these
fundamental rights harmful legislative
-
proposals happening in these specific
sectors such as proposed online
-
terrorist content regulation or other
legislative acts seeking to somehow
-
regulate specific categories of user
generated content. This is quite puzzling
-
for us as a digital rights activists too,
and very often, actually, so I would maybe
-
separate DSA from this for a moment and
say that all of these sectoral
-
legislations what they have in common is:
first of all, continuing these negative
-
legislative trends that we already
described and that we constantly observe
-
in practice, such as shifting more and
more responsibility on online platforms.
-
And at the same time, what is also very
interesting, what they have in common is
-
the legal basis that they stand on, and
that's the legal basis that is rather
-
connected to the cooperation within the
digital single market, even though they
-
seek to tackle a very particular type of
category of content category, which is
-
manifestly illegal. So logically, if they
should have that appropriate legal ground,
-
it should be something more close to
police and judicial cooperation, which we
-
don't see happening in practice,
specifically because there is this idea
-
that platforms are the best suited to
decide how the illegal content will be
-
tackled in the online space. They can be
the fastest, they can be the most
-
effective. So they should actually have
that main decision making powers and
-
forced into taking those responsibilities
which have ever ultimately, according to
-
the rule of law principle, should and have
to be in the hands of the state and public
-
authorities, preferably judicial
authorities. So I would say they are all
-
bad news for fundamental rights protection
of online users, civil rights
-
organizations, all of us that are on this
call today. We're fighting very hard also
-
against the online service content
regulation. There was a lot of damage
-
control done, especially with the first
report that was tabled by the European
-
Parliament and also now during the last
trialogue since the negotiations seems to
-
be concluded and the outcome is not great.
It's far from ideal. And I'm worried that
-
with other sectoral legislative attempts
coming from the European Commission, we
-
might see the same outcome. It will be
very interesting to see how that will
-
actually then play together with the
Digital Services Act, which is trying to
-
do the exact opposite to actually fix this
negative legislative efforts that we see
-
at the EU level with these sectoral
legislation, but also with the member
-
states at the national level. I could also
mention the European Commission reaction
-
to some national legislative proposals.
But Christoph, I would leave that to you
-
and please step in.
Christoph: I think you explained it
-
perfectly, and the only thing I can
supplement here is that if you look at
-
this move from sectoral legislation,
asylum legislation to horizontal
-
legislation, now back to sectoral
legislation - it's a problem, it's a mess.
-
First, the two sides not very good
coordinated which brings troubles for
-
legal certainty. It makes it very
troublesome for platforms to follow up.
-
And it's problematic for us, for us in the
space. We are some sort of lobbyist as
-
well, just for public interest. But you
will have to have to deal with copyright,
-
with CSAM, with TERREG, with end to
end encryption, DSA, DMA and 15 other
-
parties to pop up content by content. It's
very hard to manage to have the capacity
-
ready to be early in the debate, and it's
so important to be early in the debate to
-
prevent that from happening. And I think
that's a huge challenge for us, to have
-
something for us to reflect to in the next
days. How can we join forces better in a
-
more systematic way in order to really
follow up on all those initiatives? That's
-
for me, a very problematic development.
Herald: So in summary it's a mess. So it
-
is related, but we can't explain how,
because it's such a mess. Fair enough.
-
I have another question for you, Eliska.
Someone was asking how the proposed
-
Good Samaritan clause works compared to..
how it currently works in Germany. But I
-
think it's a bit unreasonable to expect
everyone to know how it works in Germany.
-
I would rephrase it this as: how does this
proposed Good Samaritan clause work
-
compared to how it is now under the
e-Commerce Directive?
-
Eliska: Thank you very much. Yeah, so a
great question again, I think the first if
-
we put it into the context of the EU law
and apologies that I cannot really answer
-
how - you know, compare the German context
- I really don't dare to, I'm not a
-
German lawyer, so I wouldn't like to step
in those waters. But first of all, there
-
is no Good Samaritan clause per se within
the scope of e-Commerce Directive. It did
-
not really exist within the law. And I'm
using the pass sentence now because DSA is
-
trying to change that. So that level of
legal certainty was not really, really
-
there for the platforms. There was the
conditional model of the liability, which
-
is still preserved within the regulation.
But if you think of a Good Samaritan
-
clause as we know it from the section
230, or let's use that Good Samaritan
-
clause as an example, because also
e-Commerce Directive was actually drafted
-
as a response to Communication Decency Act
that was the legislation that puts things
-
into motion. So that's the first
ultimate point. I explain at the
-
beginning in my presentation what was then
happening in the space of combating
-
illegal content at the EU level, and
especially I would refer to the
-
communication that the European Commission
published, I think, back in 2018, where it
-
actually encouraged and called on online
platforms to proactively engage with
-
illegal content and use these proactive
measures to actually seek an adequate
-
response to illegal content. Now, to mix
that with this conditional model of
-
liability, which is of course defined by
the obtaining actual knowledge by the
-
platform that created a perfect storm that
I already explained. So the platforms knew
-
that they are kind of pushed by the
legislature to actually seek these active
-
responses to illegal content, often
deploying automated measures. But they
-
didn't have any legal certainty or
security on their side that if they do so,
-
they won't end up ultimately being held
legally liable and face legal consequences
-
as a result of obtaining actual knowledge
through those proactive measures that were
-
kind of the tool, how they could possibly
actually obtain that knowledge. Now, what
-
DSA does, it specifically actually simply
states and I think it's Article 6 in the
-
Digital Services Act, if I'm not mistaken,
and I can even open it, it specifically
-
basically says that platforms can use
these proactive measures or, you know,
-
continue using some tools that actually
seek to provide some responses to this
-
type of content without the fear of being
held liable. So it's it's an article which
-
has approximately, I think, two
paragraphs, but it's finally in the
-
legislation and that means that it will
help to reinforce the level of legal
-
certainty. I would also emphasize that
very often in Europe, when we discuss Good
-
Samaritan clause, and Good Samaritan is
actually very unfortunate term, because
-
it's very much connected to the American
legal tradition. But when it's being mixed
-
up with the conditional model of liability
and with the prohibition of general
-
monitoring, which is still upheld, these
are the main principles of the European
-
intermediary reliability law and the
regime that is applicable within the EU,
-
such a safeguard can be actually
beneficial and it won't lead hopefully to
-
these blanket immunity for online
platforms or to this idea that platforms
-
will be able to do whatever they want with
the illegal content without any public
-
scrutiny, because there are other
measures, safeguards and principles in
-
place as a part of conditional model of
liability that we have here in Europe. So
-
I'm sorry, maybe that was too complicated.
Legalistic explanation there. But this is
-
how these provisions should work in
practice. We, of course, have to wait for
-
the implementation of the law and see how
that will turn out. But the main purpose
-
is that this legal certainty that was
lacking until now can finally come to its
-
existence, which should help us to prevent
over removal of legitimate speech from
-
online platforms.
Herald: OK, thank you. I have two other
-
questions from the Internet about
interoperability, and I suppose I should
-
look at Christoph for them. The last one
I'm going to ask first is: would such
-
interoperability make it much more
difficult to combat harassment and
-
stalking on the Internet? How do you
police that kind of misbehavior if it's
-
across different platforms who are forced
to interoperate and also be conduits for
-
such bad behavior. And I'll come to the
earlier question if you've answered this
-
question Christoph.
Christoph: It's a pretty good question.
-
First, to understand our vision on
interoperability is to understand that we
-
would like to have it between platforms
that empower large platforms and the right
-
of smaller platforms, actually, to make
use of interoperability. So it should not
-
be among the big platforms. So small
platforms should be able to connect to the
-
big platforms. And second, we believe it
will help and not make it worse because we
-
have now a problem of hate speech, we have
now a problem of a lack of privacy, we
-
have now a problem of the attention
industry that works with, you know,
-
certain pictures put in certain frames to
trigger the attention of users, because users
-
don't have a choice of the content
moderation practices, users don't have a
-
choice to see which kind of content should
be shown. And users don't have options to
-
regulate the privacy. The idea of more
competitors would be exactly that I can
-
move to a space, where I'm not harassed
and not be made subject to certain content
-
that hurt my feelings. Right. And that
moment I get control. I can choose a
-
provider that gives me those options and
we would like even to go a step further.
-
Back end interoperability was a start. We
believe if users want to, they should be
-
able to delegate a third party company or
piece of a third party software to
-
interact with the platform on their
behalf. So users would have the option to
-
see a news feed in different order,
calibrate their own filters on
-
misinformation. So in this sense,
interoperability can be a great tool,
-
actually, to tackle hate speech and to
sort of negative developments. Of course,
-
there is a risk to it. I think the risk
comes rather from the data industry side
-
again, that we need to take care not to
place one or another data selling
-
industry on the one that we already face.
But for this, we have options as well to
-
avoid that from happening. But to answer
the question, we believe interoperability
-
is a tool actually to escape from the
negative developments you had mentioned.
-
Herald: Critical counter question for me
then, aren't you actually advocating for
-
just roll your own recommendation engines
to be able to do so? Can't you achieve
-
that without interoperability?
Christoph: Sure. Recounter question: Do
-
you think an average user can accomplish
that quite easily? You know, like when we
-
look at the Internet through the lenses of
market competition then we see that it is
-
the dominance of platforms over data that
have created those spaces, those developed
-
gardens where users have the feeling they are
trapped and cannot escape from. And there
-
are so many alternative options that can
not get off the ground because users feel
-
trapped, don't want to leave their friends
behind and don't have options, actually to
-
have a better moderation system. Of course,
you can be creative and, you know, use
-
plugins and whatever you see fit, but you
need to stay within the platform barriers.
-
But we would like to enable users to actually
leave developed garden, go to another
-
place, but still stay in touch with
friends who have made the choice to remain
-
there. And I think that's perhaps the
difference to what you had in mind.
-
Herald: I have a follow up question. Well,
another question from the Internet,
-
regardless of interoperability, and that
is, historically speaking, as soon as the
-
big players get involved in certain
standards, they tend to also shape policy
-
by being involved in that. How would that
be different in the case of
-
interoperability and specifically
mentioned by the person who ask the
-
question. That Mastodon probably
flourishes because nobody else was
-
involved in setting that standard.
Christoph: Ah it's an excellent question.
-
And we struggled with the question of
standards ourselves in our policy paper,
-
which is our recommendations for European
Union to enact certain provisions in the
-
new Digital Services Act. We abstain from
asking to establish new standards like API
-
standards. We believe it's a bad idea to
regulate technology like that. What we
-
want to do is that big platforms just
offer interoperability however they see fit.
-
We don't want to have a standard that can
be either again monopolized or lobbied by
-
the big platforms. Because then we end up
with the standards we already see
-
which we don't like. But it's a good
question. And what we did is with our
-
policy principles on interoperability to
give kind of a food for thought, how we
-
believe the end version should look like,
but the many questions that remain and we
-
don't know exactly the way how to go
there.
-
Herald: Yeah, I'm sorry of sticking to the
topic of interoperability, because most
-
questions are actually about that. One of
the other questions is how do we prevent
-
this from getting messed up like it
happens with PSD2? And for the audience
-
that don't know about PSD2 - PSD2 is a
Directive that forced banks to open up
-
APIs to other financial service providers,
which is also interoperability between
-
platforms, in this case for banking
platforms, which comes with all sorts of
-
privacy questions that weren't completely
thought through when that legislation came
-
about. Sorry for having this long winded
intoduction, Christoph, but I think it was
-
needed for people that don't know what
PSD2 means.
-
Christoph: It's a good question.
Interestingly, we never used PSD2 or the
-
Telecommunications Act because both have
interoperability options as those negative
-
examples we always used as examples that
hey, it's already possible. So you don't
-
have an excuse to say it's impossible to put
it in the law. What is true is that there
-
is a lot of mess around it. The question
of how to avoid the mess, it's a question
-
of.. Netzpolitik again. So the question
of whether policy makers are actually
-
listening to us or listening to industry
lobbyists. So the ones who raised the
-
question is absolutely right - there's a
huge risk for every topic we talk about.
-
Whether interoperability, whether it's use
of control about content, targeted ads,
-
liability, everything that we believe
should be the law, of course, could be
-
hijacked, could be redesigned in a way
that it will lead to more problems than
-
fewer problems. So, indeed, for every
policy question we raise, we need to ask
-
ourselves, is it worth the fight to risk
opening the box of the Pandora? Do we make
-
it worse? What we said is on that front -
we are happy to make a pressure and what
-
we need to do in the next year is to
convince them that we are the right person
-
to talk to. And that's perhaps a challenge
how to make that explicable to policy
-
makers. So those who ask the questions, I
think those should help us to come to
-
Brussels to the parliament and tell MEPs
how it's going to work.
-
Herald: On that note a question to both of
you. Citizen enforcement, I prefer the
-
term citizen overuses. Would it be helpful
to push for amendments in the parliament
-
for at least the targeting points you both
mentioned before? And if so, how?
-
Eliska: So I guess that I will start just
so Christoph can rest a little. So the
-
question was whether it would be useful to
push for those amendments? Was that's
-
right?
Herald: For amendments that cover the
-
targeting of citizens.
Eliska: Absolutely. So there is, of
-
course, short and long answer as to every
question. And so the short answer would be
-
yes, but given that the wording of such an
amendment will be precise and nuanced. We
-
are still working out our positioning on
online targeting and I think we all.. no
-
one can name those practices that we don't
want to see being deployed by platforms
-
and where we can actually imagine a proper
ban on such practices. We have recently
-
published one of our blog posts where we
actually unfold the way of thinking that
-
Access Now currently, you know, how we
are brainstorming about this whole issue.
-
And as I said, especially those.. that's
targeting that uses behavioral data of
-
users, citizens, then maybe let's go for
individuals because they are also obliged
-
to protect the rights of individuals that
are not their citizens. So that's
-
definitely one form where we can
definitely see and will be supporting the
-
ban and possible phase out that will
actually lead to a ban. The same goes for
-
the cross side tracking of users due to
the fact how users data are being abused
-
again as being the integral part of the
business models of these platforms and so
-
on and so forth. So that's one of the
direction we will be definitely taking.
-
And again, we are inviting all of you to
help us out, to brainstorm together with
-
us, to assess different options,
directions that we should take into
-
consideration and not forget about. But I
personally think that this will be one of
-
the main battles when it comes to DSA,
where we will definitely need to be on the
-
same page and harmonize and joint the
forces, because DSA gives us a good ground
-
at the moment, but it doesn't go far
enough. So, yes, definitely the answer is
-
yes, but given that, we will have a very
nuanced position so we know what we are
-
asking for and we are taking into
consideration also those other aspects
-
that could eventually play out badly in
practice. So good intentions are not
-
enough when it comes to DSA.
Herald: Thank you. I think we are slightly
-
over time, but I've been told beforehand
it's OK to do so for a few minutes and
-
there's three questions that are open. One
of one of them, I will answer myself. That
-
is basically: have the member states
responded? And the answer to that is no.
-
The member states have not taken any
position. And two others, I think are
-
quite interesting and important from
a techy perspective. One is: is there
-
anything you see that might affect current
decentralized platforms like the
-
Fediverse, Mastodon? And the other is:
will any review of the data protection,
-
sorry, the Database Protection Directive
affect make the search engines and
-
interact of these again?
Christoph: Perhaps I jump in and Eliska,
-
you take over. First, member states have
given an opinion actually on DSA, there
-
have been one-two official submissions,
plus joint letters, plus discussions in
-
council on whether the DSA was presented.
I have some nice protocols, which showed
-
the different attitude of member states
towards it. So for us, it also means we
-
need to work straight away with the
council to ensure that the package would
-
be good. What was the question? Ah, yes, I
think the answer to the question depends
-
on the what lawyers call materials scope
applications whether it would apply to
-
those platform models at all. But
Eliska you can help me out here. You have
-
always criticized for the e-Commerce
Directive that it was not quite clear how
-
it would relate to first non profit
platforms. And many of these alternative
-
platforms are like that, because there was
this issue of providing a service against
-
remuneration and what's not quite clear
what it means. Would it apply to Wikipedia
-
if you get like donations would have
applied to a blogger if you have, like,
-
pop ups, ads or something like that. So
that's, I think, one huge question. And
-
the second question is, in as much those
new due diligence obligation would force
-
alternative platforms, governance models
to redesign the interfaces. And those are
-
open question for us. We have not analyzed
that in detail, but we see that.. we have
-
worried that it would not only impact the
large platforms, but many others as well.
-
What do you think Eliska?
Eliska: Yeah, I can only agree and
-
especially regarding the non profit
question. Yeah, this is also or was and
-
always been one of our main asks for
nonprofit organizations. And actually it's
-
not quite clear how that will play out now
in practice. By the way, how DSA standing,
-
because at the moment it actually speaks
about online platforms and then it speaks
-
about very large online platforms, but
what it will be and how it will impact
-
nonprofit organizations, whether these are
bloggers or organizations like us civil
-
rights organizations that remain to be
seen. I also know that the European Court
-
of Human Rights and its jurisprudence try
to establish some principles for the
-
nonprofit since Delfi AS versus Estonia
and then with the empty e-decision and the
-
Swedish case that followed afterwards. But
I'm not sure how well it was actually
-
elaborated later on. But this is something
we will be definitely looking at and
-
working on further and regarding the
impact on the smaller players and also at
-
the interface idea, this is still
something we are actually also wondering
-
about and thinking how this will turn out
in practice. And we are hoping to actually
-
develop our positioning further on these
issues as well, because we actually
-
started working, all of us on DSA and on
our recommendations already. I think a
-
year ago, maybe a year and a half ago when
we were working just with the leagues that
-
Politico or other media platforms in
Brussels were actually sharing with us.
-
And we were working with the bits and
pieces and trying to put our thinking
-
together. Now we have the draft and I
think we need to do another round of very
-
detailed thinking. What will be ultimately
our position or what will be those
-
recommendations and amendments in the
European Parliament we will be supporting
-
and pushing for. So it's a period of
hard work for all of us. Not mentioning
-
that I always say that, you know, we stand
against a huge lobby power that is going
-
to be and is constantly being exercised by
these private actors. But I also have to
-
say that we had a very good cooperation
with the European Commission throughout
-
the process. And I think that I can say
that on behalf of all of us, that we feel
-
that the European Commission really listen
this time. So, yeah, more question marks
-
than answers here from me, I think.
Herald: This is fine with me, not knowing
-
something is fine. I think we're
definitely run out of time. Thank you both
-
for being here. Well, enjoy the 2D online
world and the Congress. Thank you. That
-
wraps the session up for us. Thank you all.
Christoph: Thank you.
-
Eliska: Thank you very much. Bye.
-
music
-
Subtitles created by c3subtitles.de
in the year 2021. Join, and help us!