prerol music
Herald: Our next Talk will be tackling how
social media companies are creating a
global morality standard through content
regulations. This would be presented by
two persons standing here the digital
rights advocate Matthew Stender and the
writer and activist Jillian C. York. Please
give them a warm applause.
applause
Matthew: Hello everybody. I hope you all
had a great Congress. Thank you for being
here today. You know we're almost wrapped
up with the Congress but yeah we
appreciate you being here. My name is
Matthew Stender. I am a communications
strategist's creative director and digital
rights advocate focusing on privacy.
Social media censorship and Freedom of
Press and expression.
Jillian: And I am Jillian York and I work
at the Electronic Frontier Foundation. I
work on privacy and free expression issues
as well as a few other things and I'm
based in Berlin. Thank you. For Berlin?
Awesome. Cool. Hope to see some of you
there. Great. So today we're gonna be
talking about sin in the time of
technology and what we mean by that is the
way in which corporations particularly
content platforms and social media
platforms are driving morality and our
perception of it. We've got three key
takeaways to start off with. The first is
that social media companies have an
unparalleled amount of influence over our
modern communications. This we know I
think this is probably something everyone
in this room can agree on. These companies
also play a huge role in shaping our
global outlook on morality and what
constitutes it. So the ways in which we
perceive different imagery different
speech is being increasingly defined by
the regulations that these platforms put
upon us on our daily activities on them.
And third they are entirely undemocratic.
They're beholden to shareholders and
governments but not at all to the public.
Not to me, not to you. Rarely do they
listen to us and when they do there has to
be a fairly exceptional mount of public
pressure on them. And so that's, that's
our starting point that's what we wanna
kick off with and I'll pass the mike to
Matthew.
M: So thinking about these three takeaways
I'm going to bring it to kind of top level
for a moment. To introduce an idea today
which some people have talked about but
the idea of the rise of the techno class.
So probably a lot of people in this room
have followed the negotiations, leaks in
part and then in full by WikiLeaks about
the Trans-Pacific Partnership the TPP what
some people have mentioned in this debate
is the idea of a corporate capture in a
world now in which that corporations are
becoming are maturing to this to the
extent in which they can now sue
governments that the multinational reach
of many corporations are larger than that.
The diplomatic reach of countries. And
with this social media platforms being
part of this that they, that these social
media companies now are going to have the
capacity to influence not only cultures
but people within cultures and how they
communicate with people inside their
culture and communicate globally. So as
activists and technologists I would like
to propose at least that we start thinking
about and beyond the product and service
offerings of today's social media
companies and start looking ahead to two -
five - ten years down the road in which
these companies may have social media
services and serve as media social media
service offerings which are
indistinguishable from today's ISPs and
telcos and other things. And this is
really to say that social media is moving
past the era of the walled garden into neo
empires. So one of the things that is on
the slide are some headlines about
different delivery mechanisms in which
that social media companies and also
people like Elon Musk are looking to roll
out an almost leapfrog if not completely
leapfrog the existing technologies of
Russia broadcasting fiberoptics these sort
of things. So we now are looking at a
world in which that Facebook is now gonna
have drones. Google is looking into
balloons and other people looking into low
earth orbit satellites to be able to
provide directly to the end consumer, to
the user, to the handset the content which
flows through these networks. So one of
the first things I believe we're gonna see
in this field is free basics. Facebook has
a service it was launched as Internet.org
and now it's been rebranded free basics
and why this is interesting is that while
in one hand free basis is a free service
that it's trying to get the people that
are not on the Internet now to use a
Facebook's window Facebook's window to the
world. It has maybe a couple dozen sites
that are accessible it runs over the data
networks for countries reliance the
telecommunications company in India is one
of the larger telcos but not the largest.
There's a lot of pressure that Facebook is
putting on the government of India right
now to be able to have the service offered
across the country. One of the ways that
this is problematic is because a limited
number of Web sites flow through this that
people that get exposed to free basic.
This might be their first time seeing the
Internet in some cases because an example
is interesting to think about is a lion
born into a zoo. Perhaps evolution and
other things may have this line dream
perhaps of running wild on the plains of
Africa but at the same time it will never
know that world Facebook free basic users
knowing Facebook's view to the window of
the Internet may not all jump over to a
full data package on their ISP. And many
people may be stuck in Facebook's window
to the world.
J: In other words we've reached an era
where these companies have as I've said
unprecedented control over our daily
communications both the information that
we can access and the speech and imagery
that we can express to the world and to
each other. So the postings and pages and
friend requests, the millions of
politically active users as well, have
helped to make Mark Zuckerberg and his
colleagues as well as the people at
Google, Twitter and all these other fine
companies extremely rich and yet we're
pushing back in this case. I've got a
great quote from Rebecca MacKinnon where
she refers to Facebook as Facebookistan
and I think that that is an apt example of
what we're looking at. These are
corporations but they're not beholden at
all to the public as we know. And instead
they've kind of turned into these quasi
dictatorships that dictate precisely how
we behave on them. I also wanted to throw
this one up to talk a little bit about the
global speech norm. This is from Ben
Wagner who's written a number of pieces on
this but who kind of coined the concept of
a global speech standard which is what
these companies have begun and are
increasingly imposing upon us. This global
speech standard is essentially catering to
everyone in the world trying to make every
user in every country and every government
happy. But as a result have kind of tamper
down free speech to this very basic level
that makes both the governments of let's
say the United States and Germany happy as
well as the governments of countries like
Saudi Arabia. Therefore we're looking at
really kind of the lowest common
denominator when it comes to sometimes
tone types of speech and this sort of flat
gray standard when it comes to others.
M: So Jillian just mentioned we have
countries in play. Facebook is another
organizations social media companies are
trying to pivot and play within an
international field but let's just take it
up for a moment a look at the scale and
scope and size of these social media
companies. So I just put some figures from
the Internet and with some latest census
information we have China 1.37 billion
people, India 1.25 billion people 2.2
billion individuals and practitioners of
Islam and Christianity. But now we have
Facebook with according to their
statistics 1.5 billion active monthly
users. Their statistics - I'm sure many
people would like to dispute these numbers
- but the same time with these platforms
are now large. I mean larger than - not
larger than some of religions - but
Facebook has more monthly active users
than China or India have citizens. So
we're not talking about - you know -
basement startups, we're now talking about
companies with the size and scale to be
able to really be influential in a larger
institutional way. So Magna Carta app the
U.S. Constitution the Declaration of Human
Rights the tree of masters the Bible the
Quran. These are time tested at least
longstanding principle documents. That's
how that place upon their constituents
whether it be citizens or spiritual
adherents a certain code of conduct.
Facebook, as Jillian mentioned, is
nondemocratic. Facebook's terms and
standards were written by a small group of
individuals with a few compelling
interests in mind. But we are now talking
about 1.5 billion people on a monthly
basis that are subservient to a Terms of
Service in which they had no input on.
Sort of pivot from there and bring it back
to spirituality why is this important.
Well spiritual morality has always been a
place for - for religion. Religion has a
monopoly on the soul. You could say, that
religion is a set of rules in which that
if you obey you are able to not go to hell
or heaven after an afterlife reincarnated
whatever the religious practice may be.
Civil morality is quite interesting in the
sense that the sovereign state as a top
level institution has the ability to put
into place a series of statutes and
regulations. The violation of which can
send you to jail. Another interesting note
is that the state is also, also has a
monopoly on the use of sanctioned
violence. Say that's the actors of
official actors of the states are able to
do things in which that the citizens
citizens of that state may not. And if we
take a look at this this concept of
digital morality I spoke about earlier
with services like free basic introducing
new individuals to the Internet well, by a
violation of the terms of service you can
be excluded from these massive global
networks. And really they, Facebook is
actually trying to create a if not a
monopoly a semi monopoly on global
connectivity. in a lot of ways. So what
drives Facebook and this is a few things.
One is a protection stick protectionistic
legal framework. Ok. The control of
copyright violations is something that a
lot of platforms stomped out pretty early.
They don't want to be sued by the IRA or
the MPAA. And so there was mechanisms in
which copyright, copyrighted material was
able to be taken on the platform. They
also or limit potential competition. And I
think this is quite interesting in the
sense that they have shown this in two
ways. One: They've boughten they've
purchased rival or potential competitors.
You see this with Instagram being bought
by Facebook. But Facebook has also
demonstrated the ability or willingness to
censor certain context, certain content.
tsu.co was a news social sites and
mentions and links to this platform were
deleted or not allowed on Facebook. So
even using Facebook as a platform to talk
about another platform was was not
allowed. And then a third component is the
operation on the global scale. It's only
the size of the company it's also about
the global reach. So if Facebook maintains
offices around the world, as other social
media companies do, they engage in public
diplomacy and they also offer aid in many
countries and many languages. So just to
take it to to companies like Facebook for
a moment really an economics you have the
traditional multinationals the 20th
century Coca Cola McDonald's. The end
user. The end goal the goal for the end
user of these products was consumption.
This is changing now. Facebook is looking
to capture more and more parts of the
supply chain. And as a this may be as a
service provider as a content moderator
and responsible for negotiating and
educating the content disputes. At the end
of the day that users are really the
product. It's not for us Facebook users
the platform it's really for advertisers.
We take a hierarchy of the platform where
the corporation advertisers and then users
kind of at the fringes.
J: So let's get into the nitty gritty a
little bit about what content moderation
on these platforms actually looks like. So
I've put up two headlines from Adrian Chan
a journalist who wrote these for Gawker
and wired respectively. They're both a
couple of years old. But what he did was
he looked into he investigated who was
moderating the content on these platforms
and what he found and accused these
companies out of is outsourcing the
content moderation to low paid workers in
developing countries. In this case he
found the first article I think Morocco
was the country and I'm gonna show a slide
from that a bit of what those content
moderators worked with. And the second
article talked a lot about the use of
workers in the Philippines for this
purpose. We know that these workers are
probably low paid. We know that they're
given very very minimal amount of a
minimal timeframe to look at the content
that they're being presented. So here's
how it basically works across platforms
with small differences. I post something
and I'll show you some great examples of
things they posted later. I post something
and if I post it to my friends only, my
friends can then report it to the company.
If I post it publicly anybody who can see
it or who's a user of the product can
report it to the company. Once a piece of
content is reported a content moderator
then looks at it and within that very
small time frame we're talking half a
second to two seconds probably based on
the investigative research that's been
done by a number of people they have to
decide if this content fits the terms of
service or not. Now most of these
companies have a legalistic terms of
service as well as a set of community
guidelines or community standards which
are clear to the user but they're still
often very vague. And so I wanna get into
a couple of examples that show that. This
is just this slide is the one of the
examples that I gave. You can't see it
very well, so I won't leave it up for too
long. But that was what content moderators
at this outsource company oDesk were
allegedly using to moderate content on
Facebook. This next photo contains nudity.
So I think everyone probably knows who
this is and has seen this photo. Yes? No?
OK. This is Kim Kardashian and this photo
allegedly broke the Internet. It was a
photo taken for paper magazine. It was
posted widely on the web and it was seen
by many many people. Now this photograph
definitely violates Facebook's terms of
service. Buuut Kim Kardashian's really
famous and makes a lot of money. So in
most instances as far as I can tell this
photo was totally fine on Facebook. Now
let's talk about those rules a little bit.
Facebook says that they restrict nudity
unless it is art. So they do make an
exception for art which may be why they
allowed that image of Kim Kardashian's
behind to stay up. Art is defined by the
individual. And yet at the same time that
you know they make clear that that's let's
say a photograph of Michelangelo's David
or a photograph of another piece of art in
a museum would be perfectly acceptable
whereas you know you're sort of average
nudity maybe probably is not going to be
allowed to remain on the platform. They
also note that they restrict the display
of nudity to ensure that their global..
because their global community may be
sensitive to this type of content
particularly because of their cultural
background or age. So this is Facebook in
their community standards telling you
explicitly that they are toning down free
speech to make everyone happy. This is
another photograph. Germans are
particularly I'm interested. Is everyone
familiar with the show the Golden Girls?
OK. Quite a few. So you might recognize
her she was Dorothea on the Golden Girls
this is the actress Bea Arthur and this is
from a painting from 1991 by John Curran
of her. It's unclear whether or not she
sat for the painting. It's a wonderful
image, it's beautiful it's very beautiful
portrait of her. But I posted it on
Facebook several times in a week. I
encouraged my friends to report it. And in
fact Facebook found this to not be art.
Sorry. Another image this is by an artist
a Canadian artist called Rupi Kaur. She
posted a series of images in which she was
menstruating, she was trying to
essentially describe the normal the
normality of this. The fact that this is
something that all women go through. Most
women go through. And as a result
Instagram denied. Unclear on the reasons,
they told her that had violated the terms
of service but weren't exactly clear as to
why. And finally this is another one. This
is by an artist friend of mine, I'm afraid
that I have completely blanked on who did
this particular piece, but what it was is:
They took famous works of nude art and had
sex workers pose in the same poses as the
pieces of art. I thought it was a really
cool project but Google Plus did not find
that to be a really cool project. And
because of their guidelines on nudity they
banned it. This is a cat. Just want to
make sure you're awake. It was totally
allowed. So in addition to the problems of
content moderation I'm gonna go ahead and
say that we also have a major diversity
problem at these companies. These
statistics are facts these are from all of
these companies themselves. They put out
diversity reports recently or as I like to
call them diversity reports and they show
that. So the statistics are a little bit
different because they only capture data
on ethnicity or nationality in their US
offices just because of how those
standards are sort of odd all over the
world. So the first stats referred to
their global staff. The second ones in
each line refer to their U.S. staff but as
you can see these companies are largely
made of white men which is probably not
surprising but it is a problem. Now why is
that a problem? Particularly when you're
talking about policy teams the people who
build policies and regulations have an
inherent bias. We all have an inherent
bias. But what we've seen in this is
really a bias of sort of the American
style of prudeness. Nudity is not allowed
but violence extreme violence as long as
it's fictional is totally OK. And that's
generally how these platforms operate. And
so I think that when we ensure that there
is diversity in the teams creating both
our tools our technology and our policies,
then we can ensure that diverse world
views are brought into those that creation
process and that the policies are
therefore more just. So what can we do
about this problem? As consumers as
technologists as activists as whomever you
might identify as. In the first one I
think a lot of the technologists are gonna
agree with: Develop decentralized
networks. We need to work toward that
ideal because these companies are not
getting any smaller. We're not gonna
necessarily go out and say that they're
too big to fail but they are massive and
as Matthew noted earlier they're buying up
properties all over the place and making
sure that they do have control over our
speech. The second thing is to push for
greater transparency around terms of
service takedowns. Now I'm not a huge fan
of transparency for the sake of
transparency. I think that these, you
know, these companies have been putting
out transparency reports for a long time
that show what countries ask them to take
down content or hand over user data. But
we've seen those transparency reports to
be incredibly flawed already. And so in
pushing for greater transparency around
terms of service take downs that's only a
first step. The third thing is, we need to
demand that these companies adhere to
global speech standards. We already have
the Universal Declaration of Human Rights.
I don't understand why we need companies
to develop their own bespoke rules. And so
by..
applause
And so by demanding that companies adhere
to global speech standards we can ensure
that these are places of free expression
because it is unrealistic to just tell
people to get off Facebook. I can't tell
you how many times in the tech community
over the years I've heard people say well
if you don't like it just leave. That's
not a realistic option for many people
around the world and I think we all know
that deep down.
applause
Thank you. And so the other thing I would
say though is that public pressure works.
We saw last year with Facebook's real name
policy there are a number of drag
performers in the San Francisco Bay Area
who were kicked off the platform because
they were using their performance - their
drag names - which is a completely
legitimate thing to do just as folks have
hacker names or other pseudonyms. But
those folks pushed back. They formed a
coalition and they got Facebook to change
a little bit. It's not completely there
yet but they're making progress and I'm
hoping that this goes well. And then the
last thing is and this is totally a pitch
thrown right out there: Support projects
like ours I'm gonna throw to Matthew to
talk about onlinecensorship.org and
another project done by the excellent
Rebecca MacKinnon called
ranking digital rights.
M: So just a little bit of thinking
outside the box onlinecensorship.org is a
platform that's recently launched. Users
can go onto the platform and submit a
small questionnaire if their content has
been taken down by the platforms. Why we
think this is exciting is because right
now as we mentioned that transparency
reports are fundamentally flawed. We are
looking to crowdsource information about
the ways in which the social media
companies, six social media companies, are
moderating and taking down content because
we can't know exactly what kind of
accountability and transparency in real
time. We're hoping to be able to find
trends both across the kind of conduct has
been taking down, geographic trends, news
related trends within sort of
self-reported content takedown. But it's
platforms like these that I think that
hopefully will begin to spring up in
response for the community to be able to
put tools in place that people can be a
part of the poor porting
and Transparency Initiative.
J: We launched about a month ago and were
hoping to put out our first set of reports
around March. And finally I just want to
close with one more quote before we slip
into Q&A and that is just to save it. It's
reasonable that we press Facebook on these
questions of public responsibility, while
also acknowledging that Facebook cannot be
all things to all people. We can demand
that their design decisions and user
policies standard for review explicit
thoughtful and open to public
deliberation. But - and this is the most
important part in my view - the choices that
Facebook makes in their design and in
their policies or value judgments. This is
political and I know you've heard that in
a lot of talk. So have I. But I think we
can't, we cannot forget that this is all
political and we have to address it as
such. And for someone if that means you
know quitting the platform thats fine too.
But I think that we should still
understand that our friends our relatives
our families are using these platforms and
that we do owe it to everybody to make
them a better place for free expression
and privacy. Thank you.
applause
Herald: Thank you so much. So please now
we have a section of Q&A for anyone who
has a question please use one of the mikes
on the site's. And I think we have a
question from one of our viewers? No. OK.
Please proceed numer one.
Mike 1: You just addressed that I am sort
of especially after listening to your talk
I'm sort of on the verge of quitting
Facebook or starting to I don't know.
applause
Yeah, I mean and I agree it's a hard
decision. I've been on Facebook for I
think six years now and it is a dispute
for me myself. So, I'm in this very
strange position. And now I have to kind
of decide what to do. Are there any.. is
there any help for me out there to tell me
what would be.. I don't know what.. that
takes my state and helps me in deciding..
I don't know. It's strange.
J: That's such a hard question. I mean
I'm.. I'll put on my privacy hat for just
a second and say what I would say to
people when they're making that
consideration from a privacy view point
because they do think that the
implications of privacy on these platforms
is often much more severe than those of
speech. But this is what I do. So in that
case you know I think it's really about
understanding your threat model of
understanding what sort of threat you're
under when it comes to you know the data
collection that these companies are
undertaking as well as the censorship of
course. But I think it really is a
personal decision and I I'm sure that
there are - you know - there are great
resources out there around digital
security and around thinking through those
thread model processes and perhaps that
could be of help to you for that. If you
want to add?
M: No I mean I think it's, it's one of
these big toss ups like this is a system
in which that many people are connected
through even sometimes e-mail addresses
rollover and Facebook. And so I think it's
the opportunity cost by leaving a
platform. What do you have to lose, what
do you have to gain. But it's also
important remember that well the snapshot
we see of Facebook now it's not gonna get
any it's probably not gonna get better.
It's probably gonna be more invasive and
in coming into different parts of our
lives so I think from the security and
privacy aspect it's really just up to the
individual.
Mike 1: Short follow up - if I am allowed
to - I don't see the.. the main point for
me is not my personal implications so I am
quite aware that Facebook is a bad thing
and I can leave it, but I'm sort of
thinking about it's way past the point
where we can decide on our own and decide:
OK, is it good for me or is it good for my
friend or is it good for my mom or for my
dad or whatever? We have to think about:
Is Facebook as such for society is a good
thing - as you are addressing. So I think
we have to drive this decision making from
one person to a lot, lot, lot of persons.
J: I agree and I'll note. What we're
talking about..
applause
- I agree. What we're talking about in the
project that we work on together is a
small piece of the broader issue and I
agree that this needs to be tackled from
many angles.
Herald: Ok, we have a question from one of
our viewers on the Internet, please.
Signal Angel: Yeah, one of the questions
from the internet is: Aren't the
moderators the real problem who bann
everything which they don't really like
rather than the providers of the service?
Herald: Can you please repeat that?
Signal Angel: The question was if the
moderators sometimes volunteers aren't the
problem because they bann everything that
they don't like rather than the providers
of a certain service?
J: Ahm, no I mean I would say that the
content moderators.. we don't know who
they are, so that's part of the issue, as
we don't know and I've - you know - I've
heard many allegations over the years when
certain content's been taken down in a
certain local or cultural context
particularly in the Arab world. I've heard
the accusation that like oh those content
moderators are pro Sisi the dictator in
Egypt or whatever. I'm not sure how much
merit that holds because like I said we
don't know who they are. But I would say
is that they're not.. it doesn't feel like
they're given the resources to do their
jobs well, so even if they were the best
most neutral people on earth they're given
very little time probably very little
money and not a whole lot of resources to
work with in making those determinations.
Herald: Thank you. We take a question from
Mike 3 please.
Mike 3: Test, test. OK. First off, thank
you so much for the talk. And I just have
a basic question. So it seems logical that
Facebook is trying to put out this mantra
of protect the children. I can kind of get
behind that. And it also seems based on
the fact that they have the "real names
policy" that they would also expect you to
put in a real legal age. So if they're
trying to censor things like nudity: Why
couldn't they simply use things like age
as criteria to protect children from
nudity. While letting everyone else who is
above the legal age make their own
decision?
J: You wanna take that?
M: I think it's a few factors one: It's I
guess on the technical side what on the
technical side, what constitutes nudity.
And in a process way if it does get
flagged as when something is flagged do
you have a channel or two boxes to say
what sort of content I could use a system
in which that content flagged as nudity
gets referred to a special nudity
moderator and then the moderator says:
"Yes is nudity" then filter all less than
- you know - legal age or whatever age.
But I think it's part of a broader more
systematic approach by Facebook. It's the
broad strokes. It's really kind of
dictating this digital baseline this
digital morality baseline and we're
saying: "No". Anybody in the world cannot
see this. These are our hard lines and it
doesn't matter what age you are where you
reside. This is the box in which we are
placing you in and content that falls
outside of this box for anybody,
regardless of age or origin, that these
this is what we say you can see and
anything that falls out that you risk
having your account suspended. So I think
it's a mechanism of control.
Herald: Thank you so much. I think
unfortunately we're run out of time for
questions. I would like to apologize for
everyone who's standing maybe you have
time to discuss that afterwards. Thank you
everyone and thank you.
applause
postrol music
subtitles created by c3subtitles.de
Join, and help us!