1
00:00:00,440 --> 00:00:09,460
prerol music
2
00:00:09,460 --> 00:00:13,849
Herald: Our next Talk will be tackling how
social media companies are creating a
3
00:00:13,849 --> 00:00:20,480
global morality standard through content
regulations. This would be presented by
4
00:00:20,480 --> 00:00:25,470
two persons standing here the digital
rights advocate Matthew Stender and the
5
00:00:25,470 --> 00:00:31,220
writer and activist Jillian C. York. Please
give them a warm applause.
6
00:00:31,220 --> 00:00:38,090
applause
7
00:00:38,090 --> 00:00:41,480
Matthew: Hello everybody. I hope you all
had a great Congress. Thank you for being
8
00:00:41,480 --> 00:00:46,949
here today. You know we're almost wrapped
up with the Congress but yeah we
9
00:00:46,949 --> 00:00:51,929
appreciate you being here. My name is
Matthew Stender. I am a communications
10
00:00:51,929 --> 00:00:57,419
strategist's creative director and digital
rights advocate focusing on privacy.
11
00:00:57,419 --> 00:01:02,000
Social media censorship and Freedom of
Press and expression.
12
00:01:02,000 --> 00:01:06,550
Jillian: And I am Jillian York and I work
at the Electronic Frontier Foundation. I
13
00:01:06,550 --> 00:01:10,581
work on privacy and free expression issues
as well as a few other things and I'm
14
00:01:10,581 --> 00:01:18,480
based in Berlin. Thank you. For Berlin?
Awesome. Cool. Hope to see some of you
15
00:01:18,480 --> 00:01:22,410
there. Great. So today we're gonna be
talking about sin in the time of
16
00:01:22,410 --> 00:01:27,140
technology and what we mean by that is the
way in which corporations particularly
17
00:01:27,140 --> 00:01:31,320
content platforms and social media
platforms are driving morality and our
18
00:01:31,320 --> 00:01:36,630
perception of it. We've got three key
takeaways to start off with. The first is
19
00:01:36,630 --> 00:01:40,430
that social media companies have an
unparalleled amount of influence over our
20
00:01:40,430 --> 00:01:43,980
modern communications. This we know I
think this is probably something everyone
21
00:01:43,980 --> 00:01:48,130
in this room can agree on. These companies
also play a huge role in shaping our
22
00:01:48,130 --> 00:01:52,580
global outlook on morality and what
constitutes it. So the ways in which we
23
00:01:52,580 --> 00:01:57,280
perceive different imagery different
speech is being increasingly defined by
24
00:01:57,280 --> 00:02:02,100
the regulations that these platforms put
upon us on our daily activities on them.
25
00:02:02,100 --> 00:02:05,780
And third they are entirely undemocratic.
They're beholden to shareholders and
26
00:02:05,780 --> 00:02:10,170
governments but not at all to the public.
Not to me, not to you. Rarely do they
27
00:02:10,170 --> 00:02:14,120
listen to us and when they do there has to
be a fairly exceptional mount of public
28
00:02:14,120 --> 00:02:18,340
pressure on them. And so that's, that's
our starting point that's what we wanna
29
00:02:18,340 --> 00:02:22,769
kick off with and I'll pass the mike to
Matthew.
30
00:02:22,769 --> 00:02:28,170
M: So thinking about these three takeaways
I'm going to bring it to kind of top level
31
00:02:28,170 --> 00:02:38,209
for a moment. To introduce an idea today
which some people have talked about but
32
00:02:38,209 --> 00:02:44,400
the idea of the rise of the techno class.
So probably a lot of people in this room
33
00:02:44,400 --> 00:02:50,990
have followed the negotiations, leaks in
part and then in full by WikiLeaks about
34
00:02:50,990 --> 00:02:58,050
the Trans-Pacific Partnership the TPP what
some people have mentioned in this debate
35
00:02:58,050 --> 00:03:03,360
is the idea of a corporate capture in a
world now in which that corporations are
36
00:03:03,360 --> 00:03:09,220
becoming are maturing to this to the
extent in which they can now sue
37
00:03:09,220 --> 00:03:15,270
governments that the multinational reach
of many corporations are larger than that.
38
00:03:15,270 --> 00:03:21,620
The diplomatic reach of countries. And
with this social media platforms being
39
00:03:21,620 --> 00:03:26,250
part of this that they, that these social
media companies now are going to have the
40
00:03:26,250 --> 00:03:30,180
capacity to influence not only cultures
but people within cultures and how they
41
00:03:30,180 --> 00:03:36,239
communicate with people inside their
culture and communicate globally. So as
42
00:03:36,239 --> 00:03:40,989
activists and technologists I would like
to propose at least that we start thinking
43
00:03:40,989 --> 00:03:45,099
about and beyond the product and service
offerings of today's social media
44
00:03:45,099 --> 00:03:50,660
companies and start looking ahead to two -
five - ten years down the road in which
45
00:03:50,660 --> 00:03:56,559
these companies may have social media
services and serve as media social media
46
00:03:56,559 --> 00:04:00,709
service offerings which are
indistinguishable from today's ISPs and
47
00:04:00,709 --> 00:04:05,910
telcos and other things. And this is
really to say that social media is moving
48
00:04:05,910 --> 00:04:15,400
past the era of the walled garden into neo
empires. So one of the things that is on
49
00:04:15,400 --> 00:04:20,298
the slide are some headlines about
different delivery mechanisms in which
50
00:04:20,298 --> 00:04:24,980
that social media companies and also
people like Elon Musk are looking to roll
51
00:04:24,980 --> 00:04:30,050
out an almost leapfrog if not completely
leapfrog the existing technologies of
52
00:04:30,050 --> 00:04:36,530
Russia broadcasting fiberoptics these sort
of things. So we now are looking at a
53
00:04:36,530 --> 00:04:41,240
world in which that Facebook is now gonna
have drones. Google is looking into
54
00:04:41,240 --> 00:04:46,590
balloons and other people looking into low
earth orbit satellites to be able to
55
00:04:46,590 --> 00:04:52,220
provide directly to the end consumer, to
the user, to the handset the content which
56
00:04:52,220 --> 00:04:58,199
flows through these networks. So one of
the first things I believe we're gonna see
57
00:04:58,199 --> 00:05:05,600
in this field is free basics. Facebook has
a service it was launched as Internet.org
58
00:05:05,600 --> 00:05:12,021
and now it's been rebranded free basics
and why this is interesting is that while
59
00:05:12,021 --> 00:05:17,060
in one hand free basis is a free service
that it's trying to get the people that
60
00:05:17,060 --> 00:05:21,590
are not on the Internet now to use a
Facebook's window Facebook's window to the
61
00:05:21,590 --> 00:05:28,030
world. It has maybe a couple dozen sites
that are accessible it runs over the data
62
00:05:28,030 --> 00:05:35,180
networks for countries reliance the
telecommunications company in India is one
63
00:05:35,180 --> 00:05:41,550
of the larger telcos but not the largest.
There's a lot of pressure that Facebook is
64
00:05:41,550 --> 00:05:48,320
putting on the government of India right
now to be able to have the service offered
65
00:05:48,320 --> 00:05:54,960
across the country. One of the ways that
this is problematic is because a limited
66
00:05:54,960 --> 00:05:59,300
number of Web sites flow through this that
people that get exposed to free basic.
67
00:05:59,300 --> 00:06:05,180
This might be their first time seeing the
Internet in some cases because an example
68
00:06:05,180 --> 00:06:10,350
is interesting to think about is a lion
born into a zoo. Perhaps evolution and
69
00:06:10,350 --> 00:06:16,889
other things may have this line dream
perhaps of running wild on the plains of
70
00:06:16,889 --> 00:06:25,080
Africa but at the same time it will never
know that world Facebook free basic users
71
00:06:25,080 --> 00:06:32,199
knowing Facebook's view to the window of
the Internet may not all jump over to a
72
00:06:32,199 --> 00:06:37,380
full data package on their ISP. And many
people may be stuck in Facebook's window
73
00:06:37,380 --> 00:06:41,190
to the world.
74
00:06:41,190 --> 00:06:44,740
J: In other words we've reached an era
where these companies have as I've said
75
00:06:44,740 --> 00:06:48,530
unprecedented control over our daily
communications both the information that
76
00:06:48,530 --> 00:06:52,460
we can access and the speech and imagery
that we can express to the world and to
77
00:06:52,460 --> 00:06:55,880
each other. So the postings and pages and
friend requests, the millions of
78
00:06:55,880 --> 00:06:59,620
politically active users as well, have
helped to make Mark Zuckerberg and his
79
00:06:59,620 --> 00:07:02,770
colleagues as well as the people at
Google, Twitter and all these other fine
80
00:07:02,770 --> 00:07:07,850
companies extremely rich and yet we're
pushing back in this case. I've got a
81
00:07:07,850 --> 00:07:11,400
great quote from Rebecca MacKinnon where
she refers to Facebook as Facebookistan
82
00:07:11,400 --> 00:07:14,610
and I think that that is an apt example of
what we're looking at. These are
83
00:07:14,610 --> 00:07:19,280
corporations but they're not beholden at
all to the public as we know. And instead
84
00:07:19,280 --> 00:07:23,190
they've kind of turned into these quasi
dictatorships that dictate precisely how
85
00:07:23,190 --> 00:07:29,400
we behave on them. I also wanted to throw
this one up to talk a little bit about the
86
00:07:29,400 --> 00:07:34,030
global speech norm. This is from Ben
Wagner who's written a number of pieces on
87
00:07:34,030 --> 00:07:38,370
this but who kind of coined the concept of
a global speech standard which is what
88
00:07:38,370 --> 00:07:42,930
these companies have begun and are
increasingly imposing upon us. This global
89
00:07:42,930 --> 00:07:47,800
speech standard is essentially catering to
everyone in the world trying to make every
90
00:07:47,800 --> 00:07:51,800
user in every country and every government
happy. But as a result have kind of tamper
91
00:07:51,800 --> 00:07:57,360
down free speech to this very basic level
that makes both the governments of let's
92
00:07:57,360 --> 00:08:00,930
say the United States and Germany happy as
well as the governments of countries like
93
00:08:00,930 --> 00:08:05,270
Saudi Arabia. Therefore we're looking at
really kind of the lowest common
94
00:08:05,270 --> 00:08:09,870
denominator when it comes to sometimes
tone types of speech and this sort of flat
95
00:08:09,870 --> 00:08:14,810
gray standard when it comes to others.
96
00:08:14,810 --> 00:08:22,050
M: So Jillian just mentioned we have
countries in play. Facebook is another
97
00:08:22,050 --> 00:08:26,080
organizations social media companies are
trying to pivot and play within an
98
00:08:26,080 --> 00:08:30,490
international field but let's just take it
up for a moment a look at the scale and
99
00:08:30,490 --> 00:08:35,799
scope and size of these social media
companies. So I just put some figures from
100
00:08:35,799 --> 00:08:42,750
the Internet and with some latest census
information we have China 1.37 billion
101
00:08:42,750 --> 00:08:48,020
people, India 1.25 billion people 2.2
billion individuals and practitioners of
102
00:08:48,020 --> 00:08:51,762
Islam and Christianity. But now we have
Facebook with according to their
103
00:08:51,762 --> 00:08:57,610
statistics 1.5 billion active monthly
users. Their statistics - I'm sure many
104
00:08:57,610 --> 00:09:01,520
people would like to dispute these numbers
- but the same time with these platforms
105
00:09:01,520 --> 00:09:08,640
are now large. I mean larger than - not
larger than some of religions - but
106
00:09:08,640 --> 00:09:15,040
Facebook has more monthly active users
than China or India have citizens. So
107
00:09:15,040 --> 00:09:19,061
we're not talking about - you know -
basement startups, we're now talking about
108
00:09:19,061 --> 00:09:26,940
companies with the size and scale to be
able to really be influential in a larger
109
00:09:26,940 --> 00:09:42,790
institutional way. So Magna Carta app the
U.S. Constitution the Declaration of Human
110
00:09:42,790 --> 00:09:49,130
Rights the tree of masters the Bible the
Quran. These are time tested at least
111
00:09:49,130 --> 00:09:55,310
longstanding principle documents. That's
how that place upon their constituents
112
00:09:55,310 --> 00:10:03,980
whether it be citizens or spiritual
adherents a certain code of conduct.
113
00:10:03,980 --> 00:10:08,510
Facebook, as Jillian mentioned, is
nondemocratic. Facebook's terms and
114
00:10:08,510 --> 00:10:12,990
standards were written by a small group of
individuals with a few compelling
115
00:10:12,990 --> 00:10:18,540
interests in mind. But we are now talking
about 1.5 billion people on a monthly
116
00:10:18,540 --> 00:10:27,240
basis that are subservient to a Terms of
Service in which they had no input on.
117
00:10:27,240 --> 00:10:32,470
Sort of pivot from there and bring it back
to spirituality why is this important.
118
00:10:32,470 --> 00:10:41,890
Well spiritual morality has always been a
place for - for religion. Religion has a
119
00:10:41,890 --> 00:10:49,760
monopoly on the soul. You could say, that
religion is a set of rules in which that
120
00:10:49,760 --> 00:10:55,860
if you obey you are able to not go to hell
or heaven after an afterlife reincarnated
121
00:10:55,860 --> 00:11:02,560
whatever the religious practice may be.
Civil morality is quite interesting in the
122
00:11:02,560 --> 00:11:07,610
sense that the sovereign state as a top
level institution has the ability to put
123
00:11:07,610 --> 00:11:13,460
into place a series of statutes and
regulations. The violation of which can
124
00:11:13,460 --> 00:11:18,570
send you to jail. Another interesting note
is that the state is also, also has a
125
00:11:18,570 --> 00:11:24,670
monopoly on the use of sanctioned
violence. Say that's the actors of
126
00:11:24,670 --> 00:11:28,080
official actors of the states are able to
do things in which that the citizens
127
00:11:28,080 --> 00:11:35,570
citizens of that state may not. And if we
take a look at this this concept of
128
00:11:35,570 --> 00:11:41,910
digital morality I spoke about earlier
with services like free basic introducing
129
00:11:41,910 --> 00:11:48,459
new individuals to the Internet well, by a
violation of the terms of service you can
130
00:11:48,459 --> 00:11:58,399
be excluded from these massive global
networks. And really they, Facebook is
131
00:11:58,399 --> 00:12:04,050
actually trying to create a if not a
monopoly a semi monopoly on global
132
00:12:04,050 --> 00:12:14,000
connectivity. in a lot of ways. So what
drives Facebook and this is a few things.
133
00:12:14,000 --> 00:12:20,870
One is a protection stick protectionistic
legal framework. Ok. The control of
134
00:12:20,870 --> 00:12:25,360
copyright violations is something that a
lot of platforms stomped out pretty early.
135
00:12:25,360 --> 00:12:29,899
They don't want to be sued by the IRA or
the MPAA. And so there was mechanisms in
136
00:12:29,899 --> 00:12:36,641
which copyright, copyrighted material was
able to be taken on the platform. They
137
00:12:36,641 --> 00:12:41,990
also or limit potential competition. And I
think this is quite interesting in the
138
00:12:41,990 --> 00:12:47,220
sense that they have shown this in two
ways. One: They've boughten they've
139
00:12:47,220 --> 00:12:52,140
purchased rival or potential competitors.
You see this with Instagram being bought
140
00:12:52,140 --> 00:12:57,870
by Facebook. But Facebook has also
demonstrated the ability or willingness to
141
00:12:57,870 --> 00:13:04,899
censor certain context, certain content.
tsu.co was a news social sites and
142
00:13:04,899 --> 00:13:10,279
mentions and links to this platform were
deleted or not allowed on Facebook. So
143
00:13:10,279 --> 00:13:16,480
even using Facebook as a platform to talk
about another platform was was not
144
00:13:16,480 --> 00:13:23,860
allowed. And then a third component is the
operation on the global scale. It's only
145
00:13:23,860 --> 00:13:29,440
the size of the company it's also about
the global reach. So if Facebook maintains
146
00:13:29,440 --> 00:13:33,320
offices around the world, as other social
media companies do, they engage in public
147
00:13:33,320 --> 00:13:41,720
diplomacy and they also offer aid in many
countries and many languages. So just to
148
00:13:41,720 --> 00:13:47,450
take it to to companies like Facebook for
a moment really an economics you have the
149
00:13:47,450 --> 00:13:53,490
traditional multinationals the 20th
century Coca Cola McDonald's. The end
150
00:13:53,490 --> 00:14:00,610
user. The end goal the goal for the end
user of these products was consumption.
151
00:14:00,610 --> 00:14:05,170
This is changing now. Facebook is looking
to capture more and more parts of the
152
00:14:05,170 --> 00:14:11,339
supply chain. And as a this may be as a
service provider as a content moderator
153
00:14:11,339 --> 00:14:21,070
and responsible for negotiating and
educating the content disputes. At the end
154
00:14:21,070 --> 00:14:27,510
of the day that users are really the
product. It's not for us Facebook users
155
00:14:27,510 --> 00:14:32,630
the platform it's really for advertisers.
We take a hierarchy of the platform where
156
00:14:32,630 --> 00:14:38,510
the corporation advertisers and then users
kind of at the fringes.
157
00:14:38,510 --> 00:14:41,899
J: So let's get into the nitty gritty a
little bit about what content moderation
158
00:14:41,899 --> 00:14:46,399
on these platforms actually looks like. So
I've put up two headlines from Adrian Chan
159
00:14:46,399 --> 00:14:50,149
a journalist who wrote these for Gawker
and wired respectively. They're both a
160
00:14:50,149 --> 00:14:54,910
couple of years old. But what he did was
he looked into he investigated who was
161
00:14:54,910 --> 00:14:58,899
moderating the content on these platforms
and what he found and accused these
162
00:14:58,899 --> 00:15:03,100
companies out of is outsourcing the
content moderation to low paid workers in
163
00:15:03,100 --> 00:15:06,779
developing countries. In this case he
found the first article I think Morocco
164
00:15:06,779 --> 00:15:09,970
was the country and I'm gonna show a slide
from that a bit of what those content
165
00:15:09,970 --> 00:15:14,050
moderators worked with. And the second
article talked a lot about the use of
166
00:15:14,050 --> 00:15:17,110
workers in the Philippines for this
purpose. We know that these workers are
167
00:15:17,110 --> 00:15:21,529
probably low paid. We know that they're
given very very minimal amount of a
168
00:15:21,529 --> 00:15:25,200
minimal timeframe to look at the content
that they're being presented. So here's
169
00:15:25,200 --> 00:15:31,199
how it basically works across platforms
with small differences. I post something
170
00:15:31,199 --> 00:15:34,580
and I'll show you some great examples of
things they posted later. I post something
171
00:15:34,580 --> 00:15:39,070
and if I post it to my friends only, my
friends can then report it to the company.
172
00:15:39,070 --> 00:15:43,430
If I post it publicly anybody who can see
it or who's a user of the product can
173
00:15:43,430 --> 00:15:47,800
report it to the company. Once a piece of
content is reported a content moderator
174
00:15:47,800 --> 00:15:51,420
then looks at it and within that very
small time frame we're talking half a
175
00:15:51,420 --> 00:15:56,160
second to two seconds probably based on
the investigative research that's been
176
00:15:56,160 --> 00:16:00,130
done by a number of people they have to
decide if this content fits the terms of
177
00:16:00,130 --> 00:16:03,990
service or not. Now most of these
companies have a legalistic terms of
178
00:16:03,990 --> 00:16:07,800
service as well as a set of community
guidelines or community standards which
179
00:16:07,800 --> 00:16:12,740
are clear to the user but they're still
often very vague. And so I wanna get into
180
00:16:12,740 --> 00:16:18,270
a couple of examples that show that. This
is just this slide is the one of the
181
00:16:18,270 --> 00:16:21,290
examples that I gave. You can't see it
very well, so I won't leave it up for too
182
00:16:21,290 --> 00:16:25,800
long. But that was what content moderators
at this outsource company oDesk were
183
00:16:25,800 --> 00:16:33,880
allegedly using to moderate content on
Facebook. This next photo contains nudity.
184
00:16:33,880 --> 00:16:40,740
So I think everyone probably knows who
this is and has seen this photo. Yes? No?
185
00:16:40,740 --> 00:16:46,389
OK. This is Kim Kardashian and this photo
allegedly broke the Internet. It was a
186
00:16:46,389 --> 00:16:50,779
photo taken for paper magazine. It was
posted widely on the web and it was seen
187
00:16:50,779 --> 00:16:55,459
by many many people. Now this photograph
definitely violates Facebook's terms of
188
00:16:55,459 --> 00:17:00,380
service. Buuut Kim Kardashian's really
famous and makes a lot of money. So in
189
00:17:00,380 --> 00:17:06,779
most instances as far as I can tell this
photo was totally fine on Facebook. Now
190
00:17:06,779 --> 00:17:11,169
let's talk about those rules a little bit.
Facebook says that they restrict nudity
191
00:17:11,169 --> 00:17:15,749
unless it is art. So they do make an
exception for art which may be why they
192
00:17:15,749 --> 00:17:21,169
allowed that image of Kim Kardashian's
behind to stay up. Art is defined by the
193
00:17:21,169 --> 00:17:26,128
individual. And yet at the same time that
you know they make clear that that's let's
194
00:17:26,128 --> 00:17:30,420
say a photograph of Michelangelo's David
or a photograph of another piece of art in
195
00:17:30,420 --> 00:17:34,210
a museum would be perfectly acceptable
whereas you know you're sort of average
196
00:17:34,210 --> 00:17:39,090
nudity maybe probably is not going to be
allowed to remain on the platform. They
197
00:17:39,090 --> 00:17:42,970
also note that they restrict the display
of nudity to ensure that their global..
198
00:17:42,970 --> 00:17:46,379
because their global community may be
sensitive to this type of content
199
00:17:46,379 --> 00:17:50,120
particularly because of their cultural
background or age. So this is Facebook in
200
00:17:50,120 --> 00:17:54,909
their community standards telling you
explicitly that they are toning down free
201
00:17:54,909 --> 00:18:01,789
speech to make everyone happy. This is
another photograph. Germans are
202
00:18:01,789 --> 00:18:07,480
particularly I'm interested. Is everyone
familiar with the show the Golden Girls?
203
00:18:07,480 --> 00:18:10,929
OK. Quite a few. So you might recognize
her she was Dorothea on the Golden Girls
204
00:18:10,929 --> 00:18:15,090
this is the actress Bea Arthur and this is
from a painting from 1991 by John Curran
205
00:18:15,090 --> 00:18:19,289
of her. It's unclear whether or not she
sat for the painting. It's a wonderful
206
00:18:19,289 --> 00:18:23,590
image, it's beautiful it's very beautiful
portrait of her. But I posted it on
207
00:18:23,590 --> 00:18:28,180
Facebook several times in a week. I
encouraged my friends to report it. And in
208
00:18:28,180 --> 00:18:36,010
fact Facebook found this to not be art.
Sorry. Another image this is by an artist
209
00:18:36,010 --> 00:18:40,730
a Canadian artist called Rupi Kaur. She
posted a series of images in which she was
210
00:18:40,730 --> 00:18:45,509
menstruating, she was trying to
essentially describe the normal the
211
00:18:45,509 --> 00:18:49,580
normality of this. The fact that this is
something that all women go through. Most
212
00:18:49,580 --> 00:18:56,909
women go through. And as a result
Instagram denied. Unclear on the reasons,
213
00:18:56,909 --> 00:19:01,100
they told her that had violated the terms
of service but weren't exactly clear as to
214
00:19:01,100 --> 00:19:06,039
why. And finally this is another one. This
is by an artist friend of mine, I'm afraid
215
00:19:06,039 --> 00:19:09,951
that I have completely blanked on who did
this particular piece, but what it was is:
216
00:19:09,951 --> 00:19:16,370
They took famous works of nude art and had
sex workers pose in the same poses as the
217
00:19:16,370 --> 00:19:19,980
pieces of art. I thought it was a really
cool project but Google Plus did not find
218
00:19:19,980 --> 00:19:23,919
that to be a really cool project. And
because of their guidelines on nudity they
219
00:19:23,919 --> 00:19:30,909
banned it. This is a cat. Just want to
make sure you're awake. It was totally
220
00:19:30,909 --> 00:19:37,629
allowed. So in addition to the problems of
content moderation I'm gonna go ahead and
221
00:19:37,629 --> 00:19:41,290
say that we also have a major diversity
problem at these companies. These
222
00:19:41,290 --> 00:19:45,509
statistics are facts these are from all of
these companies themselves. They put out
223
00:19:45,509 --> 00:19:51,109
diversity reports recently or as I like to
call them diversity reports and they show
224
00:19:51,109 --> 00:19:54,250
that. So the statistics are a little bit
different because they only capture data
225
00:19:54,250 --> 00:19:59,249
on ethnicity or nationality in their US
offices just because of how those
226
00:19:59,249 --> 00:20:02,710
standards are sort of odd all over the
world. So the first stats referred to
227
00:20:02,710 --> 00:20:06,690
their global staff. The second ones in
each line refer to their U.S. staff but as
228
00:20:06,690 --> 00:20:10,749
you can see these companies are largely
made of white men which is probably not
229
00:20:10,749 --> 00:20:16,080
surprising but it is a problem. Now why is
that a problem? Particularly when you're
230
00:20:16,080 --> 00:20:20,440
talking about policy teams the people who
build policies and regulations have an
231
00:20:20,440 --> 00:20:24,499
inherent bias. We all have an inherent
bias. But what we've seen in this is
232
00:20:24,499 --> 00:20:30,460
really a bias of sort of the American
style of prudeness. Nudity is not allowed
233
00:20:30,460 --> 00:20:34,669
but violence extreme violence as long as
it's fictional is totally OK. And that's
234
00:20:34,669 --> 00:20:39,221
generally how these platforms operate. And
so I think that when we ensure that there
235
00:20:39,221 --> 00:20:44,779
is diversity in the teams creating both
our tools our technology and our policies,
236
00:20:44,779 --> 00:20:49,200
then we can ensure that diverse world
views are brought into those that creation
237
00:20:49,200 --> 00:20:56,479
process and that the policies are
therefore more just. So what can we do
238
00:20:56,479 --> 00:21:00,840
about this problem? As consumers as
technologists as activists as whomever you
239
00:21:00,840 --> 00:21:04,940
might identify as. In the first one I
think a lot of the technologists are gonna
240
00:21:04,940 --> 00:21:08,779
agree with: Develop decentralized
networks. We need to work toward that
241
00:21:08,779 --> 00:21:12,409
ideal because these companies are not
getting any smaller. We're not gonna
242
00:21:12,409 --> 00:21:16,279
necessarily go out and say that they're
too big to fail but they are massive and
243
00:21:16,279 --> 00:21:20,479
as Matthew noted earlier they're buying up
properties all over the place and making
244
00:21:20,479 --> 00:21:25,399
sure that they do have control over our
speech. The second thing is to push for
245
00:21:25,399 --> 00:21:29,830
greater transparency around terms of
service takedowns. Now I'm not a huge fan
246
00:21:29,830 --> 00:21:33,210
of transparency for the sake of
transparency. I think that these, you
247
00:21:33,210 --> 00:21:36,509
know, these companies have been putting
out transparency reports for a long time
248
00:21:36,509 --> 00:21:42,379
that show what countries ask them to take
down content or hand over user data. But
249
00:21:42,379 --> 00:21:48,330
we've seen those transparency reports to
be incredibly flawed already. And so in
250
00:21:48,330 --> 00:21:51,529
pushing for greater transparency around
terms of service take downs that's only a
251
00:21:51,529 --> 00:21:55,809
first step. The third thing is, we need to
demand that these companies adhere to
252
00:21:55,809 --> 00:21:59,260
global speech standards. We already have
the Universal Declaration of Human Rights.
253
00:21:59,260 --> 00:22:03,950
I don't understand why we need companies
to develop their own bespoke rules. And so
254
00:22:03,950 --> 00:22:10,509
by..
applause
255
00:22:10,509 --> 00:22:13,679
And so by demanding that companies adhere
to global speech standards we can ensure
256
00:22:13,679 --> 00:22:17,519
that these are places of free expression
because it is unrealistic to just tell
257
00:22:17,519 --> 00:22:21,029
people to get off Facebook. I can't tell
you how many times in the tech community
258
00:22:21,029 --> 00:22:26,230
over the years I've heard people say well
if you don't like it just leave. That's
259
00:22:26,230 --> 00:22:29,519
not a realistic option for many people
around the world and I think we all know
260
00:22:29,519 --> 00:22:32,229
that deep down.
applause
261
00:22:32,229 --> 00:22:38,690
Thank you. And so the other thing I would
say though is that public pressure works.
262
00:22:38,690 --> 00:22:42,940
We saw last year with Facebook's real name
policy there are a number of drag
263
00:22:42,940 --> 00:22:46,899
performers in the San Francisco Bay Area
who were kicked off the platform because
264
00:22:46,899 --> 00:22:50,090
they were using their performance - their
drag names - which is a completely
265
00:22:50,090 --> 00:22:54,739
legitimate thing to do just as folks have
hacker names or other pseudonyms. But
266
00:22:54,739 --> 00:22:58,849
those folks pushed back. They formed a
coalition and they got Facebook to change
267
00:22:58,849 --> 00:23:04,239
a little bit. It's not completely there
yet but they're making progress and I'm
268
00:23:04,239 --> 00:23:09,119
hoping that this goes well. And then the
last thing is and this is totally a pitch
269
00:23:09,119 --> 00:23:13,249
thrown right out there: Support projects
like ours I'm gonna throw to Matthew to
270
00:23:13,249 --> 00:23:16,970
talk about onlinecensorship.org and
another project done by the excellent
271
00:23:16,970 --> 00:23:21,289
Rebecca MacKinnon called
ranking digital rights.
272
00:23:21,289 --> 00:23:25,320
M: So just a little bit of thinking
outside the box onlinecensorship.org is a
273
00:23:25,320 --> 00:23:31,630
platform that's recently launched. Users
can go onto the platform and submit a
274
00:23:31,630 --> 00:23:36,590
small questionnaire if their content has
been taken down by the platforms. Why we
275
00:23:36,590 --> 00:23:39,460
think this is exciting is because right
now as we mentioned that transparency
276
00:23:39,460 --> 00:23:45,169
reports are fundamentally flawed. We are
looking to crowdsource information about
277
00:23:45,169 --> 00:23:49,090
the ways in which the social media
companies, six social media companies, are
278
00:23:49,090 --> 00:23:53,940
moderating and taking down content because
we can't know exactly what kind of
279
00:23:53,940 --> 00:23:57,159
accountability and transparency in real
time. We're hoping to be able to find
280
00:23:57,159 --> 00:24:03,999
trends both across the kind of conduct has
been taking down, geographic trends, news
281
00:24:03,999 --> 00:24:08,350
related trends within sort of
self-reported content takedown. But it's
282
00:24:08,350 --> 00:24:12,710
platforms like these that I think that
hopefully will begin to spring up in
283
00:24:12,710 --> 00:24:18,370
response for the community to be able to
put tools in place that people can be a
284
00:24:18,370 --> 00:24:21,539
part of the poor porting
and Transparency Initiative.
285
00:24:21,539 --> 00:24:24,269
J: We launched about a month ago and were
hoping to put out our first set of reports
286
00:24:24,269 --> 00:24:29,059
around March. And finally I just want to
close with one more quote before we slip
287
00:24:29,059 --> 00:24:33,740
into Q&A and that is just to save it. It's
reasonable that we press Facebook on these
288
00:24:33,740 --> 00:24:37,409
questions of public responsibility, while
also acknowledging that Facebook cannot be
289
00:24:37,409 --> 00:24:41,159
all things to all people. We can demand
that their design decisions and user
290
00:24:41,159 --> 00:24:45,950
policies standard for review explicit
thoughtful and open to public
291
00:24:45,950 --> 00:24:50,200
deliberation. But - and this is the most
important part in my view - the choices that
292
00:24:50,200 --> 00:24:54,549
Facebook makes in their design and in
their policies or value judgments. This is
293
00:24:54,549 --> 00:24:57,929
political and I know you've heard that in
a lot of talk. So have I. But I think we
294
00:24:57,929 --> 00:25:01,700
can't, we cannot forget that this is all
political and we have to address it as
295
00:25:01,700 --> 00:25:06,460
such. And for someone if that means you
know quitting the platform thats fine too.
296
00:25:06,460 --> 00:25:09,139
But I think that we should still
understand that our friends our relatives
297
00:25:09,139 --> 00:25:13,349
our families are using these platforms and
that we do owe it to everybody to make
298
00:25:13,349 --> 00:25:17,969
them a better place for free expression
and privacy. Thank you.
299
00:25:17,969 --> 00:25:23,780
applause
300
00:25:23,780 --> 00:25:28,629
Herald: Thank you so much. So please now
we have a section of Q&A for anyone who
301
00:25:28,629 --> 00:25:36,110
has a question please use one of the mikes
on the site's. And I think we have a
302
00:25:36,110 --> 00:25:44,259
question from one of our viewers? No. OK.
Please proceed numer one.
303
00:25:44,259 --> 00:25:48,859
Mike 1: You just addressed that I am sort
of especially after listening to your talk
304
00:25:48,859 --> 00:25:55,499
I'm sort of on the verge of quitting
Facebook or starting to I don't know.
305
00:25:55,499 --> 00:26:00,200
applause
Yeah, I mean and I agree it's a hard
306
00:26:00,200 --> 00:26:07,200
decision. I've been on Facebook for I
think six years now and it is a dispute
307
00:26:07,200 --> 00:26:12,729
for me myself. So, I'm in this very
strange position. And now I have to kind
308
00:26:12,729 --> 00:26:18,389
of decide what to do. Are there any.. is
there any help for me out there to tell me
309
00:26:18,389 --> 00:26:26,269
what would be.. I don't know what.. that
takes my state and helps me in deciding..
310
00:26:26,269 --> 00:26:28,950
I don't know. It's strange.
311
00:26:28,950 --> 00:26:33,639
J: That's such a hard question. I mean
I'm.. I'll put on my privacy hat for just
312
00:26:33,639 --> 00:26:38,809
a second and say what I would say to
people when they're making that
313
00:26:38,809 --> 00:26:41,389
consideration from a privacy view point
because they do think that the
314
00:26:41,389 --> 00:26:44,849
implications of privacy on these platforms
is often much more severe than those of
315
00:26:44,849 --> 00:26:49,390
speech. But this is what I do. So in that
case you know I think it's really about
316
00:26:49,390 --> 00:26:54,249
understanding your threat model of
understanding what sort of threat you're
317
00:26:54,249 --> 00:26:57,799
under when it comes to you know the data
collection that these companies are
318
00:26:57,799 --> 00:27:01,850
undertaking as well as the censorship of
course. But I think it really is a
319
00:27:01,850 --> 00:27:04,989
personal decision and I I'm sure that
there are - you know - there are great
320
00:27:04,989 --> 00:27:09,460
resources out there around digital
security and around thinking through those
321
00:27:09,460 --> 00:27:12,859
thread model processes and perhaps that
could be of help to you for that. If you
322
00:27:12,859 --> 00:27:17,210
want to add?
M: No I mean I think it's, it's one of
323
00:27:17,210 --> 00:27:21,059
these big toss ups like this is a system
in which that many people are connected
324
00:27:21,059 --> 00:27:26,369
through even sometimes e-mail addresses
rollover and Facebook. And so I think it's
325
00:27:26,369 --> 00:27:30,059
the opportunity cost by leaving a
platform. What do you have to lose, what
326
00:27:30,059 --> 00:27:34,619
do you have to gain. But it's also
important remember that well the snapshot
327
00:27:34,619 --> 00:27:40,350
we see of Facebook now it's not gonna get
any it's probably not gonna get better.
328
00:27:40,350 --> 00:27:44,440
It's probably gonna be more invasive and
in coming into different parts of our
329
00:27:44,440 --> 00:27:49,710
lives so I think from the security and
privacy aspect it's really just up to the
330
00:27:49,710 --> 00:27:54,019
individual.
Mike 1: Short follow up - if I am allowed
331
00:27:54,019 --> 00:28:03,369
to - I don't see the.. the main point for
me is not my personal implications so I am
332
00:28:03,369 --> 00:28:08,929
quite aware that Facebook is a bad thing
and I can leave it, but I'm sort of
333
00:28:08,929 --> 00:28:13,649
thinking about it's way past the point
where we can decide on our own and decide:
334
00:28:13,649 --> 00:28:17,520
OK, is it good for me or is it good for my
friend or is it good for my mom or for my
335
00:28:17,520 --> 00:28:23,629
dad or whatever? We have to think about:
Is Facebook as such for society is a good
336
00:28:23,629 --> 00:28:28,409
thing - as you are addressing. So I think
we have to drive this decision making from
337
00:28:28,409 --> 00:28:34,529
one person to a lot, lot, lot of persons.
J: I agree and I'll note. What we're
338
00:28:34,529 --> 00:28:36,809
talking about..
applause
339
00:28:36,809 --> 00:28:41,399
- I agree. What we're talking about in the
project that we work on together is a
340
00:28:41,399 --> 00:28:45,049
small piece of the broader issue and I
agree that this needs to be tackled from
341
00:28:45,049 --> 00:28:49,179
many angles.
Herald: Ok, we have a question from one of
342
00:28:49,179 --> 00:28:51,999
our viewers on the Internet, please.
Signal Angel: Yeah, one of the questions
343
00:28:51,999 --> 00:28:55,500
from the internet is: Aren't the
moderators the real problem who bann
344
00:28:55,500 --> 00:29:02,729
everything which they don't really like
rather than the providers of the service?
345
00:29:02,729 --> 00:29:05,789
Herald: Can you please repeat that?
Signal Angel: The question was if the
346
00:29:05,789 --> 00:29:10,799
moderators sometimes volunteers aren't the
problem because they bann everything that
347
00:29:10,799 --> 00:29:15,169
they don't like rather than the providers
of a certain service?
348
00:29:15,169 --> 00:29:19,899
J: Ahm, no I mean I would say that the
content moderators.. we don't know who
349
00:29:19,899 --> 00:29:23,190
they are, so that's part of the issue, as
we don't know and I've - you know - I've
350
00:29:23,190 --> 00:29:27,149
heard many allegations over the years when
certain content's been taken down in a
351
00:29:27,149 --> 00:29:30,799
certain local or cultural context
particularly in the Arab world. I've heard
352
00:29:30,799 --> 00:29:34,629
the accusation that like oh those content
moderators are pro Sisi the dictator in
353
00:29:34,629 --> 00:29:39,249
Egypt or whatever. I'm not sure how much
merit that holds because like I said we
354
00:29:39,249 --> 00:29:44,869
don't know who they are. But I would say
is that they're not.. it doesn't feel like
355
00:29:44,869 --> 00:29:48,529
they're given the resources to do their
jobs well, so even if they were the best
356
00:29:48,529 --> 00:29:53,080
most neutral people on earth they're given
very little time probably very little
357
00:29:53,080 --> 00:29:58,509
money and not a whole lot of resources to
work with in making those determinations.
358
00:29:58,509 --> 00:30:01,419
Herald: Thank you. We take a question from
Mike 3 please.
359
00:30:01,419 --> 00:30:06,229
Mike 3: Test, test. OK. First off, thank
you so much for the talk. And I just have
360
00:30:06,229 --> 00:30:12,039
a basic question. So it seems logical that
Facebook is trying to put out this mantra
361
00:30:12,039 --> 00:30:16,749
of protect the children. I can kind of get
behind that. And it also seems based on
362
00:30:16,749 --> 00:30:20,530
the fact that they have the "real names
policy" that they would also expect you to
363
00:30:20,530 --> 00:30:25,560
put in a real legal age. So if they're
trying to censor things like nudity: Why
364
00:30:25,560 --> 00:30:30,719
couldn't they simply use things like age
as criteria to protect children from
365
00:30:30,719 --> 00:30:34,239
nudity. While letting everyone else who is
above the legal age make their own
366
00:30:34,239 --> 00:30:38,149
decision?
J: You wanna take that?
367
00:30:38,149 --> 00:30:43,769
M: I think it's a few factors one: It's I
guess on the technical side what on the
368
00:30:43,769 --> 00:30:48,339
technical side, what constitutes nudity.
And in a process way if it does get
369
00:30:48,339 --> 00:30:53,100
flagged as when something is flagged do
you have a channel or two boxes to say
370
00:30:53,100 --> 00:30:57,589
what sort of content I could use a system
in which that content flagged as nudity
371
00:30:57,589 --> 00:31:02,499
gets referred to a special nudity
moderator and then the moderator says:
372
00:31:02,499 --> 00:31:10,019
"Yes is nudity" then filter all less than
- you know - legal age or whatever age.
373
00:31:10,019 --> 00:31:14,809
But I think it's part of a broader more
systematic approach by Facebook. It's the
374
00:31:14,809 --> 00:31:22,320
broad strokes. It's really kind of
dictating this digital baseline this
375
00:31:22,320 --> 00:31:26,769
digital morality baseline and we're
saying: "No". Anybody in the world cannot
376
00:31:26,769 --> 00:31:30,519
see this. These are our hard lines and it
doesn't matter what age you are where you
377
00:31:30,519 --> 00:31:34,249
reside. This is the box in which we are
placing you in and content that falls
378
00:31:34,249 --> 00:31:38,559
outside of this box for anybody,
regardless of age or origin, that these
379
00:31:38,559 --> 00:31:43,459
this is what we say you can see and
anything that falls out that you risk
380
00:31:43,459 --> 00:31:47,549
having your account suspended. So I think
it's a mechanism of control.
381
00:31:47,549 --> 00:31:50,729
Herald: Thank you so much. I think
unfortunately we're run out of time for
382
00:31:50,729 --> 00:31:53,880
questions. I would like to apologize for
everyone who's standing maybe you have
383
00:31:53,880 --> 00:32:01,259
time to discuss that afterwards. Thank you
everyone and thank you.
384
00:32:01,259 --> 00:32:07,972
applause
385
00:32:07,972 --> 00:32:12,801
postrol music
386
00:32:12,801 --> 00:32:20,000
subtitles created by c3subtitles.de
Join, and help us!