0:00:00.440,0:00:09.460
prerol music
0:00:09.460,0:00:13.849
Herald: Our next Talk will be tackling how[br]social media companies are creating a
0:00:13.849,0:00:20.480
global morality standard through content[br]regulations. This would be presented by
0:00:20.480,0:00:25.470
two persons standing here the digital[br]rights advocate Matthew Stender and the
0:00:25.470,0:00:31.220
writer and activist Jillian C. York. Please[br]give them a warm applause.
0:00:31.220,0:00:38.090
applause
0:00:38.090,0:00:41.480
Matthew: Hello everybody. I hope you all[br]had a great Congress. Thank you for being
0:00:41.480,0:00:46.949
here today. You know we're almost wrapped[br]up with the Congress but yeah we
0:00:46.949,0:00:51.929
appreciate you being here. My name is[br]Matthew Stender. I am a communications
0:00:51.929,0:00:57.419
strategist's creative director and digital[br]rights advocate focusing on privacy.
0:00:57.419,0:01:02.000
Social media censorship and Freedom of[br]Press and expression.
0:01:02.000,0:01:06.550
Jillian: And I am Jillian York and I work[br]at the Electronic Frontier Foundation. I
0:01:06.550,0:01:10.581
work on privacy and free expression issues[br]as well as a few other things and I'm
0:01:10.581,0:01:18.480
based in Berlin. Thank you. For Berlin?[br]Awesome. Cool. Hope to see some of you
0:01:18.480,0:01:22.410
there. Great. So today we're gonna be[br]talking about sin in the time of
0:01:22.410,0:01:27.140
technology and what we mean by that is the[br]way in which corporations particularly
0:01:27.140,0:01:31.320
content platforms and social media[br]platforms are driving morality and our
0:01:31.320,0:01:36.630
perception of it. We've got three key[br]takeaways to start off with. The first is
0:01:36.630,0:01:40.430
that social media companies have an[br]unparalleled amount of influence over our
0:01:40.430,0:01:43.980
modern communications. This we know I[br]think this is probably something everyone
0:01:43.980,0:01:48.130
in this room can agree on. These companies[br]also play a huge role in shaping our
0:01:48.130,0:01:52.580
global outlook on morality and what[br]constitutes it. So the ways in which we
0:01:52.580,0:01:57.280
perceive different imagery different[br]speech is being increasingly defined by
0:01:57.280,0:02:02.100
the regulations that these platforms put[br]upon us on our daily activities on them.
0:02:02.100,0:02:05.780
And third they are entirely undemocratic.[br]They're beholden to shareholders and
0:02:05.780,0:02:10.170
governments but not at all to the public.[br]Not to me, not to you. Rarely do they
0:02:10.170,0:02:14.120
listen to us and when they do there has to[br]be a fairly exceptional mount of public
0:02:14.120,0:02:18.340
pressure on them. And so that's, that's[br]our starting point that's what we wanna
0:02:18.340,0:02:22.769
kick off with and I'll pass the mike to[br]Matthew.
0:02:22.769,0:02:28.170
M: So thinking about these three takeaways[br]I'm going to bring it to kind of top level
0:02:28.170,0:02:38.209
for a moment. To introduce an idea today[br]which some people have talked about but
0:02:38.209,0:02:44.400
the idea of the rise of the techno class.[br]So probably a lot of people in this room
0:02:44.400,0:02:50.990
have followed the negotiations, leaks in[br]part and then in full by WikiLeaks about
0:02:50.990,0:02:58.050
the Trans-Pacific Partnership the TPP what[br]some people have mentioned in this debate
0:02:58.050,0:03:03.360
is the idea of a corporate capture in a[br]world now in which that corporations are
0:03:03.360,0:03:09.220
becoming are maturing to this to the[br]extent in which they can now sue
0:03:09.220,0:03:15.270
governments that the multinational reach[br]of many corporations are larger than that.
0:03:15.270,0:03:21.620
The diplomatic reach of countries. And[br]with this social media platforms being
0:03:21.620,0:03:26.250
part of this that they, that these social[br]media companies now are going to have the
0:03:26.250,0:03:30.180
capacity to influence not only cultures[br]but people within cultures and how they
0:03:30.180,0:03:36.239
communicate with people inside their[br]culture and communicate globally. So as
0:03:36.239,0:03:40.989
activists and technologists I would like[br]to propose at least that we start thinking
0:03:40.989,0:03:45.099
about and beyond the product and service[br]offerings of today's social media
0:03:45.099,0:03:50.660
companies and start looking ahead to two -[br]five - ten years down the road in which
0:03:50.660,0:03:56.559
these companies may have social media[br]services and serve as media social media
0:03:56.559,0:04:00.709
service offerings which are[br]indistinguishable from today's ISPs and
0:04:00.709,0:04:05.910
telcos and other things. And this is[br]really to say that social media is moving
0:04:05.910,0:04:15.400
past the era of the walled garden into neo[br]empires. So one of the things that is on
0:04:15.400,0:04:20.298
the slide are some headlines about[br]different delivery mechanisms in which
0:04:20.298,0:04:24.980
that social media companies and also[br]people like Elon Musk are looking to roll
0:04:24.980,0:04:30.050
out an almost leapfrog if not completely[br]leapfrog the existing technologies of
0:04:30.050,0:04:36.530
Russia broadcasting fiberoptics these sort[br]of things. So we now are looking at a
0:04:36.530,0:04:41.240
world in which that Facebook is now gonna[br]have drones. Google is looking into
0:04:41.240,0:04:46.590
balloons and other people looking into low[br]earth orbit satellites to be able to
0:04:46.590,0:04:52.220
provide directly to the end consumer, to[br]the user, to the handset the content which
0:04:52.220,0:04:58.199
flows through these networks. So one of[br]the first things I believe we're gonna see
0:04:58.199,0:05:05.600
in this field is free basics. Facebook has[br]a service it was launched as Internet.org
0:05:05.600,0:05:12.021
and now it's been rebranded free basics[br]and why this is interesting is that while
0:05:12.021,0:05:17.060
in one hand free basis is a free service[br]that it's trying to get the people that
0:05:17.060,0:05:21.590
are not on the Internet now to use a[br]Facebook's window Facebook's window to the
0:05:21.590,0:05:28.030
world. It has maybe a couple dozen sites[br]that are accessible it runs over the data
0:05:28.030,0:05:35.180
networks for countries reliance the[br]telecommunications company in India is one
0:05:35.180,0:05:41.550
of the larger telcos but not the largest.[br]There's a lot of pressure that Facebook is
0:05:41.550,0:05:48.320
putting on the government of India right[br]now to be able to have the service offered
0:05:48.320,0:05:54.960
across the country. One of the ways that[br]this is problematic is because a limited
0:05:54.960,0:05:59.300
number of Web sites flow through this that[br]people that get exposed to free basic.
0:05:59.300,0:06:05.180
This might be their first time seeing the[br]Internet in some cases because an example
0:06:05.180,0:06:10.350
is interesting to think about is a lion[br]born into a zoo. Perhaps evolution and
0:06:10.350,0:06:16.889
other things may have this line dream[br]perhaps of running wild on the plains of
0:06:16.889,0:06:25.080
Africa but at the same time it will never[br]know that world Facebook free basic users
0:06:25.080,0:06:32.199
knowing Facebook's view to the window of[br]the Internet may not all jump over to a
0:06:32.199,0:06:37.380
full data package on their ISP. And many[br]people may be stuck in Facebook's window
0:06:37.380,0:06:41.190
to the world.
0:06:41.190,0:06:44.740
J: In other words we've reached an era[br]where these companies have as I've said
0:06:44.740,0:06:48.530
unprecedented control over our daily[br]communications both the information that
0:06:48.530,0:06:52.460
we can access and the speech and imagery[br]that we can express to the world and to
0:06:52.460,0:06:55.880
each other. So the postings and pages and[br]friend requests, the millions of
0:06:55.880,0:06:59.620
politically active users as well, have[br]helped to make Mark Zuckerberg and his
0:06:59.620,0:07:02.770
colleagues as well as the people at[br]Google, Twitter and all these other fine
0:07:02.770,0:07:07.850
companies extremely rich and yet we're[br]pushing back in this case. I've got a
0:07:07.850,0:07:11.400
great quote from Rebecca MacKinnon where[br]she refers to Facebook as Facebookistan
0:07:11.400,0:07:14.610
and I think that that is an apt example of[br]what we're looking at. These are
0:07:14.610,0:07:19.280
corporations but they're not beholden at[br]all to the public as we know. And instead
0:07:19.280,0:07:23.190
they've kind of turned into these quasi[br]dictatorships that dictate precisely how
0:07:23.190,0:07:29.400
we behave on them. I also wanted to throw[br]this one up to talk a little bit about the
0:07:29.400,0:07:34.030
global speech norm. This is from Ben[br]Wagner who's written a number of pieces on
0:07:34.030,0:07:38.370
this but who kind of coined the concept of[br]a global speech standard which is what
0:07:38.370,0:07:42.930
these companies have begun and are[br]increasingly imposing upon us. This global
0:07:42.930,0:07:47.800
speech standard is essentially catering to[br]everyone in the world trying to make every
0:07:47.800,0:07:51.800
user in every country and every government[br]happy. But as a result have kind of tamper
0:07:51.800,0:07:57.360
down free speech to this very basic level[br]that makes both the governments of let's
0:07:57.360,0:08:00.930
say the United States and Germany happy as[br]well as the governments of countries like
0:08:00.930,0:08:05.270
Saudi Arabia. Therefore we're looking at[br]really kind of the lowest common
0:08:05.270,0:08:09.870
denominator when it comes to sometimes[br]tone types of speech and this sort of flat
0:08:09.870,0:08:14.810
gray standard when it comes to others.
0:08:14.810,0:08:22.050
M: So Jillian just mentioned we have[br]countries in play. Facebook is another
0:08:22.050,0:08:26.080
organizations social media companies are[br]trying to pivot and play within an
0:08:26.080,0:08:30.490
international field but let's just take it[br]up for a moment a look at the scale and
0:08:30.490,0:08:35.799
scope and size of these social media[br]companies. So I just put some figures from
0:08:35.799,0:08:42.750
the Internet and with some latest census[br]information we have China 1.37 billion
0:08:42.750,0:08:48.020
people, India 1.25 billion people 2.2[br]billion individuals and practitioners of
0:08:48.020,0:08:51.762
Islam and Christianity. But now we have[br]Facebook with according to their
0:08:51.762,0:08:57.610
statistics 1.5 billion active monthly[br]users. Their statistics - I'm sure many
0:08:57.610,0:09:01.520
people would like to dispute these numbers[br]- but the same time with these platforms
0:09:01.520,0:09:08.640
are now large. I mean larger than - not[br]larger than some of religions - but
0:09:08.640,0:09:15.040
Facebook has more monthly active users[br]than China or India have citizens. So
0:09:15.040,0:09:19.061
we're not talking about - you know -[br]basement startups, we're now talking about
0:09:19.061,0:09:26.940
companies with the size and scale to be[br]able to really be influential in a larger
0:09:26.940,0:09:42.790
institutional way. So Magna Carta app the[br]U.S. Constitution the Declaration of Human
0:09:42.790,0:09:49.130
Rights the tree of masters the Bible the[br]Quran. These are time tested at least
0:09:49.130,0:09:55.310
longstanding principle documents. That's[br]how that place upon their constituents
0:09:55.310,0:10:03.980
whether it be citizens or spiritual[br]adherents a certain code of conduct.
0:10:03.980,0:10:08.510
Facebook, as Jillian mentioned, is[br]nondemocratic. Facebook's terms and
0:10:08.510,0:10:12.990
standards were written by a small group of[br]individuals with a few compelling
0:10:12.990,0:10:18.540
interests in mind. But we are now talking[br]about 1.5 billion people on a monthly
0:10:18.540,0:10:27.240
basis that are subservient to a Terms of[br]Service in which they had no input on.
0:10:27.240,0:10:32.470
Sort of pivot from there and bring it back[br]to spirituality why is this important.
0:10:32.470,0:10:41.890
Well spiritual morality has always been a[br]place for - for religion. Religion has a
0:10:41.890,0:10:49.760
monopoly on the soul. You could say, that[br]religion is a set of rules in which that
0:10:49.760,0:10:55.860
if you obey you are able to not go to hell[br]or heaven after an afterlife reincarnated
0:10:55.860,0:11:02.560
whatever the religious practice may be.[br]Civil morality is quite interesting in the
0:11:02.560,0:11:07.610
sense that the sovereign state as a top[br]level institution has the ability to put
0:11:07.610,0:11:13.460
into place a series of statutes and[br]regulations. The violation of which can
0:11:13.460,0:11:18.570
send you to jail. Another interesting note[br]is that the state is also, also has a
0:11:18.570,0:11:24.670
monopoly on the use of sanctioned[br]violence. Say that's the actors of
0:11:24.670,0:11:28.080
official actors of the states are able to[br]do things in which that the citizens
0:11:28.080,0:11:35.570
citizens of that state may not. And if we[br]take a look at this this concept of
0:11:35.570,0:11:41.910
digital morality I spoke about earlier[br]with services like free basic introducing
0:11:41.910,0:11:48.459
new individuals to the Internet well, by a[br]violation of the terms of service you can
0:11:48.459,0:11:58.399
be excluded from these massive global[br]networks. And really they, Facebook is
0:11:58.399,0:12:04.050
actually trying to create a if not a[br]monopoly a semi monopoly on global
0:12:04.050,0:12:14.000
connectivity. in a lot of ways. So what[br]drives Facebook and this is a few things.
0:12:14.000,0:12:20.870
One is a protection stick protectionistic[br]legal framework. Ok. The control of
0:12:20.870,0:12:25.360
copyright violations is something that a[br]lot of platforms stomped out pretty early.
0:12:25.360,0:12:29.899
They don't want to be sued by the IRA or[br]the MPAA. And so there was mechanisms in
0:12:29.899,0:12:36.641
which copyright, copyrighted material was[br]able to be taken on the platform. They
0:12:36.641,0:12:41.990
also or limit potential competition. And I[br]think this is quite interesting in the
0:12:41.990,0:12:47.220
sense that they have shown this in two[br]ways. One: They've boughten they've
0:12:47.220,0:12:52.140
purchased rival or potential competitors.[br]You see this with Instagram being bought
0:12:52.140,0:12:57.870
by Facebook. But Facebook has also[br]demonstrated the ability or willingness to
0:12:57.870,0:13:04.899
censor certain context, certain content.[br]tsu.co was a news social sites and
0:13:04.899,0:13:10.279
mentions and links to this platform were[br]deleted or not allowed on Facebook. So
0:13:10.279,0:13:16.480
even using Facebook as a platform to talk[br]about another platform was was not
0:13:16.480,0:13:23.860
allowed. And then a third component is the[br]operation on the global scale. It's only
0:13:23.860,0:13:29.440
the size of the company it's also about[br]the global reach. So if Facebook maintains
0:13:29.440,0:13:33.320
offices around the world, as other social[br]media companies do, they engage in public
0:13:33.320,0:13:41.720
diplomacy and they also offer aid in many[br]countries and many languages. So just to
0:13:41.720,0:13:47.450
take it to to companies like Facebook for[br]a moment really an economics you have the
0:13:47.450,0:13:53.490
traditional multinationals the 20th[br]century Coca Cola McDonald's. The end
0:13:53.490,0:14:00.610
user. The end goal the goal for the end[br]user of these products was consumption.
0:14:00.610,0:14:05.170
This is changing now. Facebook is looking[br]to capture more and more parts of the
0:14:05.170,0:14:11.339
supply chain. And as a this may be as a[br]service provider as a content moderator
0:14:11.339,0:14:21.070
and responsible for negotiating and[br]educating the content disputes. At the end
0:14:21.070,0:14:27.510
of the day that users are really the[br]product. It's not for us Facebook users
0:14:27.510,0:14:32.630
the platform it's really for advertisers.[br]We take a hierarchy of the platform where
0:14:32.630,0:14:38.510
the corporation advertisers and then users[br]kind of at the fringes.
0:14:38.510,0:14:41.899
J: So let's get into the nitty gritty a[br]little bit about what content moderation
0:14:41.899,0:14:46.399
on these platforms actually looks like. So[br]I've put up two headlines from Adrian Chan
0:14:46.399,0:14:50.149
a journalist who wrote these for Gawker[br]and wired respectively. They're both a
0:14:50.149,0:14:54.910
couple of years old. But what he did was[br]he looked into he investigated who was
0:14:54.910,0:14:58.899
moderating the content on these platforms[br]and what he found and accused these
0:14:58.899,0:15:03.100
companies out of is outsourcing the[br]content moderation to low paid workers in
0:15:03.100,0:15:06.779
developing countries. In this case he[br]found the first article I think Morocco
0:15:06.779,0:15:09.970
was the country and I'm gonna show a slide[br]from that a bit of what those content
0:15:09.970,0:15:14.050
moderators worked with. And the second[br]article talked a lot about the use of
0:15:14.050,0:15:17.110
workers in the Philippines for this[br]purpose. We know that these workers are
0:15:17.110,0:15:21.529
probably low paid. We know that they're[br]given very very minimal amount of a
0:15:21.529,0:15:25.200
minimal timeframe to look at the content[br]that they're being presented. So here's
0:15:25.200,0:15:31.199
how it basically works across platforms[br]with small differences. I post something
0:15:31.199,0:15:34.580
and I'll show you some great examples of[br]things they posted later. I post something
0:15:34.580,0:15:39.070
and if I post it to my friends only, my[br]friends can then report it to the company.
0:15:39.070,0:15:43.430
If I post it publicly anybody who can see[br]it or who's a user of the product can
0:15:43.430,0:15:47.800
report it to the company. Once a piece of[br]content is reported a content moderator
0:15:47.800,0:15:51.420
then looks at it and within that very[br]small time frame we're talking half a
0:15:51.420,0:15:56.160
second to two seconds probably based on[br]the investigative research that's been
0:15:56.160,0:16:00.130
done by a number of people they have to[br]decide if this content fits the terms of
0:16:00.130,0:16:03.990
service or not. Now most of these[br]companies have a legalistic terms of
0:16:03.990,0:16:07.800
service as well as a set of community[br]guidelines or community standards which
0:16:07.800,0:16:12.740
are clear to the user but they're still[br]often very vague. And so I wanna get into
0:16:12.740,0:16:18.270
a couple of examples that show that. This[br]is just this slide is the one of the
0:16:18.270,0:16:21.290
examples that I gave. You can't see it[br]very well, so I won't leave it up for too
0:16:21.290,0:16:25.800
long. But that was what content moderators[br]at this outsource company oDesk were
0:16:25.800,0:16:33.880
allegedly using to moderate content on[br]Facebook. This next photo contains nudity.
0:16:33.880,0:16:40.740
So I think everyone probably knows who[br]this is and has seen this photo. Yes? No?
0:16:40.740,0:16:46.389
OK. This is Kim Kardashian and this photo[br]allegedly broke the Internet. It was a
0:16:46.389,0:16:50.779
photo taken for paper magazine. It was[br]posted widely on the web and it was seen
0:16:50.779,0:16:55.459
by many many people. Now this photograph[br]definitely violates Facebook's terms of
0:16:55.459,0:17:00.380
service. Buuut Kim Kardashian's really[br]famous and makes a lot of money. So in
0:17:00.380,0:17:06.779
most instances as far as I can tell this[br]photo was totally fine on Facebook. Now
0:17:06.779,0:17:11.169
let's talk about those rules a little bit.[br]Facebook says that they restrict nudity
0:17:11.169,0:17:15.749
unless it is art. So they do make an[br]exception for art which may be why they
0:17:15.749,0:17:21.169
allowed that image of Kim Kardashian's[br]behind to stay up. Art is defined by the
0:17:21.169,0:17:26.128
individual. And yet at the same time that[br]you know they make clear that that's let's
0:17:26.128,0:17:30.420
say a photograph of Michelangelo's David[br]or a photograph of another piece of art in
0:17:30.420,0:17:34.210
a museum would be perfectly acceptable[br]whereas you know you're sort of average
0:17:34.210,0:17:39.090
nudity maybe probably is not going to be[br]allowed to remain on the platform. They
0:17:39.090,0:17:42.970
also note that they restrict the display[br]of nudity to ensure that their global..
0:17:42.970,0:17:46.379
because their global community may be[br]sensitive to this type of content
0:17:46.379,0:17:50.120
particularly because of their cultural[br]background or age. So this is Facebook in
0:17:50.120,0:17:54.909
their community standards telling you[br]explicitly that they are toning down free
0:17:54.909,0:18:01.789
speech to make everyone happy. This is[br]another photograph. Germans are
0:18:01.789,0:18:07.480
particularly I'm interested. Is everyone[br]familiar with the show the Golden Girls?
0:18:07.480,0:18:10.929
OK. Quite a few. So you might recognize[br]her she was Dorothea on the Golden Girls
0:18:10.929,0:18:15.090
this is the actress Bea Arthur and this is[br]from a painting from 1991 by John Curran
0:18:15.090,0:18:19.289
of her. It's unclear whether or not she[br]sat for the painting. It's a wonderful
0:18:19.289,0:18:23.590
image, it's beautiful it's very beautiful[br]portrait of her. But I posted it on
0:18:23.590,0:18:28.180
Facebook several times in a week. I[br]encouraged my friends to report it. And in
0:18:28.180,0:18:36.010
fact Facebook found this to not be art.[br]Sorry. Another image this is by an artist
0:18:36.010,0:18:40.730
a Canadian artist called Rupi Kaur. She[br]posted a series of images in which she was
0:18:40.730,0:18:45.509
menstruating, she was trying to[br]essentially describe the normal the
0:18:45.509,0:18:49.580
normality of this. The fact that this is[br]something that all women go through. Most
0:18:49.580,0:18:56.909
women go through. And as a result[br]Instagram denied. Unclear on the reasons,
0:18:56.909,0:19:01.100
they told her that had violated the terms[br]of service but weren't exactly clear as to
0:19:01.100,0:19:06.039
why. And finally this is another one. This[br]is by an artist friend of mine, I'm afraid
0:19:06.039,0:19:09.951
that I have completely blanked on who did[br]this particular piece, but what it was is:
0:19:09.951,0:19:16.370
They took famous works of nude art and had[br]sex workers pose in the same poses as the
0:19:16.370,0:19:19.980
pieces of art. I thought it was a really[br]cool project but Google Plus did not find
0:19:19.980,0:19:23.919
that to be a really cool project. And[br]because of their guidelines on nudity they
0:19:23.919,0:19:30.909
banned it. This is a cat. Just want to[br]make sure you're awake. It was totally
0:19:30.909,0:19:37.629
allowed. So in addition to the problems of[br]content moderation I'm gonna go ahead and
0:19:37.629,0:19:41.290
say that we also have a major diversity[br]problem at these companies. These
0:19:41.290,0:19:45.509
statistics are facts these are from all of[br]these companies themselves. They put out
0:19:45.509,0:19:51.109
diversity reports recently or as I like to[br]call them diversity reports and they show
0:19:51.109,0:19:54.250
that. So the statistics are a little bit[br]different because they only capture data
0:19:54.250,0:19:59.249
on ethnicity or nationality in their US[br]offices just because of how those
0:19:59.249,0:20:02.710
standards are sort of odd all over the[br]world. So the first stats referred to
0:20:02.710,0:20:06.690
their global staff. The second ones in[br]each line refer to their U.S. staff but as
0:20:06.690,0:20:10.749
you can see these companies are largely[br]made of white men which is probably not
0:20:10.749,0:20:16.080
surprising but it is a problem. Now why is[br]that a problem? Particularly when you're
0:20:16.080,0:20:20.440
talking about policy teams the people who[br]build policies and regulations have an
0:20:20.440,0:20:24.499
inherent bias. We all have an inherent[br]bias. But what we've seen in this is
0:20:24.499,0:20:30.460
really a bias of sort of the American[br]style of prudeness. Nudity is not allowed
0:20:30.460,0:20:34.669
but violence extreme violence as long as[br]it's fictional is totally OK. And that's
0:20:34.669,0:20:39.221
generally how these platforms operate. And[br]so I think that when we ensure that there
0:20:39.221,0:20:44.779
is diversity in the teams creating both[br]our tools our technology and our policies,
0:20:44.779,0:20:49.200
then we can ensure that diverse world[br]views are brought into those that creation
0:20:49.200,0:20:56.479
process and that the policies are[br]therefore more just. So what can we do
0:20:56.479,0:21:00.840
about this problem? As consumers as[br]technologists as activists as whomever you
0:21:00.840,0:21:04.940
might identify as. In the first one I[br]think a lot of the technologists are gonna
0:21:04.940,0:21:08.779
agree with: Develop decentralized[br]networks. We need to work toward that
0:21:08.779,0:21:12.409
ideal because these companies are not[br]getting any smaller. We're not gonna
0:21:12.409,0:21:16.279
necessarily go out and say that they're[br]too big to fail but they are massive and
0:21:16.279,0:21:20.479
as Matthew noted earlier they're buying up[br]properties all over the place and making
0:21:20.479,0:21:25.399
sure that they do have control over our[br]speech. The second thing is to push for
0:21:25.399,0:21:29.830
greater transparency around terms of[br]service takedowns. Now I'm not a huge fan
0:21:29.830,0:21:33.210
of transparency for the sake of[br]transparency. I think that these, you
0:21:33.210,0:21:36.509
know, these companies have been putting[br]out transparency reports for a long time
0:21:36.509,0:21:42.379
that show what countries ask them to take[br]down content or hand over user data. But
0:21:42.379,0:21:48.330
we've seen those transparency reports to[br]be incredibly flawed already. And so in
0:21:48.330,0:21:51.529
pushing for greater transparency around[br]terms of service take downs that's only a
0:21:51.529,0:21:55.809
first step. The third thing is, we need to[br]demand that these companies adhere to
0:21:55.809,0:21:59.260
global speech standards. We already have[br]the Universal Declaration of Human Rights.
0:21:59.260,0:22:03.950
I don't understand why we need companies[br]to develop their own bespoke rules. And so
0:22:03.950,0:22:10.509
by..[br]applause
0:22:10.509,0:22:13.679
And so by demanding that companies adhere[br]to global speech standards we can ensure
0:22:13.679,0:22:17.519
that these are places of free expression[br]because it is unrealistic to just tell
0:22:17.519,0:22:21.029
people to get off Facebook. I can't tell[br]you how many times in the tech community
0:22:21.029,0:22:26.230
over the years I've heard people say well[br]if you don't like it just leave. That's
0:22:26.230,0:22:29.519
not a realistic option for many people[br]around the world and I think we all know
0:22:29.519,0:22:32.229
that deep down.[br]applause
0:22:32.229,0:22:38.690
Thank you. And so the other thing I would[br]say though is that public pressure works.
0:22:38.690,0:22:42.940
We saw last year with Facebook's real name[br]policy there are a number of drag
0:22:42.940,0:22:46.899
performers in the San Francisco Bay Area[br]who were kicked off the platform because
0:22:46.899,0:22:50.090
they were using their performance - their[br]drag names - which is a completely
0:22:50.090,0:22:54.739
legitimate thing to do just as folks have[br]hacker names or other pseudonyms. But
0:22:54.739,0:22:58.849
those folks pushed back. They formed a[br]coalition and they got Facebook to change
0:22:58.849,0:23:04.239
a little bit. It's not completely there[br]yet but they're making progress and I'm
0:23:04.239,0:23:09.119
hoping that this goes well. And then the[br]last thing is and this is totally a pitch
0:23:09.119,0:23:13.249
thrown right out there: Support projects[br]like ours I'm gonna throw to Matthew to
0:23:13.249,0:23:16.970
talk about onlinecensorship.org and[br]another project done by the excellent
0:23:16.970,0:23:21.289
Rebecca MacKinnon called[br]ranking digital rights.
0:23:21.289,0:23:25.320
M: So just a little bit of thinking[br]outside the box onlinecensorship.org is a
0:23:25.320,0:23:31.630
platform that's recently launched. Users[br]can go onto the platform and submit a
0:23:31.630,0:23:36.590
small questionnaire if their content has[br]been taken down by the platforms. Why we
0:23:36.590,0:23:39.460
think this is exciting is because right[br]now as we mentioned that transparency
0:23:39.460,0:23:45.169
reports are fundamentally flawed. We are[br]looking to crowdsource information about
0:23:45.169,0:23:49.090
the ways in which the social media[br]companies, six social media companies, are
0:23:49.090,0:23:53.940
moderating and taking down content because[br]we can't know exactly what kind of
0:23:53.940,0:23:57.159
accountability and transparency in real[br]time. We're hoping to be able to find
0:23:57.159,0:24:03.999
trends both across the kind of conduct has[br]been taking down, geographic trends, news
0:24:03.999,0:24:08.350
related trends within sort of[br]self-reported content takedown. But it's
0:24:08.350,0:24:12.710
platforms like these that I think that[br]hopefully will begin to spring up in
0:24:12.710,0:24:18.370
response for the community to be able to[br]put tools in place that people can be a
0:24:18.370,0:24:21.539
part of the poor porting[br]and Transparency Initiative.
0:24:21.539,0:24:24.269
J: We launched about a month ago and were[br]hoping to put out our first set of reports
0:24:24.269,0:24:29.059
around March. And finally I just want to[br]close with one more quote before we slip
0:24:29.059,0:24:33.740
into Q&A and that is just to save it. It's[br]reasonable that we press Facebook on these
0:24:33.740,0:24:37.409
questions of public responsibility, while[br]also acknowledging that Facebook cannot be
0:24:37.409,0:24:41.159
all things to all people. We can demand[br]that their design decisions and user
0:24:41.159,0:24:45.950
policies standard for review explicit[br]thoughtful and open to public
0:24:45.950,0:24:50.200
deliberation. But - and this is the most[br]important part in my view - the choices that
0:24:50.200,0:24:54.549
Facebook makes in their design and in[br]their policies or value judgments. This is
0:24:54.549,0:24:57.929
political and I know you've heard that in[br]a lot of talk. So have I. But I think we
0:24:57.929,0:25:01.700
can't, we cannot forget that this is all[br]political and we have to address it as
0:25:01.700,0:25:06.460
such. And for someone if that means you[br]know quitting the platform thats fine too.
0:25:06.460,0:25:09.139
But I think that we should still[br]understand that our friends our relatives
0:25:09.139,0:25:13.349
our families are using these platforms and[br]that we do owe it to everybody to make
0:25:13.349,0:25:17.969
them a better place for free expression[br]and privacy. Thank you.
0:25:17.969,0:25:23.780
applause
0:25:23.780,0:25:28.629
Herald: Thank you so much. So please now[br]we have a section of Q&A for anyone who
0:25:28.629,0:25:36.110
has a question please use one of the mikes[br]on the site's. And I think we have a
0:25:36.110,0:25:44.259
question from one of our viewers? No. OK.[br]Please proceed numer one.
0:25:44.259,0:25:48.859
Mike 1: You just addressed that I am sort[br]of especially after listening to your talk
0:25:48.859,0:25:55.499
I'm sort of on the verge of quitting[br]Facebook or starting to I don't know.
0:25:55.499,0:26:00.200
applause[br]Yeah, I mean and I agree it's a hard
0:26:00.200,0:26:07.200
decision. I've been on Facebook for I[br]think six years now and it is a dispute
0:26:07.200,0:26:12.729
for me myself. So, I'm in this very[br]strange position. And now I have to kind
0:26:12.729,0:26:18.389
of decide what to do. Are there any.. is[br]there any help for me out there to tell me
0:26:18.389,0:26:26.269
what would be.. I don't know what.. that[br]takes my state and helps me in deciding..
0:26:26.269,0:26:28.950
I don't know. It's strange.
0:26:28.950,0:26:33.639
J: That's such a hard question. I mean[br]I'm.. I'll put on my privacy hat for just
0:26:33.639,0:26:38.809
a second and say what I would say to[br]people when they're making that
0:26:38.809,0:26:41.389
consideration from a privacy view point[br]because they do think that the
0:26:41.389,0:26:44.849
implications of privacy on these platforms[br]is often much more severe than those of
0:26:44.849,0:26:49.390
speech. But this is what I do. So in that[br]case you know I think it's really about
0:26:49.390,0:26:54.249
understanding your threat model of[br]understanding what sort of threat you're
0:26:54.249,0:26:57.799
under when it comes to you know the data[br]collection that these companies are
0:26:57.799,0:27:01.850
undertaking as well as the censorship of[br]course. But I think it really is a
0:27:01.850,0:27:04.989
personal decision and I I'm sure that[br]there are - you know - there are great
0:27:04.989,0:27:09.460
resources out there around digital[br]security and around thinking through those
0:27:09.460,0:27:12.859
thread model processes and perhaps that[br]could be of help to you for that. If you
0:27:12.859,0:27:17.210
want to add?[br]M: No I mean I think it's, it's one of
0:27:17.210,0:27:21.059
these big toss ups like this is a system[br]in which that many people are connected
0:27:21.059,0:27:26.369
through even sometimes e-mail addresses[br]rollover and Facebook. And so I think it's
0:27:26.369,0:27:30.059
the opportunity cost by leaving a[br]platform. What do you have to lose, what
0:27:30.059,0:27:34.619
do you have to gain. But it's also[br]important remember that well the snapshot
0:27:34.619,0:27:40.350
we see of Facebook now it's not gonna get[br]any it's probably not gonna get better.
0:27:40.350,0:27:44.440
It's probably gonna be more invasive and[br]in coming into different parts of our
0:27:44.440,0:27:49.710
lives so I think from the security and[br]privacy aspect it's really just up to the
0:27:49.710,0:27:54.019
individual.[br]Mike 1: Short follow up - if I am allowed
0:27:54.019,0:28:03.369
to - I don't see the.. the main point for[br]me is not my personal implications so I am
0:28:03.369,0:28:08.929
quite aware that Facebook is a bad thing[br]and I can leave it, but I'm sort of
0:28:08.929,0:28:13.649
thinking about it's way past the point[br]where we can decide on our own and decide:
0:28:13.649,0:28:17.520
OK, is it good for me or is it good for my[br]friend or is it good for my mom or for my
0:28:17.520,0:28:23.629
dad or whatever? We have to think about:[br]Is Facebook as such for society is a good
0:28:23.629,0:28:28.409
thing - as you are addressing. So I think[br]we have to drive this decision making from
0:28:28.409,0:28:34.529
one person to a lot, lot, lot of persons.[br]J: I agree and I'll note. What we're
0:28:34.529,0:28:36.809
talking about..[br]applause
0:28:36.809,0:28:41.399
- I agree. What we're talking about in the[br]project that we work on together is a
0:28:41.399,0:28:45.049
small piece of the broader issue and I[br]agree that this needs to be tackled from
0:28:45.049,0:28:49.179
many angles.[br]Herald: Ok, we have a question from one of
0:28:49.179,0:28:51.999
our viewers on the Internet, please.[br]Signal Angel: Yeah, one of the questions
0:28:51.999,0:28:55.500
from the internet is: Aren't the[br]moderators the real problem who bann
0:28:55.500,0:29:02.729
everything which they don't really like[br]rather than the providers of the service?
0:29:02.729,0:29:05.789
Herald: Can you please repeat that?[br]Signal Angel: The question was if the
0:29:05.789,0:29:10.799
moderators sometimes volunteers aren't the[br]problem because they bann everything that
0:29:10.799,0:29:15.169
they don't like rather than the providers[br]of a certain service?
0:29:15.169,0:29:19.899
J: Ahm, no I mean I would say that the[br]content moderators.. we don't know who
0:29:19.899,0:29:23.190
they are, so that's part of the issue, as[br]we don't know and I've - you know - I've
0:29:23.190,0:29:27.149
heard many allegations over the years when[br]certain content's been taken down in a
0:29:27.149,0:29:30.799
certain local or cultural context[br]particularly in the Arab world. I've heard
0:29:30.799,0:29:34.629
the accusation that like oh those content[br]moderators are pro Sisi the dictator in
0:29:34.629,0:29:39.249
Egypt or whatever. I'm not sure how much[br]merit that holds because like I said we
0:29:39.249,0:29:44.869
don't know who they are. But I would say[br]is that they're not.. it doesn't feel like
0:29:44.869,0:29:48.529
they're given the resources to do their[br]jobs well, so even if they were the best
0:29:48.529,0:29:53.080
most neutral people on earth they're given[br]very little time probably very little
0:29:53.080,0:29:58.509
money and not a whole lot of resources to[br]work with in making those determinations.
0:29:58.509,0:30:01.419
Herald: Thank you. We take a question from[br]Mike 3 please.
0:30:01.419,0:30:06.229
Mike 3: Test, test. OK. First off, thank[br]you so much for the talk. And I just have
0:30:06.229,0:30:12.039
a basic question. So it seems logical that[br]Facebook is trying to put out this mantra
0:30:12.039,0:30:16.749
of protect the children. I can kind of get[br]behind that. And it also seems based on
0:30:16.749,0:30:20.530
the fact that they have the "real names[br]policy" that they would also expect you to
0:30:20.530,0:30:25.560
put in a real legal age. So if they're[br]trying to censor things like nudity: Why
0:30:25.560,0:30:30.719
couldn't they simply use things like age[br]as criteria to protect children from
0:30:30.719,0:30:34.239
nudity. While letting everyone else who is[br]above the legal age make their own
0:30:34.239,0:30:38.149
decision?[br]J: You wanna take that?
0:30:38.149,0:30:43.769
M: I think it's a few factors one: It's I[br]guess on the technical side what on the
0:30:43.769,0:30:48.339
technical side, what constitutes nudity.[br]And in a process way if it does get
0:30:48.339,0:30:53.100
flagged as when something is flagged do[br]you have a channel or two boxes to say
0:30:53.100,0:30:57.589
what sort of content I could use a system[br]in which that content flagged as nudity
0:30:57.589,0:31:02.499
gets referred to a special nudity[br]moderator and then the moderator says:
0:31:02.499,0:31:10.019
"Yes is nudity" then filter all less than[br]- you know - legal age or whatever age.
0:31:10.019,0:31:14.809
But I think it's part of a broader more[br]systematic approach by Facebook. It's the
0:31:14.809,0:31:22.320
broad strokes. It's really kind of[br]dictating this digital baseline this
0:31:22.320,0:31:26.769
digital morality baseline and we're[br]saying: "No". Anybody in the world cannot
0:31:26.769,0:31:30.519
see this. These are our hard lines and it[br]doesn't matter what age you are where you
0:31:30.519,0:31:34.249
reside. This is the box in which we are[br]placing you in and content that falls
0:31:34.249,0:31:38.559
outside of this box for anybody,[br]regardless of age or origin, that these
0:31:38.559,0:31:43.459
this is what we say you can see and[br]anything that falls out that you risk
0:31:43.459,0:31:47.549
having your account suspended. So I think[br]it's a mechanism of control.
0:31:47.549,0:31:50.729
Herald: Thank you so much. I think[br]unfortunately we're run out of time for
0:31:50.729,0:31:53.880
questions. I would like to apologize for[br]everyone who's standing maybe you have
0:31:53.880,0:32:01.259
time to discuss that afterwards. Thank you[br]everyone and thank you.
0:32:01.259,0:32:07.972
applause
0:32:07.972,0:32:12.801
postrol music
0:32:12.801,0:32:20.000
subtitles created by c3subtitles.de[br]Join, and help us!