Matthew Stender, Jillian C. York: Sin in the time of Technology
-
0:00 - 0:09prerol music
-
0:09 - 0:14Herald: Our next Talk will be tackling how
social media companies are creating a -
0:14 - 0:20global morality standard through content
regulations. This would be presented by -
0:20 - 0:25two persons standing here the digital
rights advocate Matthew Stender and the -
0:25 - 0:31writer and activist Jillian C. York. Please
give them a warm applause. -
0:31 - 0:38applause
-
0:38 - 0:41Matthew: Hello everybody. I hope you all
had a great Congress. Thank you for being -
0:41 - 0:47here today. You know we're almost wrapped
up with the Congress but yeah we -
0:47 - 0:52appreciate you being here. My name is
Matthew Stender. I am a communications -
0:52 - 0:57strategist's creative director and digital
rights advocate focusing on privacy. -
0:57 - 1:02Social media censorship and Freedom of
Press and expression. -
1:02 - 1:07Jillian: And I am Jillian York and I work
at the Electronic Frontier Foundation. I -
1:07 - 1:11work on privacy and free expression issues
as well as a few other things and I'm -
1:11 - 1:18based in Berlin. Thank you. For Berlin?
Awesome. Cool. Hope to see some of you -
1:18 - 1:22there. Great. So today we're gonna be
talking about sin in the time of -
1:22 - 1:27technology and what we mean by that is the
way in which corporations particularly -
1:27 - 1:31content platforms and social media
platforms are driving morality and our -
1:31 - 1:37perception of it. We've got three key
takeaways to start off with. The first is -
1:37 - 1:40that social media companies have an
unparalleled amount of influence over our -
1:40 - 1:44modern communications. This we know I
think this is probably something everyone -
1:44 - 1:48in this room can agree on. These companies
also play a huge role in shaping our -
1:48 - 1:53global outlook on morality and what
constitutes it. So the ways in which we -
1:53 - 1:57perceive different imagery different
speech is being increasingly defined by -
1:57 - 2:02the regulations that these platforms put
upon us on our daily activities on them. -
2:02 - 2:06And third they are entirely undemocratic.
They're beholden to shareholders and -
2:06 - 2:10governments but not at all to the public.
Not to me, not to you. Rarely do they -
2:10 - 2:14listen to us and when they do there has to
be a fairly exceptional mount of public -
2:14 - 2:18pressure on them. And so that's, that's
our starting point that's what we wanna -
2:18 - 2:23kick off with and I'll pass the mike to
Matthew. -
2:23 - 2:28M: So thinking about these three takeaways
I'm going to bring it to kind of top level -
2:28 - 2:38for a moment. To introduce an idea today
which some people have talked about but -
2:38 - 2:44the idea of the rise of the techno class.
So probably a lot of people in this room -
2:44 - 2:51have followed the negotiations, leaks in
part and then in full by WikiLeaks about -
2:51 - 2:58the Trans-Pacific Partnership the TPP what
some people have mentioned in this debate -
2:58 - 3:03is the idea of a corporate capture in a
world now in which that corporations are -
3:03 - 3:09becoming are maturing to this to the
extent in which they can now sue -
3:09 - 3:15governments that the multinational reach
of many corporations are larger than that. -
3:15 - 3:22The diplomatic reach of countries. And
with this social media platforms being -
3:22 - 3:26part of this that they, that these social
media companies now are going to have the -
3:26 - 3:30capacity to influence not only cultures
but people within cultures and how they -
3:30 - 3:36communicate with people inside their
culture and communicate globally. So as -
3:36 - 3:41activists and technologists I would like
to propose at least that we start thinking -
3:41 - 3:45about and beyond the product and service
offerings of today's social media -
3:45 - 3:51companies and start looking ahead to two -
five - ten years down the road in which -
3:51 - 3:57these companies may have social media
services and serve as media social media -
3:57 - 4:01service offerings which are
indistinguishable from today's ISPs and -
4:01 - 4:06telcos and other things. And this is
really to say that social media is moving -
4:06 - 4:15past the era of the walled garden into neo
empires. So one of the things that is on -
4:15 - 4:20the slide are some headlines about
different delivery mechanisms in which -
4:20 - 4:25that social media companies and also
people like Elon Musk are looking to roll -
4:25 - 4:30out an almost leapfrog if not completely
leapfrog the existing technologies of -
4:30 - 4:37Russia broadcasting fiberoptics these sort
of things. So we now are looking at a -
4:37 - 4:41world in which that Facebook is now gonna
have drones. Google is looking into -
4:41 - 4:47balloons and other people looking into low
earth orbit satellites to be able to -
4:47 - 4:52provide directly to the end consumer, to
the user, to the handset the content which -
4:52 - 4:58flows through these networks. So one of
the first things I believe we're gonna see -
4:58 - 5:06in this field is free basics. Facebook has
a service it was launched as Internet.org -
5:06 - 5:12and now it's been rebranded free basics
and why this is interesting is that while -
5:12 - 5:17in one hand free basis is a free service
that it's trying to get the people that -
5:17 - 5:22are not on the Internet now to use a
Facebook's window Facebook's window to the -
5:22 - 5:28world. It has maybe a couple dozen sites
that are accessible it runs over the data -
5:28 - 5:35networks for countries reliance the
telecommunications company in India is one -
5:35 - 5:42of the larger telcos but not the largest.
There's a lot of pressure that Facebook is -
5:42 - 5:48putting on the government of India right
now to be able to have the service offered -
5:48 - 5:55across the country. One of the ways that
this is problematic is because a limited -
5:55 - 5:59number of Web sites flow through this that
people that get exposed to free basic. -
5:59 - 6:05This might be their first time seeing the
Internet in some cases because an example -
6:05 - 6:10is interesting to think about is a lion
born into a zoo. Perhaps evolution and -
6:10 - 6:17other things may have this line dream
perhaps of running wild on the plains of -
6:17 - 6:25Africa but at the same time it will never
know that world Facebook free basic users -
6:25 - 6:32knowing Facebook's view to the window of
the Internet may not all jump over to a -
6:32 - 6:37full data package on their ISP. And many
people may be stuck in Facebook's window -
6:37 - 6:41to the world.
-
6:41 - 6:45J: In other words we've reached an era
where these companies have as I've said -
6:45 - 6:49unprecedented control over our daily
communications both the information that -
6:49 - 6:52we can access and the speech and imagery
that we can express to the world and to -
6:52 - 6:56each other. So the postings and pages and
friend requests, the millions of -
6:56 - 7:00politically active users as well, have
helped to make Mark Zuckerberg and his -
7:00 - 7:03colleagues as well as the people at
Google, Twitter and all these other fine -
7:03 - 7:08companies extremely rich and yet we're
pushing back in this case. I've got a -
7:08 - 7:11great quote from Rebecca MacKinnon where
she refers to Facebook as Facebookistan -
7:11 - 7:15and I think that that is an apt example of
what we're looking at. These are -
7:15 - 7:19corporations but they're not beholden at
all to the public as we know. And instead -
7:19 - 7:23they've kind of turned into these quasi
dictatorships that dictate precisely how -
7:23 - 7:29we behave on them. I also wanted to throw
this one up to talk a little bit about the -
7:29 - 7:34global speech norm. This is from Ben
Wagner who's written a number of pieces on -
7:34 - 7:38this but who kind of coined the concept of
a global speech standard which is what -
7:38 - 7:43these companies have begun and are
increasingly imposing upon us. This global -
7:43 - 7:48speech standard is essentially catering to
everyone in the world trying to make every -
7:48 - 7:52user in every country and every government
happy. But as a result have kind of tamper -
7:52 - 7:57down free speech to this very basic level
that makes both the governments of let's -
7:57 - 8:01say the United States and Germany happy as
well as the governments of countries like -
8:01 - 8:05Saudi Arabia. Therefore we're looking at
really kind of the lowest common -
8:05 - 8:10denominator when it comes to sometimes
tone types of speech and this sort of flat -
8:10 - 8:15gray standard when it comes to others.
-
8:15 - 8:22M: So Jillian just mentioned we have
countries in play. Facebook is another -
8:22 - 8:26organizations social media companies are
trying to pivot and play within an -
8:26 - 8:30international field but let's just take it
up for a moment a look at the scale and -
8:30 - 8:36scope and size of these social media
companies. So I just put some figures from -
8:36 - 8:43the Internet and with some latest census
information we have China 1.37 billion -
8:43 - 8:48people, India 1.25 billion people 2.2
billion individuals and practitioners of -
8:48 - 8:52Islam and Christianity. But now we have
Facebook with according to their -
8:52 - 8:58statistics 1.5 billion active monthly
users. Their statistics - I'm sure many -
8:58 - 9:02people would like to dispute these numbers
- but the same time with these platforms -
9:02 - 9:09are now large. I mean larger than - not
larger than some of religions - but -
9:09 - 9:15Facebook has more monthly active users
than China or India have citizens. So -
9:15 - 9:19we're not talking about - you know -
basement startups, we're now talking about -
9:19 - 9:27companies with the size and scale to be
able to really be influential in a larger -
9:27 - 9:43institutional way. So Magna Carta app the
U.S. Constitution the Declaration of Human -
9:43 - 9:49Rights the tree of masters the Bible the
Quran. These are time tested at least -
9:49 - 9:55longstanding principle documents. That's
how that place upon their constituents -
9:55 - 10:04whether it be citizens or spiritual
adherents a certain code of conduct. -
10:04 - 10:09Facebook, as Jillian mentioned, is
nondemocratic. Facebook's terms and -
10:09 - 10:13standards were written by a small group of
individuals with a few compelling -
10:13 - 10:19interests in mind. But we are now talking
about 1.5 billion people on a monthly -
10:19 - 10:27basis that are subservient to a Terms of
Service in which they had no input on. -
10:27 - 10:32Sort of pivot from there and bring it back
to spirituality why is this important. -
10:32 - 10:42Well spiritual morality has always been a
place for - for religion. Religion has a -
10:42 - 10:50monopoly on the soul. You could say, that
religion is a set of rules in which that -
10:50 - 10:56if you obey you are able to not go to hell
or heaven after an afterlife reincarnated -
10:56 - 11:03whatever the religious practice may be.
Civil morality is quite interesting in the -
11:03 - 11:08sense that the sovereign state as a top
level institution has the ability to put -
11:08 - 11:13into place a series of statutes and
regulations. The violation of which can -
11:13 - 11:19send you to jail. Another interesting note
is that the state is also, also has a -
11:19 - 11:25monopoly on the use of sanctioned
violence. Say that's the actors of -
11:25 - 11:28official actors of the states are able to
do things in which that the citizens -
11:28 - 11:36citizens of that state may not. And if we
take a look at this this concept of -
11:36 - 11:42digital morality I spoke about earlier
with services like free basic introducing -
11:42 - 11:48new individuals to the Internet well, by a
violation of the terms of service you can -
11:48 - 11:58be excluded from these massive global
networks. And really they, Facebook is -
11:58 - 12:04actually trying to create a if not a
monopoly a semi monopoly on global -
12:04 - 12:14connectivity. in a lot of ways. So what
drives Facebook and this is a few things. -
12:14 - 12:21One is a protection stick protectionistic
legal framework. Ok. The control of -
12:21 - 12:25copyright violations is something that a
lot of platforms stomped out pretty early. -
12:25 - 12:30They don't want to be sued by the IRA or
the MPAA. And so there was mechanisms in -
12:30 - 12:37which copyright, copyrighted material was
able to be taken on the platform. They -
12:37 - 12:42also or limit potential competition. And I
think this is quite interesting in the -
12:42 - 12:47sense that they have shown this in two
ways. One: They've boughten they've -
12:47 - 12:52purchased rival or potential competitors.
You see this with Instagram being bought -
12:52 - 12:58by Facebook. But Facebook has also
demonstrated the ability or willingness to -
12:58 - 13:05censor certain context, certain content.
tsu.co was a news social sites and -
13:05 - 13:10mentions and links to this platform were
deleted or not allowed on Facebook. So -
13:10 - 13:16even using Facebook as a platform to talk
about another platform was was not -
13:16 - 13:24allowed. And then a third component is the
operation on the global scale. It's only -
13:24 - 13:29the size of the company it's also about
the global reach. So if Facebook maintains -
13:29 - 13:33offices around the world, as other social
media companies do, they engage in public -
13:33 - 13:42diplomacy and they also offer aid in many
countries and many languages. So just to -
13:42 - 13:47take it to to companies like Facebook for
a moment really an economics you have the -
13:47 - 13:53traditional multinationals the 20th
century Coca Cola McDonald's. The end -
13:53 - 14:01user. The end goal the goal for the end
user of these products was consumption. -
14:01 - 14:05This is changing now. Facebook is looking
to capture more and more parts of the -
14:05 - 14:11supply chain. And as a this may be as a
service provider as a content moderator -
14:11 - 14:21and responsible for negotiating and
educating the content disputes. At the end -
14:21 - 14:28of the day that users are really the
product. It's not for us Facebook users -
14:28 - 14:33the platform it's really for advertisers.
We take a hierarchy of the platform where -
14:33 - 14:39the corporation advertisers and then users
kind of at the fringes. -
14:39 - 14:42J: So let's get into the nitty gritty a
little bit about what content moderation -
14:42 - 14:46on these platforms actually looks like. So
I've put up two headlines from Adrian Chan -
14:46 - 14:50a journalist who wrote these for Gawker
and wired respectively. They're both a -
14:50 - 14:55couple of years old. But what he did was
he looked into he investigated who was -
14:55 - 14:59moderating the content on these platforms
and what he found and accused these -
14:59 - 15:03companies out of is outsourcing the
content moderation to low paid workers in -
15:03 - 15:07developing countries. In this case he
found the first article I think Morocco -
15:07 - 15:10was the country and I'm gonna show a slide
from that a bit of what those content -
15:10 - 15:14moderators worked with. And the second
article talked a lot about the use of -
15:14 - 15:17workers in the Philippines for this
purpose. We know that these workers are -
15:17 - 15:22probably low paid. We know that they're
given very very minimal amount of a -
15:22 - 15:25minimal timeframe to look at the content
that they're being presented. So here's -
15:25 - 15:31how it basically works across platforms
with small differences. I post something -
15:31 - 15:35and I'll show you some great examples of
things they posted later. I post something -
15:35 - 15:39and if I post it to my friends only, my
friends can then report it to the company. -
15:39 - 15:43If I post it publicly anybody who can see
it or who's a user of the product can -
15:43 - 15:48report it to the company. Once a piece of
content is reported a content moderator -
15:48 - 15:51then looks at it and within that very
small time frame we're talking half a -
15:51 - 15:56second to two seconds probably based on
the investigative research that's been -
15:56 - 16:00done by a number of people they have to
decide if this content fits the terms of -
16:00 - 16:04service or not. Now most of these
companies have a legalistic terms of -
16:04 - 16:08service as well as a set of community
guidelines or community standards which -
16:08 - 16:13are clear to the user but they're still
often very vague. And so I wanna get into -
16:13 - 16:18a couple of examples that show that. This
is just this slide is the one of the -
16:18 - 16:21examples that I gave. You can't see it
very well, so I won't leave it up for too -
16:21 - 16:26long. But that was what content moderators
at this outsource company oDesk were -
16:26 - 16:34allegedly using to moderate content on
Facebook. This next photo contains nudity. -
16:34 - 16:41So I think everyone probably knows who
this is and has seen this photo. Yes? No? -
16:41 - 16:46OK. This is Kim Kardashian and this photo
allegedly broke the Internet. It was a -
16:46 - 16:51photo taken for paper magazine. It was
posted widely on the web and it was seen -
16:51 - 16:55by many many people. Now this photograph
definitely violates Facebook's terms of -
16:55 - 17:00service. Buuut Kim Kardashian's really
famous and makes a lot of money. So in -
17:00 - 17:07most instances as far as I can tell this
photo was totally fine on Facebook. Now -
17:07 - 17:11let's talk about those rules a little bit.
Facebook says that they restrict nudity -
17:11 - 17:16unless it is art. So they do make an
exception for art which may be why they -
17:16 - 17:21allowed that image of Kim Kardashian's
behind to stay up. Art is defined by the -
17:21 - 17:26individual. And yet at the same time that
you know they make clear that that's let's -
17:26 - 17:30say a photograph of Michelangelo's David
or a photograph of another piece of art in -
17:30 - 17:34a museum would be perfectly acceptable
whereas you know you're sort of average -
17:34 - 17:39nudity maybe probably is not going to be
allowed to remain on the platform. They -
17:39 - 17:43also note that they restrict the display
of nudity to ensure that their global.. -
17:43 - 17:46because their global community may be
sensitive to this type of content -
17:46 - 17:50particularly because of their cultural
background or age. So this is Facebook in -
17:50 - 17:55their community standards telling you
explicitly that they are toning down free -
17:55 - 18:02speech to make everyone happy. This is
another photograph. Germans are -
18:02 - 18:07particularly I'm interested. Is everyone
familiar with the show the Golden Girls? -
18:07 - 18:11OK. Quite a few. So you might recognize
her she was Dorothea on the Golden Girls -
18:11 - 18:15this is the actress Bea Arthur and this is
from a painting from 1991 by John Curran -
18:15 - 18:19of her. It's unclear whether or not she
sat for the painting. It's a wonderful -
18:19 - 18:24image, it's beautiful it's very beautiful
portrait of her. But I posted it on -
18:24 - 18:28Facebook several times in a week. I
encouraged my friends to report it. And in -
18:28 - 18:36fact Facebook found this to not be art.
Sorry. Another image this is by an artist -
18:36 - 18:41a Canadian artist called Rupi Kaur. She
posted a series of images in which she was -
18:41 - 18:46menstruating, she was trying to
essentially describe the normal the -
18:46 - 18:50normality of this. The fact that this is
something that all women go through. Most -
18:50 - 18:57women go through. And as a result
Instagram denied. Unclear on the reasons, -
18:57 - 19:01they told her that had violated the terms
of service but weren't exactly clear as to -
19:01 - 19:06why. And finally this is another one. This
is by an artist friend of mine, I'm afraid -
19:06 - 19:10that I have completely blanked on who did
this particular piece, but what it was is: -
19:10 - 19:16They took famous works of nude art and had
sex workers pose in the same poses as the -
19:16 - 19:20pieces of art. I thought it was a really
cool project but Google Plus did not find -
19:20 - 19:24that to be a really cool project. And
because of their guidelines on nudity they -
19:24 - 19:31banned it. This is a cat. Just want to
make sure you're awake. It was totally -
19:31 - 19:38allowed. So in addition to the problems of
content moderation I'm gonna go ahead and -
19:38 - 19:41say that we also have a major diversity
problem at these companies. These -
19:41 - 19:46statistics are facts these are from all of
these companies themselves. They put out -
19:46 - 19:51diversity reports recently or as I like to
call them diversity reports and they show -
19:51 - 19:54that. So the statistics are a little bit
different because they only capture data -
19:54 - 19:59on ethnicity or nationality in their US
offices just because of how those -
19:59 - 20:03standards are sort of odd all over the
world. So the first stats referred to -
20:03 - 20:07their global staff. The second ones in
each line refer to their U.S. staff but as -
20:07 - 20:11you can see these companies are largely
made of white men which is probably not -
20:11 - 20:16surprising but it is a problem. Now why is
that a problem? Particularly when you're -
20:16 - 20:20talking about policy teams the people who
build policies and regulations have an -
20:20 - 20:24inherent bias. We all have an inherent
bias. But what we've seen in this is -
20:24 - 20:30really a bias of sort of the American
style of prudeness. Nudity is not allowed -
20:30 - 20:35but violence extreme violence as long as
it's fictional is totally OK. And that's -
20:35 - 20:39generally how these platforms operate. And
so I think that when we ensure that there -
20:39 - 20:45is diversity in the teams creating both
our tools our technology and our policies, -
20:45 - 20:49then we can ensure that diverse world
views are brought into those that creation -
20:49 - 20:56process and that the policies are
therefore more just. So what can we do -
20:56 - 21:01about this problem? As consumers as
technologists as activists as whomever you -
21:01 - 21:05might identify as. In the first one I
think a lot of the technologists are gonna -
21:05 - 21:09agree with: Develop decentralized
networks. We need to work toward that -
21:09 - 21:12ideal because these companies are not
getting any smaller. We're not gonna -
21:12 - 21:16necessarily go out and say that they're
too big to fail but they are massive and -
21:16 - 21:20as Matthew noted earlier they're buying up
properties all over the place and making -
21:20 - 21:25sure that they do have control over our
speech. The second thing is to push for -
21:25 - 21:30greater transparency around terms of
service takedowns. Now I'm not a huge fan -
21:30 - 21:33of transparency for the sake of
transparency. I think that these, you -
21:33 - 21:37know, these companies have been putting
out transparency reports for a long time -
21:37 - 21:42that show what countries ask them to take
down content or hand over user data. But -
21:42 - 21:48we've seen those transparency reports to
be incredibly flawed already. And so in -
21:48 - 21:52pushing for greater transparency around
terms of service take downs that's only a -
21:52 - 21:56first step. The third thing is, we need to
demand that these companies adhere to -
21:56 - 21:59global speech standards. We already have
the Universal Declaration of Human Rights. -
21:59 - 22:04I don't understand why we need companies
to develop their own bespoke rules. And so -
22:04 - 22:11by..
applause -
22:11 - 22:14And so by demanding that companies adhere
to global speech standards we can ensure -
22:14 - 22:18that these are places of free expression
because it is unrealistic to just tell -
22:18 - 22:21people to get off Facebook. I can't tell
you how many times in the tech community -
22:21 - 22:26over the years I've heard people say well
if you don't like it just leave. That's -
22:26 - 22:30not a realistic option for many people
around the world and I think we all know -
22:30 - 22:32that deep down.
applause -
22:32 - 22:39Thank you. And so the other thing I would
say though is that public pressure works. -
22:39 - 22:43We saw last year with Facebook's real name
policy there are a number of drag -
22:43 - 22:47performers in the San Francisco Bay Area
who were kicked off the platform because -
22:47 - 22:50they were using their performance - their
drag names - which is a completely -
22:50 - 22:55legitimate thing to do just as folks have
hacker names or other pseudonyms. But -
22:55 - 22:59those folks pushed back. They formed a
coalition and they got Facebook to change -
22:59 - 23:04a little bit. It's not completely there
yet but they're making progress and I'm -
23:04 - 23:09hoping that this goes well. And then the
last thing is and this is totally a pitch -
23:09 - 23:13thrown right out there: Support projects
like ours I'm gonna throw to Matthew to -
23:13 - 23:17talk about onlinecensorship.org and
another project done by the excellent -
23:17 - 23:21Rebecca MacKinnon called
ranking digital rights. -
23:21 - 23:25M: So just a little bit of thinking
outside the box onlinecensorship.org is a -
23:25 - 23:32platform that's recently launched. Users
can go onto the platform and submit a -
23:32 - 23:37small questionnaire if their content has
been taken down by the platforms. Why we -
23:37 - 23:39think this is exciting is because right
now as we mentioned that transparency -
23:39 - 23:45reports are fundamentally flawed. We are
looking to crowdsource information about -
23:45 - 23:49the ways in which the social media
companies, six social media companies, are -
23:49 - 23:54moderating and taking down content because
we can't know exactly what kind of -
23:54 - 23:57accountability and transparency in real
time. We're hoping to be able to find -
23:57 - 24:04trends both across the kind of conduct has
been taking down, geographic trends, news -
24:04 - 24:08related trends within sort of
self-reported content takedown. But it's -
24:08 - 24:13platforms like these that I think that
hopefully will begin to spring up in -
24:13 - 24:18response for the community to be able to
put tools in place that people can be a -
24:18 - 24:22part of the poor porting
and Transparency Initiative. -
24:22 - 24:24J: We launched about a month ago and were
hoping to put out our first set of reports -
24:24 - 24:29around March. And finally I just want to
close with one more quote before we slip -
24:29 - 24:34into Q&A and that is just to save it. It's
reasonable that we press Facebook on these -
24:34 - 24:37questions of public responsibility, while
also acknowledging that Facebook cannot be -
24:37 - 24:41all things to all people. We can demand
that their design decisions and user -
24:41 - 24:46policies standard for review explicit
thoughtful and open to public -
24:46 - 24:50deliberation. But - and this is the most
important part in my view - the choices that -
24:50 - 24:55Facebook makes in their design and in
their policies or value judgments. This is -
24:55 - 24:58political and I know you've heard that in
a lot of talk. So have I. But I think we -
24:58 - 25:02can't, we cannot forget that this is all
political and we have to address it as -
25:02 - 25:06such. And for someone if that means you
know quitting the platform thats fine too. -
25:06 - 25:09But I think that we should still
understand that our friends our relatives -
25:09 - 25:13our families are using these platforms and
that we do owe it to everybody to make -
25:13 - 25:18them a better place for free expression
and privacy. Thank you. -
25:18 - 25:24applause
-
25:24 - 25:29Herald: Thank you so much. So please now
we have a section of Q&A for anyone who -
25:29 - 25:36has a question please use one of the mikes
on the site's. And I think we have a -
25:36 - 25:44question from one of our viewers? No. OK.
Please proceed numer one. -
25:44 - 25:49Mike 1: You just addressed that I am sort
of especially after listening to your talk -
25:49 - 25:55I'm sort of on the verge of quitting
Facebook or starting to I don't know. -
25:55 - 26:00applause
Yeah, I mean and I agree it's a hard -
26:00 - 26:07decision. I've been on Facebook for I
think six years now and it is a dispute -
26:07 - 26:13for me myself. So, I'm in this very
strange position. And now I have to kind -
26:13 - 26:18of decide what to do. Are there any.. is
there any help for me out there to tell me -
26:18 - 26:26what would be.. I don't know what.. that
takes my state and helps me in deciding.. -
26:26 - 26:29I don't know. It's strange.
-
26:29 - 26:34J: That's such a hard question. I mean
I'm.. I'll put on my privacy hat for just -
26:34 - 26:39a second and say what I would say to
people when they're making that -
26:39 - 26:41consideration from a privacy view point
because they do think that the -
26:41 - 26:45implications of privacy on these platforms
is often much more severe than those of -
26:45 - 26:49speech. But this is what I do. So in that
case you know I think it's really about -
26:49 - 26:54understanding your threat model of
understanding what sort of threat you're -
26:54 - 26:58under when it comes to you know the data
collection that these companies are -
26:58 - 27:02undertaking as well as the censorship of
course. But I think it really is a -
27:02 - 27:05personal decision and I I'm sure that
there are - you know - there are great -
27:05 - 27:09resources out there around digital
security and around thinking through those -
27:09 - 27:13thread model processes and perhaps that
could be of help to you for that. If you -
27:13 - 27:17want to add?
M: No I mean I think it's, it's one of -
27:17 - 27:21these big toss ups like this is a system
in which that many people are connected -
27:21 - 27:26through even sometimes e-mail addresses
rollover and Facebook. And so I think it's -
27:26 - 27:30the opportunity cost by leaving a
platform. What do you have to lose, what -
27:30 - 27:35do you have to gain. But it's also
important remember that well the snapshot -
27:35 - 27:40we see of Facebook now it's not gonna get
any it's probably not gonna get better. -
27:40 - 27:44It's probably gonna be more invasive and
in coming into different parts of our -
27:44 - 27:50lives so I think from the security and
privacy aspect it's really just up to the -
27:50 - 27:54individual.
Mike 1: Short follow up - if I am allowed -
27:54 - 28:03to - I don't see the.. the main point for
me is not my personal implications so I am -
28:03 - 28:09quite aware that Facebook is a bad thing
and I can leave it, but I'm sort of -
28:09 - 28:14thinking about it's way past the point
where we can decide on our own and decide: -
28:14 - 28:18OK, is it good for me or is it good for my
friend or is it good for my mom or for my -
28:18 - 28:24dad or whatever? We have to think about:
Is Facebook as such for society is a good -
28:24 - 28:28thing - as you are addressing. So I think
we have to drive this decision making from -
28:28 - 28:35one person to a lot, lot, lot of persons.
J: I agree and I'll note. What we're -
28:35 - 28:37talking about..
applause -
28:37 - 28:41- I agree. What we're talking about in the
project that we work on together is a -
28:41 - 28:45small piece of the broader issue and I
agree that this needs to be tackled from -
28:45 - 28:49many angles.
Herald: Ok, we have a question from one of -
28:49 - 28:52our viewers on the Internet, please.
Signal Angel: Yeah, one of the questions -
28:52 - 28:56from the internet is: Aren't the
moderators the real problem who bann -
28:56 - 29:03everything which they don't really like
rather than the providers of the service? -
29:03 - 29:06Herald: Can you please repeat that?
Signal Angel: The question was if the -
29:06 - 29:11moderators sometimes volunteers aren't the
problem because they bann everything that -
29:11 - 29:15they don't like rather than the providers
of a certain service? -
29:15 - 29:20J: Ahm, no I mean I would say that the
content moderators.. we don't know who -
29:20 - 29:23they are, so that's part of the issue, as
we don't know and I've - you know - I've -
29:23 - 29:27heard many allegations over the years when
certain content's been taken down in a -
29:27 - 29:31certain local or cultural context
particularly in the Arab world. I've heard -
29:31 - 29:35the accusation that like oh those content
moderators are pro Sisi the dictator in -
29:35 - 29:39Egypt or whatever. I'm not sure how much
merit that holds because like I said we -
29:39 - 29:45don't know who they are. But I would say
is that they're not.. it doesn't feel like -
29:45 - 29:49they're given the resources to do their
jobs well, so even if they were the best -
29:49 - 29:53most neutral people on earth they're given
very little time probably very little -
29:53 - 29:59money and not a whole lot of resources to
work with in making those determinations. -
29:59 - 30:01Herald: Thank you. We take a question from
Mike 3 please. -
30:01 - 30:06Mike 3: Test, test. OK. First off, thank
you so much for the talk. And I just have -
30:06 - 30:12a basic question. So it seems logical that
Facebook is trying to put out this mantra -
30:12 - 30:17of protect the children. I can kind of get
behind that. And it also seems based on -
30:17 - 30:21the fact that they have the "real names
policy" that they would also expect you to -
30:21 - 30:26put in a real legal age. So if they're
trying to censor things like nudity: Why -
30:26 - 30:31couldn't they simply use things like age
as criteria to protect children from -
30:31 - 30:34nudity. While letting everyone else who is
above the legal age make their own -
30:34 - 30:38decision?
J: You wanna take that? -
30:38 - 30:44M: I think it's a few factors one: It's I
guess on the technical side what on the -
30:44 - 30:48technical side, what constitutes nudity.
And in a process way if it does get -
30:48 - 30:53flagged as when something is flagged do
you have a channel or two boxes to say -
30:53 - 30:58what sort of content I could use a system
in which that content flagged as nudity -
30:58 - 31:02gets referred to a special nudity
moderator and then the moderator says: -
31:02 - 31:10"Yes is nudity" then filter all less than
- you know - legal age or whatever age. -
31:10 - 31:15But I think it's part of a broader more
systematic approach by Facebook. It's the -
31:15 - 31:22broad strokes. It's really kind of
dictating this digital baseline this -
31:22 - 31:27digital morality baseline and we're
saying: "No". Anybody in the world cannot -
31:27 - 31:31see this. These are our hard lines and it
doesn't matter what age you are where you -
31:31 - 31:34reside. This is the box in which we are
placing you in and content that falls -
31:34 - 31:39outside of this box for anybody,
regardless of age or origin, that these -
31:39 - 31:43this is what we say you can see and
anything that falls out that you risk -
31:43 - 31:48having your account suspended. So I think
it's a mechanism of control. -
31:48 - 31:51Herald: Thank you so much. I think
unfortunately we're run out of time for -
31:51 - 31:54questions. I would like to apologize for
everyone who's standing maybe you have -
31:54 - 32:01time to discuss that afterwards. Thank you
everyone and thank you. -
32:01 - 32:08applause
-
32:08 - 32:13postrol music
-
32:13 - 32:20subtitles created by c3subtitles.de
Join, and help us!
- Title:
- Matthew Stender, Jillian C. York: Sin in the time of Technology
- Description:
-
Technology companies now hold an unprecedented ability to shape the world around us by limiting our ability to access certain content and by crafting proprietary algorithm that bring us our daily streams of content.
Matthew Stender, Jillian C. York
- Video Language:
- English
- Duration:
- 32:20
![]() |
C3Subtitles edited English subtitles for Matthew Stender, Jillian C. York: Sin in the time of Technology | Jan 3, 2020, 12:12 PM |
![]() |
Bar Sch edited English subtitles for Matthew Stender, Jillian C. York: Sin in the time of Technology | Nov 22, 2016, 12:57 AM |
![]() |
Bar Sch edited English subtitles for Matthew Stender, Jillian C. York: Sin in the time of Technology | Nov 17, 2016, 6:13 PM |
![]() |
Bar Sch edited English subtitles for Matthew Stender, Jillian C. York: Sin in the time of Technology | Nov 17, 2016, 5:53 PM |
![]() |
Bar Sch edited English subtitles for Matthew Stender, Jillian C. York: Sin in the time of Technology | Nov 17, 2016, 4:04 PM |