hideApril is World Autism Month and we want to bring awareness to the importance of inclusion in the classroom!
💡Learn with Amara.org how Captioning Can Empower Diverse Learners!

< Return to Video

Matthew Stender, Jillian C. York: Sin in the time of Technology

  • 0:00 - 0:09
    prerol music
  • 0:09 - 0:14
    Herald: Our next Talk will be tackling how
    social media companies are creating a
  • 0:14 - 0:20
    global morality standard through content
    regulations. This would be presented by
  • 0:20 - 0:25
    two persons standing here the digital
    rights advocate Matthew Stender and the
  • 0:25 - 0:31
    writer and activist Jillian C. York. Please
    give them a warm applause.
  • 0:31 - 0:38
    applause
  • 0:38 - 0:41
    Matthew: Hello everybody. I hope you all
    had a great Congress. Thank you for being
  • 0:41 - 0:47
    here today. You know we're almost wrapped
    up with the Congress but yeah we
  • 0:47 - 0:52
    appreciate you being here. My name is
    Matthew Stender. I am a communications
  • 0:52 - 0:57
    strategist's creative director and digital
    rights advocate focusing on privacy.
  • 0:57 - 1:02
    Social media censorship and Freedom of
    Press and expression.
  • 1:02 - 1:07
    Jillian: And I am Jillian York and I work
    at the Electronic Frontier Foundation. I
  • 1:07 - 1:11
    work on privacy and free expression issues
    as well as a few other things and I'm
  • 1:11 - 1:18
    based in Berlin. Thank you. For Berlin?
    Awesome. Cool. Hope to see some of you
  • 1:18 - 1:22
    there. Great. So today we're gonna be
    talking about sin in the time of
  • 1:22 - 1:27
    technology and what we mean by that is the
    way in which corporations particularly
  • 1:27 - 1:31
    content platforms and social media
    platforms are driving morality and our
  • 1:31 - 1:37
    perception of it. We've got three key
    takeaways to start off with. The first is
  • 1:37 - 1:40
    that social media companies have an
    unparalleled amount of influence over our
  • 1:40 - 1:44
    modern communications. This we know I
    think this is probably something everyone
  • 1:44 - 1:48
    in this room can agree on. These companies
    also play a huge role in shaping our
  • 1:48 - 1:53
    global outlook on morality and what
    constitutes it. So the ways in which we
  • 1:53 - 1:57
    perceive different imagery different
    speech is being increasingly defined by
  • 1:57 - 2:02
    the regulations that these platforms put
    upon us on our daily activities on them.
  • 2:02 - 2:06
    And third they are entirely undemocratic.
    They're beholden to shareholders and
  • 2:06 - 2:10
    governments but not at all to the public.
    Not to me, not to you. Rarely do they
  • 2:10 - 2:14
    listen to us and when they do there has to
    be a fairly exceptional mount of public
  • 2:14 - 2:18
    pressure on them. And so that's, that's
    our starting point that's what we wanna
  • 2:18 - 2:23
    kick off with and I'll pass the mike to
    Matthew.
  • 2:23 - 2:28
    M: So thinking about these three takeaways
    I'm going to bring it to kind of top level
  • 2:28 - 2:38
    for a moment. To introduce an idea today
    which some people have talked about but
  • 2:38 - 2:44
    the idea of the rise of the techno class.
    So probably a lot of people in this room
  • 2:44 - 2:51
    have followed the negotiations, leaks in
    part and then in full by WikiLeaks about
  • 2:51 - 2:58
    the Trans-Pacific Partnership the TPP what
    some people have mentioned in this debate
  • 2:58 - 3:03
    is the idea of a corporate capture in a
    world now in which that corporations are
  • 3:03 - 3:09
    becoming are maturing to this to the
    extent in which they can now sue
  • 3:09 - 3:15
    governments that the multinational reach
    of many corporations are larger than that.
  • 3:15 - 3:22
    The diplomatic reach of countries. And
    with this social media platforms being
  • 3:22 - 3:26
    part of this that they, that these social
    media companies now are going to have the
  • 3:26 - 3:30
    capacity to influence not only cultures
    but people within cultures and how they
  • 3:30 - 3:36
    communicate with people inside their
    culture and communicate globally. So as
  • 3:36 - 3:41
    activists and technologists I would like
    to propose at least that we start thinking
  • 3:41 - 3:45
    about and beyond the product and service
    offerings of today's social media
  • 3:45 - 3:51
    companies and start looking ahead to two -
    five - ten years down the road in which
  • 3:51 - 3:57
    these companies may have social media
    services and serve as media social media
  • 3:57 - 4:01
    service offerings which are
    indistinguishable from today's ISPs and
  • 4:01 - 4:06
    telcos and other things. And this is
    really to say that social media is moving
  • 4:06 - 4:15
    past the era of the walled garden into neo
    empires. So one of the things that is on
  • 4:15 - 4:20
    the slide are some headlines about
    different delivery mechanisms in which
  • 4:20 - 4:25
    that social media companies and also
    people like Elon Musk are looking to roll
  • 4:25 - 4:30
    out an almost leapfrog if not completely
    leapfrog the existing technologies of
  • 4:30 - 4:37
    Russia broadcasting fiberoptics these sort
    of things. So we now are looking at a
  • 4:37 - 4:41
    world in which that Facebook is now gonna
    have drones. Google is looking into
  • 4:41 - 4:47
    balloons and other people looking into low
    earth orbit satellites to be able to
  • 4:47 - 4:52
    provide directly to the end consumer, to
    the user, to the handset the content which
  • 4:52 - 4:58
    flows through these networks. So one of
    the first things I believe we're gonna see
  • 4:58 - 5:06
    in this field is free basics. Facebook has
    a service it was launched as Internet.org
  • 5:06 - 5:12
    and now it's been rebranded free basics
    and why this is interesting is that while
  • 5:12 - 5:17
    in one hand free basis is a free service
    that it's trying to get the people that
  • 5:17 - 5:22
    are not on the Internet now to use a
    Facebook's window Facebook's window to the
  • 5:22 - 5:28
    world. It has maybe a couple dozen sites
    that are accessible it runs over the data
  • 5:28 - 5:35
    networks for countries reliance the
    telecommunications company in India is one
  • 5:35 - 5:42
    of the larger telcos but not the largest.
    There's a lot of pressure that Facebook is
  • 5:42 - 5:48
    putting on the government of India right
    now to be able to have the service offered
  • 5:48 - 5:55
    across the country. One of the ways that
    this is problematic is because a limited
  • 5:55 - 5:59
    number of Web sites flow through this that
    people that get exposed to free basic.
  • 5:59 - 6:05
    This might be their first time seeing the
    Internet in some cases because an example
  • 6:05 - 6:10
    is interesting to think about is a lion
    born into a zoo. Perhaps evolution and
  • 6:10 - 6:17
    other things may have this line dream
    perhaps of running wild on the plains of
  • 6:17 - 6:25
    Africa but at the same time it will never
    know that world Facebook free basic users
  • 6:25 - 6:32
    knowing Facebook's view to the window of
    the Internet may not all jump over to a
  • 6:32 - 6:37
    full data package on their ISP. And many
    people may be stuck in Facebook's window
  • 6:37 - 6:41
    to the world.
  • 6:41 - 6:45
    J: In other words we've reached an era
    where these companies have as I've said
  • 6:45 - 6:49
    unprecedented control over our daily
    communications both the information that
  • 6:49 - 6:52
    we can access and the speech and imagery
    that we can express to the world and to
  • 6:52 - 6:56
    each other. So the postings and pages and
    friend requests, the millions of
  • 6:56 - 7:00
    politically active users as well, have
    helped to make Mark Zuckerberg and his
  • 7:00 - 7:03
    colleagues as well as the people at
    Google, Twitter and all these other fine
  • 7:03 - 7:08
    companies extremely rich and yet we're
    pushing back in this case. I've got a
  • 7:08 - 7:11
    great quote from Rebecca MacKinnon where
    she refers to Facebook as Facebookistan
  • 7:11 - 7:15
    and I think that that is an apt example of
    what we're looking at. These are
  • 7:15 - 7:19
    corporations but they're not beholden at
    all to the public as we know. And instead
  • 7:19 - 7:23
    they've kind of turned into these quasi
    dictatorships that dictate precisely how
  • 7:23 - 7:29
    we behave on them. I also wanted to throw
    this one up to talk a little bit about the
  • 7:29 - 7:34
    global speech norm. This is from Ben
    Wagner who's written a number of pieces on
  • 7:34 - 7:38
    this but who kind of coined the concept of
    a global speech standard which is what
  • 7:38 - 7:43
    these companies have begun and are
    increasingly imposing upon us. This global
  • 7:43 - 7:48
    speech standard is essentially catering to
    everyone in the world trying to make every
  • 7:48 - 7:52
    user in every country and every government
    happy. But as a result have kind of tamper
  • 7:52 - 7:57
    down free speech to this very basic level
    that makes both the governments of let's
  • 7:57 - 8:01
    say the United States and Germany happy as
    well as the governments of countries like
  • 8:01 - 8:05
    Saudi Arabia. Therefore we're looking at
    really kind of the lowest common
  • 8:05 - 8:10
    denominator when it comes to sometimes
    tone types of speech and this sort of flat
  • 8:10 - 8:15
    gray standard when it comes to others.
  • 8:15 - 8:22
    M: So Jillian just mentioned we have
    countries in play. Facebook is another
  • 8:22 - 8:26
    organizations social media companies are
    trying to pivot and play within an
  • 8:26 - 8:30
    international field but let's just take it
    up for a moment a look at the scale and
  • 8:30 - 8:36
    scope and size of these social media
    companies. So I just put some figures from
  • 8:36 - 8:43
    the Internet and with some latest census
    information we have China 1.37 billion
  • 8:43 - 8:48
    people, India 1.25 billion people 2.2
    billion individuals and practitioners of
  • 8:48 - 8:52
    Islam and Christianity. But now we have
    Facebook with according to their
  • 8:52 - 8:58
    statistics 1.5 billion active monthly
    users. Their statistics - I'm sure many
  • 8:58 - 9:02
    people would like to dispute these numbers
    - but the same time with these platforms
  • 9:02 - 9:09
    are now large. I mean larger than - not
    larger than some of religions - but
  • 9:09 - 9:15
    Facebook has more monthly active users
    than China or India have citizens. So
  • 9:15 - 9:19
    we're not talking about - you know -
    basement startups, we're now talking about
  • 9:19 - 9:27
    companies with the size and scale to be
    able to really be influential in a larger
  • 9:27 - 9:43
    institutional way. So Magna Carta app the
    U.S. Constitution the Declaration of Human
  • 9:43 - 9:49
    Rights the tree of masters the Bible the
    Quran. These are time tested at least
  • 9:49 - 9:55
    longstanding principle documents. That's
    how that place upon their constituents
  • 9:55 - 10:04
    whether it be citizens or spiritual
    adherents a certain code of conduct.
  • 10:04 - 10:09
    Facebook, as Jillian mentioned, is
    nondemocratic. Facebook's terms and
  • 10:09 - 10:13
    standards were written by a small group of
    individuals with a few compelling
  • 10:13 - 10:19
    interests in mind. But we are now talking
    about 1.5 billion people on a monthly
  • 10:19 - 10:27
    basis that are subservient to a Terms of
    Service in which they had no input on.
  • 10:27 - 10:32
    Sort of pivot from there and bring it back
    to spirituality why is this important.
  • 10:32 - 10:42
    Well spiritual morality has always been a
    place for - for religion. Religion has a
  • 10:42 - 10:50
    monopoly on the soul. You could say, that
    religion is a set of rules in which that
  • 10:50 - 10:56
    if you obey you are able to not go to hell
    or heaven after an afterlife reincarnated
  • 10:56 - 11:03
    whatever the religious practice may be.
    Civil morality is quite interesting in the
  • 11:03 - 11:08
    sense that the sovereign state as a top
    level institution has the ability to put
  • 11:08 - 11:13
    into place a series of statutes and
    regulations. The violation of which can
  • 11:13 - 11:19
    send you to jail. Another interesting note
    is that the state is also, also has a
  • 11:19 - 11:25
    monopoly on the use of sanctioned
    violence. Say that's the actors of
  • 11:25 - 11:28
    official actors of the states are able to
    do things in which that the citizens
  • 11:28 - 11:36
    citizens of that state may not. And if we
    take a look at this this concept of
  • 11:36 - 11:42
    digital morality I spoke about earlier
    with services like free basic introducing
  • 11:42 - 11:48
    new individuals to the Internet well, by a
    violation of the terms of service you can
  • 11:48 - 11:58
    be excluded from these massive global
    networks. And really they, Facebook is
  • 11:58 - 12:04
    actually trying to create a if not a
    monopoly a semi monopoly on global
  • 12:04 - 12:14
    connectivity. in a lot of ways. So what
    drives Facebook and this is a few things.
  • 12:14 - 12:21
    One is a protection stick protectionistic
    legal framework. Ok. The control of
  • 12:21 - 12:25
    copyright violations is something that a
    lot of platforms stomped out pretty early.
  • 12:25 - 12:30
    They don't want to be sued by the IRA or
    the MPAA. And so there was mechanisms in
  • 12:30 - 12:37
    which copyright, copyrighted material was
    able to be taken on the platform. They
  • 12:37 - 12:42
    also or limit potential competition. And I
    think this is quite interesting in the
  • 12:42 - 12:47
    sense that they have shown this in two
    ways. One: They've boughten they've
  • 12:47 - 12:52
    purchased rival or potential competitors.
    You see this with Instagram being bought
  • 12:52 - 12:58
    by Facebook. But Facebook has also
    demonstrated the ability or willingness to
  • 12:58 - 13:05
    censor certain context, certain content.
    tsu.co was a news social sites and
  • 13:05 - 13:10
    mentions and links to this platform were
    deleted or not allowed on Facebook. So
  • 13:10 - 13:16
    even using Facebook as a platform to talk
    about another platform was was not
  • 13:16 - 13:24
    allowed. And then a third component is the
    operation on the global scale. It's only
  • 13:24 - 13:29
    the size of the company it's also about
    the global reach. So if Facebook maintains
  • 13:29 - 13:33
    offices around the world, as other social
    media companies do, they engage in public
  • 13:33 - 13:42
    diplomacy and they also offer aid in many
    countries and many languages. So just to
  • 13:42 - 13:47
    take it to to companies like Facebook for
    a moment really an economics you have the
  • 13:47 - 13:53
    traditional multinationals the 20th
    century Coca Cola McDonald's. The end
  • 13:53 - 14:01
    user. The end goal the goal for the end
    user of these products was consumption.
  • 14:01 - 14:05
    This is changing now. Facebook is looking
    to capture more and more parts of the
  • 14:05 - 14:11
    supply chain. And as a this may be as a
    service provider as a content moderator
  • 14:11 - 14:21
    and responsible for negotiating and
    educating the content disputes. At the end
  • 14:21 - 14:28
    of the day that users are really the
    product. It's not for us Facebook users
  • 14:28 - 14:33
    the platform it's really for advertisers.
    We take a hierarchy of the platform where
  • 14:33 - 14:39
    the corporation advertisers and then users
    kind of at the fringes.
  • 14:39 - 14:42
    J: So let's get into the nitty gritty a
    little bit about what content moderation
  • 14:42 - 14:46
    on these platforms actually looks like. So
    I've put up two headlines from Adrian Chan
  • 14:46 - 14:50
    a journalist who wrote these for Gawker
    and wired respectively. They're both a
  • 14:50 - 14:55
    couple of years old. But what he did was
    he looked into he investigated who was
  • 14:55 - 14:59
    moderating the content on these platforms
    and what he found and accused these
  • 14:59 - 15:03
    companies out of is outsourcing the
    content moderation to low paid workers in
  • 15:03 - 15:07
    developing countries. In this case he
    found the first article I think Morocco
  • 15:07 - 15:10
    was the country and I'm gonna show a slide
    from that a bit of what those content
  • 15:10 - 15:14
    moderators worked with. And the second
    article talked a lot about the use of
  • 15:14 - 15:17
    workers in the Philippines for this
    purpose. We know that these workers are
  • 15:17 - 15:22
    probably low paid. We know that they're
    given very very minimal amount of a
  • 15:22 - 15:25
    minimal timeframe to look at the content
    that they're being presented. So here's
  • 15:25 - 15:31
    how it basically works across platforms
    with small differences. I post something
  • 15:31 - 15:35
    and I'll show you some great examples of
    things they posted later. I post something
  • 15:35 - 15:39
    and if I post it to my friends only, my
    friends can then report it to the company.
  • 15:39 - 15:43
    If I post it publicly anybody who can see
    it or who's a user of the product can
  • 15:43 - 15:48
    report it to the company. Once a piece of
    content is reported a content moderator
  • 15:48 - 15:51
    then looks at it and within that very
    small time frame we're talking half a
  • 15:51 - 15:56
    second to two seconds probably based on
    the investigative research that's been
  • 15:56 - 16:00
    done by a number of people they have to
    decide if this content fits the terms of
  • 16:00 - 16:04
    service or not. Now most of these
    companies have a legalistic terms of
  • 16:04 - 16:08
    service as well as a set of community
    guidelines or community standards which
  • 16:08 - 16:13
    are clear to the user but they're still
    often very vague. And so I wanna get into
  • 16:13 - 16:18
    a couple of examples that show that. This
    is just this slide is the one of the
  • 16:18 - 16:21
    examples that I gave. You can't see it
    very well, so I won't leave it up for too
  • 16:21 - 16:26
    long. But that was what content moderators
    at this outsource company oDesk were
  • 16:26 - 16:34
    allegedly using to moderate content on
    Facebook. This next photo contains nudity.
  • 16:34 - 16:41
    So I think everyone probably knows who
    this is and has seen this photo. Yes? No?
  • 16:41 - 16:46
    OK. This is Kim Kardashian and this photo
    allegedly broke the Internet. It was a
  • 16:46 - 16:51
    photo taken for paper magazine. It was
    posted widely on the web and it was seen
  • 16:51 - 16:55
    by many many people. Now this photograph
    definitely violates Facebook's terms of
  • 16:55 - 17:00
    service. Buuut Kim Kardashian's really
    famous and makes a lot of money. So in
  • 17:00 - 17:07
    most instances as far as I can tell this
    photo was totally fine on Facebook. Now
  • 17:07 - 17:11
    let's talk about those rules a little bit.
    Facebook says that they restrict nudity
  • 17:11 - 17:16
    unless it is art. So they do make an
    exception for art which may be why they
  • 17:16 - 17:21
    allowed that image of Kim Kardashian's
    behind to stay up. Art is defined by the
  • 17:21 - 17:26
    individual. And yet at the same time that
    you know they make clear that that's let's
  • 17:26 - 17:30
    say a photograph of Michelangelo's David
    or a photograph of another piece of art in
  • 17:30 - 17:34
    a museum would be perfectly acceptable
    whereas you know you're sort of average
  • 17:34 - 17:39
    nudity maybe probably is not going to be
    allowed to remain on the platform. They
  • 17:39 - 17:43
    also note that they restrict the display
    of nudity to ensure that their global..
  • 17:43 - 17:46
    because their global community may be
    sensitive to this type of content
  • 17:46 - 17:50
    particularly because of their cultural
    background or age. So this is Facebook in
  • 17:50 - 17:55
    their community standards telling you
    explicitly that they are toning down free
  • 17:55 - 18:02
    speech to make everyone happy. This is
    another photograph. Germans are
  • 18:02 - 18:07
    particularly I'm interested. Is everyone
    familiar with the show the Golden Girls?
  • 18:07 - 18:11
    OK. Quite a few. So you might recognize
    her she was Dorothea on the Golden Girls
  • 18:11 - 18:15
    this is the actress Bea Arthur and this is
    from a painting from 1991 by John Curran
  • 18:15 - 18:19
    of her. It's unclear whether or not she
    sat for the painting. It's a wonderful
  • 18:19 - 18:24
    image, it's beautiful it's very beautiful
    portrait of her. But I posted it on
  • 18:24 - 18:28
    Facebook several times in a week. I
    encouraged my friends to report it. And in
  • 18:28 - 18:36
    fact Facebook found this to not be art.
    Sorry. Another image this is by an artist
  • 18:36 - 18:41
    a Canadian artist called Rupi Kaur. She
    posted a series of images in which she was
  • 18:41 - 18:46
    menstruating, she was trying to
    essentially describe the normal the
  • 18:46 - 18:50
    normality of this. The fact that this is
    something that all women go through. Most
  • 18:50 - 18:57
    women go through. And as a result
    Instagram denied. Unclear on the reasons,
  • 18:57 - 19:01
    they told her that had violated the terms
    of service but weren't exactly clear as to
  • 19:01 - 19:06
    why. And finally this is another one. This
    is by an artist friend of mine, I'm afraid
  • 19:06 - 19:10
    that I have completely blanked on who did
    this particular piece, but what it was is:
  • 19:10 - 19:16
    They took famous works of nude art and had
    sex workers pose in the same poses as the
  • 19:16 - 19:20
    pieces of art. I thought it was a really
    cool project but Google Plus did not find
  • 19:20 - 19:24
    that to be a really cool project. And
    because of their guidelines on nudity they
  • 19:24 - 19:31
    banned it. This is a cat. Just want to
    make sure you're awake. It was totally
  • 19:31 - 19:38
    allowed. So in addition to the problems of
    content moderation I'm gonna go ahead and
  • 19:38 - 19:41
    say that we also have a major diversity
    problem at these companies. These
  • 19:41 - 19:46
    statistics are facts these are from all of
    these companies themselves. They put out
  • 19:46 - 19:51
    diversity reports recently or as I like to
    call them diversity reports and they show
  • 19:51 - 19:54
    that. So the statistics are a little bit
    different because they only capture data
  • 19:54 - 19:59
    on ethnicity or nationality in their US
    offices just because of how those
  • 19:59 - 20:03
    standards are sort of odd all over the
    world. So the first stats referred to
  • 20:03 - 20:07
    their global staff. The second ones in
    each line refer to their U.S. staff but as
  • 20:07 - 20:11
    you can see these companies are largely
    made of white men which is probably not
  • 20:11 - 20:16
    surprising but it is a problem. Now why is
    that a problem? Particularly when you're
  • 20:16 - 20:20
    talking about policy teams the people who
    build policies and regulations have an
  • 20:20 - 20:24
    inherent bias. We all have an inherent
    bias. But what we've seen in this is
  • 20:24 - 20:30
    really a bias of sort of the American
    style of prudeness. Nudity is not allowed
  • 20:30 - 20:35
    but violence extreme violence as long as
    it's fictional is totally OK. And that's
  • 20:35 - 20:39
    generally how these platforms operate. And
    so I think that when we ensure that there
  • 20:39 - 20:45
    is diversity in the teams creating both
    our tools our technology and our policies,
  • 20:45 - 20:49
    then we can ensure that diverse world
    views are brought into those that creation
  • 20:49 - 20:56
    process and that the policies are
    therefore more just. So what can we do
  • 20:56 - 21:01
    about this problem? As consumers as
    technologists as activists as whomever you
  • 21:01 - 21:05
    might identify as. In the first one I
    think a lot of the technologists are gonna
  • 21:05 - 21:09
    agree with: Develop decentralized
    networks. We need to work toward that
  • 21:09 - 21:12
    ideal because these companies are not
    getting any smaller. We're not gonna
  • 21:12 - 21:16
    necessarily go out and say that they're
    too big to fail but they are massive and
  • 21:16 - 21:20
    as Matthew noted earlier they're buying up
    properties all over the place and making
  • 21:20 - 21:25
    sure that they do have control over our
    speech. The second thing is to push for
  • 21:25 - 21:30
    greater transparency around terms of
    service takedowns. Now I'm not a huge fan
  • 21:30 - 21:33
    of transparency for the sake of
    transparency. I think that these, you
  • 21:33 - 21:37
    know, these companies have been putting
    out transparency reports for a long time
  • 21:37 - 21:42
    that show what countries ask them to take
    down content or hand over user data. But
  • 21:42 - 21:48
    we've seen those transparency reports to
    be incredibly flawed already. And so in
  • 21:48 - 21:52
    pushing for greater transparency around
    terms of service take downs that's only a
  • 21:52 - 21:56
    first step. The third thing is, we need to
    demand that these companies adhere to
  • 21:56 - 21:59
    global speech standards. We already have
    the Universal Declaration of Human Rights.
  • 21:59 - 22:04
    I don't understand why we need companies
    to develop their own bespoke rules. And so
  • 22:04 - 22:11
    by..
    applause
  • 22:11 - 22:14
    And so by demanding that companies adhere
    to global speech standards we can ensure
  • 22:14 - 22:18
    that these are places of free expression
    because it is unrealistic to just tell
  • 22:18 - 22:21
    people to get off Facebook. I can't tell
    you how many times in the tech community
  • 22:21 - 22:26
    over the years I've heard people say well
    if you don't like it just leave. That's
  • 22:26 - 22:30
    not a realistic option for many people
    around the world and I think we all know
  • 22:30 - 22:32
    that deep down.
    applause
  • 22:32 - 22:39
    Thank you. And so the other thing I would
    say though is that public pressure works.
  • 22:39 - 22:43
    We saw last year with Facebook's real name
    policy there are a number of drag
  • 22:43 - 22:47
    performers in the San Francisco Bay Area
    who were kicked off the platform because
  • 22:47 - 22:50
    they were using their performance - their
    drag names - which is a completely
  • 22:50 - 22:55
    legitimate thing to do just as folks have
    hacker names or other pseudonyms. But
  • 22:55 - 22:59
    those folks pushed back. They formed a
    coalition and they got Facebook to change
  • 22:59 - 23:04
    a little bit. It's not completely there
    yet but they're making progress and I'm
  • 23:04 - 23:09
    hoping that this goes well. And then the
    last thing is and this is totally a pitch
  • 23:09 - 23:13
    thrown right out there: Support projects
    like ours I'm gonna throw to Matthew to
  • 23:13 - 23:17
    talk about onlinecensorship.org and
    another project done by the excellent
  • 23:17 - 23:21
    Rebecca MacKinnon called
    ranking digital rights.
  • 23:21 - 23:25
    M: So just a little bit of thinking
    outside the box onlinecensorship.org is a
  • 23:25 - 23:32
    platform that's recently launched. Users
    can go onto the platform and submit a
  • 23:32 - 23:37
    small questionnaire if their content has
    been taken down by the platforms. Why we
  • 23:37 - 23:39
    think this is exciting is because right
    now as we mentioned that transparency
  • 23:39 - 23:45
    reports are fundamentally flawed. We are
    looking to crowdsource information about
  • 23:45 - 23:49
    the ways in which the social media
    companies, six social media companies, are
  • 23:49 - 23:54
    moderating and taking down content because
    we can't know exactly what kind of
  • 23:54 - 23:57
    accountability and transparency in real
    time. We're hoping to be able to find
  • 23:57 - 24:04
    trends both across the kind of conduct has
    been taking down, geographic trends, news
  • 24:04 - 24:08
    related trends within sort of
    self-reported content takedown. But it's
  • 24:08 - 24:13
    platforms like these that I think that
    hopefully will begin to spring up in
  • 24:13 - 24:18
    response for the community to be able to
    put tools in place that people can be a
  • 24:18 - 24:22
    part of the poor porting
    and Transparency Initiative.
  • 24:22 - 24:24
    J: We launched about a month ago and were
    hoping to put out our first set of reports
  • 24:24 - 24:29
    around March. And finally I just want to
    close with one more quote before we slip
  • 24:29 - 24:34
    into Q&A and that is just to save it. It's
    reasonable that we press Facebook on these
  • 24:34 - 24:37
    questions of public responsibility, while
    also acknowledging that Facebook cannot be
  • 24:37 - 24:41
    all things to all people. We can demand
    that their design decisions and user
  • 24:41 - 24:46
    policies standard for review explicit
    thoughtful and open to public
  • 24:46 - 24:50
    deliberation. But - and this is the most
    important part in my view - the choices that
  • 24:50 - 24:55
    Facebook makes in their design and in
    their policies or value judgments. This is
  • 24:55 - 24:58
    political and I know you've heard that in
    a lot of talk. So have I. But I think we
  • 24:58 - 25:02
    can't, we cannot forget that this is all
    political and we have to address it as
  • 25:02 - 25:06
    such. And for someone if that means you
    know quitting the platform thats fine too.
  • 25:06 - 25:09
    But I think that we should still
    understand that our friends our relatives
  • 25:09 - 25:13
    our families are using these platforms and
    that we do owe it to everybody to make
  • 25:13 - 25:18
    them a better place for free expression
    and privacy. Thank you.
  • 25:18 - 25:24
    applause
  • 25:24 - 25:29
    Herald: Thank you so much. So please now
    we have a section of Q&A for anyone who
  • 25:29 - 25:36
    has a question please use one of the mikes
    on the site's. And I think we have a
  • 25:36 - 25:44
    question from one of our viewers? No. OK.
    Please proceed numer one.
  • 25:44 - 25:49
    Mike 1: You just addressed that I am sort
    of especially after listening to your talk
  • 25:49 - 25:55
    I'm sort of on the verge of quitting
    Facebook or starting to I don't know.
  • 25:55 - 26:00
    applause
    Yeah, I mean and I agree it's a hard
  • 26:00 - 26:07
    decision. I've been on Facebook for I
    think six years now and it is a dispute
  • 26:07 - 26:13
    for me myself. So, I'm in this very
    strange position. And now I have to kind
  • 26:13 - 26:18
    of decide what to do. Are there any.. is
    there any help for me out there to tell me
  • 26:18 - 26:26
    what would be.. I don't know what.. that
    takes my state and helps me in deciding..
  • 26:26 - 26:29
    I don't know. It's strange.
  • 26:29 - 26:34
    J: That's such a hard question. I mean
    I'm.. I'll put on my privacy hat for just
  • 26:34 - 26:39
    a second and say what I would say to
    people when they're making that
  • 26:39 - 26:41
    consideration from a privacy view point
    because they do think that the
  • 26:41 - 26:45
    implications of privacy on these platforms
    is often much more severe than those of
  • 26:45 - 26:49
    speech. But this is what I do. So in that
    case you know I think it's really about
  • 26:49 - 26:54
    understanding your threat model of
    understanding what sort of threat you're
  • 26:54 - 26:58
    under when it comes to you know the data
    collection that these companies are
  • 26:58 - 27:02
    undertaking as well as the censorship of
    course. But I think it really is a
  • 27:02 - 27:05
    personal decision and I I'm sure that
    there are - you know - there are great
  • 27:05 - 27:09
    resources out there around digital
    security and around thinking through those
  • 27:09 - 27:13
    thread model processes and perhaps that
    could be of help to you for that. If you
  • 27:13 - 27:17
    want to add?
    M: No I mean I think it's, it's one of
  • 27:17 - 27:21
    these big toss ups like this is a system
    in which that many people are connected
  • 27:21 - 27:26
    through even sometimes e-mail addresses
    rollover and Facebook. And so I think it's
  • 27:26 - 27:30
    the opportunity cost by leaving a
    platform. What do you have to lose, what
  • 27:30 - 27:35
    do you have to gain. But it's also
    important remember that well the snapshot
  • 27:35 - 27:40
    we see of Facebook now it's not gonna get
    any it's probably not gonna get better.
  • 27:40 - 27:44
    It's probably gonna be more invasive and
    in coming into different parts of our
  • 27:44 - 27:50
    lives so I think from the security and
    privacy aspect it's really just up to the
  • 27:50 - 27:54
    individual.
    Mike 1: Short follow up - if I am allowed
  • 27:54 - 28:03
    to - I don't see the.. the main point for
    me is not my personal implications so I am
  • 28:03 - 28:09
    quite aware that Facebook is a bad thing
    and I can leave it, but I'm sort of
  • 28:09 - 28:14
    thinking about it's way past the point
    where we can decide on our own and decide:
  • 28:14 - 28:18
    OK, is it good for me or is it good for my
    friend or is it good for my mom or for my
  • 28:18 - 28:24
    dad or whatever? We have to think about:
    Is Facebook as such for society is a good
  • 28:24 - 28:28
    thing - as you are addressing. So I think
    we have to drive this decision making from
  • 28:28 - 28:35
    one person to a lot, lot, lot of persons.
    J: I agree and I'll note. What we're
  • 28:35 - 28:37
    talking about..
    applause
  • 28:37 - 28:41
    - I agree. What we're talking about in the
    project that we work on together is a
  • 28:41 - 28:45
    small piece of the broader issue and I
    agree that this needs to be tackled from
  • 28:45 - 28:49
    many angles.
    Herald: Ok, we have a question from one of
  • 28:49 - 28:52
    our viewers on the Internet, please.
    Signal Angel: Yeah, one of the questions
  • 28:52 - 28:56
    from the internet is: Aren't the
    moderators the real problem who bann
  • 28:56 - 29:03
    everything which they don't really like
    rather than the providers of the service?
  • 29:03 - 29:06
    Herald: Can you please repeat that?
    Signal Angel: The question was if the
  • 29:06 - 29:11
    moderators sometimes volunteers aren't the
    problem because they bann everything that
  • 29:11 - 29:15
    they don't like rather than the providers
    of a certain service?
  • 29:15 - 29:20
    J: Ahm, no I mean I would say that the
    content moderators.. we don't know who
  • 29:20 - 29:23
    they are, so that's part of the issue, as
    we don't know and I've - you know - I've
  • 29:23 - 29:27
    heard many allegations over the years when
    certain content's been taken down in a
  • 29:27 - 29:31
    certain local or cultural context
    particularly in the Arab world. I've heard
  • 29:31 - 29:35
    the accusation that like oh those content
    moderators are pro Sisi the dictator in
  • 29:35 - 29:39
    Egypt or whatever. I'm not sure how much
    merit that holds because like I said we
  • 29:39 - 29:45
    don't know who they are. But I would say
    is that they're not.. it doesn't feel like
  • 29:45 - 29:49
    they're given the resources to do their
    jobs well, so even if they were the best
  • 29:49 - 29:53
    most neutral people on earth they're given
    very little time probably very little
  • 29:53 - 29:59
    money and not a whole lot of resources to
    work with in making those determinations.
  • 29:59 - 30:01
    Herald: Thank you. We take a question from
    Mike 3 please.
  • 30:01 - 30:06
    Mike 3: Test, test. OK. First off, thank
    you so much for the talk. And I just have
  • 30:06 - 30:12
    a basic question. So it seems logical that
    Facebook is trying to put out this mantra
  • 30:12 - 30:17
    of protect the children. I can kind of get
    behind that. And it also seems based on
  • 30:17 - 30:21
    the fact that they have the "real names
    policy" that they would also expect you to
  • 30:21 - 30:26
    put in a real legal age. So if they're
    trying to censor things like nudity: Why
  • 30:26 - 30:31
    couldn't they simply use things like age
    as criteria to protect children from
  • 30:31 - 30:34
    nudity. While letting everyone else who is
    above the legal age make their own
  • 30:34 - 30:38
    decision?
    J: You wanna take that?
  • 30:38 - 30:44
    M: I think it's a few factors one: It's I
    guess on the technical side what on the
  • 30:44 - 30:48
    technical side, what constitutes nudity.
    And in a process way if it does get
  • 30:48 - 30:53
    flagged as when something is flagged do
    you have a channel or two boxes to say
  • 30:53 - 30:58
    what sort of content I could use a system
    in which that content flagged as nudity
  • 30:58 - 31:02
    gets referred to a special nudity
    moderator and then the moderator says:
  • 31:02 - 31:10
    "Yes is nudity" then filter all less than
    - you know - legal age or whatever age.
  • 31:10 - 31:15
    But I think it's part of a broader more
    systematic approach by Facebook. It's the
  • 31:15 - 31:22
    broad strokes. It's really kind of
    dictating this digital baseline this
  • 31:22 - 31:27
    digital morality baseline and we're
    saying: "No". Anybody in the world cannot
  • 31:27 - 31:31
    see this. These are our hard lines and it
    doesn't matter what age you are where you
  • 31:31 - 31:34
    reside. This is the box in which we are
    placing you in and content that falls
  • 31:34 - 31:39
    outside of this box for anybody,
    regardless of age or origin, that these
  • 31:39 - 31:43
    this is what we say you can see and
    anything that falls out that you risk
  • 31:43 - 31:48
    having your account suspended. So I think
    it's a mechanism of control.
  • 31:48 - 31:51
    Herald: Thank you so much. I think
    unfortunately we're run out of time for
  • 31:51 - 31:54
    questions. I would like to apologize for
    everyone who's standing maybe you have
  • 31:54 - 32:01
    time to discuss that afterwards. Thank you
    everyone and thank you.
  • 32:01 - 32:08
    applause
  • 32:08 - 32:13
    postrol music
  • 32:13 - 32:20
    subtitles created by c3subtitles.de
    Join, and help us!
Title:
Matthew Stender, Jillian C. York: Sin in the time of Technology
Description:

Technology companies now hold an unprecedented ability to shape the world around us by limiting our ability to access certain content and by crafting proprietary algorithm that bring us our daily streams of content.

Matthew Stender, Jillian C. York

more » « less
Video Language:
English
Duration:
32:20

English subtitles

Revisions