< Return to Video

Facebook Files Explained

  • 0:00 - 0:05
    rc3 2021 preroll now here @c-base
  • 0:05 - 0:11
    Herald: The Facebook files last year,
    Harvard graduated and Facebook employe,
  • 0:11 - 0:18
    FH enhanded huge internal
    Facebook documents to, among others,
  • 0:18 - 0:25
    newspapers, journals and in fact, the US
    Senate said it was there to testify about
  • 0:25 - 0:35
    Facebook's conduct. This has gained
    quite some, quite some attention, but at
  • 0:35 - 0:40
    least in Germany not as
    ambitious as it should have. And I'm
  • 0:40 - 0:50
    welcoming today. Lena? And Svea? HalIo! Will
    be giving us the better insight into
  • 0:50 - 0:57
    Facebook files. We believe in having a 30
    minute talk by the end about it wasn't in
  • 0:57 - 1:03
    our Q&A after that. So please do continue
    with your questions on the various
  • 1:03 - 1:10
    channels. They will hopefully miraculously
    a few of this week to be transferred over
  • 1:10 - 1:19
    the air to our mates, who will be giving
    us many insights. The screen is yours very
  • 1:19 - 1:23
    much.
    Svea: OK. Thank you so much. It's great to
  • 1:23 - 1:30
    be here. Sorry, first of all, for my voice
    because I'm having the Congress flu
  • 1:30 - 1:36
    without Congress laughing. No, just
    just the bad cold, but no COVID. So
  • 1:36 - 1:43
    everything is fine. Yeah. Thank you so
    much for welcoming us today. Before we
  • 1:43 - 1:50
    dive into the Facebook files and give you
    a an exclusive view and inside, let us
  • 1:50 - 1:54
    shortly introduce us. So yeah, I do. Lena,
    you go first.
  • 1:55 - 2:00
    Lena: I am Lena. I'm an investigative
    reporter with the WDR Zeitung in Berlin
  • 2:00 - 2:04
    and they mostly work on terror
    investigations, but also anything complex.
  • 2:04 - 2:08
    So I, of course, jumped on the Facebook
    files as well.
  • 2:08 - 2:14
    Svea: Yes. And my name is Svea. I also am
    an investigative reporter working
  • 2:14 - 2:21
    freelance for NDR television, mostly on
    tech issues. And then Lena and I, we
  • 2:21 - 2:28
    worked very closely together for several
    weeks or months now on the Facebook files.
  • 2:28 - 2:36
    We were in the team. We had the contact to
    FH first and Europe and also
  • 2:36 - 2:42
    we had to. We had the chance to look and
    work very closely with the files, and we
  • 2:42 - 2:47
    did a lot of stories on these issues. And
    so we thought, this would be a great time
  • 2:47 - 2:55
    to give some behind the scenes views and
    to tell you a little bit what's not all
  • 2:55 - 3:01
    in all the newspapers, some more details
    about, FH and also what's in
  • 3:01 - 3:07
    the files and what's not in the files and
    how they should be read or interpreted. So
  • 3:07 - 3:14
    this is what this is about, and I will now
    open the presentation on the screen and
  • 3:15 - 3:24
    vanish behind behind it. So yeah, have a
    great time and Lena will start first. OK,
  • 3:24 - 3:32
    let's see. This is who we are you. We just
    told you. And then let's go, Lena.
  • 3:33 - 3:40
    Lena: I consider the presentation,
    Svea: OK, sorry. Then let's just get to
  • 3:40 - 3:48
    make sure that yes, is it? Yes. Sorry. OK,
    now this should work. Who is FH?
  • 3:48 - 3:52
    OK. Lena, you go first.
    Lena: So who is FH? Most of
  • 3:52 - 3:57
    you have seen already, but in the
    beginning wasn't so clear in September,
  • 3:57 - 4:01
    when, in September 2021, the Wall Street
    Journal started publishing a series of
  • 4:01 - 4:07
    articles and podcasts under the name of
    the Facebook Files. The leaker was still
  • 4:07 - 4:11
    anonymous. The reports were riveting
    because these journalists from the Wall
  • 4:11 - 4:16
    Street Journal had were using hundreds and
    thousands of internal documents from
  • 4:16 - 4:23
    Facebook Internal Employee Network, and
    only after a few weeks it was announced
  • 4:23 - 4:28
    that the leaker will cooperate with
    Congress and the SEC, the Securities and
  • 4:28 - 4:34
    Exchange Commission, and reveal their
    identity and. By then, we were already
  • 4:34 - 4:41
    talking to Frances in video call, and she
    appeared on the screen as a very normal
  • 4:41 - 4:48
    woman, our age with headphones and on her
    couch. And until October, the beginning of
  • 4:48 - 4:54
    October, we knew her under a codename. So
    until she really revealed her identity,
  • 4:54 - 5:02
    she was also using a code name with us as
    journalists. And. Yeah, we learned that
  • 5:02 - 5:07
    that she was 37 years old, she was born in
    Iowa. And both her parents are professors,
  • 5:08 - 5:13
    but her mother later became a priest. And
    Frances went on to study electrical
  • 5:13 - 5:17
    computer engineering at a small college
    named Olin College in Massachusetts,
  • 5:18 - 5:23
    and she later became a data engineer and
    product manager for Google, mainly working
  • 5:23 - 5:34
    on Google books. Then she got a job at
    Yelp and Pinterest. And then in 2018, she
  • 5:34 - 5:41
    got offered a job at Facebook. And first
    she was very hesitant. She told us because
  • 5:41 - 5:45
    of Facebook, the reputation for being an
    engine of medicalization and because of
  • 5:45 - 5:50
    the personal experience. Frances, how
    happened to her friends? And this is a
  • 5:50 - 5:57
    clip from our interview last year. video
    plays
    Yeah, let's try to do what I want
  • 5:57 - 6:06
    to do to what you do. So I came into
    Facebook. I was asked by Facebook to take
  • 6:06 - 6:10
    stock of mobile to give a platform
    exclusively outage and point system. It
  • 6:10 - 6:14
    has the internet become easy to talk to?
    That is yes. No, Oh my God, this problem
  • 6:14 - 6:17
    is still going to feel so much larger than
    you might otherwise. Video ends
  • 6:17 - 6:26
    Lena: So as soon as we heard when she
    arrived at Facebook, the disenchantment or
  • 6:26 - 6:34
    the the problems she saw started
    immediately because immediately she saw
  • 6:34 - 6:39
    that. What she had thought of Facebook as
    an engine of radicalization was even
  • 6:39 - 6:46
    bigger. And with her personal experience,
    just to explain that in a few years back,
  • 6:46 - 6:52
    she got very sick and she had an assistant
    who later became almost like a brother, a
  • 6:52 - 7:00
    very close friend, and she lost him to the
    rabbit hole. His journey started. Yeah,
  • 7:00 - 7:07
    on normal social media sites and that
    4chan and let him through the of
  • 7:07 - 7:14
    conspiracy. And she told her that she was
    shocked, that she lost him and that he was
  • 7:14 - 7:20
    unreachable for facts. At some point, she
    wasn't able to have a conversation with
  • 7:20 - 7:25
    him anymore. So when she got offered the
    job and was able to do the job and the
  • 7:25 - 7:30
    integrity method, the team at Facebook
    that was formed to protect the US
  • 7:30 - 7:40
    elections and working on civic integrity.
    She really thought that you could make a
  • 7:40 - 7:52
    change. But she realized,
    that the companies, she said, didn't want
  • 7:52 - 7:59
    to change, although she really admired her
    coworkers. He said they were really smart
  • 7:59 - 8:04
    and creative people. But she said the
    leadership didn't want to listen. And so
  • 8:04 - 8:13
    in the end, after about two years, she
    became a whistleblower. And when you talk
  • 8:13 - 8:20
    to her today. You can see that it seems
    like she's the perfect person to do that.
  • 8:20 - 8:25
    She seems really at ease with her role,
    and she's found her role also to be in the
  • 8:25 - 8:29
    public and to put a face on the
    whistleblowing. She says her both
  • 8:29 - 8:32
    parents are professors, and it feels very
    natural to her to sit and explain things
  • 8:32 - 8:38
    to people. And that's what she does now as
    a teenager. She's really touring the world
  • 8:38 - 8:44
    to get her message out. And also we learn
    that she has a photographic memory, a very
  • 8:44 - 8:49
    good memory, and that she's financially
    independent because she invested in crypto
  • 8:49 - 9:02
    in the early days. So, yeah, she seems to
    be. Yeah, yeah. Whistleblower, almost. And
  • 9:02 - 9:07
    gone, yeah. But let's go to her motivated
    way in the end, she wanted to become a
  • 9:07 - 9:19
    whistleblower. So these two clips from our
    interview again, she told us that
  • 9:20 - 9:27
    when she was at Facebook: "I was faced. So
    this is hard. Patients, I was faced with
  • 9:27 - 9:32
    information that I believe put many, maybe
    a million or 10 million lives on the line.
  • 9:32 - 9:38
    I sat there and you were staring down at a
    situation where you believe maybe 10
  • 9:38 - 9:42
    million people could die over the next 20
    years. And I knew that I had the
  • 9:42 - 9:47
    information that could potentially face in
    a fraction of those life. She has to do
  • 9:47 - 9:53
    something." And now she says: "It's not about
    bad people or bad content at Facebook,
  • 9:53 - 9:58
    it's about a system. And like either
    the organization incentives or the
  • 9:58 - 10:06
    incentives at Facebook, they are they.
    They are wrong. They are skewed." And then
  • 10:06 - 10:12
    also, she says that she failed to change
    the system from within, and she realized:
  • 10:12 - 10:15
    "This problem was so much larger than even
    I thought it was. I kept trying and
  • 10:15 - 10:18
    trying. At some point, I read the
    realization that there was enough
  • 10:18 - 10:23
    systematic problems that I would have at
    some point to figure out how to bring the
  • 10:23 - 10:30
    information to the public." So she tried
    actually to make her complaints heard in
  • 10:30 - 10:35
    the company, but that she had the the she
    was under the impression, that the
  • 10:35 - 10:42
    leadership didn't want to listen.
    Svea: Yes. And I think with this things
  • 10:42 - 10:50
    you have now learned about FH
    I think if you know all this things you
  • 10:50 - 10:57
    can understand the leak much better and
    this is what we're going to do now. So we
  • 10:57 - 11:03
    will dive into the Files. And if you have
    all this in mind, so what was her
  • 11:03 - 11:09
    motivation, then you will now see and
    understand better, why some things are in
  • 11:09 - 11:15
    the Files and why some things aren't.
    Because you now know why she did what she
  • 11:15 - 11:24
    did and what type of person so roughly she
    is. So let's go to the revelation timeline
  • 11:24 - 11:34
    just just quickly here. In December 2020,
    she worked into the Civic Integrity Team,
  • 11:34 - 11:40
    but this team got dissolved and there was
    a Wall Street Journal reporter and he saw
  • 11:40 - 11:46
    a chance and asked all these, all the
    people who worked there, and he tried to
  • 11:46 - 11:51
    get interviews with them. And he also got
    the chance to meet FH and
  • 11:51 - 11:57
    probably the and the podcast from Wall
    Street Journal he tells the story that
  • 11:57 - 12:05
    they were talking to each other. And yeah,
    probably this also. Yeah. Get confidence
  • 12:05 - 12:09
    that there's somebody she probably could
    talk to, and we assume that then she
  • 12:09 - 12:15
    started collecting the files, but this
    stays blurry. So maybe she started
  • 12:15 - 12:20
    earlier. But this is something not which
    is not really known and which is not
  • 12:20 - 12:24
    really answered clearly. So she's always
    when we asked her, she was always speaking
  • 12:24 - 12:30
    from summer. So summer, summer, maybe this
    is the summer 21, then the Wall Street
  • 12:30 - 12:35
    Journal story just got out. And I think
    what's also interesting here to see is,
  • 12:35 - 12:40
    that you have this time period from
    December to September. So more than half a
  • 12:40 - 12:47
    year from the first contact, from the
    reporter to the leakage. So I think this
  • 12:47 - 12:53
    is quite interesting. And then you have
    the filing to the S.E.C. and to U.S.
  • 12:53 - 13:00
    Congress via whistleblower AID. We will
    talk about this later, too. And then
  • 13:00 - 13:09
    things get rush and then things get IQ
    each or then the everything gets very
  • 13:09 - 13:14
    fast. So there you have the filing, then
    you have the revealing of the identity.
  • 13:14 - 13:20
    And on the 5th of October, she is speaking
    to Congress and then all the publications
  • 13:20 - 13:27
    maybe you all have read or some of them,
    who you have read went out. So it's pretty
  • 13:27 - 13:33
    interesting to see now. Oh yeah. And then
    we also implemented when we come and
  • 13:33 - 13:38
    place, what was in the beginning of
    October. This is where we broke the first
  • 13:38 - 13:44
    story. And then formed the EU consortium. So
    we thought, maybe what's most what's
  • 13:44 - 13:48
    probably most interesting for you here,
    guys, is not the question what's in the
  • 13:48 - 13:54
    files? We'll tell about this later, too,
    about what? What's not in the files? Maybe
  • 13:54 - 14:00
    this is the more interesting question. And
    so to get the answer to that, it's
  • 14:00 - 14:06
    really important to understand the nature
    of the files where the files come from and
  • 14:06 - 14:10
    about what kind of files we are talking
    when we're speaking about the Facebook
  • 14:10 - 14:21
    files. And as Lena already mentioned,
    FH was a data scientist within
  • 14:21 - 14:25
    Facebook. She worked with the Civic
    Integrity Team and later she worked with
  • 14:25 - 14:31
    counter-espionage. So she was regular
    Facebook employee. She had she hadn't have
  • 14:31 - 14:37
    a high rank or something. She wasn't part
    of the board or she wasn't an executive
  • 14:37 - 14:45
    person. So of course, she had limited
    access to documents. But yeah, luckily for
  • 14:45 - 14:51
    her, Facebook maintains a quite
    transparent approach regarding its own
  • 14:51 - 14:56
    research and a lot of other relevant
    information, probably a lot more
  • 14:56 - 15:01
    transparent than other companies. So you,
    you have and Facebook, you have some kind
  • 15:01 - 15:08
    of internal Facebook, which is called a
    workplace and and workplace. You find a
  • 15:08 - 15:15
    vast, vast amount of internal research
    reports and people are discussing this
  • 15:15 - 15:22
    research with each other. And but I must
    admit so in these research that you have a
  • 15:22 - 15:28
    lot of technical terms and you have a lot
    of teams speaking to each other, the
  • 15:28 - 15:35
    researches made for Facebook employees.
    Of course, it's full of abbreviations or
  • 15:35 - 15:42
    yeah, terms you can't or hardly
    understand as an outsider, so we will get
  • 15:42 - 15:49
    into that later. So to give you a glimpse
    of how is a file, how what did we see when
  • 15:49 - 15:55
    we dived into it? So you have these
    totally unstructured PDF documents with
  • 15:55 - 16:02
    more than an estimated 10000 pages? They
    all photographed. And there you have this
  • 16:02 - 16:08
    research and also these discussions and
    you see here where I did the pink arrows.
  • 16:08 - 16:13
    You can see how it looks and it looks a
    little bit like Facebook. You have groups
  • 16:13 - 16:19
    of groups and you have comments, you have
    smileys and then you have these black
  • 16:19 - 16:24
    redactions, which were made later on
    because all these fires were made for
  • 16:24 - 16:31
    Congress. And so all the names were
    redacted. And this this is pretty
  • 16:31 - 16:38
    important to know when you think about
    what the files don't tell. So what's
  • 16:38 - 16:45
    missing? As I told you, that FH, she was
    part of the executive
  • 16:45 - 16:52
    board, so it is not in the files what Mark
    Zuckerberg or other key executive know. As
  • 16:52 - 16:59
    these are not reports directly to Mark
    Zuckerberg as these are research reports
  • 16:59 - 17:07
    for the internal network. So they don't
    say much about leadership or about
  • 17:07 - 17:13
    decisions from leadership or what was
    discussed by leadership. Sometimes you
  • 17:13 - 17:18
    have these postings in the internal
    Facebook where leadership is discussing
  • 17:18 - 17:24
    something, so you get an idea or a glimpse
    what the board or what high rank execute
  • 17:24 - 17:29
    thoughts. But this is basically
    something what the files are not telling.
  • 17:29 - 17:37
    Then the Facebook files don't provide any
    context. And this is something I've ...
  • 17:37 - 17:43
    what also FH thinks, what's yeah,
    really is the reason why she gave all
  • 17:43 - 17:49
    these files to journalists, because she
    hoped that we could provide context,
  • 17:49 - 17:57
    because there are no information on who
    did the report. So usually the authors
  • 17:57 - 18:02
    redacted. So you can't see. Is he a good
    researcher? Is he long working with his
  • 18:02 - 18:08
    book or what? What has he got for an
    education? So you don't know how reliable
  • 18:08 - 18:14
    they are or what happened before the
    study, or after all? And I think this is
  • 18:14 - 18:20
    very important and we especially saw this.
    I don't know if you remember the reporting
  • 18:20 - 18:26
    on the Instagram study and when you read
    the Instagram study very closely, you see
  • 18:26 - 18:32
    that there that they are talking about a
    dozen people who they have conducted
  • 18:32 - 18:40
    interviews, qualitative interviews with.
    And this is a research which is highly
  • 18:40 - 18:47
    qualitative and not quantitative, so it's
    not representative. And so this is very
  • 18:47 - 18:54
    important if you look at the files that
    you look at the numbers, how many people
  • 18:54 - 19:01
    with how many people are study is
    conducted or with how many accounts or how
  • 19:01 - 19:08
    many? Yeah, what what is it really about?
    And then you also see in the file that
  • 19:08 - 19:12
    there are specific areas which are very
    well represented, like hate speech. I
  • 19:12 - 19:19
    think this is an issue, FH worked a lot on
    it and the civic integrity teams
  • 19:19 - 19:24
    at other areas are
    missing or not represented. So, for
  • 19:24 - 19:29
    example, I'm very interested in fake
    accounts. So this was the first thing I
  • 19:29 - 19:35
    did when I was sweating through the files
    and what is what research is down fake
  • 19:35 - 19:41
    accounts. There is some research, of
    course, but yeah, I was I was saying, I
  • 19:41 - 19:47
    was missing content and I thought, Oh,
    there need to be more or about scam. I
  • 19:47 - 19:52
    don't know if you know, love scams,
    though. I did not find anything about this
  • 19:52 - 19:57
    or click farms, then engagement numbers.
    They are some engagement numbers, but not
  • 19:57 - 20:04
    that much. So I I can only speculate about
    why there are some areas very well
  • 20:04 - 20:10
    represented in the files, and some aren't.
    But I think probably this are different
  • 20:10 - 20:18
    reasons. So on the one hand, FH,
    she had a specific time, limited time to
  • 20:18 - 20:26
    get these files and also probably she had
    limited areas where she could go and read
  • 20:26 - 20:32
    these documents. And also, of course,
    these are internal research. This is
  • 20:32 - 20:37
    entirely research. These are internal
    studies. So there are only studies that
  • 20:37 - 20:42
    facebook employes start.
    This was some kind we definitely should
  • 20:42 - 20:47
    investigate, and if there's no if they
    didn't feel the urge to investigate it, of
  • 20:47 - 20:53
    course they can't be a file. So these are
    some reasons, I think, why some areas are
  • 20:53 - 21:02
    missing. OK, so now laugh let's get to
    the good part. So what's In the files? I
  • 21:02 - 21:08
    think some of you probably have read or
    heard about the Wall Street Journal
  • 21:08 - 21:13
    revelations, so we do not want to dive
    into this because this is broadly known
  • 21:13 - 21:18
    about, that celebrities were treated
    differently, that human trafficking goes
  • 21:18 - 21:25
    on on Facebook. I don't know who watched
    Jan Bömermann. He began talking about this
  • 21:25 - 21:31
    on that Instagram was toxic for teens or
    Mexican drug cartels using Facebook, or
  • 21:31 - 21:36
    that Facebook changed the news algorithm
    and polarization got worse. So this are
  • 21:36 - 21:43
    the this is where the first revelations.
    So let's talk about growth. So this was
  • 21:43 - 21:48
    something I was looking into it's, because
    I'm pretty interested in the whole fake
  • 21:48 - 21:53
    accounts area. So I looked into growth and
    I found it pretty interesting because
  • 21:53 - 22:00
    Facebook is always speaking about growth
    and that they are growing and making more
  • 22:00 - 22:06
    and more profits. And of course, this is
    true. But if you look through the files
  • 22:06 - 22:11
    and you can see that young users,
    especially users under the age of 25, that
  • 22:11 - 22:17
    these are the numbers are decreasing. And
    that's what you can see here pretty
  • 22:17 - 22:23
    closely. You have the red line and then
    you have the blue line. These are
  • 22:23 - 22:32
    symbolizing the younger users under 25.
    And even in COVID times where you have on
  • 22:32 - 22:40
    the right corners, you see this spike and
    all the other age groups, especially, I
  • 22:40 - 22:45
    think people above 50 got highly
    interested in Facebook during COVID. Now
  • 22:45 - 22:50
    it's just shocking. So you have every age
    group is highly is using Facebook more
  • 22:51 - 22:59
    during COVID, but not so that people under
    25. And this is pretty interesting because
  • 22:59 - 23:05
    of the filing for the FCC, the
    Börsenaufsicht and there because what is
  • 23:05 - 23:10
    with the advertisers and is Facebook
    really telling the truth when they always
  • 23:10 - 23:17
    talking about growth and profits. Then I
    think hate speech pretty important. And
  • 23:17 - 23:23
    there are a lot of case studies about
    especially poorer countries or high risk
  • 23:23 - 23:28
    countries and that they are the
    polarization goes on and that Facebook is
  • 23:28 - 23:36
    not taking enough measures, especially for
    sub languages on the far right for Arabic
  • 23:36 - 23:43
    countries or Asian countries or African
    countries. Even worse, where you have
  • 23:43 - 23:48
    often a lot of dialects and a lot of
    languages, and that there are not many
  • 23:48 - 23:54
    people doing, for example, content
    moderation. And one of the probably most
  • 23:54 - 24:00
    difficult documents, but I think one of
    the most interesting ones, this one. So
  • 24:00 - 24:05
    sorry, I think you can't read a lot, but I
    will explain as this report, it's called
  • 24:05 - 24:12
    the OpEx report. And it's it's really a
    long document with a lot of numbers, but I
  • 24:12 - 24:16
    liked it very much because there were so
    many numbers and it was quite an official
  • 24:16 - 24:23
    document and not only a study of one
    researcher did, and in his opex report,
  • 24:23 - 24:28
    you can see the missed proportion and
    spending money to fight misinformation
  • 24:28 - 24:34
    between English speaking countries like
    the U.S. and ArW, this the rest of world.
  • 24:35 - 24:41
    So you see here on the document, on the
    right, you see on the first column you see
  • 24:41 - 24:47
    misinformation and then you have these
    rows and then on the row, the right
  • 24:47 - 24:53
    corner, you can see that money or man
    hours. Is this transferred into man hours
  • 24:54 - 25:03
    that 84 percent man hours? Here is the
    first Quartal quarter of 2020 that 84
  • 25:03 - 25:11
    percent was spent for misinformation, and
    rest of world is 16 percent, which if you
  • 25:11 - 25:17
    compare the U.S. to the rest of the world,
    you can see definitely see where the
  • 25:17 - 25:23
    focus, where Facebook focus lies. And this
    of this also was one document FH
  • 25:23 - 25:28
    pointed us through and said, You
    have to check the numbers and then you can
  • 25:28 - 25:35
    see what I mean when I say they are, yeah,
    profit and growth is more important for
  • 25:35 - 25:44
    Facebook than human lives. I hope this
    very yeah, this document makes this clear.
  • 25:44 - 25:50
    OK, yeah. What does Facebook say? Facebook
    says, No. We're doing a lot of course and
  • 25:50 - 25:56
    everything and the people using Facebook.
    This is really important to us. OK, and
  • 25:56 - 26:03
    now for the last part, I give, yeah, Lena
    will tell you something more about the
  • 26:03 - 26:09
    fight within. Lena: Yeah, thank you very
    much. So what is really interesting to us
  • 26:09 - 26:15
    is to see the discussion in the work
    culture and the discussions among among
  • 26:15 - 26:20
    teams and among employees. And you could
    see that in the chat, that was going on,
  • 26:20 - 26:27
    especially under some of the studies. And
    it was kind of confirming our impression
  • 26:27 - 26:33
    that Facebook, that there was a lot of
    debate, internal debate and that there was
  • 26:33 - 26:38
    a lot of frustration among employees
    because as we have counted, there were at
  • 26:38 - 26:45
    least 11 major leaks since 2016. Most of
    them remained anonymous. But there were
  • 26:45 - 26:52
    also some that happened that went public.
    For example, Sophie Zangh went public
  • 26:52 - 27:02
    in April 2000 and 2021, and there was just
    a few months before FH came
  • 27:02 - 27:10
    out and became public and made that made
    it public that she had leaked documents.
  • 27:10 - 27:15
    Zangh has not leaked any documents that
    she had talked to reporters. And and and
  • 27:15 - 27:24
    she hurt her when she entered her
    employment with Facebook. And she said she
  • 27:24 - 27:28
    posted a batch post this kind of part of
    the internal exchanges, and we found many
  • 27:28 - 27:35
    of these batch posts. And she posted, I found
    multiple blatant attempts by foreign
  • 27:35 - 27:39
    national governments to abuse our platform
    and that failed to mislead their own
  • 27:39 - 27:45
    citizenry and cost international news on
    multiple occasions. And she was just as
  • 27:45 - 27:53
    FH, she was concerned with
    Facebook lack of content moderation and
  • 27:53 - 28:00
    lack of enforcement of community standards
    outside of the U.S.. And so this just to
  • 28:00 - 28:08
    show you that in term of debate was very,
    very public internally. So I mean, we
  • 28:08 - 28:19
    found another batch code. It was called
    leaving Q&A. This was from May 2000 at
  • 28:20 - 28:31
    from May 2021, and there this person was
    concerned about hate speech and this
  • 28:31 - 28:36
    person said she couldn't take it anymore
    just to be an on Facebook employee because
  • 28:36 - 28:40
    she says with so many internal forces propping
    up the production of hateful and violent
  • 28:40 - 28:46
    content. The task of stopping hate and
    violence on Facebook starts to feel even
  • 28:46 - 28:51
    more sisyphean than it already is and
    shaming internal forces, meaning the
  • 28:51 - 28:58
    leadership and this is something that I,
    this person says, so many internal
  • 28:58 - 29:03
    forces and the leadership. This is
    something that FH was also
  • 29:03 - 29:10
    really concerned about that. On the one
    hand, there are people working on
  • 29:10 - 29:15
    combating hate speech and changing the
    algorithms to make it more a better and
  • 29:16 - 29:22
    safer environment. And then, on the other
    hand, that there are some forces in that
  • 29:22 - 29:30
    in the company that apparently work
    against these efforts. And this fight that
  • 29:30 - 29:36
    we could see, could you go back, and
    this fight that we could see was
  • 29:36 - 29:50
    especially viral after January six, after
    the storm of the Capitol and their
  • 29:50 - 29:56
    employees were very much discussing and
    they were very they were furious. And one
  • 29:56 - 30:00
    one person has said, we've been fueling
    this fire for a long time and we shouldn't
  • 30:00 - 30:05
    be surprised. It's now out of control. So
    employers are giving Facebook
  • 30:06 - 30:12
    responsibility for the development. And
    another person said, employers employees
  • 30:13 - 30:17
    should say employees are tired of thoughts
    and prayers from leadership. We want
  • 30:17 - 30:22
    action. And another person said so many
    research backed ideas get shut down. We
  • 30:22 - 30:27
    need to do a better job at making
    decisions from a research perspective, and
  • 30:27 - 30:33
    this is something that's also very close
    to FH. She said in our
  • 30:33 - 30:40
    interview, she says there are solutions
    that are there are wonderful employers and
  • 30:40 - 30:46
    wonderful teams, who are working to come
    up with solutions. But they are they are
  • 30:46 - 30:53
    blocked by leadership out of a for profit
    interests. And this here's a screenshot of
  • 30:53 - 30:59
    another person saying I'm struggling to
    match my values to my employment here. I
  • 30:59 - 31:05
    came here hoping to effect change and
    improve society, but I've seen it atrophy
  • 31:05 - 31:10
    and abdication of responsibility. I'm
    tired of platitudes. I want action items.
  • 31:10 - 31:17
    We are not a neutral entity, so employee
    seems to be extremely critical of
  • 31:17 - 31:32
    Facebook. Yeah. And so as a general take
    away, you could say that we expect to see
  • 31:32 - 31:38
    more leaks from Facebook. We expect to see
    more whistleblowers coming out of this
  • 31:38 - 31:42
    work culture because people seem to be
    extremely frustrated. And this is also
  • 31:42 - 31:47
    just to wrap it up. This is Facebook
    reaction to the Facebook files and the
  • 31:47 - 31:53
    revelation that came out after October 4th
    after after FH went public at Mark
  • 31:53 - 32:00
    Zuckerberg said this week. And in a
    message to his employees, he said: "We care
  • 32:00 - 32:05
    deeply about issues like safety, well-
    being and mental health and the coverage
  • 32:05 - 32:13
    that misrepresents our work and our
    motive." And Facebook Communications VP
  • 32:13 - 32:22
    John Pernetti said. Even that it was an
    orchestrated campaign against Facebook and
  • 32:22 - 32:28
    in response to our reporting, he said: "We
    welcome scrutiny and feedback that these
  • 32:28 - 32:32
    documents are being used to paint a
    narrative that we hide or that that that
  • 32:32 - 32:37
    was terrible, that we hide or cherry
    picked data when in fact we do the
  • 32:37 - 32:43
    opposite." So they are they they are kind
    of blankly refusing FH
  • 32:43 - 32:53
    accusations. But earn a T. What what's
    happened to FH during this
  • 32:53 - 33:00
    process? I think he was talking to a
    reporter early on and she was on her own.
  • 33:02 - 33:08
    But when when she started and when when
    the when the revelations started in The
  • 33:08 - 33:18
    Wall Street Journal, apparently we don't
    know. And I am sorry, I'm hearing the
  • 33:18 - 33:24
    sound from the c-base, and that's why I
    was a bit confused. But in the process of
  • 33:24 - 33:30
    leaking documents, she got in contact with
    the whistleblower aid, which was founded
  • 33:30 - 33:39
    by a former whistleblower, John Tye, and
    which is a nonprofit organization claiming
  • 33:39 - 33:43
    that no one should have to risk their
    career or their freedom to follow their
  • 33:43 - 33:48
    conscience. And some of you may know the
    executive director Libby knew a former CEO
  • 33:48 - 33:56
    of Open Technology Fund, so they helped
    Frances to making a protected protected
  • 33:57 - 34:07
    disclosure. And she also had lawyers who
    represent her. They are from Bryson
  • 34:07 - 34:14
    Gillette. That's a consulting firm and law firm
    founded 2020 by Bill Burton. He was a former
  • 34:14 - 34:20
    spokesman, the deputy spokesman of Obama
    and the Bryson Gillette, and that is also
  • 34:20 - 34:23
    involved with the Lincoln Project. So they
    are clearly from the democratic spectrum.
  • 34:25 - 34:29
    And this was also why Facebook could
    easily make the claim that it was an
  • 34:29 - 34:35
    orchestrated campaign by the Democrats, an
    orchestrated political campaign. We think
  • 34:35 - 34:42
    that. But let's go on to a more groups
    involved with Francis just to wrap it up
  • 34:42 - 34:48
    and let us Luminate, which is founded
    by Pierre Omidyar the founder of eBay,
  • 34:49 - 34:54
    and also Ben Scott been involved there. He's a
    senior policy adviser for innovation at
  • 34:54 - 34:59
    the U.S. Department of State and Hillary
    Clinton and was there, and they are
  • 35:00 - 35:06
    cooperating closely on funding. For
    example, recent tech operating also in the
  • 35:06 - 35:11
    U.S. and Europe. The nonprofit lobbies a
    lobbying organization that wants to
  • 35:11 - 35:16
    regulate the market for Big Tech. So these
    organizations, they come from the
  • 35:16 - 35:21
    democratic spectrum. There's no question,
    but we see that a rethink and our
  • 35:21 - 35:28
    impression was that they were mentioned
    matching FH interest. So they
  • 35:28 - 35:35
    have kind of the same goal. So they came
    they came together in the process.
  • 35:35 - 35:47
    Svea: Yeah, really important to see now,
    OK, what's next? So we have reported on
  • 35:47 - 35:55
    the Facebook Files in Europe, in the U.S.,
    but it's pretty exciting to see. Yeah,
  • 35:55 - 36:01
    what is happening? Is Facebook changing
    something or is there any regulation
  • 36:01 - 36:07
    coming? So of course, the future is still
    open and not written, but to give you a
  • 36:07 - 36:12
    little bit a glimpse what happened after
    the 7th of October? So then, yeah, FH
  • 36:12 - 36:18
    went on some kind of tour through
    Europe? Though she was a London, she was
  • 36:18 - 36:25
    at this bar Lisbon, and in November she
    went to Berlin, to Brussels and to Paris.
  • 36:25 - 36:30
    So why did she do that? Why did she travel
    from Puerto Rico to the U.S. and then from
  • 36:30 - 36:37
    the U.S. through Europe? I think every
    three days another city, it was a tour
  • 36:37 - 36:42
    like kind of a rock star or something. Why
    did she do this? Yeah, it is. Because,
  • 36:43 - 36:48
    yeah, they she clearly has some kind of
    agenda. She definitely wants to change.
  • 36:48 - 36:54
    And she did not want to throw a lot of
    documents in the internet and then hide in
  • 36:54 - 36:59
    the castle or somewhere else. She
    definitely wanted something. Yeah, she she
  • 36:59 - 37:08
    wanted action. So she had big hopes in
    Europe because as probably some of you
  • 37:08 - 37:14
    know, here that Digital Services Act is
    debated. It was debated in the parliament
  • 37:14 - 37:21
    and now it's up to the commission. So
    hopefully next year, the Digital Services
  • 37:21 - 37:29
    Act will be in place. And yeah, she and
    also many other groups are hoping for more
  • 37:29 - 37:35
    regulations and control regarding
    especially content moderation, fiiles and
  • 37:35 - 37:43
    also transparency, so that you can see
    what is going on on these platforms and in
  • 37:43 - 37:51
    the U.S. the hope lies in Congress and in
    the SEC . So I think I'm sure they hope
  • 37:51 - 37:56
    that the SEC will have a billion or
    billion dollar fine on Facebook and also
  • 37:56 - 38:04
    probably pushes for more regulation. And
    the discussion centers around that news,
  • 38:04 - 38:10
    getting more control of that data that
    that there's more transparency and
  • 38:10 - 38:17
    probably that also there's more taxation,
    probably on digital ads just yet to make
  • 38:17 - 38:23
    the business model harder. Because I think
    also this is only one way to come by is
  • 38:23 - 38:32
    yeah, to to do something about the
    business model. OK. So yeah, we hope you
  • 38:32 - 38:36
    enjoyed our presentation and we yeah, I
    want to thank you for listening and also
  • 38:36 - 38:42
    big thanks to the team who work for that.
    And I hope we have another five to 10
  • 38:42 - 38:44
    minutes for some Q&A.
  • 38:44 - 38:49
    Herald: Yes, thank you. Yes, we have
    indeed another 10 minutes before we have
  • 38:49 - 38:54
    to prepare for the next talk. Thank you
    very much for this. These 45 minutes of
  • 38:54 - 39:04
    content Svea laughs very well,
    you are. You are giving the impression
  • 39:04 - 39:09
    your presentation gives the impression
    that Facebook is basically watching from
  • 39:09 - 39:16
    within. It's a question when mutiny within
    the Facebook workforce will break out and
  • 39:16 - 39:22
    that 85 percent of all the content comes
    from payments for the ugly content and not
  • 39:23 - 39:31
    for the good one. How true is this? How
    how evil is Facebook compared to other
  • 39:32 - 39:37
    debated evil doers like Telegram or
    others? What do you think?
  • 39:41 - 39:50
    Svea: I think that, you know, the Facebook
    files and you have like the feeling that
  • 39:50 - 39:56
    you are in a submarine as you're diving
    into the Marine, you know, and you start
  • 39:56 - 40:02
    diving underwater and then you go with the
    light bulb, you know, with one light and
  • 40:02 - 40:07
    then you shed into the dark and this other
    Facebook files and you can't see what
  • 40:07 - 40:13
    surrounds this. So we definitely. So I
    think it's it's not fair to say like the
  • 40:13 - 40:18
    whole company is so also it's only fair to
    say that and, what you can see on the
  • 40:18 - 40:23
    comments, for example, after the storm of
    the Capitol, where many employees were
  • 40:23 - 40:29
    speaking. So I think it's not that the
    whole company is rotten from within, but
  • 40:29 - 40:34
    of course, there are parts which are
    rotten. And this, I think, is shown in the
  • 40:34 - 40:39
    files. I think it's really hard to compare
    Facebook to Telegram or Google or
  • 40:39 - 40:46
    something because I think a Facebook or
    Meta, as they now called, it's quite a
  • 40:46 - 40:53
    very unique player on the market. They
    are. Yeah, they bought Instagram. They
  • 40:53 - 41:00
    bought WhatsApp. You can't compare
    Facebook or Meta to Telegram because this
  • 41:00 - 41:07
    is a completely different scale. So yeah,
    that's that's I hope I could. It's a
  • 41:07 - 41:12
    little bit of a philosophical question, so
    I hope my answer is okay laughs.
  • 41:12 - 41:18
    Herald: I mean, that's OK. You know...
    Lena: I want to add, I think it's true.
  • 41:18 - 41:24
    What we have, we are only seeing a glimpse
    of, of course, of some of the internal
  • 41:24 - 41:30
    discussions. But it gives us an
    impression. And I think the impression I
  • 41:30 - 41:35
    would see, I would say, is right that
    there are a lot of frustration among the
  • 41:35 - 41:39
    complaints because what Facebook says on
    the outside, we are good. We are
  • 41:39 - 41:46
    connecting people. Whatever their slogan
    is, it is not. It seems to be not
  • 41:46 - 41:54
    confirmed by by many employees. They say
    they think just as Frances Haugen of power
  • 41:54 - 42:02
    and things that the company putting profit
    over safety. And there may be that there
  • 42:02 - 42:06
    are probably a lot of employees that are
    super happy to work at Facebook and they
  • 42:06 - 42:10
    don't see it. And I think that they have
    amazing perks that they get at the office.
  • 42:10 - 42:15
    You know, they have little kitchen. They
    have. I mean, it seems to be a wonderful
  • 42:15 - 42:20
    place to work if you care about this
    issue. It seems that many, many people who
  • 42:20 - 42:25
    really, really care about these issues,
    such as hate speech, such as the
  • 42:25 - 42:30
    democratic processes such as a foreign
    influence on elections, and they seem to
  • 42:30 - 42:36
    be very frustrated. So as I said, I expect
    more whistleblowers. I expect more leaks
  • 42:36 - 42:37
    to come.
    Herald: Hmm.
  • 42:37 - 42:41
    Lena: And also, it's a structural thing
    just to end this and it's a structural
  • 42:41 - 42:46
    thing there. When you compare it, for
    example, to Twitter, it seems to be that
  • 42:46 - 42:52
    Twitter is doing many things much better
    than than than Facebook. But as I said,
  • 42:52 - 42:58
    you can't really compare. But there is a
    discussion within the Facebook players
  • 42:58 - 43:04
    about a firewall between those parts of
    the company that are enforcing policies,
  • 43:04 - 43:10
    internal policies and those parts of the
    company who are responsible for that, for
  • 43:10 - 43:16
    the numbers and who are selling ads, for
    example. And apparently, it's Twitter.
  • 43:16 - 43:22
    There's a there's a firewall. And at
    Facebook, there's no firewall. So the same
  • 43:22 - 43:29
    people who are selling adds are also making
    exceptions for political actors on, on or
  • 43:29 - 43:35
    on certain behaviors. And that's this is a
    structural thing that frustrated many
  • 43:35 - 43:39
    people.
    Herald: No Chinese wall. I see. I am
  • 43:39 - 43:45
    getting signals that we have only three
    minutes left. A very short question away,
  • 43:45 - 43:50
    short answer from our technical audience.
    You talked about the internal platform
  • 43:50 - 43:55
    workplace, Facebook's internal knowledge
    base. It would, of course, be interesting
  • 43:55 - 44:02
    to see how user level access controls,
    encription, white's management work on
  • 44:02 - 44:07
    this platform is anything in the leaks
    about that? No.
  • 44:07 - 44:13
    Lena: No. Only research reports only the
    vast amount of research reports.
  • 44:13 - 44:18
    Herald: Mm hmm. But this vast amounts seem
    to be accessible. OK? Somebody is waving,
  • 44:18 - 44:23
    Waving us Goodbye. I see. Thank you so
    much. Interesting insight. Interesting
  • 44:23 - 44:29
    updates, folks. Follow this issue. It's
    not over. Things will be coming. Thank
  • 44:29 - 44:34
    you, Lena. Thank you, Svea. I wish you
    good health through the winter and thank
  • 44:34 - 44:38
    you again Svea laughs for your marital
    effort to you.
  • 44:38 - 44:43
    Svea, Lena: Thank you.
    I think you see by everything is licensed
  • 44:43 - 44:47
    under CC by 4.0. And it is all for the
    community 2.0 and for everybody.
  • 44:47 - 44:54
    Subtitles created by c3subtitles.de
    in the year 2022. Join, and help us!
Title:
Facebook Files Explained
Description:

more » « less
Video Language:
English
Duration:
44:52

English subtitles

Revisions