< Return to Video

36C3 - Inside the Fake Like Factories

  • 0:00 - 0:19
    36C3 preroll music
  • 0:19 - 0:26
    Herald: OK. So inside the fake like
    factories. I'm going to date myself. I
  • 0:26 - 0:33
    remember it was the Congress around
    1990,1991 or so, where I was sitting
  • 0:33 - 0:39
    together with some people who came over to
    the states to visit the CCC Congress. And
  • 0:39 - 0:43
    we were kind of riffing on how great the
    internet is gonna make the world, you
  • 0:43 - 0:47
    know, how how it's gonna bring world peace
    and truth will rule and everything like
  • 0:47 - 0:57
    that. Boy, were we naive, boy, where we
    totally wrong. And today I'm going to be
  • 0:57 - 1:03
    schooled in how wrong I actually was
    because we have Svea, Dennis and Philip to
  • 1:03 - 1:09
    tell us all about the fake like factories
    around the world. And with that, could you
  • 1:09 - 1:18
    please help me in welcoming them onto the
    stage? Svea, Dennis and Philip.
  • 1:18 - 1:29
    Philip: Thank you very much. Welcome to
    our talk "Inside the Fake Like Factories
  • 1:29 - 1:36
    ". My name is Philip. I'm an Internet
    activist against disinformation and I'm
  • 1:36 - 1:39
    also a student of the University of
    Bamberg.
  • 1:39 - 1:45
    Svea: Hi. Thank you that you listen to us
    tonight. My name is Svea. I'm an
  • 1:45 - 1:50
    investigative journalist, freelance mostly
    for the NDR and ARD. It's a public
  • 1:50 - 1:56
    broadcaster in Germany. And I focus on
    tech issues. And I had the pleasure to
  • 1:56 - 2:01
    work with these two guys on, for me, a
    journalistic project and for them on a
  • 2:01 - 2:04
    scientific project.
    Dennis: Yeah. Hi, everyone. My name is
  • 2:04 - 2:09
    Dennis. I'm a PhD student from Ruhr
    University Bochum. I'm working as a
  • 2:09 - 2:16
    research assistant for the chair for
    System Security. My research focuses on
  • 2:16 - 2:21
    network security topics and Internet
    measurements. And as Svea said, Philip and
  • 2:21 - 2:27
    myself, we are here for the scientific
    part and Svea is for the journalistic part
  • 2:27 - 2:32
    here.
    Philip: So here's our outline for today.
  • 2:32 - 2:39
    So first, I'm going to briefly talk about
    our motivation for our descent into the
  • 2:39 - 2:45
    fake like factories and then we are going
    to show you how we got our hands on ninety
  • 2:45 - 2:51
    thousand fake like campaigns of a major
    crowd working platform. And we are also
  • 2:51 - 2:56
    going to show you why we think that there
    are 10 billion registered Facebook users
  • 2:56 - 3:04
    today. So first, I'm going to talk about
    the like button. The like button is the
  • 3:04 - 3:12
    ultimate indicator for popularity on
    social media. It shows you how trustworthy
  • 3:12 - 3:19
    someone is. It shows how how popular
    someone is. It shows, it is an indicator
  • 3:19 - 3:27
    for economic success of brands and it also
    influences the Facebook algorithm. And as
  • 3:27 - 3:32
    we are going to show now, these kind of
    likes can be easily forged and
  • 3:32 - 3:39
    manipulated. But the problem is that many
    users will still prefer this bad info on
  • 3:39 - 3:46
    Facebook about the popularity of a product
    to no info at all. And so this is a real
  • 3:46 - 3:54
    problem. And there is no real solution to
    this. So first, we are going to talk about
  • 3:54 - 3:59
    the factories and the workers in the fake
    like factories.
  • 3:59 - 4:04
    Svea: That there are fake likes and that
    you can buy likes everywhere, it's well
  • 4:04 - 4:10
    known. So if you Google "buying fake
    likes" or even "fake comments" for
  • 4:10 - 4:15
    Instagram or for Facebook, then you will
    get like a hundreds of results and you can
  • 4:15 - 4:20
    buy them very cheap and very expensive. It
    doesn't matter, you can buy them from
  • 4:20 - 4:28
    every country. But when you think of these
    bought likes, then you may think of this.
  • 4:28 - 4:35
    So you may think of somebody sitting in
    China, Pakistan or India, and you think of
  • 4:35 - 4:40
    computers and machines doing all this and
    that they are, yeah, that they are fake
  • 4:40 - 4:48
    and also that they can easily be detected
    and that maybe they are not a big problem.
  • 4:48 - 4:55
    But it's not always like this. It also can
    be like this. So, I want you to meet
  • 4:55 - 5:03
    Maria, I met her in Berlin. And Harald, he
    lives near Mönchen-Gladbach. So Maria, she
  • 5:03 - 5:12
    is a a retiree. She was a former police
    officer. And as money is always short, she
  • 5:12 - 5:20
    is clicking Facebook likes for money. She
    earns between 2 cent and 6 cent per like.
  • 5:20 - 5:29
    And Harald, he was a baker once, is now
    getting social aid and he is also clicking
  • 5:29 - 5:34
    and liking and commenting the whole day.
    We met them during our research project
  • 5:34 - 5:41
    and did some interviews about their likes.
    And one platform they are clicking and
  • 5:41 - 5:47
    working for is PaidLikes. It's only one
    platform out of a universe, out of a
  • 5:47 - 5:52
    cosmos. PaidLikes, they are sitting just a
    couple of minutes from here in Magdeburg
  • 5:52 - 5:57
    and they are offering that you can earn
    money with liking on different platforms.
  • 5:57 - 6:02
    And it looks like this when you log into
    the platform with your Facebook account
  • 6:02 - 6:07
    then you get in the morning, in the
    afternoon, in the evening, you get, we
  • 6:07 - 6:13
    call it campaigns. But these are pages,
    Facebook fan pages or Instagram pages, or
  • 6:13 - 6:18
    posts, or comments. You can, you know, you
    can work your way through them and click
  • 6:18 - 6:23
    them. And I blurred you see here the blue
    bar; I blurred them because we don't want
  • 6:23 - 6:30
    to get sued from all these companies,
    which you can see there. To take you a
  • 6:30 - 6:37
    little bit with me on the journey. Harald,
    he was okay with us coming by for
  • 6:37 - 6:44
    television and he was okay that we did a
    long interview with him, and I want to
  • 6:44 - 6:50
    show you a very small piece out of his
    daily life sitting there doing the
  • 6:50 - 6:54
    household, the washing and the cleaning,
    and clicking.
  • 7:27 - 7:36
    Come on. It could be like that. You click
    and you earn some money. How did we meet
  • 7:36 - 7:41
    him and all the others? Of course, because
    Philip and Dennis, they have a more
  • 7:41 - 7:45
    scientific approach. So it was also
    important not only to talk to one or two,
  • 7:45 - 7:50
    but to talk to many. So we created a
    Facebook fan page, which we call "Eine
  • 7:50 - 7:54
    Linie unterm Strich" (a line under a line)
    because I thought, okay, nobody will like
  • 7:54 - 8:01
    this freely. And then we did a post. This
    post, and we bought likes, and you won't
  • 8:01 - 8:10
    believe it, it worked so well; 222 people,
    all the people I paid for liked this. And
  • 8:10 - 8:18
    then we wrote all of them and we talked to
    many of them. Some of them only in
  • 8:18 - 8:23
    writing, some of them only we just called
    or had a phone chat. But they gave us a
  • 8:23 - 8:30
    lot of information about their life as a
    click worker, which I will sum up. So what
  • 8:30 - 8:36
    PaidLikes by itself says, they say that
    they have 30000 registered users, and it's
  • 8:36 - 8:41
    really interesting because you might think
    that they are all registered with 10 or 15
  • 8:41 - 8:46
    accounts, but most of them, they are not.
    They are clicking with their real account,
  • 8:46 - 8:58
    which makes it really hard to detect them.
    So they even scan their I.D. so that the
  • 8:58 - 9:03
    company knows that they are real. Then
    they earn their money. And we met men,
  • 9:03 - 9:10
    women, stay-at-home moms, low-income
    earners, retirees, people who are getting
  • 9:10 - 9:18
    social care. So, basically, anybody. There
    was no kind of bias. And many of them are
  • 9:18 - 9:25
    clicking for two and more platforms. That
    was, I didn't meet anybody who's only
  • 9:25 - 9:29
    clicking for one platform. They all have a
    variety of platforms where they are
  • 9:29 - 9:35
    writing comments or clicking likes. And
    you can make - this is what they told us -
  • 9:35 - 9:42
    between 15 euro and 450 euro monthly, if
    you are a so-called power clicker and you
  • 9:42 - 9:48
    do this some kind of professional. But
    this are only the workers, and maybe you
  • 9:48 - 9:53
    are more interested in who are the buyers?
    Who benefits?
  • 9:53 - 10:00
    Dennis: Yeah. Let's come to step two. Who
    benefits from the campaigns? So I think
  • 10:00 - 10:06
    you all remember this page. This is the
    screen if you log into PaidLikes and,
  • 10:06 - 10:14
    you'll see the campaigns with, you have to
    click in order to get a little bit of
  • 10:14 - 10:25
    money. And by luck we've noticed that if
    you go over a URL, we see in the left
  • 10:25 - 10:32
    bottom side of the browser, a URL
    redirecting to the campaign. You have to
  • 10:32 - 10:41
    click and you see that every campaign is
    using a unique ID. It is just a simple
  • 10:41 - 10:50
    integer, and the good thing is, it is just
    incremented. So now maybe some of you guys
  • 10:50 - 10:57
    notice what we can do with that. And yeah,
    it is really easy with these constructed
  • 10:57 - 11:03
    URLs to implement a crawler for data
    gathering, and our crawler simply
  • 11:03 - 11:12
    requested all campaign IDs between 0 and
    90000. Maybe some of you ask why 90000? As
  • 11:12 - 11:17
    I already said, we were also registered as
    click workers and we see, we saw that the
  • 11:17 - 11:25
    highest ID campaign used is about 88000.
    So we thought OK, 90000 is a good value
  • 11:25 - 11:31
    and we check for every request between
    these 90000 requests if it got resolved or
  • 11:31 - 11:36
    not, and if it got resolved, we redirected
    the URL we present this source. That
  • 11:36 - 11:42
    should be liked or followed. And we did
    not save the page sources from the
  • 11:42 - 11:51
    resolved URLs, we only save the resolved
    URLs in the list of campaigns, and this
  • 11:51 - 11:59
    list was then the basis for further
    analysis. And here you see our list.
  • 11:59 - 12:06
    Svea: Yes. This was the point when Dennis
    and Philip, when they came to us and said,
  • 12:06 - 12:12
    hey, we have a list. So what can you find?
    And of course we searched AfD, was one of
  • 12:12 - 12:21
    the first search queries. And yeah, of
    course, AfD is also in that list. Maybe
  • 12:21 - 12:31
    not so surprisingly for some. And when you
    look, it is AFD Gelsenkirchen. And the fan
  • 12:31 - 12:40
    page. And we asked AfD Gelsenkirchen, did
    you buy likes? And they said, we don't
  • 12:40 - 12:48
    know how we got on that list. But however,
    we do not rule out an anonymous donation.
  • 12:48 - 12:55
    But now you would think, Ok, they found
    AfD; this is very expectable. But no, all
  • 12:55 - 13:01
    political parties – mostly local and
    regional entities - showed up on that
  • 13:01 - 13:09
    list. So we have CDU/CSU. We have had FDP,
    SPD, AfD, Die Grünen and Die Linke. But
  • 13:09 - 13:15
    not that you think Angela Merkel or some
    very big Facebook fan pages just showed
  • 13:15 - 13:24
    up. No, no. Very small entities with a
    couple of hundreds or maybe 10000 or 15000
  • 13:24 - 13:28
    followers. And I think this makes
    perfectly sense, because somebody who has
  • 13:28 - 13:35
    already very, very much many fans
    probably would not buy them there at
  • 13:35 - 13:46
    PaidLikes. And we asked many of them, and
    mostly they could not explain it. They
  • 13:46 - 13:52
    would never do something like that. Yeah,
    they were completely over asked. But you
  • 13:52 - 13:57
    have to think that we only saw the
    campaign. The campaigns, their Facebook
  • 13:57 - 14:03
    fan pages, we could not see who bought the
    likes. And as you can imagine, everybody
  • 14:03 - 14:09
    could have done it like the mother, the
    brother, the fan, you know, the dog. So
  • 14:09 - 14:15
    this was a case we would have needed a lot
    of luck to call anybody out of the blue
  • 14:15 - 14:20
    and then he would say, oh, yes, I did
    this. And there was one, or there were
  • 14:20 - 14:26
    some politicians who admitted it. And one
    of them, she did it also publicly and gave
  • 14:26 - 14:35
    us an interview. It's Tanja Kühne. She is
    a regional politician from Walsrode,
  • 14:35 - 14:40
    Niedersachsen. And she was in the..., it
    was the case that it was after an election
  • 14:40 - 14:44
    and she was not very happy with her fan
    page. That is what she told us. She was
  • 14:44 - 14:49
    very unlucky and she wanted, you know, to
    push herself and to boost it a little bit,
  • 14:49 - 14:56
    and get more friends and followers and
    reach. And then she bought 500 followers.
  • 14:56 - 15:03
    And then we had a nice interview with her
    about that. Show you a small piece.
  • 15:54 - 16:00
    Okay, so you see – answers are pretty
    interesting. And she.. I think she was
  • 16:00 - 16:05
    that courageous to speak out to us. Many
    of others did too, but only on the phone.
  • 16:05 - 16:09
    And they didn't want to go on the record.
    But she's not the only one who answered
  • 16:09 - 16:14
    like this. Because, of course, if you call
    through a list of potential fake like
  • 16:14 - 16:21
    buyers, of course they answer like, no,
    it's not a scam. And I also think from a
  • 16:21 - 16:26
    jurisdictional way, it's it's also very
    hard to show that this is fraud and a
  • 16:26 - 16:33
    scam. And it's more an ethical problem
    that you can that you can see here, that
  • 16:33 - 16:40
    it's manipulative if you buy likes. We
    also found a guy from FSP from the
  • 16:40 - 16:45
    Bundestag. But yeah, he ran away and
    didn't want to get interviewed, so I
  • 16:45 - 16:53
    couldn't show you. So bought, or no
    probably... He was like 40 times in our
  • 16:53 - 16:59
    list for various Facebook posts and videos
    and also for his Instagram account. But we
  • 16:59 - 17:07
    could not get him on, we could not get him
    on record. So what did others say? We, of
  • 17:07 - 17:11
    course, confronted Facebook, Instagram and
    YouTube with this small research. And they
  • 17:11 - 17:18
    said, no, we don't want fake likes on our
    platform. PaidLikes is active since 2012,
  • 17:18 - 17:25
    you know. So they waited seven years. But
    after our report, at least, Facebook
  • 17:25 - 17:33
    temporarily blocked PaidLikes. And of
    course, we asked them too, and spoke to
  • 17:33 - 17:36
    them and wrote with PaidLikes in
    Magdeburg. And they said, of course, it's
  • 17:36 - 17:42
    not a scam because the click workers they
    are freely clicking on pages. So, yeah,
  • 17:42 - 17:48
    kind of nobody cares. But PaidLikes, this
    is only the tip of the iceberg.
  • 17:48 - 17:59
    Philip: So we also wanted to dive a little
    bit into this fake like universe outside
  • 17:59 - 18:06
    of PaidLikes and to see what else is out
    there. And so we did an analysis of
  • 18:06 - 18:13
    account creation on Facebook. So what
    Facebook is saying about account creation
  • 18:13 - 18:19
    is that they are very effective against
    fake accounts. So they say they remove
  • 18:19 - 18:26
    billions of accounts each year, and that
    most of these accounts never reach any
  • 18:26 - 18:33
    real users and they remove them before
    they get reported. So what Facebook
  • 18:33 - 18:39
    basically wants to tell you is that they
    have it under control. However, there are
  • 18:39 - 18:46
    a number of reports that suggest
    otherwise. For example, recently at NATO-
  • 18:46 - 18:54
    Stratcom Taskforce released a report where
    they actually bought 54000 likes, 54000
  • 18:54 - 19:02
    social media interactions for just 300
    Euros. So this is a very low price. And I
  • 19:02 - 19:07
    think you wouldn't expect such a low price
    if it would be hard to get that many
  • 19:07 - 19:16
    interactions. They bought 3500 comments,
    25000 likes, 20000 views and 5100
  • 19:16 - 19:23
    followers. Everything for just 300 Euros.
    So, you know, the thing they have in
  • 19:23 - 19:32
    common, they are cheap, the fake likes and
    the fake interactions. So we also have,
  • 19:32 - 19:38
    there was also another report from Vice
    Germany recently. And they reported on
  • 19:38 - 19:46
    some interesting facts about automated
    fake accounts. They reported on findings
  • 19:46 - 19:51
    that suggest that actually people use
    internet or hacked internet of things
  • 19:51 - 19:59
    devices and to use them to create these
    fake accounts and to manage them. And so
  • 19:59 - 20:05
    it's actually kind of interesting to think
    about this this wa. To say, OK, maybe next
  • 20:05 - 20:11
    election your fridge is actually going to
    support the other candidate on Facebook.
  • 20:11 - 20:17
    And so we also wanted to look into this
    and we wanted to go a step further and to
  • 20:17 - 20:25
    look at who these people are. Who are
    they, and what what are they doing on
  • 20:25 - 20:32
    Facebook? And so we actually examined the
    profiles of purchased likes. For this we
  • 20:32 - 20:38
    created four comments under arbitrary
    posts, and then we bought likes for these
  • 20:38 - 20:46
    comments, and then we examined the
    resulting profiles of the fake likes. So
  • 20:46 - 20:51
    it was pretty cheap to buy these likes.
    Comment likes are always a little bit more
  • 20:51 - 21:00
    expensive than other likes. And we found
    all these offerings on Google and we paid
  • 21:00 - 21:08
    with PayPal. So we actually used a pretty
    neat trick to estimate the age of these
  • 21:08 - 21:16
    fake accounts. So as you can see here, the
    Facebook user ID is incremented. So
  • 21:16 - 21:24
    Facebook started in 2009 to use
    incremented Facebook ID, and they use this
  • 21:24 - 21:32
    pattern of 1 0 0 0 and then the
    incremented number. And as you can see, in
  • 21:32 - 21:40
    2009 this incremented number was very
    close to zero. And then today it is close
  • 21:40 - 21:50
    to 40 billion. And in this time period,
    you can see that you can kind of get a
  • 21:50 - 21:57
    rather fitting line through all these
    points. And you can see that the likes are
  • 21:57 - 22:03
    in fact incremented, ... the account IDs
    are in fact incremented over time. So we
  • 22:03 - 22:09
    can use this fact in reverse to estimate
    the creation date of an account where we
  • 22:09 - 22:15
    know the Facebook ID. And that's exactly
    what we did with these fake likes. So we
  • 22:15 - 22:22
    estimated the account creation dates. And
    as you can see, we get kind of different
  • 22:22 - 22:29
    results from different services. For
    example, PaidLikes, they had rather old
  • 22:29 - 22:36
    accounts. So this means they use very
    authentic accounts. And we already know
  • 22:36 - 22:41
    that because we talked to them. So these
    are very authentic accounts. Also like
  • 22:41 - 22:47
    Service A over here also uses very, very
    authentic accounts. But on the other hand,
  • 22:47 - 22:52
    like service B uses very new accounts,
    they were all created in the last three
  • 22:52 - 22:58
    years. So if you look at the accounts and
    also from these numbers, we think that
  • 22:58 - 23:07
    these accounts were bots and on service C
    it's kind of not clear, are these are
  • 23:07 - 23:11
    these accounts bots or are these
    clickworkers? Maybe it's a mixture of
  • 23:11 - 23:18
    both, we don't know exactly for sure. But
    this is an interesting metric to measure
  • 23:18 - 23:23
    the age of the accounts to determine if
    some of them might be bots. And that's
  • 23:23 - 23:29
    exactly what we did on this page. So this
    is actually a page for garden furniture
  • 23:29 - 23:37
    and we found it in our list that we got
    from paid likes. So they bought, obviously
  • 23:37 - 23:44
    they were on this list for bought likes on
    Facebook, on PaidLikes. And they caught
  • 23:44 - 23:51
    our eye because they had one million
    likes. And that's rather unusual for a
  • 23:51 - 24:01
    shop for garden furniture in Germany. And
    so we looked at this page further and we
  • 24:01 - 24:07
    noticed other interesting things. For
    example, there are posts, all the time,
  • 24:07 - 24:14
    they got like thousands of likes. And
    that's also kind of unusual for a garden
  • 24:14 - 24:20
    furniture shop. And so we looked into the
    likes and as you can see, they all look
  • 24:20 - 24:27
    like they come from Southeast Asia and
    they don't look very authentic. And we
  • 24:27 - 24:32
    were actually able to estimate the
    creation dates of these accounts. And we
  • 24:32 - 24:37
    found that most of these accounts that
    were used for liking these posts on this
  • 24:37 - 24:44
    page were actually created in the last
    three years. So this is a page where
  • 24:44 - 24:50
    everything, from the number of people who
    like to page to the number of people who
  • 24:50 - 24:56
    like to posts is complete fraud. So
    nothing about this is real. And it's
  • 24:56 - 25:02
    obvious that this can happen on Facebook
    and that this is a really, really big
  • 25:02 - 25:08
    problem. I mean, this is a, this is a shop
    for garden furniture. Obviously, they
  • 25:08 - 25:15
    probably don't have such huge sums of
    money. So it was probably very cheap to
  • 25:15 - 25:22
    buy this amount of fake accounts. And it
    is really shocking to see how, how big,
  • 25:22 - 25:31
    how big the scale is of this kind of
    operations. And so what we have to say is,
  • 25:31 - 25:40
    OK, when Facebook says they have it under
    control, we have to doubt that. So now we
  • 25:40 - 25:46
    can look at the bigger picture. And what
    we are going to do here is we are going to
  • 25:46 - 25:53
    use this same graph that we used before to
    estimate the creation dates, but in a
  • 25:53 - 25:59
    different way. So we can actually see that
    the lowest and the highest points of
  • 25:59 - 26:05
    Facebook IDs in this graph. So we know the
    newest Facebook ID by creating a new
  • 26:05 - 26:13
    account. And we know the lowest ID because
    it's zero. And then we know that there are
  • 26:13 - 26:21
    40 billion Facebook IDs. Now, in the next
    step, we took a sample, a random sample
  • 26:21 - 26:28
    from these 40 billion Facebook IDs. And
    inside of the sample, we checked if these
  • 26:28 - 26:34
    accounts exist, if this ID corresponds to
    an existing account. And we do that because
  • 26:34 - 26:39
    we obviously cannot check 40 billion
    accounts and 40 billion IDs, but we can
  • 26:39 - 26:46
    check a small sample of these accounts of
    these IDs and estimate, then, the number
  • 26:46 - 26:54
    of existing accounts on Facebook and
    total. So for this, we repeatedly access
  • 26:54 - 27:03
    the same sample of one million random IDs
    over the course of one year. And we also
  • 27:03 - 27:10
    pulled a sample of 10 million random IDs
    for closer analysis this July. And now
  • 27:10 - 27:16
    Dennis is going to tell you how we did it.
    Dennis: Yeah. Well, pretty interesting,
  • 27:16 - 27:21
    pretty interesting results so far, right?
    So we again implemented the crawler, the
  • 27:21 - 27:27
    second time for gathering public Facebook
    information, the public Facebook account
  • 27:27 - 27:36
    data. And, yeah, this was not so easy as
    in the first case. Um, yeah. As. It's not
  • 27:36 - 27:45
    surprising that Facebook is using a lot of
    measures to try to block the automated
  • 27:45 - 27:52
    crawling of the Facebook page, for example
    with IP blocking or CAPTCHA solving. But,
  • 27:52 - 28:00
    uh, we were pretty easy... Yeah, we could
    pretty easy solve this problem by using
  • 28:00 - 28:07
    the Tor Anonymity Network. So every time
    our IP got blocked by crawling the data,
  • 28:07 - 28:14
    we just made a new Tor connection and
    change the IP. And this also with the
  • 28:14 - 28:21
    CAPTCHAs. And with this easy method, we
    were able to to crawl all the Facebook,
  • 28:21 - 28:26
    and all the public Facebook data. And
    let's have a look at two examples. The
  • 28:26 - 28:37
    first example is facebook.com/4. So the,
    very, very small Facebook ID. Yeah, in
  • 28:37 - 28:42
    this case, we are, we are redirected and
    check the response and find a valid
  • 28:42 - 28:50
    account page. And does anyone know which
    account this is? Mark Zuckerberg? Yeah,
  • 28:50 - 28:55
    that's correct. This is this is a public
    account for Mark Zuckerberg. Number four,
  • 28:55 - 29:02
    as we see, as we already saw, the other
    IDs are really high. But he got the number
  • 29:02 - 29:11
    four. Second example was facebook.com/3.
    In this case, we are not forwarded. And
  • 29:11 - 29:18
    this means that it is an invalid account.
    And that was really easy to confirm with a
  • 29:18 - 29:24
    quick Google search. And it was a test
    account from the beginning of Facebook. So
  • 29:24 - 29:31
    we did not get redirected. And it's just
    the login page from Facebook. And with
  • 29:31 - 29:38
    these examples, we did, we did a lot of, a
    lot more experiments. And at the end, we
  • 29:38 - 29:47
    were able to to build this tree. And, yeah,
    this tree represents the high level
  • 29:47 - 29:53
    approach from our scraper. So in the,
    What's that?
  • 29:53 - 29:56
    Svea: Okay. Sleeping.
    Laughing
  • 29:56 - 30:07
    Dennis: Yeah. We have still time. Right.
    So what? Okay, so everyone is waking up
  • 30:07 - 30:17
    again. Oh, yeah. The first step we call
    the domain, www.facebook.com/FID. If we
  • 30:17 - 30:25
    get redirected in this case, then we check
    if the, if the page is an account page. If
  • 30:25 - 30:31
    it's an account page, then it's an public
    account like the example 4 and we were
  • 30:31 - 30:40
    able to save the raw data, the raw HTTP
    source. If we, if it's not an account page
  • 30:40 - 30:45
    then everything is OK. If it's not, it's
    not a public account and we are not able
  • 30:45 - 30:53
    to save any data. And if we call, if we
    do, if we do not get redirected in the
  • 30:53 - 31:02
    first step, then we call the second
    domain, facebook.com/profile.php?id=FID
  • 31:02 - 31:09
    with the mobile user agent. And if we get
    redirected then, then again, it is a
  • 31:09 - 31:15
    nonpublic profile and we cannot save
    anything. But, and if we get not
  • 31:15 - 31:23
    redirected, it is an invalid profile and
    it is most often a deleted account. Yeah.
  • 31:23 - 31:29
    And yeah, that's the high level overview
    of our scraper. And Phillip will now give
  • 31:29 - 31:32
    some more information on interesting
    results.
  • 31:32 - 31:39
    Phillip: So the most interesting result of
    this scraping of the sample of Facebook
  • 31:39 - 31:47
    IDs was that one in four Facebook IDs
    corresponds to a valid account. And you
  • 31:47 - 31:54
    can do the math. There are 40 billion
    Facebook IDs, so there must be 10 billion
  • 31:54 - 32:00
    registered users on Facebook. And this
    means that there are more registered users
  • 32:00 - 32:08
    on Facebook than there are humans on
    Earth. And also, it means that it's even
  • 32:08 - 32:12
    worse than that because not everybody on
    Earth can have a Facebook account because
  • 32:12 - 32:17
    not everybody, you need a smartphone for
    that. And many people don't have those. So
  • 32:17 - 32:22
    this is actually a pretty high number and
    it's very unexpected. So in July 2019,
  • 32:22 - 32:29
    there were more than ten billion Facebook
    accounts. Also, we did another research on
  • 32:29 - 32:36
    the timeframe between October 2018 and
    today, or this month. And we found that in
  • 32:36 - 32:43
    this timeframe there were 2 billion new
    registered Facebook accounts. So this is
  • 32:43 - 32:49
    like the timeframe of one year, more or
    less. And in a similar timeframe, the
  • 32:49 - 32:59
    monthly active user base rose by only 187
    million. Facebook deleted 150 million
  • 32:59 - 33:05
    older accounts between October 2018 and
    July 2019. And we know that because we
  • 33:05 - 33:11
    pulled the same sample over a longer
    period of time. And then we watched for
  • 33:11 - 33:16
    accounts that got deleted in the sample.
    And that enables us to estimate this
  • 33:16 - 33:23
    number of 150 million accounts that got
    deleted that are basically older than our
  • 33:23 - 33:32
    sample. So I made some nice graphs for
    your viewing pleasure. So, again, the
  • 33:32 - 33:41
    older accounts were, just 150 million were
    deleted since October 2018. These are
  • 33:41 - 33:46
    accounts that are older than last year.
    And Facebook claims that since then, about
  • 33:46 - 33:53
    7 billion accounts got deleted from their
    platform, which is vastly more than these
  • 33:53 - 33:58
    older accounts. And that that's why we
    think that Facebook mostly deleted these
  • 33:58 - 34:07
    newer accounts. And if an account is older
    than a certain age, then it is very
  • 34:07 - 34:13
    unlikely that it gets deleted. And also, I
    think you can see the scales here. So, of
  • 34:13 - 34:18
    course, the registered users are not the
    same thing as active users, but you can
  • 34:18 - 34:23
    still see that there are much more
    registrations of, of new users than there
  • 34:23 - 34:30
    are active users. And there are new active
    users during the last year. So what does
  • 34:30 - 34:38
    this all mean? Does it mean that Facebook
    gets flooded by fake accounts? We don't
  • 34:38 - 34:43
    really know. We only know these numbers.
    What Facebook is telling us is that they
  • 34:43 - 34:50
    only count and publish active users, as I
    already said, that there is a disconnect
  • 34:50 - 34:57
    between this record, registered users and
    active users and Facebook only reports on
  • 34:57 - 35:04
    the active users. Also, they say that
    users register accounts, but they don't
  • 35:04 - 35:11
    verify them or they don't use them, and
    that's how this number gets so high. But I
  • 35:11 - 35:19
    think that that's not really explaining
    these high numbers and because that's just
  • 35:19 - 35:26
    by orders of magnitude larger than
    anything that this could cause. Also, they
  • 35:26 - 35:32
    say that they regularly delete fake
    accounts. But we have seen that these are
  • 35:32 - 35:38
    mostly accounts that get deleted directly
    after their creation. And if they survive
  • 35:38 - 35:46
    long enough, then they are getting
    through. So what does this all mean?
  • 35:46 - 35:55
    Svea: Okay, so you got the full load,
    which I had like over two or three months.
  • 35:55 - 36:03
    And what for me was, was a one very big
    conclusion was that we have some kind of
  • 36:03 - 36:09
    broken metric here, that all the likes and
    all the hearts on Instagram and the
  • 36:09 - 36:14
    followers that they can so easily be
    manipulated. And then it's it's so hard to
  • 36:14 - 36:19
    tell in some cases, it's so hard to tell
    if they are real or not real. And this
  • 36:19 - 36:26
    opens the gate for manipulation and yes,
    untrueness. And for economic losses, if
  • 36:26 - 36:33
    you think as somebody who is investing
    money and or as an advertiser, for
  • 36:33 - 36:40
    example. And in the very end, it is a case
    of eroding trust, which means that we
  • 36:40 - 36:46
    cannot trust these numbers anymore. These
    numbers are, you know, they are so easily
  • 36:46 - 36:54
    manipulated. And why should we trust this?
    And this has a severe consequence for all
  • 36:54 - 36:59
    the social networks. If you are still in
    them. So what can be a solution? And
  • 36:59 - 37:05
    Philip, you thought about that.
    Phillip: So basically we have two
  • 37:05 - 37:11
    problems. One is click workers and one is
    fakes. Click workers are basically just
  • 37:11 - 37:18
    hyper active users and they are selling
    their hyper activity. And so what social
  • 37:18 - 37:24
    networks could do is just make
    interactions scarce, so just lower the
  • 37:24 - 37:29
    value of more interactions. If you are a
    hyper active users, then your interaction
  • 37:29 - 37:34
    should count less than the interactions of
    a less active user.
  • 37:34 - 37:39
    Mumbling
    That's kind of solvable, I think. The real
  • 37:39 - 37:47
    problem is the authenticity. So if you if
    you get stopped from posting or liking
  • 37:47 - 37:53
    hundreds of pages a day, then maybe you
    just create multiple accounts and operate
  • 37:53 - 37:59
    them simultaneously. And this can only be
    solved by authenticity. So this can only
  • 37:59 - 38:05
    be solved if you know that the person who
    is operating the account is just one
  • 38:05 - 38:11
    person, is operating one account. And this
    is really hard to do, because Facebook
  • 38:11 - 38:15
    doesn't know who is clicking. Is it a bot?
    Is it a clickworrker, or is it one
  • 38:15 - 38:20
    clickworker for ten accounts? How does
    this work? And so this is really hard for
  • 38:20 - 38:28
    the, for the social media companies to do.
    And you could say, OK, let's send in the
  • 38:28 - 38:32
    passport or something like that to prove
    authenticity. But that's actually not a
  • 38:32 - 38:37
    good idea because nobody wants to send
    their passport to Facebook. And so this is
  • 38:37 - 38:42
    really a hard problem that has to be
    solved. If we want to use social, social
  • 38:42 - 38:50
    media in a meaningful way. And so this is
    what, what companies could do. And now...
  • 38:50 - 38:53
    Svea: But what do what you
    could do. Okay. Of course, you can delete
  • 38:53 - 38:56
    your Facebook account or your Instagram
    account and stop.
  • 38:56 - 39:01
    Slight Applause, Lauthing
    Svea: Yeah! Stay away from social media.
  • 39:01 - 39:09
    But this maybe is not for all of us a
    solution. So I think be aware, of course.
  • 39:09 - 39:17
    Spread the word, tell others. And if, if
    you, if you like, then and you get more
  • 39:17 - 39:24
    intelligence about that, we are really
    happy to dig deeper in these networks. And
  • 39:24 - 39:30
    and we will go on investigating and so at
    last but not least, it's to say thank you
  • 39:30 - 39:33
    to you guys. Thank you very much for
    listening.
  • 39:33 - 39:40
    Applause
    Svea: And we did not do this alone. We are
  • 39:40 - 39:45
    not three people. There are many more
    standing behind and doing this, this
  • 39:45 - 39:51
    beautiful research. And we are opening now
    for questions, please.
  • 39:51 - 39:55
    Herald: Yes. Please, thank Svea, Phil and
    Dennis again.
  • 39:55 - 40:06
    Applause
    And we have microphones out
  • 40:06 - 40:10
    here in the room, about nine of them,
    actually. If you line up behind them to
  • 40:10 - 40:16
    ask a question, remember that a question
    is a sentence with a question mark behind
  • 40:16 - 40:20
    it. And I think I see somebody at number
    three. So let's start with that.
  • 40:20 - 40:26
    Question: Hi. I, I just have a little
    question. Wouldn't a dislike button, the
  • 40:26 - 40:31
    concept of a dislike button, wouldn't that
    be a solution to all the problems?
  • 40:31 - 40:38
    Phillip: So we thought about recommending
    that Facebook ditches the like button
  • 40:38 - 40:42
    altogether. I think that would be a better
    solution than a dislike button, because a
  • 40:42 - 40:47
    dislike button could also be manipulated
    and it would be even worse because you
  • 40:47 - 40:54
    could actually manipulate the network into
    down ranking posts or kind of not showing
  • 40:54 - 41:01
    posts to somebody. And that, I think would
    be even worse. I imagine what dictators
  • 41:01 - 41:08
    would do with that. And so I think the
    best option would be to actually not show
  • 41:08 - 41:18
    off like, like counts anymore and to this,
    to actually make people not invest into
  • 41:18 - 41:25
    these counts if they become meaningless.
    Herald: I think I see a microphone 7, up
  • 41:25 - 41:28
    there.
    Question: Hello. So one question I had is
  • 41:28 - 41:37
    you are signed creation dates to IDs. How
    did you do this?
  • 41:37 - 41:52
    Phillip: So, we actually knew the creation
    date of some accounts. And then we kind of
  • 41:52 - 41:58
    interpolated between the creation dates
    and the IDs. So you see this black line
  • 41:58 - 42:04
    there. That's actually our, our
    interpolation. And with this black line,
  • 42:04 - 42:11
    we can then estimate the creation dates
    for IDs that we do not yet know because
  • 42:11 - 42:17
    they did, kind of fill in the gaps.
    Q: Follow up question, do you know why
  • 42:17 - 42:20
    there are some points outside of this
    graph?
  • 42:20 - 42:24
    Phillip: No.
    Q: No? Thank you.
  • 42:24 - 42:26
    Herald: So there was a question from the
    Internet.
  • 42:26 - 42:34
    Question: Did you report your findings to
    Facebook? And did they do anything?
  • 42:34 - 42:42
    Svea: Because this research is very new,
    we, we just recently approached them and
  • 42:42 - 42:47
    showed them the research and we got an
    answer. But I think we also already showed
  • 42:47 - 42:54
    the answer. It was that they, I think that
    they only count and publish active users.
  • 42:54 - 43:00
    They could, they did not want to tell us
    how many registered users they have, that
  • 43:00 - 43:04
    they say, oh, sometimes users register
    accounts, but don't use them or verify
  • 43:04 - 43:09
    them. And that they regularly delete fake
    accounts. But we hope that we get into a
  • 43:09 - 43:12
    closer discussion with them soon about
    this.
  • 43:12 - 43:19
    Herald: Microphone two.
    Question: When hunting down the bias of
  • 43:19 - 43:27
    the campaigns, did you dig out your own
    campaign line, Line below the line? No,
  • 43:27 - 43:34
    because they stopped scraping in August.
    And I, you stopped scraping in August. And
  • 43:34 - 43:39
    then I started, you know, the whole
    project started with them coming to us
  • 43:39 - 43:45
    with the list. And then we thought, oh,
    this is very interesting. And then the
  • 43:45 - 43:51
    whole journalistic research started. And,
    but I think if we, I think if we would do
  • 43:51 - 43:56
    it again, of course, I think we would find
    us. We all also found there was another
  • 43:56 - 44:02
    magazine, and they did, also a test, paid
    test a couple of years ago. And we found
  • 44:02 - 44:05
    their campaign.
    Phillip: So, so we we actually did another
  • 44:05 - 44:11
    test. And for the other test, I noted we
    also got like this ID, I think. And it
  • 44:11 - 44:20
    worked to plug it into the URL and then we
    also got to redirected to our own page. So
  • 44:20 - 44:23
    that worked.
    Q: Thank you.
  • 44:23 - 44:26
    Herald: Microphone three.
    Question: Hi. I'm Farhan, I'm a Pakistani
  • 44:26 - 44:31
    journalist. And first of all, I would like
    to say that you were right when you said
  • 44:31 - 44:35
    that there might be people sitting in
    Pakistan clicking on the likes. That does
  • 44:35 - 44:41
    happen. But my question would be that
    Facebook does have its own ad program that
  • 44:41 - 44:47
    it aggressively pushes. And in that ad
    program, there is also options whereby
  • 44:47 - 44:54
    people can buy likes and comments and
    impressions and reactions. Did you, would
  • 44:54 - 45:00
    you also consider those as a fake? I mean,
    that they're not fake, per se, but they're
  • 45:00 - 45:06
    still bought likes. So what's your view on
    those? Thank you.
  • 45:06 - 45:14
    Phillip: So, when you buy ads on Facebook,
    then, so, what you what you actually want
  • 45:14 - 45:19
    to have is fans for your page that are
    actually interested in your page. So
  • 45:19 - 45:25
    that's kind of the difference, I think to
    the, to the paid likes system where the
  • 45:25 - 45:30
    people themselves, they get paid for
    liking stuff that they wouldn't normally
  • 45:30 - 45:36
    like. So I think that's the fundamental
    difference between the two programs. And
  • 45:36 - 45:41
    that's why I think that one is unethical.
    And one is not really that unethical.
  • 45:41 - 45:48
    Svea: The very problem is if you, if you
    buy these click workers, then you have
  • 45:48 - 45:53
    many people in your fan page. They are not
    interested in you. They don't care about
  • 45:53 - 45:57
    you. They don't look at your products.
    They don't look at your political party.
  • 45:57 - 46:04
    And then often the people, they
    additionally, they make Facebook ads, and
  • 46:04 - 46:08
    these ads, they are shown, again, the
    click workers and they don't look at them.
  • 46:08 - 46:13
    So, you know, people, they are burning
    money and money and money with this whole
  • 46:13 - 46:18
    corrupt system.
    Herald: So, microphone two.
  • 46:18 - 46:22
    Question: Hi. Thanks. Thanks for the talk
    and thanks for the effort of going through
  • 46:22 - 46:28
    all of this project. From my
    understanding, this whole finding
  • 46:28 - 46:35
    basically undermines the trust in
    Facebook's likes in general, per se. So I
  • 46:35 - 46:42
    would expect now the price of likes to
    drop and the pay for click workers to drop
  • 46:42 - 46:49
    as well. Do you have any metrics on that?
    Svea: The research just went public. I
  • 46:49 - 46:56
    think one week ago. So, so what we have
    seen as an effect is that Facebook, they
  • 46:56 - 47:03
    excluded paid likes for, for a moment. So,
    yes, of course, one platform is down. But
  • 47:03 - 47:08
    I think there are so many outside. There
    are so many. So I think...
  • 47:08 - 47:14
    Q: I meant the phenomenon of paid likes,
    not the company itself. Like the value of
  • 47:14 - 47:19
    a like as a measure of credibility...
    Phillip: We didn't...
  • 47:19 - 47:23
    Q: ...is declining now. That's my, that's
    my...
  • 47:23 - 47:28
    Svea: Yes. That's why many people are
    buying Instagram hearts now. So, so, yes,
  • 47:28 - 47:33
    that's true. The like is not the fancy hot
    shit anymore. Yes. And we also saw in the
  • 47:33 - 47:41
    data that the likes for the fan pages,
    they rapidly went down and the likes for
  • 47:41 - 47:45
    the posts and the comments, they went up.
    So I think, yes, there is a shift. And
  • 47:45 - 47:52
    what we also saw in that data was that the
    Facebook likes, they, they went down from
  • 47:52 - 47:58
    2016. They are rapidly down. And what is
    growing and rising is YouTube and
  • 47:58 - 48:02
    Instagram. Now, everything is about,
    today, everything is about Instagram.
  • 48:02 - 48:05
    Q: Thanks.
    Herald: So let's go to number one.
  • 48:05 - 48:10
    Question: Hello and thank you very much
    for this fascinating talk, because I've
  • 48:10 - 48:15
    been following this whole topic for a
    while. And I was wondering if you were
  • 48:15 - 48:21
    looking also into the demographics, in
    terms of age groups and social class, not
  • 48:21 - 48:26
    of the people who were doing the actual
    liking, but actually, you know, buying
  • 48:26 - 48:31
    these likes. Because I think that what is
    changing is an entire social discourse on
  • 48:31 - 48:37
    social capital and, the bold U.S. kind of
    term, because it can now be quantified. As
  • 48:37 - 48:44
    a teacher, I hear of kids who buy likes to
    be more popular than their other
  • 48:44 - 48:48
    schoolmates. So I'm wondering if you're
    looking into that, because I think that's
  • 48:48 - 48:53
    fascinating, fascinating area to actually
    come up with numbers about it.
  • 48:53 - 48:59
    Svea: It definitely is. And we were all so
    fascinated by this data set of 90,000 data
  • 48:59 - 49:05
    points. And what we did was, and this was
    very hard, and was that we tried it, first
  • 49:05 - 49:12
    of all, to look who is buying likes, like
    automotives, you know, to to, this some,
  • 49:12 - 49:19
    you know, what, what kind of branches? Who
    is in that? And so this was this was
  • 49:19 - 49:25
    doable. But to get more into demographics,
    you would have liked to, to crawl, to
  • 49:25 - 49:34
    click every page. And so we we did not do
    this. What we did was, of course, that we
  • 49:34 - 49:38
    that we were a team of three to ten people
    and manually looking into it. And what we,
  • 49:38 - 49:44
    of course, saw that on Instagram and on
    YouTube, you have many of these very young
  • 49:44 - 49:47
    people. Some of them, I actually called
    them and they were like, Yes, I bought
  • 49:47 - 49:54
    likes. Very bad idea. So I think yes, I
    think there is a demographic shift away
  • 49:54 - 50:00
    from the companies and the automotive and
    industries buying Facebook fan page likes
  • 50:00 - 50:04
    to Instagram and YouTube wannabe-
    influencers.
  • 50:04 - 50:06
    Q: Influencers, influencer culture is
    obviously...
  • 50:06 - 50:13
    Svea: Yes. And I have to admit here we, we
    showed you the political side, but we have
  • 50:13 - 50:20
    to admit that the political likes, they
    were like this small in the numbers. And
  • 50:20 - 50:26
    the very, very vast majority of this data
    set, it's about wedding planners,
  • 50:26 - 50:31
    photography, tattoo studios and
    influencers, influencers, influencers and
  • 50:31 - 50:34
    YouTubers, of course.
    Q: Yes. Thank you so much.
  • 50:34 - 50:37
    Herald: So we have a lot of questions in
    the room. I'm going to get to you as soon
  • 50:37 - 50:40
    as we can. I'd like to go to the Internet
    first.
  • 50:40 - 50:45
    Signal Angel: Do you think this will get
    bit better or worse if people move to more
  • 50:45 - 50:48
    decentralized platforms?
    Phillip: To more what?
  • 50:48 - 50:55
    Svea: If it get better or worse.
    Dennis: Can you repeat that, please?
  • 50:55 - 50:59
    Herald: Would this issue get better or
    worse if people move to a more
  • 50:59 - 51:01
    decentralized platform?
    Phillip: Decentralized. decentralized,
  • 51:01 - 51:12
    okay. So, I mean, we can look at, at the,
    this slide, I think, and think about
  • 51:12 - 51:18
    whether decentralized platforms would
    change any of these, any of these two
  • 51:18 - 51:26
    points here. And I fear, I don't think so,
    because they cannot solve the interactions
  • 51:26 - 51:30
    problem that people can be hyperactive.
    Actually, that's kind of a normal thing
  • 51:30 - 51:34
    with social media. A small portion of
    social media users is much more active
  • 51:34 - 51:40
    than everybody else. That's kind of. You
    have that without paying for it. So
  • 51:40 - 51:45
    without even having paid likes, you will
    have to consider if social media is really
  • 51:45 - 51:51
    kind of representative of the society.
    But, and the other thing is authenticity.
  • 51:51 - 51:57
    And also in a decentralized platform, you
    could have multiple accounts run by the
  • 51:57 - 52:01
    same person.
    Herald: So, microphone seven, all the way
  • 52:01 - 52:07
    back there.
    Question: Hi. Do you know if Facebook even
  • 52:07 - 52:10
    removes the likes when they delete fake
    accounts?
  • 52:10 - 52:17
    Svea: Do you know that?
    Phillip: No, we don't know that. No, we
  • 52:17 - 52:21
    don't. We don't know. We know they delete
    fake accounts, but we don't know if they
  • 52:21 - 52:28
    also delete the likes. I know from our
    research that the people we approached,
  • 52:28 - 52:31
    they did not delete the click workers.
    They get...
  • 52:31 - 52:36
    Herald: Microphone two.
    Question: Yeah. Hi. So I have a question
  • 52:36 - 52:41
    with respect to this, one out of four
    Facebook accounts are active in your, in
  • 52:41 - 52:47
    your test. Did you see any difference with
    respect to age of the accounts? So is it
  • 52:47 - 52:52
    always one out the four to the entire
    sample? Or does it maybe change over the,
  • 52:52 - 52:58
    over the like going from a zero ID to,
    well, 10 billion or 40 billion?
  • 52:58 - 53:02
    Phillip: So you're talking about the
    density of accounts in our ID?
  • 53:02 - 53:06
    Q: Kind of.
    Phillip: So, so there are changes over
  • 53:06 - 53:12
    time. Yeah. So I guess I think now it's
    less than it was before. So now they are
  • 53:12 - 53:19
    less than for then, and before it was more
    and so I think it was. Yeah. I don't know.
  • 53:19 - 53:24
    Q: But you don't see anything specific
    that now, only in the new accounts, only
  • 53:24 - 53:28
    one out of 10 is active or valid and
    before it was one out of two or something
  • 53:28 - 53:31
    like that.
    Phillip: It's not that extreme. So it's
  • 53:31 - 53:35
    less than that. It's kind of...
    Dennis: We have to say we did not check
  • 53:35 - 53:41
    this, but there were no special cases.
    Phillip: But it changed over time? So
  • 53:41 - 53:47
    before it was less and, before it was more
    and now it is less. And so what we checked
  • 53:47 - 53:55
    was whether an ID actually corresponds to
    an account. And so this metric, yeah. And
  • 53:55 - 53:57
    it changed a little bit over time, but not
    much.
  • 53:57 - 54:02
    Herald: So, so number three, please.
    Question: Yeah. Thank you for a very
  • 54:02 - 54:07
    interesting talk. At the end, you gave
    some recommendations, how to fix the
  • 54:07 - 54:12
    metrics, right? And it's always nice to
    have some metrics because then, well, we
  • 54:12 - 54:15
    are the people who deal with the numbers.
    So we want the metrics. But I want to
  • 54:15 - 54:20
    raise the issue whether quantitative
    measure is actually the right thing to do.
  • 54:20 - 54:26
    So would you buy your furniture from store
    A with 300 likes against store B with 200
  • 54:26 - 54:32
    likes? Or would it not be better to have a
    more qualitative thing? And to what extent
  • 54:32 - 54:38
    is a quantitative measure maybe also the
    source of a lot of bad developments we see
  • 54:38 - 54:43
    in social media to begin with, even not
    with bot firms and anything, but just
  • 54:43 - 54:48
    people who go for the quick like and say
    Hooray for Trump and then get, whatever,
  • 54:48 - 54:52
    all the Trumpists is liking that and the
    others say Fuck Trump and you get all the
  • 54:52 - 54:57
    non Trumpists like that and you get all
    the polarization, right? So, Instagram, I
  • 54:57 - 55:03
    think they just don't just display their
    like equivalent anymore in order to
  • 55:03 - 55:05
    prevent that, so could you maybe comment
    on that?
  • 55:05 - 55:12
    Svea: I think this is a good idea, to, to
    hide the likes. Yes. But I you know, we
  • 55:12 - 55:18
    talked to many clickworkers and they do a
    lot of stuff. And what they also do is
  • 55:18 - 55:23
    taking comments and doing copy paste for
    comments section or for Amazon reviews.
  • 55:23 - 55:30
    So, you know, I think it's really hard to
    get them out of the system because maybe
  • 55:30 - 55:34
    if the likes are not shown and if and when
    the comments are counting, then you will
  • 55:34 - 55:41
    have people who are copy pasting comments
    in the comments section. So I really think
  • 55:41 - 55:45
    that the networks, that they really have
    an issue here.
  • 55:45 - 55:50
    Herald: So let's try to squeeze the last
    three questions now. First, number seven,
  • 55:50 - 55:53
    really quick.
    Question: Very quick. Thank you for the
  • 55:53 - 55:59
    nice insights. And I have a question about
    the location of the users. So you made
  • 55:59 - 56:03
    your point that you can analyze by the
    metadata where, uh, when the account was
  • 56:03 - 56:09
    made. But how about the location of the
    followers? Is there any way to analyze
  • 56:09 - 56:12
    that as well?
    Phillip: So we can only analyze that if
  • 56:12 - 56:21
    the users agreed to share it publicly and
    not all of them do that, I think often a
  • 56:21 - 56:26
    name check is often a very good way to
    check where somebody is from. For these
  • 56:26 - 56:32
    fake likes, for example. But as I said, it
    always depends on what the user himself is
  • 56:32 - 56:36
    willing to share.
    Herald: Internet?
  • 56:36 - 56:41
    Signal Angel: Isn't this just the western
    version of the Chinese social credit
  • 56:41 - 56:44
    system? Where do we go from here? What is
    the future of all this?
  • 56:44 - 56:54
    Svea: Yeah, it's dystopian, right? Oh,
    yeah, I don't, after this research, you
  • 56:54 - 57:01
    know, for me, I deleted my Facebook
    account like one or two years ago. So this
  • 57:01 - 57:07
    does you know, this did not matter to me
    so much. But I stayed on Instagram and
  • 57:07 - 57:13
    when I saw all this bought likes and
    abonnents and followers and also YouTube,
  • 57:13 - 57:17
    all this views, this, because the click
    workers, they also watch YouTube videos.
  • 57:17 - 57:21
    They have to stay on them like 40 seconds,
    it's really funny because they hate
  • 57:21 - 57:27
    hearing like techno music, rap music, all
    40 seconds and then they go on. But when I
  • 57:27 - 57:35
    sit next to Herald for two hour, three
    hours, I was so desillusionated about all
  • 57:35 - 57:41
    the social network things. And and I
    thought, OK, don't count on anything. Just
  • 57:41 - 57:46
    if you like the content, follow them and
    look at them. But don't believe anything.
  • 57:46 - 57:50
    That was my personal take away from this
    research.
  • 57:50 - 57:54
    Herald: So very last question, microphone
    two.
  • 57:54 - 57:59
    Question: A couple of days ago, The
    Independent reported that Facebook, the
  • 57:59 - 58:07
    Facebook App was activating the camera
    when reading a news feed. Could this be in
  • 58:07 - 58:11
    use in the context of detecting fake
    accounts?
  • 58:11 - 58:18
    Svea: I don't know.
    Phillip: So, I think that that in this
  • 58:18 - 58:27
    particular instance that it was probably a
    bug. So, I don't know, but I mean that the
  • 58:27 - 58:31
    people who work at Facebook are, not all
    of them are like crooks or anything that
  • 58:31 - 58:35
    they will deliberately program this kind
    of stuff. So they said that it was kind of
  • 58:35 - 58:41
    a bug from from an update that they did.
    And the question is whether we can
  • 58:41 - 58:49
    actually detect fake accounts with the
    camera. And the problem is that current, I
  • 58:49 - 58:57
    don't think that current face recognition
    technology is enough to detect that you
  • 58:57 - 59:03
    are a unique person. So there are so many
    people on the planet that probably another
  • 59:03 - 59:09
    person who has the same face. And I think
    the new iPhone, they also have this much
  • 59:09 - 59:15
    more sophisticated version of this
    technology. And even they say, OK, there's
  • 59:15 - 59:19
    a chance of one in, I don't know, that
    there is somebody who can unlock your
  • 59:19 - 59:24
    phone. So I think it's really hard to do
    that with, do that with recording
  • 59:24 - 59:29
    technology, to actually prove that
    somebody is just one person.
  • 59:29 - 59:38
    Herald: So with that, would you please
    help me thank Svea, Dennis and Philip
  • 59:38 - 59:41
    one more time for this fantastic
    presentation! Very interesting and very,
  • 59:41 - 59:48
    very disturbing. Thank you very much.
    Applause
  • 59:48 - 59:52
    postroll music
  • 59:52 - 60:16
    Subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
36C3 - Inside the Fake Like Factories
Description:

more » « less
Video Language:
English
Duration:
01:00:16

English subtitles

Revisions