< Return to Video

36C3 - No Body's Business But Mine, a dive into Menstruation Apps

  • 0:00 - 0:20
    36C3 preroll music
  • 0:20 - 0:25
    Herald: It is my honor to introduce you
    today to Eva and Chris. Eva, she is a
  • 0:25 - 0:29
    senior researcher at Privacy
    International. She works on gender,
  • 0:29 - 0:35
    economical and social rights and how they
    interplay with the right to privacy,
  • 0:35 - 0:40
    especially in marginalized communities.
    Chris, she is the privacy lead at
  • 0:40 - 0:46
    technology lead at Privacy International.
    And his day-to-day job is to expose
  • 0:46 - 0:51
    company and how they profit from
    individuals and specifically today they
  • 0:51 - 0:59
    will tell us how these companies can even
    profit from your menstruations. Thank you.
  • 0:59 - 1:00
    Chris: Thank you.
  • 1:00 - 1:05
    applause
  • 1:05 - 1:14
    C: Hi, everyone. It's nice to be back at
    CCC. I was at CCC last year. If you heard
  • 1:14 - 1:19
    my talk from last year, this is going to
    be like a slightly vague part 2. And if
  • 1:19 - 1:22
    you're not, I'm just gonna give you a very
    brief recap because there is a
  • 1:22 - 1:28
    relationship between the two. So, I will
    give you a little bit of background about
  • 1:28 - 1:33
    how this project started. Then we get to a
    little bit about menstruation apps and
  • 1:33 - 1:38
    what a menstruation app actually is. Let
    me talk a little bit through some of the
  • 1:38 - 1:42
    data that these these apps are collecting
    and talk how we did our research, our
  • 1:42 - 1:48
    research methodology and then what our
    findings are and our conclusions. So last
  • 1:48 - 1:55
    year, I and a colleague did a project
    around how Facebook collects data about
  • 1:55 - 2:04
    users on Android devices using the Android
    Facebook SDK. And this is whether you have
  • 2:04 - 2:10
    a Facebook account or not. And for that
    project, we really looked when you first
  • 2:10 - 2:14
    opened apps and didn't really have to do
    very much interaction with them
  • 2:14 - 2:24
    particularily, about the automatic sending
    of data in a post GDPR context. And so we
  • 2:24 - 2:30
    looked a load of apps for that project,
    including a couple of period trackers. And
  • 2:30 - 2:37
    that kind of led onto this project because
    we were seeing loads of apps, across
  • 2:37 - 2:43
    different areas of categories. So we
    thought we'd like hone in a little bit on
  • 2:43 - 2:49
    period trackers to see what kind of data,
    because they're by far more sensitive than
  • 2:49 - 2:53
    many of the other apps on there, like you
    might consider your music history to be
  • 2:53 - 3:04
    very sensitive.... laughs So. Yeah. So,
    just a quick update on the previous work
  • 3:04 - 3:12
    from last year. We actually followed up
    with all of the companies from that, from
  • 3:12 - 3:17
    that report. And by the end of like going
    through multiple rounds of response, over
  • 3:17 - 3:22
    60 percent of them a changed practices
    either by disabling the Facebook SDK in
  • 3:22 - 3:31
    their app or by disabling it until you
    gave consent or removing it entirely. So I
  • 3:31 - 3:36
    pass over to Eva Blum-Dumontet. She's
    going to talk you through menstruation
  • 3:36 - 3:39
    apps.
    Eva: So I just want to make sure that
  • 3:39 - 3:42
    we're all on the same page. Although if
    you didn't know what a menstruation app is
  • 3:42 - 3:48
    and you still bothered coming to this
    talk, I'm extremely grateful. So how many
  • 3:48 - 3:54
    of you are are using a menstruation app or
    have a partner, who's been using a
  • 3:54 - 3:58
    menstruation app? Oh my God. Oh, okay. I
    didn't expect that. I thought it was going
  • 3:58 - 4:03
    to be much less. Okay. Well, for the few
    of you who still might not know what a
  • 4:03 - 4:08
    menstruation app is, I'm still going to go
    quickly through what a menstruation app
  • 4:08 - 4:16
    is. It's the idea of a menstruation app.
    We also call them period tracker. It's to
  • 4:16 - 4:22
    have an app that tracks your menstruation
    cycle. So that they tell you what days
  • 4:22 - 4:27
    you're most fertile. And you can
    obviously, if you're using them to try and
  • 4:27 - 4:33
    get pregnant or if you have, for example,
    a painful period, you can sort of plan
  • 4:33 - 4:40
    accordingly. So that's essentially the
    main 2 reasons users would be would be
  • 4:40 - 4:48
    looking into using menstruation apps:
    pregnancy, period tracking. Now, how did
  • 4:48 - 4:54
    this research starts? As Chris said,
    obviously there was whole research that
  • 4:54 - 5:01
    had been done by Privacy International
    last year on various apps. And as Chris
  • 5:01 - 5:09
    also already said what I was particularly
    interested in was the kind of data that
  • 5:09 - 5:13
    menstruation apps are collecting, because
    as we'll explain in this talk, it's really
  • 5:13 - 5:22
    actually not just limited to menstruation
    cycle. And so I was interested in seeing
  • 5:22 - 5:27
    what actually happens to the data when it
    is being shared. So I should say we're
  • 5:27 - 5:32
    really standing on the shoulders of giants
    when it comes to this research. There was
  • 5:32 - 5:36
    previously existing research on
    menstruation apps that was done by a
  • 5:36 - 5:41
    partner organization, Coding Rights in
    Brazil. So they had done research on the
  • 5:41 - 5:47
    kind of data that was collected by
    menstruation apps and the granularity of
  • 5:47 - 5:52
    this data. Yet, a very interesting thing
    that we're looking at was the gender
  • 5:52 - 5:59
    normativity of those apps. Chris and I
    have been looking at, you know, dozens of
  • 5:59 - 6:03
    these apps and, you know, they have
    various data showing practices, as we'll
  • 6:03 - 6:08
    explain in the stock. But they have one
    thing that all of them have in common is
  • 6:08 - 6:16
    that they are all pink. The other thing is
    that they talk to their users as woman.
  • 6:16 - 6:21
    They, you know, don't want sort of even
    compute the fact that maybe not all their
  • 6:21 - 6:30
    users are woman. So there is a very sort
    of like narrow perspective of pregnancy
  • 6:30 - 6:41
    and females' bodies and how does female
    sexuality function. Now, as I was saying,
  • 6:41 - 6:45
    when you're using a menstruation app, it's
    not just your menstruation cycle that
  • 6:45 - 6:55
    you're entering. So this is some of the
    questions that menstruation apps ask: So
  • 6:55 - 7:01
    sex; There is a lot about sex that they
    want to know? How often, is it protected
  • 7:01 - 7:08
    or unprotected? Are you smoking? Are you
    drinking? Are you partying? How often? We
  • 7:08 - 7:17
    even had one app that was asking about
    masturbation, your sleeping pattern, your
  • 7:17 - 7:23
    coffee drinking habits. One thing that's
    really interesting is that - and we'll
  • 7:23 - 7:29
    talk a little bit more again about this
    later - but there's very strong data
  • 7:29 - 7:34
    protection laws in Europe called GDPR as
    most of you will know. And it says that
  • 7:34 - 7:38
    only data that's strictly necessary should
    be collected. So I'm still unclear what
  • 7:38 - 7:47
    masturbation has to do with tracking your
    menstruation cycle, but... Other thing
  • 7:47 - 7:56
    that was collected is about your health
    and the reason health is so important is
  • 7:56 - 8:00
    also related to data protection laws
    because when you're collecting health
  • 8:00 - 8:05
    data, you need to show that you're taking
    an extra step to collect this data because
  • 8:05 - 8:11
    it's considered sensitive personal data.
    So extra steps in terms of getting
  • 8:11 - 8:17
    explicit consent from the users but also
    through steps on behalf of the data
  • 8:17 - 8:22
    controller, in terms of showing that
    they're making extra steps for the
  • 8:22 - 8:29
    security of this data. So this is the type
    of question that was asked. There is so
  • 8:29 - 8:35
    much asked about vaginal discharge and
    what kind of vaginal discharge you get
  • 8:35 - 8:40
    with all sorts of weird adjectives for
    this: "Tiki, creamy". So yeah, they
  • 8:40 - 8:49
    clearly thought a lot about this. And it
    is a lot about mood as well. Even, yeah, I
  • 8:49 - 8:56
    didn't know 'romantic' was a mood but
    apparently it is. And what's interesting
  • 8:56 - 9:02
    obviously about mood in the context where,
    you know, we've seen stories like
  • 9:02 - 9:07
    Cambridge Analytica, for example. So we
    know how much companies, we know how much
  • 9:07 - 9:12
    political parties are trying to understand
    how we think, how we feel. So that's
  • 9:12 - 9:17
    actually quite significant that you have
    an app that's collecting information about
  • 9:17 - 9:24
    how we feel on a daily basis. And
    obviously, like when people enter all
  • 9:24 - 9:29
    these data, their expectation at that
    point is that the data stays between
  • 9:29 - 9:35
    between them and the app. And actually,
    there is very little in the privacy policy
  • 9:35 - 9:42
    that could that would normally suggest
    that it was. So this is the moment where I
  • 9:42 - 9:46
    actually should say we're not making this
    up; like literally everything in this list
  • 9:46 - 9:52
    of questions were things, literal terms,
    that they were asking. So we set out to
  • 9:52 - 9:55
    look at the most popular menstruation
    apps. Do you want to carry on?
  • 9:55 - 10:00
    Chris: Yeah. I forgot to introduce myself
    as well. Really? That's a terrible
  • 10:00 - 10:02
    speaking habit.
    Eva: Christopher Weatherhead..
  • 10:02 - 10:09
    Chris: .. Privacy International's
    technology lead. So yeah.. What I said
  • 10:09 - 10:12
    about our previous research, we have
    actually looked at most of the very
  • 10:12 - 10:18
    popular menstruation apps, the ones that
    have hundreds of thousands of downloads.
  • 10:18 - 10:22
    And these apps - like as we're saying that
    this kind of work has been done before. A
  • 10:22 - 10:26
    lot of these apps that come into quite a
    lot of criticism, I'd spare you the free
  • 10:26 - 10:30
    advertising about which ones particularly
    but most of them don't do anything
  • 10:30 - 10:36
    particularly outrageous, at least between
    the app and the developers' servers. A lot
  • 10:36 - 10:39
    of them don't share with third parties at
    that stage. So you can't look between the
  • 10:39 - 10:44
    app and the server to see what they're
    sharing. They might be sharing data from
  • 10:44 - 10:48
    the developers' server to Facebook or to
    other places but at least you can't see
  • 10:48 - 10:56
    in-between. But we're an international
    organization and we work around the globe.
  • 10:56 - 11:01
    And most of the apps that get the most
    downloads are particularly Western, U.S.,
  • 11:01 - 11:08
    European but they're not the most popular
    apps necessarily in a context like India
  • 11:08 - 11:13
    and the Philippines and Latin America. So
    we thought we'd have a look and see those
  • 11:13 - 11:17
    Apps. They're all available in Europe but
    they're not necessarily the most popular
  • 11:17 - 11:23
    in Europe. And this is where things
    started getting interesting. So what
  • 11:23 - 11:30
    exactly did we do? Well, we started off by
    triaging through a large number of period
  • 11:30 - 11:36
    trackers. And as Eva said earlier: every
    logo must be pink. And we were just kind
  • 11:36 - 11:40
    of looking through to see how many
    trackers - this is using extras (?)
  • 11:40 - 11:47
    privacy. We have our own instance in PI
    and we just looked through to see how many
  • 11:47 - 11:51
    trackers and who the trackers were. So,
    for example, this is Maya, which is
  • 11:51 - 11:55
    exceptionally popular in India,
    predominantly - it's made by an Indian
  • 11:55 - 12:01
    company. And as you can see, it's got a
    large number of trackers in it: a
  • 12:01 - 12:09
    CleverTap, Facebook, Flurry, Google and
    Inmobi? So we went through this process and
  • 12:09 - 12:15
    this allowed us to cut down... There's
    hundreds of period trackers. Not all of
  • 12:15 - 12:19
    them are necessarily bad but it's nice to
    try to see which ones had the most
  • 12:19 - 12:24
    trackers, where they were used and try and
    just triage them a little bit. From this,
  • 12:24 - 12:33
    we then run through PI's interception
    environment, which is a VM that I've made.
  • 12:33 - 12:37
    I actually made it last year for the talk
    I gave last year. And I said I'd release
  • 12:37 - 12:41
    it after the talk and took me like three
    months to release it but it's now
  • 12:41 - 12:45
    available. You can go onto PI's website
    and download it. It's a man in the middle
  • 12:45 - 12:53
    proxy with a few settings - mainly for
    looking at iOS and Android apps to do data
  • 12:53 - 12:59
    interception between them. And so we run
    through that and we got to have a look at
  • 12:59 - 13:05
    all the data that's being sent to and from
    both the app developer and third parties.
  • 13:05 - 13:11
    And here's what we found.
    Eva: So out of the six apps we looked out,
  • 13:11 - 13:18
    five shared data with Facebook. Out of
    those five, three pinged Facebook to let
  • 13:18 - 13:24
    them know when their users were
    downloading the app and opening the app.
  • 13:24 - 13:30
    And that's already quite significant
    information and we'll get to that later.
  • 13:30 - 13:37
    Now, what's actually interesting and the
    focus of a report was on the two apps that
  • 13:37 - 13:42
    shared every single piece of information
    that their users entered with Facebook and
  • 13:42 - 13:50
    other third parties. So just to brief you:
    the two apps we focused on are both called
  • 13:50 - 13:55
    Maya. So that's all very helpful. One is
    spelled Maya: M-a-y-a. The other ones
  • 13:55 - 14:01
    spellt Mia M-I-A. So, yeah, just bear with
    me because this is actually quite
  • 14:01 - 14:10
    confusing. But so initially we'll focus on
    Maya, which is - as Chris mentioned - an
  • 14:10 - 14:16
    app that's based in India. There have a
    user base of several millions. Their are
  • 14:16 - 14:27
    based in India. Userbase, mostly in India,
    also quite popular in the Philippines. So
  • 14:27 - 14:30
    what's interesting with Maya is that they
    start sharing data with Facebook before
  • 14:30 - 14:35
    you even get you agree to their privacy
    policy. So I should say already about the
  • 14:35 - 14:39
    privacy policy of a lot of those apps that
    we looked at is that they are literally
  • 14:39 - 14:48
    the definition of small prints. It's very
    hard to read. It's legalese language. It
  • 14:48 - 14:54
    really puts into perspective the whole
    question of consent in GDPR because GDPR
  • 14:54 - 14:58
    says like the consents must be informed.
    So you must be able to understand what
  • 14:58 - 15:04
    you're consenting to. When you're reading
    this extremely long, extremely opaque
  • 15:04 - 15:09
    privacy policies of a lot - literally all
    the menstruation apps we've looked at,
  • 15:09 - 15:14
    excluding one that didn't even bother
    putting their privacy policy, actually.
  • 15:14 - 15:20
    It's opaque. It's very hard to understand
    and - absolutely, definitely, do not say
  • 15:20 - 15:25
    that they're sharing information with
    Facebook. As I said, data sharing happened
  • 15:25 - 15:30
    before you get to agree to their privacy
    policy. The other thing that's also worth
  • 15:30 - 15:33
    remembering is that when to share
    information with Facebook - doesn't matter
  • 15:33 - 15:39
    if you have a Facebook account or not, the
    information still being relayed. The other
  • 15:39 - 15:44
    interesting thing that you'll notice as
    well in several of the slides is that the
  • 15:44 - 15:49
    information that's being shared is tied to
    your identity through your unique ID
  • 15:49 - 15:55
    identifiers, also your email address. But
    basically most of the questions we got
  • 15:55 - 16:00
    when we released the research was like:
    oh, if I use a fake email address or if I
  • 16:00 - 16:06
    use a fake name, is that OK? Well, it's
    not because even if you have a Facebook
  • 16:06 - 16:13
    account through your unique identifier,
    they would definitely be able to trace you
  • 16:13 - 16:22
    backs. There is no way to actually
    anonymize this process unless - well at
  • 16:22 - 16:27
    the end, unless you deliberately trying to
    trick it and use a separate phone
  • 16:27 - 16:34
    basically for regular users. It's quite
    difficult. So this is what it looks like
  • 16:34 - 16:42
    when you enter the data. So as I said, I
    didn't lie to you. This is the kind of
  • 16:42 - 16:49
    questions they're asking you. And this is
    what it looks like when it's being shared
  • 16:49 - 16:55
    with Facebook. So you see the symptomes
    changing, for example, like blood
  • 16:55 - 17:00
    pressure, swelling, acne, that's all being
    shipped through craft out Facebook,
  • 17:00 - 17:06
    through the Facebook SDK. This is what it
    looks like when they show you
  • 17:06 - 17:12
    contraceptive practice, so again, like
    we're talking health data. Here we're
  • 17:12 - 17:18
    talking sensitive data. We're talking
    about data that shouldn't normally require
  • 17:18 - 17:22
    extra steps in terms of collecting it, in
    terms of how it's being processed. But
  • 17:22 - 17:29
    nope, in this case it was shared exactly
    like the rest. This's what it looks like.
  • 17:29 - 17:34
    Well, so, yeah with sex life it was a
    little bit different. So that's what it
  • 17:34 - 17:38
    looks like when they're asking you about,
    you know, you just had sex, was it
  • 17:38 - 17:45
    protected? Was it unprotected? The way it
    was shared with Facebook was a little bit
  • 17:45 - 17:51
    cryptic, so to speak. So if you have
    protected sex, it was entered as love "2",
  • 17:51 - 17:58
    unprotected sex was entered as Love "3". I
    managed to figure that out pretty quickly.
  • 17:58 - 18:07
    So it's not so cryptic. That's also quite
    funny. So Maya had a diary section where
  • 18:07 - 18:13
    they encourage people to enter like their
    notes and your personal faults. And I
  • 18:13 - 18:19
    mean, it's a menstruation app so you can
    sort of get the idea of what people are
  • 18:19 - 18:22
    going to be writing down in there or
    expected to write on. It's not going to be
  • 18:22 - 18:26
    their shopping list, although shopping
    lists could also be personal, sensitive,
  • 18:26 - 18:33
    personal information, but.. So we were
    wondering what would happen if you were to
  • 18:33 - 18:38
    write in this in this diary and how this
    data would be processed. So we entered
  • 18:38 - 18:42
    literally we entered something very
    sensitive, entered here. This is what we
  • 18:42 - 18:53
    wrote. And literally everything we wrote
    was shared with Facebook. Maya also shared
  • 18:53 - 18:58
    your health data, not just with Facebook,
    but with a company called CleverTap that's
  • 18:58 - 19:05
    based in California. So what's CleverTap?
    CleverTap is a data broker, basically.
  • 19:05 - 19:12
    It's a company that - sort of similar to
    Facebook with the Facebook SDK. They
  • 19:12 - 19:17
    expect of developers to hand over the data
    and in exchange app developers get
  • 19:17 - 19:24
    insights about like how people use the
    app, what time of day. You know, the age
  • 19:24 - 19:31
    of their users. They get all sorts of
    information and analytics out of the data
  • 19:31 - 19:39
    they share with this company. It took us
    some time to figure it out because it
  • 19:39 - 19:43
    shared as wicked wizard?
    Chris: Wicket Rocket.
  • 19:43 - 19:50
    Eva: Wicket Rocket, yeah. But that's
    exactly the same. Everything that was
  • 19:50 - 19:57
    shared with Facebook was also shared with
    CleverTap again, with the email address
  • 19:57 - 20:05
    that we were using - everything. Let's
    shift. Now, let's look at the other Mia.
  • 20:05 - 20:10
    It's not just the name that's similar,
    it's also the data showing practices. Mia
  • 20:10 - 20:18
    is based in Cypress, so in European Union.
    I should say, in all cases, regardless of
  • 20:18 - 20:22
    where the company is based, the moment
    that they market the product in European
  • 20:22 - 20:29
    Union, so like literally every app we
    looked at, they need to - well they should
  • 20:29 - 20:40
    respect GDPR. Our European data protection
    law. Now, the first thing that Mia asked
  • 20:40 - 20:45
    when you started the app and again - I'll
    get to that later about the significance
  • 20:45 - 20:50
    of this - is why you're using the app or
    you using it to try and get pregnant or
  • 20:50 - 20:56
    are you just using it to try to track your
    periods? Now, it's interesting because it
  • 20:56 - 21:00
    doesn't change at all the way you interact
    with the app eventually. The apps stays
  • 21:00 - 21:05
    exactly the same. But this is actually the
    most important kind of data. This is
  • 21:05 - 21:11
    literally called the germ of data
    collection. It's trying to know when a
  • 21:11 - 21:16
    woman is trying to get pregnant or not. So
    the reason this is the first question they
  • 21:16 - 21:21
    ask is, well my guess on this is - they
    want to make sure that like even if you
  • 21:21 - 21:26
    don't actually use the app that's at least
    that much information they can collect
  • 21:26 - 21:32
    about you. And so this information was
    shared immediately with Facebook and with
  • 21:32 - 21:37
    AppsFlyer. AppsFlyer is very similar to
    CleverTap in the way it works. It's also a
  • 21:37 - 21:44
    company that collects data from these apps
    and that as services in terms of analytics
  • 21:44 - 21:54
    and insights into user behavior. It's
    based in Israel. So this is what it looks
  • 21:54 - 22:05
    like when you enter the information. Yeah,
    masturbation, pill. What kind of pill
  • 22:05 - 22:11
    you're taking, your lifestyle habits. Now
    where it's slightly different is that the
  • 22:11 - 22:16
    information doesn't immediately get shared
    with Facebook but based on the information
  • 22:16 - 22:23
    you enter, you get articles that are
    tailored for you. So, for example, like
  • 22:23 - 22:27
    when you select masturbation, you will
    get, you know, masturbation: what you want
  • 22:27 - 22:36
    to know but are ashamed to ask. Now,
    what's eventually shared with Facebook is
  • 22:36 - 22:43
    actually the kind of article that's being
    offered to you. So basically, yes, the
  • 22:43 - 22:50
    information is shared indirectly because
    then you know you have Facebook and...
  • 22:50 - 22:53
    You've just entered masturbation because
    you're getting an article about
  • 22:53 - 22:59
    masturbation. So this is what happened
    when you enter alcohol. So expected
  • 22:59 - 23:03
    effects of alcohol on a woman's body.
    That's what happened when you enter
  • 23:03 - 23:06
    "unprotected sex". So effectively, all the
    information is still shared just
  • 23:06 - 23:14
    indirectly through the articles you're
    getting. Yeah. Last thing also, I should
  • 23:14 - 23:18
    say on this, in terms of the articles that
    you're getting, is that sometimes there
  • 23:18 - 23:23
    was sort of also kind of like crossing the
    data.. was like.. so the articles will be
  • 23:23 - 23:30
    about like: oh, you have cramps outside of
    your periods, for example, like during
  • 23:30 - 23:37
    your fertile phase. And so you will get
    the article specifically for this and the
  • 23:37 - 23:43
    information that's shared with Facebook
    and with AppsFlyer is that this person is
  • 23:43 - 23:49
    in their fertile period in this phase of
    their cycles and having cramps. Now, why
  • 23:49 - 23:52
    are menstruation apps so obsessed with
    finding out if you're trying to get
  • 23:52 - 24:00
    pregnant? And so, this goes back to a lot
    of the things I mentioned before that, you
  • 24:00 - 24:04
    know, about wanting to know in the very
    first place if you're trying to get
  • 24:04 - 24:10
    pregnant or not. And also, this is
    probably why a lot of those apps are
  • 24:10 - 24:17
    trying to really nail down in their
    language and discourse how you're using
  • 24:17 - 24:23
    the apps for. When a person is pregnant,
    they're purchasing habit, their consumer
  • 24:23 - 24:30
    habits change. Obviously, you know, you
    buy not only for yourself but you start
  • 24:30 - 24:37
    buying for others as well. But also you're
    buying new things you've never purchased
  • 24:37 - 24:42
    before. So what a regular person will be
    quite difficult to change her purchasing
  • 24:42 - 24:48
    habit was a person that's pregnant.
    They'll be advertisers will be really keen
  • 24:48 - 24:53
    to target them because this is a point of
    their life where their habits change and
  • 24:53 - 24:58
    where they can be more easily influenced
    one way or another. So in other words,
  • 24:58 - 25:04
    it's pink advertising time. In other more
    words and pictures, there's research done
  • 25:04 - 25:12
    in 2014 in the US that was trying to sort
    of evaluate the value of data for a
  • 25:12 - 25:19
    person. So an average American person
    that's not pregnant was 10 cents. A person
  • 25:19 - 25:29
    who's pregnant would be one dollar fifty.
    So you may have noticed we using the past
  • 25:29 - 25:33
    tense when we talked about - well I hope I
    did when I was speaking definitely into
  • 25:33 - 25:38
    the lights at least - we used the past
    tense when we talk about data sharing of
  • 25:38 - 25:43
    these apps. That's because both Maya and
    MIA, which were the two apps we were
  • 25:43 - 25:48
    really targeting with this report, stop
    using the Facebook SDK when we wrote to
  • 25:48 - 25:51
    them about our research
    before we published it.
  • 25:51 - 26:01
    applause
    So it was quite nice because he didn't
  • 26:01 - 26:06
    even like rely on actually us publishing
    the report. It was merely at a stage of
  • 26:06 - 26:10
    like, hey, this is all right of response.
    We're gonna be publishing this. Do you
  • 26:10 - 26:14
    have anything to say about this? And
    essentially what they had to say is like:
  • 26:14 - 26:21
    "Yep, sorry, apologies. We are stopping
    this." I think, you know.. What's really
  • 26:21 - 26:28
    interesting as well to me about like the
    how quick the response was is.. it really
  • 26:28 - 26:34
    shows how this is not a vital service for
    them. This is a plus. This is something
  • 26:34 - 26:42
    that's a useful tool. But the fact that
    they immediately could just stop using it,
  • 26:42 - 26:48
    I think really shows that, you know, it
    was.. I wouldn't see a lazy practice, but
  • 26:48 - 26:53
    it's a case of light. As long as no one's
    complaining, then you are going to carry
  • 26:53 - 27:00
    on using it. And I think that was also the
    discourse with your research. There was
  • 27:00 - 27:03
    also a lot that changed
    their behaviors after.
  • 27:03 - 27:06
    Chris: A lot of the developers sometimes
    don't even realize necessarily what data
  • 27:06 - 27:12
    they're up to sharing with people like
    Facebook, with people like CleverTap. They
  • 27:12 - 27:17
    just integrate the SDK and
    hope for the best.
  • 27:17 - 27:22
    Eva: We also got this interesting response
    from AppsFlyer is that it's very
  • 27:22 - 27:27
    hypocritical. Essentially, what they're
    saying is like oh, like we specifically
  • 27:27 - 27:34
    ask our customers or oh, yeah, do not
    share health data with us specifically for
  • 27:34 - 27:38
    the reason I mentioned earlier, which is
    what? Because of GDPR, you're normally
  • 27:38 - 27:45
    expected to take extra step when you
    process sensitive health data. So their
  • 27:45 - 27:49
    response is that they as their customer to
    not share health data or sensitive
  • 27:49 - 27:55
    personal data so they don't become liable
    in terms of the law. So they were like,
  • 27:55 - 28:00
    oh, we're sorry, like this is a breach of
    contract. Now, the reason is very
  • 28:00 - 28:04
    hypocritical is that obviously when you
    have contracts with menstruation apps and
  • 28:04 - 28:08
    actually Maya was not the only
    menstruation apps that we're working with.
  • 28:08 - 28:12
    I mean, you know, what can you generally
    expect in terms of the kind of data you're
  • 28:12 - 28:19
    gonna receive? So here's a conclusion for
    us that research works. It's fun, it's
  • 28:19 - 28:27
    easy to do. You know, Chris has not
    published the environment. It doesn't
  • 28:27 - 28:33
    actually - once the environment is sort of
    set up it doesn't actually require
  • 28:33 - 28:37
    technical background, as you saw from the
    slides it's pretty straightforward to
  • 28:37 - 28:42
    actually understand how the data is being
    shared. So you should do it, too. But more
  • 28:42 - 28:47
    broadly, we think it's really important to
    do more research, not just at this stage
  • 28:47 - 28:54
    of the process, but generally about the
    security and the data and the data showing
  • 28:54 - 29:00
    practices of apps, because, you know, it's
    hard law and more and more people are
  • 29:00 - 29:06
    using or interacting with technology and
    using the Internet. So we need to do think
  • 29:06 - 29:11
    much more carefully about the security
    implication of the apps we use and
  • 29:11 - 29:16
    obviously it works. Thank you.
  • 29:16 - 29:25
    applause
  • 29:25 - 29:30
    Herald: Thank you. So, yeah, please line
    up in front of the microphones. We can
  • 29:30 - 29:34
    start with microphone two.
    Mic 2: Hi. Thank you. So you mentioned
  • 29:34 - 29:39
    that now we can check whether our data is
    being shared with third parties on the
  • 29:39 - 29:42
    path between the user and the developer.
    But we cannot know for all the other apps
  • 29:42 - 29:46
    and for these, what if it's not being
    shared later from the developer, from the
  • 29:46 - 29:52
    company to other companies. Have you
    conceptualize some ways of testing that?
  • 29:52 - 29:56
    Is it possible?
    Chris: Yes. So you could do it, data
  • 29:56 - 30:04
    separate access request and the GDPR that
    would... like the problem is it's quite
  • 30:04 - 30:11
    hard to necessarily know. How the process
    - how the system outside of the app to
  • 30:11 - 30:16
    serve relationship is quite hard to know
    the processes of that data and so it is
  • 30:16 - 30:20
    quite opaque. They might apply a different
    identifier too, they might do other
  • 30:20 - 30:24
    manipulations to that data so trying to
    track down and prove this bit of data
  • 30:24 - 30:29
    belong to you. It's quite challenging.
    Eva: This is something we're going to try.
  • 30:29 - 30:32
    We're going to be doing in 2020, actually.
    We're going to be doing data subject
  • 30:32 - 30:38
    access request of those apps that we've
    been looking up to see if we find anything
  • 30:38 - 30:44
    both under GDPR but also under different
    data protection laws in different
  • 30:44 - 30:50
    countries. To see basically what we get,
    how much we can obtain from that.
  • 30:50 - 30:55
    Herald: So I'd go with the signal angle.
    Signal: So what advice can you give us on
  • 30:55 - 31:00
    how we can make people understand that
    from a privacy perspective, it's better to
  • 31:00 - 31:05
    use pen and paper instead of entering
    sensitive data into any of these apps?
  • 31:05 - 31:10
    Eva: I definitely wouldn't advise that. I
    wouldn't advise pen and paper. I think for
  • 31:10 - 31:17
    us like really the key... The work we are
    doing is not actually targeting users.
  • 31:17 - 31:21
    It's targeting companies. We think it's
    companies that really need to do better.
  • 31:21 - 31:26
    We're often ask about, you know, advice to
    customers or advice to users and
  • 31:26 - 31:32
    consumers. But what I think and what we've
    been telling companies as well is that,
  • 31:32 - 31:36
    you know, their users trust you and they
    have the right to trust you. They also
  • 31:36 - 31:41
    have the right to expect that you're
    respecting the law. The European Union has
  • 31:41 - 31:47
    a very ambitious legislation when it comes
    to privacy with GDPR. And so the least
  • 31:47 - 31:56
    they can expect is that you're respecting
    the law. And so, no, I would ... and this
  • 31:56 - 32:00
    is the thing, I think people have the
    right to use those apps, they have the
  • 32:00 - 32:04
    right to say, well, this is a useful
    service for me. It's really companies that
  • 32:04 - 32:08
    need you. They need to up their game. They
    need to live up to the expectations of
  • 32:08 - 32:16
    their consumers. Not the other way around.
    Herald: Microphone 1.
  • 32:16 - 32:19
    Mic 1: Hi. So from the talk, it seems and
    I think that's what you get, you mostly
  • 32:19 - 32:23
    focused on Android based apps. Can you
    maybe comment on what the situation is
  • 32:23 - 32:27
    with iOS? Is there any technical
    difficulty or is it anything completely
  • 32:27 - 32:31
    different with respect to these apps and
    apps in general?
  • 32:31 - 32:34
    Chris: There's not really a technical
    difficulty like the setup a little bit
  • 32:34 - 32:39
    different, but functionally you can look
    at the same kind of data. The focus here,
  • 32:39 - 32:45
    though, is also.. So it's two-fold in some
    respects. Most of the places that these
  • 32:45 - 32:50
    apps are used are heavily dominated
    Android territories, places like India,
  • 32:50 - 32:56
    the Philippines. iOS penetration there,
    uh, Apple device penetration there is very
  • 32:56 - 33:02
    low. There's no technical reason not to
    look at Apple devices. But like in this
  • 33:02 - 33:07
    particular context, it's not necessarily
    hugely relevant. So does that answer your
  • 33:07 - 33:09
    question?
    Mic 1: And technically with youre set-up,
  • 33:09 - 33:12
    you could also do the same
    analysis with an iOS device?
  • 33:12 - 33:17
    Chris: Yeah. As I said it's a little bit
    of a change to how you... You have to
  • 33:17 - 33:22
    register the device as an MDM dev.. like a
    mobile profile device. Otherwise you can
  • 33:22 - 33:31
    do the exact same level of interception.
    Mic: Uh, hi. My question is actually
  • 33:31 - 33:33
    related to the last question
    is a little bit technical.
  • 33:33 - 33:36
    Chris: Sure.
    Mic: I'm also doing some research on apps
  • 33:36 - 33:40
    and I've noticed with the newest versions
    of Android that they're making more
  • 33:40 - 33:44
    difficult to install custom certificates
    to have this pass- through and check what
  • 33:44 - 33:49
    the apps are actually communicating to
    their home servers. Have you find a way to
  • 33:49 - 33:54
    make this easier?
    Chris: Yes. So we actually hit the same
  • 33:54 - 34:02
    issue as you in some respects. So the
    installing of custom certificates was not
  • 34:02 - 34:06
    really an obstacle because you can add to
    the user if it's a rich device, you can
  • 34:06 - 34:14
    add them to the system store and they are
    trusted by all the apps on the device. The
  • 34:14 - 34:19
    problem we're now hitting is the Android 9
    and 10 have TLS 1.3 and TLS 1.3
  • 34:19 - 34:24
    to text as a man in the middle or at
    least it tries to might terminate the
  • 34:24 - 34:29
    connection. Uh, this is a bit of a
    problem. So currently all our research is
  • 34:29 - 34:37
    still running on Android 8.1 devices. This
    isn't going to be sustainable long term.
  • 34:37 - 34:43
    Herald: Um, 4.
    Mic 4: Hey, thank you for the great talk.
  • 34:43 - 34:47
    Your research is obviously targeted in a
    constructive, critical way towards
  • 34:47 - 34:53
    companies that are making apps surrounding
    menstrual research. Did you learn anything
  • 34:53 - 34:57
    from this context that you would want to
    pass on to people who research this area
  • 34:57 - 35:03
    more generally? I'm thinking, for example,
    of Paramount Corp in the US, who've done
  • 35:03 - 35:08
    micro dosing research on LSD and are
    starting a breakout study on menstrual
  • 35:08 - 35:12
    issues.
    Eva: Well, I think this is why I was
  • 35:12 - 35:16
    concluded on it. I think there is a
    there's still a lot of research that needs
  • 35:16 - 35:21
    to be done in terms of the sharing. And
    obviously, I think anything that touches
  • 35:21 - 35:28
    on people's health is a key priority
    because it's something people relate very
  • 35:28 - 35:34
    strongly to. The consequences, especially
    in the US, for example, of sharing health
  • 35:34 - 35:39
    data like this, of having - you know -
    data, even like your blood pressure and so
  • 35:39 - 35:43
    on. Like what are the consequences if
    those informations are gonna be shared,
  • 35:43 - 35:47
    for example, with like insurance companies
    and so on. This is what I think is
  • 35:47 - 35:52
    absolutely essential to have a better
    understanding of the data collection and
  • 35:52 - 35:58
    sharing practices of the services. The
    moments when you have health data that's
  • 35:58 - 36:00
    being involved.
    Chris: .. yeah because we often focus
  • 36:00 - 36:06
    about this being an advertising issue. But
    in that sense as well, insurance and even
  • 36:06 - 36:10
    credit referencing of all sorts of other
    things become problematic, especially when
  • 36:10 - 36:15
    it comes to pregnancy related.
    Eva: Yeah, even employers could be after
  • 36:15 - 36:19
    this kind of information.
    Herald: Six.
  • 36:19 - 36:24
    Mic 6: Hi. I'm wondering if there is an
    easy way or a tool which we can use to
  • 36:24 - 36:33
    detect if apps are using our data or are
    reporting them to Facebook or whatever. Or
  • 36:33 - 36:40
    if we can even use those apps but block
    this data from being reported to Facebook.
  • 36:40 - 36:46
    Chris: Yes. So, you can file all of faith
    graft on Facebook.com and stop sending
  • 36:46 - 36:52
    data to that. There's a few issues here.
    Firstly, it doesn't really like.. This
  • 36:52 - 36:58
    audience can do this. Most users don't
    have the technical nuance to know what
  • 36:58 - 37:02
    needs to be blocked, what doesn't
    necessarily need to be blocked. It's on
  • 37:02 - 37:07
    the companies to be careful with users
    data. It's not up to the users to try and
  • 37:07 - 37:14
    defend against.. It shouldn't be on the
    use to defend against malicious data
  • 37:14 - 37:17
    sharing or...
    Eva: You know... also one interesting
  • 37:17 - 37:22
    thing is that if Facebook had put this in
    place of light where you could opt out
  • 37:22 - 37:25
    from data sharing with the apps you're
    using, but that only works if you're a
  • 37:25 - 37:30
    Facebook user. And as I said, like this
    data has been collected whether you are a
  • 37:30 - 37:34
    user or not. So in a sense, for people who
    aren't Facebook users, they couldn't opt
  • 37:34 - 37:38
    out of this.
    Chris: The Facebook SDK the developers are
  • 37:38 - 37:47
    integrating the default state for sharing
    of data is on, the flag is true. And
  • 37:47 - 37:56
    although they have a long legal text on
    the help pages for the developer tools,
  • 37:56 - 38:01
    it's like unless you have a decent
    understanding of local data protection
  • 38:01 - 38:05
    practice or local protection law. It's
    like it's not it's not something that most
  • 38:05 - 38:09
    developers are gonna be able to understand
    why this flag should be something
  • 38:09 - 38:16
    different from on. You know there's loads
    of flags in the SDK, which flags should be
  • 38:16 - 38:22
    on and off, depending on which
    jurisdiction you're selling to, or users
  • 38:22 - 38:27
    going to be in.
    Herald: Signal Angel, again.
  • 38:27 - 38:32
    Singal: Do you know any good apps which
    don't share data and are privacy friendly?
  • 38:32 - 38:37
    Probably even one that is open source.
    Eva: So, I mean, as in the problem which
  • 38:37 - 38:43
    is why I wouldn't want to vouch for any
    app is that even in the apps that, you
  • 38:43 - 38:48
    know, where in terms of like the traffic
    analysis we've done, we didn't see any any
  • 38:48 - 38:53
    data sharing. As Chris was explaining, the
    data can be shared at a later stage and
  • 38:53 - 39:01
    it'd be impossible for us to really find
    out. So.. no, I can't be vouching for any
  • 39:01 - 39:05
    app. I don't know if you can...
    Chris: The problem is we can't ever look
  • 39:05 - 39:11
    like one specific moment in time to see
    whether data is being shared, unlike what
  • 39:11 - 39:18
    was good today might bad tomorrow. What
    was bad yesterday might be good today.
  • 39:18 - 39:25
    Although, I was in Argentina recently
    speaking to a group of feminist activists,
  • 39:25 - 39:32
    and they have been developing a
    menstruation tracking app. And the app was
  • 39:32 - 39:38
    removed from the Google Play store because
    it had illustrations that were deemed
  • 39:38 - 39:42
    pornographic. But they were illustrations
    around medical related stuff. So even
  • 39:42 - 39:45
    people, who were trying to do the right
    thing, going through the open source
  • 39:45 - 39:50
    channels are still fighting a completely
    different issue when it comes to
  • 39:50 - 39:53
    menstruation tracking.
    It's a very fine line.
  • 39:53 - 39:57
    Herald: Um, three.
    inaudible
  • 39:57 - 40:02
    Eva: Sorry, can't hear -the Mic's not
    working.
  • 40:02 - 40:05
    Herald: Microphone three.
    Mic 3: Test.
  • 40:05 - 40:10
    Eva: Yeah, it's great - perfect.
    Mic 3: I was wondering if the graph API
  • 40:10 - 40:17
    endpoint was actually in place to trick
    menstruation data or is it more like a
  • 40:17 - 40:23
    general purpose advertisement
    tracking thing or. Yeah.
  • 40:23 - 40:29
    Chris: So my understanding is that there's
    two broad kinds of data that Facebook gets
  • 40:29 - 40:36
    as automated app events that Facebook were
    aware of. So app open, app close, app
  • 40:36 - 40:42
    install, relinking. Relinking is quite an
    important one for Facebook. That way they
  • 40:42 - 40:45
    check to see whether you already have a
    Facebook account logged in to log the app
  • 40:45 - 40:50
    to your Facebook account when standing.
    There's also a load of custom events that
  • 40:50 - 40:55
    the app developers can put in. There is
    then collated back to a data set - I would
  • 40:55 - 41:02
    imagine on the other side. So when it
    comes to things like whether it's nausea
  • 41:02 - 41:06
    or some of the other health issues, it is
    actually being cross-referenced by the
  • 41:06 - 41:12
    developer. Does that answer your question?
    Mic 3: Yes, thank you.
  • 41:12 - 41:16
    Herald: Five, microphone five.
    Mic 5: Can you repeat what you said in the
  • 41:16 - 41:23
    beginning about the menstruation apps used
    in Europe, especially Clue and the Period
  • 41:23 - 41:30
    Tracker? Yeah. So those are the most
    popular apps actually across the world,
  • 41:30 - 41:35
    not just in Europe and the US. A lot of
    them in terms of like the traffic analysis
  • 41:35 - 41:41
    stage, a lot of them have not clean up
    their app. So we can't see any any data
  • 41:41 - 41:46
    sharing happening at that stage. But as I
    said, I can't be vouching for them and
  • 41:46 - 41:50
    saying, oh, yeah, those are safe and fine
    to use because we don't know what's
  • 41:50 - 41:54
    actually happening to the data once it's
    been collected by the app. All we can say
  • 41:54 - 42:02
    is that as far as the research we've done
    goes, we didn't see any data being shed
  • 42:02 - 42:07
    Chris: Those apps you mentioned have been
    investigated by The Wall Street Journal
  • 42:07 - 42:12
    and The New York Times relatively
    recently. So they've been.. had quite like
  • 42:12 - 42:16
    a spotlight on them. So they've had to
    really up their game and a lot of ways
  • 42:16 - 42:21
    which we would like everyone to do. But as
    Eva says, we don't know what else they
  • 42:21 - 42:25
    might be doing with that data on their
    side, not necessarily between the phone
  • 42:25 - 42:29
    and the server but from their server to
    another server.
  • 42:29 - 42:33
    Herald: Microphone one.
    Mic 1: Hi. Thank you for the insightful
  • 42:33 - 42:38
    talk. I have a question that goes in a
    similar direction. Do you know whether or
  • 42:38 - 42:44
    not these apps, even if they adhere to
    GDPR rules collect the data to then at a
  • 42:44 - 42:49
    later point at least sell it to the
    highest bidder? Because a lot of them are
  • 42:49 - 42:53
    free to use. And I wonder what is their
    main goal besides that?
  • 42:53 - 42:58
    Eva: I mean, the advertisement his how
    they make profit. And so, I mean, the
  • 42:58 - 43:04
    whole question about them trying to know
    if you're pregnant or not is that this
  • 43:04 - 43:12
    information can eventually be - you know -
    be monetized through, you know, through
  • 43:12 - 43:17
    how they target the advertisement at you.
    Actually when you're using those apps, you
  • 43:17 - 43:20
    can see in some of the slides, like you're
    constantly like being flushed with like
  • 43:20 - 43:26
    all sorts of advertisement on the app, you
    know, whether they are selling it
  • 43:26 - 43:31
    externally or not - I can't tell. But what
    I can tell is, yeah, your business model
  • 43:31 - 43:35
    is advertisement and so they are deriving
    profit from the data they collect.
  • 43:35 - 43:40
    Absolutely.
    Herald: Again, on microphone one.
  • 43:40 - 43:45
    Mic 1: Thank you. I was wondering if there
    was more of a big data kind of aspect to
  • 43:45 - 43:50
    it as well, because these are really
    interesting medical information on women’s
  • 43:50 - 43:55
    cycles in general.
    Eva: Yeah, and the answer is, like, I call
  • 43:55 - 43:58
    it—this is a bit of a black box and
    especially in the way, for example, that
  • 43:58 - 44:03
    Facebook is using this data like we don't
    know. We can assume that this is like part
  • 44:03 - 44:07
    of the … we could assume this is part of
    the profiling that Facebook does of both
  • 44:07 - 44:13
    their users and their non-users. But the
    way the way this data is actually
  • 44:13 - 44:20
    processed also by those apps through data
    brokers and so on, it’s a bit of a black
  • 44:20 - 44:28
    box.
    Herald: Microphone 1.
  • 44:28 - 44:32
    Mic 1: Yeah. Thank you a lot for your talk
    and I have two completely different
  • 44:32 - 44:38
    questions. The first one is: you've been
    focusing a lot on advertising and how this
  • 44:38 - 44:45
    data is used to sell to advertisers. But I
    mean, like you aim to be pregnant or not.
  • 44:45 - 44:49
    It's like it has to be the best kept
    secret, at least in Switzerland for any
  • 44:49 - 44:54
    female person, because like if you also
    want to get employed, your employer must
  • 44:54 - 45:00
    not know whether or not you want to get
    pregnant. And so I would like to ask,
  • 45:00 - 45:06
    like, how likely is it that this kind of
    data is also potentially sold to employers
  • 45:06 - 45:12
    who may want to poke into your health and
    reproductive situation? And then my other
  • 45:12 - 45:17
    question is entirely different, because we
    also know that female health is one of the
  • 45:17 - 45:22
    least researched topics around, and that's
    actually a huge problem. Like so little is
  • 45:22 - 45:28
    actually known about female health and the
    kind of data that these apps collect is
  • 45:28 - 45:34
    actually a gold mine to advance research
    on health issues that are specific for
  • 45:34 - 45:39
    certain bodies like female bodies. And so
    I would also like to know like how would
  • 45:39 - 45:44
    it be possible to still gather this kind
    of data and still to collect it, but use
  • 45:44 - 45:48
    it for like a beneficial purpose, like it
    to improve knowledge on these issues?
  • 45:48 - 45:54
    Eva: Sure. So to answer your first
    question, the answer will be similar to
  • 45:54 - 45:58
    the previous answer I gave, which is, you
    know, it's black box problem. It's like
  • 45:58 - 46:02
    it's very difficult to know exactly, you
    know, what's actually happening to this
  • 46:02 - 46:09
    data. Obviously, GDPR is there to prevent
    something from happening. But as we've
  • 46:09 - 46:18
    seen from these apps, like they were, you
    know, towing a very blurry line. And so
  • 46:18 - 46:22
    the risk, obviously, of … this is
    something that can’t be relia…. I can't be
  • 46:22 - 46:26
    saying, oh, this is happening because I
    have no evidence that this is happening.
  • 46:26 - 46:32
    But obviously, the risk of multiple, the
    risk of like employers, as you say, the
  • 46:32 - 46:36
    insurance companies that could get it,
    that political parties could get it and
  • 46:36 - 46:41
    target their messages based on information
    they have about your mood, about, you
  • 46:41 - 46:45
    know, even the fact that you're trying to
    start a family. So, yeah, there is a very
  • 46:45 - 46:50
    broad range of risk. The advertisement we
    know for sure is happening because this is
  • 46:50 - 46:56
    like the basis of their business model.
    The risk, the range of risk is very, very
  • 46:56 - 47:00
    broad.
    Chris: To just expand on that: Again, as
  • 47:00 - 47:05
    Eva said, we can't point out a specific
    example of any of this. But if you look at
  • 47:05 - 47:10
    some of the other data brokers, her
    experience as a data broker, they collect.
  • 47:10 - 47:16
    They have a statutory response. In the UK
    is a statutory job of being a credit
  • 47:16 - 47:24
    reference agency, but they also run what
    is believed to be armed data enrichment.
  • 47:24 - 47:29
    One of the things her employers could do
    is by experience data to when hiring
  • 47:29 - 47:36
    staff. Like I can't say that if this data
    ever ends up there. But, you know, as they
  • 47:36 - 47:41
    all collect, there is people collecting
    data and using it for some level of
  • 47:41 - 47:45
    auditing.
    Eva: And to transfer your second question.
  • 47:45 - 47:50
    I think this is a very important problem
    you point out is the question of data
  • 47:50 - 47:56
    inequality and whose data gets collected
    for what purpose. There is I do quite a
  • 47:56 - 48:01
    lot of work on delivery of state services.
    For example, when there are populations
  • 48:01 - 48:06
    that are isolated, not using technology
    and so on. You might just be missing out
  • 48:06 - 48:12
    on people, for example, who should be in
    need of health care or state
  • 48:12 - 48:18
    support and so on. Just because you like
    data about about them. And so, female
  • 48:18 - 48:24
    health is obviously a very key issue. We
    just, we literally lack sufficient health
  • 48:24 - 48:31
    data about about woman on women's health
    specifically. Now, in terms of how data is
  • 48:31 - 48:36
    processed in medical research, then
    there's actually protocol a in place
  • 48:36 - 48:40
    normally to ensure, to ensure consent, to
    ensure explicit consent, to ensure that
  • 48:40 - 48:47
    the data is properly collected. And so I
    think I wouldn't want you means that you,
  • 48:47 - 48:52
    just because the way does apps. I've been
    collecting data. If you know, if there's
  • 48:52 - 48:57
    one thing to take out of this of this dog
    is that, it's been nothing short of
  • 48:57 - 49:02
    horrifying, really. That data is being
    collected before and shared before you
  • 49:02 - 49:06
    even get your consent to anything. I
    wouldn't trust any of these private
  • 49:06 - 49:16
    companies to really be the ones carrying
    well taking part in in in medical research
  • 49:16 - 49:23
    or on those. So I agree with you that
    there is a need for better and more data
  • 49:23 - 49:29
    on women's health. But I don't think. I
    don't think any of these actors so far
  • 49:29 - 49:34
    have proved to be trusted on this issue.
    Herald: Microphone 2.
  • 49:34 - 49:37
    Mic 2: Yeah. Thank you for this great
    talk. Um. Short question. What do you
  • 49:37 - 49:42
    think is the rationale of, uh, this
    menstruation apps to integrate the
  • 49:42 - 49:46
    Facebook SDK if they don't get money from
    Facebook? OK, uh. Being able to
  • 49:46 - 49:54
    commercialize and this data.
    Chris: Good question. Um, it could be a
  • 49:54 - 50:01
    mix of things. So sometimes it's literally
    the the the developers literally just have
  • 50:01 - 50:05
    this as part of their tool chain their
    workflow when they're developing apps. I
  • 50:05 - 50:08
    don't necessarily know about these two
    peer trackers where other apps are
  • 50:08 - 50:14
    developed by these companies. But, uh, in
    our in our previous work, which I
  • 50:14 - 50:19
    presented last year, we find that some
    companies just produce a load of apps and
  • 50:19 - 50:23
    they just use the same tool chain every
    time. That includes by default. The
  • 50:23 - 50:30
    Facebook SDK is part of a tool chain. Uh,
    some of them are like included for what I
  • 50:30 - 50:34
    would regard as genuine purposes. Like
    they want their users to share something
  • 50:34 - 50:38
    or they want their users to be able to log
    in with Facebook and those cases, they
  • 50:38 - 50:42
    included, for what would be regarded a
    legitimate reason below them. Just don't
  • 50:42 - 50:48
    ever actually they haven't integrated it
    does appearance and they don't ever really
  • 50:48 - 50:52
    use anything of it other than that. Mean
    that there are a lot of developers simply
  • 50:52 - 51:02
    quite unaware of the default state is
    verbose and how it sends data to Facebook.
  • 51:02 - 51:06
    Herald: Yeah. Maybe we be close with one
    last question from me. Um, it doesn't it's
  • 51:06 - 51:12
    usually a bunch of ups. How many of them
    do certificate pinning? Uh, we see this as a
  • 51:12 - 51:17
    widespread policy or...
    Chris: Are they just not really. Yet. I
  • 51:17 - 51:22
    would have a problem doing an analysis
    where stuff could've been pinned. You say
  • 51:22 - 51:29
    TLS 1.3 is proven to be
    more problematic than pinning. Uh, yeah.
  • 51:29 - 51:32
    Herald: Ok, well, thank you so much. And,
    uh. Yeah.
  • 51:32 - 51:41
    Applause
  • 51:41 - 51:44
    36C3 Postroll music
  • 51:44 - 52:08
    Subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
36C3 - No Body's Business But Mine, a dive into Menstruation Apps
Description:

more » « less
Video Language:
English
Duration:
52:08

English subtitles

Revisions