0:00:00.000,0:00:19.770
36C3 preroll music
0:00:19.770,0:00:25.070
Herald: It is my honor to introduce you[br]today to Eva and Chris. Eva, she is a
0:00:25.070,0:00:29.440
senior researcher at Privacy[br]International. She works on gender,
0:00:29.440,0:00:34.680
economical and social rights and how they[br]interplay with the right to privacy,
0:00:34.680,0:00:40.430
especially in marginalized communities.[br]Chris, she is the privacy lead at
0:00:40.430,0:00:46.370
technology lead at Privacy International.[br]And his day-to-day job is to expose
0:00:46.370,0:00:51.290
company and how they profit from[br]individuals and specifically today they
0:00:51.290,0:00:59.230
will tell us how these companies can even[br]profit from your menstruations. Thank you.
0:00:59.230,0:01:00.470
Chris: Thank you.
0:01:00.470,0:01:05.200
applause
0:01:05.200,0:01:13.860
C: Hi, everyone. It's nice to be back at[br]CCC. I was at CCC last year. If you heard
0:01:13.860,0:01:18.580
my talk from last year, this is going to[br]be like a slightly vague part 2. And if
0:01:18.580,0:01:21.680
you're not, I'm just gonna give you a very[br]brief recap because there is a
0:01:21.680,0:01:28.380
relationship between the two. So, I will[br]give you a little bit of background about
0:01:28.380,0:01:32.540
how this project started. Then we get to a[br]little bit about menstruation apps and
0:01:32.540,0:01:38.040
what a menstruation app actually is. Let[br]me talk a little bit through some of the
0:01:38.040,0:01:42.250
data that these these apps are collecting[br]and talk how we did our research, our
0:01:42.250,0:01:48.390
research methodology and then what our[br]findings are and our conclusions. So last
0:01:48.390,0:01:54.640
year, I and a colleague did a project[br]around how Facebook collects data about
0:01:54.640,0:02:03.670
users on Android devices using the Android[br]Facebook SDK. And this is whether you have
0:02:03.670,0:02:09.540
a Facebook account or not. And for that[br]project, we really looked when you first
0:02:09.540,0:02:13.740
opened apps and didn't really have to do[br]very much interaction with them
0:02:13.740,0:02:23.560
particularily, about the automatic sending[br]of data in a post GDPR context. And so we
0:02:23.560,0:02:30.170
looked a load of apps for that project,[br]including a couple of period trackers. And
0:02:30.170,0:02:36.820
that kind of led onto this project because[br]we were seeing loads of apps, across
0:02:36.820,0:02:42.820
different areas of categories. So we[br]thought we'd like hone in a little bit on
0:02:42.820,0:02:48.570
period trackers to see what kind of data,[br]because they're by far more sensitive than
0:02:48.570,0:02:52.600
many of the other apps on there, like you[br]might consider your music history to be
0:02:52.600,0:03:03.690
very sensitive.... laughs So. Yeah. So,[br]just a quick update on the previous work
0:03:03.690,0:03:11.850
from last year. We actually followed up[br]with all of the companies from that, from
0:03:11.850,0:03:17.450
that report. And by the end of like going[br]through multiple rounds of response, over
0:03:17.450,0:03:22.410
60 percent of them a changed practices[br]either by disabling the Facebook SDK in
0:03:22.410,0:03:30.699
their app or by disabling it until you[br]gave consent or removing it entirely. So I
0:03:30.699,0:03:35.690
pass over to Eva Blum-Dumontet. She's[br]going to talk you through menstruation
0:03:35.690,0:03:38.850
apps.[br]Eva: So I just want to make sure that
0:03:38.850,0:03:42.310
we're all on the same page. Although if[br]you didn't know what a menstruation app is
0:03:42.310,0:03:47.790
and you still bothered coming to this[br]talk, I'm extremely grateful. So how many
0:03:47.790,0:03:53.540
of you are are using a menstruation app or[br]have a partner, who's been using a
0:03:53.540,0:03:58.330
menstruation app? Oh my God. Oh, okay. I[br]didn't expect that. I thought it was going
0:03:58.330,0:04:03.440
to be much less. Okay. Well, for the few[br]of you who still might not know what a
0:04:03.440,0:04:07.670
menstruation app is, I'm still going to go[br]quickly through what a menstruation app
0:04:07.670,0:04:15.520
is. It's the idea of a menstruation app.[br]We also call them period tracker. It's to
0:04:15.520,0:04:21.500
have an app that tracks your menstruation[br]cycle. So that they tell you what days
0:04:21.500,0:04:26.720
you're most fertile. And you can[br]obviously, if you're using them to try and
0:04:26.720,0:04:32.840
get pregnant or if you have, for example,[br]a painful period, you can sort of plan
0:04:32.840,0:04:39.660
accordingly. So that's essentially the[br]main 2 reasons users would be would be
0:04:39.660,0:04:48.470
looking into using menstruation apps:[br]pregnancy, period tracking. Now, how did
0:04:48.470,0:04:53.880
this research starts? As Chris said,[br]obviously there was whole research that
0:04:53.880,0:05:01.270
had been done by Privacy International[br]last year on various apps. And as Chris
0:05:01.270,0:05:08.660
also already said what I was particularly[br]interested in was the kind of data that
0:05:08.660,0:05:13.220
menstruation apps are collecting, because[br]as we'll explain in this talk, it's really
0:05:13.220,0:05:21.800
actually not just limited to menstruation[br]cycle. And so I was interested in seeing
0:05:21.800,0:05:26.820
what actually happens to the data when it[br]is being shared. So I should say we're
0:05:26.820,0:05:31.530
really standing on the shoulders of giants[br]when it comes to this research. There was
0:05:31.530,0:05:35.660
previously existing research on[br]menstruation apps that was done by a
0:05:35.660,0:05:40.930
partner organization, Coding Rights in[br]Brazil. So they had done research on the
0:05:40.930,0:05:46.690
kind of data that was collected by[br]menstruation apps and the granularity of
0:05:46.690,0:05:52.080
this data. Yet, a very interesting thing[br]that we're looking at was the gender
0:05:52.080,0:05:59.030
normativity of those apps. Chris and I[br]have been looking at, you know, dozens of
0:05:59.030,0:06:03.280
these apps and, you know, they have[br]various data showing practices, as we'll
0:06:03.280,0:06:07.870
explain in the stock. But they have one[br]thing that all of them have in common is
0:06:07.870,0:06:16.150
that they are all pink. The other thing is[br]that they talk to their users as woman.
0:06:16.150,0:06:20.550
They, you know, don't want sort of even[br]compute the fact that maybe not all their
0:06:20.550,0:06:30.280
users are woman. So there is a very sort[br]of like narrow perspective of pregnancy
0:06:30.280,0:06:41.020
and females' bodies and how does female[br]sexuality function. Now, as I was saying,
0:06:41.020,0:06:45.060
when you're using a menstruation app, it's[br]not just your menstruation cycle that
0:06:45.060,0:06:55.330
you're entering. So this is some of the[br]questions that menstruation apps ask: So
0:06:55.330,0:07:01.090
sex; There is a lot about sex that they[br]want to know? How often, is it protected
0:07:01.090,0:07:08.420
or unprotected? Are you smoking? Are you[br]drinking? Are you partying? How often? We
0:07:08.420,0:07:16.880
even had one app that was asking about[br]masturbation, your sleeping pattern, your
0:07:16.880,0:07:22.930
coffee drinking habits. One thing that's[br]really interesting is that - and we'll
0:07:22.930,0:07:28.910
talk a little bit more again about this[br]later - but there's very strong data
0:07:28.910,0:07:34.071
protection laws in Europe called GDPR as[br]most of you will know. And it says that
0:07:34.071,0:07:38.419
only data that's strictly necessary should[br]be collected. So I'm still unclear what
0:07:38.419,0:07:46.980
masturbation has to do with tracking your[br]menstruation cycle, but... Other thing
0:07:46.980,0:07:56.480
that was collected is about your health[br]and the reason health is so important is
0:07:56.480,0:07:59.980
also related to data protection laws[br]because when you're collecting health
0:07:59.980,0:08:04.730
data, you need to show that you're taking[br]an extra step to collect this data because
0:08:04.730,0:08:11.460
it's considered sensitive personal data.[br]So extra steps in terms of getting
0:08:11.460,0:08:17.170
explicit consent from the users but also[br]through steps on behalf of the data
0:08:17.170,0:08:22.060
controller, in terms of showing that[br]they're making extra steps for the
0:08:22.060,0:08:28.790
security of this data. So this is the type[br]of question that was asked. There is so
0:08:28.790,0:08:34.560
much asked about vaginal discharge and[br]what kind of vaginal discharge you get
0:08:34.560,0:08:39.879
with all sorts of weird adjectives for[br]this: "Tiki, creamy". So yeah, they
0:08:39.879,0:08:49.070
clearly thought a lot about this. And it[br]is a lot about mood as well. Even, yeah, I
0:08:49.070,0:08:56.190
didn't know 'romantic' was a mood but[br]apparently it is. And what's interesting
0:08:56.190,0:09:01.900
obviously about mood in the context where,[br]you know, we've seen stories like
0:09:01.900,0:09:07.000
Cambridge Analytica, for example. So we[br]know how much companies, we know how much
0:09:07.000,0:09:11.940
political parties are trying to understand[br]how we think, how we feel. So that's
0:09:11.940,0:09:17.490
actually quite significant that you have[br]an app that's collecting information about
0:09:17.490,0:09:24.110
how we feel on a daily basis. And[br]obviously, like when people enter all
0:09:24.110,0:09:29.200
these data, their expectation at that[br]point is that the data stays between
0:09:29.200,0:09:35.481
between them and the app. And actually,[br]there is very little in the privacy policy
0:09:35.481,0:09:41.930
that could that would normally suggest[br]that it was. So this is the moment where I
0:09:41.930,0:09:45.710
actually should say we're not making this[br]up; like literally everything in this list
0:09:45.710,0:09:51.750
of questions were things, literal terms,[br]that they were asking. So we set out to
0:09:51.750,0:09:55.400
look at the most popular menstruation[br]apps. Do you want to carry on?
0:09:55.400,0:09:59.840
Chris: Yeah. I forgot to introduce myself[br]as well. Really? That's a terrible
0:09:59.840,0:10:02.440
speaking habit.[br]Eva: Christopher Weatherhead..
0:10:02.440,0:10:08.740
Chris: .. Privacy International's[br]technology lead. So yeah.. What I said
0:10:08.740,0:10:11.580
about our previous research, we have[br]actually looked at most of the very
0:10:11.580,0:10:17.990
popular menstruation apps, the ones that[br]have hundreds of thousands of downloads.
0:10:17.990,0:10:21.910
And these apps - like as we're saying that[br]this kind of work has been done before. A
0:10:21.910,0:10:25.560
lot of these apps that come into quite a[br]lot of criticism, I'd spare you the free
0:10:25.560,0:10:30.460
advertising about which ones particularly[br]but most of them don't do anything
0:10:30.460,0:10:36.500
particularly outrageous, at least between[br]the app and the developers' servers. A lot
0:10:36.500,0:10:39.470
of them don't share with third parties at[br]that stage. So you can't look between the
0:10:39.470,0:10:43.850
app and the server to see what they're[br]sharing. They might be sharing data from
0:10:43.850,0:10:48.270
the developers' server to Facebook or to[br]other places but at least you can't see
0:10:48.270,0:10:55.600
in-between. But we're an international[br]organization and we work around the globe.
0:10:55.600,0:11:01.260
And most of the apps that get the most[br]downloads are particularly Western, U.S.,
0:11:01.260,0:11:07.700
European but they're not the most popular[br]apps necessarily in a context like India
0:11:07.700,0:11:12.810
and the Philippines and Latin America. So[br]we thought we'd have a look and see those
0:11:12.810,0:11:17.330
Apps. They're all available in Europe but[br]they're not necessarily the most popular
0:11:17.330,0:11:23.330
in Europe. And this is where things[br]started getting interesting. So what
0:11:23.330,0:11:29.520
exactly did we do? Well, we started off by[br]triaging through a large number of period
0:11:29.520,0:11:36.270
trackers. And as Eva said earlier: every[br]logo must be pink. And we were just kind
0:11:36.270,0:11:40.420
of looking through to see how many[br]trackers - this is using extras (?)
0:11:40.420,0:11:46.600
privacy. We have our own instance in PI[br]and we just looked through to see how many
0:11:46.600,0:11:50.780
trackers and who the trackers were. So,[br]for example, this is Maya, which is
0:11:50.780,0:11:54.519
exceptionally popular in India,[br]predominantly - it's made by an Indian
0:11:54.519,0:12:01.050
company. And as you can see, it's got a[br]large number of trackers in it: a
0:12:01.050,0:12:09.230
CleverTap, Facebook, Flurry, Google and[br]Inmobi? So we went through this process and
0:12:09.230,0:12:14.780
this allowed us to cut down... There's[br]hundreds of period trackers. Not all of
0:12:14.780,0:12:18.769
them are necessarily bad but it's nice to[br]try to see which ones had the most
0:12:18.769,0:12:24.500
trackers, where they were used and try and[br]just triage them a little bit. From this,
0:12:24.500,0:12:33.190
we then run through PI's interception[br]environment, which is a VM that I've made.
0:12:33.190,0:12:37.410
I actually made it last year for the talk[br]I gave last year. And I said I'd release
0:12:37.410,0:12:40.620
it after the talk and took me like three[br]months to release it but it's now
0:12:40.620,0:12:45.420
available. You can go onto PI's website[br]and download it. It's a man in the middle
0:12:45.420,0:12:52.860
proxy with a few settings - mainly for[br]looking at iOS and Android apps to do data
0:12:52.860,0:12:59.210
interception between them. And so we run[br]through that and we got to have a look at
0:12:59.210,0:13:05.030
all the data that's being sent to and from[br]both the app developer and third parties.
0:13:05.030,0:13:10.810
And here's what we found.[br]Eva: So out of the six apps we looked out,
0:13:10.810,0:13:17.920
five shared data with Facebook. Out of[br]those five, three pinged Facebook to let
0:13:17.920,0:13:23.990
them know when their users were[br]downloading the app and opening the app.
0:13:23.990,0:13:29.759
And that's already quite significant[br]information and we'll get to that later.
0:13:29.759,0:13:37.060
Now, what's actually interesting and the[br]focus of a report was on the two apps that
0:13:37.060,0:13:42.040
shared every single piece of information[br]that their users entered with Facebook and
0:13:42.040,0:13:49.820
other third parties. So just to brief you:[br]the two apps we focused on are both called
0:13:49.820,0:13:55.330
Maya. So that's all very helpful. One is[br]spelled Maya: M-a-y-a. The other ones
0:13:55.330,0:14:01.100
spellt Mia M-I-A. So, yeah, just bear with[br]me because this is actually quite
0:14:01.100,0:14:09.800
confusing. But so initially we'll focus on[br]Maya, which is - as Chris mentioned - an
0:14:09.800,0:14:16.190
app that's based in India. There have a[br]user base of several millions. Their are
0:14:16.190,0:14:27.080
based in India. Userbase, mostly in India,[br]also quite popular in the Philippines. So
0:14:27.080,0:14:30.470
what's interesting with Maya is that they[br]start sharing data with Facebook before
0:14:30.470,0:14:34.800
you even get you agree to their privacy[br]policy. So I should say already about the
0:14:34.800,0:14:39.320
privacy policy of a lot of those apps that[br]we looked at is that they are literally
0:14:39.320,0:14:48.380
the definition of small prints. It's very[br]hard to read. It's legalese language. It
0:14:48.380,0:14:53.620
really puts into perspective the whole[br]question of consent in GDPR because GDPR
0:14:53.620,0:14:58.209
says like the consents must be informed.[br]So you must be able to understand what
0:14:58.209,0:15:03.950
you're consenting to. When you're reading[br]this extremely long, extremely opaque
0:15:03.950,0:15:09.069
privacy policies of a lot - literally all[br]the menstruation apps we've looked at,
0:15:09.069,0:15:14.310
excluding one that didn't even bother[br]putting their privacy policy, actually.
0:15:14.310,0:15:20.360
It's opaque. It's very hard to understand[br]and - absolutely, definitely, do not say
0:15:20.360,0:15:25.480
that they're sharing information with[br]Facebook. As I said, data sharing happened
0:15:25.480,0:15:29.740
before you get to agree to their privacy[br]policy. The other thing that's also worth
0:15:29.740,0:15:33.490
remembering is that when to share[br]information with Facebook - doesn't matter
0:15:33.490,0:15:39.180
if you have a Facebook account or not, the[br]information still being relayed. The other
0:15:39.180,0:15:43.720
interesting thing that you'll notice as[br]well in several of the slides is that the
0:15:43.720,0:15:48.760
information that's being shared is tied to[br]your identity through your unique ID
0:15:48.760,0:15:54.640
identifiers, also your email address. But[br]basically most of the questions we got
0:15:54.640,0:16:00.220
when we released the research was like:[br]oh, if I use a fake email address or if I
0:16:00.220,0:16:06.079
use a fake name, is that OK? Well, it's[br]not because even if you have a Facebook
0:16:06.079,0:16:13.089
account through your unique identifier,[br]they would definitely be able to trace you
0:16:13.089,0:16:21.810
backs. There is no way to actually[br]anonymize this process unless - well at
0:16:21.810,0:16:27.420
the end, unless you deliberately trying to[br]trick it and use a separate phone
0:16:27.420,0:16:34.040
basically for regular users. It's quite[br]difficult. So this is what it looks like
0:16:34.040,0:16:41.620
when you enter the data. So as I said, I[br]didn't lie to you. This is the kind of
0:16:41.620,0:16:49.340
questions they're asking you. And this is[br]what it looks like when it's being shared
0:16:49.340,0:16:54.930
with Facebook. So you see the symptomes[br]changing, for example, like blood
0:16:54.930,0:17:00.339
pressure, swelling, acne, that's all being[br]shipped through craft out Facebook,
0:17:00.339,0:17:06.350
through the Facebook SDK. This is what it[br]looks like when they show you
0:17:06.350,0:17:11.729
contraceptive practice, so again, like[br]we're talking health data. Here we're
0:17:11.729,0:17:17.890
talking sensitive data. We're talking[br]about data that shouldn't normally require
0:17:17.890,0:17:22.309
extra steps in terms of collecting it, in[br]terms of how it's being processed. But
0:17:22.309,0:17:28.840
nope, in this case it was shared exactly[br]like the rest. This's what it looks like.
0:17:28.840,0:17:33.709
Well, so, yeah with sex life it was a[br]little bit different. So that's what it
0:17:33.709,0:17:37.511
looks like when they're asking you about,[br]you know, you just had sex, was it
0:17:37.511,0:17:44.550
protected? Was it unprotected? The way it[br]was shared with Facebook was a little bit
0:17:44.550,0:17:51.490
cryptic, so to speak. So if you have[br]protected sex, it was entered as love "2",
0:17:51.490,0:17:57.779
unprotected sex was entered as Love "3". I[br]managed to figure that out pretty quickly.
0:17:57.779,0:18:07.000
So it's not so cryptic. That's also quite[br]funny. So Maya had a diary section where
0:18:07.000,0:18:12.920
they encourage people to enter like their[br]notes and your personal faults. And I
0:18:12.920,0:18:18.680
mean, it's a menstruation app so you can[br]sort of get the idea of what people are
0:18:18.680,0:18:21.899
going to be writing down in there or[br]expected to write on. It's not going to be
0:18:21.899,0:18:26.429
their shopping list, although shopping[br]lists could also be personal, sensitive,
0:18:26.429,0:18:33.049
personal information, but.. So we were[br]wondering what would happen if you were to
0:18:33.049,0:18:38.429
write in this in this diary and how this[br]data would be processed. So we entered
0:18:38.429,0:18:42.379
literally we entered something very[br]sensitive, entered here. This is what we
0:18:42.379,0:18:53.409
wrote. And literally everything we wrote[br]was shared with Facebook. Maya also shared
0:18:53.409,0:18:58.080
your health data, not just with Facebook,[br]but with a company called CleverTap that's
0:18:58.080,0:19:05.440
based in California. So what's CleverTap?[br]CleverTap is a data broker, basically.
0:19:05.440,0:19:11.520
It's a company that - sort of similar to[br]Facebook with the Facebook SDK. They
0:19:11.520,0:19:16.950
expect of developers to hand over the data[br]and in exchange app developers get
0:19:16.950,0:19:23.679
insights about like how people use the[br]app, what time of day. You know, the age
0:19:23.679,0:19:30.789
of their users. They get all sorts of[br]information and analytics out of the data
0:19:30.789,0:19:38.889
they share with this company. It took us[br]some time to figure it out because it
0:19:38.889,0:19:43.020
shared as wicked wizard?[br]Chris: Wicket Rocket.
0:19:43.020,0:19:50.009
Eva: Wicket Rocket, yeah. But that's[br]exactly the same. Everything that was
0:19:50.009,0:19:57.340
shared with Facebook was also shared with[br]CleverTap again, with the email address
0:19:57.340,0:20:04.989
that we were using - everything. Let's[br]shift. Now, let's look at the other Mia.
0:20:04.989,0:20:10.110
It's not just the name that's similar,[br]it's also the data showing practices. Mia
0:20:10.110,0:20:18.320
is based in Cypress, so in European Union.[br]I should say, in all cases, regardless of
0:20:18.320,0:20:22.120
where the company is based, the moment[br]that they market the product in European
0:20:22.120,0:20:29.460
Union, so like literally every app we[br]looked at, they need to - well they should
0:20:29.460,0:20:40.479
respect GDPR. Our European data protection[br]law. Now, the first thing that Mia asked
0:20:40.479,0:20:44.940
when you started the app and again - I'll[br]get to that later about the significance
0:20:44.940,0:20:49.710
of this - is why you're using the app or[br]you using it to try and get pregnant or
0:20:49.710,0:20:55.879
are you just using it to try to track your[br]periods? Now, it's interesting because it
0:20:55.879,0:21:00.070
doesn't change at all the way you interact[br]with the app eventually. The apps stays
0:21:00.070,0:21:05.179
exactly the same. But this is actually the[br]most important kind of data. This is
0:21:05.179,0:21:11.419
literally called the germ of data[br]collection. It's trying to know when a
0:21:11.419,0:21:15.970
woman is trying to get pregnant or not. So[br]the reason this is the first question they
0:21:15.970,0:21:21.389
ask is, well my guess on this is - they[br]want to make sure that like even if you
0:21:21.389,0:21:25.630
don't actually use the app that's at least[br]that much information they can collect
0:21:25.630,0:21:31.510
about you. And so this information was[br]shared immediately with Facebook and with
0:21:31.510,0:21:36.529
AppsFlyer. AppsFlyer is very similar to[br]CleverTap in the way it works. It's also a
0:21:36.529,0:21:44.470
company that collects data from these apps[br]and that as services in terms of analytics
0:21:44.470,0:21:54.479
and insights into user behavior. It's[br]based in Israel. So this is what it looks
0:21:54.479,0:22:04.710
like when you enter the information. Yeah,[br]masturbation, pill. What kind of pill
0:22:04.710,0:22:10.760
you're taking, your lifestyle habits. Now[br]where it's slightly different is that the
0:22:10.760,0:22:15.960
information doesn't immediately get shared[br]with Facebook but based on the information
0:22:15.960,0:22:22.559
you enter, you get articles that are[br]tailored for you. So, for example, like
0:22:22.559,0:22:27.359
when you select masturbation, you will[br]get, you know, masturbation: what you want
0:22:27.359,0:22:35.850
to know but are ashamed to ask. Now,[br]what's eventually shared with Facebook is
0:22:35.850,0:22:43.159
actually the kind of article that's being[br]offered to you. So basically, yes, the
0:22:43.159,0:22:50.220
information is shared indirectly because[br]then you know you have Facebook and...
0:22:50.220,0:22:52.929
You've just entered masturbation because[br]you're getting an article about
0:22:52.929,0:22:58.940
masturbation. So this is what happened[br]when you enter alcohol. So expected
0:22:58.940,0:23:02.630
effects of alcohol on a woman's body.[br]That's what happened when you enter
0:23:02.630,0:23:06.149
"unprotected sex". So effectively, all the[br]information is still shared just
0:23:06.149,0:23:14.440
indirectly through the articles you're[br]getting. Yeah. Last thing also, I should
0:23:14.440,0:23:18.449
say on this, in terms of the articles that[br]you're getting, is that sometimes there
0:23:18.449,0:23:23.489
was sort of also kind of like crossing the[br]data.. was like.. so the articles will be
0:23:23.489,0:23:30.479
about like: oh, you have cramps outside of[br]your periods, for example, like during
0:23:30.479,0:23:37.070
your fertile phase. And so you will get[br]the article specifically for this and the
0:23:37.070,0:23:42.559
information that's shared with Facebook[br]and with AppsFlyer is that this person is
0:23:42.559,0:23:49.470
in their fertile period in this phase of[br]their cycles and having cramps. Now, why
0:23:49.470,0:23:52.370
are menstruation apps so obsessed with[br]finding out if you're trying to get
0:23:52.370,0:23:59.840
pregnant? And so, this goes back to a lot[br]of the things I mentioned before that, you
0:23:59.840,0:24:04.039
know, about wanting to know in the very[br]first place if you're trying to get
0:24:04.039,0:24:10.260
pregnant or not. And also, this is[br]probably why a lot of those apps are
0:24:10.260,0:24:16.729
trying to really nail down in their[br]language and discourse how you're using
0:24:16.729,0:24:23.169
the apps for. When a person is pregnant,[br]they're purchasing habit, their consumer
0:24:23.169,0:24:29.910
habits change. Obviously, you know, you[br]buy not only for yourself but you start
0:24:29.910,0:24:36.669
buying for others as well. But also you're[br]buying new things you've never purchased
0:24:36.669,0:24:41.549
before. So what a regular person will be[br]quite difficult to change her purchasing
0:24:41.549,0:24:47.549
habit was a person that's pregnant.[br]They'll be advertisers will be really keen
0:24:47.549,0:24:52.869
to target them because this is a point of[br]their life where their habits change and
0:24:52.869,0:24:58.440
where they can be more easily influenced[br]one way or another. So in other words,
0:24:58.440,0:25:03.960
it's pink advertising time. In other more[br]words and pictures, there's research done
0:25:03.960,0:25:12.119
in 2014 in the US that was trying to sort[br]of evaluate the value of data for a
0:25:12.119,0:25:19.320
person. So an average American person[br]that's not pregnant was 10 cents. A person
0:25:19.320,0:25:29.250
who's pregnant would be one dollar fifty.[br]So you may have noticed we using the past
0:25:29.250,0:25:33.020
tense when we talked about - well I hope I[br]did when I was speaking definitely into
0:25:33.020,0:25:38.359
the lights at least - we used the past[br]tense when we talk about data sharing of
0:25:38.359,0:25:43.330
these apps. That's because both Maya and[br]MIA, which were the two apps we were
0:25:43.330,0:25:47.980
really targeting with this report, stop[br]using the Facebook SDK when we wrote to
0:25:47.980,0:25:51.089
them about our research[br]before we published it.
0:25:51.089,0:26:00.789
applause[br]So it was quite nice because he didn't
0:26:00.789,0:26:05.690
even like rely on actually us publishing[br]the report. It was merely at a stage of
0:26:05.690,0:26:09.979
like, hey, this is all right of response.[br]We're gonna be publishing this. Do you
0:26:09.979,0:26:13.549
have anything to say about this? And[br]essentially what they had to say is like:
0:26:13.549,0:26:21.260
"Yep, sorry, apologies. We are stopping[br]this." I think, you know.. What's really
0:26:21.260,0:26:27.529
interesting as well to me about like the[br]how quick the response was is.. it really
0:26:27.529,0:26:34.159
shows how this is not a vital service for[br]them. This is a plus. This is something
0:26:34.159,0:26:41.679
that's a useful tool. But the fact that[br]they immediately could just stop using it,
0:26:41.679,0:26:48.269
I think really shows that, you know, it[br]was.. I wouldn't see a lazy practice, but
0:26:48.269,0:26:53.169
it's a case of light. As long as no one's[br]complaining, then you are going to carry
0:26:53.169,0:27:00.299
on using it. And I think that was also the[br]discourse with your research. There was
0:27:00.299,0:27:02.709
also a lot that changed[br]their behaviors after.
0:27:02.709,0:27:06.499
Chris: A lot of the developers sometimes[br]don't even realize necessarily what data
0:27:06.499,0:27:12.009
they're up to sharing with people like[br]Facebook, with people like CleverTap. They
0:27:12.009,0:27:16.649
just integrate the SDK and[br]hope for the best.
0:27:16.649,0:27:22.249
Eva: We also got this interesting response[br]from AppsFlyer is that it's very
0:27:22.249,0:27:26.899
hypocritical. Essentially, what they're[br]saying is like oh, like we specifically
0:27:26.899,0:27:33.549
ask our customers or oh, yeah, do not[br]share health data with us specifically for
0:27:33.549,0:27:37.679
the reason I mentioned earlier, which is[br]what? Because of GDPR, you're normally
0:27:37.679,0:27:44.519
expected to take extra step when you[br]process sensitive health data. So their
0:27:44.519,0:27:48.809
response is that they as their customer to[br]not share health data or sensitive
0:27:48.809,0:27:54.900
personal data so they don't become liable[br]in terms of the law. So they were like,
0:27:54.900,0:27:59.909
oh, we're sorry, like this is a breach of[br]contract. Now, the reason is very
0:27:59.909,0:28:04.289
hypocritical is that obviously when you[br]have contracts with menstruation apps and
0:28:04.289,0:28:07.860
actually Maya was not the only[br]menstruation apps that we're working with.
0:28:07.860,0:28:12.230
I mean, you know, what can you generally[br]expect in terms of the kind of data you're
0:28:12.230,0:28:19.139
gonna receive? So here's a conclusion for[br]us that research works. It's fun, it's
0:28:19.139,0:28:26.979
easy to do. You know, Chris has not[br]published the environment. It doesn't
0:28:26.979,0:28:32.539
actually - once the environment is sort of[br]set up it doesn't actually require
0:28:32.539,0:28:36.820
technical background, as you saw from the[br]slides it's pretty straightforward to
0:28:36.820,0:28:41.959
actually understand how the data is being[br]shared. So you should do it, too. But more
0:28:41.959,0:28:46.989
broadly, we think it's really important to[br]do more research, not just at this stage
0:28:46.989,0:28:54.269
of the process, but generally about the[br]security and the data and the data showing
0:28:54.269,0:29:00.139
practices of apps, because, you know, it's[br]hard law and more and more people are
0:29:00.139,0:29:05.679
using or interacting with technology and[br]using the Internet. So we need to do think
0:29:05.679,0:29:10.510
much more carefully about the security[br]implication of the apps we use and
0:29:10.510,0:29:15.639
obviously it works. Thank you.
0:29:15.639,0:29:25.369
applause
0:29:25.369,0:29:29.519
Herald: Thank you. So, yeah, please line[br]up in front of the microphones. We can
0:29:29.519,0:29:33.869
start with microphone two.[br]Mic 2: Hi. Thank you. So you mentioned
0:29:33.869,0:29:39.119
that now we can check whether our data is[br]being shared with third parties on the
0:29:39.119,0:29:42.460
path between the user and the developer.[br]But we cannot know for all the other apps
0:29:42.460,0:29:46.279
and for these, what if it's not being[br]shared later from the developer, from the
0:29:46.279,0:29:51.859
company to other companies. Have you[br]conceptualize some ways of testing that?
0:29:51.859,0:29:55.659
Is it possible?[br]Chris: Yes. So you could do it, data
0:29:55.659,0:30:03.979
separate access request and the GDPR that[br]would... like the problem is it's quite
0:30:03.979,0:30:11.299
hard to necessarily know. How the process[br]- how the system outside of the app to
0:30:11.299,0:30:16.139
serve relationship is quite hard to know[br]the processes of that data and so it is
0:30:16.139,0:30:20.309
quite opaque. They might apply a different[br]identifier too, they might do other
0:30:20.309,0:30:23.859
manipulations to that data so trying to[br]track down and prove this bit of data
0:30:23.859,0:30:28.700
belong to you. It's quite challenging.[br]Eva: This is something we're going to try.
0:30:28.700,0:30:32.070
We're going to be doing in 2020, actually.[br]We're going to be doing data subject
0:30:32.070,0:30:38.330
access request of those apps that we've[br]been looking up to see if we find anything
0:30:38.330,0:30:43.549
both under GDPR but also under different[br]data protection laws in different
0:30:43.549,0:30:49.980
countries. To see basically what we get,[br]how much we can obtain from that.
0:30:49.980,0:30:54.960
Herald: So I'd go with the signal angle.[br]Signal: So what advice can you give us on
0:30:54.960,0:31:00.330
how we can make people understand that[br]from a privacy perspective, it's better to
0:31:00.330,0:31:05.280
use pen and paper instead of entering[br]sensitive data into any of these apps?
0:31:05.280,0:31:10.440
Eva: I definitely wouldn't advise that. I[br]wouldn't advise pen and paper. I think for
0:31:10.440,0:31:17.359
us like really the key... The work we are[br]doing is not actually targeting users.
0:31:17.359,0:31:21.280
It's targeting companies. We think it's[br]companies that really need to do better.
0:31:21.280,0:31:26.269
We're often ask about, you know, advice to[br]customers or advice to users and
0:31:26.269,0:31:32.029
consumers. But what I think and what we've[br]been telling companies as well is that,
0:31:32.029,0:31:36.190
you know, their users trust you and they[br]have the right to trust you. They also
0:31:36.190,0:31:40.969
have the right to expect that you're[br]respecting the law. The European Union has
0:31:40.969,0:31:47.429
a very ambitious legislation when it comes[br]to privacy with GDPR. And so the least
0:31:47.429,0:31:55.950
they can expect is that you're respecting[br]the law. And so, no, I would ... and this
0:31:55.950,0:31:59.539
is the thing, I think people have the[br]right to use those apps, they have the
0:31:59.539,0:32:03.850
right to say, well, this is a useful[br]service for me. It's really companies that
0:32:03.850,0:32:08.210
need you. They need to up their game. They[br]need to live up to the expectations of
0:32:08.210,0:32:15.600
their consumers. Not the other way around.[br]Herald: Microphone 1.
0:32:15.600,0:32:19.219
Mic 1: Hi. So from the talk, it seems and[br]I think that's what you get, you mostly
0:32:19.219,0:32:23.320
focused on Android based apps. Can you[br]maybe comment on what the situation is
0:32:23.320,0:32:27.219
with iOS? Is there any technical[br]difficulty or is it anything completely
0:32:27.219,0:32:30.719
different with respect to these apps and[br]apps in general?
0:32:30.719,0:32:33.669
Chris: There's not really a technical[br]difficulty like the setup a little bit
0:32:33.669,0:32:38.799
different, but functionally you can look[br]at the same kind of data. The focus here,
0:32:38.799,0:32:44.960
though, is also.. So it's two-fold in some[br]respects. Most of the places that these
0:32:44.960,0:32:49.940
apps are used are heavily dominated[br]Android territories, places like India,
0:32:49.940,0:32:55.529
the Philippines. iOS penetration there,[br]uh, Apple device penetration there is very
0:32:55.529,0:33:01.979
low. There's no technical reason not to[br]look at Apple devices. But like in this
0:33:01.979,0:33:06.779
particular context, it's not necessarily[br]hugely relevant. So does that answer your
0:33:06.779,0:33:08.989
question?[br]Mic 1: And technically with youre set-up,
0:33:08.989,0:33:12.060
you could also do the same[br]analysis with an iOS device?
0:33:12.060,0:33:17.339
Chris: Yeah. As I said it's a little bit[br]of a change to how you... You have to
0:33:17.339,0:33:22.489
register the device as an MDM dev.. like a[br]mobile profile device. Otherwise you can
0:33:22.489,0:33:30.809
do the exact same level of interception.[br]Mic: Uh, hi. My question is actually
0:33:30.809,0:33:33.210
related to the last question[br]is a little bit technical.
0:33:33.210,0:33:35.619
Chris: Sure.[br]Mic: I'm also doing some research on apps
0:33:35.619,0:33:39.539
and I've noticed with the newest versions[br]of Android that they're making more
0:33:39.539,0:33:44.289
difficult to install custom certificates[br]to have this pass- through and check what
0:33:44.289,0:33:49.070
the apps are actually communicating to[br]their home servers. Have you find a way to
0:33:49.070,0:33:54.029
make this easier?[br]Chris: Yes. So we actually hit the same
0:33:54.029,0:34:01.539
issue as you in some respects. So the[br]installing of custom certificates was not
0:34:01.539,0:34:05.550
really an obstacle because you can add to[br]the user if it's a rich device, you can
0:34:05.550,0:34:13.510
add them to the system store and they are[br]trusted by all the apps on the device. The
0:34:13.510,0:34:19.330
problem we're now hitting is the Android 9[br]and 10 have TLS 1.3 and TLS 1.3
0:34:19.330,0:34:24.340
to text as a man in the middle or at[br]least it tries to might terminate the
0:34:24.340,0:34:28.760
connection. Uh, this is a bit of a[br]problem. So currently all our research is
0:34:28.760,0:34:37.490
still running on Android 8.1 devices. This[br]isn't going to be sustainable long term.
0:34:37.490,0:34:43.210
Herald: Um, 4.[br]Mic 4: Hey, thank you for the great talk.
0:34:43.210,0:34:47.250
Your research is obviously targeted in a[br]constructive, critical way towards
0:34:47.250,0:34:53.250
companies that are making apps surrounding[br]menstrual research. Did you learn anything
0:34:53.250,0:34:57.210
from this context that you would want to[br]pass on to people who research this area
0:34:57.210,0:35:03.360
more generally? I'm thinking, for example,[br]of Paramount Corp in the US, who've done
0:35:03.360,0:35:07.700
micro dosing research on LSD and are[br]starting a breakout study on menstrual
0:35:07.700,0:35:12.080
issues.[br]Eva: Well, I think this is why I was
0:35:12.080,0:35:15.980
concluded on it. I think there is a[br]there's still a lot of research that needs
0:35:15.980,0:35:21.090
to be done in terms of the sharing. And[br]obviously, I think anything that touches
0:35:21.090,0:35:27.830
on people's health is a key priority[br]because it's something people relate very
0:35:27.830,0:35:33.750
strongly to. The consequences, especially[br]in the US, for example, of sharing health
0:35:33.750,0:35:38.700
data like this, of having - you know -[br]data, even like your blood pressure and so
0:35:38.700,0:35:42.760
on. Like what are the consequences if[br]those informations are gonna be shared,
0:35:42.760,0:35:46.590
for example, with like insurance companies[br]and so on. This is what I think is
0:35:46.590,0:35:52.470
absolutely essential to have a better[br]understanding of the data collection and
0:35:52.470,0:35:57.570
sharing practices of the services. The[br]moments when you have health data that's
0:35:57.570,0:35:59.720
being involved.[br]Chris: .. yeah because we often focus
0:35:59.720,0:36:06.000
about this being an advertising issue. But[br]in that sense as well, insurance and even
0:36:06.000,0:36:09.950
credit referencing of all sorts of other[br]things become problematic, especially when
0:36:09.950,0:36:14.750
it comes to pregnancy related.[br]Eva: Yeah, even employers could be after
0:36:14.750,0:36:18.510
this kind of information.[br]Herald: Six.
0:36:18.510,0:36:24.450
Mic 6: Hi. I'm wondering if there is an[br]easy way or a tool which we can use to
0:36:24.450,0:36:32.580
detect if apps are using our data or are[br]reporting them to Facebook or whatever. Or
0:36:32.580,0:36:39.830
if we can even use those apps but block[br]this data from being reported to Facebook.
0:36:39.830,0:36:45.650
Chris: Yes. So, you can file all of faith[br]graft on Facebook.com and stop sending
0:36:45.650,0:36:51.770
data to that. There's a few issues here.[br]Firstly, it doesn't really like.. This
0:36:51.770,0:36:57.940
audience can do this. Most users don't[br]have the technical nuance to know what
0:36:57.940,0:37:02.390
needs to be blocked, what doesn't[br]necessarily need to be blocked. It's on
0:37:02.390,0:37:07.300
the companies to be careful with users[br]data. It's not up to the users to try and
0:37:07.300,0:37:13.500
defend against.. It shouldn't be on the[br]use to defend against malicious data
0:37:13.500,0:37:17.490
sharing or...[br]Eva: You know... also one interesting
0:37:17.490,0:37:21.930
thing is that if Facebook had put this in[br]place of light where you could opt out
0:37:21.930,0:37:25.470
from data sharing with the apps you're[br]using, but that only works if you're a
0:37:25.470,0:37:29.840
Facebook user. And as I said, like this[br]data has been collected whether you are a
0:37:29.840,0:37:34.230
user or not. So in a sense, for people who[br]aren't Facebook users, they couldn't opt
0:37:34.230,0:37:37.720
out of this.[br]Chris: The Facebook SDK the developers are
0:37:37.720,0:37:46.690
integrating the default state for sharing[br]of data is on, the flag is true. And
0:37:46.690,0:37:56.480
although they have a long legal text on[br]the help pages for the developer tools,
0:37:56.480,0:38:00.540
it's like unless you have a decent[br]understanding of local data protection
0:38:00.540,0:38:04.890
practice or local protection law. It's[br]like it's not it's not something that most
0:38:04.890,0:38:08.840
developers are gonna be able to understand[br]why this flag should be something
0:38:08.840,0:38:16.320
different from on. You know there's loads[br]of flags in the SDK, which flags should be
0:38:16.320,0:38:21.930
on and off, depending on which[br]jurisdiction you're selling to, or users
0:38:21.930,0:38:27.240
going to be in.[br]Herald: Signal Angel, again.
0:38:27.240,0:38:31.530
Singal: Do you know any good apps which[br]don't share data and are privacy friendly?
0:38:31.530,0:38:37.120
Probably even one that is open source.[br]Eva: So, I mean, as in the problem which
0:38:37.120,0:38:43.260
is why I wouldn't want to vouch for any[br]app is that even in the apps that, you
0:38:43.260,0:38:48.500
know, where in terms of like the traffic[br]analysis we've done, we didn't see any any
0:38:48.500,0:38:53.160
data sharing. As Chris was explaining, the[br]data can be shared at a later stage and
0:38:53.160,0:39:00.720
it'd be impossible for us to really find[br]out. So.. no, I can't be vouching for any
0:39:00.720,0:39:04.650
app. I don't know if you can...[br]Chris: The problem is we can't ever look
0:39:04.650,0:39:10.810
like one specific moment in time to see[br]whether data is being shared, unlike what
0:39:10.810,0:39:17.690
was good today might bad tomorrow. What[br]was bad yesterday might be good today.
0:39:17.690,0:39:25.230
Although, I was in Argentina recently[br]speaking to a group of feminist activists,
0:39:25.230,0:39:31.860
and they have been developing a[br]menstruation tracking app. And the app was
0:39:31.860,0:39:37.800
removed from the Google Play store because[br]it had illustrations that were deemed
0:39:37.800,0:39:42.500
pornographic. But they were illustrations[br]around medical related stuff. So even
0:39:42.500,0:39:45.170
people, who were trying to do the right[br]thing, going through the open source
0:39:45.170,0:39:49.720
channels are still fighting a completely[br]different issue when it comes to
0:39:49.720,0:39:52.940
menstruation tracking.[br]It's a very fine line.
0:39:52.940,0:39:57.330
Herald: Um, three.[br]inaudible
0:39:57.330,0:40:01.770
Eva: Sorry, can't hear -the Mic's not[br]working.
0:40:01.770,0:40:04.790
Herald: Microphone three.[br]Mic 3: Test.
0:40:04.790,0:40:09.850
Eva: Yeah, it's great - perfect.[br]Mic 3: I was wondering if the graph API
0:40:09.850,0:40:16.560
endpoint was actually in place to trick[br]menstruation data or is it more like a
0:40:16.560,0:40:22.970
general purpose advertisement[br]tracking thing or. Yeah.
0:40:22.970,0:40:29.360
Chris: So my understanding is that there's[br]two broad kinds of data that Facebook gets
0:40:29.360,0:40:35.970
as automated app events that Facebook were[br]aware of. So app open, app close, app
0:40:35.970,0:40:41.760
install, relinking. Relinking is quite an[br]important one for Facebook. That way they
0:40:41.760,0:40:44.940
check to see whether you already have a[br]Facebook account logged in to log the app
0:40:44.940,0:40:49.950
to your Facebook account when standing.[br]There's also a load of custom events that
0:40:49.950,0:40:55.400
the app developers can put in. There is[br]then collated back to a data set - I would
0:40:55.400,0:41:01.520
imagine on the other side. So when it[br]comes to things like whether it's nausea
0:41:01.520,0:41:06.390
or some of the other health issues, it is[br]actually being cross-referenced by the
0:41:06.390,0:41:11.820
developer. Does that answer your question?[br]Mic 3: Yes, thank you.
0:41:11.820,0:41:16.320
Herald: Five, microphone five.[br]Mic 5: Can you repeat what you said in the
0:41:16.320,0:41:23.290
beginning about the menstruation apps used[br]in Europe, especially Clue and the Period
0:41:23.290,0:41:29.860
Tracker? Yeah. So those are the most[br]popular apps actually across the world,
0:41:29.860,0:41:35.100
not just in Europe and the US. A lot of[br]them in terms of like the traffic analysis
0:41:35.100,0:41:40.980
stage, a lot of them have not clean up[br]their app. So we can't see any any data
0:41:40.980,0:41:46.090
sharing happening at that stage. But as I[br]said, I can't be vouching for them and
0:41:46.090,0:41:49.680
saying, oh, yeah, those are safe and fine[br]to use because we don't know what's
0:41:49.680,0:41:54.310
actually happening to the data once it's[br]been collected by the app. All we can say
0:41:54.310,0:42:01.870
is that as far as the research we've done[br]goes, we didn't see any data being shed
0:42:01.870,0:42:06.750
Chris: Those apps you mentioned have been[br]investigated by The Wall Street Journal
0:42:06.750,0:42:11.790
and The New York Times relatively[br]recently. So they've been.. had quite like
0:42:11.790,0:42:15.720
a spotlight on them. So they've had to[br]really up their game and a lot of ways
0:42:15.720,0:42:20.590
which we would like everyone to do. But as[br]Eva says, we don't know what else they
0:42:20.590,0:42:24.740
might be doing with that data on their[br]side, not necessarily between the phone
0:42:24.740,0:42:29.150
and the server but from their server to[br]another server.
0:42:29.150,0:42:32.510
Herald: Microphone one.[br]Mic 1: Hi. Thank you for the insightful
0:42:32.510,0:42:37.620
talk. I have a question that goes in a[br]similar direction. Do you know whether or
0:42:37.620,0:42:44.080
not these apps, even if they adhere to[br]GDPR rules collect the data to then at a
0:42:44.080,0:42:48.850
later point at least sell it to the[br]highest bidder? Because a lot of them are
0:42:48.850,0:42:53.160
free to use. And I wonder what is their[br]main goal besides that?
0:42:53.160,0:42:58.440
Eva: I mean, the advertisement his how[br]they make profit. And so, I mean, the
0:42:58.440,0:43:04.450
whole question about them trying to know[br]if you're pregnant or not is that this
0:43:04.450,0:43:11.540
information can eventually be - you know -[br]be monetized through, you know, through
0:43:11.540,0:43:17.070
how they target the advertisement at you.[br]Actually when you're using those apps, you
0:43:17.070,0:43:20.340
can see in some of the slides, like you're[br]constantly like being flushed with like
0:43:20.340,0:43:25.630
all sorts of advertisement on the app, you[br]know, whether they are selling it
0:43:25.630,0:43:31.470
externally or not - I can't tell. But what[br]I can tell is, yeah, your business model
0:43:31.470,0:43:34.960
is advertisement and so they are deriving[br]profit from the data they collect.
0:43:34.960,0:43:40.410
Absolutely.[br]Herald: Again, on microphone one.
0:43:40.410,0:43:44.600
Mic 1: Thank you. I was wondering if there[br]was more of a big data kind of aspect to
0:43:44.600,0:43:50.080
it as well, because these are really[br]interesting medical information on women’s
0:43:50.080,0:43:54.560
cycles in general.[br]Eva: Yeah, and the answer is, like, I call
0:43:54.560,0:43:58.030
it—this is a bit of a black box and[br]especially in the way, for example, that
0:43:58.030,0:44:03.100
Facebook is using this data like we don't[br]know. We can assume that this is like part
0:44:03.100,0:44:07.280
of the … we could assume this is part of[br]the profiling that Facebook does of both
0:44:07.280,0:44:13.400
their users and their non-users. But the[br]way the way this data is actually
0:44:13.400,0:44:19.510
processed also by those apps through data[br]brokers and so on, it’s a bit of a black
0:44:19.510,0:44:27.530
box.[br]Herald: Microphone 1.
0:44:27.530,0:44:32.030
Mic 1: Yeah. Thank you a lot for your talk[br]and I have two completely different
0:44:32.030,0:44:37.630
questions. The first one is: you've been[br]focusing a lot on advertising and how this
0:44:37.630,0:44:44.940
data is used to sell to advertisers. But I[br]mean, like you aim to be pregnant or not.
0:44:44.940,0:44:48.810
It's like it has to be the best kept[br]secret, at least in Switzerland for any
0:44:48.810,0:44:54.430
female person, because like if you also[br]want to get employed, your employer must
0:44:54.430,0:44:59.740
not know whether or not you want to get[br]pregnant. And so I would like to ask,
0:44:59.740,0:45:06.230
like, how likely is it that this kind of[br]data is also potentially sold to employers
0:45:06.230,0:45:12.000
who may want to poke into your health and[br]reproductive situation? And then my other
0:45:12.000,0:45:17.290
question is entirely different, because we[br]also know that female health is one of the
0:45:17.290,0:45:22.220
least researched topics around, and that's[br]actually a huge problem. Like so little is
0:45:22.220,0:45:27.510
actually known about female health and the[br]kind of data that these apps collect is
0:45:27.510,0:45:34.310
actually a gold mine to advance research[br]on health issues that are specific for
0:45:34.310,0:45:38.920
certain bodies like female bodies. And so[br]I would also like to know like how would
0:45:38.920,0:45:43.860
it be possible to still gather this kind[br]of data and still to collect it, but use
0:45:43.860,0:45:48.490
it for like a beneficial purpose, like it[br]to improve knowledge on these issues?
0:45:48.490,0:45:53.690
Eva: Sure. So to answer your first[br]question, the answer will be similar to
0:45:53.690,0:45:58.300
the previous answer I gave, which is, you[br]know, it's black box problem. It's like
0:45:58.300,0:46:02.410
it's very difficult to know exactly, you[br]know, what's actually happening to this
0:46:02.410,0:46:08.570
data. Obviously, GDPR is there to prevent[br]something from happening. But as we've
0:46:08.570,0:46:17.890
seen from these apps, like they were, you[br]know, towing a very blurry line. And so
0:46:17.890,0:46:22.360
the risk, obviously, of … this is[br]something that can’t be relia…. I can't be
0:46:22.360,0:46:26.290
saying, oh, this is happening because I[br]have no evidence that this is happening.
0:46:26.290,0:46:31.760
But obviously, the risk of multiple, the[br]risk of like employers, as you say, the
0:46:31.760,0:46:36.490
insurance companies that could get it,[br]that political parties could get it and
0:46:36.490,0:46:40.960
target their messages based on information[br]they have about your mood, about, you
0:46:40.960,0:46:45.260
know, even the fact that you're trying to[br]start a family. So, yeah, there is a very
0:46:45.260,0:46:50.240
broad range of risk. The advertisement we[br]know for sure is happening because this is
0:46:50.240,0:46:55.850
like the basis of their business model.[br]The risk, the range of risk is very, very
0:46:55.850,0:46:59.940
broad.[br]Chris: To just expand on that: Again, as
0:46:59.940,0:47:05.430
Eva said, we can't point out a specific[br]example of any of this. But if you look at
0:47:05.430,0:47:10.260
some of the other data brokers, her[br]experience as a data broker, they collect.
0:47:10.260,0:47:16.350
They have a statutory response. In the UK[br]is a statutory job of being a credit
0:47:16.350,0:47:23.520
reference agency, but they also run what[br]is believed to be armed data enrichment.
0:47:23.520,0:47:29.200
One of the things her employers could do[br]is by experience data to when hiring
0:47:29.200,0:47:35.690
staff. Like I can't say that if this data[br]ever ends up there. But, you know, as they
0:47:35.690,0:47:41.120
all collect, there is people collecting[br]data and using it for some level of
0:47:41.120,0:47:45.450
auditing.[br]Eva: And to transfer your second question.
0:47:45.450,0:47:49.810
I think this is a very important problem[br]you point out is the question of data
0:47:49.810,0:47:56.230
inequality and whose data gets collected[br]for what purpose. There is I do quite a
0:47:56.230,0:48:01.100
lot of work on delivery of state services.[br]For example, when there are populations
0:48:01.100,0:48:05.940
that are isolated, not using technology[br]and so on. You might just be missing out
0:48:05.940,0:48:12.450
on people, for example, who should be in[br]need of health care or state
0:48:12.450,0:48:18.120
support and so on. Just because you like[br]data about about them. And so, female
0:48:18.120,0:48:24.260
health is obviously a very key issue. We[br]just, we literally lack sufficient health
0:48:24.260,0:48:30.520
data about about woman on women's health[br]specifically. Now, in terms of how data is
0:48:30.520,0:48:35.550
processed in medical research, then[br]there's actually protocol a in place
0:48:35.550,0:48:40.470
normally to ensure, to ensure consent, to[br]ensure explicit consent, to ensure that
0:48:40.470,0:48:47.210
the data is properly collected. And so I[br]think I wouldn't want you means that you,
0:48:47.210,0:48:52.010
just because the way does apps. I've been[br]collecting data. If you know, if there's
0:48:52.010,0:48:56.980
one thing to take out of this of this dog[br]is that, it's been nothing short of
0:48:56.980,0:49:02.370
horrifying, really. That data is being[br]collected before and shared before you
0:49:02.370,0:49:06.320
even get your consent to anything. I[br]wouldn't trust any of these private
0:49:06.320,0:49:16.100
companies to really be the ones carrying[br]well taking part in in in medical research
0:49:16.100,0:49:22.750
or on those. So I agree with you that[br]there is a need for better and more data
0:49:22.750,0:49:28.860
on women's health. But I don't think. I[br]don't think any of these actors so far
0:49:28.860,0:49:33.900
have proved to be trusted on this issue.[br]Herald: Microphone 2.
0:49:33.900,0:49:37.010
Mic 2: Yeah. Thank you for this great[br]talk. Um. Short question. What do you
0:49:37.010,0:49:42.280
think is the rationale of, uh, this[br]menstruation apps to integrate the
0:49:42.280,0:49:46.470
Facebook SDK if they don't get money from[br]Facebook? OK, uh. Being able to
0:49:46.470,0:49:54.160
commercialize and this data.[br]Chris: Good question. Um, it could be a
0:49:54.160,0:50:00.910
mix of things. So sometimes it's literally[br]the the the developers literally just have
0:50:00.910,0:50:05.110
this as part of their tool chain their[br]workflow when they're developing apps. I
0:50:05.110,0:50:08.280
don't necessarily know about these two[br]peer trackers where other apps are
0:50:08.280,0:50:14.080
developed by these companies. But, uh, in[br]our in our previous work, which I
0:50:14.080,0:50:18.630
presented last year, we find that some[br]companies just produce a load of apps and
0:50:18.630,0:50:22.550
they just use the same tool chain every[br]time. That includes by default. The
0:50:22.550,0:50:29.550
Facebook SDK is part of a tool chain. Uh,[br]some of them are like included for what I
0:50:29.550,0:50:34.270
would regard as genuine purposes. Like[br]they want their users to share something
0:50:34.270,0:50:37.780
or they want their users to be able to log[br]in with Facebook and those cases, they
0:50:37.780,0:50:42.210
included, for what would be regarded a[br]legitimate reason below them. Just don't
0:50:42.210,0:50:47.760
ever actually they haven't integrated it[br]does appearance and they don't ever really
0:50:47.760,0:50:52.070
use anything of it other than that. Mean[br]that there are a lot of developers simply
0:50:52.070,0:51:02.460
quite unaware of the default state is[br]verbose and how it sends data to Facebook.
0:51:02.460,0:51:06.220
Herald: Yeah. Maybe we be close with one[br]last question from me. Um, it doesn't it's
0:51:06.220,0:51:12.120
usually a bunch of ups. How many of them[br]do certificate pinning? Uh, we see this as a
0:51:12.120,0:51:16.920
widespread policy or...[br]Chris: Are they just not really. Yet. I
0:51:16.920,0:51:21.930
would have a problem doing an analysis[br]where stuff could've been pinned. You say
0:51:21.930,0:51:28.710
TLS 1.3 is proven to be[br]more problematic than pinning. Uh, yeah.
0:51:28.710,0:51:32.410
Herald: Ok, well, thank you so much. And,[br]uh. Yeah.
0:51:32.410,0:51:40.520
Applause
0:51:40.520,0:51:44.200
36C3 Postroll music
0:51:44.200,0:52:08.000
Subtitles created by c3subtitles.de[br]in the year 2020. Join, and help us!