36C3 preroll music
Herald: It is my honor to introduce you
today to Eva and Chris. Eva, she is a
senior researcher at Privacy
International. She works on gender,
economical and social rights and how they
interplay with the right to privacy,
especially in marginalized communities.
Chris, she is the privacy lead at
technology lead at Privacy International.
And his day-to-day job is to expose
company and how they profit from
individuals and specifically today they
will tell us how these companies can even
profit from your menstruations. Thank you.
Chris: Thank you.
applause
C: Hi, everyone. It's nice to be back at
CCC. I was at CCC last year. If you heard
my talk from last year, this is going to
be like a slightly vague part 2. And if
you're not, I'm just gonna give you a very
brief recap because there is a
relationship between the two. So, I will
give you a little bit of background about
how this project started. Then we get to a
little bit about menstruation apps and
what a menstruation app actually is. Let
me talk a little bit through some of the
data that these these apps are collecting
and talk how we did our research, our
research methodology and then what our
findings are and our conclusions. So last
year, I and a colleague did a project
around how Facebook collects data about
users on Android devices using the Android
Facebook SDK. And this is whether you have
a Facebook account or not. And for that
project, we really looked when you first
opened apps and didn't really have to do
very much interaction with them
particularily, about the automatic sending
of data in a post GDPR context. And so we
looked a load of apps for that project,
including a couple of period trackers. And
that kind of led onto this project because
we were seeing loads of apps, across
different areas of categories. So we
thought we'd like hone in a little bit on
period trackers to see what kind of data,
because they're by far more sensitive than
many of the other apps on there, like you
might consider your music history to be
very sensitive.... laughs So. Yeah. So,
just a quick update on the previous work
from last year. We actually followed up
with all of the companies from that, from
that report. And by the end of like going
through multiple rounds of response, over
60 percent of them a changed practices
either by disabling the Facebook SDK in
their app or by disabling it until you
gave consent or removing it entirely. So I
pass over to Eva Blum-Dumontet. She's
going to talk you through menstruation
apps.
Eva: So I just want to make sure that
we're all on the same page. Although if
you didn't know what a menstruation app is
and you still bothered coming to this
talk, I'm extremely grateful. So how many
of you are are using a menstruation app or
have a partner, who's been using a
menstruation app? Oh my God. Oh, okay. I
didn't expect that. I thought it was going
to be much less. Okay. Well, for the few
of you who still might not know what a
menstruation app is, I'm still going to go
quickly through what a menstruation app
is. It's the idea of a menstruation app.
We also call them period tracker. It's to
have an app that tracks your menstruation
cycle. So that they tell you what days
you're most fertile. And you can
obviously, if you're using them to try and
get pregnant or if you have, for example,
a painful period, you can sort of plan
accordingly. So that's essentially the
main 2 reasons users would be would be
looking into using menstruation apps:
pregnancy, period tracking. Now, how did
this research starts? As Chris said,
obviously there was whole research that
had been done by Privacy International
last year on various apps. And as Chris
also already said what I was particularly
interested in was the kind of data that
menstruation apps are collecting, because
as we'll explain in this talk, it's really
actually not just limited to menstruation
cycle. And so I was interested in seeing
what actually happens to the data when it
is being shared. So I should say we're
really standing on the shoulders of giants
when it comes to this research. There was
previously existing research on
menstruation apps that was done by a
partner organization, Coding Rights in
Brazil. So they had done research on the
kind of data that was collected by
menstruation apps and the granularity of
this data. Yet, a very interesting thing
that we're looking at was the gender
normativity of those apps. Chris and I
have been looking at, you know, dozens of
these apps and, you know, they have
various data showing practices, as we'll
explain in the stock. But they have one
thing that all of them have in common is
that they are all pink. The other thing is
that they talk to their users as woman.
They, you know, don't want sort of even
compute the fact that maybe not all their
users are woman. So there is a very sort
of like narrow perspective of pregnancy
and females' bodies and how does female
sexuality function. Now, as I was saying,
when you're using a menstruation app, it's
not just your menstruation cycle that
you're entering. So this is some of the
questions that menstruation apps ask: So
sex; There is a lot about sex that they
want to know? How often, is it protected
or unprotected? Are you smoking? Are you
drinking? Are you partying? How often? We
even had one app that was asking about
masturbation, your sleeping pattern, your
coffee drinking habits. One thing that's
really interesting is that - and we'll
talk a little bit more again about this
later - but there's very strong data
protection laws in Europe called GDPR as
most of you will know. And it says that
only data that's strictly necessary should
be collected. So I'm still unclear what
masturbation has to do with tracking your
menstruation cycle, but... Other thing
that was collected is about your health
and the reason health is so important is
also related to data protection laws
because when you're collecting health
data, you need to show that you're taking
an extra step to collect this data because
it's considered sensitive personal data.
So extra steps in terms of getting
explicit consent from the users but also
through steps on behalf of the data
controller, in terms of showing that
they're making extra steps for the
security of this data. So this is the type
of question that was asked. There is so
much asked about vaginal discharge and
what kind of vaginal discharge you get
with all sorts of weird adjectives for
this: "Tiki, creamy". So yeah, they
clearly thought a lot about this. And it
is a lot about mood as well. Even, yeah, I
didn't know 'romantic' was a mood but
apparently it is. And what's interesting
obviously about mood in the context where,
you know, we've seen stories like
Cambridge Analytica, for example. So we
know how much companies, we know how much
political parties are trying to understand
how we think, how we feel. So that's
actually quite significant that you have
an app that's collecting information about
how we feel on a daily basis. And
obviously, like when people enter all
these data, their expectation at that
point is that the data stays between
between them and the app. And actually,
there is very little in the privacy policy
that could that would normally suggest
that it was. So this is the moment where I
actually should say we're not making this
up; like literally everything in this list
of questions were things, literal terms,
that they were asking. So we set out to
look at the most popular menstruation
apps. Do you want to carry on?
Chris: Yeah. I forgot to introduce myself
as well. Really? That's a terrible
speaking habit.
Eva: Christopher Weatherhead..
Chris: .. Privacy International's
technology lead. So yeah.. What I said
about our previous research, we have
actually looked at most of the very
popular menstruation apps, the ones that
have hundreds of thousands of downloads.
And these apps - like as we're saying that
this kind of work has been done before. A
lot of these apps that come into quite a
lot of criticism, I'd spare you the free
advertising about which ones particularly
but most of them don't do anything
particularly outrageous, at least between
the app and the developers' servers. A lot
of them don't share with third parties at
that stage. So you can't look between the
app and the server to see what they're
sharing. They might be sharing data from
the developers' server to Facebook or to
other places but at least you can't see
in-between. But we're an international
organization and we work around the globe.
And most of the apps that get the most
downloads are particularly Western, U.S.,
European but they're not the most popular
apps necessarily in a context like India
and the Philippines and Latin America. So
we thought we'd have a look and see those
Apps. They're all available in Europe but
they're not necessarily the most popular
in Europe. And this is where things
started getting interesting. So what
exactly did we do? Well, we started off by
triaging through a large number of period
trackers. And as Eva said earlier: every
logo must be pink. And we were just kind
of looking through to see how many
trackers - this is using extras (?)
privacy. We have our own instance in PI
and we just looked through to see how many
trackers and who the trackers were. So,
for example, this is Maya, which is
exceptionally popular in India,
predominantly - it's made by an Indian
company. And as you can see, it's got a
large number of trackers in it: a
CleverTap, Facebook, Flurry, Google and
Inmobi? So we went through this process and
this allowed us to cut down... There's
hundreds of period trackers. Not all of
them are necessarily bad but it's nice to
try to see which ones had the most
trackers, where they were used and try and
just triage them a little bit. From this,
we then run through PI's interception
environment, which is a VM that I've made.
I actually made it last year for the talk
I gave last year. And I said I'd release
it after the talk and took me like three
months to release it but it's now
available. You can go onto PI's website
and download it. It's a man in the middle
proxy with a few settings - mainly for
looking at iOS and Android apps to do data
interception between them. And so we run
through that and we got to have a look at
all the data that's being sent to and from
both the app developer and third parties.
And here's what we found.
Eva: So out of the six apps we looked out,
five shared data with Facebook. Out of
those five, three pinged Facebook to let
them know when their users were
downloading the app and opening the app.
And that's already quite significant
information and we'll get to that later.
Now, what's actually interesting and the
focus of a report was on the two apps that
shared every single piece of information
that their users entered with Facebook and
other third parties. So just to brief you:
the two apps we focused on are both called
Maya. So that's all very helpful. One is
spelled Maya: M-a-y-a. The other ones
spellt Mia M-I-A. So, yeah, just bear with
me because this is actually quite
confusing. But so initially we'll focus on
Maya, which is - as Chris mentioned - an
app that's based in India. There have a
user base of several millions. Their are
based in India. Userbase, mostly in India,
also quite popular in the Philippines. So
what's interesting with Maya is that they
start sharing data with Facebook before
you even get you agree to their privacy
policy. So I should say already about the
privacy policy of a lot of those apps that
we looked at is that they are literally
the definition of small prints. It's very
hard to read. It's legalese language. It
really puts into perspective the whole
question of consent in GDPR because GDPR
says like the consents must be informed.
So you must be able to understand what
you're consenting to. When you're reading
this extremely long, extremely opaque
privacy policies of a lot - literally all
the menstruation apps we've looked at,
excluding one that didn't even bother
putting their privacy policy, actually.
It's opaque. It's very hard to understand
and - absolutely, definitely, do not say
that they're sharing information with
Facebook. As I said, data sharing happened
before you get to agree to their privacy
policy. The other thing that's also worth
remembering is that when to share
information with Facebook - doesn't matter
if you have a Facebook account or not, the
information still being relayed. The other
interesting thing that you'll notice as
well in several of the slides is that the
information that's being shared is tied to
your identity through your unique ID
identifiers, also your email address. But
basically most of the questions we got
when we released the research was like:
oh, if I use a fake email address or if I
use a fake name, is that OK? Well, it's
not because even if you have a Facebook
account through your unique identifier,
they would definitely be able to trace you
backs. There is no way to actually
anonymize this process unless - well at
the end, unless you deliberately trying to
trick it and use a separate phone
basically for regular users. It's quite
difficult. So this is what it looks like
when you enter the data. So as I said, I
didn't lie to you. This is the kind of
questions they're asking you. And this is
what it looks like when it's being shared
with Facebook. So you see the symptomes
changing, for example, like blood
pressure, swelling, acne, that's all being
shipped through craft out Facebook,
through the Facebook SDK. This is what it
looks like when they show you
contraceptive practice, so again, like
we're talking health data. Here we're
talking sensitive data. We're talking
about data that shouldn't normally require
extra steps in terms of collecting it, in
terms of how it's being processed. But
nope, in this case it was shared exactly
like the rest. This's what it looks like.
Well, so, yeah with sex life it was a
little bit different. So that's what it
looks like when they're asking you about,
you know, you just had sex, was it
protected? Was it unprotected? The way it
was shared with Facebook was a little bit
cryptic, so to speak. So if you have
protected sex, it was entered as love "2",
unprotected sex was entered as Love "3". I
managed to figure that out pretty quickly.
So it's not so cryptic. That's also quite
funny. So Maya had a diary section where
they encourage people to enter like their
notes and your personal faults. And I
mean, it's a menstruation app so you can
sort of get the idea of what people are
going to be writing down in there or
expected to write on. It's not going to be
their shopping list, although shopping
lists could also be personal, sensitive,
personal information, but.. So we were
wondering what would happen if you were to
write in this in this diary and how this
data would be processed. So we entered
literally we entered something very
sensitive, entered here. This is what we
wrote. And literally everything we wrote
was shared with Facebook. Maya also shared
your health data, not just with Facebook,
but with a company called CleverTap that's
based in California. So what's CleverTap?
CleverTap is a data broker, basically.
It's a company that - sort of similar to
Facebook with the Facebook SDK. They
expect of developers to hand over the data
and in exchange app developers get
insights about like how people use the
app, what time of day. You know, the age
of their users. They get all sorts of
information and analytics out of the data
they share with this company. It took us
some time to figure it out because it
shared as wicked wizard?
Chris: Wicket Rocket.
Eva: Wicket Rocket, yeah. But that's
exactly the same. Everything that was
shared with Facebook was also shared with
CleverTap again, with the email address
that we were using - everything. Let's
shift. Now, let's look at the other Mia.
It's not just the name that's similar,
it's also the data showing practices. Mia
is based in Cypress, so in European Union.
I should say, in all cases, regardless of
where the company is based, the moment
that they market the product in European
Union, so like literally every app we
looked at, they need to - well they should
respect GDPR. Our European data protection
law. Now, the first thing that Mia asked
when you started the app and again - I'll
get to that later about the significance
of this - is why you're using the app or
you using it to try and get pregnant or
are you just using it to try to track your
periods? Now, it's interesting because it
doesn't change at all the way you interact
with the app eventually. The apps stays
exactly the same. But this is actually the
most important kind of data. This is
literally called the germ of data
collection. It's trying to know when a
woman is trying to get pregnant or not. So
the reason this is the first question they
ask is, well my guess on this is - they
want to make sure that like even if you
don't actually use the app that's at least
that much information they can collect
about you. And so this information was
shared immediately with Facebook and with
AppsFlyer. AppsFlyer is very similar to
CleverTap in the way it works. It's also a
company that collects data from these apps
and that as services in terms of analytics
and insights into user behavior. It's
based in Israel. So this is what it looks
like when you enter the information. Yeah,
masturbation, pill. What kind of pill
you're taking, your lifestyle habits. Now
where it's slightly different is that the
information doesn't immediately get shared
with Facebook but based on the information
you enter, you get articles that are
tailored for you. So, for example, like
when you select masturbation, you will
get, you know, masturbation: what you want
to know but are ashamed to ask. Now,
what's eventually shared with Facebook is
actually the kind of article that's being
offered to you. So basically, yes, the
information is shared indirectly because
then you know you have Facebook and...
You've just entered masturbation because
you're getting an article about
masturbation. So this is what happened
when you enter alcohol. So expected
effects of alcohol on a woman's body.
That's what happened when you enter
"unprotected sex". So effectively, all the
information is still shared just
indirectly through the articles you're
getting. Yeah. Last thing also, I should
say on this, in terms of the articles that
you're getting, is that sometimes there
was sort of also kind of like crossing the
data.. was like.. so the articles will be
about like: oh, you have cramps outside of
your periods, for example, like during
your fertile phase. And so you will get
the article specifically for this and the
information that's shared with Facebook
and with AppsFlyer is that this person is
in their fertile period in this phase of
their cycles and having cramps. Now, why
are menstruation apps so obsessed with
finding out if you're trying to get
pregnant? And so, this goes back to a lot
of the things I mentioned before that, you
know, about wanting to know in the very
first place if you're trying to get
pregnant or not. And also, this is
probably why a lot of those apps are
trying to really nail down in their
language and discourse how you're using
the apps for. When a person is pregnant,
they're purchasing habit, their consumer
habits change. Obviously, you know, you
buy not only for yourself but you start
buying for others as well. But also you're
buying new things you've never purchased
before. So what a regular person will be
quite difficult to change her purchasing
habit was a person that's pregnant.
They'll be advertisers will be really keen
to target them because this is a point of
their life where their habits change and
where they can be more easily influenced
one way or another. So in other words,
it's pink advertising time. In other more
words and pictures, there's research done
in 2014 in the US that was trying to sort
of evaluate the value of data for a
person. So an average American person
that's not pregnant was 10 cents. A person
who's pregnant would be one dollar fifty.
So you may have noticed we using the past
tense when we talked about - well I hope I
did when I was speaking definitely into
the lights at least - we used the past
tense when we talk about data sharing of
these apps. That's because both Maya and
MIA, which were the two apps we were
really targeting with this report, stop
using the Facebook SDK when we wrote to
them about our research
before we published it.
applause
So it was quite nice because he didn't
even like rely on actually us publishing
the report. It was merely at a stage of
like, hey, this is all right of response.
We're gonna be publishing this. Do you
have anything to say about this? And
essentially what they had to say is like:
"Yep, sorry, apologies. We are stopping
this." I think, you know.. What's really
interesting as well to me about like the
how quick the response was is.. it really
shows how this is not a vital service for
them. This is a plus. This is something
that's a useful tool. But the fact that
they immediately could just stop using it,
I think really shows that, you know, it
was.. I wouldn't see a lazy practice, but
it's a case of light. As long as no one's
complaining, then you are going to carry
on using it. And I think that was also the
discourse with your research. There was
also a lot that changed
their behaviors after.
Chris: A lot of the developers sometimes
don't even realize necessarily what data
they're up to sharing with people like
Facebook, with people like CleverTap. They
just integrate the SDK and
hope for the best.
Eva: We also got this interesting response
from AppsFlyer is that it's very
hypocritical. Essentially, what they're
saying is like oh, like we specifically
ask our customers or oh, yeah, do not
share health data with us specifically for
the reason I mentioned earlier, which is
what? Because of GDPR, you're normally
expected to take extra step when you
process sensitive health data. So their
response is that they as their customer to
not share health data or sensitive
personal data so they don't become liable
in terms of the law. So they were like,
oh, we're sorry, like this is a breach of
contract. Now, the reason is very
hypocritical is that obviously when you
have contracts with menstruation apps and
actually Maya was not the only
menstruation apps that we're working with.
I mean, you know, what can you generally
expect in terms of the kind of data you're
gonna receive? So here's a conclusion for
us that research works. It's fun, it's
easy to do. You know, Chris has not
published the environment. It doesn't
actually - once the environment is sort of
set up it doesn't actually require
technical background, as you saw from the
slides it's pretty straightforward to
actually understand how the data is being
shared. So you should do it, too. But more
broadly, we think it's really important to
do more research, not just at this stage
of the process, but generally about the
security and the data and the data showing
practices of apps, because, you know, it's
hard law and more and more people are
using or interacting with technology and
using the Internet. So we need to do think
much more carefully about the security
implication of the apps we use and
obviously it works. Thank you.
applause
Herald: Thank you. So, yeah, please line
up in front of the microphones. We can
start with microphone two.
Mic 2: Hi. Thank you. So you mentioned
that now we can check whether our data is
being shared with third parties on the
path between the user and the developer.
But we cannot know for all the other apps
and for these, what if it's not being
shared later from the developer, from the
company to other companies. Have you
conceptualize some ways of testing that?
Is it possible?
Chris: Yes. So you could do it, data
separate access request and the GDPR that
would... like the problem is it's quite
hard to necessarily know. How the process
- how the system outside of the app to
serve relationship is quite hard to know
the processes of that data and so it is
quite opaque. They might apply a different
identifier too, they might do other
manipulations to that data so trying to
track down and prove this bit of data
belong to you. It's quite challenging.
Eva: This is something we're going to try.
We're going to be doing in 2020, actually.
We're going to be doing data subject
access request of those apps that we've
been looking up to see if we find anything
both under GDPR but also under different
data protection laws in different
countries. To see basically what we get,
how much we can obtain from that.
Herald: So I'd go with the signal angle.
Signal: So what advice can you give us on
how we can make people understand that
from a privacy perspective, it's better to
use pen and paper instead of entering
sensitive data into any of these apps?
Eva: I definitely wouldn't advise that. I
wouldn't advise pen and paper. I think for
us like really the key... The work we are
doing is not actually targeting users.
It's targeting companies. We think it's
companies that really need to do better.
We're often ask about, you know, advice to
customers or advice to users and
consumers. But what I think and what we've
been telling companies as well is that,
you know, their users trust you and they
have the right to trust you. They also
have the right to expect that you're
respecting the law. The European Union has
a very ambitious legislation when it comes
to privacy with GDPR. And so the least
they can expect is that you're respecting
the law. And so, no, I would ... and this
is the thing, I think people have the
right to use those apps, they have the
right to say, well, this is a useful
service for me. It's really companies that
need you. They need to up their game. They
need to live up to the expectations of
their consumers. Not the other way around.
Herald: Microphone 1.
Mic 1: Hi. So from the talk, it seems and
I think that's what you get, you mostly
focused on Android based apps. Can you
maybe comment on what the situation is
with iOS? Is there any technical
difficulty or is it anything completely
different with respect to these apps and
apps in general?
Chris: There's not really a technical
difficulty like the setup a little bit
different, but functionally you can look
at the same kind of data. The focus here,
though, is also.. So it's two-fold in some
respects. Most of the places that these
apps are used are heavily dominated
Android territories, places like India,
the Philippines. iOS penetration there,
uh, Apple device penetration there is very
low. There's no technical reason not to
look at Apple devices. But like in this
particular context, it's not necessarily
hugely relevant. So does that answer your
question?
Mic 1: And technically with youre set-up,
you could also do the same
analysis with an iOS device?
Chris: Yeah. As I said it's a little bit
of a change to how you... You have to
register the device as an MDM dev.. like a
mobile profile device. Otherwise you can
do the exact same level of interception.
Mic: Uh, hi. My question is actually
related to the last question
is a little bit technical.
Chris: Sure.
Mic: I'm also doing some research on apps
and I've noticed with the newest versions
of Android that they're making more
difficult to install custom certificates
to have this pass- through and check what
the apps are actually communicating to
their home servers. Have you find a way to
make this easier?
Chris: Yes. So we actually hit the same
issue as you in some respects. So the
installing of custom certificates was not
really an obstacle because you can add to
the user if it's a rich device, you can
add them to the system store and they are
trusted by all the apps on the device. The
problem we're now hitting is the Android 9
and 10 have TLS 1.3 and TLS 1.3
to text as a man in the middle or at
least it tries to might terminate the
connection. Uh, this is a bit of a
problem. So currently all our research is
still running on Android 8.1 devices. This
isn't going to be sustainable long term.
Herald: Um, 4.
Mic 4: Hey, thank you for the great talk.
Your research is obviously targeted in a
constructive, critical way towards
companies that are making apps surrounding
menstrual research. Did you learn anything
from this context that you would want to
pass on to people who research this area
more generally? I'm thinking, for example,
of Paramount Corp in the US, who've done
micro dosing research on LSD and are
starting a breakout study on menstrual
issues.
Eva: Well, I think this is why I was
concluded on it. I think there is a
there's still a lot of research that needs
to be done in terms of the sharing. And
obviously, I think anything that touches
on people's health is a key priority
because it's something people relate very
strongly to. The consequences, especially
in the US, for example, of sharing health
data like this, of having - you know -
data, even like your blood pressure and so
on. Like what are the consequences if
those informations are gonna be shared,
for example, with like insurance companies
and so on. This is what I think is
absolutely essential to have a better
understanding of the data collection and
sharing practices of the services. The
moments when you have health data that's
being involved.
Chris: .. yeah because we often focus
about this being an advertising issue. But
in that sense as well, insurance and even
credit referencing of all sorts of other
things become problematic, especially when
it comes to pregnancy related.
Eva: Yeah, even employers could be after
this kind of information.
Herald: Six.
Mic 6: Hi. I'm wondering if there is an
easy way or a tool which we can use to
detect if apps are using our data or are
reporting them to Facebook or whatever. Or
if we can even use those apps but block
this data from being reported to Facebook.
Chris: Yes. So, you can file all of faith
graft on Facebook.com and stop sending
data to that. There's a few issues here.
Firstly, it doesn't really like.. This
audience can do this. Most users don't
have the technical nuance to know what
needs to be blocked, what doesn't
necessarily need to be blocked. It's on
the companies to be careful with users
data. It's not up to the users to try and
defend against.. It shouldn't be on the
use to defend against malicious data
sharing or...
Eva: You know... also one interesting
thing is that if Facebook had put this in
place of light where you could opt out
from data sharing with the apps you're
using, but that only works if you're a
Facebook user. And as I said, like this
data has been collected whether you are a
user or not. So in a sense, for people who
aren't Facebook users, they couldn't opt
out of this.
Chris: The Facebook SDK the developers are
integrating the default state for sharing
of data is on, the flag is true. And
although they have a long legal text on
the help pages for the developer tools,
it's like unless you have a decent
understanding of local data protection
practice or local protection law. It's
like it's not it's not something that most
developers are gonna be able to understand
why this flag should be something
different from on. You know there's loads
of flags in the SDK, which flags should be
on and off, depending on which
jurisdiction you're selling to, or users
going to be in.
Herald: Signal Angel, again.
Singal: Do you know any good apps which
don't share data and are privacy friendly?
Probably even one that is open source.
Eva: So, I mean, as in the problem which
is why I wouldn't want to vouch for any
app is that even in the apps that, you
know, where in terms of like the traffic
analysis we've done, we didn't see any any
data sharing. As Chris was explaining, the
data can be shared at a later stage and
it'd be impossible for us to really find
out. So.. no, I can't be vouching for any
app. I don't know if you can...
Chris: The problem is we can't ever look
like one specific moment in time to see
whether data is being shared, unlike what
was good today might bad tomorrow. What
was bad yesterday might be good today.
Although, I was in Argentina recently
speaking to a group of feminist activists,
and they have been developing a
menstruation tracking app. And the app was
removed from the Google Play store because
it had illustrations that were deemed
pornographic. But they were illustrations
around medical related stuff. So even
people, who were trying to do the right
thing, going through the open source
channels are still fighting a completely
different issue when it comes to
menstruation tracking.
It's a very fine line.
Herald: Um, three.
inaudible
Eva: Sorry, can't hear -the Mic's not
working.
Herald: Microphone three.
Mic 3: Test.
Eva: Yeah, it's great - perfect.
Mic 3: I was wondering if the graph API
endpoint was actually in place to trick
menstruation data or is it more like a
general purpose advertisement
tracking thing or. Yeah.
Chris: So my understanding is that there's
two broad kinds of data that Facebook gets
as automated app events that Facebook were
aware of. So app open, app close, app
install, relinking. Relinking is quite an
important one for Facebook. That way they
check to see whether you already have a
Facebook account logged in to log the app
to your Facebook account when standing.
There's also a load of custom events that
the app developers can put in. There is
then collated back to a data set - I would
imagine on the other side. So when it
comes to things like whether it's nausea
or some of the other health issues, it is
actually being cross-referenced by the
developer. Does that answer your question?
Mic 3: Yes, thank you.
Herald: Five, microphone five.
Mic 5: Can you repeat what you said in the
beginning about the menstruation apps used
in Europe, especially Clue and the Period
Tracker? Yeah. So those are the most
popular apps actually across the world,
not just in Europe and the US. A lot of
them in terms of like the traffic analysis
stage, a lot of them have not clean up
their app. So we can't see any any data
sharing happening at that stage. But as I
said, I can't be vouching for them and
saying, oh, yeah, those are safe and fine
to use because we don't know what's
actually happening to the data once it's
been collected by the app. All we can say
is that as far as the research we've done
goes, we didn't see any data being shed
Chris: Those apps you mentioned have been
investigated by The Wall Street Journal
and The New York Times relatively
recently. So they've been.. had quite like
a spotlight on them. So they've had to
really up their game and a lot of ways
which we would like everyone to do. But as
Eva says, we don't know what else they
might be doing with that data on their
side, not necessarily between the phone
and the server but from their server to
another server.
Herald: Microphone one.
Mic 1: Hi. Thank you for the insightful
talk. I have a question that goes in a
similar direction. Do you know whether or
not these apps, even if they adhere to
GDPR rules collect the data to then at a
later point at least sell it to the
highest bidder? Because a lot of them are
free to use. And I wonder what is their
main goal besides that?
Eva: I mean, the advertisement his how
they make profit. And so, I mean, the
whole question about them trying to know
if you're pregnant or not is that this
information can eventually be - you know -
be monetized through, you know, through
how they target the advertisement at you.
Actually when you're using those apps, you
can see in some of the slides, like you're
constantly like being flushed with like
all sorts of advertisement on the app, you
know, whether they are selling it
externally or not - I can't tell. But what
I can tell is, yeah, your business model
is advertisement and so they are deriving
profit from the data they collect.
Absolutely.
Herald: Again, on microphone one.
Mic 1: Thank you. I was wondering if there
was more of a big data kind of aspect to
it as well, because these are really
interesting medical information on women’s
cycles in general.
Eva: Yeah, and the answer is, like, I call
it—this is a bit of a black box and
especially in the way, for example, that
Facebook is using this data like we don't
know. We can assume that this is like part
of the … we could assume this is part of
the profiling that Facebook does of both
their users and their non-users. But the
way the way this data is actually
processed also by those apps through data
brokers and so on, it’s a bit of a black
box.
Herald: Microphone 1.
Mic 1: Yeah. Thank you a lot for your talk
and I have two completely different
questions. The first one is: you've been
focusing a lot on advertising and how this
data is used to sell to advertisers. But I
mean, like you aim to be pregnant or not.
It's like it has to be the best kept
secret, at least in Switzerland for any
female person, because like if you also
want to get employed, your employer must
not know whether or not you want to get
pregnant. And so I would like to ask,
like, how likely is it that this kind of
data is also potentially sold to employers
who may want to poke into your health and
reproductive situation? And then my other
question is entirely different, because we
also know that female health is one of the
least researched topics around, and that's
actually a huge problem. Like so little is
actually known about female health and the
kind of data that these apps collect is
actually a gold mine to advance research
on health issues that are specific for
certain bodies like female bodies. And so
I would also like to know like how would
it be possible to still gather this kind
of data and still to collect it, but use
it for like a beneficial purpose, like it
to improve knowledge on these issues?
Eva: Sure. So to answer your first
question, the answer will be similar to
the previous answer I gave, which is, you
know, it's black box problem. It's like
it's very difficult to know exactly, you
know, what's actually happening to this
data. Obviously, GDPR is there to prevent
something from happening. But as we've
seen from these apps, like they were, you
know, towing a very blurry line. And so
the risk, obviously, of … this is
something that can’t be relia…. I can't be
saying, oh, this is happening because I
have no evidence that this is happening.
But obviously, the risk of multiple, the
risk of like employers, as you say, the
insurance companies that could get it,
that political parties could get it and
target their messages based on information
they have about your mood, about, you
know, even the fact that you're trying to
start a family. So, yeah, there is a very
broad range of risk. The advertisement we
know for sure is happening because this is
like the basis of their business model.
The risk, the range of risk is very, very
broad.
Chris: To just expand on that: Again, as
Eva said, we can't point out a specific
example of any of this. But if you look at
some of the other data brokers, her
experience as a data broker, they collect.
They have a statutory response. In the UK
is a statutory job of being a credit
reference agency, but they also run what
is believed to be armed data enrichment.
One of the things her employers could do
is by experience data to when hiring
staff. Like I can't say that if this data
ever ends up there. But, you know, as they
all collect, there is people collecting
data and using it for some level of
auditing.
Eva: And to transfer your second question.
I think this is a very important problem
you point out is the question of data
inequality and whose data gets collected
for what purpose. There is I do quite a
lot of work on delivery of state services.
For example, when there are populations
that are isolated, not using technology
and so on. You might just be missing out
on people, for example, who should be in
need of health care or state
support and so on. Just because you like
data about about them. And so, female
health is obviously a very key issue. We
just, we literally lack sufficient health
data about about woman on women's health
specifically. Now, in terms of how data is
processed in medical research, then
there's actually protocol a in place
normally to ensure, to ensure consent, to
ensure explicit consent, to ensure that
the data is properly collected. And so I
think I wouldn't want you means that you,
just because the way does apps. I've been
collecting data. If you know, if there's
one thing to take out of this of this dog
is that, it's been nothing short of
horrifying, really. That data is being
collected before and shared before you
even get your consent to anything. I
wouldn't trust any of these private
companies to really be the ones carrying
well taking part in in in medical research
or on those. So I agree with you that
there is a need for better and more data
on women's health. But I don't think. I
don't think any of these actors so far
have proved to be trusted on this issue.
Herald: Microphone 2.
Mic 2: Yeah. Thank you for this great
talk. Um. Short question. What do you
think is the rationale of, uh, this
menstruation apps to integrate the
Facebook SDK if they don't get money from
Facebook? OK, uh. Being able to
commercialize and this data.
Chris: Good question. Um, it could be a
mix of things. So sometimes it's literally
the the the developers literally just have
this as part of their tool chain their
workflow when they're developing apps. I
don't necessarily know about these two
peer trackers where other apps are
developed by these companies. But, uh, in
our in our previous work, which I
presented last year, we find that some
companies just produce a load of apps and
they just use the same tool chain every
time. That includes by default. The
Facebook SDK is part of a tool chain. Uh,
some of them are like included for what I
would regard as genuine purposes. Like
they want their users to share something
or they want their users to be able to log
in with Facebook and those cases, they
included, for what would be regarded a
legitimate reason below them. Just don't
ever actually they haven't integrated it
does appearance and they don't ever really
use anything of it other than that. Mean
that there are a lot of developers simply
quite unaware of the default state is
verbose and how it sends data to Facebook.
Herald: Yeah. Maybe we be close with one
last question from me. Um, it doesn't it's
usually a bunch of ups. How many of them
do certificate pinning? Uh, we see this as a
widespread policy or...
Chris: Are they just not really. Yet. I
would have a problem doing an analysis
where stuff could've been pinned. You say
TLS 1.3 is proven to be
more problematic than pinning. Uh, yeah.
Herald: Ok, well, thank you so much. And,
uh. Yeah.
Applause
36C3 Postroll music
Subtitles created by c3subtitles.de
in the year 2020. Join, and help us!