0:00:00.000,0:00:09.520
32C3 preroll music
0:00:09.520,0:00:13.210
Herald: Our next talk is[br]called “Safe Harbor”.
0:00:13.210,0:00:18.019
Background is: back in October, in[br]the light of the Snowden revelations
0:00:18.019,0:00:22.829
the Court of Justice of the European[br]Union – that’s the “EuGH” in German
0:00:22.829,0:00:29.159
declared the Safe Harbor agreement[br]between the EU and the US invalid.
0:00:29.159,0:00:32.500
This talk is about how we got there[br]as well as further implications
0:00:32.500,0:00:37.470
of that decision. Please believe me when[br]I say our speaker is ideally suited
0:00:37.470,0:00:42.730
to talk about that topic. Please give it[br]up for the man actually suing Facebook
0:00:42.730,0:00:45.990
over Data Protection concerns:[br]Max Schrems!
0:00:45.990,0:00:50.940
applause and cheers
0:00:50.940,0:00:53.290
Max Schrems: Hallo! Hey![br]applause and cheers
0:00:53.290,0:01:01.770
applause
0:01:01.770,0:01:05.420
It’s cheerful like some Facebook Annual[br]conference where the newest things
0:01:05.420,0:01:09.920
are kind of presented. I’m doing a little[br]intro basically where I got there.
0:01:09.920,0:01:13.330
This was my nice little university in[br]California. And I was studying there
0:01:13.330,0:01:15.259
for half a year and there were a[br]couple of people from Facebook
0:01:15.259,0:01:18.149
and other big companies and[br]they were talking about
0:01:18.149,0:01:21.289
European Data Protection law. And[br]the basic thing they said – it was
0:01:21.289,0:01:25.219
not an original quote but basically what[br]they said is: “Fuck the Europeans,
0:01:25.219,0:01:28.050
you can fuck their law as much as you[br]want and nothing is going to happen.”
0:01:28.050,0:01:31.919
And that was kind of the start of the[br]whole story because I thought: “Okay,
0:01:31.919,0:01:35.639
let’s just make a couple of[br]complaints and see where it goes.”
0:01:35.639,0:01:41.219
I originally got 1.300 pages Facebook data[br]back then, because you can exercise
0:01:41.219,0:01:45.420
your right to access. And Facebook[br]actually sent me a CD with a PDF file
0:01:45.420,0:01:49.069
on it with all my Facebook data.[br]It was by far not everything
0:01:49.069,0:01:51.880
but it was the first time that someone[br]really got the data and I was asking
0:01:51.880,0:01:54.889
someone from Facebook why they were so[br]stupid to send me all this information.
0:01:54.889,0:01:59.100
Because a lot of it was obviously illegal.[br]And the answer was “We had internal
0:01:59.100,0:02:04.090
communications problems.” So someone was[br]just stupid enough to burn it on a CD and
0:02:04.090,0:02:09.158
send it on. One of the CDs actually was[br]first going to Sydney in Australia because
0:02:09.158,0:02:12.819
they put “Australia” instead of “Austria”[br]on the label which was one of the things
0:02:12.819,0:02:15.050
as well.[br]applause
0:02:15.050,0:02:20.010
Anyway, this was basically how[br]my interest in Facebook started;
0:02:20.010,0:02:23.450
and the media got crazy about it because[br]there is like a little guy that does
0:02:23.450,0:02:27.280
something against the big guy. And this[br]is basically how the whole thing got
0:02:27.280,0:02:30.980
this big. This is like a cartoon from my[br]Salzburg newspaper. This should be me,
0:02:30.980,0:02:34.640
and it’s like basically the reason why[br]the story got that big because it’s
0:02:34.640,0:02:37.790
a small guy doing something against[br]Facebook, not necessarily because
0:02:37.790,0:02:41.770
what I was doing was so especially smart.[br]But the story was just good for the media,
0:02:41.770,0:02:45.879
’cause data protection is generally a very[br]dry topic that they can’t report about
0:02:45.879,0:02:50.019
and they’re they had like[br]the guy that did something.
0:02:50.019,0:02:54.299
A couple of introductions. We actually[br]had 3 procedures. So if you heard about
0:02:54.299,0:02:58.290
what I was doing… There was originally[br]a procedure at the Irish Data Protection
0:02:58.290,0:03:02.310
Commission, on Facebook itself – so what[br]Facebook itself does with the data.
0:03:02.310,0:03:06.620
This procedure has ended after 3 years.[br]There’s a “Class Action” in Vienna
0:03:06.620,0:03:09.930
right now that’s still ongoing. It’s in[br]front of the Supreme Court in Austria
0:03:09.930,0:03:14.900
right now. And there is the procedure[br]that I’m talking about today which is
0:03:14.900,0:03:19.920
the procedure on Safe Harbor at the[br]Irish Data Protection Commission.
0:03:19.920,0:03:22.719
A couple of other background[br]informations: I personally don’t think
0:03:22.719,0:03:25.159
Facebook is the issue. Facebook[br]is just a nice example for
0:03:25.159,0:03:29.609
an overall bigger issue. So I was never[br]personally concerned with Facebook but
0:03:29.609,0:03:32.969
for me the question is how we enforce[br]Data Protection or kind of stuff.
0:03:32.969,0:03:35.909
applause[br]So it’s not a Facebook talk; Facebook is
0:03:35.909,0:03:39.290
applause[br]the example. And of course the whole thing
0:03:39.290,0:03:42.599
is just one puzzle piece. A lot of people[br]are saying: “This was one win but there
0:03:42.599,0:03:46.279
are so many other issues!” – Yes, you’re[br]totally right! This was just one issue.
0:03:46.279,0:03:49.439
But you got to start somewhere.[br]And the whole thing is also
0:03:49.439,0:03:53.209
not an ultimate solution. So I can[br]not present you the final solution
0:03:53.209,0:03:57.530
for everything, but probably a couple[br]of possibilities to do something.
0:03:57.530,0:04:00.969
If you’re interested in the documents[br]– we pretty much publish everything
0:04:00.969,0:04:05.469
on the web page. It’s a very old style web[br]page. But you can download the PDF files
0:04:05.469,0:04:10.379
and everything if you’re interested[br]in the facts and (?) the details.
0:04:10.379,0:04:15.779
Talking about facts, the whole thing[br]started with the Snowden case,
0:04:15.779,0:04:19.019
where we kind of for the first time had[br]documents proving who is actually
0:04:19.019,0:04:23.500
forwarding data to the NSA in this case.[br]And this is the interesting part, because
0:04:23.500,0:04:26.530
we have a lot of rumours but if you’re in[br]a Court room you actually have to prove
0:04:26.530,0:04:30.180
everything and you cannot just suspect[br]that very likely they’re doing it. But you
0:04:30.180,0:04:35.199
need actual proof. And thanks to Snowden[br]we had at least a bunch of information
0:04:35.199,0:04:40.550
that we could use. These are the slides,[br]you all know them. The first very
0:04:40.550,0:04:46.880
interesting thing was the FISA act and we[br]mainly argued under 1881a as an example
0:04:46.880,0:04:51.509
for the overall surveillance in the US. So[br]we took this law as an example but it was
0:04:51.509,0:04:55.560
not the only thing we relied on. And I[br]think it’s interesting for Europeans to
0:04:55.560,0:05:01.030
understand how the law actually works.[br]The law actually goes after data and not
0:05:01.030,0:05:08.849
after people. We typically have laws in[br]criminal procedures that go after people.
0:05:08.849,0:05:13.289
This law goes after data. So it totally[br]falls outside of our normal thinking of
0:05:13.289,0:05:15.229
“we’re going after a suspect,[br]someone that
0:05:15.229,0:05:17.810
may have committed a crime”. Basically the
0:05:17.810,0:05:21.430
law says that there’s an electronic[br]communications service provider that holds
0:05:21.430,0:05:25.270
foreign intelligence information. That’s[br]much more than just terrorist prevention,
0:05:25.270,0:05:29.919
that’s also things that the US is[br]generally interested in.
0:05:29.919,0:05:33.739
And this is the level that’s publicly[br]known and everything else is basically
0:05:33.739,0:05:38.139
classified. So under the law the FISA[br]Court can do certification for one year
0:05:38.139,0:05:43.669
that basically says “the NSA can access[br]data”. In this certifications there are
0:05:43.669,0:05:46.240
these minimization and targeting[br]procedures that they have to describe.
0:05:46.240,0:05:48.099
But they’re not public.[br]We don’t know how
0:05:48.099,0:05:51.210
they look like. And basically they’re here
0:05:51.210,0:05:56.370
to separate data from US people out of[br]the data set. So it doesn’t really help
0:05:56.370,0:06:00.690
a European. And then there is a so called[br]Directive that goes to the individual
0:06:00.690,0:06:04.210
service provider which basically says:[br]“Give us the data in some technical
0:06:04.210,0:06:08.020
format.” So very likely it’s some kind[br]of API or some kind of possibility
0:06:08.020,0:06:12.569
that they can retrieve the data. That’s[br]what the law says. We don’t know
0:06:12.569,0:06:17.440
how it actually looks and we don't[br]have perfect proof of it. So there are
0:06:17.440,0:06:20.530
a lot of things that are disputed and[br]still disputed by the US government.
0:06:20.530,0:06:24.809
So the exact technical implementations,[br]the amount of data that’s actually pulled,
0:06:24.809,0:06:28.830
all the review mechanisms they have[br]internally. That’s all stuff that was
0:06:28.830,0:06:34.110
not 100% sure, and not sure enough[br]to present it to a Court. Which was
0:06:34.110,0:06:38.940
the basic problem we had. First of[br]all after the Snowden thing broke
0:06:38.940,0:06:44.460
we had different reactions. And that was[br]kind of how I started the procedure.
0:06:44.460,0:06:48.020
The first reaction was demonstrations.[br]We were all walking in the streets.
0:06:48.020,0:06:51.080
Which is good and which is important,[br]but we all know that this is something
0:06:51.080,0:06:55.710
we have to do but not something that’s[br]gonna change the world. Second thing:
0:06:55.710,0:06:58.849
we had parliaments like the European[br]Parliament doing resolutions saying
0:06:58.849,0:07:03.110
that we should strike down the Safe Harbor[br]and this is all bad and evil. We had
0:07:03.110,0:07:07.030
the Commission pretty much saying the[br]same thing. We had national politicians
0:07:07.030,0:07:11.569
saying the same thing. And we all knew[br]that basically this means that they all
0:07:11.569,0:07:16.580
send an angry letter to the US. Then they[br]can walk in front of the media and say:
0:07:16.580,0:07:19.879
“Yes, we’ve done something, we sent[br]an angry letter to the US”, and the US
0:07:19.879,0:07:25.229
is just thrown basically in some trash bin[br]of crazy Europeans wanting strange things
0:07:25.229,0:07:31.960
and that was it. So I was actually called[br]by a journalist and asked if there’s
0:07:31.960,0:07:36.610
some other option. And I was then[br]starting to think about it and there’s
0:07:36.610,0:07:42.300
the so called Safe Harbor agreement. To[br]explain the “Safe Harbor”: In Europe
0:07:42.300,0:07:46.490
we have Data Protection law that is on[br]the papers but factually not enforced.
0:07:46.490,0:07:50.569
But at least, in theory, we have it. And[br]we have a couple of other countries
0:07:50.569,0:07:54.819
that have the same level of protection[br]or similar laws. And generally
0:07:54.819,0:07:58.460
Data Protection only works if you keep[br]the data within the protected sphere so
0:07:58.460,0:08:01.150
you’re not allowed to send personal[br]data to a third country that
0:08:01.150,0:08:04.819
doesn’t have adequate protection. There[br]are a couple of other countries that do;
0:08:04.819,0:08:09.740
and therefore you can transfer data e.g.[br]to Switzerland. This is what the law says.
0:08:09.740,0:08:13.460
And there are certain servers that[br]are outside these countries where
0:08:13.460,0:08:17.009
we can have contractual relationships. So[br]basically if you have a server in India,
0:08:17.009,0:08:20.090
you have a contract with your[br]Indian hosting provider saying:
0:08:20.090,0:08:24.199
“You apply proper Data Protection to it”.[br]So you can transfer data there, too.
0:08:24.199,0:08:27.740
All of this is approved by the European[br]Commission. This is how data
0:08:27.740,0:08:33.299
flows legally outside of the EU – personal[br]data. This all doesn’t apply
0:08:33.299,0:08:38.539
to any other kind of data, only personal[br]data. And we had a basic problem
0:08:38.539,0:08:42.469
with the US because there was this[br]Directive saying you can forward data
0:08:42.469,0:08:46.850
to other countries but there is no Data[br]Protection Law in the US. So basically
0:08:46.850,0:08:48.810
you wouldn’t be allowed to send[br]data there unless you have
0:08:48.810,0:08:52.870
some contractual relationship which[br]is always kind of complicated.
0:08:52.870,0:08:58.110
So the solution was to have a self[br]certification to EU principles
0:08:58.110,0:09:01.839
and this was put into an Executive[br]Decision by the European Commission.
0:09:01.839,0:09:06.940
So basically how Safe Harbor is working[br]is that e.g. Google can walk up and say:
0:09:06.940,0:09:12.540
“Hereby I pledge that I follow European[br]Data Protection Law. I solemnly swear!”.
0:09:12.540,0:09:15.110
And then they do whatever they[br]want to do. And basically
0:09:15.110,0:09:17.750
that’s the Safe Harbor system and the[br]Europeans can walk around saying:
0:09:17.750,0:09:22.250
“Yeah, there is some seal saying[br]that everything is fine, so don’t worry.”
0:09:22.250,0:09:25.380
Everybody knew that this is a fucked-up[br]system but for years and years
0:09:25.380,0:09:29.399
everyone was looking away because politics[br]is there and economics is there and
0:09:29.399,0:09:33.560
they just needed it. So basically Safe[br]Harbor works that way that a US company
0:09:33.560,0:09:38.360
can follow the Safe Harbor principles[br]and say: “We follow them”, then
0:09:38.360,0:09:41.519
the Federal Trade Commission and private[br]arbitrators are overlooking them
0:09:41.519,0:09:46.700
– in theory, in practice they never do –[br]and this whole thing was packaged
0:09:46.700,0:09:50.160
into decision by the European[br]Commission. And this is the so called
0:09:50.160,0:09:55.600
Safe Harbor system. So from a European[br]legal point of view it’s not an agreement
0:09:55.600,0:10:01.010
with the US, it’s a system that the US has[br]set up that we approved as adequate. So
0:10:01.010,0:10:04.300
there’s no binding thing between the US[br]and Europe, we can kind of trash it
0:10:04.300,0:10:10.790
any time. They’ve just never done that.[br]Which brings me to the legal argument.
0:10:10.790,0:10:14.930
Basically if I’m this little Smiley down[br]there, I’m sitting in Austria and
0:10:14.930,0:10:20.920
transfer my data to Facebook Ireland,[br]because worldwide – 82% of all users
0:10:20.920,0:10:23.890
have a contract with Facebook Ireland.[br]Anyone that lives outside the US
0:10:23.890,0:10:28.600
and Canada. So anyone from China,[br]South America, Africa has a contract
0:10:28.600,0:10:33.670
with Facebook in Ireland. And legally they[br]forward the data to Facebook in the US;
0:10:33.670,0:10:37.170
technically the data is directly[br]forwarded. So the data is actually flowing
0:10:37.170,0:10:41.529
right to the servers in the US. However[br]legally it goes through Ireland. And
0:10:41.529,0:10:46.029
my contract partner is an Irish company.[br]And under the law they can only transfer
0:10:46.029,0:10:49.839
data to the US if there is adequate[br]protection. At the same time we know
0:10:49.839,0:10:53.550
that the PRISM system is hooked up in[br]the end. So I was basically walking up
0:10:53.550,0:10:56.720
to the Court and saying: “Mass[br]Surveillance is very likely not
0:10:56.720,0:11:01.669
adequate protection, he?” And[br]that was basically the argument.
0:11:01.669,0:11:08.980
applause
0:11:08.980,0:11:12.250
The interesting thing in this situation[br]was actually the strategic approach.
0:11:12.250,0:11:17.899
So, we have the NSA and other surveillance[br]organizations that use private companies.
0:11:17.899,0:11:21.880
So we have kind of a public-private[br]surveillance partnership. It’s PPP in
0:11:21.880,0:11:28.540
a kind of surveillance way. Facebook is[br]subject to US law, so under US law they
0:11:28.540,0:11:32.110
have to forward all the data. At the same[br]time Facebook Ireland is subject to
0:11:32.110,0:11:35.040
European law so they’re not[br]allowed to forward all this data.
0:11:35.040,0:11:41.550
Which is interesting because[br]they’re split. The EU law regulates
0:11:41.550,0:11:45.570
how these third cwountry transfers work.[br]And all of this has to be interpreted
0:11:45.570,0:11:49.959
under Fundamental Rights. So this was[br]basically the system were looking at.
0:11:49.959,0:11:53.459
And the really crucial thing is that we[br]have this public-private surveillance.
0:11:53.459,0:11:56.930
Because we do have jurisdiction over[br]private company. We don’t have
0:11:56.930,0:12:00.680
jurisdication over the NSA. We can[br]send angry letters to the NSA. But
0:12:00.680,0:12:04.070
we do have jurisdiction over Facebook,[br]Google etc. because they’re basically
0:12:04.070,0:12:08.769
based here. Mainly for tax reasons.[br]And this was the interesting thing that
0:12:08.769,0:12:11.720
in difference to the national surveillance[br]where we can pretty much just send
0:12:11.720,0:12:15.240
the angry letters we can do something[br]about the private companies. And
0:12:15.240,0:12:19.230
without the private companies there is[br]almost no mass surveillance in this scale
0:12:19.230,0:12:22.810
because the NSA is not in our phones,[br]it’s the Googles and Apples and whatever.
0:12:22.810,0:12:29.000
And without them you’re not really[br]able to get this mass surveillance.
0:12:29.000,0:12:32.579
This is like the legal chart. Basically[br]what we argued is: there’s 7 and 8
0:12:32.579,0:12:35.490
of the Charta of Fundamental Rights.[br]That’s your right to Privacy and
0:12:35.490,0:12:39.420
your right to Data Protection. There[br]is an article in the Directive that
0:12:39.420,0:12:44.149
has to be interpreted in light of it. Then[br]there’s the Executive Decision of the EU.
0:12:44.149,0:12:48.750
This is basically the Safe Harbor[br]decision which refers to Paragraph 4
0:12:48.750,0:12:52.990
of the Safe Harbor principles. And the[br]Safe Harbor principles basically say
0:12:52.990,0:12:56.540
that the FISA Act is okay. So[br]you have kind of this circle
0:12:56.540,0:13:01.579
of different legal layers which is getting[br]really crazy. I’ll try to break it down
0:13:01.579,0:13:05.339
a little bit. Basically 7 and 8 of the[br]Charta we basically compared
0:13:05.339,0:13:08.690
to Data Retention, so the[br]“Vorratsdatenspeicherung”.
0:13:08.690,0:13:13.820
We basically said PRISM is much worse. If[br]“Vorratsdatenspeicherung” (Data Retention)
0:13:13.820,0:13:18.300
was invalid then PRISM has to be 10 times[br]as bad. That was basically the argument.
0:13:18.300,0:13:22.240
Very simple. We just compared: the[br]one was content data – the other one
0:13:22.240,0:13:25.779
was meta data. The one is storage[br]– the other one is making available.
0:13:25.779,0:13:29.519
And the one is endless – the other[br]one is 24 months. So basically
0:13:29.519,0:13:33.540
in all these categories PRISM was much[br]worse. And if the one has to be illegal
0:13:33.540,0:13:37.639
the other one has to be as well. And[br]what’s interesting – and that’s something
0:13:37.639,0:13:42.740
that the US side is typically not getting[br]– is that Article 8 is already covering
0:13:42.740,0:13:47.459
“making available of data”. So the[br]fun thing is I only had to prove
0:13:47.459,0:13:51.190
that Facebook makes data available,[br]so basically it’s possible
0:13:51.190,0:13:55.420
the NSA is pulling it. I didn’t even have[br]to prove that the NSA is factually pulling
0:13:55.420,0:14:00.190
my personal data. And this was like the[br]relevant point because under US law
0:14:00.190,0:14:04.220
basically your Fundamental Rights only[br]kick in when they factually look at your
0:14:04.220,0:14:07.850
data and actually surveil you. So I was[br]only: “They’re making it available
0:14:07.850,0:14:11.740
– that’s good enough for me!” which[br]was making all these factual evidence
0:14:11.740,0:14:16.789
much easier. So basically I only had[br]to say: “Look at the XKeyscore slides
0:14:16.789,0:14:22.180
where they say ‘user name Facebook’[br]they can get somehow the data out of it.
0:14:22.180,0:14:26.860
It’s at least made available; that’s[br]all I need to prove”. And this is
0:14:26.860,0:14:30.360
the big difference between the US[br]– it’s very simplified, but basically
0:14:30.360,0:14:34.649
between the US approach and the European[br]approach; is that in the US you have to
0:14:34.649,0:14:38.639
prove that your data is actually pulled.[br]I only had to prove that my data is made
0:14:38.639,0:14:44.399
available. So I had to… I was able to[br]get out of all the factual questions.
0:14:44.399,0:14:48.100
This is a comparison – you basically…[br]in the US we have very strict laws
0:14:48.100,0:14:53.149
for certain types of surveillance while in[br]Europe we have a more flexible system
0:14:53.149,0:14:56.730
that covers much more. So it’s a[br]different approach that we just have
0:14:56.730,0:15:00.279
in the two legal spheres. We’re both[br]talking about your Fundamental
0:15:00.279,0:15:03.750
Right to Privacy, but in details it’s[br]very different. And that’s kind of
0:15:03.750,0:15:07.909
the differences what we used. The fun[br]thing is if you’re European you don’t have
0:15:07.909,0:15:11.509
any rights in the US anyways because[br]the Bill Of Rights only applies to people
0:15:11.509,0:15:15.829
that live in the US and US citizens so[br]you’re out of luck anyways. So you’re
0:15:15.829,0:15:20.209
only left with the European things.[br]Basically the law which is
0:15:20.209,0:15:23.550
the second level after the Fundamental[br]Rights is saying that there has to be
0:15:23.550,0:15:28.250
an adequate level of protection as I said[br]and this third country has to ensure it
0:15:28.250,0:15:33.279
by domestic law or international[br]commitments. And I was saying: “You know
0:15:33.279,0:15:36.180
there’s the FISA Act, you can read[br]it, it definitely doesn’t ensure
0:15:36.180,0:15:38.630
your fundamental rights and an[br]adequate protection. So we're
0:15:38.630,0:15:45.110
kind of out of Article 25”. And there is[br]paragraph 4 of the Safe Harbor principles
0:15:45.110,0:15:50.829
which say that all these wonderful privacy[br]principles that US companies sign up to
0:15:50.829,0:15:56.579
do not apply whenever a national law[br]in the US is overruling it. So there are
0:15:56.579,0:16:01.690
principles that companies say: “We[br]follow!” but if there is a city in Texas
0:16:01.690,0:16:05.649
saying: “We have a local ordinance[br]saying: ‘You have to do differently!’”
0:16:05.649,0:16:09.029
all these Safe Harbor principles[br]don’t apply anymore. And this is
0:16:09.029,0:16:12.699
the fundamental flaw of the self[br]certification system that it only works
0:16:12.699,0:16:17.240
if there is no law around that conflicts[br]with it. And as there are tons of laws
0:16:17.240,0:16:22.519
that conflict with it you’re hardly[br]able to hold up a system like that.
0:16:22.519,0:16:26.220
So basically if you go through all these[br]different legal layers you end up with
0:16:26.220,0:16:29.730
a conflict between the US FISA Act[br]and the European Fundamental Rights.
0:16:29.730,0:16:33.069
So you’re going through different layers[br]of the system but you’re basically making
0:16:33.069,0:16:41.009
a circle. This is what we did which was[br]a little bit complicated but worked.
0:16:41.009,0:16:53.719
applause
0:16:53.719,0:16:57.430
Basically now to the procedure,[br]so how the whole thing happened.
0:16:57.430,0:17:01.639
First I went through the Safe Harbor. Safe[br]Harbor allows you to go to TRUSTe or
0:17:01.639,0:17:06.630
the Federal Trade Commission and there’s[br]an online form to make your complaint. And
0:17:06.630,0:17:10.270
I was making a complaint and I think you[br]were only allowed to put in 60 characters
0:17:10.270,0:17:13.650
to explain what your complaint is. Which[br]is a little bit complicated if you’re
0:17:13.650,0:17:17.779
trying to explain NSA mass surveillance.[br]So I only wrote: “Stop Facebook, Inc.’s
0:17:17.779,0:17:20.410
involvement in PRISM!”. That[br]was everything I could actually
0:17:20.410,0:17:24.280
put in the text box; that was[br]the absolute maximum.
0:17:24.280,0:17:27.690
And the answer I got back was: “TRUSTe[br]does not have the authority to address
0:17:27.690,0:17:31.280
the matter you raise.” Which is obvious,[br]it’s a private arbitration company
0:17:31.280,0:17:35.370
that can hardly tell Facebook to not[br]follow the NSA’s guidelines. So
0:17:35.370,0:17:39.310
this was the arbitration mechanism under[br]Safe Harbor. You can also go to the
0:17:39.310,0:17:43.390
Federal Trade Commission and have your[br]complaint filed there. But they basically
0:17:43.390,0:17:50.290
just ignore them. This was the letter I[br]got back, that they received it. But
0:17:50.290,0:17:53.760
I was talking to the people at the FTC and[br]they say: “Yeah, we get these complaints
0:17:53.760,0:17:59.130
but they’re ending up in a huge storage[br]system where they stay for ever after”.
0:17:59.130,0:18:02.530
So this was enforcement done by[br]Safe Harbor. And we knew that
0:18:02.530,0:18:05.640
in the private field already; but in this[br]case it was especially interesting.
0:18:05.640,0:18:08.720
To be fair, both of these institutions[br]have no power to do anything
0:18:08.720,0:18:11.260
about mass surveillance. So[br]there was really a reason why
0:18:11.260,0:18:14.330
they didn’t do anything.[br]The next step you have is
0:18:14.330,0:18:17.330
the national Data Protection Commissioner.[br]So we have 28 countries
0:18:17.330,0:18:21.640
with 28 [Commissioners]; plus Germany has[br]– I think – a Data Protection Commissioner
0:18:21.640,0:18:26.940
in every province. And you end up at[br]this. And this is my most favourite slide.
0:18:26.940,0:18:30.500
This is the Irish Data[br]Protection Commissioner.
0:18:30.500,0:18:33.910
applause
0:18:33.910,0:18:36.450
To be super precise[br]– I don’t know if you
0:18:36.450,0:18:38.100
can see the laser pointer. But this is a
0:18:38.100,0:18:41.570
super market. And this is the Irish Data[br]Protection Commissioner back there.
0:18:41.570,0:18:45.040
laughter, applause
0:18:45.040,0:18:48.370
To be a little more fair, actually they’re[br]up here and they’re like 20 people
0:18:48.370,0:18:51.720
when we filed it originally. The fun thing[br]is back at the times they didn’t have
0:18:51.720,0:18:54.730
a single lawyer and not a single[br]technician. So they were 20
0:18:54.730,0:18:58.470
public employees that were dealing[br]with Data Protection and no one
0:18:58.470,0:19:02.960
had any clue of the technical[br]or the legal things about it.
0:19:02.960,0:19:06.140
The fun thing is: this is Billy Hawkes,[br]the Data Protection Commissioner
0:19:06.140,0:19:09.720
at the time. He went on the[br]national radio in the morning.
0:19:09.720,0:19:12.960
And in Ireland radio is a really big[br]thing. So it was a morning show.
0:19:12.960,0:19:16.540
And he was asked about these complaints.[br]And he actually said on the radio:
0:19:16.540,0:19:21.170
“I don’t think it will come[br]as much of a surprise
0:19:21.170,0:19:24.880
that the US services have access[br]to all the US companies”.
0:19:24.880,0:19:28.510
And this was the craziest thing![br]I was sitting in front of the radio
0:19:28.510,0:19:34.560
and was like: “Strike! He just[br]acknowledged that all this is true!”.
0:19:34.560,0:19:38.040
And the second thing, he said: “This US[br]surveillance operation is not an issue
0:19:38.040,0:19:42.500
of Data Protection”. Interesting.
0:19:42.500,0:19:47.570
It’s actually online and you can listen[br]to it. But the fun thing was really that
0:19:47.570,0:19:52.180
the factual level is so hard to prove that[br]I was afraid that they would dispute:
0:19:52.180,0:19:55.380
“Hah, who knows if all this is true?[br]We don’t have any evidence!
0:19:55.380,0:19:58.710
The companies say we are[br]not engaging in all of this.”
0:19:58.710,0:20:02.770
So having the Data Protection Commissioner[br]saying: “Sure they surveil you!
0:20:02.770,0:20:06.330
Are you surprised?” was great[br]because we were kind of out of
0:20:06.330,0:20:10.180
the whole factual debate.
0:20:10.180,0:20:13.070
I actually got a letter back from them[br]saying that they’re not investigating
0:20:13.070,0:20:17.850
any of it. And I was asking them why. And[br]they were naming 2 sections of the law,
0:20:17.850,0:20:21.600
a combination thereof. So there was one[br]thing where it says they shall investigate
0:20:21.600,0:20:25.130
– which means they have to – or[br]they may investigate. And they say
0:20:25.130,0:20:28.060
they only “may” investigate complaints[br]and they just don’t feel like
0:20:28.060,0:20:32.710
investigating PRISM and Facebook[br]and all of this. Secondly they say
0:20:32.710,0:20:36.920
that a complaint could be “frivolous[br]and vexatious” – I love the word!
0:20:36.920,0:20:41.130
And therefor they’re not investigating[br]it. “A combination thereof or indeed
0:20:41.130,0:20:47.980
any other relevant matter.” So we[br]transferred this letter into a picture
0:20:47.980,0:20:51.630
which is basically what they said: “So[br]why did you not investigate PRISM?”
0:20:51.630,0:20:54.140
– “‘Shall’ means ‘may’, frivolous[br]or
0:20:54.140,0:20:55.700
vexatious, a combination of A and B
0:20:55.700,0:20:58.820
or any other reason.”[br]So this was the answer
0:20:58.820,0:21:01.450
by the Irish Data Protection Commissioner[br]why they wouldn’t want to investigate
0:21:01.450,0:21:05.710
the complaint. Just to give[br]you background information:
0:21:05.710,0:21:08.350
these are the complaints that the Irish[br]Data Protection Commissioner is receiving
0:21:08.350,0:21:10.510
– the blue line – and the red line is[br]all
0:21:10.510,0:21:13.380
of the complaints they’re not deciding.
0:21:13.380,0:21:17.420
Which is 96..98% of the complaints
0:21:17.420,0:21:20.680
they receive on an average year.[br]Which is interesting because you have
0:21:20.680,0:21:24.130
a right to get a decision but they don’t.
0:21:24.130,0:21:27.180
To give you the bigger picture: we[br]also made complaints on Apple
0:21:27.180,0:21:30.510
and all the other PRISM companies.[br]And Ireland basically said
0:21:30.510,0:21:35.310
what I just told you. Luxembourg, where[br]Skype and Microsoft are situated, said
0:21:35.310,0:21:38.760
that they do not have enough evidence for[br]the participation of Microsoft and Skype
0:21:38.760,0:21:43.070
[in PRISM]. And the funniest thing[br]about the answer was that they said
0:21:43.070,0:21:46.030
that they’re restricted by their[br]investigations to the territory
0:21:46.030,0:21:50.220
of Luxembourg. And since all of this is[br]happening in the US they have no way
0:21:50.220,0:21:54.050
of ever finding out what was going on.[br]So I was telling them: “You know,
0:21:54.050,0:21:57.890
most of this is online and if you’re not[br]able to download it I can print it out
0:21:57.890,0:22:02.890
for you and ship it to Luxembourg.” But[br]the problem is why we didn’t go down
0:22:02.890,0:22:06.460
in Luxembourg is because they went down[br]this factual kind of argument. They said:
0:22:06.460,0:22:10.010
“It’s all illegal but factually we[br]don’t believe it’s true”. And
0:22:10.010,0:22:15.280
then there was Germany that are[br]still investigating until today.
0:22:15.280,0:22:18.300
This was Yahoo. Actually that was[br]Yahoo in Munich but they now
0:22:18.300,0:22:21.140
moved to Ireland as well. So I don’t[br]know what happened to this complaint.
0:22:21.140,0:22:23.800
We never heard back. But whenever we sent[br]an email they were like: “Yeah, we’re
0:22:23.800,0:22:29.820
still investigating.” So what happened now[br]is that I went to the Irish High Court.
0:22:29.820,0:22:34.150
To jeopardize the non-decision of the[br]Irish Data Protection Commissioner.
0:22:34.150,0:22:37.350
This is the case that then went down as[br]“Schrems vs. the Data Protection
0:22:37.350,0:22:40.790
Commissioner” which is so strange because[br]I never wanted to have my name
0:22:40.790,0:22:43.690
on any of this and now the decision is[br]actually called after my second name
0:22:43.690,0:22:46.990
which is always freaking me out in a way.[br]Because you’re fighting for Privacy and
0:22:46.990,0:22:51.480
suddenly your name is all over the place.[br]applause and laughter
0:22:51.480,0:22:56.010
applause
0:22:56.010,0:23:00.160
And this is the Irish High Court. So you…
0:23:00.160,0:23:04.130
It’s very complicated to[br]get a procedure like that.
0:23:04.130,0:23:07.610
The biggest issue is that you need money.[br]If you’re in front of an Irish Court
0:23:07.610,0:23:10.100
and you lose a case you[br]end up with a legal bill of
0:23:10.100,0:23:13.280
a couple of hundred thousand[br]Euros. Which is the reason why
0:23:13.280,0:23:17.600
never anybody ever challenged the[br]Irish Data Protection Commissioner.
0:23:17.600,0:23:21.510
Because you just gonna[br]lose your house over it!
0:23:21.510,0:23:25.900
So what I did is: we did a little[br]bit of crowd-funding! And
0:23:25.900,0:23:29.420
we actually got about 70.000 Euros out[br]of it. This was a crowd-funding platform
0:23:29.420,0:23:32.180
that basically worked in a way[br]that people could donate
0:23:32.180,0:23:35.950
and if we don’t need the money we either[br]donate it to another Privacy cause
0:23:35.950,0:23:39.710
or we actually give people the money[br]back. Which we got to have to do
0:23:39.710,0:23:43.490
because we won the case. And all[br]our costs are paid by the other side.
0:23:43.490,0:23:49.960
applause
0:23:49.960,0:23:55.660
So the fun thing is you then have to[br]walk into this wonderful old Court here
0:23:55.660,0:23:59.710
on Mondays at 11:30. And[br]there’s a room where you can
0:23:59.710,0:24:03.990
make your application. And about 100 other[br]people making their application as well.
0:24:03.990,0:24:06.850
And there is no number. So there[br]are 100 lawyers sitting in a room,
0:24:06.850,0:24:11.240
waiting for the judge to call out your[br]case. So we were sitting there until 4 PM
0:24:11.240,0:24:15.430
or something until suddenly our case was[br]called up. And we actually got kind of
0:24:15.430,0:24:19.260
the possibility to bring our case and then[br]it’s postponed to another date and
0:24:19.260,0:24:24.400
blablablablabla. In the end you[br]end up with something like this.
0:24:24.400,0:24:28.380
Which is all the paperwork[br]because in Ireland the Courts
0:24:28.380,0:24:33.020
are not computerized so far. So you[br]have to bring all the paperwork,
0:24:33.020,0:24:38.200
anything you rely on, in 3 copies.[br]And it’s all paper, noted of the pages,
0:24:38.200,0:24:43.260
so all these copies have pages 1 to 1000.[br]Someone’s writing all of them on the page.
0:24:43.260,0:24:46.010
And then they copy it 3 times and it’s[br]then in this wonderful little thing.
0:24:46.010,0:24:49.810
I thought it’s great. And[br]what happened is that
0:24:49.810,0:24:53.630
we walked into the judge’s room and you[br]get a judge assigned on the same day.
0:24:53.630,0:24:57.690
So you end up in front of a judge[br]that has never heard about Privacy,
0:24:57.690,0:25:00.570
never heard about Facebook and[br]never heard about Snowden and PRISM
0:25:00.570,0:25:03.710
and any of this. So you walk into the[br]room as like “We would like to debate
0:25:03.710,0:25:08.220
the Safe Harbor with you” and he was like[br]“What the fuck is the Safe Harbor?”.
0:25:08.220,0:25:12.250
So what happened is that he told us to[br]kind of explain what it is for 15 minutes.
0:25:12.250,0:25:16.280
And then he postponed the[br]whole thing for 2 hours I think
0:25:16.280,0:25:19.960
and we walked over to a pub and had a[br]beer. So that the judge could remotely
0:25:19.960,0:25:24.540
read what he’s about to look into.
0:25:24.540,0:25:27.150
And Ireland is very interesting because[br]you need a Solicitor and a Counsel
0:25:27.150,0:25:30.900
and then the Counsel is actually talking[br]to the Judge. So I actually had 2 filters.
0:25:30.900,0:25:34.010
If I’m the client down here I had to[br]talk to my Solicitor. The Solicitor
0:25:34.010,0:25:38.520
was telling the Counsel what to say to the[br]Judge. So half of it was lost on the way.
0:25:38.520,0:25:41.730
And when I was asking if I could[br]just address the Judge personally
0:25:41.730,0:25:44.770
they were like “No, no way that you could[br]possibly address the Judge personally
0:25:44.770,0:25:46.240
even though you’re the claimant”.[br]Which is
0:25:46.240,0:25:48.110
funny ’cause they talk about this “person”
0:25:48.110,0:25:52.060
in the room. It’s like “What’s the problem[br]of this Mr. Schrems?”. And you’re like
0:25:52.060,0:25:57.040
sitting right here, it’s like[br]“This would be me!”.
0:25:57.040,0:26:01.060
So what happened in Ireland is that we[br]had about 10 reasons why under Irish law
0:26:01.060,0:26:05.960
the Irish Data Protection Commissioner[br]would have to do its job but the Court
0:26:05.960,0:26:09.880
actually wiped all of this from the table[br]and said actually the Safe Harbor
0:26:09.880,0:26:12.580
is the issue, which legally they’re[br]not allowed to do what politically
0:26:12.580,0:26:16.500
was very wise and forwarded this[br]wonderful easy-to-understand question
0:26:16.500,0:26:19.710
to the European Court of Justice.
0:26:19.710,0:26:23.480
The reason why they put this kind[br]of very random question is that
0:26:23.480,0:26:27.710
if you jeopardize a law in Ireland you[br]have to get some Advocate General engaged.
0:26:27.710,0:26:30.090
And they didn’t want to do that[br]so they kind of “asked a question
0:26:30.090,0:26:33.890
around the actual question”[br]to not really get them engaged.
0:26:33.890,0:26:35.720
Which was very complicated[br]because we didn’t know how
0:26:35.720,0:26:38.610
the European Court of Justice ’d kind of[br]react to this random question because
0:26:38.610,0:26:41.530
it was so broad that they could just walk[br]any other direction and not address
0:26:41.530,0:26:47.330
the real issue. What was wonderful is[br]that in the judgment by the Irish Court
0:26:47.330,0:26:50.640
they have actually said that[br]all of this is factually true.
0:26:50.640,0:26:54.170
All the mass surveillance is factually[br]true. And the fun thing to understand
0:26:54.170,0:26:58.720
is that the factual assessment is done by[br]the national Courts. So the European
0:26:58.720,0:27:02.000
Court of Justice is not engaging in[br]factual matters anymore. They only
0:27:02.000,0:27:06.800
ask legal questions: “Is this legal or[br]not”. So we had a split of responsibility.
0:27:06.800,0:27:10.680
The Irish Court only said that all of this[br]is true. And Luxembourg only said
0:27:10.680,0:27:15.050
that all of this would be legal if all of[br]this would be true. Which was kind of
0:27:15.050,0:27:19.530
an interesting situation. But to be[br]fair no one before the European
0:27:19.530,0:27:23.010
Court of Justice has ever questioned[br]that this is true. So even the UK
0:27:23.010,0:27:26.270
that was in front of the Court and that[br]should possibly know if all of this
0:27:26.270,0:27:30.270
is true or not, they have never[br]questioned the facts. laughs
0:27:30.270,0:27:34.930
There is a pretty good factual basis.[br]What was interesting as well is
0:27:34.930,0:27:39.150
that I said I’m not gonna go in front[br]of the European Court of Justice.
0:27:39.150,0:27:44.240
Because the cost is so high that even the[br]60 or 70.000 Euros I got in donations
0:27:44.240,0:27:47.970
wouldn’t cover it. And I knew the judge[br]wants to get this hot potato off his table
0:27:47.970,0:27:52.280
and down to Luxembourg. So I was asking[br]for a so called “protective cost order”
0:27:52.280,0:27:55.470
which kind of tells you beforehand that[br]there is a maximum amount you have to pay
0:27:55.470,0:27:57.960
if you lose a case. And it[br]was actually the first one
0:27:57.960,0:28:01.960
to ever get protective cost[br]order in Ireland granted.
0:28:01.960,0:28:06.020
Which was really cool and the Irish[br]were like outraged about it, too.
0:28:06.020,0:28:09.760
applause
0:28:09.760,0:28:12.960
So we basically walked into the[br]European Court of Justice which is
0:28:12.960,0:28:17.150
a really hefty procedure.[br]In this room were…
0:28:17.150,0:28:20.520
13 judges are in front of you. The[br]European Court of Justice has assigned it
0:28:20.520,0:28:24.150
to the Great Chamber. So there is a[br]Small, a Medium and a Great Chamber.
0:28:24.150,0:28:28.480
Which is the highest thing you can[br]possibly end up in Europe. And
0:28:28.480,0:28:31.490
it’s chaired by the President of the[br]European Court of Justice. And this is
0:28:31.490,0:28:36.350
kind of where the really really basic,[br]really important questions are dealt with.
0:28:36.350,0:28:38.890
So I was like: “Cool, I’m getting to the[br]European Court of Justice!”. And it’s
0:28:38.890,0:28:41.930
funny because all the lawyers that were in[br]the room, everyone was like “I can pledge
0:28:41.930,0:28:44.580
in front of the European Court of[br]Justice!”. They all took pictures like
0:28:44.580,0:28:47.040
they were in Disneyland or something.[br]audience laughing
0:28:47.040,0:28:52.970
And it was – lawyers can be[br]very… kind of… interesting. And
0:28:52.970,0:28:57.250
we ended up in front of these 3 major[br]people. It was the President,
0:28:57.250,0:29:00.860
Thomas von Danwitz – who is the German[br]judge and he also wrote the lead decision.
0:29:00.860,0:29:03.710
He’s the Judge Rapporteur, so within[br]the 13 judges there’s one that is
0:29:03.710,0:29:07.130
the reporting judge and actually drafts[br]the whole case. And he was also
0:29:07.130,0:29:12.760
doing the Data Retention. And then there[br]was Yves Bot as the Advocate General.
0:29:12.760,0:29:15.220
The hearing was interesting[br]because we got questions
0:29:15.220,0:29:18.480
from the European Court of Justice before[br]the hearing. And in these questions
0:29:18.480,0:29:21.600
they were actually digging down into[br]the core issues of mass surveillance
0:29:21.600,0:29:25.930
in the US. When I got the questions[br]I was like “We won the case!” because
0:29:25.930,0:29:30.810
there’s no way they can decide differently[br]as soon as they address the question.
0:29:30.810,0:29:34.790
There were participants from all over[br]Europe. These are the countries,
0:29:34.790,0:29:37.660
then there was the European Parliament,[br]the European Data Protection Supervisor
0:29:37.660,0:29:40.770
and the European Commission.[br]There was me – MS down there,
0:29:40.770,0:29:44.940
the Data Protection Commissioner[br]and Digital Rights Ireland. And
0:29:44.940,0:29:49.310
what was interesting was the countries[br]that were not there. Like Germany, e.g.
0:29:49.310,0:29:53.000
was not there in this major procedure.[br]And as far as I’ve heard there were
0:29:53.000,0:30:00.300
reasons of not getting too engaged in[br]the Transatlantic Partnership problem.
0:30:00.300,0:30:04.800
So this was kind of interesting because[br]the UK walked up but Germany was like:
0:30:04.800,0:30:08.130
“No, we rather don’t want[br]to say anything about this.”
0:30:08.130,0:30:12.370
What was interesting as well is that there[br]were interventions by the US Government.
0:30:12.370,0:30:16.470
So I heard… we were… on a Tuesday we[br]were actually in the Court. And on Mondays
0:30:16.470,0:30:20.430
I got text messages from people of[br]these different countries telling me that
0:30:20.430,0:30:25.040
the US just called them up. And[br]I was like: “This is interesting”
0:30:25.040,0:30:28.180
because I know a lot of these people from[br]conferences and stuff. So they were like
0:30:28.180,0:30:31.210
telling me: “The US just called me[br]up and said they wanna talk to my
0:30:31.210,0:30:34.520
lead lead lead supervisor and tell me[br]what to say tomorrow in the Court”.
0:30:34.520,0:30:38.050
It was like: “This is very interesting!”.
0:30:38.050,0:30:44.070
I was actually in the Court room and there[br]was the justice person from the US embassy
0:30:44.070,0:30:47.650
to the European Union. And he was actually[br]watching the procedure and watching
0:30:47.650,0:30:50.430
what everybody was arguing.[br]Where I had a feeling this is
0:30:50.430,0:30:54.130
like a watchdog situation. And someone[br]pointed out that this is the guy,
0:30:54.130,0:30:57.420
so I knew who it is. And he was walking up[br]to me and asked: “Are you the plaintiff?”
0:30:57.420,0:31:02.960
And I said: “Yeah, hey!” and he was[br]trying to talk to me and I said:
0:31:02.960,0:31:06.420
“Did you manage calling everybody by now[br]or do you still need a couple of numbers?”
0:31:06.420,0:31:06.870
audience laughing
0:31:06.870,0:31:11.650
And he was like: “(?) arrogant!”. He was[br]like: “He didn’t just ask this question?”.
0:31:11.650,0:31:15.590
He said: “No, we kind of we’re in contact[br]with all of our colleagues and of course
0:31:15.590,0:31:19.280
we have to kind of push for the interest[br]of the US” and blablabla. I thought:
0:31:19.280,0:31:22.840
“This is very interesting!”. But[br]anyway, it didn’t help them.
0:31:22.840,0:31:27.720
No one of them was really kind[br]of arguing for the US, actually.
0:31:27.720,0:31:30.440
The findings of the European Court of[br]Justice, so what was in the judgment
0:31:30.440,0:31:36.050
in the end. First of all, Safe Harbor[br]is invalid. Which was the big news.
0:31:36.050,0:31:39.340
And this was over night. We were expecting[br]that they would have a grace period
0:31:39.340,0:31:43.700
so it’s invalid within 3 months or[br]something like this. But in the minute
0:31:43.700,0:31:48.450
they were saying it there all your data[br]transfers to the US were suddenly illegal.
0:31:48.450,0:31:58.710
applause[br]Which was kind of big.
0:31:58.710,0:32:02.070
The second biggie was that they actually[br]said that the essence of your rights
0:32:02.070,0:32:04.850
is violated. Now this, for an average[br]person, doesn’t mean too much.
0:32:04.850,0:32:10.270
But for a lawyer it says: “Oh my[br]god, the essence is touched!!”.
0:32:10.270,0:32:14.170
To explain to you what the essence is and[br]why everybody is so excited about it is:
0:32:14.170,0:32:17.820
basically if you have a violation of[br]your rights you have no interference.
0:32:17.820,0:32:20.420
So if a policeman was walking[br]down the street and watching you
0:32:20.420,0:32:24.030
there’s no interference with any of your[br]rights. If they probably tapped your phone
0:32:24.030,0:32:27.450
there is some kind of proportionality[br]issue which is what we typically debate
0:32:27.450,0:32:30.950
before a Court. There is a system how[br]you argue if something is proportionate
0:32:30.950,0:32:34.750
or not. So e.g. Data Retention[br]was not proportionate.
0:32:34.750,0:32:37.950
And Data Retention would be somewhere[br]here probably. points to slide
0:32:37.950,0:32:41.800
So not legal anymore but[br]still in a proportionality test.
0:32:41.800,0:32:44.320
And then there is “the essence”[br]which means whatever the fuck
0:32:44.320,0:32:47.190
you’re trying to do here is totally[br]illegal because what you’re doing
0:32:47.190,0:32:49.690
is so much out of the scale[br]of proportionality that
0:32:49.690,0:32:53.530
it will never be legal. And on Data[br]Retention it actually said that
0:32:53.530,0:32:56.590
for the first time…[br]applause
0:32:56.590,0:33:01.890
applause
0:33:01.890,0:33:04.160
…and this was actually the[br]first time as far as I saw
0:33:04.160,0:33:07.290
that the European Court of Justice has[br]ever said that under the convention.
0:33:07.290,0:33:11.410
So the convention is only[br]in place since 2008, I think.
0:33:11.410,0:33:13.640
But it’s the first time they actually[br]found that in a case which was
0:33:13.640,0:33:18.120
huge for law in general. There[br]was a couple of findings
0:33:18.120,0:33:21.660
on Data Protection powers that[br]are not too interesting for you.
0:33:21.660,0:33:26.450
What may be interesting is that
0:33:26.450,0:33:29.760
– there is a story to this picture[br]that’s the reason I put it in –
0:33:29.760,0:33:32.310
basically they said that a[br]third country doesn’t have
0:33:32.310,0:33:35.600
to provide adequate protection, as[br]I said before. So the story was
0:33:35.600,0:33:39.210
that third countries originally had[br]to provide equivalent protection.
0:33:39.210,0:33:42.270
But there was lobbying going on,[br]so the word “equivalent” was
0:33:42.270,0:33:47.090
changed to “adequate”. And[br]“adequate” means basically nothing.
0:33:47.090,0:33:51.120
Because anything and nothing can be[br]adequate. “Adequate” has no legal meaning.
0:33:51.120,0:33:55.910
I mean if you ask what an adequate[br]dressing is – you don’t really know.
0:33:55.910,0:33:59.500
So they changed that actually back to the[br]law… to the wording that was lobbied
0:33:59.500,0:34:02.880
out of the law and said it has to be[br]“essentially equivalent” and that’s how
0:34:02.880,0:34:05.980
we now understand “adequate”. Which is[br]cool because any third country now
0:34:05.980,0:34:10.110
has to provide more or less the same[br]level of protection than Europe has.
0:34:10.110,0:34:15.469
There has to be effective detention[br]and supervision mechanisms. And
0:34:15.469,0:34:18.199
there has to be legal redress. Just[br]a really short thing on the picture:
0:34:18.199,0:34:20.899
I was actually just pointing at two[br]people and they were taking a picture
0:34:20.899,0:34:24.530
from down there to make it a Victory[br]sign. And that’s how the media
0:34:24.530,0:34:27.929
is then doing: “Whoo”.[br]making short Victory gesture
0:34:27.929,0:34:31.899
I have to speed up a little bit.[br]Not too much but a little bit.
0:34:31.899,0:34:36.070
The future, and I think that’s probably[br]relevant for you guys as well…
0:34:36.070,0:34:40.820
First of all, what this whole judgment[br]means. First of all the US
0:34:40.820,0:34:44.070
basically lost its privileged[br]status as being a country
0:34:44.070,0:34:47.480
that provides adequate [data] protection.[br]Which is kind of the elephant in the room
0:34:47.480,0:34:50.860
that everyone knew anyway, that they’re[br]not providing it. And now, officially,
0:34:50.860,0:34:55.300
they’re not providing it anymore. And the[br]US is now like any third country.
0:34:55.300,0:35:00.320
So like China or Russia or India or[br]any country we usually transfer data to.
0:35:00.320,0:35:03.750
So it’s not like you cannot transfer[br]data to the US anymore.
0:35:03.750,0:35:06.840
But they lost their special status.[br]Basically what the judgment said:
0:35:06.840,0:35:09.080
“You can’t have mass surveillance[br]and be at the same time
0:35:09.080,0:35:13.490
an adequately [data] protecting country”.[br]Which is kind of logical anyway.
0:35:13.490,0:35:16.540
The consequence is that you have to[br]use the derogations that are in the law
0:35:16.540,0:35:20.100
that we have for other countries as well.[br]So a lot of people said: “You know,
0:35:20.100,0:35:23.610
the only result will be that there will be[br]a consent box saying ‘I consent that my
0:35:23.610,0:35:27.480
[personal] data is going to the US.’”[br]Now the problem is: consent has to be
0:35:27.480,0:35:32.160
freely given, informed, unambiguous[br]and specific; under European law.
0:35:32.160,0:35:34.350
Which is something all the Googles[br]and Facebooks in the world have
0:35:34.350,0:35:36.870
never understood. That’s the reason[br]why all these Privacy Policies are
0:35:36.870,0:35:41.270
typically invalid. But anyway. So if[br]you have any of these wordings that
0:35:41.270,0:35:45.190
they’re currently using, like “Your data[br]is subject to all applicable laws” it’s
0:35:45.190,0:35:48.300
very likely not “informed” and[br]“unambiguous”. Because you don’t have
0:35:48.300,0:35:52.120
any fucking idea that your data is[br]ending up at the NSA if you read this.
0:35:52.120,0:35:55.510
So what they would have to do is to have[br]some Policy saying: “I agree that all of
0:35:55.510,0:35:59.530
my personal data is made available to the[br]NSA, FBI and whatsoever – YES/NO”.
0:35:59.530,0:36:02.080
applause[br]Because it has to be “freely given”, so
0:36:02.080,0:36:04.410
applause[br]I have to have the option to say “No”.
0:36:04.410,0:36:09.050
Now this would theoretically be possible[br]but under US law they’re placed under
0:36:09.050,0:36:10.510
a “gag order”, so they’re[br]not allowed to
0:36:10.510,0:36:13.610
say this. So they’re in a legal kind of
0:36:13.610,0:36:17.030
Limbo because on the one hand they have to[br]say: “It’s this way” but on the other hand
0:36:17.030,0:36:22.210
they have to say “No it’s not”. So consent[br]is not going to give you any solution.
0:36:22.210,0:36:25.760
Then there are Standard Contractual[br]Clauses. That’s the one from Apple that
0:36:25.760,0:36:28.130
they’re using right now.
0:36:28.130,0:36:31.940
And Standard Contractual Clauses allow[br]you to have a contract with a provider
0:36:31.940,0:36:36.220
in a third country. And that[br]pledges to you in a contract
0:36:36.220,0:36:40.950
that all your data is safe. The problem[br]is that they have exception clauses.
0:36:40.950,0:36:45.140
That basically say: “If there’s mass[br]surveillance your whole contract is void”
0:36:45.140,0:36:48.860
because you cannot have a contract[br]saying: “Hereby I pledge full Privacy”
0:36:48.860,0:36:52.870
and at the same time be subject to these[br]laws. And this is the interesting thing:
0:36:52.870,0:36:56.130
all these companies are saying: “Now we’re[br]doing Standard Contractual Clauses”,
0:36:56.130,0:36:58.400
but none of them are going to hold[br]up in Courts and everybody knows,
0:36:58.400,0:37:00.450
but of course to their shareholders[br]they have to tell: “Oh we have
0:37:00.450,0:37:04.160
a wonderful solution for this.”
0:37:04.160,0:37:07.860
The big question here is if we have[br]a factual or legal assessment.
0:37:07.860,0:37:12.690
So do we have to look at factually what[br]data is actually processed by the NSA
0:37:12.690,0:37:16.470
and what are they actually doing. Or do we[br]just have to look at the laws in a country
0:37:16.470,0:37:21.550
and the possibility of mass access. So the[br]factual assessment works fine for Apple,
0:37:21.550,0:37:26.120
Google etc. who are all in these Snowden[br]slides. If you look at the abstract and
0:37:26.120,0:37:30.660
legal assessment which is legally the[br]thing that probably we have to do
0:37:30.660,0:37:34.950
you actually end up with questions like[br]Amazon. Amazon was not a huge
0:37:34.950,0:37:38.280
cloud provider when the Snowden slides[br]were actually drafted and written.
0:37:38.280,0:37:42.410
They’re huge now. And very likely[br]they’re subject to all of these laws.
0:37:42.410,0:37:45.130
So how do we deal with a company like[br]this? Can we still forward [personal] data
0:37:45.130,0:37:48.500
to an Amazon cloud? If we know[br]they’re subject to these US laws.
0:37:48.500,0:37:51.480
So this is the question of which[br]companies are actually falling
0:37:51.480,0:37:56.310
under this whole judgment.
0:37:56.310,0:37:59.510
Basically you still have a couple of[br]other exemptions. So this basic thing
0:37:59.510,0:38:03.130
that a couple of people say that you’re[br]not allowed to book a hotel [room]
0:38:03.130,0:38:06.580
in the US anymore is not true. There[br]are a lot of exceptions in the law e.g.
0:38:06.580,0:38:10.580
the performance of a contract. So if[br]I book a hotel [room] in New York online
0:38:10.580,0:38:14.540
my [personal] data has to go to New York[br]to actually book my hotel [room]. So
0:38:14.540,0:38:17.950
in all these cases you can still transfer[br][personal] data. The ruling is mainly
0:38:17.950,0:38:20.550
on outsourcing. So if you could[br]theoretically have your [personal] data
0:38:20.550,0:38:24.210
in Europe you’re just not choosing because[br]it’s cheaper to host it in the US or
0:38:24.210,0:38:29.280
it’s easier or it’s more convenient. In[br]these cases we actually get problems.
0:38:29.280,0:38:33.700
So what we did is we had a second round[br]of complaints. That is now taking
0:38:33.700,0:38:38.290
these judgments onboard. You can download[br]them on the web page as well. And there’s
0:38:38.290,0:38:43.440
also the deal that Facebook Ireland[br]with Facebook US has signed.
0:38:43.440,0:38:48.760
To have safety to your data. And this is[br]currently under investigation in Ireland.
0:38:48.760,0:38:52.720
Basically I argued that they have a[br]contract but the contract is void because
0:38:52.720,0:38:56.650
US law says they have to do all this mass[br]surveillance. I just got the letter that
0:38:56.650,0:39:01.350
on November, 18th Facebook has actually[br]given them [to the DPC] a huge amount
0:39:01.350,0:39:06.650
of information on what they’re actually[br]doing with the data. This is now going
0:39:06.650,0:39:10.160
to be under investigation. The big[br]question is if the DPC in Ireland is
0:39:10.160,0:39:13.260
actually giving us access to this[br]information. Because so far all these
0:39:13.260,0:39:17.470
evidence that they had they said:[br]“it’s all secret and you cannot know
0:39:17.470,0:39:20.500
what Facebook is doing with your data[br]even though you’re fully informed about
0:39:20.500,0:39:24.220
what they’re doing with your data.”[br]Which is kind of interesting as well. But
0:39:24.220,0:39:30.260
– different issue. A big question was also[br]if there’s gonna be a Safe Harbor 2.0.
0:39:30.260,0:39:33.010
I already was told by everybody they’re[br]not gonna call it a Safe Harbor anymore
0:39:33.010,0:39:37.350
because they’re stuck with media[br]headlines like “Safe Harbor is sunk”
0:39:37.350,0:39:40.270
or something like this.
0:39:40.270,0:39:43.320
And what happened is that the US has done[br]a huge lobbying effort. They have said
0:39:43.320,0:39:46.490
right on the day that all of this is based[br]on wrong facts and they’ve never done
0:39:46.490,0:39:50.720
any of this; and all of this[br]is Trade War; and blablablabla.
0:39:50.720,0:39:53.510
So they put a lot of pressure on them.[br]I was actually talking to Jurova,
0:39:53.510,0:39:57.210
the Justice Commissioner. And I was[br]impressed by her. She actually took
0:39:57.210,0:40:00.650
a whole hour and she really knew what[br]was going on. And at the time they had
0:40:00.650,0:40:04.120
press releases saying: “We’re really[br]deeply working on the new Safe Harbor”.
0:40:04.120,0:40:07.390
And I was asking Jurova: “Did you get[br]any of the evidence you need to make
0:40:07.390,0:40:10.850
such a finding?” And the answer[br]was: “Yeah, we’re still waiting for it.
0:40:10.850,0:40:13.770
We should get it next week”.[br]Which basically meant this
0:40:13.770,0:40:17.810
is never going to work out anymore. But[br]of course I think there’s a blame game
0:40:17.810,0:40:21.700
going on. The EU has to say: “We[br]tried everything to find a solution”
0:40:21.700,0:40:24.940
and the US is saying: “We tried[br]everything to find a solution, too”.
0:40:24.940,0:40:27.220
And then in the end they will blame[br]each other for not finding a solution.
0:40:27.220,0:40:30.530
That’s my guess. But[br]we’ll see what happens.
0:40:30.530,0:40:34.350
The basic problem with a Safe Harbor 2[br]is that in the government sector
0:40:34.350,0:40:38.380
they’d basically have to rewrite the whole[br]US legal system. Which they haven’t done
0:40:38.380,0:40:43.660
for their own citizens. So they will very[br]likely not do it for European citizens.
0:40:43.660,0:40:46.640
Like judicial redress. Not even an[br]American has judicial redress. So
0:40:46.640,0:40:50.110
they would never give that to a European.[br]And the private area: they actually
0:40:50.110,0:40:53.750
have to redraft the whole Safe Harbor[br]principles because they now have to be
0:40:53.750,0:40:57.460
essentially equivalent of[br]what Europe is doing.
0:40:57.460,0:41:00.380
So this would also protect people on[br]the private sphere much more but it
0:41:00.380,0:41:05.200
would really take a major overhaul of[br]the whole system. To give you an idea:
0:41:05.200,0:41:08.300
all of these processing operations[br]are covered by European law. So
0:41:08.300,0:41:12.580
from collection all the way[br]to really deleting the data.
0:41:12.580,0:41:16.610
This is what’s covered by the Safe[br]Harbor principles. Only 2 operations
0:41:16.610,0:41:20.250
which is at the closure by “transmission”[br]and the “change of purpose”. Anything else
0:41:20.250,0:41:23.730
they can do as fully as they wanna do[br]under the current Safe Harbor things.
0:41:23.730,0:41:26.960
So if you talk about “essentially[br]equivalent” you see on these spaces
0:41:26.960,0:41:30.540
already points to slide [br]that this is miles apart.
0:41:30.540,0:41:35.350
So what is the future of US-EU-US data[br]flows? We will have massive problems
0:41:35.350,0:41:38.130
for the PRISM companies. Because[br]what they’re doing is just a violation
0:41:38.130,0:41:40.950
of our Fundamental Rights. Give or take[br]it – you can change the law as much
0:41:40.950,0:41:44.680
as you want but you cannot[br]change the Fundamental Rights.
0:41:44.680,0:41:47.770
And you’ll have serious problems[br]for businesses that are subject
0:41:47.770,0:41:51.490
to US surveillance law in[br]general. So I’m wondering
0:41:51.490,0:41:54.580
what the final solution is. And that[br]was part of the issue that I had
0:41:54.580,0:41:57.500
with the cases. Typically I like[br]to have a solution for all of this.
0:41:57.500,0:42:01.450
In this case I could only point at the[br]problems but I couldn’t really come up
0:42:01.450,0:42:05.980
with solutions. Because solutions are[br]something that has to be done politically.
0:42:05.980,0:42:11.190
An interesting question was: “How[br]about EU surveillance, actually?”
0:42:11.190,0:42:15.380
Because aren’t they doing more or[br]less the same thing? Which is true.
0:42:15.380,0:42:18.690
And the problem is that the Charta of[br]Fundamental Rights only applies
0:42:18.690,0:42:23.300
to anything that’s regulated by the EU.[br]And national surveillance is exempt
0:42:23.300,0:42:28.720
from any EU law. It’s something that[br]member states are doing all by themselves.
0:42:28.720,0:42:32.650
So you’re out of luck here. You[br]can possibly argue it through
0:42:32.650,0:42:37.840
a couple of circles; but it’s hard to[br]do. However, 7 and 8 of the Charta
0:42:37.840,0:42:42.210
– exactly the same wording as the[br]European Convention of Human Rights.
0:42:42.210,0:42:46.270
And this applies to National Security[br]cases. So the relevant Court here
0:42:46.270,0:42:49.640
is actually in Strasbourg. So you[br]could probably end up at this Court
0:42:49.640,0:42:53.030
with the same argument and say: if they[br]already found that this is a violation
0:42:53.030,0:42:56.370
of your essence in Luxembourg – don’t[br]you want to give us the same rights
0:42:56.370,0:43:00.520
in Strasbourg as well? And these cool[br]Courts are in kind of a fight about
0:43:00.520,0:43:04.370
kind of providing proper Privacy[br]protection and protection in general.
0:43:04.370,0:43:08.240
So very likely you can walk up with[br]a German case or with a UK case
0:43:08.240,0:43:11.240
or a French case and pretty much do[br]the same thing here. So the judgment
0:43:11.240,0:43:13.710
will be interesting for European[br]surveillance as well because
0:43:13.710,0:43:16.690
it’s a benchmark. And you can hardly[br]argue that the US is bad and we’re
0:43:16.690,0:43:21.200
not doing the same thing. Either solutions[br]are possibly technical solutions.
0:43:21.200,0:43:26.260
So what Microsoft did with the cloud[br]services and hosting it with the Germans.
0:43:26.260,0:43:31.190
And the German Telekom. And there[br]is really the issue that if you can get
0:43:31.190,0:43:36.090
a technical solution of not having any[br]access from the US side you can actually
0:43:36.090,0:43:39.440
get out of the whole problem. So you can[br]try with encryption or data localization;
0:43:39.440,0:43:43.210
all this kind of stuff. However none[br]of this is really a very sexy solution
0:43:43.210,0:43:48.530
to the whole issue. However it's[br]something that you can possibly do.
0:43:48.530,0:43:53.660
Last thing: enforcement. And this a[br]little bit of a pitch, I got to confess.
0:43:53.660,0:43:57.170
We have the problem so far that
0:43:57.170,0:44:00.430
we have Data Protection law in Europe.
0:44:00.430,0:44:03.090
But we don’t really have enforcement. And[br]the problem is that the lawyers don’t know
0:44:03.090,0:44:06.940
what’s happening technically. The[br]technical people hardly know
0:44:06.940,0:44:10.550
what the law says. And then you[br]have a funding issue. So the idea
0:44:10.550,0:44:13.210
that I have right now is to create some[br]kind of an NGO or some kind of
0:44:13.210,0:44:17.570
a “Stiftung Warentest for Privacy”. To[br]kind of look into the devices we all have
0:44:17.570,0:44:20.880
and kind of have a structured system of[br]really looking into it. And then probably
0:44:20.880,0:44:24.670
do enforcement as well if your[br]stuff that you have on your device
0:44:24.670,0:44:28.750
is not following European law.[br]I think this is an approach that
0:44:28.750,0:44:31.860
probably changes a lot of the issues.[br]It’s not gonna change everything.
0:44:31.860,0:44:35.680
But this could possibly be a solution to[br]a lot of what we had. And that’s kind of
0:44:35.680,0:44:39.650
what we did in other fields of law as[br]well. That we have NGOs or organizations
0:44:39.650,0:44:42.780
that take care of these things. I think[br]that would be a solution and probably
0:44:42.780,0:44:48.090
helps a little bit. Last - before we[br]have a question/answer session –
0:44:48.090,0:44:51.910
a little Bullshit Bingo to probably get a[br]couple of questions answered right away.
0:44:51.910,0:44:55.990
So the first thing is that a lot[br]of questions are if the EU
0:44:55.990,0:44:59.100
does the same thing. I just answered it:[br]Of course they do the same thing and
0:44:59.100,0:45:01.720
we’ll have to do something about it[br]as well. And I hope that my case
0:45:01.720,0:45:06.850
is a good case to bring other cases[br]against member states of the EU.
0:45:06.850,0:45:10.460
The second question is these whole PRISM[br]companies are saying they don’t do this.
0:45:10.460,0:45:14.420
It’s absurd because they’re all placed[br]under gag orders. Or the people that are
0:45:14.420,0:45:17.360
talking to us don’t even have the[br]security clearance to talk about
0:45:17.360,0:45:20.990
the surveillance system. So it’s insane[br]when a PR person comes up and says:
0:45:20.990,0:45:25.040
“I hereby read the briefing from Facebook[br]that we’re not doing this!”. Which
0:45:25.040,0:45:28.620
basically is what we have right now.[br]And that’s what a lot of the media
0:45:28.620,0:45:32.530
is referring to as well. Another thing[br]that Facebook and the US government
0:45:32.530,0:45:36.110
have argued later is that they weren’t[br]asked. They were not invited to the Court
0:45:36.110,0:45:40.510
procedure. The fun thing is: both of them[br]totally knew about the Court procedure.
0:45:40.510,0:45:44.780
They just decided not to step in and not[br]to get a party of the procedure. So they
0:45:44.780,0:45:46.620
were like first: “Ouh,[br]we don’t wanna talk
0:45:46.620,0:45:49.720
about it” and then when the decision
0:45:49.720,0:45:53.520
came around they were like:[br]“Oh we weren’t asked”.
0:45:53.520,0:45:56.970
Of course it’s a win-on-paper mainly[br]but we’re trying to get it implemented
0:45:56.970,0:46:01.250
in practice as well. And there[br]is kind of this argument
0:46:01.250,0:46:05.490
“The EU has broken the Internet”[br]which I typically rebut in “No, the US
0:46:05.490,0:46:08.990
has broken the Internet and[br]the EU is reacting to it”.
0:46:08.990,0:46:16.150
applause
0:46:16.150,0:46:19.490
Another issue that was interesting[br]is that a lot of the US side said that
0:46:19.490,0:46:23.710
this is protectionism. So the EU is only[br]enforcing these Fundamental Rights
0:46:23.710,0:46:28.870
to hurt US companies. Which is funny[br]because I’m not involved in kind of
0:46:28.870,0:46:31.940
getting more trade to Europe. I’m[br]just like someone interested in
0:46:31.940,0:46:36.340
my Fundamental Rights. And secondly the[br]European politics has done everything
0:46:36.340,0:46:40.610
to kind of not get this case to cross.[br]So kind of this idea that this is
0:46:40.610,0:46:45.100
a protectionist thing is kind of strange,[br]too. And the last question which is:
0:46:45.100,0:46:49.200
“What about the Cables? What about all[br]the other types of surveillance we have?”
0:46:49.200,0:46:53.940
They’re an issue, too. In these cases you[br]just have more issues of actual hacking,
0:46:53.940,0:46:58.870
government hacking, basically. So[br]illegal access to servers and cables.
0:46:58.870,0:47:02.050
Which is harder to tackle with than[br]these companies. Because we have
0:47:02.050,0:47:05.720
this private interference. So there are a[br]lot of other issues around here as well,
0:47:05.720,0:47:09.130
I was just happy to kind of get one thing[br]across. And I’m happy for questions,
0:47:09.130,0:47:13.470
as well.[br]applause
0:47:13.470,0:47:21.300
Herald: Alright…[br]applause
0:47:21.300,0:47:22.989
Max: at lowered voice[br]Wie lange haben wir noch für Fragen?
0:47:22.989,0:47:30.000
Herald: We have about[br]10 minutes for questions.
0:47:30.000,0:47:33.960
I would ask you to please line up at[br]the microphones here in the hall.
0:47:33.960,0:47:37.260
We have 6 microphones. And we have also
0:47:37.260,0:47:40.420
questions from the IRC.[br]While you guys queue up
0:47:40.420,0:47:44.220
I would take one from the internet.
0:47:44.220,0:47:48.410
Signal Angel: Yeah, just[br]one – for the first time.
0:47:48.410,0:47:51.650
Does TTIP influence any of this?
0:47:51.650,0:47:55.600
Max: Basically, not really. Because[br]the judgment that was done was
0:47:55.600,0:47:58.240
on the Fundamental Rights. So if they[br]have some kind of wording in TTIP
0:47:58.240,0:48:02.740
it would again be illegal. And there was[br]actually a push to get something like that
0:48:02.740,0:48:08.750
into TTIP. And as far as I know this idea[br]was done, after the judgment. laughs
0:48:08.750,0:48:13.410
Just a little intro: EDRI has organized[br]an ask-me-anything thing at 7 PM as well.
0:48:13.410,0:48:17.530
So if you got specific questions, you[br]can also go there. Just as a reminder.
0:48:17.530,0:48:20.980
Herald: OK, great.[br]Microphone No.2, please.
0:48:20.980,0:48:25.220
Question: Thank you for your[br]efforts. The question would be:
0:48:25.220,0:48:28.260
Could US businesses[br]under these findings ever
0:48:28.260,0:48:33.930
be again employed[br]in critical sectors? E.g.
0:48:33.930,0:48:39.380
public sector, Windows in the[br]Bundestag, e.g. and stuff like that?
0:48:39.380,0:48:44.230
Max: Yep, yip. That’s a huge problem.[br]And that’s a problem we had for a while.
0:48:44.230,0:48:48.650
I was mainly talking actually with people[br]in the business area. I’m mainly invited
0:48:48.650,0:48:51.460
to conferences there. And people[br]were telling me: “Yeah, we’re doing
0:48:51.460,0:48:55.400
all our bank data on Google[br]now”. And I was like: WTF?
0:48:55.400,0:48:58.670
Because this is not only Privacy.[br]That’s also trade secrets, all of
0:48:58.670,0:49:02.680
this kind of stuff. So there is this[br]huge issue and if you talk about
0:49:02.680,0:49:06.970
the new Windows that is talking home a[br]little more than the old did, you probably
0:49:06.970,0:49:11.010
have the same issue here because[br]Microsoft is falling under the same thing.
0:49:11.010,0:49:14.200
Q: No plausible deniability,[br]therefor culpability.
0:49:14.200,0:49:16.170
M: Yep, yep, yep.[br]Q: Thank you!
0:49:16.170,0:49:18.119
Max: Thank you!
0:49:18.119,0:49:20.970
Herald: OK, microphone No.3,[br]please, for your next question.
0:49:20.970,0:49:24.920
Question: How would you assess[br]Microsoft saying they put up
0:49:24.920,0:49:29.060
a huge fight that they… well,
0:49:29.060,0:49:32.680
they said they had customers’[br]data in Ireland and they said
0:49:32.680,0:49:36.070
they refuse to give it to the FBI.[br]What’s to think of that?
0:49:36.070,0:49:38.450
Max: I think to be fair a lot of[br]these companies have realized
0:49:38.450,0:49:42.780
that there is an issue. And that[br]they are the “Feuer am Arsch”.
0:49:42.780,0:49:47.170
And Microsoft… actually a couple of[br]Microsoft people is talking to me
0:49:47.170,0:49:50.780
and is like: “We’re actually not[br]unhappy about this case because
0:49:50.780,0:49:54.620
we have a good argument in the US[br]now that we’re getting troubles here…”
0:49:54.620,0:49:59.330
But the companies are between[br]these 2 chairs. The US law says:
0:49:59.330,0:50:02.500
“We kill you if you’re not giving us all[br]the data” and the problem so far is
0:50:02.500,0:50:06.170
that in the EU… e.g. in Austria
0:50:06.170,0:50:09.800
the maximum penalty is 25.000 Euro[br]if you don’t comply with this.
0:50:09.800,0:50:11.950
Q: Peanuts.[br]M: Which is absurd.
0:50:11.950,0:50:15.430
And in most other countries it’s the same.[br]We now have the Data Protection regulation
0:50:15.430,0:50:18.060
that is coming up which gives[br]you a penalty of a maximum
0:50:18.060,0:50:21.580
of 4% of the worldwide turnover,[br]which is a couple of millions.
0:50:21.580,0:50:25.060
And if you want to thank someone there’s[br]Jan Philipp Albrecht, probably in the room
0:50:25.060,0:50:28.520
or not anymore, who is the member of [EU][br]Parliament from the Green Party, that’s
0:50:28.520,0:50:31.920
actually from Hamburg who[br]has negotiated all of this.
0:50:31.920,0:50:34.500
And this actually could possibly[br]change a couple of these things.
0:50:34.500,0:50:38.290
But you have this conflict of laws[br]and solutions like the Telekom thing –
0:50:38.290,0:50:41.650
that you host the data with the Telekom –[br]could possibly allow them to argue
0:50:41.650,0:50:44.690
in the US that they don’t have any factual[br]access anymore so they can’t give the data
0:50:44.690,0:50:49.150
to the US Government. But we’re splitting[br]the internet here. And this is not really
0:50:49.150,0:50:53.270
something I like too much, but[br]apparently the only solution.
0:50:53.270,0:50:55.850
Herald: OK, thank you for your[br]question. We have another one
0:50:55.850,0:51:00.210
at microphone 4, please.
0:51:00.210,0:51:07.369
Q: Thank you very much for your[br]efforts, first of all. And great result!
0:51:07.369,0:51:09.630
M: Thank you.[br]Q: The question from me would also be:
0:51:09.630,0:51:12.800
Is there any change in the system[br]in Ireland now? So somebody has
0:51:12.800,0:51:16.190
a similar struggle to yours – the[br]next round might be easier or not?
0:51:16.190,0:51:19.650
Max: Basically what the Irish DPC got[br]is a wonderful new building. And
0:51:19.650,0:51:22.590
the press release is too funny.[br]Because it says: “We have a very nice
0:51:22.590,0:51:27.540
Victorian building now downtown Dublin[br]in a very nice neighborhood“ and blablabla
0:51:27.540,0:51:30.020
and they get double the staff of what[br]they had before. The key problem
0:51:30.020,0:51:33.630
is none of this. I only took the picture[br]because it kind of shows what’s
0:51:33.630,0:51:37.630
inside the building. And the key[br]problem is that we have 2 countries
0:51:37.630,0:51:40.390
– Luxembourg and Ireland, where[br]all of these headquarters are – and
0:51:40.390,0:51:44.080
these 2 countries are not interested[br]in collecting taxes, they’re
0:51:44.080,0:51:48.260
not interested in enforcing Privacy Law,[br]they’re not interested in any of this. And
0:51:48.260,0:51:52.360
they’re basically getting a huge bunch of[br]money in the back of the rest of the EU.
0:51:52.360,0:51:55.869
And until this actually changes[br]and there’s a change of attitude
0:51:55.869,0:52:00.460
in the Irish DPC it doesn’t really[br]matter in which building they are.
0:52:00.460,0:52:03.710
So they got a lot of more money to kind[br]of – to the public – say: “Yes we have
0:52:03.710,0:52:05.660
more money and we have[br]more staff and dadadadada”…
0:52:05.660,0:52:07.700
Q: …but the system did not change!
0:52:07.700,0:52:10.680
M: The big question is what the system is[br]doing: they can prove now! As they have
0:52:10.680,0:52:14.180
the new complaint on their table on Safe[br]Harbor and PRISM and Facebook.
0:52:14.180,0:52:16.740
They can prove; if they do something[br]about it or not – my guess is that
0:52:16.740,0:52:19.390
they’ll find “some” random reasons[br]why unfortunately they couldn’t do
0:52:19.390,0:52:23.229
anything about it. We’ll see.[br]Q: OK, thanks!
0:52:23.229,0:52:26.490
Herald: OK, thank you! It’s[br]your turn, microphone No.2.
0:52:26.490,0:52:32.490
Question: OK, thank you very much and also[br]thank you for your service for the public.
0:52:32.490,0:52:40.980
M: Thanks for the support![br]applause
0:52:40.980,0:52:45.540
Q: What that will…[br]Sorry about the English…
0:52:45.540,0:52:50.030
M: Sag's auf Deutsch!
0:52:50.030,0:52:57.869
Q: Was bedeutet das eigentlich für die[br]Geschichte mit der Vorratsdatenspeicherung
0:52:57.869,0:53:03.700
wenn die jetzt wieder kommt?[br]Und inwiefern wird Social Media
0:53:03.700,0:53:08.410
damit jetzt sozusagen freigestellt[br]wieder oder nicht?
0:53:08.410,0:53:12.060
M: To be honest I didn’t really look[br]into the German Data Retention thing
0:53:12.060,0:53:15.610
too much. To be honest, being an Austrian[br]I’m like “Our Supreme Cou… Constitu…”
0:53:15.610,0:53:17.060
Q: Me, too![br]audience laughing
0:53:17.060,0:53:21.210
M: Yeah, I heard. “Our Constitutional[br]Court kind of killed it”, so…
0:53:21.210,0:53:25.060
I don’t think we’ll see a Data[br]Retention in Austria too soon.
0:53:25.060,0:53:27.380
But for Germany it’s gonna be interesting[br]especially if you find a way to
0:53:27.380,0:53:31.490
go to Luxembourg in the end. Like if you[br]find some hook to say: “Actually,
0:53:31.490,0:53:34.930
this German law violates something[br]in the Data Protection Regulation
0:53:34.930,0:53:38.980
or in the Directive“. So we can probably[br]find a way to go back to Luxembourg.
0:53:38.980,0:53:41.290
Could help. The other thing is that just[br]the fact that the Luxembourg Court
0:53:41.290,0:53:46.100
has been so active has probably boosted[br]up a lot of the National Courts as well.
0:53:46.100,0:53:50.020
Because the German decision, I had[br]the feeling was like a “We don’t really
0:53:50.020,0:53:53.490
feel like we can fully say that this is[br]actually illegal and we kind of argued
0:53:53.490,0:53:56.230
that it’s somehow not illegal the way[br]it is, but possibly you can do it
0:53:56.230,0:54:00.090
in the future and uooah…“. And after[br]Luxembourg has really thrown
0:54:00.090,0:54:04.790
all of this right out of the door and was[br]like: “Get lost with your Data Retention
0:54:04.790,0:54:08.200
thing and especially with the PRISM thing”[br]you probably have better case law now,
0:54:08.200,0:54:11.240
as well. And that could be relevant[br]for National Courts as well. Because
0:54:11.240,0:54:16.160
of course these things are question of[br]proportionality. And if we ask everybody
0:54:16.160,0:54:18.430
here in the room what they[br]think is proportionate or not,
0:54:18.430,0:54:22.400
everyone has another opinion. And[br]therefore it’s relevant what our people
0:54:22.400,0:54:25.260
are saying and what other Courts are[br]saying to probably get the level of
0:54:25.260,0:54:28.780
what we feel as proportionate[br]somehow a little bit up.
0:54:28.780,0:54:31.640
Q: So thank you very much. And go on![br]M: Thank you!
0:54:31.640,0:54:35.610
Herald: OK, just for the record, in[br]case you couldn’t tell by the answer:
0:54:35.610,0:54:39.680
the question was about the implications[br]for the Data Retention Laws, like
0:54:39.680,0:54:44.690
in Germany and Austria. Microphone[br]No.1, we have another question.
0:54:44.690,0:54:48.750
Question: Hi! Two questions. One: could[br]you tell a little bit more about your idea
0:54:48.750,0:54:55.850
of “Stiftung Datenschutz” Europe-wide?[br]And how do we get funding to you…
0:54:55.850,0:54:58.690
M: Send me an email![br]Q: …if you don’t mind?
0:54:58.690,0:55:03.810
Second question: when I argue with people[br]about like “Let’s keep the personal data
0:55:03.810,0:55:08.300
of all activists within Europe!” I always[br]get this answer: “Yeah, are you so naive,
0:55:08.300,0:55:11.770
do you think it’s anything different[br]if the server stands in Frankfurt
0:55:11.770,0:55:14.170
instead of San Francisco?”[br]What do you say to that?
0:55:14.170,0:55:17.670
Max: The same problem, like pretty much[br]what we have is – and that’s the reason
0:55:17.670,0:55:20.410
why I said I hope this judgment is used[br]for National surveillance in Europe,
0:55:20.410,0:55:25.190
as well. Because we do have the same[br]issues. I mean when you are an Austrian
0:55:25.190,0:55:29.560
and the German “Untersuchungsausschuss”[br]is basically saying: “Ah, we’re only
0:55:29.560,0:55:32.840
protecting Germans” I feel like my[br]fucking data is going through Frankfurt
0:55:32.840,0:55:36.750
all the times. And I’m kind of out of[br]the scope, apparently. So we do need
0:55:36.750,0:55:39.369
to take care of this as well. I hope[br]that this is a case showing that
0:55:39.369,0:55:42.580
you can actually take action. You[br]just have to poke long enough and
0:55:42.580,0:55:47.770
kind of poke at the right spot especially.[br]And I think this is something that…
0:55:47.770,0:55:51.280
there’s not an ultimate solution to it.[br]It’s just one of the kind of holes
0:55:51.280,0:55:54.869
that you have. The other thing that we[br]may see is that a lot of companies that
0:55:54.869,0:55:59.630
are holding this data are much more[br]questioning an order they get.
0:55:59.630,0:56:03.540
Because if they get legal problems from[br]an order they got by a German Court
0:56:03.540,0:56:08.210
or whatever it is they probably[br]are now more interested in – and
0:56:08.210,0:56:10.530
actually looking at it. Because[br]right now it’s cheaper for them
0:56:10.530,0:56:14.390
to just forward the data. You don’t need[br]a whole Legal Team, reviewing it all.
0:56:14.390,0:56:17.760
So I think to kind of split the private[br]companies that are helping them
0:56:17.760,0:56:21.880
from the Government and kind of get some[br]issue between them probably helps there,
0:56:21.880,0:56:25.010
as well. But of course it’s just like[br]little peanuts you put in there.
0:56:25.010,0:56:28.930
But in the end you have that[br]issue, in the end. Yeah.
0:56:28.930,0:56:34.660
On the “Stiftung Datenschutz” or whatever:[br]I think that’s kind of a thing I just
0:56:34.660,0:56:37.710
wanted to blow out to people here, because[br]I’m mainly in the legal sphere and in,
0:56:37.710,0:56:43.200
like the activist/consumer side. And[br]I think that’s the big problem we have
0:56:43.200,0:56:48.249
in the legal and consumer side is that we[br]don’t understand the devices that much.
0:56:48.249,0:56:51.050
And we lack the evidence. We[br]don’t really have the evidence of
0:56:51.050,0:56:54.050
what’s actually going on on devices[br]and you need to have this evidence
0:56:54.050,0:56:56.869
if you go in front of Courts. I think[br]a lot of the people in the room probably
0:56:56.869,0:57:00.890
have this evidence somewhere on the[br]computer. So the idea of really getting
0:57:00.890,0:57:03.670
this connection at some point – it’s not[br]something I can pitch to you right away,
0:57:03.670,0:57:07.220
because it’s not… like I don’t wanna[br]start it tomorrow. But it’s something
0:57:07.220,0:57:10.560
I wanted to circulate to get feedback[br]as well, what you guys think of it.
0:57:10.560,0:57:15.559
So if there’s any feedback on it, send me[br]an email, or twitter. Or whatever it is.
0:57:15.559,0:57:21.520
applause
0:57:21.520,0:57:23.789
Herald: So we do have a bit time left,
0:57:23.789,0:57:26.640
microphone No.2 with the[br]next question, please.
0:57:26.640,0:57:31.390
Question: What can I do as an individual[br]person now? Can I sue Google
0:57:31.390,0:57:35.770
or can I sue other companies[br]just to stop this?
0:57:35.770,0:57:40.160
And would it create some[br]pressure if I do that?
0:57:40.160,0:57:43.040
So what can the ordinary[br]citizen do now?
0:57:43.040,0:57:47.160
Max: We’re right now… I already prepared[br]it but I didn’t have time to send it out
0:57:47.160,0:57:50.070
to have complaints against the Googles and[br]all the others that are on the PRISM list.
0:57:50.070,0:57:53.180
We started with Facebook because I kind[br]of know them the best. And, you know, so
0:57:53.180,0:57:57.430
it was a good start. And the idea[br]was really to have other people
0:57:57.430,0:58:00.790
probably copy-pasting this. The complaint[br]against Facebook we actually filed
0:58:00.790,0:58:05.109
with the Hamburg DPC, as well, and the[br]Belgium DPC. The idea behind it was
0:58:05.109,0:58:08.670
that the Irish now suddenly have 2 other[br]DPCs that are more interested in
0:58:08.670,0:58:12.940
enforcing the law in their boat. So[br]they’re not the only captains anymore.
0:58:12.940,0:58:17.330
And it’s interesting what’s gonna happen[br]here. If there are other people
0:58:17.330,0:58:21.660
that have other cases and just file a[br]complaint with your Data Protection
0:58:21.660,0:58:25.200
authority, a lot of them, especially the[br]German Data Protection authorities
0:58:25.200,0:58:28.940
– most of them – are really interested[br]in doing something about it, but
0:58:28.940,0:58:32.260
they oftentimes just need a case. They[br]need someone to complain about it and
0:58:32.260,0:58:37.070
someone giving them the evidence and[br]someone arguing it, to get things started.
0:58:37.070,0:58:41.760
So if anyone is using Google Drive[br]or something like that – let’s go.
0:58:41.760,0:58:44.750
And basically the wording is on our web[br]page. You just have to download it
0:58:44.750,0:58:50.300
and reword it. And we’re gonna probably[br]publish on the website the complaints
0:58:50.300,0:58:53.080
against the other companies, as soon as[br]they’re out. Probably the next 2..3 weeks.
0:58:53.080,0:59:00.270
Something like this. So just[br]copy-paste and spread the love.
0:59:00.270,0:59:04.260
Herald: OK, thank you[br]very much, Max, again!
0:59:04.260,0:59:07.530
For your great talk. This is it!
0:59:07.530,0:59:10.430
postroll music
0:59:10.430,0:59:17.919
Subtitles created by c3subtitles.de[br]in 2016. Join and help us do more!