1
00:00:00,000 --> 00:00:09,520
32C3 preroll music
2
00:00:09,520 --> 00:00:13,210
Herald: Our next talk is
called “Safe Harbor”.
3
00:00:13,210 --> 00:00:18,019
Background is: back in October, in
the light of the Snowden revelations
4
00:00:18,019 --> 00:00:22,829
the Court of Justice of the European
Union – that’s the “EuGH” in German
5
00:00:22,829 --> 00:00:29,159
declared the Safe Harbor agreement
between the EU and the US invalid.
6
00:00:29,159 --> 00:00:32,500
This talk is about how we got there
as well as further implications
7
00:00:32,500 --> 00:00:37,470
of that decision. Please believe me when
I say our speaker is ideally suited
8
00:00:37,470 --> 00:00:42,730
to talk about that topic. Please give it
up for the man actually suing Facebook
9
00:00:42,730 --> 00:00:45,990
over Data Protection concerns:
Max Schrems!
10
00:00:45,990 --> 00:00:50,940
applause and cheers
11
00:00:50,940 --> 00:00:53,290
Max Schrems: Hallo! Hey!
applause and cheers
12
00:00:53,290 --> 00:01:01,770
applause
13
00:01:01,770 --> 00:01:05,420
It’s cheerful like some Facebook Annual
conference where the newest things
14
00:01:05,420 --> 00:01:09,920
are kind of presented. I’m doing a little
intro basically where I got there.
15
00:01:09,920 --> 00:01:13,330
This was my nice little university in
California. And I was studying there
16
00:01:13,330 --> 00:01:15,259
for half a year and there were a
couple of people from Facebook
17
00:01:15,259 --> 00:01:18,149
and other big companies and
they were talking about
18
00:01:18,149 --> 00:01:21,289
European Data Protection law. And
the basic thing they said – it was
19
00:01:21,289 --> 00:01:25,219
not an original quote but basically what
they said is: “Fuck the Europeans,
20
00:01:25,219 --> 00:01:28,050
you can fuck their law as much as you
want and nothing is going to happen.”
21
00:01:28,050 --> 00:01:31,919
And that was kind of the start of the
whole story because I thought: “Okay,
22
00:01:31,919 --> 00:01:35,639
let’s just make a couple of
complaints and see where it goes.”
23
00:01:35,639 --> 00:01:41,219
I originally got 1.300 pages Facebook data
back then, because you can exercise
24
00:01:41,219 --> 00:01:45,420
your right to access. And Facebook
actually sent me a CD with a PDF file
25
00:01:45,420 --> 00:01:49,069
on it with all my Facebook data.
It was by far not everything
26
00:01:49,069 --> 00:01:51,880
but it was the first time that someone
really got the data and I was asking
27
00:01:51,880 --> 00:01:54,889
someone from Facebook why they were so
stupid to send me all this information.
28
00:01:54,889 --> 00:01:59,100
Because a lot of it was obviously illegal.
And the answer was “We had internal
29
00:01:59,100 --> 00:02:04,090
communications problems.” So someone was
just stupid enough to burn it on a CD and
30
00:02:04,090 --> 00:02:09,158
send it on. One of the CDs actually was
first going to Sydney in Australia because
31
00:02:09,158 --> 00:02:12,819
they put “Australia” instead of “Austria”
on the label which was one of the things
32
00:02:12,819 --> 00:02:15,050
as well.
applause
33
00:02:15,050 --> 00:02:20,010
Anyway, this was basically how
my interest in Facebook started;
34
00:02:20,010 --> 00:02:23,450
and the media got crazy about it because
there is like a little guy that does
35
00:02:23,450 --> 00:02:27,280
something against the big guy. And this
is basically how the whole thing got
36
00:02:27,280 --> 00:02:30,980
this big. This is like a cartoon from my
Salzburg newspaper. This should be me,
37
00:02:30,980 --> 00:02:34,640
and it’s like basically the reason why
the story got that big because it’s
38
00:02:34,640 --> 00:02:37,790
a small guy doing something against
Facebook, not necessarily because
39
00:02:37,790 --> 00:02:41,770
what I was doing was so especially smart.
But the story was just good for the media,
40
00:02:41,770 --> 00:02:45,879
’cause data protection is generally a very
dry topic that they can’t report about
41
00:02:45,879 --> 00:02:50,019
and they’re they had like
the guy that did something.
42
00:02:50,019 --> 00:02:54,299
A couple of introductions. We actually
had 3 procedures. So if you heard about
43
00:02:54,299 --> 00:02:58,290
what I was doing… There was originally
a procedure at the Irish Data Protection
44
00:02:58,290 --> 00:03:02,310
Commission, on Facebook itself – so what
Facebook itself does with the data.
45
00:03:02,310 --> 00:03:06,620
This procedure has ended after 3 years.
There’s a “Class Action” in Vienna
46
00:03:06,620 --> 00:03:09,930
right now that’s still ongoing. It’s in
front of the Supreme Court in Austria
47
00:03:09,930 --> 00:03:14,900
right now. And there is the procedure
that I’m talking about today which is
48
00:03:14,900 --> 00:03:19,920
the procedure on Safe Harbor at the
Irish Data Protection Commission.
49
00:03:19,920 --> 00:03:22,719
A couple of other background
informations: I personally don’t think
50
00:03:22,719 --> 00:03:25,159
Facebook is the issue. Facebook
is just a nice example for
51
00:03:25,159 --> 00:03:29,609
an overall bigger issue. So I was never
personally concerned with Facebook but
52
00:03:29,609 --> 00:03:32,969
for me the question is how we enforce
Data Protection or kind of stuff.
53
00:03:32,969 --> 00:03:35,909
applause
So it’s not a Facebook talk; Facebook is
54
00:03:35,909 --> 00:03:39,290
applause
the example. And of course the whole thing
55
00:03:39,290 --> 00:03:42,599
is just one puzzle piece. A lot of people
are saying: “This was one win but there
56
00:03:42,599 --> 00:03:46,279
are so many other issues!” – Yes, you’re
totally right! This was just one issue.
57
00:03:46,279 --> 00:03:49,439
But you got to start somewhere.
And the whole thing is also
58
00:03:49,439 --> 00:03:53,209
not an ultimate solution. So I can
not present you the final solution
59
00:03:53,209 --> 00:03:57,530
for everything, but probably a couple
of possibilities to do something.
60
00:03:57,530 --> 00:04:00,969
If you’re interested in the documents
– we pretty much publish everything
61
00:04:00,969 --> 00:04:05,469
on the web page. It’s a very old style web
page. But you can download the PDF files
62
00:04:05,469 --> 00:04:10,379
and everything if you’re interested
in the facts and (?) the details.
63
00:04:10,379 --> 00:04:15,779
Talking about facts, the whole thing
started with the Snowden case,
64
00:04:15,779 --> 00:04:19,019
where we kind of for the first time had
documents proving who is actually
65
00:04:19,019 --> 00:04:23,500
forwarding data to the NSA in this case.
And this is the interesting part, because
66
00:04:23,500 --> 00:04:26,530
we have a lot of rumours but if you’re in
a Court room you actually have to prove
67
00:04:26,530 --> 00:04:30,180
everything and you cannot just suspect
that very likely they’re doing it. But you
68
00:04:30,180 --> 00:04:35,199
need actual proof. And thanks to Snowden
we had at least a bunch of information
69
00:04:35,199 --> 00:04:40,550
that we could use. These are the slides,
you all know them. The first very
70
00:04:40,550 --> 00:04:46,880
interesting thing was the FISA act and we
mainly argued under 1881a as an example
71
00:04:46,880 --> 00:04:51,509
for the overall surveillance in the US. So
we took this law as an example but it was
72
00:04:51,509 --> 00:04:55,560
not the only thing we relied on. And I
think it’s interesting for Europeans to
73
00:04:55,560 --> 00:05:01,030
understand how the law actually works.
The law actually goes after data and not
74
00:05:01,030 --> 00:05:08,849
after people. We typically have laws in
criminal procedures that go after people.
75
00:05:08,849 --> 00:05:13,289
This law goes after data. So it totally
falls outside of our normal thinking of
76
00:05:13,289 --> 00:05:15,229
“we’re going after a suspect,
someone that
77
00:05:15,229 --> 00:05:17,810
may have committed a crime”. Basically the
78
00:05:17,810 --> 00:05:21,430
law says that there’s an electronic
communications service provider that holds
79
00:05:21,430 --> 00:05:25,270
foreign intelligence information. That’s
much more than just terrorist prevention,
80
00:05:25,270 --> 00:05:29,919
that’s also things that the US is
generally interested in.
81
00:05:29,919 --> 00:05:33,739
And this is the level that’s publicly
known and everything else is basically
82
00:05:33,739 --> 00:05:38,139
classified. So under the law the FISA
Court can do certification for one year
83
00:05:38,139 --> 00:05:43,669
that basically says “the NSA can access
data”. In this certifications there are
84
00:05:43,669 --> 00:05:46,240
these minimization and targeting
procedures that they have to describe.
85
00:05:46,240 --> 00:05:48,099
But they’re not public.
We don’t know how
86
00:05:48,099 --> 00:05:51,210
they look like. And basically they’re here
87
00:05:51,210 --> 00:05:56,370
to separate data from US people out of
the data set. So it doesn’t really help
88
00:05:56,370 --> 00:06:00,690
a European. And then there is a so called
Directive that goes to the individual
89
00:06:00,690 --> 00:06:04,210
service provider which basically says:
“Give us the data in some technical
90
00:06:04,210 --> 00:06:08,020
format.” So very likely it’s some kind
of API or some kind of possibility
91
00:06:08,020 --> 00:06:12,569
that they can retrieve the data. That’s
what the law says. We don’t know
92
00:06:12,569 --> 00:06:17,440
how it actually looks and we don't
have perfect proof of it. So there are
93
00:06:17,440 --> 00:06:20,530
a lot of things that are disputed and
still disputed by the US government.
94
00:06:20,530 --> 00:06:24,809
So the exact technical implementations,
the amount of data that’s actually pulled,
95
00:06:24,809 --> 00:06:28,830
all the review mechanisms they have
internally. That’s all stuff that was
96
00:06:28,830 --> 00:06:34,110
not 100% sure, and not sure enough
to present it to a Court. Which was
97
00:06:34,110 --> 00:06:38,940
the basic problem we had. First of
all after the Snowden thing broke
98
00:06:38,940 --> 00:06:44,460
we had different reactions. And that was
kind of how I started the procedure.
99
00:06:44,460 --> 00:06:48,020
The first reaction was demonstrations.
We were all walking in the streets.
100
00:06:48,020 --> 00:06:51,080
Which is good and which is important,
but we all know that this is something
101
00:06:51,080 --> 00:06:55,710
we have to do but not something that’s
gonna change the world. Second thing:
102
00:06:55,710 --> 00:06:58,849
we had parliaments like the European
Parliament doing resolutions saying
103
00:06:58,849 --> 00:07:03,110
that we should strike down the Safe Harbor
and this is all bad and evil. We had
104
00:07:03,110 --> 00:07:07,030
the Commission pretty much saying the
same thing. We had national politicians
105
00:07:07,030 --> 00:07:11,569
saying the same thing. And we all knew
that basically this means that they all
106
00:07:11,569 --> 00:07:16,580
send an angry letter to the US. Then they
can walk in front of the media and say:
107
00:07:16,580 --> 00:07:19,879
“Yes, we’ve done something, we sent
an angry letter to the US”, and the US
108
00:07:19,879 --> 00:07:25,229
is just thrown basically in some trash bin
of crazy Europeans wanting strange things
109
00:07:25,229 --> 00:07:31,960
and that was it. So I was actually called
by a journalist and asked if there’s
110
00:07:31,960 --> 00:07:36,610
some other option. And I was then
starting to think about it and there’s
111
00:07:36,610 --> 00:07:42,300
the so called Safe Harbor agreement. To
explain the “Safe Harbor”: In Europe
112
00:07:42,300 --> 00:07:46,490
we have Data Protection law that is on
the papers but factually not enforced.
113
00:07:46,490 --> 00:07:50,569
But at least, in theory, we have it. And
we have a couple of other countries
114
00:07:50,569 --> 00:07:54,819
that have the same level of protection
or similar laws. And generally
115
00:07:54,819 --> 00:07:58,460
Data Protection only works if you keep
the data within the protected sphere so
116
00:07:58,460 --> 00:08:01,150
you’re not allowed to send personal
data to a third country that
117
00:08:01,150 --> 00:08:04,819
doesn’t have adequate protection. There
are a couple of other countries that do;
118
00:08:04,819 --> 00:08:09,740
and therefore you can transfer data e.g.
to Switzerland. This is what the law says.
119
00:08:09,740 --> 00:08:13,460
And there are certain servers that
are outside these countries where
120
00:08:13,460 --> 00:08:17,009
we can have contractual relationships. So
basically if you have a server in India,
121
00:08:17,009 --> 00:08:20,090
you have a contract with your
Indian hosting provider saying:
122
00:08:20,090 --> 00:08:24,199
“You apply proper Data Protection to it”.
So you can transfer data there, too.
123
00:08:24,199 --> 00:08:27,740
All of this is approved by the European
Commission. This is how data
124
00:08:27,740 --> 00:08:33,299
flows legally outside of the EU – personal
data. This all doesn’t apply
125
00:08:33,299 --> 00:08:38,539
to any other kind of data, only personal
data. And we had a basic problem
126
00:08:38,539 --> 00:08:42,469
with the US because there was this
Directive saying you can forward data
127
00:08:42,469 --> 00:08:46,850
to other countries but there is no Data
Protection Law in the US. So basically
128
00:08:46,850 --> 00:08:48,810
you wouldn’t be allowed to send
data there unless you have
129
00:08:48,810 --> 00:08:52,870
some contractual relationship which
is always kind of complicated.
130
00:08:52,870 --> 00:08:58,110
So the solution was to have a self
certification to EU principles
131
00:08:58,110 --> 00:09:01,839
and this was put into an Executive
Decision by the European Commission.
132
00:09:01,839 --> 00:09:06,940
So basically how Safe Harbor is working
is that e.g. Google can walk up and say:
133
00:09:06,940 --> 00:09:12,540
“Hereby I pledge that I follow European
Data Protection Law. I solemnly swear!”.
134
00:09:12,540 --> 00:09:15,110
And then they do whatever they
want to do. And basically
135
00:09:15,110 --> 00:09:17,750
that’s the Safe Harbor system and the
Europeans can walk around saying:
136
00:09:17,750 --> 00:09:22,250
“Yeah, there is some seal saying
that everything is fine, so don’t worry.”
137
00:09:22,250 --> 00:09:25,380
Everybody knew that this is a fucked-up
system but for years and years
138
00:09:25,380 --> 00:09:29,399
everyone was looking away because politics
is there and economics is there and
139
00:09:29,399 --> 00:09:33,560
they just needed it. So basically Safe
Harbor works that way that a US company
140
00:09:33,560 --> 00:09:38,360
can follow the Safe Harbor principles
and say: “We follow them”, then
141
00:09:38,360 --> 00:09:41,519
the Federal Trade Commission and private
arbitrators are overlooking them
142
00:09:41,519 --> 00:09:46,700
– in theory, in practice they never do –
and this whole thing was packaged
143
00:09:46,700 --> 00:09:50,160
into decision by the European
Commission. And this is the so called
144
00:09:50,160 --> 00:09:55,600
Safe Harbor system. So from a European
legal point of view it’s not an agreement
145
00:09:55,600 --> 00:10:01,010
with the US, it’s a system that the US has
set up that we approved as adequate. So
146
00:10:01,010 --> 00:10:04,300
there’s no binding thing between the US
and Europe, we can kind of trash it
147
00:10:04,300 --> 00:10:10,790
any time. They’ve just never done that.
Which brings me to the legal argument.
148
00:10:10,790 --> 00:10:14,930
Basically if I’m this little Smiley down
there, I’m sitting in Austria and
149
00:10:14,930 --> 00:10:20,920
transfer my data to Facebook Ireland,
because worldwide – 82% of all users
150
00:10:20,920 --> 00:10:23,890
have a contract with Facebook Ireland.
Anyone that lives outside the US
151
00:10:23,890 --> 00:10:28,600
and Canada. So anyone from China,
South America, Africa has a contract
152
00:10:28,600 --> 00:10:33,670
with Facebook in Ireland. And legally they
forward the data to Facebook in the US;
153
00:10:33,670 --> 00:10:37,170
technically the data is directly
forwarded. So the data is actually flowing
154
00:10:37,170 --> 00:10:41,529
right to the servers in the US. However
legally it goes through Ireland. And
155
00:10:41,529 --> 00:10:46,029
my contract partner is an Irish company.
And under the law they can only transfer
156
00:10:46,029 --> 00:10:49,839
data to the US if there is adequate
protection. At the same time we know
157
00:10:49,839 --> 00:10:53,550
that the PRISM system is hooked up in
the end. So I was basically walking up
158
00:10:53,550 --> 00:10:56,720
to the Court and saying: “Mass
Surveillance is very likely not
159
00:10:56,720 --> 00:11:01,669
adequate protection, he?” And
that was basically the argument.
160
00:11:01,669 --> 00:11:08,980
applause
161
00:11:08,980 --> 00:11:12,250
The interesting thing in this situation
was actually the strategic approach.
162
00:11:12,250 --> 00:11:17,899
So, we have the NSA and other surveillance
organizations that use private companies.
163
00:11:17,899 --> 00:11:21,880
So we have kind of a public-private
surveillance partnership. It’s PPP in
164
00:11:21,880 --> 00:11:28,540
a kind of surveillance way. Facebook is
subject to US law, so under US law they
165
00:11:28,540 --> 00:11:32,110
have to forward all the data. At the same
time Facebook Ireland is subject to
166
00:11:32,110 --> 00:11:35,040
European law so they’re not
allowed to forward all this data.
167
00:11:35,040 --> 00:11:41,550
Which is interesting because
they’re split. The EU law regulates
168
00:11:41,550 --> 00:11:45,570
how these third cwountry transfers work.
And all of this has to be interpreted
169
00:11:45,570 --> 00:11:49,959
under Fundamental Rights. So this was
basically the system were looking at.
170
00:11:49,959 --> 00:11:53,459
And the really crucial thing is that we
have this public-private surveillance.
171
00:11:53,459 --> 00:11:56,930
Because we do have jurisdiction over
private company. We don’t have
172
00:11:56,930 --> 00:12:00,680
jurisdication over the NSA. We can
send angry letters to the NSA. But
173
00:12:00,680 --> 00:12:04,070
we do have jurisdiction over Facebook,
Google etc. because they’re basically
174
00:12:04,070 --> 00:12:08,769
based here. Mainly for tax reasons.
And this was the interesting thing that
175
00:12:08,769 --> 00:12:11,720
in difference to the national surveillance
where we can pretty much just send
176
00:12:11,720 --> 00:12:15,240
the angry letters we can do something
about the private companies. And
177
00:12:15,240 --> 00:12:19,230
without the private companies there is
almost no mass surveillance in this scale
178
00:12:19,230 --> 00:12:22,810
because the NSA is not in our phones,
it’s the Googles and Apples and whatever.
179
00:12:22,810 --> 00:12:29,000
And without them you’re not really
able to get this mass surveillance.
180
00:12:29,000 --> 00:12:32,579
This is like the legal chart. Basically
what we argued is: there’s 7 and 8
181
00:12:32,579 --> 00:12:35,490
of the Charta of Fundamental Rights.
That’s your right to Privacy and
182
00:12:35,490 --> 00:12:39,420
your right to Data Protection. There
is an article in the Directive that
183
00:12:39,420 --> 00:12:44,149
has to be interpreted in light of it. Then
there’s the Executive Decision of the EU.
184
00:12:44,149 --> 00:12:48,750
This is basically the Safe Harbor
decision which refers to Paragraph 4
185
00:12:48,750 --> 00:12:52,990
of the Safe Harbor principles. And the
Safe Harbor principles basically say
186
00:12:52,990 --> 00:12:56,540
that the FISA Act is okay. So
you have kind of this circle
187
00:12:56,540 --> 00:13:01,579
of different legal layers which is getting
really crazy. I’ll try to break it down
188
00:13:01,579 --> 00:13:05,339
a little bit. Basically 7 and 8 of the
Charta we basically compared
189
00:13:05,339 --> 00:13:08,690
to Data Retention, so the
“Vorratsdatenspeicherung”.
190
00:13:08,690 --> 00:13:13,820
We basically said PRISM is much worse. If
“Vorratsdatenspeicherung” (Data Retention)
191
00:13:13,820 --> 00:13:18,300
was invalid then PRISM has to be 10 times
as bad. That was basically the argument.
192
00:13:18,300 --> 00:13:22,240
Very simple. We just compared: the
one was content data – the other one
193
00:13:22,240 --> 00:13:25,779
was meta data. The one is storage
– the other one is making available.
194
00:13:25,779 --> 00:13:29,519
And the one is endless – the other
one is 24 months. So basically
195
00:13:29,519 --> 00:13:33,540
in all these categories PRISM was much
worse. And if the one has to be illegal
196
00:13:33,540 --> 00:13:37,639
the other one has to be as well. And
what’s interesting – and that’s something
197
00:13:37,639 --> 00:13:42,740
that the US side is typically not getting
– is that Article 8 is already covering
198
00:13:42,740 --> 00:13:47,459
“making available of data”. So the
fun thing is I only had to prove
199
00:13:47,459 --> 00:13:51,190
that Facebook makes data available,
so basically it’s possible
200
00:13:51,190 --> 00:13:55,420
the NSA is pulling it. I didn’t even have
to prove that the NSA is factually pulling
201
00:13:55,420 --> 00:14:00,190
my personal data. And this was like the
relevant point because under US law
202
00:14:00,190 --> 00:14:04,220
basically your Fundamental Rights only
kick in when they factually look at your
203
00:14:04,220 --> 00:14:07,850
data and actually surveil you. So I was
only: “They’re making it available
204
00:14:07,850 --> 00:14:11,740
– that’s good enough for me!” which
was making all these factual evidence
205
00:14:11,740 --> 00:14:16,789
much easier. So basically I only had
to say: “Look at the XKeyscore slides
206
00:14:16,789 --> 00:14:22,180
where they say ‘user name Facebook’
they can get somehow the data out of it.
207
00:14:22,180 --> 00:14:26,860
It’s at least made available; that’s
all I need to prove”. And this is
208
00:14:26,860 --> 00:14:30,360
the big difference between the US
– it’s very simplified, but basically
209
00:14:30,360 --> 00:14:34,649
between the US approach and the European
approach; is that in the US you have to
210
00:14:34,649 --> 00:14:38,639
prove that your data is actually pulled.
I only had to prove that my data is made
211
00:14:38,639 --> 00:14:44,399
available. So I had to… I was able to
get out of all the factual questions.
212
00:14:44,399 --> 00:14:48,100
This is a comparison – you basically…
in the US we have very strict laws
213
00:14:48,100 --> 00:14:53,149
for certain types of surveillance while in
Europe we have a more flexible system
214
00:14:53,149 --> 00:14:56,730
that covers much more. So it’s a
different approach that we just have
215
00:14:56,730 --> 00:15:00,279
in the two legal spheres. We’re both
talking about your Fundamental
216
00:15:00,279 --> 00:15:03,750
Right to Privacy, but in details it’s
very different. And that’s kind of
217
00:15:03,750 --> 00:15:07,909
the differences what we used. The fun
thing is if you’re European you don’t have
218
00:15:07,909 --> 00:15:11,509
any rights in the US anyways because
the Bill Of Rights only applies to people
219
00:15:11,509 --> 00:15:15,829
that live in the US and US citizens so
you’re out of luck anyways. So you’re
220
00:15:15,829 --> 00:15:20,209
only left with the European things.
Basically the law which is
221
00:15:20,209 --> 00:15:23,550
the second level after the Fundamental
Rights is saying that there has to be
222
00:15:23,550 --> 00:15:28,250
an adequate level of protection as I said
and this third country has to ensure it
223
00:15:28,250 --> 00:15:33,279
by domestic law or international
commitments. And I was saying: “You know
224
00:15:33,279 --> 00:15:36,180
there’s the FISA Act, you can read
it, it definitely doesn’t ensure
225
00:15:36,180 --> 00:15:38,630
your fundamental rights and an
adequate protection. So we're
226
00:15:38,630 --> 00:15:45,110
kind of out of Article 25”. And there is
paragraph 4 of the Safe Harbor principles
227
00:15:45,110 --> 00:15:50,829
which say that all these wonderful privacy
principles that US companies sign up to
228
00:15:50,829 --> 00:15:56,579
do not apply whenever a national law
in the US is overruling it. So there are
229
00:15:56,579 --> 00:16:01,690
principles that companies say: “We
follow!” but if there is a city in Texas
230
00:16:01,690 --> 00:16:05,649
saying: “We have a local ordinance
saying: ‘You have to do differently!’”
231
00:16:05,649 --> 00:16:09,029
all these Safe Harbor principles
don’t apply anymore. And this is
232
00:16:09,029 --> 00:16:12,699
the fundamental flaw of the self
certification system that it only works
233
00:16:12,699 --> 00:16:17,240
if there is no law around that conflicts
with it. And as there are tons of laws
234
00:16:17,240 --> 00:16:22,519
that conflict with it you’re hardly
able to hold up a system like that.
235
00:16:22,519 --> 00:16:26,220
So basically if you go through all these
different legal layers you end up with
236
00:16:26,220 --> 00:16:29,730
a conflict between the US FISA Act
and the European Fundamental Rights.
237
00:16:29,730 --> 00:16:33,069
So you’re going through different layers
of the system but you’re basically making
238
00:16:33,069 --> 00:16:41,009
a circle. This is what we did which was
a little bit complicated but worked.
239
00:16:41,009 --> 00:16:53,719
applause
240
00:16:53,719 --> 00:16:57,430
Basically now to the procedure,
so how the whole thing happened.
241
00:16:57,430 --> 00:17:01,639
First I went through the Safe Harbor. Safe
Harbor allows you to go to TRUSTe or
242
00:17:01,639 --> 00:17:06,630
the Federal Trade Commission and there’s
an online form to make your complaint. And
243
00:17:06,630 --> 00:17:10,270
I was making a complaint and I think you
were only allowed to put in 60 characters
244
00:17:10,270 --> 00:17:13,650
to explain what your complaint is. Which
is a little bit complicated if you’re
245
00:17:13,650 --> 00:17:17,779
trying to explain NSA mass surveillance.
So I only wrote: “Stop Facebook, Inc.’s
246
00:17:17,779 --> 00:17:20,410
involvement in PRISM!”. That
was everything I could actually
247
00:17:20,410 --> 00:17:24,280
put in the text box; that was
the absolute maximum.
248
00:17:24,280 --> 00:17:27,690
And the answer I got back was: “TRUSTe
does not have the authority to address
249
00:17:27,690 --> 00:17:31,280
the matter you raise.” Which is obvious,
it’s a private arbitration company
250
00:17:31,280 --> 00:17:35,370
that can hardly tell Facebook to not
follow the NSA’s guidelines. So
251
00:17:35,370 --> 00:17:39,310
this was the arbitration mechanism under
Safe Harbor. You can also go to the
252
00:17:39,310 --> 00:17:43,390
Federal Trade Commission and have your
complaint filed there. But they basically
253
00:17:43,390 --> 00:17:50,290
just ignore them. This was the letter I
got back, that they received it. But
254
00:17:50,290 --> 00:17:53,760
I was talking to the people at the FTC and
they say: “Yeah, we get these complaints
255
00:17:53,760 --> 00:17:59,130
but they’re ending up in a huge storage
system where they stay for ever after”.
256
00:17:59,130 --> 00:18:02,530
So this was enforcement done by
Safe Harbor. And we knew that
257
00:18:02,530 --> 00:18:05,640
in the private field already; but in this
case it was especially interesting.
258
00:18:05,640 --> 00:18:08,720
To be fair, both of these institutions
have no power to do anything
259
00:18:08,720 --> 00:18:11,260
about mass surveillance. So
there was really a reason why
260
00:18:11,260 --> 00:18:14,330
they didn’t do anything.
The next step you have is
261
00:18:14,330 --> 00:18:17,330
the national Data Protection Commissioner.
So we have 28 countries
262
00:18:17,330 --> 00:18:21,640
with 28 [Commissioners]; plus Germany has
– I think – a Data Protection Commissioner
263
00:18:21,640 --> 00:18:26,940
in every province. And you end up at
this. And this is my most favourite slide.
264
00:18:26,940 --> 00:18:30,500
This is the Irish Data
Protection Commissioner.
265
00:18:30,500 --> 00:18:33,910
applause
266
00:18:33,910 --> 00:18:36,450
To be super precise
– I don’t know if you
267
00:18:36,450 --> 00:18:38,100
can see the laser pointer. But this is a
268
00:18:38,100 --> 00:18:41,570
super market. And this is the Irish Data
Protection Commissioner back there.
269
00:18:41,570 --> 00:18:45,040
laughter, applause
270
00:18:45,040 --> 00:18:48,370
To be a little more fair, actually they’re
up here and they’re like 20 people
271
00:18:48,370 --> 00:18:51,720
when we filed it originally. The fun thing
is back at the times they didn’t have
272
00:18:51,720 --> 00:18:54,730
a single lawyer and not a single
technician. So they were 20
273
00:18:54,730 --> 00:18:58,470
public employees that were dealing
with Data Protection and no one
274
00:18:58,470 --> 00:19:02,960
had any clue of the technical
or the legal things about it.
275
00:19:02,960 --> 00:19:06,140
The fun thing is: this is Billy Hawkes,
the Data Protection Commissioner
276
00:19:06,140 --> 00:19:09,720
at the time. He went on the
national radio in the morning.
277
00:19:09,720 --> 00:19:12,960
And in Ireland radio is a really big
thing. So it was a morning show.
278
00:19:12,960 --> 00:19:16,540
And he was asked about these complaints.
And he actually said on the radio:
279
00:19:16,540 --> 00:19:21,170
“I don’t think it will come
as much of a surprise
280
00:19:21,170 --> 00:19:24,880
that the US services have access
to all the US companies”.
281
00:19:24,880 --> 00:19:28,510
And this was the craziest thing!
I was sitting in front of the radio
282
00:19:28,510 --> 00:19:34,560
and was like: “Strike! He just
acknowledged that all this is true!”.
283
00:19:34,560 --> 00:19:38,040
And the second thing, he said: “This US
surveillance operation is not an issue
284
00:19:38,040 --> 00:19:42,500
of Data Protection”. Interesting.
285
00:19:42,500 --> 00:19:47,570
It’s actually online and you can listen
to it. But the fun thing was really that
286
00:19:47,570 --> 00:19:52,180
the factual level is so hard to prove that
I was afraid that they would dispute:
287
00:19:52,180 --> 00:19:55,380
“Hah, who knows if all this is true?
We don’t have any evidence!
288
00:19:55,380 --> 00:19:58,710
The companies say we are
not engaging in all of this.”
289
00:19:58,710 --> 00:20:02,770
So having the Data Protection Commissioner
saying: “Sure they surveil you!
290
00:20:02,770 --> 00:20:06,330
Are you surprised?” was great
because we were kind of out of
291
00:20:06,330 --> 00:20:10,180
the whole factual debate.
292
00:20:10,180 --> 00:20:13,070
I actually got a letter back from them
saying that they’re not investigating
293
00:20:13,070 --> 00:20:17,850
any of it. And I was asking them why. And
they were naming 2 sections of the law,
294
00:20:17,850 --> 00:20:21,600
a combination thereof. So there was one
thing where it says they shall investigate
295
00:20:21,600 --> 00:20:25,130
– which means they have to – or
they may investigate. And they say
296
00:20:25,130 --> 00:20:28,060
they only “may” investigate complaints
and they just don’t feel like
297
00:20:28,060 --> 00:20:32,710
investigating PRISM and Facebook
and all of this. Secondly they say
298
00:20:32,710 --> 00:20:36,920
that a complaint could be “frivolous
and vexatious” – I love the word!
299
00:20:36,920 --> 00:20:41,130
And therefor they’re not investigating
it. “A combination thereof or indeed
300
00:20:41,130 --> 00:20:47,980
any other relevant matter.” So we
transferred this letter into a picture
301
00:20:47,980 --> 00:20:51,630
which is basically what they said: “So
why did you not investigate PRISM?”
302
00:20:51,630 --> 00:20:54,140
– “‘Shall’ means ‘may’, frivolous
or
303
00:20:54,140 --> 00:20:55,700
vexatious, a combination of A and B
304
00:20:55,700 --> 00:20:58,820
or any other reason.”
So this was the answer
305
00:20:58,820 --> 00:21:01,450
by the Irish Data Protection Commissioner
why they wouldn’t want to investigate
306
00:21:01,450 --> 00:21:05,710
the complaint. Just to give
you background information:
307
00:21:05,710 --> 00:21:08,350
these are the complaints that the Irish
Data Protection Commissioner is receiving
308
00:21:08,350 --> 00:21:10,510
– the blue line – and the red line is
all
309
00:21:10,510 --> 00:21:13,380
of the complaints they’re not deciding.
310
00:21:13,380 --> 00:21:17,420
Which is 96..98% of the complaints
311
00:21:17,420 --> 00:21:20,680
they receive on an average year.
Which is interesting because you have
312
00:21:20,680 --> 00:21:24,130
a right to get a decision but they don’t.
313
00:21:24,130 --> 00:21:27,180
To give you the bigger picture: we
also made complaints on Apple
314
00:21:27,180 --> 00:21:30,510
and all the other PRISM companies.
And Ireland basically said
315
00:21:30,510 --> 00:21:35,310
what I just told you. Luxembourg, where
Skype and Microsoft are situated, said
316
00:21:35,310 --> 00:21:38,760
that they do not have enough evidence for
the participation of Microsoft and Skype
317
00:21:38,760 --> 00:21:43,070
[in PRISM]. And the funniest thing
about the answer was that they said
318
00:21:43,070 --> 00:21:46,030
that they’re restricted by their
investigations to the territory
319
00:21:46,030 --> 00:21:50,220
of Luxembourg. And since all of this is
happening in the US they have no way
320
00:21:50,220 --> 00:21:54,050
of ever finding out what was going on.
So I was telling them: “You know,
321
00:21:54,050 --> 00:21:57,890
most of this is online and if you’re not
able to download it I can print it out
322
00:21:57,890 --> 00:22:02,890
for you and ship it to Luxembourg.” But
the problem is why we didn’t go down
323
00:22:02,890 --> 00:22:06,460
in Luxembourg is because they went down
this factual kind of argument. They said:
324
00:22:06,460 --> 00:22:10,010
“It’s all illegal but factually we
don’t believe it’s true”. And
325
00:22:10,010 --> 00:22:15,280
then there was Germany that are
still investigating until today.
326
00:22:15,280 --> 00:22:18,300
This was Yahoo. Actually that was
Yahoo in Munich but they now
327
00:22:18,300 --> 00:22:21,140
moved to Ireland as well. So I don’t
know what happened to this complaint.
328
00:22:21,140 --> 00:22:23,800
We never heard back. But whenever we sent
an email they were like: “Yeah, we’re
329
00:22:23,800 --> 00:22:29,820
still investigating.” So what happened now
is that I went to the Irish High Court.
330
00:22:29,820 --> 00:22:34,150
To jeopardize the non-decision of the
Irish Data Protection Commissioner.
331
00:22:34,150 --> 00:22:37,350
This is the case that then went down as
“Schrems vs. the Data Protection
332
00:22:37,350 --> 00:22:40,790
Commissioner” which is so strange because
I never wanted to have my name
333
00:22:40,790 --> 00:22:43,690
on any of this and now the decision is
actually called after my second name
334
00:22:43,690 --> 00:22:46,990
which is always freaking me out in a way.
Because you’re fighting for Privacy and
335
00:22:46,990 --> 00:22:51,480
suddenly your name is all over the place.
applause and laughter
336
00:22:51,480 --> 00:22:56,010
applause
337
00:22:56,010 --> 00:23:00,160
And this is the Irish High Court. So you…
338
00:23:00,160 --> 00:23:04,130
It’s very complicated to
get a procedure like that.
339
00:23:04,130 --> 00:23:07,610
The biggest issue is that you need money.
If you’re in front of an Irish Court
340
00:23:07,610 --> 00:23:10,100
and you lose a case you
end up with a legal bill of
341
00:23:10,100 --> 00:23:13,280
a couple of hundred thousand
Euros. Which is the reason why
342
00:23:13,280 --> 00:23:17,600
never anybody ever challenged the
Irish Data Protection Commissioner.
343
00:23:17,600 --> 00:23:21,510
Because you just gonna
lose your house over it!
344
00:23:21,510 --> 00:23:25,900
So what I did is: we did a little
bit of crowd-funding! And
345
00:23:25,900 --> 00:23:29,420
we actually got about 70.000 Euros out
of it. This was a crowd-funding platform
346
00:23:29,420 --> 00:23:32,180
that basically worked in a way
that people could donate
347
00:23:32,180 --> 00:23:35,950
and if we don’t need the money we either
donate it to another Privacy cause
348
00:23:35,950 --> 00:23:39,710
or we actually give people the money
back. Which we got to have to do
349
00:23:39,710 --> 00:23:43,490
because we won the case. And all
our costs are paid by the other side.
350
00:23:43,490 --> 00:23:49,960
applause
351
00:23:49,960 --> 00:23:55,660
So the fun thing is you then have to
walk into this wonderful old Court here
352
00:23:55,660 --> 00:23:59,710
on Mondays at 11:30. And
there’s a room where you can
353
00:23:59,710 --> 00:24:03,990
make your application. And about 100 other
people making their application as well.
354
00:24:03,990 --> 00:24:06,850
And there is no number. So there
are 100 lawyers sitting in a room,
355
00:24:06,850 --> 00:24:11,240
waiting for the judge to call out your
case. So we were sitting there until 4 PM
356
00:24:11,240 --> 00:24:15,430
or something until suddenly our case was
called up. And we actually got kind of
357
00:24:15,430 --> 00:24:19,260
the possibility to bring our case and then
it’s postponed to another date and
358
00:24:19,260 --> 00:24:24,400
blablablablabla. In the end you
end up with something like this.
359
00:24:24,400 --> 00:24:28,380
Which is all the paperwork
because in Ireland the Courts
360
00:24:28,380 --> 00:24:33,020
are not computerized so far. So you
have to bring all the paperwork,
361
00:24:33,020 --> 00:24:38,200
anything you rely on, in 3 copies.
And it’s all paper, noted of the pages,
362
00:24:38,200 --> 00:24:43,260
so all these copies have pages 1 to 1000.
Someone’s writing all of them on the page.
363
00:24:43,260 --> 00:24:46,010
And then they copy it 3 times and it’s
then in this wonderful little thing.
364
00:24:46,010 --> 00:24:49,810
I thought it’s great. And
what happened is that
365
00:24:49,810 --> 00:24:53,630
we walked into the judge’s room and you
get a judge assigned on the same day.
366
00:24:53,630 --> 00:24:57,690
So you end up in front of a judge
that has never heard about Privacy,
367
00:24:57,690 --> 00:25:00,570
never heard about Facebook and
never heard about Snowden and PRISM
368
00:25:00,570 --> 00:25:03,710
and any of this. So you walk into the
room as like “We would like to debate
369
00:25:03,710 --> 00:25:08,220
the Safe Harbor with you” and he was like
“What the fuck is the Safe Harbor?”.
370
00:25:08,220 --> 00:25:12,250
So what happened is that he told us to
kind of explain what it is for 15 minutes.
371
00:25:12,250 --> 00:25:16,280
And then he postponed the
whole thing for 2 hours I think
372
00:25:16,280 --> 00:25:19,960
and we walked over to a pub and had a
beer. So that the judge could remotely
373
00:25:19,960 --> 00:25:24,540
read what he’s about to look into.
374
00:25:24,540 --> 00:25:27,150
And Ireland is very interesting because
you need a Solicitor and a Counsel
375
00:25:27,150 --> 00:25:30,900
and then the Counsel is actually talking
to the Judge. So I actually had 2 filters.
376
00:25:30,900 --> 00:25:34,010
If I’m the client down here I had to
talk to my Solicitor. The Solicitor
377
00:25:34,010 --> 00:25:38,520
was telling the Counsel what to say to the
Judge. So half of it was lost on the way.
378
00:25:38,520 --> 00:25:41,730
And when I was asking if I could
just address the Judge personally
379
00:25:41,730 --> 00:25:44,770
they were like “No, no way that you could
possibly address the Judge personally
380
00:25:44,770 --> 00:25:46,240
even though you’re the claimant”.
Which is
381
00:25:46,240 --> 00:25:48,110
funny ’cause they talk about this “person”
382
00:25:48,110 --> 00:25:52,060
in the room. It’s like “What’s the problem
of this Mr. Schrems?”. And you’re like
383
00:25:52,060 --> 00:25:57,040
sitting right here, it’s like
“This would be me!”.
384
00:25:57,040 --> 00:26:01,060
So what happened in Ireland is that we
had about 10 reasons why under Irish law
385
00:26:01,060 --> 00:26:05,960
the Irish Data Protection Commissioner
would have to do its job but the Court
386
00:26:05,960 --> 00:26:09,880
actually wiped all of this from the table
and said actually the Safe Harbor
387
00:26:09,880 --> 00:26:12,580
is the issue, which legally they’re
not allowed to do what politically
388
00:26:12,580 --> 00:26:16,500
was very wise and forwarded this
wonderful easy-to-understand question
389
00:26:16,500 --> 00:26:19,710
to the European Court of Justice.
390
00:26:19,710 --> 00:26:23,480
The reason why they put this kind
of very random question is that
391
00:26:23,480 --> 00:26:27,710
if you jeopardize a law in Ireland you
have to get some Advocate General engaged.
392
00:26:27,710 --> 00:26:30,090
And they didn’t want to do that
so they kind of “asked a question
393
00:26:30,090 --> 00:26:33,890
around the actual question”
to not really get them engaged.
394
00:26:33,890 --> 00:26:35,720
Which was very complicated
because we didn’t know how
395
00:26:35,720 --> 00:26:38,610
the European Court of Justice ’d kind of
react to this random question because
396
00:26:38,610 --> 00:26:41,530
it was so broad that they could just walk
any other direction and not address
397
00:26:41,530 --> 00:26:47,330
the real issue. What was wonderful is
that in the judgment by the Irish Court
398
00:26:47,330 --> 00:26:50,640
they have actually said that
all of this is factually true.
399
00:26:50,640 --> 00:26:54,170
All the mass surveillance is factually
true. And the fun thing to understand
400
00:26:54,170 --> 00:26:58,720
is that the factual assessment is done by
the national Courts. So the European
401
00:26:58,720 --> 00:27:02,000
Court of Justice is not engaging in
factual matters anymore. They only
402
00:27:02,000 --> 00:27:06,800
ask legal questions: “Is this legal or
not”. So we had a split of responsibility.
403
00:27:06,800 --> 00:27:10,680
The Irish Court only said that all of this
is true. And Luxembourg only said
404
00:27:10,680 --> 00:27:15,050
that all of this would be legal if all of
this would be true. Which was kind of
405
00:27:15,050 --> 00:27:19,530
an interesting situation. But to be
fair no one before the European
406
00:27:19,530 --> 00:27:23,010
Court of Justice has ever questioned
that this is true. So even the UK
407
00:27:23,010 --> 00:27:26,270
that was in front of the Court and that
should possibly know if all of this
408
00:27:26,270 --> 00:27:30,270
is true or not, they have never
questioned the facts. laughs
409
00:27:30,270 --> 00:27:34,930
There is a pretty good factual basis.
What was interesting as well is
410
00:27:34,930 --> 00:27:39,150
that I said I’m not gonna go in front
of the European Court of Justice.
411
00:27:39,150 --> 00:27:44,240
Because the cost is so high that even the
60 or 70.000 Euros I got in donations
412
00:27:44,240 --> 00:27:47,970
wouldn’t cover it. And I knew the judge
wants to get this hot potato off his table
413
00:27:47,970 --> 00:27:52,280
and down to Luxembourg. So I was asking
for a so called “protective cost order”
414
00:27:52,280 --> 00:27:55,470
which kind of tells you beforehand that
there is a maximum amount you have to pay
415
00:27:55,470 --> 00:27:57,960
if you lose a case. And it
was actually the first one
416
00:27:57,960 --> 00:28:01,960
to ever get protective cost
order in Ireland granted.
417
00:28:01,960 --> 00:28:06,020
Which was really cool and the Irish
were like outraged about it, too.
418
00:28:06,020 --> 00:28:09,760
applause
419
00:28:09,760 --> 00:28:12,960
So we basically walked into the
European Court of Justice which is
420
00:28:12,960 --> 00:28:17,150
a really hefty procedure.
In this room were…
421
00:28:17,150 --> 00:28:20,520
13 judges are in front of you. The
European Court of Justice has assigned it
422
00:28:20,520 --> 00:28:24,150
to the Great Chamber. So there is a
Small, a Medium and a Great Chamber.
423
00:28:24,150 --> 00:28:28,480
Which is the highest thing you can
possibly end up in Europe. And
424
00:28:28,480 --> 00:28:31,490
it’s chaired by the President of the
European Court of Justice. And this is
425
00:28:31,490 --> 00:28:36,350
kind of where the really really basic,
really important questions are dealt with.
426
00:28:36,350 --> 00:28:38,890
So I was like: “Cool, I’m getting to the
European Court of Justice!”. And it’s
427
00:28:38,890 --> 00:28:41,930
funny because all the lawyers that were in
the room, everyone was like “I can pledge
428
00:28:41,930 --> 00:28:44,580
in front of the European Court of
Justice!”. They all took pictures like
429
00:28:44,580 --> 00:28:47,040
they were in Disneyland or something.
audience laughing
430
00:28:47,040 --> 00:28:52,970
And it was – lawyers can be
very… kind of… interesting. And
431
00:28:52,970 --> 00:28:57,250
we ended up in front of these 3 major
people. It was the President,
432
00:28:57,250 --> 00:29:00,860
Thomas von Danwitz – who is the German
judge and he also wrote the lead decision.
433
00:29:00,860 --> 00:29:03,710
He’s the Judge Rapporteur, so within
the 13 judges there’s one that is
434
00:29:03,710 --> 00:29:07,130
the reporting judge and actually drafts
the whole case. And he was also
435
00:29:07,130 --> 00:29:12,760
doing the Data Retention. And then there
was Yves Bot as the Advocate General.
436
00:29:12,760 --> 00:29:15,220
The hearing was interesting
because we got questions
437
00:29:15,220 --> 00:29:18,480
from the European Court of Justice before
the hearing. And in these questions
438
00:29:18,480 --> 00:29:21,600
they were actually digging down into
the core issues of mass surveillance
439
00:29:21,600 --> 00:29:25,930
in the US. When I got the questions
I was like “We won the case!” because
440
00:29:25,930 --> 00:29:30,810
there’s no way they can decide differently
as soon as they address the question.
441
00:29:30,810 --> 00:29:34,790
There were participants from all over
Europe. These are the countries,
442
00:29:34,790 --> 00:29:37,660
then there was the European Parliament,
the European Data Protection Supervisor
443
00:29:37,660 --> 00:29:40,770
and the European Commission.
There was me – MS down there,
444
00:29:40,770 --> 00:29:44,940
the Data Protection Commissioner
and Digital Rights Ireland. And
445
00:29:44,940 --> 00:29:49,310
what was interesting was the countries
that were not there. Like Germany, e.g.
446
00:29:49,310 --> 00:29:53,000
was not there in this major procedure.
And as far as I’ve heard there were
447
00:29:53,000 --> 00:30:00,300
reasons of not getting too engaged in
the Transatlantic Partnership problem.
448
00:30:00,300 --> 00:30:04,800
So this was kind of interesting because
the UK walked up but Germany was like:
449
00:30:04,800 --> 00:30:08,130
“No, we rather don’t want
to say anything about this.”
450
00:30:08,130 --> 00:30:12,370
What was interesting as well is that there
were interventions by the US Government.
451
00:30:12,370 --> 00:30:16,470
So I heard… we were… on a Tuesday we
were actually in the Court. And on Mondays
452
00:30:16,470 --> 00:30:20,430
I got text messages from people of
these different countries telling me that
453
00:30:20,430 --> 00:30:25,040
the US just called them up. And
I was like: “This is interesting”
454
00:30:25,040 --> 00:30:28,180
because I know a lot of these people from
conferences and stuff. So they were like
455
00:30:28,180 --> 00:30:31,210
telling me: “The US just called me
up and said they wanna talk to my
456
00:30:31,210 --> 00:30:34,520
lead lead lead supervisor and tell me
what to say tomorrow in the Court”.
457
00:30:34,520 --> 00:30:38,050
It was like: “This is very interesting!”.
458
00:30:38,050 --> 00:30:44,070
I was actually in the Court room and there
was the justice person from the US embassy
459
00:30:44,070 --> 00:30:47,650
to the European Union. And he was actually
watching the procedure and watching
460
00:30:47,650 --> 00:30:50,430
what everybody was arguing.
Where I had a feeling this is
461
00:30:50,430 --> 00:30:54,130
like a watchdog situation. And someone
pointed out that this is the guy,
462
00:30:54,130 --> 00:30:57,420
so I knew who it is. And he was walking up
to me and asked: “Are you the plaintiff?”
463
00:30:57,420 --> 00:31:02,960
And I said: “Yeah, hey!” and he was
trying to talk to me and I said:
464
00:31:02,960 --> 00:31:06,420
“Did you manage calling everybody by now
or do you still need a couple of numbers?”
465
00:31:06,420 --> 00:31:06,870
audience laughing
466
00:31:06,870 --> 00:31:11,650
And he was like: “(?) arrogant!”. He was
like: “He didn’t just ask this question?”.
467
00:31:11,650 --> 00:31:15,590
He said: “No, we kind of we’re in contact
with all of our colleagues and of course
468
00:31:15,590 --> 00:31:19,280
we have to kind of push for the interest
of the US” and blablabla. I thought:
469
00:31:19,280 --> 00:31:22,840
“This is very interesting!”. But
anyway, it didn’t help them.
470
00:31:22,840 --> 00:31:27,720
No one of them was really kind
of arguing for the US, actually.
471
00:31:27,720 --> 00:31:30,440
The findings of the European Court of
Justice, so what was in the judgment
472
00:31:30,440 --> 00:31:36,050
in the end. First of all, Safe Harbor
is invalid. Which was the big news.
473
00:31:36,050 --> 00:31:39,340
And this was over night. We were expecting
that they would have a grace period
474
00:31:39,340 --> 00:31:43,700
so it’s invalid within 3 months or
something like this. But in the minute
475
00:31:43,700 --> 00:31:48,450
they were saying it there all your data
transfers to the US were suddenly illegal.
476
00:31:48,450 --> 00:31:58,710
applause
Which was kind of big.
477
00:31:58,710 --> 00:32:02,070
The second biggie was that they actually
said that the essence of your rights
478
00:32:02,070 --> 00:32:04,850
is violated. Now this, for an average
person, doesn’t mean too much.
479
00:32:04,850 --> 00:32:10,270
But for a lawyer it says: “Oh my
god, the essence is touched!!”.
480
00:32:10,270 --> 00:32:14,170
To explain to you what the essence is and
why everybody is so excited about it is:
481
00:32:14,170 --> 00:32:17,820
basically if you have a violation of
your rights you have no interference.
482
00:32:17,820 --> 00:32:20,420
So if a policeman was walking
down the street and watching you
483
00:32:20,420 --> 00:32:24,030
there’s no interference with any of your
rights. If they probably tapped your phone
484
00:32:24,030 --> 00:32:27,450
there is some kind of proportionality
issue which is what we typically debate
485
00:32:27,450 --> 00:32:30,950
before a Court. There is a system how
you argue if something is proportionate
486
00:32:30,950 --> 00:32:34,750
or not. So e.g. Data Retention
was not proportionate.
487
00:32:34,750 --> 00:32:37,950
And Data Retention would be somewhere
here probably. points to slide
488
00:32:37,950 --> 00:32:41,800
So not legal anymore but
still in a proportionality test.
489
00:32:41,800 --> 00:32:44,320
And then there is “the essence”
which means whatever the fuck
490
00:32:44,320 --> 00:32:47,190
you’re trying to do here is totally
illegal because what you’re doing
491
00:32:47,190 --> 00:32:49,690
is so much out of the scale
of proportionality that
492
00:32:49,690 --> 00:32:53,530
it will never be legal. And on Data
Retention it actually said that
493
00:32:53,530 --> 00:32:56,590
for the first time…
applause
494
00:32:56,590 --> 00:33:01,890
applause
495
00:33:01,890 --> 00:33:04,160
…and this was actually the
first time as far as I saw
496
00:33:04,160 --> 00:33:07,290
that the European Court of Justice has
ever said that under the convention.
497
00:33:07,290 --> 00:33:11,410
So the convention is only
in place since 2008, I think.
498
00:33:11,410 --> 00:33:13,640
But it’s the first time they actually
found that in a case which was
499
00:33:13,640 --> 00:33:18,120
huge for law in general. There
was a couple of findings
500
00:33:18,120 --> 00:33:21,660
on Data Protection powers that
are not too interesting for you.
501
00:33:21,660 --> 00:33:26,450
What may be interesting is that
502
00:33:26,450 --> 00:33:29,760
– there is a story to this picture
that’s the reason I put it in –
503
00:33:29,760 --> 00:33:32,310
basically they said that a
third country doesn’t have
504
00:33:32,310 --> 00:33:35,600
to provide adequate protection, as
I said before. So the story was
505
00:33:35,600 --> 00:33:39,210
that third countries originally had
to provide equivalent protection.
506
00:33:39,210 --> 00:33:42,270
But there was lobbying going on,
so the word “equivalent” was
507
00:33:42,270 --> 00:33:47,090
changed to “adequate”. And
“adequate” means basically nothing.
508
00:33:47,090 --> 00:33:51,120
Because anything and nothing can be
adequate. “Adequate” has no legal meaning.
509
00:33:51,120 --> 00:33:55,910
I mean if you ask what an adequate
dressing is – you don’t really know.
510
00:33:55,910 --> 00:33:59,500
So they changed that actually back to the
law… to the wording that was lobbied
511
00:33:59,500 --> 00:34:02,880
out of the law and said it has to be
“essentially equivalent” and that’s how
512
00:34:02,880 --> 00:34:05,980
we now understand “adequate”. Which is
cool because any third country now
513
00:34:05,980 --> 00:34:10,110
has to provide more or less the same
level of protection than Europe has.
514
00:34:10,110 --> 00:34:15,469
There has to be effective detention
and supervision mechanisms. And
515
00:34:15,469 --> 00:34:18,199
there has to be legal redress. Just
a really short thing on the picture:
516
00:34:18,199 --> 00:34:20,899
I was actually just pointing at two
people and they were taking a picture
517
00:34:20,899 --> 00:34:24,530
from down there to make it a Victory
sign. And that’s how the media
518
00:34:24,530 --> 00:34:27,929
is then doing: “Whoo”.
making short Victory gesture
519
00:34:27,929 --> 00:34:31,899
I have to speed up a little bit.
Not too much but a little bit.
520
00:34:31,899 --> 00:34:36,070
The future, and I think that’s probably
relevant for you guys as well…
521
00:34:36,070 --> 00:34:40,820
First of all, what this whole judgment
means. First of all the US
522
00:34:40,820 --> 00:34:44,070
basically lost its privileged
status as being a country
523
00:34:44,070 --> 00:34:47,480
that provides adequate [data] protection.
Which is kind of the elephant in the room
524
00:34:47,480 --> 00:34:50,860
that everyone knew anyway, that they’re
not providing it. And now, officially,
525
00:34:50,860 --> 00:34:55,300
they’re not providing it anymore. And the
US is now like any third country.
526
00:34:55,300 --> 00:35:00,320
So like China or Russia or India or
any country we usually transfer data to.
527
00:35:00,320 --> 00:35:03,750
So it’s not like you cannot transfer
data to the US anymore.
528
00:35:03,750 --> 00:35:06,840
But they lost their special status.
Basically what the judgment said:
529
00:35:06,840 --> 00:35:09,080
“You can’t have mass surveillance
and be at the same time
530
00:35:09,080 --> 00:35:13,490
an adequately [data] protecting country”.
Which is kind of logical anyway.
531
00:35:13,490 --> 00:35:16,540
The consequence is that you have to
use the derogations that are in the law
532
00:35:16,540 --> 00:35:20,100
that we have for other countries as well.
So a lot of people said: “You know,
533
00:35:20,100 --> 00:35:23,610
the only result will be that there will be
a consent box saying ‘I consent that my
534
00:35:23,610 --> 00:35:27,480
[personal] data is going to the US.’”
Now the problem is: consent has to be
535
00:35:27,480 --> 00:35:32,160
freely given, informed, unambiguous
and specific; under European law.
536
00:35:32,160 --> 00:35:34,350
Which is something all the Googles
and Facebooks in the world have
537
00:35:34,350 --> 00:35:36,870
never understood. That’s the reason
why all these Privacy Policies are
538
00:35:36,870 --> 00:35:41,270
typically invalid. But anyway. So if
you have any of these wordings that
539
00:35:41,270 --> 00:35:45,190
they’re currently using, like “Your data
is subject to all applicable laws” it’s
540
00:35:45,190 --> 00:35:48,300
very likely not “informed” and
“unambiguous”. Because you don’t have
541
00:35:48,300 --> 00:35:52,120
any fucking idea that your data is
ending up at the NSA if you read this.
542
00:35:52,120 --> 00:35:55,510
So what they would have to do is to have
some Policy saying: “I agree that all of
543
00:35:55,510 --> 00:35:59,530
my personal data is made available to the
NSA, FBI and whatsoever – YES/NO”.
544
00:35:59,530 --> 00:36:02,080
applause
Because it has to be “freely given”, so
545
00:36:02,080 --> 00:36:04,410
applause
I have to have the option to say “No”.
546
00:36:04,410 --> 00:36:09,050
Now this would theoretically be possible
but under US law they’re placed under
547
00:36:09,050 --> 00:36:10,510
a “gag order”, so they’re
not allowed to
548
00:36:10,510 --> 00:36:13,610
say this. So they’re in a legal kind of
549
00:36:13,610 --> 00:36:17,030
Limbo because on the one hand they have to
say: “It’s this way” but on the other hand
550
00:36:17,030 --> 00:36:22,210
they have to say “No it’s not”. So consent
is not going to give you any solution.
551
00:36:22,210 --> 00:36:25,760
Then there are Standard Contractual
Clauses. That’s the one from Apple that
552
00:36:25,760 --> 00:36:28,130
they’re using right now.
553
00:36:28,130 --> 00:36:31,940
And Standard Contractual Clauses allow
you to have a contract with a provider
554
00:36:31,940 --> 00:36:36,220
in a third country. And that
pledges to you in a contract
555
00:36:36,220 --> 00:36:40,950
that all your data is safe. The problem
is that they have exception clauses.
556
00:36:40,950 --> 00:36:45,140
That basically say: “If there’s mass
surveillance your whole contract is void”
557
00:36:45,140 --> 00:36:48,860
because you cannot have a contract
saying: “Hereby I pledge full Privacy”
558
00:36:48,860 --> 00:36:52,870
and at the same time be subject to these
laws. And this is the interesting thing:
559
00:36:52,870 --> 00:36:56,130
all these companies are saying: “Now we’re
doing Standard Contractual Clauses”,
560
00:36:56,130 --> 00:36:58,400
but none of them are going to hold
up in Courts and everybody knows,
561
00:36:58,400 --> 00:37:00,450
but of course to their shareholders
they have to tell: “Oh we have
562
00:37:00,450 --> 00:37:04,160
a wonderful solution for this.”
563
00:37:04,160 --> 00:37:07,860
The big question here is if we have
a factual or legal assessment.
564
00:37:07,860 --> 00:37:12,690
So do we have to look at factually what
data is actually processed by the NSA
565
00:37:12,690 --> 00:37:16,470
and what are they actually doing. Or do we
just have to look at the laws in a country
566
00:37:16,470 --> 00:37:21,550
and the possibility of mass access. So the
factual assessment works fine for Apple,
567
00:37:21,550 --> 00:37:26,120
Google etc. who are all in these Snowden
slides. If you look at the abstract and
568
00:37:26,120 --> 00:37:30,660
legal assessment which is legally the
thing that probably we have to do
569
00:37:30,660 --> 00:37:34,950
you actually end up with questions like
Amazon. Amazon was not a huge
570
00:37:34,950 --> 00:37:38,280
cloud provider when the Snowden slides
were actually drafted and written.
571
00:37:38,280 --> 00:37:42,410
They’re huge now. And very likely
they’re subject to all of these laws.
572
00:37:42,410 --> 00:37:45,130
So how do we deal with a company like
this? Can we still forward [personal] data
573
00:37:45,130 --> 00:37:48,500
to an Amazon cloud? If we know
they’re subject to these US laws.
574
00:37:48,500 --> 00:37:51,480
So this is the question of which
companies are actually falling
575
00:37:51,480 --> 00:37:56,310
under this whole judgment.
576
00:37:56,310 --> 00:37:59,510
Basically you still have a couple of
other exemptions. So this basic thing
577
00:37:59,510 --> 00:38:03,130
that a couple of people say that you’re
not allowed to book a hotel [room]
578
00:38:03,130 --> 00:38:06,580
in the US anymore is not true. There
are a lot of exceptions in the law e.g.
579
00:38:06,580 --> 00:38:10,580
the performance of a contract. So if
I book a hotel [room] in New York online
580
00:38:10,580 --> 00:38:14,540
my [personal] data has to go to New York
to actually book my hotel [room]. So
581
00:38:14,540 --> 00:38:17,950
in all these cases you can still transfer
[personal] data. The ruling is mainly
582
00:38:17,950 --> 00:38:20,550
on outsourcing. So if you could
theoretically have your [personal] data
583
00:38:20,550 --> 00:38:24,210
in Europe you’re just not choosing because
it’s cheaper to host it in the US or
584
00:38:24,210 --> 00:38:29,280
it’s easier or it’s more convenient. In
these cases we actually get problems.
585
00:38:29,280 --> 00:38:33,700
So what we did is we had a second round
of complaints. That is now taking
586
00:38:33,700 --> 00:38:38,290
these judgments onboard. You can download
them on the web page as well. And there’s
587
00:38:38,290 --> 00:38:43,440
also the deal that Facebook Ireland
with Facebook US has signed.
588
00:38:43,440 --> 00:38:48,760
To have safety to your data. And this is
currently under investigation in Ireland.
589
00:38:48,760 --> 00:38:52,720
Basically I argued that they have a
contract but the contract is void because
590
00:38:52,720 --> 00:38:56,650
US law says they have to do all this mass
surveillance. I just got the letter that
591
00:38:56,650 --> 00:39:01,350
on November, 18th Facebook has actually
given them [to the DPC] a huge amount
592
00:39:01,350 --> 00:39:06,650
of information on what they’re actually
doing with the data. This is now going
593
00:39:06,650 --> 00:39:10,160
to be under investigation. The big
question is if the DPC in Ireland is
594
00:39:10,160 --> 00:39:13,260
actually giving us access to this
information. Because so far all these
595
00:39:13,260 --> 00:39:17,470
evidence that they had they said:
“it’s all secret and you cannot know
596
00:39:17,470 --> 00:39:20,500
what Facebook is doing with your data
even though you’re fully informed about
597
00:39:20,500 --> 00:39:24,220
what they’re doing with your data.”
Which is kind of interesting as well. But
598
00:39:24,220 --> 00:39:30,260
– different issue. A big question was also
if there’s gonna be a Safe Harbor 2.0.
599
00:39:30,260 --> 00:39:33,010
I already was told by everybody they’re
not gonna call it a Safe Harbor anymore
600
00:39:33,010 --> 00:39:37,350
because they’re stuck with media
headlines like “Safe Harbor is sunk”
601
00:39:37,350 --> 00:39:40,270
or something like this.
602
00:39:40,270 --> 00:39:43,320
And what happened is that the US has done
a huge lobbying effort. They have said
603
00:39:43,320 --> 00:39:46,490
right on the day that all of this is based
on wrong facts and they’ve never done
604
00:39:46,490 --> 00:39:50,720
any of this; and all of this
is Trade War; and blablablabla.
605
00:39:50,720 --> 00:39:53,510
So they put a lot of pressure on them.
I was actually talking to Jurova,
606
00:39:53,510 --> 00:39:57,210
the Justice Commissioner. And I was
impressed by her. She actually took
607
00:39:57,210 --> 00:40:00,650
a whole hour and she really knew what
was going on. And at the time they had
608
00:40:00,650 --> 00:40:04,120
press releases saying: “We’re really
deeply working on the new Safe Harbor”.
609
00:40:04,120 --> 00:40:07,390
And I was asking Jurova: “Did you get
any of the evidence you need to make
610
00:40:07,390 --> 00:40:10,850
such a finding?” And the answer
was: “Yeah, we’re still waiting for it.
611
00:40:10,850 --> 00:40:13,770
We should get it next week”.
Which basically meant this
612
00:40:13,770 --> 00:40:17,810
is never going to work out anymore. But
of course I think there’s a blame game
613
00:40:17,810 --> 00:40:21,700
going on. The EU has to say: “We
tried everything to find a solution”
614
00:40:21,700 --> 00:40:24,940
and the US is saying: “We tried
everything to find a solution, too”.
615
00:40:24,940 --> 00:40:27,220
And then in the end they will blame
each other for not finding a solution.
616
00:40:27,220 --> 00:40:30,530
That’s my guess. But
we’ll see what happens.
617
00:40:30,530 --> 00:40:34,350
The basic problem with a Safe Harbor 2
is that in the government sector
618
00:40:34,350 --> 00:40:38,380
they’d basically have to rewrite the whole
US legal system. Which they haven’t done
619
00:40:38,380 --> 00:40:43,660
for their own citizens. So they will very
likely not do it for European citizens.
620
00:40:43,660 --> 00:40:46,640
Like judicial redress. Not even an
American has judicial redress. So
621
00:40:46,640 --> 00:40:50,110
they would never give that to a European.
And the private area: they actually
622
00:40:50,110 --> 00:40:53,750
have to redraft the whole Safe Harbor
principles because they now have to be
623
00:40:53,750 --> 00:40:57,460
essentially equivalent of
what Europe is doing.
624
00:40:57,460 --> 00:41:00,380
So this would also protect people on
the private sphere much more but it
625
00:41:00,380 --> 00:41:05,200
would really take a major overhaul of
the whole system. To give you an idea:
626
00:41:05,200 --> 00:41:08,300
all of these processing operations
are covered by European law. So
627
00:41:08,300 --> 00:41:12,580
from collection all the way
to really deleting the data.
628
00:41:12,580 --> 00:41:16,610
This is what’s covered by the Safe
Harbor principles. Only 2 operations
629
00:41:16,610 --> 00:41:20,250
which is at the closure by “transmission”
and the “change of purpose”. Anything else
630
00:41:20,250 --> 00:41:23,730
they can do as fully as they wanna do
under the current Safe Harbor things.
631
00:41:23,730 --> 00:41:26,960
So if you talk about “essentially
equivalent” you see on these spaces
632
00:41:26,960 --> 00:41:30,540
already points to slide
that this is miles apart.
633
00:41:30,540 --> 00:41:35,350
So what is the future of US-EU-US data
flows? We will have massive problems
634
00:41:35,350 --> 00:41:38,130
for the PRISM companies. Because
what they’re doing is just a violation
635
00:41:38,130 --> 00:41:40,950
of our Fundamental Rights. Give or take
it – you can change the law as much
636
00:41:40,950 --> 00:41:44,680
as you want but you cannot
change the Fundamental Rights.
637
00:41:44,680 --> 00:41:47,770
And you’ll have serious problems
for businesses that are subject
638
00:41:47,770 --> 00:41:51,490
to US surveillance law in
general. So I’m wondering
639
00:41:51,490 --> 00:41:54,580
what the final solution is. And that
was part of the issue that I had
640
00:41:54,580 --> 00:41:57,500
with the cases. Typically I like
to have a solution for all of this.
641
00:41:57,500 --> 00:42:01,450
In this case I could only point at the
problems but I couldn’t really come up
642
00:42:01,450 --> 00:42:05,980
with solutions. Because solutions are
something that has to be done politically.
643
00:42:05,980 --> 00:42:11,190
An interesting question was: “How
about EU surveillance, actually?”
644
00:42:11,190 --> 00:42:15,380
Because aren’t they doing more or
less the same thing? Which is true.
645
00:42:15,380 --> 00:42:18,690
And the problem is that the Charta of
Fundamental Rights only applies
646
00:42:18,690 --> 00:42:23,300
to anything that’s regulated by the EU.
And national surveillance is exempt
647
00:42:23,300 --> 00:42:28,720
from any EU law. It’s something that
member states are doing all by themselves.
648
00:42:28,720 --> 00:42:32,650
So you’re out of luck here. You
can possibly argue it through
649
00:42:32,650 --> 00:42:37,840
a couple of circles; but it’s hard to
do. However, 7 and 8 of the Charta
650
00:42:37,840 --> 00:42:42,210
– exactly the same wording as the
European Convention of Human Rights.
651
00:42:42,210 --> 00:42:46,270
And this applies to National Security
cases. So the relevant Court here
652
00:42:46,270 --> 00:42:49,640
is actually in Strasbourg. So you
could probably end up at this Court
653
00:42:49,640 --> 00:42:53,030
with the same argument and say: if they
already found that this is a violation
654
00:42:53,030 --> 00:42:56,370
of your essence in Luxembourg – don’t
you want to give us the same rights
655
00:42:56,370 --> 00:43:00,520
in Strasbourg as well? And these cool
Courts are in kind of a fight about
656
00:43:00,520 --> 00:43:04,370
kind of providing proper Privacy
protection and protection in general.
657
00:43:04,370 --> 00:43:08,240
So very likely you can walk up with
a German case or with a UK case
658
00:43:08,240 --> 00:43:11,240
or a French case and pretty much do
the same thing here. So the judgment
659
00:43:11,240 --> 00:43:13,710
will be interesting for European
surveillance as well because
660
00:43:13,710 --> 00:43:16,690
it’s a benchmark. And you can hardly
argue that the US is bad and we’re
661
00:43:16,690 --> 00:43:21,200
not doing the same thing. Either solutions
are possibly technical solutions.
662
00:43:21,200 --> 00:43:26,260
So what Microsoft did with the cloud
services and hosting it with the Germans.
663
00:43:26,260 --> 00:43:31,190
And the German Telekom. And there
is really the issue that if you can get
664
00:43:31,190 --> 00:43:36,090
a technical solution of not having any
access from the US side you can actually
665
00:43:36,090 --> 00:43:39,440
get out of the whole problem. So you can
try with encryption or data localization;
666
00:43:39,440 --> 00:43:43,210
all this kind of stuff. However none
of this is really a very sexy solution
667
00:43:43,210 --> 00:43:48,530
to the whole issue. However it's
something that you can possibly do.
668
00:43:48,530 --> 00:43:53,660
Last thing: enforcement. And this a
little bit of a pitch, I got to confess.
669
00:43:53,660 --> 00:43:57,170
We have the problem so far that
670
00:43:57,170 --> 00:44:00,430
we have Data Protection law in Europe.
671
00:44:00,430 --> 00:44:03,090
But we don’t really have enforcement. And
the problem is that the lawyers don’t know
672
00:44:03,090 --> 00:44:06,940
what’s happening technically. The
technical people hardly know
673
00:44:06,940 --> 00:44:10,550
what the law says. And then you
have a funding issue. So the idea
674
00:44:10,550 --> 00:44:13,210
that I have right now is to create some
kind of an NGO or some kind of
675
00:44:13,210 --> 00:44:17,570
a “Stiftung Warentest for Privacy”. To
kind of look into the devices we all have
676
00:44:17,570 --> 00:44:20,880
and kind of have a structured system of
really looking into it. And then probably
677
00:44:20,880 --> 00:44:24,670
do enforcement as well if your
stuff that you have on your device
678
00:44:24,670 --> 00:44:28,750
is not following European law.
I think this is an approach that
679
00:44:28,750 --> 00:44:31,860
probably changes a lot of the issues.
It’s not gonna change everything.
680
00:44:31,860 --> 00:44:35,680
But this could possibly be a solution to
a lot of what we had. And that’s kind of
681
00:44:35,680 --> 00:44:39,650
what we did in other fields of law as
well. That we have NGOs or organizations
682
00:44:39,650 --> 00:44:42,780
that take care of these things. I think
that would be a solution and probably
683
00:44:42,780 --> 00:44:48,090
helps a little bit. Last - before we
have a question/answer session –
684
00:44:48,090 --> 00:44:51,910
a little Bullshit Bingo to probably get a
couple of questions answered right away.
685
00:44:51,910 --> 00:44:55,990
So the first thing is that a lot
of questions are if the EU
686
00:44:55,990 --> 00:44:59,100
does the same thing. I just answered it:
Of course they do the same thing and
687
00:44:59,100 --> 00:45:01,720
we’ll have to do something about it
as well. And I hope that my case
688
00:45:01,720 --> 00:45:06,850
is a good case to bring other cases
against member states of the EU.
689
00:45:06,850 --> 00:45:10,460
The second question is these whole PRISM
companies are saying they don’t do this.
690
00:45:10,460 --> 00:45:14,420
It’s absurd because they’re all placed
under gag orders. Or the people that are
691
00:45:14,420 --> 00:45:17,360
talking to us don’t even have the
security clearance to talk about
692
00:45:17,360 --> 00:45:20,990
the surveillance system. So it’s insane
when a PR person comes up and says:
693
00:45:20,990 --> 00:45:25,040
“I hereby read the briefing from Facebook
that we’re not doing this!”. Which
694
00:45:25,040 --> 00:45:28,620
basically is what we have right now.
And that’s what a lot of the media
695
00:45:28,620 --> 00:45:32,530
is referring to as well. Another thing
that Facebook and the US government
696
00:45:32,530 --> 00:45:36,110
have argued later is that they weren’t
asked. They were not invited to the Court
697
00:45:36,110 --> 00:45:40,510
procedure. The fun thing is: both of them
totally knew about the Court procedure.
698
00:45:40,510 --> 00:45:44,780
They just decided not to step in and not
to get a party of the procedure. So they
699
00:45:44,780 --> 00:45:46,620
were like first: “Ouh,
we don’t wanna talk
700
00:45:46,620 --> 00:45:49,720
about it” and then when the decision
701
00:45:49,720 --> 00:45:53,520
came around they were like:
“Oh we weren’t asked”.
702
00:45:53,520 --> 00:45:56,970
Of course it’s a win-on-paper mainly
but we’re trying to get it implemented
703
00:45:56,970 --> 00:46:01,250
in practice as well. And there
is kind of this argument
704
00:46:01,250 --> 00:46:05,490
“The EU has broken the Internet”
which I typically rebut in “No, the US
705
00:46:05,490 --> 00:46:08,990
has broken the Internet and
the EU is reacting to it”.
706
00:46:08,990 --> 00:46:16,150
applause
707
00:46:16,150 --> 00:46:19,490
Another issue that was interesting
is that a lot of the US side said that
708
00:46:19,490 --> 00:46:23,710
this is protectionism. So the EU is only
enforcing these Fundamental Rights
709
00:46:23,710 --> 00:46:28,870
to hurt US companies. Which is funny
because I’m not involved in kind of
710
00:46:28,870 --> 00:46:31,940
getting more trade to Europe. I’m
just like someone interested in
711
00:46:31,940 --> 00:46:36,340
my Fundamental Rights. And secondly the
European politics has done everything
712
00:46:36,340 --> 00:46:40,610
to kind of not get this case to cross.
So kind of this idea that this is
713
00:46:40,610 --> 00:46:45,100
a protectionist thing is kind of strange,
too. And the last question which is:
714
00:46:45,100 --> 00:46:49,200
“What about the Cables? What about all
the other types of surveillance we have?”
715
00:46:49,200 --> 00:46:53,940
They’re an issue, too. In these cases you
just have more issues of actual hacking,
716
00:46:53,940 --> 00:46:58,870
government hacking, basically. So
illegal access to servers and cables.
717
00:46:58,870 --> 00:47:02,050
Which is harder to tackle with than
these companies. Because we have
718
00:47:02,050 --> 00:47:05,720
this private interference. So there are a
lot of other issues around here as well,
719
00:47:05,720 --> 00:47:09,130
I was just happy to kind of get one thing
across. And I’m happy for questions,
720
00:47:09,130 --> 00:47:13,470
as well.
applause
721
00:47:13,470 --> 00:47:21,300
Herald: Alright…
applause
722
00:47:21,300 --> 00:47:22,989
Max: at lowered voice
Wie lange haben wir noch für Fragen?
723
00:47:22,989 --> 00:47:30,000
Herald: We have about
10 minutes for questions.
724
00:47:30,000 --> 00:47:33,960
I would ask you to please line up at
the microphones here in the hall.
725
00:47:33,960 --> 00:47:37,260
We have 6 microphones. And we have also
726
00:47:37,260 --> 00:47:40,420
questions from the IRC.
While you guys queue up
727
00:47:40,420 --> 00:47:44,220
I would take one from the internet.
728
00:47:44,220 --> 00:47:48,410
Signal Angel: Yeah, just
one – for the first time.
729
00:47:48,410 --> 00:47:51,650
Does TTIP influence any of this?
730
00:47:51,650 --> 00:47:55,600
Max: Basically, not really. Because
the judgment that was done was
731
00:47:55,600 --> 00:47:58,240
on the Fundamental Rights. So if they
have some kind of wording in TTIP
732
00:47:58,240 --> 00:48:02,740
it would again be illegal. And there was
actually a push to get something like that
733
00:48:02,740 --> 00:48:08,750
into TTIP. And as far as I know this idea
was done, after the judgment. laughs
734
00:48:08,750 --> 00:48:13,410
Just a little intro: EDRI has organized
an ask-me-anything thing at 7 PM as well.
735
00:48:13,410 --> 00:48:17,530
So if you got specific questions, you
can also go there. Just as a reminder.
736
00:48:17,530 --> 00:48:20,980
Herald: OK, great.
Microphone No.2, please.
737
00:48:20,980 --> 00:48:25,220
Question: Thank you for your
efforts. The question would be:
738
00:48:25,220 --> 00:48:28,260
Could US businesses
under these findings ever
739
00:48:28,260 --> 00:48:33,930
be again employed
in critical sectors? E.g.
740
00:48:33,930 --> 00:48:39,380
public sector, Windows in the
Bundestag, e.g. and stuff like that?
741
00:48:39,380 --> 00:48:44,230
Max: Yep, yip. That’s a huge problem.
And that’s a problem we had for a while.
742
00:48:44,230 --> 00:48:48,650
I was mainly talking actually with people
in the business area. I’m mainly invited
743
00:48:48,650 --> 00:48:51,460
to conferences there. And people
were telling me: “Yeah, we’re doing
744
00:48:51,460 --> 00:48:55,400
all our bank data on Google
now”. And I was like: WTF?
745
00:48:55,400 --> 00:48:58,670
Because this is not only Privacy.
That’s also trade secrets, all of
746
00:48:58,670 --> 00:49:02,680
this kind of stuff. So there is this
huge issue and if you talk about
747
00:49:02,680 --> 00:49:06,970
the new Windows that is talking home a
little more than the old did, you probably
748
00:49:06,970 --> 00:49:11,010
have the same issue here because
Microsoft is falling under the same thing.
749
00:49:11,010 --> 00:49:14,200
Q: No plausible deniability,
therefor culpability.
750
00:49:14,200 --> 00:49:16,170
M: Yep, yep, yep.
Q: Thank you!
751
00:49:16,170 --> 00:49:18,119
Max: Thank you!
752
00:49:18,119 --> 00:49:20,970
Herald: OK, microphone No.3,
please, for your next question.
753
00:49:20,970 --> 00:49:24,920
Question: How would you assess
Microsoft saying they put up
754
00:49:24,920 --> 00:49:29,060
a huge fight that they… well,
755
00:49:29,060 --> 00:49:32,680
they said they had customers’
data in Ireland and they said
756
00:49:32,680 --> 00:49:36,070
they refuse to give it to the FBI.
What’s to think of that?
757
00:49:36,070 --> 00:49:38,450
Max: I think to be fair a lot of
these companies have realized
758
00:49:38,450 --> 00:49:42,780
that there is an issue. And that
they are the “Feuer am Arsch”.
759
00:49:42,780 --> 00:49:47,170
And Microsoft… actually a couple of
Microsoft people is talking to me
760
00:49:47,170 --> 00:49:50,780
and is like: “We’re actually not
unhappy about this case because
761
00:49:50,780 --> 00:49:54,620
we have a good argument in the US
now that we’re getting troubles here…”
762
00:49:54,620 --> 00:49:59,330
But the companies are between
these 2 chairs. The US law says:
763
00:49:59,330 --> 00:50:02,500
“We kill you if you’re not giving us all
the data” and the problem so far is
764
00:50:02,500 --> 00:50:06,170
that in the EU… e.g. in Austria
765
00:50:06,170 --> 00:50:09,800
the maximum penalty is 25.000 Euro
if you don’t comply with this.
766
00:50:09,800 --> 00:50:11,950
Q: Peanuts.
M: Which is absurd.
767
00:50:11,950 --> 00:50:15,430
And in most other countries it’s the same.
We now have the Data Protection regulation
768
00:50:15,430 --> 00:50:18,060
that is coming up which gives
you a penalty of a maximum
769
00:50:18,060 --> 00:50:21,580
of 4% of the worldwide turnover,
which is a couple of millions.
770
00:50:21,580 --> 00:50:25,060
And if you want to thank someone there’s
Jan Philipp Albrecht, probably in the room
771
00:50:25,060 --> 00:50:28,520
or not anymore, who is the member of [EU]
Parliament from the Green Party, that’s
772
00:50:28,520 --> 00:50:31,920
actually from Hamburg who
has negotiated all of this.
773
00:50:31,920 --> 00:50:34,500
And this actually could possibly
change a couple of these things.
774
00:50:34,500 --> 00:50:38,290
But you have this conflict of laws
and solutions like the Telekom thing –
775
00:50:38,290 --> 00:50:41,650
that you host the data with the Telekom –
could possibly allow them to argue
776
00:50:41,650 --> 00:50:44,690
in the US that they don’t have any factual
access anymore so they can’t give the data
777
00:50:44,690 --> 00:50:49,150
to the US Government. But we’re splitting
the internet here. And this is not really
778
00:50:49,150 --> 00:50:53,270
something I like too much, but
apparently the only solution.
779
00:50:53,270 --> 00:50:55,850
Herald: OK, thank you for your
question. We have another one
780
00:50:55,850 --> 00:51:00,210
at microphone 4, please.
781
00:51:00,210 --> 00:51:07,369
Q: Thank you very much for your
efforts, first of all. And great result!
782
00:51:07,369 --> 00:51:09,630
M: Thank you.
Q: The question from me would also be:
783
00:51:09,630 --> 00:51:12,800
Is there any change in the system
in Ireland now? So somebody has
784
00:51:12,800 --> 00:51:16,190
a similar struggle to yours – the
next round might be easier or not?
785
00:51:16,190 --> 00:51:19,650
Max: Basically what the Irish DPC got
is a wonderful new building. And
786
00:51:19,650 --> 00:51:22,590
the press release is too funny.
Because it says: “We have a very nice
787
00:51:22,590 --> 00:51:27,540
Victorian building now downtown Dublin
in a very nice neighborhood“ and blablabla
788
00:51:27,540 --> 00:51:30,020
and they get double the staff of what
they had before. The key problem
789
00:51:30,020 --> 00:51:33,630
is none of this. I only took the picture
because it kind of shows what’s
790
00:51:33,630 --> 00:51:37,630
inside the building. And the key
problem is that we have 2 countries
791
00:51:37,630 --> 00:51:40,390
– Luxembourg and Ireland, where
all of these headquarters are – and
792
00:51:40,390 --> 00:51:44,080
these 2 countries are not interested
in collecting taxes, they’re
793
00:51:44,080 --> 00:51:48,260
not interested in enforcing Privacy Law,
they’re not interested in any of this. And
794
00:51:48,260 --> 00:51:52,360
they’re basically getting a huge bunch of
money in the back of the rest of the EU.
795
00:51:52,360 --> 00:51:55,869
And until this actually changes
and there’s a change of attitude
796
00:51:55,869 --> 00:52:00,460
in the Irish DPC it doesn’t really
matter in which building they are.
797
00:52:00,460 --> 00:52:03,710
So they got a lot of more money to kind
of – to the public – say: “Yes we have
798
00:52:03,710 --> 00:52:05,660
more money and we have
more staff and dadadadada”…
799
00:52:05,660 --> 00:52:07,700
Q: …but the system did not change!
800
00:52:07,700 --> 00:52:10,680
M: The big question is what the system is
doing: they can prove now! As they have
801
00:52:10,680 --> 00:52:14,180
the new complaint on their table on Safe
Harbor and PRISM and Facebook.
802
00:52:14,180 --> 00:52:16,740
They can prove; if they do something
about it or not – my guess is that
803
00:52:16,740 --> 00:52:19,390
they’ll find “some” random reasons
why unfortunately they couldn’t do
804
00:52:19,390 --> 00:52:23,229
anything about it. We’ll see.
Q: OK, thanks!
805
00:52:23,229 --> 00:52:26,490
Herald: OK, thank you! It’s
your turn, microphone No.2.
806
00:52:26,490 --> 00:52:32,490
Question: OK, thank you very much and also
thank you for your service for the public.
807
00:52:32,490 --> 00:52:40,980
M: Thanks for the support!
applause
808
00:52:40,980 --> 00:52:45,540
Q: What that will…
Sorry about the English…
809
00:52:45,540 --> 00:52:50,030
M: Sag's auf Deutsch!
810
00:52:50,030 --> 00:52:57,869
Q: Was bedeutet das eigentlich für die
Geschichte mit der Vorratsdatenspeicherung
811
00:52:57,869 --> 00:53:03,700
wenn die jetzt wieder kommt?
Und inwiefern wird Social Media
812
00:53:03,700 --> 00:53:08,410
damit jetzt sozusagen freigestellt
wieder oder nicht?
813
00:53:08,410 --> 00:53:12,060
M: To be honest I didn’t really look
into the German Data Retention thing
814
00:53:12,060 --> 00:53:15,610
too much. To be honest, being an Austrian
I’m like “Our Supreme Cou… Constitu…”
815
00:53:15,610 --> 00:53:17,060
Q: Me, too!
audience laughing
816
00:53:17,060 --> 00:53:21,210
M: Yeah, I heard. “Our Constitutional
Court kind of killed it”, so…
817
00:53:21,210 --> 00:53:25,060
I don’t think we’ll see a Data
Retention in Austria too soon.
818
00:53:25,060 --> 00:53:27,380
But for Germany it’s gonna be interesting
especially if you find a way to
819
00:53:27,380 --> 00:53:31,490
go to Luxembourg in the end. Like if you
find some hook to say: “Actually,
820
00:53:31,490 --> 00:53:34,930
this German law violates something
in the Data Protection Regulation
821
00:53:34,930 --> 00:53:38,980
or in the Directive“. So we can probably
find a way to go back to Luxembourg.
822
00:53:38,980 --> 00:53:41,290
Could help. The other thing is that just
the fact that the Luxembourg Court
823
00:53:41,290 --> 00:53:46,100
has been so active has probably boosted
up a lot of the National Courts as well.
824
00:53:46,100 --> 00:53:50,020
Because the German decision, I had
the feeling was like a “We don’t really
825
00:53:50,020 --> 00:53:53,490
feel like we can fully say that this is
actually illegal and we kind of argued
826
00:53:53,490 --> 00:53:56,230
that it’s somehow not illegal the way
it is, but possibly you can do it
827
00:53:56,230 --> 00:54:00,090
in the future and uooah…“. And after
Luxembourg has really thrown
828
00:54:00,090 --> 00:54:04,790
all of this right out of the door and was
like: “Get lost with your Data Retention
829
00:54:04,790 --> 00:54:08,200
thing and especially with the PRISM thing”
you probably have better case law now,
830
00:54:08,200 --> 00:54:11,240
as well. And that could be relevant
for National Courts as well. Because
831
00:54:11,240 --> 00:54:16,160
of course these things are question of
proportionality. And if we ask everybody
832
00:54:16,160 --> 00:54:18,430
here in the room what they
think is proportionate or not,
833
00:54:18,430 --> 00:54:22,400
everyone has another opinion. And
therefore it’s relevant what our people
834
00:54:22,400 --> 00:54:25,260
are saying and what other Courts are
saying to probably get the level of
835
00:54:25,260 --> 00:54:28,780
what we feel as proportionate
somehow a little bit up.
836
00:54:28,780 --> 00:54:31,640
Q: So thank you very much. And go on!
M: Thank you!
837
00:54:31,640 --> 00:54:35,610
Herald: OK, just for the record, in
case you couldn’t tell by the answer:
838
00:54:35,610 --> 00:54:39,680
the question was about the implications
for the Data Retention Laws, like
839
00:54:39,680 --> 00:54:44,690
in Germany and Austria. Microphone
No.1, we have another question.
840
00:54:44,690 --> 00:54:48,750
Question: Hi! Two questions. One: could
you tell a little bit more about your idea
841
00:54:48,750 --> 00:54:55,850
of “Stiftung Datenschutz” Europe-wide?
And how do we get funding to you…
842
00:54:55,850 --> 00:54:58,690
M: Send me an email!
Q: …if you don’t mind?
843
00:54:58,690 --> 00:55:03,810
Second question: when I argue with people
about like “Let’s keep the personal data
844
00:55:03,810 --> 00:55:08,300
of all activists within Europe!” I always
get this answer: “Yeah, are you so naive,
845
00:55:08,300 --> 00:55:11,770
do you think it’s anything different
if the server stands in Frankfurt
846
00:55:11,770 --> 00:55:14,170
instead of San Francisco?”
What do you say to that?
847
00:55:14,170 --> 00:55:17,670
Max: The same problem, like pretty much
what we have is – and that’s the reason
848
00:55:17,670 --> 00:55:20,410
why I said I hope this judgment is used
for National surveillance in Europe,
849
00:55:20,410 --> 00:55:25,190
as well. Because we do have the same
issues. I mean when you are an Austrian
850
00:55:25,190 --> 00:55:29,560
and the German “Untersuchungsausschuss”
is basically saying: “Ah, we’re only
851
00:55:29,560 --> 00:55:32,840
protecting Germans” I feel like my
fucking data is going through Frankfurt
852
00:55:32,840 --> 00:55:36,750
all the times. And I’m kind of out of
the scope, apparently. So we do need
853
00:55:36,750 --> 00:55:39,369
to take care of this as well. I hope
that this is a case showing that
854
00:55:39,369 --> 00:55:42,580
you can actually take action. You
just have to poke long enough and
855
00:55:42,580 --> 00:55:47,770
kind of poke at the right spot especially.
And I think this is something that…
856
00:55:47,770 --> 00:55:51,280
there’s not an ultimate solution to it.
It’s just one of the kind of holes
857
00:55:51,280 --> 00:55:54,869
that you have. The other thing that we
may see is that a lot of companies that
858
00:55:54,869 --> 00:55:59,630
are holding this data are much more
questioning an order they get.
859
00:55:59,630 --> 00:56:03,540
Because if they get legal problems from
an order they got by a German Court
860
00:56:03,540 --> 00:56:08,210
or whatever it is they probably
are now more interested in – and
861
00:56:08,210 --> 00:56:10,530
actually looking at it. Because
right now it’s cheaper for them
862
00:56:10,530 --> 00:56:14,390
to just forward the data. You don’t need
a whole Legal Team, reviewing it all.
863
00:56:14,390 --> 00:56:17,760
So I think to kind of split the private
companies that are helping them
864
00:56:17,760 --> 00:56:21,880
from the Government and kind of get some
issue between them probably helps there,
865
00:56:21,880 --> 00:56:25,010
as well. But of course it’s just like
little peanuts you put in there.
866
00:56:25,010 --> 00:56:28,930
But in the end you have that
issue, in the end. Yeah.
867
00:56:28,930 --> 00:56:34,660
On the “Stiftung Datenschutz” or whatever:
I think that’s kind of a thing I just
868
00:56:34,660 --> 00:56:37,710
wanted to blow out to people here, because
I’m mainly in the legal sphere and in,
869
00:56:37,710 --> 00:56:43,200
like the activist/consumer side. And
I think that’s the big problem we have
870
00:56:43,200 --> 00:56:48,249
in the legal and consumer side is that we
don’t understand the devices that much.
871
00:56:48,249 --> 00:56:51,050
And we lack the evidence. We
don’t really have the evidence of
872
00:56:51,050 --> 00:56:54,050
what’s actually going on on devices
and you need to have this evidence
873
00:56:54,050 --> 00:56:56,869
if you go in front of Courts. I think
a lot of the people in the room probably
874
00:56:56,869 --> 00:57:00,890
have this evidence somewhere on the
computer. So the idea of really getting
875
00:57:00,890 --> 00:57:03,670
this connection at some point – it’s not
something I can pitch to you right away,
876
00:57:03,670 --> 00:57:07,220
because it’s not… like I don’t wanna
start it tomorrow. But it’s something
877
00:57:07,220 --> 00:57:10,560
I wanted to circulate to get feedback
as well, what you guys think of it.
878
00:57:10,560 --> 00:57:15,559
So if there’s any feedback on it, send me
an email, or twitter. Or whatever it is.
879
00:57:15,559 --> 00:57:21,520
applause
880
00:57:21,520 --> 00:57:23,789
Herald: So we do have a bit time left,
881
00:57:23,789 --> 00:57:26,640
microphone No.2 with the
next question, please.
882
00:57:26,640 --> 00:57:31,390
Question: What can I do as an individual
person now? Can I sue Google
883
00:57:31,390 --> 00:57:35,770
or can I sue other companies
just to stop this?
884
00:57:35,770 --> 00:57:40,160
And would it create some
pressure if I do that?
885
00:57:40,160 --> 00:57:43,040
So what can the ordinary
citizen do now?
886
00:57:43,040 --> 00:57:47,160
Max: We’re right now… I already prepared
it but I didn’t have time to send it out
887
00:57:47,160 --> 00:57:50,070
to have complaints against the Googles and
all the others that are on the PRISM list.
888
00:57:50,070 --> 00:57:53,180
We started with Facebook because I kind
of know them the best. And, you know, so
889
00:57:53,180 --> 00:57:57,430
it was a good start. And the idea
was really to have other people
890
00:57:57,430 --> 00:58:00,790
probably copy-pasting this. The complaint
against Facebook we actually filed
891
00:58:00,790 --> 00:58:05,109
with the Hamburg DPC, as well, and the
Belgium DPC. The idea behind it was
892
00:58:05,109 --> 00:58:08,670
that the Irish now suddenly have 2 other
DPCs that are more interested in
893
00:58:08,670 --> 00:58:12,940
enforcing the law in their boat. So
they’re not the only captains anymore.
894
00:58:12,940 --> 00:58:17,330
And it’s interesting what’s gonna happen
here. If there are other people
895
00:58:17,330 --> 00:58:21,660
that have other cases and just file a
complaint with your Data Protection
896
00:58:21,660 --> 00:58:25,200
authority, a lot of them, especially the
German Data Protection authorities
897
00:58:25,200 --> 00:58:28,940
– most of them – are really interested
in doing something about it, but
898
00:58:28,940 --> 00:58:32,260
they oftentimes just need a case. They
need someone to complain about it and
899
00:58:32,260 --> 00:58:37,070
someone giving them the evidence and
someone arguing it, to get things started.
900
00:58:37,070 --> 00:58:41,760
So if anyone is using Google Drive
or something like that – let’s go.
901
00:58:41,760 --> 00:58:44,750
And basically the wording is on our web
page. You just have to download it
902
00:58:44,750 --> 00:58:50,300
and reword it. And we’re gonna probably
publish on the website the complaints
903
00:58:50,300 --> 00:58:53,080
against the other companies, as soon as
they’re out. Probably the next 2..3 weeks.
904
00:58:53,080 --> 00:59:00,270
Something like this. So just
copy-paste and spread the love.
905
00:59:00,270 --> 00:59:04,260
Herald: OK, thank you
very much, Max, again!
906
00:59:04,260 --> 00:59:07,530
For your great talk. This is it!
907
00:59:07,530 --> 00:59:10,430
postroll music
908
00:59:10,430 --> 00:59:17,919
Subtitles created by c3subtitles.de
in 2016. Join and help us do more!