WEBVTT 00:00:00.000 --> 00:00:09.520 32C3 preroll music 00:00:09.520 --> 00:00:13.210 Herald: Our next talk is called “Safe Harbor”. 00:00:13.210 --> 00:00:18.019 Background is: back in October, in the light of the Snowden revelations 00:00:18.019 --> 00:00:22.829 the Court of Justice of the European Union – that’s the “EuGH” in German 00:00:22.829 --> 00:00:29.159 declared the Safe Harbor agreement between the EU and the US invalid. 00:00:29.159 --> 00:00:32.500 This talk is about how we got there as well as further implications 00:00:32.500 --> 00:00:37.470 of that decision. Please believe me when I say our speaker is ideally suited 00:00:37.470 --> 00:00:42.730 to talk about that topic. Please give it up for the man actually suing Facebook 00:00:42.730 --> 00:00:45.990 over Data Protection concerns: Max Schrems! 00:00:45.990 --> 00:00:50.940 applause and cheers 00:00:50.940 --> 00:00:53.290 Max Schrems: Hallo! Hey! applause and cheers 00:00:53.290 --> 00:01:01.770 applause 00:01:01.770 --> 00:01:05.420 It’s cheerful like some Facebook Annual conference where the newest things 00:01:05.420 --> 00:01:09.920 are kind of presented. I’m doing a little intro basically where I got there. 00:01:09.920 --> 00:01:13.330 This was my nice little university in California. And I was studying there 00:01:13.330 --> 00:01:15.259 for half a year and there were a couple of people from Facebook 00:01:15.259 --> 00:01:18.149 and other big companies and they were talking about 00:01:18.149 --> 00:01:21.289 European Data Protection law. And the basic thing they said – it was 00:01:21.289 --> 00:01:25.219 not an original quote but basically what they said is: “Fuck the Europeans, 00:01:25.219 --> 00:01:28.050 you can fuck their law as much as you want and nothing is going to happen.” 00:01:28.050 --> 00:01:31.919 And that was kind of the start of the whole story because I thought: “Okay, 00:01:31.919 --> 00:01:35.639 let’s just make a couple of complaints and see where it goes.” 00:01:35.639 --> 00:01:41.219 I originally got 1.300 pages Facebook data back then, because you can exercise 00:01:41.219 --> 00:01:45.420 your right to access. And Facebook actually sent me a CD with a PDF file 00:01:45.420 --> 00:01:49.069 on it with all my Facebook data. It was by far not everything 00:01:49.069 --> 00:01:51.880 but it was the first time that someone really got the data and I was asking 00:01:51.880 --> 00:01:54.889 someone from Facebook why they were so stupid to send me all this information. 00:01:54.889 --> 00:01:59.100 Because a lot of it was obviously illegal. And the answer was “We had internal 00:01:59.100 --> 00:02:04.090 communications problems.” So someone was just stupid enough to burn it on a CD and 00:02:04.090 --> 00:02:09.158 send it on. One of the CDs actually was first going to Sydney in Australia because 00:02:09.158 --> 00:02:12.819 they put “Australia” instead of “Austria” on the label which was one of the things 00:02:12.819 --> 00:02:15.050 as well. applause 00:02:15.050 --> 00:02:20.010 Anyway, this was basically how my interest in Facebook started; 00:02:20.010 --> 00:02:23.450 and the media got crazy about it because there is like a little guy that does 00:02:23.450 --> 00:02:27.280 something against the big guy. And this is basically how the whole thing got 00:02:27.280 --> 00:02:30.980 this big. This is like a cartoon from my Salzburg newspaper. This should be me, 00:02:30.980 --> 00:02:34.640 and it’s like basically the reason why the story got that big because it’s 00:02:34.640 --> 00:02:37.790 a small guy doing something against Facebook, not necessarily because 00:02:37.790 --> 00:02:41.770 what I was doing was so especially smart. But the story was just good for the media, 00:02:41.770 --> 00:02:45.879 ’cause data protection is generally a very dry topic that they can’t report about 00:02:45.879 --> 00:02:50.019 and they’re they had like the guy that did something. 00:02:50.019 --> 00:02:54.299 A couple of introductions. We actually had 3 procedures. So if you heard about 00:02:54.299 --> 00:02:58.290 what I was doing… There was originally a procedure at the Irish Data Protection 00:02:58.290 --> 00:03:02.310 Commission, on Facebook itself – so what Facebook itself does with the data. 00:03:02.310 --> 00:03:06.620 This procedure has ended after 3 years. There’s a “Class Action” in Vienna 00:03:06.620 --> 00:03:09.930 right now that’s still ongoing. It’s in front of the Supreme Court in Austria 00:03:09.930 --> 00:03:14.900 right now. And there is the procedure that I’m talking about today which is 00:03:14.900 --> 00:03:19.920 the procedure on Safe Harbor at the Irish Data Protection Commission. 00:03:19.920 --> 00:03:22.719 A couple of other background informations: I personally don’t think 00:03:22.719 --> 00:03:25.159 Facebook is the issue. Facebook is just a nice example for 00:03:25.159 --> 00:03:29.609 an overall bigger issue. So I was never personally concerned with Facebook but 00:03:29.609 --> 00:03:32.969 for me the question is how we enforce Data Protection or kind of stuff. 00:03:32.969 --> 00:03:35.909 applause So it’s not a Facebook talk; Facebook is 00:03:35.909 --> 00:03:39.290 applause the example. And of course the whole thing 00:03:39.290 --> 00:03:42.599 is just one puzzle piece. A lot of people are saying: “This was one win but there 00:03:42.599 --> 00:03:46.279 are so many other issues!” – Yes, you’re totally right! This was just one issue. 00:03:46.279 --> 00:03:49.439 But you got to start somewhere. And the whole thing is also 00:03:49.439 --> 00:03:53.209 not an ultimate solution. So I can not present you the final solution 00:03:53.209 --> 00:03:57.530 for everything, but probably a couple of possibilities to do something. 00:03:57.530 --> 00:04:00.969 If you’re interested in the documents – we pretty much publish everything 00:04:00.969 --> 00:04:05.469 on the web page. It’s a very old style web page. But you can download the PDF files 00:04:05.469 --> 00:04:10.379 and everything if you’re interested in the facts and (?) the details. 00:04:10.379 --> 00:04:15.779 Talking about facts, the whole thing started with the Snowden case, 00:04:15.779 --> 00:04:19.019 where we kind of for the first time had documents proving who is actually 00:04:19.019 --> 00:04:23.500 forwarding data to the NSA in this case. And this is the interesting part, because 00:04:23.500 --> 00:04:26.530 we have a lot of rumours but if you’re in a Court room you actually have to prove 00:04:26.530 --> 00:04:30.180 everything and you cannot just suspect that very likely they’re doing it. But you 00:04:30.180 --> 00:04:35.199 need actual proof. And thanks to Snowden we had at least a bunch of information 00:04:35.199 --> 00:04:40.550 that we could use. These are the slides, you all know them. The first very 00:04:40.550 --> 00:04:46.880 interesting thing was the FISA act and we mainly argued under 1881a as an example 00:04:46.880 --> 00:04:51.509 for the overall surveillance in the US. So we took this law as an example but it was 00:04:51.509 --> 00:04:55.560 not the only thing we relied on. And I think it’s interesting for Europeans to 00:04:55.560 --> 00:05:01.030 understand how the law actually works. The law actually goes after data and not 00:05:01.030 --> 00:05:08.849 after people. We typically have laws in criminal procedures that go after people. 00:05:08.849 --> 00:05:13.289 This law goes after data. So it totally falls outside of our normal thinking of 00:05:13.289 --> 00:05:15.229 “we’re going after a suspect, someone that 00:05:15.229 --> 00:05:17.810 may have committed a crime”. Basically the 00:05:17.810 --> 00:05:21.430 law says that there’s an electronic communications service provider that holds 00:05:21.430 --> 00:05:25.270 foreign intelligence information. That’s much more than just terrorist prevention, 00:05:25.270 --> 00:05:29.919 that’s also things that the US is generally interested in. 00:05:29.919 --> 00:05:33.739 And this is the level that’s publicly known and everything else is basically 00:05:33.739 --> 00:05:38.139 classified. So under the law the FISA Court can do certification for one year 00:05:38.139 --> 00:05:43.669 that basically says “the NSA can access data”. In this certifications there are 00:05:43.669 --> 00:05:46.240 these minimization and targeting procedures that they have to describe. 00:05:46.240 --> 00:05:48.099 But they’re not public. We don’t know how 00:05:48.099 --> 00:05:51.210 they look like. And basically they’re here 00:05:51.210 --> 00:05:56.370 to separate data from US people out of the data set. So it doesn’t really help 00:05:56.370 --> 00:06:00.690 a European. And then there is a so called Directive that goes to the individual 00:06:00.690 --> 00:06:04.210 service provider which basically says: “Give us the data in some technical 00:06:04.210 --> 00:06:08.020 format.” So very likely it’s some kind of API or some kind of possibility 00:06:08.020 --> 00:06:12.569 that they can retrieve the data. That’s what the law says. We don’t know 00:06:12.569 --> 00:06:17.440 how it actually looks and we don't have perfect proof of it. So there are 00:06:17.440 --> 00:06:20.530 a lot of things that are disputed and still disputed by the US government. 00:06:20.530 --> 00:06:24.809 So the exact technical implementations, the amount of data that’s actually pulled, 00:06:24.809 --> 00:06:28.830 all the review mechanisms they have internally. That’s all stuff that was 00:06:28.830 --> 00:06:34.110 not 100% sure, and not sure enough to present it to a Court. Which was 00:06:34.110 --> 00:06:38.940 the basic problem we had. First of all after the Snowden thing broke 00:06:38.940 --> 00:06:44.460 we had different reactions. And that was kind of how I started the procedure. 00:06:44.460 --> 00:06:48.020 The first reaction was demonstrations. We were all walking in the streets. 00:06:48.020 --> 00:06:51.080 Which is good and which is important, but we all know that this is something 00:06:51.080 --> 00:06:55.710 we have to do but not something that’s gonna change the world. Second thing: 00:06:55.710 --> 00:06:58.849 we had parliaments like the European Parliament doing resolutions saying 00:06:58.849 --> 00:07:03.110 that we should strike down the Safe Harbor and this is all bad and evil. We had 00:07:03.110 --> 00:07:07.030 the Commission pretty much saying the same thing. We had national politicians 00:07:07.030 --> 00:07:11.569 saying the same thing. And we all knew that basically this means that they all 00:07:11.569 --> 00:07:16.580 send an angry letter to the US. Then they can walk in front of the media and say: 00:07:16.580 --> 00:07:19.879 “Yes, we’ve done something, we sent an angry letter to the US”, and the US 00:07:19.879 --> 00:07:25.229 is just thrown basically in some trash bin of crazy Europeans wanting strange things 00:07:25.229 --> 00:07:31.960 and that was it. So I was actually called by a journalist and asked if there’s 00:07:31.960 --> 00:07:36.610 some other option. And I was then starting to think about it and there’s 00:07:36.610 --> 00:07:42.300 the so called Safe Harbor agreement. To explain the “Safe Harbor”: In Europe 00:07:42.300 --> 00:07:46.490 we have Data Protection law that is on the papers but factually not enforced. 00:07:46.490 --> 00:07:50.569 But at least, in theory, we have it. And we have a couple of other countries 00:07:50.569 --> 00:07:54.819 that have the same level of protection or similar laws. And generally 00:07:54.819 --> 00:07:58.460 Data Protection only works if you keep the data within the protected sphere so 00:07:58.460 --> 00:08:01.150 you’re not allowed to send personal data to a third country that 00:08:01.150 --> 00:08:04.819 doesn’t have adequate protection. There are a couple of other countries that do; 00:08:04.819 --> 00:08:09.740 and therefore you can transfer data e.g. to Switzerland. This is what the law says. 00:08:09.740 --> 00:08:13.460 And there are certain servers that are outside these countries where 00:08:13.460 --> 00:08:17.009 we can have contractual relationships. So basically if you have a server in India, 00:08:17.009 --> 00:08:20.090 you have a contract with your Indian hosting provider saying: 00:08:20.090 --> 00:08:24.199 “You apply proper Data Protection to it”. So you can transfer data there, too. 00:08:24.199 --> 00:08:27.740 All of this is approved by the European Commission. This is how data 00:08:27.740 --> 00:08:33.299 flows legally outside of the EU – personal data. This all doesn’t apply 00:08:33.299 --> 00:08:38.539 to any other kind of data, only personal data. And we had a basic problem 00:08:38.539 --> 00:08:42.469 with the US because there was this Directive saying you can forward data 00:08:42.469 --> 00:08:46.850 to other countries but there is no Data Protection Law in the US. So basically 00:08:46.850 --> 00:08:48.810 you wouldn’t be allowed to send data there unless you have 00:08:48.810 --> 00:08:52.870 some contractual relationship which is always kind of complicated. 00:08:52.870 --> 00:08:58.110 So the solution was to have a self certification to EU principles 00:08:58.110 --> 00:09:01.839 and this was put into an Executive Decision by the European Commission. 00:09:01.839 --> 00:09:06.940 So basically how Safe Harbor is working is that e.g. Google can walk up and say: 00:09:06.940 --> 00:09:12.540 “Hereby I pledge that I follow European Data Protection Law. I solemnly swear!”. 00:09:12.540 --> 00:09:15.110 And then they do whatever they want to do. And basically 00:09:15.110 --> 00:09:17.750 that’s the Safe Harbor system and the Europeans can walk around saying: 00:09:17.750 --> 00:09:22.250 “Yeah, there is some seal saying that everything is fine, so don’t worry.” 00:09:22.250 --> 00:09:25.380 Everybody knew that this is a fucked-up system but for years and years 00:09:25.380 --> 00:09:29.399 everyone was looking away because politics is there and economics is there and 00:09:29.399 --> 00:09:33.560 they just needed it. So basically Safe Harbor works that way that a US company 00:09:33.560 --> 00:09:38.360 can follow the Safe Harbor principles and say: “We follow them”, then 00:09:38.360 --> 00:09:41.519 the Federal Trade Commission and private arbitrators are overlooking them 00:09:41.519 --> 00:09:46.700 – in theory, in practice they never do – and this whole thing was packaged 00:09:46.700 --> 00:09:50.160 into decision by the European Commission. And this is the so called 00:09:50.160 --> 00:09:55.600 Safe Harbor system. So from a European legal point of view it’s not an agreement 00:09:55.600 --> 00:10:01.010 with the US, it’s a system that the US has set up that we approved as adequate. So 00:10:01.010 --> 00:10:04.300 there’s no binding thing between the US and Europe, we can kind of trash it 00:10:04.300 --> 00:10:10.790 any time. They’ve just never done that. Which brings me to the legal argument. 00:10:10.790 --> 00:10:14.930 Basically if I’m this little Smiley down there, I’m sitting in Austria and 00:10:14.930 --> 00:10:20.920 transfer my data to Facebook Ireland, because worldwide – 82% of all users 00:10:20.920 --> 00:10:23.890 have a contract with Facebook Ireland. Anyone that lives outside the US 00:10:23.890 --> 00:10:28.600 and Canada. So anyone from China, South America, Africa has a contract 00:10:28.600 --> 00:10:33.670 with Facebook in Ireland. And legally they forward the data to Facebook in the US; 00:10:33.670 --> 00:10:37.170 technically the data is directly forwarded. So the data is actually flowing 00:10:37.170 --> 00:10:41.529 right to the servers in the US. However legally it goes through Ireland. And 00:10:41.529 --> 00:10:46.029 my contract partner is an Irish company. And under the law they can only transfer 00:10:46.029 --> 00:10:49.839 data to the US if there is adequate protection. At the same time we know 00:10:49.839 --> 00:10:53.550 that the PRISM system is hooked up in the end. So I was basically walking up 00:10:53.550 --> 00:10:56.720 to the Court and saying: “Mass Surveillance is very likely not 00:10:56.720 --> 00:11:01.669 adequate protection, he?” And that was basically the argument. 00:11:01.669 --> 00:11:08.980 applause 00:11:08.980 --> 00:11:12.250 The interesting thing in this situation was actually the strategic approach. 00:11:12.250 --> 00:11:17.899 So, we have the NSA and other surveillance organizations that use private companies. 00:11:17.899 --> 00:11:21.880 So we have kind of a public-private surveillance partnership. It’s PPP in 00:11:21.880 --> 00:11:28.540 a kind of surveillance way. Facebook is subject to US law, so under US law they 00:11:28.540 --> 00:11:32.110 have to forward all the data. At the same time Facebook Ireland is subject to 00:11:32.110 --> 00:11:35.040 European law so they’re not allowed to forward all this data. 00:11:35.040 --> 00:11:41.550 Which is interesting because they’re split. The EU law regulates 00:11:41.550 --> 00:11:45.570 how these third cwountry transfers work. And all of this has to be interpreted 00:11:45.570 --> 00:11:49.959 under Fundamental Rights. So this was basically the system were looking at. 00:11:49.959 --> 00:11:53.459 And the really crucial thing is that we have this public-private surveillance. 00:11:53.459 --> 00:11:56.930 Because we do have jurisdiction over private company. We don’t have 00:11:56.930 --> 00:12:00.680 jurisdication over the NSA. We can send angry letters to the NSA. But 00:12:00.680 --> 00:12:04.070 we do have jurisdiction over Facebook, Google etc. because they’re basically 00:12:04.070 --> 00:12:08.769 based here. Mainly for tax reasons. And this was the interesting thing that 00:12:08.769 --> 00:12:11.720 in difference to the national surveillance where we can pretty much just send 00:12:11.720 --> 00:12:15.240 the angry letters we can do something about the private companies. And 00:12:15.240 --> 00:12:19.230 without the private companies there is almost no mass surveillance in this scale 00:12:19.230 --> 00:12:22.810 because the NSA is not in our phones, it’s the Googles and Apples and whatever. 00:12:22.810 --> 00:12:29.000 And without them you’re not really able to get this mass surveillance. 00:12:29.000 --> 00:12:32.579 This is like the legal chart. Basically what we argued is: there’s 7 and 8 00:12:32.579 --> 00:12:35.490 of the Charta of Fundamental Rights. That’s your right to Privacy and 00:12:35.490 --> 00:12:39.420 your right to Data Protection. There is an article in the Directive that 00:12:39.420 --> 00:12:44.149 has to be interpreted in light of it. Then there’s the Executive Decision of the EU. 00:12:44.149 --> 00:12:48.750 This is basically the Safe Harbor decision which refers to Paragraph 4 00:12:48.750 --> 00:12:52.990 of the Safe Harbor principles. And the Safe Harbor principles basically say 00:12:52.990 --> 00:12:56.540 that the FISA Act is okay. So you have kind of this circle 00:12:56.540 --> 00:13:01.579 of different legal layers which is getting really crazy. I’ll try to break it down 00:13:01.579 --> 00:13:05.339 a little bit. Basically 7 and 8 of the Charta we basically compared 00:13:05.339 --> 00:13:08.690 to Data Retention, so the “Vorratsdatenspeicherung”. 00:13:08.690 --> 00:13:13.820 We basically said PRISM is much worse. If “Vorratsdatenspeicherung” (Data Retention) 00:13:13.820 --> 00:13:18.300 was invalid then PRISM has to be 10 times as bad. That was basically the argument. 00:13:18.300 --> 00:13:22.240 Very simple. We just compared: the one was content data – the other one 00:13:22.240 --> 00:13:25.779 was meta data. The one is storage – the other one is making available. 00:13:25.779 --> 00:13:29.519 And the one is endless – the other one is 24 months. So basically 00:13:29.519 --> 00:13:33.540 in all these categories PRISM was much worse. And if the one has to be illegal 00:13:33.540 --> 00:13:37.639 the other one has to be as well. And what’s interesting – and that’s something 00:13:37.639 --> 00:13:42.740 that the US side is typically not getting – is that Article 8 is already covering 00:13:42.740 --> 00:13:47.459 “making available of data”. So the fun thing is I only had to prove 00:13:47.459 --> 00:13:51.190 that Facebook makes data available, so basically it’s possible 00:13:51.190 --> 00:13:55.420 the NSA is pulling it. I didn’t even have to prove that the NSA is factually pulling 00:13:55.420 --> 00:14:00.190 my personal data. And this was like the relevant point because under US law 00:14:00.190 --> 00:14:04.220 basically your Fundamental Rights only kick in when they factually look at your 00:14:04.220 --> 00:14:07.850 data and actually surveil you. So I was only: “They’re making it available 00:14:07.850 --> 00:14:11.740 – that’s good enough for me!” which was making all these factual evidence 00:14:11.740 --> 00:14:16.789 much easier. So basically I only had to say: “Look at the XKeyscore slides 00:14:16.789 --> 00:14:22.180 where they say ‘user name Facebook’ they can get somehow the data out of it. 00:14:22.180 --> 00:14:26.860 It’s at least made available; that’s all I need to prove”. And this is 00:14:26.860 --> 00:14:30.360 the big difference between the US – it’s very simplified, but basically 00:14:30.360 --> 00:14:34.649 between the US approach and the European approach; is that in the US you have to 00:14:34.649 --> 00:14:38.639 prove that your data is actually pulled. I only had to prove that my data is made 00:14:38.639 --> 00:14:44.399 available. So I had to… I was able to get out of all the factual questions. 00:14:44.399 --> 00:14:48.100 This is a comparison – you basically… in the US we have very strict laws 00:14:48.100 --> 00:14:53.149 for certain types of surveillance while in Europe we have a more flexible system 00:14:53.149 --> 00:14:56.730 that covers much more. So it’s a different approach that we just have 00:14:56.730 --> 00:15:00.279 in the two legal spheres. We’re both talking about your Fundamental 00:15:00.279 --> 00:15:03.750 Right to Privacy, but in details it’s very different. And that’s kind of 00:15:03.750 --> 00:15:07.909 the differences what we used. The fun thing is if you’re European you don’t have 00:15:07.909 --> 00:15:11.509 any rights in the US anyways because the Bill Of Rights only applies to people 00:15:11.509 --> 00:15:15.829 that live in the US and US citizens so you’re out of luck anyways. So you’re 00:15:15.829 --> 00:15:20.209 only left with the European things. Basically the law which is 00:15:20.209 --> 00:15:23.550 the second level after the Fundamental Rights is saying that there has to be 00:15:23.550 --> 00:15:28.250 an adequate level of protection as I said and this third country has to ensure it 00:15:28.250 --> 00:15:33.279 by domestic law or international commitments. And I was saying: “You know 00:15:33.279 --> 00:15:36.180 there’s the FISA Act, you can read it, it definitely doesn’t ensure 00:15:36.180 --> 00:15:38.630 your fundamental rights and an adequate protection. So we're 00:15:38.630 --> 00:15:45.110 kind of out of Article 25”. And there is paragraph 4 of the Safe Harbor principles 00:15:45.110 --> 00:15:50.829 which say that all these wonderful privacy principles that US companies sign up to 00:15:50.829 --> 00:15:56.579 do not apply whenever a national law in the US is overruling it. So there are 00:15:56.579 --> 00:16:01.690 principles that companies say: “We follow!” but if there is a city in Texas 00:16:01.690 --> 00:16:05.649 saying: “We have a local ordinance saying: ‘You have to do differently!’” 00:16:05.649 --> 00:16:09.029 all these Safe Harbor principles don’t apply anymore. And this is 00:16:09.029 --> 00:16:12.699 the fundamental flaw of the self certification system that it only works 00:16:12.699 --> 00:16:17.240 if there is no law around that conflicts with it. And as there are tons of laws 00:16:17.240 --> 00:16:22.519 that conflict with it you’re hardly able to hold up a system like that. 00:16:22.519 --> 00:16:26.220 So basically if you go through all these different legal layers you end up with 00:16:26.220 --> 00:16:29.730 a conflict between the US FISA Act and the European Fundamental Rights. 00:16:29.730 --> 00:16:33.069 So you’re going through different layers of the system but you’re basically making 00:16:33.069 --> 00:16:41.009 a circle. This is what we did which was a little bit complicated but worked. 00:16:41.009 --> 00:16:53.719 applause 00:16:53.719 --> 00:16:57.430 Basically now to the procedure, so how the whole thing happened. 00:16:57.430 --> 00:17:01.639 First I went through the Safe Harbor. Safe Harbor allows you to go to TRUSTe or 00:17:01.639 --> 00:17:06.630 the Federal Trade Commission and there’s an online form to make your complaint. And 00:17:06.630 --> 00:17:10.270 I was making a complaint and I think you were only allowed to put in 60 characters 00:17:10.270 --> 00:17:13.650 to explain what your complaint is. Which is a little bit complicated if you’re 00:17:13.650 --> 00:17:17.779 trying to explain NSA mass surveillance. So I only wrote: “Stop Facebook, Inc.’s 00:17:17.779 --> 00:17:20.410 involvement in PRISM!”. That was everything I could actually 00:17:20.410 --> 00:17:24.280 put in the text box; that was the absolute maximum. 00:17:24.280 --> 00:17:27.690 And the answer I got back was: “TRUSTe does not have the authority to address 00:17:27.690 --> 00:17:31.280 the matter you raise.” Which is obvious, it’s a private arbitration company 00:17:31.280 --> 00:17:35.370 that can hardly tell Facebook to not follow the NSA’s guidelines. So 00:17:35.370 --> 00:17:39.310 this was the arbitration mechanism under Safe Harbor. You can also go to the 00:17:39.310 --> 00:17:43.390 Federal Trade Commission and have your complaint filed there. But they basically 00:17:43.390 --> 00:17:50.290 just ignore them. This was the letter I got back, that they received it. But 00:17:50.290 --> 00:17:53.760 I was talking to the people at the FTC and they say: “Yeah, we get these complaints 00:17:53.760 --> 00:17:59.130 but they’re ending up in a huge storage system where they stay for ever after”. 00:17:59.130 --> 00:18:02.530 So this was enforcement done by Safe Harbor. And we knew that 00:18:02.530 --> 00:18:05.640 in the private field already; but in this case it was especially interesting. 00:18:05.640 --> 00:18:08.720 To be fair, both of these institutions have no power to do anything 00:18:08.720 --> 00:18:11.260 about mass surveillance. So there was really a reason why 00:18:11.260 --> 00:18:14.330 they didn’t do anything. The next step you have is 00:18:14.330 --> 00:18:17.330 the national Data Protection Commissioner. So we have 28 countries 00:18:17.330 --> 00:18:21.640 with 28 [Commissioners]; plus Germany has – I think – a Data Protection Commissioner 00:18:21.640 --> 00:18:26.940 in every province. And you end up at this. And this is my most favourite slide. 00:18:26.940 --> 00:18:30.500 This is the Irish Data Protection Commissioner. 00:18:30.500 --> 00:18:33.910 applause 00:18:33.910 --> 00:18:36.450 To be super precise – I don’t know if you 00:18:36.450 --> 00:18:38.100 can see the laser pointer. But this is a 00:18:38.100 --> 00:18:41.570 super market. And this is the Irish Data Protection Commissioner back there. 00:18:41.570 --> 00:18:45.040 laughter, applause 00:18:45.040 --> 00:18:48.370 To be a little more fair, actually they’re up here and they’re like 20 people 00:18:48.370 --> 00:18:51.720 when we filed it originally. The fun thing is back at the times they didn’t have 00:18:51.720 --> 00:18:54.730 a single lawyer and not a single technician. So they were 20 00:18:54.730 --> 00:18:58.470 public employees that were dealing with Data Protection and no one 00:18:58.470 --> 00:19:02.960 had any clue of the technical or the legal things about it. 00:19:02.960 --> 00:19:06.140 The fun thing is: this is Billy Hawkes, the Data Protection Commissioner 00:19:06.140 --> 00:19:09.720 at the time. He went on the national radio in the morning. 00:19:09.720 --> 00:19:12.960 And in Ireland radio is a really big thing. So it was a morning show. 00:19:12.960 --> 00:19:16.540 And he was asked about these complaints. And he actually said on the radio: 00:19:16.540 --> 00:19:21.170 “I don’t think it will come as much of a surprise 00:19:21.170 --> 00:19:24.880 that the US services have access to all the US companies”. 00:19:24.880 --> 00:19:28.510 And this was the craziest thing! I was sitting in front of the radio 00:19:28.510 --> 00:19:34.560 and was like: “Strike! He just acknowledged that all this is true!”. 00:19:34.560 --> 00:19:38.040 And the second thing, he said: “This US surveillance operation is not an issue 00:19:38.040 --> 00:19:42.500 of Data Protection”. Interesting. 00:19:42.500 --> 00:19:47.570 It’s actually online and you can listen to it. But the fun thing was really that 00:19:47.570 --> 00:19:52.180 the factual level is so hard to prove that I was afraid that they would dispute: 00:19:52.180 --> 00:19:55.380 “Hah, who knows if all this is true? We don’t have any evidence! 00:19:55.380 --> 00:19:58.710 The companies say we are not engaging in all of this.” 00:19:58.710 --> 00:20:02.770 So having the Data Protection Commissioner saying: “Sure they surveil you! 00:20:02.770 --> 00:20:06.330 Are you surprised?” was great because we were kind of out of 00:20:06.330 --> 00:20:10.180 the whole factual debate. 00:20:10.180 --> 00:20:13.070 I actually got a letter back from them saying that they’re not investigating 00:20:13.070 --> 00:20:17.850 any of it. And I was asking them why. And they were naming 2 sections of the law, 00:20:17.850 --> 00:20:21.600 a combination thereof. So there was one thing where it says they shall investigate 00:20:21.600 --> 00:20:25.130 – which means they have to – or they may investigate. And they say 00:20:25.130 --> 00:20:28.060 they only “may” investigate complaints and they just don’t feel like 00:20:28.060 --> 00:20:32.710 investigating PRISM and Facebook and all of this. Secondly they say 00:20:32.710 --> 00:20:36.920 that a complaint could be “frivolous and vexatious” – I love the word! 00:20:36.920 --> 00:20:41.130 And therefor they’re not investigating it. “A combination thereof or indeed 00:20:41.130 --> 00:20:47.980 any other relevant matter.” So we transferred this letter into a picture 00:20:47.980 --> 00:20:51.630 which is basically what they said: “So why did you not investigate PRISM?” 00:20:51.630 --> 00:20:54.140 – “‘Shall’ means ‘may’, frivolous or 00:20:54.140 --> 00:20:55.700 vexatious, a combination of A and B 00:20:55.700 --> 00:20:58.820 or any other reason.” So this was the answer 00:20:58.820 --> 00:21:01.450 by the Irish Data Protection Commissioner why they wouldn’t want to investigate 00:21:01.450 --> 00:21:05.710 the complaint. Just to give you background information: 00:21:05.710 --> 00:21:08.350 these are the complaints that the Irish Data Protection Commissioner is receiving 00:21:08.350 --> 00:21:10.510 – the blue line – and the red line is all 00:21:10.510 --> 00:21:13.380 of the complaints they’re not deciding. 00:21:13.380 --> 00:21:17.420 Which is 96..98% of the complaints 00:21:17.420 --> 00:21:20.680 they receive on an average year. Which is interesting because you have 00:21:20.680 --> 00:21:24.130 a right to get a decision but they don’t. 00:21:24.130 --> 00:21:27.180 To give you the bigger picture: we also made complaints on Apple 00:21:27.180 --> 00:21:30.510 and all the other PRISM companies. And Ireland basically said 00:21:30.510 --> 00:21:35.310 what I just told you. Luxembourg, where Skype and Microsoft are situated, said 00:21:35.310 --> 00:21:38.760 that they do not have enough evidence for the participation of Microsoft and Skype 00:21:38.760 --> 00:21:43.070 [in PRISM]. And the funniest thing about the answer was that they said 00:21:43.070 --> 00:21:46.030 that they’re restricted by their investigations to the territory 00:21:46.030 --> 00:21:50.220 of Luxembourg. And since all of this is happening in the US they have no way 00:21:50.220 --> 00:21:54.050 of ever finding out what was going on. So I was telling them: “You know, 00:21:54.050 --> 00:21:57.890 most of this is online and if you’re not able to download it I can print it out 00:21:57.890 --> 00:22:02.890 for you and ship it to Luxembourg.” But the problem is why we didn’t go down 00:22:02.890 --> 00:22:06.460 in Luxembourg is because they went down this factual kind of argument. They said: 00:22:06.460 --> 00:22:10.010 “It’s all illegal but factually we don’t believe it’s true”. And 00:22:10.010 --> 00:22:15.280 then there was Germany that are still investigating until today. 00:22:15.280 --> 00:22:18.300 This was Yahoo. Actually that was Yahoo in Munich but they now 00:22:18.300 --> 00:22:21.140 moved to Ireland as well. So I don’t know what happened to this complaint. 00:22:21.140 --> 00:22:23.800 We never heard back. But whenever we sent an email they were like: “Yeah, we’re 00:22:23.800 --> 00:22:29.820 still investigating.” So what happened now is that I went to the Irish High Court. 00:22:29.820 --> 00:22:34.150 To jeopardize the non-decision of the Irish Data Protection Commissioner. 00:22:34.150 --> 00:22:37.350 This is the case that then went down as “Schrems vs. the Data Protection 00:22:37.350 --> 00:22:40.790 Commissioner” which is so strange because I never wanted to have my name 00:22:40.790 --> 00:22:43.690 on any of this and now the decision is actually called after my second name 00:22:43.690 --> 00:22:46.990 which is always freaking me out in a way. Because you’re fighting for Privacy and 00:22:46.990 --> 00:22:51.480 suddenly your name is all over the place. applause and laughter 00:22:51.480 --> 00:22:56.010 applause 00:22:56.010 --> 00:23:00.160 And this is the Irish High Court. So you… 00:23:00.160 --> 00:23:04.130 It’s very complicated to get a procedure like that. 00:23:04.130 --> 00:23:07.610 The biggest issue is that you need money. If you’re in front of an Irish Court 00:23:07.610 --> 00:23:10.100 and you lose a case you end up with a legal bill of 00:23:10.100 --> 00:23:13.280 a couple of hundred thousand Euros. Which is the reason why 00:23:13.280 --> 00:23:17.600 never anybody ever challenged the Irish Data Protection Commissioner. 00:23:17.600 --> 00:23:21.510 Because you just gonna lose your house over it! 00:23:21.510 --> 00:23:25.900 So what I did is: we did a little bit of crowd-funding! And 00:23:25.900 --> 00:23:29.420 we actually got about 70.000 Euros out of it. This was a crowd-funding platform 00:23:29.420 --> 00:23:32.180 that basically worked in a way that people could donate 00:23:32.180 --> 00:23:35.950 and if we don’t need the money we either donate it to another Privacy cause 00:23:35.950 --> 00:23:39.710 or we actually give people the money back. Which we got to have to do 00:23:39.710 --> 00:23:43.490 because we won the case. And all our costs are paid by the other side. 00:23:43.490 --> 00:23:49.960 applause 00:23:49.960 --> 00:23:55.660 So the fun thing is you then have to walk into this wonderful old Court here 00:23:55.660 --> 00:23:59.710 on Mondays at 11:30. And there’s a room where you can 00:23:59.710 --> 00:24:03.990 make your application. And about 100 other people making their application as well. 00:24:03.990 --> 00:24:06.850 And there is no number. So there are 100 lawyers sitting in a room, 00:24:06.850 --> 00:24:11.240 waiting for the judge to call out your case. So we were sitting there until 4 PM 00:24:11.240 --> 00:24:15.430 or something until suddenly our case was called up. And we actually got kind of 00:24:15.430 --> 00:24:19.260 the possibility to bring our case and then it’s postponed to another date and 00:24:19.260 --> 00:24:24.400 blablablablabla. In the end you end up with something like this. 00:24:24.400 --> 00:24:28.380 Which is all the paperwork because in Ireland the Courts 00:24:28.380 --> 00:24:33.020 are not computerized so far. So you have to bring all the paperwork, 00:24:33.020 --> 00:24:38.200 anything you rely on, in 3 copies. And it’s all paper, noted of the pages, 00:24:38.200 --> 00:24:43.260 so all these copies have pages 1 to 1000. Someone’s writing all of them on the page. 00:24:43.260 --> 00:24:46.010 And then they copy it 3 times and it’s then in this wonderful little thing. 00:24:46.010 --> 00:24:49.810 I thought it’s great. And what happened is that 00:24:49.810 --> 00:24:53.630 we walked into the judge’s room and you get a judge assigned on the same day. 00:24:53.630 --> 00:24:57.690 So you end up in front of a judge that has never heard about Privacy, 00:24:57.690 --> 00:25:00.570 never heard about Facebook and never heard about Snowden and PRISM 00:25:00.570 --> 00:25:03.710 and any of this. So you walk into the room as like “We would like to debate 00:25:03.710 --> 00:25:08.220 the Safe Harbor with you” and he was like “What the fuck is the Safe Harbor?”. 00:25:08.220 --> 00:25:12.250 So what happened is that he told us to kind of explain what it is for 15 minutes. 00:25:12.250 --> 00:25:16.280 And then he postponed the whole thing for 2 hours I think 00:25:16.280 --> 00:25:19.960 and we walked over to a pub and had a beer. So that the judge could remotely 00:25:19.960 --> 00:25:24.540 read what he’s about to look into. 00:25:24.540 --> 00:25:27.150 And Ireland is very interesting because you need a Solicitor and a Counsel 00:25:27.150 --> 00:25:30.900 and then the Counsel is actually talking to the Judge. So I actually had 2 filters. 00:25:30.900 --> 00:25:34.010 If I’m the client down here I had to talk to my Solicitor. The Solicitor 00:25:34.010 --> 00:25:38.520 was telling the Counsel what to say to the Judge. So half of it was lost on the way. 00:25:38.520 --> 00:25:41.730 And when I was asking if I could just address the Judge personally 00:25:41.730 --> 00:25:44.770 they were like “No, no way that you could possibly address the Judge personally 00:25:44.770 --> 00:25:46.240 even though you’re the claimant”. Which is 00:25:46.240 --> 00:25:48.110 funny ’cause they talk about this “person” 00:25:48.110 --> 00:25:52.060 in the room. It’s like “What’s the problem of this Mr. Schrems?”. And you’re like 00:25:52.060 --> 00:25:57.040 sitting right here, it’s like “This would be me!”. 00:25:57.040 --> 00:26:01.060 So what happened in Ireland is that we had about 10 reasons why under Irish law 00:26:01.060 --> 00:26:05.960 the Irish Data Protection Commissioner would have to do its job but the Court 00:26:05.960 --> 00:26:09.880 actually wiped all of this from the table and said actually the Safe Harbor 00:26:09.880 --> 00:26:12.580 is the issue, which legally they’re not allowed to do what politically 00:26:12.580 --> 00:26:16.500 was very wise and forwarded this wonderful easy-to-understand question 00:26:16.500 --> 00:26:19.710 to the European Court of Justice. 00:26:19.710 --> 00:26:23.480 The reason why they put this kind of very random question is that 00:26:23.480 --> 00:26:27.710 if you jeopardize a law in Ireland you have to get some Advocate General engaged. 00:26:27.710 --> 00:26:30.090 And they didn’t want to do that so they kind of “asked a question 00:26:30.090 --> 00:26:33.890 around the actual question” to not really get them engaged. 00:26:33.890 --> 00:26:35.720 Which was very complicated because we didn’t know how 00:26:35.720 --> 00:26:38.610 the European Court of Justice ’d kind of react to this random question because 00:26:38.610 --> 00:26:41.530 it was so broad that they could just walk any other direction and not address 00:26:41.530 --> 00:26:47.330 the real issue. What was wonderful is that in the judgment by the Irish Court 00:26:47.330 --> 00:26:50.640 they have actually said that all of this is factually true. 00:26:50.640 --> 00:26:54.170 All the mass surveillance is factually true. And the fun thing to understand 00:26:54.170 --> 00:26:58.720 is that the factual assessment is done by the national Courts. So the European 00:26:58.720 --> 00:27:02.000 Court of Justice is not engaging in factual matters anymore. They only 00:27:02.000 --> 00:27:06.800 ask legal questions: “Is this legal or not”. So we had a split of responsibility. 00:27:06.800 --> 00:27:10.680 The Irish Court only said that all of this is true. And Luxembourg only said 00:27:10.680 --> 00:27:15.050 that all of this would be legal if all of this would be true. Which was kind of 00:27:15.050 --> 00:27:19.530 an interesting situation. But to be fair no one before the European 00:27:19.530 --> 00:27:23.010 Court of Justice has ever questioned that this is true. So even the UK 00:27:23.010 --> 00:27:26.270 that was in front of the Court and that should possibly know if all of this 00:27:26.270 --> 00:27:30.270 is true or not, they have never questioned the facts. laughs 00:27:30.270 --> 00:27:34.930 There is a pretty good factual basis. What was interesting as well is 00:27:34.930 --> 00:27:39.150 that I said I’m not gonna go in front of the European Court of Justice. 00:27:39.150 --> 00:27:44.240 Because the cost is so high that even the 60 or 70.000 Euros I got in donations 00:27:44.240 --> 00:27:47.970 wouldn’t cover it. And I knew the judge wants to get this hot potato off his table 00:27:47.970 --> 00:27:52.280 and down to Luxembourg. So I was asking for a so called “protective cost order” 00:27:52.280 --> 00:27:55.470 which kind of tells you beforehand that there is a maximum amount you have to pay 00:27:55.470 --> 00:27:57.960 if you lose a case. And it was actually the first one 00:27:57.960 --> 00:28:01.960 to ever get protective cost order in Ireland granted. 00:28:01.960 --> 00:28:06.020 Which was really cool and the Irish were like outraged about it, too. 00:28:06.020 --> 00:28:09.760 applause 00:28:09.760 --> 00:28:12.960 So we basically walked into the European Court of Justice which is 00:28:12.960 --> 00:28:17.150 a really hefty procedure. In this room were… 00:28:17.150 --> 00:28:20.520 13 judges are in front of you. The European Court of Justice has assigned it 00:28:20.520 --> 00:28:24.150 to the Great Chamber. So there is a Small, a Medium and a Great Chamber. 00:28:24.150 --> 00:28:28.480 Which is the highest thing you can possibly end up in Europe. And 00:28:28.480 --> 00:28:31.490 it’s chaired by the President of the European Court of Justice. And this is 00:28:31.490 --> 00:28:36.350 kind of where the really really basic, really important questions are dealt with. 00:28:36.350 --> 00:28:38.890 So I was like: “Cool, I’m getting to the European Court of Justice!”. And it’s 00:28:38.890 --> 00:28:41.930 funny because all the lawyers that were in the room, everyone was like “I can pledge 00:28:41.930 --> 00:28:44.580 in front of the European Court of Justice!”. They all took pictures like 00:28:44.580 --> 00:28:47.040 they were in Disneyland or something. audience laughing 00:28:47.040 --> 00:28:52.970 And it was – lawyers can be very… kind of… interesting. And 00:28:52.970 --> 00:28:57.250 we ended up in front of these 3 major people. It was the President, 00:28:57.250 --> 00:29:00.860 Thomas von Danwitz – who is the German judge and he also wrote the lead decision. 00:29:00.860 --> 00:29:03.710 He’s the Judge Rapporteur, so within the 13 judges there’s one that is 00:29:03.710 --> 00:29:07.130 the reporting judge and actually drafts the whole case. And he was also 00:29:07.130 --> 00:29:12.760 doing the Data Retention. And then there was Yves Bot as the Advocate General. 00:29:12.760 --> 00:29:15.220 The hearing was interesting because we got questions 00:29:15.220 --> 00:29:18.480 from the European Court of Justice before the hearing. And in these questions 00:29:18.480 --> 00:29:21.600 they were actually digging down into the core issues of mass surveillance 00:29:21.600 --> 00:29:25.930 in the US. When I got the questions I was like “We won the case!” because 00:29:25.930 --> 00:29:30.810 there’s no way they can decide differently as soon as they address the question. 00:29:30.810 --> 00:29:34.790 There were participants from all over Europe. These are the countries, 00:29:34.790 --> 00:29:37.660 then there was the European Parliament, the European Data Protection Supervisor 00:29:37.660 --> 00:29:40.770 and the European Commission. There was me – MS down there, 00:29:40.770 --> 00:29:44.940 the Data Protection Commissioner and Digital Rights Ireland. And 00:29:44.940 --> 00:29:49.310 what was interesting was the countries that were not there. Like Germany, e.g. 00:29:49.310 --> 00:29:53.000 was not there in this major procedure. And as far as I’ve heard there were 00:29:53.000 --> 00:30:00.300 reasons of not getting too engaged in the Transatlantic Partnership problem. 00:30:00.300 --> 00:30:04.800 So this was kind of interesting because the UK walked up but Germany was like: 00:30:04.800 --> 00:30:08.130 “No, we rather don’t want to say anything about this.” 00:30:08.130 --> 00:30:12.370 What was interesting as well is that there were interventions by the US Government. 00:30:12.370 --> 00:30:16.470 So I heard… we were… on a Tuesday we were actually in the Court. And on Mondays 00:30:16.470 --> 00:30:20.430 I got text messages from people of these different countries telling me that 00:30:20.430 --> 00:30:25.040 the US just called them up. And I was like: “This is interesting” 00:30:25.040 --> 00:30:28.180 because I know a lot of these people from conferences and stuff. So they were like 00:30:28.180 --> 00:30:31.210 telling me: “The US just called me up and said they wanna talk to my 00:30:31.210 --> 00:30:34.520 lead lead lead supervisor and tell me what to say tomorrow in the Court”. 00:30:34.520 --> 00:30:38.050 It was like: “This is very interesting!”. 00:30:38.050 --> 00:30:44.070 I was actually in the Court room and there was the justice person from the US embassy 00:30:44.070 --> 00:30:47.650 to the European Union. And he was actually watching the procedure and watching 00:30:47.650 --> 00:30:50.430 what everybody was arguing. Where I had a feeling this is 00:30:50.430 --> 00:30:54.130 like a watchdog situation. And someone pointed out that this is the guy, 00:30:54.130 --> 00:30:57.420 so I knew who it is. And he was walking up to me and asked: “Are you the plaintiff?” 00:30:57.420 --> 00:31:02.960 And I said: “Yeah, hey!” and he was trying to talk to me and I said: 00:31:02.960 --> 00:31:06.420 “Did you manage calling everybody by now or do you still need a couple of numbers?” 00:31:06.420 --> 00:31:06.870 audience laughing 00:31:06.870 --> 00:31:11.650 And he was like: “(?) arrogant!”. He was like: “He didn’t just ask this question?”. 00:31:11.650 --> 00:31:15.590 He said: “No, we kind of we’re in contact with all of our colleagues and of course 00:31:15.590 --> 00:31:19.280 we have to kind of push for the interest of the US” and blablabla. I thought: 00:31:19.280 --> 00:31:22.840 “This is very interesting!”. But anyway, it didn’t help them. 00:31:22.840 --> 00:31:27.720 No one of them was really kind of arguing for the US, actually. 00:31:27.720 --> 00:31:30.440 The findings of the European Court of Justice, so what was in the judgment 00:31:30.440 --> 00:31:36.050 in the end. First of all, Safe Harbor is invalid. Which was the big news. 00:31:36.050 --> 00:31:39.340 And this was over night. We were expecting that they would have a grace period 00:31:39.340 --> 00:31:43.700 so it’s invalid within 3 months or something like this. But in the minute 00:31:43.700 --> 00:31:48.450 they were saying it there all your data transfers to the US were suddenly illegal. 00:31:48.450 --> 00:31:58.710 applause Which was kind of big. NOTE Paragraph 00:31:58.710 --> 00:32:02.070 The second biggie was that they actually said that the essence of your rights 00:32:02.070 --> 00:32:04.850 is violated. Now this, for an average person, doesn’t mean too much. 00:32:04.850 --> 00:32:10.270 But for a lawyer it says: “Oh my god, the essence is touched!!”. 00:32:10.270 --> 00:32:14.170 To explain to you what the essence is and why everybody is so excited about it is: 00:32:14.170 --> 00:32:17.820 basically if you have a violation of your rights you have no interference. 00:32:17.820 --> 00:32:20.420 So if a policeman was walking down the street and watching you 00:32:20.420 --> 00:32:24.030 there’s no interference with any of your rights. If they probably tapped your phone 00:32:24.030 --> 00:32:27.450 there is some kind of proportionality issue which is what we typically debate 00:32:27.450 --> 00:32:30.950 before a Court. There is a system how you argue if something is proportionate 00:32:30.950 --> 00:32:34.750 or not. So e.g. Data Retention was not proportionate. 00:32:34.750 --> 00:32:37.950 And Data Retention would be somewhere here probably. points to slide 00:32:37.950 --> 00:32:41.800 So not legal anymore but still in a proportionality test. 00:32:41.800 --> 00:32:44.320 And then there is “the essence” which means whatever the fuck 00:32:44.320 --> 00:32:47.190 you’re trying to do here is totally illegal because what you’re doing 00:32:47.190 --> 00:32:49.690 is so much out of the scale of proportionality that 00:32:49.690 --> 00:32:53.530 it will never be legal. And on Data Retention it actually said that 00:32:53.530 --> 00:32:56.590 for the first time… applause 00:32:56.590 --> 00:33:01.890 applause 00:33:01.890 --> 00:33:04.160 …and this was actually the first time as far as I saw 00:33:04.160 --> 00:33:07.290 that the European Court of Justice has ever said that under the convention. 00:33:07.290 --> 00:33:11.410 So the convention is only in place since 2008, I think. 00:33:11.410 --> 00:33:13.640 But it’s the first time they actually found that in a case which was 00:33:13.640 --> 00:33:18.120 huge for law in general. There was a couple of findings 00:33:18.120 --> 00:33:21.660 on Data Protection powers that are not too interesting for you. 00:33:21.660 --> 00:33:26.450 What may be interesting is that 00:33:26.450 --> 00:33:29.760 – there is a story to this picture that’s the reason I put it in – 00:33:29.760 --> 00:33:32.310 basically they said that a third country doesn’t have 00:33:32.310 --> 00:33:35.600 to provide adequate protection, as I said before. So the story was 00:33:35.600 --> 00:33:39.210 that third countries originally had to provide equivalent protection. 00:33:39.210 --> 00:33:42.270 But there was lobbying going on, so the word “equivalent” was 00:33:42.270 --> 00:33:47.090 changed to “adequate”. And “adequate” means basically nothing. 00:33:47.090 --> 00:33:51.120 Because anything and nothing can be adequate. “Adequate” has no legal meaning. 00:33:51.120 --> 00:33:55.910 I mean if you ask what an adequate dressing is – you don’t really know. 00:33:55.910 --> 00:33:59.500 So they changed that actually back to the law… to the wording that was lobbied 00:33:59.500 --> 00:34:02.880 out of the law and said it has to be “essentially equivalent” and that’s how 00:34:02.880 --> 00:34:05.980 we now understand “adequate”. Which is cool because any third country now 00:34:05.980 --> 00:34:10.110 has to provide more or less the same level of protection than Europe has. 00:34:10.110 --> 00:34:15.469 There has to be effective detention and supervision mechanisms. And 00:34:15.469 --> 00:34:18.199 there has to be legal redress. Just a really short thing on the picture: 00:34:18.199 --> 00:34:20.899 I was actually just pointing at two people and they were taking a picture 00:34:20.899 --> 00:34:24.530 from down there to make it a Victory sign. And that’s how the media 00:34:24.530 --> 00:34:27.929 is then doing: “Whoo”. making short Victory gesture 00:34:27.929 --> 00:34:31.899 I have to speed up a little bit. Not too much but a little bit. 00:34:31.899 --> 00:34:36.070 The future, and I think that’s probably relevant for you guys as well… 00:34:36.070 --> 00:34:40.820 First of all, what this whole judgment means. First of all the US 00:34:40.820 --> 00:34:44.070 basically lost its privileged status as being a country 00:34:44.070 --> 00:34:47.480 that provides adequate [data] protection. Which is kind of the elephant in the room 00:34:47.480 --> 00:34:50.860 that everyone knew anyway, that they’re not providing it. And now, officially, 00:34:50.860 --> 00:34:55.300 they’re not providing it anymore. And the US is now like any third country. 00:34:55.300 --> 00:35:00.320 So like China or Russia or India or any country we usually transfer data to. 00:35:00.320 --> 00:35:03.750 So it’s not like you cannot transfer data to the US anymore. 00:35:03.750 --> 00:35:06.840 But they lost their special status. Basically what the judgment said: 00:35:06.840 --> 00:35:09.080 “You can’t have mass surveillance and be at the same time 00:35:09.080 --> 00:35:13.490 an adequately [data] protecting country”. Which is kind of logical anyway. 00:35:13.490 --> 00:35:16.540 The consequence is that you have to use the derogations that are in the law 00:35:16.540 --> 00:35:20.100 that we have for other countries as well. So a lot of people said: “You know, 00:35:20.100 --> 00:35:23.610 the only result will be that there will be a consent box saying ‘I consent that my 00:35:23.610 --> 00:35:27.480 [personal] data is going to the US.’” Now the problem is: consent has to be 00:35:27.480 --> 00:35:32.160 freely given, informed, unambiguous and specific; under European law. 00:35:32.160 --> 00:35:34.350 Which is something all the Googles and Facebooks in the world have 00:35:34.350 --> 00:35:36.870 never understood. That’s the reason why all these Privacy Policies are 00:35:36.870 --> 00:35:41.270 typically invalid. But anyway. So if you have any of these wordings that 00:35:41.270 --> 00:35:45.190 they’re currently using, like “Your data is subject to all applicable laws” it’s 00:35:45.190 --> 00:35:48.300 very likely not “informed” and “unambiguous”. Because you don’t have 00:35:48.300 --> 00:35:52.120 any fucking idea that your data is ending up at the NSA if you read this. 00:35:52.120 --> 00:35:55.510 So what they would have to do is to have some Policy saying: “I agree that all of 00:35:55.510 --> 00:35:59.530 my personal data is made available to the NSA, FBI and whatsoever – YES/NO”. 00:35:59.530 --> 00:36:02.080 applause Because it has to be “freely given”, so 00:36:02.080 --> 00:36:04.410 applause I have to have the option to say “No”. 00:36:04.410 --> 00:36:09.050 Now this would theoretically be possible but under US law they’re placed under 00:36:09.050 --> 00:36:10.510 a “gag order”, so they’re not allowed to 00:36:10.510 --> 00:36:13.610 say this. So they’re in a legal kind of 00:36:13.610 --> 00:36:17.030 Limbo because on the one hand they have to say: “It’s this way” but on the other hand 00:36:17.030 --> 00:36:22.210 they have to say “No it’s not”. So consent is not going to give you any solution. 00:36:22.210 --> 00:36:25.760 Then there are Standard Contractual Clauses. That’s the one from Apple that 00:36:25.760 --> 00:36:28.130 they’re using right now. 00:36:28.130 --> 00:36:31.940 And Standard Contractual Clauses allow you to have a contract with a provider 00:36:31.940 --> 00:36:36.220 in a third country. And that pledges to you in a contract 00:36:36.220 --> 00:36:40.950 that all your data is safe. The problem is that they have exception clauses. 00:36:40.950 --> 00:36:45.140 That basically say: “If there’s mass surveillance your whole contract is void” 00:36:45.140 --> 00:36:48.860 because you cannot have a contract saying: “Hereby I pledge full Privacy” 00:36:48.860 --> 00:36:52.870 and at the same time be subject to these laws. And this is the interesting thing: 00:36:52.870 --> 00:36:56.130 all these companies are saying: “Now we’re doing Standard Contractual Clauses”, 00:36:56.130 --> 00:36:58.400 but none of them are going to hold up in Courts and everybody knows, 00:36:58.400 --> 00:37:00.450 but of course to their shareholders they have to tell: “Oh we have 00:37:00.450 --> 00:37:04.160 a wonderful solution for this.” 00:37:04.160 --> 00:37:07.860 The big question here is if we have a factual or legal assessment. 00:37:07.860 --> 00:37:12.690 So do we have to look at factually what data is actually processed by the NSA 00:37:12.690 --> 00:37:16.470 and what are they actually doing. Or do we just have to look at the laws in a country 00:37:16.470 --> 00:37:21.550 and the possibility of mass access. So the factual assessment works fine for Apple, 00:37:21.550 --> 00:37:26.120 Google etc. who are all in these Snowden slides. If you look at the abstract and 00:37:26.120 --> 00:37:30.660 legal assessment which is legally the thing that probably we have to do 00:37:30.660 --> 00:37:34.950 you actually end up with questions like Amazon. Amazon was not a huge 00:37:34.950 --> 00:37:38.280 cloud provider when the Snowden slides were actually drafted and written. 00:37:38.280 --> 00:37:42.410 They’re huge now. And very likely they’re subject to all of these laws. 00:37:42.410 --> 00:37:45.130 So how do we deal with a company like this? Can we still forward [personal] data 00:37:45.130 --> 00:37:48.500 to an Amazon cloud? If we know they’re subject to these US laws. 00:37:48.500 --> 00:37:51.480 So this is the question of which companies are actually falling 00:37:51.480 --> 00:37:56.310 under this whole judgment. 00:37:56.310 --> 00:37:59.510 Basically you still have a couple of other exemptions. So this basic thing 00:37:59.510 --> 00:38:03.130 that a couple of people say that you’re not allowed to book a hotel [room] 00:38:03.130 --> 00:38:06.580 in the US anymore is not true. There are a lot of exceptions in the law e.g. 00:38:06.580 --> 00:38:10.580 the performance of a contract. So if I book a hotel [room] in New York online 00:38:10.580 --> 00:38:14.540 my [personal] data has to go to New York to actually book my hotel [room]. So 00:38:14.540 --> 00:38:17.950 in all these cases you can still transfer [personal] data. The ruling is mainly 00:38:17.950 --> 00:38:20.550 on outsourcing. So if you could theoretically have your [personal] data 00:38:20.550 --> 00:38:24.210 in Europe you’re just not choosing because it’s cheaper to host it in the US or 00:38:24.210 --> 00:38:29.280 it’s easier or it’s more convenient. In these cases we actually get problems. 00:38:29.280 --> 00:38:33.700 So what we did is we had a second round of complaints. That is now taking 00:38:33.700 --> 00:38:38.290 these judgments onboard. You can download them on the web page as well. And there’s 00:38:38.290 --> 00:38:43.440 also the deal that Facebook Ireland with Facebook US has signed. 00:38:43.440 --> 00:38:48.760 To have safety to your data. And this is currently under investigation in Ireland. 00:38:48.760 --> 00:38:52.720 Basically I argued that they have a contract but the contract is void because 00:38:52.720 --> 00:38:56.650 US law says they have to do all this mass surveillance. I just got the letter that 00:38:56.650 --> 00:39:01.350 on November, 18th Facebook has actually given them [to the DPC] a huge amount 00:39:01.350 --> 00:39:06.650 of information on what they’re actually doing with the data. This is now going 00:39:06.650 --> 00:39:10.160 to be under investigation. The big question is if the DPC in Ireland is 00:39:10.160 --> 00:39:13.260 actually giving us access to this information. Because so far all these 00:39:13.260 --> 00:39:17.470 evidence that they had they said: “it’s all secret and you cannot know 00:39:17.470 --> 00:39:20.500 what Facebook is doing with your data even though you’re fully informed about 00:39:20.500 --> 00:39:24.220 what they’re doing with your data.” Which is kind of interesting as well. But 00:39:24.220 --> 00:39:30.260 – different issue. A big question was also if there’s gonna be a Safe Harbor 2.0. 00:39:30.260 --> 00:39:33.010 I already was told by everybody they’re not gonna call it a Safe Harbor anymore 00:39:33.010 --> 00:39:37.350 because they’re stuck with media headlines like “Safe Harbor is sunk” 00:39:37.350 --> 00:39:40.270 or something like this. 00:39:40.270 --> 00:39:43.320 And what happened is that the US has done a huge lobbying effort. They have said 00:39:43.320 --> 00:39:46.490 right on the day that all of this is based on wrong facts and they’ve never done 00:39:46.490 --> 00:39:50.720 any of this; and all of this is Trade War; and blablablabla. 00:39:50.720 --> 00:39:53.510 So they put a lot of pressure on them. I was actually talking to Jurova, 00:39:53.510 --> 00:39:57.210 the Justice Commissioner. And I was impressed by her. She actually took 00:39:57.210 --> 00:40:00.650 a whole hour and she really knew what was going on. And at the time they had 00:40:00.650 --> 00:40:04.120 press releases saying: “We’re really deeply working on the new Safe Harbor”. 00:40:04.120 --> 00:40:07.390 And I was asking Jurova: “Did you get any of the evidence you need to make 00:40:07.390 --> 00:40:10.850 such a finding?” And the answer was: “Yeah, we’re still waiting for it. 00:40:10.850 --> 00:40:13.770 We should get it next week”. Which basically meant this 00:40:13.770 --> 00:40:17.810 is never going to work out anymore. But of course I think there’s a blame game 00:40:17.810 --> 00:40:21.700 going on. The EU has to say: “We tried everything to find a solution” 00:40:21.700 --> 00:40:24.940 and the US is saying: “We tried everything to find a solution, too”. 00:40:24.940 --> 00:40:27.220 And then in the end they will blame each other for not finding a solution. 00:40:27.220 --> 00:40:30.530 That’s my guess. But we’ll see what happens. 00:40:30.530 --> 00:40:34.350 The basic problem with a Safe Harbor 2 is that in the government sector 00:40:34.350 --> 00:40:38.380 they’d basically have to rewrite the whole US legal system. Which they haven’t done 00:40:38.380 --> 00:40:43.660 for their own citizens. So they will very likely not do it for European citizens. 00:40:43.660 --> 00:40:46.640 Like judicial redress. Not even an American has judicial redress. So 00:40:46.640 --> 00:40:50.110 they would never give that to a European. And the private area: they actually 00:40:50.110 --> 00:40:53.750 have to redraft the whole Safe Harbor principles because they now have to be 00:40:53.750 --> 00:40:57.460 essentially equivalent of what Europe is doing. 00:40:57.460 --> 00:41:00.380 So this would also protect people on the private sphere much more but it 00:41:00.380 --> 00:41:05.200 would really take a major overhaul of the whole system. To give you an idea: 00:41:05.200 --> 00:41:08.300 all of these processing operations are covered by European law. So 00:41:08.300 --> 00:41:12.580 from collection all the way to really deleting the data. 00:41:12.580 --> 00:41:16.610 This is what’s covered by the Safe Harbor principles. Only 2 operations 00:41:16.610 --> 00:41:20.250 which is at the closure by “transmission” and the “change of purpose”. Anything else 00:41:20.250 --> 00:41:23.730 they can do as fully as they wanna do under the current Safe Harbor things. 00:41:23.730 --> 00:41:26.960 So if you talk about “essentially equivalent” you see on these spaces 00:41:26.960 --> 00:41:30.540 already points to slide that this is miles apart. 00:41:30.540 --> 00:41:35.350 So what is the future of US-EU-US data flows? We will have massive problems 00:41:35.350 --> 00:41:38.130 for the PRISM companies. Because what they’re doing is just a violation 00:41:38.130 --> 00:41:40.950 of our Fundamental Rights. Give or take it – you can change the law as much 00:41:40.950 --> 00:41:44.680 as you want but you cannot change the Fundamental Rights. 00:41:44.680 --> 00:41:47.770 And you’ll have serious problems for businesses that are subject 00:41:47.770 --> 00:41:51.490 to US surveillance law in general. So I’m wondering 00:41:51.490 --> 00:41:54.580 what the final solution is. And that was part of the issue that I had 00:41:54.580 --> 00:41:57.500 with the cases. Typically I like to have a solution for all of this. 00:41:57.500 --> 00:42:01.450 In this case I could only point at the problems but I couldn’t really come up 00:42:01.450 --> 00:42:05.980 with solutions. Because solutions are something that has to be done politically. 00:42:05.980 --> 00:42:11.190 An interesting question was: “How about EU surveillance, actually?” 00:42:11.190 --> 00:42:15.380 Because aren’t they doing more or less the same thing? Which is true. 00:42:15.380 --> 00:42:18.690 And the problem is that the Charta of Fundamental Rights only applies 00:42:18.690 --> 00:42:23.300 to anything that’s regulated by the EU. And national surveillance is exempt 00:42:23.300 --> 00:42:28.720 from any EU law. It’s something that member states are doing all by themselves. 00:42:28.720 --> 00:42:32.650 So you’re out of luck here. You can possibly argue it through 00:42:32.650 --> 00:42:37.840 a couple of circles; but it’s hard to do. However, 7 and 8 of the Charta 00:42:37.840 --> 00:42:42.210 – exactly the same wording as the European Convention of Human Rights. 00:42:42.210 --> 00:42:46.270 And this applies to National Security cases. So the relevant Court here 00:42:46.270 --> 00:42:49.640 is actually in Strasbourg. So you could probably end up at this Court 00:42:49.640 --> 00:42:53.030 with the same argument and say: if they already found that this is a violation 00:42:53.030 --> 00:42:56.370 of your essence in Luxembourg – don’t you want to give us the same rights 00:42:56.370 --> 00:43:00.520 in Strasbourg as well? And these cool Courts are in kind of a fight about 00:43:00.520 --> 00:43:04.370 kind of providing proper Privacy protection and protection in general. 00:43:04.370 --> 00:43:08.240 So very likely you can walk up with a German case or with a UK case 00:43:08.240 --> 00:43:11.240 or a French case and pretty much do the same thing here. So the judgment 00:43:11.240 --> 00:43:13.710 will be interesting for European surveillance as well because 00:43:13.710 --> 00:43:16.690 it’s a benchmark. And you can hardly argue that the US is bad and we’re 00:43:16.690 --> 00:43:21.200 not doing the same thing. Either solutions are possibly technical solutions. 00:43:21.200 --> 00:43:26.260 So what Microsoft did with the cloud services and hosting it with the Germans. 00:43:26.260 --> 00:43:31.190 And the German Telekom. And there is really the issue that if you can get 00:43:31.190 --> 00:43:36.090 a technical solution of not having any access from the US side you can actually 00:43:36.090 --> 00:43:39.440 get out of the whole problem. So you can try with encryption or data localization; 00:43:39.440 --> 00:43:43.210 all this kind of stuff. However none of this is really a very sexy solution 00:43:43.210 --> 00:43:48.530 to the whole issue. However it's something that you can possibly do. 00:43:48.530 --> 00:43:53.660 Last thing: enforcement. And this a little bit of a pitch, I got to confess. 00:43:53.660 --> 00:43:57.170 We have the problem so far that 00:43:57.170 --> 00:44:00.430 we have Data Protection law in Europe. 00:44:00.430 --> 00:44:03.090 But we don’t really have enforcement. And the problem is that the lawyers don’t know 00:44:03.090 --> 00:44:06.940 what’s happening technically. The technical people hardly know 00:44:06.940 --> 00:44:10.550 what the law says. And then you have a funding issue. So the idea 00:44:10.550 --> 00:44:13.210 that I have right now is to create some kind of an NGO or some kind of 00:44:13.210 --> 00:44:17.570 a “Stiftung Warentest for Privacy”. To kind of look into the devices we all have 00:44:17.570 --> 00:44:20.880 and kind of have a structured system of really looking into it. And then probably 00:44:20.880 --> 00:44:24.670 do enforcement as well if your stuff that you have on your device 00:44:24.670 --> 00:44:28.750 is not following European law. I think this is an approach that 00:44:28.750 --> 00:44:31.860 probably changes a lot of the issues. It’s not gonna change everything. 00:44:31.860 --> 00:44:35.680 But this could possibly be a solution to a lot of what we had. And that’s kind of 00:44:35.680 --> 00:44:39.650 what we did in other fields of law as well. That we have NGOs or organizations 00:44:39.650 --> 00:44:42.780 that take care of these things. I think that would be a solution and probably 00:44:42.780 --> 00:44:48.090 helps a little bit. Last - before we have a question/answer session – 00:44:48.090 --> 00:44:51.910 a little Bullshit Bingo to probably get a couple of questions answered right away. 00:44:51.910 --> 00:44:55.990 So the first thing is that a lot of questions are if the EU 00:44:55.990 --> 00:44:59.100 does the same thing. I just answered it: Of course they do the same thing and 00:44:59.100 --> 00:45:01.720 we’ll have to do something about it as well. And I hope that my case 00:45:01.720 --> 00:45:06.850 is a good case to bring other cases against member states of the EU. 00:45:06.850 --> 00:45:10.460 The second question is these whole PRISM companies are saying they don’t do this. 00:45:10.460 --> 00:45:14.420 It’s absurd because they’re all placed under gag orders. Or the people that are 00:45:14.420 --> 00:45:17.360 talking to us don’t even have the security clearance to talk about 00:45:17.360 --> 00:45:20.990 the surveillance system. So it’s insane when a PR person comes up and says: 00:45:20.990 --> 00:45:25.040 “I hereby read the briefing from Facebook that we’re not doing this!”. Which 00:45:25.040 --> 00:45:28.620 basically is what we have right now. And that’s what a lot of the media 00:45:28.620 --> 00:45:32.530 is referring to as well. Another thing that Facebook and the US government 00:45:32.530 --> 00:45:36.110 have argued later is that they weren’t asked. They were not invited to the Court 00:45:36.110 --> 00:45:40.510 procedure. The fun thing is: both of them totally knew about the Court procedure. 00:45:40.510 --> 00:45:44.780 They just decided not to step in and not to get a party of the procedure. So they 00:45:44.780 --> 00:45:46.620 were like first: “Ouh, we don’t wanna talk 00:45:46.620 --> 00:45:49.720 about it” and then when the decision 00:45:49.720 --> 00:45:53.520 came around they were like: “Oh we weren’t asked”. 00:45:53.520 --> 00:45:56.970 Of course it’s a win-on-paper mainly but we’re trying to get it implemented 00:45:56.970 --> 00:46:01.250 in practice as well. And there is kind of this argument 00:46:01.250 --> 00:46:05.490 “The EU has broken the Internet” which I typically rebut in “No, the US 00:46:05.490 --> 00:46:08.990 has broken the Internet and the EU is reacting to it”. 00:46:08.990 --> 00:46:16.150 applause 00:46:16.150 --> 00:46:19.490 Another issue that was interesting is that a lot of the US side said that 00:46:19.490 --> 00:46:23.710 this is protectionism. So the EU is only enforcing these Fundamental Rights 00:46:23.710 --> 00:46:28.870 to hurt US companies. Which is funny because I’m not involved in kind of 00:46:28.870 --> 00:46:31.940 getting more trade to Europe. I’m just like someone interested in 00:46:31.940 --> 00:46:36.340 my Fundamental Rights. And secondly the European politics has done everything 00:46:36.340 --> 00:46:40.610 to kind of not get this case to cross. So kind of this idea that this is 00:46:40.610 --> 00:46:45.100 a protectionist thing is kind of strange, too. And the last question which is: 00:46:45.100 --> 00:46:49.200 “What about the Cables? What about all the other types of surveillance we have?” 00:46:49.200 --> 00:46:53.940 They’re an issue, too. In these cases you just have more issues of actual hacking, 00:46:53.940 --> 00:46:58.870 government hacking, basically. So illegal access to servers and cables. 00:46:58.870 --> 00:47:02.050 Which is harder to tackle with than these companies. Because we have 00:47:02.050 --> 00:47:05.720 this private interference. So there are a lot of other issues around here as well, 00:47:05.720 --> 00:47:09.130 I was just happy to kind of get one thing across. And I’m happy for questions, 00:47:09.130 --> 00:47:13.470 as well. applause 00:47:13.470 --> 00:47:21.300 Herald: Alright… applause 00:47:21.300 --> 00:47:22.989 Max: at lowered voice Wie lange haben wir noch für Fragen? 00:47:22.989 --> 00:47:30.000 Herald: We have about 10 minutes for questions. 00:47:30.000 --> 00:47:33.960 I would ask you to please line up at the microphones here in the hall. 00:47:33.960 --> 00:47:37.260 We have 6 microphones. And we have also 00:47:37.260 --> 00:47:40.420 questions from the IRC. While you guys queue up 00:47:40.420 --> 00:47:44.220 I would take one from the internet. 00:47:44.220 --> 00:47:48.410 Signal Angel: Yeah, just one – for the first time. 00:47:48.410 --> 00:47:51.650 Does TTIP influence any of this? 00:47:51.650 --> 00:47:55.600 Max: Basically, not really. Because the judgment that was done was 00:47:55.600 --> 00:47:58.240 on the Fundamental Rights. So if they have some kind of wording in TTIP 00:47:58.240 --> 00:48:02.740 it would again be illegal. And there was actually a push to get something like that 00:48:02.740 --> 00:48:08.750 into TTIP. And as far as I know this idea was done, after the judgment. laughs 00:48:08.750 --> 00:48:13.410 Just a little intro: EDRI has organized an ask-me-anything thing at 7 PM as well. 00:48:13.410 --> 00:48:17.530 So if you got specific questions, you can also go there. Just as a reminder. 00:48:17.530 --> 00:48:20.980 Herald: OK, great. Microphone No.2, please. 00:48:20.980 --> 00:48:25.220 Question: Thank you for your efforts. The question would be: 00:48:25.220 --> 00:48:28.260 Could US businesses under these findings ever 00:48:28.260 --> 00:48:33.930 be again employed in critical sectors? E.g. 00:48:33.930 --> 00:48:39.380 public sector, Windows in the Bundestag, e.g. and stuff like that? 00:48:39.380 --> 00:48:44.230 Max: Yep, yip. That’s a huge problem. And that’s a problem we had for a while. 00:48:44.230 --> 00:48:48.650 I was mainly talking actually with people in the business area. I’m mainly invited 00:48:48.650 --> 00:48:51.460 to conferences there. And people were telling me: “Yeah, we’re doing 00:48:51.460 --> 00:48:55.400 all our bank data on Google now”. And I was like: WTF? 00:48:55.400 --> 00:48:58.670 Because this is not only Privacy. That’s also trade secrets, all of 00:48:58.670 --> 00:49:02.680 this kind of stuff. So there is this huge issue and if you talk about 00:49:02.680 --> 00:49:06.970 the new Windows that is talking home a little more than the old did, you probably 00:49:06.970 --> 00:49:11.010 have the same issue here because Microsoft is falling under the same thing. 00:49:11.010 --> 00:49:14.200 Q: No plausible deniability, therefor culpability. 00:49:14.200 --> 00:49:16.170 M: Yep, yep, yep. Q: Thank you! 00:49:16.170 --> 00:49:18.119 Max: Thank you! 00:49:18.119 --> 00:49:20.970 Herald: OK, microphone No.3, please, for your next question. 00:49:20.970 --> 00:49:24.920 Question: How would you assess Microsoft saying they put up 00:49:24.920 --> 00:49:29.060 a huge fight that they… well, 00:49:29.060 --> 00:49:32.680 they said they had customers’ data in Ireland and they said 00:49:32.680 --> 00:49:36.070 they refuse to give it to the FBI. What’s to think of that? 00:49:36.070 --> 00:49:38.450 Max: I think to be fair a lot of these companies have realized 00:49:38.450 --> 00:49:42.780 that there is an issue. And that they are the “Feuer am Arsch”. 00:49:42.780 --> 00:49:47.170 And Microsoft… actually a couple of Microsoft people is talking to me 00:49:47.170 --> 00:49:50.780 and is like: “We’re actually not unhappy about this case because 00:49:50.780 --> 00:49:54.620 we have a good argument in the US now that we’re getting troubles here…” 00:49:54.620 --> 00:49:59.330 But the companies are between these 2 chairs. The US law says: 00:49:59.330 --> 00:50:02.500 “We kill you if you’re not giving us all the data” and the problem so far is 00:50:02.500 --> 00:50:06.170 that in the EU… e.g. in Austria 00:50:06.170 --> 00:50:09.800 the maximum penalty is 25.000 Euro if you don’t comply with this. 00:50:09.800 --> 00:50:11.950 Q: Peanuts. M: Which is absurd. 00:50:11.950 --> 00:50:15.430 And in most other countries it’s the same. We now have the Data Protection regulation 00:50:15.430 --> 00:50:18.060 that is coming up which gives you a penalty of a maximum 00:50:18.060 --> 00:50:21.580 of 4% of the worldwide turnover, which is a couple of millions. 00:50:21.580 --> 00:50:25.060 And if you want to thank someone there’s Jan Philipp Albrecht, probably in the room 00:50:25.060 --> 00:50:28.520 or not anymore, who is the member of [EU] Parliament from the Green Party, that’s 00:50:28.520 --> 00:50:31.920 actually from Hamburg who has negotiated all of this. 00:50:31.920 --> 00:50:34.500 And this actually could possibly change a couple of these things. 00:50:34.500 --> 00:50:38.290 But you have this conflict of laws and solutions like the Telekom thing – 00:50:38.290 --> 00:50:41.650 that you host the data with the Telekom – could possibly allow them to argue 00:50:41.650 --> 00:50:44.690 in the US that they don’t have any factual access anymore so they can’t give the data 00:50:44.690 --> 00:50:49.150 to the US Government. But we’re splitting the internet here. And this is not really 00:50:49.150 --> 00:50:53.270 something I like too much, but apparently the only solution. 00:50:53.270 --> 00:50:55.850 Herald: OK, thank you for your question. We have another one 00:50:55.850 --> 00:51:00.210 at microphone 4, please. 00:51:00.210 --> 00:51:07.369 Q: Thank you very much for your efforts, first of all. And great result! 00:51:07.369 --> 00:51:09.630 M: Thank you. Q: The question from me would also be: 00:51:09.630 --> 00:51:12.800 Is there any change in the system in Ireland now? So somebody has 00:51:12.800 --> 00:51:16.190 a similar struggle to yours – the next round might be easier or not? 00:51:16.190 --> 00:51:19.650 Max: Basically what the Irish DPC got is a wonderful new building. And 00:51:19.650 --> 00:51:22.590 the press release is too funny. Because it says: “We have a very nice 00:51:22.590 --> 00:51:27.540 Victorian building now downtown Dublin in a very nice neighborhood“ and blablabla 00:51:27.540 --> 00:51:30.020 and they get double the staff of what they had before. The key problem 00:51:30.020 --> 00:51:33.630 is none of this. I only took the picture because it kind of shows what’s 00:51:33.630 --> 00:51:37.630 inside the building. And the key problem is that we have 2 countries 00:51:37.630 --> 00:51:40.390 – Luxembourg and Ireland, where all of these headquarters are – and 00:51:40.390 --> 00:51:44.080 these 2 countries are not interested in collecting taxes, they’re 00:51:44.080 --> 00:51:48.260 not interested in enforcing Privacy Law, they’re not interested in any of this. And 00:51:48.260 --> 00:51:52.360 they’re basically getting a huge bunch of money in the back of the rest of the EU. 00:51:52.360 --> 00:51:55.869 And until this actually changes and there’s a change of attitude 00:51:55.869 --> 00:52:00.460 in the Irish DPC it doesn’t really matter in which building they are. 00:52:00.460 --> 00:52:03.710 So they got a lot of more money to kind of – to the public – say: “Yes we have 00:52:03.710 --> 00:52:05.660 more money and we have more staff and dadadadada”… 00:52:05.660 --> 00:52:07.700 Q: …but the system did not change! 00:52:07.700 --> 00:52:10.680 M: The big question is what the system is doing: they can prove now! As they have 00:52:10.680 --> 00:52:14.180 the new complaint on their table on Safe Harbor and PRISM and Facebook. 00:52:14.180 --> 00:52:16.740 They can prove; if they do something about it or not – my guess is that 00:52:16.740 --> 00:52:19.390 they’ll find “some” random reasons why unfortunately they couldn’t do 00:52:19.390 --> 00:52:23.229 anything about it. We’ll see. Q: OK, thanks! 00:52:23.229 --> 00:52:26.490 Herald: OK, thank you! It’s your turn, microphone No.2. 00:52:26.490 --> 00:52:32.490 Question: OK, thank you very much and also thank you for your service for the public. 00:52:32.490 --> 00:52:40.980 M: Thanks for the support! applause 00:52:40.980 --> 00:52:45.540 Q: What that will… Sorry about the English… 00:52:45.540 --> 00:52:50.030 M: Sag's auf Deutsch! 00:52:50.030 --> 00:52:57.869 Q: Was bedeutet das eigentlich für die Geschichte mit der Vorratsdatenspeicherung 00:52:57.869 --> 00:53:03.700 wenn die jetzt wieder kommt? Und inwiefern wird Social Media 00:53:03.700 --> 00:53:08.410 damit jetzt sozusagen freigestellt wieder oder nicht? 00:53:08.410 --> 00:53:12.060 M: To be honest I didn’t really look into the German Data Retention thing 00:53:12.060 --> 00:53:15.610 too much. To be honest, being an Austrian I’m like “Our Supreme Cou… Constitu…” 00:53:15.610 --> 00:53:17.060 Q: Me, too! audience laughing 00:53:17.060 --> 00:53:21.210 M: Yeah, I heard. “Our Constitutional Court kind of killed it”, so… 00:53:21.210 --> 00:53:25.060 I don’t think we’ll see a Data Retention in Austria too soon. 00:53:25.060 --> 00:53:27.380 But for Germany it’s gonna be interesting especially if you find a way to 00:53:27.380 --> 00:53:31.490 go to Luxembourg in the end. Like if you find some hook to say: “Actually, 00:53:31.490 --> 00:53:34.930 this German law violates something in the Data Protection Regulation 00:53:34.930 --> 00:53:38.980 or in the Directive“. So we can probably find a way to go back to Luxembourg. 00:53:38.980 --> 00:53:41.290 Could help. The other thing is that just the fact that the Luxembourg Court 00:53:41.290 --> 00:53:46.100 has been so active has probably boosted up a lot of the National Courts as well. 00:53:46.100 --> 00:53:50.020 Because the German decision, I had the feeling was like a “We don’t really 00:53:50.020 --> 00:53:53.490 feel like we can fully say that this is actually illegal and we kind of argued 00:53:53.490 --> 00:53:56.230 that it’s somehow not illegal the way it is, but possibly you can do it 00:53:56.230 --> 00:54:00.090 in the future and uooah…“. And after Luxembourg has really thrown 00:54:00.090 --> 00:54:04.790 all of this right out of the door and was like: “Get lost with your Data Retention 00:54:04.790 --> 00:54:08.200 thing and especially with the PRISM thing” you probably have better case law now, 00:54:08.200 --> 00:54:11.240 as well. And that could be relevant for National Courts as well. Because 00:54:11.240 --> 00:54:16.160 of course these things are question of proportionality. And if we ask everybody 00:54:16.160 --> 00:54:18.430 here in the room what they think is proportionate or not, 00:54:18.430 --> 00:54:22.400 everyone has another opinion. And therefore it’s relevant what our people 00:54:22.400 --> 00:54:25.260 are saying and what other Courts are saying to probably get the level of 00:54:25.260 --> 00:54:28.780 what we feel as proportionate somehow a little bit up. 00:54:28.780 --> 00:54:31.640 Q: So thank you very much. And go on! M: Thank you! 00:54:31.640 --> 00:54:35.610 Herald: OK, just for the record, in case you couldn’t tell by the answer: 00:54:35.610 --> 00:54:39.680 the question was about the implications for the Data Retention Laws, like 00:54:39.680 --> 00:54:44.690 in Germany and Austria. Microphone No.1, we have another question. 00:54:44.690 --> 00:54:48.750 Question: Hi! Two questions. One: could you tell a little bit more about your idea 00:54:48.750 --> 00:54:55.850 of “Stiftung Datenschutz” Europe-wide? And how do we get funding to you… 00:54:55.850 --> 00:54:58.690 M: Send me an email! Q: …if you don’t mind? 00:54:58.690 --> 00:55:03.810 Second question: when I argue with people about like “Let’s keep the personal data 00:55:03.810 --> 00:55:08.300 of all activists within Europe!” I always get this answer: “Yeah, are you so naive, 00:55:08.300 --> 00:55:11.770 do you think it’s anything different if the server stands in Frankfurt 00:55:11.770 --> 00:55:14.170 instead of San Francisco?” What do you say to that? 00:55:14.170 --> 00:55:17.670 Max: The same problem, like pretty much what we have is – and that’s the reason 00:55:17.670 --> 00:55:20.410 why I said I hope this judgment is used for National surveillance in Europe, 00:55:20.410 --> 00:55:25.190 as well. Because we do have the same issues. I mean when you are an Austrian 00:55:25.190 --> 00:55:29.560 and the German “Untersuchungsausschuss” is basically saying: “Ah, we’re only 00:55:29.560 --> 00:55:32.840 protecting Germans” I feel like my fucking data is going through Frankfurt 00:55:32.840 --> 00:55:36.750 all the times. And I’m kind of out of the scope, apparently. So we do need 00:55:36.750 --> 00:55:39.369 to take care of this as well. I hope that this is a case showing that 00:55:39.369 --> 00:55:42.580 you can actually take action. You just have to poke long enough and 00:55:42.580 --> 00:55:47.770 kind of poke at the right spot especially. And I think this is something that… 00:55:47.770 --> 00:55:51.280 there’s not an ultimate solution to it. It’s just one of the kind of holes 00:55:51.280 --> 00:55:54.869 that you have. The other thing that we may see is that a lot of companies that 00:55:54.869 --> 00:55:59.630 are holding this data are much more questioning an order they get. 00:55:59.630 --> 00:56:03.540 Because if they get legal problems from an order they got by a German Court 00:56:03.540 --> 00:56:08.210 or whatever it is they probably are now more interested in – and 00:56:08.210 --> 00:56:10.530 actually looking at it. Because right now it’s cheaper for them 00:56:10.530 --> 00:56:14.390 to just forward the data. You don’t need a whole Legal Team, reviewing it all. 00:56:14.390 --> 00:56:17.760 So I think to kind of split the private companies that are helping them 00:56:17.760 --> 00:56:21.880 from the Government and kind of get some issue between them probably helps there, 00:56:21.880 --> 00:56:25.010 as well. But of course it’s just like little peanuts you put in there. 00:56:25.010 --> 00:56:28.930 But in the end you have that issue, in the end. Yeah. 00:56:28.930 --> 00:56:34.660 On the “Stiftung Datenschutz” or whatever: I think that’s kind of a thing I just 00:56:34.660 --> 00:56:37.710 wanted to blow out to people here, because I’m mainly in the legal sphere and in, 00:56:37.710 --> 00:56:43.200 like the activist/consumer side. And I think that’s the big problem we have 00:56:43.200 --> 00:56:48.249 in the legal and consumer side is that we don’t understand the devices that much. 00:56:48.249 --> 00:56:51.050 And we lack the evidence. We don’t really have the evidence of 00:56:51.050 --> 00:56:54.050 what’s actually going on on devices and you need to have this evidence 00:56:54.050 --> 00:56:56.869 if you go in front of Courts. I think a lot of the people in the room probably 00:56:56.869 --> 00:57:00.890 have this evidence somewhere on the computer. So the idea of really getting 00:57:00.890 --> 00:57:03.670 this connection at some point – it’s not something I can pitch to you right away, 00:57:03.670 --> 00:57:07.220 because it’s not… like I don’t wanna start it tomorrow. But it’s something 00:57:07.220 --> 00:57:10.560 I wanted to circulate to get feedback as well, what you guys think of it. 00:57:10.560 --> 00:57:15.559 So if there’s any feedback on it, send me an email, or twitter. Or whatever it is. 00:57:15.559 --> 00:57:21.520 applause 00:57:21.520 --> 00:57:23.789 Herald: So we do have a bit time left, 00:57:23.789 --> 00:57:26.640 microphone No.2 with the next question, please. 00:57:26.640 --> 00:57:31.390 Question: What can I do as an individual person now? Can I sue Google 00:57:31.390 --> 00:57:35.770 or can I sue other companies just to stop this? 00:57:35.770 --> 00:57:40.160 And would it create some pressure if I do that? 00:57:40.160 --> 00:57:43.040 So what can the ordinary citizen do now? 00:57:43.040 --> 00:57:47.160 Max: We’re right now… I already prepared it but I didn’t have time to send it out 00:57:47.160 --> 00:57:50.070 to have complaints against the Googles and all the others that are on the PRISM list. 00:57:50.070 --> 00:57:53.180 We started with Facebook because I kind of know them the best. And, you know, so 00:57:53.180 --> 00:57:57.430 it was a good start. And the idea was really to have other people 00:57:57.430 --> 00:58:00.790 probably copy-pasting this. The complaint against Facebook we actually filed 00:58:00.790 --> 00:58:05.109 with the Hamburg DPC, as well, and the Belgium DPC. The idea behind it was 00:58:05.109 --> 00:58:08.670 that the Irish now suddenly have 2 other DPCs that are more interested in 00:58:08.670 --> 00:58:12.940 enforcing the law in their boat. So they’re not the only captains anymore. 00:58:12.940 --> 00:58:17.330 And it’s interesting what’s gonna happen here. If there are other people 00:58:17.330 --> 00:58:21.660 that have other cases and just file a complaint with your Data Protection 00:58:21.660 --> 00:58:25.200 authority, a lot of them, especially the German Data Protection authorities 00:58:25.200 --> 00:58:28.940 – most of them – are really interested in doing something about it, but 00:58:28.940 --> 00:58:32.260 they oftentimes just need a case. They need someone to complain about it and 00:58:32.260 --> 00:58:37.070 someone giving them the evidence and someone arguing it, to get things started. 00:58:37.070 --> 00:58:41.760 So if anyone is using Google Drive or something like that – let’s go. 00:58:41.760 --> 00:58:44.750 And basically the wording is on our web page. You just have to download it 00:58:44.750 --> 00:58:50.300 and reword it. And we’re gonna probably publish on the website the complaints 00:58:50.300 --> 00:58:53.080 against the other companies, as soon as they’re out. Probably the next 2..3 weeks. 00:58:53.080 --> 00:59:00.270 Something like this. So just copy-paste and spread the love. 00:59:00.270 --> 00:59:04.260 Herald: OK, thank you very much, Max, again! 00:59:04.260 --> 00:59:07.530 For your great talk. This is it! 00:59:07.530 --> 00:59:10.430 postroll music 00:59:10.430 --> 00:59:17.919 Subtitles created by c3subtitles.de in 2016. Join and help us do more!