< Return to Video

Max Schrems: Safe Harbor

  • 0:00 - 0:10
    32C3 preroll music
  • 0:10 - 0:13
    Herald: Our next talk is
    called “Safe Harbor”.
  • 0:13 - 0:18
    Background is: back in October, in
    the light of the Snowden revelations
  • 0:18 - 0:23
    the Court of Justice of the European
    Union – that’s the “EuGH” in German
  • 0:23 - 0:29
    declared the Safe Harbor agreement
    between the EU and the US invalid.
  • 0:29 - 0:32
    This talk is about how we got there
    as well as further implications
  • 0:32 - 0:37
    of that decision. Please believe me when
    I say our speaker is ideally suited
  • 0:37 - 0:43
    to talk about that topic. Please give it
    up for the man actually suing Facebook
  • 0:43 - 0:46
    over Data Protection concerns:
    Max Schrems!
  • 0:46 - 0:51
    applause and cheers
  • 0:51 - 0:53
    Max Schrems: Hallo! Hey!
    applause and cheers
  • 0:53 - 1:02
    applause
  • 1:02 - 1:05
    It’s cheerful like some Facebook Annual
    conference where the newest things
  • 1:05 - 1:10
    are kind of presented. I’m doing a little
    intro basically where I got there.
  • 1:10 - 1:13
    This was my nice little university in
    California. And I was studying there
  • 1:13 - 1:15
    for half a year and there were a
    couple of people from Facebook
  • 1:15 - 1:18
    and other big companies and
    they were talking about
  • 1:18 - 1:21
    European Data Protection law. And
    the basic thing they said – it was
  • 1:21 - 1:25
    not an original quote but basically what
    they said is: “Fuck the Europeans,
  • 1:25 - 1:28
    you can fuck their law as much as you
    want and nothing is going to happen.”
  • 1:28 - 1:32
    And that was kind of the start of the
    whole story because I thought: “Okay,
  • 1:32 - 1:36
    let’s just make a couple of
    complaints and see where it goes.”
  • 1:36 - 1:41
    I originally got 1.300 pages Facebook data
    back then, because you can exercise
  • 1:41 - 1:45
    your right to access. And Facebook
    actually sent me a CD with a PDF file
  • 1:45 - 1:49
    on it with all my Facebook data.
    It was by far not everything
  • 1:49 - 1:52
    but it was the first time that someone
    really got the data and I was asking
  • 1:52 - 1:55
    someone from Facebook why they were so
    stupid to send me all this information.
  • 1:55 - 1:59
    Because a lot of it was obviously illegal.
    And the answer was “We had internal
  • 1:59 - 2:04
    communications problems.” So someone was
    just stupid enough to burn it on a CD and
  • 2:04 - 2:09
    send it on. One of the CDs actually was
    first going to Sydney in Australia because
  • 2:09 - 2:13
    they put “Australia” instead of “Austria”
    on the label which was one of the things
  • 2:13 - 2:15
    as well.
    applause
  • 2:15 - 2:20
    Anyway, this was basically how
    my interest in Facebook started;
  • 2:20 - 2:23
    and the media got crazy about it because
    there is like a little guy that does
  • 2:23 - 2:27
    something against the big guy. And this
    is basically how the whole thing got
  • 2:27 - 2:31
    this big. This is like a cartoon from my
    Salzburg newspaper. This should be me,
  • 2:31 - 2:35
    and it’s like basically the reason why
    the story got that big because it’s
  • 2:35 - 2:38
    a small guy doing something against
    Facebook, not necessarily because
  • 2:38 - 2:42
    what I was doing was so especially smart.
    But the story was just good for the media,
  • 2:42 - 2:46
    ’cause data protection is generally a very
    dry topic that they can’t report about
  • 2:46 - 2:50
    and they’re they had like
    the guy that did something.
  • 2:50 - 2:54
    A couple of introductions. We actually
    had 3 procedures. So if you heard about
  • 2:54 - 2:58
    what I was doing… There was originally
    a procedure at the Irish Data Protection
  • 2:58 - 3:02
    Commission, on Facebook itself – so what
    Facebook itself does with the data.
  • 3:02 - 3:07
    This procedure has ended after 3 years.
    There’s a “Class Action” in Vienna
  • 3:07 - 3:10
    right now that’s still ongoing. It’s in
    front of the Supreme Court in Austria
  • 3:10 - 3:15
    right now. And there is the procedure
    that I’m talking about today which is
  • 3:15 - 3:20
    the procedure on Safe Harbor at the
    Irish Data Protection Commission.
  • 3:20 - 3:23
    A couple of other background
    informations: I personally don’t think
  • 3:23 - 3:25
    Facebook is the issue. Facebook
    is just a nice example for
  • 3:25 - 3:30
    an overall bigger issue. So I was never
    personally concerned with Facebook but
  • 3:30 - 3:33
    for me the question is how we enforce
    Data Protection or kind of stuff.
  • 3:33 - 3:36
    applause
    So it’s not a Facebook talk; Facebook is
  • 3:36 - 3:39
    applause
    the example. And of course the whole thing
  • 3:39 - 3:43
    is just one puzzle piece. A lot of people
    are saying: “This was one win but there
  • 3:43 - 3:46
    are so many other issues!” – Yes, you’re
    totally right! This was just one issue.
  • 3:46 - 3:49
    But you got to start somewhere.
    And the whole thing is also
  • 3:49 - 3:53
    not an ultimate solution. So I can
    not present you the final solution
  • 3:53 - 3:58
    for everything, but probably a couple
    of possibilities to do something.
  • 3:58 - 4:01
    If you’re interested in the documents
    – we pretty much publish everything
  • 4:01 - 4:05
    on the web page. It’s a very old style web
    page. But you can download the PDF files
  • 4:05 - 4:10
    and everything if you’re interested
    in the facts and (?) the details.
  • 4:10 - 4:16
    Talking about facts, the whole thing
    started with the Snowden case,
  • 4:16 - 4:19
    where we kind of for the first time had
    documents proving who is actually
  • 4:19 - 4:24
    forwarding data to the NSA in this case.
    And this is the interesting part, because
  • 4:24 - 4:27
    we have a lot of rumours but if you’re in
    a Court room you actually have to prove
  • 4:27 - 4:30
    everything and you cannot just suspect
    that very likely they’re doing it. But you
  • 4:30 - 4:35
    need actual proof. And thanks to Snowden
    we had at least a bunch of information
  • 4:35 - 4:41
    that we could use. These are the slides,
    you all know them. The first very
  • 4:41 - 4:47
    interesting thing was the FISA act and we
    mainly argued under 1881a as an example
  • 4:47 - 4:52
    for the overall surveillance in the US. So
    we took this law as an example but it was
  • 4:52 - 4:56
    not the only thing we relied on. And I
    think it’s interesting for Europeans to
  • 4:56 - 5:01
    understand how the law actually works.
    The law actually goes after data and not
  • 5:01 - 5:09
    after people. We typically have laws in
    criminal procedures that go after people.
  • 5:09 - 5:13
    This law goes after data. So it totally
    falls outside of our normal thinking of
  • 5:13 - 5:15
    “we’re going after a suspect,
    someone that
  • 5:15 - 5:18
    may have committed a crime”. Basically the
  • 5:18 - 5:21
    law says that there’s an electronic
    communications service provider that holds
  • 5:21 - 5:25
    foreign intelligence information. That’s
    much more than just terrorist prevention,
  • 5:25 - 5:30
    that’s also things that the US is
    generally interested in.
  • 5:30 - 5:34
    And this is the level that’s publicly
    known and everything else is basically
  • 5:34 - 5:38
    classified. So under the law the FISA
    Court can do certification for one year
  • 5:38 - 5:44
    that basically says “the NSA can access
    data”. In this certifications there are
  • 5:44 - 5:46
    these minimization and targeting
    procedures that they have to describe.
  • 5:46 - 5:48
    But they’re not public.
    We don’t know how
  • 5:48 - 5:51
    they look like. And basically they’re here
  • 5:51 - 5:56
    to separate data from US people out of
    the data set. So it doesn’t really help
  • 5:56 - 6:01
    a European. And then there is a so called
    Directive that goes to the individual
  • 6:01 - 6:04
    service provider which basically says:
    “Give us the data in some technical
  • 6:04 - 6:08
    format.” So very likely it’s some kind
    of API or some kind of possibility
  • 6:08 - 6:13
    that they can retrieve the data. That’s
    what the law says. We don’t know
  • 6:13 - 6:17
    how it actually looks and we don't
    have perfect proof of it. So there are
  • 6:17 - 6:21
    a lot of things that are disputed and
    still disputed by the US government.
  • 6:21 - 6:25
    So the exact technical implementations,
    the amount of data that’s actually pulled,
  • 6:25 - 6:29
    all the review mechanisms they have
    internally. That’s all stuff that was
  • 6:29 - 6:34
    not 100% sure, and not sure enough
    to present it to a Court. Which was
  • 6:34 - 6:39
    the basic problem we had. First of
    all after the Snowden thing broke
  • 6:39 - 6:44
    we had different reactions. And that was
    kind of how I started the procedure.
  • 6:44 - 6:48
    The first reaction was demonstrations.
    We were all walking in the streets.
  • 6:48 - 6:51
    Which is good and which is important,
    but we all know that this is something
  • 6:51 - 6:56
    we have to do but not something that’s
    gonna change the world. Second thing:
  • 6:56 - 6:59
    we had parliaments like the European
    Parliament doing resolutions saying
  • 6:59 - 7:03
    that we should strike down the Safe Harbor
    and this is all bad and evil. We had
  • 7:03 - 7:07
    the Commission pretty much saying the
    same thing. We had national politicians
  • 7:07 - 7:12
    saying the same thing. And we all knew
    that basically this means that they all
  • 7:12 - 7:17
    send an angry letter to the US. Then they
    can walk in front of the media and say:
  • 7:17 - 7:20
    “Yes, we’ve done something, we sent
    an angry letter to the US”, and the US
  • 7:20 - 7:25
    is just thrown basically in some trash bin
    of crazy Europeans wanting strange things
  • 7:25 - 7:32
    and that was it. So I was actually called
    by a journalist and asked if there’s
  • 7:32 - 7:37
    some other option. And I was then
    starting to think about it and there’s
  • 7:37 - 7:42
    the so called Safe Harbor agreement. To
    explain the “Safe Harbor”: In Europe
  • 7:42 - 7:46
    we have Data Protection law that is on
    the papers but factually not enforced.
  • 7:46 - 7:51
    But at least, in theory, we have it. And
    we have a couple of other countries
  • 7:51 - 7:55
    that have the same level of protection
    or similar laws. And generally
  • 7:55 - 7:58
    Data Protection only works if you keep
    the data within the protected sphere so
  • 7:58 - 8:01
    you’re not allowed to send personal
    data to a third country that
  • 8:01 - 8:05
    doesn’t have adequate protection. There
    are a couple of other countries that do;
  • 8:05 - 8:10
    and therefore you can transfer data e.g.
    to Switzerland. This is what the law says.
  • 8:10 - 8:13
    And there are certain servers that
    are outside these countries where
  • 8:13 - 8:17
    we can have contractual relationships. So
    basically if you have a server in India,
  • 8:17 - 8:20
    you have a contract with your
    Indian hosting provider saying:
  • 8:20 - 8:24
    “You apply proper Data Protection to it”.
    So you can transfer data there, too.
  • 8:24 - 8:28
    All of this is approved by the European
    Commission. This is how data
  • 8:28 - 8:33
    flows legally outside of the EU – personal
    data. This all doesn’t apply
  • 8:33 - 8:39
    to any other kind of data, only personal
    data. And we had a basic problem
  • 8:39 - 8:42
    with the US because there was this
    Directive saying you can forward data
  • 8:42 - 8:47
    to other countries but there is no Data
    Protection Law in the US. So basically
  • 8:47 - 8:49
    you wouldn’t be allowed to send
    data there unless you have
  • 8:49 - 8:53
    some contractual relationship which
    is always kind of complicated.
  • 8:53 - 8:58
    So the solution was to have a self
    certification to EU principles
  • 8:58 - 9:02
    and this was put into an Executive
    Decision by the European Commission.
  • 9:02 - 9:07
    So basically how Safe Harbor is working
    is that e.g. Google can walk up and say:
  • 9:07 - 9:13
    “Hereby I pledge that I follow European
    Data Protection Law. I solemnly swear!”.
  • 9:13 - 9:15
    And then they do whatever they
    want to do. And basically
  • 9:15 - 9:18
    that’s the Safe Harbor system and the
    Europeans can walk around saying:
  • 9:18 - 9:22
    “Yeah, there is some seal saying
    that everything is fine, so don’t worry.”
  • 9:22 - 9:25
    Everybody knew that this is a fucked-up
    system but for years and years
  • 9:25 - 9:29
    everyone was looking away because politics
    is there and economics is there and
  • 9:29 - 9:34
    they just needed it. So basically Safe
    Harbor works that way that a US company
  • 9:34 - 9:38
    can follow the Safe Harbor principles
    and say: “We follow them”, then
  • 9:38 - 9:42
    the Federal Trade Commission and private
    arbitrators are overlooking them
  • 9:42 - 9:47
    – in theory, in practice they never do –
    and this whole thing was packaged
  • 9:47 - 9:50
    into decision by the European
    Commission. And this is the so called
  • 9:50 - 9:56
    Safe Harbor system. So from a European
    legal point of view it’s not an agreement
  • 9:56 - 10:01
    with the US, it’s a system that the US has
    set up that we approved as adequate. So
  • 10:01 - 10:04
    there’s no binding thing between the US
    and Europe, we can kind of trash it
  • 10:04 - 10:11
    any time. They’ve just never done that.
    Which brings me to the legal argument.
  • 10:11 - 10:15
    Basically if I’m this little Smiley down
    there, I’m sitting in Austria and
  • 10:15 - 10:21
    transfer my data to Facebook Ireland,
    because worldwide – 82% of all users
  • 10:21 - 10:24
    have a contract with Facebook Ireland.
    Anyone that lives outside the US
  • 10:24 - 10:29
    and Canada. So anyone from China,
    South America, Africa has a contract
  • 10:29 - 10:34
    with Facebook in Ireland. And legally they
    forward the data to Facebook in the US;
  • 10:34 - 10:37
    technically the data is directly
    forwarded. So the data is actually flowing
  • 10:37 - 10:42
    right to the servers in the US. However
    legally it goes through Ireland. And
  • 10:42 - 10:46
    my contract partner is an Irish company.
    And under the law they can only transfer
  • 10:46 - 10:50
    data to the US if there is adequate
    protection. At the same time we know
  • 10:50 - 10:54
    that the PRISM system is hooked up in
    the end. So I was basically walking up
  • 10:54 - 10:57
    to the Court and saying: “Mass
    Surveillance is very likely not
  • 10:57 - 11:02
    adequate protection, he?” And
    that was basically the argument.
  • 11:02 - 11:09
    applause
  • 11:09 - 11:12
    The interesting thing in this situation
    was actually the strategic approach.
  • 11:12 - 11:18
    So, we have the NSA and other surveillance
    organizations that use private companies.
  • 11:18 - 11:22
    So we have kind of a public-private
    surveillance partnership. It’s PPP in
  • 11:22 - 11:29
    a kind of surveillance way. Facebook is
    subject to US law, so under US law they
  • 11:29 - 11:32
    have to forward all the data. At the same
    time Facebook Ireland is subject to
  • 11:32 - 11:35
    European law so they’re not
    allowed to forward all this data.
  • 11:35 - 11:42
    Which is interesting because
    they’re split. The EU law regulates
  • 11:42 - 11:46
    how these third cwountry transfers work.
    And all of this has to be interpreted
  • 11:46 - 11:50
    under Fundamental Rights. So this was
    basically the system were looking at.
  • 11:50 - 11:53
    And the really crucial thing is that we
    have this public-private surveillance.
  • 11:53 - 11:57
    Because we do have jurisdiction over
    private company. We don’t have
  • 11:57 - 12:01
    jurisdication over the NSA. We can
    send angry letters to the NSA. But
  • 12:01 - 12:04
    we do have jurisdiction over Facebook,
    Google etc. because they’re basically
  • 12:04 - 12:09
    based here. Mainly for tax reasons.
    And this was the interesting thing that
  • 12:09 - 12:12
    in difference to the national surveillance
    where we can pretty much just send
  • 12:12 - 12:15
    the angry letters we can do something
    about the private companies. And
  • 12:15 - 12:19
    without the private companies there is
    almost no mass surveillance in this scale
  • 12:19 - 12:23
    because the NSA is not in our phones,
    it’s the Googles and Apples and whatever.
  • 12:23 - 12:29
    And without them you’re not really
    able to get this mass surveillance.
  • 12:29 - 12:33
    This is like the legal chart. Basically
    what we argued is: there’s 7 and 8
  • 12:33 - 12:35
    of the Charta of Fundamental Rights.
    That’s your right to Privacy and
  • 12:35 - 12:39
    your right to Data Protection. There
    is an article in the Directive that
  • 12:39 - 12:44
    has to be interpreted in light of it. Then
    there’s the Executive Decision of the EU.
  • 12:44 - 12:49
    This is basically the Safe Harbor
    decision which refers to Paragraph 4
  • 12:49 - 12:53
    of the Safe Harbor principles. And the
    Safe Harbor principles basically say
  • 12:53 - 12:57
    that the FISA Act is okay. So
    you have kind of this circle
  • 12:57 - 13:02
    of different legal layers which is getting
    really crazy. I’ll try to break it down
  • 13:02 - 13:05
    a little bit. Basically 7 and 8 of the
    Charta we basically compared
  • 13:05 - 13:09
    to Data Retention, so the
    “Vorratsdatenspeicherung”.
  • 13:09 - 13:14
    We basically said PRISM is much worse. If
    “Vorratsdatenspeicherung” (Data Retention)
  • 13:14 - 13:18
    was invalid then PRISM has to be 10 times
    as bad. That was basically the argument.
  • 13:18 - 13:22
    Very simple. We just compared: the
    one was content data – the other one
  • 13:22 - 13:26
    was meta data. The one is storage
    – the other one is making available.
  • 13:26 - 13:30
    And the one is endless – the other
    one is 24 months. So basically
  • 13:30 - 13:34
    in all these categories PRISM was much
    worse. And if the one has to be illegal
  • 13:34 - 13:38
    the other one has to be as well. And
    what’s interesting – and that’s something
  • 13:38 - 13:43
    that the US side is typically not getting
    – is that Article 8 is already covering
  • 13:43 - 13:47
    “making available of data”. So the
    fun thing is I only had to prove
  • 13:47 - 13:51
    that Facebook makes data available,
    so basically it’s possible
  • 13:51 - 13:55
    the NSA is pulling it. I didn’t even have
    to prove that the NSA is factually pulling
  • 13:55 - 14:00
    my personal data. And this was like the
    relevant point because under US law
  • 14:00 - 14:04
    basically your Fundamental Rights only
    kick in when they factually look at your
  • 14:04 - 14:08
    data and actually surveil you. So I was
    only: “They’re making it available
  • 14:08 - 14:12
    – that’s good enough for me!” which
    was making all these factual evidence
  • 14:12 - 14:17
    much easier. So basically I only had
    to say: “Look at the XKeyscore slides
  • 14:17 - 14:22
    where they say ‘user name Facebook’
    they can get somehow the data out of it.
  • 14:22 - 14:27
    It’s at least made available; that’s
    all I need to prove”. And this is
  • 14:27 - 14:30
    the big difference between the US
    – it’s very simplified, but basically
  • 14:30 - 14:35
    between the US approach and the European
    approach; is that in the US you have to
  • 14:35 - 14:39
    prove that your data is actually pulled.
    I only had to prove that my data is made
  • 14:39 - 14:44
    available. So I had to… I was able to
    get out of all the factual questions.
  • 14:44 - 14:48
    This is a comparison – you basically…
    in the US we have very strict laws
  • 14:48 - 14:53
    for certain types of surveillance while in
    Europe we have a more flexible system
  • 14:53 - 14:57
    that covers much more. So it’s a
    different approach that we just have
  • 14:57 - 15:00
    in the two legal spheres. We’re both
    talking about your Fundamental
  • 15:00 - 15:04
    Right to Privacy, but in details it’s
    very different. And that’s kind of
  • 15:04 - 15:08
    the differences what we used. The fun
    thing is if you’re European you don’t have
  • 15:08 - 15:12
    any rights in the US anyways because
    the Bill Of Rights only applies to people
  • 15:12 - 15:16
    that live in the US and US citizens so
    you’re out of luck anyways. So you’re
  • 15:16 - 15:20
    only left with the European things.
    Basically the law which is
  • 15:20 - 15:24
    the second level after the Fundamental
    Rights is saying that there has to be
  • 15:24 - 15:28
    an adequate level of protection as I said
    and this third country has to ensure it
  • 15:28 - 15:33
    by domestic law or international
    commitments. And I was saying: “You know
  • 15:33 - 15:36
    there’s the FISA Act, you can read
    it, it definitely doesn’t ensure
  • 15:36 - 15:39
    your fundamental rights and an
    adequate protection. So we're
  • 15:39 - 15:45
    kind of out of Article 25”. And there is
    paragraph 4 of the Safe Harbor principles
  • 15:45 - 15:51
    which say that all these wonderful privacy
    principles that US companies sign up to
  • 15:51 - 15:57
    do not apply whenever a national law
    in the US is overruling it. So there are
  • 15:57 - 16:02
    principles that companies say: “We
    follow!” but if there is a city in Texas
  • 16:02 - 16:06
    saying: “We have a local ordinance
    saying: ‘You have to do differently!’”
  • 16:06 - 16:09
    all these Safe Harbor principles
    don’t apply anymore. And this is
  • 16:09 - 16:13
    the fundamental flaw of the self
    certification system that it only works
  • 16:13 - 16:17
    if there is no law around that conflicts
    with it. And as there are tons of laws
  • 16:17 - 16:23
    that conflict with it you’re hardly
    able to hold up a system like that.
  • 16:23 - 16:26
    So basically if you go through all these
    different legal layers you end up with
  • 16:26 - 16:30
    a conflict between the US FISA Act
    and the European Fundamental Rights.
  • 16:30 - 16:33
    So you’re going through different layers
    of the system but you’re basically making
  • 16:33 - 16:41
    a circle. This is what we did which was
    a little bit complicated but worked.
  • 16:41 - 16:54
    applause
  • 16:54 - 16:57
    Basically now to the procedure,
    so how the whole thing happened.
  • 16:57 - 17:02
    First I went through the Safe Harbor. Safe
    Harbor allows you to go to TRUSTe or
  • 17:02 - 17:07
    the Federal Trade Commission and there’s
    an online form to make your complaint. And
  • 17:07 - 17:10
    I was making a complaint and I think you
    were only allowed to put in 60 characters
  • 17:10 - 17:14
    to explain what your complaint is. Which
    is a little bit complicated if you’re
  • 17:14 - 17:18
    trying to explain NSA mass surveillance.
    So I only wrote: “Stop Facebook, Inc.’s
  • 17:18 - 17:20
    involvement in PRISM!”. That
    was everything I could actually
  • 17:20 - 17:24
    put in the text box; that was
    the absolute maximum.
  • 17:24 - 17:28
    And the answer I got back was: “TRUSTe
    does not have the authority to address
  • 17:28 - 17:31
    the matter you raise.” Which is obvious,
    it’s a private arbitration company
  • 17:31 - 17:35
    that can hardly tell Facebook to not
    follow the NSA’s guidelines. So
  • 17:35 - 17:39
    this was the arbitration mechanism under
    Safe Harbor. You can also go to the
  • 17:39 - 17:43
    Federal Trade Commission and have your
    complaint filed there. But they basically
  • 17:43 - 17:50
    just ignore them. This was the letter I
    got back, that they received it. But
  • 17:50 - 17:54
    I was talking to the people at the FTC and
    they say: “Yeah, we get these complaints
  • 17:54 - 17:59
    but they’re ending up in a huge storage
    system where they stay for ever after”.
  • 17:59 - 18:03
    So this was enforcement done by
    Safe Harbor. And we knew that
  • 18:03 - 18:06
    in the private field already; but in this
    case it was especially interesting.
  • 18:06 - 18:09
    To be fair, both of these institutions
    have no power to do anything
  • 18:09 - 18:11
    about mass surveillance. So
    there was really a reason why
  • 18:11 - 18:14
    they didn’t do anything.
    The next step you have is
  • 18:14 - 18:17
    the national Data Protection Commissioner.
    So we have 28 countries
  • 18:17 - 18:22
    with 28 [Commissioners]; plus Germany has
    – I think – a Data Protection Commissioner
  • 18:22 - 18:27
    in every province. And you end up at
    this. And this is my most favourite slide.
  • 18:27 - 18:30
    This is the Irish Data
    Protection Commissioner.
  • 18:30 - 18:34
    applause
  • 18:34 - 18:36
    To be super precise
    – I don’t know if you
  • 18:36 - 18:38
    can see the laser pointer. But this is a
  • 18:38 - 18:42
    super market. And this is the Irish Data
    Protection Commissioner back there.
  • 18:42 - 18:45
    laughter, applause
  • 18:45 - 18:48
    To be a little more fair, actually they’re
    up here and they’re like 20 people
  • 18:48 - 18:52
    when we filed it originally. The fun thing
    is back at the times they didn’t have
  • 18:52 - 18:55
    a single lawyer and not a single
    technician. So they were 20
  • 18:55 - 18:58
    public employees that were dealing
    with Data Protection and no one
  • 18:58 - 19:03
    had any clue of the technical
    or the legal things about it.
  • 19:03 - 19:06
    The fun thing is: this is Billy Hawkes,
    the Data Protection Commissioner
  • 19:06 - 19:10
    at the time. He went on the
    national radio in the morning.
  • 19:10 - 19:13
    And in Ireland radio is a really big
    thing. So it was a morning show.
  • 19:13 - 19:17
    And he was asked about these complaints.
    And he actually said on the radio:
  • 19:17 - 19:21
    “I don’t think it will come
    as much of a surprise
  • 19:21 - 19:25
    that the US services have access
    to all the US companies”.
  • 19:25 - 19:29
    And this was the craziest thing!
    I was sitting in front of the radio
  • 19:29 - 19:35
    and was like: “Strike! He just
    acknowledged that all this is true!”.
  • 19:35 - 19:38
    And the second thing, he said: “This US
    surveillance operation is not an issue
  • 19:38 - 19:42
    of Data Protection”. Interesting.
  • 19:42 - 19:48
    It’s actually online and you can listen
    to it. But the fun thing was really that
  • 19:48 - 19:52
    the factual level is so hard to prove that
    I was afraid that they would dispute:
  • 19:52 - 19:55
    “Hah, who knows if all this is true?
    We don’t have any evidence!
  • 19:55 - 19:59
    The companies say we are
    not engaging in all of this.”
  • 19:59 - 20:03
    So having the Data Protection Commissioner
    saying: “Sure they surveil you!
  • 20:03 - 20:06
    Are you surprised?” was great
    because we were kind of out of
  • 20:06 - 20:10
    the whole factual debate.
  • 20:10 - 20:13
    I actually got a letter back from them
    saying that they’re not investigating
  • 20:13 - 20:18
    any of it. And I was asking them why. And
    they were naming 2 sections of the law,
  • 20:18 - 20:22
    a combination thereof. So there was one
    thing where it says they shall investigate
  • 20:22 - 20:25
    – which means they have to – or
    they may investigate. And they say
  • 20:25 - 20:28
    they only “may” investigate complaints
    and they just don’t feel like
  • 20:28 - 20:33
    investigating PRISM and Facebook
    and all of this. Secondly they say
  • 20:33 - 20:37
    that a complaint could be “frivolous
    and vexatious” – I love the word!
  • 20:37 - 20:41
    And therefor they’re not investigating
    it. “A combination thereof or indeed
  • 20:41 - 20:48
    any other relevant matter.” So we
    transferred this letter into a picture
  • 20:48 - 20:52
    which is basically what they said: “So
    why did you not investigate PRISM?”
  • 20:52 - 20:54
    – “‘Shall’ means ‘may’, frivolous
    or
  • 20:54 - 20:56
    vexatious, a combination of A and B
  • 20:56 - 20:59
    or any other reason.”
    So this was the answer
  • 20:59 - 21:01
    by the Irish Data Protection Commissioner
    why they wouldn’t want to investigate
  • 21:01 - 21:06
    the complaint. Just to give
    you background information:
  • 21:06 - 21:08
    these are the complaints that the Irish
    Data Protection Commissioner is receiving
  • 21:08 - 21:11
    – the blue line – and the red line is
    all
  • 21:11 - 21:13
    of the complaints they’re not deciding.
  • 21:13 - 21:17
    Which is 96..98% of the complaints
  • 21:17 - 21:21
    they receive on an average year.
    Which is interesting because you have
  • 21:21 - 21:24
    a right to get a decision but they don’t.
  • 21:24 - 21:27
    To give you the bigger picture: we
    also made complaints on Apple
  • 21:27 - 21:31
    and all the other PRISM companies.
    And Ireland basically said
  • 21:31 - 21:35
    what I just told you. Luxembourg, where
    Skype and Microsoft are situated, said
  • 21:35 - 21:39
    that they do not have enough evidence for
    the participation of Microsoft and Skype
  • 21:39 - 21:43
    [in PRISM]. And the funniest thing
    about the answer was that they said
  • 21:43 - 21:46
    that they’re restricted by their
    investigations to the territory
  • 21:46 - 21:50
    of Luxembourg. And since all of this is
    happening in the US they have no way
  • 21:50 - 21:54
    of ever finding out what was going on.
    So I was telling them: “You know,
  • 21:54 - 21:58
    most of this is online and if you’re not
    able to download it I can print it out
  • 21:58 - 22:03
    for you and ship it to Luxembourg.” But
    the problem is why we didn’t go down
  • 22:03 - 22:06
    in Luxembourg is because they went down
    this factual kind of argument. They said:
  • 22:06 - 22:10
    “It’s all illegal but factually we
    don’t believe it’s true”. And
  • 22:10 - 22:15
    then there was Germany that are
    still investigating until today.
  • 22:15 - 22:18
    This was Yahoo. Actually that was
    Yahoo in Munich but they now
  • 22:18 - 22:21
    moved to Ireland as well. So I don’t
    know what happened to this complaint.
  • 22:21 - 22:24
    We never heard back. But whenever we sent
    an email they were like: “Yeah, we’re
  • 22:24 - 22:30
    still investigating.” So what happened now
    is that I went to the Irish High Court.
  • 22:30 - 22:34
    To jeopardize the non-decision of the
    Irish Data Protection Commissioner.
  • 22:34 - 22:37
    This is the case that then went down as
    “Schrems vs. the Data Protection
  • 22:37 - 22:41
    Commissioner” which is so strange because
    I never wanted to have my name
  • 22:41 - 22:44
    on any of this and now the decision is
    actually called after my second name
  • 22:44 - 22:47
    which is always freaking me out in a way.
    Because you’re fighting for Privacy and
  • 22:47 - 22:51
    suddenly your name is all over the place.
    applause and laughter
  • 22:51 - 22:56
    applause
  • 22:56 - 23:00
    And this is the Irish High Court. So you…
  • 23:00 - 23:04
    It’s very complicated to
    get a procedure like that.
  • 23:04 - 23:08
    The biggest issue is that you need money.
    If you’re in front of an Irish Court
  • 23:08 - 23:10
    and you lose a case you
    end up with a legal bill of
  • 23:10 - 23:13
    a couple of hundred thousand
    Euros. Which is the reason why
  • 23:13 - 23:18
    never anybody ever challenged the
    Irish Data Protection Commissioner.
  • 23:18 - 23:22
    Because you just gonna
    lose your house over it!
  • 23:22 - 23:26
    So what I did is: we did a little
    bit of crowd-funding! And
  • 23:26 - 23:29
    we actually got about 70.000 Euros out
    of it. This was a crowd-funding platform
  • 23:29 - 23:32
    that basically worked in a way
    that people could donate
  • 23:32 - 23:36
    and if we don’t need the money we either
    donate it to another Privacy cause
  • 23:36 - 23:40
    or we actually give people the money
    back. Which we got to have to do
  • 23:40 - 23:43
    because we won the case. And all
    our costs are paid by the other side.
  • 23:43 - 23:50
    applause
  • 23:50 - 23:56
    So the fun thing is you then have to
    walk into this wonderful old Court here
  • 23:56 - 24:00
    on Mondays at 11:30. And
    there’s a room where you can
  • 24:00 - 24:04
    make your application. And about 100 other
    people making their application as well.
  • 24:04 - 24:07
    And there is no number. So there
    are 100 lawyers sitting in a room,
  • 24:07 - 24:11
    waiting for the judge to call out your
    case. So we were sitting there until 4 PM
  • 24:11 - 24:15
    or something until suddenly our case was
    called up. And we actually got kind of
  • 24:15 - 24:19
    the possibility to bring our case and then
    it’s postponed to another date and
  • 24:19 - 24:24
    blablablablabla. In the end you
    end up with something like this.
  • 24:24 - 24:28
    Which is all the paperwork
    because in Ireland the Courts
  • 24:28 - 24:33
    are not computerized so far. So you
    have to bring all the paperwork,
  • 24:33 - 24:38
    anything you rely on, in 3 copies.
    And it’s all paper, noted of the pages,
  • 24:38 - 24:43
    so all these copies have pages 1 to 1000.
    Someone’s writing all of them on the page.
  • 24:43 - 24:46
    And then they copy it 3 times and it’s
    then in this wonderful little thing.
  • 24:46 - 24:50
    I thought it’s great. And
    what happened is that
  • 24:50 - 24:54
    we walked into the judge’s room and you
    get a judge assigned on the same day.
  • 24:54 - 24:58
    So you end up in front of a judge
    that has never heard about Privacy,
  • 24:58 - 25:01
    never heard about Facebook and
    never heard about Snowden and PRISM
  • 25:01 - 25:04
    and any of this. So you walk into the
    room as like “We would like to debate
  • 25:04 - 25:08
    the Safe Harbor with you” and he was like
    “What the fuck is the Safe Harbor?”.
  • 25:08 - 25:12
    So what happened is that he told us to
    kind of explain what it is for 15 minutes.
  • 25:12 - 25:16
    And then he postponed the
    whole thing for 2 hours I think
  • 25:16 - 25:20
    and we walked over to a pub and had a
    beer. So that the judge could remotely
  • 25:20 - 25:25
    read what he’s about to look into.
  • 25:25 - 25:27
    And Ireland is very interesting because
    you need a Solicitor and a Counsel
  • 25:27 - 25:31
    and then the Counsel is actually talking
    to the Judge. So I actually had 2 filters.
  • 25:31 - 25:34
    If I’m the client down here I had to
    talk to my Solicitor. The Solicitor
  • 25:34 - 25:39
    was telling the Counsel what to say to the
    Judge. So half of it was lost on the way.
  • 25:39 - 25:42
    And when I was asking if I could
    just address the Judge personally
  • 25:42 - 25:45
    they were like “No, no way that you could
    possibly address the Judge personally
  • 25:45 - 25:46
    even though you’re the claimant”.
    Which is
  • 25:46 - 25:48
    funny ’cause they talk about this “person”
  • 25:48 - 25:52
    in the room. It’s like “What’s the problem
    of this Mr. Schrems?”. And you’re like
  • 25:52 - 25:57
    sitting right here, it’s like
    “This would be me!”.
  • 25:57 - 26:01
    So what happened in Ireland is that we
    had about 10 reasons why under Irish law
  • 26:01 - 26:06
    the Irish Data Protection Commissioner
    would have to do its job but the Court
  • 26:06 - 26:10
    actually wiped all of this from the table
    and said actually the Safe Harbor
  • 26:10 - 26:13
    is the issue, which legally they’re
    not allowed to do what politically
  • 26:13 - 26:16
    was very wise and forwarded this
    wonderful easy-to-understand question
  • 26:16 - 26:20
    to the European Court of Justice.
  • 26:20 - 26:23
    The reason why they put this kind
    of very random question is that
  • 26:23 - 26:28
    if you jeopardize a law in Ireland you
    have to get some Advocate General engaged.
  • 26:28 - 26:30
    And they didn’t want to do that
    so they kind of “asked a question
  • 26:30 - 26:34
    around the actual question”
    to not really get them engaged.
  • 26:34 - 26:36
    Which was very complicated
    because we didn’t know how
  • 26:36 - 26:39
    the European Court of Justice ’d kind of
    react to this random question because
  • 26:39 - 26:42
    it was so broad that they could just walk
    any other direction and not address
  • 26:42 - 26:47
    the real issue. What was wonderful is
    that in the judgment by the Irish Court
  • 26:47 - 26:51
    they have actually said that
    all of this is factually true.
  • 26:51 - 26:54
    All the mass surveillance is factually
    true. And the fun thing to understand
  • 26:54 - 26:59
    is that the factual assessment is done by
    the national Courts. So the European
  • 26:59 - 27:02
    Court of Justice is not engaging in
    factual matters anymore. They only
  • 27:02 - 27:07
    ask legal questions: “Is this legal or
    not”. So we had a split of responsibility.
  • 27:07 - 27:11
    The Irish Court only said that all of this
    is true. And Luxembourg only said
  • 27:11 - 27:15
    that all of this would be legal if all of
    this would be true. Which was kind of
  • 27:15 - 27:20
    an interesting situation. But to be
    fair no one before the European
  • 27:20 - 27:23
    Court of Justice has ever questioned
    that this is true. So even the UK
  • 27:23 - 27:26
    that was in front of the Court and that
    should possibly know if all of this
  • 27:26 - 27:30
    is true or not, they have never
    questioned the facts. laughs
  • 27:30 - 27:35
    There is a pretty good factual basis.
    What was interesting as well is
  • 27:35 - 27:39
    that I said I’m not gonna go in front
    of the European Court of Justice.
  • 27:39 - 27:44
    Because the cost is so high that even the
    60 or 70.000 Euros I got in donations
  • 27:44 - 27:48
    wouldn’t cover it. And I knew the judge
    wants to get this hot potato off his table
  • 27:48 - 27:52
    and down to Luxembourg. So I was asking
    for a so called “protective cost order”
  • 27:52 - 27:55
    which kind of tells you beforehand that
    there is a maximum amount you have to pay
  • 27:55 - 27:58
    if you lose a case. And it
    was actually the first one
  • 27:58 - 28:02
    to ever get protective cost
    order in Ireland granted.
  • 28:02 - 28:06
    Which was really cool and the Irish
    were like outraged about it, too.
  • 28:06 - 28:10
    applause
  • 28:10 - 28:13
    So we basically walked into the
    European Court of Justice which is
  • 28:13 - 28:17
    a really hefty procedure.
    In this room were…
  • 28:17 - 28:21
    13 judges are in front of you. The
    European Court of Justice has assigned it
  • 28:21 - 28:24
    to the Great Chamber. So there is a
    Small, a Medium and a Great Chamber.
  • 28:24 - 28:28
    Which is the highest thing you can
    possibly end up in Europe. And
  • 28:28 - 28:31
    it’s chaired by the President of the
    European Court of Justice. And this is
  • 28:31 - 28:36
    kind of where the really really basic,
    really important questions are dealt with.
  • 28:36 - 28:39
    So I was like: “Cool, I’m getting to the
    European Court of Justice!”. And it’s
  • 28:39 - 28:42
    funny because all the lawyers that were in
    the room, everyone was like “I can pledge
  • 28:42 - 28:45
    in front of the European Court of
    Justice!”. They all took pictures like
  • 28:45 - 28:47
    they were in Disneyland or something.
    audience laughing
  • 28:47 - 28:53
    And it was – lawyers can be
    very… kind of… interesting. And
  • 28:53 - 28:57
    we ended up in front of these 3 major
    people. It was the President,
  • 28:57 - 29:01
    Thomas von Danwitz – who is the German
    judge and he also wrote the lead decision.
  • 29:01 - 29:04
    He’s the Judge Rapporteur, so within
    the 13 judges there’s one that is
  • 29:04 - 29:07
    the reporting judge and actually drafts
    the whole case. And he was also
  • 29:07 - 29:13
    doing the Data Retention. And then there
    was Yves Bot as the Advocate General.
  • 29:13 - 29:15
    The hearing was interesting
    because we got questions
  • 29:15 - 29:18
    from the European Court of Justice before
    the hearing. And in these questions
  • 29:18 - 29:22
    they were actually digging down into
    the core issues of mass surveillance
  • 29:22 - 29:26
    in the US. When I got the questions
    I was like “We won the case!” because
  • 29:26 - 29:31
    there’s no way they can decide differently
    as soon as they address the question.
  • 29:31 - 29:35
    There were participants from all over
    Europe. These are the countries,
  • 29:35 - 29:38
    then there was the European Parliament,
    the European Data Protection Supervisor
  • 29:38 - 29:41
    and the European Commission.
    There was me – MS down there,
  • 29:41 - 29:45
    the Data Protection Commissioner
    and Digital Rights Ireland. And
  • 29:45 - 29:49
    what was interesting was the countries
    that were not there. Like Germany, e.g.
  • 29:49 - 29:53
    was not there in this major procedure.
    And as far as I’ve heard there were
  • 29:53 - 30:00
    reasons of not getting too engaged in
    the Transatlantic Partnership problem.
  • 30:00 - 30:05
    So this was kind of interesting because
    the UK walked up but Germany was like:
  • 30:05 - 30:08
    “No, we rather don’t want
    to say anything about this.”
  • 30:08 - 30:12
    What was interesting as well is that there
    were interventions by the US Government.
  • 30:12 - 30:16
    So I heard… we were… on a Tuesday we
    were actually in the Court. And on Mondays
  • 30:16 - 30:20
    I got text messages from people of
    these different countries telling me that
  • 30:20 - 30:25
    the US just called them up. And
    I was like: “This is interesting”
  • 30:25 - 30:28
    because I know a lot of these people from
    conferences and stuff. So they were like
  • 30:28 - 30:31
    telling me: “The US just called me
    up and said they wanna talk to my
  • 30:31 - 30:35
    lead lead lead supervisor and tell me
    what to say tomorrow in the Court”.
  • 30:35 - 30:38
    It was like: “This is very interesting!”.
  • 30:38 - 30:44
    I was actually in the Court room and there
    was the justice person from the US embassy
  • 30:44 - 30:48
    to the European Union. And he was actually
    watching the procedure and watching
  • 30:48 - 30:50
    what everybody was arguing.
    Where I had a feeling this is
  • 30:50 - 30:54
    like a watchdog situation. And someone
    pointed out that this is the guy,
  • 30:54 - 30:57
    so I knew who it is. And he was walking up
    to me and asked: “Are you the plaintiff?”
  • 30:57 - 31:03
    And I said: “Yeah, hey!” and he was
    trying to talk to me and I said:
  • 31:03 - 31:06
    “Did you manage calling everybody by now
    or do you still need a couple of numbers?”
  • 31:06 - 31:07
    audience laughing
  • 31:07 - 31:12
    And he was like: “(?) arrogant!”. He was
    like: “He didn’t just ask this question?”.
  • 31:12 - 31:16
    He said: “No, we kind of we’re in contact
    with all of our colleagues and of course
  • 31:16 - 31:19
    we have to kind of push for the interest
    of the US” and blablabla. I thought:
  • 31:19 - 31:23
    “This is very interesting!”. But
    anyway, it didn’t help them.
  • 31:23 - 31:28
    No one of them was really kind
    of arguing for the US, actually.
  • 31:28 - 31:30
    The findings of the European Court of
    Justice, so what was in the judgment
  • 31:30 - 31:36
    in the end. First of all, Safe Harbor
    is invalid. Which was the big news.
  • 31:36 - 31:39
    And this was over night. We were expecting
    that they would have a grace period
  • 31:39 - 31:44
    so it’s invalid within 3 months or
    something like this. But in the minute
  • 31:44 - 31:48
    they were saying it there all your data
    transfers to the US were suddenly illegal.
  • 31:48 - 31:59
    applause
    Which was kind of big.
  • 31:59 - 32:02
    The second biggie was that they actually
    said that the essence of your rights
  • 32:02 - 32:05
    is violated. Now this, for an average
    person, doesn’t mean too much.
  • 32:05 - 32:10
    But for a lawyer it says: “Oh my
    god, the essence is touched!!”.
  • 32:10 - 32:14
    To explain to you what the essence is and
    why everybody is so excited about it is:
  • 32:14 - 32:18
    basically if you have a violation of
    your rights you have no interference.
  • 32:18 - 32:20
    So if a policeman was walking
    down the street and watching you
  • 32:20 - 32:24
    there’s no interference with any of your
    rights. If they probably tapped your phone
  • 32:24 - 32:27
    there is some kind of proportionality
    issue which is what we typically debate
  • 32:27 - 32:31
    before a Court. There is a system how
    you argue if something is proportionate
  • 32:31 - 32:35
    or not. So e.g. Data Retention
    was not proportionate.
  • 32:35 - 32:38
    And Data Retention would be somewhere
    here probably. points to slide
  • 32:38 - 32:42
    So not legal anymore but
    still in a proportionality test.
  • 32:42 - 32:44
    And then there is “the essence”
    which means whatever the fuck
  • 32:44 - 32:47
    you’re trying to do here is totally
    illegal because what you’re doing
  • 32:47 - 32:50
    is so much out of the scale
    of proportionality that
  • 32:50 - 32:54
    it will never be legal. And on Data
    Retention it actually said that
  • 32:54 - 32:57
    for the first time…
    applause
  • 32:57 - 33:02
    applause
  • 33:02 - 33:04
    …and this was actually the
    first time as far as I saw
  • 33:04 - 33:07
    that the European Court of Justice has
    ever said that under the convention.
  • 33:07 - 33:11
    So the convention is only
    in place since 2008, I think.
  • 33:11 - 33:14
    But it’s the first time they actually
    found that in a case which was
  • 33:14 - 33:18
    huge for law in general. There
    was a couple of findings
  • 33:18 - 33:22
    on Data Protection powers that
    are not too interesting for you.
  • 33:22 - 33:26
    What may be interesting is that
  • 33:26 - 33:30
    – there is a story to this picture
    that’s the reason I put it in –
  • 33:30 - 33:32
    basically they said that a
    third country doesn’t have
  • 33:32 - 33:36
    to provide adequate protection, as
    I said before. So the story was
  • 33:36 - 33:39
    that third countries originally had
    to provide equivalent protection.
  • 33:39 - 33:42
    But there was lobbying going on,
    so the word “equivalent” was
  • 33:42 - 33:47
    changed to “adequate”. And
    “adequate” means basically nothing.
  • 33:47 - 33:51
    Because anything and nothing can be
    adequate. “Adequate” has no legal meaning.
  • 33:51 - 33:56
    I mean if you ask what an adequate
    dressing is – you don’t really know.
  • 33:56 - 34:00
    So they changed that actually back to the
    law… to the wording that was lobbied
  • 34:00 - 34:03
    out of the law and said it has to be
    “essentially equivalent” and that’s how
  • 34:03 - 34:06
    we now understand “adequate”. Which is
    cool because any third country now
  • 34:06 - 34:10
    has to provide more or less the same
    level of protection than Europe has.
  • 34:10 - 34:15
    There has to be effective detention
    and supervision mechanisms. And
  • 34:15 - 34:18
    there has to be legal redress. Just
    a really short thing on the picture:
  • 34:18 - 34:21
    I was actually just pointing at two
    people and they were taking a picture
  • 34:21 - 34:25
    from down there to make it a Victory
    sign. And that’s how the media
  • 34:25 - 34:28
    is then doing: “Whoo”.
    making short Victory gesture
  • 34:28 - 34:32
    I have to speed up a little bit.
    Not too much but a little bit.
  • 34:32 - 34:36
    The future, and I think that’s probably
    relevant for you guys as well…
  • 34:36 - 34:41
    First of all, what this whole judgment
    means. First of all the US
  • 34:41 - 34:44
    basically lost its privileged
    status as being a country
  • 34:44 - 34:47
    that provides adequate [data] protection.
    Which is kind of the elephant in the room
  • 34:47 - 34:51
    that everyone knew anyway, that they’re
    not providing it. And now, officially,
  • 34:51 - 34:55
    they’re not providing it anymore. And the
    US is now like any third country.
  • 34:55 - 35:00
    So like China or Russia or India or
    any country we usually transfer data to.
  • 35:00 - 35:04
    So it’s not like you cannot transfer
    data to the US anymore.
  • 35:04 - 35:07
    But they lost their special status.
    Basically what the judgment said:
  • 35:07 - 35:09
    “You can’t have mass surveillance
    and be at the same time
  • 35:09 - 35:13
    an adequately [data] protecting country”.
    Which is kind of logical anyway.
  • 35:13 - 35:17
    The consequence is that you have to
    use the derogations that are in the law
  • 35:17 - 35:20
    that we have for other countries as well.
    So a lot of people said: “You know,
  • 35:20 - 35:24
    the only result will be that there will be
    a consent box saying ‘I consent that my
  • 35:24 - 35:27
    [personal] data is going to the US.’”
    Now the problem is: consent has to be
  • 35:27 - 35:32
    freely given, informed, unambiguous
    and specific; under European law.
  • 35:32 - 35:34
    Which is something all the Googles
    and Facebooks in the world have
  • 35:34 - 35:37
    never understood. That’s the reason
    why all these Privacy Policies are
  • 35:37 - 35:41
    typically invalid. But anyway. So if
    you have any of these wordings that
  • 35:41 - 35:45
    they’re currently using, like “Your data
    is subject to all applicable laws” it’s
  • 35:45 - 35:48
    very likely not “informed” and
    “unambiguous”. Because you don’t have
  • 35:48 - 35:52
    any fucking idea that your data is
    ending up at the NSA if you read this.
  • 35:52 - 35:56
    So what they would have to do is to have
    some Policy saying: “I agree that all of
  • 35:56 - 36:00
    my personal data is made available to the
    NSA, FBI and whatsoever – YES/NO”.
  • 36:00 - 36:02
    applause
    Because it has to be “freely given”, so
  • 36:02 - 36:04
    applause
    I have to have the option to say “No”.
  • 36:04 - 36:09
    Now this would theoretically be possible
    but under US law they’re placed under
  • 36:09 - 36:11
    a “gag order”, so they’re
    not allowed to
  • 36:11 - 36:14
    say this. So they’re in a legal kind of
  • 36:14 - 36:17
    Limbo because on the one hand they have to
    say: “It’s this way” but on the other hand
  • 36:17 - 36:22
    they have to say “No it’s not”. So consent
    is not going to give you any solution.
  • 36:22 - 36:26
    Then there are Standard Contractual
    Clauses. That’s the one from Apple that
  • 36:26 - 36:28
    they’re using right now.
  • 36:28 - 36:32
    And Standard Contractual Clauses allow
    you to have a contract with a provider
  • 36:32 - 36:36
    in a third country. And that
    pledges to you in a contract
  • 36:36 - 36:41
    that all your data is safe. The problem
    is that they have exception clauses.
  • 36:41 - 36:45
    That basically say: “If there’s mass
    surveillance your whole contract is void”
  • 36:45 - 36:49
    because you cannot have a contract
    saying: “Hereby I pledge full Privacy”
  • 36:49 - 36:53
    and at the same time be subject to these
    laws. And this is the interesting thing:
  • 36:53 - 36:56
    all these companies are saying: “Now we’re
    doing Standard Contractual Clauses”,
  • 36:56 - 36:58
    but none of them are going to hold
    up in Courts and everybody knows,
  • 36:58 - 37:00
    but of course to their shareholders
    they have to tell: “Oh we have
  • 37:00 - 37:04
    a wonderful solution for this.”
  • 37:04 - 37:08
    The big question here is if we have
    a factual or legal assessment.
  • 37:08 - 37:13
    So do we have to look at factually what
    data is actually processed by the NSA
  • 37:13 - 37:16
    and what are they actually doing. Or do we
    just have to look at the laws in a country
  • 37:16 - 37:22
    and the possibility of mass access. So the
    factual assessment works fine for Apple,
  • 37:22 - 37:26
    Google etc. who are all in these Snowden
    slides. If you look at the abstract and
  • 37:26 - 37:31
    legal assessment which is legally the
    thing that probably we have to do
  • 37:31 - 37:35
    you actually end up with questions like
    Amazon. Amazon was not a huge
  • 37:35 - 37:38
    cloud provider when the Snowden slides
    were actually drafted and written.
  • 37:38 - 37:42
    They’re huge now. And very likely
    they’re subject to all of these laws.
  • 37:42 - 37:45
    So how do we deal with a company like
    this? Can we still forward [personal] data
  • 37:45 - 37:48
    to an Amazon cloud? If we know
    they’re subject to these US laws.
  • 37:48 - 37:51
    So this is the question of which
    companies are actually falling
  • 37:51 - 37:56
    under this whole judgment.
  • 37:56 - 38:00
    Basically you still have a couple of
    other exemptions. So this basic thing
  • 38:00 - 38:03
    that a couple of people say that you’re
    not allowed to book a hotel [room]
  • 38:03 - 38:07
    in the US anymore is not true. There
    are a lot of exceptions in the law e.g.
  • 38:07 - 38:11
    the performance of a contract. So if
    I book a hotel [room] in New York online
  • 38:11 - 38:15
    my [personal] data has to go to New York
    to actually book my hotel [room]. So
  • 38:15 - 38:18
    in all these cases you can still transfer
    [personal] data. The ruling is mainly
  • 38:18 - 38:21
    on outsourcing. So if you could
    theoretically have your [personal] data
  • 38:21 - 38:24
    in Europe you’re just not choosing because
    it’s cheaper to host it in the US or
  • 38:24 - 38:29
    it’s easier or it’s more convenient. In
    these cases we actually get problems.
  • 38:29 - 38:34
    So what we did is we had a second round
    of complaints. That is now taking
  • 38:34 - 38:38
    these judgments onboard. You can download
    them on the web page as well. And there’s
  • 38:38 - 38:43
    also the deal that Facebook Ireland
    with Facebook US has signed.
  • 38:43 - 38:49
    To have safety to your data. And this is
    currently under investigation in Ireland.
  • 38:49 - 38:53
    Basically I argued that they have a
    contract but the contract is void because
  • 38:53 - 38:57
    US law says they have to do all this mass
    surveillance. I just got the letter that
  • 38:57 - 39:01
    on November, 18th Facebook has actually
    given them [to the DPC] a huge amount
  • 39:01 - 39:07
    of information on what they’re actually
    doing with the data. This is now going
  • 39:07 - 39:10
    to be under investigation. The big
    question is if the DPC in Ireland is
  • 39:10 - 39:13
    actually giving us access to this
    information. Because so far all these
  • 39:13 - 39:17
    evidence that they had they said:
    “it’s all secret and you cannot know
  • 39:17 - 39:20
    what Facebook is doing with your data
    even though you’re fully informed about
  • 39:20 - 39:24
    what they’re doing with your data.”
    Which is kind of interesting as well. But
  • 39:24 - 39:30
    – different issue. A big question was also
    if there’s gonna be a Safe Harbor 2.0.
  • 39:30 - 39:33
    I already was told by everybody they’re
    not gonna call it a Safe Harbor anymore
  • 39:33 - 39:37
    because they’re stuck with media
    headlines like “Safe Harbor is sunk”
  • 39:37 - 39:40
    or something like this.
  • 39:40 - 39:43
    And what happened is that the US has done
    a huge lobbying effort. They have said
  • 39:43 - 39:46
    right on the day that all of this is based
    on wrong facts and they’ve never done
  • 39:46 - 39:51
    any of this; and all of this
    is Trade War; and blablablabla.
  • 39:51 - 39:54
    So they put a lot of pressure on them.
    I was actually talking to Jurova,
  • 39:54 - 39:57
    the Justice Commissioner. And I was
    impressed by her. She actually took
  • 39:57 - 40:01
    a whole hour and she really knew what
    was going on. And at the time they had
  • 40:01 - 40:04
    press releases saying: “We’re really
    deeply working on the new Safe Harbor”.
  • 40:04 - 40:07
    And I was asking Jurova: “Did you get
    any of the evidence you need to make
  • 40:07 - 40:11
    such a finding?” And the answer
    was: “Yeah, we’re still waiting for it.
  • 40:11 - 40:14
    We should get it next week”.
    Which basically meant this
  • 40:14 - 40:18
    is never going to work out anymore. But
    of course I think there’s a blame game
  • 40:18 - 40:22
    going on. The EU has to say: “We
    tried everything to find a solution”
  • 40:22 - 40:25
    and the US is saying: “We tried
    everything to find a solution, too”.
  • 40:25 - 40:27
    And then in the end they will blame
    each other for not finding a solution.
  • 40:27 - 40:31
    That’s my guess. But
    we’ll see what happens.
  • 40:31 - 40:34
    The basic problem with a Safe Harbor 2
    is that in the government sector
  • 40:34 - 40:38
    they’d basically have to rewrite the whole
    US legal system. Which they haven’t done
  • 40:38 - 40:44
    for their own citizens. So they will very
    likely not do it for European citizens.
  • 40:44 - 40:47
    Like judicial redress. Not even an
    American has judicial redress. So
  • 40:47 - 40:50
    they would never give that to a European.
    And the private area: they actually
  • 40:50 - 40:54
    have to redraft the whole Safe Harbor
    principles because they now have to be
  • 40:54 - 40:57
    essentially equivalent of
    what Europe is doing.
  • 40:57 - 41:00
    So this would also protect people on
    the private sphere much more but it
  • 41:00 - 41:05
    would really take a major overhaul of
    the whole system. To give you an idea:
  • 41:05 - 41:08
    all of these processing operations
    are covered by European law. So
  • 41:08 - 41:13
    from collection all the way
    to really deleting the data.
  • 41:13 - 41:17
    This is what’s covered by the Safe
    Harbor principles. Only 2 operations
  • 41:17 - 41:20
    which is at the closure by “transmission”
    and the “change of purpose”. Anything else
  • 41:20 - 41:24
    they can do as fully as they wanna do
    under the current Safe Harbor things.
  • 41:24 - 41:27
    So if you talk about “essentially
    equivalent” you see on these spaces
  • 41:27 - 41:31
    already points to slide
    that this is miles apart.
  • 41:31 - 41:35
    So what is the future of US-EU-US data
    flows? We will have massive problems
  • 41:35 - 41:38
    for the PRISM companies. Because
    what they’re doing is just a violation
  • 41:38 - 41:41
    of our Fundamental Rights. Give or take
    it – you can change the law as much
  • 41:41 - 41:45
    as you want but you cannot
    change the Fundamental Rights.
  • 41:45 - 41:48
    And you’ll have serious problems
    for businesses that are subject
  • 41:48 - 41:51
    to US surveillance law in
    general. So I’m wondering
  • 41:51 - 41:55
    what the final solution is. And that
    was part of the issue that I had
  • 41:55 - 41:58
    with the cases. Typically I like
    to have a solution for all of this.
  • 41:58 - 42:01
    In this case I could only point at the
    problems but I couldn’t really come up
  • 42:01 - 42:06
    with solutions. Because solutions are
    something that has to be done politically.
  • 42:06 - 42:11
    An interesting question was: “How
    about EU surveillance, actually?”
  • 42:11 - 42:15
    Because aren’t they doing more or
    less the same thing? Which is true.
  • 42:15 - 42:19
    And the problem is that the Charta of
    Fundamental Rights only applies
  • 42:19 - 42:23
    to anything that’s regulated by the EU.
    And national surveillance is exempt
  • 42:23 - 42:29
    from any EU law. It’s something that
    member states are doing all by themselves.
  • 42:29 - 42:33
    So you’re out of luck here. You
    can possibly argue it through
  • 42:33 - 42:38
    a couple of circles; but it’s hard to
    do. However, 7 and 8 of the Charta
  • 42:38 - 42:42
    – exactly the same wording as the
    European Convention of Human Rights.
  • 42:42 - 42:46
    And this applies to National Security
    cases. So the relevant Court here
  • 42:46 - 42:50
    is actually in Strasbourg. So you
    could probably end up at this Court
  • 42:50 - 42:53
    with the same argument and say: if they
    already found that this is a violation
  • 42:53 - 42:56
    of your essence in Luxembourg – don’t
    you want to give us the same rights
  • 42:56 - 43:01
    in Strasbourg as well? And these cool
    Courts are in kind of a fight about
  • 43:01 - 43:04
    kind of providing proper Privacy
    protection and protection in general.
  • 43:04 - 43:08
    So very likely you can walk up with
    a German case or with a UK case
  • 43:08 - 43:11
    or a French case and pretty much do
    the same thing here. So the judgment
  • 43:11 - 43:14
    will be interesting for European
    surveillance as well because
  • 43:14 - 43:17
    it’s a benchmark. And you can hardly
    argue that the US is bad and we’re
  • 43:17 - 43:21
    not doing the same thing. Either solutions
    are possibly technical solutions.
  • 43:21 - 43:26
    So what Microsoft did with the cloud
    services and hosting it with the Germans.
  • 43:26 - 43:31
    And the German Telekom. And there
    is really the issue that if you can get
  • 43:31 - 43:36
    a technical solution of not having any
    access from the US side you can actually
  • 43:36 - 43:39
    get out of the whole problem. So you can
    try with encryption or data localization;
  • 43:39 - 43:43
    all this kind of stuff. However none
    of this is really a very sexy solution
  • 43:43 - 43:49
    to the whole issue. However it's
    something that you can possibly do.
  • 43:49 - 43:54
    Last thing: enforcement. And this a
    little bit of a pitch, I got to confess.
  • 43:54 - 43:57
    We have the problem so far that
  • 43:57 - 44:00
    we have Data Protection law in Europe.
  • 44:00 - 44:03
    But we don’t really have enforcement. And
    the problem is that the lawyers don’t know
  • 44:03 - 44:07
    what’s happening technically. The
    technical people hardly know
  • 44:07 - 44:11
    what the law says. And then you
    have a funding issue. So the idea
  • 44:11 - 44:13
    that I have right now is to create some
    kind of an NGO or some kind of
  • 44:13 - 44:18
    a “Stiftung Warentest for Privacy”. To
    kind of look into the devices we all have
  • 44:18 - 44:21
    and kind of have a structured system of
    really looking into it. And then probably
  • 44:21 - 44:25
    do enforcement as well if your
    stuff that you have on your device
  • 44:25 - 44:29
    is not following European law.
    I think this is an approach that
  • 44:29 - 44:32
    probably changes a lot of the issues.
    It’s not gonna change everything.
  • 44:32 - 44:36
    But this could possibly be a solution to
    a lot of what we had. And that’s kind of
  • 44:36 - 44:40
    what we did in other fields of law as
    well. That we have NGOs or organizations
  • 44:40 - 44:43
    that take care of these things. I think
    that would be a solution and probably
  • 44:43 - 44:48
    helps a little bit. Last - before we
    have a question/answer session –
  • 44:48 - 44:52
    a little Bullshit Bingo to probably get a
    couple of questions answered right away.
  • 44:52 - 44:56
    So the first thing is that a lot
    of questions are if the EU
  • 44:56 - 44:59
    does the same thing. I just answered it:
    Of course they do the same thing and
  • 44:59 - 45:02
    we’ll have to do something about it
    as well. And I hope that my case
  • 45:02 - 45:07
    is a good case to bring other cases
    against member states of the EU.
  • 45:07 - 45:10
    The second question is these whole PRISM
    companies are saying they don’t do this.
  • 45:10 - 45:14
    It’s absurd because they’re all placed
    under gag orders. Or the people that are
  • 45:14 - 45:17
    talking to us don’t even have the
    security clearance to talk about
  • 45:17 - 45:21
    the surveillance system. So it’s insane
    when a PR person comes up and says:
  • 45:21 - 45:25
    “I hereby read the briefing from Facebook
    that we’re not doing this!”. Which
  • 45:25 - 45:29
    basically is what we have right now.
    And that’s what a lot of the media
  • 45:29 - 45:33
    is referring to as well. Another thing
    that Facebook and the US government
  • 45:33 - 45:36
    have argued later is that they weren’t
    asked. They were not invited to the Court
  • 45:36 - 45:41
    procedure. The fun thing is: both of them
    totally knew about the Court procedure.
  • 45:41 - 45:45
    They just decided not to step in and not
    to get a party of the procedure. So they
  • 45:45 - 45:47
    were like first: “Ouh,
    we don’t wanna talk
  • 45:47 - 45:50
    about it” and then when the decision
  • 45:50 - 45:54
    came around they were like:
    “Oh we weren’t asked”.
  • 45:54 - 45:57
    Of course it’s a win-on-paper mainly
    but we’re trying to get it implemented
  • 45:57 - 46:01
    in practice as well. And there
    is kind of this argument
  • 46:01 - 46:05
    “The EU has broken the Internet”
    which I typically rebut in “No, the US
  • 46:05 - 46:09
    has broken the Internet and
    the EU is reacting to it”.
  • 46:09 - 46:16
    applause
  • 46:16 - 46:19
    Another issue that was interesting
    is that a lot of the US side said that
  • 46:19 - 46:24
    this is protectionism. So the EU is only
    enforcing these Fundamental Rights
  • 46:24 - 46:29
    to hurt US companies. Which is funny
    because I’m not involved in kind of
  • 46:29 - 46:32
    getting more trade to Europe. I’m
    just like someone interested in
  • 46:32 - 46:36
    my Fundamental Rights. And secondly the
    European politics has done everything
  • 46:36 - 46:41
    to kind of not get this case to cross.
    So kind of this idea that this is
  • 46:41 - 46:45
    a protectionist thing is kind of strange,
    too. And the last question which is:
  • 46:45 - 46:49
    “What about the Cables? What about all
    the other types of surveillance we have?”
  • 46:49 - 46:54
    They’re an issue, too. In these cases you
    just have more issues of actual hacking,
  • 46:54 - 46:59
    government hacking, basically. So
    illegal access to servers and cables.
  • 46:59 - 47:02
    Which is harder to tackle with than
    these companies. Because we have
  • 47:02 - 47:06
    this private interference. So there are a
    lot of other issues around here as well,
  • 47:06 - 47:09
    I was just happy to kind of get one thing
    across. And I’m happy for questions,
  • 47:09 - 47:13
    as well.
    applause
  • 47:13 - 47:21
    Herald: Alright…
    applause
  • 47:21 - 47:23
    Max: at lowered voice
    Wie lange haben wir noch für Fragen?
  • 47:23 - 47:30
    Herald: We have about
    10 minutes for questions.
  • 47:30 - 47:34
    I would ask you to please line up at
    the microphones here in the hall.
  • 47:34 - 47:37
    We have 6 microphones. And we have also
  • 47:37 - 47:40
    questions from the IRC.
    While you guys queue up
  • 47:40 - 47:44
    I would take one from the internet.
  • 47:44 - 47:48
    Signal Angel: Yeah, just
    one – for the first time.
  • 47:48 - 47:52
    Does TTIP influence any of this?
  • 47:52 - 47:56
    Max: Basically, not really. Because
    the judgment that was done was
  • 47:56 - 47:58
    on the Fundamental Rights. So if they
    have some kind of wording in TTIP
  • 47:58 - 48:03
    it would again be illegal. And there was
    actually a push to get something like that
  • 48:03 - 48:09
    into TTIP. And as far as I know this idea
    was done, after the judgment. laughs
  • 48:09 - 48:13
    Just a little intro: EDRI has organized
    an ask-me-anything thing at 7 PM as well.
  • 48:13 - 48:18
    So if you got specific questions, you
    can also go there. Just as a reminder.
  • 48:18 - 48:21
    Herald: OK, great.
    Microphone No.2, please.
  • 48:21 - 48:25
    Question: Thank you for your
    efforts. The question would be:
  • 48:25 - 48:28
    Could US businesses
    under these findings ever
  • 48:28 - 48:34
    be again employed
    in critical sectors? E.g.
  • 48:34 - 48:39
    public sector, Windows in the
    Bundestag, e.g. and stuff like that?
  • 48:39 - 48:44
    Max: Yep, yip. That’s a huge problem.
    And that’s a problem we had for a while.
  • 48:44 - 48:49
    I was mainly talking actually with people
    in the business area. I’m mainly invited
  • 48:49 - 48:51
    to conferences there. And people
    were telling me: “Yeah, we’re doing
  • 48:51 - 48:55
    all our bank data on Google
    now”. And I was like: WTF?
  • 48:55 - 48:59
    Because this is not only Privacy.
    That’s also trade secrets, all of
  • 48:59 - 49:03
    this kind of stuff. So there is this
    huge issue and if you talk about
  • 49:03 - 49:07
    the new Windows that is talking home a
    little more than the old did, you probably
  • 49:07 - 49:11
    have the same issue here because
    Microsoft is falling under the same thing.
  • 49:11 - 49:14
    Q: No plausible deniability,
    therefor culpability.
  • 49:14 - 49:16
    M: Yep, yep, yep.
    Q: Thank you!
  • 49:16 - 49:18
    Max: Thank you!
  • 49:18 - 49:21
    Herald: OK, microphone No.3,
    please, for your next question.
  • 49:21 - 49:25
    Question: How would you assess
    Microsoft saying they put up
  • 49:25 - 49:29
    a huge fight that they… well,
  • 49:29 - 49:33
    they said they had customers’
    data in Ireland and they said
  • 49:33 - 49:36
    they refuse to give it to the FBI.
    What’s to think of that?
  • 49:36 - 49:38
    Max: I think to be fair a lot of
    these companies have realized
  • 49:38 - 49:43
    that there is an issue. And that
    they are the “Feuer am Arsch”.
  • 49:43 - 49:47
    And Microsoft… actually a couple of
    Microsoft people is talking to me
  • 49:47 - 49:51
    and is like: “We’re actually not
    unhappy about this case because
  • 49:51 - 49:55
    we have a good argument in the US
    now that we’re getting troubles here…”
  • 49:55 - 49:59
    But the companies are between
    these 2 chairs. The US law says:
  • 49:59 - 50:02
    “We kill you if you’re not giving us all
    the data” and the problem so far is
  • 50:02 - 50:06
    that in the EU… e.g. in Austria
  • 50:06 - 50:10
    the maximum penalty is 25.000 Euro
    if you don’t comply with this.
  • 50:10 - 50:12
    Q: Peanuts.
    M: Which is absurd.
  • 50:12 - 50:15
    And in most other countries it’s the same.
    We now have the Data Protection regulation
  • 50:15 - 50:18
    that is coming up which gives
    you a penalty of a maximum
  • 50:18 - 50:22
    of 4% of the worldwide turnover,
    which is a couple of millions.
  • 50:22 - 50:25
    And if you want to thank someone there’s
    Jan Philipp Albrecht, probably in the room
  • 50:25 - 50:29
    or not anymore, who is the member of [EU]
    Parliament from the Green Party, that’s
  • 50:29 - 50:32
    actually from Hamburg who
    has negotiated all of this.
  • 50:32 - 50:34
    And this actually could possibly
    change a couple of these things.
  • 50:34 - 50:38
    But you have this conflict of laws
    and solutions like the Telekom thing –
  • 50:38 - 50:42
    that you host the data with the Telekom –
    could possibly allow them to argue
  • 50:42 - 50:45
    in the US that they don’t have any factual
    access anymore so they can’t give the data
  • 50:45 - 50:49
    to the US Government. But we’re splitting
    the internet here. And this is not really
  • 50:49 - 50:53
    something I like too much, but
    apparently the only solution.
  • 50:53 - 50:56
    Herald: OK, thank you for your
    question. We have another one
  • 50:56 - 51:00
    at microphone 4, please.
  • 51:00 - 51:07
    Q: Thank you very much for your
    efforts, first of all. And great result!
  • 51:07 - 51:10
    M: Thank you.
    Q: The question from me would also be:
  • 51:10 - 51:13
    Is there any change in the system
    in Ireland now? So somebody has
  • 51:13 - 51:16
    a similar struggle to yours – the
    next round might be easier or not?
  • 51:16 - 51:20
    Max: Basically what the Irish DPC got
    is a wonderful new building. And
  • 51:20 - 51:23
    the press release is too funny.
    Because it says: “We have a very nice
  • 51:23 - 51:28
    Victorian building now downtown Dublin
    in a very nice neighborhood“ and blablabla
  • 51:28 - 51:30
    and they get double the staff of what
    they had before. The key problem
  • 51:30 - 51:34
    is none of this. I only took the picture
    because it kind of shows what’s
  • 51:34 - 51:38
    inside the building. And the key
    problem is that we have 2 countries
  • 51:38 - 51:40
    – Luxembourg and Ireland, where
    all of these headquarters are – and
  • 51:40 - 51:44
    these 2 countries are not interested
    in collecting taxes, they’re
  • 51:44 - 51:48
    not interested in enforcing Privacy Law,
    they’re not interested in any of this. And
  • 51:48 - 51:52
    they’re basically getting a huge bunch of
    money in the back of the rest of the EU.
  • 51:52 - 51:56
    And until this actually changes
    and there’s a change of attitude
  • 51:56 - 52:00
    in the Irish DPC it doesn’t really
    matter in which building they are.
  • 52:00 - 52:04
    So they got a lot of more money to kind
    of – to the public – say: “Yes we have
  • 52:04 - 52:06
    more money and we have
    more staff and dadadadada”…
  • 52:06 - 52:08
    Q: …but the system did not change!
  • 52:08 - 52:11
    M: The big question is what the system is
    doing: they can prove now! As they have
  • 52:11 - 52:14
    the new complaint on their table on Safe
    Harbor and PRISM and Facebook.
  • 52:14 - 52:17
    They can prove; if they do something
    about it or not – my guess is that
  • 52:17 - 52:19
    they’ll find “some” random reasons
    why unfortunately they couldn’t do
  • 52:19 - 52:23
    anything about it. We’ll see.
    Q: OK, thanks!
  • 52:23 - 52:26
    Herald: OK, thank you! It’s
    your turn, microphone No.2.
  • 52:26 - 52:32
    Question: OK, thank you very much and also
    thank you for your service for the public.
  • 52:32 - 52:41
    M: Thanks for the support!
    applause
  • 52:41 - 52:46
    Q: What that will…
    Sorry about the English…
  • 52:46 - 52:50
    M: Sag's auf Deutsch!
  • 52:50 - 52:58
    Q: Was bedeutet das eigentlich für die
    Geschichte mit der Vorratsdatenspeicherung
  • 52:58 - 53:04
    wenn die jetzt wieder kommt?
    Und inwiefern wird Social Media
  • 53:04 - 53:08
    damit jetzt sozusagen freigestellt
    wieder oder nicht?
  • 53:08 - 53:12
    M: To be honest I didn’t really look
    into the German Data Retention thing
  • 53:12 - 53:16
    too much. To be honest, being an Austrian
    I’m like “Our Supreme Cou… Constitu…”
  • 53:16 - 53:17
    Q: Me, too!
    audience laughing
  • 53:17 - 53:21
    M: Yeah, I heard. “Our Constitutional
    Court kind of killed it”, so…
  • 53:21 - 53:25
    I don’t think we’ll see a Data
    Retention in Austria too soon.
  • 53:25 - 53:27
    But for Germany it’s gonna be interesting
    especially if you find a way to
  • 53:27 - 53:31
    go to Luxembourg in the end. Like if you
    find some hook to say: “Actually,
  • 53:31 - 53:35
    this German law violates something
    in the Data Protection Regulation
  • 53:35 - 53:39
    or in the Directive“. So we can probably
    find a way to go back to Luxembourg.
  • 53:39 - 53:41
    Could help. The other thing is that just
    the fact that the Luxembourg Court
  • 53:41 - 53:46
    has been so active has probably boosted
    up a lot of the National Courts as well.
  • 53:46 - 53:50
    Because the German decision, I had
    the feeling was like a “We don’t really
  • 53:50 - 53:53
    feel like we can fully say that this is
    actually illegal and we kind of argued
  • 53:53 - 53:56
    that it’s somehow not illegal the way
    it is, but possibly you can do it
  • 53:56 - 54:00
    in the future and uooah…“. And after
    Luxembourg has really thrown
  • 54:00 - 54:05
    all of this right out of the door and was
    like: “Get lost with your Data Retention
  • 54:05 - 54:08
    thing and especially with the PRISM thing”
    you probably have better case law now,
  • 54:08 - 54:11
    as well. And that could be relevant
    for National Courts as well. Because
  • 54:11 - 54:16
    of course these things are question of
    proportionality. And if we ask everybody
  • 54:16 - 54:18
    here in the room what they
    think is proportionate or not,
  • 54:18 - 54:22
    everyone has another opinion. And
    therefore it’s relevant what our people
  • 54:22 - 54:25
    are saying and what other Courts are
    saying to probably get the level of
  • 54:25 - 54:29
    what we feel as proportionate
    somehow a little bit up.
  • 54:29 - 54:32
    Q: So thank you very much. And go on!
    M: Thank you!
  • 54:32 - 54:36
    Herald: OK, just for the record, in
    case you couldn’t tell by the answer:
  • 54:36 - 54:40
    the question was about the implications
    for the Data Retention Laws, like
  • 54:40 - 54:45
    in Germany and Austria. Microphone
    No.1, we have another question.
  • 54:45 - 54:49
    Question: Hi! Two questions. One: could
    you tell a little bit more about your idea
  • 54:49 - 54:56
    of “Stiftung Datenschutz” Europe-wide?
    And how do we get funding to you…
  • 54:56 - 54:59
    M: Send me an email!
    Q: …if you don’t mind?
  • 54:59 - 55:04
    Second question: when I argue with people
    about like “Let’s keep the personal data
  • 55:04 - 55:08
    of all activists within Europe!” I always
    get this answer: “Yeah, are you so naive,
  • 55:08 - 55:12
    do you think it’s anything different
    if the server stands in Frankfurt
  • 55:12 - 55:14
    instead of San Francisco?”
    What do you say to that?
  • 55:14 - 55:18
    Max: The same problem, like pretty much
    what we have is – and that’s the reason
  • 55:18 - 55:20
    why I said I hope this judgment is used
    for National surveillance in Europe,
  • 55:20 - 55:25
    as well. Because we do have the same
    issues. I mean when you are an Austrian
  • 55:25 - 55:30
    and the German “Untersuchungsausschuss”
    is basically saying: “Ah, we’re only
  • 55:30 - 55:33
    protecting Germans” I feel like my
    fucking data is going through Frankfurt
  • 55:33 - 55:37
    all the times. And I’m kind of out of
    the scope, apparently. So we do need
  • 55:37 - 55:39
    to take care of this as well. I hope
    that this is a case showing that
  • 55:39 - 55:43
    you can actually take action. You
    just have to poke long enough and
  • 55:43 - 55:48
    kind of poke at the right spot especially.
    And I think this is something that…
  • 55:48 - 55:51
    there’s not an ultimate solution to it.
    It’s just one of the kind of holes
  • 55:51 - 55:55
    that you have. The other thing that we
    may see is that a lot of companies that
  • 55:55 - 56:00
    are holding this data are much more
    questioning an order they get.
  • 56:00 - 56:04
    Because if they get legal problems from
    an order they got by a German Court
  • 56:04 - 56:08
    or whatever it is they probably
    are now more interested in – and
  • 56:08 - 56:11
    actually looking at it. Because
    right now it’s cheaper for them
  • 56:11 - 56:14
    to just forward the data. You don’t need
    a whole Legal Team, reviewing it all.
  • 56:14 - 56:18
    So I think to kind of split the private
    companies that are helping them
  • 56:18 - 56:22
    from the Government and kind of get some
    issue between them probably helps there,
  • 56:22 - 56:25
    as well. But of course it’s just like
    little peanuts you put in there.
  • 56:25 - 56:29
    But in the end you have that
    issue, in the end. Yeah.
  • 56:29 - 56:35
    On the “Stiftung Datenschutz” or whatever:
    I think that’s kind of a thing I just
  • 56:35 - 56:38
    wanted to blow out to people here, because
    I’m mainly in the legal sphere and in,
  • 56:38 - 56:43
    like the activist/consumer side. And
    I think that’s the big problem we have
  • 56:43 - 56:48
    in the legal and consumer side is that we
    don’t understand the devices that much.
  • 56:48 - 56:51
    And we lack the evidence. We
    don’t really have the evidence of
  • 56:51 - 56:54
    what’s actually going on on devices
    and you need to have this evidence
  • 56:54 - 56:57
    if you go in front of Courts. I think
    a lot of the people in the room probably
  • 56:57 - 57:01
    have this evidence somewhere on the
    computer. So the idea of really getting
  • 57:01 - 57:04
    this connection at some point – it’s not
    something I can pitch to you right away,
  • 57:04 - 57:07
    because it’s not… like I don’t wanna
    start it tomorrow. But it’s something
  • 57:07 - 57:11
    I wanted to circulate to get feedback
    as well, what you guys think of it.
  • 57:11 - 57:16
    So if there’s any feedback on it, send me
    an email, or twitter. Or whatever it is.
  • 57:16 - 57:22
    applause
  • 57:22 - 57:24
    Herald: So we do have a bit time left,
  • 57:24 - 57:27
    microphone No.2 with the
    next question, please.
  • 57:27 - 57:31
    Question: What can I do as an individual
    person now? Can I sue Google
  • 57:31 - 57:36
    or can I sue other companies
    just to stop this?
  • 57:36 - 57:40
    And would it create some
    pressure if I do that?
  • 57:40 - 57:43
    So what can the ordinary
    citizen do now?
  • 57:43 - 57:47
    Max: We’re right now… I already prepared
    it but I didn’t have time to send it out
  • 57:47 - 57:50
    to have complaints against the Googles and
    all the others that are on the PRISM list.
  • 57:50 - 57:53
    We started with Facebook because I kind
    of know them the best. And, you know, so
  • 57:53 - 57:57
    it was a good start. And the idea
    was really to have other people
  • 57:57 - 58:01
    probably copy-pasting this. The complaint
    against Facebook we actually filed
  • 58:01 - 58:05
    with the Hamburg DPC, as well, and the
    Belgium DPC. The idea behind it was
  • 58:05 - 58:09
    that the Irish now suddenly have 2 other
    DPCs that are more interested in
  • 58:09 - 58:13
    enforcing the law in their boat. So
    they’re not the only captains anymore.
  • 58:13 - 58:17
    And it’s interesting what’s gonna happen
    here. If there are other people
  • 58:17 - 58:22
    that have other cases and just file a
    complaint with your Data Protection
  • 58:22 - 58:25
    authority, a lot of them, especially the
    German Data Protection authorities
  • 58:25 - 58:29
    – most of them – are really interested
    in doing something about it, but
  • 58:29 - 58:32
    they oftentimes just need a case. They
    need someone to complain about it and
  • 58:32 - 58:37
    someone giving them the evidence and
    someone arguing it, to get things started.
  • 58:37 - 58:42
    So if anyone is using Google Drive
    or something like that – let’s go.
  • 58:42 - 58:45
    And basically the wording is on our web
    page. You just have to download it
  • 58:45 - 58:50
    and reword it. And we’re gonna probably
    publish on the website the complaints
  • 58:50 - 58:53
    against the other companies, as soon as
    they’re out. Probably the next 2..3 weeks.
  • 58:53 - 59:00
    Something like this. So just
    copy-paste and spread the love.
  • 59:00 - 59:04
    Herald: OK, thank you
    very much, Max, again!
  • 59:04 - 59:08
    For your great talk. This is it!
  • 59:08 - 59:10
    postroll music
  • 59:10 - 59:18
    Subtitles created by c3subtitles.de
    in 2016. Join and help us do more!
Title:
Max Schrems: Safe Harbor
Description:

more » « less
Video Language:
English
Duration:
59:19

English subtitles

Revisions