Return to Video

Citizenfour QA Session

  • 0:00 - 0:01
    ... wanted to be able to use
  • 0:01 - 0:03
    Thunderbird and GnuPG together with Tor,
  • 0:03 - 0:05
    and so we thought:
  • 0:05 - 0:07
    oh, it would be really easy, I bet,
  • 0:07 - 0:10
    to configure Thunderbird to work with Tor
  • 0:10 - 0:12
    - hah - so a new Free software project
    was born.
  • 0:12 - 0:16
    It's a really simple thing, but basically
  • 0:16 - 0:18
    it's just a package
    that hooks it all together.
  • 0:18 - 0:21
    So a lot of people were using Thunderbird
  • 0:21 - 0:24
    and TorBirdy, and GnuPG, and Tor,
  • 0:24 - 0:26
    and Debian, together for email,
  • 0:26 - 0:30
    combined with Riseup as an email service.
  • 0:31 - 0:37
    So it's literally a real peer to peer,
    Free software driven set of things,
  • 0:37 - 0:40
    actually, that made it possible.
  • 0:49 - 0:50
    [question]:
    So one thing I never understood about this
  • 0:50 - 0:53
    process was exactly how the documents were
    handled, and maybe that's because nobody
  • 0:53 - 0:58
    wants to say, but, you know, did you leave
    them on a server somewhere and download
  • 0:58 - 1:01
    them, hand them over to people, and who
    took what where, and how do you...
  • 1:01 - 1:05
    in case I need to do something really
    dangerous with a load of documents,
  • 1:05 - 1:08
    what's the best way of doing it?
  • 1:08 - 1:11
    [laughter]
  • 1:13 - 1:15
    [Jacob]: Hmm!
  • 1:17 - 1:19
    [audience member]: It's a good thing
    this isn't being streamed.
  • 1:19 - 1:22
    I'm sorry, what?
  • 1:22 - 1:25
    There was a voice from god,
    what did she say?
  • 1:25 - 1:27
    [audience]:
    I said good we aren't streaming tonight.
  • 1:27 - 1:30
    Oh yeah, so hello to all of our friends
  • 1:30 - 1:34
    in domestic and international
    surveillance services.
  • 1:35 - 1:37
    Well, so I won't answer your question,
  • 1:37 - 1:40
    but since you asked the question,
    it's my turn to talk.
  • 1:40 - 1:42
    So what I would say is that...
  • 1:42 - 1:44
    if you want to do clandestine activities
  • 1:44 - 1:46
    that you fear for your life for,
  • 1:46 - 1:48
    you need to really think about
    the situation that you're in
  • 1:48 - 1:49
    very carefully.
  • 1:49 - 1:52
    And so a big part of this is
    operational security
  • 1:52 - 1:54
    and a big part of that is
    compartmentalization.
  • 1:54 - 1:56
    So certain people had access
    to certain things,
  • 1:56 - 1:58
    but maybe they couldn't decrypt them,
  • 1:58 - 2:01
    and certain things were moved around,
  • 2:01 - 2:03
    and that's on a need to know basis,
  • 2:03 - 2:05
    and those people who knew,
  • 2:05 - 2:09
    which is not me - I don't know anything,
    I don't know what you're talking about.
  • 2:10 - 2:12
    Those people knew, and then you know,
  • 2:12 - 2:13
    it'll go with them to their grave.
  • 2:13 - 2:16
    So if you're interested in being the next
    Edward Snowden,
  • 2:16 - 2:17
    you need to do your homework
  • 2:17 - 2:20
    in finding people that will be able to do
    the other part of it, let's say.
  • 2:20 - 2:23
    But just in general, I mean
  • 2:23 - 2:25
    compartmentalization is key, right.
  • 2:25 - 2:27
    So it's not just for AppArmor profiles.
  • 2:27 - 2:30
    So you need to think about
    what you want to do.
  • 2:30 - 2:34
    And I mean a big part of this
    is to consider that the network itself
  • 2:34 - 2:37
    is the enemy, even though it is useful
    for communicating.
  • 2:37 - 2:41
    So all the metadata that exists
    on the network
  • 2:41 - 2:43
    could have tipped people off,
    could have caused
  • 2:43 - 2:44
    this whole thing to fall apart.
  • 2:44 - 2:47
    It really is amazing, I feel like you know
  • 2:47 - 2:48
    two and half, three years ago,
  • 2:48 - 2:50
    when you talk about Free software,
  • 2:50 - 2:52
    and you talk about the idea of
    Free software,
  • 2:52 - 2:55
    and you talk about issues relating to
    autonomy and privacy, and security
  • 2:55 - 2:58
    you have a really different reception now
    than you did then,
  • 2:58 - 2:59
    and that's really what it took
  • 2:59 - 3:02
    to turn the world half a degree,
    or something,
  • 3:02 - 3:04
    or a quarter of a degree or something.
  • 3:05 - 3:08
    So I'm not going to tell you about
    detailed plans for conspiracy,
  • 3:08 - 3:11
    but I highly encourage you to read about
    South African history,
  • 3:11 - 3:14
    in particular the history of
    Umkhonto we Sizwe.
  • 3:14 - 3:18
    They are the clandestine communications
    group for MK,
  • 3:18 - 3:21
    or rather the operation who lay inside of MK,
  • 3:21 - 3:23
    which is Umkhonto we Sizwe,
  • 3:23 - 3:25
    and they are sort of with
    the African National Congress,
  • 3:25 - 3:29
    and those people have published so many
    books about the revolutionary activities
  • 3:29 - 3:31
    to overthrow the apartheid state.
  • 3:31 - 3:34
    If you read these books, especially
    the book "Operation Vula"
  • 3:34 - 3:36
    and "Armed and Dangerous"
    by Ronnie Kasrils
  • 3:36 - 3:39
    they give you some idea about
    what you need to do
  • 3:39 - 3:40
    which is to compartmentalize,
  • 3:40 - 3:43
    how to find people to do various tasks,
    specific tasks,
  • 3:43 - 3:45
    how to work on building trust
    with each other, what that looks like,
  • 3:45 - 3:47
    how to identify political targets,
  • 3:47 - 3:50
    how you might use things
    like communications technology
  • 3:50 - 3:53
    to change the political topic on,
  • 3:53 - 3:55
    and the discussion in general.
  • 3:56 - 4:00
    And I think the best way to learn about
    these things is to study previous people
  • 4:00 - 4:02
    who have tried to do that kind of stuff.
  • 4:02 - 4:05
    And the NSA is not the apartheid regime of
    South Africa,
  • 4:05 - 4:07
    but there are still lessons
    to be learned there,
  • 4:07 - 4:10
    so if you really want to know the answer
    to that, also Che Guevara's manual
  • 4:10 - 4:12
    on guerilla warfare is very interesting,
  • 4:12 - 4:14
    and there's a lot of other books like that.
  • 4:14 - 4:16
    I'd be happy to talk about it
    with you later.
  • 4:16 - 4:18
    And I have nothing to do with anything
    that we may or may not have done.
  • 4:19 - 4:20
    [laughter]
  • 4:25 - 4:29
    [question]: Do you think there is a chance
    that things may get better
  • 4:29 - 4:34
    for example I know that publicly,
    some programs were not extended
  • 4:34 - 4:37
    but I don't know what is happening
    in the background
  • 4:37 - 4:43
    so maybe it's the same thing
    but they are pretending that it's not
  • 4:43 - 4:45
    How do you see this?
  • 4:45 - 4:48
    [Jacob]: Well I think a couple of things.
  • 4:48 - 4:54
    In general I think what happened, not just
    with this movie but with all of these things
  • 4:54 - 4:56
    is that in inspired hope,
  • 4:56 - 4:57
    and the hope is very important,
  • 4:57 - 5:01
    but hope is not a strategy for survival,
    or for building alternatives,
  • 5:01 - 5:03
    so what it has also done, is that it has
    allowed us to raise the profile
  • 5:04 - 5:06
    of the things which actually do
    make it better.
  • 5:06 - 5:09
    For example ridding ourselves of the
    chains of proprietary software
  • 5:09 - 5:12
    is something that's a serious discussion
    with people that wouldn't have previously
  • 5:12 - 5:15
    talked about Free software
    because they don't care about liberty,
  • 5:15 - 5:17
    they care about security.
  • 5:17 - 5:19
    And even though I think those are
    really simliar things,
  • 5:19 - 5:21
    previously they just thought we were just
    Free software hippies,
  • 5:21 - 5:22
    in tie-dye shirts
  • 5:22 - 5:25
    and while that may be true on the weekends
    and evenings
  • 5:25 - 5:28
    or with Bdale every day
    [laughter]
  • 5:28 - 5:30
    I think that actually does make it better
  • 5:30 - 5:33
    And it also changes the dialogue, in
    the sense that it's no longer reasonable
  • 5:33 - 5:37
    to pretend that mass surveillance and
    surveillance issues don't matter,
  • 5:37 - 5:39
    because if you really go down the
    rabbit-hole
  • 5:39 - 5:42
    of thinking about what some of the
    security services are trying to do
  • 5:42 - 5:45
    it becomes obvious that we want to encrypt
    everything all the time
  • 5:45 - 5:48
    to beat selector-based surveillance
    and dragnet-based surveillance.
  • 5:48 - 5:50
    It doesn't matter if something is authenticated
  • 5:50 - 5:53
    You could still trigger some action
    to take place
  • 5:53 - 5:54
    with these kinds of surveillance machines
  • 5:54 - 5:57
    that could for example drone
    strike someone,
  • 5:57 - 5:58
    and so it raises that.
  • 5:58 - 6:00
    And that gives me a lot of hope too,
  • 6:00 - 6:03
    because people understand the root
    of the problem,
  • 6:03 - 6:05
    or the root of many problems
  • 6:05 - 6:07
    and the root of some violence
    in the world, actually.
  • 6:07 - 6:09
    And so it helps us to reduce that
    violence
  • 6:09 - 6:11
    by getting people to acknowledge
    that it's real
  • 6:11 - 6:12
    and also that they care about it
  • 6:12 - 6:14
    and that we care about each other.
  • 6:14 - 6:17
    So that really gives me a lot of hope,
    and part of that is Snowden
  • 6:17 - 6:18
    and part of that is the documents
  • 6:18 - 6:20
    but the other part of it is that..
  • 6:20 - 6:25
    I don't want to blow it up and make it
    sound like we did something
  • 6:25 - 6:27
    like a big deal,
  • 6:27 - 6:30
    but in a sense, Laura, Glen, myself
    and a number of other people
  • 6:30 - 6:33
    were really not sure we would ever be able
    to travel home to our country
  • 6:33 - 6:34
    that we wouldn't be arrested.
  • 6:34 - 6:36
    I actually haven't been home
    in over two and half years,
  • 6:36 - 6:39
    well, two years and three months
    or something
  • 6:39 - 6:42
    I went out on a small business trip
    that was supposed to last two weeks
  • 6:42 - 6:43
    and then this happened
  • 6:43 - 6:45
    and I've been here ever since.
  • 6:45 - 6:47
    It's a really long, crazy trip.
  • 6:47 - 6:51
    But the point is that that's what was
    necessary to make some of these changes
  • 6:51 - 6:53
    and eventually it will turn around
  • 6:54 - 6:55
    and I will be able to go home,
  • 6:55 - 6:57
    and Laura and Glen will be able to travel
    to the US again.
  • 6:57 - 7:00
    Obviously, Julian is still stuck in the
    Ecuadorian embassy
  • 7:00 - 7:02
    Sarah lives in exile in Berlin,
  • 7:02 - 7:03
    I live in exile in Berlin,
  • 7:03 - 7:05
    And Ed is in Moscow
  • 7:05 - 7:08
    So we're not finished with some of
    these things
  • 7:08 - 7:12
    and it's also possible that we are,
    the set of people I mentioned,
  • 7:12 - 7:15
    the state we're in, will stay that way
    forever.
  • 7:15 - 7:17
    But what matters is that the rest
    of the world
  • 7:17 - 7:19
    can actually move on and fix some of
    these problems,
  • 7:19 - 7:21
    and I have a lot of hope about that.
  • 7:21 - 7:24
    And I see a lot of change, that's the
    really big part.
  • 7:24 - 7:30
    Like I see the reproducible build stuff
    that Holger and Lunar are working on.
  • 7:30 - 7:33
    People really understand the root reason
    for needing to do that
  • 7:33 - 7:35
    and actually seems quite reasonable
    to people
  • 7:35 - 7:38
    who would previously have expended energy
    against it,
  • 7:38 - 7:41
    in support of it, so I think that's
    really good.
  • 7:41 - 7:43
    And there's a lot of other hopeful things.
  • 7:43 - 7:45
    So I would try and be as uplifting
    as possible.
  • 7:45 - 7:48
    It's not just the rum!
  • 7:50 - 7:54
    [question]: Near the end of the film
    we saw something about another source.
  • 7:54 - 7:57
    I may have been missing some news
    or something
  • 7:57 - 8:01
    but I don't remember anything about that
    being public.
  • 8:01 - 8:03
    Do you know what happened to them?
  • 8:03 - 8:06
    [Jacob]: As far as I know any other
    source that was mentioned in the film
  • 8:06 - 8:08
    is still anonymous, and they're still free.
  • 8:08 - 8:11
    I'm not exactly sure because I was not
    involved in that part
  • 8:11 - 8:13
    but I also saw the end of the film
  • 8:13 - 8:16
    and I've seen a bunch of other reporting
    which wasn't attributed to anyone in particular
  • 8:17 - 8:21
    So the good news... there's an old slogan
    from the Dutch hacker community, right?
  • 8:22 - 8:23
    "Someone you trust is one of us,
  • 8:23 - 8:26
    and the leak is higher up in the chain of
    command than you"
  • 8:26 - 8:31
    And I feel like that might be true again,
    hopefully.
  • 8:33 - 8:35
    I think that guy has a question as well.
  • 8:35 - 8:39
    [question]: Part of the problem initially
    was that encryption software
  • 8:39 - 8:42
    was not so easy to use, right?
  • 8:42 - 8:44
    And I think part of the challenge
    for everyone
  • 8:44 - 8:48
    was to improve on that situation
    to make it better
  • 8:48 - 8:53
    so I'm asking you if you've observed
    any change and to the rest of the room
  • 8:53 - 8:56
    have we done anything to improve on that?
  • 8:57 - 9:01
    [Jacob]: I definitely think that there is
    a lot of free software
  • 9:01 - 9:03
    that makes encryption easier to use,
  • 9:03 - 9:06
    though not always on free platforms,
    which really is heart-breaking.
  • 9:06 - 9:09
    For example Moxie Marlinspike has done
    a really good job
  • 9:09 - 9:11
    with Signal, Textsecure and Redphone
  • 9:11 - 9:14
    and making end-to-end, encrypted
    calling, texting, sexting,
  • 9:14 - 9:17
    and whatever apps,
  • 9:17 - 9:20
    sext-secure is what I think it's nicknamed
  • 9:20 - 9:22
    and I'm very impressed by that,
    and it works really well
  • 9:22 - 9:25
    and it's something which, especially
    in the last two years,
  • 9:25 - 9:28
    if you have a cell-phone,
    which I don't recommend
  • 9:28 - 9:31
    but if you have a cell-phone,
    and you put in everyone's phone number,
  • 9:31 - 9:34
    a lot of people that I would classify as
    non-technical people,
  • 9:34 - 9:37
    that don't care about Free software
    as a hobby or as a passion
  • 9:37 - 9:39
    or as a profession.
  • 9:39 - 9:41
    You see their names in those systems
  • 9:41 - 9:43
    often more than some of the
    Free software people,
  • 9:43 - 9:44
    and that's really impressive to me,
  • 9:44 - 9:48
    and I think there's been a huge shift
    just generally about those sorts of things
  • 9:48 - 9:51
    also about social responsibility,
  • 9:51 - 9:54
    or people understand they have a
    responsibility to other people
  • 9:54 - 9:58
    to encrypt communications,
    and not to put people in harm's way
  • 9:58 - 10:01
    by sending unsafe stuff over
    unsafe communication lines.
  • 10:01 - 10:05
    So I think in my personal view it's better.
  • 10:05 - 10:08
    But the original problem wasn't actually
    that the encryption was hard to use.
  • 10:08 - 10:11
    I think the main problem is people didn't
    understand the reason
  • 10:11 - 10:13
    that it needed to be done
  • 10:13 - 10:17
    and they believed the lie that is
    targetted versus mass surveillance.
  • 10:17 - 10:20
    And there's a big lie, and the lie is
    that there is such a thing
  • 10:20 - 10:22
    as targeted surveillance.
  • 10:22 - 10:25
    In the modern era, most so-called
    targetted surveillance actually happens
  • 10:25 - 10:26
    through mass surveillance.
  • 10:26 - 10:28
    They gather everything up, and then they
    look through the thing
  • 10:28 - 10:30
    they've already seized.
  • 10:30 - 10:33
    And of course there are targetted,
    focussed attacks.
  • 10:33 - 10:36
    But the main thing is that the abuse of
    surveillance often happens
  • 10:36 - 10:38
    on an individual basis.
  • 10:38 - 10:40
    It also has a societal cost.
  • 10:40 - 10:42
    I think a lot of people really
    understand that.
  • 10:42 - 10:46
    It's probably because I also live in
    Germany now for the last two years
  • 10:46 - 10:50
    but I feel that German society in
    particular is extremely aware
  • 10:50 - 10:52
    of these abuses in the modern world
  • 10:52 - 10:55
    and they have a historical context
    that allows them to talk about it
  • 10:55 - 10:58
    with the rest of the world, where the
    world doesn't downplay it.
  • 10:58 - 11:00
    So this is how other people relate to
    Germany
  • 11:00 - 11:03
    not just about Germans relate to
    each other.
  • 11:03 - 11:06
    And that has also been really good
    for just meeting regular people
  • 11:06 - 11:08
    who really care about it,
  • 11:08 - 11:09
    and who really want to do things.
  • 11:09 - 11:11
    So people's parents email me,
    and are like
  • 11:11 - 11:12
    "I want to protect my children,
  • 11:12 - 11:15
    what's the best way to use crypto
    with them?"
  • 11:15 - 11:17
    You know, things like that.
  • 11:17 - 11:19
    And I didn't ever receive emails like
    that in the past
  • 11:19 - 11:24
    and that's to me is uplifting
    and very positive.
  • 11:25 - 11:28
    [question]: A quick organisational question.
  • 11:28 - 11:30
    Right now we're live-streaming the Q&A.
    Are you comfortable with that?
  • 11:31 - 11:33
    [Jacob]: I don't think in the last three
    years I've ever had a moment
  • 11:33 - 11:36
    that wasn't being recorded.
  • 11:37 - 11:39
    [laughter, applause]
  • 11:41 - 11:43
    [question]: If you're fine with it, moving on...
  • 11:44 - 11:48
    [Jacob]: That's fine, just don't do it
    when I'm trying to sleep.
  • 11:48 - 11:51
    [question]: I was wondering why Laura
    and you ended up in Germany
  • 11:51 - 11:55
    because what you said about people in
    Germany might be true
  • 11:55 - 12:01
    but I'm really ashamed about my Government,
    how they dealt with spying the chancellor,
  • 12:01 - 12:04
    and anything... they are doing nothing for this.
  • 12:04 - 12:08
    [Jacob]: The reason that we ended up in
    Germany
  • 12:08 - 12:11
    is that I'd been attending
    Chaos Computer Club events
  • 12:11 - 12:13
    for many years
  • 12:13 - 12:15
    and there are bunch of people that are
    part of the Chaos Computer Club
  • 12:15 - 12:17
    who are really supportive,
    and good people,
  • 12:17 - 12:19
    who have a stable base,
    and an infrastructure.
  • 12:19 - 12:25
    The German hacker scene has this
    phenomenon which is that
  • 12:25 - 12:27
    it's a part of society.
  • 12:27 - 12:30
    So there are people in the CCC who will
    talk with the constitutional court
  • 12:30 - 12:32
    for example,
  • 12:32 - 12:34
    and that creates a much more stable
    civil society
  • 12:34 - 12:36
    and those people were willing to help us.
  • 12:36 - 12:39
    They were willing to hold footage,
    to hold encrypted data.
  • 12:39 - 12:42
    They were willing to help modify hardware.
  • 12:42 - 12:45
    There was a huge base of support where
    people, even if they had fear,
  • 12:45 - 12:47
    they did stuff anyway.
  • 12:47 - 12:50
    And that support went back a long time.
  • 12:50 - 12:53
    And so we knew that it would be safe
    to store footage for the film here.
  • 12:53 - 12:56
    In Berlin, not in Heidelberg, but here
    in Germany.
  • 12:56 - 13:01
    And we knew that, of course,
    there were people that would be helpful.
  • 13:01 - 13:03
    In the US there's a much bigger culture
    of fear.
  • 13:03 - 13:06
    People are afraid of having their houses
    raided by the police,
  • 13:06 - 13:08
    where there's lots of detainments at the
    borders,
  • 13:08 - 13:10
    where there's lots of speculative arrests,
  • 13:10 - 13:12
    journalists that are jailed,
  • 13:12 - 13:15
    so the situation was not to say that
    Germany was perfect.
  • 13:15 - 13:19
    I revealed in Der Spiegel with three other
    journalists that Merkel was spied on
  • 13:19 - 13:20
    by the NSA.
  • 13:20 - 13:22
    And it's clear that the Germany government
    was complicit
  • 13:22 - 13:24
    with some of this surveillance.
  • 13:24 - 13:27
    But in a sort of pyramid of surveillance
    there's a sort of colonialism
  • 13:27 - 13:28
    that takes place.
  • 13:28 - 13:31
    And that the NSA and GCHQ are at the top.
  • 13:31 - 13:33
    And the Germans are little bit below that.
  • 13:33 - 13:37
    The thing is that there's not a lot you
    can do about that.
  • 13:37 - 13:39
    And so even though we revealed this
    about Merkel,
  • 13:39 - 13:41
    it's not clear what she should do.
  • 13:41 - 13:42
    It's not clear what anyone should do.
  • 13:42 - 13:45
    But one thing that was clear was that
    if they wanted to break into our houses
  • 13:45 - 13:50
    they would do it in a way that would
    cost them a lot politically.
  • 13:50 - 13:51
    It would be very public.
  • 13:51 - 13:53
    The last time someone raided someone
    working with Der Spiegel
  • 13:53 - 13:56
    was in 1962 during the Spiegel affair,
  • 13:56 - 13:58
    and some ministers were kicked out.
  • 13:58 - 14:00
    You may have seen recently the
    Landesverrat thing
  • 14:00 - 14:02
    with Netzpolitik.
  • 14:02 - 14:04
    The charges against them now
    have been dropped.
  • 14:04 - 14:07
    That would never happen in the
    United States.
  • 14:07 - 14:08
    We would not be safe.
  • 14:08 - 14:10
    And I still, for my investigative
    journalism,
  • 14:10 - 14:11
    and my work with Wikileaks,
  • 14:11 - 14:13
    and my work with the Tor project,
  • 14:13 - 14:15
    I wouldn't even go back to the US,
  • 14:15 - 14:17
    because there's no chance that if they
    wanted to do something to me
  • 14:17 - 14:21
    that I would have any constitutional
    liberties, I think,
  • 14:21 - 14:23
    and the same is true of Snowden.
  • 14:23 - 14:24
    You just won't get that fair trial.
  • 14:24 - 14:28
    And we thought at least here we would
    have ground to stand and fight on.
  • 14:28 - 14:30
    And it's exactly what happened,
    and we won.
  • 14:34 - 14:36
    [question]: This is also about the fear
    stuff that you talk about
  • 14:36 - 14:42
    which is in the very old days we used to
    put red words in the end of every message
  • 14:42 - 14:46
    to make sure that it would be hard to find
    the actual subversive message
  • 14:46 - 14:48
    among all the noise.
  • 14:48 - 14:50
    And you can think about the same thing
    here.
  • 14:50 - 14:55
    Should we build our systems so that
    everything gets encrypted all the time?
  • 14:56 - 14:59
    [Jacob]: So I have a lot of radical
    suggestions for what to do,
  • 14:59 - 15:01
    but I'm going to talk about them tomorrow
    in the keynote mostly.
  • 15:01 - 15:04
    But to give you an example,
    when you install Debian,
  • 15:04 - 15:06
    you can give someone the ability to log
    into the machine
  • 15:06 - 15:08
    over a Tor hidden service for free.
  • 15:08 - 15:12
    You get a free .onion when you add two
    lines to a Tor configuration file.
  • 15:12 - 15:16
    We should make encryption not only easy
    to use but out of the box
  • 15:16 - 15:20
    we should have it possible to have
    end-to-end reachability and connectivity,
  • 15:20 - 15:24
    and we should reduce the total amount
    of metadata, to make it harder for people
  • 15:24 - 15:26
    who want to break the law, that want to
    break into computers.
  • 15:26 - 15:31
    We should solve the problem of adversarial
    versus non-adversarial forensics
  • 15:31 - 15:36
    so we can verify our systems with open
    hardware and Free software together.
  • 15:36 - 15:39
    And there's a lot to be done,
    but the main thing to do is to recognise
  • 15:39 - 15:43
    that if you have the ability to upload
    to Debian,
  • 15:43 - 15:46
    there are literally intelligence agencies
    that would like those keys.
  • 15:46 - 15:49
    And we have a great responsiblity to
    humanity as Debian developers
  • 15:49 - 15:52
    to do the right thing: to build open
    systems,
  • 15:52 - 15:55
    to build them in a way where users don't
    need to understand this stuff.
  • 15:55 - 15:58
    There are a lot of people in the world
    that will never see this film.
  • 15:58 - 16:03
    And we can solve the problems that this
    film describes largely with Free software.
  • 16:03 - 16:05
    And we can do that without them knowing,
  • 16:05 - 16:07
    and they will be safe for us having
    done that.
  • 16:07 - 16:10
    And if we can do that, the world will be
    a better place, I think.
  • 16:10 - 16:12
    And I think the world is a better place
    because of the efforts that were
  • 16:12 - 16:15
    already done in that area, that made this
    possible.
  • 16:15 - 16:18
    The Tails project made it so that a bunch
    of people
  • 16:18 - 16:20
    who were good at investigative journalism,
  • 16:20 - 16:24
    but absolutely terrible with computers,
    were able to pull this off.
  • 16:24 - 16:27
    And that is entirely the product, in my
    opinion, of Free software.
  • 16:27 - 16:33
    And a little bit of Laura and Glen, but
    I'd say a lot of Free software.
  • 16:34 - 16:36
    [question]: How many people do you think
    NSA has
  • 16:36 - 16:39
    working within the Debian community?
  • 16:40 - 16:44
    [laughter, applause]
  • 16:45 - 16:49
    [Jacob]: Well, I looked in the Snowden
    archive about that actually.
  • 16:53 - 16:56
    [laughter, applause]
  • 16:57 - 17:03
    Yeah. And as far as I can tell Debian is
    not a high priority target for them.
  • 17:04 - 17:06
    I mean they write exploits for all sort
    of stuff
  • 17:06 - 17:11
    but I never found any systematic attempt
    to compromise or harm the Debian project.
  • 17:11 - 17:15
    But obviously there are people who are
    paid by the NSA to infiltrate communities,
  • 17:15 - 17:17
    and that's why we have to open transparent
    processes
  • 17:17 - 17:21
    so that if those people behave badly,
    we have an audit trail.
  • 17:21 - 17:23
    We won't ever stop that kind of stuff,
  • 17:23 - 17:25
    but what matters
    is that people do good things.
  • 17:25 - 17:29
    It doesn't matter who they do bad things
    for as long as we can correct those things
  • 17:29 - 17:31
    and/or catch them and stop them before
    it happens.
  • 17:31 - 17:33
    But as far as I know there are only a
    couple of people that have ever
  • 17:33 - 17:36
    been associated with the NSA in the
    Debian community.
  • 17:36 - 17:40
    But I think we shouldn't get paranoid
    about it,
  • 17:40 - 17:42
    but we should just be prudent about our
    processes,
  • 17:42 - 17:44
    because there are lots of intelligence
    services around the world
  • 17:44 - 17:47
    that do not like the values of a
    universal operating system,
  • 17:47 - 17:51
    so I don't think it's super-important to
    look, but I did actually look,
  • 17:51 - 17:55
    very specifically for a whole bunch of
    people in the Debian community
  • 17:55 - 17:58
    to see if any of them also were being
    paid by the NSA
  • 17:58 - 18:02
    and I didn't find any serious thing that
    raised concern,
  • 18:02 - 18:04
    and if I did, I would have...
  • 18:04 - 18:08
    I mean, there were lots of things I found
    in the archive that I immediately
  • 18:08 - 18:09
    notified security teams about.
  • 18:09 - 18:14
    Where I worked along with many other
    people to actually fix those things.
  • 18:14 - 18:19
    And one of those things, if we had found
    them, like infiltrators in Debian,
  • 18:19 - 18:21
    I absolutely would have just told people
    about it.
  • 18:21 - 18:23
    The problem is that a lot of the
    journalists don't want to do that
  • 18:23 - 18:26
    because there's a ten year felony
    where you go to prison -
  • 18:26 - 18:28
    a federal American prison, for
    ten years -
  • 18:28 - 18:30
    if you reveal the name of an agent.
  • 18:31 - 18:32
    So there's a tension there,
  • 18:32 - 18:34
    but I think that there's something
    to be said,
  • 18:35 - 18:37
    if they're actually actively harming the
    community
  • 18:37 - 18:38
    and they're committing a crime,
  • 18:38 - 18:39
    I think there's something to be said
    about that.
  • 18:39 - 18:41
    So if I found that I think it would be
    worthwhile,
  • 18:41 - 18:43
    but just so you know, there's this
    high cost.
  • 18:43 - 18:45
    So if there were people in the agency
    now,
  • 18:45 - 18:49
    because they say that we used Tails, and
    Debian, and they wanted to subvert it,
  • 18:49 - 18:52
    there's a really really high bar for
    punishment.
  • 18:52 - 18:55
    Which suggests that maybe people
    won't tell you.
  • 18:55 - 18:59
    So we need to sort of bank on the fact
    that we'll never know,
  • 18:59 - 19:03
    but we don't need to know, as long as we
    have good processes
  • 19:03 - 19:04
    that would catch bad behaviour.
  • 19:04 - 19:06
    And that's one of the strengths of Debian.
  • 19:06 - 19:09
    There are very few operating systems,
    I think,
  • 19:09 - 19:11
    and just in general Free software
    communities,
  • 19:11 - 19:15
    that are as diverse, and committed to the
    openness and the Free software nature
  • 19:15 - 19:18
    of this kind of a project,
  • 19:18 - 19:21
    and so it's very important to state that.
  • 19:22 - 19:25
    But I do think one of the things that will
    happen in the future at some point
  • 19:25 - 19:28
    is that you'll start to find people in the
    Debian community that are pressured
  • 19:28 - 19:30
    by other people to do bad things
  • 19:30 - 19:32
    so we need to set up processes that will
    stop that,
  • 19:32 - 19:34
    to create an incentive for that
    not happening.
  • 19:35 - 19:37
    But it's really tough,
  • 19:38 - 19:40
    so I think that openness, transparency
    and accountability are the ways that
  • 19:40 - 19:44
    we can combat that, because otherwise
    we won't really be able to solve it.
  • 19:45 - 19:47
    But don't be paranoid, is the other thing.
  • 19:47 - 19:50
    They really are out to get you,
    so be prepared.
  • 19:50 - 19:56
    [laughter, applause]
  • 20:01 - 20:06
    [question]: I'm just wondering how trust
    was established
  • 20:06 - 20:10
    because I'm just realizing that
    this community,
  • 20:10 - 20:15
    for you to verify your public key and even
    fingerprint is like,
  • 20:15 - 20:16
    you have you produce your passport,
  • 20:16 - 20:20
    so I'm wondering how Laura managed to
    exchange her keys with Snowden
  • 20:20 - 20:23
    and make sure that they were really
    talking to the right person.
  • 20:24 - 20:28
    [Jacob]: Well, they had a whole sort of
    dance for doing key exchange.
  • 20:28 - 20:33
    I think it was a little bit luck, and a
    little bit transitive trust,
  • 20:33 - 20:35
    there's a little bit of the web of trust,
  • 20:35 - 20:36
    and it worked pretty well.
  • 20:37 - 20:41
    I mean, I don't think that the key-signing
    stuff that Debian does is anything close
  • 20:41 - 20:43
    to what they were doing.
  • 20:43 - 20:46
    They just wanted to make sure that the
    keys they had were the right keys,
  • 20:46 - 20:48
    and that they weren't compromised,
  • 20:48 - 20:50
    and that then they would change things.
  • 20:50 - 20:51
    There was a point in the movie where they
    said:
  • 20:51 - 20:56
    "let's disassociate our meta-data
    one more time"
  • 20:56 - 20:59
    And what that means is they changed all
    of the identifiers that are visible
  • 20:59 - 21:04
    to the network, new keys, new email
    addresses, new Tor circuit, etc
  • 21:04 - 21:08
    and this is like a key consistency thing,
  • 21:08 - 21:11
    where they had the right key to begin with
    and they continued to rotate over to new keys.
  • 21:11 - 21:13
    This is also sometimes called TOFU.
  • 21:13 - 21:16
    This is, I think, weaker than the
    web of trust,
  • 21:16 - 21:19
    but a lot easier for people to do, and
    very easy to explain,
  • 21:19 - 21:21
    and it worked out pretty well.
  • 21:21 - 21:25
    It doesn't scale really well, but it has a
    separate good side
  • 21:25 - 21:29
    which is the web of trust explicitly names
    a web of co-conspirators.
  • 21:29 - 21:31
    And so you don't want that feature.
  • 21:31 - 21:33
    It's useful for something like Debian;
  • 21:33 - 21:36
    it's not useful for clandestine
    conspiracies to commit
  • 21:36 - 21:38
    investigative journalism.
  • 21:38 - 21:40
    [laughter]
  • 21:42 - 21:44
    Lots of questions, this is great.
  • 21:45 - 21:52
    [question]: Somebody working on Tails told
    me that the NSA has a file on every DD.
  • 21:52 - 21:54
    Is that true, do you know?
  • 21:55 - 21:57
    [Jacob]: Okay, so when you balance your
    check-book,
  • 21:57 - 21:59
    just to answer your question in a really
    strange way,
  • 21:59 - 22:01
    when you balance your check-book,
    or you balance your bank account,
  • 22:01 - 22:04
    and you think this is how much my rent is,
    this is how much food is,
  • 22:04 - 22:06
    this is how much I have to spend on some
    new hardware,
  • 22:06 - 22:10
    you think about money in an
    individual way.
  • 22:11 - 22:13
    But if you think about is as a state, the
    way a state thinks about money.
  • 22:13 - 22:16
    They don't balance budgets the same
    way that you do.
  • 22:16 - 22:18
    They think about long-term investments
    very differently.
  • 22:18 - 22:20
    They have other people's money.
  • 22:20 - 22:22
    It's a whole different way of managing it.
  • 22:22 - 22:27
    And the NSA is not the Stasi. So it's not
    that you have to worry about
  • 22:27 - 22:30
    them having a file on you, or every Debian
    developer,
  • 22:30 - 22:33
    but rather there exist some laws in the
    United States that say
  • 22:33 - 22:36
    for cyber-security purposes, you don't
    have constitutional rights
  • 22:36 - 22:38
    and based on your accent, you weren't
    an American anyway,
  • 22:38 - 22:40
    and you aren't in America,
  • 22:40 - 22:42
    so you don't have any rights at all,
    anyway, according to them.
  • 22:42 - 22:44
    They're just allowed to do whatever they
    want to you,
  • 22:44 - 22:46
    up to and including murdering you, with
    the CIA.
  • 22:46 - 22:49
    That's what they do with drones; that was
    at the very end of the movie.
  • 22:50 - 22:52
    So it's not that they have a file on you.
  • 22:52 - 22:56
    It's that they have giant databases full
    of information on all of us,
  • 22:56 - 23:00
    and then when they're interested in you,
    pull up all your data,
  • 23:00 - 23:01
    and associative data,
  • 23:01 - 23:03
    and then they use that, and sometimes
    they use it to target you,
  • 23:03 - 23:06
    to break into your machines,
    or to find people to exert pressure on,
  • 23:06 - 23:08
    or to do psychological manipulation on.
  • 23:08 - 23:11
    All that stuff, they do all of those
    things.
  • 23:11 - 23:13
    And so it's not that they have one file
    on you.
  • 23:13 - 23:16
    Though maybe, it depends, if you work on
    a critical package like the Linux kernel
  • 23:16 - 23:21
    they might be more interested in you
    than if you work on something else.
  • 23:21 - 23:25
    I don't want to denigrate anyone's work,
    but they have very specific focuses,
  • 23:25 - 23:29
    and so they definitely are interested in
    being able to compromise systems, right?
  • 23:30 - 23:36
    And so you may also have a file, but it's
    really the meta list that's the new way
  • 23:36 - 23:37
    of thinking about it.
  • 23:37 - 23:41
    And in some senses I think that's actually
    scarier, because they just hoover up
  • 23:41 - 23:43
    everything, all across the whole Internet,
  • 23:43 - 23:46
    and things that are interesting, then
    they have them.
  • 23:46 - 23:49
    And depending on what interesting
    things are there, they maybe
  • 23:49 - 23:52
    put those in a database that lasts
    for ever,
  • 23:52 - 23:53
    or maybe it's just around for 30 days,
  • 23:53 - 23:57
    or maybe its full content for 9 days,
    or something like that.
  • 23:58 - 24:00
    And then of course if you are a person of
    interest
  • 24:00 - 24:03
    they do do the same stuff that the Stasi
    does,
  • 24:03 - 24:06
    they do that Zersetzung stuff, if you're
    familiar with this German term,
  • 24:06 - 24:11
    disintegration, they do that kind of
    stuff, along with JTRIG, from GCHQ,
  • 24:11 - 24:16
    so they harass people, blackmail them,
    do all sorts of really nasty stuff.
  • 24:17 - 24:20
    And they do that also, so both of those
    things.
  • 24:21 - 24:23
    So again, I don't think you should be
    paranoid, you should encrypt your stuff,
  • 24:23 - 24:25
    and help people do the same,
  • 24:25 - 24:29
    and know that in a democratic society with
    a secret political police,
  • 24:29 - 24:32
    the right place to be is in their
    database, right?
  • 24:32 - 24:34
    You should be proud of being surveilled
    by them,
  • 24:34 - 24:36
    it means you're doing the right thing.
  • 24:37 - 24:42
    [laughter, applause]
  • 24:43 - 24:45
    Nonetheless, we should stop them.
  • 24:49 - 24:54
    [question]: I'm curious about your views
    about Snowden actually coming out
  • 24:54 - 24:56
    and saying he was the whistleblower,
  • 24:56 - 24:59
    because I know, when he came out,
    I had some fierce discussion
  • 24:59 - 25:02
    with friends about it, so I wanted to know
    what you thought about it.
  • 25:02 - 25:03
    [Jacob]: What do you mean came out?
  • 25:03 - 25:07
    [question]: He said I'm Edward Snowden,
    I'm the whistle-blower, here I am,
  • 25:07 - 25:10
    instead of just being anonymous the
    whole way, just sending files to people.
  • 25:11 - 25:14
    [Jacob]: Well, I think the main thing is
    that it's about control of
  • 25:14 - 25:15
    your own narrative, right?
  • 25:15 - 25:20
    I mean if we could have done everything
    here anonymous, and gotten away with it,
  • 25:20 - 25:21
    would that have made the same impact
  • 25:21 - 25:25
    in getting other people to come forward
    even if they maintain their anonymity?
  • 25:25 - 25:28
    So I think that what Snowden did, what's
    beautiful about it,
  • 25:28 - 25:31
    is that he basically did enough,
  • 25:31 - 25:33
    where he could then survive.
  • 25:33 - 25:36
    Our job now for the most part, a very
    good friend told me,
  • 25:36 - 25:39
    he's a little bit of a fatalist, he said:
  • 25:39 - 25:43
    your job, Laura's job, Glen's job,
    Snowden's job, your job now is
  • 25:43 - 25:45
    just to survive.
  • 25:45 - 25:47
    That's all that you need to do now.
    You don't need to do anything else.
  • 25:47 - 25:52
    You should go do other things, like
    drink a glass of wine, relax, be happy,
  • 25:52 - 25:55
    have a nice life, but just survive,
  • 25:55 - 25:59
    so other people can see that you do the
    right thing, and even though you could have
  • 25:59 - 26:02
    done more, you did enough,
    and you lived through it.
  • 26:02 - 26:06
    And so Snowden coming out and telling us
    all of these things, I mean,
  • 26:06 - 26:10
    there are really powerful people saying
    he should be assassinated, right,
  • 26:10 - 26:14
    hung by the neck until dead, was what one
    of the CIA people said.
  • 26:14 - 26:17
    So he probably could have continued to be
    anonymous for a while,
  • 26:17 - 26:20
    but imagine if the NSA had got to reveal
    his identity.
  • 26:20 - 26:24
    How would that have been framed, what
    would the first impression have been?
  • 26:24 - 26:28
    I think they called him a narcissist, and
    they called him all these terrible names.
  • 26:28 - 26:33
    And it didn't really stick, because he
    basically said "come at me bro',
  • 26:33 - 26:38
    I'm ready, and you can do your worst,
    but you can't get rid of the facts,
  • 26:38 - 26:39
    so let's talk about the facts."
  • 26:39 - 26:42
    And I think the timing of how he did that
    is good, because people really cared
  • 26:43 - 26:46
    about the issues, but he also recognized
    that it was a matter of time,
  • 26:46 - 26:51
    the NSA police went to his house, they
    really bothered his family,
  • 26:51 - 26:55
    they've done that with my family as well,
    other people's families have had trouble.
  • 26:55 - 27:00
    So I think it's tough, because I
    think he probably would have liked to have
  • 27:00 - 27:03
    been able to not have that happen, but
    there comes a point at which
  • 27:03 - 27:05
    you're the person who has access to all
    that information
  • 27:05 - 27:07
    and they're going to figure it out.
  • 27:07 - 27:12
    No amount of anonymity, I think, will
    last forever, but it can buy you time.
  • 27:12 - 27:15
    He got exactly the amount of time
    he needed.
  • 27:15 - 27:18
    The really sad part about him coming out
    in public when he did, though, was that
  • 27:18 - 27:21
    he got stuck in Russia, because my
    government cancelled his passport.
  • 27:21 - 27:24
    I think mostly for propaganda reasons.
  • 27:24 - 27:28
    Because in the United States, we denigrate
    all things relating to Russia.
  • 27:28 - 27:30
    And there are lots of problems with
    Russia,
  • 27:30 - 27:32
    and especially with Vladimir Putin,
  • 27:32 - 27:37
    but at the same time that seems to be the
    only country that was willing to uphold
  • 27:37 - 27:38
    his fundamental liberties.
  • 27:38 - 27:41
    I went to the Council of Europe, and to
    the European Parliament,
  • 27:41 - 27:45
    to the German Parliament, to the French,
    sort of to the French Parliament,
  • 27:45 - 27:48
    they didn't really want to meet with me,
    but also to the Austrian Parliament,
  • 27:48 - 27:50
    and to a number of other places,
  • 27:50 - 27:53
    and everyone said, oh, we would really
    live to help anybody who needs help,
  • 27:53 - 27:55
    oh it's Edward Snowden, never mind.
  • 27:56 - 27:58
    [laughter]
  • 27:58 - 28:03
    And so though I have a lot of critiques
    on Russia, the propaganda aspect of it
  • 28:03 - 28:05
    was very damaging for him to be stuck
    in Russia,
  • 28:05 - 28:08
    but on the other hand, he's still alive,
    and he's still mostly free.
  • 28:08 - 28:12
    And they recognized his right to
    seek and to receive asylum.
  • 28:13 - 28:15
    So there's a lot of trade-offs to think
    identifying one's self,
  • 28:15 - 28:18
    and if you were thinking about being
    the next Snowden,
  • 28:18 - 28:19
    or helping Snowden,
    or something like that,
  • 28:20 - 28:23
    you really have to think that, you really
    have to think this out many steps ahead,
  • 28:23 - 28:26
    and it's easy to stay, oh he should have
    just stayed anonymous and
  • 28:26 - 28:28
    nobody would have figured it out,
  • 28:28 - 28:31
    but that's very clearly not planning for
    the case that they do figure it out,
  • 28:31 - 28:33
    and then they're going to be in control
    of the narrative,
  • 28:33 - 28:38
    and in that case, I think you are better
    off to do what he did,
  • 28:38 - 28:40
    and he did so quite reluctantly.
  • 28:40 - 28:43
    He's not an egoist, or an narcissist,
    he's actually a really shy guy
  • 28:43 - 28:45
    from what I can tell.
  • 28:45 - 28:49
    I don't know exactly what conversation
    you and your friend had,
  • 28:49 - 28:53
    but I would suspect that the notion is
    that people are more powerful
  • 28:53 - 28:54
    when anonymous.
  • 28:54 - 28:56
    And that is true sometimes,
    but not always,
  • 28:56 - 28:58
    and it's important to remember that
    the anonymity technology is there
  • 28:58 - 29:01
    so you have a choice, not a requirement.
  • 29:01 - 29:04
    And that choice is sometimes
    counter-intuitive,
  • 29:04 - 29:06
    but I think he did the right thing in
    this way, and I wish that my government
  • 29:06 - 29:09
    had done the right thing by him as well,
    but they did not.
  • 29:09 - 29:12
    [question]: So there are lot of questions,
    do you want to keep going on,
  • 29:12 - 29:13
    shall we get in a little Mate?
  • 29:15 - 29:18
    [Jacob]: I would love some of that rum.
  • 29:18 - 29:23
    I think I have to GRsec, right?
    GRsec kernel.
  • 29:23 - 29:24
    And then rum appears. Rum as a service.
  • 29:26 - 29:30
    [applause]
  • 29:33 - 29:37
    I'm really happy to keep taking questions,
    because to me, what I want is
  • 29:37 - 29:42
    for every person in this room to feel
    a part of this, because you really are.
  • 29:42 - 29:45
    A lot of the people I've met in this
    community really inspire me to action,
  • 29:45 - 29:49
    and it's important to understand that
    really, it would not have been possible
  • 29:49 - 29:50
    without Debian.
  • 29:50 - 29:54
    For example debootstrap - really important
    tool, right?
  • 29:54 - 29:59
    With weasel's packaging of Tor, it allowed
    us to have bootstraps of things,
  • 29:59 - 30:00
    it allowed us to build things,
  • 30:00 - 30:02
    and using Free software really was
    helpful,
  • 30:02 - 30:05
    so if you guys have any questions at all,
  • 30:05 - 30:08
    really each and every person that helps
    with Debian should just know
  • 30:08 - 30:10
    that you are a part of that,
  • 30:10 - 30:12
    and I'm just happy to talk for as long as
    you want, basically,
  • 30:12 - 30:14
    to answer all of your questions,
  • 30:14 - 30:16
    except the ones that put me in prison.
    Thanks.
  • 30:16 - 30:18
    [laughter]
  • 30:19 - 30:24
    [question]: I just wanted to make a quick
    note about the question
  • 30:24 - 30:26
    "do they have a file on me?"
  • 30:26 - 30:30
    From all I've read so far, it's just that
    they're doing the thing
  • 30:30 - 30:33
    that is in the commercial world called
    "big data".
  • 30:34 - 30:36
    [Jacob]: Yep. Absolutely.
  • 30:36 - 30:39
    Oh boy. GRsec again?
  • 30:41 - 30:45
    [orga]: it's not rum, but it's Bavarian
    whisky.
  • 30:46 - 30:50
    [Jacob]: Oh boy. It's going to be a
    heavy morning tomorrow.
  • 30:51 - 30:54
    I saw another couple of hands.
  • 30:56 - 31:00
    [question]: I was just wondering if
    that you noticed throughout this
  • 31:00 - 31:05
    that you think we could improve in Debian
    to make the next people's lives easier.
  • 31:05 - 31:09
    [Jacob]: Oh my god, I'm so glad you asked
    that question, that's so fantastic.
  • 31:09 - 31:10
    I'm going to talk about that tomorrow
    in my keynote,
  • 31:10 - 31:12
    but let me tell you about one that I have.
  • 31:12 - 31:17
    I revealed a specific document about a
    wifi injection attack system.
  • 31:17 - 31:19
    It's a classified document, it's a
    top secret document,
  • 31:19 - 31:22
    for a thing called nightstand, and what
    nightstand is,
  • 31:22 - 31:26
    it's basically like car metasploit,
    it's a wifi injector...
  • 31:26 - 31:29
    cheers!
  • 31:35 - 31:37
    Danke schön.
  • 31:38 - 31:41
    It's a wifi injector device...
  • 31:41 - 31:43
    Whew, jesus!
  • 31:44 - 31:48
    [laughter, applause]
  • 31:53 - 31:57
    [orga]: Tonight's whisky sponsored by
    drunc-tank dot org.
  • 32:00 - 32:04
    [Jacob]: So this wifi injector device,
    what it does is it basically is able to
  • 32:04 - 32:09
    exploit the kernel of a device by sending
    malformed data over wifi.
  • 32:09 - 32:15
    Now I have a series of photographs, so
    all of us.. not all of us, but most of us
  • 32:15 - 32:20
    used these speciallly modified X60s where
    we removed the microphones, soldered??
  • 32:20 - 32:22
    down things on the PCI bus,
  • 32:22 - 32:24
    we removed, like, firewire, really
    modified it, flashed coreboot onto it,
  • 32:24 - 32:27
    flipped the read pin so it was only
    read-only,
  • 32:27 - 32:30
    so you couldn't easily make a BIOS
    root kit and make it persistent,
  • 32:30 - 32:32
    we booted TAILS, did all this stuff,
  • 32:32 - 32:36
    often we could boot to RAM so that
    once the machine was powered off
  • 32:36 - 32:39
    basically it would be done, so if someone
    kicks down your door,
  • 32:39 - 32:41
    you just pull the power out,
  • 32:41 - 32:43
    and you don't have a battery, and
    when the power fails you have an
  • 32:43 - 32:45
    instant kill switch.
  • 32:45 - 32:48
    So things that are in TAILS that are
    really useful include this
  • 32:48 - 32:53
    wiping the kernel memory package
    which I hear is being packaged for Debian
  • 32:53 - 32:55
    soon, which is very exciting.
  • 32:55 - 32:57
    Because everyone should have access
    to that so we can tie it into something
  • 32:57 - 33:01
    like GNU panicd or these other things.
  • 33:01 - 33:08
    But one thing I kept having problems with
    is this wifi injection device,
  • 33:08 - 33:10
    I'm pretty sure, was very close to my
    house.
  • 33:10 - 33:13
    There was a white van outside, it was
    vibrating a bit like there was a guy
  • 33:13 - 33:15
    walking around in it,
  • 33:15 - 33:18
    and then all of sudden, an X60 here,
    an X60 here, and an X60 here,
  • 33:18 - 33:22
    just booted into TAILS, not doing
    anything at all, but on the wifi network,
  • 33:22 - 33:24
    kernel panic, kernel panic, kernel panic.
  • 33:24 - 33:28
    All the same kernel panic, all the
    same memory offsets,
  • 33:28 - 33:32
    in the Appletalk driver of the stock
    kernel for TAILS.
  • 33:32 - 33:37
    I think I filed a bug upstream with TAILS
    at the time,
  • 33:37 - 33:40
    but this is just incredible because
    it's clear that all the crap
  • 33:40 - 33:46
    in the default Debian kernel that you
    really want for your 1992 Apple network
  • 33:46 - 33:48
    makes operational security really hard,
  • 33:48 - 33:52
    and one thing that would be really great
    would be a GRsec enabled kernel...
  • 33:53 - 33:55
    [applause]
  • 33:55 - 33:58
    Yes, have to drink.
  • 34:01 - 34:07
    But as an example, we built different
    custom machines, and one of the things
  • 34:07 - 34:10
    that we did for some people and in some
    circumstances was
  • 34:10 - 34:12
    to build GRsec enabled kernels.
  • 34:12 - 34:15
    And I'm not going to drink again.
  • 34:19 - 34:21
    So we built those kernels
  • 34:21 - 34:23
    [audience]: Which ones?
  • 34:24 - 34:27
    [Jacbob]: Yes, exactly, those ones.
  • 34:27 - 34:31
    And that was work which creates a problem
    for a bunch of reasons.
  • 34:31 - 34:34
    When you build custom kernels, and
    you only have a few people
  • 34:34 - 34:35
    that can build those kernels,
  • 34:35 - 34:38
    you actually build a chain of evidence of
    who helped who.
  • 34:38 - 34:40
    And if that was a stable, normal package,
  • 34:40 - 34:43
    that people could install in a Debian
    pure blend,
  • 34:43 - 34:45
    then it would have been easier to do that.
  • 34:45 - 34:49
    We built a lot more sandbox profiles for
    various different things,
  • 34:49 - 34:51
    we built some transparent TOR-ification
    stuff,
  • 34:51 - 34:54
    and that required a lot of bespoke
    knowledge,
  • 34:54 - 34:57
    and it required a lot of effort that a lot
    of people did not have,
  • 34:57 - 34:59
    because they had a different set of
    skills,
  • 34:59 - 35:01
    and it's good to have a division of
    labour,
  • 35:01 - 35:04
    but having that kind of stuff built into
    Debian by default, making a
  • 35:04 - 35:06
    Debian installer that could do that,
  • 35:06 - 35:09
    and also verification, would be great,
    right?
  • 35:09 - 35:12
    So I wrote some custom scripts
    where I could look at a TAILS disk,
  • 35:12 - 35:14
    or a Debian install,
  • 35:14 - 35:16
    and know if it had been tampered with.
  • 35:16 - 35:20
    And it would be nice if there was just
    a disk you could boot that did
  • 35:20 - 35:22
    verification of an installed system
  • 35:22 - 35:25
    very very easily, so easily that
    Glen Greenwald could use it.
  • 35:25 - 35:30
    I love Glen, I say that very politely,
  • 35:30 - 35:33
    but what I mean is it needs to be
    easier than that,
  • 35:33 - 35:36
    because Glen at least knows that he
    he a reason to use it.
  • 35:36 - 35:40
    And so that was something that we really
    needed help with.
  • 35:40 - 35:42
    And we spent a lot of time on that.
  • 35:42 - 35:44
    And there are lots of other little things
    like that,
  • 35:44 - 35:45
    and I'll talk about some of those things
    tomorrow,
  • 35:45 - 35:47
    but one of the really big problems is
    hardware,
  • 35:47 - 35:51
    which is that you cannot buy a modern
    Intel CPU which doesn't come
  • 35:51 - 35:52
    with a backdoor any more.
  • 35:52 - 35:57
    And that is a huge problem, and I'm not
    sure that the answer is to use ARM.
  • 35:57 - 35:59
    It seems like the answer is to use ARM.
  • 35:59 - 36:03
    But that's only if assume that ARM didn't
    just add a backdoor that's obvious.
  • 36:03 - 36:08
    So we really need to think about how to,
    in moving forward,
  • 36:08 - 36:12
    how to have easy to use, easy to buy
    on the shelf, Debian hardware,
  • 36:12 - 36:15
    available everywhere, all the time,
  • 36:15 - 36:18
    so you can just go and buy this thing and
    verify it in some way
  • 36:18 - 36:20
    with some other machine,
  • 36:20 - 36:22
    to know that you would have the right
    thing.
  • 36:22 - 36:25
    And to that extent we didn't have X-rays
    for a lot of the circuit boards,
  • 36:25 - 36:28
    so that made it very difficult to know
    if when you buy something,
  • 36:28 - 36:30
    it's been tampered with.
  • 36:30 - 36:32
    I'll talk about some of that stuff
    tomorrow,
  • 36:32 - 36:36
    but basically, Debian does a lot of stuff
    right,
  • 36:36 - 36:39
    and that is also worth mentioning.
  • 36:39 - 36:44
    There's so many things that just work
    out of the box, that just work perfectly.
  • 36:44 - 36:48
    So the main thing is to keep the
    quality assurance at the level,
  • 36:48 - 36:50
    or to exceed where it is right now.
  • 36:50 - 36:52
    Because it actually works super super
    well.
  • 36:52 - 36:56
    The exception being for very specific
    targetted attacks,
  • 36:56 - 36:59
    the kernel attack surface is pretty big,
    and pretty bad, I think.
  • 36:59 - 37:03
    And also, we rebuilt some binaries in
    order to..
  • 37:03 - 37:04
    sorry, I'll get to you in a second.
  • 37:04 - 37:09
    We rebuilt some binaries to make sure
    that we had address space randomisation
  • 37:09 - 37:12
    and linker hardening, and stack
    canary stuff,
  • 37:12 - 37:16
    and for some stuff lately we've been using
    address sanitizer,
  • 37:16 - 37:20
    so it would be really great if all the
    hardening stuff was turned on,
  • 37:20 - 37:23
    if there was PAX plus GRsec as a kernel.
  • 37:24 - 37:27
    [audience]: so the specific problem with
    GR security is that they don't really
  • 37:27 - 37:30
    want to work with distros.
  • 37:30 - 37:35
    So we could have a Linux kernel package
    with GR security applied,
  • 37:35 - 37:38
    but it wouldn't have any of the other
    Debian patches.
  • 37:39 - 37:41
    [Jacob]: So I talked with Brad Spender
    about this,
  • 37:41 - 37:43
    and I'm so glad that you said that,
  • 37:43 - 37:47
    because what he said was that, as far
    as I can tell, he's totally interested in
  • 37:47 - 37:50
    helping Debian with this but thinks that
    Debian is not interested.
  • 37:50 - 37:53
    He actually runs a kernel building
    service where they do
  • 37:53 - 37:55
    individual kernel builds, and I think
    you'd be interested,
  • 37:55 - 37:57
    and when I told him we'd love to have
    this in TAILS, he said
  • 37:57 - 38:01
    what patches do I need to include in GRsec
    to make sure that it'll work?
  • 38:01 - 38:04
    And he offered to do the integration
    into the GRsec patch if there are not
  • 38:04 - 38:06
    too many things.
  • 38:06 - 38:08
    So I think what we should try and do
    is build a line of communication,
  • 38:08 - 38:10
    and if it costs money we should find a way
    to raise that money,
  • 38:10 - 38:12
    I'll put in some of my own personal money
    for this,
  • 38:12 - 38:14
    and I know other people would too.
  • 38:14 - 38:14
    [distant audience]: I will.
  • 38:14 - 38:16
    [Jacob]: Great.
  • 38:16 - 38:19
    So securedrop, for example, part of what
    they do for their leaking platform,
  • 38:19 - 38:22
    if you go to the intercepts website,
    you want to leak them a document,
  • 38:22 - 38:26
    they actually use free software
    everywhere, but there are a few things
  • 38:26 - 38:29
    they build specially, and one of those
    things is a GRsec kernel.
  • 38:29 - 38:32
    So the people at first look, that helped
    make this movie,
  • 38:32 - 38:34
    and that work on securedrop,
  • 38:34 - 38:35
    they would probably also,
  • 38:35 - 38:37
    I'm not committing them, I don't
    know that they would actually do this,
  • 38:37 - 38:39
    but I think they would really like it if
    that was in there,
  • 38:39 - 38:42
    and I think it we could find the community
    will to do that,
  • 38:42 - 38:44
    I know I would volunteer and other people
    would,
  • 38:44 - 38:47
    I know that dkg in the back would love to
    help with this, I know the ACLU is just
  • 38:47 - 38:50
    totally behind funding this work, right?
    [audience]: I don't know.
  • 38:54 - 38:56
    I thought that you were there to protect
    my civil liberties, buddy.
  • 39:00 - 39:03
    But I really think that it's possible
    that we could do this,
  • 39:03 - 39:07
    and I definitely think Brad, the author of
    GRsec,
  • 39:07 - 39:10
    I think he would really love it if Debian
    shipped GRsec.
  • 39:10 - 39:12
    And it doesn't need to come by default,
  • 39:12 - 39:17
    but if it was possible to just have
    it at all, that would be great.
  • 39:17 - 39:20
    Maybe we could have an affinity group
    where everyone who is interested can
  • 39:20 - 39:23
    meet sometime tomorrow and we could
    talk about doing this.
  • 39:23 - 39:25
    I would love to have that conversation.
  • 39:25 - 39:27
    Who are you?
  • 39:28 - 39:29
    [audience]: Ben Hutchings.
  • 39:30 - 39:34
    [Jacob]: Oh, nice to meet you!
  • 39:35 - 39:39
    [laughter, applause]
  • 39:43 - 39:45
    That's awkward.
  • 39:47 - 39:51
    [question]: Hi. Sorry to interrupt the
    awkwardness,
  • 39:51 - 39:53
    and replace it with more awkwardness.
  • 39:53 - 39:54
    Nice to see you, Jake.
  • 39:54 - 39:58
    So, I remember reading the documents
    in 2013
  • 39:58 - 40:04
    and seeing the NSA's internal training
    guide for how to query their
  • 40:04 - 40:08
    Hadoop data store, aka xkeyscore,
  • 40:08 - 40:15
    and so I thought I would just ask you
    if you think Free software net helps us
  • 40:15 - 40:17
    or helps them.
  • 40:17 - 40:19
    [Jacob]: I'm really glad you asked that
    question.
  • 40:19 - 40:23
    I think that Free software helps everyone
    on the planet, and I think that
  • 40:23 - 40:27
    purpose-based limitations.. I understand
    why people want them.
  • 40:28 - 40:31
    I think we should try to build a world
    where we are free,
  • 40:31 - 40:34
    and so putting in purpose-based
    limitations is really problematic,
  • 40:35 - 40:38
    and I think what we should do is try to
    mitigate the harm that they can do
  • 40:38 - 40:39
    with those systems,
  • 40:39 - 40:42
    as opposed to pretending that they care
    about Free software licensing.
  • 40:42 - 40:45
    These guys kill people with flying robots,
  • 40:45 - 40:49
    it's illegal to murder people, and they
    do it.
  • 40:49 - 40:53
    Limiting their use with licenses, first
    of all, that just means they'll spend
  • 40:53 - 40:55
    your tax money to rewrite it if they care
    about the license,
  • 40:56 - 41:00
    and you won't get their bug-fixes or their
    improvements,
  • 41:00 - 41:02
    and then additionally they're still not
    going to obey your license anyway,
  • 41:02 - 41:05
    because literally some of these people
    work on assassinating people.
  • 41:05 - 41:08
    So it is better that we keep our integrity
    and take the high road,
  • 41:08 - 41:12
    and write Free software, and we give it to
    every single person on the planet
  • 41:12 - 41:14
    without exception,
  • 41:15 - 41:16
    It's just better. It's better for all of
    us, right?
  • 41:16 - 41:22
    So the fact that they have Hadoop, the
    fact that they, for example, use OpenSSL,
  • 41:22 - 41:25
    or maybe they use Tor, or whatever, right?
  • 41:25 - 41:27
    Or they use gdb to debug their exploits.
  • 41:30 - 41:32
    I kind of wish that on them.
  • 41:34 - 41:37
    [laughter, applause]
  • 41:38 - 41:39
    I think it's great, right?
  • 41:39 - 41:42
    So one of the things Che Guevara said
    in his manual about guerilla warfare,
  • 41:42 - 41:45
    in chapter two, is that (oh, it was
    chapter three)
  • 41:45 - 41:48
    He talks about when you have to arm
    a guerrilla army,
  • 41:48 - 41:52
    this is not exactly related, but it's an
    analog.
  • 41:52 - 41:55
    He says that the most important thing
    is for the guerrilla army to
  • 41:55 - 41:58
    use the weapons of the people that
    they're fighting - the oppressor.
  • 41:58 - 42:02
    And the reason is that it allows you to
    resupply, essentially.
  • 42:02 - 42:05
    When you win a battle, you resupply.
  • 42:05 - 42:08
    When we all use the same Free software,
    and we're working on these things,
  • 42:08 - 42:11
    the fact that they have to contribute
    to the same projects and they often do
  • 42:11 - 42:13
    means there's a net win for us.
  • 42:13 - 42:16
    They do have some private things that they
    don't share, obviously,
  • 42:16 - 42:19
    with the exception of nice people like
    Edward Snowden,
  • 42:19 - 42:22
    and I think that it is a net positive
    thing,
  • 42:22 - 42:24
    and if we think of it as a struggle,
  • 42:24 - 42:26
    we are better off to take the high road,
  • 42:26 - 42:29
    and so I really think we should not
    pretend that we can stop them,
  • 42:29 - 42:32
    and instead we should work together
    to build solutions.
  • 42:32 - 42:34
    And I think that Debian is doing that,
    right?
  • 42:34 - 42:36
    I think Debian is much harder to
    compromise than
  • 42:36 - 42:38
    a lot of other operating systems,
  • 42:38 - 42:40
    and it's much much harder to coerce
    people,
  • 42:40 - 42:43
    and there's a strong ethos that comes
    with it that it's not just the technical
  • 42:43 - 42:45
    project, there's a social aspect to it.
  • 42:45 - 42:49
    I think I was in the New Maintainer
    queue for 11 years,
  • 42:49 - 42:50
    maybe that's a little too long,
  • 42:50 - 42:52
    but there's a huge hazing process,
  • 42:52 - 42:56
    so anyone who wants to help, really really
    wants to help,
  • 42:56 - 42:59
    and if they want to do something wrong
    there are processes to catch
  • 42:59 - 43:01
    people doing things wrong.
  • 43:01 - 43:03
    So we should really stay true to the
    Free software ethos,
  • 43:03 - 43:05
    and it really is a net benefit.
  • 43:08 - 43:12
    [question]: Hi Jake. Thanks a lot for
    saying so much "GRsec".
  • 43:17 - 43:20
    Just wanted to give a shout out.
  • 43:20 - 43:25
    You mentioned possible backdoors in
    CPUs and so on,
  • 43:25 - 43:30
    that ARM might not be the next best thing
    because it's not so open either.
  • 43:30 - 43:33
    You might want to have a look at Power 8.
  • 43:33 - 43:39
    It's basically PowerPC 64, so Debian has
    support for it as far as I know,
  • 43:39 - 43:41
    and most of the stuff is actually open.
  • 43:41 - 43:45
    Not that actual designs that IBM is
    using,
  • 43:45 - 43:49
    but you can have, actually, an FPGA
    implementation of it,
  • 43:49 - 43:55
    and if you have the money make your own
    ASICs for it, without even knowing
  • 43:55 - 43:59
    how to do it, which is pretty good,
    I think.
  • 44:00 - 44:03
    [Jacob]: I think there are lots of things
    we can hack right?
  • 44:03 - 44:06
    I mean I had one of those weird RMS
    laptops, the Limote,
  • 44:06 - 44:08
    or whatever it's called, for a while.
  • 44:08 - 44:11
    And I was definitely able to get some
    Free software running on it,
  • 44:11 - 44:13
    in theory it was a Free software laptop.
  • 44:13 - 44:16
    But getting other people to use this is
    the problem,
  • 44:16 - 44:18
    you need to get everybody to use it,
    right?
  • 44:18 - 44:21
    There's a sort of old anarchist cliché,
  • 44:21 - 44:23
    "None of us are free until all of us are
    free"
  • 44:23 - 44:25
    And that really applies here.
  • 44:25 - 44:28
    We really need to have Free software
    that's usable by everyone,
  • 44:28 - 44:31
    otherwise we're sort of bound by the
    lowest common denominator
  • 44:31 - 44:36
    of Free, or proprietary tools, depending
    on what people have to use.
  • 44:36 - 44:38
    So it'll be great when we have that,
  • 44:38 - 44:40
    and there's a thing called the Nokimist???
  • 44:40 - 44:44
    which is a video mixing board that has an
    FPGA implementing a Free software CPU
  • 44:44 - 44:46
    that you can boot Debian on, or OpenWRT,
  • 44:46 - 44:48
    and it does work, and I have used it,
  • 44:48 - 44:51
    and in fact I used to use it as a shell,
  • 44:51 - 44:54
    and for a long time I used a Debian
    trick,
  • 44:54 - 44:56
    actually I've never talked about that in
    public,
  • 44:56 - 44:58
    let me think about that for a second.
  • 44:59 - 45:02
    So I used to use an IRC client that was
    really buggy,
  • 45:02 - 45:05
    and I couldn't figure out where all the
    bugs were,
  • 45:05 - 45:08
    but I knew that if I hung out in certain
    networks that someone else
  • 45:08 - 45:12
    would help me find those bugs by trying
    to exploit my client.
  • 45:12 - 45:14
    And I wanted to make it as hard as
    possible.
  • 45:14 - 45:19
    So I ran my IRC client inside of a Debian
    machine that was running an S390 emulator.
  • 45:19 - 45:25
    Who here uses Hercules? Thank you to
    whoever packaged it.
  • 45:25 - 45:28
    And so I would use Hercules, it was a
    very long install process.
  • 45:28 - 45:30
    Very slow.
  • 45:30 - 45:34
    And I would do this, and what I'd always
    dreamed of doing at some point
  • 45:34 - 45:37
    was using the Nokimist??? and the
    Hercules together
  • 45:37 - 45:41
    for maximum ridiculously difficult
    to exploit,
  • 45:41 - 45:42
    plus GRsec kernel.
  • 45:45 - 45:48
    But that's not a usable thing.
  • 45:48 - 45:50
    So what we need to do is take these kinds
    of prototypes
  • 45:50 - 45:53
    which actually do represent many steps
    forward,
  • 45:53 - 45:56
    and we need to make sure that they're
    produced on a scale where
  • 45:56 - 46:00
    you can go into a store and puchase them
    anonymously, with cash,
  • 46:00 - 46:02
    in a way that you can then verify.
  • 46:02 - 46:06
    And we're actually really close to that
    with software defined radios
  • 46:06 - 46:08
    and open hardware,
  • 46:08 - 46:10
    but we're not quite there yet.
  • 46:12 - 46:16
    [question]: What I meant is that Power 8
    is basically getting big, currently,
  • 46:16 - 46:18
    on the server market,
  • 46:18 - 46:21
    and it might get big for other stuff also.
  • 46:22 - 46:23
    [Jacob]: Hopefully.
  • 46:26 - 46:29
    [question]: I want to come back to the
    story about the panic
  • 46:29 - 46:32
    in the Appletalk driver.
  • 46:32 - 46:37
    The common approach against this is
    to compile your own kernel with
  • 46:37 - 46:40
    all this stuff not compiled in,
  • 46:40 - 46:44
    but on two of my systems I have a
    modprobe wrapper which has
  • 46:44 - 46:47
    a whitelist of modules which may be
    loaded,
  • 46:47 - 46:52
    and I install that wrapper as the thing
    that the kernel uses for loading modules.
  • 46:52 - 46:58
    Do you know if such a thing exists
    elsewhere, or if not,
  • 46:58 - 47:03
    I would be interested in developing it
    into something which is actually useable
  • 47:03 - 47:05
    for people.
  • 47:06 - 47:08
    [Jacob]: That would be great.
  • 47:08 - 47:12
    In this case we were using Tails.
  • 47:12 - 47:19
    And so, Tails is very finicky about what
    it will accept, and very reasonably so,
  • 47:19 - 47:23
    and so having that in Debian will make it
    a lot easier to get it into something
  • 47:23 - 47:25
    like Tails, I think.
  • 47:25 - 47:29
    But the main thing is really that we have
    to think about the attack surface
  • 47:29 - 47:30
    of the kernel very differently.
  • 47:30 - 47:33
    The problem is not Appletalk; the problem
    is the Linux kernel is filled with
  • 47:33 - 47:35
    a lot of code,
  • 47:35 - 47:39
    and you can autoload, in certain cases,
    certain things come in,
  • 47:39 - 47:40
    and certain things get autoloaded,
  • 47:40 - 47:43
    and I know Bdale loves his
    ham radio stuff,
  • 47:43 - 47:46
    but I never use ham radio on my machine
  • 47:46 - 47:49
    I used for clandestine conspiracies,
    you know?
  • 47:49 - 47:51
    That's a separate machine.
  • 47:51 - 47:52
    It's over here.
  • 47:52 - 47:54
    So we just need to find a way to think
    about that.
  • 47:54 - 47:57
    And part of that could be kernel stuff,
    but also part of it could be thinking
  • 47:57 - 48:00
    about solutions like that, where we
    don't need to change the kernel.
  • 48:00 - 48:02
    So if you could package that and develop
    that, it would be really fantastic.
  • 48:04 - 48:09
    [Ben]: Actually, some time ago, after
    I think it was the econet exploits,
  • 48:09 - 48:13
    no-one uses econet, it was broken anyway,
    but you could exploit it,
  • 48:13 - 48:15
    because it was autoloaded.
  • 48:15 - 48:23
    So I actually went through and turned off
    autoloading on a few of the more obscure
  • 48:23 - 48:25
    network protocols.
  • 48:25 - 48:29
    We could probably go further with that,
    even in the defaults.
  • 48:29 - 48:32
    [Jacob]: I think it would be great to
    change some of the kernel stuff so that
  • 48:32 - 48:36
    at least, I mean, Tails is a special use
    case, where, I think, it's very important,
  • 48:36 - 48:38
    and it doesn't work for everyone,
  • 48:38 - 48:41
    but we should just consider that there are
    certainly things which are really great,
  • 48:41 - 48:44
    and I want to use Debian for it, because
    Debian is a universal operating system.
  • 48:44 - 48:48
    But for a modern desktop system where
    you're using GNOME,
  • 48:48 - 48:54
    and you haven't set anything up,
    Appletalk for example,
  • 48:54 - 48:58
    maybe we would ask those people
    to load that module themselves.
  • 49:00 - 49:05
    [Ben]: Yeah, for example you could
    have, a lot of those things are going to
  • 49:05 - 49:07
    have supporting utilities,
  • 49:07 - 49:10
    so you could put something in the
    supporting utilities that loads it
  • 49:10 - 49:11
    at boot time.
  • 49:12 - 49:14
    And if you don't have those installed,
    you don't need it.
  • 49:15 - 49:17
    [Jacob]: Yep, totally. And I think there's
    lots of ways to do it where
  • 49:17 - 49:20
    the network can't trigger it,
    and that's important.
  • 49:21 - 49:24
    [Ben]: Yeah, that puzzled me,
    I can't understand,
  • 49:24 - 49:29
    the protocol module should get loaded
    when userland tries to open a socket
  • 49:29 - 49:32
    of that type,
  • 49:32 - 49:35
    it shouldn't happen in response to
    network traffic.
  • 49:37 - 49:45
    There are things like, I think if you
    run ifconfig that can autoload
  • 49:45 - 49:47
    a bunch of things, for example.
  • 49:48 - 49:50
    [Jacob]: Yeah, I think on either side
    it should be more explicit,
  • 49:50 - 49:53
    and in this case with Tails,
  • 49:53 - 49:55
    there was a time when you looked at
    the kernel module list
  • 49:55 - 49:57
    and it was pretty amazing,
  • 49:57 - 50:01
    like I think there was an X25 thing,
    an Appletalk thing,
  • 50:01 - 50:04
    wait, this is all about going over Tor,
    we don't support any of these
  • 50:04 - 50:05
    things at all.
  • 50:05 - 50:10
    So it's just the way that things are
    interdependent, right?
  • 50:10 - 50:11
    It's not a dig at the kernel itself.
  • 50:11 - 50:14
    I think the Linux kernel as it works
    in Debian today works really well
  • 50:14 - 50:15
    for a lot of people,
  • 50:15 - 50:18
    but there is definitely a high security
    use case,
  • 50:18 - 50:21
    and I, for example, if I were a Debian
    developer, and I had a development
  • 50:21 - 50:23
    machine where I didn't run a web
    browser,
  • 50:23 - 50:25
    and I took a lot of effort.
  • 50:25 - 50:29
    It would be really nice if there were
    a kernel that put in the same
  • 50:29 - 50:32
    threshold of security.
  • 50:32 - 50:36
    And I think that the GRsec kernel with
    some stuff changed about it,
  • 50:36 - 50:38
    like getting rid of Appletalk and a few
    other things,
  • 50:38 - 50:40
    would be closer to that,
  • 50:40 - 50:42
    and combined with that guy's tool that
    he's talking about,
  • 50:42 - 50:47
    you could make autoloadable module,
    that at least even if the system was
  • 50:47 - 50:50
    going to autoload it, you could stop it,
    in a failing closed sort of way.
  • 50:50 - 50:53
    And I think there's a lot of stuff,
    practically, to do on that front,
  • 50:53 - 50:56
    and there's another project called
    Subgraph OS,
  • 50:56 - 51:02
    which is basically working on becoming
    in some ways a Debian derivative,
  • 51:02 - 51:05
    and they're going to do stuff like GRsec
    kernel,
  • 51:05 - 51:08
    and they have a whole sandboxing framework
    which uses apparmor, seccomp
  • 51:08 - 51:11
    and xpra, and a few other things,
  • 51:11 - 51:14
    and I think that they'll make a lot of
    interesting security decisions,
  • 51:14 - 51:17
    which might make sense to adopt in
    Debian later.
  • 51:18 - 51:20
    [Ben]: I think Matthew Garrett has an
    interesting criticism about that and
  • 51:20 - 51:24
    how it wouldn't really work, and Wayland
    was a better way to go than xpra.
  • 51:25 - 51:27
    [Jacob]: Yeah, I've heard those
    criticisms,
  • 51:27 - 51:29
    but Matthew Garrett is wrong.
  • 51:30 - 51:33
    Not usually, but in this particular case.
  • 51:33 - 51:37
    For example, the sandboxing stuff,
    if you have a GNOME appstore,
  • 51:37 - 51:42
    essentially, that's for one set of users,
    but for a Debian developer
  • 51:42 - 51:44
    writing your own policies,
    it might be useful,
  • 51:44 - 51:47
    and if you need Wayland, you might
    not have a full solution,
  • 51:47 - 51:49
    we might want to have both for a while.
  • 51:49 - 51:51
    And think it'd be great.
  • 51:51 - 51:54
    And the main thing is we just need to
    find people who will think about those
  • 51:54 - 51:56
    issues and try to integrate them,
  • 51:56 - 52:00
    because most people who write exploits,
    or who understand how to do offensive
  • 52:00 - 52:03
    security stuff, they don't want to help
    Free software projects,
  • 52:04 - 52:06
    they just want to exploit them.
  • 52:06 - 52:08
    And so some of the Subgraph guys,
    what I really like about them
  • 52:08 - 52:11
    is that they're trying to improve the
    Free software products we all use.
  • 52:11 - 52:13
    Even though they may make different
    design decisions,
  • 52:13 - 52:15
    they're making Free software all the same.
  • 52:19 - 52:24
    [question]: Maybe also, some other thing
    to keep in mind is actually
  • 52:24 - 52:39
    that there is also a social aspect of this
    pressure if NSA wants to put anything
  • 52:39 - 52:41
    inside Debian.
  • 52:41 - 52:48
    So if we actually also need to make sure
    that if they put pressure on somebody
  • 52:48 - 52:57
    we have any way to help these people
    not to land in prison.
  • 52:57 - 53:04
    So is there also a social aspect of
    supporting people which get pressure
  • 53:04 - 53:06
    from anyone.
  • 53:07 - 53:10
    [Jacob]: Yep. I mean, if anyone is ever
    in that situation one thing I would say
  • 53:10 - 53:13
    is that it's your right to remain silent,
  • 53:13 - 53:16
    you have the right to remain silent
    I think is the phrase the police would say
  • 53:16 - 53:19
    but there are definitely communities
    of people who will help you.
  • 53:19 - 53:22
    There's a group called the Courage
    foundation, for example,
  • 53:22 - 53:24
    which was started by Sarah Harrison,
  • 53:24 - 53:26
    and the job that the Courage foundation
    has taken on
  • 53:26 - 53:30
    is essentially to help people who would be
    sources or who are in harm's way like this
  • 53:30 - 53:33
    and if you found yourself in that kind of
    a position there are people
  • 53:33 - 53:35
    who will try to help you.
  • 53:35 - 53:37
    I really don't think that is the next step
    in this,
  • 53:37 - 53:38
    I think that could happen.
  • 53:38 - 53:42
    But I think it's much more likely someone
    is going to write an exploit for Firefox.
  • 53:42 - 53:45
    That's the way they're going to own
    Debian people in the future,
  • 53:45 - 53:47
    for the most part, that's how they own
    us today.
  • 53:47 - 53:52
    Firefox, number one enemy to security
    on your Debian machine, probably.
  • 53:52 - 53:55
    And that's not a dig at Firefox, it's just
    super-complicated software,
  • 53:55 - 53:57
    and these guys are really good at
    writing exploits,
  • 53:57 - 53:59
    and that's an easy target.
  • 53:59 - 54:01
    So we, I think, have to do with the social
    thing,
  • 54:01 - 54:04
    but we also should look at some of the
    technical problems,
  • 54:04 - 54:07
    and then when and if people have that,
    you can contact me.
  • 54:07 - 54:10
    I'm super happy to put you in touch with
    people who will help.
  • 54:10 - 54:14
    And obviously, get a lawyer, get several
    lawyers if you can.
  • 54:14 - 54:17
    Contact the EFF, or the ACLU, depending
    on where you are.
  • 54:17 - 54:23
    At least in Germany, and in the United
    States, it isn't so bad yet
  • 54:23 - 54:26
    that they can put that kind of pressure
    on you openly,
  • 54:26 - 54:28
    in a Free software project.
  • 54:28 - 54:31
    If you write proprietary software you're
    in a very different situation,
  • 54:31 - 54:34
    and there are definitely people who are in
    that situation right now,
  • 54:34 - 54:38
    and I don't envy them. Their position is
    actually much harder.
  • 54:38 - 54:42
    So actually writing Free software already
    makes you not at the very beginning
  • 54:42 - 54:44
    of the target list, I think.
  • 54:47 - 54:53
    Any other questions? Wow. Where's the rum?
  • 55:01 - 55:06
    [question]: How do you deliver the
    encrypted message without exposing
  • 55:06 - 55:08
    the connection to a third party?
  • 55:14 - 55:16
    [Jacob]: Which encrypted message do you
    mean?
  • 55:17 - 55:19
    Do you mean, like jabber?
  • 55:20 - 55:21
    [question]: Email, or jabber, yes.
  • 55:22 - 55:26
    [Jacob]: For the most part we use systems
    where Tor hidden services are available
  • 55:26 - 55:29
    to connect to them, so we never even left
    the Tor anonymity network,
  • 55:29 - 55:32
    so they're end-to-end encrypted and
    anonymized, you connect to a
  • 55:32 - 55:34
    .onion address,
  • 55:34 - 55:38
    and then using crypto on top of that, so
    TLS to a Jabber server,
  • 55:38 - 55:40
    and then OTR on top of that,
  • 55:40 - 55:45
    so you have, you could call it a
    composition of cryptographic systems,
  • 55:45 - 55:50
    and the core of that is Tor, along with
    using throwaway machines,
  • 55:50 - 55:53
    going to locations where you never go
    twice,
  • 55:53 - 55:57
    using open wifi plus Tor plus TLS plus
    OTR,
  • 55:57 - 56:02
    and for email, Riseup offers Tor hidden
    services, which allows you to do the same
  • 56:02 - 56:05
    thing, essentially, and then using PGP as
    well.
  • 56:06 - 56:12
    [question]: I mean, how about metadata,
    like the delivery address of the target?
  • 56:13 - 56:23
    [Jacob]: In some cases we use a system
    called Pond,
  • 56:23 - 56:26
    and Pond is a system that is completely
    Tor hidden service based,
  • 56:26 - 56:29
    pond.imperialviolet.org.
  • 56:29 - 56:33
    Adam Langley probably wouldn't want me to
    say, but I'll say it anyway,
  • 56:33 - 56:36
    it would be very useful to package this
    for Debian,
  • 56:36 - 56:39
    because it's a system where once you do
    key exchange with someone,
  • 56:39 - 56:44
    you have an end-to-end encrypted messaging
    system that's like email,
  • 56:44 - 56:48
    you can send files that are encrypted,
    you can send messages that are encrypted,
  • 56:48 - 56:51
    It's delay based. You don't have
    usernames,
  • 56:51 - 56:54
    you just have a public key,
    and then you have group signatures,
  • 56:54 - 56:57
    so that people can send things to your
    mailbox by proving they are a member
  • 56:57 - 57:00
    of the group but not which member of
    the group they are.
  • 57:00 - 57:02
    And there's a lot of stuff like that.
  • 57:02 - 57:04
    So we use Jabber, we use email, and we use
    Pond.
  • 57:04 - 57:08
    And those three systems together also
    allowed us to build a clandestine
  • 57:08 - 57:10
    sneakernet.
  • 57:10 - 57:11
    So we have the ability to carry USB disks,
  • 57:11 - 57:13
    and a few of us carried them inside of
    our bodies,
  • 57:13 - 57:16
    and if you've never had that experience,
    lucky you.
  • 57:20 - 57:24
    You want to make sure you use post-quantum
    computer crypto for that, by the way.
  • 57:24 - 57:26
    It's more comfortable.
  • 57:29 - 57:31
    [orga]: Shall we relieve this man from his
    duties?
  • 57:31 - 57:33
    [Jacob]: Any more questions?
  • 57:34 - 57:35
    [orga]: One more question.
  • 57:36 - 57:39
    [question]: Okay, so when the Snowden
    leaks were first published it created
  • 57:39 - 57:42
    a lot of awareness, and people were
    talking about it,
  • 57:42 - 57:45
    and there was a huge media echo,
  • 57:45 - 57:48
    Now if some documents leaked, people
    are saying yeah, all this surveillance,
  • 57:48 - 57:51
    and we aren't dead yet, and we can still
    live our lives.
  • 57:51 - 57:55
    They basically care less. They still care
    a bit, but they care much less than
  • 57:55 - 57:59
    when the first documents were published,
  • 57:59 - 58:05
    so how can we maintain awareness for
    this issue in the world population,
  • 58:05 - 58:06
    in your opinion?
  • 58:07 - 58:09
    [Jacob]: There's a really scary thing
    that's happening right now.
  • 58:09 - 58:14
    There was this idea in the 90s, we had
    the crypto wars.
  • 58:14 - 58:16
    Did any of you remember this idea of the
    crypto wars?
  • 58:16 - 58:19
    Okay, a few of you do, maybe not all of
    you do.
  • 58:19 - 58:22
    But we had the so-called crypto wars in
    the 90s, I encourage you to look this up
  • 58:22 - 58:25
    on DuckDuckGo, or whatever your
    favourite search engine is.
  • 58:25 - 58:29
    In theory we're in the second crypto
    wars now.
  • 58:29 - 58:32
    In reality what happened is the first
    crypto wars never ended.
  • 58:32 - 58:35
    We didn't actually win, like we thought
    we did.
  • 58:35 - 58:37
    But there are a bunch of things that are
    taking place.
  • 58:37 - 58:41
    For example, making a stand against
    backdoors.
  • 58:41 - 58:45
    Using end-to-end encrypted
    communications.
  • 58:45 - 58:48
    Actually pushing for that, being quite
    open about actually hosting
  • 58:48 - 58:51
    those kinds of services, and doing it
    from a principled perspective,
  • 58:51 - 58:53
    from a legal perspective.
  • 58:53 - 58:58
    I think you will find that the tension
    will continue to rise for a while,
  • 58:58 - 59:02
    and I think that it will continue to be
    a conversation about public debate,
  • 59:02 - 59:07
    and an important aspect of this is that
    now regular journalists that don't
  • 59:07 - 59:10
    understand technology at least understand
    the importance of these things.
  • 59:10 - 59:13
    And if they don't do that, they at least
    perceive that they will be considered
  • 59:13 - 59:17
    unprofessional if they don't care, and
    think about those things,
  • 59:17 - 59:19
    or they'll be somehow negligent.
  • 59:19 - 59:21
    And I think that will keep some of the
    discussion going,
  • 59:21 - 59:24
    and it will allow us to build some
    breathing room,
  • 59:24 - 59:27
    and that breathing room will actually
    allow us to build some alternatives.
  • 59:27 - 59:30
    But there are some downsides, right?
  • 59:30 - 59:34
    Some of the things that take place when
    you reveal security service spying
  • 59:34 - 59:37
    is that it tends to get normalized, to a
    degree.
  • 59:37 - 59:39
    But then in some cases it does get pushed
    back.
  • 59:39 - 59:44
    In the 70s in the United States, it became
    illegal to do assassinations, for example.
  • 59:44 - 59:47
    Because what the CIA were doing was so
    atrocious that eventually there was
  • 59:47 - 59:49
    political pushback.
  • 59:49 - 59:52
    It turns out it only lasted 30 years, and
    then they started doing it again.
  • 59:52 - 59:58
    But there's a saying in my country which
    is that effectively the price of liberty
  • 59:58 - 59:59
    is eternal vigilance.
  • 59:59 - 60:01
    And that's what we are engaged in now.
  • 60:01 - 60:05
    And the liberty starts with software
    liberty, I think,
  • 60:05 - 60:07
    in the case of communications on networks.
  • 60:07 - 60:10
    And so we have to have Free software,
    and it has to be responsibly encoding
  • 60:10 - 60:12
    packets and data,
  • 60:12 - 60:14
    and if we think about it in this sense
    we'll find a lot of pressure,
  • 60:14 - 60:17
    and we'll have a lot of discussions
    about it,
  • 60:17 - 60:20
    and you'll start to see it be a part of
    policy debates,
  • 60:20 - 60:23
    like one of the presidential candidates
    in the United States
  • 60:23 - 60:24
    just came out against encryption.
  • 60:24 - 60:27
    I hope that sinks his presidential
    campaign.
  • 60:27 - 60:29
    I mean it's weird to be against
    encryption.
  • 60:29 - 60:31
    It's like I'm against prime numbers.
  • 60:31 - 60:33
    No modular arithmetic.
  • 60:34 - 60:37
    [laughter, applause]
  • 60:39 - 60:42
    I just want to say it's important to
    understand, you are right,
  • 60:42 - 60:44
    people will be normalized about it,
  • 60:44 - 60:46
    but each and every one of us that
    understands these issues
  • 60:46 - 60:48
    can actually keep it alive.
  • 60:48 - 60:50
    And the way we do that is when we
    communicate with people...
  • 60:50 - 60:52
    I'll give you an example which I
    like to give.
  • 60:52 - 60:55
    I grew up in San Fransisco and in the Bay
    Area or San Fransisco, and California,
  • 60:55 - 60:58
    and I did that in the 80s.
  • 60:58 - 61:02
    And so a lot of people that I knew had
    HIV and they died of AIDS.
  • 61:02 - 61:06
    And there was a huge discussion about
    this, and it was called GRID,
  • 61:06 - 61:09
    the Gay Related Immune Deficiency
    syndrome.
  • 61:09 - 61:11
    Before it was called HIV and AIDS.
  • 61:11 - 61:13
    And lots of people were sick, and lot of
    people died,
  • 61:13 - 61:15
    and there was a sort of normalization
    process where people sort of
  • 61:15 - 61:18
    accepted this as their fate, especially
    if they were in the gay community.
  • 61:18 - 61:23
    And still, over years and years and years,
    people began to build a culture about
  • 61:23 - 61:26
    safe sex, and they started to talk about
    respecting their partners,
  • 61:26 - 61:29
    and about talking about these issues,
    and about getting tested,
  • 61:29 - 61:32
    and it took a lot of effort, to really go
    much further.
  • 61:32 - 61:34
    A lot of people actually died in that
    process.
  • 61:34 - 61:37
    It was a very sad, serious situation.
  • 61:37 - 61:40
    And I think we have similar discussions
    that are taking place now,
  • 61:40 - 61:42
    and some people don't take it seriously,
  • 61:42 - 61:45
    and if they happen to be Muslims living
    in Pakistan,
  • 61:45 - 61:48
    they might get a drone strike.
  • 61:48 - 61:51
    And there's a sort of survival mechanism
    that takes place there.
  • 61:51 - 61:54
    And it's an unfortunate parallel, I think,
  • 61:54 - 61:57
    but I would really consider that we can
    change this dialogue
  • 61:57 - 62:00
    by continuing to have it even though
    it's exhausting,
  • 62:00 - 62:02
    and by recognizing our responsibility,
  • 62:02 - 62:04
    and how we can make it better by
    continuing to do that,
  • 62:04 - 62:07
    and by building healthy alternatives,
    and by building new systems,
  • 62:07 - 62:10
    and by refusing to backdoor any
    system, ever,
  • 62:10 - 62:14
    completely committing to
    Free software,
  • 62:14 - 62:17
    and transparency of that software,
    and also of those processes.
  • 62:17 - 62:20
    And really really really sharing the
    knowledge about it,
  • 62:20 - 62:22
    to make it impossible to surpress.
  • 62:22 - 62:25
    And we should not accept the
    normalization of that.
  • 62:25 - 62:28
    We shouldn't make it fun to spy on people,
    we shouldn't make jokes about it
  • 62:28 - 62:30
    in a way that normalizes it,
  • 62:30 - 62:34
    and we should respect those people
    who are victims of surveillance,
  • 62:34 - 62:37
    and we should recognize that basically
    everyone here is a victim of surveillance
  • 62:37 - 62:38
    to some degree,
  • 62:38 - 62:40
    and we should care about that,
    and we should continue to be upset,
  • 62:40 - 62:43
    but not just upset; to channel that
    anger into something useful
  • 62:43 - 62:45
    like making Debian better.
  • 62:47 - 62:50
    [applause]
  • 62:56 - 63:00
    [orga]: Thanks Jake for such a long Q&A
    session,
  • 63:00 - 63:02
    I hope you enjoy the rum.
  • 63:02 - 63:05
    And I'm sure Jake's going to answer any more
    questions if he can still talk.
  • 63:08 - 63:10
    [Jacob]: Thanks.
Title:
Citizenfour QA Session
Description:

QA session given by Jacob Appelbaum after the projection of "Citizenfour" during DebConf15

more » « less
Video Language:
English
Team:
Debconf
Project:
2015_debconf15

English subtitles

Revisions Compare revisions