< Return to Video

Snowden, the NSA, and Free Software - Bruce Schneier + Eben Moglen

  • 0:09 - 0:14
    It's conventional to give an introduction at a moment like this
  • 0:14 - 0:19
    and there are of course two things that make doing that difficult, if not pointless.
  • 0:19 - 0:26
    One of which is, everyone already knows the speaker, and the other is, his family is here, which means of course
  • 0:26 - 0:31
    introductions must be handled with particular care.
  • 0:31 - 0:40
    I've been trying to talk about Snowden and the future at this law school this fall, and I was hampered by two things:
  • 0:40 - 0:47
    I hadn't read the documents, and I wasn't a cryptographic expert.
  • 0:47 - 0:53
    Both of those problems have been solved because I'm not going to be doing the talking this evening.
  • 0:53 - 1:01
    Bruce Schneider is (I think it's fair to say), the world's most important cryptographer and public intellectual--
  • 1:01 - 1:11
    most wonderful cryptographers being more introverted and less linguistically capable in high rhetorical form.
  • 1:11 - 1:18
    So that's why he doesn't need any introduction, but I should say that as another alumnus of
  • 1:18 - 1:21
    the Hunter College Elementary School System,
  • 1:21 - 1:24
    that Bruce is a graduate of the Hunter High School in New York
  • 1:24 - 1:32
    and of the University of Rochester, and of American University, and holds honorary doctorates
  • 1:32 - 1:38
    arising from having done good work for the human race, which most people here know all about.
  • 1:38 - 1:46
    Applied cryptography is still a really good place to begin if you want to understand why you can trust the math.
  • 1:46 - 1:54
    The array of articles, interviews, and books on security, trust and modern technology
  • 1:54 - 2:03
    is, for people like me who try to follow along doing the law of this, is not just an inspiration but a godsend.
  • 2:03 - 2:12
    It's my great pleasure to have Bruce here at Columbia today because he knows what the rest of all those documents say
  • 2:12 - 2:18
    which means he knows a great deal about Snowden and the future is really going to turn out.
  • 2:18 - 2:27
    I hope here in conversation this evening we can hit some of the geeky high spots of all of that.
  • 2:27 - 2:37
    So, Bruce, welcome to Columbia Law School and thank you for being here.
  • 2:37 - 2:44
    Maybe a good place to begin would be to say whatever you can say about how you came to be
  • 2:44 - 2:49
    involved with Glen Greenwald and the project of publication of Mr. Snowden's disclosures.
  • 2:49 - 2:54
    Well, the one sentence answer is: "I was asked."
  • 2:54 - 3:01
    Greenwald had in his possession all these documents. They are very technical, very jargon-filled
  • 3:01 - 3:08
    and he needed an expert in the material to help understand them.
  • 3:08 - 3:14
    My name came up again and again until he called me.
  • 3:14 - 3:20
    Stuff happens, and I go down to Rio, and it's kind of a surreal experience to be handed
  • 3:20 - 3:27
    reams of Top Secret classified material and say "hey read this and tell me what you think" but
  • 3:27 - 3:30
    that's what I did.
  • 3:30 - 3:36
    We worked on several stories, and the one story we published before Greenwald severed his relationship
  • 3:36 - 3:40
    with The Guardian was about Tor, the anonymity service.
  • 3:40 - 3:51
    It's a good story, it talks about how it is secure, how the NSA does go after Tor users, what mechanisms they're using
  • 3:51 - 3:59
    how they are attacking users on the Internet, both getting data, breaking anonymity, breaking into computers.
  • 3:59 - 4:06
    The story published in early October, and I think it was like two weeks later that Greenwald broke with The Guardian.
  • 4:06 - 4:13
    Presumably, when a new venture gets started up, I will be back doing stories.
  • 4:13 - 4:21
    There is a lot more to tell. Until then, you are in the very capable hands of Bart Gellman and Ashkan Soltani
  • 4:21 - 4:26
    who are writing for the Washington Post, doing a great job.
  • 4:26 - 4:35
    What do you think we should do with the fact that you probably weren't terribly surprised by what you read?
  • 4:35 - 4:41
    There's a movement around that world that says, "Well, nobody is really surprised because everybody
  • 4:41 - 4:47
    knows it's going on, therefore there's nothing we need to do about it," and I find myself confronting that knowing that
  • 4:47 - 4:55
    the first part of the syllogism is, in fact, correct. We're not terribly surprised. Why are we not terribly surprised?
  • 4:55 - 4:58
    You know, I think we're both surprised and not surprised.
  • 4:58 - 5:05
    It's really interesting, if you've ever watched any movie with an NSA villain, this is exactly the sorta thing they would do.
  • 5:05 - 5:13
    There's nothing in the revelations-- I mean, sometimes they are a bit extreme, spying on gaming worlds--
  • 5:13 - 5:19
    but if you thought about it for half a minute, you would say "well of course you would, that's a place to communicate."
  • 5:19 - 5:25
    If the goal is to eavesdrop on all communications, you're going to eavesdrop on that channel just like
  • 5:25 - 5:31
    you'd eavesdrop on a little chat window in a Scrabble game, as well as Skype.
  • 5:31 - 5:41
    So, there is no surprise. But the details, the extent, I think it really is a surprise.
  • 5:41 - 5:47
    We kinda knew it, but we never actually fully thought about it, we never did the math. We never worked out the budgets.
  • 5:47 - 5:54
    And we were starting to, because we were seeing the Utah facility come up, and people were looking at
  • 5:54 - 6:02
    the square footage, the power, how many servers are there, what could be stored there, but still you're just guessing.
  • 6:02 - 6:09
    Seeing it for real is just surprising because it's there. You might know it's there, but seeing it...
  • 6:09 - 6:17
    The analogy I've been using, a crummy analogy but it's the best one I've got, is it's kinda like death.
  • 6:17 - 6:26
    You all know death is coming, it's not a surprise, the story always ends this way, yet everytime it happens it is a surprise.
  • 6:26 - 6:32
    Because you basically never really think about it. I think surveillance was like that; we never thought about it.
  • 6:32 - 6:40
    There are professionals in the world of cyber security who have thought about it. They are more surprised I think than you and me
  • 6:40 - 6:47
    because they trusted in what the listeners told them more. The gaps that have opened up in the documents you've read
  • 6:47 - 6:56
    between what is actually going on and what people were assured was going on must seem fairly large.
  • 6:56 - 7:01
    We know that they promised the financial community that they were going to break financial crypto.
  • 7:01 - 7:09
    We know they made all sorts of promises about how minimization worked in the United States.
  • 7:09 - 7:20
    In that sense, is it true that those of us at the "cypherpunk" edge of the world more surprised because they trusted The Listeners more?
  • 7:20 - 7:28
    I don't know, my guess is if you're right you're surprised even more, because, my god, it is actually that bad.
  • 7:28 - 7:38
    You know, around the edges... you sorta have a bell curve of beliefs of how bad this was.
  • 7:38 - 7:46
    Now we're seeing that even the more extreme beliefs of how much surveillance is going on
  • 7:46 - 7:59
    are true and actually conservative. We're seeing a surprising number of alliances, where it's very common for the NSA to spy on country A
  • 7:59 - 8:06
    and then partner with country A to spy on country B, and then partner with country B to spy on somebody else.
  • 8:06 - 8:15
    We're seeing so many webs of that. Germany, who is one of NSA's most trusted partners is being spied on.
  • 8:15 - 8:22
    The only thing we haven't seen-- and I can't wait because it will be extraordinarily big-- is when we start seeing
  • 8:22 - 8:28
    the UK and the US start spying on each other. I think the odds of that not being true are very small.
  • 8:28 - 8:35
    Because why in the world, if you as the NSA are spying on your own country, why wouldn't you spy on the country of your closest ally?
  • 8:35 - 8:40
    You're spying on everyone else. There's just no exclusion.
  • 8:40 - 8:47
    What do you think is the biggest headline for you, as a technical thinker, that's come out of the documents you've seen?
  • 8:47 - 9:00
    I think the most important headline is that crypto works. We expected more cryptanalysis, we expected more "the NSA can break this code and that code and that code."
  • 9:00 - 9:08
    We know they're spending an enormous amount of money on this. But again, we learned from the documents that cryptography works.
  • 9:08 - 9:19
    That's the lesson of Tor. The NSA can't break Tor and that pisses them off. That's the lesson of the NSA eavesdropping on your buddy list and address books
  • 9:19 - 9:27
    from your connections between your browser, Gmail, and your ISP. You look at the data, and they get about 10 times the amount of data from
  • 9:27 - 9:37
    Yahoo users as Google users even though Google is so many more times larger than Yahoo. It's because Google uses SSL client-side as default.
  • 9:37 - 9:41
    So, SSL works. And there's another slide from a program called MUSCULAR
  • 9:41 - 9:49
    which is an NSA program for getting data from Google's data centers and the connections between them, where they specifically point out
  • 9:49 - 10:03
    "this is the place where SSL is removed." So, we see again and again that cryptography is not much of a barrier, but they aren't breaking it by breaking the math.
  • 10:03 - 10:12
    They're breaking it by cheating, by going after the implementation, by stealing keys, by forging certificates
  • 10:12 - 10:22
    by doing all the non-cryptography things that we all know are important, but you don't get that mathematical benefit.
  • 10:22 - 10:33
    Just to take the paranoid side of this for a moment, how much of this do you think could turn out to be cognitive bias in our collecting?
  • 10:33 - 10:40
    Did Mr. Snowden miss the whole other trove in which the documents about crypto-breaking are?
  • 10:40 - 10:50
    That's certainly possible. We don't know a lot about how he collected and what he collected, but it's certainly possible.
  • 10:50 - 10:55
    There are a documents about cryptanalysis that are completely separate, on separate networks and he didn't have access to them.
  • 10:55 - 11:07
    But what we are seeing is operational stuff. You'll see in the documents on BULLRUN, which is their program to subvert cryptography, it'll say
  • 11:07 - 11:18
    in several places, "don't speculate on how this works." They're talking to the analysts, the people doing the intelligence.
  • 11:18 - 11:31
    "You want to break into these circuits, we will do that for you. Don't speculate on how we're doing it. Just accept this windfall of data and be happy."
  • 11:31 - 11:42
    But we do see, again and again, crypto stymieing. Tor's story is really important. They [NSA] have seminars and workshops on "how do we break Tor?"
  • 11:42 - 11:53
    Again and again you see that they are unable to, that the cryptography is working, and that when they have breaks they're getting in around the edges--
  • 11:53 - 11:58
    attacking the user, they're trying to go after correlations.
  • 11:58 - 12:06
    There is something in the black budget, which we see the first pages of (and I think the Washington Post published those)
  • 12:06 - 12:17
    there's a narrative by James Clapper, talking about what the NSA is doing. There is one sentence where he talks about crytopgraphy--
  • 12:17 - 12:22
    I'm going to read the sentence because I think the wording is interesting.
  • 12:22 - 12:35
    He says, "We are investing in groundbreaking cryptanalytic capabilities to defeat adversarial cryptography and exploit Internet traffic."
  • 12:35 - 12:43
    So, that sentence doesn't sound like, "We got a bunch of smart people in a room and hope they get lucky."
  • 12:43 - 12:49
    That sounds like "We've got a piece of math but we need a bunch of engineering to make it work."
  • 12:49 - 13:03
    "We need a really big computer, a big database, we need some something that requires time and budget. Not new math."
  • 13:03 - 13:10
    So, there's speculation that there is at least one piece of cryptography that they have and we don't. Which is perfectly reasonable.
  • 13:10 - 13:20
    They tend to hire the top 10% of mathematicians every year. They go into the agency, they never come out. They get everything we do, we get nothing they do.
  • 13:20 - 13:27
    There is going to be this differential in knowledge that which will expand over the years.
  • 13:27 - 13:36
    Before this we all would say, "we think they're 10 years ahead of the state of the art." And we pull that number out of the air.
  • 13:36 - 13:43
    So there are speculations of what the NSA has. And these are just speculations; we know nothing.
  • 13:43 - 13:48
    I'll give the three-- the first is something about elliptic curves.
  • 13:48 - 13:58
    Elliptic curves is a very complex area of mathematics that is being used in public key cryptography, and it is perfectly reasonable to believe that
  • 13:58 - 14:06
    the NSA has some ability to either break elliptic curves to a greater extent than we can, or to break certain classes of curves that we would
  • 14:06 - 14:16
    not be able to recognize. We know that the agency has tried to affect curve selection, which implies that there are some classes of curves
  • 14:16 - 14:23
    they have an advantage over, that we don't know. The other reasonable assumption is general factoring.
  • 14:23 - 14:31
    If you look at the academic world, factoring gets better slowly every year-- a factor of 2 here, a factor of 10 there, a factor of 100 on a really good year.
  • 14:31 - 14:38
    Every year it progresses. If you assume the NSA is 10 years ahead of the state of the arts, you can do some math and you can assume they are
  • 14:38 - 14:45
    at some higher point than we are. And that'd be the sort of thing that would make sense from Clapper's quote.
  • 14:45 - 14:58
    The third speculation is the RC4 algorithm, which is a symmetric cypher that has been teetering on the edge of "we can break it in academia" for quite awhile.
  • 14:58 - 15:07
    It was invented by Ron Rivest, who is actually the master of the too-good-to-be-true cypher: you can't imagine it being secure yet you can't break it.
  • 15:07 - 15:11
    And maybe it's holding up, but maybe they have something.
  • 15:11 - 15:20
    Those are my guesses but there is going to be at least one piece of math, it will be extraordinarily hidden.
  • 15:20 - 15:33
    Just like the names of the companies who are cooperating in BULLRUN, this stuff is ECI-- extremely compartmentalized information-- pretty much never written down.
  • 15:33 - 15:39
    Now, sometimes you get lucky. And Snowden, because of his position, did get lucky.
  • 15:39 - 15:53
    I remember some of the early weeks of this, some congressman said a quote like, "Snowden couldn't have this data. You have to be on a list to get this data. He wasn't on the list!"
  • 15:53 - 16:01
    I'm listening to the person and I'm thinking, "no you don't understand, here's the man who typed the list into the computer. Don't you understand what root access means?"
  • 16:01 - 16:10
    Who has the most access to the secrets in a company? It's the janitorial staff, because they get access to everything.
  • 16:10 - 16:20
    The plumbers, the people who are doing the infrastructure, have enormous access, and that seems to be what happened.
  • 16:20 - 16:26
    Let's talk a little bit about the efforts against math. Standards corruption, for example.
  • 16:26 - 16:36
    We have one clearly documented case of NSA taking over a standards-making process, and choosing in the end, for NIST,
  • 16:36 - 16:42
    a random number generator which was not a random number generator exactly...
  • 16:42 - 16:51
    How should we think that process of not necessarily attacking the basic mathematics but attacking how that math is applied
  • 16:51 - 16:58
    in the standards process to produce weaknesses, how far should we take that to an extent?
  • 16:58 - 17:04
    That's certainly worrying, something we're looking at over and over again. That the standard in question is a random number generator standard,
  • 17:04 - 17:11
    and if you are going to put a backdoor into a system, hacking a random number generator is a perfect place to do it.
  • 17:11 - 17:20
    You can do it imperceptibly. It doesn't affect the output at all. It's a really good place to put a secret backdoor.
  • 17:20 - 17:27
    This is a case, again, where we knew. It was '06 or '07 where I'm writing an essay where I'm saying
  • 17:27 - 17:32
    "Don't trust this random number generator because there could conceivably be a backdoor and here's how you'd put it in."
  • 17:32 - 17:41
    Again, there's no big surprise. It's one of four random number generators in the standard. The other three we think are good.
  • 17:41 - 17:49
    But this entered the standard and then it started being requested by governments, by US government contracts,
  • 17:49 - 17:56
    and it ended up as the default random number generator in some libraries, and no one is really sure how.
  • 17:56 - 18:06
    So this is an example of how a hacked standard can infiltrate slowly the systems we're using.
  • 18:06 - 18:12
    So now we're starting to look at everything else, and NIST, whose the government entity doing these standards,
  • 18:12 - 18:23
    who pretty much has been trusted, is coming under a lot of scrutiny, and they, I think rightfully, are very angry at the NSA for ruining their credibility.
  • 18:23 - 18:33
    I think a lot of the standards are still good. The AES is still strong. We have a new hash function standard that I'm happy with.
  • 18:33 - 18:40
    These were semi-public processes. These were not NSA-produced algorithms that became standards.
  • 18:40 - 18:48
    These were open calls, and we in the community would look at them, and there was rough consensus, and the NIST picked one.
  • 18:48 - 19:00
    It's possible that they're hacked, but I think it's really unlikely. I would more look at implementations.
  • 19:00 - 19:07
    We know that cellphone encryption-- and this is not just the NSA-- here you have an international cellphone standard
  • 19:07 - 19:18
    and you've got three dozen countries that want to eavesdrop. So all together there's pressure not to have real security in your cell networks.
  • 19:18 - 19:26
    So these are more overt, we know about them. I worry more about the private standards than the public standards.
  • 19:26 - 19:36
    We have one example of a backdoor trying to be slipped into Linux, which we are almost positive is enemy action. We don't know who
  • 19:36 - 19:48
    the enemy is in this case, it could be anybody, which we found because we, I think, got lucky. So, certainly it is possible to slip backdoors in commercial software.
  • 19:48 - 19:54
    I worry less about the standards and more the private stuff we can't see.
  • 19:54 - 20:03
    Do you think we're going to have to consider the possibility that all the standardized family of ECC curves are ones that we should abandon?
  • 20:03 - 20:09
    I don't. I mean, they are curves that came from academia, they are curves that came from public processes.
  • 20:09 - 20:20
    Those are ones we can trust more. I would love to up our key lengths, for extra conservativeness, just because I'm now more leery-- especially with elliptic curves.
  • 20:20 - 20:27
    I think we just have to look at pedigree, and we have to mistrust things that can be tampered with.
  • 20:27 - 20:33
    We know that the NSA has implemented curve selection, but we don't know how.
  • 20:33 - 20:43
    We don't know if it's the NSA going to-- i'm making this up-- an engineer at some company and saying "here's some curves, why don't you suggest these?"
  • 20:43 - 20:53
    We don't know if there's some vetting... we don't know what has happened. But I would like curves to be generated in open, public manners.
  • 20:53 - 21:00
    I think for the world, if we're going to trust them in a global community, we have to do that.
  • 21:00 - 21:09
    We end up talking about the NSA, but this is not about the NSA. This is what any large nation state would do.
  • 21:09 - 21:17
    Snowden has given us some phenomenal insight into the NSA's activities in particular, but we know China uses a lot of these same techniques.
  • 21:17 - 21:29
    We know other countries do. This is going to be what cybercrime is going to look like in 3-5 years because technology democratizes.
  • 21:29 - 21:35
    We really need to get security for everyone against everything.
  • 21:35 - 21:45
    So, let's follow that up a little bit. We have talked so far, and most of the Snowden documents that we have seen publicly released so far have
  • 21:45 - 21:54
    talked primarily about passive listening activity of one sort or another-- hack, tap, steal, with respect to the backbones and the telecom networks.
  • 21:54 - 22:03
    But we haven't had much information about listener activity directed at subverting the security of individuals or businesses.
  • 22:03 - 22:12
    With the cookie-based material that started coming out 36 hours ago, things have begun to change a little bit about that.
  • 22:12 - 22:23
    Have you seen things you can talk about that relate to how the American military listeners or others are directly subverting the security of individual's computers?
  • 22:23 - 22:30
    So the first story on that was the Tor story from early October. And there, I wrote about two different programs
  • 22:30 - 22:40
    that the NSA uses for active attack. And there's been a great article in Foreign Policy on TAO, tailored access operations.
  • 22:40 - 22:46
    These are basically the NSA's black bag teams. So, we have seen these stories over the last several months.
  • 22:46 - 22:56
    The two things I wrote about back in early October were QUANTUM, and QUANTUM is an add-on to their eavesdropping platforms.
  • 22:56 - 23:03
    So, the NSA has large eavesdropping platforms on internet trunks, and these have names like TUMULT, TURBULENCE, and TURMOIL.
  • 23:03 - 23:12
    I'm not quite sure how those three relate to each other. They all begin with 'TU' so they seem to be a family. One is a superset of another.
  • 23:12 - 23:22
    TURMOIL seems to be the latest generation. TURMOIL is the device that gets the "firehose" and quickly decides
  • 23:22 - 23:29
    what needs further analysis, because the firehose is coming at you and you need to make very quick decisions about what to eavesdrop on.
  • 23:29 - 23:43
    So, sitting on TURMOIL is something called QUANTUM, and what QUANTUM does is it gives the NSA the ability to inject into the stream, to add bytes.
  • 23:43 - 23:51
    And this is what the NSA uses for things like packet injection. Packet injection injects packets into your data stream.
  • 23:51 - 23:56
    Again, nothing new, this is how the great firewall of China works-- this is a hacker tool.
  • 23:56 - 24:05
    But if you're sitting on the backbone, if you're on the AT&T backbone, you can do some phenomenally interesting things with this.
  • 24:05 - 24:16
    You can do DNS hacking-- this is actually what China does for censorship. You can do frame injection, where you redirect users
  • 24:16 - 24:25
    surreptitiously to other servers-- I'll get back to that later. We knew from the Tor story that there's something called QUANTUM Cookie.
  • 24:25 - 24:36
    We didn't really know how it worked until we just got the Cookie story from a few days ago, but the slide said "force users to divulge their cookies."
  • 24:36 - 24:46
    This is a way that the NSA would-- and think of this in terms of a Tor user, this is a user that is being anonymous on the network because of Tor--
  • 24:46 - 24:55
    if you could force that user to divulge his cookies, if you've got a database of whose cookies belong to who, that de-anonymizes.
  • 24:55 - 25:00
    So now we're seeing how these things link together. And there are other QUANTUM programs.
  • 25:00 - 25:12
    Nicholas Weaver, whose at UC Berkely, knows nothing about the documents but has written some great essays on how this works,
  • 25:12 - 25:18
    because this is how of course it would work now that you know about it, you start thinking about what you would do with this.
  • 25:18 - 25:26
    So we do know quite a lot about QUANTUM, and I think that's important. We also know about a program called FOXACID--
  • 25:26 - 25:34
    and by the way, the NSA has the coolest codenames in the world. If you're at lunch, you'd want to sit at the FOXACID table.
  • 25:34 - 25:40
    Worse codename: EGOTISTICALGIRAFFE. Never want to sit with them. Ever.
  • 25:40 - 25:54
    So FOXACID is the NSA's multifaceted hacking tool. If you think about their problem, if you think about what they need to do,
  • 25:54 - 26:05
    they need to turn, basically, people off the street, into cyberwarriors, and the way they do that is not going to be through years of training-- that's expensive.
  • 26:05 - 26:14
    It's going to be through tools and procedural manuals, and automated/semi-automated ways to make hacking work.
  • 26:14 - 26:20
    And FOXACID-- if you know hacking tools, you know Metasploit?-- FOXACID is Metasploit with a budget.
  • 26:20 - 26:33
    So this is the server that, when you visit it, and you can be forced to visit it in many ways, one of them is through a QUANTUM-- we think it's called a
  • 26:33 - 26:45
    QUANTUM tip, QUANTUM inserts-- I mean, there are some codenames we don't understand. First the user is "tipped", tipped into FOXACID.
  • 26:45 - 26:52
    So you're visiting-- and I'm making this up-- Google, and the NSA sees you visit Google, and they do a frame injection,
  • 26:52 - 27:00
    and then some invisible packets go to the FOXACID server, which recognizes who you are through whatever systems they have,
  • 27:00 - 27:08
    and the server says "OK, it's this person." They're going to know if this person is high value vs. low value,
  • 27:08 - 27:17
    if this person is a sophisticated user vs. a naive user. And based on all these criteria, FOXACID will decide what exploit to serve.
  • 27:17 - 27:25
    ...and I forget the codename of the basic exploit, it's a cool codename too. Damn it...
  • 27:25 - 27:34
    And then if that works-- called a "shot"-- if that works then there's a series of other exploits that are run to figure out
  • 27:34 - 27:44
    "OK I'm on this computer, who is it? Where is it? What network is it on? What's it connected to? What's on it?"
  • 27:44 - 27:54
    And then we know there's a lot of specialized attack tools. There's a document that the French press published, and I don't know if they meant to but
  • 27:54 - 28:03
    at the bottom of this document is this glossary of codenames and attack codes. There was a special attack code for figuring out where the
  • 28:03 - 28:09
    geographical location is. There's a special attack code for jumping air gaps-- that was interesting. There's a special attack code for doing
  • 28:09 - 28:20
    various things you might want to do. So, we have quite a bit. We've seen nothing so far from US Cyber Command.
  • 28:20 - 28:30
    And we don't know, it's not public yet, if the story is not yet out, or if the US Cyber Command documents were separate enough that
  • 28:30 - 28:38
    they're not in the trove. So, everything we're seeing is NSA and GCHQ-- that's the British counterpart. We're not seeing US Cyber Command,
  • 28:38 - 28:47
    which presumably does a lot more offensive operations. That's kind of their job. But the NSA does quite a lot, too.
  • 28:47 - 28:57
    We know that TAO will go in and steal keys. If there's a circuit they want to eavesdrop on but they can't break it, they'll go in and steal the key.
  • 28:57 - 29:13
    Let's just hang out a moment in this question of injection attacks from the backbone. That tip that put somebody's browsing activity
  • 29:13 - 29:24
    into a platform where they could try various exploits and see what's going on, that depended on a frame injection in the example you gave,
  • 29:24 - 29:30
    which a browser could be smart enough to turn down all together. If that browser were running NoScript, what would happen?
  • 29:30 - 29:46
    It depends. NoScript is a really good way to deal with some of these, but in our normal browsing, there's often quite a lot redirects that don't all involve scripts.
  • 29:46 - 29:54
    It's not clear to me whether scripts are required for this attack. I think there's going to be some attacks where they are not.
  • 29:54 - 30:04
    You read the NSA documentation, they talk a lot about PSPs, personal security products, and these just piss them off ginormously.
  • 30:04 - 30:13
    A lot of this action involves around a couple of things-- there's also the fact that the Internet is very insecure out of the box,
  • 30:13 - 30:28
    and there is this sort of background radiation of script kiddies attacking things all the time, so when you are attacked it's the 30th time this millisecond, so what?
  • 30:28 - 30:39
    That will give an agency like the NSA, or somebody else, an enormous amount of cover. Because attacks happen so often.
  • 30:39 - 30:49
    There's certainly a lot of things we can do to make this much harder. Encrypting the backbone would do an enormous amount of good.
  • 30:49 - 30:57
    You can't do frame injection in an SSL connection, because you can't see the frames. You can do an DNS redirect, or other things you can do,
  • 30:57 - 31:09
    but there are things you can't do. Using the privacy tools we have, I think, give us an enormous benefit. The fact that Tor works-- that might be
  • 31:09 - 31:23
    the biggest surprise we've seen so far-- that Tor does work. It's annoying to use, but it does work. That shows that a bunch of us can decide
  • 31:23 - 31:29
    that we're going to build a privacy tool that will defeat major governments. That's kind of awesome.
  • 31:29 - 31:35
    Kind of too awesome to be true.
  • 31:35 - 31:40
    You know, everything I've read tells me the NSA cannot break Tor. I believe the NSA cannot break Tor.
  • 31:40 - 31:46
    When all the dust settles, how much do you think they won't be able to break Tor? Is Tor going to be the exception, or are we going to be sitting there
  • 31:46 - 31:49
    saying, "the new GPG is also safe"?
  • 31:49 - 32:01
    I think most of the public domain privacy tools are going to be safe, yes. I think GPG is going to be safe. I think OTR is going to be safe.
  • 32:01 - 32:14
    I think that Tails is going to be safe. I do think that these systems, because they were not-- you know, the NSA has a big lever when
  • 32:14 - 32:28
    a tool is written closed-source by a for-profit corporation. There are levers they have that they don't have in the open source international, altruistic community.
  • 32:28 - 32:41
    And these are generally written by crypto-paranoids, they're pretty well designed. We make mistakes, but we find them and we correct them,
  • 32:41 - 32:53
    and we're getting good at that. I think that the NSA is going after these tools, they're going after implementations. Everyone got their Microsoft
  • 32:53 - 33:01
    Update patches two days ago, you installed them, did that put a backdoor into your system? You have no idea. I mean, we hope not, we think not,
  • 33:01 - 33:10
    but we actually don't know. That's going to be a much more fruitful avenue of attack, and yes, you can actually break all of those tools that way.
  • 33:10 - 33:21
    Auto-update is great, but auto-update requires trust. But I think that the math and the protocols are fundamentally secure.
  • 33:21 - 33:31
    So-- I will admit that I say this as a free software advocate-- I think that what you just said is without Freedom Zero there is not freedom: if you can't read it, you can't trust it.
  • 33:31 - 33:35
    Is that where we're going to be when the dust settles?
    --I think it's where we always have been.
  • 33:35 - 33:45
    --But people didn't believe it?
    --But we do believe it. We are all here trusting the building codes here at Columbia University.
  • 33:45 - 33:56
    We don't think about "well, the roof could fall on our heads," but we are trusting. We're trusting the people around us, we're trusting all the tools we use,
  • 33:56 - 34:05
    both tech and non-tech, but yes, we're trusting our hardware. I mean, a few days ago, was it OpenBSD?, announced that they no longer
  • 34:05 - 34:15
    trust the random number generator on the Intel chip. Not because we know it's broken, no because we have evidence that it's broken,
  • 34:15 - 34:24
    but because we know that Intel is susceptible, and if they were told "break your random number generator or we're not buying your stuff anymore," what are they going to do?
  • 34:24 - 34:36
    And, a researcher's name I forget right now, showed a really clever way to put a backdoor in a random number generator on a silicon chip, that we would never in a million years find.
  • 34:36 - 34:46
    So we have a proven concept that it's possible, we have a company that could be susceptible, and we have mathematical fixes to this.
  • 34:46 - 34:54
    We can run the hardware outputs through an algorithm with some other input, and we know how to fix this.
  • 34:54 - 35:07
    So, we either have to trust them, or we have to do things to ensure we're still secure if they're not trustworthy, but we're still trusting
  • 35:07 - 35:18
    those tools that are now fixing this. So in the end, you have to trust everyone up the chain from the hardware, operating system software, user, everything--
  • 35:18 - 35:30
    -- to the room your sitting in which could have various listening devices-- and that's never going to change. In any technological society, you cannot examine everything.
  • 35:30 - 35:42
    You fundamentally must trust. This is why transparency of process is so important. We don't trust because we verify, we trust because we know someone else verified,
  • 35:42 - 35:47
    or a few people who mutually don't like each other have verified, and that sort of mechanism.
  • 35:47 - 35:53
    Both republicans and democrats are counting the votes, therefore... that sort of thing.
  • 35:53 - 36:01
    But in that, we would then say that the way that reduces out is use software over hardware where you can,
  • 36:01 - 36:05
    and use software you can read over software that you can't.
    --Yes.
  • 36:05 - 36:13
    And so, we are pushing ourselves towards openness or freedom, depending on which word we happen to be using.
  • 36:13 - 36:18
    And we're basically saying that hardware's definition in the 20th century is, "hardware is what the NSA is inside."
  • 36:18 - 36:21
    --Unless we have open source hardware, which we here talk about.
  • 36:21 - 36:28
    --Well, at that point we're going to have to go very far towards the chips themselves, aren't we?
    --That's right.
  • 36:28 - 36:40
    We're not talking about designs and layouts that are free to copy, modify, and reuse. We're talking about we have to go from the masks up,
  • 36:40 - 36:42
    otherwise we wouldn't trust it.
  • 36:42 - 36:52
    Right, and the goal here is to reduce your trust footprint. I mean, I could trust 30 companies, if I could trust 5 that's better.
  • 36:52 - 36:59
    Or, if I could figure out ways where, I don't have to trust any one, but in order to break my security it has to be a collusion of two of them.
  • 36:59 - 37:03
    These things make it harder for the attacker.
  • 37:03 - 37:10
    OK, good. So I've got an apartment full of gear, like many of the people in this room, and there's a lot of boxes in there.
  • 37:10 - 37:20
    I think what I have learned from the documents I have seen so far and what I think they tell me about the context of the listeners I've always known--
  • 37:20 - 37:25
    maybe you agree with me about this-- is if I'm going to start distrusting some box in my apartment, I should start with my router.
  • 37:25 - 37:40
    I would. I believe-- and this story hasn't really been told, I think it will, I'm not sure where the details are-- that the routers, the network devices
  • 37:40 - 37:51
    are a much more fruitful avenue of attack than the computers. And I think we're just starting to see that. There have been a couple of stories in the past
  • 37:51 - 38:01
    couple of weeks about malware attacks against routers. The criminals are starting to notice this, but routers never get patched, basically.
  • 38:01 - 38:11
    They're running a 4 year old version of Linux, they've got a bunch of binary blobs around them for various device drivers, and they never ever get patched.
  • 38:11 - 38:24
    Even if the patch was issued, you would have no idea how to install it. The margins are very slim, the industry isn't really set up for security updates.
  • 38:24 - 38:36
    They're always building the next thing. At a very small level, I think we are-- ignoring the NSA-- the next wave of cybercrime is going to come after these routers.
  • 38:36 - 38:46
    We saw an attack on a point of sales system recently. There was a botnet that took over a gazillion routers in Brazil recently.
  • 38:46 - 38:58
    I think this is very much a danger for all of us. For the NSA, I think they've had better luck with the router companies.
  • 38:58 - 39:07
    I think this is very generational. You start to think about the history of the NSA and surveillance, and cooperating with US companies, telcos have
  • 39:07 - 39:16
    started cooperating with the NSA since the NSA came into existence. My guess is this cooperation just carried through the Cold War,
  • 39:16 - 39:29
    and after it's no big deal for Level3, or AT&T, or any telco company, or executive, or person, to, you know, "Oh yeah we give the NSA a copy, that's just what we do."
  • 39:29 - 39:38
    And that is a very different mentality that you'll get out of Google, or Microsoft, or Apple, or companies coming out of the computer space that
  • 39:38 - 39:47
    don't have this history of cooperation and collusion. The reactions you're getting from those companies are much more hostile.
  • 39:47 - 39:56
    "What do you mean you're doing this to us?" Not, "oh yeah we kinda assumed that, and we'll give you a room if you just ask. Don't be a stranger."
  • 39:56 - 40:04
    But isn't part of the outrage a result of the fact that they thought they had made deals as a result of which they weren't going to be troubled more?
  • 40:04 - 40:09
    They really just feel that the guys they bought didn't stay bought, right?
  • 40:09 - 40:21
    You know, I'm not sure it's deals. Yes, I think it's a bit rich for CEOs of Google to complain that the NSA is getting a copy of the data it stole from you fair and square.
  • 40:21 - 40:31
    And certainly a lot of government surveillance piggy-backs on corporate surveillance. There's a whole story about cookies-- it's simply because these
  • 40:31 - 40:39
    companies want to identify you on the internet, and the NSA is just getting itself a copy.
    --Right, the prefs cookie at Google, let's just
  • 40:39 - 40:47
    fingerprint all the browsers just in case we need all the browser fingerprints on Earth, and then, by god, that makes it easier to steal all the browser fingerprints.
  • 40:47 - 41:02
    But, these companies do have a huge PR problem. They did believe, I think, that the bulk of the NSA collection of their stuff, came through the front door,
  • 41:02 - 41:10
    came through National Security Letters, came through subpoenas, came through warrants. I don't know it, but I assume Google has a room full of
  • 41:10 - 41:19
    lawyers that deal with the 30 or 50 countries that serve it with subpoenas or whatever they're called in that country, whether they're legal or not.
  • 41:19 - 41:32
    I believe these companies did think that that was primarily what the NSA was doing. I don't think they realized that was just a way to launder stuff they got surreptitiously previously.
  • 41:32 - 41:44
    I think a really important moral is that the NSA surveillance is robust. It's robust legally, it's robust technically, it's robust politically.
  • 41:44 - 41:54
    I can name three different ways the NSA has access to your GMail, under three different legal authorities.
  • 41:54 - 42:03
    And I worry about pending legislation in the United States that tends to focus on a particular program, or a particular authority,
  • 42:03 - 42:16
    not realizing that they have backups and backups to backups. So, I do think that Google was legitimately surprised at the extent that they were penetrating,
  • 42:16 - 42:27
    given that they were cooperating where they thought they had to. "We're giving you what you're asking for, under the extraordinarily draconian laws,
  • 42:27 - 42:36
    you mean you're getting it these other ways (plural) also? What, do you guys have money to burn?" "Yeah, we kinda do."
  • 42:36 - 42:46
    And so, the private dataminers who also have money to burn are gonna have to burn some making themselves more secure or people aren't going to use them?
  • 42:46 - 42:55
    We don't know. We're getting back to trust again. You are someone in some country somewhere and you've learned that the NSA is getting a copy of
  • 42:55 - 43:02
    everything. And Google has a press release saying, "Oh, we fixed that." Do you believe it? I sure don't. I think the companies have
  • 43:02 - 43:13
    a serious problem right now, that the trust that-- and this is an Internet problem-- the Internet used to be run on a basic, U.S. benign dictatorship.
  • 43:13 - 43:23
    Under the assumption that the U.S. was generally behaving in the world's best interest. And I think that trust lost is a one-way function.
  • 43:23 - 43:36
    We generally believe that Google, yeah, they were reading your GMail and serving you ads, but that was it. And now that the cats out of the bag, I'm not
  • 43:36 - 43:47
    sure there's a way for these companies to convince the world that, "yes, we've contained the problem." That, "yes, we only give the NSA the data only when they ask us with secret requests."
  • 43:47 - 43:56
    Which is the best they'll ever be able to say. I think this is why we're seeing these movements in Brazil and other countries that say,
  • 43:56 - 44:05
    "Now wait a second. We want this data in our country. There no longer exists these assurances you can give us."
  • 44:05 - 44:16
    Because maybe we've been deluding ourselves the past, you know, bunch of years-- and cloud computing is not going away for a whole bunch of other reasons--
  • 44:16 - 44:25
    I see coming some Internet balkanization, which I think is going to be very bad because a bunch of countries are going to be doing
  • 44:25 - 44:36
    way worse than we are. And a lot of countries are using our actions to justify their own actions. So, if this is fixed, I don't think it's coming from the companies.
  • 44:36 - 44:44
    I think it's coming from the tech community. It's coming from the IETF, it's coming from the open source movement,
  • 44:44 - 44:58
    it's coming from all the non-commercial entities that are going to try to build security back in. We'll never be able to trust Google, or Microsoft, or Apple,
  • 44:58 - 45:01
    or any of these companies ever again. I just don't think that's going to happen.
  • 45:01 - 45:13
    But most of the free communities that have been building crypto and security software, groups of hackers who, as you say, are knowledgeable
  • 45:13 - 45:22
    and extremely well motivated, they probably would have said 10 years ago, "Look, we are making software that we think creates security, but
  • 45:22 - 45:33
    if you're up against national means of intelligence, all bets are off." And now, if you're right about what we're going to be called upon to do, we're going to have to
  • 45:33 - 45:40
    raise our game substantially, because what you've really said is, "unless you're good against national means of intelligence, you're not good at all."
  • 45:40 - 45:51
    Yeah, but it's actually better than that. One of the things we've learned about the NSA is they might have more employees doing surveillance
  • 45:51 - 45:57
    than the rest of the planet combined, and bigger budget than the rest of the planet combined, but they are not made of magic.
  • 45:57 - 46:07
    They are subject to the same laws of mathematics, and physics, and economics that everyone else is. And what we've done is not--
  • 46:07 - 46:20
    the problem is we've made surveillance too cheap. We've made bulk surveillance too cheap. Fundamentally, if the NSA, or China,
  • 46:20 - 46:28
    or a dozen other countries I could name, or a bunch of really good hackers, want into your computer, they are in. Period.
  • 46:28 - 46:40
    We do not have the expertise, anywhere on this planet, to build that level of security. Right now, in the world, on computers, attack is much easier than defense.
  • 46:40 - 46:51
    But that's not what I'm trying to defend against. I'm trying to defend against bulk collection. And this is what we object to.
  • 46:51 - 47:01
    If the Snowden documents revealed the NSA spied on the Taliban in North Korea, no one would care. If the NSA spied on Belgium--
  • 47:01 - 47:14
    or, I guess that UK spied on Belgium, which is like Connecticut spying on Nebraska-- that's the problem. It is easier to get everything than to target.
  • 47:14 - 47:23
    The economics are all wrong. Fixing the economics is a much more tractable problem, and something we can do.
  • 47:23 - 47:34
    So, it's not, "you need to be secure against the NSA." You need to be secure against NSA bulk collection, and that's an extremely important point.
  • 47:34 - 47:44
    If you're the financial industry, however, you might actually need to be secure against the NSA. Part of what is happening, it seems to me at the moment,
  • 47:44 - 47:54
    --and I'd be very interested to hear your view on this-- we're also living in a world after the end of money, where trust is all that sustains economic value.
  • 47:54 - 48:02
    Bars of gold have been replaced by bit streams signed by trusted parties. Signing is a cryptographic activity,
  • 48:02 - 48:10
    the consequence of which is that if we are to have the economics you are talking about, in a world where values are represented by digital
  • 48:10 - 48:21
    entities, signed by trusted parties using algorithms we believe in, there is actually, at the end of the day, a requirement to provide a level of
  • 48:21 - 48:29
    security in order to stave off chaotic risk in the world financial system, which it appears the American government has been deliberately undermining.
  • 48:29 - 48:38
    Isn't there a really hard choice out there for us now, about whether we're going to have security in the way the military listeners think
  • 48:38 - 48:42
    about it, or are we going to have trusts sufficient to run the world economic system?
  • 48:42 - 48:57
    I think that this is the fundamental choice that this whole story brings to light. A lot of people talk about this as "should the NSA be allowed to spy or not?"
  • 48:57 - 49:07
    That's actually the wrong way to think about it. The way to think about it is should be build an electronic infrastructure / Internet in the information age,
  • 49:07 - 49:14
    where everyone is allowed to spy, or where nobody is. Do we choose surveillance or security,
  • 49:14 - 49:24
    where security is defined as not that the NSA isn't listening, but that nobody is listening. Because the NSA doesn't get the only ear.
  • 49:24 - 49:35
    It's the global financial industry, but it's everything else as well. And this is in the NSA's mission, the NSA has always had a dual mission:
  • 49:35 - 49:45
    throughout the Cold War, it's to protect US communications and eavesdrop on Warsaw Pact communications.
  • 49:45 - 49:53
    That dual mission made a lot of sense during the cold war. You eavesdrop on the Soviet stuff and you protect the American stuff.
  • 49:53 - 50:07
    That fails when everyone starts using the same stuff. When the entire world uses TCP/IP, and Cisco routers, and Microsoft Windows, suddenly...
  • 50:07 - 50:12
    --Well, not the entire world...
    --Well, enough of the world, to a first approximation.
  • 50:12 - 50:26
    You now have a very real choice. You learn of a vulnerability against-- I'm making this up-- a Cisco router. You can use that vulnerability to
  • 50:26 - 50:33
    eavesdrop on the people you don't like, knowing full well that other people might discover that vulnerability and eavesdrop on you, or you can
  • 50:33 - 50:42
    close the vulnerability, reduce your ability to eavesdrop, and eliminate everyone else's ability to eavesdrop as well.
  • 50:42 - 50:51
    And maybe the financial industry is the tipping point for this, but I think we need to collectively recognize that it is in our
  • 50:51 - 50:59
    collective long-term interest to have actual security and not eavesdropping.
  • 50:59 - 51:03
    Does actual security imply anonymity?
    --Yes.
  • 51:03 - 51:12
    And the distinction of anonymity has been pretty much their goal all the way along. Attribution is what they look for.
  • 51:12 - 51:18
    "Make it possible for us to attach an identity to every action."
    --Yes, and this is the metadata debate.
  • 51:18 - 51:28
    When the first stories about Verizon and cellphone eavesdropping, one of the defenses was-- the President said this-- he said,
  • 51:28 - 51:38
    "Don't worry, it's all metadata. No one is listening to your conversations." I think this is an extremely... I don't know what word I want to use...
  • 51:38 - 51:45
    --Disingenuous.
    --Yeah, and it is. Because metadata equals surveillance.
  • 51:45 - 51:55
    And it's easy to understand this. Imagine you hired a private detective to eavesdrop on somebody. That detective would put a bug in
  • 51:55 - 52:04
    their home, and their car, and their office, and you would get a report of their conversations. That's what the data is. If you ask that same
  • 52:04 - 52:15
    detective to surveil somebody, you'd get a different report: where he went, who he spoke to, what he purchased, what he read. That's all metadata.
  • 52:15 - 52:25
    When the President says, "Don't worry, it's just metadata" I hear "Don't worry, you're all just under surveillance 24/7."
  • 52:25 - 52:35
    Breaking anonymity is part of that, because it's one thing to know that this anonymous blog did these things. It's very different to
  • 52:35 - 52:43
    attached a name to it, or if you can't do that, continuity with other anonymous blobs, you attach a persistent pseudonym.
  • 52:43 - 52:54
    That's not just the goal of the NSA, that's the goal of Google. That's the goal of Facebook. When Google+ came up with a real names policy,
  • 52:54 - 53:06
    it was basically "we need to market to you better. We don't want anonymity on our system." When they're trying to tie your cellphone usage to your internet usage to
  • 53:06 - 53:14
    your real world usage, that's all about breaking anonymity. So it's for-profit and for-government. This is what's happening.
  • 53:14 - 53:24
    So, what is the sum of the economic thinking that lies behind the idea that we change the economics? It is obviously expensive to follow people.
  • 53:24 - 53:34
    You gotta have a guy out there tailing people and she's gotta know how to not get seen. But getting 5 billion cellphone location records a day... that's much simpler.
  • 53:34 - 53:44
    Isn't it a permanent economic fact that the way we live in the digital universe, following individual people is expensive and following everybody is much cheaper?
  • 53:44 - 53:55
    --Only if following everybody is cheap. And that's true because we have designed the cellphone system such that this location data
  • 53:55 - 54:01
    is transmitted in the clear, and easy to eavesdrop on. We could design a cellphone system that doesn't have that property.
  • 54:01 - 54:12
    We've designed an Internet economic architecture where surveillance is the fundamental business model. We could decide not to
  • 54:12 - 54:18
    design it that way.
    --Yes, but if you and I and everybody in this room who totally believes this goes and says
  • 54:18 - 54:25
    "we need to build an internet with anonymity built in from the beginning," it will be a complete political non-starter.
  • 54:25 - 54:34
    Because every policeman, every taxman, every other form of legitimate government agency on earth has now decided they can do a much better job
  • 54:34 - 54:38
    governing us without anonymity, and never going back. Isn't that right?
  • 54:38 - 54:47
    --So, I tend to be long-term optimistic. I think that we as a species tend to solve these problems.
  • 54:47 - 54:58
    It might take us a generation, or two. We might have some pretty horrible world wars while we're doing it, but you know, the quote that actually
  • 54:58 - 55:07
    lets me sleep at night is Martin Luther King Jr. who says "the arc of history is long but bends towards justice." We do manage to have more
  • 55:07 - 55:19
    freedom, and more liberty, and more rights, century by century. Not year by year. So I do think that long term, wherever that is, we will have licked this.
  • 55:19 - 55:28
    --OK, but Martin Luther King can say that because his view of justice isn't path-dependent. His view of justice is it's absolute and it's always there.
  • 55:28 - 55:38
    Technology, on the other hand, is path-dependent. When our friend Dan Geer at In-Q-Tel says that talk on tradeoffs
  • 55:38 - 55:44
    in cybersecurity that you and I both so admire, this is the last generation in which the human race gets a choice.
  • 55:44 - 55:52
    He's basically speaking to what you've just said. You said "if we have long enough we'll get this fixed" and he said "technology is path dependent
  • 55:52 - 56:01
    and once this is fastened on the human race it may not be unfastenable again, and we evolve forward from where we are in a dependent path."
  • 56:01 - 56:10
    So one of those lets me sleep and the other one keeps me awake, and between those two what you and I have to confront is our friends
  • 56:10 - 56:18
    out in the world who say "it's hopeless, there's nothing we can do," and "I'm not doing anything wrong, so why should I care?"
  • 56:18 - 56:25
    And those are the two arguments that we need to address. In the couple of minutes left to us before we open it up to all of these people,
  • 56:25 - 56:30
    what do you say to the people who say, "it's hopeless, there's nothing we can do"?
  • 56:30 - 56:36
    --I think there's a lot we can do. That's, I think, one of the most important morals from the Snowden documents, is that the NSA isn't
  • 56:36 - 56:45
    made of magic, that there not breaking cryptography anywhere near the extent that we kinda thought they were, that there are things we can do to make
  • 56:45 - 56:53
    ourselves much more secure. I mean, if you are the one person they want, they're going to get in. But again, that leverages the economics.
  • 56:53 - 56:59
    Now we're getting into tailing everybody individually. You've only got so many agents, you can only tail so many people.
  • 56:59 - 57:09
    If you eliminate the bulk, or make the bulk harder, or make us more able to hide in the noise, we are doing ourselves an enormous favor.
  • 57:09 - 57:15
    And if we give the tools to the dissidents around the world who are hiding from much worse regimes than we have,
  • 57:15 - 57:21
    to do this, we are doing an enormous amount of good for the world. There are things we can do. It is no where near hopeless, and I think
  • 57:21 - 57:34
    we learned this again and again and again. And, the other half is "Why? I don't have anything to hide." The people who are speaking best to this
  • 57:34 - 57:45
    are the psychologists, who look at what it is like to live under constant gaze, or under the threat of-- that if you believe that you could
  • 57:45 - 57:55
    be watched at any moment, what does that do to you as a person? And what we learn is, it makes you different. It makes you more conformist.
  • 57:55 - 58:11
    It makes you less willing to think new thoughts or try new ideas. It stagnates society. It makes us all worse. Society improves because people dare to
  • 58:11 - 58:21
    think the unthinkable and then after 20 or 30 years everyone says, "well you know, that was kind of a good idea." It takes a while, but it has to start
  • 58:21 - 58:32
    with doing something that you don't want anyone else to know. So, it hurts us big and small. It hurts us in the big because society stagnates,
  • 58:32 - 58:43
    and it hurts us in the small because we are diminished as individuals, because we cannot fully be individuals. We have to be a member of the group.
  • 58:43 - 58:53
    I mean, there's phenomenal writings, philosophical and psychological, that really look at how this works. It's a hard argument to make.
  • 58:53 - 59:06
    The arguments on the other side are quite simple: "terrorists will kill your children." That's it. That argument pushes four very core buttons that will
  • 59:06 - 59:17
    make you scared. So I could spend an hour saying, "well this doesn't protect you from terrorism." That argument is happening at a higher intellectual
  • 59:17 - 59:36
    level than your fear. I'm going to lose that argument. So, the forces of surveillance are strong. This is an extremely difficult fight and I'm always amazed
  • 59:36 - 59:48
    at the resilience of our species to overcome intractable problems, to overcome futility. It amazes me again and again, and I'm not willing to
  • 59:48 - 59:57
    count us out. It is possible that we've reached some theoretical limits here, and I could actually draw out that argument, that's, you know, some Darwinian-level
  • 59:57 - 60:08
    limit in our species, that technology just makes bad things happen and we have no choice here. My guess is not, but it's going to require a lot of
  • 60:08 - 60:18
    changing. I mean, the war has to end-- that's a phrase you used when we were talking earlier. If terrorists-- if General Alexander could get
  • 60:18 - 60:28
    in front of Congress and say, "if I had these powers I could have stopped 9/11," and no one looks to him and says, "you didn't stop Boston." And that was
  • 60:28 - 60:40
    one guy on a terrorist watch list, and the other guy with a sloppy Facebook trail. What are you talking about? We need that level of response.
  • 60:40 - 60:43
    But I'm still bullish on us.
  • 60:43 - 60:52
    --So, if I don't ask you someone else is going to ask you, I might as well save the time: what do you trust these days?
  • 60:52 - 61:05
    --So, I actually wrote an essay about that in The Guardian, and what do I trust? I trust OTR, I trust Tails, I trust GPG,
  • 61:05 - 61:19
    I trust-- oh, what's the file encrypter-- Truecrypt, which I consider the best of three bad alternatives. I do a few other things, there's a file erasure program,
  • 61:19 - 61:29
    that I think they're all pretty good. But basically, I have an airgap computer I use for things I don't want on the Internet. And again, all these things we can
  • 61:29 - 61:38
    pick apart, but I'm just trying to make it harder.
    --We don't have to pick you apart, because other people will do that for us.
  • 61:38 - 61:48
    --If the NSA wanted me, I think they're in. If the FBI-- could the FBI get a warrant against my computer? Probably.
  • 61:48 - 61:59
    They haven't broken down my door yet.
    --[audience] That you know of...
    --That I know of, right. But what am I going to do?
  • 61:59 - 62:10
    I am not a nation state, I cannot protect my computer. My house is not tamper shielded, and it will never be tamper shielded. I will never have my computers
  • 62:10 - 62:20
    in a secret level [unintelligible] safe. I will never have guards, no one's home right now and I don't come home for a couple of hours. So it is quite easy
  • 62:20 - 62:30
    to grab an image of my hard drive. It is trivial to put temporary receiver around, or grab my keystrokes when I type in my password. If you are
  • 62:30 - 62:43
    targeted, there's pretty much nothing you can do with that level. So, at some point you have to just say, "that's the way the world works" and you can't do anything.
  • 62:43 - 62:52
    But you can protect yourself against bulk surveillance, and that's largely what I'm trying to do. When I go in and out of the country right now,
  • 62:52 - 62:59
    my securities don't happen on your laptop. So when my laptop gets-- and it's interesting, I spend a lot of time now when I'm flying into the US erasing
  • 62:59 - 63:07
    all my free space, deleting data, encrypting archives, and all sort of things. You spend a few hours doing it, you go through the US border and nothing happens,
  • 63:07 - 63:17
    and you know, you're pissed off. I went through all this trouble, and you can't seize my laptop? What the hell are you guys doing? And I just know after
  • 63:17 - 63:24
    four or five times, "I don't have to do this, they're not going to take my stuff..." and then they're going to take it. And security is a lot like that.
  • 63:24 - 63:33
    There's an essay I should write, on how hard it is to get opsec right. There's a nice story-- well there's a couple of stories--
  • 63:33 - 63:44
    there's a story of General Petraeus, and how his secret conversations were eavesdropped on. And also, the guy who was running Silk Road. And there's
  • 63:44 - 63:59
    also a third one I'd use, from the Chinese hackers that Mandiant found. You have to be perfect, that if you make a mistake sometime in the past 10 years,
  • 63:59 - 64:10
    your security has been compromised. And because there's never feedback, you never know. I can tell you that one time-- and I shouldn't say this--
  • 64:10 - 64:20
    I spent a lot of time encrypting this archive, I had it encrypted, I zipped it, I encrypted it, I decrypted it, and encrypted it again just make sure I had it right
  • 64:20 - 64:29
    because if you get it wrong the key doesn't work and you're screwed. I do this, and I throw the zips in the trash, I delete the trash, I erase the trash,
  • 64:29 - 64:42
    I go through the border, I come in, I open my computer, and I forgot to erase the originals... You never get any feedback as you do this.
  • 64:42 - 64:52
    You never know if you did it right. And it's easy to make a mistake, because security is always-- you never want to do security, it's always in the way.
  • 64:52 - 65:04
    "Oh, I have to remember when I use OTR to do the authentication step." It's not what I want to do, I want to talk to the guy. I have to remember--
  • 65:04 - 65:12
    and I'll do this: I'll close my laptop, boot it down, put in my USB stick, open up Tails, get it all ready and then "Oh, damn, the email address I wanted is on
  • 65:12 - 65:24
    the memory." I gotta close it all down, open it up again. It's always in the way. It's very easy to make a mistake. And the way the balance goes, if you make
  • 65:24 - 65:33
    a mistake, you're done. This makes it very hard. I'm not helping, am I?
  • 65:33 - 65:41
    --I felt very good, I was thinking to myself I need to design an exploit platform that gives positive reinforcement back to you for doing the
  • 65:41 - 65:47
    wrong thing, and pretty soon we'll have you trained to do the wrong thing--
    --And if it gets a cool codename, you're in.
  • 65:47 - 65:56
    --Absolutely. So, I think it's time we let some other people ask some questions.
  • 65:56 - 66:05
    --[audience] Bruce, thanks very much. I wanted to ask one question that's about the opposite side of bulk analysis, and that is about natural language
  • 66:05 - 66:12
    processing. What is your assessment of the state of it, and how much of a threat or problem do you think it really is? Because I've seen some of what's
  • 66:12 - 66:18
    going into it, and I wasn't particularly impressed, frankly.
    --Well, we don't know. There are a large number of
  • 66:18 - 66:25
    patents the NSA has in this area, so there's stuff in the public. My guess is they're extraordinarily good. This is something they've been working on
  • 66:25 - 66:34
    since computers were invented, because this is not a new problem. Especially when you are dealing with radio, where you had to transcribe. It was
  • 66:34 - 66:44
    the only thing you could do, that the recording just couldn't keep up. So my guess is they are very good at natural language processing,
  • 66:44 - 66:56
    and natural language translation. I would expect that most everything gets very quickly turned into text, that there's easy ways to annotate little
  • 66:56 - 67:08
    bits of voice, stuff you don't know that might need a person. And that voice printing is extremely advanced as well. Again, nothing is published in
  • 67:08 - 67:16
    this, and I don't know if it will be. But I would expect this is an area they have devoted considerable resources on for a decade.
  • 67:16 - 67:19
    -- [audience] Are they better than Facebook, do you think, or Google?
    -- They would have to be. They've been doing this
  • 67:19 - 67:27
    for decades, and with way more budget. And again, it's going to be a one-way function. Anything that Google and Facebook can do is going to come out
  • 67:27 - 67:36
    of the academic community, and they're going to know about it. It's like cryptography: you have this-- information only flows in one direction, from the
  • 67:36 - 67:45
    academic community to the NSA. It never flows the other way. So the NSA can get the best of the world, plus what they have. They spend a lot of money
  • 67:45 - 67:51
    on linguists.
  • 67:51 - 68:00
    -- [audience] I want to get your opinion about these third party companies that are creating these commercial off-the-shelf products in order
  • 68:00 - 68:14
    to spy and target people. I read this document "For Their Eyes Only: The Commercialization of Digital Spying" and at some point you posted something like
  • 68:14 - 68:18
    that on your blog, so I want to get your opinion on these companies, not only NSA but now these other people.
  • 68:18 - 68:35
    -- Right, I mean one problem-- I guess it is a problem-- a lot of government capabilities have corporate analogs. So we talk a lot
  • 68:35 - 68:41
    about surveillance, there's government surveillance and then there's corporate surveillance, and all these tools are being built for corporate surveillance,
  • 68:41 - 68:51
    some of it for legitimate reasons, some of it for reasons we may not like, and then this is also being used by governments.
  • 68:51 - 69:00
    You know, propaganda tools-- we're seeing companies like Blue Coat and Sophos. These are commercial products being sold in
  • 69:00 - 69:09
    corporations that are also being sold to Syria to identify and arrest dissidents. A lot of these technologies are dual use, and I don't
  • 69:09 - 69:19
    think we can address one issue without also addressing the other. We cannot just say "governments can't do this and corporations can."
  • 69:19 - 69:31
    The tools lend themselves to abuse. And there's talk about putting a lot of these tools back under export control. Over the past month I've been seeing
  • 69:31 - 69:40
    more discussion about that. I'm not sure that it's possible anymore. It's a very different world than the 90s, when you actually could have export controls
  • 69:40 - 69:47
    on cryptography, because everything was mailed around. It wasn't just downloaded. The connected international world is much harder. You end
  • 69:47 - 70:01
    up putting national barriers like the Great Firewall of China, which works well but is also pretty porous. These are certainly important to talk about,
  • 70:01 - 70:09
    the corporate analogs to these government tools.
    --I can't speak about the Snowden documents which Bruce has seen, and we're not ready at SFLC
  • 70:09 - 70:18
    to make any publications yet, but I can tell you for sure that there are national governments that have outsourced the process of penetrating
  • 70:18 - 70:28
    and listening to computer networks to commercial organizations whose contract work mixes government and commercial spying, but whose primary
  • 70:28 - 70:35
    bread and butter in this and other countries around the world is the conduct of governmental spying.
    --I think it's dangerous for us because
  • 70:35 - 70:43
    now you have an industry that is going to lobby. Just like you have a private prison industry lobbying for more draconian laws, you're going to have
  • 70:43 - 70:47
    a private surveillance industry lobbying for more surveillance, because more surveillance means more sales.
  • 70:47 - 71:04
    --Well, wealthy database-making companies can be counted on doing that anyway, I should think.
  • 71:04 - 71:13
    -- [audience] Here's a real paranoid question for you. An important strategy in all espionage is the spread of disinformation. Is it possible with the spread
  • 71:13 - 71:22
    of the Snowden documents, or other types of leaks, that there is some tiny bit of disinformation there to make the community represented here trust
  • 71:22 - 71:32
    some type of technology that is actually vulnerable?
    --So that is actually the less paranoid version of that argument. The more paranoid one is
  • 71:32 - 71:42
    that Snowden is a government plant and that is all disinformation. You do hear that. I believe that is not true. I believe that Snowden is a legitimate
  • 71:42 - 71:54
    whistle blower, that he has legitimate whistle blowing documents that he-- I guess, legitimately-- that he fair and square stole from the NSA and
  • 71:54 - 72:07
    went to China with, and that this is real, that this is not government disinformation. It would-- nah, it doesn't even pass the smell test. I do not
  • 72:07 - 72:17
    believe so.
    --It looks like we should hand the mic down.
    --Just throw it at them.
  • 72:17 - 72:22
    --If we owned it we would do that.
    --What could possibly go wrong with that?
    --[audience] Hi. I had a question about
  • 72:22 - 72:33
    the relationship between corporate and government surveillance. So, you were saying that one thing we could do to defend against the surveillance of
  • 72:33 - 72:40
    the mobile phone network is the location information could all be encrypted. But, the mobile phone companies actually need to know which
  • 72:40 - 72:48
    cell tower to send your signal to. So the mobile phone companies are going to know. They're in a position where they can't help but collude.
  • 72:48 - 72:55
    Do you have a vision for what a mobile phone-- what people here would consider to be a functional mobile phone network-- would look like that the government couldn't actually spy on?
  • 72:55 - 73:07
    --So I don't actually know. My guess is that it is possible, that you could have a distributed system that would hide location data from the central
  • 73:07 - 73:18
    nodes. And some of it is just leveraging small distribution. I think we were all much more secure when there were 100,000 ISPs than when there
  • 73:18 - 73:29
    were 100. That level of distribution-- again, the economics. We could force the NSA or the FBI to go after all of these companies rather than
  • 73:29 - 73:40
    just a few. So my guess is that there is a cell architecture that doesn't require centralized [unintelligible]. And just like in file sharing,
  • 73:40 - 73:49
    the original file sharing systems had a centralized network that knew who had what file. Those were gone after by the music industry, and then the
  • 73:49 - 73:56
    follow on systems were distributed. They were peer-to-peer. They didn't have that centralized command-and-control. We know how to do
  • 73:56 - 74:04
    this. The question is making it fast, making it scale. I'm not saying this is easy, but if we want to, yes, I think we can.
  • 74:04 - 74:08
    --[audience] Do you know anyone who is working on that?
    --Not a soul.
  • 74:08 - 74:16
    --[audience] Hey, thanks guys. I've got two quick ones, maybe one for Eben and one for Bruce. One is, is it worth it to encrypt in a big corporate
  • 74:16 - 74:28
    cloud, like Amazon or Rackspace, using their encryption. And two, what are your thoughts on Sibel Edmonds and Russ Tice and Cyptome, who were
  • 74:28 -
    just in a debate with Greenwald on Twitter.
    --I didn't follow the debate.
    --Nor I. Tell us about it.
  • Not Synced
    --[audience] Well, basically, Sibel Edmunds was the FBI's [unintelligible] translator pre 9/11. She's wondering, "Where are all the documents?
  • Not Synced
    Why are only 1% out, and what's going on with Paypal and Omidyar, and that"
  • Not Synced
    --I think it turns out that starting a new media empire is harder than you think. That's my guess. It's just things are happening slower than maybe people
  • Not Synced
    would like. You know, releasing documents is hard. There are legitimate secrets in there that you don't want released. There are, there really are.
  • Not Synced
    And it's good that the process is happening slowly and methodically, that a Wikileaks-style data dump would not be fun for anybody. So, that's good.
  • Not Synced
    And there's a lot there, and it's slow to look at, and all of these stories do end up with a negotiation with the government.
  • Not Synced
    And this is the way journalism works. I didn't know this, but I got to meet it, that the reporters say, "We're releasing this story. Do you have
  • Not Synced
    anything that-- basically, do you mind?" And if the government says "Yes, don't release anything," of course no one is going to listen. But if they say
  • Not Synced
    "Look, this particular sentence, if you do this it will disrupt--," and there's a level of trust here, between the reporters and the government, and you know
  • Not Synced
    names are redacted, operational details are redacted. If the NSA is spying on North Korea and the Taliban we're not going to hear about it
  • Not Synced
    because that would be a good thing. So, there is this long process, and figuring out what our stories and legitimate interest is also a long process. So,
  • Not Synced
    this does take a long time. There's a lot of stuff.
    --[audience] People are also making a lot of money, though. That's kind of what's being conjectured.
  • Not Synced
    --You know-- the people who are doing it, are they? Is anyone reading this stuff except us?
    --[audience] Isn't Greenwald--
  • Not Synced
    --Greenwald does have a book deal, but there's way easier ways to make a living than living in exile. None of these people are making lots of money.
  • Not Synced
    Laura Poitras-- anybody think of Bart Gellman for the Washington Post, and Ashkan Soltani who's working with him, this is not a huge profit center.
  • Not Synced
    You would do way, way better calling up the Russian embassy and saying, "how much would you give me for the lot?"
  • Not Synced
    --[audience] I wanted to push back a little bit harder on that. You drew a dichotomy between, on the one hand, surveillance, and on the other hand,
  • Not Synced
    security. And you've said that we have to choose among them, and among them you choose security. When you say there are legitimate secrets and
  • Not Synced
    there are operational details, those operational details are things that we would need in order to defend against this stuff.
  • Not Synced
    And if I have, for instance, a friend who is working on anti-censorship software, or on secrecy software, why would you not give my friend a copy of these
  • Not Synced
    documents so that my friend can actually make his or her software work?
  • Not Synced
    --So, the hope is that the documents your friend gets are enough, that what's eliminated is the name and the phone number of the guy who wrote the
  • Not Synced
    documents. Or, what's eliminated-- you'll see this in some of the documents-- they'll give a list of places we're eavesdropping on, and the IP addresses
  • Not Synced
    will be blacked out. So, my hope is, when I wrote the Tor story I wanted to give enough detail so that the people who design Tor, the people who are working
  • Not Synced
    on internet backbone security, had enough to figure out what the NSA is doing and to defend themselves. I didn't need to tell them-- and again, I'm making
  • Not Synced
    this up-- that FOXACID was implemented successfully against, you know, these guys in Yemen. Because that is not useful to the fixers. I tried very hard--
  • Not Synced
    and I think the Washington Post did as well, I'm very happy with their level of detail-- that it is enough for us to know what the vulnerabilities are, what the
  • Not Synced
    capabilities are, how they're being used, the extent they're being used, and to give us the information we need that, if we chose to, to fix them.
  • Not Synced
    Yes, I think this is-- in some ways it would be neat to say "here it all is," but it's just not going to happen. It just isn't. If it does, it will be a mistake,
  • Not Synced
    like the Wikileaks. It will be some confluence of bad things that shouldn't have happened.
  • Not Synced
    --It's a little like ordinary vulnerability disclosure, isn't it Dave? I mean, you might as well want to tell people how to fix the problem without explaining which
  • Not Synced
    bank is vulnerable, right?
    --[audience] Well, if at a certain point if the problems are not getting fixed-- and it seems like on a political
  • Not Synced
    level the problems are not getting fixed, because this is the other piece of it-- our ability to advocate for ourselves as citizens about what policies we
  • Not Synced
    do and do not want. The vulnerability is not being fixed, and at a certain point you do go public and say, "this is the vulnerability."
  • Not Synced
    --Well, I think Mr. Snowden has done that, and the likelihood that politics won't fix this is probably 1.0, no matter how many documents are disclosed.
  • Not Synced
    --Sad, but true.
    --I don't think politics is going to do it all by itself no matter what happens, precisely because I don't
  • Not Synced
    think it's really going to turn out that the politics hinges on the technical details. It hinges on, it seems to me, where Bruce says it hinges on, where people
  • Not Synced
    are going to buy arguments that they should be afraid. And we don't know how mad democracy is about that yet. Thanks to Mr. Snowden we're about
  • Not Synced
    to find out.
  • Not Synced
    --[audience] Just want to test a slightly more technical question. So you mentioned, basically, that open source systems are better in this case
  • Not Synced
    because we can look at them and see that there's not backdoor, and what immediately comes to mind to me is the underhanded C coding contest, or the
  • Not Synced
    Trusting Trust attack. These things that, you can hide backdoors in systems that people are looking at. Give me ideas on how to defend against
  • Not Synced
    that sort of thing, make systems more audit-able, etc.
    --So, you can but it's harder. The reason I like open source and free software and non-corporate, is less
  • Not Synced
    because you can look at it, and more because it is harder for someone to slip someone in, because someone is looking at it. And yes, there's an
  • Not Synced
    Obfuscated C Contest, and if you showed up in the Unix kernel with C code that looked like it came from the Obfuscated C Contest, you'd be sent back
  • Not Synced
    and be told to make it look more clear.
    --[audience] Well, have you looked at OpenSSL recently?
  • Not Synced
    --I have not. Fair enough. I'm just trying to leverage the economics here, I want to make it harder, I want to increase the risk. Something that comes up
  • Not Synced
    again and again in the NSA documents is that they are amazingly risk-averse. They don't like risk. They don't want to take risks. They really take very safe
  • Not Synced
    paths. And if you increase the risk, you're going tip it to a point where they're not going to try, because the risk is dangerous. And I think that's-- this is
  • Not Synced
    something that, without any legislation or any technical fixes, will change because of this summer, all these stories. And as amazing as it is,
  • Not Synced
    the NSA has all sorts of contingencies for all sorts of things, never had a contingency for "all of our documents get released to the public."
  • Not Synced
    It took them, what, two, three months to get a PR firm with enough clearance to talk to. Now they have a blog, and a Twitter feed, and they respond quickly.
  • Not Synced
    But it took them a long time. So, when they were making decisions like, "should we eavesdrop on Belgium," they had all these benefits, costs, and
  • Not Synced
    risks, but the world finding out-- they never thought that was a possibility. Well that is now over. I think that, basically, every NSA operation from now on
  • Not Synced
    is going to be the big red letters underneath the "should be do it or not?" is "This is going to become public in three to five years."
  • Not Synced
    With high probability. "Are we OK with doing it?" And I think some things are just not going to happen, because the blowback has been real.
  • Not Synced
    --[audience] So you don't think that a Trusting Trust attack, a compiler--
    --It could, but you know that's not an easy attack.
  • Not Synced
    --[audience] And if it was discovered...
    --And if it's discovered, "Wow." Suddenly it's really bad. This has rocked the agency. This is not
  • Not Synced
    something they thought of.
    --But if I could just ask a technical question back, did I hear you say that it would be a really good idea
  • Not Synced
    if OpenSSL were rewritten to be clearer, and more modular, and easier for people to handle--
    --[audience] Absolutely, if you could rewrite the
  • Not Synced
    whole thing in Python, that would be--
    --I don't think we should necessarily expect it to get rewritten in Python. And I'm personally
  • Not Synced
    not sorry about that. But I'll hold out for rewriting it all in Perl if that'll make you feel better.
Title:
Snowden, the NSA, and Free Software - Bruce Schneier + Eben Moglen
Description:

A conversation with Bruce Schneier, hosted by Eben Moglen, at Columbia Law School NYC on December 12 2013, about what we can learn from the Snowden documents, the NSA's efforts to weaken global cryptography, and how we can keep free software tools from being subverted. The talk was webcast live via the Internet Society Chapters Webcast Channel

Audio: http://www.softwarefreedom.org/events/2013/a_conversation_with_bruce_schneier/
Download HD Video: https://archive.org/details/schneier

More Moglen: http://snowdenandthefuture.info/

More Schneier: https://www.schneier.com/

Video Sponsor: Internet Society New York Chapter http://isoc-ny.org

Webcast support: NYI http://nyi.net

Thanks: Software Freedom Law Center - http://www.softwarefreedom.org/

more » « less
Video Language:
English
Duration:
01:31:45

English subtitles

Incomplete

Revisions