< Return to Video

#rC3 - What Price the Upload Filter?

  • 0:00 - 0:13
    rC3 preroll music
  • 0:13 - 0:18
    Herald: This is Ross Anderson, and he's
    giving a talk to us today, and the title
  • 0:18 - 0:24
    is What Price the Upload Filter? From Cold
    War to Crypto Wars and Back Again. And
  • 0:24 - 0:31
    we're very happy that he's here today. And
    and for our non-English speaking public,
  • 0:31 - 0:36
    we have translations.
    speaks german
  • 0:36 - 0:41
    Dieser Talk wird auf Deutsch übersetzt.
    speaks french
  • 0:41 - 0:50
    Cette conférence est traduit en
    français aussi.
  • 0:50 - 0:57
    Yeah. Um. Ross, ready to start? Let's go.
    Have a good time. Enjoy.
  • 0:57 - 1:10
    Ross: Yes, ready to go. Thanks. OK. As has
    been said, I'm Ross Anderson and I'm in
  • 1:10 - 1:14
    the position of being one of the old guys
    of this field and that I've been involved
  • 1:14 - 1:19
    in the crypto wars right from the start.
    And in fact, even since before the clipper
  • 1:19 - 1:28
    chip actually came out. If we could go to
    the slides, please.
  • 1:28 - 1:31
    Right, can we see the slides?
  • 1:31 - 2:10
    silence
  • 2:10 - 2:15
    surprised the U.S. armed
    forces. And guess what happened? Well, in
  • 2:15 - 2:21
    the 1950s, Boris Hagelin had set up that
    company, secretly sold it to the NSA and
  • 2:21 - 2:28
    for a number of years, quite a lot of years,
    countries as diverse as Latin America and
  • 2:28 - 2:33
    India and even NATO countries such as
    Italy were buying machines from Crypto AG,
  • 2:33 - 2:39
    which the NSA could decipher. And this had
    all sorts of consequences. For example,
  • 2:39 - 2:45
    it's been revealed fairly recently that
    Britain's success against Argentina in the
  • 2:45 - 2:51
    Falklands War in 1982 was to a large
    extent due to signals intelligence that
  • 2:51 - 2:59
    came from these machines. So, next slide,
    please. And in this prehistory of the
  • 2:59 - 3:04
    crypto wars, almost all the play was
    between governments. There was very little
  • 3:04 - 3:08
    role for civil society. There was one or
    two journalists who were engaged in trying
  • 3:08 - 3:14
    to map what the NSA and friends were up to.
    As far as industry was concerned, well, at
  • 3:14 - 3:18
    that time, I was working in banking and we
    found that encryption for confidentiality
  • 3:18 - 3:22
    was discouraged. If we tried to use line
    encryption, then false mysteriously
  • 3:22 - 3:27
    appeared on the line. But authentication
    was OK. We were allowed to encrypt PIN
  • 3:27 - 3:32
    pad, PIN blocks. We were allowed to put
    MACs on messages. There was some minor
  • 3:32 - 3:37
    harassment. For example, when Rivest,
    Shamir and Adleman came up with their
  • 3:37 - 3:43
    encryption algorithm, the NSA tried to
    make it classified. But the Provost of
  • 3:43 - 3:48
    MIT, Jerome Wiesner, persuaded them not to
    make that fight. The big debate in the
  • 3:48 - 3:53
    1970s still, was whether the NSA affected
    the design of the data encryption standard
  • 3:53 - 3:58
    algorithm, and we know now that this was
    the case. It was designed to be only just
  • 3:58 - 4:04
    strong enough and Whit Diffie predicted
    back in the 1970s that 2 to the power of
  • 4:04 - 4:09
    56 key search would eventually be
    feasible. The EFF built a machine in 1998
  • 4:09 - 4:14
    and now of course that's fairly easy
    because each bitcoin block costs 2 to
  • 4:14 - 4:20
    the power of 68 calculations. Next slide,
    please. So where things get interesting is
  • 4:20 - 4:26
    that the NSA persuaded Bill Clinton in one
    of his first cabinet meetings in 1993 to
  • 4:26 - 4:30
    introduce key escrow, the idea that the
    NSA should have a copy of every of these
  • 4:30 - 4:37
    keys. And one of the people at that
    meeting admitted later that President Bush,
  • 4:37 - 4:41
    the elder, had been asked and had refused,
    but Clinton when he goes into office was
  • 4:41 - 4:46
    naive and thought that this was an
    opportunity to fix the world. Now, the
  • 4:46 - 4:50
    clipper chip which we can see here, was
    tamper resistant and had of secret block
  • 4:50 - 4:59
    cipher with an NSA backdoor key. And the
    launch product was an AT&T secure phone.
  • 4:59 - 5:05
    Next slide, please. Now the Clipper protocol
    was an interesting one in that each chip
  • 5:05 - 5:12
    had a unique secret key KU and a global
    secret family key kNSA burned in. And in
  • 5:12 - 5:18
    order to, say, send data to Bob, Alice had
    to send her clipper chip a working key kW,
  • 5:18 - 5:23
    which is generated by some external means,
    such as a Diffie Hellman Key exchange. And
  • 5:23 - 5:28
    it makes a law enforcement access field,
    which was basically Alice and Bob's names
  • 5:28 - 5:34
    with the working key encrypted under the
    unit key and then a hash of the working
  • 5:34 - 5:38
    key encrypted under the NSA key. And that
    was sent along with the cipher text to make
  • 5:38 - 5:43
    authorized wiretapping easy. And the idea
    with the hash was that this would stop
  • 5:43 - 5:48
    cheating. Bob's Clipper Chip wouldn't use
    a working key unless it came with a valid
  • 5:48 - 5:54
    LEAF. And I can remember, a few of us can
    still remember, the enormous outcry that
  • 5:54 - 5:58
    this caused at the time. American
    companies in particular didn't like it
  • 5:58 - 6:02
    because they started losing business to
    foreign firms. And in fact, a couple of
  • 6:02 - 6:07
    our students here at Cambridge started a
    company nCipher, that grew to be quite
  • 6:07 - 6:13
    large because they could sell worldwide,
    unlike US firms. People said, why don't we
  • 6:13 - 6:17
    use encryption software? Well, that's easy
    to write, but it's hard to deploy at
  • 6:17 - 6:23
    scale, as Phil Zimmermann found with PGP.
    And the big concern was whether key escrow
  • 6:23 - 6:29
    would kill electronic commerce. A
    secondary concern was whether, how on earth,
  • 6:29 - 6:32
    will we know if government designs are
    secure? Why on earth should you trust the
  • 6:32 - 6:41
    NSA? Next slide, please. Well, the first
    serious fight back in the crypto wars came
  • 6:41 - 6:45
    when Matt Blaze at Bell Labs found an
    attack on Clipper. He found that Alice
  • 6:45 - 6:52
    could just try lots of these until one of
    them works, because the tag was only 16
  • 6:52 - 6:58
    Bits long and it turned out that 2 to the
    power of 112 of the 2 to the power of
  • 6:58 - 7:04
    128 possibilities work. And this meant
    that Alice could generate a bogus LEAF
  • 7:04 - 7:08
    that would pass inspection, but which
    wouldn't decrypt the traffic, and Bob
  • 7:08 - 7:12
    could also generate a new LEAF on the fly.
    So you could write non-interoperable rogue
  • 7:12 - 7:17
    applications that the NSA has no access
    to. And with a bit more work, you could
  • 7:17 - 7:21
    make rogue applications interoperate with
    official ones. This was only the first of
  • 7:21 - 7:30
    many dumb ideas. Next slide, please. OK,
    so why don't people just use software?
  • 7:30 - 7:36
    Well, at that time, the US had export
    controls on intangible goods such as
  • 7:36 - 7:40
    software, although European countries
    generally didn't. And this meant that US
  • 7:40 - 7:46
    academics couldn't put crypto code online,
    although we Europeans could and we
  • 7:46 - 7:52
    did. And so Phil Zimmermann achieved fame
    by exporting PGP, pretty good privacy, some
  • 7:52 - 7:56
    encryption software he had written for
    America as a paper book. And this was
  • 7:56 - 8:01
    protected by the First Amendment. They
    sent it across the border to Canada. They
  • 8:01 - 8:04
    fed it into an optical character
    recognition scanner. They recompiled it
  • 8:04 - 8:09
    and the code had escaped. For this Phil
    was subjected to a grand jury
  • 8:09 - 8:14
    investigation. There was also the
    Bernstein case around code as free speech
  • 8:14 - 8:19
    and Bruce Schneier rose to fame with his
    book "Applying Cryptography", which had
  • 8:19 - 8:24
    protocols, algorithms and source code in C,
    which you could type in in order to get
  • 8:24 - 8:32
    cryptographic algorithms anywhere. And we
    saw export-controlled clothing. This
  • 8:32 - 8:36
    t-shirt was something that many people wore
    at the time. I've actually got one and I
  • 8:36 - 8:41
    planned to wear it for this. But
    unfortunately, I came into the lab in
  • 8:41 - 8:47
    order to get better connectivity and I
    left it at home. So this t-shirt was an
  • 8:47 - 8:53
    implementation of RSA written in perl,
    plus a barcode so that you can scan it in.
  • 8:53 - 8:58
    And in theory, you should not walk across
    the border wearing this t-shirt. Or if
  • 8:58 - 9:03
    you're a US citizen, you shouldn't even
    let a non-US citizen look at it. So by
  • 9:03 - 9:09
    these means, people probed the outskirts of
    what was possible and, you know an awful
  • 9:09 - 9:17
    lot of fun was had. It was a good laugh to
    tweak the Tyrannosaur's tail. Next slide.
  • 9:17 - 9:25
    But this wasn't just something that was
    limited to the USA. The big and obvious
  • 9:25 - 9:30
    problem, if you try and do key escrow in
    Europe, is that there's dozens of
  • 9:30 - 9:36
    countries in Europe and what happens if
    someone from Britain, for example, has got
  • 9:36 - 9:40
    a mobile phone that they bought in France
    or a German SIM card and they're standing
  • 9:40 - 9:45
    on the streets in Stockholm and they phone
    somebody who's in Budapest, who's got a
  • 9:45 - 9:49
    Hungarian phone with the Spanish SIM card
    in it. Then which of these countries'
  • 9:49 - 9:54
    secret police forces should be able to
    listen to the call. And this was something
  • 9:54 - 10:00
    that stalled the progress of key escrow,
    that's a good way to describe it, in Europe.
  • 10:00 - 10:07
    And in 1996 GCHQ got academic colleagues
    at Royal Holloway to come up with a
  • 10:07 - 10:12
    proposal for public sector email, which
    they believe would fix this. Now, at the
  • 10:12 - 10:19
    time after clipper had fallen into
    disrepute, the NSA's proposal was that
  • 10:19 - 10:24
    also the certification authority should have
    to be licensed, and that this would enforce
  • 10:24 - 10:28
    a condition that all private keys would be
    escrows, so you would only be able to get a
  • 10:28 - 10:34
    signature on your public key if the
    private key was was held by the CA. And
  • 10:34 - 10:38
    the idea is that you'd have one CA for
    each government department and civilians
  • 10:38 - 10:42
    would use trusted firms like Barclays Bank
    or the post office, which would keep our
  • 10:42 - 10:48
    keys safe. And it would also work across
    other EU member states, so that somebody in
  • 10:48 - 10:54
    Britain calling somebody in Germany would
    end up in a situation where a trustworthy
  • 10:54 - 10:59
    CA, from the NSA's point of view, that is
    an untrustworthy CA from our point of view,
  • 10:59 - 11:04
    in Britain would be prepared to make a key
    and so would one in Germany. This, at
  • 11:04 - 11:11
    least, was the idea. So how do we do this,
    next slide, on the GCHQ protocol. So here's
  • 11:11 - 11:15
    how it was designed to work in the UK
    government. If Alice at the Department of
  • 11:15 - 11:20
    Agriculture wants to talk to Bob at the
    Department of Business, she asks her
  • 11:20 - 11:26
    Departmental Security Officer DA for a send
    key for herself and a receive key for Bob.
  • 11:26 - 11:35
    And DA and DB get a top level
    interoperability key KTAB from GCHQ and DA
  • 11:35 - 11:44
    calculates a secret send key of the day as
    a hash of KTAB and Alice's name and the
  • 11:44 - 11:50
    DA's own Identity for Alice which he gives
    to Alice and similarly a public receive
  • 11:50 - 11:55
    key of the day for Bob and Alice sends Bob
    her public send key along with the
  • 11:55 - 12:00
    encrypted message and Bob can go
    to his DSO and get his secret receive
  • 12:00 - 12:06
    key of the day. Now this is slightly
    complicated and there's all sorts of other
  • 12:06 - 12:11
    things wrong with it once you start to
    look at it. Next slide, please. The first
  • 12:11 - 12:15
    is that from the point of view of the
    overall effect, you could just as easily
  • 12:15 - 12:19
    have used Kerberos because you've
    basically got a key distribution center at
  • 12:19 - 12:25
    both ends, which knows everybody's keys. So
    you've not actually gained very much by
  • 12:25 - 12:31
    using complicated public key mechanisms,
    and the next problem is what's the law
  • 12:31 - 12:36
    enforcement access need for centrally
    generated signing keys? If this is
  • 12:36 - 12:40
    actually for law enforcement rather than
    intelligence? Well, the police want to be
  • 12:40 - 12:47
    able to read things, not forge things. A
    third problem is that keys involve hashing
  • 12:47 - 12:52
    department names and governments are
    changing the name of the departments all
  • 12:52 - 12:57
    the time, as the prime minister of the day
    moves his ministers around and they chop
  • 12:57 - 13:02
    and change departments. And this means, of
    course, that everybody has to get new
  • 13:02 - 13:06
    cryptographic keys and suddenly the old
    cryptographic keys don't work anymore. And
  • 13:06 - 13:11
    those are horrendous complexity comes from
    this. Now, there are about 10 other things
  • 13:11 - 13:15
    wrong with this protocol, but curiously
    enough, it's still used by the UK
  • 13:15 - 13:19
    government for the top secret stuff. It
    went through a number of iterations. It's
  • 13:19 - 13:24
    now called Mikey Sakke, there's details in
    my security engineering book. And it
  • 13:24 - 13:28
    turned out to be such a pain that the
    stuff below top secret now is just used as
  • 13:28 - 13:33
    a branded version of G suite. So if what
    you want to do is to figure out what
  • 13:33 - 13:37
    speech Boris Johnson will be making
    tomorrow, we just have to guess the
  • 13:37 - 13:44
    password recovery questions for his
    private secretaries and officials. Next
  • 13:44 - 13:51
    slide, the global Internet Trust Register.
    This was an interesting piece of fun we
  • 13:51 - 13:56
    had around the 1997 election when Tony
    Blair took over and introduced the Labor
  • 13:56 - 14:00
    government before the election, Labor
    promised to not seize crypto keys in bulk
  • 14:00 - 14:04
    without a warrant. And one of the
    first things that happened to him once he
  • 14:04 - 14:10
    was in office is Vice President Al Gore
    went to visit him and all of a sudden Tony
  • 14:10 - 14:13
    Blair decided that he wanted all
    certification authorities to be licensed
  • 14:13 - 14:19
    and they were about to rush this through
    parliament. So we put all the important
  • 14:19 - 14:23
    public keys in a paper book and we took it
    to the cultural secretary, Chris Smith,
  • 14:23 - 14:28
    and we said, you're the minister for books
    why are you passing a law to ban this
  • 14:28 - 14:33
    book. And if you'll switch to the video
    shot, I've got the initial copy of the
  • 14:33 - 14:37
    book that we just put together on the
    photocopying machine in the department.
  • 14:37 - 14:42
    And then we sent the PDF off to MIT and
    they produced it as a proper book. And
  • 14:42 - 14:48
    this means that we had a book which is
    supposedly protected and this enabled us
  • 14:48 - 14:55
    to get the the topic onto the agenda for
    cabinet discussion. So this at least
  • 14:55 - 15:00
    precipitous action, we ended up with the
    Regulation of Investigatory Powers Bill in
  • 15:00 - 15:05
    2000. That was far from perfect, but that
    was a longer story. So what happened back
  • 15:05 - 15:10
    then is that we set up an NGO, a digital
    rights organization, the Foundation for
  • 15:10 - 15:17
    Information Policy Research. And the
    climate at the time was such that we had
  • 15:17 - 15:22
    no difficulty raising a couple of hundred
    thousand pounds from Microsoft and Hewlett
  • 15:22 - 15:28
    Packard and Redbus and other tech players.
    So we were able to hire Casper Bowden for
  • 15:28 - 15:32
    three years to basically be the director
    of FIPR and to lobby the government hard
  • 15:32 - 15:38
    on this. And if we can go back to the
    slides, please, and go to the next slide,
  • 15:38 - 15:47
    the slide on bringing it all together. So
    in 1997, a number of us, Hal Abelson and I
  • 15:47 - 15:55
    and Steve Bellovin and Josh Benaloh from
    Microsoft and Matt Blaze who had broken
  • 15:55 - 15:59
    Clipper and Whit Diffie, who invented
    digital signatures, and John Gilmore of
  • 15:59 - 16:06
    EFF, Peter Neumann of SRI, Ron Rivest,
    Jeff Schiller of MIT and Bruce Schneier
  • 16:06 - 16:09
    who had written applied cryptography and
    got together and wrote a paper on the
  • 16:09 - 16:14
    risks of key recovery, key escrow and
    trust in third party encryption, where we
  • 16:14 - 16:18
    discussed the system consequences of
    giving third party or government access to
  • 16:18 - 16:24
    both traffic data and content without user
    notice or consent deployed internationally
  • 16:24 - 16:28
    and available around the clock. We came to
    the conclusion that this was not really
  • 16:28 - 16:34
    doable. It was simply too many
    vulnerabilities and too many complexities.
  • 16:34 - 16:39
    So how did it end? Well, if we go to the
    next slide, the victory in Europe wasn't
  • 16:39 - 16:44
    as a result of academic arguments. It was
    a result of industry pressure. And we owe
  • 16:44 - 16:49
    a debt to Commissioner Martin Bangemann
    and also to the German government who
  • 16:49 - 16:57
    backed him. And in 1994, Martin had put
    together a group of European CEOs to
  • 16:57 - 17:01
    advise him on internet policy. And they
    advised them to keep your hands off until
  • 17:01 - 17:04
    we can see which way it's going. That's
    just wrong with this thing and see what we
  • 17:04 - 17:11
    can do with it. And the thing that he
    developed in order to drive a stake
  • 17:11 - 17:15
    through the heart of key escrow was the
    Electronic Signatures Directive in 1999.
  • 17:15 - 17:20
    And this gave a rebuttable presumption of
    validity to qualifying electronic
  • 17:20 - 17:25
    signatures, but subject to a number of
    conditions. And one of these was that the
  • 17:25 - 17:30
    signing key must never be known to anybody
    else other than the signer and this killed
  • 17:30 - 17:37
    the idea of licensing CAs in such a way
    that the the NSA had access to all the
  • 17:37 - 17:41
    private key material. The agencies had
    argued that without controlling
  • 17:41 - 17:45
    signatures, you couldn't control
    encryption. But of course, as intelligence
  • 17:45 - 17:49
    agencies, they were as much interested in
    manipulating information as they were in
  • 17:49 - 17:57
    listening into it. And this created a
    really sharp conflict with businesses. In
  • 17:57 - 18:01
    the U.K., with the Regulation of
    Investigatory Powers Bill went through the
  • 18:01 - 18:06
    following year. And there we got strong
    support from the banks who did not want
  • 18:06 - 18:11
    the possibility of intelligence and law
    enforcement personnel either getting hold
  • 18:11 - 18:16
    of bank keys or forging banking
    transactions. And so we managed to, with
  • 18:16 - 18:21
    their help to insert a number of
    conditions into the bill, which meant that
  • 18:21 - 18:26
    if a court or chief constable, for
    example, demands a key from a company,
  • 18:26 - 18:29
    they've got to demand it from somebody at
    the level of a director of the company.
  • 18:29 - 18:34
    And it's got to be signed by someone
    really senior such as the chief constable.
  • 18:34 - 18:39
    So there was some controls that we managed
    to get in there. Next slide! What did
  • 18:39 - 18:45
    victory in the USA look like? Well, in the
    middle of 2000 as a number of people had
  • 18:45 - 18:49
    predicted, Al Gore decided that he wanted
    to stop fighting the tech industry in
  • 18:49 - 18:54
    order to get elected president. And there
    was a deal done at the time which was
  • 18:54 - 19:01
    secret. It was done at the FBI
    headquarters at Quantico by US law
  • 19:01 - 19:05
    enforcement would rely on naturally
    occurring vulnerabilities rather than
  • 19:05 - 19:10
    compelling their insertion by companies
    like Intel or Microsoft. This was secret
  • 19:10 - 19:15
    at the time, and I happen to know about it
    because I was consulting for Intel and the
  • 19:15 - 19:21
    NDA I was under had a four year time
    limits on it. So after 2004, I was at the
  • 19:21 - 19:26
    ability to talk about this. And so this
    basically gave the NSA access to the CERT
  • 19:26 - 19:31
    feed. And so as part of this deal, the
    export rules were liberalized a bit, but
  • 19:31 - 19:38
    with various hooks and gotchas left so
    that the authorities could bully companies
  • 19:38 - 19:46
    who got too difficult. And in 2002, Robert
    Morris, senior, who had been the chief
  • 19:46 - 19:51
    scientist at the NSA at much of this
    period, admitted that the real policy goal
  • 19:51 - 19:55
    was to ensure that the many systems
    developed during the dot com boom were
  • 19:55 - 20:02
    deployed with weak protection or none. And
    there's a huge, long list of these. Next
  • 20:02 - 20:11
    slide, please. So what was the collateral
    damage from crypto war one? This is the
  • 20:11 - 20:15
    first knuckle pass of this talk, which
    I've got together as a result of spending
  • 20:15 - 20:21
    the last academic year writing the third
    edition of my book on security engineering
  • 20:21 - 20:27
    as I've gone through and updated all the
    chapters on car security, the role of
  • 20:27 - 20:33
    security and web security and so on and so
    forth, we find everywhere. But there are
  • 20:33 - 20:38
    still very serious costs remaining from
    crypto war one, for example, almost all of
  • 20:38 - 20:43
    the remote key entry systems for cars use
    inadequate cryptography for random
  • 20:43 - 20:48
    number generators and so on and so forth.
    And car theft has almost doubled in the
  • 20:48 - 20:55
    past five years. This is not all due to
    weak crypto, but it's substantially due to
  • 20:55 - 21:01
    a wrong culture that was started off in
    the context of the crypto wars. Second,
  • 21:01 - 21:06
    there are millions of door locks still
    using Mifare classic, even the building
  • 21:06 - 21:12
    where I work. For example, the University
    of Cambridge changed its door locks around
  • 21:12 - 21:17
    2000. So we've still got a whole lot of
    mifare classic around. And it's very
  • 21:17 - 21:21
    difficult when you've got 100 buildings to
    change all the locks on them. And this is
  • 21:21 - 21:26
    the case with thousands of organizations
    worldwide, with universities, with banks,
  • 21:26 - 21:31
    with all sorts of people, simply because
    changing all the locks at once and dozens
  • 21:31 - 21:36
    of buildings is just too expensive. Then,
    of course, there's the CA in your
  • 21:36 - 21:41
    browser, most nations own or control
    certification authorities that your
  • 21:41 - 21:47
    browser trusts and the few nations that
    weren't allowed to own such CAs, such as
  • 21:47 - 21:53
    Iran, get up to mischief, as we find in
    the case of the DigiNotar hack a few years
  • 21:53 - 21:59
    ago. And this means that most nations have
    got a more or less guaranteed ability to
  • 21:59 - 22:06
    do man in the middle attacks on your Web
    log ons. Some companies like Google, of
  • 22:06 - 22:11
    course, started to fix that with various
    mechanisms such as certificate pinning.
  • 22:11 - 22:16
    But that was a deliberate vulnerability
    that was there for a long, long time and
  • 22:16 - 22:22
    is still very widespread. Phones. 2G is
    insecure. That actually goes back to the
  • 22:22 - 22:27
    Cold War rather than the crypto war. But
    thanks to the crypto wars 4G and 5G are
  • 22:27 - 22:32
    not very much better. The details are
    slightly complicated and again, they're
  • 22:32 - 22:38
    described in the book, Bluetooth is easy
    to hack. That's another piece of legacy.
  • 22:38 - 22:43
    And as I mentioned, the agencies own the
    CERT's responsible disclosure pipeline,
  • 22:43 - 22:48
    which means that they got a free fire hose
    of zero days that they can exploit
  • 22:48 - 22:54
    for perhaps a month or three before these
    end up being patched. So next slide,
  • 22:54 - 23:03
    please. Last year when I talked at Chaos
    Communication Congress, the audience chose
  • 23:03 - 23:09
    this as the cover for my security
    engineering book, and that's now out. And
  • 23:09 - 23:13
    it's the process of writing this that
    brought home to me the scale of the damage
  • 23:13 - 23:18
    that we still suffered as a result of
    crypto war one. So let's move on to the
  • 23:18 - 23:25
    next slide and the next period of history,
    which we might call the war on terror. And
  • 23:25 - 23:31
    I've arbitrarily put this down as 2000 to
    2013 although some countries stoped using
  • 23:31 - 23:37
    the phrase war on terror in about 2008
    once we have got rid of George W. Bush and
  • 23:37 - 23:41
    Tony Blair. But as a historical
    convenience, this is, if you like, the
  • 23:41 - 23:46
    central period in our tale. And it starts
    off with a lot of harassment around the
  • 23:46 - 23:55
    edges of security and cryptography. For
    example, in 2000, Tony Blair promoted the
  • 23:55 - 24:02
    EU dual use regulation number 1334 to
    extend export controls from tangible goods
  • 24:02 - 24:08
    such as rifles and tanks to intangibles
    such as crypto software. Despite the fact
  • 24:08 - 24:14
    that he has basically declared peace on
    the tech industry. Two years later, in
  • 24:14 - 24:18
    2002, the UK parliament balked at an
    export control bill that was going to
  • 24:18 - 24:24
    transpose this because it added controls
    on scientific speech, not just crypto
  • 24:24 - 24:29
    code, but even papers on cryptanalysis and
    even electron microscope scripts and
  • 24:29 - 24:33
    so parliament started the research
    exemption clause at the arguments of the
  • 24:33 - 24:39
    then president of the Royal Society, Sir
    Robert May. But what then happened is that
  • 24:39 - 24:46
    GCHQ used EU regulations to frustrate
    Parliament and this pattern of extralegal
  • 24:46 - 24:52
    behavior was to continue. Next slide!
    Because after export control, the place
  • 24:52 - 24:57
    shifted to traffic data retention, another
    bad thing that I'm afraid to say, the UK
  • 24:57 - 25:03
    exported to Europe back in the days when
    we were, in effect, the Americans
  • 25:03 - 25:09
    consigliere on the European Council. Sorry
    about that, folks, but all I can say is at
  • 25:09 - 25:16
    least we helped start EDRI a year after
    that. So one of the interesting aspects of
  • 25:16 - 25:21
    this was that our then home secretary,
    Jacqui Smith, started talking about the
  • 25:21 - 25:26
    need for a common database of all the
    metadata of who had phoned whom when, who
  • 25:26 - 25:31
    had sent an email to whom when, so that
    the police could continue to use the
  • 25:31 - 25:35
    traditional contact tracing techniques
    online. And the line that we got hammered
  • 25:35 - 25:40
    home to us again and again and again was
    if you got nothing to hide, you've got
  • 25:40 - 25:47
    nothing to fear. What then happened in
    2008, is that a very bad person went into
  • 25:47 - 25:54
    Parliament and went to the PC where the
    expense claims of MPs were kept and they
  • 25:54 - 25:59
    copied all the expense claims onto a DVD
    and they sold it around Fleet Street. And
  • 25:59 - 26:03
    so The Daily Telegraph bought it from them
    for 400˙000£. And then for the best
  • 26:03 - 26:08
    part of a year, the Daily Telegraph was
    telling scandalous things about what
  • 26:08 - 26:12
    various members of parliament had claimed
    from the taxpayer. But it turned out that
  • 26:12 - 26:16
    also Jacqui Smith may have been innocent.
    Her husband had been downloading
  • 26:16 - 26:21
    pornography and charging it to our
    parliamentary expenses. So she lost her
  • 26:21 - 26:26
    job as home secretary and she lost her
    seat in parliament and the communications
  • 26:26 - 26:33
    data bill was lost. So was this a victory?
    Well, in June 2013, we learned from Ed
  • 26:33 - 26:39
    Snowden that they just built it anyway,
    despite parliament. So maybe the victory
  • 26:39 - 26:43
    in parliament wasn't what it seemed to be
    at the time. But I'm getting ahead of
  • 26:43 - 26:52
    myself; anyway. Next slide, please. The
    other thing that we did in the 2000s is
  • 26:52 - 26:56
    that we spent, I spent maybe a third of my
    time and about another hundred people
  • 26:56 - 27:01
    joined and we developed the economics of
    security as a discipline. We began to
  • 27:01 - 27:06
    realize that many of the things that went
    wrong happened because Alice was guarding
  • 27:06 - 27:11
    a system and Bob was paying the cost of
    failure. For example, if you got a payment
  • 27:11 - 27:17
    system, then in order to prevent fraud,
    what you basically have to do is to get
  • 27:17 - 27:22
    the merchants and the bank to buy
    transactions from them, to take care of
  • 27:22 - 27:26
    the costs of fraud, follow the cardholder
    of the banks that issue them with cards.
  • 27:26 - 27:32
    And the two aren't the same. But it's this
    that causes the governance tensions and
  • 27:32 - 27:37
    causes governments to break down and makes
    fraud harder than it should be. Now after
  • 27:37 - 27:42
    that, one of the early topics was
    patching and responsible disclosure. And
  • 27:42 - 27:45
    we worked through all the issues of
    whether you should not patch at all, which
  • 27:45 - 27:49
    some people in industry wanted to do, or
    whether you should just put all the bugs
  • 27:49 - 27:53
    on bug trackers which some hackers wanted
    to do or whether you would go through the
  • 27:53 - 27:57
    CERT system despite the NSA compromise,
    because they at least would give you legal
  • 27:57 - 28:04
    cover. And, you know, bully Microsoft into
    catching the bug the next patch Tuesday
  • 28:04 - 28:10
    and then the disclosure after 90 days. And
    we eventually came to the conclusion as an
  • 28:10 - 28:16
    industry followed that responsible
    disclosure was the way to go. Now, one of
  • 28:16 - 28:22
    the problems that arises here is the
    equities issue. Suppose you're the
  • 28:22 - 28:27
    director of the NSA and somebody comes to
    you with some super new innovative bug.
  • 28:27 - 28:33
    You say they have rediscovered Spectre,
    for example. And so you've got a bug which
  • 28:33 - 28:41
    can be used to penetrate any crypto
    software that's out there. Do you report
  • 28:41 - 28:46
    the bug to Microsoft and Intel to defend
    300 million Americans, or do you keep it
  • 28:46 - 28:51
    quiet so you can exploit 450 million
    Europeans and a thousand billion Chinese
  • 28:51 - 28:55
    and so on and so forth? Well, once you put
    it that way, it's fairly obvious that the
  • 28:55 - 29:00
    NSA will favor attack over defense. And
    there are multiple models of attack and
  • 29:00 - 29:04
    defense. You can think of institutional
    factors and politics, for example, if you
  • 29:04 - 29:10
    are director of the NSA, and you defend
    300 million Americans. You defend the
  • 29:10 - 29:16
    White House against the Chinese hacking
    it. You know, the president will never
  • 29:16 - 29:20
    know if he's hacked or not because the
    Chinese will keep it quiet if they do. But
  • 29:20 - 29:25
    if, on the other hand, you manage to hack
    the Politburo land in Peking, you can put
  • 29:25 - 29:31
    some juicy intelligence every morning with
    the president's breakfast cereal. So
  • 29:31 - 29:37
    that's an even stronger argument of why
    you should do attack rather than defense.
  • 29:37 - 29:43
    And all the thing that I mentioned in
    passing is that throughout the 2000s,
  • 29:43 - 29:47
    governments also scrambled to get more
    data of the citizens, for example, in
  • 29:47 - 29:52
    Britain with a long debate about whether
    medical records should be centralized. In
  • 29:52 - 29:56
    the beginning, we said if you were to
    centralize all medical records, that would
  • 29:56 - 29:59
    be such a large target that the database
    should be top secret and it would be too
  • 29:59 - 30:06
    inconvenient for doctors to use. Well,
    Blair decided in 2001 to do it anyway. We
  • 30:06 - 30:11
    wrote a report in 2009 saying that this
    was a red line and that this was a serious
  • 30:11 - 30:17
    hazard and then in 2014 we discovered that
    Cameron's buddy, who was the transparency
  • 30:17 - 30:22
    czar and the NHS had sold the database to
    1200 researchers, including drug companies
  • 30:22 - 30:27
    in China. So that meant that all the
    sensitive personal health information
  • 30:27 - 30:31
    about one billion patients episodes had
    been sold around the world and was
  • 30:31 - 30:35
    available to not just to medical
    researchers, but to foreign intelligence
  • 30:35 - 30:51
    services. This brings us on to Snowden. In
    June 2013. We had one of those game
  • 30:51 - 30:57
    changing moments when Ed Snowden leaked a
    whole bunch of papers showing that the NSA
  • 30:57 - 31:02
    had been breaking the law in America and
    GCHQ had been breaking the law in Britain,
  • 31:02 - 31:06
    that we have been lied to, the parliament
    had been misled, and a whole lot of
  • 31:06 - 31:11
    collection and interception was going on,
    which supposedly shouldn't have been going
  • 31:11 - 31:16
    on. Now, one of the things that got
    industry attention was a system called
  • 31:16 - 31:22
    PRISM, which was in fact legal because
    this was done as a result of warrants
  • 31:22 - 31:28
    being served on the major Internet service
    providers. And if we could move to the
  • 31:28 - 31:34
    next slide, we can see that this started
    off with Microsoft in 2007. Yahoo! in
  • 31:34 - 31:38
    2008, they fought in court for a year they
    lost and then Google and Facebook and so on
  • 31:38 - 31:45
    got added. This basically enabled the NSA
    to go to someone like Google and say
  • 31:45 - 31:50
    rossjanderson@gmail.com is a foreign
    national, we're therefore entitled to read
  • 31:50 - 31:55
    his traffic, kindly give us his Gmail. And
    Google would say, yes, sir. For Americans,
  • 31:55 - 31:58
    you have to show probable cause that
    they've committed a crime for foreigners
  • 31:58 - 32:06
    you simply have to show probable cause
    that they're a foreigner. The next slide.
  • 32:06 - 32:15
    This disclosure from Snowden disclosed
    that PRISM, despite the fact that it only
  • 32:15 - 32:20
    costs about 20 million dollars a year, was
    generating something like half of all the
  • 32:20 - 32:27
    intelligence that the NSA was using. By
    the end of financial year 2012, but that
  • 32:27 - 32:33
    was not all. Next slide, please. The thing
    that really annoyed Google was this slide
  • 32:33 - 32:39
    on the deck from a presentation at GCHQ
    showing how the NSA was not merely
  • 32:39 - 32:44
    collecting stuff through the front door by
    serving warrants on Google in Mountain
  • 32:44 - 32:49
    View, it was collecting stuff through the
    backdoor as well, because they were
  • 32:49 - 32:54
    harvesting the plaintext copies of Gmail
    and maps and docs and so on, which were
  • 32:54 - 32:59
    being sent backwards and forwards between
    Google's different data centers. And the
  • 32:59 - 33:05
    little smiley face, which you can see on
    the sticky, got Sergei and Friends really,
  • 33:05 - 33:10
    really uptight. And they just decided,
    right, you know, we're not going to allow
  • 33:10 - 33:13
    this. They will have to knock and show
    warrants in the future. And there was a
  • 33:13 - 33:17
    crash program and all the major Internet
    service providers to encrypt all the
  • 33:17 - 33:25
    traffic so that in future things could
    only be got by means of a warrant. Next
  • 33:25 - 33:38
    slide, please. The EU was really annoyed
    by what was called Operation Socialist.
  • 33:38 - 33:50
    Operation Socialist was basically, the
    hack of Belgacom and the idea was that
  • 33:50 - 33:57
    GCHQ spearfished some technical staff at
    Belgacom and this enabled them to wiretap
  • 33:57 - 34:05
    all the traffic at the European Commission
    in Brussels and as well as mobile phone
  • 34:05 - 34:12
    traffic to and from various countries in
    Africa. And this is rather amazing. It's
  • 34:12 - 34:17
    as if Nicola Sturgeon, the first minister
    of Scotland, had tasked Police Scotland
  • 34:17 - 34:22
    with hacking BT so that she could watch
    out what was going on with the parliament
  • 34:22 - 34:31
    in London. So this annoyed a number of
    people. With the next slide, we can see.
  • 34:31 - 34:40
    That the the Operation Bull Run, an
    operation Edgehill, as GCHQ called their
  • 34:40 - 34:45
    version of it, have an aggressive,
    multipronged efforts to break widely used
  • 34:45 - 34:50
    Internet encryption technologies. And we
    learned an awful lot about what was being
  • 34:50 - 34:56
    done to break VPNs worldwide and what had
    been done in terms of inserting
  • 34:56 - 35:02
    vulnerabilities and protocols, getting
    people to use vulnerable prime numbers for
  • 35:02 - 35:07
    Diffie Hellman key exchange and so on and
    so forth. Next slide, first slide and
  • 35:07 - 35:12
    Bullrun and Edgehill SIGINT enabling
    projects actively engages the US and
  • 35:12 - 35:16
    foreign IT industries to covertly
    influence and/or overtly leverage their
  • 35:16 - 35:21
    commercial products' designs. These design
    changes make the systems in question
  • 35:21 - 35:25
    exploitable through SIGINT collection
    endpoint midpoints, et cetera, with
  • 35:25 - 35:28
    foreknowledge of the modification, the
    consumer and other adversaries however the
  • 35:28 - 35:37
    system security remains intact. Next
    slide, so the insert vulnerabilities into
  • 35:37 - 35:41
    commercial systems, I.T. systems, networks
    and point communication devices used by
  • 35:41 - 35:49
    targets. Next slide. They also influence
    policy standards and specifications for
  • 35:49 - 35:54
    commercial public key technologies, and
    this was the smoking gun that
  • 35:54 - 36:02
    crypto war 1 had not actually ended. It had
    just gone undercover. And so with this,
  • 36:02 - 36:08
    things come out into the open next slide
    so we could perhaps date crypto war 2 to
  • 36:08 - 36:13
    the Snowden disclosures in their aftermath
    in America. It must be said that all three
  • 36:13 - 36:18
    arms of the US government showed at least
    mild remarks. Obama set up the NSA review
  • 36:18 - 36:24
    group and adopted most of what it said
    except on the equities issue. Congress got
  • 36:24 - 36:28
    data retention, renewed the Patriot Act
    and the FISA court introduced an advocate
  • 36:28 - 36:33
    for Targets. Tech companies as I
    mentioned, started encrypting all their
  • 36:33 - 36:39
    traffic. In the UK on the other hand,
    governments expressed no remorse at all,
  • 36:39 - 36:43
    and they passed the Investigatory Powers
    Act to legalize all the unlawful things
  • 36:43 - 36:48
    they've already been doing. And they could
    now order firms secretly do anything they
  • 36:48 - 36:57
    physically can. However, data retention
    was nixed by the European courts. The
  • 36:57 - 37:02
    academic response in the next slide, keys
    under doormats, much the same authors as
  • 37:02 - 37:09
    before. We analyzed the new situation and
    came to much of the same conclusions. Next
  • 37:09 - 37:15
    slide, the 2018 GCHQ
    proposals from Ian Levy and Crispin
  • 37:15 - 37:21
    Robinson proposed to add ghost users to
    WhatsApp and FaceTime calls in response to
  • 37:21 - 37:26
    warrants. The idea is that you've got an
    FBI key on your device hearing. You still
  • 37:26 - 37:30
    have end to end, so you just have an extra
    end. And this, of course, fills the keys
  • 37:30 - 37:34
    on the doormats tests. Your software would
    abandon best practice. It would create
  • 37:34 - 37:40
    targets and increase complexity and it
    would also have to lie about trust. Next
  • 37:40 - 37:49
    slide, please. This brings us to the
    upload filters which were proposed over
  • 37:49 - 37:56
    the past six months, they first surfaced
    in early 2020 to a Stanford think tank and
  • 37:56 - 38:01
    they were adopted by Commissioner Ylva
    Johansson on June the 9th at the start of
  • 38:01 - 38:06
    the German presidency. On the 20th of
    September we got a leaked tech paper whose
  • 38:06 - 38:12
    authors include our GCHQ friends Ian Levie
    and Crispin Robinson. The top options are
  • 38:12 - 38:18
    that you filter in client software
    assisted by a server, as client side only
  • 38:18 - 38:23
    filtering is too constrained and easy to
    compromise. The excuse is that you want to
  • 38:23 - 38:29
    stop illegal material such as child sex
    abuse images being shared over end to end
  • 38:29 - 38:34
    messaging system such as WhatsApp. Various
    NGOs objected, and we had a meeting with
  • 38:34 - 38:40
    the commission, which was a little bit
    like a Stockholm Syndrome event. We had
  • 38:40 - 38:44
    one official there on the child protection
    front fax by half a dozen officials from
  • 38:44 - 38:49
    various security bodies, departments and
    agencies who seemed to be clearly driving
  • 38:49 - 38:53
    the thing with child protection merely
    being an excuse to promote this lead.
  • 38:53 - 39:00
    Well, the obvious things to worry about
    are as a similar language in the new
  • 39:00 - 39:05
    terror regulation, you can expect the
    filter to extend from child sex abuse
  • 39:05 - 39:11
    material to terror. And static filtering
    won't work because if there's a bad list
  • 39:11 - 39:15
    of 100˙000 forbidden images, then the bad
    people will just go out and make another
  • 39:15 - 39:23
    100˙000 child sex abuse images. So the
    filtering will have to become dynamic. And
  • 39:23 - 39:27
    then the question is whether your form
    will block it or report it. And there's an
  • 39:27 - 39:32
    existing legal duty in a number of
    countries and in the UK to although
  • 39:32 - 39:37
    obviously no longer a member state, the
    existing duty to report terror stuff. And
  • 39:37 - 39:42
    the question is, who will be in charge of
    updating the filters? What's going to
  • 39:42 - 39:51
    happen then? Next slide. Well, we've seen
    an illustration during the lockdown in
  • 39:51 - 39:55
    April, the French and Dutch government
    sent an update to all Encrochat mobile
  • 39:55 - 39:59
    phones with a rootkit which copied
    messages, crypto keys and lock screen
  • 39:59 - 40:04
    passwords. The Encrochat was a brand of
    mobile phone that was sold through
  • 40:04 - 40:11
    underground channels to various criminal
    groups and others. And since this was
  • 40:11 - 40:18
    largely used by criminals of various
    kinds, the U.K. government justify bulk
  • 40:18 - 40:24
    intercepts by passing its office targets
    and equipment interference. In other
  • 40:24 - 40:29
    words, they brought a targeted warrant for
    all forty five thousand Encrochat handsets
  • 40:29 - 40:33
    and of ten thousand users in the U.K.,
    eight hundred were arrested in June when
  • 40:33 - 40:40
    the wire tapping exercise was completed.
    Now, again, this appears to ignore the
  • 40:40 - 40:44
    laws that we have on the books because
    even our Investigatory Powers Act rules
  • 40:44 - 40:49
    out all interception of U.K.
    residents. And those who follow such
  • 40:49 - 40:53
    matters will know that there was a trial
    at Liverpool Crown Court, a hearing of
  • 40:53 - 40:59
    whether this stuff was admissible. And we
    should have a first verdict on that early
  • 40:59 - 41:05
    in the new year. And that will no doubt go
    to appeal. And if the material is held to
  • 41:05 - 41:10
    be admissible, then there will be a whole
    series of trials. So this brings me to my
  • 41:10 - 41:17
    final point. What can we expect going
    forward? China is emerging as a full-stack
  • 41:17 - 41:22
    competitor to the West, not like Russia in
    Cold War one, because Russia only ever
  • 41:22 - 41:27
    produced things like primary goods, like
    oil and weapons in trouble, of course. But
  • 41:27 - 41:31
    China is trying to compete all the way up
    and down the stack from chips, through
  • 41:31 - 41:36
    software, up through services and
    everything else. And developments in China
  • 41:36 - 41:41
    don't exactly fill one with much
    confidence, because in March 2018,
  • 41:41 - 41:45
    President Xi declared himself to be ruler
    for life, basically tearing up the Chinese
  • 41:45 - 41:50
    constitution. There are large-scale state
    crimes being committed in Tibet and
  • 41:50 - 41:55
    Xiniang and elsewhere. Just last week,
    Britain's chief rabbi described the
  • 41:55 - 42:04
    treatment of Uyghurs as an unfathomable
    mass atrocity. In my book, I describe
  • 42:04 - 42:09
    escalating cyber conflict and various
    hacks, such as the hack of the Office of
  • 42:09 - 42:15
    Personnel Management, which had clearance
    files on all Americans who work for the
  • 42:15 - 42:21
    federal governments, the hack of Equifax,
    which got credit ratings and credit
  • 42:21 - 42:26
    histories of all Americans. And there are
    also growing tussles and standards. For
  • 42:26 - 42:33
    example, the draft ISO 27553 on biometric
    authentication for mobile phones is
  • 42:33 - 42:38
    introducing at the insistence of Chinese
    delegates, a central database option. So
  • 42:38 - 42:43
    in future, your phone might not verify
    your faceprint or your fingerprint
  • 42:43 - 42:50
    locally. It might do it with a central
    database. Next slide, how could Cold War
  • 42:50 - 42:57
    2.0 be different? Well, there's a number
    of interesting things here, and the
  • 42:57 - 43:01
    purpose of this talk is to try and kick
    off a discussion of these issues. China
  • 43:01 - 43:06
    makes electronics, not just guns, the way
    the old USSR did. Can you have a separate
  • 43:06 - 43:14
    supply chain for China and one for
    everybody else? But hang on a minute,
  • 43:14 - 43:20
    consider the fact that China has now
    collected very substantial personal data
  • 43:20 - 43:25
    sets on the Office of Personnel
    Management, the US government employees,
  • 43:25 - 43:32
    by forcing Apple to set up its own data
    centers in China for iPhone users in
  • 43:32 - 43:39
    China, they get access to all the data
    for Chinese users of iPhones that America
  • 43:39 - 43:45
    gets for American users of iPhones, plus
    maybe more as well. If the Chinese can
  • 43:45 - 43:51
    break the HSMs in Chinese data centers as
    we expect them to be able to, Equifax got
  • 43:51 - 43:57
    them data on all economically active
    people in the USA. care.data gave them
  • 43:57 - 44:02
    medical records of everybody in the UK.
    And this bulk personal data is already
  • 44:02 - 44:08
    being targeted in intelligence use when
    Western countries, for example, send
  • 44:08 - 44:14
    diplomats to countries in Africa or Latin
    America or local Chinese counter-
  • 44:14 - 44:17
    intelligence, people know whether they're
    bona fide diplomats or whether they're
  • 44:17 - 44:22
    intelligence agents, undercover, all
    from exploitation of all this personal
  • 44:22 - 44:26
    information. Now, given that this
    information's already in efficient targeted
  • 44:26 - 44:32
    use, the next question we have to ask is
    when will it be used at scale? And this is
  • 44:32 - 44:37
    the point at which we say that the
    equities issue now needs a serious rethink
  • 44:37 - 44:44
    and the whole structure of the conflict is
    going to have to move from more offensive
  • 44:44 - 44:50
    to more defensive because we depend on
    supply chains to which the Chinese have
  • 44:50 - 44:55
    access more than they depend on supply
    chains to which we have access. Now, it's
  • 44:55 - 45:01
    dreadful that we're headed towards a new
    Cold War, but as we head there, we have to
  • 45:01 - 45:06
    ask also the respective roles of
    governments, industry and civil society,
  • 45:06 - 45:14
    academia. Next slide, please. And so
    looking for my point is this. That is Cold
  • 45:14 - 45:19
    War 2.0 does happen. I hope it doesn't.
    But we appear to be headed that way
  • 45:19 - 45:24
    despite the change of governments in the
    White House. Then we need to be able to
  • 45:24 - 45:31
    defend everybody, not just the elites. No,
    it's not going to be easy because there
  • 45:31 - 45:35
    are more state players, the USA is a big
    block, the EU is a big block. There are
  • 45:35 - 45:40
    other players, other democracies that are
    other non democracies. Those other failing
  • 45:40 - 45:45
    democracies. This is going to be complex
    and messy. It isn't going to be a
  • 45:45 - 45:50
    situation like last time where big tech
    reaches out to civil society and academia
  • 45:50 - 45:56
    and we could see a united front against
    the agencies. And even in that case, of
  • 45:56 - 46:01
    course, the victory that we got was only
    an apparent victory, a superficial victory
  • 46:01 - 46:06
    that's only lasted for a while. So what
    could we do? Well, at this point, I think
  • 46:06 - 46:11
    we need to remind all the players to
    listen. But it's not just about strategy
  • 46:11 - 46:16
    and tactics, but it's about values, too.
    And so we need to be firmly on the side of
  • 46:16 - 46:21
    freedom, privacy and the rule of law. Now,
    for the old timers, you may remember that
  • 46:21 - 46:30
    there was a product called Tom-Skype,
    which was introduced in 2011 in China. The
  • 46:30 - 46:34
    Chinese wanted the citizens to be able to
    use Skype, but they wanted to be able to
  • 46:34 - 46:38
    wiretap as well, despite the fact that
    Skype at the time had end to end
  • 46:38 - 46:45
    encryption. And so people in China were
    compelled to download a client for Skype
  • 46:45 - 46:50
    called Tom-Skype. Tom was the company that
    distributed Skype in China and it
  • 46:50 - 46:55
    basically had built in wire tapping. So
    you had end to end encryption using Skype
  • 46:55 - 47:01
    in those days. But in China, you ended up
    having a Trojan client, which you had to
  • 47:01 - 47:08
    use. And what we are doing at the moment
    is basically the EU is trying to copy Tom-
  • 47:08 - 47:13
    Skype and saying that we should be doing
    what China was doing eight years ago. And
  • 47:13 - 47:18
    I say we should reject that. We can't
    challenge President Xi by going down that
  • 47:18 - 47:22
    route. Instead, we've got to reset our
    values and we've got to think through the
  • 47:22 - 47:28
    equities issue and we've got to figure out
    how it is that we're going to deal with
  • 47:28 - 47:33
    the challenges of dealing with non-
    democratic countries when there is serious
  • 47:33 - 47:41
    conflict in a globalized world where we're
    sharing the same technology. Thanks. And
  • 47:41 - 47:52
    perhaps the last slide for my book can
    come now and I'm happy to take questions.
  • 47:52 - 47:58
    Herald: Yeah, thanks a lot, Ross, for your
    talk. It's a bit depressing to listen to
  • 47:58 - 48:10
    you. I have to admit let's have a look.
    OK, so I have a question. I'm wondering if
  • 48:10 - 48:15
    the export controls at EU level became
    worse than UK level export controls
  • 48:15 - 48:21
    because entities like GCHQ had more
    influence there or because there's a harmful
  • 48:21 - 48:27
    Franco German security culture or what it
    was. Do you have anything on that?
  • 48:27 - 48:31
    Ross: Well, the experience that we had
    with these export controls, once they were
  • 48:31 - 48:38
    in place, was as follows. It was about
    2015 I think, or 2016, It came to our
  • 48:38 - 48:44
    attention that a British company, Sophos,
    was selling bulk surveillance equipment to
  • 48:44 - 48:49
    President al Assad of Syria, and he was
    using it to basically wiretap his entire
  • 48:49 - 48:54
    population and decide who he was going to
    arrest and kill the following day. And it
  • 48:54 - 48:59
    was sold by Sophos in fact, through a
    German subsidiary. And so we went along to
  • 48:59 - 49:07
    the export control office in Victoria
    Street. A number of NGOs, the open rights
  • 49:07 - 49:12
    group went along and Privacy International
    and us and one or two others. And we said,
  • 49:12 - 49:16
    look, according to the EU dual use
    regulation, bulk intercept equipment is
  • 49:16 - 49:20
    military equipment. It should be in the
    military list. Therefore, you should be
  • 49:20 - 49:25
    demanding an export license for this
    stuff. And they found every conceivable
  • 49:25 - 49:34
    excuse not to demand it. And it was the
    lady from GCHQ there in the room who was
  • 49:34 - 49:38
    clearly calling the shots. And she was
    absolutely determined that there should be
  • 49:38 - 49:44
    no export controls on the stuff being sold
    to Syria. And eventually I said, look,
  • 49:44 - 49:47
    it's fairly obvious what's going on here.
    If there's going to be black boxes and
  • 49:47 - 49:51
    President al-Assad's network, you want
    them to be British black boxes or German
  • 49:51 - 49:56
    black boxes, not Ukrainian or Israeli
    black boxes. And she said, I cannot
  • 49:56 - 50:01
    discuss classified matters in an open
    meeting, which is as close as you get to
  • 50:01 - 50:07
    an admission. And a couple of months
    later, Angela Merkel, to her great credit,
  • 50:07 - 50:13
    has actually come out in public and said
    that allowing the equipment to be exported
  • 50:13 - 50:16
    from Utimaco to Syria was one of the
    hardest decision she'd ever taken as
  • 50:16 - 50:22
    counselor. And that was a very difficult
    tradeoff between maintaining intelligence
  • 50:22 - 50:27
    access, given the possibility that Western
    troops would be involved in Syria and the
  • 50:27 - 50:33
    fact that the kit was being used for very
    evil purposes. So that's an example of how
  • 50:33 - 50:38
    the export controls are used in practice.
    They are not used to control the harms
  • 50:38 - 50:44
    that we as voters are told that they're
    there to control. Right. They are used in
  • 50:44 - 50:50
    all sorts of dark and dismal games. And we
    really have to tackle the issue of export
  • 50:50 - 50:56
    controls with our eyes open.
    H: Yeah, yeah. There's a lot a lot to do.
  • 50:56 - 51:04
    And now Germany has left the EU, UN
    Security Council. So let's see what
  • 51:04 - 51:13
    happens next. Yeah. We'll see, Ross.
    Anything else you'd like to add? We don't
  • 51:13 - 51:19
    have any more questions. Oh, no, we have
    another question. It's just come up
  • 51:19 - 51:25
    seconds ago. Do you think that refusal to
    accept back doors will create large
  • 51:25 - 51:35
    uncensorable applications?
    R: Well, if you get large applications
  • 51:35 - 51:42
    which are associated with significant
    economic power, then low pressure gets
  • 51:42 - 51:51
    brought to bear on those economic players
    to do their social duty. And... this is what
  • 51:51 - 51:57
    we have seen with the platforms that
    intermediate content, that act as content
  • 51:57 - 52:00
    intermediaries such as Facebook and Google
    and so on, that they do a certain amount
  • 52:00 - 52:09
    of filtering. But if, on the other hand,
    you have wholesale surveillance before the
  • 52:09 - 52:14
    fact of End-To-End encrypted stuff, then
    are we moving into an environment where
  • 52:14 - 52:19
    private speech from one person to another
    is no longer permitted? You know, I don't
  • 52:19 - 52:24
    think that's the right trade off that we
    should be taking, because we all know from
  • 52:24 - 52:29
    hard experience that when governments say,
    think of the children, they're not
  • 52:29 - 52:32
    thinking of children at all. If they were
    thinking of children, they would not be
  • 52:32 - 52:36
    selling weapons to Saudi Arabia and the
    United Arab Emirates to kill children in
  • 52:36 - 52:42
    Yemen. And they say think about terrorism.
    But the censorship that we are supposed to
  • 52:42 - 52:48
    use in universities around terrorism, the
    so-called prevent duty is known to be
  • 52:48 - 52:52
    counterproductive. It makes Muslim
    students feel alienated and marginalized.
  • 52:52 - 52:57
    So the arguments that governments use
    around this are not in any way honest. And
  • 52:57 - 53:02
    we now have 20 years experience of these
    dishonest arguments. And for goodness
  • 53:02 - 53:06
    sake, let's have a more grown up
    conversation about these things.
  • 53:06 - 53:12
    H: Now, you're totally right, even if I
    have to admit, it took me a couple of
  • 53:12 - 53:25
    years, not 20, but a lot to finally
    understand, OK? This I think that's it, we
  • 53:25 - 53:31
    just have another comment and I'm thanking
    you for your time and are you in an
  • 53:31 - 53:37
    assembly somewhere around hanging around
    in the next hour or so? Maybe if someone
  • 53:37 - 53:42
    wants to talk to you, he can just pop by
    if you ever if you have used this 2d world
  • 53:42 - 53:45
    already.
    R: No, I haven't been using the 2d world.
  • 53:45 - 53:51
    I had some issues with my browser and
    getting into it. But I've got my my
  • 53:51 - 53:55
    webpage and my email address is public and
    anybody who wants to discuss these things
  • 53:55 - 54:00
    is welcome to get in touch with me.
    Herald: All right. So thanks a lot.
  • 54:00 - 54:04
    R: Thank you for the invitation.
    H: Yeah. Thanks a lot.
  • 54:04 - 54:08
    rC3 postroll music
  • 54:08 - 54:43
    Subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
#rC3 - What Price the Upload Filter?
Description:

more » « less
Video Language:
English
Duration:
54:44

English subtitles

Revisions