0:00:00.000,0:00:12.590
rC3 preroll music
0:00:12.590,0:00:18.109
Herald: This is Ross Anderson, and he's[br]giving a talk to us today, and the title
0:00:18.109,0:00:24.079
is What Price the Upload Filter? From Cold[br]War to Crypto Wars and Back Again. And
0:00:24.079,0:00:31.489
we're very happy that he's here today. And[br]and for our non-English speaking public,
0:00:31.489,0:00:35.990
we have translations.[br]speaks german
0:00:35.990,0:00:41.300
Dieser Talk wird auf Deutsch übersetzt.[br]speaks french
0:00:41.300,0:00:49.839
Cette conférence est traduit en[br]français aussi.
0:00:49.839,0:00:56.769
Yeah. Um. Ross, ready to start? Let's go. [br]Have a good time. Enjoy.
0:00:56.769,0:01:09.750
Ross: Yes, ready to go. Thanks. OK. As has[br]been said, I'm Ross Anderson and I'm in
0:01:09.750,0:01:13.620
the position of being one of the old guys[br]of this field and that I've been involved
0:01:13.620,0:01:18.680
in the crypto wars right from the start.[br]And in fact, even since before the clipper
0:01:18.680,0:01:27.510
chip actually came out. If we could go to[br]the slides, please.
0:01:27.510,0:01:31.407
Right, can we see the slides?
0:01:31.407,0:02:09.505
silence
0:02:09.505,0:02:14.790
surprised the U.S. armed[br]forces. And guess what happened? Well, in
0:02:14.790,0:02:21.060
the 1950s, Boris Hagelin had set up that[br]company, secretly sold it to the NSA and
0:02:21.060,0:02:28.110
for a number of years, quite a lot of years,[br]countries as diverse as Latin America and
0:02:28.110,0:02:33.130
India and even NATO countries such as[br]Italy were buying machines from Crypto AG,
0:02:33.130,0:02:39.470
which the NSA could decipher. And this had[br]all sorts of consequences. For example,
0:02:39.470,0:02:44.780
it's been revealed fairly recently that[br]Britain's success against Argentina in the
0:02:44.780,0:02:50.680
Falklands War in 1982 was to a large[br]extent due to signals intelligence that
0:02:50.680,0:02:58.790
came from these machines. So, next slide,[br]please. And in this prehistory of the
0:02:58.790,0:03:03.600
crypto wars, almost all the play was[br]between governments. There was very little
0:03:03.600,0:03:08.180
role for civil society. There was one or[br]two journalists who were engaged in trying
0:03:08.180,0:03:13.870
to map what the NSA and friends were up to.[br]As far as industry was concerned, well, at
0:03:13.870,0:03:18.180
that time, I was working in banking and we[br]found that encryption for confidentiality
0:03:18.180,0:03:22.319
was discouraged. If we tried to use line[br]encryption, then false mysteriously
0:03:22.319,0:03:26.880
appeared on the line. But authentication[br]was OK. We were allowed to encrypt PIN
0:03:26.880,0:03:32.380
pad, PIN blocks. We were allowed to put[br]MACs on messages. There was some minor
0:03:32.380,0:03:37.250
harassment. For example, when Rivest,[br]Shamir and Adleman came up with their
0:03:37.250,0:03:42.650
encryption algorithm, the NSA tried to[br]make it classified. But the Provost of
0:03:42.650,0:03:47.840
MIT, Jerome Wiesner, persuaded them not to[br]make that fight. The big debate in the
0:03:47.840,0:03:52.880
1970s still, was whether the NSA affected[br]the design of the data encryption standard
0:03:52.880,0:03:57.800
algorithm, and we know now that this was[br]the case. It was designed to be only just
0:03:57.800,0:04:04.330
strong enough and Whit Diffie predicted[br]back in the 1970s that 2 to the power of
0:04:04.330,0:04:08.920
56 key search would eventually be[br]feasible. The EFF built a machine in 1998
0:04:08.920,0:04:13.510
and now of course that's fairly easy[br]because each bitcoin block costs 2 to
0:04:13.510,0:04:20.169
the power of 68 calculations. Next slide,[br]please. So where things get interesting is
0:04:20.169,0:04:25.919
that the NSA persuaded Bill Clinton in one[br]of his first cabinet meetings in 1993 to
0:04:25.919,0:04:30.270
introduce key escrow, the idea that the[br]NSA should have a copy of every of these
0:04:30.270,0:04:36.860
keys. And one of the people at that[br]meeting admitted later that President Bush,
0:04:36.860,0:04:41.280
the elder, had been asked and had refused,[br]but Clinton when he goes into office was
0:04:41.280,0:04:46.300
naive and thought that this was an[br]opportunity to fix the world. Now, the
0:04:46.300,0:04:50.159
clipper chip which we can see here, was[br]tamper resistant and had of secret block
0:04:50.159,0:04:58.620
cipher with an NSA backdoor key. And the[br]launch product was an AT&T secure phone.
0:04:58.620,0:05:05.030
Next slide, please. Now the Clipper protocol[br]was an interesting one in that each chip
0:05:05.030,0:05:12.210
had a unique secret key KU and a global[br]secret family key kNSA burned in. And in
0:05:12.210,0:05:18.189
order to, say, send data to Bob, Alice had[br]to send her clipper chip a working key kW,
0:05:18.189,0:05:22.890
which is generated by some external means,[br]such as a Diffie Hellman Key exchange. And
0:05:22.890,0:05:28.430
it makes a law enforcement access field,[br]which was basically Alice and Bob's names
0:05:28.430,0:05:33.529
with the working key encrypted under the[br]unit key and then a hash of the working
0:05:33.529,0:05:38.449
key encrypted under the NSA key. And that[br]was sent along with the cipher text to make
0:05:38.449,0:05:43.209
authorized wiretapping easy. And the idea[br]with the hash was that this would stop
0:05:43.209,0:05:47.830
cheating. Bob's Clipper Chip wouldn't use[br]a working key unless it came with a valid
0:05:47.830,0:05:53.889
LEAF. And I can remember, a few of us can[br]still remember, the enormous outcry that
0:05:53.889,0:05:57.819
this caused at the time. American[br]companies in particular didn't like it
0:05:57.819,0:06:02.439
because they started losing business to[br]foreign firms. And in fact, a couple of
0:06:02.439,0:06:07.499
our students here at Cambridge started a[br]company nCipher, that grew to be quite
0:06:07.499,0:06:13.300
large because they could sell worldwide,[br]unlike US firms. People said, why don't we
0:06:13.300,0:06:17.019
use encryption software? Well, that's easy[br]to write, but it's hard to deploy at
0:06:17.019,0:06:22.749
scale, as Phil Zimmermann found with PGP.[br]And the big concern was whether key escrow
0:06:22.749,0:06:28.620
would kill electronic commerce. A[br]secondary concern was whether, how on earth,
0:06:28.620,0:06:32.389
will we know if government designs are[br]secure? Why on earth should you trust the
0:06:32.389,0:06:40.669
NSA? Next slide, please. Well, the first[br]serious fight back in the crypto wars came
0:06:40.669,0:06:45.259
when Matt Blaze at Bell Labs found an[br]attack on Clipper. He found that Alice
0:06:45.259,0:06:52.339
could just try lots of these until one of[br]them works, because the tag was only 16
0:06:52.339,0:06:57.589
Bits long and it turned out that 2 to the[br]power of 112 of the 2 to the power of
0:06:57.589,0:07:03.749
128 possibilities work. And this meant[br]that Alice could generate a bogus LEAF
0:07:03.749,0:07:07.589
that would pass inspection, but which[br]wouldn't decrypt the traffic, and Bob
0:07:07.589,0:07:11.879
could also generate a new LEAF on the fly.[br]So you could write non-interoperable rogue
0:07:11.879,0:07:16.691
applications that the NSA has no access[br]to. And with a bit more work, you could
0:07:16.691,0:07:21.479
make rogue applications interoperate with[br]official ones. This was only the first of
0:07:21.479,0:07:30.089
many dumb ideas. Next slide, please. OK,[br]so why don't people just use software?
0:07:30.089,0:07:36.189
Well, at that time, the US had export[br]controls on intangible goods such as
0:07:36.189,0:07:40.430
software, although European countries[br]generally didn't. And this meant that US
0:07:40.430,0:07:45.610
academics couldn't put crypto code online,[br]although we Europeans could and we
0:07:45.610,0:07:52.059
did. And so Phil Zimmermann achieved fame[br]by exporting PGP, pretty good privacy, some
0:07:52.059,0:07:56.099
encryption software he had written for[br]America as a paper book. And this was
0:07:56.099,0:08:00.509
protected by the First Amendment. They[br]sent it across the border to Canada. They
0:08:00.509,0:08:04.360
fed it into an optical character[br]recognition scanner. They recompiled it
0:08:04.360,0:08:08.889
and the code had escaped. For this Phil[br]was subjected to a grand jury
0:08:08.889,0:08:13.979
investigation. There was also the[br]Bernstein case around code as free speech
0:08:13.979,0:08:19.059
and Bruce Schneier rose to fame with his[br]book "Applying Cryptography", which had
0:08:19.059,0:08:24.249
protocols, algorithms and source code in C,[br]which you could type in in order to get
0:08:24.249,0:08:31.960
cryptographic algorithms anywhere. And we[br]saw export-controlled clothing. This
0:08:31.960,0:08:36.389
t-shirt was something that many people wore[br]at the time. I've actually got one and I
0:08:36.389,0:08:41.089
planned to wear it for this. But[br]unfortunately, I came into the lab in
0:08:41.089,0:08:47.230
order to get better connectivity and I[br]left it at home. So this t-shirt was an
0:08:47.230,0:08:53.009
implementation of RSA written in perl,[br]plus a barcode so that you can scan it in.
0:08:53.009,0:08:58.350
And in theory, you should not walk across[br]the border wearing this t-shirt. Or if
0:08:58.350,0:09:03.270
you're a US citizen, you shouldn't even[br]let a non-US citizen look at it. So by
0:09:03.270,0:09:08.579
these means, people probed the outskirts of[br]what was possible and, you know an awful
0:09:08.579,0:09:17.070
lot of fun was had. It was a good laugh to[br]tweak the Tyrannosaur's tail. Next slide.
0:09:17.070,0:09:25.030
But this wasn't just something that was[br]limited to the USA. The big and obvious
0:09:25.030,0:09:30.230
problem, if you try and do key escrow in[br]Europe, is that there's dozens of
0:09:30.230,0:09:35.810
countries in Europe and what happens if[br]someone from Britain, for example, has got
0:09:35.810,0:09:40.100
a mobile phone that they bought in France[br]or a German SIM card and they're standing
0:09:40.100,0:09:44.509
on the streets in Stockholm and they phone[br]somebody who's in Budapest, who's got a
0:09:44.509,0:09:48.699
Hungarian phone with the Spanish SIM card[br]in it. Then which of these countries'
0:09:48.699,0:09:54.260
secret police forces should be able to[br]listen to the call. And this was something
0:09:54.260,0:09:59.930
that stalled the progress of key escrow, [br]that's a good way to describe it, in Europe.
0:09:59.930,0:10:06.839
And in 1996 GCHQ got academic colleagues[br]at Royal Holloway to come up with a
0:10:06.839,0:10:11.860
proposal for public sector email, which[br]they believe would fix this. Now, at the
0:10:11.860,0:10:18.919
time after clipper had fallen into[br]disrepute, the NSA's proposal was that
0:10:18.919,0:10:24.130
also the certification authority should have[br]to be licensed, and that this would enforce
0:10:24.130,0:10:27.879
a condition that all private keys would be[br]escrows, so you would only be able to get a
0:10:27.879,0:10:33.530
signature on your public key if the[br]private key was was held by the CA. And
0:10:33.530,0:10:38.440
the idea is that you'd have one CA for[br]each government department and civilians
0:10:38.440,0:10:42.190
would use trusted firms like Barclays Bank[br]or the post office, which would keep our
0:10:42.190,0:10:48.009
keys safe. And it would also work across[br]other EU member states, so that somebody in
0:10:48.009,0:10:53.870
Britain calling somebody in Germany would[br]end up in a situation where a trustworthy
0:10:53.870,0:10:59.220
CA, from the NSA's point of view, that is[br]an untrustworthy CA from our point of view,
0:10:59.220,0:11:03.680
in Britain would be prepared to make a key[br]and so would one in Germany. This, at
0:11:03.680,0:11:10.630
least, was the idea. So how do we do this,[br]next slide, on the GCHQ protocol. So here's
0:11:10.630,0:11:14.829
how it was designed to work in the UK[br]government. If Alice at the Department of
0:11:14.829,0:11:20.120
Agriculture wants to talk to Bob at the[br]Department of Business, she asks her
0:11:20.120,0:11:26.089
Departmental Security Officer DA for a send[br]key for herself and a receive key for Bob.
0:11:26.089,0:11:35.360
And DA and DB get a top level[br]interoperability key KTAB from GCHQ and DA
0:11:35.360,0:11:44.120
calculates a secret send key of the day as[br]a hash of KTAB and Alice's name and the
0:11:44.120,0:11:50.430
DA's own Identity for Alice which he gives[br]to Alice and similarly a public receive
0:11:50.430,0:11:55.481
key of the day for Bob and Alice sends Bob[br]her public send key along with the
0:11:55.481,0:12:00.379
encrypted message and Bob can go[br]to his DSO and get his secret receive
0:12:00.379,0:12:06.120
key of the day. Now this is slightly[br]complicated and there's all sorts of other
0:12:06.120,0:12:11.299
things wrong with it once you start to[br]look at it. Next slide, please. The first
0:12:11.299,0:12:14.790
is that from the point of view of the[br]overall effect, you could just as easily
0:12:14.790,0:12:19.080
have used Kerberos because you've[br]basically got a key distribution center at
0:12:19.080,0:12:24.630
both ends, which knows everybody's keys. So[br]you've not actually gained very much by
0:12:24.630,0:12:31.081
using complicated public key mechanisms,[br]and the next problem is what's the law
0:12:31.081,0:12:36.390
enforcement access need for centrally[br]generated signing keys? If this is
0:12:36.390,0:12:40.480
actually for law enforcement rather than[br]intelligence? Well, the police want to be
0:12:40.480,0:12:47.480
able to read things, not forge things. A[br]third problem is that keys involve hashing
0:12:47.480,0:12:52.110
department names and governments are[br]changing the name of the departments all
0:12:52.110,0:12:56.980
the time, as the prime minister of the day[br]moves his ministers around and they chop
0:12:56.980,0:13:01.810
and change departments. And this means, of[br]course, that everybody has to get new
0:13:01.810,0:13:06.320
cryptographic keys and suddenly the old[br]cryptographic keys don't work anymore. And
0:13:06.320,0:13:10.800
those are horrendous complexity comes from[br]this. Now, there are about 10 other things
0:13:10.800,0:13:15.420
wrong with this protocol, but curiously[br]enough, it's still used by the UK
0:13:15.420,0:13:19.090
government for the top secret stuff. It[br]went through a number of iterations. It's
0:13:19.090,0:13:23.939
now called Mikey Sakke, there's details in[br]my security engineering book. And it
0:13:23.939,0:13:28.129
turned out to be such a pain that the[br]stuff below top secret now is just used as
0:13:28.129,0:13:32.959
a branded version of G suite. So if what[br]you want to do is to figure out what
0:13:32.959,0:13:37.050
speech Boris Johnson will be making[br]tomorrow, we just have to guess the
0:13:37.050,0:13:44.459
password recovery questions for his[br]private secretaries and officials. Next
0:13:44.459,0:13:51.060
slide, the global Internet Trust Register.[br]This was an interesting piece of fun we
0:13:51.060,0:13:55.929
had around the 1997 election when Tony[br]Blair took over and introduced the Labor
0:13:55.929,0:14:00.439
government before the election, Labor[br]promised to not seize crypto keys in bulk
0:14:00.439,0:14:04.499
without a warrant. And one of the[br]first things that happened to him once he
0:14:04.499,0:14:09.569
was in office is Vice President Al Gore[br]went to visit him and all of a sudden Tony
0:14:09.569,0:14:13.449
Blair decided that he wanted all[br]certification authorities to be licensed
0:14:13.449,0:14:18.520
and they were about to rush this through[br]parliament. So we put all the important
0:14:18.520,0:14:23.290
public keys in a paper book and we took it[br]to the cultural secretary, Chris Smith,
0:14:23.290,0:14:27.790
and we said, you're the minister for books[br]why are you passing a law to ban this
0:14:27.790,0:14:32.580
book. And if you'll switch to the video[br]shot, I've got the initial copy of the
0:14:32.580,0:14:36.579
book that we just put together on the[br]photocopying machine in the department.
0:14:36.579,0:14:42.200
And then we sent the PDF off to MIT and[br]they produced it as a proper book. And
0:14:42.200,0:14:48.350
this means that we had a book which is[br]supposedly protected and this enabled us
0:14:48.350,0:14:55.209
to get the the topic onto the agenda for[br]cabinet discussion. So this at least
0:14:55.209,0:14:59.779
precipitous action, we ended up with the[br]Regulation of Investigatory Powers Bill in
0:14:59.779,0:15:04.830
2000. That was far from perfect, but that[br]was a longer story. So what happened back
0:15:04.830,0:15:09.860
then is that we set up an NGO, a digital[br]rights organization, the Foundation for
0:15:09.860,0:15:16.620
Information Policy Research. And the[br]climate at the time was such that we had
0:15:16.620,0:15:22.310
no difficulty raising a couple of hundred[br]thousand pounds from Microsoft and Hewlett
0:15:22.310,0:15:28.370
Packard and Redbus and other tech players.[br]So we were able to hire Casper Bowden for
0:15:28.370,0:15:32.199
three years to basically be the director[br]of FIPR and to lobby the government hard
0:15:32.199,0:15:38.490
on this. And if we can go back to the[br]slides, please, and go to the next slide,
0:15:38.490,0:15:47.170
the slide on bringing it all together. So[br]in 1997, a number of us, Hal Abelson and I
0:15:47.170,0:15:54.569
and Steve Bellovin and Josh Benaloh from[br]Microsoft and Matt Blaze who had broken
0:15:54.569,0:15:59.430
Clipper and Whit Diffie, who invented[br]digital signatures, and John Gilmore of
0:15:59.430,0:16:05.689
EFF, Peter Neumann of SRI, Ron Rivest,[br]Jeff Schiller of MIT and Bruce Schneier
0:16:05.689,0:16:09.339
who had written applied cryptography and[br]got together and wrote a paper on the
0:16:09.339,0:16:13.830
risks of key recovery, key escrow and[br]trust in third party encryption, where we
0:16:13.830,0:16:18.470
discussed the system consequences of[br]giving third party or government access to
0:16:18.470,0:16:23.850
both traffic data and content without user[br]notice or consent deployed internationally
0:16:23.850,0:16:27.550
and available around the clock. We came to[br]the conclusion that this was not really
0:16:27.550,0:16:33.550
doable. It was simply too many[br]vulnerabilities and too many complexities.
0:16:33.550,0:16:38.899
So how did it end? Well, if we go to the[br]next slide, the victory in Europe wasn't
0:16:38.899,0:16:44.259
as a result of academic arguments. It was[br]a result of industry pressure. And we owe
0:16:44.259,0:16:48.750
a debt to Commissioner Martin Bangemann[br]and also to the German government who
0:16:48.750,0:16:56.699
backed him. And in 1994, Martin had put[br]together a group of European CEOs to
0:16:56.699,0:17:01.190
advise him on internet policy. And they[br]advised them to keep your hands off until
0:17:01.190,0:17:04.160
we can see which way it's going. That's[br]just wrong with this thing and see what we
0:17:04.160,0:17:10.670
can do with it. And the thing that he[br]developed in order to drive a stake
0:17:10.670,0:17:15.180
through the heart of key escrow was the[br]Electronic Signatures Directive in 1999.
0:17:15.180,0:17:20.330
And this gave a rebuttable presumption of[br]validity to qualifying electronic
0:17:20.330,0:17:24.640
signatures, but subject to a number of[br]conditions. And one of these was that the
0:17:24.640,0:17:29.570
signing key must never be known to anybody[br]else other than the signer and this killed
0:17:29.570,0:17:37.120
the idea of licensing CAs in such a way[br]that the the NSA had access to all the
0:17:37.120,0:17:41.390
private key material. The agencies had[br]argued that without controlling
0:17:41.390,0:17:45.260
signatures, you couldn't control[br]encryption. But of course, as intelligence
0:17:45.260,0:17:49.440
agencies, they were as much interested in[br]manipulating information as they were in
0:17:49.440,0:17:57.280
listening into it. And this created a[br]really sharp conflict with businesses. In
0:17:57.280,0:18:00.770
the U.K., with the Regulation of[br]Investigatory Powers Bill went through the
0:18:00.770,0:18:05.540
following year. And there we got strong[br]support from the banks who did not want
0:18:05.540,0:18:10.600
the possibility of intelligence and law[br]enforcement personnel either getting hold
0:18:10.600,0:18:16.380
of bank keys or forging banking[br]transactions. And so we managed to, with
0:18:16.380,0:18:20.720
their help to insert a number of[br]conditions into the bill, which meant that
0:18:20.720,0:18:25.520
if a court or chief constable, for[br]example, demands a key from a company,
0:18:25.520,0:18:29.400
they've got to demand it from somebody at[br]the level of a director of the company.
0:18:29.400,0:18:33.800
And it's got to be signed by someone[br]really senior such as the chief constable.
0:18:33.800,0:18:38.910
So there was some controls that we managed[br]to get in there. Next slide! What did
0:18:38.910,0:18:44.660
victory in the USA look like? Well, in the[br]middle of 2000 as a number of people had
0:18:44.660,0:18:49.170
predicted, Al Gore decided that he wanted[br]to stop fighting the tech industry in
0:18:49.170,0:18:54.410
order to get elected president. And there[br]was a deal done at the time which was
0:18:54.410,0:19:01.070
secret. It was done at the FBI[br]headquarters at Quantico by US law
0:19:01.070,0:19:04.880
enforcement would rely on naturally[br]occurring vulnerabilities rather than
0:19:04.880,0:19:10.070
compelling their insertion by companies[br]like Intel or Microsoft. This was secret
0:19:10.070,0:19:15.017
at the time, and I happen to know about it[br]because I was consulting for Intel and the
0:19:15.017,0:19:20.800
NDA I was under had a four year time[br]limits on it. So after 2004, I was at the
0:19:20.800,0:19:25.760
ability to talk about this. And so this[br]basically gave the NSA access to the CERT
0:19:25.760,0:19:30.930
feed. And so as part of this deal, the[br]export rules were liberalized a bit, but
0:19:30.930,0:19:38.090
with various hooks and gotchas left so[br]that the authorities could bully companies
0:19:38.090,0:19:45.580
who got too difficult. And in 2002, Robert[br]Morris, senior, who had been the chief
0:19:45.580,0:19:50.740
scientist at the NSA at much of this[br]period, admitted that the real policy goal
0:19:50.740,0:19:54.540
was to ensure that the many systems[br]developed during the dot com boom were
0:19:54.540,0:20:02.190
deployed with weak protection or none. And[br]there's a huge, long list of these. Next
0:20:02.190,0:20:11.430
slide, please. So what was the collateral[br]damage from crypto war one? This is the
0:20:11.430,0:20:15.310
first knuckle pass of this talk, which[br]I've got together as a result of spending
0:20:15.310,0:20:20.920
the last academic year writing the third[br]edition of my book on security engineering
0:20:20.920,0:20:26.520
as I've gone through and updated all the[br]chapters on car security, the role of
0:20:26.520,0:20:32.510
security and web security and so on and so[br]forth, we find everywhere. But there are
0:20:32.510,0:20:38.050
still very serious costs remaining from[br]crypto war one, for example, almost all of
0:20:38.050,0:20:43.430
the remote key entry systems for cars use[br]inadequate cryptography for random
0:20:43.430,0:20:48.290
number generators and so on and so forth.[br]And car theft has almost doubled in the
0:20:48.290,0:20:55.380
past five years. This is not all due to[br]weak crypto, but it's substantially due to
0:20:55.380,0:21:01.323
a wrong culture that was started off in[br]the context of the crypto wars. Second,
0:21:01.323,0:21:06.430
there are millions of door locks still[br]using Mifare classic, even the building
0:21:06.430,0:21:12.030
where I work. For example, the University[br]of Cambridge changed its door locks around
0:21:12.030,0:21:17.040
2000. So we've still got a whole lot of[br]mifare classic around. And it's very
0:21:17.040,0:21:21.150
difficult when you've got 100 buildings to[br]change all the locks on them. And this is
0:21:21.150,0:21:26.250
the case with thousands of organizations[br]worldwide, with universities, with banks,
0:21:26.250,0:21:30.990
with all sorts of people, simply because[br]changing all the locks at once and dozens
0:21:30.990,0:21:35.770
of buildings is just too expensive. Then,[br]of course, there's the CA in your
0:21:35.770,0:21:40.990
browser, most nations own or control[br]certification authorities that your
0:21:40.990,0:21:47.380
browser trusts and the few nations that[br]weren't allowed to own such CAs, such as
0:21:47.380,0:21:53.040
Iran, get up to mischief, as we find in[br]the case of the DigiNotar hack a few years
0:21:53.040,0:21:59.000
ago. And this means that most nations have[br]got a more or less guaranteed ability to
0:21:59.000,0:22:05.770
do man in the middle attacks on your Web[br]log ons. Some companies like Google, of
0:22:05.770,0:22:11.410
course, started to fix that with various[br]mechanisms such as certificate pinning.
0:22:11.410,0:22:16.160
But that was a deliberate vulnerability[br]that was there for a long, long time and
0:22:16.160,0:22:22.410
is still very widespread. Phones. 2G is[br]insecure. That actually goes back to the
0:22:22.410,0:22:27.281
Cold War rather than the crypto war. But[br]thanks to the crypto wars 4G and 5G are
0:22:27.281,0:22:32.450
not very much better. The details are[br]slightly complicated and again, they're
0:22:32.450,0:22:37.950
described in the book, Bluetooth is easy[br]to hack. That's another piece of legacy.
0:22:37.950,0:22:43.380
And as I mentioned, the agencies own the[br]CERT's responsible disclosure pipeline,
0:22:43.380,0:22:47.690
which means that they got a free fire hose[br]of zero days that they can exploit
0:22:47.690,0:22:53.780
for perhaps a month or three before these[br]end up being patched. So next slide,
0:22:53.780,0:23:02.600
please. Last year when I talked at Chaos[br]Communication Congress, the audience chose
0:23:02.600,0:23:08.900
this as the cover for my security[br]engineering book, and that's now out. And
0:23:08.900,0:23:12.730
it's the process of writing this that[br]brought home to me the scale of the damage
0:23:12.730,0:23:18.450
that we still suffered as a result of[br]crypto war one. So let's move on to the
0:23:18.450,0:23:24.610
next slide and the next period of history,[br]which we might call the war on terror. And
0:23:24.610,0:23:30.980
I've arbitrarily put this down as 2000 to[br]2013 although some countries stoped using
0:23:30.980,0:23:36.790
the phrase war on terror in about 2008[br]once we have got rid of George W. Bush and
0:23:36.790,0:23:41.330
Tony Blair. But as a historical[br]convenience, this is, if you like, the
0:23:41.330,0:23:46.140
central period in our tale. And it starts[br]off with a lot of harassment around the
0:23:46.140,0:23:54.700
edges of security and cryptography. For[br]example, in 2000, Tony Blair promoted the
0:23:54.700,0:24:02.290
EU dual use regulation number 1334 to[br]extend export controls from tangible goods
0:24:02.290,0:24:07.810
such as rifles and tanks to intangibles[br]such as crypto software. Despite the fact
0:24:07.810,0:24:13.980
that he has basically declared peace on[br]the tech industry. Two years later, in
0:24:13.980,0:24:18.090
2002, the UK parliament balked at an[br]export control bill that was going to
0:24:18.090,0:24:24.140
transpose this because it added controls[br]on scientific speech, not just crypto
0:24:24.140,0:24:28.900
code, but even papers on cryptanalysis and[br]even electron microscope scripts and
0:24:28.900,0:24:33.300
so parliament started the research[br]exemption clause at the arguments of the
0:24:33.300,0:24:39.420
then president of the Royal Society, Sir[br]Robert May. But what then happened is that
0:24:39.420,0:24:45.820
GCHQ used EU regulations to frustrate[br]Parliament and this pattern of extralegal
0:24:45.820,0:24:51.700
behavior was to continue. Next slide![br]Because after export control, the place
0:24:51.700,0:24:57.310
shifted to traffic data retention, another[br]bad thing that I'm afraid to say, the UK
0:24:57.310,0:25:02.630
exported to Europe back in the days when[br]we were, in effect, the Americans
0:25:02.630,0:25:08.530
consigliere on the European Council. Sorry[br]about that, folks, but all I can say is at
0:25:08.530,0:25:15.900
least we helped start EDRI a year after[br]that. So one of the interesting aspects of
0:25:15.900,0:25:20.590
this was that our then home secretary,[br]Jacqui Smith, started talking about the
0:25:20.590,0:25:26.080
need for a common database of all the[br]metadata of who had phoned whom when, who
0:25:26.080,0:25:30.720
had sent an email to whom when, so that[br]the police could continue to use the
0:25:30.720,0:25:35.340
traditional contact tracing techniques[br]online. And the line that we got hammered
0:25:35.340,0:25:39.500
home to us again and again and again was[br]if you got nothing to hide, you've got
0:25:39.500,0:25:47.490
nothing to fear. What then happened in[br]2008, is that a very bad person went into
0:25:47.490,0:25:53.550
Parliament and went to the PC where the[br]expense claims of MPs were kept and they
0:25:53.550,0:25:58.630
copied all the expense claims onto a DVD[br]and they sold it around Fleet Street. And
0:25:58.630,0:26:03.450
so The Daily Telegraph bought it from them[br]for 400˙000£. And then for the best
0:26:03.450,0:26:07.500
part of a year, the Daily Telegraph was[br]telling scandalous things about what
0:26:07.500,0:26:12.170
various members of parliament had claimed[br]from the taxpayer. But it turned out that
0:26:12.170,0:26:15.730
also Jacqui Smith may have been innocent.[br]Her husband had been downloading
0:26:15.730,0:26:21.010
pornography and charging it to our[br]parliamentary expenses. So she lost her
0:26:21.010,0:26:25.820
job as home secretary and she lost her[br]seat in parliament and the communications
0:26:25.820,0:26:32.950
data bill was lost. So was this a victory?[br]Well, in June 2013, we learned from Ed
0:26:32.950,0:26:39.310
Snowden that they just built it anyway,[br]despite parliament. So maybe the victory
0:26:39.310,0:26:43.400
in parliament wasn't what it seemed to be[br]at the time. But I'm getting ahead of
0:26:43.400,0:26:51.570
myself; anyway. Next slide, please. The[br]other thing that we did in the 2000s is
0:26:51.570,0:26:56.200
that we spent, I spent maybe a third of my[br]time and about another hundred people
0:26:56.200,0:27:00.856
joined and we developed the economics of[br]security as a discipline. We began to
0:27:00.856,0:27:05.660
realize that many of the things that went[br]wrong happened because Alice was guarding
0:27:05.660,0:27:10.910
a system and Bob was paying the cost of[br]failure. For example, if you got a payment
0:27:10.910,0:27:17.480
system, then in order to prevent fraud,[br]what you basically have to do is to get
0:27:17.480,0:27:21.920
the merchants and the bank to buy[br]transactions from them, to take care of
0:27:21.920,0:27:26.440
the costs of fraud, follow the cardholder[br]of the banks that issue them with cards.
0:27:26.440,0:27:31.870
And the two aren't the same. But it's this[br]that causes the governance tensions and
0:27:31.870,0:27:36.906
causes governments to break down and makes[br]fraud harder than it should be. Now after
0:27:36.906,0:27:41.530
that, one of the early topics was[br]patching and responsible disclosure. And
0:27:41.530,0:27:45.340
we worked through all the issues of[br]whether you should not patch at all, which
0:27:45.340,0:27:48.860
some people in industry wanted to do, or[br]whether you should just put all the bugs
0:27:48.860,0:27:52.690
on bug trackers which some hackers wanted[br]to do or whether you would go through the
0:27:52.690,0:27:57.200
CERT system despite the NSA compromise,[br]because they at least would give you legal
0:27:57.200,0:28:03.720
cover. And, you know, bully Microsoft into[br]catching the bug the next patch Tuesday
0:28:03.720,0:28:10.040
and then the disclosure after 90 days. And[br]we eventually came to the conclusion as an
0:28:10.040,0:28:16.270
industry followed that responsible[br]disclosure was the way to go. Now, one of
0:28:16.270,0:28:21.530
the problems that arises here is the[br]equities issue. Suppose you're the
0:28:21.530,0:28:27.260
director of the NSA and somebody comes to[br]you with some super new innovative bug.
0:28:27.260,0:28:33.490
You say they have rediscovered Spectre,[br]for example. And so you've got a bug which
0:28:33.490,0:28:40.640
can be used to penetrate any crypto[br]software that's out there. Do you report
0:28:40.640,0:28:45.640
the bug to Microsoft and Intel to defend[br]300 million Americans, or do you keep it
0:28:45.640,0:28:50.830
quiet so you can exploit 450 million[br]Europeans and a thousand billion Chinese
0:28:50.830,0:28:55.170
and so on and so forth? Well, once you put[br]it that way, it's fairly obvious that the
0:28:55.170,0:29:00.370
NSA will favor attack over defense. And[br]there are multiple models of attack and
0:29:00.370,0:29:04.420
defense. You can think of institutional[br]factors and politics, for example, if you
0:29:04.420,0:29:10.350
are director of the NSA, and you defend[br]300 million Americans. You defend the
0:29:10.350,0:29:15.720
White House against the Chinese hacking[br]it. You know, the president will never
0:29:15.720,0:29:19.970
know if he's hacked or not because the[br]Chinese will keep it quiet if they do. But
0:29:19.970,0:29:24.790
if, on the other hand, you manage to hack[br]the Politburo land in Peking, you can put
0:29:24.790,0:29:31.040
some juicy intelligence every morning with[br]the president's breakfast cereal. So
0:29:31.040,0:29:37.150
that's an even stronger argument of why[br]you should do attack rather than defense.
0:29:37.150,0:29:43.360
And all the thing that I mentioned in[br]passing is that throughout the 2000s,
0:29:43.360,0:29:47.390
governments also scrambled to get more[br]data of the citizens, for example, in
0:29:47.390,0:29:51.930
Britain with a long debate about whether[br]medical records should be centralized. In
0:29:51.930,0:29:56.030
the beginning, we said if you were to[br]centralize all medical records, that would
0:29:56.030,0:29:59.440
be such a large target that the database[br]should be top secret and it would be too
0:29:59.440,0:30:06.480
inconvenient for doctors to use. Well,[br]Blair decided in 2001 to do it anyway. We
0:30:06.480,0:30:10.700
wrote a report in 2009 saying that this[br]was a red line and that this was a serious
0:30:10.700,0:30:17.030
hazard and then in 2014 we discovered that[br]Cameron's buddy, who was the transparency
0:30:17.030,0:30:22.440
czar and the NHS had sold the database to[br]1200 researchers, including drug companies
0:30:22.440,0:30:26.740
in China. So that meant that all the[br]sensitive personal health information
0:30:26.740,0:30:31.480
about one billion patients episodes had[br]been sold around the world and was
0:30:31.480,0:30:35.240
available to not just to medical[br]researchers, but to foreign intelligence
0:30:35.240,0:30:50.760
services. This brings us on to Snowden. In[br]June 2013. We had one of those game
0:30:50.760,0:30:57.280
changing moments when Ed Snowden leaked a[br]whole bunch of papers showing that the NSA
0:30:57.280,0:31:02.390
had been breaking the law in America and[br]GCHQ had been breaking the law in Britain,
0:31:02.390,0:31:06.320
that we have been lied to, the parliament[br]had been misled, and a whole lot of
0:31:06.320,0:31:10.580
collection and interception was going on,[br]which supposedly shouldn't have been going
0:31:10.580,0:31:15.790
on. Now, one of the things that got[br]industry attention was a system called
0:31:15.790,0:31:22.500
PRISM, which was in fact legal because[br]this was done as a result of warrants
0:31:22.500,0:31:28.190
being served on the major Internet service[br]providers. And if we could move to the
0:31:28.190,0:31:33.500
next slide, we can see that this started[br]off with Microsoft in 2007. Yahoo! in
0:31:33.500,0:31:38.121
2008, they fought in court for a year they[br]lost and then Google and Facebook and so on
0:31:38.121,0:31:44.640
got added. This basically enabled the NSA[br]to go to someone like Google and say
0:31:44.640,0:31:49.590
rossjanderson@gmail.com is a foreign[br]national, we're therefore entitled to read
0:31:49.590,0:31:54.660
his traffic, kindly give us his Gmail. And[br]Google would say, yes, sir. For Americans,
0:31:54.660,0:31:58.240
you have to show probable cause that[br]they've committed a crime for foreigners
0:31:58.240,0:32:06.060
you simply have to show probable cause[br]that they're a foreigner. The next slide.
0:32:06.060,0:32:14.700
This disclosure from Snowden disclosed[br]that PRISM, despite the fact that it only
0:32:14.700,0:32:20.400
costs about 20 million dollars a year, was[br]generating something like half of all the
0:32:20.400,0:32:27.160
intelligence that the NSA was using. By[br]the end of financial year 2012, but that
0:32:27.160,0:32:33.100
was not all. Next slide, please. The thing[br]that really annoyed Google was this slide
0:32:33.100,0:32:38.820
on the deck from a presentation at GCHQ[br]showing how the NSA was not merely
0:32:38.820,0:32:44.480
collecting stuff through the front door by[br]serving warrants on Google in Mountain
0:32:44.480,0:32:48.590
View, it was collecting stuff through the[br]backdoor as well, because they were
0:32:48.590,0:32:53.840
harvesting the plaintext copies of Gmail[br]and maps and docs and so on, which were
0:32:53.840,0:32:59.350
being sent backwards and forwards between[br]Google's different data centers. And the
0:32:59.350,0:33:04.650
little smiley face, which you can see on[br]the sticky, got Sergei and Friends really,
0:33:04.650,0:33:09.890
really uptight. And they just decided,[br]right, you know, we're not going to allow
0:33:09.890,0:33:13.310
this. They will have to knock and show[br]warrants in the future. And there was a
0:33:13.310,0:33:17.120
crash program and all the major Internet[br]service providers to encrypt all the
0:33:17.120,0:33:25.180
traffic so that in future things could[br]only be got by means of a warrant. Next
0:33:25.180,0:33:38.060
slide, please. The EU was really annoyed[br]by what was called Operation Socialist.
0:33:38.060,0:33:49.920
Operation Socialist was basically, the[br]hack of Belgacom and the idea was that
0:33:49.920,0:33:56.710
GCHQ spearfished some technical staff at[br]Belgacom and this enabled them to wiretap
0:33:56.710,0:34:04.870
all the traffic at the European Commission[br]in Brussels and as well as mobile phone
0:34:04.870,0:34:11.910
traffic to and from various countries in[br]Africa. And this is rather amazing. It's
0:34:11.910,0:34:16.940
as if Nicola Sturgeon, the first minister[br]of Scotland, had tasked Police Scotland
0:34:16.940,0:34:21.781
with hacking BT so that she could watch[br]out what was going on with the parliament
0:34:21.781,0:34:30.919
in London. So this annoyed a number of[br]people. With the next slide, we can see.
0:34:30.919,0:34:40.149
That the the Operation Bull Run, an[br]operation Edgehill, as GCHQ called their
0:34:40.149,0:34:44.740
version of it, have an aggressive,[br]multipronged efforts to break widely used
0:34:44.740,0:34:49.899
Internet encryption technologies. And we[br]learned an awful lot about what was being
0:34:49.899,0:34:55.929
done to break VPNs worldwide and what had[br]been done in terms of inserting
0:34:55.929,0:35:01.830
vulnerabilities and protocols, getting[br]people to use vulnerable prime numbers for
0:35:01.830,0:35:06.750
Diffie Hellman key exchange and so on and[br]so forth. Next slide, first slide and
0:35:06.750,0:35:11.870
Bullrun and Edgehill SIGINT enabling[br]projects actively engages the US and
0:35:11.870,0:35:16.310
foreign IT industries to covertly[br]influence and/or overtly leverage their
0:35:16.310,0:35:20.690
commercial products' designs. These design[br]changes make the systems in question
0:35:20.690,0:35:24.680
exploitable through SIGINT collection[br]endpoint midpoints, et cetera, with
0:35:24.680,0:35:28.400
foreknowledge of the modification, the[br]consumer and other adversaries however the
0:35:28.400,0:35:36.510
system security remains intact. Next[br]slide, so the insert vulnerabilities into
0:35:36.510,0:35:41.450
commercial systems, I.T. systems, networks[br]and point communication devices used by
0:35:41.450,0:35:49.160
targets. Next slide. They also influence[br]policy standards and specifications for
0:35:49.160,0:35:54.270
commercial public key technologies, and[br]this was the smoking gun that
0:35:54.270,0:36:02.240
crypto war 1 had not actually ended. It had[br]just gone undercover. And so with this,
0:36:02.240,0:36:08.250
things come out into the open next slide[br]so we could perhaps date crypto war 2 to
0:36:08.250,0:36:13.190
the Snowden disclosures in their aftermath[br]in America. It must be said that all three
0:36:13.190,0:36:18.350
arms of the US government showed at least[br]mild remarks. Obama set up the NSA review
0:36:18.350,0:36:23.810
group and adopted most of what it said[br]except on the equities issue. Congress got
0:36:23.810,0:36:28.180
data retention, renewed the Patriot Act[br]and the FISA court introduced an advocate
0:36:28.180,0:36:33.340
for Targets. Tech companies as I[br]mentioned, started encrypting all their
0:36:33.340,0:36:39.220
traffic. In the UK on the other hand,[br]governments expressed no remorse at all,
0:36:39.220,0:36:43.450
and they passed the Investigatory Powers[br]Act to legalize all the unlawful things
0:36:43.450,0:36:47.740
they've already been doing. And they could[br]now order firms secretly do anything they
0:36:47.740,0:36:56.730
physically can. However, data retention[br]was nixed by the European courts. The
0:36:56.730,0:37:01.920
academic response in the next slide, keys[br]under doormats, much the same authors as
0:37:01.920,0:37:08.670
before. We analyzed the new situation and[br]came to much of the same conclusions. Next
0:37:08.670,0:37:14.620
slide, the 2018 GCHQ[br]proposals from Ian Levy and Crispin
0:37:14.620,0:37:20.870
Robinson proposed to add ghost users to[br]WhatsApp and FaceTime calls in response to
0:37:20.870,0:37:25.860
warrants. The idea is that you've got an[br]FBI key on your device hearing. You still
0:37:25.860,0:37:30.110
have end to end, so you just have an extra[br]end. And this, of course, fills the keys
0:37:30.110,0:37:34.380
on the doormats tests. Your software would[br]abandon best practice. It would create
0:37:34.380,0:37:39.690
targets and increase complexity and it[br]would also have to lie about trust. Next
0:37:39.690,0:37:49.310
slide, please. This brings us to the[br]upload filters which were proposed over
0:37:49.310,0:37:55.990
the past six months, they first surfaced[br]in early 2020 to a Stanford think tank and
0:37:55.990,0:38:00.960
they were adopted by Commissioner Ylva[br]Johansson on June the 9th at the start of
0:38:00.960,0:38:05.930
the German presidency. On the 20th of[br]September we got a leaked tech paper whose
0:38:05.930,0:38:11.650
authors include our GCHQ friends Ian Levie[br]and Crispin Robinson. The top options are
0:38:11.650,0:38:17.620
that you filter in client software[br]assisted by a server, as client side only
0:38:17.620,0:38:22.570
filtering is too constrained and easy to[br]compromise. The excuse is that you want to
0:38:22.570,0:38:28.520
stop illegal material such as child sex[br]abuse images being shared over end to end
0:38:28.520,0:38:34.210
messaging system such as WhatsApp. Various[br]NGOs objected, and we had a meeting with
0:38:34.210,0:38:39.580
the commission, which was a little bit[br]like a Stockholm Syndrome event. We had
0:38:39.580,0:38:43.750
one official there on the child protection[br]front fax by half a dozen officials from
0:38:43.750,0:38:48.610
various security bodies, departments and[br]agencies who seemed to be clearly driving
0:38:48.610,0:38:53.191
the thing with child protection merely[br]being an excuse to promote this lead.
0:38:53.191,0:39:00.360
Well, the obvious things to worry about[br]are as a similar language in the new
0:39:00.360,0:39:04.730
terror regulation, you can expect the[br]filter to extend from child sex abuse
0:39:04.730,0:39:10.840
material to terror. And static filtering[br]won't work because if there's a bad list
0:39:10.840,0:39:15.380
of 100˙000 forbidden images, then the bad[br]people will just go out and make another
0:39:15.380,0:39:22.530
100˙000 child sex abuse images. So the[br]filtering will have to become dynamic. And
0:39:22.530,0:39:26.880
then the question is whether your form[br]will block it or report it. And there's an
0:39:26.880,0:39:32.090
existing legal duty in a number of[br]countries and in the UK to although
0:39:32.090,0:39:37.310
obviously no longer a member state, the[br]existing duty to report terror stuff. And
0:39:37.310,0:39:41.840
the question is, who will be in charge of[br]updating the filters? What's going to
0:39:41.840,0:39:50.750
happen then? Next slide. Well, we've seen[br]an illustration during the lockdown in
0:39:50.750,0:39:55.230
April, the French and Dutch government[br]sent an update to all Encrochat mobile
0:39:55.230,0:39:59.450
phones with a rootkit which copied[br]messages, crypto keys and lock screen
0:39:59.450,0:40:04.460
passwords. The Encrochat was a brand of[br]mobile phone that was sold through
0:40:04.460,0:40:11.001
underground channels to various criminal[br]groups and others. And since this was
0:40:11.001,0:40:18.119
largely used by criminals of various[br]kinds, the U.K. government justify bulk
0:40:18.119,0:40:24.160
intercepts by passing its office targets[br]and equipment interference. In other
0:40:24.160,0:40:28.600
words, they brought a targeted warrant for[br]all forty five thousand Encrochat handsets
0:40:28.600,0:40:33.400
and of ten thousand users in the U.K.,[br]eight hundred were arrested in June when
0:40:33.400,0:40:39.680
the wire tapping exercise was completed.[br]Now, again, this appears to ignore the
0:40:39.680,0:40:44.450
laws that we have on the books because[br]even our Investigatory Powers Act rules
0:40:44.450,0:40:48.710
out all interception of U.K.[br]residents. And those who follow such
0:40:48.710,0:40:52.950
matters will know that there was a trial[br]at Liverpool Crown Court, a hearing of
0:40:52.950,0:40:59.369
whether this stuff was admissible. And we[br]should have a first verdict on that early
0:40:59.369,0:41:05.270
in the new year. And that will no doubt go[br]to appeal. And if the material is held to
0:41:05.270,0:41:09.820
be admissible, then there will be a whole[br]series of trials. So this brings me to my
0:41:09.820,0:41:17.050
final point. What can we expect going[br]forward? China is emerging as a full-stack
0:41:17.050,0:41:21.700
competitor to the West, not like Russia in[br]Cold War one, because Russia only ever
0:41:21.700,0:41:26.760
produced things like primary goods, like[br]oil and weapons in trouble, of course. But
0:41:26.760,0:41:30.690
China is trying to compete all the way up[br]and down the stack from chips, through
0:41:30.690,0:41:35.690
software, up through services and[br]everything else. And developments in China
0:41:35.690,0:41:40.850
don't exactly fill one with much[br]confidence, because in March 2018,
0:41:40.850,0:41:45.400
President Xi declared himself to be ruler[br]for life, basically tearing up the Chinese
0:41:45.400,0:41:50.280
constitution. There are large-scale state[br]crimes being committed in Tibet and
0:41:50.280,0:41:55.240
Xiniang and elsewhere. Just last week,[br]Britain's chief rabbi described the
0:41:55.240,0:42:03.991
treatment of Uyghurs as an unfathomable[br]mass atrocity. In my book, I describe
0:42:03.991,0:42:09.280
escalating cyber conflict and various[br]hacks, such as the hack of the Office of
0:42:09.280,0:42:15.100
Personnel Management, which had clearance[br]files on all Americans who work for the
0:42:15.100,0:42:20.710
federal governments, the hack of Equifax,[br]which got credit ratings and credit
0:42:20.710,0:42:25.560
histories of all Americans. And there are[br]also growing tussles and standards. For
0:42:25.560,0:42:32.840
example, the draft ISO 27553 on biometric[br]authentication for mobile phones is
0:42:32.840,0:42:38.080
introducing at the insistence of Chinese[br]delegates, a central database option. So
0:42:38.080,0:42:43.480
in future, your phone might not verify[br]your faceprint or your fingerprint
0:42:43.480,0:42:50.440
locally. It might do it with a central[br]database. Next slide, how could Cold War
0:42:50.440,0:42:56.550
2.0 be different? Well, there's a number[br]of interesting things here, and the
0:42:56.550,0:43:00.960
purpose of this talk is to try and kick[br]off a discussion of these issues. China
0:43:00.960,0:43:06.120
makes electronics, not just guns, the way[br]the old USSR did. Can you have a separate
0:43:06.120,0:43:13.600
supply chain for China and one for[br]everybody else? But hang on a minute,
0:43:13.600,0:43:20.220
consider the fact that China has now[br]collected very substantial personal data
0:43:20.220,0:43:25.300
sets on the Office of Personnel[br]Management, the US government employees,
0:43:25.300,0:43:32.360
by forcing Apple to set up its own data[br]centers in China for iPhone users in
0:43:32.360,0:43:39.270
China, they get access to all the data[br]for Chinese users of iPhones that America
0:43:39.270,0:43:44.750
gets for American users of iPhones, plus[br]maybe more as well. If the Chinese can
0:43:44.750,0:43:50.690
break the HSMs in Chinese data centers as[br]we expect them to be able to, Equifax got
0:43:50.690,0:43:56.960
them data on all economically active[br]people in the USA. care.data gave them
0:43:56.960,0:44:02.390
medical records of everybody in the UK.[br]And this bulk personal data is already
0:44:02.390,0:44:08.470
being targeted in intelligence use when[br]Western countries, for example, send
0:44:08.470,0:44:13.640
diplomats to countries in Africa or Latin[br]America or local Chinese counter-
0:44:13.640,0:44:16.870
intelligence, people know whether they're[br]bona fide diplomats or whether they're
0:44:16.870,0:44:22.210
intelligence agents, undercover, all[br]from exploitation of all this personal
0:44:22.210,0:44:26.220
information. Now, given that this[br]information's already in efficient targeted
0:44:26.220,0:44:31.970
use, the next question we have to ask is[br]when will it be used at scale? And this is
0:44:31.970,0:44:37.390
the point at which we say that the[br]equities issue now needs a serious rethink
0:44:37.390,0:44:43.830
and the whole structure of the conflict is[br]going to have to move from more offensive
0:44:43.830,0:44:49.540
to more defensive because we depend on[br]supply chains to which the Chinese have
0:44:49.540,0:44:55.460
access more than they depend on supply[br]chains to which we have access. Now, it's
0:44:55.460,0:45:01.190
dreadful that we're headed towards a new[br]Cold War, but as we head there, we have to
0:45:01.190,0:45:05.950
ask also the respective roles of[br]governments, industry and civil society,
0:45:05.950,0:45:14.040
academia. Next slide, please. And so[br]looking for my point is this. That is Cold
0:45:14.040,0:45:18.860
War 2.0 does happen. I hope it doesn't.[br]But we appear to be headed that way
0:45:18.860,0:45:23.680
despite the change of governments in the[br]White House. Then we need to be able to
0:45:23.680,0:45:31.010
defend everybody, not just the elites. No,[br]it's not going to be easy because there
0:45:31.010,0:45:35.270
are more state players, the USA is a big[br]block, the EU is a big block. There are
0:45:35.270,0:45:39.650
other players, other democracies that are[br]other non democracies. Those other failing
0:45:39.650,0:45:45.210
democracies. This is going to be complex[br]and messy. It isn't going to be a
0:45:45.210,0:45:50.310
situation like last time where big tech[br]reaches out to civil society and academia
0:45:50.310,0:45:55.930
and we could see a united front against[br]the agencies. And even in that case, of
0:45:55.930,0:46:00.550
course, the victory that we got was only[br]an apparent victory, a superficial victory
0:46:00.550,0:46:06.410
that's only lasted for a while. So what[br]could we do? Well, at this point, I think
0:46:06.410,0:46:10.960
we need to remind all the players to[br]listen. But it's not just about strategy
0:46:10.960,0:46:15.800
and tactics, but it's about values, too.[br]And so we need to be firmly on the side of
0:46:15.800,0:46:21.470
freedom, privacy and the rule of law. Now,[br]for the old timers, you may remember that
0:46:21.470,0:46:29.520
there was a product called Tom-Skype,[br]which was introduced in 2011 in China. The
0:46:29.520,0:46:34.470
Chinese wanted the citizens to be able to[br]use Skype, but they wanted to be able to
0:46:34.470,0:46:38.290
wiretap as well, despite the fact that[br]Skype at the time had end to end
0:46:38.290,0:46:44.520
encryption. And so people in China were[br]compelled to download a client for Skype
0:46:44.520,0:46:50.450
called Tom-Skype. Tom was the company that[br]distributed Skype in China and it
0:46:50.450,0:46:55.070
basically had built in wire tapping. So[br]you had end to end encryption using Skype
0:46:55.070,0:47:01.240
in those days. But in China, you ended up[br]having a Trojan client, which you had to
0:47:01.240,0:47:08.240
use. And what we are doing at the moment[br]is basically the EU is trying to copy Tom-
0:47:08.240,0:47:13.440
Skype and saying that we should be doing[br]what China was doing eight years ago. And
0:47:13.440,0:47:17.540
I say we should reject that. We can't[br]challenge President Xi by going down that
0:47:17.540,0:47:21.970
route. Instead, we've got to reset our[br]values and we've got to think through the
0:47:21.970,0:47:27.600
equities issue and we've got to figure out[br]how it is that we're going to deal with
0:47:27.600,0:47:32.570
the challenges of dealing with non-[br]democratic countries when there is serious
0:47:32.570,0:47:40.620
conflict in a globalized world where we're[br]sharing the same technology. Thanks. And
0:47:40.620,0:47:52.230
perhaps the last slide for my book can[br]come now and I'm happy to take questions.
0:47:52.230,0:47:58.460
Herald: Yeah, thanks a lot, Ross, for your[br]talk. It's a bit depressing to listen to
0:47:58.460,0:48:09.510
you. I have to admit let's have a look.[br]OK, so I have a question. I'm wondering if
0:48:09.510,0:48:15.369
the export controls at EU level became[br]worse than UK level export controls
0:48:15.369,0:48:20.660
because entities like GCHQ had more[br]influence there or because there's a harmful
0:48:20.660,0:48:26.619
Franco German security culture or what it[br]was. Do you have anything on that?
0:48:26.619,0:48:30.890
Ross: Well, the experience that we had[br]with these export controls, once they were
0:48:30.890,0:48:38.260
in place, was as follows. It was about[br]2015 I think, or 2016, It came to our
0:48:38.260,0:48:43.800
attention that a British company, Sophos,[br]was selling bulk surveillance equipment to
0:48:43.800,0:48:49.330
President al Assad of Syria, and he was[br]using it to basically wiretap his entire
0:48:49.330,0:48:54.080
population and decide who he was going to[br]arrest and kill the following day. And it
0:48:54.080,0:48:58.530
was sold by Sophos in fact, through a[br]German subsidiary. And so we went along to
0:48:58.530,0:49:06.870
the export control office in Victoria[br]Street. A number of NGOs, the open rights
0:49:06.870,0:49:11.880
group went along and Privacy International[br]and us and one or two others. And we said,
0:49:11.880,0:49:16.480
look, according to the EU dual use[br]regulation, bulk intercept equipment is
0:49:16.480,0:49:19.950
military equipment. It should be in the[br]military list. Therefore, you should be
0:49:19.950,0:49:25.330
demanding an export license for this[br]stuff. And they found every conceivable
0:49:25.330,0:49:34.100
excuse not to demand it. And it was the[br]lady from GCHQ there in the room who was
0:49:34.100,0:49:38.280
clearly calling the shots. And she was[br]absolutely determined that there should be
0:49:38.280,0:49:44.040
no export controls on the stuff being sold[br]to Syria. And eventually I said, look,
0:49:44.040,0:49:47.260
it's fairly obvious what's going on here.[br]If there's going to be black boxes and
0:49:47.260,0:49:51.110
President al-Assad's network, you want[br]them to be British black boxes or German
0:49:51.110,0:49:55.960
black boxes, not Ukrainian or Israeli[br]black boxes. And she said, I cannot
0:49:55.960,0:50:00.830
discuss classified matters in an open[br]meeting, which is as close as you get to
0:50:00.830,0:50:06.840
an admission. And a couple of months[br]later, Angela Merkel, to her great credit,
0:50:06.840,0:50:12.640
has actually come out in public and said[br]that allowing the equipment to be exported
0:50:12.640,0:50:16.440
from Utimaco to Syria was one of the[br]hardest decision she'd ever taken as
0:50:16.440,0:50:21.770
counselor. And that was a very difficult[br]tradeoff between maintaining intelligence
0:50:21.770,0:50:27.470
access, given the possibility that Western[br]troops would be involved in Syria and the
0:50:27.470,0:50:33.300
fact that the kit was being used for very[br]evil purposes. So that's an example of how
0:50:33.300,0:50:38.280
the export controls are used in practice.[br]They are not used to control the harms
0:50:38.280,0:50:44.330
that we as voters are told that they're[br]there to control. Right. They are used in
0:50:44.330,0:50:49.940
all sorts of dark and dismal games. And we[br]really have to tackle the issue of export
0:50:49.940,0:50:55.980
controls with our eyes open.[br]H: Yeah, yeah. There's a lot a lot to do.
0:50:55.980,0:51:03.800
And now Germany has left the EU, UN[br]Security Council. So let's see what
0:51:03.800,0:51:13.000
happens next. Yeah. We'll see, Ross.[br]Anything else you'd like to add? We don't
0:51:13.000,0:51:19.350
have any more questions. Oh, no, we have[br]another question. It's just come up
0:51:19.350,0:51:24.510
seconds ago. Do you think that refusal to[br]accept back doors will create large
0:51:24.510,0:51:35.300
uncensorable applications?[br]R: Well, if you get large applications
0:51:35.300,0:51:41.619
which are associated with significant[br]economic power, then low pressure gets
0:51:41.619,0:51:51.450
brought to bear on those economic players[br]to do their social duty. And... this is what
0:51:51.450,0:51:56.520
we have seen with the platforms that[br]intermediate content, that act as content
0:51:56.520,0:52:00.220
intermediaries such as Facebook and Google[br]and so on, that they do a certain amount
0:52:00.220,0:52:08.510
of filtering. But if, on the other hand,[br]you have wholesale surveillance before the
0:52:08.510,0:52:13.690
fact of End-To-End encrypted stuff, then[br]are we moving into an environment where
0:52:13.690,0:52:19.200
private speech from one person to another[br]is no longer permitted? You know, I don't
0:52:19.200,0:52:24.490
think that's the right trade off that we[br]should be taking, because we all know from
0:52:24.490,0:52:28.780
hard experience that when governments say,[br]think of the children, they're not
0:52:28.780,0:52:32.090
thinking of children at all. If they were[br]thinking of children, they would not be
0:52:32.090,0:52:36.280
selling weapons to Saudi Arabia and the[br]United Arab Emirates to kill children in
0:52:36.280,0:52:41.850
Yemen. And they say think about terrorism.[br]But the censorship that we are supposed to
0:52:41.850,0:52:47.880
use in universities around terrorism, the[br]so-called prevent duty is known to be
0:52:47.880,0:52:52.280
counterproductive. It makes Muslim[br]students feel alienated and marginalized.
0:52:52.280,0:52:57.480
So the arguments that governments use[br]around this are not in any way honest. And
0:52:57.480,0:53:01.810
we now have 20 years experience of these[br]dishonest arguments. And for goodness
0:53:01.810,0:53:05.550
sake, let's have a more grown up[br]conversation about these things.
0:53:05.550,0:53:11.700
H: Now, you're totally right, even if I[br]have to admit, it took me a couple of
0:53:11.700,0:53:24.660
years, not 20, but a lot to finally[br]understand, OK? This I think that's it, we
0:53:24.660,0:53:31.230
just have another comment and I'm thanking[br]you for your time and are you in an
0:53:31.230,0:53:36.680
assembly somewhere around hanging around[br]in the next hour or so? Maybe if someone
0:53:36.680,0:53:41.860
wants to talk to you, he can just pop by[br]if you ever if you have used this 2d world
0:53:41.860,0:53:45.260
already.[br]R: No, I haven't been using the 2d world.
0:53:45.260,0:53:50.590
I had some issues with my browser and[br]getting into it. But I've got my my
0:53:50.590,0:53:55.380
webpage and my email address is public and[br]anybody who wants to discuss these things
0:53:55.380,0:53:59.740
is welcome to get in touch with me.[br]Herald: All right. So thanks a lot.
0:53:59.740,0:54:04.195
R: Thank you for the invitation.[br]H: Yeah. Thanks a lot.
0:54:04.195,0:54:07.800
rC3 postroll music
0:54:07.800,0:54:43.050
Subtitles created by c3subtitles.de[br]in the year 2020. Join, and help us!