1
00:00:00,000 --> 00:00:12,590
rC3 preroll music
2
00:00:12,590 --> 00:00:18,109
Herald: This is Ross Anderson, and he's
giving a talk to us today, and the title
3
00:00:18,109 --> 00:00:24,079
is What Price the Upload Filter? From Cold
War to Crypto Wars and Back Again. And
4
00:00:24,079 --> 00:00:31,489
we're very happy that he's here today. And
and for our non-English speaking public,
5
00:00:31,489 --> 00:00:35,990
we have translations.
speaks german
6
00:00:35,990 --> 00:00:41,300
Dieser Talk wird auf Deutsch übersetzt.
speaks french
7
00:00:41,300 --> 00:00:49,839
Cette conférence est traduit en
français aussi.
8
00:00:49,839 --> 00:00:56,769
Yeah. Um. Ross, ready to start? Let's go.
Have a good time. Enjoy.
9
00:00:56,769 --> 00:01:09,750
Ross: Yes, ready to go. Thanks. OK. As has
been said, I'm Ross Anderson and I'm in
10
00:01:09,750 --> 00:01:13,620
the position of being one of the old guys
of this field and that I've been involved
11
00:01:13,620 --> 00:01:18,680
in the crypto wars right from the start.
And in fact, even since before the clipper
12
00:01:18,680 --> 00:01:27,510
chip actually came out. If we could go to
the slides, please.
13
00:01:27,510 --> 00:01:31,407
Right, can we see the slides?
14
00:01:31,407 --> 00:02:09,505
silence
15
00:02:09,505 --> 00:02:14,790
surprised the U.S. armed
forces. And guess what happened? Well, in
16
00:02:14,790 --> 00:02:21,060
the 1950s, Boris Hagelin had set up that
company, secretly sold it to the NSA and
17
00:02:21,060 --> 00:02:28,110
for a number of years, quite a lot of years,
countries as diverse as Latin America and
18
00:02:28,110 --> 00:02:33,130
India and even NATO countries such as
Italy were buying machines from Crypto AG,
19
00:02:33,130 --> 00:02:39,470
which the NSA could decipher. And this had
all sorts of consequences. For example,
20
00:02:39,470 --> 00:02:44,780
it's been revealed fairly recently that
Britain's success against Argentina in the
21
00:02:44,780 --> 00:02:50,680
Falklands War in 1982 was to a large
extent due to signals intelligence that
22
00:02:50,680 --> 00:02:58,790
came from these machines. So, next slide,
please. And in this prehistory of the
23
00:02:58,790 --> 00:03:03,600
crypto wars, almost all the play was
between governments. There was very little
24
00:03:03,600 --> 00:03:08,180
role for civil society. There was one or
two journalists who were engaged in trying
25
00:03:08,180 --> 00:03:13,870
to map what the NSA and friends were up to.
As far as industry was concerned, well, at
26
00:03:13,870 --> 00:03:18,180
that time, I was working in banking and we
found that encryption for confidentiality
27
00:03:18,180 --> 00:03:22,319
was discouraged. If we tried to use line
encryption, then false mysteriously
28
00:03:22,319 --> 00:03:26,880
appeared on the line. But authentication
was OK. We were allowed to encrypt PIN
29
00:03:26,880 --> 00:03:32,380
pad, PIN blocks. We were allowed to put
MACs on messages. There was some minor
30
00:03:32,380 --> 00:03:37,250
harassment. For example, when Rivest,
Shamir and Adleman came up with their
31
00:03:37,250 --> 00:03:42,650
encryption algorithm, the NSA tried to
make it classified. But the Provost of
32
00:03:42,650 --> 00:03:47,840
MIT, Jerome Wiesner, persuaded them not to
make that fight. The big debate in the
33
00:03:47,840 --> 00:03:52,880
1970s still, was whether the NSA affected
the design of the data encryption standard
34
00:03:52,880 --> 00:03:57,800
algorithm, and we know now that this was
the case. It was designed to be only just
35
00:03:57,800 --> 00:04:04,330
strong enough and Whit Diffie predicted
back in the 1970s that 2 to the power of
36
00:04:04,330 --> 00:04:08,920
56 key search would eventually be
feasible. The EFF built a machine in 1998
37
00:04:08,920 --> 00:04:13,510
and now of course that's fairly easy
because each bitcoin block costs 2 to
38
00:04:13,510 --> 00:04:20,169
the power of 68 calculations. Next slide,
please. So where things get interesting is
39
00:04:20,169 --> 00:04:25,919
that the NSA persuaded Bill Clinton in one
of his first cabinet meetings in 1993 to
40
00:04:25,919 --> 00:04:30,270
introduce key escrow, the idea that the
NSA should have a copy of every of these
41
00:04:30,270 --> 00:04:36,860
keys. And one of the people at that
meeting admitted later that President Bush,
42
00:04:36,860 --> 00:04:41,280
the elder, had been asked and had refused,
but Clinton when he goes into office was
43
00:04:41,280 --> 00:04:46,300
naive and thought that this was an
opportunity to fix the world. Now, the
44
00:04:46,300 --> 00:04:50,159
clipper chip which we can see here, was
tamper resistant and had of secret block
45
00:04:50,159 --> 00:04:58,620
cipher with an NSA backdoor key. And the
launch product was an AT&T secure phone.
46
00:04:58,620 --> 00:05:05,030
Next slide, please. Now the Clipper protocol
was an interesting one in that each chip
47
00:05:05,030 --> 00:05:12,210
had a unique secret key KU and a global
secret family key kNSA burned in. And in
48
00:05:12,210 --> 00:05:18,189
order to, say, send data to Bob, Alice had
to send her clipper chip a working key kW,
49
00:05:18,189 --> 00:05:22,890
which is generated by some external means,
such as a Diffie Hellman Key exchange. And
50
00:05:22,890 --> 00:05:28,430
it makes a law enforcement access field,
which was basically Alice and Bob's names
51
00:05:28,430 --> 00:05:33,529
with the working key encrypted under the
unit key and then a hash of the working
52
00:05:33,529 --> 00:05:38,449
key encrypted under the NSA key. And that
was sent along with the cipher text to make
53
00:05:38,449 --> 00:05:43,209
authorized wiretapping easy. And the idea
with the hash was that this would stop
54
00:05:43,209 --> 00:05:47,830
cheating. Bob's Clipper Chip wouldn't use
a working key unless it came with a valid
55
00:05:47,830 --> 00:05:53,889
LEAF. And I can remember, a few of us can
still remember, the enormous outcry that
56
00:05:53,889 --> 00:05:57,819
this caused at the time. American
companies in particular didn't like it
57
00:05:57,819 --> 00:06:02,439
because they started losing business to
foreign firms. And in fact, a couple of
58
00:06:02,439 --> 00:06:07,499
our students here at Cambridge started a
company nCipher, that grew to be quite
59
00:06:07,499 --> 00:06:13,300
large because they could sell worldwide,
unlike US firms. People said, why don't we
60
00:06:13,300 --> 00:06:17,019
use encryption software? Well, that's easy
to write, but it's hard to deploy at
61
00:06:17,019 --> 00:06:22,749
scale, as Phil Zimmermann found with PGP.
And the big concern was whether key escrow
62
00:06:22,749 --> 00:06:28,620
would kill electronic commerce. A
secondary concern was whether, how on earth,
63
00:06:28,620 --> 00:06:32,389
will we know if government designs are
secure? Why on earth should you trust the
64
00:06:32,389 --> 00:06:40,669
NSA? Next slide, please. Well, the first
serious fight back in the crypto wars came
65
00:06:40,669 --> 00:06:45,259
when Matt Blaze at Bell Labs found an
attack on Clipper. He found that Alice
66
00:06:45,259 --> 00:06:52,339
could just try lots of these until one of
them works, because the tag was only 16
67
00:06:52,339 --> 00:06:57,589
Bits long and it turned out that 2 to the
power of 112 of the 2 to the power of
68
00:06:57,589 --> 00:07:03,749
128 possibilities work. And this meant
that Alice could generate a bogus LEAF
69
00:07:03,749 --> 00:07:07,589
that would pass inspection, but which
wouldn't decrypt the traffic, and Bob
70
00:07:07,589 --> 00:07:11,879
could also generate a new LEAF on the fly.
So you could write non-interoperable rogue
71
00:07:11,879 --> 00:07:16,691
applications that the NSA has no access
to. And with a bit more work, you could
72
00:07:16,691 --> 00:07:21,479
make rogue applications interoperate with
official ones. This was only the first of
73
00:07:21,479 --> 00:07:30,089
many dumb ideas. Next slide, please. OK,
so why don't people just use software?
74
00:07:30,089 --> 00:07:36,189
Well, at that time, the US had export
controls on intangible goods such as
75
00:07:36,189 --> 00:07:40,430
software, although European countries
generally didn't. And this meant that US
76
00:07:40,430 --> 00:07:45,610
academics couldn't put crypto code online,
although we Europeans could and we
77
00:07:45,610 --> 00:07:52,059
did. And so Phil Zimmermann achieved fame
by exporting PGP, pretty good privacy, some
78
00:07:52,059 --> 00:07:56,099
encryption software he had written for
America as a paper book. And this was
79
00:07:56,099 --> 00:08:00,509
protected by the First Amendment. They
sent it across the border to Canada. They
80
00:08:00,509 --> 00:08:04,360
fed it into an optical character
recognition scanner. They recompiled it
81
00:08:04,360 --> 00:08:08,889
and the code had escaped. For this Phil
was subjected to a grand jury
82
00:08:08,889 --> 00:08:13,979
investigation. There was also the
Bernstein case around code as free speech
83
00:08:13,979 --> 00:08:19,059
and Bruce Schneier rose to fame with his
book "Applying Cryptography", which had
84
00:08:19,059 --> 00:08:24,249
protocols, algorithms and source code in C,
which you could type in in order to get
85
00:08:24,249 --> 00:08:31,960
cryptographic algorithms anywhere. And we
saw export-controlled clothing. This
86
00:08:31,960 --> 00:08:36,389
t-shirt was something that many people wore
at the time. I've actually got one and I
87
00:08:36,389 --> 00:08:41,089
planned to wear it for this. But
unfortunately, I came into the lab in
88
00:08:41,089 --> 00:08:47,230
order to get better connectivity and I
left it at home. So this t-shirt was an
89
00:08:47,230 --> 00:08:53,009
implementation of RSA written in perl,
plus a barcode so that you can scan it in.
90
00:08:53,009 --> 00:08:58,350
And in theory, you should not walk across
the border wearing this t-shirt. Or if
91
00:08:58,350 --> 00:09:03,270
you're a US citizen, you shouldn't even
let a non-US citizen look at it. So by
92
00:09:03,270 --> 00:09:08,579
these means, people probed the outskirts of
what was possible and, you know an awful
93
00:09:08,579 --> 00:09:17,070
lot of fun was had. It was a good laugh to
tweak the Tyrannosaur's tail. Next slide.
94
00:09:17,070 --> 00:09:25,030
But this wasn't just something that was
limited to the USA. The big and obvious
95
00:09:25,030 --> 00:09:30,230
problem, if you try and do key escrow in
Europe, is that there's dozens of
96
00:09:30,230 --> 00:09:35,810
countries in Europe and what happens if
someone from Britain, for example, has got
97
00:09:35,810 --> 00:09:40,100
a mobile phone that they bought in France
or a German SIM card and they're standing
98
00:09:40,100 --> 00:09:44,509
on the streets in Stockholm and they phone
somebody who's in Budapest, who's got a
99
00:09:44,509 --> 00:09:48,699
Hungarian phone with the Spanish SIM card
in it. Then which of these countries'
100
00:09:48,699 --> 00:09:54,260
secret police forces should be able to
listen to the call. And this was something
101
00:09:54,260 --> 00:09:59,930
that stalled the progress of key escrow,
that's a good way to describe it, in Europe.
102
00:09:59,930 --> 00:10:06,839
And in 1996 GCHQ got academic colleagues
at Royal Holloway to come up with a
103
00:10:06,839 --> 00:10:11,860
proposal for public sector email, which
they believe would fix this. Now, at the
104
00:10:11,860 --> 00:10:18,919
time after clipper had fallen into
disrepute, the NSA's proposal was that
105
00:10:18,919 --> 00:10:24,130
also the certification authority should have
to be licensed, and that this would enforce
106
00:10:24,130 --> 00:10:27,879
a condition that all private keys would be
escrows, so you would only be able to get a
107
00:10:27,879 --> 00:10:33,530
signature on your public key if the
private key was was held by the CA. And
108
00:10:33,530 --> 00:10:38,440
the idea is that you'd have one CA for
each government department and civilians
109
00:10:38,440 --> 00:10:42,190
would use trusted firms like Barclays Bank
or the post office, which would keep our
110
00:10:42,190 --> 00:10:48,009
keys safe. And it would also work across
other EU member states, so that somebody in
111
00:10:48,009 --> 00:10:53,870
Britain calling somebody in Germany would
end up in a situation where a trustworthy
112
00:10:53,870 --> 00:10:59,220
CA, from the NSA's point of view, that is
an untrustworthy CA from our point of view,
113
00:10:59,220 --> 00:11:03,680
in Britain would be prepared to make a key
and so would one in Germany. This, at
114
00:11:03,680 --> 00:11:10,630
least, was the idea. So how do we do this,
next slide, on the GCHQ protocol. So here's
115
00:11:10,630 --> 00:11:14,829
how it was designed to work in the UK
government. If Alice at the Department of
116
00:11:14,829 --> 00:11:20,120
Agriculture wants to talk to Bob at the
Department of Business, she asks her
117
00:11:20,120 --> 00:11:26,089
Departmental Security Officer DA for a send
key for herself and a receive key for Bob.
118
00:11:26,089 --> 00:11:35,360
And DA and DB get a top level
interoperability key KTAB from GCHQ and DA
119
00:11:35,360 --> 00:11:44,120
calculates a secret send key of the day as
a hash of KTAB and Alice's name and the
120
00:11:44,120 --> 00:11:50,430
DA's own Identity for Alice which he gives
to Alice and similarly a public receive
121
00:11:50,430 --> 00:11:55,481
key of the day for Bob and Alice sends Bob
her public send key along with the
122
00:11:55,481 --> 00:12:00,379
encrypted message and Bob can go
to his DSO and get his secret receive
123
00:12:00,379 --> 00:12:06,120
key of the day. Now this is slightly
complicated and there's all sorts of other
124
00:12:06,120 --> 00:12:11,299
things wrong with it once you start to
look at it. Next slide, please. The first
125
00:12:11,299 --> 00:12:14,790
is that from the point of view of the
overall effect, you could just as easily
126
00:12:14,790 --> 00:12:19,080
have used Kerberos because you've
basically got a key distribution center at
127
00:12:19,080 --> 00:12:24,630
both ends, which knows everybody's keys. So
you've not actually gained very much by
128
00:12:24,630 --> 00:12:31,081
using complicated public key mechanisms,
and the next problem is what's the law
129
00:12:31,081 --> 00:12:36,390
enforcement access need for centrally
generated signing keys? If this is
130
00:12:36,390 --> 00:12:40,480
actually for law enforcement rather than
intelligence? Well, the police want to be
131
00:12:40,480 --> 00:12:47,480
able to read things, not forge things. A
third problem is that keys involve hashing
132
00:12:47,480 --> 00:12:52,110
department names and governments are
changing the name of the departments all
133
00:12:52,110 --> 00:12:56,980
the time, as the prime minister of the day
moves his ministers around and they chop
134
00:12:56,980 --> 00:13:01,810
and change departments. And this means, of
course, that everybody has to get new
135
00:13:01,810 --> 00:13:06,320
cryptographic keys and suddenly the old
cryptographic keys don't work anymore. And
136
00:13:06,320 --> 00:13:10,800
those are horrendous complexity comes from
this. Now, there are about 10 other things
137
00:13:10,800 --> 00:13:15,420
wrong with this protocol, but curiously
enough, it's still used by the UK
138
00:13:15,420 --> 00:13:19,090
government for the top secret stuff. It
went through a number of iterations. It's
139
00:13:19,090 --> 00:13:23,939
now called Mikey Sakke, there's details in
my security engineering book. And it
140
00:13:23,939 --> 00:13:28,129
turned out to be such a pain that the
stuff below top secret now is just used as
141
00:13:28,129 --> 00:13:32,959
a branded version of G suite. So if what
you want to do is to figure out what
142
00:13:32,959 --> 00:13:37,050
speech Boris Johnson will be making
tomorrow, we just have to guess the
143
00:13:37,050 --> 00:13:44,459
password recovery questions for his
private secretaries and officials. Next
144
00:13:44,459 --> 00:13:51,060
slide, the global Internet Trust Register.
This was an interesting piece of fun we
145
00:13:51,060 --> 00:13:55,929
had around the 1997 election when Tony
Blair took over and introduced the Labor
146
00:13:55,929 --> 00:14:00,439
government before the election, Labor
promised to not seize crypto keys in bulk
147
00:14:00,439 --> 00:14:04,499
without a warrant. And one of the
first things that happened to him once he
148
00:14:04,499 --> 00:14:09,569
was in office is Vice President Al Gore
went to visit him and all of a sudden Tony
149
00:14:09,569 --> 00:14:13,449
Blair decided that he wanted all
certification authorities to be licensed
150
00:14:13,449 --> 00:14:18,520
and they were about to rush this through
parliament. So we put all the important
151
00:14:18,520 --> 00:14:23,290
public keys in a paper book and we took it
to the cultural secretary, Chris Smith,
152
00:14:23,290 --> 00:14:27,790
and we said, you're the minister for books
why are you passing a law to ban this
153
00:14:27,790 --> 00:14:32,580
book. And if you'll switch to the video
shot, I've got the initial copy of the
154
00:14:32,580 --> 00:14:36,579
book that we just put together on the
photocopying machine in the department.
155
00:14:36,579 --> 00:14:42,200
And then we sent the PDF off to MIT and
they produced it as a proper book. And
156
00:14:42,200 --> 00:14:48,350
this means that we had a book which is
supposedly protected and this enabled us
157
00:14:48,350 --> 00:14:55,209
to get the the topic onto the agenda for
cabinet discussion. So this at least
158
00:14:55,209 --> 00:14:59,779
precipitous action, we ended up with the
Regulation of Investigatory Powers Bill in
159
00:14:59,779 --> 00:15:04,830
2000. That was far from perfect, but that
was a longer story. So what happened back
160
00:15:04,830 --> 00:15:09,860
then is that we set up an NGO, a digital
rights organization, the Foundation for
161
00:15:09,860 --> 00:15:16,620
Information Policy Research. And the
climate at the time was such that we had
162
00:15:16,620 --> 00:15:22,310
no difficulty raising a couple of hundred
thousand pounds from Microsoft and Hewlett
163
00:15:22,310 --> 00:15:28,370
Packard and Redbus and other tech players.
So we were able to hire Casper Bowden for
164
00:15:28,370 --> 00:15:32,199
three years to basically be the director
of FIPR and to lobby the government hard
165
00:15:32,199 --> 00:15:38,490
on this. And if we can go back to the
slides, please, and go to the next slide,
166
00:15:38,490 --> 00:15:47,170
the slide on bringing it all together. So
in 1997, a number of us, Hal Abelson and I
167
00:15:47,170 --> 00:15:54,569
and Steve Bellovin and Josh Benaloh from
Microsoft and Matt Blaze who had broken
168
00:15:54,569 --> 00:15:59,430
Clipper and Whit Diffie, who invented
digital signatures, and John Gilmore of
169
00:15:59,430 --> 00:16:05,689
EFF, Peter Neumann of SRI, Ron Rivest,
Jeff Schiller of MIT and Bruce Schneier
170
00:16:05,689 --> 00:16:09,339
who had written applied cryptography and
got together and wrote a paper on the
171
00:16:09,339 --> 00:16:13,830
risks of key recovery, key escrow and
trust in third party encryption, where we
172
00:16:13,830 --> 00:16:18,470
discussed the system consequences of
giving third party or government access to
173
00:16:18,470 --> 00:16:23,850
both traffic data and content without user
notice or consent deployed internationally
174
00:16:23,850 --> 00:16:27,550
and available around the clock. We came to
the conclusion that this was not really
175
00:16:27,550 --> 00:16:33,550
doable. It was simply too many
vulnerabilities and too many complexities.
176
00:16:33,550 --> 00:16:38,899
So how did it end? Well, if we go to the
next slide, the victory in Europe wasn't
177
00:16:38,899 --> 00:16:44,259
as a result of academic arguments. It was
a result of industry pressure. And we owe
178
00:16:44,259 --> 00:16:48,750
a debt to Commissioner Martin Bangemann
and also to the German government who
179
00:16:48,750 --> 00:16:56,699
backed him. And in 1994, Martin had put
together a group of European CEOs to
180
00:16:56,699 --> 00:17:01,190
advise him on internet policy. And they
advised them to keep your hands off until
181
00:17:01,190 --> 00:17:04,160
we can see which way it's going. That's
just wrong with this thing and see what we
182
00:17:04,160 --> 00:17:10,670
can do with it. And the thing that he
developed in order to drive a stake
183
00:17:10,670 --> 00:17:15,180
through the heart of key escrow was the
Electronic Signatures Directive in 1999.
184
00:17:15,180 --> 00:17:20,330
And this gave a rebuttable presumption of
validity to qualifying electronic
185
00:17:20,330 --> 00:17:24,640
signatures, but subject to a number of
conditions. And one of these was that the
186
00:17:24,640 --> 00:17:29,570
signing key must never be known to anybody
else other than the signer and this killed
187
00:17:29,570 --> 00:17:37,120
the idea of licensing CAs in such a way
that the the NSA had access to all the
188
00:17:37,120 --> 00:17:41,390
private key material. The agencies had
argued that without controlling
189
00:17:41,390 --> 00:17:45,260
signatures, you couldn't control
encryption. But of course, as intelligence
190
00:17:45,260 --> 00:17:49,440
agencies, they were as much interested in
manipulating information as they were in
191
00:17:49,440 --> 00:17:57,280
listening into it. And this created a
really sharp conflict with businesses. In
192
00:17:57,280 --> 00:18:00,770
the U.K., with the Regulation of
Investigatory Powers Bill went through the
193
00:18:00,770 --> 00:18:05,540
following year. And there we got strong
support from the banks who did not want
194
00:18:05,540 --> 00:18:10,600
the possibility of intelligence and law
enforcement personnel either getting hold
195
00:18:10,600 --> 00:18:16,380
of bank keys or forging banking
transactions. And so we managed to, with
196
00:18:16,380 --> 00:18:20,720
their help to insert a number of
conditions into the bill, which meant that
197
00:18:20,720 --> 00:18:25,520
if a court or chief constable, for
example, demands a key from a company,
198
00:18:25,520 --> 00:18:29,400
they've got to demand it from somebody at
the level of a director of the company.
199
00:18:29,400 --> 00:18:33,800
And it's got to be signed by someone
really senior such as the chief constable.
200
00:18:33,800 --> 00:18:38,910
So there was some controls that we managed
to get in there. Next slide! What did
201
00:18:38,910 --> 00:18:44,660
victory in the USA look like? Well, in the
middle of 2000 as a number of people had
202
00:18:44,660 --> 00:18:49,170
predicted, Al Gore decided that he wanted
to stop fighting the tech industry in
203
00:18:49,170 --> 00:18:54,410
order to get elected president. And there
was a deal done at the time which was
204
00:18:54,410 --> 00:19:01,070
secret. It was done at the FBI
headquarters at Quantico by US law
205
00:19:01,070 --> 00:19:04,880
enforcement would rely on naturally
occurring vulnerabilities rather than
206
00:19:04,880 --> 00:19:10,070
compelling their insertion by companies
like Intel or Microsoft. This was secret
207
00:19:10,070 --> 00:19:15,017
at the time, and I happen to know about it
because I was consulting for Intel and the
208
00:19:15,017 --> 00:19:20,800
NDA I was under had a four year time
limits on it. So after 2004, I was at the
209
00:19:20,800 --> 00:19:25,760
ability to talk about this. And so this
basically gave the NSA access to the CERT
210
00:19:25,760 --> 00:19:30,930
feed. And so as part of this deal, the
export rules were liberalized a bit, but
211
00:19:30,930 --> 00:19:38,090
with various hooks and gotchas left so
that the authorities could bully companies
212
00:19:38,090 --> 00:19:45,580
who got too difficult. And in 2002, Robert
Morris, senior, who had been the chief
213
00:19:45,580 --> 00:19:50,740
scientist at the NSA at much of this
period, admitted that the real policy goal
214
00:19:50,740 --> 00:19:54,540
was to ensure that the many systems
developed during the dot com boom were
215
00:19:54,540 --> 00:20:02,190
deployed with weak protection or none. And
there's a huge, long list of these. Next
216
00:20:02,190 --> 00:20:11,430
slide, please. So what was the collateral
damage from crypto war one? This is the
217
00:20:11,430 --> 00:20:15,310
first knuckle pass of this talk, which
I've got together as a result of spending
218
00:20:15,310 --> 00:20:20,920
the last academic year writing the third
edition of my book on security engineering
219
00:20:20,920 --> 00:20:26,520
as I've gone through and updated all the
chapters on car security, the role of
220
00:20:26,520 --> 00:20:32,510
security and web security and so on and so
forth, we find everywhere. But there are
221
00:20:32,510 --> 00:20:38,050
still very serious costs remaining from
crypto war one, for example, almost all of
222
00:20:38,050 --> 00:20:43,430
the remote key entry systems for cars use
inadequate cryptography for random
223
00:20:43,430 --> 00:20:48,290
number generators and so on and so forth.
And car theft has almost doubled in the
224
00:20:48,290 --> 00:20:55,380
past five years. This is not all due to
weak crypto, but it's substantially due to
225
00:20:55,380 --> 00:21:01,323
a wrong culture that was started off in
the context of the crypto wars. Second,
226
00:21:01,323 --> 00:21:06,430
there are millions of door locks still
using Mifare classic, even the building
227
00:21:06,430 --> 00:21:12,030
where I work. For example, the University
of Cambridge changed its door locks around
228
00:21:12,030 --> 00:21:17,040
2000. So we've still got a whole lot of
mifare classic around. And it's very
229
00:21:17,040 --> 00:21:21,150
difficult when you've got 100 buildings to
change all the locks on them. And this is
230
00:21:21,150 --> 00:21:26,250
the case with thousands of organizations
worldwide, with universities, with banks,
231
00:21:26,250 --> 00:21:30,990
with all sorts of people, simply because
changing all the locks at once and dozens
232
00:21:30,990 --> 00:21:35,770
of buildings is just too expensive. Then,
of course, there's the CA in your
233
00:21:35,770 --> 00:21:40,990
browser, most nations own or control
certification authorities that your
234
00:21:40,990 --> 00:21:47,380
browser trusts and the few nations that
weren't allowed to own such CAs, such as
235
00:21:47,380 --> 00:21:53,040
Iran, get up to mischief, as we find in
the case of the DigiNotar hack a few years
236
00:21:53,040 --> 00:21:59,000
ago. And this means that most nations have
got a more or less guaranteed ability to
237
00:21:59,000 --> 00:22:05,770
do man in the middle attacks on your Web
log ons. Some companies like Google, of
238
00:22:05,770 --> 00:22:11,410
course, started to fix that with various
mechanisms such as certificate pinning.
239
00:22:11,410 --> 00:22:16,160
But that was a deliberate vulnerability
that was there for a long, long time and
240
00:22:16,160 --> 00:22:22,410
is still very widespread. Phones. 2G is
insecure. That actually goes back to the
241
00:22:22,410 --> 00:22:27,281
Cold War rather than the crypto war. But
thanks to the crypto wars 4G and 5G are
242
00:22:27,281 --> 00:22:32,450
not very much better. The details are
slightly complicated and again, they're
243
00:22:32,450 --> 00:22:37,950
described in the book, Bluetooth is easy
to hack. That's another piece of legacy.
244
00:22:37,950 --> 00:22:43,380
And as I mentioned, the agencies own the
CERT's responsible disclosure pipeline,
245
00:22:43,380 --> 00:22:47,690
which means that they got a free fire hose
of zero days that they can exploit
246
00:22:47,690 --> 00:22:53,780
for perhaps a month or three before these
end up being patched. So next slide,
247
00:22:53,780 --> 00:23:02,600
please. Last year when I talked at Chaos
Communication Congress, the audience chose
248
00:23:02,600 --> 00:23:08,900
this as the cover for my security
engineering book, and that's now out. And
249
00:23:08,900 --> 00:23:12,730
it's the process of writing this that
brought home to me the scale of the damage
250
00:23:12,730 --> 00:23:18,450
that we still suffered as a result of
crypto war one. So let's move on to the
251
00:23:18,450 --> 00:23:24,610
next slide and the next period of history,
which we might call the war on terror. And
252
00:23:24,610 --> 00:23:30,980
I've arbitrarily put this down as 2000 to
2013 although some countries stoped using
253
00:23:30,980 --> 00:23:36,790
the phrase war on terror in about 2008
once we have got rid of George W. Bush and
254
00:23:36,790 --> 00:23:41,330
Tony Blair. But as a historical
convenience, this is, if you like, the
255
00:23:41,330 --> 00:23:46,140
central period in our tale. And it starts
off with a lot of harassment around the
256
00:23:46,140 --> 00:23:54,700
edges of security and cryptography. For
example, in 2000, Tony Blair promoted the
257
00:23:54,700 --> 00:24:02,290
EU dual use regulation number 1334 to
extend export controls from tangible goods
258
00:24:02,290 --> 00:24:07,810
such as rifles and tanks to intangibles
such as crypto software. Despite the fact
259
00:24:07,810 --> 00:24:13,980
that he has basically declared peace on
the tech industry. Two years later, in
260
00:24:13,980 --> 00:24:18,090
2002, the UK parliament balked at an
export control bill that was going to
261
00:24:18,090 --> 00:24:24,140
transpose this because it added controls
on scientific speech, not just crypto
262
00:24:24,140 --> 00:24:28,900
code, but even papers on cryptanalysis and
even electron microscope scripts and
263
00:24:28,900 --> 00:24:33,300
so parliament started the research
exemption clause at the arguments of the
264
00:24:33,300 --> 00:24:39,420
then president of the Royal Society, Sir
Robert May. But what then happened is that
265
00:24:39,420 --> 00:24:45,820
GCHQ used EU regulations to frustrate
Parliament and this pattern of extralegal
266
00:24:45,820 --> 00:24:51,700
behavior was to continue. Next slide!
Because after export control, the place
267
00:24:51,700 --> 00:24:57,310
shifted to traffic data retention, another
bad thing that I'm afraid to say, the UK
268
00:24:57,310 --> 00:25:02,630
exported to Europe back in the days when
we were, in effect, the Americans
269
00:25:02,630 --> 00:25:08,530
consigliere on the European Council. Sorry
about that, folks, but all I can say is at
270
00:25:08,530 --> 00:25:15,900
least we helped start EDRI a year after
that. So one of the interesting aspects of
271
00:25:15,900 --> 00:25:20,590
this was that our then home secretary,
Jacqui Smith, started talking about the
272
00:25:20,590 --> 00:25:26,080
need for a common database of all the
metadata of who had phoned whom when, who
273
00:25:26,080 --> 00:25:30,720
had sent an email to whom when, so that
the police could continue to use the
274
00:25:30,720 --> 00:25:35,340
traditional contact tracing techniques
online. And the line that we got hammered
275
00:25:35,340 --> 00:25:39,500
home to us again and again and again was
if you got nothing to hide, you've got
276
00:25:39,500 --> 00:25:47,490
nothing to fear. What then happened in
2008, is that a very bad person went into
277
00:25:47,490 --> 00:25:53,550
Parliament and went to the PC where the
expense claims of MPs were kept and they
278
00:25:53,550 --> 00:25:58,630
copied all the expense claims onto a DVD
and they sold it around Fleet Street. And
279
00:25:58,630 --> 00:26:03,450
so The Daily Telegraph bought it from them
for 400˙000£. And then for the best
280
00:26:03,450 --> 00:26:07,500
part of a year, the Daily Telegraph was
telling scandalous things about what
281
00:26:07,500 --> 00:26:12,170
various members of parliament had claimed
from the taxpayer. But it turned out that
282
00:26:12,170 --> 00:26:15,730
also Jacqui Smith may have been innocent.
Her husband had been downloading
283
00:26:15,730 --> 00:26:21,010
pornography and charging it to our
parliamentary expenses. So she lost her
284
00:26:21,010 --> 00:26:25,820
job as home secretary and she lost her
seat in parliament and the communications
285
00:26:25,820 --> 00:26:32,950
data bill was lost. So was this a victory?
Well, in June 2013, we learned from Ed
286
00:26:32,950 --> 00:26:39,310
Snowden that they just built it anyway,
despite parliament. So maybe the victory
287
00:26:39,310 --> 00:26:43,400
in parliament wasn't what it seemed to be
at the time. But I'm getting ahead of
288
00:26:43,400 --> 00:26:51,570
myself; anyway. Next slide, please. The
other thing that we did in the 2000s is
289
00:26:51,570 --> 00:26:56,200
that we spent, I spent maybe a third of my
time and about another hundred people
290
00:26:56,200 --> 00:27:00,856
joined and we developed the economics of
security as a discipline. We began to
291
00:27:00,856 --> 00:27:05,660
realize that many of the things that went
wrong happened because Alice was guarding
292
00:27:05,660 --> 00:27:10,910
a system and Bob was paying the cost of
failure. For example, if you got a payment
293
00:27:10,910 --> 00:27:17,480
system, then in order to prevent fraud,
what you basically have to do is to get
294
00:27:17,480 --> 00:27:21,920
the merchants and the bank to buy
transactions from them, to take care of
295
00:27:21,920 --> 00:27:26,440
the costs of fraud, follow the cardholder
of the banks that issue them with cards.
296
00:27:26,440 --> 00:27:31,870
And the two aren't the same. But it's this
that causes the governance tensions and
297
00:27:31,870 --> 00:27:36,906
causes governments to break down and makes
fraud harder than it should be. Now after
298
00:27:36,906 --> 00:27:41,530
that, one of the early topics was
patching and responsible disclosure. And
299
00:27:41,530 --> 00:27:45,340
we worked through all the issues of
whether you should not patch at all, which
300
00:27:45,340 --> 00:27:48,860
some people in industry wanted to do, or
whether you should just put all the bugs
301
00:27:48,860 --> 00:27:52,690
on bug trackers which some hackers wanted
to do or whether you would go through the
302
00:27:52,690 --> 00:27:57,200
CERT system despite the NSA compromise,
because they at least would give you legal
303
00:27:57,200 --> 00:28:03,720
cover. And, you know, bully Microsoft into
catching the bug the next patch Tuesday
304
00:28:03,720 --> 00:28:10,040
and then the disclosure after 90 days. And
we eventually came to the conclusion as an
305
00:28:10,040 --> 00:28:16,270
industry followed that responsible
disclosure was the way to go. Now, one of
306
00:28:16,270 --> 00:28:21,530
the problems that arises here is the
equities issue. Suppose you're the
307
00:28:21,530 --> 00:28:27,260
director of the NSA and somebody comes to
you with some super new innovative bug.
308
00:28:27,260 --> 00:28:33,490
You say they have rediscovered Spectre,
for example. And so you've got a bug which
309
00:28:33,490 --> 00:28:40,640
can be used to penetrate any crypto
software that's out there. Do you report
310
00:28:40,640 --> 00:28:45,640
the bug to Microsoft and Intel to defend
300 million Americans, or do you keep it
311
00:28:45,640 --> 00:28:50,830
quiet so you can exploit 450 million
Europeans and a thousand billion Chinese
312
00:28:50,830 --> 00:28:55,170
and so on and so forth? Well, once you put
it that way, it's fairly obvious that the
313
00:28:55,170 --> 00:29:00,370
NSA will favor attack over defense. And
there are multiple models of attack and
314
00:29:00,370 --> 00:29:04,420
defense. You can think of institutional
factors and politics, for example, if you
315
00:29:04,420 --> 00:29:10,350
are director of the NSA, and you defend
300 million Americans. You defend the
316
00:29:10,350 --> 00:29:15,720
White House against the Chinese hacking
it. You know, the president will never
317
00:29:15,720 --> 00:29:19,970
know if he's hacked or not because the
Chinese will keep it quiet if they do. But
318
00:29:19,970 --> 00:29:24,790
if, on the other hand, you manage to hack
the Politburo land in Peking, you can put
319
00:29:24,790 --> 00:29:31,040
some juicy intelligence every morning with
the president's breakfast cereal. So
320
00:29:31,040 --> 00:29:37,150
that's an even stronger argument of why
you should do attack rather than defense.
321
00:29:37,150 --> 00:29:43,360
And all the thing that I mentioned in
passing is that throughout the 2000s,
322
00:29:43,360 --> 00:29:47,390
governments also scrambled to get more
data of the citizens, for example, in
323
00:29:47,390 --> 00:29:51,930
Britain with a long debate about whether
medical records should be centralized. In
324
00:29:51,930 --> 00:29:56,030
the beginning, we said if you were to
centralize all medical records, that would
325
00:29:56,030 --> 00:29:59,440
be such a large target that the database
should be top secret and it would be too
326
00:29:59,440 --> 00:30:06,480
inconvenient for doctors to use. Well,
Blair decided in 2001 to do it anyway. We
327
00:30:06,480 --> 00:30:10,700
wrote a report in 2009 saying that this
was a red line and that this was a serious
328
00:30:10,700 --> 00:30:17,030
hazard and then in 2014 we discovered that
Cameron's buddy, who was the transparency
329
00:30:17,030 --> 00:30:22,440
czar and the NHS had sold the database to
1200 researchers, including drug companies
330
00:30:22,440 --> 00:30:26,740
in China. So that meant that all the
sensitive personal health information
331
00:30:26,740 --> 00:30:31,480
about one billion patients episodes had
been sold around the world and was
332
00:30:31,480 --> 00:30:35,240
available to not just to medical
researchers, but to foreign intelligence
333
00:30:35,240 --> 00:30:50,760
services. This brings us on to Snowden. In
June 2013. We had one of those game
334
00:30:50,760 --> 00:30:57,280
changing moments when Ed Snowden leaked a
whole bunch of papers showing that the NSA
335
00:30:57,280 --> 00:31:02,390
had been breaking the law in America and
GCHQ had been breaking the law in Britain,
336
00:31:02,390 --> 00:31:06,320
that we have been lied to, the parliament
had been misled, and a whole lot of
337
00:31:06,320 --> 00:31:10,580
collection and interception was going on,
which supposedly shouldn't have been going
338
00:31:10,580 --> 00:31:15,790
on. Now, one of the things that got
industry attention was a system called
339
00:31:15,790 --> 00:31:22,500
PRISM, which was in fact legal because
this was done as a result of warrants
340
00:31:22,500 --> 00:31:28,190
being served on the major Internet service
providers. And if we could move to the
341
00:31:28,190 --> 00:31:33,500
next slide, we can see that this started
off with Microsoft in 2007. Yahoo! in
342
00:31:33,500 --> 00:31:38,121
2008, they fought in court for a year they
lost and then Google and Facebook and so on
343
00:31:38,121 --> 00:31:44,640
got added. This basically enabled the NSA
to go to someone like Google and say
344
00:31:44,640 --> 00:31:49,590
rossjanderson@gmail.com is a foreign
national, we're therefore entitled to read
345
00:31:49,590 --> 00:31:54,660
his traffic, kindly give us his Gmail. And
Google would say, yes, sir. For Americans,
346
00:31:54,660 --> 00:31:58,240
you have to show probable cause that
they've committed a crime for foreigners
347
00:31:58,240 --> 00:32:06,060
you simply have to show probable cause
that they're a foreigner. The next slide.
348
00:32:06,060 --> 00:32:14,700
This disclosure from Snowden disclosed
that PRISM, despite the fact that it only
349
00:32:14,700 --> 00:32:20,400
costs about 20 million dollars a year, was
generating something like half of all the
350
00:32:20,400 --> 00:32:27,160
intelligence that the NSA was using. By
the end of financial year 2012, but that
351
00:32:27,160 --> 00:32:33,100
was not all. Next slide, please. The thing
that really annoyed Google was this slide
352
00:32:33,100 --> 00:32:38,820
on the deck from a presentation at GCHQ
showing how the NSA was not merely
353
00:32:38,820 --> 00:32:44,480
collecting stuff through the front door by
serving warrants on Google in Mountain
354
00:32:44,480 --> 00:32:48,590
View, it was collecting stuff through the
backdoor as well, because they were
355
00:32:48,590 --> 00:32:53,840
harvesting the plaintext copies of Gmail
and maps and docs and so on, which were
356
00:32:53,840 --> 00:32:59,350
being sent backwards and forwards between
Google's different data centers. And the
357
00:32:59,350 --> 00:33:04,650
little smiley face, which you can see on
the sticky, got Sergei and Friends really,
358
00:33:04,650 --> 00:33:09,890
really uptight. And they just decided,
right, you know, we're not going to allow
359
00:33:09,890 --> 00:33:13,310
this. They will have to knock and show
warrants in the future. And there was a
360
00:33:13,310 --> 00:33:17,120
crash program and all the major Internet
service providers to encrypt all the
361
00:33:17,120 --> 00:33:25,180
traffic so that in future things could
only be got by means of a warrant. Next
362
00:33:25,180 --> 00:33:38,060
slide, please. The EU was really annoyed
by what was called Operation Socialist.
363
00:33:38,060 --> 00:33:49,920
Operation Socialist was basically, the
hack of Belgacom and the idea was that
364
00:33:49,920 --> 00:33:56,710
GCHQ spearfished some technical staff at
Belgacom and this enabled them to wiretap
365
00:33:56,710 --> 00:34:04,870
all the traffic at the European Commission
in Brussels and as well as mobile phone
366
00:34:04,870 --> 00:34:11,910
traffic to and from various countries in
Africa. And this is rather amazing. It's
367
00:34:11,910 --> 00:34:16,940
as if Nicola Sturgeon, the first minister
of Scotland, had tasked Police Scotland
368
00:34:16,940 --> 00:34:21,781
with hacking BT so that she could watch
out what was going on with the parliament
369
00:34:21,781 --> 00:34:30,919
in London. So this annoyed a number of
people. With the next slide, we can see.
370
00:34:30,919 --> 00:34:40,149
That the the Operation Bull Run, an
operation Edgehill, as GCHQ called their
371
00:34:40,149 --> 00:34:44,740
version of it, have an aggressive,
multipronged efforts to break widely used
372
00:34:44,740 --> 00:34:49,899
Internet encryption technologies. And we
learned an awful lot about what was being
373
00:34:49,899 --> 00:34:55,929
done to break VPNs worldwide and what had
been done in terms of inserting
374
00:34:55,929 --> 00:35:01,830
vulnerabilities and protocols, getting
people to use vulnerable prime numbers for
375
00:35:01,830 --> 00:35:06,750
Diffie Hellman key exchange and so on and
so forth. Next slide, first slide and
376
00:35:06,750 --> 00:35:11,870
Bullrun and Edgehill SIGINT enabling
projects actively engages the US and
377
00:35:11,870 --> 00:35:16,310
foreign IT industries to covertly
influence and/or overtly leverage their
378
00:35:16,310 --> 00:35:20,690
commercial products' designs. These design
changes make the systems in question
379
00:35:20,690 --> 00:35:24,680
exploitable through SIGINT collection
endpoint midpoints, et cetera, with
380
00:35:24,680 --> 00:35:28,400
foreknowledge of the modification, the
consumer and other adversaries however the
381
00:35:28,400 --> 00:35:36,510
system security remains intact. Next
slide, so the insert vulnerabilities into
382
00:35:36,510 --> 00:35:41,450
commercial systems, I.T. systems, networks
and point communication devices used by
383
00:35:41,450 --> 00:35:49,160
targets. Next slide. They also influence
policy standards and specifications for
384
00:35:49,160 --> 00:35:54,270
commercial public key technologies, and
this was the smoking gun that
385
00:35:54,270 --> 00:36:02,240
crypto war 1 had not actually ended. It had
just gone undercover. And so with this,
386
00:36:02,240 --> 00:36:08,250
things come out into the open next slide
so we could perhaps date crypto war 2 to
387
00:36:08,250 --> 00:36:13,190
the Snowden disclosures in their aftermath
in America. It must be said that all three
388
00:36:13,190 --> 00:36:18,350
arms of the US government showed at least
mild remarks. Obama set up the NSA review
389
00:36:18,350 --> 00:36:23,810
group and adopted most of what it said
except on the equities issue. Congress got
390
00:36:23,810 --> 00:36:28,180
data retention, renewed the Patriot Act
and the FISA court introduced an advocate
391
00:36:28,180 --> 00:36:33,340
for Targets. Tech companies as I
mentioned, started encrypting all their
392
00:36:33,340 --> 00:36:39,220
traffic. In the UK on the other hand,
governments expressed no remorse at all,
393
00:36:39,220 --> 00:36:43,450
and they passed the Investigatory Powers
Act to legalize all the unlawful things
394
00:36:43,450 --> 00:36:47,740
they've already been doing. And they could
now order firms secretly do anything they
395
00:36:47,740 --> 00:36:56,730
physically can. However, data retention
was nixed by the European courts. The
396
00:36:56,730 --> 00:37:01,920
academic response in the next slide, keys
under doormats, much the same authors as
397
00:37:01,920 --> 00:37:08,670
before. We analyzed the new situation and
came to much of the same conclusions. Next
398
00:37:08,670 --> 00:37:14,620
slide, the 2018 GCHQ
proposals from Ian Levy and Crispin
399
00:37:14,620 --> 00:37:20,870
Robinson proposed to add ghost users to
WhatsApp and FaceTime calls in response to
400
00:37:20,870 --> 00:37:25,860
warrants. The idea is that you've got an
FBI key on your device hearing. You still
401
00:37:25,860 --> 00:37:30,110
have end to end, so you just have an extra
end. And this, of course, fills the keys
402
00:37:30,110 --> 00:37:34,380
on the doormats tests. Your software would
abandon best practice. It would create
403
00:37:34,380 --> 00:37:39,690
targets and increase complexity and it
would also have to lie about trust. Next
404
00:37:39,690 --> 00:37:49,310
slide, please. This brings us to the
upload filters which were proposed over
405
00:37:49,310 --> 00:37:55,990
the past six months, they first surfaced
in early 2020 to a Stanford think tank and
406
00:37:55,990 --> 00:38:00,960
they were adopted by Commissioner Ylva
Johansson on June the 9th at the start of
407
00:38:00,960 --> 00:38:05,930
the German presidency. On the 20th of
September we got a leaked tech paper whose
408
00:38:05,930 --> 00:38:11,650
authors include our GCHQ friends Ian Levie
and Crispin Robinson. The top options are
409
00:38:11,650 --> 00:38:17,620
that you filter in client software
assisted by a server, as client side only
410
00:38:17,620 --> 00:38:22,570
filtering is too constrained and easy to
compromise. The excuse is that you want to
411
00:38:22,570 --> 00:38:28,520
stop illegal material such as child sex
abuse images being shared over end to end
412
00:38:28,520 --> 00:38:34,210
messaging system such as WhatsApp. Various
NGOs objected, and we had a meeting with
413
00:38:34,210 --> 00:38:39,580
the commission, which was a little bit
like a Stockholm Syndrome event. We had
414
00:38:39,580 --> 00:38:43,750
one official there on the child protection
front fax by half a dozen officials from
415
00:38:43,750 --> 00:38:48,610
various security bodies, departments and
agencies who seemed to be clearly driving
416
00:38:48,610 --> 00:38:53,191
the thing with child protection merely
being an excuse to promote this lead.
417
00:38:53,191 --> 00:39:00,360
Well, the obvious things to worry about
are as a similar language in the new
418
00:39:00,360 --> 00:39:04,730
terror regulation, you can expect the
filter to extend from child sex abuse
419
00:39:04,730 --> 00:39:10,840
material to terror. And static filtering
won't work because if there's a bad list
420
00:39:10,840 --> 00:39:15,380
of 100˙000 forbidden images, then the bad
people will just go out and make another
421
00:39:15,380 --> 00:39:22,530
100˙000 child sex abuse images. So the
filtering will have to become dynamic. And
422
00:39:22,530 --> 00:39:26,880
then the question is whether your form
will block it or report it. And there's an
423
00:39:26,880 --> 00:39:32,090
existing legal duty in a number of
countries and in the UK to although
424
00:39:32,090 --> 00:39:37,310
obviously no longer a member state, the
existing duty to report terror stuff. And
425
00:39:37,310 --> 00:39:41,840
the question is, who will be in charge of
updating the filters? What's going to
426
00:39:41,840 --> 00:39:50,750
happen then? Next slide. Well, we've seen
an illustration during the lockdown in
427
00:39:50,750 --> 00:39:55,230
April, the French and Dutch government
sent an update to all Encrochat mobile
428
00:39:55,230 --> 00:39:59,450
phones with a rootkit which copied
messages, crypto keys and lock screen
429
00:39:59,450 --> 00:40:04,460
passwords. The Encrochat was a brand of
mobile phone that was sold through
430
00:40:04,460 --> 00:40:11,001
underground channels to various criminal
groups and others. And since this was
431
00:40:11,001 --> 00:40:18,119
largely used by criminals of various
kinds, the U.K. government justify bulk
432
00:40:18,119 --> 00:40:24,160
intercepts by passing its office targets
and equipment interference. In other
433
00:40:24,160 --> 00:40:28,600
words, they brought a targeted warrant for
all forty five thousand Encrochat handsets
434
00:40:28,600 --> 00:40:33,400
and of ten thousand users in the U.K.,
eight hundred were arrested in June when
435
00:40:33,400 --> 00:40:39,680
the wire tapping exercise was completed.
Now, again, this appears to ignore the
436
00:40:39,680 --> 00:40:44,450
laws that we have on the books because
even our Investigatory Powers Act rules
437
00:40:44,450 --> 00:40:48,710
out all interception of U.K.
residents. And those who follow such
438
00:40:48,710 --> 00:40:52,950
matters will know that there was a trial
at Liverpool Crown Court, a hearing of
439
00:40:52,950 --> 00:40:59,369
whether this stuff was admissible. And we
should have a first verdict on that early
440
00:40:59,369 --> 00:41:05,270
in the new year. And that will no doubt go
to appeal. And if the material is held to
441
00:41:05,270 --> 00:41:09,820
be admissible, then there will be a whole
series of trials. So this brings me to my
442
00:41:09,820 --> 00:41:17,050
final point. What can we expect going
forward? China is emerging as a full-stack
443
00:41:17,050 --> 00:41:21,700
competitor to the West, not like Russia in
Cold War one, because Russia only ever
444
00:41:21,700 --> 00:41:26,760
produced things like primary goods, like
oil and weapons in trouble, of course. But
445
00:41:26,760 --> 00:41:30,690
China is trying to compete all the way up
and down the stack from chips, through
446
00:41:30,690 --> 00:41:35,690
software, up through services and
everything else. And developments in China
447
00:41:35,690 --> 00:41:40,850
don't exactly fill one with much
confidence, because in March 2018,
448
00:41:40,850 --> 00:41:45,400
President Xi declared himself to be ruler
for life, basically tearing up the Chinese
449
00:41:45,400 --> 00:41:50,280
constitution. There are large-scale state
crimes being committed in Tibet and
450
00:41:50,280 --> 00:41:55,240
Xiniang and elsewhere. Just last week,
Britain's chief rabbi described the
451
00:41:55,240 --> 00:42:03,991
treatment of Uyghurs as an unfathomable
mass atrocity. In my book, I describe
452
00:42:03,991 --> 00:42:09,280
escalating cyber conflict and various
hacks, such as the hack of the Office of
453
00:42:09,280 --> 00:42:15,100
Personnel Management, which had clearance
files on all Americans who work for the
454
00:42:15,100 --> 00:42:20,710
federal governments, the hack of Equifax,
which got credit ratings and credit
455
00:42:20,710 --> 00:42:25,560
histories of all Americans. And there are
also growing tussles and standards. For
456
00:42:25,560 --> 00:42:32,840
example, the draft ISO 27553 on biometric
authentication for mobile phones is
457
00:42:32,840 --> 00:42:38,080
introducing at the insistence of Chinese
delegates, a central database option. So
458
00:42:38,080 --> 00:42:43,480
in future, your phone might not verify
your faceprint or your fingerprint
459
00:42:43,480 --> 00:42:50,440
locally. It might do it with a central
database. Next slide, how could Cold War
460
00:42:50,440 --> 00:42:56,550
2.0 be different? Well, there's a number
of interesting things here, and the
461
00:42:56,550 --> 00:43:00,960
purpose of this talk is to try and kick
off a discussion of these issues. China
462
00:43:00,960 --> 00:43:06,120
makes electronics, not just guns, the way
the old USSR did. Can you have a separate
463
00:43:06,120 --> 00:43:13,600
supply chain for China and one for
everybody else? But hang on a minute,
464
00:43:13,600 --> 00:43:20,220
consider the fact that China has now
collected very substantial personal data
465
00:43:20,220 --> 00:43:25,300
sets on the Office of Personnel
Management, the US government employees,
466
00:43:25,300 --> 00:43:32,360
by forcing Apple to set up its own data
centers in China for iPhone users in
467
00:43:32,360 --> 00:43:39,270
China, they get access to all the data
for Chinese users of iPhones that America
468
00:43:39,270 --> 00:43:44,750
gets for American users of iPhones, plus
maybe more as well. If the Chinese can
469
00:43:44,750 --> 00:43:50,690
break the HSMs in Chinese data centers as
we expect them to be able to, Equifax got
470
00:43:50,690 --> 00:43:56,960
them data on all economically active
people in the USA. care.data gave them
471
00:43:56,960 --> 00:44:02,390
medical records of everybody in the UK.
And this bulk personal data is already
472
00:44:02,390 --> 00:44:08,470
being targeted in intelligence use when
Western countries, for example, send
473
00:44:08,470 --> 00:44:13,640
diplomats to countries in Africa or Latin
America or local Chinese counter-
474
00:44:13,640 --> 00:44:16,870
intelligence, people know whether they're
bona fide diplomats or whether they're
475
00:44:16,870 --> 00:44:22,210
intelligence agents, undercover, all
from exploitation of all this personal
476
00:44:22,210 --> 00:44:26,220
information. Now, given that this
information's already in efficient targeted
477
00:44:26,220 --> 00:44:31,970
use, the next question we have to ask is
when will it be used at scale? And this is
478
00:44:31,970 --> 00:44:37,390
the point at which we say that the
equities issue now needs a serious rethink
479
00:44:37,390 --> 00:44:43,830
and the whole structure of the conflict is
going to have to move from more offensive
480
00:44:43,830 --> 00:44:49,540
to more defensive because we depend on
supply chains to which the Chinese have
481
00:44:49,540 --> 00:44:55,460
access more than they depend on supply
chains to which we have access. Now, it's
482
00:44:55,460 --> 00:45:01,190
dreadful that we're headed towards a new
Cold War, but as we head there, we have to
483
00:45:01,190 --> 00:45:05,950
ask also the respective roles of
governments, industry and civil society,
484
00:45:05,950 --> 00:45:14,040
academia. Next slide, please. And so
looking for my point is this. That is Cold
485
00:45:14,040 --> 00:45:18,860
War 2.0 does happen. I hope it doesn't.
But we appear to be headed that way
486
00:45:18,860 --> 00:45:23,680
despite the change of governments in the
White House. Then we need to be able to
487
00:45:23,680 --> 00:45:31,010
defend everybody, not just the elites. No,
it's not going to be easy because there
488
00:45:31,010 --> 00:45:35,270
are more state players, the USA is a big
block, the EU is a big block. There are
489
00:45:35,270 --> 00:45:39,650
other players, other democracies that are
other non democracies. Those other failing
490
00:45:39,650 --> 00:45:45,210
democracies. This is going to be complex
and messy. It isn't going to be a
491
00:45:45,210 --> 00:45:50,310
situation like last time where big tech
reaches out to civil society and academia
492
00:45:50,310 --> 00:45:55,930
and we could see a united front against
the agencies. And even in that case, of
493
00:45:55,930 --> 00:46:00,550
course, the victory that we got was only
an apparent victory, a superficial victory
494
00:46:00,550 --> 00:46:06,410
that's only lasted for a while. So what
could we do? Well, at this point, I think
495
00:46:06,410 --> 00:46:10,960
we need to remind all the players to
listen. But it's not just about strategy
496
00:46:10,960 --> 00:46:15,800
and tactics, but it's about values, too.
And so we need to be firmly on the side of
497
00:46:15,800 --> 00:46:21,470
freedom, privacy and the rule of law. Now,
for the old timers, you may remember that
498
00:46:21,470 --> 00:46:29,520
there was a product called Tom-Skype,
which was introduced in 2011 in China. The
499
00:46:29,520 --> 00:46:34,470
Chinese wanted the citizens to be able to
use Skype, but they wanted to be able to
500
00:46:34,470 --> 00:46:38,290
wiretap as well, despite the fact that
Skype at the time had end to end
501
00:46:38,290 --> 00:46:44,520
encryption. And so people in China were
compelled to download a client for Skype
502
00:46:44,520 --> 00:46:50,450
called Tom-Skype. Tom was the company that
distributed Skype in China and it
503
00:46:50,450 --> 00:46:55,070
basically had built in wire tapping. So
you had end to end encryption using Skype
504
00:46:55,070 --> 00:47:01,240
in those days. But in China, you ended up
having a Trojan client, which you had to
505
00:47:01,240 --> 00:47:08,240
use. And what we are doing at the moment
is basically the EU is trying to copy Tom-
506
00:47:08,240 --> 00:47:13,440
Skype and saying that we should be doing
what China was doing eight years ago. And
507
00:47:13,440 --> 00:47:17,540
I say we should reject that. We can't
challenge President Xi by going down that
508
00:47:17,540 --> 00:47:21,970
route. Instead, we've got to reset our
values and we've got to think through the
509
00:47:21,970 --> 00:47:27,600
equities issue and we've got to figure out
how it is that we're going to deal with
510
00:47:27,600 --> 00:47:32,570
the challenges of dealing with non-
democratic countries when there is serious
511
00:47:32,570 --> 00:47:40,620
conflict in a globalized world where we're
sharing the same technology. Thanks. And
512
00:47:40,620 --> 00:47:52,230
perhaps the last slide for my book can
come now and I'm happy to take questions.
513
00:47:52,230 --> 00:47:58,460
Herald: Yeah, thanks a lot, Ross, for your
talk. It's a bit depressing to listen to
514
00:47:58,460 --> 00:48:09,510
you. I have to admit let's have a look.
OK, so I have a question. I'm wondering if
515
00:48:09,510 --> 00:48:15,369
the export controls at EU level became
worse than UK level export controls
516
00:48:15,369 --> 00:48:20,660
because entities like GCHQ had more
influence there or because there's a harmful
517
00:48:20,660 --> 00:48:26,619
Franco German security culture or what it
was. Do you have anything on that?
518
00:48:26,619 --> 00:48:30,890
Ross: Well, the experience that we had
with these export controls, once they were
519
00:48:30,890 --> 00:48:38,260
in place, was as follows. It was about
2015 I think, or 2016, It came to our
520
00:48:38,260 --> 00:48:43,800
attention that a British company, Sophos,
was selling bulk surveillance equipment to
521
00:48:43,800 --> 00:48:49,330
President al Assad of Syria, and he was
using it to basically wiretap his entire
522
00:48:49,330 --> 00:48:54,080
population and decide who he was going to
arrest and kill the following day. And it
523
00:48:54,080 --> 00:48:58,530
was sold by Sophos in fact, through a
German subsidiary. And so we went along to
524
00:48:58,530 --> 00:49:06,870
the export control office in Victoria
Street. A number of NGOs, the open rights
525
00:49:06,870 --> 00:49:11,880
group went along and Privacy International
and us and one or two others. And we said,
526
00:49:11,880 --> 00:49:16,480
look, according to the EU dual use
regulation, bulk intercept equipment is
527
00:49:16,480 --> 00:49:19,950
military equipment. It should be in the
military list. Therefore, you should be
528
00:49:19,950 --> 00:49:25,330
demanding an export license for this
stuff. And they found every conceivable
529
00:49:25,330 --> 00:49:34,100
excuse not to demand it. And it was the
lady from GCHQ there in the room who was
530
00:49:34,100 --> 00:49:38,280
clearly calling the shots. And she was
absolutely determined that there should be
531
00:49:38,280 --> 00:49:44,040
no export controls on the stuff being sold
to Syria. And eventually I said, look,
532
00:49:44,040 --> 00:49:47,260
it's fairly obvious what's going on here.
If there's going to be black boxes and
533
00:49:47,260 --> 00:49:51,110
President al-Assad's network, you want
them to be British black boxes or German
534
00:49:51,110 --> 00:49:55,960
black boxes, not Ukrainian or Israeli
black boxes. And she said, I cannot
535
00:49:55,960 --> 00:50:00,830
discuss classified matters in an open
meeting, which is as close as you get to
536
00:50:00,830 --> 00:50:06,840
an admission. And a couple of months
later, Angela Merkel, to her great credit,
537
00:50:06,840 --> 00:50:12,640
has actually come out in public and said
that allowing the equipment to be exported
538
00:50:12,640 --> 00:50:16,440
from Utimaco to Syria was one of the
hardest decision she'd ever taken as
539
00:50:16,440 --> 00:50:21,770
counselor. And that was a very difficult
tradeoff between maintaining intelligence
540
00:50:21,770 --> 00:50:27,470
access, given the possibility that Western
troops would be involved in Syria and the
541
00:50:27,470 --> 00:50:33,300
fact that the kit was being used for very
evil purposes. So that's an example of how
542
00:50:33,300 --> 00:50:38,280
the export controls are used in practice.
They are not used to control the harms
543
00:50:38,280 --> 00:50:44,330
that we as voters are told that they're
there to control. Right. They are used in
544
00:50:44,330 --> 00:50:49,940
all sorts of dark and dismal games. And we
really have to tackle the issue of export
545
00:50:49,940 --> 00:50:55,980
controls with our eyes open.
H: Yeah, yeah. There's a lot a lot to do.
546
00:50:55,980 --> 00:51:03,800
And now Germany has left the EU, UN
Security Council. So let's see what
547
00:51:03,800 --> 00:51:13,000
happens next. Yeah. We'll see, Ross.
Anything else you'd like to add? We don't
548
00:51:13,000 --> 00:51:19,350
have any more questions. Oh, no, we have
another question. It's just come up
549
00:51:19,350 --> 00:51:24,510
seconds ago. Do you think that refusal to
accept back doors will create large
550
00:51:24,510 --> 00:51:35,300
uncensorable applications?
R: Well, if you get large applications
551
00:51:35,300 --> 00:51:41,619
which are associated with significant
economic power, then low pressure gets
552
00:51:41,619 --> 00:51:51,450
brought to bear on those economic players
to do their social duty. And... this is what
553
00:51:51,450 --> 00:51:56,520
we have seen with the platforms that
intermediate content, that act as content
554
00:51:56,520 --> 00:52:00,220
intermediaries such as Facebook and Google
and so on, that they do a certain amount
555
00:52:00,220 --> 00:52:08,510
of filtering. But if, on the other hand,
you have wholesale surveillance before the
556
00:52:08,510 --> 00:52:13,690
fact of End-To-End encrypted stuff, then
are we moving into an environment where
557
00:52:13,690 --> 00:52:19,200
private speech from one person to another
is no longer permitted? You know, I don't
558
00:52:19,200 --> 00:52:24,490
think that's the right trade off that we
should be taking, because we all know from
559
00:52:24,490 --> 00:52:28,780
hard experience that when governments say,
think of the children, they're not
560
00:52:28,780 --> 00:52:32,090
thinking of children at all. If they were
thinking of children, they would not be
561
00:52:32,090 --> 00:52:36,280
selling weapons to Saudi Arabia and the
United Arab Emirates to kill children in
562
00:52:36,280 --> 00:52:41,850
Yemen. And they say think about terrorism.
But the censorship that we are supposed to
563
00:52:41,850 --> 00:52:47,880
use in universities around terrorism, the
so-called prevent duty is known to be
564
00:52:47,880 --> 00:52:52,280
counterproductive. It makes Muslim
students feel alienated and marginalized.
565
00:52:52,280 --> 00:52:57,480
So the arguments that governments use
around this are not in any way honest. And
566
00:52:57,480 --> 00:53:01,810
we now have 20 years experience of these
dishonest arguments. And for goodness
567
00:53:01,810 --> 00:53:05,550
sake, let's have a more grown up
conversation about these things.
568
00:53:05,550 --> 00:53:11,700
H: Now, you're totally right, even if I
have to admit, it took me a couple of
569
00:53:11,700 --> 00:53:24,660
years, not 20, but a lot to finally
understand, OK? This I think that's it, we
570
00:53:24,660 --> 00:53:31,230
just have another comment and I'm thanking
you for your time and are you in an
571
00:53:31,230 --> 00:53:36,680
assembly somewhere around hanging around
in the next hour or so? Maybe if someone
572
00:53:36,680 --> 00:53:41,860
wants to talk to you, he can just pop by
if you ever if you have used this 2d world
573
00:53:41,860 --> 00:53:45,260
already.
R: No, I haven't been using the 2d world.
574
00:53:45,260 --> 00:53:50,590
I had some issues with my browser and
getting into it. But I've got my my
575
00:53:50,590 --> 00:53:55,380
webpage and my email address is public and
anybody who wants to discuss these things
576
00:53:55,380 --> 00:53:59,740
is welcome to get in touch with me.
Herald: All right. So thanks a lot.
577
00:53:59,740 --> 00:54:04,195
R: Thank you for the invitation.
H: Yeah. Thanks a lot.
578
00:54:04,195 --> 00:54:07,800
rC3 postroll music
579
00:54:07,800 --> 00:54:43,050
Subtitles created by c3subtitles.de
in the year 2020. Join, and help us!