rC3 preroll music
Herald: This is Ross Anderson, and he's
giving a talk to us today, and the title
is What Price the Upload Filter? From Cold
War to Crypto Wars and Back Again. And
we're very happy that he's here today. And
and for our non-English speaking public,
we have translations.
speaks german
Dieser Talk wird auf Deutsch übersetzt.
speaks french
Cette conférence est traduit en
français aussi.
Yeah. Um. Ross, ready to start? Let's go.
Have a good time. Enjoy.
Ross: Yes, ready to go. Thanks. OK. As has
been said, I'm Ross Anderson and I'm in
the position of being one of the old guys
of this field and that I've been involved
in the crypto wars right from the start.
And in fact, even since before the clipper
chip actually came out. If we could go to
the slides, please.
Right, can we see the slides?
silence
surprised the U.S. armed
forces. And guess what happened? Well, in
the 1950s, Boris Hagelin had set up that
company, secretly sold it to the NSA and
for a number of years, quite a lot of years,
countries as diverse as Latin America and
India and even NATO countries such as
Italy were buying machines from Crypto AG,
which the NSA could decipher. And this had
all sorts of consequences. For example,
it's been revealed fairly recently that
Britain's success against Argentina in the
Falklands War in 1982 was to a large
extent due to signals intelligence that
came from these machines. So, next slide,
please. And in this prehistory of the
crypto wars, almost all the play was
between governments. There was very little
role for civil society. There was one or
two journalists who were engaged in trying
to map what the NSA and friends were up to.
As far as industry was concerned, well, at
that time, I was working in banking and we
found that encryption for confidentiality
was discouraged. If we tried to use line
encryption, then false mysteriously
appeared on the line. But authentication
was OK. We were allowed to encrypt PIN
pad, PIN blocks. We were allowed to put
MACs on messages. There was some minor
harassment. For example, when Rivest,
Shamir and Adleman came up with their
encryption algorithm, the NSA tried to
make it classified. But the Provost of
MIT, Jerome Wiesner, persuaded them not to
make that fight. The big debate in the
1970s still, was whether the NSA affected
the design of the data encryption standard
algorithm, and we know now that this was
the case. It was designed to be only just
strong enough and Whit Diffie predicted
back in the 1970s that 2 to the power of
56 key search would eventually be
feasible. The EFF built a machine in 1998
and now of course that's fairly easy
because each bitcoin block costs 2 to
the power of 68 calculations. Next slide,
please. So where things get interesting is
that the NSA persuaded Bill Clinton in one
of his first cabinet meetings in 1993 to
introduce key escrow, the idea that the
NSA should have a copy of every of these
keys. And one of the people at that
meeting admitted later that President Bush,
the elder, had been asked and had refused,
but Clinton when he goes into office was
naive and thought that this was an
opportunity to fix the world. Now, the
clipper chip which we can see here, was
tamper resistant and had of secret block
cipher with an NSA backdoor key. And the
launch product was an AT&T secure phone.
Next slide, please. Now the Clipper protocol
was an interesting one in that each chip
had a unique secret key KU and a global
secret family key kNSA burned in. And in
order to, say, send data to Bob, Alice had
to send her clipper chip a working key kW,
which is generated by some external means,
such as a Diffie Hellman Key exchange. And
it makes a law enforcement access field,
which was basically Alice and Bob's names
with the working key encrypted under the
unit key and then a hash of the working
key encrypted under the NSA key. And that
was sent along with the cipher text to make
authorized wiretapping easy. And the idea
with the hash was that this would stop
cheating. Bob's Clipper Chip wouldn't use
a working key unless it came with a valid
LEAF. And I can remember, a few of us can
still remember, the enormous outcry that
this caused at the time. American
companies in particular didn't like it
because they started losing business to
foreign firms. And in fact, a couple of
our students here at Cambridge started a
company nCipher, that grew to be quite
large because they could sell worldwide,
unlike US firms. People said, why don't we
use encryption software? Well, that's easy
to write, but it's hard to deploy at
scale, as Phil Zimmermann found with PGP.
And the big concern was whether key escrow
would kill electronic commerce. A
secondary concern was whether, how on earth,
will we know if government designs are
secure? Why on earth should you trust the
NSA? Next slide, please. Well, the first
serious fight back in the crypto wars came
when Matt Blaze at Bell Labs found an
attack on Clipper. He found that Alice
could just try lots of these until one of
them works, because the tag was only 16
Bits long and it turned out that 2 to the
power of 112 of the 2 to the power of
128 possibilities work. And this meant
that Alice could generate a bogus LEAF
that would pass inspection, but which
wouldn't decrypt the traffic, and Bob
could also generate a new LEAF on the fly.
So you could write non-interoperable rogue
applications that the NSA has no access
to. And with a bit more work, you could
make rogue applications interoperate with
official ones. This was only the first of
many dumb ideas. Next slide, please. OK,
so why don't people just use software?
Well, at that time, the US had export
controls on intangible goods such as
software, although European countries
generally didn't. And this meant that US
academics couldn't put crypto code online,
although we Europeans could and we
did. And so Phil Zimmermann achieved fame
by exporting PGP, pretty good privacy, some
encryption software he had written for
America as a paper book. And this was
protected by the First Amendment. They
sent it across the border to Canada. They
fed it into an optical character
recognition scanner. They recompiled it
and the code had escaped. For this Phil
was subjected to a grand jury
investigation. There was also the
Bernstein case around code as free speech
and Bruce Schneier rose to fame with his
book "Applying Cryptography", which had
protocols, algorithms and source code in C,
which you could type in in order to get
cryptographic algorithms anywhere. And we
saw export-controlled clothing. This
t-shirt was something that many people wore
at the time. I've actually got one and I
planned to wear it for this. But
unfortunately, I came into the lab in
order to get better connectivity and I
left it at home. So this t-shirt was an
implementation of RSA written in perl,
plus a barcode so that you can scan it in.
And in theory, you should not walk across
the border wearing this t-shirt. Or if
you're a US citizen, you shouldn't even
let a non-US citizen look at it. So by
these means, people probed the outskirts of
what was possible and, you know an awful
lot of fun was had. It was a good laugh to
tweak the Tyrannosaur's tail. Next slide.
But this wasn't just something that was
limited to the USA. The big and obvious
problem, if you try and do key escrow in
Europe, is that there's dozens of
countries in Europe and what happens if
someone from Britain, for example, has got
a mobile phone that they bought in France
or a German SIM card and they're standing
on the streets in Stockholm and they phone
somebody who's in Budapest, who's got a
Hungarian phone with the Spanish SIM card
in it. Then which of these countries'
secret police forces should be able to
listen to the call. And this was something
that stalled the progress of key escrow,
that's a good way to describe it, in Europe.
And in 1996 GCHQ got academic colleagues
at Royal Holloway to come up with a
proposal for public sector email, which
they believe would fix this. Now, at the
time after clipper had fallen into
disrepute, the NSA's proposal was that
also the certification authority should have
to be licensed, and that this would enforce
a condition that all private keys would be
escrows, so you would only be able to get a
signature on your public key if the
private key was was held by the CA. And
the idea is that you'd have one CA for
each government department and civilians
would use trusted firms like Barclays Bank
or the post office, which would keep our
keys safe. And it would also work across
other EU member states, so that somebody in
Britain calling somebody in Germany would
end up in a situation where a trustworthy
CA, from the NSA's point of view, that is
an untrustworthy CA from our point of view,
in Britain would be prepared to make a key
and so would one in Germany. This, at
least, was the idea. So how do we do this,
next slide, on the GCHQ protocol. So here's
how it was designed to work in the UK
government. If Alice at the Department of
Agriculture wants to talk to Bob at the
Department of Business, she asks her
Departmental Security Officer DA for a send
key for herself and a receive key for Bob.
And DA and DB get a top level
interoperability key KTAB from GCHQ and DA
calculates a secret send key of the day as
a hash of KTAB and Alice's name and the
DA's own Identity for Alice which he gives
to Alice and similarly a public receive
key of the day for Bob and Alice sends Bob
her public send key along with the
encrypted message and Bob can go
to his DSO and get his secret receive
key of the day. Now this is slightly
complicated and there's all sorts of other
things wrong with it once you start to
look at it. Next slide, please. The first
is that from the point of view of the
overall effect, you could just as easily
have used Kerberos because you've
basically got a key distribution center at
both ends, which knows everybody's keys. So
you've not actually gained very much by
using complicated public key mechanisms,
and the next problem is what's the law
enforcement access need for centrally
generated signing keys? If this is
actually for law enforcement rather than
intelligence? Well, the police want to be
able to read things, not forge things. A
third problem is that keys involve hashing
department names and governments are
changing the name of the departments all
the time, as the prime minister of the day
moves his ministers around and they chop
and change departments. And this means, of
course, that everybody has to get new
cryptographic keys and suddenly the old
cryptographic keys don't work anymore. And
those are horrendous complexity comes from
this. Now, there are about 10 other things
wrong with this protocol, but curiously
enough, it's still used by the UK
government for the top secret stuff. It
went through a number of iterations. It's
now called Mikey Sakke, there's details in
my security engineering book. And it
turned out to be such a pain that the
stuff below top secret now is just used as
a branded version of G suite. So if what
you want to do is to figure out what
speech Boris Johnson will be making
tomorrow, we just have to guess the
password recovery questions for his
private secretaries and officials. Next
slide, the global Internet Trust Register.
This was an interesting piece of fun we
had around the 1997 election when Tony
Blair took over and introduced the Labor
government before the election, Labor
promised to not seize crypto keys in bulk
without a warrant. And one of the
first things that happened to him once he
was in office is Vice President Al Gore
went to visit him and all of a sudden Tony
Blair decided that he wanted all
certification authorities to be licensed
and they were about to rush this through
parliament. So we put all the important
public keys in a paper book and we took it
to the cultural secretary, Chris Smith,
and we said, you're the minister for books
why are you passing a law to ban this
book. And if you'll switch to the video
shot, I've got the initial copy of the
book that we just put together on the
photocopying machine in the department.
And then we sent the PDF off to MIT and
they produced it as a proper book. And
this means that we had a book which is
supposedly protected and this enabled us
to get the the topic onto the agenda for
cabinet discussion. So this at least
precipitous action, we ended up with the
Regulation of Investigatory Powers Bill in
2000. That was far from perfect, but that
was a longer story. So what happened back
then is that we set up an NGO, a digital
rights organization, the Foundation for
Information Policy Research. And the
climate at the time was such that we had
no difficulty raising a couple of hundred
thousand pounds from Microsoft and Hewlett
Packard and Redbus and other tech players.
So we were able to hire Casper Bowden for
three years to basically be the director
of FIPR and to lobby the government hard
on this. And if we can go back to the
slides, please, and go to the next slide,
the slide on bringing it all together. So
in 1997, a number of us, Hal Abelson and I
and Steve Bellovin and Josh Benaloh from
Microsoft and Matt Blaze who had broken
Clipper and Whit Diffie, who invented
digital signatures, and John Gilmore of
EFF, Peter Neumann of SRI, Ron Rivest,
Jeff Schiller of MIT and Bruce Schneier
who had written applied cryptography and
got together and wrote a paper on the
risks of key recovery, key escrow and
trust in third party encryption, where we
discussed the system consequences of
giving third party or government access to
both traffic data and content without user
notice or consent deployed internationally
and available around the clock. We came to
the conclusion that this was not really
doable. It was simply too many
vulnerabilities and too many complexities.
So how did it end? Well, if we go to the
next slide, the victory in Europe wasn't
as a result of academic arguments. It was
a result of industry pressure. And we owe
a debt to Commissioner Martin Bangemann
and also to the German government who
backed him. And in 1994, Martin had put
together a group of European CEOs to
advise him on internet policy. And they
advised them to keep your hands off until
we can see which way it's going. That's
just wrong with this thing and see what we
can do with it. And the thing that he
developed in order to drive a stake
through the heart of key escrow was the
Electronic Signatures Directive in 1999.
And this gave a rebuttable presumption of
validity to qualifying electronic
signatures, but subject to a number of
conditions. And one of these was that the
signing key must never be known to anybody
else other than the signer and this killed
the idea of licensing CAs in such a way
that the the NSA had access to all the
private key material. The agencies had
argued that without controlling
signatures, you couldn't control
encryption. But of course, as intelligence
agencies, they were as much interested in
manipulating information as they were in
listening into it. And this created a
really sharp conflict with businesses. In
the U.K., with the Regulation of
Investigatory Powers Bill went through the
following year. And there we got strong
support from the banks who did not want
the possibility of intelligence and law
enforcement personnel either getting hold
of bank keys or forging banking
transactions. And so we managed to, with
their help to insert a number of
conditions into the bill, which meant that
if a court or chief constable, for
example, demands a key from a company,
they've got to demand it from somebody at
the level of a director of the company.
And it's got to be signed by someone
really senior such as the chief constable.
So there was some controls that we managed
to get in there. Next slide! What did
victory in the USA look like? Well, in the
middle of 2000 as a number of people had
predicted, Al Gore decided that he wanted
to stop fighting the tech industry in
order to get elected president. And there
was a deal done at the time which was
secret. It was done at the FBI
headquarters at Quantico by US law
enforcement would rely on naturally
occurring vulnerabilities rather than
compelling their insertion by companies
like Intel or Microsoft. This was secret
at the time, and I happen to know about it
because I was consulting for Intel and the
NDA I was under had a four year time
limits on it. So after 2004, I was at the
ability to talk about this. And so this
basically gave the NSA access to the CERT
feed. And so as part of this deal, the
export rules were liberalized a bit, but
with various hooks and gotchas left so
that the authorities could bully companies
who got too difficult. And in 2002, Robert
Morris, senior, who had been the chief
scientist at the NSA at much of this
period, admitted that the real policy goal
was to ensure that the many systems
developed during the dot com boom were
deployed with weak protection or none. And
there's a huge, long list of these. Next
slide, please. So what was the collateral
damage from crypto war one? This is the
first knuckle pass of this talk, which
I've got together as a result of spending
the last academic year writing the third
edition of my book on security engineering
as I've gone through and updated all the
chapters on car security, the role of
security and web security and so on and so
forth, we find everywhere. But there are
still very serious costs remaining from
crypto war one, for example, almost all of
the remote key entry systems for cars use
inadequate cryptography for random
number generators and so on and so forth.
And car theft has almost doubled in the
past five years. This is not all due to
weak crypto, but it's substantially due to
a wrong culture that was started off in
the context of the crypto wars. Second,
there are millions of door locks still
using Mifare classic, even the building
where I work. For example, the University
of Cambridge changed its door locks around
2000. So we've still got a whole lot of
mifare classic around. And it's very
difficult when you've got 100 buildings to
change all the locks on them. And this is
the case with thousands of organizations
worldwide, with universities, with banks,
with all sorts of people, simply because
changing all the locks at once and dozens
of buildings is just too expensive. Then,
of course, there's the CA in your
browser, most nations own or control
certification authorities that your
browser trusts and the few nations that
weren't allowed to own such CAs, such as
Iran, get up to mischief, as we find in
the case of the DigiNotar hack a few years
ago. And this means that most nations have
got a more or less guaranteed ability to
do man in the middle attacks on your Web
log ons. Some companies like Google, of
course, started to fix that with various
mechanisms such as certificate pinning.
But that was a deliberate vulnerability
that was there for a long, long time and
is still very widespread. Phones. 2G is
insecure. That actually goes back to the
Cold War rather than the crypto war. But
thanks to the crypto wars 4G and 5G are
not very much better. The details are
slightly complicated and again, they're
described in the book, Bluetooth is easy
to hack. That's another piece of legacy.
And as I mentioned, the agencies own the
CERT's responsible disclosure pipeline,
which means that they got a free fire hose
of zero days that they can exploit
for perhaps a month or three before these
end up being patched. So next slide,
please. Last year when I talked at Chaos
Communication Congress, the audience chose
this as the cover for my security
engineering book, and that's now out. And
it's the process of writing this that
brought home to me the scale of the damage
that we still suffered as a result of
crypto war one. So let's move on to the
next slide and the next period of history,
which we might call the war on terror. And
I've arbitrarily put this down as 2000 to
2013 although some countries stoped using
the phrase war on terror in about 2008
once we have got rid of George W. Bush and
Tony Blair. But as a historical
convenience, this is, if you like, the
central period in our tale. And it starts
off with a lot of harassment around the
edges of security and cryptography. For
example, in 2000, Tony Blair promoted the
EU dual use regulation number 1334 to
extend export controls from tangible goods
such as rifles and tanks to intangibles
such as crypto software. Despite the fact
that he has basically declared peace on
the tech industry. Two years later, in
2002, the UK parliament balked at an
export control bill that was going to
transpose this because it added controls
on scientific speech, not just crypto
code, but even papers on cryptanalysis and
even electron microscope scripts and
so parliament started the research
exemption clause at the arguments of the
then president of the Royal Society, Sir
Robert May. But what then happened is that
GCHQ used EU regulations to frustrate
Parliament and this pattern of extralegal
behavior was to continue. Next slide!
Because after export control, the place
shifted to traffic data retention, another
bad thing that I'm afraid to say, the UK
exported to Europe back in the days when
we were, in effect, the Americans
consigliere on the European Council. Sorry
about that, folks, but all I can say is at
least we helped start EDRI a year after
that. So one of the interesting aspects of
this was that our then home secretary,
Jacqui Smith, started talking about the
need for a common database of all the
metadata of who had phoned whom when, who
had sent an email to whom when, so that
the police could continue to use the
traditional contact tracing techniques
online. And the line that we got hammered
home to us again and again and again was
if you got nothing to hide, you've got
nothing to fear. What then happened in
2008, is that a very bad person went into
Parliament and went to the PC where the
expense claims of MPs were kept and they
copied all the expense claims onto a DVD
and they sold it around Fleet Street. And
so The Daily Telegraph bought it from them
for 400˙000£. And then for the best
part of a year, the Daily Telegraph was
telling scandalous things about what
various members of parliament had claimed
from the taxpayer. But it turned out that
also Jacqui Smith may have been innocent.
Her husband had been downloading
pornography and charging it to our
parliamentary expenses. So she lost her
job as home secretary and she lost her
seat in parliament and the communications
data bill was lost. So was this a victory?
Well, in June 2013, we learned from Ed
Snowden that they just built it anyway,
despite parliament. So maybe the victory
in parliament wasn't what it seemed to be
at the time. But I'm getting ahead of
myself; anyway. Next slide, please. The
other thing that we did in the 2000s is
that we spent, I spent maybe a third of my
time and about another hundred people
joined and we developed the economics of
security as a discipline. We began to
realize that many of the things that went
wrong happened because Alice was guarding
a system and Bob was paying the cost of
failure. For example, if you got a payment
system, then in order to prevent fraud,
what you basically have to do is to get
the merchants and the bank to buy
transactions from them, to take care of
the costs of fraud, follow the cardholder
of the banks that issue them with cards.
And the two aren't the same. But it's this
that causes the governance tensions and
causes governments to break down and makes
fraud harder than it should be. Now after
that, one of the early topics was
patching and responsible disclosure. And
we worked through all the issues of
whether you should not patch at all, which
some people in industry wanted to do, or
whether you should just put all the bugs
on bug trackers which some hackers wanted
to do or whether you would go through the
CERT system despite the NSA compromise,
because they at least would give you legal
cover. And, you know, bully Microsoft into
catching the bug the next patch Tuesday
and then the disclosure after 90 days. And
we eventually came to the conclusion as an
industry followed that responsible
disclosure was the way to go. Now, one of
the problems that arises here is the
equities issue. Suppose you're the
director of the NSA and somebody comes to
you with some super new innovative bug.
You say they have rediscovered Spectre,
for example. And so you've got a bug which
can be used to penetrate any crypto
software that's out there. Do you report
the bug to Microsoft and Intel to defend
300 million Americans, or do you keep it
quiet so you can exploit 450 million
Europeans and a thousand billion Chinese
and so on and so forth? Well, once you put
it that way, it's fairly obvious that the
NSA will favor attack over defense. And
there are multiple models of attack and
defense. You can think of institutional
factors and politics, for example, if you
are director of the NSA, and you defend
300 million Americans. You defend the
White House against the Chinese hacking
it. You know, the president will never
know if he's hacked or not because the
Chinese will keep it quiet if they do. But
if, on the other hand, you manage to hack
the Politburo land in Peking, you can put
some juicy intelligence every morning with
the president's breakfast cereal. So
that's an even stronger argument of why
you should do attack rather than defense.
And all the thing that I mentioned in
passing is that throughout the 2000s,
governments also scrambled to get more
data of the citizens, for example, in
Britain with a long debate about whether
medical records should be centralized. In
the beginning, we said if you were to
centralize all medical records, that would
be such a large target that the database
should be top secret and it would be too
inconvenient for doctors to use. Well,
Blair decided in 2001 to do it anyway. We
wrote a report in 2009 saying that this
was a red line and that this was a serious
hazard and then in 2014 we discovered that
Cameron's buddy, who was the transparency
czar and the NHS had sold the database to
1200 researchers, including drug companies
in China. So that meant that all the
sensitive personal health information
about one billion patients episodes had
been sold around the world and was
available to not just to medical
researchers, but to foreign intelligence
services. This brings us on to Snowden. In
June 2013. We had one of those game
changing moments when Ed Snowden leaked a
whole bunch of papers showing that the NSA
had been breaking the law in America and
GCHQ had been breaking the law in Britain,
that we have been lied to, the parliament
had been misled, and a whole lot of
collection and interception was going on,
which supposedly shouldn't have been going
on. Now, one of the things that got
industry attention was a system called
PRISM, which was in fact legal because
this was done as a result of warrants
being served on the major Internet service
providers. And if we could move to the
next slide, we can see that this started
off with Microsoft in 2007. Yahoo! in
2008, they fought in court for a year they
lost and then Google and Facebook and so on
got added. This basically enabled the NSA
to go to someone like Google and say
rossjanderson@gmail.com is a foreign
national, we're therefore entitled to read
his traffic, kindly give us his Gmail. And
Google would say, yes, sir. For Americans,
you have to show probable cause that
they've committed a crime for foreigners
you simply have to show probable cause
that they're a foreigner. The next slide.
This disclosure from Snowden disclosed
that PRISM, despite the fact that it only
costs about 20 million dollars a year, was
generating something like half of all the
intelligence that the NSA was using. By
the end of financial year 2012, but that
was not all. Next slide, please. The thing
that really annoyed Google was this slide
on the deck from a presentation at GCHQ
showing how the NSA was not merely
collecting stuff through the front door by
serving warrants on Google in Mountain
View, it was collecting stuff through the
backdoor as well, because they were
harvesting the plaintext copies of Gmail
and maps and docs and so on, which were
being sent backwards and forwards between
Google's different data centers. And the
little smiley face, which you can see on
the sticky, got Sergei and Friends really,
really uptight. And they just decided,
right, you know, we're not going to allow
this. They will have to knock and show
warrants in the future. And there was a
crash program and all the major Internet
service providers to encrypt all the
traffic so that in future things could
only be got by means of a warrant. Next
slide, please. The EU was really annoyed
by what was called Operation Socialist.
Operation Socialist was basically, the
hack of Belgacom and the idea was that
GCHQ spearfished some technical staff at
Belgacom and this enabled them to wiretap
all the traffic at the European Commission
in Brussels and as well as mobile phone
traffic to and from various countries in
Africa. And this is rather amazing. It's
as if Nicola Sturgeon, the first minister
of Scotland, had tasked Police Scotland
with hacking BT so that she could watch
out what was going on with the parliament
in London. So this annoyed a number of
people. With the next slide, we can see.
That the the Operation Bull Run, an
operation Edgehill, as GCHQ called their
version of it, have an aggressive,
multipronged efforts to break widely used
Internet encryption technologies. And we
learned an awful lot about what was being
done to break VPNs worldwide and what had
been done in terms of inserting
vulnerabilities and protocols, getting
people to use vulnerable prime numbers for
Diffie Hellman key exchange and so on and
so forth. Next slide, first slide and
Bullrun and Edgehill SIGINT enabling
projects actively engages the US and
foreign IT industries to covertly
influence and/or overtly leverage their
commercial products' designs. These design
changes make the systems in question
exploitable through SIGINT collection
endpoint midpoints, et cetera, with
foreknowledge of the modification, the
consumer and other adversaries however the
system security remains intact. Next
slide, so the insert vulnerabilities into
commercial systems, I.T. systems, networks
and point communication devices used by
targets. Next slide. They also influence
policy standards and specifications for
commercial public key technologies, and
this was the smoking gun that
crypto war 1 had not actually ended. It had
just gone undercover. And so with this,
things come out into the open next slide
so we could perhaps date crypto war 2 to
the Snowden disclosures in their aftermath
in America. It must be said that all three
arms of the US government showed at least
mild remarks. Obama set up the NSA review
group and adopted most of what it said
except on the equities issue. Congress got
data retention, renewed the Patriot Act
and the FISA court introduced an advocate
for Targets. Tech companies as I
mentioned, started encrypting all their
traffic. In the UK on the other hand,
governments expressed no remorse at all,
and they passed the Investigatory Powers
Act to legalize all the unlawful things
they've already been doing. And they could
now order firms secretly do anything they
physically can. However, data retention
was nixed by the European courts. The
academic response in the next slide, keys
under doormats, much the same authors as
before. We analyzed the new situation and
came to much of the same conclusions. Next
slide, the 2018 GCHQ
proposals from Ian Levy and Crispin
Robinson proposed to add ghost users to
WhatsApp and FaceTime calls in response to
warrants. The idea is that you've got an
FBI key on your device hearing. You still
have end to end, so you just have an extra
end. And this, of course, fills the keys
on the doormats tests. Your software would
abandon best practice. It would create
targets and increase complexity and it
would also have to lie about trust. Next
slide, please. This brings us to the
upload filters which were proposed over
the past six months, they first surfaced
in early 2020 to a Stanford think tank and
they were adopted by Commissioner Ylva
Johansson on June the 9th at the start of
the German presidency. On the 20th of
September we got a leaked tech paper whose
authors include our GCHQ friends Ian Levie
and Crispin Robinson. The top options are
that you filter in client software
assisted by a server, as client side only
filtering is too constrained and easy to
compromise. The excuse is that you want to
stop illegal material such as child sex
abuse images being shared over end to end
messaging system such as WhatsApp. Various
NGOs objected, and we had a meeting with
the commission, which was a little bit
like a Stockholm Syndrome event. We had
one official there on the child protection
front fax by half a dozen officials from
various security bodies, departments and
agencies who seemed to be clearly driving
the thing with child protection merely
being an excuse to promote this lead.
Well, the obvious things to worry about
are as a similar language in the new
terror regulation, you can expect the
filter to extend from child sex abuse
material to terror. And static filtering
won't work because if there's a bad list
of 100˙000 forbidden images, then the bad
people will just go out and make another
100˙000 child sex abuse images. So the
filtering will have to become dynamic. And
then the question is whether your form
will block it or report it. And there's an
existing legal duty in a number of
countries and in the UK to although
obviously no longer a member state, the
existing duty to report terror stuff. And
the question is, who will be in charge of
updating the filters? What's going to
happen then? Next slide. Well, we've seen
an illustration during the lockdown in
April, the French and Dutch government
sent an update to all Encrochat mobile
phones with a rootkit which copied
messages, crypto keys and lock screen
passwords. The Encrochat was a brand of
mobile phone that was sold through
underground channels to various criminal
groups and others. And since this was
largely used by criminals of various
kinds, the U.K. government justify bulk
intercepts by passing its office targets
and equipment interference. In other
words, they brought a targeted warrant for
all forty five thousand Encrochat handsets
and of ten thousand users in the U.K.,
eight hundred were arrested in June when
the wire tapping exercise was completed.
Now, again, this appears to ignore the
laws that we have on the books because
even our Investigatory Powers Act rules
out all interception of U.K.
residents. And those who follow such
matters will know that there was a trial
at Liverpool Crown Court, a hearing of
whether this stuff was admissible. And we
should have a first verdict on that early
in the new year. And that will no doubt go
to appeal. And if the material is held to
be admissible, then there will be a whole
series of trials. So this brings me to my
final point. What can we expect going
forward? China is emerging as a full-stack
competitor to the West, not like Russia in
Cold War one, because Russia only ever
produced things like primary goods, like
oil and weapons in trouble, of course. But
China is trying to compete all the way up
and down the stack from chips, through
software, up through services and
everything else. And developments in China
don't exactly fill one with much
confidence, because in March 2018,
President Xi declared himself to be ruler
for life, basically tearing up the Chinese
constitution. There are large-scale state
crimes being committed in Tibet and
Xiniang and elsewhere. Just last week,
Britain's chief rabbi described the
treatment of Uyghurs as an unfathomable
mass atrocity. In my book, I describe
escalating cyber conflict and various
hacks, such as the hack of the Office of
Personnel Management, which had clearance
files on all Americans who work for the
federal governments, the hack of Equifax,
which got credit ratings and credit
histories of all Americans. And there are
also growing tussles and standards. For
example, the draft ISO 27553 on biometric
authentication for mobile phones is
introducing at the insistence of Chinese
delegates, a central database option. So
in future, your phone might not verify
your faceprint or your fingerprint
locally. It might do it with a central
database. Next slide, how could Cold War
2.0 be different? Well, there's a number
of interesting things here, and the
purpose of this talk is to try and kick
off a discussion of these issues. China
makes electronics, not just guns, the way
the old USSR did. Can you have a separate
supply chain for China and one for
everybody else? But hang on a minute,
consider the fact that China has now
collected very substantial personal data
sets on the Office of Personnel
Management, the US government employees,
by forcing Apple to set up its own data
centers in China for iPhone users in
China, they get access to all the data
for Chinese users of iPhones that America
gets for American users of iPhones, plus
maybe more as well. If the Chinese can
break the HSMs in Chinese data centers as
we expect them to be able to, Equifax got
them data on all economically active
people in the USA. care.data gave them
medical records of everybody in the UK.
And this bulk personal data is already
being targeted in intelligence use when
Western countries, for example, send
diplomats to countries in Africa or Latin
America or local Chinese counter-
intelligence, people know whether they're
bona fide diplomats or whether they're
intelligence agents, undercover, all
from exploitation of all this personal
information. Now, given that this
information's already in efficient targeted
use, the next question we have to ask is
when will it be used at scale? And this is
the point at which we say that the
equities issue now needs a serious rethink
and the whole structure of the conflict is
going to have to move from more offensive
to more defensive because we depend on
supply chains to which the Chinese have
access more than they depend on supply
chains to which we have access. Now, it's
dreadful that we're headed towards a new
Cold War, but as we head there, we have to
ask also the respective roles of
governments, industry and civil society,
academia. Next slide, please. And so
looking for my point is this. That is Cold
War 2.0 does happen. I hope it doesn't.
But we appear to be headed that way
despite the change of governments in the
White House. Then we need to be able to
defend everybody, not just the elites. No,
it's not going to be easy because there
are more state players, the USA is a big
block, the EU is a big block. There are
other players, other democracies that are
other non democracies. Those other failing
democracies. This is going to be complex
and messy. It isn't going to be a
situation like last time where big tech
reaches out to civil society and academia
and we could see a united front against
the agencies. And even in that case, of
course, the victory that we got was only
an apparent victory, a superficial victory
that's only lasted for a while. So what
could we do? Well, at this point, I think
we need to remind all the players to
listen. But it's not just about strategy
and tactics, but it's about values, too.
And so we need to be firmly on the side of
freedom, privacy and the rule of law. Now,
for the old timers, you may remember that
there was a product called Tom-Skype,
which was introduced in 2011 in China. The
Chinese wanted the citizens to be able to
use Skype, but they wanted to be able to
wiretap as well, despite the fact that
Skype at the time had end to end
encryption. And so people in China were
compelled to download a client for Skype
called Tom-Skype. Tom was the company that
distributed Skype in China and it
basically had built in wire tapping. So
you had end to end encryption using Skype
in those days. But in China, you ended up
having a Trojan client, which you had to
use. And what we are doing at the moment
is basically the EU is trying to copy Tom-
Skype and saying that we should be doing
what China was doing eight years ago. And
I say we should reject that. We can't
challenge President Xi by going down that
route. Instead, we've got to reset our
values and we've got to think through the
equities issue and we've got to figure out
how it is that we're going to deal with
the challenges of dealing with non-
democratic countries when there is serious
conflict in a globalized world where we're
sharing the same technology. Thanks. And
perhaps the last slide for my book can
come now and I'm happy to take questions.
Herald: Yeah, thanks a lot, Ross, for your
talk. It's a bit depressing to listen to
you. I have to admit let's have a look.
OK, so I have a question. I'm wondering if
the export controls at EU level became
worse than UK level export controls
because entities like GCHQ had more
influence there or because there's a harmful
Franco German security culture or what it
was. Do you have anything on that?
Ross: Well, the experience that we had
with these export controls, once they were
in place, was as follows. It was about
2015 I think, or 2016, It came to our
attention that a British company, Sophos,
was selling bulk surveillance equipment to
President al Assad of Syria, and he was
using it to basically wiretap his entire
population and decide who he was going to
arrest and kill the following day. And it
was sold by Sophos in fact, through a
German subsidiary. And so we went along to
the export control office in Victoria
Street. A number of NGOs, the open rights
group went along and Privacy International
and us and one or two others. And we said,
look, according to the EU dual use
regulation, bulk intercept equipment is
military equipment. It should be in the
military list. Therefore, you should be
demanding an export license for this
stuff. And they found every conceivable
excuse not to demand it. And it was the
lady from GCHQ there in the room who was
clearly calling the shots. And she was
absolutely determined that there should be
no export controls on the stuff being sold
to Syria. And eventually I said, look,
it's fairly obvious what's going on here.
If there's going to be black boxes and
President al-Assad's network, you want
them to be British black boxes or German
black boxes, not Ukrainian or Israeli
black boxes. And she said, I cannot
discuss classified matters in an open
meeting, which is as close as you get to
an admission. And a couple of months
later, Angela Merkel, to her great credit,
has actually come out in public and said
that allowing the equipment to be exported
from Utimaco to Syria was one of the
hardest decision she'd ever taken as
counselor. And that was a very difficult
tradeoff between maintaining intelligence
access, given the possibility that Western
troops would be involved in Syria and the
fact that the kit was being used for very
evil purposes. So that's an example of how
the export controls are used in practice.
They are not used to control the harms
that we as voters are told that they're
there to control. Right. They are used in
all sorts of dark and dismal games. And we
really have to tackle the issue of export
controls with our eyes open.
H: Yeah, yeah. There's a lot a lot to do.
And now Germany has left the EU, UN
Security Council. So let's see what
happens next. Yeah. We'll see, Ross.
Anything else you'd like to add? We don't
have any more questions. Oh, no, we have
another question. It's just come up
seconds ago. Do you think that refusal to
accept back doors will create large
uncensorable applications?
R: Well, if you get large applications
which are associated with significant
economic power, then low pressure gets
brought to bear on those economic players
to do their social duty. And... this is what
we have seen with the platforms that
intermediate content, that act as content
intermediaries such as Facebook and Google
and so on, that they do a certain amount
of filtering. But if, on the other hand,
you have wholesale surveillance before the
fact of End-To-End encrypted stuff, then
are we moving into an environment where
private speech from one person to another
is no longer permitted? You know, I don't
think that's the right trade off that we
should be taking, because we all know from
hard experience that when governments say,
think of the children, they're not
thinking of children at all. If they were
thinking of children, they would not be
selling weapons to Saudi Arabia and the
United Arab Emirates to kill children in
Yemen. And they say think about terrorism.
But the censorship that we are supposed to
use in universities around terrorism, the
so-called prevent duty is known to be
counterproductive. It makes Muslim
students feel alienated and marginalized.
So the arguments that governments use
around this are not in any way honest. And
we now have 20 years experience of these
dishonest arguments. And for goodness
sake, let's have a more grown up
conversation about these things.
H: Now, you're totally right, even if I
have to admit, it took me a couple of
years, not 20, but a lot to finally
understand, OK? This I think that's it, we
just have another comment and I'm thanking
you for your time and are you in an
assembly somewhere around hanging around
in the next hour or so? Maybe if someone
wants to talk to you, he can just pop by
if you ever if you have used this 2d world
already.
R: No, I haven't been using the 2d world.
I had some issues with my browser and
getting into it. But I've got my my
webpage and my email address is public and
anybody who wants to discuss these things
is welcome to get in touch with me.
Herald: All right. So thanks a lot.
R: Thank you for the invitation.
H: Yeah. Thanks a lot.
rC3 postroll music
Subtitles created by c3subtitles.de
in the year 2020. Join, and help us!