[Script Info] Title: [Events] Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text Dialogue: 0,0:00:00.00,0:00:12.59,Default,,0000,0000,0000,,{\i1}rC3 preroll music{\i0} Dialogue: 0,0:00:12.59,0:00:18.11,Default,,0000,0000,0000,,Herald: This is Ross Anderson, and he's\Ngiving a talk to us today, and the title Dialogue: 0,0:00:18.11,0:00:24.08,Default,,0000,0000,0000,,is What Price the Upload Filter? From Cold\NWar to Crypto Wars and Back Again. And Dialogue: 0,0:00:24.08,0:00:31.49,Default,,0000,0000,0000,,we're very happy that he's here today. And\Nand for our non-English speaking public, Dialogue: 0,0:00:31.49,0:00:35.99,Default,,0000,0000,0000,,we have translations.\N{\i1}speaks german{\i0} Dialogue: 0,0:00:35.99,0:00:41.30,Default,,0000,0000,0000,,Dieser Talk wird auf Deutsch übersetzt.\N{\i1}speaks french{\i0} Dialogue: 0,0:00:41.30,0:00:49.84,Default,,0000,0000,0000,,Cette conférence est traduit en\Nfrançais aussi. Dialogue: 0,0:00:49.84,0:00:56.77,Default,,0000,0000,0000,,Yeah. Um. Ross, ready to start? Let's go. \NHave a good time. Enjoy. Dialogue: 0,0:00:56.77,0:01:09.75,Default,,0000,0000,0000,,Ross: Yes, ready to go. Thanks. OK. As has\Nbeen said, I'm Ross Anderson and I'm in Dialogue: 0,0:01:09.75,0:01:13.62,Default,,0000,0000,0000,,the position of being one of the old guys\Nof this field and that I've been involved Dialogue: 0,0:01:13.62,0:01:18.68,Default,,0000,0000,0000,,in the crypto wars right from the start.\NAnd in fact, even since before the clipper Dialogue: 0,0:01:18.68,0:01:27.51,Default,,0000,0000,0000,,chip actually came out. If we could go to\Nthe slides, please. Dialogue: 0,0:01:27.51,0:01:31.41,Default,,0000,0000,0000,,Right, can we see the slides? Dialogue: 0,0:01:31.41,0:02:09.50,Default,,0000,0000,0000,,{\i1}silence{\i0} Dialogue: 0,0:02:09.50,0:02:14.79,Default,,0000,0000,0000,,surprised the U.S. armed\Nforces. And guess what happened? Well, in Dialogue: 0,0:02:14.79,0:02:21.06,Default,,0000,0000,0000,,the 1950s, Boris Hagelin had set up that\Ncompany, secretly sold it to the NSA and Dialogue: 0,0:02:21.06,0:02:28.11,Default,,0000,0000,0000,,for a number of years, quite a lot of years,\Ncountries as diverse as Latin America and Dialogue: 0,0:02:28.11,0:02:33.13,Default,,0000,0000,0000,,India and even NATO countries such as\NItaly were buying machines from Crypto AG, Dialogue: 0,0:02:33.13,0:02:39.47,Default,,0000,0000,0000,,which the NSA could decipher. And this had\Nall sorts of consequences. For example, Dialogue: 0,0:02:39.47,0:02:44.78,Default,,0000,0000,0000,,it's been revealed fairly recently that\NBritain's success against Argentina in the Dialogue: 0,0:02:44.78,0:02:50.68,Default,,0000,0000,0000,,Falklands War in 1982 was to a large\Nextent due to signals intelligence that Dialogue: 0,0:02:50.68,0:02:58.79,Default,,0000,0000,0000,,came from these machines. So, next slide,\Nplease. And in this prehistory of the Dialogue: 0,0:02:58.79,0:03:03.60,Default,,0000,0000,0000,,crypto wars, almost all the play was\Nbetween governments. There was very little Dialogue: 0,0:03:03.60,0:03:08.18,Default,,0000,0000,0000,,role for civil society. There was one or\Ntwo journalists who were engaged in trying Dialogue: 0,0:03:08.18,0:03:13.87,Default,,0000,0000,0000,,to map what the NSA and friends were up to.\NAs far as industry was concerned, well, at Dialogue: 0,0:03:13.87,0:03:18.18,Default,,0000,0000,0000,,that time, I was working in banking and we\Nfound that encryption for confidentiality Dialogue: 0,0:03:18.18,0:03:22.32,Default,,0000,0000,0000,,was discouraged. If we tried to use line\Nencryption, then false mysteriously Dialogue: 0,0:03:22.32,0:03:26.88,Default,,0000,0000,0000,,appeared on the line. But authentication\Nwas OK. We were allowed to encrypt PIN Dialogue: 0,0:03:26.88,0:03:32.38,Default,,0000,0000,0000,,pad, PIN blocks. We were allowed to put\NMACs on messages. There was some minor Dialogue: 0,0:03:32.38,0:03:37.25,Default,,0000,0000,0000,,harassment. For example, when Rivest,\NShamir and Adleman came up with their Dialogue: 0,0:03:37.25,0:03:42.65,Default,,0000,0000,0000,,encryption algorithm, the NSA tried to\Nmake it classified. But the Provost of Dialogue: 0,0:03:42.65,0:03:47.84,Default,,0000,0000,0000,,MIT, Jerome Wiesner, persuaded them not to\Nmake that fight. The big debate in the Dialogue: 0,0:03:47.84,0:03:52.88,Default,,0000,0000,0000,,1970s still, was whether the NSA affected\Nthe design of the data encryption standard Dialogue: 0,0:03:52.88,0:03:57.80,Default,,0000,0000,0000,,algorithm, and we know now that this was\Nthe case. It was designed to be only just Dialogue: 0,0:03:57.80,0:04:04.33,Default,,0000,0000,0000,,strong enough and Whit Diffie predicted\Nback in the 1970s that 2 to the power of Dialogue: 0,0:04:04.33,0:04:08.92,Default,,0000,0000,0000,,56 key search would eventually be\Nfeasible. The EFF built a machine in 1998 Dialogue: 0,0:04:08.92,0:04:13.51,Default,,0000,0000,0000,,and now of course that's fairly easy\Nbecause each bitcoin block costs 2 to Dialogue: 0,0:04:13.51,0:04:20.17,Default,,0000,0000,0000,,the power of 68 calculations. Next slide,\Nplease. So where things get interesting is Dialogue: 0,0:04:20.17,0:04:25.92,Default,,0000,0000,0000,,that the NSA persuaded Bill Clinton in one\Nof his first cabinet meetings in 1993 to Dialogue: 0,0:04:25.92,0:04:30.27,Default,,0000,0000,0000,,introduce key escrow, the idea that the\NNSA should have a copy of every of these Dialogue: 0,0:04:30.27,0:04:36.86,Default,,0000,0000,0000,,keys. And one of the people at that\Nmeeting admitted later that President Bush, Dialogue: 0,0:04:36.86,0:04:41.28,Default,,0000,0000,0000,,the elder, had been asked and had refused,\Nbut Clinton when he goes into office was Dialogue: 0,0:04:41.28,0:04:46.30,Default,,0000,0000,0000,,naive and thought that this was an\Nopportunity to fix the world. Now, the Dialogue: 0,0:04:46.30,0:04:50.16,Default,,0000,0000,0000,,clipper chip which we can see here, was\Ntamper resistant and had of secret block Dialogue: 0,0:04:50.16,0:04:58.62,Default,,0000,0000,0000,,cipher with an NSA backdoor key. And the\Nlaunch product was an AT&T secure phone. Dialogue: 0,0:04:58.62,0:05:05.03,Default,,0000,0000,0000,,Next slide, please. Now the Clipper protocol\Nwas an interesting one in that each chip Dialogue: 0,0:05:05.03,0:05:12.21,Default,,0000,0000,0000,,had a unique secret key KU and a global\Nsecret family key kNSA burned in. And in Dialogue: 0,0:05:12.21,0:05:18.19,Default,,0000,0000,0000,,order to, say, send data to Bob, Alice had\Nto send her clipper chip a working key kW, Dialogue: 0,0:05:18.19,0:05:22.89,Default,,0000,0000,0000,,which is generated by some external means,\Nsuch as a Diffie Hellman Key exchange. And Dialogue: 0,0:05:22.89,0:05:28.43,Default,,0000,0000,0000,,it makes a law enforcement access field,\Nwhich was basically Alice and Bob's names Dialogue: 0,0:05:28.43,0:05:33.53,Default,,0000,0000,0000,,with the working key encrypted under the\Nunit key and then a hash of the working Dialogue: 0,0:05:33.53,0:05:38.45,Default,,0000,0000,0000,,key encrypted under the NSA key. And that\Nwas sent along with the cipher text to make Dialogue: 0,0:05:38.45,0:05:43.21,Default,,0000,0000,0000,,authorized wiretapping easy. And the idea\Nwith the hash was that this would stop Dialogue: 0,0:05:43.21,0:05:47.83,Default,,0000,0000,0000,,cheating. Bob's Clipper Chip wouldn't use\Na working key unless it came with a valid Dialogue: 0,0:05:47.83,0:05:53.89,Default,,0000,0000,0000,,LEAF. And I can remember, a few of us can\Nstill remember, the enormous outcry that Dialogue: 0,0:05:53.89,0:05:57.82,Default,,0000,0000,0000,,this caused at the time. American\Ncompanies in particular didn't like it Dialogue: 0,0:05:57.82,0:06:02.44,Default,,0000,0000,0000,,because they started losing business to\Nforeign firms. And in fact, a couple of Dialogue: 0,0:06:02.44,0:06:07.50,Default,,0000,0000,0000,,our students here at Cambridge started a\Ncompany nCipher, that grew to be quite Dialogue: 0,0:06:07.50,0:06:13.30,Default,,0000,0000,0000,,large because they could sell worldwide,\Nunlike US firms. People said, why don't we Dialogue: 0,0:06:13.30,0:06:17.02,Default,,0000,0000,0000,,use encryption software? Well, that's easy\Nto write, but it's hard to deploy at Dialogue: 0,0:06:17.02,0:06:22.75,Default,,0000,0000,0000,,scale, as Phil Zimmermann found with PGP.\NAnd the big concern was whether key escrow Dialogue: 0,0:06:22.75,0:06:28.62,Default,,0000,0000,0000,,would kill electronic commerce. A\Nsecondary concern was whether, how on earth, Dialogue: 0,0:06:28.62,0:06:32.39,Default,,0000,0000,0000,,will we know if government designs are\Nsecure? Why on earth should you trust the Dialogue: 0,0:06:32.39,0:06:40.67,Default,,0000,0000,0000,,NSA? Next slide, please. Well, the first\Nserious fight back in the crypto wars came Dialogue: 0,0:06:40.67,0:06:45.26,Default,,0000,0000,0000,,when Matt Blaze at Bell Labs found an\Nattack on Clipper. He found that Alice Dialogue: 0,0:06:45.26,0:06:52.34,Default,,0000,0000,0000,,could just try lots of these until one of\Nthem works, because the tag was only 16 Dialogue: 0,0:06:52.34,0:06:57.59,Default,,0000,0000,0000,,Bits long and it turned out that 2 to the\Npower of 112 of the 2 to the power of Dialogue: 0,0:06:57.59,0:07:03.75,Default,,0000,0000,0000,,128 possibilities work. And this meant\Nthat Alice could generate a bogus LEAF Dialogue: 0,0:07:03.75,0:07:07.59,Default,,0000,0000,0000,,that would pass inspection, but which\Nwouldn't decrypt the traffic, and Bob Dialogue: 0,0:07:07.59,0:07:11.88,Default,,0000,0000,0000,,could also generate a new LEAF on the fly.\NSo you could write non-interoperable rogue Dialogue: 0,0:07:11.88,0:07:16.69,Default,,0000,0000,0000,,applications that the NSA has no access\Nto. And with a bit more work, you could Dialogue: 0,0:07:16.69,0:07:21.48,Default,,0000,0000,0000,,make rogue applications interoperate with\Nofficial ones. This was only the first of Dialogue: 0,0:07:21.48,0:07:30.09,Default,,0000,0000,0000,,many dumb ideas. Next slide, please. OK,\Nso why don't people just use software? Dialogue: 0,0:07:30.09,0:07:36.19,Default,,0000,0000,0000,,Well, at that time, the US had export\Ncontrols on intangible goods such as Dialogue: 0,0:07:36.19,0:07:40.43,Default,,0000,0000,0000,,software, although European countries\Ngenerally didn't. And this meant that US Dialogue: 0,0:07:40.43,0:07:45.61,Default,,0000,0000,0000,,academics couldn't put crypto code online,\Nalthough we Europeans could and we Dialogue: 0,0:07:45.61,0:07:52.06,Default,,0000,0000,0000,,did. And so Phil Zimmermann achieved fame\Nby exporting PGP, pretty good privacy, some Dialogue: 0,0:07:52.06,0:07:56.10,Default,,0000,0000,0000,,encryption software he had written for\NAmerica as a paper book. And this was Dialogue: 0,0:07:56.10,0:08:00.51,Default,,0000,0000,0000,,protected by the First Amendment. They\Nsent it across the border to Canada. They Dialogue: 0,0:08:00.51,0:08:04.36,Default,,0000,0000,0000,,fed it into an optical character\Nrecognition scanner. They recompiled it Dialogue: 0,0:08:04.36,0:08:08.89,Default,,0000,0000,0000,,and the code had escaped. For this Phil\Nwas subjected to a grand jury Dialogue: 0,0:08:08.89,0:08:13.98,Default,,0000,0000,0000,,investigation. There was also the\NBernstein case around code as free speech Dialogue: 0,0:08:13.98,0:08:19.06,Default,,0000,0000,0000,,and Bruce Schneier rose to fame with his\Nbook "Applying Cryptography", which had Dialogue: 0,0:08:19.06,0:08:24.25,Default,,0000,0000,0000,,protocols, algorithms and source code in C,\Nwhich you could type in in order to get Dialogue: 0,0:08:24.25,0:08:31.96,Default,,0000,0000,0000,,cryptographic algorithms anywhere. And we\Nsaw export-controlled clothing. This Dialogue: 0,0:08:31.96,0:08:36.39,Default,,0000,0000,0000,,t-shirt was something that many people wore\Nat the time. I've actually got one and I Dialogue: 0,0:08:36.39,0:08:41.09,Default,,0000,0000,0000,,planned to wear it for this. But\Nunfortunately, I came into the lab in Dialogue: 0,0:08:41.09,0:08:47.23,Default,,0000,0000,0000,,order to get better connectivity and I\Nleft it at home. So this t-shirt was an Dialogue: 0,0:08:47.23,0:08:53.01,Default,,0000,0000,0000,,implementation of RSA written in perl,\Nplus a barcode so that you can scan it in. Dialogue: 0,0:08:53.01,0:08:58.35,Default,,0000,0000,0000,,And in theory, you should not walk across\Nthe border wearing this t-shirt. Or if Dialogue: 0,0:08:58.35,0:09:03.27,Default,,0000,0000,0000,,you're a US citizen, you shouldn't even\Nlet a non-US citizen look at it. So by Dialogue: 0,0:09:03.27,0:09:08.58,Default,,0000,0000,0000,,these means, people probed the outskirts of\Nwhat was possible and, you know an awful Dialogue: 0,0:09:08.58,0:09:17.07,Default,,0000,0000,0000,,lot of fun was had. It was a good laugh to\Ntweak the Tyrannosaur's tail. Next slide. Dialogue: 0,0:09:17.07,0:09:25.03,Default,,0000,0000,0000,,But this wasn't just something that was\Nlimited to the USA. The big and obvious Dialogue: 0,0:09:25.03,0:09:30.23,Default,,0000,0000,0000,,problem, if you try and do key escrow in\NEurope, is that there's dozens of Dialogue: 0,0:09:30.23,0:09:35.81,Default,,0000,0000,0000,,countries in Europe and what happens if\Nsomeone from Britain, for example, has got Dialogue: 0,0:09:35.81,0:09:40.10,Default,,0000,0000,0000,,a mobile phone that they bought in France\Nor a German SIM card and they're standing Dialogue: 0,0:09:40.10,0:09:44.51,Default,,0000,0000,0000,,on the streets in Stockholm and they phone\Nsomebody who's in Budapest, who's got a Dialogue: 0,0:09:44.51,0:09:48.70,Default,,0000,0000,0000,,Hungarian phone with the Spanish SIM card\Nin it. Then which of these countries' Dialogue: 0,0:09:48.70,0:09:54.26,Default,,0000,0000,0000,,secret police forces should be able to\Nlisten to the call. And this was something Dialogue: 0,0:09:54.26,0:09:59.93,Default,,0000,0000,0000,,that stalled the progress of key escrow, \Nthat's a good way to describe it, in Europe. Dialogue: 0,0:09:59.93,0:10:06.84,Default,,0000,0000,0000,,And in 1996 GCHQ got academic colleagues\Nat Royal Holloway to come up with a Dialogue: 0,0:10:06.84,0:10:11.86,Default,,0000,0000,0000,,proposal for public sector email, which\Nthey believe would fix this. Now, at the Dialogue: 0,0:10:11.86,0:10:18.92,Default,,0000,0000,0000,,time after clipper had fallen into\Ndisrepute, the NSA's proposal was that Dialogue: 0,0:10:18.92,0:10:24.13,Default,,0000,0000,0000,,also the certification authority should have\Nto be licensed, and that this would enforce Dialogue: 0,0:10:24.13,0:10:27.88,Default,,0000,0000,0000,,a condition that all private keys would be\Nescrows, so you would only be able to get a Dialogue: 0,0:10:27.88,0:10:33.53,Default,,0000,0000,0000,,signature on your public key if the\Nprivate key was was held by the CA. And Dialogue: 0,0:10:33.53,0:10:38.44,Default,,0000,0000,0000,,the idea is that you'd have one CA for\Neach government department and civilians Dialogue: 0,0:10:38.44,0:10:42.19,Default,,0000,0000,0000,,would use trusted firms like Barclays Bank\Nor the post office, which would keep our Dialogue: 0,0:10:42.19,0:10:48.01,Default,,0000,0000,0000,,keys safe. And it would also work across\Nother EU member states, so that somebody in Dialogue: 0,0:10:48.01,0:10:53.87,Default,,0000,0000,0000,,Britain calling somebody in Germany would\Nend up in a situation where a trustworthy Dialogue: 0,0:10:53.87,0:10:59.22,Default,,0000,0000,0000,,CA, from the NSA's point of view, that is\Nan untrustworthy CA from our point of view, Dialogue: 0,0:10:59.22,0:11:03.68,Default,,0000,0000,0000,,in Britain would be prepared to make a key\Nand so would one in Germany. This, at Dialogue: 0,0:11:03.68,0:11:10.63,Default,,0000,0000,0000,,least, was the idea. So how do we do this,\Nnext slide, on the GCHQ protocol. So here's Dialogue: 0,0:11:10.63,0:11:14.83,Default,,0000,0000,0000,,how it was designed to work in the UK\Ngovernment. If Alice at the Department of Dialogue: 0,0:11:14.83,0:11:20.12,Default,,0000,0000,0000,,Agriculture wants to talk to Bob at the\NDepartment of Business, she asks her Dialogue: 0,0:11:20.12,0:11:26.09,Default,,0000,0000,0000,,Departmental Security Officer DA for a send\Nkey for herself and a receive key for Bob. Dialogue: 0,0:11:26.09,0:11:35.36,Default,,0000,0000,0000,,And DA and DB get a top level\Ninteroperability key KTAB from GCHQ and DA Dialogue: 0,0:11:35.36,0:11:44.12,Default,,0000,0000,0000,,calculates a secret send key of the day as\Na hash of KTAB and Alice's name and the Dialogue: 0,0:11:44.12,0:11:50.43,Default,,0000,0000,0000,,DA's own Identity for Alice which he gives\Nto Alice and similarly a public receive Dialogue: 0,0:11:50.43,0:11:55.48,Default,,0000,0000,0000,,key of the day for Bob and Alice sends Bob\Nher public send key along with the Dialogue: 0,0:11:55.48,0:12:00.38,Default,,0000,0000,0000,,encrypted message and Bob can go\Nto his DSO and get his secret receive Dialogue: 0,0:12:00.38,0:12:06.12,Default,,0000,0000,0000,,key of the day. Now this is slightly\Ncomplicated and there's all sorts of other Dialogue: 0,0:12:06.12,0:12:11.30,Default,,0000,0000,0000,,things wrong with it once you start to\Nlook at it. Next slide, please. The first Dialogue: 0,0:12:11.30,0:12:14.79,Default,,0000,0000,0000,,is that from the point of view of the\Noverall effect, you could just as easily Dialogue: 0,0:12:14.79,0:12:19.08,Default,,0000,0000,0000,,have used Kerberos because you've\Nbasically got a key distribution center at Dialogue: 0,0:12:19.08,0:12:24.63,Default,,0000,0000,0000,,both ends, which knows everybody's keys. So\Nyou've not actually gained very much by Dialogue: 0,0:12:24.63,0:12:31.08,Default,,0000,0000,0000,,using complicated public key mechanisms,\Nand the next problem is what's the law Dialogue: 0,0:12:31.08,0:12:36.39,Default,,0000,0000,0000,,enforcement access need for centrally\Ngenerated signing keys? If this is Dialogue: 0,0:12:36.39,0:12:40.48,Default,,0000,0000,0000,,actually for law enforcement rather than\Nintelligence? Well, the police want to be Dialogue: 0,0:12:40.48,0:12:47.48,Default,,0000,0000,0000,,able to read things, not forge things. A\Nthird problem is that keys involve hashing Dialogue: 0,0:12:47.48,0:12:52.11,Default,,0000,0000,0000,,department names and governments are\Nchanging the name of the departments all Dialogue: 0,0:12:52.11,0:12:56.98,Default,,0000,0000,0000,,the time, as the prime minister of the day\Nmoves his ministers around and they chop Dialogue: 0,0:12:56.98,0:13:01.81,Default,,0000,0000,0000,,and change departments. And this means, of\Ncourse, that everybody has to get new Dialogue: 0,0:13:01.81,0:13:06.32,Default,,0000,0000,0000,,cryptographic keys and suddenly the old\Ncryptographic keys don't work anymore. And Dialogue: 0,0:13:06.32,0:13:10.80,Default,,0000,0000,0000,,those are horrendous complexity comes from\Nthis. Now, there are about 10 other things Dialogue: 0,0:13:10.80,0:13:15.42,Default,,0000,0000,0000,,wrong with this protocol, but curiously\Nenough, it's still used by the UK Dialogue: 0,0:13:15.42,0:13:19.09,Default,,0000,0000,0000,,government for the top secret stuff. It\Nwent through a number of iterations. It's Dialogue: 0,0:13:19.09,0:13:23.94,Default,,0000,0000,0000,,now called Mikey Sakke, there's details in\Nmy security engineering book. And it Dialogue: 0,0:13:23.94,0:13:28.13,Default,,0000,0000,0000,,turned out to be such a pain that the\Nstuff below top secret now is just used as Dialogue: 0,0:13:28.13,0:13:32.96,Default,,0000,0000,0000,,a branded version of G suite. So if what\Nyou want to do is to figure out what Dialogue: 0,0:13:32.96,0:13:37.05,Default,,0000,0000,0000,,speech Boris Johnson will be making\Ntomorrow, we just have to guess the Dialogue: 0,0:13:37.05,0:13:44.46,Default,,0000,0000,0000,,password recovery questions for his\Nprivate secretaries and officials. Next Dialogue: 0,0:13:44.46,0:13:51.06,Default,,0000,0000,0000,,slide, the global Internet Trust Register.\NThis was an interesting piece of fun we Dialogue: 0,0:13:51.06,0:13:55.93,Default,,0000,0000,0000,,had around the 1997 election when Tony\NBlair took over and introduced the Labor Dialogue: 0,0:13:55.93,0:14:00.44,Default,,0000,0000,0000,,government before the election, Labor\Npromised to not seize crypto keys in bulk Dialogue: 0,0:14:00.44,0:14:04.50,Default,,0000,0000,0000,,without a warrant. And one of the\Nfirst things that happened to him once he Dialogue: 0,0:14:04.50,0:14:09.57,Default,,0000,0000,0000,,was in office is Vice President Al Gore\Nwent to visit him and all of a sudden Tony Dialogue: 0,0:14:09.57,0:14:13.45,Default,,0000,0000,0000,,Blair decided that he wanted all\Ncertification authorities to be licensed Dialogue: 0,0:14:13.45,0:14:18.52,Default,,0000,0000,0000,,and they were about to rush this through\Nparliament. So we put all the important Dialogue: 0,0:14:18.52,0:14:23.29,Default,,0000,0000,0000,,public keys in a paper book and we took it\Nto the cultural secretary, Chris Smith, Dialogue: 0,0:14:23.29,0:14:27.79,Default,,0000,0000,0000,,and we said, you're the minister for books\Nwhy are you passing a law to ban this Dialogue: 0,0:14:27.79,0:14:32.58,Default,,0000,0000,0000,,book. And if you'll switch to the video\Nshot, I've got the initial copy of the Dialogue: 0,0:14:32.58,0:14:36.58,Default,,0000,0000,0000,,book that we just put together on the\Nphotocopying machine in the department. Dialogue: 0,0:14:36.58,0:14:42.20,Default,,0000,0000,0000,,And then we sent the PDF off to MIT and\Nthey produced it as a proper book. And Dialogue: 0,0:14:42.20,0:14:48.35,Default,,0000,0000,0000,,this means that we had a book which is\Nsupposedly protected and this enabled us Dialogue: 0,0:14:48.35,0:14:55.21,Default,,0000,0000,0000,,to get the the topic onto the agenda for\Ncabinet discussion. So this at least Dialogue: 0,0:14:55.21,0:14:59.78,Default,,0000,0000,0000,,precipitous action, we ended up with the\NRegulation of Investigatory Powers Bill in Dialogue: 0,0:14:59.78,0:15:04.83,Default,,0000,0000,0000,,2000. That was far from perfect, but that\Nwas a longer story. So what happened back Dialogue: 0,0:15:04.83,0:15:09.86,Default,,0000,0000,0000,,then is that we set up an NGO, a digital\Nrights organization, the Foundation for Dialogue: 0,0:15:09.86,0:15:16.62,Default,,0000,0000,0000,,Information Policy Research. And the\Nclimate at the time was such that we had Dialogue: 0,0:15:16.62,0:15:22.31,Default,,0000,0000,0000,,no difficulty raising a couple of hundred\Nthousand pounds from Microsoft and Hewlett Dialogue: 0,0:15:22.31,0:15:28.37,Default,,0000,0000,0000,,Packard and Redbus and other tech players.\NSo we were able to hire Casper Bowden for Dialogue: 0,0:15:28.37,0:15:32.20,Default,,0000,0000,0000,,three years to basically be the director\Nof FIPR and to lobby the government hard Dialogue: 0,0:15:32.20,0:15:38.49,Default,,0000,0000,0000,,on this. And if we can go back to the\Nslides, please, and go to the next slide, Dialogue: 0,0:15:38.49,0:15:47.17,Default,,0000,0000,0000,,the slide on bringing it all together. So\Nin 1997, a number of us, Hal Abelson and I Dialogue: 0,0:15:47.17,0:15:54.57,Default,,0000,0000,0000,,and Steve Bellovin and Josh Benaloh from\NMicrosoft and Matt Blaze who had broken Dialogue: 0,0:15:54.57,0:15:59.43,Default,,0000,0000,0000,,Clipper and Whit Diffie, who invented\Ndigital signatures, and John Gilmore of Dialogue: 0,0:15:59.43,0:16:05.69,Default,,0000,0000,0000,,EFF, Peter Neumann of SRI, Ron Rivest,\NJeff Schiller of MIT and Bruce Schneier Dialogue: 0,0:16:05.69,0:16:09.34,Default,,0000,0000,0000,,who had written applied cryptography and\Ngot together and wrote a paper on the Dialogue: 0,0:16:09.34,0:16:13.83,Default,,0000,0000,0000,,risks of key recovery, key escrow and\Ntrust in third party encryption, where we Dialogue: 0,0:16:13.83,0:16:18.47,Default,,0000,0000,0000,,discussed the system consequences of\Ngiving third party or government access to Dialogue: 0,0:16:18.47,0:16:23.85,Default,,0000,0000,0000,,both traffic data and content without user\Nnotice or consent deployed internationally Dialogue: 0,0:16:23.85,0:16:27.55,Default,,0000,0000,0000,,and available around the clock. We came to\Nthe conclusion that this was not really Dialogue: 0,0:16:27.55,0:16:33.55,Default,,0000,0000,0000,,doable. It was simply too many\Nvulnerabilities and too many complexities. Dialogue: 0,0:16:33.55,0:16:38.90,Default,,0000,0000,0000,,So how did it end? Well, if we go to the\Nnext slide, the victory in Europe wasn't Dialogue: 0,0:16:38.90,0:16:44.26,Default,,0000,0000,0000,,as a result of academic arguments. It was\Na result of industry pressure. And we owe Dialogue: 0,0:16:44.26,0:16:48.75,Default,,0000,0000,0000,,a debt to Commissioner Martin Bangemann\Nand also to the German government who Dialogue: 0,0:16:48.75,0:16:56.70,Default,,0000,0000,0000,,backed him. And in 1994, Martin had put\Ntogether a group of European CEOs to Dialogue: 0,0:16:56.70,0:17:01.19,Default,,0000,0000,0000,,advise him on internet policy. And they\Nadvised them to keep your hands off until Dialogue: 0,0:17:01.19,0:17:04.16,Default,,0000,0000,0000,,we can see which way it's going. That's\Njust wrong with this thing and see what we Dialogue: 0,0:17:04.16,0:17:10.67,Default,,0000,0000,0000,,can do with it. And the thing that he\Ndeveloped in order to drive a stake Dialogue: 0,0:17:10.67,0:17:15.18,Default,,0000,0000,0000,,through the heart of key escrow was the\NElectronic Signatures Directive in 1999. Dialogue: 0,0:17:15.18,0:17:20.33,Default,,0000,0000,0000,,And this gave a rebuttable presumption of\Nvalidity to qualifying electronic Dialogue: 0,0:17:20.33,0:17:24.64,Default,,0000,0000,0000,,signatures, but subject to a number of\Nconditions. And one of these was that the Dialogue: 0,0:17:24.64,0:17:29.57,Default,,0000,0000,0000,,signing key must never be known to anybody\Nelse other than the signer and this killed Dialogue: 0,0:17:29.57,0:17:37.12,Default,,0000,0000,0000,,the idea of licensing CAs in such a way\Nthat the the NSA had access to all the Dialogue: 0,0:17:37.12,0:17:41.39,Default,,0000,0000,0000,,private key material. The agencies had\Nargued that without controlling Dialogue: 0,0:17:41.39,0:17:45.26,Default,,0000,0000,0000,,signatures, you couldn't control\Nencryption. But of course, as intelligence Dialogue: 0,0:17:45.26,0:17:49.44,Default,,0000,0000,0000,,agencies, they were as much interested in\Nmanipulating information as they were in Dialogue: 0,0:17:49.44,0:17:57.28,Default,,0000,0000,0000,,listening into it. And this created a\Nreally sharp conflict with businesses. In Dialogue: 0,0:17:57.28,0:18:00.77,Default,,0000,0000,0000,,the U.K., with the Regulation of\NInvestigatory Powers Bill went through the Dialogue: 0,0:18:00.77,0:18:05.54,Default,,0000,0000,0000,,following year. And there we got strong\Nsupport from the banks who did not want Dialogue: 0,0:18:05.54,0:18:10.60,Default,,0000,0000,0000,,the possibility of intelligence and law\Nenforcement personnel either getting hold Dialogue: 0,0:18:10.60,0:18:16.38,Default,,0000,0000,0000,,of bank keys or forging banking\Ntransactions. And so we managed to, with Dialogue: 0,0:18:16.38,0:18:20.72,Default,,0000,0000,0000,,their help to insert a number of\Nconditions into the bill, which meant that Dialogue: 0,0:18:20.72,0:18:25.52,Default,,0000,0000,0000,,if a court or chief constable, for\Nexample, demands a key from a company, Dialogue: 0,0:18:25.52,0:18:29.40,Default,,0000,0000,0000,,they've got to demand it from somebody at\Nthe level of a director of the company. Dialogue: 0,0:18:29.40,0:18:33.80,Default,,0000,0000,0000,,And it's got to be signed by someone\Nreally senior such as the chief constable. Dialogue: 0,0:18:33.80,0:18:38.91,Default,,0000,0000,0000,,So there was some controls that we managed\Nto get in there. Next slide! What did Dialogue: 0,0:18:38.91,0:18:44.66,Default,,0000,0000,0000,,victory in the USA look like? Well, in the\Nmiddle of 2000 as a number of people had Dialogue: 0,0:18:44.66,0:18:49.17,Default,,0000,0000,0000,,predicted, Al Gore decided that he wanted\Nto stop fighting the tech industry in Dialogue: 0,0:18:49.17,0:18:54.41,Default,,0000,0000,0000,,order to get elected president. And there\Nwas a deal done at the time which was Dialogue: 0,0:18:54.41,0:19:01.07,Default,,0000,0000,0000,,secret. It was done at the FBI\Nheadquarters at Quantico by US law Dialogue: 0,0:19:01.07,0:19:04.88,Default,,0000,0000,0000,,enforcement would rely on naturally\Noccurring vulnerabilities rather than Dialogue: 0,0:19:04.88,0:19:10.07,Default,,0000,0000,0000,,compelling their insertion by companies\Nlike Intel or Microsoft. This was secret Dialogue: 0,0:19:10.07,0:19:15.02,Default,,0000,0000,0000,,at the time, and I happen to know about it\Nbecause I was consulting for Intel and the Dialogue: 0,0:19:15.02,0:19:20.80,Default,,0000,0000,0000,,NDA I was under had a four year time\Nlimits on it. So after 2004, I was at the Dialogue: 0,0:19:20.80,0:19:25.76,Default,,0000,0000,0000,,ability to talk about this. And so this\Nbasically gave the NSA access to the CERT Dialogue: 0,0:19:25.76,0:19:30.93,Default,,0000,0000,0000,,feed. And so as part of this deal, the\Nexport rules were liberalized a bit, but Dialogue: 0,0:19:30.93,0:19:38.09,Default,,0000,0000,0000,,with various hooks and gotchas left so\Nthat the authorities could bully companies Dialogue: 0,0:19:38.09,0:19:45.58,Default,,0000,0000,0000,,who got too difficult. And in 2002, Robert\NMorris, senior, who had been the chief Dialogue: 0,0:19:45.58,0:19:50.74,Default,,0000,0000,0000,,scientist at the NSA at much of this\Nperiod, admitted that the real policy goal Dialogue: 0,0:19:50.74,0:19:54.54,Default,,0000,0000,0000,,was to ensure that the many systems\Ndeveloped during the dot com boom were Dialogue: 0,0:19:54.54,0:20:02.19,Default,,0000,0000,0000,,deployed with weak protection or none. And\Nthere's a huge, long list of these. Next Dialogue: 0,0:20:02.19,0:20:11.43,Default,,0000,0000,0000,,slide, please. So what was the collateral\Ndamage from crypto war one? This is the Dialogue: 0,0:20:11.43,0:20:15.31,Default,,0000,0000,0000,,first knuckle pass of this talk, which\NI've got together as a result of spending Dialogue: 0,0:20:15.31,0:20:20.92,Default,,0000,0000,0000,,the last academic year writing the third\Nedition of my book on security engineering Dialogue: 0,0:20:20.92,0:20:26.52,Default,,0000,0000,0000,,as I've gone through and updated all the\Nchapters on car security, the role of Dialogue: 0,0:20:26.52,0:20:32.51,Default,,0000,0000,0000,,security and web security and so on and so\Nforth, we find everywhere. But there are Dialogue: 0,0:20:32.51,0:20:38.05,Default,,0000,0000,0000,,still very serious costs remaining from\Ncrypto war one, for example, almost all of Dialogue: 0,0:20:38.05,0:20:43.43,Default,,0000,0000,0000,,the remote key entry systems for cars use\Ninadequate cryptography for random Dialogue: 0,0:20:43.43,0:20:48.29,Default,,0000,0000,0000,,number generators and so on and so forth.\NAnd car theft has almost doubled in the Dialogue: 0,0:20:48.29,0:20:55.38,Default,,0000,0000,0000,,past five years. This is not all due to\Nweak crypto, but it's substantially due to Dialogue: 0,0:20:55.38,0:21:01.32,Default,,0000,0000,0000,,a wrong culture that was started off in\Nthe context of the crypto wars. Second, Dialogue: 0,0:21:01.32,0:21:06.43,Default,,0000,0000,0000,,there are millions of door locks still\Nusing Mifare classic, even the building Dialogue: 0,0:21:06.43,0:21:12.03,Default,,0000,0000,0000,,where I work. For example, the University\Nof Cambridge changed its door locks around Dialogue: 0,0:21:12.03,0:21:17.04,Default,,0000,0000,0000,,2000. So we've still got a whole lot of\Nmifare classic around. And it's very Dialogue: 0,0:21:17.04,0:21:21.15,Default,,0000,0000,0000,,difficult when you've got 100 buildings to\Nchange all the locks on them. And this is Dialogue: 0,0:21:21.15,0:21:26.25,Default,,0000,0000,0000,,the case with thousands of organizations\Nworldwide, with universities, with banks, Dialogue: 0,0:21:26.25,0:21:30.99,Default,,0000,0000,0000,,with all sorts of people, simply because\Nchanging all the locks at once and dozens Dialogue: 0,0:21:30.99,0:21:35.77,Default,,0000,0000,0000,,of buildings is just too expensive. Then,\Nof course, there's the CA in your Dialogue: 0,0:21:35.77,0:21:40.99,Default,,0000,0000,0000,,browser, most nations own or control\Ncertification authorities that your Dialogue: 0,0:21:40.99,0:21:47.38,Default,,0000,0000,0000,,browser trusts and the few nations that\Nweren't allowed to own such CAs, such as Dialogue: 0,0:21:47.38,0:21:53.04,Default,,0000,0000,0000,,Iran, get up to mischief, as we find in\Nthe case of the DigiNotar hack a few years Dialogue: 0,0:21:53.04,0:21:59.00,Default,,0000,0000,0000,,ago. And this means that most nations have\Ngot a more or less guaranteed ability to Dialogue: 0,0:21:59.00,0:22:05.77,Default,,0000,0000,0000,,do man in the middle attacks on your Web\Nlog ons. Some companies like Google, of Dialogue: 0,0:22:05.77,0:22:11.41,Default,,0000,0000,0000,,course, started to fix that with various\Nmechanisms such as certificate pinning. Dialogue: 0,0:22:11.41,0:22:16.16,Default,,0000,0000,0000,,But that was a deliberate vulnerability\Nthat was there for a long, long time and Dialogue: 0,0:22:16.16,0:22:22.41,Default,,0000,0000,0000,,is still very widespread. Phones. 2G is\Ninsecure. That actually goes back to the Dialogue: 0,0:22:22.41,0:22:27.28,Default,,0000,0000,0000,,Cold War rather than the crypto war. But\Nthanks to the crypto wars 4G and 5G are Dialogue: 0,0:22:27.28,0:22:32.45,Default,,0000,0000,0000,,not very much better. The details are\Nslightly complicated and again, they're Dialogue: 0,0:22:32.45,0:22:37.95,Default,,0000,0000,0000,,described in the book, Bluetooth is easy\Nto hack. That's another piece of legacy. Dialogue: 0,0:22:37.95,0:22:43.38,Default,,0000,0000,0000,,And as I mentioned, the agencies own the\NCERT's responsible disclosure pipeline, Dialogue: 0,0:22:43.38,0:22:47.69,Default,,0000,0000,0000,,which means that they got a free fire hose\Nof zero days that they can exploit Dialogue: 0,0:22:47.69,0:22:53.78,Default,,0000,0000,0000,,for perhaps a month or three before these\Nend up being patched. So next slide, Dialogue: 0,0:22:53.78,0:23:02.60,Default,,0000,0000,0000,,please. Last year when I talked at Chaos\NCommunication Congress, the audience chose Dialogue: 0,0:23:02.60,0:23:08.90,Default,,0000,0000,0000,,this as the cover for my security\Nengineering book, and that's now out. And Dialogue: 0,0:23:08.90,0:23:12.73,Default,,0000,0000,0000,,it's the process of writing this that\Nbrought home to me the scale of the damage Dialogue: 0,0:23:12.73,0:23:18.45,Default,,0000,0000,0000,,that we still suffered as a result of\Ncrypto war one. So let's move on to the Dialogue: 0,0:23:18.45,0:23:24.61,Default,,0000,0000,0000,,next slide and the next period of history,\Nwhich we might call the war on terror. And Dialogue: 0,0:23:24.61,0:23:30.98,Default,,0000,0000,0000,,I've arbitrarily put this down as 2000 to\N2013 although some countries stoped using Dialogue: 0,0:23:30.98,0:23:36.79,Default,,0000,0000,0000,,the phrase war on terror in about 2008\Nonce we have got rid of George W. Bush and Dialogue: 0,0:23:36.79,0:23:41.33,Default,,0000,0000,0000,,Tony Blair. But as a historical\Nconvenience, this is, if you like, the Dialogue: 0,0:23:41.33,0:23:46.14,Default,,0000,0000,0000,,central period in our tale. And it starts\Noff with a lot of harassment around the Dialogue: 0,0:23:46.14,0:23:54.70,Default,,0000,0000,0000,,edges of security and cryptography. For\Nexample, in 2000, Tony Blair promoted the Dialogue: 0,0:23:54.70,0:24:02.29,Default,,0000,0000,0000,,EU dual use regulation number 1334 to\Nextend export controls from tangible goods Dialogue: 0,0:24:02.29,0:24:07.81,Default,,0000,0000,0000,,such as rifles and tanks to intangibles\Nsuch as crypto software. Despite the fact Dialogue: 0,0:24:07.81,0:24:13.98,Default,,0000,0000,0000,,that he has basically declared peace on\Nthe tech industry. Two years later, in Dialogue: 0,0:24:13.98,0:24:18.09,Default,,0000,0000,0000,,2002, the UK parliament balked at an\Nexport control bill that was going to Dialogue: 0,0:24:18.09,0:24:24.14,Default,,0000,0000,0000,,transpose this because it added controls\Non scientific speech, not just crypto Dialogue: 0,0:24:24.14,0:24:28.90,Default,,0000,0000,0000,,code, but even papers on cryptanalysis and\Neven electron microscope scripts and Dialogue: 0,0:24:28.90,0:24:33.30,Default,,0000,0000,0000,,so parliament started the research\Nexemption clause at the arguments of the Dialogue: 0,0:24:33.30,0:24:39.42,Default,,0000,0000,0000,,then president of the Royal Society, Sir\NRobert May. But what then happened is that Dialogue: 0,0:24:39.42,0:24:45.82,Default,,0000,0000,0000,,GCHQ used EU regulations to frustrate\NParliament and this pattern of extralegal Dialogue: 0,0:24:45.82,0:24:51.70,Default,,0000,0000,0000,,behavior was to continue. Next slide!\NBecause after export control, the place Dialogue: 0,0:24:51.70,0:24:57.31,Default,,0000,0000,0000,,shifted to traffic data retention, another\Nbad thing that I'm afraid to say, the UK Dialogue: 0,0:24:57.31,0:25:02.63,Default,,0000,0000,0000,,exported to Europe back in the days when\Nwe were, in effect, the Americans Dialogue: 0,0:25:02.63,0:25:08.53,Default,,0000,0000,0000,,consigliere on the European Council. Sorry\Nabout that, folks, but all I can say is at Dialogue: 0,0:25:08.53,0:25:15.90,Default,,0000,0000,0000,,least we helped start EDRI a year after\Nthat. So one of the interesting aspects of Dialogue: 0,0:25:15.90,0:25:20.59,Default,,0000,0000,0000,,this was that our then home secretary,\NJacqui Smith, started talking about the Dialogue: 0,0:25:20.59,0:25:26.08,Default,,0000,0000,0000,,need for a common database of all the\Nmetadata of who had phoned whom when, who Dialogue: 0,0:25:26.08,0:25:30.72,Default,,0000,0000,0000,,had sent an email to whom when, so that\Nthe police could continue to use the Dialogue: 0,0:25:30.72,0:25:35.34,Default,,0000,0000,0000,,traditional contact tracing techniques\Nonline. And the line that we got hammered Dialogue: 0,0:25:35.34,0:25:39.50,Default,,0000,0000,0000,,home to us again and again and again was\Nif you got nothing to hide, you've got Dialogue: 0,0:25:39.50,0:25:47.49,Default,,0000,0000,0000,,nothing to fear. What then happened in\N2008, is that a very bad person went into Dialogue: 0,0:25:47.49,0:25:53.55,Default,,0000,0000,0000,,Parliament and went to the PC where the\Nexpense claims of MPs were kept and they Dialogue: 0,0:25:53.55,0:25:58.63,Default,,0000,0000,0000,,copied all the expense claims onto a DVD\Nand they sold it around Fleet Street. And Dialogue: 0,0:25:58.63,0:26:03.45,Default,,0000,0000,0000,,so The Daily Telegraph bought it from them\Nfor 400˙000£. And then for the best Dialogue: 0,0:26:03.45,0:26:07.50,Default,,0000,0000,0000,,part of a year, the Daily Telegraph was\Ntelling scandalous things about what Dialogue: 0,0:26:07.50,0:26:12.17,Default,,0000,0000,0000,,various members of parliament had claimed\Nfrom the taxpayer. But it turned out that Dialogue: 0,0:26:12.17,0:26:15.73,Default,,0000,0000,0000,,also Jacqui Smith may have been innocent.\NHer husband had been downloading Dialogue: 0,0:26:15.73,0:26:21.01,Default,,0000,0000,0000,,pornography and charging it to our\Nparliamentary expenses. So she lost her Dialogue: 0,0:26:21.01,0:26:25.82,Default,,0000,0000,0000,,job as home secretary and she lost her\Nseat in parliament and the communications Dialogue: 0,0:26:25.82,0:26:32.95,Default,,0000,0000,0000,,data bill was lost. So was this a victory?\NWell, in June 2013, we learned from Ed Dialogue: 0,0:26:32.95,0:26:39.31,Default,,0000,0000,0000,,Snowden that they just built it anyway,\Ndespite parliament. So maybe the victory Dialogue: 0,0:26:39.31,0:26:43.40,Default,,0000,0000,0000,,in parliament wasn't what it seemed to be\Nat the time. But I'm getting ahead of Dialogue: 0,0:26:43.40,0:26:51.57,Default,,0000,0000,0000,,myself; anyway. Next slide, please. The\Nother thing that we did in the 2000s is Dialogue: 0,0:26:51.57,0:26:56.20,Default,,0000,0000,0000,,that we spent, I spent maybe a third of my\Ntime and about another hundred people Dialogue: 0,0:26:56.20,0:27:00.86,Default,,0000,0000,0000,,joined and we developed the economics of\Nsecurity as a discipline. We began to Dialogue: 0,0:27:00.86,0:27:05.66,Default,,0000,0000,0000,,realize that many of the things that went\Nwrong happened because Alice was guarding Dialogue: 0,0:27:05.66,0:27:10.91,Default,,0000,0000,0000,,a system and Bob was paying the cost of\Nfailure. For example, if you got a payment Dialogue: 0,0:27:10.91,0:27:17.48,Default,,0000,0000,0000,,system, then in order to prevent fraud,\Nwhat you basically have to do is to get Dialogue: 0,0:27:17.48,0:27:21.92,Default,,0000,0000,0000,,the merchants and the bank to buy\Ntransactions from them, to take care of Dialogue: 0,0:27:21.92,0:27:26.44,Default,,0000,0000,0000,,the costs of fraud, follow the cardholder\Nof the banks that issue them with cards. Dialogue: 0,0:27:26.44,0:27:31.87,Default,,0000,0000,0000,,And the two aren't the same. But it's this\Nthat causes the governance tensions and Dialogue: 0,0:27:31.87,0:27:36.91,Default,,0000,0000,0000,,causes governments to break down and makes\Nfraud harder than it should be. Now after Dialogue: 0,0:27:36.91,0:27:41.53,Default,,0000,0000,0000,,that, one of the early topics was\Npatching and responsible disclosure. And Dialogue: 0,0:27:41.53,0:27:45.34,Default,,0000,0000,0000,,we worked through all the issues of\Nwhether you should not patch at all, which Dialogue: 0,0:27:45.34,0:27:48.86,Default,,0000,0000,0000,,some people in industry wanted to do, or\Nwhether you should just put all the bugs Dialogue: 0,0:27:48.86,0:27:52.69,Default,,0000,0000,0000,,on bug trackers which some hackers wanted\Nto do or whether you would go through the Dialogue: 0,0:27:52.69,0:27:57.20,Default,,0000,0000,0000,,CERT system despite the NSA compromise,\Nbecause they at least would give you legal Dialogue: 0,0:27:57.20,0:28:03.72,Default,,0000,0000,0000,,cover. And, you know, bully Microsoft into\Ncatching the bug the next patch Tuesday Dialogue: 0,0:28:03.72,0:28:10.04,Default,,0000,0000,0000,,and then the disclosure after 90 days. And\Nwe eventually came to the conclusion as an Dialogue: 0,0:28:10.04,0:28:16.27,Default,,0000,0000,0000,,industry followed that responsible\Ndisclosure was the way to go. Now, one of Dialogue: 0,0:28:16.27,0:28:21.53,Default,,0000,0000,0000,,the problems that arises here is the\Nequities issue. Suppose you're the Dialogue: 0,0:28:21.53,0:28:27.26,Default,,0000,0000,0000,,director of the NSA and somebody comes to\Nyou with some super new innovative bug. Dialogue: 0,0:28:27.26,0:28:33.49,Default,,0000,0000,0000,,You say they have rediscovered Spectre,\Nfor example. And so you've got a bug which Dialogue: 0,0:28:33.49,0:28:40.64,Default,,0000,0000,0000,,can be used to penetrate any crypto\Nsoftware that's out there. Do you report Dialogue: 0,0:28:40.64,0:28:45.64,Default,,0000,0000,0000,,the bug to Microsoft and Intel to defend\N300 million Americans, or do you keep it Dialogue: 0,0:28:45.64,0:28:50.83,Default,,0000,0000,0000,,quiet so you can exploit 450 million\NEuropeans and a thousand billion Chinese Dialogue: 0,0:28:50.83,0:28:55.17,Default,,0000,0000,0000,,and so on and so forth? Well, once you put\Nit that way, it's fairly obvious that the Dialogue: 0,0:28:55.17,0:29:00.37,Default,,0000,0000,0000,,NSA will favor attack over defense. And\Nthere are multiple models of attack and Dialogue: 0,0:29:00.37,0:29:04.42,Default,,0000,0000,0000,,defense. You can think of institutional\Nfactors and politics, for example, if you Dialogue: 0,0:29:04.42,0:29:10.35,Default,,0000,0000,0000,,are director of the NSA, and you defend\N300 million Americans. You defend the Dialogue: 0,0:29:10.35,0:29:15.72,Default,,0000,0000,0000,,White House against the Chinese hacking\Nit. You know, the president will never Dialogue: 0,0:29:15.72,0:29:19.97,Default,,0000,0000,0000,,know if he's hacked or not because the\NChinese will keep it quiet if they do. But Dialogue: 0,0:29:19.97,0:29:24.79,Default,,0000,0000,0000,,if, on the other hand, you manage to hack\Nthe Politburo land in Peking, you can put Dialogue: 0,0:29:24.79,0:29:31.04,Default,,0000,0000,0000,,some juicy intelligence every morning with\Nthe president's breakfast cereal. So Dialogue: 0,0:29:31.04,0:29:37.15,Default,,0000,0000,0000,,that's an even stronger argument of why\Nyou should do attack rather than defense. Dialogue: 0,0:29:37.15,0:29:43.36,Default,,0000,0000,0000,,And all the thing that I mentioned in\Npassing is that throughout the 2000s, Dialogue: 0,0:29:43.36,0:29:47.39,Default,,0000,0000,0000,,governments also scrambled to get more\Ndata of the citizens, for example, in Dialogue: 0,0:29:47.39,0:29:51.93,Default,,0000,0000,0000,,Britain with a long debate about whether\Nmedical records should be centralized. In Dialogue: 0,0:29:51.93,0:29:56.03,Default,,0000,0000,0000,,the beginning, we said if you were to\Ncentralize all medical records, that would Dialogue: 0,0:29:56.03,0:29:59.44,Default,,0000,0000,0000,,be such a large target that the database\Nshould be top secret and it would be too Dialogue: 0,0:29:59.44,0:30:06.48,Default,,0000,0000,0000,,inconvenient for doctors to use. Well,\NBlair decided in 2001 to do it anyway. We Dialogue: 0,0:30:06.48,0:30:10.70,Default,,0000,0000,0000,,wrote a report in 2009 saying that this\Nwas a red line and that this was a serious Dialogue: 0,0:30:10.70,0:30:17.03,Default,,0000,0000,0000,,hazard and then in 2014 we discovered that\NCameron's buddy, who was the transparency Dialogue: 0,0:30:17.03,0:30:22.44,Default,,0000,0000,0000,,czar and the NHS had sold the database to\N1200 researchers, including drug companies Dialogue: 0,0:30:22.44,0:30:26.74,Default,,0000,0000,0000,,in China. So that meant that all the\Nsensitive personal health information Dialogue: 0,0:30:26.74,0:30:31.48,Default,,0000,0000,0000,,about one billion patients episodes had\Nbeen sold around the world and was Dialogue: 0,0:30:31.48,0:30:35.24,Default,,0000,0000,0000,,available to not just to medical\Nresearchers, but to foreign intelligence Dialogue: 0,0:30:35.24,0:30:50.76,Default,,0000,0000,0000,,services. This brings us on to Snowden. In\NJune 2013. We had one of those game Dialogue: 0,0:30:50.76,0:30:57.28,Default,,0000,0000,0000,,changing moments when Ed Snowden leaked a\Nwhole bunch of papers showing that the NSA Dialogue: 0,0:30:57.28,0:31:02.39,Default,,0000,0000,0000,,had been breaking the law in America and\NGCHQ had been breaking the law in Britain, Dialogue: 0,0:31:02.39,0:31:06.32,Default,,0000,0000,0000,,that we have been lied to, the parliament\Nhad been misled, and a whole lot of Dialogue: 0,0:31:06.32,0:31:10.58,Default,,0000,0000,0000,,collection and interception was going on,\Nwhich supposedly shouldn't have been going Dialogue: 0,0:31:10.58,0:31:15.79,Default,,0000,0000,0000,,on. Now, one of the things that got\Nindustry attention was a system called Dialogue: 0,0:31:15.79,0:31:22.50,Default,,0000,0000,0000,,PRISM, which was in fact legal because\Nthis was done as a result of warrants Dialogue: 0,0:31:22.50,0:31:28.19,Default,,0000,0000,0000,,being served on the major Internet service\Nproviders. And if we could move to the Dialogue: 0,0:31:28.19,0:31:33.50,Default,,0000,0000,0000,,next slide, we can see that this started\Noff with Microsoft in 2007. Yahoo! in Dialogue: 0,0:31:33.50,0:31:38.12,Default,,0000,0000,0000,,2008, they fought in court for a year they\Nlost and then Google and Facebook and so on Dialogue: 0,0:31:38.12,0:31:44.64,Default,,0000,0000,0000,,got added. This basically enabled the NSA\Nto go to someone like Google and say Dialogue: 0,0:31:44.64,0:31:49.59,Default,,0000,0000,0000,,rossjanderson@gmail.com is a foreign\Nnational, we're therefore entitled to read Dialogue: 0,0:31:49.59,0:31:54.66,Default,,0000,0000,0000,,his traffic, kindly give us his Gmail. And\NGoogle would say, yes, sir. For Americans, Dialogue: 0,0:31:54.66,0:31:58.24,Default,,0000,0000,0000,,you have to show probable cause that\Nthey've committed a crime for foreigners Dialogue: 0,0:31:58.24,0:32:06.06,Default,,0000,0000,0000,,you simply have to show probable cause\Nthat they're a foreigner. The next slide. Dialogue: 0,0:32:06.06,0:32:14.70,Default,,0000,0000,0000,,This disclosure from Snowden disclosed\Nthat PRISM, despite the fact that it only Dialogue: 0,0:32:14.70,0:32:20.40,Default,,0000,0000,0000,,costs about 20 million dollars a year, was\Ngenerating something like half of all the Dialogue: 0,0:32:20.40,0:32:27.16,Default,,0000,0000,0000,,intelligence that the NSA was using. By\Nthe end of financial year 2012, but that Dialogue: 0,0:32:27.16,0:32:33.10,Default,,0000,0000,0000,,was not all. Next slide, please. The thing\Nthat really annoyed Google was this slide Dialogue: 0,0:32:33.10,0:32:38.82,Default,,0000,0000,0000,,on the deck from a presentation at GCHQ\Nshowing how the NSA was not merely Dialogue: 0,0:32:38.82,0:32:44.48,Default,,0000,0000,0000,,collecting stuff through the front door by\Nserving warrants on Google in Mountain Dialogue: 0,0:32:44.48,0:32:48.59,Default,,0000,0000,0000,,View, it was collecting stuff through the\Nbackdoor as well, because they were Dialogue: 0,0:32:48.59,0:32:53.84,Default,,0000,0000,0000,,harvesting the plaintext copies of Gmail\Nand maps and docs and so on, which were Dialogue: 0,0:32:53.84,0:32:59.35,Default,,0000,0000,0000,,being sent backwards and forwards between\NGoogle's different data centers. And the Dialogue: 0,0:32:59.35,0:33:04.65,Default,,0000,0000,0000,,little smiley face, which you can see on\Nthe sticky, got Sergei and Friends really, Dialogue: 0,0:33:04.65,0:33:09.89,Default,,0000,0000,0000,,really uptight. And they just decided,\Nright, you know, we're not going to allow Dialogue: 0,0:33:09.89,0:33:13.31,Default,,0000,0000,0000,,this. They will have to knock and show\Nwarrants in the future. And there was a Dialogue: 0,0:33:13.31,0:33:17.12,Default,,0000,0000,0000,,crash program and all the major Internet\Nservice providers to encrypt all the Dialogue: 0,0:33:17.12,0:33:25.18,Default,,0000,0000,0000,,traffic so that in future things could\Nonly be got by means of a warrant. Next Dialogue: 0,0:33:25.18,0:33:38.06,Default,,0000,0000,0000,,slide, please. The EU was really annoyed\Nby what was called Operation Socialist. Dialogue: 0,0:33:38.06,0:33:49.92,Default,,0000,0000,0000,,Operation Socialist was basically, the\Nhack of Belgacom and the idea was that Dialogue: 0,0:33:49.92,0:33:56.71,Default,,0000,0000,0000,,GCHQ spearfished some technical staff at\NBelgacom and this enabled them to wiretap Dialogue: 0,0:33:56.71,0:34:04.87,Default,,0000,0000,0000,,all the traffic at the European Commission\Nin Brussels and as well as mobile phone Dialogue: 0,0:34:04.87,0:34:11.91,Default,,0000,0000,0000,,traffic to and from various countries in\NAfrica. And this is rather amazing. It's Dialogue: 0,0:34:11.91,0:34:16.94,Default,,0000,0000,0000,,as if Nicola Sturgeon, the first minister\Nof Scotland, had tasked Police Scotland Dialogue: 0,0:34:16.94,0:34:21.78,Default,,0000,0000,0000,,with hacking BT so that she could watch\Nout what was going on with the parliament Dialogue: 0,0:34:21.78,0:34:30.92,Default,,0000,0000,0000,,in London. So this annoyed a number of\Npeople. With the next slide, we can see. Dialogue: 0,0:34:30.92,0:34:40.15,Default,,0000,0000,0000,,That the the Operation Bull Run, an\Noperation Edgehill, as GCHQ called their Dialogue: 0,0:34:40.15,0:34:44.74,Default,,0000,0000,0000,,version of it, have an aggressive,\Nmultipronged efforts to break widely used Dialogue: 0,0:34:44.74,0:34:49.90,Default,,0000,0000,0000,,Internet encryption technologies. And we\Nlearned an awful lot about what was being Dialogue: 0,0:34:49.90,0:34:55.93,Default,,0000,0000,0000,,done to break VPNs worldwide and what had\Nbeen done in terms of inserting Dialogue: 0,0:34:55.93,0:35:01.83,Default,,0000,0000,0000,,vulnerabilities and protocols, getting\Npeople to use vulnerable prime numbers for Dialogue: 0,0:35:01.83,0:35:06.75,Default,,0000,0000,0000,,Diffie Hellman key exchange and so on and\Nso forth. Next slide, first slide and Dialogue: 0,0:35:06.75,0:35:11.87,Default,,0000,0000,0000,,Bullrun and Edgehill SIGINT enabling\Nprojects actively engages the US and Dialogue: 0,0:35:11.87,0:35:16.31,Default,,0000,0000,0000,,foreign IT industries to covertly\Ninfluence and/or overtly leverage their Dialogue: 0,0:35:16.31,0:35:20.69,Default,,0000,0000,0000,,commercial products' designs. These design\Nchanges make the systems in question Dialogue: 0,0:35:20.69,0:35:24.68,Default,,0000,0000,0000,,exploitable through SIGINT collection\Nendpoint midpoints, et cetera, with Dialogue: 0,0:35:24.68,0:35:28.40,Default,,0000,0000,0000,,foreknowledge of the modification, the\Nconsumer and other adversaries however the Dialogue: 0,0:35:28.40,0:35:36.51,Default,,0000,0000,0000,,system security remains intact. Next\Nslide, so the insert vulnerabilities into Dialogue: 0,0:35:36.51,0:35:41.45,Default,,0000,0000,0000,,commercial systems, I.T. systems, networks\Nand point communication devices used by Dialogue: 0,0:35:41.45,0:35:49.16,Default,,0000,0000,0000,,targets. Next slide. They also influence\Npolicy standards and specifications for Dialogue: 0,0:35:49.16,0:35:54.27,Default,,0000,0000,0000,,commercial public key technologies, and\Nthis was the smoking gun that Dialogue: 0,0:35:54.27,0:36:02.24,Default,,0000,0000,0000,,crypto war 1 had not actually ended. It had\Njust gone undercover. And so with this, Dialogue: 0,0:36:02.24,0:36:08.25,Default,,0000,0000,0000,,things come out into the open next slide\Nso we could perhaps date crypto war 2 to Dialogue: 0,0:36:08.25,0:36:13.19,Default,,0000,0000,0000,,the Snowden disclosures in their aftermath\Nin America. It must be said that all three Dialogue: 0,0:36:13.19,0:36:18.35,Default,,0000,0000,0000,,arms of the US government showed at least\Nmild remarks. Obama set up the NSA review Dialogue: 0,0:36:18.35,0:36:23.81,Default,,0000,0000,0000,,group and adopted most of what it said\Nexcept on the equities issue. Congress got Dialogue: 0,0:36:23.81,0:36:28.18,Default,,0000,0000,0000,,data retention, renewed the Patriot Act\Nand the FISA court introduced an advocate Dialogue: 0,0:36:28.18,0:36:33.34,Default,,0000,0000,0000,,for Targets. Tech companies as I\Nmentioned, started encrypting all their Dialogue: 0,0:36:33.34,0:36:39.22,Default,,0000,0000,0000,,traffic. In the UK on the other hand,\Ngovernments expressed no remorse at all, Dialogue: 0,0:36:39.22,0:36:43.45,Default,,0000,0000,0000,,and they passed the Investigatory Powers\NAct to legalize all the unlawful things Dialogue: 0,0:36:43.45,0:36:47.74,Default,,0000,0000,0000,,they've already been doing. And they could\Nnow order firms secretly do anything they Dialogue: 0,0:36:47.74,0:36:56.73,Default,,0000,0000,0000,,physically can. However, data retention\Nwas nixed by the European courts. The Dialogue: 0,0:36:56.73,0:37:01.92,Default,,0000,0000,0000,,academic response in the next slide, keys\Nunder doormats, much the same authors as Dialogue: 0,0:37:01.92,0:37:08.67,Default,,0000,0000,0000,,before. We analyzed the new situation and\Ncame to much of the same conclusions. Next Dialogue: 0,0:37:08.67,0:37:14.62,Default,,0000,0000,0000,,slide, the 2018 GCHQ\Nproposals from Ian Levy and Crispin Dialogue: 0,0:37:14.62,0:37:20.87,Default,,0000,0000,0000,,Robinson proposed to add ghost users to\NWhatsApp and FaceTime calls in response to Dialogue: 0,0:37:20.87,0:37:25.86,Default,,0000,0000,0000,,warrants. The idea is that you've got an\NFBI key on your device hearing. You still Dialogue: 0,0:37:25.86,0:37:30.11,Default,,0000,0000,0000,,have end to end, so you just have an extra\Nend. And this, of course, fills the keys Dialogue: 0,0:37:30.11,0:37:34.38,Default,,0000,0000,0000,,on the doormats tests. Your software would\Nabandon best practice. It would create Dialogue: 0,0:37:34.38,0:37:39.69,Default,,0000,0000,0000,,targets and increase complexity and it\Nwould also have to lie about trust. Next Dialogue: 0,0:37:39.69,0:37:49.31,Default,,0000,0000,0000,,slide, please. This brings us to the\Nupload filters which were proposed over Dialogue: 0,0:37:49.31,0:37:55.99,Default,,0000,0000,0000,,the past six months, they first surfaced\Nin early 2020 to a Stanford think tank and Dialogue: 0,0:37:55.99,0:38:00.96,Default,,0000,0000,0000,,they were adopted by Commissioner Ylva\NJohansson on June the 9th at the start of Dialogue: 0,0:38:00.96,0:38:05.93,Default,,0000,0000,0000,,the German presidency. On the 20th of\NSeptember we got a leaked tech paper whose Dialogue: 0,0:38:05.93,0:38:11.65,Default,,0000,0000,0000,,authors include our GCHQ friends Ian Levie\Nand Crispin Robinson. The top options are Dialogue: 0,0:38:11.65,0:38:17.62,Default,,0000,0000,0000,,that you filter in client software\Nassisted by a server, as client side only Dialogue: 0,0:38:17.62,0:38:22.57,Default,,0000,0000,0000,,filtering is too constrained and easy to\Ncompromise. The excuse is that you want to Dialogue: 0,0:38:22.57,0:38:28.52,Default,,0000,0000,0000,,stop illegal material such as child sex\Nabuse images being shared over end to end Dialogue: 0,0:38:28.52,0:38:34.21,Default,,0000,0000,0000,,messaging system such as WhatsApp. Various\NNGOs objected, and we had a meeting with Dialogue: 0,0:38:34.21,0:38:39.58,Default,,0000,0000,0000,,the commission, which was a little bit\Nlike a Stockholm Syndrome event. We had Dialogue: 0,0:38:39.58,0:38:43.75,Default,,0000,0000,0000,,one official there on the child protection\Nfront fax by half a dozen officials from Dialogue: 0,0:38:43.75,0:38:48.61,Default,,0000,0000,0000,,various security bodies, departments and\Nagencies who seemed to be clearly driving Dialogue: 0,0:38:48.61,0:38:53.19,Default,,0000,0000,0000,,the thing with child protection merely\Nbeing an excuse to promote this lead. Dialogue: 0,0:38:53.19,0:39:00.36,Default,,0000,0000,0000,,Well, the obvious things to worry about\Nare as a similar language in the new Dialogue: 0,0:39:00.36,0:39:04.73,Default,,0000,0000,0000,,terror regulation, you can expect the\Nfilter to extend from child sex abuse Dialogue: 0,0:39:04.73,0:39:10.84,Default,,0000,0000,0000,,material to terror. And static filtering\Nwon't work because if there's a bad list Dialogue: 0,0:39:10.84,0:39:15.38,Default,,0000,0000,0000,,of 100˙000 forbidden images, then the bad\Npeople will just go out and make another Dialogue: 0,0:39:15.38,0:39:22.53,Default,,0000,0000,0000,,100˙000 child sex abuse images. So the\Nfiltering will have to become dynamic. And Dialogue: 0,0:39:22.53,0:39:26.88,Default,,0000,0000,0000,,then the question is whether your form\Nwill block it or report it. And there's an Dialogue: 0,0:39:26.88,0:39:32.09,Default,,0000,0000,0000,,existing legal duty in a number of\Ncountries and in the UK to although Dialogue: 0,0:39:32.09,0:39:37.31,Default,,0000,0000,0000,,obviously no longer a member state, the\Nexisting duty to report terror stuff. And Dialogue: 0,0:39:37.31,0:39:41.84,Default,,0000,0000,0000,,the question is, who will be in charge of\Nupdating the filters? What's going to Dialogue: 0,0:39:41.84,0:39:50.75,Default,,0000,0000,0000,,happen then? Next slide. Well, we've seen\Nan illustration during the lockdown in Dialogue: 0,0:39:50.75,0:39:55.23,Default,,0000,0000,0000,,April, the French and Dutch government\Nsent an update to all Encrochat mobile Dialogue: 0,0:39:55.23,0:39:59.45,Default,,0000,0000,0000,,phones with a rootkit which copied\Nmessages, crypto keys and lock screen Dialogue: 0,0:39:59.45,0:40:04.46,Default,,0000,0000,0000,,passwords. The Encrochat was a brand of\Nmobile phone that was sold through Dialogue: 0,0:40:04.46,0:40:11.00,Default,,0000,0000,0000,,underground channels to various criminal\Ngroups and others. And since this was Dialogue: 0,0:40:11.00,0:40:18.12,Default,,0000,0000,0000,,largely used by criminals of various\Nkinds, the U.K. government justify bulk Dialogue: 0,0:40:18.12,0:40:24.16,Default,,0000,0000,0000,,intercepts by passing its office targets\Nand equipment interference. In other Dialogue: 0,0:40:24.16,0:40:28.60,Default,,0000,0000,0000,,words, they brought a targeted warrant for\Nall forty five thousand Encrochat handsets Dialogue: 0,0:40:28.60,0:40:33.40,Default,,0000,0000,0000,,and of ten thousand users in the U.K.,\Neight hundred were arrested in June when Dialogue: 0,0:40:33.40,0:40:39.68,Default,,0000,0000,0000,,the wire tapping exercise was completed.\NNow, again, this appears to ignore the Dialogue: 0,0:40:39.68,0:40:44.45,Default,,0000,0000,0000,,laws that we have on the books because\Neven our Investigatory Powers Act rules Dialogue: 0,0:40:44.45,0:40:48.71,Default,,0000,0000,0000,,out all interception of U.K.\Nresidents. And those who follow such Dialogue: 0,0:40:48.71,0:40:52.95,Default,,0000,0000,0000,,matters will know that there was a trial\Nat Liverpool Crown Court, a hearing of Dialogue: 0,0:40:52.95,0:40:59.37,Default,,0000,0000,0000,,whether this stuff was admissible. And we\Nshould have a first verdict on that early Dialogue: 0,0:40:59.37,0:41:05.27,Default,,0000,0000,0000,,in the new year. And that will no doubt go\Nto appeal. And if the material is held to Dialogue: 0,0:41:05.27,0:41:09.82,Default,,0000,0000,0000,,be admissible, then there will be a whole\Nseries of trials. So this brings me to my Dialogue: 0,0:41:09.82,0:41:17.05,Default,,0000,0000,0000,,final point. What can we expect going\Nforward? China is emerging as a full-stack Dialogue: 0,0:41:17.05,0:41:21.70,Default,,0000,0000,0000,,competitor to the West, not like Russia in\NCold War one, because Russia only ever Dialogue: 0,0:41:21.70,0:41:26.76,Default,,0000,0000,0000,,produced things like primary goods, like\Noil and weapons in trouble, of course. But Dialogue: 0,0:41:26.76,0:41:30.69,Default,,0000,0000,0000,,China is trying to compete all the way up\Nand down the stack from chips, through Dialogue: 0,0:41:30.69,0:41:35.69,Default,,0000,0000,0000,,software, up through services and\Neverything else. And developments in China Dialogue: 0,0:41:35.69,0:41:40.85,Default,,0000,0000,0000,,don't exactly fill one with much\Nconfidence, because in March 2018, Dialogue: 0,0:41:40.85,0:41:45.40,Default,,0000,0000,0000,,President Xi declared himself to be ruler\Nfor life, basically tearing up the Chinese Dialogue: 0,0:41:45.40,0:41:50.28,Default,,0000,0000,0000,,constitution. There are large-scale state\Ncrimes being committed in Tibet and Dialogue: 0,0:41:50.28,0:41:55.24,Default,,0000,0000,0000,,Xiniang and elsewhere. Just last week,\NBritain's chief rabbi described the Dialogue: 0,0:41:55.24,0:42:03.99,Default,,0000,0000,0000,,treatment of Uyghurs as an unfathomable\Nmass atrocity. In my book, I describe Dialogue: 0,0:42:03.99,0:42:09.28,Default,,0000,0000,0000,,escalating cyber conflict and various\Nhacks, such as the hack of the Office of Dialogue: 0,0:42:09.28,0:42:15.10,Default,,0000,0000,0000,,Personnel Management, which had clearance\Nfiles on all Americans who work for the Dialogue: 0,0:42:15.10,0:42:20.71,Default,,0000,0000,0000,,federal governments, the hack of Equifax,\Nwhich got credit ratings and credit Dialogue: 0,0:42:20.71,0:42:25.56,Default,,0000,0000,0000,,histories of all Americans. And there are\Nalso growing tussles and standards. For Dialogue: 0,0:42:25.56,0:42:32.84,Default,,0000,0000,0000,,example, the draft ISO 27553 on biometric\Nauthentication for mobile phones is Dialogue: 0,0:42:32.84,0:42:38.08,Default,,0000,0000,0000,,introducing at the insistence of Chinese\Ndelegates, a central database option. So Dialogue: 0,0:42:38.08,0:42:43.48,Default,,0000,0000,0000,,in future, your phone might not verify\Nyour faceprint or your fingerprint Dialogue: 0,0:42:43.48,0:42:50.44,Default,,0000,0000,0000,,locally. It might do it with a central\Ndatabase. Next slide, how could Cold War Dialogue: 0,0:42:50.44,0:42:56.55,Default,,0000,0000,0000,,2.0 be different? Well, there's a number\Nof interesting things here, and the Dialogue: 0,0:42:56.55,0:43:00.96,Default,,0000,0000,0000,,purpose of this talk is to try and kick\Noff a discussion of these issues. China Dialogue: 0,0:43:00.96,0:43:06.12,Default,,0000,0000,0000,,makes electronics, not just guns, the way\Nthe old USSR did. Can you have a separate Dialogue: 0,0:43:06.12,0:43:13.60,Default,,0000,0000,0000,,supply chain for China and one for\Neverybody else? But hang on a minute, Dialogue: 0,0:43:13.60,0:43:20.22,Default,,0000,0000,0000,,consider the fact that China has now\Ncollected very substantial personal data Dialogue: 0,0:43:20.22,0:43:25.30,Default,,0000,0000,0000,,sets on the Office of Personnel\NManagement, the US government employees, Dialogue: 0,0:43:25.30,0:43:32.36,Default,,0000,0000,0000,,by forcing Apple to set up its own data\Ncenters in China for iPhone users in Dialogue: 0,0:43:32.36,0:43:39.27,Default,,0000,0000,0000,,China, they get access to all the data\Nfor Chinese users of iPhones that America Dialogue: 0,0:43:39.27,0:43:44.75,Default,,0000,0000,0000,,gets for American users of iPhones, plus\Nmaybe more as well. If the Chinese can Dialogue: 0,0:43:44.75,0:43:50.69,Default,,0000,0000,0000,,break the HSMs in Chinese data centers as\Nwe expect them to be able to, Equifax got Dialogue: 0,0:43:50.69,0:43:56.96,Default,,0000,0000,0000,,them data on all economically active\Npeople in the USA. care.data gave them Dialogue: 0,0:43:56.96,0:44:02.39,Default,,0000,0000,0000,,medical records of everybody in the UK.\NAnd this bulk personal data is already Dialogue: 0,0:44:02.39,0:44:08.47,Default,,0000,0000,0000,,being targeted in intelligence use when\NWestern countries, for example, send Dialogue: 0,0:44:08.47,0:44:13.64,Default,,0000,0000,0000,,diplomats to countries in Africa or Latin\NAmerica or local Chinese counter- Dialogue: 0,0:44:13.64,0:44:16.87,Default,,0000,0000,0000,,intelligence, people know whether they're\Nbona fide diplomats or whether they're Dialogue: 0,0:44:16.87,0:44:22.21,Default,,0000,0000,0000,,intelligence agents, undercover, all\Nfrom exploitation of all this personal Dialogue: 0,0:44:22.21,0:44:26.22,Default,,0000,0000,0000,,information. Now, given that this\Ninformation's already in efficient targeted Dialogue: 0,0:44:26.22,0:44:31.97,Default,,0000,0000,0000,,use, the next question we have to ask is\Nwhen will it be used at scale? And this is Dialogue: 0,0:44:31.97,0:44:37.39,Default,,0000,0000,0000,,the point at which we say that the\Nequities issue now needs a serious rethink Dialogue: 0,0:44:37.39,0:44:43.83,Default,,0000,0000,0000,,and the whole structure of the conflict is\Ngoing to have to move from more offensive Dialogue: 0,0:44:43.83,0:44:49.54,Default,,0000,0000,0000,,to more defensive because we depend on\Nsupply chains to which the Chinese have Dialogue: 0,0:44:49.54,0:44:55.46,Default,,0000,0000,0000,,access more than they depend on supply\Nchains to which we have access. Now, it's Dialogue: 0,0:44:55.46,0:45:01.19,Default,,0000,0000,0000,,dreadful that we're headed towards a new\NCold War, but as we head there, we have to Dialogue: 0,0:45:01.19,0:45:05.95,Default,,0000,0000,0000,,ask also the respective roles of\Ngovernments, industry and civil society, Dialogue: 0,0:45:05.95,0:45:14.04,Default,,0000,0000,0000,,academia. Next slide, please. And so\Nlooking for my point is this. That is Cold Dialogue: 0,0:45:14.04,0:45:18.86,Default,,0000,0000,0000,,War 2.0 does happen. I hope it doesn't.\NBut we appear to be headed that way Dialogue: 0,0:45:18.86,0:45:23.68,Default,,0000,0000,0000,,despite the change of governments in the\NWhite House. Then we need to be able to Dialogue: 0,0:45:23.68,0:45:31.01,Default,,0000,0000,0000,,defend everybody, not just the elites. No,\Nit's not going to be easy because there Dialogue: 0,0:45:31.01,0:45:35.27,Default,,0000,0000,0000,,are more state players, the USA is a big\Nblock, the EU is a big block. There are Dialogue: 0,0:45:35.27,0:45:39.65,Default,,0000,0000,0000,,other players, other democracies that are\Nother non democracies. Those other failing Dialogue: 0,0:45:39.65,0:45:45.21,Default,,0000,0000,0000,,democracies. This is going to be complex\Nand messy. It isn't going to be a Dialogue: 0,0:45:45.21,0:45:50.31,Default,,0000,0000,0000,,situation like last time where big tech\Nreaches out to civil society and academia Dialogue: 0,0:45:50.31,0:45:55.93,Default,,0000,0000,0000,,and we could see a united front against\Nthe agencies. And even in that case, of Dialogue: 0,0:45:55.93,0:46:00.55,Default,,0000,0000,0000,,course, the victory that we got was only\Nan apparent victory, a superficial victory Dialogue: 0,0:46:00.55,0:46:06.41,Default,,0000,0000,0000,,that's only lasted for a while. So what\Ncould we do? Well, at this point, I think Dialogue: 0,0:46:06.41,0:46:10.96,Default,,0000,0000,0000,,we need to remind all the players to\Nlisten. But it's not just about strategy Dialogue: 0,0:46:10.96,0:46:15.80,Default,,0000,0000,0000,,and tactics, but it's about values, too.\NAnd so we need to be firmly on the side of Dialogue: 0,0:46:15.80,0:46:21.47,Default,,0000,0000,0000,,freedom, privacy and the rule of law. Now,\Nfor the old timers, you may remember that Dialogue: 0,0:46:21.47,0:46:29.52,Default,,0000,0000,0000,,there was a product called Tom-Skype,\Nwhich was introduced in 2011 in China. The Dialogue: 0,0:46:29.52,0:46:34.47,Default,,0000,0000,0000,,Chinese wanted the citizens to be able to\Nuse Skype, but they wanted to be able to Dialogue: 0,0:46:34.47,0:46:38.29,Default,,0000,0000,0000,,wiretap as well, despite the fact that\NSkype at the time had end to end Dialogue: 0,0:46:38.29,0:46:44.52,Default,,0000,0000,0000,,encryption. And so people in China were\Ncompelled to download a client for Skype Dialogue: 0,0:46:44.52,0:46:50.45,Default,,0000,0000,0000,,called Tom-Skype. Tom was the company that\Ndistributed Skype in China and it Dialogue: 0,0:46:50.45,0:46:55.07,Default,,0000,0000,0000,,basically had built in wire tapping. So\Nyou had end to end encryption using Skype Dialogue: 0,0:46:55.07,0:47:01.24,Default,,0000,0000,0000,,in those days. But in China, you ended up\Nhaving a Trojan client, which you had to Dialogue: 0,0:47:01.24,0:47:08.24,Default,,0000,0000,0000,,use. And what we are doing at the moment\Nis basically the EU is trying to copy Tom- Dialogue: 0,0:47:08.24,0:47:13.44,Default,,0000,0000,0000,,Skype and saying that we should be doing\Nwhat China was doing eight years ago. And Dialogue: 0,0:47:13.44,0:47:17.54,Default,,0000,0000,0000,,I say we should reject that. We can't\Nchallenge President Xi by going down that Dialogue: 0,0:47:17.54,0:47:21.97,Default,,0000,0000,0000,,route. Instead, we've got to reset our\Nvalues and we've got to think through the Dialogue: 0,0:47:21.97,0:47:27.60,Default,,0000,0000,0000,,equities issue and we've got to figure out\Nhow it is that we're going to deal with Dialogue: 0,0:47:27.60,0:47:32.57,Default,,0000,0000,0000,,the challenges of dealing with non-\Ndemocratic countries when there is serious Dialogue: 0,0:47:32.57,0:47:40.62,Default,,0000,0000,0000,,conflict in a globalized world where we're\Nsharing the same technology. Thanks. And Dialogue: 0,0:47:40.62,0:47:52.23,Default,,0000,0000,0000,,perhaps the last slide for my book can\Ncome now and I'm happy to take questions. Dialogue: 0,0:47:52.23,0:47:58.46,Default,,0000,0000,0000,,Herald: Yeah, thanks a lot, Ross, for your\Ntalk. It's a bit depressing to listen to Dialogue: 0,0:47:58.46,0:48:09.51,Default,,0000,0000,0000,,you. I have to admit let's have a look.\NOK, so I have a question. I'm wondering if Dialogue: 0,0:48:09.51,0:48:15.37,Default,,0000,0000,0000,,the export controls at EU level became\Nworse than UK level export controls Dialogue: 0,0:48:15.37,0:48:20.66,Default,,0000,0000,0000,,because entities like GCHQ had more\Ninfluence there or because there's a harmful Dialogue: 0,0:48:20.66,0:48:26.62,Default,,0000,0000,0000,,Franco German security culture or what it\Nwas. Do you have anything on that? Dialogue: 0,0:48:26.62,0:48:30.89,Default,,0000,0000,0000,,Ross: Well, the experience that we had\Nwith these export controls, once they were Dialogue: 0,0:48:30.89,0:48:38.26,Default,,0000,0000,0000,,in place, was as follows. It was about\N2015 I think, or 2016, It came to our Dialogue: 0,0:48:38.26,0:48:43.80,Default,,0000,0000,0000,,attention that a British company, Sophos,\Nwas selling bulk surveillance equipment to Dialogue: 0,0:48:43.80,0:48:49.33,Default,,0000,0000,0000,,President al Assad of Syria, and he was\Nusing it to basically wiretap his entire Dialogue: 0,0:48:49.33,0:48:54.08,Default,,0000,0000,0000,,population and decide who he was going to\Narrest and kill the following day. And it Dialogue: 0,0:48:54.08,0:48:58.53,Default,,0000,0000,0000,,was sold by Sophos in fact, through a\NGerman subsidiary. And so we went along to Dialogue: 0,0:48:58.53,0:49:06.87,Default,,0000,0000,0000,,the export control office in Victoria\NStreet. A number of NGOs, the open rights Dialogue: 0,0:49:06.87,0:49:11.88,Default,,0000,0000,0000,,group went along and Privacy International\Nand us and one or two others. And we said, Dialogue: 0,0:49:11.88,0:49:16.48,Default,,0000,0000,0000,,look, according to the EU dual use\Nregulation, bulk intercept equipment is Dialogue: 0,0:49:16.48,0:49:19.95,Default,,0000,0000,0000,,military equipment. It should be in the\Nmilitary list. Therefore, you should be Dialogue: 0,0:49:19.95,0:49:25.33,Default,,0000,0000,0000,,demanding an export license for this\Nstuff. And they found every conceivable Dialogue: 0,0:49:25.33,0:49:34.10,Default,,0000,0000,0000,,excuse not to demand it. And it was the\Nlady from GCHQ there in the room who was Dialogue: 0,0:49:34.10,0:49:38.28,Default,,0000,0000,0000,,clearly calling the shots. And she was\Nabsolutely determined that there should be Dialogue: 0,0:49:38.28,0:49:44.04,Default,,0000,0000,0000,,no export controls on the stuff being sold\Nto Syria. And eventually I said, look, Dialogue: 0,0:49:44.04,0:49:47.26,Default,,0000,0000,0000,,it's fairly obvious what's going on here.\NIf there's going to be black boxes and Dialogue: 0,0:49:47.26,0:49:51.11,Default,,0000,0000,0000,,President al-Assad's network, you want\Nthem to be British black boxes or German Dialogue: 0,0:49:51.11,0:49:55.96,Default,,0000,0000,0000,,black boxes, not Ukrainian or Israeli\Nblack boxes. And she said, I cannot Dialogue: 0,0:49:55.96,0:50:00.83,Default,,0000,0000,0000,,discuss classified matters in an open\Nmeeting, which is as close as you get to Dialogue: 0,0:50:00.83,0:50:06.84,Default,,0000,0000,0000,,an admission. And a couple of months\Nlater, Angela Merkel, to her great credit, Dialogue: 0,0:50:06.84,0:50:12.64,Default,,0000,0000,0000,,has actually come out in public and said\Nthat allowing the equipment to be exported Dialogue: 0,0:50:12.64,0:50:16.44,Default,,0000,0000,0000,,from Utimaco to Syria was one of the\Nhardest decision she'd ever taken as Dialogue: 0,0:50:16.44,0:50:21.77,Default,,0000,0000,0000,,counselor. And that was a very difficult\Ntradeoff between maintaining intelligence Dialogue: 0,0:50:21.77,0:50:27.47,Default,,0000,0000,0000,,access, given the possibility that Western\Ntroops would be involved in Syria and the Dialogue: 0,0:50:27.47,0:50:33.30,Default,,0000,0000,0000,,fact that the kit was being used for very\Nevil purposes. So that's an example of how Dialogue: 0,0:50:33.30,0:50:38.28,Default,,0000,0000,0000,,the export controls are used in practice.\NThey are not used to control the harms Dialogue: 0,0:50:38.28,0:50:44.33,Default,,0000,0000,0000,,that we as voters are told that they're\Nthere to control. Right. They are used in Dialogue: 0,0:50:44.33,0:50:49.94,Default,,0000,0000,0000,,all sorts of dark and dismal games. And we\Nreally have to tackle the issue of export Dialogue: 0,0:50:49.94,0:50:55.98,Default,,0000,0000,0000,,controls with our eyes open.\NH: Yeah, yeah. There's a lot a lot to do. Dialogue: 0,0:50:55.98,0:51:03.80,Default,,0000,0000,0000,,And now Germany has left the EU, UN\NSecurity Council. So let's see what Dialogue: 0,0:51:03.80,0:51:13.00,Default,,0000,0000,0000,,happens next. Yeah. We'll see, Ross.\NAnything else you'd like to add? We don't Dialogue: 0,0:51:13.00,0:51:19.35,Default,,0000,0000,0000,,have any more questions. Oh, no, we have\Nanother question. It's just come up Dialogue: 0,0:51:19.35,0:51:24.51,Default,,0000,0000,0000,,seconds ago. Do you think that refusal to\Naccept back doors will create large Dialogue: 0,0:51:24.51,0:51:35.30,Default,,0000,0000,0000,,uncensorable applications?\NR: Well, if you get large applications Dialogue: 0,0:51:35.30,0:51:41.62,Default,,0000,0000,0000,,which are associated with significant\Neconomic power, then low pressure gets Dialogue: 0,0:51:41.62,0:51:51.45,Default,,0000,0000,0000,,brought to bear on those economic players\Nto do their social duty. And... this is what Dialogue: 0,0:51:51.45,0:51:56.52,Default,,0000,0000,0000,,we have seen with the platforms that\Nintermediate content, that act as content Dialogue: 0,0:51:56.52,0:52:00.22,Default,,0000,0000,0000,,intermediaries such as Facebook and Google\Nand so on, that they do a certain amount Dialogue: 0,0:52:00.22,0:52:08.51,Default,,0000,0000,0000,,of filtering. But if, on the other hand,\Nyou have wholesale surveillance before the Dialogue: 0,0:52:08.51,0:52:13.69,Default,,0000,0000,0000,,fact of End-To-End encrypted stuff, then\Nare we moving into an environment where Dialogue: 0,0:52:13.69,0:52:19.20,Default,,0000,0000,0000,,private speech from one person to another\Nis no longer permitted? You know, I don't Dialogue: 0,0:52:19.20,0:52:24.49,Default,,0000,0000,0000,,think that's the right trade off that we\Nshould be taking, because we all know from Dialogue: 0,0:52:24.49,0:52:28.78,Default,,0000,0000,0000,,hard experience that when governments say,\Nthink of the children, they're not Dialogue: 0,0:52:28.78,0:52:32.09,Default,,0000,0000,0000,,thinking of children at all. If they were\Nthinking of children, they would not be Dialogue: 0,0:52:32.09,0:52:36.28,Default,,0000,0000,0000,,selling weapons to Saudi Arabia and the\NUnited Arab Emirates to kill children in Dialogue: 0,0:52:36.28,0:52:41.85,Default,,0000,0000,0000,,Yemen. And they say think about terrorism.\NBut the censorship that we are supposed to Dialogue: 0,0:52:41.85,0:52:47.88,Default,,0000,0000,0000,,use in universities around terrorism, the\Nso-called prevent duty is known to be Dialogue: 0,0:52:47.88,0:52:52.28,Default,,0000,0000,0000,,counterproductive. It makes Muslim\Nstudents feel alienated and marginalized. Dialogue: 0,0:52:52.28,0:52:57.48,Default,,0000,0000,0000,,So the arguments that governments use\Naround this are not in any way honest. And Dialogue: 0,0:52:57.48,0:53:01.81,Default,,0000,0000,0000,,we now have 20 years experience of these\Ndishonest arguments. And for goodness Dialogue: 0,0:53:01.81,0:53:05.55,Default,,0000,0000,0000,,sake, let's have a more grown up\Nconversation about these things. Dialogue: 0,0:53:05.55,0:53:11.70,Default,,0000,0000,0000,,H: Now, you're totally right, even if I\Nhave to admit, it took me a couple of Dialogue: 0,0:53:11.70,0:53:24.66,Default,,0000,0000,0000,,years, not 20, but a lot to finally\Nunderstand, OK? This I think that's it, we Dialogue: 0,0:53:24.66,0:53:31.23,Default,,0000,0000,0000,,just have another comment and I'm thanking\Nyou for your time and are you in an Dialogue: 0,0:53:31.23,0:53:36.68,Default,,0000,0000,0000,,assembly somewhere around hanging around\Nin the next hour or so? Maybe if someone Dialogue: 0,0:53:36.68,0:53:41.86,Default,,0000,0000,0000,,wants to talk to you, he can just pop by\Nif you ever if you have used this 2d world Dialogue: 0,0:53:41.86,0:53:45.26,Default,,0000,0000,0000,,already.\NR: No, I haven't been using the 2d world. Dialogue: 0,0:53:45.26,0:53:50.59,Default,,0000,0000,0000,,I had some issues with my browser and\Ngetting into it. But I've got my my Dialogue: 0,0:53:50.59,0:53:55.38,Default,,0000,0000,0000,,webpage and my email address is public and\Nanybody who wants to discuss these things Dialogue: 0,0:53:55.38,0:53:59.74,Default,,0000,0000,0000,,is welcome to get in touch with me.\NHerald: All right. So thanks a lot. Dialogue: 0,0:53:59.74,0:54:04.20,Default,,0000,0000,0000,,R: Thank you for the invitation.\NH: Yeah. Thanks a lot. Dialogue: 0,0:54:04.20,0:54:07.80,Default,,0000,0000,0000,,{\i1}rC3 postroll music{\i0} Dialogue: 0,0:54:07.80,0:54:43.05,Default,,0000,0000,0000,,Subtitles created by c3subtitles.de\Nin the year 2020. Join, and help us!