0:00:00.000,0:00:19.220
prerol music
0:00:19.220,0:00:25.180
Herald: Our next speaker, he's a professor[br]of security engineering at Cambridge
0:00:25.180,0:00:31.250
University. He is the author of the book[br]Security Engineering. He has done a lot of
0:00:31.250,0:00:39.890
things already. He has been inventing semi[br]invasive attacks based on inducing photo
0:00:39.890,0:00:45.580
currence. He has done API attacks. He[br]has done a lot of stuff. If you read his
0:00:45.580,0:00:50.520
bio is it feels like he's involved in[br]almost everything we like related to
0:00:50.520,0:00:57.084
security. So please give a huge round and[br]a warm welcome to Ross Anderson and his
0:00:57.084,0:01:01.496
talk, The Sustainability of safety,[br]security and privacy.
0:01:01.496,0:01:02.746
applause
0:01:02.746,0:01:16.125
Ross Anderson: Thanks. Right. It's great[br]to be here, and I'm going to tell a story
0:01:16.125,0:01:23.981
that starts a few years ago and it's about[br]the regulation of safety. Just to set the
0:01:23.981,0:01:31.405
scene, you may recall that in February[br]this year there was this watch Enox's
0:01:31.405,0:01:37.709
Safe-Kid One suddenly got recalled. And[br]why? Well, it's unlikely that unencrypted
0:01:37.709,0:01:42.790
communications with the backhand server[br]allowing an authenticated access and
0:01:42.790,0:01:47.006
translated into layman language that meant[br]that hackers could track and call your
0:01:47.006,0:01:52.260
kids, changed the device ID and do[br]arbitrary bad things. So this was
0:01:52.260,0:01:57.447
immediately recalled by the European Union[br]using powers that it had under the Radio
0:01:57.447,0:02:02.388
Equipment Directive. And this was a bit of[br]a wake up call for industry, because up
0:02:02.388,0:02:07.514
until then, people active in the so-called[br]Internet of Things didn't have any idea
0:02:07.514,0:02:11.470
that, you know, if they produced an unsafe[br]device, then they could suddenly be
0:02:11.470,0:02:20.374
ordered to take it off the market. Anyway,[br]back in 2015, the European Union's
0:02:20.374,0:02:25.835
research department asked Eireann Leverett,[br]Richard Clayton and me to examine what I
0:02:25.835,0:02:32.327
would see implied from the regulation of[br]safety, because the European institutions
0:02:32.327,0:02:36.855
regulate all sorts of things, from toys to[br]railway signals and from cars through
0:02:36.855,0:02:41.071
drugs to aircraft. And if you start having[br]software and everything, does this mean
0:02:41.071,0:02:46.310
that all these dozens of agencies suddenly[br]start to have software safety experts and
0:02:46.310,0:02:51.604
software security experts? So what does[br]this mean in institutional terms? We
0:02:51.604,0:02:57.512
produced a report for them in 2016, which[br]the commission sat on for a year. A
0:02:57.512,0:03:03.000
version of the report came out in 2017 and[br]later that year the full report. And the
0:03:03.000,0:03:07.351
gist of our report was once you get[br]software everywhere, safety and security
0:03:07.351,0:03:12.721
become entangled. And in fact, when you[br]think about it, the two are the same in
0:03:12.721,0:03:19.287
pretty well all the languages spoken by EU[br]citizens. speaks other languages.
0:03:19.287,0:03:23.170
It's only English that distinguishes[br]between the two. And with
0:03:23.170,0:03:28.264
Britain leaving the EU, of course you will[br]have languages in which safety and
0:03:28.264,0:03:33.578
security become the same. Throughout[br]Brussels and throughout the continent. But
0:03:33.578,0:03:38.191
anyway, how are we going to update safety[br]regulation in order to cope? This was the
0:03:38.191,0:03:44.185
problem that Brussels was trying to get[br]its head around. So one of the things that
0:03:44.185,0:03:50.619
we had been looking at over the past 15,[br]20 years is the economics of information
0:03:50.619,0:03:56.381
security, because often a big complex[br]systems fail because the incentives are
0:03:56.381,0:04:01.530
wrong. If Alice guards the system and Bob[br]pairs the cost of failure, you can expect
0:04:01.530,0:04:08.374
trouble. And many of these ideas go across[br]the safety as well. Now, it's already well
0:04:08.374,0:04:13.203
known that markets do safety in some[br]industries, such as aviation, way better
0:04:13.203,0:04:18.903
than others, such as medicine. And cars[br]were dreadful for many years for the first
0:04:18.903,0:04:23.245
80 years of the car industry. People[br]didn't bother with things like seatbelts,
0:04:23.245,0:04:28.643
and it was only until Ralph Nader's book,[br]Unsafe at Any Speed, led the Americans to
0:04:28.643,0:04:32.767
set up the National Highways,[br]Transportation and Safety Administration
0:04:32.767,0:04:37.410
and various court cases brought this[br]forcefully to public attention that car
0:04:37.410,0:04:42.900
safety started to become a thing. Now in[br]the EU, we've got a whole series of broad
0:04:42.900,0:04:49.292
frameworks and specific directives and[br]detail rules and thus overall 20 EU
0:04:49.292,0:04:55.074
agencies plus the UNECE in play here. So[br]how can we navigate this? Well, what we
0:04:55.074,0:05:00.035
were asked to do was to look at three[br]specific verticals and study them in some
0:05:00.035,0:05:06.507
detail so that the lessons from them could[br]be then taken to the other verticals in
0:05:06.507,0:05:17.967
which the EU operates. And, cars were one[br]of those. And some of you may remember the
0:05:17.967,0:05:26.601
carshark pepper in 2011. Four guys from[br]San Diego and the University of Washington
0:05:26.601,0:05:30.720
figured out how to hack a vehicle and[br]control it remotely. And I used to have a
0:05:30.720,0:05:34.477
lovely little video of this that the[br]researchers gave me. But my Mac got
0:05:34.477,0:05:41.370
upgraded to Catalina last week and it[br]doesn't play anymore. So, verschlimmbessern?
0:05:41.370,0:05:44.354
Man sagt auf Deutsch? Oder?[br]Yeah.
0:05:44.354,0:05:49.316
applause
0:05:49.316,0:05:53.717
Okay. We'll get it going sooner or later.[br]Anyway, this was largely ignored because
0:05:53.717,0:05:59.976
one little video didn't make the biscuit.[br]But in 2015, there suddenly came to the
0:05:59.976,0:06:04.643
attention of the industry because Charlie[br]Miller and Chris Valasek, two guys who had
0:06:04.643,0:06:10.870
been in the NSA is hacking team hacks a[br]cheap Cherokee using Chryslers Uconnect.
0:06:10.870,0:06:14.171
And this meant that they could go down[br]through all the Chrysler vehicles in
0:06:14.171,0:06:18.548
America and look at them one by one and[br]ask, where are you? And then when they
0:06:18.548,0:06:21.676
found the vehicle that was somewhere[br]interesting, they could go in and do
0:06:21.676,0:06:26.878
things to it. And what they found was that[br]to hack a vehicle, suddenly you just
0:06:26.878,0:06:34.539
needed the vehicle's IP address. And so[br]they got a journalist into a vehicle and
0:06:34.539,0:06:38.649
they got into slow down and had trucks[br]behind them hooting away, and eventually
0:06:38.649,0:06:43.102
they ran the vehicle off the road. And[br]when the TV footage of this got out,
0:06:43.102,0:06:47.505
suddenly, people cared. It made the front[br]pages of the press in the USA, and
0:06:47.505,0:06:52.359
Chrysler had to recall 1.4 million[br]vehicles for a software fix, which meant
0:06:52.359,0:06:58.268
actually reflashing the firmware of the[br]devices. And it cost them billions and
0:06:58.268,0:07:02.170
billions of dollars. So all of a sudden,[br]this is something to which people paid
0:07:02.170,0:07:10.675
attention. Some of you may know this chap[br]here, at least by sight. This is Martin
0:07:10.675,0:07:15.852
Winterkorn, who used to run Volkswagen.[br]And when it turned out that he had hacked
0:07:15.852,0:07:20.292
millions and millions of Volkswagen[br]vehicles by putting in evil software that
0:07:20.292,0:07:26.780
defeated emissions controls. That's what[br]happened to Volkswagen stock price. Oh,
0:07:26.780,0:07:33.770
and he lost his job and got prosecuted. So[br]this is an important point about vehicles
0:07:33.770,0:07:37.668
and in fact, about many things in the[br]Internet of things for Internet of
0:07:37.668,0:07:42.246
targets, whatever you want to call it. The[br]thread model isn't just external, it is
0:07:42.246,0:07:47.105
internal as well. There are bad people all[br]the way up and down the supply chain. Even
0:07:47.105,0:07:54.605
at the OEM. So that's the state of play in[br]cars. And we investigated that and wrote a
0:07:54.605,0:08:03.785
bit about it. Now, here's medicine. This[br]was the second thing that we looked at.
0:08:03.785,0:08:08.789
These are some pictures of the scene in[br]the intensive care unit in Swansea
0:08:08.789,0:08:13.335
Hospital. So after your car gets hacked[br]and you go off the road, this is where you
0:08:13.335,0:08:19.918
end up. And just as a car has got about 50[br]computers in it, you're now going to see
0:08:19.918,0:08:34.040
that there's quite a few computers at your[br]bedside. How many CPUs can you see? You
0:08:34.040,0:08:39.807
see, there's quite a few, about a[br]comparable number to the number of CPUs in
0:08:39.807,0:08:47.235
your car. Only here the systems[br]integration is done by the nurse, not by
0:08:47.235,0:08:55.528
the engineers at Volkswagen or Mercedes.[br]And does this cause safety problems? Oh,
0:08:55.528,0:09:06.723
sure. Here are pictures of the user[br]interface of infusion pumps taken from
0:09:06.723,0:09:13.500
Swansea's intensive care unit. And as you[br]can see, they're all different. This is a
0:09:13.500,0:09:17.736
little bit like if you suddenly had to[br]drive a car from the 1930s an old
0:09:17.736,0:09:22.452
Lanchester, for example, and then you find[br]that the accelerator is between the brake
0:09:22.452,0:09:27.416
and the clutch, right? Honestly, there[br]used to be such cars. You can still find
0:09:27.416,0:09:33.325
them in antique car fairs or a Model T[br]Ford, for example, for the accelerator is
0:09:33.325,0:09:39.003
actually a lever on the dashboard and one[br]of the pedals is as a gear change. And yet
0:09:39.003,0:09:44.330
you're asking nurses to operate a variety[br]of different pieces of equipment and look,
0:09:44.330,0:09:50.645
for example, at the Bodyguard 545. The one[br]on the top to increase the doors. Right,
0:09:50.645,0:09:54.527
this is the morphine that is being dripped[br]into your vein once you've had your car
0:09:54.527,0:09:58.949
crash, to increase the dose you have to[br]press 2 and to decrease that, you have to
0:09:58.949,0:10:06.882
press 0. Under the Bodyguard 545 at the[br]bottom right, to increase the dose you
0:10:06.882,0:10:14.367
press 5 and to decrease it, you press 0.[br]And this leads to accidents, to fatal
0:10:14.367,0:10:21.179
accidents, a significant number of them.[br]Okay. So you might say, well, why not have
0:10:21.179,0:10:25.576
standards? Well, we have standards. We've[br]got standards which say that liter should
0:10:25.576,0:10:30.510
always be a capital L, so it is not[br]confused with a one. And then you see that
0:10:30.510,0:10:37.522
and the Bodyguard on the bottom right.[br]MILLILITERS is a capital L in green. Okay.
0:10:37.522,0:10:43.285
Well done, Mr. Bodyguard. The problem is,[br]if you look up two lines, you see 500
0:10:43.285,0:10:49.172
milliliters is in small letters. So[br]there's a standard problem. There's an
0:10:49.172,0:10:53.785
enforcement problem and there's extra[br]inanities because each of these vendors
0:10:53.785,0:10:58.285
will say, well, everybody else should[br]standardize on my kit. And there are also
0:10:58.285,0:11:04.745
various other market failures. So the[br]expert who's been investigating this is my
0:11:04.745,0:11:09.515
friend Harold Thimbleby, who's a professor[br]of computer science at Swansea. And his
0:11:09.515,0:11:14.603
research shows that hospitals safety,[br]usability failures kill about 2000 people
0:11:14.603,0:11:22.207
every year in the UK, which is about the[br]same as road accidents. And safety
0:11:22.207,0:11:29.572
usability, in other words, gets ignored[br]because the incentives are wrong. In
0:11:29.572,0:11:33.486
Britain and indeed in the European[br]institutions, people tend to follow the
0:11:33.486,0:11:39.190
FDA in America and that is captured by the[br]large medical device makers over there.
0:11:39.190,0:11:45.150
They only have two engineers. They're not[br]allowed to play with pumps, etc, etc, etc.
0:11:45.150,0:11:50.322
The curious thing here is that safety and[br]security come together. The safety of
0:11:50.322,0:11:55.316
medical devices may improve because as[br]soon as it becomes possible to hack a
0:11:55.316,0:12:02.577
medical device, then people suddenly take[br]care. So the first of this was when Kevin
0:12:02.577,0:12:07.334
Fu and researchers at the University of[br]Michigan showed that they could hack the
0:12:07.334,0:12:12.270
hospital, a symbolic infusion pump over[br]Wi-Fi. And this led the FDA to immediately
0:12:12.270,0:12:17.244
panic and blacklist the pump, recalling it[br]from service. But then said, Kevin, what
0:12:17.244,0:12:21.108
about the 200 other infusion pumps that[br]are unsafe because of the things on the
0:12:21.108,0:12:27.760
previous slide? Also, the FDA, we couldn't[br]possibly recall all those. Then two years
0:12:27.760,0:12:33.118
ago, there's an even bigger recall. It[br]turned out that 450 000 pacemakers made by
0:12:33.118,0:12:38.939
St. Jude could similarly be hacked over[br]Wi-Fi. And so the recall was ordered. And
0:12:38.939,0:12:42.590
this is quite serious, because if you've[br]got a heart pacemaker, right, it's
0:12:42.590,0:12:47.681
implanted surgically in the muscle next to[br]your shoulder blade. And to remove that
0:12:47.681,0:12:51.740
and replace it with a new one, which they[br]do every 10 years to change the battery,
0:12:51.740,0:12:54.950
you know, is a day care surgery procedure.[br]You have to go in there, get an
0:12:54.950,0:12:58.256
anesthetic. They have to have a[br]cardiologist ready in case you have a
0:12:58.256,0:13:05.340
heart attack. It's a big deal, right? It[br]costs maybe 3000 pounds in the UK. And so
0:13:05.340,0:13:11.000
3000 pounds times 450 000 pacemakers.[br]Multiply it by two for American health
0:13:11.000,0:13:18.510
care costs and you're talking real money.[br]So what should Europe do about this? Well,
0:13:18.510,0:13:22.970
thankfully, the European institutions have[br]been getting off their butts on this and
0:13:22.970,0:13:27.650
the medical device directors have been[br]revised. And from next year, medical
0:13:27.650,0:13:31.170
devices will have post-market[br]surveillance, risk management plan,
0:13:31.170,0:13:37.460
ergonomic design. And here's perhaps the[br]driver for software engineering for
0:13:37.460,0:13:41.600
devices that incorporate software. The[br]software shall be developed in accordance
0:13:41.600,0:13:45.680
with the state of the art, taking into[br]account the principles of development,
0:13:45.680,0:13:50.810
life cycle risk management, including[br]information, security, verification and
0:13:50.810,0:13:57.470
validation. So there at least we have a[br]foothold and it continues. Devices shall
0:13:57.470,0:14:02.150
be designed and manufactured in such a way[br]as to protect as far as possible against
0:14:02.150,0:14:06.620
unauthorized access that could hamper the[br]device from functioning as intended. Now
0:14:06.620,0:14:11.040
it's still not perfect. There's various[br]things that the manufacturers can do to
0:14:11.040,0:14:17.090
wriggle. But it's still a huge[br]improvement. The third thing that we
0:14:17.090,0:14:20.990
looked at was energy, electricity[br]substations and electro technical
0:14:20.990,0:14:25.700
equipments in general, there have been one[br]or two talks at this conference on that.
0:14:25.700,0:14:30.480
Basically, the problem is that you've got[br]a 40 year life cycle for these devices.
0:14:30.480,0:14:35.750
Protocols such as Smart Bus and DNP3 don't[br]support authentication. And the fact that
0:14:35.750,0:14:41.180
everything has gone to IP networks means[br]that as with the Chrysler Jeeps. Anybody
0:14:41.180,0:14:45.750
who knows your IP address can read from[br]and with an actuator's IP address, you can
0:14:45.750,0:14:51.200
activate it. So the only practical fix[br]there is to re-perimeterise and the
0:14:51.200,0:14:56.300
entrepreneurs who noticed this 10 to 15[br]years ago and set up companies like Beldon
0:14:56.300,0:15:00.980
have now made lots and lots of money.[br]Companies like BP now have thousands of
0:15:00.980,0:15:06.050
such firewalls which isolate their[br]chemical and other plants from the
0:15:06.050,0:15:11.480
internet. So one way in which you can deal[br]with this is having one component that
0:15:11.480,0:15:14.900
connects you to the network, you replace[br]it every five years. That's one way of
0:15:14.900,0:15:20.270
doing, if you'd like sustainable security[br]for your oil refinery. But this is a lot
0:15:20.270,0:15:25.280
harder for cars, which have got multiple[br]RF interfaces. A modern car has maybe 10
0:15:25.280,0:15:31.600
interfaces in all those there is the[br]internal phone. There's the short range radio
0:15:31.600,0:15:37.310
link for remote key entry. Those things.[br]There are links to the devices that
0:15:37.310,0:15:41.030
monitor your tire pressure. There's all[br]sorts of other things and every single one
0:15:41.030,0:15:48.350
of these has been exploited at least once.[br]And there are particular difficulties in
0:15:48.350,0:15:53.180
the auto industry because of the[br]fragmented responsibility in the supply
0:15:53.180,0:15:57.530
chain between the OEM, the tier ones and[br]the specialists who produce all the
0:15:57.530,0:16:03.380
various bits and pieces that get glued[br]together. Anyway, so the broad questions
0:16:03.380,0:16:08.480
that arise from this include who will[br]investigate incidents and to whom will
0:16:08.480,0:16:15.890
they be reported? Right? How do we embed[br]responsible disclosure? How do we bring
0:16:15.890,0:16:21.500
safety engineers and security engineers[br]together? This is an enormous project
0:16:21.500,0:16:25.580
because security engineers and safety[br]engineers use different languages. We have
0:16:25.580,0:16:31.040
different university degree programs. We[br]go to different conferences. And the world
0:16:31.040,0:16:35.450
of safety is similarly fragmented between[br]the power people, the car people, the
0:16:35.450,0:16:40.680
naval people, the signal people and so on[br]and so forth. Some companies are beginning
0:16:40.680,0:16:44.940
to get this together. The first is Bosch,[br]which put together their safety,
0:16:44.940,0:16:48.960
engineering and security engineering[br]professions. But even once you have done
0:16:48.960,0:16:53.640
that in organizational terms, how do you[br]teach a security engineer to think safety
0:16:53.640,0:16:58.950
and vice versa? Then the problem that[br]bothered the European Union, are the
0:16:58.950,0:17:04.350
regulators all going to need security[br]engineers? Right. I mean, many of these
0:17:04.350,0:17:10.250
organizations in Brussels don't even have[br]an engineer on staff, right? They are
0:17:10.250,0:17:16.260
mostly full of lawyers and policy people.[br]And then, of course, for this audience,
0:17:16.260,0:17:21.280
how do you prevent abuse of lock-in, you[br]know, in America if you've got a chapter
0:17:21.280,0:17:25.200
from John Deere? And then if you don't[br]take it to a John Deere dealer every six
0:17:25.200,0:17:29.790
months or so, it stops working. Right. And[br]if you try and hack it so you can fix it
0:17:29.790,0:17:34.740
yourself, then John Deere will try to get[br]you prosecuted. We just don't want that
0:17:34.740,0:17:41.100
kind of stuff coming over the Atlantic[br]into Europe. So we ended up with a number
0:17:41.100,0:17:46.770
of recommendations. We thought that we[br]would get vendors to self-certify for the
0:17:46.770,0:17:52.160
CE mark that products could be patched if[br]need be. That turned out to be not viable.
0:17:52.160,0:17:57.100
We then came up with another idea that[br]things should be secure by default for the
0:17:57.100,0:18:00.630
update to the Ready Equipment Directive.[br]And that didn't get through the European
0:18:00.630,0:18:06.980
Parliament either. In fact, it was Mozilla[br]that lobbied against it. Eventually we got
0:18:06.980,0:18:11.850
something through which I'll discuss in a[br]minute. We talked about requiring a secure
0:18:11.850,0:18:15.210
development lifecycle with vulnerability[br]management because we've already got
0:18:15.210,0:18:21.330
standards for that. We talked about[br]creating an European security engineering
0:18:21.330,0:18:25.830
agency. So that would be people in[br]Brussels to support policymakers and the
0:18:25.830,0:18:30.540
reaction to that. A year and a half ago[br]was to arrange for ENISA to be allowed to
0:18:30.540,0:18:35.040
open an office in Brussels so that they[br]can hopefully build a capability. There
0:18:35.040,0:18:40.200
with some technical people who can support[br]policymakers. We recommended extending the
0:18:40.200,0:18:45.830
product liability directive to services.[br]There is enormous pushback on that.
0:18:45.830,0:18:50.430
Companies like Google and Facebook and so[br]on don't like the idea that they should be
0:18:50.430,0:18:55.620
as liable for mistakes made by Google[br]Maps, as for example, Garmin is liable for
0:18:55.620,0:19:00.930
mistakes made by the navigators. And then[br]there's the whole business of how do you
0:19:00.930,0:19:05.220
take the information that European[br]institutions already have on breaches and
0:19:05.220,0:19:10.140
vulnerabilities and report this not just[br]to ENISA, but the safety regulators and
0:19:10.140,0:19:14.160
users, because somehow you've got to[br]create a learning system. And this is
0:19:14.160,0:19:19.050
perhaps one of the big pieces of work to[br]be done. How do you take, I mean, once all
0:19:19.050,0:19:23.550
cars are sort of semi intelligent, once[br]everybody's got telemetry and once that
0:19:23.550,0:19:28.050
are, you know, gigabytes of data[br]everywhere, then whenever there's a car
0:19:28.050,0:19:34.050
crash, the data have to go to all sorts of[br]places, to the police, to the insurers, to
0:19:34.050,0:19:40.350
courts, and then, of course, up to the car[br]makers and regulators and component
0:19:40.350,0:19:45.060
suppliers and so on. How do you design the[br]system that will cause the right data to
0:19:45.060,0:19:49.680
get to the right place, which will still[br]respect people's privacy rights and all
0:19:49.680,0:19:54.900
the various other legal obligations? This[br]is a huge project and nobody has really
0:19:54.900,0:19:59.880
started to think yet about how it's going[br]to be done, right. At present, if you've
0:19:59.880,0:20:03.780
got a crash in a car like a Tesla, which[br]has got very good telemetry, you basically
0:20:03.780,0:20:07.200
have to take Tesla to court to get the[br]data because otherwise they won't hand it
0:20:07.200,0:20:13.320
over. Right. We need a better regime for[br]this. And that at present is a blank
0:20:13.320,0:20:18.910
slate. It's up to us, I suppose, to figure[br]out how such a system should be designed
0:20:18.910,0:20:23.870
and built, and it will take many years to[br]do it, right. If you want a safe system, a
0:20:23.870,0:20:32.940
system that learns this is what is going[br]to involve. But there's one thing that
0:20:32.940,0:20:37.920
struck us after we'd done this work, after[br]we delivered this to the European
0:20:37.920,0:20:41.940
Commission, that I'd gone to Brussels and[br]given a thought to dozens and dozens of
0:20:41.940,0:20:49.060
security guys. Richard Clayton and I went[br]to Schloss Dagstuhl for a weeklong seminar
0:20:49.060,0:20:53.010
on some other security topic. And we were[br]just chatting one evening and we said,
0:20:53.010,0:21:00.250
well, you know, what did we actually learn[br]from this whole exercise on
0:21:00.250,0:21:07.090
standardization and certification? Well,[br]it's basically this. That there's two
0:21:07.090,0:21:12.790
types of secure things that we currently[br]know how to make. The first is stuff like
0:21:12.790,0:21:17.890
your phone or your laptop, which is secure[br]because you patch it every month. Right.
0:21:17.890,0:21:22.180
But then you have to throw it away after[br]three years because Larry and Sergei don't
0:21:22.180,0:21:35.920
have enough money to maintain three[br]versions of Android. And then we've got
0:21:35.920,0:21:41.460
things like cars and medical devices where[br]we test them to death before release and
0:21:41.460,0:21:46.600
we don't connect them to the Internet, and[br]we almost never patch them unless Charlie
0:21:46.600,0:21:52.750
Miller and Chris Fellowship get to go at[br]your car that is. So what's gonna happen
0:21:52.750,0:21:59.050
to support costs? Now that we're starting[br]to patch cars and you have to patch cars
0:21:59.050,0:22:02.890
because they're online, I want some things[br]online, right? Anybody in the world can
0:22:02.890,0:22:06.760
attack us. If a vulnerability is[br]discovered, it can be scaled and something
0:22:06.760,0:22:11.150
that you can previously ignore suddenly[br]becomes something that you have to fix.
0:22:11.150,0:22:14.650
And if you, you have to pull all your cars[br]into a garage to patch them, that costs
0:22:14.650,0:22:18.490
real money. So you need to be able to[br]patch them over the air. So all of a
0:22:18.490,0:22:26.920
sudden cars become like computers or[br]phones. So what is this going to mean? So
0:22:26.920,0:22:34.030
this is the trilemma. If you've got a[br]standard safety life cycle, there's no
0:22:34.030,0:22:38.150
patching. You get safety and[br]sustainability, but you can't go online
0:22:38.150,0:22:43.600
because you'll get hacked. And if you get[br]the standard security lifecycle you're
0:22:43.600,0:22:50.650
patching, but that breaks the safety[br]certification, so that's a problem. And if
0:22:50.650,0:22:54.730
you get patching plus redoing safety[br]certification with current methods, then
0:22:54.730,0:22:58.930
the cost of maintaining your safety rating[br]can be sky high. So here's the big
0:22:58.930,0:23:09.770
problem. How do you get safety, security[br]and sustainability at the same time? Now
0:23:09.770,0:23:13.040
this brings us to another thing that a[br]number of people at this congress are
0:23:13.040,0:23:17.960
interested in: the right to repair. This[br]is the Centennial Light, right? It's been
0:23:17.960,0:23:24.230
running since 1901. Right. It's in[br]Livermore in California. It's kind of dim,
0:23:24.230,0:23:30.200
but you can go there and you can see it.[br]Still there. In 1924, the three firms have
0:23:30.200,0:23:34.790
dominated the light business. GE, Osram[br]and Philips agreed to reduce average bulb
0:23:34.790,0:23:39.590
lifetime some 2500 hours to 1000[br]hours. Why? In order to sell more of
0:23:39.590,0:23:46.430
them. And one of the things that's come[br]along with CPUs and communications and so
0:23:46.430,0:23:52.360
on with smart stuff to use, that horrible[br]word, is that firms are now using online
0:23:52.360,0:23:58.340
mechanisms, software and cryptographic[br]mechanisms in order to make it hard or
0:23:58.340,0:24:03.860
even illegal to fix products. And I[br]believe that there's a case against Apple
0:24:03.860,0:24:16.790
going on in France about this. Now, you[br]might not think it's something that
0:24:16.790,0:24:20.780
politicians will get upset about, that you[br]have to throw away your phone after three
0:24:20.780,0:24:25.070
years instead of after five years. But[br]here's something you really should worry
0:24:25.070,0:24:31.640
about. Vehicle life cycle economics,[br]because the lifetimes of cars in Europe
0:24:31.640,0:24:36.990
have about doubled in the last 40 years.[br]And the average age of a car in Britain,
0:24:36.990,0:24:46.530
which is scrapped, is now almost 15 years.[br]So what's going to happen once you've got,
0:24:46.530,0:24:54.110
you know, wonderful self-driving software[br]in all the cars. Well, a number of big car
0:24:54.110,0:25:00.200
companies, including in this country, were[br]taking the view two years ago that they
0:25:00.200,0:25:06.320
wanted people to scrap their cars after[br]six years and buy a new one. Hey, makes
0:25:06.320,0:25:10.100
business sense, doesn't it? If you're Mr.[br]Mercedes, your business model is if the
0:25:10.100,0:25:13.790
customer is rich, you sell him a three[br]year lease on a new car. And if the
0:25:13.790,0:25:18.370
customer is not quite so rich, you sell[br]him a three year lease on a Mercedes
0:25:18.370,0:25:23.715
approved used car. And if somebody drives a[br]seven year old Mercedes, that's thought
0:25:23.715,0:25:31.620
crime. You know, they should emigrate to[br]Africa or something. So this was the view
0:25:31.620,0:25:38.070
of the vehicle makers. But here's the rub.[br]The embedded CO2 costs of a car often
0:25:38.070,0:25:43.380
exceeds its lifetime fuel burn. My best[br]estimate for the embedded CO2 costs of an
0:25:43.380,0:25:48.030
E-class American is 35 tons. So go and[br]work out, you know, how many liters per
0:25:48.030,0:25:53.760
100 kilometers and how many kilometers[br]it's gonna run in 15 years. And you come
0:25:53.760,0:25:59.710
to the conclusion that if you get a six[br]year lifetime, then maybe you are
0:25:59.710,0:26:07.180
decreasing the range of the car from 300[br]000 kilometers to 100 000 kilometers. And
0:26:07.180,0:26:13.080
so you're approximately doubling the[br]overall CO2 emissions. Taking the whole
0:26:13.080,0:26:16.710
life cycle, not just the scope one, but[br]the scope two, and the scope three, the
0:26:16.710,0:26:22.320
embedded stuff as well. And then there are[br]other consequences. What about Africa,
0:26:22.320,0:26:26.820
where most vehicles are imported second[br]hand? If you go to Nairobi, all the cars
0:26:26.820,0:26:31.110
are between 10 and 20 years old, right?[br]They arrive in the docks in Mombasa when
0:26:31.110,0:26:35.310
they're already 10 years old and people[br]drive them for 10 years and then they end
0:26:35.310,0:26:39.090
up in Uganda or Chad or somewhere like[br]that. And they're repaired for as long as
0:26:39.090,0:26:43.560
they're repairable. What's going to happen[br]to road transport in Africa if all of a
0:26:43.560,0:26:48.660
sudden there's a software time bomb that[br]causes cars to self-destruct? Ten years
0:26:48.660,0:26:56.040
after we leave the showroom. And if there[br]isn't, what about safety? I don't know
0:26:56.040,0:27:00.420
what the rules are here, but in Britain I[br]have to get my car through a safety
0:27:00.420,0:27:05.010
examination every year, once it's more[br]than three years old. And it's entirely
0:27:05.010,0:27:09.510
foreseeable that within two or three years[br]the mechanic will want to check that the
0:27:09.510,0:27:15.880
software is up to date. So once the[br]software update is no longer available,
0:27:15.880,0:27:24.580
that's basically saying this car must now[br]be exported or scrapped. I couldn't resist
0:27:24.580,0:27:29.120
the temptation to put in a cartoon:[br]"My engine's making a weird noise."
0:27:29.120,0:27:32.490
"Can you take a look?"[br]"Sure. Just pop the hood. Oh, the hood
0:27:32.490,0:27:36.600
latch is also broken. Okay, just pull up[br]to that big pit and push the car in. We'll
0:27:36.600,0:27:41.400
go get a new one."[br]Right? This is if we start treating cars
0:27:41.400,0:27:53.250
the way we treat consumer electronics. So[br]what's a reasonable design lifetime? Well,
0:27:53.250,0:27:58.260
with cars, the way it is going is maybe 18[br]years, say 10 years from the sale of the
0:27:58.260,0:28:03.660
last products in a model range, domestic[br]appliances, 10 years because of spares
0:28:03.660,0:28:09.720
obligation plus store life, say 15.[br]Medical devices: If a pacemaker lives for
0:28:09.720,0:28:16.410
10 years, then maybe you need 20 years. Of[br]electricity substations, even more. So
0:28:16.410,0:28:22.500
from the point of view of engineers, the[br]question is, how can you see to it that
0:28:22.500,0:28:27.690
your software will be patchable for 20[br]years? So as we put it in the abstract, if
0:28:27.690,0:28:34.830
you are writing software now for a car[br]that will go on sale in 2023, what sort of
0:28:34.830,0:28:39.090
languages, what sort of tool change should[br]you use? What sort of crypto should you
0:28:39.090,0:28:46.390
use so that you're sure you'll still be[br]able to patch that software in 2043? And
0:28:46.390,0:28:50.040
that isn't just about the languages and[br]compilers and linkers and so on. That's
0:28:50.040,0:28:59.490
about the whole ecosystem. So what did the[br]EU do? Well, I'm pleased to say that at
0:28:59.490,0:29:05.800
the third attempt, the EU managed to get[br]some law through on this. Their active 771
0:29:05.800,0:29:10.440
this year on smart goods says that buyers[br]of goods with digital elements are
0:29:10.440,0:29:15.570
entitled to necessary updates for two[br]years or for a longer period of time if
0:29:15.570,0:29:20.880
this is a reasonable expectation of the[br]customer. This is what they managed to get
0:29:20.880,0:29:24.990
through the parliament. And what we would[br]expect is that this will mean at least 10
0:29:24.990,0:29:29.520
years for cars, ovens, fridges, air[br]conditioning and so on because of existing
0:29:29.520,0:29:35.100
provisions about physical spares. And[br]what's more, the trader has got the burden
0:29:35.100,0:29:39.720
of proof in the first couple of years if[br]there's disputes. So there is now the
0:29:39.720,0:29:48.160
legal framework there to create the demand[br]for long term patching of software. And
0:29:48.160,0:29:54.570
now it's kind of up to us. If the durable[br]goods were deciding today are still
0:29:54.570,0:30:00.030
working in 2039, then a whole bunch of[br]things are gonna have to change. Computer
0:30:00.030,0:30:04.650
science has always been about managing[br]complexity ever since the very first high
0:30:04.650,0:30:09.780
level languages and the history goes on[br]from there through types and objects and
0:30:09.780,0:30:14.730
tools like git and Jenkins and Coverity.[br]So here's a question for the computer
0:30:14.730,0:30:19.560
scientists here. What else is going to be[br]needed for sustainable computing? Once we
0:30:19.560,0:30:31.440
have software in just about everything. So[br]research topics to support 20 year
0:30:31.440,0:30:35.670
patching include a more stable and[br]powerful toolchain. We know how complex
0:30:35.670,0:30:41.730
this can be from crypto with looking at[br]history of the last 20 years of TLS. Cars
0:30:41.730,0:30:45.480
teach that it's difficult and expensive to[br]sustain all the different test
0:30:45.480,0:30:50.790
environments. You have a different models[br]of cars. Control systems teach for that
0:30:50.790,0:30:54.480
you can make small changes to the[br]architecture, which will then limit what
0:30:54.480,0:30:59.640
you have to patch. Android teaches how do[br]you go about motivating OEMs to patch
0:30:59.640,0:31:04.140
products that they no longer sell. In this[br]case, it's European law, but there's maybe
0:31:04.140,0:31:10.840
other things you can do too. What does it[br]mean for those of us who teach and
0:31:10.840,0:31:15.090
research in universities? Well, since[br]2016, I've been teaching safety and
0:31:15.090,0:31:20.490
security together in the same course the[br]first year undergraduates, because
0:31:20.490,0:31:25.560
presenting these ideas together in[br]lockstep will help people to think in more
0:31:25.560,0:31:30.300
unified terms about how it all holds[br]together. In research terms we've have
0:31:30.300,0:31:34.590
been starting to look at what we can do to[br]make the tool chain more sustainable. For
0:31:34.590,0:31:39.750
example, one of the problems that you have[br]if you maintain crypto software is that
0:31:39.750,0:31:44.550
every so often the compiler writes, okay,[br]so a little bit smarter and the compiler
0:31:44.550,0:31:48.450
figures out that these extra padding[br]instructions that you put in to make the
0:31:48.450,0:31:53.970
the loops of your crypto routines run in[br]constant time and to scrub the contents of
0:31:53.970,0:31:58.130
round keys once you are no longer in use,[br]are not doing any real work, and it
0:31:58.130,0:32:02.840
removes them. And all of a sudden from one[br]day to the next, you find that your crypto
0:32:02.840,0:32:07.520
has sprung a huge big timing leak and then[br]you have to rush to get somebody out of
0:32:07.520,0:32:11.900
bed to fix the tool chain. So one of the[br]things that we thought was that better
0:32:11.900,0:32:17.360
ways for programmers to communicate intent[br]might help. And so there's a paper by
0:32:17.360,0:32:21.800
Laurent Simon and David Chisnall and I[br]where we looked about zeroising sensitive
0:32:21.800,0:32:27.830
variables and doing constant time loops[br]with a plug in and VM. And that led to a
0:32:27.830,0:32:32.810
EuroS&P paper a year and a half ago: "What[br]you get is what you C", and there's a plug
0:32:32.810,0:32:40.770
in that you can download them and play[br]with. Macro scale sustainable security is
0:32:40.770,0:32:45.980
going to require a lot more. Despite the[br]problems in the area industry with the
0:32:45.980,0:32:51.800
737Max, the aerospace industry still has[br]got a better feedback loop of learning
0:32:51.800,0:32:59.280
from incidents and accidents. And we don't[br]have that yet in any of the fields like
0:32:59.280,0:33:05.360
cars and so on. It's going to be needed.[br]What can we use as a guide? Security
0:33:05.360,0:33:13.070
economics is one set of intellectual tools[br]that can be applied. We've known for
0:33:13.070,0:33:18.020
almost 20 years now that complex socio-[br]technical systems often fail because of
0:33:18.020,0:33:22.490
poor incentives. If Alice guards a system[br]and Bob pays the cost of failure, you can
0:33:22.490,0:33:27.740
expect trouble. And so security economics[br]researchers can explain platform security
0:33:27.740,0:33:34.040
problems, patching cycle liability games[br]and so on. And the same principles apply
0:33:34.040,0:33:38.750
to safety and will become even more[br]important as safety and security become
0:33:38.750,0:33:43.940
entangled. Also, we'll get even more data[br]and we'll be able to do more research and
0:33:43.940,0:33:51.080
get more insights from the data. So where[br]does this lead? Well, our papers Making
0:33:51.080,0:33:56.240
security sustainable, and the thing that[br]we did for the EU standardization and
0:33:56.240,0:34:00.500
certification of the Internet of Things[br]are on my web page together with other
0:34:00.500,0:34:04.910
relevant papers on topics around[br]sustainability from, you know, smart
0:34:04.910,0:34:11.280
metering to pushing back on wildlife[br]crime. And that's the first place to go if
0:34:11.280,0:34:15.540
you're interested in this stuff. And[br]there's also our blog. And if you're
0:34:15.540,0:34:20.790
interested in these kinds of issues at the[br]interface between technology and policy of
0:34:20.790,0:34:25.980
how incentives work and how they very[br]often fail when it comes to complex socio-
0:34:25.980,0:34:31.240
technical systems, then does the workshop[br]on the Economics of Information Security
0:34:31.240,0:34:36.750
in Brussels next June is the place where[br]academics interested in these topics tend
0:34:36.750,0:34:47.400
to meet up. So perhaps we'll see a few of[br]you there in June. And with that, there's
0:34:47.400,0:34:53.250
a book on security engineering which goes[br]over some of these things and there's a
0:34:53.250,0:34:56.127
third edition in the pipeline.
0:34:56.127,0:34:58.577
H: Thank you very much,[br]Ross Anderson, for the talk.
0:34:58.577,0:35:08.787
applause
0:35:08.787,0:35:13.290
We will start the Q&A session a little bit[br]differently than you used to, Ross has a
0:35:13.290,0:35:18.807
question to you. So he told me there will[br]be a third edition of his book and he is
0:35:18.807,0:35:24.745
not yet sure about the cover he wants to[br]have. So you are going to choose. And so
0:35:24.745,0:35:29.545
that the people on the stream also can[br]hear your choice, I would like you to make
0:35:29.545,0:35:36.610
a humming noise for the cover which you[br]like more. You will first see Bill's covers.
0:35:36.610,0:35:43.570
R: Cover 1, and cover 2.[br]H: So, who of you would like to prefer the
0:35:43.570,0:35:52.510
first cover?[br]applause Come on.
0:35:52.510,0:36:01.850
And the second choice. louder applause[br]OK. I think we have a clear favorite here
0:36:01.850,0:36:04.517
from the audience, so it would[br]be the second cover.
0:36:04.517,0:36:08.690
R: Thanks.[br]H: And we will look forward to see this
0:36:08.690,0:36:13.727
cover next year then. So if you now have[br]questions yourself, you can line up in
0:36:13.727,0:36:18.867
front of the microphones. You will find[br]eight distributed in the hall, three in
0:36:18.867,0:36:27.070
the middle, two on the sides. Signal Angel[br]has the first question from the Internet.
0:36:27.070,0:36:31.560
Person1: The first question is, is there a[br]reason why you didn't include aviation
0:36:31.560,0:36:36.278
into your research?[br]R: We were asked to choose three fields,
0:36:36.278,0:36:40.649
and the three fields I chose were the ones[br]in which we's worked more, most recently.
0:36:40.649,0:36:46.413
I did some work in avionics for that was[br]40 years ago, so I'm no longer current.
0:36:46.413,0:36:49.096
H: Alright, a question from microphone[br]number two, please.
0:36:49.096,0:36:54.097
Person2: Hi. Thanks for your talk. What[br]I'm wondering most about is where do you
0:36:54.097,0:37:00.750
believe the balance will fall in the fight[br]between privacy, the want of the
0:37:00.750,0:37:06.582
manufacturer to prove that it wasn't their[br]fault and the right to repair?
0:37:06.582,0:37:10.120
R: Well, this is an immensely complex[br]question and it's one that we'll be
0:37:10.120,0:37:15.104
fighting about for the next 20 years. But[br]all I can suggest is that we study the
0:37:15.104,0:37:19.670
problems in detail, that we collect the[br]data that we need to say coherent things
0:37:19.670,0:37:24.279
to policymakers and that we use the[br]intellectual tools that we have, such as
0:37:24.279,0:37:28.760
the economics of security in order to[br]inform these arguments. That's the best
0:37:28.760,0:37:32.601
way that we can fight these fights, you[br]know, by being clearheaded and by being
0:37:32.601,0:37:35.873
informed.[br]H: Thank you. A question from microphone
0:37:35.873,0:37:44.836
number four, please. Can you switch on the[br]microphone number four.
0:37:44.836,0:37:51.380
Person3: Oh, sorry. Hello. Thank you for[br]the talk. As a software engineer, arguably
0:37:51.380,0:37:57.049
I can cause much more damage than a single[br]medical professional simply because of the
0:37:57.049,0:38:04.043
multiplication of my work. Why is it that[br]there is still no conversation about
0:38:04.043,0:38:09.236
software engineers caring liability[br]insurance and being collaborative for the
0:38:09.236,0:38:13.485
work they do?[br]R: Well, that again is a complex question.
0:38:13.485,0:38:16.874
And there are some countries like Canada[br]where being a professional engineer gives
0:38:16.874,0:38:21.705
you a particular status. I think it's[br]cultural as much as anything else, because
0:38:21.705,0:38:27.365
our trade has always been freewheeling,[br]it's always been growing very quickly. And
0:38:27.365,0:38:31.969
throughout my lifetime it's been sucking[br]up a fair proportion of science graduates.
0:38:31.969,0:38:35.058
If you were to restrict software[br]engineering to people with degrees in
0:38:35.058,0:38:38.377
computer science, then we would have an[br]awful lot fewer people. I wouldn't be
0:38:38.377,0:38:43.193
here, for example, because my first[br]degree was in pure math.
0:38:43.193,0:38:46.744
H: All right, the question from microphone[br]number one, please.
0:38:46.744,0:38:52.646
Person4: Hi. Thank you for the talk. My[br]question is also about aviation, because
0:38:52.646,0:38:59.399
as I understand that a lot of the, all[br]retired aircraft and other equipment is
0:38:59.399,0:39:06.313
dumped into the so-called developing[br]countries. And with the modern technology
0:39:06.313,0:39:12.180
and the modern aircraft where the issue of[br]maintain or software or betting would
0:39:12.180,0:39:19.092
still be in question. But how do we see[br]that rolling out also for the so-called
0:39:19.092,0:39:24.630
third world countries? Because I am a[br]Pakistani journalist, but this worries me
0:39:24.630,0:39:31.925
a lot because we get so many devices[br]dumped into Pakistan after they're retired
0:39:31.925,0:39:36.706
and people just use them. I mean, it's a[br]country that can not even afford a license,
0:39:36.706,0:39:41.464
to operating system. So maybe you could[br]shed a light on that. Thank you.
0:39:41.464,0:39:45.547
R: Well, there are some positive things[br]that can be done. Development IT is
0:39:45.547,0:39:50.841
something in which we are engaged. You can[br]find the details of my Web site, but good
0:39:50.841,0:39:55.808
things don't necessarily have to involve[br]IT. One of my school friends became an
0:39:55.808,0:40:00.695
anesthetist and after he retired, he[br]devoted his energies to developing an
0:40:00.695,0:40:05.693
infusion pump for use in less developed[br]countries, which was very much cheaper
0:40:05.693,0:40:09.339
than the ones that we saw on the screen[br]there. And it's also safe, rugged,
0:40:09.339,0:40:16.082
reliable and designed for for use in[br]places like Pakistan and Africa and South
0:40:16.082,0:40:22.183
America. So the appropriate technology[br]doesn't always have to be the wiziest?,
0:40:22.183,0:40:29.192
right. And if you've got very bad roads,[br]as in India, in Africa, and relatively
0:40:29.192,0:40:33.883
cheap labor, then perhaps autonomous[br]cars should not be a priority.
0:40:33.883,0:40:35.801
Person4: Thank you.[br]H: All right. We have another question
0:40:35.801,0:40:40.694
from the Internet, the Signal Angel, please?[br]Person5: Why force updates by law?
0:40:40.694,0:40:45.355
Wouldn't it be better to prohibit the[br]important things from accessing the
0:40:45.355,0:40:50.348
Internet by law?[br]R: Well, politics is the art of the
0:40:50.348,0:40:56.635
possible. And you can only realistically[br]talk about a certain number of things at
0:40:56.635,0:41:00.895
any one time in any political culture or[br]the so-called Overton Window. Now, if
0:41:00.895,0:41:05.931
you talked about banning technology,[br]banning cars that are connected to the
0:41:05.931,0:41:10.288
Internet as a minister, you will be[br]immediately shouted out of office as being
0:41:10.288,0:41:14.422
a Luddite, right. So it's just not[br]possible to go down that path. What is
0:41:14.422,0:41:19.574
possible is to go down the path of saying,[br]look, if you've got a company that imports
0:41:19.574,0:41:24.323
lots of dangerous toys that harm kids or[br]dangerous CCTV cameras are recruited into
0:41:24.323,0:41:28.380
a botnet, and if you don't meet European[br]regulations, we'll put the containers on
0:41:28.380,0:41:32.009
the boat back to China. That's just[br]something that can be solved politically.
0:41:32.009,0:41:36.940
And given the weakness of the car industry[br]after the emission standard scandal, it
0:41:36.940,0:41:40.775
was possible for Brussels to push through[br]something that the car industry really
0:41:40.775,0:41:46.376
didn't like. So, again, and even then that[br]was the third attempt to do something
0:41:46.376,0:41:52.309
about it. So, again, it's what you can[br]practically achieve in real world politics
0:41:52.309,0:41:56.364
H: All right. We have more questions.[br]Microphone number four, please.
0:41:56.364,0:42:01.189
Person6: Hi, I'm automotive cyber security[br]analyst and embedded software engineer.
0:42:01.189,0:42:06.895
Most the part of the ISO 21434 Automotive[br]Cyber Security Standard, are you aware of
0:42:06.895,0:42:09.995
the standard that's coming[br]out next year? Hopefully.
0:42:09.995,0:42:13.588
R: I've not done any significant work with[br]it. Friends in the motor industry have
0:42:13.588,0:42:17.589
talked about it, but it's not something[br]we've engaged with in a detail.
0:42:17.589,0:42:21.484
Person6: So I guess my point is not so[br]much a question, but a little bit of a
0:42:21.484,0:42:25.830
pushback but a lot of the things you[br]talked about are being worked on and are
0:42:25.830,0:42:32.990
being considered over the years updating[br]is going to be mandated. Just 30, a 30, 40
0:42:32.990,0:42:38.220
year lifecycle of the vehicle is being[br]considered by engineers. Why not? Nobody I
0:42:38.220,0:42:44.634
know talks about a six year lifecycle that[br]you know, that that's back in the 80s,
0:42:44.634,0:42:49.010
maybe when we talked about planned[br]obsolescence. But that's just not a thing.
0:42:49.010,0:42:53.695
So I'm not really sure where that language[br]is coming from, to be honest with you.
0:42:53.695,0:42:57.590
R: Well, I've been to close motor industry[br]conferences where senior executives have
0:42:57.590,0:43:02.990
been talking about just that in terms of[br]autonomous vehicles. So, yeah, it's
0:43:02.990,0:43:09.860
something that we've disabused them of.[br]H: All right. So time is unfortunately up,
0:43:09.860,0:43:14.570
but I think Ross will be available after[br]to talk as well for questions so you can
0:43:14.570,0:43:19.300
meet him here on the side. Please give a[br]huge round of applause for Ross Anderson.
0:43:19.300,0:43:20.780
applause
0:43:20.780,0:43:24.211
R: Thanks. And thank you[br]for choosing the cover.
0:43:24.211,0:43:26.381
36c3 postrol music
0:43:26.381,0:43:52.000
Subtitles created by c3subtitles.de[br]in the year 2021. Join, and help us!