1
00:00:00,000 --> 00:00:19,220
prerol music
2
00:00:19,220 --> 00:00:25,180
Herald: Our next speaker, he's a professor
of security engineering at Cambridge
3
00:00:25,180 --> 00:00:31,250
University. He is the author of the book
Security Engineering. He has done a lot of
4
00:00:31,250 --> 00:00:39,890
things already. He has been inventing semi
invasive attacks based on inducing photo
5
00:00:39,890 --> 00:00:45,580
currence. He has done API attacks. He
has done a lot of stuff. If you read his
6
00:00:45,580 --> 00:00:50,520
bio is it feels like he's involved in
almost everything we like related to
7
00:00:50,520 --> 00:00:57,084
security. So please give a huge round and
a warm welcome to Ross Anderson and his
8
00:00:57,084 --> 00:01:01,496
talk, The Sustainability of safety,
security and privacy.
9
00:01:01,496 --> 00:01:02,746
applause
10
00:01:02,746 --> 00:01:16,125
Ross Anderson: Thanks. Right. It's great
to be here, and I'm going to tell a story
11
00:01:16,125 --> 00:01:23,981
that starts a few years ago and it's about
the regulation of safety. Just to set the
12
00:01:23,981 --> 00:01:31,405
scene, you may recall that in February
this year there was this watch Enox's
13
00:01:31,405 --> 00:01:37,709
Safe-Kid One suddenly got recalled. And
why? Well, it's unlikely that unencrypted
14
00:01:37,709 --> 00:01:42,790
communications with the backhand server
allowing an authenticated access and
15
00:01:42,790 --> 00:01:47,006
translated into layman language that meant
that hackers could track and call your
16
00:01:47,006 --> 00:01:52,260
kids, changed the device ID and do
arbitrary bad things. So this was
17
00:01:52,260 --> 00:01:57,447
immediately recalled by the European Union
using powers that it had under the Radio
18
00:01:57,447 --> 00:02:02,388
Equipment Directive. And this was a bit of
a wake up call for industry, because up
19
00:02:02,388 --> 00:02:07,514
until then, people active in the so-called
Internet of Things didn't have any idea
20
00:02:07,514 --> 00:02:11,470
that, you know, if they produced an unsafe
device, then they could suddenly be
21
00:02:11,470 --> 00:02:20,374
ordered to take it off the market. Anyway,
back in 2015, the European Union's
22
00:02:20,374 --> 00:02:25,835
research department asked Eireann Leverett,
Richard Clayton and me to examine what I
23
00:02:25,835 --> 00:02:32,327
would see implied from the regulation of
safety, because the European institutions
24
00:02:32,327 --> 00:02:36,855
regulate all sorts of things, from toys to
railway signals and from cars through
25
00:02:36,855 --> 00:02:41,071
drugs to aircraft. And if you start having
software and everything, does this mean
26
00:02:41,071 --> 00:02:46,310
that all these dozens of agencies suddenly
start to have software safety experts and
27
00:02:46,310 --> 00:02:51,604
software security experts? So what does
this mean in institutional terms? We
28
00:02:51,604 --> 00:02:57,512
produced a report for them in 2016, which
the commission sat on for a year. A
29
00:02:57,512 --> 00:03:03,000
version of the report came out in 2017 and
later that year the full report. And the
30
00:03:03,000 --> 00:03:07,351
gist of our report was once you get
software everywhere, safety and security
31
00:03:07,351 --> 00:03:12,721
become entangled. And in fact, when you
think about it, the two are the same in
32
00:03:12,721 --> 00:03:19,287
pretty well all the languages spoken by EU
citizens. speaks other languages.
33
00:03:19,287 --> 00:03:23,170
It's only English that distinguishes
between the two. And with
34
00:03:23,170 --> 00:03:28,264
Britain leaving the EU, of course you will
have languages in which safety and
35
00:03:28,264 --> 00:03:33,578
security become the same. Throughout
Brussels and throughout the continent. But
36
00:03:33,578 --> 00:03:38,191
anyway, how are we going to update safety
regulation in order to cope? This was the
37
00:03:38,191 --> 00:03:44,185
problem that Brussels was trying to get
its head around. So one of the things that
38
00:03:44,185 --> 00:03:50,619
we had been looking at over the past 15,
20 years is the economics of information
39
00:03:50,619 --> 00:03:56,381
security, because often a big complex
systems fail because the incentives are
40
00:03:56,381 --> 00:04:01,530
wrong. If Alice guards the system and Bob
pairs the cost of failure, you can expect
41
00:04:01,530 --> 00:04:08,374
trouble. And many of these ideas go across
the safety as well. Now, it's already well
42
00:04:08,374 --> 00:04:13,203
known that markets do safety in some
industries, such as aviation, way better
43
00:04:13,203 --> 00:04:18,903
than others, such as medicine. And cars
were dreadful for many years for the first
44
00:04:18,903 --> 00:04:23,245
80 years of the car industry. People
didn't bother with things like seatbelts,
45
00:04:23,245 --> 00:04:28,643
and it was only until Ralph Nader's book,
Unsafe at Any Speed, led the Americans to
46
00:04:28,643 --> 00:04:32,767
set up the National Highways,
Transportation and Safety Administration
47
00:04:32,767 --> 00:04:37,410
and various court cases brought this
forcefully to public attention that car
48
00:04:37,410 --> 00:04:42,900
safety started to become a thing. Now in
the EU, we've got a whole series of broad
49
00:04:42,900 --> 00:04:49,292
frameworks and specific directives and
detail rules and thus overall 20 EU
50
00:04:49,292 --> 00:04:55,074
agencies plus the UNECE in play here. So
how can we navigate this? Well, what we
51
00:04:55,074 --> 00:05:00,035
were asked to do was to look at three
specific verticals and study them in some
52
00:05:00,035 --> 00:05:06,507
detail so that the lessons from them could
be then taken to the other verticals in
53
00:05:06,507 --> 00:05:17,967
which the EU operates. And, cars were one
of those. And some of you may remember the
54
00:05:17,967 --> 00:05:26,601
carshark pepper in 2011. Four guys from
San Diego and the University of Washington
55
00:05:26,601 --> 00:05:30,720
figured out how to hack a vehicle and
control it remotely. And I used to have a
56
00:05:30,720 --> 00:05:34,477
lovely little video of this that the
researchers gave me. But my Mac got
57
00:05:34,477 --> 00:05:41,370
upgraded to Catalina last week and it
doesn't play anymore. So, verschlimmbessern?
58
00:05:41,370 --> 00:05:44,354
Man sagt auf Deutsch? Oder?
Yeah.
59
00:05:44,354 --> 00:05:49,316
applause
60
00:05:49,316 --> 00:05:53,717
Okay. We'll get it going sooner or later.
Anyway, this was largely ignored because
61
00:05:53,717 --> 00:05:59,976
one little video didn't make the biscuit.
But in 2015, there suddenly came to the
62
00:05:59,976 --> 00:06:04,643
attention of the industry because Charlie
Miller and Chris Valasek, two guys who had
63
00:06:04,643 --> 00:06:10,870
been in the NSA is hacking team hacks a
cheap Cherokee using Chryslers Uconnect.
64
00:06:10,870 --> 00:06:14,171
And this meant that they could go down
through all the Chrysler vehicles in
65
00:06:14,171 --> 00:06:18,548
America and look at them one by one and
ask, where are you? And then when they
66
00:06:18,548 --> 00:06:21,676
found the vehicle that was somewhere
interesting, they could go in and do
67
00:06:21,676 --> 00:06:26,878
things to it. And what they found was that
to hack a vehicle, suddenly you just
68
00:06:26,878 --> 00:06:34,539
needed the vehicle's IP address. And so
they got a journalist into a vehicle and
69
00:06:34,539 --> 00:06:38,649
they got into slow down and had trucks
behind them hooting away, and eventually
70
00:06:38,649 --> 00:06:43,102
they ran the vehicle off the road. And
when the TV footage of this got out,
71
00:06:43,102 --> 00:06:47,505
suddenly, people cared. It made the front
pages of the press in the USA, and
72
00:06:47,505 --> 00:06:52,359
Chrysler had to recall 1.4 million
vehicles for a software fix, which meant
73
00:06:52,359 --> 00:06:58,268
actually reflashing the firmware of the
devices. And it cost them billions and
74
00:06:58,268 --> 00:07:02,170
billions of dollars. So all of a sudden,
this is something to which people paid
75
00:07:02,170 --> 00:07:10,675
attention. Some of you may know this chap
here, at least by sight. This is Martin
76
00:07:10,675 --> 00:07:15,852
Winterkorn, who used to run Volkswagen.
And when it turned out that he had hacked
77
00:07:15,852 --> 00:07:20,292
millions and millions of Volkswagen
vehicles by putting in evil software that
78
00:07:20,292 --> 00:07:26,780
defeated emissions controls. That's what
happened to Volkswagen stock price. Oh,
79
00:07:26,780 --> 00:07:33,770
and he lost his job and got prosecuted. So
this is an important point about vehicles
80
00:07:33,770 --> 00:07:37,668
and in fact, about many things in the
Internet of things for Internet of
81
00:07:37,668 --> 00:07:42,246
targets, whatever you want to call it. The
thread model isn't just external, it is
82
00:07:42,246 --> 00:07:47,105
internal as well. There are bad people all
the way up and down the supply chain. Even
83
00:07:47,105 --> 00:07:54,605
at the OEM. So that's the state of play in
cars. And we investigated that and wrote a
84
00:07:54,605 --> 00:08:03,785
bit about it. Now, here's medicine. This
was the second thing that we looked at.
85
00:08:03,785 --> 00:08:08,789
These are some pictures of the scene in
the intensive care unit in Swansea
86
00:08:08,789 --> 00:08:13,335
Hospital. So after your car gets hacked
and you go off the road, this is where you
87
00:08:13,335 --> 00:08:19,918
end up. And just as a car has got about 50
computers in it, you're now going to see
88
00:08:19,918 --> 00:08:34,040
that there's quite a few computers at your
bedside. How many CPUs can you see? You
89
00:08:34,040 --> 00:08:39,807
see, there's quite a few, about a
comparable number to the number of CPUs in
90
00:08:39,807 --> 00:08:47,235
your car. Only here the systems
integration is done by the nurse, not by
91
00:08:47,235 --> 00:08:55,528
the engineers at Volkswagen or Mercedes.
And does this cause safety problems? Oh,
92
00:08:55,528 --> 00:09:06,723
sure. Here are pictures of the user
interface of infusion pumps taken from
93
00:09:06,723 --> 00:09:13,500
Swansea's intensive care unit. And as you
can see, they're all different. This is a
94
00:09:13,500 --> 00:09:17,736
little bit like if you suddenly had to
drive a car from the 1930s an old
95
00:09:17,736 --> 00:09:22,452
Lanchester, for example, and then you find
that the accelerator is between the brake
96
00:09:22,452 --> 00:09:27,416
and the clutch, right? Honestly, there
used to be such cars. You can still find
97
00:09:27,416 --> 00:09:33,325
them in antique car fairs or a Model T
Ford, for example, for the accelerator is
98
00:09:33,325 --> 00:09:39,003
actually a lever on the dashboard and one
of the pedals is as a gear change. And yet
99
00:09:39,003 --> 00:09:44,330
you're asking nurses to operate a variety
of different pieces of equipment and look,
100
00:09:44,330 --> 00:09:50,645
for example, at the Bodyguard 545. The one
on the top to increase the doors. Right,
101
00:09:50,645 --> 00:09:54,527
this is the morphine that is being dripped
into your vein once you've had your car
102
00:09:54,527 --> 00:09:58,949
crash, to increase the dose you have to
press 2 and to decrease that, you have to
103
00:09:58,949 --> 00:10:06,882
press 0. Under the Bodyguard 545 at the
bottom right, to increase the dose you
104
00:10:06,882 --> 00:10:14,367
press 5 and to decrease it, you press 0.
And this leads to accidents, to fatal
105
00:10:14,367 --> 00:10:21,179
accidents, a significant number of them.
Okay. So you might say, well, why not have
106
00:10:21,179 --> 00:10:25,576
standards? Well, we have standards. We've
got standards which say that liter should
107
00:10:25,576 --> 00:10:30,510
always be a capital L, so it is not
confused with a one. And then you see that
108
00:10:30,510 --> 00:10:37,522
and the Bodyguard on the bottom right.
MILLILITERS is a capital L in green. Okay.
109
00:10:37,522 --> 00:10:43,285
Well done, Mr. Bodyguard. The problem is,
if you look up two lines, you see 500
110
00:10:43,285 --> 00:10:49,172
milliliters is in small letters. So
there's a standard problem. There's an
111
00:10:49,172 --> 00:10:53,785
enforcement problem and there's extra
inanities because each of these vendors
112
00:10:53,785 --> 00:10:58,285
will say, well, everybody else should
standardize on my kit. And there are also
113
00:10:58,285 --> 00:11:04,745
various other market failures. So the
expert who's been investigating this is my
114
00:11:04,745 --> 00:11:09,515
friend Harold Thimbleby, who's a professor
of computer science at Swansea. And his
115
00:11:09,515 --> 00:11:14,603
research shows that hospitals safety,
usability failures kill about 2000 people
116
00:11:14,603 --> 00:11:22,207
every year in the UK, which is about the
same as road accidents. And safety
117
00:11:22,207 --> 00:11:29,572
usability, in other words, gets ignored
because the incentives are wrong. In
118
00:11:29,572 --> 00:11:33,486
Britain and indeed in the European
institutions, people tend to follow the
119
00:11:33,486 --> 00:11:39,190
FDA in America and that is captured by the
large medical device makers over there.
120
00:11:39,190 --> 00:11:45,150
They only have two engineers. They're not
allowed to play with pumps, etc, etc, etc.
121
00:11:45,150 --> 00:11:50,322
The curious thing here is that safety and
security come together. The safety of
122
00:11:50,322 --> 00:11:55,316
medical devices may improve because as
soon as it becomes possible to hack a
123
00:11:55,316 --> 00:12:02,577
medical device, then people suddenly take
care. So the first of this was when Kevin
124
00:12:02,577 --> 00:12:07,334
Fu and researchers at the University of
Michigan showed that they could hack the
125
00:12:07,334 --> 00:12:12,270
hospital, a symbolic infusion pump over
Wi-Fi. And this led the FDA to immediately
126
00:12:12,270 --> 00:12:17,244
panic and blacklist the pump, recalling it
from service. But then said, Kevin, what
127
00:12:17,244 --> 00:12:21,108
about the 200 other infusion pumps that
are unsafe because of the things on the
128
00:12:21,108 --> 00:12:27,760
previous slide? Also, the FDA, we couldn't
possibly recall all those. Then two years
129
00:12:27,760 --> 00:12:33,118
ago, there's an even bigger recall. It
turned out that 450 000 pacemakers made by
130
00:12:33,118 --> 00:12:38,939
St. Jude could similarly be hacked over
Wi-Fi. And so the recall was ordered. And
131
00:12:38,939 --> 00:12:42,590
this is quite serious, because if you've
got a heart pacemaker, right, it's
132
00:12:42,590 --> 00:12:47,681
implanted surgically in the muscle next to
your shoulder blade. And to remove that
133
00:12:47,681 --> 00:12:51,740
and replace it with a new one, which they
do every 10 years to change the battery,
134
00:12:51,740 --> 00:12:54,950
you know, is a day care surgery procedure.
You have to go in there, get an
135
00:12:54,950 --> 00:12:58,256
anesthetic. They have to have a
cardiologist ready in case you have a
136
00:12:58,256 --> 00:13:05,340
heart attack. It's a big deal, right? It
costs maybe 3000 pounds in the UK. And so
137
00:13:05,340 --> 00:13:11,000
3000 pounds times 450 000 pacemakers.
Multiply it by two for American health
138
00:13:11,000 --> 00:13:18,510
care costs and you're talking real money.
So what should Europe do about this? Well,
139
00:13:18,510 --> 00:13:22,970
thankfully, the European institutions have
been getting off their butts on this and
140
00:13:22,970 --> 00:13:27,650
the medical device directors have been
revised. And from next year, medical
141
00:13:27,650 --> 00:13:31,170
devices will have post-market
surveillance, risk management plan,
142
00:13:31,170 --> 00:13:37,460
ergonomic design. And here's perhaps the
driver for software engineering for
143
00:13:37,460 --> 00:13:41,600
devices that incorporate software. The
software shall be developed in accordance
144
00:13:41,600 --> 00:13:45,680
with the state of the art, taking into
account the principles of development,
145
00:13:45,680 --> 00:13:50,810
life cycle risk management, including
information, security, verification and
146
00:13:50,810 --> 00:13:57,470
validation. So there at least we have a
foothold and it continues. Devices shall
147
00:13:57,470 --> 00:14:02,150
be designed and manufactured in such a way
as to protect as far as possible against
148
00:14:02,150 --> 00:14:06,620
unauthorized access that could hamper the
device from functioning as intended. Now
149
00:14:06,620 --> 00:14:11,040
it's still not perfect. There's various
things that the manufacturers can do to
150
00:14:11,040 --> 00:14:17,090
wriggle. But it's still a huge
improvement. The third thing that we
151
00:14:17,090 --> 00:14:20,990
looked at was energy, electricity
substations and electro technical
152
00:14:20,990 --> 00:14:25,700
equipments in general, there have been one
or two talks at this conference on that.
153
00:14:25,700 --> 00:14:30,480
Basically, the problem is that you've got
a 40 year life cycle for these devices.
154
00:14:30,480 --> 00:14:35,750
Protocols such as Smart Bus and DNP3 don't
support authentication. And the fact that
155
00:14:35,750 --> 00:14:41,180
everything has gone to IP networks means
that as with the Chrysler Jeeps. Anybody
156
00:14:41,180 --> 00:14:45,750
who knows your IP address can read from
and with an actuator's IP address, you can
157
00:14:45,750 --> 00:14:51,200
activate it. So the only practical fix
there is to re-perimeterise and the
158
00:14:51,200 --> 00:14:56,300
entrepreneurs who noticed this 10 to 15
years ago and set up companies like Beldon
159
00:14:56,300 --> 00:15:00,980
have now made lots and lots of money.
Companies like BP now have thousands of
160
00:15:00,980 --> 00:15:06,050
such firewalls which isolate their
chemical and other plants from the
161
00:15:06,050 --> 00:15:11,480
internet. So one way in which you can deal
with this is having one component that
162
00:15:11,480 --> 00:15:14,900
connects you to the network, you replace
it every five years. That's one way of
163
00:15:14,900 --> 00:15:20,270
doing, if you'd like sustainable security
for your oil refinery. But this is a lot
164
00:15:20,270 --> 00:15:25,280
harder for cars, which have got multiple
RF interfaces. A modern car has maybe 10
165
00:15:25,280 --> 00:15:31,600
interfaces in all those there is the
internal phone. There's the short range radio
166
00:15:31,600 --> 00:15:37,310
link for remote key entry. Those things.
There are links to the devices that
167
00:15:37,310 --> 00:15:41,030
monitor your tire pressure. There's all
sorts of other things and every single one
168
00:15:41,030 --> 00:15:48,350
of these has been exploited at least once.
And there are particular difficulties in
169
00:15:48,350 --> 00:15:53,180
the auto industry because of the
fragmented responsibility in the supply
170
00:15:53,180 --> 00:15:57,530
chain between the OEM, the tier ones and
the specialists who produce all the
171
00:15:57,530 --> 00:16:03,380
various bits and pieces that get glued
together. Anyway, so the broad questions
172
00:16:03,380 --> 00:16:08,480
that arise from this include who will
investigate incidents and to whom will
173
00:16:08,480 --> 00:16:15,890
they be reported? Right? How do we embed
responsible disclosure? How do we bring
174
00:16:15,890 --> 00:16:21,500
safety engineers and security engineers
together? This is an enormous project
175
00:16:21,500 --> 00:16:25,580
because security engineers and safety
engineers use different languages. We have
176
00:16:25,580 --> 00:16:31,040
different university degree programs. We
go to different conferences. And the world
177
00:16:31,040 --> 00:16:35,450
of safety is similarly fragmented between
the power people, the car people, the
178
00:16:35,450 --> 00:16:40,680
naval people, the signal people and so on
and so forth. Some companies are beginning
179
00:16:40,680 --> 00:16:44,940
to get this together. The first is Bosch,
which put together their safety,
180
00:16:44,940 --> 00:16:48,960
engineering and security engineering
professions. But even once you have done
181
00:16:48,960 --> 00:16:53,640
that in organizational terms, how do you
teach a security engineer to think safety
182
00:16:53,640 --> 00:16:58,950
and vice versa? Then the problem that
bothered the European Union, are the
183
00:16:58,950 --> 00:17:04,350
regulators all going to need security
engineers? Right. I mean, many of these
184
00:17:04,350 --> 00:17:10,250
organizations in Brussels don't even have
an engineer on staff, right? They are
185
00:17:10,250 --> 00:17:16,260
mostly full of lawyers and policy people.
And then, of course, for this audience,
186
00:17:16,260 --> 00:17:21,280
how do you prevent abuse of lock-in, you
know, in America if you've got a chapter
187
00:17:21,280 --> 00:17:25,200
from John Deere? And then if you don't
take it to a John Deere dealer every six
188
00:17:25,200 --> 00:17:29,790
months or so, it stops working. Right. And
if you try and hack it so you can fix it
189
00:17:29,790 --> 00:17:34,740
yourself, then John Deere will try to get
you prosecuted. We just don't want that
190
00:17:34,740 --> 00:17:41,100
kind of stuff coming over the Atlantic
into Europe. So we ended up with a number
191
00:17:41,100 --> 00:17:46,770
of recommendations. We thought that we
would get vendors to self-certify for the
192
00:17:46,770 --> 00:17:52,160
CE mark that products could be patched if
need be. That turned out to be not viable.
193
00:17:52,160 --> 00:17:57,100
We then came up with another idea that
things should be secure by default for the
194
00:17:57,100 --> 00:18:00,630
update to the Ready Equipment Directive.
And that didn't get through the European
195
00:18:00,630 --> 00:18:06,980
Parliament either. In fact, it was Mozilla
that lobbied against it. Eventually we got
196
00:18:06,980 --> 00:18:11,850
something through which I'll discuss in a
minute. We talked about requiring a secure
197
00:18:11,850 --> 00:18:15,210
development lifecycle with vulnerability
management because we've already got
198
00:18:15,210 --> 00:18:21,330
standards for that. We talked about
creating an European security engineering
199
00:18:21,330 --> 00:18:25,830
agency. So that would be people in
Brussels to support policymakers and the
200
00:18:25,830 --> 00:18:30,540
reaction to that. A year and a half ago
was to arrange for ENISA to be allowed to
201
00:18:30,540 --> 00:18:35,040
open an office in Brussels so that they
can hopefully build a capability. There
202
00:18:35,040 --> 00:18:40,200
with some technical people who can support
policymakers. We recommended extending the
203
00:18:40,200 --> 00:18:45,830
product liability directive to services.
There is enormous pushback on that.
204
00:18:45,830 --> 00:18:50,430
Companies like Google and Facebook and so
on don't like the idea that they should be
205
00:18:50,430 --> 00:18:55,620
as liable for mistakes made by Google
Maps, as for example, Garmin is liable for
206
00:18:55,620 --> 00:19:00,930
mistakes made by the navigators. And then
there's the whole business of how do you
207
00:19:00,930 --> 00:19:05,220
take the information that European
institutions already have on breaches and
208
00:19:05,220 --> 00:19:10,140
vulnerabilities and report this not just
to ENISA, but the safety regulators and
209
00:19:10,140 --> 00:19:14,160
users, because somehow you've got to
create a learning system. And this is
210
00:19:14,160 --> 00:19:19,050
perhaps one of the big pieces of work to
be done. How do you take, I mean, once all
211
00:19:19,050 --> 00:19:23,550
cars are sort of semi intelligent, once
everybody's got telemetry and once that
212
00:19:23,550 --> 00:19:28,050
are, you know, gigabytes of data
everywhere, then whenever there's a car
213
00:19:28,050 --> 00:19:34,050
crash, the data have to go to all sorts of
places, to the police, to the insurers, to
214
00:19:34,050 --> 00:19:40,350
courts, and then, of course, up to the car
makers and regulators and component
215
00:19:40,350 --> 00:19:45,060
suppliers and so on. How do you design the
system that will cause the right data to
216
00:19:45,060 --> 00:19:49,680
get to the right place, which will still
respect people's privacy rights and all
217
00:19:49,680 --> 00:19:54,900
the various other legal obligations? This
is a huge project and nobody has really
218
00:19:54,900 --> 00:19:59,880
started to think yet about how it's going
to be done, right. At present, if you've
219
00:19:59,880 --> 00:20:03,780
got a crash in a car like a Tesla, which
has got very good telemetry, you basically
220
00:20:03,780 --> 00:20:07,200
have to take Tesla to court to get the
data because otherwise they won't hand it
221
00:20:07,200 --> 00:20:13,320
over. Right. We need a better regime for
this. And that at present is a blank
222
00:20:13,320 --> 00:20:18,910
slate. It's up to us, I suppose, to figure
out how such a system should be designed
223
00:20:18,910 --> 00:20:23,870
and built, and it will take many years to
do it, right. If you want a safe system, a
224
00:20:23,870 --> 00:20:32,940
system that learns this is what is going
to involve. But there's one thing that
225
00:20:32,940 --> 00:20:37,920
struck us after we'd done this work, after
we delivered this to the European
226
00:20:37,920 --> 00:20:41,940
Commission, that I'd gone to Brussels and
given a thought to dozens and dozens of
227
00:20:41,940 --> 00:20:49,060
security guys. Richard Clayton and I went
to Schloss Dagstuhl for a weeklong seminar
228
00:20:49,060 --> 00:20:53,010
on some other security topic. And we were
just chatting one evening and we said,
229
00:20:53,010 --> 00:21:00,250
well, you know, what did we actually learn
from this whole exercise on
230
00:21:00,250 --> 00:21:07,090
standardization and certification? Well,
it's basically this. That there's two
231
00:21:07,090 --> 00:21:12,790
types of secure things that we currently
know how to make. The first is stuff like
232
00:21:12,790 --> 00:21:17,890
your phone or your laptop, which is secure
because you patch it every month. Right.
233
00:21:17,890 --> 00:21:22,180
But then you have to throw it away after
three years because Larry and Sergei don't
234
00:21:22,180 --> 00:21:35,920
have enough money to maintain three
versions of Android. And then we've got
235
00:21:35,920 --> 00:21:41,460
things like cars and medical devices where
we test them to death before release and
236
00:21:41,460 --> 00:21:46,600
we don't connect them to the Internet, and
we almost never patch them unless Charlie
237
00:21:46,600 --> 00:21:52,750
Miller and Chris Fellowship get to go at
your car that is. So what's gonna happen
238
00:21:52,750 --> 00:21:59,050
to support costs? Now that we're starting
to patch cars and you have to patch cars
239
00:21:59,050 --> 00:22:02,890
because they're online, I want some things
online, right? Anybody in the world can
240
00:22:02,890 --> 00:22:06,760
attack us. If a vulnerability is
discovered, it can be scaled and something
241
00:22:06,760 --> 00:22:11,150
that you can previously ignore suddenly
becomes something that you have to fix.
242
00:22:11,150 --> 00:22:14,650
And if you, you have to pull all your cars
into a garage to patch them, that costs
243
00:22:14,650 --> 00:22:18,490
real money. So you need to be able to
patch them over the air. So all of a
244
00:22:18,490 --> 00:22:26,920
sudden cars become like computers or
phones. So what is this going to mean? So
245
00:22:26,920 --> 00:22:34,030
this is the trilemma. If you've got a
standard safety life cycle, there's no
246
00:22:34,030 --> 00:22:38,150
patching. You get safety and
sustainability, but you can't go online
247
00:22:38,150 --> 00:22:43,600
because you'll get hacked. And if you get
the standard security lifecycle you're
248
00:22:43,600 --> 00:22:50,650
patching, but that breaks the safety
certification, so that's a problem. And if
249
00:22:50,650 --> 00:22:54,730
you get patching plus redoing safety
certification with current methods, then
250
00:22:54,730 --> 00:22:58,930
the cost of maintaining your safety rating
can be sky high. So here's the big
251
00:22:58,930 --> 00:23:09,770
problem. How do you get safety, security
and sustainability at the same time? Now
252
00:23:09,770 --> 00:23:13,040
this brings us to another thing that a
number of people at this congress are
253
00:23:13,040 --> 00:23:17,960
interested in: the right to repair. This
is the Centennial Light, right? It's been
254
00:23:17,960 --> 00:23:24,230
running since 1901. Right. It's in
Livermore in California. It's kind of dim,
255
00:23:24,230 --> 00:23:30,200
but you can go there and you can see it.
Still there. In 1924, the three firms have
256
00:23:30,200 --> 00:23:34,790
dominated the light business. GE, Osram
and Philips agreed to reduce average bulb
257
00:23:34,790 --> 00:23:39,590
lifetime some 2500 hours to 1000
hours. Why? In order to sell more of
258
00:23:39,590 --> 00:23:46,430
them. And one of the things that's come
along with CPUs and communications and so
259
00:23:46,430 --> 00:23:52,360
on with smart stuff to use, that horrible
word, is that firms are now using online
260
00:23:52,360 --> 00:23:58,340
mechanisms, software and cryptographic
mechanisms in order to make it hard or
261
00:23:58,340 --> 00:24:03,860
even illegal to fix products. And I
believe that there's a case against Apple
262
00:24:03,860 --> 00:24:16,790
going on in France about this. Now, you
might not think it's something that
263
00:24:16,790 --> 00:24:20,780
politicians will get upset about, that you
have to throw away your phone after three
264
00:24:20,780 --> 00:24:25,070
years instead of after five years. But
here's something you really should worry
265
00:24:25,070 --> 00:24:31,640
about. Vehicle life cycle economics,
because the lifetimes of cars in Europe
266
00:24:31,640 --> 00:24:36,990
have about doubled in the last 40 years.
And the average age of a car in Britain,
267
00:24:36,990 --> 00:24:46,530
which is scrapped, is now almost 15 years.
So what's going to happen once you've got,
268
00:24:46,530 --> 00:24:54,110
you know, wonderful self-driving software
in all the cars. Well, a number of big car
269
00:24:54,110 --> 00:25:00,200
companies, including in this country, were
taking the view two years ago that they
270
00:25:00,200 --> 00:25:06,320
wanted people to scrap their cars after
six years and buy a new one. Hey, makes
271
00:25:06,320 --> 00:25:10,100
business sense, doesn't it? If you're Mr.
Mercedes, your business model is if the
272
00:25:10,100 --> 00:25:13,790
customer is rich, you sell him a three
year lease on a new car. And if the
273
00:25:13,790 --> 00:25:18,370
customer is not quite so rich, you sell
him a three year lease on a Mercedes
274
00:25:18,370 --> 00:25:23,715
approved used car. And if somebody drives a
seven year old Mercedes, that's thought
275
00:25:23,715 --> 00:25:31,620
crime. You know, they should emigrate to
Africa or something. So this was the view
276
00:25:31,620 --> 00:25:38,070
of the vehicle makers. But here's the rub.
The embedded CO2 costs of a car often
277
00:25:38,070 --> 00:25:43,380
exceeds its lifetime fuel burn. My best
estimate for the embedded CO2 costs of an
278
00:25:43,380 --> 00:25:48,030
E-class American is 35 tons. So go and
work out, you know, how many liters per
279
00:25:48,030 --> 00:25:53,760
100 kilometers and how many kilometers
it's gonna run in 15 years. And you come
280
00:25:53,760 --> 00:25:59,710
to the conclusion that if you get a six
year lifetime, then maybe you are
281
00:25:59,710 --> 00:26:07,180
decreasing the range of the car from 300
000 kilometers to 100 000 kilometers. And
282
00:26:07,180 --> 00:26:13,080
so you're approximately doubling the
overall CO2 emissions. Taking the whole
283
00:26:13,080 --> 00:26:16,710
life cycle, not just the scope one, but
the scope two, and the scope three, the
284
00:26:16,710 --> 00:26:22,320
embedded stuff as well. And then there are
other consequences. What about Africa,
285
00:26:22,320 --> 00:26:26,820
where most vehicles are imported second
hand? If you go to Nairobi, all the cars
286
00:26:26,820 --> 00:26:31,110
are between 10 and 20 years old, right?
They arrive in the docks in Mombasa when
287
00:26:31,110 --> 00:26:35,310
they're already 10 years old and people
drive them for 10 years and then they end
288
00:26:35,310 --> 00:26:39,090
up in Uganda or Chad or somewhere like
that. And they're repaired for as long as
289
00:26:39,090 --> 00:26:43,560
they're repairable. What's going to happen
to road transport in Africa if all of a
290
00:26:43,560 --> 00:26:48,660
sudden there's a software time bomb that
causes cars to self-destruct? Ten years
291
00:26:48,660 --> 00:26:56,040
after we leave the showroom. And if there
isn't, what about safety? I don't know
292
00:26:56,040 --> 00:27:00,420
what the rules are here, but in Britain I
have to get my car through a safety
293
00:27:00,420 --> 00:27:05,010
examination every year, once it's more
than three years old. And it's entirely
294
00:27:05,010 --> 00:27:09,510
foreseeable that within two or three years
the mechanic will want to check that the
295
00:27:09,510 --> 00:27:15,880
software is up to date. So once the
software update is no longer available,
296
00:27:15,880 --> 00:27:24,580
that's basically saying this car must now
be exported or scrapped. I couldn't resist
297
00:27:24,580 --> 00:27:29,120
the temptation to put in a cartoon:
"My engine's making a weird noise."
298
00:27:29,120 --> 00:27:32,490
"Can you take a look?"
"Sure. Just pop the hood. Oh, the hood
299
00:27:32,490 --> 00:27:36,600
latch is also broken. Okay, just pull up
to that big pit and push the car in. We'll
300
00:27:36,600 --> 00:27:41,400
go get a new one."
Right? This is if we start treating cars
301
00:27:41,400 --> 00:27:53,250
the way we treat consumer electronics. So
what's a reasonable design lifetime? Well,
302
00:27:53,250 --> 00:27:58,260
with cars, the way it is going is maybe 18
years, say 10 years from the sale of the
303
00:27:58,260 --> 00:28:03,660
last products in a model range, domestic
appliances, 10 years because of spares
304
00:28:03,660 --> 00:28:09,720
obligation plus store life, say 15.
Medical devices: If a pacemaker lives for
305
00:28:09,720 --> 00:28:16,410
10 years, then maybe you need 20 years. Of
electricity substations, even more. So
306
00:28:16,410 --> 00:28:22,500
from the point of view of engineers, the
question is, how can you see to it that
307
00:28:22,500 --> 00:28:27,690
your software will be patchable for 20
years? So as we put it in the abstract, if
308
00:28:27,690 --> 00:28:34,830
you are writing software now for a car
that will go on sale in 2023, what sort of
309
00:28:34,830 --> 00:28:39,090
languages, what sort of tool change should
you use? What sort of crypto should you
310
00:28:39,090 --> 00:28:46,390
use so that you're sure you'll still be
able to patch that software in 2043? And
311
00:28:46,390 --> 00:28:50,040
that isn't just about the languages and
compilers and linkers and so on. That's
312
00:28:50,040 --> 00:28:59,490
about the whole ecosystem. So what did the
EU do? Well, I'm pleased to say that at
313
00:28:59,490 --> 00:29:05,800
the third attempt, the EU managed to get
some law through on this. Their active 771
314
00:29:05,800 --> 00:29:10,440
this year on smart goods says that buyers
of goods with digital elements are
315
00:29:10,440 --> 00:29:15,570
entitled to necessary updates for two
years or for a longer period of time if
316
00:29:15,570 --> 00:29:20,880
this is a reasonable expectation of the
customer. This is what they managed to get
317
00:29:20,880 --> 00:29:24,990
through the parliament. And what we would
expect is that this will mean at least 10
318
00:29:24,990 --> 00:29:29,520
years for cars, ovens, fridges, air
conditioning and so on because of existing
319
00:29:29,520 --> 00:29:35,100
provisions about physical spares. And
what's more, the trader has got the burden
320
00:29:35,100 --> 00:29:39,720
of proof in the first couple of years if
there's disputes. So there is now the
321
00:29:39,720 --> 00:29:48,160
legal framework there to create the demand
for long term patching of software. And
322
00:29:48,160 --> 00:29:54,570
now it's kind of up to us. If the durable
goods were deciding today are still
323
00:29:54,570 --> 00:30:00,030
working in 2039, then a whole bunch of
things are gonna have to change. Computer
324
00:30:00,030 --> 00:30:04,650
science has always been about managing
complexity ever since the very first high
325
00:30:04,650 --> 00:30:09,780
level languages and the history goes on
from there through types and objects and
326
00:30:09,780 --> 00:30:14,730
tools like git and Jenkins and Coverity.
So here's a question for the computer
327
00:30:14,730 --> 00:30:19,560
scientists here. What else is going to be
needed for sustainable computing? Once we
328
00:30:19,560 --> 00:30:31,440
have software in just about everything. So
research topics to support 20 year
329
00:30:31,440 --> 00:30:35,670
patching include a more stable and
powerful toolchain. We know how complex
330
00:30:35,670 --> 00:30:41,730
this can be from crypto with looking at
history of the last 20 years of TLS. Cars
331
00:30:41,730 --> 00:30:45,480
teach that it's difficult and expensive to
sustain all the different test
332
00:30:45,480 --> 00:30:50,790
environments. You have a different models
of cars. Control systems teach for that
333
00:30:50,790 --> 00:30:54,480
you can make small changes to the
architecture, which will then limit what
334
00:30:54,480 --> 00:30:59,640
you have to patch. Android teaches how do
you go about motivating OEMs to patch
335
00:30:59,640 --> 00:31:04,140
products that they no longer sell. In this
case, it's European law, but there's maybe
336
00:31:04,140 --> 00:31:10,840
other things you can do too. What does it
mean for those of us who teach and
337
00:31:10,840 --> 00:31:15,090
research in universities? Well, since
2016, I've been teaching safety and
338
00:31:15,090 --> 00:31:20,490
security together in the same course the
first year undergraduates, because
339
00:31:20,490 --> 00:31:25,560
presenting these ideas together in
lockstep will help people to think in more
340
00:31:25,560 --> 00:31:30,300
unified terms about how it all holds
together. In research terms we've have
341
00:31:30,300 --> 00:31:34,590
been starting to look at what we can do to
make the tool chain more sustainable. For
342
00:31:34,590 --> 00:31:39,750
example, one of the problems that you have
if you maintain crypto software is that
343
00:31:39,750 --> 00:31:44,550
every so often the compiler writes, okay,
so a little bit smarter and the compiler
344
00:31:44,550 --> 00:31:48,450
figures out that these extra padding
instructions that you put in to make the
345
00:31:48,450 --> 00:31:53,970
the loops of your crypto routines run in
constant time and to scrub the contents of
346
00:31:53,970 --> 00:31:58,130
round keys once you are no longer in use,
are not doing any real work, and it
347
00:31:58,130 --> 00:32:02,840
removes them. And all of a sudden from one
day to the next, you find that your crypto
348
00:32:02,840 --> 00:32:07,520
has sprung a huge big timing leak and then
you have to rush to get somebody out of
349
00:32:07,520 --> 00:32:11,900
bed to fix the tool chain. So one of the
things that we thought was that better
350
00:32:11,900 --> 00:32:17,360
ways for programmers to communicate intent
might help. And so there's a paper by
351
00:32:17,360 --> 00:32:21,800
Laurent Simon and David Chisnall and I
where we looked about zeroising sensitive
352
00:32:21,800 --> 00:32:27,830
variables and doing constant time loops
with a plug in and VM. And that led to a
353
00:32:27,830 --> 00:32:32,810
EuroS&P paper a year and a half ago: "What
you get is what you C", and there's a plug
354
00:32:32,810 --> 00:32:40,770
in that you can download them and play
with. Macro scale sustainable security is
355
00:32:40,770 --> 00:32:45,980
going to require a lot more. Despite the
problems in the area industry with the
356
00:32:45,980 --> 00:32:51,800
737Max, the aerospace industry still has
got a better feedback loop of learning
357
00:32:51,800 --> 00:32:59,280
from incidents and accidents. And we don't
have that yet in any of the fields like
358
00:32:59,280 --> 00:33:05,360
cars and so on. It's going to be needed.
What can we use as a guide? Security
359
00:33:05,360 --> 00:33:13,070
economics is one set of intellectual tools
that can be applied. We've known for
360
00:33:13,070 --> 00:33:18,020
almost 20 years now that complex socio-
technical systems often fail because of
361
00:33:18,020 --> 00:33:22,490
poor incentives. If Alice guards a system
and Bob pays the cost of failure, you can
362
00:33:22,490 --> 00:33:27,740
expect trouble. And so security economics
researchers can explain platform security
363
00:33:27,740 --> 00:33:34,040
problems, patching cycle liability games
and so on. And the same principles apply
364
00:33:34,040 --> 00:33:38,750
to safety and will become even more
important as safety and security become
365
00:33:38,750 --> 00:33:43,940
entangled. Also, we'll get even more data
and we'll be able to do more research and
366
00:33:43,940 --> 00:33:51,080
get more insights from the data. So where
does this lead? Well, our papers Making
367
00:33:51,080 --> 00:33:56,240
security sustainable, and the thing that
we did for the EU standardization and
368
00:33:56,240 --> 00:34:00,500
certification of the Internet of Things
are on my web page together with other
369
00:34:00,500 --> 00:34:04,910
relevant papers on topics around
sustainability from, you know, smart
370
00:34:04,910 --> 00:34:11,280
metering to pushing back on wildlife
crime. And that's the first place to go if
371
00:34:11,280 --> 00:34:15,540
you're interested in this stuff. And
there's also our blog. And if you're
372
00:34:15,540 --> 00:34:20,790
interested in these kinds of issues at the
interface between technology and policy of
373
00:34:20,790 --> 00:34:25,980
how incentives work and how they very
often fail when it comes to complex socio-
374
00:34:25,980 --> 00:34:31,240
technical systems, then does the workshop
on the Economics of Information Security
375
00:34:31,240 --> 00:34:36,750
in Brussels next June is the place where
academics interested in these topics tend
376
00:34:36,750 --> 00:34:47,400
to meet up. So perhaps we'll see a few of
you there in June. And with that, there's
377
00:34:47,400 --> 00:34:53,250
a book on security engineering which goes
over some of these things and there's a
378
00:34:53,250 --> 00:34:56,127
third edition in the pipeline.
379
00:34:56,127 --> 00:34:58,577
H: Thank you very much,
Ross Anderson, for the talk.
380
00:34:58,577 --> 00:35:08,787
applause
381
00:35:08,787 --> 00:35:13,290
We will start the Q&A session a little bit
differently than you used to, Ross has a
382
00:35:13,290 --> 00:35:18,807
question to you. So he told me there will
be a third edition of his book and he is
383
00:35:18,807 --> 00:35:24,745
not yet sure about the cover he wants to
have. So you are going to choose. And so
384
00:35:24,745 --> 00:35:29,545
that the people on the stream also can
hear your choice, I would like you to make
385
00:35:29,545 --> 00:35:36,610
a humming noise for the cover which you
like more. You will first see Bill's covers.
386
00:35:36,610 --> 00:35:43,570
R: Cover 1, and cover 2.
H: So, who of you would like to prefer the
387
00:35:43,570 --> 00:35:52,510
first cover?
applause Come on.
388
00:35:52,510 --> 00:36:01,850
And the second choice. louder applause
OK. I think we have a clear favorite here
389
00:36:01,850 --> 00:36:04,517
from the audience, so it would
be the second cover.
390
00:36:04,517 --> 00:36:08,690
R: Thanks.
H: And we will look forward to see this
391
00:36:08,690 --> 00:36:13,727
cover next year then. So if you now have
questions yourself, you can line up in
392
00:36:13,727 --> 00:36:18,867
front of the microphones. You will find
eight distributed in the hall, three in
393
00:36:18,867 --> 00:36:27,070
the middle, two on the sides. Signal Angel
has the first question from the Internet.
394
00:36:27,070 --> 00:36:31,560
Person1: The first question is, is there a
reason why you didn't include aviation
395
00:36:31,560 --> 00:36:36,278
into your research?
R: We were asked to choose three fields,
396
00:36:36,278 --> 00:36:40,649
and the three fields I chose were the ones
in which we's worked more, most recently.
397
00:36:40,649 --> 00:36:46,413
I did some work in avionics for that was
40 years ago, so I'm no longer current.
398
00:36:46,413 --> 00:36:49,096
H: Alright, a question from microphone
number two, please.
399
00:36:49,096 --> 00:36:54,097
Person2: Hi. Thanks for your talk. What
I'm wondering most about is where do you
400
00:36:54,097 --> 00:37:00,750
believe the balance will fall in the fight
between privacy, the want of the
401
00:37:00,750 --> 00:37:06,582
manufacturer to prove that it wasn't their
fault and the right to repair?
402
00:37:06,582 --> 00:37:10,120
R: Well, this is an immensely complex
question and it's one that we'll be
403
00:37:10,120 --> 00:37:15,104
fighting about for the next 20 years. But
all I can suggest is that we study the
404
00:37:15,104 --> 00:37:19,670
problems in detail, that we collect the
data that we need to say coherent things
405
00:37:19,670 --> 00:37:24,279
to policymakers and that we use the
intellectual tools that we have, such as
406
00:37:24,279 --> 00:37:28,760
the economics of security in order to
inform these arguments. That's the best
407
00:37:28,760 --> 00:37:32,601
way that we can fight these fights, you
know, by being clearheaded and by being
408
00:37:32,601 --> 00:37:35,873
informed.
H: Thank you. A question from microphone
409
00:37:35,873 --> 00:37:44,836
number four, please. Can you switch on the
microphone number four.
410
00:37:44,836 --> 00:37:51,380
Person3: Oh, sorry. Hello. Thank you for
the talk. As a software engineer, arguably
411
00:37:51,380 --> 00:37:57,049
I can cause much more damage than a single
medical professional simply because of the
412
00:37:57,049 --> 00:38:04,043
multiplication of my work. Why is it that
there is still no conversation about
413
00:38:04,043 --> 00:38:09,236
software engineers caring liability
insurance and being collaborative for the
414
00:38:09,236 --> 00:38:13,485
work they do?
R: Well, that again is a complex question.
415
00:38:13,485 --> 00:38:16,874
And there are some countries like Canada
where being a professional engineer gives
416
00:38:16,874 --> 00:38:21,705
you a particular status. I think it's
cultural as much as anything else, because
417
00:38:21,705 --> 00:38:27,365
our trade has always been freewheeling,
it's always been growing very quickly. And
418
00:38:27,365 --> 00:38:31,969
throughout my lifetime it's been sucking
up a fair proportion of science graduates.
419
00:38:31,969 --> 00:38:35,058
If you were to restrict software
engineering to people with degrees in
420
00:38:35,058 --> 00:38:38,377
computer science, then we would have an
awful lot fewer people. I wouldn't be
421
00:38:38,377 --> 00:38:43,193
here, for example, because my first
degree was in pure math.
422
00:38:43,193 --> 00:38:46,744
H: All right, the question from microphone
number one, please.
423
00:38:46,744 --> 00:38:52,646
Person4: Hi. Thank you for the talk. My
question is also about aviation, because
424
00:38:52,646 --> 00:38:59,399
as I understand that a lot of the, all
retired aircraft and other equipment is
425
00:38:59,399 --> 00:39:06,313
dumped into the so-called developing
countries. And with the modern technology
426
00:39:06,313 --> 00:39:12,180
and the modern aircraft where the issue of
maintain or software or betting would
427
00:39:12,180 --> 00:39:19,092
still be in question. But how do we see
that rolling out also for the so-called
428
00:39:19,092 --> 00:39:24,630
third world countries? Because I am a
Pakistani journalist, but this worries me
429
00:39:24,630 --> 00:39:31,925
a lot because we get so many devices
dumped into Pakistan after they're retired
430
00:39:31,925 --> 00:39:36,706
and people just use them. I mean, it's a
country that can not even afford a license,
431
00:39:36,706 --> 00:39:41,464
to operating system. So maybe you could
shed a light on that. Thank you.
432
00:39:41,464 --> 00:39:45,547
R: Well, there are some positive things
that can be done. Development IT is
433
00:39:45,547 --> 00:39:50,841
something in which we are engaged. You can
find the details of my Web site, but good
434
00:39:50,841 --> 00:39:55,808
things don't necessarily have to involve
IT. One of my school friends became an
435
00:39:55,808 --> 00:40:00,695
anesthetist and after he retired, he
devoted his energies to developing an
436
00:40:00,695 --> 00:40:05,693
infusion pump for use in less developed
countries, which was very much cheaper
437
00:40:05,693 --> 00:40:09,339
than the ones that we saw on the screen
there. And it's also safe, rugged,
438
00:40:09,339 --> 00:40:16,082
reliable and designed for for use in
places like Pakistan and Africa and South
439
00:40:16,082 --> 00:40:22,183
America. So the appropriate technology
doesn't always have to be the wiziest?,
440
00:40:22,183 --> 00:40:29,192
right. And if you've got very bad roads,
as in India, in Africa, and relatively
441
00:40:29,192 --> 00:40:33,883
cheap labor, then perhaps autonomous
cars should not be a priority.
442
00:40:33,883 --> 00:40:35,801
Person4: Thank you.
H: All right. We have another question
443
00:40:35,801 --> 00:40:40,694
from the Internet, the Signal Angel, please?
Person5: Why force updates by law?
444
00:40:40,694 --> 00:40:45,355
Wouldn't it be better to prohibit the
important things from accessing the
445
00:40:45,355 --> 00:40:50,348
Internet by law?
R: Well, politics is the art of the
446
00:40:50,348 --> 00:40:56,635
possible. And you can only realistically
talk about a certain number of things at
447
00:40:56,635 --> 00:41:00,895
any one time in any political culture or
the so-called Overton Window. Now, if
448
00:41:00,895 --> 00:41:05,931
you talked about banning technology,
banning cars that are connected to the
449
00:41:05,931 --> 00:41:10,288
Internet as a minister, you will be
immediately shouted out of office as being
450
00:41:10,288 --> 00:41:14,422
a Luddite, right. So it's just not
possible to go down that path. What is
451
00:41:14,422 --> 00:41:19,574
possible is to go down the path of saying,
look, if you've got a company that imports
452
00:41:19,574 --> 00:41:24,323
lots of dangerous toys that harm kids or
dangerous CCTV cameras are recruited into
453
00:41:24,323 --> 00:41:28,380
a botnet, and if you don't meet European
regulations, we'll put the containers on
454
00:41:28,380 --> 00:41:32,009
the boat back to China. That's just
something that can be solved politically.
455
00:41:32,009 --> 00:41:36,940
And given the weakness of the car industry
after the emission standard scandal, it
456
00:41:36,940 --> 00:41:40,775
was possible for Brussels to push through
something that the car industry really
457
00:41:40,775 --> 00:41:46,376
didn't like. So, again, and even then that
was the third attempt to do something
458
00:41:46,376 --> 00:41:52,309
about it. So, again, it's what you can
practically achieve in real world politics
459
00:41:52,309 --> 00:41:56,364
H: All right. We have more questions.
Microphone number four, please.
460
00:41:56,364 --> 00:42:01,189
Person6: Hi, I'm automotive cyber security
analyst and embedded software engineer.
461
00:42:01,189 --> 00:42:06,895
Most the part of the ISO 21434 Automotive
Cyber Security Standard, are you aware of
462
00:42:06,895 --> 00:42:09,995
the standard that's coming
out next year? Hopefully.
463
00:42:09,995 --> 00:42:13,588
R: I've not done any significant work with
it. Friends in the motor industry have
464
00:42:13,588 --> 00:42:17,589
talked about it, but it's not something
we've engaged with in a detail.
465
00:42:17,589 --> 00:42:21,484
Person6: So I guess my point is not so
much a question, but a little bit of a
466
00:42:21,484 --> 00:42:25,830
pushback but a lot of the things you
talked about are being worked on and are
467
00:42:25,830 --> 00:42:32,990
being considered over the years updating
is going to be mandated. Just 30, a 30, 40
468
00:42:32,990 --> 00:42:38,220
year lifecycle of the vehicle is being
considered by engineers. Why not? Nobody I
469
00:42:38,220 --> 00:42:44,634
know talks about a six year lifecycle that
you know, that that's back in the 80s,
470
00:42:44,634 --> 00:42:49,010
maybe when we talked about planned
obsolescence. But that's just not a thing.
471
00:42:49,010 --> 00:42:53,695
So I'm not really sure where that language
is coming from, to be honest with you.
472
00:42:53,695 --> 00:42:57,590
R: Well, I've been to close motor industry
conferences where senior executives have
473
00:42:57,590 --> 00:43:02,990
been talking about just that in terms of
autonomous vehicles. So, yeah, it's
474
00:43:02,990 --> 00:43:09,860
something that we've disabused them of.
H: All right. So time is unfortunately up,
475
00:43:09,860 --> 00:43:14,570
but I think Ross will be available after
to talk as well for questions so you can
476
00:43:14,570 --> 00:43:19,300
meet him here on the side. Please give a
huge round of applause for Ross Anderson.
477
00:43:19,300 --> 00:43:20,780
applause
478
00:43:20,780 --> 00:43:24,211
R: Thanks. And thank you
for choosing the cover.
479
00:43:24,211 --> 00:43:26,381
36c3 postrol music
480
00:43:26,381 --> 00:43:52,000
Subtitles created by c3subtitles.de
in the year 2021. Join, and help us!