-
prerol music
-
Herald: Our next speaker, he's a professor
of security engineering at Cambridge
-
University. He is the author of the book
Security Engineering. He has done a lot of
-
things already. He has been inventing semi
invasive attacks based on inducing photo
-
currence. He has done API attacks. He
has done a lot of stuff. If you read his
-
bio is it feels like he's involved in
almost everything we like related to
-
security. So please give a huge round and
a warm welcome to Ross Anderson and his
-
talk, The Sustainability of safety,
security and privacy.
-
applause
-
Ross Anderson: Thanks. Right. It's great
to be here, and I'm going to tell a story
-
that starts a few years ago and it's about
the regulation of safety. Just to set the
-
scene, you may recall that in February
this year there was this watch Enox's
-
Safe-Kid One suddenly got recalled. And
why? Well, it's unlikely that unencrypted
-
communications with the backhand server
allowing an authenticated access and
-
translated into layman language that meant
that hackers could track and call your
-
kids, changed the device ID and do
arbitrary bad things. So this was
-
immediately recalled by the European Union
using powers that it had under the Radio
-
Equipment Directive. And this was a bit of
a wake up call for industry, because up
-
until then, people active in the so-called
Internet of Things didn't have any idea
-
that, you know, if they produced an unsafe
device, then they could suddenly be
-
ordered to take it off the market. Anyway,
back in 2015, the European Union's
-
research department asked Eireann Leverett,
Richard Clayton and me to examine what I
-
would see implied from the regulation of
safety, because the European institutions
-
regulate all sorts of things, from toys to
railway signals and from cars through
-
drugs to aircraft. And if you start having
software and everything, does this mean
-
that all these dozens of agencies suddenly
start to have software safety experts and
-
software security experts? So what does
this mean in institutional terms? We
-
produced a report for them in 2016, which
the commission sat on for a year. A
-
version of the report came out in 2017 and
later that year the full report. And the
-
gist of our report was once you get
software everywhere, safety and security
-
become entangled. And in fact, when you
think about it, the two are the same in
-
pretty well all the languages spoken by EU
citizens. speaks other languages.
-
It's only English that distinguishes
between the two. And with
-
Britain leaving the EU, of course you will
have languages in which safety and
-
security become the same. Throughout
Brussels and throughout the continent. But
-
anyway, how are we going to update safety
regulation in order to cope? This was the
-
problem that Brussels was trying to get
its head around. So one of the things that
-
we had been looking at over the past 15,
20 years is the economics of information
-
security, because often a big complex
systems fail because the incentives are
-
wrong. If Alice guards the system and Bob
pairs the cost of failure, you can expect
-
trouble. And many of these ideas go across
the safety as well. Now, it's already well
-
known that markets do safety in some
industries, such as aviation, way better
-
than others, such as medicine. And cars
were dreadful for many years for the first
-
80 years of the car industry. People
didn't bother with things like seatbelts,
-
and it was only until Ralph Nader's book,
Unsafe at Any Speed, led the Americans to
-
set up the National Highways,
Transportation and Safety Administration
-
and various court cases brought this
forcefully to public attention that car
-
safety started to become a thing. Now in
the EU, we've got a whole series of broad
-
frameworks and specific directives and
detail rules and thus overall 20 EU
-
agencies plus the UNECE in play here. So
how can we navigate this? Well, what we
-
were asked to do was to look at three
specific verticals and study them in some
-
detail so that the lessons from them could
be then taken to the other verticals in
-
which the EU operates. And, cars were one
of those. And some of you may remember the
-
carshark pepper in 2011. Four guys from
San Diego and the University of Washington
-
figured out how to hack a vehicle and
control it remotely. And I used to have a
-
lovely little video of this that the
researchers gave me. But my Mac got
-
upgraded to Catalina last week and it
doesn't play anymore. So, verschlimmbessern?
-
Man sagt auf Deutsch? Oder?
Yeah.
-
applause
-
Okay. We'll get it going sooner or later.
Anyway, this was largely ignored because
-
one little video didn't make the biscuit.
But in 2015, there suddenly came to the
-
attention of the industry because Charlie
Miller and Chris Valasek, two guys who had
-
been in the NSA is hacking team hacks a
cheap Cherokee using Chryslers Uconnect.
-
And this meant that they could go down
through all the Chrysler vehicles in
-
America and look at them one by one and
ask, where are you? And then when they
-
found the vehicle that was somewhere
interesting, they could go in and do
-
things to it. And what they found was that
to hack a vehicle, suddenly you just
-
needed the vehicle's IP address. And so
they got a journalist into a vehicle and
-
they got into slow down and had trucks
behind them hooting away, and eventually
-
they ran the vehicle off the road. And
when the TV footage of this got out,
-
suddenly, people cared. It made the front
pages of the press in the USA, and
-
Chrysler had to recall 1.4 million
vehicles for a software fix, which meant
-
actually reflashing the firmware of the
devices. And it cost them billions and
-
billions of dollars. So all of a sudden,
this is something to which people paid
-
attention. Some of you may know this chap
here, at least by sight. This is Martin
-
Winterkorn, who used to run Volkswagen.
And when it turned out that he had hacked
-
millions and millions of Volkswagen
vehicles by putting in evil software that
-
defeated emissions controls. That's what
happened to Volkswagen stock price. Oh,
-
and he lost his job and got prosecuted. So
this is an important point about vehicles
-
and in fact, about many things in the
Internet of things for Internet of
-
targets, whatever you want to call it. The
thread model isn't just external, it is
-
internal as well. There are bad people all
the way up and down the supply chain. Even
-
at the OEM. So that's the state of play in
cars. And we investigated that and wrote a
-
bit about it. Now, here's medicine. This
was the second thing that we looked at.
-
These are some pictures of the scene in
the intensive care unit in Swansea
-
Hospital. So after your car gets hacked
and you go off the road, this is where you
-
end up. And just as a car has got about 50
computers in it, you're now going to see
-
that there's quite a few computers at your
bedside. How many CPUs can you see? You
-
see, there's quite a few, about a
comparable number to the number of CPUs in
-
your car. Only here the systems
integration is done by the nurse, not by
-
the engineers at Volkswagen or Mercedes.
And does this cause safety problems? Oh,
-
sure. Here are pictures of the user
interface of infusion pumps taken from
-
Swansea's intensive care unit. And as you
can see, they're all different. This is a
-
little bit like if you suddenly had to
drive a car from the 1930s an old
-
Lanchester, for example, and then you find
that the accelerator is between the brake
-
and the clutch, right? Honestly, there
used to be such cars. You can still find
-
them in antique car fairs or a Model T
Ford, for example, for the accelerator is
-
actually a lever on the dashboard and one
of the pedals is as a gear change. And yet
-
you're asking nurses to operate a variety
of different pieces of equipment and look,
-
for example, at the Bodyguard 545. The one
on the top to increase the doors. Right,
-
this is the morphine that is being dripped
into your vein once you've had your car
-
crash, to increase the dose you have to
press 2 and to decrease that, you have to
-
press 0. Under the Bodyguard 545 at the
bottom right, to increase the dose you
-
press 5 and to decrease it, you press 0.
And this leads to accidents, to fatal
-
accidents, a significant number of them.
Okay. So you might say, well, why not have
-
standards? Well, we have standards. We've
got standards which say that liter should
-
always be a capital L, so it is not
confused with a one. And then you see that
-
and the Bodyguard on the bottom right.
MILLILITERS is a capital L in green. Okay.
-
Well done, Mr. Bodyguard. The problem is,
if you look up two lines, you see 500
-
milliliters is in small letters. So
there's a standard problem. There's an
-
enforcement problem and there's extra
inanities because each of these vendors
-
will say, well, everybody else should
standardize on my kit. And there are also
-
various other market failures. So the
expert who's been investigating this is my
-
friend Harold Thimbleby, who's a professor
of computer science at Swansea. And his
-
research shows that hospitals safety,
usability failures kill about 2000 people
-
every year in the UK, which is about the
same as road accidents. And safety
-
usability, in other words, gets ignored
because the incentives are wrong. In
-
Britain and indeed in the European
institutions, people tend to follow the
-
FDA in America and that is captured by the
large medical device makers over there.
-
They only have two engineers. They're not
allowed to play with pumps, etc, etc, etc.
-
The curious thing here is that safety and
security come together. The safety of
-
medical devices may improve because as
soon as it becomes possible to hack a
-
medical device, then people suddenly take
care. So the first of this was when Kevin
-
Fu and researchers at the University of
Michigan showed that they could hack the
-
hospital, a symbolic infusion pump over
Wi-Fi. And this led the FDA to immediately
-
panic and blacklist the pump, recalling it
from service. But then said, Kevin, what
-
about the 200 other infusion pumps that
are unsafe because of the things on the
-
previous slide? Also, the FDA, we couldn't
possibly recall all those. Then two years
-
ago, there's an even bigger recall. It
turned out that 450 000 pacemakers made by
-
St. Jude could similarly be hacked over
Wi-Fi. And so the recall was ordered. And
-
this is quite serious, because if you've
got a heart pacemaker, right, it's
-
implanted surgically in the muscle next to
your shoulder blade. And to remove that
-
and replace it with a new one, which they
do every 10 years to change the battery,
-
you know, is a day care surgery procedure.
You have to go in there, get an
-
anesthetic. They have to have a
cardiologist ready in case you have a
-
heart attack. It's a big deal, right? It
costs maybe 3000 pounds in the UK. And so
-
3000 pounds times 450 000 pacemakers.
Multiply it by two for American health
-
care costs and you're talking real money.
So what should Europe do about this? Well,
-
thankfully, the European institutions have
been getting off their butts on this and
-
the medical device directors have been
revised. And from next year, medical
-
devices will have post-market
surveillance, risk management plan,
-
ergonomic design. And here's perhaps the
driver for software engineering for
-
devices that incorporate software. The
software shall be developed in accordance
-
with the state of the art, taking into
account the principles of development,
-
life cycle risk management, including
information, security, verification and
-
validation. So there at least we have a
foothold and it continues. Devices shall
-
be designed and manufactured in such a way
as to protect as far as possible against
-
unauthorized access that could hamper the
device from functioning as intended. Now
-
it's still not perfect. There's various
things that the manufacturers can do to
-
wriggle. But it's still a huge
improvement. The third thing that we
-
looked at was energy, electricity
substations and electro technical
-
equipments in general, there have been one
or two talks at this conference on that.
-
Basically, the problem is that you've got
a 40 year life cycle for these devices.
-
Protocols such as Smart Bus and DNP3 don't
support authentication. And the fact that
-
everything has gone to IP networks means
that as with the Chrysler Jeeps. Anybody
-
who knows your IP address can read from
and with an actuator's IP address, you can
-
activate it. So the only practical fix
there is to re-perimeterise and the
-
entrepreneurs who noticed this 10 to 15
years ago and set up companies like Beldon
-
have now made lots and lots of money.
Companies like BP now have thousands of
-
such firewalls which isolate their
chemical and other plants from the
-
internet. So one way in which you can deal
with this is having one component that
-
connects you to the network, you replace
it every five years. That's one way of
-
doing, if you'd like sustainable security
for your oil refinery. But this is a lot
-
harder for cars, which have got multiple
RF interfaces. A modern car has maybe 10
-
interfaces in all those there is the
internal phone. There's the short range radio
-
link for remote key entry. Those things.
There are links to the devices that
-
monitor your tire pressure. There's all
sorts of other things and every single one
-
of these has been exploited at least once.
And there are particular difficulties in
-
the auto industry because of the
fragmented responsibility in the supply
-
chain between the OEM, the tier ones and
the specialists who produce all the
-
various bits and pieces that get glued
together. Anyway, so the broad questions
-
that arise from this include who will
investigate incidents and to whom will
-
they be reported? Right? How do we embed
responsible disclosure? How do we bring
-
safety engineers and security engineers
together? This is an enormous project
-
because security engineers and safety
engineers use different languages. We have
-
different university degree programs. We
go to different conferences. And the world
-
of safety is similarly fragmented between
the power people, the car people, the
-
naval people, the signal people and so on
and so forth. Some companies are beginning
-
to get this together. The first is Bosch,
which put together their safety,
-
engineering and security engineering
professions. But even once you have done
-
that in organizational terms, how do you
teach a security engineer to think safety
-
and vice versa? Then the problem that
bothered the European Union, are the
-
regulators all going to need security
engineers? Right. I mean, many of these
-
organizations in Brussels don't even have
an engineer on staff, right? They are
-
mostly full of lawyers and policy people.
And then, of course, for this audience,
-
how do you prevent abuse of lock-in, you
know, in America if you've got a chapter
-
from John Deere? And then if you don't
take it to a John Deere dealer every six
-
months or so, it stops working. Right. And
if you try and hack it so you can fix it
-
yourself, then John Deere will try to get
you prosecuted. We just don't want that
-
kind of stuff coming over the Atlantic
into Europe. So we ended up with a number
-
of recommendations. We thought that we
would get vendors to self-certify for the
-
CE mark that products could be patched if
need be. That turned out to be not viable.
-
We then came up with another idea that
things should be secure by default for the
-
update to the Ready Equipment Directive.
And that didn't get through the European
-
Parliament either. In fact, it was Mozilla
that lobbied against it. Eventually we got
-
something through which I'll discuss in a
minute. We talked about requiring a secure
-
development lifecycle with vulnerability
management because we've already got
-
standards for that. We talked about
creating an European security engineering
-
agency. So that would be people in
Brussels to support policymakers and the
-
reaction to that. A year and a half ago
was to arrange for ENISA to be allowed to
-
open an office in Brussels so that they
can hopefully build a capability. There
-
with some technical people who can support
policymakers. We recommended extending the
-
product liability directive to services.
There is enormous pushback on that.
-
Companies like Google and Facebook and so
on don't like the idea that they should be
-
as liable for mistakes made by Google
Maps, as for example, Garmin is liable for
-
mistakes made by the navigators. And then
there's the whole business of how do you
-
take the information that European
institutions already have on breaches and
-
vulnerabilities and report this not just
to ENISA, but the safety regulators and
-
users, because somehow you've got to
create a learning system. And this is
-
perhaps one of the big pieces of work to
be done. How do you take, I mean, once all
-
cars are sort of semi intelligent, once
everybody's got telemetry and once that
-
are, you know, gigabytes of data
everywhere, then whenever there's a car
-
crash, the data have to go to all sorts of
places, to the police, to the insurers, to
-
courts, and then, of course, up to the car
makers and regulators and component
-
suppliers and so on. How do you design the
system that will cause the right data to
-
get to the right place, which will still
respect people's privacy rights and all
-
the various other legal obligations? This
is a huge project and nobody has really
-
started to think yet about how it's going
to be done, right. At present, if you've
-
got a crash in a car like a Tesla, which
has got very good telemetry, you basically
-
have to take Tesla to court to get the
data because otherwise they won't hand it
-
over. Right. We need a better regime for
this. And that at present is a blank
-
slate. It's up to us, I suppose, to figure
out how such a system should be designed
-
and built, and it will take many years to
do it, right. If you want a safe system, a
-
system that learns this is what is going
to involve. But there's one thing that
-
struck us after we'd done this work, after
we delivered this to the European
-
Commission, that I'd gone to Brussels and
given a thought to dozens and dozens of
-
security guys. Richard Clayton and I went
to Schloss Dagstuhl for a weeklong seminar
-
on some other security topic. And we were
just chatting one evening and we said,
-
well, you know, what did we actually learn
from this whole exercise on
-
standardization and certification? Well,
it's basically this. That there's two
-
types of secure things that we currently
know how to make. The first is stuff like
-
your phone or your laptop, which is secure
because you patch it every month. Right.
-
But then you have to throw it away after
three years because Larry and Sergei don't
-
have enough money to maintain three
versions of Android. And then we've got
-
things like cars and medical devices where
we test them to death before release and
-
we don't connect them to the Internet, and
we almost never patch them unless Charlie
-
Miller and Chris Fellowship get to go at
your car that is. So what's gonna happen
-
to support costs? Now that we're starting
to patch cars and you have to patch cars
-
because they're online, I want some things
online, right? Anybody in the world can
-
attack us. If a vulnerability is
discovered, it can be scaled and something
-
that you can previously ignore suddenly
becomes something that you have to fix.
-
And if you, you have to pull all your cars
into a garage to patch them, that costs
-
real money. So you need to be able to
patch them over the air. So all of a
-
sudden cars become like computers or
phones. So what is this going to mean? So
-
this is the trilemma. If you've got a
standard safety life cycle, there's no
-
patching. You get safety and
sustainability, but you can't go online
-
because you'll get hacked. And if you get
the standard security lifecycle you're
-
patching, but that breaks the safety
certification, so that's a problem. And if
-
you get patching plus redoing safety
certification with current methods, then
-
the cost of maintaining your safety rating
can be sky high. So here's the big
-
problem. How do you get safety, security
and sustainability at the same time? Now
-
this brings us to another thing that a
number of people at this congress are
-
interested in: the right to repair. This
is the Centennial Light, right? It's been
-
running since 1901. Right. It's in
Livermore in California. It's kind of dim,
-
but you can go there and you can see it.
Still there. In 1924, the three firms have
-
dominated the light business. GE, Osram
and Philips agreed to reduce average bulb
-
lifetime some 2500 hours to 1000
hours. Why? In order to sell more of
-
them. And one of the things that's come
along with CPUs and communications and so
-
on with smart stuff to use, that horrible
word, is that firms are now using online
-
mechanisms, software and cryptographic
mechanisms in order to make it hard or
-
even illegal to fix products. And I
believe that there's a case against Apple
-
going on in France about this. Now, you
might not think it's something that
-
politicians will get upset about, that you
have to throw away your phone after three
-
years instead of after five years. But
here's something you really should worry
-
about. Vehicle life cycle economics,
because the lifetimes of cars in Europe
-
have about doubled in the last 40 years.
And the average age of a car in Britain,
-
which is scrapped, is now almost 15 years.
So what's going to happen once you've got,
-
you know, wonderful self-driving software
in all the cars. Well, a number of big car
-
companies, including in this country, were
taking the view two years ago that they
-
wanted people to scrap their cars after
six years and buy a new one. Hey, makes
-
business sense, doesn't it? If you're Mr.
Mercedes, your business model is if the
-
customer is rich, you sell him a three
year lease on a new car. And if the
-
customer is not quite so rich, you sell
him a three year lease on a Mercedes
-
approved used car. And if somebody drives a
seven year old Mercedes, that's thought
-
crime. You know, they should emigrate to
Africa or something. So this was the view
-
of the vehicle makers. But here's the rub.
The embedded CO2 costs of a car often
-
exceeds its lifetime fuel burn. My best
estimate for the embedded CO2 costs of an
-
E-class American is 35 tons. So go and
work out, you know, how many liters per
-
100 kilometers and how many kilometers
it's gonna run in 15 years. And you come
-
to the conclusion that if you get a six
year lifetime, then maybe you are
-
decreasing the range of the car from 300
000 kilometers to 100 000 kilometers. And
-
so you're approximately doubling the
overall CO2 emissions. Taking the whole
-
life cycle, not just the scope one, but
the scope two, and the scope three, the
-
embedded stuff as well. And then there are
other consequences. What about Africa,
-
where most vehicles are imported second
hand? If you go to Nairobi, all the cars
-
are between 10 and 20 years old, right?
They arrive in the docks in Mombasa when
-
they're already 10 years old and people
drive them for 10 years and then they end
-
up in Uganda or Chad or somewhere like
that. And they're repaired for as long as
-
they're repairable. What's going to happen
to road transport in Africa if all of a
-
sudden there's a software time bomb that
causes cars to self-destruct? Ten years
-
after we leave the showroom. And if there
isn't, what about safety? I don't know
-
what the rules are here, but in Britain I
have to get my car through a safety
-
examination every year, once it's more
than three years old. And it's entirely
-
foreseeable that within two or three years
the mechanic will want to check that the
-
software is up to date. So once the
software update is no longer available,
-
that's basically saying this car must now
be exported or scrapped. I couldn't resist
-
the temptation to put in a cartoon:
"My engine's making a weird noise."
-
"Can you take a look?"
"Sure. Just pop the hood. Oh, the hood
-
latch is also broken. Okay, just pull up
to that big pit and push the car in. We'll
-
go get a new one."
Right? This is if we start treating cars
-
the way we treat consumer electronics. So
what's a reasonable design lifetime? Well,
-
with cars, the way it is going is maybe 18
years, say 10 years from the sale of the
-
last products in a model range, domestic
appliances, 10 years because of spares
-
obligation plus store life, say 15.
Medical devices: If a pacemaker lives for
-
10 years, then maybe you need 20 years. Of
electricity substations, even more. So
-
from the point of view of engineers, the
question is, how can you see to it that
-
your software will be patchable for 20
years? So as we put it in the abstract, if
-
you are writing software now for a car
that will go on sale in 2023, what sort of
-
languages, what sort of tool change should
you use? What sort of crypto should you
-
use so that you're sure you'll still be
able to patch that software in 2043? And
-
that isn't just about the languages and
compilers and linkers and so on. That's
-
about the whole ecosystem. So what did the
EU do? Well, I'm pleased to say that at
-
the third attempt, the EU managed to get
some law through on this. Their active 771
-
this year on smart goods says that buyers
of goods with digital elements are
-
entitled to necessary updates for two
years or for a longer period of time if
-
this is a reasonable expectation of the
customer. This is what they managed to get
-
through the parliament. And what we would
expect is that this will mean at least 10
-
years for cars, ovens, fridges, air
conditioning and so on because of existing
-
provisions about physical spares. And
what's more, the trader has got the burden
-
of proof in the first couple of years if
there's disputes. So there is now the
-
legal framework there to create the demand
for long term patching of software. And
-
now it's kind of up to us. If the durable
goods were deciding today are still
-
working in 2039, then a whole bunch of
things are gonna have to change. Computer
-
science has always been about managing
complexity ever since the very first high
-
level languages and the history goes on
from there through types and objects and
-
tools like git and Jenkins and Coverity.
So here's a question for the computer
-
scientists here. What else is going to be
needed for sustainable computing? Once we
-
have software in just about everything. So
research topics to support 20 year
-
patching include a more stable and
powerful toolchain. We know how complex
-
this can be from crypto with looking at
history of the last 20 years of TLS. Cars
-
teach that it's difficult and expensive to
sustain all the different test
-
environments. You have a different models
of cars. Control systems teach for that
-
you can make small changes to the
architecture, which will then limit what
-
you have to patch. Android teaches how do
you go about motivating OEMs to patch
-
products that they no longer sell. In this
case, it's European law, but there's maybe
-
other things you can do too. What does it
mean for those of us who teach and
-
research in universities? Well, since
2016, I've been teaching safety and
-
security together in the same course the
first year undergraduates, because
-
presenting these ideas together in
lockstep will help people to think in more
-
unified terms about how it all holds
together. In research terms we've have
-
been starting to look at what we can do to
make the tool chain more sustainable. For
-
example, one of the problems that you have
if you maintain crypto software is that
-
every so often the compiler writes, okay,
so a little bit smarter and the compiler
-
figures out that these extra padding
instructions that you put in to make the
-
the loops of your crypto routines run in
constant time and to scrub the contents of
-
round keys once you are no longer in use,
are not doing any real work, and it
-
removes them. And all of a sudden from one
day to the next, you find that your crypto
-
has sprung a huge big timing leak and then
you have to rush to get somebody out of
-
bed to fix the tool chain. So one of the
things that we thought was that better
-
ways for programmers to communicate intent
might help. And so there's a paper by
-
Laurent Simon and David Chisnall and I
where we looked about zeroising sensitive
-
variables and doing constant time loops
with a plug in and VM. And that led to a
-
EuroS&P paper a year and a half ago: "What
you get is what you C", and there's a plug
-
in that you can download them and play
with. Macro scale sustainable security is
-
going to require a lot more. Despite the
problems in the area industry with the
-
737Max, the aerospace industry still has
got a better feedback loop of learning
-
from incidents and accidents. And we don't
have that yet in any of the fields like
-
cars and so on. It's going to be needed.
What can we use as a guide? Security
-
economics is one set of intellectual tools
that can be applied. We've known for
-
almost 20 years now that complex socio-
technical systems often fail because of
-
poor incentives. If Alice guards a system
and Bob pays the cost of failure, you can
-
expect trouble. And so security economics
researchers can explain platform security
-
problems, patching cycle liability games
and so on. And the same principles apply
-
to safety and will become even more
important as safety and security become
-
entangled. Also, we'll get even more data
and we'll be able to do more research and
-
get more insights from the data. So where
does this lead? Well, our papers Making
-
security sustainable, and the thing that
we did for the EU standardization and
-
certification of the Internet of Things
are on my web page together with other
-
relevant papers on topics around
sustainability from, you know, smart
-
metering to pushing back on wildlife
crime. And that's the first place to go if
-
you're interested in this stuff. And
there's also our blog. And if you're
-
interested in these kinds of issues at the
interface between technology and policy of
-
how incentives work and how they very
often fail when it comes to complex socio-
-
technical systems, then does the workshop
on the Economics of Information Security
-
in Brussels next June is the place where
academics interested in these topics tend
-
to meet up. So perhaps we'll see a few of
you there in June. And with that, there's
-
a book on security engineering which goes
over some of these things and there's a
-
third edition in the pipeline.
-
H: Thank you very much,
Ross Anderson, for the talk.
-
applause
-
We will start the Q&A session a little bit
differently than you used to, Ross has a
-
question to you. So he told me there will
be a third edition of his book and he is
-
not yet sure about the cover he wants to
have. So you are going to choose. And so
-
that the people on the stream also can
hear your choice, I would like you to make
-
a humming noise for the cover which you
like more. You will first see Bill's covers.
-
R: Cover 1, and cover 2.
H: So, who of you would like to prefer the
-
first cover?
applause Come on.
-
And the second choice. louder applause
OK. I think we have a clear favorite here
-
from the audience, so it would
be the second cover.
-
R: Thanks.
H: And we will look forward to see this
-
cover next year then. So if you now have
questions yourself, you can line up in
-
front of the microphones. You will find
eight distributed in the hall, three in
-
the middle, two on the sides. Signal Angel
has the first question from the Internet.
-
Person1: The first question is, is there a
reason why you didn't include aviation
-
into your research?
R: We were asked to choose three fields,
-
and the three fields I chose were the ones
in which we's worked more, most recently.
-
I did some work in avionics for that was
40 years ago, so I'm no longer current.
-
H: Alright, a question from microphone
number two, please.
-
Person2: Hi. Thanks for your talk. What
I'm wondering most about is where do you
-
believe the balance will fall in the fight
between privacy, the want of the
-
manufacturer to prove that it wasn't their
fault and the right to repair?
-
R: Well, this is an immensely complex
question and it's one that we'll be
-
fighting about for the next 20 years. But
all I can suggest is that we study the
-
problems in detail, that we collect the
data that we need to say coherent things
-
to policymakers and that we use the
intellectual tools that we have, such as
-
the economics of security in order to
inform these arguments. That's the best
-
way that we can fight these fights, you
know, by being clearheaded and by being
-
informed.
H: Thank you. A question from microphone
-
number four, please. Can you switch on the
microphone number four.
-
Person3: Oh, sorry. Hello. Thank you for
the talk. As a software engineer, arguably
-
I can cause much more damage than a single
medical professional simply because of the
-
multiplication of my work. Why is it that
there is still no conversation about
-
software engineers caring liability
insurance and being collaborative for the
-
work they do?
R: Well, that again is a complex question.
-
And there are some countries like Canada
where being a professional engineer gives
-
you a particular status. I think it's
cultural as much as anything else, because
-
our trade has always been freewheeling,
it's always been growing very quickly. And
-
throughout my lifetime it's been sucking
up a fair proportion of science graduates.
-
If you were to restrict software
engineering to people with degrees in
-
computer science, then we would have an
awful lot fewer people. I wouldn't be
-
here, for example, because my first
degree was in pure math.
-
H: All right, the question from microphone
number one, please.
-
Person4: Hi. Thank you for the talk. My
question is also about aviation, because
-
as I understand that a lot of the, all
retired aircraft and other equipment is
-
dumped into the so-called developing
countries. And with the modern technology
-
and the modern aircraft where the issue of
maintain or software or betting would
-
still be in question. But how do we see
that rolling out also for the so-called
-
third world countries? Because I am a
Pakistani journalist, but this worries me
-
a lot because we get so many devices
dumped into Pakistan after they're retired
-
and people just use them. I mean, it's a
country that can not even afford a license,
-
to operating system. So maybe you could
shed a light on that. Thank you.
-
R: Well, there are some positive things
that can be done. Development IT is
-
something in which we are engaged. You can
find the details of my Web site, but good
-
things don't necessarily have to involve
IT. One of my school friends became an
-
anesthetist and after he retired, he
devoted his energies to developing an
-
infusion pump for use in less developed
countries, which was very much cheaper
-
than the ones that we saw on the screen
there. And it's also safe, rugged,
-
reliable and designed for for use in
places like Pakistan and Africa and South
-
America. So the appropriate technology
doesn't always have to be the wiziest?,
-
right. And if you've got very bad roads,
as in India, in Africa, and relatively
-
cheap labor, then perhaps autonomous
cars should not be a priority.
-
Person4: Thank you.
H: All right. We have another question
-
from the Internet, the Signal Angel, please?
Person5: Why force updates by law?
-
Wouldn't it be better to prohibit the
important things from accessing the
-
Internet by law?
R: Well, politics is the art of the
-
possible. And you can only realistically
talk about a certain number of things at
-
any one time in any political culture or
the so-called Overton Window. Now, if
-
you talked about banning technology,
banning cars that are connected to the
-
Internet as a minister, you will be
immediately shouted out of office as being
-
a Luddite, right. So it's just not
possible to go down that path. What is
-
possible is to go down the path of saying,
look, if you've got a company that imports
-
lots of dangerous toys that harm kids or
dangerous CCTV cameras are recruited into
-
a botnet, and if you don't meet European
regulations, we'll put the containers on
-
the boat back to China. That's just
something that can be solved politically.
-
And given the weakness of the car industry
after the emission standard scandal, it
-
was possible for Brussels to push through
something that the car industry really
-
didn't like. So, again, and even then that
was the third attempt to do something
-
about it. So, again, it's what you can
practically achieve in real world politics
-
H: All right. We have more questions.
Microphone number four, please.
-
Person6: Hi, I'm automotive cyber security
analyst and embedded software engineer.
-
Most the part of the ISO 21434 Automotive
Cyber Security Standard, are you aware of
-
the standard that's coming
out next year? Hopefully.
-
R: I've not done any significant work with
it. Friends in the motor industry have
-
talked about it, but it's not something
we've engaged with in a detail.
-
Person6: So I guess my point is not so
much a question, but a little bit of a
-
pushback but a lot of the things you
talked about are being worked on and are
-
being considered over the years updating
is going to be mandated. Just 30, a 30, 40
-
year lifecycle of the vehicle is being
considered by engineers. Why not? Nobody I
-
know talks about a six year lifecycle that
you know, that that's back in the 80s,
-
maybe when we talked about planned
obsolescence. But that's just not a thing.
-
So I'm not really sure where that language
is coming from, to be honest with you.
-
R: Well, I've been to close motor industry
conferences where senior executives have
-
been talking about just that in terms of
autonomous vehicles. So, yeah, it's
-
something that we've disabused them of.
H: All right. So time is unfortunately up,
-
but I think Ross will be available after
to talk as well for questions so you can
-
meet him here on the side. Please give a
huge round of applause for Ross Anderson.
-
applause
-
R: Thanks. And thank you
for choosing the cover.
-
36c3 postrol music
-
Subtitles created by c3subtitles.de
in the year 2021. Join, and help us!