< Return to Video

36C3 - The sustainability of safety, security and privacy

  • 0:00 - 0:19
    prerol music
  • 0:19 - 0:25
    Herald: Our next speaker, he's a professor
    of security engineering at Cambridge
  • 0:25 - 0:31
    University. He is the author of the book
    Security Engineering. He has done a lot of
  • 0:31 - 0:40
    things already. He has been inventing semi
    invasive attacks based on inducing photo
  • 0:40 - 0:46
    currence. He has done API attacks. He
    has done a lot of stuff. If you read his
  • 0:46 - 0:51
    bio is it feels like he's involved in
    almost everything we like related to
  • 0:51 - 0:57
    security. So please give a huge round and
    a warm welcome to Ross Anderson and his
  • 0:57 - 1:01
    talk, The Sustainability of safety,
    security and privacy.
  • 1:01 - 1:03
    applause
  • 1:03 - 1:16
    Ross Anderson: Thanks. Right. It's great
    to be here, and I'm going to tell a story
  • 1:16 - 1:24
    that starts a few years ago and it's about
    the regulation of safety. Just to set the
  • 1:24 - 1:31
    scene, you may recall that in February
    this year there was this watch Enox's
  • 1:31 - 1:38
    Safe-Kid One suddenly got recalled. And
    why? Well, it's unlikely that unencrypted
  • 1:38 - 1:43
    communications with the backhand server
    allowing an authenticated access and
  • 1:43 - 1:47
    translated into layman language that meant
    that hackers could track and call your
  • 1:47 - 1:52
    kids, changed the device ID and do
    arbitrary bad things. So this was
  • 1:52 - 1:57
    immediately recalled by the European Union
    using powers that it had under the Radio
  • 1:57 - 2:02
    Equipment Directive. And this was a bit of
    a wake up call for industry, because up
  • 2:02 - 2:08
    until then, people active in the so-called
    Internet of Things didn't have any idea
  • 2:08 - 2:11
    that, you know, if they produced an unsafe
    device, then they could suddenly be
  • 2:11 - 2:20
    ordered to take it off the market. Anyway,
    back in 2015, the European Union's
  • 2:20 - 2:26
    research department asked Eireann Leverett,
    Richard Clayton and me to examine what I
  • 2:26 - 2:32
    would see implied from the regulation of
    safety, because the European institutions
  • 2:32 - 2:37
    regulate all sorts of things, from toys to
    railway signals and from cars through
  • 2:37 - 2:41
    drugs to aircraft. And if you start having
    software and everything, does this mean
  • 2:41 - 2:46
    that all these dozens of agencies suddenly
    start to have software safety experts and
  • 2:46 - 2:52
    software security experts? So what does
    this mean in institutional terms? We
  • 2:52 - 2:58
    produced a report for them in 2016, which
    the commission sat on for a year. A
  • 2:58 - 3:03
    version of the report came out in 2017 and
    later that year the full report. And the
  • 3:03 - 3:07
    gist of our report was once you get
    software everywhere, safety and security
  • 3:07 - 3:13
    become entangled. And in fact, when you
    think about it, the two are the same in
  • 3:13 - 3:19
    pretty well all the languages spoken by EU
    citizens. speaks other languages.
  • 3:19 - 3:23
    It's only English that distinguishes
    between the two. And with
  • 3:23 - 3:28
    Britain leaving the EU, of course you will
    have languages in which safety and
  • 3:28 - 3:34
    security become the same. Throughout
    Brussels and throughout the continent. But
  • 3:34 - 3:38
    anyway, how are we going to update safety
    regulation in order to cope? This was the
  • 3:38 - 3:44
    problem that Brussels was trying to get
    its head around. So one of the things that
  • 3:44 - 3:51
    we had been looking at over the past 15,
    20 years is the economics of information
  • 3:51 - 3:56
    security, because often a big complex
    systems fail because the incentives are
  • 3:56 - 4:02
    wrong. If Alice guards the system and Bob
    pairs the cost of failure, you can expect
  • 4:02 - 4:08
    trouble. And many of these ideas go across
    the safety as well. Now, it's already well
  • 4:08 - 4:13
    known that markets do safety in some
    industries, such as aviation, way better
  • 4:13 - 4:19
    than others, such as medicine. And cars
    were dreadful for many years for the first
  • 4:19 - 4:23
    80 years of the car industry. People
    didn't bother with things like seatbelts,
  • 4:23 - 4:29
    and it was only until Ralph Nader's book,
    Unsafe at Any Speed, led the Americans to
  • 4:29 - 4:33
    set up the National Highways,
    Transportation and Safety Administration
  • 4:33 - 4:37
    and various court cases brought this
    forcefully to public attention that car
  • 4:37 - 4:43
    safety started to become a thing. Now in
    the EU, we've got a whole series of broad
  • 4:43 - 4:49
    frameworks and specific directives and
    detail rules and thus overall 20 EU
  • 4:49 - 4:55
    agencies plus the UNECE in play here. So
    how can we navigate this? Well, what we
  • 4:55 - 5:00
    were asked to do was to look at three
    specific verticals and study them in some
  • 5:00 - 5:07
    detail so that the lessons from them could
    be then taken to the other verticals in
  • 5:07 - 5:18
    which the EU operates. And, cars were one
    of those. And some of you may remember the
  • 5:18 - 5:27
    carshark pepper in 2011. Four guys from
    San Diego and the University of Washington
  • 5:27 - 5:31
    figured out how to hack a vehicle and
    control it remotely. And I used to have a
  • 5:31 - 5:34
    lovely little video of this that the
    researchers gave me. But my Mac got
  • 5:34 - 5:41
    upgraded to Catalina last week and it
    doesn't play anymore. So, verschlimmbessern?
  • 5:41 - 5:44
    Man sagt auf Deutsch? Oder?
    Yeah.
  • 5:44 - 5:49
    applause
  • 5:49 - 5:54
    Okay. We'll get it going sooner or later.
    Anyway, this was largely ignored because
  • 5:54 - 6:00
    one little video didn't make the biscuit.
    But in 2015, there suddenly came to the
  • 6:00 - 6:05
    attention of the industry because Charlie
    Miller and Chris Valasek, two guys who had
  • 6:05 - 6:11
    been in the NSA is hacking team hacks a
    cheap Cherokee using Chryslers Uconnect.
  • 6:11 - 6:14
    And this meant that they could go down
    through all the Chrysler vehicles in
  • 6:14 - 6:19
    America and look at them one by one and
    ask, where are you? And then when they
  • 6:19 - 6:22
    found the vehicle that was somewhere
    interesting, they could go in and do
  • 6:22 - 6:27
    things to it. And what they found was that
    to hack a vehicle, suddenly you just
  • 6:27 - 6:35
    needed the vehicle's IP address. And so
    they got a journalist into a vehicle and
  • 6:35 - 6:39
    they got into slow down and had trucks
    behind them hooting away, and eventually
  • 6:39 - 6:43
    they ran the vehicle off the road. And
    when the TV footage of this got out,
  • 6:43 - 6:48
    suddenly, people cared. It made the front
    pages of the press in the USA, and
  • 6:48 - 6:52
    Chrysler had to recall 1.4 million
    vehicles for a software fix, which meant
  • 6:52 - 6:58
    actually reflashing the firmware of the
    devices. And it cost them billions and
  • 6:58 - 7:02
    billions of dollars. So all of a sudden,
    this is something to which people paid
  • 7:02 - 7:11
    attention. Some of you may know this chap
    here, at least by sight. This is Martin
  • 7:11 - 7:16
    Winterkorn, who used to run Volkswagen.
    And when it turned out that he had hacked
  • 7:16 - 7:20
    millions and millions of Volkswagen
    vehicles by putting in evil software that
  • 7:20 - 7:27
    defeated emissions controls. That's what
    happened to Volkswagen stock price. Oh,
  • 7:27 - 7:34
    and he lost his job and got prosecuted. So
    this is an important point about vehicles
  • 7:34 - 7:38
    and in fact, about many things in the
    Internet of things for Internet of
  • 7:38 - 7:42
    targets, whatever you want to call it. The
    thread model isn't just external, it is
  • 7:42 - 7:47
    internal as well. There are bad people all
    the way up and down the supply chain. Even
  • 7:47 - 7:55
    at the OEM. So that's the state of play in
    cars. And we investigated that and wrote a
  • 7:55 - 8:04
    bit about it. Now, here's medicine. This
    was the second thing that we looked at.
  • 8:04 - 8:09
    These are some pictures of the scene in
    the intensive care unit in Swansea
  • 8:09 - 8:13
    Hospital. So after your car gets hacked
    and you go off the road, this is where you
  • 8:13 - 8:20
    end up. And just as a car has got about 50
    computers in it, you're now going to see
  • 8:20 - 8:34
    that there's quite a few computers at your
    bedside. How many CPUs can you see? You
  • 8:34 - 8:40
    see, there's quite a few, about a
    comparable number to the number of CPUs in
  • 8:40 - 8:47
    your car. Only here the systems
    integration is done by the nurse, not by
  • 8:47 - 8:56
    the engineers at Volkswagen or Mercedes.
    And does this cause safety problems? Oh,
  • 8:56 - 9:07
    sure. Here are pictures of the user
    interface of infusion pumps taken from
  • 9:07 - 9:14
    Swansea's intensive care unit. And as you
    can see, they're all different. This is a
  • 9:14 - 9:18
    little bit like if you suddenly had to
    drive a car from the 1930s an old
  • 9:18 - 9:22
    Lanchester, for example, and then you find
    that the accelerator is between the brake
  • 9:22 - 9:27
    and the clutch, right? Honestly, there
    used to be such cars. You can still find
  • 9:27 - 9:33
    them in antique car fairs or a Model T
    Ford, for example, for the accelerator is
  • 9:33 - 9:39
    actually a lever on the dashboard and one
    of the pedals is as a gear change. And yet
  • 9:39 - 9:44
    you're asking nurses to operate a variety
    of different pieces of equipment and look,
  • 9:44 - 9:51
    for example, at the Bodyguard 545. The one
    on the top to increase the doors. Right,
  • 9:51 - 9:55
    this is the morphine that is being dripped
    into your vein once you've had your car
  • 9:55 - 9:59
    crash, to increase the dose you have to
    press 2 and to decrease that, you have to
  • 9:59 - 10:07
    press 0. Under the Bodyguard 545 at the
    bottom right, to increase the dose you
  • 10:07 - 10:14
    press 5 and to decrease it, you press 0.
    And this leads to accidents, to fatal
  • 10:14 - 10:21
    accidents, a significant number of them.
    Okay. So you might say, well, why not have
  • 10:21 - 10:26
    standards? Well, we have standards. We've
    got standards which say that liter should
  • 10:26 - 10:31
    always be a capital L, so it is not
    confused with a one. And then you see that
  • 10:31 - 10:38
    and the Bodyguard on the bottom right.
    MILLILITERS is a capital L in green. Okay.
  • 10:38 - 10:43
    Well done, Mr. Bodyguard. The problem is,
    if you look up two lines, you see 500
  • 10:43 - 10:49
    milliliters is in small letters. So
    there's a standard problem. There's an
  • 10:49 - 10:54
    enforcement problem and there's extra
    inanities because each of these vendors
  • 10:54 - 10:58
    will say, well, everybody else should
    standardize on my kit. And there are also
  • 10:58 - 11:05
    various other market failures. So the
    expert who's been investigating this is my
  • 11:05 - 11:10
    friend Harold Thimbleby, who's a professor
    of computer science at Swansea. And his
  • 11:10 - 11:15
    research shows that hospitals safety,
    usability failures kill about 2000 people
  • 11:15 - 11:22
    every year in the UK, which is about the
    same as road accidents. And safety
  • 11:22 - 11:30
    usability, in other words, gets ignored
    because the incentives are wrong. In
  • 11:30 - 11:33
    Britain and indeed in the European
    institutions, people tend to follow the
  • 11:33 - 11:39
    FDA in America and that is captured by the
    large medical device makers over there.
  • 11:39 - 11:45
    They only have two engineers. They're not
    allowed to play with pumps, etc, etc, etc.
  • 11:45 - 11:50
    The curious thing here is that safety and
    security come together. The safety of
  • 11:50 - 11:55
    medical devices may improve because as
    soon as it becomes possible to hack a
  • 11:55 - 12:03
    medical device, then people suddenly take
    care. So the first of this was when Kevin
  • 12:03 - 12:07
    Fu and researchers at the University of
    Michigan showed that they could hack the
  • 12:07 - 12:12
    hospital, a symbolic infusion pump over
    Wi-Fi. And this led the FDA to immediately
  • 12:12 - 12:17
    panic and blacklist the pump, recalling it
    from service. But then said, Kevin, what
  • 12:17 - 12:21
    about the 200 other infusion pumps that
    are unsafe because of the things on the
  • 12:21 - 12:28
    previous slide? Also, the FDA, we couldn't
    possibly recall all those. Then two years
  • 12:28 - 12:33
    ago, there's an even bigger recall. It
    turned out that 450 000 pacemakers made by
  • 12:33 - 12:39
    St. Jude could similarly be hacked over
    Wi-Fi. And so the recall was ordered. And
  • 12:39 - 12:43
    this is quite serious, because if you've
    got a heart pacemaker, right, it's
  • 12:43 - 12:48
    implanted surgically in the muscle next to
    your shoulder blade. And to remove that
  • 12:48 - 12:52
    and replace it with a new one, which they
    do every 10 years to change the battery,
  • 12:52 - 12:55
    you know, is a day care surgery procedure.
    You have to go in there, get an
  • 12:55 - 12:58
    anesthetic. They have to have a
    cardiologist ready in case you have a
  • 12:58 - 13:05
    heart attack. It's a big deal, right? It
    costs maybe 3000 pounds in the UK. And so
  • 13:05 - 13:11
    3000 pounds times 450 000 pacemakers.
    Multiply it by two for American health
  • 13:11 - 13:19
    care costs and you're talking real money.
    So what should Europe do about this? Well,
  • 13:19 - 13:23
    thankfully, the European institutions have
    been getting off their butts on this and
  • 13:23 - 13:28
    the medical device directors have been
    revised. And from next year, medical
  • 13:28 - 13:31
    devices will have post-market
    surveillance, risk management plan,
  • 13:31 - 13:37
    ergonomic design. And here's perhaps the
    driver for software engineering for
  • 13:37 - 13:42
    devices that incorporate software. The
    software shall be developed in accordance
  • 13:42 - 13:46
    with the state of the art, taking into
    account the principles of development,
  • 13:46 - 13:51
    life cycle risk management, including
    information, security, verification and
  • 13:51 - 13:57
    validation. So there at least we have a
    foothold and it continues. Devices shall
  • 13:57 - 14:02
    be designed and manufactured in such a way
    as to protect as far as possible against
  • 14:02 - 14:07
    unauthorized access that could hamper the
    device from functioning as intended. Now
  • 14:07 - 14:11
    it's still not perfect. There's various
    things that the manufacturers can do to
  • 14:11 - 14:17
    wriggle. But it's still a huge
    improvement. The third thing that we
  • 14:17 - 14:21
    looked at was energy, electricity
    substations and electro technical
  • 14:21 - 14:26
    equipments in general, there have been one
    or two talks at this conference on that.
  • 14:26 - 14:30
    Basically, the problem is that you've got
    a 40 year life cycle for these devices.
  • 14:30 - 14:36
    Protocols such as Smart Bus and DNP3 don't
    support authentication. And the fact that
  • 14:36 - 14:41
    everything has gone to IP networks means
    that as with the Chrysler Jeeps. Anybody
  • 14:41 - 14:46
    who knows your IP address can read from
    and with an actuator's IP address, you can
  • 14:46 - 14:51
    activate it. So the only practical fix
    there is to re-perimeterise and the
  • 14:51 - 14:56
    entrepreneurs who noticed this 10 to 15
    years ago and set up companies like Beldon
  • 14:56 - 15:01
    have now made lots and lots of money.
    Companies like BP now have thousands of
  • 15:01 - 15:06
    such firewalls which isolate their
    chemical and other plants from the
  • 15:06 - 15:11
    internet. So one way in which you can deal
    with this is having one component that
  • 15:11 - 15:15
    connects you to the network, you replace
    it every five years. That's one way of
  • 15:15 - 15:20
    doing, if you'd like sustainable security
    for your oil refinery. But this is a lot
  • 15:20 - 15:25
    harder for cars, which have got multiple
    RF interfaces. A modern car has maybe 10
  • 15:25 - 15:32
    interfaces in all those there is the
    internal phone. There's the short range radio
  • 15:32 - 15:37
    link for remote key entry. Those things.
    There are links to the devices that
  • 15:37 - 15:41
    monitor your tire pressure. There's all
    sorts of other things and every single one
  • 15:41 - 15:48
    of these has been exploited at least once.
    And there are particular difficulties in
  • 15:48 - 15:53
    the auto industry because of the
    fragmented responsibility in the supply
  • 15:53 - 15:58
    chain between the OEM, the tier ones and
    the specialists who produce all the
  • 15:58 - 16:03
    various bits and pieces that get glued
    together. Anyway, so the broad questions
  • 16:03 - 16:08
    that arise from this include who will
    investigate incidents and to whom will
  • 16:08 - 16:16
    they be reported? Right? How do we embed
    responsible disclosure? How do we bring
  • 16:16 - 16:22
    safety engineers and security engineers
    together? This is an enormous project
  • 16:22 - 16:26
    because security engineers and safety
    engineers use different languages. We have
  • 16:26 - 16:31
    different university degree programs. We
    go to different conferences. And the world
  • 16:31 - 16:35
    of safety is similarly fragmented between
    the power people, the car people, the
  • 16:35 - 16:41
    naval people, the signal people and so on
    and so forth. Some companies are beginning
  • 16:41 - 16:45
    to get this together. The first is Bosch,
    which put together their safety,
  • 16:45 - 16:49
    engineering and security engineering
    professions. But even once you have done
  • 16:49 - 16:54
    that in organizational terms, how do you
    teach a security engineer to think safety
  • 16:54 - 16:59
    and vice versa? Then the problem that
    bothered the European Union, are the
  • 16:59 - 17:04
    regulators all going to need security
    engineers? Right. I mean, many of these
  • 17:04 - 17:10
    organizations in Brussels don't even have
    an engineer on staff, right? They are
  • 17:10 - 17:16
    mostly full of lawyers and policy people.
    And then, of course, for this audience,
  • 17:16 - 17:21
    how do you prevent abuse of lock-in, you
    know, in America if you've got a chapter
  • 17:21 - 17:25
    from John Deere? And then if you don't
    take it to a John Deere dealer every six
  • 17:25 - 17:30
    months or so, it stops working. Right. And
    if you try and hack it so you can fix it
  • 17:30 - 17:35
    yourself, then John Deere will try to get
    you prosecuted. We just don't want that
  • 17:35 - 17:41
    kind of stuff coming over the Atlantic
    into Europe. So we ended up with a number
  • 17:41 - 17:47
    of recommendations. We thought that we
    would get vendors to self-certify for the
  • 17:47 - 17:52
    CE mark that products could be patched if
    need be. That turned out to be not viable.
  • 17:52 - 17:57
    We then came up with another idea that
    things should be secure by default for the
  • 17:57 - 18:01
    update to the Ready Equipment Directive.
    And that didn't get through the European
  • 18:01 - 18:07
    Parliament either. In fact, it was Mozilla
    that lobbied against it. Eventually we got
  • 18:07 - 18:12
    something through which I'll discuss in a
    minute. We talked about requiring a secure
  • 18:12 - 18:15
    development lifecycle with vulnerability
    management because we've already got
  • 18:15 - 18:21
    standards for that. We talked about
    creating an European security engineering
  • 18:21 - 18:26
    agency. So that would be people in
    Brussels to support policymakers and the
  • 18:26 - 18:31
    reaction to that. A year and a half ago
    was to arrange for ENISA to be allowed to
  • 18:31 - 18:35
    open an office in Brussels so that they
    can hopefully build a capability. There
  • 18:35 - 18:40
    with some technical people who can support
    policymakers. We recommended extending the
  • 18:40 - 18:46
    product liability directive to services.
    There is enormous pushback on that.
  • 18:46 - 18:50
    Companies like Google and Facebook and so
    on don't like the idea that they should be
  • 18:50 - 18:56
    as liable for mistakes made by Google
    Maps, as for example, Garmin is liable for
  • 18:56 - 19:01
    mistakes made by the navigators. And then
    there's the whole business of how do you
  • 19:01 - 19:05
    take the information that European
    institutions already have on breaches and
  • 19:05 - 19:10
    vulnerabilities and report this not just
    to ENISA, but the safety regulators and
  • 19:10 - 19:14
    users, because somehow you've got to
    create a learning system. And this is
  • 19:14 - 19:19
    perhaps one of the big pieces of work to
    be done. How do you take, I mean, once all
  • 19:19 - 19:24
    cars are sort of semi intelligent, once
    everybody's got telemetry and once that
  • 19:24 - 19:28
    are, you know, gigabytes of data
    everywhere, then whenever there's a car
  • 19:28 - 19:34
    crash, the data have to go to all sorts of
    places, to the police, to the insurers, to
  • 19:34 - 19:40
    courts, and then, of course, up to the car
    makers and regulators and component
  • 19:40 - 19:45
    suppliers and so on. How do you design the
    system that will cause the right data to
  • 19:45 - 19:50
    get to the right place, which will still
    respect people's privacy rights and all
  • 19:50 - 19:55
    the various other legal obligations? This
    is a huge project and nobody has really
  • 19:55 - 20:00
    started to think yet about how it's going
    to be done, right. At present, if you've
  • 20:00 - 20:04
    got a crash in a car like a Tesla, which
    has got very good telemetry, you basically
  • 20:04 - 20:07
    have to take Tesla to court to get the
    data because otherwise they won't hand it
  • 20:07 - 20:13
    over. Right. We need a better regime for
    this. And that at present is a blank
  • 20:13 - 20:19
    slate. It's up to us, I suppose, to figure
    out how such a system should be designed
  • 20:19 - 20:24
    and built, and it will take many years to
    do it, right. If you want a safe system, a
  • 20:24 - 20:33
    system that learns this is what is going
    to involve. But there's one thing that
  • 20:33 - 20:38
    struck us after we'd done this work, after
    we delivered this to the European
  • 20:38 - 20:42
    Commission, that I'd gone to Brussels and
    given a thought to dozens and dozens of
  • 20:42 - 20:49
    security guys. Richard Clayton and I went
    to Schloss Dagstuhl for a weeklong seminar
  • 20:49 - 20:53
    on some other security topic. And we were
    just chatting one evening and we said,
  • 20:53 - 21:00
    well, you know, what did we actually learn
    from this whole exercise on
  • 21:00 - 21:07
    standardization and certification? Well,
    it's basically this. That there's two
  • 21:07 - 21:13
    types of secure things that we currently
    know how to make. The first is stuff like
  • 21:13 - 21:18
    your phone or your laptop, which is secure
    because you patch it every month. Right.
  • 21:18 - 21:22
    But then you have to throw it away after
    three years because Larry and Sergei don't
  • 21:22 - 21:36
    have enough money to maintain three
    versions of Android. And then we've got
  • 21:36 - 21:41
    things like cars and medical devices where
    we test them to death before release and
  • 21:41 - 21:47
    we don't connect them to the Internet, and
    we almost never patch them unless Charlie
  • 21:47 - 21:53
    Miller and Chris Fellowship get to go at
    your car that is. So what's gonna happen
  • 21:53 - 21:59
    to support costs? Now that we're starting
    to patch cars and you have to patch cars
  • 21:59 - 22:03
    because they're online, I want some things
    online, right? Anybody in the world can
  • 22:03 - 22:07
    attack us. If a vulnerability is
    discovered, it can be scaled and something
  • 22:07 - 22:11
    that you can previously ignore suddenly
    becomes something that you have to fix.
  • 22:11 - 22:15
    And if you, you have to pull all your cars
    into a garage to patch them, that costs
  • 22:15 - 22:18
    real money. So you need to be able to
    patch them over the air. So all of a
  • 22:18 - 22:27
    sudden cars become like computers or
    phones. So what is this going to mean? So
  • 22:27 - 22:34
    this is the trilemma. If you've got a
    standard safety life cycle, there's no
  • 22:34 - 22:38
    patching. You get safety and
    sustainability, but you can't go online
  • 22:38 - 22:44
    because you'll get hacked. And if you get
    the standard security lifecycle you're
  • 22:44 - 22:51
    patching, but that breaks the safety
    certification, so that's a problem. And if
  • 22:51 - 22:55
    you get patching plus redoing safety
    certification with current methods, then
  • 22:55 - 22:59
    the cost of maintaining your safety rating
    can be sky high. So here's the big
  • 22:59 - 23:10
    problem. How do you get safety, security
    and sustainability at the same time? Now
  • 23:10 - 23:13
    this brings us to another thing that a
    number of people at this congress are
  • 23:13 - 23:18
    interested in: the right to repair. This
    is the Centennial Light, right? It's been
  • 23:18 - 23:24
    running since 1901. Right. It's in
    Livermore in California. It's kind of dim,
  • 23:24 - 23:30
    but you can go there and you can see it.
    Still there. In 1924, the three firms have
  • 23:30 - 23:35
    dominated the light business. GE, Osram
    and Philips agreed to reduce average bulb
  • 23:35 - 23:40
    lifetime some 2500 hours to 1000
    hours. Why? In order to sell more of
  • 23:40 - 23:46
    them. And one of the things that's come
    along with CPUs and communications and so
  • 23:46 - 23:52
    on with smart stuff to use, that horrible
    word, is that firms are now using online
  • 23:52 - 23:58
    mechanisms, software and cryptographic
    mechanisms in order to make it hard or
  • 23:58 - 24:04
    even illegal to fix products. And I
    believe that there's a case against Apple
  • 24:04 - 24:17
    going on in France about this. Now, you
    might not think it's something that
  • 24:17 - 24:21
    politicians will get upset about, that you
    have to throw away your phone after three
  • 24:21 - 24:25
    years instead of after five years. But
    here's something you really should worry
  • 24:25 - 24:32
    about. Vehicle life cycle economics,
    because the lifetimes of cars in Europe
  • 24:32 - 24:37
    have about doubled in the last 40 years.
    And the average age of a car in Britain,
  • 24:37 - 24:47
    which is scrapped, is now almost 15 years.
    So what's going to happen once you've got,
  • 24:47 - 24:54
    you know, wonderful self-driving software
    in all the cars. Well, a number of big car
  • 24:54 - 25:00
    companies, including in this country, were
    taking the view two years ago that they
  • 25:00 - 25:06
    wanted people to scrap their cars after
    six years and buy a new one. Hey, makes
  • 25:06 - 25:10
    business sense, doesn't it? If you're Mr.
    Mercedes, your business model is if the
  • 25:10 - 25:14
    customer is rich, you sell him a three
    year lease on a new car. And if the
  • 25:14 - 25:18
    customer is not quite so rich, you sell
    him a three year lease on a Mercedes
  • 25:18 - 25:24
    approved used car. And if somebody drives a
    seven year old Mercedes, that's thought
  • 25:24 - 25:32
    crime. You know, they should emigrate to
    Africa or something. So this was the view
  • 25:32 - 25:38
    of the vehicle makers. But here's the rub.
    The embedded CO2 costs of a car often
  • 25:38 - 25:43
    exceeds its lifetime fuel burn. My best
    estimate for the embedded CO2 costs of an
  • 25:43 - 25:48
    E-class American is 35 tons. So go and
    work out, you know, how many liters per
  • 25:48 - 25:54
    100 kilometers and how many kilometers
    it's gonna run in 15 years. And you come
  • 25:54 - 26:00
    to the conclusion that if you get a six
    year lifetime, then maybe you are
  • 26:00 - 26:07
    decreasing the range of the car from 300
    000 kilometers to 100 000 kilometers. And
  • 26:07 - 26:13
    so you're approximately doubling the
    overall CO2 emissions. Taking the whole
  • 26:13 - 26:17
    life cycle, not just the scope one, but
    the scope two, and the scope three, the
  • 26:17 - 26:22
    embedded stuff as well. And then there are
    other consequences. What about Africa,
  • 26:22 - 26:27
    where most vehicles are imported second
    hand? If you go to Nairobi, all the cars
  • 26:27 - 26:31
    are between 10 and 20 years old, right?
    They arrive in the docks in Mombasa when
  • 26:31 - 26:35
    they're already 10 years old and people
    drive them for 10 years and then they end
  • 26:35 - 26:39
    up in Uganda or Chad or somewhere like
    that. And they're repaired for as long as
  • 26:39 - 26:44
    they're repairable. What's going to happen
    to road transport in Africa if all of a
  • 26:44 - 26:49
    sudden there's a software time bomb that
    causes cars to self-destruct? Ten years
  • 26:49 - 26:56
    after we leave the showroom. And if there
    isn't, what about safety? I don't know
  • 26:56 - 27:00
    what the rules are here, but in Britain I
    have to get my car through a safety
  • 27:00 - 27:05
    examination every year, once it's more
    than three years old. And it's entirely
  • 27:05 - 27:10
    foreseeable that within two or three years
    the mechanic will want to check that the
  • 27:10 - 27:16
    software is up to date. So once the
    software update is no longer available,
  • 27:16 - 27:25
    that's basically saying this car must now
    be exported or scrapped. I couldn't resist
  • 27:25 - 27:29
    the temptation to put in a cartoon:
    "My engine's making a weird noise."
  • 27:29 - 27:32
    "Can you take a look?"
    "Sure. Just pop the hood. Oh, the hood
  • 27:32 - 27:37
    latch is also broken. Okay, just pull up
    to that big pit and push the car in. We'll
  • 27:37 - 27:41
    go get a new one."
    Right? This is if we start treating cars
  • 27:41 - 27:53
    the way we treat consumer electronics. So
    what's a reasonable design lifetime? Well,
  • 27:53 - 27:58
    with cars, the way it is going is maybe 18
    years, say 10 years from the sale of the
  • 27:58 - 28:04
    last products in a model range, domestic
    appliances, 10 years because of spares
  • 28:04 - 28:10
    obligation plus store life, say 15.
    Medical devices: If a pacemaker lives for
  • 28:10 - 28:16
    10 years, then maybe you need 20 years. Of
    electricity substations, even more. So
  • 28:16 - 28:22
    from the point of view of engineers, the
    question is, how can you see to it that
  • 28:22 - 28:28
    your software will be patchable for 20
    years? So as we put it in the abstract, if
  • 28:28 - 28:35
    you are writing software now for a car
    that will go on sale in 2023, what sort of
  • 28:35 - 28:39
    languages, what sort of tool change should
    you use? What sort of crypto should you
  • 28:39 - 28:46
    use so that you're sure you'll still be
    able to patch that software in 2043? And
  • 28:46 - 28:50
    that isn't just about the languages and
    compilers and linkers and so on. That's
  • 28:50 - 28:59
    about the whole ecosystem. So what did the
    EU do? Well, I'm pleased to say that at
  • 28:59 - 29:06
    the third attempt, the EU managed to get
    some law through on this. Their active 771
  • 29:06 - 29:10
    this year on smart goods says that buyers
    of goods with digital elements are
  • 29:10 - 29:16
    entitled to necessary updates for two
    years or for a longer period of time if
  • 29:16 - 29:21
    this is a reasonable expectation of the
    customer. This is what they managed to get
  • 29:21 - 29:25
    through the parliament. And what we would
    expect is that this will mean at least 10
  • 29:25 - 29:30
    years for cars, ovens, fridges, air
    conditioning and so on because of existing
  • 29:30 - 29:35
    provisions about physical spares. And
    what's more, the trader has got the burden
  • 29:35 - 29:40
    of proof in the first couple of years if
    there's disputes. So there is now the
  • 29:40 - 29:48
    legal framework there to create the demand
    for long term patching of software. And
  • 29:48 - 29:55
    now it's kind of up to us. If the durable
    goods were deciding today are still
  • 29:55 - 30:00
    working in 2039, then a whole bunch of
    things are gonna have to change. Computer
  • 30:00 - 30:05
    science has always been about managing
    complexity ever since the very first high
  • 30:05 - 30:10
    level languages and the history goes on
    from there through types and objects and
  • 30:10 - 30:15
    tools like git and Jenkins and Coverity.
    So here's a question for the computer
  • 30:15 - 30:20
    scientists here. What else is going to be
    needed for sustainable computing? Once we
  • 30:20 - 30:31
    have software in just about everything. So
    research topics to support 20 year
  • 30:31 - 30:36
    patching include a more stable and
    powerful toolchain. We know how complex
  • 30:36 - 30:42
    this can be from crypto with looking at
    history of the last 20 years of TLS. Cars
  • 30:42 - 30:45
    teach that it's difficult and expensive to
    sustain all the different test
  • 30:45 - 30:51
    environments. You have a different models
    of cars. Control systems teach for that
  • 30:51 - 30:54
    you can make small changes to the
    architecture, which will then limit what
  • 30:54 - 31:00
    you have to patch. Android teaches how do
    you go about motivating OEMs to patch
  • 31:00 - 31:04
    products that they no longer sell. In this
    case, it's European law, but there's maybe
  • 31:04 - 31:11
    other things you can do too. What does it
    mean for those of us who teach and
  • 31:11 - 31:15
    research in universities? Well, since
    2016, I've been teaching safety and
  • 31:15 - 31:20
    security together in the same course the
    first year undergraduates, because
  • 31:20 - 31:26
    presenting these ideas together in
    lockstep will help people to think in more
  • 31:26 - 31:30
    unified terms about how it all holds
    together. In research terms we've have
  • 31:30 - 31:35
    been starting to look at what we can do to
    make the tool chain more sustainable. For
  • 31:35 - 31:40
    example, one of the problems that you have
    if you maintain crypto software is that
  • 31:40 - 31:45
    every so often the compiler writes, okay,
    so a little bit smarter and the compiler
  • 31:45 - 31:48
    figures out that these extra padding
    instructions that you put in to make the
  • 31:48 - 31:54
    the loops of your crypto routines run in
    constant time and to scrub the contents of
  • 31:54 - 31:58
    round keys once you are no longer in use,
    are not doing any real work, and it
  • 31:58 - 32:03
    removes them. And all of a sudden from one
    day to the next, you find that your crypto
  • 32:03 - 32:08
    has sprung a huge big timing leak and then
    you have to rush to get somebody out of
  • 32:08 - 32:12
    bed to fix the tool chain. So one of the
    things that we thought was that better
  • 32:12 - 32:17
    ways for programmers to communicate intent
    might help. And so there's a paper by
  • 32:17 - 32:22
    Laurent Simon and David Chisnall and I
    where we looked about zeroising sensitive
  • 32:22 - 32:28
    variables and doing constant time loops
    with a plug in and VM. And that led to a
  • 32:28 - 32:33
    EuroS&P paper a year and a half ago: "What
    you get is what you C", and there's a plug
  • 32:33 - 32:41
    in that you can download them and play
    with. Macro scale sustainable security is
  • 32:41 - 32:46
    going to require a lot more. Despite the
    problems in the area industry with the
  • 32:46 - 32:52
    737Max, the aerospace industry still has
    got a better feedback loop of learning
  • 32:52 - 32:59
    from incidents and accidents. And we don't
    have that yet in any of the fields like
  • 32:59 - 33:05
    cars and so on. It's going to be needed.
    What can we use as a guide? Security
  • 33:05 - 33:13
    economics is one set of intellectual tools
    that can be applied. We've known for
  • 33:13 - 33:18
    almost 20 years now that complex socio-
    technical systems often fail because of
  • 33:18 - 33:22
    poor incentives. If Alice guards a system
    and Bob pays the cost of failure, you can
  • 33:22 - 33:28
    expect trouble. And so security economics
    researchers can explain platform security
  • 33:28 - 33:34
    problems, patching cycle liability games
    and so on. And the same principles apply
  • 33:34 - 33:39
    to safety and will become even more
    important as safety and security become
  • 33:39 - 33:44
    entangled. Also, we'll get even more data
    and we'll be able to do more research and
  • 33:44 - 33:51
    get more insights from the data. So where
    does this lead? Well, our papers Making
  • 33:51 - 33:56
    security sustainable, and the thing that
    we did for the EU standardization and
  • 33:56 - 34:00
    certification of the Internet of Things
    are on my web page together with other
  • 34:00 - 34:05
    relevant papers on topics around
    sustainability from, you know, smart
  • 34:05 - 34:11
    metering to pushing back on wildlife
    crime. And that's the first place to go if
  • 34:11 - 34:16
    you're interested in this stuff. And
    there's also our blog. And if you're
  • 34:16 - 34:21
    interested in these kinds of issues at the
    interface between technology and policy of
  • 34:21 - 34:26
    how incentives work and how they very
    often fail when it comes to complex socio-
  • 34:26 - 34:31
    technical systems, then does the workshop
    on the Economics of Information Security
  • 34:31 - 34:37
    in Brussels next June is the place where
    academics interested in these topics tend
  • 34:37 - 34:47
    to meet up. So perhaps we'll see a few of
    you there in June. And with that, there's
  • 34:47 - 34:53
    a book on security engineering which goes
    over some of these things and there's a
  • 34:53 - 34:56
    third edition in the pipeline.
  • 34:56 - 34:59
    H: Thank you very much,
    Ross Anderson, for the talk.
  • 34:59 - 35:09
    applause
  • 35:09 - 35:13
    We will start the Q&A session a little bit
    differently than you used to, Ross has a
  • 35:13 - 35:19
    question to you. So he told me there will
    be a third edition of his book and he is
  • 35:19 - 35:25
    not yet sure about the cover he wants to
    have. So you are going to choose. And so
  • 35:25 - 35:30
    that the people on the stream also can
    hear your choice, I would like you to make
  • 35:30 - 35:37
    a humming noise for the cover which you
    like more. You will first see Bill's covers.
  • 35:37 - 35:44
    R: Cover 1, and cover 2.
    H: So, who of you would like to prefer the
  • 35:44 - 35:53
    first cover?
    applause Come on.
  • 35:53 - 36:02
    And the second choice. louder applause
    OK. I think we have a clear favorite here
  • 36:02 - 36:05
    from the audience, so it would
    be the second cover.
  • 36:05 - 36:09
    R: Thanks.
    H: And we will look forward to see this
  • 36:09 - 36:14
    cover next year then. So if you now have
    questions yourself, you can line up in
  • 36:14 - 36:19
    front of the microphones. You will find
    eight distributed in the hall, three in
  • 36:19 - 36:27
    the middle, two on the sides. Signal Angel
    has the first question from the Internet.
  • 36:27 - 36:32
    Person1: The first question is, is there a
    reason why you didn't include aviation
  • 36:32 - 36:36
    into your research?
    R: We were asked to choose three fields,
  • 36:36 - 36:41
    and the three fields I chose were the ones
    in which we's worked more, most recently.
  • 36:41 - 36:46
    I did some work in avionics for that was
    40 years ago, so I'm no longer current.
  • 36:46 - 36:49
    H: Alright, a question from microphone
    number two, please.
  • 36:49 - 36:54
    Person2: Hi. Thanks for your talk. What
    I'm wondering most about is where do you
  • 36:54 - 37:01
    believe the balance will fall in the fight
    between privacy, the want of the
  • 37:01 - 37:07
    manufacturer to prove that it wasn't their
    fault and the right to repair?
  • 37:07 - 37:10
    R: Well, this is an immensely complex
    question and it's one that we'll be
  • 37:10 - 37:15
    fighting about for the next 20 years. But
    all I can suggest is that we study the
  • 37:15 - 37:20
    problems in detail, that we collect the
    data that we need to say coherent things
  • 37:20 - 37:24
    to policymakers and that we use the
    intellectual tools that we have, such as
  • 37:24 - 37:29
    the economics of security in order to
    inform these arguments. That's the best
  • 37:29 - 37:33
    way that we can fight these fights, you
    know, by being clearheaded and by being
  • 37:33 - 37:36
    informed.
    H: Thank you. A question from microphone
  • 37:36 - 37:45
    number four, please. Can you switch on the
    microphone number four.
  • 37:45 - 37:51
    Person3: Oh, sorry. Hello. Thank you for
    the talk. As a software engineer, arguably
  • 37:51 - 37:57
    I can cause much more damage than a single
    medical professional simply because of the
  • 37:57 - 38:04
    multiplication of my work. Why is it that
    there is still no conversation about
  • 38:04 - 38:09
    software engineers caring liability
    insurance and being collaborative for the
  • 38:09 - 38:13
    work they do?
    R: Well, that again is a complex question.
  • 38:13 - 38:17
    And there are some countries like Canada
    where being a professional engineer gives
  • 38:17 - 38:22
    you a particular status. I think it's
    cultural as much as anything else, because
  • 38:22 - 38:27
    our trade has always been freewheeling,
    it's always been growing very quickly. And
  • 38:27 - 38:32
    throughout my lifetime it's been sucking
    up a fair proportion of science graduates.
  • 38:32 - 38:35
    If you were to restrict software
    engineering to people with degrees in
  • 38:35 - 38:38
    computer science, then we would have an
    awful lot fewer people. I wouldn't be
  • 38:38 - 38:43
    here, for example, because my first
    degree was in pure math.
  • 38:43 - 38:47
    H: All right, the question from microphone
    number one, please.
  • 38:47 - 38:53
    Person4: Hi. Thank you for the talk. My
    question is also about aviation, because
  • 38:53 - 38:59
    as I understand that a lot of the, all
    retired aircraft and other equipment is
  • 38:59 - 39:06
    dumped into the so-called developing
    countries. And with the modern technology
  • 39:06 - 39:12
    and the modern aircraft where the issue of
    maintain or software or betting would
  • 39:12 - 39:19
    still be in question. But how do we see
    that rolling out also for the so-called
  • 39:19 - 39:25
    third world countries? Because I am a
    Pakistani journalist, but this worries me
  • 39:25 - 39:32
    a lot because we get so many devices
    dumped into Pakistan after they're retired
  • 39:32 - 39:37
    and people just use them. I mean, it's a
    country that can not even afford a license,
  • 39:37 - 39:41
    to operating system. So maybe you could
    shed a light on that. Thank you.
  • 39:41 - 39:46
    R: Well, there are some positive things
    that can be done. Development IT is
  • 39:46 - 39:51
    something in which we are engaged. You can
    find the details of my Web site, but good
  • 39:51 - 39:56
    things don't necessarily have to involve
    IT. One of my school friends became an
  • 39:56 - 40:01
    anesthetist and after he retired, he
    devoted his energies to developing an
  • 40:01 - 40:06
    infusion pump for use in less developed
    countries, which was very much cheaper
  • 40:06 - 40:09
    than the ones that we saw on the screen
    there. And it's also safe, rugged,
  • 40:09 - 40:16
    reliable and designed for for use in
    places like Pakistan and Africa and South
  • 40:16 - 40:22
    America. So the appropriate technology
    doesn't always have to be the wiziest?,
  • 40:22 - 40:29
    right. And if you've got very bad roads,
    as in India, in Africa, and relatively
  • 40:29 - 40:34
    cheap labor, then perhaps autonomous
    cars should not be a priority.
  • 40:34 - 40:36
    Person4: Thank you.
    H: All right. We have another question
  • 40:36 - 40:41
    from the Internet, the Signal Angel, please?
    Person5: Why force updates by law?
  • 40:41 - 40:45
    Wouldn't it be better to prohibit the
    important things from accessing the
  • 40:45 - 40:50
    Internet by law?
    R: Well, politics is the art of the
  • 40:50 - 40:57
    possible. And you can only realistically
    talk about a certain number of things at
  • 40:57 - 41:01
    any one time in any political culture or
    the so-called Overton Window. Now, if
  • 41:01 - 41:06
    you talked about banning technology,
    banning cars that are connected to the
  • 41:06 - 41:10
    Internet as a minister, you will be
    immediately shouted out of office as being
  • 41:10 - 41:14
    a Luddite, right. So it's just not
    possible to go down that path. What is
  • 41:14 - 41:20
    possible is to go down the path of saying,
    look, if you've got a company that imports
  • 41:20 - 41:24
    lots of dangerous toys that harm kids or
    dangerous CCTV cameras are recruited into
  • 41:24 - 41:28
    a botnet, and if you don't meet European
    regulations, we'll put the containers on
  • 41:28 - 41:32
    the boat back to China. That's just
    something that can be solved politically.
  • 41:32 - 41:37
    And given the weakness of the car industry
    after the emission standard scandal, it
  • 41:37 - 41:41
    was possible for Brussels to push through
    something that the car industry really
  • 41:41 - 41:46
    didn't like. So, again, and even then that
    was the third attempt to do something
  • 41:46 - 41:52
    about it. So, again, it's what you can
    practically achieve in real world politics
  • 41:52 - 41:56
    H: All right. We have more questions.
    Microphone number four, please.
  • 41:56 - 42:01
    Person6: Hi, I'm automotive cyber security
    analyst and embedded software engineer.
  • 42:01 - 42:07
    Most the part of the ISO 21434 Automotive
    Cyber Security Standard, are you aware of
  • 42:07 - 42:10
    the standard that's coming
    out next year? Hopefully.
  • 42:10 - 42:14
    R: I've not done any significant work with
    it. Friends in the motor industry have
  • 42:14 - 42:18
    talked about it, but it's not something
    we've engaged with in a detail.
  • 42:18 - 42:21
    Person6: So I guess my point is not so
    much a question, but a little bit of a
  • 42:21 - 42:26
    pushback but a lot of the things you
    talked about are being worked on and are
  • 42:26 - 42:33
    being considered over the years updating
    is going to be mandated. Just 30, a 30, 40
  • 42:33 - 42:38
    year lifecycle of the vehicle is being
    considered by engineers. Why not? Nobody I
  • 42:38 - 42:45
    know talks about a six year lifecycle that
    you know, that that's back in the 80s,
  • 42:45 - 42:49
    maybe when we talked about planned
    obsolescence. But that's just not a thing.
  • 42:49 - 42:54
    So I'm not really sure where that language
    is coming from, to be honest with you.
  • 42:54 - 42:58
    R: Well, I've been to close motor industry
    conferences where senior executives have
  • 42:58 - 43:03
    been talking about just that in terms of
    autonomous vehicles. So, yeah, it's
  • 43:03 - 43:10
    something that we've disabused them of.
    H: All right. So time is unfortunately up,
  • 43:10 - 43:15
    but I think Ross will be available after
    to talk as well for questions so you can
  • 43:15 - 43:19
    meet him here on the side. Please give a
    huge round of applause for Ross Anderson.
  • 43:19 - 43:21
    applause
  • 43:21 - 43:24
    R: Thanks. And thank you
    for choosing the cover.
  • 43:24 - 43:26
    36c3 postrol music
  • 43:26 - 43:52
    Subtitles created by c3subtitles.de
    in the year 2021. Join, and help us!
Title:
36C3 - The sustainability of safety, security and privacy
Description:

more » « less
Video Language:
English
Duration:
43:52

English subtitles

Revisions