< Return to Video

34C3 - BootStomp: On the Security of Bootloaders in Mobile Devices

  • 0:00 - 0:15
    34c3 intro
  • 0:15 - 0:19
    Herald: This is Audrey from California and
  • 0:19 - 0:27
    she's from the University of California
    Santa Barbara, security lab, if I'm
  • 0:27 - 0:38
    informed correctly, and it is about
    automated discovery of vulnerabilities in
  • 0:38 - 0:51
    the Android bootloader. Wow, not really my
    problem but definitely Audrey's. So here
  • 0:51 - 0:57
    we go. Please let's have a big hand for
    Audrey Dutcher. Thank you.
  • 0:57 - 1:07
    applause
  • 1:07 - 1:11
    Audrey: Good evening everybody. Today
  • 1:11 - 1:17
    we're talking about Android boot loaders.
    As a brief aside I didn't actually work on
  • 1:17 - 1:22
    this work I just sit across from the
    people who worked on this work and I was
  • 1:22 - 1:27
    the only one who could make it to Germany.
    I have, do work on some of the stuff that
  • 1:27 - 1:32
    it depends on, so this is my field but
    this is not my project. Just brief
  • 1:32 - 1:41
    disclaimer, thanks. So today we're talking
    about Android boot loaders. Phones are
  • 1:41 - 1:46
    complicated, bootloaders are complicated,
    processors are complicated and trying to
  • 1:46 - 1:51
    get at the bottom of these is very
    difficult subject; if you ever done any
  • 1:51 - 1:55
    homebrew kernel dev or homebrew retro-
    gaming, you know that interacting with
  • 1:55 - 2:00
    hardware is really complicated and trying
    to do this in a phone, a system connected
  • 2:00 - 2:04
    to a touchscreen and a modem and lots of
    complicated, money sensitive things, it's
  • 2:04 - 2:11
    really not, it's really complicated and -
    but every single one of you has one of has
  • 2:11 - 2:16
    probably has a phone in your pocket and
    all of these are immensely valuable
  • 2:16 - 2:22
    targets for attacks, so we want to be able
    to inhales detect bugs in them
  • 2:22 - 2:27
    automatically, that's the name of the
    game. So the bootloader in a device: it
  • 2:27 - 2:32
    takes - it's the it's the job of "oh,
    we've powered on, we need to get
  • 2:32 - 2:38
    everything initialized", so we initialize
    the device and the peripherals and then
  • 2:38 - 2:44
    the final the final gasp of breath of the
    bootloader is to take the kernel and
  • 2:44 - 2:49
    execute it. And the kernel obviously needs
    to be loaded from storage somewhere. For
  • 2:49 - 2:54
    Android specifically, this is what we
    worked, on most Android devices are ARMs,
  • 2:54 - 2:57
    there's no particular standard for what an
    ARM bootloader should look like but the
  • 2:57 - 3:00
    ARM people do give you some guidelines.
    There's an open-source implementation of
  • 3:00 - 3:04
    what a secure boot letter should look
    like. There are in fact several boot
  • 3:04 - 3:09
    loaders on ARM, we'll go over this later.
    But it's some mor- it's a complicated
  • 3:09 - 3:14
    affair that needs to preserve several
    security properties along the way. And
  • 3:14 - 3:19
    above all, the whole goal of this process
    is to make sure that things are secure and
  • 3:19 - 3:26
    to make sure the user data is protected.
    That's what we're trying to do. Like we
  • 3:26 - 3:31
    said the phones in your pockets are
    valuable targets. If you can attack the
  • 3:31 - 3:35
    bootloader you can root you can roo- get a
    rootkit on the device, which is even more
  • 3:35 - 3:40
    powerful than just getting root on it. If
    an attacker were to compromised your phone
  • 3:40 - 3:46
    he could brick your device or establi- I
    talked about rootkits already but but
  • 3:46 - 3:51
    additionally you might want to circumvent
    the security properties of your phone's
  • 3:51 - 3:57
    bootloader in order to customize it:
    rooting, jailbreaking. "Unlocking" is the
  • 3:57 - 4:05
    key word in this situation.
    The Android bootloader establishes
  • 4:05 - 4:11
    cryptographic integrity over basically
    what's happening at all times, so on your
  • 4:11 - 4:17
    phone there is a master key and that will,
    that kno- that knows that it should only
  • 4:17 - 4:21
    it should only run some code that has been
    signed with the key associated with the
  • 4:21 - 4:25
    hardware and then the next stage of
    bootloader has a key that it will verify
  • 4:25 - 4:28
    that the next stage of the bootloader
    hasn't been tampered with. And this is
  • 4:28 - 4:34
    where we get the term "chain of trust",
    where each part establishes "oh, I'm very
  • 4:34 - 4:39
    very sure, cryptographically sure,
    assuming RSA hasn't been broken yet, that
  • 4:39 - 4:45
    the next bit of code is going to be doing
    something that I authorized."
  • 4:45 - 4:49
    Circumventing this is valuable, as we've
    talked about, phones have to have a way to
  • 4:49 - 4:56
    buil- to do that built-in, unless you're
    Apple, and but obviously protecting this
  • 4:56 - 5:03
    mechanism from attackers is a point of
    contention, so really you need to make
  • 5:03 - 5:10
    sure that only the real owner of the
    device can actually unlock the phone. So
  • 5:10 - 5:16
    wha- this project is about making is about
    discovering vulnerabilities that let us
  • 5:16 - 5:20
    circumvent this process, so the threat
    model that we use for this project is that
  • 5:20 - 5:25
    there is, the phone is rooted and the
    attacker has this root control. This is
  • 5:25 - 5:32
    pretty out there, no not that out there,
    root vulnerabilities exist, but it's it's
  • 5:32 - 5:36
    enough to make you scoff "Oh what's the
    point of this", well, the security
  • 5:36 - 5:40
    properties of the phone are supposed to
    extend above the hypervisor level. It's,
  • 5:40 - 5:42
    you're supposed to have these guarantees
    that things should always work, assuming
  • 5:42 - 5:48
    the chain of trust works, regardless of
    what how what's happening in the kernel.
  • 5:49 - 5:55
    So today we are going to be talking about
    BootStomp, which is a tool that
  • 5:55 - 6:00
    automatically verifies these properties
    and discovers bugs. I'm going a little
  • 6:00 - 6:04
    slow, I'll speed up.
    So first of the booting process in Android
  • 6:04 - 6:09
    ecosystems is pretty complicated and
    multi-stage; there is the there's the base
  • 6:09 - 6:13
    bootloader BL1, which loads and verifies
    another bootloader, which loads and
  • 6:13 - 6:17
    verifies another bootloader and this is
    important, because the first on's in a ROM
  • 6:17 - 6:23
    and is very small and the second one is
    probably going, is probably by the
  • 6:23 - 6:27
    hardware vendor and the third one is
    probably by the OS vendor, for example,
  • 6:27 - 6:33
    and they all need to do different things.
    So the important part here is these EL
  • 6:33 - 6:37
    things; those are the ARM exception
    levels, which are basically the global
  • 6:37 - 6:42
    permission levels for an android
    processor. The EL3 is basically the god
  • 6:42 - 6:46
    mode. There's EL2 for hypervisors, it
    isn't in this chart, there's EL1, which is
  • 6:46 - 6:51
    the kernel and the EL0, which is user
    space. So when we boot you're obviously in
  • 6:51 - 6:54
    the highest execution level and gradually,
    as we establish more and more
  • 6:54 - 6:59
    initialization of the device, we're going
    to cede control to less privileged
  • 6:59 - 7:05
    components, so the bootloader is operate
    very privilegededly and one of the things
  • 7:05 - 7:11
    they need to do is establish what's, the
    ARM trust zone, the trusted execution
  • 7:11 - 7:19
    environment that lets people do really
    secure things on Android phones.
  • 7:19 - 7:26
    That's, this is something that is set up
    by built by the BL31 bootloader and in
  • 7:26 - 7:29
    secure world you need to do things like
    establish, initialize hardware and
  • 7:29 - 7:34
    peripherals and in the non secure world
    you're norm- you're running like the
  • 7:34 - 7:39
    normal kernel and the normal users apps.
    And on some phones you actually have a
  • 7:39 - 7:46
    final bootloader, which runs in EL1, BL33
    or the aboot executable and that's the
  • 7:46 - 7:53
    that's the one that we're generally
    targeting for for this stuff. So this is
  • 7:53 - 7:56
    what I was talking about the chain of
    trust: each of those arrows represents
  • 7:56 - 8:00
    cryptographic integrity, so the next stage
    only gets loaded if there's a valid
  • 8:00 - 8:09
    signature indicating that we really trust
    what's going on here. And that's the
  • 8:09 - 8:14
    unlocking process that we were talking
    about; if you, the verified, physical
  • 8:14 - 8:18
    owner of the device, wants to you can
    disable that last bit and cause and allow
  • 8:18 - 8:22
    untrusted code to run as the kernel.
    That's totally fine, if you own the
  • 8:22 - 8:28
    device.
    The unlocking process is supposed to
  • 8:28 - 8:31
    really specifically verify these two
    things: that you have physical access to
  • 8:31 - 8:38
    the device and that you actually own it,
    like you know the pin to it, that's what
  • 8:38 - 8:46
    establishes ownership of our device.And so
    specifically when you when you go through
  • 8:46 - 8:51
    that process it does set some specific
    flags on your persistent storage, saying
  • 8:51 - 8:58
    this is an unlocked device now, you can do
    whatever but making sure that that can
  • 8:58 - 9:05
    only happen when it's authorized is the
    point of contention here. It should, the,
  • 9:05 - 9:09
    typically what happens is this security
    state is itself cryptographically signed,
  • 9:09 - 9:15
    so you can't just set unlocked, you have
    to set unlocked but signed by the people
  • 9:15 - 9:23
    that we really trust. And but but
    generally you probably shouldn't be able
  • 9:23 - 9:31
    to write to it just from the normal user
    space. So the question is: we saw that the
  • 9:31 - 9:35
    operating system is separate from the
    bootloader. So what we want to be able to
  • 9:35 - 9:40
    do is get from the Android OS to affecting
    the, to the bootloader. And can this
  • 9:40 - 9:48
    happen? Well, of course, that's why we're
    here. So the, let's see. Oh I didn't
  • 9:48 - 9:52
    realize there were animations on the
    slides, that's unfortunate. So this is
  • 9:52 - 9:59
    sort of the normal flow chart of how these
    things normally come about.
  • 9:59 - 10:04
    You've got the bootloader, which has to
    read from persistent storage in order to
  • 10:04 - 10:07
    initialize the operating system. Like, of
    course you have to read, for example,
  • 10:07 - 10:11
    whether or not the device is unlocked, you
    have to load the kernel itself. There are
  • 10:11 - 10:17
    lots of inputs to the bootloader and
    intuition is that the bootloader is, these
  • 10:17 - 10:23
    just serve as normal inputs to a program,
    which can be analyzed for vulnerabilities.
  • 10:23 - 10:31
    Oh lord, this is a mess. So so from the
    OS, you're allowed to, you ha- if you have
  • 10:31 - 10:36
    root privileges in the operating system
    you can write to this persistent storage,
  • 10:36 - 10:47
    which means that you have that this serves
    as another input to the bootloader and
  • 10:47 - 10:53
    this can cause bad things to happen. So we
    need some sort of tool, it's the point of
  • 10:53 - 10:58
    this project, to automatically verify the
    safety properties of these boot loaders.
  • 10:58 - 11:04
    That's BootStomp. Bootloaders are
    complicated. There's a lot of stuff, which
  • 11:04 - 11:08
    means you have to automate - the analysis
    has to be automated in order to really
  • 11:08 - 11:12
    sift through something as big and
    complicated as a bootloader, with the care
  • 11:12 - 11:17
    necessary to actually find bugs that are
    sitting there.
  • 11:17 - 11:20
    And but these things aren't usually things
    that you have source code for, so it needs
  • 11:20 - 11:25
    to be a binary analysis and furthermore
    you can't really do a dynamic execution on
  • 11:25 - 11:30
    something that needs to run on the highest
    privilege level of a processor, so you
  • 11:30 - 11:33
    have to have your - step back - and it has
    to be static as well. And furthermore this
  • 11:33 - 11:37
    needs to be a fully free-standing analysis
    that doesn't assume anything other than
  • 11:37 - 11:42
    "oh, we're executing code on a system",
    because there's no known syscalls or API's
  • 11:42 - 11:47
    to checkpoint process or say "oh, we know
    what this means, we don't really have to
  • 11:47 - 11:56
    analyze it." So it's a tall order but you
    can do it with enough work. So BootStomp
  • 11:56 - 12:03
    specifically is the tool that we built. It
    will automatically detect these inputs,
  • 12:03 - 12:10
    that we talked about, to the bootloader
    and then it will determine if these inputs
  • 12:10 - 12:14
    can be used to compromise various security
    properties of the device.
  • 12:14 - 12:20
    One such example is if you can use this to
    just achieve memory corruption for example
  • 12:20 - 12:28
    or more abstract forms of vulnerability,
    such as code flows that will result in
  • 12:28 - 12:33
    unwanted data being written by the more
    privileged bootloader somewhere. And the
  • 12:33 - 12:40
    important thing about this analysis is
    that its results are easily verifiable and
  • 12:40 - 12:45
    traceable and it's very easy to like look
    at the outputs and say "oh, well, this is
  • 12:45 - 12:49
    what's happening and this is why I think
    this happened and therefore I can
  • 12:49 - 12:59
    reproduce this, possibly?" This happens
    through symbolic taint analysis. This is
  • 12:59 - 13:06
    the part that I know about, because I work
    on angr, which is the symbolic execution
  • 13:06 - 13:11
    analysis static analysis tool that
    bootstomp uses in order to do its taint
  • 13:11 - 13:19
    analysis. That taint analysis of all
    execution are kind of loaded words, so
  • 13:19 - 13:24
    this is what specifically is meant is that
    when we discover these sources and sinks
  • 13:24 - 13:29
    of behavior, through person particularly
    static static analysis and some
  • 13:29 - 13:32
    heuristics, of course. And then we
    propagate these taints through symbolic
  • 13:32 - 13:35
    execution, while maintaining tractability.
    And notice wherever
  • 13:35 - 13:40
    we meet wherever we can find pause from
    taint sources to behaviors sinks that we
  • 13:40 - 13:47
    think are vulnerable. Specifically, we
    think these these behavior sinks are
  • 13:47 - 13:52
    vulnerable behavior if you can arbitrarily
    write to memory, or read from memory.
  • 13:52 - 13:55
    Like, really arbitrary, if there's a
    pointer which is controlled by user input
  • 13:55 - 14:00
    - memory corruption stuff. And
    additionally, if you can control loop
  • 14:00 - 14:04
    iterations through your input, that
    indicates the denial of service attack.
  • 14:04 - 14:12
    And finally, the unlocking mechanism, the
    bootloader unlocking mechanism, if there
  • 14:12 - 14:18
    is if we can detect specific code paths
    which indicate bypasses - those are
  • 14:18 - 14:26
    valuable. So, yeah, so this is the
    specific architecture of the tool. There
  • 14:26 - 14:31
    are the two main modules one which is
    written as an IDA analysis. You know, the
  • 14:31 - 14:35
    big tool that everyone probably doesn't
    pay enough money for. And then there's the
  • 14:35 - 14:42
    other component written in angr which is
    the symbolic change analysis. And this is
  • 14:42 - 14:52
    probably the point where I break out of
    here and actually start the live demo.
  • 14:52 - 14:59
    That's big enough.
    Okay, so we're working on a Huawei boot
  • 14:59 - 15:07
    image here, the fastboot image. We're
    going to load it up in IDA real quick. So
  • 15:07 - 15:15
    here, IDA has understands, oh this is what
    the executable is. So if we just sort of
  • 15:15 - 15:24
    run the initial script, find taints, it'll
    think real hard for a little bit. There's
  • 15:24 - 15:27
    no real reason this couldn't if it's part
    couldn't have been done an angr or binary
  • 15:27 - 15:34
    ninja or r2 or (???), god forbid. But,
    this is a collaborative project if you saw
  • 15:34 - 15:37
    the huge author list and people write
    stuff and whatever they're comfortable
  • 15:37 - 15:42
    with. So it's IDA in this case.
    Realistically, because this is just a
  • 15:42 - 15:47
    binary blob when you load it in IDA it
    doesn't immediately know where everything
  • 15:47 - 15:54
    is, so you have to sort of nudge it into..
    oh here's where all the functions are.
  • 15:54 - 16:05
    Okay, we finished, and what it's done is:
    we've got this taint source sync dot txt
  • 16:05 - 16:12
    which shows us, "oh, here are all the sources
    of tainted information, and here's a few of
  • 16:12 - 16:16
    the sinks that we established." Obviously
    you don't need a sink analysis to
  • 16:16 - 16:20
    determine if you've got memory corruption
    or not but we like knowing where the
  • 16:20 - 16:24
    writes to persistent storage are and where
    all the specifically the memcopy functions
  • 16:24 - 16:35
    are valuable for analysis. And then, if we
    run our taint analysis bootloader, taint
  • 16:35 - 16:39
    on the - oh this configuration file is
    real simple. It just says, "oh here's what
  • 16:39 - 16:42
    we're analyzing: it's a 64-bit
    architecture, don't bother analyzing thumb
  • 16:42 - 16:52
    mode, etc cetera, simple stuff." And it'll
    do this for about 20 minutes. Uh, config;
  • 16:52 - 16:58
    and it'll do this for about 20 minutes. I
    hope it finishes before the demo is over.
  • 16:58 - 17:06
    If not, I'll do some magic and we'll a
    pre-prepared solution. But, so, we talked
  • 17:06 - 17:10
    about these seeds there used the the seeds
    for our taint analysis or for our
  • 17:10 - 17:18
    persistent storage. And that I used by the
    unlocking procedure. So the heuristics I
  • 17:18 - 17:21
    was talking about - we want to identify
    the reads from persistent storage through
  • 17:21 - 17:26
    log messages, keyword keyword analysis and
    long distances. So the eMMC is this is a
  • 17:26 - 17:30
    specific memory module used by
    the bootloader for secure purposes. And
  • 17:30 - 17:35
    just it's the persistent storage device
    basically. And you can identify these log
  • 17:35 - 17:40
    messages and then we just do a diff -u
    analysis back from the guard condition on
  • 17:40 - 17:44
    that block to its source and you say, "oh
    that function must be the read." It's
  • 17:44 - 17:52
    pretty simple. It works surprisingly
    often. Of course, if this isn't enough you
  • 17:52 - 17:56
    can just manually analyze the firmware and
    provide, "oh here's where we read from
  • 17:56 - 17:58
    persistent storage, here's what you
    should taint."
  • 18:08 - 18:12
    Cool. So the taint
  • 18:12 - 18:15
    analysis: our taints are specifically
    sy-- this is specifically symbolic taint
  • 18:15 - 18:20
    analysis so it's not just like what Triton
    does where you've got a concrete value
  • 18:20 - 18:25
    that has metadata attached. This is a real
    symbol being used for symbolic execution.
  • 18:25 - 18:30
    If you're not familiar with symbolic
    execution, it's a if it's a form of static
  • 18:30 - 18:38
    analysis in which you emulate the code,
    but instead of having the values for some
  • 18:38 - 18:41
    of things you can just have symbols. And
    then when you perform operation on those
  • 18:41 - 18:46
    symbols you construct an abstract syntax
    tree of the behavior. And then when you
  • 18:46 - 18:52
    run into branch conditions based on those
    things you can say, "oh, well, in order to
  • 18:52 - 19:00
    get from point A to point B this
    constraint must be satisfied." And of
  • 19:00 - 19:05
    course, now you can just add z3 and stir
    and you have passed the inputs to generate
  • 19:05 - 19:12
    paths to the program. So for the sinks of
    the taint analysis, we want we wants to
  • 19:12 - 19:19
    say, "oh, if tainted data come comes into
    and is is the argument to memcpy then
  • 19:19 - 19:23
    that's a vulnerability." I don't mean
    like, it's the I don't mean like, the
  • 19:23 - 19:28
    taint data is the subject of memcopy, like
    it's one of the values passed to memcpy.
  • 19:28 - 19:34
    That's a memory corruption vulnerability
    generally. Yeah, we talked about memory
  • 19:34 - 19:36
    corruption, and we talked about loop
    conditions, and we talked about writes to
  • 19:36 - 19:41
    persistent storage with the unlocking
    stuff. Cool. For taint checking
  • 19:41 - 19:46
    specifically -- oh this is exactly what I
    just said. Yeah, and part and what I was
  • 19:46 - 19:50
    talking about with the symbolic predicates
    and trace analysis means
  • 19:50 - 19:54
    that when you see something,
    you automatically have
  • 19:54 - 20:00
    the input that will generate that
    behavior. So the output is inherently
  • 20:00 - 20:06
    traceable. Unfortunately, symbolic
    execution has some issues. I was actually
  • 20:06 - 20:12
    at CCC two years ago talking about the
    exact same problem. You have this problem
  • 20:12 - 20:19
    where, oh, you generate paths between
    different between different states and
  • 20:19 - 20:24
    there can be too many of them. It
    overwhelms your analysis. So you can use
  • 20:24 - 20:29
    some heuristics to say, "oh, we don't want
    to we can; because it's the static
  • 20:29 - 20:34
    analysis we have a more powerful step over
    than what a debugger can do." We don't
  • 20:34 - 20:37
    have to actually analyze the function, we
    can just take the instruction pointer and
  • 20:37 - 20:44
    move it over there. And, this does cause
    some unsoundness, but it's not a problem
  • 20:44 - 20:49
    if you like make sure that the arguments
    aren't tainted, for example. Or sometimes
  • 20:49 - 20:54
    you just accept the unsoundness as part of
    the tractability of the problem. Limit
  • 20:54 - 20:59
    loop operation: that's classic technique
    from static analysis. And the time out, of
  • 20:59 - 21:06
    course. So, what are the bugs we found? We
    evaluated this on four boot loaders and we
  • 21:06 - 21:15
    found several bugs. Six of which were zero
    days. So that's pretty good. It's like,
  • 21:15 - 21:19
    okay, so you found some bugs but it could
    just be you; oh there are some errors and
  • 21:19 - 21:23
    an initialization that don't really
    matter. But on the other hand you can
  • 21:23 - 21:32
    crash it 41 41 41. That's pretty serious.
    So as we saw, some of the bootloader is
  • 21:32 - 21:36
    like do work in ARM EL3 so this is pretty
    significant. You can do whatever you want
  • 21:36 - 21:40
    in the device if you actually have
    sufficient control over it. This is
  • 21:40 - 21:48
    rootkit territory. You could break
    anything you wanted. Then there's another
  • 21:48 - 21:52
    component in the analysis that says, "can
    we'd find bypasses to the unlocking
  • 21:52 - 21:58
    procedure." For example, this is this is
    basically one of the ones that we found:
  • 21:58 - 22:04
    it's so it says boots on detected this,
    this flow from data that was read from the
  • 22:04 - 22:09
    device to data that was written to the
    device, and what this code is supposed to
  • 22:09 - 22:13
    do -- do I have animations? yes --it's
    supposed to,
  • 22:13 - 22:15
    like, take some input
    and verify that it hashes
  • 22:15 - 22:20
    to a certain value. And if so, hash
    that value and write it back to disk, and
  • 22:20 - 22:27
    that constitutes the cryptographically
    secure unlocking thing. However, the thing
  • 22:27 - 22:33
    that we write to is compared to the
    identical to the thing that was read from
  • 22:33 - 22:40
    the disk. So you can just; the thing that
    boots on purported was the code flow from
  • 22:40 - 22:44
    the disk backs the disk indicating that if
    you can read from the disk, you know how
  • 22:44 - 22:53
    to produce the thing that will unlock the
    phone. So this isn't secure. Mitigations.
  • 22:53 - 22:59
    So, the thing that Google does in order to
    prevent attacks of this class is that the
  • 22:59 - 23:04
    key ness is the secure encryption key that
    unlocked, that decrypts the like userland
  • 23:04 - 23:13
    data is, has embedded in it the unlock
    state. So clearly, if you change the
  • 23:13 - 23:17
    unlock state you brick the entire phone.
    Well, not brick, but have to reset it have
  • 23:17 - 23:27
    to lose all your data. That's still not
    really good enough but realistically we
  • 23:27 - 23:33
    should probably be using a more trusted
    form of storage that's not just the normal
  • 23:33 - 23:37
    normal partitions in the SD card in order
    to just sort of store this state. It
  • 23:37 - 23:41
    should probably be part of the eMMC, or
    specifically the replay protected memory
  • 23:41 - 23:47
    block which uses cryptographic mechanisms
    to synchronize the, what's it called,
  • 23:47 - 23:55
    synchronize this writes to the memory with
    the authenticated process. And so that
  • 23:55 - 23:58
    would make that would make sure that
    only the bootloader could unlock it. But
  • 23:58 - 24:02
    of course that wouldn't protect against
    memory corruption vulnerabilities and
  • 24:02 - 24:05
    there's nothing really to be said about
    that other than, "hey, this is a serious
  • 24:05 - 24:12
    problem." In conclusion, all these bugs
    have been reported, most of them have been
  • 24:12 - 24:18
    fixed. As far as I'm aware this is the
    first study to really explore and
  • 24:18 - 24:22
    develop analyses for Android boot loaders
    and in it we developed an automated
  • 24:22 - 24:27
    technique to analyze boot loaders with
    tractable alerts. I found six 0days in
  • 24:27 - 24:32
    various boot loaders and our
    implementation is open source. I will be
  • 24:32 - 24:35
    taking questions, thank you for listening.
  • 24:35 - 24:44
    applause
  • 24:44 - 24:46
    Herald: That was quite amazing.
  • 24:50 - 24:53
    Okay we'll be taking some
    questions from people
  • 24:53 - 24:57
    that understood exactly
    what it was all about. Yes
  • 24:57 - 24:59
    I see somebody walking up to microphone
    one.
  • 24:59 - 25:02
    Mic 1: Thank you very much for talk--
  • 25:02 - 25:05
    Herald: Are you talking the mic otherwise
    we can't record it.
  • 25:05 - 25:07
    Mic 1: Okay, thank you very much for that
  • 25:07 - 25:12
    work, that was really cool. Your mystic
    investigations didn't include devicing the
  • 25:12 - 25:16
    code better. Do you think it's possible to
    write the code so that your tools can
  • 25:16 - 25:21
    analyze it and maybe it would be secure?
    Or not yet?
  • 25:21 - 25:28
    Audrey: Well, there's certainly things to
    be said for having things in open source,
  • 25:28 - 25:33
    because necessarily doing analysis on
    source code is a much more, a much better
  • 25:33 - 25:41
    defined field than the than doing analysis
    on binary code. Additionally, you can
  • 25:41 - 25:48
    write your stuff in languages there is
    safer than C. I don't know if it's, I
  • 25:48 - 25:56
    didn't know if it's safe to talk about
    rust yet, but rust is cool. Yeah, there's
  • 25:56 - 26:01
    lots of things that you can do. I just
    realized I didn't show off; I didn't show
  • 26:01 - 26:06
    off the still running, the analysis: the
    automated results. It did not finish in
  • 26:06 - 26:13
    time so I will run some magic, and now we
    have some results. Which..
  • 26:13 - 26:19
    applause
  • 26:19 - 26:22
    So, here's a here's one of the analysis
  • 26:22 - 26:28
    results. We found at this location in the
    program a tainted variable, specifically
  • 26:28 - 26:33
    the tainted at offset 261 into the tainted
    buffer. This variable was used as a
  • 26:33 - 26:41
    pointer. And that involved following the
    path from along along this way. So there
  • 26:41 - 26:46
    is a vulnerability that I discovered for you.
    So we can go on with question sorry that
  • 26:46 - 26:47
    was a bit.
  • 26:47 - 26:49
    Herald: Any more questions from the
  • 26:49 - 26:58
    audience? There is no question from from
    the internet. Okay, one question, go
  • 26:58 - 27:00
    ahead: talk into the mic please.
  • 27:00 - 27:03
    Question: You said that the bugs you found
  • 27:03 - 27:08
    where responsibly disclosed and fixed.
    Were they actually fixed in real existing
  • 27:08 - 27:12
    devices or did the vendors just say, "oh,
    we'll fix it in future devices."
  • 27:12 - 27:17
    Audrey: I wish I knew the answer to that
    question. I wasn't on the in the did this.
  • 27:17 - 27:23
    Yeah, I can't speak to that. That was just
    that was just a slide on the slides that I
  • 27:23 - 27:30
    was given. I sure hope they were really
    responsibly disclosed. It's real hard to
  • 27:30 - 27:36
    push updates to the bootloader!
  • 27:36 - 27:40
    Herald: Okay, any more questions? okay so
  • 27:40 - 27:44
    let's conclude this talk. People, when you
    leave the hall, please take all your stuff
  • 27:44 - 27:50
    with you. Your bottles, your cups. Don't
    forget anything, have a last check. Thank
  • 27:50 - 27:55
    you very much let's have one final hand
    for Audrey Dutcher, from California!
  • 27:55 - 28:01
    applause
  • 28:01 - 28:06
    34c3 outro
  • 28:06 - 28:23
    subtitles created by c3subtitles.de
    in the year 2018. Join, and help us!
Title:
34C3 - BootStomp: On the Security of Bootloaders in Mobile Devices
Description:

more » « less
Video Language:
English
Duration:
28:23

English subtitles

Revisions