< Return to Video

Marie Moe, Eireann Leverett: Unpatchable

  • 0:00 - 0:04
    ♪ preroll music ♪
  • 0:04 - 0:11
    Angel: The next talk will start now
  • 0:11 - 0:13
    and will be 'Unpatchable -
  • 0:13 - 0:15
    living with a vulnerable
    implanted device'
  • 0:15 - 0:18
    by Dr. Marie Moe and Eireann Leverett.
  • 0:18 - 0:22
    Give them a warm round
    of applause please.
  • 0:22 - 0:29
    applause
  • 0:33 - 0:39
    heart monitor beep sounds start
  • 0:39 - 0:40
    So, we are here today
  • 0:40 - 0:42
    to talk to you about a subject
  • 0:42 - 0:45
    that is really close to my heart.
  • 0:45 - 0:46
    I have a medical implant.
  • 0:46 - 0:49
    A pacemaker, that is generating
  • 0:49 - 0:52
    every single beat of my heart.
  • 0:52 - 0:56
    But how can I trust my own heart,
  • 0:56 - 0:58
    when it's being controlled by a machine,
  • 0:58 - 1:00
    running a proprietary code,
  • 1:00 - 1:04
    and there is no transparency?
  • 1:04 - 1:06
    So I'm a patient,
  • 1:06 - 1:09
    but I'm also a security researcher.
  • 1:09 - 1:11
    I'm a hacker, because I like
  • 1:11 - 1:13
    to figure out how things work.
  • 1:13 - 1:15
    That's why I started a project
  • 1:15 - 1:16
    on breaking my own heart,
  • 1:16 - 1:17
    together with Eireann
  • 1:17 - 1:20
    and a couple of friends.
  • 1:20 - 1:23
    Because I really want to know
  • 1:23 - 1:24
    what protocols are running
  • 1:24 - 1:27
    in this machine inside my body.
  • 1:27 - 1:29
    Is the crypto correctly implemented?
  • 1:29 - 1:33
    Does it even have crypto?
  • 1:35 - 1:38
    So I'm here to inspire you today.
  • 1:38 - 1:41
    I want more people
    to hack to save lives.
  • 1:41 - 1:44
    Because we are all becoming
  • 1:44 - 1:48
    more and more dependent on machines.
  • 1:48 - 1:50
    Maybe some of you in the audience
  • 1:50 - 1:52
    also have medical implants,
  • 1:52 - 1:53
    maybe you know someone
  • 1:53 - 1:58
    that's also depending on
    medical implants
  • 1:58 - 2:00
    Imagine that this is your heartbeat
  • 2:00 - 2:04
    and it's being controlled by a device.
  • 2:04 - 2:06
    A device, that might fail.
  • 2:06 - 2:10
    Due to software bugs,
  • 2:10 - 2:12
    due to hardware failures.
  • 2:12 - 2:14
    additional background sound:
    real heartbeat
  • 2:14 - 2:18
    Wouldn't you also like to know
  • 2:18 - 2:21
    if it has security vulnerabilities?
  • 2:21 - 2:24
    If it can be trusted?
  • 2:27 - 2:32
    sounds stop
    beeeeep
  • 2:32 - 2:36
    E: Something to think about, right?
  • 2:36 - 2:37
    M: Yeah.
  • 2:37 - 2:40
    E: Marie is an incredibly
    brave women.
  • 2:40 - 2:43
    When she asked me to give this talk
  • 2:43 - 2:45
    it made me nervous, right?
  • 2:45 - 2:47
    It's such a personal story.
  • 2:47 - 2:49
    Such a journey as well.
  • 2:49 - 2:50
    And she's gonna talk to you
  • 2:50 - 2:51
    about a lot of things, right?
  • 2:51 - 2:54
    Not just hacking medical devices
  • 2:54 - 2:55
    from a safety point of view
  • 2:55 - 2:58
    but also some of the
    privacy concerns,
  • 2:58 - 2:59
    some of the transparency concerns,
  • 2:59 - 3:01
    some of the consent concerns.
  • 3:01 - 3:03
    So, there's a lot to get trough
  • 3:03 - 3:05
    in the next hour.
  • 3:05 - 3:07
    But I think you're gonna enjoy it
  • 3:07 - 3:08
    quite a lot.
  • 3:08 - 3:11
    M: So, let me tell you
  • 3:11 - 3:13
    the story about my heart.
  • 3:13 - 3:15
    So, 4 years ago
  • 3:15 - 3:18
    I got my medical implant.
  • 3:18 - 3:21
    It was a kind of emergency situation
  • 3:21 - 3:23
    because my heart was starting to beat
  • 3:23 - 3:24
    really slow,
  • 3:24 - 3:26
    so i needed to have the pacemaker.
  • 3:26 - 3:29
    I had no choice.
  • 3:29 - 3:31
    After I got the implant,
  • 3:31 - 3:33
    since I was a security researcher,
  • 3:33 - 3:34
    of course I started to
  • 3:34 - 3:37
    look up information about how it worked.
  • 3:37 - 3:38
    And I googled for information.
  • 3:38 - 3:40
    I found a technical manual
  • 3:40 - 3:41
    of my pacemaker
  • 3:41 - 3:44
    and I started to read it.
  • 3:44 - 3:46
    And i was quite surprised
  • 3:46 - 3:48
    when I learned that
  • 3:48 - 3:52
    my pacemaker has 2 wireless interfaces.
  • 3:52 - 3:55
    There is one interface, that is really
  • 3:55 - 3:56
    close field communication,
  • 3:56 - 3:59
    near field communication
  • 3:59 - 4:01
    that is being used when I'm at checkups
  • 4:01 - 4:03
    at the hospital,
  • 4:03 - 4:06
    where the technician,
  • 4:06 - 4:08
    the pacemaker technician or doctor
  • 4:08 - 4:10
    uses a programming device
  • 4:10 - 4:12
    and places it
  • 4:12 - 4:14
    really close to my pacemaker.
  • 4:14 - 4:17
    And it's possible to use that
  • 4:17 - 4:20
    communication to adjust the settings.
  • 4:20 - 4:22
    But it also has another
  • 4:22 - 4:23
    wireless interface,
  • 4:23 - 4:25
    that I was not aware of,
  • 4:25 - 4:28
    that I was not informed of
    as a patient.
  • 4:28 - 4:31
    It has a possibility for remote monitoring
  • 4:31 - 4:32
    or telemetry,
  • 4:32 - 4:36
    where you can have an
    access point in your house
  • 4:36 - 4:37
    that will communicate
  • 4:37 - 4:39
    with the pacemaker
  • 4:39 - 4:42
    at a couple of meters distance.
  • 4:42 - 4:44
    And it can collect logs from the pacemaker
  • 4:44 - 4:46
    and send them to a server
  • 4:46 - 4:48
    at the vendor.
  • 4:48 - 4:49
    And there is a web interface
  • 4:49 - 4:50
    where the doctor can log in
  • 4:50 - 4:53
    and retrieve my information.
  • 4:53 - 4:55
    And I have no access the data
  • 4:55 - 4:56
    that is being collected
  • 4:56 - 4:58
    by my device.
  • 4:58 - 5:00
    E: So imagine for a moment
  • 5:00 - 5:02
    that you are buying a new phone
  • 5:02 - 5:04
    or buying a new laptop.
  • 5:04 - 5:05
    You would do your homework, right?
  • 5:05 - 5:07
    You would understand
    what interfaces where there.
  • 5:07 - 5:10
    But in Marie's case she's just
  • 5:10 - 5:12
    given a device,
    and then later she gets
  • 5:12 - 5:14
    to go and read the manual, right?
  • 5:14 - 5:17
    So she's the epitome
    of a informed consumer
  • 5:17 - 5:18
    in this space
  • 5:18 - 5:20
    and we want a lot more
    informed consumers
  • 5:20 - 5:21
    in this space,
  • 5:21 - 5:22
    which is why we are giving this talk.
  • 5:22 - 5:24
    Now, I don't know about you,
  • 5:24 - 5:26
    but I'm used to hacking
  • 5:26 - 5:27
    industrial systems.
  • 5:27 - 5:29
    I haven't done as
    much medical research
  • 5:29 - 5:30
    in the past.
  • 5:30 - 5:32
    So, when I first
    started this project
  • 5:32 - 5:33
    I knew literally nothing
  • 5:33 - 5:35
    about Marie's heart.
  • 5:35 - 5:36
    Or even my own.
  • 5:36 - 5:39
    And she had to teach me
    how the heart works
  • 5:39 - 5:40
    and how her pacemaker works.
  • 5:40 - 5:43
    So, would you mind explaining
  • 5:43 - 5:45
    some details to the audience
    that will be relevant
  • 5:45 - 5:46
    through the rest of the presentation?
  • 5:46 - 5:48
    M: Actually I think
    we're going to show you
  • 5:48 - 5:50
    a video of
    how the heart works.
  • 5:50 - 5:53
    So, it's a little bit of
    biology introduction here
  • 5:53 - 5:58
    before we start
    with the technical details.
  • 5:58 - 6:01
    So, this.. play the video.
  • 6:01 - 6:03
    Video: A normal heart beat rate
  • 6:03 - 6:07
    and rhythm is called
    'Normal Sinus Rhythm'.
  • 6:07 - 6:09
    The heart's pumping action
  • 6:09 - 6:11
    is driven by electrical stimulation
  • 6:11 - 6:14
    within the heart muscle.
  • 6:14 - 6:15
    the heart's electrical system
  • 6:15 - 6:17
    allows it to beat in an
  • 6:17 - 6:20
    organized, synchronized pattern.
  • 6:20 - 6:21
    Every normal heart beat
  • 6:21 - 6:23
    has 4 steps.
  • 6:23 - 6:25
    Step 1:
  • 6:25 - 6:27
    As blood flows into the heart
  • 6:27 - 6:28
    an electrical impulse
  • 6:28 - 6:31
    from an upper area of the right atrium
  • 6:31 - 6:34
    also known as the sinus node
  • 6:34 - 6:36
    causes the atria to contract.
  • 6:36 - 6:38
    When the atria contract
  • 6:38 - 6:39
    they squeeze the blood
  • 6:39 - 6:42
    into the ventricles.
  • 6:42 - 6:43
    Step 3:
  • 6:43 - 6:45
    There is a very short pause
  • 6:45 - 6:48
    only about a fraction of a second.
  • 6:48 - 6:49
    and Step 4:
  • 6:49 - 6:51
    The ventricles contract
  • 6:51 - 6:56
    pumping the blood to the body.
  • 6:56 - 6:57
    A heart normally beats
  • 6:57 - 7:01
    between 60-100 times/min.
  • 7:01 - 7:02
    Electrical signals in your heart
  • 7:02 - 7:05
    can become blocked or irregular,
  • 7:05 - 7:06
    causing a disruption
  • 7:06 - 7:08
    in your hearts normal rhythm.
  • 7:08 - 7:10
    When the heart's rhythm is too fast,
  • 7:10 - 7:13
    too slow or out of order,
  • 7:13 - 7:14
    an arrhythmia,
  • 7:14 - 7:19
    also called a rhythm disorder occurs.
  • 7:19 - 7:21
    When your heart beats out of rhythm,
  • 7:21 - 7:22
    it may not deliver enough blood
  • 7:22 - 7:25
    to your body.
  • 7:25 - 7:26
    Rhythm disorders can be caused
  • 7:26 - 7:28
    by a number of factors
  • 7:28 - 7:31
    including disease, heredity,
  • 7:31 - 7:34
    medications or other factors.
  • 7:34 - 7:37
    E: So for those of you
    who are already aware of that,
  • 7:37 - 7:38
    apologies.
  • 7:38 - 7:39
    But I needed to learn that.
  • 7:39 - 7:40
    I needed to learn the basics
  • 7:40 - 7:42
    before we even got started, right?
  • 7:42 - 7:44
    So...
  • 7:44 - 7:47
    M: So this is a diagram of the
  • 7:47 - 7:50
    electrical system of the heart.
  • 7:50 - 7:52
    So, as you see,
    this is the sinus node
  • 7:52 - 7:54
    that is generating the pulse.
  • 7:54 - 7:56
    And in my case
  • 7:56 - 7:59
    I had a problem with the signal
  • 7:59 - 8:02
    being generated by the sinus node
  • 8:02 - 8:05
    not reaching the lower
    heart chamber.
  • 8:05 - 8:11
    It's something called an AV block
    or a heart block
  • 8:11 - 8:14
    So, occasionally this will cause
  • 8:14 - 8:17
    an arrhythmia that makes
    the heart pause.
  • 8:17 - 8:18
    If you don't have a heart beat
  • 8:18 - 8:20
    for, like ... 8-10 seconds,
  • 8:20 - 8:22
    you lose your consciousness.
  • 8:22 - 8:24
    And that was, what happened to me.
  • 8:24 - 8:26
    I just suddenly found myself
  • 8:26 - 8:27
    lying on the floor
  • 8:27 - 8:29
    and I didn't remember how I got there.
  • 8:29 - 8:31
    And it turned out that it was my heart
  • 8:31 - 8:34
    that had taken a break.
  • 8:34 - 8:37
    So that's how I discovered
  • 8:37 - 8:39
    that I had this issue.
  • 8:39 - 8:41
    So, this is where the signal is blocked
  • 8:41 - 8:44
    on the way down to the lower heart chamber
  • 8:44 - 8:46
    But there's a backup function
  • 8:46 - 8:51
    in the heart that can make
  • 8:51 - 8:52
    a so called backup pulse.
  • 8:52 - 8:55
    And I had that backup pulse
  • 8:55 - 8:57
    when I went to the
    emergency room.
  • 8:57 - 9:00
    So I had a pulse
    around 30-40 beats/min.
  • 9:00 - 9:03
    And that's generated by some cells
  • 9:03 - 9:05
    in the lower heart chamber.
  • 9:05 - 9:08
    So, after I got the pacemaker
  • 9:08 - 9:09
    my heart started to become
  • 9:09 - 9:10
    a little bit more lazy.
  • 9:10 - 9:12
    So it is not certain,
  • 9:12 - 9:14
    that I will have this backup pulse
  • 9:14 - 9:17
    anymore if the pacemaker
    stops working.
  • 9:17 - 9:18
    So currently
  • 9:18 - 9:22
    my heart is 100% running
    on the pacemaker.
  • 9:22 - 9:27
    So, let's also look at
    how the pacemaker works.
  • 9:27 - 9:30
    I have another video of that.
  • 9:30 - 9:32
    So, this is my little friend
  • 9:32 - 9:34
    that is running my heart.
  • 9:34 - 9:38
    Video: A pacemaker
    is a miniaturized computer
  • 9:38 - 9:41
    that is used to treat
    a slow heart beat.
  • 9:41 - 9:43
    It is about the size
  • 9:43 - 9:45
    of a couple of stacked silver dollars
  • 9:45 - 9:49
    and weights approximately 17-25 grams.
  • 9:49 - 9:52
    It is usually surgically placed
  • 9:52 - 9:54
    or implanted just under the skin
  • 9:54 - 9:57
    in the chest area.
  • 9:57 - 10:00
    The device sends
    a tiny electrical pulse
  • 10:00 - 10:02
    down a thin coated wire,
  • 10:02 - 10:05
    called a lead, into your heart.
  • 10:05 - 10:07
    This stimulates the heart to beat.
  • 10:07 - 10:09
    This impulses are very tiny
  • 10:09 - 10:12
    and most people
    do not feel them.
  • 10:12 - 10:14
    While the device
    helps your heart
  • 10:14 - 10:16
    maintain its rhythm,
  • 10:16 - 10:17
    it also stores information
  • 10:17 - 10:18
    about your heart that can be
  • 10:18 - 10:20
    retrieved by your doctor
  • 10:20 - 10:22
    to program the device.
  • 10:22 - 10:24
    E: Remember that!
  • 10:24 - 10:26
    M: Yeah... Did you see
  • 10:26 - 10:29
    the ones and zeros at the end
  • 10:29 - 10:29
    of the video?
  • 10:29 - 10:31
    That's what we want
    to know more about.
  • 10:31 - 10:33
    Because this information
  • 10:33 - 10:35
    that is being collected
    by the pacemaker,
  • 10:35 - 10:37
    how it works,
  • 10:37 - 10:39
    how the code looks like,
  • 10:39 - 10:40
    it's all closed source,
  • 10:40 - 10:42
    it's all proprietary information.
  • 10:42 - 10:45
    And that's why we need more
  • 10:45 - 10:46
    security researchers,
  • 10:46 - 10:49
    we need more 3rd party testing,
  • 10:49 - 10:52
    to be sure that we can trust this code.
  • 10:52 - 10:54
    E: And you can imagine that
  • 10:54 - 10:56
    we're doing some of
    this research as well.
  • 10:56 - 10:58
    But I'm not gonna break
    Marie's heart on stage,
  • 10:58 - 10:59
    I'm not gonna drop 0-day
  • 10:59 - 11:01
    on some medical devices,
  • 11:01 - 11:03
    so if you came for that,
  • 11:03 - 11:04
    it's not worth staying.
  • 11:04 - 11:05
    The rest of the presentation
  • 11:05 - 11:07
    will be about some of
    the things we found
  • 11:07 - 11:08
    and how this works and
  • 11:08 - 11:10
    how you might approach this research.
  • 11:10 - 11:12
    And some of the people
    who did this research before,
  • 11:12 - 11:12
    because there's plenty of others,
  • 11:12 - 11:13
    and we like to give a shout-out
  • 11:13 - 11:16
    to those who've done
    great research in advance.
  • 11:16 - 11:19
    But essentially this point is
  • 11:19 - 11:20
    very relevant.
  • 11:20 - 11:21
    That the internet
    of medical things
  • 11:21 - 11:23
    is already here.
  • 11:23 - 11:25
    And Marie is wired into it.
  • 11:25 - 11:27
    She's a bit younger than the average
  • 11:27 - 11:30
    pacemaker patient, but, you know,
  • 11:30 - 11:32
    she was thrust into this situation
  • 11:32 - 11:33
    where she had to think about things
  • 11:33 - 11:34
    in a very different way.
  • 11:34 - 11:36
    Like, you did a Masters,
    breaking crypto,
  • 11:36 - 11:39
    and also a PHD in Information Security.
  • 11:39 - 11:41
    Did you imagine, that
    things you learned
  • 11:41 - 11:43
    about SSH and
    network security
  • 11:43 - 11:47
    might one day apply to your
    heart and your own body?
  • 11:47 - 11:50
    M: No, I never
    figured out that
  • 11:50 - 11:53
    my research would eventually
    end up inside my own body.
  • 11:53 - 11:55
    That's something I never
    thought about.
  • 11:55 - 11:58
    And also, there's a lot of
  • 11:58 - 12:00
    people that don't think about
  • 12:00 - 12:03
    how the medical devices
    actually work.
  • 12:03 - 12:05
    So, when I asked this question
  • 12:05 - 12:06
    to health care professionals
  • 12:06 - 12:09
    they look at me like I'm crazy,
  • 12:09 - 12:11
    they don't ... they have never
    thought about this before.
  • 12:11 - 12:15
    That there's actually code
    inside my body
  • 12:15 - 12:16
    and someone has
    programmed it,
  • 12:16 - 12:18
    someone has
    written this code.
  • 12:18 - 12:20
    And, did they think
    about, that this
  • 12:20 - 12:23
    would actually control
    someone's life,
  • 12:23 - 12:27
    and be my own personal
    critical infrastructure?
  • 12:29 - 12:31
    E: Yeah, personal
    infrastructure, right?
  • 12:31 - 12:33
    On a physical level.
  • 12:33 - 12:35
    And also, I think, it's...
  • 12:35 - 12:38
    You know, the point that you made
    is important to reiterate,
  • 12:38 - 12:39
    that you go and see your doctor
  • 12:39 - 12:40
    and you ask these questions about
  • 12:40 - 12:42
    whether anyone can hack into my heart
  • 12:42 - 12:44
    and they probably look
    at you and go like
  • 12:44 - 12:47
    'Don't you worry your pretty
    little head about that', right?
  • 12:47 - 12:48
    But Marie used to head up
  • 12:48 - 12:50
    the Norwegian computer
    emergency response team
  • 12:50 - 12:51
    for a couple of years
  • 12:51 - 12:53
    and knows a lot of hackers
  • 12:53 - 12:55
    and knows what she's
    talking about, right?
  • 12:55 - 12:57
    So, when she asked her doctor
    these questions,
  • 12:57 - 12:59
    they're very legitimate questions.
  • 12:59 - 13:01
    And the doctors probably
    don't know anything about code,
  • 13:01 - 13:03
    but they need to move
    towards a place
  • 13:03 - 13:05
    where they can answer
    those questions with some
  • 13:05 - 13:08
    honesty and certainty and
    treat them with the dignity
  • 13:08 - 13:11
    that they deserve.
  • 13:11 - 13:12
    Should we show them
    a little bit more
  • 13:12 - 13:14
    about the total ecosystem
    of devices
  • 13:14 - 13:17
    that we are talking about,
    at least in this particular talk?
  • 13:17 - 13:19
    M: Yeah.
  • 13:19 - 13:22
    E: So, this was
    all new to me.
  • 13:22 - 13:25
    I mean I've moved around
    in networks and done some
  • 13:25 - 13:28
    penetration testing and
    some stuff in the past,
  • 13:28 - 13:32
    but I didn't know much about
    implantable medical devices.
  • 13:32 - 13:34
    So, we've got a couple
    of them there.
  • 13:34 - 13:38
    The ICD, which is the
    in-cardio-defibrillator,
  • 13:38 - 13:40
    that's some of the work
    that you saw from Barnaby Jack
  • 13:40 - 13:42
    which we will mention later,
  • 13:42 - 13:43
    was on those particular devices,
  • 13:43 - 13:45
    We've got the pacemakers
    and of course other devices
  • 13:45 - 13:47
    could be in this diagram as well.
  • 13:47 - 13:49
    Like, we could be talking
    about insulin pumps
  • 13:49 - 13:51
    or other things in the future.
  • 13:51 - 13:55
    The device itself speaks
    to box number 2,
  • 13:55 - 13:56
    which we will tell you a little bit
    more about in a moment,
  • 13:56 - 14:00
    using a protocol, commonly
    referred to as 'MICS'.
  • 14:00 - 14:02
    A number of different
    devices use this
  • 14:02 - 14:06
    Medical Implant
    Communication Service.
  • 14:06 - 14:09
    And Marie shocked me yesterday
  • 14:09 - 14:11
    when she found
    a couple devices
  • 14:11 - 14:16
    that potentially use Bluetooth. sighing
    laughter
  • 14:16 - 14:20
    So, would you like to tell them
    a little bit more about the access point,
  • 14:20 - 14:21
    and I'll join in?
  • 14:21 - 14:24
    M: Yeah, so, the access
    point is the device
  • 14:24 - 14:27
    that you can typically have
    on your bed stand
  • 14:27 - 14:32
    and that will, depending
    on your configuration,
  • 14:32 - 14:35
    contact your pacemaker
    as regular intervals,
  • 14:35 - 14:38
    e.g. once during the night.
  • 14:38 - 14:41
    It will start a communication
    with the pacemaker,
  • 14:41 - 14:43
    couple of meters distance,
  • 14:43 - 14:44
    and will start
    collecting logs.
  • 14:44 - 14:47
    And this logs will
    then be sent,
  • 14:47 - 14:52
    it can be via SMS
    or other means,
  • 14:52 - 14:54
    to a server.
  • 14:54 - 14:59
    So, there's a lot of my
    personal information
  • 14:59 - 15:02
    that can end up different
    places in this diagram.
  • 15:02 - 15:06
    So, of course it's
    in my own device,
  • 15:06 - 15:10
    it will be then communicated
    via this access point
  • 15:10 - 15:11
    and also then
  • 15:11 - 15:14
    via the cellular network.
  • 15:14 - 15:20
    And then it will also be stored
    in the telemetry server.
  • 15:20 - 15:25
    Potentially when I go
    for the checkups
  • 15:25 - 15:29
    my personal information will
    also end up in my
  • 15:29 - 15:30
    doctor workstation
  • 15:30 - 15:37
    or in the electronic
    patient records.
  • 15:37 - 15:40
    And there's a lot of things
    that can go wrong there.
  • 15:40 - 15:42
    E: Yeah, you
    can see, it's using
  • 15:42 - 15:47
    famously secure methods
    of communication
  • 15:47 - 15:52
    that have never been backdoored or
    compromised by anyone ever before,
  • 15:52 - 15:56
    even here at this conference,
    probably even this time around.
  • 15:56 - 16:00
    So these are some things
    that are concerning.
  • 16:00 - 16:03
    The data also travels often
    to other countries
  • 16:03 - 16:05
    and so there are questions
    about the jurisdiction
  • 16:05 - 16:10
    in terms of privacy laws
    in terms of some of this data.
  • 16:10 - 16:13
    And some of you can go and
    look deeper into that as well.
  • 16:13 - 16:15
    The telemetry store thing
    I think is important,
  • 16:15 - 16:20
    some of this is a telemetry store,
    such as the server at the vendor.
  • 16:20 - 16:22
    So the vendor owns some
    machines somewhere
  • 16:22 - 16:24
    that collect data
    from Marie's heart.
  • 16:24 - 16:27
    So you can imagine she goes to see her
    doctor and the doctor is like:
  • 16:27 - 16:31
    'Hey, Marie, last weekend, did you, ...
    run a half marathon or something?'
  • 16:31 - 16:33
    And she hasn't told him, right?
  • 16:33 - 16:35
    Like, he just can look
    at the data and see,
  • 16:35 - 16:39
    that her heart rate was up
    for a couple hours.
  • 16:39 - 16:41
    That's true though, right? You
    did actually run a half marathon.
  • 16:41 - 16:44
    M: Yeah, I did run a half marathon.
    laughing
  • 16:44 - 16:47
    E: So, the telemetry
    store is one part,
  • 16:47 - 16:48
    but there's also the
    doctors work station
  • 16:48 - 16:51
    which contains a lot of
    this medical data.
  • 16:51 - 16:54
    So, from privacy perspective
    that's part of the attack surface.
  • 16:54 - 16:55
    But there's also the programmers, right?
  • 16:55 - 16:58
    There's the device's programmers.
  • 16:58 - 17:01
    So that's an interesting point, that
    I hope a lot of you are interested in
  • 17:01 - 17:05
    already, that there
    is a programmer
  • 17:05 - 17:06
    for these devices.
  • 17:06 - 17:10
    M: So, we actually
    went shopping on eBay
  • 17:10 - 17:12
    and we found some
    of these devices.
  • 17:12 - 17:13
    E: You can buy them on eBay?
  • 17:13 - 17:14
    M: Yeah.
    E: laughing
  • 17:14 - 17:17
    M: So, I found
    a programmer
  • 17:17 - 17:19
    that can program
    my device, on eBay
  • 17:19 - 17:21
    and I bought it.
  • 17:21 - 17:22
    And I also found a couple of
    these access points.
  • 17:22 - 17:26
    So, that's what we're
    now starting to look at.
  • 17:26 - 17:29
    E: We just wanna to give
    you an overview of this system,
  • 17:29 - 17:32
    and it's fairly similar across the
    different device vendors,
  • 17:32 - 17:35
    and we're not going to talk
    about individual vendors.
  • 17:35 - 17:37
    But if you're gonna go and
    do this kind of research
  • 17:37 - 17:40
    you can see that some of the research
    you've already done in the past
  • 17:40 - 17:43
    applies to different parts
    of this process.
  • 17:43 - 17:47
    M: And talking about
    patient privacy,
  • 17:47 - 17:51
    when we got the
    programmer from ebay
  • 17:51 - 17:54
    it actually contained
    patient information.
  • 17:54 - 17:57
    So, that's the
    really bad thing.
  • 17:57 - 17:59
    E: So, I found
    this very odd.
  • 17:59 - 18:01
    I had a similar reaction
    to yourselves because
  • 18:01 - 18:03
    I usually do industrial
    system stuff.
  • 18:03 - 18:06
    One of my friends picked up
    some PLCs recently and
  • 18:06 - 18:10
    they had data from the nuclear plant,
    that the PLCs had been used in.
  • 18:10 - 18:14
    So, decommissioning is a problem
    in industrial systems
  • 18:14 - 18:18
    but it turns out also
    in medical devices, right?
  • 18:18 - 18:20
    I guess that's a useful point
    to make as well,
  • 18:20 - 18:23
    about the costs of doing
    this kind of research.
  • 18:23 - 18:26
    It is possible to get some
    devices, some implants
  • 18:26 - 18:29
    from people who have sadly
    passed on,
  • 18:29 - 18:33
    but that comes with a very high
    cost of biomedical decontamination.
  • 18:33 - 18:36
    So that raises the cost
    of doing this research
  • 18:36 - 18:38
    on the implants themselves,
    not necessarily on the rest
  • 18:38 - 18:39
    of the devices.
  • 18:39 - 18:43
    M: Yeah, so, also want
    to say, that in this research
  • 18:43 - 18:44
    I had not have not tinkered
    with my own device.
  • 18:44 - 18:47
    So, that would not be a good thing ...
  • 18:47 - 18:50
    E: You're not gonna let me,
    like, SSH into your heart and just ...
  • 18:50 - 18:52
    M: Um.. No.
    E: ... just delete some stuff.. No?
  • 18:52 - 18:55
    M: No.
    E: I wouldn't do it anyway,
  • 18:55 - 18:57
    but it's an interesting point, right?
  • 18:57 - 18:59
    So, like, there are a lot of
    safety percussions
  • 18:59 - 19:01
    that we and the rest
    of the team have to take
  • 19:01 - 19:02
    when we are doing this research.
  • 19:02 - 19:06
    And one of them is
    not pairing Marie's pacemaker
  • 19:06 - 19:09
    with any of the devices
    that are under test.
  • 19:09 - 19:14
    Do you wanna say a bit more
    about connectivity and vulnerability?
  • 19:14 - 19:15
    M: Yeah, so...
  • 19:15 - 19:19
    I was worried
    when I discovered that
  • 19:19 - 19:24
    I had this possible connectivity
    to the medical internet of things.
  • 19:24 - 19:29
    In my case this is switched off
    in the configurations
  • 19:29 - 19:30
    but it's there.
  • 19:30 - 19:33
    It's possible to turn it on,
    it's possible for me to be
  • 19:33 - 19:37
    hooked up to the,
    this internet of medical things.
  • 19:37 - 19:40
    And for some patients
    this is really benefit.
  • 19:40 - 19:43
    So you always have to make
    a risk-based decision
  • 19:43 - 19:48
    on whether or not to
    make use of this
  • 19:48 - 19:49
    connectivity.
  • 19:49 - 19:52
    But I think it's really important
    that you make an informed decision
  • 19:52 - 19:55
    about that and that the patient
  • 19:55 - 20:02
    is informed and has given
    his or her consent
  • 20:02 - 20:04
    to have this feature.
  • 20:04 - 20:08
    The battery lifetime of my pacemaker
    is around 10 years.
  • 20:08 - 20:10
    So in 6 years time
  • 20:10 - 20:13
    I will have to have a
    replacement surgery
  • 20:13 - 20:16
    and I'm going to be
    a really difficult patient laughing
  • 20:16 - 20:18
    laughter
  • 20:18 - 20:24
    So, ...
    applause
  • 20:24 - 20:25
    E: Right on.
  • 20:25 - 20:28
    M: I really want to know
  • 20:28 - 20:30
    how the devices work
    by then and
  • 20:30 - 20:34
    I want to make an informed
    decision on whether or not
  • 20:34 - 20:36
    to have this connectivity.
  • 20:36 - 20:39
    But of course for lot of patients
    the benefit of having this
  • 20:39 - 20:41
    outweighs the risk.
  • 20:41 - 20:45
    Because people that had other
    heart problems than me
  • 20:45 - 20:47
    they have to go for more
    frequent checkups.
  • 20:47 - 20:50
    I only have to go once a year.
  • 20:50 - 20:53
    So, for patients that need to go
    frequently for checkups,
  • 20:53 - 20:56
    it's really good for them
    to have the possibility
  • 20:56 - 20:58
    of having telemetry and
    having connectivity to
  • 20:58 - 21:00
    have remote patient monitoring.
  • 21:00 - 21:04
    E: Yeah, imagine you
    have mobility problems or
  • 21:04 - 21:06
    you even just live far
  • 21:06 - 21:09
    from a major city.
  • 21:09 - 21:11
    And making the journey
    to the hospital is quite arduous,
  • 21:11 - 21:15
    then this kind of remote
    telemetry allows your doctor
  • 21:15 - 21:17
    to keep track of
    what's going on.
  • 21:17 - 21:20
    And that's very important,
    we don't wanna, like...
  • 21:20 - 21:22
    have a big scary testosterone
    filled talk where we, like,
  • 21:22 - 21:23
    hack some pacemakers.
  • 21:23 - 21:27
    We wanna talk about
    how there's a dual use thing
  • 21:27 - 21:28
    going on here.
  • 21:28 - 21:32
    And that there is a lot of value
    in having this devices
  • 21:32 - 21:36
    but we also want them to be safe
    and secure and preserve our privacy
  • 21:36 - 21:39
    and a lot of other things.
  • 21:39 - 21:44
    So, these are some
    of the issues.
  • 21:44 - 21:46
    Of course the last one,
    the remote assassination scenario,
  • 21:46 - 21:49
    that' s everyone favorite one
    to fantasize about
  • 21:49 - 21:53
    or talk about, or make
    movies about, but
  • 21:53 - 21:55
    we think there's a lot of
    other issues in here
  • 21:55 - 21:57
    that are more interesting,
  • 21:57 - 21:59
    some quality issues even, right,
  • 21:59 - 22:02
    that we'll talk about
    in a little bit.
  • 22:02 - 22:03
    Battery exhaustion,
  • 22:03 - 22:07
    again something many people
    don't think about. But...
  • 22:07 - 22:09
    I'm very interested in
    cyber-physical exploitation
  • 22:09 - 22:13
    and so some of this elements
    were interesting to me
  • 22:13 - 22:16
    that you might use the device
    in a way that wasn't expected.
  • 22:16 - 22:21
    M: So personally I'm not afraid
    of being remotely assassinated.
  • 22:21 - 22:23
    E: I've actually never known
    you to be afraid of anything
  • 22:23 - 22:25
    M: laughing
  • 22:25 - 22:29
    I'm more worried about
    software bugs in my device,
  • 22:29 - 22:32
    the things that can malfunction,
  • 22:32 - 22:34
    E: Is that just theoretical?
  • 22:34 - 22:37
    M: No, actually software bugs
  • 22:37 - 22:39
    have killed people.
  • 22:39 - 22:41
    So, think about that!
  • 22:41 - 22:42
    People that are not here,
  • 22:42 - 22:45
    they don't have their voice
    and they can't really
  • 22:45 - 22:46
    give there story.
  • 22:46 - 22:51
    But there are stories about persons
    depending on medical devices
  • 22:51 - 22:54
    dying because their
    device malfunctioned.
  • 22:54 - 22:58
    E: There's even some
    great research
  • 22:58 - 23:02
    from academics about
    how the user interface design
  • 23:02 - 23:05
    of medical devices can have
    an impact on patients safety
  • 23:05 - 23:07
    and how designing UX
  • 23:07 - 23:10
    much more clearly
    and concisely
  • 23:10 - 23:12
    specifically for the
    medical profession
  • 23:12 - 23:18
    might improve
    the care of patients.
  • 23:18 - 23:20
    Do you wanna say more
    about this slide or should we
  • 23:20 - 23:22
    go on to the previous work,
    should we... go ahead!
  • 23:22 - 23:25
    M: Yeah, I think it's really
    important also to...
  • 23:25 - 23:28
    the issue of trusting the vendors.
  • 23:28 - 23:31
    So, as a patient I'm
    expected to just, you know,
  • 23:31 - 23:35
    trust, that my device
    is working correctly,
  • 23:35 - 23:39
    every security vulnerability
    has been corrected by the vendor
  • 23:39 - 23:40
    and it's safe.
  • 23:40 - 23:43
    But I want to have more
    third party testing,
  • 23:43 - 23:48
    I want to have more security
    research on medical implants.
  • 23:48 - 23:52
    And as a lot things, like ...
    history has shown
  • 23:52 - 23:58
    we can't always trust that
    the vendors do the right thing.
  • 23:58 - 24:00
    E: I think this is a good
    opportunity for us to ask
  • 24:00 - 24:03
    a very fun question, which is:
  • 24:03 - 24:06
    Any fans of DMCA in the room?
  • 24:06 - 24:08
    laughter
  • 24:08 - 24:09
    No? No fans? Alright.
  • 24:09 - 24:13
    Well, you then you'll really enjoy this.
  • 24:13 - 24:17
    Marie has some very exciting news
    about DMCA exemptions.
  • 24:17 - 24:21
    M: Yeah, so... October, this year
  • 24:21 - 24:28
    there was a ruling of
    an DMCA exemption for
  • 24:28 - 24:31
    security research
    on medical devices
  • 24:31 - 24:34
    also for automotive security research.
  • 24:34 - 24:35
    So, this means, that
  • 24:35 - 24:39
    as researchers you can
  • 24:39 - 24:42
    actually do reverse engineering
    of medical implants
  • 24:42 - 24:46
    without infringing copyright laws.
  • 24:46 - 24:48
    It will take effect
    I think October next year.
  • 24:48 - 24:51
    E: Yeah.
    M: That is really a big
  • 24:51 - 24:54
    step forward in my opinion.
  • 24:54 - 24:56
    And I hope that this will
    encourage more research.
  • 24:56 - 25:00
    And I also want to mention
    that there are
  • 25:00 - 25:03
    fellow activist patients
    like myself
  • 25:03 - 25:07
    that was behind that proposal
    of having this exemptions.
  • 25:07 - 25:12
    So, Jay Radcliff who hacked
    his own insulin pump,
  • 25:12 - 25:16
    Karen Sandler, who is a free and
    open software advocat.
  • 25:16 - 25:21
    And Hugo Campos, who has
    an ICD implant, he is very ...
  • 25:21 - 25:25
    he wants to have access
    to his own data
  • 25:25 - 25:28
    for quantified self reasons.
  • 25:28 - 25:31
    So this patients,
    they actually
  • 25:31 - 25:36
    made this happen,
    that you're allowed to do
  • 25:36 - 25:39
    security research
    on medical devices.
  • 25:39 - 25:41
    I think that's really great.
  • 25:41 - 25:48
    applause
  • 25:48 - 25:52
    E: Do you wanna say something
    about Scott Erven's presentation
  • 25:52 - 25:52
    that you saw at DEF CON?
  • 25:52 - 25:54
    M: Yeah, that was a really
    interesting presentation about
  • 25:54 - 26:00
    how medical devices have
    really poor security.
  • 26:00 - 26:02
    And they have, like,
    hard coded credentials,
  • 26:02 - 26:06
    and you can find them
    using Shodan on the internet.
  • 26:06 - 26:10
    This were not pacemakers,
    but other types of
  • 26:10 - 26:11
    different medical devices.
  • 26:11 - 26:17
    There are, like, hospital networks
    that are completely open
  • 26:17 - 26:21
    and you can access
    the medical equipment
  • 26:21 - 26:26
    using default passwords that
    you can find in the manuals.
  • 26:26 - 26:27
    And the vendors claim that
  • 26:27 - 26:30
    no, these are not hard coded,
    these are default,
  • 26:30 - 26:34
    but then the manuals say:
    Do not change this password...
  • 26:34 - 26:37
    E: Because they want to
    integrate with other stuff, right? So...
  • 26:37 - 26:41
    I've heard that excuse from SCADA,
    so I wasn't having it.
  • 26:41 - 26:44
    M: They also put up some
    medical device honeypots
  • 26:44 - 26:49
    to see if there were
    targeted hacking attempts
  • 26:49 - 26:55
    but they only picked up regular malware
    on them, which is also ...
  • 26:55 - 26:57
    E: Only!
    M: ... of course of a concern laughing
  • 26:57 - 27:01
    E: Anything else,
    about prior art, Kevin?
  • 27:01 - 27:05
    M: I guess we should mention
    that the academic research
  • 27:05 - 27:08
    on hacking pacemakers,
    which was started by
  • 27:08 - 27:11
    a group led by Kevin Fu
  • 27:11 - 27:14
    and they had this
    first paper in 2008
  • 27:14 - 27:15
    that they also followed up
    with more academic research
  • 27:15 - 27:18
    and they showed that it's
    possible to hack a pacemaker.
  • 27:18 - 27:21
    They showed that...
    this was possible on a, like
  • 27:21 - 27:23
    a couple of centimeters
    distance only,
  • 27:23 - 27:28
    so, like, the attack scenario
    would be, if you have a
  • 27:28 - 27:30
    device similar to the
    programmers device
  • 27:30 - 27:34
    and you attack me with it
    you can laughing
  • 27:34 - 27:34
    turn off my pacemaker.
  • 27:34 - 27:36
    That's not really scary,
  • 27:36 - 27:40
    but then we have the research
    by Barnaby Jack
  • 27:40 - 27:46
    where this range of the attack
    is extended to several meters
  • 27:46 - 27:49
    so you have someone with
    an antenna in a room
  • 27:49 - 27:51
    scanning for pacemakers
  • 27:51 - 27:54
    and starting to program them.
  • 27:54 - 28:00
    E: We have a saying
    at Cambridge about that.
  • 28:00 - 28:02
    Some of the other people at the
    university have been doing attacks
  • 28:02 - 28:05
    a lot longer than I have, and
    one of the things they say is:
  • 28:05 - 28:07
    'Attacks only get worse,
    they never get better.'
  • 28:07 - 28:11
    So, the range might be short one year,
    then a couple of years later it's worse.
  • 28:11 - 28:16
    M: The worst case scenario
    I think would be remotely,
  • 28:16 - 28:20
    via the internet being able to
    hack pacemakers.
  • 28:20 - 28:24
    but there's no research so far
    indicating that that's possible.
  • 28:24 - 28:27
    E: And we don't wanna
    hype that up. We don't wanna...
  • 28:27 - 28:29
    M: No.
    E: ... get that kind of an angle
  • 28:29 - 28:32
    on this talk. We wanna make the
    point that hacking can save lives,
  • 28:32 - 28:39
    that hackers are global citizen's
    resource to save lives, right? So...
  • 28:39 - 28:45
    M: Yeah, so, this is the result
    of hacking of the drug infusion pumps.
  • 28:45 - 28:49
    Earlier this year
  • 28:49 - 28:55
    the FDA actually issued the first ever
    recall of a medical device
  • 28:55 - 28:58
    based on cyber security concerns.
  • 28:58 - 29:02
    E: I think that's amazing, right?
    They've recalled products
  • 29:02 - 29:06
    because of cyber security concerns. They
    used to have to wait until someone died.
  • 29:06 - 29:10
    In fact, they had to show
    something like 500 deaths
  • 29:10 - 29:13
    before you could recall a product.
    So now they can ...
  • 29:13 - 29:16
    the FDA, at least in the US,
    they can recall products
  • 29:16 - 29:19
    just based on security
    considerations.
  • 29:19 - 29:21
    M: So, this is also,
  • 29:21 - 29:27
    I guess the first example
    of that type of pro-active
  • 29:27 - 29:29
    security research,
    where you can
  • 29:29 - 29:33
    make a proof of concept
    without killing any patients
  • 29:33 - 29:37
    and then that closes
    the security holes.
  • 29:37 - 29:38
    And that potentially
    saves lives.
  • 29:38 - 29:41
    And no one has been hurt
    in the research.
  • 29:41 - 29:42
    I think that's great.
  • 29:42 - 29:45
    E: I'm also really excited
    because we give a lot of presentations
  • 29:45 - 29:49
    about security that are filled with
    doom and gloom and depression,
  • 29:49 - 29:52
    so it's nice to have two major victories
    in medical device research
  • 29:52 - 29:55
    in the last few years.
    One being the DMCA exemptions
  • 29:55 - 29:57
    and the other being
    actual product recalls.
  • 29:57 - 30:02
    M: Yeah, and the FDA are starting
    to take these issues seriously and
  • 30:02 - 30:06
    they are really focusing on the cyber
    security of medical implants now.
  • 30:06 - 30:10
    I'm going to go to a workshop
    arranged by the FDA in January
  • 30:10 - 30:16
    and participate on a panel discussing
    cyber security of medical implants.
  • 30:16 - 30:19
    And it's great to have this
    type of interaction between
  • 30:19 - 30:23
    the security committee, medical
    device vendors and the regulators.
  • 30:23 - 30:25
    So, things are happening.
  • 30:25 - 30:27
    E: Yeah. How do you feel
    as an audience,
  • 30:27 - 30:30
    are you glad that she's going to be
    your representative in Washington
  • 30:30 - 30:32
    for some of these issues?
  • 30:32 - 30:39
    applause
  • 30:39 - 30:41
    And we want you to get
    involved as well, right?
  • 30:41 - 30:45
    This is not just about Marie
    and myself and the other people
  • 30:45 - 30:47
    who worked on this
    project, it's meant say
  • 30:47 - 30:50
    you too can do this research.
    And you should be.
  • 30:50 - 30:53
    You have to be a little sensitive,
    a little bit precise and articulate
  • 30:53 - 30:55
    about concerns.
  • 30:55 - 30:59
    We take some inspiration from the
    former research around hygiene.
  • 30:59 - 31:01
    Imagine the first time some scientist
    went to some other scientist and said
  • 31:01 - 31:05
    'There is this invisible stuff,
    and it's on your hands,
  • 31:05 - 31:07
    and if you don't wash your hands
    people get infections!'
  • 31:07 - 31:08
    And everyone thought
    they were crazy.
  • 31:08 - 31:12
    Well, it's kind of the same with us
    talking about industrial systems
  • 31:12 - 31:16
    or talking about medical devices
    or talking about hacking in general.
  • 31:16 - 31:18
    People just didn't, sort of,
    believe it was possible at first.
  • 31:18 - 31:21
    And so we have to articulate ourselves
    very, very carefully.
  • 31:21 - 31:25
    So, we draw inspiration from
    that early hygiene movement
  • 31:25 - 31:29
    where they had a couple simple rules
    that started to save people's lives
  • 31:29 - 31:32
    while they explained germ theory
    to the masses.
  • 31:32 - 31:38
    M: Yeah, so, this type of research
    is kind of low hanging fruits
  • 31:38 - 31:41
    where you just, so...
  • 31:41 - 31:46
    what we show here is an example,
  • 31:46 - 31:50
    where there's a lot of medical
    device networks in hospitals
  • 31:50 - 31:54
    that are open to the internet
    and that can get infected
  • 31:54 - 31:59
    by normal type of malware,
    like banking trojans or whatever.
  • 31:59 - 32:03
    And this is potentially a safety issue.
  • 32:03 - 32:08
    So, if your MR scanner or some other
  • 32:08 - 32:13
    more life-critical device
    is being unavailable because of
  • 32:13 - 32:17
    a virus on it,
  • 32:17 - 32:21
    that's a real concern for patient
    security and safety.
  • 32:21 - 32:26
    So we need to think more about
    the hygiene also in terms of
  • 32:26 - 32:30
    computer viruses, not only
    just normal viruses.
  • 32:30 - 32:33
    E: Yeah. So, you know, some
    times people will treat you like
  • 32:33 - 32:36
    this is an entirely theoretical
    concern, but
  • 32:36 - 32:39
    I think this is one of the best
    illustrations that we've found
  • 32:39 - 32:42
    of how that should
    be a concern,
  • 32:42 - 32:44
    and I think all of you will get it,
  • 32:44 - 32:47
    but I wanna give you a moment to kind of
    read what's about to come up on the slides.
  • 32:47 - 32:59
    So I'll just let you enjoy
    that for a moment.
  • 32:59 - 33:02
    So if it's not clear or it's not your
    first language or something,
  • 33:02 - 33:08
    this guy basically sharded patient data
    across a bunch of amazon clusters.
  • 33:08 - 33:11
    And then it was unavailable.
    And they were very concerned
  • 33:11 - 33:14
    about the unavailability of their
    costumer patient data
  • 33:14 - 33:18
    sharded across amazon instances.
  • 33:18 - 33:23
    He was complaining to support, like
    'Can I get support to fix this?' laughing
  • 33:23 - 33:27
    M: So, all the data of the ...
  • 33:27 - 33:32
    ... the monitoring data of the cardiac
    patients is unavailable to them
  • 33:32 - 33:35
    because of the service
    being downed.
  • 33:35 - 33:43
    And, well, do you want to outsource your
    patient's safety to the cloud? Really?
  • 33:43 - 33:45
    I don't want that.
    Okay.
  • 33:45 - 33:50
    E: I wanna get into some other details.
    We have sort of 10 min left if we can ...
  • 33:50 - 33:53
    so we can have a lot of questions,
    and I'm sure there will be some.
  • 33:53 - 33:58
    But I want you to talk to them about
    this very personal story.
  • 33:58 - 34:01
    This is... Remember before, when we
    said, is this stuff theoretical?
  • 34:01 - 34:02
    I want you to pay a lot of
    attention to this story.
  • 34:02 - 34:04
    It really moved me
    when she first told me.
  • 34:04 - 34:09
    M: I know how it feels to have
    my body controlled by a device
  • 34:09 - 34:12
    that is not working correctly.
  • 34:12 - 34:18
    So, I think it was around 2 or 3
    weeks after I had the surgery.
  • 34:18 - 34:19
    I felt fine.
  • 34:19 - 34:23
    But I hadn't really done
    any exercise yet.
  • 34:23 - 34:28
    The surgery was pretty easy,
    I only had 2 weeks sick leave
  • 34:28 - 34:30
    and then I came back to work
  • 34:30 - 34:31
    and I went to London
  • 34:31 - 34:35
    to participate in a course
    in ethical hacking and
  • 34:35 - 34:40
    I did take the London Underground
    together with some of my colleges
  • 34:40 - 34:43
    and we went of at this station
    at Covent Garden
  • 34:43 - 34:46
    And I don't know if you
    have been there but
  • 34:46 - 34:49
    that particular station is
    really low underground.
  • 34:49 - 34:52
    They have elevators that you
    can use to get up,
  • 34:52 - 34:55
    but usually there are, like,
    long queues to the elevators...
  • 34:55 - 34:57
    E: You always have to do
    things the hard way, right?
  • 34:57 - 34:58
    M: You had to take the stairs, or
  • 34:58 - 35:01
    they were just heading for the stairs
    and I was following them and
  • 35:01 - 35:06
    we were starting to climb the stairs and
    I didn't read this warning sign, which is:
  • 35:06 - 35:10
    'Those with luggage, pushchairs & heart
    conditions, please use the lift' laughing
  • 35:10 - 35:12
    Because I was feeling fine,
  • 35:12 - 35:16
    and this was the first time that I
    figured out there's something wrong
  • 35:16 - 35:18
    with my pacemaker or with my heart.
  • 35:18 - 35:20
    Because I came like
    half way up this stairs
  • 35:20 - 35:23
    and I felt like I was going to die.
  • 35:23 - 35:25
    It was a really horrible feeling.
  • 35:25 - 35:26
    I didn't have any more breath left,
  • 35:26 - 35:31
    I felt like I wasn't able
    to complete the stairs.
  • 35:31 - 35:34
    I didn't know what was
    happening to me, but
  • 35:34 - 35:37
    somehow I managed to
    drag myself up the stairs
  • 35:37 - 35:39
    and my heart was really...
  • 35:39 - 35:41
    it didn't feel right.
  • 35:41 - 35:45
    So, first thing when I came
    back from this course
  • 35:45 - 35:46
    I went to my doctor
  • 35:46 - 35:49
    and we started to try
    debug me, tried to find out
  • 35:49 - 35:52
    what was wrong with my pacemaker.
  • 35:52 - 35:55
    And this is how that looks like.
    E: laughing
  • 35:55 - 35:58
    M: So, there's a stack
    of different programmers
  • 35:58 - 36:02
    - this is not me by the way, but it's
    a very similar situation.
  • 36:02 - 36:04
    E: And we'll come back to those
    programmers in a moment.
  • 36:04 - 36:05
    M: Yeah.
    E: But the bit I want you
  • 36:05 - 36:09
    to focus on is, like, they're
    debugging your pacemaker?
  • 36:09 - 36:12
    Inside you?
    M: Yeah, I didn't know
  • 36:12 - 36:13
    what was happening
    at the time.
  • 36:13 - 36:15
    We were just trying to
    get the settings right
  • 36:15 - 36:19
    and it took like 2 or 3 months before
    we figured out what was wrong.
  • 36:19 - 36:24
    And what happened was, that my
    operate limit was set to low for me,
  • 36:24 - 36:30
    for my age. So, the normal pacemaker
    patient is maybe around 80 years old
  • 36:30 - 36:34
    and the default operate
    limit was 160 beats/min.
  • 36:34 - 36:37
    And that's pretty low for
    a young person.
  • 36:37 - 36:40
    E: So, imagine, like, you're younger
    and you're really fit and you know
  • 36:40 - 36:44
    how to do something really well,
    like swimming or skiing or skateboarding
  • 36:44 - 36:47
    or whatever. You're fantastic at it.
    And then a couple years go past
  • 36:47 - 36:50
    and you know, you gain some weight
    and you're not as good at it, right?
  • 36:50 - 36:53
    But now imagine that
    happens in 3 seconds.
  • 36:53 - 36:55
    While you're walking
    up a set of stairs.
  • 36:55 - 36:57
    M: So, what happens is that
    the pacemaker detects
  • 36:57 - 37:02
    'Oh, you have a really high pulse'.
    And there's a safety mechanism
  • 37:02 - 37:05
    that will cut your pulse in half ...
    E: In half!
  • 37:05 - 37:07
    laughter
    M: laughing So in my case it went
  • 37:07 - 37:11
    from 160 beats/min to 80 beats/min.
    In a second, or less than a second,
  • 37:11 - 37:14
    and that felt really, really horrible.
  • 37:14 - 37:16
    And it took a long time
    to figure out what was wrong.
  • 37:16 - 37:21
    It wasn't until they put me on
    an exercise bike and
  • 37:21 - 37:25
    had me on monitoring that they
    figured out what was wrong, because
  • 37:25 - 37:31
    the thing was, that what was displayed
    on the pacemaker technician's view
  • 37:31 - 37:36
    was not the same settings that
    my pacemaker actually had.
  • 37:36 - 37:41
    There was a software bug in the
    programmer, that caused this problem.
  • 37:41 - 37:46
    E: So they thought they had updated
    her settings to be that of a young person.
  • 37:46 - 37:47
    They were like
    'Oh, we've already changed it'.
  • 37:47 - 37:51
    But they lost the view. They couldn't
    see the actual state of the pacemaker.
  • 37:51 - 37:54
    And the only way to figure that out
    was to put her on a bike
  • 37:54 - 37:57
    and let her cycle until her
    heart rate was high enough.
  • 37:57 - 38:00
    You know, literally physically
    debugging her to figure out
  • 38:00 - 38:01
    what was wrong.
  • 38:01 - 38:04
    Now stop and think about whether or not
    you would trust your doctor
  • 38:04 - 38:07
    to debug software.
  • 38:07 - 38:11
    laughter
  • 38:11 - 38:14
    So, say a little bit more about those
    programmers and then we'll move on
  • 38:14 - 38:15
    towards the future.
  • 38:15 - 38:19
    M: Yeah, so, we got hold of one of these
    programmers, as mentioned
  • 38:19 - 38:20
    and looked inside it.
  • 38:20 - 38:24
    And, well, we named this talk
    'Unpatchable', because
  • 38:24 - 38:30
    originally my hypothesis was that,
    if you find a bug in a pacemaker
  • 38:30 - 38:33
    it will be hard to patch it.
  • 38:33 - 38:35
    Maybe it would require surgery.
  • 38:35 - 38:37
    But then when we looked
    inside the programmer
  • 38:37 - 38:43
    and we saw that it contained firmware
    for pacemakers we realized that
  • 38:43 - 38:46
    it's possible to actually patch the
    pacemaker via this programmer.
  • 38:46 - 38:50
    E: One of the other researchers
    finds these firmware blobs inside
  • 38:50 - 38:53
    the programmer code and, like,
    my heart stopped at that point, right?
  • 38:53 - 39:00
    I was just going 'Really, you can just
    update the code on someones pacemaker?'
  • 39:00 - 39:02
    We also wanna say something
    about standardization.
  • 39:02 - 39:03
    Look at all those
    different programmers.
  • 39:03 - 39:06
    Someone goes into a hospital
    with one of these devices
  • 39:06 - 39:09
    they have may different programmers
    so they have to make an estimation
  • 39:09 - 39:13
    of which... you know, which
    programmer for which device.
  • 39:13 - 39:14
    Like, which one are you running.
  • 39:14 - 39:18
    And, so, some standardization
    would be an option laughing
  • 39:18 - 39:20
    perhaps, in this case.
    M: Yeah.
  • 39:20 - 39:23
    E: Alright. So, we gonna need
    to move quickly through
  • 39:23 - 39:25
    the next few slides to talk
    to you about the future,
  • 39:25 - 39:29
    but I hope that drives home that
    this is a very real issue for real people.
  • 39:29 - 39:33
    M: So, pacemakers are evolving and
    they are getting smaller
  • 39:33 - 39:36
    and this is the type of pacemaker
    that you can actually implant
  • 39:36 - 39:37
    inside the heart.
  • 39:37 - 39:42
    So, the pacemaker I have today
    is outside the heart and it has
  • 39:42 - 39:44
    leads that are wired to my heart.
  • 39:44 - 39:51
    But in future they are getting
    smaller and more sophisticated and
  • 39:51 - 39:53
    I think this is exciting!
  • 39:53 - 39:55
    I think that a lot of you,
    also in the audience will
  • 39:55 - 39:58
    benefit from having this type of
    technology when you grow older
  • 39:58 - 40:02
    and we can have longer lives and
    we can live more healthier lives
  • 40:02 - 40:05
    because of the technology
    E: And keep in mind, right?
  • 40:05 - 40:07
    Some of you may already have devices
    and already have this issues,
  • 40:07 - 40:10
    but others of you will think 'Ah, that
    won't happen to me for quite a long time'
  • 40:10 - 40:13
    But it can be a sudden thing, that,
    you know, you don't necessarily
  • 40:13 - 40:17
    have a choice to run code
    inside your body.
  • 40:17 - 40:21
    Which OS do you wanna implant?
    laughing
  • 40:21 - 40:25
    You wanna tell them about the..
  • 40:25 - 40:27
    M: This is also a quite exciting
  • 40:27 - 40:30
    maybe future type of implants
    that you can have.
  • 40:30 - 40:34
    So, this is actually a cardiac sock,
    it's 3D-printed and it's making
  • 40:34 - 40:38
    a rabbit's heart beat outside
    the body of the rabbit.
  • 40:38 - 40:41
    So, there's a lot of technology
    and sensors and things that
  • 40:41 - 40:44
    are going to be implanted
    in our bodies
  • 40:44 - 40:47
    and I think more of you will become
    cyborgs like me in the future
  • 40:47 - 40:50
    E: And there's a lot of work
    that you could be doing.
  • 40:50 - 40:51
    You know, 3D-printing
    this devices,
  • 40:51 - 40:57
    and open sourcing as much
    of this as possible.
  • 40:57 - 40:59
    There's a lot to say here, right?
  • 40:59 - 41:03
    I think it's time to address
    the really scary issue.
  • 41:03 - 41:08
    The informed consent issue
    around patching, right?
  • 41:08 - 41:10
    Remember earlier we were
    talking about the programmers
  • 41:10 - 41:12
    and we pointed out that there
    were firmware blobs in there
  • 41:12 - 41:14
    and that these people,
    you know, your doctor or nurse
  • 41:14 - 41:19
    could upgrade the code
    running on your medical implant.
  • 41:19 - 41:24
    Now, is there a legal requirement
    for them to inform you,
  • 41:24 - 41:27
    before they alter the code
    that's running inside your body?
  • 41:27 - 41:27
    As far as we can tell
  • 41:27 - 41:30
    - and we need to look at a lot of
    different countries at the same time,
  • 41:30 - 41:32
    so we gonna ask you to help us -
  • 41:32 - 41:35
    as far as we can tell there are not
    laws requiring your doctor
  • 41:35 - 41:40
    to tell you that they are upgrading
    the firmware in your device.
  • 41:40 - 41:44
    M: Yeah, think about that laughs
  • 41:44 - 41:45
    It's a quite scary thing.
  • 41:45 - 41:49
    I want to know what's happening
    to my implant, the code,
  • 41:49 - 41:53
    if someone wants to alter the code
    inside my body, I would like to know
  • 41:53 - 41:57
    and I would like to make
    an informed decision on that
  • 41:57 - 41:59
    and give my consent
    before it happens.
  • 41:59 - 42:02
    E: You might even choose a device
    where that's possible or not possible
  • 42:02 - 42:06
    because you're making a risk-based
    decision and you're an informed consumer
  • 42:06 - 42:08
    but how do we help people,
    who don't wanna understand
  • 42:08 - 42:11
    software and firmware and upgrades
    make those decisions in the future as well.
  • 42:11 - 42:16
    Alright.
  • 42:16 - 42:17
    M: So now, if we're going to go through
  • 42:17 - 42:22
    all this, but there's a lot of reasons
    why we're in the situations of having
  • 42:22 - 42:24
    insecure medical devices.
  • 42:24 - 42:29
    There's a lot of legacy technology because
    there's a long lifetime of this devices
  • 42:29 - 42:32
    and it takes a long time
    to get them on the market.
  • 42:32 - 42:36
    And they can be patched,
    but in some cases
  • 42:36 - 42:41
    they are not patched or there are
    no software updates applied to them.
  • 42:41 - 42:48
    We don't have any third party
    security testing of the devices,
  • 42:48 - 42:49
    and that's really needed in my opinion.
  • 42:49 - 42:51
    E: Right, an underwriters laboratory
  • 42:51 - 42:55
    or consumer laboratory that's there
    to check some of these details.
  • 42:55 - 42:59
    And I don't think that's unreasonable,
    right? That sort of approach.
  • 42:59 - 43:02
    M: And there's a lack of regulations,
    also. So there's a lot of things
  • 43:02 - 43:05
    that should be worked on.
  • 43:05 - 43:07
    E: So, there's a lot of
    ways to solve this
  • 43:07 - 43:10
    and we're not gonna give you
    the answer, because we're not
  • 43:10 - 43:13
    geniuses, so we're
    gonna say that
  • 43:13 - 43:16
    these are some different
    approaches that we see all
  • 43:16 - 43:20
    playing in a solution space.
  • 43:20 - 43:22
    So, vendor awareness is
    obviously important, but
  • 43:22 - 43:24
    that's not the only thing.
    A lot of the vendors have been
  • 43:24 - 43:28
    very supportive and
    very open to discussion,
  • 43:28 - 43:32
    of transparency, that needs to
    happen more in the future, right?
  • 43:32 - 43:34
    Security risk monitoring,
    I've been working in the field
  • 43:34 - 43:39
    of cyber insurance, which I'm sure
    sounds like insanity to the rest of you,
  • 43:39 - 43:43
    and it is, there are bad days.
    But that could play a part
  • 43:43 - 43:46
    in this risk equation in the future.
  • 43:46 - 43:50
    What about medical incidence response,
    right? Or medical device forensics.
  • 43:50 - 43:54
    M: If I suddenly drop dead
    I really would like to have
  • 43:54 - 43:57
    a forensic analysis
    of my pacemaker, to ...
  • 43:57 - 44:01
    E: Please remember that, all of you!
    Like, if anything is going to happen
  • 44:01 - 44:05
    to Marie... everyone asked that, right?
    Like, 'Aren't you afraid of giving this talk?'
  • 44:05 - 44:07
    And we thought about it,
    we talked about it a lot and
  • 44:07 - 44:10
    she's got a lot of support
    from her husband and her son
  • 44:10 - 44:13
    and her family and a bunch of us.
    If anything happens to this woman
  • 44:13 - 44:15
    I hope that we will all be doing
    forensic analysis
  • 44:15 - 44:17
    of everything.
  • 44:17 - 44:25
    applause
  • 44:25 - 44:32
    Cool. So, we'll say a little bit about
    'I Am The Cavalry' and social contract
  • 44:32 - 44:35
    and then we'll wrap it up, okay?
  • 44:35 - 44:38
    So, 'I Am The Cavalry' does
    a lot of grassroots research
  • 44:38 - 44:41
    and support and lobbying and
    tries to articulate these messages.
  • 44:41 - 44:44
    They have a medical implant
    arm that has a bunch of
  • 44:44 - 44:46
    different researchers doing
    this kind of stuff.
  • 44:46 - 44:49
    Do you wanna say more about them?
  • 44:49 - 44:52
    M: Yeah, so we are both
    part of the Cavalry,
  • 44:52 - 44:56
    because no one is coming
    to save us from the future
  • 44:56 - 45:00
    of being more depended on
    trusting our lives on machines
  • 45:00 - 45:04
    so, that's why we need to step up
    and do the research and
  • 45:04 - 45:07
    encourage and inspire the research.
  • 45:07 - 45:09
    So, that's why I joined
    'I Am The Cavalry'
  • 45:09 - 45:13
    and I think it's a
    good thing to have
  • 45:13 - 45:16
    a collaboration effort between
    researchers, between the vendors
  • 45:16 - 45:21
    and the regulators, as they are,
    or we are working with.
  • 45:21 - 45:25
    E: We also think that even if you
    don't do reverse engineering
  • 45:25 - 45:28
    or you're not interested in
    security details or the opcodes
  • 45:28 - 45:30
    that are inside the firmwares
    or whatever,
  • 45:30 - 45:33
    this question is a question that
    any of you here can talk about
  • 45:33 - 45:36
    for the rest of the congress and
    going forward into the future.
  • 45:36 - 45:37
    Right?
  • 45:37 - 45:40
    This is Marie's, so go ahead.
  • 45:40 - 45:48
    M: Yeah, so, I really want to know
    what code is running inside my body.
  • 45:48 - 45:49
    And I want to know ...
  • 45:49 - 45:55
    or I want to have a social contract
    with my medical doctors and
  • 45:55 - 45:59
    my physician that is giving me
    this implants.
  • 45:59 - 46:06
    It needs to be based on a
    patient-to-doctor trust relationship.
  • 46:06 - 46:09
    And also between
    me and the vendors.
  • 46:09 - 46:13
    So I really want to know that
    I can trust this machine inside...
  • 46:13 - 46:16
    E: And we think many of you will
    be facing similar questions
  • 46:16 - 46:17
    to these in the future.
  • 46:17 - 46:20
    I have questions.
    Some of my questions are serious,
  • 46:20 - 46:25
    some of my questions are
    not serious, like this one:
  • 46:25 - 46:28
    Is the code on your dress
    from your pacemaker?
  • 46:28 - 46:32
    M: No, actually it's from the
    computer game 'Doom'.
  • 46:32 - 46:33
    But ...
    laughter
  • 46:33 - 46:36
    once I have the laughing
    code of my pacemaker
  • 46:36 - 46:39
    I'm going to make a custom-
    ordered dress and get it...
  • 46:39 - 46:45
    E: Which is pretty cool, right?
    M: ... get it with my own code.
  • 46:45 - 46:49
    applause
  • 46:49 - 46:54
    So, let's wrap up with... what we
    want to have of future research.
  • 46:54 - 46:57
    So, we encourage more research,
    and these are some things that
  • 46:57 - 46:59
    could be looked into.
  • 46:59 - 47:03
    Like open source medical devices,
    that doesn't really exist,
  • 47:03 - 47:05
    at least not for pacemakers.
  • 47:05 - 47:09
    But I think that's one way
    of going forward.
  • 47:09 - 47:14
    E: I think it's also an opportunity
    for us to mention a really scary idea,
  • 47:14 - 47:18
    which is, you know, should anyone
    have a golden key to Marie's heart,
  • 47:18 - 47:22
    should there be backdoored
    encryption inside of her heart?
  • 47:22 - 47:25
    We think no laughing
    but that...
  • 47:25 - 47:28
    M: I don't see any reason why
    the NSA should be able to
  • 47:28 - 47:31
    have a back door to my heart,
    do you?
  • 47:31 - 47:34
    E: You would be an extremist,
    that's why you don't want them
  • 47:34 - 47:37
    to have a back door to your heart.
    But this is a serious question, right?
  • 47:37 - 47:39
    If you start backdooring
    any kind of crypto anywhere,
  • 47:39 - 47:41
    how do you know,
    where it's gonna end up.
  • 47:41 - 47:47
    It might end up in medical devices
    and we think that's unacceptable.
  • 47:47 - 47:58
    applause
  • 47:58 - 48:05
    M: And we should also mention
    that we're not doing this alone,
  • 48:05 - 48:09
    we have other researchers
    helping us forward doing this.
  • 48:09 - 48:12
    Angel: So, thank you very much
    for this thrilling talk,
  • 48:12 - 48:15
    we're now doing a little
    Q&A for 10 min,
  • 48:15 - 48:20
    and for the Q&A please keep in mind
    to respect Marie's privacy, so
  • 48:20 - 48:23
    don't ask for details about
  • 48:23 - 48:25
    the implant or
    something like that.
  • 48:25 - 48:27
    E: Yeah, the brands and stuff.
  • 48:27 - 48:30
    We're gonna tell you, what OS
    she's running.
  • 48:30 - 48:35
    Angel: People, who are now leaving
    the room, they will not be able
  • 48:35 - 48:41
    to come back in, because
  • 48:41 - 48:43
    of measures laughing
    laughter
  • 48:43 - 48:48
    So, let's start with the Q&A!
    Let's start with this microphone there.
  • 48:48 - 48:54
    Q: Hi, first of all thank you very much
    for a very fascinating talk.
  • 48:54 - 48:57
    I'm not going to ask you
    about specific vendors.
  • 48:57 - 49:01
    However, I thought it was very
    interesting what you said, that
  • 49:01 - 49:06
    most vendors were really supportive
    I would like to know whether
  • 49:06 - 49:09
    there have been
    exceptions to that rule,
  • 49:09 - 49:14
    not who it was or anything like that
    but what kind of arguments
  • 49:14 - 49:19
    you may have heard from vendors
    e. g. have they referred to anything
  • 49:19 - 49:24
    such as trade secrets or copyright
    or any other legal reasons
  • 49:24 - 49:28
    why not to give you,
    or not to give public access
  • 49:28 - 49:33
    to information about devices?
    Thank you.
  • 49:33 - 49:42
    E: So, we haven't had any legal
    issues so far in this research.
  • 49:42 - 49:45
    And in general they haven't been
    concerned about copyright.
  • 49:45 - 49:48
    I think they're more concerned
    about press, bad press,
  • 49:48 - 49:51
    and a hype, you know, what
    they would see as hype.
  • 49:51 - 49:55
    they don't wanna see us scaring
    people away from these things
  • 49:55 - 49:56
    with, you know, these stories.
  • 49:56 - 50:00
    M: Yeah, that's also something
    I'm concerned of, of course,
  • 50:00 - 50:03
    as a patient. I don't want to
    scare my fellow patients
  • 50:03 - 50:06
    from having life-critical
    implants in their body.
  • 50:06 - 50:11
    Because a lot of people need
    them, like me, to survive.
  • 50:11 - 50:16
    So, the benefit clearly
    outweighs the risk in my case.
  • 50:16 - 50:19
    E: But that seems to be their
    main concern, like, you know,
  • 50:19 - 50:20
    'Don't give us too
    much bad press'
  • 50:20 - 50:25
    Angel: Ok, next question
    from over there.
  • 50:25 - 50:32
    Q: Hello. I wanted to ask you, if you
    know about any existing initiatives
  • 50:32 - 50:35
    on open sourcing
    the medical devices,
  • 50:35 - 50:40
    on mandating the open sourcing
    of the software and firmware
  • 50:40 - 50:44
    through the legal system,
    in European Union, in United States
  • 50:44 - 50:48
    because I think I've read
    about such initiatives
  • 50:48 - 50:51
    about 1 year ago or so,
    but it was just a glimpse.
  • 50:51 - 50:56
    M: So, there are some patients
    that have reverse engineered their
  • 50:56 - 50:58
    no audio
  • 50:58 - 51:04
    (insu)lin pumps. I know, that
    there are groups of patients
  • 51:04 - 51:08
    like the parents of children
    with insulin pumps.
  • 51:08 - 51:11
    They have created
    software to be able...
  • 51:11 - 51:14
    to have an app on their
    mobile phone to be able
  • 51:14 - 51:17
    to monitor their child's
    blood sugar levels.
  • 51:17 - 51:21
    So that's one way of
    doing this open source
  • 51:21 - 51:23
    and I think that's great.
  • 51:23 - 51:27
    Q: But nothing
    in the legal systems,
  • 51:27 - 51:33
    no initiatives to mandate this,
    e.g. on European level?
  • 51:33 - 51:34
    E: Not so far that we've seen,
  • 51:34 - 51:36
    but that's something that
    can be discussed now, right?
  • 51:36 - 51:39
    M: I think it's really interesting,
    you could look into the legal
  • 51:39 - 51:42
    aspects and the regulations
    around this, yeah.
  • 51:42 - 51:43
    Q: Thank you.
  • 51:43 - 51:46
    Angel: Ok, can we have
    a question from the internet?
  • 51:46 - 51:49
    Q: Yes, from the IRC someone asks:
  • 51:49 - 51:53
    'Does your pacemaker
    have a biofeedback,
  • 51:53 - 51:56
    so in case something bad
    happens it starts to defibrillate?
  • 51:56 - 52:03
    M: No, I don't have an ICD,
    so in my case I'm not getting a shock
  • 52:03 - 52:06
    in case my heart stops.
    Because I have a different condition
  • 52:06 - 52:09
    I only need to have
    my rhythm corrected.
  • 52:09 - 52:11
    But there are other
    types of conditions,
  • 52:11 - 52:14
    that require pacemakers
    that can deliver shocks.
  • 52:14 - 52:18
    Angel: Ok, one question
    from that microphone there.
  • 52:18 - 52:20
    Q: Thank you very much.
    At one point you mentioned
  • 52:20 - 52:25
    that the connectivity in you
    pacemaker is off. For now.
  • 52:25 - 52:29
    And, is that something, that patients
    are asked during the process,
  • 52:29 - 52:32
    or is that something,
    patients have to require?
  • 52:32 - 52:36
    And generally: What role
    do you see for the choice
  • 52:36 - 52:39
    not to have any connectivity
    or any security for that matter,
  • 52:39 - 52:42
    that technology would
    make available to you?
  • 52:42 - 52:47
    So, how do you see the possibility
    to choose a more risky life
  • 52:47 - 52:50
    in terms of trading in
    for privacy, whatever?
  • 52:50 - 52:52
    M: Yeah, I think that's
    really a relevant question.
  • 52:52 - 52:58
    As we mentioned
    in the social contract,
  • 52:58 - 53:04
    I really would like, that the doctors
    informed patients about
  • 53:04 - 53:08
    their different wireless interfaces
    and that there's an informed decision
  • 53:08 - 53:11
    whether or not to switch it on.
  • 53:11 - 53:15
    So, in my case, I don't
    have it switched on and ...
  • 53:15 - 53:18
    I don't need it, so there's no reason
    why I need to have it switched on.
  • 53:18 - 53:22
    But then, again, why did I get
    an implant that has this capability?
  • 53:22 - 53:29
    I should have had the option of
    opting out of it, but I didn't get that.
  • 53:29 - 53:32
    They didn't ask me, or they
    didn't inform me of that,
  • 53:32 - 53:35
    before I got the implant.
    It was chosen for me.
  • 53:35 - 53:41
    And at that time I hadn't looked
    into the security of medical devices,
  • 53:41 - 53:43
    and I needed to
    have the implant,
  • 53:43 - 53:46
    so I couldn't really make
    an informed decision.
  • 53:46 - 53:49
    A lot of patients that are,
    like, older and not so...
  • 53:49 - 53:55
    that don't really understand
    the technology,
  • 53:55 - 54:00
    they can't make that
    informed decision, like I can.
  • 54:00 - 54:03
    So, it's really a
    complex issue
  • 54:03 - 54:06
    and something that we
    need to discuss more.
  • 54:06 - 54:09
    Angel: Ok, another
    question from there.
  • 54:09 - 54:11
    Q: Yeah, thanks.
  • 54:11 - 54:14
    As a hacker, connected personally
  • 54:14 - 54:19
    and professionally
    to the medical world:
  • 54:19 - 54:25
    How can I educate doctors,
    nurses, medical people
  • 54:25 - 54:31
    about the security risks presented
    by connected medical devices?
  • 54:31 - 54:35
    What can I tell them?
    Do you have something
  • 54:35 - 54:38
    from your own experience
    I could somehow ...
  • 54:38 - 54:42
    M: Yeah, so, the issue of
    software bugs in the devices
  • 54:42 - 54:48
    I think is a real scenario
    that can happen and ...
  • 54:48 - 54:50
    E: Yeah, if you can repeat
    that story of debugging her,
  • 54:50 - 54:54
    like, I think, that makes the point.
    And then try in adopt that
  • 54:54 - 54:57
    hygiene-metaphor that we
    had before, where, you know,
  • 54:57 - 55:00
    people didn't believe in germs,
    and these problems before,
  • 55:00 - 55:02
    we're in that sort of era,
    and we're still figuring out
  • 55:02 - 55:05
    what the scope of potential
    security and privacy problems are
  • 55:05 - 55:07
    for medical devices.
    In the meantime
  • 55:07 - 55:10
    please be open to new research
    on this subject, right?
  • 55:10 - 55:12
    And that story is
    a fantastic illustration,
  • 55:12 - 55:17
    that we don't need evil hacker
    typer, you know, bond villain,
  • 55:17 - 55:22
    we just need failure to debug
    programming station, properly, right?
  • 55:22 - 55:24
    Q: Thank you very much.
  • 55:24 - 55:26
    Angel: Ok, another question
    from the internet.
  • 55:26 - 55:29
    Q: Yes, from the IRC:
  • 55:29 - 55:34
    '20 years ago it was common,
    that a magnet had to be placed
  • 55:34 - 55:40
    on the patients chest to activate the
    pacemakers remote configuration interface.
  • 55:40 - 55:42
    Is that no longer the case today?'
  • 55:42 - 55:46
    E: It's still the case with some devices,
    but not with all of them I think.
  • 55:46 - 55:52
    M: Yeah, it varies between the devices,
    how they are programmed and
  • 55:52 - 55:58
    how long distance you
    can be from the device.
  • 55:58 - 56:03
    Q: Thank you for the talk.
    I've some medical devices
  • 56:03 - 56:10
    in myself to, an insulin pump and
    sensors to measure the blood sugar levels,
  • 56:10 - 56:16
    I'm busy with hacking that and
    to write the software for myself,
  • 56:16 - 56:18
    because the *** doesn't
    have the software.
  • 56:18 - 56:25
    Have you ever think about it, to write
    your own software for your pacemaker?
  • 56:25 - 56:27
    E: laughing
    M: laughing
  • 56:27 - 56:34
    M: No, I haven't thought about
    that until now. No. laughing
  • 56:34 - 56:38
    E: Fantastic, I think that deserves
    a round of applause, though,
  • 56:38 - 56:40
    because that's exactly
    what we're talking about.
  • 56:40 - 56:42
    applause
  • 56:42 - 56:46
    Angel: Another question
    from there.
  • 56:46 - 56:53
    Q: First off, I want to say thank you
    that you gave this talk, because
  • 56:53 - 56:56
    once it's quite interesting,
    but it's not that talk,
  • 56:56 - 57:00
    anyone of that is effected could hold,
  • 57:00 - 57:05
    so, it takes quiet some courage and
  • 57:05 - 57:07
    I want to say thank you. So
  • 57:07 - 57:12
    applause
  • 57:12 - 57:15
    Secondly, thank you for giving me the
  • 57:15 - 57:18
    update. I started medical technology but
  • 57:18 - 57:22
    I finished ten years ago and I didn't work
  • 57:22 - 57:22
    in the area and it's quiet interesting to
  • 57:22 - 57:24
    see what happened in the meantime, but
  • 57:24 - 57:25
    now for my actual question:
  • 57:25 - 57:28
    You said you got devices on ebay, is it
  • 57:28 - 57:30
    possible to get the hole
  • 57:30 - 57:31
    communication chain?
  • 57:31 - 57:35
    So you can make a sandbox test or ..
  • 57:35 - 57:38
    M: Yes it's possible to get devices,
  • 57:38 - 57:40
    it's not so easy to get the pacemaker
  • 57:40 - 57:42
    itself , it's quite expensive.
  • 57:42 - 57:44
    E: And even when we get one,
  • 57:44 - 57:46
    we have some paring issues and like
  • 57:46 - 57:48
    Marie can't be in the same room , when
  • 57:48 - 57:50
    we were doing a curtain types of testing
  • 57:50 - 57:53
    and right, so that last piece is difficult
  • 57:53 - 57:55
    but the rest of the chain is pretty
  • 57:55 - 57:56
    available for the research.
  • 57:56 - 57:57
    Q: Ok, thank you.
  • 57:57 - 58:00
    Angel: So, time is running out, so we,
  • 58:00 - 58:02
    only time left for one question and from
  • 58:02 - 58:03
    there please.
  • 58:03 - 58:06
    Q: Thank you. I'm also involved in
  • 58:06 - 58:10
    software quality checks and software qs
  • 58:10 - 58:13
    here in Germany also
    with medical developments
  • 58:13 - 58:16
    and as far as I know, it is the most
  • 58:16 - 58:19
    restricted area of developing products
  • 58:19 - 58:21
    I think in the world,
  • 58:21 - 58:25
    it's just easier to manipulate software
  • 58:25 - 58:28
    in a car X-source system or breaking guard
  • 58:28 - 58:30
    or something like this, where you don't
  • 58:30 - 58:34
    have to show any testing certificate or
  • 58:34 - 58:36
    something like this, the FDA is a very
  • 58:36 - 58:38
    high regulation part there.
  • 58:38 - 58:42
    Do you have the feeling that it's a
  • 58:42 - 58:45
    general issue that patients do not have
  • 58:45 - 58:48
    access to these FDA compliant tests and
  • 58:48 - 58:49
    software q-a-systems?
  • 58:49 - 58:53
    M: Yeah, I think that we should have
  • 58:53 - 58:56
    more openness and more transparency
  • 58:56 - 58:58
    about, around this issues , really.
  • 58:58 - 59:02
    E: I mean, it's fantastic you do quality
  • 59:02 - 59:03
    assurance, i used to be in quality assurance
  • 59:03 - 59:06
    at a large cooperation and I got tiered
  • 59:06 - 59:09
    and landed in strategy and pen testing and
  • 59:09 - 59:10
    then I just thought of myself as paramilitary
  • 59:10 - 59:11
    quality assurence , ..
  • 59:11 - 59:16
    now I just do it on ever I wanne test, so
  • 59:16 - 59:18
    thank you for doing q-a and keep doing it
  • 59:18 - 59:20
    and hopefull you don't have to many regulations
  • 59:20 - 59:22
    but companies sharing more of this
  • 59:22 - 59:24
    information, its really the transparency
  • 59:24 - 59:25
    and the discussion, the open dialogue
  • 59:25 - 59:28
    with patients and doctor and a vendor is
  • 59:28 - 59:31
    really what we wanna focus on and make
  • 59:31 - 59:33
    our final note ?
    M: Yeah.
  • 59:33 - 59:36
    M: We see some problems already
  • 59:36 - 59:38
    the last year, the MI Undercover Group has
  • 59:38 - 59:42
    had some great progress on having good
  • 59:42 - 59:46
    discussions with the FDA and also involving
  • 59:46 - 59:49
    the medical device vendors in the discussions
  • 59:49 - 59:51
    about cyber security of medical devices
  • 59:51 - 59:53
    and implants. so thats great and I hope
  • 59:53 - 59:55
    that this will be even better the next year.
  • 59:55 - 59:57
    E: And I think you wanne to say
  • 59:57 - 59:59
    one more thing to congress before we leave
  • 59:59 - 59:59
    which is:
  • 59:59 - 60:01
    M: Hack to save lives!
  • 60:01 - 60:05
    applaus
  • 60:05 - 60:09
    ♪ postroll music ♪
  • 60:09 - 60:16
    subtitles created by c3subtitles.de
    Join, and help us!
Title:
Marie Moe, Eireann Leverett: Unpatchable
Description:

more » « less
Video Language:
English
Duration:
01:00:16

English subtitles

Revisions