Return to Video

Listen to Your Heart: Security and Privacy of Implantable Cardio Foo

  • 0:23 - 0:26
    Preroll music
  • 0:26 - 0:31
    Herald: Good evening, everybody. The
    upcoming talk is titled "Listen to Your
  • 0:31 - 0:36
    Heart: Security and Privacy of Implantable
    Cardio Foo" and will be delivered by e7p,
  • 0:36 - 0:42
    who is a Ph.D. student at the Max Planck
    Institute. and Christoph Saatjohann who is
  • 0:42 - 0:47
    also a Ph.D. student at the University of
    Applied Sciences Münster where he also
  • 0:47 - 0:54
    teaches security in medical devices. This
    talk also be translated into German.
  • 0:54 - 1:01
    Dieser Vortrag wird auch simultan
    übersetzt in Deutsch. And that is also the
  • 1:01 - 1:05
    extent of my German. I would say, e7p over
    to you.
  • 1:05 - 1:10
    e7p: Yeah, thanks a lot for the nice
    introduction. I hope you can all hear and
  • 1:10 - 1:15
    see us. OK, so yeah,welcome to the talk
    "Listen to Your Heart: Security and
  • 1:15 - 1:23
    Privacy of Implantable Cardio Foo". I'm
    Endres and as said I'm a Ph.D. student at
  • 1:23 - 1:27
    the Max Planck Institute for Security and
    Privacy. My main topic is embedded
  • 1:27 - 1:34
    security. This topic is a funded project,
    which is called MediSec. It's a
  • 1:34 - 1:38
    cooperation between cyber security
    researchers in Bochum and Münster, as well
  • 1:38 - 1:43
    as medical technology researchers and
    staff of the University Hospital in
  • 1:43 - 1:50
    Münster. And this is funded research. So
    to start off, we want to give a quick
  • 1:50 - 1:59
    motivation what our topic is about. So we
    chose these implantable cardiac
  • 1:59 - 2:05
    defibrillators or other heart related
    implantable devices. And there are
  • 2:05 - 2:12
    different kinds of these, so there's lots
    of other classical heart pacemakers, which
  • 2:12 - 2:16
    every one of you should already heard
    about. Then there's also implantable
  • 2:16 - 2:25
    defibrillators, which have other
    applications actually, and there are also
  • 2:25 - 2:31
    heart monitors just for diagnosis. And
    yeah, as these implants are inside your
  • 2:31 - 2:37
    body, they pose a high risk for threats
    and also they have communication
  • 2:37 - 2:45
    interfaces that are similar to these of
    the Internet of Things. Also, we want to
  • 2:45 - 2:51
    talk a bit about the ethical arguments. So
    when asking: Why hacking medical devices?
  • 2:51 - 2:57
    Well, first, the obvious thing is, yeah,
    there's a long device lifecycle in the
  • 2:57 - 3:03
    medical sector, presumably because of the
    strong regulations and required
  • 3:03 - 3:12
    certifications for medical products. So
    it's more optimal to keep these devices as
  • 3:12 - 3:24
    long as possible on the market. And for
    this reason, it's also sure that the, that
  • 3:24 - 3:39
    the, that there are open bugs from old
    hardware or software and, but the
  • 3:39 - 3:43
    manufacturers need to know about the
    issues to be able to do something about
  • 3:43 - 3:51
    it. That's a disclaimer for affected
    patients. It's independent to the decision
  • 3:51 - 3:58
    for or against such a device what we talk
    about. Because after all, these devices
  • 3:58 - 4:09
    can save your life. OK, let's get a bit
    more to the details. Also we want to talk
  • 4:09 - 4:15
    shortly about the responsible disclosure
    process. When we found out some bugs and
  • 4:15 - 4:23
    vulnerabilities, we informed all the, the
    involved companies at least six months
  • 4:23 - 4:33
    ago. So from now on, it's maybe a year or.
    So the companies took us serious. They
  • 4:33 - 4:39
    acknowledged our results and ours and
    their goal is not to worry any affected
  • 4:39 - 4:46
    people but to improve the product
    security. Vulnerable devices are or will
  • 4:46 - 4:53
    soon be replaced, or at least the firmware
    gets updated. And yeah, whenever we do
  • 4:53 - 4:58
    independent security research, it helps to
    keep the quality of the products higher,
  • 4:58 - 5:07
    which is in both ours and their interests.
    And one note is, if you ever find out any
  • 5:07 - 5:13
    bugs or vulnerabilities in some product
    please inform the companies first before
  • 5:13 - 5:20
    publishing anything of it online or
    anywhere else. OK, let's get started into
  • 5:20 - 5:27
    the topic. First of all, I want to talk
    about all the devices in the environment
  • 5:27 - 5:35
    around the implantable devices. First of
    all, we have the implants himself. These
  • 5:35 - 5:41
    little devices, they are not so heavy.
    They are placed, I think, under the skin
  • 5:41 - 5:47
    near the heart. I don't know. I'm not from
    the medical sector, but yeah, inside the
  • 5:47 - 5:58
    body and they have one or multiple
    contacts which electrodes connect to. And
  • 5:58 - 6:05
    these connect them to muscles or the
    organs, and to their thing. But as there
  • 6:05 - 6:11
    is no outside connection for configuring
    or receiving test data or something like
  • 6:11 - 6:17
    this or events. There is a programing
    device which is usually located in the
  • 6:17 - 6:26
    hospital or in the heart clinics. Then
    there's also a home monitoring station,
  • 6:26 - 6:32
    which the patient takes home and puts at
    the bed table, for instance, so it can
  • 6:32 - 6:38
    receive all the relevant data from the
    implant every day and transmit relevant
  • 6:38 - 6:45
    data then to the doctor. This does not
    happen directly, but over the
  • 6:45 - 6:50
    manufacturer's infrastructure, the
    structure and the transmission here is
  • 6:50 - 6:58
    over the internet usually. But then the
    doctor can receive all the data over the
  • 6:58 - 7:06
    internet again, and yeah, so that's all
    the four big spots where data is
  • 7:06 - 7:14
    transmitted to and from. And if we have an
    attacker here, he could try to attack or
  • 7:14 - 7:19
    find vulnerabilities in any of these four
    devices, as well as their communication
  • 7:19 - 7:27
    interfaces and protocols. OK, coming a bit
    more concrete. So in total, there are
  • 7:27 - 7:35
    around five major vendors worldwide that
    develop these implants and other devices
  • 7:35 - 7:46
    around them, and we look. We try to
    analyze these three on top here, and yeah.
  • 7:46 - 7:53
    So we go a bit more in detail what we
    found out and what we try to analyze here.
  • 7:53 - 8:01
    So going back to the implants, I already
    showed it to you. That's maybe how it
  • 8:01 - 8:08
    looks like from the inside. Not very good
    to see, I think, with the camera, but
  • 8:08 - 8:15
    there's also some picture in the slides.
    And yeah, first of all, these implants
  • 8:15 - 8:21
    contain the desired functionality. For
    instance, defibrillator, pacemaker, heart
  • 8:21 - 8:26
    recorder. And these features have
    different requirements. For instance, a
  • 8:26 - 8:34
    defibrillator probably needs more power,
    and so needs a larger battery or a huge
  • 8:34 - 8:41
    capacitor, for instance. And a heart
    monitor doesn't need anything of these.
  • 8:41 - 8:45
    And of course, all of these need this
    communication interface, which is realized
  • 8:45 - 8:55
    over radio frequency. But sometimes it's
    also only over the inductive coupling,
  • 8:55 - 9:03
    which is maybe known from RFID. And when
    looking inside these devices, we see there
  • 9:03 - 9:10
    are highly customized parts inside, which
    means there are unlabeled chips, even
  • 9:10 - 9:16
    unpackaged chips that are directly bonded
    onto the PCBs. So analysis is quite hard
  • 9:16 - 9:22
    and difficult. But all of these have in
    common that there's a small microcomputer
  • 9:22 - 9:31
    inside that handles everything and also
    the communication. Yeah. Then there are
  • 9:31 - 9:39
    these home monitoring units, I just get
    one here, looks like these, and as I said,
  • 9:39 - 9:48
    they sit on the bed table and transmit the
    data on a daily basis to the doctors. They
  • 9:48 - 9:55
    also need some wireless communication
    interface to the implant. And when they
  • 9:55 - 10:00
    got the data, they need to transmit them
    to the doctor. And this is usually done
  • 10:00 - 10:09
    over a mobile to mobile GSM or UMTS
    network. And then the data is sent to the
  • 10:09 - 10:14
    manufacturer server. And compared to the
    implants these are based on standard
  • 10:14 - 10:20
    multipurpose hardware. That means often we
    will find there are Linux operating
  • 10:20 - 10:28
    systems and lots of debug ports of serial
    interfaces or USB. So they are easily
  • 10:28 - 10:35
    accessible for us. OK. And then we have
    this hospital programmer. They are used in
  • 10:35 - 10:41
    the cardiology clinics, are able to
    configure the implants and also use test
  • 10:41 - 10:47
    modes in the implants. But also, like the
    home monitoring, they can read out the
  • 10:47 - 10:54
    stored events or live data and. Yeah. So
    they are in the heart clinic and operated
  • 10:54 - 11:00
    by doctors. And usually these are rented
    to the hospitals or leased from the
  • 11:00 - 11:06
    manufacturer. But, however, we could find
    out that individuals could buy these
  • 11:06 - 11:14
    second hand on specialized, yeah,
    specialized platforms similar to eBay, I
  • 11:14 - 11:22
    would say, for medical devices. And now on
    to our methodology when analyzing the
  • 11:22 - 11:29
    devices. So first of all, we thought about
    goals of a potential attacker. First of
  • 11:29 - 11:40
    all he mainly would like to influence the
    implantable device itself. And this can be
  • 11:40 - 11:46
    done either. This can be mostly done over
    the interface that the programming device
  • 11:46 - 11:54
    uses, so to inject malicious firmware in
    the implant could be one goal of the
  • 11:54 - 12:00
    potential attacker. Another goal could be
    to GSM spoof the connection of the home
  • 12:00 - 12:08
    monitoring box and then dump some some
    medical data or also privacy related data.
  • 12:08 - 12:15
    And yeah, when looking at the programing
    device, one could also think about direct
  • 12:15 - 12:25
    misuse to use, for example, the test modes
    already included in the device. So what
  • 12:25 - 12:34
    research questions do result in this? So
    the first question is: What is possible
  • 12:34 - 12:40
    only with the direct interaction with
    genuine devices, which means that is non-
  • 12:40 - 12:49
    invasive? And the second question is: How
    secure are these in general? Like, when
  • 12:49 - 12:55
    also invasive attacks are allowed. And the
    third question is: Can we finally
  • 12:55 - 13:03
    understand all communication protocols or
    is this rather difficult to do? OK. So now
  • 13:03 - 13:10
    we look more concretely on these attack
    vectors, and we do this with the devices
  • 13:10 - 13:17
    which we got from our partner hospital.
    And yeah, to start off, we looked at the
  • 13:17 - 13:24
    Biotronik home monitoring unit. And what
    we did there was to run a rogue GSM cell.
  • 13:24 - 13:38
    So we did GSM spoofing with OpenBTS. And
    this allowed us to intercept data. So this
  • 13:38 - 13:43
    data, which we got then, was not
    encrypted, so we could reveal the access
  • 13:43 - 13:50
    credentials. And the same credentials you
    could also find out when dumping the
  • 13:50 - 13:57
    firmware from the microcontroller. And
    this was then done over the JTAG
  • 13:57 - 14:03
    interface. This firmware, which we got
    there, could be reverse engineered with
  • 14:03 - 14:10
    Ghidra, which is an open source tool for
    reverse engineering. And what we found
  • 14:10 - 14:16
    there was also an AES cypher
    implementation, which is mainly used for
  • 14:16 - 14:23
    authentication steps. Also, the firmware
    contained the credentials and thus the
  • 14:23 - 14:31
    internet domain cm3-homemonitoring.de. But
    according to the manufacturer, this domain
  • 14:31 - 14:38
    is only used as an authentication realm.
    However, they were kind of surprised when
  • 14:38 - 14:45
    we told them that actually they are using
    the same domain for other services. But I
  • 14:45 - 14:54
    hope they won't do this anymore. So, yeah,
    this should be fine. OK. Next up is the
  • 14:54 - 14:58
    Medtronic home monitoring unit, the
    approach there was similar to the
  • 14:58 - 15:05
    Biotronik home monitoring unit. What we
    found there was that a spoofing attack was
  • 15:05 - 15:13
    not quite possible because this Medtronic
    home monitoring unit uses a mobile to
  • 15:13 - 15:22
    mobile SIM card, which is on a intranet,
    not done on an internet or a VPN, could
  • 15:22 - 15:30
    think about this. And so we couldn't get a
    connection to the original service. And
  • 15:30 - 15:40
    however, we found also on a web blog post
    a documented method to find out about the
  • 15:40 - 15:46
    encryption password because the firmware
    of the device is encrypted. And, yeah,
  • 15:46 - 15:52
    turned out that it was a Linux embedded
    Linux system, which we could also
  • 15:52 - 15:58
    influence when opening up and tearing down
    the device. Taking out, I think it was an
  • 15:58 - 16:04
    SD card and then overwriting some, some
    files. And here in the picture, you can
  • 16:04 - 16:12
    also see where we could display an image
    on the screen of the device. This was done
  • 16:12 - 16:21
    using some DBus messages because it was an
    embedded linux really. So here we've got
  • 16:21 - 16:29
    also the server backend addresses also in
    the firmware. But more to that later. The
  • 16:29 - 16:35
    third device which we analyzed was this
    Boston scientific programing device. You
  • 16:35 - 16:43
    can switch the camera so we can see it
    more clearly here. Yeah. So this rather
  • 16:43 - 16:54
    huge device we could buy for around 2.000
    U.S. dollars from this auction platform.
  • 16:54 - 17:00
    And we could also tear this down because
    it's never used any more for real
  • 17:00 - 17:10
    productive cases. And there we found a
    hard disk inside. And there is also a
  • 17:10 - 17:18
    Linux system on it, which I think is from
    2002, in the slides. And the device itself
  • 17:18 - 17:24
    is a Intel Pentium based device, which is
    designed in 2004, and the software is from
  • 17:24 - 17:34
    2011. So quite old, right? But yeah. So
    that's I think the thing about this
  • 17:34 - 17:41
    device. The Linux kernel, sorry, the Linux
    system in the device also contained a
  • 17:41 - 17:52
    window manager, the tiling window manager
    and using the modifying files or shell
  • 17:52 - 17:59
    scripts on the hard disk allowed us to
    also start the twm, the tiling window
  • 17:59 - 18:06
    manager on certain actions. So from there
    on, we could simply open up an xterm shell
  • 18:06 - 18:16
    and then have root rights. OK. So maybe I
    will show it in the live demonstration
  • 18:16 - 18:23
    later. Also, we found some region lock
    configuration file, which we could also
  • 18:23 - 18:33
    alter to allow to connect to the radio
    frequency, which implants used around the
  • 18:33 - 18:40
    world. And as the Linux kernel is so old,
    probably further exploits are likely
  • 18:40 - 18:48
    possible. But the device is luckily being
    replaced right now. That's what Boston
  • 18:48 - 18:57
    Scientific told us. One nice thing about
    this device is that we found also a x86
  • 18:57 - 19:05
    binary, which is called "checkDongle" on
    the hard disk and this checkDongle is more
  • 19:05 - 19:11
    binary. It just looks for direct
    connections on the printer port. And when
  • 19:11 - 19:15
    reverse engineering the direct
    connections, we could rebuild a genuine
  • 19:15 - 19:24
    dongle. OK. So with this dongle we were
    then able to also change this RF region
  • 19:24 - 19:34
    setting inside the general menu of the
    device, but we could also boot into either
  • 19:34 - 19:41
    the integrated firmware upgrade utility
    over some special USB drive additionally,
  • 19:41 - 19:51
    or we could access the BIOS configuration
    and boot from different devices. This, of
  • 19:51 - 19:57
    course, could leak stored treatment data
    and personal data of the patients that are
  • 19:57 - 20:05
    stored, maybe on the hard disk itself or
    later on when something is modified. OK,
  • 20:05 - 20:12
    so now I have prepared a little live
    demonstration with this programing device.
  • 20:12 - 20:19
    Maybe the guys from the camera operation
    can switch to the live feed from the
  • 20:19 - 20:26
    device itself. We will see what I see
    here. And first of all, I will quickly
  • 20:26 - 20:37
    show how the device itself works and I
    just put this antenna on one implant and
  • 20:37 - 20:44
    then the interrogate button leads to
    starting the specific software for the
  • 20:44 - 20:51
    implant. As you already see the tiling
    window manager also started, so when we
  • 20:51 - 20:57
    want to, we can start a xterm terminal and
    when connected to a keyboard, we can also
  • 20:57 - 21:08
    type in something and we are root. Also in
    this standard interface now we can access
  • 21:08 - 21:14
    some test modes or settings of the
    implant, but I'm not really into it, so
  • 21:14 - 21:22
    let's skip this part when setting or doing
    some stuff here. But what else we could do
  • 21:22 - 21:32
    now is this security dongle. I just plug
    it in and then started again with an
  • 21:32 - 21:45
    attached flash drive. Then it starts this
    normal BIOS post. And when this is done it
  • 21:45 - 21:50
    simply boots from this USB flash drive.
    One special thing about this flash drive
  • 21:50 - 21:58
    is I had to find one which supports USB
    1.1 because the hardware is so old. But
  • 21:58 - 22:07
    finally, I got it working to boot from
    this USB drive. And after some while, when
  • 22:07 - 22:14
    loading all the data, you see, it simply
    starts a FreeDOS operating system and then
  • 22:14 - 22:23
    starts Doom. Now we can simply play doom
    on this programing device from a hospital,
  • 22:23 - 22:31
    so. Quite interesting, right? OK, I think
    you can switch back to the slides, please.
  • 22:31 - 22:42
    OK. So now that was this programming
    computer. What else is missing, is the
  • 22:42 - 22:49
    server infrastructure between the home
    monitoring and the doctor. First of all,
  • 22:49 - 22:56
    we looked at the home monitoring access to
    the manufacturer and when looking at the
  • 22:56 - 23:03
    credentials or rather the HTTP domain or
    IP address, I don't know, in the home
  • 23:03 - 23:10
    monitoring system of Medtronic, we were
    able to access the HTTP web server, which
  • 23:10 - 23:16
    the data is, I think, using a POST request
    is transmitted to the server. However,
  • 23:16 - 23:26
    whatever we sent to the server resulted in
    a blank page with the status code 200. So
  • 23:26 - 23:30
    no matter what we sent, right? We could
    also send some really, really incorrect
  • 23:30 - 23:38
    data. But it doesn't matter. It just keeps
    this blank page. So this seems to be a
  • 23:38 - 23:46
    measure against this misuse. And maybe
    it's not so bad it like this. However, I
  • 23:46 - 23:53
    don't know if we looked for any encrypted
    stuff there. Probably it's only TLS
  • 23:53 - 24:01
    encrypted or something. OK, and then the
    doctor also gets the data from the
  • 24:01 - 24:07
    manufacturer server. So this is also
    usually done over a Web interface, which
  • 24:07 - 24:14
    we learned from our partnering hospital.
    And when looking around there, we thought
  • 24:14 - 24:20
    it's not that bad because there's a
    typical login username password
  • 24:20 - 24:28
    authentication included. And then we
    stopped there because these are productive
  • 24:28 - 24:33
    systems and we wouldn't want to do some
    SQL injections or something like this
  • 24:33 - 24:39
    because it's a really productive system
    and probably life-depending monitoring is
  • 24:39 - 24:45
    running there. So we didn't want to risk
    anything. Right. So better stop there and
  • 24:45 - 24:56
    let it be. OK. So but from the first look,
    it looked quite okayish. OK, so a quick
  • 24:56 - 25:02
    summary about all these findings on the
    technical aspect. There are several
  • 25:02 - 25:10
    security vulnerabilities in different
    devices. Yeah, sure, patients could be
  • 25:10 - 25:16
    harmed, when therapy relevant data was
    manipulated. However, usually there is a
  • 25:16 - 25:22
    doctor in between. So whenever the doctor
    gets some information that something's
  • 25:22 - 25:29
    wrong or something like this, and probably
    he would look and find out what is wrong.
  • 25:29 - 25:37
    And yeah, we also found out that it's
    possible to access medical devices. So,
  • 25:37 - 25:43
    yeah, we got this programing computer for
    2.000 U.S. dollars, which clearly shows
  • 25:43 - 25:52
    that maybe not a good practice to simply
    rent or lease these devices, but maybe
  • 25:52 - 26:00
    design these at most secure as possible.
    And maybe some countermeasures, what can
  • 26:00 - 26:09
    be done to make it better? First of all,
    regular software update maintenance could
  • 26:09 - 26:14
    resolve most of these issues. Also, it
    would be nice to include some medical
  • 26:14 - 26:19
    professionals in a product engineering
    phase, because some test modes maybe
  • 26:19 - 26:25
    aren't that relevant when the implant is
    finally inserted in the body after
  • 26:25 - 26:34
    surgery, so then nobody needs these test
    modes anymore, for example. And last of
  • 26:34 - 26:42
    all, but not least. Please make use of
    state of the art cryptography and PKIs and
  • 26:42 - 26:49
    maybe also open protocols to improve the
    security and develop something that is as
  • 26:49 - 26:55
    secure as it gets. OK, so this is, I
    think, the technical part, and I would
  • 26:55 - 27:01
    like to hand over to Christoph, who will
    tell us something about the GDPR request
  • 27:01 - 27:08
    and responses and nightmares.
    Christoph Saatjohann: Yeah, thank you. So
  • 27:08 - 27:11
    my name is Christoph Saatjohann from
    Münster University of applied sciences,
  • 27:11 - 27:16
    and I will tell you something about the
    privaciy part, so privacy stuff, because
  • 27:16 - 27:20
    as you already heard, there is a lot of
    data included in this complete ecosystem.
  • 27:20 - 27:24
    So there is some data flowing from the
    implantable device to the home monitoring
  • 27:24 - 27:29
    service and then going farther to the
    internet to the company of devices here.
  • 27:29 - 27:36
    Now my question was, OK, what can we do
    here? How can we take a look into the data
  • 27:36 - 27:39
    processing here? How can we look into the
    processes of the company? What would they
  • 27:39 - 27:45
    do with our data or with the patient data?
    And we used the GDPR for this, so the GDPR
  • 27:45 - 27:50
    is the General Data Protection Regulation,
    it was put in force in 2018. So it's not
  • 27:50 - 27:54
    so new. During our study it was two or
    three years old. So we thought the
  • 27:54 - 28:03
    companies are still, are already prepared
    for such stuff. Mrs. GDPR, the user in our
  • 28:03 - 28:07
    case, or the patient, can obtain some
    information about the processed data and
  • 28:07 - 28:13
    with the Article 15 of the GDPR the
    patient can ask about the purpose of the
  • 28:13 - 28:18
    processing, the categories of the data and
    the recipients. So, for example, some
  • 28:18 - 28:24
    subcontracts who will get the data from
    the patient and compute something that
  • 28:24 - 28:29
    just to convert it to some PDF or put it
    on the web interface for the doctors. So
  • 28:29 - 28:35
    that's some typical part, typical tasks
    for some subcontractors, so some other
  • 28:35 - 28:41
    recipients who will get the data. With
    Article 20, it is possible to get a copy
  • 28:41 - 28:46
    of the data itself. The patient could ask
    the company,: Yeah, please give me a copy
  • 28:46 - 28:50
    of all my own data. I want to look into it
    or I want to move it to a different
  • 28:50 - 28:57
    company. And for this moving from one
    company to a different company, it's
  • 28:57 - 29:04
    called data portability, the data must be
    provided in a commonly used, machine
  • 29:04 - 29:09
    readable format and machine readable
    format does not mean PDF, for example. For
  • 29:09 - 29:17
    our topic here, for the measurable things,
    commonly used formats may be DICOM or HL7
  • 29:17 - 29:24
    and not PDF. The GDPR also defines the
    maximum answer time. So every request from
  • 29:24 - 29:30
    a customer should be answered in a maximum
    of four weeks. If it's a real complex
  • 29:30 - 29:34
    thing, a complex request or something like
    this, it might be extended up to three
  • 29:34 - 29:40
    months in total , but the customer has to
    be informed about the extension and also
  • 29:40 - 29:46
    the reasons for the extension. The last
    point is said, so GDPR defines two
  • 29:46 - 29:50
    important roles for the following parts.
    Also talk here. First is the data
  • 29:50 - 29:55
    controller. That means the data controller
    is responsible for the complete data
  • 29:55 - 30:00
    process. He might want to share this data
    with some other recipients or
  • 30:00 - 30:06
    subcontractors or subsidiaries of the
    company. And then the other recipient is
  • 30:06 - 30:12
    called the data processor and he processes
    the data. But responsible for this process
  • 30:12 - 30:16
    is the data controller. So the important
    thing here, the data controller is
  • 30:16 - 30:22
    responsible. Whatever happens, he will,
    yeah, he has to answer the request. So
  • 30:22 - 30:31
    with these GDPR, yeah, methods here, we
    thought about: What can we do? And our
  • 30:31 - 30:36
    thing was that we acquired some patients
    with some implanted devices and we sent
  • 30:36 - 30:42
    some GDPR inquiries in their names. It was
    a company, so we told the company: OK, we
  • 30:42 - 30:46
    are patient xy and we want to know
    something about our data and we want to
  • 30:46 - 30:51
    have a copy of our own data. And of
    course, now you can argue, OK, now we are
  • 30:51 - 30:57
    handing some very sensitive medical data
    here, so we have to keep our study here,
  • 30:57 - 31:02
    our case study itself GDPR compliant. So
    we talked to our data protection officer,
  • 31:02 - 31:07
    told our study design here. We set up some
    contracts with the patients so that we are
  • 31:07 - 31:15
    self GDPR compliant. Hopefully, it works
    out. So no one, so we haven't got sued. So
  • 31:15 - 31:21
    I think that worked out. At the end we
    were waiting for the answers of the
  • 31:21 - 31:24
    companies and the hospitals and of course,
    analyze the results. So we looked on the
  • 31:24 - 31:30
    completeness, we thought about: This
    dataset, is it complete? Or have the
  • 31:30 - 31:34
    companies some other data which were not
    provided? We also looked on the data
  • 31:34 - 31:40
    security, especially: How is the data
    transmitted? Do we get this via plain text
  • 31:40 - 31:47
    email, perhaps like ZIP files or some CD-
    ROMs? So we looked on this process here,
  • 31:47 - 31:50
    and of course, we look on the time of the
    answer. So remember, four weeks is the
  • 31:50 - 31:55
    maximum time. In some cases, perhaps three
    months, but standard would be four weeks.
  • 31:55 - 32:02
    And of course, if required, we sent some
    follow up queries. Yes. And, as already
  • 32:02 - 32:07
    said, we are some responsible researchers
    here, so we also do this responsible
  • 32:07 - 32:11
    disclosure stuff. So we talked to the
    companies and discussed some methods, some
  • 32:11 - 32:17
    process improvements, what can they do or
    at least what should they do or must do to
  • 32:17 - 32:24
    be GDPR compliant. So let's take a look at
    the results here, what we get from our
  • 32:24 - 32:30
    case study. First vendor was the Biotronik
    and we sent the first inquiry to the
  • 32:30 - 32:36
    Biotronik subsidiary, but we learned that
    we have a wrong contact. So we just took a
  • 32:36 - 32:40
    data privacy contact from some documents
    from the hospital. But they wrote back:
  • 32:40 - 32:43
    Ah, sorry, we are the wrong company, where
    just the sales company from Biotronik.
  • 32:43 - 32:48
    Please refer to the different company.
    Then we wrote a second letter to the
  • 32:48 - 32:51
    different company. And we got an answer
    after two months and, now remember, four
  • 32:51 - 32:56
    weeks, it's not two months, so it was
    delayed. Sure. But the answer itself was
  • 32:56 - 33:02
    also a bit unsatisfying for us because
    Biotronik told us: The device was never
  • 33:02 - 33:07
    connected to any home monitoring system
    here, so no personal data is stored at
  • 33:07 - 33:12
    Biotronik. We asked the patient: Do you
    ever have some of these home monitoring
  • 33:12 - 33:17
    devices? And he told us: No, never got
    this device. So this is a classic example
  • 33:17 - 33:22
    of a good study design or, in this case, a
    bad study design. So first of all, get
  • 33:22 - 33:29
    your context right. And secondly, choose
    good participants of your study. So this
  • 33:29 - 33:36
    might be a future work item here. Perhaps
    choose another different patient, perhaps.
  • 33:36 - 33:41
    Next company, we wrote to, was Medtronic.
    You already know it from the other devices
  • 33:41 - 33:47
    here. And the answer was that we have to
    send a copy of the ID card, so they wanted
  • 33:47 - 33:55
    to have an identification verification.
    The GDPR does not define a really strict
  • 33:55 - 33:59
    method for the verification or when it is
    required, when it's mandatory or when not
  • 33:59 - 34:06
    but in the GDPR it says, it is possible in
    some cases and we think here we are
  • 34:06 - 34:10
    dealing here with very sensitive medical
    personal data. We think this is totally
  • 34:10 - 34:14
    fine. So identification verification is
    fine for us, and we think this is a good
  • 34:14 - 34:20
    thing here that they really check it, that
    we are the person who is telling the
  • 34:20 - 34:27
    person. They also recommend us to use the
    Medtronic secure email system, and first
  • 34:27 - 34:32
    of all, we had a good impression because
    it's much better than have some plain text
  • 34:32 - 34:36
    email and if they are hosting some secure
    email system on their servers, we said:
  • 34:36 - 34:40
    OK, that's a good idea, right? We have a
    TLS secure connection here. Looks
  • 34:40 - 34:45
    perfectly fine. Let me send some tests
    emails, but we saw on the email headers
  • 34:45 - 34:51
    that email is routed to some US servers
    from the Proofpoint company in the USA.
  • 34:51 - 34:55
    And here we would say, OK, that's not
    really good because normally if I'm a
  • 34:55 - 35:01
    German customer or a European customer and
    sending some GDPR request to the
  • 35:01 - 35:08
    Medtronic, Germany or any other EU
    Medtronic subsidiary, I'm not sure about
  • 35:08 - 35:13
    or I have no knowledge about that email is
    routed through the US. And also for the
  • 35:13 - 35:20
    GDPR compliance we are not sure if this is
    actually allowed because there are some
  • 35:20 - 35:24
    discussions about the Safe Harbor thing.
    So that might be not really GDPR
  • 35:24 - 35:29
    compliant. In the least it's not good for
    the user. It's a bad user experience here.
  • 35:29 - 35:37
    But OK, we will use this anyway, because
    we think such a device, such a platform is
  • 35:37 - 35:41
    better than plaintext email, so we sent
    our inquiry about the system. And the next
  • 35:41 - 35:45
    point was really surprising to us. So we
    were waiting for the results. And I looked
  • 35:45 - 35:51
    into my yeah, we created a GMX free web
    email service account for this
  • 35:51 - 35:56
    communication and suddenly I got an email
    in this system in the GMX email. So the
  • 35:56 - 36:01
    response from Medtronic was not sent via
    the secure channel. It was just sending in
  • 36:01 - 36:06
    plaintext email and we said: OK, what's
    the point here? So you recommend us to use
  • 36:06 - 36:10
    the secure system, but they use plaintext
    email. So this is a thing they really have
  • 36:10 - 36:17
    to change, sure. Then it goes a bit back
    and forth, we wrote some emails and they
  • 36:17 - 36:22
    wanted to have some more information about
    us. What device do we have, what serial
  • 36:22 - 36:28
    number and what services we use. And in
    the end, we got a doc file. So a standard
  • 36:28 - 36:33
    Word file as attachment of an email and we
    should write some information in it and
  • 36:33 - 36:38
    send it back. And from a security point of
    view, so from our security researcher site
  • 36:38 - 36:43
    here, we would say that's not the best way
    to do it because Doc files in the email
  • 36:43 - 36:48
    attachment, this is a classical use for
    ransomware or phishing email. So we did
  • 36:48 - 36:54
    this to get the final data as a final
    answer, but we would propose to change the
  • 36:54 - 36:59
    system here. Where we got now the final
    data. And so we thought, OK, now we really
  • 36:59 - 37:04
    got some data now. So that's the point
    where we got some really good stuff here.
  • 37:04 - 37:09
    But in the end, after this forth and back
    and creating some accounts on some
  • 37:09 - 37:13
    systems, they just stated: Oh, so we were
    the wrong contact. The hospital is
  • 37:13 - 37:18
    responsible. We are just a data controller
    in this case. And of course, this was a
  • 37:18 - 37:23
    bit unsatisfying because we thought: OK,
    we are now close to the to get the data.
  • 37:23 - 37:31
    But never got happened. Yeah, so
    analyzizing, so in the end, in GDPR
  • 37:31 - 37:36
    compliant might be OK. So we are not into
    this relationship between Medtronic and
  • 37:36 - 37:41
    the hospital. So it might be that they
    have an agreement: Who is the controller,
  • 37:41 - 37:45
    who was responsible? We couldn't check it
    as a patient, but of course, the user
  • 37:45 - 37:49
    experience is not good because you have so
    many emails and at the end you will get
  • 37:49 - 37:57
    nothing. But the third one is Boston
    Scientific. You know Boston Scientific
  • 37:57 - 38:06
    from the nice Doom device here. And we
    sent an inquiry to BSC, and we got a
  • 38:06 - 38:10
    response that they want to have an
    identification verification, so the same
  • 38:10 - 38:15
    as Medtronic. So we said, yeah, that's
    fine, sounds legit. They said also, yeah,
  • 38:15 - 38:20
    you can use plaintext email. Just send an
    email to this email address or you can use
  • 38:20 - 38:26
    our online tool and our patient chooses
    the email. So from security side, we would
  • 38:26 - 38:32
    use a platform or a secure platform now.
    But I can totally understand the patient
  • 38:32 - 38:38
    because it was a hard copy letter, so a
    real letter by snake postal mail. And he
  • 38:38 - 38:43
    should type this really long link, the
    long URL with some random data. And if you
  • 38:43 - 38:48
    type one character wrong, you have to do
    it again and so on. And from the customer
  • 38:48 - 38:52
    point of view, so the user experience is
    also very bad here. So no one wants to
  • 38:52 - 38:57
    really type this in your browser and see
    if it's correct or not. So Boston
  • 38:57 - 39:02
    Scientific should use a different system
    here, some short link system or just have
  • 39:02 - 39:11
    a short domain and a very short access
    code, but something better like this one
  • 39:11 - 39:16
    here. But then we got an email. So it was
    a plain text email, so not good, of
  • 39:16 - 39:21
    course, medical data via plain text email
    not good. So some can now argue: OK, but
  • 39:21 - 39:28
    our patient started to do it. Our patient
    started to wrote a plaintext email. But
  • 39:28 - 39:33
    the common understanding for the GDPR is
    that even if the customer is asking via
  • 39:33 - 39:38
    plain text email, the company cannot
    respond in plain text email. You have to
  • 39:38 - 39:43
    do something special, something secure. So
    this is also a thing Boston Scientific
  • 39:43 - 39:47
    sure changed. But hey, we got seven PDF
    reports. So our first data now in this
  • 39:47 - 39:52
    case study after I don't know how many
    emails and letters, but we got some data.
  • 39:52 - 39:57
    Then we thought, OK, seven PDF reports and
    the device is active for three years. That
  • 39:57 - 40:03
    sounds not OK, that sounds a bit, a bit
    less for seven, for three years. So we
  • 40:03 - 40:08
    contacted the doctor, of course, with the
    consent of the patient, and the doctor
  • 40:08 - 40:13
    looked into this online portal and saw a
    lot of data, there were a lot of raw data,
  • 40:13 - 40:18
    a lot of PDF reports and graphs and full
    of information. And so we thought: OK,
  • 40:18 - 40:22
    that's not enough. We got seven reports
    but the system is full of any other data.
  • 40:22 - 40:28
    Of course, we go to BSC and tell us: OK,
    we want to have all the data. BSC
  • 40:28 - 40:35
    apologized: OK, so we didn't look it up
    into the home monitoring thing, but you
  • 40:35 - 40:41
    can have the data, but we need two extra
    months. So, as I said in the interruption,
  • 40:41 - 40:47
    that might be OK if it's a really complex
    thing and then it might be OK. For my, my
  • 40:47 - 40:53
    understanding is or I have a feeling that
    they have to implement some export
  • 40:53 - 40:57
    mechanism to fulfill our request. But OK,
    if they want to have two months and we got
  • 40:57 - 41:04
    the data, we were fine with this. Yeah,
    and now final data. Within this extended
  • 41:04 - 41:09
    deadline, so they did this in the
    deadline, in the three months, we got all
  • 41:09 - 41:13
    the data. And all the data, I mean,
    really, we got a ton of it.It was a large
  • 41:13 - 41:19
    zip file with a lot of HL7 data. So it's a
    raw, digital, machine readable format. And
  • 41:19 - 41:23
    we got some episode data, also digital as
    an excel sheet. And we were now really
  • 41:23 - 41:29
    happy because this was really satisfying.
    The patient, or in this case, we got all
  • 41:29 - 41:36
    the data which are really GDPR compliant.
    So that was the first and only vendor
  • 41:36 - 41:42
    where we got really our GDPR request
    fulfilled. Yeah, but last point, we have
  • 41:42 - 41:48
    to mention now the security. It was not
    sent directly by email, but we just got
  • 41:48 - 41:52
    the download link via email. And from the
    security perspective, that's more or less
  • 41:52 - 41:57
    the same. So if you have a man in the
    middle who can read plaintext email, you
  • 41:57 - 42:02
    can also click on the link in the download
    email. So, OK, we got the data but the
  • 42:02 - 42:10
    process here with the security must be
    improved. We also got one hospital patient
  • 42:10 - 42:18
    and we send one inquiry to a hospital.
    There's also some things which we were not
  • 42:18 - 42:22
    informed of, which we're not aware of it.
    The hospital was doing the implantation of
  • 42:22 - 42:28
    the device in our inquiry was five years,
    the hospital was bankrupt and we were told
  • 42:28 - 42:33
    that we have to contact a different person
    of the old owner of the hospital. So the
  • 42:33 - 42:38
    bankrupt company. We also think that this
    is in the, yeah, we also think that this
  • 42:38 - 42:43
    might not be correct because the
    explantation of the device was done during
  • 42:43 - 42:49
    the time where the hospital was under the
    control of the new owner, which we
  • 42:49 - 42:54
    contacted. So there might be some data for
    the explantation. But we also write a
  • 42:54 - 43:00
    letter to the old company, and we got an
    answer, but also after two months, so
  • 43:00 - 43:08
    again, here we have to wait two months. A
    delay GDPR time frame was not fulfilled.
  • 43:08 - 43:12
    And also the final answer, so we get some,
    we really got some data, as we can see
  • 43:12 - 43:17
    here, this handwritten stuff here, but we
    were missing, for example, the surgery
  • 43:17 - 43:24
    report. Normally a hospital has to do a
    lot of documentation for surgery, but we
  • 43:24 - 43:31
    didn't get this information here, so we
    just get a part of it. But in summary, I
  • 43:31 - 43:35
    won't go into all at this point, but you
    can see a lot of red here, so we had some
  • 43:35 - 43:40
    plaintext data via email, which is not
    correct and also not legally, not GDPR
  • 43:40 - 43:45
    compliant. We have some deadline missed.
    We have some incomplete data or at the end
  • 43:45 - 43:50
    it was complete data, but we have to ask a
    lot and have to ask a doctor who will
  • 43:50 - 43:56
    double check if the data is correct and we
    needed often more than one request. And
  • 43:56 - 44:00
    finally, for the patient, if you want to
    have your data, if you want to execute
  • 44:00 - 44:04
    your right, it's a really hard way, as you
    can see here. And you need a lot of
  • 44:04 - 44:09
    emails, a lot of time for this. And
    sometimes it's not really possible because
  • 44:09 - 44:13
    they say: OK, just go to the hospital, go
    to a different thing here. We are not
  • 44:13 - 44:20
    responsible for this. So it's not a good
    user experience here. And our
  • 44:20 - 44:24
    recommendation for the companies is and
    also for the hospitals, they should be
  • 44:24 - 44:30
    prepared for such stuff because GDPR is
    now active for more than three years. And
  • 44:30 - 44:37
    in the next time or some time, they will
    get some requests and perhaps not from the
  • 44:37 - 44:40
    friendly researchers but from real
    patients. And they can, yeah, if they
  • 44:40 - 44:45
    won't get the answer, they can go to the
    local authorities and the local
  • 44:45 - 44:50
    authorities can really,can sue them so
    they can punish the company with a large
  • 44:50 - 44:57
    fine. So our recommendations: Be prepared
    for such stuff. And with this slide, I
  • 44:57 - 45:01
    want to thank you for your attention and
    want to close the talk, and we are happy
  • 45:01 - 45:08
    to answer some questions in the Q&A
    session. Thanks.
  • 45:08 - 45:14
    Herald: Thank you very much. I have a
    bunch of questions from the internet.
  • 45:14 - 45:22
    First one is, did you analyze binaries
    only or did you have any access to source
  • 45:22 - 45:26
    codes? And did you ask any
    manufacturers...
  • 45:26 - 45:35
    e7p: Please repeat the question. I didn't
    get it because of...
  • 45:35 - 45:43
    Herald: Some technician, please mute the
    room. I can also (inaudible) Ah, the
  • 45:43 - 45:50
    question is: Did you analyze binaries only
    or did you obtain a degree of access to
  • 45:50 - 46:00
    the source codes?
    e7p: I'm not sure if the check dongle is
  • 46:00 - 46:06
    meant but for this one, it was very small
    and we could analyze it easily using
  • 46:06 - 46:13
    Ghidra to decompile it and then just see
    which data needs to be which position in
  • 46:13 - 46:18
    the parallel port. If that was the
    question. I think the other binaries from
  • 46:18 - 46:27
    the home monitoring system or, if you
    could concretize... From the home
  • 46:27 - 46:33
    monitoring systems, we first had just a
    look for strings mainly, for some access
  • 46:33 - 46:42
    credentials or domain names. And I think
    we did not that much the decompilation in
  • 46:42 - 46:49
    the other stuff, but the whole software of
    the programing computer is in obfuscated
  • 46:49 - 46:58
    Java. And this is, I don't know, it's a
    just in time compiled obfuscated Java, and
  • 46:58 - 47:04
    I didn't find a good way how to do with
    this. Other questions?
  • 47:04 - 47:16
    Herald: Thank you. Another question was:
    How many CVEs did you file from this
  • 47:16 - 47:24
    research?
    e7p: So I'm not sure, but some of these
  • 47:24 - 47:30
    findings were already found by others, and
    we just didn't get that they were already
  • 47:30 - 47:37
    reported as CVE or as CISA reports. But I
    think one or two or three.
  • 47:37 - 47:44
    Christoph: Yes, there were some CVEs and
    also our results were linked to some other
  • 47:44 - 47:50
    CVEs which were already published. The
    company did some other internal research
  • 47:50 - 47:55
    thing and internal research got some CVEs.
    But if you look into the CVE, you cannot
  • 47:55 - 48:01
    always make it back to the actual
    vulnerability. But at least for the
  • 48:01 - 48:06
    programmer here, for the Boston scientific
    programmer there was the CISA advisory and
  • 48:06 - 48:11
    a few months back, I think in September or
    just end of August, there was a CISA
  • 48:11 - 48:17
    advisory for this programmer and it
    stated, I don't know, four or five CVEs.
  • 48:17 - 48:22
    Yeah, and it will be replaced in the next,
    yeah, hopefully months, because it's also
  • 48:22 - 48:27
    pretty old. And it's the old generation
    and the hospitals have to use the newer
  • 48:27 - 48:39
    generation in the next months.
    Herald: You mentioned the cooperativeness
  • 48:39 - 48:48
    of the manufacturers on the subject access
    request but how cooperative were they on
  • 48:48 - 48:52
    the technical side, when you tried to
    report and disclose?
  • 48:52 - 48:59
    e7p: Yeah, actually, they were quite
    cooperative when we simply wrote: Hey, we
  • 48:59 - 49:05
    found this and that. We wrote it to them,
    I think, first at the press address or
  • 49:05 - 49:13
    something. And then we were redirected to
    the internal security group, or, would you
  • 49:13 - 49:20
    say, the experts. And then we had, I
    think, Zoom meetings with them and it was
  • 49:20 - 49:26
    a good cooperation, I would say.
    Christoph: Really, I think it was a good
  • 49:26 - 49:31
    communication on the same level here, so
    it was really the goal from all that we,
  • 49:31 - 49:37
    of course, don't threaten the patients,
    but get the product more secure. I think
  • 49:37 - 49:41
    it is also some point of regulation like
    the CISA in the US. If they have
  • 49:41 - 49:46
    vulnerabilities they have to change it so
    they also get some pressure from the
  • 49:46 - 49:52
    regulations and they really want to change
    some things. So that's my impression and
  • 49:52 - 49:56
    the discussions were really well
    organized, well structured with a lot of
  • 49:56 - 50:00
    people who were really deep into the
    topic. So we asked some questions and we
  • 50:00 - 50:04
    got really deep insights into the products
    here. They were very helpful.
  • 50:04 - 50:09
    e7p: So I think all of the companies
    offered some jobs for security analysts.
  • 50:09 - 50:12
    Christoph: Oh yeah!
    e7p: So anyone who's interested in jobs at
  • 50:12 - 50:15
    Boston Scientific or Medtronic or
    Biotronik must...
  • 50:15 - 50:25
    Christoph: Just hack a device and you will
    get a job for it. I don't know.
  • 50:25 - 50:33
    Herald: And the last question I have for
    you is how difficult this was in terms of
  • 50:33 - 50:40
    technical skills needed. Was this, did it
    require really high tech exploits or was
  • 50:40 - 50:45
    this just unbelievably easy? How, where in
    the spectrum was it?
  • 50:45 - 50:53
    e7p: So with the programing device it was,
    for me, rather difficult because I'm not
  • 50:53 - 51:00
    so much into 90s or 2000s PC architecture,
    so I did have to learn something and I
  • 51:00 - 51:08
    found out about old custom hardware which
    is interacting over PCI and also with the
  • 51:08 - 51:16
    home monitoring I had to learn some stuff
    for embedded Linux and find out where the
  • 51:16 - 51:24
    network stack is and how it all works, but
    all in all, I think in maybe one month or
  • 51:24 - 51:30
    something like this, I could have done it
    all in all. And to find out.
  • 51:30 - 51:36
    Christoph: I mean, it actually depends on
    your knowledge already. So some stuff were
  • 51:36 - 51:39
    pretty standard like sniffing on some
    busses on the hardware with a logic
  • 51:39 - 51:44
    analyzer. If you did this in past, you can
    also do this on these devices, for
  • 51:44 - 51:48
    example. But if you never did this, then
    you have to figure out which pins are for
  • 51:48 - 51:52
    which bus, how can I identify which bus is
    used here? How can I read out the EPROM?
  • 51:52 - 51:57
    How can I read out the memory chips? It
    highly depends on your previous work and
  • 51:57 - 52:06
    previous knowledge. For Endres here it's
    easy. laughs
  • 52:06 - 52:17
    e7p: laughs Maybe not.
    Herald: OK, thank you. I think this
  • 52:17 - 52:24
    concludes the session. Thank you both for
    this interesting presentation. So we'll
  • 52:24 - 52:28
    see how your research will be further
    approved.
  • 52:28 - 52:34
    e7p: Thank you. Thanks for being here.
    Christoph: For being remote there. And
  • 52:34 - 52:39
    next time, hopefully in life and presence,
    hopefully.
  • 52:39 - 52:42
    Herald: Yes, that would be so much better
    than this. Bye bye.
  • 52:42 - 53:03
    postroll music
  • 53:03 - 53:06
    Subtitles created by c3subtitles.de
    in the year 2021. Join, and help us!
Title:
Listen to Your Heart: Security and Privacy of Implantable Cardio Foo
Description:

more » « less
Video Language:
English
Duration:
53:06

English subtitles

Revisions