0:00:23.180,0:00:26.470
Preroll music
0:00:26.470,0:00:30.590
Herald: Good evening, everybody. The[br]upcoming talk is titled "Listen to Your
0:00:30.590,0:00:36.150
Heart: Security and Privacy of Implantable[br]Cardio Foo" and will be delivered by e7p,
0:00:36.150,0:00:42.490
who is a Ph.D. student at the Max Planck[br]Institute. and Christoph Saatjohann who is
0:00:42.490,0:00:47.190
also a Ph.D. student at the University of[br]Applied Sciences Münster where he also
0:00:47.190,0:00:53.540
teaches security in medical devices. This[br]talk also be translated into German.
0:00:53.540,0:01:00.660
Dieser Vortrag wird auch simultan[br]übersetzt in Deutsch. And that is also the
0:01:00.660,0:01:05.030
extent of my German. I would say, e7p over[br]to you.
0:01:05.030,0:01:09.560
e7p: Yeah, thanks a lot for the nice[br]introduction. I hope you can all hear and
0:01:09.560,0:01:15.330
see us. OK, so yeah,welcome to the talk[br]"Listen to Your Heart: Security and
0:01:15.330,0:01:22.890
Privacy of Implantable Cardio Foo". I'm[br]Endres and as said I'm a Ph.D. student at
0:01:22.890,0:01:27.280
the Max Planck Institute for Security and[br]Privacy. My main topic is embedded
0:01:27.280,0:01:33.900
security. This topic is a funded project,[br]which is called MediSec. It's a
0:01:33.900,0:01:37.860
cooperation between cyber security[br]researchers in Bochum and Münster, as well
0:01:37.860,0:01:42.630
as medical technology researchers and[br]staff of the University Hospital in
0:01:42.630,0:01:49.650
Münster. And this is funded research. So[br]to start off, we want to give a quick
0:01:49.650,0:01:58.630
motivation what our topic is about. So we[br]chose these implantable cardiac
0:01:58.630,0:02:05.140
defibrillators or other heart related[br]implantable devices. And there are
0:02:05.140,0:02:11.650
different kinds of these, so there's lots[br]of other classical heart pacemakers, which
0:02:11.650,0:02:16.050
every one of you should already heard[br]about. Then there's also implantable
0:02:16.050,0:02:25.079
defibrillators, which have other[br]applications actually, and there are also
0:02:25.079,0:02:31.000
heart monitors just for diagnosis. And[br]yeah, as these implants are inside your
0:02:31.000,0:02:37.250
body, they pose a high risk for threats[br]and also they have communication
0:02:37.250,0:02:44.549
interfaces that are similar to these of[br]the Internet of Things. Also, we want to
0:02:44.549,0:02:51.170
talk a bit about the ethical arguments. So[br]when asking: Why hacking medical devices?
0:02:51.170,0:02:56.579
Well, first, the obvious thing is, yeah,[br]there's a long device lifecycle in the
0:02:56.579,0:03:02.520
medical sector, presumably because of the[br]strong regulations and required
0:03:02.520,0:03:11.780
certifications for medical products. So[br]it's more optimal to keep these devices as
0:03:11.780,0:03:23.769
long as possible on the market. And for[br]this reason, it's also sure that the, that
0:03:23.769,0:03:38.570
the, that there are open bugs from old[br]hardware or software and, but the
0:03:38.570,0:03:43.450
manufacturers need to know about the[br]issues to be able to do something about
0:03:43.450,0:03:50.840
it. That's a disclaimer for affected[br]patients. It's independent to the decision
0:03:50.840,0:03:58.090
for or against such a device what we talk[br]about. Because after all, these devices
0:03:58.090,0:04:09.060
can save your life. OK, let's get a bit[br]more to the details. Also we want to talk
0:04:09.060,0:04:15.310
shortly about the responsible disclosure[br]process. When we found out some bugs and
0:04:15.310,0:04:22.889
vulnerabilities, we informed all the, the[br]involved companies at least six months
0:04:22.889,0:04:32.930
ago. So from now on, it's maybe a year or.[br]So the companies took us serious. They
0:04:32.930,0:04:39.130
acknowledged our results and ours and[br]their goal is not to worry any affected
0:04:39.130,0:04:46.289
people but to improve the product[br]security. Vulnerable devices are or will
0:04:46.289,0:04:53.099
soon be replaced, or at least the firmware[br]gets updated. And yeah, whenever we do
0:04:53.099,0:04:58.310
independent security research, it helps to[br]keep the quality of the products higher,
0:04:58.310,0:05:06.710
which is in both ours and their interests.[br]And one note is, if you ever find out any
0:05:06.710,0:05:12.520
bugs or vulnerabilities in some product[br]please inform the companies first before
0:05:12.520,0:05:19.560
publishing anything of it online or[br]anywhere else. OK, let's get started into
0:05:19.560,0:05:27.080
the topic. First of all, I want to talk[br]about all the devices in the environment
0:05:27.080,0:05:34.580
around the implantable devices. First of[br]all, we have the implants himself. These
0:05:34.580,0:05:40.919
little devices, they are not so heavy.[br]They are placed, I think, under the skin
0:05:40.919,0:05:47.150
near the heart. I don't know. I'm not from[br]the medical sector, but yeah, inside the
0:05:47.150,0:05:57.919
body and they have one or multiple[br]contacts which electrodes connect to. And
0:05:57.919,0:06:04.599
these connect them to muscles or the[br]organs, and to their thing. But as there
0:06:04.599,0:06:10.949
is no outside connection for configuring[br]or receiving test data or something like
0:06:10.949,0:06:17.480
this or events. There is a programing[br]device which is usually located in the
0:06:17.480,0:06:25.789
hospital or in the heart clinics. Then[br]there's also a home monitoring station,
0:06:25.789,0:06:32.229
which the patient takes home and puts at[br]the bed table, for instance, so it can
0:06:32.229,0:06:37.540
receive all the relevant data from the[br]implant every day and transmit relevant
0:06:37.540,0:06:44.569
data then to the doctor. This does not[br]happen directly, but over the
0:06:44.569,0:06:49.889
manufacturer's infrastructure, the[br]structure and the transmission here is
0:06:49.889,0:06:57.860
over the internet usually. But then the[br]doctor can receive all the data over the
0:06:57.860,0:07:06.410
internet again, and yeah, so that's all[br]the four big spots where data is
0:07:06.410,0:07:14.010
transmitted to and from. And if we have an[br]attacker here, he could try to attack or
0:07:14.010,0:07:19.340
find vulnerabilities in any of these four[br]devices, as well as their communication
0:07:19.340,0:07:26.770
interfaces and protocols. OK, coming a bit[br]more concrete. So in total, there are
0:07:26.770,0:07:34.840
around five major vendors worldwide that[br]develop these implants and other devices
0:07:34.840,0:07:46.449
around them, and we look. We try to[br]analyze these three on top here, and yeah.
0:07:46.449,0:07:53.050
So we go a bit more in detail what we[br]found out and what we try to analyze here.
0:07:53.050,0:08:01.409
So going back to the implants, I already[br]showed it to you. That's maybe how it
0:08:01.409,0:08:08.099
looks like from the inside. Not very good[br]to see, I think, with the camera, but
0:08:08.099,0:08:15.490
there's also some picture in the slides.[br]And yeah, first of all, these implants
0:08:15.490,0:08:20.750
contain the desired functionality. For[br]instance, defibrillator, pacemaker, heart
0:08:20.750,0:08:26.400
recorder. And these features have[br]different requirements. For instance, a
0:08:26.400,0:08:34.190
defibrillator probably needs more power,[br]and so needs a larger battery or a huge
0:08:34.190,0:08:40.910
capacitor, for instance. And a heart[br]monitor doesn't need anything of these.
0:08:40.910,0:08:45.400
And of course, all of these need this[br]communication interface, which is realized
0:08:45.400,0:08:54.730
over radio frequency. But sometimes it's[br]also only over the inductive coupling,
0:08:54.730,0:09:03.029
which is maybe known from RFID. And when[br]looking inside these devices, we see there
0:09:03.029,0:09:09.790
are highly customized parts inside, which[br]means there are unlabeled chips, even
0:09:09.790,0:09:16.149
unpackaged chips that are directly bonded[br]onto the PCBs. So analysis is quite hard
0:09:16.149,0:09:21.990
and difficult. But all of these have in[br]common that there's a small microcomputer
0:09:21.990,0:09:30.880
inside that handles everything and also[br]the communication. Yeah. Then there are
0:09:30.880,0:09:38.610
these home monitoring units, I just get[br]one here, looks like these, and as I said,
0:09:38.610,0:09:48.310
they sit on the bed table and transmit the[br]data on a daily basis to the doctors. They
0:09:48.310,0:09:54.709
also need some wireless communication[br]interface to the implant. And when they
0:09:54.709,0:09:59.690
got the data, they need to transmit them[br]to the doctor. And this is usually done
0:09:59.690,0:10:08.760
over a mobile to mobile GSM or UMTS[br]network. And then the data is sent to the
0:10:08.760,0:10:14.459
manufacturer server. And compared to the[br]implants these are based on standard
0:10:14.459,0:10:19.850
multipurpose hardware. That means often we[br]will find there are Linux operating
0:10:19.850,0:10:28.350
systems and lots of debug ports of serial[br]interfaces or USB. So they are easily
0:10:28.350,0:10:34.980
accessible for us. OK. And then we have[br]this hospital programmer. They are used in
0:10:34.980,0:10:41.310
the cardiology clinics, are able to[br]configure the implants and also use test
0:10:41.310,0:10:47.459
modes in the implants. But also, like the[br]home monitoring, they can read out the
0:10:47.459,0:10:53.930
stored events or live data and. Yeah. So[br]they are in the heart clinic and operated
0:10:53.930,0:10:59.579
by doctors. And usually these are rented[br]to the hospitals or leased from the
0:10:59.579,0:11:05.550
manufacturer. But, however, we could find[br]out that individuals could buy these
0:11:05.550,0:11:13.649
second hand on specialized, yeah,[br]specialized platforms similar to eBay, I
0:11:13.649,0:11:21.550
would say, for medical devices. And now on[br]to our methodology when analyzing the
0:11:21.550,0:11:28.789
devices. So first of all, we thought about[br]goals of a potential attacker. First of
0:11:28.789,0:11:39.600
all he mainly would like to influence the[br]implantable device itself. And this can be
0:11:39.600,0:11:45.860
done either. This can be mostly done over[br]the interface that the programming device
0:11:45.860,0:11:54.140
uses, so to inject malicious firmware in[br]the implant could be one goal of the
0:11:54.140,0:11:59.839
potential attacker. Another goal could be[br]to GSM spoof the connection of the home
0:11:59.839,0:12:08.170
monitoring box and then dump some some[br]medical data or also privacy related data.
0:12:08.170,0:12:15.010
And yeah, when looking at the programing[br]device, one could also think about direct
0:12:15.010,0:12:25.190
misuse to use, for example, the test modes[br]already included in the device. So what
0:12:25.190,0:12:34.259
research questions do result in this? So[br]the first question is: What is possible
0:12:34.259,0:12:40.120
only with the direct interaction with[br]genuine devices, which means that is non-
0:12:40.120,0:12:48.980
invasive? And the second question is: How[br]secure are these in general? Like, when
0:12:48.980,0:12:54.600
also invasive attacks are allowed. And the[br]third question is: Can we finally
0:12:54.600,0:13:03.000
understand all communication protocols or[br]is this rather difficult to do? OK. So now
0:13:03.000,0:13:10.430
we look more concretely on these attack[br]vectors, and we do this with the devices
0:13:10.430,0:13:17.199
which we got from our partner hospital.[br]And yeah, to start off, we looked at the
0:13:17.199,0:13:24.370
Biotronik home monitoring unit. And what[br]we did there was to run a rogue GSM cell.
0:13:24.370,0:13:38.019
So we did GSM spoofing with OpenBTS. And[br]this allowed us to intercept data. So this
0:13:38.019,0:13:42.890
data, which we got then, was not[br]encrypted, so we could reveal the access
0:13:42.890,0:13:50.480
credentials. And the same credentials you[br]could also find out when dumping the
0:13:50.480,0:13:57.279
firmware from the microcontroller. And[br]this was then done over the JTAG
0:13:57.279,0:14:03.420
interface. This firmware, which we got[br]there, could be reverse engineered with
0:14:03.420,0:14:10.110
Ghidra, which is an open source tool for[br]reverse engineering. And what we found
0:14:10.110,0:14:16.000
there was also an AES cypher[br]implementation, which is mainly used for
0:14:16.000,0:14:23.410
authentication steps. Also, the firmware[br]contained the credentials and thus the
0:14:23.410,0:14:31.120
internet domain cm3-homemonitoring.de. But[br]according to the manufacturer, this domain
0:14:31.120,0:14:37.670
is only used as an authentication realm.[br]However, they were kind of surprised when
0:14:37.670,0:14:44.980
we told them that actually they are using[br]the same domain for other services. But I
0:14:44.980,0:14:54.129
hope they won't do this anymore. So, yeah,[br]this should be fine. OK. Next up is the
0:14:54.129,0:14:58.239
Medtronic home monitoring unit, the[br]approach there was similar to the
0:14:58.239,0:15:05.400
Biotronik home monitoring unit. What we[br]found there was that a spoofing attack was
0:15:05.400,0:15:13.000
not quite possible because this Medtronic[br]home monitoring unit uses a mobile to
0:15:13.000,0:15:21.829
mobile SIM card, which is on a intranet,[br]not done on an internet or a VPN, could
0:15:21.829,0:15:30.500
think about this. And so we couldn't get a[br]connection to the original service. And
0:15:30.500,0:15:39.589
however, we found also on a web blog post[br]a documented method to find out about the
0:15:39.589,0:15:45.690
encryption password because the firmware[br]of the device is encrypted. And, yeah,
0:15:45.690,0:15:51.670
turned out that it was a Linux embedded[br]Linux system, which we could also
0:15:51.670,0:15:58.389
influence when opening up and tearing down[br]the device. Taking out, I think it was an
0:15:58.389,0:16:04.279
SD card and then overwriting some, some[br]files. And here in the picture, you can
0:16:04.279,0:16:12.199
also see where we could display an image[br]on the screen of the device. This was done
0:16:12.199,0:16:20.720
using some DBus messages because it was an[br]embedded linux really. So here we've got
0:16:20.720,0:16:28.830
also the server backend addresses also in[br]the firmware. But more to that later. The
0:16:28.830,0:16:34.560
third device which we analyzed was this[br]Boston scientific programing device. You
0:16:34.560,0:16:43.089
can switch the camera so we can see it[br]more clearly here. Yeah. So this rather
0:16:43.089,0:16:53.799
huge device we could buy for around 2.000[br]U.S. dollars from this auction platform.
0:16:53.799,0:17:00.000
And we could also tear this down because[br]it's never used any more for real
0:17:00.000,0:17:09.829
productive cases. And there we found a[br]hard disk inside. And there is also a
0:17:09.829,0:17:17.640
Linux system on it, which I think is from[br]2002, in the slides. And the device itself
0:17:17.640,0:17:24.360
is a Intel Pentium based device, which is[br]designed in 2004, and the software is from
0:17:24.360,0:17:34.060
2011. So quite old, right? But yeah. So[br]that's I think the thing about this
0:17:34.060,0:17:41.440
device. The Linux kernel, sorry, the Linux[br]system in the device also contained a
0:17:41.440,0:17:51.940
window manager, the tiling window manager[br]and using the modifying files or shell
0:17:51.940,0:17:59.080
scripts on the hard disk allowed us to[br]also start the twm, the tiling window
0:17:59.080,0:18:05.930
manager on certain actions. So from there[br]on, we could simply open up an xterm shell
0:18:05.930,0:18:15.710
and then have root rights. OK. So maybe I[br]will show it in the live demonstration
0:18:15.710,0:18:23.370
later. Also, we found some region lock[br]configuration file, which we could also
0:18:23.370,0:18:33.140
alter to allow to connect to the radio[br]frequency, which implants used around the
0:18:33.140,0:18:39.920
world. And as the Linux kernel is so old,[br]probably further exploits are likely
0:18:39.920,0:18:47.870
possible. But the device is luckily being[br]replaced right now. That's what Boston
0:18:47.870,0:18:57.450
Scientific told us. One nice thing about[br]this device is that we found also a x86
0:18:57.450,0:19:04.620
binary, which is called "checkDongle" on[br]the hard disk and this checkDongle is more
0:19:04.620,0:19:10.880
binary. It just looks for direct[br]connections on the printer port. And when
0:19:10.880,0:19:15.130
reverse engineering the direct[br]connections, we could rebuild a genuine
0:19:15.130,0:19:24.260
dongle. OK. So with this dongle we were[br]then able to also change this RF region
0:19:24.260,0:19:34.030
setting inside the general menu of the[br]device, but we could also boot into either
0:19:34.030,0:19:41.010
the integrated firmware upgrade utility[br]over some special USB drive additionally,
0:19:41.010,0:19:51.420
or we could access the BIOS configuration[br]and boot from different devices. This, of
0:19:51.420,0:19:56.770
course, could leak stored treatment data[br]and personal data of the patients that are
0:19:56.770,0:20:05.450
stored, maybe on the hard disk itself or[br]later on when something is modified. OK,
0:20:05.450,0:20:11.590
so now I have prepared a little live[br]demonstration with this programing device.
0:20:11.590,0:20:18.950
Maybe the guys from the camera operation[br]can switch to the live feed from the
0:20:18.950,0:20:26.210
device itself. We will see what I see[br]here. And first of all, I will quickly
0:20:26.210,0:20:37.250
show how the device itself works and I[br]just put this antenna on one implant and
0:20:37.250,0:20:44.390
then the interrogate button leads to[br]starting the specific software for the
0:20:44.390,0:20:50.990
implant. As you already see the tiling[br]window manager also started, so when we
0:20:50.990,0:20:56.580
want to, we can start a xterm terminal and[br]when connected to a keyboard, we can also
0:20:56.580,0:21:08.370
type in something and we are root. Also in[br]this standard interface now we can access
0:21:08.370,0:21:13.521
some test modes or settings of the[br]implant, but I'm not really into it, so
0:21:13.521,0:21:22.060
let's skip this part when setting or doing[br]some stuff here. But what else we could do
0:21:22.060,0:21:31.790
now is this security dongle. I just plug[br]it in and then started again with an
0:21:31.790,0:21:44.710
attached flash drive. Then it starts this[br]normal BIOS post. And when this is done it
0:21:44.710,0:21:50.120
simply boots from this USB flash drive.[br]One special thing about this flash drive
0:21:50.120,0:21:58.310
is I had to find one which supports USB[br]1.1 because the hardware is so old. But
0:21:58.310,0:22:06.850
finally, I got it working to boot from[br]this USB drive. And after some while, when
0:22:06.850,0:22:13.990
loading all the data, you see, it simply[br]starts a FreeDOS operating system and then
0:22:13.990,0:22:22.600
starts Doom. Now we can simply play doom[br]on this programing device from a hospital,
0:22:22.600,0:22:31.130
so. Quite interesting, right? OK, I think[br]you can switch back to the slides, please.
0:22:31.130,0:22:41.680
OK. So now that was this programming[br]computer. What else is missing, is the
0:22:41.680,0:22:48.640
server infrastructure between the home[br]monitoring and the doctor. First of all,
0:22:48.640,0:22:55.830
we looked at the home monitoring access to[br]the manufacturer and when looking at the
0:22:55.830,0:23:03.140
credentials or rather the HTTP domain or[br]IP address, I don't know, in the home
0:23:03.140,0:23:09.920
monitoring system of Medtronic, we were[br]able to access the HTTP web server, which
0:23:09.920,0:23:16.360
the data is, I think, using a POST request[br]is transmitted to the server. However,
0:23:16.360,0:23:25.510
whatever we sent to the server resulted in[br]a blank page with the status code 200. So
0:23:25.510,0:23:29.710
no matter what we sent, right? We could[br]also send some really, really incorrect
0:23:29.710,0:23:38.030
data. But it doesn't matter. It just keeps[br]this blank page. So this seems to be a
0:23:38.030,0:23:46.200
measure against this misuse. And maybe[br]it's not so bad it like this. However, I
0:23:46.200,0:23:52.950
don't know if we looked for any encrypted[br]stuff there. Probably it's only TLS
0:23:52.950,0:24:01.340
encrypted or something. OK, and then the[br]doctor also gets the data from the
0:24:01.340,0:24:07.430
manufacturer server. So this is also[br]usually done over a Web interface, which
0:24:07.430,0:24:13.890
we learned from our partnering hospital.[br]And when looking around there, we thought
0:24:13.890,0:24:19.980
it's not that bad because there's a[br]typical login username password
0:24:19.980,0:24:27.610
authentication included. And then we[br]stopped there because these are productive
0:24:27.610,0:24:32.760
systems and we wouldn't want to do some[br]SQL injections or something like this
0:24:32.760,0:24:39.470
because it's a really productive system[br]and probably life-depending monitoring is
0:24:39.470,0:24:45.390
running there. So we didn't want to risk[br]anything. Right. So better stop there and
0:24:45.390,0:24:56.050
let it be. OK. So but from the first look,[br]it looked quite okayish. OK, so a quick
0:24:56.050,0:25:02.410
summary about all these findings on the[br]technical aspect. There are several
0:25:02.410,0:25:10.090
security vulnerabilities in different[br]devices. Yeah, sure, patients could be
0:25:10.090,0:25:16.110
harmed, when therapy relevant data was[br]manipulated. However, usually there is a
0:25:16.110,0:25:21.990
doctor in between. So whenever the doctor[br]gets some information that something's
0:25:21.990,0:25:29.420
wrong or something like this, and probably[br]he would look and find out what is wrong.
0:25:29.420,0:25:37.300
And yeah, we also found out that it's[br]possible to access medical devices. So,
0:25:37.300,0:25:43.110
yeah, we got this programing computer for[br]2.000 U.S. dollars, which clearly shows
0:25:43.110,0:25:51.540
that maybe not a good practice to simply[br]rent or lease these devices, but maybe
0:25:51.540,0:26:00.491
design these at most secure as possible.[br]And maybe some countermeasures, what can
0:26:00.491,0:26:08.650
be done to make it better? First of all,[br]regular software update maintenance could
0:26:08.650,0:26:14.390
resolve most of these issues. Also, it[br]would be nice to include some medical
0:26:14.390,0:26:19.010
professionals in a product engineering[br]phase, because some test modes maybe
0:26:19.010,0:26:25.040
aren't that relevant when the implant is[br]finally inserted in the body after
0:26:25.040,0:26:33.800
surgery, so then nobody needs these test[br]modes anymore, for example. And last of
0:26:33.800,0:26:42.130
all, but not least. Please make use of[br]state of the art cryptography and PKIs and
0:26:42.130,0:26:49.120
maybe also open protocols to improve the[br]security and develop something that is as
0:26:49.120,0:26:55.030
secure as it gets. OK, so this is, I[br]think, the technical part, and I would
0:26:55.030,0:27:00.751
like to hand over to Christoph, who will[br]tell us something about the GDPR request
0:27:00.751,0:27:07.910
and responses and nightmares.[br]Christoph Saatjohann: Yeah, thank you. So
0:27:07.910,0:27:11.030
my name is Christoph Saatjohann from[br]Münster University of applied sciences,
0:27:11.030,0:27:16.031
and I will tell you something about the[br]privaciy part, so privacy stuff, because
0:27:16.031,0:27:20.130
as you already heard, there is a lot of[br]data included in this complete ecosystem.
0:27:20.130,0:27:24.130
So there is some data flowing from the[br]implantable device to the home monitoring
0:27:24.130,0:27:28.920
service and then going farther to the[br]internet to the company of devices here.
0:27:28.920,0:27:35.520
Now my question was, OK, what can we do[br]here? How can we take a look into the data
0:27:35.520,0:27:39.110
processing here? How can we look into the[br]processes of the company? What would they
0:27:39.110,0:27:44.590
do with our data or with the patient data?[br]And we used the GDPR for this, so the GDPR
0:27:44.590,0:27:49.540
is the General Data Protection Regulation,[br]it was put in force in 2018. So it's not
0:27:49.540,0:27:54.460
so new. During our study it was two or[br]three years old. So we thought the
0:27:54.460,0:28:02.631
companies are still, are already prepared[br]for such stuff. Mrs. GDPR, the user in our
0:28:02.631,0:28:07.380
case, or the patient, can obtain some[br]information about the processed data and
0:28:07.380,0:28:12.780
with the Article 15 of the GDPR the[br]patient can ask about the purpose of the
0:28:12.780,0:28:17.630
processing, the categories of the data and[br]the recipients. So, for example, some
0:28:17.630,0:28:23.580
subcontracts who will get the data from[br]the patient and compute something that
0:28:23.580,0:28:29.420
just to convert it to some PDF or put it[br]on the web interface for the doctors. So
0:28:29.420,0:28:35.280
that's some typical part, typical tasks[br]for some subcontractors, so some other
0:28:35.280,0:28:41.270
recipients who will get the data. With[br]Article 20, it is possible to get a copy
0:28:41.270,0:28:46.320
of the data itself. The patient could ask[br]the company,: Yeah, please give me a copy
0:28:46.320,0:28:50.290
of all my own data. I want to look into it[br]or I want to move it to a different
0:28:50.290,0:28:57.141
company. And for this moving from one[br]company to a different company, it's
0:28:57.141,0:29:03.860
called data portability, the data must be[br]provided in a commonly used, machine
0:29:03.860,0:29:08.950
readable format and machine readable[br]format does not mean PDF, for example. For
0:29:08.950,0:29:17.490
our topic here, for the measurable things,[br]commonly used formats may be DICOM or HL7
0:29:17.490,0:29:24.350
and not PDF. The GDPR also defines the[br]maximum answer time. So every request from
0:29:24.350,0:29:29.920
a customer should be answered in a maximum[br]of four weeks. If it's a real complex
0:29:29.920,0:29:34.420
thing, a complex request or something like[br]this, it might be extended up to three
0:29:34.420,0:29:39.670
months in total , but the customer has to[br]be informed about the extension and also
0:29:39.670,0:29:46.150
the reasons for the extension. The last[br]point is said, so GDPR defines two
0:29:46.150,0:29:50.460
important roles for the following parts.[br]Also talk here. First is the data
0:29:50.460,0:29:55.090
controller. That means the data controller[br]is responsible for the complete data
0:29:55.090,0:29:59.860
process. He might want to share this data[br]with some other recipients or
0:29:59.860,0:30:05.590
subcontractors or subsidiaries of the[br]company. And then the other recipient is
0:30:05.590,0:30:11.690
called the data processor and he processes[br]the data. But responsible for this process
0:30:11.690,0:30:15.660
is the data controller. So the important[br]thing here, the data controller is
0:30:15.660,0:30:22.180
responsible. Whatever happens, he will,[br]yeah, he has to answer the request. So
0:30:22.180,0:30:31.170
with these GDPR, yeah, methods here, we[br]thought about: What can we do? And our
0:30:31.170,0:30:35.730
thing was that we acquired some patients[br]with some implanted devices and we sent
0:30:35.730,0:30:41.661
some GDPR inquiries in their names. It was[br]a company, so we told the company: OK, we
0:30:41.661,0:30:46.270
are patient xy and we want to know[br]something about our data and we want to
0:30:46.270,0:30:51.380
have a copy of our own data. And of[br]course, now you can argue, OK, now we are
0:30:51.380,0:30:57.331
handing some very sensitive medical data[br]here, so we have to keep our study here,
0:30:57.331,0:31:02.080
our case study itself GDPR compliant. So[br]we talked to our data protection officer,
0:31:02.080,0:31:07.370
told our study design here. We set up some[br]contracts with the patients so that we are
0:31:07.370,0:31:15.040
self GDPR compliant. Hopefully, it works[br]out. So no one, so we haven't got sued. So
0:31:15.040,0:31:20.630
I think that worked out. At the end we[br]were waiting for the answers of the
0:31:20.630,0:31:24.410
companies and the hospitals and of course,[br]analyze the results. So we looked on the
0:31:24.410,0:31:29.800
completeness, we thought about: This[br]dataset, is it complete? Or have the
0:31:29.800,0:31:33.610
companies some other data which were not[br]provided? We also looked on the data
0:31:33.610,0:31:40.180
security, especially: How is the data[br]transmitted? Do we get this via plain text
0:31:40.180,0:31:46.590
email, perhaps like ZIP files or some CD-[br]ROMs? So we looked on this process here,
0:31:46.590,0:31:50.340
and of course, we look on the time of the[br]answer. So remember, four weeks is the
0:31:50.340,0:31:55.220
maximum time. In some cases, perhaps three[br]months, but standard would be four weeks.
0:31:55.220,0:32:01.850
And of course, if required, we sent some[br]follow up queries. Yes. And, as already
0:32:01.850,0:32:06.540
said, we are some responsible researchers[br]here, so we also do this responsible
0:32:06.540,0:32:10.830
disclosure stuff. So we talked to the[br]companies and discussed some methods, some
0:32:10.830,0:32:16.580
process improvements, what can they do or[br]at least what should they do or must do to
0:32:16.580,0:32:23.760
be GDPR compliant. So let's take a look at[br]the results here, what we get from our
0:32:23.760,0:32:29.670
case study. First vendor was the Biotronik[br]and we sent the first inquiry to the
0:32:29.670,0:32:35.570
Biotronik subsidiary, but we learned that[br]we have a wrong contact. So we just took a
0:32:35.570,0:32:39.830
data privacy contact from some documents[br]from the hospital. But they wrote back:
0:32:39.830,0:32:43.270
Ah, sorry, we are the wrong company, where[br]just the sales company from Biotronik.
0:32:43.270,0:32:47.690
Please refer to the different company.[br]Then we wrote a second letter to the
0:32:47.690,0:32:51.401
different company. And we got an answer[br]after two months and, now remember, four
0:32:51.401,0:32:56.460
weeks, it's not two months, so it was[br]delayed. Sure. But the answer itself was
0:32:56.460,0:33:02.200
also a bit unsatisfying for us because[br]Biotronik told us: The device was never
0:33:02.200,0:33:06.600
connected to any home monitoring system[br]here, so no personal data is stored at
0:33:06.600,0:33:11.710
Biotronik. We asked the patient: Do you[br]ever have some of these home monitoring
0:33:11.710,0:33:17.200
devices? And he told us: No, never got[br]this device. So this is a classic example
0:33:17.200,0:33:22.340
of a good study design or, in this case, a[br]bad study design. So first of all, get
0:33:22.340,0:33:29.280
your context right. And secondly, choose[br]good participants of your study. So this
0:33:29.280,0:33:36.010
might be a future work item here. Perhaps[br]choose another different patient, perhaps.
0:33:36.010,0:33:41.120
Next company, we wrote to, was Medtronic.[br]You already know it from the other devices
0:33:41.120,0:33:47.451
here. And the answer was that we have to[br]send a copy of the ID card, so they wanted
0:33:47.451,0:33:54.610
to have an identification verification.[br]The GDPR does not define a really strict
0:33:54.610,0:33:59.350
method for the verification or when it is[br]required, when it's mandatory or when not
0:33:59.350,0:34:06.230
but in the GDPR it says, it is possible in[br]some cases and we think here we are
0:34:06.230,0:34:10.000
dealing here with very sensitive medical[br]personal data. We think this is totally
0:34:10.000,0:34:14.250
fine. So identification verification is[br]fine for us, and we think this is a good
0:34:14.250,0:34:20.000
thing here that they really check it, that[br]we are the person who is telling the
0:34:20.000,0:34:27.110
person. They also recommend us to use the[br]Medtronic secure email system, and first
0:34:27.110,0:34:32.030
of all, we had a good impression because[br]it's much better than have some plain text
0:34:32.030,0:34:36.169
email and if they are hosting some secure[br]email system on their servers, we said:
0:34:36.169,0:34:39.779
OK, that's a good idea, right? We have a[br]TLS secure connection here. Looks
0:34:39.779,0:34:44.540
perfectly fine. Let me send some tests[br]emails, but we saw on the email headers
0:34:44.540,0:34:50.780
that email is routed to some US servers[br]from the Proofpoint company in the USA.
0:34:50.780,0:34:55.399
And here we would say, OK, that's not[br]really good because normally if I'm a
0:34:55.399,0:35:00.540
German customer or a European customer and[br]sending some GDPR request to the
0:35:00.540,0:35:07.880
Medtronic, Germany or any other EU[br]Medtronic subsidiary, I'm not sure about
0:35:07.880,0:35:13.289
or I have no knowledge about that email is[br]routed through the US. And also for the
0:35:13.289,0:35:19.559
GDPR compliance we are not sure if this is[br]actually allowed because there are some
0:35:19.559,0:35:23.510
discussions about the Safe Harbor thing.[br]So that might be not really GDPR
0:35:23.510,0:35:28.710
compliant. In the least it's not good for[br]the user. It's a bad user experience here.
0:35:28.710,0:35:36.520
But OK, we will use this anyway, because[br]we think such a device, such a platform is
0:35:36.520,0:35:40.700
better than plaintext email, so we sent[br]our inquiry about the system. And the next
0:35:40.700,0:35:45.089
point was really surprising to us. So we[br]were waiting for the results. And I looked
0:35:45.089,0:35:50.900
into my yeah, we created a GMX free web[br]email service account for this
0:35:50.900,0:35:56.380
communication and suddenly I got an email[br]in this system in the GMX email. So the
0:35:56.380,0:36:00.670
response from Medtronic was not sent via[br]the secure channel. It was just sending in
0:36:00.670,0:36:05.880
plaintext email and we said: OK, what's[br]the point here? So you recommend us to use
0:36:05.880,0:36:10.010
the secure system, but they use plaintext[br]email. So this is a thing they really have
0:36:10.010,0:36:17.390
to change, sure. Then it goes a bit back[br]and forth, we wrote some emails and they
0:36:17.390,0:36:21.630
wanted to have some more information about[br]us. What device do we have, what serial
0:36:21.630,0:36:28.109
number and what services we use. And in[br]the end, we got a doc file. So a standard
0:36:28.109,0:36:32.809
Word file as attachment of an email and we[br]should write some information in it and
0:36:32.809,0:36:38.020
send it back. And from a security point of[br]view, so from our security researcher site
0:36:38.020,0:36:43.250
here, we would say that's not the best way[br]to do it because Doc files in the email
0:36:43.250,0:36:48.390
attachment, this is a classical use for[br]ransomware or phishing email. So we did
0:36:48.390,0:36:53.670
this to get the final data as a final[br]answer, but we would propose to change the
0:36:53.670,0:36:59.320
system here. Where we got now the final[br]data. And so we thought, OK, now we really
0:36:59.320,0:37:04.430
got some data now. So that's the point[br]where we got some really good stuff here.
0:37:04.430,0:37:09.099
But in the end, after this forth and back[br]and creating some accounts on some
0:37:09.099,0:37:13.000
systems, they just stated: Oh, so we were[br]the wrong contact. The hospital is
0:37:13.000,0:37:17.980
responsible. We are just a data controller[br]in this case. And of course, this was a
0:37:17.980,0:37:23.010
bit unsatisfying because we thought: OK,[br]we are now close to the to get the data.
0:37:23.010,0:37:30.869
But never got happened. Yeah, so[br]analyzizing, so in the end, in GDPR
0:37:30.869,0:37:36.319
compliant might be OK. So we are not into[br]this relationship between Medtronic and
0:37:36.319,0:37:41.069
the hospital. So it might be that they[br]have an agreement: Who is the controller,
0:37:41.069,0:37:45.260
who was responsible? We couldn't check it[br]as a patient, but of course, the user
0:37:45.260,0:37:49.490
experience is not good because you have so[br]many emails and at the end you will get
0:37:49.490,0:37:56.840
nothing. But the third one is Boston[br]Scientific. You know Boston Scientific
0:37:56.840,0:38:05.620
from the nice Doom device here. And we[br]sent an inquiry to BSC, and we got a
0:38:05.620,0:38:09.569
response that they want to have an[br]identification verification, so the same
0:38:09.569,0:38:15.400
as Medtronic. So we said, yeah, that's[br]fine, sounds legit. They said also, yeah,
0:38:15.400,0:38:19.779
you can use plaintext email. Just send an[br]email to this email address or you can use
0:38:19.779,0:38:25.530
our online tool and our patient chooses[br]the email. So from security side, we would
0:38:25.530,0:38:31.980
use a platform or a secure platform now.[br]But I can totally understand the patient
0:38:31.980,0:38:37.640
because it was a hard copy letter, so a[br]real letter by snake postal mail. And he
0:38:37.640,0:38:43.339
should type this really long link, the[br]long URL with some random data. And if you
0:38:43.339,0:38:47.920
type one character wrong, you have to do[br]it again and so on. And from the customer
0:38:47.920,0:38:52.299
point of view, so the user experience is[br]also very bad here. So no one wants to
0:38:52.299,0:38:57.460
really type this in your browser and see[br]if it's correct or not. So Boston
0:38:57.460,0:39:02.460
Scientific should use a different system[br]here, some short link system or just have
0:39:02.460,0:39:11.260
a short domain and a very short access[br]code, but something better like this one
0:39:11.260,0:39:16.349
here. But then we got an email. So it was[br]a plain text email, so not good, of
0:39:16.349,0:39:21.090
course, medical data via plain text email[br]not good. So some can now argue: OK, but
0:39:21.090,0:39:27.750
our patient started to do it. Our patient[br]started to wrote a plaintext email. But
0:39:27.750,0:39:32.589
the common understanding for the GDPR is[br]that even if the customer is asking via
0:39:32.589,0:39:37.569
plain text email, the company cannot[br]respond in plain text email. You have to
0:39:37.569,0:39:42.520
do something special, something secure. So[br]this is also a thing Boston Scientific
0:39:42.520,0:39:47.009
sure changed. But hey, we got seven PDF[br]reports. So our first data now in this
0:39:47.009,0:39:52.279
case study after I don't know how many[br]emails and letters, but we got some data.
0:39:52.279,0:39:57.119
Then we thought, OK, seven PDF reports and[br]the device is active for three years. That
0:39:57.119,0:40:02.950
sounds not OK, that sounds a bit, a bit[br]less for seven, for three years. So we
0:40:02.950,0:40:08.450
contacted the doctor, of course, with the[br]consent of the patient, and the doctor
0:40:08.450,0:40:12.660
looked into this online portal and saw a[br]lot of data, there were a lot of raw data,
0:40:12.660,0:40:18.069
a lot of PDF reports and graphs and full[br]of information. And so we thought: OK,
0:40:18.069,0:40:22.440
that's not enough. We got seven reports[br]but the system is full of any other data.
0:40:22.440,0:40:27.960
Of course, we go to BSC and tell us: OK,[br]we want to have all the data. BSC
0:40:27.960,0:40:35.019
apologized: OK, so we didn't look it up[br]into the home monitoring thing, but you
0:40:35.019,0:40:41.119
can have the data, but we need two extra[br]months. So, as I said in the interruption,
0:40:41.119,0:40:47.390
that might be OK if it's a really complex[br]thing and then it might be OK. For my, my
0:40:47.390,0:40:53.019
understanding is or I have a feeling that[br]they have to implement some export
0:40:53.019,0:40:57.309
mechanism to fulfill our request. But OK,[br]if they want to have two months and we got
0:40:57.309,0:41:04.049
the data, we were fine with this. Yeah,[br]and now final data. Within this extended
0:41:04.049,0:41:08.980
deadline, so they did this in the[br]deadline, in the three months, we got all
0:41:08.980,0:41:13.180
the data. And all the data, I mean,[br]really, we got a ton of it.It was a large
0:41:13.180,0:41:19.170
zip file with a lot of HL7 data. So it's a[br]raw, digital, machine readable format. And
0:41:19.170,0:41:23.450
we got some episode data, also digital as[br]an excel sheet. And we were now really
0:41:23.450,0:41:28.960
happy because this was really satisfying.[br]The patient, or in this case, we got all
0:41:28.960,0:41:35.559
the data which are really GDPR compliant.[br]So that was the first and only vendor
0:41:35.559,0:41:41.590
where we got really our GDPR request[br]fulfilled. Yeah, but last point, we have
0:41:41.590,0:41:48.240
to mention now the security. It was not[br]sent directly by email, but we just got
0:41:48.240,0:41:52.369
the download link via email. And from the[br]security perspective, that's more or less
0:41:52.369,0:41:56.890
the same. So if you have a man in the[br]middle who can read plaintext email, you
0:41:56.890,0:42:02.190
can also click on the link in the download[br]email. So, OK, we got the data but the
0:42:02.190,0:42:10.059
process here with the security must be[br]improved. We also got one hospital patient
0:42:10.059,0:42:17.669
and we send one inquiry to a hospital.[br]There's also some things which we were not
0:42:17.669,0:42:22.099
informed of, which we're not aware of it.[br]The hospital was doing the implantation of
0:42:22.099,0:42:28.059
the device in our inquiry was five years,[br]the hospital was bankrupt and we were told
0:42:28.059,0:42:33.290
that we have to contact a different person[br]of the old owner of the hospital. So the
0:42:33.290,0:42:38.390
bankrupt company. We also think that this[br]is in the, yeah, we also think that this
0:42:38.390,0:42:42.750
might not be correct because the[br]explantation of the device was done during
0:42:42.750,0:42:49.049
the time where the hospital was under the[br]control of the new owner, which we
0:42:49.049,0:42:54.010
contacted. So there might be some data for[br]the explantation. But we also write a
0:42:54.010,0:43:00.240
letter to the old company, and we got an[br]answer, but also after two months, so
0:43:00.240,0:43:08.200
again, here we have to wait two months. A[br]delay GDPR time frame was not fulfilled.
0:43:08.200,0:43:12.220
And also the final answer, so we get some,[br]we really got some data, as we can see
0:43:12.220,0:43:16.749
here, this handwritten stuff here, but we[br]were missing, for example, the surgery
0:43:16.749,0:43:24.309
report. Normally a hospital has to do a[br]lot of documentation for surgery, but we
0:43:24.309,0:43:31.170
didn't get this information here, so we[br]just get a part of it. But in summary, I
0:43:31.170,0:43:35.170
won't go into all at this point, but you[br]can see a lot of red here, so we had some
0:43:35.170,0:43:40.369
plaintext data via email, which is not[br]correct and also not legally, not GDPR
0:43:40.369,0:43:45.390
compliant. We have some deadline missed.[br]We have some incomplete data or at the end
0:43:45.390,0:43:49.930
it was complete data, but we have to ask a[br]lot and have to ask a doctor who will
0:43:49.930,0:43:55.720
double check if the data is correct and we[br]needed often more than one request. And
0:43:55.720,0:44:00.299
finally, for the patient, if you want to[br]have your data, if you want to execute
0:44:00.299,0:44:03.960
your right, it's a really hard way, as you[br]can see here. And you need a lot of
0:44:03.960,0:44:08.980
emails, a lot of time for this. And[br]sometimes it's not really possible because
0:44:08.980,0:44:12.939
they say: OK, just go to the hospital, go[br]to a different thing here. We are not
0:44:12.939,0:44:20.320
responsible for this. So it's not a good[br]user experience here. And our
0:44:20.320,0:44:23.589
recommendation for the companies is and[br]also for the hospitals, they should be
0:44:23.589,0:44:30.390
prepared for such stuff because GDPR is[br]now active for more than three years. And
0:44:30.390,0:44:36.779
in the next time or some time, they will[br]get some requests and perhaps not from the
0:44:36.779,0:44:40.420
friendly researchers but from real[br]patients. And they can, yeah, if they
0:44:40.420,0:44:44.619
won't get the answer, they can go to the[br]local authorities and the local
0:44:44.619,0:44:49.879
authorities can really,can sue them so[br]they can punish the company with a large
0:44:49.879,0:44:57.330
fine. So our recommendations: Be prepared[br]for such stuff. And with this slide, I
0:44:57.330,0:45:01.080
want to thank you for your attention and[br]want to close the talk, and we are happy
0:45:01.080,0:45:08.079
to answer some questions in the Q&A[br]session. Thanks.
0:45:08.079,0:45:13.799
Herald: Thank you very much. I have a[br]bunch of questions from the internet.
0:45:13.799,0:45:22.490
First one is, did you analyze binaries[br]only or did you have any access to source
0:45:22.490,0:45:26.259
codes? And did you ask any[br]manufacturers...
0:45:26.259,0:45:35.140
e7p: Please repeat the question. I didn't[br]get it because of...
0:45:35.140,0:45:42.740
Herald: Some technician, please mute the[br]room. I can also (inaudible) Ah, the
0:45:42.740,0:45:50.190
question is: Did you analyze binaries only[br]or did you obtain a degree of access to
0:45:50.190,0:46:00.480
the source codes?[br]e7p: I'm not sure if the check dongle is
0:46:00.480,0:46:06.180
meant but for this one, it was very small[br]and we could analyze it easily using
0:46:06.180,0:46:13.249
Ghidra to decompile it and then just see[br]which data needs to be which position in
0:46:13.249,0:46:17.760
the parallel port. If that was the[br]question. I think the other binaries from
0:46:17.760,0:46:27.109
the home monitoring system or, if you[br]could concretize... From the home
0:46:27.109,0:46:32.559
monitoring systems, we first had just a[br]look for strings mainly, for some access
0:46:32.559,0:46:41.609
credentials or domain names. And I think[br]we did not that much the decompilation in
0:46:41.609,0:46:49.190
the other stuff, but the whole software of[br]the programing computer is in obfuscated
0:46:49.190,0:46:57.960
Java. And this is, I don't know, it's a[br]just in time compiled obfuscated Java, and
0:46:57.960,0:47:04.339
I didn't find a good way how to do with[br]this. Other questions?
0:47:04.339,0:47:15.760
Herald: Thank you. Another question was:[br]How many CVEs did you file from this
0:47:15.760,0:47:24.410
research?[br]e7p: So I'm not sure, but some of these
0:47:24.410,0:47:30.079
findings were already found by others, and[br]we just didn't get that they were already
0:47:30.079,0:47:37.200
reported as CVE or as CISA reports. But I[br]think one or two or three.
0:47:37.200,0:47:43.800
Christoph: Yes, there were some CVEs and[br]also our results were linked to some other
0:47:43.800,0:47:49.849
CVEs which were already published. The[br]company did some other internal research
0:47:49.849,0:47:55.070
thing and internal research got some CVEs.[br]But if you look into the CVE, you cannot
0:47:55.070,0:48:01.320
always make it back to the actual[br]vulnerability. But at least for the
0:48:01.320,0:48:05.970
programmer here, for the Boston scientific[br]programmer there was the CISA advisory and
0:48:05.970,0:48:11.440
a few months back, I think in September or[br]just end of August, there was a CISA
0:48:11.440,0:48:16.630
advisory for this programmer and it[br]stated, I don't know, four or five CVEs.
0:48:16.630,0:48:21.660
Yeah, and it will be replaced in the next,[br]yeah, hopefully months, because it's also
0:48:21.660,0:48:27.239
pretty old. And it's the old generation[br]and the hospitals have to use the newer
0:48:27.239,0:48:38.920
generation in the next months.[br]Herald: You mentioned the cooperativeness
0:48:38.920,0:48:47.549
of the manufacturers on the subject access[br]request but how cooperative were they on
0:48:47.549,0:48:51.940
the technical side, when you tried to[br]report and disclose?
0:48:51.940,0:48:59.290
e7p: Yeah, actually, they were quite[br]cooperative when we simply wrote: Hey, we
0:48:59.290,0:49:04.940
found this and that. We wrote it to them,[br]I think, first at the press address or
0:49:04.940,0:49:13.289
something. And then we were redirected to[br]the internal security group, or, would you
0:49:13.289,0:49:20.119
say, the experts. And then we had, I[br]think, Zoom meetings with them and it was
0:49:20.119,0:49:25.660
a good cooperation, I would say.[br]Christoph: Really, I think it was a good
0:49:25.660,0:49:31.220
communication on the same level here, so[br]it was really the goal from all that we,
0:49:31.220,0:49:36.749
of course, don't threaten the patients,[br]but get the product more secure. I think
0:49:36.749,0:49:41.420
it is also some point of regulation like[br]the CISA in the US. If they have
0:49:41.420,0:49:45.900
vulnerabilities they have to change it so[br]they also get some pressure from the
0:49:45.900,0:49:51.799
regulations and they really want to change[br]some things. So that's my impression and
0:49:51.799,0:49:56.150
the discussions were really well[br]organized, well structured with a lot of
0:49:56.150,0:49:59.730
people who were really deep into the[br]topic. So we asked some questions and we
0:49:59.730,0:50:04.019
got really deep insights into the products[br]here. They were very helpful.
0:50:04.019,0:50:08.660
e7p: So I think all of the companies[br]offered some jobs for security analysts.
0:50:08.660,0:50:12.160
Christoph: Oh yeah![br]e7p: So anyone who's interested in jobs at
0:50:12.160,0:50:14.680
Boston Scientific or Medtronic or[br]Biotronik must...
0:50:14.680,0:50:24.589
Christoph: Just hack a device and you will[br]get a job for it. I don't know.
0:50:24.589,0:50:33.080
Herald: And the last question I have for[br]you is how difficult this was in terms of
0:50:33.080,0:50:39.539
technical skills needed. Was this, did it[br]require really high tech exploits or was
0:50:39.539,0:50:45.029
this just unbelievably easy? How, where in[br]the spectrum was it?
0:50:45.029,0:50:53.059
e7p: So with the programing device it was,[br]for me, rather difficult because I'm not
0:50:53.059,0:51:00.269
so much into 90s or 2000s PC architecture,[br]so I did have to learn something and I
0:51:00.269,0:51:07.839
found out about old custom hardware which[br]is interacting over PCI and also with the
0:51:07.839,0:51:15.650
home monitoring I had to learn some stuff[br]for embedded Linux and find out where the
0:51:15.650,0:51:23.759
network stack is and how it all works, but[br]all in all, I think in maybe one month or
0:51:23.759,0:51:29.569
something like this, I could have done it[br]all in all. And to find out.
0:51:29.569,0:51:35.789
Christoph: I mean, it actually depends on[br]your knowledge already. So some stuff were
0:51:35.789,0:51:39.369
pretty standard like sniffing on some[br]busses on the hardware with a logic
0:51:39.369,0:51:43.660
analyzer. If you did this in past, you can[br]also do this on these devices, for
0:51:43.660,0:51:47.609
example. But if you never did this, then[br]you have to figure out which pins are for
0:51:47.609,0:51:51.579
which bus, how can I identify which bus is[br]used here? How can I read out the EPROM?
0:51:51.579,0:51:57.410
How can I read out the memory chips? It[br]highly depends on your previous work and
0:51:57.410,0:52:05.819
previous knowledge. For Endres here it's[br]easy. laughs
0:52:05.819,0:52:17.329
e7p: laughs Maybe not.[br]Herald: OK, thank you. I think this
0:52:17.329,0:52:24.220
concludes the session. Thank you both for[br]this interesting presentation. So we'll
0:52:24.220,0:52:28.119
see how your research will be further[br]approved.
0:52:28.119,0:52:33.570
e7p: Thank you. Thanks for being here.[br]Christoph: For being remote there. And
0:52:33.570,0:52:39.376
next time, hopefully in life and presence,[br]hopefully.
0:52:39.376,0:52:42.170
Herald: Yes, that would be so much better[br]than this. Bye bye.
0:52:42.170,0:53:03.050
postroll music
0:53:03.050,0:53:06.000
Subtitles created by c3subtitles.de[br]in the year 2021. Join, and help us!