0:00:00.000,0:00:19.259
35C3 preroll music
0:00:19.259,0:00:24.429
Herald angel: Welcome everybody to our[br]next Talk. It's the talk “Wallet.fail”.
0:00:24.429,0:00:28.380
As you all know, when you have something[br]valuable you put it somewhere safe. But as
0:00:28.380,0:00:33.710
we as hackers also know there is no place[br]that is really safe and our three speakers
0:00:33.710,0:00:39.059
Thomas, Dmitry and Josh are now going to[br]demonstrate in the next hour the art of
0:00:39.059,0:00:43.590
completely breaking something apart. So[br]please give a big round of applause for
0:00:43.590,0:00:47.970
Thomas, Dmitry and Josh and have a lot of[br]fun.
0:00:47.970,0:00:51.790
Applause
0:00:51.790,0:00:55.359
Dmitry: So just just to start, I'm[br]curious how many people here actually own
0:00:55.359,0:01:02.300
cryptocurrency. Raise your hand. And how[br]many of you store it on a hardware wallet?
0:01:02.300,0:01:09.420
So we're very sorry to everyone who has[br]their hand up. OK. So it's not just me.
0:01:09.420,0:01:15.390
It's me, Josh and Thomas. So we're all[br]hardware people. We do low level hardware
0:01:15.390,0:01:21.200
stuff in varying degrees and we got into[br]cryptocurrency and so I can recommend to
0:01:21.200,0:01:25.369
everyone sitting in this room if you're a[br]security person. There's not a lot of
0:01:25.369,0:01:31.340
people doing security and cryptocurrency[br]as much as that's painful to hear. So yeah
0:01:31.340,0:01:36.110
I mean a lot of this is based on reverse[br]engineering. We love cryptocurrency.
0:01:36.110,0:01:40.660
I mean for us crypto also stands for[br]cryptography not just crypto currency, but
0:01:40.660,0:01:45.729
no offense to anyone with this talk. It's[br]just something that it's a category that
0:01:45.729,0:01:49.869
we looked at. And so the results kind of[br]speak for themselves. And again this
0:01:49.869,0:01:53.700
wouldn't be possible alone. So we have a[br]lot of people to thank. I'm not going to
0:01:53.700,0:01:57.840
go through all of them individually. Just[br]be knowing that we're thankful to everyone
0:01:57.840,0:02:03.800
on this, on the slide. So yes, so we[br]started this about six months ago. So we
0:02:03.800,0:02:07.289
wanted to take a look at cryptocurrency[br]because we own some cryptocurrency
0:02:07.289,0:02:12.890
ourselves and we saw that everyone's using[br]cryptocurrency wallets. It's more and more
0:02:12.890,0:02:18.709
the thing that you do. So we started a[br]group chat as you do nowadays. And we have
0:02:18.709,0:02:26.349
50000 messages now and 1100 images. And I[br]had my first, I had my son in the meantime
0:02:26.349,0:02:30.820
as well. So it's a really long time that[br]we been looking at this, etc.
0:02:30.820,0:02:33.170
Applause
0:02:33.170,0:02:37.480
OK, so what do we want to achieve[br]though? Because people don't give
0:02:37.480,0:02:40.780
the kinds of attacks so you can[br]actually perform against
0:02:40.780,0:02:44.349
cryptocurrency wallets enough credit.[br]So first attack is supply chain attacks
0:02:44.349,0:02:47.910
where you are able to manipulate the [br]devices before they get
0:02:47.910,0:02:51.409
to the end customer.[br]Firmware vulnerabilities, where you find a
0:02:51.409,0:02:55.169
vulnerability in the firmware and can[br]somehow either infect or do something else
0:02:55.169,0:02:58.879
with the device. Side-channel attacks of[br]course. I think that's one of the more
0:02:58.879,0:03:02.660
obvious ones that people are familiar[br]with. And also chip-level vulnerabilities.
0:03:02.660,0:03:06.380
So we were able to find one of each of[br]these. And so that's the talk that we're
0:03:06.380,0:03:10.750
going to talk about each one of these[br]individually. But first, what's a wallet?
0:03:10.750,0:03:15.159
Just in case you are not 100 percent[br]familiar with them. So a wallet, and in
0:03:15.159,0:03:19.460
general cryptocurrency how do you do this,[br]it's just asymmetric cryptography. So you
0:03:19.460,0:03:24.270
have a private key and a public key. The[br]public key, basically, it gives you the
0:03:24.270,0:03:28.939
address. You can derive the address from[br]this. The address is nothing other than
0:03:28.939,0:03:33.020
the public key of the wallet and you have[br]the private key and you need this to send
0:03:33.020,0:03:37.659
transactions, so to actually operate with[br]the cryptocurrency. So this, the private
0:03:37.659,0:03:41.626
key, is what needs to be kept secret. The[br]public key is something that everyone can
0:03:41.626,0:03:45.830
know so that they can send cryptocurrency[br]to you. But it kind of sucks to have a
0:03:45.830,0:03:50.189
separate for each cryptocurrency-pair or[br]for each wallet maybe even multiple
0:03:50.189,0:03:55.829
wallets. It sucks to generate a new[br]cryptographic pair for each one of them.
0:03:55.829,0:04:00.390
So the people, the wonderful people,[br]behind bitcoin have thought of something
0:04:00.390,0:04:06.830
for this and it's called BIP32/BIP44. And,[br]so, what it is is you have a cryptographic
0:04:06.830,0:04:13.810
seed and you can actually derive the[br]accounts from a single seed. So you
0:04:13.810,0:04:18.160
basically store one seed and you're able[br]to implement and do unlimited amount of
0:04:18.160,0:04:23.480
wallets. Okay. So basically you do key[br]derivation, you add some data, do key
0:04:23.480,0:04:27.329
derivation and you can have an unlimited[br]amount of wallets while storing a single
0:04:27.329,0:04:31.290
seed. And this is what you're using when[br]you're using a hardware wallet. So and of
0:04:31.290,0:04:35.235
course for each key derivation there will[br]be a new private key and a public key, but
0:04:35.235,0:04:38.759
it will be generated in a predictable[br]manner and you only need a store one
0:04:38.759,0:04:42.630
secret seed. So you only have to store the[br]seed. You can write it down, and that's
0:04:42.630,0:04:46.730
the advantage. But it's difficult to write[br]down because it's binary data. So come
0:04:46.730,0:04:52.160
BIP39, which is what you're most used to,[br]which is a format in which you can take
0:04:52.160,0:04:55.880
that cryptographic seed, that binary data,[br]and actually convert it to a set of
0:04:55.880,0:05:00.020
dictionary words that you can then easily[br]write down on a piece of paper and store
0:05:00.020,0:05:03.889
it at your mother's house, or store half[br]of it at your mother's house and half of
0:05:03.889,0:05:07.710
it at your grandmother's house. And that[br]way somebody would have to go into both
0:05:07.710,0:05:13.820
houses simultaneously to get your words.[br]So yeah. So what's a hardware wallet?
0:05:13.820,0:05:17.800
So we just talked about what's a wallet.[br]So why do you even need a hardware wallet?
0:05:17.800,0:05:22.370
Well, the problem is, of course, computers[br]can get backdoored, have malware running
0:05:22.370,0:05:26.380
on them and this is what you want to pre-[br]vent against. How do you do this? You have
0:05:26.380,0:05:30.139
a secure device, you store your seeds[br]externally. Usually, this is a USB-
0:05:30.139,0:05:35.180
connected device that you store your[br]crypto on and so you can trust this even
0:05:35.180,0:05:39.099
if you can't trust your computer. This is[br]the idea. So what happens is the computer
0:05:39.099,0:05:42.940
sends the transaction to the device. The[br]device gets the transaction, it can
0:05:42.940,0:05:46.660
actually confirm or deny the transaction.[br]It also displays the transaction. So
0:05:46.660,0:05:50.430
before you do any cryptographic signing,[br]you can see is that actually what I was
0:05:50.430,0:05:55.370
doing or was my computer owned and is it[br]initiating the transaction for me? So you
0:05:55.370,0:06:00.410
sign the transaction and also, yeah, the[br]seed never leaves the transaction, but the
0:06:00.410,0:06:04.160
hardware signs a transaction for you. You[br]send it back to the computer and the
0:06:04.160,0:06:09.610
computer can actually take that and send[br]it to the Internet. OK? So that's a quick
0:06:09.610,0:06:14.920
rundown of how crypto or, sorry, how[br]hardware wallets work. So the first thing
0:06:14.920,0:06:19.889
that we looked at was supply chain attacks[br]which is where Josh gonna pick up. You
0:06:19.889,0:06:26.410
have a mic. Oh sorry.[br]Josh: Ok, so the three big things I want
0:06:26.410,0:06:30.030
to leave you with as we go through the[br]supply chain attacks are, stickers for
0:06:30.030,0:06:34.210
laptops, they are not for security. So[br]we're going to be talking about stickers
0:06:34.210,0:06:39.330
today. They're there for laptop[br]decorations, they are not for security.
0:06:39.330,0:06:43.221
Supply chain attacks are easy to perform,[br]but they're quite hard to perform at
0:06:43.221,0:06:47.620
scale. And the last takeaway that I will[br]leave you with is that, the vendor's
0:06:47.620,0:06:52.720
threat model may not actually be your[br]threat model. So security stickers, some
0:06:52.720,0:06:56.460
of the wallet vendors are using them. I[br]have seen them on other products, they
0:06:56.460,0:07:01.290
seem to be quite popular. I have a friend[br]and colleague named Joe Fitzpatrick, he
0:07:01.290,0:07:06.690
also likes stickers. So the stickers that[br]he makes are the same as we find on his
0:07:06.690,0:07:11.289
security product. They have holograms.[br]They have unique serial numbers. And they
0:07:11.289,0:07:16.380
leave you with that nice warm fuzzy[br]security feeling. Joe makes some funny
0:07:16.380,0:07:21.690
ones. You can get a Fitz 140-2 approved[br]stickers. You don't have to pay the money
0:07:21.690,0:07:27.410
for the FIPS one, just get the Fitz one.[br]So the first device I looked at was the
0:07:27.410,0:07:34.270
Trezor One. The Trezor One actually has[br]two levels of protection on the packaging.
0:07:34.270,0:07:40.449
There's the hologram sticker than the[br]actual box is enclosed with an adhesive.
0:07:40.449,0:07:44.349
So it's supposed to be that you actually[br]have to rip open the box to get into it.
0:07:44.349,0:07:47.830
But if you use a hot air gun or a[br]hairdryer it's actually quite easy to
0:07:47.830,0:07:51.949
remove. And so if you see on the left[br]there that's the original package and on
0:07:51.949,0:07:57.490
the right this is a box that I opened and[br]put everything back into. And if you look
0:07:57.490,0:08:01.069
closely there is a little bit of gap[br]there. The sticker has a little bit of
0:08:01.069,0:08:05.491
break but this was the first try. And it's[br]pretty close. So trust me taking a sticker
0:08:05.491,0:08:09.780
off is not very hard. Now if you remember[br]this picture of the sticker cause we're
0:08:09.780,0:08:14.370
going to come back to it. So but for the[br]vendor this is actually a real problem so
0:08:14.370,0:08:18.379
Trezor did put a blog post out that one of[br]the challenges they face is that they're
0:08:18.379,0:08:22.819
facing counterfeiting of their devices. So[br]this is from their blog post. They say hey
0:08:22.819,0:08:26.400
you know we've noticed that there's[br]counterfeit devices. You have to look at
0:08:26.400,0:08:30.669
the stickers to see that they are legit.[br]So I said remember look at that sticker.
0:08:30.669,0:08:35.010
So I bought that case about a year and a[br]half ago for my previous DevCon talk and
0:08:35.010,0:08:39.290
it's the same sticker that they're saying[br]is fake here. So then on their wiki it's
0:08:39.290,0:08:43.159
very confusing because there's three sets[br]of stickers so basically, yeah, stickers
0:08:43.159,0:08:47.890
are very confusing. They cause problems[br]for end users. And I was not even sure if
0:08:47.890,0:08:54.680
I bought a real Trezor or a cloned one. So[br]this morning I got out a new case. And
0:08:54.680,0:08:59.010
just to make sure I took off the sticker[br]using very sophisticated equipment
0:08:59.010,0:09:04.290
including a very expensive Dyson hairdryer[br]that was included in the AirBnB and I was
0:09:04.290,0:09:08.174
able to remove the sticker.[br]So it comes off
0:09:08.174,0:09:14.390
with zero residue. So yes stickers do[br]not provide any security. On the Trezor T
0:09:14.390,0:09:18.350
they switched it from the box and now the[br]box can be opened easily. But now there's
0:09:18.350,0:09:24.140
a sticker on the USB-C port. Again as you[br]would expect use hot air and you can
0:09:24.140,0:09:28.940
easily remove it. Pro tip: don't set the[br]hot air rework that high I had it set for
0:09:28.940,0:09:33.320
lead free reworking and I actually melted[br]the enclosure. So if you're going to do
0:09:33.320,0:09:37.711
this kind of supply chain attack, maybe,[br]no, set the heat a little lower but if you
0:09:37.711,0:09:44.690
just google how to remove stickers the[br]same attack methods work. So this causes
0:09:44.690,0:09:49.750
a bit of confusion because the ledger[br]device has a very, I will say, in your
0:09:49.750,0:09:55.210
face a piece of paper when you open the[br]box it says there are no stickers in this
0:09:55.210,0:10:02.000
box. However I combed through about 250[br]1-star Amazon reviews and a lot of them
0:10:02.000,0:10:06.220
have to do with confusion about the[br]stickers. So some of them are actually
0:10:06.220,0:10:10.730
quite funny. So this this one started out[br]"Note to wallet hackers", so I was really
0:10:10.730,0:10:15.410
into this. So I was like, OK, pro tip[br]what's this guy have to say? And basically
0:10:15.410,0:10:18.940
he was complaining that there's[br]fingerprints on the device. That's how he
0:10:18.940,0:10:23.880
knew it was hacked. Another one complained[br]that the fingerprints were on the wallet
0:10:23.880,0:10:27.694
and there was a hair underneath. So if[br]you're doing supply chain attacks be sure
0:10:27.694,0:10:32.390
to remove any evidence of your[br]fingerprints or hair. So anyway stickers
0:10:32.390,0:10:36.613
don't work. That's all I want to say about[br]that. Once you get through this enclosure
0:10:36.613,0:10:40.860
though you then have to have the challenge[br]of actually opening the enclosure. These
0:10:40.860,0:10:44.770
are three different wallet devices: Ledger[br]Nano on the left, the Trezor One and the
0:10:44.770,0:10:49.240
Trezor T on the bottom all of which[br]actually open pretty easily. So the Trezor
0:10:49.240,0:10:53.580
One, even, so, I'm still not sure if[br]that's the counterfeit or the real one,
0:10:53.580,0:10:57.930
but I get on the on the real one today. I[br]was able to pop open enclosure. So it is
0:10:57.930,0:11:01.660
ultra sonically welded but you can pry it[br]in there and open it. The Ledger Nano
0:11:01.660,0:11:06.070
opens very easily, like, without any[br]equipment. But once you do this, you know
0:11:06.070,0:11:09.690
what do you do once it's opened? So the[br]attack basically is you take the
0:11:09.690,0:11:13.340
microcontroller and you rework it. So you[br]remove the microcontroller from the
0:11:13.340,0:11:17.260
printed circuit board and you put on a new[br]one that you bought from a distributor.
0:11:17.260,0:11:20.660
Once you've done that on the Trezor[br]devices you can put your compromised
0:11:20.660,0:11:24.610
bootloader on there. So this is, I did not[br]go as far to make the compromised
0:11:24.610,0:11:28.385
bootloader, but I did confirm that once I[br]switched the microcontroller, I could
0:11:28.385,0:11:33.200
connect with a debugger over SWD and I[br]have free access to the chip. So some of
0:11:33.200,0:11:39.190
the parts got blown off when I was[br]reworking but the SDM works fine. So yeah.
0:11:39.190,0:11:43.140
So you just rework, reflash and then you[br]put everything back together. So next I
0:11:43.140,0:11:47.360
want to talk about hardware implants. So[br]you may remember the story that came out
0:11:47.360,0:11:51.390
there was this big hack by Bloomberg about[br]hardware implants. I wanted to make a
0:11:51.390,0:11:55.570
hardware implant. I also wanted to have a[br]little bit of fun with this. So, we, in
0:11:55.570,0:12:00.390
honor of the Bloomberg story which has[br]some, you may have some issues with it.
0:12:00.390,0:12:06.010
We're about to talk about the BloomBurglar[br]which is a super micro fun implant. So the
0:12:06.010,0:12:10.010
goals for this implant is I wanted this[br]implant to happen after receipt. So it is
0:12:10.010,0:12:14.325
both a supply chain attack and a physical[br]one like a red team can perform this. A
0:12:14.325,0:12:18.970
malicious insider could also perform this[br]attack. Zero firmware, because more fun.
0:12:18.970,0:12:22.980
It has to fit inside of a hardware wallet,[br]so it has to be small it has to also
0:12:22.980,0:12:26.803
bypass the core security function,[br]otherwise it's not an implant. Very few
0:12:26.803,0:12:31.870
components. I have a thousand of them with[br]me. So I wanted to be able to offer Makers
0:12:31.870,0:12:37.927
and DIYers to participate in the hardware[br]implant fun. So what kind of implant did I
0:12:37.927,0:12:42.100
end up with. Well, I decided to do a[br]basically, an RF-triggered switch and so
0:12:42.100,0:12:47.115
the idea is on these devices there's a[br]button and the button is the last line of
0:12:47.115,0:12:51.441
defense. So all the vendors assume that[br]the host is going to be compromised. They
0:12:51.441,0:12:55.162
just assume that's going to be easy[br]because that's software. And so once you
0:12:55.162,0:12:59.030
have a compromised host you have to send[br]it to the device and then the human -- so
0:12:59.030,0:13:03.130
humans are still needed -- humans have to[br]look at it and say "Is this the right
0:13:03.130,0:13:07.620
transaction or not?" They have to say yes[br]or no. So now with this implant I can,
0:13:07.620,0:13:11.700
through RF, I can trigger the yes button.[br]So a human is not required to send
0:13:11.700,0:13:15.366
transactions, I can remotely trigger it.[br]Basically the RF comes in through the
0:13:15.366,0:13:19.040
antenna it goes through a single[br]transistor which is the main component and
0:13:19.040,0:13:22.890
it pulls the button low. And I'm sorry to[br]say that the bill of materials is quite
0:13:22.890,0:13:28.360
expensive at three dollars and 16 cents.[br]Two dollars and 61 cents of that is this
0:13:28.360,0:13:33.780
potentiometer I had to use. So it is a[br]little bit expensive. I'm sorry. Also, why
0:13:33.780,0:13:39.250
is this so big. I mean this is an American[br]Dime I can fit two on them. What's the
0:13:39.250,0:13:43.110
deal. Why is it so big. Well I optimized[br]it for hand assembly. So it would be, you
0:13:43.110,0:13:47.020
know, more fun to use, but basically you[br]put the antenna in and then there's an out
0:13:47.020,0:13:51.056
button and, like I said, I have a thousand[br]with me. So just for scale. This is how it
0:13:51.056,0:13:55.720
fits on the Ledger Nano. This is how it[br]fits on the Trezor. It is also because
0:13:55.720,0:13:59.580
breadboard-friendly is a thing. So we made[br]it breadboard-friendly. So you can also
0:13:59.580,0:14:04.230
play along very easily at home. So then[br]the last challenge with an RF implant is
0:14:04.230,0:14:07.920
how do you design antenna to fit in there.[br]And so the big thing there with an
0:14:07.920,0:14:11.850
SMA connector is the first prototype[br]I did. Experimented with a few antenna
0:14:11.850,0:14:15.920
designs but that remember it, it all has[br]to fit inside the Ledger. So that's
0:14:15.920,0:14:20.520
actually quite easy because a Ledger Nano[br]has a plenty of room to insert extra
0:14:20.520,0:14:26.630
circuitry and so it quite fits easily in[br]the Ledger Nano. And then I did the
0:14:26.630,0:14:30.066
implant and then I started to go through[br]the wallet process. I got to a
0:14:30.066,0:14:34.680
check that said is might, you know, is the[br]Ledger device genuine. And here I actually
0:14:34.680,0:14:39.626
got a little bit nervous because it wasn't[br]working, and so it wasn't working. I was
0:14:39.626,0:14:43.569
like, maybe they were checking this, you[br]know how did they detect it. Don't worry,
0:14:43.569,0:14:47.956
it's only Linux. So it just doesn't work[br]on Linux. So that was no problem. I did it
0:14:47.956,0:14:52.492
on windows and no problems. The device was[br]genuine, I was able to move on. So the
0:14:52.492,0:14:56.480
thing is, this is a very crude receiver,[br]but the attacker can always use more
0:14:56.480,0:15:02.200
power. So here I have this is my antenna[br]setup in the basement, and with a 50W
0:15:02.200,0:15:06.710
transmitter I can remotely trigger the[br]button at 11 meters, and at this point I'm
0:15:06.710,0:15:10.589
just limited by my basement size. I'm[br]pretty very confident that I'd be able to
0:15:10.589,0:15:16.550
remotely trigger this thing further. Yeah.[br]So here we're going to see a demo of what
0:15:16.550,0:15:20.380
it looks like and for the other problem[br]you have with hardware implants is how do
0:15:20.380,0:15:24.356
you know you have the implanted device. So[br]you have to label it some way. Ledger has
0:15:24.356,0:15:29.290
this kind of Latin phrase that scrolls " I[br]wanted my own Latin phrase" And so this is
0:15:29.290,0:15:33.310
how I know this is my implanted device. So[br]what we're going to see is that the
0:15:33.310,0:15:37.188
transaction screens is gonna show up. This[br]is, and I'm basically going to trigger
0:15:37.188,0:15:40.810
this remotely, so I'm going to show that[br]radio come in and then it's going to
0:15:40.810,0:15:47.589
approve the transaction without any hands.[br]So this is the transaction. There is the
0:15:47.589,0:15:51.949
screen going. This is the way it supposed[br]to verify. There's the radio coming in at
0:15:51.949,0:15:56.395
433 MHz and then it's going to proceed to[br]the next screen without me touching the
0:15:56.395,0:16:02.259
button. There you go. So this is remotely[br]triggered, and that would have sent
0:16:02.259,0:16:06.270
transactions. So if you think about the[br]context that you have a malicious software
0:16:06.270,0:16:10.636
implant that sent it to a wrong address,[br]the attacker now can remotely accept that
0:16:10.636,0:16:19.610
and bypass the security module.[br]Applause
0:16:19.610,0:16:25.646
So, yeah, on the recaps, stickers are for[br]laptops, not for security. Supply chain
0:16:25.646,0:16:29.967
attacks are very easy to do at a hardware[br]level, but they're quite hard to do at
0:16:29.967,0:16:33.716
scale. And when the vendor says the device[br]is genuine, that may mean different
0:16:33.716,0:16:42.778
things.[br]Thomas: to segue to the next part, so six
0:16:42.778,0:16:47.790
months ago, Josh Datko said something that[br]I found kind of funny and it's almost
0:16:47.790,0:16:52.620
correct: "If you put funny constants in[br]your code, they will end up on DEFCON
0:16:52.620,0:16:56.740
slides, and they won't be laughing with[br]you." Small mistake, they won't end up at
0:16:56.740,0:17:03.410
DEF CON, they will be at CCC. and so[br]introducing the fOOdbabe vulnerability,
0:17:03.410,0:17:09.359
it's a bootloader vulnerability in a[br]Ledger Nano S. We did not come up with
0:17:09.359,0:17:14.050
this constant. It's literally in the code[br]as we'll see later. So the name was not
0:17:14.050,0:17:19.109
ours, but we like it. So we also bought[br]the domain foodba.be.
0:17:19.109,0:17:23.934
Laughter[br]Ledger Nano S is a very simple wallet. It
0:17:23.934,0:17:28.383
simply has a small display, it has a USB[br]port and two buttons. That's really all
0:17:28.383,0:17:33.170
there is. And you should take it apart.[br]You see it's just some pieces of plastic,
0:17:33.170,0:17:38.570
the display and the PCB. And looking at[br]the PCB, it kind of has an interesting
0:17:38.570,0:17:44.770
architecture where you have a STM32, which[br]is just a general purpose microcontroller,
0:17:44.770,0:17:50.250
and a ST31, which is a secret element that[br]is for example used in pay-TV and so on.
0:17:50.250,0:17:56.290
And is regarded as a very high security[br]chip, basically. And if you turn the PCB
0:17:56.290,0:18:00.070
around, you'll see that they were nice[br]enough to leave the programming port for
0:18:00.070,0:18:06.770
the STM32 open to us, ENABLED.[br]Laughter
0:18:06.770,0:18:13.440
And this has been suspected by other[br]people that we verified it. But you know,
0:18:13.440,0:18:17.810
you have to go through it. And obviously[br]Ledger is aware of this. And so let's look
0:18:17.810,0:18:23.430
at the security model that the Ledger Nano[br]S has. The basic idea is that if we look
0:18:23.430,0:18:28.700
at this device, we kind of have this[br]schematic of the STM32 being on the left
0:18:28.700,0:18:33.640
and the ST31 on the right. And as you can[br]see, all peripherals are connected to the
0:18:33.640,0:18:39.040
STM32. That is because the ST31 does not[br]have enough pins to connect peripherals.
0:18:39.040,0:18:43.814
It literally only has a one pin interface,[br]which is for the smartcard protocols
0:18:43.814,0:18:50.560
basically. And so all the heavy lifting is[br]done by the STM32. And Ledger splits it up
0:18:50.560,0:18:56.590
into the unsecure part and the secure[br]part. And the idea is that the STM32 acts
0:18:56.590,0:19:00.860
as a proxy. So it's basically the hardware[br]driver for the button, for the display,
0:19:00.860,0:19:06.216
for the USB, similar to a northbridge in[br]your standard computer. And when you take
0:19:06.216,0:19:11.160
a computer and want to make a transaction,[br]you create your transaction on the
0:19:11.160,0:19:18.770
computer, it goes through USB to the[br]STM32, and the STM32 then forwards it to
0:19:18.770,0:19:25.480
the ST31. THe ST31 then says, Oh, a new[br]transaction, I want trust the user to
0:19:25.480,0:19:31.010
confirm it. So it sends a display command[br]to the STM32 which in turn displays that
0:19:31.010,0:19:36.190
on the screen. And then you press the[br]"yes" button again it goes the same route
0:19:36.190,0:19:41.430
to the ST31, which then internally signs[br]the transaction. So the seed never leaves
0:19:41.430,0:19:47.470
the device and our assigned transaction[br]goes back through the STM, through USB to
0:19:47.470,0:19:54.400
the computer. To us, this means if this[br]chip is compromised, we can send malicious
0:19:54.400,0:20:01.500
transactions to the ST31 and confirm them[br]ourselves. Or we can even go and show a
0:20:01.500,0:20:06.970
different transaction on the screen than[br]we are actually sending to the ST31. And
0:20:06.970,0:20:11.570
Ledger is aware of this and we'll talk[br]about how they try to mitigate this later.
0:20:11.570,0:20:15.720
But first we have to find an exploit,[br]because while we do have debugging access
0:20:15.720,0:20:22.250
to the chip, hardware access is sometimes[br]kind of buggy. No offence. So we wanted to
0:20:22.250,0:20:26.550
have a software bug. And so we started[br]reverse engineering the firmware upgrade
0:20:26.550,0:20:34.180
process. And when you look at the[br]bootloader, the bootloader for the Ledger
0:20:34.180,0:20:38.530
used to be open-source, and back then they[br]didn't have any verification of the
0:20:38.530,0:20:42.780
firmware. So you could basically boot the[br]device into bootloader mode, flash
0:20:42.780,0:20:47.610
whatever from where you want, and then it[br]would run it. After someone, Saleem in
0:20:47.610,0:20:51.730
this case, wrote about this, they changed[br]it, and they changed it to do some
0:20:51.730,0:20:56.070
cryptographic measure. And we were too[br]lazy to reverse engineer the cryptographic
0:20:56.070,0:21:00.730
measure because it's very time consuming,[br]very hard. So we looked more at the parts
0:21:00.730,0:21:06.140
surrounding it and how we can maybe find a[br]bug in the bootloader to break it. And it
0:21:06.140,0:21:14.130
turns out that when you try to upgrade[br]your Ledger, you accept four different
0:21:14.130,0:21:18.820
commands. One is select segment, which[br]allows you to select the address base at
0:21:18.820,0:21:22.730
which you're firmware will be flashed. One[br]is the load command, which allows you to
0:21:22.730,0:21:27.570
write data to flash. Then you have the[br]flush command, which is basically like
0:21:27.570,0:21:32.875
f-sync on Linux and writes your changes to[br]the non-volatile memory. And you have the
0:21:32.875,0:21:38.546
boot command, which verifies the flash[br]code and starts booting it. So to us the
0:21:38.546,0:21:43.723
boot command is the most interesting,[br]because it provides all verification and
0:21:43.723,0:21:50.010
it attempts to ensure that no malicious[br]image is booted. And it turns out that if
0:21:50.010,0:21:54.020
you issue the boot command, it compares[br]the whole image to whatever
0:21:54.020,0:21:59.421
cryptographically function they use, and[br]if it's successfully verified, they write
0:21:59.421,0:22:08.640
a constant to the address 0x0800 3000, and[br]that constant is OxF00DBABE. And so, to
0:22:08.640,0:22:14.690
not have to verify the entire flash on[br]each boot, they just do this once, so only
0:22:14.690,0:22:22.100
after firmware upgrade. So basically if[br]you boot up the ledger, it boots, it waits
0:22:22.100,0:22:25.990
500 milliseconds. It checks if you have a[br]button pressed. If yes, it goes to
0:22:25.990,0:22:32.584
bootloader. Otherwise it loads the[br]constant at 0x08003000. And if it's
0:22:32.584,0:22:36.597
0xF00DBABE, it boots the firmware. So our[br]goal is to write a 0xF00DBABE to that
0:22:36.597,0:22:43.600
address. First attempt, we just issue a[br]select segment command to exactly that
0:22:43.600,0:22:51.564
address. We just write 0xF00DBABE to it,[br]flush and reset the device. Didn't work
0:22:51.564,0:22:57.049
unfortunately. so we had to do more[br]reverse engineering. It turns out that
0:22:57.049,0:23:02.100
they use an interesting approach to ensure[br]that you don't accidentally flash over the
0:23:02.100,0:23:06.690
bootloader. So they basically blacklist a[br]whole memory region. So if you try to
0:23:06.690,0:23:15.249
flash from 0x0800_0000 up to 0x0800_3000.[br]It returns an error. If you try to
0:23:15.249,0:23:19.300
directly write to F00DBABE, They thought[br]about it, and they have a very specific
0:23:19.300,0:23:25.736
code path to prevent that. So they memset[br]it to zero and you're screwed again. And
0:23:25.736,0:23:31.740
then finally it writes assuming you didn't[br]error out. But it turns out that the STM32
0:23:31.740,0:23:36.950
has kind of an interesting memory map and[br]on a lot of chips, you cannot only map
0:23:36.950,0:23:41.690
your flash to one address, but you can[br]also have it mapped to another address.
0:23:41.690,0:23:50.990
And in this case the flash is indeed also[br]mapped to the address 0. And so the
0:23:50.990,0:23:57.170
bootloader uses a blacklisting approach,[br]so it only excludes certain memory areas.
0:23:57.170,0:24:01.220
But it doesn't use whitelisting where you[br]could only explicitly write to this memory
0:24:01.220,0:24:08.700
region. So they do not block writing to[br]0x0000_0000. Profit! Second attempt. We
0:24:08.700,0:24:15.401
just select the segment at 0x0000_3000,[br]which maps to 0x0800_3000, we write
0:24:15.401,0:24:23.100
0xF00DBABE to it, we flush, reset, and we[br]can flash custom firmware! Awesome!
0:24:23.100,0:24:32.520
Applause[br]So what do you do when you have a device
0:24:32.520,0:24:40.182
that, where the display is not big enough[br]to run DOM with a custom firmware. So in
0:24:40.182,0:24:44.090
this case it's an original letter, press[br]the button, put it into bootloader mode,
0:24:44.090,0:24:59.960
which is part of the normal operation, and[br]Laughtes and Applause
0:24:59.960,0:25:06.520
If you want to play a bit of snake, come[br]by later. How are they protecting against
0:25:06.520,0:25:11.667
this? I've mentioned before Ledger is[br]aware that you can reflash this STM32. And
0:25:11.667,0:25:16.400
they are, they put in some measures to[br]prevent you from doing malicious stuff.
0:25:16.400,0:25:20.870
And basically what they do and this is[br]very simplified, and we did not bother to
0:25:20.870,0:25:26.630
fully reverse engineer because we didn't[br]need to, basically. When the chip boots,
0:25:26.630,0:25:31.490
it sends its entire firmware to the ST31,[br]which then performs some kind of hashing
0:25:31.490,0:25:36.700
also, verifies that the firmware as[br]authentic. And it also measures the time
0:25:36.700,0:25:40.760
it takes to send the firmware. This is to[br]prevent you from just running a
0:25:40.760,0:25:48.772
compression algorithm on the STM32 and[br]send it very slowly. How do we bypass
0:25:48.772,0:25:55.716
this? So our idea was, because we not only[br]have flash, we also have RAM. So what if
0:25:55.716,0:26:04.161
we create a compromised and compressed[br]firmware that copies itself to RAM? We
0:26:04.161,0:26:10.241
jump to it and then it writes its entire[br]compressed firmware to flash,
0:26:10.241,0:26:14.961
uncompressed in that case, and then we[br]just call the original code on the secure
0:26:14.961,0:26:21.290
element. It would verify the firmware, it[br]would run with a real timing and boots up
0:26:21.290,0:26:28.000
regularly. And so we attempted this. It[br]took quite a while to achieve.
0:26:28.000,0:26:31.570
Because basically, you can't do ZIP, you[br]can't do LZMA, because even if you
0:26:31.570,0:26:36.850
compress the image you don't have enough[br]space for complex compressor. So our
0:26:36.850,0:26:41.960
attempt was to find duplicate bytes,[br]squeeze them together and make space for a
0:26:41.960,0:26:46.390
custom payload. And basically we just have[br]a table that says, okay, now you will have
0:26:46.390,0:26:52.590
six zeros or something. And our each table[br]entry only takes a single byte. So, and
0:26:52.590,0:26:56.610
it's only like 10 instructions in[br]assembler to run this decompressor, so you
0:26:56.610,0:27:01.000
don't have the large code base. It's very[br]easy to use. And it turns out that even
0:27:01.000,0:27:05.330
with a very simple detector, like in this[br]case we rerun the script to find the
0:27:05.330,0:27:10.750
longest duplicate data, and you can see on[br]the first try, we get like 260 bytes of
0:27:10.750,0:27:17.219
space for our payload, which is enough for[br]a lot of things, let's say. And we have a
0:27:17.219,0:27:22.330
working PoC of concept of this and we[br]would go into a lot of details, but if we
0:27:22.330,0:27:27.450
only got an hour. And so we will release[br]after this talk and on non-offensive
0:27:27.450,0:27:31.850
example of this that you can look at how[br]does it work, what can you do even if
0:27:31.850,0:27:37.170
you're firmware is attempting to be[br]verified. And we also and this is very
0:27:37.170,0:27:41.391
exciting we are working with the YouTube[br]LiveOverflow and he created a 20 minute
0:27:41.391,0:27:46.919
video on walking through this entire[br]F00DBABE vulnerability, how did the
0:27:46.919,0:27:51.877
verification works and how to bypass it to[br]a certain degree. We don't want to
0:27:51.877,0:27:56.920
weaponize it. So we did not, we will not[br]release the first the full thing, but
0:27:56.920,0:28:02.676
yeah, very excited for this. Stay tuned on[br]our Twitter and we'll link it for sure. As
0:28:02.676,0:28:06.393
part of this, we also have a lot of[br]software that we will release. So public
0:28:06.393,0:28:10.320
release, we'll release the snake firmware.[br]So hopefully this evening you'll be able
0:28:10.320,0:28:15.250
to play snake on your Ledger. If you[br]bought some bitcoin at twenty thousand now
0:28:15.250,0:28:21.070
you're bankrupt, you can at least play[br]snake. We will opensource the compressor
0:28:21.070,0:28:26.350
and the extractor. We built a logic[br]analyzer plugin for this markup protocol
0:28:26.350,0:28:31.330
and we built software that analyzes the[br]communication between the STM32 and the
0:28:31.330,0:28:36.900
ST31 on the Ledger specific data, and you[br]can just dump it. So if you guys are
0:28:36.900,0:28:45.500
interested in for example trying to break[br]into the ST31, please have a go. And
0:28:45.500,0:28:50.302
Ledger has a second device, which is[br]called the Ledger Blue. We assume the
0:28:50.302,0:28:55.300
reason it's called the Ledger Blue is[br]because it contains Bluetooth. But they
0:28:55.300,0:29:00.060
never enable Bluetooth. So it's basically[br]just a regular Ledger with a color display
0:29:00.060,0:29:06.240
and a big battery in it. And we call this[br]part "Fantastic Signals and how to find
0:29:06.240,0:29:10.210
them".[br]Laughter
0:29:10.210,0:29:14.980
Because when we opened up this device and[br]we were chatting, we have this nice
0:29:14.980,0:29:20.530
telegram chat room where we're chatting[br]24/7 while doing this. And we opened up
0:29:20.530,0:29:24.460
the device and the first thing,like[br]literally five minutes after opening it, I
0:29:24.460,0:29:29.653
saw that you have the secure element on[br]the left and the STM32 on the right. You
0:29:29.653,0:29:36.220
have some other stuff like the Bluetooth[br]module and so on. The trace between the
0:29:36.220,0:29:42.445
secure element and the microcontroller is[br]pretty long and contains a pretty fast
0:29:42.445,0:29:50.590
signal. So what is a long conductor with a[br]fast changing current? Anyone got a clue?
0:29:50.590,0:29:54.844
Interjection[br]Correct. It's an antenna.
0:29:54.844,0:30:02.550
So I pulled out my HackRF [br]software defined radio, this
0:30:02.550,0:30:08.470
is just a very, a more sophisticated RTL-[br]SDR, so you can just sniff arbitrary
0:30:08.470,0:30:13.600
signals with it. I got a random shitty[br]telescope antenna on Amazon and they have
0:30:13.600,0:30:20.330
my Ledger blue. So on this screen, you can[br]see the blue thing is the radio spectrum
0:30:20.330,0:30:26.790
around 169 MHz and if we start entering[br]our pin we can see that there's a weak
0:30:26.790,0:30:29.960
signal.[br]Laughter
0:30:29.960,0:30:37.670
You guys see where this is going. On the[br]radio. Unfortunately that signal is pretty
0:30:37.670,0:30:46.410
weak. Luckily they included an antenna.[br]They call it a USB cable, but I'm not so
0:30:46.410,0:30:53.610
sure about it. So this time with USB[br]connected, and we do the same thing again.
0:30:53.610,0:31:00.004
You can see like crazy radio spikes and[br]this is right next to each other. But even
0:31:00.004,0:31:05.820
if you go a couple of meters. I was[br]limited as Josh by my living room space.
0:31:05.820,0:31:11.950
You get a couple of meters of decent[br]reception. So our goal was to find out
0:31:11.950,0:31:17.970
what is this signal and if we just look at[br]the recorded amplitude of the signal, we
0:31:17.970,0:31:23.291
get this. And if you do a lot of probing[br]and so on, you immediately see, ok, there
0:31:23.291,0:31:29.380
are spikes and there are 11 of them and[br]then there's a pause and then are small
0:31:29.380,0:31:34.760
spikes. So this is probably some kind of[br]protocol that first sends 11 bytes of data
0:31:34.760,0:31:39.415
then pauses, and then sends more data. So[br]we looked at the back of the device,
0:31:39.415,0:31:43.571
started probing every single connection[br]and tried to figure out is this the secure
0:31:43.571,0:31:50.210
element? Is this whatever? And it turned[br]out to be the display bus. So we can sniff
0:31:50.210,0:31:56.830
information on what is sent to the display[br]remotely. And if you, if we look at the
0:31:56.830,0:32:01.040
signal that gets sent in blue, it's the[br]signal that gets sent when you press the
0:32:01.040,0:32:07.010
letter zero on the pin pad and an orange[br]when you press the letter seven. So we can
0:32:07.010,0:32:11.304
see a very clear difference at certain[br]points on the signal which confirmed our
0:32:11.304,0:32:16.850
suspicion. But building software for this[br]is kind of boring, like digital signal
0:32:16.850,0:32:22.380
processing is not really my thing. So what[br]do we do? And we wanted to increase the
0:32:22.380,0:32:29.140
buzzword load in our talk a bit. And so we[br]are hacking blockchain IoT devices, using
0:32:29.140,0:32:40.934
artificial intelligence, in the cloud.[br]Applause and Laughter
0:32:40.934,0:32:47.660
So our ideal was we record training[br]signals, we use some kind of prefiltering,
0:32:47.660,0:32:55.130
we train AI on it. Profit! Literally.[br]Problem is, getting training data really
0:32:55.130,0:32:59.309
sucks, because you don't want to sit there[br]for 10 hours pressing the same key on a
0:32:59.309,0:33:06.094
pin pad. It really doesn't sound like fun.[br]And so this needs automation. So,
0:33:06.094,0:33:11.700
Laughter[br]So we took in Arduino, we took a roll of
0:33:11.700,0:33:17.560
masking tape, a piece of acrylic glass, a[br]PCB vice and this is a HUAWEI-pen for the
0:33:17.560,0:33:24.937
extra amount of Chinese backdoor. And we[br]let this run for a couple of hours. And
0:33:24.937,0:33:32.360
you can actually see that every time it[br]presses down, you can see that the digit
0:33:32.360,0:33:37.480
that you pressed is highlighted and the[br]difference in the signal we saw earlier is
0:33:37.480,0:33:42.600
probably the x and y coordinate, of where[br]it highlights the button. And that's the
0:33:42.600,0:33:51.065
difference. We can see in the trace. And[br]so we had a lot of recorded data. Now we
0:33:51.065,0:33:58.241
created a training set. We created a test[br]set, preprocessing Tensorflow ai model.
0:33:58.241,0:34:05.188
It's really easy surprisingly. And we[br]tried our test set did a prediction. And
0:34:05.188,0:34:10.360
so the big question how accurate is it.[br]And it turns out. So this is the the
0:34:10.360,0:34:16.530
result of a cut of the test set. And if we[br]zoom in on this this basically tells you
0:34:16.530,0:34:21.594
we have the signal of this great thing[br]it's just a picture representation of the
0:34:21.594,0:34:28.550
signal and it tells you how sure it is,[br]what digit it is. In this case it's 7 with
0:34:28.550,0:34:35.168
98 percent likelihood. So pretty good. In[br]our test set we only have one wrong result
0:34:35.168,0:34:40.760
and overall we get it wrong 90 percent[br]accuracy and to move this in the cloud we
0:34:40.760,0:34:46.699
are hosting this on the Google cloud. As[br]the LedgerAI for you guys to play with and
0:34:46.699,0:34:51.415
we'll publish it online with a limited[br]dataset that is trained on a very close
0:34:51.415,0:34:56.020
space. You cannot do something super[br]malicious with it but it's nice to play
0:34:56.020,0:35:01.510
around and see how this was done. And this[br]brings us to the next part, glitch me if
0:35:01.510,0:35:11.770
you can. Thank you.[br]Applause
0:35:11.770,0:35:17.024
Josh: So now we're going to talk about the[br]silicon level vulnerability with glitching
0:35:17.024,0:35:21.342
attacks fault injectio so to review.[br]So to review I will be talking about the
0:35:21.342,0:35:25.530
trezor one. And so I just want to go over[br]very quickly what the architecture is of
0:35:25.530,0:35:31.910
the trezor one and some previous work that[br]is done. So the Trezor One is quite a
0:35:31.910,0:35:37.800
simple embedded device. It consists of[br]only a few components. It has an OLED
0:35:37.800,0:35:44.062
display it has some buttons and has a USB[br]connector that are all externally facing.
0:35:44.062,0:35:53.619
Internally it has its main brain if you[br]will the STM32F205 microcontroller which
0:35:53.619,0:35:57.923
controls all the other operations on the[br]Trezor, that display, the USB, and the two
0:35:57.923,0:36:05.130
buttons. So last year we gave a talk at[br]DEFCON "Breaking Bitcoin Hardware Wallets"
0:36:05.130,0:36:09.549
here we use the chip whisper to mainly do[br]the glitching attacks, the conclusions
0:36:09.549,0:36:16.400
from last year is that the F2O5 was[br]vulnerable to fault injection but it was
0:36:16.400,0:36:21.498
inconclusive if we could do a exploit via[br]the fault. So this year we have a
0:36:21.498,0:36:27.468
different result but the output of that[br]work was this board was
0:36:27.468,0:36:29.200
called the breaking bitcoin board.
0:36:29.200,0:36:34.130
Basically it was a Trezor clone that just[br]made it easy to attach wires and probes
0:36:34.130,0:36:38.517
and so we made this board. The design[br]schematics are all online. It's open
0:36:38.517,0:36:42.970
source hardware. This is the chip[br]whisperer set up that we were using so we
0:36:42.970,0:36:47.257
made the board specifically to fit on the[br]chip whisperer target board. And this is
0:36:47.257,0:36:51.739
just what it looks like when you use the[br]chip whisper GUI to perform a glitch. And
0:36:51.739,0:36:56.442
here we were doing application level code[br]so it's very different but I gave that
0:36:56.442,0:37:07.020
talk and then I met Dmitry and Thomas.[br]Dmitry: Fortunately we had Josh to do the
0:37:07.020,0:37:11.690
talk last year and to kind of exhaust a[br]lot of the firmware vulnerabilities that
0:37:11.690,0:37:15.570
were actually hardware vulnerabilities in[br]the firmware that might have been there.
0:37:15.570,0:37:19.750
So we immediately knew that we could[br]exclude this. And so you can start looking
0:37:19.750,0:37:23.990
at the underlying microcontrollers. So in[br]this case it's STM32 microcontroller that
0:37:23.990,0:37:28.556
they use inside of it and it controls[br]everything. So compromising the STM32
0:37:28.556,0:37:33.430
microcontroller means that you can[br]compromise, you can compromise the device.
0:37:33.430,0:37:38.690
So I mean so there's a couple of papers[br]that have covered some of the
0:37:38.690,0:37:42.800
vulnerabilities in the STM32 specifically[br]there's one which describes a UV attack
0:37:42.800,0:37:49.090
which lets you downgrade the security on[br]the STM32. So we determined that paper
0:37:49.090,0:37:53.880
unfortunately does not apply to our result[br]because the Trezor or is smart enough when
0:37:53.880,0:37:58.670
it boot's to check the value stored in[br]Flash. And if it has been altered in any
0:37:58.670,0:38:03.516
way to set it correctly. So they actually[br]even protect against this kind of attack.
0:38:03.516,0:38:08.190
But nevertheless you can see that there is[br]some vulnerabilities. So there is another
0:38:08.190,0:38:12.310
paper which unfortunately has not been[br]published yet and we couldn't get in touch
0:38:12.310,0:38:15.880
with the authors yet. That should be[br]coming out in January hopefully which
0:38:15.880,0:38:23.380
describes glitches against the STM32 F1[br]and STM32 F3. So now we have the F0, the
0:38:23.380,0:38:30.250
F1, and the F3 and so basically here's the[br]product matrix. So three of them are
0:38:30.250,0:38:37.530
already vulnerable. So what we're looking[br]at SDM 32 F2 and potentially STM32 F4 if
0:38:37.530,0:38:43.160
we're talking about the Trezor model T so[br]those we do not have vulnerabilities for
0:38:43.160,0:38:49.738
yet. So let's take a look at how how it[br]works really quickly. So the way that STM
0:38:49.738,0:38:56.010
implements security on the STM32 is that[br]they store an option byte and the option
0:38:56.010,0:39:02.119
byte the thing to remember is on on a[br]cortex M3 or M4 microcontroller that you
0:39:02.119,0:39:06.010
don't have anything other than flash. So[br]even though they call it option buy or
0:39:06.010,0:39:10.130
refer you to this is fusing or being[br]permanent and hardware. It's still stored
0:39:10.130,0:39:14.180
and flash just like the user application[br]is stored in flash. So it's the same exact
0:39:14.180,0:39:19.390
same non-volatile memory that's otherwise[br]used. So basically when you get a new SDM
0:39:19.390,0:39:24.400
32 it's shipped in a state where you have[br]full access. So that's how Josh was able
0:39:24.400,0:39:30.530
to rework abord and flash it with new[br]firmware. And there is the ultimate
0:39:30.530,0:39:35.650
security is what's called RDP2. So there[br]you have no access but you can see that
0:39:35.650,0:39:43.550
basically if you have a value other than[br]aa or cc which correspond to RDP0 and RDP2
0:39:43.550,0:39:48.670
respectively then you have what's called[br]RDP1 and this is interesting because it
0:39:48.670,0:39:52.590
doesn't give you access to the flash which[br]is actually where the cryptographic seed
0:39:52.590,0:39:57.050
is stored on the Trezor but it gives you[br]access to RAM, it gives you access to the
0:39:57.050,0:40:01.440
registers but it doesn't give you flash[br]access like I said and it doesn't give you
0:40:01.440,0:40:05.350
single stepping as well so connecting a[br]debugger and this mode will actually cause
0:40:05.350,0:40:10.480
the hardware to hard fault which we'll see[br]in the second. So basically what we want
0:40:10.480,0:40:16.099
to try to do is to downgrade RDP2 which is[br]what the trezor is set to. And we want
0:40:16.099,0:40:24.450
to be able to access the device at RDP1[br]which is somewhat vulnerable state. This
0:40:24.450,0:40:29.160
so I should say that this is this is the[br]correct way to approach this and it's
0:40:29.160,0:40:35.270
great for doing an educational talk. But[br]in all honesty there's three of us. And so
0:40:35.270,0:40:39.859
we did this completely in the dark over a[br]over 3 months trying different
0:40:39.859,0:40:44.330
parameters on our on our glitch setups[br]which also later and were able to find
0:40:44.330,0:40:49.740
this. But I'm here to explain it to all of[br]you so that it's easy to reproduce. So if
0:40:49.740,0:40:53.640
you actually watch the SDM 30F2 boot[br]you'll see that it's relatively slow and
0:40:53.640,0:40:57.780
it's only this slow after you power cycle[br]the board. So it takes approximately
0:40:57.780,0:41:02.340
1.8 milliseconds to boot which is[br]a microcontroller terms pretty slow so you
0:41:02.340,0:41:06.170
can see there's the power supply there's[br]the IO pin and that's approximately how
0:41:06.170,0:41:10.720
long it takes to boot the firmware so you[br]can see that's where the IO actually
0:41:10.720,0:41:16.250
toggles so 120 milliseconds later. So we[br]just wrote some firmware to basically
0:41:16.250,0:41:20.130
toggle one of the pins measured within an[br]oscilloscope. Now we have the timing of
0:41:20.130,0:41:24.830
how long that takes. So that's not super[br]interesting because that's not really a
0:41:24.830,0:41:29.010
trigger. And each one of these[br]microcontrollers internally it has a boot
0:41:29.010,0:41:34.790
rom so it has some some rom read only[br]memory. It's not non-volatile memory it's
0:41:34.790,0:41:40.619
not the flash. It's literally a rom which[br]is inside the chip itself. It's it's hard
0:41:40.619,0:41:45.589
coded. It cannot be fixed or patched that[br]gets executed first. So we wanted to
0:41:45.589,0:41:49.510
actually attack that because anything else[br]is the user application and that's what
0:41:49.510,0:41:54.340
Josh did last year. So you can kind of[br]start to fiddle this down. So you see that
0:41:54.340,0:41:58.600
1.4 milliseconds of the reboot[br]nothing actually happens because this is
0:41:58.600,0:42:02.450
now the reset line. And so the reset line[br]goes high after 1.4 millisecond
0:42:02.450,0:42:06.060
so you can ignore the first[br]1.4 milliseconds after you
0:42:06.060,0:42:10.560
cycle the power. So now the next step that[br]you can actually do is you can connect
0:42:10.560,0:42:15.870
what's called a shunt resistor. So[br]oscilloscopes are there to measure
0:42:15.870,0:42:19.349
voltage and so you want to actually[br]measure current to be able to know how
0:42:19.349,0:42:23.450
much power is being consumed[br]by the device. So you do what's called
0:42:23.450,0:42:26.721
a shunt measurement and that's[br]what I have on this slide right here.
0:42:26.721,0:42:30.839
So you have the blue signal is now[br]actually the power consumption. And so now
0:42:30.839,0:42:34.640
you can actually look and see what's[br]happening. So the first thing that happens
0:42:34.640,0:42:38.859
is we have the execution of the BootROM.[br]You can see in the power consumption curve
0:42:38.859,0:42:44.440
you can clearly see this moment in time.[br]Then you have basically where the flash
0:42:44.440,0:42:49.209
and option bytes actually get read[br]somewhat at least within the BootROM. And
0:42:49.209,0:42:53.620
finally the third distinctive moment in[br]time is where the application actually
0:42:53.620,0:42:58.240
begins to execute. So now we've taken this[br]1.8 milliseconds which is a
0:42:58.240,0:43:03.200
relatively long time and reduced it to 200[br]microseconds. We're actually interested
0:43:03.200,0:43:07.910
in. And not only that we know that we're[br]actually interested in having slightly
0:43:07.910,0:43:12.650
higher power consumption than the normal[br]execution of the bootloader or the BootROM
0:43:12.650,0:43:19.230
rather and this is somewhere between[br]let's say 170 microseconds and 200
0:43:19.230,0:43:23.760
microseconds. So this is the time at which[br]we actually need to glitch and this is
0:43:23.760,0:43:28.460
also reasonable parameters. If you're[br]trying to reproduce this at home. So what
0:43:28.460,0:43:33.570
do you need to reproduce this thing. So I.[br]The greatest thing that came out in the
0:43:33.570,0:43:38.660
last couple of years is the these cheap[br]Chinese power supplies where you take a
0:43:38.660,0:43:43.600
cheap you know old wall wart from one of[br]your old Linksys routers you plug it in
0:43:43.600,0:43:48.860
and then you actually have a controllable[br]power supply with with voltage and current
0:43:48.860,0:43:53.460
and you can adjust this and control this.[br]And so that's what we're using here. The
0:43:53.460,0:43:56.645
second thing that I have to actually
0:43:56.645,0:44:01.390
control the timing is an FPGA. I mean I[br]use FPGA's for everything and this is
0:44:01.390,0:44:05.849
something that was easiest to put together[br]with an FPGA because FPGAs have constant
0:44:05.849,0:44:11.471
timing. So finally we have a multiplexer[br]there as well and the multiplexers are
0:44:11.471,0:44:16.750
actually switching between two voltages[br]between ground so completely cutting the
0:44:16.750,0:44:21.260
voltage off and the normal operating[br]voltage of the microcontroller. And
0:44:21.260,0:44:27.310
finally we have a debugger, the J-link[br]which is highly advised if you want to
0:44:27.310,0:44:33.310
ever do Jtag stuff. So it's just a Jtag[br]debugger and basically what happens is
0:44:33.310,0:44:39.540
you let this run for a while and it looks[br]like this. It's not really super eventful
0:44:39.540,0:44:43.590
so you can see that the voltage the yellow[br]signal is actually the voltage and you can
0:44:43.590,0:44:46.770
see we're just dipping the voltage at[br]different points in time and
0:44:46.770,0:44:51.630
simultaneously we have a python script[br]checking if we have Jtag access or not.
0:44:51.630,0:44:56.680
Protip to all the new dads if you do this[br]at home you can turn your oscilloscope
0:44:56.680,0:45:00.499
towards the door, so that when you get up[br]at night because the baby's crying, you
0:45:00.499,0:45:06.060
can see if it's still running or not. So[br]it's very, it's highly advised. So now
0:45:06.060,0:45:10.900
Thomas is going to tell us how to get the[br]seed into into RAM.
0:45:10.900,0:45:17.500
Thomas: So we had this thing running for[br]3 months roughly across 3
0:45:17.500,0:45:22.020
continents because Josh is in America,[br]Dmitry is in Russia and I'm in Germany and
0:45:22.020,0:45:26.610
so it took us 3 months to get a[br]successful glitch and even then we didn't
0:45:26.610,0:45:32.090
believe it at first because we exhausted[br]everything basically. And the only reason
0:45:32.090,0:45:39.290
we finally got it working is that we did a[br]mistake where we misstook 70 ms with
0:45:39.290,0:45:43.710
170 ms and had it run for a long time. And[br]that's how we found out that the BootROM
0:45:43.710,0:45:48.920
is actually super slow to boot on this[br]device. And once we had this downgrade
0:45:48.920,0:45:56.810
from RDP2 to RDP1, we were able to read[br]the RAM, but we cannot read the flash
0:45:56.810,0:46:03.700
which actually contains the seed. And so[br]how do we find this? And our idea was we
0:46:03.700,0:46:08.540
start reviewing the upgrade procedure[br]because on the Trezor, the way the
0:46:08.540,0:46:12.780
bootloader works is, it doesn't require a[br]PIN or anything to upgrade the firmware,
0:46:12.780,0:46:16.390
which makes sense, because let's say you[br]have a bug in the pin function you want to
0:46:16.390,0:46:21.609
somehow be able to get rid of it, right?[br]And the other thing is if you flash a
0:46:21.609,0:46:28.970
fully valid firmware it retains the data,[br]it retains your seed. if you flash and not
0:46:28.970,0:46:35.170
genuine one. It actually will erase your[br]seed and so on. And the big, and they do a
0:46:35.170,0:46:38.990
really good job on the firmware[br]verification. We reviewed it for days and
0:46:38.990,0:46:43.640
days and days and didn't find anything.[br]But so how does this upgrade procedure
0:46:43.640,0:46:48.220
work? how is this seat retained? And so[br]when you reviewed the relevant code you
0:46:48.220,0:46:54.310
see that there is a call to backup[br]metadata which sounds like it's going to
0:46:54.310,0:46:59.859
retain somehow your data. And indeed you[br]can see that it's literally a mem-copy
0:46:59.859,0:47:06.030
from the data on flash we're interested[br]into RAM. And so our basic procedure
0:47:06.030,0:47:11.630
was, we go into bootloader we start the[br]firmware upgrade and we stop it before the
0:47:11.630,0:47:16.599
RAM gets cleared. Because if you finish[br]the upgrade procedure, the Trezor actually
0:47:16.599,0:47:22.390
clears its memory again, which is a very[br]decent way to do it. But we've found a way
0:47:22.390,0:47:25.810
to retain it in RAM. So it turns out that[br]when you start the firmware upgrade
0:47:25.810,0:47:32.880
process, it eventually asks you to verify[br]to check some of what you just flashed and
0:47:32.880,0:47:38.540
it turns out that at this point in time,[br]the seed is still in RAM and we can read
0:47:38.540,0:47:47.410
it out via RDP2. And this is relatively[br]simple to do once you actually manage to
0:47:47.410,0:47:51.780
glitch the device. You basically just run[br]openocd dump_image, you get an image of
0:47:51.780,0:47:57.060
the SRAM and you have the whole RAM[br]contents and so.
0:47:57.060,0:48:04.330
Dmitry: What are we going to do,Thomas?[br]What high tech hacking tool will be using
0:48:04.330,0:48:09.630
today to extract the seed?[br]Thomas:So we actually before we were
0:48:09.630,0:48:14.310
successful, we had hours of talks on the[br]how do we, how is this seed stored and so
0:48:14.310,0:48:18.880
on. But we've found this super[br]sophisticated seed extraction tool that
0:48:18.880,0:48:26.010
only runs on POSIX and POSIX-like systems,[br]it's called strings.
0:48:26.010,0:48:30.140
Laughter[br]And so basically it turns out that when
0:48:30.140,0:48:37.640
you have a firmware dump as we have RAM[br]dump as we do now, and we go to we just
0:48:37.640,0:48:43.550
run strings on the dump. We get a couple[br]of really nice words and I don't know if
0:48:43.550,0:48:49.340
you remember the intro, but this is your[br]seeds.
0:48:49.340,0:48:55.600
Applause[br]And you might be wondering what this
0:48:55.600,0:48:59.900
little number is. This is your pin to the[br]device.
0:48:59.900,0:49:09.020
Laughters[br]That was a great day. And so Josh, or one
0:49:09.020,0:49:16.369
of Josh's employees took all this mess we[br]created on all desks and made it into this
0:49:16.369,0:49:23.600
nice device which is basically a socket[br]where you put in your chip and then we can
0:49:23.600,0:49:28.480
read out the seed and so on.[br]Dmitry: And all of this stuff including
0:49:28.480,0:49:32.600
the board design, FPGA codes, and the[br]Verilog code that we use, I mean if
0:49:32.600,0:49:36.810
somebody wants to, they can apply it and[br]do the same thing with one of the ICEPICKs
0:49:36.810,0:49:41.060
or one of the more open source friendly[br]FPGA boards. This just happens to be the
0:49:41.060,0:49:46.710
one that we all had lying around and could[br]reproduce the work with. You can go ahead
0:49:46.710,0:49:50.848
and do it. I mean we suspect, I think[br]Thomas said, we suspect you might be able
0:49:50.848,0:49:54.910
to do with Arduino as well, because the[br]actual glitch pulse is only approximately
0:49:54.910,0:50:02.061
60 μs or sorry, 6 μs in time. So it's a[br]relatively slow signal as well, so it
0:50:02.061,0:50:08.310
should be relatively repeatable even with[br]something cheaper than this. But this is a
0:50:08.310,0:50:12.200
way to automate this even better and to[br]not have dangling wires or any of the
0:50:12.200,0:50:16.870
small soldering that was required to do it[br]in situ in the device which we had on the
0:50:16.870,0:50:22.109
previous slide. So all of that we're going[br]to have it on GIthub. And so I think the
0:50:22.109,0:50:28.080
final, the final thing.[br]Thomas: one more thing before we are,
0:50:28.080,0:50:35.550
sorry. One more thing. So this breaks a[br]lot of the Trezor security, but there is
0:50:35.550,0:50:41.060
a way to protect your seed against this,[br]So if you use a passphrase on your device,
0:50:41.060,0:50:46.520
the way we understood it, it basically[br]doesn't allows somebody with hardware
0:50:46.520,0:50:51.850
access to steal all your funds. So if you[br]add a passphrase to your Trezor, a good
0:50:51.850,0:50:57.760
passphrase and your machine is not already[br]owned you can somehow somewhat protect
0:50:57.760,0:51:03.220
against this. But a lot of people don't.[br]So we are really sorry we didn't mean any
0:51:03.220,0:51:08.520
harm.[br]Dmitry: So yeah, that's the conclusion I
0:51:08.520,0:51:14.310
would say. So yeah I mean, so all the[br]stuff we're going to put online, I guess I
0:51:14.310,0:51:20.590
said, so you can follow us for the links[br]on the online. wallet.fail, it's a domain
0:51:20.590,0:51:26.420
name, believe it or not, fail is a TLD. So[br]you can go to github.com/walletfail,
0:51:26.420,0:51:32.600
twitter.com/walletfail. You can follow me,[br]Thomas, and Josh on Twitter as well and
0:51:32.600,0:51:36.710
like I said, we'll be releasing all this[br]stuff so it will go up slowly. Just
0:51:36.710,0:51:40.730
because I think when we set out six months[br]ago we did not expect us to have 100
0:51:40.730,0:51:45.619
percent success in everything that we were[br]planning to do. so that's a first for me
0:51:45.619,0:51:48.420
at the very least.[br]Thomas: The saddest part is that we have
0:51:48.420,0:51:54.950
more vulnerabilities to other wallets,[br]but, only one hour. And so we also have
0:51:54.950,0:51:58.720
some stuff to give out so we have the[br]hardware implant PCBs, we have thousands
0:51:58.720,0:52:01.810
of them if you want to get some.[br]Dmitry: Off to Josh.
0:52:01.810,0:52:08.810
Thomas: We even have components for them[br]for like 100 devices so hit us up and we
0:52:08.810,0:52:11.020
can do something. Thank you.
0:52:11.020,0:52:21.960
Applause
0:52:21.960,0:52:25.650
Herald: Thank you guys, it's an amazing[br]talk. I feel really inspired to break
0:52:25.650,0:52:30.630
things apart in a very creative way. We[br]have some time left for questions. So if
0:52:30.630,0:52:34.720
you have questions, please line up at the[br]microphones. But first we're going to
0:52:34.720,0:52:37.299
start with a question from the Internet.
0:52:37.299,0:52:40.239
Signal Angel: Thank you,[br]I've got two related
0:52:40.239,0:52:44.240
questions from the internet. First one,[br]how hard did you guys laugh when bitify
0:52:44.240,0:52:50.599
announced that their Android-based wallet[br]was unhackable? And second question, have
0:52:50.599,0:52:55.510
you had a try to attack larger processors[br]like ARM-based processors?
0:52:55.510,0:53:00.900
Thomas: So maybe let's start with Bitfi.[br]So we only talk about somewhat secure
0:53:00.900,0:53:06.720
wallets, we didn't want to use a Chinese[br]phone in this talk. So we laughed pretty
0:53:06.720,0:53:13.892
hard and we ordered some, but yeah.[br]Dmitry: And I mean this was covered
0:53:13.892,0:53:17.780
extensively. So another guy who you should[br]follow on Twitter @cybergibbons gave a
0:53:17.780,0:53:22.165
talk at hardwear.io on the topic of the[br]Bitfi. He was summarizing research that
0:53:22.165,0:53:25.568
he did in conjunction with a bunch of[br]other people as well. So if you're
0:53:25.568,0:53:27.970
interested in the Bitfi you should go look[br]at them.
0:53:27.970,0:53:30.170
So the second question was about ARM-based
0:53:30.170,0:53:35.420
controllers. I mean all of these were[br]ARM-based. Every single chip as far as I
0:53:35.420,0:53:38.989
know that we looked at was was ARM-based[br]in one way or another.
0:53:38.989,0:53:40.210
Thomas: Yeah and there's,
0:53:40.210,0:53:44.060
so if you're interested in this, look at[br]glitching the Nintendo Switch where they
0:53:44.060,0:53:48.359
glitch the Tegra used in the Nintendo[br]Switch, which is very interesting and will
0:53:48.359,0:53:52.924
give a lot of inspiration in that[br]regard, basically.
0:53:52.924,0:53:57.206
Herald: Thank you. A question for[br]microphone 4 please.
0:53:57.206,0:54:01.755
Mic 4: Hi, Trezor CPO here, first thank[br]you for the talk, we worked with you to
0:54:01.755,0:54:06.010
fix the issues as soon as are recommend to[br]prod and if anyone interested in hacking
0:54:06.010,0:54:13.733
hardware wallets, we are really interested[br]in working with the hardware hackers
0:54:13.733,0:54:17.940
community and we have a[br]responsible disclosure program.
0:54:17.940,0:54:24.065
you mentioned problems with supply chain[br]attacks, but gave no solutions, so let me
0:54:24.065,0:54:30.109
give you one. Trezor is open source[br]hardware so you can build your own
0:54:30.109,0:54:32.235
from locally sourced components
0:54:32.235,0:54:37.949
and if you are paranoid and don't want to[br]deal with these kind of attacks.
0:54:37.949,0:54:44.139
but my question is, is there any[br]other solution except for building
0:54:44.139,0:54:47.259
your own wallet or inspecting[br]the code to run and
0:54:47.259,0:54:50.380
interrogate about basically?
0:54:50.380,0:54:55.240
Thomas: First Thank you. One thing we[br]should mention is that when we looked at
0:54:55.240,0:54:59.920
the Trezor code, the reason we had to end[br]up glitching this chip for three months is
0:54:59.920,0:55:04.080
that we couldn't break the firmware[br]otherwise. So they do a great job. And
0:55:04.080,0:55:08.480
it's really awesome.[br]Applause
0:55:08.480,0:55:15.570
Dmitry: Yes. The firmware on the Trezor is[br]something to look at. I mean I recommend
0:55:15.570,0:55:19.580
that, I mean we all do consulting work as[br]well. And so it's something that I
0:55:19.580,0:55:24.740
recommend that people who are interested[br]in looking at how to prevent certain doom
0:55:24.740,0:55:28.445
mitigations and hardware. It's an[br]excellent project to look at. And so
0:55:28.445,0:55:32.390
Trezor should be commended on that. But at[br]the end of the day it doesn't mean that
0:55:32.390,0:55:37.270
the chip that the Trezor uses is secure[br]against these kinds of attacks. And that's
0:55:37.270,0:55:41.369
where we had a fallback to looking for[br]silicon vulnerabilities against a chip
0:55:41.369,0:55:44.928
or, sorry, a wallet like the Trezor.
0:55:45.814,0:55:48.032
Josh: I would say on this hygeine side,
0:55:48.032,0:55:53.002
this is a very difficult problem,[br]governments especially have this issue.
0:55:53.002,0:55:57.192
You can do cryptographic attestation, but[br]as we saw with the Ledger nano,
0:55:57.192,0:56:01.105
that cryptographic attestation didn't help[br]verify that the requests were legitimate
0:56:01.105,0:56:05.110
against a hardware attack, so there's been[br]talk about X-raying the board and all this
0:56:05.110,0:56:08.540
stuff, but this is still very much an[br]open problem in hardware security.
0:56:08.540,0:56:11.020
Herald: Another question from microphone[br]3.
0:56:11.020,0:56:16.310
Mic: Actually I have a suggestion.[br]Herald: Make it short, though. Because
0:56:16.310,0:56:19.390
usually we just take questions. One[br]sentence.
0:56:19.390,0:56:25.203
Mic: A few MCUs actually have Jtag[br]connected via hardware fuses.
0:56:25.665,0:56:28.530
So this might be useful
0:56:28.530,0:56:35.600
at least slow down glitching attacks.[br]Dmitry: Thanks. I agree. But these are
0:56:35.600,0:56:40.764
not Cortex-M microcontrollers I can tell[br]you that with 100% certainty. It has to do
0:56:40.764,0:56:44.109
a lot with the fact that the[br]microcontrollers that are being used in
0:56:44.109,0:56:48.460
these devices, they're built to spec to[br]the spec that ARM specified that ARM
0:56:48.460,0:56:53.800
thinks would be a good set of features for[br]this class of device or rather for the for
0:56:53.800,0:56:57.990
the CPUs for the class of device that they[br]ended up getting put in. So anything
0:56:57.990,0:57:02.950
Cortex-M is gonna to have vulnerabilities[br]that are more or less like the silicon
0:57:02.950,0:57:06.859
vulnerabilities that we have. It's just I[br]mean if you ask me I think it's a matter
0:57:06.859,0:57:11.050
of time just to sit there. I mean[br]fortunately we had something like 3 months
0:57:11.050,0:57:16.770
of just glitching to be able to find find[br]these bugs. But if you can apply that much
0:57:16.770,0:57:22.200
to find it silicon attack you might be[br]able to find this kind of vulnerability as
0:57:22.200,0:57:25.579
well in other Cortex-M products. Only[br]three minutes.
0:57:25.579,0:57:28.940
Herald: All good. Another question from[br]microphone 4 please.
0:57:28.940,0:57:33.933
Mic 4: So obviously as part of your work[br]you analyzed the firmware of these
0:57:33.933,0:57:40.150
devices. Did you find that the firmware [br]is in any way obfuscated or encrypted?
0:57:40.150,0:57:45.380
Thomas: So basically yep, on these chips[br]you cannot really encrypt the firmware. On
0:57:45.380,0:57:51.000
the ST31 you can encrypt it. But we didn't[br]have to look at it because the ST31 is not
0:57:51.000,0:57:54.710
something you have to break but so no[br]there was no real obfuscation that we
0:57:54.710,0:57:58.780
could see. But we also don't have the code[br]in the case of letters so I just stared at
0:57:58.780,0:58:05.220
IDA pro for hours and yeah.[br]Herald: The next person on microphone 4.
0:58:05.220,0:58:11.310
Mic 4: Hello, did you have a look at the[br]entropy chip that generates the master
0:58:11.310,0:58:14.880
seeds on both of these hardware devices,[br]and what's your take on that?
0:58:14.880,0:58:21.530
Dmitry: I mean, so we already hovered how[br]the Trezor works. There is only one chip
0:58:21.530,0:58:26.760
and it's the STM32 so I know that there[br]was a known issue with Trezor back in the
0:58:26.760,0:58:32.950
day where they weren't seeding the[br]basically the RNG correctly. But this was
0:58:32.950,0:58:37.820
fixed. But for our attacks this wasn't[br]this wasn't an issue. I mean if you were
0:58:37.820,0:58:42.710
concerned about how strong these are, how[br]strong the random number generators are
0:58:42.710,0:58:48.560
for creating a seed you could actually[br]create a BIP39 wallet outside of any
0:58:48.560,0:58:53.539
one of these and then just use them for[br]their hardware features and get the seed
0:58:53.539,0:58:56.326
from outside.[br]Herald: And if you have a question, do
0:58:56.326,0:59:00.170
move to the microphone if you're able to.[br]But first we have another question from
0:59:00.170,0:59:03.937
the Internet.[br]SA: Thank you. Did you guys see the
0:59:03.937,0:59:09.472
dinosaur hiphop zero wallet?[br]Thomas: No but if you send it to us
0:59:09.472,0:59:11.663
we are happy to look at it.[br]Thomas: Oh you did.
0:59:11.663,0:59:14.626
Dmitry: Yeah, we had the it[br]Josh: The dinosaur hiphop wallet -
0:59:14.626,0:59:18.380
Thank you for the kind of trick questions[br]- So the design of the dinosaur hiphop
0:59:18.380,0:59:21.784
wallet was a trezor clone[br]that we looked at last year.
0:59:21.784,0:59:23.928
Thomas: Ah[br]Josh: Called breaking bitcoin board
0:59:23.928,0:59:27.365
so that if we didn't, otherwise[br]functionally it's a trezor clone
0:59:27.365,0:59:30.125
but we stole a lot of the instructions[br]from dinosaur hiphop
0:59:30.125,0:59:33.444
make the breaking bitcoin board[br]and then prepare the operating system.
0:59:33.444,0:59:37.307
Dmitry: I mean, and maybe on that note[br]I would say that in terms of looking at
0:59:37.307,0:59:42.319
what wallets are actually be used you'll[br]find that, so the Ledger is a very popular
0:59:42.319,0:59:44.266
wallet, the Trezor is a very popular
0:59:44.266,0:59:50.080
wallet. But since the Trezor is opensource[br]there is a lot of clones and forks of the
0:59:50.080,0:59:55.930
Trezor. And when I say that not all of[br]them run the latest security patches that
0:59:55.930,1:00:00.430
have been applied to the Trezor code base.[br]So that's also something that you can do
1:00:00.430,1:00:04.830
is basically diff the projects and see[br]which one of them which ones are staying
1:00:04.830,1:00:06.169
up to date and which aren't.
1:00:06.169,1:00:08.822
Herald: Your question has to be the very[br]last one today.
1:00:08.822,1:00:15.783
Please speak directly into the microphone.[br]Even closer to the mic.
1:00:15.783,1:00:25.322
Mic: Seeing as this is the first CCC for[br]many of us and some of us might not have
1:00:25.322,1:00:29.939
that much experience in hardware hacking.[br]Do you have any tips for beginners?
1:00:29.939,1:00:39.210
Thomas: Yeah lots of them. Buy an Arduino [br]learn what mistakes you do with it and
1:00:39.210,1:00:44.100
learn how hardware works, basically. Watch[br]a lot of online videos and I think you
1:00:44.100,1:00:48.530
gave presentations, you gave[br]presentations. I gave some presentations.
1:00:48.530,1:00:53.520
So just watch talks, watch LiveOverflow.[br]LiveOverflow, great YouTube channel on
1:00:53.520,1:00:59.619
exactly this stuff. And also don't[br]hesitate to reach out to us. If you have a
1:00:59.619,1:01:04.580
question. Always contact us[br]info@wallet.fail, on Twitter, wherever. we
1:01:04.580,1:01:07.491
are happy to talk to you. It might take a[br]while.
1:01:07.491,1:01:12.043
Josh: On non-security electronics, if you[br]go to Sparkfun or Adafruit, they have lots
1:01:12.043,1:01:15.840
of free material of how electronics work,[br]how to get started. It's not security
1:01:15.840,1:01:18.140
related, but it's a very good[br]electronics program
1:01:18.140,1:01:21.142
Dmitry: But I'll say I started[br]with Arduino too.
1:01:23.642,1:01:27.487
Herald: All right thank you guys so much[br]for the very nice questions and you guys
1:01:27.487,1:01:30.150
for the amazing and inspiring talk.[br]Thank you so much.
1:01:30.150,1:01:31.670
Applause
1:01:31.670,1:01:58.000
Subtitles created by c3subtitles.de[br]in the years 2018-2020. Join, and help us!