WEBVTT
00:00:00.000 --> 00:00:19.259
35C3 preroll music
00:00:19.259 --> 00:00:24.429
Herald angel: Welcome everybody to our
next Talk. It's the talk “Wallet.fail”.
00:00:24.429 --> 00:00:28.380
As you all know, when you have something
valuable you put it somewhere safe. But as
00:00:28.380 --> 00:00:33.710
we as hackers also know there is no place
that is really safe and our three speakers
00:00:33.710 --> 00:00:39.059
Thomas, Dmitry and Josh are now going to
demonstrate in the next hour the art of
00:00:39.059 --> 00:00:43.590
completely breaking something apart. So
please give a big round of applause for
00:00:43.590 --> 00:00:47.970
Thomas, Dmitry and Josh and have a lot of
fun.
00:00:47.970 --> 00:00:51.790
Applause
00:00:51.790 --> 00:00:55.359
Dmitry: So just just to start, I'm
curious how many people here actually own
00:00:55.359 --> 00:01:02.300
cryptocurrency. Raise your hand. And how
many of you store it on a hardware wallet?
00:01:02.300 --> 00:01:09.420
So we're very sorry to everyone who has
their hand up. OK. So it's not just me.
00:01:09.420 --> 00:01:15.390
It's me, Josh and Thomas. So we're all
hardware people. We do low level hardware
00:01:15.390 --> 00:01:21.200
stuff in varying degrees and we got into
cryptocurrency and so I can recommend to
00:01:21.200 --> 00:01:25.369
everyone sitting in this room if you're a
security person. There's not a lot of
00:01:25.369 --> 00:01:31.340
people doing security and cryptocurrency
as much as that's painful to hear. So yeah
00:01:31.340 --> 00:01:36.110
I mean a lot of this is based on reverse
engineering. We love cryptocurrency.
00:01:36.110 --> 00:01:40.660
I mean for us crypto also stands for
cryptography not just crypto currency, but
00:01:40.660 --> 00:01:45.729
no offense to anyone with this talk. It's
just something that it's a category that
00:01:45.729 --> 00:01:49.869
we looked at. And so the results kind of
speak for themselves. And again this
00:01:49.869 --> 00:01:53.700
wouldn't be possible alone. So we have a
lot of people to thank. I'm not going to
00:01:53.700 --> 00:01:57.840
go through all of them individually. Just
be knowing that we're thankful to everyone
00:01:57.840 --> 00:02:03.800
on this, on the slide. So yes, so we
started this about six months ago. So we
00:02:03.800 --> 00:02:07.289
wanted to take a look at cryptocurrency
because we own some cryptocurrency
00:02:07.289 --> 00:02:12.890
ourselves and we saw that everyone's using
cryptocurrency wallets. It's more and more
00:02:12.890 --> 00:02:18.709
the thing that you do. So we started a
group chat as you do nowadays. And we have
00:02:18.709 --> 00:02:26.349
50000 messages now and 1100 images. And I
had my first, I had my son in the meantime
00:02:26.349 --> 00:02:30.820
as well. So it's a really long time that
we been looking at this, etc.
00:02:30.820 --> 00:02:33.170
Applause
00:02:33.170 --> 00:02:37.480
OK, so what do we want to achieve
though? Because people don't give
00:02:37.480 --> 00:02:40.780
the kinds of attacks so you can
actually perform against
00:02:40.780 --> 00:02:44.349
cryptocurrency wallets enough credit.
So first attack is supply chain attacks
00:02:44.349 --> 00:02:47.910
where you are able to manipulate the
devices before they get
00:02:47.910 --> 00:02:51.409
to the end customer.
Firmware vulnerabilities, where you find a
00:02:51.409 --> 00:02:55.169
vulnerability in the firmware and can
somehow either infect or do something else
00:02:55.169 --> 00:02:58.879
with the device. Side-channel attacks of
course. I think that's one of the more
00:02:58.879 --> 00:03:02.660
obvious ones that people are familiar
with. And also chip-level vulnerabilities.
00:03:02.660 --> 00:03:06.380
So we were able to find one of each of
these. And so that's the talk that we're
00:03:06.380 --> 00:03:10.750
going to talk about each one of these
individually. But first, what's a wallet?
00:03:10.750 --> 00:03:15.159
Just in case you are not 100 percent
familiar with them. So a wallet, and in
00:03:15.159 --> 00:03:19.460
general cryptocurrency how do you do this,
it's just asymmetric cryptography. So you
00:03:19.460 --> 00:03:24.270
have a private key and a public key. The
public key, basically, it gives you the
00:03:24.270 --> 00:03:28.939
address. You can derive the address from
this. The address is nothing other than
00:03:28.939 --> 00:03:33.020
the public key of the wallet and you have
the private key and you need this to send
00:03:33.020 --> 00:03:37.659
transactions, so to actually operate with
the cryptocurrency. So this, the private
00:03:37.659 --> 00:03:41.626
key, is what needs to be kept secret. The
public key is something that everyone can
00:03:41.626 --> 00:03:45.830
know so that they can send cryptocurrency
to you. But it kind of sucks to have a
00:03:45.830 --> 00:03:50.189
separate for each cryptocurrency-pair or
for each wallet maybe even multiple
00:03:50.189 --> 00:03:55.829
wallets. It sucks to generate a new
cryptographic pair for each one of them.
00:03:55.829 --> 00:04:00.390
So the people, the wonderful people,
behind bitcoin have thought of something
00:04:00.390 --> 00:04:06.830
for this and it's called BIP32/BIP44. And,
so, what it is is you have a cryptographic
00:04:06.830 --> 00:04:13.810
seed and you can actually derive the
accounts from a single seed. So you
00:04:13.810 --> 00:04:18.160
basically store one seed and you're able
to implement and do unlimited amount of
00:04:18.160 --> 00:04:23.480
wallets. Okay. So basically you do key
derivation, you add some data, do key
00:04:23.480 --> 00:04:27.329
derivation and you can have an unlimited
amount of wallets while storing a single
00:04:27.329 --> 00:04:31.290
seed. And this is what you're using when
you're using a hardware wallet. So and of
00:04:31.290 --> 00:04:35.235
course for each key derivation there will
be a new private key and a public key, but
00:04:35.235 --> 00:04:38.759
it will be generated in a predictable
manner and you only need a store one
00:04:38.759 --> 00:04:42.630
secret seed. So you only have to store the
seed. You can write it down, and that's
00:04:42.630 --> 00:04:46.730
the advantage. But it's difficult to write
down because it's binary data. So come
00:04:46.730 --> 00:04:52.160
BIP39, which is what you're most used to,
which is a format in which you can take
00:04:52.160 --> 00:04:55.880
that cryptographic seed, that binary data,
and actually convert it to a set of
00:04:55.880 --> 00:05:00.020
dictionary words that you can then easily
write down on a piece of paper and store
00:05:00.020 --> 00:05:03.889
it at your mother's house, or store half
of it at your mother's house and half of
00:05:03.889 --> 00:05:07.710
it at your grandmother's house. And that
way somebody would have to go into both
00:05:07.710 --> 00:05:13.820
houses simultaneously to get your words.
So yeah. So what's a hardware wallet?
00:05:13.820 --> 00:05:17.800
So we just talked about what's a wallet.
So why do you even need a hardware wallet?
00:05:17.800 --> 00:05:22.370
Well, the problem is, of course, computers
can get backdoored, have malware running
00:05:22.370 --> 00:05:26.380
on them and this is what you want to pre-
vent against. How do you do this? You have
00:05:26.380 --> 00:05:30.139
a secure device, you store your seeds
externally. Usually, this is a USB-
00:05:30.139 --> 00:05:35.180
connected device that you store your
crypto on and so you can trust this even
00:05:35.180 --> 00:05:39.099
if you can't trust your computer. This is
the idea. So what happens is the computer
00:05:39.099 --> 00:05:42.940
sends the transaction to the device. The
device gets the transaction, it can
00:05:42.940 --> 00:05:46.660
actually confirm or deny the transaction.
It also displays the transaction. So
00:05:46.660 --> 00:05:50.430
before you do any cryptographic signing,
you can see is that actually what I was
00:05:50.430 --> 00:05:55.370
doing or was my computer owned and is it
initiating the transaction for me? So you
00:05:55.370 --> 00:06:00.410
sign the transaction and also, yeah, the
seed never leaves the transaction, but the
00:06:00.410 --> 00:06:04.160
hardware signs a transaction for you. You
send it back to the computer and the
00:06:04.160 --> 00:06:09.610
computer can actually take that and send
it to the Internet. OK? So that's a quick
00:06:09.610 --> 00:06:14.920
rundown of how crypto or, sorry, how
hardware wallets work. So the first thing
00:06:14.920 --> 00:06:19.889
that we looked at was supply chain attacks
which is where Josh gonna pick up. You
00:06:19.889 --> 00:06:26.410
have a mic. Oh sorry.
Josh: Ok, so the three big things I want
00:06:26.410 --> 00:06:30.030
to leave you with as we go through the
supply chain attacks are, stickers for
00:06:30.030 --> 00:06:34.210
laptops, they are not for security. So
we're going to be talking about stickers
00:06:34.210 --> 00:06:39.330
today. They're there for laptop
decorations, they are not for security.
00:06:39.330 --> 00:06:43.221
Supply chain attacks are easy to perform,
but they're quite hard to perform at
00:06:43.221 --> 00:06:47.620
scale. And the last takeaway that I will
leave you with is that, the vendor's
00:06:47.620 --> 00:06:52.720
threat model may not actually be your
threat model. So security stickers, some
00:06:52.720 --> 00:06:56.460
of the wallet vendors are using them. I
have seen them on other products, they
00:06:56.460 --> 00:07:01.290
seem to be quite popular. I have a friend
and colleague named Joe Fitzpatrick, he
00:07:01.290 --> 00:07:06.690
also likes stickers. So the stickers that
he makes are the same as we find on his
00:07:06.690 --> 00:07:11.289
security product. They have holograms.
They have unique serial numbers. And they
00:07:11.289 --> 00:07:16.380
leave you with that nice warm fuzzy
security feeling. Joe makes some funny
00:07:16.380 --> 00:07:21.690 line:1
ones. You can get a Fitz 140-2 approved
stickers. You don't have to pay the money
00:07:21.690 --> 00:07:27.410
for the FIPS one, just get the Fitz one.
So the first device I looked at was the
00:07:27.410 --> 00:07:34.270
Trezor One. The Trezor One actually has
two levels of protection on the packaging.
00:07:34.270 --> 00:07:40.449
There's the hologram sticker than the
actual box is enclosed with an adhesive.
00:07:40.449 --> 00:07:44.349
So it's supposed to be that you actually
have to rip open the box to get into it.
00:07:44.349 --> 00:07:47.830
But if you use a hot air gun or a
hairdryer it's actually quite easy to
00:07:47.830 --> 00:07:51.949
remove. And so if you see on the left
there that's the original package and on
00:07:51.949 --> 00:07:57.490
the right this is a box that I opened and
put everything back into. And if you look
00:07:57.490 --> 00:08:01.069
closely there is a little bit of gap
there. The sticker has a little bit of
00:08:01.069 --> 00:08:05.491
break but this was the first try. And it's
pretty close. So trust me taking a sticker
00:08:05.491 --> 00:08:09.780
off is not very hard. Now if you remember
this picture of the sticker cause we're
00:08:09.780 --> 00:08:14.370
going to come back to it. So but for the
vendor this is actually a real problem so
00:08:14.370 --> 00:08:18.379
Trezor did put a blog post out that one of
the challenges they face is that they're
00:08:18.379 --> 00:08:22.819
facing counterfeiting of their devices. So
this is from their blog post. They say hey
00:08:22.819 --> 00:08:26.400
you know we've noticed that there's
counterfeit devices. You have to look at
00:08:26.400 --> 00:08:30.669
the stickers to see that they are legit.
So I said remember look at that sticker.
00:08:30.669 --> 00:08:35.010
So I bought that case about a year and a
half ago for my previous DevCon talk and
00:08:35.010 --> 00:08:39.290
it's the same sticker that they're saying
is fake here. So then on their wiki it's
00:08:39.290 --> 00:08:43.159
very confusing because there's three sets
of stickers so basically, yeah, stickers
00:08:43.159 --> 00:08:47.890
are very confusing. They cause problems
for end users. And I was not even sure if
00:08:47.890 --> 00:08:54.680
I bought a real Trezor or a cloned one. So
this morning I got out a new case. And
00:08:54.680 --> 00:08:59.010
just to make sure I took off the sticker
using very sophisticated equipment
00:08:59.010 --> 00:09:04.290
including a very expensive Dyson hairdryer
that was included in the AirBnB and I was
00:09:04.290 --> 00:09:08.174
able to remove the sticker.
So it comes off
00:09:08.174 --> 00:09:14.390
with zero residue. So yes stickers do
not provide any security. On the Trezor T
00:09:14.390 --> 00:09:18.350
they switched it from the box and now the
box can be opened easily. But now there's
00:09:18.350 --> 00:09:24.140
a sticker on the USB-C port. Again as you
would expect use hot air and you can
00:09:24.140 --> 00:09:28.940
easily remove it. Pro tip: don't set the
hot air rework that high I had it set for
00:09:28.940 --> 00:09:33.320
lead free reworking and I actually melted
the enclosure. So if you're going to do
00:09:33.320 --> 00:09:37.711
this kind of supply chain attack, maybe,
no, set the heat a little lower but if you
00:09:37.711 --> 00:09:44.690
just google how to remove stickers the
same attack methods work. So this causes
00:09:44.690 --> 00:09:49.750
a bit of confusion because the ledger
device has a very, I will say, in your
00:09:49.750 --> 00:09:55.210
face a piece of paper when you open the
box it says there are no stickers in this
00:09:55.210 --> 00:10:02.000
box. However I combed through about 250
1-star Amazon reviews and a lot of them
00:10:02.000 --> 00:10:06.220
have to do with confusion about the
stickers. So some of them are actually
00:10:06.220 --> 00:10:10.730
quite funny. So this this one started out
"Note to wallet hackers", so I was really
00:10:10.730 --> 00:10:15.410
into this. So I was like, OK, pro tip
what's this guy have to say? And basically
00:10:15.410 --> 00:10:18.940
he was complaining that there's
fingerprints on the device. That's how he
00:10:18.940 --> 00:10:23.880
knew it was hacked. Another one complained
that the fingerprints were on the wallet
00:10:23.880 --> 00:10:27.694
and there was a hair underneath. So if
you're doing supply chain attacks be sure
00:10:27.694 --> 00:10:32.390
to remove any evidence of your
fingerprints or hair. So anyway stickers
00:10:32.390 --> 00:10:36.613
don't work. That's all I want to say about
that. Once you get through this enclosure
00:10:36.613 --> 00:10:40.860
though you then have to have the challenge
of actually opening the enclosure. These
00:10:40.860 --> 00:10:44.770
are three different wallet devices: Ledger
Nano on the left, the Trezor One and the
00:10:44.770 --> 00:10:49.240
Trezor T on the bottom all of which
actually open pretty easily. So the Trezor
00:10:49.240 --> 00:10:53.580
One, even, so, I'm still not sure if
that's the counterfeit or the real one,
00:10:53.580 --> 00:10:57.930
but I get on the on the real one today. I
was able to pop open enclosure. So it is
00:10:57.930 --> 00:11:01.660
ultra sonically welded but you can pry it
in there and open it. The Ledger Nano
00:11:01.660 --> 00:11:06.070
opens very easily, like, without any
equipment. But once you do this, you know
00:11:06.070 --> 00:11:09.690
what do you do once it's opened? So the
attack basically is you take the
00:11:09.690 --> 00:11:13.340
microcontroller and you rework it. So you
remove the microcontroller from the
00:11:13.340 --> 00:11:17.260
printed circuit board and you put on a new
one that you bought from a distributor.
00:11:17.260 --> 00:11:20.660
Once you've done that on the Trezor
devices you can put your compromised
00:11:20.660 --> 00:11:24.610
bootloader on there. So this is, I did not
go as far to make the compromised
00:11:24.610 --> 00:11:28.385
bootloader, but I did confirm that once I
switched the microcontroller, I could
00:11:28.385 --> 00:11:33.200
connect with a debugger over SWD and I
have free access to the chip. So some of
00:11:33.200 --> 00:11:39.190
the parts got blown off when I was
reworking but the SDM works fine. So yeah.
00:11:39.190 --> 00:11:43.140
So you just rework, reflash and then you
put everything back together. So next I
00:11:43.140 --> 00:11:47.360
want to talk about hardware implants. So
you may remember the story that came out
00:11:47.360 --> 00:11:51.390
there was this big hack by Bloomberg about
hardware implants. I wanted to make a
00:11:51.390 --> 00:11:55.570
hardware implant. I also wanted to have a
little bit of fun with this. So, we, in
00:11:55.570 --> 00:12:00.390
honor of the Bloomberg story which has
some, you may have some issues with it.
00:12:00.390 --> 00:12:06.010
We're about to talk about the BloomBurglar
which is a super micro fun implant. So the
00:12:06.010 --> 00:12:10.010
goals for this implant is I wanted this
implant to happen after receipt. So it is
00:12:10.010 --> 00:12:14.325
both a supply chain attack and a physical
one like a red team can perform this. A
00:12:14.325 --> 00:12:18.970
malicious insider could also perform this
attack. Zero firmware, because more fun.
00:12:18.970 --> 00:12:22.980
It has to fit inside of a hardware wallet,
so it has to be small it has to also
00:12:22.980 --> 00:12:26.803
bypass the core security function,
otherwise it's not an implant. Very few
00:12:26.803 --> 00:12:31.870
components. I have a thousand of them with
me. So I wanted to be able to offer Makers
00:12:31.870 --> 00:12:37.927
and DIYers to participate in the hardware
implant fun. So what kind of implant did I
00:12:37.927 --> 00:12:42.100
end up with. Well, I decided to do a
basically, an RF-triggered switch and so
00:12:42.100 --> 00:12:47.115
the idea is on these devices there's a
button and the button is the last line of
00:12:47.115 --> 00:12:51.441
defense. So all the vendors assume that
the host is going to be compromised. They
00:12:51.441 --> 00:12:55.162
just assume that's going to be easy
because that's software. And so once you
00:12:55.162 --> 00:12:59.030
have a compromised host you have to send
it to the device and then the human -- so
00:12:59.030 --> 00:13:03.130
humans are still needed -- humans have to
look at it and say "Is this the right
00:13:03.130 --> 00:13:07.620
transaction or not?" They have to say yes
or no. So now with this implant I can,
00:13:07.620 --> 00:13:11.700
through RF, I can trigger the yes button.
So a human is not required to send
00:13:11.700 --> 00:13:15.366
transactions, I can remotely trigger it.
Basically the RF comes in through the
00:13:15.366 --> 00:13:19.040
antenna it goes through a single
transistor which is the main component and
00:13:19.040 --> 00:13:22.890
it pulls the button low. And I'm sorry to
say that the bill of materials is quite
00:13:22.890 --> 00:13:28.360
expensive at three dollars and 16 cents.
Two dollars and 61 cents of that is this
00:13:28.360 --> 00:13:33.780
potentiometer I had to use. So it is a
little bit expensive. I'm sorry. Also, why
00:13:33.780 --> 00:13:39.250
is this so big. I mean this is an American
Dime I can fit two on them. What's the
00:13:39.250 --> 00:13:43.110
deal. Why is it so big. Well I optimized
it for hand assembly. So it would be, you
00:13:43.110 --> 00:13:47.020
know, more fun to use, but basically you
put the antenna in and then there's an out
00:13:47.020 --> 00:13:51.056
button and, like I said, I have a thousand
with me. So just for scale. This is how it
00:13:51.056 --> 00:13:55.720
fits on the Ledger Nano. This is how it
fits on the Trezor. It is also because
00:13:55.720 --> 00:13:59.580
breadboard-friendly is a thing. So we made
it breadboard-friendly. So you can also
00:13:59.580 --> 00:14:04.230
play along very easily at home. So then
the last challenge with an RF implant is
00:14:04.230 --> 00:14:07.920
how do you design antenna to fit in there.
And so the big thing there with an
00:14:07.920 --> 00:14:11.850
SMA connector is the first prototype
I did. Experimented with a few antenna
00:14:11.850 --> 00:14:15.920
designs but that remember it, it all has
to fit inside the Ledger. So that's
00:14:15.920 --> 00:14:20.520
actually quite easy because a Ledger Nano
has a plenty of room to insert extra
00:14:20.520 --> 00:14:26.630
circuitry and so it quite fits easily in
the Ledger Nano. And then I did the
00:14:26.630 --> 00:14:30.066
implant and then I started to go through
the wallet process. I got to a
00:14:30.066 --> 00:14:34.680
check that said is might, you know, is the
Ledger device genuine. And here I actually
00:14:34.680 --> 00:14:39.626
got a little bit nervous because it wasn't
working, and so it wasn't working. I was
00:14:39.626 --> 00:14:43.569
like, maybe they were checking this, you
know how did they detect it. Don't worry,
00:14:43.569 --> 00:14:47.956
it's only Linux. So it just doesn't work
on Linux. So that was no problem. I did it
00:14:47.956 --> 00:14:52.492
on windows and no problems. The device was
genuine, I was able to move on. So the
00:14:52.492 --> 00:14:56.480
thing is, this is a very crude receiver,
but the attacker can always use more
00:14:56.480 --> 00:15:02.200
power. So here I have this is my antenna
setup in the basement, and with a 50W
00:15:02.200 --> 00:15:06.710
transmitter I can remotely trigger the
button at 11 meters, and at this point I'm
00:15:06.710 --> 00:15:10.589
just limited by my basement size. I'm
pretty very confident that I'd be able to
00:15:10.589 --> 00:15:16.550
remotely trigger this thing further. Yeah.
So here we're going to see a demo of what
00:15:16.550 --> 00:15:20.380
it looks like and for the other problem
you have with hardware implants is how do
00:15:20.380 --> 00:15:24.356
you know you have the implanted device. So
you have to label it some way. Ledger has
00:15:24.356 --> 00:15:29.290
this kind of Latin phrase that scrolls " I
wanted my own Latin phrase" And so this is
00:15:29.290 --> 00:15:33.310
how I know this is my implanted device. So
what we're going to see is that the
00:15:33.310 --> 00:15:37.188
transaction screens is gonna show up. This
is, and I'm basically going to trigger
00:15:37.188 --> 00:15:40.810
this remotely, so I'm going to show that
radio come in and then it's going to
00:15:40.810 --> 00:15:47.589
approve the transaction without any hands.
So this is the transaction. There is the
00:15:47.589 --> 00:15:51.949
screen going. This is the way it supposed
to verify. There's the radio coming in at
00:15:51.949 --> 00:15:56.395
433 MHz and then it's going to proceed to
the next screen without me touching the
00:15:56.395 --> 00:16:02.259
button. There you go. So this is remotely
triggered, and that would have sent
00:16:02.259 --> 00:16:06.270
transactions. So if you think about the
context that you have a malicious software
00:16:06.270 --> 00:16:10.636
implant that sent it to a wrong address,
the attacker now can remotely accept that
00:16:10.636 --> 00:16:19.610
and bypass the security module.
Applause
00:16:19.610 --> 00:16:25.646
So, yeah, on the recaps, stickers are for
laptops, not for security. Supply chain
00:16:25.646 --> 00:16:29.967
attacks are very easy to do at a hardware
level, but they're quite hard to do at
00:16:29.967 --> 00:16:33.716
scale. And when the vendor says the device
is genuine, that may mean different
00:16:33.716 --> 00:16:42.778
things.
Thomas: to segue to the next part, so six
00:16:42.778 --> 00:16:47.790
months ago, Josh Datko said something that
I found kind of funny and it's almost
00:16:47.790 --> 00:16:52.620
correct: "If you put funny constants in
your code, they will end up on DEFCON
00:16:52.620 --> 00:16:56.740
slides, and they won't be laughing with
you." Small mistake, they won't end up at
00:16:56.740 --> 00:17:03.410
DEF CON, they will be at CCC. and so
introducing the fOOdbabe vulnerability,
00:17:03.410 --> 00:17:09.359
it's a bootloader vulnerability in a
Ledger Nano S. We did not come up with
00:17:09.359 --> 00:17:14.050
this constant. It's literally in the code
as we'll see later. So the name was not
00:17:14.050 --> 00:17:19.109
ours, but we like it. So we also bought
the domain foodba.be.
00:17:19.109 --> 00:17:23.934
Laughter
Ledger Nano S is a very simple wallet. It
00:17:23.934 --> 00:17:28.383
simply has a small display, it has a USB
port and two buttons. That's really all
00:17:28.383 --> 00:17:33.170
there is. And you should take it apart.
You see it's just some pieces of plastic,
00:17:33.170 --> 00:17:38.570
the display and the PCB. And looking at
the PCB, it kind of has an interesting
00:17:38.570 --> 00:17:44.770
architecture where you have a STM32, which
is just a general purpose microcontroller,
00:17:44.770 --> 00:17:50.250
and a ST31, which is a secret element that
is for example used in pay-TV and so on.
00:17:50.250 --> 00:17:56.290
And is regarded as a very high security
chip, basically. And if you turn the PCB
00:17:56.290 --> 00:18:00.070
around, you'll see that they were nice
enough to leave the programming port for
00:18:00.070 --> 00:18:06.770
the STM32 open to us, ENABLED.
Laughter
00:18:06.770 --> 00:18:13.440
And this has been suspected by other
people that we verified it. But you know,
00:18:13.440 --> 00:18:17.810
you have to go through it. And obviously
Ledger is aware of this. And so let's look
00:18:17.810 --> 00:18:23.430
at the security model that the Ledger Nano
S has. The basic idea is that if we look
00:18:23.430 --> 00:18:28.700
at this device, we kind of have this
schematic of the STM32 being on the left
00:18:28.700 --> 00:18:33.640
and the ST31 on the right. And as you can
see, all peripherals are connected to the
00:18:33.640 --> 00:18:39.040
STM32. That is because the ST31 does not
have enough pins to connect peripherals.
00:18:39.040 --> 00:18:43.814
It literally only has a one pin interface,
which is for the smartcard protocols
00:18:43.814 --> 00:18:50.560
basically. And so all the heavy lifting is
done by the STM32. And Ledger splits it up
00:18:50.560 --> 00:18:56.590
into the unsecure part and the secure
part. And the idea is that the STM32 acts
00:18:56.590 --> 00:19:00.860
as a proxy. So it's basically the hardware
driver for the button, for the display,
00:19:00.860 --> 00:19:06.216
for the USB, similar to a northbridge in
your standard computer. And when you take
00:19:06.216 --> 00:19:11.160
a computer and want to make a transaction,
you create your transaction on the
00:19:11.160 --> 00:19:18.770
computer, it goes through USB to the
STM32, and the STM32 then forwards it to
00:19:18.770 --> 00:19:25.480
the ST31. THe ST31 then says, Oh, a new
transaction, I want trust the user to
00:19:25.480 --> 00:19:31.010
confirm it. So it sends a display command
to the STM32 which in turn displays that
00:19:31.010 --> 00:19:36.190
on the screen. And then you press the
"yes" button again it goes the same route
00:19:36.190 --> 00:19:41.430
to the ST31, which then internally signs
the transaction. So the seed never leaves
00:19:41.430 --> 00:19:47.470
the device and our assigned transaction
goes back through the STM, through USB to
00:19:47.470 --> 00:19:54.400
the computer. To us, this means if this
chip is compromised, we can send malicious
00:19:54.400 --> 00:20:01.500
transactions to the ST31 and confirm them
ourselves. Or we can even go and show a
00:20:01.500 --> 00:20:06.970
different transaction on the screen than
we are actually sending to the ST31. And
00:20:06.970 --> 00:20:11.570
Ledger is aware of this and we'll talk
about how they try to mitigate this later.
00:20:11.570 --> 00:20:15.720
But first we have to find an exploit,
because while we do have debugging access
00:20:15.720 --> 00:20:22.250
to the chip, hardware access is sometimes
kind of buggy. No offence. So we wanted to
00:20:22.250 --> 00:20:26.550
have a software bug. And so we started
reverse engineering the firmware upgrade
00:20:26.550 --> 00:20:34.180
process. And when you look at the
bootloader, the bootloader for the Ledger
00:20:34.180 --> 00:20:38.530
used to be open-source, and back then they
didn't have any verification of the
00:20:38.530 --> 00:20:42.780
firmware. So you could basically boot the
device into bootloader mode, flash
00:20:42.780 --> 00:20:47.610
whatever from where you want, and then it
would run it. After someone, Saleem in
00:20:47.610 --> 00:20:51.730
this case, wrote about this, they changed
it, and they changed it to do some
00:20:51.730 --> 00:20:56.070
cryptographic measure. And we were too
lazy to reverse engineer the cryptographic
00:20:56.070 --> 00:21:00.730
measure because it's very time consuming,
very hard. So we looked more at the parts
00:21:00.730 --> 00:21:06.140
surrounding it and how we can maybe find a
bug in the bootloader to break it. And it
00:21:06.140 --> 00:21:14.130
turns out that when you try to upgrade
your Ledger, you accept four different
00:21:14.130 --> 00:21:18.820
commands. One is select segment, which
allows you to select the address base at
00:21:18.820 --> 00:21:22.730
which you're firmware will be flashed. One
is the load command, which allows you to
00:21:22.730 --> 00:21:27.570
write data to flash. Then you have the
flush command, which is basically like
00:21:27.570 --> 00:21:32.875
f-sync on Linux and writes your changes to
the non-volatile memory. And you have the
00:21:32.875 --> 00:21:38.546
boot command, which verifies the flash
code and starts booting it. So to us the
00:21:38.546 --> 00:21:43.723
boot command is the most interesting,
because it provides all verification and
00:21:43.723 --> 00:21:50.010
it attempts to ensure that no malicious
image is booted. And it turns out that if
00:21:50.010 --> 00:21:54.020
you issue the boot command, it compares
the whole image to whatever
00:21:54.020 --> 00:21:59.421
cryptographically function they use, and
if it's successfully verified, they write
00:21:59.421 --> 00:22:08.640
a constant to the address 0x0800 3000, and
that constant is OxF00DBABE. And so, to
00:22:08.640 --> 00:22:14.690
not have to verify the entire flash on
each boot, they just do this once, so only
00:22:14.690 --> 00:22:22.100
after firmware upgrade. So basically if
you boot up the ledger, it boots, it waits
00:22:22.100 --> 00:22:25.990
500 milliseconds. It checks if you have a
button pressed. If yes, it goes to
00:22:25.990 --> 00:22:32.584
bootloader. Otherwise it loads the
constant at 0x08003000. And if it's
00:22:32.584 --> 00:22:36.597
0xF00DBABE, it boots the firmware. So our
goal is to write a 0xF00DBABE to that
00:22:36.597 --> 00:22:43.600
address. First attempt, we just issue a
select segment command to exactly that
00:22:43.600 --> 00:22:51.564
address. We just write 0xF00DBABE to it,
flush and reset the device. Didn't work
00:22:51.564 --> 00:22:57.049
unfortunately. so we had to do more
reverse engineering. It turns out that
00:22:57.049 --> 00:23:02.100
they use an interesting approach to ensure
that you don't accidentally flash over the
00:23:02.100 --> 00:23:06.690
bootloader. So they basically blacklist a
whole memory region. So if you try to
00:23:06.690 --> 00:23:15.249
flash from 0x0800_0000 up to 0x0800_3000.
It returns an error. If you try to
00:23:15.249 --> 00:23:19.300
directly write to F00DBABE, They thought
about it, and they have a very specific
00:23:19.300 --> 00:23:25.736
code path to prevent that. So they memset
it to zero and you're screwed again. And
00:23:25.736 --> 00:23:31.740
then finally it writes assuming you didn't
error out. But it turns out that the STM32
00:23:31.740 --> 00:23:36.950
has kind of an interesting memory map and
on a lot of chips, you cannot only map
00:23:36.950 --> 00:23:41.690
your flash to one address, but you can
also have it mapped to another address.
00:23:41.690 --> 00:23:50.990
And in this case the flash is indeed also
mapped to the address 0. And so the
00:23:50.990 --> 00:23:57.170
bootloader uses a blacklisting approach,
so it only excludes certain memory areas.
00:23:57.170 --> 00:24:01.220
But it doesn't use whitelisting where you
could only explicitly write to this memory
00:24:01.220 --> 00:24:08.700
region. So they do not block writing to
0x0000_0000. Profit! Second attempt. We
00:24:08.700 --> 00:24:15.401
just select the segment at 0x0000_3000,
which maps to 0x0800_3000, we write
00:24:15.401 --> 00:24:23.100
0xF00DBABE to it, we flush, reset, and we
can flash custom firmware! Awesome!
00:24:23.100 --> 00:24:32.520
Applause
So what do you do when you have a device
00:24:32.520 --> 00:24:40.182
that, where the display is not big enough
to run DOM with a custom firmware. So in
00:24:40.182 --> 00:24:44.090
this case it's an original letter, press
the button, put it into bootloader mode,
00:24:44.090 --> 00:24:59.960
which is part of the normal operation, and
Laughtes and Applause
00:24:59.960 --> 00:25:06.520
If you want to play a bit of snake, come
by later. How are they protecting against
00:25:06.520 --> 00:25:11.667
this? I've mentioned before Ledger is
aware that you can reflash this STM32. And
00:25:11.667 --> 00:25:16.400
they are, they put in some measures to
prevent you from doing malicious stuff.
00:25:16.400 --> 00:25:20.870
And basically what they do and this is
very simplified, and we did not bother to
00:25:20.870 --> 00:25:26.630
fully reverse engineer because we didn't
need to, basically. When the chip boots,
00:25:26.630 --> 00:25:31.490
it sends its entire firmware to the ST31,
which then performs some kind of hashing
00:25:31.490 --> 00:25:36.700
also, verifies that the firmware as
authentic. And it also measures the time
00:25:36.700 --> 00:25:40.760
it takes to send the firmware. This is to
prevent you from just running a
00:25:40.760 --> 00:25:48.772
compression algorithm on the STM32 and
send it very slowly. How do we bypass
00:25:48.772 --> 00:25:55.716
this? So our idea was, because we not only
have flash, we also have RAM. So what if
00:25:55.716 --> 00:26:04.161
we create a compromised and compressed
firmware that copies itself to RAM? We
00:26:04.161 --> 00:26:10.241
jump to it and then it writes its entire
compressed firmware to flash,
00:26:10.241 --> 00:26:14.961
uncompressed in that case, and then we
just call the original code on the secure
00:26:14.961 --> 00:26:21.290
element. It would verify the firmware, it
would run with a real timing and boots up
00:26:21.290 --> 00:26:28.000
regularly. And so we attempted this. It
took quite a while to achieve.
00:26:28.000 --> 00:26:31.570
Because basically, you can't do ZIP, you
can't do LZMA, because even if you
00:26:31.570 --> 00:26:36.850
compress the image you don't have enough
space for complex compressor. So our
00:26:36.850 --> 00:26:41.960
attempt was to find duplicate bytes,
squeeze them together and make space for a
00:26:41.960 --> 00:26:46.390
custom payload. And basically we just have
a table that says, okay, now you will have
00:26:46.390 --> 00:26:52.590
six zeros or something. And our each table
entry only takes a single byte. So, and
00:26:52.590 --> 00:26:56.610
it's only like 10 instructions in
assembler to run this decompressor, so you
00:26:56.610 --> 00:27:01.000
don't have the large code base. It's very
easy to use. And it turns out that even
00:27:01.000 --> 00:27:05.330
with a very simple detector, like in this
case we rerun the script to find the
00:27:05.330 --> 00:27:10.750
longest duplicate data, and you can see on
the first try, we get like 260 bytes of
00:27:10.750 --> 00:27:17.219
space for our payload, which is enough for
a lot of things, let's say. And we have a
00:27:17.219 --> 00:27:22.330
working PoC of concept of this and we
would go into a lot of details, but if we
00:27:22.330 --> 00:27:27.450
only got an hour. And so we will release
after this talk and on non-offensive
00:27:27.450 --> 00:27:31.850
example of this that you can look at how
does it work, what can you do even if
00:27:31.850 --> 00:27:37.170
you're firmware is attempting to be
verified. And we also and this is very
00:27:37.170 --> 00:27:41.391
exciting we are working with the YouTube
LiveOverflow and he created a 20 minute
00:27:41.391 --> 00:27:46.919
video on walking through this entire
F00DBABE vulnerability, how did the
00:27:46.919 --> 00:27:51.877
verification works and how to bypass it to
a certain degree. We don't want to
00:27:51.877 --> 00:27:56.920
weaponize it. So we did not, we will not
release the first the full thing, but
00:27:56.920 --> 00:28:02.676
yeah, very excited for this. Stay tuned on
our Twitter and we'll link it for sure. As
00:28:02.676 --> 00:28:06.393
part of this, we also have a lot of
software that we will release. So public
00:28:06.393 --> 00:28:10.320
release, we'll release the snake firmware.
So hopefully this evening you'll be able
00:28:10.320 --> 00:28:15.250
to play snake on your Ledger. If you
bought some bitcoin at twenty thousand now
00:28:15.250 --> 00:28:21.070
you're bankrupt, you can at least play
snake. We will opensource the compressor
00:28:21.070 --> 00:28:26.350
and the extractor. We built a logic
analyzer plugin for this markup protocol
00:28:26.350 --> 00:28:31.330
and we built software that analyzes the
communication between the STM32 and the
00:28:31.330 --> 00:28:36.900
ST31 on the Ledger specific data, and you
can just dump it. So if you guys are
00:28:36.900 --> 00:28:45.500
interested in for example trying to break
into the ST31, please have a go. And
00:28:45.500 --> 00:28:50.302
Ledger has a second device, which is
called the Ledger Blue. We assume the
00:28:50.302 --> 00:28:55.300
reason it's called the Ledger Blue is
because it contains Bluetooth. But they
00:28:55.300 --> 00:29:00.060
never enable Bluetooth. So it's basically
just a regular Ledger with a color display
00:29:00.060 --> 00:29:06.240
and a big battery in it. And we call this
part "Fantastic Signals and how to find
00:29:06.240 --> 00:29:10.210
them".
Laughter
00:29:10.210 --> 00:29:14.980
Because when we opened up this device and
we were chatting, we have this nice
00:29:14.980 --> 00:29:20.530
telegram chat room where we're chatting
24/7 while doing this. And we opened up
00:29:20.530 --> 00:29:24.460
the device and the first thing,like
literally five minutes after opening it, I
00:29:24.460 --> 00:29:29.653
saw that you have the secure element on
the left and the STM32 on the right. You
00:29:29.653 --> 00:29:36.220
have some other stuff like the Bluetooth
module and so on. The trace between the
00:29:36.220 --> 00:29:42.445
secure element and the microcontroller is
pretty long and contains a pretty fast
00:29:42.445 --> 00:29:50.590
signal. So what is a long conductor with a
fast changing current? Anyone got a clue?
00:29:50.590 --> 00:29:54.844
Interjection
Correct. It's an antenna.
00:29:54.844 --> 00:30:02.550
So I pulled out my HackRF
software defined radio, this
00:30:02.550 --> 00:30:08.470
is just a very, a more sophisticated RTL-
SDR, so you can just sniff arbitrary
00:30:08.470 --> 00:30:13.600
signals with it. I got a random shitty
telescope antenna on Amazon and they have
00:30:13.600 --> 00:30:20.330
my Ledger blue. So on this screen, you can
see the blue thing is the radio spectrum
00:30:20.330 --> 00:30:26.790
around 169 MHz and if we start entering
our pin we can see that there's a weak
00:30:26.790 --> 00:30:29.960
signal.
Laughter
00:30:29.960 --> 00:30:37.670
You guys see where this is going. On the
radio. Unfortunately that signal is pretty
00:30:37.670 --> 00:30:46.410
weak. Luckily they included an antenna.
They call it a USB cable, but I'm not so
00:30:46.410 --> 00:30:53.610
sure about it. So this time with USB
connected, and we do the same thing again.
00:30:53.610 --> 00:31:00.004
You can see like crazy radio spikes and
this is right next to each other. But even
00:31:00.004 --> 00:31:05.820
if you go a couple of meters. I was
limited as Josh by my living room space.
00:31:05.820 --> 00:31:11.950
You get a couple of meters of decent
reception. So our goal was to find out
00:31:11.950 --> 00:31:17.970
what is this signal and if we just look at
the recorded amplitude of the signal, we
00:31:17.970 --> 00:31:23.291
get this. And if you do a lot of probing
and so on, you immediately see, ok, there
00:31:23.291 --> 00:31:29.380
are spikes and there are 11 of them and
then there's a pause and then are small
00:31:29.380 --> 00:31:34.760
spikes. So this is probably some kind of
protocol that first sends 11 bytes of data
00:31:34.760 --> 00:31:39.415
then pauses, and then sends more data. So
we looked at the back of the device,
00:31:39.415 --> 00:31:43.571
started probing every single connection
and tried to figure out is this the secure
00:31:43.571 --> 00:31:50.210
element? Is this whatever? And it turned
out to be the display bus. So we can sniff
00:31:50.210 --> 00:31:56.830
information on what is sent to the display
remotely. And if you, if we look at the
00:31:56.830 --> 00:32:01.040
signal that gets sent in blue, it's the
signal that gets sent when you press the
00:32:01.040 --> 00:32:07.010
letter zero on the pin pad and an orange
when you press the letter seven. So we can
00:32:07.010 --> 00:32:11.304
see a very clear difference at certain
points on the signal which confirmed our
00:32:11.304 --> 00:32:16.850
suspicion. But building software for this
is kind of boring, like digital signal
00:32:16.850 --> 00:32:22.380
processing is not really my thing. So what
do we do? And we wanted to increase the
00:32:22.380 --> 00:32:29.140
buzzword load in our talk a bit. And so we
are hacking blockchain IoT devices, using
00:32:29.140 --> 00:32:40.934
artificial intelligence, in the cloud.
Applause and Laughter
00:32:40.934 --> 00:32:47.660
So our ideal was we record training
signals, we use some kind of prefiltering,
00:32:47.660 --> 00:32:55.130
we train AI on it. Profit! Literally.
Problem is, getting training data really
00:32:55.130 --> 00:32:59.309
sucks, because you don't want to sit there
for 10 hours pressing the same key on a
00:32:59.309 --> 00:33:06.094
pin pad. It really doesn't sound like fun.
And so this needs automation. So,
00:33:06.094 --> 00:33:11.700
Laughter
So we took in Arduino, we took a roll of
00:33:11.700 --> 00:33:17.560
masking tape, a piece of acrylic glass, a
PCB vice and this is a HUAWEI-pen for the
00:33:17.560 --> 00:33:24.937
extra amount of Chinese backdoor. And we
let this run for a couple of hours. And
00:33:24.937 --> 00:33:32.360
you can actually see that every time it
presses down, you can see that the digit
00:33:32.360 --> 00:33:37.480
that you pressed is highlighted and the
difference in the signal we saw earlier is
00:33:37.480 --> 00:33:42.600
probably the x and y coordinate, of where
it highlights the button. And that's the
00:33:42.600 --> 00:33:51.065
difference. We can see in the trace. And
so we had a lot of recorded data. Now we
00:33:51.065 --> 00:33:58.241
created a training set. We created a test
set, preprocessing Tensorflow ai model.
00:33:58.241 --> 00:34:05.188
It's really easy surprisingly. And we
tried our test set did a prediction. And
00:34:05.188 --> 00:34:10.360
so the big question how accurate is it.
And it turns out. So this is the the
00:34:10.360 --> 00:34:16.530
result of a cut of the test set. And if we
zoom in on this this basically tells you
00:34:16.530 --> 00:34:21.594
we have the signal of this great thing
it's just a picture representation of the
00:34:21.594 --> 00:34:28.550
signal and it tells you how sure it is,
what digit it is. In this case it's 7 with
00:34:28.550 --> 00:34:35.168
98 percent likelihood. So pretty good. In
our test set we only have one wrong result
00:34:35.168 --> 00:34:40.760
and overall we get it wrong 90 percent
accuracy and to move this in the cloud we
00:34:40.760 --> 00:34:46.699
are hosting this on the Google cloud. As
the LedgerAI for you guys to play with and
00:34:46.699 --> 00:34:51.415
we'll publish it online with a limited
dataset that is trained on a very close
00:34:51.415 --> 00:34:56.020
space. You cannot do something super
malicious with it but it's nice to play
00:34:56.020 --> 00:35:01.510
around and see how this was done. And this
brings us to the next part, glitch me if
00:35:01.510 --> 00:35:11.770
you can. Thank you.
Applause
00:35:11.770 --> 00:35:17.024
Josh: So now we're going to talk about the
silicon level vulnerability with glitching
00:35:17.024 --> 00:35:21.342
attacks fault injectio so to review.
So to review I will be talking about the
00:35:21.342 --> 00:35:25.530
trezor one. And so I just want to go over
very quickly what the architecture is of
00:35:25.530 --> 00:35:31.910
the trezor one and some previous work that
is done. So the Trezor One is quite a
00:35:31.910 --> 00:35:37.800
simple embedded device. It consists of
only a few components. It has an OLED
00:35:37.800 --> 00:35:44.062
display it has some buttons and has a USB
connector that are all externally facing.
00:35:44.062 --> 00:35:53.619
Internally it has its main brain if you
will the STM32F205 microcontroller which
00:35:53.619 --> 00:35:57.923
controls all the other operations on the
Trezor, that display, the USB, and the two
00:35:57.923 --> 00:36:05.130
buttons. So last year we gave a talk at
DEFCON "Breaking Bitcoin Hardware Wallets"
00:36:05.130 --> 00:36:09.549
here we use the chip whisper to mainly do
the glitching attacks, the conclusions
00:36:09.549 --> 00:36:16.400
from last year is that the F2O5 was
vulnerable to fault injection but it was
00:36:16.400 --> 00:36:21.498
inconclusive if we could do a exploit via
the fault. So this year we have a
00:36:21.498 --> 00:36:27.468
different result but the output of that
work was this board was
00:36:27.468 --> 00:36:29.200
called the breaking bitcoin board.
00:36:29.200 --> 00:36:34.130
Basically it was a Trezor clone that just
made it easy to attach wires and probes
00:36:34.130 --> 00:36:38.517
and so we made this board. The design
schematics are all online. It's open
00:36:38.517 --> 00:36:42.970
source hardware. This is the chip
whisperer set up that we were using so we
00:36:42.970 --> 00:36:47.257
made the board specifically to fit on the
chip whisperer target board. And this is
00:36:47.257 --> 00:36:51.739
just what it looks like when you use the
chip whisper GUI to perform a glitch. And
00:36:51.739 --> 00:36:56.442
here we were doing application level code
so it's very different but I gave that
00:36:56.442 --> 00:37:07.020
talk and then I met Dmitry and Thomas.
Dmitry: Fortunately we had Josh to do the
00:37:07.020 --> 00:37:11.690
talk last year and to kind of exhaust a
lot of the firmware vulnerabilities that
00:37:11.690 --> 00:37:15.570
were actually hardware vulnerabilities in
the firmware that might have been there.
00:37:15.570 --> 00:37:19.750
So we immediately knew that we could
exclude this. And so you can start looking
00:37:19.750 --> 00:37:23.990
at the underlying microcontrollers. So in
this case it's STM32 microcontroller that
00:37:23.990 --> 00:37:28.556
they use inside of it and it controls
everything. So compromising the STM32
00:37:28.556 --> 00:37:33.430
microcontroller means that you can
compromise, you can compromise the device.
00:37:33.430 --> 00:37:38.690
So I mean so there's a couple of papers
that have covered some of the
00:37:38.690 --> 00:37:42.800
vulnerabilities in the STM32 specifically
there's one which describes a UV attack
00:37:42.800 --> 00:37:49.090
which lets you downgrade the security on
the STM32. So we determined that paper
NOTE Paragraph
00:37:49.090 --> 00:37:53.880
unfortunately does not apply to our result
because the Trezor or is smart enough when
00:37:53.880 --> 00:37:58.670
it boot's to check the value stored in
Flash. And if it has been altered in any
00:37:58.670 --> 00:38:03.516
way to set it correctly. So they actually
even protect against this kind of attack.
00:38:03.516 --> 00:38:08.190
But nevertheless you can see that there is
some vulnerabilities. So there is another
00:38:08.190 --> 00:38:12.310
paper which unfortunately has not been
published yet and we couldn't get in touch
00:38:12.310 --> 00:38:15.880
with the authors yet. That should be
coming out in January hopefully which
00:38:15.880 --> 00:38:23.380
describes glitches against the STM32 F1
and STM32 F3. So now we have the F0, the
00:38:23.380 --> 00:38:30.250
F1, and the F3 and so basically here's the
product matrix. So three of them are
00:38:30.250 --> 00:38:37.530
already vulnerable. So what we're looking
at SDM 32 F2 and potentially STM32 F4 if
00:38:37.530 --> 00:38:43.160
we're talking about the Trezor model T so
those we do not have vulnerabilities for
00:38:43.160 --> 00:38:49.738
yet. So let's take a look at how how it
works really quickly. So the way that STM
00:38:49.738 --> 00:38:56.010
implements security on the STM32 is that
they store an option byte and the option
00:38:56.010 --> 00:39:02.119
byte the thing to remember is on on a
cortex M3 or M4 microcontroller that you
00:39:02.119 --> 00:39:06.010
don't have anything other than flash. So
even though they call it option buy or
00:39:06.010 --> 00:39:10.130
refer you to this is fusing or being
permanent and hardware. It's still stored
00:39:10.130 --> 00:39:14.180
and flash just like the user application
is stored in flash. So it's the same exact
00:39:14.180 --> 00:39:19.390
same non-volatile memory that's otherwise
used. So basically when you get a new SDM
00:39:19.390 --> 00:39:24.400
32 it's shipped in a state where you have
full access. So that's how Josh was able
00:39:24.400 --> 00:39:30.530
to rework abord and flash it with new
firmware. And there is the ultimate
00:39:30.530 --> 00:39:35.650
security is what's called RDP2. So there
you have no access but you can see that
00:39:35.650 --> 00:39:43.550
basically if you have a value other than
aa or cc which correspond to RDP0 and RDP2
00:39:43.550 --> 00:39:48.670
respectively then you have what's called
RDP1 and this is interesting because it
00:39:48.670 --> 00:39:52.590
doesn't give you access to the flash which
is actually where the cryptographic seed
00:39:52.590 --> 00:39:57.050
is stored on the Trezor but it gives you
access to RAM, it gives you access to the
00:39:57.050 --> 00:40:01.440
registers but it doesn't give you flash
access like I said and it doesn't give you
00:40:01.440 --> 00:40:05.350
single stepping as well so connecting a
debugger and this mode will actually cause
00:40:05.350 --> 00:40:10.480
the hardware to hard fault which we'll see
in the second. So basically what we want
00:40:10.480 --> 00:40:16.099
to try to do is to downgrade RDP2 which is
what the trezor is set to. And we want
00:40:16.099 --> 00:40:24.450
to be able to access the device at RDP1
which is somewhat vulnerable state. This
00:40:24.450 --> 00:40:29.160
so I should say that this is this is the
correct way to approach this and it's
00:40:29.160 --> 00:40:35.270
great for doing an educational talk. But
in all honesty there's three of us. And so
00:40:35.270 --> 00:40:39.859
we did this completely in the dark over a
over 3 months trying different
00:40:39.859 --> 00:40:44.330
parameters on our on our glitch setups
which also later and were able to find
00:40:44.330 --> 00:40:49.740
this. But I'm here to explain it to all of
you so that it's easy to reproduce. So if
00:40:49.740 --> 00:40:53.640
you actually watch the SDM 30F2 boot
you'll see that it's relatively slow and
00:40:53.640 --> 00:40:57.780
it's only this slow after you power cycle
the board. So it takes approximately
00:40:57.780 --> 00:41:02.340
1.8 milliseconds to boot which is
a microcontroller terms pretty slow so you
00:41:02.340 --> 00:41:06.170
can see there's the power supply there's
the IO pin and that's approximately how
00:41:06.170 --> 00:41:10.720
long it takes to boot the firmware so you
can see that's where the IO actually
00:41:10.720 --> 00:41:16.250
toggles so 120 milliseconds later. So we
just wrote some firmware to basically
00:41:16.250 --> 00:41:20.130
toggle one of the pins measured within an
oscilloscope. Now we have the timing of
00:41:20.130 --> 00:41:24.830
how long that takes. So that's not super
interesting because that's not really a
00:41:24.830 --> 00:41:29.010
trigger. And each one of these
microcontrollers internally it has a boot
00:41:29.010 --> 00:41:34.790
rom so it has some some rom read only
memory. It's not non-volatile memory it's
00:41:34.790 --> 00:41:40.619
not the flash. It's literally a rom which
is inside the chip itself. It's it's hard
00:41:40.619 --> 00:41:45.589
coded. It cannot be fixed or patched that
gets executed first. So we wanted to
00:41:45.589 --> 00:41:49.510
actually attack that because anything else
is the user application and that's what
00:41:49.510 --> 00:41:54.340
Josh did last year. So you can kind of
start to fiddle this down. So you see that
00:41:54.340 --> 00:41:58.600
1.4 milliseconds of the reboot
nothing actually happens because this is
00:41:58.600 --> 00:42:02.450
now the reset line. And so the reset line
goes high after 1.4 millisecond
00:42:02.450 --> 00:42:06.060
so you can ignore the first
1.4 milliseconds after you
00:42:06.060 --> 00:42:10.560
cycle the power. So now the next step that
you can actually do is you can connect
00:42:10.560 --> 00:42:15.870
what's called a shunt resistor. So
oscilloscopes are there to measure
00:42:15.870 --> 00:42:19.349
voltage and so you want to actually
measure current to be able to know how
00:42:19.349 --> 00:42:23.450
much power is being consumed
by the device. So you do what's called
00:42:23.450 --> 00:42:26.721
a shunt measurement and that's
what I have on this slide right here.
00:42:26.721 --> 00:42:30.839
So you have the blue signal is now
actually the power consumption. And so now
00:42:30.839 --> 00:42:34.640
you can actually look and see what's
happening. So the first thing that happens
00:42:34.640 --> 00:42:38.859
is we have the execution of the BootROM.
You can see in the power consumption curve
00:42:38.859 --> 00:42:44.440
you can clearly see this moment in time.
Then you have basically where the flash
00:42:44.440 --> 00:42:49.209
and option bytes actually get read
somewhat at least within the BootROM. And
00:42:49.209 --> 00:42:53.620
finally the third distinctive moment in
time is where the application actually
00:42:53.620 --> 00:42:58.240
begins to execute. So now we've taken this
1.8 milliseconds which is a
00:42:58.240 --> 00:43:03.200
relatively long time and reduced it to 200
microseconds. We're actually interested
00:43:03.200 --> 00:43:07.910
in. And not only that we know that we're
actually interested in having slightly
00:43:07.910 --> 00:43:12.650
higher power consumption than the normal
execution of the bootloader or the BootROM
00:43:12.650 --> 00:43:19.230
rather and this is somewhere between
let's say 170 microseconds and 200
00:43:19.230 --> 00:43:23.760
microseconds. So this is the time at which
we actually need to glitch and this is
00:43:23.760 --> 00:43:28.460
also reasonable parameters. If you're
trying to reproduce this at home. So what
00:43:28.460 --> 00:43:33.570
do you need to reproduce this thing. So I.
The greatest thing that came out in the
00:43:33.570 --> 00:43:38.660
last couple of years is the these cheap
Chinese power supplies where you take a
00:43:38.660 --> 00:43:43.600
cheap you know old wall wart from one of
your old Linksys routers you plug it in
00:43:43.600 --> 00:43:48.860
and then you actually have a controllable
power supply with with voltage and current
00:43:48.860 --> 00:43:53.460
and you can adjust this and control this.
And so that's what we're using here. The
00:43:53.460 --> 00:43:56.645
second thing that I have to actually
00:43:56.645 --> 00:44:01.390
control the timing is an FPGA. I mean I
use FPGA's for everything and this is
00:44:01.390 --> 00:44:05.849
something that was easiest to put together
with an FPGA because FPGAs have constant
00:44:05.849 --> 00:44:11.471
timing. So finally we have a multiplexer
there as well and the multiplexers are
00:44:11.471 --> 00:44:16.750
actually switching between two voltages
between ground so completely cutting the
00:44:16.750 --> 00:44:21.260
voltage off and the normal operating
voltage of the microcontroller. And
00:44:21.260 --> 00:44:27.310
finally we have a debugger, the J-link
which is highly advised if you want to
00:44:27.310 --> 00:44:33.310
ever do Jtag stuff. So it's just a Jtag
debugger and basically what happens is
00:44:33.310 --> 00:44:39.540
you let this run for a while and it looks
like this. It's not really super eventful
00:44:39.540 --> 00:44:43.590
so you can see that the voltage the yellow
signal is actually the voltage and you can
00:44:43.590 --> 00:44:46.770
see we're just dipping the voltage at
different points in time and
00:44:46.770 --> 00:44:51.630
simultaneously we have a python script
checking if we have Jtag access or not.
00:44:51.630 --> 00:44:56.680
Protip to all the new dads if you do this
at home you can turn your oscilloscope
00:44:56.680 --> 00:45:00.499
towards the door, so that when you get up
at night because the baby's crying, you
00:45:00.499 --> 00:45:06.060
can see if it's still running or not. So
it's very, it's highly advised. So now
00:45:06.060 --> 00:45:10.900
Thomas is going to tell us how to get the
seed into into RAM.
00:45:10.900 --> 00:45:17.500
Thomas: So we had this thing running for
3 months roughly across 3
00:45:17.500 --> 00:45:22.020
continents because Josh is in America,
Dmitry is in Russia and I'm in Germany and
00:45:22.020 --> 00:45:26.610
so it took us 3 months to get a
successful glitch and even then we didn't
00:45:26.610 --> 00:45:32.090
believe it at first because we exhausted
everything basically. And the only reason
00:45:32.090 --> 00:45:39.290
we finally got it working is that we did a
mistake where we misstook 70 ms with
00:45:39.290 --> 00:45:43.710
170 ms and had it run for a long time. And
that's how we found out that the BootROM
00:45:43.710 --> 00:45:48.920
is actually super slow to boot on this
device. And once we had this downgrade
00:45:48.920 --> 00:45:56.810
from RDP2 to RDP1, we were able to read
the RAM, but we cannot read the flash
00:45:56.810 --> 00:46:03.700
which actually contains the seed. And so
how do we find this? And our idea was we
00:46:03.700 --> 00:46:08.540
start reviewing the upgrade procedure
because on the Trezor, the way the
00:46:08.540 --> 00:46:12.780
bootloader works is, it doesn't require a
PIN or anything to upgrade the firmware,
00:46:12.780 --> 00:46:16.390
which makes sense, because let's say you
have a bug in the pin function you want to
00:46:16.390 --> 00:46:21.609
somehow be able to get rid of it, right?
And the other thing is if you flash a
00:46:21.609 --> 00:46:28.970
fully valid firmware it retains the data,
it retains your seed. if you flash and not
00:46:28.970 --> 00:46:35.170
genuine one. It actually will erase your
seed and so on. And the big, and they do a
00:46:35.170 --> 00:46:38.990
really good job on the firmware
verification. We reviewed it for days and
00:46:38.990 --> 00:46:43.640
days and days and didn't find anything.
But so how does this upgrade procedure
00:46:43.640 --> 00:46:48.220
work? how is this seat retained? And so
when you reviewed the relevant code you
00:46:48.220 --> 00:46:54.310
see that there is a call to backup
metadata which sounds like it's going to
00:46:54.310 --> 00:46:59.859
retain somehow your data. And indeed you
can see that it's literally a mem-copy
00:46:59.859 --> 00:47:06.030
from the data on flash we're interested
into RAM. And so our basic procedure
00:47:06.030 --> 00:47:11.630
was, we go into bootloader we start the
firmware upgrade and we stop it before the
00:47:11.630 --> 00:47:16.599
RAM gets cleared. Because if you finish
the upgrade procedure, the Trezor actually
00:47:16.599 --> 00:47:22.390
clears its memory again, which is a very
decent way to do it. But we've found a way
00:47:22.390 --> 00:47:25.810
to retain it in RAM. So it turns out that
when you start the firmware upgrade
00:47:25.810 --> 00:47:32.880
process, it eventually asks you to verify
to check some of what you just flashed and
00:47:32.880 --> 00:47:38.540
it turns out that at this point in time,
the seed is still in RAM and we can read
00:47:38.540 --> 00:47:47.410
it out via RDP2. And this is relatively
simple to do once you actually manage to
00:47:47.410 --> 00:47:51.780
glitch the device. You basically just run
openocd dump_image, you get an image of
00:47:51.780 --> 00:47:57.060
the SRAM and you have the whole RAM
contents and so.
00:47:57.060 --> 00:48:04.330
Dmitry: What are we going to do,Thomas?
What high tech hacking tool will be using
00:48:04.330 --> 00:48:09.630
today to extract the seed?
Thomas:So we actually before we were
00:48:09.630 --> 00:48:14.310
successful, we had hours of talks on the
how do we, how is this seed stored and so
00:48:14.310 --> 00:48:18.880
on. But we've found this super
sophisticated seed extraction tool that
00:48:18.880 --> 00:48:26.010
only runs on POSIX and POSIX-like systems,
it's called strings.
00:48:26.010 --> 00:48:30.140
Laughter
And so basically it turns out that when
00:48:30.140 --> 00:48:37.640
you have a firmware dump as we have RAM
dump as we do now, and we go to we just
00:48:37.640 --> 00:48:43.550
run strings on the dump. We get a couple
of really nice words and I don't know if
00:48:43.550 --> 00:48:49.340
you remember the intro, but this is your
seeds.
00:48:49.340 --> 00:48:55.600
Applause
And you might be wondering what this
00:48:55.600 --> 00:48:59.900
little number is. This is your pin to the
device.
00:48:59.900 --> 00:49:09.020
Laughters
That was a great day. And so Josh, or one
00:49:09.020 --> 00:49:16.369
of Josh's employees took all this mess we
created on all desks and made it into this
00:49:16.369 --> 00:49:23.600
nice device which is basically a socket
where you put in your chip and then we can
00:49:23.600 --> 00:49:28.480
read out the seed and so on.
Dmitry: And all of this stuff including
00:49:28.480 --> 00:49:32.600
the board design, FPGA codes, and the
Verilog code that we use, I mean if
00:49:32.600 --> 00:49:36.810
somebody wants to, they can apply it and
do the same thing with one of the ICEPICKs
00:49:36.810 --> 00:49:41.060
or one of the more open source friendly
FPGA boards. This just happens to be the
00:49:41.060 --> 00:49:46.710
one that we all had lying around and could
reproduce the work with. You can go ahead
00:49:46.710 --> 00:49:50.848
and do it. I mean we suspect, I think
Thomas said, we suspect you might be able
00:49:50.848 --> 00:49:54.910
to do with Arduino as well, because the
actual glitch pulse is only approximately
00:49:54.910 --> 00:50:02.061
60 μs or sorry, 6 μs in time. So it's a
relatively slow signal as well, so it
00:50:02.061 --> 00:50:08.310
should be relatively repeatable even with
something cheaper than this. But this is a
00:50:08.310 --> 00:50:12.200
way to automate this even better and to
not have dangling wires or any of the
00:50:12.200 --> 00:50:16.870
small soldering that was required to do it
in situ in the device which we had on the
00:50:16.870 --> 00:50:22.109
previous slide. So all of that we're going
to have it on GIthub. And so I think the
00:50:22.109 --> 00:50:28.080
final, the final thing.
Thomas: one more thing before we are,
00:50:28.080 --> 00:50:35.550
sorry. One more thing. So this breaks a
lot of the Trezor security, but there is
00:50:35.550 --> 00:50:41.060
a way to protect your seed against this,
So if you use a passphrase on your device,
00:50:41.060 --> 00:50:46.520
the way we understood it, it basically
doesn't allows somebody with hardware
00:50:46.520 --> 00:50:51.850
access to steal all your funds. So if you
add a passphrase to your Trezor, a good
00:50:51.850 --> 00:50:57.760
passphrase and your machine is not already
owned you can somehow somewhat protect
00:50:57.760 --> 00:51:03.220
against this. But a lot of people don't.
So we are really sorry we didn't mean any
00:51:03.220 --> 00:51:08.520
harm.
Dmitry: So yeah, that's the conclusion I
00:51:08.520 --> 00:51:14.310
would say. So yeah I mean, so all the
stuff we're going to put online, I guess I
00:51:14.310 --> 00:51:20.590
said, so you can follow us for the links
on the online. wallet.fail, it's a domain
00:51:20.590 --> 00:51:26.420
name, believe it or not, fail is a TLD. So
you can go to github.com/walletfail,
00:51:26.420 --> 00:51:32.600
twitter.com/walletfail. You can follow me,
Thomas, and Josh on Twitter as well and
00:51:32.600 --> 00:51:36.710
like I said, we'll be releasing all this
stuff so it will go up slowly. Just
00:51:36.710 --> 00:51:40.730
because I think when we set out six months
ago we did not expect us to have 100
00:51:40.730 --> 00:51:45.619
percent success in everything that we were
planning to do. so that's a first for me
00:51:45.619 --> 00:51:48.420
at the very least.
Thomas: The saddest part is that we have
00:51:48.420 --> 00:51:54.950
more vulnerabilities to other wallets,
but, only one hour. And so we also have
00:51:54.950 --> 00:51:58.720
some stuff to give out so we have the
hardware implant PCBs, we have thousands
00:51:58.720 --> 00:52:01.810
of them if you want to get some.
Dmitry: Off to Josh.
00:52:01.810 --> 00:52:08.810
Thomas: We even have components for them
for like 100 devices so hit us up and we
00:52:08.810 --> 00:52:11.020
can do something. Thank you.
00:52:11.020 --> 00:52:21.960
Applause
00:52:21.960 --> 00:52:25.650
Herald: Thank you guys, it's an amazing
talk. I feel really inspired to break
00:52:25.650 --> 00:52:30.630
things apart in a very creative way. We
have some time left for questions. So if
00:52:30.630 --> 00:52:34.720
you have questions, please line up at the
microphones. But first we're going to
00:52:34.720 --> 00:52:37.299
start with a question from the Internet.
00:52:37.299 --> 00:52:40.239
Signal Angel: Thank you,
I've got two related
00:52:40.239 --> 00:52:44.240
questions from the internet. First one,
how hard did you guys laugh when bitify
00:52:44.240 --> 00:52:50.599
announced that their Android-based wallet
was unhackable? And second question, have
00:52:50.599 --> 00:52:55.510
you had a try to attack larger processors
like ARM-based processors?
00:52:55.510 --> 00:53:00.900
Thomas: So maybe let's start with Bitfi.
So we only talk about somewhat secure
00:53:00.900 --> 00:53:06.720
wallets, we didn't want to use a Chinese
phone in this talk. So we laughed pretty
00:53:06.720 --> 00:53:13.892
hard and we ordered some, but yeah.
Dmitry: And I mean this was covered
00:53:13.892 --> 00:53:17.780
extensively. So another guy who you should
follow on Twitter @cybergibbons gave a
00:53:17.780 --> 00:53:22.165
talk at hardwear.io on the topic of the
Bitfi. He was summarizing research that
00:53:22.165 --> 00:53:25.568
he did in conjunction with a bunch of
other people as well. So if you're
00:53:25.568 --> 00:53:27.970
interested in the Bitfi you should go look
at them.
00:53:27.970 --> 00:53:30.170
So the second question was about ARM-based
00:53:30.170 --> 00:53:35.420
controllers. I mean all of these were
ARM-based. Every single chip as far as I
00:53:35.420 --> 00:53:38.989
know that we looked at was was ARM-based
in one way or another.
00:53:38.989 --> 00:53:40.210
Thomas: Yeah and there's,
00:53:40.210 --> 00:53:44.060
so if you're interested in this, look at
glitching the Nintendo Switch where they
00:53:44.060 --> 00:53:48.359
glitch the Tegra used in the Nintendo
Switch, which is very interesting and will
00:53:48.359 --> 00:53:52.924
give a lot of inspiration in that
regard, basically.
00:53:52.924 --> 00:53:57.206
Herald: Thank you. A question for
microphone 4 please.
00:53:57.206 --> 00:54:01.755
Mic 4: Hi, Trezor CPO here, first thank
you for the talk, we worked with you to
00:54:01.755 --> 00:54:06.010
fix the issues as soon as are recommend to
prod and if anyone interested in hacking
00:54:06.010 --> 00:54:13.733
hardware wallets, we are really interested
in working with the hardware hackers
00:54:13.733 --> 00:54:17.940
community and we have a
responsible disclosure program.
00:54:17.940 --> 00:54:24.065
you mentioned problems with supply chain
attacks, but gave no solutions, so let me
00:54:24.065 --> 00:54:30.109
give you one. Trezor is open source
hardware so you can build your own
00:54:30.109 --> 00:54:32.235
from locally sourced components
00:54:32.235 --> 00:54:37.949
and if you are paranoid and don't want to
deal with these kind of attacks.
00:54:37.949 --> 00:54:44.139
but my question is, is there any
other solution except for building
00:54:44.139 --> 00:54:47.259
your own wallet or inspecting
the code to run and
00:54:47.259 --> 00:54:50.380
interrogate about basically?
00:54:50.380 --> 00:54:55.240
Thomas: First Thank you. One thing we
should mention is that when we looked at
00:54:55.240 --> 00:54:59.920
the Trezor code, the reason we had to end
up glitching this chip for three months is
00:54:59.920 --> 00:55:04.080
that we couldn't break the firmware
otherwise. So they do a great job. And
00:55:04.080 --> 00:55:08.480
it's really awesome.
Applause
00:55:08.480 --> 00:55:15.570
Dmitry: Yes. The firmware on the Trezor is
something to look at. I mean I recommend
00:55:15.570 --> 00:55:19.580
that, I mean we all do consulting work as
well. And so it's something that I
00:55:19.580 --> 00:55:24.740
recommend that people who are interested
in looking at how to prevent certain doom
00:55:24.740 --> 00:55:28.445
mitigations and hardware. It's an
excellent project to look at. And so
00:55:28.445 --> 00:55:32.390
Trezor should be commended on that. But at
the end of the day it doesn't mean that
00:55:32.390 --> 00:55:37.270
the chip that the Trezor uses is secure
against these kinds of attacks. And that's
00:55:37.270 --> 00:55:41.369
where we had a fallback to looking for
silicon vulnerabilities against a chip
00:55:41.369 --> 00:55:44.928
or, sorry, a wallet like the Trezor.
00:55:45.814 --> 00:55:48.032
Josh: I would say on this hygeine side,
00:55:48.032 --> 00:55:53.002
this is a very difficult problem,
governments especially have this issue.
00:55:53.002 --> 00:55:57.192
You can do cryptographic attestation, but
as we saw with the Ledger nano,
00:55:57.192 --> 00:56:01.105
that cryptographic attestation didn't help
verify that the requests were legitimate
00:56:01.105 --> 00:56:05.110
against a hardware attack, so there's been
talk about X-raying the board and all this
00:56:05.110 --> 00:56:08.540
stuff, but this is still very much an
open problem in hardware security.
00:56:08.540 --> 00:56:11.020
Herald: Another question from microphone
3.
00:56:11.020 --> 00:56:16.310
Mic: Actually I have a suggestion.
Herald: Make it short, though. Because
00:56:16.310 --> 00:56:19.390
usually we just take questions. One
sentence.
00:56:19.390 --> 00:56:25.203
Mic: A few MCUs actually have Jtag
connected via hardware fuses.
00:56:25.665 --> 00:56:28.530
So this might be useful
00:56:28.530 --> 00:56:35.600
at least slow down glitching attacks.
Dmitry: Thanks. I agree. But these are
00:56:35.600 --> 00:56:40.764
not Cortex-M microcontrollers I can tell
you that with 100% certainty. It has to do
00:56:40.764 --> 00:56:44.109
a lot with the fact that the
microcontrollers that are being used in
00:56:44.109 --> 00:56:48.460
these devices, they're built to spec to
the spec that ARM specified that ARM
00:56:48.460 --> 00:56:53.800
thinks would be a good set of features for
this class of device or rather for the for
00:56:53.800 --> 00:56:57.990
the CPUs for the class of device that they
ended up getting put in. So anything
00:56:57.990 --> 00:57:02.950
Cortex-M is gonna to have vulnerabilities
that are more or less like the silicon
00:57:02.950 --> 00:57:06.859
vulnerabilities that we have. It's just I
mean if you ask me I think it's a matter
00:57:06.859 --> 00:57:11.050
of time just to sit there. I mean
fortunately we had something like 3 months
00:57:11.050 --> 00:57:16.770
of just glitching to be able to find find
these bugs. But if you can apply that much
00:57:16.770 --> 00:57:22.200
to find it silicon attack you might be
able to find this kind of vulnerability as
00:57:22.200 --> 00:57:25.579
well in other Cortex-M products. Only
three minutes.
00:57:25.579 --> 00:57:28.940
Herald: All good. Another question from
microphone 4 please.
00:57:28.940 --> 00:57:33.933
Mic 4: So obviously as part of your work
you analyzed the firmware of these
00:57:33.933 --> 00:57:40.150
devices. Did you find that the firmware
is in any way obfuscated or encrypted?
00:57:40.150 --> 00:57:45.380
Thomas: So basically yep, on these chips
you cannot really encrypt the firmware. On
00:57:45.380 --> 00:57:51.000
the ST31 you can encrypt it. But we didn't
have to look at it because the ST31 is not
00:57:51.000 --> 00:57:54.710
something you have to break but so no
there was no real obfuscation that we
00:57:54.710 --> 00:57:58.780
could see. But we also don't have the code
in the case of letters so I just stared at
00:57:58.780 --> 00:58:05.220
IDA pro for hours and yeah.
Herald: The next person on microphone 4.
00:58:05.220 --> 00:58:11.310
Mic 4: Hello, did you have a look at the
entropy chip that generates the master
00:58:11.310 --> 00:58:14.880
seeds on both of these hardware devices,
and what's your take on that?
00:58:14.880 --> 00:58:21.530
Dmitry: I mean, so we already hovered how
the Trezor works. There is only one chip
00:58:21.530 --> 00:58:26.760
and it's the STM32 so I know that there
was a known issue with Trezor back in the
00:58:26.760 --> 00:58:32.950
day where they weren't seeding the
basically the RNG correctly. But this was
00:58:32.950 --> 00:58:37.820
fixed. But for our attacks this wasn't
this wasn't an issue. I mean if you were
00:58:37.820 --> 00:58:42.710
concerned about how strong these are, how
strong the random number generators are
00:58:42.710 --> 00:58:48.560
for creating a seed you could actually
create a BIP39 wallet outside of any
00:58:48.560 --> 00:58:53.539
one of these and then just use them for
their hardware features and get the seed
00:58:53.539 --> 00:58:56.326
from outside.
Herald: And if you have a question, do
00:58:56.326 --> 00:59:00.170
move to the microphone if you're able to.
But first we have another question from
00:59:00.170 --> 00:59:03.937
the Internet.
SA: Thank you. Did you guys see the
00:59:03.937 --> 00:59:09.472
dinosaur hiphop zero wallet?
Thomas: No but if you send it to us
00:59:09.472 --> 00:59:11.663
we are happy to look at it.
Thomas: Oh you did.
00:59:11.663 --> 00:59:14.626
Dmitry: Yeah, we had the it
Josh: The dinosaur hiphop wallet -
00:59:14.626 --> 00:59:18.380
Thank you for the kind of trick questions
- So the design of the dinosaur hiphop
00:59:18.380 --> 00:59:21.784
wallet was a trezor clone
that we looked at last year.
00:59:21.784 --> 00:59:23.928
Thomas: Ah
Josh: Called breaking bitcoin board
00:59:23.928 --> 00:59:27.365
so that if we didn't, otherwise
functionally it's a trezor clone
00:59:27.365 --> 00:59:30.125
but we stole a lot of the instructions
from dinosaur hiphop
00:59:30.125 --> 00:59:33.444
make the breaking bitcoin board
and then prepare the operating system.
00:59:33.444 --> 00:59:37.307
Dmitry: I mean, and maybe on that note
I would say that in terms of looking at
00:59:37.307 --> 00:59:42.319
what wallets are actually be used you'll
find that, so the Ledger is a very popular
00:59:42.319 --> 00:59:44.266
wallet, the Trezor is a very popular
00:59:44.266 --> 00:59:50.080
wallet. But since the Trezor is opensource
there is a lot of clones and forks of the
00:59:50.080 --> 00:59:55.930
Trezor. And when I say that not all of
them run the latest security patches that
00:59:55.930 --> 01:00:00.430
have been applied to the Trezor code base.
So that's also something that you can do
01:00:00.430 --> 01:00:04.830
is basically diff the projects and see
which one of them which ones are staying
01:00:04.830 --> 01:00:06.169
up to date and which aren't.
01:00:06.169 --> 01:00:08.822
Herald: Your question has to be the very
last one today.
01:00:08.822 --> 01:00:15.783
Please speak directly into the microphone.
Even closer to the mic.
01:00:15.783 --> 01:00:25.322
Mic: Seeing as this is the first CCC for
many of us and some of us might not have
01:00:25.322 --> 01:00:29.939
that much experience in hardware hacking.
Do you have any tips for beginners?
01:00:29.939 --> 01:00:39.210
Thomas: Yeah lots of them. Buy an Arduino
learn what mistakes you do with it and
01:00:39.210 --> 01:00:44.100
learn how hardware works, basically. Watch
a lot of online videos and I think you
01:00:44.100 --> 01:00:48.530
gave presentations, you gave
presentations. I gave some presentations.
01:00:48.530 --> 01:00:53.520
So just watch talks, watch LiveOverflow.
LiveOverflow, great YouTube channel on
01:00:53.520 --> 01:00:59.619
exactly this stuff. And also don't
hesitate to reach out to us. If you have a
01:00:59.619 --> 01:01:04.580
question. Always contact us
info@wallet.fail, on Twitter, wherever. we
01:01:04.580 --> 01:01:07.491
are happy to talk to you. It might take a
while.
01:01:07.491 --> 01:01:12.043
Josh: On non-security electronics, if you
go to Sparkfun or Adafruit, they have lots
01:01:12.043 --> 01:01:15.840
of free material of how electronics work,
how to get started. It's not security
01:01:15.840 --> 01:01:18.140
related, but it's a very good
electronics program
01:01:18.140 --> 01:01:21.142
Dmitry: But I'll say I started
with Arduino too.
01:01:23.642 --> 01:01:27.487
Herald: All right thank you guys so much
for the very nice questions and you guys
01:01:27.487 --> 01:01:30.150
for the amazing and inspiring talk.
Thank you so much.
01:01:30.150 --> 01:01:31.670
Applause
01:01:31.670 --> 01:01:58.000
Subtitles created by c3subtitles.de
in the years 2018-2020. Join, and help us!