0:00:18.209,0:00:23.874
Herald-Angel: The Layman's Guide to Zero-[br]Day Engineering is our next talk by, and
0:00:23.874,0:00:31.050
my colleagues out in Austin who run the[br]Pown2Own contest, assure me that our next
0:00:31.050,0:00:35.700
speakers are really very much top of their[br]class, and I'm really looking forward to
0:00:35.700,0:00:40.430
this talk for that. A capture the flag[br]contest like that requires having done a
0:00:40.430,0:00:46.309
lot of your homework upfront so that you[br]have the tools at your disposal, at the
0:00:46.309,0:00:50.920
time, so that you can win. And Marcus and[br]Amy are here to tell us something way more
0:00:50.920,0:00:56.309
valuable about the actual tools they found,[br]but how they actually arrived at those
0:00:56.309,0:01:00.659
tools, and you know, the process of going[br]to that. And I think that is going to be a
0:01:00.659,0:01:08.250
very valuable recipe or lesson for us. So,[br]please help me welcome Marcus and Amy to a
0:01:08.250,0:01:16.768
very much anticipated talk.[br]applause
0:01:16.768,0:01:22.600
Marcus: All right. Hi everyone. Thank you[br]for making out to our talk this evening.
0:01:22.600,0:01:26.409
So, I'd like to start by thanking the CCC[br]organizers for inviting us out here to
0:01:26.409,0:01:30.329
give this talk. This was a unique[br]opportunity for us to share some of our
0:01:30.329,0:01:35.810
experience with the community, and we're[br]really happy to be here. So yeah, I hope
0:01:35.810,0:01:42.729
you guys enjoy. OK, so who are we? Well,[br]my name is Marcus Gaasedelen. I sometimes
0:01:42.729,0:01:48.070
go by the handle @gaasedelen which is my[br]last name. And I'm joined here by my co-
0:01:48.070,0:01:52.950
worker Amy who was also a good friend and[br]longtime collaborator. We worked for a
0:01:52.950,0:01:56.391
company called Ret2 Systems. Ret2 is best[br]known publicly for its security research
0:01:56.391,0:01:59.740
and development. Behind the scenes we do[br]consulting and have been pushing to
0:01:59.740,0:02:04.040
improve the availability of security[br]education, and specialized security
0:02:04.040,0:02:08.100
training, as well as, raising awareness[br]and sharing information like you're going
0:02:08.100,0:02:14.069
to see today. So this talk has been[br]structured roughly to show our approach in
0:02:14.069,0:02:18.370
breaking some of the world's most hardened[br]consumer software. In particular, we're
0:02:18.370,0:02:23.349
going to talk about one of the Zero-Days[br]that we produce at Ret2 in 2018. And over
0:02:23.349,0:02:28.019
the course of the talk, we hope to break[br]some common misconceptions about the
0:02:28.019,0:02:31.260
process of Zero-Day Engineering. We're[br]going to highlight some of the
0:02:31.260,0:02:36.799
observations that we've gathered and built[br]up about this industry and this trade over
0:02:36.799,0:02:41.269
the course of many years now. And we're[br]going to try to offer some advice on how
0:02:41.269,0:02:46.220
to get started doing this kind of work as[br]an individual. So, we're calling this talk
0:02:46.220,0:02:51.580
a non-technical commentary about the[br]process of Zero-Day Engineering. At times,
0:02:51.580,0:02:55.740
it may seem like we're stating the[br]obvious. But the point is to show that
0:02:55.740,0:03:00.879
there's less magic behind the curtain than[br]most of the spectators probably realize.
0:03:00.879,0:03:05.350
So let's talk about PWN2OWN 2018. For[br]those that don't know, PWN2OWN is an
0:03:05.350,0:03:09.099
industry level security competition,[br]organized annually by Trend Micro's Zero-
0:03:09.099,0:03:15.000
Day Initiative. PWN2OWN invites the top[br]security researchers from around the world
0:03:15.000,0:03:19.810
to showcase Zero-Day exploits against high[br]value software targets such as premier web
0:03:19.810,0:03:23.310
browsers, operating systems and[br]virtualization solutions, such as Hyper-V,
0:03:23.310,0:03:28.540
VMware, VirtualBox, Xen, whatever.[br]So at Ret2, we thought it would be fun to
0:03:28.540,0:03:32.560
play on PWN2OWN this year. Specifically we[br]wanted to target the competition's browser
0:03:32.560,0:03:38.480
category. We chose to attack Apple's[br]Safari web browser on MacOS because it was
0:03:38.480,0:03:44.700
new, it was mysterious. But also to avoid[br]any prior conflicts of interest. And so
0:03:44.700,0:03:48.159
for this competition, we ended up[br]developing a type of Zero-Day, known
0:03:48.159,0:03:56.019
typically as a single click RCE or Safari[br]remote, kind of as some industry language.
0:03:56.019,0:04:00.219
So what this means is that we could gain[br]remote, root level access to your Macbook,
0:04:00.219,0:04:05.059
should you click a single malicious link[br]of ours. That's kind of terrifying. You
0:04:05.059,0:04:10.390
know, a lot of you might feel like you're[br]very prone to not clicking malicious
0:04:10.390,0:04:15.091
links, or not getting spearfished. But it's[br]so easy. Maybe you're in a coffee shop, maybe
0:04:15.091,0:04:19.970
I just man-in-the-middle your connection.[br]It's pretty, yeah, it's a pretty scary
0:04:19.970,0:04:23.360
world. So this is actually a picture that[br]you took on stage at PWN2OWN 2018,
0:04:23.360,0:04:27.940
directly following our exploit attempt.[br]This is actually Joshua Smith from ZDI
0:04:27.940,0:04:33.020
holding the competition machine after our[br]exploit had landed, unfortunately, a
0:04:33.020,0:04:37.020
little bit too late. But the payload at[br]the end of our exploit would pop Apple's
0:04:37.020,0:04:41.370
calculator app and a root shell on the[br]victim machine. This is usually used to
0:04:41.370,0:04:45.930
demonstrate code execution. So, for fun we[br]also made the payload change the desktop's
0:04:45.930,0:04:50.666
background to the Ret2 logo. So that's[br]what you're seeing there. So, what makes a
0:04:50.666,0:04:54.620
Zero-Day a fun case study, is that we had[br]virtually no prior experience with Safari
0:04:54.620,0:04:58.750
or MacOS going into this event. We[br]literally didn't even have a single
0:04:58.750,0:05:03.220
Macbook in the office. We actually had to[br]go out and buy one. And, so as a result
0:05:03.220,0:05:07.050
you get to see, how we as expert[br]researchers approach new and unknown
0:05:07.050,0:05:12.060
software targets. So I promised that this[br]was a non-technical talk which is mostly
0:05:12.060,0:05:16.710
true. That's because we actually publish[br]all the nitty gritty details for the
0:05:16.710,0:05:21.530
entire exploit chain as a verbose six part[br]blog series on our blog this past summer.
0:05:21.530,0:05:26.534
It's hard to make highly tactical talks[br]fun and accessible to all audiences. So
0:05:26.534,0:05:31.210
we've reserved much of the truly technical[br]stuff for you to read at your own leisure.
0:05:31.210,0:05:34.550
It's not a prerequisite for this talk, so[br]don't feel bad if you haven't read those.
0:05:34.550,0:05:39.310
So with that in mind, we're ready to[br]introduce you to the very 1st step of what
0:05:39.310,0:05:44.870
we're calling, The Layman's Guide to Zero-[br]Day Engineering. So, at the start of this
0:05:44.870,0:05:48.730
talk, I said we'd be attacking some of the[br]most high value and well protected
0:05:48.730,0:05:54.543
consumer software. This is no joke, right?[br]This is a high stakes game. So before any
0:05:54.543,0:05:58.759
of you even think about looking at code,[br]or searching for vulnerabilities in these
0:05:58.759,0:06:02.590
products, you need to set some[br]expectations about what you're going to be
0:06:02.590,0:06:08.190
up against. So this is a picture of you.[br]You might be a security expert, a software
0:06:08.190,0:06:11.757
engineer, or even just an enthusiast. But[br]there are some odd twists of self-
0:06:11.757,0:06:16.009
loathing, you find yourself interested in[br]Zero-Days, and the desire to break some
0:06:16.009,0:06:22.380
high impact software, like a web browser.[br]But it's important to recognize that
0:06:22.380,0:06:25.759
you're looking to devise some of the[br]largest, most successful organizations of
0:06:25.759,0:06:29.720
our generation. These types of companies have[br]every interest in securing their products
0:06:29.720,0:06:33.410
and building trust with consumers. These[br]vendors have steadily been growing their
0:06:33.410,0:06:36.449
investments in software and device[br]security, and that trend will only
0:06:36.449,0:06:41.300
continue. You see cyber security in[br]headlines every day, hacking, you know,
0:06:41.300,0:06:45.220
these systems compromised. It's only[br]getting more popular. You know, there's
0:06:45.220,0:06:49.319
more money than ever in this space. This[br]is a beautiful mountain peak that
0:06:49.319,0:06:53.330
represents your mission of, I want to[br]craft a Zero-Day. But you're cent up this
0:06:53.330,0:06:58.410
mountain is not going to be an easy task.[br]As an individual, the odds are not really
0:06:58.410,0:07:02.940
in your favor. This game is sort of a free[br]for all, and everyone is at each other's
0:07:02.940,0:07:06.770
throats. So, in one corner is the vendor,[br]might as well have infinite money and
0:07:06.770,0:07:10.729
infinite experience. In another corner, is[br]the rest of the security research
0:07:10.729,0:07:16.199
community, fellow enthusiasts, other[br]threat actors. So, all of you are going to
0:07:16.199,0:07:21.120
be fighting over the same terrain, which[br]is the code. This is unforgiving terrain
0:07:21.120,0:07:25.669
in and of itself. But the vendor has home[br]field advantage. So these obstacles are
0:07:25.669,0:07:30.419
not fun, but it's only going to get worse[br]for you. Newcomers often don't prepare
0:07:30.419,0:07:34.810
themselves for understanding what kind of[br]time scale they should expect when working
0:07:34.810,0:07:39.840
on these types of projects. So, for those[br]of you who are familiar with the Capture
0:07:39.840,0:07:44.550
The Flag circuit. These competitions[br]usually are time boxed from 36 to 48
0:07:44.550,0:07:49.191
hours. Normally, they're over a weekend. We[br]came out of that circuit. We love the
0:07:49.191,0:07:54.880
sport. We still play. But how long does it[br]take to develop a Zero-Day? Well, it can
0:07:54.880,0:07:58.780
vary a lot. Sometimes, you get really[br]lucky. I've seen someone produce a
0:07:58.780,0:08:05.340
Chrome-/V8-bug in 2 days. Other times,[br]it's taken two weeks. Sometimes, it takes
0:08:05.340,0:08:10.360
a month. But sometimes, it can actually[br]take a lot longer to study and exploit new
0:08:10.360,0:08:13.960
targets. You need to be thinking, you[br]know, you need to be looking at time in
0:08:13.960,0:08:19.509
these kind of scales. And so it could take[br]3.5 months. It could take maybe even 6
0:08:19.509,0:08:23.021
months for some targets. The fact of the[br]matter is that it's almost impossible to
0:08:23.021,0:08:28.370
tell how long the process is going take[br]you. And so unlike a CTF challenge,
0:08:28.370,0:08:33.140
there's no upper bound to this process of[br]Zero-Day Engineering. There's no guarantee
0:08:33.140,0:08:37.270
that the exploitable bugs you need to make[br]a Zero-Day, even exist in the software
0:08:37.270,0:08:43.400
you're targeting. You also don't always[br]know what you're looking for, and you're
0:08:43.400,0:08:47.540
working on projects that are many order[br]magnitudes larger than any sort of
0:08:47.540,0:08:51.150
educational resource. We're talking[br]millions of lines of code where your
0:08:51.150,0:08:56.780
average CTF challenge might be a couple[br]hundred lines to see at most. So I can
0:08:56.780,0:09:01.673
already see the tear and self-doubt in[br]some of your eyes. But I really want to
0:09:01.673,0:09:06.640
stress that you shouldn't be too hard on[br]yourself about this stuff. As a novice,
0:09:06.640,0:09:11.470
you need to keep these caveats in mind and[br]accept that failure is not unlikely in the
0:09:11.470,0:09:15.746
journey. All right? So please check this[br]box before watching the rest of the talk.
0:09:17.406,0:09:21.010
So having built some psychological[br]foundation for the task at hand, the next
0:09:21.010,0:09:28.130
step in the Layman's Guide is what we call[br]reconnaissance. So this is kind of a goofy
0:09:28.130,0:09:33.566
slide, but yes, even Metasploit reminds[br]you to start out doing recon. So with
0:09:33.566,0:09:36.530
regard to Zero-Day Engineering,[br]discovering vulnerabilities against large
0:09:36.530,0:09:40.606
scale software can be an absolutely[br]overwhelming experience. Like that
0:09:40.606,0:09:44.790
mountain, it's like, where do I start?[br]What hill do I go up? Like, where do I
0:09:44.790,0:09:49.330
even go from there? So to overcome this,[br]it's vital to build foundational knowledge
0:09:49.330,0:09:53.470
about the target. It's also one of the[br]least glamorous parts of the Zero-Day
0:09:53.470,0:09:57.540
development process. And it's often[br]skipped by many. You don't see any of the
0:09:57.540,0:10:01.000
other speakers really talking about this[br]so much. You don't see blog posts where
0:10:01.000,0:10:05.480
people are like, I googled for eight hours[br]about Apple Safari before writing a Zero-
0:10:05.480,0:10:11.160
Day for it. So you want to aggregate and[br]review all existing research related to
0:10:11.160,0:10:17.266
your target. This is super, super[br]important. So how do we do our recon? Well
0:10:17.266,0:10:21.790
the simple answer is Google everything.[br]This is literally us Googling something,
0:10:21.790,0:10:25.210
and what we do is we go through and we[br]click, and we download, and we bookmark
0:10:25.210,0:10:29.580
every single thing for about five pages.[br]And you see all those buttons that you
0:10:29.580,0:10:33.542
never click at the bottom of Google. All[br]the things are related searches that you
0:10:33.542,0:10:37.370
might want to look at. You should[br]definitely click all of those. You should
0:10:37.370,0:10:40.960
also go through at least four or five[br]pages and keep downloading and saving
0:10:40.960,0:10:48.040
everything that looks remotely relevant.[br]So you just keep doing this over, and
0:10:48.040,0:10:53.621
over, and over again. And you just Google,[br]and Google, and Google everything that you
0:10:53.621,0:10:58.730
think could possibly be related. And the[br]idea is, you know, you just want to grab
0:10:58.730,0:11:02.260
all this information, you want to[br]understand everything you can about this
0:11:02.260,0:11:07.766
target. Even if it's not Apple Safari[br]specific. I mean, look into V8, look into
0:11:07.766,0:11:14.200
Chrome, look into Opera, look into Chakra, [br]look into whatever you want. So the goal
0:11:14.200,0:11:19.370
is to build up a library of security[br]literature related to your target and its
0:11:19.370,0:11:26.010
ecosystem. And then, I want you to read[br]all of it. But I don't want you, don't,
0:11:26.010,0:11:29.120
don't force yourself to understand[br]everything in your sack in your
0:11:29.120,0:11:32.720
literature. The point of this exercise is[br]to build additional concepts about the
0:11:32.720,0:11:36.996
software, its architecture and its[br]security track record. By the end of the
0:11:36.996,0:11:40.120
reconnaissance phase, you should aim to be[br]able to answer these kinds of questions
0:11:40.120,0:11:45.727
about your target. What is the purpose of[br]the software? How is it architected? Can
0:11:45.727,0:11:50.640
anyone describe what WebKit's architecture[br]is to me? What are its major components?
0:11:50.640,0:11:55.880
Is there a sandbox around it? How do you[br]debug it? How did the developers debug it?
0:11:55.880,0:12:00.550
Are there any tips and tricks, are there[br]special flags? What does its security
0:12:00.550,0:12:04.265
track record look like? Does it have[br]historically vulnerable components? Is
0:12:04.265,0:12:10.630
there existing writeups, exploits, or[br]research in it? etc. All right, we're
0:12:10.630,0:12:16.450
through reconnaissance. Step 2 is going to[br]be target selection. So, there's actually
0:12:16.450,0:12:20.190
a few different names that you could maybe[br]call this. Technically, we're targeting
0:12:20.190,0:12:24.684
Apple's Safari, but you want to try and[br]narrow your scope. So what we're looking
0:12:24.684,0:12:32.520
at here is a TreeMap visualization of the[br]the Web Kit source. So Apple's Safari web
0:12:32.520,0:12:36.030
browser is actually built on top of the[br]Web Kit framework which is essentially a
0:12:36.030,0:12:42.000
browser engine. This is Open Source. So[br]yeah, this is a TreeMap visualization of
0:12:42.000,0:12:47.000
the source directory where files are[br]sorted in by size. So each of those boxes
0:12:47.000,0:12:53.020
is essentially a file. While all the grey[br]boxes, the big gray boxes are directories.
0:12:53.020,0:13:02.240
All the sub squares are files. In each[br]file is sized, based on its lines of code.
0:13:02.240,0:13:07.490
Hue, the blue hues represent approximate[br]maximum cyclomatic complexity detected in
0:13:07.490,0:13:10.810
each source file. And you might be[br]getting, anyway, you might be getting
0:13:10.810,0:13:14.191
flashbacks back to that picture of that[br]mountain peak. How do you even start to
0:13:14.191,0:13:17.546
hunt for security vulnerabilities in a[br]product or codebase of this size?
0:13:17.546,0:13:22.070
3 million lines of code. You know, I've[br]maybe written like, I don't know, like a
0:13:22.070,0:13:28.687
100,000 lines of C or C++ in my life, let[br]alone read or reviewed 3 million. So the
0:13:28.687,0:13:33.963
short answer to breaking this problem down[br]is that you need to reduce your scope of
0:13:33.963,0:13:39.950
valuation, and focus on depth over[br]breadth. And this is most critical when
0:13:39.950,0:13:44.980
attacking extremely well picked over[br]targets. Maybe you're probing an IoT
0:13:44.980,0:13:47.600
device? You can probably just sneeze at[br]that thing and you are going to find
0:13:47.600,0:13:52.424
vulnerabilities. But you know, you're[br]fighting on a very different landscape here.
0:13:52.424,0:13:59.820
And so you need to be very detailed[br]with your review. So reduce your scope.
0:13:59.820,0:14:03.950
Our reconnaissance and past experience[br]with exploiting browsers had lead us to
0:14:03.950,0:14:09.090
focus on WebKit's JavaScript engine,[br]highlighted up here in orange. So, bugs in
0:14:09.090,0:14:14.120
JS engines, when it comes to browsers, are[br]generally regarded as extremely powerful
0:14:14.120,0:14:18.460
bugs. But they're also few and far[br]between, and they're kind of becoming more
0:14:18.460,0:14:23.550
rare, as more of you are looking for bugs.[br]More people are colliding, they're dying
0:14:23.550,0:14:28.920
quicker. And so, anyway, let's try to[br]reduce our scope. So we reduce our scope
0:14:28.920,0:14:33.820
from 3 million down in 350,000 lines of[br]code. Here, we'll zoom into that orange.
0:14:33.820,0:14:37.200
So now we're looking at the JavaScript[br]directory, specifically the JavaScript
0:14:37.200,0:14:42.490
core directory. So this is a JavaScript[br]engine within WebKit, as used by Safari,
0:14:42.490,0:14:47.820
on MacOS. And specifically, to further[br]reduce our scope, we chose to focus on the
0:14:47.820,0:14:52.821
highest level interface of the JavaScript[br]core which is the runtime folder. So this
0:14:52.821,0:14:57.800
contains code that's almost 1 for 1[br]mappings to JavaScript objects and methods
0:14:57.800,0:15:05.791
in the interpreter. So, for example,[br]Array.reverse, or concat, or whatever.
0:15:05.791,0:15:10.901
It's very close to what you JavaScript[br]authors are familiar with. And so this is
0:15:10.901,0:15:16.836
what the runtime folder looks like, at[br]approximately 70,000 lines of code. When
0:15:16.836,0:15:21.630
we were spinning up for PWN2OWN, we said,[br]okay, we are going to find a bug in this
0:15:21.630,0:15:25.680
directory in one of these files, and we're[br]not going to leave it until we have, you
0:15:25.680,0:15:30.784
know, walked away with something. So if we[br]take a step back now. This is what we
0:15:30.784,0:15:34.592
started with, and this is what we've done.[br]We've reduced our scope. So it helps
0:15:34.592,0:15:39.490
illustrate this, you know, whittling[br]process. It was almost a little bit
0:15:39.490,0:15:44.310
arbitrary. There's a lot, previously,[br]there's been a lot of bugs in the runtime
0:15:44.310,0:15:51.380
directory. But it's really been cleaned up[br]the past few years. So anyway, this is
0:15:51.380,0:15:56.870
what we chose for our RCE. So having spent[br]a number of years going back and forth
0:15:56.870,0:16:00.930
between attacking and defending, I've come[br]to recognize that bad components do not
0:16:00.930,0:16:05.200
get good fast. Usually researchers are[br]able to hammer away at these components
0:16:05.200,0:16:10.520
for years before they reach some level of[br]acceptable security. So to escape Safari's
0:16:10.520,0:16:15.084
sandbox, we simply looked at the security[br]trends covered during the reconnaissance phase.
0:16:15.084,0:16:18.790
So, this observation, historically[br]bad components will often take years to
0:16:18.790,0:16:23.542
improve, means that we chose to look at[br]WindowServer. And for those that don't
0:16:23.542,0:16:29.390
know, WindowServer is a root level system[br]service that runs on MacOS. Our research
0:16:29.390,0:16:35.570
turned up a trail of ugly bugs from a[br]MacOS, from essentially the WindowServer
0:16:35.570,0:16:42.910
which is accessible to the Safari sandbox.[br]And in particular, when we're doing our
0:16:42.910,0:16:47.110
research, we're looking at ZDI's website,[br]and you can just search all their
0:16:47.110,0:16:52.910
advisories that they've disclosed. And in[br]particular in 2016, there is over 10 plus
0:16:52.910,0:16:57.220
vulnerabilities report to ZDI that were[br]used as sandbox escapes or privilege
0:16:57.220,0:17:03.406
escalation style issues. And so, these are[br]only vulnerabilities that are reported to
0:17:03.406,0:17:09.500
ZDI. If you look in 2017, there is 4 all,[br]again, used for the same purpose. I think,
0:17:09.500,0:17:15.600
all of these were actually, probably used[br]at PWN2OWN both years. And then in 2018,
0:17:15.600,0:17:19.720
there is just one. And so, this is 3[br]years. Over the span of 3 years where
0:17:19.720,0:17:25.010
people were hitting the same exact[br]component, and Apple or researchers around
0:17:25.010,0:17:28.810
the world could have been watching, or[br]listening and finding bugs, and fighting
0:17:28.810,0:17:36.230
over this land right here. And so, it's[br]pretty interesting. I mean, they give some
0:17:36.230,0:17:42.200
perspective. The fact of the matter is[br]that it's hard to write, it's really hard
0:17:42.200,0:17:46.250
for bad components to improve quickly.[br]Nobody wants to try and sit down and
0:17:46.250,0:17:50.497
rewrite bad code. And vendors are[br]terrified, absolutely terrified of
0:17:50.497,0:17:55.400
shipping regressions. Most vendors will[br]only patch or modify old bad code only
0:17:55.400,0:18:02.370
when they absolutely must. For example,[br]when a vulnerability is reported to them.
0:18:02.370,0:18:07.930
And so, as listed on this slide, there's a[br]number of reasons why a certain module or
0:18:07.930,0:18:12.780
component has a terrible security track[br]record. Just try to keep in mind, that's
0:18:12.780,0:18:17.740
usually a good place to look for more[br]bugs. So if you see a waterfall of bugs
0:18:17.740,0:18:22.660
this year in some component, like, wasm or[br]JIT, maybe you should be looking there,
0:18:22.660,0:18:27.401
right? Because that might be good for a[br]few more years. All right.
0:18:28.060,0:18:32.240
Step three. So after all this talk, we are finally[br]getting to a point where we can start
0:18:32.240,0:18:38.470
probing and exploring the codebase in[br]greater depth. This step is all about bug hunting.
0:18:38.470,0:18:45.280
So as an individual researcher or[br]small organization, the hardest part of
0:18:45.280,0:18:48.957
the Zero-Day engineering process is[br]usually discovering and exploiting a
0:18:48.957,0:18:53.400
vulnerability. That's just kind of from[br]our perspective. This can maybe vary from
0:18:53.400,0:18:58.040
person to person. But you know, we don't[br]have a hundred million dollars to spend on
0:18:58.040,0:19:06.360
fuzzers, for example. And so we literally[br]have one Macbook, right? So it's kind of
0:19:06.360,0:19:10.920
like looking for a needle in a haystack.[br]We're also well versed in the exploitation
0:19:10.920,0:19:14.938
process itself. And so those end up being[br]a little bit more formulaic for ourselves.
0:19:14.938,0:19:18.500
So there are two core strategies for[br]finding exploitable vulnerabilities.
0:19:18.500,0:19:22.133
There's a lot of pros and cons to both of[br]these approaches. But I don't want to
0:19:22.133,0:19:25.448
spend too much time talking about their[br]strengths or weaknesses. So they're all
0:19:25.448,0:19:30.610
listed here, the short summary is that[br]fuzzing is the main go-to strategy for
0:19:30.610,0:19:36.970
many security enthusiasts. Some of the key[br]perks is that it's scalable and almost
0:19:36.970,0:19:41.841
always yields result. And so, spoiler[br]alert, but later in the software industry,
0:19:41.841,0:19:48.640
we fuzzed both of our bugs. Both the bugs[br]that we use for a full chain. And we know
0:19:48.640,0:19:53.806
it's 2018. These things are still falling[br]out with some very trivial means. OK. So,
0:19:53.806,0:19:58.860
source review is the other main strategy.[br]Source review is often much harder for
0:19:58.860,0:20:02.746
novices, but it can produce some high[br]quality bugs when performed diligently.
0:20:02.746,0:20:06.200
You know if you're looking to just get[br]into this stuff, I would say, start real
0:20:06.200,0:20:12.930
simple, start with fuzzing and see how[br]fare you get. So, yeah, for the purpose of
0:20:12.930,0:20:16.490
this talk, we are mostly going to focus on[br]fuzzing. This is a picture from the
0:20:16.490,0:20:21.080
dashboard of a simple, scalable fuzzing[br]harness we built for JavaScript core. This
0:20:21.080,0:20:25.457
is when we were ramping up for PWN2OWN and[br]trying to build our chain. It was a
0:20:25.457,0:20:30.310
grammar based JavaScript fuzzer, based on[br]Mozilla's Darma. There is nothing fancy
0:20:30.310,0:20:34.723
about it. This is a snippet of some of[br]what some of its output looked like. We
0:20:34.723,0:20:37.860
had only started building it out when we[br]actually found the exploitable
0:20:37.860,0:20:42.520
vulnerability that we ended up using. So[br]we haven't really played with this much
0:20:42.520,0:20:48.030
since then, but it's, I mean, it shows[br]kind of how easy it was to get where we
0:20:48.030,0:20:55.221
needed to go. So, something like we'd like[br]to stress heavily to the folks who fuzz,
0:20:55.221,0:20:59.720
is that it really must be treated as a[br]science for these competitive targets.
0:20:59.720,0:21:04.180
Guys, I know code coverage isn't the best[br]metric, but you absolutely must use some
0:21:04.180,0:21:08.552
form of introspection to quantify the[br]progress and reach of your fuzzing. Please
0:21:08.552,0:21:13.730
don't just fuzz blindly. So our fuzzer[br]would generate web based code coverage
0:21:13.730,0:21:18.160
reports of our grammars every 15 minutes,[br]or so. This allowed us to quickly iterate
0:21:18.160,0:21:22.594
upon our fuzzer, helping generate more[br]interesting complex test cases. A good
0:21:22.594,0:21:26.050
target is 60 percent code coverage. So you[br]can see that in the upper right hand
0:21:26.050,0:21:29.050
corner. That's kind of what we were[br]shooting for. Again, it really varies from
0:21:29.050,0:21:33.849
target to target. This was also just us[br]focusing on the runtime folder. If you see
0:21:33.849,0:21:39.400
in the upper left hand corner. And so,[br]something that we have observed, again
0:21:39.400,0:21:45.910
over many targets and exotic, exotic[br]targets, is that bugs almost always fall
0:21:45.910,0:21:51.624
out of what we call the hard fought final[br]coverage percentages. And so, what this
0:21:51.624,0:21:55.960
means is, you might work for a while,[br]trying to build up your coverage, trying
0:21:55.960,0:22:02.030
to, you know, build a good set of test[br]cases, or Grammar's for fuzzing, and then
0:22:02.030,0:22:06.030
you'll hit that 60 percent, and you'll be,[br]okay, what am I missing now? Like everyone
0:22:06.030,0:22:09.550
gets that 60 percent, let's say. But then,[br]once you start inching a little bit
0:22:09.550,0:22:15.150
further is when you start fining a lot of[br]bugs. So, for example, we will pull up
0:22:15.150,0:22:19.190
code, and we'll be like, why did we not[br]hit those blocks up there? Why are those
0:22:19.190,0:22:22.630
grey box? Why did we never hit those in[br]our millions of test cases? And we'll go
0:22:22.630,0:22:26.020
find that that's some weird edge case, or[br]some unoptimized condition, or something
0:22:26.020,0:22:32.160
like that, and we will modify our test[br]cases to hit that code. Other times we'll
0:22:32.160,0:22:36.230
actually sit down pull it up on our[br]projector and talk through some of that
0:22:36.230,0:22:39.420
and we'll be like: What the hell is going[br]on there? This is actually, it's funny,
0:22:39.420,0:22:44.341
this is actually a live photo that I took[br]during our pawn2own hunt. You know as
0:22:44.341,0:22:48.430
cliche as this picture is of hackers[br]standing in front of like a dark screen in
0:22:48.430,0:22:52.090
a dark room, this was absolutely real. You[br]know we were we were just reading some
0:22:52.090,0:23:02.190
code. And so it's good to rubber ducky[br]among co-workers and to hash out ideas to
0:23:02.190,0:23:10.520
help confirm theories or discard them. And[br]so yeah this kinda leads us to the next
0:23:10.520,0:23:15.220
piece of advice is when you're doing[br]source reviews so this applies to both
0:23:15.220,0:23:21.060
debugging or assessing those corner cases[br]and whatnot. If you're ever unsure about
0:23:21.060,0:23:24.800
the code that you're reading you[br]absolutely should be using debuggers and
0:23:24.800,0:23:30.250
dynamic analysis. So as painful as it can[br]maybe be to set up a JavaScript core or
0:23:30.250,0:23:35.500
debug this massive C++ application that's[br]dumping these massive call stacks that are
0:23:35.500,0:23:39.900
100 [steps] deep you need to learn those[br]tools or you are never going to be able to
0:23:39.900,0:23:46.700
understand the amount of context necessary[br]for some of these bugs and complex code.
0:23:46.700,0:23:55.010
So for example one of our blog posts makes[br]extensive use of rr to reverse or to root
0:23:55.010,0:23:58.831
cause the vulnerability that we end up[br]exploiting. It was a race condition in the
0:23:58.831,0:24:03.270
garbage collector - totally wild bug.[br]There's probably, I said there's probably
0:24:03.270,0:24:07.830
3 people on earth that could have spotted[br]this bug through source review. It
0:24:07.830,0:24:12.690
required immense knowledge of code base in[br]my opinion to be able to recognize this as
0:24:12.690,0:24:16.250
a vulnerability. We found it through[br]fuzzing, we had a root cause it using time
0:24:16.250,0:24:23.630
travel debugging. Mozilla's rr which is an[br]amazing project. And so, yeah. Absolutely
0:24:23.630,0:24:28.100
use debuggers. This is an example of a[br]call stack again, just using a debugger to
0:24:28.100,0:24:32.043
dump the callstack from a function that[br]you are auditing can give you an insane
0:24:32.043,0:24:36.500
amount of context as to how that function[br]is used, what kind of data it's operating
0:24:36.500,0:24:42.294
on. Maybe, you know, what kind of areas of[br]the codebase it's called from. You're not
0:24:42.294,0:24:46.490
actually supposed to be able to read the[br]size or read the slide but it's a
0:24:46.490,0:24:56.370
backtrace from GDB that is 40 or 50 call[br]stacks deep. All right. So there is this
0:24:56.370,0:25:01.420
huge misconception by novices that new[br]code is inherently more secure and that
0:25:01.420,0:25:07.054
vulnerabilities are only being removed[br]from code bases, not added. This is almost
0:25:07.054,0:25:11.980
patently false and this is something that[br]I've observed over the course of several
0:25:11.980,0:25:18.377
years. Countless targets you know, code[br]from all sorts of vendors. And there's
0:25:18.377,0:25:24.340
this really great blog post put out by[br]Ivan from GPZ this past fall and in his
0:25:24.340,0:25:29.469
blog post he basically ... so one year ago[br]he fuzzed WebKit using his fuzzer
0:25:29.469,0:25:33.250
called Domato. He found a bunch of[br]vulnerabilities, he reported them and then
0:25:33.250,0:25:39.680
he open sourced the fuzzer. But then this[br]year, this fall, he downloads his fuzzer,
0:25:39.680,0:25:43.553
ran it again with little to no changes,[br]just to get things up and running. And
0:25:43.553,0:25:47.680
then he found another eight plus[br]exploitable use after free vulnerabilities.
0:25:47.680,0:25:51.400
So what's really amazing[br]about this, is when you look at these last
0:25:51.400,0:25:55.590
two columns that I have highlighted in[br]red, virtually all the bugs he found had
0:25:55.590,0:26:03.950
been introduced or regressed in the past[br]12 months. So yes, new vulnerabilities get
0:26:03.950,0:26:11.110
introduced every single day. The biggest[br]reason new code is considered harmful, is
0:26:11.110,0:26:16.270
simply that it's not had years to sit in[br]market. This means it hasn't had time to
0:26:16.270,0:26:20.843
mature, it hasn't been tested exhaustively[br]like the rest of the code base. As soon as
0:26:20.843,0:26:24.786
that developer pushes it, whenever it hits[br]release, whenever it hits stable that's
0:26:24.786,0:26:29.000
when you have a billion users pounding at[br]- it let's say on Chrome. I don't know how
0:26:29.000,0:26:33.020
big that user base is but it's massive and[br]that's a thousand users around the world
0:26:33.020,0:26:37.537
just using the browser who are effectively[br]fuzzing it just by browsing the web. And
0:26:37.537,0:26:41.120
so of course you're going to manifest[br]interesting conditions that will cover
0:26:41.120,0:26:47.046
things that are not in your test cases and[br]unit testing. So yeah, it's not uncommon.
0:26:47.046,0:26:50.490
The second point down here is that it's[br]not uncommon for new code to break
0:26:50.490,0:26:55.360
assumptions made elsewhere in the code[br]base. And this is also actually extremely
0:26:55.360,0:26:59.720
common. The complexity of these code bases[br]can be absolutely insane and it can be
0:26:59.720,0:27:04.263
extremely hard to tell if let's say some[br]new code that Joe Schmoe, the new
0:27:04.263,0:27:10.420
developer, adds breaks some paradigm held[br]by let's say the previous owner of the
0:27:10.420,0:27:14.870
codebase. He maybe doesn't understand it[br]as well - you know, maybe it could be an
0:27:14.870,0:27:22.996
expert developer who just made a mistake.[br]It's super common. Now a piece of advice.
0:27:22.996,0:27:27.200
This should be a no brainer for bug[br]hunting but novices often grow impatient
0:27:27.200,0:27:30.320
and start hopping around between code and[br]functions and getting lost or trying to
0:27:30.320,0:27:35.920
chase use-after-frees or bug classes[br]without really truly understanding what
0:27:35.920,0:27:41.780
they're looking for. So a great starting[br]point is always identify the sources of
0:27:41.780,0:27:45.700
user input or the way that you can[br]interface with the program and then just
0:27:45.700,0:27:50.890
follow the data, follow it down. You know[br]what functions parse it, what manipulates
0:27:50.890,0:27:58.611
your data, what reads it, what writes to[br]it. You know just keep it simple. And so
0:27:58.611,0:28:03.570
when we're looking for our sandbox escapes[br]when you're looking at window server and
0:28:03.570,0:28:07.370
our research had showed that there's all[br]of these functions we don't know anything
0:28:07.370,0:28:12.000
about Mac but we read this blog post from[br]Keine that was like "Oh there's all these
0:28:12.000,0:28:15.700
functions that you can send data to in[br]window server" and apparently there's
0:28:15.700,0:28:21.090
about six hundred and there are all these[br]functions prefixed with underscore
0:28:21.090,0:28:28.250
underscore x. And so the 600 end points[br]will parse and operate upon data that we
0:28:28.250,0:28:33.430
send to them. And so to draw a rough[br]diagram, there is essentially this big red
0:28:33.430,0:28:37.760
data tube from the safari sandbox to the[br]windows server system service. This tube
0:28:37.760,0:28:43.651
can deliver arbitrary data that we control[br]to all those six hundred end points. We
0:28:43.651,0:28:48.001
immediately thought let's just try to man-[br]in-the-middle this data pipe, so that we
0:28:48.001,0:28:52.260
can see what's going on. And so that's[br]exactly what we did. We just hooked up
0:28:52.260,0:28:58.898
FRIDA to it, another open source DBI. It's[br]on GitHub. It's pretty cool. And we're
0:28:58.898,0:29:04.590
able to stream all of the messages flowing[br]over this pipe so we can see all this data
0:29:04.590,0:29:08.870
just being sent into the window server[br]from all sorts of applications - actually
0:29:08.870,0:29:12.800
everything on Mac OS talks to this. The[br]window server is responsible for drawing
0:29:12.800,0:29:17.490
all your windows on the desktop, your[br]mouse clicks, your whatever. It's kind of
0:29:17.490,0:29:23.820
like explorer.exe on Windows. So you know[br]we see all this data coming through, we
0:29:23.820,0:29:29.050
see all these crazy messages, all these[br]unique message formats, all these data
0:29:29.050,0:29:33.948
buffers that it's sending in and this is[br]just begging to be fuzzed. And so we said
0:29:33.948,0:29:38.080
"OK, let's fuzz it" and we're getting all[br]hyped and I distinctly remember thinking
0:29:38.080,0:29:42.440
maybe we can jerry-rig AFL into the[br]window server or let's mutate these
0:29:42.440,0:29:51.059
buffers with Radamsa or why don't we just[br]try flipping some bits. So that's what we
0:29:51.059,0:29:57.030
did. So I actually had a very timely tweet[br]just a few weeks back that echoed this
0:29:57.030,0:30:02.590
exact experience. He said that "Looking at[br]my Security / Vulnerability research
0:30:02.590,0:30:06.753
career, my biggest mistakes were almost[br]always trying to be too clever. Success
0:30:06.753,0:30:11.587
hides behind what is the dumbest thing[br]that could possibly work." The takeaway
0:30:11.587,0:30:18.030
here is that you should always start[br]simple and iterate. So this is our Fuzz
0:30:18.030,0:30:22.520
farm. It's a single 13 inch MacBook Pro. I[br]don't know if this is actually going to
0:30:22.520,0:30:26.220
work, it's not a big deal if it doesn't.[br]I'm only gonna play a few seconds of it.
0:30:26.220,0:30:31.953
This is me literally placing my wallet on[br]the enter key and you can see this box
0:30:31.953,0:30:35.490
popping up and we're fuzzing - our fuzzer[br]is running now and flipping bits in the
0:30:35.490,0:30:39.366
messages. And the screen is changing[br]colors. You're going to start seeing the
0:30:39.366,0:30:42.905
boxes freaking out. It's going all over[br]the place. This is because the bits are
0:30:42.905,0:30:46.660
being flipped, it's corrupting stuff, it's[br]changing the messages. Normally, this
0:30:46.660,0:30:51.330
little box is supposed to show your[br]password hint. But the thing is by holding
0:30:51.330,0:30:56.240
the enter key on the locked screen. All[br]this traffic was being generated to the
0:30:56.240,0:31:00.490
window server, and every time the window[br]server crashed - you know where it brings
0:31:00.490,0:31:03.809
you? It brings you right back to your lock[br]screen. So we had this awesome fuzzing
0:31:03.809,0:31:16.003
setup by just holding the enter key.[br]Applause
0:31:16.003,0:31:21.810
And we you know we lovingly titled that[br]picture "Advanced Persistent Threat" in
0:31:21.810,0:31:29.630
our blog. So this is a crash that we got[br]out of the fuzzer. This occurred very
0:31:29.630,0:31:34.299
quickly after ... this was probably within[br]the first 24 hours. So we found a ton of
0:31:34.299,0:31:38.670
crashes, we didn't even explore all of[br]them. There is probably a few still
0:31:38.670,0:31:44.380
sitting on our server. But there's lots[br]and all the rest ... lots of garbage. But
0:31:44.380,0:31:48.780
then this one stands out in particular:[br]Anytime you see this thing up here that says
0:31:48.780,0:31:53.679
"EXC_BAD_ACCESS" with a big number up[br]there with address equals blah blah blah.
0:31:53.679,0:31:57.520
That's a really bad place to be. And so[br]this is a bug that we end up using at
0:31:57.520,0:32:01.250
pwn2own to perform our sandbox escape if[br]you want to read about it again, it's on
0:32:01.250,0:32:05.710
the blog, we're not going to go too deep[br]into it here. So maybe some of you have
0:32:05.710,0:32:12.590
seen the infosec comic. You know it's all[br]about how people try to do these really
0:32:12.590,0:32:17.320
cool clever things. They get ... People[br]can get too caught up trying to inject so
0:32:17.320,0:32:21.930
much science and technology into these[br]problems that they often miss the forest
0:32:21.930,0:32:26.750
for the trees. And so here we are in the[br]second panel. We just wrote this really
0:32:26.750,0:32:31.832
crappy little fuzzer and we found our bug[br]pretty quickly. And this guy's really
0:32:31.832,0:32:38.670
upset. Which brings us to the[br]misconception that only expert researchers
0:32:38.670,0:32:42.950
with blank tools can find bugs. And so you[br]can fill in the blank with whatever you
0:32:42.950,0:32:51.000
want. It can be cutting edge tools, state[br]of the art, state sponsored, magic bullet.
0:32:51.000,0:32:58.720
This is not true. There are very few[br]secrets. So the next observation: you
0:32:58.720,0:33:03.350
should be very wary of any bugs that you[br]find quickly. A good mantra is that an
0:33:03.350,0:33:08.840
easy to find bug is just as easily found[br]by others. And so what this means is that
0:33:08.840,0:33:13.260
soon after our blog post went out ...[br]actually at pwn2own 2018 we actually knew
0:33:13.260,0:33:18.640
we had collided with fluorescence one of[br]the other competitors. We both struggled
0:33:18.640,0:33:24.580
with exploiting this issue ... is a[br]difficult bug to exploit. And we were ...
0:33:24.580,0:33:29.884
we had some very creative exploit, it was[br]very strange. But there was some
0:33:29.884,0:33:33.490
discussion after the fact on Twitter by[br]nadge - started by Neddy -he's probably
0:33:33.490,0:33:36.299
out here, actually speaking tomorrow. You[br]guys should go see this talk about the
0:33:36.299,0:33:42.610
Chrome IPC. That should be really good.[br]But there is some discussion on Twitter,
0:33:42.610,0:33:47.170
that Ned had started, and Nicholas, who is[br]also here, said "well, at least three
0:33:47.170,0:33:52.130
teams found it separately". So at least[br]us, fluorescence and Nicholas had found
0:33:52.130,0:33:57.320
this bug. And we were all at pwn2own, so[br]you can think how many people out there
0:33:57.320,0:34:01.200
might have also found this. There's[br]probably at least a few. How many people
0:34:01.200,0:34:07.160
actually tried to weaponize this thing?[br]Maybe not many. It is kind of a difficult
0:34:07.160,0:34:14.790
bug. And so there are probably at least a[br]few other researchers who are aware of
0:34:14.790,0:34:21.070
this bug. So yeah, that kinda closes the,[br]you know, if you found a bug very quickly
0:34:21.070,0:34:25.360
especially with fuzzing, you can almost[br]guarantee that someone else has found it.
0:34:25.360,0:34:31.040
So I want to pass over the next section to[br]Amy to continue.
0:34:31.040,0:34:38.360
Amy: So we just talked a bunch about, you[br]know, techniques and expectations when
0:34:38.360,0:34:42.070
you're actually looking for the bug. Let[br]me take over here and talk a little bit
0:34:42.070,0:34:48.179
about what to expect when trying to[br]exploit whatever bug you end up finding.
0:34:48.179,0:34:53.600
Yeah and so we have the exploit[br]development as the next step. So OK, you
0:34:53.600,0:34:57.221
found a bug right, you've done the hard[br]part. You were looking at whatever your
0:34:57.221,0:35:01.090
target is, maybe it's a browser maybe it's[br]the window server or the kernel or
0:35:01.090,0:35:05.784
whatever you're trying to target. But the[br]question is how do you actually do the rest?
0:35:05.784,0:35:10.500
How do you go from the bug to[br]actually popping a calculator onto the
0:35:10.500,0:35:15.650
screen? The systems that you're working[br]with have such a high level of complexity
0:35:15.650,0:35:19.880
that even just like understanding you know[br]enough to know how your bug works it might
0:35:19.880,0:35:23.650
not be enough to actually know how to[br]exploit it. So we try to like brute force
0:35:23.650,0:35:28.560
our way to an exploit, is that a good[br]idea? Well all right before we try to
0:35:28.560,0:35:34.390
tackle your bug let's take a step back and[br]ask a slightly different question. How do
0:35:34.390,0:35:39.410
we actually write an exploit like this in[br]general? Now I feel like a lot of people
0:35:39.410,0:35:44.461
consider these kind of exploits maybe be[br]in their own league at least when you
0:35:44.461,0:35:49.750
compare them to something like maybe what[br]you'd do at a CTF competition or something
0:35:49.750,0:35:55.859
simpler like that. And if you were for[br]example to be given a browser exploit
0:35:55.859,0:36:00.340
challenge at a CTF competition it may seem[br]like an impossibly daunting task has just
0:36:00.340,0:36:04.620
been laid in front of you if you've never[br]done this stuff before. So how can we work
0:36:04.620,0:36:09.350
to sort of change that view? And you know[br]it might be kind of cliche but I actually
0:36:09.350,0:36:14.090
think the best way to do it is through[br]practice. And I know everyone says "oh how
0:36:14.090,0:36:19.540
do you get good", "oh, practice". But I[br]think that this is actually very valuable
0:36:19.540,0:36:24.930
for this and the way that practicing[br]actually comes out is that, well, before we
0:36:24.930,0:36:29.010
talked a lot about consuming everything[br]you could about your targets, like
0:36:29.010,0:36:33.740
searching for everything you could that's[br]public, downloading it, trying to read it
0:36:33.740,0:36:36.960
even if you don't understand it, because[br]you'll hopefully gleam something from it
0:36:36.960,0:36:41.550
it doesn't hurt but maybe your goal now[br]could be actually trying to understand it
0:36:41.550,0:36:46.810
as at least as much as you can. You know[br]it's going to be... it's not going to be
0:36:46.810,0:36:53.130
easy. These are very intricate systems[br]that we're attacking here. And so it will
0:36:53.130,0:36:57.150
be a lot of work to understand this stuff.[br]But for every old exploit you can work
0:36:57.150,0:37:02.240
your way through, the path will become[br]clearer for actually exploiting these
0:37:02.240,0:37:10.600
targets. So as because I focused mostly on[br]browser work and I did that browser part
0:37:10.600,0:37:16.540
of our chain, at least the exploitation[br]part. I have done a lot of exploits and
0:37:16.540,0:37:21.080
read a ton of browser exploits and one[br]thing that I have found is that a lot of
0:37:21.080,0:37:26.170
them have very very similar structure. And[br]they'll have similar techniques in them
0:37:26.170,0:37:30.923
they'll have similar sort of primitives[br]that are being used to build up the
0:37:30.923,0:37:37.660
exploit. And so that's one observation.[br]And to actually illustrate that I have an
0:37:37.660,0:37:42.770
example. So alongside us at this at[br]PWN2OWN this spring we had Samuel Groffs
0:37:42.770,0:37:48.920
of Phoenix. He's probably here right now.[br]So he was targeting Safari just like we
0:37:48.920,0:37:53.550
were. But his bug was in the just in time[br]compiler, the JIT, which converts
0:37:53.550,0:38:00.061
JavaScript to the machine code. Our bug[br]was nowhere near that. It was over in the
0:38:00.061,0:38:06.250
garbage collector so a completely[br]different kind of bug. But the bug here is
0:38:06.250,0:38:11.014
super reliable. It was very very clean. I[br]recommend you go look at it online. It's a
0:38:11.014,0:38:18.850
very good resource. And then, a few months[br]later, at PWN2OWN Mobile, so another pwning
0:38:18.850,0:38:24.430
event, we have Fluoroacetate, which was an[br]amazing team who managed to pretty much
0:38:24.430,0:38:28.071
pwn everything they could get their hands[br]on at that competition, including an
0:38:28.071,0:38:33.103
iPhone which of course iPhone uses Safari[br]so they needed a Safari bug. The safari
0:38:33.103,0:38:37.690
bug that they had was very similar in[br]structure to the previous bug earlier that
0:38:37.690,0:38:43.320
year, at least in terms of how the bug[br]worked and what you could do with it. So
0:38:43.320,0:38:47.760
now you could exploit both of these bugs[br]with very similar exploit code almost in
0:38:47.760,0:38:52.950
the same way. There were a few tweaks you[br]had to do because Apple added a few things
0:38:52.950,0:39:01.100
since then. But the path between bug and[br]code execution was very similar. Then,
0:39:01.100,0:39:07.070
even a few months after that, there is a[br]CTF called "Realworld CTF" which took
0:39:07.070,0:39:11.050
place in China and as the title suggests[br]they had a lot of realistic challenges
0:39:11.050,0:39:18.280
including Safari. So of course my team[br]RPISEC was there and they woke me up in
0:39:18.280,0:39:23.000
the middle of the night and tasked me with[br]solving it. And so I was like "Okay, okay
0:39:23.000,0:39:27.760
I'll look at this". And I looked at it and[br]it was a JIT bug and I've never actually
0:39:27.760,0:39:34.720
before that looked at the Safari JIT. And[br]so, you know, I didn't have much previous
0:39:34.720,0:39:40.490
experience doing that, but because I had[br]taken the time to read all the public
0:39:40.490,0:39:45.300
exploits. So I read all the other PWN2OWN[br]competitors exploits and I read all the
0:39:45.300,0:39:49.470
other things that people were releasing[br]for different CVEs. I had seen a bug like
0:39:49.470,0:39:54.790
this before very similar and I knew how to[br]exploit it, so I could... I was able to
0:39:54.790,0:39:59.460
quickly build the path from bug to code[br]exec and we actually managed to get first
0:39:59.460,0:40:03.350
blood on the challenge which was really[br]really cool.
0:40:03.350,0:40:12.020
Applaus[br]So... So what does this actually mean?
0:40:12.020,0:40:19.160
Well I think not every bug is going to be[br]that easily to swap into an exploit but I
0:40:19.160,0:40:23.350
do think that understanding old exploits[br]is extremely valuable if you're trying to
0:40:23.350,0:40:28.840
exploit new bugs. A good place to start if[br]you're interested in looking at old bugs
0:40:28.840,0:40:34.170
is on places like this with the js-vuln-db, [br]which is a basically a repository of a
0:40:34.170,0:40:39.200
whole bunch of JavaScript bugs and proof[br]of concepts and sometimes even exploits
0:40:39.200,0:40:43.690
for them. And so if you were to go through[br]all of those, I guarantee by the end you'd
0:40:43.690,0:40:49.560
have a great understanding of the types of[br]bugs that are showing up these days and
0:40:49.560,0:40:55.430
probably how to exploit most of them.[br]And... But there aren't that many bugs
0:40:55.430,0:41:02.430
that get published that are full exploits.[br]There's only a couple a year maybe. So
0:41:02.430,0:41:05.260
what do you do from there once you've read[br]all those in and you need to learn more?
0:41:05.260,0:41:12.540
Well maybe start trying to exploit other[br]bugs yourself so you can go... For
0:41:12.540,0:41:16.260
example, I like Chrome because they have a[br]very nice list of all their
0:41:16.260,0:41:19.510
vulnerabilities that they post every time[br]they have an update and they even link you
0:41:19.510,0:41:24.980
to the issue, so you can go and see[br]exactly what was wrong and so take some of
0:41:24.980,0:41:30.500
these for example, at the very top you[br]have out-of-bounds write in V8. So we
0:41:30.500,0:41:34.000
could click on that and go and see what[br]the bug was and then could try to write an
0:41:34.000,0:41:38.130
exploit for it. And then by the end we all[br]have had a much better idea of how to
0:41:38.130,0:41:43.970
exploit an out-of-bounds write in V8 and[br]we've now done it ourselves too. So this
0:41:43.970,0:41:47.650
is a chance to sort of apply what you've[br]learned. But you say OK that's a lot of
0:41:47.650,0:41:53.030
work. You know I have to do all kinds of[br]other stuff, I'm still in school or I have
0:41:53.030,0:41:59.690
a full time job. Can't I just play CTFs?[br]Well it's a good question. The question is
0:41:59.690,0:42:03.310
how much do CTFs actually help you with[br]these kind of exploits. I do think that
0:42:03.310,0:42:06.310
you can build a very good mindset for this[br]because you need a very adversarial
0:42:06.310,0:42:13.410
mindset to do this sort of work. But a lot[br]of the times the challenges don't really
0:42:13.410,0:42:17.270
represent the real world exploitation.[br]There was a good tweet just the other day,
0:42:17.270,0:42:23.890
like a few days ago, where we were saying[br]that yeah libc is... like random libc
0:42:23.890,0:42:28.660
challenges - Actually I don't think[br]it's... Yes. It's libc here. Yeah. - are
0:42:28.660,0:42:33.330
often very artificial and don't carry much[br]value to real world because they're very
0:42:33.330,0:42:38.570
specific. Some people love these sort of[br]very specific CTF challenges, but I don't
0:42:38.570,0:42:43.410
think that there's as much value as there[br]could be. However a lot of... There's been
0:42:43.410,0:42:48.040
a couple CTFs recently and historically as[br]well that have had pretty realistic
0:42:48.040,0:42:56.580
challenges in them. In fact right now is a[br]CTF, 35c3 CTF is running and they have 3
0:42:56.580,0:43:00.250
browser exploit challenges, they have a[br]full chain safari challenge, they have a
0:43:00.250,0:43:05.950
virtual box challenge, It's like it's[br]pretty crazy and it's crazy to see people
0:43:05.950,0:43:10.810
solve those challenges in such a short[br]time span too. But I think it's definitely
0:43:10.810,0:43:15.020
something that you can look at afterwards[br]even if you don't manage to get through
0:43:15.020,0:43:19.940
one of those challenges today. But[br]something to try to work on. And so these
0:43:19.940,0:43:24.990
are the sort of new CTFs are actually[br]pretty good for people who want to jump
0:43:24.990,0:43:31.690
off to this kind of real estate or a real[br]exploit development work. However it can
0:43:31.690,0:43:36.310
be kind of scary for newer newcomers to[br]the CTF scene because suddenly you know
0:43:36.310,0:43:39.681
it's your first CTF and they're asking you[br]to exploit Chrome and you're like what...
0:43:39.681,0:43:45.730
what is going on here. So there, it is a[br]double edged sword sometimes. All right so
0:43:45.730,0:43:50.590
now we found the bug and we have[br]experience, so what do we actually do?
0:43:50.590,0:43:54.610
Well you have to kind of have to get lucky[br]though because even if you've had a ton of
0:43:54.610,0:43:58.820
experience that doesn't necessarily mean[br]that you can instantly write an exploit
0:43:58.820,0:44:03.390
for a bug. Our javascript exploit was kind[br]of like that, it was kind of nice, we knew
0:44:03.390,0:44:09.170
what to do very right away but the[br]brows... or our sandbox exploit did not
0:44:09.170,0:44:14.110
fit into a nice box of a previous exploit[br]that we had seen. So it took a lot of
0:44:14.110,0:44:19.020
effort. Quickly I'll show... So this was[br]the actual bug that we exploited for the
0:44:19.020,0:44:25.340
Sandbox. It's a pretty simple bug. It's a[br]integer issue where index is signed which
0:44:25.340,0:44:30.160
means it can be negative. So normally it[br]expects a value like 4 but we could give
0:44:30.160,0:44:34.920
it a value like negative 3 and that would[br]make it go out of bounds and we could
0:44:34.920,0:44:39.830
corrupt memory. So very simple bug not[br]like a crazy complex one like some of the
0:44:39.830,0:44:44.340
other ones we've seen. But does that mean[br]that this exploit is going to be really
0:44:44.340,0:44:52.440
simple? Well let's see... That's a lot of[br]code. So our exploit for this bug ended up
0:44:52.440,0:44:59.400
being about 1300 lines. And so that's[br]pretty crazy. And you're probably
0:44:59.400,0:45:04.750
wondering how it gets there but I do want[br]to say just be aware that when you do find
0:45:04.750,0:45:09.410
it simple looking bug, it might not be[br]that easy to solve or to exploit. And it
0:45:09.410,0:45:14.654
might take a lot of effort but don't get[br]discouraged if it happens to you. It just
0:45:14.654,0:45:19.420
means it's time to ride the exploit[br]development rollercoaster. And basically
0:45:19.420,0:45:24.150
what that means is there's a lot of ups[br]and downs to an exploit and we have to
0:45:24.150,0:45:28.010
basically ride the rollercoaster until[br]hopefully we have it, the exploit,
0:45:28.010,0:45:36.020
finished and we had to do that for our[br]sandbox escape. And so to say we found the
0:45:36.020,0:45:41.880
bug and we had a bunch of great ideas we'd[br]previously seen a bug exploited like this
0:45:41.880,0:45:47.250
by Kiehne and we read their papers and we[br]had a great idea but then we were like OK
0:45:47.250,0:45:51.422
OK it's going to work we just have to make[br]sure this one bit is not set and it was
0:45:51.422,0:45:55.750
like in a random looking value, so we[br]assumed it would be fine. But turns out
0:45:55.750,0:46:00.690
that bit is always set and we have no idea[br]why and no one else knows why, so thank
0:46:00.690,0:46:06.450
you Apple for that. And so OK maybe we can[br]work around it, maybe we can figure out a
0:46:06.450,0:46:11.190
way to unset it and we're like oh yes we[br]can delete it. It's going to work again!
0:46:11.190,0:46:14.670
Everything will be great! Until we realize[br]that actually breaks the rest of the
0:46:14.670,0:46:20.900
exploit. So it's this back and forth, it's[br]an up and down. And you know sometimes
0:46:20.900,0:46:26.920
when you solve one issue you think you've[br]got what you need and then another issue
0:46:26.920,0:46:30.930
shows up.[br]So it's all about making incremental
0:46:30.930,0:46:35.870
progress towards removing all the issues[br]that are in your way, getting at least
0:46:35.870,0:46:38.770
something that works.[br]Marcus: And so just as a quick aside, this
0:46:38.770,0:46:41.450
all happened within like 60 minutes one[br]night.
0:46:41.450,0:46:44.920
Amy: Yeah.[br]Marcus: Amy saw me just like I was
0:46:44.920,0:46:49.609
walking, out of breath, I was like Are you[br]kidding me? There's two bugs that tripped
0:46:49.609,0:46:54.110
us up that made this bug much more[br]difficult to exploit. And there is no good
0:46:54.110,0:46:58.320
reason for why those issues were there and[br]that it was a horrible experience.
0:46:58.320,0:47:04.260
Amy: But it's still one I'd recommend. And[br]so this rollercoaster it actually applies
0:47:04.260,0:47:10.950
to the entire process not just for, you[br]know, the exploit development because
0:47:10.950,0:47:15.940
you'll have it when you find crashes that[br]don't actually lead to vulnerabilities or
0:47:15.940,0:47:20.340
unexplainable crashes or super unreliable[br]exploits. You just have to keep pushing
0:47:20.340,0:47:24.450
your way through until eventually you[br]hopefully you get to the end of the ride
0:47:24.450,0:47:32.740
and you've got yourself a nice exploit.[br]And so now. OK. So we assume, OK, we've
0:47:32.740,0:47:36.400
written an exploit at this point. Maybe[br]it's not the most reliable thing but it
0:47:36.400,0:47:42.060
works like I can get to my code exec every[br]now and then. So we have to start talking
0:47:42.060,0:47:46.721
about the payload. So what is the payload[br]exactly? A payload is whatever your
0:47:46.721,0:47:51.430
exploits trying to actually do. It could[br]be trying to open up a calculator on the
0:47:51.430,0:47:56.870
screen, it could be trying to launch your[br]sandbox escape exploit, it could be trying
0:47:56.870,0:48:01.859
to clean up your system after your[br]exploit. And by that, I mean fix the
0:48:01.859,0:48:05.870
program that you're actually exploiting.[br]So in CTF we don't get a lot of practice
0:48:05.870,0:48:11.240
with this because we're so used to doing[br]'system', you know, 'cat flag', and then
0:48:11.240,0:48:14.910
it doesn't matter if the entire program is[br]crashing down in flames around us because
0:48:14.910,0:48:19.670
we got the flag. So in this case yeah you[br]cat the flag and then it crashes right
0:48:19.670,0:48:24.410
away because you didn't have anything[br]after your action. But in the real world
0:48:24.410,0:48:27.760
it kind of matters all the more, so here's[br]an example like what would happen if your
0:48:27.760,0:48:32.640
exploit didn't clean up after itself, and[br]just crashes and goes back to the login
0:48:32.640,0:48:37.590
screen. This doesn't look very good. If[br]you're at a conference like Pwn2Own this
0:48:37.590,0:48:44.390
won't work, I don't think that they would[br]let you win if this happened. And so, it's
0:48:44.390,0:48:48.589
very important to try to go back and fix[br]up any damage that you've done to the
0:48:48.589,0:48:55.339
system, before it crashes, right after you[br]finished. Right. And so actually running
0:48:55.339,0:49:00.981
your payload, so, a lot of times we see or[br]in the exploits we see that you'll get to
0:49:00.981,0:49:05.560
the code exec here which is just CCs[br]which means INT3 which just tells the
0:49:05.560,0:49:10.700
program to stop or trap to break point and[br]all the exploits you see most of the time
0:49:10.700,0:49:13.580
they just stop here. They don't tell you[br]anymore and to be fair you know they've
0:49:13.580,0:49:16.820
gotten a new code exec they're just[br]talking about the exploit but you, you
0:49:16.820,0:49:20.349
still have to figure out how to do your[br]payload because unless you want to write
0:49:20.349,0:49:26.349
those 1300 lines of code in handwritten[br]assembly and then make it into shellcode,
0:49:26.349,0:49:32.230
you're not going to have a good time. And[br]so, we had to figure out a way to actually
0:49:32.230,0:49:37.570
take our payload, write it to the file[br]system in the only place that the sandbox
0:49:37.570,0:49:42.560
would let us, and then we could run it[br]again, as a library, and then it would go
0:49:42.560,0:49:50.460
and actually do our exploit. And so yeah.[br]And so now that you've like assembled
0:49:50.460,0:49:56.430
everything you're almost done here. You[br]have your exploit working. You get a
0:49:56.430,0:50:00.020
calculator pops up. This was actually our[br]sandbox escape of running and popping
0:50:00.020,0:50:04.320
calculator and proving that we had brute[br]code exec, but we're not completely done
0:50:04.320,0:50:09.630
yet because we need to do a little bit[br]more, which is exploit reliability. We
0:50:09.630,0:50:12.740
need to make sure that our exploit is[br]actually as reliable as we want it to
0:50:12.740,0:50:17.270
because if it only works 1 in 100 times[br]that's not going to be very good. For
0:50:17.270,0:50:22.310
Pwn2own, we ended up building a harness[br]for our Mac which would let us run the
0:50:22.310,0:50:26.160
exploit multiple times, and then collect[br]information about it so we could look
0:50:26.160,0:50:30.500
here, and we could see, very easily how[br]often it would fail and how often it would
0:50:30.500,0:50:36.130
succeed, and then we could go and get more[br]information, maybe a log, and other stuff
0:50:36.130,0:50:41.760
like how long it ran. And this made it[br]very easy to iterate over our exploit, and
0:50:41.760,0:50:48.290
try to correct issues and, make it better[br]and more reliable. We found that most of
0:50:48.290,0:50:52.589
our failures were coming from our heap[br]groom, which is where we try to line all
0:50:52.589,0:50:57.100
your memory in certain ways, but there's[br]not much that you can do there in our
0:50:57.100,0:51:00.570
situation, so we tried to make it as best[br]as we could and then accepted the
0:51:00.570,0:51:06.680
reliability that we got. You, something[br]else might want to test on is a bunch of
0:51:06.680,0:51:11.410
multiple devices. For example our[br]javascript exploit was a race condition,
0:51:11.410,0:51:15.300
so that means the number of cpus on the[br]device and the speed of the cpu actually
0:51:15.300,0:51:20.349
might matter when you're running your[br]exploit. You might want to try different
0:51:20.349,0:51:23.990
operating systems or different operating[br]system versions, because even if they're
0:51:23.990,0:51:28.180
all vulnerable, they may have different[br]quirks, or tweaks, that you have to do to
0:51:28.180,0:51:32.981
actually make your exploit work reliably[br]on all of them. We had to, we wanted to
0:51:32.981,0:51:37.340
test on the MacOS beta as well as the[br]normal MacOS at least, so that we could
0:51:37.340,0:51:42.089
make sure it worked in case Apple pushed[br]an update right before the competition. So
0:51:42.089,0:51:44.359
we did make sure that some parts of our[br]code, and our exploit, could be
0:51:44.359,0:51:49.090
interchanged. So for example, we have[br]addresses here that are specific to the
0:51:49.090,0:51:52.500
operating system version, and we could[br]swap those out very easily by changing
0:51:52.500,0:51:58.619
what part of the code is done here. Yeah.[br]And then also if you're targeting some
0:51:58.619,0:52:01.610
browsers you might be interested in[br]testing them on mobile too even if you're
0:52:01.610,0:52:06.400
not targeting the mobile device. Because a[br]lot of times the bugs might also work on a
0:52:06.400,0:52:10.250
phone, or at least the initial bug will.[br]And so that's another interesting target
0:52:10.250,0:52:17.030
you might be interested in if you weren't[br]thinking about it originally. So in
0:52:17.030,0:52:20.170
general what I'm trying to say is try[br]throwing your exploit at everything you
0:52:20.170,0:52:25.690
can, and hopefully you will be able to[br]recover some reliability percentages or
0:52:25.690,0:52:30.640
figure out things that you overlooked in[br]your initial testing. Alright. I'm gonna
0:52:30.640,0:52:35.250
throw it back over, for the final section.[br]Marcus: You're in the final section here.
0:52:35.250,0:52:39.210
So I didn't get to spend as much time as I[br]would have liked on this section, but I
0:52:39.210,0:52:43.250
think it's an important discussion to have[br]on here. And so the very last step of our
0:52:43.250,0:52:50.849
layman's guide is about responsibilities.[br]And so this is critical. And so you've
0:52:50.849,0:52:54.940
listened to our talk. You've seen us[br]develop the skeleton keys to computers and
0:52:54.940,0:53:01.710
systems and devices. We can create doors[br]into computers and servers and people's
0:53:01.710,0:53:06.490
machines, you can invade privacy, you can[br]deal damage to people's lives and
0:53:06.490,0:53:12.610
companies and systems and countries and so[br]there is a lot of you have to be very
0:53:12.610,0:53:17.870
careful with these. And so everyone in[br]this room, you know if you take any of our
0:53:17.870,0:53:22.869
advice going into this stuff, you know,[br]please acknowledge what you're getting
0:53:22.869,0:53:27.869
into and what can be done to people. And[br]so, there's at least one example that's
0:53:27.869,0:53:31.090
kind of related, that, you know I pulled[br]out quickly that, you know, quickly came
0:53:31.090,0:53:35.860
to mind. It was in 2016. I have to say i[br]remember this day actually sitting at
0:53:35.860,0:53:42.820
work. And there is this massive ddos, that[br]plagued the Internet, at least in the
0:53:42.820,0:53:48.110
U.S., and it took down all your favorite[br]sites:Twitter, Amazon, Netflix, Etsy,
0:53:48.110,0:53:54.000
Github, Spotify, Reddit. I remember the[br]whole Internet came to a crawl in the U.S.
0:53:54.000,0:54:01.310
This is the L3 outage map. This was[br]absolutely insane. And I remember people
0:54:01.310,0:54:05.240
were bouncing off the walls like crazy,[br]you know. After the fact and they were all
0:54:05.240,0:54:09.490
referencing Bruce Schneier's blog, and[br]there on Twitter there's all this
0:54:09.490,0:54:13.800
discussion popping up that "this was[br]likely a state actor". This was a newly
0:54:13.800,0:54:19.370
sophisticated ddos attack. Bruce suggested[br]it was China or Russia, or you know, some
0:54:19.370,0:54:23.160
nation state and the blog post was[br]specifically titled someone is learning
0:54:23.160,0:54:28.430
how to take down the Internet. But then a[br]few months later we figured out that this
0:54:28.430,0:54:32.260
was called the mirai botnet and it was[br]actually just a bunch of kids trying to
0:54:32.260,0:54:38.500
ddos each other minecraft servers. And so[br]it's a... it's scary because you know I
0:54:38.500,0:54:45.630
have a lot of respect for the young people[br]and how talented they are. And it's a... But
0:54:45.630,0:54:51.690
people need to be very conscious about the[br]damage that can be caused by these things.
0:54:51.690,0:54:57.780
Mirai, they weren't using 0days per se.[br]Later. Now nowadays they are using 0days
0:54:57.780,0:55:00.747
but back then they weren't, they were just[br]using an IoT based botnet. One of the
0:55:00.747,0:55:04.977
biggest in the world, our highest[br]throughput. But it was incredibly
0:55:04.977,0:55:11.710
damaging. And you know, so when you you,[br]it's hard to recognize the power of an
0:55:11.710,0:55:17.660
0day until you are wielding it. And so[br]that's why it's not the first step of the
0:55:17.660,0:55:20.930
layman's guide. Once you finish this[br]process you will come to realize the
0:55:20.930,0:55:25.510
danger that you can cause, but also the[br]danger that you might be putting yourself
0:55:25.510,0:55:33.070
in. And so I kind of want to close on that[br]please be very careful. All right. And so,
0:55:33.070,0:55:36.940
that's all we have. This is the[br]conclusion. The layman's guide, that is
0:55:36.940,0:55:41.536
the summary. And if you have any questions[br]we'll take them now. Otherwise if we run
0:55:41.536,0:55:44.890
out of time you can catch us after the talk,[br]and we'll have some cool stickers too,
0:55:44.890,0:55:51.329
so...[br]Applause
0:55:51.329,0:55:59.385
Herald-Angel: Wow, great talk. Thank you.[br]We have very very little time for
0:55:59.385,0:56:03.480
questions. If somebody is very quick they[br]can come up to one of the microphones in
0:56:03.480,0:56:08.070
the front, we will handle one but[br]otherwise, will you guys be available
0:56:08.070,0:56:10.460
after the talk?[br]Marcus: We will be available after the
0:56:10.460,0:56:14.600
talk, if ou want to come up and chat. We[br]might get swarmed but we also have some
0:56:14.600,0:56:17.680
cool Rekto stickers, so, come grab them if[br]you want them.
0:56:17.680,0:56:21.339
Herald-angel: And where can we find you.[br]Marcus: We'll be over here. We'll try to
0:56:21.339,0:56:22.720
head out to the back[br]Herald-angel: Yeah yeah because we have
0:56:22.720,0:56:25.390
another talk coming up in a moment or so.[br]Marcus: OK.
0:56:25.390,0:56:29.859
Herald-angel: I don't see any questions.[br]So I'm going to wrap it up at this point,
0:56:29.859,0:56:34.210
but as I said the speakers will be[br]available. Let's give this great speech
0:56:34.210,0:56:35.360
another round of applause.
0:56:35.360,0:56:40.266
Applause
0:56:40.266,0:56:42.251
Outro
0:56:42.251,0:57:04.000
subtitles created by c3subtitles.de[br]in the year 2020. Join, and help us!