1
00:00:00,000 --> 00:00:14,985
34c3 intro
2
00:00:14,985 --> 00:00:23,670
Herald: So, to our next talk... Sit and
relax, you know what that means. Glass of
3
00:00:23,670 --> 00:00:30,599
wine or mate, your favorite easy chair,
and of course your latest WIFI enabled toy
4
00:00:30,599 --> 00:00:36,099
compromising your intimate moments.
Barbara Wimmer, as free author and
5
00:00:36,099 --> 00:00:40,649
journalist, will tell you more about the
Internet of Fails,
6
00:00:40,649 --> 00:00:47,190
will tell you more about where IoT goes
wrong. She's a free author and journalist
7
00:00:47,190 --> 00:00:57,440
at futurezone.at, (DORF?), and will in the
near future release one or two public
8
00:00:57,440 --> 00:01:11,769
stories and a book. Applause!
applause
9
00:01:11,769 --> 00:01:15,780
Barbara Wimmer: Hello everybody. I'm
waiting for my slides to appear on the
10
00:01:15,780 --> 00:01:23,740
screen. Where are my slides please? That's
not my slides.
11
00:01:37,420 --> 00:01:48,630
Oh, thank you very much. So welcome to the
talk Internet of Fails when IoT has gone
12
00:01:48,630 --> 00:01:59,140
wrong. This is a very negative topic title
actually and you're getting a lot of
13
00:01:59,140 --> 00:02:06,710
negative stories in this next hour but I
don't want to talk only about negative
14
00:02:06,710 --> 00:02:13,610
things so you can see "FAIL" as a "first
attempt in learning". So actually at the
15
00:02:13,610 --> 00:02:19,030
end of the talk I want to talk about
solutions as well and I don't want to
16
00:02:19,030 --> 00:02:27,290
provide only bad and negative examples
because that's what we hear every day. And
17
00:02:27,290 --> 00:02:33,500
this is perfect for the congress motto
"tuwat" because this is all about let's
18
00:02:33,500 --> 00:02:44,770
tuwat together. So nobody, most of you in
this room don't will not know me. So I'm
19
00:02:44,770 --> 00:02:51,850
going to introduce myself a little bit and
why I'm talking to you about this topic,
20
00:02:51,850 --> 00:02:58,040
because that's probably what everybody
asks me when I appear somewhere and say oh
21
00:02:58,040 --> 00:03:07,490
I give talks about IoT. And so actually I
work as an IT journalist for more than 12
22
00:03:07,490 --> 00:03:17,490
years. And I got in contact with internet
of things in 2014 when I talked to the
23
00:03:17,490 --> 00:03:26,430
local CERT.at team in Austria. I'm from
Vienna. And they first told me that the
24
00:03:26,430 --> 00:03:32,420
first refrigerator was caught that was
sending out spam mails and that was in
25
00:03:32,420 --> 00:03:42,470
2014 and actually that was really a funny
story back then and we were laughing about
26
00:03:42,470 --> 00:03:48,530
it but at the same time we already knew
that there is something coming up which is
27
00:03:48,530 --> 00:03:59,870
quite going to be a huge development and
so from back then I watched the whole IoT
28
00:03:59,870 --> 00:04:09,150
development in terms of security and
privacy. And in the next 45min you will
29
00:04:09,150 --> 00:04:19,219
hear a lot of stuff about IoT, and where
the problem with IoT is currently and
30
00:04:19,219 --> 00:04:26,400
examples of fails in terms of security and
privacy. But like I mentioned before I
31
00:04:26,400 --> 00:04:31,760
wanna talk about solutions and when we
talk about solutions it will not be like
32
00:04:31,760 --> 00:04:38,019
only one side, like only the consumer,
only the IT-security, only developers.
33
00:04:38,019 --> 00:04:46,740
Actually what I'm going not to provide is
detailed IT-security stuff. So if you
34
00:04:46,740 --> 00:04:53,789
wanna focus more on any story that I'm
talking about I'm mentioning most of the
35
00:04:53,789 --> 00:05:01,709
the sources in the slides and if you
really wanna know this example got up,
36
00:05:01,709 --> 00:05:06,559
please look it up if you're really
interested deeply into it. I'm a
37
00:05:06,559 --> 00:05:12,889
journalist and not an IT-security person
so please don't expect me to go into
38
00:05:12,889 --> 00:05:19,770
details in this talk. Thats why it's also
in the ethics talk - ethics section of the
39
00:05:19,770 --> 00:05:28,759
congress and not the security part. So
coming to the internet of things I want to
40
00:05:28,759 --> 00:05:39,759
start with a few numbers because these
numbers show the development of IoT. In
41
00:05:39,759 --> 00:05:48,700
2016 we had 6.3 billions of devices out
there. This year we already had 8.3
42
00:05:48,700 --> 00:05:58,830
billion of devices and in 2020 we will -
we are going to have 20.4 billion
43
00:05:58,830 --> 00:06:05,159
connected devices out there. So the
numbers are from Gartner Institute from
44
00:06:05,159 --> 00:06:13,699
January and I have one more slide with
more accurate data from June this year and
45
00:06:13,699 --> 00:06:23,400
actually this slide shows that the
development is actually really growing.
46
00:06:23,400 --> 00:06:32,400
17% more compared to the previous year.
And by 2021 global IoT spending is
47
00:06:32,400 --> 00:06:42,389
expected to reach about 1.4 trillion
dollars. So maybe some you are asking
48
00:06:42,389 --> 00:06:49,809
yourself: What is the internet of things?
Maybe some of you expected I'm only
49
00:06:49,809 --> 00:06:59,669
talking about a smart home, because IoT is
often related to the smart home. And we're
50
00:06:59,669 --> 00:07:06,139
having all the smart devices that we put
into our living rooms, but that's actually
51
00:07:06,139 --> 00:07:12,740
not the main focus because it's more about
the connected everything. Which means
52
00:07:12,740 --> 00:07:19,239
toys, sex toys, home automation,
lightbulbs, surveillance cameras,
53
00:07:19,239 --> 00:07:28,569
thermostats, but also digital assistants
and wearables. So I wanna start with a few
54
00:07:28,569 --> 00:07:37,580
examples of classical internet of things
stuff which is actually a smart coffee
55
00:07:37,580 --> 00:07:45,430
maker. That's ... so what is smart about a
coffee maker? It only gets ... it doesn't
56
00:07:45,430 --> 00:07:51,429
get smart when you regulate your coffee
machine by app because what's smart about
57
00:07:51,429 --> 00:07:58,189
that? You can just press the button on the
machine. But when you connect your coffee
58
00:07:58,189 --> 00:08:05,750
machine with fitness and sleeping trackers
the coffee machine already knows when you
59
00:08:05,750 --> 00:08:13,179
get up if you need a strong or soft coffee
in the morning and so that might sound
60
00:08:13,179 --> 00:08:20,469
comfortable for some of us, but it also
has a lot of dangers inside, because you
61
00:08:20,469 --> 00:08:25,709
never know that the data is really safe
and only stays with you. Maybe your
62
00:08:25,709 --> 00:08:37,429
insurance company get them one day. So you
all know Cars -probably-, the film, and
63
00:08:37,429 --> 00:08:46,040
this is McLightning Queen and it got a toy
nowadays which is sold for 350 dollars -
64
00:08:46,040 --> 00:08:55,490
no sorry, euros - and this car is able to
sit next to you and watch the film with
65
00:08:55,490 --> 00:09:02,310
you and is going to comment the film.
laughter
66
00:09:02,310 --> 00:09:09,740
And it is - this sounds very funny - but -
and it is funny - but it means that it has
67
00:09:09,740 --> 00:09:15,130
a microphone integrated which is waiting
for the terms in the film on the right
68
00:09:15,130 --> 00:09:22,750
stories and then it makes comments. And
the microphone can only be turned off by
69
00:09:22,750 --> 00:09:30,810
app so there's no physical button to turn
it off and actually another thing is when
70
00:09:30,810 --> 00:09:36,410
you first ... when you actually got this
present for Christmas, which is a really
71
00:09:36,410 --> 00:09:46,589
expensive present with 350 euros, it's
actually first updating for more than
72
00:09:46,589 --> 00:10:01,230
35min before you can even use it. The next
example - you're already laughing - is
73
00:10:01,230 --> 00:10:09,120
internet of ... I call it internet of shit
because you can't say anything else to
74
00:10:09,120 --> 00:10:16,350
that example. It's a toilet IoT sensor
which is actually a smart, small little
75
00:10:16,350 --> 00:10:25,269
box which is put into the toilet. And this
box has sensors. It's an Intel box but I
76
00:10:25,269 --> 00:10:34,760
don't know and this box has sensors and
these sensors help analyzing the stool.
77
00:10:34,760 --> 00:10:44,360
And this data that is collected is going
to send into the cloud. And actually this
78
00:10:44,360 --> 00:10:49,550
could be very useful for people who are
having chronical diseases like Colitis
79
00:10:49,550 --> 00:10:59,319
Ulcerosa or other chronical diseases with
the digestion stuff but it is mainly
80
00:10:59,319 --> 00:11:05,480
designed for healthy people who want to
make better nutrition and reduce their
81
00:11:05,480 --> 00:11:13,870
stress levels with the stool analysis. And
maybe it sounds good at the beginning but
82
00:11:13,870 --> 00:11:21,709
this data that is collected could also be
used for other things in the future. So
83
00:11:21,709 --> 00:11:30,889
it's a perfect example for internet of
shit. But there is another internet of
84
00:11:30,889 --> 00:11:37,970
shit which is a twitter account that
collects all these funny little stories.
85
00:11:37,970 --> 00:11:44,920
It's not from me, so I'm not behind that.
I tried to reach the person but I never
86
00:11:44,920 --> 00:11:50,730
got a replay so I can't tell you anything
about them but they collect examples - if
87
00:11:50,730 --> 00:11:55,579
you don't follow them now and are
interested in this topic you might do
88
00:11:55,579 --> 00:12:05,410
after this talk - so after presenting a
couple of IoT examples with the good and a
89
00:12:05,410 --> 00:12:13,089
bit of the bad sides I first wanna focus a
little bit on the problem because as I
90
00:12:13,089 --> 00:12:20,149
said before you might now think that
everything is nice, comfortable, why
91
00:12:20,149 --> 00:12:26,690
shouldn't we do that and stuff like that.
So the problem is that most of the vendors
92
00:12:26,690 --> 00:12:33,730
that are doing IoT stuff now, that start
to connect everything, they are creating
93
00:12:33,730 --> 00:12:41,350
manually operated devices without
connectivity for long years and they had a
94
00:12:41,350 --> 00:12:48,060
lot of knowledge in terms of materials,
ergonomics, mechanical engineering but
95
00:12:48,060 --> 00:12:58,199
almost zero in the fields of IT security.
Actually I don't say that without having
96
00:12:58,199 --> 00:13:06,959
talked to vendors that have said exactly
that when I interviewed them. Like there
97
00:13:06,959 --> 00:13:14,509
was a lightbulb vendor from Austria who is
a really big vendor who is making
98
00:13:14,509 --> 00:13:22,399
lightbulbs for years and years and years
and actually they started to make
99
00:13:22,399 --> 00:13:34,610
connected lightbulbs in 2015 and when they
did that they ... and I asked them "Oh how
100
00:13:34,610 --> 00:13:44,959
big is your IT security department?" "1
Person". So they didn't actually have the
101
00:13:44,959 --> 00:13:51,579
knowledge that IT security might be more
important when they connect - when they
102
00:13:51,579 --> 00:14:00,079
start to connect things. And actually the
result is that these vendors are making
103
00:14:00,079 --> 00:14:05,519
the same sort of security errors than the
high tech industry was dealing with 15
104
00:14:05,519 --> 00:14:14,269
years ago. So the early 2000s called and
want their web security, their lack of
105
00:14:14,269 --> 00:14:23,700
security back. So there are all kinds of
problems we already know from past:
106
00:14:23,700 --> 00:14:28,709
hardcoded passwords, unsecure bluetooth
connections, permanent cloud server
107
00:14:28,709 --> 00:14:38,920
connections and a lot of other stuff. So
we're going to have from all these 20
108
00:14:38,920 --> 00:14:45,709
billion devices out there, there will be a
lot of unsecure devices and the problem is
109
00:14:45,709 --> 00:14:53,410
that they are collecting to a botnet and
are starting DDoS attacks and we are going
110
00:14:53,410 --> 00:15:02,579
to have internet outages. For those who
are not familiar with the terms I made a
111
00:15:02,579 --> 00:15:07,550
really really really short explanation so
that you are also understanding what I am
112
00:15:07,550 --> 00:15:14,709
talking about. A botnet is a network of
private computers infected with malicious
113
00:15:14,709 --> 00:15:21,749
software and controlled as a group without
the owners knowledge. Like the example of
114
00:15:21,749 --> 00:15:29,060
the refrigerator that was sending out spam
I told you about earlier. This
115
00:15:29,060 --> 00:15:35,870
refrigerator sent out ... one refrigerator
was sending out 750.000 spam mails by the
116
00:15:35,870 --> 00:15:43,029
way. So the botnet, that has a botnet
owner of course, because it's not only a
117
00:15:43,029 --> 00:15:50,430
zombie botnet, and the botnet owner can
control this network of infected computers
118
00:15:50,430 --> 00:15:57,611
by issuing commands to perform malicious
activities like DDoS attacks. So DDoS is a
119
00:15:57,611 --> 00:16:04,300
distributed denial of Service attack and
actually that's an attempt to stop
120
00:16:04,300 --> 00:16:10,459
legitimate users form accessing the data
normally available on a website. And this
121
00:16:10,459 --> 00:16:19,590
actually can lead to completely shutdown
of a service. And we had this already so
122
00:16:19,590 --> 00:16:30,070
I'm not talking about something in the far
future but we had this in 2016 and most
123
00:16:30,070 --> 00:16:37,639
people already recognized it but it didn't
recognized why - their twitter accounts
124
00:16:37,639 --> 00:16:43,750
did not work, they couldn't use Reddit, or
Spotify, or they couldn't pay with PayPal
125
00:16:43,750 --> 00:16:52,850
at the moment. And behind that attack was
Mirai so several other major services were
126
00:16:52,850 --> 00:17:03,230
offline because an infrastructure provider
was attacked by zombie IoT devices. And
127
00:17:03,230 --> 00:17:11,579
this was one year ago and now one year
later Mirai botnet infections are still
128
00:17:11,579 --> 00:17:21,400
widespread so not every zombie device is
already secured so there are still some
129
00:17:21,400 --> 00:17:26,829
around and not so little and actually
there is a study saying that every
130
00:17:26,829 --> 00:17:35,800
unsecured - no every botnet infection
that's there - every security hole thats
131
00:17:35,800 --> 00:17:42,910
there is staying there for at least 7
years which means that all the unsecure
132
00:17:42,910 --> 00:17:50,890
devices which are out now could get
infected and could stay infected for 7
133
00:17:50,890 --> 00:17:56,680
years. So that's why it's very important
that we are going to do something really
134
00:17:56,680 --> 00:18:10,170
quickly and not starting like in 2020. So
Mirai was supposed to continue in 2017 and
135
00:18:10,170 --> 00:18:20,220
actually a lot of DDoS attacks similar
attacks like Mirai happened in 2017. This
136
00:18:20,220 --> 00:18:29,870
as an example could unleash at any moment
which was in November one few days later
137
00:18:29,870 --> 00:18:41,650
exactly this attack was unleashed, so it
happened. In 2017 we also had a huge
138
00:18:41,650 --> 00:18:54,400
increase in DDoS attacks 91% increase from
Q1 and it's going to increase more. I have
139
00:18:54,400 --> 00:19:09,290
to take a short sip, sorry.
Now we're coming back to examples. One
140
00:19:09,290 --> 00:19:15,720
really good example is the university that
was attacked by it's own vending machines
141
00:19:15,720 --> 00:19:26,250
and smart lightbulbs and 5000 other IoT
devices. This was very very difficult to
142
00:19:26,250 --> 00:19:31,740
get fixed because they couldn't get the
university network down so they had to
143
00:19:31,740 --> 00:19:38,260
find a really difficult solution to get it
back up. And actually how did they even
144
00:19:38,260 --> 00:19:42,650
notice about it? Because the students
complained that the internet was going so
145
00:19:42,650 --> 00:19:53,240
slow. Another example which has nothing to
do with DDoS attacks anymore but with IoT
146
00:19:53,240 --> 00:20:03,480
sensors - actually - in a fishtank in an
American casino - north American casino
147
00:20:03,480 --> 00:20:12,140
there were sensors measuring the
temperature of the aquarium and the
148
00:20:12,140 --> 00:20:18,900
fishtank - that the fishes didn't die -
and these sensors were sending the data to
149
00:20:18,900 --> 00:20:28,500
a PC of this casino and this PC was the
same - was using the same network than the
150
00:20:28,500 --> 00:20:37,870
sensors so actually the cybercriminals
could access to this data of the casino
151
00:20:37,870 --> 00:20:43,210
and were stealing them and sending them to
their own servers in Finland. And the
152
00:20:43,210 --> 00:20:56,500
amount was about 10GB of data. Another
example which is actually one of my most -
153
00:20:56,500 --> 00:21:03,490
I don't know why but it's the example I
personally like most of the whole examples
154
00:21:03,490 --> 00:21:11,190
that are collected in 2017. So there was a
surveillance camera bought by a
155
00:21:11,190 --> 00:21:22,060
netherlands woman. Actually she wanted to
surveil her dog when she was out at work
156
00:21:22,060 --> 00:21:29,840
but what did this camera do? It did
surveil the dog when she was out at work,
157
00:21:29,840 --> 00:21:37,260
but when she was at home the camera
followed her through the room and was
158
00:21:37,260 --> 00:21:44,410
watching her all over the place. And it
had a microphone integrated and one day it
159
00:21:44,410 --> 00:21:51,680
started to talk with her and it said "hola
señorita". And this woman was so
160
00:21:51,680 --> 00:21:59,890
frightened that she actually started to
record that because she thought that
161
00:21:59,890 --> 00:22:08,290
nobody will buy this story. All will think
I’m crazy but this camera actually did not
162
00:22:08,290 --> 00:22:15,580
surveil the dog but was hacked and
surveiled her. And it was a very cheap
163
00:22:15,580 --> 00:22:21,870
camera by the way. She bought it in a
supermarket but we don't know the name of
164
00:22:21,870 --> 00:22:29,330
the vendor in this case. So coming for a
very cheap camera to a very hightech
165
00:22:29,330 --> 00:22:40,140
camera the cameras you see here is one
that is actually build in a lot of
166
00:22:40,140 --> 00:22:48,180
companies and there was a security hole
found by some Vienna security specialists
167
00:22:48,180 --> 00:22:53,240
from SEC consult and actually they
demonstrated me how they could actually
168
00:22:53,240 --> 00:23:03,450
hack into this camera and how they could
make it possible that this camera shows
169
00:23:03,450 --> 00:23:13,240
pictures of an empty room in a bank so the
pictures from the empty room in the bank
170
00:23:13,240 --> 00:23:20,240
were shown to me and in reality the bank
was robbed - ok, not in reality. But it
171
00:23:20,240 --> 00:23:29,210
could have been robbed. So thats actually
sounding a little bit like a movie scene
172
00:23:29,210 --> 00:23:37,530
and actually this camera which is sold as
a security camera is kind of useless when
173
00:23:37,530 --> 00:23:42,840
it doesn't have security and it doesn't
really show the picture. And the problem
174
00:23:42,840 --> 00:23:53,970
with this camera was hardcoded passwords.
And the hardcoded password got fixed after
175
00:23:53,970 --> 00:24:02,690
so it was responsible disclosure process
and this camera is safe now. So I'm coming
176
00:24:02,690 --> 00:24:11,800
to a different example now. And this now
finally explains why this toy is sitting
177
00:24:11,800 --> 00:24:19,670
here. Before my talk everybody was telling
me "Ah, you brought your favorite toy, to
178
00:24:19,670 --> 00:24:26,140
protect you during your talk." and I was
laughing "Oh no. No no no no, it one of
179
00:24:26,140 --> 00:24:36,570
the most unsecure devices out there." But
before we come to this in special I'm
180
00:24:36,570 --> 00:24:46,790
going to talk a little bit about connected
toys. So the Germany Stiftung Warentest
181
00:24:46,790 --> 00:24:54,650
had made a study regarding connected toys.
The people were testing them and actually
182
00:24:54,650 --> 00:25:04,820
all of their tested bears, robot dogs and
dolls were very very unsecure and some of
183
00:25:04,820 --> 00:25:12,779
them were even critical and are extremely
critical and others were critical. And
184
00:25:12,779 --> 00:25:22,370
actually what was the problem with the
toys and also with this? They were using -
185
00:25:22,370 --> 00:25:28,210
they are using bluetooth connections. And
these bluetooth connections are not
186
00:25:28,210 --> 00:25:34,360
secured by a password or a PIN code. So
every smartphone user close enough could
187
00:25:34,360 --> 00:25:42,630
connect to the toy and listen to children
or ask questions or threaten them and
188
00:25:42,630 --> 00:25:49,670
another problem are the data collecting
apps related to this stuff. So actually
189
00:25:49,670 --> 00:25:58,640
this little unicorn has an app where you
can send the messages. So what does this
190
00:25:58,640 --> 00:26:07,790
actually? It can play messages and you can
- as a child you can record messages and
191
00:26:07,790 --> 00:26:17,460
send it to you mom or your dad. And when
you play messages you never - the heart
192
00:26:17,460 --> 00:26:24,690
blinks. So actually there's a message
waiting for you now. And I'm not sure if
193
00:26:24,690 --> 00:26:32,710
it's the same that I recorded earlier
before. Maybe now it is, maybe at the end
194
00:26:32,710 --> 00:26:42,730
of the talk when I press the button again
it might not be. And so everybody can - so
195
00:26:42,730 --> 00:26:49,840
this - err sorry - This device does have
an app where you can send the message to.
196
00:26:49,840 --> 00:26:55,730
And it also has a children interface and
where you are using the children interface
197
00:26:55,730 --> 00:27:02,660
you're seeing that there are ads
integrated. And in the children's
198
00:27:02,660 --> 00:27:13,230
interface there were ads for porn and
ehm... ...other stuff, which are not
199
00:27:13,230 --> 00:27:20,320
really in the best hands of a child. And
this is also what Stiftung Warentest has
200
00:27:20,320 --> 00:27:31,140
actually - yeah has actually found out.
The data is also used to send to third
201
00:27:31,140 --> 00:27:35,700
party companies and they put trackers to
control the online behavior of their
202
00:27:35,700 --> 00:27:42,700
parents. This is also done with this
device. So the Stiftung Warentest advises
203
00:27:42,700 --> 00:27:51,290
a not connectible dumb teddy might be the
smarter choice in the future. And before I
204
00:27:51,290 --> 00:27:56,530
finally press this button - you're
probably curious now - but first I'm going
205
00:27:56,530 --> 00:28:07,420
to talk a little bit about Cayla. You
probably have heard of Cayla as a very
206
00:28:07,420 --> 00:28:14,880
unsecure doll. Actually it got forbidden
in Germany by law. It is judged as a
207
00:28:14,880 --> 00:28:22,080
prohibited broadcasting station. And
parents who do not destroy it will be
208
00:28:22,080 --> 00:28:28,710
actually fined. And I tried to buy Cayla
in Austria and didn't get the doll. So
209
00:28:28,710 --> 00:28:35,050
actually it should be really off the
market in the German speaking area. And
210
00:28:35,050 --> 00:28:43,500
actually that is also a result of a
campaign from Norway called Toyfail, which
211
00:28:43,500 --> 00:28:49,800
is a Norwegian consumer organization who
are actually - this is Cayla. You can see
212
00:28:49,800 --> 00:29:00,110
her now. Which is actually going to the
European Parliament to make them
213
00:29:00,110 --> 00:29:07,830
understand how unsecure toys is doing a
lot of harm and how we should put more
214
00:29:07,830 --> 00:29:17,130
security into toys. And I've brought a
short little video and I hope we can hear
215
00:29:17,130 --> 00:29:27,810
the audio here as well. We will see.
No. You don't hear anything.
216
00:29:27,810 --> 00:29:31,660
But this doesn't matter because they
have...
217
00:29:31,660 --> 00:29:35,960
Sign Language Interpreter: subtitles
Barbara: subtitles.
218
00:29:35,960 --> 00:29:40,530
Person (Video): There's not added any kind
of security. With simple steps I can talk
219
00:29:40,530 --> 00:29:44,990
through the doll and listen to other
people.
220
00:29:44,990 --> 00:29:47,740
Person through doll (Video): No one wants
others to speak directly through the doll.
221
00:29:47,740 --> 00:29:56,790
Barbara: He's speaking now at the moment.
Doll: inaudible
222
00:29:56,790 --> 00:30:38,900
Person: And you may think... [see video
subs] ... Cayla, can I trust you?
223
00:30:38,900 --> 00:30:44,010
Doll: I don't know.
laughter
224
00:30:44,010 --> 00:30:58,150
applause
Barbara: Yeah and we don't trust Cayla and
225
00:30:58,150 --> 00:31:07,910
we also don't trust our little unicorn.
button clicking
226
00:31:07,910 --> 00:31:25,040
laughter
crying baby in background
227
00:31:25,040 --> 00:31:34,810
Barbara: Ok, somebody has hacked it.
laughter
228
00:31:34,810 --> 00:31:42,920
Yes.
Unicorn Toy: Hello, Chaos Communication
229
00:31:42,920 --> 00:31:48,000
Congress.
Barbara: Ok, that's what I recorded
230
00:31:48,000 --> 00:31:57,140
earlier. But there is some time left.
Maybe, maybe... but you're all sitting too
231
00:31:57,140 --> 00:32:04,120
far actually and nobody of you brought
your computer, so... but we will see, we
232
00:32:04,120 --> 00:32:10,040
will try it later on. So but actually you
shouldn't trust this unicorn, because this
233
00:32:10,040 --> 00:32:22,360
unicorn is from the company called
Cloudpets, which is a - no sorry It's a
234
00:32:22,360 --> 00:32:29,680
toy called Cloudpet and the company is
Spiraltoys from the US. So this is
235
00:32:29,680 --> 00:32:39,110
Cloudpet and there are cats and dogs and
unicorns and it's very ugly but it's a
236
00:32:39,110 --> 00:32:48,640
unicorn. And actually now I'm already
talking a lot about this. Why I'm
237
00:32:48,640 --> 00:32:57,550
explaining you now. There already was a
data breach with this toy so the
238
00:32:57,550 --> 00:33:05,610
children's messages in Cloudpets data
actually was stolen and was public on the
239
00:33:05,610 --> 00:33:13,740
internet. 2 million voice messages
recorded on the cuddly toys has been
240
00:33:13,740 --> 00:33:25,060
discovered free on the internet. And
actually Spiraltoys say that there was no
241
00:33:25,060 --> 00:33:33,631
data breach but the data was there, so...
Thats also why I brought this, it was
242
00:33:33,631 --> 00:33:40,360
still very easily available and actually
as I said before the app for child the
243
00:33:40,360 --> 00:33:51,250
interface shows porn ads, so I would not
recommend that for your child. Actually
244
00:33:51,250 --> 00:33:55,600
there are already a lot of institutions
out there which are warning for connected
245
00:33:55,600 --> 00:34:03,490
toys also the consumer group Which? which
actually did a study about this and other
246
00:34:03,490 --> 00:34:10,000
like also the Furby connected they
analyzed, the German Stiftung Warentest,
247
00:34:10,000 --> 00:34:13,949
the Austrian Verein für
Konsumenteninformation, the Norwegian
248
00:34:13,949 --> 00:34:22,429
consumer council, and the FBI. The list is
to be continued. So consider if you really
249
00:34:22,429 --> 00:34:31,480
need a connected toy for your child or
yourself because the next section is about
250
00:34:31,480 --> 00:34:37,979
sex toys.
laughter
251
00:34:37,979 --> 00:34:49,900
applause
squeaky horn
252
00:34:49,900 --> 00:34:57,170
more laughter and applause
I am not... It's not necessary say a lot
253
00:34:57,170 --> 00:35:04,330
about this example. It's actually a
connected vibrator that has a build-in
254
00:35:04,330 --> 00:35:18,870
camera and this camera is very very very
unsafe. Also this toy is really expensive,
255
00:35:18,870 --> 00:35:24,670
so you can't say "Eh, it's only the cheap
stuff that is so unsecure." Also the high-
256
00:35:24,670 --> 00:35:32,480
tech stuff can be really unsecure. I mean
this vibrator costs 250 dollars so it's
257
00:35:32,480 --> 00:35:42,610
very expensive and it has a build-in web-
connected endoscope and they found out
258
00:35:42,610 --> 00:35:55,640
that it's massively insecure. The password
of this... And if you forgot to change it
259
00:35:55,640 --> 00:36:01,740
it's a few more players than expected that
might be watching your newest video about
260
00:36:01,740 --> 00:36:09,950
your private sex adventures. There was
another example actually in this - sorry
261
00:36:09,950 --> 00:36:14,640
go back one more time to this example -
there's a very funny video on it on
262
00:36:14,640 --> 00:36:20,490
youtube about it, maybe you wanna watch
it. I didn't bring it because I couldn't
263
00:36:20,490 --> 00:36:31,600
reach the makers of it. So I'm going to
the next example which is about a case of
264
00:36:31,600 --> 00:36:39,040
sex toy company that actually admits to
recording users remote sex sessions and it
265
00:36:39,040 --> 00:36:48,110
called it a "minor bug". It was this love
sensor remote app you can see the icon
266
00:36:48,110 --> 00:36:56,050
here and actually this is a vibrator and
an app and the vibrator controlling app
267
00:36:56,050 --> 00:37:03,080
was recording all the sex sounds, all the
sounds you're making when you're using
268
00:37:03,080 --> 00:37:09,610
this vibrator and stores them on the phone
without your knowledge. And the company
269
00:37:09,610 --> 00:37:15,600
says that no information or data was sent
to the servers so this audio file exists
270
00:37:15,600 --> 00:37:21,570
only temporarily and only your device. And
they already had an update so actually
271
00:37:21,570 --> 00:37:28,280
this is not as funny as the other story
but still it's an example of how unsecure
272
00:37:28,280 --> 00:37:38,450
sex stuff can be. So there are lot of lot
of more sex examples out there. One you
273
00:37:38,450 --> 00:37:45,780
should actually definitely search for
after - please don't search for now, but
274
00:37:45,780 --> 00:37:55,250
after this talk. You could google or
duckduckgo or whatever you use the terms
275
00:37:55,250 --> 00:38:04,280
"blowjob injection". And please add
"security" because otherwise you will land
276
00:38:04,280 --> 00:38:07,920
on other sites.
laughter
277
00:38:07,920 --> 00:38:18,360
And this was a female security expert who
was doing this research about a device
278
00:38:18,360 --> 00:38:24,760
which actually was supposed to your
girlfriend could make you a special
279
00:38:24,760 --> 00:38:31,050
blowjob program, your special blowjob and
this could be hacked so somebody else's
280
00:38:31,050 --> 00:38:39,120
blowjob might appear instead your own.
laughter
281
00:38:39,120 --> 00:38:47,520
So there's also a story about a map of
buttplugs in Berlin that are unsecure.
282
00:38:47,520 --> 00:38:56,460
Also if you're interested in that please
also search for that story. Because it's
283
00:38:56,460 --> 00:39:01,450
funny to talk about this, but I also wanna
talk little bit about things that we could
284
00:39:01,450 --> 00:39:08,890
actually do. And one of the projects in
this part is actually doing something
285
00:39:08,890 --> 00:39:14,480
thats called the "internet of dongs
project - hacking sex toys for security
286
00:39:14,480 --> 00:39:22,190
and privacy". And as you can see it's
supported by PornHub, which in this case
287
00:39:22,190 --> 00:39:29,030
means that they get money from PornHub
that they can buy sex toys for their
288
00:39:29,030 --> 00:39:41,680
research. So PornHub is sponsoring them.
Actually I did for talk to the guy who is
289
00:39:41,680 --> 00:39:49,510
behind this project. He's called
Randomman. That's a render of him and this
290
00:39:49,510 --> 00:39:57,210
is the website by the way. So he told me
he's currently - they're currently a team
291
00:39:57,210 --> 00:40:05,600
of about 15-20 people out there that are
doing their security research in their own
292
00:40:05,600 --> 00:40:10,980
spare time. And they are not getting any
money for it and they also don't want to
293
00:40:10,980 --> 00:40:17,670
get any money but they are already looking
for more security experts that wanna join
294
00:40:17,670 --> 00:40:24,440
the team and also they have also an
ethical codex and stuff like that and
295
00:40:24,440 --> 00:40:32,180
actually one of the most important things
that he was telling me is that he doesn't
296
00:40:32,180 --> 00:40:41,110
want that you should stay off connected
sex toys at all, but to find the security
297
00:40:41,110 --> 00:40:54,760
holes that we are all able to use them if
we want without any fear. So yeah, you can
298
00:40:54,760 --> 00:41:02,710
get in contact with him if you're
interested. Coming to a different section
299
00:41:02,710 --> 00:41:14,110
now. You can see I'm switching from
security to security and privacy and now
300
00:41:14,110 --> 00:41:23,900
I'm landed on the privacy section. This is
Google Home. And we all know that there is
301
00:41:23,900 --> 00:41:32,869
also Amazon Echo and digital assistants
are also smart IoT devices and this is why
302
00:41:32,869 --> 00:41:38,810
I wanna talk a very very short time about
them because I'm sure a lot of people got
303
00:41:38,810 --> 00:41:46,290
those devices for Christmas. Actually
there was a big increase of digital
304
00:41:46,290 --> 00:41:56,630
assistants in the last year int this
quarter 3 of 2016 there were only 900.000
305
00:41:56,630 --> 00:42:11,040
of such devices sold and in the quarter 3
2017 we had more than 7.4 million of those
306
00:42:11,040 --> 00:42:17,180
devices sold. So there's a huge increase
and we don't even have the numbers of the
307
00:42:17,180 --> 00:42:29,110
Christmas time. Yeah you have seen it. so
why I wanna talk about it, because when
308
00:42:29,110 --> 00:42:36,510
you put this kind of stuff in your home it
might be very comfortable at the beginning
309
00:42:36,510 --> 00:42:41,520
because you don't have to look up the
weather information you can - you don't
310
00:42:41,520 --> 00:42:47,250
have to read your emails you can make the
device read your own emails you can use
311
00:42:47,250 --> 00:42:55,880
them to program your list of what you're
going to buy and stuff like that but
312
00:42:55,880 --> 00:43:02,380
that's how they learn a lot about the
users habits and their personalties and
313
00:43:02,380 --> 00:43:07,480
those devices will learn more and more
information about you and this information
314
00:43:07,480 --> 00:43:16,350
does not stay in your own home it actually
is going to send to the servers of amazon
315
00:43:16,350 --> 00:43:22,720
and google and I don't need to tell you
what amazon an google are doing with this
316
00:43:22,720 --> 00:43:31,170
data. current at least currently they are
only collecting it but that's very
317
00:43:31,170 --> 00:43:39,760
valuable and they turn around and use it
or sell it in various ways to monetize
318
00:43:39,760 --> 00:43:48,760
that information in one of the future
days. So all digital assistants send the
319
00:43:48,760 --> 00:43:54,440
voice controls that are made after "Ok,
Google" or "Alexa" to their servers and
320
00:43:54,440 --> 00:44:00,850
the data will be saved there and it was
not possible for me to find out for how
321
00:44:00,850 --> 00:44:07,460
long and at which servers. It's not in
their terms of conditions and I couldn't
322
00:44:07,460 --> 00:44:15,600
find it anywhere. So also the German data
privacy delegate Andrea Voßhoff didn't
323
00:44:15,600 --> 00:44:21,580
find this information. She criticized that
"It is not easy for users to understand
324
00:44:21,580 --> 00:44:28,340
how, to what extent and where the
information collected is processed. Also,
325
00:44:28,340 --> 00:44:37,300
it is not clear how long the data will be
stored." So if you still want those
326
00:44:37,300 --> 00:44:45,369
devices in your home now there are at
least physical mute button with google
327
00:44:45,369 --> 00:44:52,150
home and amazon echo and you can also
change the settings to control the data so
328
00:44:52,150 --> 00:45:00,400
all the data that is collected is regulary
deleted from the servers but of course you
329
00:45:00,400 --> 00:45:08,490
never know in how may backups it's
collected as well. So yes it's only
330
00:45:08,490 --> 00:45:22,480
recording after this voice control but
both devices already got hacked and yeah I
331
00:45:22,480 --> 00:45:32,370
didn't amazon echo got hacked in 2016 and
google mini got hacked in 2017 of course
332
00:45:32,370 --> 00:45:39,610
both problems got fixed and when I say got
hacked it means that the devices in your
333
00:45:39,610 --> 00:45:54,000
home were listening to the conversations
all the time. So I'm coming -
334
00:45:54,000 --> 00:46:01,110
unfortunately the funny examples are over.
I'm coming to the part where I wanna speak
335
00:46:01,110 --> 00:46:09,960
about what we can do against the lack of
security and lack of privacy with the
336
00:46:09,960 --> 00:46:18,560
internet of things. So we are currently
having the status quo where we are having an
337
00:46:18,560 --> 00:46:23,510
information asymmetry between the vendor
and the customer. Currently the
338
00:46:23,510 --> 00:46:29,100
manufacturers do not need to provide a
sample information but(?) how security of
339
00:46:29,100 --> 00:46:36,900
a device such as how long it will receive
security updates. so when we buy a device
340
00:46:36,900 --> 00:46:52,150
we never know... oh is it going to be safe
or not. So what we need ... actually what
341
00:46:52,150 --> 00:47:00,300
we need. I did write a couple of things -
I write down a couple of things here which
342
00:47:00,300 --> 00:47:10,410
are partly stolen by the green MEP Jan
Philipp Albrecht from his program because
343
00:47:10,410 --> 00:47:18,300
he's dealing a lot with that kind of
question what we can do with his work and
344
00:47:18,300 --> 00:47:27,590
I'm also - I also was stealing some of
those suggestions from the Renderman from
345
00:47:27,590 --> 00:47:34,520
the Internet of Dongs project, he also had
some helpful tips. And I also stole some
346
00:47:34,520 --> 00:47:40,000
of the information from security experts I
talked in interviews all of the time
347
00:47:40,000 --> 00:47:45,080
because we never talk only about the bad
things we always - we all want to get the
348
00:47:45,080 --> 00:47:52,690
internet of things safer at the end. So
some of them suggested that we could need
349
00:47:52,690 --> 00:48:01,070
an security star rating system similar to
the energy labeling. And when we talk
350
00:48:01,070 --> 00:48:13,130
about security star ratings that could
mean that we use a label. When a device
351
00:48:13,130 --> 00:48:19,551
gets security updates for free for the
next five years it gets the A++ label, if
352
00:48:19,551 --> 00:48:24,900
it's no updates at all and it stays
unsecure it gets the baddest rating or
353
00:48:24,900 --> 00:48:32,330
such things. Actually vendors should also
be forced to close security holes instead
354
00:48:32,330 --> 00:48:39,620
of ignoring them. And they should provide
the security researchers with email
355
00:48:39,620 --> 00:48:45,850
addresses where we can easily report
security flaws because sometimes the
356
00:48:45,850 --> 00:48:52,330
hardest part of the game is to actually
find the right contact to send out the
357
00:48:52,330 --> 00:49:01,450
information about what is unsecure and
what's not. What we also need is a
358
00:49:01,450 --> 00:49:09,480
mandatory offline mode for electronical
devices so this device at least has a
359
00:49:09,480 --> 00:49:19,710
button where you can turn it off. so it
doesn't listen to you permanently. And we
360
00:49:19,710 --> 00:49:28,090
need that for all devices - all connected
devices. Also an airbag and seatbelt for
361
00:49:28,090 --> 00:49:35,160
the digital age and we also have to talk
about product liability and a clear update
362
00:49:35,160 --> 00:49:46,090
policy. so there are also good examples
that we are having now. Actually all what
363
00:49:46,090 --> 00:49:54,920
I was talking about here is regulation.
Regulation that is not existing at the
364
00:49:54,920 --> 00:50:05,080
moment. But there is some regulation that
is existing in the kind of data which is
365
00:50:05,080 --> 00:50:12,870
the GDPR the General Data Protection
Regulation which is coming up in May 2018
366
00:50:12,870 --> 00:50:20,170
and it has included some really really
really helpful things: privacy by design
367
00:50:20,170 --> 00:50:27,750
and privacy by default. And more
possibilities for law enforcement. And
368
00:50:27,750 --> 00:50:36,090
this is very very important because it
doesn't say that because we are going to
369
00:50:36,090 --> 00:50:43,330
have a regulation about privacy by design
and privacy by default this is really done
370
00:50:43,330 --> 00:50:47,800
by the vendors. Actually when is was
interviewing some of them they already
371
00:50:47,800 --> 00:50:55,270
told me that it's not their plan to
integrate that in their products they are
372
00:50:55,270 --> 00:51:03,820
going to wait until they are sued. They
say "Oh, we don't need it. why should we
373
00:51:03,820 --> 00:51:16,090
do it worked now - nope." So that's why
the law enforcement comes into place and
374
00:51:16,090 --> 00:51:21,430
maybe some of you know Max Schrems, he's
also speaking here in two days about
375
00:51:21,430 --> 00:51:28,490
something else though and he a data
protection activist. And he says that
376
00:51:28,490 --> 00:51:33,780
everything that goes will be done in this
phase we are now, but if vendors won't
377
00:51:33,780 --> 00:51:44,601
observe the law we have to remind them to
do it. So this is how he looks like and he
378
00:51:44,601 --> 00:51:51,770
says that with this new regulation we can,
as a customer, ask for compensation when
379
00:51:51,770 --> 00:51:57,790
data breaches occur. We couldn't do that
so easily now but with this new regulation
380
00:51:57,790 --> 00:52:05,160
it will get a lot of easier. And if 4
billion people sue a company and ask for
381
00:52:05,160 --> 00:52:16,160
compensation that could be a bit expensive
at the end. So if you are not able to sue
382
00:52:16,160 --> 00:52:24,590
anybody yourself, which is not cheap so
nobody - not everybody will secure
383
00:52:24,590 --> 00:52:32,140
companies you can support organizations
that help you with that like the new
384
00:52:32,140 --> 00:52:39,150
organization from Max Schrems called "None
of Your Business" maybe you have seen this
385
00:52:39,150 --> 00:52:45,980
already, I'm not saying that you should
support especially (???) this
386
00:52:45,980 --> 00:52:52,020
organization but his plan is to actually
do that stuff I explained earlier: sue
387
00:52:52,020 --> 00:52:59,270
companies that are not abiding to the law.
So if you wanna visit the website they
388
00:52:59,270 --> 00:53:13,350
currently collecting money. What else can
consumers do? That are no easy tips but we
389
00:53:13,350 --> 00:53:20,280
can't do much except a few easy things.
Does this product really need an internet
390
00:53:20,280 --> 00:53:28,000
connection? Is it possible to turn it off?
Is it still working after that? What do we
391
00:53:28,000 --> 00:53:36,590
find about it on the internet? Can we
reach the vendor? Does the vendor reply
392
00:53:36,590 --> 00:53:45,030
when I have a question? Do we get more
information? Sometimes also clicktivism
393
00:53:45,030 --> 00:53:53,179
helps to stop vendors making stupid
decisions. Here is another example from
394
00:53:53,179 --> 00:54:00,010
the vacuum robot cleaning machine Roomba
who wanted to sell the data that is
395
00:54:00,010 --> 00:54:08,350
collected from the home from the vacuum
cleaner and actually there was a huge huge
396
00:54:08,350 --> 00:54:14,080
huge shitstorm after he was announcing
that - the CEO that was announcing that.
397
00:54:14,080 --> 00:54:20,270
And after the shitstrorm the CEO said "Ok,
no nono. We're not collecting. We're not
398
00:54:20,270 --> 00:54:28,490
selling your data. No no." So sometimes
this helps as well and of course follow
399
00:54:28,490 --> 00:54:35,940
the basics in IT-security please update
everything that has updates, separate
400
00:54:35,940 --> 00:54:45,270
networks from IoT products and use safe
passwords, support open hardware, open
401
00:54:45,270 --> 00:54:50,890
software, products where the data is
stored locally is always better than in
402
00:54:50,890 --> 00:54:58,050
the cloud and if you're tech savvy enough
start - which I think you are here - start
403
00:54:58,050 --> 00:55:09,110
building your own tools. Because you have
the control. And what can developers do?
404
00:55:09,110 --> 00:55:14,710
Support privacy by design, security by
design, think about it from the beginning
405
00:55:14,710 --> 00:55:22,150
because you can change it and take
responsibility. And IT security can also
406
00:55:22,150 --> 00:55:30,010
do some stuff or continue to do some
stuff. Point the vendor to the problems,
407
00:55:30,010 --> 00:55:36,240
make helping IT security stronger, keep
reporting the flaws, publish your
408
00:55:36,240 --> 00:55:43,270
research, help develop standards, labels
and seat belts and support each others
409
00:55:43,270 --> 00:55:52,100
work to a stronger voice about this. So
I'm coming to the end of my talk now and
410
00:55:52,100 --> 00:55:57,920
to the topic back to the internet of
fails: How many must be killed in the
411
00:55:57,920 --> 00:56:04,730
Internet of Deadly Things train wrecks?
This is actually an article I was reading
412
00:56:04,730 --> 00:56:12,750
with a huge interest myself because it was
starting to deal with making comparisons
413
00:56:12,750 --> 00:56:17,550
to the great age of railway construction
that was likewise riddled with decades of
414
00:56:17,550 --> 00:56:25,820
disasters before the introduction of
effective signaling and failsafe breaks.
415
00:56:25,820 --> 00:56:30,140
And it was also comparisoned with the
automotive industry where the mandatory
416
00:56:30,140 --> 00:56:36,650
fitting of seatbelts designing the bodies
of cars to reduce injury to pedestrians,
417
00:56:36,650 --> 00:56:42,330
airbag and measures to reduce air
pollution were not introduced not early
418
00:56:42,330 --> 00:56:51,369
enough. So this guy was asked: Do we
really need to kill a few people first?
419
00:56:51,369 --> 00:56:58,400
And he said: Unfortunately that will happen.
So he says: Safety and security standards
420
00:56:58,400 --> 00:57:06,349
for the internet of things can't come soon
enough. I agree with that. With that we
421
00:57:06,349 --> 00:57:15,960
need standards really soon. So I am at the
end of my talk and if we have some time
422
00:57:15,960 --> 00:57:22,210
left I'm waiting for your questions,
ideas, and input now. Otherwise I will
423
00:57:22,210 --> 00:57:25,370
thank you very much for your attention.
424
00:57:25,370 --> 00:57:28,370
applause
425
00:57:28,370 --> 00:57:33,890
Herald: Thank you Barbara. A very warm
applause.
426
00:57:33,890 --> 00:57:37,630
So a small information: If you want to
exit the room please exit the room to your
427
00:57:37,630 --> 00:57:47,770
left over there. So, questions?
I see one question from the Signal Angel.
428
00:57:47,770 --> 00:57:54,040
Q: Hello, ok. The internet wants to know,
well those companies don't have any IoT
429
00:57:54,040 --> 00:58:03,370
security whatsoever or basically none, so
what can we do to make them have more?
430
00:58:03,370 --> 00:58:07,710
B: What we as who, as consumers?
Q: Yeah, basically.
431
00:58:07,710 --> 00:58:15,220
B: Yeah, actually I would - what I said
was I would write them and ask for
432
00:58:15,220 --> 00:58:25,720
standards. I would - I think it can be the
first step that we can write emails or
433
00:58:25,720 --> 00:58:32,851
call them and say "Well, what kind of
security is build in this device, can you
434
00:58:32,851 --> 00:58:40,139
tell me? Otherwise I won't buy your
product."
435
00:58:40,139 --> 00:58:50,270
Herald: Thank you. Any other question? Ok,
in this case again: Thank you Barbara for
436
00:58:50,270 --> 00:58:53,250
your nice talk.
applause
437
00:58:53,250 --> 00:58:59,774
A very warm round of applause. Thanks.
438
00:58:59,774 --> 00:59:05,287
34c3 outro
439
00:59:05,287 --> 00:59:20,741
subtitles created by c3subtitles.de
in the year 2018. Join, and help us!