< Return to Video

34C3 - Internet of Fails

  • 0:00 - 0:15
    34c3 intro
  • 0:15 - 0:24
    Herald: So, to our next talk... Sit and
    relax, you know what that means. Glass of
  • 0:24 - 0:31
    wine or mate, your favorite easy chair,
    and of course your latest WIFI enabled toy
  • 0:31 - 0:36
    compromising your intimate moments.
    Barbara Wimmer, as free author and
  • 0:36 - 0:41
    journalist, will tell you more about the
    Internet of Fails,
  • 0:41 - 0:47
    will tell you more about where IoT goes
    wrong. She's a free author and journalist
  • 0:47 - 0:57
    at futurezone.at, (DORF?), and will in the
    near future release one or two public
  • 0:57 - 1:12
    stories and a book. Applause!
    applause
  • 1:12 - 1:16
    Barbara Wimmer: Hello everybody. I'm
    waiting for my slides to appear on the
  • 1:16 - 1:24
    screen. Where are my slides please? That's
    not my slides.
  • 1:37 - 1:49
    Oh, thank you very much. So welcome to the
    talk Internet of Fails when IoT has gone
  • 1:49 - 1:59
    wrong. This is a very negative topic title
    actually and you're getting a lot of
  • 1:59 - 2:07
    negative stories in this next hour but I
    don't want to talk only about negative
  • 2:07 - 2:14
    things so you can see "FAIL" as a "first
    attempt in learning". So actually at the
  • 2:14 - 2:19
    end of the talk I want to talk about
    solutions as well and I don't want to
  • 2:19 - 2:27
    provide only bad and negative examples
    because that's what we hear every day. And
  • 2:27 - 2:34
    this is perfect for the congress motto
    "tuwat" because this is all about let's
  • 2:34 - 2:45
    tuwat together. So nobody, most of you in
    this room don't will not know me. So I'm
  • 2:45 - 2:52
    going to introduce myself a little bit and
    why I'm talking to you about this topic,
  • 2:52 - 2:58
    because that's probably what everybody
    asks me when I appear somewhere and say oh
  • 2:58 - 3:07
    I give talks about IoT. And so actually I
    work as an IT journalist for more than 12
  • 3:07 - 3:17
    years. And I got in contact with internet
    of things in 2014 when I talked to the
  • 3:17 - 3:26
    local CERT.at team in Austria. I'm from
    Vienna. And they first told me that the
  • 3:26 - 3:32
    first refrigerator was caught that was
    sending out spam mails and that was in
  • 3:32 - 3:42
    2014 and actually that was really a funny
    story back then and we were laughing about
  • 3:42 - 3:49
    it but at the same time we already knew
    that there is something coming up which is
  • 3:49 - 4:00
    quite going to be a huge development and
    so from back then I watched the whole IoT
  • 4:00 - 4:09
    development in terms of security and
    privacy. And in the next 45min you will
  • 4:09 - 4:19
    hear a lot of stuff about IoT, and where
    the problem with IoT is currently and
  • 4:19 - 4:26
    examples of fails in terms of security and
    privacy. But like I mentioned before I
  • 4:26 - 4:32
    wanna talk about solutions and when we
    talk about solutions it will not be like
  • 4:32 - 4:38
    only one side, like only the consumer,
    only the IT-security, only developers.
  • 4:38 - 4:47
    Actually what I'm going not to provide is
    detailed IT-security stuff. So if you
  • 4:47 - 4:54
    wanna focus more on any story that I'm
    talking about I'm mentioning most of the
  • 4:54 - 5:02
    the sources in the slides and if you
    really wanna know this example got up,
  • 5:02 - 5:07
    please look it up if you're really
    interested deeply into it. I'm a
  • 5:07 - 5:13
    journalist and not an IT-security person
    so please don't expect me to go into
  • 5:13 - 5:20
    details in this talk. Thats why it's also
    in the ethics talk - ethics section of the
  • 5:20 - 5:29
    congress and not the security part. So
    coming to the internet of things I want to
  • 5:29 - 5:40
    start with a few numbers because these
    numbers show the development of IoT. In
  • 5:40 - 5:49
    2016 we had 6.3 billions of devices out
    there. This year we already had 8.3
  • 5:49 - 5:59
    billion of devices and in 2020 we will -
    we are going to have 20.4 billion
  • 5:59 - 6:05
    connected devices out there. So the
    numbers are from Gartner Institute from
  • 6:05 - 6:14
    January and I have one more slide with
    more accurate data from June this year and
  • 6:14 - 6:23
    actually this slide shows that the
    development is actually really growing.
  • 6:23 - 6:32
    17% more compared to the previous year.
    And by 2021 global IoT spending is
  • 6:32 - 6:42
    expected to reach about 1.4 trillion
    dollars. So maybe some you are asking
  • 6:42 - 6:50
    yourself: What is the internet of things?
    Maybe some of you expected I'm only
  • 6:50 - 7:00
    talking about a smart home, because IoT is
    often related to the smart home. And we're
  • 7:00 - 7:06
    having all the smart devices that we put
    into our living rooms, but that's actually
  • 7:06 - 7:13
    not the main focus because it's more about
    the connected everything. Which means
  • 7:13 - 7:19
    toys, sex toys, home automation,
    lightbulbs, surveillance cameras,
  • 7:19 - 7:29
    thermostats, but also digital assistants
    and wearables. So I wanna start with a few
  • 7:29 - 7:38
    examples of classical internet of things
    stuff which is actually a smart coffee
  • 7:38 - 7:45
    maker. That's ... so what is smart about a
    coffee maker? It only gets ... it doesn't
  • 7:45 - 7:51
    get smart when you regulate your coffee
    machine by app because what's smart about
  • 7:51 - 7:58
    that? You can just press the button on the
    machine. But when you connect your coffee
  • 7:58 - 8:06
    machine with fitness and sleeping trackers
    the coffee machine already knows when you
  • 8:06 - 8:13
    get up if you need a strong or soft coffee
    in the morning and so that might sound
  • 8:13 - 8:20
    comfortable for some of us, but it also
    has a lot of dangers inside, because you
  • 8:20 - 8:26
    never know that the data is really safe
    and only stays with you. Maybe your
  • 8:26 - 8:37
    insurance company get them one day. So you
    all know Cars -probably-, the film, and
  • 8:37 - 8:46
    this is McLightning Queen and it got a toy
    nowadays which is sold for 350 dollars -
  • 8:46 - 8:55
    no sorry, euros - and this car is able to
    sit next to you and watch the film with
  • 8:55 - 9:02
    you and is going to comment the film.
    laughter
  • 9:02 - 9:10
    And it is - this sounds very funny - but -
    and it is funny - but it means that it has
  • 9:10 - 9:15
    a microphone integrated which is waiting
    for the terms in the film on the right
  • 9:15 - 9:23
    stories and then it makes comments. And
    the microphone can only be turned off by
  • 9:23 - 9:31
    app so there's no physical button to turn
    it off and actually another thing is when
  • 9:31 - 9:36
    you first ... when you actually got this
    present for Christmas, which is a really
  • 9:36 - 9:47
    expensive present with 350 euros, it's
    actually first updating for more than
  • 9:47 - 10:01
    35min before you can even use it. The next
    example - you're already laughing - is
  • 10:01 - 10:09
    internet of ... I call it internet of shit
    because you can't say anything else to
  • 10:09 - 10:16
    that example. It's a toilet IoT sensor
    which is actually a smart, small little
  • 10:16 - 10:25
    box which is put into the toilet. And this
    box has sensors. It's an Intel box but I
  • 10:25 - 10:35
    don't know and this box has sensors and
    these sensors help analyzing the stool.
  • 10:35 - 10:44
    And this data that is collected is going
    to send into the cloud. And actually this
  • 10:44 - 10:50
    could be very useful for people who are
    having chronical diseases like Colitis
  • 10:50 - 10:59
    Ulcerosa or other chronical diseases with
    the digestion stuff but it is mainly
  • 10:59 - 11:05
    designed for healthy people who want to
    make better nutrition and reduce their
  • 11:05 - 11:14
    stress levels with the stool analysis. And
    maybe it sounds good at the beginning but
  • 11:14 - 11:22
    this data that is collected could also be
    used for other things in the future. So
  • 11:22 - 11:31
    it's a perfect example for internet of
    shit. But there is another internet of
  • 11:31 - 11:38
    shit which is a twitter account that
    collects all these funny little stories.
  • 11:38 - 11:45
    It's not from me, so I'm not behind that.
    I tried to reach the person but I never
  • 11:45 - 11:51
    got a replay so I can't tell you anything
    about them but they collect examples - if
  • 11:51 - 11:56
    you don't follow them now and are
    interested in this topic you might do
  • 11:56 - 12:05
    after this talk - so after presenting a
    couple of IoT examples with the good and a
  • 12:05 - 12:13
    bit of the bad sides I first wanna focus a
    little bit on the problem because as I
  • 12:13 - 12:20
    said before you might now think that
    everything is nice, comfortable, why
  • 12:20 - 12:27
    shouldn't we do that and stuff like that.
    So the problem is that most of the vendors
  • 12:27 - 12:34
    that are doing IoT stuff now, that start
    to connect everything, they are creating
  • 12:34 - 12:41
    manually operated devices without
    connectivity for long years and they had a
  • 12:41 - 12:48
    lot of knowledge in terms of materials,
    ergonomics, mechanical engineering but
  • 12:48 - 12:58
    almost zero in the fields of IT security.
    Actually I don't say that without having
  • 12:58 - 13:07
    talked to vendors that have said exactly
    that when I interviewed them. Like there
  • 13:07 - 13:15
    was a lightbulb vendor from Austria who is
    a really big vendor who is making
  • 13:15 - 13:22
    lightbulbs for years and years and years
    and actually they started to make
  • 13:22 - 13:35
    connected lightbulbs in 2015 and when they
    did that they ... and I asked them "Oh how
  • 13:35 - 13:45
    big is your IT security department?" "1
    Person". So they didn't actually have the
  • 13:45 - 13:52
    knowledge that IT security might be more
    important when they connect - when they
  • 13:52 - 14:00
    start to connect things. And actually the
    result is that these vendors are making
  • 14:00 - 14:06
    the same sort of security errors than the
    high tech industry was dealing with 15
  • 14:06 - 14:14
    years ago. So the early 2000s called and
    want their web security, their lack of
  • 14:14 - 14:24
    security back. So there are all kinds of
    problems we already know from past:
  • 14:24 - 14:29
    hardcoded passwords, unsecure bluetooth
    connections, permanent cloud server
  • 14:29 - 14:39
    connections and a lot of other stuff. So
    we're going to have from all these 20
  • 14:39 - 14:46
    billion devices out there, there will be a
    lot of unsecure devices and the problem is
  • 14:46 - 14:53
    that they are collecting to a botnet and
    are starting DDoS attacks and we are going
  • 14:53 - 15:03
    to have internet outages. For those who
    are not familiar with the terms I made a
  • 15:03 - 15:08
    really really really short explanation so
    that you are also understanding what I am
  • 15:08 - 15:15
    talking about. A botnet is a network of
    private computers infected with malicious
  • 15:15 - 15:22
    software and controlled as a group without
    the owners knowledge. Like the example of
  • 15:22 - 15:29
    the refrigerator that was sending out spam
    I told you about earlier. This
  • 15:29 - 15:36
    refrigerator sent out ... one refrigerator
    was sending out 750.000 spam mails by the
  • 15:36 - 15:43
    way. So the botnet, that has a botnet
    owner of course, because it's not only a
  • 15:43 - 15:50
    zombie botnet, and the botnet owner can
    control this network of infected computers
  • 15:50 - 15:58
    by issuing commands to perform malicious
    activities like DDoS attacks. So DDoS is a
  • 15:58 - 16:04
    distributed denial of Service attack and
    actually that's an attempt to stop
  • 16:04 - 16:10
    legitimate users form accessing the data
    normally available on a website. And this
  • 16:10 - 16:20
    actually can lead to completely shutdown
    of a service. And we had this already so
  • 16:20 - 16:30
    I'm not talking about something in the far
    future but we had this in 2016 and most
  • 16:30 - 16:38
    people already recognized it but it didn't
    recognized why - their twitter accounts
  • 16:38 - 16:44
    did not work, they couldn't use Reddit, or
    Spotify, or they couldn't pay with PayPal
  • 16:44 - 16:53
    at the moment. And behind that attack was
    Mirai so several other major services were
  • 16:53 - 17:03
    offline because an infrastructure provider
    was attacked by zombie IoT devices. And
  • 17:03 - 17:12
    this was one year ago and now one year
    later Mirai botnet infections are still
  • 17:12 - 17:21
    widespread so not every zombie device is
    already secured so there are still some
  • 17:21 - 17:27
    around and not so little and actually
    there is a study saying that every
  • 17:27 - 17:36
    unsecured - no every botnet infection
    that's there - every security hole thats
  • 17:36 - 17:43
    there is staying there for at least 7
    years which means that all the unsecure
  • 17:43 - 17:51
    devices which are out now could get
    infected and could stay infected for 7
  • 17:51 - 17:57
    years. So that's why it's very important
    that we are going to do something really
  • 17:57 - 18:10
    quickly and not starting like in 2020. So
    Mirai was supposed to continue in 2017 and
  • 18:10 - 18:20
    actually a lot of DDoS attacks similar
    attacks like Mirai happened in 2017. This
  • 18:20 - 18:30
    as an example could unleash at any moment
    which was in November one few days later
  • 18:30 - 18:42
    exactly this attack was unleashed, so it
    happened. In 2017 we also had a huge
  • 18:42 - 18:54
    increase in DDoS attacks 91% increase from
    Q1 and it's going to increase more. I have
  • 18:54 - 19:09
    to take a short sip, sorry.
    Now we're coming back to examples. One
  • 19:09 - 19:16
    really good example is the university that
    was attacked by it's own vending machines
  • 19:16 - 19:26
    and smart lightbulbs and 5000 other IoT
    devices. This was very very difficult to
  • 19:26 - 19:32
    get fixed because they couldn't get the
    university network down so they had to
  • 19:32 - 19:38
    find a really difficult solution to get it
    back up. And actually how did they even
  • 19:38 - 19:43
    notice about it? Because the students
    complained that the internet was going so
  • 19:43 - 19:53
    slow. Another example which has nothing to
    do with DDoS attacks anymore but with IoT
  • 19:53 - 20:03
    sensors - actually - in a fishtank in an
    American casino - north American casino
  • 20:03 - 20:12
    there were sensors measuring the
    temperature of the aquarium and the
  • 20:12 - 20:19
    fishtank - that the fishes didn't die -
    and these sensors were sending the data to
  • 20:19 - 20:28
    a PC of this casino and this PC was the
    same - was using the same network than the
  • 20:28 - 20:38
    sensors so actually the cybercriminals
    could access to this data of the casino
  • 20:38 - 20:43
    and were stealing them and sending them to
    their own servers in Finland. And the
  • 20:43 - 20:56
    amount was about 10GB of data. Another
    example which is actually one of my most -
  • 20:56 - 21:03
    I don't know why but it's the example I
    personally like most of the whole examples
  • 21:03 - 21:11
    that are collected in 2017. So there was a
    surveillance camera bought by a
  • 21:11 - 21:22
    netherlands woman. Actually she wanted to
    surveil her dog when she was out at work
  • 21:22 - 21:30
    but what did this camera do? It did
    surveil the dog when she was out at work,
  • 21:30 - 21:37
    but when she was at home the camera
    followed her through the room and was
  • 21:37 - 21:44
    watching her all over the place. And it
    had a microphone integrated and one day it
  • 21:44 - 21:52
    started to talk with her and it said "hola
    señorita". And this woman was so
  • 21:52 - 22:00
    frightened that she actually started to
    record that because she thought that
  • 22:00 - 22:08
    nobody will buy this story. All will think
    I’m crazy but this camera actually did not
  • 22:08 - 22:16
    surveil the dog but was hacked and
    surveiled her. And it was a very cheap
  • 22:16 - 22:22
    camera by the way. She bought it in a
    supermarket but we don't know the name of
  • 22:22 - 22:29
    the vendor in this case. So coming for a
    very cheap camera to a very hightech
  • 22:29 - 22:40
    camera the cameras you see here is one
    that is actually build in a lot of
  • 22:40 - 22:48
    companies and there was a security hole
    found by some Vienna security specialists
  • 22:48 - 22:53
    from SEC consult and actually they
    demonstrated me how they could actually
  • 22:53 - 23:03
    hack into this camera and how they could
    make it possible that this camera shows
  • 23:03 - 23:13
    pictures of an empty room in a bank so the
    pictures from the empty room in the bank
  • 23:13 - 23:20
    were shown to me and in reality the bank
    was robbed - ok, not in reality. But it
  • 23:20 - 23:29
    could have been robbed. So thats actually
    sounding a little bit like a movie scene
  • 23:29 - 23:38
    and actually this camera which is sold as
    a security camera is kind of useless when
  • 23:38 - 23:43
    it doesn't have security and it doesn't
    really show the picture. And the problem
  • 23:43 - 23:54
    with this camera was hardcoded passwords.
    And the hardcoded password got fixed after
  • 23:54 - 24:03
    so it was responsible disclosure process
    and this camera is safe now. So I'm coming
  • 24:03 - 24:12
    to a different example now. And this now
    finally explains why this toy is sitting
  • 24:12 - 24:20
    here. Before my talk everybody was telling
    me "Ah, you brought your favorite toy, to
  • 24:20 - 24:26
    protect you during your talk." and I was
    laughing "Oh no. No no no no, it one of
  • 24:26 - 24:37
    the most unsecure devices out there." But
    before we come to this in special I'm
  • 24:37 - 24:47
    going to talk a little bit about connected
    toys. So the Germany Stiftung Warentest
  • 24:47 - 24:55
    had made a study regarding connected toys.
    The people were testing them and actually
  • 24:55 - 25:05
    all of their tested bears, robot dogs and
    dolls were very very unsecure and some of
  • 25:05 - 25:13
    them were even critical and are extremely
    critical and others were critical. And
  • 25:13 - 25:22
    actually what was the problem with the
    toys and also with this? They were using -
  • 25:22 - 25:28
    they are using bluetooth connections. And
    these bluetooth connections are not
  • 25:28 - 25:34
    secured by a password or a PIN code. So
    every smartphone user close enough could
  • 25:34 - 25:43
    connect to the toy and listen to children
    or ask questions or threaten them and
  • 25:43 - 25:50
    another problem are the data collecting
    apps related to this stuff. So actually
  • 25:50 - 25:59
    this little unicorn has an app where you
    can send the messages. So what does this
  • 25:59 - 26:08
    actually? It can play messages and you can
    - as a child you can record messages and
  • 26:08 - 26:17
    send it to you mom or your dad. And when
    you play messages you never - the heart
  • 26:17 - 26:25
    blinks. So actually there's a message
    waiting for you now. And I'm not sure if
  • 26:25 - 26:33
    it's the same that I recorded earlier
    before. Maybe now it is, maybe at the end
  • 26:33 - 26:43
    of the talk when I press the button again
    it might not be. And so everybody can - so
  • 26:43 - 26:50
    this - err sorry - This device does have
    an app where you can send the message to.
  • 26:50 - 26:56
    And it also has a children interface and
    where you are using the children interface
  • 26:56 - 27:03
    you're seeing that there are ads
    integrated. And in the children's
  • 27:03 - 27:13
    interface there were ads for porn and
    ehm... ...other stuff, which are not
  • 27:13 - 27:20
    really in the best hands of a child. And
    this is also what Stiftung Warentest has
  • 27:20 - 27:31
    actually - yeah has actually found out.
    The data is also used to send to third
  • 27:31 - 27:36
    party companies and they put trackers to
    control the online behavior of their
  • 27:36 - 27:43
    parents. This is also done with this
    device. So the Stiftung Warentest advises
  • 27:43 - 27:51
    a not connectible dumb teddy might be the
    smarter choice in the future. And before I
  • 27:51 - 27:57
    finally press this button - you're
    probably curious now - but first I'm going
  • 27:57 - 28:07
    to talk a little bit about Cayla. You
    probably have heard of Cayla as a very
  • 28:07 - 28:15
    unsecure doll. Actually it got forbidden
    in Germany by law. It is judged as a
  • 28:15 - 28:22
    prohibited broadcasting station. And
    parents who do not destroy it will be
  • 28:22 - 28:29
    actually fined. And I tried to buy Cayla
    in Austria and didn't get the doll. So
  • 28:29 - 28:35
    actually it should be really off the
    market in the German speaking area. And
  • 28:35 - 28:44
    actually that is also a result of a
    campaign from Norway called Toyfail, which
  • 28:44 - 28:50
    is a Norwegian consumer organization who
    are actually - this is Cayla. You can see
  • 28:50 - 29:00
    her now. Which is actually going to the
    European Parliament to make them
  • 29:00 - 29:08
    understand how unsecure toys is doing a
    lot of harm and how we should put more
  • 29:08 - 29:17
    security into toys. And I've brought a
    short little video and I hope we can hear
  • 29:17 - 29:28
    the audio here as well. We will see.
    No. You don't hear anything.
  • 29:28 - 29:32
    But this doesn't matter because they
    have...
  • 29:32 - 29:36
    Sign Language Interpreter: subtitles
    Barbara: subtitles.
  • 29:36 - 29:41
    Person (Video): There's not added any kind
    of security. With simple steps I can talk
  • 29:41 - 29:45
    through the doll and listen to other
    people.
  • 29:45 - 29:48
    Person through doll (Video): No one wants
    others to speak directly through the doll.
  • 29:48 - 29:57
    Barbara: He's speaking now at the moment.
    Doll: inaudible
  • 29:57 - 30:39
    Person: And you may think... [see video
    subs] ... Cayla, can I trust you?
  • 30:39 - 30:44
    Doll: I don't know.
    laughter
  • 30:44 - 30:58
    applause
    Barbara: Yeah and we don't trust Cayla and
  • 30:58 - 31:08
    we also don't trust our little unicorn.
    button clicking
  • 31:08 - 31:25
    laughter
    crying baby in background
  • 31:25 - 31:35
    Barbara: Ok, somebody has hacked it.
    laughter
  • 31:35 - 31:43
    Yes.
    Unicorn Toy: Hello, Chaos Communication
  • 31:43 - 31:48
    Congress.
    Barbara: Ok, that's what I recorded
  • 31:48 - 31:57
    earlier. But there is some time left.
    Maybe, maybe... but you're all sitting too
  • 31:57 - 32:04
    far actually and nobody of you brought
    your computer, so... but we will see, we
  • 32:04 - 32:10
    will try it later on. So but actually you
    shouldn't trust this unicorn, because this
  • 32:10 - 32:22
    unicorn is from the company called
    Cloudpets, which is a - no sorry It's a
  • 32:22 - 32:30
    toy called Cloudpet and the company is
    Spiraltoys from the US. So this is
  • 32:30 - 32:39
    Cloudpet and there are cats and dogs and
    unicorns and it's very ugly but it's a
  • 32:39 - 32:49
    unicorn. And actually now I'm already
    talking a lot about this. Why I'm
  • 32:49 - 32:58
    explaining you now. There already was a
    data breach with this toy so the
  • 32:58 - 33:06
    children's messages in Cloudpets data
    actually was stolen and was public on the
  • 33:06 - 33:14
    internet. 2 million voice messages
    recorded on the cuddly toys has been
  • 33:14 - 33:25
    discovered free on the internet. And
    actually Spiraltoys say that there was no
  • 33:25 - 33:34
    data breach but the data was there, so...
    Thats also why I brought this, it was
  • 33:34 - 33:40
    still very easily available and actually
    as I said before the app for child the
  • 33:40 - 33:51
    interface shows porn ads, so I would not
    recommend that for your child. Actually
  • 33:51 - 33:56
    there are already a lot of institutions
    out there which are warning for connected
  • 33:56 - 34:03
    toys also the consumer group Which? which
    actually did a study about this and other
  • 34:03 - 34:10
    like also the Furby connected they
    analyzed, the German Stiftung Warentest,
  • 34:10 - 34:14
    the Austrian Verein für
    Konsumenteninformation, the Norwegian
  • 34:14 - 34:22
    consumer council, and the FBI. The list is
    to be continued. So consider if you really
  • 34:22 - 34:31
    need a connected toy for your child or
    yourself because the next section is about
  • 34:31 - 34:38
    sex toys.
    laughter
  • 34:38 - 34:50
    applause
    squeaky horn
  • 34:50 - 34:57
    more laughter and applause
    I am not... It's not necessary say a lot
  • 34:57 - 35:04
    about this example. It's actually a
    connected vibrator that has a build-in
  • 35:04 - 35:19
    camera and this camera is very very very
    unsafe. Also this toy is really expensive,
  • 35:19 - 35:25
    so you can't say "Eh, it's only the cheap
    stuff that is so unsecure." Also the high-
  • 35:25 - 35:32
    tech stuff can be really unsecure. I mean
    this vibrator costs 250 dollars so it's
  • 35:32 - 35:43
    very expensive and it has a build-in web-
    connected endoscope and they found out
  • 35:43 - 35:56
    that it's massively insecure. The password
    of this... And if you forgot to change it
  • 35:56 - 36:02
    it's a few more players than expected that
    might be watching your newest video about
  • 36:02 - 36:10
    your private sex adventures. There was
    another example actually in this - sorry
  • 36:10 - 36:15
    go back one more time to this example -
    there's a very funny video on it on
  • 36:15 - 36:20
    youtube about it, maybe you wanna watch
    it. I didn't bring it because I couldn't
  • 36:20 - 36:32
    reach the makers of it. So I'm going to
    the next example which is about a case of
  • 36:32 - 36:39
    sex toy company that actually admits to
    recording users remote sex sessions and it
  • 36:39 - 36:48
    called it a "minor bug". It was this love
    sensor remote app you can see the icon
  • 36:48 - 36:56
    here and actually this is a vibrator and
    an app and the vibrator controlling app
  • 36:56 - 37:03
    was recording all the sex sounds, all the
    sounds you're making when you're using
  • 37:03 - 37:10
    this vibrator and stores them on the phone
    without your knowledge. And the company
  • 37:10 - 37:16
    says that no information or data was sent
    to the servers so this audio file exists
  • 37:16 - 37:22
    only temporarily and only your device. And
    they already had an update so actually
  • 37:22 - 37:28
    this is not as funny as the other story
    but still it's an example of how unsecure
  • 37:28 - 37:38
    sex stuff can be. So there are lot of lot
    of more sex examples out there. One you
  • 37:38 - 37:46
    should actually definitely search for
    after - please don't search for now, but
  • 37:46 - 37:55
    after this talk. You could google or
    duckduckgo or whatever you use the terms
  • 37:55 - 38:04
    "blowjob injection". And please add
    "security" because otherwise you will land
  • 38:04 - 38:08
    on other sites.
    laughter
  • 38:08 - 38:18
    And this was a female security expert who
    was doing this research about a device
  • 38:18 - 38:25
    which actually was supposed to your
    girlfriend could make you a special
  • 38:25 - 38:31
    blowjob program, your special blowjob and
    this could be hacked so somebody else's
  • 38:31 - 38:39
    blowjob might appear instead your own.
    laughter
  • 38:39 - 38:48
    So there's also a story about a map of
    buttplugs in Berlin that are unsecure.
  • 38:48 - 38:56
    Also if you're interested in that please
    also search for that story. Because it's
  • 38:56 - 39:01
    funny to talk about this, but I also wanna
    talk little bit about things that we could
  • 39:01 - 39:09
    actually do. And one of the projects in
    this part is actually doing something
  • 39:09 - 39:14
    thats called the "internet of dongs
    project - hacking sex toys for security
  • 39:14 - 39:22
    and privacy". And as you can see it's
    supported by PornHub, which in this case
  • 39:22 - 39:29
    means that they get money from PornHub
    that they can buy sex toys for their
  • 39:29 - 39:42
    research. So PornHub is sponsoring them.
    Actually I did for talk to the guy who is
  • 39:42 - 39:50
    behind this project. He's called
    Randomman. That's a render of him and this
  • 39:50 - 39:57
    is the website by the way. So he told me
    he's currently - they're currently a team
  • 39:57 - 40:06
    of about 15-20 people out there that are
    doing their security research in their own
  • 40:06 - 40:11
    spare time. And they are not getting any
    money for it and they also don't want to
  • 40:11 - 40:18
    get any money but they are already looking
    for more security experts that wanna join
  • 40:18 - 40:24
    the team and also they have also an
    ethical codex and stuff like that and
  • 40:24 - 40:32
    actually one of the most important things
    that he was telling me is that he doesn't
  • 40:32 - 40:41
    want that you should stay off connected
    sex toys at all, but to find the security
  • 40:41 - 40:55
    holes that we are all able to use them if
    we want without any fear. So yeah, you can
  • 40:55 - 41:03
    get in contact with him if you're
    interested. Coming to a different section
  • 41:03 - 41:14
    now. You can see I'm switching from
    security to security and privacy and now
  • 41:14 - 41:24
    I'm landed on the privacy section. This is
    Google Home. And we all know that there is
  • 41:24 - 41:33
    also Amazon Echo and digital assistants
    are also smart IoT devices and this is why
  • 41:33 - 41:39
    I wanna talk a very very short time about
    them because I'm sure a lot of people got
  • 41:39 - 41:46
    those devices for Christmas. Actually
    there was a big increase of digital
  • 41:46 - 41:57
    assistants in the last year int this
    quarter 3 of 2016 there were only 900.000
  • 41:57 - 42:11
    of such devices sold and in the quarter 3
    2017 we had more than 7.4 million of those
  • 42:11 - 42:17
    devices sold. So there's a huge increase
    and we don't even have the numbers of the
  • 42:17 - 42:29
    Christmas time. Yeah you have seen it. so
    why I wanna talk about it, because when
  • 42:29 - 42:37
    you put this kind of stuff in your home it
    might be very comfortable at the beginning
  • 42:37 - 42:42
    because you don't have to look up the
    weather information you can - you don't
  • 42:42 - 42:47
    have to read your emails you can make the
    device read your own emails you can use
  • 42:47 - 42:56
    them to program your list of what you're
    going to buy and stuff like that but
  • 42:56 - 43:02
    that's how they learn a lot about the
    users habits and their personalties and
  • 43:02 - 43:07
    those devices will learn more and more
    information about you and this information
  • 43:07 - 43:16
    does not stay in your own home it actually
    is going to send to the servers of amazon
  • 43:16 - 43:23
    and google and I don't need to tell you
    what amazon an google are doing with this
  • 43:23 - 43:31
    data. current at least currently they are
    only collecting it but that's very
  • 43:31 - 43:40
    valuable and they turn around and use it
    or sell it in various ways to monetize
  • 43:40 - 43:49
    that information in one of the future
    days. So all digital assistants send the
  • 43:49 - 43:54
    voice controls that are made after "Ok,
    Google" or "Alexa" to their servers and
  • 43:54 - 44:01
    the data will be saved there and it was
    not possible for me to find out for how
  • 44:01 - 44:07
    long and at which servers. It's not in
    their terms of conditions and I couldn't
  • 44:07 - 44:16
    find it anywhere. So also the German data
    privacy delegate Andrea Voßhoff didn't
  • 44:16 - 44:22
    find this information. She criticized that
    "It is not easy for users to understand
  • 44:22 - 44:28
    how, to what extent and where the
    information collected is processed. Also,
  • 44:28 - 44:37
    it is not clear how long the data will be
    stored." So if you still want those
  • 44:37 - 44:45
    devices in your home now there are at
    least physical mute button with google
  • 44:45 - 44:52
    home and amazon echo and you can also
    change the settings to control the data so
  • 44:52 - 45:00
    all the data that is collected is regulary
    deleted from the servers but of course you
  • 45:00 - 45:08
    never know in how may backups it's
    collected as well. So yes it's only
  • 45:08 - 45:22
    recording after this voice control but
    both devices already got hacked and yeah I
  • 45:22 - 45:32
    didn't amazon echo got hacked in 2016 and
    google mini got hacked in 2017 of course
  • 45:32 - 45:40
    both problems got fixed and when I say got
    hacked it means that the devices in your
  • 45:40 - 45:54
    home were listening to the conversations
    all the time. So I'm coming -
  • 45:54 - 46:01
    unfortunately the funny examples are over.
    I'm coming to the part where I wanna speak
  • 46:01 - 46:10
    about what we can do against the lack of
    security and lack of privacy with the
  • 46:10 - 46:19
    internet of things. So we are currently
    having the status quo where we are having an
  • 46:19 - 46:24
    information asymmetry between the vendor
    and the customer. Currently the
  • 46:24 - 46:29
    manufacturers do not need to provide a
    sample information but(?) how security of
  • 46:29 - 46:37
    a device such as how long it will receive
    security updates. so when we buy a device
  • 46:37 - 46:52
    we never know... oh is it going to be safe
    or not. So what we need ... actually what
  • 46:52 - 47:00
    we need. I did write a couple of things -
    I write down a couple of things here which
  • 47:00 - 47:10
    are partly stolen by the green MEP Jan
    Philipp Albrecht from his program because
  • 47:10 - 47:18
    he's dealing a lot with that kind of
    question what we can do with his work and
  • 47:18 - 47:28
    I'm also - I also was stealing some of
    those suggestions from the Renderman from
  • 47:28 - 47:35
    the Internet of Dongs project, he also had
    some helpful tips. And I also stole some
  • 47:35 - 47:40
    of the information from security experts I
    talked in interviews all of the time
  • 47:40 - 47:45
    because we never talk only about the bad
    things we always - we all want to get the
  • 47:45 - 47:53
    internet of things safer at the end. So
    some of them suggested that we could need
  • 47:53 - 48:01
    an security star rating system similar to
    the energy labeling. And when we talk
  • 48:01 - 48:13
    about security star ratings that could
    mean that we use a label. When a device
  • 48:13 - 48:20
    gets security updates for free for the
    next five years it gets the A++ label, if
  • 48:20 - 48:25
    it's no updates at all and it stays
    unsecure it gets the baddest rating or
  • 48:25 - 48:32
    such things. Actually vendors should also
    be forced to close security holes instead
  • 48:32 - 48:40
    of ignoring them. And they should provide
    the security researchers with email
  • 48:40 - 48:46
    addresses where we can easily report
    security flaws because sometimes the
  • 48:46 - 48:52
    hardest part of the game is to actually
    find the right contact to send out the
  • 48:52 - 49:01
    information about what is unsecure and
    what's not. What we also need is a
  • 49:01 - 49:09
    mandatory offline mode for electronical
    devices so this device at least has a
  • 49:09 - 49:20
    button where you can turn it off. so it
    doesn't listen to you permanently. And we
  • 49:20 - 49:28
    need that for all devices - all connected
    devices. Also an airbag and seatbelt for
  • 49:28 - 49:35
    the digital age and we also have to talk
    about product liability and a clear update
  • 49:35 - 49:46
    policy. so there are also good examples
    that we are having now. Actually all what
  • 49:46 - 49:55
    I was talking about here is regulation.
    Regulation that is not existing at the
  • 49:55 - 50:05
    moment. But there is some regulation that
    is existing in the kind of data which is
  • 50:05 - 50:13
    the GDPR the General Data Protection
    Regulation which is coming up in May 2018
  • 50:13 - 50:20
    and it has included some really really
    really helpful things: privacy by design
  • 50:20 - 50:28
    and privacy by default. And more
    possibilities for law enforcement. And
  • 50:28 - 50:36
    this is very very important because it
    doesn't say that because we are going to
  • 50:36 - 50:43
    have a regulation about privacy by design
    and privacy by default this is really done
  • 50:43 - 50:48
    by the vendors. Actually when is was
    interviewing some of them they already
  • 50:48 - 50:55
    told me that it's not their plan to
    integrate that in their products they are
  • 50:55 - 51:04
    going to wait until they are sued. They
    say "Oh, we don't need it. why should we
  • 51:04 - 51:16
    do it worked now - nope." So that's why
    the law enforcement comes into place and
  • 51:16 - 51:21
    maybe some of you know Max Schrems, he's
    also speaking here in two days about
  • 51:21 - 51:28
    something else though and he a data
    protection activist. And he says that
  • 51:28 - 51:34
    everything that goes will be done in this
    phase we are now, but if vendors won't
  • 51:34 - 51:45
    observe the law we have to remind them to
    do it. So this is how he looks like and he
  • 51:45 - 51:52
    says that with this new regulation we can,
    as a customer, ask for compensation when
  • 51:52 - 51:58
    data breaches occur. We couldn't do that
    so easily now but with this new regulation
  • 51:58 - 52:05
    it will get a lot of easier. And if 4
    billion people sue a company and ask for
  • 52:05 - 52:16
    compensation that could be a bit expensive
    at the end. So if you are not able to sue
  • 52:16 - 52:25
    anybody yourself, which is not cheap so
    nobody - not everybody will secure
  • 52:25 - 52:32
    companies you can support organizations
    that help you with that like the new
  • 52:32 - 52:39
    organization from Max Schrems called "None
    of Your Business" maybe you have seen this
  • 52:39 - 52:46
    already, I'm not saying that you should
    support especially (???) this
  • 52:46 - 52:52
    organization but his plan is to actually
    do that stuff I explained earlier: sue
  • 52:52 - 52:59
    companies that are not abiding to the law.
    So if you wanna visit the website they
  • 52:59 - 53:13
    currently collecting money. What else can
    consumers do? That are no easy tips but we
  • 53:13 - 53:20
    can't do much except a few easy things.
    Does this product really need an internet
  • 53:20 - 53:28
    connection? Is it possible to turn it off?
    Is it still working after that? What do we
  • 53:28 - 53:37
    find about it on the internet? Can we
    reach the vendor? Does the vendor reply
  • 53:37 - 53:45
    when I have a question? Do we get more
    information? Sometimes also clicktivism
  • 53:45 - 53:53
    helps to stop vendors making stupid
    decisions. Here is another example from
  • 53:53 - 54:00
    the vacuum robot cleaning machine Roomba
    who wanted to sell the data that is
  • 54:00 - 54:08
    collected from the home from the vacuum
    cleaner and actually there was a huge huge
  • 54:08 - 54:14
    huge shitstorm after he was announcing
    that - the CEO that was announcing that.
  • 54:14 - 54:20
    And after the shitstrorm the CEO said "Ok,
    no nono. We're not collecting. We're not
  • 54:20 - 54:28
    selling your data. No no." So sometimes
    this helps as well and of course follow
  • 54:28 - 54:36
    the basics in IT-security please update
    everything that has updates, separate
  • 54:36 - 54:45
    networks from IoT products and use safe
    passwords, support open hardware, open
  • 54:45 - 54:51
    software, products where the data is
    stored locally is always better than in
  • 54:51 - 54:58
    the cloud and if you're tech savvy enough
    start - which I think you are here - start
  • 54:58 - 55:09
    building your own tools. Because you have
    the control. And what can developers do?
  • 55:09 - 55:15
    Support privacy by design, security by
    design, think about it from the beginning
  • 55:15 - 55:22
    because you can change it and take
    responsibility. And IT security can also
  • 55:22 - 55:30
    do some stuff or continue to do some
    stuff. Point the vendor to the problems,
  • 55:30 - 55:36
    make helping IT security stronger, keep
    reporting the flaws, publish your
  • 55:36 - 55:43
    research, help develop standards, labels
    and seat belts and support each others
  • 55:43 - 55:52
    work to a stronger voice about this. So
    I'm coming to the end of my talk now and
  • 55:52 - 55:58
    to the topic back to the internet of
    fails: How many must be killed in the
  • 55:58 - 56:05
    Internet of Deadly Things train wrecks?
    This is actually an article I was reading
  • 56:05 - 56:13
    with a huge interest myself because it was
    starting to deal with making comparisons
  • 56:13 - 56:18
    to the great age of railway construction
    that was likewise riddled with decades of
  • 56:18 - 56:26
    disasters before the introduction of
    effective signaling and failsafe breaks.
  • 56:26 - 56:30
    And it was also comparisoned with the
    automotive industry where the mandatory
  • 56:30 - 56:37
    fitting of seatbelts designing the bodies
    of cars to reduce injury to pedestrians,
  • 56:37 - 56:42
    airbag and measures to reduce air
    pollution were not introduced not early
  • 56:42 - 56:51
    enough. So this guy was asked: Do we
    really need to kill a few people first?
  • 56:51 - 56:58
    And he said: Unfortunately that will happen.
    So he says: Safety and security standards
  • 56:58 - 57:06
    for the internet of things can't come soon
    enough. I agree with that. With that we
  • 57:06 - 57:16
    need standards really soon. So I am at the
    end of my talk and if we have some time
  • 57:16 - 57:22
    left I'm waiting for your questions,
    ideas, and input now. Otherwise I will
  • 57:22 - 57:25
    thank you very much for your attention.
  • 57:25 - 57:28
    applause
  • 57:28 - 57:34
    Herald: Thank you Barbara. A very warm
    applause.
  • 57:34 - 57:38
    So a small information: If you want to
    exit the room please exit the room to your
  • 57:38 - 57:48
    left over there. So, questions?
    I see one question from the Signal Angel.
  • 57:48 - 57:54
    Q: Hello, ok. The internet wants to know,
    well those companies don't have any IoT
  • 57:54 - 58:03
    security whatsoever or basically none, so
    what can we do to make them have more?
  • 58:03 - 58:08
    B: What we as who, as consumers?
    Q: Yeah, basically.
  • 58:08 - 58:15
    B: Yeah, actually I would - what I said
    was I would write them and ask for
  • 58:15 - 58:26
    standards. I would - I think it can be the
    first step that we can write emails or
  • 58:26 - 58:33
    call them and say "Well, what kind of
    security is build in this device, can you
  • 58:33 - 58:40
    tell me? Otherwise I won't buy your
    product."
  • 58:40 - 58:50
    Herald: Thank you. Any other question? Ok,
    in this case again: Thank you Barbara for
  • 58:50 - 58:53
    your nice talk.
    applause
  • 58:53 - 59:00
    A very warm round of applause. Thanks.
  • 59:00 - 59:05
    34c3 outro
  • 59:05 - 59:21
    subtitles created by c3subtitles.de
    in the year 2018. Join, and help us!
Title:
34C3 - Internet of Fails
Description:

more » « less
Video Language:
English
Duration:
59:21

English subtitles

Revisions