Return to Video

Talking Behind Your Back (33c3)

  • 0:00 - 0:14
    preroll music
  • 0:14 - 0:19
    Vasilios: Hello, everyone, thanks for coming
    today. I'm going to introduce the ultrasound
  • 0:19 - 0:24
    ecosystem, which is an exotic and kind of
    little known ecosystem. So I would like to
  • 0:24 - 0:30
    start with a short story about the
    product, which is also our motivation for
  • 0:30 - 0:40
    this work. So some time ago, there was a
    product that worked in the ultrasound
  • 0:40 - 0:44
    spectrum that cannot be perceived by
    humans. And the product was actually an
  • 0:44 - 0:51
    interesting idea. It was very promising
    and everything, but it also had a fatal
  • 0:51 - 0:57
    flaw. So now that I've done this
    introduction, I would like to tell you
  • 0:57 - 1:01
    more about the story of the product and
    how it came to be and what was it? What
  • 1:01 - 1:08
    was its lifecycle. So 2012, a company
    called SilverPush was a startup in
  • 1:08 - 1:15
    India. It was founded there and they had
    this ultrasound device tracking product.
  • 1:15 - 1:19
    I'll go more into the technical details
    later. So for a couple of years, they were
  • 1:19 - 1:27
    working on that product. And it wasn't
    until 2014 that they basically got some
  • 1:27 - 1:33
    serious funding from a venture center or
    other angel investors for millions. So in
  • 1:33 - 1:39
    2014, they also got a few months after
    they got funded. They also got some press
  • 1:39 - 1:45
    coverage about the product and they got
    some pretty good reviews on their
  • 1:45 - 1:49
    newspapers and articles about what the
    product could do. And at the same time,
  • 1:49 - 1:53
    they were doing what most of the companies
    are doing, like publishing patents about
  • 1:53 - 2:00
    their technology and everything. So things
    later started to go like year after year
  • 2:00 - 2:06
    and half maybe started to go not that well
    for them. The security community noticed
  • 2:06 - 2:11
    and there was some press coverage about
    the product that was not so positive
  • 2:11 - 2:19
    anymore. So this is one of the very first
    emails that appear on the Web regarding
  • 2:19 - 2:25
    the product. So it's from a W3C
    working group. So a researcher there is
  • 2:25 - 2:31
    basically. Notifying the other members of
    the group that, OK, there is this product,
  • 2:31 - 2:36
    maybe there are transparency issues, and
    certainly the users are not aware of
  • 2:36 - 2:42
    what exactly is going on there. So let's
    keep an eye on it. And so this was a very
  • 2:42 - 2:48
    one of the very first things published
    about the product from the privacy and
  • 2:48 - 2:54
    security perspective. So what happened
    then was the press took notice and they
  • 2:54 - 3:01
    got all those headlines urging users to be
    very careful. And, oh, this is a this is
  • 3:01 - 3:09
    evil, take care. People are eavesdropping
    on you and stuff. So, of course, this led
  • 3:09 - 3:15
    also on the FTC to take action. They
    organized a workshop on cross device tracking
  • 3:15 - 3:22
    in general, I think, and they made specific
    mentions for ultrasound cross device tracking
  • 3:22 - 3:28
    don't worry if you're not familiar with this terms,
    I'm going to define everything later. So
  • 3:28 - 3:33
    what basically they were saying is
    transparency issues. How do how do we
  • 3:33 - 3:40
    protect ourselves? How is that thing
    working? So, then the users, of course,
  • 3:40 - 3:44
    started to react. And there were like many
    people who were unhappy, they were
  • 3:44 - 3:48
    complaining, what is this? I don't want
    that thing. So people were actually
  • 3:48 - 3:53
    suggesting solutions and the solutions
    that were making sense. And of course, you
  • 3:53 - 4:01
    have always the users that are completely
    immune to what you have there. So what
  • 4:01 - 4:11
    happened then is like five months after
    the FTC took much more serious action
  • 4:11 - 4:16
    regarding this specific product. So it
    sent a letter to all the developers. And
  • 4:16 - 4:20
    the letter was essentially saying, you
    know, you're using this framework in
  • 4:20 - 4:27
    Europe. We've seen it in Google Play
    store. It's not enough that you are asking
  • 4:27 - 4:31
    for the microphone permission. You should
    let the users know that you are tracking
  • 4:31 - 4:36
    them if you are doing so. Otherwise, you
    are violating rule X, Y, Z, and you're not
  • 4:36 - 4:43
    allowed to do that. So this was pretty
    serious, I would say. And what happened
  • 4:43 - 4:47
    next is basically the company withdrew
    from the US market and said, you know, we
  • 4:47 - 4:51
    have nothing to do with the U.S. market
    and this product is not active there. You
  • 4:51 - 4:57
    shouldn't be concerned. So end of story
    like the product is not out there in the
  • 4:57 - 5:04
    US at least anymore. Are we safe? So it
    seemed to us that it was assumed that this
  • 5:04 - 5:10
    was an isolated security incident. And to
    be fair, very little became known about
  • 5:10 - 5:15
    the technology. At this point. The press
    moved on to other hot topics at the time,
  • 5:15 - 5:21
    people went quiet, like if people are not
    using it, it's fine. So everyone
  • 5:21 - 5:26
    seemed happy. But we're curious people. So
    we had lots of questions that were not
  • 5:26 - 5:35
    answered. So our main questions was like
    why they were using ultrasounds. We'll see
  • 5:35 - 5:42
    that what they are doing, you can do with
    our technologies, how such frameworks
  • 5:42 - 5:47
    work. We had no idea there was no coverage
    or nothing about it. The technical,
  • 5:47 - 5:53
    technically speaking, out there, are there
    other such products there? Because we were
  • 5:53 - 5:59
    aware of one. Everyone on all the articles
    was referring to that one product, but we
  • 5:59 - 6:03
    were not sure if there are others doing
    the same thing. And of course, we were
  • 6:03 - 6:08
    looking for a report about the whole
    ecosystem and how it works. And there was
  • 6:08 - 6:13
    nothing. So what do you do then if if
    there are no technical resources?
  • 6:13 - 6:20
    Basically, we decided to do our own
    research and come up with this report that
  • 6:20 - 6:25
    we were lacking. So we're done with
    motivation so far. We were pretty pumped
  • 6:25 - 6:30
    up about looking into it. OK, what's
    there? The rest of the presentation will
  • 6:30 - 6:34
    go as follows. Like first I'm going to
    introduce ultrasound tracking and other
  • 6:34 - 6:41
    terminology, then I'm going to go on with
    the attack details. And indeed, we have an
  • 6:41 - 6:48
    attack again against the Tor browser. Then
    we're doing a formal security analysis of
  • 6:48 - 6:53
    the ecosystem and try to pinpoint the
    things that went wrong. And then we'll try
  • 6:53 - 7:00
    to introduce our countermeasures and
    advocate for proper practices. So to begin
  • 7:00 - 7:07
    with, I'm Vasilis. I've done this work
    with other curious people. These are
  • 7:07 - 7:13
    showing how Yanick Fratantonio, Christopher
    Kruegel and Giovanni Vigna from UCSB and also
  • 7:13 - 7:19
    Federico Maggi from Polytechnical
    Damilola. Let's now start with the
  • 7:19 - 7:26
    ecosystem, so apparently ultrasounds are
    used in a lot of places and they can be
  • 7:26 - 7:31
    utilized for different purposes, some of
    them are cross device tracking that are
  • 7:31 - 7:37
    referred already to audience analytics,
    synchronized content, proximity, marketing
  • 7:37 - 7:41
    and device pairing. You can do some other
    things, but you will see them later. So to
  • 7:41 - 7:49
    begin with what cross device tracking is,
    cross device tracking is basically the holy
  • 7:49 - 7:54
    grail for marketers right now because
    you're using your multiple devices
  • 7:54 - 7:58
    smartphone, laptop, computer, maybe your
    TV and to them, your appear as different
  • 7:58 - 8:02
    people. And they all want to be able to
    link to link those devices to know that
  • 8:02 - 8:07
    you're the same person so that they can
    build their profiles more accurately. So,
  • 8:07 - 8:13
    for instance, if you're watching an ad on
    the TV, they want to be able to know that
  • 8:13 - 8:20
    it's you so that they can push relevant
    ads from your smartphone or follow up ads.
  • 8:20 - 8:28
    Um. So this is employed by major
    advertising networks, and there are two
  • 8:28 - 8:33
    ways to do it, either deterministically or
    probabilistically, that deterministic
  • 8:33 - 8:39
    approach is much more reliable. You get
    100 percent accuracy and works as follows.
  • 8:39 - 8:43
    If you are Facebook, the users are heavily
    incentivized to log in from all their
  • 8:43 - 8:49
    devices. So what happens is that. You can
    immediately know that, OK, this user has
  • 8:49 - 8:55
    these three devices and I can put relevant
    content to all of them. However, if you
  • 8:55 - 8:59
    are not Facebook or Google you, it's much
    more unlikely that the users would want to
  • 8:59 - 9:04
    log into your platform from their
    different devices. So you have to look for
  • 9:04 - 9:10
    alternatives. And one tool to come up with
    those alternatives are ultrasound beacons.
  • 9:10 - 9:18
    So, um, ultrasound tracking products are
    using ultrasound because they may sound
  • 9:18 - 9:23
    exotic, but basically there they are. What
    they are doing is they are encoding a
  • 9:23 - 9:29
    sequence of symbols, um, in a very high
    frequency that it's inaudible by humans.
  • 9:29 - 9:36
    That's the first key feature. The second one
    is they can be emitted by most commercial
  • 9:36 - 9:39
    speakers and they can be captured by most
    commercial microphones, for instance,
  • 9:39 - 9:49
    found on your smartphone. So the technical
    details are the following. I know there
  • 9:49 - 9:54
    are a lot of experts in these kinds of
    things here, so I'm averaging out what how
  • 9:54 - 9:58
    the companies are doing it right now. I'm
    not saying that this is the best way to do
  • 9:58 - 10:02
    it, but this is more or less what they're
    doing. Of course, they have patents, so
  • 10:02 - 10:06
    each one of them is doing a slightly
    different thing so they don't overlap.
  • 10:06 - 10:13
    They're using the near ultrasound spectrum
    between the eight eight kilohertz and 20
  • 10:13 - 10:19
    kilohertz, which is inaudible by usually
    by adults. They divide it in smaller
  • 10:19 - 10:27
    chunks. So if you divide it in chunks that
    have size of 75 Hertz, you get 26, about 26
  • 10:27 - 10:34
    chunks, and then you can assign letter of
    the alphabet on each one of them. And then
  • 10:34 - 10:38
    what they are doing is usually within four
    to five seconds. They emit sequences of
  • 10:38 - 10:45
    characters. Usually they contain for four
    to six characters in there, and they use
  • 10:45 - 10:52
    it to incorporate a unique ID
    corresponding to their source, they attach
  • 10:52 - 10:57
    the beacon to. So there is no ultrasound
    beacon standard, as I said previously, but
  • 10:57 - 11:00
    there are lots of patents, so each one of
    them is doing a slightly different thing.
  • 11:00 - 11:06
    But this is a basic principle. We did some
    experiments and we found out that within
  • 11:06 - 11:14
    seven meters, you get pretty good accuracy
    in low error rate. So of course, this depends
  • 11:14 - 11:20
    exactly how you encode things. But with
    applications found on Google Play, this
  • 11:20 - 11:25
    worked up to seven meters. Um, we couldn't
    find computer speakers that were not able
  • 11:25 - 11:33
    to emit near ultrasound frequencies and
    work with this technology and.. we this is
  • 11:33 - 11:37
    pretty known for this kind of frequencies,
    they cannot penetrate through physical
  • 11:37 - 11:41
    objects, but this is not a problem for
    their purposes. And we did some
  • 11:41 - 11:47
    experiments with our research assistant
    and we can say that they are audible by
  • 11:47 - 11:54
    animals. So if you combine cross device
    tracking and ultrasound because you get
  • 11:54 - 12:02
    ultrasound cross device tracking. So now what
    you can do with this and this is this is a
  • 12:02 - 12:07
    pretty good idea, actually, because it
    offers high accuracy, you don't ask the
  • 12:07 - 12:17
    users to log in, which is very high, very
    demanding thing to ask for. You can embed
  • 12:17 - 12:23
    those beacons in websites or TV ads, and
    this technology, however, requires some
  • 12:23 - 12:26
    sort of sophisticated backend
    infrastructure. We're going to see more
  • 12:26 - 12:30
    about it later. And you also need the
    network of publishers who are willing to
  • 12:30 - 12:37
    incorporate incorporate beacons in their
    content, whatever this content is. And
  • 12:37 - 12:41
    then, of course, you need an ultrasound
    cross device tracking framework that is going
  • 12:41 - 12:47
    to run on the user's mobile device, a
    smartphone. So these frameworks are
  • 12:47 - 12:53
    essentially and as the advertising SDK is the
    key that the developers can use to display
  • 12:53 - 12:57
    ads on their free apps. So it's not that
    the developers are going to incorporate
  • 12:57 - 13:04
    the ultrasound framework is going to
    incorporate an advertising SDK with
  • 13:04 - 13:10
    varying degrees of understanding of what
    it does. So here is how ultrasound cross device
  • 13:10 - 13:16
    tracking works. On step one, basically, we
    have the advertising client. He just wants
  • 13:16 - 13:20
    to advertise, advertises his products. He
    goes to the ultrasound cross device
  • 13:20 - 13:25
    tracking provider who has the
    infrastructure set up, set up a campaign,
  • 13:25 - 13:32
    and they provide their associates a unique
    ultrasound because with this campaign and
  • 13:32 - 13:38
    then pushes this become to content
    publishers to incorporate them
  • 13:38 - 13:44
    incorporated into their content, depending
    on what the advertiser advertising client
  • 13:44 - 13:49
    is trying to achieve. So this is step
    three or step for a user is basically
  • 13:49 - 13:57
    accessing all of those content providers
    either. This is a TV ad or a website on
  • 13:57 - 14:03
    the Internet and one this once this
    content is loaded or displayed by your TV.
  • 14:03 - 14:08
    At the same time, the device, the devices
    speakers are emitting the ultrasounds. And
  • 14:08 - 14:14
    if you have the ultrasound cross device tracking
    framework on your phone, which is usually
  • 14:14 - 14:19
    listening on the background, then it picks
    up the ultrasound and on step six, it
  • 14:19 - 14:25
    submits it back to the service provider,
    which now knows that, OK, this guy has
  • 14:25 - 14:32
    watched this DVR or whatever it is, and
    I'm going to add this to his profile and
  • 14:32 - 14:38
    push our target dates back to his device.
    So, of course, by doing this, they're just
  • 14:38 - 14:46
    trying to improve, improve their
    conversion rate and get more customers.
  • 14:46 - 14:53
    Another use of ultrasounds currently in
    practice is proximity marketing, so venues
  • 14:53 - 14:59
    basically set up multiple, multiple
    ultrasound meters. This is kind of fancy
  • 14:59 - 15:05
    name for speakers and this is kind of the
    nice thing about the ultrasound. You just
  • 15:05 - 15:11
    need speakers. So they put this in
    multiple locations in their venue, either
  • 15:11 - 15:18
    a supermarket or a stadium, for instance,
    and then there is a customer up. If you're
  • 15:18 - 15:23
    a supermarket, there is a supermarket up.
    If you're an NBA team, which will see
  • 15:23 - 15:30
    later, you have this fun application that
    the fans of your team can download
  • 15:30 - 15:35
    and install on their smartphones. And then
    once this app, this happens, listing on
  • 15:35 - 15:41
    the background and it picks up the
    ultrasound and submits them back to the
  • 15:41 - 15:48
    company. So the main purpose of using is
    this is basically to study in user
  • 15:48 - 15:55
    behavior, in user behavior, provide real
    time notifications like, OK, you are in
  • 15:55 - 16:00
    this aisle on the supermarket, but if you
    just walk two meters down, you're going to
  • 16:00 - 16:06
    see this product in discount. Or the third
    point, which kind of incentivizes the
  • 16:06 - 16:11
    users more, is basically you're offering
    reward points for users visiting your
  • 16:11 - 16:18
    store. And actually there is a product
    doing exactly that on the market. So some
  • 16:18 - 16:24
    other uses are device pairing. And this
    basically relies on the fact that
  • 16:24 - 16:29
    ultrasounds do not penetrate through
    objects. So if you have a small TV, say,
  • 16:29 - 16:37
    with or Chromecast, for instance, they can
    emit random PIN through ultrasound. Your
  • 16:37 - 16:41
    device picks it up and submits it back to
    the device through the Internet. And now
  • 16:41 - 16:44
    you've proved that you are on the same
    physical location with the with Chromecast
  • 16:44 - 16:51
    or whatever your TV is. Also, Google
    recently acquired sleek login. They are
  • 16:51 - 16:56
    also using ultrasounds for authentication.
    It's not entirely clear what their product
  • 16:56 - 17:00
    is about, though. And also you have
    audience measurement and analytics. So
  • 17:00 - 17:07
    what they are doing is basically if you're
    if you incorporate multiple beacons in the
  • 17:07 - 17:12
    night, then you can basically track the
    reactions and the behavior of the users of
  • 17:12 - 17:18
    it, of the audience in the sense that
    first, you know, how many people have
  • 17:18 - 17:21
    watched your ad a second, you know what
    happened. So if they show it's Sanderlin
  • 17:21 - 17:27
    between and this, so they submit only the
    first beacon of the two, if you have two,
  • 17:27 - 17:34
    then you also track their behavior. OK, so
    we've seen all these technologies and then
  • 17:34 - 17:41
    we started wondering how secure is that
    thing? Like, OK, what security measures
  • 17:41 - 17:46
    are there applied by companies and
    everything? So I'm going to immediately
  • 17:46 - 17:51
    start with the exploitation of the
    technology. So to do that, we just need
  • 17:51 - 17:59
    the computer with speakers and the Tor browser
    and the smartphone with an ultrasound app
  • 17:59 - 18:03
    and a state level advisory. I'm going to
    say more about the state level advisory
  • 18:03 - 18:12
    later, but just keep in mind that it's on
    the Tor threat model, so. I have a
  • 18:12 - 18:15
    video of the attack. I'm going to stop it,
    I'm going to pose it in different places
  • 18:15 - 18:25
    to explain some more stuff. Yeah, OK, so
    I'm going to set up the scene before that.
  • 18:25 - 18:28
    So let's make the assumption that we have
    a whistle blower that wants to leak some
  • 18:28 - 18:34
    documents to a journalist, but he doesn't
    know that the journalist is working with
  • 18:34 - 18:39
    the government and his main intent is
    basically to deanonymize him. So the
  • 18:39 - 18:43
    journalist does the following, asks the
    whistleblower to upload the documents to a
  • 18:43 - 18:48
    Tor hidden service or a website that he owns.
    And the whistleblower basically thinking
  • 18:48 - 18:55
    that he's safe to do that through Tor
    loads the page. So now I'm having I have the
  • 18:55 - 19:07
    demo, which is exactly that implements
    exactly that scenario. So the whistle
  • 19:07 - 19:13
    blower opens the Tor browser, so the setup is
    the following, we have the phone next to
  • 19:13 - 19:17
    the computer. This can be up to seven
    meters away, but for practical purposes,
  • 19:17 - 19:21
    it has to be next to the computer. So we
    have the Tor browser. What are we going to do
  • 19:21 - 19:29
    first? For the purpose of the demo, we use
    them smart for listening framework that's
  • 19:29 - 19:37
    visible to the.. to the user. This is
    basically the demo(?). Those apps, ultrasound
  • 19:37 - 19:41
    cross device tracking apps run in the background,
    so now we're setting set it on listening
  • 19:41 - 19:46
    mode so that it starts listening. Of
    course, in normal framework, the user
  • 19:46 - 19:53
    doesn't have to do that part. But we want
    to show that. We want to show that what's
  • 19:53 - 20:03
    happening. So now the whistleblower is
    going to load the innocuous were paid,
  • 20:03 - 20:13
    suggested by the journalist and see what
    happens to. OK, now we've loaded the page
  • 20:13 - 20:20
    and the phone is listening in reality in
    the background, so let's see what happens.
  • 20:31 - 20:36
    OK, this is looks pretty bad. We have lots
    of information about the user visiting our
  • 20:36 - 20:44
    hidden service. I assume you already have some
    clues about how this happened, what the
  • 20:44 - 20:55
    information that we have is the following.
    First of all. We have his IP address,
  • 20:55 - 21:02
    phone number. Don't call this phone
    number, because this isn't right. The ID
  • 21:02 - 21:11
    is he may end his Google account email. So
    this is enough to say and his location, of
  • 21:11 - 21:15
    course, and this is enough to say that we
    essentially deanonymized him, even if we
  • 21:15 - 21:22
    had the IP address, that would have been
    enough. So before I explain exactly how
  • 21:22 - 21:26
    the attacked work, I'm going to introduce
    some tools that the attackers have at
  • 21:26 - 21:32
    their disposal. The first one is a Bitcoin
    injection. So what you can essentially do
  • 21:32 - 21:37
    is basically craft your own ultrasound
    beacons and push them to devices,
  • 21:37 - 21:41
    listening for beacons, and then their
    devices are going to treat them like valid
  • 21:41 - 21:45
    beacons and submit them back to the
    company's backend. And then the same
  • 21:45 - 21:50
    things. Basically, you can also replace
    ultrasound beacons, meaning that you can
  • 21:50 - 21:55
    capture them from virus location. And this
    is actually happening on the wild at a
  • 21:55 - 22:04
    large scale for a specific application.
    And then once you capture those beacons,
  • 22:04 - 22:11
    you can replace them back to the company's
    back end through the user's devices to
  • 22:11 - 22:17
    give you a clue. There is a company that
    incentivizes users to visit stores by
  • 22:17 - 22:23
    providing them offers and end points when
    they are visiting stores and people are
  • 22:23 - 22:28
    capturing the beacons and are replaying them
    back to their devices from home. So they
  • 22:28 - 22:31
    are selling the beacons through the
    Internet so that they don't have to go to
  • 22:31 - 22:40
    the actual stores. OK, the problem here is
    basically that the framework is handling
  • 22:40 - 22:43
    every beacon. It doesn't have a way to
    distinguish between the valid and
  • 22:43 - 22:48
    maliciously crafted beacons. And my favorite
    tool for the attackers is basically a beacon
  • 22:48 - 22:55
    trap, which is a code snippet that
    once you loaded, you basically reproduce
  • 22:55 - 23:02
    one or more inaudible beacons that the
    attacker chose to. So this can happen in
  • 23:02 - 23:07
    lots of ways on the demo. I use the first
    one. So you build a website and you have
  • 23:07 - 23:12
    some JavaScript there just playing the
    ultrasounds from the back. What else you can
  • 23:12 - 23:18
    do is basically start crosseyed scripting
    vulnerability. Just exploit it on any
  • 23:18 - 23:23
    random website and then you can inject
    beacons to the visitors of this website
  • 23:23 - 23:30
    or a man-in-the-middle attacks just
    adding or javascript snippet on that
  • 23:30 - 23:38
    user's traffic or they send an audio
    message to the to the victim. So how did
  • 23:38 - 23:42
    Tor deanonymization attack work? It's the
    following. So first the adversary needs to
  • 23:42 - 23:50
    set up, set up a campaign, and then once
    he captures the the beacon associated with
  • 23:50 - 23:56
    that campaign, he builds a beacon trap and
    essentially on step three lures, the user
  • 23:56 - 24:01
    to visit it. This is what the journalist
    basically did for the whistleblower on our
  • 24:01 - 24:06
    scenario. Then the user loads the
    resource. He has no idea this is possible.
  • 24:06 - 24:12
    And she slapped him amidst the ultrasound,
    beacon. If you if your smartphone has such a
  • 24:12 - 24:17
    framework, it's going to pick it up and
    submit it back to the provider and I don't
  • 24:17 - 24:22
    know about you, but when I'm using Tor,
    I'm not connecting my phone through to the
  • 24:22 - 24:26
    Internet, through the Tor network. My
    phone is connected through my normal Wi-
  • 24:26 - 24:34
    Fi. So now the ultrasound service provider
    knows that the you know, this smartphone
  • 24:34 - 24:38
    device omitted that specific beacon. And
    then I step seven, basically the
  • 24:38 - 24:43
    adversary, which is state level adversary.
    Can simply subpoena the provider for the
  • 24:43 - 24:49
    AP or other identifiers, which from what
    we've seen, they collect plenty of them.
  • 24:49 - 24:55
    OK, so the first two elements, we have
    them already like the Tor browser
  • 24:55 - 25:03
    computer, which biggest fine smartphone
    with ultrasound tracking enabled
  • 25:03 - 25:08
    framework. Fine. What about the state
    level adversity? So we didn't have a state
  • 25:08 - 25:13
    level adversity handy. So what we did is
    basically we redirected the
  • 25:13 - 25:19
    traffic from step six to the advertized
    backend. And I want to stress a point
  • 25:19 - 25:29
    here. This is not. A long, long shot
    assumption. So what we've seen in October
  • 25:29 - 25:33
    is the following. I don't know how many of
    you realize, but AT&T was running a spy
  • 25:33 - 25:41
    program, a thing called Hammesfahr, and it
    was providing paid access to governments
  • 25:41 - 25:45
    only with an administrative subpoena,
    which is not doesn't even need to be
  • 25:45 - 25:51
    obtained by it's ads. So it's pretty easy
    for them to get access to this kind of
  • 25:51 - 25:56
    data. Especially we're talking about an IP
    address. It's not it's very easy for them
  • 25:56 - 26:02
    to get it. So we also came up with some
    more attacks. First one is profile,
  • 26:02 - 26:08
    corruption. Advertisers really like to
    build profiles about you, your interests
  • 26:08 - 26:15
    and your behavior. So what you are
    basically doing is you can inject beacons
  • 26:15 - 26:21
    to other people or even to your own phone
    and then you can malform their profile.
  • 26:21 - 26:28
    Exactly. The impact of this attack depends
    on how the backend of the advertising
  • 26:28 - 26:33
    company and the infrastructure works, but
    the attack is definitely possible. And
  • 26:33 - 26:40
    then there is information leakage attack
    were works under a similar assumption. You
  • 26:40 - 26:45
    can replay Beacon's eavesdrop and replay
    because your own phone to make your
  • 26:45 - 26:50
    profile similar to that of the victims.
    And then based on how recommendation
  • 26:50 - 26:56
    systems work, you're very likely to get
    similar arts and similar content with that
  • 26:56 - 27:01
    of the victims. So of course, this also
    depends about exactly how the
  • 27:01 - 27:07
    recommendation system is implemented, but
    it's definitely possible. OK, so we've
  • 27:07 - 27:12
    seen certain things that makes us think
    that, OK, the ecosystem is not very
  • 27:12 - 27:19
    secure. Um, we try to find out exactly why
    this happened. So we did a security
  • 27:19 - 27:25
    evaluation or we came up with four points.
    The first one is that we came up with we
  • 27:25 - 27:32
    realized that the threat model is
    inaccurate, that ultrasound, because none
  • 27:32 - 27:39
    of the implementations we've seen had any
    security features. Um, they also violated
  • 27:39 - 27:44
    the fundamental security principle and
    they lacked transparency when it comes
  • 27:44 - 27:49
    when it came to user interface. So let's
    go through them one by one. So inaccurate
  • 27:49 - 27:53
    and model. Basically what they do is
    basically they rely on the fact that
  • 27:53 - 27:58
    ultrasounds cannot penetrate the walls and
    they travel up to seven meters reliably.
  • 27:58 - 28:06
    However, as I said, as a matter of fact,
    they also assume that you cannot capture
  • 28:06 - 28:11
    and replay because because of that, that's
    the reason, um, what what's happening in
  • 28:11 - 28:15
    practice, that you can get really close
    using beacon traps. So their assumption
  • 28:15 - 28:22
    is not that accurate. Um, also, the
    security capabilities of beacons are
  • 28:22 - 28:30
    heavily constrained by the low bandwidth
    the channel is has the limited time that
  • 28:30 - 28:34
    you have to reach the users. So if someone
    is in a supermarket, he's not going to
  • 28:34 - 28:37
    stop somewhere for a very long time. So
    you have limited time and a noisy
  • 28:37 - 28:42
    environment. So you want a very low error
    rate. And adding crypto to the beacons
  • 28:42 - 28:49
    it may not be a good idea, but it also
    results. This also results in replay in
  • 28:49 - 28:54
    injection attacks being possible. Um, we
    also hear the violation of the privilege
  • 28:54 - 29:00
    of, uh, sorry, the principle of least privilege.
    So what happens is basically all these
  • 29:00 - 29:05
    apps need full access to the microphone.
    And based on the way it works, it's
  • 29:05 - 29:10
    completely unnecessary for them to gain
    access to the audible frequencies.
  • 29:10 - 29:15
    However, even if they want to, there's no
    way to gain access only to the ultrasound
  • 29:15 - 29:21
    spectrum, both in Android and iOS. You
    have to gain either access to the whole
  • 29:21 - 29:27
    spectrum or no access at all. So this, of
    course, results in the first malicious
  • 29:27 - 29:32
    developers can at any time start using
    their access to the microphone. And of
  • 29:32 - 29:39
    course, all the benign ultrasound enabled
    apps are perceived by as malicious by the
  • 29:39 - 29:45
    users. And this actually will say more
    about it later. So lack of transparency is
  • 29:45 - 29:51
    inclose. This is a bad combination with
    what exactly we've seen previously,
  • 29:51 - 29:56
    because it that we've observed large
    discrepancies between apps when it comes
  • 29:56 - 30:01
    to informing the users and also lots of
    discrepancies when it comes to providing
  • 30:01 - 30:06
    opt out options. And there is a conflict
    of interest there, because if you're a
  • 30:06 - 30:13
    framework developer, developer, you want
    to advise for proper practices for your
  • 30:13 - 30:18
    customers, but you are not you're not
    going to enforce them or kind of blackmail
  • 30:18 - 30:22
    them. Either you do it properly or you're
    not using my framework. So there is a
  • 30:22 - 30:27
    conflict of interest there. So what
    happened because of a lack of
  • 30:27 - 30:33
    transparency is the following. Signals 360 is
    one of those frameworks. An NBA team
  • 30:33 - 30:40
    started using this in May. And then a few
    months after there is a sue and someone
  • 30:40 - 30:44
    claims, you know, that thing is listening
    in the background. And what's interesting
  • 30:44 - 30:49
    is on the claim, what they are saying is,
    OK, I gave permission through the Android
  • 30:49 - 30:54
    permission system for them to access the
    microphone, but it was not explained to me
  • 30:54 - 30:59
    exactly what they were doing. And this is
    in close ties with what the FTC was saying
  • 30:59 - 31:09
    in the letter a few months ago. Also,
    again, the same story, um, football team
  • 31:09 - 31:14
    starts using such a framework a few months
    after people are complaining that they are
  • 31:14 - 31:22
    being eavesdropped on. Um, I think what
    happened here is that. When the team was
  • 31:22 - 31:26
    playing a match, the application started
    listening for ultrasounds, but not all
  • 31:26 - 31:30
    your fans are going to be in the stadium,
    so you end up listening for ultrasounds in
  • 31:30 - 31:37
    a church and other places. So, yeah,
    people were also pissed. Um, OK, just to
  • 31:37 - 31:42
    put it into perspective how prevalent
    these technologies are, the ecosystem is
  • 31:42 - 31:48
    growing. Even though that one company
    withdrew. There are other companies in the
  • 31:48 - 31:55
    ecosystem are coming up with new products
    as well. So the number of users is
  • 31:55 - 32:00
    relatively low, but it's also very hard to
    estimate right now. We could find around
  • 32:00 - 32:05
    10 companies offering ultrasound related
    products and the majority of them is
  • 32:05 - 32:10
    gathered around proximity marketing. There
    was only one company doing ultrasound
  • 32:10 - 32:17
    cross device tracking. At least we found
    one. Um, and this is mainly due to
  • 32:17 - 32:21
    infrastructure complexity. It's not easy
    to do all those things. And secondly, I
  • 32:21 - 32:26
    also believe that the whole backslash from
    the security community is incentivized
  • 32:26 - 32:33
    other companies from joining because they
    don't want a tarnished reputation. OK, so
  • 32:33 - 32:37
    we have this situation right now.
    Companies are using ultrasound. What are
  • 32:37 - 32:48
    we going to do? So this was our initial
    idea. This is what we thought first. But
  • 32:48 - 32:55
    we want to fix things. So we tried to come
    up with certain steps that we need to take
  • 32:55 - 33:02
    to actually fix that thing and make it
    usable, but not dangerous. Um, so we
  • 33:02 - 33:07
    listed what's wrong with it. We did it
    already. We we developed some quick fixes
  • 33:07 - 33:12
    that I'm going to present later and medium
    term solutions as well. And then we
  • 33:12 - 33:17
    started advocating for a long term changes
    that are going to make the ecosystem
  • 33:17 - 33:24
    reliable. And we need the involvement from
    the community there. Definitely. So. We
  • 33:24 - 33:30
    developed some short and medium term
    solutions, um, the first one is a browser
  • 33:30 - 33:38
    extension, our browser extension basically
    does the following is based on HTML5, the
  • 33:38 - 33:46
    Web audio API. Um, it filters all audio
    sources and places a filter between the
  • 33:46 - 33:51
    audio source and the destination on the
    Web page and filters out ultrasounds. To
  • 33:51 - 33:55
    do that, we use a heisel filter that
    attenuates all frequencies above 18kHz
  • 33:55 - 34:05
    and it works pretty reliably. And
    we leave all audible frequencies, intact.
  • 34:05 - 34:10
    But it's not going to work with
    obsolete legacy technologies such as
  • 34:10 - 34:17
    flash. OK, we also have an adroit
    permission, I think this somewhat more
  • 34:17 - 34:23
    medium term solution, what we did is we
    developed a unique developed parts for the
  • 34:23 - 34:29
    Android permission system. This allows for
    fine grained control over the audio channel,
  • 34:29 - 34:35
    basically separates the permission needed
    for listening to audible sound and the
  • 34:35 - 34:40
    permission needed for listening to the
    ultrasound spectrum. So at least we force the
  • 34:40 - 34:45
    applications to specifically declare that
    they are going to listen to four
  • 34:45 - 34:49
    ultrasounds. And of course, users can, on
    the latest Android versions, can also
  • 34:49 - 34:54
    disable this permission and it can act as
    an opt out option if the app is not
  • 34:54 - 35:03
    providing it. We also initiated discussion
    on the Turbo Tracker, but, um, we have,
  • 35:03 - 35:09
    um, we are advocating for some long term
    solutions, so we really need some
  • 35:09 - 35:16
    standardization here. Um, let's agree on
    ultrasound to confirm that and decide what
  • 35:16 - 35:20
    security features can be there. I mean, we
    need to figure out what's technically
  • 35:20 - 35:25
    possible there because it's not clear. And
    then once we have a standard, we can start
  • 35:25 - 35:32
    building some APIs. And the APIs are very
    nice idea because, um, they will work as
  • 35:32 - 35:37
    the Bluetooth APIs work, meaning that they
    will provide some methods to discover,
  • 35:37 - 35:42
    process, generate and emit the sound
    beacons through a new API related
  • 35:42 - 35:49
    permission. And this means that we will
    stop having overprivileged apps. We won't
  • 35:49 - 35:54
    need access to the microphone anymore,
    which is a huge problem right now. And of
  • 35:54 - 35:59
    course, the applications will not be
    considered spying anymore. And there is
  • 35:59 - 36:04
    also another problem that we found out
    while we were playing with those shops.
  • 36:04 - 36:08
    Um, if you have a framework listening
    through the microphone, other apps cannot
  • 36:08 - 36:12
    access it. So we are trying to open the
    camera app to record the video on the app.
  • 36:12 - 36:17
    Camera app was crashing because the framework
    was locking the access to the
  • 36:17 - 36:22
    microphone. Now we may have some
    developers from frameworks saying, you
  • 36:22 - 36:26
    know, I'm not going to use your API. I'm
    going to keep asking for access to the
  • 36:26 - 36:34
    microphone. But we can force them to use
    this API if we somehow, um, by default
  • 36:34 - 36:39
    filter out the ultrasound frequencies
    from the microphone and
  • 36:39 - 36:45
    provide the way to the user to enable them
    on a pure application basis from his
  • 36:45 - 36:56
    phone. OK, so. Here's what we did, um, we
    analyzed them, multiple ultrasound
  • 36:56 - 37:00
    tracking technologies, we saw what what's
    out there in the real world and reverse
  • 37:00 - 37:08
    engineered such frameworks. We identified,
    um, quite a few security shortcomings. We
  • 37:08 - 37:16
    introduced our attacks and proposed some,
    um, usable countermeasures. Um, and
  • 37:16 - 37:22
    hopefully we initiated the discussion
    about standardizing ultrasound because,
  • 37:22 - 37:28
    um, but there are still things left to do.
    So for the application developers, please,
  • 37:28 - 37:33
    um, explicitly notify the users about what
    your app is doing. Many of them would
  • 37:33 - 37:41
    appreciate to know that. Um, also, we need
    to improve transparency in the data
  • 37:41 - 37:47
    collection process because they collecting
    lots of data and very few information were
  • 37:47 - 37:52
    available about what kind of data they
    framework's collect. Um, we also think
  • 37:52 - 37:57
    it's a good idea to have an opt in option
    if it's not too much to ask, at least an
  • 37:57 - 38:08
    opt out and standard security practices,
    um, as always. So framework providers
  • 38:08 - 38:14
    basically need to make sure that the
    developers inform the users and also make
  • 38:14 - 38:21
    sure that the users consent regularly to
    listening for because like it's not enough
  • 38:21 - 38:26
    if you consent once and then a month after
    the app is still listening for ultrasound beacons
  • 38:26 - 38:33
    you have to periodically ask the user if it's
    still okay to do that. Um. Ideally, every time
  • 38:33 - 38:40
    you are going to listen and then, of
    course, we need to work on standardizing
  • 38:40 - 38:44
    ultrasound because this is going to be a
    long process and then building the
  • 38:44 - 38:48
    specialized, specialized API. Hopefully
    this is going to be easier once we have a
  • 38:48 - 38:57
    standard and see what kind of
    authentication mechanisms can we have in
  • 38:57 - 39:04
    this kind of constrained transmission
    channel. So..
  • 39:04 - 39:17
    applause
  • 39:17 - 39:21
    Herald: Thank you Vasilios. If you have any
    questions, please do line up at the four
  • 39:21 - 39:27
    microphones here in the walkways and the
    first question will be the front
  • 39:27 - 39:31
    microphone here.
    Mic: Hello and thank you for your
  • 39:31 - 39:35
    presentation. And I have a couple of
    questions to ask that are technical and
  • 39:35 - 39:41
    they are very related. First of all, do
    you think that blocking out in our system
  • 39:41 - 39:48
    level the high frequencies for either
    microphone or the speakers as well, a
  • 39:48 - 39:53
    something that is technically feasible and
    will not put a very high latency in the
  • 39:53 - 39:57
    processing?
    Vasilios: So we did that through the
  • 39:57 - 39:59
    permission. You are talking
    about the smartphone right?
  • 39:59 - 40:04
    Mic: Yeah, basically, because you have to
    have a real time sound and microphone
  • 40:04 - 40:07
    feedback.
    Vasilios: So we did that with the
  • 40:07 - 40:14
    permission. And I think it's not it's not
    to resource demanding, if that's
  • 40:14 - 40:17
    your question. So it's
    definitely possible to do that.
  • 40:17 - 40:22
    Mic: And the second part is, so
    there is a new market maybe for some
  • 40:22 - 40:28
    companies producing and microphones and
    speakers that explicitly block out
  • 40:28 - 40:34
    ultrasounds, right?
    Vasilios: Possibly. Possibly. Um, I'm not
  • 40:34 - 40:39
    sure if you can do this from the
    application level. We developed parts for
  • 40:39 - 40:44
    the Android system. I think our first
    approach back then was basically try to
  • 40:44 - 40:48
    build an app to do that from the
    application, from the user land. And
  • 40:48 - 40:53
    basically, I'm not sure if you can I doubt
    actually an Android if you can filter out
  • 40:53 - 40:59
    ultrasounds. But from a browser, we have
    our extension. It works on Chrome. You can
  • 40:59 - 41:04
    easily use our code to do the
    same thing on the Firefox.
  • 41:04 - 41:07
    Mic: Thanks.
    Herald: The next question is from the
  • 41:07 - 41:10
    front right microphone.
    Mic: Thank you for your talk. I have a
  • 41:10 - 41:15
    question about the attack requirements
    against the whistleblower using Tor.
  • 41:15 - 41:24
    I'm curious, the attacker has access to
    the app on the smartphone and also access
  • 41:24 - 41:33
    to the smartphone microphone. Wouldn't the
    attacker then be able to just listen in on
  • 41:33 - 41:37
    the conversation of the whistleblower and
    thereby identify him?
  • 41:37 - 41:41
    Vasilios: Yeah, absolutely. Absolutely.
    It's a major problem. The problem is that
  • 41:41 - 41:48
    they have access to the microphone. So
    this is very this is very real and it's
  • 41:48 - 41:53
    not going to be resolved even if we had
    access only to the ultrasound spectrum.
  • 41:53 - 41:57
    What we're saying is basically, if we only
    had access to the ultrasound spectrum,
  • 41:57 - 42:05
    you're still uhm you are still vulnerable
    to these attacks unless you incorporate
  • 42:05 - 42:10
    some crypto mechanisms that prevent these
    things from happening. Is this your
  • 42:10 - 42:16
    question or?
    Mic: Um, well, I can still pull off the
  • 42:16 - 42:19
    same attack if I don't
    use ultrasound right?
  • 42:19 - 42:22
    Vasilios: Through the audible spectrum?
    Mic: Yes,
  • 42:22 - 42:29
    Vasilios: You can absolutely do. There is
    one company doing tracking in the audible
  • 42:29 - 42:36
    spectrum. This is much harder to mitigate.
    We're looking into it about ways, but
  • 42:36 - 42:39
    there are so many ways to incorporate
    beacons in the audible spectrum. The thing
  • 42:39 - 42:47
    is that there is not much of an ecosystem
    in this area right now that so you don't
  • 42:47 - 42:53
    have lots of frameworks are there as many
    as you have for ultrasounds.
  • 42:53 - 42:56
    Mic: Thank you.
    Herald: Our next question will be from
  • 42:56 - 43:01
    the Internet via our signal angel
    Signal Angel: $Username is asking, have
  • 43:01 - 43:08
    you heard about exploiting parricide
    ultrasound emiters like IC component's?
  • 43:08 - 43:10
    Vasilios: Can you please
    repeat the question?
  • 43:10 - 43:15
    Signal Angel: Yes, sure. The question is,
    can you use other components on the main
  • 43:15 - 43:24
    board or maybe the hard disk to emit
    ultrasounds and then broadcast the beacon
  • 43:24 - 43:29
    via this?
    Vailios: Uh. So that's a very that's a
  • 43:29 - 43:35
    very good question. The answer is I don't
    know, possibly, and it's very scary. Um,
  • 43:35 - 43:42
    hopefully not, but I doubt it. I think
    there should be a way to do it. Um, maybe
  • 43:42 - 43:47
    the problem is that you cannot do this
    completely in a completely inaudible way.
  • 43:47 - 43:52
    Like you may be able to meet ultrasounds,
    but you will also emit some sort of sound
  • 43:52 - 43:58
    in the audible spectrum so that the user
    will know that something is going on.
  • 43:58 - 44:03
    Herald: The next question
    from the left microphone.
  • 44:03 - 44:07
    Mic: Thank you for your talk and
    especially thanks for the research. So,
  • 44:07 - 44:13
    uh, do you know of any framework's or, uh,
    STKs that cash the beacon's they find?
  • 44:13 - 44:18
    Because for my use case, I my phone was
    mostly offline. I just make it online when
  • 44:18 - 44:22
    I have to check
    something. So I'm not that concerned. But
  • 44:22 - 44:25
    you do you know, if they like cash the
    beacons and and submit them later
  • 44:25 - 44:32
    something like this. Of course they do.
    I'm not surprised, unfortunately. Yeah.
  • 44:32 - 44:39
    Thanks. Next question from the rear.
    Right. Oh, what is the data rate? You can
  • 44:39 - 44:44
    send in the ultrasound. Very good
    question. And it's totally relevant to the
  • 44:44 - 44:51
    cryptographic mechanisms we want to
    incorporate from our experiments. Um, in
  • 44:51 - 44:58
    four seconds you can basically send like
    five to six alphabet characters if you're
  • 44:58 - 45:04
    willing to kind of reduce the range a lot
    less in less than seven meters, you may be
  • 45:04 - 45:12
    able to send more. But the standard is not
    very robust in this sense. But these
  • 45:12 - 45:16
    experiments were done with this kind of
    naive encoding that most of the companies
  • 45:16 - 45:23
    are using. So if you do the encoding in a
    very smart way, possibly you can increase
  • 45:23 - 45:29
    that. And a small second part, what's the
    energy consumption on the phone if that is
  • 45:29 - 45:35
    running all the time? Wouldn't I detect
    that? So it's not, uh, it's not good. We
  • 45:35 - 45:39
    saw that it was draining the battery and
    actually in the comments, I don't know if
  • 45:39 - 45:44
    I had that comment here. Some people were
    complaining that, um, I tried and it was
  • 45:44 - 45:53
    draining my battery. And, um, there is an
    impact. Absolutely. Amazon and Google Nest
  • 45:53 - 45:58
    and all the other parts, aren't you more
    worried about that? You know, the always
  • 45:58 - 46:02
    listening thing from Google and Amazon and
    everyone is coming up with some something
  • 46:02 - 46:10
    like that that's always on. And so that
    it's kind of strange because a user's
  • 46:10 - 46:18
    consent. But at the same time, they don't
    completely understand. So there is a gray
  • 46:18 - 46:23
    line there, like you can say that the
    users, OK, you consented to that up,
  • 46:23 - 46:29
    starting with your with your phone and
    listening on the background. But at the
  • 46:29 - 46:35
    same time, the users don't have the best
    understanding. Always. Thank you. Next
  • 46:35 - 46:39
    question from the front left microphone
    first. Thank you for the talk. I would be
  • 46:39 - 46:44
    interested in how you selected your real
    world applications and how many you found
  • 46:44 - 46:51
    that already use such a framework. So what
    was the first part of the question, how
  • 46:51 - 46:57
    you selected your real world applications
    from the marketplace staff if you had any.
  • 46:57 - 47:04
    So we're trying to do a systematic scan of
    the whole market, but it's not easy. So we
  • 47:04 - 47:09
    not able to do that. There are resources
    on the Internet. Luckily, the companies
  • 47:09 - 47:16
    need to advertise their product. So they
    basically publish press releases saying,
  • 47:16 - 47:22
    you know, this NBA team started using our
    product. We did some sort of scanning
  • 47:22 - 47:28
    through alternative datasets, but
    definitely we don't have an exhaustive
  • 47:28 - 47:33
    list of applications. What I can say,
    though, is that there are applications
  • 47:33 - 47:40
    with. Using such frameworks with nearly up
    to, if I remember correctly, up to one
  • 47:40 - 47:49
    million installations. One notable
    example, OK, I'm not entirely sure what I
  • 47:49 - 47:55
    wanted, but up to a million we definitely
    saw. OK, thanks. Do we have more questions
  • 47:55 - 48:02
    from the Internet? Yes, E.F. is asking, is
    he aware of or are you aware sorry? Are
  • 48:02 - 48:06
    you aware of any framework available by
    Google or Apple? In other words, how do we
  • 48:06 - 48:12
    know that it's not, for instance,
    seriously snitching on us? How do we know
  • 48:12 - 48:20
    that it's not true? It's not serious. Some
    maybe Aleksa snitching on us. We don't. I
  • 48:20 - 48:24
    think that's a that's a very large
    discussion. Right. So is the same problem
  • 48:24 - 48:34
    that these companies are having? Because
    if I go back here, basically the users are
  • 48:34 - 48:44
    accusing them of eavesdropping. Especially
    here from reverse engineering those
  • 48:44 - 48:50
    frameworks, we couldn't find any such
    activity, but again, it's very hard to
  • 48:50 - 48:54
    convince the users that you are listening
    to the ultrasound spectrum. You if you're
  • 48:54 - 49:00
    accessing the whole audible frequencies
    through the microphone, you're going to or
  • 49:00 - 49:04
    you will always find yourself in this
    position. So I guess it's the same problem
  • 49:04 - 49:09
    that Alexa has from Amazon. But in this
    case, you can actually solve it by
  • 49:09 - 49:15
    constraining the spectrum that you gain
    access to. Next question from the front
  • 49:15 - 49:21
    left microphone, please. Has anybody done
    an audible demonstration off these beacons
  • 49:21 - 49:26
    bypassed by transposing them down an
    octave or two, I think might be useful for
  • 49:26 - 49:34
    for or your talk to something like that.
    So you mean a demo, but using audible
  • 49:34 - 49:41
    frequencies? Essentially, there is this
    one company, but they are being pretty to
  • 49:41 - 49:45
    all of these companies are being pretty
    secretive with their technology. So they
  • 49:45 - 49:51
    publish what's needed for marketing
    purposes like accuracy sometimes remains
  • 49:51 - 49:57
    very limited technical details. But apart
    from these, you have to get your hands on
  • 49:57 - 50:05
    the framework somehow and analyze it
    yourself. So in this kind of overview we
  • 50:05 - 50:08
    need for the ecosystem, we had to do
    everything by ourselves. There was no
  • 50:08 - 50:16
    resources out there were very limited, um,
    or recording it and playing it down and
  • 50:16 - 50:23
    transposing it yourself, if you know where
    as a beacon of. Possibly I'm not I'm not
  • 50:23 - 50:32
    entirely sure you could. Yeah. Another
    question from our signal, angel mestas,
  • 50:32 - 50:38
    again asking, um, would it be possible,
    even if you have a low pass filter to use,
  • 50:38 - 50:45
    uh, for instance, the cost effect and high
    cost effect to transmit the beacon via
  • 50:45 - 50:54
    ultrasound, but in a regime which is as
    free for the app? So it's basically the
  • 50:54 - 51:00
    question, can I somehow, via Aliasing USA
    address on signal to make a normal signal
  • 51:00 - 51:08
    out of it? Possibly, I don't know. I think
    you are much more creative than I am, so
  • 51:08 - 51:17
    maybe I should add more bullet points on
    this controversialist here. Apparently,
  • 51:17 - 51:23
    there are many more ways to do this,
    possibly like hardware missions. This one
  • 51:23 - 51:30
    sounds like a good idea, too. So next
    question from the real right microphone. I
  • 51:30 - 51:34
    apologize if you explain the story they
    didn't understand, but is is sort of
  • 51:34 - 51:39
    drowning out the signals, like jamming.
    They just broadcasting white noise in that
  • 51:39 - 51:44
    spectrum, an effective countermeasure. And
    as a follow up, if it is, would it
  • 51:44 - 51:57
    terrorize my dog? So absolutely, it's
    effective. I mean, this it works up to
  • 51:57 - 52:02
    seven meters, but we're not saying it's
    not fragile, so you can do that, but it's
  • 52:02 - 52:06
    noise pollution. And my dog, I don't think
    it was happy. I did it for a very limited
  • 52:06 - 52:10
    time. I could see her ears moving, but I
    don't think she would appreciate it if I
  • 52:10 - 52:17
    had the device at home doing this all the
    time. Do we have any more questions from
  • 52:17 - 52:22
    the Internet? Yes, EULEX is asking to what
    extent could we use these for our own
  • 52:22 - 52:27
    needs? For example, people in repressive
    situations, for example, activists could
  • 52:27 - 52:31
    use it to transmit secret encrypted
    messages. Are there any efforts in this
  • 52:31 - 52:41
    area? Yes, there are. People are
    developing ultrasound modems. I think
  • 52:41 - 52:51
    there is even a tag on it. And yes, of
    course there is. So I would say, yes, I'm
  • 52:51 - 52:57
    not entirely sure about the capabilities
    of this channel in terms of bandwidth, but
  • 52:57 - 53:02
    this is why we we are not advocating to
    kill the technology just to make it secure
  • 53:02 - 53:07
    and know its limitations. So you can do
    good stuff with it. And this is what we
  • 53:07 - 53:14
    want. Next question from the Rio, right?
    Yeah, I'm wondering if you could transfer
  • 53:14 - 53:20
    that technique from the ultrasound range
    also to the Audible Range, for example, by
  • 53:20 - 53:27
    using watermarks, audio, watermarks, and
    then, well, your permission thingy with
  • 53:27 - 53:32
    the ultrasound permissions would be
    ineffective and you could also track the
  • 53:32 - 53:38
    user. How about this? Is it possible audio
    watermarks in the audible spectrum? Yeah,
  • 53:38 - 53:43
    it's absolutely possible. Um, our
    countermeasures are not effective against
  • 53:43 - 53:50
    this. Um, it's just that there is from our
    research, just one company doing this. Uh,
  • 53:50 - 53:57
    so this one, um, I think technically it's
    a bit more challenging to do that.
  • 53:57 - 54:03
    Instead, they're just admitting they are
    doing it in a very basic way. So
  • 54:03 - 54:08
    hopefully, um, if there is a clear way to
    do it through ultrasounds, they are not
  • 54:08 - 54:15
    going to reside reside in the audible
    spectrum. But our countermeasures are not
  • 54:15 - 54:23
    effective against the audible. Um.
    Watermarks. Yeah, thanks, next question
  • 54:23 - 54:29
    from the front left microphone. I've heard
    that I don't think it's very credible, but
  • 54:29 - 54:34
    I've heard that there is some sound on
    this sub sound spectrum. There were some
  • 54:34 - 54:41
    experiments showing that they can
    influence our mood, the mood of humans. Is
  • 54:41 - 54:48
    there any relevant information about how
    ultrasounds could affect us? So without
  • 54:48 - 54:55
    being an expert in this particular area?
    I've read similar articles when I was
  • 54:55 - 54:59
    looking into it. I can tell you it's very
    annoying, especially if you're listening
  • 54:59 - 55:06
    to it through headphones. You cannot
    really hear the sound, but you can if
  • 55:06 - 55:12
    you're using headphones, you can feel the
    pressure. So if I don't know what kind of
  • 55:12 - 55:20
    medical condition you may develop, but you
    won't be very sane after. Do we have any
  • 55:20 - 55:27
    more questions? Yes. One further question,
    um, would it be possible to, um, use a
  • 55:27 - 55:34
    charming solution to get rid of the
    signals? Yes, but you you're going to
  • 55:34 - 55:38
    follow the you know, it's going to result
    in noise pollution, but if you are being
  • 55:38 - 55:47
    paranoid about it, yes, it's and it's, I
    think, a straightforward thing to do. Any
  • 55:47 - 55:53
    more questions? One more on the front left
    microphone. Know, you said that physical
  • 55:53 - 55:59
    objects will block the ultrasound. How
    solid do the physical objects need to be?
  • 55:59 - 56:05
    So, for example, does my pocket block the
    ultrasound and thus prevent my phone to
  • 56:05 - 56:12
    call the environment and vice versa? OK,
    well, that's a good question. I don't
  • 56:12 - 56:17
    think that clothes can actually do that
    unless it's very thick. Thin girls
  • 56:17 - 56:27
    definitely block it. Um. Thick glass, I
    would say it reduce the transmission rate,
  • 56:27 - 56:36
    the signal to noise ratio by a lot, but it
    could go through it, so. You need
  • 56:36 - 56:43
    something quite concrete, metal. I don't
    think it goes through it. So are there any
  • 56:43 - 56:48
    more? Doesn't look like it, maybe, maybe
    one more sorry. Oh, good signal, good bye.
  • 56:48 - 57:02
    Kitty is asking, could you name or compile
    a list of tracking programs and apps? So.
  • 57:02 - 57:07
    That's a good question. We're trying to
    make an exhaustive list and try to resolve
  • 57:07 - 57:17
    this in a systematic way. I've already
    listed two Macenta frameworks. One is the
  • 57:17 - 57:20
    Silverbush one three actually. One is the
    Silver Paswan. There is another one used
  • 57:20 - 57:33
    by single 360. So developed the signal
    360, and then there is a listener one.
  • 57:33 - 57:40
    These are very popular. Um, and then its
    developer is incorporating them into their
  • 57:40 - 57:49
    applications in different ways, offering
    varying levels of transparency for the
  • 57:49 - 57:54
    users. So it's better if you start knowing
    what the frameworks are and then trying to
  • 57:54 - 57:59
    find the applications using them, because
    you know what? You're looking in the code
  • 57:59 - 58:06
    and you can develop some queries and
    enabling you to access an ability to to
  • 58:06 - 58:14
    track which applications are using them.
    What what we observed for Silverbush is
  • 58:14 - 58:19
    basically after the company announced that
    they are moving out of the US and because
  • 58:19 - 58:24
    of the whole backslash, maybe even before
    that, um, companies started to drop the
  • 58:24 - 58:30
    framework. So all their versions had the
    framework, but they are not using it
  • 58:30 - 58:53
    anymore. I think that's it. Thank you very
    much, Vasilios Lovelady's.
  • 58:53 - 59:03
    Subtitles created by c3subtitles.de
    in the year 2021. Join, and help us!
Title:
Talking Behind Your Back (33c3)
Description:

more » « less
Video Language:
English
Duration:
59:04

English subtitles

Incomplete

Revisions Compare revisions