Return to Video

Tor onion services: more useful than you think

  • 0:00 - 0:11
    32C3 preroll music
  • 0:11 - 0:16
    Herald: I welcome you to “TOR Onion
    Services – more useful than you think!”
  • 0:16 - 0:19
    This talk is presented by George
  • 0:19 - 0:25
    – who is a core developer of TOR and he is
    also a developer of the Hidden Services –
  • 0:25 - 0:30
    by David Goulet – who is a developer
    for the TOR Hidden Services –
  • 0:30 - 0:35
    and by Roger, who is founder of the
    TOR Project, an MIT Graduate and
  • 0:35 - 0:38
    the Foreign Policy Magazine
    calls him: he is one of the
  • 0:38 - 0:43
    Top 100 Global Thinkers.
    I think that speaks for himself.
  • 0:43 - 0:50
    applause
  • 0:50 - 0:54
    Today we will hear examples of Hidden
    Services for really cool use cases,
  • 0:54 - 0:58
    but also we will hear about security
    fixes that make the Hidden Services
  • 0:58 - 1:01
    even more safer and
    stronger for all of us.
  • 1:01 - 1:06
    Stage free for “TOR Onion Services –
    more useful than all of us think”
  • 1:06 - 1:11
    applause
    Roger: Great!
  • 1:11 - 1:15
    Hi, I’m going to pause while
    we get our slides up, I hope…
  • 1:15 - 1:18
    Hopefully that will be a quick
    and easy event – perfect!
  • 1:18 - 1:22
    Okay, so. Hi, I’m Roger,
    this is George, this is David.
  • 1:22 - 1:27
    We’re going to tell you about TOR Hidden
    Services or TOR Onion Services.
  • 1:27 - 1:31
    They’re basically synonyms, originally
    they were called TOR Hidden Services
  • 1:31 - 1:36
    because the original idea was that
    you hide the location of the service;
  • 1:36 - 1:40
    and then a lot of people started
    using them for other features,
  • 1:40 - 1:44
    other security properties, so we’ve been
    shifting to the name Onion Services.
  • 1:44 - 1:46
    So we’ll switch back and
    forth in what we call them.
  • 1:46 - 1:51
    So, a spoiler before we start.
    This is not a talk about the dark web,
  • 1:51 - 1:54
    there is no dark web.
    It’s just a couple of
  • 1:54 - 1:59
    websites out there that are protected
    by other security properties.
  • 1:59 - 2:03
    So we’ll talk a lot more about
    that. You can think of it as:
  • 2:03 - 2:09
    HTTPS is a way of getting encryption and
    security when you’re going to a website
  • 2:09 - 2:13
    and .onion is another way of getting
    encryption and security when you’re
  • 2:13 - 2:18
    going to a website. So, journalists
    like writing articles about the
  • 2:18 - 2:23
    huge iceberg with 95% of
    the web is in the dark web.
  • 2:23 - 2:26
    That’s nonsense! So we’re gonna try
    and tell you a little bit more about
  • 2:26 - 2:31
    what actually is Hidden Services
    and Onion Services and
  • 2:31 - 2:35
    who uses them and what
    things they can do with them.
  • 2:35 - 2:38
    How many people here know
    quite a bit about how TOR works
  • 2:38 - 2:41
    and what TOR is and so on? I’m
    hoping everybody raises their hand!
  • 2:41 - 2:44
    Awesome, okay. We can
    skip all of that, perfect!
  • 2:44 - 2:47
    So TOR is a US non-profit,
  • 2:47 - 2:52
    it’s open-source, it’s a
    network of about 8000 relays.
  • 2:52 - 2:56
    One of the fun things about TOR is the
    community of researchers, and developers,
  • 2:56 - 3:00
    and users, and activists, and
    advocates, all around the world.
  • 3:00 - 3:04
    Every time I go to a new city, there’s
    a research group at the university
  • 3:04 - 3:07
    who wants me to teach them what
    the open research questions are
  • 3:07 - 3:11
    and how they can improve, how
    they can help you improve TOR.
  • 3:11 - 3:14
    So, the basic idea is,
    you have your software,
  • 3:14 - 3:16
    it pulls down a list of the 8000 relays,
  • 3:16 - 3:18
    and it builds a path through 3 of them
  • 3:18 - 3:21
    so that no single relay gets to learn
  • 3:21 - 3:24
    where you’re coming from
    and where you’re going.
  • 3:24 - 3:28
    And you can see up at the top
    here, there is a .onion address.
  • 3:28 - 3:32
    So basically Hidden Services,
    or Onion Services, are
  • 3:32 - 3:37
    in your browser, in your TOR browser,
    you go to an alternate type of
  • 3:37 - 3:40
    domain name, that ends in .onion,
    and then you end up at that website.
  • 3:40 - 3:44
    So here’s an example of
    a riseup.net website,
  • 3:44 - 3:48
    which we are reaching using
    the onion address for it
  • 3:48 - 3:53
    rather than black.riseup.net.
  • 3:53 - 3:57
    Okay, so, I talked about the building
    block before, how you use TOR normally
  • 3:57 - 4:01
    to build a 3-hop circuit through the
    network. Once you have that building block,
  • 4:01 - 4:04
    then you can glue two of them together.
  • 4:04 - 4:08
    So you’ve got Alice over here,
    connecting into the TOR network,
  • 4:08 - 4:11
    and you’ve got Bob, the website,
    connecting into the TOR network,
  • 4:11 - 4:13
    and they rendez-vous in the middle.
  • 4:13 - 4:17
    So Alice is getting her anonymity,
    her 3 hops inside TOR,
  • 4:17 - 4:20
    Bob is getting his anonymity,
    his 3 hops inside of TOR,
  • 4:20 - 4:21
    and they meet in the middle.
  • 4:21 - 4:26
    So Alice doesn’t know where Bob is,
    Bob doesn’t know where Alice is,
  • 4:26 - 4:28
    and the point in the middle
    doesn’t know either of them,
  • 4:28 - 4:32
    yet they can reach each other, and
    get some cool security properties.
  • 4:32 - 4:35
    So, some of these cool
    security properties:
  • 4:35 - 4:38
    One of the really cool ones is that
    that .onion name that you saw
  • 4:38 - 4:42
    with the base32
  • 4:42 - 4:44
    big pile of 16 characters,
  • 4:44 - 4:48
    that is the hash of the public
    key which is the Onion Service,
  • 4:48 - 4:52
    which is the Onion address.
    So they’re self-authenticating, meaning
  • 4:52 - 4:54
    if I have the right onion address,
  • 4:54 - 4:57
    I can be sure that I’m
    connecting to the website,
  • 4:57 - 5:00
    to the service, that’s
    associated with that key.
  • 5:00 - 5:03
    So I don’t need some sort
    of Certificate Authority model
  • 5:03 - 5:06
    where I trust Turkish
    Telecom to not lie to me.
  • 5:06 - 5:10
    It’s all built-in, self-authenticating,
  • 5:10 - 5:12
    I don’t need any external resources
  • 5:12 - 5:16
    to convince myself that I’m
    going to the right place.
  • 5:16 - 5:19
    Along with that, is, they’re
    end-to-end encrypted.
  • 5:19 - 5:23
    So I know that nobody
    between my TOR client
  • 5:23 - 5:26
    and the TOR client on the
    Service side is able to read,
  • 5:26 - 5:29
    or intercept, or
    man-in-the-middle the traffic.
  • 5:29 - 5:32
    So there are some other
    interesting features also,
  • 5:32 - 5:35
    one of them is the NAT punching feature.
  • 5:35 - 5:37
    If you offer an Onion Service,
  • 5:37 - 5:40
    there’s no reason to allow
    incoming connections to it.
  • 5:40 - 5:44
    So I can run an Onion Service
    deep inside the corporate firewall,
  • 5:44 - 5:48
    or behind Comcast’s firewall,
    or wherever I want to,
  • 5:48 - 5:50
    and people are able to reach it.
  • 5:50 - 5:55
    So there are a lot of people from
    the systems administration side
  • 5:55 - 6:01
    who say: “I’m going to offer an Onion
    address for my home SSH server,
  • 6:01 - 6:04
    and now the only way that I can
    connect back into my home box
  • 6:04 - 6:08
    is via the TOR network.
    I get end-to-end encryption,
  • 6:08 - 6:12
    I get self-authentication,
    and there’s no other way in.
  • 6:12 - 6:15
    I just firewall all incoming connections
  • 6:15 - 6:18
    and so the only surface area
    that I expose to the world
  • 6:18 - 6:22
    is, if you’re using my onion
    address, you reach my SSH port.
  • 6:22 - 6:25
    I don’t allow any other
    packets in of any sort.”
  • 6:25 - 6:29
    So that’s a cool example
    of how security people
  • 6:29 - 6:32
    use Onion Services.
  • 6:32 - 6:36
    George: So, hello, we have some
    statistics for you to show you,
  • 6:36 - 6:39
    to give you an idea of the
    current maturity of the system.
  • 6:39 - 6:44
    We got these statistics by asking
    relays to send us information
  • 6:44 - 6:47
    about the Hidden Service
    activity they see.
  • 6:47 - 6:50
    Only a small fraction of relays
    is reporting these statistics,
  • 6:50 - 6:54
    so we extrapolate
    from this small fraction.
  • 6:54 - 6:58
    So that’s why these statistics can
    have lots of ups and downs,
  • 6:58 - 7:01
    and noise, and everything, but anyway,
    they can give you a basic idea.
  • 7:01 - 7:05
    So, this first statistic is the number of
    Hidden Services on the network,
  • 7:05 - 7:10
    and you can see that it’s about
    30.000 Hidden Services, give or take,
  • 7:10 - 7:15
    and it’s a pretty small number if you
    compare it to the whole Internet,
  • 7:15 - 7:19
    I don’t even know, it’s basically
    in the early adoption stages.
  • 7:19 - 7:22
    And we also have
    another statistic, this one,
  • 7:22 - 7:28
    which is the traffic that the Hidden
    Services are generating, basically.
  • 7:28 - 7:33
    On the top, you can see the total traffic
    that the whole network is pushing.
  • 7:33 - 7:36
    It’s about, I don’t know, 60.000 megabits,
  • 7:36 - 7:40
    and the bottom graph is the
    Hidden-Service-specific traffic,
  • 7:40 - 7:42
    and you can see that
    it’s like 1000 megabits.
  • 7:42 - 7:46
    Like, a very small
    fraction, basically. So,
  • 7:46 - 7:50
    Hidden Services are still a
    very small part of TOR. And,
  • 7:50 - 7:54
    if you don’t understand this
    number thing very well,
  • 7:54 - 7:58
    we did some calculations and stuff,
    and we have this new figure for you,
  • 7:58 - 8:03
    which is that basically 5% of
    client traffic is Hidden Services.
  • 8:03 - 8:07
    From the whole TOR, 5% is
    Hidden Services, basically.
  • 8:07 - 8:10
    You can handle this as you want.
  • 8:10 - 8:14
    So, and, we did this whole
    thing like a year ago,
  • 8:14 - 8:19
    and we spent lots of time like figuring
    out how to collect statistics,
  • 8:19 - 8:22
    how to get from the values
    themselves to those graphs,
  • 8:22 - 8:27
    how to obfuscate the statistics in
    such a way that we don’t reveal
  • 8:27 - 8:30
    any information about any clients,
  • 8:30 - 8:34
    and we wrote a tech report about
    it, that you can find in this link
  • 8:34 - 8:38
    if you’re interested in looking more
    [at] how the whole thing works,
  • 8:38 - 8:43
    and we even wrote a proposal, so if you
    google for TOR Project, proposal 238,
  • 8:43 - 8:50
    you can find more information,
    and, yeah, that’s it.
  • 8:51 - 8:55
    Roger: Okay, so, how did
    this whole thing start?
  • 8:55 - 8:58
    We’re going to go through a
    couple of years at the beginning.
  • 8:58 - 9:01
    In 2004, I wrote the original
    Hidden Service code,
  • 9:01 - 9:04
    and I basically wrote it as a toy.
  • 9:04 - 9:07
    It was an example: “We
    have this thing called TOR,
  • 9:07 - 9:10
    and use it as a building
    block, look what we can do!
  • 9:10 - 9:12
    We can connect 2 TOR circuits together,
  • 9:12 - 9:16
    and then you can run a service like this.”
  • 9:16 - 9:19
    Basically nobody used it for a few years.
  • 9:19 - 9:21
    One of my friends set
    up a hidden wiki where,
  • 9:21 - 9:24
    if you run an Onion Service,
    then you can go to the wiki,
  • 9:24 - 9:27
    and sign up your address
    so that people can find it.
  • 9:27 - 9:31
    And there were some example services.
    But for the first couple of years,
  • 9:31 - 9:35
    it basically wasn’t used,
    wasn’t interesting.
  • 9:35 - 9:40
    The first really interesting use
    case was the Zyprexa documents.
  • 9:40 - 9:42
    So this was in 2005, 2006.
  • 9:42 - 9:46
    There’s a huge pharmaceutical
    company called Eli Lilly
  • 9:46 - 9:50
    and they have an antipsychotic
    drug called Zyprexa
  • 9:50 - 9:54
    and it turns out that it was
    giving people diabetes
  • 9:54 - 9:58
    and harming them, and killing
    them, and they knew about it.
  • 9:58 - 10:02
    And somebody leaked 11.000
    documents onto the Internet
  • 10:02 - 10:05
    showing that this drug
    company knew about the fact
  • 10:05 - 10:07
    that they were harming their customers.
  • 10:07 - 10:11
    And of course the drug company sent
    a cease and desist to the website,
  • 10:11 - 10:14
    and it went away, and it
    came up somewhere else,
  • 10:14 - 10:16
    and they sent a cease and desist,
    and it was bouncing around,
  • 10:16 - 10:21
    and suddenly somebody set up a TOR
    Hidden Service with all of the documents,
  • 10:21 - 10:25
    and Eli Lilly had no idea how to send
    a cease and desist to that address,
  • 10:25 - 10:28
    and a lot of people were able to read
  • 10:28 - 10:31
    the corruption and problems
    with this drug company.
  • 10:31 - 10:35
    So that was… on the one hand, yay!
  • 10:35 - 10:39
    applause
  • 10:39 - 10:43
    On the one hand, that’s really cool. Here
    we are, we have a censorship-resistant
  • 10:43 - 10:47
    privacy thing, somebody
    used it to get information out
  • 10:47 - 10:51
    about a huge company that
    was hurting people, great!
  • 10:51 - 10:55
    On the other hand, it set
    us up where ever after,
  • 10:55 - 10:58
    people looked at Hidden Services and said:
  • 10:58 - 11:01
    “Well, how do I find a document
    that some large organization’s
  • 11:01 - 11:06
    going to be angry about? I’m going
    to set up a website for leaking things,
  • 11:06 - 11:10
    I’m gonna set up a website for something
    else that the Man wants to shut down.”
  • 11:10 - 11:14
    So the first example of
    Hidden Services pointed us
  • 11:14 - 11:18
    in a direction where, after
    that, a lot of people
  • 11:18 - 11:22
    thought that that’s what
    Hidden Services were about.
  • 11:22 - 11:25
    So, that leads to the next year,
  • 11:25 - 11:28
    Wikileaks set up a Hidden Service
    for their submission engine,
  • 11:28 - 11:32
    and it’s not that they wanted to
    hide the location of the server.
  • 11:32 - 11:36
    The server was in Sweden, everybody
    knew the server was in Sweden.
  • 11:36 - 11:40
    But they wanted to give extra security
    to users who were trying to get there.
  • 11:40 - 11:45
    One of the really interesting properties
    that they used from Hidden Services
  • 11:45 - 11:48
    is the fact that if you
    go to the .onion site
  • 11:48 - 11:52
    from your normal browser,
    it totally doesn’t work.
  • 11:52 - 11:54
    And this was a security feature for them.
  • 11:54 - 11:58
    Because they wanted to make
    sure that if you’re a leaker,
  • 11:58 - 12:01
    and you’re doing it wrong,
    you’re configuring things wrong,
  • 12:01 - 12:02
    then it totally fails from the beginning.
  • 12:02 - 12:06
    They wanted to completely remove
    the chance that you accidentally think
  • 12:06 - 12:10
    that you’re using TOR
    correctly and being safe
  • 12:10 - 12:13
    when actually you screwed
    up and you’re not using TOR.
  • 12:13 - 12:15
    So they wanted to use Onion Services
  • 12:15 - 12:18
    as another layer of security for the user,
  • 12:18 - 12:21
    to protect the user from screwing up.
  • 12:21 - 12:25
    Now fast forward a couple of more years,
    there’s another organization in Italy
  • 12:25 - 12:29
    called GlobaLeaks, where they’ve
    set up basically a mechanism where,
  • 12:29 - 12:32
    if you have something you
    want to share with the world,
  • 12:32 - 12:36
    then you can be connected to a journalist
    through this GlobaLeaks platform.
  • 12:36 - 12:40
    And they actually have been
    going around to governments,
  • 12:40 - 12:43
    convincing them to set
    up GlobaLeaks platforms.
  • 12:43 - 12:45
    So they’ve gone to the Italian government,
  • 12:45 - 12:48
    they’ve gone to the Philippine government,
  • 12:48 - 12:52
    and basically they say:
    “Look, this is a way for you
  • 12:52 - 12:55
    to report on corruption,
    to hear about corruption
  • 12:55 - 12:58
    inside your country.” Now, if you
    go to a government, and you say:
  • 12:58 - 13:01
    “I hear there is corruption,
    here’s a way to report on it.”
  • 13:01 - 13:04
    not everybody in the government
    will be happy with that.
  • 13:04 - 13:08
    But one of the features is,
    you can very easily say:
  • 13:08 - 13:12
    “Can you help me set up an
    anti-corruption whistleblowing site
  • 13:12 - 13:17
    for the country next door? I would be
    happy to… you know they’ve got corruption,
  • 13:17 - 13:20
    so how about they provide
    the corruption site?”
  • 13:20 - 13:24
    applause
  • 13:24 - 13:27
    So it’s really cool that GlobaLeaks
    is playing the political game,
  • 13:27 - 13:33
    trying to demonstrate that making
    these things public is worthwhile.
  • 13:33 - 13:37
    And, of course, here’s a picture of a
    cute cat, we have to have one of those,
  • 13:37 - 13:42
    and WildLeaks is a really
    good example of a positive,
  • 13:42 - 13:46
    I mean, this is a way where if you
    see somebody killing a rhinoceros
  • 13:46 - 13:49
    or elephant or something in
    Africa, and you know about it,
  • 13:49 - 13:52
    upload it to WildLeaks, and
    then they can learn more
  • 13:52 - 13:56
    about poaching and
    extinction events and so on.
  • 13:56 - 14:00
    So, it’s hard to argue with anti-poaching,
  • 14:00 - 14:03
    anti-corruption sites like that.
  • 14:03 - 14:07
    And that moves us to SecureDrop,
    there’s a group in the US
  • 14:07 - 14:11
    that is working on another
    example of how to connect
  • 14:11 - 14:15
    people with interesting information
    to journalists who want to write about it.
  • 14:15 - 14:18
    And they’ve actually connected
    with the New Yorker and a lot of
  • 14:18 - 14:21
    high-profile newspapers,
  • 14:21 - 14:26
    to be able to provide a way for people
    to securely provide information
  • 14:26 - 14:32
    to those journalists. And they say that
    it has been used in high-profile events,
  • 14:32 - 14:34
    and they won’t tell us which
    events, which is great!
  • 14:34 - 14:38
    That’s exactly how it’s
    supposed to work.
  • 14:38 - 14:43
    applause
  • 14:43 - 14:46
    David: Hello. So, continuing
    our timeline here,
  • 14:46 - 14:50
    this very cool thing happened
    in 2014, where Aphex Twin,
  • 14:50 - 14:53
    this electronic experimental guy,
  • 14:53 - 14:57
    released his album Syro through
    an onion address on Twitter,
  • 14:57 - 15:00
    and he got 4.000 Retweets.
    So we encourage you guys
  • 15:00 - 15:04
    to consider this method
    of releasing all your stuff,
  • 15:04 - 15:09
    and the complementary ways to
    release it would be the open web.
  • 15:09 - 15:12
    So, onion addresses.
  • 15:12 - 15:16
    Following that, we got
    Blockchain, recently,
  • 15:16 - 15:20
    in 2014, let’s say two years ago.
  • 15:20 - 15:22
    They discovered that
    for security concerns,
  • 15:22 - 15:26
    when you’re using TOR, the exit nodes,
  • 15:26 - 15:28
    some exit nodes, and malicious exit nodes,
  • 15:28 - 15:31
    were rewriting the Bitcoin addresses.
  • 15:31 - 15:33
    So for security concerns, they changed…
  • 15:33 - 15:37
    if you go… you come to
    blockchain.info from TOR,
  • 15:37 - 15:39
    they tell you to use the onion address
  • 15:39 - 15:42
    so you get all the fancy properties
    of end-to-end encryption,
  • 15:42 - 15:45
    and so on, and so forth.
  • 15:45 - 15:50
    As of still today, we know
    that malicious exit nodes exist,
  • 15:50 - 15:54
    and they do rewrite Bitcoin addresses.
  • 15:54 - 15:56
    Don’t be alarmed, it’s not like HAL3000’s,
  • 15:56 - 16:00
    the thing is that, we at the TOR
    Project are actively monitoring
  • 16:00 - 16:05
    the network at the exit nodes
  • 16:05 - 16:07
    for these kinds of craziness.
  • 16:07 - 16:11
    And we need more help from
    everyone, from the community,
  • 16:11 - 16:13
    to find those, so we can block them,
  • 16:13 - 16:16
    remove them, so fuck
    those. Fuck those guys.
  • 16:16 - 16:21
    And Blockchain took action
    with Onion Services. So, great.
  • 16:21 - 16:24
    Roger: And Facebook set up a
    Hidden Service recently as well,
  • 16:24 - 16:26
    an onion address for their website.
  • 16:26 - 16:29
    So, the first thing many of
    you might be thinking is:
  • 16:29 - 16:33
    “Wait a minute, I don’t understand,
    Facebook is a website on the Internet,
  • 16:33 - 16:36
    why do they need a Hidden Service,
    why do they need an onion address?”
  • 16:36 - 16:42
    So, the first answer is, they worry
    about users in interesting countries.
  • 16:42 - 16:46
    Say you’ve got a Facebook user in
    Turkey or Tunisia or something like that,
  • 16:46 - 16:48
    and they try to go to Facebook,
  • 16:48 - 16:51
    and the local DNS server lies to them
    and sends them somewhere else,
  • 16:51 - 16:55
    or Turkish Telecom, which is a certificate
    authority that everybody trusts,
  • 16:55 - 16:58
    ends up pretending to be Facebook.
  • 16:58 - 17:00
    You man-in-the-middle them,
    now there’s certificate pinning
  • 17:00 - 17:04
    and other challenges like that,
    and maybe those are good starts.
  • 17:04 - 17:08
    But wouldn’t it be cool just to skip the
    whole certificate authority infrastructure
  • 17:08 - 17:12
    and say “Here’s an address”, where
    if you go to this in your TOR Browser,
  • 17:12 - 17:15
    you don’t have to worry
    about PGP hijacking,
  • 17:15 - 17:17
    you don’t have to worry
    about certificate authorities,
  • 17:17 - 17:20
    you don’t have to worry about DNS,
    it’s all inside the TOR network,
  • 17:20 - 17:23
    and it takes care of the security
    properties I talked about before.
  • 17:23 - 17:26
    So, that’s a really cool
    way that they can switch.
  • 17:26 - 17:29
    I was talking to one of the
    Facebook people earlier.
  • 17:29 - 17:34
    He doesn’t want me to tell the number of
    users who are using Facebook over TOR,
  • 17:34 - 17:38
    but it’s many hundreds of thousands.
    It’s a shockingly high number of users.
  • 17:38 - 17:42
    So, wouldn’t it be cool if we
    can switch many of those users
  • 17:42 - 17:45
    from connecting to Facebook.com over TOR,
  • 17:45 - 17:47
    to connecting to Facebook’s onion address,
  • 17:47 - 17:51
    and then reduce the
    load on the exit relays,
  • 17:51 - 17:54
    so that it’s faster and
    easier and scales better
  • 17:54 - 17:59
    for the people connecting to websites
    that aren’t onion addresses?
  • 17:59 - 18:02
    So, I was thinking about
    this at the very beginning
  • 18:02 - 18:05
    and I was thinking: “Wait
    a minute, I don’t get it,
  • 18:05 - 18:08
    Facebook has an onion address,
    but they have a real address,
  • 18:08 - 18:12
    why do we need the other one?”
    And then I was thinking back.
  • 18:12 - 18:15
    So, you remember 10 years ago,
    when people were running websites
  • 18:15 - 18:18
    and the administrator on the website said:
  • 18:18 - 18:23
    “I don’t need to offer HTTPS for
    my website, because my users…”
  • 18:23 - 18:26
    and then they had some bullshit excuse
    about how their users didn’t need security
  • 18:26 - 18:29
    or didn’t need encryption,
    or something like that.
  • 18:29 - 18:32
    And now, 10 years later, we all think
  • 18:32 - 18:35
    that the people saying: “I don’t
    need HTTPS for my website”…
  • 18:35 - 18:38
    we think they’re greedy and
    short-sighted, and selfish,
  • 18:38 - 18:40
    and they’re not thinking
    about their users.
  • 18:40 - 18:44
    I think the Onion Service thing
    is exactly the same thing.
  • 18:44 - 18:46
    Right now, there are
    plenty of people saying:
  • 18:46 - 18:51
    “I already have HTTPS, I don’t need
    an onion address for my website
  • 18:51 - 18:55
    because my users…” and then
    they have some lame explanation.
  • 18:55 - 18:59
    So hopefully in a couple of years,
    it will be self-evident to everybody
  • 18:59 - 19:02
    that users should be the ones to choose
  • 19:02 - 19:05
    what sort of security
    properties they want.
  • 19:05 - 19:09
    It shouldn’t be about what the
    website thinks the user should have.
  • 19:09 - 19:11
    I should have the choice
    when I’m going to Facebook.
  • 19:11 - 19:16
    Do I go to the HTTP version,
    do I go to the HTTPS version,
  • 19:16 - 19:18
    do I go the onion version?
  • 19:18 - 19:21
    It should be up to me to
    decide what my situation is
  • 19:21 - 19:23
    and get the security
    properties that I want.
  • 19:23 - 19:27
    The other challenge here, I’ve talked
    to some researchers a while ago
  • 19:27 - 19:31
    who said: “I found a copy of
    Facebook on the dark web”
  • 19:31 - 19:34
    and I was thinking: “Wait a minute,
  • 19:34 - 19:37
    you didn’t find a copy of
    Facebook on the dark web,
  • 19:37 - 19:41
    there’s a mechanism for securely
    getting to the website called Facebook,
  • 19:41 - 19:43
    and it’s called Onion Services.
  • 19:43 - 19:47
    There’s no separate dark web,
    it’s about transport encryption,
  • 19:47 - 19:52
    it’s about a way of reaching
    the destination more safely.”
  • 19:52 - 19:54
    One of the other really cool things,
  • 19:54 - 19:57
    Facebook didn’t just set
    up an onion address,
  • 19:57 - 20:01
    they got an HTTPS certificate
    for their onion address.
  • 20:01 - 20:03
    They got an EV cert,
    the kind that shows you
  • 20:03 - 20:06
    the green little bar that says:
    “This is Facebook”
  • 20:06 - 20:09
    for their onion address.
    They went to Digicert,
  • 20:09 - 20:13
    and Digicert gave them an SSL certificate
  • 20:13 - 20:18
    for their onion address, so now you can
    get both of them at once. Which is
  • 20:18 - 20:23
    an amazing new step that we hadn’t
    even been thinking about at the time.
  • 20:23 - 20:25
    So, what does this give them?
    Why is this valuable?
  • 20:25 - 20:28
    One of them is, on the browser side,
  • 20:28 - 20:32
    when you’re going to an HTTPS URL,
  • 20:32 - 20:35
    the browser knows to
    treat those cookies better,
  • 20:35 - 20:37
    and to not leak certain things,
  • 20:37 - 20:40
    and there’s all sorts of security
    and privacy improvements
  • 20:40 - 20:42
    that browsers do when you’re going there.
  • 20:42 - 20:45
    And we don’t want to teach the browser
  • 20:45 - 20:49
    that if it’s HTTPS or .onion then be safe.
  • 20:49 - 20:50
    The other nice thing, on the server side,
  • 20:50 - 20:53
    Facebook didn’t have to change anything.
  • 20:53 - 20:55
    This is another way of reaching
    the Facebook server.
  • 20:55 - 20:59
    That’s all there is to it.
  • 20:59 - 21:03
    And then, another cool thing:
  • 21:03 - 21:08
    It turns out that the only way
    to get a wildcard EV certificate
  • 21:08 - 21:10
    is for an onion domain.
  • 21:10 - 21:14
    It’s actually written into, like,
    the certificate authority world,
  • 21:14 - 21:18
    that there is a grand exception
    for onion addresses.
  • 21:18 - 21:23
    You can’t get a wildcard EV cert
    unless it’s for an onion address.
  • 21:23 - 21:26
    So this is super duper
    endorsement of Onion Services
  • 21:26 - 21:30
    from the certificate authority people.
  • 21:30 - 21:35
    applause
  • 21:35 - 21:38
    But let’s take a step even further.
  • 21:38 - 21:42
    Wouldn’t it be cool if we take
    the Let’s Encrypt project
  • 21:42 - 21:45
    and they bundle a TOR client in
  • 21:45 - 21:48
    with each web server that’s
    offering the Let’s Encrypt system?
  • 21:48 - 21:52
    So every time you sign up for Let’s
    Encrypt, you also click the button
  • 21:52 - 21:55
    saying: “And I want an onion
    address for my website”,
  • 21:55 - 21:57
    and they automatically,
    in the same certificate…
  • 21:57 - 22:02
    you get one for riseup.net,
    and as an alternate name,
  • 22:02 - 22:05
    it’s blahblahblah.onion.
    It’s in the same certificate.
  • 22:05 - 22:08
    So users can go to your website directly
  • 22:08 - 22:10
    or they can go there
    over the onion address
  • 22:10 - 22:13
    and either way you
    provide the SSL certificate
  • 22:13 - 22:14
    that keeps everybody safe.
  • 22:14 - 22:18
    Wouldn’t it be cool if every time
    somebody signs up for Let’s Encrypt,
  • 22:18 - 22:22
    they get an onion address
    for free for their website,
  • 22:22 - 22:26
    so that everybody can choose how
    they want to reach that website?
  • 22:26 - 22:33
    applause, cheering
  • 22:33 - 22:37
    Now, there are a few problems
    with that. One of the big ones is,
  • 22:37 - 22:41
    we want some way of binding
    the riseup.net address
  • 22:41 - 22:45
    to the onion address,
    so that when I go to riseup.net
  • 22:45 - 22:47
    I know that I’m going to
    the correct onion address.
  • 22:47 - 22:50
    So we need some way to vouch for them
  • 22:50 - 22:53
    and connect them through
    signatures or something.
  • 22:53 - 22:56
    It can be done, but somebody
    needs to work out the details.
  • 22:56 - 22:57
    The other policy barrier is,
  • 22:57 - 23:01
    right now, the certificate
    authority people
  • 23:01 - 23:04
    say you cannot get an
    onion address for a DV cert,
  • 23:04 - 23:08
    the normal kind of cert.
    You can only get it for an EV cert.
  • 23:08 - 23:11
    And Alec over here is leading the charge
  • 23:11 - 23:13
    to convince them that that makes no sense,
  • 23:13 - 23:16
    so hopefully in the next couple
    of years, with all of your help,
  • 23:16 - 23:18
    they will realize that onion addresses are
  • 23:18 - 23:22
    just like all the other
    addresses in the world.
  • 23:22 - 23:26
    Which leads to another really
    cool feature from this year.
  • 23:26 - 23:32
    We got IETF to publicly
    specify, in a real RFC,
  • 23:32 - 23:35
    that the .onion domain is a special case,
  • 23:35 - 23:38
    and they’re not going to give
    it out in any other way. So…
  • 23:38 - 23:40
    applause
    yeah!
  • 23:40 - 23:46
    applause
  • 23:46 - 23:49
    So the first effect here is
    that we have actual approval
  • 23:49 - 23:54
    of Onion Services from the IETF
    and other standards committees.
  • 23:54 - 23:57
    But the second effect, which
    is a second-order effect, is,
  • 23:57 - 24:01
    now we can go to the browsers,
    and the DNS resolvers,
  • 24:01 - 24:05
    and say, whenever you
    see an onion resolve,
  • 24:05 - 24:07
    cut it right there, because you
    know that it’s not going into TOR,
  • 24:07 - 24:10
    and you know that it shouldn’t
    go out onto the network.
  • 24:10 - 24:13
    So now, when you’re in your
    normal Internet Explorer,
  • 24:13 - 24:15
    and you accidentally click
    on an onion address,
  • 24:15 - 24:18
    Internet Explorer knows
    that that’s a local address,
  • 24:18 - 24:20
    that shouldn’t go out onto the network.
  • 24:20 - 24:23
    So we can keep people
    safer, in ordinary browsers
  • 24:23 - 24:29
    that otherwise wouldn’t
    even care that we exist.
  • 24:29 - 24:33
    George: OK, so, so far we’ve been talking
    about websites and Hidden Services,
  • 24:33 - 24:36
    but this is not all that
    Hidden Services can do.
  • 24:36 - 24:39
    Basically, you can do any sort of TCP
  • 24:39 - 24:41
    thing you want to do over Hidden Services.
  • 24:41 - 24:46
    We’re going to show you a few
    examples of third-party applications
  • 24:46 - 24:48
    that have been developed
    for Hidden Services
  • 24:48 - 24:50
    and do various interesting things.
  • 24:50 - 24:55
    First of all, OnionShare is
    a file transfer application
  • 24:55 - 24:59
    where you basically download this
    thing, and then you feed it a file,
  • 24:59 - 25:03
    and then it exposes a HTTP
    server that you can,
  • 25:03 - 25:07
    that basically hosts your file, an
    onion address that hosts your file,
  • 25:07 - 25:10
    and you can give that a URL, you
    can give that URL to your friends,
  • 25:10 - 25:14
    and they can just put it on their TOR
    Browser and download the file easily.
  • 25:14 - 25:17
    It’s quite convenient, nicely made,
  • 25:17 - 25:20
    and I think various organizations
  • 25:20 - 25:24
    like the Intercept and stuff, are
    using it to transfer files internally.
  • 25:24 - 25:28
    And it works fine so far,
    as far as we know.
  • 25:28 - 25:33
    Then the various Hidden Services
    are quite good for doing messaging
  • 25:33 - 25:37
    because basically both sides are anonymous
  • 25:37 - 25:41
    and this gives a nice twist to
    when you talk to some person
  • 25:41 - 25:44
    and one way to do so is Ricochet,
  • 25:44 - 25:47
    which is an application that allows you
  • 25:47 - 25:49
    to talk one-to-one to other people.
  • 25:49 - 25:54
    It’s decentralized, because
    it works over Hidden Services.
  • 25:54 - 25:58
    That’s actually quite useful,
    because, for example,
  • 25:58 - 26:02
    a few months ago, the Jabber CCC
    server got shut down for a few days
  • 26:02 - 26:05
    and you couldn’t talk
    to anyone, basically,
  • 26:05 - 26:07
    but if you used Ricochet, you were fine,
  • 26:07 - 26:10
    because they can shut down the CCC server
  • 26:10 - 26:14
    but they probably can’t shut down
    the whole TOR network so easily.
  • 26:14 - 26:19
    It also has a nice slick UI,
    which is quite refreshing
  • 26:19 - 26:23
    if you’re used to the usual
    UIs of the open-source world.
  • 26:23 - 26:26
    And, anyway, you can…
  • 26:26 - 26:29
    you can download it from
    that web server there.
  • 26:29 - 26:31
    And then there is Pond,
  • 26:31 - 26:35
    which is a more experimental
    avant-garde messaging application,
  • 26:35 - 26:41
    which is basically a mix between
    messaging and mix nets.
  • 26:41 - 26:45
    It basically uses a server which
    delays your messages and stuff,
  • 26:45 - 26:48
    which makes it much harder
    for a network adversary
  • 26:48 - 26:51
    to know when you’re sending
    or receiving messages,
  • 26:51 - 26:55
    because you also send chaff,
    and fake traffic and stuff.
  • 26:55 - 26:57
    It’s super-experimental,
  • 26:57 - 26:59
    the author doesn’t even
    want us to really endorse it,
  • 26:59 - 27:05
    but information is free and
    you can visit that website
  • 27:05 - 27:09
    to learn more about it.
  • 27:09 - 27:12
    David: So, there’s also,
    for many years now,
  • 27:12 - 27:16
    plenty of services and tools that exist.
    George just showed us some tools,
  • 27:16 - 27:19
    but now there’s services like
    Jabber, SMTP, and IMAP,
  • 27:19 - 27:23
    from the Riseup guys, but not
    only Riseup, but Systemli,
  • 27:23 - 27:25
    Autistici, and Calyx Institute for Jabber.
  • 27:25 - 27:28
    And it’s more and more and
    more Jabber servers right now
  • 27:28 - 27:31
    that are federating TOR through the Onions
  • 27:31 - 27:33
    for server-to-server
    and also client-to-server.
  • 27:33 - 27:37
    And this has been around for a long time,
    and it serves many, many, many users.
  • 27:37 - 27:41
    I think Riseup has more than
    30.000 users on their Jabbers.
  • 27:41 - 27:43
    Another neat thing about them is,
  • 27:43 - 27:48
    very recently, that Debian
    created their package repository
  • 27:48 - 27:52
    and now you can use an onion address
    to just update your Debian system.
  • 27:52 - 27:56
    And you use this amazing package
    which is apt-tor-transport and, hop!,
  • 27:56 - 28:01
    you can update everything through an onion
    address, it will detect it automatically.
  • 28:01 - 28:05
    But then, there’s also much
    more that happened recently.
  • 28:05 - 28:08
    Also the GPG key servers
    exist as an onion address.
  • 28:08 - 28:12
    So you can update your GPG key,
  • 28:12 - 28:14
    and download a signature,
    and so on, and so forth,
  • 28:14 - 28:18
    which in a way hides
    from global observers,
  • 28:18 - 28:21
    because we know they exist,
    all your social graph
  • 28:21 - 28:26
    because, well, you’re in an
    end-to-end encrypted channel.
  • 28:26 - 28:28
    Of course it can go to
    GPG servers, that’s true,
  • 28:28 - 28:32
    but still at least on the
    wire, it’s hidden. Very nice.
  • 28:32 - 28:36
    Now, DuckDuckGo of course, they have
    Jabbers and they also have Hidden Service.
  • 28:36 - 28:40
    And I talked to the DuckDuckGo
    people a few months ago maybe,
  • 28:40 - 28:44
    I don’t remember. But the point is,
    they have many, many, many users
  • 28:44 - 28:47
    coming through their onion
    addresses. The Pirate Bay also.
  • 28:47 - 28:52
    So, the point of all this is that,
    with Facebook and Blockchain,
  • 28:52 - 28:55
    and all those we’ve seen
    – and we actually know that
  • 28:55 - 28:59
    several Alexa 500 top websites
  • 28:59 - 29:03
    are currently deploying onion addresses.
  • 29:03 - 29:08
    And the point here is, between
    the TOR network, the onion space
  • 29:08 - 29:14
    and the open Internet…
    If all sites are on both sides, well,
  • 29:14 - 29:18
    it becomes one side. It’s just different
    ways of accessing the information.
  • 29:18 - 29:21
    So please, please, please go to your
    companies, go to your organization,
  • 29:21 - 29:24
    deploy onion addresses,
    and make them public.
  • 29:24 - 29:28
    Help us have much more.
  • 29:28 - 29:31
    Roger: Let me, before I get to
    the next one, re-emphasize
  • 29:31 - 29:34
    the point that George was
    making about Ricochet.
  • 29:34 - 29:37
    So Ricochet is an alternate chat program
  • 29:37 - 29:40
    where every user is
    their own onion address.
  • 29:40 - 29:42
    Every user is their own Onion Service.
  • 29:42 - 29:46
    And you talk from one Onion
    Service to the other Onion Service.
  • 29:46 - 29:50
    You don’t have to know where the person is
    or necessarily even who the person is.
  • 29:50 - 29:52
    And there’s no middle,
    there’s no central point
  • 29:52 - 29:55
    to go and learn all the accounts,
  • 29:55 - 29:57
    and who’s friends with who, and so on.
  • 29:57 - 30:01
    There’s nothing to break into in the
    middle where you can spy on everybody.
  • 30:01 - 30:04
    Everything is decentralized,
    everybody is their own onion address.
  • 30:04 - 30:09
    So I think that’s a key point
    as an alternate chat paradigm
  • 30:09 - 30:13
    where hopefully we can switch away
    from the centralization model.
  • 30:13 - 30:19
    applause
  • 30:19 - 30:21
    Okay. So, on to phase 2,
    a brief diversion.
  • 30:21 - 30:25
    We’ve been talking to a bunch of
    researchers over the past few years
  • 30:25 - 30:28
    about, they want to do
    research on TOR to study
  • 30:28 - 30:32
    how many users there are,
    or how many people go to Facebook,
  • 30:32 - 30:35
    or all sorts of other
    research questions, and
  • 30:35 - 30:40
    sometimes they do it in dangerous
    ways, or inappropriate ways.
  • 30:40 - 30:44
    So we’ve been working on guidelines
    to help people who want to do it safely
  • 30:44 - 30:48
    actually be able to not harm
    people or minimize the harm.
  • 30:48 - 30:50
    So here are some of the guidelines.
  • 30:50 - 30:54
    First one is, try to attack your
    own traffic, try to attack yourself,
  • 30:54 - 30:58
    so, if you have a question and you
    need to do it on the real TOR network,
  • 30:58 - 31:01
    you should be the one to generate
    your traffic and then try to attack that.
  • 31:01 - 31:05
    You shouldn’t just pick an
    arbitrary user and attack them
  • 31:05 - 31:07
    because who knows if that’s a
    person in Syria who needs help
  • 31:07 - 31:12
    or a person in Germany who’s trying to
    get out from oppression, and so on.
  • 31:12 - 31:17
    Another approach: only collect data that
    you’re willing to publish to the world.
  • 31:17 - 31:19
    So, too many researchers say:
  • 31:19 - 31:22
    “Well I’m going to learn all
    of this interesting stuff,
  • 31:22 - 31:24
    and I’m going to write it
    down on my hard drive,
  • 31:24 - 31:27
    and I’ll keep it very safe.
    Nobody will break in.”
  • 31:27 - 31:29
    That approach fails every time.
  • 31:29 - 31:32
    Somebody breaks in, you lose
    the data, you forget about it,
  • 31:32 - 31:35
    so the ethical way to do this is,
  • 31:35 - 31:38
    only collect stuff that you’re
    willing to make public.
  • 31:38 - 31:41
    Only collect stuff that’s
    safe to make public.
  • 31:41 - 31:45
    And then the other piece, that’s part
    of what we’re talking about there,
  • 31:45 - 31:47
    don’t collect data that you don’t need,
  • 31:47 - 31:50
    so figure out what your
    research question is,
  • 31:50 - 31:54
    figure out the minimum that you
    can collect to answer that question.
  • 31:54 - 31:57
    For example, if I want to know
    how many people connect to Facebook,
  • 31:57 - 32:01
    I should not collect every destination
    that everybody goes to,
  • 32:01 - 32:05
    and then afterwards count up
    how many of them were Facebook.
  • 32:05 - 32:08
    I should have a counter that
    says: Facebook, increment.
  • 32:08 - 32:10
    And then at the end it outputs a number,
  • 32:10 - 32:13
    and that’s the only thing that
    I need to know in that case.
  • 32:13 - 32:15
    So limit the granularity of data.
  • 32:15 - 32:19
    If you’re counting how many users
    are connecting from different countries,
  • 32:19 - 32:23
    and there are very few
    users coming from Mauritania,
  • 32:23 - 32:25
    consider rounding that down to zero,
  • 32:25 - 32:27
    so that you don’t accidentally harm
  • 32:27 - 32:30
    the five people in Mauritania
    who’re using it today.
  • 32:30 - 32:32
    So, approach to do this is:
  • 32:32 - 32:34
    figure out what you’re trying to learn,
  • 32:34 - 32:37
    describe the benefits to
    the world of learning that,
  • 32:37 - 32:41
    describe the risks to
    people around the world
  • 32:41 - 32:44
    in TOR of the approach that you’re taking,
  • 32:44 - 32:47
    and then argue that the benefits
    are outweighing the risks.
  • 32:47 - 32:50
    And one of the key ways
    of looking at this is,
  • 32:50 - 32:55
    if you’re collecting something
    interesting, some interesting dataset,
  • 32:55 - 32:57
    think about whether there
    could be somebody else,
  • 32:57 - 33:01
    somewhere out there in the world,
    who has some other dataset,
  • 33:01 - 33:04
    and when you combine
    their dataset with yours,
  • 33:04 - 33:07
    somebody learns something new,
    somebody gets harmed.
  • 33:07 - 33:10
    If you can imagine any other dataset
  • 33:10 - 33:12
    that, when it’s combined
    with yours, harms people,
  • 33:12 - 33:15
    then you need to think harder about that.
  • 33:15 - 33:19
    And then the last point:
    use a test network, when possible.
  • 33:19 - 33:22
    There’s a tool called chutney,
    there’s a tool called Shadow,
  • 33:22 - 33:25
    where you can run your own internal
    TOR network on one computer,
  • 33:25 - 33:29
    and if you can do your research
    that way, it’s even better.
  • 33:29 - 33:32
    So, those are great
    guidelines, sounds good.
  • 33:32 - 33:33
    We need to encourage
    more people to do them,
  • 33:33 - 33:36
    and, to be fair, this is not going to
  • 33:36 - 33:38
    stop bad people from doing bad things.
  • 33:38 - 33:41
    But if you want to do
    research responsibly,
  • 33:41 - 33:44
    and ethically, without harming TOR users,
  • 33:44 - 33:46
    then we need to help everybody learn
  • 33:46 - 33:49
    what the guidelines are
    to do it more safely.
  • 33:49 - 33:51
    So here’s an example of a tricky edge case
  • 33:51 - 33:54
    where we really want to think
    harder about these things.
  • 33:54 - 33:57
    One of them is, there are people out there
  • 33:57 - 34:01
    who want to build a list of every
    onion address that they can find.
  • 34:01 - 34:04
    So, you can learn about
    that by going to google
  • 34:04 - 34:08
    and doing a google search on .onion
    and they give you some addresses.
  • 34:08 - 34:12
    That’s okay, that seems reasonable.
    It’s a public dataset, okay, fine.
  • 34:12 - 34:15
    There’s a more complicated
    one, where you are Verisign,
  • 34:15 - 34:18
    and you run some of the DNS root servers,
  • 34:18 - 34:23
    and you spy on the DNS
    queries of the whole Internet.
  • 34:23 - 34:27
    And anybody who accidentally
    sends a .onion DNS query
  • 34:27 - 34:30
    to your root server, you write it down.
  • 34:30 - 34:32
    So now you learn all the side-channel
  • 34:32 - 34:35
    accidentally leaked addresses.
  • 34:35 - 34:38
    Is that, does that follow our guidelines?
    Is that okay, is that ethical?
  • 34:38 - 34:41
    It’s kind of complicated,
    but I don’t know a way to stop it,
  • 34:41 - 34:46
    and they already have
    the dataset. So, okay, fine.
  • 34:46 - 34:48
    Now a more complicated one.
  • 34:48 - 34:52
    What if you’re Comcast, and
    you spy on all of your users,
  • 34:52 - 34:56
    to find out what their DNS queries are,
    to learn about accidental leakage there.
  • 34:56 - 34:59
    Again, I’m going to say it’s
    complicated, but it’s probably fine,
  • 34:59 - 35:02
    they’re already seeing it,
    there’s nothing we, TOR, can do
  • 35:02 - 35:06
    to change our protocol, to make
    people accidentally leak these things.
  • 35:06 - 35:08
    But then option four:
  • 35:08 - 35:11
    what if you want to learn
    a bunch of onion addresses,
  • 35:11 - 35:13
    so you run new relays in the TOR network,
  • 35:13 - 35:16
    and you sign them up and
    they get into a position
  • 35:16 - 35:18
    where they can learn about onion addresses
  • 35:18 - 35:21
    that are being published,
    and then you make a list internally.
  • 35:21 - 35:25
    So that’s actually not cool, because
  • 35:25 - 35:27
    it’s part of the TOR protocol
  • 35:27 - 35:30
    that people providing onion addresses
  • 35:30 - 35:32
    don’t expect those to become public,
  • 35:32 - 35:34
    and we’ll talk later about
    ways that we have
  • 35:34 - 35:36
    for being able to fix this.
  • 35:36 - 35:40
    So, if it’s a protocol problem
    that we know how to fix,
  • 35:40 - 35:43
    and it’s inside TOR, and
    you’re misbehaving as a relay,
  • 35:43 - 35:46
    then that’s not cool.
    So this is an example where,
  • 35:46 - 35:50
    it’s sort of hard to reason through
    where we should draw the line,
  • 35:50 - 35:52
    and I’d love to chat more
    with you all after that
  • 35:52 - 35:56
    about where the line should be.
  • 35:56 - 35:59
    And that leads to an example
  • 35:59 - 36:01
    of some research that was done last year,
  • 36:01 - 36:05
    by, we think, some folks at CMU,
  • 36:05 - 36:07
    who attacked the TOR network,
  • 36:07 - 36:10
    and, as far as I can tell,
    collected a dataset,
  • 36:10 - 36:12
    and they collected more than they needed
  • 36:12 - 36:14
    to answer their research questions.
  • 36:14 - 36:16
    They didn’t do minimization,
  • 36:16 - 36:20
    they didn’t attack only their own traffic,
    they didn’t use a test network,
  • 36:20 - 36:24
    they basically violated every one of
    the guidelines from two slides ago.
  • 36:24 - 36:26
    So that’s sort of a sad story,
  • 36:26 - 36:28
    and that leads to the next question:
  • 36:28 - 36:31
    Should we have some sort
    of TOR ethics review board?
  • 36:31 - 36:34
    Wouldn’t it be cool if, as a researcher,
  • 36:34 - 36:37
    you write up what you’re trying to learn
  • 36:37 - 36:39
    and why it’s safe,
    and how you’re going to do it,
  • 36:39 - 36:43
    and you show that to other professors
    who help you decide whether you’re right,
  • 36:43 - 36:48
    and then we go to the academic
    review journals and conferences,
  • 36:48 - 36:51
    and we get them to expect,
    in your research paper,
  • 36:51 - 36:55
    a little section on why this
    is responsible research.
  • 36:55 - 36:59
    And, at that point, it’s expected
    that you have thought through that,
  • 36:59 - 37:03
    and anybody who writes a
    paper without that section,
  • 37:03 - 37:06
    everybody knows that they haven’t
    thought it through as much as they should.
  • 37:06 - 37:10
    Wouldn’t that be a cool future world
    where research is done more responsibly
  • 37:10 - 37:14
    around TOR, and around
    security more generally?
  • 37:14 - 37:19
    applause
  • 37:19 - 37:22
    Okay. So, there are a couple of problems
  • 37:22 - 37:26
    in TOR Onion Service security right
    now. I’m gonna zip through them briefly
  • 37:26 - 37:29
    so we can actually get to talk about
    them in more detail. The first one is,
  • 37:29 - 37:32
    the onion identity keys are RSA 1024 bits.
  • 37:32 - 37:36
    So, that is too short.
    We need to switch to ECC.
  • 37:36 - 37:39
    Another one is, you, the adversary,
  • 37:39 - 37:42
    can run relays, and
    choose your identity key
  • 37:42 - 37:44
    so that you end up in the
    right place in the network
  • 37:44 - 37:48
    in order to target – censor,
    surveil, whatever –
  • 37:48 - 37:51
    certain onion addresses.
    And we’ll talk more about that also.
  • 37:51 - 37:54
    Another one, I talked about
    that a few slides ago,
  • 37:54 - 37:58
    you can run relays in order to
    learn about new onion addresses,
  • 37:58 - 38:00
    and we’ve got some fixes for that.
  • 38:00 - 38:02
    So those are 3 that
    are onion-address specific,
  • 38:02 - 38:06
    onion-service specific, that we can solve.
  • 38:06 - 38:08
    And then there are 3 issues
    that are much more broad.
  • 38:08 - 38:10
    One of them is,
  • 38:10 - 38:13
    bad guys can run hundreds of relays,
  • 38:13 - 38:16
    and we need to learn how to
    notice that and protect against it.
  • 38:16 - 38:18
    Another one is,
  • 38:18 - 38:20
    you can run relays to learn more about
  • 38:20 - 38:22
    the path selection that
    clients are going to do,
  • 38:22 - 38:26
    and then there’s website fingerprinting.
    All of those are separate talks.
  • 38:26 - 38:28
    I wanted to mention them here,
    I’m happy to talk about them later,
  • 38:28 - 38:32
    but we don’t have time
    to get into them in detail.
  • 38:32 - 38:35
    Okay, phase three, how
    do TOR Hidden Services,
  • 38:35 - 38:38
    TOR Onion Services work right now?
    Just to give you some background,
  • 38:38 - 38:41
    so that when we talk about
    the design improvements,
  • 38:41 - 38:43
    you have a handle on what’s going on.
  • 38:43 - 38:47
    So, we’ve got Alice over here, she
    wants to visit some Hidden Service Bob.
  • 38:47 - 38:50
    The first step is, Bob generates a key,
  • 38:50 - 38:53
    and he establishes 3 introduction points,
  • 38:53 - 38:55
    3 circuits into the TOR network.
  • 38:55 - 38:59
    And then he publishes, to
    the big database in the sky,
  • 38:59 - 39:00
    “Hi, this is my key,
  • 39:00 - 39:03
    and these are my 3 introduction points.”
  • 39:03 - 39:06
    And at that point, Alice somehow
    learns his onion address,
  • 39:06 - 39:10
    and she goes to the database
    and pulls down the descriptor
  • 39:10 - 39:13
    that has his key and the
    3 introduction points
  • 39:13 - 39:16
    and in parallel to that,
    she connects to the –
  • 39:16 - 39:19
    she picks her own rendezvous point,
    and she builds a TOR circuit there.
  • 39:19 - 39:24
    So at this point, Bob has 3 introduction
    points open in the TOR network,
  • 39:24 - 39:28
    and Alice has one rendezvous
    point open in the TOR network.
  • 39:28 - 39:31
    And at that point, Alice connects
    to one of the introduction points
  • 39:31 - 39:35
    and says: “Hey, I want to connect
    to you, and I’m waiting over here.
  • 39:35 - 39:37
    This is the address for
    my rendezvous point.
  • 39:37 - 39:40
    If you want to talk back to me,
    I’m waiting right here.”
  • 39:40 - 39:43
    And then at that point,
    Bob, if he wants to,
  • 39:43 - 39:45
    makes a connection to
    the rendezvous point.
  • 39:45 - 39:47
    So now Alice has a connection
    to the rendezvous point,
  • 39:47 - 39:50
    and Bob has a connection
    to the rendezvous point,
  • 39:50 - 39:52
    and at that point they do
    the crypto handshake
  • 39:52 - 39:55
    so that they get end-to-end encryption,
  • 39:55 - 39:58
    and then, as the last step,
    they’re able to send traffic
  • 39:58 - 40:01
    over that circuit, where Alice has 3 hops,
  • 40:01 - 40:03
    and Bob has three hops,
  • 40:03 - 40:06
    and they’re able to provide
    security from that point.
  • 40:06 - 40:07
    So that’s a very brief summary
  • 40:07 - 40:09
    of how the handshake works,
  • 40:09 - 40:12
    in Hidden Service land.
  • 40:12 - 40:16
    So, in the previous slide, I was talking
    about this database in the sky.
  • 40:16 - 40:19
    Once upon a time, that
    was just 3 computers.
  • 40:19 - 40:21
    I ran 2 of them.
  • 40:21 - 40:24
    Another directory authority
    operator ran the third.
  • 40:24 - 40:28
    And then, we switched to distributing
    that over the entire TOR network,
  • 40:28 - 40:31
    so there are 8000 relays…
  • 40:31 - 40:37
    So imagine the hash of each relay’s
    identity key on this hash ring.
  • 40:37 - 40:40
    There are 6 relays at any given point
  • 40:40 - 40:43
    that are responsible for knowing
  • 40:43 - 40:45
    where a given Onion Service is.
  • 40:45 - 40:48
    So, when I’m running my own Onion Service,
  • 40:48 - 40:50
    I compute which 6 relays they are,
  • 40:50 - 40:52
    and I publish my descriptor to it,
  • 40:52 - 40:55
    and when I’m the client
    and I want to go there,
  • 40:55 - 40:57
    I compute which 6 relays they are,
  • 40:57 - 41:00
    and I go to any one of them,
    and I can fetch the descriptor.
  • 41:00 - 41:03
    So, the way that we actually generate
  • 41:03 - 41:08
    the predictable set of
    which relays they are,
  • 41:08 - 41:11
    is this hash function up on the top.
    So you look at the onion address,
  • 41:11 - 41:14
    you look at what day
    it is, the time period,
  • 41:14 - 41:16
    and other things that are pretty static.
  • 41:16 - 41:19
    So that’s how both sides can compute
  • 41:19 - 41:21
    which relays they should go to
  • 41:21 - 41:26
    when they’re publishing
    or fetching a descriptor.
  • 41:27 - 41:32
    George: Okay, so, back to
    the security issues again.
  • 41:32 - 41:34
    A few years ago, like 2..3 years ago,
  • 41:34 - 41:36
    we started looking into Hidden Services
  • 41:36 - 41:39
    and enumerating the various problems.
  • 41:39 - 41:41
    Various people wrote papers about
  • 41:41 - 41:44
    some open issues of security and stuff.
  • 41:44 - 41:48
    So in 2013 we wrote the first proposal
  • 41:48 - 41:51
    called next generation Hidden Services,
  • 41:51 - 41:55
    which basically details various
    ways for improving security,
  • 41:55 - 41:58
    better crypto, blah blah
    blah, blah blah blah.
  • 41:58 - 42:02
    This happened like 2 years ago,
    and we still have not been,
  • 42:02 - 42:06
    we haven’t started heavily
    developing, because of the lack of,
  • 42:06 - 42:09
    basically, developers, since
    Hidden Services have been
  • 42:09 - 42:14
    largely volunteer-driven projects
    since like a year ago or so.
  • 42:14 - 42:17
    So everything was done on
    our spare time, basically.
  • 42:17 - 42:21
    But we’ve been writing proposals,
  • 42:21 - 42:24
    we’ve been active anyway.
  • 42:24 - 42:28
    We’re going to start looking over
    the various security issues
  • 42:28 - 42:31
    and the ways to fix them, let’s say.
  • 42:31 - 42:35
    The first one, we call it
    HSDir predictability,
  • 42:35 - 42:39
    and it touches the subject
    that Roger mentioned,
  • 42:39 - 42:40
    the database in the sky,
  • 42:40 - 42:43
    which is basically that
    when a Hidden Service…
  • 42:43 - 42:47
    A Hidden Service every time has
    6 Hidden Service directories,
  • 42:47 - 42:51
    6 relays of the network
    being responsible for it.
  • 42:51 - 42:55
    So every day, each Hidden
    Service has 6 relays
  • 42:55 - 42:58
    responsible for it, and it chooses it
  • 42:58 - 43:01
    using this weird hash formula there.
  • 43:01 - 43:04
    Which, if you can see, it’s deterministic
  • 43:04 - 43:07
    so all of them are static,
    apart from time-period
  • 43:07 - 43:09
    which rotates every day.
  • 43:09 - 43:12
    But the problem is that,
    because it’s deterministic,
  • 43:12 - 43:16
    you can basically plug in
    the onion address you want
  • 43:16 - 43:17
    and the time period in the future
  • 43:17 - 43:20
    and you can basically predict
    the result of that function,
  • 43:20 - 43:22
    in like, 2 months from now.
  • 43:22 - 43:25
    So, you can basically know
  • 43:25 - 43:29
    which relays will be responsible
    for a Hidden Service in 2 months
  • 43:29 - 43:33
    and, if you’re a bad guy, maybe you’ll
    go and inject yourself to that place
  • 43:33 - 43:36
    and you will become the
    HSDir of a Hidden Service
  • 43:36 - 43:39
    and then you can,
    like, monitor its activity
  • 43:39 - 43:42
    or you can do DoS attacks.
  • 43:42 - 43:44
    So this is not something we like, and
  • 43:44 - 43:48
    we will attempt to fix it.
  • 43:48 - 43:51
    Our idea for fixing it is that
    we will have to make it,
  • 43:51 - 43:55
    from deterministic, we will have
    to turn it into probabilistic thing,
  • 43:55 - 43:59
    and we do this by adding
    a random value in it.
  • 43:59 - 44:02
    And this random value is
    basically a fresh random value
  • 44:02 - 44:06
    that the network is going to
    be generating every day.
  • 44:06 - 44:11
    So, how we do this is we use these 9
    directory authorities from the network,
  • 44:11 - 44:14
    they’re these 9 computers, they’re
    hardcoded in the source code
  • 44:14 - 44:17
    and they’re considered semi-trusted.
  • 44:17 - 44:19
    And basically we wrote the protocol,
  • 44:19 - 44:23
    that all these 9 directory
    authorities do a little dance
  • 44:23 - 44:28
    and in the end of the day, they have
    a fresh random value every day.
  • 44:28 - 44:32
    It’s not something new, it uses
    a commit-and-reveal protocol,
  • 44:32 - 44:35
    which is some way to do some sort of
  • 44:35 - 44:38
    distributed random number generation.
  • 44:38 - 44:41
    And then every day they
    make this random value,
  • 44:41 - 44:42
    they put it in the consensus,
  • 44:42 - 44:46
    and then Hidden Services
    and Hidden Service users
  • 44:46 - 44:48
    take the random value from the consensus,
  • 44:48 - 44:51
    plug it into that formula,
    and they use it.
  • 44:51 - 44:54
    And since this is a
    supposedly secure protocol,
  • 44:54 - 44:58
    I shouldn’t be able to predict what the
    random value is going to be in 2 months.
  • 44:58 - 45:00
    Hence, I cannot go and inject myself
  • 45:00 - 45:04
    in that position in the
    database, basically.
  • 45:04 - 45:09
    And this is the way we fix this problem.
  • 45:11 - 45:16
    David: Right, continuing now on the
    next generation Hidden Services.
  • 45:16 - 45:18
    The key thing is better crypto.
  • 45:18 - 45:21
    Right know we have RSA-1024 and SHA-1,
  • 45:21 - 45:24
    which are considered completely bad to use
  • 45:24 - 45:26
    and we’re going to use of course
  • 45:26 - 45:30
    this fancy man there’s work, Daniel.
  • 45:30 - 45:34
    So it’s basically ED25519
    for encrypting and signing
  • 45:34 - 45:37
    and of course, using SHA-256.
  • 45:37 - 45:42
    Right now, in TOR, we are experimenting
    an implementation upstream of SHA-3.
  • 45:42 - 45:47
    Although, apparently, we’re not sure
    if SHA-256 or SHA-3 will be used, but
  • 45:47 - 45:51
    right now SHA-256 is a contender
    because it goes way faster than SHA-3.
  • 45:51 - 45:56
    SHA-3 has more things. Still pending.
    But, for now, elliptic curves.
  • 45:56 - 46:00
    So, what to take of that is:
    next generation Hidden Services,
  • 46:00 - 46:04
    which we are actively
    working on right now,
  • 46:04 - 46:08
    will drop all dead crypto.
  • 46:08 - 46:10
    One of the big changes that’s coming up
  • 46:10 - 46:13
    is the onion addresses.
  • 46:13 - 46:16
    On top you have the current onion address
  • 46:16 - 46:18
    and it’s going to move to 52 characters,
  • 46:18 - 46:21
    so, basically, your public key.
  • 46:21 - 46:24
    This is maybe for you
    guys considered painful,
  • 46:24 - 46:26
    and for us it’s extremely painful
  • 46:26 - 46:30
    to enter this address or just to
    type in an address like that,
  • 46:30 - 46:32
    so, you know, open proposal right now,
  • 46:32 - 46:35
    or I think there’s an email
    thread on our mailing list
  • 46:35 - 46:39
    on coming up with some more fancy way
  • 46:39 - 46:44
    to remember an onion address that size.
  • 46:44 - 46:49
    Like, words, remembering words that just
    mash in together and create that address.
  • 46:49 - 46:51
    So, yeah.
  • 46:51 - 46:55
    Now, as the Hidden Service evolves,
  • 46:55 - 46:59
    one of the things we really
    want to do is make them faster.
  • 46:59 - 47:02
    The big difference between
    Hidden Services in the network
  • 47:02 - 47:04
    and a normal TOR circuit
    in the network is that
  • 47:04 - 47:06
    normal TOR circuits usually have 3 hops,
  • 47:06 - 47:08
    then in Hidden Services you
    have 3 hops from the client
  • 47:08 - 47:12
    and 3 hops from the Service.
    Thus you have 6 hops.
  • 47:12 - 47:16
    So of course much more time
    to go through all the relays
  • 47:16 - 47:20
    than the normal circuit. Now we have
    this proposal going on which is
  • 47:20 - 47:24
    Rendezvous Single Onion Services. And
    the point is, you’re gonna do the dance,
  • 47:24 - 47:28
    the introduction, the rendezvous, and
    then once you go to the rendezvous,
  • 47:28 - 47:33
    instead of the service going 3 hops,
    you’re gonna go 1 hop to the service.
  • 47:33 - 47:38
    So in here we have this artist wanting
    to update their Debian machine,
  • 47:38 - 47:41
    let’s say that, and the Debian
    server doesn’t care really much
  • 47:41 - 47:44
    about anonymity, because we know
    where the Debian servers are,
  • 47:44 - 47:48
    and that’s fine. Thus, clients still
    have anonymity with the 3 hops,
  • 47:48 - 47:53
    then the service doesn’t care, so
    only 1 hop. And it goes way faster.
  • 47:53 - 47:58
    So that’s something we hopefully
    will end up deploying soon.
  • 47:58 - 48:02
    Now, the second one is the Single
    Onion Service. It’s roughly the same,
  • 48:02 - 48:07
    so we have this chef here wanting to go to
    his Fairtrade website, you know, whatever,
  • 48:07 - 48:12
    and the difference here is that
    we are going to skip completely
  • 48:12 - 48:15
    the introduction and rendezvous dance.
  • 48:15 - 48:18
    And you’re going to do a 3 hop
    circuit to a rendezvous point
  • 48:18 - 48:24
    where the Hidden Service,
    let’s say, let’s call it a node,
  • 48:24 - 48:27
    an introduction point,
    which is the yellow line,
  • 48:27 - 48:30
    and then the client will go
    to that introduction point,
  • 48:30 - 48:33
    and instead of having this current dance
  • 48:33 - 48:35
    the client extends to the service.
  • 48:35 - 48:38
    And now we have a 3 hop thing,
  • 48:38 - 48:43
    no prior work being done
    for introduction or rendezvous,
  • 48:43 - 48:45
    and it goes way faster.
  • 48:45 - 48:49
    Again, those 2 here and here
  • 48:49 - 48:54
    are optimization for services
    that do not care about anonymity.
  • 48:54 - 48:58
    And there are plenty of use cases
    for that. Facebook, for instance,
  • 48:58 - 49:02
    or Debian repositories, and so on.
  • 49:02 - 49:05
    Roger: Am I still on? Great. Facebook
    and Debian are really excited about
  • 49:05 - 49:09
    having one of these options so that
    they can have all of their users
  • 49:09 - 49:13
    reach their service with all the cool
    security properties we talked about
  • 49:13 - 49:16
    but also a lot faster and more scalable
  • 49:16 - 49:19
    than the current design.
  • 49:19 - 49:24
    David: Precisely. So, one of the
    very cool things we did this summer
  • 49:24 - 49:28
    was the TOR summer of privacy.
    We got some people in,
  • 49:28 - 49:32
    which we can consider
    interns or students whatever,
  • 49:32 - 49:38
    and one of these projects – that came
    out of this cool person that is Donncha –
  • 49:38 - 49:42
    created OnionBalance. So it’s a way
    of load-balancing Hidden Services.
  • 49:42 - 49:46
    So as you create a Hidden
    Service with the top key,
  • 49:46 - 49:48
    then you copy that key
    to multiple machines
  • 49:48 - 49:52
    so all those servers will
    start creating a descriptor.
  • 49:52 - 49:56
    Basically the descriptor, if you
    can remember, is how to reach me.
  • 49:56 - 50:00
    And we’re going to cherry-pick
    introduction points from each
  • 50:00 - 50:04
    and create a master descriptor
    that you can see in that picture;
  • 50:04 - 50:07
    and that master descriptor
    is (?) what clients will use.
  • 50:07 - 50:08
    Thus you load-balance the network
  • 50:08 - 50:12
    depending on introduction
    points and the instance.
  • 50:12 - 50:14
    And this is great! And we actually know
  • 50:14 - 50:18
    that Facebook will actively
    start beta-testing this thing
  • 50:18 - 50:21
    so we can have load-balancing and CDNs
  • 50:21 - 50:27
    and much more easier for onion addresses.
  • 50:27 - 50:30
    So, just before I give
    these slides to Roger,
  • 50:30 - 50:33
    the next generation Onion
    Services is something
  • 50:33 - 50:35
    that has been around for 2 years now
  • 50:35 - 50:40
    and now we’re going to start
    working in 2016 actively,
  • 50:40 - 50:43
    and almost with 4 full-time
    developers on that.
  • 50:43 - 50:46
    It’s still not enough, we need more… we
    need resources because we need to get away
  • 50:46 - 50:49
    from our funding that restricts us
    for not working on Onion Services
  • 50:49 - 50:52
    which we have a bit now.
    So resources is very, very important
  • 50:52 - 50:55
    so we can get this thing
    that’s extremely important.
  • 50:55 - 51:00
    And in the next year, we hope
    to get this thing here done.
  • 51:00 - 51:07
    applause
  • 51:07 - 51:11
    Roger: Great, so there are a couple
    of important takeaways from
  • 51:11 - 51:13
    what we’ve been describing to you today.
  • 51:13 - 51:16
    One big one is, there are a lot of
  • 51:16 - 51:18
    different types of Onion Services out there.
  • 51:18 - 51:20
    There are a lot more than people think.
  • 51:20 - 51:23
    Everybody looks at Hidden
    Services and says: “Oh,
  • 51:23 - 51:26
    they’re for websites that the government
    hates” or something like that.
  • 51:26 - 51:30
    But there are examples
    like Ricochet, like Facebook,
  • 51:30 - 51:32
    like GlobaLeaks, like SecureDrop.
  • 51:32 - 51:35
    All of these different examples are
  • 51:35 - 51:38
    cool things you can do
    with better security properties
  • 51:38 - 51:43
    for your communication. So it’s not
    about hiding where the website is,
  • 51:43 - 51:45
    it’s about getting more secure ways
  • 51:45 - 51:49
    of reaching websites and
    other services around the world.
  • 51:49 - 51:53
    So another key point: this is still a
    tiny fraction of the overall TOR network.
  • 51:53 - 51:55
    We have millions of people
    using TOR every day,
  • 51:55 - 51:59
    and something like 5% of the
    traffic through the TOR network
  • 51:59 - 52:02
    is Hidden-Service,
    or Onion-Service related.
  • 52:02 - 52:05
    So it was 3% last year, it’s 5% now,
  • 52:05 - 52:08
    it’s going up, sounds good,
    but it’s still a tiny fraction.
  • 52:08 - 52:12
    And maybe that’s good, because when
    you’re using an Onion Service right now,
  • 52:12 - 52:14
    you put double load on the TOR network,
  • 52:14 - 52:17
    because both sides add their own circuit.
  • 52:17 - 52:21
    Whereas if we switch to some of these
    designs that David was talking about,
  • 52:21 - 52:24
    then it will be much more scalable
    and much more efficient
  • 52:24 - 52:29
    and it would be really cool to
    have Amazon, and Facebook,
  • 52:29 - 52:32
    and Twitter, and Wikipedia, and so on,
  • 52:32 - 52:37
    all allowing people to get more
    security, while using TOR,
  • 52:37 - 52:42
    while protecting places like Facebook
    from learning where they are today.
  • 52:42 - 52:46
    Another key point, we got all these
    cool designs that we touched on briefly,
  • 52:46 - 52:49
    we’d be happy to tell you
    more about them after the talk,
  • 52:49 - 52:53
    and then the last point: You
    run a cool service out there,
  • 52:53 - 52:55
    please set up an onion version of it,
  • 52:55 - 52:58
    please set up an onion
    address for your cool service,
  • 52:58 - 53:01
    so that the typical average
    onion service in the world
  • 53:01 - 53:05
    becomes a totally normal
    website or other service
  • 53:05 - 53:09
    that totally ordinary people go to. And
    that’s how we will mainstream this thing
  • 53:09 - 53:12
    and take over the world.
  • 53:12 - 53:19
    applause
  • 53:19 - 53:22
    applause
  • 53:22 - 53:24
    And then, as a final point,
  • 53:24 - 53:27
    we are in the middle of our
    first ever donation campaign.
  • 53:27 - 53:30
    We are actually trying to grow
  • 53:30 - 53:33
    a base of people who want to support TOR
  • 53:33 - 53:35
    in the same way that EFF has done.
  • 53:35 - 53:39
    So it would be wonderful… I don’t want to
    throw away all of our Government funders,
  • 53:39 - 53:41
    at least not right now, but I
    would like to get to the point
  • 53:41 - 53:45
    where we have other options,
    more sustainability,
  • 53:45 - 53:48
    and we don’t have to look at
    each new funding proposal
  • 53:48 - 53:51
    and wonder if we have to
    un-fund people if we don’t get it.
  • 53:51 - 53:53
    So I’d love to have much more diversity
  • 53:53 - 53:56
    in the type of people who
    are helping TOR to exist
  • 53:56 - 53:59
    and thrive and help save the world.
  • 53:59 - 54:01
    So, please consider helping.
  • 54:01 - 54:08
    applause
  • 54:08 - 54:12
    Herald: Thanks a lot for this awesome
    talk, we have 6 minutes left for questions
  • 54:12 - 54:14
    and please line up at the microphones
  • 54:14 - 54:17
    1, 2, 3, 4, 5, and 6 down here.
  • 54:17 - 54:19
    And while you are doing that,
  • 54:19 - 54:22
    we would like to hear a
    question from the Internet.
  • 54:22 - 54:24
    Signal Angel: Thank you. I have
    a bunch of questions regarding
  • 54:24 - 54:28
    compromised onion addresses.
    Herald: Start with one.
  • 54:28 - 54:31
    Signal: Do we have a
    kind of evil twin problem
  • 54:31 - 54:36
    and what can I do if my onion
    address is compromised?
  • 54:36 - 54:38
    When there are widespread services
  • 54:38 - 54:42
    on the TOR net, like Amazon,
  • 54:42 - 54:46
    how do I know which
    one is the official service?
  • 54:46 - 54:47
    Roger: So the first question, of,
  • 54:47 - 54:51
    if your onion key gets stolen
    or something like that,
  • 54:51 - 54:54
    that’s the same as the SSL problem.
  • 54:54 - 54:57
    How do you keep your certificate
    for your web server safe?
  • 54:57 - 55:01
    The answer is: you should keep it safe
    just like you keep everything else safe.
  • 55:01 - 55:05
    And if somebody gets the key for
    your onion address, sucks to be you.
  • 55:05 - 55:08
    Don’t let them do that.
    For the second question,
  • 55:08 - 55:11
    how do you know that a given
    onion address is Amazon’s?
  • 55:11 - 55:13
    That ties into the certificate authority,
  • 55:13 - 55:18
    the https, the EV cert discussion
    that we talked about
  • 55:18 - 55:23
    where we need to somehow bind
    in Amazon’s SSL certificate
  • 55:23 - 55:28
    the fact that it’s Amazon, and
    this is their alternate onion address.
  • 55:28 - 55:30
    We need to put those
    in the same certificate,
  • 55:30 - 55:32
    so that everybody knows
    if you’re getting one
  • 55:32 - 55:35
    then you know it’s really Amazon.
  • 55:35 - 55:39
    H: Thank you. I would like to hear
    the question from microphone 1
  • 55:39 - 55:41
    and remember to keep it short and concise.
  • 55:41 - 55:44
    Q: Again, the addressing issue,
  • 55:44 - 55:47
    switching from 16 to 52 characters is nice
  • 55:47 - 55:52
    but if we have to change
    the algorithm again
  • 55:52 - 55:55
    wouldn't it be nice
    to have, like, a prefix
  • 55:55 - 56:00
    to determine the
    algorithm for the address?
  • 56:00 - 56:02
    Roger: Yes, we actually have
    a couple of extra bits
  • 56:02 - 56:05
    in that 52 bytes and we could use them
  • 56:05 - 56:07
    for versioning or all sorts of things.
  • 56:07 - 56:11
    And there are some examples
    of that in the proposals,
  • 56:11 - 56:13
    I don’t think we’ve fixed
    on any answer yet.
  • 56:13 - 56:17
    So, we’d love to have your help.
  • 56:17 - 56:20
    H: Thank you. Please, microphone number 3.
  • 56:20 - 56:22
    Q: Hey, you gave us a couple of examples
  • 56:22 - 56:25
    from Facebook, and I was just wondering
  • 56:25 - 56:28
    if there’s any sort of affiliation between
  • 56:28 - 56:32
    Facebook and TOR, or Facebook
    just happens to be really keen
  • 56:32 - 56:37
    on offering their services
    in less democratic jurisdictions?
  • 56:37 - 56:39
    Roger: There’s a nice
    guy named Alec up here,
  • 56:39 - 56:43
    who on his own thought of making
    Facebook more secure using TOR,
  • 56:43 - 56:46
    and he went and did it, and then
    we realized that he was right,
  • 56:46 - 56:49
    so we’ve been trying
    to help him ever since.
  • 56:49 - 56:53
    applause
  • 56:53 - 56:58
    H: Thanks a lot. Microphone number 4.
  • 56:58 - 57:01
    Q: You said that you want
    more and more people
  • 57:01 - 57:04
    to run Hidden Services.
  • 57:04 - 57:06
    My question for this is,
  • 57:06 - 57:10
    are there any guidelines
    on how to do that?
  • 57:10 - 57:13
    Examples of how to do
    it with specific services?
  • 57:13 - 57:17
    Because from what I’ve seen, I tried
    doing this for some of the services I run,
  • 57:17 - 57:21
    that it’s not… it’s painful,
    most of the time.
  • 57:21 - 57:24
    And with the increase of
    the addresses right now,
  • 57:24 - 57:28
    the size of the addresses is going to
    become more and more painful. And
  • 57:28 - 57:32
    one of the things that
    I’d love to see for release,
  • 57:32 - 57:36
    for example, a DNS record
    type that’s .onion
  • 57:36 - 57:40
    that you can add for your
    normal DNS record
  • 57:40 - 57:43
    so people can just look into that and
  • 57:43 - 57:47
    choose between A, AAA,
    and onion to connect to it.
  • 57:47 - 57:49
    Roger: Yeah. For that last
    one, for the DNS side,
  • 57:49 - 57:52
    if we have DNSSEC, thumbs up.
  • 57:52 - 57:55
    If we don’t have DNSSEC,
    I don’t want to have
  • 57:55 - 57:59
    that terrible security link
    as one of the first steps.
  • 57:59 - 58:01
    I don’t want to trust
    the local DNS resolver
  • 58:01 - 58:03
    to tell me I can go somewhere else
  • 58:03 - 58:06
    and then after that, if I get
    the right address, I’m safe.
  • 58:06 - 58:08
    That sounds terrible.
  • 58:08 - 58:12
    George: So, on how to set up
    Hidden Services correctly,
  • 58:12 - 58:16
    I think Riseup recently published
    some sort of guidelines
  • 58:16 - 58:20
    with various ways you can tweak
    it and make it more secure.
  • 58:20 - 58:23
    I think there are also
    some on the TOR Wiki.
  • 58:23 - 58:27
    But in general you’re right that there are
    various ways you can mess up this thing
  • 58:27 - 58:30
    and it’s not super easy for anyone here
  • 58:30 - 58:33
    to set up a Hidden Service right now,
  • 58:33 - 58:35
    and hopefully in the
    future we will be able
  • 58:35 - 58:39
    to have an easier way for people
    to set up Hidden Services,
  • 58:39 - 58:42
    maybe provide a bundle
    that you double-click
  • 58:42 - 58:45
    and it spawns up a
    blog, or a Docker image,
  • 58:45 - 58:50
    I don’t know. It’s still one of the
    things we really need to look into.
  • 58:50 - 58:53
    Roger: It would be really cool to
    have a server version of Tails
  • 58:53 - 58:56
    that has all of this built in
    with a, like, Python web server
  • 58:56 - 59:00
    that’s hard to break into and
    automatically configured safely.
  • 59:00 - 59:04
    That would be something that would
    make a lot of people able to do this
  • 59:04 - 59:08
    more conveniently and not
    screw up when they’re doing it.
  • 59:08 - 59:13
    H: Okay!
    applause
  • 59:13 - 59:17
    So we have a bit of less than 1 minute
    left, so I would say “Last question”
  • 59:17 - 59:20
    and let’s say microphone
    number 3 for that.
  • 59:20 - 59:23
    Q: Hello, I have two small questions.
    H: No, one.
  • 59:23 - 59:25
    Q: One, okay.
    laughter
  • 59:25 - 59:28
    So, I noticed you
    have some semi-trusted
  • 59:28 - 59:31
    assumptions for the random
    number generation
  • 59:31 - 59:33
    for your relays. Did you consider,
  • 59:33 - 59:37
    or do you think there’s some merit
    in using the Bitcoin blockchain
  • 59:37 - 59:39
    to generate randomness?
  • 59:39 - 59:42
    George: We considered it. We considered
    using the Bitcoin blockchain,
  • 59:42 - 59:47
    the NIST beacon, all these things, but
    there are various engineering issues
  • 59:47 - 59:51
    like to use the blockchain thing you
    need to have 2 verified Merkle trees.
  • 59:51 - 59:54
    This needs to be coded
    on the TOR codebase.
  • 59:54 - 59:58
    You also depend on Bitcoin, which is
    a system quite powerful to be honest,
  • 59:58 - 60:02
    but you probably don’t want to
    depend on outside systems, so…
  • 60:02 - 60:04
    We really considered it, though.
  • 60:04 - 60:06
    Q: Thank you.
  • 60:06 - 60:09
    H: Thanks a lot for this
    Q&A, I think you will…
  • 60:09 - 60:14
    applause
  • 60:14 - 60:17
    applause
  • 60:17 - 60:21
    I think you will stick around and be
    available for another question-and-answer,
  • 60:21 - 60:25
    more personal, after the
    next upcoming talk, which is
  • 60:25 - 60:28
    “State of the Onion”, in 15 minutes.
  • 60:28 - 60:33
    postroll music
  • 60:33 - 60:38
    Subtitles created by c3subtitles.de
    in 2016. Join and help us do more!
Title:
Tor onion services: more useful than you think
Description:

more » « less
Video Language:
English
Duration:
01:00:39

English subtitles

Revisions