1
00:00:00,000 --> 00:00:10,019
silent 31C3 preroll titles
2
00:00:10,019 --> 00:00:15,509
applause
3
00:00:15,509 --> 00:00:19,400
Roger: Okay, hi everybody! I’m Roger
Dingledine, and this is Jake Appelbaum.
4
00:00:19,400 --> 00:00:21,910
And we’re here to tell you more
about what’s going on with Tor
5
00:00:21,910 --> 00:00:26,070
over the past year. We actually wanted
to start out asking Laura to give us
6
00:00:26,070 --> 00:00:29,540
a little bit of context from her
perspective, about Citizenfour,
7
00:00:29,540 --> 00:00:33,510
and the value of these sorts
of tools to journalists.
8
00:00:33,510 --> 00:00:39,530
applause
9
00:00:39,530 --> 00:00:46,150
Laura: So. Am I live? Okay. Roger and Jake
asked me to say a few things about Tor,
10
00:00:46,150 --> 00:00:49,660
and what does it mean for investigative
journalists. And I can say that certainly
11
00:00:49,660 --> 00:00:54,020
the work that I’ve done, on working with
disclosures by Edward Snowden, and
12
00:00:54,020 --> 00:00:58,600
first communicating with him would not
have been possible. Without the work
13
00:00:58,600 --> 00:01:03,340
that these 2 people do. And that everybody
[does] who contributes to the Tor network.
14
00:01:03,340 --> 00:01:06,460
So I’m deeply grateful to everyone here.
15
00:01:06,460 --> 00:01:12,950
applause
16
00:01:12,950 --> 00:01:17,050
When I was communicating with Snowden
for several months before I met him
17
00:01:17,050 --> 00:01:21,520
in Hongkong we talked often about the Tor
network, and it’s something that actually
18
00:01:21,520 --> 00:01:26,780
he feels is vital for online
privacy. And, to sort of
19
00:01:26,780 --> 00:01:31,299
defeat surveillance. It’s really our
only tool to be able to do that. And
20
00:01:31,299 --> 00:01:35,610
I just wanted to tell one story about what
happens when journalists don’t use it.
21
00:01:35,610 --> 00:01:39,159
I can’t go into lots of details, but
there’s a very well known investigative
22
00:01:39,159 --> 00:01:42,969
journalist who was working on a story.
He had a source. And the source was
23
00:01:42,969 --> 00:01:48,060
in the Intelligence community. And he had
done some research on his computer,
24
00:01:48,060 --> 00:01:53,460
not using Tor. And I was with him when
he got a phone call. And on the phone,
25
00:01:53,460 --> 00:01:57,490
the person was saying: “What the fuck were
you doing looking up this, this and this?”
26
00:01:57,490 --> 00:02:00,640
And this is an example of what
happens when Intelligence agencies
27
00:02:00,640 --> 00:02:05,030
target journalists. So without Tor
we literally can’t do the work that
28
00:02:05,030 --> 00:02:09,218
we need to do. So thank you,
and please support Tor! Thanks!
29
00:02:09,218 --> 00:02:11,058
applause
30
00:02:11,058 --> 00:02:17,180
Roger: Well, thank you!
continued applause
31
00:02:17,180 --> 00:02:22,100
Jacob: So to follow-up on what Laura
has just said: We think it’s important
32
00:02:22,100 --> 00:02:25,970
to really expand, not just into the
technical world, or to talk about
33
00:02:25,970 --> 00:02:31,000
the political issues in some abstract
sense. But also to reach out to culture.
34
00:02:31,000 --> 00:02:34,100
So in this case, this is a picture in the
Reina Sofia which is one of the largest
35
00:02:34,100 --> 00:02:37,650
museums in Spain. And that in the middle
is Mason Juday, and Trevor Paglen,
36
00:02:37,650 --> 00:02:41,730
and that’s me on the right. And the only
time you’ll ever find me on the right!
37
00:02:41,730 --> 00:02:46,930
And so it is the case that this is
a Tor relay. It’s actually 2 Tor relays
38
00:02:46,930 --> 00:02:51,350
running on the open hardware device
Novena, made by bunny and Sean.
39
00:02:51,350 --> 00:02:55,360
And it’s actually running as a middle
relay now, but it may in some point
40
00:02:55,360 --> 00:02:59,659
with one configuration change become
an exit relay. And it is the case that
41
00:02:59,659 --> 00:03:06,260
the Reina Sofia is hosting this Tor relay.
So, now, if… so we live in capitalism…
42
00:03:06,260 --> 00:03:11,209
applause
43
00:03:11,209 --> 00:03:14,900
So it is the case that if the Police wanna
seize this relay they got to buy it
44
00:03:14,900 --> 00:03:17,220
like every other piece
of art in the museum.
45
00:03:17,220 --> 00:03:24,340
laughter and applause
46
00:03:24,340 --> 00:03:27,299
And part of the reason that we’re
doing this kind of stuff – at least
47
00:03:27,299 --> 00:03:31,480
that piece of art which I did with Trevor
and Mason and Leif Ryge who is also
48
00:03:31,480 --> 00:03:35,519
in this room, and Aaron Gibson, also in
this room – is because we think that
49
00:03:35,519 --> 00:03:39,780
culture is important. And we think that
it’s important to tie the issue of anonymity
50
00:03:39,780 --> 00:03:43,459
not just as an abstract idea but as an
actual thing that is representative
51
00:03:43,459 --> 00:03:47,490
not only of our culture but of the world
we want to live in, overall. For all the
52
00:03:47,490 --> 00:03:51,930
cultures of the world. And so, for that
reason we also have quite recently
53
00:03:51,930 --> 00:03:56,920
been thinking a lot about social norms.
And it is the case that there’s a person
54
00:03:56,920 --> 00:04:01,040
in our community, and many persons in our
community that have come under attack.
55
00:04:01,040 --> 00:04:05,440
And have been deeply harassed.
And we think that that sucks!
56
00:04:05,440 --> 00:04:09,319
And we don’t like that. Even though we
promote anonymity without any question,
57
00:04:09,319 --> 00:04:12,859
i.e. no backdoors ever, and we’ll
get back to that in a minute,
58
00:04:12,859 --> 00:04:16,070
it is the case that we really
want to promote ‘being
59
00:04:16,070 --> 00:04:18,940
excellent to each other’. In the
sort of spirit of Noisebridge!
60
00:04:18,940 --> 00:04:25,679
applause
61
00:04:25,679 --> 00:04:28,830
And it’s still a little bit American-centric
but you can get the basic idea.
62
00:04:28,830 --> 00:04:33,099
It applies to Europe as well. Just replace
‘First Amendment’ with some of your local law.
63
00:04:33,099 --> 00:04:36,950
Or a local constitutional right. It isn’t
the case that we’re saying that you
64
00:04:36,950 --> 00:04:39,820
shouldn’t have the right to say things.
But we are saying “Get the fuck out
65
00:04:39,820 --> 00:04:43,169
of our community if you’re going
to be abusive to women!”
66
00:04:43,169 --> 00:04:50,539
applause and cheers
67
00:04:50,539 --> 00:04:55,150
And you’ll note that I used the word
‘Fuck’ to say it. And I’m sorry about that.
68
00:04:55,150 --> 00:04:58,630
Because the point is we all make mistakes.
And we want to make sure that while
69
00:04:58,630 --> 00:05:03,130
it’s true that we have transgressions we
want to make sure that we can find
70
00:05:03,130 --> 00:05:07,050
a place of reconciliation, and we can
work towards conflict resolution.
71
00:05:07,050 --> 00:05:10,880
And it’s important at the same time to
recognize that there are people who’s
72
00:05:10,880 --> 00:05:15,659
real lives are harmed by harassment
online. In this case one of the people
73
00:05:15,659 --> 00:05:19,830
is in this audience. And I hope that they
won’t mind being named. But we want
74
00:05:19,830 --> 00:05:23,740
to give her a shoutout and say
that we stand behind her 100%.
75
00:05:23,740 --> 00:05:25,260
Roger: Yeah, so, …
76
00:05:25,260 --> 00:05:33,059
applause
77
00:05:33,059 --> 00:05:37,690
So one of our developers on core Tor,
Andrea, has been harassed on Twitter
78
00:05:37,690 --> 00:05:42,440
and elsewhere, really a lot more
than should happen to anybody.
79
00:05:42,440 --> 00:05:45,880
And there are a couple of points
to make here. One of them is:
80
00:05:45,880 --> 00:05:49,919
She’s a woman, and women online
have been harassed for basically
81
00:05:49,919 --> 00:05:54,440
since ‘online’ has existed. Not just
women, other minorities, pretty much
82
00:05:54,440 --> 00:05:59,110
all over the place. Especially recently
things have been getting worse.
83
00:05:59,110 --> 00:06:04,820
The other important point to realize:
she’s not just being attacked because
84
00:06:04,820 --> 00:06:08,390
she happens to be there. She’s being
attacked because they’re trying to attack
85
00:06:08,390 --> 00:06:13,310
the Tor project and all the other people
in Tor. So, yes, she may be the focus
86
00:06:13,310 --> 00:06:17,409
of some of the attacks but we - the rest
of the Tor community, the rest of the
87
00:06:17,409 --> 00:06:22,270
security community - need to stand up
and take on some of this burden of
88
00:06:22,270 --> 00:06:26,090
communicating and interacting,
and talking about these issues.
89
00:06:26,090 --> 00:06:29,130
We can’t just leave it
to her to defend herself.
90
00:06:29,130 --> 00:06:38,240
applause
91
00:06:38,240 --> 00:06:41,770
Jacob: And so we want to set a particular
standard which is that there are
92
00:06:41,770 --> 00:06:44,209
lots of journalists that have a lot of
questions. And we really think that
93
00:06:44,209 --> 00:06:47,970
there are a lot of legitimate questions to
ask. E.g. I think it sucks that we take
94
00:06:47,970 --> 00:06:52,050
Department of Defense money, sometimes.
And sometimes I also think it’s good that
95
00:06:52,050 --> 00:06:55,140
people have the ability to feed
themselves, and have the ability
96
00:06:55,140 --> 00:06:59,270
to actually have a home and a family. Now,
I don’t have those things, really. I mean
97
00:06:59,270 --> 00:07:02,510
I can feed myself, but I don’t have a home
or a family in the same way that, say,
98
00:07:02,510 --> 00:07:07,520
the family people on side of Tor do. And
they need to be paid. It is the case that
99
00:07:07,520 --> 00:07:12,060
that is true. And that raises questions.
Like I, personally, wouldn’t ever take
100
00:07:12,060 --> 00:07:16,710
CIA money. And I think that nobody should.
And I don’t think the CIA should exist.
101
00:07:16,710 --> 00:07:18,260
But we have a diversity…
102
00:07:18,260 --> 00:07:22,300
applause
103
00:07:22,300 --> 00:07:26,150
…we have a diversity of funding because
we have a diversity of users. And so that
104
00:07:26,150 --> 00:07:29,500
raises a lot of questions. And I think
people should ask those questions.
105
00:07:29,500 --> 00:07:32,489
And Roger, and the rest of the Tor
community feels that way, too. But
106
00:07:32,489 --> 00:07:36,700
it’s important that we don’t single out
a specific person. And, in particular,
107
00:07:36,700 --> 00:07:40,270
to single out Andrea, again. She
does not deserve all the heat about
108
00:07:40,270 --> 00:07:43,969
some of the decisions that the
Tor project as a non-profit makes.
109
00:07:43,969 --> 00:07:48,510
She is a developer who is integral to
Tor. If it was not for her a significant
110
00:07:48,510 --> 00:07:52,039
portion of Tor would not exist. It
would not be as bug free as it is.
111
00:07:52,039 --> 00:07:56,839
And it would not be getting better all the
time. So we want people to reach out
112
00:07:56,839 --> 00:08:02,019
to this alias, if they actually want
to talk, and have a forum where
113
00:08:02,019 --> 00:08:06,020
the whole of Tor can really respond, and
think about these things in a positive way,
114
00:08:06,020 --> 00:08:10,029
and really engage with the press. In a way
that we can manage; because at the moment
115
00:08:10,029 --> 00:08:14,419
we get, I would say, 5 (on
average) press requests every day.
116
00:08:14,419 --> 00:08:19,140
That’s really a lot. And it is also the
case that 4 of those requests
117
00:08:19,140 --> 00:08:23,950
are very well phrased, extremely
reasonable questions. And one of them is,
118
00:08:23,950 --> 00:08:28,339
you know: “Why to
choose to run Tor?” And
119
00:08:28,339 --> 00:08:32,089
we should address all of them. We
really should. And at the same time
120
00:08:32,089 --> 00:08:35,190
we have to recognize that some of these
people that are kind of harassing,
121
00:08:35,190 --> 00:08:37,840
they might trigger me. That one will
trigger me, and I would probably
122
00:08:37,840 --> 00:08:40,650
write back with something kind of shitty.
So we want to distribute the work in a way
123
00:08:40,650 --> 00:08:44,049
where people will be nice. Even to the
people that are unreasonable. Because
124
00:08:44,049 --> 00:08:48,460
at the core – we need to be held to
account, and we need people to look to us
125
00:08:48,460 --> 00:08:52,090
about these things, and to ask us these
hard questions. And so this is the address
126
00:08:52,090 --> 00:08:55,580
to reach out to: [press@torproject.org].
Not harassing Andrea online on Twitter.
127
00:08:55,580 --> 00:09:00,690
Not coming after individual developers.
Not posting crazy stuff on the mailing list.
128
00:09:00,690 --> 00:09:04,520
Wait until we’ve actually talked to you,
then post the crazy stuff on the mailing list.
129
00:09:04,520 --> 00:09:07,510
Or wherever you’re going to post it. And
then hopefully we can actually answer
130
00:09:07,510 --> 00:09:11,830
the questions in a good faith-, helpful
way. There’s no reason to talk about
131
00:09:11,830 --> 00:09:15,210
conspiracy theories, we can just
talk about the business plans.
132
00:09:15,210 --> 00:09:19,380
And into that point wanna make it clear:
133
00:09:19,380 --> 00:09:23,330
stop being an asshole to people in the
community. But this is not negotiable.
134
00:09:23,330 --> 00:09:27,210
We’re not saying because we don’t want
you to harass people that we’re going
135
00:09:27,210 --> 00:09:30,800
to backdoor Tor. That will never happen.
You will find a bullet in the back of my head
136
00:09:30,800 --> 00:09:35,410
before that happens. And maybe Roger’s,
too. Depending on the order of operations.
137
00:09:35,410 --> 00:09:44,550
laughter and applause
138
00:09:44,550 --> 00:09:48,240
Roger: Okay, so we’re going to talk
a little bit about the various things
139
00:09:48,240 --> 00:09:52,760
we’ve done over the past year. To
give you a very brief introduction to Tor:
140
00:09:52,760 --> 00:09:56,820
Tor is an anonymity system. You’ve got
Alice, the client over there. She builds
141
00:09:56,820 --> 00:10:00,760
a path through 3 different relays
around the world. And the idea is
142
00:10:00,760 --> 00:10:04,030
that somebody watching her local
network connection can’t figure out
143
00:10:04,030 --> 00:10:07,830
what destination she’s going to. And
somebody watching the destinations
144
00:10:07,830 --> 00:10:11,750
can’t figure out where she’s coming
from. And we have quite a few relays
145
00:10:11,750 --> 00:10:16,440
at this point. Here’s a… the red line is
the graph of the number of relays
146
00:10:16,440 --> 00:10:20,560
we’ve had over the past year. For those
of you who remember ‘Heartbleed’
147
00:10:20,560 --> 00:10:24,450
you can see the big drop in April when
we removed a bunch of relays that
148
00:10:24,450 --> 00:10:29,390
had insecure keys. But this is not the
interesting graph. The interesting graph
149
00:10:29,390 --> 00:10:35,870
is ‘capacity over the past year’. And
we’ve gone from a little over 6 GBps
150
00:10:35,870 --> 00:10:39,990
of capacity up to more
than 12 GBps of capacity.
151
00:10:39,990 --> 00:10:48,140
applause
152
00:10:48,140 --> 00:10:51,820
And as long as we can make the difference
between those 2 lines big enough then
153
00:10:51,820 --> 00:10:56,200
Tor performance is pretty good. But we rely
on all of you to keep on running relays,
154
00:10:56,200 --> 00:11:01,180
and make them faster etc. so that we
can handle all the users who need Tor.
155
00:11:01,180 --> 00:11:06,350
Okay, another topic. Deterministic
builds. Mike Perry and Seth Schoen
156
00:11:06,350 --> 00:11:10,250
did a great talk a few days ago. So you
should go watch the stream on that!
157
00:11:10,250 --> 00:11:15,100
The very short version is: We have
a way of building Tor Browser so that
158
00:11:15,100 --> 00:11:19,230
everybody can build Tor Browser
and produce the same binary.
159
00:11:19,230 --> 00:11:22,940
And that way you don’t have to worry about
problems on your build machine and you can
160
00:11:22,940 --> 00:11:27,660
actually check that the program we give
you, really is based on the source code
161
00:11:27,660 --> 00:11:29,470
that we say that it is.
162
00:11:29,470 --> 00:11:33,910
Jacob: And this is of course important
because we really don’t want to be
163
00:11:33,910 --> 00:11:37,940
a focal point where someone comes
after us and says: “You have to produce
164
00:11:37,940 --> 00:11:41,670
a backdoored version”. So it’s very
important because we do receive
165
00:11:41,670 --> 00:11:46,430
a lot of pressure, from a lot of different
groups. And we never want to cave.
166
00:11:46,430 --> 00:11:50,310
And here’s how we think it is the
case that we will never cave:
167
00:11:50,310 --> 00:11:54,470
Free Software, open specifications,
reproducible builds,
168
00:11:54,470 --> 00:11:57,920
things that can be verified
with cryptographic signatures.
169
00:11:57,920 --> 00:12:01,650
That will not only keep us honest
against the – what do you call it –
170
00:12:01,650 --> 00:12:04,870
the angels of our better nature.
I don’t believe in angels. But anyway.
171
00:12:04,870 --> 00:12:09,100
The point is that it will keep us honest.
But it will also keep other people at bay.
172
00:12:09,100 --> 00:12:13,360
From trying to do something harmful to
us. Because when something happens
173
00:12:13,360 --> 00:12:17,860
you will be able to immediately find it.
And Mike Perry, by the way, is incredible.
174
00:12:17,860 --> 00:12:24,730
He probably hates that I’m saying his name
right now. Sorry, Mike! Are you here?
175
00:12:24,730 --> 00:12:27,340
laughter
Bastard! laughs
176
00:12:27,340 --> 00:12:32,020
But Mike Perry is a machine. He also
has a heart! But he’s a machine.
177
00:12:32,020 --> 00:12:36,010
And he’s incredible. And he has been
working non-stop on this. And he is really
178
00:12:36,010 --> 00:12:40,420
ground-breaking in not only doing
this for Firefox but really thinking
179
00:12:40,420 --> 00:12:43,720
about these hard problems, and
understanding that if he was just building
180
00:12:43,720 --> 00:12:47,810
this browser by himself, and he was
doing it in a non-verifiable way
181
00:12:47,810 --> 00:12:51,210
that it would really, actually be
a serious problem. Because we distribute
182
00:12:51,210 --> 00:12:55,930
this software. And so, I mean
there is a reason that the NSA
183
00:12:55,930 --> 00:12:59,890
calls Mike Perry a “worthy adversary”.
And it is because he’s amazing!
184
00:12:59,890 --> 00:13:02,260
applause
So let’s give it up for Mike Perry!
185
00:13:02,260 --> 00:13:07,600
ongoing applause
186
00:13:07,600 --> 00:13:11,870
Roger: Not only that, but his work, along
with Bitcoin’s work has pushed Debian
187
00:13:11,870 --> 00:13:16,500
and Fedora, and other groups to work
on reproducible builds as well. So,
188
00:13:16,500 --> 00:13:20,520
hopefully the whole security
community will get better!
189
00:13:20,520 --> 00:13:24,810
applause
190
00:13:24,810 --> 00:13:28,700
Jacob: And to the point about Citizenfour.
One of the things that’s been happening
191
00:13:28,700 --> 00:13:33,390
quite recently is that really respectable
nice people like the people at Mozilla
192
00:13:33,390 --> 00:13:37,440
have decided that they really want
us to work together. Which is great.
193
00:13:37,440 --> 00:13:41,250
Because we wanted to, and we have
respected their work for a very long time.
194
00:13:41,250 --> 00:13:46,779
And so Tor is now partnering with Mozilla.
And that means that Mozilla, as a group,
195
00:13:46,779 --> 00:13:50,400
will be running Tor relays. At first
middle nodes, and then, hopefully,
196
00:13:50,400 --> 00:13:56,770
we believe, exit relays. And that is
huge because Mozilla is at the forefront
197
00:13:56,770 --> 00:14:02,370
of doing a lot of work for end users. Just
everyday regular people wanting privacy.
198
00:14:02,370 --> 00:14:07,589
Things like DoNotTrack e.g.
are a way to try to experiment.
199
00:14:07,589 --> 00:14:11,510
Things like the Tor Browser a way to
experiment even further. To really bring
200
00:14:11,510 --> 00:14:16,400
Privacy-by-Design. And it’s amazing
that Mozilla is doing that. And
201
00:14:16,400 --> 00:14:20,500
we’ve made a partnership with them, and
we’re hopeful, cautiously optimistic even,
202
00:14:20,500 --> 00:14:23,680
that this is going to produce some very
good results where our communities can
203
00:14:23,680 --> 00:14:28,450
sort of fuse, and give Privacy-by-Design
software to every person on the planet
204
00:14:28,450 --> 00:14:31,210
with no exceptions whatsoever.
205
00:14:31,210 --> 00:14:37,860
applause
206
00:14:37,860 --> 00:14:42,010
Now we also have a couple of things
that we would like to talk about,
207
00:14:42,010 --> 00:14:44,960
just generally, that are a little bit
technical. But at the same time
208
00:14:44,960 --> 00:14:49,070
we wanna keep it accessible because
we think that this talk, well, it’s useful
209
00:14:49,070 --> 00:14:51,841
to talk about technical details. The most
important thing is somebody who has
210
00:14:51,841 --> 00:14:55,260
never heard of the Tor community before,
who watches this video, we want them
211
00:14:55,260 --> 00:15:00,730
to understand some of the
details, and enough, let’s say,
212
00:15:00,730 --> 00:15:04,650
technical understanding that they’ll be
able to go and look it up if they want to,
213
00:15:04,650 --> 00:15:07,510
but they’ll also understand we’re not
just glossing over, completely.
214
00:15:07,510 --> 00:15:10,320
So, pluggable transports are very
important. Right now, the way
215
00:15:10,320 --> 00:15:15,540
that Tor works is that we connect with an
SSL/TLS connection. The protocol SSL/TLS,
216
00:15:15,540 --> 00:15:19,750
one of the 2, depending on the client
library, and the server library. And
217
00:15:19,750 --> 00:15:23,180
that looks like an SSL connection, for
the most part. But as some of you know
218
00:15:23,180 --> 00:15:28,470
there are people on this planet
they collect SSL and TLS data,
219
00:15:28,470 --> 00:15:32,440
about everything flowing across the
internet. That’s really a problem.
220
00:15:32,440 --> 00:15:36,550
It turns out we thought in some cases
that it was just censorship that mattered.
221
00:15:36,550 --> 00:15:40,410
But it turns out broad classification
of traffic is really, actually, a problem
222
00:15:40,410 --> 00:15:44,960
not just for blocking but also for later
doing identification of traffic flows.
223
00:15:44,960 --> 00:15:47,740
So I’ve already lost the non-technical
people in the audience, so, let me
224
00:15:47,740 --> 00:15:51,580
rephrase that and say: We have these other
ways of connecting to the Tor network.
225
00:15:51,580 --> 00:15:55,740
And they don’t look just like a secure
banking transaction. They look instead
226
00:15:55,740 --> 00:16:01,250
like DNS, or HTTP – that is your regular
web browsing or name resolution.
227
00:16:01,250 --> 00:16:04,990
And we have a lot of different pluggable
transports. And some of them are cool.
228
00:16:04,990 --> 00:16:08,040
Some of them make it look like you’re
connecting to Google. When in fact you’re
229
00:16:08,040 --> 00:16:11,180
connecting to the Tor Project. And it’s
because you, in fact, are connecting
230
00:16:11,180 --> 00:16:16,660
to Google. Leif Ryge, are you
in the room, here? Maybe, no?
231
00:16:16,660 --> 00:16:19,960
This is really… you guys,
and your anonymity!
232
00:16:19,960 --> 00:16:23,890
laughter
It is the case…
233
00:16:23,890 --> 00:16:27,070
he showed this to me, I mentioned this to
some other people and David Fifield,
234
00:16:27,070 --> 00:16:30,870
I think, either independently rediscovered
it. There’s also the GoAgent people
235
00:16:30,870 --> 00:16:35,390
that discovered this. You can connect
to Google with an SSL connection,
236
00:16:35,390 --> 00:16:38,430
and the certificate will say:
dadada.google.com. And you of course
237
00:16:38,430 --> 00:16:42,740
verify it. And it is of course signed,
probably by Adam Langley, personally.
238
00:16:42,740 --> 00:16:48,120
And… maybe it’s just the Google
CAs. And then you give it a different
239
00:16:48,120 --> 00:16:53,270
HTTP host header. So you say: actually
I wanna talk to Appspot. I wanna talk
240
00:16:53,270 --> 00:16:58,470
to torbridge.appspot.com.
And inside of the TLS connection,
241
00:16:58,470 --> 00:17:01,149
which looks like it’s a connection to
Google which is one of the most popular
242
00:17:01,149 --> 00:17:05,119
websites on the internet you then make
essentially an encrypted connection
243
00:17:05,119 --> 00:17:09,980
through that. And then from there
to the Tor network. Using Google,
244
00:17:09,980 --> 00:17:13,859
but also Cloudflare – they don’t
just provide you with captchas!
245
00:17:13,859 --> 00:17:19,329
laughter and applause
laughs
246
00:17:19,329 --> 00:17:23,170
Poor Cloudflare guy! We were joking
we should stand outside his office
247
00:17:23,170 --> 00:17:26,009
and make him answer
captchas to get in the door!
248
00:17:26,009 --> 00:17:30,460
laughter and applause
249
00:17:30,460 --> 00:17:34,260
All of those people clapping wish you
would solve the Cloudflare captcha issue!
250
00:17:34,260 --> 00:17:39,810
So it also works with other compute
clusters. And other CDNs.
251
00:17:39,810 --> 00:17:43,300
And so this is really awesome because
it means that now you can connect
252
00:17:43,300 --> 00:17:47,280
through those CDNs to the Tor network,
using Meek (?) and other pluggable transports
253
00:17:47,280 --> 00:17:52,620
like that. So that’s a huge win.
And deploying it by default
254
00:17:52,620 --> 00:17:54,140
– I think we have another slide for that…
255
00:17:54,140 --> 00:17:58,270
Roger: Nope, that’s it!
We’ve got a different one, yes.
256
00:17:58,270 --> 00:18:03,440
So, one of the neat things about Meek (?) is:
because it works on all these different
257
00:18:03,440 --> 00:18:07,910
sorts of providers – Akamai
and all the CDNs out there –
258
00:18:07,910 --> 00:18:12,840
a lot of those are still reachable from
places like China. Lots of our pluggable
259
00:18:12,840 --> 00:18:16,370
transports don’t work so well in China,
but meek does, at this point.
260
00:18:16,370 --> 00:18:20,200
So there are a lot of happy users.
Here’s a graph of an earlier
261
00:18:20,200 --> 00:18:24,230
pluggable transport that we had,
called ‘obfs3’. It still works in China,
262
00:18:24,230 --> 00:18:28,030
and Iran, and Syria and lots
of places around the world.
263
00:18:28,030 --> 00:18:31,920
But the sort of blue/aqua line is
264
00:18:31,920 --> 00:18:36,540
how much use we’ve seen of
obfs3. And you can tell exactly
265
00:18:36,540 --> 00:18:41,590
when we put out the new Tor browser
release that had obfs3 built-in
266
00:18:41,590 --> 00:18:46,890
and easy-to-use by ordinary people.
So one of the really important pushes
267
00:18:46,890 --> 00:18:50,850
we’ve been doing is trying to make
– rather than trying to explain
268
00:18:50,850 --> 00:18:54,090
how pluggable transports work, and
teach you everything – just make them
269
00:18:54,090 --> 00:18:57,370
really simple. Make them part of Tor
browser, you just click on “My Tor
270
00:18:57,370 --> 00:19:01,690
isn’t working so I wanna use some
other way to make my Tor work”.
271
00:19:01,690 --> 00:19:06,260
And we’ve got 10.000 people at this
point who are happily using obfs3.
272
00:19:06,260 --> 00:19:10,930
I think a lot of them are in
Syria and Iran at this point.
273
00:19:10,930 --> 00:19:17,640
applause
274
00:19:17,640 --> 00:19:21,150
Something else we’ve been doing over
the past year is working really hard
275
00:19:21,150 --> 00:19:26,020
on improving the robustness,
and testing infrastructure,
276
00:19:26,020 --> 00:19:29,960
and unit tests for the core Tor
source code. So Nick Mathewson
277
00:19:29,960 --> 00:19:34,230
and Andrea Shepard in particular
have been really working on robustness
278
00:19:34,230 --> 00:19:39,700
to make this something we can rely
on, as a building block in tails,
279
00:19:39,700 --> 00:19:43,770
in Tor browser, in all the other
applications that rely on Tor.
280
00:19:43,770 --> 00:19:47,190
So in the background things were
getting a lot stronger. Hopefully that
281
00:19:47,190 --> 00:19:51,960
will serve us very well
in the battles to come.
282
00:19:51,960 --> 00:19:59,220
applause
283
00:19:59,220 --> 00:20:02,280
Jacob: So this fine gentleman
who was a teen heartthrob
284
00:20:02,280 --> 00:20:03,980
on Italian television many years ago…
285
00:20:03,980 --> 00:20:06,530
Arturo: Thank you for doxing me!
Jacob: Sorry.
286
00:20:06,530 --> 00:20:08,260
both laugh
287
00:20:08,260 --> 00:20:10,450
If only you’d been using Tor!
288
00:20:10,450 --> 00:20:16,020
Arturo: Yeah, TV over Tor. So…
A project that we started a couple
289
00:20:16,020 --> 00:20:23,620
of years ago with Jake is sort of related
I guess to the Tor project’s goals of
290
00:20:23,620 --> 00:20:29,740
increasing privacy and having a better
understanding on how people’s lives
291
00:20:29,740 --> 00:20:35,380
are impacted through technology. And this
project is called OONI, or the ‘Open
292
00:20:35,380 --> 00:20:40,010
Observatory of Network Interference’. And
what it is, before being a piece of software
293
00:20:40,010 --> 00:20:46,080
is a set of principles, and best practices
and specifications written in English
294
00:20:46,080 --> 00:20:52,860
for how it is best to conduct network
related measurements. That sort of
295
00:20:52,860 --> 00:20:57,510
measurements that we’re interested in
running have to do with identifying
296
00:20:57,510 --> 00:21:04,130
network irregularities. These are symptoms
that can be a sign of presence of
297
00:21:04,130 --> 00:21:10,710
surveillance or censorship, on the network
that you’re testing. And we use
298
00:21:10,710 --> 00:21:15,860
a methodology that has been peer-reviewed,
of which we have published a paper.
299
00:21:15,860 --> 00:21:21,200
It’s implemented using free software. And
all of the data that we collect is made
300
00:21:21,200 --> 00:21:26,800
available to the public. So that you can
look at it, analyze it and draw your
301
00:21:26,800 --> 00:21:33,160
own conclusions from it.
applause
302
00:21:33,160 --> 00:21:37,560
And so we believe that this effort is
something that is helpful and useful
303
00:21:37,560 --> 00:21:43,179
to people such as journalists, researchers,
activists or just simple citizens that are
304
00:21:43,179 --> 00:21:48,679
interested in being more aware, and have
a better understanding that is based
305
00:21:48,679 --> 00:21:55,559
on facts instead of just anecdotes, on
what is the reality of internet censorship
306
00:21:55,559 --> 00:22:00,059
in their country. And we believe that
historical data is especially important
307
00:22:00,059 --> 00:22:05,660
because it gives us an understanding of
how these censorship and surveillance
308
00:22:05,660 --> 00:22:12,670
apparatuses evolve over time. So
I would like to invite you all to run
309
00:22:12,670 --> 00:22:21,730
Ooniprobe today, if you copy and paste
this command line inside of a Debian-based
310
00:22:21,730 --> 00:22:26,730
system. Obviously… perhaps you should
read what is inside it before running it.
311
00:22:26,730 --> 00:22:31,310
applause
312
00:22:31,310 --> 00:22:34,630
But once you do that you will have
a Ooniprobe setup and you will be
313
00:22:34,630 --> 00:22:40,570
collecting measurements for your country.
If instead you would like to have
314
00:22:40,570 --> 00:22:46,890
an actual hardware device we have a very
limited number of them. But if you’re
315
00:22:46,890 --> 00:22:49,799
from an interesting country and you’re
interested in running Ooniprobe
316
00:22:49,799 --> 00:22:54,420
we can give you a little Raspberry Pi with
an LCD screen that you can take home,
317
00:22:54,420 --> 00:23:00,860
connect to your network and adopt
a Ooniprobe in your home network.
318
00:23:00,860 --> 00:23:09,130
To learn more about this you should come
later today at Noisy Square, at 6 P.M.
319
00:23:09,130 --> 00:23:11,750
to learn more about it.
320
00:23:11,750 --> 00:23:13,020
Roger: Thank you!
321
00:23:13,020 --> 00:23:17,500
applause
322
00:23:17,500 --> 00:23:20,570
Jacob: And, just to finish up here,
I mean, OONI is a human rights
323
00:23:20,570 --> 00:23:26,070
observation project which Arturo and
Aaron Gibson – also somewhere in the room,
324
00:23:26,070 --> 00:23:32,130
I’m sure he won’t stand up so I won’t even
ask him. It’s great! Because we went from
325
00:23:32,130 --> 00:23:35,400
a world where there was no open
measurement, with only secret tools,
326
00:23:35,400 --> 00:23:39,110
essentially, where people acted like
secret agents, going in the countries
327
00:23:39,110 --> 00:23:42,320
to do measurements. There wasn’t really
an understanding of the risks that
328
00:23:42,320 --> 00:23:45,860
were involved, how the tests function,
where non-technical people could have
329
00:23:45,860 --> 00:23:50,830
reasonable explanations. And now we have
open measurement tools, we have open data
330
00:23:50,830 --> 00:23:55,080
standards, we have really like a framework
for understanding this as a human right
331
00:23:55,080 --> 00:23:59,250
to observe the world around you. And then
also to share that data, and to actually
332
00:23:59,250 --> 00:24:03,290
discuss that data, what it means. And to
be able to set standards for it.
333
00:24:03,290 --> 00:24:06,330
And hopefully that means that people have
informed consent when they engage
334
00:24:06,330 --> 00:24:10,600
in something that could be risky, like running
Ooni in a place like… that is dangerous
335
00:24:10,600 --> 00:24:13,030
like the United States or Cuba,
or something like China.
336
00:24:13,030 --> 00:24:18,000
applause
And so, Arturo personally though, is
337
00:24:18,000 --> 00:24:21,610
the heart and soul of Ooni. And it is
really important that we see that
338
00:24:21,610 --> 00:24:25,580
the Tor community is huge. It’s really
huge, it’s made up of a lot of people
339
00:24:25,580 --> 00:24:29,670
doing a lot of different things. And part
of Ooni is Tor. We need Tor to be able
340
00:24:29,670 --> 00:24:33,929
to have a secure communications channel
back to another system, we need that
341
00:24:33,929 --> 00:24:38,230
so that people can log into these
Ooniprobes e.g. over Tor Hidden Services.
342
00:24:38,230 --> 00:24:42,610
That kind of fusion of things where we
have anonymity but at the same time
343
00:24:42,610 --> 00:24:45,980
we have this data set that is in some
cases identifying, in some cases
344
00:24:45,980 --> 00:24:49,910
it’s not identifying, depending on the
test. We need an anonymous communications
345
00:24:49,910 --> 00:24:53,630
channel to do that kind of human rights
observation. And so… just so we can
346
00:24:53,630 --> 00:24:57,070
make Arturo a little… feel a little
appreciated I just wanna give him
347
00:24:57,070 --> 00:25:00,500
another round of applause, for making this
human rights observation project.
348
00:25:00,500 --> 00:25:08,240
applause
Jacob joins the applause
349
00:25:08,240 --> 00:25:12,990
Roger: So I encourage all of you not only
to run Ooniprobe in interesting places,
350
00:25:12,990 --> 00:25:17,660
and in boring places because they might
become interesting. But also to help write
351
00:25:17,660 --> 00:25:22,500
new tests, and work on the design of these
things, so that we can detect and notice
352
00:25:22,500 --> 00:25:27,289
new problems on the internet more quickly.
Something else we’ve been up to over
353
00:25:27,289 --> 00:25:32,920
the past year is Tor Weekly News. We were
really excited by Linux Weekly News etc.
354
00:25:32,920 --> 00:25:37,990
and… so every week there’s a new
blog post and mail that summarizes
355
00:25:37,990 --> 00:25:41,820
what’s happened over the past week.
We encourage you to look at all these.
356
00:25:41,820 --> 00:25:45,870
A special shout-out to harmony and
lunar for helping to make this happen
357
00:25:45,870 --> 00:25:47,950
over the past year. Thank you!
358
00:25:47,950 --> 00:25:52,679
applause
359
00:25:52,679 --> 00:25:57,370
Jacob: Finally there’s a Tor list you can
be on, that you really wanna be on!
360
00:25:57,370 --> 00:26:01,460
Roger: Being on lists is good. One of the
other features we’ve been really excited
361
00:26:01,460 --> 00:26:06,590
about over the past year: EFF has been
helping with Outreach. EFF ran
362
00:26:06,590 --> 00:26:10,820
a Tor relay challenge to try to get a lot
of people running relays. And I think
363
00:26:10,820 --> 00:26:17,150
they have several thousand relays that
signed up because of the relay challenge.
364
00:26:17,150 --> 00:26:19,549
Pushing a lot of traffic.
So that’s really great!
365
00:26:19,549 --> 00:26:23,339
applause
366
00:26:23,339 --> 00:26:27,040
And at the same time not only did they
get a lot of more people running relays
367
00:26:27,040 --> 00:26:31,750
but they also did some great advocacy
and outreach for getting more exit relays
368
00:26:31,750 --> 00:26:36,440
in universities, and basically teaching
people why Tor is important. We all need
369
00:26:36,440 --> 00:26:40,200
to be doing more of that! We’ll
touch on that a little bit more later.
370
00:26:40,200 --> 00:26:44,190
So you all I hope remember what was
going on in Turkey, earlier this year.
371
00:26:44,190 --> 00:26:48,419
Here’s a cool graph of Tor use in Turkey
when they started to block Youtube
372
00:26:48,419 --> 00:26:52,170
and other things. Then people realized,
I need to get some tools to get around
373
00:26:52,170 --> 00:26:56,830
that censorship. But you probably
weren’t paying attention when Iraq
374
00:26:56,830 --> 00:27:01,430
filtered Facebook, and suddenly a lot of
people in Iraq needed to get some sort
375
00:27:01,430 --> 00:27:05,570
of way to get around their censorship. So
there are a bunch of interesting graphs
376
00:27:05,570 --> 00:27:10,470
like this on the Tor Metrics project, of
what’s been going on over the past year.
377
00:27:10,470 --> 00:27:13,290
Jacob: And we actually…
– if you could go back, yeah.
378
00:27:13,290 --> 00:27:17,510
One thing that’s really interesting about
this is: Karsten Loesing who is, I think,
379
00:27:17,510 --> 00:27:20,530
also not going to stand up, maybe you
will? Are you here? I don’t see you,
380
00:27:20,530 --> 00:27:25,929
Karsten? No? No, okay. He does all
the metrics, this anonymous, shadowy
381
00:27:25,929 --> 00:27:29,650
metrics figure. And if you go to
metrics.torproject.org you’ll see
382
00:27:29,650 --> 00:27:33,830
open data that is properly anonymized
– you would expect that from us –
383
00:27:33,830 --> 00:27:38,539
as well as actual documents that explain
the anonymity, the counting techniques,
384
00:27:38,539 --> 00:27:42,140
that explain the privacy conserving
statistics. And you can see these graphs,
385
00:27:42,140 --> 00:27:46,050
you can generate them based on certain
parameters. If you are interested
386
00:27:46,050 --> 00:27:50,320
in seeing e.g. geopolitical events,
and how they tie in to the internet,
387
00:27:50,320 --> 00:27:54,870
this project is part of what inspired
Ooni. This is how we get statistics
388
00:27:54,870 --> 00:27:58,289
and interesting things about the Tor
network itself. From Tor clients,
389
00:27:58,289 --> 00:28:02,150
from Tor relays, from Tor bridges.
And it tells you all sorts of things.
390
00:28:02,150 --> 00:28:08,700
Platform information, version number of
the software, which country someone
391
00:28:08,700 --> 00:28:13,440
might be connecting from etc. Where
they’re hosted… If you are interested
392
00:28:13,440 --> 00:28:17,900
looking at this website and finding spikes
like this you may in fact be able to
393
00:28:17,900 --> 00:28:22,590
find out that there is a censorship event
in that country, and we haven’t noticed it.
394
00:28:22,590 --> 00:28:26,410
There are a lot of countries in the world
if we split it up by country. And sometimes
395
00:28:26,410 --> 00:28:31,460
50.000 Tor users fall off the Tor network
because another American company has sold
396
00:28:31,460 --> 00:28:36,780
that country censorship equipment. We
need help finding these events, and then
397
00:28:36,780 --> 00:28:41,340
understanding their context. So if in your
country something like that happens
398
00:28:41,340 --> 00:28:45,830
looking at this data can help us not only
to advocate for anonymity in such a place
399
00:28:45,830 --> 00:28:48,910
but it can help us to also technically
realize we need to fix a thing,
400
00:28:48,910 --> 00:28:51,799
change a thing… And it’s through this
data that we can have a dialog
401
00:28:51,799 --> 00:28:55,550
about those things. So if you have no
technical ability at all but you’re
402
00:28:55,550 --> 00:28:59,260
interested and understand where you
come from – look at this data set, try
403
00:28:59,260 --> 00:29:03,450
to understand it, and then reach out to us
and hopefully we can learn about that.
404
00:29:03,450 --> 00:29:06,289
That’s how we learn about this, that’s how
we learned about the previous thing.
405
00:29:06,289 --> 00:29:09,860
And many years ago we gave a Tor talk
about how countries and governments
406
00:29:09,860 --> 00:29:15,470
and corporations try to censor Tor. And
of course, a lot has happened since then.
407
00:29:15,470 --> 00:29:18,510
There’s a lot of those things, and very
difficult to keep up with them. So
408
00:29:18,510 --> 00:29:22,820
we really need the community’s help to
contextualize, to explain and define
409
00:29:22,820 --> 00:29:25,750
these things.
410
00:29:25,750 --> 00:29:30,970
Roger: Okay. Next section of the talk,
‘things that excited journalists over
411
00:29:30,970 --> 00:29:35,270
the past year’. That actually turned out
to be not-so-big a deal. And we’re gonna
412
00:29:35,270 --> 00:29:39,220
try to blow through a lot of them quickly,
so that we can get to the stuff that
413
00:29:39,220 --> 00:29:45,669
actually was a big deal. So I guess in
August or something there was going to be
414
00:29:45,669 --> 00:29:50,190
a Blackhat talk about how you can
just totally break Tor, and then
415
00:29:50,190 --> 00:29:55,080
the Blackhat talk got pulled. Turns out
that it was a group at CMU who were
416
00:29:55,080 --> 00:30:00,200
doing some research on Tor. And I begged
them for a long time to get a little bit
417
00:30:00,200 --> 00:30:04,720
of information about what attack they had.
Eventually they sent me a little bit of
418
00:30:04,720 --> 00:30:08,510
information. And then we were all
thinking about how to fix it. And then
419
00:30:08,510 --> 00:30:12,280
Nick Mathewson, one of the Tor developers,
said: “Why don’t I just deploy
420
00:30:12,280 --> 00:30:17,490
a detection thing on the real Tor network,
just in case somebody is doing this?” And
421
00:30:17,490 --> 00:30:21,210
then it turns out somebody was doing this.
And then I sent mail to the Cert (?) people
422
00:30:21,210 --> 00:30:25,789
saying: “Hey, are you, like, are you like
running those 100 relays that are doing
423
00:30:25,789 --> 00:30:31,690
this attack on Tor users right now?” And
I never heard back from them after that.
424
00:30:31,690 --> 00:30:36,570
So that’s sort of a… this is a sad
story for a lot of different reasons.
425
00:30:36,570 --> 00:30:41,070
But I guess the good news is we identified
the relays that were doing the attack,
426
00:30:41,070 --> 00:30:45,020
we cut them out of the network, and we
deployed a defense that will first of all
427
00:30:45,020 --> 00:30:49,000
make that particular attack not
work anymore. And also detect it
428
00:30:49,000 --> 00:30:52,010
when somebody else is trying
to do an attack like this.
429
00:30:52,010 --> 00:30:53,610
Jacob: This, of course, is…
430
00:30:53,610 --> 00:30:59,720
applause
431
00:30:59,720 --> 00:31:05,020
This is a hard lesson, for 2 reasons.
The first reason is that that it’s awful
432
00:31:05,020 --> 00:31:07,750
to do those kinds of attacks on the real
Tor network. And there’s a question about
433
00:31:07,750 --> 00:31:12,530
responsibility. But the second lesson is
that when these kinds of things happen,
434
00:31:12,530 --> 00:31:17,179
and we have the ability to actually
understand them we can respond to them.
435
00:31:17,179 --> 00:31:21,370
It’s really awful that the talk
was pulled, and it is really awful
436
00:31:21,370 --> 00:31:24,640
that these people were not able to give
us more information. And it’s also really
437
00:31:24,640 --> 00:31:28,030
awful that they were apparently carrying
out the attack. And there were lots
438
00:31:28,030 --> 00:31:31,831
of open questions about it. But in general
we believe that we’ve mitigated the attack
439
00:31:31,831 --> 00:31:36,450
which is important. But we also
advocated for that talk to go forward.
440
00:31:36,450 --> 00:31:40,549
Because we think that, of course, the
answer to even really frustrating speech
441
00:31:40,549 --> 00:31:45,710
is more speech! So we wanna know more
about it. It somehow is very disturbing
442
00:31:45,710 --> 00:31:49,090
that that talk was pulled. And they should
be able to present their research,
443
00:31:49,090 --> 00:31:52,650
even if there’s anger on our face it’s
important for our users to know as much
444
00:31:52,650 --> 00:31:57,520
as we can, so that we can move
forward with protecting Tor users.
445
00:31:57,520 --> 00:32:02,500
Roger: Okay, so, another exciting
topic from a couple of months ago:
446
00:32:02,500 --> 00:32:04,630
Russia apparently put out
a call-for-research work…
447
00:32:04,630 --> 00:32:06,990
loud splashing noise from Jake
opening a loaded water bottle
448
00:32:06,990 --> 00:32:10,580
…to come up with attacks on Tor.
Jacob: It’s another attack on Tor!
449
00:32:10,580 --> 00:32:14,970
Roger: Enjoy your water, Jake.
I hope that was worth it. laughs
450
00:32:14,970 --> 00:32:16,530
Jacob: laughs It was really
worth it. Was very thirsty.
451
00:32:16,530 --> 00:32:19,919
Roger: So Russia put out a
call-for-research proposals
452
00:32:19,919 --> 00:32:25,930
on attacking Tor. Somebody mistranslated
that phrase from Russian into ‘prize’,
453
00:32:25,930 --> 00:32:31,200
or ‘bounty’, or ‘contest’. And then we had
all these articles, saying “Russia is
454
00:32:31,200 --> 00:32:36,080
holding a contest to break Tor” when
actually, no, they just wanted somebody
455
00:32:36,080 --> 00:32:41,560
to work on research on Tor attacks.
So this would be like the U.S. National
456
00:32:41,560 --> 00:32:46,730
Science Foundation holds a contest
for Tor research. That’s not actually
457
00:32:46,730 --> 00:32:50,280
how government funding works.
Mistranslations cause a lot of
458
00:32:50,280 --> 00:32:55,030
exciting journalist articles but as
far as I can tell it turned out to be
459
00:32:55,030 --> 00:32:59,850
basically nothing. Also it was basically
‘no money’. So, maybe something
460
00:32:59,850 --> 00:33:03,069
will come of this, we’ll see. Something
else that’s been bothering me a lot,
461
00:33:03,069 --> 00:33:08,260
lately: Cryptowall, now called
‘Cryptolocker’. So, there are jerks
462
00:33:08,260 --> 00:33:12,230
out there who break into your
mobile phone of some sort,
463
00:33:12,230 --> 00:33:17,159
give you malware, viruses, something
like that. They encrypt your files,
464
00:33:17,159 --> 00:33:22,050
and then they send you basically a ransom
note saying “We’ve encrypted your file,
465
00:33:22,050 --> 00:33:27,320
if you want it back send some Bitcoin over
here!” So this is bad, so far. But then
466
00:33:27,320 --> 00:33:31,320
the part that really upsets me is they
say: “And if you don’t know how to do this
467
00:33:31,320 --> 00:33:35,960
go to our website torproject.org and
download the Tor Browser in order
468
00:33:35,960 --> 00:33:42,620
to pay us”. Fuck them! I do not want
people doing this with our software!
469
00:33:42,620 --> 00:33:49,220
applause
470
00:33:49,220 --> 00:33:51,890
Jacob: Yeah, fuck them. I mean I don’t
really have a lot to contribute to that.
471
00:33:51,890 --> 00:33:56,510
I mean it’s really… Hidden Services have
a really bad rap, and it’s frustrating,
472
00:33:56,510 --> 00:33:59,900
right? There’s a… of course this
quantitative and qualitative analysis
473
00:33:59,900 --> 00:34:03,890
that we can have here. And the reality
of the situation is that one Globaleaks
474
00:34:03,890 --> 00:34:08,270
leaking interface is ‘one.onion’ (?), for
example. What is the value of that?
475
00:34:08,270 --> 00:34:13,540
Versus 10.000 Hidden Services run by these
jerks? And it’s very hard to understand
476
00:34:13,540 --> 00:34:16,989
the social value of these things, except
to say that we really need things like
477
00:34:16,989 --> 00:34:21,710
Hidden Services. And jackasses like this
are really making it hard for us to defend
478
00:34:21,710 --> 00:34:26,199
the right to publish anonymously. And so,
if you know who these people are please
479
00:34:26,199 --> 00:34:30,549
ask them to stop! I don’t even know
what the ask is there. But they really
480
00:34:30,549 --> 00:34:33,109
should stop. Or maybe there’s some
interesting things that you can do.
481
00:34:33,109 --> 00:34:37,159
I don’t know. But we really, really
don’t like that this is someone’s
482
00:34:37,159 --> 00:34:41,229
first introduction to Tor! That they think
that we’re responsible for this. We
483
00:34:41,229 --> 00:34:44,549
most certainly are not responsible for
these things. We certainly do not deploy
484
00:34:44,549 --> 00:34:51,000
malware. And Hidden Services are actually
very important for a lot of people.
485
00:34:51,000 --> 00:34:53,930
These people are not those people!
486
00:34:53,930 --> 00:34:59,949
applause
487
00:34:59,949 --> 00:35:03,539
Roger: Another ‘exciting’ story,
a month or 2 ago, was,
488
00:35:03,539 --> 00:35:08,289
“81% of Tor users can be de-anonymized…”
and then some more words, depending on
489
00:35:08,289 --> 00:35:13,210
which article you read. So it turns out
that one of our friends, Sambuddho, who is
490
00:35:13,210 --> 00:35:19,309
a professor in India now, did some work
on analyzing traffic correlation attacks
491
00:35:19,309 --> 00:35:24,210
in the lab. He found, in the lab, that
some of his attacks worked sometime,
492
00:35:24,210 --> 00:35:29,410
great… And then some journalists found it,
and said: “Ah! This must be the reason why
493
00:35:29,410 --> 00:35:33,849
Tor is insecure today”. So he wrote
an article, it got Slashdot, it got
494
00:35:33,849 --> 00:35:38,210
all the other news stories. And suddenly
everybody knew that Tor was broken
495
00:35:38,210 --> 00:35:43,759
because “81% of Tor users…”.
So it turns out that Sambuddho himself
496
00:35:43,759 --> 00:35:47,699
stood up and said actually: “No, you
misunderstood my article”. But
497
00:35:47,699 --> 00:35:51,910
that didn’t matter because nobody listened
to the author of the paper at that point.
498
00:35:51,910 --> 00:35:57,390
So I guess there’s a broader issue that
we’re struggling with here, in terms of
499
00:35:57,390 --> 00:36:02,430
how to explain the details of these
things because traffic correlation attacks
500
00:36:02,430 --> 00:36:08,560
are a big deal. They probably do work
if you have enough traffic around
501
00:36:08,560 --> 00:36:12,079
the internet, and you’re looking at the
right places. You probably can do
502
00:36:12,079 --> 00:36:17,549
the attack. But that paper did not do the
attack. So I keep finding myself saying:
503
00:36:17,549 --> 00:36:21,880
“No no no, you’re misunderstanding the
paper, the paper doesn’t tell us anything,
504
00:36:21,880 --> 00:36:25,749
but the attack is real! But the paper
doesn’t tell us anything”. And this is
505
00:36:25,749 --> 00:36:30,049
really confusing to journalists because
it sounds like I’m disagreeing with myself
506
00:36:30,049 --> 00:36:35,059
with these 2 different sentences. So we
need to come up with some way to
507
00:36:35,059 --> 00:36:39,770
be able to explain: “Here are all of the
real attacks, that are really actually
508
00:36:39,770 --> 00:36:44,979
worrisome, and it’s great that researchers
are working on them. And they probably
509
00:36:44,979 --> 00:36:50,839
are a big deal, in some way. But no, that
paper that you’re pointing at right now
510
00:36:50,839 --> 00:36:55,839
is not the reason why they’re a big
deal”. We also saw this in the context
511
00:36:55,839 --> 00:36:59,790
of an NSA paper which was published
a couple of days ago, thanks to
512
00:36:59,790 --> 00:37:02,690
some other folks.
Jacob: Sad, ‘some other folks’!
513
00:37:02,690 --> 00:37:04,950
Roger: ‘Some other folks’. I won’t specify
514
00:37:04,950 --> 00:37:10,020
exactly which other folks. And they
similarly had a traffic correlation attack.
515
00:37:10,020 --> 00:37:15,579
And in the paper it’s really a bad one.
It’s the same as the paper that was
516
00:37:15,579 --> 00:37:20,140
published in 2003, in the open literature.
There was a much better paper
517
00:37:20,140 --> 00:37:25,309
published in 2004, in the open literature,
that apparently these folks didn’t read.
518
00:37:25,309 --> 00:37:29,619
So I don’t wanna say traffic correlation
attacks don’t work, but all these papers
519
00:37:29,619 --> 00:37:35,609
that we’re looking at don’t show…
aren’t very good papers.
520
00:37:35,609 --> 00:37:39,120
Jacob: So one of the solutions to a lot
of journalists that don’t understand
521
00:37:39,120 --> 00:37:42,710
technology is that it’s actually quite
easy to be a journalist by comparison
522
00:37:42,710 --> 00:37:47,319
to being a technologist. It’s possible
to write about things in a factually
523
00:37:47,319 --> 00:37:51,359
correct way, sometimes you don’t always
reach the right audiences, that can
524
00:37:51,359 --> 00:37:55,489
actually be difficult. It depends. So you
have to write for different reading
525
00:37:55,489 --> 00:37:59,390
comprehension levels, e.g. And we tried
to write for people who understand
526
00:37:59,390 --> 00:38:03,249
the internet. At least when I write as
a journalist. And so, when I sometimes
527
00:38:03,249 --> 00:38:07,210
take off my Tor hat I put on my journalistic
hat. And part of the reason is that
528
00:38:07,210 --> 00:38:10,369
in order to even tell you about some
of the things that we learn, if I don’t
529
00:38:10,369 --> 00:38:14,599
put on my journalistic hat I get a nice
pair of handcuffs. So it’s very important
530
00:38:14,599 --> 00:38:17,719
to have journalistic protection so that we
can inform you about these things.
531
00:38:17,719 --> 00:38:23,430
So e.g. it is the case that XKeyscore
rules – we published some of them.
532
00:38:23,430 --> 00:38:28,589
Not ‘we’, Tor. But me and this set of
people at the top, of this by-line here.
533
00:38:28,589 --> 00:38:33,420
In NDR. Some of you know NDR, it’s a very
large German publication. I also publish
534
00:38:33,420 --> 00:38:37,730
with Der Spiegel, as a journalist. In this
case we published XKeyscore rules.
535
00:38:37,730 --> 00:38:41,609
Where we specifically learned an important
lesson. And the important lesson was,
536
00:38:41,609 --> 00:38:44,660
even if you’re a journalist explaining
things exactly technically correctly
537
00:38:44,660 --> 00:38:47,739
– people will still get it wrong. It’s just
not the journalists that get it wrong.
538
00:38:47,739 --> 00:38:50,640
It’s the readers. Very frustrating.
539
00:38:50,640 --> 00:38:55,079
People decided that because the NSA
definitely has XKeyscore rules that is
540
00:38:55,079 --> 00:38:58,529
rules for surveilling the internet, where
they’re looking at big traffic buffers.
541
00:38:58,529 --> 00:39:03,890
TEMPORA e.g. the British surveillance
system that is built on XKeyscore.
542
00:39:03,890 --> 00:39:08,190
With a – probably – week-long buffer of
all internet traffic. That’s a big buffer,
543
00:39:08,190 --> 00:39:15,249
by the way. Doing these XKeyscore
rules, running across that traffic set,
544
00:39:15,249 --> 00:39:18,130
they would find that people were
connecting to directory authorities.
545
00:39:18,130 --> 00:39:20,959
One of those directory authorities is
mine, actually, quite ironically. And
546
00:39:20,959 --> 00:39:25,760
then Sebastian Hahn, and other people
in this audience. And some people said:
547
00:39:25,760 --> 00:39:30,839
“Oh, don’t use Tor because the NSA will
be monitoring you!” That is exactly
548
00:39:30,839 --> 00:39:35,660
the wrong take-away. Because there are
XKeyscore rules on the order of tens of
549
00:39:35,660 --> 00:39:39,890
thousands, from what we can tell.
So everything you do is going through
550
00:39:39,890 --> 00:39:43,000
these giant surveillance systems. And
what you’ll learn when you monitor
551
00:39:43,000 --> 00:39:48,579
someone using Tor is that they’re
using Tor potentially, in that buffer.
552
00:39:48,579 --> 00:39:51,299
Which is different than ‘they learn
for sure that you were going to
553
00:39:51,299 --> 00:39:55,769
the Chaos Computer Club’s web site’,
or that you were going to a dating site.
554
00:39:55,769 --> 00:39:59,430
So it’s the difference between ‘they learn
some keeny (?) bit of information about you’,
555
00:39:59,430 --> 00:40:02,920
that you’re using an anonymity
system, versus ‘they learned exactly
556
00:40:02,920 --> 00:40:06,469
what you were doing on the internet’. Now
if there were only a few XKeyscore rules
557
00:40:06,469 --> 00:40:10,849
at all, and it was just that about Tor
then that conclusion people reach
558
00:40:10,849 --> 00:40:15,260
would be correct. But it’s exactly not
true. The XKeyscore system is so powerful
559
00:40:15,260 --> 00:40:18,900
that if you have a logo for a company,
so anyone here that runs a company,
560
00:40:18,900 --> 00:40:23,440
and you put a logo inside of a document,
the XKeyscore system can find that logo
561
00:40:23,440 --> 00:40:28,489
in all of the documents flowing across the
internet in real-time. And alert someone
562
00:40:28,489 --> 00:40:34,079
that someone has sent a .DOC or a PDF with
that image inside of it. And alert them.
563
00:40:34,079 --> 00:40:38,229
So that they can intercept it. So the
lesson is not “Don’t use Tor because
564
00:40:38,229 --> 00:40:43,200
XKeyscore may put your metadata into
a database, in the so-called ‘corporate
565
00:40:43,200 --> 00:40:47,930
repositories’”. The lesson is “Holy shit,
there’s this gigantic buffering system
566
00:40:47,930 --> 00:40:52,259
which has search capabilities that even
allow you to search inside of documents.
567
00:40:52,259 --> 00:40:55,740
Really, really advanced capabilities where
they can select that traffic and put it
568
00:40:55,740 --> 00:41:00,069
somewhere else”. “Use an anonymity
system!” And also: “Look, they’re
569
00:41:00,069 --> 00:41:04,789
targeting anonymity systems, even in the
United States, which, at least for the NSA
570
00:41:04,789 --> 00:41:08,239
they’re not supposed to be doing those
kinds of things”. They literately were
571
00:41:08,239 --> 00:41:11,369
caught lying here. They’re doing
bulk internet surveillance even
572
00:41:11,369 --> 00:41:16,109
in the United States. Using these
kinds of systems. That’s really scary.
573
00:41:16,109 --> 00:41:19,680
But the real big lesson to take away from
that is, actually, that they’re doing this
574
00:41:19,680 --> 00:41:22,440
for all the protocols that they can
write fingerprints for. And they have
575
00:41:22,440 --> 00:41:28,770
a generic language where they can actually
describe protocols. And so we published
576
00:41:28,770 --> 00:41:32,529
a number of those, we = NDR. And I would
really recommend you read and understand
577
00:41:32,529 --> 00:41:35,749
that. But the lesson, again, is not
“Oh no, they’re going to detect you’re
578
00:41:35,749 --> 00:41:40,190
using Tor”. We have never said that Tor
can e.g. protect you against someone
579
00:41:40,190 --> 00:41:45,130
seeing that you’re using it. Especially in
the long term. But rather the point is
580
00:41:45,130 --> 00:41:49,509
exactly the scariest point. This mass
internet surveillance is real. And
581
00:41:49,509 --> 00:41:55,579
it is the case that it is real-time.
And it’s a real problem.
582
00:41:55,579 --> 00:42:02,540
applause
583
00:42:02,540 --> 00:42:05,910
Roger: If you’re using Tor they see that
you’re using Tor. If you’re not using Tor
584
00:42:05,910 --> 00:42:09,630
they see exactly where you’re going.
You end up in a list of people who went
585
00:42:09,630 --> 00:42:13,150
to ‘this’ website, or ‘this’ website,
or used ‘this’ service, or sent
586
00:42:13,150 --> 00:42:18,589
‘this’ document. And the diversity of
Tor users is part of the safety, where,
587
00:42:18,589 --> 00:42:21,779
just because they know you’re using
Tor doesn’t tell them that much.
588
00:42:21,779 --> 00:42:24,890
One of the other things I’ve been
wrestling with after looking at a bunch
589
00:42:24,890 --> 00:42:29,039
of these documents lately is the whole
‘how do we protect against pervasive
590
00:42:29,039 --> 00:42:33,079
surveillance’. And this is an entire talk
on its own. We’ve been doing some
591
00:42:33,079 --> 00:42:39,380
design changes. We pushed out some changes
in Tor that protect you more against
592
00:42:39,380 --> 00:42:42,980
pervasive surveillance. We – for the
technical people out there – we’ve reduced
593
00:42:42,980 --> 00:42:47,799
the number of guard relays that you use
by default from 3 to 1. So there are
594
00:42:47,799 --> 00:42:52,609
fewer places on the internet that get to
see your Tor traffic. That’s a good start.
595
00:42:52,609 --> 00:42:56,009
One of the other lessons we’ve been
realizing: The internet is more centralized
596
00:42:56,009 --> 00:43:01,479
than we’d like. So it’s easy to say
“Oh, we just need more exit relays,
597
00:43:01,479 --> 00:43:05,230
and then we’ll have more protection
against these things”. But if we put
598
00:43:05,230 --> 00:43:09,450
another exit relay in that same data
sensor (?) in Frankfurt that they’re
599
00:43:09,450 --> 00:43:13,719
already watching that’s not actually going
to give us more safety against these
600
00:43:13,719 --> 00:43:18,950
pervasive surveillance adversaries.
Something else I realized: so we used
601
00:43:18,950 --> 00:43:23,119
to talk about how Tor does these two
different things. We’ve got anonymity,
602
00:43:23,119 --> 00:43:27,059
we’re trying to protect against somebody
trying to learn what you’re doing, and
603
00:43:27,059 --> 00:43:30,470
we’ve got circumvention, censorship
circumvention. We’re trying to protect
604
00:43:30,470 --> 00:43:33,589
against somebody trying to prevent
you from going somewhere.
605
00:43:33,589 --> 00:43:37,980
But it turns out in the surveillance
case they do deep packet inspection
606
00:43:37,980 --> 00:43:42,200
to figure out what protocol you’re
doing, to find out what you’re up to.
607
00:43:42,200 --> 00:43:45,710
And in the censorship case they do
deep packet inspection to figure out
608
00:43:45,710 --> 00:43:49,730
what protocol you’re using, to decide
whether to block it. So it’s actually…
609
00:43:49,730 --> 00:43:55,049
these fields are much more related
than we had realized before. And
610
00:43:55,049 --> 00:43:59,140
it took us a while, I’m really happy that
we have these documents to look at,
611
00:43:59,140 --> 00:44:03,599
so that we have a better understanding
of how this global surveillance
612
00:44:03,599 --> 00:44:10,660
and censorship works. Long ago, so in
2007, I ended up doing a talk at the NSA,
613
00:44:10,660 --> 00:44:14,619
to try to convince them that we were not
the bad guys. And you can read the notes
614
00:44:14,619 --> 00:44:18,530
that they took about my talk at the
NSA. Because they’re published
615
00:44:18,530 --> 00:44:22,660
in the Washington Post. So I encourage you
to go read what the NSA thought of my talk
616
00:44:22,660 --> 00:44:28,440
to them. That same year I ended up going
to GCHQ, to give a talk to them, to try
617
00:44:28,440 --> 00:44:31,799
to convince them that we were not the
bad people. And I thought to myself:
618
00:44:31,799 --> 00:44:35,230
“I don’t want to give them anything
useful. I don’t want to talk about
619
00:44:35,230 --> 00:44:39,599
anonymity, because I know they’re going
to try to break anonymity. So I’m going
620
00:44:39,599 --> 00:44:43,270
to give them a talk that has nothing to do
with anything that they should care about.
621
00:44:43,270 --> 00:44:48,359
I’m going to talk about the censorship
arms race in China, and DPI, and stuff
622
00:44:48,359 --> 00:44:53,509
like that, that they shouldn’t care
about at all”. Boy, were we wrong!
623
00:44:53,509 --> 00:44:59,420
applause
624
00:44:59,420 --> 00:45:03,389
So the other thing to think about here,
there are a bunch of different pluggable
625
00:45:03,389 --> 00:45:08,300
transports that could come in handy
against the surveillance adversary.
626
00:45:08,300 --> 00:45:12,380
We have, so far, been thinking of
pluggable transports in terms of
627
00:45:12,380 --> 00:45:16,140
‘there’s somebody trying to censor your
connection, they’re doing DPI, or they’re
628
00:45:16,140 --> 00:45:20,590
looking for addresses, and they’re trying
to block things’. One of the things
629
00:45:20,590 --> 00:45:24,680
we learned from this past summer’s
documents: imagine an adversary
630
00:45:24,680 --> 00:45:29,140
who builds a list of all the public Tor
relays. And then they build a list of
631
00:45:29,140 --> 00:45:33,059
all of the IP addresses that connect
to those Tor relays. Now they know
632
00:45:33,059 --> 00:45:36,421
all the bridges, and many of the users.
And now they build a list of all the
633
00:45:36,421 --> 00:45:41,400
IP addresses that connect to those IP
addresses. And they go a few hops out,
634
00:45:41,400 --> 00:45:46,610
and now they know all the public relays,
all the bridges, all the users, all of
635
00:45:46,610 --> 00:45:50,079
the other things that are connected to
Tor. And they can keep track of which ones
636
00:45:50,079 --> 00:45:55,849
they should log traffic for, for the next
6 months, rather than the next week.
637
00:45:55,849 --> 00:46:00,599
That’s a really scary adversary. Some of
the pluggable transports we’ve been
638
00:46:00,599 --> 00:46:06,009
working on could actually come in handy
here. So ‘Flash proxy’ is one of the ones
639
00:46:06,009 --> 00:46:10,709
you heard about in last year’s talk. The
basic idea of a Flash proxy is to get
640
00:46:10,709 --> 00:46:16,940
users running web browsers to volunteer
running web-RTC, or something like that
641
00:46:16,940 --> 00:46:22,150
to basically be a short-lived bridge
between the censored user and
642
00:46:22,150 --> 00:46:26,979
the Tor Network. So the idea is that you
get millions of people running browsers,
643
00:46:26,979 --> 00:46:31,450
and then you can proxy from inside China,
or Syria, or America, or wherever
644
00:46:31,450 --> 00:46:36,650
the problem is, through the browser into
the Tor Network. But from the surveillance
645
00:46:36,650 --> 00:46:42,170
perspective suddenly they end up with
an enormous list of millions of people
646
00:46:42,170 --> 00:46:46,209
around the world that are
basically buffering the Tor user
647
00:46:46,209 --> 00:46:50,089
from the Tor Network. So if they
start with this list of IP addresses,
648
00:46:50,089 --> 00:46:52,710
and they’re trying to build a list of
everything, now they end up
649
00:46:52,710 --> 00:46:56,210
with millions of IP addresses
that have nothing to do with Tor.
650
00:46:56,210 --> 00:46:59,640
And they have to realize, at the time
they’re watching, that they want to go
651
00:46:59,640 --> 00:47:03,769
one more hop out. So I don’t
know if that will work. But this is
652
00:47:03,769 --> 00:47:08,680
an interesting research area that more
people need to look at: How can we,
653
00:47:08,680 --> 00:47:12,880
against an adversary who’s trying to build
a list of everybody who has anything to do
654
00:47:12,880 --> 00:47:17,749
with Tor, how can we have
Tor users not end up on that list.
655
00:47:17,749 --> 00:47:22,729
What sort of transports or tunneling
through Google app spot (?),
656
00:47:22,729 --> 00:47:27,440
or other tools like that can we use
to break that chain, so it’s not as easy
657
00:47:27,440 --> 00:47:32,709
for them to track down
where all the users are.
658
00:47:32,709 --> 00:47:36,500
Okay, Silk Road 2, we’ve had a lot
of questions about. I think it’s called
659
00:47:36,500 --> 00:47:41,099
Operation Onimous (?). I actually talked
to an American law enforcement person
660
00:47:41,099 --> 00:47:46,250
who was involved in this. And he
told me, from his perspective, exactly
661
00:47:46,250 --> 00:47:50,720
how it happened. Apparently the
Silk Road 2 guy wrote his name down
662
00:47:50,720 --> 00:47:54,979
somewhere. So they brought him in,
and started asking him questions. And
663
00:47:54,979 --> 00:47:58,760
as soon as they started asking him
questions he started naming names.
664
00:47:58,760 --> 00:48:02,479
And they counted up to 16 names, and
they went and arrested all those people,
665
00:48:02,479 --> 00:48:05,730
and collected their computers. And then
they put out a press release, saying
666
00:48:05,730 --> 00:48:10,140
that they had an amazing Tor attack.
667
00:48:10,140 --> 00:48:13,019
applause
668
00:48:13,019 --> 00:48:18,069
So there are a couple of lessons here. One
of them is: Yes, it’s another case where
669
00:48:18,069 --> 00:48:25,250
opsec failed. But the other lesson that
we learn is: These large law enforcement
670
00:48:25,250 --> 00:48:32,729
adversaries are happy to use press spin
and lies, and whatever else it takes
671
00:48:32,729 --> 00:48:36,779
to try to scare people away from
having safety on the internet.
672
00:48:36,779 --> 00:48:40,390
Jacob: This is a really… to me,
especially, if I take off my Tor hat
673
00:48:40,390 --> 00:48:44,820
and put on my journalistic hat, as if
I can actually take off hats etc., but
674
00:48:44,820 --> 00:48:49,019
it’s really terrifying that journalists
don’t actually ask hard questions
675
00:48:49,019 --> 00:48:54,950
about that. You know, the Europol people
that spoke to the press, they talked
676
00:48:54,950 --> 00:48:59,119
about this as if they had some incredible
attack, they talked about 0-day,
677
00:48:59,119 --> 00:49:02,999
they talked about how, you know,
they had broken Tor, “You’re not safe
678
00:49:02,999 --> 00:49:05,750
on the Dark Web”. We don’t even use the
term ‘Dark Web’. That’s how you know
679
00:49:05,750 --> 00:49:13,509
that they’re full of shit. But it’s…
applause
680
00:49:13,509 --> 00:49:18,480
That’s sort of like when people have Tor
in all caps (?)(?)(?)(?)(?)(?), dark web,
681
00:49:18,480 --> 00:49:22,809
that kind of stuff, this is a bad sign. But
the way they talk about it, it was clear
682
00:49:22,809 --> 00:49:27,230
that they, as far as we can tell, they
don’t have that. But they really hyped it.
683
00:49:27,230 --> 00:49:32,529
As much as they possibly could. I mean,
it is, effectively, and I think it is even
684
00:49:32,529 --> 00:49:36,970
technically a psychological operation
against the civilian population. They
685
00:49:36,970 --> 00:49:41,589
want to scare you into believing that Tor
doesn’t work. Because, in fact, it does work,
686
00:49:41,589 --> 00:49:45,999
and it is a problem for them. So any time
they can ever have some kind of win-it-all
687
00:49:45,999 --> 00:49:49,489
they always spin it as if they’re great,
powerful adversaries, and it’s
688
00:49:49,489 --> 00:49:54,189
us-versus-them. And that’s exactly wrong.
It is not us-versus-them. Because we all
689
00:49:54,189 --> 00:49:57,900
need anonymity. We all absolutely need
that. And they shouldn’t be treating us
690
00:49:57,900 --> 00:50:02,819
as adversaries. They, in fact, are also
Tor users, quite ironically. So it is
691
00:50:02,819 --> 00:50:06,150
interesting though, because they know that
they haven’t done that. But they don’t
692
00:50:06,150 --> 00:50:09,059
want you to know that they haven’t done
that. In fact, they want you to know
693
00:50:09,059 --> 00:50:11,529
the opposite. Of course we could be
wrong. They could have some
694
00:50:11,529 --> 00:50:17,989
super-secret exploit, but as far as we can
tell that just is not the case. So, what’s
695
00:50:17,989 --> 00:50:20,920
to be learned from this? We used to think
it was just American law enforcement
696
00:50:20,920 --> 00:50:24,709
that were scary jerks. Now it’s also
European. I don’t know if that’s
697
00:50:24,709 --> 00:50:28,670
the right buzzing(?). But hopefully some
of you will go and work at Europol,
698
00:50:28,670 --> 00:50:31,929
and tell us what’s really going on.
699
00:50:31,929 --> 00:50:37,739
applause
700
00:50:37,739 --> 00:50:42,799
Roger: Speaking of Hidden Services. We
have a new design in mind, that will have
701
00:50:42,799 --> 00:50:47,839
some stronger crypto properties, and make
it harder to enumerate Hidden Services.
702
00:50:47,839 --> 00:50:52,059
It won’t solve some of the big anonymity
questions that are still open research
703
00:50:52,059 --> 00:50:55,640
questions. But there are a lot of
improvements we’d like to make,
704
00:50:55,640 --> 00:50:59,789
to make the crypto more secure, and
performance changes etc. And we’d been
705
00:50:59,789 --> 00:51:04,529
thinking about doing some sort of crowd
funding, kickstarter-like thing, to make
706
00:51:04,529 --> 00:51:08,630
Hidden Services work better. We’ve got
a funder who cares about understanding
707
00:51:08,630 --> 00:51:12,790
Hidden Services, but that’s not the same
as actually making them more secure.
708
00:51:12,790 --> 00:51:17,329
So we’d love to chat with you after this
about how to make one of those
709
00:51:17,329 --> 00:51:19,839
kickstarters actually work.
710
00:51:19,839 --> 00:51:25,529
Jacob: Right, so, if you have questions
we have some amount of time for questions.
711
00:51:25,529 --> 00:51:28,489
And while you line up at the microphone
I’ll tell you a quick story. So if you
712
00:51:28,489 --> 00:51:31,120
have questions please line up at the
microphone, so we can do this.
713
00:51:31,120 --> 00:51:34,010
This is a picture of a man who was
assassinated in San Francisco.
714
00:51:34,010 --> 00:51:36,509
His name is Harvey Milk. Anybody
here – ever hear of Harvey Milk?
715
00:51:36,509 --> 00:51:38,809
applause
716
00:51:38,809 --> 00:51:43,319
Great. Harvey Milk was basically the
first out-gay politician in, I think,
717
00:51:43,319 --> 00:51:47,569
the United States. He was a city council
member in San Francisco. And this was
718
00:51:47,569 --> 00:51:52,059
during a huge fever pitch apora (?) where…
basically it was the battle between:
719
00:51:52,059 --> 00:51:56,999
“Are people who are gay people or not?”
And what he said is: Go home and
720
00:51:56,999 --> 00:52:00,190
tell your brothers, your mothers, your
sisters, your family members and
721
00:52:00,190 --> 00:52:03,890
your co-workers that you’re gay. Tell
them that, so that when they advocate
722
00:52:03,890 --> 00:52:08,549
for violence against gay people, when
they advocate for harm against you
723
00:52:08,549 --> 00:52:13,609
that they know they’re talking about you.
Not an abstract boogieman. But someone
724
00:52:13,609 --> 00:52:18,790
that they actually know, and that they
love. We need every person in this room,
725
00:52:18,790 --> 00:52:22,699
every person watching this video later to
go home and talk about how you needed
726
00:52:22,699 --> 00:52:26,749
anonymity, for 5 or 10 minutes. How you
needed it every day to do your job.
727
00:52:26,749 --> 00:52:30,949
We need people to reach out. Now that’s
a sad story with Harvey Milk which is
728
00:52:30,949 --> 00:52:33,760
that he and mayor Moscone of San
Francisco were actually killed by
729
00:52:33,760 --> 00:52:38,539
a very crazy person, that was also in city
government, in the American traditional
730
00:52:38,539 --> 00:52:43,549
extreme gun violence. He was shot and
killed. And that person actually got away
731
00:52:43,549 --> 00:52:48,049
with it. The so-called ‘Twinkie defense’.
So we’re not trying to draw that parallel.
732
00:52:48,049 --> 00:52:53,220
Just to be clear please don’t shoot us and
kill us! Not even funny, unfortunately.
733
00:52:53,220 --> 00:52:57,890
But to understand that we are really
under threat, a lot of pressure. There’s
734
00:52:57,890 --> 00:53:02,410
a lot of pressure. We get pressure from
law enforcement investigation agencies
735
00:53:02,410 --> 00:53:08,239
to backdoor Tor, and we tell them:
“No”, and that takes a lot of stress
736
00:53:08,239 --> 00:53:12,079
and dumps it on us. And we need support
from a lot of people, to tell them
737
00:53:12,079 --> 00:53:16,459
to back off. It can’t just be us that
say that. Or we will lose some day.
738
00:53:16,459 --> 00:53:20,499
And there are also very scary adversaries
that do not care at all about the law.
739
00:53:20,499 --> 00:53:25,000
Not that those guys care about the law but
really don’t care about the law at all.
740
00:53:25,000 --> 00:53:29,430
And we need people to understand how
important anonymity is, and make sure
741
00:53:29,430 --> 00:53:35,040
that that goes into every conversation.
So really, go home and teach your friends
742
00:53:35,040 --> 00:53:38,489
and your family members about your
need for anonymity. This lesson
743
00:53:38,489 --> 00:53:42,299
from Harvey Milk was very useful. It is
the case that now, in California where
744
00:53:42,299 --> 00:53:46,180
there is a huge fever pitch (?) battle about
this that you can e.g. be gay and be
745
00:53:46,180 --> 00:53:50,760
a school teacher. That was one of the
battles that Harvey Milk helped win.
746
00:53:50,760 --> 00:53:58,759
applause
747
00:53:58,759 --> 00:54:02,520
So, with that I think
that we have time for…
748
00:54:02,520 --> 00:54:06,200
Herald: Yeah, we have like 10 minutes left
for questions. So, thank you so much
749
00:54:06,200 --> 00:54:09,689
for the talk! It’s really inspiring.
Thank you for keeping up the work!
750
00:54:09,689 --> 00:54:17,259
applause
751
00:54:17,259 --> 00:54:20,233
Really! Although you do this every year
it never gets old. And I think your…
752
00:54:20,233 --> 00:54:24,119
every year you give people the chance to
leave the Congress with a feeling of hope
753
00:54:24,119 --> 00:54:26,869
and purpose. So, thank you so much for
everything you do and every minute
754
00:54:26,869 --> 00:54:30,489
you spend on this project. So we start
with a question from the internet.
755
00:54:30,489 --> 00:54:32,339
applause
756
00:54:32,339 --> 00:54:34,739
Jacob: We’d like to take a few questions
from the internet all at once,
757
00:54:34,739 --> 00:54:36,889
if possible, so we can try to answer
them as quickly as possible.
758
00:54:36,889 --> 00:54:38,469
Signal Angel: Okay.
Herald: Alright.
759
00:54:38,469 --> 00:54:41,569
Signal Angel: So, the first one: Yesterday
you said that SSH is broken. So
760
00:54:41,569 --> 00:54:45,719
what should we use to safely
administrate our Tor relays?
761
00:54:45,719 --> 00:54:49,950
Jacob: Hah! That’s great. So,
first of all! Next set of questions!
762
00:54:49,950 --> 00:54:53,259
Signal Angel: So the next one is: How much
money would be needed to get independent
763
00:54:53,259 --> 00:54:56,170
from Government funding,
and is that even desired?
764
00:54:56,170 --> 00:54:59,229
Jacob: Ah, do you want me to do both?
Roger: Sure.
765
00:54:59,229 --> 00:55:00,529
Jacob: Okay.
Signal Angel: Hope so.
766
00:55:00,529 --> 00:55:05,579
Jacob: Okay. First question: Consider
using a Tor Hidden Service, and then
767
00:55:05,579 --> 00:55:09,079
SSH’ing into that Tor Hidden Service.
Composition of cryptographic components
768
00:55:09,079 --> 00:55:15,680
is probably very important. A detail about
SSH: We don’t know what is going on.
769
00:55:15,680 --> 00:55:19,299
We only know what was claimed in those
documents. That’s a really scary claim.
770
00:55:19,299 --> 00:55:24,170
This creates a political problem. The U.S.
Congress and other political bodies
771
00:55:24,170 --> 00:55:27,680
should really be asking the secret
services if they really have a database
772
00:55:27,680 --> 00:55:31,160
called CAPRI OS where they store
SSH decrypts. And how they populate
773
00:55:31,160 --> 00:55:35,209
that database. Because that is critical
infrastructure. We can’t solve that problem
774
00:55:35,209 --> 00:55:39,259
with the knowledge that we have right now.
But we know now: There is a problem.
775
00:55:39,259 --> 00:55:42,520
What is that problem? So, composition
of those systems: It seems to be,
776
00:55:42,520 --> 00:55:45,899
the documents say that they haven’t broken
the crypto in Tor Hidden Services. So
777
00:55:45,899 --> 00:55:51,499
put those two together. And also consider
that cryptography only buys you time.
778
00:55:51,499 --> 00:55:55,640
It really isn’t the case that all the
crypto we have today is going to be good
779
00:55:55,640 --> 00:55:59,579
maybe in 150 years. If Sci-Fi quantum
computers ever come out, and they
780
00:55:59,579 --> 00:56:03,119
actually work, Shor’s algorithm and
other things really seem to suggest
781
00:56:03,119 --> 00:56:07,160
we have a lot of trouble ahead. And the
second part, about money: Yeah, we would
782
00:56:07,160 --> 00:56:10,999
love to replace Government funding. I mean
at least I would. But that isn’t to say
783
00:56:10,999 --> 00:56:14,549
that we don’t respect that there are
people that do fund us to do good things.
784
00:56:14,549 --> 00:56:20,099
We do take money from agencies who e.g.
the Department of Human Rights and Labor,
785
00:56:20,099 --> 00:56:22,470
at the State Department. They’re sort of
like the advertising arm for the
786
00:56:22,470 --> 00:56:26,519
gun-running part of the State Department,
as Julian Assange would say. And they
787
00:56:26,519 --> 00:56:30,029
actually care about Human Rights. They
care that you have access to anonymity.
788
00:56:30,029 --> 00:56:35,039
It’s weird because the State Department
– the rest of it – might not care. But,
789
00:56:35,039 --> 00:56:38,670
we really, really would like to off-set
that money. But we’d like to grow.
790
00:56:38,670 --> 00:56:42,959
We’d like to be able to hire 100 people
in this room to work on this full-time.
791
00:56:42,959 --> 00:56:47,999
Because the planet needs anonymity. But
that requires that we find that money.
792
00:56:47,999 --> 00:56:52,219
And the best place at the moment is by
writing grant proposals. And that is how
793
00:56:52,219 --> 00:56:55,539
we have in fact done that. And that
allows us also to operate openly.
794
00:56:55,539 --> 00:56:59,599
So we don’t have e.g. clearances. And we
try to publish everything we can about it.
795
00:56:59,599 --> 00:57:03,539
And if you ever write a FOIA we always
tell the agency that has received the
796
00:57:03,539 --> 00:57:09,480
Freedom Of Information request: Give the
requestor everything. Give it all to them.
797
00:57:09,480 --> 00:57:13,280
We have nothing to hide about this, we
want you to see that. We want you to see
798
00:57:13,280 --> 00:57:17,059
that when a government agency has paid
us money that we have done it for THIS
799
00:57:17,059 --> 00:57:20,700
line item, and THIS line item. And we’ve
done it as well as we could do it, and
800
00:57:20,700 --> 00:57:24,420
it is in line with the open research, and
we have really done a good thing,
801
00:57:24,420 --> 00:57:26,250
that helps people.
802
00:57:26,250 --> 00:57:30,979
Roger: So I’d love to diversify our
funding. I’d love to have foundations,
803
00:57:30,979 --> 00:57:37,929
I’d love to have the EFF model where
individuals fund because we do great things
804
00:57:37,929 --> 00:57:42,839
– look at what we did over the past year –
and in fact, right here: Look at what we
805
00:57:42,839 --> 00:57:46,660
did over the past year. We’ve done so
amazing things, we’re gonna do some more
806
00:57:46,660 --> 00:57:50,849
amazing things next year. We need your
help to actually make all of this happen.
807
00:57:50,849 --> 00:57:55,229
Jacob: Anybody here
a Bitcoin millionaire?
808
00:57:55,229 --> 00:57:57,340
Because we now take Bitcoin!
809
00:57:57,340 --> 00:58:02,630
applause
810
00:58:02,630 --> 00:58:05,260
Herald: Alright, let’s take
a question from microphone 1.
811
00:58:05,260 --> 00:58:09,180
Question: Just a short question:
is there a follow-up on the
812
00:58:09,180 --> 00:58:14,539
Thomas White tor-talk mailing list thing?
813
00:58:14,539 --> 00:58:18,579
Roger: So, Thomas White runs a few exit
relays. Some of them are quite large,
814
00:58:18,579 --> 00:58:24,519
I’m very happy he does that. It is quite
normal for exit relays to come and go.
815
00:58:24,519 --> 00:58:29,470
He is in England, and as far as I can tell
England is not a very good place to be
816
00:58:29,470 --> 00:58:36,249
these days. But he’s trying to fix his
country from inside which is really great.
817
00:58:36,249 --> 00:58:40,920
Basically the short version is: It’s not
a big deal. He runs some exit relays,
818
00:58:40,920 --> 00:58:45,160
somebody tries to take them down, there
are 6000 relays in the network right now,
819
00:58:45,160 --> 00:58:48,609
they go up and down, it’s normal.
820
00:58:48,609 --> 00:58:52,630
Question: Is this related to the Tor
blog post, that Thomas White thing,
821
00:58:52,630 --> 00:58:55,380
where you said there’s an upcoming…
822
00:58:55,380 --> 00:58:59,630
Roger: It is unrelated, except for the
fact that everybody was watching.
823
00:58:59,630 --> 00:59:03,130
So then, when he wrote a tor-talk mail
saying “Hey, I’m concerned about my
824
00:59:03,130 --> 00:59:06,760
exit relays”, suddenly all the journalists
said: “Oh my god, they must be
825
00:59:06,760 --> 00:59:09,069
the same thing!” So, no, unrelated!
826
00:59:09,069 --> 00:59:11,180
Jacob: There are a lot of people that
have been attacking the Tor network.
827
00:59:11,180 --> 00:59:13,940
You’ve probably seen there’ve been
Denial-of-Service attacks, and things
828
00:59:13,940 --> 00:59:18,029
like that on the Tor directory
authorities. This is what I was saying
829
00:59:18,029 --> 00:59:22,319
one or two slides ago when I said “Please
tell people the value of Tor, and that
830
00:59:22,319 --> 00:59:26,789
you need it”. Because when people do
Denial-of-Service attacks, when they see
831
00:59:26,789 --> 00:59:30,709
servers, we really need, in a peer2peer
network way, to draw up more relays
832
00:59:30,709 --> 00:59:34,449
to actually increase the bandwidth
capacity, to increase the exit capacity.
833
00:59:34,449 --> 00:59:38,609
And it’s very important to do that. Right?
I mean it’s very, very serious that
834
00:59:38,609 --> 00:59:41,670
those things happen. But it’s also
important that the design of the network
835
00:59:41,670 --> 00:59:45,099
is designed with the expectation that
thieves will steal computer systems,
836
00:59:45,099 --> 00:59:50,749
that jerks will denial-of-service them
etc. So if you can run an exit relay,
837
00:59:50,749 --> 00:59:53,789
thank you! Thank you for doing that.
Next question?
838
00:59:53,789 --> 00:59:55,869
applause
Herald: Yeah. Let’s take a question
839
00:59:55,869 --> 00:59:56,890
from microphone 2.
840
00:59:56,890 --> 01:00:00,979
Question: First of all a quick shoutout to
your Ooni friend. Please don’t ask people
841
01:00:00,979 --> 01:00:06,299
to run arbitrary code over the internet.
Curl-piper’s age (?) is not good style.
842
01:00:06,299 --> 01:00:09,829
Roger: There’s a deb (?) that we’re working
on also that should be a lot better.
843
01:00:09,829 --> 01:00:13,000
Jacob: Yeah, ‘apt-get install ooniprobe’
will also work.
844
01:00:13,000 --> 01:00:18,510
Question: Do you have any plans
of implementing IPv6, finally?
845
01:00:18,510 --> 01:00:24,839
Jacob: So there is IPv6, so Linus
Nordberg, one of the finest Tor people
846
01:00:24,839 --> 01:00:32,029
I’ve ever met, he, in fact, helped add
IPv6 support, initial IPv6 support
847
01:00:32,029 --> 01:00:36,809
to the Tor network. So, e.g. you can,
in fact, exit through the Tor network
848
01:00:36,809 --> 01:00:42,660
with IPv4 or IPv6. It is the case that the
Tor relays in the network still all need
849
01:00:42,660 --> 01:00:48,559
IPv4, not just IPv6. My Tor directory
authority which runs in California,
850
01:00:48,559 --> 01:00:52,619
it has an IPv4 and an IPv6 address,
so if you have an IPv6 address you can
851
01:00:52,619 --> 01:00:55,799
bootstrap, you can connect to that.
You could do some interesting
852
01:00:55,799 --> 01:00:59,469
pluggable-transport stuff as well. So
that is on the road map. This is another
853
01:00:59,469 --> 01:01:03,460
example of: If you really care about that
issue please send us your Bitcoins!
854
01:01:03,460 --> 01:01:07,619
And it would be really fantastic because
we really want that! But right now,
855
01:01:07,619 --> 01:01:12,640
you can use Tor as a v4-v6 gateway.
You really can do that, and we would
856
01:01:12,640 --> 01:01:15,980
encourage that. It’s another example
of some kind of neat feature of Tor
857
01:01:15,980 --> 01:01:18,289
which you would never think an
anonymity system would have.
858
01:01:18,289 --> 01:01:23,079
Roger: And in Iran, right now, where IPv6
is not censored because the soft…
859
01:01:23,079 --> 01:01:26,931
the censorship stuff they have from
America and Europe didn’t think
860
01:01:26,931 --> 01:01:30,779
to censor IPv6…
laughter and applause
861
01:01:30,779 --> 01:01:34,989
applause
862
01:01:34,989 --> 01:01:41,079
so you can use a bridge right now in Iran
that connects over IPv6. Works great.
863
01:01:41,079 --> 01:01:43,769
Jacob: Yeah. Next question?
Herald: Alright, microphone 4!
864
01:01:43,769 --> 01:01:46,869
Question: So we heard lots of really
encouraging success stories about Tor
865
01:01:46,869 --> 01:01:50,890
working against a global passive
adversary. But we know that Tor
866
01:01:50,890 --> 01:01:54,819
wasn’t designed for this use case.
The question is: What needs to happen
867
01:01:54,819 --> 01:01:59,099
in order for Tor to actually being
able to handle this, officially?
868
01:01:59,099 --> 01:02:01,890
Is this just research, or some
more development work?
869
01:02:01,890 --> 01:02:06,779
Roger: There’s a lot of really hard open
research questions there. So if you’re…
870
01:02:06,779 --> 01:02:10,890
so, I get… basically one of the
issues is what we call the
871
01:02:10,890 --> 01:02:15,190
end-to-end traffic correlation attack. So
if you can see the flow over here coming
872
01:02:15,190 --> 01:02:18,699
into the Tor network, and you can see the
corresponding flow over here, coming out
873
01:02:18,699 --> 01:02:23,020
of it, then you do some simple statistics,
and you say: “Hey, wait a minute, these
874
01:02:23,020 --> 01:02:27,359
line up!” And there are a bunch of
different directions on how to make that
875
01:02:27,359 --> 01:02:32,680
harder. Basically what you want to
do is drive up the false-positive rate.
876
01:02:32,680 --> 01:02:37,660
So you see a flow here, and there are
actually 1000 flows that look like they
877
01:02:37,660 --> 01:02:41,779
sort of match. And maybe you can do
that by adding a little bit of padding,
878
01:02:41,779 --> 01:02:46,619
or delays, or batching or something. The
research, as we understand it right now,
879
01:02:46,619 --> 01:02:51,049
means that you have to add hours
of delay, not seconds of delay.
880
01:02:51,049 --> 01:02:56,739
That’s kind of crummy. So another way
of phrasing that: Imagine a graph,
881
01:02:56,739 --> 01:03:02,670
the X axis is how much overhead
we’re adding. And the Y axis is
882
01:03:02,670 --> 01:03:06,739
how much security we get against the
end-to-end correlation attack. We have
883
01:03:06,739 --> 01:03:13,049
zero data points on that graph. We have
no idea what the curve looks like.
884
01:03:13,049 --> 01:03:16,249
Jacob: There’s also another point which
is: Roger has an assumption. He says
885
01:03:16,249 --> 01:03:20,809
if we have a high false-positive rate,
that that’s a good thing. Well, maybe,
886
01:03:20,809 --> 01:03:23,440
maybe actually, that’s exactly the
wrong thing. Maybe the result is
887
01:03:23,440 --> 01:03:27,630
that 1000 people get rounded up instead
of 1. The reality is that there is
888
01:03:27,630 --> 01:03:31,030
no system that – as far as we know –
is actually safer than that. Of course
889
01:03:31,030 --> 01:03:34,300
we would say that, we work on Tor. But as
an example: One of the XKeyscore things
890
01:03:34,300 --> 01:03:37,890
that I’ve seen in this research which
we published in the NDR story is that
891
01:03:37,890 --> 01:03:41,180
they were doing an attack on Hotspot Shield
where they were actually doing
892
01:03:41,180 --> 01:03:45,299
traffic correlation where they were able
to de-anonymize VPN users because of
893
01:03:45,299 --> 01:03:49,190
it’s a single hop. And then they were
also able to do Quantuminsert to attack
894
01:03:49,190 --> 01:03:54,390
specific users using the VPN. We haven’t
seen evidence of them doing that to Tor.
895
01:03:54,390 --> 01:03:57,680
That also doesn’t mean that every VPN
is broken. It just means that VPN
896
01:03:57,680 --> 01:04:00,729
has a different threat model. There’s
lot of attacks that are like that, and
897
01:04:00,729 --> 01:04:05,400
the problem is the internet is a dangerous
place. So, I mean, Banksy said it best:
898
01:04:05,400 --> 01:04:09,229
He said, in the future people will be
anonymous for 15 minutes. And
899
01:04:09,229 --> 01:04:13,249
I think he may have over-estimated
that. Depending on the attacker.
900
01:04:13,249 --> 01:04:17,209
Roger: There’s a conference called the
Privacy Enhancing Technology Symposium,
901
01:04:17,209 --> 01:04:21,390
petsymposium.org where all of the
Anonymous Communications researchers
902
01:04:21,390 --> 01:04:26,619
get together each year to consider exactly
these sorts of research questions. So,
903
01:04:26,619 --> 01:04:30,359
it’s not just an engineering question,
there’s a lot of basic science left
904
01:04:30,359 --> 01:04:33,199
in terms of how to make
these things harder.
905
01:04:33,199 --> 01:04:35,219
Herald: Alright, the last question
is one from the internet.
906
01:04:35,219 --> 01:04:40,259
Signal Angel: Okay, so, does running
a Ooniprobe involve any legal risks?
907
01:04:40,259 --> 01:04:43,249
Jacob: Okay, so, great! We can take
different questions, cause we’re gonna say
908
01:04:43,249 --> 01:04:44,519
“Talk to Arturo!”
909
01:04:44,519 --> 01:04:46,899
Herald: Alright, so, microphone 3!
910
01:04:46,899 --> 01:04:51,549
Question: Okay, as a new
Tor relay operator I’ve got…
911
01:04:51,549 --> 01:04:57,829
applause
Jacob: Take a bow!
912
01:04:57,829 --> 01:05:04,209
Question: So, since about 2 months I run
3 relays, rather high bandwidth, and
913
01:05:04,209 --> 01:05:10,380
on 2 of these I had quite strange things
happen. One case: A kernel crash in the
914
01:05:10,380 --> 01:05:16,640
Intel e1000 driver, the other one having
the top-of-the-rack switch just reboot,
915
01:05:16,640 --> 01:05:22,199
which is by the way a Juniper switch.
So I’m kind of concerned about this
916
01:05:22,199 --> 01:05:26,390
operational security. You
know, could you trust that?
917
01:05:26,390 --> 01:05:31,779
Jacob: Yeah, absolutely. So the short
version of it is: Agencies like the NSA,
918
01:05:31,779 --> 01:05:34,920
depending on where you’re located, might
compromise something like your Juniper
919
01:05:34,920 --> 01:05:38,859
switch upstream. They sit on Zerodays
for critical infrastructure, that includes
920
01:05:38,859 --> 01:05:44,740
core routers, and switches. But
it may not be such a big thing.
921
01:05:44,740 --> 01:05:49,670
It really depends on where you’re located.
It could also be that the hardware sucks.
922
01:05:49,670 --> 01:05:52,790
laughter
And that the software is not good. And
923
01:05:52,790 --> 01:05:56,839
when you, of course, are pushing,
let’s say gigabits of traffic through it
924
01:05:56,839 --> 01:06:01,789
it falls over. It’s really hard to know.
That’s a really good question,
925
01:06:01,789 --> 01:06:07,080
which is very specific, and kind of
hard for us to address without data.
926
01:06:07,080 --> 01:06:13,070
Question: Sorry, I’m concerned that the
attack, like this, you know, they could,
927
01:06:13,070 --> 01:06:17,939
actually, compromise the machine without
knowing, or compromise the exact uplink.
928
01:06:17,939 --> 01:06:21,650
And this would actually be a viable
attack, like very low-key,
929
01:06:21,650 --> 01:06:24,489
you don’t see it, as [an] operator,
maybe, if you’re not very careful.
930
01:06:24,489 --> 01:06:28,079
And you can watch all the traffic
going inside, going outside the box.
931
01:06:28,079 --> 01:06:32,769
Jacob: It would be fantastic
if you can prove that theory.
932
01:06:32,769 --> 01:06:36,959
Because, of course, if you can, maybe we
can find other information that allows us
933
01:06:36,959 --> 01:06:41,019
to stop those types of things to
happen, or e.g. can in some way
934
01:06:41,019 --> 01:06:45,660
allow us to fix the problems that are
being exploited. The reality is that
935
01:06:45,660 --> 01:06:48,630
general purpose computers
are quite frankly not very secure,
936
01:06:48,630 --> 01:06:51,759
and special purpose computers
aren’t doing much better.
937
01:06:51,759 --> 01:06:55,140
Roger: I worry not only about active
attacks like that but about passive attacks
938
01:06:55,140 --> 01:06:59,269
where they already have some sort of
surveillance device up-stream from you
939
01:06:59,269 --> 01:07:03,939
in you co-location facility, or something
like that. So, yes. These are all
940
01:07:03,939 --> 01:07:09,859
really big concerns. One of the defenses
that Tor has is diversity around the world.
941
01:07:09,859 --> 01:07:14,199
So, hopefully they won’t be able to do
that to all of the relays. But yeah,
942
01:07:14,199 --> 01:07:16,769
this is a big issue. We should
keep talking about it.
943
01:07:16,769 --> 01:07:20,589
Herald: Alright, I just wanna come back
to the question before, for a second.
944
01:07:20,589 --> 01:07:22,719
Because there was a question from the
internet. So the people are not able
945
01:07:22,719 --> 01:07:27,949
to talk. Ooniprobe guy, hey, could you
maybe answer the question, like,
946
01:07:27,949 --> 01:07:30,640
right now, or maybe on Twitter,
or post a link or something?
947
01:07:30,640 --> 01:07:33,390
Because I happen to believe that
it’s a very important question.
948
01:07:33,390 --> 01:07:35,640
You remember the question?
If there are legal restric…
949
01:07:35,640 --> 01:07:40,809
Arturo: Yeah well, I mean the thing is
that we don’t really know like what are
950
01:07:40,809 --> 01:07:43,049
the… who was it that
was asking the question?
951
01:07:43,049 --> 01:07:46,049
Jacob: The internet?
Arturo: Ah, the internet. Okay.
952
01:07:46,049 --> 01:07:51,099
laughter and applause
Jacob laughs
953
01:07:51,099 --> 01:07:58,660
So I guess we can’t know all of the
legal risks involved in every country.
954
01:07:58,660 --> 01:08:02,609
It is definitely the case that in some
countries you may get in trouble
955
01:08:02,609 --> 01:08:11,039
for visiting some websites that are
considered illegal. So, I can go
956
01:08:11,039 --> 01:08:16,189
in more detail into this if you
come later to Noisy Square at 6.
957
01:08:16,189 --> 01:08:17,670
Herald: The internet can’t
come, that’s the problem!
958
01:08:17,670 --> 01:08:20,240
Arturo: Ah, the internet can’t come, shit!
Okay! laughter
959
01:08:20,240 --> 01:08:26,790
So,… laughs
applause
960
01:08:26,790 --> 01:08:29,440
Jacob: There’re a lot of jokes in that!
961
01:08:29,440 --> 01:08:33,770
Arturo: The short answer is that you
should look at the test specifications,
962
01:08:33,770 --> 01:08:38,920
that are written in English, and they have
at the bottom some notes that detail
963
01:08:38,920 --> 01:08:46,190
what can be some of the risks involved.
But we are not lawyers. So we don’t know
964
01:08:46,190 --> 01:08:50,939
what are the risks for all of the
countries. So you should probably speak
965
01:08:50,939 --> 01:08:56,399
to somebody that knows about these things
in your country. And it’s experimental
966
01:08:56,399 --> 01:09:03,069
software, and there are not many people
that are doing this. So we generally can’t
967
01:09:03,069 --> 01:09:08,209
say. Hope that answers your question.
Question: Thanks a lot, yeah, thanks.
968
01:09:08,209 --> 01:09:11,420
Herald: Alright, I guess, just to sum
it up: Be careful whatever you do.
969
01:09:11,420 --> 01:09:15,719
laughter and applause
Alright, so, Jake was just asking
970
01:09:15,719 --> 01:09:19,740
if maybe we could just gather a couple
of questions, and then ask about them
971
01:09:19,740 --> 01:09:21,730
outside. Did I get that right?
Jacob: Yeah, so if everyone who is
972
01:09:21,730 --> 01:09:25,459
at a microphone, disperse to the correct
microphone, if you could just ask all your
973
01:09:25,459 --> 01:09:29,080
questions, then everyone else who’s here
that wants to hear the answers will know
974
01:09:29,080 --> 01:09:32,040
that you should stick around and talk
to us afterwards. We won’t answer
975
01:09:32,040 --> 01:09:34,660
all these questions unless there’s
a really burning one. But that way
976
01:09:34,660 --> 01:09:37,000
the guys that are standing at the
microphone, or the gals that are
977
01:09:37,000 --> 01:09:40,470
standing at the microphone or other, can
actually ask them right now, and if you’re
978
01:09:40,470 --> 01:09:43,399
interested come and find us right
afterwards. We’re going to probably
979
01:09:43,399 --> 01:09:46,880
go to the tea house upstairs, or
maybe I shouldn’t have said that.
980
01:09:46,880 --> 01:09:49,089
laughter
Herald: Alright, so we’re gonna do it
981
01:09:49,089 --> 01:09:51,449
like this. We’re gonna rush through this.
And we’re just gonna hear a lot of
982
01:09:51,449 --> 01:09:55,920
interesting questions, but no answers. If
you wanna hear the answers stay tuned
983
01:09:55,920 --> 01:10:00,090
and don’t switch the channel. So we take
a couple of questions. Microphone 5.
984
01:10:00,090 --> 01:10:03,600
And be quick about it.
Question: In regards to robustness and
985
01:10:03,600 --> 01:10:07,190
the Mozilla partnership: Are there any
thoughts about incrementally replacing
986
01:10:07,190 --> 01:10:10,540
the C++ infrastructure
with Rust? Eventually?
987
01:10:10,540 --> 01:10:14,680
Herald: Microphone 4!
Is it open, microphone 4?
988
01:10:14,680 --> 01:10:22,980
Question: Can you compare Tor with JAP
from TU Dresden in aspects of anonymity?
989
01:10:22,980 --> 01:10:25,790
Herald: Okay, the other
guy at microphone 4!
990
01:10:25,790 --> 01:10:29,740
Question: To your knowledge has anyone got
into trouble for running a non-exit relay?
991
01:10:29,740 --> 01:10:32,950
And do you have any tips for people that
wanna help by running a non-exit relay?
992
01:10:32,950 --> 01:10:34,860
Herald: Okay, microphone 1, 2 guys.
993
01:10:34,860 --> 01:10:39,020
Question: I have a question, or
a suggestion for the funding problematic.
994
01:10:39,020 --> 01:10:43,660
Have you… you’re teaming up with Mozilla,
have you ever considered like producing
995
01:10:43,660 --> 01:10:47,960
own smartphones, because there’s a huge
margin. I also think there’s a problem
996
01:10:47,960 --> 01:10:55,500
like… why most people don’t use
cryptography is because there’s no
997
01:10:55,500 --> 01:11:01,010
easy-to-use, out-of-the-box, cool product
that’s like… that goes out and has a story
998
01:11:01,010 --> 01:11:02,810
or anything, like the marketing on Apple.
999
01:11:02,810 --> 01:11:05,310
Herald: Alright, the other
guy at microphone 1.
1000
01:11:05,310 --> 01:11:09,900
Question: So a couple of minutes before
the talk started someone did a Sibyl (?)
1001
01:11:09,900 --> 01:11:14,110
attack on Tor. And we should fix that
a.s.a.p. So please don’t disappear
1002
01:11:14,110 --> 01:11:17,450
for the next few hours.
Jacob rages, laughing, theatrically
1003
01:11:17,450 --> 01:11:19,030
Thanks!
1004
01:11:19,030 --> 01:11:21,840
Roger: It never ends.
Jacob: It never ends!
1005
01:11:21,840 --> 01:11:24,320
Herald: Alright. Two questions
from microphone 3.
1006
01:11:24,320 --> 01:11:27,870
Question: So when they took
down Silkroad they took
1007
01:11:27,870 --> 01:11:31,670
a lot of Bitcoins with them. I wonder
what the [U.S.] Government is doing
1008
01:11:31,670 --> 01:11:34,690
with the large amount of anonymized cash.
1009
01:11:34,690 --> 01:11:37,220
Roger: They auctioned it off.
Jacob: They sell it. Next question.
1010
01:11:37,220 --> 01:11:39,240
Question: And I think they
should give it to you.
1011
01:11:39,240 --> 01:11:41,810
Herald: Alright. Last question!
Jacob: I fully agree!
1012
01:11:41,810 --> 01:11:45,810
Question: So to combat against the
‘misinformed journalists’ thing
1013
01:11:45,810 --> 01:11:50,550
why not have a dashboard, very
prominently displayed on the Tor Project
1014
01:11:50,550 --> 01:11:54,730
listing all of the academic, open
like known problems with Tor,
1015
01:11:54,730 --> 01:11:58,290
and always have the journalists go there
first to get the source of information,
1016
01:11:58,290 --> 01:12:00,400
rather than misunderstanding
academic research.
1017
01:12:00,400 --> 01:12:02,760
Jacob: Fantastic, so if you wanna know…
1018
01:12:02,760 --> 01:12:04,790
Herald: Alright, if you found any of these
questions interesting, and you’re also
1019
01:12:04,790 --> 01:12:08,940
interested in the answers stick around, go
to Noisy Square, speak to these two guys,
1020
01:12:08,940 --> 01:12:12,100
and get all your answers. Other than
that, you heard it a Brillion times, but:
1021
01:12:12,100 --> 01:12:15,980
go home, start a relay! My friends and I
did two years ago, after Jake’s keynote.
1022
01:12:15,980 --> 01:12:18,760
It’s really not that hard. You can make
a difference. And thank you so much,
1023
01:12:18,760 --> 01:12:20,300
for Roger and Jake, as every year!
1024
01:12:20,300 --> 01:12:27,500
applause
1025
01:12:27,500 --> 01:12:29,250
silent postroll titles
1026
01:12:29,250 --> 01:12:38,826
subtitles created by c3subtitles.de
in the year 2017. Join, and help us!