1
00:00:11,380 --> 00:00:17,699
Applause
2
00:00:17,699 --> 00:00:20,060
So, good morning everyone
3
00:00:20,060 --> 00:00:23,529
my name is Arne and today
4
00:00:23,529 --> 00:00:25,529
I'll be hoping to entertain you
5
00:00:25,529 --> 00:00:32,189
a bit with some GPG usability issues.
6
00:00:32,189 --> 00:00:33,980
thanks for being here this early in the morning.
7
00:00:33,980 --> 00:00:36,750
I know, some of you have had a short night
8
00:00:36,750 --> 00:00:43,210
In short for the impatient ones:
9
00:00:43,210 --> 00:00:46,660
Why is GnuPG damn near unusable?
10
00:00:46,660 --> 00:00:51,690
Well, actually, I don’t know
11
00:00:51,690 --> 00:00:52,699
Laughter
12
00:00:52,699 --> 00:00:57,940
So more research is needed … as always.
13
00:00:57,940 --> 00:00:59,969
Because it's not like using a thermometer.
14
00:00:59,969 --> 00:01:03,699
We're doing something between social science and security
15
00:01:03,699 --> 00:01:10,699
But I will present some interesting perspectives
16
00:01:11,729 --> 00:01:16,720
or at least what I hope you'll find interesting perspectives.
17
00:01:16,720 --> 00:01:20,340
This talk is about some possible explanations
18
00:01:20,340 --> 00:01:25,000
that usable security research can offer to the question
19
00:01:25,000 --> 00:01:27,340
Now some context, something about myself,
20
00:01:27,340 --> 00:01:34,020
so you have a bit of an idea where I'm coming from
21
00:01:34,020 --> 00:01:39,200
and what coloured glassed I have on.
22
00:01:39,200 --> 00:01:44,030
So pretty much my background is in Mathematics,
23
00:01:44,030 --> 00:01:48,500
Computer science, and—strangely enough—International relations
24
00:01:48,500 --> 00:01:51,860
My professional background is that I've been doing
25
00:01:51,860 --> 00:01:57,319
embedded system security evaluations and training
26
00:01:57,319 --> 00:02:02,890
and I've also been a PhD student, studying the usability of security.
27
00:02:02,890 --> 00:02:07,729
Currently, I teach the new generation,
28
00:02:07,729 --> 00:02:14,729
hoping to bring some new blood into the security world.
29
00:02:15,030 --> 00:02:17,910
I want to do some expectation setting
30
00:02:17,910 --> 00:02:21,319
I want to say, what this talk is not about.
31
00:02:21,319 --> 00:02:23,660
I will also give some helpful pointers for
32
00:02:23,660 --> 00:02:29,510
those of you that are interested in these other areas.
33
00:02:29,510 --> 00:02:34,269
I will not go into too much detail about the issue of truth
34
00:02:34,269 --> 00:02:37,510
in security science.
35
00:02:37,510 --> 00:02:40,469
Here are some links to some interesting papers that cover this
36
00:02:40,469 --> 00:02:42,930
in a lot of detail.
37
00:02:42,930 --> 00:02:45,650
Neither will I be giving a security primer.
38
00:02:45,650 --> 00:02:50,219
There are some nice links to books on the slide.
39
00:02:50,219 --> 00:02:55,340
I'll also not be giving a cryptography primer or a history lesson.
40
00:02:55,340 --> 00:02:57,819
Neither will I be giving an introduction to PGP
41
00:02:57,819 --> 00:03:02,749
And, interestingly enough, even though the talk is titled
42
00:03:02,749 --> 00:03:06,579
“why is GPG damn near unusable”, I will
43
00:03:06,579 --> 00:03:10,159
not really be doing much PGP bashing
44
00:03:10,159 --> 00:03:15,290
I think it's quite, actually, a wonderful effort and other people
45
00:03:15,290 --> 00:03:20,739
have pretty much done the PGP/GnuPG bashing for me.
46
00:03:20,739 --> 00:03:25,560
And, as I've already mentioned, I will not be giving any definite answers
47
00:03:25,560 --> 00:03:29,329
and a lot of “it depends.”
48
00:03:29,329 --> 00:03:33,739
But then you might ask “well, it depends. What does it depend on?”
49
00:03:33,739 --> 00:03:37,319
Well, for one: What users you’re looking at
50
00:03:37,319 --> 00:03:39,689
which goals they have in mind and
51
00:03:39,689 --> 00:03:43,609
in what context, what environment they’re doing these things.
52
00:03:43,609 --> 00:03:47,819
So, instead I want to kindle your inspiration
53
00:03:47,819 --> 00:03:54,489
I want to offer you a new view on the security environment
54
00:03:54,489 --> 00:03:59,700
and I'll also give you some concrete exercises that you can try out
55
00:03:59,700 --> 00:04:01,859
at home or at the office.
56
00:04:01,859 --> 00:04:07,739
Some “do’s” and “don’t’s” and pointers for further exploration:
57
00:04:07,739 --> 00:04:10,249
This is a short overview of the talk
58
00:04:10,249 --> 00:04:16,250
I'll start with the background story to why I’m giving this talk
59
00:04:16,250 --> 00:04:21,298
then an overview over usable security research area,
60
00:04:21,298 --> 00:04:24,630
some principles and methods for usablity,
61
00:04:24,630 --> 00:04:28,590
some case studies, then some open questions remain
62
00:04:28,590 --> 00:04:35,590
So, the story. Well. It all started with this book.
63
00:04:36,810 --> 00:04:41,260
When I was reading about the Snowden revelations,
64
00:04:41,260 --> 00:04:46,690
I read, how Snowden tried to contact Glenn Greenwald.
65
00:04:46,690 --> 00:04:53,270
On December, 1st, he sent an email, saying, well, writing to Glenn:
66
00:04:53,270 --> 00:05:00,120
“If you don’t use PGP, some people will never be able to contact you.”
67
00:05:00,120 --> 00:05:05,320
“Please install this helpful tool and if you need any help,
68
00:05:05,320 --> 00:05:08,200
please request so.”
69
00:05:08,200 --> 00:05:11,850
Three days later, Glenn Greenwald says: “Sorry, I don’t
70
00:05:11,850 --> 00:05:16,970
know how to do that, but I’ll look into it.”
71
00:05:16,970 --> 00:05:21,910
and Snowden writes back: “Okay, well, sure. And again:
72
00:05:21,910 --> 00:05:23,980
If you need any help, I can facilitate contact.”
73
00:05:23,980 --> 00:05:28,720
Now, a mere seven weeks later,
74
00:05:28,720 --> 00:05:30,050
Laughter
75
00:05:30,050 --> 00:05:37,050
Glenn is like “okay, well, I’ll do it within the next days or so.”
76
00:05:37,320 --> 00:05:38,290
Okay, sure …
77
00:05:38,290 --> 00:05:42,780
Snowden’s like “my sincerest thanks”.
78
00:05:42,780 --> 00:05:46,440
But actually in the meantime, Snowden was growing a bit impatient
79
00:05:46,440 --> 00:05:51,080
’cause, okay, “why are you not encrypting?”
80
00:05:51,080 --> 00:05:55,050
So he sent an email to Micah Lee, saying, “okay, well, hello,
81
00:05:55,050 --> 00:06:01,820
I’m a friend, can you help me getting contact with Laura Poitras?”
82
00:06:01,820 --> 00:06:06,260
In addition to that, he made a ten-minute video for Glenn Greenwald
83
00:06:06,260 --> 00:06:09,270
Laughter
84
00:06:09,270 --> 00:06:11,340
… describing how to use GPG.
85
00:06:11,340 --> 00:06:17,250
And actually I have quite a lot of screenshots of that video
86
00:06:17,250 --> 00:06:19,880
and it's quite entertaining.
87
00:06:19,880 --> 00:06:24,160
’cause, of course, Snowden was getting increasingly
88
00:06:24,160 --> 00:06:27,550
bothered by the whole situation.
89
00:06:27,550 --> 00:06:33,090
Now, this is the video that Snowden made
90
00:06:33,090 --> 00:06:40,090
“GPG for Journalists. For Windows.”
91
00:06:40,990 --> 00:06:46,860
Laughter
92
00:06:48,800 --> 00:06:51,340
I’ll just click through it, because I think,
93
00:06:51,340 --> 00:06:53,960
the slides speak for themselves.
94
00:06:53,960 --> 00:07:00,960
Take notes of all the usability issues you can identify.
95
00:07:01,860 --> 00:07:06,030
So just click through the wizard, generate a new key,
96
00:07:06,030 --> 00:07:11,710
enable “expert settings”, ’cause we want 3000-bit keys
97
00:07:11,710 --> 00:07:15,960
We want a very long password, etc.
98
00:07:15,960 --> 00:07:19,260
And now, of course, we also wanna go and find keys
99
00:07:19,260 --> 00:07:21,840
on the keyserver.
100
00:07:21,840 --> 00:07:24,790
And we need to make sure that we shouldn’t
101
00:07:24,790 --> 00:07:28,670
write our draft messages in GMail
102
00:07:28,670 --> 00:07:31,740
or Thunderbird and enigmail, for that matter.
103
00:07:31,740 --> 00:07:37,340
Although that issue has been solved.
104
00:07:39,580 --> 00:07:42,230
So, I think you can start seeing
105
00:07:42,230 --> 00:07:45,380
why Glenn Greenwald
—even if he did open this video—
106
00:07:45,380 --> 00:07:52,850
was like
“okay, well, I’m not gonna bother.”
107
00:07:52,850 --> 00:07:56,820
And Snowden is so kind to say, after 12 minutes,
108
00:07:56,820 --> 00:08:01,580
“if you have any remaining questions, please contact me.”
109
00:08:01,580 --> 00:08:05,870
At this year’s HOPE conference,
110
00:08:05,870 --> 00:08:12,390
Snowden actually did a call for arms and he said
111
00:08:12,390 --> 00:08:16,740
“Okay, we need people to evaluate our security systems.
112
00:08:16,740 --> 00:08:23,740
we need people to go and do red team. But in addition to that,
113
00:08:25,530 --> 00:08:30,970
we also need to look at the user experience issue.”
114
00:08:30,970 --> 00:08:37,089
So this is a transcript of his kind of manifesto
115
00:08:37,089 --> 00:08:42,190
and he says: “GPG is really damn near unusable”
116
00:08:42,190 --> 00:08:46,810
because, well, of course, you might know command line
117
00:08:46,810 --> 00:08:49,910
and then, okay, you might be okay
118
00:08:49,910 --> 00:08:53,470
but “Gam Gam at the home”, she is never going to be able
119
00:08:53,470 --> 00:08:59,490
to use GnuPG.
120
00:08:59,490 --> 00:09:09,129
And he also notes that, okay, we were part of a technical elite
121
00:09:09,129 --> 00:09:15,749
and he calls on us to work on the technical literacy of people
122
00:09:15,749 --> 00:09:18,339
because what he explicitly warns against is
123
00:09:18,339 --> 00:09:22,680
a high priesthood of technology.
124
00:09:22,680 --> 00:09:27,850
Okay, that’s a nice call to arms
125
00:09:27,850 --> 00:09:32,950
but are we actually up for a new dawn?
126
00:09:32,950 --> 00:09:40,489
Well, I wanna go into the background of usable security
127
00:09:40,489 --> 00:09:44,959
and I wanna show you that we’ve actually been
128
00:09:44,959 --> 00:09:48,009
in a pretty dark time.
129
00:09:48,009 --> 00:09:55,999
So, back in 1999, there was this paper:
“Why Johnny can’t encrypt”
130
00:09:56,000 --> 00:10:01,219
which described mostly the same broken interface
131
00:10:01,219 --> 00:10:09,999
so if you remember, if you go back to the video of which
132
00:10:09,999 --> 00:10:15,370
I showed some screenshots, then, well, if you look at
133
00:10:15,370 --> 00:10:21,899
these screenshots from 1999, well,
is there a lot of difference?
134
00:10:21,899 --> 00:10:24,470
Not really! Nothing much has changed.
135
00:10:24,470 --> 00:10:25,800
There are still the same
136
00:10:25,800 --> 00:10:31,759
conceptual barriers,
and same crappy defaults.
137
00:10:31,759 --> 00:10:35,980
And most astonishingly, in the paper there
138
00:10:35,980 --> 00:10:39,460
is a description of a user study where
139
00:10:39,460 --> 00:10:44,830
users were given 90 minutes to encrypt an email
140
00:10:44,830 --> 00:10:49,860
and most were unable to do so.
141
00:10:49,860 --> 00:10:55,380
I think, that pretty much describes “damn near unusable.”
142
00:10:55,380 --> 00:11:02,550
A timeline from, well, before 1999 to now
143
00:11:02,550 --> 00:11:06,100
of the usable security research.
144
00:11:06,100 --> 00:11:09,920
So, quite a lot has happened
145
00:11:09,920 --> 00:11:15,180
although it is still a growing field.
146
00:11:15,180 --> 00:11:20,949
It started—the idea of usable security,
it was explicitly defined first—
147
00:11:20,949 --> 00:11:29,760
in 1975, but it was only until
… only in … 1989 that
148
00:11:29,760 --> 00:11:33,380
the first usability tests were carried out.
149
00:11:33,380 --> 00:11:38,430
And only in 1996 that
150
00:11:38,430 --> 00:11:44,660
the concept of “user-centered security”
was described.
151
00:11:44,660 --> 00:11:49,269
An interesting paper, also from 1999, shows how
152
00:11:49,269 --> 00:11:55,239
contary to the general description of users as lazy
153
00:11:55,239 --> 00:12:00,829
and basically as the weakest chain in security
154
00:12:00,829 --> 00:12:06,009
this paper describes users as pretty rational beings
155
00:12:06,009 --> 00:12:09,660
who see security as an overhead and …
where they don't
156
00:12:09,660 --> 00:12:16,589
understand the usefulness of what they’re doing.
157
00:12:16,589 --> 00:12:21,309
The study of PGP 5.0, I’ve talked about that already,
158
00:12:21,309 --> 00:12:27,550
and there was also a study of the
Kazaa network in 2002.
159
00:12:27,550 --> 00:12:29,550
And it was found out that a lot of users were
160
00:12:29,550 --> 00:12:35,339
accidentally sharing files from personal pictures,
161
00:12:35,339 --> 00:12:41,749
who knows, maybe credit-card details,
you never know, right?
162
00:12:41,749 --> 00:12:48,209
In 2002, a lot of the knowledge of usable security design
163
00:12:48,209 --> 00:12:51,180
was concretised in ten key principles
164
00:12:51,180 --> 00:12:54,469
and if you’re interested,
165
00:12:54,469 --> 00:13:03,999
I do recommend you to look at the paper.
166
00:13:03,999 --> 00:13:09,939
A solution to the PGP problem was proposed in
167
00:13:09,939 --> 00:13:11,679
2004, well, actually,
168
00:13:11,679 --> 00:13:15,050
it was proposed earlier
but it was tested in 2005.
169
00:13:15,050 --> 00:13:19,529
And it was found that actually
if we automate encryption
170
00:13:19,529 --> 00:13:23,869
and if we automate key exchange then, well,
171
00:13:23,869 --> 00:13:26,059
things are pretty workable, except that
172
00:13:26,059 --> 00:13:30,290
users still fall for
phishing attacks, of course.
173
00:13:30,290 --> 00:13:38,399
But last year, another research
identified that, well,
174
00:13:38,399 --> 00:13:41,839
making security transparent is all nice and well
175
00:13:41,839 --> 00:13:46,309
but it's also dangerous because users no longer
176
00:13:46,309 --> 00:13:49,879
are less likely to trust the system and
177
00:13:49,879 --> 00:13:54,259
are less likely to really understand
what’s really happening.
178
00:13:54,259 --> 00:13:59,489
So a paper this year also identified another issue:
179
00:13:59,489 --> 00:14:04,209
Users generally have very bad understanding
180
00:14:04,209 --> 00:14:05,980
of the email architecture.
181
00:14:05,980 --> 00:14:08,579
An email goes from point A to point B.
182
00:14:08,579 --> 00:14:15,630
And what happens in-between is unknown.
183
00:14:15,630 --> 00:14:22,570
So, before I go on to general usability principles
184
00:14:22,570 --> 00:14:27,550
from the founding pillar of the usable security field
185
00:14:27,550 --> 00:14:33,670
I wanna give some exaples of usability failures.
186
00:14:33,670 --> 00:14:37,470
You might be familiar with project VENONA.
187
00:14:37,470 --> 00:14:41,649
This was an effort by the US intelligence agencies
188
00:14:41,649 --> 00:14:45,290
to try and decrypt soviet communication.
189
00:14:45,290 --> 00:14:49,389
And they actually were pretty successful.
190
00:14:49,389 --> 00:14:53,980
They encovered a lot of spying and, well,
191
00:14:53,980 --> 00:14:56,189
how did they do this?
192
00:14:56,189 --> 00:14:59,869
The soviets were using one-time pads and,
193
00:14:59,869 --> 00:15:02,619
well, if you reuse a one-time pad,
194
00:15:02,619 --> 00:15:08,069
then you leak a lot of information about plain-text.
195
00:15:08,069 --> 00:15:10,269
Well, what we also see happening a lot is
196
00:15:10,269 --> 00:15:12,600
low password entropy.
197
00:15:12,600 --> 00:15:19,440
We have people choosing password “123456”, etc.
198
00:15:19,440 --> 00:15:25,479
And what I just described, the study looking into
199
00:15:25,479 --> 00:15:29,249
the mental models of users,
200
00:15:29,249 --> 00:15:32,300
of the email architecture and how it works
201
00:15:32,300 --> 00:15:34,819
well, at the top you have
202
00:15:34,819 --> 00:15:37,879
still a pretty simplified description of how things work
203
00:15:37,879 --> 00:15:40,519
and at the bottom we have an actual drawing
204
00:15:40,519 --> 00:15:43,300
of a research participant when asked:
205
00:15:43,300 --> 00:15:49,779
“Can you draw how an email
goes from point A to point B?”
206
00:15:49,779 --> 00:15:53,779
And it’s like:
“Well, it goes from one place to the other.”
207
00:15:53,779 --> 00:15:58,829
Okay …
208
00:15:58,829 --> 00:16:01,039
clicking sounds
209
00:16:01,039 --> 00:16:03,669
Okay, so this died.
210
00:16:12,569 --> 00:16:18,689
So, these are two screenshots of enigmail
211
00:16:18,689 --> 00:16:21,939
Well, if I wouldn’t have marked them
212
00:16:21,939 --> 00:16:27,199
as the plaintext and encrypted email that would be sent
213
00:16:27,199 --> 00:16:30,899
you probably wouldn’t have spotted which was which
214
00:16:30,899 --> 00:16:34,610
this is a pretty big failure in
215
00:16:34,610 --> 00:16:39,839
the visibility of the system.
216
00:16:39,839 --> 00:16:44,680
You don’t see anything? Ah.
217
00:16:44,680 --> 00:16:47,509
Audience: “That’s the point!”
That's the point, yes!
218
00:16:47,509 --> 00:16:58,109
laughter
applause
219
00:16:58,109 --> 00:17:02,209
On the left we have a screenshot of GPG and
220
00:17:02,209 --> 00:17:04,299
as I’ve already described,
221
00:17:04,299 --> 00:17:08,530
command line people, we like command lines
222
00:17:08,530 --> 00:17:11,300
but normal people don’t.
223
00:17:11,300 --> 00:17:13,720
And what we also see is a lot of the jargon that is
224
00:17:13,720 --> 00:17:17,030
currently being used even in GUI applications
225
00:17:17,030 --> 00:17:23,409
so on the right there is PGP 10.0.
226
00:17:23,409 --> 00:17:25,979
Now I wanna close these examples with
227
00:17:25,979 --> 00:17:28,529
well, you might be wondering: “what is this?”
228
00:17:28,529 --> 00:17:32,589
This is actually an example of a security device
229
00:17:32,589 --> 00:17:36,649
from, I think it’s around 4000 years ago.
230
00:17:36,649 --> 00:17:38,360
Like, People could use this.
231
00:17:38,360 --> 00:17:42,870
Why can’t we get it right today?
232
00:17:42,870 --> 00:17:46,360
Something that you should,
233
00:17:46,360 --> 00:17:48,869
this is a little homework exercise,
234
00:17:48,869 --> 00:17:52,419
take a laptop to your grandma, show her PGP,
235
00:17:52,419 --> 00:17:55,110
can she use it—yes or no?
236
00:17:55,110 --> 00:18:02,440
Probably not, but who knows?
237
00:18:02,450 --> 00:18:03,740
Now I wanna go into
238
00:18:03,740 --> 00:18:09,840
the usability cornerstones of usable security.
239
00:18:09,840 --> 00:18:13,049
I wanna start with heuristics
240
00:18:13,049 --> 00:18:15,519
some people call them “rules of thumb,” other people
241
00:18:15,519 --> 00:18:19,059
call them “the ten holy commandments”
242
00:18:19,059 --> 00:18:23,299
For example, the ten commandments of Dieter Rams,
243
00:18:23,299 --> 00:18:27,059
there is ten commandments of Jakob Nielsen,
244
00:18:27,059 --> 00:18:28,250
of Don Norman
245
00:18:28,250 --> 00:18:35,110
and it really depends on who you believe in, etc.
246
00:18:35,110 --> 00:18:37,380
But at the cornerstone of all of these is that
247
00:18:37,380 --> 00:18:40,270
design is made for people.
248
00:18:40,270 --> 00:18:45,800
And, well, actually, Google says it quite well
249
00:18:45,800 --> 00:18:48,559
in their guiding mission:
250
00:18:48,559 --> 00:18:52,740
“Focus on the user and all else will follow.”
251
00:18:52,740 --> 00:18:54,809
Or, as a usability maxim:
252
00:18:54,809 --> 00:18:57,350
“thou shalt test with thy user”
253
00:18:57,350 --> 00:19:01,209
Don’t just give them the thing.
254
00:19:01,209 --> 00:19:03,200
But there is one problem with these heuristics
255
00:19:03,200 --> 00:19:06,510
and with this advice going just with your user.
256
00:19:06,510 --> 00:19:10,889
Because it’s a pretty abstract advice.
257
00:19:10,889 --> 00:19:12,090
What do you do?
258
00:19:12,090 --> 00:19:13,940
You go out into the world to get practice.
259
00:19:13,940 --> 00:19:17,870
You start observing people.
260
00:19:17,870 --> 00:19:20,169
One nice exercise to try is:
261
00:19:20,169 --> 00:19:21,289
go to the vending machine,
262
00:19:21,289 --> 00:19:24,539
for example the ones at the S-Bahn.
263
00:19:24,539 --> 00:19:26,049
Just stand next to it
264
00:19:26,049 --> 00:19:28,269
and observe people buying tickets.
265
00:19:28,269 --> 00:19:30,860
It’s quite entertaining, actually.
266
00:19:30,860 --> 00:19:33,500
Laughter
267
00:19:33,500 --> 00:19:36,010
And something you can also do is
268
00:19:36,010 --> 00:19:37,890
search for usability failures.
269
00:19:37,890 --> 00:19:39,750
This is what you already do when
270
00:19:39,750 --> 00:19:41,269
you’re observing people.
271
00:19:41,269 --> 00:19:45,320
But even just google for “usability failure”,
272
00:19:45,320 --> 00:19:47,870
“GUI fail”, etc., and you will find
273
00:19:47,870 --> 00:19:53,400
lots of entertaining stuff.
274
00:19:53,400 --> 00:19:54,950
Those were some heuristics
275
00:19:54,950 --> 00:19:56,250
but what about the princpiles
276
00:19:56,250 --> 00:20:01,740
that lie behind those?
277
00:20:01,740 --> 00:20:05,799
Usability or interaction design
278
00:20:05,799 --> 00:20:09,299
is a cycle between the user and the system.
279
00:20:09,299 --> 00:20:10,470
The user and the world.
280
00:20:10,470 --> 00:20:12,090
The user acts on the world
281
00:20:12,090 --> 00:20:13,590
and gets feedback.
282
00:20:13,590 --> 00:20:17,000
They interpret that.
283
00:20:17,000 --> 00:20:18,480
One important concept is
284
00:20:18,480 --> 00:20:20,070
for things to be visible.
285
00:20:20,070 --> 00:20:21,110
For the underlying system state
286
00:20:21,110 --> 00:20:22,320
to be visible and
287
00:20:22,320 --> 00:20:23,760
you get appropriate feedback
288
00:20:23,760 --> 00:20:26,169
from the system.
289
00:20:26,169 --> 00:20:31,230
So these are Don Norman’s gulfs
of execution and evaluation
290
00:20:31,230 --> 00:20:34,940
sort of yin and yang.
291
00:20:34,940 --> 00:20:38,549
And there is two concrete problems
292
00:20:38,549 --> 00:20:39,830
to illustrate.
293
00:20:39,830 --> 00:20:41,850
For example, the button problem
294
00:20:41,850 --> 00:20:45,840
that “how do you know what happens
when you push the button?”
295
00:20:45,840 --> 00:20:50,620
and “how do you know how to push it?”
296
00:20:50,620 --> 00:20:52,750
I unforunately don’t have a picture of it
297
00:20:52,750 --> 00:20:58,360
but at Oxford station, the tabs in the bathrooms
298
00:20:58,360 --> 00:21:01,550
they say “push” and you need to turn.
299
00:21:01,550 --> 00:21:05,439
Laughter
300
00:21:05,439 --> 00:21:08,510
Then there is the toilet door problem.
301
00:21:08,510 --> 00:21:11,740
The problem of “how do you know
what state a system is in”.
302
00:21:11,740 --> 00:21:15,730
How do you know whether an email will be encrypted?
303
00:21:15,730 --> 00:21:20,269
This is a picture …
304
00:21:20,269 --> 00:21:21,980
basically there is two locks.
305
00:21:21,980 --> 00:21:26,120
One is actually broken and it’s …
when pushing the button that's on the
306
00:21:26,120 --> 00:21:29,049
door handle, you usually lock the door.
307
00:21:29,049 --> 00:21:31,620
But … well … it broke. So that must have been
308
00:21:31,620 --> 00:21:36,200
an entertaining accident.
309
00:21:36,200 --> 00:21:39,080
Another, as I’ve already described,
310
00:21:39,080 --> 00:21:44,339
another important concept is that of mental models.
311
00:21:44,339 --> 00:21:47,860
It’s a question of what idea does the user have
312
00:21:47,860 --> 00:21:52,589
of the system by interacting with it?
313
00:21:52,589 --> 00:21:55,880
How do they acquire knowledge?
314
00:21:55,880 --> 00:21:59,250
For example, how to achieve discoverability
315
00:21:59,250 --> 00:22:00,769
of the system?
316
00:22:00,769 --> 00:22:05,710
And how to ensure that while
a user is discovering the system
317
00:22:05,710 --> 00:22:09,480
that they are less likely to make mistakes?
318
00:22:09,480 --> 00:22:13,880
So this is the concept of poka-yoke
319
00:22:13,880 --> 00:22:18,429
and it’s … here is an example
you also see with floppy disks,
320
00:22:18,429 --> 00:22:22,089
with USB sticks, etc.
321
00:22:22,089 --> 00:22:24,309
It’s engineered such that users are
322
00:22:24,309 --> 00:22:27,020
less likely to make a mistake.
323
00:22:27,020 --> 00:22:30,720
Then there’s also the idea
of enabling knowledge transfer
324
00:22:30,720 --> 00:22:33,339
So how can we do this?
325
00:22:33,339 --> 00:22:35,480
One thing is metaphors.
326
00:22:35,480 --> 00:22:39,919
And I’m not sure how many of you recognise this,
327
00:22:39,919 --> 00:22:44,030
this is Microsoft BOB.
328
00:22:44,030 --> 00:22:46,399
Traditionally, PC systems have been built
329
00:22:46,399 --> 00:22:51,909
on the desktop metaphor.
330
00:22:51,909 --> 00:22:58,169
Laughter
Microsoft BOB had a little too much.
331
00:22:58,169 --> 00:23:04,510
To enable knowledge transfer,
you can also standardise systems.
332
00:23:04,510 --> 00:23:08,519
And one important tool for this is design languages
333
00:23:08,519 --> 00:23:12,159
so if you’re designing for iOS, go look at
334
00:23:12,159 --> 00:23:15,970
the design language, the Human
Interface Guidelines of iOS.
335
00:23:15,970 --> 00:23:19,690
The same for Windows – go look
at the Metro Design Guidelines.
336
00:23:19,690 --> 00:23:26,159
As for Android, look at Material Design.
337
00:23:26,159 --> 00:23:30,409
Because, another interesting exercise
to try out
338
00:23:30,409 --> 00:23:33,230
relating to design languages
339
00:23:33,230 --> 00:23:38,250
and also to get familiar with how designers try to
340
00:23:38,250 --> 00:23:40,519
communicate with users is to
341
00:23:40,519 --> 00:23:44,639
look at an interface and trying to decode
342
00:23:44,639 --> 00:23:48,599
what the designer is trying to say to the user.
343
00:23:48,599 --> 00:23:53,669
And another interesting exercise is to look at
344
00:23:53,669 --> 00:23:58,840
not usability but UNusability.
345
00:23:58,840 --> 00:24:00,909
So there is this pretty interesting book
346
00:24:00,909 --> 00:24:04,940
called “evil by design” and it goes into
347
00:24:04,940 --> 00:24:08,450
all the various techniques that designers use
348
00:24:08,450 --> 00:24:16,760
to fool users, to get them to buy an extra hotel, car, etc.
349
00:24:16,760 --> 00:24:21,750
and, well, RyanAir is pretty much the worst offender
350
00:24:21,750 --> 00:24:24,970
so a good example to study.
351
00:24:24,970 --> 00:24:30,519
Applause
352
00:24:30,519 --> 00:24:34,480
So, what if you wanna go out into the world
353
00:24:34,480 --> 00:24:40,220
and are gonna apply these princpiles, these heuristics?
354
00:24:40,220 --> 00:24:42,350
The first thing to know is that
355
00:24:42,350 --> 00:24:45,219
design has to be a process
356
00:24:45,219 --> 00:24:50,739
whereby, first part is actually defining your problem.
357
00:24:50,739 --> 00:24:53,729
You first brain-storm
358
00:24:53,729 --> 00:24:58,230
then you try to narrow down to concrete requirements
359
00:24:58,230 --> 00:25:02,870
after which you go and try out
the various approaches
360
00:25:02,870 --> 00:25:05,850
and test these.
361
00:25:05,850 --> 00:25:09,279
What materials do usability experts actually use?
362
00:25:09,279 --> 00:25:15,630
Well, of course there’s expensive tools, Axure, etc.
363
00:25:15,630 --> 00:25:19,220
but I think one of the most used materials
364
00:25:19,220 --> 00:25:25,490
is still the post-it note. Just paper and pens.
365
00:25:25,490 --> 00:25:28,980
And, okay, where do you wanna go and test?
366
00:25:28,980 --> 00:25:32,090
Well, actually, go out into the field.
367
00:25:32,090 --> 00:25:35,950
Go to the ticket machine of the S-Bahn.
368
00:25:35,950 --> 00:25:39,019
But also go and test in the lab, so that you have
369
00:25:39,019 --> 00:25:42,179
a controlled environment.
370
00:25:42,179 --> 00:25:45,309
And then you ask “okay, how do I test?”
371
00:25:45,309 --> 00:25:49,919
Well, first thing is: Go and get some real people.
372
00:25:49,919 --> 00:25:54,630
Of course, it’s … it can be difficult to actually
373
00:25:54,630 --> 00:26:00,620
get people into the lab but it’s not impossible.
374
00:26:00,620 --> 00:26:02,429
So once you have people in the lab,
375
00:26:02,429 --> 00:26:05,020
here are some methods.
376
00:26:05,020 --> 00:26:07,279
There are so many usability evaluation methods
377
00:26:07,279 --> 00:26:09,409
that I’m not gonna list them all and
378
00:26:09,409 --> 00:26:13,200
I encourage you to go and look them up yourself
379
00:26:13,200 --> 00:26:15,360
’cause it’s also very personal what works for you
380
00:26:15,360 --> 00:26:20,500
and what works in your situation.
381
00:26:20,500 --> 00:26:23,050
When using these methods you wanna
382
00:26:23,050 --> 00:26:25,780
evaluate how well a solution works
383
00:26:25,780 --> 00:26:29,809
So you’re gonna look at some matrix
384
00:26:29,809 --> 00:26:31,409
so that at the end of your evaluation
385
00:26:31,409 --> 00:26:35,100
you can say “okay, we’ve done a good job”,
386
00:26:35,100 --> 00:26:40,100
“this can go better”,
“Okay, maybe we can move that”, …
387
00:26:40,100 --> 00:26:44,069
So these are the standardised ones, so
388
00:26:44,069 --> 00:26:47,690
how effective are people, or etc.
389
00:26:47,690 --> 00:26:52,909
You can read …
390
00:26:52,909 --> 00:26:55,759
For a quick start guide on how to
391
00:26:55,759 --> 00:26:59,159
perform usability studies, this is quite a nice one.
392
00:26:59,159 --> 00:27:00,480
And the most important thing to remember
393
00:27:00,480 --> 00:27:04,529
is that preparation is half the work.
394
00:27:04,529 --> 00:27:08,120
First thing to check that everything is working,
395
00:27:08,120 --> 00:27:17,180
make sure that you have everyone
you need in the room, etc.
396
00:27:17,180 --> 00:27:23,249
And maybe most importantly,
usability and usable security,
397
00:27:23,249 --> 00:27:26,380
well, usable security is still a growing field, but
398
00:27:26,380 --> 00:27:30,630
usability is a very large field and most likely
399
00:27:30,630 --> 00:27:34,720
all the problems that you are going to face
400
00:27:34,720 --> 00:27:36,970
or at least a large percentage, other people
401
00:27:36,970 --> 00:27:39,080
have faced before.
402
00:27:39,080 --> 00:27:43,529
So this book is, well, it describes a lot of the stories
403
00:27:43,529 --> 00:27:47,529
of user experience professionals and the things
404
00:27:47,529 --> 00:27:52,040
that they’ve come up against.
405
00:27:52,040 --> 00:27:56,189
A homework exerciese if you feel like it
406
00:27:56,189 --> 00:28:00,990
is looking at basically analysing who is your user
407
00:28:00,990 --> 00:28:06,760
and where they’re going to use the application.
408
00:28:06,760 --> 00:28:10,409
And also something to think about is
409
00:28:10,409 --> 00:28:12,649
how might you involve your user?
410
00:28:12,649 --> 00:28:16,889
Not just during the usability testing,
411
00:28:16,889 --> 00:28:21,070
but also afterwards.
412
00:28:21,070 --> 00:28:28,450
Now I wanna go into some case
studies of encryption systems.
413
00:28:28,450 --> 00:28:30,230
Now there’s quite a lot, and these are not all,
414
00:28:30,230 --> 00:28:34,769
it’s just a small selection but I wanna focus on three.
415
00:28:34,769 --> 00:28:40,229
I wanna focus at the OpenPGP standard,
Cryptocat and TextSecure.
416
00:28:40,229 --> 00:28:42,769
So, OpenPGP, well …
417
00:28:42,769 --> 00:28:46,230
email is now almost 50 years old,
418
00:28:46,230 --> 00:28:52,190
we have an encryption standard—S/MIME,
it is widely used
419
00:28:52,190 --> 00:28:56,039
well, it’s widely usable but it’s not widely used …
420
00:28:56,039 --> 00:29:03,679
and GnuPG is used widely but is not installed by default
421
00:29:03,679 --> 00:29:09,939
and when usability teaches us one thing
422
00:29:09,939 --> 00:29:14,129
it’s that defaults rule.
423
00:29:14,129 --> 00:29:18,190
Because users don’t change defaults.
424
00:29:18,190 --> 00:29:23,360
Now you might ask “Okay,
PGP is not installed by default,
425
00:29:23,360 --> 00:29:26,560
so is there actually still a future for OpenPGP?”
426
00:29:26,560 --> 00:29:30,179
Well, I’d argue: Yes.
We have browser plug-ins
427
00:29:30,179 --> 00:29:33,059
which make it easier for users
428
00:29:33,059 --> 00:29:37,850
JavaScript crypto … I’ll come back to that later …
429
00:29:37,850 --> 00:29:43,420
But when we look at Mailvelope, we see, well,
430
00:29:43,420 --> 00:29:48,040
the EFF scorecard, it has a pretty decent rating
431
00:29:48,040 --> 00:29:55,790
at least compared to that of native PGP implementations.
432
00:29:55,790 --> 00:29:58,629
And also Google has announced and has been working
433
00:29:58,629 --> 00:30:01,409
for quite some time on their own plug-in for
434
00:30:01,409 --> 00:30:03,379
end-to-end encryption.
435
00:30:03,379 --> 00:30:07,950
And Yahoo! is also involved in that.
436
00:30:07,950 --> 00:30:11,389
And after the Snowden revelations there has been
437
00:30:11,389 --> 00:30:15,009
a widespread search in the interest
438
00:30:15,009 --> 00:30:18,460
in encrypted communications
439
00:30:18,460 --> 00:30:23,320
and this is one website where a lot of these are listed.
440
00:30:23,320 --> 00:30:27,889
And one project that I’d especially like to emphasise
441
00:30:27,889 --> 00:30:31,910
is mailpile because I think it looks
442
00:30:31,910 --> 00:30:35,300
like a very interesting approach
443
00:30:35,300 --> 00:30:37,820
whereby the question is:
444
00:30:37,820 --> 00:30:41,080
Can we use OpenPGP as a stepping stone?
445
00:30:41,080 --> 00:30:46,620
OpenPGP is not perfect, meta-data is not protected,
446
00:30:46,620 --> 00:30:48,299
header is not protected, etc.
447
00:30:48,299 --> 00:30:51,870
But maybe when we get people into the ecosystem,
448
00:30:51,870 --> 00:30:56,169
we can try and gradually move
them to more secure options.
449
00:30:56,169 --> 00:30:58,899
Now, what about Cryptocat?
450
00:30:58,899 --> 00:31:04,070
So, Cryptocat’s online chat platform
451
00:31:04,070 --> 00:31:06,900
that … yes … uses JavaScript.
452
00:31:06,900 --> 00:31:10,909
And of course, JavaScript crypto is bad
453
00:31:10,909 --> 00:31:14,620
but it can be made better.
454
00:31:14,620 --> 00:31:20,160
And I think JavaScript crypto is not the worst problem.
455
00:31:20,160 --> 00:31:22,860
Cryptocat had a pretty disastrous problem
456
00:31:22,860 --> 00:31:26,809
whereby all messages that were sent
457
00:31:26,809 --> 00:31:30,610
were pretty easily decryptable.
458
00:31:30,610 --> 00:31:33,169
But actually, this is just history repeating itself
459
00:31:33,169 --> 00:31:39,090
’cause PGP 1.0 used something called BassOmatic,
460
00:31:39,090 --> 00:31:44,620
the BassOmatic cypher which is also pretty weak.
461
00:31:44,620 --> 00:31:49,509
And Cryptocat is improving, which is the important thing.
462
00:31:49,509 --> 00:31:51,179
There is now a browser plug-in and
463
00:31:51,179 --> 00:31:53,890
of course, there’s an app for that and
464
00:31:53,890 --> 00:31:56,570
actually, Cryptocat is doing really, really well
465
00:31:56,570 --> 00:31:59,280
in the EFF benchmarks.
466
00:31:59,280 --> 00:32:04,539
And Cryptocat is asking the one question that a lot
467
00:32:04,539 --> 00:32:06,659
of other applications are not asking, which is:
468
00:32:06,659 --> 00:32:09,459
“How can we actually make crypto fun?”
469
00:32:09,459 --> 00:32:12,429
When you start Cryptocat, there’s noises
470
00:32:12,429 --> 00:32:15,340
and there’s interesting facts about cats
471
00:32:15,340 --> 00:32:18,759
Laughter
472
00:32:18,759 --> 00:32:21,249
… depends on whether you like cats, but still!
473
00:32:21,249 --> 00:32:23,450
Keeps you busy!
474
00:32:23,450 --> 00:32:28,989
Now, the last case: TextSecure
also has pretty good markings
475
00:32:28,989 --> 00:32:32,299
and actually just like CryptoCat,
476
00:32:32,299 --> 00:32:35,860
the App store distribution model is something that
477
00:32:35,860 --> 00:32:38,820
I think is a valuable one for usability.
478
00:32:38,820 --> 00:32:41,910
It makes it easy to install.
479
00:32:41,910 --> 00:32:46,049
And something that TextSecure is also looking at is
480
00:32:46,049 --> 00:32:52,180
synchronisation options for your address book.
481
00:32:52,180 --> 00:32:56,980
And I think the most interesting development is
482
00:32:56,980 --> 00:33:00,490
on the one side, the CyanogenMod integration,
483
00:33:00,490 --> 00:33:05,029
so that people will have encryption enabled by default.
484
00:33:05,029 --> 00:33:09,879
’Cause as I mentioned: People don’t change defaults.
485
00:33:09,879 --> 00:33:14,709
And this one is a bit more contoversial, but
486
00:33:14,709 --> 00:33:18,180
there’s also the WhatsApp partnership.
487
00:33:18,180 --> 00:33:21,130
And of course people will say “it’s not secure”,
488
00:33:21,130 --> 00:33:22,960
we know, we know,
489
00:33:22,960 --> 00:33:24,389
EFF knows!
490
00:33:24,389 --> 00:33:28,740
But at least, it’s more secure than nothing at all.
491
00:33:28,740 --> 00:33:31,330
Because: Doesn’t every little bit help?
492
00:33:31,330 --> 00:33:32,680
Well, I’d say: yes.
493
00:33:32,680 --> 00:33:35,590
And at least, it’s one stepping stone.
494
00:33:35,590 --> 00:33:40,110
And, well, all of these are open-source,
495
00:33:40,110 --> 00:33:41,639
so you can think for yourself:
496
00:33:41,639 --> 00:33:45,120
How can I improve these?
497
00:33:45,120 --> 00:33:50,450
Now, there’s still some open questions remaining
498
00:33:50,450 --> 00:33:52,620
in the usable security field and in the
499
00:33:52,620 --> 00:33:56,320
wider security field as well.
500
00:33:56,320 --> 00:33:58,860
I won’t go into all of these,
501
00:33:58,860 --> 00:34:03,210
I wanna focus on the issues that developers have,
502
00:34:03,210 --> 00:34:05,730
issues of end user understanding
503
00:34:05,730 --> 00:34:09,459
and of identitiy management.
504
00:34:09,459 --> 00:34:14,059
Because the development environment
505
00:34:14,059 --> 00:34:18,179
there’s the crypto-plumbing problem, some people call it.
506
00:34:18,179 --> 00:34:20,820
How do we standardise on a cryptographic algorithm?
507
00:34:20,820 --> 00:34:25,250
How do we make everyone use the same system?
508
00:34:25,250 --> 00:34:29,179
Because, again, it’s history repeating itself.
509
00:34:29,179 --> 00:34:35,668
With PGP, we had RSA, changed for
DSA because of patent issues
510
00:34:35,668 --> 00:34:39,540
IDEA changed for CAST5 because of patent issues
511
00:34:39,540 --> 00:34:41,729
and now we have something similar:
512
00:34:41,729 --> 00:34:43,599
’cause for PGP the question is:
513
00:34:43,599 --> 00:34:45,619
Which curve do we choose?
514
00:34:45,619 --> 00:34:51,089
’cause this is from Bernstein, who has got a whole list
515
00:34:51,089 --> 00:34:56,229
of, well not all the curves,
but a large selection of them
516
00:34:56,229 --> 00:34:57,920
analysing the security
517
00:34:57,920 --> 00:35:01,210
but how do you make, well, pretty much
518
00:35:01,210 --> 00:35:06,460
the whole world agree on a single standard?
519
00:35:06,460 --> 00:35:11,110
And also, can we move toward safer languages?
520
00:35:11,110 --> 00:35:18,270
And I’ve been talking about the
usability of encryption systems
521
00:35:18,270 --> 00:35:21,770
for users, but what about for developers?
522
00:35:21,770 --> 00:35:25,770
So, API usability, and as I’ve mentioned:
523
00:35:25,770 --> 00:35:28,000
Language usability.
524
00:35:28,000 --> 00:35:31,640
And on top of that, it is not just a technical issue,
525
00:35:31,640 --> 00:35:34,959
because, of course, we secure microchips,
526
00:35:34,959 --> 00:35:41,560
but we also wanna secure social systems.
527
00:35:41,560 --> 00:35:45,000
Because, in principal, we live in an open system,
528
00:35:45,000 --> 00:35:51,230
in an open society and a system cannot audit itself.
529
00:35:51,230 --> 00:35:55,709
So, okay, what do we do, right?
I don’t know.
530
00:35:55,709 --> 00:35:58,489
I mean, that’s why it’s an open question!
531
00:35:58,489 --> 00:36:00,970
’Cause how de we ensure the authenticity of,
532
00:36:00,970 --> 00:36:06,300
I don’t know, my Intel processor in my lapotp?
533
00:36:06,300 --> 00:36:07,000
How do I know that the
534
00:36:07,000 --> 00:36:09,089
random number generator isn’t bogus?
535
00:36:09,089 --> 00:36:13,589
Well, I know it is, but …
laughter
536
00:36:13,589 --> 00:36:17,140
Then, there’s the issue of identity management
537
00:36:17,140 --> 00:36:19,309
related to key management, like
538
00:36:19,309 --> 00:36:25,329
who has the keys to the kingdom?
539
00:36:25,329 --> 00:36:27,160
One approach, as I’ve already mentioned, is
540
00:36:27,160 --> 00:36:28,760
key continuity management.
541
00:36:28,760 --> 00:36:32,400
Whereby we automate both key exchange and
542
00:36:32,400 --> 00:36:35,520
whereby we automate encryption.
543
00:36:35,520 --> 00:36:38,289
So one principle is trust on first use,
544
00:36:38,289 --> 00:36:43,569
whereby, well, one approach will be to attach your key
545
00:36:43,569 --> 00:36:46,010
to any email you send out and anyone who receives
546
00:36:46,010 --> 00:36:50,020
this email just assumes it’s the proper key.
547
00:36:50,020 --> 00:36:52,819
Of course, it’s not fully secure,
548
00:36:52,819 --> 00:36:56,319
but at least, it’s something.
549
00:36:56,319 --> 00:36:59,109
And this is really, I think, the major question
550
00:36:59,109 --> 00:37:00,809
in interoperability:
551
00:37:00,809 --> 00:37:05,210
How do you ensure that you can access your email
552
00:37:05,210 --> 00:37:08,690
from multiple devices?
553
00:37:08,690 --> 00:37:10,880
Now, of course, there is meta-data leakage,
554
00:37:10,880 --> 00:37:14,230
PGP doesn’t protect meta-data,
555
00:37:14,230 --> 00:37:16,890
and, you know, your friendly security agency knows
556
00:37:16,890 --> 00:37:18,319
where you went last summer …
557
00:37:18,319 --> 00:37:19,300
So, what do we do?
558
00:37:19,300 --> 00:37:23,649
We do anonymous routing, we send over tor, but
559
00:37:23,649 --> 00:37:26,150
I mean, how do we roll that out?
560
00:37:26,150 --> 00:37:27,500
I think the approach that
561
00:37:27,500 --> 00:37:30,240
mailpile is trying to do is interesting
562
00:37:30,240 --> 00:37:33,000
and, of course still an open question, but
563
00:37:33,000 --> 00:37:36,800
interesting research nonetheless.
564
00:37:36,800 --> 00:37:39,049
Then there’s the introduction problem of
565
00:37:39,049 --> 00:37:43,730
okay, how, I meet someone here, after the talk,
566
00:37:43,730 --> 00:37:45,990
they tell me who they are,
567
00:37:45,990 --> 00:37:49,829
but either I get their card—which is nice—or
568
00:37:49,829 --> 00:37:52,260
they say what their name is.
569
00:37:52,260 --> 00:37:55,869
But they’re not gonna tell me, they’re not gonna spell out
570
00:37:55,869 --> 00:37:57,820
their fingerprint.
571
00:37:57,820 --> 00:38:02,400
So the idea of Zooko’s triangle is that identifiers
572
00:38:02,400 --> 00:38:07,630
are either human-meaningful,
secure or decentralised.
573
00:38:07,630 --> 00:38:09,270
Pick two.
574
00:38:09,270 --> 00:38:13,140
So here’s some examples of identifiers,
575
00:38:13,140 --> 00:38:16,270
so for Bitcoin: Lots of random garbage.
576
00:38:16,270 --> 00:38:19,009
For OpenPGP: Lots of random garbage
577
00:38:19,009 --> 00:38:22,330
For miniLock: Lots of random garbage
578
00:38:22,330 --> 00:38:26,390
So, I think an interesting research problem is:
579
00:38:26,390 --> 00:38:29,719
Can we actually make these things memorable?
580
00:38:29,719 --> 00:38:32,200
You know, I can memorise email addresses,
581
00:38:32,200 --> 00:38:34,359
I can memorise phone numbers,
582
00:38:34,359 --> 00:38:39,839
I can not memorise these. I can try, but …
583
00:38:39,839 --> 00:38:45,390
Then, the last open question I wanna focus on
584
00:38:45,390 --> 00:38:48,780
is that of end-user understanding.
585
00:38:48,780 --> 00:38:53,599
So of course, we’ll know that all devices are monitored.
586
00:38:53,599 --> 00:39:00,420
But does the average user?
587
00:39:00,420 --> 00:39:04,750
Do they know what worms can do?
588
00:39:04,750 --> 00:39:09,280
Have they read these books?
589
00:39:09,280 --> 00:39:15,089
Do they know where GCHQ is?
590
00:39:15,089 --> 00:39:20,970
Do they know that Cupertino has
pretty much the same powers?
591
00:39:20,970 --> 00:39:23,880
Laughter
592
00:39:23,880 --> 00:39:28,980
Do they know they’re living in a panopticon to come?
593
00:39:28,980 --> 00:39:32,160
Laughter
594
00:39:32,160 --> 00:39:37,800
Do they know that people are
killed based on meta-data?
595
00:39:37,800 --> 00:39:40,829
Well, I think not.
596
00:39:40,829 --> 00:39:45,550
And actually this is a poster from the university
597
00:39:45,550 --> 00:39:47,069
where I did my Master’s
598
00:39:47,069 --> 00:39:50,940
and interestingly enough, it was founded by a guy
599
00:39:50,940 --> 00:39:56,279
who made a fortune selling sugar pills.
600
00:39:56,279 --> 00:40:02,649
You know, snake oil, we also have this in crypto.
601
00:40:02,649 --> 00:40:06,079
And how is the user to know
602
00:40:06,079 --> 00:40:08,130
whether something is secure or not?
603
00:40:08,130 --> 00:40:10,609
Of course, we have the secure messaging scorecard
604
00:40:10,609 --> 00:40:15,210
but can users find these?
605
00:40:15,210 --> 00:40:21,190
Well, I think, there’s three aspects
to end-user understanding
606
00:40:21,190 --> 00:40:24,250
which is knowledge acquisition,
knowledge transfer,
607
00:40:24,250 --> 00:40:27,220
and the verification updating of this knowledge.
608
00:40:27,220 --> 00:40:30,950
So, as I’ve already mentioned,
we can do dummy-proofing
609
00:40:30,950 --> 00:40:38,110
and we can create transparent systems.
610
00:40:38,110 --> 00:40:41,160
For knowledge transfer, we can
611
00:40:41,160 --> 00:40:44,400
look at appropriate metaphors and design languages.
612
00:40:44,400 --> 00:40:46,829
And for verification we can
613
00:40:46,829 --> 00:40:50,590
try an approach: Choose an advertising.
614
00:40:50,590 --> 00:40:56,500
And, last but not least, we can do user-testing.
615
00:40:56,500 --> 00:41:02,770
Because all these open questions that I’ve described
616
00:41:02,770 --> 00:41:05,549
and all this research that has been done,
617
00:41:05,549 --> 00:41:11,089
I think it’s missing one key issue, which is that
618
00:41:11,089 --> 00:41:13,640
the usability people and the security people
619
00:41:13,640 --> 00:41:17,480
tend not to really talk to one another.
620
00:41:17,480 --> 00:41:21,440
The open-source developers and the users:
621
00:41:21,440 --> 00:41:23,490
Are they talking enough?
622
00:41:23,490 --> 00:41:26,760
I think that’s something, if we want a new dawn,
623
00:41:26,760 --> 00:41:30,970
that’s something that I think we should approach.
624
00:41:30,970 --> 00:41:35,110
Yeah, so, from my side, that’s it.
625
00:41:35,110 --> 00:41:37,490
I’m open for any questions.
626
00:41:37,490 --> 00:41:49,320
Applause
627
00:41:49,320 --> 00:41:52,270
Herald: Arne, thank you very much for your brilliant talk
628
00:41:52,270 --> 00:41:55,030
Now, if you have any questions to ask, would you please
629
00:41:55,030 --> 00:41:57,920
line up at the microphones in the aisles?!
630
00:41:57,920 --> 00:42:00,470
The others who’d like to leave now,
631
00:42:00,470 --> 00:42:04,240
I’d ask you kindly to please leave very quietly
632
00:42:04,240 --> 00:42:09,270
so we can hear what the people
asking questions will tell us.
633
00:42:09,270 --> 00:42:14,279
And those at the microphones,
if you could talk slowly,
634
00:42:14,279 --> 00:42:19,099
then those translating have no problems in translating
635
00:42:19,099 --> 00:42:21,490
what is being asked. Thank you very much.
636
00:42:21,490 --> 00:42:27,460
And I think we’ll start with mic #4 on the left-hand side.
637
00:42:27,460 --> 00:42:32,000
Mic#4: Yes, so, if you’ve been to any successful
638
00:42:32,000 --> 00:42:36,500
crypto party, you know that crypto parties very quickly
639
00:42:36,500 --> 00:42:41,430
turn into not about discussing software,
how to use software,
640
00:42:41,430 --> 00:42:43,780
but into threat model discussions.
641
00:42:43,780 --> 00:42:46,930
And to actually get users
to think about what they’re
642
00:42:46,930 --> 00:42:49,420
trying to protect themselves for and if a certain
643
00:42:49,420 --> 00:42:52,710
messaging app is secure, that still means nothing.
644
00:42:52,710 --> 00:42:55,810
’Cause there is lots of other stuff that’s going on.
645
00:42:55,810 --> 00:42:57,240
Can you talk a little bit about that and
646
00:42:57,240 --> 00:43:00,130
how that runs into this model about, you know,
647
00:43:00,130 --> 00:43:02,260
how we need to educate users and, while we’re at it,
648
00:43:02,260 --> 00:43:03,640
what we want to educated about.
649
00:43:03,640 --> 00:43:05,930
And what they actually need to be using.
650
00:43:05,930 --> 00:43:09,640
Arne: Well, I think that’s an interesting point
651
00:43:09,640 --> 00:43:14,210
and I think, one issue, one big issue is:
652
00:43:14,210 --> 00:43:17,180
okay, we can throw lots of crypto parties
653
00:43:17,180 --> 00:43:20,809
but we’re never gonna be able to throw enough parties.
654
00:43:20,809 --> 00:43:22,970
I … with one party you’re very lucky
655
00:43:22,970 --> 00:43:24,609
you’re gonna educate 100 people.
656
00:43:24,609 --> 00:43:28,950
I mean, just imagine how many parties
you’d need to throw. Right?
657
00:43:28,950 --> 00:43:32,980
I mean, it’s gonna be a heck of party, but … yeah.
658
00:43:32,980 --> 00:43:38,730
And I think, secondly, the question of threat modeling,
659
00:43:38,730 --> 00:43:43,000
I think, sure, that’s helpful to do, but
660
00:43:43,000 --> 00:43:47,760
I think, users do first need an understanding of,
661
00:43:47,760 --> 00:43:49,290
for example, the email architecture.
662
00:43:49,290 --> 00:43:51,520
Cause, how can they do threat
modeling when they think
663
00:43:51,520 --> 00:43:55,260
that an email magically pops
from one computer to the next?
664
00:43:55,260 --> 00:43:59,250
I think, that is pretty much impossible.
665
00:43:59,250 --> 00:44:01,250
I hope that …
666
00:44:01,250 --> 00:44:04,890
Herald: Thank you very much, so …
Microphone #3, please.
667
00:44:04,890 --> 00:44:07,439
Mic#3: Arne, thank you very much for your talk.
668
00:44:07,439 --> 00:44:10,430
There’s one aspect that I didn’t see in your slides.
669
00:44:10,430 --> 00:44:13,049
And that is the aspect of the language that we use
670
00:44:13,049 --> 00:44:16,940
to describe concepts in PGP—and GPG, for that matter.
671
00:44:16,940 --> 00:44:19,510
And I know that there was a paper last year
672
00:44:19,510 --> 00:44:21,890
about why King George can’t encrypt and
673
00:44:21,890 --> 00:44:23,960
they were trying to propose a new language.
674
00:44:23,960 --> 00:44:26,109
Do you think that such initiatives are worthwile
675
00:44:26,109 --> 00:44:28,650
or are we stuck with this language and should we make
676
00:44:28,650 --> 00:44:31,720
as good use of it as we can?
677
00:44:31,720 --> 00:44:37,849
Arne: I think that’s a good point
and actually the question
678
00:44:37,849 --> 00:44:44,649
of “okay, what metaphors do you wanna use?” … I think
679
00:44:44,649 --> 00:44:46,799
we’re pretty much stuck with the language
680
00:44:46,799 --> 00:44:49,710
that we’re using for the moment but
681
00:44:49,710 --> 00:44:54,130
I think it does make sense
to go and look into the future
682
00:44:54,130 --> 00:44:58,289
at alternatives models.
683
00:44:58,289 --> 00:45:00,990
Yeah, so I actually wrote a paper that also
684
00:45:00,990 --> 00:45:04,970
goes into that a bit, looking at
685
00:45:04,970 --> 00:45:08,630
the metaphor of handshakes to exchange keys.
686
00:45:08,630 --> 00:45:09,790
So, for example, you could have
687
00:45:09,790 --> 00:45:15,520
an embedded device as a ring or wristband,
688
00:45:15,520 --> 00:45:19,000
it could even be a smartwatch, for that matter.
689
00:45:19,000 --> 00:45:21,569
Could you use that shaking of hands to
690
00:45:21,569 --> 00:45:24,470
build trust-relationships?
691
00:45:24,470 --> 00:45:29,740
And that might be a better
metaphor than key-signing,
692
00:45:29,740 --> 00:45:31,469
webs of trust, etc.
693
00:45:31,469 --> 00:45:34,559
’Cause I think, that is horribly broken
694
00:45:34,559 --> 00:45:39,990
for I mean the concept, trying
to explain that to users.
695
00:45:39,990 --> 00:45:43,430
Herald: Thank you. And … at the back in the middle.
696
00:45:43,430 --> 00:45:44,980
Signal angel: Thanks. A question from the internet:
697
00:45:44,980 --> 00:45:47,000
[username?] from the Internet wants to know if you’re
698
00:45:47,000 --> 00:45:51,839
aware of the PEP project, the “pretty easy privacy”
699
00:45:51,839 --> 00:45:53,059
and your opinions on that.
700
00:45:53,059 --> 00:45:54,710
And another question is:
701
00:45:54,710 --> 00:46:01,520
How important is the trust level of the crypto to you?
702
00:46:01,520 --> 00:46:04,420
Arne: Well, yes, actually, there’s this screenshot
703
00:46:04,420 --> 00:46:09,729
of the PEP project in the slides
704
00:46:09,729 --> 00:46:15,149
… in the why WhatsApp is horribly insecure and
705
00:46:15,149 --> 00:46:18,720
of course, I agree, and yeah.
706
00:46:18,720 --> 00:46:21,680
I’ve looked into the PEP project for a bit
707
00:46:21,680 --> 00:46:24,549
and I think, yeah, I think it’s an interesting
708
00:46:24,549 --> 00:46:28,480
approach but I still have to read up on it a bit more.
709
00:46:28,480 --> 00:46:31,369
Then, for the second question,
710
00:46:31,369 --> 00:46:38,039
“how important is the rust in the crypto?”:
711
00:46:38,039 --> 00:46:41,749
I think that’s an important one.
712
00:46:41,749 --> 00:46:43,220
Especially the question of
713
00:46:43,220 --> 00:46:52,780
“how do we build social systems
to ensure reliable cryptography?”
714
00:46:52,780 --> 00:46:56,830
So one example is the Advanced
Encryption Standard competition.
715
00:46:56,830 --> 00:46:59,559
Everyone was free to send in entries,
716
00:46:59,559 --> 00:47:01,990
their design princpiles were open
717
00:47:01,990 --> 00:47:06,219
and this is in complete contrast
to the Data Encryption Standard
718
00:47:06,219 --> 00:47:11,920
which, I think, the design princpiles are still Top Secret.
719
00:47:11,920 --> 00:47:16,290
So yeah, I think, the crypto is
something we need to build on
720
00:47:16,290 --> 00:47:22,059
but, well, actually, the crypto is
again built on other systems
721
00:47:22,059 --> 00:47:28,040
where the trust in these systems
is even more important.
722
00:47:28,040 --> 00:47:30,720
Herald: Okay, thank you, microphone #2, please.
723
00:47:30,720 --> 00:47:34,270
Mic#2: Yes, Arne, thank you very
much for your excellent talk.
724
00:47:34,270 --> 00:47:37,710
I wonder how about what to do with feedback
725
00:47:37,710 --> 00:47:40,899
on usability in open-source software.
726
00:47:40,899 --> 00:47:42,329
So, you publish something on GitHub
727
00:47:42,329 --> 00:47:44,049
and you’re with a group of people
728
00:47:44,049 --> 00:47:45,450
who don’t know each other and
729
00:47:45,450 --> 00:47:48,089
one publishes something,
the other publishes something,
730
00:47:48,089 --> 00:47:51,349
and then: How do we know
this software is usable?
731
00:47:51,349 --> 00:47:53,660
In commercial software, there’s all kind of hooks
732
00:47:53,660 --> 00:47:55,780
on the website, on the app,
733
00:47:55,780 --> 00:47:59,059
to send feedback to the commercial vendor.
734
00:47:59,059 --> 00:48:02,270
But in open-source software,
how do you gather this information?
735
00:48:02,270 --> 00:48:04,630
How do you use it, is there any way to do this
736
00:48:04,630 --> 00:48:05,890
in an anonymised way?
737
00:48:05,890 --> 00:48:08,589
I haven’t seen anything related to this.
738
00:48:08,589 --> 00:48:11,480
Is this one of the reasons why
open-source software is maybe
739
00:48:11,480 --> 00:48:15,249
less usable than commercial software?
740
00:48:15,249 --> 00:48:19,889
Arne: It might be. It might be.
741
00:48:19,889 --> 00:48:22,599
But regarding your question, like, how do you know
742
00:48:22,599 --> 00:48:29,559
whether a commercial software is usable, well,
743
00:48:29,559 --> 00:48:32,279
you … one way is looking at:
744
00:48:32,279 --> 00:48:34,840
Okay, what kind of statistics do you get back?
745
00:48:34,840 --> 00:48:37,720
But if you push out something totally unusable
746
00:48:37,720 --> 00:48:39,920
and then, I mean, you’re going to expect
747
00:48:39,920 --> 00:48:44,599
that the statistics come back looking like shit.
748
00:48:44,599 --> 00:48:49,829
So, the best approach is to
design usability in from the start.
749
00:48:49,829 --> 00:48:51,230
The same with security.
750
00:48:51,230 --> 00:48:54,950
And I think, that is also …
so even if you have …
751
00:48:54,950 --> 00:48:58,670
you want privacy for end users, I think it’s still possible
752
00:48:58,670 --> 00:49:01,530
to get people into their lab and look at:
753
00:49:01,530 --> 00:49:03,270
Okay, how are they using the system?
754
00:49:03,270 --> 00:49:05,760
What things can we improve?
755
00:49:05,760 --> 00:49:08,289
And what things are working well?
756
00:49:08,289 --> 00:49:10,740
Mic#2: So you’re saying, you should only publish
757
00:49:10,740 --> 00:49:19,010
open-source software for users
if you also tested in a lab?
758
00:49:19,010 --> 00:49:22,599
Arne: Well, I think, this is a bit of a discussion of:
759
00:49:22,599 --> 00:49:25,740
Do we just allow people to build
houses however they want to
760
00:49:25,740 --> 00:49:28,410
or do we have building codes?
761
00:49:28,410 --> 00:49:32,130
And … I think … well, actually, this proposal of holding
762
00:49:32,130 --> 00:49:35,730
software developers responsible for what they produce,
763
00:49:35,730 --> 00:49:40,299
if it’s commercial software, I mean,
that proposal has been
764
00:49:40,299 --> 00:49:41,970
made a long time ago.
765
00:49:41,970 --> 00:49:43,130
And the question is:
766
00:49:43,130 --> 00:49:47,950
How would that work in an
open-source software community?
767
00:49:47,950 --> 00:49:50,460
Well, actually, I don’t have an answer to that.
768
00:49:50,460 --> 00:49:52,660
But I think, it’s an interesting question.
769
00:49:52,660 --> 00:49:54,490
Mic#2: Thank you.
770
00:49:54,490 --> 00:49:57,990
Herald: Thank you very much. #1, please.
771
00:49:57,990 --> 00:50:01,130
Mic#1: You said that every little bit helps,
772
00:50:01,130 --> 00:50:04,039
so if we have systems that
don’t provide a lot of … well …
773
00:50:04,039 --> 00:50:06,680
are almost insecure, they
provide just a bit of security, than
774
00:50:06,680 --> 00:50:09,869
that is still better than no security.
775
00:50:09,869 --> 00:50:12,970
My question is: Isn’t that actually worse because
776
00:50:12,970 --> 00:50:15,150
this promotes a false sense of security and
777
00:50:15,150 --> 00:50:19,920
that makes people just use
the insecure broken systems
778
00:50:19,920 --> 00:50:23,559
just to say “we have some security with us”?
779
00:50:23,559 --> 00:50:26,210
Arne: I completely agree but
780
00:50:26,210 --> 00:50:29,339
I think that currently people … I mean …
781
00:50:29,339 --> 00:50:30,920
when you think an email goes
782
00:50:30,920 --> 00:50:33,680
from one system to the other directly
783
00:50:33,680 --> 00:50:40,920
and I mean … from these studies
that I’ve done, I’ve met
784
00:50:40,920 --> 00:50:46,060
quite some people who still think email is secure.
785
00:50:46,060 --> 00:50:49,589
So, of course, you might give
them a false sense of security
786
00:50:49,589 --> 00:50:52,640
when you give them a
more secure program but
787
00:50:52,640 --> 00:50:54,480
at least it’s more secure than email—right?
788
00:50:54,480 --> 00:50:56,070
I mean …
789
00:50:56,070 --> 00:50:57,339
Mic#1: Thank you.
790
00:50:57,339 --> 00:50:59,520
Herald: Thank you. There’s another
question on the Internet.
791
00:50:59,520 --> 00:51:02,559
Signal angel: Yes, thank you. Question from the Internet:
792
00:51:02,559 --> 00:51:06,199
What crypto would you finally
recommend your grandma to use?
793
00:51:06,199 --> 00:51:10,260
Arne: laughs
794
00:51:10,260 --> 00:51:15,500
Well … Unfortunately, my grandma
has already passed away.
795
00:51:15,500 --> 00:51:19,520
I mean … her secrets will be safe …
796
00:51:27,420 --> 00:51:32,030
Actually, I think something like where
797
00:51:32,030 --> 00:51:37,349
Crypto is enabled by default, say …
iMessage, I mean
798
00:51:37,349 --> 00:51:42,059
of course, there’s backdoors,
etc., but at least
799
00:51:42,059 --> 00:51:45,339
it is more secure than plain SMS.
800
00:51:45,339 --> 00:51:53,249
So I would advise my grandma
to use … well, to look at …
801
00:51:53,249 --> 00:51:56,289
actually I’d first analyse
what does she have available
802
00:51:56,289 --> 00:51:58,880
and then I would look at okay
which is the most secure
803
00:51:58,880 --> 00:52:03,680
and still usable?
804
00:52:03,680 --> 00:52:07,339
Herald: Thank you very much, so mic #3, please.
805
00:52:07,339 --> 00:52:10,880
Mic#3: So, just wondering:
806
00:52:10,880 --> 00:52:14,950
You told that there is
a problem with the missing
807
00:52:14,950 --> 00:52:20,329
default installation of GPG
on operating systems but
808
00:52:20,329 --> 00:52:24,890
I think, this is more of a
problem of which OS you choose
809
00:52:24,890 --> 00:52:28,220
because at least I don’t
know any Linux system which
810
00:52:28,220 --> 00:52:33,599
doesn’t have GPG installed today by default.
811
00:52:33,599 --> 00:52:39,539
If you use … at least I’ve used
the normal workstation setup.
812
00:52:39,539 --> 00:52:42,550
Arne: Yes, I think you already
answered your own question:
813
00:52:42,550 --> 00:52:47,230
Linux.
Laughter
814
00:52:47,230 --> 00:52:50,690
Unfortunately, Linux is not yet widely default.
815
00:52:50,690 --> 00:52:53,270
I mean, I’d love it to be, but … yeah.
816
00:52:53,270 --> 00:52:57,730
Mic#3: But if I send an email to Microsoft and say:
817
00:52:57,730 --> 00:53:02,539
Well, install GPG by default, they’re not gonna
818
00:53:02,539 --> 00:53:04,150
listen to me.
819
00:53:04,150 --> 00:53:07,530
And I think, for all of us, we should do
820
00:53:07,530 --> 00:53:08,740
a lot more of that.
821
00:53:08,740 --> 00:53:13,780
Even if Microsoft is the devil for most of us.
822
00:53:13,780 --> 00:53:15,609
Thank you.
823
00:53:15,609 --> 00:53:19,599
Arne: Well … We should be doing more of what?
824
00:53:19,599 --> 00:53:26,430
Mic#3: Making more demands
to integrate GPG by default
825
00:53:26,430 --> 00:53:29,210
in Microsoft products, for example.
826
00:53:29,210 --> 00:53:31,059
Arne: Yes, I completely agree.
827
00:53:31,059 --> 00:53:33,869
Well, what you already see happening …
828
00:53:33,869 --> 00:53:36,140
or I mean, it’s not very high-profile yet,
829
00:53:36,140 --> 00:53:39,020
but for example I mean … I’ve refered to
830
00:53:39,020 --> 00:53:42,700
the EFF scorecard a couple of times but
831
00:53:42,700 --> 00:53:49,750
that is some pressure to encourage developers
832
00:53:49,750 --> 00:53:53,010
to include security by default.
833
00:53:53,010 --> 00:53:56,940
But, I think I’ve also mentioned, one of the big problems
834
00:53:56,940 --> 00:54:01,049
is: users at the moment … I mean …
835
00:54:01,049 --> 00:54:04,079
developers might say: my system is secure.
836
00:54:04,079 --> 00:54:06,549
I mean … what does that mean?
837
00:54:06,549 --> 00:54:09,510
Do we hold developers and
commercial entities …
838
00:54:09,510 --> 00:54:12,339
do we hold them to, well,
839
00:54:12,339 --> 00:54:14,039
truthful advertisting standards or not?
840
00:54:14,039 --> 00:54:17,200
I mean, I would say: Let’s gonna look at
841
00:54:17,200 --> 00:54:21,289
what are companies claiming and
842
00:54:21,289 --> 00:54:22,849
do they actually stand up to that?
843
00:54:22,849 --> 00:54:26,079
And if not: Can we actually sue them?
844
00:54:26,079 --> 00:54:27,720
Can we make them tell the truth about
845
00:54:27,720 --> 00:54:30,759
what is happening and what is not?
846
00:54:30,759 --> 00:54:32,960
Herald: So, we’ve got about 2 more minutes left …
847
00:54:32,960 --> 00:54:37,049
So it’s a maximum of two more questions, #2, please.
848
00:54:37,049 --> 00:54:43,440
Mic#2: Yeah, so … Every security system fails.
849
00:54:43,440 --> 00:54:50,010
So I’m interested in what sort of work has been done on
850
00:54:50,010 --> 00:54:56,999
how do users recover from failure?
851
00:54:56,999 --> 00:55:00,660
Everything will get subverted,
852
00:55:00,660 --> 00:55:04,190
your best firend will sneak
your key off your computer,
853
00:55:04,190 --> 00:55:06,099
something will go wrong with that, you know …
854
00:55:06,099 --> 00:55:09,510
your kids will grab it …
855
00:55:09,510 --> 00:55:13,450
and just, is there, in general, has somebody looked at
856
00:55:13,450 --> 00:55:17,000
these sorts of issues?
857
00:55:17,000 --> 00:55:18,559
Is there research on it?
858
00:55:18,559 --> 00:55:21,930
Arne: Of various aspects of the problem but
859
00:55:21,930 --> 00:55:25,640
as far as I’m aware not for the general issue
860
00:55:25,640 --> 00:55:30,170
and not any field studies specifically looking at
861
00:55:30,170 --> 00:55:34,269
“Okay, what happens when a key is compromised, etc.”
862
00:55:34,269 --> 00:55:37,520
I mean, we do have certain cases of things happening
863
00:55:37,520 --> 00:55:41,789
but nothing structured.
864
00:55:41,789 --> 00:55:44,720
No structured studies, as far as I’m aware.
865
00:55:44,720 --> 00:55:46,810
Herald: Thank you. #3?
866
00:55:46,810 --> 00:55:51,539
Mic#3: Yeah, you mentioned
mailpile as a stepping stone
867
00:55:51,539 --> 00:55:56,380
for people to start using GnuPG and stuff, but
868
00:55:56,380 --> 00:56:04,820
you also talked about an
average user seeing mail as just
869
00:56:04,820 --> 00:56:08,789
coming from one place and
then ending up in another place.
870
00:56:08,789 --> 00:56:12,250
Shouldn’t we actually talk about
871
00:56:12,250 --> 00:56:17,880
how to make encryption transparent for the users?
872
00:56:17,880 --> 00:56:21,430
Why should they actually care about these things?
873
00:56:21,430 --> 00:56:24,980
Shouldn’t it be embedded in the protocols?
874
00:56:24,980 --> 00:56:28,869
Shouldn’t we actually talk about
embedding them in the protocols,
875
00:56:28,869 --> 00:56:31,510
stop using unsecure protocols
876
00:56:31,510 --> 00:56:36,109
and having all of these,
you talked a little bit about it,
877
00:56:36,109 --> 00:56:38,720
as putting it in the defaults.
878
00:56:38,720 --> 00:56:42,549
But shouldn’t we emphasise that a lot more?
879
00:56:42,549 --> 00:56:46,730
Arne: Yeah, I think we should
certainly be working towards
880
00:56:46,730 --> 00:56:50,200
“How do we get security by default?”
881
00:56:50,200 --> 00:56:54,270
But I think … I’ve mentioned it shortly that
882
00:56:54,270 --> 00:56:57,519
making things transparent also has a danger.
883
00:56:57,519 --> 00:57:01,000
I mean, this whole, it’s a bit like …
884
00:57:01,000 --> 00:57:03,380
a system should be transparent is a bit like
885
00:57:03,380 --> 00:57:05,880
marketing speak, because actually
886
00:57:05,880 --> 00:57:09,140
we don’t want systems to be completely transparent,
887
00:57:09,140 --> 00:57:13,430
’cause we also wanna be able
to engage with the systems.
888
00:57:13,430 --> 00:57:16,410
Are the systems working as they should be?
889
00:57:16,410 --> 00:57:20,380
So, I mean, this is a difficult balance to find, but yeah …
890
00:57:20,380 --> 00:57:24,730
Something that you achieve through usability studies,
891
00:57:24,730 --> 00:57:29,069
security analysis, etc.
892
00:57:29,069 --> 00:57:31,450
Herald: All right, Arne,
thank you very much for giving
893
00:57:31,450 --> 00:57:33,640
us your very inspiring talk,
894
00:57:33,640 --> 00:57:36,010
thank you for sharing your information with us.
895
00:57:36,010 --> 00:57:38,479
Please give him a round of applause.
896
00:57:38,479 --> 00:57:41,319
Thank you very much.
applause
897
00:57:41,319 --> 00:57:52,000
subtitles created by c3subtitles.de
Join, and help us!