0:00:11.380,0:00:17.699
Applause
0:00:17.699,0:00:20.060
So, good morning everyone
0:00:20.060,0:00:23.529
my name is Arne and today
0:00:23.529,0:00:25.529
I'll be hoping to entertain you
0:00:25.529,0:00:32.189
a bit with some GPG usability issues.
0:00:32.189,0:00:33.980
thanks for being here this early in the morning.
0:00:33.980,0:00:36.750
I know, some of you have had a short night
0:00:36.750,0:00:43.210
In short for the impatient ones:
0:00:43.210,0:00:46.660
Why is GnuPG damn near unusable?
0:00:46.660,0:00:51.690
Well, actually, I don’t know
0:00:51.690,0:00:52.699
Laughter
0:00:52.699,0:00:57.940
So more research is needed … as always.
0:00:57.940,0:00:59.969
Because it's not like using a thermometer.
0:00:59.969,0:01:03.699
We're doing something between social science and security
0:01:03.699,0:01:10.699
But I will present some interesting perspectives
0:01:11.729,0:01:16.720
or at least what I hope you'll find interesting perspectives.
0:01:16.720,0:01:20.340
This talk is about some possible explanations
0:01:20.340,0:01:25.000
that usable security research can offer to the question
0:01:25.000,0:01:27.340
Now some context, something about myself,
0:01:27.340,0:01:34.020
so you have a bit of an idea where I'm coming from
0:01:34.020,0:01:39.200
and what coloured glassed I have on.
0:01:39.200,0:01:44.030
So pretty much my background is in Mathematics,
0:01:44.030,0:01:48.500
Computer science, and—strangely enough—International relations
0:01:48.500,0:01:51.860
My professional background is that I've been doing
0:01:51.860,0:01:57.319
embedded system security evaluations and training
0:01:57.319,0:02:02.890
and I've also been a PhD student, studying the usability of security.
0:02:02.890,0:02:07.729
Currently, I teach the new generation,
0:02:07.729,0:02:14.729
hoping to bring some new blood into the security world.
0:02:15.030,0:02:17.910
I want to do some expectation setting
0:02:17.910,0:02:21.319
I want to say, what this talk is not about.
0:02:21.319,0:02:23.660
I will also give some helpful pointers for
0:02:23.660,0:02:29.510
those of you that are interested in these other areas.
0:02:29.510,0:02:34.269
I will not go into too much detail about the issue of truth
0:02:34.269,0:02:37.510
in security science.
0:02:37.510,0:02:40.469
Here are some links to some interesting papers that cover this
0:02:40.469,0:02:42.930
in a lot of detail.
0:02:42.930,0:02:45.650
Neither will I be giving a security primer.
0:02:45.650,0:02:50.219
There are some nice links to books on the slide.
0:02:50.219,0:02:55.340
I'll also not be giving a cryptography primer or a history lesson.
0:02:55.340,0:02:57.819
Neither will I be giving an introduction to PGP
0:02:57.819,0:03:02.749
And, interestingly enough, even though the talk is titled
0:03:02.749,0:03:06.579
“why is GPG damn near unusable”, I will
0:03:06.579,0:03:10.159
not really be doing much PGP bashing
0:03:10.159,0:03:15.290
I think it's quite, actually, a wonderful effort and other people
0:03:15.290,0:03:20.739
have pretty much done the PGP/GnuPG bashing for me.
0:03:20.739,0:03:25.560
And, as I've already mentioned, I will not be giving any definite answers
0:03:25.560,0:03:29.329
and a lot of “it depends.”
0:03:29.329,0:03:33.739
But then you might ask “well, it depends. What does it depend on?”
0:03:33.739,0:03:37.319
Well, for one: What users you’re looking at
0:03:37.319,0:03:39.689
which goals they have in mind and
0:03:39.689,0:03:43.609
in what context, what environment they’re doing these things.
0:03:43.609,0:03:47.819
So, instead I want to kindle your inspiration
0:03:47.819,0:03:54.489
I want to offer you a new view on the security environment
0:03:54.489,0:03:59.700
and I'll also give you some concrete exercises that you can try out
0:03:59.700,0:04:01.859
at home or at the office.
0:04:01.859,0:04:07.739
Some “do’s” and “don’t’s” and pointers for further exploration:
0:04:07.739,0:04:10.249
This is a short overview of the talk
0:04:10.249,0:04:16.250
I'll start with the background story to why I’m giving this talk
0:04:16.250,0:04:21.298
then an overview over usable security research area,
0:04:21.298,0:04:24.630
some principles and methods for usablity,
0:04:24.630,0:04:28.590
some case studies, then some open questions remain
0:04:28.590,0:04:35.590
So, the story. Well. It all started with this book.
0:04:36.810,0:04:41.260
When I was reading about the Snowden revelations,
0:04:41.260,0:04:46.690
I read, how Snowden tried to contact Glenn Greenwald.
0:04:46.690,0:04:53.270
On December, 1st, he sent an email, saying, well, writing to Glenn:
0:04:53.270,0:05:00.120
“If you don’t use PGP, some people will never be able to contact you.”
0:05:00.120,0:05:05.320
“Please install this helpful tool and if you need any help,
0:05:05.320,0:05:08.200
please request so.”
0:05:08.200,0:05:11.850
Three days later, Glenn Greenwald says: “Sorry, I don’t
0:05:11.850,0:05:16.970
know how to do that, but I’ll look into it.”
0:05:16.970,0:05:21.910
and Snowden writes back: “Okay, well, sure. And again:
0:05:21.910,0:05:23.980
If you need any help, I can facilitate contact.”
0:05:23.980,0:05:28.720
Now, a mere seven weeks later,
0:05:28.720,0:05:30.050
Laughter
0:05:30.050,0:05:37.050
Glenn is like “okay, well, I’ll do it within the next days or so.”
0:05:37.320,0:05:38.290
Okay, sure …
0:05:38.290,0:05:42.780
Snowden’s like “my sincerest thanks”.
0:05:42.780,0:05:46.440
But actually in the meantime, Snowden was growing a bit impatient
0:05:46.440,0:05:51.080
’cause, okay, “why are you not encrypting?”
0:05:51.080,0:05:55.050
So he sent an email to Micah Lee, saying, “okay, well, hello,
0:05:55.050,0:06:01.820
I’m a friend, can you help me getting contact with Laura Poitras?”
0:06:01.820,0:06:06.260
In addition to that, he made a ten-minute video for Glenn Greenwald
0:06:06.260,0:06:09.270
Laughter
0:06:09.270,0:06:11.340
… describing how to use GPG.
0:06:11.340,0:06:17.250
And actually I have quite a lot of screenshots of that video
0:06:17.250,0:06:19.880
and it's quite entertaining.
0:06:19.880,0:06:24.160
’cause, of course, Snowden was getting increasingly
0:06:24.160,0:06:27.550
bothered by the whole situation.
0:06:27.550,0:06:33.090
Now, this is the video that Snowden made
0:06:33.090,0:06:40.090
“GPG for Journalists. For Windows.”
0:06:40.990,0:06:46.860
Laughter
0:06:48.800,0:06:51.340
I’ll just click through it, because I think,
0:06:51.340,0:06:53.960
the slides speak for themselves.
0:06:53.960,0:07:00.960
Take notes of all the usability issues you can identify.
0:07:01.860,0:07:06.030
So just click through the wizard, generate a new key,
0:07:06.030,0:07:11.710
enable “expert settings”, ’cause we want 3000-bit keys
0:07:11.710,0:07:15.960
We want a very long password, etc.
0:07:15.960,0:07:19.260
And now, of course, we also wanna go and find keys
0:07:19.260,0:07:21.840
on the keyserver.
0:07:21.840,0:07:24.790
And we need to make sure that we shouldn’t
0:07:24.790,0:07:28.670
write our draft messages in GMail
0:07:28.670,0:07:31.740
or Thunderbird and enigmail, for that matter.
0:07:31.740,0:07:37.340
Although that issue has been solved.
0:07:39.580,0:07:42.230
So, I think you can start seeing
0:07:42.230,0:07:45.380
why Glenn Greenwald[br]—even if he did open this video—
0:07:45.380,0:07:52.850
was like[br]“okay, well, I’m not gonna bother.”
0:07:52.850,0:07:56.820
And Snowden is so kind to say, after 12 minutes,
0:07:56.820,0:08:01.580
“if you have any remaining questions, please contact me.”
0:08:01.580,0:08:05.870
At this year’s HOPE conference,
0:08:05.870,0:08:12.390
Snowden actually did a call for arms and he said
0:08:12.390,0:08:16.740
“Okay, we need people to evaluate our security systems.
0:08:16.740,0:08:23.740
we need people to go and do red team. But in addition to that,
0:08:25.530,0:08:30.970
we also need to look at the user experience issue.”
0:08:30.970,0:08:37.089
So this is a transcript of his kind of manifesto
0:08:37.089,0:08:42.190
and he says: “GPG is really damn near unusable”
0:08:42.190,0:08:46.810
because, well, of course, you might know command line
0:08:46.810,0:08:49.910
and then, okay, you might be okay
0:08:49.910,0:08:53.470
but “Gam Gam at the home”, she is never going to be able
0:08:53.470,0:08:59.490
to use GnuPG.
0:08:59.490,0:09:09.129
And he also notes that, okay, we were part of a technical elite
0:09:09.129,0:09:15.749
and he calls on us to work on the technical literacy of people
0:09:15.749,0:09:18.339
because what he explicitly warns against is
0:09:18.339,0:09:22.680
a high priesthood of technology.
0:09:22.680,0:09:27.850
Okay, that’s a nice call to arms
0:09:27.850,0:09:32.950
but are we actually up for a new dawn?
0:09:32.950,0:09:40.489
Well, I wanna go into the background of usable security
0:09:40.489,0:09:44.959
and I wanna show you that we’ve actually been
0:09:44.959,0:09:48.009
in a pretty dark time.
0:09:48.009,0:09:55.999
So, back in 1999, there was this paper:[br]“Why Johnny can’t encrypt”
0:09:56.000,0:10:01.219
which described mostly the same broken interface
0:10:01.219,0:10:09.999
so if you remember, if you go back to the video of which
0:10:09.999,0:10:15.370
I showed some screenshots, then, well, if you look at
0:10:15.370,0:10:21.899
these screenshots from 1999, well,[br]is there a lot of difference?
0:10:21.899,0:10:24.470
Not really! Nothing much has changed.
0:10:24.470,0:10:25.800
There are still the same
0:10:25.800,0:10:31.759
conceptual barriers,[br]and same crappy defaults.
0:10:31.759,0:10:35.980
And most astonishingly, in the paper there
0:10:35.980,0:10:39.460
is a description of a user study where
0:10:39.460,0:10:44.830
users were given 90 minutes to encrypt an email
0:10:44.830,0:10:49.860
and most were unable to do so.
0:10:49.860,0:10:55.380
I think, that pretty much describes “damn near unusable.”
0:10:55.380,0:11:02.550
A timeline from, well, before 1999 to now
0:11:02.550,0:11:06.100
of the usable security research.
0:11:06.100,0:11:09.920
So, quite a lot has happened
0:11:09.920,0:11:15.180
although it is still a growing field.
0:11:15.180,0:11:20.949
It started—the idea of usable security,[br]it was explicitly defined first—
0:11:20.949,0:11:29.760
in 1975, but it was only until[br]… only in … 1989 that
0:11:29.760,0:11:33.380
the first usability tests were carried out.
0:11:33.380,0:11:38.430
And only in 1996 that
0:11:38.430,0:11:44.660
the concept of “user-centered security”[br]was described.
0:11:44.660,0:11:49.269
An interesting paper, also from 1999, shows how
0:11:49.269,0:11:55.239
contary to the general description of users as lazy
0:11:55.239,0:12:00.829
and basically as the weakest chain in security
0:12:00.829,0:12:06.009
this paper describes users as pretty rational beings
0:12:06.009,0:12:09.660
who see security as an overhead and …[br]where they don't
0:12:09.660,0:12:16.589
understand the usefulness of what they’re doing.
0:12:16.589,0:12:21.309
The study of PGP 5.0, I’ve talked about that already,
0:12:21.309,0:12:27.550
and there was also a study of the[br]Kazaa network in 2002.
0:12:27.550,0:12:29.550
And it was found out that a lot of users were
0:12:29.550,0:12:35.339
accidentally sharing files from personal pictures,
0:12:35.339,0:12:41.749
who knows, maybe credit-card details,[br]you never know, right?
0:12:41.749,0:12:48.209
In 2002, a lot of the knowledge of usable security design
0:12:48.209,0:12:51.180
was concretised in ten key principles
0:12:51.180,0:12:54.469
and if you’re interested,
0:12:54.469,0:13:03.999
I do recommend you to look at the paper.
0:13:03.999,0:13:09.939
A solution to the PGP problem was proposed in
0:13:09.939,0:13:11.679
2004, well, actually,
0:13:11.679,0:13:15.050
it was proposed earlier [br]but it was tested in 2005.
0:13:15.050,0:13:19.529
And it was found that actually[br]if we automate encryption
0:13:19.529,0:13:23.869
and if we automate key exchange then, well,
0:13:23.869,0:13:26.059
things are pretty workable, except that
0:13:26.059,0:13:30.290
users still fall for[br]phishing attacks, of course.
0:13:30.290,0:13:38.399
But last year, another research[br]identified that, well,
0:13:38.399,0:13:41.839
making security transparent is all nice and well
0:13:41.839,0:13:46.309
but it's also dangerous because users no longer
0:13:46.309,0:13:49.879
are less likely to trust the system and
0:13:49.879,0:13:54.259
are less likely to really understand[br]what’s really happening.
0:13:54.259,0:13:59.489
So a paper this year also identified another issue:
0:13:59.489,0:14:04.209
Users generally have very bad understanding
0:14:04.209,0:14:05.980
of the email architecture.
0:14:05.980,0:14:08.579
An email goes from point A to point B.
0:14:08.579,0:14:15.630
And what happens in-between is unknown.
0:14:15.630,0:14:22.570
So, before I go on to general usability principles
0:14:22.570,0:14:27.550
from the founding pillar of the usable security field
0:14:27.550,0:14:33.670
I wanna give some exaples of usability failures.
0:14:33.670,0:14:37.470
You might be familiar with project VENONA.
0:14:37.470,0:14:41.649
This was an effort by the US intelligence agencies
0:14:41.649,0:14:45.290
to try and decrypt soviet communication.
0:14:45.290,0:14:49.389
And they actually were pretty successful.
0:14:49.389,0:14:53.980
They encovered a lot of spying and, well,
0:14:53.980,0:14:56.189
how did they do this?
0:14:56.189,0:14:59.869
The soviets were using one-time pads and,
0:14:59.869,0:15:02.619
well, if you reuse a one-time pad,
0:15:02.619,0:15:08.069
then you leak a lot of information about plain-text.
0:15:08.069,0:15:10.269
Well, what we also see happening a lot is
0:15:10.269,0:15:12.600
low password entropy.
0:15:12.600,0:15:19.440
We have people choosing password “123456”, etc.
0:15:19.440,0:15:25.479
And what I just described, the study looking into
0:15:25.479,0:15:29.249
the mental models of users,
0:15:29.249,0:15:32.300
of the email architecture and how it works
0:15:32.300,0:15:34.819
well, at the top you have
0:15:34.819,0:15:37.879
still a pretty simplified description of how things work
0:15:37.879,0:15:40.519
and at the bottom we have an actual drawing
0:15:40.519,0:15:43.300
of a research participant when asked:
0:15:43.300,0:15:49.779
“Can you draw how an email[br]goes from point A to point B?”
0:15:49.779,0:15:53.779
And it’s like:[br]“Well, it goes from one place to the other.”
0:15:53.779,0:15:58.829
Okay …
0:15:58.829,0:16:01.039
clicking sounds
0:16:01.039,0:16:03.669
Okay, so this died.
0:16:12.569,0:16:18.689
So, these are two screenshots of enigmail
0:16:18.689,0:16:21.939
Well, if I wouldn’t have marked them
0:16:21.939,0:16:27.199
as the plaintext and encrypted email that would be sent
0:16:27.199,0:16:30.899
you probably wouldn’t have spotted which was which
0:16:30.899,0:16:34.610
this is a pretty big failure in
0:16:34.610,0:16:39.839
the visibility of the system.
0:16:39.839,0:16:44.680
You don’t see anything? Ah.
0:16:44.680,0:16:47.509
Audience: “That’s the point!”[br]That's the point, yes!
0:16:47.509,0:16:58.109
laughter[br]applause
0:16:58.109,0:17:02.209
On the left we have a screenshot of GPG and
0:17:02.209,0:17:04.299
as I’ve already described,
0:17:04.299,0:17:08.530
command line people, we like command lines
0:17:08.530,0:17:11.300
but normal people don’t.
0:17:11.300,0:17:13.720
And what we also see is a lot of the jargon that is
0:17:13.720,0:17:17.030
currently being used even in GUI applications
0:17:17.030,0:17:23.409
so on the right there is PGP 10.0.
0:17:23.409,0:17:25.979
Now I wanna close these examples with
0:17:25.979,0:17:28.529
well, you might be wondering: “what is this?”
0:17:28.529,0:17:32.589
This is actually an example of a security device
0:17:32.589,0:17:36.649
from, I think it’s around 4000 years ago.
0:17:36.649,0:17:38.360
Like, People could use this.
0:17:38.360,0:17:42.870
Why can’t we get it right today?
0:17:42.870,0:17:46.360
Something that you should,
0:17:46.360,0:17:48.869
this is a little homework exercise,
0:17:48.869,0:17:52.419
take a laptop to your grandma, show her PGP,
0:17:52.419,0:17:55.110
can she use it—yes or no?
0:17:55.110,0:18:02.440
Probably not, but who knows?
0:18:02.450,0:18:03.740
Now I wanna go into
0:18:03.740,0:18:09.840
the usability cornerstones of usable security.
0:18:09.840,0:18:13.049
I wanna start with heuristics
0:18:13.049,0:18:15.519
some people call them “rules of thumb,” other people
0:18:15.519,0:18:19.059
call them “the ten holy commandments”
0:18:19.059,0:18:23.299
For example, the ten commandments of Dieter Rams,
0:18:23.299,0:18:27.059
there is ten commandments of Jakob Nielsen,
0:18:27.059,0:18:28.250
of Don Norman
0:18:28.250,0:18:35.110
and it really depends on who you believe in, etc.
0:18:35.110,0:18:37.380
But at the cornerstone of all of these is that
0:18:37.380,0:18:40.270
design is made for people.
0:18:40.270,0:18:45.800
And, well, actually, Google says it quite well
0:18:45.800,0:18:48.559
in their guiding mission:
0:18:48.559,0:18:52.740
“Focus on the user and all else will follow.”
0:18:52.740,0:18:54.809
Or, as a usability maxim:
0:18:54.809,0:18:57.350
“thou shalt test with thy user”
0:18:57.350,0:19:01.209
Don’t just give them the thing.
0:19:01.209,0:19:03.200
But there is one problem with these heuristics
0:19:03.200,0:19:06.510
and with this advice going just with your user.
0:19:06.510,0:19:10.889
Because it’s a pretty abstract advice.
0:19:10.889,0:19:12.090
What do you do?
0:19:12.090,0:19:13.940
You go out into the world to get practice.
0:19:13.940,0:19:17.870
You start observing people.
0:19:17.870,0:19:20.169
One nice exercise to try is:
0:19:20.169,0:19:21.289
go to the vending machine,
0:19:21.289,0:19:24.539
for example the ones at the S-Bahn.
0:19:24.539,0:19:26.049
Just stand next to it
0:19:26.049,0:19:28.269
and observe people buying tickets.
0:19:28.269,0:19:30.860
It’s quite entertaining, actually.
0:19:30.860,0:19:33.500
Laughter
0:19:33.500,0:19:36.010
And something you can also do is
0:19:36.010,0:19:37.890
search for usability failures.
0:19:37.890,0:19:39.750
This is what you already do when
0:19:39.750,0:19:41.269
you’re observing people.
0:19:41.269,0:19:45.320
But even just google for “usability failure”,
0:19:45.320,0:19:47.870
“GUI fail”, etc., and you will find
0:19:47.870,0:19:53.400
lots of entertaining stuff.
0:19:53.400,0:19:54.950
Those were some heuristics
0:19:54.950,0:19:56.250
but what about the princpiles
0:19:56.250,0:20:01.740
that lie behind those?
0:20:01.740,0:20:05.799
Usability or interaction design
0:20:05.799,0:20:09.299
is a cycle between the user and the system.
0:20:09.299,0:20:10.470
The user and the world.
0:20:10.470,0:20:12.090
The user acts on the world
0:20:12.090,0:20:13.590
and gets feedback.
0:20:13.590,0:20:17.000
They interpret that.
0:20:17.000,0:20:18.480
One important concept is
0:20:18.480,0:20:20.070
for things to be visible.
0:20:20.070,0:20:21.110
For the underlying system state
0:20:21.110,0:20:22.320
to be visible and
0:20:22.320,0:20:23.760
you get appropriate feedback
0:20:23.760,0:20:26.169
from the system.
0:20:26.169,0:20:31.230
So these are Don Norman’s gulfs[br]of execution and evaluation
0:20:31.230,0:20:34.940
sort of yin and yang.
0:20:34.940,0:20:38.549
And there is two concrete problems
0:20:38.549,0:20:39.830
to illustrate.
0:20:39.830,0:20:41.850
For example, the button problem
0:20:41.850,0:20:45.840
that “how do you know what happens[br]when you push the button?”
0:20:45.840,0:20:50.620
and “how do you know how to push it?”
0:20:50.620,0:20:52.750
I unforunately don’t have a picture of it
0:20:52.750,0:20:58.360
but at Oxford station, the tabs in the bathrooms
0:20:58.360,0:21:01.550
they say “push” and you need to turn.
0:21:01.550,0:21:05.439
Laughter
0:21:05.439,0:21:08.510
Then there is the toilet door problem.
0:21:08.510,0:21:11.740
The problem of “how do you know[br]what state a system is in”.
0:21:11.740,0:21:15.730
How do you know whether an email will be encrypted?
0:21:15.730,0:21:20.269
This is a picture …
0:21:20.269,0:21:21.980
basically there is two locks.
0:21:21.980,0:21:26.120
One is actually broken and it’s …[br]when pushing the button that's on the
0:21:26.120,0:21:29.049
door handle, you usually lock the door.
0:21:29.049,0:21:31.620
But … well … it broke. So that must have been
0:21:31.620,0:21:36.200
an entertaining accident.
0:21:36.200,0:21:39.080
Another, as I’ve already described,
0:21:39.080,0:21:44.339
another important concept is that of mental models.
0:21:44.339,0:21:47.860
It’s a question of what idea does the user have
0:21:47.860,0:21:52.589
of the system by interacting with it?
0:21:52.589,0:21:55.880
How do they acquire knowledge?
0:21:55.880,0:21:59.250
For example, how to achieve discoverability
0:21:59.250,0:22:00.769
of the system?
0:22:00.769,0:22:05.710
And how to ensure that while[br]a user is discovering the system
0:22:05.710,0:22:09.480
that they are less likely to make mistakes?
0:22:09.480,0:22:13.880
So this is the concept of poka-yoke
0:22:13.880,0:22:18.429
and it’s … here is an example[br]you also see with floppy disks,
0:22:18.429,0:22:22.089
with USB sticks, etc.
0:22:22.089,0:22:24.309
It’s engineered such that users are
0:22:24.309,0:22:27.020
less likely to make a mistake.
0:22:27.020,0:22:30.720
Then there’s also the idea[br]of enabling knowledge transfer
0:22:30.720,0:22:33.339
So how can we do this?
0:22:33.339,0:22:35.480
One thing is metaphors.
0:22:35.480,0:22:39.919
And I’m not sure how many of you recognise this,
0:22:39.919,0:22:44.030
this is Microsoft BOB.
0:22:44.030,0:22:46.399
Traditionally, PC systems have been built
0:22:46.399,0:22:51.909
on the desktop metaphor.
0:22:51.909,0:22:58.169
Laughter[br]Microsoft BOB had a little too much.
0:22:58.169,0:23:04.510
To enable knowledge transfer,[br]you can also standardise systems.
0:23:04.510,0:23:08.519
And one important tool for this is design languages
0:23:08.519,0:23:12.159
so if you’re designing for iOS, go look at
0:23:12.159,0:23:15.970
the design language, the Human[br]Interface Guidelines of iOS.
0:23:15.970,0:23:19.690
The same for Windows – go look[br]at the Metro Design Guidelines.
0:23:19.690,0:23:26.159
As for Android, look at Material Design.
0:23:26.159,0:23:30.409
Because, another interesting exercise[br]to try out
0:23:30.409,0:23:33.230
relating to design languages
0:23:33.230,0:23:38.250
and also to get familiar with how designers try to
0:23:38.250,0:23:40.519
communicate with users is to
0:23:40.519,0:23:44.639
look at an interface and trying to decode
0:23:44.639,0:23:48.599
what the designer is trying to say to the user.
0:23:48.599,0:23:53.669
And another interesting exercise is to look at
0:23:53.669,0:23:58.840
not usability but UNusability.
0:23:58.840,0:24:00.909
So there is this pretty interesting book
0:24:00.909,0:24:04.940
called “evil by design” and it goes into
0:24:04.940,0:24:08.450
all the various techniques that designers use
0:24:08.450,0:24:16.760
to fool users, to get them to buy an extra hotel, car, etc.
0:24:16.760,0:24:21.750
and, well, RyanAir is pretty much the worst offender
0:24:21.750,0:24:24.970
so a good example to study.
0:24:24.970,0:24:30.519
Applause
0:24:30.519,0:24:34.480
So, what if you wanna go out into the world
0:24:34.480,0:24:40.220
and are gonna apply these princpiles, these heuristics?
0:24:40.220,0:24:42.350
The first thing to know is that
0:24:42.350,0:24:45.219
design has to be a process
0:24:45.219,0:24:50.739
whereby, first part is actually defining your problem.
0:24:50.739,0:24:53.729
You first brain-storm
0:24:53.729,0:24:58.230
then you try to narrow down to concrete requirements
0:24:58.230,0:25:02.870
after which you go and try out[br]the various approaches
0:25:02.870,0:25:05.850
and test these.
0:25:05.850,0:25:09.279
What materials do usability experts actually use?
0:25:09.279,0:25:15.630
Well, of course there’s expensive tools, Axure, etc.
0:25:15.630,0:25:19.220
but I think one of the most used materials
0:25:19.220,0:25:25.490
is still the post-it note. Just paper and pens.
0:25:25.490,0:25:28.980
And, okay, where do you wanna go and test?
0:25:28.980,0:25:32.090
Well, actually, go out into the field.
0:25:32.090,0:25:35.950
Go to the ticket machine of the S-Bahn.
0:25:35.950,0:25:39.019
But also go and test in the lab, so that you have
0:25:39.019,0:25:42.179
a controlled environment.
0:25:42.179,0:25:45.309
And then you ask “okay, how do I test?”
0:25:45.309,0:25:49.919
Well, first thing is: Go and get some real people.
0:25:49.919,0:25:54.630
Of course, it’s … it can be difficult to actually
0:25:54.630,0:26:00.620
get people into the lab but it’s not impossible.
0:26:00.620,0:26:02.429
So once you have people in the lab,
0:26:02.429,0:26:05.020
here are some methods.
0:26:05.020,0:26:07.279
There are so many usability evaluation methods
0:26:07.279,0:26:09.409
that I’m not gonna list them all and
0:26:09.409,0:26:13.200
I encourage you to go and look them up yourself
0:26:13.200,0:26:15.360
’cause it’s also very personal what works for you
0:26:15.360,0:26:20.500
and what works in your situation.
0:26:20.500,0:26:23.050
When using these methods you wanna
0:26:23.050,0:26:25.780
evaluate how well a solution works
0:26:25.780,0:26:29.809
So you’re gonna look at some matrix
0:26:29.809,0:26:31.409
so that at the end of your evaluation
0:26:31.409,0:26:35.100
you can say “okay, we’ve done a good job”,
0:26:35.100,0:26:40.100
“this can go better”,[br]“Okay, maybe we can move that”, …
0:26:40.100,0:26:44.069
So these are the standardised ones, so
0:26:44.069,0:26:47.690
how effective are people, or etc.
0:26:47.690,0:26:52.909
You can read …
0:26:52.909,0:26:55.759
For a quick start guide on how to
0:26:55.759,0:26:59.159
perform usability studies, this is quite a nice one.
0:26:59.159,0:27:00.480
And the most important thing to remember
0:27:00.480,0:27:04.529
is that preparation is half the work.
0:27:04.529,0:27:08.120
First thing to check that everything is working,
0:27:08.120,0:27:17.180
make sure that you have everyone[br]you need in the room, etc.
0:27:17.180,0:27:23.249
And maybe most importantly,[br]usability and usable security,
0:27:23.249,0:27:26.380
well, usable security is still a growing field, but
0:27:26.380,0:27:30.630
usability is a very large field and most likely
0:27:30.630,0:27:34.720
all the problems that you are going to face
0:27:34.720,0:27:36.970
or at least a large percentage, other people
0:27:36.970,0:27:39.080
have faced before.
0:27:39.080,0:27:43.529
So this book is, well, it describes a lot of the stories
0:27:43.529,0:27:47.529
of user experience professionals and the things
0:27:47.529,0:27:52.040
that they’ve come up against.
0:27:52.040,0:27:56.189
A homework exerciese if you feel like it
0:27:56.189,0:28:00.990
is looking at basically analysing who is your user
0:28:00.990,0:28:06.760
and where they’re going to use the application.
0:28:06.760,0:28:10.409
And also something to think about is
0:28:10.409,0:28:12.649
how might you involve your user?
0:28:12.649,0:28:16.889
Not just during the usability testing,
0:28:16.889,0:28:21.070
but also afterwards.
0:28:21.070,0:28:28.450
Now I wanna go into some case[br]studies of encryption systems.
0:28:28.450,0:28:30.230
Now there’s quite a lot, and these are not all,
0:28:30.230,0:28:34.769
it’s just a small selection but I wanna focus on three.
0:28:34.769,0:28:40.229
I wanna focus at the OpenPGP standard,[br]Cryptocat and TextSecure.
0:28:40.229,0:28:42.769
So, OpenPGP, well …
0:28:42.769,0:28:46.230
email is now almost 50 years old,
0:28:46.230,0:28:52.190
we have an encryption standard—S/MIME,[br]it is widely used
0:28:52.190,0:28:56.039
well, it’s widely usable but it’s not widely used …
0:28:56.039,0:29:03.679
and GnuPG is used widely but is not installed by default
0:29:03.679,0:29:09.939
and when usability teaches us one thing
0:29:09.939,0:29:14.129
it’s that defaults rule.
0:29:14.129,0:29:18.190
Because users don’t change defaults.
0:29:18.190,0:29:23.360
Now you might ask “Okay,[br]PGP is not installed by default,
0:29:23.360,0:29:26.560
so is there actually still a future for OpenPGP?”
0:29:26.560,0:29:30.179
Well, I’d argue: Yes.[br]We have browser plug-ins
0:29:30.179,0:29:33.059
which make it easier for users
0:29:33.059,0:29:37.850
JavaScript crypto … I’ll come back to that later …
0:29:37.850,0:29:43.420
But when we look at Mailvelope, we see, well,
0:29:43.420,0:29:48.040
the EFF scorecard, it has a pretty decent rating
0:29:48.040,0:29:55.790
at least compared to that of native PGP implementations.
0:29:55.790,0:29:58.629
And also Google has announced and has been working
0:29:58.629,0:30:01.409
for quite some time on their own plug-in for
0:30:01.409,0:30:03.379
end-to-end encryption.
0:30:03.379,0:30:07.950
And Yahoo! is also involved in that.
0:30:07.950,0:30:11.389
And after the Snowden revelations there has been
0:30:11.389,0:30:15.009
a widespread search in the interest
0:30:15.009,0:30:18.460
in encrypted communications
0:30:18.460,0:30:23.320
and this is one website where a lot of these are listed.
0:30:23.320,0:30:27.889
And one project that I’d especially like to emphasise
0:30:27.889,0:30:31.910
is mailpile because I think it looks
0:30:31.910,0:30:35.300
like a very interesting approach
0:30:35.300,0:30:37.820
whereby the question is:
0:30:37.820,0:30:41.080
Can we use OpenPGP as a stepping stone?
0:30:41.080,0:30:46.620
OpenPGP is not perfect, meta-data is not protected,
0:30:46.620,0:30:48.299
header is not protected, etc.
0:30:48.299,0:30:51.870
But maybe when we get people into the ecosystem,
0:30:51.870,0:30:56.169
we can try and gradually move[br]them to more secure options.
0:30:56.169,0:30:58.899
Now, what about Cryptocat?
0:30:58.899,0:31:04.070
So, Cryptocat’s online chat platform
0:31:04.070,0:31:06.900
that … yes … uses JavaScript.
0:31:06.900,0:31:10.909
And of course, JavaScript crypto is bad
0:31:10.909,0:31:14.620
but it can be made better.
0:31:14.620,0:31:20.160
And I think JavaScript crypto is not the worst problem.
0:31:20.160,0:31:22.860
Cryptocat had a pretty disastrous problem
0:31:22.860,0:31:26.809
whereby all messages that were sent
0:31:26.809,0:31:30.610
were pretty easily decryptable.
0:31:30.610,0:31:33.169
But actually, this is just history repeating itself
0:31:33.169,0:31:39.090
’cause PGP 1.0 used something called BassOmatic,
0:31:39.090,0:31:44.620
the BassOmatic cypher which is also pretty weak.
0:31:44.620,0:31:49.509
And Cryptocat is improving, which is the important thing.
0:31:49.509,0:31:51.179
There is now a browser plug-in and
0:31:51.179,0:31:53.890
of course, there’s an app for that and
0:31:53.890,0:31:56.570
actually, Cryptocat is doing really, really well
0:31:56.570,0:31:59.280
in the EFF benchmarks.
0:31:59.280,0:32:04.539
And Cryptocat is asking the one question that a lot
0:32:04.539,0:32:06.659
of other applications are not asking, which is:
0:32:06.659,0:32:09.459
“How can we actually make crypto fun?”
0:32:09.459,0:32:12.429
When you start Cryptocat, there’s noises
0:32:12.429,0:32:15.340
and there’s interesting facts about cats
0:32:15.340,0:32:18.759
Laughter
0:32:18.759,0:32:21.249
… depends on whether you like cats, but still!
0:32:21.249,0:32:23.450
Keeps you busy!
0:32:23.450,0:32:28.989
Now, the last case: TextSecure[br]also has pretty good markings
0:32:28.989,0:32:32.299
and actually just like CryptoCat,
0:32:32.299,0:32:35.860
the App store distribution model is something that
0:32:35.860,0:32:38.820
I think is a valuable one for usability.
0:32:38.820,0:32:41.910
It makes it easy to install.
0:32:41.910,0:32:46.049
And something that TextSecure is also looking at is
0:32:46.049,0:32:52.180
synchronisation options for your address book.
0:32:52.180,0:32:56.980
And I think the most interesting development is
0:32:56.980,0:33:00.490
on the one side, the CyanogenMod integration,
0:33:00.490,0:33:05.029
so that people will have encryption enabled by default.
0:33:05.029,0:33:09.879
’Cause as I mentioned: People don’t change defaults.
0:33:09.879,0:33:14.709
And this one is a bit more contoversial, but
0:33:14.709,0:33:18.180
there’s also the WhatsApp partnership.
0:33:18.180,0:33:21.130
And of course people will say “it’s not secure”,
0:33:21.130,0:33:22.960
we know, we know,
0:33:22.960,0:33:24.389
EFF knows!
0:33:24.389,0:33:28.740
But at least, it’s more secure than nothing at all.
0:33:28.740,0:33:31.330
Because: Doesn’t every little bit help?
0:33:31.330,0:33:32.680
Well, I’d say: yes.
0:33:32.680,0:33:35.590
And at least, it’s one stepping stone.
0:33:35.590,0:33:40.110
And, well, all of these are open-source,
0:33:40.110,0:33:41.639
so you can think for yourself:
0:33:41.639,0:33:45.120
How can I improve these?
0:33:45.120,0:33:50.450
Now, there’s still some open questions remaining
0:33:50.450,0:33:52.620
in the usable security field and in the
0:33:52.620,0:33:56.320
wider security field as well.
0:33:56.320,0:33:58.860
I won’t go into all of these,
0:33:58.860,0:34:03.210
I wanna focus on the issues that developers have,
0:34:03.210,0:34:05.730
issues of end user understanding
0:34:05.730,0:34:09.459
and of identitiy management.
0:34:09.459,0:34:14.059
Because the development environment
0:34:14.059,0:34:18.179
there’s the crypto-plumbing problem, some people call it.
0:34:18.179,0:34:20.820
How do we standardise on a cryptographic algorithm?
0:34:20.820,0:34:25.250
How do we make everyone use the same system?
0:34:25.250,0:34:29.179
Because, again, it’s history repeating itself.
0:34:29.179,0:34:35.668
With PGP, we had RSA, changed for[br]DSA because of patent issues
0:34:35.668,0:34:39.540
IDEA changed for CAST5 because of patent issues
0:34:39.540,0:34:41.729
and now we have something similar:
0:34:41.729,0:34:43.599
’cause for PGP the question is:
0:34:43.599,0:34:45.619
Which curve do we choose?
0:34:45.619,0:34:51.089
’cause this is from Bernstein, who has got a whole list
0:34:51.089,0:34:56.229
of, well not all the curves,[br]but a large selection of them
0:34:56.229,0:34:57.920
analysing the security
0:34:57.920,0:35:01.210
but how do you make, well, pretty much
0:35:01.210,0:35:06.460
the whole world agree on a single standard?
0:35:06.460,0:35:11.110
And also, can we move toward safer languages?
0:35:11.110,0:35:18.270
And I’ve been talking about the[br]usability of encryption systems
0:35:18.270,0:35:21.770
for users, but what about for developers?
0:35:21.770,0:35:25.770
So, API usability, and as I’ve mentioned:
0:35:25.770,0:35:28.000
Language usability.
0:35:28.000,0:35:31.640
And on top of that, it is not just a technical issue,
0:35:31.640,0:35:34.959
because, of course, we secure microchips,
0:35:34.959,0:35:41.560
but we also wanna secure social systems.
0:35:41.560,0:35:45.000
Because, in principal, we live in an open system,
0:35:45.000,0:35:51.230
in an open society and a system cannot audit itself.
0:35:51.230,0:35:55.709
So, okay, what do we do, right?[br]I don’t know.
0:35:55.709,0:35:58.489
I mean, that’s why it’s an open question!
0:35:58.489,0:36:00.970
’Cause how de we ensure the authenticity of,
0:36:00.970,0:36:06.300
I don’t know, my Intel processor in my lapotp?
0:36:06.300,0:36:07.000
How do I know that the
0:36:07.000,0:36:09.089
random number generator isn’t bogus?
0:36:09.089,0:36:13.589
Well, I know it is, but …[br]laughter
0:36:13.589,0:36:17.140
Then, there’s the issue of identity management
0:36:17.140,0:36:19.309
related to key management, like
0:36:19.309,0:36:25.329
who has the keys to the kingdom?
0:36:25.329,0:36:27.160
One approach, as I’ve already mentioned, is
0:36:27.160,0:36:28.760
key continuity management.
0:36:28.760,0:36:32.400
Whereby we automate both key exchange and
0:36:32.400,0:36:35.520
whereby we automate encryption.
0:36:35.520,0:36:38.289
So one principle is trust on first use,
0:36:38.289,0:36:43.569
whereby, well, one approach will be to attach your key
0:36:43.569,0:36:46.010
to any email you send out and anyone who receives
0:36:46.010,0:36:50.020
this email just assumes it’s the proper key.
0:36:50.020,0:36:52.819
Of course, it’s not fully secure,
0:36:52.819,0:36:56.319
but at least, it’s something.
0:36:56.319,0:36:59.109
And this is really, I think, the major question
0:36:59.109,0:37:00.809
in interoperability:
0:37:00.809,0:37:05.210
How do you ensure that you can access your email
0:37:05.210,0:37:08.690
from multiple devices?
0:37:08.690,0:37:10.880
Now, of course, there is meta-data leakage,
0:37:10.880,0:37:14.230
PGP doesn’t protect meta-data,
0:37:14.230,0:37:16.890
and, you know, your friendly security agency knows
0:37:16.890,0:37:18.319
where you went last summer …
0:37:18.319,0:37:19.300
So, what do we do?
0:37:19.300,0:37:23.649
We do anonymous routing, we send over tor, but
0:37:23.649,0:37:26.150
I mean, how do we roll that out?
0:37:26.150,0:37:27.500
I think the approach that
0:37:27.500,0:37:30.240
mailpile is trying to do is interesting
0:37:30.240,0:37:33.000
and, of course still an open question, but
0:37:33.000,0:37:36.800
interesting research nonetheless.
0:37:36.800,0:37:39.049
Then there’s the introduction problem of
0:37:39.049,0:37:43.730
okay, how, I meet someone here, after the talk,
0:37:43.730,0:37:45.990
they tell me who they are,
0:37:45.990,0:37:49.829
but either I get their card—which is nice—or
0:37:49.829,0:37:52.260
they say what their name is.
0:37:52.260,0:37:55.869
But they’re not gonna tell me, they’re not gonna spell out
0:37:55.869,0:37:57.820
their fingerprint.
0:37:57.820,0:38:02.400
So the idea of Zooko’s triangle is that identifiers
0:38:02.400,0:38:07.630
are either human-meaningful,[br]secure or decentralised.
0:38:07.630,0:38:09.270
Pick two.
0:38:09.270,0:38:13.140
So here’s some examples of identifiers,
0:38:13.140,0:38:16.270
so for Bitcoin: Lots of random garbage.
0:38:16.270,0:38:19.009
For OpenPGP: Lots of random garbage
0:38:19.009,0:38:22.330
For miniLock: Lots of random garbage
0:38:22.330,0:38:26.390
So, I think an interesting research problem is:
0:38:26.390,0:38:29.719
Can we actually make these things memorable?
0:38:29.719,0:38:32.200
You know, I can memorise email addresses,
0:38:32.200,0:38:34.359
I can memorise phone numbers,
0:38:34.359,0:38:39.839
I can not memorise these. I can try, but …
0:38:39.839,0:38:45.390
Then, the last open question I wanna focus on
0:38:45.390,0:38:48.780
is that of end-user understanding.
0:38:48.780,0:38:53.599
So of course, we’ll know that all devices are monitored.
0:38:53.599,0:39:00.420
But does the average user?
0:39:00.420,0:39:04.750
Do they know what worms can do?
0:39:04.750,0:39:09.280
Have they read these books?
0:39:09.280,0:39:15.089
Do they know where GCHQ is?
0:39:15.089,0:39:20.970
Do they know that Cupertino has[br]pretty much the same powers?
0:39:20.970,0:39:23.880
Laughter
0:39:23.880,0:39:28.980
Do they know they’re living in a panopticon to come?
0:39:28.980,0:39:32.160
Laughter
0:39:32.160,0:39:37.800
Do they know that people are[br]killed based on meta-data?
0:39:37.800,0:39:40.829
Well, I think not.
0:39:40.829,0:39:45.550
And actually this is a poster from the university
0:39:45.550,0:39:47.069
where I did my Master’s
0:39:47.069,0:39:50.940
and interestingly enough, it was founded by a guy
0:39:50.940,0:39:56.279
who made a fortune selling sugar pills.
0:39:56.279,0:40:02.649
You know, snake oil, we also have this in crypto.
0:40:02.649,0:40:06.079
And how is the user to know
0:40:06.079,0:40:08.130
whether something is secure or not?
0:40:08.130,0:40:10.609
Of course, we have the secure messaging scorecard
0:40:10.609,0:40:15.210
but can users find these?
0:40:15.210,0:40:21.190
Well, I think, there’s three aspects[br]to end-user understanding
0:40:21.190,0:40:24.250
which is knowledge acquisition,[br]knowledge transfer,
0:40:24.250,0:40:27.220
and the verification updating of this knowledge.
0:40:27.220,0:40:30.950
So, as I’ve already mentioned,[br]we can do dummy-proofing
0:40:30.950,0:40:38.110
and we can create transparent systems.
0:40:38.110,0:40:41.160
For knowledge transfer, we can
0:40:41.160,0:40:44.400
look at appropriate metaphors and design languages.
0:40:44.400,0:40:46.829
And for verification we can
0:40:46.829,0:40:50.590
try an approach: Choose an advertising.
0:40:50.590,0:40:56.500
And, last but not least, we can do user-testing.
0:40:56.500,0:41:02.770
Because all these open questions that I’ve described
0:41:02.770,0:41:05.549
and all this research that has been done,
0:41:05.549,0:41:11.089
I think it’s missing one key issue, which is that
0:41:11.089,0:41:13.640
the usability people and the security people
0:41:13.640,0:41:17.480
tend not to really talk to one another.
0:41:17.480,0:41:21.440
The open-source developers and the users:
0:41:21.440,0:41:23.490
Are they talking enough?
0:41:23.490,0:41:26.760
I think that’s something, if we want a new dawn,
0:41:26.760,0:41:30.970
that’s something that I think we should approach.
0:41:30.970,0:41:35.110
Yeah, so, from my side, that’s it.
0:41:35.110,0:41:37.490
I’m open for any questions.
0:41:37.490,0:41:49.320
Applause
0:41:49.320,0:41:52.270
Herald: Arne, thank you very much for your brilliant talk
0:41:52.270,0:41:55.030
Now, if you have any questions to ask, would you please
0:41:55.030,0:41:57.920
line up at the microphones in the aisles?!
0:41:57.920,0:42:00.470
The others who’d like to leave now,
0:42:00.470,0:42:04.240
I’d ask you kindly to please leave very quietly
0:42:04.240,0:42:09.270
so we can hear what the people[br]asking questions will tell us.
0:42:09.270,0:42:14.279
And those at the microphones,[br]if you could talk slowly,
0:42:14.279,0:42:19.099
then those translating have no problems in translating
0:42:19.099,0:42:21.490
what is being asked. Thank you very much.
0:42:21.490,0:42:27.460
And I think we’ll start with mic #4 on the left-hand side.
0:42:27.460,0:42:32.000
Mic#4: Yes, so, if you’ve been to any successful
0:42:32.000,0:42:36.500
crypto party, you know that crypto parties very quickly
0:42:36.500,0:42:41.430
turn into not about discussing software,[br]how to use software,
0:42:41.430,0:42:43.780
but into threat model discussions.
0:42:43.780,0:42:46.930
And to actually get users[br]to think about what they’re
0:42:46.930,0:42:49.420
trying to protect themselves for and if a certain
0:42:49.420,0:42:52.710
messaging app is secure, that still means nothing.
0:42:52.710,0:42:55.810
’Cause there is lots of other stuff that’s going on.
0:42:55.810,0:42:57.240
Can you talk a little bit about that and
0:42:57.240,0:43:00.130
how that runs into this model about, you know,
0:43:00.130,0:43:02.260
how we need to educate users and, while we’re at it,
0:43:02.260,0:43:03.640
what we want to educated about.
0:43:03.640,0:43:05.930
And what they actually need to be using.
0:43:05.930,0:43:09.640
Arne: Well, I think that’s an interesting point
0:43:09.640,0:43:14.210
and I think, one issue, one big issue is:
0:43:14.210,0:43:17.180
okay, we can throw lots of crypto parties
0:43:17.180,0:43:20.809
but we’re never gonna be able to throw enough parties.
0:43:20.809,0:43:22.970
I … with one party you’re very lucky
0:43:22.970,0:43:24.609
you’re gonna educate 100 people.
0:43:24.609,0:43:28.950
I mean, just imagine how many parties[br]you’d need to throw. Right?
0:43:28.950,0:43:32.980
I mean, it’s gonna be a heck of party, but … yeah.
0:43:32.980,0:43:38.730
And I think, secondly, the question of threat modeling,
0:43:38.730,0:43:43.000
I think, sure, that’s helpful to do, but
0:43:43.000,0:43:47.760
I think, users do first need an understanding of,
0:43:47.760,0:43:49.290
for example, the email architecture.
0:43:49.290,0:43:51.520
Cause, how can they do threat[br]modeling when they think
0:43:51.520,0:43:55.260
that an email magically pops[br]from one computer to the next?
0:43:55.260,0:43:59.250
I think, that is pretty much impossible.
0:43:59.250,0:44:01.250
I hope that …
0:44:01.250,0:44:04.890
Herald: Thank you very much, so …[br]Microphone #3, please.
0:44:04.890,0:44:07.439
Mic#3: Arne, thank you very much for your talk.
0:44:07.439,0:44:10.430
There’s one aspect that I didn’t see in your slides.
0:44:10.430,0:44:13.049
And that is the aspect of the language that we use
0:44:13.049,0:44:16.940
to describe concepts in PGP—and GPG, for that matter.
0:44:16.940,0:44:19.510
And I know that there was a paper last year
0:44:19.510,0:44:21.890
about why King George can’t encrypt and
0:44:21.890,0:44:23.960
they were trying to propose a new language.
0:44:23.960,0:44:26.109
Do you think that such initiatives are worthwile
0:44:26.109,0:44:28.650
or are we stuck with this language and should we make
0:44:28.650,0:44:31.720
as good use of it as we can?
0:44:31.720,0:44:37.849
Arne: I think that’s a good point[br]and actually the question
0:44:37.849,0:44:44.649
of “okay, what metaphors do you wanna use?” … I think
0:44:44.649,0:44:46.799
we’re pretty much stuck with the language
0:44:46.799,0:44:49.710
that we’re using for the moment but
0:44:49.710,0:44:54.130
I think it does make sense[br]to go and look into the future
0:44:54.130,0:44:58.289
at alternatives models.
0:44:58.289,0:45:00.990
Yeah, so I actually wrote a paper that also
0:45:00.990,0:45:04.970
goes into that a bit, looking at
0:45:04.970,0:45:08.630
the metaphor of handshakes to exchange keys.
0:45:08.630,0:45:09.790
So, for example, you could have
0:45:09.790,0:45:15.520
an embedded device as a ring or wristband,
0:45:15.520,0:45:19.000
it could even be a smartwatch, for that matter.
0:45:19.000,0:45:21.569
Could you use that shaking of hands to
0:45:21.569,0:45:24.470
build trust-relationships?
0:45:24.470,0:45:29.740
And that might be a better[br]metaphor than key-signing,
0:45:29.740,0:45:31.469
webs of trust, etc.
0:45:31.469,0:45:34.559
’Cause I think, that is horribly broken
0:45:34.559,0:45:39.990
for I mean the concept, trying[br]to explain that to users.
0:45:39.990,0:45:43.430
Herald: Thank you. And … at the back in the middle.
0:45:43.430,0:45:44.980
Signal angel: Thanks. A question from the internet:
0:45:44.980,0:45:47.000
[username?] from the Internet wants to know if you’re
0:45:47.000,0:45:51.839
aware of the PEP project, the “pretty easy privacy”
0:45:51.839,0:45:53.059
and your opinions on that.
0:45:53.059,0:45:54.710
And another question is:
0:45:54.710,0:46:01.520
How important is the trust level of the crypto to you?
0:46:01.520,0:46:04.420
Arne: Well, yes, actually, there’s this screenshot
0:46:04.420,0:46:09.729
of the PEP project in the slides
0:46:09.729,0:46:15.149
… in the why WhatsApp is horribly insecure and
0:46:15.149,0:46:18.720
of course, I agree, and yeah.
0:46:18.720,0:46:21.680
I’ve looked into the PEP project for a bit
0:46:21.680,0:46:24.549
and I think, yeah, I think it’s an interesting
0:46:24.549,0:46:28.480
approach but I still have to read up on it a bit more.
0:46:28.480,0:46:31.369
Then, for the second question,
0:46:31.369,0:46:38.039
“how important is the rust in the crypto?”:
0:46:38.039,0:46:41.749
I think that’s an important one.
0:46:41.749,0:46:43.220
Especially the question of
0:46:43.220,0:46:52.780
“how do we build social systems[br]to ensure reliable cryptography?”
0:46:52.780,0:46:56.830
So one example is the Advanced[br]Encryption Standard competition.
0:46:56.830,0:46:59.559
Everyone was free to send in entries,
0:46:59.559,0:47:01.990
their design princpiles were open
0:47:01.990,0:47:06.219
and this is in complete contrast[br]to the Data Encryption Standard
0:47:06.219,0:47:11.920
which, I think, the design princpiles are still Top Secret.
0:47:11.920,0:47:16.290
So yeah, I think, the crypto is[br]something we need to build on
0:47:16.290,0:47:22.059
but, well, actually, the crypto is[br]again built on other systems
0:47:22.059,0:47:28.040
where the trust in these systems[br]is even more important.
0:47:28.040,0:47:30.720
Herald: Okay, thank you, microphone #2, please.
0:47:30.720,0:47:34.270
Mic#2: Yes, Arne, thank you very[br]much for your excellent talk.
0:47:34.270,0:47:37.710
I wonder how about what to do with feedback
0:47:37.710,0:47:40.899
on usability in open-source software.
0:47:40.899,0:47:42.329
So, you publish something on GitHub
0:47:42.329,0:47:44.049
and you’re with a group of people
0:47:44.049,0:47:45.450
who don’t know each other and
0:47:45.450,0:47:48.089
one publishes something,[br]the other publishes something,
0:47:48.089,0:47:51.349
and then: How do we know[br]this software is usable?
0:47:51.349,0:47:53.660
In commercial software, there’s all kind of hooks
0:47:53.660,0:47:55.780
on the website, on the app,
0:47:55.780,0:47:59.059
to send feedback to the commercial vendor.
0:47:59.059,0:48:02.270
But in open-source software,[br]how do you gather this information?
0:48:02.270,0:48:04.630
How do you use it, is there any way to do this
0:48:04.630,0:48:05.890
in an anonymised way?
0:48:05.890,0:48:08.589
I haven’t seen anything related to this.
0:48:08.589,0:48:11.480
Is this one of the reasons why[br]open-source software is maybe
0:48:11.480,0:48:15.249
less usable than commercial software?
0:48:15.249,0:48:19.889
Arne: It might be. It might be.
0:48:19.889,0:48:22.599
But regarding your question, like, how do you know
0:48:22.599,0:48:29.559
whether a commercial software is usable, well,
0:48:29.559,0:48:32.279
you … one way is looking at:
0:48:32.279,0:48:34.840
Okay, what kind of statistics do you get back?
0:48:34.840,0:48:37.720
But if you push out something totally unusable
0:48:37.720,0:48:39.920
and then, I mean, you’re going to expect
0:48:39.920,0:48:44.599
that the statistics come back looking like shit.
0:48:44.599,0:48:49.829
So, the best approach is to[br]design usability in from the start.
0:48:49.829,0:48:51.230
The same with security.
0:48:51.230,0:48:54.950
And I think, that is also …[br]so even if you have …
0:48:54.950,0:48:58.670
you want privacy for end users, I think it’s still possible
0:48:58.670,0:49:01.530
to get people into their lab and look at:
0:49:01.530,0:49:03.270
Okay, how are they using the system?
0:49:03.270,0:49:05.760
What things can we improve?
0:49:05.760,0:49:08.289
And what things are working well?
0:49:08.289,0:49:10.740
Mic#2: So you’re saying, you should only publish
0:49:10.740,0:49:19.010
open-source software for users[br]if you also tested in a lab?
0:49:19.010,0:49:22.599
Arne: Well, I think, this is a bit of a discussion of:
0:49:22.599,0:49:25.740
Do we just allow people to build[br]houses however they want to
0:49:25.740,0:49:28.410
or do we have building codes?
0:49:28.410,0:49:32.130
And … I think … well, actually, this proposal of holding
0:49:32.130,0:49:35.730
software developers responsible for what they produce,
0:49:35.730,0:49:40.299
if it’s commercial software, I mean,[br]that proposal has been
0:49:40.299,0:49:41.970
made a long time ago.
0:49:41.970,0:49:43.130
And the question is:
0:49:43.130,0:49:47.950
How would that work in an[br]open-source software community?
0:49:47.950,0:49:50.460
Well, actually, I don’t have an answer to that.
0:49:50.460,0:49:52.660
But I think, it’s an interesting question.
0:49:52.660,0:49:54.490
Mic#2: Thank you.
0:49:54.490,0:49:57.990
Herald: Thank you very much. #1, please.
0:49:57.990,0:50:01.130
Mic#1: You said that every little bit helps,
0:50:01.130,0:50:04.039
so if we have systems that[br]don’t provide a lot of … well …
0:50:04.039,0:50:06.680
are almost insecure, they[br]provide just a bit of security, than
0:50:06.680,0:50:09.869
that is still better than no security.
0:50:09.869,0:50:12.970
My question is: Isn’t that actually worse because
0:50:12.970,0:50:15.150
this promotes a false sense of security and
0:50:15.150,0:50:19.920
that makes people just use[br]the insecure broken systems
0:50:19.920,0:50:23.559
just to say “we have some security with us”?
0:50:23.559,0:50:26.210
Arne: I completely agree but
0:50:26.210,0:50:29.339
I think that currently people … I mean …
0:50:29.339,0:50:30.920
when you think an email goes
0:50:30.920,0:50:33.680
from one system to the other directly
0:50:33.680,0:50:40.920
and I mean … from these studies[br]that I’ve done, I’ve met
0:50:40.920,0:50:46.060
quite some people who still think email is secure.
0:50:46.060,0:50:49.589
So, of course, you might give[br]them a false sense of security
0:50:49.589,0:50:52.640
when you give them a[br]more secure program but
0:50:52.640,0:50:54.480
at least it’s more secure than email—right?
0:50:54.480,0:50:56.070
I mean …
0:50:56.070,0:50:57.339
Mic#1: Thank you.
0:50:57.339,0:50:59.520
Herald: Thank you. There’s another[br]question on the Internet.
0:50:59.520,0:51:02.559
Signal angel: Yes, thank you. Question from the Internet:
0:51:02.559,0:51:06.199
What crypto would you finally[br]recommend your grandma to use?
0:51:06.199,0:51:10.260
Arne: laughs
0:51:10.260,0:51:15.500
Well … Unfortunately, my grandma[br]has already passed away.
0:51:15.500,0:51:19.520
I mean … her secrets will be safe …
0:51:27.420,0:51:32.030
Actually, I think something like where
0:51:32.030,0:51:37.349
Crypto is enabled by default, say …[br]iMessage, I mean
0:51:37.349,0:51:42.059
of course, there’s backdoors,[br]etc., but at least
0:51:42.059,0:51:45.339
it is more secure than plain SMS.
0:51:45.339,0:51:53.249
So I would advise my grandma[br]to use … well, to look at …
0:51:53.249,0:51:56.289
actually I’d first analyse[br]what does she have available
0:51:56.289,0:51:58.880
and then I would look at okay[br]which is the most secure
0:51:58.880,0:52:03.680
and still usable?
0:52:03.680,0:52:07.339
Herald: Thank you very much, so mic #3, please.
0:52:07.339,0:52:10.880
Mic#3: So, just wondering:
0:52:10.880,0:52:14.950
You told that there is[br]a problem with the missing
0:52:14.950,0:52:20.329
default installation of GPG[br]on operating systems but
0:52:20.329,0:52:24.890
I think, this is more of a[br]problem of which OS you choose
0:52:24.890,0:52:28.220
because at least I don’t[br]know any Linux system which
0:52:28.220,0:52:33.599
doesn’t have GPG installed today by default.
0:52:33.599,0:52:39.539
If you use … at least I’ve used[br]the normal workstation setup.
0:52:39.539,0:52:42.550
Arne: Yes, I think you already[br]answered your own question:
0:52:42.550,0:52:47.230
Linux.[br]Laughter
0:52:47.230,0:52:50.690
Unfortunately, Linux is not yet widely default.
0:52:50.690,0:52:53.270
I mean, I’d love it to be, but … yeah.
0:52:53.270,0:52:57.730
Mic#3: But if I send an email to Microsoft and say:
0:52:57.730,0:53:02.539
Well, install GPG by default, they’re not gonna
0:53:02.539,0:53:04.150
listen to me.
0:53:04.150,0:53:07.530
And I think, for all of us, we should do
0:53:07.530,0:53:08.740
a lot more of that.
0:53:08.740,0:53:13.780
Even if Microsoft is the devil for most of us.
0:53:13.780,0:53:15.609
Thank you.
0:53:15.609,0:53:19.599
Arne: Well … We should be doing more of what?
0:53:19.599,0:53:26.430
Mic#3: Making more demands[br]to integrate GPG by default
0:53:26.430,0:53:29.210
in Microsoft products, for example.
0:53:29.210,0:53:31.059
Arne: Yes, I completely agree.
0:53:31.059,0:53:33.869
Well, what you already see happening …
0:53:33.869,0:53:36.140
or I mean, it’s not very high-profile yet,
0:53:36.140,0:53:39.020
but for example I mean … I’ve refered to
0:53:39.020,0:53:42.700
the EFF scorecard a couple of times but
0:53:42.700,0:53:49.750
that is some pressure to encourage developers
0:53:49.750,0:53:53.010
to include security by default.
0:53:53.010,0:53:56.940
But, I think I’ve also mentioned, one of the big problems
0:53:56.940,0:54:01.049
is: users at the moment … I mean …
0:54:01.049,0:54:04.079
developers might say: my system is secure.
0:54:04.079,0:54:06.549
I mean … what does that mean?
0:54:06.549,0:54:09.510
Do we hold developers and[br]commercial entities …
0:54:09.510,0:54:12.339
do we hold them to, well,
0:54:12.339,0:54:14.039
truthful advertisting standards or not?
0:54:14.039,0:54:17.200
I mean, I would say: Let’s gonna look at
0:54:17.200,0:54:21.289
what are companies claiming and
0:54:21.289,0:54:22.849
do they actually stand up to that?
0:54:22.849,0:54:26.079
And if not: Can we actually sue them?
0:54:26.079,0:54:27.720
Can we make them tell the truth about
0:54:27.720,0:54:30.759
what is happening and what is not?
0:54:30.759,0:54:32.960
Herald: So, we’ve got about 2 more minutes left …
0:54:32.960,0:54:37.049
So it’s a maximum of two more questions, #2, please.
0:54:37.049,0:54:43.440
Mic#2: Yeah, so … Every security system fails.
0:54:43.440,0:54:50.010
So I’m interested in what sort of work has been done on
0:54:50.010,0:54:56.999
how do users recover from failure?
0:54:56.999,0:55:00.660
Everything will get subverted,
0:55:00.660,0:55:04.190
your best firend will sneak[br]your key off your computer,
0:55:04.190,0:55:06.099
something will go wrong with that, you know …
0:55:06.099,0:55:09.510
your kids will grab it …
0:55:09.510,0:55:13.450
and just, is there, in general, has somebody looked at
0:55:13.450,0:55:17.000
these sorts of issues?
0:55:17.000,0:55:18.559
Is there research on it?
0:55:18.559,0:55:21.930
Arne: Of various aspects of the problem but
0:55:21.930,0:55:25.640
as far as I’m aware not for the general issue
0:55:25.640,0:55:30.170
and not any field studies specifically looking at
0:55:30.170,0:55:34.269
“Okay, what happens when a key is compromised, etc.”
0:55:34.269,0:55:37.520
I mean, we do have certain cases of things happening
0:55:37.520,0:55:41.789
but nothing structured.
0:55:41.789,0:55:44.720
No structured studies, as far as I’m aware.
0:55:44.720,0:55:46.810
Herald: Thank you. #3?
0:55:46.810,0:55:51.539
Mic#3: Yeah, you mentioned[br]mailpile as a stepping stone
0:55:51.539,0:55:56.380
for people to start using GnuPG and stuff, but
0:55:56.380,0:56:04.820
you also talked about an[br]average user seeing mail as just
0:56:04.820,0:56:08.789
coming from one place and[br]then ending up in another place.
0:56:08.789,0:56:12.250
Shouldn’t we actually talk about
0:56:12.250,0:56:17.880
how to make encryption transparent for the users?
0:56:17.880,0:56:21.430
Why should they actually care about these things?
0:56:21.430,0:56:24.980
Shouldn’t it be embedded in the protocols?
0:56:24.980,0:56:28.869
Shouldn’t we actually talk about[br]embedding them in the protocols,
0:56:28.869,0:56:31.510
stop using unsecure protocols
0:56:31.510,0:56:36.109
and having all of these,[br]you talked a little bit about it,
0:56:36.109,0:56:38.720
as putting it in the defaults.
0:56:38.720,0:56:42.549
But shouldn’t we emphasise that a lot more?
0:56:42.549,0:56:46.730
Arne: Yeah, I think we should[br]certainly be working towards
0:56:46.730,0:56:50.200
“How do we get security by default?”
0:56:50.200,0:56:54.270
But I think … I’ve mentioned it shortly that
0:56:54.270,0:56:57.519
making things transparent also has a danger.
0:56:57.519,0:57:01.000
I mean, this whole, it’s a bit like …
0:57:01.000,0:57:03.380
a system should be transparent is a bit like
0:57:03.380,0:57:05.880
marketing speak, because actually
0:57:05.880,0:57:09.140
we don’t want systems to be completely transparent,
0:57:09.140,0:57:13.430
’cause we also wanna be able[br]to engage with the systems.
0:57:13.430,0:57:16.410
Are the systems working as they should be?
0:57:16.410,0:57:20.380
So, I mean, this is a difficult balance to find, but yeah …
0:57:20.380,0:57:24.730
Something that you achieve through usability studies,
0:57:24.730,0:57:29.069
security analysis, etc.
0:57:29.069,0:57:31.450
Herald: All right, Arne,[br]thank you very much for giving
0:57:31.450,0:57:33.640
us your very inspiring talk,
0:57:33.640,0:57:36.010
thank you for sharing your information with us.
0:57:36.010,0:57:38.479
Please give him a round of applause.
0:57:38.479,0:57:41.319
Thank you very much.[br]applause
0:57:41.319,0:57:52.000
subtitles created by c3subtitles.de[br]Join, and help us!