WEBVTT
00:00:11.380 --> 00:00:17.699
Applause
00:00:17.699 --> 00:00:20.060
So, good morning everyone
00:00:20.060 --> 00:00:23.529
my name is Arne and today
00:00:23.529 --> 00:00:25.529
I'll be hoping to entertain you
00:00:25.529 --> 00:00:32.189
a bit with some GPG usability issues.
00:00:32.189 --> 00:00:33.980
thanks for being here this early in the morning.
00:00:33.980 --> 00:00:36.750
I know, some of you have had a short night
00:00:36.750 --> 00:00:43.210
In short for the impatient ones:
00:00:43.210 --> 00:00:46.660
Why is GnuPG damn near unusable?
00:00:46.660 --> 00:00:51.690
Well, actually, I don’t know
00:00:51.690 --> 00:00:52.699
Laughter
00:00:52.699 --> 00:00:57.940
So more research is needed … as always.
00:00:57.940 --> 00:00:59.969
Because it's not like using a thermometer.
00:00:59.969 --> 00:01:03.699
We're doing something between social science and security
00:01:03.699 --> 00:01:10.699
But I will present some interesting perspectives
00:01:11.729 --> 00:01:16.720
or at least what I hope you'll find interesting perspectives.
00:01:16.720 --> 00:01:20.340
This talk is about some possible explanations
00:01:20.340 --> 00:01:25.000
that usable security research can offer to the question
00:01:25.000 --> 00:01:27.340
Now some context, something about myself,
00:01:27.340 --> 00:01:34.020
so you have a bit of an idea where I'm coming from
00:01:34.020 --> 00:01:39.200
and what coloured glassed I have on.
00:01:39.200 --> 00:01:44.030
So pretty much my background is in Mathematics,
00:01:44.030 --> 00:01:48.500
Computer science, and—strangely enough—International relations
00:01:48.500 --> 00:01:51.860
My professional background is that I've been doing
00:01:51.860 --> 00:01:57.319
embedded system security evaluations and training
00:01:57.319 --> 00:02:02.890
and I've also been a PhD student, studying the usability of security.
00:02:02.890 --> 00:02:07.729
Currently, I teach the new generation,
00:02:07.729 --> 00:02:14.729
hoping to bring some new blood into the security world.
00:02:15.030 --> 00:02:17.910
I want to do some expectation setting
00:02:17.910 --> 00:02:21.319
I want to say, what this talk is not about.
00:02:21.319 --> 00:02:23.660
I will also give some helpful pointers for
00:02:23.660 --> 00:02:29.510
those of you that are interested in these other areas.
00:02:29.510 --> 00:02:34.269
I will not go into too much detail about the issue of truth
00:02:34.269 --> 00:02:37.510
in security science.
00:02:37.510 --> 00:02:40.469
Here are some links to some interesting papers that cover this
00:02:40.469 --> 00:02:42.930
in a lot of detail.
00:02:42.930 --> 00:02:45.650
Neither will I be giving a security primer.
00:02:45.650 --> 00:02:50.219
There are some nice links to books on the slide.
00:02:50.219 --> 00:02:55.340
I'll also not be giving a cryptography primer or a history lesson.
00:02:55.340 --> 00:02:57.819
Neither will I be giving an introduction to PGP
00:02:57.819 --> 00:03:02.749
And, interestingly enough, even though the talk is titled
00:03:02.749 --> 00:03:06.579
“why is GPG damn near unusable”, I will
00:03:06.579 --> 00:03:10.159
not really be doing much PGP bashing
00:03:10.159 --> 00:03:15.290
I think it's quite, actually, a wonderful effort and other people
00:03:15.290 --> 00:03:20.739
have pretty much done the PGP/GnuPG bashing for me.
00:03:20.739 --> 00:03:25.560
And, as I've already mentioned, I will not be giving any definite answers
00:03:25.560 --> 00:03:29.329
and a lot of “it depends.”
00:03:29.329 --> 00:03:33.739
But then you might ask “well, it depends. What does it depend on?”
00:03:33.739 --> 00:03:37.319
Well, for one: What users you’re looking at
00:03:37.319 --> 00:03:39.689
which goals they have in mind and
00:03:39.689 --> 00:03:43.609
in what context, what environment they’re doing these things.
00:03:43.609 --> 00:03:47.819
So, instead I want to kindle your inspiration
00:03:47.819 --> 00:03:54.489
I want to offer you a new view on the security environment
00:03:54.489 --> 00:03:59.700
and I'll also give you some concrete exercises that you can try out
00:03:59.700 --> 00:04:01.859
at home or at the office.
00:04:01.859 --> 00:04:07.739
Some “do’s” and “don’t’s” and pointers for further exploration:
00:04:07.739 --> 00:04:10.249
This is a short overview of the talk
00:04:10.249 --> 00:04:16.250
I'll start with the background story to why I’m giving this talk
00:04:16.250 --> 00:04:21.298
then an overview over usable security research area,
00:04:21.298 --> 00:04:24.630
some principles and methods for usablity,
00:04:24.630 --> 00:04:28.590
some case studies, then some open questions remain
00:04:28.590 --> 00:04:35.590
So, the story. Well. It all started with this book.
00:04:36.810 --> 00:04:41.260
When I was reading about the Snowden revelations,
00:04:41.260 --> 00:04:46.690
I read, how Snowden tried to contact Glenn Greenwald.
00:04:46.690 --> 00:04:53.270
On December, 1st, he sent an email, saying, well, writing to Glenn:
00:04:53.270 --> 00:05:00.120
“If you don’t use PGP, some people will never be able to contact you.”
00:05:00.120 --> 00:05:05.320
“Please install this helpful tool and if you need any help,
00:05:05.320 --> 00:05:08.200
please request so.”
00:05:08.200 --> 00:05:11.850
Three days later, Glenn Greenwald says: “Sorry, I don’t
00:05:11.850 --> 00:05:16.970
know how to do that, but I’ll look into it.”
00:05:16.970 --> 00:05:21.910
and Snowden writes back: “Okay, well, sure. And again:
00:05:21.910 --> 00:05:23.980
If you need any help, I can facilitate contact.”
00:05:23.980 --> 00:05:28.720
Now, a mere seven weeks later,
00:05:28.720 --> 00:05:30.050
Laughter
00:05:30.050 --> 00:05:37.050
Glenn is like “okay, well, I’ll do it within the next days or so.”
00:05:37.320 --> 00:05:38.290
Okay, sure …
00:05:38.290 --> 00:05:42.780
Snowden’s like “my sincerest thanks”.
00:05:42.780 --> 00:05:46.440
But actually in the meantime, Snowden was growing a bit impatient
00:05:46.440 --> 00:05:51.080
’cause, okay, “why are you not encrypting?”
00:05:51.080 --> 00:05:55.050
So he sent an email to Micah Lee, saying, “okay, well, hello,
00:05:55.050 --> 00:06:01.820
I’m a friend, can you help me getting contact with Laura Poitras?”
00:06:01.820 --> 00:06:06.260
In addition to that, he made a ten-minute video for Glenn Greenwald
00:06:06.260 --> 00:06:09.270
Laughter
00:06:09.270 --> 00:06:11.340
… describing how to use GPG.
00:06:11.340 --> 00:06:17.250
And actually I have quite a lot of screenshots of that video
00:06:17.250 --> 00:06:19.880
and it's quite entertaining.
00:06:19.880 --> 00:06:24.160
’cause, of course, Snowden was getting increasingly
00:06:24.160 --> 00:06:27.550
bothered by the whole situation.
00:06:27.550 --> 00:06:33.090
Now, this is the video that Snowden made
00:06:33.090 --> 00:06:40.090
“GPG for Journalists. For Windows.”
00:06:40.990 --> 00:06:46.860
Laughter
00:06:48.800 --> 00:06:51.340
I’ll just click through it, because I think,
00:06:51.340 --> 00:06:53.960
the slides speak for themselves.
00:06:53.960 --> 00:07:00.960
Take notes of all the usability issues you can identify.
00:07:01.860 --> 00:07:06.030
So just click through the wizard, generate a new key,
00:07:06.030 --> 00:07:11.710
enable “expert settings”, ’cause we want 3000-bit keys
00:07:11.710 --> 00:07:15.960
We want a very long password, etc.
00:07:15.960 --> 00:07:19.260
And now, of course, we also wanna go and find keys
00:07:19.260 --> 00:07:21.840
on the keyserver.
00:07:21.840 --> 00:07:24.790
And we need to make sure that we shouldn’t
00:07:24.790 --> 00:07:28.670
write our draft messages in GMail
00:07:28.670 --> 00:07:31.740
or Thunderbird and enigmail, for that matter.
00:07:31.740 --> 00:07:37.340
Although that issue has been solved.
00:07:39.580 --> 00:07:42.230
So, I think you can start seeing
00:07:42.230 --> 00:07:45.380
why Glenn Greenwald
—even if he did open this video—
00:07:45.380 --> 00:07:52.850
was like
“okay, well, I’m not gonna bother.”
00:07:52.850 --> 00:07:56.820
And Snowden is so kind to say, after 12 minutes,
00:07:56.820 --> 00:08:01.580
“if you have any remaining questions, please contact me.”
00:08:01.580 --> 00:08:05.870
At this year’s HOPE conference,
00:08:05.870 --> 00:08:12.390
Snowden actually did a call for arms and he said
00:08:12.390 --> 00:08:16.740
“Okay, we need people to evaluate our security systems.
00:08:16.740 --> 00:08:23.740
we need people to go and do red team. But in addition to that,
00:08:25.530 --> 00:08:30.970
we also need to look at the user experience issue.”
00:08:30.970 --> 00:08:37.089
So this is a transcript of his kind of manifesto
00:08:37.089 --> 00:08:42.190
and he says: “GPG is really damn near unusable”
00:08:42.190 --> 00:08:46.810
because, well, of course, you might know command line
00:08:46.810 --> 00:08:49.910
and then, okay, you might be okay
00:08:49.910 --> 00:08:53.470
but “Gam Gam at the home”, she is never going to be able
00:08:53.470 --> 00:08:59.490
to use GnuPG.
00:08:59.490 --> 00:09:09.129
And he also notes that, okay, we were part of a technical elite
00:09:09.129 --> 00:09:15.749
and he calls on us to work on the technical literacy of people
00:09:15.749 --> 00:09:18.339
because what he explicitly warns against is
00:09:18.339 --> 00:09:22.680
a high priesthood of technology.
00:09:22.680 --> 00:09:27.850
Okay, that’s a nice call to arms
00:09:27.850 --> 00:09:32.950
but are we actually up for a new dawn?
00:09:32.950 --> 00:09:40.489
Well, I wanna go into the background of usable security
00:09:40.489 --> 00:09:44.959
and I wanna show you that we’ve actually been
00:09:44.959 --> 00:09:48.009
in a pretty dark time.
00:09:48.009 --> 00:09:55.999
So, back in 1999, there was this paper:
“Why Johnny can’t encrypt”
00:09:56.000 --> 00:10:01.219
which described mostly the same broken interface
00:10:01.219 --> 00:10:09.999
so if you remember, if you go back to the video of which
00:10:09.999 --> 00:10:15.370
I showed some screenshots, then, well, if you look at
00:10:15.370 --> 00:10:21.899
these screenshots from 1999, well,
is there a lot of difference?
00:10:21.899 --> 00:10:24.470
Not really! Nothing much has changed.
00:10:24.470 --> 00:10:25.800
There are still the same
00:10:25.800 --> 00:10:31.759
conceptual barriers,
and same crappy defaults.
00:10:31.759 --> 00:10:35.980
And most astonishingly, in the paper there
00:10:35.980 --> 00:10:39.460
is a description of a user study where
00:10:39.460 --> 00:10:44.830
users were given 90 minutes to encrypt an email
00:10:44.830 --> 00:10:49.860
and most were unable to do so.
00:10:49.860 --> 00:10:55.380
I think, that pretty much describes “damn near unusable.”
00:10:55.380 --> 00:11:02.550
A timeline from, well, before 1999 to now
00:11:02.550 --> 00:11:06.100
of the usable security research.
00:11:06.100 --> 00:11:09.920
So, quite a lot has happened
00:11:09.920 --> 00:11:15.180
although it is still a growing field.
00:11:15.180 --> 00:11:20.949
It started—the idea of usable security,
it was explicitly defined first—
00:11:20.949 --> 00:11:29.760
in 1975, but it was only until
… only in … 1989 that
00:11:29.760 --> 00:11:33.380
the first usability tests were carried out.
00:11:33.380 --> 00:11:38.430
And only in 1996 that
00:11:38.430 --> 00:11:44.660
the concept of “user-centered security”
was described.
00:11:44.660 --> 00:11:49.269
An interesting paper, also from 1999, shows how
00:11:49.269 --> 00:11:55.239
contary to the general description of users as lazy
00:11:55.239 --> 00:12:00.829
and basically as the weakest chain in security
00:12:00.829 --> 00:12:06.009
this paper describes users as pretty rational beings
00:12:06.009 --> 00:12:09.660
who see security as an overhead and …
where they don't
00:12:09.660 --> 00:12:16.589
understand the usefulness of what they’re doing.
00:12:16.589 --> 00:12:21.309
The study of PGP 5.0, I’ve talked about that already,
00:12:21.309 --> 00:12:27.550
and there was also a study of the
Kazaa network in 2002.
00:12:27.550 --> 00:12:29.550
And it was found out that a lot of users were
00:12:29.550 --> 00:12:35.339
accidentally sharing files from personal pictures,
00:12:35.339 --> 00:12:41.749
who knows, maybe credit-card details,
you never know, right?
00:12:41.749 --> 00:12:48.209
In 2002, a lot of the knowledge of usable security design
00:12:48.209 --> 00:12:51.180
was concretised in ten key principles
00:12:51.180 --> 00:12:54.469
and if you’re interested,
00:12:54.469 --> 00:13:03.999
I do recommend you to look at the paper.
00:13:03.999 --> 00:13:09.939
A solution to the PGP problem was proposed in
00:13:09.939 --> 00:13:11.679
2004, well, actually,
00:13:11.679 --> 00:13:15.050
it was proposed earlier
but it was tested in 2005.
00:13:15.050 --> 00:13:19.529
And it was found that actually
if we automate encryption
00:13:19.529 --> 00:13:23.869
and if we automate key exchange then, well,
00:13:23.869 --> 00:13:26.059
things are pretty workable, except that
00:13:26.059 --> 00:13:30.290
users still fall for
phishing attacks, of course.
00:13:30.290 --> 00:13:38.399
But last year, another research
identified that, well,
00:13:38.399 --> 00:13:41.839
making security transparent is all nice and well
00:13:41.839 --> 00:13:46.309
but it's also dangerous because users no longer
00:13:46.309 --> 00:13:49.879
are less likely to trust the system and
00:13:49.879 --> 00:13:54.259
are less likely to really understand
what’s really happening.
00:13:54.259 --> 00:13:59.489
So a paper this year also identified another issue:
00:13:59.489 --> 00:14:04.209
Users generally have very bad understanding
00:14:04.209 --> 00:14:05.980
of the email architecture.
00:14:05.980 --> 00:14:08.579
An email goes from point A to point B.
00:14:08.579 --> 00:14:15.630
And what happens in-between is unknown.
00:14:15.630 --> 00:14:22.570
So, before I go on to general usability principles
00:14:22.570 --> 00:14:27.550
from the founding pillar of the usable security field
00:14:27.550 --> 00:14:33.670
I wanna give some exaples of usability failures.
00:14:33.670 --> 00:14:37.470
You might be familiar with project VENONA.
00:14:37.470 --> 00:14:41.649
This was an effort by the US intelligence agencies
00:14:41.649 --> 00:14:45.290
to try and decrypt soviet communication.
00:14:45.290 --> 00:14:49.389
And they actually were pretty successful.
00:14:49.389 --> 00:14:53.980
They encovered a lot of spying and, well,
00:14:53.980 --> 00:14:56.189
how did they do this?
00:14:56.189 --> 00:14:59.869
The soviets were using one-time pads and,
00:14:59.869 --> 00:15:02.619
well, if you reuse a one-time pad,
00:15:02.619 --> 00:15:08.069
then you leak a lot of information about plain-text.
00:15:08.069 --> 00:15:10.269
Well, what we also see happening a lot is
00:15:10.269 --> 00:15:12.600
low password entropy.
00:15:12.600 --> 00:15:19.440
We have people choosing password “123456”, etc.
00:15:19.440 --> 00:15:25.479
And what I just described, the study looking into
00:15:25.479 --> 00:15:29.249
the mental models of users,
00:15:29.249 --> 00:15:32.300
of the email architecture and how it works
00:15:32.300 --> 00:15:34.819
well, at the top you have
00:15:34.819 --> 00:15:37.879
still a pretty simplified description of how things work
00:15:37.879 --> 00:15:40.519
and at the bottom we have an actual drawing
00:15:40.519 --> 00:15:43.300
of a research participant when asked:
00:15:43.300 --> 00:15:49.779
“Can you draw how an email
goes from point A to point B?”
00:15:49.779 --> 00:15:53.779
And it’s like:
“Well, it goes from one place to the other.”
00:15:53.779 --> 00:15:58.829
Okay …
00:15:58.829 --> 00:16:01.039
clicking sounds
00:16:01.039 --> 00:16:03.669
Okay, so this died.
00:16:12.569 --> 00:16:18.689
So, these are two screenshots of enigmail
00:16:18.689 --> 00:16:21.939
Well, if I wouldn’t have marked them
00:16:21.939 --> 00:16:27.199
as the plaintext and encrypted email that would be sent
00:16:27.199 --> 00:16:30.899
you probably wouldn’t have spotted which was which
00:16:30.899 --> 00:16:34.610
this is a pretty big failure in
00:16:34.610 --> 00:16:39.839
the visibility of the system.
00:16:39.839 --> 00:16:44.680
You don’t see anything? Ah.
00:16:44.680 --> 00:16:47.509
Audience: “That’s the point!”
That's the point, yes!
00:16:47.509 --> 00:16:58.109
laughter
applause
00:16:58.109 --> 00:17:02.209
On the left we have a screenshot of GPG and
00:17:02.209 --> 00:17:04.299
as I’ve already described,
00:17:04.299 --> 00:17:08.530
command line people, we like command lines
00:17:08.530 --> 00:17:11.300
but normal people don’t.
00:17:11.300 --> 00:17:13.720
And what we also see is a lot of the jargon that is
00:17:13.720 --> 00:17:17.030
currently being used even in GUI applications
00:17:17.030 --> 00:17:23.409
so on the right there is PGP 10.0.
00:17:23.409 --> 00:17:25.979
Now I wanna close these examples with
00:17:25.979 --> 00:17:28.529
well, you might be wondering: “what is this?”
00:17:28.529 --> 00:17:32.589
This is actually an example of a security device
00:17:32.589 --> 00:17:36.649
from, I think it’s around 4000 years ago.
00:17:36.649 --> 00:17:38.360
Like, People could use this.
00:17:38.360 --> 00:17:42.870
Why can’t we get it right today?
00:17:42.870 --> 00:17:46.360
Something that you should,
00:17:46.360 --> 00:17:48.869
this is a little homework exercise,
00:17:48.869 --> 00:17:52.419
take a laptop to your grandma, show her PGP,
00:17:52.419 --> 00:17:55.110
can she use it—yes or no?
00:17:55.110 --> 00:18:02.440
Probably not, but who knows?
00:18:02.450 --> 00:18:03.740
Now I wanna go into
00:18:03.740 --> 00:18:09.840
the usability cornerstones of usable security.
00:18:09.840 --> 00:18:13.049
I wanna start with heuristics
00:18:13.049 --> 00:18:15.519
some people call them “rules of thumb,” other people
00:18:15.519 --> 00:18:19.059
call them “the ten holy commandments”
00:18:19.059 --> 00:18:23.299
For example, the ten commandments of Dieter Rams,
00:18:23.299 --> 00:18:27.059
there is ten commandments of Jakob Nielsen,
00:18:27.059 --> 00:18:28.250
of Don Norman
00:18:28.250 --> 00:18:35.110
and it really depends on who you believe in, etc.
00:18:35.110 --> 00:18:37.380
But at the cornerstone of all of these is that
00:18:37.380 --> 00:18:40.270
design is made for people.
00:18:40.270 --> 00:18:45.800
And, well, actually, Google says it quite well
00:18:45.800 --> 00:18:48.559
in their guiding mission:
00:18:48.559 --> 00:18:52.740
“Focus on the user and all else will follow.”
00:18:52.740 --> 00:18:54.809
Or, as a usability maxim:
00:18:54.809 --> 00:18:57.350
“thou shalt test with thy user”
00:18:57.350 --> 00:19:01.209
Don’t just give them the thing.
00:19:01.209 --> 00:19:03.200
But there is one problem with these heuristics
00:19:03.200 --> 00:19:06.510
and with this advice going just with your user.
00:19:06.510 --> 00:19:10.889
Because it’s a pretty abstract advice.
00:19:10.889 --> 00:19:12.090
What do you do?
00:19:12.090 --> 00:19:13.940
You go out into the world to get practice.
00:19:13.940 --> 00:19:17.870
You start observing people.
00:19:17.870 --> 00:19:20.169
One nice exercise to try is:
00:19:20.169 --> 00:19:21.289
go to the vending machine,
00:19:21.289 --> 00:19:24.539
for example the ones at the S-Bahn.
00:19:24.539 --> 00:19:26.049
Just stand next to it
00:19:26.049 --> 00:19:28.269
and observe people buying tickets.
00:19:28.269 --> 00:19:30.860
It’s quite entertaining, actually.
00:19:30.860 --> 00:19:33.500
Laughter
00:19:33.500 --> 00:19:36.010
And something you can also do is
00:19:36.010 --> 00:19:37.890
search for usability failures.
00:19:37.890 --> 00:19:39.750
This is what you already do when
00:19:39.750 --> 00:19:41.269
you’re observing people.
00:19:41.269 --> 00:19:45.320
But even just google for “usability failure”,
00:19:45.320 --> 00:19:47.870
“GUI fail”, etc., and you will find
00:19:47.870 --> 00:19:53.400
lots of entertaining stuff.
00:19:53.400 --> 00:19:54.950
Those were some heuristics
00:19:54.950 --> 00:19:56.250
but what about the princpiles
00:19:56.250 --> 00:20:01.740
that lie behind those?
00:20:01.740 --> 00:20:05.799
Usability or interaction design
00:20:05.799 --> 00:20:09.299
is a cycle between the user and the system.
00:20:09.299 --> 00:20:10.470
The user and the world.
00:20:10.470 --> 00:20:12.090
The user acts on the world
00:20:12.090 --> 00:20:13.590
and gets feedback.
00:20:13.590 --> 00:20:17.000
They interpret that.
00:20:17.000 --> 00:20:18.480
One important concept is
00:20:18.480 --> 00:20:20.070
for things to be visible.
00:20:20.070 --> 00:20:21.110
For the underlying system state
00:20:21.110 --> 00:20:22.320
to be visible and
00:20:22.320 --> 00:20:23.760
you get appropriate feedback
00:20:23.760 --> 00:20:26.169
from the system.
00:20:26.169 --> 00:20:31.230
So these are Don Norman’s gulfs
of execution and evaluation
00:20:31.230 --> 00:20:34.940
sort of yin and yang.
00:20:34.940 --> 00:20:38.549
And there is two concrete problems
00:20:38.549 --> 00:20:39.830
to illustrate.
00:20:39.830 --> 00:20:41.850
For example, the button problem
00:20:41.850 --> 00:20:45.840
that “how do you know what happens
when you push the button?”
00:20:45.840 --> 00:20:50.620
and “how do you know how to push it?”
00:20:50.620 --> 00:20:52.750
I unforunately don’t have a picture of it
00:20:52.750 --> 00:20:58.360
but at Oxford station, the tabs in the bathrooms
00:20:58.360 --> 00:21:01.550
they say “push” and you need to turn.
00:21:01.550 --> 00:21:05.439
Laughter
00:21:05.439 --> 00:21:08.510
Then there is the toilet door problem.
00:21:08.510 --> 00:21:11.740
The problem of “how do you know
what state a system is in”.
00:21:11.740 --> 00:21:15.730
How do you know whether an email will be encrypted?
00:21:15.730 --> 00:21:20.269
This is a picture …
00:21:20.269 --> 00:21:21.980
basically there is two locks.
00:21:21.980 --> 00:21:26.120
One is actually broken and it’s …
when pushing the button that's on the
00:21:26.120 --> 00:21:29.049
door handle, you usually lock the door.
00:21:29.049 --> 00:21:31.620
But … well … it broke. So that must have been
00:21:31.620 --> 00:21:36.200
an entertaining accident.
00:21:36.200 --> 00:21:39.080
Another, as I’ve already described,
00:21:39.080 --> 00:21:44.339
another important concept is that of mental models.
00:21:44.339 --> 00:21:47.860
It’s a question of what idea does the user have
00:21:47.860 --> 00:21:52.589
of the system by interacting with it?
00:21:52.589 --> 00:21:55.880
How do they acquire knowledge?
00:21:55.880 --> 00:21:59.250
For example, how to achieve discoverability
00:21:59.250 --> 00:22:00.769
of the system?
00:22:00.769 --> 00:22:05.710
And how to ensure that while
a user is discovering the system
00:22:05.710 --> 00:22:09.480
that they are less likely to make mistakes?
00:22:09.480 --> 00:22:13.880
So this is the concept of poka-yoke
00:22:13.880 --> 00:22:18.429
and it’s … here is an example
you also see with floppy disks,
00:22:18.429 --> 00:22:22.089
with USB sticks, etc.
00:22:22.089 --> 00:22:24.309
It’s engineered such that users are
00:22:24.309 --> 00:22:27.020
less likely to make a mistake.
00:22:27.020 --> 00:22:30.720
Then there’s also the idea
of enabling knowledge transfer
00:22:30.720 --> 00:22:33.339
So how can we do this?
00:22:33.339 --> 00:22:35.480
One thing is metaphors.
00:22:35.480 --> 00:22:39.919
And I’m not sure how many of you recognise this,
00:22:39.919 --> 00:22:44.030
this is Microsoft BOB.
00:22:44.030 --> 00:22:46.399
Traditionally, PC systems have been built
00:22:46.399 --> 00:22:51.909
on the desktop metaphor.
00:22:51.909 --> 00:22:58.169
Laughter
Microsoft BOB had a little too much.
00:22:58.169 --> 00:23:04.510
To enable knowledge transfer,
you can also standardise systems.
00:23:04.510 --> 00:23:08.519
And one important tool for this is design languages
00:23:08.519 --> 00:23:12.159
so if you’re designing for iOS, go look at
00:23:12.159 --> 00:23:15.970
the design language, the Human
Interface Guidelines of iOS.
00:23:15.970 --> 00:23:19.690
The same for Windows – go look
at the Metro Design Guidelines.
00:23:19.690 --> 00:23:26.159
As for Android, look at Material Design.
00:23:26.159 --> 00:23:30.409
Because, another interesting exercise
to try out
00:23:30.409 --> 00:23:33.230
relating to design languages
00:23:33.230 --> 00:23:38.250
and also to get familiar with how designers try to
00:23:38.250 --> 00:23:40.519
communicate with users is to
00:23:40.519 --> 00:23:44.639
look at an interface and trying to decode
00:23:44.639 --> 00:23:48.599
what the designer is trying to say to the user.
00:23:48.599 --> 00:23:53.669
And another interesting exercise is to look at
00:23:53.669 --> 00:23:58.840
not usability but UNusability.
00:23:58.840 --> 00:24:00.909
So there is this pretty interesting book
00:24:00.909 --> 00:24:04.940
called “evil by design” and it goes into
00:24:04.940 --> 00:24:08.450
all the various techniques that designers use
00:24:08.450 --> 00:24:16.760
to fool users, to get them to buy an extra hotel, car, etc.
00:24:16.760 --> 00:24:21.750
and, well, RyanAir is pretty much the worst offender
00:24:21.750 --> 00:24:24.970
so a good example to study.
00:24:24.970 --> 00:24:30.519
Applause
00:24:30.519 --> 00:24:34.480
So, what if you wanna go out into the world
00:24:34.480 --> 00:24:40.220
and are gonna apply these princpiles, these heuristics?
00:24:40.220 --> 00:24:42.350
The first thing to know is that
00:24:42.350 --> 00:24:45.219
design has to be a process
00:24:45.219 --> 00:24:50.739
whereby, first part is actually defining your problem.
00:24:50.739 --> 00:24:53.729
You first brain-storm
00:24:53.729 --> 00:24:58.230
then you try to narrow down to concrete requirements
00:24:58.230 --> 00:25:02.870
after which you go and try out
the various approaches
00:25:02.870 --> 00:25:05.850
and test these.
00:25:05.850 --> 00:25:09.279
What materials do usability experts actually use?
00:25:09.279 --> 00:25:15.630
Well, of course there’s expensive tools, Axure, etc.
00:25:15.630 --> 00:25:19.220
but I think one of the most used materials
00:25:19.220 --> 00:25:25.490
is still the post-it note. Just paper and pens.
00:25:25.490 --> 00:25:28.980
And, okay, where do you wanna go and test?
00:25:28.980 --> 00:25:32.090
Well, actually, go out into the field.
00:25:32.090 --> 00:25:35.950
Go to the ticket machine of the S-Bahn.
00:25:35.950 --> 00:25:39.019
But also go and test in the lab, so that you have
00:25:39.019 --> 00:25:42.179
a controlled environment.
00:25:42.179 --> 00:25:45.309
And then you ask “okay, how do I test?”
00:25:45.309 --> 00:25:49.919
Well, first thing is: Go and get some real people.
00:25:49.919 --> 00:25:54.630
Of course, it’s … it can be difficult to actually
00:25:54.630 --> 00:26:00.620
get people into the lab but it’s not impossible.
00:26:00.620 --> 00:26:02.429
So once you have people in the lab,
00:26:02.429 --> 00:26:05.020
here are some methods.
00:26:05.020 --> 00:26:07.279
There are so many usability evaluation methods
00:26:07.279 --> 00:26:09.409
that I’m not gonna list them all and
00:26:09.409 --> 00:26:13.200
I encourage you to go and look them up yourself
00:26:13.200 --> 00:26:15.360
’cause it’s also very personal what works for you
00:26:15.360 --> 00:26:20.500
and what works in your situation.
00:26:20.500 --> 00:26:23.050
When using these methods you wanna
00:26:23.050 --> 00:26:25.780
evaluate how well a solution works
00:26:25.780 --> 00:26:29.809
So you’re gonna look at some matrix
00:26:29.809 --> 00:26:31.409
so that at the end of your evaluation
00:26:31.409 --> 00:26:35.100
you can say “okay, we’ve done a good job”,
00:26:35.100 --> 00:26:40.100
“this can go better”,
“Okay, maybe we can move that”, …
00:26:40.100 --> 00:26:44.069
So these are the standardised ones, so
00:26:44.069 --> 00:26:47.690
how effective are people, or etc.
00:26:47.690 --> 00:26:52.909
You can read …
00:26:52.909 --> 00:26:55.759
For a quick start guide on how to
00:26:55.759 --> 00:26:59.159
perform usability studies, this is quite a nice one.
00:26:59.159 --> 00:27:00.480
And the most important thing to remember
00:27:00.480 --> 00:27:04.529
is that preparation is half the work.
00:27:04.529 --> 00:27:08.120
First thing to check that everything is working,
00:27:08.120 --> 00:27:17.180
make sure that you have everyone
you need in the room, etc.
00:27:17.180 --> 00:27:23.249
And maybe most importantly,
usability and usable security,
00:27:23.249 --> 00:27:26.380
well, usable security is still a growing field, but
00:27:26.380 --> 00:27:30.630
usability is a very large field and most likely
00:27:30.630 --> 00:27:34.720
all the problems that you are going to face
00:27:34.720 --> 00:27:36.970
or at least a large percentage, other people
00:27:36.970 --> 00:27:39.080
have faced before.
00:27:39.080 --> 00:27:43.529
So this book is, well, it describes a lot of the stories
00:27:43.529 --> 00:27:47.529
of user experience professionals and the things
00:27:47.529 --> 00:27:52.040
that they’ve come up against.
00:27:52.040 --> 00:27:56.189
A homework exerciese if you feel like it
00:27:56.189 --> 00:28:00.990
is looking at basically analysing who is your user
00:28:00.990 --> 00:28:06.760
and where they’re going to use the application.
00:28:06.760 --> 00:28:10.409
And also something to think about is
00:28:10.409 --> 00:28:12.649
how might you involve your user?
00:28:12.649 --> 00:28:16.889
Not just during the usability testing,
00:28:16.889 --> 00:28:21.070
but also afterwards.
00:28:21.070 --> 00:28:28.450
Now I wanna go into some case
studies of encryption systems.
00:28:28.450 --> 00:28:30.230
Now there’s quite a lot, and these are not all,
00:28:30.230 --> 00:28:34.769
it’s just a small selection but I wanna focus on three.
00:28:34.769 --> 00:28:40.229
I wanna focus at the OpenPGP standard,
Cryptocat and TextSecure.
00:28:40.229 --> 00:28:42.769
So, OpenPGP, well …
00:28:42.769 --> 00:28:46.230
email is now almost 50 years old,
00:28:46.230 --> 00:28:52.190
we have an encryption standard—S/MIME,
it is widely used
00:28:52.190 --> 00:28:56.039
well, it’s widely usable but it’s not widely used …
00:28:56.039 --> 00:29:03.679
and GnuPG is used widely but is not installed by default
00:29:03.679 --> 00:29:09.939
and when usability teaches us one thing
00:29:09.939 --> 00:29:14.129
it’s that defaults rule.
00:29:14.129 --> 00:29:18.190
Because users don’t change defaults.
00:29:18.190 --> 00:29:23.360
Now you might ask “Okay,
PGP is not installed by default,
00:29:23.360 --> 00:29:26.560
so is there actually still a future for OpenPGP?”
00:29:26.560 --> 00:29:30.179
Well, I’d argue: Yes.
We have browser plug-ins
00:29:30.179 --> 00:29:33.059
which make it easier for users
00:29:33.059 --> 00:29:37.850
JavaScript crypto … I’ll come back to that later …
00:29:37.850 --> 00:29:43.420
But when we look at Mailvelope, we see, well,
00:29:43.420 --> 00:29:48.040
the EFF scorecard, it has a pretty decent rating
00:29:48.040 --> 00:29:55.790
at least compared to that of native PGP implementations.
00:29:55.790 --> 00:29:58.629
And also Google has announced and has been working
00:29:58.629 --> 00:30:01.409
for quite some time on their own plug-in for
00:30:01.409 --> 00:30:03.379
end-to-end encryption.
00:30:03.379 --> 00:30:07.950
And Yahoo! is also involved in that.
00:30:07.950 --> 00:30:11.389
And after the Snowden revelations there has been
00:30:11.389 --> 00:30:15.009
a widespread search in the interest
00:30:15.009 --> 00:30:18.460
in encrypted communications
00:30:18.460 --> 00:30:23.320
and this is one website where a lot of these are listed.
00:30:23.320 --> 00:30:27.889
And one project that I’d especially like to emphasise
00:30:27.889 --> 00:30:31.910
is mailpile because I think it looks
00:30:31.910 --> 00:30:35.300
like a very interesting approach
00:30:35.300 --> 00:30:37.820
whereby the question is:
00:30:37.820 --> 00:30:41.080
Can we use OpenPGP as a stepping stone?
00:30:41.080 --> 00:30:46.620
OpenPGP is not perfect, meta-data is not protected,
00:30:46.620 --> 00:30:48.299
header is not protected, etc.
00:30:48.299 --> 00:30:51.870
But maybe when we get people into the ecosystem,
00:30:51.870 --> 00:30:56.169
we can try and gradually move
them to more secure options.
00:30:56.169 --> 00:30:58.899
Now, what about Cryptocat?
00:30:58.899 --> 00:31:04.070
So, Cryptocat’s online chat platform
00:31:04.070 --> 00:31:06.900
that … yes … uses JavaScript.
00:31:06.900 --> 00:31:10.909
And of course, JavaScript crypto is bad
00:31:10.909 --> 00:31:14.620
but it can be made better.
00:31:14.620 --> 00:31:20.160
And I think JavaScript crypto is not the worst problem.
00:31:20.160 --> 00:31:22.860
Cryptocat had a pretty disastrous problem
00:31:22.860 --> 00:31:26.809
whereby all messages that were sent
00:31:26.809 --> 00:31:30.610
were pretty easily decryptable.
00:31:30.610 --> 00:31:33.169
But actually, this is just history repeating itself
00:31:33.169 --> 00:31:39.090
’cause PGP 1.0 used something called BassOmatic,
00:31:39.090 --> 00:31:44.620
the BassOmatic cypher which is also pretty weak.
00:31:44.620 --> 00:31:49.509
And Cryptocat is improving, which is the important thing.
00:31:49.509 --> 00:31:51.179
There is now a browser plug-in and
00:31:51.179 --> 00:31:53.890
of course, there’s an app for that and
00:31:53.890 --> 00:31:56.570
actually, Cryptocat is doing really, really well
00:31:56.570 --> 00:31:59.280
in the EFF benchmarks.
00:31:59.280 --> 00:32:04.539
And Cryptocat is asking the one question that a lot
00:32:04.539 --> 00:32:06.659
of other applications are not asking, which is:
00:32:06.659 --> 00:32:09.459
“How can we actually make crypto fun?”
00:32:09.459 --> 00:32:12.429
When you start Cryptocat, there’s noises
00:32:12.429 --> 00:32:15.340
and there’s interesting facts about cats
00:32:15.340 --> 00:32:18.759
Laughter
00:32:18.759 --> 00:32:21.249
… depends on whether you like cats, but still!
00:32:21.249 --> 00:32:23.450
Keeps you busy!
00:32:23.450 --> 00:32:28.989
Now, the last case: TextSecure
also has pretty good markings
00:32:28.989 --> 00:32:32.299
and actually just like CryptoCat,
00:32:32.299 --> 00:32:35.860
the App store distribution model is something that
00:32:35.860 --> 00:32:38.820
I think is a valuable one for usability.
00:32:38.820 --> 00:32:41.910
It makes it easy to install.
00:32:41.910 --> 00:32:46.049
And something that TextSecure is also looking at is
00:32:46.049 --> 00:32:52.180
synchronisation options for your address book.
00:32:52.180 --> 00:32:56.980
And I think the most interesting development is
00:32:56.980 --> 00:33:00.490
on the one side, the CyanogenMod integration,
00:33:00.490 --> 00:33:05.029
so that people will have encryption enabled by default.
00:33:05.029 --> 00:33:09.879
’Cause as I mentioned: People don’t change defaults.
00:33:09.879 --> 00:33:14.709
And this one is a bit more contoversial, but
00:33:14.709 --> 00:33:18.180
there’s also the WhatsApp partnership.
00:33:18.180 --> 00:33:21.130
And of course people will say “it’s not secure”,
00:33:21.130 --> 00:33:22.960
we know, we know,
00:33:22.960 --> 00:33:24.389
EFF knows!
00:33:24.389 --> 00:33:28.740
But at least, it’s more secure than nothing at all.
00:33:28.740 --> 00:33:31.330
Because: Doesn’t every little bit help?
00:33:31.330 --> 00:33:32.680
Well, I’d say: yes.
00:33:32.680 --> 00:33:35.590
And at least, it’s one stepping stone.
00:33:35.590 --> 00:33:40.110
And, well, all of these are open-source,
00:33:40.110 --> 00:33:41.639
so you can think for yourself:
00:33:41.639 --> 00:33:45.120
How can I improve these?
00:33:45.120 --> 00:33:50.450
Now, there’s still some open questions remaining
00:33:50.450 --> 00:33:52.620
in the usable security field and in the
00:33:52.620 --> 00:33:56.320
wider security field as well.
00:33:56.320 --> 00:33:58.860
I won’t go into all of these,
00:33:58.860 --> 00:34:03.210
I wanna focus on the issues that developers have,
00:34:03.210 --> 00:34:05.730
issues of end user understanding
00:34:05.730 --> 00:34:09.459
and of identitiy management.
00:34:09.459 --> 00:34:14.059
Because the development environment
00:34:14.059 --> 00:34:18.179
there’s the crypto-plumbing problem, some people call it.
00:34:18.179 --> 00:34:20.820
How do we standardise on a cryptographic algorithm?
00:34:20.820 --> 00:34:25.250
How do we make everyone use the same system?
00:34:25.250 --> 00:34:29.179
Because, again, it’s history repeating itself.
00:34:29.179 --> 00:34:35.668
With PGP, we had RSA, changed for
DSA because of patent issues
00:34:35.668 --> 00:34:39.540
IDEA changed for CAST5 because of patent issues
00:34:39.540 --> 00:34:41.729
and now we have something similar:
00:34:41.729 --> 00:34:43.599
’cause for PGP the question is:
00:34:43.599 --> 00:34:45.619
Which curve do we choose?
00:34:45.619 --> 00:34:51.089
’cause this is from Bernstein, who has got a whole list
00:34:51.089 --> 00:34:56.229
of, well not all the curves,
but a large selection of them
00:34:56.229 --> 00:34:57.920
analysing the security
00:34:57.920 --> 00:35:01.210
but how do you make, well, pretty much
00:35:01.210 --> 00:35:06.460
the whole world agree on a single standard?
00:35:06.460 --> 00:35:11.110
And also, can we move toward safer languages?
00:35:11.110 --> 00:35:18.270
And I’ve been talking about the
usability of encryption systems
00:35:18.270 --> 00:35:21.770
for users, but what about for developers?
00:35:21.770 --> 00:35:25.770
So, API usability, and as I’ve mentioned:
00:35:25.770 --> 00:35:28.000
Language usability.
00:35:28.000 --> 00:35:31.640
And on top of that, it is not just a technical issue,
00:35:31.640 --> 00:35:34.959
because, of course, we secure microchips,
00:35:34.959 --> 00:35:41.560
but we also wanna secure social systems.
00:35:41.560 --> 00:35:45.000
Because, in principal, we live in an open system,
00:35:45.000 --> 00:35:51.230
in an open society and a system cannot audit itself.
00:35:51.230 --> 00:35:55.709
So, okay, what do we do, right?
I don’t know.
00:35:55.709 --> 00:35:58.489
I mean, that’s why it’s an open question!
00:35:58.489 --> 00:36:00.970
’Cause how de we ensure the authenticity of,
00:36:00.970 --> 00:36:06.300
I don’t know, my Intel processor in my lapotp?
00:36:06.300 --> 00:36:07.000
How do I know that the
00:36:07.000 --> 00:36:09.089
random number generator isn’t bogus?
00:36:09.089 --> 00:36:13.589
Well, I know it is, but …
laughter
00:36:13.589 --> 00:36:17.140
Then, there’s the issue of identity management
00:36:17.140 --> 00:36:19.309
related to key management, like
00:36:19.309 --> 00:36:25.329
who has the keys to the kingdom?
00:36:25.329 --> 00:36:27.160
One approach, as I’ve already mentioned, is
00:36:27.160 --> 00:36:28.760
key continuity management.
00:36:28.760 --> 00:36:32.400
Whereby we automate both key exchange and
00:36:32.400 --> 00:36:35.520
whereby we automate encryption.
00:36:35.520 --> 00:36:38.289
So one principle is trust on first use,
00:36:38.289 --> 00:36:43.569
whereby, well, one approach will be to attach your key
00:36:43.569 --> 00:36:46.010
to any email you send out and anyone who receives
00:36:46.010 --> 00:36:50.020
this email just assumes it’s the proper key.
00:36:50.020 --> 00:36:52.819
Of course, it’s not fully secure,
00:36:52.819 --> 00:36:56.319
but at least, it’s something.
00:36:56.319 --> 00:36:59.109
And this is really, I think, the major question
00:36:59.109 --> 00:37:00.809
in interoperability:
00:37:00.809 --> 00:37:05.210
How do you ensure that you can access your email
00:37:05.210 --> 00:37:08.690
from multiple devices?
00:37:08.690 --> 00:37:10.880
Now, of course, there is meta-data leakage,
00:37:10.880 --> 00:37:14.230
PGP doesn’t protect meta-data,
00:37:14.230 --> 00:37:16.890
and, you know, your friendly security agency knows
00:37:16.890 --> 00:37:18.319
where you went last summer …
00:37:18.319 --> 00:37:19.300
So, what do we do?
00:37:19.300 --> 00:37:23.649
We do anonymous routing, we send over tor, but
00:37:23.649 --> 00:37:26.150
I mean, how do we roll that out?
00:37:26.150 --> 00:37:27.500
I think the approach that
00:37:27.500 --> 00:37:30.240
mailpile is trying to do is interesting
00:37:30.240 --> 00:37:33.000
and, of course still an open question, but
00:37:33.000 --> 00:37:36.800
interesting research nonetheless.
00:37:36.800 --> 00:37:39.049
Then there’s the introduction problem of
00:37:39.049 --> 00:37:43.730
okay, how, I meet someone here, after the talk,
00:37:43.730 --> 00:37:45.990
they tell me who they are,
00:37:45.990 --> 00:37:49.829
but either I get their card—which is nice—or
00:37:49.829 --> 00:37:52.260
they say what their name is.
00:37:52.260 --> 00:37:55.869
But they’re not gonna tell me, they’re not gonna spell out
00:37:55.869 --> 00:37:57.820
their fingerprint.
00:37:57.820 --> 00:38:02.400
So the idea of Zooko’s triangle is that identifiers
00:38:02.400 --> 00:38:07.630
are either human-meaningful,
secure or decentralised.
00:38:07.630 --> 00:38:09.270
Pick two.
00:38:09.270 --> 00:38:13.140
So here’s some examples of identifiers,
00:38:13.140 --> 00:38:16.270
so for Bitcoin: Lots of random garbage.
00:38:16.270 --> 00:38:19.009
For OpenPGP: Lots of random garbage
00:38:19.009 --> 00:38:22.330
For miniLock: Lots of random garbage
00:38:22.330 --> 00:38:26.390
So, I think an interesting research problem is:
00:38:26.390 --> 00:38:29.719
Can we actually make these things memorable?
00:38:29.719 --> 00:38:32.200
You know, I can memorise email addresses,
00:38:32.200 --> 00:38:34.359
I can memorise phone numbers,
00:38:34.359 --> 00:38:39.839
I can not memorise these. I can try, but …
00:38:39.839 --> 00:38:45.390
Then, the last open question I wanna focus on
00:38:45.390 --> 00:38:48.780
is that of end-user understanding.
00:38:48.780 --> 00:38:53.599
So of course, we’ll know that all devices are monitored.
00:38:53.599 --> 00:39:00.420
But does the average user?
00:39:00.420 --> 00:39:04.750
Do they know what worms can do?
00:39:04.750 --> 00:39:09.280
Have they read these books?
00:39:09.280 --> 00:39:15.089
Do they know where GCHQ is?
00:39:15.089 --> 00:39:20.970
Do they know that Cupertino has
pretty much the same powers?
00:39:20.970 --> 00:39:23.880
Laughter
00:39:23.880 --> 00:39:28.980
Do they know they’re living in a panopticon to come?
00:39:28.980 --> 00:39:32.160
Laughter
00:39:32.160 --> 00:39:37.800
Do they know that people are
killed based on meta-data?
00:39:37.800 --> 00:39:40.829
Well, I think not.
00:39:40.829 --> 00:39:45.550
And actually this is a poster from the university
00:39:45.550 --> 00:39:47.069
where I did my Master’s
00:39:47.069 --> 00:39:50.940
and interestingly enough, it was founded by a guy
00:39:50.940 --> 00:39:56.279
who made a fortune selling sugar pills.
00:39:56.279 --> 00:40:02.649
You know, snake oil, we also have this in crypto.
00:40:02.649 --> 00:40:06.079
And how is the user to know
00:40:06.079 --> 00:40:08.130
whether something is secure or not?
00:40:08.130 --> 00:40:10.609
Of course, we have the secure messaging scorecard
00:40:10.609 --> 00:40:15.210
but can users find these?
00:40:15.210 --> 00:40:21.190
Well, I think, there’s three aspects
to end-user understanding
00:40:21.190 --> 00:40:24.250
which is knowledge acquisition,
knowledge transfer,
00:40:24.250 --> 00:40:27.220
and the verification updating of this knowledge.
00:40:27.220 --> 00:40:30.950
So, as I’ve already mentioned,
we can do dummy-proofing
00:40:30.950 --> 00:40:38.110
and we can create transparent systems.
00:40:38.110 --> 00:40:41.160
For knowledge transfer, we can
00:40:41.160 --> 00:40:44.400
look at appropriate metaphors and design languages.
00:40:44.400 --> 00:40:46.829
And for verification we can
00:40:46.829 --> 00:40:50.590
try an approach: Choose an advertising.
00:40:50.590 --> 00:40:56.500
And, last but not least, we can do user-testing.
00:40:56.500 --> 00:41:02.770
Because all these open questions that I’ve described
00:41:02.770 --> 00:41:05.549
and all this research that has been done,
00:41:05.549 --> 00:41:11.089
I think it’s missing one key issue, which is that
00:41:11.089 --> 00:41:13.640
the usability people and the security people
00:41:13.640 --> 00:41:17.480
tend not to really talk to one another.
00:41:17.480 --> 00:41:21.440
The open-source developers and the users:
00:41:21.440 --> 00:41:23.490
Are they talking enough?
00:41:23.490 --> 00:41:26.760
I think that’s something, if we want a new dawn,
00:41:26.760 --> 00:41:30.970
that’s something that I think we should approach.
00:41:30.970 --> 00:41:35.110
Yeah, so, from my side, that’s it.
00:41:35.110 --> 00:41:37.490
I’m open for any questions.
00:41:37.490 --> 00:41:49.320
Applause
00:41:49.320 --> 00:41:52.270
Herald: Arne, thank you very much for your brilliant talk
00:41:52.270 --> 00:41:55.030
Now, if you have any questions to ask, would you please
00:41:55.030 --> 00:41:57.920
line up at the microphones in the aisles?!
00:41:57.920 --> 00:42:00.470
The others who’d like to leave now,
00:42:00.470 --> 00:42:04.240
I’d ask you kindly to please leave very quietly
00:42:04.240 --> 00:42:09.270
so we can hear what the people
asking questions will tell us.
00:42:09.270 --> 00:42:14.279
And those at the microphones,
if you could talk slowly,
00:42:14.279 --> 00:42:19.099
then those translating have no problems in translating
00:42:19.099 --> 00:42:21.490
what is being asked. Thank you very much.
00:42:21.490 --> 00:42:27.460
And I think we’ll start with mic #4 on the left-hand side.
00:42:27.460 --> 00:42:32.000
Mic#4: Yes, so, if you’ve been to any successful
00:42:32.000 --> 00:42:36.500
crypto party, you know that crypto parties very quickly
00:42:36.500 --> 00:42:41.430
turn into not about discussing software,
how to use software,
00:42:41.430 --> 00:42:43.780
but into threat model discussions.
00:42:43.780 --> 00:42:46.930
And to actually get users
to think about what they’re
00:42:46.930 --> 00:42:49.420
trying to protect themselves for and if a certain
00:42:49.420 --> 00:42:52.710
messaging app is secure, that still means nothing.
00:42:52.710 --> 00:42:55.810
’Cause there is lots of other stuff that’s going on.
00:42:55.810 --> 00:42:57.240
Can you talk a little bit about that and
00:42:57.240 --> 00:43:00.130
how that runs into this model about, you know,
00:43:00.130 --> 00:43:02.260
how we need to educate users and, while we’re at it,
00:43:02.260 --> 00:43:03.640
what we want to educated about.
00:43:03.640 --> 00:43:05.930
And what they actually need to be using.
00:43:05.930 --> 00:43:09.640
Arne: Well, I think that’s an interesting point
00:43:09.640 --> 00:43:14.210
and I think, one issue, one big issue is:
00:43:14.210 --> 00:43:17.180
okay, we can throw lots of crypto parties
00:43:17.180 --> 00:43:20.809
but we’re never gonna be able to throw enough parties.
00:43:20.809 --> 00:43:22.970
I … with one party you’re very lucky
00:43:22.970 --> 00:43:24.609
you’re gonna educate 100 people.
00:43:24.609 --> 00:43:28.950
I mean, just imagine how many parties
you’d need to throw. Right?
00:43:28.950 --> 00:43:32.980
I mean, it’s gonna be a heck of party, but … yeah.
00:43:32.980 --> 00:43:38.730
And I think, secondly, the question of threat modeling,
00:43:38.730 --> 00:43:43.000
I think, sure, that’s helpful to do, but
00:43:43.000 --> 00:43:47.760
I think, users do first need an understanding of,
00:43:47.760 --> 00:43:49.290
for example, the email architecture.
00:43:49.290 --> 00:43:51.520
Cause, how can they do threat
modeling when they think
00:43:51.520 --> 00:43:55.260
that an email magically pops
from one computer to the next?
00:43:55.260 --> 00:43:59.250
I think, that is pretty much impossible.
00:43:59.250 --> 00:44:01.250
I hope that …
00:44:01.250 --> 00:44:04.890
Herald: Thank you very much, so …
Microphone #3, please.
00:44:04.890 --> 00:44:07.439
Mic#3: Arne, thank you very much for your talk.
00:44:07.439 --> 00:44:10.430
There’s one aspect that I didn’t see in your slides.
00:44:10.430 --> 00:44:13.049
And that is the aspect of the language that we use
00:44:13.049 --> 00:44:16.940
to describe concepts in PGP—and GPG, for that matter.
00:44:16.940 --> 00:44:19.510
And I know that there was a paper last year
00:44:19.510 --> 00:44:21.890
about why King George can’t encrypt and
00:44:21.890 --> 00:44:23.960
they were trying to propose a new language.
00:44:23.960 --> 00:44:26.109
Do you think that such initiatives are worthwile
00:44:26.109 --> 00:44:28.650
or are we stuck with this language and should we make
00:44:28.650 --> 00:44:31.720
as good use of it as we can?
00:44:31.720 --> 00:44:37.849
Arne: I think that’s a good point
and actually the question
00:44:37.849 --> 00:44:44.649
of “okay, what metaphors do you wanna use?” … I think
00:44:44.649 --> 00:44:46.799
we’re pretty much stuck with the language
00:44:46.799 --> 00:44:49.710
that we’re using for the moment but
00:44:49.710 --> 00:44:54.130
I think it does make sense
to go and look into the future
00:44:54.130 --> 00:44:58.289
at alternatives models.
00:44:58.289 --> 00:45:00.990
Yeah, so I actually wrote a paper that also
00:45:00.990 --> 00:45:04.970
goes into that a bit, looking at
00:45:04.970 --> 00:45:08.630
the metaphor of handshakes to exchange keys.
00:45:08.630 --> 00:45:09.790
So, for example, you could have
00:45:09.790 --> 00:45:15.520
an embedded device as a ring or wristband,
00:45:15.520 --> 00:45:19.000
it could even be a smartwatch, for that matter.
00:45:19.000 --> 00:45:21.569
Could you use that shaking of hands to
00:45:21.569 --> 00:45:24.470
build trust-relationships?
00:45:24.470 --> 00:45:29.740
And that might be a better
metaphor than key-signing,
00:45:29.740 --> 00:45:31.469
webs of trust, etc.
00:45:31.469 --> 00:45:34.559
’Cause I think, that is horribly broken
00:45:34.559 --> 00:45:39.990
for I mean the concept, trying
to explain that to users.
00:45:39.990 --> 00:45:43.430
Herald: Thank you. And … at the back in the middle.
00:45:43.430 --> 00:45:44.980
Signal angel: Thanks. A question from the internet:
00:45:44.980 --> 00:45:47.000
[username?] from the Internet wants to know if you’re
00:45:47.000 --> 00:45:51.839
aware of the PEP project, the “pretty easy privacy”
00:45:51.839 --> 00:45:53.059
and your opinions on that.
00:45:53.059 --> 00:45:54.710
And another question is:
00:45:54.710 --> 00:46:01.520
How important is the trust level of the crypto to you?
00:46:01.520 --> 00:46:04.420
Arne: Well, yes, actually, there’s this screenshot
00:46:04.420 --> 00:46:09.729
of the PEP project in the slides
00:46:09.729 --> 00:46:15.149
… in the why WhatsApp is horribly insecure and
00:46:15.149 --> 00:46:18.720
of course, I agree, and yeah.
00:46:18.720 --> 00:46:21.680
I’ve looked into the PEP project for a bit
00:46:21.680 --> 00:46:24.549
and I think, yeah, I think it’s an interesting
00:46:24.549 --> 00:46:28.480
approach but I still have to read up on it a bit more.
00:46:28.480 --> 00:46:31.369
Then, for the second question,
00:46:31.369 --> 00:46:38.039
“how important is the rust in the crypto?”:
00:46:38.039 --> 00:46:41.749
I think that’s an important one.
00:46:41.749 --> 00:46:43.220
Especially the question of
00:46:43.220 --> 00:46:52.780
“how do we build social systems
to ensure reliable cryptography?”
00:46:52.780 --> 00:46:56.830
So one example is the Advanced
Encryption Standard competition.
00:46:56.830 --> 00:46:59.559
Everyone was free to send in entries,
00:46:59.559 --> 00:47:01.990
their design princpiles were open
00:47:01.990 --> 00:47:06.219
and this is in complete contrast
to the Data Encryption Standard
00:47:06.219 --> 00:47:11.920
which, I think, the design princpiles are still Top Secret.
00:47:11.920 --> 00:47:16.290
So yeah, I think, the crypto is
something we need to build on
00:47:16.290 --> 00:47:22.059
but, well, actually, the crypto is
again built on other systems
00:47:22.059 --> 00:47:28.040
where the trust in these systems
is even more important.
00:47:28.040 --> 00:47:30.720
Herald: Okay, thank you, microphone #2, please.
00:47:30.720 --> 00:47:34.270
Mic#2: Yes, Arne, thank you very
much for your excellent talk.
00:47:34.270 --> 00:47:37.710
I wonder how about what to do with feedback
00:47:37.710 --> 00:47:40.899
on usability in open-source software.
00:47:40.899 --> 00:47:42.329
So, you publish something on GitHub
00:47:42.329 --> 00:47:44.049
and you’re with a group of people
00:47:44.049 --> 00:47:45.450
who don’t know each other and
00:47:45.450 --> 00:47:48.089
one publishes something,
the other publishes something,
00:47:48.089 --> 00:47:51.349
and then: How do we know
this software is usable?
00:47:51.349 --> 00:47:53.660
In commercial software, there’s all kind of hooks
00:47:53.660 --> 00:47:55.780
on the website, on the app,
00:47:55.780 --> 00:47:59.059
to send feedback to the commercial vendor.
00:47:59.059 --> 00:48:02.270
But in open-source software,
how do you gather this information?
00:48:02.270 --> 00:48:04.630
How do you use it, is there any way to do this
00:48:04.630 --> 00:48:05.890
in an anonymised way?
00:48:05.890 --> 00:48:08.589
I haven’t seen anything related to this.
00:48:08.589 --> 00:48:11.480
Is this one of the reasons why
open-source software is maybe
00:48:11.480 --> 00:48:15.249
less usable than commercial software?
00:48:15.249 --> 00:48:19.889
Arne: It might be. It might be.
00:48:19.889 --> 00:48:22.599
But regarding your question, like, how do you know
00:48:22.599 --> 00:48:29.559
whether a commercial software is usable, well,
00:48:29.559 --> 00:48:32.279
you … one way is looking at:
00:48:32.279 --> 00:48:34.840
Okay, what kind of statistics do you get back?
00:48:34.840 --> 00:48:37.720
But if you push out something totally unusable
00:48:37.720 --> 00:48:39.920
and then, I mean, you’re going to expect
00:48:39.920 --> 00:48:44.599
that the statistics come back looking like shit.
00:48:44.599 --> 00:48:49.829
So, the best approach is to
design usability in from the start.
00:48:49.829 --> 00:48:51.230
The same with security.
00:48:51.230 --> 00:48:54.950
And I think, that is also …
so even if you have …
00:48:54.950 --> 00:48:58.670
you want privacy for end users, I think it’s still possible
00:48:58.670 --> 00:49:01.530
to get people into their lab and look at:
00:49:01.530 --> 00:49:03.270
Okay, how are they using the system?
00:49:03.270 --> 00:49:05.760
What things can we improve?
00:49:05.760 --> 00:49:08.289
And what things are working well?
00:49:08.289 --> 00:49:10.740
Mic#2: So you’re saying, you should only publish
00:49:10.740 --> 00:49:19.010
open-source software for users
if you also tested in a lab?
00:49:19.010 --> 00:49:22.599
Arne: Well, I think, this is a bit of a discussion of:
00:49:22.599 --> 00:49:25.740
Do we just allow people to build
houses however they want to
00:49:25.740 --> 00:49:28.410
or do we have building codes?
00:49:28.410 --> 00:49:32.130
And … I think … well, actually, this proposal of holding
00:49:32.130 --> 00:49:35.730
software developers responsible for what they produce,
00:49:35.730 --> 00:49:40.299
if it’s commercial software, I mean,
that proposal has been
00:49:40.299 --> 00:49:41.970
made a long time ago.
00:49:41.970 --> 00:49:43.130
And the question is:
00:49:43.130 --> 00:49:47.950
How would that work in an
open-source software community?
00:49:47.950 --> 00:49:50.460
Well, actually, I don’t have an answer to that.
00:49:50.460 --> 00:49:52.660
But I think, it’s an interesting question.
00:49:52.660 --> 00:49:54.490
Mic#2: Thank you.
00:49:54.490 --> 00:49:57.990
Herald: Thank you very much. #1, please.
00:49:57.990 --> 00:50:01.130
Mic#1: You said that every little bit helps,
00:50:01.130 --> 00:50:04.039
so if we have systems that
don’t provide a lot of … well …
00:50:04.039 --> 00:50:06.680
are almost insecure, they
provide just a bit of security, than
00:50:06.680 --> 00:50:09.869
that is still better than no security.
00:50:09.869 --> 00:50:12.970
My question is: Isn’t that actually worse because
00:50:12.970 --> 00:50:15.150
this promotes a false sense of security and
00:50:15.150 --> 00:50:19.920
that makes people just use
the insecure broken systems
00:50:19.920 --> 00:50:23.559
just to say “we have some security with us”?
00:50:23.559 --> 00:50:26.210
Arne: I completely agree but
00:50:26.210 --> 00:50:29.339
I think that currently people … I mean …
00:50:29.339 --> 00:50:30.920
when you think an email goes
00:50:30.920 --> 00:50:33.680
from one system to the other directly
00:50:33.680 --> 00:50:40.920
and I mean … from these studies
that I’ve done, I’ve met
00:50:40.920 --> 00:50:46.060
quite some people who still think email is secure.
00:50:46.060 --> 00:50:49.589
So, of course, you might give
them a false sense of security
00:50:49.589 --> 00:50:52.640
when you give them a
more secure program but
00:50:52.640 --> 00:50:54.480
at least it’s more secure than email—right?
00:50:54.480 --> 00:50:56.070
I mean …
00:50:56.070 --> 00:50:57.339
Mic#1: Thank you.
00:50:57.339 --> 00:50:59.520
Herald: Thank you. There’s another
question on the Internet.
00:50:59.520 --> 00:51:02.559
Signal angel: Yes, thank you. Question from the Internet:
00:51:02.559 --> 00:51:06.199
What crypto would you finally
recommend your grandma to use?
00:51:06.199 --> 00:51:10.260
Arne: laughs
00:51:10.260 --> 00:51:15.500
Well … Unfortunately, my grandma
has already passed away.
00:51:15.500 --> 00:51:19.520
I mean … her secrets will be safe …
00:51:27.420 --> 00:51:32.030
Actually, I think something like where
00:51:32.030 --> 00:51:37.349
Crypto is enabled by default, say …
iMessage, I mean
00:51:37.349 --> 00:51:42.059
of course, there’s backdoors,
etc., but at least
00:51:42.059 --> 00:51:45.339
it is more secure than plain SMS.
00:51:45.339 --> 00:51:53.249
So I would advise my grandma
to use … well, to look at …
00:51:53.249 --> 00:51:56.289
actually I’d first analyse
what does she have available
00:51:56.289 --> 00:51:58.880
and then I would look at okay
which is the most secure
00:51:58.880 --> 00:52:03.680
and still usable?
00:52:03.680 --> 00:52:07.339
Herald: Thank you very much, so mic #3, please.
00:52:07.339 --> 00:52:10.880
Mic#3: So, just wondering:
00:52:10.880 --> 00:52:14.950
You told that there is
a problem with the missing
00:52:14.950 --> 00:52:20.329
default installation of GPG
on operating systems but
00:52:20.329 --> 00:52:24.890
I think, this is more of a
problem of which OS you choose
00:52:24.890 --> 00:52:28.220
because at least I don’t
know any Linux system which
00:52:28.220 --> 00:52:33.599
doesn’t have GPG installed today by default.
00:52:33.599 --> 00:52:39.539
If you use … at least I’ve used
the normal workstation setup.
00:52:39.539 --> 00:52:42.550
Arne: Yes, I think you already
answered your own question:
00:52:42.550 --> 00:52:47.230
Linux.
Laughter
00:52:47.230 --> 00:52:50.690
Unfortunately, Linux is not yet widely default.
00:52:50.690 --> 00:52:53.270
I mean, I’d love it to be, but … yeah.
00:52:53.270 --> 00:52:57.730
Mic#3: But if I send an email to Microsoft and say:
00:52:57.730 --> 00:53:02.539
Well, install GPG by default, they’re not gonna
00:53:02.539 --> 00:53:04.150
listen to me.
00:53:04.150 --> 00:53:07.530
And I think, for all of us, we should do
00:53:07.530 --> 00:53:08.740
a lot more of that.
00:53:08.740 --> 00:53:13.780
Even if Microsoft is the devil for most of us.
00:53:13.780 --> 00:53:15.609
Thank you.
00:53:15.609 --> 00:53:19.599
Arne: Well … We should be doing more of what?
00:53:19.599 --> 00:53:26.430
Mic#3: Making more demands
to integrate GPG by default
00:53:26.430 --> 00:53:29.210
in Microsoft products, for example.
00:53:29.210 --> 00:53:31.059
Arne: Yes, I completely agree.
00:53:31.059 --> 00:53:33.869
Well, what you already see happening …
00:53:33.869 --> 00:53:36.140
or I mean, it’s not very high-profile yet,
00:53:36.140 --> 00:53:39.020
but for example I mean … I’ve refered to
00:53:39.020 --> 00:53:42.700
the EFF scorecard a couple of times but
00:53:42.700 --> 00:53:49.750
that is some pressure to encourage developers
00:53:49.750 --> 00:53:53.010
to include security by default.
00:53:53.010 --> 00:53:56.940
But, I think I’ve also mentioned, one of the big problems
00:53:56.940 --> 00:54:01.049
is: users at the moment … I mean …
00:54:01.049 --> 00:54:04.079
developers might say: my system is secure.
00:54:04.079 --> 00:54:06.549
I mean … what does that mean?
00:54:06.549 --> 00:54:09.510
Do we hold developers and
commercial entities …
00:54:09.510 --> 00:54:12.339
do we hold them to, well,
00:54:12.339 --> 00:54:14.039
truthful advertisting standards or not?
00:54:14.039 --> 00:54:17.200
I mean, I would say: Let’s gonna look at
00:54:17.200 --> 00:54:21.289
what are companies claiming and
00:54:21.289 --> 00:54:22.849
do they actually stand up to that?
00:54:22.849 --> 00:54:26.079
And if not: Can we actually sue them?
00:54:26.079 --> 00:54:27.720
Can we make them tell the truth about
00:54:27.720 --> 00:54:30.759
what is happening and what is not?
00:54:30.759 --> 00:54:32.960
Herald: So, we’ve got about 2 more minutes left …
00:54:32.960 --> 00:54:37.049
So it’s a maximum of two more questions, #2, please.
00:54:37.049 --> 00:54:43.440
Mic#2: Yeah, so … Every security system fails.
00:54:43.440 --> 00:54:50.010
So I’m interested in what sort of work has been done on
00:54:50.010 --> 00:54:56.999
how do users recover from failure?
00:54:56.999 --> 00:55:00.660
Everything will get subverted,
00:55:00.660 --> 00:55:04.190
your best firend will sneak
your key off your computer,
00:55:04.190 --> 00:55:06.099
something will go wrong with that, you know …
00:55:06.099 --> 00:55:09.510
your kids will grab it …
00:55:09.510 --> 00:55:13.450
and just, is there, in general, has somebody looked at
00:55:13.450 --> 00:55:17.000
these sorts of issues?
00:55:17.000 --> 00:55:18.559
Is there research on it?
00:55:18.559 --> 00:55:21.930
Arne: Of various aspects of the problem but
00:55:21.930 --> 00:55:25.640
as far as I’m aware not for the general issue
00:55:25.640 --> 00:55:30.170
and not any field studies specifically looking at
00:55:30.170 --> 00:55:34.269
“Okay, what happens when a key is compromised, etc.”
00:55:34.269 --> 00:55:37.520
I mean, we do have certain cases of things happening
00:55:37.520 --> 00:55:41.789
but nothing structured.
00:55:41.789 --> 00:55:44.720
No structured studies, as far as I’m aware.
00:55:44.720 --> 00:55:46.810
Herald: Thank you. #3?
00:55:46.810 --> 00:55:51.539
Mic#3: Yeah, you mentioned
mailpile as a stepping stone
00:55:51.539 --> 00:55:56.380
for people to start using GnuPG and stuff, but
00:55:56.380 --> 00:56:04.820
you also talked about an
average user seeing mail as just
00:56:04.820 --> 00:56:08.789
coming from one place and
then ending up in another place.
00:56:08.789 --> 00:56:12.250
Shouldn’t we actually talk about
00:56:12.250 --> 00:56:17.880
how to make encryption transparent for the users?
00:56:17.880 --> 00:56:21.430
Why should they actually care about these things?
00:56:21.430 --> 00:56:24.980
Shouldn’t it be embedded in the protocols?
00:56:24.980 --> 00:56:28.869
Shouldn’t we actually talk about
embedding them in the protocols,
00:56:28.869 --> 00:56:31.510
stop using unsecure protocols
00:56:31.510 --> 00:56:36.109
and having all of these,
you talked a little bit about it,
00:56:36.109 --> 00:56:38.720
as putting it in the defaults.
00:56:38.720 --> 00:56:42.549
But shouldn’t we emphasise that a lot more?
00:56:42.549 --> 00:56:46.730
Arne: Yeah, I think we should
certainly be working towards
00:56:46.730 --> 00:56:50.200
“How do we get security by default?”
00:56:50.200 --> 00:56:54.270
But I think … I’ve mentioned it shortly that
00:56:54.270 --> 00:56:57.519
making things transparent also has a danger.
00:56:57.519 --> 00:57:01.000
I mean, this whole, it’s a bit like …
00:57:01.000 --> 00:57:03.380
a system should be transparent is a bit like
00:57:03.380 --> 00:57:05.880
marketing speak, because actually
00:57:05.880 --> 00:57:09.140
we don’t want systems to be completely transparent,
00:57:09.140 --> 00:57:13.430
’cause we also wanna be able
to engage with the systems.
00:57:13.430 --> 00:57:16.410
Are the systems working as they should be?
00:57:16.410 --> 00:57:20.380
So, I mean, this is a difficult balance to find, but yeah …
00:57:20.380 --> 00:57:24.730
Something that you achieve through usability studies,
00:57:24.730 --> 00:57:29.069
security analysis, etc.
00:57:29.069 --> 00:57:31.450
Herald: All right, Arne,
thank you very much for giving
00:57:31.450 --> 00:57:33.640
us your very inspiring talk,
00:57:33.640 --> 00:57:36.010
thank you for sharing your information with us.
00:57:36.010 --> 00:57:38.479
Please give him a round of applause.
00:57:38.479 --> 00:57:41.319
Thank you very much.
applause
00:57:41.319 --> 00:57:52.000
subtitles created by c3subtitles.de
Join, and help us!