Arne Padmos: Why is GPG "damn near unusable"?
-
0:11 - 0:18Applause
-
0:18 - 0:20So, good morning everyone
-
0:20 - 0:24my name is Arne and today
-
0:24 - 0:26I'll be hoping to entertain you
-
0:26 - 0:32a bit with some GPG usability issues.
-
0:32 - 0:34thanks for being here this early in the morning.
-
0:34 - 0:37I know, some of you have had a short night
-
0:37 - 0:43In short for the impatient ones:
-
0:43 - 0:47Why is GnuPG damn near unusable?
-
0:47 - 0:52Well, actually, I don’t know
-
0:52 - 0:53Laughter
-
0:53 - 0:58So more research is needed … as always.
-
0:58 - 1:00Because it's not like using a thermometer.
-
1:00 - 1:04We're doing something between social science and security
-
1:04 - 1:11But I will present some interesting perspectives
-
1:12 - 1:17or at least what I hope you'll find interesting perspectives.
-
1:17 - 1:20This talk is about some possible explanations
-
1:20 - 1:25that usable security research can offer to the question
-
1:25 - 1:27Now some context, something about myself,
-
1:27 - 1:34so you have a bit of an idea where I'm coming from
-
1:34 - 1:39and what coloured glassed I have on.
-
1:39 - 1:44So pretty much my background is in Mathematics,
-
1:44 - 1:48Computer science, and—strangely enough—International relations
-
1:48 - 1:52My professional background is that I've been doing
-
1:52 - 1:57embedded system security evaluations and training
-
1:57 - 2:03and I've also been a PhD student, studying the usability of security.
-
2:03 - 2:08Currently, I teach the new generation,
-
2:08 - 2:15hoping to bring some new blood into the security world.
-
2:15 - 2:18I want to do some expectation setting
-
2:18 - 2:21I want to say, what this talk is not about.
-
2:21 - 2:24I will also give some helpful pointers for
-
2:24 - 2:30those of you that are interested in these other areas.
-
2:30 - 2:34I will not go into too much detail about the issue of truth
-
2:34 - 2:38in security science.
-
2:38 - 2:40Here are some links to some interesting papers that cover this
-
2:40 - 2:43in a lot of detail.
-
2:43 - 2:46Neither will I be giving a security primer.
-
2:46 - 2:50There are some nice links to books on the slide.
-
2:50 - 2:55I'll also not be giving a cryptography primer or a history lesson.
-
2:55 - 2:58Neither will I be giving an introduction to PGP
-
2:58 - 3:03And, interestingly enough, even though the talk is titled
-
3:03 - 3:07“why is GPG damn near unusable”, I will
-
3:07 - 3:10not really be doing much PGP bashing
-
3:10 - 3:15I think it's quite, actually, a wonderful effort and other people
-
3:15 - 3:21have pretty much done the PGP/GnuPG bashing for me.
-
3:21 - 3:26And, as I've already mentioned, I will not be giving any definite answers
-
3:26 - 3:29and a lot of “it depends.”
-
3:29 - 3:34But then you might ask “well, it depends. What does it depend on?”
-
3:34 - 3:37Well, for one: What users you’re looking at
-
3:37 - 3:40which goals they have in mind and
-
3:40 - 3:44in what context, what environment they’re doing these things.
-
3:44 - 3:48So, instead I want to kindle your inspiration
-
3:48 - 3:54I want to offer you a new view on the security environment
-
3:54 - 4:00and I'll also give you some concrete exercises that you can try out
-
4:00 - 4:02at home or at the office.
-
4:02 - 4:08Some “do’s” and “don’t’s” and pointers for further exploration:
-
4:08 - 4:10This is a short overview of the talk
-
4:10 - 4:16I'll start with the background story to why I’m giving this talk
-
4:16 - 4:21then an overview over usable security research area,
-
4:21 - 4:25some principles and methods for usablity,
-
4:25 - 4:29some case studies, then some open questions remain
-
4:29 - 4:36So, the story. Well. It all started with this book.
-
4:37 - 4:41When I was reading about the Snowden revelations,
-
4:41 - 4:47I read, how Snowden tried to contact Glenn Greenwald.
-
4:47 - 4:53On December, 1st, he sent an email, saying, well, writing to Glenn:
-
4:53 - 5:00“If you don’t use PGP, some people will never be able to contact you.”
-
5:00 - 5:05“Please install this helpful tool and if you need any help,
-
5:05 - 5:08please request so.”
-
5:08 - 5:12Three days later, Glenn Greenwald says: “Sorry, I don’t
-
5:12 - 5:17know how to do that, but I’ll look into it.”
-
5:17 - 5:22and Snowden writes back: “Okay, well, sure. And again:
-
5:22 - 5:24If you need any help, I can facilitate contact.”
-
5:24 - 5:29Now, a mere seven weeks later,
-
5:29 - 5:30Laughter
-
5:30 - 5:37Glenn is like “okay, well, I’ll do it within the next days or so.”
-
5:37 - 5:38Okay, sure …
-
5:38 - 5:43Snowden’s like “my sincerest thanks”.
-
5:43 - 5:46But actually in the meantime, Snowden was growing a bit impatient
-
5:46 - 5:51’cause, okay, “why are you not encrypting?”
-
5:51 - 5:55So he sent an email to Micah Lee, saying, “okay, well, hello,
-
5:55 - 6:02I’m a friend, can you help me getting contact with Laura Poitras?”
-
6:02 - 6:06In addition to that, he made a ten-minute video for Glenn Greenwald
-
6:06 - 6:09Laughter
-
6:09 - 6:11… describing how to use GPG.
-
6:11 - 6:17And actually I have quite a lot of screenshots of that video
-
6:17 - 6:20and it's quite entertaining.
-
6:20 - 6:24’cause, of course, Snowden was getting increasingly
-
6:24 - 6:28bothered by the whole situation.
-
6:28 - 6:33Now, this is the video that Snowden made
-
6:33 - 6:40“GPG for Journalists. For Windows.”
-
6:41 - 6:47Laughter
-
6:49 - 6:51I’ll just click through it, because I think,
-
6:51 - 6:54the slides speak for themselves.
-
6:54 - 7:01Take notes of all the usability issues you can identify.
-
7:02 - 7:06So just click through the wizard, generate a new key,
-
7:06 - 7:12enable “expert settings”, ’cause we want 3000-bit keys
-
7:12 - 7:16We want a very long password, etc.
-
7:16 - 7:19And now, of course, we also wanna go and find keys
-
7:19 - 7:22on the keyserver.
-
7:22 - 7:25And we need to make sure that we shouldn’t
-
7:25 - 7:29write our draft messages in GMail
-
7:29 - 7:32or Thunderbird and enigmail, for that matter.
-
7:32 - 7:37Although that issue has been solved.
-
7:40 - 7:42So, I think you can start seeing
-
7:42 - 7:45why Glenn Greenwald
—even if he did open this video— -
7:45 - 7:53was like
“okay, well, I’m not gonna bother.” -
7:53 - 7:57And Snowden is so kind to say, after 12 minutes,
-
7:57 - 8:02“if you have any remaining questions, please contact me.”
-
8:02 - 8:06At this year’s HOPE conference,
-
8:06 - 8:12Snowden actually did a call for arms and he said
-
8:12 - 8:17“Okay, we need people to evaluate our security systems.
-
8:17 - 8:24we need people to go and do red team. But in addition to that,
-
8:26 - 8:31we also need to look at the user experience issue.”
-
8:31 - 8:37So this is a transcript of his kind of manifesto
-
8:37 - 8:42and he says: “GPG is really damn near unusable”
-
8:42 - 8:47because, well, of course, you might know command line
-
8:47 - 8:50and then, okay, you might be okay
-
8:50 - 8:53but “Gam Gam at the home”, she is never going to be able
-
8:53 - 8:59to use GnuPG.
-
8:59 - 9:09And he also notes that, okay, we were part of a technical elite
-
9:09 - 9:16and he calls on us to work on the technical literacy of people
-
9:16 - 9:18because what he explicitly warns against is
-
9:18 - 9:23a high priesthood of technology.
-
9:23 - 9:28Okay, that’s a nice call to arms
-
9:28 - 9:33but are we actually up for a new dawn?
-
9:33 - 9:40Well, I wanna go into the background of usable security
-
9:40 - 9:45and I wanna show you that we’ve actually been
-
9:45 - 9:48in a pretty dark time.
-
9:48 - 9:56So, back in 1999, there was this paper:
“Why Johnny can’t encrypt” -
9:56 - 10:01which described mostly the same broken interface
-
10:01 - 10:10so if you remember, if you go back to the video of which
-
10:10 - 10:15I showed some screenshots, then, well, if you look at
-
10:15 - 10:22these screenshots from 1999, well,
is there a lot of difference? -
10:22 - 10:24Not really! Nothing much has changed.
-
10:24 - 10:26There are still the same
-
10:26 - 10:32conceptual barriers,
and same crappy defaults. -
10:32 - 10:36And most astonishingly, in the paper there
-
10:36 - 10:39is a description of a user study where
-
10:39 - 10:45users were given 90 minutes to encrypt an email
-
10:45 - 10:50and most were unable to do so.
-
10:50 - 10:55I think, that pretty much describes “damn near unusable.”
-
10:55 - 11:03A timeline from, well, before 1999 to now
-
11:03 - 11:06of the usable security research.
-
11:06 - 11:10So, quite a lot has happened
-
11:10 - 11:15although it is still a growing field.
-
11:15 - 11:21It started—the idea of usable security,
it was explicitly defined first— -
11:21 - 11:30in 1975, but it was only until
… only in … 1989 that -
11:30 - 11:33the first usability tests were carried out.
-
11:33 - 11:38And only in 1996 that
-
11:38 - 11:45the concept of “user-centered security”
was described. -
11:45 - 11:49An interesting paper, also from 1999, shows how
-
11:49 - 11:55contary to the general description of users as lazy
-
11:55 - 12:01and basically as the weakest chain in security
-
12:01 - 12:06this paper describes users as pretty rational beings
-
12:06 - 12:10who see security as an overhead and …
where they don't -
12:10 - 12:17understand the usefulness of what they’re doing.
-
12:17 - 12:21The study of PGP 5.0, I’ve talked about that already,
-
12:21 - 12:28and there was also a study of the
Kazaa network in 2002. -
12:28 - 12:30And it was found out that a lot of users were
-
12:30 - 12:35accidentally sharing files from personal pictures,
-
12:35 - 12:42who knows, maybe credit-card details,
you never know, right? -
12:42 - 12:48In 2002, a lot of the knowledge of usable security design
-
12:48 - 12:51was concretised in ten key principles
-
12:51 - 12:54and if you’re interested,
-
12:54 - 13:04I do recommend you to look at the paper.
-
13:04 - 13:10A solution to the PGP problem was proposed in
-
13:10 - 13:122004, well, actually,
-
13:12 - 13:15it was proposed earlier
but it was tested in 2005. -
13:15 - 13:20And it was found that actually
if we automate encryption -
13:20 - 13:24and if we automate key exchange then, well,
-
13:24 - 13:26things are pretty workable, except that
-
13:26 - 13:30users still fall for
phishing attacks, of course. -
13:30 - 13:38But last year, another research
identified that, well, -
13:38 - 13:42making security transparent is all nice and well
-
13:42 - 13:46but it's also dangerous because users no longer
-
13:46 - 13:50are less likely to trust the system and
-
13:50 - 13:54are less likely to really understand
what’s really happening. -
13:54 - 13:59So a paper this year also identified another issue:
-
13:59 - 14:04Users generally have very bad understanding
-
14:04 - 14:06of the email architecture.
-
14:06 - 14:09An email goes from point A to point B.
-
14:09 - 14:16And what happens in-between is unknown.
-
14:16 - 14:23So, before I go on to general usability principles
-
14:23 - 14:28from the founding pillar of the usable security field
-
14:28 - 14:34I wanna give some exaples of usability failures.
-
14:34 - 14:37You might be familiar with project VENONA.
-
14:37 - 14:42This was an effort by the US intelligence agencies
-
14:42 - 14:45to try and decrypt soviet communication.
-
14:45 - 14:49And they actually were pretty successful.
-
14:49 - 14:54They encovered a lot of spying and, well,
-
14:54 - 14:56how did they do this?
-
14:56 - 15:00The soviets were using one-time pads and,
-
15:00 - 15:03well, if you reuse a one-time pad,
-
15:03 - 15:08then you leak a lot of information about plain-text.
-
15:08 - 15:10Well, what we also see happening a lot is
-
15:10 - 15:13low password entropy.
-
15:13 - 15:19We have people choosing password “123456”, etc.
-
15:19 - 15:25And what I just described, the study looking into
-
15:25 - 15:29the mental models of users,
-
15:29 - 15:32of the email architecture and how it works
-
15:32 - 15:35well, at the top you have
-
15:35 - 15:38still a pretty simplified description of how things work
-
15:38 - 15:41and at the bottom we have an actual drawing
-
15:41 - 15:43of a research participant when asked:
-
15:43 - 15:50“Can you draw how an email
goes from point A to point B?” -
15:50 - 15:54And it’s like:
“Well, it goes from one place to the other.” -
15:54 - 15:59Okay …
-
15:59 - 16:01clicking sounds
-
16:01 - 16:04Okay, so this died.
-
16:13 - 16:19So, these are two screenshots of enigmail
-
16:19 - 16:22Well, if I wouldn’t have marked them
-
16:22 - 16:27as the plaintext and encrypted email that would be sent
-
16:27 - 16:31you probably wouldn’t have spotted which was which
-
16:31 - 16:35this is a pretty big failure in
-
16:35 - 16:40the visibility of the system.
-
16:40 - 16:45You don’t see anything? Ah.
-
16:45 - 16:48Audience: “That’s the point!”
That's the point, yes! -
16:48 - 16:58laughter
applause -
16:58 - 17:02On the left we have a screenshot of GPG and
-
17:02 - 17:04as I’ve already described,
-
17:04 - 17:09command line people, we like command lines
-
17:09 - 17:11but normal people don’t.
-
17:11 - 17:14And what we also see is a lot of the jargon that is
-
17:14 - 17:17currently being used even in GUI applications
-
17:17 - 17:23so on the right there is PGP 10.0.
-
17:23 - 17:26Now I wanna close these examples with
-
17:26 - 17:29well, you might be wondering: “what is this?”
-
17:29 - 17:33This is actually an example of a security device
-
17:33 - 17:37from, I think it’s around 4000 years ago.
-
17:37 - 17:38Like, People could use this.
-
17:38 - 17:43Why can’t we get it right today?
-
17:43 - 17:46Something that you should,
-
17:46 - 17:49this is a little homework exercise,
-
17:49 - 17:52take a laptop to your grandma, show her PGP,
-
17:52 - 17:55can she use it—yes or no?
-
17:55 - 18:02Probably not, but who knows?
-
18:02 - 18:04Now I wanna go into
-
18:04 - 18:10the usability cornerstones of usable security.
-
18:10 - 18:13I wanna start with heuristics
-
18:13 - 18:16some people call them “rules of thumb,” other people
-
18:16 - 18:19call them “the ten holy commandments”
-
18:19 - 18:23For example, the ten commandments of Dieter Rams,
-
18:23 - 18:27there is ten commandments of Jakob Nielsen,
-
18:27 - 18:28of Don Norman
-
18:28 - 18:35and it really depends on who you believe in, etc.
-
18:35 - 18:37But at the cornerstone of all of these is that
-
18:37 - 18:40design is made for people.
-
18:40 - 18:46And, well, actually, Google says it quite well
-
18:46 - 18:49in their guiding mission:
-
18:49 - 18:53“Focus on the user and all else will follow.”
-
18:53 - 18:55Or, as a usability maxim:
-
18:55 - 18:57“thou shalt test with thy user”
-
18:57 - 19:01Don’t just give them the thing.
-
19:01 - 19:03But there is one problem with these heuristics
-
19:03 - 19:07and with this advice going just with your user.
-
19:07 - 19:11Because it’s a pretty abstract advice.
-
19:11 - 19:12What do you do?
-
19:12 - 19:14You go out into the world to get practice.
-
19:14 - 19:18You start observing people.
-
19:18 - 19:20One nice exercise to try is:
-
19:20 - 19:21go to the vending machine,
-
19:21 - 19:25for example the ones at the S-Bahn.
-
19:25 - 19:26Just stand next to it
-
19:26 - 19:28and observe people buying tickets.
-
19:28 - 19:31It’s quite entertaining, actually.
-
19:31 - 19:34Laughter
-
19:34 - 19:36And something you can also do is
-
19:36 - 19:38search for usability failures.
-
19:38 - 19:40This is what you already do when
-
19:40 - 19:41you’re observing people.
-
19:41 - 19:45But even just google for “usability failure”,
-
19:45 - 19:48“GUI fail”, etc., and you will find
-
19:48 - 19:53lots of entertaining stuff.
-
19:53 - 19:55Those were some heuristics
-
19:55 - 19:56but what about the princpiles
-
19:56 - 20:02that lie behind those?
-
20:02 - 20:06Usability or interaction design
-
20:06 - 20:09is a cycle between the user and the system.
-
20:09 - 20:10The user and the world.
-
20:10 - 20:12The user acts on the world
-
20:12 - 20:14and gets feedback.
-
20:14 - 20:17They interpret that.
-
20:17 - 20:18One important concept is
-
20:18 - 20:20for things to be visible.
-
20:20 - 20:21For the underlying system state
-
20:21 - 20:22to be visible and
-
20:22 - 20:24you get appropriate feedback
-
20:24 - 20:26from the system.
-
20:26 - 20:31So these are Don Norman’s gulfs
of execution and evaluation -
20:31 - 20:35sort of yin and yang.
-
20:35 - 20:39And there is two concrete problems
-
20:39 - 20:40to illustrate.
-
20:40 - 20:42For example, the button problem
-
20:42 - 20:46that “how do you know what happens
when you push the button?” -
20:46 - 20:51and “how do you know how to push it?”
-
20:51 - 20:53I unforunately don’t have a picture of it
-
20:53 - 20:58but at Oxford station, the tabs in the bathrooms
-
20:58 - 21:02they say “push” and you need to turn.
-
21:02 - 21:05Laughter
-
21:05 - 21:09Then there is the toilet door problem.
-
21:09 - 21:12The problem of “how do you know
what state a system is in”. -
21:12 - 21:16How do you know whether an email will be encrypted?
-
21:16 - 21:20This is a picture …
-
21:20 - 21:22basically there is two locks.
-
21:22 - 21:26One is actually broken and it’s …
when pushing the button that's on the -
21:26 - 21:29door handle, you usually lock the door.
-
21:29 - 21:32But … well … it broke. So that must have been
-
21:32 - 21:36an entertaining accident.
-
21:36 - 21:39Another, as I’ve already described,
-
21:39 - 21:44another important concept is that of mental models.
-
21:44 - 21:48It’s a question of what idea does the user have
-
21:48 - 21:53of the system by interacting with it?
-
21:53 - 21:56How do they acquire knowledge?
-
21:56 - 21:59For example, how to achieve discoverability
-
21:59 - 22:01of the system?
-
22:01 - 22:06And how to ensure that while
a user is discovering the system -
22:06 - 22:09that they are less likely to make mistakes?
-
22:09 - 22:14So this is the concept of poka-yoke
-
22:14 - 22:18and it’s … here is an example
you also see with floppy disks, -
22:18 - 22:22with USB sticks, etc.
-
22:22 - 22:24It’s engineered such that users are
-
22:24 - 22:27less likely to make a mistake.
-
22:27 - 22:31Then there’s also the idea
of enabling knowledge transfer -
22:31 - 22:33So how can we do this?
-
22:33 - 22:35One thing is metaphors.
-
22:35 - 22:40And I’m not sure how many of you recognise this,
-
22:40 - 22:44this is Microsoft BOB.
-
22:44 - 22:46Traditionally, PC systems have been built
-
22:46 - 22:52on the desktop metaphor.
-
22:52 - 22:58Laughter
Microsoft BOB had a little too much. -
22:58 - 23:05To enable knowledge transfer,
you can also standardise systems. -
23:05 - 23:09And one important tool for this is design languages
-
23:09 - 23:12so if you’re designing for iOS, go look at
-
23:12 - 23:16the design language, the Human
Interface Guidelines of iOS. -
23:16 - 23:20The same for Windows – go look
at the Metro Design Guidelines. -
23:20 - 23:26As for Android, look at Material Design.
-
23:26 - 23:30Because, another interesting exercise
to try out -
23:30 - 23:33relating to design languages
-
23:33 - 23:38and also to get familiar with how designers try to
-
23:38 - 23:41communicate with users is to
-
23:41 - 23:45look at an interface and trying to decode
-
23:45 - 23:49what the designer is trying to say to the user.
-
23:49 - 23:54And another interesting exercise is to look at
-
23:54 - 23:59not usability but UNusability.
-
23:59 - 24:01So there is this pretty interesting book
-
24:01 - 24:05called “evil by design” and it goes into
-
24:05 - 24:08all the various techniques that designers use
-
24:08 - 24:17to fool users, to get them to buy an extra hotel, car, etc.
-
24:17 - 24:22and, well, RyanAir is pretty much the worst offender
-
24:22 - 24:25so a good example to study.
-
24:25 - 24:31Applause
-
24:31 - 24:34So, what if you wanna go out into the world
-
24:34 - 24:40and are gonna apply these princpiles, these heuristics?
-
24:40 - 24:42The first thing to know is that
-
24:42 - 24:45design has to be a process
-
24:45 - 24:51whereby, first part is actually defining your problem.
-
24:51 - 24:54You first brain-storm
-
24:54 - 24:58then you try to narrow down to concrete requirements
-
24:58 - 25:03after which you go and try out
the various approaches -
25:03 - 25:06and test these.
-
25:06 - 25:09What materials do usability experts actually use?
-
25:09 - 25:16Well, of course there’s expensive tools, Axure, etc.
-
25:16 - 25:19but I think one of the most used materials
-
25:19 - 25:25is still the post-it note. Just paper and pens.
-
25:25 - 25:29And, okay, where do you wanna go and test?
-
25:29 - 25:32Well, actually, go out into the field.
-
25:32 - 25:36Go to the ticket machine of the S-Bahn.
-
25:36 - 25:39But also go and test in the lab, so that you have
-
25:39 - 25:42a controlled environment.
-
25:42 - 25:45And then you ask “okay, how do I test?”
-
25:45 - 25:50Well, first thing is: Go and get some real people.
-
25:50 - 25:55Of course, it’s … it can be difficult to actually
-
25:55 - 26:01get people into the lab but it’s not impossible.
-
26:01 - 26:02So once you have people in the lab,
-
26:02 - 26:05here are some methods.
-
26:05 - 26:07There are so many usability evaluation methods
-
26:07 - 26:09that I’m not gonna list them all and
-
26:09 - 26:13I encourage you to go and look them up yourself
-
26:13 - 26:15’cause it’s also very personal what works for you
-
26:15 - 26:20and what works in your situation.
-
26:20 - 26:23When using these methods you wanna
-
26:23 - 26:26evaluate how well a solution works
-
26:26 - 26:30So you’re gonna look at some matrix
-
26:30 - 26:31so that at the end of your evaluation
-
26:31 - 26:35you can say “okay, we’ve done a good job”,
-
26:35 - 26:40“this can go better”,
“Okay, maybe we can move that”, … -
26:40 - 26:44So these are the standardised ones, so
-
26:44 - 26:48how effective are people, or etc.
-
26:48 - 26:53You can read …
-
26:53 - 26:56For a quick start guide on how to
-
26:56 - 26:59perform usability studies, this is quite a nice one.
-
26:59 - 27:00And the most important thing to remember
-
27:00 - 27:05is that preparation is half the work.
-
27:05 - 27:08First thing to check that everything is working,
-
27:08 - 27:17make sure that you have everyone
you need in the room, etc. -
27:17 - 27:23And maybe most importantly,
usability and usable security, -
27:23 - 27:26well, usable security is still a growing field, but
-
27:26 - 27:31usability is a very large field and most likely
-
27:31 - 27:35all the problems that you are going to face
-
27:35 - 27:37or at least a large percentage, other people
-
27:37 - 27:39have faced before.
-
27:39 - 27:44So this book is, well, it describes a lot of the stories
-
27:44 - 27:48of user experience professionals and the things
-
27:48 - 27:52that they’ve come up against.
-
27:52 - 27:56A homework exerciese if you feel like it
-
27:56 - 28:01is looking at basically analysing who is your user
-
28:01 - 28:07and where they’re going to use the application.
-
28:07 - 28:10And also something to think about is
-
28:10 - 28:13how might you involve your user?
-
28:13 - 28:17Not just during the usability testing,
-
28:17 - 28:21but also afterwards.
-
28:21 - 28:28Now I wanna go into some case
studies of encryption systems. -
28:28 - 28:30Now there’s quite a lot, and these are not all,
-
28:30 - 28:35it’s just a small selection but I wanna focus on three.
-
28:35 - 28:40I wanna focus at the OpenPGP standard,
Cryptocat and TextSecure. -
28:40 - 28:43So, OpenPGP, well …
-
28:43 - 28:46email is now almost 50 years old,
-
28:46 - 28:52we have an encryption standard—S/MIME,
it is widely used -
28:52 - 28:56well, it’s widely usable but it’s not widely used …
-
28:56 - 29:04and GnuPG is used widely but is not installed by default
-
29:04 - 29:10and when usability teaches us one thing
-
29:10 - 29:14it’s that defaults rule.
-
29:14 - 29:18Because users don’t change defaults.
-
29:18 - 29:23Now you might ask “Okay,
PGP is not installed by default, -
29:23 - 29:27so is there actually still a future for OpenPGP?”
-
29:27 - 29:30Well, I’d argue: Yes.
We have browser plug-ins -
29:30 - 29:33which make it easier for users
-
29:33 - 29:38JavaScript crypto … I’ll come back to that later …
-
29:38 - 29:43But when we look at Mailvelope, we see, well,
-
29:43 - 29:48the EFF scorecard, it has a pretty decent rating
-
29:48 - 29:56at least compared to that of native PGP implementations.
-
29:56 - 29:59And also Google has announced and has been working
-
29:59 - 30:01for quite some time on their own plug-in for
-
30:01 - 30:03end-to-end encryption.
-
30:03 - 30:08And Yahoo! is also involved in that.
-
30:08 - 30:11And after the Snowden revelations there has been
-
30:11 - 30:15a widespread search in the interest
-
30:15 - 30:18in encrypted communications
-
30:18 - 30:23and this is one website where a lot of these are listed.
-
30:23 - 30:28And one project that I’d especially like to emphasise
-
30:28 - 30:32is mailpile because I think it looks
-
30:32 - 30:35like a very interesting approach
-
30:35 - 30:38whereby the question is:
-
30:38 - 30:41Can we use OpenPGP as a stepping stone?
-
30:41 - 30:47OpenPGP is not perfect, meta-data is not protected,
-
30:47 - 30:48header is not protected, etc.
-
30:48 - 30:52But maybe when we get people into the ecosystem,
-
30:52 - 30:56we can try and gradually move
them to more secure options. -
30:56 - 30:59Now, what about Cryptocat?
-
30:59 - 31:04So, Cryptocat’s online chat platform
-
31:04 - 31:07that … yes … uses JavaScript.
-
31:07 - 31:11And of course, JavaScript crypto is bad
-
31:11 - 31:15but it can be made better.
-
31:15 - 31:20And I think JavaScript crypto is not the worst problem.
-
31:20 - 31:23Cryptocat had a pretty disastrous problem
-
31:23 - 31:27whereby all messages that were sent
-
31:27 - 31:31were pretty easily decryptable.
-
31:31 - 31:33But actually, this is just history repeating itself
-
31:33 - 31:39’cause PGP 1.0 used something called BassOmatic,
-
31:39 - 31:45the BassOmatic cypher which is also pretty weak.
-
31:45 - 31:50And Cryptocat is improving, which is the important thing.
-
31:50 - 31:51There is now a browser plug-in and
-
31:51 - 31:54of course, there’s an app for that and
-
31:54 - 31:57actually, Cryptocat is doing really, really well
-
31:57 - 31:59in the EFF benchmarks.
-
31:59 - 32:05And Cryptocat is asking the one question that a lot
-
32:05 - 32:07of other applications are not asking, which is:
-
32:07 - 32:09“How can we actually make crypto fun?”
-
32:09 - 32:12When you start Cryptocat, there’s noises
-
32:12 - 32:15and there’s interesting facts about cats
-
32:15 - 32:19Laughter
-
32:19 - 32:21… depends on whether you like cats, but still!
-
32:21 - 32:23Keeps you busy!
-
32:23 - 32:29Now, the last case: TextSecure
also has pretty good markings -
32:29 - 32:32and actually just like CryptoCat,
-
32:32 - 32:36the App store distribution model is something that
-
32:36 - 32:39I think is a valuable one for usability.
-
32:39 - 32:42It makes it easy to install.
-
32:42 - 32:46And something that TextSecure is also looking at is
-
32:46 - 32:52synchronisation options for your address book.
-
32:52 - 32:57And I think the most interesting development is
-
32:57 - 33:00on the one side, the CyanogenMod integration,
-
33:00 - 33:05so that people will have encryption enabled by default.
-
33:05 - 33:10’Cause as I mentioned: People don’t change defaults.
-
33:10 - 33:15And this one is a bit more contoversial, but
-
33:15 - 33:18there’s also the WhatsApp partnership.
-
33:18 - 33:21And of course people will say “it’s not secure”,
-
33:21 - 33:23we know, we know,
-
33:23 - 33:24EFF knows!
-
33:24 - 33:29But at least, it’s more secure than nothing at all.
-
33:29 - 33:31Because: Doesn’t every little bit help?
-
33:31 - 33:33Well, I’d say: yes.
-
33:33 - 33:36And at least, it’s one stepping stone.
-
33:36 - 33:40And, well, all of these are open-source,
-
33:40 - 33:42so you can think for yourself:
-
33:42 - 33:45How can I improve these?
-
33:45 - 33:50Now, there’s still some open questions remaining
-
33:50 - 33:53in the usable security field and in the
-
33:53 - 33:56wider security field as well.
-
33:56 - 33:59I won’t go into all of these,
-
33:59 - 34:03I wanna focus on the issues that developers have,
-
34:03 - 34:06issues of end user understanding
-
34:06 - 34:09and of identitiy management.
-
34:09 - 34:14Because the development environment
-
34:14 - 34:18there’s the crypto-plumbing problem, some people call it.
-
34:18 - 34:21How do we standardise on a cryptographic algorithm?
-
34:21 - 34:25How do we make everyone use the same system?
-
34:25 - 34:29Because, again, it’s history repeating itself.
-
34:29 - 34:36With PGP, we had RSA, changed for
DSA because of patent issues -
34:36 - 34:40IDEA changed for CAST5 because of patent issues
-
34:40 - 34:42and now we have something similar:
-
34:42 - 34:44’cause for PGP the question is:
-
34:44 - 34:46Which curve do we choose?
-
34:46 - 34:51’cause this is from Bernstein, who has got a whole list
-
34:51 - 34:56of, well not all the curves,
but a large selection of them -
34:56 - 34:58analysing the security
-
34:58 - 35:01but how do you make, well, pretty much
-
35:01 - 35:06the whole world agree on a single standard?
-
35:06 - 35:11And also, can we move toward safer languages?
-
35:11 - 35:18And I’ve been talking about the
usability of encryption systems -
35:18 - 35:22for users, but what about for developers?
-
35:22 - 35:26So, API usability, and as I’ve mentioned:
-
35:26 - 35:28Language usability.
-
35:28 - 35:32And on top of that, it is not just a technical issue,
-
35:32 - 35:35because, of course, we secure microchips,
-
35:35 - 35:42but we also wanna secure social systems.
-
35:42 - 35:45Because, in principal, we live in an open system,
-
35:45 - 35:51in an open society and a system cannot audit itself.
-
35:51 - 35:56So, okay, what do we do, right?
I don’t know. -
35:56 - 35:58I mean, that’s why it’s an open question!
-
35:58 - 36:01’Cause how de we ensure the authenticity of,
-
36:01 - 36:06I don’t know, my Intel processor in my lapotp?
-
36:06 - 36:07How do I know that the
-
36:07 - 36:09random number generator isn’t bogus?
-
36:09 - 36:14Well, I know it is, but …
laughter -
36:14 - 36:17Then, there’s the issue of identity management
-
36:17 - 36:19related to key management, like
-
36:19 - 36:25who has the keys to the kingdom?
-
36:25 - 36:27One approach, as I’ve already mentioned, is
-
36:27 - 36:29key continuity management.
-
36:29 - 36:32Whereby we automate both key exchange and
-
36:32 - 36:36whereby we automate encryption.
-
36:36 - 36:38So one principle is trust on first use,
-
36:38 - 36:44whereby, well, one approach will be to attach your key
-
36:44 - 36:46to any email you send out and anyone who receives
-
36:46 - 36:50this email just assumes it’s the proper key.
-
36:50 - 36:53Of course, it’s not fully secure,
-
36:53 - 36:56but at least, it’s something.
-
36:56 - 36:59And this is really, I think, the major question
-
36:59 - 37:01in interoperability:
-
37:01 - 37:05How do you ensure that you can access your email
-
37:05 - 37:09from multiple devices?
-
37:09 - 37:11Now, of course, there is meta-data leakage,
-
37:11 - 37:14PGP doesn’t protect meta-data,
-
37:14 - 37:17and, you know, your friendly security agency knows
-
37:17 - 37:18where you went last summer …
-
37:18 - 37:19So, what do we do?
-
37:19 - 37:24We do anonymous routing, we send over tor, but
-
37:24 - 37:26I mean, how do we roll that out?
-
37:26 - 37:28I think the approach that
-
37:28 - 37:30mailpile is trying to do is interesting
-
37:30 - 37:33and, of course still an open question, but
-
37:33 - 37:37interesting research nonetheless.
-
37:37 - 37:39Then there’s the introduction problem of
-
37:39 - 37:44okay, how, I meet someone here, after the talk,
-
37:44 - 37:46they tell me who they are,
-
37:46 - 37:50but either I get their card—which is nice—or
-
37:50 - 37:52they say what their name is.
-
37:52 - 37:56But they’re not gonna tell me, they’re not gonna spell out
-
37:56 - 37:58their fingerprint.
-
37:58 - 38:02So the idea of Zooko’s triangle is that identifiers
-
38:02 - 38:08are either human-meaningful,
secure or decentralised. -
38:08 - 38:09Pick two.
-
38:09 - 38:13So here’s some examples of identifiers,
-
38:13 - 38:16so for Bitcoin: Lots of random garbage.
-
38:16 - 38:19For OpenPGP: Lots of random garbage
-
38:19 - 38:22For miniLock: Lots of random garbage
-
38:22 - 38:26So, I think an interesting research problem is:
-
38:26 - 38:30Can we actually make these things memorable?
-
38:30 - 38:32You know, I can memorise email addresses,
-
38:32 - 38:34I can memorise phone numbers,
-
38:34 - 38:40I can not memorise these. I can try, but …
-
38:40 - 38:45Then, the last open question I wanna focus on
-
38:45 - 38:49is that of end-user understanding.
-
38:49 - 38:54So of course, we’ll know that all devices are monitored.
-
38:54 - 39:00But does the average user?
-
39:00 - 39:05Do they know what worms can do?
-
39:05 - 39:09Have they read these books?
-
39:09 - 39:15Do they know where GCHQ is?
-
39:15 - 39:21Do they know that Cupertino has
pretty much the same powers? -
39:21 - 39:24Laughter
-
39:24 - 39:29Do they know they’re living in a panopticon to come?
-
39:29 - 39:32Laughter
-
39:32 - 39:38Do they know that people are
killed based on meta-data? -
39:38 - 39:41Well, I think not.
-
39:41 - 39:46And actually this is a poster from the university
-
39:46 - 39:47where I did my Master’s
-
39:47 - 39:51and interestingly enough, it was founded by a guy
-
39:51 - 39:56who made a fortune selling sugar pills.
-
39:56 - 40:03You know, snake oil, we also have this in crypto.
-
40:03 - 40:06And how is the user to know
-
40:06 - 40:08whether something is secure or not?
-
40:08 - 40:11Of course, we have the secure messaging scorecard
-
40:11 - 40:15but can users find these?
-
40:15 - 40:21Well, I think, there’s three aspects
to end-user understanding -
40:21 - 40:24which is knowledge acquisition,
knowledge transfer, -
40:24 - 40:27and the verification updating of this knowledge.
-
40:27 - 40:31So, as I’ve already mentioned,
we can do dummy-proofing -
40:31 - 40:38and we can create transparent systems.
-
40:38 - 40:41For knowledge transfer, we can
-
40:41 - 40:44look at appropriate metaphors and design languages.
-
40:44 - 40:47And for verification we can
-
40:47 - 40:51try an approach: Choose an advertising.
-
40:51 - 40:56And, last but not least, we can do user-testing.
-
40:56 - 41:03Because all these open questions that I’ve described
-
41:03 - 41:06and all this research that has been done,
-
41:06 - 41:11I think it’s missing one key issue, which is that
-
41:11 - 41:14the usability people and the security people
-
41:14 - 41:17tend not to really talk to one another.
-
41:17 - 41:21The open-source developers and the users:
-
41:21 - 41:23Are they talking enough?
-
41:23 - 41:27I think that’s something, if we want a new dawn,
-
41:27 - 41:31that’s something that I think we should approach.
-
41:31 - 41:35Yeah, so, from my side, that’s it.
-
41:35 - 41:37I’m open for any questions.
-
41:37 - 41:49Applause
-
41:49 - 41:52Herald: Arne, thank you very much for your brilliant talk
-
41:52 - 41:55Now, if you have any questions to ask, would you please
-
41:55 - 41:58line up at the microphones in the aisles?!
-
41:58 - 42:00The others who’d like to leave now,
-
42:00 - 42:04I’d ask you kindly to please leave very quietly
-
42:04 - 42:09so we can hear what the people
asking questions will tell us. -
42:09 - 42:14And those at the microphones,
if you could talk slowly, -
42:14 - 42:19then those translating have no problems in translating
-
42:19 - 42:21what is being asked. Thank you very much.
-
42:21 - 42:27And I think we’ll start with mic #4 on the left-hand side.
-
42:27 - 42:32Mic#4: Yes, so, if you’ve been to any successful
-
42:32 - 42:36crypto party, you know that crypto parties very quickly
-
42:36 - 42:41turn into not about discussing software,
how to use software, -
42:41 - 42:44but into threat model discussions.
-
42:44 - 42:47And to actually get users
to think about what they’re -
42:47 - 42:49trying to protect themselves for and if a certain
-
42:49 - 42:53messaging app is secure, that still means nothing.
-
42:53 - 42:56’Cause there is lots of other stuff that’s going on.
-
42:56 - 42:57Can you talk a little bit about that and
-
42:57 - 43:00how that runs into this model about, you know,
-
43:00 - 43:02how we need to educate users and, while we’re at it,
-
43:02 - 43:04what we want to educated about.
-
43:04 - 43:06And what they actually need to be using.
-
43:06 - 43:10Arne: Well, I think that’s an interesting point
-
43:10 - 43:14and I think, one issue, one big issue is:
-
43:14 - 43:17okay, we can throw lots of crypto parties
-
43:17 - 43:21but we’re never gonna be able to throw enough parties.
-
43:21 - 43:23I … with one party you’re very lucky
-
43:23 - 43:25you’re gonna educate 100 people.
-
43:25 - 43:29I mean, just imagine how many parties
you’d need to throw. Right? -
43:29 - 43:33I mean, it’s gonna be a heck of party, but … yeah.
-
43:33 - 43:39And I think, secondly, the question of threat modeling,
-
43:39 - 43:43I think, sure, that’s helpful to do, but
-
43:43 - 43:48I think, users do first need an understanding of,
-
43:48 - 43:49for example, the email architecture.
-
43:49 - 43:52Cause, how can they do threat
modeling when they think -
43:52 - 43:55that an email magically pops
from one computer to the next? -
43:55 - 43:59I think, that is pretty much impossible.
-
43:59 - 44:01I hope that …
-
44:01 - 44:05Herald: Thank you very much, so …
Microphone #3, please. -
44:05 - 44:07Mic#3: Arne, thank you very much for your talk.
-
44:07 - 44:10There’s one aspect that I didn’t see in your slides.
-
44:10 - 44:13And that is the aspect of the language that we use
-
44:13 - 44:17to describe concepts in PGP—and GPG, for that matter.
-
44:17 - 44:20And I know that there was a paper last year
-
44:20 - 44:22about why King George can’t encrypt and
-
44:22 - 44:24they were trying to propose a new language.
-
44:24 - 44:26Do you think that such initiatives are worthwile
-
44:26 - 44:29or are we stuck with this language and should we make
-
44:29 - 44:32as good use of it as we can?
-
44:32 - 44:38Arne: I think that’s a good point
and actually the question -
44:38 - 44:45of “okay, what metaphors do you wanna use?” … I think
-
44:45 - 44:47we’re pretty much stuck with the language
-
44:47 - 44:50that we’re using for the moment but
-
44:50 - 44:54I think it does make sense
to go and look into the future -
44:54 - 44:58at alternatives models.
-
44:58 - 45:01Yeah, so I actually wrote a paper that also
-
45:01 - 45:05goes into that a bit, looking at
-
45:05 - 45:09the metaphor of handshakes to exchange keys.
-
45:09 - 45:10So, for example, you could have
-
45:10 - 45:16an embedded device as a ring or wristband,
-
45:16 - 45:19it could even be a smartwatch, for that matter.
-
45:19 - 45:22Could you use that shaking of hands to
-
45:22 - 45:24build trust-relationships?
-
45:24 - 45:30And that might be a better
metaphor than key-signing, -
45:30 - 45:31webs of trust, etc.
-
45:31 - 45:35’Cause I think, that is horribly broken
-
45:35 - 45:40for I mean the concept, trying
to explain that to users. -
45:40 - 45:43Herald: Thank you. And … at the back in the middle.
-
45:43 - 45:45Signal angel: Thanks. A question from the internet:
-
45:45 - 45:47[username?] from the Internet wants to know if you’re
-
45:47 - 45:52aware of the PEP project, the “pretty easy privacy”
-
45:52 - 45:53and your opinions on that.
-
45:53 - 45:55And another question is:
-
45:55 - 46:02How important is the trust level of the crypto to you?
-
46:02 - 46:04Arne: Well, yes, actually, there’s this screenshot
-
46:04 - 46:10of the PEP project in the slides
-
46:10 - 46:15… in the why WhatsApp is horribly insecure and
-
46:15 - 46:19of course, I agree, and yeah.
-
46:19 - 46:22I’ve looked into the PEP project for a bit
-
46:22 - 46:25and I think, yeah, I think it’s an interesting
-
46:25 - 46:28approach but I still have to read up on it a bit more.
-
46:28 - 46:31Then, for the second question,
-
46:31 - 46:38“how important is the rust in the crypto?”:
-
46:38 - 46:42I think that’s an important one.
-
46:42 - 46:43Especially the question of
-
46:43 - 46:53“how do we build social systems
to ensure reliable cryptography?” -
46:53 - 46:57So one example is the Advanced
Encryption Standard competition. -
46:57 - 47:00Everyone was free to send in entries,
-
47:00 - 47:02their design princpiles were open
-
47:02 - 47:06and this is in complete contrast
to the Data Encryption Standard -
47:06 - 47:12which, I think, the design princpiles are still Top Secret.
-
47:12 - 47:16So yeah, I think, the crypto is
something we need to build on -
47:16 - 47:22but, well, actually, the crypto is
again built on other systems -
47:22 - 47:28where the trust in these systems
is even more important. -
47:28 - 47:31Herald: Okay, thank you, microphone #2, please.
-
47:31 - 47:34Mic#2: Yes, Arne, thank you very
much for your excellent talk. -
47:34 - 47:38I wonder how about what to do with feedback
-
47:38 - 47:41on usability in open-source software.
-
47:41 - 47:42So, you publish something on GitHub
-
47:42 - 47:44and you’re with a group of people
-
47:44 - 47:45who don’t know each other and
-
47:45 - 47:48one publishes something,
the other publishes something, -
47:48 - 47:51and then: How do we know
this software is usable? -
47:51 - 47:54In commercial software, there’s all kind of hooks
-
47:54 - 47:56on the website, on the app,
-
47:56 - 47:59to send feedback to the commercial vendor.
-
47:59 - 48:02But in open-source software,
how do you gather this information? -
48:02 - 48:05How do you use it, is there any way to do this
-
48:05 - 48:06in an anonymised way?
-
48:06 - 48:09I haven’t seen anything related to this.
-
48:09 - 48:11Is this one of the reasons why
open-source software is maybe -
48:11 - 48:15less usable than commercial software?
-
48:15 - 48:20Arne: It might be. It might be.
-
48:20 - 48:23But regarding your question, like, how do you know
-
48:23 - 48:30whether a commercial software is usable, well,
-
48:30 - 48:32you … one way is looking at:
-
48:32 - 48:35Okay, what kind of statistics do you get back?
-
48:35 - 48:38But if you push out something totally unusable
-
48:38 - 48:40and then, I mean, you’re going to expect
-
48:40 - 48:45that the statistics come back looking like shit.
-
48:45 - 48:50So, the best approach is to
design usability in from the start. -
48:50 - 48:51The same with security.
-
48:51 - 48:55And I think, that is also …
so even if you have … -
48:55 - 48:59you want privacy for end users, I think it’s still possible
-
48:59 - 49:02to get people into their lab and look at:
-
49:02 - 49:03Okay, how are they using the system?
-
49:03 - 49:06What things can we improve?
-
49:06 - 49:08And what things are working well?
-
49:08 - 49:11Mic#2: So you’re saying, you should only publish
-
49:11 - 49:19open-source software for users
if you also tested in a lab? -
49:19 - 49:23Arne: Well, I think, this is a bit of a discussion of:
-
49:23 - 49:26Do we just allow people to build
houses however they want to -
49:26 - 49:28or do we have building codes?
-
49:28 - 49:32And … I think … well, actually, this proposal of holding
-
49:32 - 49:36software developers responsible for what they produce,
-
49:36 - 49:40if it’s commercial software, I mean,
that proposal has been -
49:40 - 49:42made a long time ago.
-
49:42 - 49:43And the question is:
-
49:43 - 49:48How would that work in an
open-source software community? -
49:48 - 49:50Well, actually, I don’t have an answer to that.
-
49:50 - 49:53But I think, it’s an interesting question.
-
49:53 - 49:54Mic#2: Thank you.
-
49:54 - 49:58Herald: Thank you very much. #1, please.
-
49:58 - 50:01Mic#1: You said that every little bit helps,
-
50:01 - 50:04so if we have systems that
don’t provide a lot of … well … -
50:04 - 50:07are almost insecure, they
provide just a bit of security, than -
50:07 - 50:10that is still better than no security.
-
50:10 - 50:13My question is: Isn’t that actually worse because
-
50:13 - 50:15this promotes a false sense of security and
-
50:15 - 50:20that makes people just use
the insecure broken systems -
50:20 - 50:24just to say “we have some security with us”?
-
50:24 - 50:26Arne: I completely agree but
-
50:26 - 50:29I think that currently people … I mean …
-
50:29 - 50:31when you think an email goes
-
50:31 - 50:34from one system to the other directly
-
50:34 - 50:41and I mean … from these studies
that I’ve done, I’ve met -
50:41 - 50:46quite some people who still think email is secure.
-
50:46 - 50:50So, of course, you might give
them a false sense of security -
50:50 - 50:53when you give them a
more secure program but -
50:53 - 50:54at least it’s more secure than email—right?
-
50:54 - 50:56I mean …
-
50:56 - 50:57Mic#1: Thank you.
-
50:57 - 51:00Herald: Thank you. There’s another
question on the Internet. -
51:00 - 51:03Signal angel: Yes, thank you. Question from the Internet:
-
51:03 - 51:06What crypto would you finally
recommend your grandma to use? -
51:06 - 51:10Arne: laughs
-
51:10 - 51:16Well … Unfortunately, my grandma
has already passed away. -
51:16 - 51:20I mean … her secrets will be safe …
-
51:27 - 51:32Actually, I think something like where
-
51:32 - 51:37Crypto is enabled by default, say …
iMessage, I mean -
51:37 - 51:42of course, there’s backdoors,
etc., but at least -
51:42 - 51:45it is more secure than plain SMS.
-
51:45 - 51:53So I would advise my grandma
to use … well, to look at … -
51:53 - 51:56actually I’d first analyse
what does she have available -
51:56 - 51:59and then I would look at okay
which is the most secure -
51:59 - 52:04and still usable?
-
52:04 - 52:07Herald: Thank you very much, so mic #3, please.
-
52:07 - 52:11Mic#3: So, just wondering:
-
52:11 - 52:15You told that there is
a problem with the missing -
52:15 - 52:20default installation of GPG
on operating systems but -
52:20 - 52:25I think, this is more of a
problem of which OS you choose -
52:25 - 52:28because at least I don’t
know any Linux system which -
52:28 - 52:34doesn’t have GPG installed today by default.
-
52:34 - 52:40If you use … at least I’ve used
the normal workstation setup. -
52:40 - 52:43Arne: Yes, I think you already
answered your own question: -
52:43 - 52:47Linux.
Laughter -
52:47 - 52:51Unfortunately, Linux is not yet widely default.
-
52:51 - 52:53I mean, I’d love it to be, but … yeah.
-
52:53 - 52:58Mic#3: But if I send an email to Microsoft and say:
-
52:58 - 53:03Well, install GPG by default, they’re not gonna
-
53:03 - 53:04listen to me.
-
53:04 - 53:08And I think, for all of us, we should do
-
53:08 - 53:09a lot more of that.
-
53:09 - 53:14Even if Microsoft is the devil for most of us.
-
53:14 - 53:16Thank you.
-
53:16 - 53:20Arne: Well … We should be doing more of what?
-
53:20 - 53:26Mic#3: Making more demands
to integrate GPG by default -
53:26 - 53:29in Microsoft products, for example.
-
53:29 - 53:31Arne: Yes, I completely agree.
-
53:31 - 53:34Well, what you already see happening …
-
53:34 - 53:36or I mean, it’s not very high-profile yet,
-
53:36 - 53:39but for example I mean … I’ve refered to
-
53:39 - 53:43the EFF scorecard a couple of times but
-
53:43 - 53:50that is some pressure to encourage developers
-
53:50 - 53:53to include security by default.
-
53:53 - 53:57But, I think I’ve also mentioned, one of the big problems
-
53:57 - 54:01is: users at the moment … I mean …
-
54:01 - 54:04developers might say: my system is secure.
-
54:04 - 54:07I mean … what does that mean?
-
54:07 - 54:10Do we hold developers and
commercial entities … -
54:10 - 54:12do we hold them to, well,
-
54:12 - 54:14truthful advertisting standards or not?
-
54:14 - 54:17I mean, I would say: Let’s gonna look at
-
54:17 - 54:21what are companies claiming and
-
54:21 - 54:23do they actually stand up to that?
-
54:23 - 54:26And if not: Can we actually sue them?
-
54:26 - 54:28Can we make them tell the truth about
-
54:28 - 54:31what is happening and what is not?
-
54:31 - 54:33Herald: So, we’ve got about 2 more minutes left …
-
54:33 - 54:37So it’s a maximum of two more questions, #2, please.
-
54:37 - 54:43Mic#2: Yeah, so … Every security system fails.
-
54:43 - 54:50So I’m interested in what sort of work has been done on
-
54:50 - 54:57how do users recover from failure?
-
54:57 - 55:01Everything will get subverted,
-
55:01 - 55:04your best firend will sneak
your key off your computer, -
55:04 - 55:06something will go wrong with that, you know …
-
55:06 - 55:10your kids will grab it …
-
55:10 - 55:13and just, is there, in general, has somebody looked at
-
55:13 - 55:17these sorts of issues?
-
55:17 - 55:19Is there research on it?
-
55:19 - 55:22Arne: Of various aspects of the problem but
-
55:22 - 55:26as far as I’m aware not for the general issue
-
55:26 - 55:30and not any field studies specifically looking at
-
55:30 - 55:34“Okay, what happens when a key is compromised, etc.”
-
55:34 - 55:38I mean, we do have certain cases of things happening
-
55:38 - 55:42but nothing structured.
-
55:42 - 55:45No structured studies, as far as I’m aware.
-
55:45 - 55:47Herald: Thank you. #3?
-
55:47 - 55:52Mic#3: Yeah, you mentioned
mailpile as a stepping stone -
55:52 - 55:56for people to start using GnuPG and stuff, but
-
55:56 - 56:05you also talked about an
average user seeing mail as just -
56:05 - 56:09coming from one place and
then ending up in another place. -
56:09 - 56:12Shouldn’t we actually talk about
-
56:12 - 56:18how to make encryption transparent for the users?
-
56:18 - 56:21Why should they actually care about these things?
-
56:21 - 56:25Shouldn’t it be embedded in the protocols?
-
56:25 - 56:29Shouldn’t we actually talk about
embedding them in the protocols, -
56:29 - 56:32stop using unsecure protocols
-
56:32 - 56:36and having all of these,
you talked a little bit about it, -
56:36 - 56:39as putting it in the defaults.
-
56:39 - 56:43But shouldn’t we emphasise that a lot more?
-
56:43 - 56:47Arne: Yeah, I think we should
certainly be working towards -
56:47 - 56:50“How do we get security by default?”
-
56:50 - 56:54But I think … I’ve mentioned it shortly that
-
56:54 - 56:58making things transparent also has a danger.
-
56:58 - 57:01I mean, this whole, it’s a bit like …
-
57:01 - 57:03a system should be transparent is a bit like
-
57:03 - 57:06marketing speak, because actually
-
57:06 - 57:09we don’t want systems to be completely transparent,
-
57:09 - 57:13’cause we also wanna be able
to engage with the systems. -
57:13 - 57:16Are the systems working as they should be?
-
57:16 - 57:20So, I mean, this is a difficult balance to find, but yeah …
-
57:20 - 57:25Something that you achieve through usability studies,
-
57:25 - 57:29security analysis, etc.
-
57:29 - 57:31Herald: All right, Arne,
thank you very much for giving -
57:31 - 57:34us your very inspiring talk,
-
57:34 - 57:36thank you for sharing your information with us.
-
57:36 - 57:38Please give him a round of applause.
-
57:38 - 57:41Thank you very much.
applause -
57:41 - 57:52subtitles created by c3subtitles.de
Join, and help us!
- Title:
- Arne Padmos: Why is GPG "damn near unusable"?
- Description:
-
http://media.ccc.de/browse/congress/2014/31c3_-_6021_-_en_-_saal_g_-_201412281130_-_why_is_gpg_damn_near_unusable_-_arne_padmos.html
GPG has been correctly described as "damn near unusable". Why is this so? What does research into usable security tell us? This talk covers the history, methods, and findings of the research field, as well as proposed solutions and open questions.
Arne Padmos
- Video Language:
- English
- Duration:
- 57:52
C3Subtitles edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? | ||
Bar Sch edited English subtitles for Arne Padmos: Why is GPG "damn near unusable"? |