The security mirage
-
0:00 - 0:02So security is two different things:
-
0:02 - 0:04it's a feeling, and it's a reality.
-
0:04 - 0:06And they're different.
-
0:06 - 0:08You could feel secure
-
0:08 - 0:10even if you're not.
-
0:10 - 0:12And you can be secure
-
0:12 - 0:14even if you don't feel it.
-
0:14 - 0:16Really, we have two separate concepts
-
0:16 - 0:18mapped onto the same word.
-
0:18 - 0:20And what I want to do in this talk
-
0:20 - 0:22is to split them apart --
-
0:22 - 0:24figuring out when they diverge
-
0:24 - 0:26and how they converge.
-
0:26 - 0:28And language is actually a problem here.
-
0:28 - 0:30There aren't a lot of good words
-
0:30 - 0:33for the concepts we're going to talk about.
-
0:33 - 0:35So if you look at security
-
0:35 - 0:37from economic terms,
-
0:37 - 0:39it's a trade-off.
-
0:39 - 0:41Every time you get some security,
-
0:41 - 0:43you're always trading off something.
-
0:43 - 0:45Whether this is a personal decision --
-
0:45 - 0:47whether you're going to install a burglar alarm in your home --
-
0:47 - 0:50or a national decision -- where you're going to invade some foreign country --
-
0:50 - 0:52you're going to trade off something,
-
0:52 - 0:55either money or time, convenience, capabilities,
-
0:55 - 0:58maybe fundamental liberties.
-
0:58 - 1:01And the question to ask when you look at a security anything
-
1:01 - 1:04is not whether this makes us safer,
-
1:04 - 1:07but whether it's worth the trade-off.
-
1:07 - 1:09You've heard in the past several years,
-
1:09 - 1:11the world is safer because Saddam Hussein is not in power.
-
1:11 - 1:14That might be true, but it's not terribly relevant.
-
1:14 - 1:17The question is, was it worth it?
-
1:17 - 1:20And you can make your own decision,
-
1:20 - 1:22and then you'll decide whether the invasion was worth it.
-
1:22 - 1:24That's how you think about security --
-
1:24 - 1:26in terms of the trade-off.
-
1:26 - 1:29Now there's often no right or wrong here.
-
1:29 - 1:31Some of us have a burglar alarm system at home,
-
1:31 - 1:33and some of us don't.
-
1:33 - 1:35And it'll depend on where we live,
-
1:35 - 1:37whether we live alone or have a family,
-
1:37 - 1:39how much cool stuff we have,
-
1:39 - 1:41how much we're willing to accept
-
1:41 - 1:43the risk of theft.
-
1:43 - 1:45In politics also,
-
1:45 - 1:47there are different opinions.
-
1:47 - 1:49A lot of times, these trade-offs
-
1:49 - 1:51are about more than just security,
-
1:51 - 1:53and I think that's really important.
-
1:53 - 1:55Now people have a natural intuition
-
1:55 - 1:57about these trade-offs.
-
1:57 - 1:59We make them every day --
-
1:59 - 2:01last night in my hotel room,
-
2:01 - 2:03when I decided to double-lock the door,
-
2:03 - 2:05or you in your car when you drove here,
-
2:05 - 2:07when we go eat lunch
-
2:07 - 2:10and decide the food's not poison and we'll eat it.
-
2:10 - 2:12We make these trade-offs again and again,
-
2:12 - 2:14multiple times a day.
-
2:14 - 2:16We often won't even notice them.
-
2:16 - 2:18They're just part of being alive; we all do it.
-
2:18 - 2:21Every species does it.
-
2:21 - 2:23Imagine a rabbit in a field, eating grass,
-
2:23 - 2:26and the rabbit's going to see a fox.
-
2:26 - 2:28That rabbit will make a security trade-off:
-
2:28 - 2:30"Should I stay, or should I flee?"
-
2:30 - 2:32And if you think about it,
-
2:32 - 2:35the rabbits that are good at making that trade-off
-
2:35 - 2:37will tend to live and reproduce,
-
2:37 - 2:39and the rabbits that are bad at it
-
2:39 - 2:41will get eaten or starve.
-
2:41 - 2:43So you'd think
-
2:43 - 2:46that us, as a successful species on the planet --
-
2:46 - 2:48you, me, everybody --
-
2:48 - 2:51would be really good at making these trade-offs.
-
2:51 - 2:53Yet it seems, again and again,
-
2:53 - 2:56that we're hopelessly bad at it.
-
2:56 - 2:59And I think that's a fundamentally interesting question.
-
2:59 - 3:01I'll give you the short answer.
-
3:01 - 3:03The answer is, we respond to the feeling of security
-
3:03 - 3:06and not the reality.
-
3:06 - 3:09Now most of the time, that works.
-
3:10 - 3:12Most of the time,
-
3:12 - 3:15feeling and reality are the same.
-
3:15 - 3:17Certainly that's true
-
3:17 - 3:20for most of human prehistory.
-
3:20 - 3:23We've developed this ability
-
3:23 - 3:25because it makes evolutionary sense.
-
3:25 - 3:27One way to think of it
-
3:27 - 3:29is that we're highly optimized
-
3:29 - 3:31for risk decisions
-
3:31 - 3:34that are endemic to living in small family groups
-
3:34 - 3:37in the East African highlands in 100,000 B.C.
-
3:37 - 3:402010 New York, not so much.
-
3:41 - 3:44Now there are several biases in risk perception.
-
3:44 - 3:46A lot of good experiments in this.
-
3:46 - 3:49And you can see certain biases that come up again and again.
-
3:49 - 3:51So I'll give you four.
-
3:51 - 3:54We tend to exaggerate spectacular and rare risks
-
3:54 - 3:56and downplay common risks --
-
3:56 - 3:59so flying versus driving.
-
3:59 - 4:01The unknown is perceived
-
4:01 - 4:04to be riskier than the familiar.
-
4:05 - 4:07One example would be,
-
4:07 - 4:10people fear kidnapping by strangers
-
4:10 - 4:13when the data supports kidnapping by relatives is much more common.
-
4:13 - 4:15This is for children.
-
4:15 - 4:18Third, personified risks
-
4:18 - 4:21are perceived to be greater than anonymous risks --
-
4:21 - 4:24so Bin Laden is scarier because he has a name.
-
4:24 - 4:26And the fourth
-
4:26 - 4:28is people underestimate risks
-
4:28 - 4:30in situations they do control
-
4:30 - 4:34and overestimate them in situations they don't control.
-
4:34 - 4:37So once you take up skydiving or smoking,
-
4:37 - 4:39you downplay the risks.
-
4:39 - 4:42If a risk is thrust upon you -- terrorism was a good example --
-
4:42 - 4:45you'll overplay it because you don't feel like it's in your control.
-
4:47 - 4:50There are a bunch of other of these biases, these cognitive biases,
-
4:50 - 4:53that affect our risk decisions.
-
4:53 - 4:55There's the availability heuristic,
-
4:55 - 4:57which basically means
-
4:57 - 5:00we estimate the probability of something
-
5:00 - 5:04by how easy it is to bring instances of it to mind.
-
5:04 - 5:06So you can imagine how that works.
-
5:06 - 5:09If you hear a lot about tiger attacks, there must be a lot of tigers around.
-
5:09 - 5:12You don't hear about lion attacks, there aren't a lot of lions around.
-
5:12 - 5:15This works until you invent newspapers.
-
5:15 - 5:17Because what newspapers do
-
5:17 - 5:19is they repeat again and again
-
5:19 - 5:21rare risks.
-
5:21 - 5:23I tell people, if it's in the news, don't worry about it.
-
5:23 - 5:25Because by definition,
-
5:25 - 5:28news is something that almost never happens.
-
5:28 - 5:30(Laughter)
-
5:30 - 5:33When something is so common, it's no longer news --
-
5:33 - 5:35car crashes, domestic violence --
-
5:35 - 5:38those are the risks you worry about.
-
5:38 - 5:40We're also a species of storytellers.
-
5:40 - 5:43We respond to stories more than data.
-
5:43 - 5:45And there's some basic innumeracy going on.
-
5:45 - 5:48I mean, the joke "One, Two, Three, Many" is kind of right.
-
5:48 - 5:51We're really good at small numbers.
-
5:51 - 5:53One mango, two mangoes, three mangoes,
-
5:53 - 5:5510,000 mangoes, 100,000 mangoes --
-
5:55 - 5:58it's still more mangoes you can eat before they rot.
-
5:58 - 6:01So one half, one quarter, one fifth -- we're good at that.
-
6:01 - 6:03One in a million, one in a billion --
-
6:03 - 6:06they're both almost never.
-
6:06 - 6:08So we have trouble with the risks
-
6:08 - 6:10that aren't very common.
-
6:10 - 6:12And what these cognitive biases do
-
6:12 - 6:15is they act as filters between us and reality.
-
6:15 - 6:17And the result
-
6:17 - 6:19is that feeling and reality get out of whack,
-
6:19 - 6:22they get different.
-
6:22 - 6:25Now you either have a feeling -- you feel more secure than you are.
-
6:25 - 6:27There's a false sense of security.
-
6:27 - 6:29Or the other way,
-
6:29 - 6:31and that's a false sense of insecurity.
-
6:31 - 6:34I write a lot about "security theater,"
-
6:34 - 6:37which are products that make people feel secure,
-
6:37 - 6:39but don't actually do anything.
-
6:39 - 6:41There's no real word for stuff that makes us secure,
-
6:41 - 6:43but doesn't make us feel secure.
-
6:43 - 6:46Maybe it's what the CIA's supposed to do for us.
-
6:48 - 6:50So back to economics.
-
6:50 - 6:54If economics, if the market, drives security,
-
6:54 - 6:56and if people make trade-offs
-
6:56 - 6:59based on the feeling of security,
-
6:59 - 7:01then the smart thing for companies to do
-
7:01 - 7:03for the economic incentives
-
7:03 - 7:06are to make people feel secure.
-
7:06 - 7:09And there are two ways to do this.
-
7:09 - 7:11One, you can make people actually secure
-
7:11 - 7:13and hope they notice.
-
7:13 - 7:16Or two, you can make people just feel secure
-
7:16 - 7:19and hope they don't notice.
-
7:20 - 7:23So what makes people notice?
-
7:23 - 7:25Well a couple of things:
-
7:25 - 7:27understanding of the security,
-
7:27 - 7:29of the risks, the threats,
-
7:29 - 7:32the countermeasures, how they work.
-
7:32 - 7:34But if you know stuff,
-
7:34 - 7:37you're more likely to have your feelings match reality.
-
7:37 - 7:40Enough real world examples helps.
-
7:40 - 7:43Now we all know the crime rate in our neighborhood,
-
7:43 - 7:46because we live there, and we get a feeling about it
-
7:46 - 7:49that basically matches reality.
-
7:49 - 7:52Security theater's exposed
-
7:52 - 7:55when it's obvious that it's not working properly.
-
7:55 - 7:59Okay, so what makes people not notice?
-
7:59 - 8:01Well, a poor understanding.
-
8:01 - 8:04If you don't understand the risks, you don't understand the costs,
-
8:04 - 8:06you're likely to get the trade-off wrong,
-
8:06 - 8:09and your feeling doesn't match reality.
-
8:09 - 8:11Not enough examples.
-
8:11 - 8:13There's an inherent problem
-
8:13 - 8:15with low probability events.
-
8:15 - 8:17If, for example,
-
8:17 - 8:19terrorism almost never happens,
-
8:19 - 8:21it's really hard to judge
-
8:21 - 8:24the efficacy of counter-terrorist measures.
-
8:25 - 8:28This is why you keep sacrificing virgins,
-
8:28 - 8:31and why your unicorn defenses are working just great.
-
8:31 - 8:34There aren't enough examples of failures.
-
8:35 - 8:38Also, feelings that are clouding the issues --
-
8:38 - 8:40the cognitive biases I talked about earlier,
-
8:40 - 8:43fears, folk beliefs,
-
8:43 - 8:46basically an inadequate model of reality.
-
8:47 - 8:50So let me complicate things.
-
8:50 - 8:52I have feeling and reality.
-
8:52 - 8:55I want to add a third element. I want to add model.
-
8:55 - 8:57Feeling and model in our head,
-
8:57 - 8:59reality is the outside world.
-
8:59 - 9:02It doesn't change; it's real.
-
9:02 - 9:04So feeling is based on our intuition.
-
9:04 - 9:06Model is based on reason.
-
9:06 - 9:09That's basically the difference.
-
9:09 - 9:11In a primitive and simple world,
-
9:11 - 9:14there's really no reason for a model
-
9:14 - 9:17because feeling is close to reality.
-
9:17 - 9:19You don't need a model.
-
9:19 - 9:21But in a modern and complex world,
-
9:21 - 9:23you need models
-
9:23 - 9:26to understand a lot of the risks we face.
-
9:27 - 9:29There's no feeling about germs.
-
9:29 - 9:32You need a model to understand them.
-
9:32 - 9:34So this model
-
9:34 - 9:37is an intelligent representation of reality.
-
9:37 - 9:40It's, of course, limited by science,
-
9:40 - 9:42by technology.
-
9:42 - 9:45We couldn't have a germ theory of disease
-
9:45 - 9:48before we invented the microscope to see them.
-
9:49 - 9:52It's limited by our cognitive biases.
-
9:52 - 9:54But it has the ability
-
9:54 - 9:56to override our feelings.
-
9:56 - 9:59Where do we get these models? We get them from others.
-
9:59 - 10:02We get them from religion, from culture,
-
10:02 - 10:04teachers, elders.
-
10:04 - 10:06A couple years ago,
-
10:06 - 10:08I was in South Africa on safari.
-
10:08 - 10:11The tracker I was with grew up in Kruger National Park.
-
10:11 - 10:14He had some very complex models of how to survive.
-
10:14 - 10:16And it depended on if you were attacked
-
10:16 - 10:18by a lion or a leopard or a rhino or an elephant --
-
10:18 - 10:21and when you had to run away, and when you couldn't run away, and when you had to climb a tree --
-
10:21 - 10:23when you could never climb a tree.
-
10:23 - 10:26I would have died in a day,
-
10:26 - 10:28but he was born there,
-
10:28 - 10:30and he understood how to survive.
-
10:30 - 10:32I was born in New York City.
-
10:32 - 10:35I could have taken him to New York, and he would have died in a day.
-
10:35 - 10:37(Laughter)
-
10:37 - 10:39Because we had different models
-
10:39 - 10:42based on our different experiences.
-
10:43 - 10:45Models can come from the media,
-
10:45 - 10:48from our elected officials.
-
10:48 - 10:51Think of models of terrorism,
-
10:51 - 10:54child kidnapping,
-
10:54 - 10:56airline safety, car safety.
-
10:56 - 10:59Models can come from industry.
-
10:59 - 11:01The two I'm following are surveillance cameras,
-
11:01 - 11:03ID cards,
-
11:03 - 11:06quite a lot of our computer security models come from there.
-
11:06 - 11:09A lot of models come from science.
-
11:09 - 11:11Health models are a great example.
-
11:11 - 11:14Think of cancer, of bird flu, swine flu, SARS.
-
11:14 - 11:17All of our feelings of security
-
11:17 - 11:19about those diseases
-
11:19 - 11:21come from models
-
11:21 - 11:24given to us, really, by science filtered through the media.
-
11:25 - 11:28So models can change.
-
11:28 - 11:30Models are not static.
-
11:30 - 11:33As we become more comfortable in our environments,
-
11:33 - 11:37our model can move closer to our feelings.
-
11:38 - 11:40So an example might be,
-
11:40 - 11:42if you go back 100 years ago
-
11:42 - 11:45when electricity was first becoming common,
-
11:45 - 11:47there were a lot of fears about it.
-
11:47 - 11:49I mean, there were people who were afraid to push doorbells,
-
11:49 - 11:52because there was electricity in there, and that was dangerous.
-
11:52 - 11:55For us, we're very facile around electricity.
-
11:55 - 11:57We change light bulbs
-
11:57 - 11:59without even thinking about it.
-
11:59 - 12:03Our model of security around electricity
-
12:03 - 12:06is something we were born into.
-
12:06 - 12:09It hasn't changed as we were growing up.
-
12:09 - 12:12And we're good at it.
-
12:12 - 12:14Or think of the risks
-
12:14 - 12:16on the Internet across generations --
-
12:16 - 12:18how your parents approach Internet security,
-
12:18 - 12:20versus how you do,
-
12:20 - 12:23versus how our kids will.
-
12:23 - 12:26Models eventually fade into the background.
-
12:27 - 12:30Intuitive is just another word for familiar.
-
12:30 - 12:32So as your model is close to reality,
-
12:32 - 12:34and it converges with feelings,
-
12:34 - 12:37you often don't know it's there.
-
12:37 - 12:39So a nice example of this
-
12:39 - 12:42came from last year and swine flu.
-
12:42 - 12:44When swine flu first appeared,
-
12:44 - 12:48the initial news caused a lot of overreaction.
-
12:48 - 12:50Now it had a name,
-
12:50 - 12:52which made it scarier than the regular flu,
-
12:52 - 12:54even though it was more deadly.
-
12:54 - 12:58And people thought doctors should be able to deal with it.
-
12:58 - 13:00So there was that feeling of lack of control.
-
13:00 - 13:02And those two things
-
13:02 - 13:04made the risk more than it was.
-
13:04 - 13:07As the novelty wore off, the months went by,
-
13:07 - 13:09there was some amount of tolerance,
-
13:09 - 13:11people got used to it.
-
13:11 - 13:14There was no new data, but there was less fear.
-
13:14 - 13:16By autumn,
-
13:16 - 13:18people thought
-
13:18 - 13:20the doctors should have solved this already.
-
13:20 - 13:22And there's kind of a bifurcation --
-
13:22 - 13:24people had to choose
-
13:24 - 13:28between fear and acceptance --
-
13:28 - 13:30actually fear and indifference --
-
13:30 - 13:33they kind of chose suspicion.
-
13:33 - 13:36And when the vaccine appeared last winter,
-
13:36 - 13:39there were a lot of people -- a surprising number --
-
13:39 - 13:42who refused to get it --
-
13:43 - 13:45as a nice example
-
13:45 - 13:48of how people's feelings of security change, how their model changes,
-
13:48 - 13:50sort of wildly
-
13:50 - 13:52with no new information,
-
13:52 - 13:54with no new input.
-
13:54 - 13:57This kind of thing happens a lot.
-
13:57 - 14:00I'm going to give one more complication.
-
14:00 - 14:03We have feeling, model, reality.
-
14:03 - 14:05I have a very relativistic view of security.
-
14:05 - 14:08I think it depends on the observer.
-
14:08 - 14:10And most security decisions
-
14:10 - 14:14have a variety of people involved.
-
14:14 - 14:16And stakeholders
-
14:16 - 14:19with specific trade-offs
-
14:19 - 14:21will try to influence the decision.
-
14:21 - 14:23And I call that their agenda.
-
14:23 - 14:25And you see agenda --
-
14:25 - 14:28this is marketing, this is politics --
-
14:28 - 14:31trying to convince you to have one model versus another,
-
14:31 - 14:33trying to convince you to ignore a model
-
14:33 - 14:36and trust your feelings,
-
14:36 - 14:39marginalizing people with models you don't like.
-
14:39 - 14:42This is not uncommon.
-
14:42 - 14:45An example, a great example, is the risk of smoking.
-
14:46 - 14:49In the history of the past 50 years, the smoking risk
-
14:49 - 14:51shows how a model changes,
-
14:51 - 14:54and it also shows how an industry fights against
-
14:54 - 14:56a model it doesn't like.
-
14:56 - 14:59Compare that to the secondhand smoke debate --
-
14:59 - 15:02probably about 20 years behind.
-
15:02 - 15:04Think about seat belts.
-
15:04 - 15:06When I was a kid, no one wore a seat belt.
-
15:06 - 15:08Nowadays, no kid will let you drive
-
15:08 - 15:10if you're not wearing a seat belt.
-
15:11 - 15:13Compare that to the airbag debate --
-
15:13 - 15:16probably about 30 years behind.
-
15:16 - 15:19All examples of models changing.
-
15:21 - 15:24What we learn is that changing models is hard.
-
15:24 - 15:26Models are hard to dislodge.
-
15:26 - 15:28If they equal your feelings,
-
15:28 - 15:31you don't even know you have a model.
-
15:31 - 15:33And there's another cognitive bias
-
15:33 - 15:35I'll call confirmation bias,
-
15:35 - 15:38where we tend to accept data
-
15:38 - 15:40that confirms our beliefs
-
15:40 - 15:43and reject data that contradicts our beliefs.
-
15:44 - 15:46So evidence against our model,
-
15:46 - 15:49we're likely to ignore, even if it's compelling.
-
15:49 - 15:52It has to get very compelling before we'll pay attention.
-
15:53 - 15:55New models that extend long periods of time are hard.
-
15:55 - 15:57Global warming is a great example.
-
15:57 - 15:59We're terrible
-
15:59 - 16:01at models that span 80 years.
-
16:01 - 16:03We can do to the next harvest.
-
16:03 - 16:06We can often do until our kids grow up.
-
16:06 - 16:09But 80 years, we're just not good at.
-
16:09 - 16:12So it's a very hard model to accept.
-
16:12 - 16:16We can have both models in our head simultaneously,
-
16:16 - 16:19right, that kind of problem
-
16:19 - 16:22where we're holding both beliefs together,
-
16:22 - 16:24right, the cognitive dissonance.
-
16:24 - 16:26Eventually,
-
16:26 - 16:29the new model will replace the old model.
-
16:29 - 16:32Strong feelings can create a model.
-
16:32 - 16:35September 11th created a security model
-
16:35 - 16:37in a lot of people's heads.
-
16:37 - 16:40Also, personal experiences with crime can do it,
-
16:40 - 16:42personal health scare,
-
16:42 - 16:44a health scare in the news.
-
16:44 - 16:46You'll see these called flashbulb events
-
16:46 - 16:48by psychiatrists.
-
16:48 - 16:51They can create a model instantaneously,
-
16:51 - 16:54because they're very emotive.
-
16:54 - 16:56So in the technological world,
-
16:56 - 16:58we don't have experience
-
16:58 - 17:00to judge models.
-
17:00 - 17:02And we rely on others. We rely on proxies.
-
17:02 - 17:06I mean, this works as long as it's to correct others.
-
17:06 - 17:08We rely on government agencies
-
17:08 - 17:13to tell us what pharmaceuticals are safe.
-
17:13 - 17:15I flew here yesterday.
-
17:15 - 17:17I didn't check the airplane.
-
17:17 - 17:19I relied on some other group
-
17:19 - 17:22to determine whether my plane was safe to fly.
-
17:22 - 17:25We're here, none of us fear the roof is going to collapse on us,
-
17:25 - 17:28not because we checked,
-
17:28 - 17:30but because we're pretty sure
-
17:30 - 17:33the building codes here are good.
-
17:33 - 17:35It's a model we just accept
-
17:35 - 17:37pretty much by faith.
-
17:37 - 17:40And that's okay.
-
17:42 - 17:44Now, what we want
-
17:44 - 17:46is people to get familiar enough
-
17:46 - 17:48with better models --
-
17:48 - 17:50have it reflected in their feelings --
-
17:50 - 17:54to allow them to make security trade-offs.
-
17:54 - 17:56Now when these go out of whack,
-
17:56 - 17:58you have two options.
-
17:58 - 18:00One, you can fix people's feelings,
-
18:00 - 18:02directly appeal to feelings.
-
18:02 - 18:05It's manipulation, but it can work.
-
18:05 - 18:07The second, more honest way
-
18:07 - 18:10is to actually fix the model.
-
18:11 - 18:13Change happens slowly.
-
18:13 - 18:16The smoking debate took 40 years,
-
18:16 - 18:19and that was an easy one.
-
18:19 - 18:21Some of this stuff is hard.
-
18:21 - 18:23I mean really though,
-
18:23 - 18:25information seems like our best hope.
-
18:25 - 18:27And I lied.
-
18:27 - 18:29Remember I said feeling, model, reality;
-
18:29 - 18:32I said reality doesn't change. It actually does.
-
18:32 - 18:34We live in a technological world;
-
18:34 - 18:37reality changes all the time.
-
18:37 - 18:40So we might have -- for the first time in our species --
-
18:40 - 18:43feeling chases model, model chases reality, reality's moving --
-
18:43 - 18:46they might never catch up.
-
18:47 - 18:49We don't know.
-
18:49 - 18:51But in the long-term,
-
18:51 - 18:54both feeling and reality are important.
-
18:54 - 18:57And I want to close with two quick stories to illustrate this.
-
18:57 - 18:591982 -- I don't know if people will remember this --
-
18:59 - 19:02there was a short epidemic
-
19:02 - 19:04of Tylenol poisonings in the United States.
-
19:04 - 19:07It's a horrific story. Someone took a bottle of Tylenol,
-
19:07 - 19:10put poison in it, closed it up, put it back on the shelf.
-
19:10 - 19:12Someone else bought it and died.
-
19:12 - 19:14This terrified people.
-
19:14 - 19:16There were a couple of copycat attacks.
-
19:16 - 19:19There wasn't any real risk, but people were scared.
-
19:19 - 19:21And this is how
-
19:21 - 19:23the tamper-proof drug industry was invented.
-
19:23 - 19:25Those tamper-proof caps, that came from this.
-
19:25 - 19:27It's complete security theater.
-
19:27 - 19:29As a homework assignment, think of 10 ways to get around it.
-
19:29 - 19:32I'll give you one, a syringe.
-
19:32 - 19:35But it made people feel better.
-
19:35 - 19:37It made their feeling of security
-
19:37 - 19:39more match the reality.
-
19:39 - 19:42Last story, a few years ago, a friend of mine gave birth.
-
19:42 - 19:44I visit her in the hospital.
-
19:44 - 19:46It turns out when a baby's born now,
-
19:46 - 19:48they put an RFID bracelet on the baby,
-
19:48 - 19:50put a corresponding one on the mother,
-
19:50 - 19:52so if anyone other than the mother takes the baby out of the maternity ward,
-
19:52 - 19:54an alarm goes off.
-
19:54 - 19:56I said, "Well, that's kind of neat.
-
19:56 - 19:58I wonder how rampant baby snatching is
-
19:58 - 20:00out of hospitals."
-
20:00 - 20:02I go home, I look it up.
-
20:02 - 20:04It basically never happens.
-
20:04 - 20:06But if you think about it,
-
20:06 - 20:08if you are a hospital,
-
20:08 - 20:10and you need to take a baby away from its mother,
-
20:10 - 20:12out of the room to run some tests,
-
20:12 - 20:14you better have some good security theater,
-
20:14 - 20:16or she's going to rip your arm off.
-
20:16 - 20:18(Laughter)
-
20:18 - 20:20So it's important for us,
-
20:20 - 20:22those of us who design security,
-
20:22 - 20:25who look at security policy,
-
20:25 - 20:27or even look at public policy
-
20:27 - 20:29in ways that affect security.
-
20:29 - 20:32It's not just reality; it's feeling and reality.
-
20:32 - 20:34What's important
-
20:34 - 20:36is that they be about the same.
-
20:36 - 20:38It's important that, if our feelings match reality,
-
20:38 - 20:40we make better security trade-offs.
-
20:40 - 20:42Thank you.
-
20:42 - 20:44(Applause)
- Title:
- The security mirage
- Speaker:
- Bruce Schneier
- Description:
-
The feeling of security and the reality of security don't always match, says computer-security expert Bruce Schneier. At TEDxPSU, he explains why we spend billions addressing news story risks, like the "security theater" now playing at your local airport, while neglecting more probable risks -- and how we can break this pattern.
- Video Language:
- English
- Team:
- closed TED
- Project:
- TEDTalks
- Duration:
- 20:44
Krystian Aparta commented on English subtitles for The security mirage | ||
Krystian Aparta edited English subtitles for The security mirage | ||
Krystian Aparta edited English subtitles for The security mirage | ||
TED edited English subtitles for The security mirage | ||
TED added a translation |
Krystian Aparta
The English transcript was updated on 5/30/2017.