0:00:23.000,0:00:25.000 So security is two different things: 0:00:25.000,0:00:27.736 it's a feeling, and it's a reality. 0:00:27.976,0:00:29.310 And they're different. 0:00:29.310,0:00:31.000 You could feel secure 0:00:31.000,0:00:33.000 even if you're not. 0:00:33.000,0:00:35.000 And you can be secure 0:00:35.000,0:00:37.000 even if you don't feel it. 0:00:37.000,0:00:39.000 Really, we have two separate concepts 0:00:39.000,0:00:41.000 mapped onto the same word. 0:00:41.000,0:00:43.000 And what I want to do in this talk 0:00:43.000,0:00:45.000 is to split them apart -- 0:00:45.000,0:00:47.000 figuring out when they diverge 0:00:47.000,0:00:49.000 and how they converge. 0:00:49.000,0:00:51.000 And language is actually a problem here. 0:00:51.000,0:00:53.000 There aren't a lot of good words 0:00:53.000,0:00:56.000 for the concepts[br]we're going to talk about. 0:00:56.000,0:00:58.000 So if you look at security 0:00:58.000,0:01:00.000 from economic terms, 0:01:00.000,0:01:02.000 it's a trade-off. 0:01:02.000,0:01:04.000 Every time you get some security, 0:01:04.000,0:01:06.000 you're always trading off something. 0:01:06.000,0:01:08.000 Whether this is a personal decision -- 0:01:08.000,0:01:10.960 whether you're going to install[br]a burglar alarm in your home -- 0:01:10.960,0:01:14.610 or a national decision -- where you're[br]going to invade some foreign country -- 0:01:14.610,0:01:16.450 you're going to trade off something, 0:01:16.450,0:01:18.730 either money or time,[br]convenience, capabilities, 0:01:18.730,0:01:21.000 maybe fundamental liberties. 0:01:21.000,0:01:24.000 And the question to ask[br]when you look at a security anything 0:01:24.000,0:01:27.000 is not whether this makes us safer, 0:01:27.000,0:01:30.000 but whether it's worth the trade-off. 0:01:30.000,0:01:32.000 You've heard in the past several years, 0:01:32.000,0:01:34.780 the world is safer because[br]Saddam Hussein is not in power. 0:01:34.780,0:01:37.210 That might be true,[br]but it's not terribly relevant. 0:01:37.210,0:01:40.000 The question is, was it worth it? 0:01:40.000,0:01:43.000 And you can make your own decision, 0:01:43.000,0:01:45.730 and then you'll decide[br]whether the invasion was worth it. 0:01:45.730,0:01:47.610 That's how you think about security -- 0:01:47.610,0:01:49.000 in terms of the trade-off. 0:01:49.000,0:01:52.000 Now there's often no right or wrong here. 0:01:52.000,0:01:54.240 Some of us have[br]a burglar alarm system at home, 0:01:54.240,0:01:56.000 and some of us don't. 0:01:56.000,0:01:58.000 And it'll depend on where we live, 0:01:58.000,0:02:00.000 whether we live alone or have a family, 0:02:00.000,0:02:02.000 how much cool stuff we have, 0:02:02.000,0:02:04.000 how much we're willing to accept 0:02:04.000,0:02:06.000 the risk of theft. 0:02:06.000,0:02:08.000 In politics also, 0:02:08.000,0:02:10.000 there are different opinions. 0:02:10.000,0:02:12.000 A lot of times, these trade-offs 0:02:12.000,0:02:14.000 are about more than just security, 0:02:14.000,0:02:16.000 and I think that's really important. 0:02:16.000,0:02:18.000 Now people have a natural intuition 0:02:18.000,0:02:20.000 about these trade-offs. 0:02:20.000,0:02:22.000 We make them every day -- 0:02:22.000,0:02:24.000 last night in my hotel room, 0:02:24.000,0:02:26.000 when I decided to double-lock the door, 0:02:26.000,0:02:28.000 or you in your car when you drove here, 0:02:28.000,0:02:30.000 when we go eat lunch 0:02:30.000,0:02:33.000 and decide the food's not poison[br]and we'll eat it. 0:02:33.000,0:02:35.000 We make these trade-offs[br]again and again, 0:02:35.000,0:02:37.000 multiple times a day. 0:02:37.000,0:02:39.000 We often don't even notice them. 0:02:39.000,0:02:41.380 They're just part of being alive;[br]we all do it. 0:02:41.380,0:02:44.000 Every species does it. 0:02:44.000,0:02:46.000 Imagine a rabbit in a field, eating grass, 0:02:46.000,0:02:49.000 and the rabbit's going to see a fox. 0:02:49.000,0:02:51.000 That rabbit will make[br]a security trade-off: 0:02:51.000,0:02:53.000 "Should I stay, or should I flee?" 0:02:53.000,0:02:55.000 And if you think about it, 0:02:55.000,0:02:58.000 the rabbits that are good[br]at making that trade-off 0:02:58.000,0:03:00.000 will tend to live and reproduce, 0:03:00.000,0:03:02.000 and the rabbits that are bad at it 0:03:02.000,0:03:04.000 will get eaten or starve. 0:03:04.000,0:03:06.000 So you'd think 0:03:06.000,0:03:09.000 that us, as a successful species[br]on the planet -- 0:03:09.000,0:03:11.000 you, me, everybody -- 0:03:11.000,0:03:14.000 would be really good[br]at making these trade-offs. 0:03:14.000,0:03:16.000 Yet it seems, again and again, 0:03:16.000,0:03:19.000 that we're hopelessly bad at it. 0:03:19.000,0:03:22.000 And I think that's a fundamentally[br]interesting question. 0:03:22.000,0:03:24.000 I'll give you the short answer. 0:03:24.000,0:03:26.470 The answer is, we respond[br]to the feeling of security 0:03:26.470,0:03:29.000 and not the reality. 0:03:29.000,0:03:31.849 Now most of the time, that works. 0:03:33.000,0:03:34.511 Most of the time, 0:03:34.511,0:03:36.997 feeling and reality are the same. 0:03:38.000,0:03:40.000 Certainly that's true 0:03:40.000,0:03:43.000 for most of human prehistory. 0:03:43.000,0:03:45.666 We've developed this ability 0:03:45.666,0:03:48.000 because it makes evolutionary sense. 0:03:49.267,0:03:50.560 One way to think of it 0:03:50.560,0:03:52.000 is that we're highly optimized 0:03:52.000,0:03:54.000 for risk decisions 0:03:54.000,0:03:57.000 that are endemic to living[br]in small family groups 0:03:57.000,0:04:00.000 in the East African highlands[br]in 100,000 B.C. 0:04:00.000,0:04:03.000 2010 New York, not so much. 0:04:06.485,0:04:09.560 Now there are several biases[br]in risk perception. 0:04:09.560,0:04:11.250 A lot of good experiments in this. 0:04:11.250,0:04:15.020 And you can see certain biases[br]that come up again and again. 0:04:15.020,0:04:16.780 So I'll give you four. 0:04:16.780,0:04:20.084 We tend to exaggerate[br]spectacular and rare risks 0:04:20.084,0:04:21.563 and downplay common risks -- 0:04:21.563,0:04:23.637 so flying versus driving. 0:04:24.412,0:04:26.465 The unknown is perceived 0:04:26.465,0:04:28.207 to be riskier than the familiar. 0:04:31.047,0:04:32.492 One example would be, 0:04:32.492,0:04:35.210 people fear kidnapping by strangers 0:04:35.210,0:04:38.665 when the data supports kidnapping[br]by relatives is much more common. 0:04:38.665,0:04:40.586 This is for children. 0:04:40.586,0:04:42.862 Third, personified risks 0:04:42.862,0:04:45.830 are perceived to be greater[br]than anonymous risks -- 0:04:45.830,0:04:48.458 so Bin Laden is scarier[br]because he has a name. 0:04:49.713,0:04:51.138 And the fourth 0:04:51.138,0:04:54.095 is people underestimate risks 0:04:54.095,0:04:55.942 in situations they do control 0:04:55.942,0:04:59.046 and overestimate them[br]in situations they don't control. 0:04:59.046,0:05:01.855 So once you take up skydiving or smoking, 0:05:01.855,0:05:04.210 you downplay the risks. 0:05:04.210,0:05:07.692 If a risk is thrust upon you[br]-- terrorism was a good example -- 0:05:07.692,0:05:10.990 you'll overplay it because you don't feel[br]like it's in your control. 0:05:10.990,0:05:15.130 There are a bunch of other[br]of these cognitive biases, 0:05:15.130,0:05:16.900 that affect our risk decisions. 0:05:17.780,0:05:20.240 There's the availability heuristic, 0:05:20.240,0:05:22.096 which basically means 0:05:22.096,0:05:24.720 we estimate the probability of something 0:05:24.720,0:05:28.100 by how easy it is[br]to bring instances of it to mind. 0:05:29.560,0:05:31.187 So you can imagine how that works. 0:05:31.187,0:05:34.830 If you hear a lot about tiger attacks,[br]there must be a lot of tigers around. 0:05:34.830,0:05:38.137 You don't hear about lion attacks,[br]there aren't a lot of lions around. 0:05:38.137,0:05:40.722 This works until you invent newspapers. 0:05:40.722,0:05:42.314 Because what newspapers do 0:05:42.314,0:05:44.890 is they repeat again and again 0:05:44.890,0:05:46.409 rare risks. 0:05:46.409,0:05:49.209 I tell people, if it's in the news,[br]don't worry about it. 0:05:49.209,0:05:50.490 Because by definition, 0:05:50.490,0:05:53.260 news is something[br]that almost never happens. 0:05:53.260,0:05:55.297 (Laughter) 0:05:55.297,0:05:58.600 When something is so common,[br]it's no longer news -- 0:05:58.600,0:06:00.656 car crashes, domestic violence -- 0:06:00.656,0:06:02.909 those are the risks you worry about. 0:06:03.375,0:06:05.416 We're also a species of storytellers. 0:06:05.416,0:06:07.849 We respond to stories more than data. 0:06:07.849,0:06:10.574 And there's some basic[br]innumeracy going on. 0:06:10.574,0:06:13.649 I mean, the joke[br]"One, Two, Three, Many" is kind of right. 0:06:13.804,0:06:16.222 We're really good at small numbers. 0:06:16.222,0:06:18.443 One mango, two mangoes, three mangoes, 0:06:18.443,0:06:20.313 10,000 mangoes, 100,000 mangoes -- 0:06:20.313,0:06:23.386 it's still more mangoes[br]you can eat before they rot. 0:06:23.386,0:06:26.530 So one half, one quarter, one fifth[br]-- we're good at that. 0:06:26.530,0:06:28.714 One in a million, one in a billion -- 0:06:28.714,0:06:30.769 they're both almost never. 0:06:31.334,0:06:33.232 So we have trouble with the risks 0:06:33.232,0:06:34.888 that aren't very common. 0:06:34.888,0:06:37.390 And what these cognitive biases do 0:06:37.390,0:06:39.889 is they act as filters[br]between us and reality. 0:06:41.152,0:06:42.753 And the result 0:06:42.753,0:06:45.051 is that feeling and reality[br]get out of whack, 0:06:45.051,0:06:46.770 they get different. 0:06:47.034,0:06:50.896 Now you either have a feeling[br]-- you feel more secure than you are. 0:06:50.896,0:06:52.714 There's a false sense of security. 0:06:52.714,0:06:53.837 Or the other way, 0:06:53.837,0:06:56.460 and that's a false sense of insecurity. 0:06:56.460,0:06:58.878 I write a lot about "security theater," 0:06:58.878,0:07:02.236 which are products[br]that make people feel secure, 0:07:02.236,0:07:04.309 but don't actually do anything. 0:07:04.309,0:07:06.814 There's no real word[br]for stuff that makes us secure, 0:07:06.814,0:07:08.891 but doesn't make us feel secure. 0:07:08.891,0:07:11.870 Maybe it's what the CIA's supposed to do[br]for us. 0:07:12.784,0:07:15.057 So back to economics. 0:07:15.057,0:07:18.259 If economics, if the market,[br]drives security, 0:07:18.259,0:07:21.260 and if people make trade-offs 0:07:21.260,0:07:23.240 based on the feeling of security, 0:07:23.240,0:07:26.977 then the smart thing for companies to do 0:07:26.977,0:07:28.583 for the economic incentives 0:07:28.583,0:07:30.886 are to make people feel secure. 0:07:32.060,0:07:33.782 And there are two ways to do this. 0:07:33.782,0:07:36.531 One, you can make people actually secure 0:07:36.531,0:07:38.360 and hope they notice. 0:07:38.360,0:07:41.060 Or two, you can make people[br]just feel secure 0:07:41.060,0:07:43.070 and hope they don't notice. 0:07:45.569,0:07:48.281 So what makes people notice? 0:07:49.022,0:07:50.701 Well a couple of things: 0:07:50.701,0:07:53.037 understanding of the security, 0:07:53.037,0:07:54.500 of the risks, the threats, 0:07:54.500,0:07:56.474 the countermeasures, how they work. 0:07:56.474,0:07:57.992 But if you know stuff, 0:07:57.992,0:08:01.848 you're more likely to have[br]your feelings match reality. 0:08:02.558,0:08:05.560 Enough real world examples helps. 0:08:05.560,0:08:08.480 Now we all know the crime rate[br]in our neighborhood, 0:08:08.480,0:08:11.000 because we live there,[br]and we get a feeling about it 0:08:11.000,0:08:13.240 that basically matches reality. 0:08:14.879,0:08:16.862 Security theater's exposed 0:08:16.862,0:08:19.350 when it's obvious[br]that it's not working properly. 0:08:21.078,0:08:23.460 Okay, so what makes people not notice? 0:08:23.460,0:08:25.878 Well, a poor understanding. 0:08:25.878,0:08:28.973 If you don't understand the risks,[br]you don't understand the costs, 0:08:28.973,0:08:31.498 you're likely to get the trade-off wrong, 0:08:31.498,0:08:34.036 and your feeling doesn't match reality. 0:08:34.036,0:08:36.559 Not enough examples. 0:08:36.559,0:08:38.127 There's an inherent problem 0:08:38.127,0:08:40.258 with low probability events. 0:08:40.284,0:08:41.759 If, for example, 0:08:41.759,0:08:44.142 terrorism almost never happens, 0:08:44.142,0:08:46.000 it's really hard to judge 0:08:46.000,0:08:49.250 the efficacy of counter-terrorist[br]measures. 0:08:50.489,0:08:53.300 This is why you keep sacrificing virgins, 0:08:53.300,0:08:56.060 and why your unicorn defenses[br]are working just great. 0:08:56.060,0:08:58.746 There aren't enough examples of failures. 0:09:00.079,0:09:03.130 Also, feelings that are clouding[br]the issues -- 0:09:03.130,0:09:05.550 the cognitive biases[br]I talked about earlier, 0:09:05.550,0:09:07.450 fears, folk beliefs, 0:09:08.531,0:09:11.456 basically an inadequate model of reality. 0:09:13.330,0:09:15.163 So let me complicate things. 0:09:15.163,0:09:17.420 I have feeling and reality. 0:09:17.420,0:09:19.930 I want to add a third element.[br]I want to add model. 0:09:20.711,0:09:22.856 Feeling and model in our head, 0:09:22.856,0:09:24.755 reality is the outside world. 0:09:24.755,0:09:26.460 It doesn't change; it's real. 0:09:27.687,0:09:29.512 So feeling is based on our intuition. 0:09:29.512,0:09:31.827 Model is based on reason. 0:09:32.209,0:09:33.972 That's basically the difference. 0:09:33.972,0:09:35.999 In a primitive and simple world, 0:09:35.999,0:09:38.040 there's really no reason for a model 0:09:40.096,0:09:42.480 because feeling is close to reality. 0:09:42.480,0:09:44.327 You don't need a model. 0:09:44.327,0:09:46.559 But in a modern and complex world, 0:09:46.559,0:09:48.380 you need models 0:09:48.380,0:09:50.879 to understand a lot of the risks we face. 0:09:52.317,0:09:54.924 There's no feeling about germs. 0:09:54.924,0:09:57.275 You need a model to understand them. 0:09:57.275,0:09:58.971 So this model 0:09:58.971,0:10:01.563 is an intelligent representation[br]of reality. 0:10:01.563,0:10:04.614 It's, of course, limited by science, 0:10:04.614,0:10:06.566 by technology. 0:10:08.108,0:10:10.304 We couldn't have a germ theory of disease 0:10:10.304,0:10:12.892 before we invented[br]the microscope to see them. 0:10:13.770,0:10:16.308 It's limited by our cognitive biases. 0:10:17.701,0:10:19.280 But it has the ability 0:10:19.280,0:10:21.400 to override our feelings. 0:10:21.400,0:10:24.450 Where do we get these models?[br]We get them from others. 0:10:24.450,0:10:27.499 We get them from religion, from culture, 0:10:27.499,0:10:29.492 teachers, elders. 0:10:29.492,0:10:30.796 A couple years ago, 0:10:30.796,0:10:33.165 I was in South Africa on safari. 0:10:33.165,0:10:36.115 The tracker I was with[br]grew up in Kruger National Park. 0:10:36.115,0:10:39.143 He had some very complex models[br]of how to survive. 0:10:39.143,0:10:41.004 And it depended on if you were attacked 0:10:41.004,0:10:43.450 by a lion or a leopard[br]or a rhino or an elephant -- 0:10:43.450,0:10:46.376 and when you had to run away,[br]and when you couldn't run away, 0:10:46.376,0:10:49.740 and when you had to climb a tree --[br]when you could never climb a tree. 0:10:49.740,0:10:51.666 I would have died in a day, 0:10:51.666,0:10:53.518 but he was born there, 0:10:53.518,0:10:55.883 and he understood how to survive. 0:10:55.883,0:10:57.540 I was born in New York City. 0:10:57.540,0:11:00.787 I could have taken him to New York,[br]and he would have died in a day. 0:11:00.787,0:11:02.790 (Laughter) 0:11:02.790,0:11:04.478 Because we had different models 0:11:04.478,0:11:06.875 based on our different experiences. 0:11:07.873,0:11:09.916 Models can come from the media, 0:11:09.916,0:11:12.261 from our elected officials. 0:11:13.137,0:11:15.800 Think of models of terrorism, 0:11:15.800,0:11:18.120 child kidnapping, 0:11:18.120,0:11:21.108 airline safety, car safety. 0:11:21.108,0:11:23.097 Models can come from industry. 0:11:24.560,0:11:27.090 The two I'm following[br]are surveillance cameras, 0:11:27.090,0:11:28.606 ID cards, 0:11:28.606,0:11:31.882 quite a lot of our computer security[br]models come from there. 0:11:31.882,0:11:34.241 A lot of models come from science. 0:11:34.241,0:11:36.076 Health models are a great example. 0:11:36.076,0:11:39.250 Think of cancer, of bird flu,[br]swine flu, SARS. 0:11:39.916,0:11:42.316 All of our feelings of security 0:11:42.316,0:11:44.320 about those diseases 0:11:44.320,0:11:45.761 come from models 0:11:45.761,0:11:48.839 given to us, really, by science filtered[br]through the media. 0:11:51.001,0:11:53.490 So models can change. 0:11:53.490,0:11:55.479 Models are not static. 0:11:55.479,0:11:58.434 As we become more comfortable[br]in our environments, 0:11:58.434,0:12:01.658 our model can move closer to our feelings. 0:12:03.630,0:12:05.961 So an example might be, 0:12:05.961,0:12:07.992 if you go back 100 years ago 0:12:07.992,0:12:10.578 when electricity was first[br]becoming common, 0:12:10.578,0:12:12.691 there were a lot of fears about it. 0:12:12.691,0:12:15.600 I mean, there were people[br]who were afraid to push doorbells, 0:12:15.600,0:12:18.708 because there was electricity in there,[br]and that was dangerous. 0:12:18.708,0:12:21.197 For us, we're very facile[br]around electricity. 0:12:21.197,0:12:22.610 We change light bulbs 0:12:22.610,0:12:24.828 without even thinking about it. 0:12:24.828,0:12:28.178 Our model of security around electricity 0:12:28.178,0:12:30.780 is something we were born into. 0:12:31.757,0:12:34.076 It hasn't changed as we were growing up. 0:12:34.076,0:12:36.225 And we're good at it. 0:12:36.870,0:12:38.675 Or think of the risks 0:12:38.675,0:12:41.250 on the Internet across generations -- 0:12:41.396,0:12:44.044 how your parents approach[br]Internet security, 0:12:44.044,0:12:45.503 versus how you do, 0:12:45.503,0:12:47.519 versus how our kids will. 0:12:47.750,0:12:50.320 Models eventually fade[br]into the background. 0:12:52.283,0:12:54.812 Intuitive is just another word[br]for familiar. 0:12:55.660,0:12:58.233 So as your model is close to reality, 0:12:58.233,0:12:59.878 and it converges with feelings, 0:12:59.878,0:13:01.959 you often don't know it's there. 0:13:03.092,0:13:04.541 So a nice example of this 0:13:04.541,0:13:06.572 came from last year and swine flu. 0:13:07.753,0:13:09.916 When swine flu first appeared, 0:13:09.916,0:13:12.903 the initial news caused[br]a lot of overreaction. 0:13:13.449,0:13:15.405 Now it had a name, 0:13:15.405,0:13:17.693 which made it scarier[br]than the regular flu, 0:13:17.693,0:13:19.769 even though it was more deadly. 0:13:19.769,0:13:22.911 And people thought doctors[br]should be able to deal with it. 0:13:23.235,0:13:25.820 So there was that feeling[br]of lack of control. 0:13:25.820,0:13:27.175 And those two things 0:13:27.175,0:13:28.933 made the risk more than it was. 0:13:28.933,0:13:32.320 As the novelty wore off,[br]the months went by, 0:13:32.320,0:13:33.988 there was some amount of tolerance, 0:13:33.988,0:13:35.636 people got used to it. 0:13:36.199,0:13:38.964 There was no new data,[br]but there was less fear. 0:13:38.964,0:13:40.674 By autumn, 0:13:40.674,0:13:42.724 people thought 0:13:42.724,0:13:45.458 the doctors should have solved this[br]already. 0:13:45.458,0:13:47.215 And there's kind of a bifurcation -- 0:13:47.215,0:13:48.588 people had to choose 0:13:48.588,0:13:51.430 between fear and acceptance -- 0:13:54.120,0:13:55.867 actually fear and indifference -- 0:13:55.867,0:13:58.274 they kind of chose suspicion. 0:13:58.923,0:14:01.786 And when the vaccine appeared last winter, 0:14:01.786,0:14:04.418 there were a lot of people[br]-- a surprising number -- 0:14:04.418,0:14:06.578 who refused to get it -- 0:14:08.420,0:14:10.140 as a nice example 0:14:10.140,0:14:13.929 of how people's feelings of security[br]change, how their model changes, 0:14:13.929,0:14:15.603 sort of wildly 0:14:15.603,0:14:17.436 with no new information, 0:14:17.436,0:14:19.086 with no new input. 0:14:20.217,0:14:22.584 This kind of thing happens a lot. 0:14:22.584,0:14:24.810 I'm going to give one more complication. 0:14:24.810,0:14:27.250 We have feeling, model, reality. 0:14:28.227,0:14:30.902 I have a very relativistic view[br]of security. 0:14:30.902,0:14:33.095 I think it depends on the observer. 0:14:33.095,0:14:35.050 And most security decisions 0:14:35.050,0:14:38.287 have a variety of people involved. 0:14:39.310,0:14:41.229 And stakeholders 0:14:41.229,0:14:43.937 with specific trade-offs 0:14:43.937,0:14:46.261 will try to influence the decision. 0:14:46.261,0:14:48.311 And I call that their agenda. 0:14:49.339,0:14:50.732 And you see agenda -- 0:14:50.732,0:14:52.837 this is marketing, this is politics -- 0:14:52.837,0:14:56.235 trying to convince you to have[br]one model versus another, 0:14:56.235,0:14:58.325 trying to convince you to ignore a model 0:14:58.325,0:15:00.894 and trust your feelings, 0:15:00.894,0:15:04.110 marginalizing people[br]with models you don't like. 0:15:04.600,0:15:06.515 This is not uncommon. 0:15:07.234,0:15:10.538 An example, a great example,[br]is the risk of smoking. 0:15:11.521,0:15:14.427 In the history of the past 50 years,[br]the smoking risk 0:15:14.427,0:15:16.586 shows how a model changes, 0:15:16.586,0:15:18.968 and it also shows[br]how an industry fights against 0:15:18.968,0:15:20.680 a model it doesn't like. 0:15:21.872,0:15:24.962 Compare that[br]to the secondhand smoke debate -- 0:15:24.962,0:15:27.455 probably about 20 years behind. 0:15:27.455,0:15:29.542 Think about seat belts. 0:15:29.542,0:15:31.696 When I was a kid, no one wore a seat belt. 0:15:31.696,0:15:33.525 Nowadays, no kid will let you drive 0:15:33.525,0:15:35.900 if you're not wearing a seat belt. 0:15:36.570,0:15:39.210 Compare that to the airbag debate -- 0:15:39.210,0:15:41.646 probably about 30 years behind. 0:15:42.320,0:15:44.790 All examples of models changing. 0:15:46.942,0:15:50.168 What we learn is that[br]changing models is hard. 0:15:50.168,0:15:52.699 Models are hard to dislodge. 0:15:52.699,0:15:54.581 If they equal your feelings, 0:15:54.581,0:15:56.518 you don't even know you have a model. 0:15:57.343,0:15:59.250 And there's another cognitive bias 0:15:59.250,0:16:01.243 I'll call confirmation bias, 0:16:01.243,0:16:03.310 where we tend to accept data 0:16:03.310,0:16:05.688 that confirms our beliefs 0:16:05.688,0:16:08.496 and reject data[br]that contradicts our beliefs. 0:16:13.833,0:16:16.236 So evidence against our model, 0:16:16.236,0:16:18.942 we're likely to ignore,[br]even if it's compelling. 0:16:18.942,0:16:21.905 It has to get very compelling[br]before we'll pay attention. 0:16:22.975,0:16:25.912 New models that extend[br]long periods of time are hard. 0:16:25.912,0:16:27.530 Global warming is a great example. 0:16:27.530,0:16:28.887 We're terrible 0:16:28.887,0:16:30.945 at models that span 80 years. 0:16:31.187,0:16:33.172 We can do to the next harvest. 0:16:33.172,0:16:35.987 We can often do until our kids grow up. 0:16:35.987,0:16:38.524 But 80 years, we're just not good at. 0:16:39.150,0:16:41.436 So it's a very hard model to accept. 0:16:42.255,0:16:45.506 We can have both models[br]in our head simultaneously, 0:16:46.010,0:16:49.004 right, that kind of problem 0:16:49.649,0:16:52.529 where we're holding both beliefs together, 0:16:52.830,0:16:54.584 right, the cognitive dissonance. 0:16:54.584,0:16:55.909 Eventually, 0:16:55.909,0:16:58.211 the new model will replace the old model. 0:16:58.211,0:17:01.048 Strong feelings can create a model. 0:17:01.758,0:17:04.748 September 11th created a security model 0:17:04.750,0:17:06.670 in a lot of people's heads. 0:17:06.670,0:17:10.223 Also, personal experiences[br]with crime can do it, 0:17:10.223,0:17:11.775 personal health scare, 0:17:11.775,0:17:14.294 a health scare in the news. 0:17:14.294,0:17:16.259 You'll see these called flashbulb events 0:17:16.259,0:17:18.007 by psychiatrists. 0:17:18.603,0:17:20.843 They can create a model instantaneously, 0:17:20.843,0:17:23.050 because they're very emotive. 0:17:23.800,0:17:25.694 So in the technological world, 0:17:25.694,0:17:27.907 we don't have experience 0:17:27.907,0:17:29.662 to judge models. 0:17:29.662,0:17:32.363 And we rely on others. We rely on proxies. 0:17:32.363,0:17:35.638 I mean, this works[br]as long as it's to correct others. 0:17:35.638,0:17:38.330 We rely on government agencies 0:17:38.330,0:17:41.650 to tell us what pharmaceuticals are safe. 0:17:42.698,0:17:44.475 I flew here yesterday. 0:17:44.475,0:17:47.002 I didn't check the airplane. 0:17:47.002,0:17:49.155 I relied on some other group 0:17:49.155,0:17:51.664 to determine whether[br]my plane was safe to fly. 0:17:51.664,0:17:55.196 We're here, none of us fear[br]the roof is going to collapse on us, 0:17:55.196,0:17:57.184 not because we checked, 0:17:57.184,0:17:59.390 but because we're pretty sure 0:17:59.390,0:18:01.534 the building codes here are good. 0:18:02.890,0:18:05.000 It's a model we just accept 0:18:05.000,0:18:07.218 pretty much by faith. 0:18:07.637,0:18:09.593 And that's okay. 0:18:12.016,0:18:14.290 Now, what we want 0:18:14.290,0:18:16.378 is people to get familiar enough 0:18:16.378,0:18:18.274 with better models -- 0:18:18.274,0:18:20.363 have it reflected in their feelings -- 0:18:20.363,0:18:23.030 to allow them to make security trade-offs. 0:18:24.011,0:18:26.217 Now when these go out of whack, 0:18:26.217,0:18:27.897 you have two options. 0:18:27.897,0:18:30.225 One, you can fix people's feelings, 0:18:30.225,0:18:32.245 directly appeal to feelings. 0:18:32.245,0:18:34.690 It's manipulation, but it can work. 0:18:35.456,0:18:37.231 The second, more honest way 0:18:37.231,0:18:39.629 is to actually fix the model. 0:18:40.918,0:18:42.956 Change happens slowly. 0:18:42.956,0:18:45.504 The smoking debate took 40 years, 0:18:45.504,0:18:47.480 and that was an easy one. 0:18:49.703,0:18:51.762 Some of this stuff is hard. 0:18:51.762,0:18:53.037 I mean really though, 0:18:53.037,0:18:55.310 information seems like our best hope. 0:18:55.310,0:18:56.836 And I lied. 0:18:56.836,0:18:59.623 Remember I said feeling, model, reality; 0:18:59.623,0:19:02.036 I said reality doesn't change.[br]It actually does. 0:19:02.036,0:19:03.917 We live in a technological world; 0:19:03.917,0:19:06.317 reality changes all the time. 0:19:07.176,0:19:09.882 So we might have[br]-- for the first time in our species -- 0:19:09.882,0:19:13.170 feeling chases model,[br]model chases reality, reality's moving -- 0:19:13.170,0:19:15.360 they might never catch up. 0:19:16.502,0:19:18.297 We don't know. 0:19:19.860,0:19:21.710 But in the long-term, 0:19:21.710,0:19:23.623 both feeling and reality are important. 0:19:23.623,0:19:26.890 And I want to close with two quick stories[br]to illustrate this. 0:19:26.890,0:19:29.496 1982 -- I don't know if people[br]will remember this -- 0:19:29.496,0:19:31.571 there was a short epidemic 0:19:31.571,0:19:33.706 of Tylenol poisonings[br]in the United States. 0:19:33.706,0:19:36.767 It's a horrific story.[br]Someone took a bottle of Tylenol, 0:19:36.767,0:19:40.368 put poison in it, closed it up,[br]put it back on the shelf. 0:19:40.368,0:19:42.333 Someone else bought it and died. 0:19:42.333,0:19:43.982 This terrified people. 0:19:43.982,0:19:45.972 There were a couple of copycat attacks. 0:19:45.972,0:19:48.988 There wasn't any real risk,[br]but people were scared. 0:19:48.988,0:19:50.311 And this is how 0:19:50.311,0:19:52.944 the tamper-proof drug industry[br]was invented. 0:19:52.944,0:19:55.101 Those tamper-proof caps,[br]that came from this. 0:19:55.101,0:19:56.804 It's complete security theater. 0:19:56.804,0:19:59.684 As a homework assignment,[br]think of 10 ways to get around it. 0:19:59.684,0:20:01.565 I'll give you one, a syringe. 0:20:01.565,0:20:04.493 But it made people feel better. 0:20:05.024,0:20:06.929 It made their feeling of security 0:20:06.929,0:20:08.590 more match the reality. 0:20:09.632,0:20:12.466 Last story, a few years ago,[br]a friend of mine gave birth. 0:20:12.466,0:20:13.921 I visit her in the hospital. 0:20:13.921,0:20:16.095 It turns out when a baby's born now, 0:20:16.095,0:20:17.929 they put an RFID bracelet on the baby, 0:20:17.929,0:20:19.778 put a corresponding one on the mother, 0:20:19.778,0:20:23.400 so if anyone other than the mother[br]takes the baby out of the maternity ward, 0:20:23.400,0:20:24.340 an alarm goes off. 0:20:24.340,0:20:26.195 I said, "Well, that's kind of neat. 0:20:26.195,0:20:28.808 I wonder how rampant baby snatching is 0:20:28.808,0:20:30.000 out of hospitals." 0:20:30.000,0:20:31.612 I go home, I look it up. 0:20:31.612,0:20:34.000 It basically never happens. 0:20:34.616,0:20:36.000 But if you think about it, 0:20:36.000,0:20:38.000 if you are a hospital, 0:20:38.000,0:20:40.410 and you need to take a baby[br]away from its mother, 0:20:40.410,0:20:42.070 out of the room to run some tests, 0:20:42.070,0:20:44.130 you better have some[br]good security theater, 0:20:44.130,0:20:46.000 or she's going to rip your arm off. 0:20:46.000,0:20:48.000 (Laughter) 0:20:48.000,0:20:50.000 So it's important for us, 0:20:50.000,0:20:52.000 those of us who design security, 0:20:52.000,0:20:55.000 who look at security policy, 0:20:55.000,0:20:57.000 or even look at public policy 0:20:57.000,0:20:59.000 in ways that affect security. 0:20:59.000,0:21:02.000 It's not just reality;[br]it's feeling and reality. 0:21:02.000,0:21:04.000 What's important 0:21:04.000,0:21:06.000 is that they be about the same. 0:21:06.000,0:21:08.590 It's important that,[br]if our feelings match reality, 0:21:08.590,0:21:10.960 we make better security trade-offs. 0:21:10.960,0:21:12.000 Thank you. 0:21:12.000,0:21:13.525 (Applause)