WEBVTT 00:00:23.000 --> 00:00:25.000 So security is two different things: 00:00:25.000 --> 00:00:27.736 it's a feeling, and it's a reality. 00:00:27.976 --> 00:00:29.310 And they're different. 00:00:29.310 --> 00:00:31.000 You could feel secure 00:00:31.000 --> 00:00:33.000 even if you're not. 00:00:33.000 --> 00:00:35.000 And you can be secure 00:00:35.000 --> 00:00:37.000 even if you don't feel it. 00:00:37.000 --> 00:00:39.000 Really, we have two separate concepts 00:00:39.000 --> 00:00:41.000 mapped onto the same word. 00:00:41.000 --> 00:00:43.000 And what I want to do in this talk 00:00:43.000 --> 00:00:45.000 is to split them apart -- 00:00:45.000 --> 00:00:47.000 figuring out when they diverge 00:00:47.000 --> 00:00:49.000 and how they converge. 00:00:49.000 --> 00:00:51.000 And language is actually a problem here. 00:00:51.000 --> 00:00:53.000 There aren't a lot of good words 00:00:53.000 --> 00:00:56.000 for the concepts we're going to talk about. 00:00:56.000 --> 00:00:58.000 So if you look at security 00:00:58.000 --> 00:01:00.000 from economic terms, 00:01:00.000 --> 00:01:02.000 it's a trade-off. 00:01:02.000 --> 00:01:04.000 Every time you get some security, 00:01:04.000 --> 00:01:06.000 you're always trading off something. 00:01:06.000 --> 00:01:08.000 Whether this is a personal decision -- 00:01:08.000 --> 00:01:10.960 whether you're going to install a burglar alarm in your home -- 00:01:10.960 --> 00:01:14.610 or a national decision -- where you're going to invade some foreign country -- 00:01:14.610 --> 00:01:16.450 you're going to trade off something, 00:01:16.450 --> 00:01:18.730 either money or time, convenience, capabilities, 00:01:18.730 --> 00:01:21.000 maybe fundamental liberties. 00:01:21.000 --> 00:01:24.000 And the question to ask when you look at a security anything 00:01:24.000 --> 00:01:27.000 is not whether this makes us safer, 00:01:27.000 --> 00:01:30.000 but whether it's worth the trade-off. 00:01:30.000 --> 00:01:32.000 You've heard in the past several years, 00:01:32.000 --> 00:01:34.780 the world is safer because Saddam Hussein is not in power. 00:01:34.780 --> 00:01:37.210 That might be true, but it's not terribly relevant. 00:01:37.210 --> 00:01:40.000 The question is, was it worth it? 00:01:40.000 --> 00:01:43.000 And you can make your own decision, 00:01:43.000 --> 00:01:45.730 and then you'll decide whether the invasion was worth it. 00:01:45.730 --> 00:01:47.610 That's how you think about security -- 00:01:47.610 --> 00:01:49.000 in terms of the trade-off. 00:01:49.000 --> 00:01:52.000 Now there's often no right or wrong here. 00:01:52.000 --> 00:01:54.240 Some of us have a burglar alarm system at home, 00:01:54.240 --> 00:01:56.000 and some of us don't. 00:01:56.000 --> 00:01:58.000 And it'll depend on where we live, 00:01:58.000 --> 00:02:00.000 whether we live alone or have a family, 00:02:00.000 --> 00:02:02.000 how much cool stuff we have, 00:02:02.000 --> 00:02:04.000 how much we're willing to accept 00:02:04.000 --> 00:02:06.000 the risk of theft. 00:02:06.000 --> 00:02:08.000 In politics also, 00:02:08.000 --> 00:02:10.000 there are different opinions. 00:02:10.000 --> 00:02:12.000 A lot of times, these trade-offs 00:02:12.000 --> 00:02:14.000 are about more than just security, 00:02:14.000 --> 00:02:16.000 and I think that's really important. 00:02:16.000 --> 00:02:18.000 Now people have a natural intuition 00:02:18.000 --> 00:02:20.000 about these trade-offs. 00:02:20.000 --> 00:02:22.000 We make them every day -- 00:02:22.000 --> 00:02:24.000 last night in my hotel room, 00:02:24.000 --> 00:02:26.000 when I decided to double-lock the door, 00:02:26.000 --> 00:02:28.000 or you in your car when you drove here, 00:02:28.000 --> 00:02:30.000 when we go eat lunch 00:02:30.000 --> 00:02:33.000 and decide the food's not poison and we'll eat it. 00:02:33.000 --> 00:02:35.000 We make these trade-offs again and again, 00:02:35.000 --> 00:02:37.000 multiple times a day. 00:02:37.000 --> 00:02:39.000 We often don't even notice them. 00:02:39.000 --> 00:02:41.380 They're just part of being alive; we all do it. 00:02:41.380 --> 00:02:44.000 Every species does it. 00:02:44.000 --> 00:02:46.000 Imagine a rabbit in a field, eating grass, 00:02:46.000 --> 00:02:49.000 and the rabbit's going to see a fox. 00:02:49.000 --> 00:02:51.000 That rabbit will make a security trade-off: 00:02:51.000 --> 00:02:53.000 "Should I stay, or should I flee?" 00:02:53.000 --> 00:02:55.000 And if you think about it, 00:02:55.000 --> 00:02:58.000 the rabbits that are good at making that trade-off 00:02:58.000 --> 00:03:00.000 will tend to live and reproduce, 00:03:00.000 --> 00:03:02.000 and the rabbits that are bad at it 00:03:02.000 --> 00:03:04.000 will get eaten or starve. 00:03:04.000 --> 00:03:06.000 So you'd think 00:03:06.000 --> 00:03:09.000 that us, as a successful species on the planet -- 00:03:09.000 --> 00:03:11.000 you, me, everybody -- 00:03:11.000 --> 00:03:14.000 would be really good at making these trade-offs. 00:03:14.000 --> 00:03:16.000 Yet it seems, again and again, 00:03:16.000 --> 00:03:19.000 that we're hopelessly bad at it. 00:03:19.000 --> 00:03:22.000 And I think that's a fundamentally interesting question. 00:03:22.000 --> 00:03:24.000 I'll give you the short answer. 00:03:24.000 --> 00:03:26.470 The answer is, we respond to the feeling of security 00:03:26.470 --> 00:03:29.000 and not the reality. 00:03:29.000 --> 00:03:31.849 Now most of the time, that works. 00:03:33.000 --> 00:03:34.511 Most of the time, 00:03:34.511 --> 00:03:36.997 feeling and reality are the same. 00:03:38.000 --> 00:03:40.000 Certainly that's true 00:03:40.000 --> 00:03:43.000 for most of human prehistory. 00:03:43.000 --> 00:03:45.666 We've developed this ability 00:03:45.666 --> 00:03:48.000 because it makes evolutionary sense. 00:03:49.267 --> 00:03:50.560 One way to think of it 00:03:50.560 --> 00:03:52.000 is that we're highly optimized 00:03:52.000 --> 00:03:54.000 for risk decisions 00:03:54.000 --> 00:03:57.000 that are endemic to living in small family groups 00:03:57.000 --> 00:04:00.000 in the East African highlands in 100,000 B.C. 00:04:00.000 --> 00:04:03.000 2010 New York, not so much. 00:04:06.485 --> 00:04:09.560 Now there are several biases in risk perception. 00:04:09.560 --> 00:04:11.250 A lot of good experiments in this. 00:04:11.250 --> 00:04:15.020 And you can see certain biases that come up again and again. 00:04:15.020 --> 00:04:16.780 So I'll give you four. 00:04:16.780 --> 00:04:20.084 We tend to exaggerate spectacular and rare risks 00:04:20.084 --> 00:04:21.563 and downplay common risks -- 00:04:21.563 --> 00:04:23.637 so flying versus driving. 00:04:24.412 --> 00:04:26.465 The unknown is perceived 00:04:26.465 --> 00:04:28.207 to be riskier than the familiar. 00:04:31.047 --> 00:04:32.492 One example would be, 00:04:32.492 --> 00:04:35.210 people fear kidnapping by strangers 00:04:35.210 --> 00:04:38.665 when the data supports kidnapping by relatives is much more common. 00:04:38.665 --> 00:04:40.586 This is for children. 00:04:40.586 --> 00:04:42.862 Third, personified risks 00:04:42.862 --> 00:04:45.830 are perceived to be greater than anonymous risks -- 00:04:45.830 --> 00:04:48.458 so Bin Laden is scarier because he has a name. 00:04:49.713 --> 00:04:51.138 And the fourth 00:04:51.138 --> 00:04:54.095 is people underestimate risks 00:04:54.095 --> 00:04:55.942 in situations they do control 00:04:55.942 --> 00:04:59.046 and overestimate them in situations they don't control. 00:04:59.046 --> 00:05:01.855 So once you take up skydiving or smoking, 00:05:01.855 --> 00:05:04.210 you downplay the risks. 00:05:04.210 --> 00:05:07.692 If a risk is thrust upon you -- terrorism was a good example -- 00:05:07.692 --> 00:05:10.990 you'll overplay it because you don't feel like it's in your control. 00:05:10.990 --> 00:05:15.130 There are a bunch of other of these cognitive biases, 00:05:15.130 --> 00:05:16.900 that affect our risk decisions. 00:05:17.780 --> 00:05:20.240 There's the availability heuristic, 00:05:20.240 --> 00:05:22.096 which basically means 00:05:22.096 --> 00:05:24.720 we estimate the probability of something 00:05:24.720 --> 00:05:28.100 by how easy it is to bring instances of it to mind. 00:05:29.560 --> 00:05:31.187 So you can imagine how that works. 00:05:31.187 --> 00:05:34.830 If you hear a lot about tiger attacks, there must be a lot of tigers around. 00:05:34.830 --> 00:05:38.137 You don't hear about lion attacks, there aren't a lot of lions around. 00:05:38.137 --> 00:05:40.722 This works until you invent newspapers. 00:05:40.722 --> 00:05:42.314 Because what newspapers do 00:05:42.314 --> 00:05:44.890 is they repeat again and again 00:05:44.890 --> 00:05:46.409 rare risks. 00:05:46.409 --> 00:05:49.209 I tell people, if it's in the news, don't worry about it. 00:05:49.209 --> 00:05:50.490 Because by definition, 00:05:50.490 --> 00:05:53.260 news is something that almost never happens. 00:05:53.260 --> 00:05:55.297 (Laughter) 00:05:55.297 --> 00:05:58.600 When something is so common, it's no longer news -- 00:05:58.600 --> 00:06:00.656 car crashes, domestic violence -- 00:06:00.656 --> 00:06:02.909 those are the risks you worry about. 00:06:03.375 --> 00:06:05.416 We're also a species of storytellers. 00:06:05.416 --> 00:06:07.849 We respond to stories more than data. 00:06:07.849 --> 00:06:10.574 And there's some basic innumeracy going on. 00:06:10.574 --> 00:06:13.649 I mean, the joke "One, Two, Three, Many" is kind of right. 00:06:13.804 --> 00:06:16.222 We're really good at small numbers. 00:06:16.222 --> 00:06:18.443 One mango, two mangoes, three mangoes, 00:06:18.443 --> 00:06:20.313 10,000 mangoes, 100,000 mangoes -- 00:06:20.313 --> 00:06:23.386 it's still more mangoes you can eat before they rot. 00:06:23.386 --> 00:06:26.530 So one half, one quarter, one fifth -- we're good at that. 00:06:26.530 --> 00:06:28.714 One in a million, one in a billion -- 00:06:28.714 --> 00:06:30.769 they're both almost never. 00:06:31.334 --> 00:06:33.232 So we have trouble with the risks 00:06:33.232 --> 00:06:34.888 that aren't very common. 00:06:34.888 --> 00:06:37.390 And what these cognitive biases do 00:06:37.390 --> 00:06:39.889 is they act as filters between us and reality. 00:06:41.152 --> 00:06:42.753 And the result 00:06:42.753 --> 00:06:45.051 is that feeling and reality get out of whack, 00:06:45.051 --> 00:06:46.770 they get different. 00:06:47.034 --> 00:06:50.896 Now you either have a feeling -- you feel more secure than you are. 00:06:50.896 --> 00:06:52.714 There's a false sense of security. 00:06:52.714 --> 00:06:53.837 Or the other way, 00:06:53.837 --> 00:06:56.460 and that's a false sense of insecurity. 00:06:56.460 --> 00:06:58.878 I write a lot about "security theater," 00:06:58.878 --> 00:07:02.236 which are products that make people feel secure, 00:07:02.236 --> 00:07:04.309 but don't actually do anything. 00:07:04.309 --> 00:07:06.814 There's no real word for stuff that makes us secure, 00:07:06.814 --> 00:07:08.891 but doesn't make us feel secure. 00:07:08.891 --> 00:07:11.870 Maybe it's what the CIA's supposed to do for us. 00:07:12.784 --> 00:07:15.057 So back to economics. 00:07:15.057 --> 00:07:18.259 If economics, if the market, drives security, 00:07:18.259 --> 00:07:21.260 and if people make trade-offs 00:07:21.260 --> 00:07:23.240 based on the feeling of security, 00:07:23.240 --> 00:07:26.977 then the smart thing for companies to do 00:07:26.977 --> 00:07:28.583 for the economic incentives 00:07:28.583 --> 00:07:30.886 are to make people feel secure. 00:07:32.060 --> 00:07:33.782 And there are two ways to do this. 00:07:33.782 --> 00:07:36.531 One, you can make people actually secure 00:07:36.531 --> 00:07:38.360 and hope they notice. 00:07:38.360 --> 00:07:41.060 Or two, you can make people just feel secure 00:07:41.060 --> 00:07:43.070 and hope they don't notice. 00:07:45.569 --> 00:07:48.281 So what makes people notice? 00:07:49.022 --> 00:07:50.701 Well a couple of things: 00:07:50.701 --> 00:07:53.037 understanding of the security, 00:07:53.037 --> 00:07:54.500 of the risks, the threats, 00:07:54.500 --> 00:07:56.474 the countermeasures, how they work. 00:07:56.474 --> 00:07:57.992 But if you know stuff, 00:07:57.992 --> 00:08:01.848 you're more likely to have your feelings match reality. 00:08:02.558 --> 00:08:05.560 Enough real world examples helps. 00:08:05.560 --> 00:08:08.480 Now we all know the crime rate in our neighborhood, 00:08:08.480 --> 00:08:11.000 because we live there, and we get a feeling about it 00:08:11.000 --> 00:08:13.240 that basically matches reality. 00:08:14.879 --> 00:08:16.862 Security theater's exposed 00:08:16.862 --> 00:08:19.350 when it's obvious that it's not working properly. 00:08:21.078 --> 00:08:23.460 Okay, so what makes people not notice? 00:08:23.460 --> 00:08:25.878 Well, a poor understanding. 00:08:25.878 --> 00:08:28.973 If you don't understand the risks, you don't understand the costs, 00:08:28.973 --> 00:08:31.498 you're likely to get the trade-off wrong, 00:08:31.498 --> 00:08:34.036 and your feeling doesn't match reality. 00:08:34.036 --> 00:08:36.559 Not enough examples. 00:08:36.559 --> 00:08:38.127 There's an inherent problem 00:08:38.127 --> 00:08:40.258 with low probability events. 00:08:40.284 --> 00:08:41.759 If, for example, 00:08:41.759 --> 00:08:44.142 terrorism almost never happens, 00:08:44.142 --> 00:08:46.000 it's really hard to judge 00:08:46.000 --> 00:08:49.250 the efficacy of counter-terrorist measures. 00:08:50.489 --> 00:08:53.300 This is why you keep sacrificing virgins, 00:08:53.300 --> 00:08:56.060 and why your unicorn defenses are working just great. 00:08:56.060 --> 00:08:58.746 There aren't enough examples of failures. 00:09:00.079 --> 00:09:03.130 Also, feelings that are clouding the issues -- 00:09:03.130 --> 00:09:05.550 the cognitive biases I talked about earlier, 00:09:05.550 --> 00:09:07.450 fears, folk beliefs, 00:09:08.531 --> 00:09:11.456 basically an inadequate model of reality. 00:09:13.330 --> 00:09:15.163 So let me complicate things. 00:09:15.163 --> 00:09:17.420 I have feeling and reality. 00:09:17.420 --> 00:09:19.930 I want to add a third element. I want to add model. 00:09:20.711 --> 00:09:22.856 Feeling and model in our head, 00:09:22.856 --> 00:09:24.755 reality is the outside world. 00:09:24.755 --> 00:09:26.460 It doesn't change; it's real. 00:09:27.687 --> 00:09:29.512 So feeling is based on our intuition. 00:09:29.512 --> 00:09:31.827 Model is based on reason. 00:09:32.209 --> 00:09:33.972 That's basically the difference. 00:09:33.972 --> 00:09:35.999 In a primitive and simple world, 00:09:35.999 --> 00:09:38.040 there's really no reason for a model 00:09:40.096 --> 00:09:42.480 because feeling is close to reality. 00:09:42.480 --> 00:09:44.327 You don't need a model. 00:09:44.327 --> 00:09:46.559 But in a modern and complex world, 00:09:46.559 --> 00:09:48.380 you need models 00:09:48.380 --> 00:09:50.879 to understand a lot of the risks we face. 00:09:52.317 --> 00:09:54.924 There's no feeling about germs. 00:09:54.924 --> 00:09:57.275 You need a model to understand them. 00:09:57.275 --> 00:09:58.971 So this model 00:09:58.971 --> 00:10:01.563 is an intelligent representation of reality. 00:10:01.563 --> 00:10:04.614 It's, of course, limited by science, 00:10:04.614 --> 00:10:06.566 by technology. 00:10:08.108 --> 00:10:10.304 We couldn't have a germ theory of disease 00:10:10.304 --> 00:10:12.892 before we invented the microscope to see them. 00:10:13.770 --> 00:10:16.308 It's limited by our cognitive biases. 00:10:17.701 --> 00:10:19.280 But it has the ability 00:10:19.280 --> 00:10:21.400 to override our feelings. 00:10:21.400 --> 00:10:24.450 Where do we get these models? We get them from others. 00:10:24.450 --> 00:10:27.499 We get them from religion, from culture, 00:10:27.499 --> 00:10:29.492 teachers, elders. 00:10:29.492 --> 00:10:30.796 A couple years ago, 00:10:30.796 --> 00:10:33.165 I was in South Africa on safari. 00:10:33.165 --> 00:10:36.115 The tracker I was with grew up in Kruger National Park. 00:10:36.115 --> 00:10:39.143 He had some very complex models of how to survive. 00:10:39.143 --> 00:10:41.004 And it depended on if you were attacked 00:10:41.004 --> 00:10:43.450 by a lion or a leopard or a rhino or an elephant -- 00:10:43.450 --> 00:10:46.376 and when you had to run away, and when you couldn't run away, 00:10:46.376 --> 00:10:49.740 and when you had to climb a tree -- when you could never climb a tree. 00:10:49.740 --> 00:10:51.666 I would have died in a day, 00:10:51.666 --> 00:10:53.518 but he was born there, 00:10:53.518 --> 00:10:55.883 and he understood how to survive. 00:10:55.883 --> 00:10:57.540 I was born in New York City. 00:10:57.540 --> 00:11:00.787 I could have taken him to New York, and he would have died in a day. 00:11:00.787 --> 00:11:02.790 (Laughter) 00:11:02.790 --> 00:11:04.478 Because we had different models 00:11:04.478 --> 00:11:06.875 based on our different experiences. 00:11:07.873 --> 00:11:09.916 Models can come from the media, 00:11:09.916 --> 00:11:12.261 from our elected officials. 00:11:13.137 --> 00:11:15.800 Think of models of terrorism, 00:11:15.800 --> 00:11:18.120 child kidnapping, 00:11:18.120 --> 00:11:21.108 airline safety, car safety. 00:11:21.108 --> 00:11:23.097 Models can come from industry. 00:11:24.560 --> 00:11:27.090 The two I'm following are surveillance cameras, 00:11:27.090 --> 00:11:28.606 ID cards, 00:11:28.606 --> 00:11:31.882 quite a lot of our computer security models come from there. 00:11:31.882 --> 00:11:34.241 A lot of models come from science. 00:11:34.241 --> 00:11:36.076 Health models are a great example. 00:11:36.076 --> 00:11:39.250 Think of cancer, of bird flu, swine flu, SARS. 00:11:39.916 --> 00:11:42.316 All of our feelings of security 00:11:42.316 --> 00:11:44.320 about those diseases 00:11:44.320 --> 00:11:45.761 come from models 00:11:45.761 --> 00:11:48.839 given to us, really, by science filtered through the media. 00:11:51.001 --> 00:11:53.490 So models can change. 00:11:53.490 --> 00:11:55.479 Models are not static. 00:11:55.479 --> 00:11:58.434 As we become more comfortable in our environments, 00:11:58.434 --> 00:12:01.658 our model can move closer to our feelings. 00:12:03.630 --> 00:12:05.961 So an example might be, 00:12:05.961 --> 00:12:07.992 if you go back 100 years ago 00:12:07.992 --> 00:12:10.578 when electricity was first becoming common, 00:12:10.578 --> 00:12:12.691 there were a lot of fears about it. 00:12:12.691 --> 00:12:15.600 I mean, there were people who were afraid to push doorbells, 00:12:15.600 --> 00:12:18.708 because there was electricity in there, and that was dangerous. 00:12:18.708 --> 00:12:21.197 For us, we're very facile around electricity. 00:12:21.197 --> 00:12:22.610 We change light bulbs 00:12:22.610 --> 00:12:24.828 without even thinking about it. 00:12:24.828 --> 00:12:28.178 Our model of security around electricity 00:12:28.178 --> 00:12:30.780 is something we were born into. 00:12:31.757 --> 00:12:34.076 It hasn't changed as we were growing up. 00:12:34.076 --> 00:12:36.225 And we're good at it. 00:12:36.870 --> 00:12:38.675 Or think of the risks 00:12:38.675 --> 00:12:41.250 on the Internet across generations -- 00:12:41.396 --> 00:12:44.044 how your parents approach Internet security, 00:12:44.044 --> 00:12:45.503 versus how you do, 00:12:45.503 --> 00:12:47.519 versus how our kids will. 00:12:47.750 --> 00:12:50.320 Models eventually fade into the background. 00:12:52.283 --> 00:12:54.812 Intuitive is just another word for familiar. 00:12:55.660 --> 00:12:58.233 So as your model is close to reality, 00:12:58.233 --> 00:12:59.878 and it converges with feelings, 00:12:59.878 --> 00:13:01.959 you often don't know it's there. 00:13:03.092 --> 00:13:04.541 So a nice example of this 00:13:04.541 --> 00:13:06.572 came from last year and swine flu. 00:13:07.753 --> 00:13:09.916 When swine flu first appeared, 00:13:09.916 --> 00:13:12.903 the initial news caused a lot of overreaction. 00:13:13.449 --> 00:13:15.405 Now it had a name, 00:13:15.405 --> 00:13:17.693 which made it scarier than the regular flu, 00:13:17.693 --> 00:13:19.769 even though it was more deadly. 00:13:19.769 --> 00:13:22.911 And people thought doctors should be able to deal with it. 00:13:23.235 --> 00:13:25.820 So there was that feeling of lack of control. 00:13:25.820 --> 00:13:27.175 And those two things 00:13:27.175 --> 00:13:28.933 made the risk more than it was. 00:13:28.933 --> 00:13:32.320 As the novelty wore off, the months went by, 00:13:32.320 --> 00:13:33.988 there was some amount of tolerance, 00:13:33.988 --> 00:13:35.636 people got used to it. 00:13:36.199 --> 00:13:38.964 There was no new data, but there was less fear. 00:13:38.964 --> 00:13:40.674 By autumn, 00:13:40.674 --> 00:13:42.724 people thought 00:13:42.724 --> 00:13:45.458 the doctors should have solved this already. 00:13:45.458 --> 00:13:47.215 And there's kind of a bifurcation -- 00:13:47.215 --> 00:13:48.588 people had to choose 00:13:48.588 --> 00:13:51.430 between fear and acceptance -- 00:13:54.120 --> 00:13:55.867 actually fear and indifference -- 00:13:55.867 --> 00:13:58.274 they kind of chose suspicion. 00:13:58.923 --> 00:14:01.786 And when the vaccine appeared last winter, 00:14:01.786 --> 00:14:04.418 there were a lot of people -- a surprising number -- 00:14:04.418 --> 00:14:06.578 who refused to get it -- 00:14:08.420 --> 00:14:10.140 as a nice example 00:14:10.140 --> 00:14:13.929 of how people's feelings of security change, how their model changes, 00:14:13.929 --> 00:14:15.603 sort of wildly 00:14:15.603 --> 00:14:17.436 with no new information, 00:14:17.436 --> 00:14:19.086 with no new input. 00:14:20.217 --> 00:14:22.584 This kind of thing happens a lot. 00:14:22.584 --> 00:14:24.810 I'm going to give one more complication. 00:14:24.810 --> 00:14:27.250 We have feeling, model, reality. 00:14:28.227 --> 00:14:30.902 I have a very relativistic view of security. 00:14:30.902 --> 00:14:33.095 I think it depends on the observer. 00:14:33.095 --> 00:14:35.050 And most security decisions 00:14:35.050 --> 00:14:38.287 have a variety of people involved. 00:14:39.310 --> 00:14:41.229 And stakeholders 00:14:41.229 --> 00:14:43.937 with specific trade-offs 00:14:43.937 --> 00:14:46.261 will try to influence the decision. 00:14:46.261 --> 00:14:48.311 And I call that their agenda. 00:14:49.339 --> 00:14:50.732 And you see agenda -- 00:14:50.732 --> 00:14:52.837 this is marketing, this is politics -- 00:14:52.837 --> 00:14:56.235 trying to convince you to have one model versus another, 00:14:56.235 --> 00:14:58.325 trying to convince you to ignore a model 00:14:58.325 --> 00:15:00.894 and trust your feelings, 00:15:00.894 --> 00:15:04.110 marginalizing people with models you don't like. 00:15:04.600 --> 00:15:06.515 This is not uncommon. 00:15:07.234 --> 00:15:10.538 An example, a great example, is the risk of smoking. 00:15:11.521 --> 00:15:14.427 In the history of the past 50 years, the smoking risk 00:15:14.427 --> 00:15:16.586 shows how a model changes, 00:15:16.586 --> 00:15:18.968 and it also shows how an industry fights against 00:15:18.968 --> 00:15:20.680 a model it doesn't like. 00:15:21.872 --> 00:15:24.962 Compare that to the secondhand smoke debate -- 00:15:24.962 --> 00:15:27.455 probably about 20 years behind. 00:15:27.455 --> 00:15:29.542 Think about seat belts. 00:15:29.542 --> 00:15:31.696 When I was a kid, no one wore a seat belt. 00:15:31.696 --> 00:15:33.525 Nowadays, no kid will let you drive 00:15:33.525 --> 00:15:35.900 if you're not wearing a seat belt. 00:15:36.570 --> 00:15:39.210 Compare that to the airbag debate -- 00:15:39.210 --> 00:15:41.646 probably about 30 years behind. 00:15:42.320 --> 00:15:44.790 All examples of models changing. 00:15:46.942 --> 00:15:50.168 What we learn is that changing models is hard. 00:15:50.168 --> 00:15:52.699 Models are hard to dislodge. 00:15:52.699 --> 00:15:54.581 If they equal your feelings, 00:15:54.581 --> 00:15:56.518 you don't even know you have a model. 00:15:57.343 --> 00:15:59.250 And there's another cognitive bias 00:15:59.250 --> 00:16:01.243 I'll call confirmation bias, 00:16:01.243 --> 00:16:03.310 where we tend to accept data 00:16:03.310 --> 00:16:05.688 that confirms our beliefs 00:16:05.688 --> 00:16:08.496 and reject data that contradicts our beliefs. 00:16:13.833 --> 00:16:16.236 So evidence against our model, 00:16:16.236 --> 00:16:18.942 we're likely to ignore, even if it's compelling. 00:16:18.942 --> 00:16:21.905 It has to get very compelling before we'll pay attention. 00:16:22.975 --> 00:16:25.912 New models that extend long periods of time are hard. 00:16:25.912 --> 00:16:27.530 Global warming is a great example. 00:16:27.530 --> 00:16:28.887 We're terrible 00:16:28.887 --> 00:16:30.945 at models that span 80 years. 00:16:31.187 --> 00:16:33.172 We can do to the next harvest. 00:16:33.172 --> 00:16:35.987 We can often do until our kids grow up. 00:16:35.987 --> 00:16:38.524 But 80 years, we're just not good at. 00:16:39.150 --> 00:16:41.436 So it's a very hard model to accept. 00:16:42.255 --> 00:16:45.506 We can have both models in our head simultaneously, 00:16:46.010 --> 00:16:49.004 right, that kind of problem 00:16:49.649 --> 00:16:52.529 where we're holding both beliefs together, 00:16:52.830 --> 00:16:54.584 right, the cognitive dissonance. 00:16:54.584 --> 00:16:55.909 Eventually, 00:16:55.909 --> 00:16:58.211 the new model will replace the old model. 00:16:58.211 --> 00:17:01.048 Strong feelings can create a model. 00:17:01.758 --> 00:17:04.748 September 11th created a security model 00:17:04.750 --> 00:17:06.670 in a lot of people's heads. 00:17:06.670 --> 00:17:10.223 Also, personal experiences with crime can do it, 00:17:10.223 --> 00:17:11.775 personal health scare, 00:17:11.775 --> 00:17:14.294 a health scare in the news. 00:17:14.294 --> 00:17:16.259 You'll see these called flashbulb events 00:17:16.259 --> 00:17:18.007 by psychiatrists. 00:17:18.603 --> 00:17:20.843 They can create a model instantaneously, 00:17:20.843 --> 00:17:23.050 because they're very emotive. 00:17:23.800 --> 00:17:25.694 So in the technological world, 00:17:25.694 --> 00:17:27.907 we don't have experience 00:17:27.907 --> 00:17:29.662 to judge models. 00:17:29.662 --> 00:17:32.363 And we rely on others. We rely on proxies. 00:17:32.363 --> 00:17:35.638 I mean, this works as long as it's to correct others. 00:17:35.638 --> 00:17:38.330 We rely on government agencies 00:17:38.330 --> 00:17:41.650 to tell us what pharmaceuticals are safe. 00:17:42.698 --> 00:17:44.475 I flew here yesterday. 00:17:44.475 --> 00:17:47.002 I didn't check the airplane. 00:17:47.002 --> 00:17:49.155 I relied on some other group 00:17:49.155 --> 00:17:51.664 to determine whether my plane was safe to fly. 00:17:51.664 --> 00:17:55.196 We're here, none of us fear the roof is going to collapse on us, 00:17:55.196 --> 00:17:57.184 not because we checked, 00:17:57.184 --> 00:17:59.390 but because we're pretty sure 00:17:59.390 --> 00:18:01.534 the building codes here are good. 00:18:02.890 --> 00:18:05.000 It's a model we just accept 00:18:05.000 --> 00:18:07.218 pretty much by faith. 00:18:07.637 --> 00:18:09.593 And that's okay. 00:18:12.016 --> 00:18:14.290 Now, what we want 00:18:14.290 --> 00:18:16.378 is people to get familiar enough 00:18:16.378 --> 00:18:18.274 with better models -- 00:18:18.274 --> 00:18:20.363 have it reflected in their feelings -- 00:18:20.363 --> 00:18:23.030 to allow them to make security trade-offs. 00:18:24.011 --> 00:18:26.217 Now when these go out of whack, 00:18:26.217 --> 00:18:27.897 you have two options. 00:18:27.897 --> 00:18:30.225 One, you can fix people's feelings, 00:18:30.225 --> 00:18:32.245 directly appeal to feelings. 00:18:32.245 --> 00:18:34.690 It's manipulation, but it can work. 00:18:35.456 --> 00:18:37.231 The second, more honest way 00:18:37.231 --> 00:18:39.629 is to actually fix the model. 00:18:40.918 --> 00:18:42.956 Change happens slowly. 00:18:42.956 --> 00:18:45.504 The smoking debate took 40 years, 00:18:45.504 --> 00:18:47.480 and that was an easy one. 00:18:49.703 --> 00:18:51.762 Some of this stuff is hard. 00:18:51.762 --> 00:18:53.037 I mean really though, 00:18:53.037 --> 00:18:55.310 information seems like our best hope. 00:18:55.310 --> 00:18:56.836 And I lied. 00:18:56.836 --> 00:18:59.623 Remember I said feeling, model, reality; 00:18:59.623 --> 00:19:02.036 I said reality doesn't change. It actually does. 00:19:02.036 --> 00:19:03.917 We live in a technological world; 00:19:03.917 --> 00:19:06.317 reality changes all the time. 00:19:07.176 --> 00:19:09.882 So we might have -- for the first time in our species -- 00:19:09.882 --> 00:19:13.170 feeling chases model, model chases reality, reality's moving -- 00:19:13.170 --> 00:19:15.360 they might never catch up. 00:19:16.502 --> 00:19:18.297 We don't know. 00:19:19.860 --> 00:19:21.710 But in the long-term, 00:19:21.710 --> 00:19:23.623 both feeling and reality are important. 00:19:23.623 --> 00:19:26.890 And I want to close with two quick stories to illustrate this. 00:19:26.890 --> 00:19:29.496 1982 -- I don't know if people will remember this -- 00:19:29.496 --> 00:19:31.571 there was a short epidemic 00:19:31.571 --> 00:19:33.706 of Tylenol poisonings in the United States. 00:19:33.706 --> 00:19:36.767 It's a horrific story. Someone took a bottle of Tylenol, 00:19:36.767 --> 00:19:40.368 put poison in it, closed it up, put it back on the shelf. 00:19:40.368 --> 00:19:42.333 Someone else bought it and died. 00:19:42.333 --> 00:19:43.982 This terrified people. 00:19:43.982 --> 00:19:45.972 There were a couple of copycat attacks. 00:19:45.972 --> 00:19:48.988 There wasn't any real risk, but people were scared. 00:19:48.988 --> 00:19:50.311 And this is how 00:19:50.311 --> 00:19:52.944 the tamper-proof drug industry was invented. 00:19:52.944 --> 00:19:55.101 Those tamper-proof caps, that came from this. 00:19:55.101 --> 00:19:56.804 It's complete security theater. 00:19:56.804 --> 00:19:59.684 As a homework assignment, think of 10 ways to get around it. 00:19:59.684 --> 00:20:01.565 I'll give you one, a syringe. 00:20:01.565 --> 00:20:04.493 But it made people feel better. 00:20:05.024 --> 00:20:06.929 It made their feeling of security 00:20:06.929 --> 00:20:08.590 more match the reality. 00:20:09.632 --> 00:20:12.466 Last story, a few years ago, a friend of mine gave birth. 00:20:12.466 --> 00:20:13.921 I visit her in the hospital. 00:20:13.921 --> 00:20:16.095 It turns out when a baby's born now, 00:20:16.095 --> 00:20:17.929 they put an RFID bracelet on the baby, 00:20:17.929 --> 00:20:19.778 put a corresponding one on the mother, 00:20:19.778 --> 00:20:23.400 so if anyone other than the mother takes the baby out of the maternity ward, 00:20:23.400 --> 00:20:24.340 an alarm goes off. 00:20:24.340 --> 00:20:26.195 I said, "Well, that's kind of neat. 00:20:26.195 --> 00:20:28.808 I wonder how rampant baby snatching is 00:20:28.808 --> 00:20:30.000 out of hospitals." 00:20:30.000 --> 00:20:31.612 I go home, I look it up. 00:20:31.612 --> 00:20:34.000 It basically never happens. 00:20:34.616 --> 00:20:36.000 But if you think about it, 00:20:36.000 --> 00:20:38.000 if you are a hospital, 00:20:38.000 --> 00:20:40.410 and you need to take a baby away from its mother, 00:20:40.410 --> 00:20:42.070 out of the room to run some tests, 00:20:42.070 --> 00:20:44.130 you better have some good security theater, 00:20:44.130 --> 00:20:46.000 or she's going to rip your arm off. 00:20:46.000 --> 00:20:48.000 (Laughter) 00:20:48.000 --> 00:20:50.000 So it's important for us, 00:20:50.000 --> 00:20:52.000 those of us who design security, 00:20:52.000 --> 00:20:55.000 who look at security policy, 00:20:55.000 --> 00:20:57.000 or even look at public policy 00:20:57.000 --> 00:20:59.000 in ways that affect security. 00:20:59.000 --> 00:21:02.000 It's not just reality; it's feeling and reality. 00:21:02.000 --> 00:21:04.000 What's important 00:21:04.000 --> 00:21:06.000 is that they be about the same. 00:21:06.000 --> 00:21:08.590 It's important that, if our feelings match reality, 00:21:08.590 --> 00:21:10.960 we make better security trade-offs. 00:21:10.960 --> 00:21:12.000 Thank you. 00:21:12.000 --> 00:21:13.525 (Applause)