WEBVTT 00:00:00.000 --> 00:00:00.778 (Cory Doctorow) Thank you very much 00:00:01.066 --> 00:00:05.420 So I'd like to start with something of a benediction or permission. 00:00:05.420 --> 00:00:07.740 I am one of nature's fast talkers 00:00:07.740 --> 00:00:10.620 and many of you are not native English speakers, or 00:00:10.620 --> 00:00:12.919 maybe not accustomed to my harsh Canadian accent 00:00:12.919 --> 00:00:15.919 in addition I've just come in from Australia 00:00:15.919 --> 00:00:19.166 and so like many of you I am horribly jetlagged and have drunk enough coffee 00:00:19.166 --> 00:00:20.736 this morning to kill a rhino. 00:00:22.112 --> 00:00:23.587 When I used to be at the United Nations 00:00:23.587 --> 00:00:27.023 I was known as the scourge of the simultaneous translation core 00:00:27.157 --> 00:00:29.528 I would stand up and speak as slowly as I could 00:00:29.576 --> 00:00:32.229 and turn around, and there they would be in their boots doing this 00:00:32.229 --> 00:00:35.136 (laughter) When I start to speak too fast, 00:00:35.136 --> 00:00:37.557 this is the universal symbol -- my wife invented it -- 00:00:37.557 --> 00:00:41.518 for "Cory, you are talking too fast". Please, don't be shy. 00:00:42.078 --> 00:00:45.790 So, I'm a parent , like many of you and I'm like I'm sure all of you 00:00:45.790 --> 00:00:48.850 who are parents, parenting takes my ass all the time. 00:00:50.030 --> 00:00:54.820 And there are many regrets I have about the mere seven and half years 00:00:54.820 --> 00:00:57.284 that I've been a parent but none ares so keenly felt 00:00:57.784 --> 00:01:00.878 as my regrets over what's happened when I've been wandering 00:01:00.878 --> 00:01:04.766 around the house and seen my daughter working on something 00:01:04.766 --> 00:01:08.697 that was beyond her abilities, that was right at the edge of what she could do 00:01:08.697 --> 00:01:12.759 and where she was doing something that she didn't have competence in yet 00:01:12.759 --> 00:01:17.019 and you know it's that amazing thing to see that frowning concentration, 00:01:17.019 --> 00:01:20.038 tongue stuck out: as a parent, your heart swells with pride 00:01:20.038 --> 00:01:21.332 and you can't help but go over 00:01:21.332 --> 00:01:23.684 and sort of peer over their shoulder what they are doing 00:01:23.684 --> 00:01:27.329 and those of you who are parents know what happens when you look too closely 00:01:27.581 --> 00:01:30.205 at someone who is working beyond the age of their competence. 00:01:30.565 --> 00:01:32.678 They go back to doing something they're already good at. 00:01:32.973 --> 00:01:35.042 You interrupt a moment of genuine learning 00:01:35.295 --> 00:01:38.491 and you replace it with a kind of embarrassment 00:01:38.859 --> 00:01:42.384 about what you're good at and what you're not. 00:01:42.979 --> 00:01:48.051 So, it matters a lot that our schools are increasingly surveilled environments, 00:01:48.051 --> 00:01:52.401 environments in which everything that our kids do is watched and recorded. 00:01:52.791 --> 00:01:56.439 Because when you do that, you interfere with those moments of real learning. 00:01:56.781 --> 00:02:00.665 Our ability to do things that we are not good at yet, that we are not proud of yet, 00:02:00.665 --> 00:02:03.524 is negatively impacted by that kind of scrutiny. 00:02:03.924 --> 00:02:06.022 And that scrutiny comes from a strange place. 00:02:06.731 --> 00:02:10.567 We have decided that there are some programmatic means 00:02:10.567 --> 00:02:13.678 by which we can find all the web page children shouldn't look at 00:02:14.136 --> 00:02:18.321 and we will filter our networks to be sure that they don't see them. 00:02:18.719 --> 00:02:21.883 Anyone who has ever paid attention knows that this doesn't work. 00:02:22.351 --> 00:02:25.524 There are more web pages that kids shouldn't look at 00:02:25.524 --> 00:02:29.043 than can ever be cataloged, and any attempt to catalog them 00:02:29.043 --> 00:02:32.031 will always catch pages that kids must be looking at. 00:02:32.031 --> 00:02:34.795 Any of you who have ever taught a unit on reproductive health 00:02:35.122 --> 00:02:37.835 know the frustration of trying to get round a school network. 00:02:38.597 --> 00:02:42.429 Now, this is done in the name of digital protection 00:02:42.643 --> 00:02:46.342 but it flies in the face of digital literacy and of real learning. 00:02:46.613 --> 00:02:50.332 Because the only way to stop kids from looking at web pages 00:02:50.332 --> 00:02:51.607 they shouldn't be looking at 00:02:51.607 --> 00:02:55.771 is to take all of the clicks that they make, all of the messages that they send, 00:02:55.771 --> 00:02:59.736 all of their online activity and offshore it to a firm 00:02:59.736 --> 00:03:04.006 that has some nonsensically arrived at list of the bad pages. 00:03:04.370 --> 00:03:07.815 And so, what we are doing is that we're exfiltrating all of our students' data 00:03:08.279 --> 00:03:10.256 to unknown third parties. 00:03:10.256 --> 00:03:12.529 Now, most of these firms, their primary business isn't 00:03:12.529 --> 00:03:14.237 serving the education sector. 00:03:14.237 --> 00:03:16.347 Most of them service the government sectors. 00:03:16.540 --> 00:03:21.328 They primarily service governments in repressive autocratic regimes. 00:03:21.328 --> 00:03:23.600 They help them ensure that their citizens aren't looking at 00:03:23.600 --> 00:03:25.376 Amnesty International web pages. 00:03:25.636 --> 00:03:29.433 They repackage those tools and sell them to our educators. 00:03:29.835 --> 00:03:32.975 So we are offshoring our children's clicks to war criminals. 00:03:33.791 --> 00:03:37.197 And what our kids do, now, is they just get around it, 00:03:37.197 --> 00:03:38.773 because it's not hard to get around it. 00:03:38.773 --> 00:03:43.940 You know, never underestimate the power of a kid who is time-rich and cash-poor 00:03:43.940 --> 00:03:46.234 to get around our technological blockades. 00:03:47.508 --> 00:03:51.006 But when they do this, they don't acquire the kind of digital literacy 00:03:51.006 --> 00:03:54.075 that we want them to do, they don't acquire real digital agency 00:03:54.400 --> 00:03:57.783 and moreover, they risk exclusion and in extreme cases, 00:03:57.783 --> 00:03:59.366 they risk criminal prosecution. 00:04:00.220 --> 00:04:04.085 So what if instead, those of us who are trapped in this system of teaching kids 00:04:04.085 --> 00:04:07.850 where we're required to subject them to this kind of surveillance 00:04:07.850 --> 00:04:10.167 that flies in the face of their real learning, 00:04:10.167 --> 00:04:13.247 what if instead, we invented curricular units 00:04:13.247 --> 00:04:16.459 that made them real first class digital citizens, 00:04:16.459 --> 00:04:19.590 in charge of trying to influence real digital problems? 00:04:19.590 --> 00:04:22.833 Like what if we said to them: "We want you to catalog the web pages 00:04:22.833 --> 00:04:25.271 that this vendor lets through that you shouldn't be seeing. 00:04:25.499 --> 00:04:29.424 We want you to catalog those pages that you should be seeing, that are blocked. 00:04:29.426 --> 00:04:31.948 We want you to go and interview every teacher in the school 00:04:31.948 --> 00:04:35.306 about all those lesson plans that were carefully laid out before lunch 00:04:35.306 --> 00:04:37.966 with a video and a web page, and over lunch, 00:04:37.966 --> 00:04:41.424 the unaccountable distance center blocked these critical resources 00:04:41.424 --> 00:04:44.796 and left them handing out photographed worksheets in the afternoon 00:04:44.796 --> 00:04:47.073 instead of the unit they prepared. 00:04:47.073 --> 00:04:50.837 We want you to learn how to do the Freedom of Information Act requests 00:04:50.837 --> 00:04:53.377 and find out what your school authority is spending 00:04:53.377 --> 00:04:56.371 to censor your internet access and surveil your activity. 00:04:56.371 --> 00:04:59.855 We want you to learn to use the internet to research these companies 00:04:59.855 --> 00:05:04.329 and we want you to present this to your parent-teacher association, 00:05:04.329 --> 00:05:06.570 to your school authority, to your local newspaper." 00:05:06.981 --> 00:05:08.917 Because that's the kind of digital literacy 00:05:09.187 --> 00:05:11.310 that makes kids into first-class digital citizens, 00:05:11.310 --> 00:05:16.083 that prepares them for a future in which they can participate fully 00:05:16.083 --> 00:05:18.080 in a world that's changing. 00:05:19.019 --> 00:05:22.677 Kids are the beta-testers of the surveillance state. 00:05:22.919 --> 00:05:26.713 The path of surveillance technology starts with prisoners, 00:05:27.157 --> 00:05:30.409 moves to asylum seekers, people in mental institutions 00:05:30.409 --> 00:05:33.919 and then to its first non-incarcerated population: children 00:05:34.258 --> 00:05:37.032 and then moves to blue-collar workers, government workers 00:05:37.032 --> 00:05:38.488 and white-collar workers. 00:05:38.488 --> 00:05:41.575 And so, what we do to kids today is what we did to prisoners yesterday 00:05:41.575 --> 00:05:44.078 and what we're going to be doing to you tomorrow. 00:05:44.078 --> 00:05:46.737 And so it matters, what we teach our kids. 00:05:47.054 --> 00:05:51.039 If you want to see where this goes, this is a kid named Blake Robbins 00:05:51.039 --> 00:05:55.124 and he attended Lower Merion High School in Lower Merion Pennsylvania 00:05:55.124 --> 00:05:56.090 outside f Philadelphia. 00:05:56.091 --> 00:05:59.903 It's the most affluent public school district in America, so affluent 00:05:59.903 --> 00:06:02.521 that all the kids were issued Macbooks at the start of the year 00:06:02.521 --> 00:06:04.665 and they had to do their homework on their Macbooks, 00:06:04.665 --> 00:06:07.778 and bring them to school every day and bring them home every night. 00:06:07.778 --> 00:06:11.472 And the Macbooks had been fitted with Laptop Theft Recovery Software, 00:06:11.472 --> 00:06:15.687 which is fancy word for a rootkit, that let the school administration 00:06:16.494 --> 00:06:19.759 covertly (check) operate the cameras and microphones on these computers 00:06:20.266 --> 00:06:23.207 and harvest files off of their hard drives 00:06:23.657 --> 00:06:25.666 view all their clicks, and so on. 00:06:26.280 --> 00:06:30.802 Now Blake Robbins found out that the software existed 00:06:30.802 --> 00:06:33.787 and how it was being used because he and the head teacher 00:06:33.787 --> 00:06:37.056 had been knocking heads for years, since he first got into the school, 00:06:37.056 --> 00:06:39.864 and one day, the head teacher summoned him to his office 00:06:39.864 --> 00:06:41.462 and said: "Blake, I've got you now." 00:06:41.670 --> 00:06:45.284 and handed him a print-out of Blake in his bedroom the night before, 00:06:45.620 --> 00:06:48.985 taking what looked like a pill, and he said: "You're taking drugs." 00:06:48.985 --> 00:06:53.988 And Blake Robbins said: "That's a candy, it's a Mike and Ike's candy, I take them 00:06:53.988 --> 00:06:55.652 when I -- I eat them when I'm studying. 00:06:55.973 --> 00:06:58.156 How did you get a picture of me in my bedroom?" 00:06:58.656 --> 00:07:02.644 This head teacher had taken over 6000 photos of Blake Robbins: 00:07:02.644 --> 00:07:06.015 awake and asleep, dressed and undressed, in the presence of his family. 00:07:06.630 --> 00:07:10.376 And in the ensuing lawsuit, the school settled for a large amount of money 00:07:10.376 --> 00:07:12.476 and promised that they wouldn't do it again 00:07:12.916 --> 00:07:15.560 without informing the students that it was going on. 00:07:16.107 --> 00:07:18.521 And increasingly, the practice is now 00:07:18.808 --> 00:07:22.110 that school administrations hand out laptops, because they're getting cheaper, 00:07:22.505 --> 00:07:24.683 with exactly the same kind of software, 00:07:24.886 --> 00:07:27.681 but they let the students know and hey find that that works even better 00:07:27.964 --> 00:07:29.682 at curbing the students' behavior, 00:07:29.682 --> 00:07:33.196 because the students know that they're always on camera. 00:07:33.769 --> 00:07:38.010 Now, the surveillance state is moving from kids to the rest of the world. 00:07:38.010 --> 00:07:39.528 It's metastasizing. 00:07:39.528 --> 00:07:44.149 Our devices are increasingly designed to treat us as attackers, 00:07:44.149 --> 00:07:46.890 as suspicious parties who can't be trusted 00:07:46.890 --> 00:07:50.577 because our devices' job is to do things that we don't want them to do. 00:07:50.923 --> 00:07:53.758 Now that's not because the vendors who make our technology 00:07:53.758 --> 00:07:55.839 want to spy on us necessarily, 00:07:55.839 --> 00:08:01.000 but they want to take the ink-jet printer business model 00:08:01.000 --> 00:08:03.768 and bring it into every other realm of the world. 00:08:03.768 --> 00:08:08.724 So the ink-jet printer business model is where you sell someone a device 00:08:08.724 --> 00:08:11.689 and then you get a continuing revenue stream from that device 00:08:11.689 --> 00:08:15.844 by making sure that competitors can't make consumables or parts 00:08:15.844 --> 00:08:19.084 or additional features or plugins for that device, 00:08:19.084 --> 00:08:22.111 without paying rent to the original manufacturer. 00:08:22.111 --> 00:08:25.911 And that allows you to maintain monopoly margins on your devices. 00:08:26.266 --> 00:08:30.629 Now, in 1998, the American government passed a law called 00:08:30.629 --> 00:08:32.416 the Digital Millennium Copyright Act, 00:08:32.416 --> 00:08:35.328 in 2001 the European Union introduced its own version, 00:08:35.328 --> 00:08:37.327 the European Union Copyright Directive. 00:08:37.593 --> 00:08:40.157 And these two laws, along with laws all around the world, 00:08:40.157 --> 00:08:45.681 in Australia, Canada and elsewhere, these laws prohibit removing digital laws 00:08:45.681 --> 00:08:48.885 that are used to restrict access to copyrighted works 00:08:49.220 --> 00:08:52.134 and they were original envisioned as a way of making sure that Europeans didn't 00:08:52.134 --> 00:08:54.640 bring cheap DVDs in from America, 00:08:54.640 --> 00:08:58.215 or making sure that Australians didn't import cheap DVDs from China. 00:08:58.215 --> 00:09:03.741 And so you have a digital work, a DVD, and it has a lock on it and to unlock it, 00:09:03.741 --> 00:09:05.363 you have to buy an authorized player 00:09:05.363 --> 00:09:07.357 and the player checks to make sure you are in region 00:09:07.357 --> 00:09:10.339 and making your own player that doesn't make that check 00:09:10.339 --> 00:09:12.409 is illegal because you'd have to remove the digital lock. 00:09:12.409 --> 00:09:13.824 And that was the original intent, 00:09:13.824 --> 00:09:18.552 it was to allow high rents to be maintained on removable media, 00:09:18.552 --> 00:09:20.515 DVDs and other entertainment content. 00:09:20.726 --> 00:09:24.229 But it very quickly spread into new rounds. 00:09:24.765 --> 00:09:28.415 So, for example, auto manufacturers now lock up 00:09:28.415 --> 00:09:30.863 all of their cars' telemetry with digital locks. 00:09:30.863 --> 00:09:33.153 If you're a mechanic and want to fix a car, 00:09:33.153 --> 00:09:36.796 you have to get a reader from the manufacturer 00:09:36.796 --> 00:09:40.240 to make sure that you can see the telemetry 00:09:40.240 --> 00:09:42.759 and then know what parts to order and how to fix it. 00:09:43.154 --> 00:09:46.380 And in order to get this reader, you have to promise the manufacturer 00:09:46.380 --> 00:09:50.069 that you will only buy parts from that manufacturer 00:09:50.069 --> 00:09:51.366 and not from third parties. 00:09:51.366 --> 00:09:53.829 So the manufacturers can keep the repair costs high 00:09:53.829 --> 00:09:56.852 and get a secondary revenue stream out of the cars. 00:09:57.388 --> 00:10:04.532 This year, the Chrysler corporation filed comments with the US Copyright Office, 00:10:04.532 --> 00:10:07.580 to say that they believed that this was the right way to do it 00:10:07.580 --> 00:10:10.278 and that it should be a felony, punishable by 5 years in prison 00:10:10.278 --> 00:10:12.218 and a $500'000 fine, 00:10:12.218 --> 00:10:16.242 to change the locks on a car that you own, so that you can choose who fixes it. 00:10:16.531 --> 00:10:19.556 It turned out that when they advertised 00:10:19.556 --> 00:10:21.855 -- well, where is my slide here? Oh, there we go -- 00:10:21.855 --> 00:10:25.166 when they advertised that it wasn't your father's Oldsmobile, 00:10:25.606 --> 00:10:28.531 they weren't speaking metaphorically, they literally meant 00:10:28.531 --> 00:10:30.486 that even though your father bought the Oldsmobile, 00:10:30.486 --> 00:10:33.157 it remained their property in perpetuity. 00:10:33.713 --> 00:10:36.430 And it's not just cars, it's every kind of device, 00:10:36.430 --> 00:10:39.283 because every kind of device today has a computer in it. 00:10:39.614 --> 00:10:43.032 The John Deer Company, the world's leading seller of heavy equipment 00:10:43.032 --> 00:10:45.036 and agricultural equipment technologies, 00:10:45.528 --> 00:10:49.224 they now view their tractors as information gathering platforms 00:10:49.225 --> 00:10:50.950 and they view the people who use them 00:10:50.950 --> 00:10:55.886 as the kind of inconvenient gut flora of their ecosystem. 00:10:56.217 --> 00:10:58.386 So if you are a farmer and you own a John Deer tractor, 00:10:58.386 --> 00:11:02.645 when you drive it around your fields, the torque centers on the wheels 00:11:02.645 --> 00:11:07.680 conduct a centimeter-accurate soil density survey of your agricultural land. 00:11:08.077 --> 00:11:11.122 That would be extremely useful to you when you're planting your seed 00:11:11.122 --> 00:11:12.642 but that data is not available to you 00:11:12.642 --> 00:11:15.372 unless unless you remove the digital lock from your John Deer tractor 00:11:15.372 --> 00:11:17.586 which again, is against the law everywhere in the world. 00:11:17.955 --> 00:11:20.075 Instead, in order to get that data 00:11:20.075 --> 00:11:23.056 you have to buy a bundle with seeds from Monsanto, 00:11:23.056 --> 00:11:24.555 who are John Deer's seed partners. 00:11:25.017 --> 00:11:28.639 John Deer then takes this data that they aggregate across whole regions 00:11:28.639 --> 00:11:31.447 and they use it to gain insight into regional crop yields 00:11:31.447 --> 00:11:33.364 that they use to play the futures market. 00:11:33.665 --> 00:11:37.464 John Deer's tractors are really just a way of gathering information 00:11:37.464 --> 00:11:39.132 and the farmers are secondary to it. 00:11:39.132 --> 00:11:41.873 Just because you own it doesn't mean it's yours. 00:11:42.245 --> 00:11:45.349 And it's not just the computers that we put our bodies into 00:11:45.520 --> 00:11:47.035 that have this business model. 00:11:47.041 --> 00:11:49.285 It's the computers that we put inside of our bodies. 00:11:49.746 --> 00:11:51.288 If you're someone who is diabetic 00:11:51.288 --> 00:11:55.037 and you're fitted with a continuous glucose-measuring insulin pump, 00:11:55.365 --> 00:11:58.227 that insulin pump is designed with a digital lock 00:11:58.227 --> 00:12:02.277 that makes sure that your doctor can only use the manufacturer's software 00:12:02.277 --> 00:12:04.100 to read the data coming off of it 00:12:04.100 --> 00:12:07.142 and that software is resold on a rolling annual license 00:12:07.142 --> 00:12:09.754 and it can't be just bought outright. 00:12:09.754 --> 00:12:12.028 And the digital locks are also used to make sure 00:12:12.028 --> 00:12:14.295 that you only buy the insulin that vendors approved 00:12:14.295 --> 00:12:16.941 and not generic insulin that might be cheaper. 00:12:16.941 --> 00:12:20.514 We've literally turned human beings into ink-jet printers. 00:12:21.165 --> 00:12:27.626 Now, this has really deep implications beyond the economic implications. 00:12:27.952 --> 00:12:30.857 Because the rules that prohibit breaking these digital locks 00:12:30.857 --> 00:12:35.091 also prohibit telling people about flaws that programmers made 00:12:35.091 --> 00:12:37.485 because if you know about a flaw that a programmer made, 00:12:37.845 --> 00:12:39.650 you can use it to break the digital lock. 00:12:40.154 --> 00:12:43.599 And that means that the errors, the vulnerabilities, 00:12:43.599 --> 00:12:49.587 the mistakes in our devices, they fester in them, they go on and on and on 00:12:49.587 --> 00:12:53.727 and our devices become these longlife reservoirs of digital pathogens. 00:12:54.345 --> 00:12:56.070 And we've seen how that plays out. 00:12:56.070 --> 00:12:58.644 One of the reasons that Volkswagen was able to get away 00:12:58.644 --> 00:13:00.881 with their Diesel cheating for so long 00:13:00.881 --> 00:13:03.927 is because no one could independently alter their firmware. 00:13:05.159 --> 00:13:07.223 It's happening all over the place. 00:13:07.647 --> 00:13:11.405 You may have seen -- you may have seen this summer 00:13:11.405 --> 00:13:14.710 that Chrysler had to recall 1.4 million jeeps 00:13:14.710 --> 00:13:18.605 because it turned out that they could be remotely controlled over the internet 00:13:18.605 --> 00:13:21.655 while driving down a motorway and have their brakes and steerings 00:13:21.655 --> 00:13:25.766 commandeered by anyone, anywhere in the world, over the internet. 00:13:27.056 --> 00:13:31.195 We only have one methodology for determining whether security works 00:13:31.499 --> 00:13:33.988 and that's to subject it to public scrutiny, 00:13:33.988 --> 00:13:37.785 to allow for other people to see what assumptions you've made. 00:13:38.200 --> 00:13:39.961 Anyone can design a security system 00:13:39.961 --> 00:13:42.177 that he himself can think of a way of breaking, 00:13:42.547 --> 00:13:44.704 but all that means is that you've designed a security system 00:13:44.704 --> 00:13:47.363 that works against people who are stupider than you. 00:13:47.997 --> 00:13:50.184 And in this regard, security is no different 00:13:50.184 --> 00:13:52.209 from any other kind of knowledge creation. 00:13:52.434 --> 00:13:55.431 You know, before we had contemporary science and scholarship, 00:13:55.431 --> 00:13:57.941 we had something that looked a lot like it, called alchemy. 00:13:58.282 --> 00:14:01.927 And for 500 years, alchemists kept what they thought they knew a secret. 00:14:02.307 --> 00:14:05.710 And that meant that every alchemist was capable of falling prey 00:14:05.710 --> 00:14:12.538 to that most urgent of human frailties, which is our ability to fool ourselves. 00:14:12.538 --> 00:14:16.227 And so, every alchemist discovered for himself in the hardest way possible 00:14:16.227 --> 00:14:18.635 that drinking mercury was a bad idea. 00:14:19.178 --> 00:14:21.608 We call that 500-year period the Dark Ages 00:14:21.793 --> 00:14:24.614 and we call the moment at which they started publishing 00:14:24.614 --> 00:14:27.926 and subjectig themselves to adversarial peer review, 00:14:27.926 --> 00:14:30.548 which is when your friends tell you about the mistakes that you've made 00:14:30.548 --> 00:14:32.985 and your enemies call you an idiot for having made them, 00:14:32.985 --> 00:14:35.528 we call that moment the Enlightenment. 00:14:37.119 --> 00:14:39.825 Now, this has profound implications. 00:14:39.825 --> 00:14:46.236 The restriction of our ability to alter the security of our devices 00:14:46.236 --> 00:14:49.449 for our own surveillance society, 00:14:49.449 --> 00:14:52.070 for our ability to be free people in society. 00:14:52.434 --> 00:14:55.920 At the height of the GDR, in 1989, 00:14:56.360 --> 00:15:01.216 the STASI had one snitch for every 60 people in East Germany, 00:15:01.216 --> 00:15:03.283 in order to surveil the entire country. 00:15:03.546 --> 00:15:06.548 A couple of decades later, we found out through Edward Snowden 00:15:06.548 --> 00:15:09.208 that the NSA was spying on everybody in the world. 00:15:09.603 --> 00:15:13.294 And the ratio of people who work at the NSA to people they are spying on 00:15:13.294 --> 00:15:14.963 is more like 1 in 10'000. 00:15:15.340 --> 00:15:18.165 They've achieved a two and a half order of magnitude 00:15:18.165 --> 00:15:20.069 productivity gain in surveillance. 00:15:20.405 --> 00:15:23.492 And the way that they got there is in part by the fact that 00:15:23.492 --> 00:15:26.002 we use devices that we're not allowed to alter, 00:15:26.002 --> 00:15:28.452 that are designed to treat us as attackers 00:15:28.452 --> 00:15:31.435 and that gather an enormous amount of information on us. 00:15:31.727 --> 00:15:34.405 If the government told you that you're required to carry around 00:15:34.405 --> 00:15:37.993 a small electronic rectangle that recorded all of your social relationships, 00:15:37.993 --> 00:15:39.327 all of your movements, 00:15:39.327 --> 00:15:42.241 all of your transient thoughts that you made known or ever looked up, (check) 00:15:43.208 --> 00:15:45.829 and would make that available to the state, 00:15:45.836 --> 00:15:48.316 and you would have to pay for it, you would revolt. 00:15:48.914 --> 00:15:51.726 But the phone companies have managed to convince us, 00:15:51.726 --> 00:15:53.991 along with the mobile vendors, 00:15:53.991 --> 00:15:56.545 that we should foot the bill for our own surveillance. 00:15:56.545 --> 00:15:58.994 It's a bit like during the Cultural Revolution, 00:15:58.994 --> 00:16:01.213 where, after your family members were executed, 00:16:01.213 --> 00:16:02.806 they sent you a bill for the bullet. 00:16:04.981 --> 00:16:11.499 So, this has big implications, as I said, for where we go as a society. 00:16:11.499 --> 00:16:14.981 Because just as our kids have a hard time functioning 00:16:14.981 --> 00:16:17.155 in the presence of surveillance, and learning, 00:16:17.155 --> 00:16:18.829 and advancing their own knowledge, 00:16:18.829 --> 00:16:21.123 we as a society have a hard time progressing 00:16:21.123 --> 00:16:22.612 in the presence of surveillance. 00:16:23.039 --> 00:16:29.152 In our own living memory, people who are today thought of as normal and right 00:16:29.152 --> 00:16:31.451 were doing something that a generation ago 00:16:31.451 --> 00:16:33.851 would have been illegal and landed them in jail. 00:16:33.851 --> 00:16:35.499 For example, you probably know someone 00:16:35.499 --> 00:16:37.801 who's married to a partner of the same sex. 00:16:38.303 --> 00:16:41.342 If you live in America, you may know someone who takes medical marijuana, 00:16:41.342 --> 00:16:42.719 or if you live in the Netherlands. 00:16:43.054 --> 00:16:47.674 And not that long ago, people who undertook these activities 00:16:47.674 --> 00:16:49.151 could have gone to jail for them, 00:16:49.151 --> 00:16:52.086 could have faced enormous social exclusion for them. 00:16:52.086 --> 00:16:56.306 The way that we got from there to here was by having a private zone, 00:16:56.306 --> 00:16:57.842 a place where people weren't surveilled, 00:16:57.842 --> 00:17:00.808 in which they could advance their interest ideas, 00:17:00.808 --> 00:17:03.862 do things that were thought of as socially unacceptable 00:17:03.862 --> 00:17:06.399 and slowly change our social attitudes. 00:17:06.710 --> 00:17:08.721 And in ........ (check) few things that in 50 years, 00:17:08.721 --> 00:17:12.710 your grandchildren will sit around the Christmas table, in 2065, and say: 00:17:12.710 --> 00:17:14.615 "How was it, grandma, how was it, grandpa, 00:17:14.615 --> 00:17:17.101 that in 2015, you got it all right, 00:17:17.101 --> 00:17:19.574 and we haven't had any social changes since then?" 00:17:19.818 --> 00:17:21.798 Then you have to ask yourself how in a world, 00:17:21.798 --> 00:17:24.536 in which we are all under continuous surveillance, 00:17:24.536 --> 00:17:27.291 we are going to find a way to improve this. 00:17:27.922 --> 00:17:30.417 So, our kids need ICT literacy, 00:17:30.417 --> 00:17:35.305 but ICT literacy isn't just typing skills or learning how to use PowerPoint. 00:17:35.305 --> 00:17:36.768 It's learning how to think critically 00:17:36.768 --> 00:17:39.510 about how they relate to the means of information, 00:17:39.834 --> 00:17:42.980 about whether they are its masters or servants. 00:17:43.810 --> 00:17:47.366 Our networks are not the most important issue that we have. 00:17:47.366 --> 00:17:51.965 There are much more important issues in society and in the world today. 00:17:51.965 --> 00:17:54.502 The future of the internet is way less important 00:17:54.502 --> 00:17:57.640 than the future of our climate, the future of gender equity, 00:17:57.640 --> 00:17:59.550 the future of racial equity, 00:17:59.550 --> 00:18:03.434 the future of the wage gap and the wealth gap in the world, 00:18:03.434 --> 00:18:07.171 but everyone of those fights is going to be fought and won or lost 00:18:07.171 --> 00:18:10.614 on the internet: it's our most foundational fight 00:18:10.956 --> 00:18:15.290 So weakened (check) computers can make us more free 00:18:15.572 --> 00:18:17.618 or they can take away our freedom. 00:18:17.618 --> 00:18:20.821 It all comes down to how we regulate them and how we use them. 00:18:21.282 --> 00:18:24.618 And it's our job, as people who are training the next generation, 00:18:25.004 --> 00:18:28.785 and whose next generation is beta-testing 00:18:28.785 --> 00:18:31.278 the surveillance technology that will be coming to us, 00:18:31.278 --> 00:18:34.807 it's our job to teach them to seize the means of information, 00:18:34.807 --> 00:18:38.732 to make themselves self-determinant in the way that they use their networks 00:18:38.732 --> 00:18:42.844 and to find ways to show them how to be critical and how to be smart 00:18:43.092 --> 00:18:45.262 and how to be, above all, subversive 00:18:45.262 --> 00:18:48.420 and how to use the technology around them. Thank you. 00:18:49.240 --> 00:18:57.445 (Applause) 00:18:57.445 --> 00:18:59.011 (Moderator) Cory, thank you very much indeed. 00:18:59.011 --> 00:19:00.164 (Doctorow) Thank you (Moderator) And I've got a bundle of 00:19:00.164 --> 00:19:04.649 points which you've stimulated from many in the audience, which sent -- 00:19:04.649 --> 00:19:06.805 (Doctorow) I'm shocked to hear that that was at all controversial, 00:19:06.805 --> 00:19:08.340 but go on. (Moderator) I didn't say "controversial", 00:19:08.340 --> 00:19:11.014 you stimulated thinking, which is great. (Doctorow laughs) 00:19:11.014 --> 00:19:16.219 But a lot of them resonate around violation of secrecy and security. 00:19:16.814 --> 00:19:20.799 And this, for example, from Annika Burgess 00:19:20.808 --> 00:19:23.120 "Is there a way for students to protect themselves 00:19:23.419 --> 00:19:27.374 from privacy violations by institutions they are supposed to trust." 00:19:27.374 --> 00:19:30.427 I think this is probably a question Ian Goldin (check) as well, as 00:19:30.427 --> 00:19:33.571 someone who is a senior figure in a major university, but 00:19:33.571 --> 00:19:36.524 this issue of privacy violations and trust. 00:19:37.251 --> 00:19:40.293 (Doctorow) Well, I think that computers have a curious dual nature. 00:19:40.293 --> 00:19:43.897 So on the one hand, they do expose us to an enormous amount of scrutiny, 00:19:43.897 --> 00:19:45.290 depending on how they are configured. 00:19:45.646 --> 00:19:48.713 But on the other hand, computers have brought new powers to us 00:19:48.713 --> 00:19:52.012 that are literally new on the face of the world, right? 00:19:52.012 --> 00:19:56.478 We have never had a reality in which normal people could have secrets 00:19:56.478 --> 00:19:57.653 from powerful people. 00:19:57.950 --> 00:20:00.523 But with the computer in your pocket, with that, [shows a smartphone] 00:20:00.523 --> 00:20:02.873 you can encrypt a message so thoroughly 00:20:02.873 --> 00:20:05.986 that if every hydrogen atom in the universe were turned into a computer 00:20:05.986 --> 00:20:08.552 and it did nothing until the heat death of the universe, 00:20:08.552 --> 00:20:10.294 but try to guess what your key was, 00:20:10.309 --> 00:20:13.406 we would run out of universe before we ran out of possible keys. 00:20:13.716 --> 00:20:16.253 So, computers do give us the power to have secrets. 00:20:16.723 --> 00:20:21.361 The problem is that institutions prohibit the use of technology 00:20:21.361 --> 00:20:23.512 that allow you to take back your own privacy. 00:20:23.809 --> 00:20:26.080 It's funny, right? Because we take kids and we say to them: 00:20:26.080 --> 00:20:29.063 "Your privacy is like your virginity: 00:20:29.063 --> 00:20:31.448 once you've lost it, you'll never get it back. 00:20:31.719 --> 00:20:33.723 Watch out what you're putting on Facebook" 00:20:33.723 --> 00:20:36.298 -- and I think they should watch what they're putting on Facebook, 00:20:36.298 --> 00:20:40.247 I'm a Facebook vegan, I don't even -- I don't use it, but we say: 00:20:40.247 --> 00:20:41.808 "Watch what you're putting on Facebook, 00:20:41.808 --> 00:20:45.124 don't send out dirty pictures of yourself on SnapChat." 00:20:45.124 --> 00:20:46.500 All good advice. 00:20:46.500 --> 00:20:50.230 But we do it while we are taking away all the private information 00:20:50.230 --> 00:20:53.623 that they have, all of their privacy and all of their agency. 00:20:53.887 --> 00:20:55.580 You know, if a parent says to a kid: 00:20:55.580 --> 00:20:57.688 "You mustn't smoke because you'll get sick" 00:20:57.688 --> 00:21:00.393 and the parent says it while lighting a new cigarette 00:21:00.393 --> 00:21:02.877 off the one that she's just put down in the ashtray, 00:21:02.877 --> 00:21:05.872 the kid knows that what you're doing matters more than what you're saying. 00:21:05.872 --> 00:21:07.136 (Moderator) The point is deficit of trust. 00:21:07.136 --> 00:21:09.423 It builds in the kind of work that David has been doing as well, 00:21:09.423 --> 00:21:11.896 this deficit of trust and privacy. 00:21:11.896 --> 00:21:13.291 And there is another point here: 00:21:13.291 --> 00:21:15.429 "Is the battle for privacy already lost? 00:21:15.429 --> 00:21:19.108 Are we already too comfortable with giving away our data?" 00:21:19.108 --> 00:21:20.376 (Doctorow) No, I don't think so at all. 00:21:20.376 --> 00:21:22.717 In fact, I think that if anything, we've reached 00:21:22.717 --> 00:21:25.071 peak indifference to surveillance, right? 00:21:25.071 --> 00:21:29.086 The surveillance races on over by it by a long shot. 00:21:29.086 --> 00:21:31.297 There will be more surveillance before there is less. 00:21:31.547 --> 00:21:34.477 But there'll never be fewer people who care about surveillance 00:21:34.477 --> 00:21:35.791 than there are today, 00:21:35.791 --> 00:21:39.216 because, as privacy advocates, we spectacularly failed, 00:21:39.216 --> 00:21:42.452 over the last 20 years, to get people to take privacy seriously 00:21:42.792 --> 00:21:46.491 and now we have firms that are frankly incompetent, 00:21:46.491 --> 00:21:50.070 retaining huge amounts of our personally identifying sensitive information 00:21:50.070 --> 00:21:53.166 and those firms are leaking that information at speed. 00:21:53.166 --> 00:21:57.957 So, it started this year with things like (phone rings) -- is that me? that's me! -- 00:21:57.957 --> 00:22:00.708 It started this year with things like Ashley Madison 00:22:00.708 --> 00:22:01.581 (Moderator) Do you want to take it? 00:22:01.581 --> 00:22:04.652 (Doctorow) No no, that was my timer going off telling me I've had my 22 minutes 00:22:04.652 --> 00:22:06.490 (Moderator) It may be someone who doesn't trust my.... 00:22:06.490 --> 00:22:08.419 (Doctorow) No,that was -- that was my 22 minutes. 00:22:08.425 --> 00:22:14.033 So, it started with Ashley Madison, the office of personnel management, 00:22:14.033 --> 00:22:17.911 everyone who ever applied for security clearance in America 00:22:17.911 --> 00:22:20.746 had all of their sensitive information, everything you had to give them 00:22:20.746 --> 00:22:22.559 about why you shouldn't have security clearance, 00:22:22.559 --> 00:22:25.275 everything that's potentially compromising about you, 00:22:25.275 --> 00:22:28.906 all of it exfiltrated by what's believed to have been a Chinese spy ring. 00:22:30.056 --> 00:22:32.044 Something like one in twenty Americans now, 00:22:32.044 --> 00:22:35.069 have had their data captured and exfiltrated from the United States. 00:22:35.810 --> 00:22:40.003 This week, VTech, the largest electronic toy manufacturer in the world, 00:22:40.003 --> 00:22:43.764 leaked the personal information of at least five million children, 00:22:43.764 --> 00:22:47.148 including potentially photos that they took with their electronic toys, 00:22:47.148 --> 00:22:51.723 as well as their parents' information, their home addresses, their passwords, 00:22:51.723 --> 00:22:54.493 their parents' passwords and their password hints. 00:22:54.493 --> 00:22:56.532 So every couple of weeks, from now on, 00:22:56.532 --> 00:23:01.084 a couple of million people are going to show up 00:23:01.084 --> 00:23:01.376 on the door of people who care about privacy and say: 00:23:01.376 --> 00:23:03.477 "You were right all along, what do we do?" 00:23:03.764 --> 00:23:06.208 And the challenge is to give them something useful 00:23:06.208 --> 00:23:07.603 they can do about privacy. 00:23:07.603 --> 00:23:09.375 And there are steps that you can take personally. 00:23:09.637 --> 00:23:12.699 If you go to the Surveillance Defense Kit 00:23:12.699 --> 00:23:14.655 at the Electronic Frontier Foundation's website, 00:23:14.655 --> 00:23:16.824 you'll find a set of tools you can use. 00:23:16.824 --> 00:23:19.735 But ultimately, it's not an individual choice, it's a social one. 00:23:19.735 --> 00:23:23.638 Privacy is a team sport, because if you keep your information private 00:23:23.638 --> 00:23:26.796 and you send it to someone else who is in your social network, 00:23:26.796 --> 00:23:29.514 who doesn't keep it private, well, then it leaks out their back door. 00:23:29.514 --> 00:23:34.771 And so we need social movements to improve our privacy. 00:23:35.201 --> 00:23:36.886 We also need better tools. 00:23:36.886 --> 00:23:40.786 It's really clear that our privacy tools have fallen short of the mark 00:23:40.786 --> 00:23:43.636 in terms of being accessible to normal people. 00:23:43.639 --> 00:23:44.730 (Moderator) Corey -- (Doctorow) I sense you want 00:23:44.730 --> 00:23:45.508 to say something: go on (Moderator) Yes. 00:23:45.508 --> 00:23:49.197 Yes, but what I want to do is keep pushing this down the route of education as well. 00:23:49.197 --> 00:23:53.184 There are two or three questions here which are very specific about that. 00:23:53.631 --> 00:23:55.820 Could you tell us more about how we help students 00:23:55.820 --> 00:23:57.264 become digital citizens? 00:23:57.609 --> 00:24:01.516 It links in important ways to the German and Nordic pre-digital concept 00:24:01.786 --> 00:24:03.841 of Bildung - - that's from Gertrud Fullend -- (check) 00:24:03.841 --> 00:24:07.871 and if you could give your daughter just one piece of advice for living 00:24:07.871 --> 00:24:11.107 in an age of digital surveillance, what would it be? 00:24:12.042 --> 00:24:14.801 (Doctorow) So, the one piece of advice I would give her is 00:24:14.801 --> 00:24:17.960 "Don't settle for anything less than total control 00:24:17.960 --> 00:24:19.237 of the means of information. 00:24:19.522 --> 00:24:23.429 Anytime someone tells you that the computer says you must do something, 00:24:23.429 --> 00:24:27.159 you have to ask yourself why the computer isn't doing what you want it to do. 00:24:27.159 --> 00:24:28.546 Who is in charge here?" 00:24:30.086 --> 00:24:31.286 -- Thank you -- 00:24:31.286 --> 00:24:34.702 In terms of how we improve digital citizenship, 00:24:34.702 --> 00:24:38.088 I think we are at odds with our institutions in large part here 00:24:38.088 --> 00:24:41.655 and I think that our institutions have decreed that privacy 00:24:41.655 --> 00:24:43.663 is a luxury we can't afford to give our students 00:24:43.663 --> 00:24:45.990 because if we do, they might do something we wouldn't like. 00:24:46.350 --> 00:24:50.796 And so, as teachers, the only thing that we can do without risking our jobs, 00:24:50.796 --> 00:24:54.454 without risking our students' education and their ability to stay in school 00:24:54.454 --> 00:24:57.318 is to teach them how to do judo, to teach them 00:24:57.318 --> 00:25:00.704 how to ask critical questions, to gather data, 00:25:00.704 --> 00:25:02.764 to demand evidence-based policy. 00:25:02.764 --> 00:25:04.598 And ultimately, I actually think that's better. 00:25:04.598 --> 00:25:07.036 I think that teaching kids how to evade firewalls 00:25:07.036 --> 00:25:10.172 is way less interesting than teaching them how to identify 00:25:10.172 --> 00:25:11.838 who is providing that firewall, 00:25:11.838 --> 00:25:14.506 who made the decision to buy that firewall, 00:25:14.506 --> 00:25:15.682 how much they're spending, 00:25:16.372 --> 00:25:19.374 and what the process is for getting that person fired. 00:25:19.374 --> 00:25:22.161 That's a much more interesting piece of digital citizenship 00:25:22.535 --> 00:25:24.449 than any piece of firewall hacking. 00:25:24.449 --> 00:25:26.080 (Moderator) But Corey, you're telling kids to get hold of their information, 00:25:26.080 --> 00:25:29.308 but picking up on your analogy with the auto 00:25:29.308 --> 00:25:33.070 where there's very clear lock-on, so much which has to be done 00:25:33.070 --> 00:25:36.828 has to be done through the manufacturers -- Martin Siebel (check) picking up -- 00:25:36.828 --> 00:25:39.915 How busy is the hacker community with breaking digital locks? 00:25:39.915 --> 00:25:43.106 In other words, digital locks are being put on so much, 00:25:43.546 --> 00:25:46.871 can the next generation, can the kids work out where these locks are, 00:25:46.871 --> 00:25:48.556 to keep control of their information? 00:25:48.911 --> 00:25:52.492 (Doctorow) So, getting the digital locks off of our devices 00:25:52.492 --> 00:25:55.512 is not just a matter of technology, it's also a matter of policy. 00:25:55.512 --> 00:25:59.046 Laws like the EUCD and new treaty instruments like TTIP 00:25:59.046 --> 00:26:00.797 and the Transpacific Partnership 00:26:00.797 --> 00:26:04.633 specify that these digital locks can't be removed legally, 00:26:04.633 --> 00:26:08.722 that firms can use them to extract monopoly ransom, to harm our security. 00:26:08.722 --> 00:26:12.960 And these treaties will make it impossible to remove these locks legally. 00:26:13.308 --> 00:26:15.612 It's not enough for there to be a demi-monde, in which 00:26:15.612 --> 00:26:18.552 people who are technologically savvy know how to break their iPhones. 00:26:18.985 --> 00:26:22.964 The problem with that is that it produces this world in which you can't know 00:26:22.964 --> 00:26:27.691 whether the software that you're using has anything bad lurking in it, right? 00:26:27.691 --> 00:26:32.656 to say to people who have iPhones that they want it reconfigured 00:26:32.656 --> 00:26:35.398 to accept software from parties that aren't approved by Apple, 00:26:35.398 --> 00:26:39.128 well, all you need to do is find a random piece of software on the internet 00:26:39.128 --> 00:26:41.913 that claims it will allow you to do it, get it (check) on your phone, 00:26:41.913 --> 00:26:44.997 and then run this software on it is like saying to people: 00:26:44.997 --> 00:26:48.199 "Well, all you need to do is just find some heroin lying around 00:26:48.199 --> 00:26:51.332 and put it in your arm: I'm sure it'll work out fine." Right? 00:26:51.992 --> 00:26:56.061 The ways in which a phone that is compromised can harm you 00:26:56.061 --> 00:26:57.725 are really without limits, right? 00:26:57.725 --> 00:27:00.992 If you think about this, this is a rectangle with a camera and a microphone 00:27:00.992 --> 00:27:02.945 that you take into the bedroom and the toilet. 00:27:02.945 --> 00:27:05.769 and the only way you know whether that camera or microphone are on or off 00:27:05.769 --> 00:27:09.616 is if the software is doing what it says it's doing, right? 00:27:09.616 --> 00:27:13.257 So, I think that we have this policy dimension that's way more important 00:27:13.257 --> 00:27:14.648 than the technical dimension, 00:27:14.648 --> 00:27:17.170 and that the only way we're going to change that policy 00:27:17.170 --> 00:27:20.006 is by making kids know what they can't do 00:27:20.391 --> 00:27:23.200 and them making them know why they're not allowed to do it, 00:27:23.200 --> 00:27:25.529 and then set them loose. (Moderator) Two final points then, 00:27:25.529 --> 00:27:27.613 building on that: Ian ...........(check) 00:27:27.613 --> 00:27:30.027 but what do you think of recent laws, say, in the Netherlands, 00:27:30.357 --> 00:27:33.966 allowing police to spy on criminals and terrorists with webcams 00:27:33.966 --> 00:27:35.793 on their own laptops? 00:27:35.793 --> 00:27:38.611 In other words, the extension implicit in this question is, 00:27:38.611 --> 00:27:43.264 what about keeping an eye on others, including the next generation coming up, 00:27:43.264 --> 00:27:45.827 and this from Peter Andreesen (check), which is slightly connected: 00:27:45.827 --> 00:27:49.125 But when the predominant management tool in our business is still based 00:27:49.125 --> 00:27:55.367 on the assumption of control, how do we change that psychology and that mindset? 00:27:55.367 --> 00:27:58.247 And David is nodding for those views you can see it at the back. 00:27:58.505 --> 00:28:02.336 (Doctorow) You know, there's not an operating system or suite of applications 00:28:02.336 --> 00:28:05.617 that bad guys use, and another set that the rest of us use. 00:28:05.617 --> 00:28:09.252 And so, when our security services discover vulnerabilities 00:28:09.252 --> 00:28:13.197 in the tools that we rely on, and then hoard those vulnerabilities, 00:28:13.197 --> 00:28:15.957 so that they can weaponize them to attack criminals or terrorists 00:28:15.957 --> 00:28:18.104 or whoever they're after, 00:28:18.624 --> 00:28:22.596 they insure that those vulnerabilities remain unpatched on our devices too. 00:28:22.596 --> 00:28:24.768 So that the people who are our adversaries, 00:28:24.768 --> 00:28:26.225 the criminals who want to attack us, 00:28:26.225 --> 00:28:30.097 the spies who want to exfiltrate our corporate secrets to their countries, 00:28:30.777 --> 00:28:35.799 the griefers, the autocratic regimes who use those vulnerabilities 00:28:35.799 --> 00:28:37.413 to spy on their population -- 00:28:37.413 --> 00:28:40.490 at Electronic Frontier Foundation, we have a client from Ethiopia, 00:28:40.490 --> 00:28:42.493 Mr. Kadani, who is dissident journalist 00:28:42.493 --> 00:28:45.617 -- Ethiopia imprisons more journalists than any other country in the world -- 00:28:45.924 --> 00:28:49.503 he fled to America, where the Ethiopian government hacked his computer 00:28:49.769 --> 00:28:52.875 with a bug in Skype that they had discovered and not reported. 00:28:52.875 --> 00:28:56.399 They'd bought the tools to do this from Italy, from a company called Hacking Team, 00:28:56.399 --> 00:28:59.675 they hacked his computer, they found out who is colleagues were in Ethiopia 00:28:59.675 --> 00:29:02.118 and they arrested them and subjected them to torture. 00:29:02.501 --> 00:29:06.568 So, when our security services take a bug and weaponize it 00:29:06.568 --> 00:29:10.610 so they can attack their adversaries, they leave that bug intact 00:29:10.610 --> 00:29:12.935 so that our adversaries can attack us. 00:29:12.935 --> 00:29:15.521 You know, this is -- I'm giving a talk later today, 00:29:15.521 --> 00:29:19.856 no matter what side you're on in the war on general-purpose computing, 00:29:19.856 --> 00:29:20.963 you're losing, 00:29:20.963 --> 00:29:23.516 because all we have right now is attack and not defense. 00:29:23.516 --> 00:29:27.844 Our security services are so concerned with making sure that their job is easy 00:29:27.844 --> 00:29:31.308 when they attack their adversaries, that they are neglecting the fact 00:29:31.308 --> 00:29:34.146 that they are making their adversaries' job easy in attacking us. 00:29:34.146 --> 00:29:36.138 If this were a football match, 00:29:36.138 --> 00:29:39.116 the score would be tied 400 to 400 after 10 minutes. 00:29:39.116 --> 00:29:40.664 (Moderator) Corey, I've got to stop you there. 00:29:40.664 --> 00:29:42.561 Thank you very much indeed. (Doctorow) Thank you very much. 00:29:42.561 --> 00:29:45.631 (Applause)