WEBVTT 00:00:00.000 --> 00:00:00.778 (Cory Doctorow) Thank you very much 00:00:01.066 --> 00:00:05.420 So I'd like to start with something of a benediction or permission. 00:00:05.420 --> 00:00:07.740 I am one of nature's fast talkers 00:00:07.740 --> 00:00:10.620 and many of you are not native English speakers, or 00:00:10.620 --> 00:00:12.919 maybe not accustomed to my harsh Canadian accent 00:00:12.919 --> 00:00:15.919 in addition I've just come in from Australia 00:00:15.919 --> 00:00:19.166 and so like many of you I am horribly jetlagged and have drunk enough coffee 00:00:19.166 --> 00:00:20.736 this morning to kill a rhino. 00:00:22.112 --> 00:00:23.587 When I used to be at the United Nations 00:00:23.587 --> 00:00:27.023 I was known as the scourge of the simultaneous translation core 00:00:27.157 --> 00:00:29.528 I would stand up and speak as slowly as I could 00:00:29.576 --> 00:00:32.229 and turn around, and there they would be in their boots doing this 00:00:32.229 --> 00:00:35.136 (laughter) When I start to speak too fast, 00:00:35.136 --> 00:00:37.557 this is the universal symbol -- my wife invented it -- 00:00:37.557 --> 00:00:41.518 for "Cory, you are talking too fast". Please, don't be shy. 00:00:42.078 --> 00:00:45.790 So, I'm a parent , like many of you and I'm like I'm sure all of you 00:00:45.790 --> 00:00:48.850 who are parents, parenting takes my ass all the time. 00:00:50.030 --> 00:00:54.820 And there are many regrets I have about the mere seven and half years 00:00:54.820 --> 00:00:57.284 that I've been a parent but none ares so keenly felt 00:00:57.784 --> 00:01:00.878 as my regrets over what's happened when I've been wandering 00:01:00.878 --> 00:01:04.766 around the house and seen my daughter working on something 00:01:04.766 --> 00:01:08.697 that was beyond her abilities, that was right at the edge of what she could do 00:01:08.697 --> 00:01:12.759 and where she was doing something that she didn't have competence in yet 00:01:12.759 --> 00:01:17.019 and you know it's that amazing thing to see that frowning concentration, 00:01:17.019 --> 00:01:20.038 tongue stuck out: as a parent, your heart swells with pride 00:01:20.038 --> 00:01:21.332 and you can't help but go over 00:01:21.332 --> 00:01:23.684 and sort of peer over their shoulder what they are doing 00:01:23.684 --> 00:01:27.329 and those of you who are parents know what happens when you look too closely 00:01:27.581 --> 00:01:30.205 at someone who is working beyond the age of their competence. 00:01:30.565 --> 00:01:32.678 They go back to doing something they're already good at. 00:01:32.973 --> 00:01:35.042 You interrupt a moment of genuine learning 00:01:35.295 --> 00:01:38.491 and you replace it with a kind of embarrassment 00:01:38.859 --> 00:01:42.384 about what you're good at and what you're not. 00:01:42.979 --> 00:01:48.051 So, it matters a lot that our schools are increasingly surveilled environments, 00:01:48.051 --> 00:01:52.401 environments in which everything that our kids do is watched and recorded. 00:01:52.791 --> 00:01:56.439 Because when you do that, you interfere with those moments of real learning. 00:01:56.781 --> 00:02:00.665 Our ability to do things that we are not good at yet, that we are not proud of yet, 00:02:00.665 --> 00:02:03.524 is negatively impacted by that kind of scrutiny. 00:02:03.924 --> 00:02:06.022 And that scrutiny comes from a strange place. 00:02:06.731 --> 00:02:10.567 We have decided that there are some programmatic means 00:02:10.567 --> 00:02:13.678 by which we can find all the web page children shouldn't look at 00:02:14.136 --> 00:02:18.321 and we will filter our networks to be sure that they don't see them. 00:02:18.719 --> 00:02:21.883 Anyone who has ever paid attention knows that this doesn't work. 00:02:22.351 --> 00:02:25.524 There are more web pages that kids shouldn't look at 00:02:25.524 --> 00:02:29.043 than can ever be cataloged, and any attempt to catalog them 00:02:29.043 --> 00:02:32.031 will always catch pages that kids must be looking at. 00:02:32.031 --> 00:02:34.795 Any of you who have ever taught a unit on reproductive health 00:02:35.122 --> 00:02:37.835 know the frustration of trying to get round a school network. 00:02:38.597 --> 00:02:42.429 Now, this is done in the name of digital protection 00:02:42.643 --> 00:02:46.342 but it flies in the face of digital literacy and of real learning. 00:02:46.613 --> 00:02:50.332 Because the only way to stop kids from looking at web pages 00:02:50.332 --> 00:02:51.607 they shouldn't be looking at 00:02:51.607 --> 00:02:55.771 is to take all of the clicks that they make, all of the messages that they send, 00:02:55.771 --> 00:02:59.736 all of their online activity and offshore it to a firm 00:02:59.736 --> 00:03:04.006 that has some nonsensically arrived at list of the bad pages. 00:03:04.370 --> 00:03:07.815 And so, what we are doing is that we're exfiltrating all of our students' data 00:03:08.279 --> 00:03:10.256 to unknown third parties. 00:03:10.256 --> 00:03:12.529 Now, most of these firms, their primary business isn't 00:03:12.529 --> 00:03:14.237 serving the education sector. 00:03:14.237 --> 00:03:16.347 Most of them service the government sectors. 00:03:16.540 --> 00:03:21.328 They primarily service governments in repressive autocratic regimes. 00:03:21.328 --> 00:03:23.600 They help them ensure that their citizens aren't looking at 00:03:23.600 --> 00:03:25.376 Amnesty International web pages. 00:03:25.636 --> 00:03:29.433 They repackage those tools and sell them to our educators. 00:03:29.835 --> 00:03:32.975 So we are offshoring our children's clicks to war criminals. 00:03:33.791 --> 00:03:37.197 And what our kids do, now, is they just get around it, 00:03:37.197 --> 00:03:38.773 because it's not hard to get around it. 00:03:38.773 --> 00:03:43.940 You know, never underestimate the power of a kid who is time-rich and cash-poor 00:03:43.940 --> 00:03:46.234 to get around our technological blockades. 00:03:47.508 --> 00:03:51.006 But when they do this, they don't acquire the kind of digital literacy 00:03:51.006 --> 00:03:54.075 that we want them to do, they don't acquire real digital agency 00:03:54.400 --> 00:03:57.783 and moreover, they risk exclusion and in extreme cases, 00:03:57.783 --> 00:03:59.366 they risk criminal prosecution. 00:04:00.220 --> 00:04:04.085 So what if instead, those of us who are trapped in this system of teaching kids 00:04:04.085 --> 00:04:07.850 where we're required to subject them to this kind of surveillance 00:04:07.850 --> 00:04:10.167 that flies in the face of their real learning, 00:04:10.167 --> 00:04:13.247 what if instead, we invented curricular units 00:04:13.247 --> 00:04:16.459 that made them real first class digital citizens, 00:04:16.459 --> 00:04:19.590 in charge of trying to influence real digital problems? 00:04:19.590 --> 00:04:22.833 Like what if we said to them: "We want you to catalog the web pages 00:04:22.833 --> 00:04:25.271 that this vendor lets through that you shouldn't be seeing. 00:04:25.499 --> 00:04:29.424 We want you to catalog those pages that you should be seeing, that are blocked. 00:04:29.426 --> 00:04:31.948 We want you to go and interview every teacher in the school 00:04:31.948 --> 00:04:35.306 about all those lesson plans that were carefully laid out before lunch 00:04:35.306 --> 00:04:37.966 with a video and a web page, and over lunch, 00:04:37.966 --> 00:04:41.424 the unaccountable distance center blocked these critical resources 00:04:41.424 --> 00:04:44.796 and left them handing out photographed worksheets in the afternoon 00:04:44.796 --> 00:04:47.073 instead of the unit they prepared. 00:04:47.073 --> 00:04:50.837 We want you to learn how to do the Freedom of Information Act requests 00:04:50.837 --> 00:04:53.377 and find out what your school authority is spending 00:04:53.377 --> 00:04:56.371 to censor your internet access and surveil your activity. 00:04:56.371 --> 00:04:59.855 We want you to learn to use the internet to research these companies 00:04:59.855 --> 00:05:04.329 and we want you to present this to your parent-teacher association, 00:05:04.329 --> 00:05:06.570 to your school authority, to your local newspaper." 00:05:06.981 --> 00:05:08.917 Because that's the kind of digital literacy 00:05:09.187 --> 00:05:11.310 that makes kids into first-class digital citizens, 00:05:11.310 --> 00:05:16.083 that prepares them for a future in which they can participate fully 00:05:16.083 --> 00:05:18.080 in a world that's changing. 00:05:19.019 --> 00:05:22.677 Kids are the beta-testers of the surveillance state. 00:05:22.919 --> 00:05:26.713 The path of surveillance technology starts with prisoners, 00:05:27.157 --> 00:05:30.409 moves to asylum seekers, people in mental institutions 00:05:30.409 --> 00:05:33.919 and then to its first non-incarcerated population: children 00:05:34.258 --> 00:05:37.032 and then moves to blue-collar workers, government workers 00:05:37.032 --> 00:05:38.488 and white-collar workers. 00:05:38.488 --> 00:05:41.575 And so, what we do to kids today is what we did to prisoners yesterday 00:05:41.575 --> 00:05:44.078 and what we're going to be doing to you tomorrow. 00:05:44.078 --> 00:05:46.737 And so it matters, what we teach our kids. 00:05:47.054 --> 00:05:51.039 If you want to see where this goes, this is a kid named Blake Robbins 00:05:51.039 --> 00:05:55.124 and he attended Lower Merion High School in Lower Merion Pennsylvania 00:05:55.124 --> 00:05:56.090 outside f Philadelphia. 00:05:56.091 --> 00:05:59.903 It's the most affluent public school district in America, so affluent 00:05:59.903 --> 00:06:02.521 that all the kids were issued Macbooks at the start of the year 00:06:02.521 --> 00:06:04.665 and they had to do their homework on their Macbooks, 00:06:04.665 --> 00:06:07.778 and bring them to school every day and bring them home every night. 00:06:07.778 --> 00:06:11.472 And the Macbooks had been fitted with Laptop Theft Recovery Software, 00:06:11.472 --> 00:06:15.687 which is fancy word for a rootkit, that let the school administration 00:06:16.494 --> 00:06:19.759 covertly (check) operate the cameras and microphones on these computers 00:06:20.266 --> 00:06:23.207 and harvest files off of their hard drives 00:06:23.657 --> 00:06:25.666 view all their clicks, and so on. 00:06:26.280 --> 00:06:30.802 Now Blake Robbins found out that the software existed 00:06:30.802 --> 00:06:33.787 and how it was being used because he and the head teacher 00:06:33.787 --> 00:06:37.056 had been knocking heads for years, since he first got into the school, 00:06:37.056 --> 00:06:39.864 and one day, the head teacher summoned him to his office 00:06:39.864 --> 00:06:41.462 and said: "Blake, I've got you now." 00:06:41.670 --> 00:06:45.284 and handed him a print-out of Blake in his bedroom the night before, 00:06:45.620 --> 00:06:48.985 taking what looked like a pill, and he said: "You're taking drugs." 00:06:48.985 --> 00:06:53.988 And Blake Robbins said: "That's a candy, it's a Mike and Ike's candy, I take them 00:06:53.988 --> 00:06:55.652 when I -- I eat them when I'm studying. 00:06:55.973 --> 00:06:58.156 How did you get a picture of me in my bedroom?" 00:06:58.656 --> 00:07:02.644 This head teacher had taken over 6000 photos of Blake Robbins: 00:07:02.644 --> 00:07:06.015 awake and asleep, dressed and undressed, in the presence of his family. 00:07:06.630 --> 00:07:10.376 And in the ensuing lawsuit, the school settled for a large amount of money 00:07:10.376 --> 00:07:12.476 and promised that they wouldn't do it again 00:07:12.916 --> 00:07:15.560 without informing the students that it was going on. 00:07:16.107 --> 00:07:18.521 And increasingly, the practice is now 00:07:18.808 --> 00:07:22.110 that school administrations hand out laptops, because they're getting cheaper, 00:07:22.505 --> 00:07:24.683 with exactly the same kind of software, 00:07:24.886 --> 00:07:27.681 but they let the students know and hey find that that works even better 00:07:27.964 --> 00:07:29.682 at curbing the students' behavior, 00:07:29.682 --> 00:07:33.196 because the students know that they're always on camera. 00:07:33.769 --> 00:07:38.010 Now, the surveillance state is moving from kids to the rest of the world. 00:07:38.010 --> 00:07:39.528 It's metastasizing. 00:07:39.528 --> 00:07:44.149 Our devices are increasingly designed to treat us as attackers, 00:07:44.149 --> 00:07:46.890 as suspicious parties who can't be trusted 00:07:46.890 --> 00:07:50.577 because our devices' job is to do things that we don't want them to do. 00:07:50.923 --> 00:07:53.758 Now that's not because the vendors who make our technology 00:07:53.758 --> 00:07:55.839 want to spy on us necessarily, 00:07:55.839 --> 00:08:01.000 but they want to take the ink-jet printer business model 00:08:01.000 --> 00:08:03.768 and bring it into every other realm of the world. 00:08:03.768 --> 00:08:08.724 So the ink-jet printer business model is where you sell someone a device 00:08:08.724 --> 00:08:11.689 and then you get a continuing revenue stream from that device 00:08:11.689 --> 00:08:15.844 by making sure that competitors can't make consumables or parts 00:08:15.844 --> 00:08:19.084 or additional features or plugins for that device, 00:08:19.084 --> 00:08:22.111 without paying rent to the original manufacturer. 00:08:22.111 --> 00:08:25.911 And that allows you to maintain monopoly margins on your devices. 00:08:26.266 --> 00:08:30.629 Now, in 1998, the American government passed a law called 00:08:30.629 --> 00:08:32.416 the Digital Millennium Copyright Act, 00:08:32.416 --> 00:08:35.328 in 2001 the European Union introduced its own version, 00:08:35.328 --> 00:08:37.327 the European Union Copyright Directive. 00:08:37.593 --> 00:08:40.157 And these two laws, along with laws all around the world, 00:08:40.157 --> 00:08:45.681 in Australia, Canada and elsewhere, these laws prohibit removing digital laws 00:08:45.681 --> 00:08:48.885 that are used to restrict access to copyrighted works 00:08:49.220 --> 00:08:52.134 and they were original envisioned as a way of making sure that Europeans didn't 00:08:52.134 --> 00:08:54.640 bring cheap DVDs in from America, 00:08:54.640 --> 00:08:58.215 or making sure that Australians didn't import cheap DVDs from China. 00:08:58.215 --> 00:09:03.741 And so you have a digital work, a DVD, and it has a lock on it and to unlock it, 00:09:03.741 --> 00:09:05.363 you have to buy an authorized player 00:09:05.363 --> 00:09:07.357 and the player checks to make sure you are in region 00:09:07.357 --> 00:09:10.339 and making your own player that doesn't make that check 00:09:10.339 --> 00:09:12.409 is illegal because you'd have to remove the digital lock. 00:09:12.409 --> 00:09:13.824 And that was the original intent, 00:09:13.824 --> 00:09:18.552 it was to allow high rents to be maintained on removable media, 00:09:18.552 --> 00:09:20.515 DVDs and other entertainment content. 00:09:20.726 --> 00:09:24.229 But it very quickly spread into new rounds. 00:09:24.765 --> 00:09:28.415 So, for example, auto manufacturers now lock up 00:09:28.415 --> 00:09:30.863 all of their cars' telemetry with digital locks. 00:09:30.863 --> 00:09:33.153 If you're a mechanic and want to fix a car, 00:09:33.153 --> 00:09:36.796 you have to get a reader from the manufacturer 00:09:36.796 --> 00:09:40.240 to make sure that you can see the telemetry 00:09:40.240 --> 00:09:42.759 and then know what parts to order and how to fix it. 00:09:43.154 --> 00:09:46.380 And in order to get this reader, you have to promise the manufacturer 00:09:46.380 --> 00:09:50.069 that you will only buy parts from that manufacturer 00:09:50.069 --> 00:09:51.366 and not from third parties. 00:09:51.366 --> 00:09:53.829 So the manufacturers can keep the repair costs high 00:09:53.829 --> 00:09:56.852 and get a secondary revenue stream out of the cars. 00:09:57.388 --> 00:10:04.532 This year, the Chrysler corporation filed comments with the US Copyright Office, 00:10:04.532 --> 00:10:07.580 to say that they believed that this was the right way to do it 00:10:07.580 --> 00:10:10.278 and that it should be a felony, punishable by 5 years in prison 00:10:10.278 --> 00:10:12.218 and a $500'000 fine, 00:10:12.218 --> 00:10:16.242 to change the locks on a car that you own, so that you can choose who fixes it. 00:10:16.531 --> 00:10:19.556 It turned out that when they advertised 00:10:19.556 --> 00:10:21.855 -- well, where is my slide here? Oh, there we go -- 00:10:21.855 --> 00:10:25.166 when they advertised that it wasn't your father's Oldsmobile, 00:10:25.606 --> 00:10:28.531 they weren't speaking metaphorically, they literally meant 00:10:28.531 --> 00:10:30.486 that even though your father bought the Oldsmobile, 00:10:30.486 --> 00:10:32.642 it remained their property in perpetuity. 99:59:59.999 --> 99:59:59.999 And it's not just cars, it's every kind of device, 99:59:59.999 --> 99:59:59.999 because every kind of device today has a computer in it. 99:59:59.999 --> 99:59:59.999 The John Deer Company, the world's leading seller of heavy equipment 99:59:59.999 --> 99:59:59.999 and agricultural equipment technologies, 99:59:59.999 --> 99:59:59.999 they now view their tractors as information gathering platforms 99:59:59.999 --> 99:59:59.999 and they view the people who use them 99:59:59.999 --> 99:59:59.999 as the kind of inconvenient gut flora of their ecosystem. 99:59:59.999 --> 99:59:59.999 So if you are a farmer and you own a John Deer tractor, 99:59:59.999 --> 99:59:59.999 when you drive it around your fields, the torque centers on the wheels 99:59:59.999 --> 99:59:59.999 conduct a centimeter-accurate soil density survey of your agricultural land. 99:59:59.999 --> 99:59:59.999 That would be extremely useful to you when you're planting your seeds 99:59:59.999 --> 99:59:59.999 but that data is not available to you 99:59:59.999 --> 99:59:59.999 unless unless you remove the digital lock from your John Deer tractor 99:59:59.999 --> 99:59:59.999 which again, is against the law everywhere in the world. 99:59:59.999 --> 99:59:59.999 Instead, in order to get that data 99:59:59.999 --> 99:59:59.999 you have to buy a bundle with seeds from Monsanto, 99:59:59.999 --> 99:59:59.999 who are John Deer's seed partners. 99:59:59.999 --> 99:59:59.999 John Deer then takes this data that they aggregate across whole regions 99:59:59.999 --> 99:59:59.999 and they use it to gain insight into regional crop yields 99:59:59.999 --> 99:59:59.999 That they use to play the futures market. 99:59:59.999 --> 99:59:59.999 John Deer's tractors are really just a way of gathering information 99:59:59.999 --> 99:59:59.999 and the farmers are secondary to it. 99:59:59.999 --> 99:59:59.999 Just because you own it doesn't mean it's yours. 99:59:59.999 --> 99:59:59.999 And it's not just the computers that we put our bodies into 99:59:59.999 --> 99:59:59.999 that have this business model. 99:59:59.999 --> 99:59:59.999 It's the computers that we put inside of our bodies. 99:59:59.999 --> 99:59:59.999 If you're someone who is diabetic 99:59:59.999 --> 99:59:59.999 and you're fitted with a continuous glucose-measuring insulin pump, 99:59:59.999 --> 99:59:59.999 that insulin pump is designed with a digital log 99:59:59.999 --> 99:59:59.999 that makes sure that your doctor can only use the manufacturer's software 99:59:59.999 --> 99:59:59.999 to read the data coming off of it 99:59:59.999 --> 99:59:59.999 and that software is resold on a rolling annual license 99:59:59.999 --> 99:59:59.999 and it can't be just bought outright. 99:59:59.999 --> 99:59:59.999 And the digital locks are also used to make sure 99:59:59.999 --> 99:59:59.999 that you only buy the insulin that vendors approved 99:59:59.999 --> 99:59:59.999 and not generic insulin that might be cheaper. 99:59:59.999 --> 99:59:59.999 We've literally turned human beings into ink-jet printers. 99:59:59.999 --> 99:59:59.999 Now, this has really deep implications beyond the economic implications. 99:59:59.999 --> 99:59:59.999 Because the rules that prohibit breaking these digital locks 99:59:59.999 --> 99:59:59.999 also prohibit telling people about flaws that programmers made 99:59:59.999 --> 99:59:59.999 because if you know about a flaw that a programmer made, 99:59:59.999 --> 99:59:59.999 you can use it to break the digital lock. 99:59:59.999 --> 99:59:59.999 And that means that the errors, the vulnerabilities, 99:59:59.999 --> 99:59:59.999 the mistakes in our devices, they fester in them, they go on and on and on 99:59:59.999 --> 99:59:59.999 and our devices become these longlife reservoirs of digital pathogens. 99:59:59.999 --> 99:59:59.999 And we've seen how that plays out. 99:59:59.999 --> 99:59:59.999 One of the reasons that Volkswagen was able to get away 99:59:59.999 --> 99:59:59.999 with their Diesel cheating for so long 99:59:59.999 --> 99:59:59.999 is because no one could independently alter their firmware. 99:59:59.999 --> 99:59:59.999 It's happening all over the place. 99:59:59.999 --> 99:59:59.999 You may have seen -- you may have seen this summer 99:59:59.999 --> 99:59:59.999 that Chrysler had to recall 1.4 million jeeps 99:59:59.999 --> 99:59:59.999 because it turned out that they could be remotely controlled over the internet 99:59:59.999 --> 99:59:59.999 while driving down a motorway and have their brakes and steering 99:59:59.999 --> 99:59:59.999 commandeered by anyone, anywhere in the world, over the internet. 99:59:59.999 --> 99:59:59.999 We only have one methodology for determining whether security works 99:59:59.999 --> 99:59:59.999 and that's to submit it to public scrutiny, 99:59:59.999 --> 99:59:59.999 to allow for other people to see what assumptions you've made. 99:59:59.999 --> 99:59:59.999 Anyone can design a security system 99:59:59.999 --> 99:59:59.999 that he himself can think of a way of breaking, 99:59:59.999 --> 99:59:59.999 but all that means is that you've designed a security system 99:59:59.999 --> 99:59:59.999 that works against people who are stupider than you. 99:59:59.999 --> 99:59:59.999 And in this regard, security is no different 99:59:59.999 --> 99:59:59.999 from any other kind of knowledge creation. 99:59:59.999 --> 99:59:59.999 You know, before we had contemporary science and scholarship, 99:59:59.999 --> 99:59:59.999 we had something that looked a lot like it, called alchemy. 99:59:59.999 --> 99:59:59.999 And for 500 years, alchemists kept what they thought they knew a secret. 99:59:59.999 --> 99:59:59.999 And that meant that every alchemist was capable of falling prey 99:59:59.999 --> 99:59:59.999 to that most urgent of human frailties, which is our ability to fool ourselves. 99:59:59.999 --> 99:59:59.999 And so, every alchemist discovered for himself in the hardest way possible 99:59:59.999 --> 99:59:59.999 that drinking mercury was a bad idea. 99:59:59.999 --> 99:59:59.999 We call that 500-year period the Dark Ages 99:59:59.999 --> 99:59:59.999 and we call the moment at which they started publishing 99:59:59.999 --> 99:59:59.999 and submitting themselves to adversarial peer review, 99:59:59.999 --> 99:59:59.999 which is when your friends tell you about the mistakes that you've made 99:59:59.999 --> 99:59:59.999 and your enemies call you an idiot for having made them, 99:59:59.999 --> 99:59:59.999 we call that moment the Enlightenment. 99:59:59.999 --> 99:59:59.999 Now, this has profound implications. 99:59:59.999 --> 99:59:59.999 The restriction of our ability to alter the security of our devices 99:59:59.999 --> 99:59:59.999 for our own surveillance society, 99:59:59.999 --> 99:59:59.999 for our ability to be free people in society. 99:59:59.999 --> 99:59:59.999 At the height of the GDR, in 1989, 99:59:59.999 --> 99:59:59.999 the STASI had one snitch for every 60 people in East Germany, 99:59:59.999 --> 99:59:59.999 in order to surveil the entire country. 99:59:59.999 --> 99:59:59.999 A couple of decades later, we found out through Edward Snowden 99:59:59.999 --> 99:59:59.999 that the NSA was spying on everybody in the world. 99:59:59.999 --> 99:59:59.999 And the ratio of people who work at the NSA to people they are spying on 99:59:59.999 --> 99:59:59.999 is more like 1 in 10'000. 99:59:59.999 --> 99:59:59.999 They've achieved a two and a half order of magnitude 99:59:59.999 --> 99:59:59.999 productivity gain in surveillance. 99:59:59.999 --> 99:59:59.999 And the way that they got there is in part by the fact that 99:59:59.999 --> 99:59:59.999 we use devices that we're not allowed to alter, 99:59:59.999 --> 99:59:59.999 that are designed to treat us as attackers 99:59:59.999 --> 99:59:59.999 and that gather an enormous amount of information on us. 99:59:59.999 --> 99:59:59.999 If the government told you that you're required to carry on 99:59:59.999 --> 99:59:59.999 a small electronic rectangle that recorded all of your social relationships, 99:59:59.999 --> 99:59:59.999 all of your movements, 99:59:59.999 --> 99:59:59.999 all of your transient thoughts that you made known or ever looked up, 99:59:59.999 --> 99:59:59.999 and would make that available to the state, 99:59:59.999 --> 99:59:59.999 and you would have to pay for it, you would revolt. 99:59:59.999 --> 99:59:59.999 But the phone companies have managed to convince us, 99:59:59.999 --> 99:59:59.999 along with the mobile phone vendors, 99:59:59.999 --> 99:59:59.999 that we should foot the bill for our own surveillance. 99:59:59.999 --> 99:59:59.999 It's a bit like during the Cultural Revolution, 99:59:59.999 --> 99:59:59.999 where, after your family members were executed, 99:59:59.999 --> 99:59:59.999 they sent you a bill for the bullet. 99:59:59.999 --> 99:59:59.999 So, this has big implications, as I said, for where we go as a society. 99:59:59.999 --> 99:59:59.999 Because just as our kids have a hard time functioning 99:59:59.999 --> 99:59:59.999 in the presence of surveillance, and learning, 99:59:59.999 --> 99:59:59.999 and advancing their own knowledge, 99:59:59.999 --> 99:59:59.999 we as a society have a hard time progressing 99:59:59.999 --> 99:59:59.999 in the presence of surveillance. 99:59:59.999 --> 99:59:59.999 In our own living memory, people who are today though of as normal and right 99:59:59.999 --> 99:59:59.999 were doing something that a generation ago 99:59:59.999 --> 99:59:59.999 would have been illegal and landed them in jail. 99:59:59.999 --> 99:59:59.999 For example, you probably know someone 99:59:59.999 --> 99:59:59.999 who's married to a partner of the same sex. 99:59:59.999 --> 99:59:59.999 If you live in America, you may know someone who takes medical marijuana, 99:59:59.999 --> 99:59:59.999 or if you live in the Netherlands. 99:59:59.999 --> 99:59:59.999 And not that long ago, people who undertook these activities 99:59:59.999 --> 99:59:59.999 could have gone to jail for them, 99:59:59.999 --> 99:59:59.999 could have faced enormous social exclusion for them. 99:59:59.999 --> 99:59:59.999 The way that we got from there to here was by having a private zone, 99:59:59.999 --> 99:59:59.999 a place where people weren't surveilled, 99:59:59.999 --> 99:59:59.999 in which they could advance their interest ideas, 99:59:59.999 --> 99:59:59.999 do things that were thought of as socially unacceptable 99:59:59.999 --> 99:59:59.999 and slowly change our social attitudes. 99:59:59.999 --> 99:59:59.999 And in ........ (check) few things that in 50 years, 99:59:59.999 --> 99:59:59.999 your grandchildren will sit around the Christmas table, in 2065, and say: 99:59:59.999 --> 99:59:59.999 "How was it, grandma, how was it, grandpa, 99:59:59.999 --> 99:59:59.999 that in 2015, you got it all right, 99:59:59.999 --> 99:59:59.999 and we haven't had any social changes since then?" 99:59:59.999 --> 99:59:59.999 Then you have to ask yourself how in a world, 99:59:59.999 --> 99:59:59.999 in which we are all under continuous surveillance, 99:59:59.999 --> 99:59:59.999 we are going to find a way to improve this. 99:59:59.999 --> 99:59:59.999 So, our kids need ICT literacy, 99:59:59.999 --> 99:59:59.999 but ICT literacy isn't just typing skills or learning how to use PowerPoing. 99:59:59.999 --> 99:59:59.999 It's learning how to think critically 99:59:59.999 --> 99:59:59.999 about how they relate to the means of information, 99:59:59.999 --> 99:59:59.999 about whether they are its masters or servants. 99:59:59.999 --> 99:59:59.999 Our networks are not the most important issue that we have. 99:59:59.999 --> 99:59:59.999 There are much more important issues in society and in the world today. 99:59:59.999 --> 99:59:59.999 The future of the internet is way less important 99:59:59.999 --> 99:59:59.999 than the future of our climate, the future of gender equity, 99:59:59.999 --> 99:59:59.999 the future of racial equity, 99:59:59.999 --> 99:59:59.999 the future of the wage gap and the wealth gap in the world, 99:59:59.999 --> 99:59:59.999 but everyone of those fights is going to be fought and won or lost 99:59:59.999 --> 99:59:59.999 on the internet: it's our most foundational fight 99:59:59.999 --> 99:59:59.999 So weakened (check) computers can make us more free 99:59:59.999 --> 99:59:59.999 or they can take away our freedom. 99:59:59.999 --> 99:59:59.999 It all comes down to how we regulate them and how we use them. 99:59:59.999 --> 99:59:59.999 And it's our job, as people who are training the next generation, 99:59:59.999 --> 99:59:59.999 and whose next generation is beta-testing 99:59:59.999 --> 99:59:59.999 the surveillance technology that will be coming to us, 99:59:59.999 --> 99:59:59.999 it's our job to teach them to seize the means of information, 99:59:59.999 --> 99:59:59.999 to make themselves self-determinant in the way that they use their networks 99:59:59.999 --> 99:59:59.999 and to find ways to show them how to be critical and how to be smart 99:59:59.999 --> 99:59:59.999 and how to be, above all, subversive and how to use the technology around them. 99:59:59.999 --> 99:59:59.999 Thank you. (18:49) 99:59:59.999 --> 99:59:59.999 (Applause) 99:59:59.999 --> 99:59:59.999 (Moderator) Cory, thank you very much indeed. 99:59:59.999 --> 99:59:59.999 (Doctorow) Thank you (Moderator) And I've got a bundle 99:59:59.999 --> 99:59:59.999 of points which you've stimulated from many in the audience, which sent -- 99:59:59.999 --> 99:59:59.999 (Doctorow) I'm shocked to hear that that was at all controversial, 99:59:59.999 --> 99:59:59.999 but go on. (Moderator) I didn't say "controversial", 99:59:59.999 --> 99:59:59.999 you stimulated thinking, which is great. 99:59:59.999 --> 99:59:59.999 But a lot of them resonate around violation of secrecy and security. 99:59:59.999 --> 99:59:59.999 And this, for example, from Anneke Burgess (check) 99:59:59.999 --> 99:59:59.999 "Is there a way for students to protect themselves 99:59:59.999 --> 99:59:59.999 from privacy violations by institutions they are supposed to trust." 99:59:59.999 --> 99:59:59.999 I think this is probably a question William Golding (check) as well, 99:59:59.999 --> 99:59:59.999 someone who is a senior figure in a major university, but 99:59:59.999 --> 99:59:59.999 this issue of privacy violations and trust. 99:59:59.999 --> 99:59:59.999 (Doctorow) Well, I think that computers have a curious dual nature. 99:59:59.999 --> 99:59:59.999 So on the one hand, they do expose us to an enormous amount of scrutiny, 99:59:59.999 --> 99:59:59.999 depending on how they are configured. 99:59:59.999 --> 99:59:59.999 But on the other hand, computers have brought new powers to us 99:59:59.999 --> 99:59:59.999 that are literally new on the face of the world, right? 99:59:59.999 --> 99:59:59.999 We have never had a reality in which normal people could have secrets 99:59:59.999 --> 99:59:59.999 from powerful people. 99:59:59.999 --> 99:59:59.999 But with the computer in your pocket, with that, 99:59:59.999 --> 99:59:59.999 you can encrypt a message so thoroughly 99:59:59.999 --> 99:59:59.999 that if every hydrogen atom in the universe were turned into a computer 99:59:59.999 --> 99:59:59.999 and it did nothing until the heat death of the universe, 99:59:59.999 --> 99:59:59.999 but try to guess what your key was, 99:59:59.999 --> 99:59:59.999 we would run out of universe before we ran out of possible keys. 99:59:59.999 --> 99:59:59.999 So, computers do give us the power to have secrets. 99:59:59.999 --> 99:59:59.999 The problem is that institutions prohibit the use of technology 99:59:59.999 --> 99:59:59.999 that allow you to take back your own privacy. 99:59:59.999 --> 99:59:59.999 It's funny, right? Because we take kids and we say to them: 99:59:59.999 --> 99:59:59.999 "Your privacy is like your virginity: 99:59:59.999 --> 99:59:59.999 once you've lost it, you'll never get it back. 99:59:59.999 --> 99:59:59.999 Watch out what you're putting on Facebook" 99:59:59.999 --> 99:59:59.999 -- and I think they should watch what they're putting on Facebook, 99:59:59.999 --> 99:59:59.999 I'm a Facebook vegan, I don't even -- I don't use it, but we say: 99:59:59.999 --> 99:59:59.999 "Watch what you're putting on Facebook, 99:59:59.999 --> 99:59:59.999 don't send out dirty pictures of yourself on SnapChat." 99:59:59.999 --> 99:59:59.999 All good advice. 99:59:59.999 --> 99:59:59.999 But we do it while we are taking away all the private information 99:59:59.999 --> 99:59:59.999 that they have, all of their privacy and all of their agency. 99:59:59.999 --> 99:59:59.999 You know, if a parent says to a kid: 99:59:59.999 --> 99:59:59.999 "You mustn't smoke because you'll get sick" 99:59:59.999 --> 99:59:59.999 and the parent says it while lighting a new cigarette 99:59:59.999 --> 99:59:59.999 off the one that she's just put down in the ashtray, 99:59:59.999 --> 99:59:59.999 the kid knows that what you're doing matters more than what you're saying. 99:59:59.999 --> 99:59:59.999 (Moderator) The point is deficit of trust. 99:59:59.999 --> 99:59:59.999 It builds in the kind of work that David has been doing as well, 99:59:59.999 --> 99:59:59.999 this deficit of trust and privacy. 99:59:59.999 --> 99:59:59.999 And there is another point here: 99:59:59.999 --> 99:59:59.999 "Is the battle for privacy already lost? 99:59:59.999 --> 99:59:59.999 Are we already too comfortable with giving away our data?" 99:59:59.999 --> 99:59:59.999 (Doctorow) No, I don't think so at all. 99:59:59.999 --> 99:59:59.999 In fact, I think that if anything, we've reached 99:59:59.999 --> 99:59:59.999 peak indifference to surveillance, right? 99:59:59.999 --> 99:59:59.999 The surveillance races on by a long shot. 99:59:59.999 --> 99:59:59.999 There will be more surveillance before there is less. 99:59:59.999 --> 99:59:59.999 But there'll never be fewer people who care about surveillance 99:59:59.999 --> 99:59:59.999 than there are today, 99:59:59.999 --> 99:59:59.999 because, as privacy advocates, we spectacularly failed, 99:59:59.999 --> 99:59:59.999 over the last 20 years, to get people to take privacy seriously 99:59:59.999 --> 99:59:59.999 and now we have firms that are frankly incompetent, 99:59:59.999 --> 99:59:59.999 retaining huge amounts of our personally identifying sensitive information 99:59:59.999 --> 99:59:59.999 and those firms are leaking that information at speed. 99:59:59.999 --> 99:59:59.999 So, it started this year with things like (phone rings) -- is that me? it's me! -- 99:59:59.999 --> 99:59:59.999 It started this year with things like Ashley Madison 99:59:59.999 --> 99:59:59.999 (Moderator) Do you want to take it? 99:59:59.999 --> 99:59:59.999 (Doctorow) No no, that was my timer going off telling me I've had my 22 minutes 99:59:59.999 --> 99:59:59.999 (Moderator) It may be someone who doesn't trust my.... 99:59:59.999 --> 99:59:59.999 (Doctorow) No,no, that was 22 minutes. 99:59:59.999 --> 99:59:59.999 So, it started with Ashley Madison, the office of personnel management, 99:59:59.999 --> 99:59:59.999 everyone who ever applied for security clearance in America 99:59:59.999 --> 99:59:59.999 had all of their sensitive information, everything you had to give them 99:59:59.999 --> 99:59:59.999 about why you shouldn't have security clearance, 99:59:59.999 --> 99:59:59.999 everything that's potentially compromising about you, 99:59:59.999 --> 99:59:59.999 all of it exfiltrated by what's believed to have been a Chinese spy ring. 99:59:59.999 --> 99:59:59.999 Something like one in twenty Americans now, 99:59:59.999 --> 99:59:59.999 have had their data captured and exfiltrated from the United States. 99:59:59.999 --> 99:59:59.999 This week, VTech, the largest electronic toy manufacturer in the world, 99:59:59.999 --> 99:59:59.999 leaked the personal information of at least five million children, 99:59:59.999 --> 99:59:59.999 including potentially photos that they took with their electronic toys, 99:59:59.999 --> 99:59:59.999 as well as their parents' information, their home addresses, their passwords, 99:59:59.999 --> 99:59:59.999 their parents' passwords and their password hints. 99:59:59.999 --> 99:59:59.999 So every couple of weeks, from now on, 99:59:59.999 --> 99:59:59.999 a couple of million people are going to show up 99:59:59.999 --> 99:59:59.999 on the door of people who care about privacy and say: 99:59:59.999 --> 99:59:59.999 "You were right all along, what do we do?" 99:59:59.999 --> 99:59:59.999 And the challenge is to give them something useful 99:59:59.999 --> 99:59:59.999 they can do about privacy. 99:59:59.999 --> 99:59:59.999 And there are steps that you can take personally. 99:59:59.999 --> 99:59:59.999 If you go to the Surveillance Defense Kit 99:59:59.999 --> 99:59:59.999 at the Electronic Frontier Foundation's website, 99:59:59.999 --> 99:59:59.999 you'll find a set of tools you can use. 99:59:59.999 --> 99:59:59.999 But ultimately, it's not an individual choice. 99:59:59.999 --> 99:59:59.999 It's a social one. 99:59:59.999 --> 99:59:59.999 Privacy is a team sport, because if you keep your information private 99:59:59.999 --> 99:59:59.999 and you send it to someone else who is in your social network, 99:59:59.999 --> 99:59:59.999 who doesn't keep it private, well, then it leaks out their back door. 99:59:59.999 --> 99:59:59.999 And so we need social movements to improve our privacy. 99:59:59.999 --> 99:59:59.999 We also need better tools. 99:59:59.999 --> 99:59:59.999 It's really clear that our privacy tools have fallen short of the mark 99:59:59.999 --> 99:59:59.999 in terms of being accessible to normal people. 99:59:59.999 --> 99:59:59.999 (Moderator) Corey -- (Doctorow) I sense you want 99:59:59.999 --> 99:59:59.999 to say something: go on (Moderator) Yes. 99:59:59.999 --> 99:59:59.999 But what I want to do is keep pushing this down the route of education as well. 99:59:59.999 --> 99:59:59.999 There are two or three questions here which are very specific about that. 99:59:59.999 --> 99:59:59.999 Could you tell us more about how we help students 99:59:59.999 --> 99:59:59.999 become digital citizens? 99:59:59.999 --> 99:59:59.999 It links in important ways to the German and Nordic pre-digital concept 99:59:59.999 --> 99:59:59.999 of Bildung ............... (check) 99:59:59.999 --> 99:59:59.999 and if you could give your daughter just one piece of advice for living 99:59:59.999 --> 99:59:59.999 in an age of digital surveillance, what would it be? 99:59:59.999 --> 99:59:59.999 (Doctorow) So, the one piece of advice I would give her is 99:59:59.999 --> 99:59:59.999 "Don't settle for anything less than total control 99:59:59.999 --> 99:59:59.999 of the means of information. 99:59:59.999 --> 99:59:59.999 Anytime someone tells you that the computer says you must do something, 99:59:59.999 --> 99:59:59.999 you have to ask yourself why the computer isn't doing what you want it to do. 99:59:59.999 --> 99:59:59.999 Who is in charge here?" 99:59:59.999 --> 99:59:59.999 In terms of how we improve digital citizenship, 99:59:59.999 --> 99:59:59.999 I think we are at odds with our institutions in large part here. 99:59:59.999 --> 99:59:59.999 I think that our institutions have decreed that privacy 99:59:59.999 --> 99:59:59.999 is a luxury we can't afford to give our students 99:59:59.999 --> 99:59:59.999 because if we do, they might do something we wouldn't like. 99:59:59.999 --> 99:59:59.999 And so, as teachers, the only thing that we can do without risking our jobs, 99:59:59.999 --> 99:59:59.999 without risking our students' education and their ability to stay in school 99:59:59.999 --> 99:59:59.999 is to teach them how to do judo, to teach them 99:59:59.999 --> 99:59:59.999 how to ask critical questions, to gather data, 99:59:59.999 --> 99:59:59.999 to demand evidence-based policy. 99:59:59.999 --> 99:59:59.999 And ultimately, I actually think that's better. 99:59:59.999 --> 99:59:59.999 I think that teaching kids how to evade firewalls 99:59:59.999 --> 99:59:59.999 is way less interesting than teaching them how to identify 99:59:59.999 --> 99:59:59.999 who is providing that firewall, 99:59:59.999 --> 99:59:59.999 who made the decision to buy that firewall, 99:59:59.999 --> 99:59:59.999 how much they're spending, 99:59:59.999 --> 99:59:59.999 and what the process is for getting that person fired. 99:59:59.999 --> 99:59:59.999 That's a much more interesting piece of digital citizenship 99:59:59.999 --> 99:59:59.999 than any piece of firewall hacking. 99:59:59.999 --> 99:59:59.999 (Moderator) But Corey, you're teaching kids to get hold of their information, 99:59:59.999 --> 99:59:59.999 but picking up on your analogy with the auto (check) 99:59:59.999 --> 99:59:59.999 where there's a very clear lock on, so much which has to be done 99:59:59.999 --> 99:59:59.999 has to be done through the manufacturers -- Martin Siebel (check) picking up -- 99:59:59.999 --> 99:59:59.999 How busy is the hacker community with breaking digital locks? 99:59:59.999 --> 99:59:59.999 In other words, digital locks are being put on so much, 99:59:59.999 --> 99:59:59.999 can the next generation, can the kids work out where these locks are, 99:59:59.999 --> 99:59:59.999 to keep control of their information. 99:59:59.999 --> 99:59:59.999 (Doctorow) So, getting the digital locks off of our devices 99:59:59.999 --> 99:59:59.999 is not just a matter of technology, it's also a matter of policy. 99:59:59.999 --> 99:59:59.999 Laws like the EUCD and new treaty instruments like TTIP 99:59:59.999 --> 99:59:59.999 and the Transpacific Partnership 99:59:59.999 --> 99:59:59.999 specify that these digital locks can't be removed legally, 99:59:59.999 --> 99:59:59.999 that firms can use them to extract monopoly ransom, to harm our security. 99:59:59.999 --> 99:59:59.999 And these treaties will make it impossible to remove these locks legally. 99:59:59.999 --> 99:59:59.999 It's not enough for there to be a demi-monde, in which 99:59:59.999 --> 99:59:59.999 people who are technologically savvy know how to break their iPhones. 99:59:59.999 --> 99:59:59.999 The problem with that is that it produces this world in which you can't know 99:59:59.999 --> 99:59:59.999 whether the software that you're using has anything bad lurking in it, right? 99:59:59.999 --> 99:59:59.999 to say to people who have iPhones that they want it reconfigured 99:59:59.999 --> 99:59:59.999 to accept software from parties that aren't approved by Apple, 99:59:59.999 --> 99:59:59.999 well all you need to do is find a random piece of software on the internet 99:59:59.999 --> 99:59:59.999 that claims it will allow you to do it, get it (check) on your phone, 99:59:59.999 --> 99:59:59.999 and then run this software on it is like saying to people: 99:59:59.999 --> 99:59:59.999 "Well, all you need to do is just find some heroin lying around 99:59:59.999 --> 99:59:59.999 and put it in your arm: I'm sure it'll work out fine." Right? 99:59:59.999 --> 99:59:59.999 The ways in which a phone that is compromised can harm you 99:59:59.999 --> 99:59:59.999 are really without limits, right? 99:59:59.999 --> 99:59:59.999 If you think about this, this is a rectangle with a camera and a microphone 99:59:59.999 --> 99:59:59.999 that you take into the bedroom and the toilet. 99:59:59.999 --> 99:59:59.999 and the only way you know whether that camera or microphone are on or off 99:59:59.999 --> 99:59:59.999 is if the software is doing what it says it's doing, right? 99:59:59.999 --> 99:59:59.999 So, I think that we have this policy dimension that's way more important 99:59:59.999 --> 99:59:59.999 than the technical dimension, 99:59:59.999 --> 99:59:59.999 and that the only way we're going to change that policy 99:59:59.999 --> 99:59:59.999 is by making kids know what they can't do 99:59:59.999 --> 99:59:59.999 and them making them know why they're not allowed to do it, 99:59:59.999 --> 99:59:59.999 and then set them loose. (Moderator) Two final points then, 99:59:59.999 --> 99:59:59.999 building on their yen ............... (check) 99:59:59.999 --> 99:59:59.999 but what do you think of recent laws, say, in the Netherlands, 99:59:59.999 --> 99:59:59.999 allowing police to spy on criminals and terrorists with webcams 99:59:59.999 --> 99:59:59.999 on their own laptops? 99:59:59.999 --> 99:59:59.999 In other words, the extension implicit in this question is, 99:59:59.999 --> 99:59:59.999 what about keeping an eye on others, including the next generation coming up, 99:59:59.999 --> 99:59:59.999 and this from Peter Andreesen (check), which is slightly connected: 99:59:59.999 --> 99:59:59.999 When the predominant management tool in our business is still based 99:59:59.999 --> 99:59:59.999 on the assumption of control, how do we change that psychology and that mindset? 99:59:59.999 --> 99:59:59.999 And David is nodding for those views ........ (check) at the back. 99:59:59.999 --> 99:59:59.999 (Doctorow) You know, there's not an operating system or suite of applications 99:59:59.999 --> 99:59:59.999 that bad guys use, and another set that the rest of us use. 99:59:59.999 --> 99:59:59.999 And so, when our security services discover vulnerabilities 99:59:59.999 --> 99:59:59.999 in the tools that we rely on, and then hoard those vulnerabilities, 99:59:59.999 --> 99:59:59.999 so that they can weaponize them to attack criminals or terrorists 99:59:59.999 --> 99:59:59.999 or whoever they're after, 99:59:59.999 --> 99:59:59.999 they insure that those vulnerabilities remain unpatched on our devices too. 99:59:59.999 --> 99:59:59.999 So that the people who are our adversaries, 99:59:59.999 --> 99:59:59.999 the criminals who want to attack us, 99:59:59.999 --> 99:59:59.999 the spies who want to exfiltrate our corporate secrets to their countries, 99:59:59.999 --> 99:59:59.999 the griefers, the autocratic regimes who use those vulnerabilities 99:59:59.999 --> 99:59:59.999 to spy on their population -- 99:59:59.999 --> 99:59:59.999 at Electronic Frontier Foundation, we have a client from Ethiopia, 99:59:59.999 --> 99:59:59.999 Mr. Kadani, who is dissident journalist 99:59:59.999 --> 99:59:59.999 -- Ethiopia imprisons more journalists than any other country in the world -- 99:59:59.999 --> 99:59:59.999 he fled to America, where the Ethiopian government hacked his computer 99:59:59.999 --> 99:59:59.999 with a bug in Skype that they had discovered and not reported. 99:59:59.999 --> 99:59:59.999 They'd bought the tools to do this from Italy, from a company called Hacking Team, 99:59:59.999 --> 99:59:59.999 they hacked his computer, they found out who is colleagues were in Ethiopia 99:59:59.999 --> 99:59:59.999 and they arrested them and subjected them to torture. 99:59:59.999 --> 99:59:59.999 So, when our security services take a bug and weaponize it 99:59:59.999 --> 99:59:59.999 so they can attack their adversaries, they leave that bug intact 99:59:59.999 --> 99:59:59.999 so that our adversaries can attack us. 99:59:59.999 --> 99:59:59.999 You know, this is -- I'm giving a talk later today, 99:59:59.999 --> 99:59:59.999 no matter what side you're on in the war on general-purpose computing, 99:59:59.999 --> 99:59:59.999 you're losing, 99:59:59.999 --> 99:59:59.999 because all we have right now is attack and not defense. 99:59:59.999 --> 99:59:59.999 Our security services are so concerned with making sure that their job is easy 99:59:59.999 --> 99:59:59.999 when they attack their adversaries, that they are neglecting the fact 99:59:59.999 --> 99:59:59.999 that they are making their adversaries' job easy in attacking us. 99:59:59.999 --> 99:59:59.999 If this were a football match, 99:59:59.999 --> 99:59:59.999 the score would be tied 400 to 400 after 10 minutes. 99:59:59.999 --> 99:59:59.999 (Moderator) Corey, I've got to stop you there. 99:59:59.999 --> 99:59:59.999 Thank you very much indeed. (Doctorow) Thank you very much.