0:00:00.000,0:00:00.778 (Cory Doctorow) Thank you very much 0:00:01.066,0:00:05.420 So I'd like to start with something of a[br]benediction or permission. 0:00:05.420,0:00:07.740 I am one of nature's fast talkers 0:00:07.740,0:00:10.620 and many of you are not [br]native English speakers, or 0:00:10.620,0:00:12.919 maybe not accustomed [br]to my harsh Canadian accent 0:00:12.919,0:00:15.919 in addition I've just come in [br]from Australia 0:00:15.919,0:00:19.166 and so like many of you I am horribly[br]jetlagged and have drunk enough coffee 0:00:19.166,0:00:20.736 this morning to kill a rhino. 0:00:22.112,0:00:23.587 When I used to be at the United Nations 0:00:23.587,0:00:27.023 I was known as the scourge of the[br]simultaneous translation core 0:00:27.157,0:00:29.528 I would stand up and speak [br]as slowly as I could 0:00:29.576,0:00:32.229 and turn around, and there they[br]would be in their boots doing this 0:00:32.229,0:00:35.136 (laughter)[br]When I start to speak too fast, 0:00:35.136,0:00:37.557 this is the universal symbol --[br]my wife invented it -- 0:00:37.557,0:00:41.518 for "Cory, you are talking too fast".[br]Please, don't be shy. 0:00:42.078,0:00:45.790 So, I'm a parent , like many of you [br]and I'm like I'm sure all of you 0:00:45.790,0:00:48.850 who are parents, parenting takes my ass [br]all the time. 0:00:50.030,0:00:54.820 And there are many regrets I have[br]about the mere seven and half years 0:00:54.820,0:00:57.284 that I've been a parent [br]but none ares so keenly felt 0:00:57.784,0:01:00.878 as my regrets over what's happened [br]when I've been wandering 0:01:00.878,0:01:04.766 around the house and seen my[br]daughter working on something 0:01:04.766,0:01:08.697 that was beyond her abilities, that was [br]right at the edge of what she could do 0:01:08.697,0:01:12.759 and where she was doing something [br]that she didn't have competence in yet 0:01:12.759,0:01:17.019 and you know it's that amazing thing [br]to see that frowning concentration, 0:01:17.019,0:01:20.038 tongue stuck out: as a parent, your[br]heart swells with pride 0:01:20.038,0:01:21.332 and you can't help but go over 0:01:21.332,0:01:23.684 and sort of peer over their shoulder[br]what they are doing 0:01:23.684,0:01:27.329 and those of you who are parents know [br]what happens when you look too closely 0:01:27.581,0:01:30.205 at someone who is working [br]beyond the age of their competence. 0:01:30.565,0:01:32.678 They go back to doing something[br]they're already good at. 0:01:32.973,0:01:35.042 You interrupt a moment [br]of genuine learning 0:01:35.295,0:01:38.491 and you replace it with [br]a kind of embarrassment 0:01:38.859,0:01:42.384 about what you're good at [br]and what you're not. 0:01:42.979,0:01:48.051 So, it matters a lot that our schools are[br]increasingly surveilled environments, 0:01:48.051,0:01:52.401 environments in which everything that [br]our kids do is watched and recorded. 0:01:52.791,0:01:56.439 Because when you do that, you interfere[br]with those moments of real learning. 0:01:56.781,0:02:00.665 Our ability to do things that we are not [br]good at yet, that we are not proud of yet, 0:02:00.665,0:02:03.524 is negatively impacted [br]by that kind of scrutiny. 0:02:03.924,0:02:06.022 And that scrutiny comes [br]from a strange place. 0:02:06.731,0:02:10.567 We have decided that there are [br]some programmatic means 0:02:10.567,0:02:13.678 by which we can find all the web page[br]children shouldn't look at 0:02:14.136,0:02:18.321 and we will filter our networks[br]to be sure that they don't see them. 0:02:18.719,0:02:21.883 Anyone who has ever paid attention[br]knows that this doesn't work. 0:02:22.351,0:02:25.524 There are more web pages [br]that kids shouldn't look at 0:02:25.524,0:02:29.043 than can ever be cataloged, [br]and any attempt to catalog them 0:02:29.043,0:02:32.031 will always catch pages that kids[br]must be looking at. 0:02:32.031,0:02:34.795 Any of you who have ever taught [br]a unit on reproductive health 0:02:35.122,0:02:37.835 know the frustration of trying [br]to get round a school network. 0:02:38.597,0:02:42.429 Now, this is done in the name of[br]digital protection 0:02:42.643,0:02:46.342 but it flies in the face of digital[br]literacy and of real learning. 0:02:46.613,0:02:50.332 Because the only way to stop kids [br]from looking at web pages 0:02:50.332,0:02:51.607 they shouldn't be looking at 0:02:51.607,0:02:55.771 is to take all of the clicks that they [br]make, all of the messages that they send, 0:02:55.771,0:02:59.736 all of their online activity [br]and offshore it to a firm 0:02:59.736,0:03:04.006 that has some nonsensically arrived at[br]list of the bad pages. 0:03:04.370,0:03:07.815 And so, what we are doing is that we're [br]exfiltrating all of our students' data 0:03:08.279,0:03:10.256 to unknown third parties. 0:03:10.256,0:03:12.529 Now, most of these firms, [br]their primary business isn't 0:03:12.529,0:03:14.237 serving the education sector. 0:03:14.237,0:03:16.347 Most of them service [br]the government sectors. 0:03:16.540,0:03:21.328 They primarily service governments in[br]repressive autocratic regimes. 0:03:21.328,0:03:23.600 They help them ensure that [br]their citizens aren't looking at 0:03:23.600,0:03:25.376 Amnesty International web pages. 0:03:25.636,0:03:29.433 They repackage those tools [br]and sell them to our educators. 0:03:29.835,0:03:32.975 So we are offshoring our children's clicks[br]to war criminals. 0:03:33.791,0:03:37.197 And what our kids do, now,[br]is they just get around it, 0:03:37.197,0:03:38.773 because it's not hard to get around it. 0:03:38.773,0:03:43.940 You know, never underestimate the power[br]of a kid who is time-rich and cash-poor 0:03:43.940,0:03:46.234 to get around our [br]technological blockades. 0:03:47.508,0:03:51.006 But when they do this, they don't acquire[br]the kind of digital literacy 0:03:51.006,0:03:54.075 that we want them to do, they don't [br]acquire real digital agency 0:03:54.400,0:03:57.783 and moreover, they risk exclusion [br]and in extreme cases, 0:03:57.783,0:03:59.366 they risk criminal prosecution. 0:04:00.220,0:04:04.085 So what if instead, those of us who are[br]trapped in this system of teaching kids 0:04:04.085,0:04:07.850 where we're required to subject them[br]to this kind of surveillance 0:04:07.850,0:04:10.167 that flies in the face [br]of their real learning, 0:04:10.167,0:04:13.247 what if instead, we invented [br]curricular units 0:04:13.247,0:04:16.459 that made them real first class[br]digital citizens, 0:04:16.459,0:04:19.590 in charge of trying to influence[br]real digital problems? 0:04:19.590,0:04:22.833 Like what if we said to them: [br]"We want you to catalog the web pages 0:04:22.833,0:04:25.271 that this vendor lets through [br]that you shouldn't be seeing. 0:04:25.499,0:04:29.424 We want you to catalog those pages that [br]you should be seeing, that are blocked. 0:04:29.426,0:04:31.948 We want you to go and interview [br]every teacher in the school 0:04:31.948,0:04:35.306 about all those lesson plans that were[br]carefully laid out before lunch 0:04:35.306,0:04:37.966 with a video and a web page, [br]and over lunch, 0:04:37.966,0:04:41.424 the unaccountable distance center[br]blocked these critical resources 0:04:41.424,0:04:44.796 and left them handing out photographed [br]worksheets in the afternoon 0:04:44.796,0:04:47.073 instead of the unit they prepared. 0:04:47.073,0:04:50.837 We want you to learn how to do the Freedom[br]of Information Act requests 0:04:50.837,0:04:53.377 and find out what your [br]school authority is spending 0:04:53.377,0:04:56.371 to censor your internet access [br]and surveil your activity. 0:04:56.371,0:04:59.855 We want you to learn to use the internet [br]to research these companies 0:04:59.855,0:05:04.329 and we want you to present this [br]to your parent-teacher association, 0:05:04.329,0:05:06.570 to your school authority,[br]to your local newspaper." 0:05:06.981,0:05:08.917 Because that's the kind [br]of digital literacy 0:05:09.187,0:05:11.310 that makes kids into first-class[br]digital citizens, 0:05:11.310,0:05:16.083 that prepares them for a future [br]in which they can participate fully 0:05:16.083,0:05:18.080 in a world that's changing. 0:05:19.019,0:05:22.677 Kids are the beta-testers [br]of the surveillance state. 0:05:22.919,0:05:26.713 The path of surveillance technology[br]starts with prisoners, 0:05:27.157,0:05:30.409 moves to asylum seekers, [br]people in mental institutions 0:05:30.409,0:05:33.919 and then to its first non-incarcerated [br]population: children 0:05:34.258,0:05:37.032 and then moves to blue-collar workers,[br]government workers 0:05:37.032,0:05:38.488 and white-collar workers. 0:05:38.488,0:05:41.575 And so, what we do to kids today[br]is what we did to prisoners yesterday 0:05:41.575,0:05:44.078 and what we're going to be doing [br]to you tomorrow. 0:05:44.078,0:05:46.737 And so it matters, what we teach our kids. 0:05:47.054,0:05:51.039 If you want to see where this goes, this[br]is a kid named Blake Robbins 0:05:51.039,0:05:55.124 and he attended Lower Merion High School [br]in Lower Merion Pennsylvania 0:05:55.124,0:05:56.090 outside f Philadelphia. 0:05:56.091,0:05:59.903 It's the most affluent public school [br]district in America, so affluent 0:05:59.903,0:06:02.521 that all the kids were issued Macbooks[br]at the start of the year 0:06:02.521,0:06:04.665 and they had to do their homework on[br]their Macbooks, 0:06:04.665,0:06:07.778 and bring them to school every day[br]and bring them home every night. 0:06:07.778,0:06:11.472 And the Macbooks had been fitted with[br]Laptop Theft Recovery Software, 0:06:11.472,0:06:15.687 which is fancy word for a rootkit, that[br]let the school administration 0:06:16.494,0:06:19.759 covertly (check) operate the cameras[br]and microphones on these computers 0:06:20.266,0:06:23.207 and harvest files off [br]of their hard drives 0:06:23.657,0:06:25.666 view all their clicks, and so on. 0:06:26.280,0:06:30.802 Now Blake Robbins found out [br]that the software existed 0:06:30.802,0:06:33.787 and how it was being used [br]because he and the head teacher 0:06:33.787,0:06:37.056 had been knocking heads for years, [br]since he first got into the school, 0:06:37.056,0:06:39.864 and one day, the head teacher [br]summoned him to his office 0:06:39.864,0:06:41.462 and said: "Blake, I've got you now." 0:06:41.670,0:06:45.284 and handed him a print-out of Blake[br]in his bedroom the night before, 0:06:45.620,0:06:48.985 taking what looked like a pill, [br]and he said: "You're taking drugs." 0:06:48.985,0:06:53.988 And Blake Robbins said: "That's a candy,[br]it's a Mike and Ike's candy, I take them 0:06:53.988,0:06:55.652 when I -- I eat them when I'm studying. 0:06:55.973,0:06:58.156 How did you get a picture [br]of me in my bedroom?" 0:06:58.656,0:07:02.644 This head teacher had taken [br]over 6000 photos of Blake Robbins: 0:07:02.644,0:07:06.015 awake and asleep, dressed and undressed,[br]in the presence of his family. 0:07:06.630,0:07:10.376 And in the ensuing lawsuit, the school[br]settled for a large amount of money 0:07:10.376,0:07:12.476 and promised that [br]they wouldn't do it again 0:07:12.916,0:07:15.560 without informing the students [br]that it was going on. 0:07:16.107,0:07:18.521 And increasingly, the practice is now 0:07:18.808,0:07:22.110 that school administrations hand out [br]laptops, because they're getting cheaper, 0:07:22.505,0:07:24.683 with exactly the same kind of software, 0:07:24.886,0:07:27.681 but they let the students know and [br]hey find that that works even better 0:07:27.964,0:07:29.682 at curbing the students' behavior, 0:07:29.682,0:07:33.196 because the students know that [br]they're always on camera. 0:07:33.769,0:07:38.010 Now, the surveillance state is moving[br]from kids to the rest of the world. 0:07:38.010,0:07:39.528 It's metastasizing. 0:07:39.528,0:07:44.149 Our devices are increasingly designed[br]to treat us as attackers, 0:07:44.149,0:07:46.890 as suspicious parties [br]who can't be trusted 0:07:46.890,0:07:50.577 because our devices' job is to do things[br]that we don't want them to do. 0:07:50.923,0:07:53.758 Now that's not because the vendors [br]who make our technology 0:07:53.758,0:07:55.839 want to spy on us necessarily, 0:07:55.839,0:08:01.000 but they want to take [br]the ink-jet printer business model 0:08:01.000,0:08:03.768 and bring it into every other realm[br]of the world. 0:08:03.768,0:08:08.724 So the ink-jet printer business model[br]is where you sell someone a device 0:08:08.724,0:08:11.689 and then you get a continuing [br]revenue stream from that device 0:08:11.689,0:08:15.844 by making sure that competitors can't make [br]consumables or parts 0:08:15.844,0:08:19.084 or additional features [br]or plugins for that device, 0:08:19.084,0:08:22.111 without paying rent [br]to the original manufacturer. 0:08:22.111,0:08:25.911 And that allows you to maintain [br]monopoly margins on your devices. 0:08:26.266,0:08:30.629 Now, in 1998, the American government [br]passed a law called 0:08:30.629,0:08:32.416 the Digital Millennium Copyright Act, 0:08:32.416,0:08:35.328 in 2001 the European Union [br]introduced its own version, 0:08:35.328,0:08:37.327 the European Union Copyright Directive. 0:08:37.593,0:08:40.157 And these two laws, along with laws[br]all around the world, 0:08:40.157,0:08:45.681 in Australia, Canada and elsewhere,[br]these laws prohibit removing digital laws 0:08:45.681,0:08:48.885 that are used to restrict [br]access to copyrighted works 0:08:49.220,0:08:52.134 and they were original envisioned as a way[br]of making sure that Europeans didn't 0:08:52.134,0:08:54.640 bring cheap DVDs in from America, 0:08:54.640,0:08:58.215 or making sure that Australians didn't [br]import cheap DVDs from China. 0:08:58.215,0:09:03.741 And so you have a digital work, a DVD,[br]and it has a lock on it and to unlock it, 0:09:03.741,0:09:05.363 you have to buy an authorized player 0:09:05.363,0:09:07.357 and the player checks to make sure [br]you are in region 0:09:07.357,0:09:10.339 and making your own player [br]that doesn't make that check 0:09:10.339,0:09:12.409 is illegal because you'd have [br]to remove the digital lock. 0:09:12.409,0:09:13.824 And that was the original intent, 0:09:13.824,0:09:18.552 it was to allow high rents to be [br]maintained on removable media, 0:09:18.552,0:09:20.515 DVDs and other entertainment content. 0:09:20.726,0:09:24.229 But it very quickly spread [br]into new rounds. 0:09:24.765,0:09:28.415 So, for example, auto manufacturers [br]now lock up 0:09:28.415,0:09:30.863 all of their cars' telemetry[br]with digital locks. 0:09:30.863,0:09:33.153 If you're a mechanic [br]and want to fix a car, 0:09:33.153,0:09:36.796 you have to get a reader [br]from the manufacturer 0:09:36.796,0:09:40.240 to make sure that you can [br]see the telemetry 0:09:40.240,0:09:42.759 and then know what parts to order [br]and how to fix it. 0:09:43.154,0:09:46.380 And in order to get this reader, [br]you have to promise the manufacturer 0:09:46.380,0:09:50.069 that you will only buy parts [br]from that manufacturer 0:09:50.069,0:09:51.366 and not from third parties. 0:09:51.366,0:09:53.829 So the manufacturers can keep [br]the repair costs high 0:09:53.829,0:09:56.852 and get a secondary revenue stream[br]out of the cars. 0:09:57.388,0:10:04.532 This year, the Chrysler corporation filed [br]comments with the US Copyright Office, 0:10:04.532,0:10:07.580 to say that they believed that [br]this was the right way to do it 0:10:07.580,0:10:10.278 and that it should be a felony, [br]punishable by 5 years in prison 0:10:10.278,0:10:12.218 and a $500'000 fine, 0:10:12.218,0:10:16.242 to change the locks on a car that you own,[br]so that you can choose who fixes it. 0:10:16.531,0:10:19.556 It turned out that when they advertised 0:10:19.556,0:10:21.855 -- well, where is my slide here?[br]Oh, there we go -- 0:10:21.855,0:10:25.166 when they advertised that [br]it wasn't your father's Oldsmobile, 0:10:25.606,0:10:28.531 they weren't speaking metaphorically, [br]they literally meant 0:10:28.531,0:10:30.486 that even though your father [br]bought the Oldsmobile, 0:10:30.486,0:10:32.642 it remained their property in perpetuity. 9:59:59.000,9:59:59.000 And it's not just cars, [br]it's every kind of device, 9:59:59.000,9:59:59.000 because every kind of device today [br]has a computer in it. 9:59:59.000,9:59:59.000 The John Deer Company, the world's leading seller of heavy equipment 9:59:59.000,9:59:59.000 and agricultural equipment technologies, 9:59:59.000,9:59:59.000 they now view their tractors as [br]information gathering platforms 9:59:59.000,9:59:59.000 and they view the people who use them 9:59:59.000,9:59:59.000 as the kind of inconvenient gut flora[br]of their ecosystem. 9:59:59.000,9:59:59.000 So if you are a farmer [br]and you own a John Deer tractor, 9:59:59.000,9:59:59.000 when you drive it around your fields, [br]the torque centers on the wheels 9:59:59.000,9:59:59.000 conduct a centimeter-accurate soil [br]density survey of your agricultural land. 9:59:59.000,9:59:59.000 That would be extremely useful to you[br]when you're planting your seeds 9:59:59.000,9:59:59.000 but that data is not available to you 9:59:59.000,9:59:59.000 unless unless you remove the digital lock [br]from your John Deer tractor 9:59:59.000,9:59:59.000 which again, is against the law [br]everywhere in the world. 9:59:59.000,9:59:59.000 Instead, in order to get that data 9:59:59.000,9:59:59.000 you have to buy a bundle with seeds [br]from Monsanto, 9:59:59.000,9:59:59.000 who are John Deer's seed partners. 9:59:59.000,9:59:59.000 John Deer then takes this data that they [br]aggregate across whole regions 9:59:59.000,9:59:59.000 and they use it to gain insight [br]into regional crop yields 9:59:59.000,9:59:59.000 That they use to play the futures market. 9:59:59.000,9:59:59.000 John Deer's tractors are really just [br]a way of gathering information 9:59:59.000,9:59:59.000 and the farmers are secondary to it. 9:59:59.000,9:59:59.000 Just because you own it [br]doesn't mean it's yours. 9:59:59.000,9:59:59.000 And it's not just the computers [br]that we put our bodies into 9:59:59.000,9:59:59.000 that have this business model. 9:59:59.000,9:59:59.000 It's the computers that we put [br]inside of our bodies. 9:59:59.000,9:59:59.000 If you're someone who is diabetic 9:59:59.000,9:59:59.000 and you're fitted with a continuous [br]glucose-measuring insulin pump, 9:59:59.000,9:59:59.000 that insulin pump is designed [br]with a digital log 9:59:59.000,9:59:59.000 that makes sure that your doctor [br]can only use the manufacturer's software 9:59:59.000,9:59:59.000 to read the data coming off of it 9:59:59.000,9:59:59.000 and that software is resold [br]on a rolling annual license 9:59:59.000,9:59:59.000 and it can't be just bought outright. 9:59:59.000,9:59:59.000 And the digital locks are also [br]used to make sure 9:59:59.000,9:59:59.000 that you only buy the insulin [br]that vendors approved 9:59:59.000,9:59:59.000 and not generic insulin [br]that might be cheaper. 9:59:59.000,9:59:59.000 We've literally turned human beings [br]into ink-jet printers. 9:59:59.000,9:59:59.000 Now, this has really deep implications[br]beyond the economic implications. 9:59:59.000,9:59:59.000 Because the rules that prohibit [br]breaking these digital locks 9:59:59.000,9:59:59.000 also prohibit telling people [br]about flaws that programmers made 9:59:59.000,9:59:59.000 because if you know about a flaw[br]that a programmer made, 9:59:59.000,9:59:59.000 you can use it to break the digital lock. 9:59:59.000,9:59:59.000 And that means that the errors,[br]the vulnerabilities, 9:59:59.000,9:59:59.000 the mistakes in our devices, they fester [br]in them, they go on and on and on 9:59:59.000,9:59:59.000 and our devices become these longlife [br]reservoirs of digital pathogens. 9:59:59.000,9:59:59.000 And we've seen how that plays out. 9:59:59.000,9:59:59.000 One of the reasons that Volkswagen [br]was able to get away 9:59:59.000,9:59:59.000 with their Diesel cheating for so long 9:59:59.000,9:59:59.000 is because no one could independently[br]alter their firmware. 9:59:59.000,9:59:59.000 It's happening all over the place. 9:59:59.000,9:59:59.000 You may have seen -- [br]you may have seen this summer 9:59:59.000,9:59:59.000 that Chrysler had to recall [br]1.4 million jeeps 9:59:59.000,9:59:59.000 because it turned out that they could be[br]remotely controlled over the internet 9:59:59.000,9:59:59.000 while driving down a motorway [br]and have their brakes and steering 9:59:59.000,9:59:59.000 commandeered by anyone, anywhere [br]in the world, over the internet. 9:59:59.000,9:59:59.000 We only have one methodology [br]for determining whether security works 9:59:59.000,9:59:59.000 and that's to submit it [br]to public scrutiny, 9:59:59.000,9:59:59.000 to allow for other people to see [br]what assumptions you've made. 9:59:59.000,9:59:59.000 Anyone can design a security system 9:59:59.000,9:59:59.000 that he himself can think [br]of a way of breaking, 9:59:59.000,9:59:59.000 but all that means is that you've [br]designed a security system 9:59:59.000,9:59:59.000 that works against people [br]who are stupider than you. 9:59:59.000,9:59:59.000 And in this regard, security [br]is no different 9:59:59.000,9:59:59.000 from any other kind of knowledge creation. 9:59:59.000,9:59:59.000 You know, before we had [br]contemporary science and scholarship, 9:59:59.000,9:59:59.000 we had something that looked [br]a lot like it, called alchemy. 9:59:59.000,9:59:59.000 And for 500 years, alchemists kept[br]what they thought they knew a secret. 9:59:59.000,9:59:59.000 And that meant that every alchemist[br]was capable of falling prey 9:59:59.000,9:59:59.000 to that most urgent of human frailties,[br]which is our ability to fool ourselves. 9:59:59.000,9:59:59.000 And so, every alchemist discovered [br]for himself in the hardest way possible 9:59:59.000,9:59:59.000 that drinking mercury was a bad idea. 9:59:59.000,9:59:59.000 We call that 500-year period the Dark Ages 9:59:59.000,9:59:59.000 and we call the moment at which [br]they started publishing 9:59:59.000,9:59:59.000 and submitting themselves [br]to adversarial peer review, 9:59:59.000,9:59:59.000 which is when your friends tell you [br]about the mistakes that you've made 9:59:59.000,9:59:59.000 and your enemies call you an idiot[br]for having made them, 9:59:59.000,9:59:59.000 we call that moment the Enlightenment. 9:59:59.000,9:59:59.000 Now, this has profound implications. 9:59:59.000,9:59:59.000 The restriction of our ability to alter [br]the security of our devices 9:59:59.000,9:59:59.000 for our own surveillance society, 9:59:59.000,9:59:59.000 for our ability to be free people[br]in society. 9:59:59.000,9:59:59.000 At the height of the GDR, in 1989, 9:59:59.000,9:59:59.000 the STASI had one snitch for every [br]60 people in East Germany, 9:59:59.000,9:59:59.000 in order to surveil the entire country. 9:59:59.000,9:59:59.000 A couple of decades later, we found out [br]through Edward Snowden 9:59:59.000,9:59:59.000 that the NSA was spying [br]on everybody in the world. 9:59:59.000,9:59:59.000 And the ratio of people who work [br]at the NSA to people they are spying on 9:59:59.000,9:59:59.000 is more like 1 in 10'000. 9:59:59.000,9:59:59.000 They've achieved a two and a half [br]order of magnitude 9:59:59.000,9:59:59.000 productivity gain in surveillance. 9:59:59.000,9:59:59.000 And the way that they got there [br]is in part by the fact that 9:59:59.000,9:59:59.000 we use devices that [br]we're not allowed to alter, 9:59:59.000,9:59:59.000 that are designed to treat us as attackers 9:59:59.000,9:59:59.000 and that gather an enormous [br]amount of information on us. 9:59:59.000,9:59:59.000 If the government told you that you're [br]required to carry on 9:59:59.000,9:59:59.000 a small electronic rectangle that [br]recorded all of your social relationships, 9:59:59.000,9:59:59.000 all of your movements, 9:59:59.000,9:59:59.000 all of your transient thoughts that[br]you made known or ever looked up, 9:59:59.000,9:59:59.000 and would make that [br]available to the state, 9:59:59.000,9:59:59.000 and you would have to pay for it,[br]you would revolt. 9:59:59.000,9:59:59.000 But the phone companies have [br]managed to convince us, 9:59:59.000,9:59:59.000 along with the mobile phone vendors, 9:59:59.000,9:59:59.000 that we should foot the bill [br]for our own surveillance. 9:59:59.000,9:59:59.000 It's a bit like during [br]the Cultural Revolution, 9:59:59.000,9:59:59.000 where, after your family members [br]were executed, 9:59:59.000,9:59:59.000 they sent you a bill for the bullet. 9:59:59.000,9:59:59.000 So, this has big implications, as I said, [br]for where we go as a society. 9:59:59.000,9:59:59.000 Because just as our kids have [br]a hard time functioning 9:59:59.000,9:59:59.000 in the presence of surveillance,[br]and learning, 9:59:59.000,9:59:59.000 and advancing their own knowledge, 9:59:59.000,9:59:59.000 we as a society have a hard time [br]progressing 9:59:59.000,9:59:59.000 in the presence of surveillance. 9:59:59.000,9:59:59.000 In our own living memory, people who are [br]today though of as normal and right 9:59:59.000,9:59:59.000 were doing something that [br]a generation ago 9:59:59.000,9:59:59.000 would have been illegal [br]and landed them in jail. 9:59:59.000,9:59:59.000 For example, you probably know someone 9:59:59.000,9:59:59.000 who's married to a partner [br]of the same sex. 9:59:59.000,9:59:59.000 If you live in America, you may know [br]someone who takes medical marijuana, 9:59:59.000,9:59:59.000 or if you live in the Netherlands. 9:59:59.000,9:59:59.000 And not that long ago, people [br]who undertook these activities 9:59:59.000,9:59:59.000 could have gone to jail for them, 9:59:59.000,9:59:59.000 could have faced enormous [br]social exclusion for them. 9:59:59.000,9:59:59.000 The way that we got from there to here[br]was by having a private zone, 9:59:59.000,9:59:59.000 a place where people weren't surveilled, 9:59:59.000,9:59:59.000 in which they could advance [br]their interest ideas, 9:59:59.000,9:59:59.000 do things that were thought of as[br]socially unacceptable 9:59:59.000,9:59:59.000 and slowly change our social attitudes. 9:59:59.000,9:59:59.000 And in ........ (check) few things [br]that in 50 years, 9:59:59.000,9:59:59.000 your grandchildren will sit around [br]the Christmas table, in 2065, and say: 9:59:59.000,9:59:59.000 "How was it, grandma, [br]how was it, grandpa, 9:59:59.000,9:59:59.000 that in 2015, you got it all right, 9:59:59.000,9:59:59.000 and we haven't had [br]any social changes since then?" 9:59:59.000,9:59:59.000 Then you have to ask yourself [br]how in a world, 9:59:59.000,9:59:59.000 in which we are all [br]under continuous surveillance, 9:59:59.000,9:59:59.000 we are going to find a way [br]to improve this. 9:59:59.000,9:59:59.000 So, our kids need ICT literacy, 9:59:59.000,9:59:59.000 but ICT literacy isn't just typing skills[br]or learning how to use PowerPoing. 9:59:59.000,9:59:59.000 It's learning how to think critically 9:59:59.000,9:59:59.000 about how they relate [br]to the means of information, 9:59:59.000,9:59:59.000 about whether they are its masters [br]or servants. 9:59:59.000,9:59:59.000 Our networks are not [br]the most important issue that we have. 9:59:59.000,9:59:59.000 There are much more important issues[br]in society and in the world today. 9:59:59.000,9:59:59.000 The future of the internet is [br]way less important 9:59:59.000,9:59:59.000 than the future of our climate,[br]the future of gender equity, 9:59:59.000,9:59:59.000 the future of racial equity, 9:59:59.000,9:59:59.000 the future of the wage gap [br]and the wealth gap in the world, 9:59:59.000,9:59:59.000 but everyone of those fights is going[br]to be fought and won or lost 9:59:59.000,9:59:59.000 on the internet:[br]it's our most foundational fight 9:59:59.000,9:59:59.000 So weakened (check) computers[br]can make us more free 9:59:59.000,9:59:59.000 or they can take away our freedom. 9:59:59.000,9:59:59.000 It all comes down to how we regulate them [br]and how we use them. 9:59:59.000,9:59:59.000 And it's our job, as people who are [br]training the next generation, 9:59:59.000,9:59:59.000 and whose next generation [br]is beta-testing 9:59:59.000,9:59:59.000 the surveillance technology [br]that will be coming to us, 9:59:59.000,9:59:59.000 it's our job to teach them to seize [br]the means of information, 9:59:59.000,9:59:59.000 to make themselves self-determinant[br]in the way that they use their networks 9:59:59.000,9:59:59.000 and to find ways to show them [br]how to be critical and how to be smart 9:59:59.000,9:59:59.000 and how to be, above all, subversive[br]and how to use the technology around them. 9:59:59.000,9:59:59.000 Thank you. (18:49) 9:59:59.000,9:59:59.000 (Applause) 9:59:59.000,9:59:59.000 (Moderator) Cory, thank you very much [br]indeed. 9:59:59.000,9:59:59.000 (Doctorow) Thank you[br](Moderator) And I've got a bundle 9:59:59.000,9:59:59.000 of points which you've stimulated [br]from many in the audience, which sent -- 9:59:59.000,9:59:59.000 (Doctorow) I'm shocked to hear that [br]that was at all controversial, 9:59:59.000,9:59:59.000 but go on.[br](Moderator) I didn't say "controversial", 9:59:59.000,9:59:59.000 you stimulated thinking, which is great. 9:59:59.000,9:59:59.000 But a lot of them resonate around[br]violation of secrecy and security. 9:59:59.000,9:59:59.000 And this, for example, [br]from Anneke Burgess (check) 9:59:59.000,9:59:59.000 "Is there a way for students [br]to protect themselves 9:59:59.000,9:59:59.000 from privacy violations by institutions[br]they are supposed to trust." 9:59:59.000,9:59:59.000 I think this is probably a question[br]William Golding (check) as well, 9:59:59.000,9:59:59.000 someone who is a senior figure[br]in a major university, but 9:59:59.000,9:59:59.000 this issue of privacy violations and trust. 9:59:59.000,9:59:59.000 (Doctorow) Well, I think that computers[br]have a curious dual nature. 9:59:59.000,9:59:59.000 So on the one hand, they do expose us[br]to an enormous amount of scrutiny, 9:59:59.000,9:59:59.000 depending on how they are configured. 9:59:59.000,9:59:59.000 But on the other hand, computers [br]have brought new powers to us 9:59:59.000,9:59:59.000 that are literally new [br]on the face of the world, right? 9:59:59.000,9:59:59.000 We have never had a reality in which[br]normal people could have secrets 9:59:59.000,9:59:59.000 from powerful people. 9:59:59.000,9:59:59.000 But with the computer in your pocket,[br]with that, 9:59:59.000,9:59:59.000 you can encrypt a message so thoroughly 9:59:59.000,9:59:59.000 that if every hydrogen atom in the [br]universe were turned into a computer 9:59:59.000,9:59:59.000 and it did nothing until [br]the heat death of the universe, 9:59:59.000,9:59:59.000 but try to guess what your key was, 9:59:59.000,9:59:59.000 we would run out of universe [br]before we ran out of possible keys. 9:59:59.000,9:59:59.000 So, computers do give us [br]the power to have secrets. 9:59:59.000,9:59:59.000 The problem is that institutions [br]prohibit the use of technology 9:59:59.000,9:59:59.000 that allow you to take back [br]your own privacy. 9:59:59.000,9:59:59.000 It's funny, right? Because we take kids[br]and we say to them: 9:59:59.000,9:59:59.000 "Your privacy is like your virginity: 9:59:59.000,9:59:59.000 once you've lost it, [br]you'll never get it back. 9:59:59.000,9:59:59.000 Watch out what you're [br]putting on Facebook" 9:59:59.000,9:59:59.000 -- and I think they should watch[br]what they're putting on Facebook, 9:59:59.000,9:59:59.000 I'm a Facebook vegan, I don't even --[br]I don't use it, but we say: 9:59:59.000,9:59:59.000 "Watch what you're putting on Facebook, 9:59:59.000,9:59:59.000 don't send out dirty pictures [br]of yourself on SnapChat." 9:59:59.000,9:59:59.000 All good advice. 9:59:59.000,9:59:59.000 But we do it while we are taking away [br]all the private information 9:59:59.000,9:59:59.000 that they have, all of their privacy[br]and all of their agency. 9:59:59.000,9:59:59.000 You know, if a parent says to a kid: 9:59:59.000,9:59:59.000 "You mustn't smoke [br]because you'll get sick" 9:59:59.000,9:59:59.000 and the parent says it [br]while lighting a new cigarette 9:59:59.000,9:59:59.000 off the one that she's just put down[br]in the ashtray, 9:59:59.000,9:59:59.000 the kid knows that what you're doing[br]matters more than what you're saying. 9:59:59.000,9:59:59.000 (Moderator) The point is deficit of trust. 9:59:59.000,9:59:59.000 It builds in the kind of work that [br]David has been doing as well, 9:59:59.000,9:59:59.000 this deficit of trust and privacy. 9:59:59.000,9:59:59.000 And there is another point here: 9:59:59.000,9:59:59.000 "Is the battle for privacy already lost? 9:59:59.000,9:59:59.000 Are we already too comfortable [br]with giving away our data?" 9:59:59.000,9:59:59.000 (Doctorow) No, I don't think so at all. 9:59:59.000,9:59:59.000 In fact, I think that if anything, [br]we've reached 9:59:59.000,9:59:59.000 peak indifference to surveillance, right? 9:59:59.000,9:59:59.000 The surveillance races on by a long shot. 9:59:59.000,9:59:59.000 There will be more surveillance[br]before there is less. 9:59:59.000,9:59:59.000 But there'll never be fewer people [br]who care about surveillance 9:59:59.000,9:59:59.000 than there are today, 9:59:59.000,9:59:59.000 because, as privacy advocates, [br]we spectacularly failed, 9:59:59.000,9:59:59.000 over the last 20 years, to get people[br]to take privacy seriously 9:59:59.000,9:59:59.000 and now we have firms that are [br]frankly incompetent, 9:59:59.000,9:59:59.000 retaining huge amounts of our personally[br]identifying sensitive information 9:59:59.000,9:59:59.000 and those firms are leaking [br]that information at speed. 9:59:59.000,9:59:59.000 So, it started this year with things like[br](phone rings) -- is that me? it's me! -- 9:59:59.000,9:59:59.000 It started this year with things like[br]Ashley Madison 9:59:59.000,9:59:59.000 (Moderator) Do you want to take it? 9:59:59.000,9:59:59.000 (Doctorow) No no, that was my timer going [br]off telling me I've had my 22 minutes 9:59:59.000,9:59:59.000 (Moderator) It may be someone who [br]doesn't trust my.... 9:59:59.000,9:59:59.000 (Doctorow) No,no, that was 22 minutes. 9:59:59.000,9:59:59.000 So, it started with Ashley Madison, [br]the office of personnel management, 9:59:59.000,9:59:59.000 everyone who ever applied for security[br]clearance in America 9:59:59.000,9:59:59.000 had all of their sensitive information,[br]everything you had to give them 9:59:59.000,9:59:59.000 about why you shouldn't have [br]security clearance, 9:59:59.000,9:59:59.000 everything that's potentially[br]compromising about you, 9:59:59.000,9:59:59.000 all of it exfiltrated by what's believed[br]to have been a Chinese spy ring. 9:59:59.000,9:59:59.000 Something like [br]one in twenty Americans now, 9:59:59.000,9:59:59.000 have had their data captured and[br]exfiltrated from the United States. 9:59:59.000,9:59:59.000 This week, VTech, the largest electronic[br]toy manufacturer in the world, 9:59:59.000,9:59:59.000 leaked the personal information of[br]at least five million children, 9:59:59.000,9:59:59.000 including potentially photos that[br]they took with their electronic toys, 9:59:59.000,9:59:59.000 as well as their parents' information,[br]their home addresses, their passwords, 9:59:59.000,9:59:59.000 their parents' passwords [br]and their password hints. 9:59:59.000,9:59:59.000 So every couple of weeks, [br]from now on, 9:59:59.000,9:59:59.000 a couple of million people [br]are going to show up 9:59:59.000,9:59:59.000 on the door of people who [br]care about privacy and say: 9:59:59.000,9:59:59.000 "You were right all along, [br]what do we do?" 9:59:59.000,9:59:59.000 And the challenge is to give them[br]something useful 9:59:59.000,9:59:59.000 they can do about privacy. 9:59:59.000,9:59:59.000 And there are steps that [br]you can take personally. 9:59:59.000,9:59:59.000 If you go to the Surveillance Defense Kit 9:59:59.000,9:59:59.000 at the Electronic Frontier Foundation's[br]website, 9:59:59.000,9:59:59.000 you'll find a set of tools you can use. 9:59:59.000,9:59:59.000 But ultimately, it's not [br]an individual choice. 9:59:59.000,9:59:59.000 It's a social one. 9:59:59.000,9:59:59.000 Privacy is a team sport, because if you[br]keep your information private 9:59:59.000,9:59:59.000 and you send it to someone else[br]who is in your social network, 9:59:59.000,9:59:59.000 who doesn't keep it private, well, then[br]it leaks out their back door. 9:59:59.000,9:59:59.000 And so we need social movements[br]to improve our privacy. 9:59:59.000,9:59:59.000 We also need better tools. 9:59:59.000,9:59:59.000 It's really clear that our privacy tools[br]have fallen short of the mark 9:59:59.000,9:59:59.000 in terms of being accessible[br]to normal people. 9:59:59.000,9:59:59.000 (Moderator) Corey --[br](Doctorow) I sense you want 9:59:59.000,9:59:59.000 to say something: go on[br](Moderator) Yes. 9:59:59.000,9:59:59.000 But what I want to do is keep pushing this[br]down the route of education as well. 9:59:59.000,9:59:59.000 There are two or three questions here[br]which are very specific about that. 9:59:59.000,9:59:59.000 Could you tell us more about [br]how we help students 9:59:59.000,9:59:59.000 become digital citizens? 9:59:59.000,9:59:59.000 It links in important ways to the [br]German and Nordic pre-digital concept 9:59:59.000,9:59:59.000 of Bildung ............... (check) 9:59:59.000,9:59:59.000 and if you could give your daughter [br]just one piece of advice for living 9:59:59.000,9:59:59.000 in an age of digital surveillance,[br]what would it be? 9:59:59.000,9:59:59.000 (Doctorow) So, the one piece of advice[br]I would give her is 9:59:59.000,9:59:59.000 "Don't settle for anything less [br]than total control 9:59:59.000,9:59:59.000 of the means of information. 9:59:59.000,9:59:59.000 Anytime someone tells you that [br]the computer says you must do something, 9:59:59.000,9:59:59.000 you have to ask yourself why the computer[br]isn't doing what you want it to do. 9:59:59.000,9:59:59.000 Who is in charge here?" 9:59:59.000,9:59:59.000 In terms of how we improve [br]digital citizenship, 9:59:59.000,9:59:59.000 I think we are at odds with[br]our institutions in large part here. 9:59:59.000,9:59:59.000 I think that our institutions [br]have decreed that privacy 9:59:59.000,9:59:59.000 is a luxury we can't afford [br]to give our students 9:59:59.000,9:59:59.000 because if we do, they might do[br]something we wouldn't like. 9:59:59.000,9:59:59.000 And so, as teachers, the only thing that [br]we can do without risking our jobs, 9:59:59.000,9:59:59.000 without risking our students' education[br]and their ability to stay in school 9:59:59.000,9:59:59.000 is to teach them how to do judo, [br]to teach them 9:59:59.000,9:59:59.000 how to ask critical questions, [br]to gather data, 9:59:59.000,9:59:59.000 to demand evidence-based policy. 9:59:59.000,9:59:59.000 And ultimately, I actually think[br]that's better. 9:59:59.000,9:59:59.000 I think that teaching kids [br]how to evade firewalls 9:59:59.000,9:59:59.000 is way less interesting than [br]teaching them how to identify 9:59:59.000,9:59:59.000 who is providing that firewall, 9:59:59.000,9:59:59.000 who made the decision [br]to buy that firewall, 9:59:59.000,9:59:59.000 how much they're spending, 9:59:59.000,9:59:59.000 and what the process is [br]for getting that person fired. 9:59:59.000,9:59:59.000 That's a much more interesting [br]piece of digital citizenship 9:59:59.000,9:59:59.000 than any piece of firewall hacking. 9:59:59.000,9:59:59.000 (Moderator) But Corey, you're teaching[br]kids to get hold of their information, 9:59:59.000,9:59:59.000 but picking up on your analogy [br]with the auto (check)[br] 9:59:59.000,9:59:59.000 where there's a very clear lock on,[br]so much which has to be done 9:59:59.000,9:59:59.000 has to be done through the manufacturers[br]-- Martin Siebel (check) picking up -- 9:59:59.000,9:59:59.000 How busy is the hacker community [br]with breaking digital locks? 9:59:59.000,9:59:59.000 In other words, digital locks [br]are being put on so much, 9:59:59.000,9:59:59.000 can the next generation, can the kids [br]work out where these locks are, 9:59:59.000,9:59:59.000 to keep control of their information. 9:59:59.000,9:59:59.000 (Doctorow) So, getting the digital locks[br]off of our devices 9:59:59.000,9:59:59.000 is not just a matter of technology, [br]it's also a matter of policy. 9:59:59.000,9:59:59.000 Laws like the EUCD and new treaty[br]instruments like TTIP 9:59:59.000,9:59:59.000 and the Transpacific Partnership 9:59:59.000,9:59:59.000 specify that these digital locks [br]can't be removed legally, 9:59:59.000,9:59:59.000 that firms can use them to extract[br]monopoly ransom, to harm our security. 9:59:59.000,9:59:59.000 And these treaties will make it impossible[br]to remove these locks legally. 9:59:59.000,9:59:59.000 It's not enough for there to be [br]a demi-monde, in which 9:59:59.000,9:59:59.000 people who are technologically savvy[br]know how to break their iPhones. 9:59:59.000,9:59:59.000 The problem with that is that it produces[br]this world in which you can't know 9:59:59.000,9:59:59.000 whether the software that you're using[br]has anything bad lurking in it, right? 9:59:59.000,9:59:59.000 to say to people who have iPhones that[br]they want it reconfigured 9:59:59.000,9:59:59.000 to accept software from parties that[br]aren't approved by Apple, 9:59:59.000,9:59:59.000 well all you need to do is find a random [br]piece of software on the internet 9:59:59.000,9:59:59.000 that claims it will allow you to do it,[br]get it (check) on your phone, 9:59:59.000,9:59:59.000 and then run this software on it[br]is like saying to people: 9:59:59.000,9:59:59.000 "Well, all you need to do is just [br]find some heroin lying around 9:59:59.000,9:59:59.000 and put it in your arm: I'm sure[br]it'll work out fine." Right? 9:59:59.000,9:59:59.000 The ways in which a phone that is [br]compromised can harm you 9:59:59.000,9:59:59.000 are really without limits, right? 9:59:59.000,9:59:59.000 If you think about this, this is a [br]rectangle with a camera and a microphone 9:59:59.000,9:59:59.000 that you take into the bedroom [br]and the toilet. 9:59:59.000,9:59:59.000 and the only way you know whether[br]that camera or microphone are on or off 9:59:59.000,9:59:59.000 is if the software is doing [br]what it says it's doing, right? 9:59:59.000,9:59:59.000 So, I think that we have this policy[br]dimension that's way more important 9:59:59.000,9:59:59.000 than the technical dimension, 9:59:59.000,9:59:59.000 and that the only way we're going to [br]change that policy 9:59:59.000,9:59:59.000 is by making kids know what they can't do 9:59:59.000,9:59:59.000 and them making them know [br]why they're not allowed to do it, 9:59:59.000,9:59:59.000 and then set them loose.[br](Moderator) Two final points then, 9:59:59.000,9:59:59.000 building on their yen ............... (check) 9:59:59.000,9:59:59.000 but what do you think of recent laws,[br]say, in the Netherlands, 9:59:59.000,9:59:59.000 allowing police to spy on criminals[br]and terrorists with webcams 9:59:59.000,9:59:59.000 on their own laptops? 9:59:59.000,9:59:59.000 In other words, the extension implicit[br]in this question is, 9:59:59.000,9:59:59.000 what about keeping an eye on others,[br]including the next generation coming up, 9:59:59.000,9:59:59.000 and this from Peter Andreesen (check),[br]which is slightly connected: 9:59:59.000,9:59:59.000 When the predominant management tool[br]in our business is still based 9:59:59.000,9:59:59.000 on the assumption of control, how do we[br]change that psychology and that mindset? 9:59:59.000,9:59:59.000 And David is nodding for those views[br]........ (check) at the back. 9:59:59.000,9:59:59.000 (Doctorow) You know, there's not an[br]operating system or suite of applications 9:59:59.000,9:59:59.000 that bad guys use, and another set[br]that the rest of us use. 9:59:59.000,9:59:59.000 And so, when our security services [br]discover vulnerabilities 9:59:59.000,9:59:59.000 in the tools that we rely on, and then[br]hoard those vulnerabilities, 9:59:59.000,9:59:59.000 so that they can weaponize them[br]to attack criminals or terrorists 9:59:59.000,9:59:59.000 or whoever they're after, 9:59:59.000,9:59:59.000 they insure that those vulnerabilities[br]remain unpatched on our devices too. 9:59:59.000,9:59:59.000 So that the people who are [br]our adversaries, 9:59:59.000,9:59:59.000 the criminals who want to attack us, 9:59:59.000,9:59:59.000 the spies who want to exfiltrate [br]our corporate secrets to their countries, 9:59:59.000,9:59:59.000 the griefers, the autocratic regimes [br]who use those vulnerabilities 9:59:59.000,9:59:59.000 to spy on their population -- 9:59:59.000,9:59:59.000 at Electronic Frontier Foundation, [br]we have a client from Ethiopia, 9:59:59.000,9:59:59.000 Mr. Kadani, who is dissident journalist 9:59:59.000,9:59:59.000 -- Ethiopia imprisons more journalists [br]than any other country in the world -- 9:59:59.000,9:59:59.000 he fled to America, where the Ethiopian[br]government hacked his computer 9:59:59.000,9:59:59.000 with a bug in Skype that they had [br]discovered and not reported. 9:59:59.000,9:59:59.000 They'd bought the tools to do this from[br]Italy, from a company called Hacking Team, 9:59:59.000,9:59:59.000 they hacked his computer, they found out[br]who is colleagues were in Ethiopia 9:59:59.000,9:59:59.000 and they arrested them and [br]subjected them to torture. 9:59:59.000,9:59:59.000 So, when our security services take a bug[br]and weaponize it 9:59:59.000,9:59:59.000 so they can attack their adversaries,[br]they leave that bug intact 9:59:59.000,9:59:59.000 so that our adversaries can attack us. 9:59:59.000,9:59:59.000 You know, this is -- I'm giving a talk[br]later today, 9:59:59.000,9:59:59.000 no matter what side you're on in the war [br]on general-purpose computing, 9:59:59.000,9:59:59.000 you're losing, 9:59:59.000,9:59:59.000 because all we have right now [br]is attack and not defense. 9:59:59.000,9:59:59.000 Our security services are so concerned[br]with making sure that their job is easy 9:59:59.000,9:59:59.000 when they attack their adversaries,[br]that they are neglecting the fact 9:59:59.000,9:59:59.000 that they are making their adversaries'[br]job easy in attacking us. 9:59:59.000,9:59:59.000 If this were a football match, 9:59:59.000,9:59:59.000 the score would be tied [br]400 to 400 after 10 minutes. 9:59:59.000,9:59:59.000 (Moderator) Corey, [br]I've got to stop you there. 9:59:59.000,9:59:59.000 Thank you very much indeed.[br](Doctorow) Thank you very much.