1 00:00:00,000 --> 00:00:00,778 (Cory Doctorow) Thank you very much 2 00:00:01,066 --> 00:00:05,420 So I'd like to start with something of a benediction or permission. 3 00:00:05,420 --> 00:00:07,740 I am one of nature's fast talkers 4 00:00:07,740 --> 00:00:10,620 and many of you are not native English speakers, or 5 00:00:10,620 --> 00:00:12,919 maybe not accustomed to my harsh Canadian accent 6 00:00:12,919 --> 00:00:15,919 in addition I've just come in from Australia 7 00:00:15,919 --> 00:00:19,166 and so like many of you I am horribly jetlagged and have drunk enough coffee 8 00:00:19,166 --> 00:00:20,736 this morning to kill a rhino. 9 00:00:22,112 --> 00:00:23,587 When I used to be at the United Nations 10 00:00:23,587 --> 00:00:27,023 I was known as the scourge of the simultaneous translation core 11 00:00:27,157 --> 00:00:29,528 I would stand up and speak as slowly as I could 12 00:00:29,576 --> 00:00:32,229 and turn around, and there they would be in their boots doing this 13 00:00:32,229 --> 00:00:35,136 (laughter) When I start to speak too fast, 14 00:00:35,136 --> 00:00:37,557 this is the universal symbol -- my wife invented it -- 15 00:00:37,557 --> 00:00:41,518 for "Cory, you are talking too fast". Please, don't be shy. 16 00:00:42,078 --> 00:00:45,790 So, I'm a parent , like many of you and I'm like I'm sure all of you 17 00:00:45,790 --> 00:00:48,850 who are parents, parenting takes my ass all the time. 18 00:00:50,030 --> 00:00:54,820 And there are many regrets I have about the mere seven and half years 19 00:00:54,820 --> 00:00:57,284 that I've been a parent but none ares so keenly felt 20 00:00:57,784 --> 00:01:00,878 as my regrets over what's happened when I've been wandering 21 00:01:00,878 --> 00:01:04,766 around the house and seen my daughter working on something 22 00:01:04,766 --> 00:01:08,697 that was beyond her abilities, that was right at the edge of what she could do 23 00:01:08,697 --> 00:01:12,759 and where she was doing something that she didn't have competence in yet 24 00:01:12,759 --> 00:01:17,019 and you know it's that amazing thing to see that frowning concentration, 25 00:01:17,019 --> 00:01:20,038 tongue stuck out: as a parent, your heart swells with pride 26 00:01:20,038 --> 00:01:21,332 and you can't help but go over 27 00:01:21,332 --> 00:01:23,684 and sort of peer over their shoulder what they are doing 28 00:01:23,684 --> 00:01:27,329 and those of you who are parents know what happens when you look too closely 29 00:01:27,581 --> 00:01:30,205 at someone who is working beyond the age of their competence. 30 00:01:30,565 --> 00:01:32,678 They go back to doing something they're already good at. 31 00:01:32,973 --> 00:01:35,042 You interrupt a moment of genuine learning 32 00:01:35,295 --> 00:01:38,491 and you replace it with a kind of embarrassment 33 00:01:38,859 --> 00:01:42,384 about what you're good at and what you're not. 34 00:01:42,979 --> 00:01:48,051 So, it matters a lot that our schools are increasingly surveilled environments, 35 00:01:48,051 --> 00:01:52,401 environments in which everything that our kids do is watched and recorded. 36 00:01:52,791 --> 00:01:56,439 Because when you do that, you interfere with those moments of real learning. 37 00:01:56,781 --> 00:02:00,665 Our ability to do things that we are not good at yet, that we are not proud of yet, 38 00:02:00,665 --> 00:02:03,524 is negatively impacted by that kind of scrutiny. 39 00:02:03,924 --> 00:02:06,022 And that scrutiny comes from a strange place. 40 00:02:06,731 --> 00:02:10,567 We have decided that there are some programmatic means 41 00:02:10,567 --> 00:02:13,678 by which we can find all the web page children shouldn't look at 42 00:02:14,136 --> 00:02:18,321 and we will filter our networks to be sure that they don't see them. 43 00:02:18,719 --> 00:02:21,883 Anyone who has ever paid attention knows that this doesn't work. 44 00:02:22,351 --> 00:02:25,524 There are more web pages that kids shouldn't look at 45 00:02:25,524 --> 00:02:29,043 than can ever be cataloged, and any attempt to catalog them 46 00:02:29,043 --> 00:02:32,031 will always catch pages that kids must be looking at. 47 00:02:32,031 --> 00:02:34,795 Any of you who have ever taught a unit on reproductive health 48 00:02:35,122 --> 00:02:37,835 know the frustration of trying to get round a school network. 49 00:02:38,597 --> 00:02:42,429 Now, this is done in the name of digital protection 50 00:02:42,643 --> 00:02:46,342 but it flies in the face of digital literacy and of real learning. 51 00:02:46,613 --> 00:02:50,332 Because the only way to stop kids from looking at web pages 52 00:02:50,332 --> 00:02:51,607 they shouldn't be looking at 53 00:02:51,607 --> 00:02:55,771 is to take all of the clicks that they make, all of the messages that they send, 54 00:02:55,771 --> 00:02:59,736 all of their online activity and offshore it to a firm 55 00:02:59,736 --> 00:03:04,006 that has some nonsensically arrived at list of the bad pages. 56 00:03:04,370 --> 00:03:07,815 And so, what we are doing is that we're exfiltrating all of our students' data 57 00:03:08,279 --> 00:03:10,256 to unknown third parties. 58 00:03:10,256 --> 00:03:12,529 Now, most of these firms, their primary business isn't 59 00:03:12,529 --> 00:03:14,237 serving the education sector. 60 00:03:14,237 --> 00:03:16,347 Most of them service the government sectors. 61 00:03:16,540 --> 00:03:21,328 They primarily service governments in repressive autocratic regimes. 62 00:03:21,328 --> 00:03:23,600 They help them ensure that their citizens aren't looking at 63 00:03:23,600 --> 00:03:25,376 Amnesty International web pages. 64 00:03:25,636 --> 00:03:29,433 They repackage those tools and sell them to our educators. 65 00:03:29,835 --> 00:03:32,975 So we are offshoring our children's clicks to war criminals. 66 00:03:33,791 --> 00:03:37,197 And what our kids do, now, is they just get around it, 67 00:03:37,197 --> 00:03:38,773 because it's not hard to get around it. 68 00:03:38,773 --> 00:03:43,940 You know, never underestimate the power of a kid who is time-rich and cash-poor 69 00:03:43,940 --> 00:03:46,234 to get around our technological blockades. 70 00:03:47,508 --> 00:03:51,006 But when they do this, they don't acquire the kind of digital literacy 71 00:03:51,006 --> 00:03:54,075 that we want them to do, they don't acquire real digital agency 72 00:03:54,400 --> 00:03:57,783 and moreover, they risk exclusion and in extreme cases, 73 00:03:57,783 --> 00:03:59,366 they risk criminal prosecution. 74 00:04:00,220 --> 00:04:04,085 So what if instead, those of us who are trapped in this system of teaching kids 75 00:04:04,085 --> 00:04:07,850 where we're required to subject them to this kind of surveillance 76 00:04:07,850 --> 00:04:10,167 that flies in the face of their real learning, 77 00:04:10,167 --> 00:04:13,247 what if instead, we invented curricular units 78 00:04:13,247 --> 00:04:16,459 that made them real first class digital citizens, 79 00:04:16,459 --> 00:04:19,590 in charge of trying to influence real digital problems? 80 00:04:19,590 --> 00:04:22,833 Like what if we said to them: "We want you to catalog the web pages 81 00:04:22,833 --> 00:04:25,271 that this vendor lets through that you shouldn't be seeing. 82 00:04:25,499 --> 00:04:29,424 We want you to catalog those pages that you should be seeing, that are blocked. 83 00:04:29,426 --> 00:04:31,948 We want you to go and interview every teacher in the school 84 00:04:31,948 --> 00:04:35,306 about all those lesson plans that were carefully laid out before lunch 85 00:04:35,306 --> 00:04:37,966 with a video and a web page, and over lunch, 86 00:04:37,966 --> 00:04:41,424 the unaccountable distance center blocked these critical resources 87 00:04:41,424 --> 00:04:44,796 and left them handing out photographed worksheets in the afternoon 88 00:04:44,796 --> 00:04:47,073 instead of the unit they prepared. 89 00:04:47,073 --> 00:04:50,837 We want you to learn how to do the Freedom of Information Act requests 90 00:04:50,837 --> 00:04:53,377 and find out what your school authority is spending 91 00:04:53,377 --> 00:04:56,371 to censor your internet access and surveil your activity. 92 00:04:56,371 --> 00:04:59,855 We want you to learn to use the internet to research these companies 93 00:04:59,855 --> 00:05:04,329 and we want you to present this to your parent-teacher association, 94 00:05:04,329 --> 00:05:06,570 to your school authority, to your local newspaper." 95 00:05:06,981 --> 00:05:08,917 Because that's the kind of digital literacy 96 00:05:09,187 --> 00:05:11,310 that makes kids into first-class digital citizens, 97 00:05:11,310 --> 00:05:16,083 that prepares them for a future in which they can participate fully 98 00:05:16,083 --> 00:05:18,080 in a world that's changing. 99 00:05:19,019 --> 00:05:22,677 Kids are the beta-testers of the surveillance state. 100 00:05:22,919 --> 00:05:26,713 The path of surveillance technology starts with prisoners, 101 00:05:27,157 --> 00:05:30,409 moves to asylum seekers, people in mental institutions 102 00:05:30,409 --> 00:05:33,919 and then to its first non-incarcerated population: children 103 00:05:34,258 --> 00:05:37,032 and then moves to blue-collar workers, government workers 104 00:05:37,032 --> 00:05:38,488 and white-collar workers. 105 00:05:38,488 --> 00:05:41,575 And so, what we do to kids today is what we did to prisoners yesterday 106 00:05:41,575 --> 00:05:44,078 and what we're going to be doing to you tomorrow. 107 00:05:44,078 --> 00:05:46,737 And so it matters, what we teach our kids. 108 00:05:47,054 --> 00:05:51,039 If you want to see where this goes, this is a kid named Blake Robbins 109 00:05:51,039 --> 00:05:55,124 and he attended Lower Merion High School in Lower Merion Pennsylvania 110 00:05:55,124 --> 00:05:56,090 outside f Philadelphia. 111 00:05:56,091 --> 00:05:59,903 It's the most affluent public school district in America, so affluent 112 00:05:59,903 --> 00:06:02,521 that all the kids were issued Macbooks at the start of the year 113 00:06:02,521 --> 00:06:04,665 and they had to do their homework on their Macbooks, 114 00:06:04,665 --> 00:06:07,778 and bring them to school every day and bring them home every night. 115 00:06:07,778 --> 00:06:11,472 And the Macbooks had been fitted with Laptop Theft Recovery Software, 116 00:06:11,472 --> 00:06:15,687 which is fancy word for a rootkit, that let the school administration 117 00:06:16,494 --> 00:06:19,759 covertly (check) operate the cameras and microphones on these computers 118 00:06:20,266 --> 00:06:23,207 and harvest files off of their hard drives 119 00:06:23,657 --> 00:06:25,666 view all their clicks, and so on. 120 00:06:26,280 --> 00:06:30,802 Now Blake Robbins found out that the software existed 121 00:06:30,802 --> 00:06:33,787 and how it was being used because he and the head teacher 122 00:06:33,787 --> 00:06:37,056 had been knocking heads for years, since he first got into the school, 123 00:06:37,056 --> 00:06:39,864 and one day, the head teacher summoned him to his office 124 00:06:39,864 --> 00:06:41,462 and said: "Blake, I've got you now." 125 00:06:41,670 --> 00:06:45,284 and handed him a print-out of Blake in his bedroom the night before, 126 00:06:45,620 --> 00:06:48,873 taking what looked like a pill, and said: "You're taking drugs." 127 99:59:59,999 --> 99:59:59,999 And Blake Robbins said: "That's a candy, it's a Mike and Ike candy, I take them -- 128 99:59:59,999 --> 99:59:59,999 I eat them when I'm studying. 129 99:59:59,999 --> 99:59:59,999 How did you get a picture of me in my bedroom?" 130 99:59:59,999 --> 99:59:59,999 This head teacher had taken over 6000 photos of Blake Robbins: 131 99:59:59,999 --> 99:59:59,999 awake and asleep, dressed and undressed, in the presence of his family. 132 99:59:59,999 --> 99:59:59,999 And in the ensuing lawsuit, the school settled for a large amount of money 133 99:59:59,999 --> 99:59:59,999 and promised that they wouldn't do it again 134 99:59:59,999 --> 99:59:59,999 without informing the students that it was going on. 135 99:59:59,999 --> 99:59:59,999 And increasingly, the practice is now 136 99:59:59,999 --> 99:59:59,999 that school administrations hand out laptops, because they're getting cheaper, 137 99:59:59,999 --> 99:59:59,999 with exactly the same kind of software, 138 99:59:59,999 --> 99:59:59,999 but they let the students know and t hey find that that works even better 139 99:59:59,999 --> 99:59:59,999 at curbing the students' behavior, 140 99:59:59,999 --> 99:59:59,999 because the students know that they're always on camera. 141 99:59:59,999 --> 99:59:59,999 Now, the surveillance state is moving from kids to the rest of the world. 142 99:59:59,999 --> 99:59:59,999 It's metastasizing. 143 99:59:59,999 --> 99:59:59,999 Our devices are increasingly designed to treat us as attackers, 144 99:59:59,999 --> 99:59:59,999 as suspicious parties who can't be trusted 145 99:59:59,999 --> 99:59:59,999 because our devices' job is to do things that we don't want them to do. 146 99:59:59,999 --> 99:59:59,999 Now that's not because the vendors who make our technology 147 99:59:59,999 --> 99:59:59,999 want to spy on us necessarily, 148 99:59:59,999 --> 99:59:59,999 but they want to take the ink-jet printer business model 149 99:59:59,999 --> 99:59:59,999 and bring it into every other realm of the world. 150 99:59:59,999 --> 99:59:59,999 So the ink-jet printer business model is where you sell someone a device 151 99:59:59,999 --> 99:59:59,999 and then you get a continuing revenue stream from that device 152 99:59:59,999 --> 99:59:59,999 by making sure that competitors can't make consumables or parts 153 99:59:59,999 --> 99:59:59,999 or additional features or plugins for that device, 154 99:59:59,999 --> 99:59:59,999 without paying rent to the original manufacturer. 155 99:59:59,999 --> 99:59:59,999 And that allows you to maintain monopoly margins on your devices. 156 99:59:59,999 --> 99:59:59,999 Now, in 1998, the American government passed a law called 157 99:59:59,999 --> 99:59:59,999 the Digital Millennium Copyright Act, 158 99:59:59,999 --> 99:59:59,999 in 2001 the European Union introduced its own version, 159 99:59:59,999 --> 99:59:59,999 the European Union Copyright Directive. 160 99:59:59,999 --> 99:59:59,999 And these two laws, along with laws all around the world, 161 99:59:59,999 --> 99:59:59,999 in Australia, Canada and elsewhere. These laws prohibit removing digital laws 162 99:59:59,999 --> 99:59:59,999 that are used to restrict access to copyrighted works 163 99:59:59,999 --> 99:59:59,999 and they were original envisioned as a way of making sure that Europeans didn't 164 99:59:59,999 --> 99:59:59,999 bring cheap DVDs in from America, 165 99:59:59,999 --> 99:59:59,999 or making sure that Australians didn't mport cheap DVDs from China. 166 99:59:59,999 --> 99:59:59,999 And so you have a digital work, a DVD, and it has a lock on it and to unlock it, 167 99:59:59,999 --> 99:59:59,999 you have to buy an authorized player 168 99:59:59,999 --> 99:59:59,999 and the player checks to make sure you are in region 169 99:59:59,999 --> 99:59:59,999 and making your own player that doesn't make that check 170 99:59:59,999 --> 99:59:59,999 is illegal because you'd have to remove the digital lock. 171 99:59:59,999 --> 99:59:59,999 And that was the original intent, 172 99:59:59,999 --> 99:59:59,999 it was to allow high rates to be maintained on removable media, 173 99:59:59,999 --> 99:59:59,999 DVDs and other entertainment content. 174 99:59:59,999 --> 99:59:59,999 But it very quickly spread into new rounds. 175 99:59:59,999 --> 99:59:59,999 So, for example, auto manufacturers now lock up all of their cars' telemetry 176 99:59:59,999 --> 99:59:59,999 with digital locks. 177 99:59:59,999 --> 99:59:59,999 If you're a mechanic and want to fix a car, 178 99:59:59,999 --> 99:59:59,999 you have to get a reader from the manufacturer 179 99:59:59,999 --> 99:59:59,999 to make sure that you can see the telemetry 180 99:59:59,999 --> 99:59:59,999 and know what parts to order and how to fix it. 181 99:59:59,999 --> 99:59:59,999 And in order to get this reader, you have to promise the manufacturer 182 99:59:59,999 --> 99:59:59,999 that you will only buy parts from that manufacturer 183 99:59:59,999 --> 99:59:59,999 and not from third parties. 184 99:59:59,999 --> 99:59:59,999 So the manufacturers can keep the repair costs high 185 99:59:59,999 --> 99:59:59,999 and get a secondary revenue stream out of the cars. 186 99:59:59,999 --> 99:59:59,999 This year, the Chrysler corporation filed comments with the US Copyright Office, 187 99:59:59,999 --> 99:59:59,999 to say that they believed that this was the right way to do it 188 99:59:59,999 --> 99:59:59,999 and that it should be a felony, punishable by 5 years in prison 189 99:59:59,999 --> 99:59:59,999 and a $500'000 fine, 190 99:59:59,999 --> 99:59:59,999 to change the locks on a car that you own, so that you can choose who fixes it. 191 99:59:59,999 --> 99:59:59,999 It turned out that when they advertised 192 99:59:59,999 --> 99:59:59,999 -- well, where is my slide here? Oh, there we go -- 193 99:59:59,999 --> 99:59:59,999 when they advertised that it wasn't your father's Oldsmobile, 194 99:59:59,999 --> 99:59:59,999 they weren't speaking metaphorically, they really meant 195 99:59:59,999 --> 99:59:59,999 that even though your father bought the Oldsmobile, 196 99:59:59,999 --> 99:59:59,999 it remained their property in perpetuity. 197 99:59:59,999 --> 99:59:59,999 And it's not just cars, it's every kind of device, 198 99:59:59,999 --> 99:59:59,999 because every kind of device today has a computer in it. 199 99:59:59,999 --> 99:59:59,999 The John Deer Company, the world's leading seller of heavy equipment 200 99:59:59,999 --> 99:59:59,999 and agricultural equipment technologies, 201 99:59:59,999 --> 99:59:59,999 they now view their tractors as information gathering platforms 202 99:59:59,999 --> 99:59:59,999 and they view the people who use them 203 99:59:59,999 --> 99:59:59,999 as the kind of inconvenient gut flora of their ecosystem. 204 99:59:59,999 --> 99:59:59,999 So if you are a farmer and you own a John Deer tractor, 205 99:59:59,999 --> 99:59:59,999 when you drive it around your fields, the torque centers on the wheels 206 99:59:59,999 --> 99:59:59,999 conduct a centimeter-accurate soil density survey of your agricultural land. 207 99:59:59,999 --> 99:59:59,999 That would be extremely useful to you when you're planting your seeds 208 99:59:59,999 --> 99:59:59,999 but that data is not available to you 209 99:59:59,999 --> 99:59:59,999 unless unless you remove the digital lock from your John Deer tractor 210 99:59:59,999 --> 99:59:59,999 which again, is against the law everywhere in the world. 211 99:59:59,999 --> 99:59:59,999 Instead, in order to get that data 212 99:59:59,999 --> 99:59:59,999 you have to buy a bundle with seeds from Monsanto, 213 99:59:59,999 --> 99:59:59,999 who are John Deer's seed partners. 214 99:59:59,999 --> 99:59:59,999 John Deer then takes this data that they aggregate across whole regions 215 99:59:59,999 --> 99:59:59,999 and they use it to gain insight into regional crop yields 216 99:59:59,999 --> 99:59:59,999 That they use to play the futures market. 217 99:59:59,999 --> 99:59:59,999 John Deer's tractors are really just a way of gathering information 218 99:59:59,999 --> 99:59:59,999 and the farmers are secondary to it. 219 99:59:59,999 --> 99:59:59,999 Just because you own it doesn't mean it's yours. 220 99:59:59,999 --> 99:59:59,999 And it's not just the computers that we put our bodies into 221 99:59:59,999 --> 99:59:59,999 that have this business model. 222 99:59:59,999 --> 99:59:59,999 It's the computers that we put inside of our bodies. 223 99:59:59,999 --> 99:59:59,999 If you're someone who is diabetic 224 99:59:59,999 --> 99:59:59,999 and you're fitted with a continuous glucose-measuring insulin pump, 225 99:59:59,999 --> 99:59:59,999 that insulin pump is designed with a digital log 226 99:59:59,999 --> 99:59:59,999 that makes sure that your doctor can only use the manufacturer's software 227 99:59:59,999 --> 99:59:59,999 to read the data coming off of it 228 99:59:59,999 --> 99:59:59,999 and that software is resold on a rolling annual license 229 99:59:59,999 --> 99:59:59,999 and it can't be just bought outright. 230 99:59:59,999 --> 99:59:59,999 And the digital locks are also used to make sure 231 99:59:59,999 --> 99:59:59,999 that you only buy the insulin that vendors approved 232 99:59:59,999 --> 99:59:59,999 and not generic insulin that might be cheaper. 233 99:59:59,999 --> 99:59:59,999 We've literally turned human beings into ink-jet printers. 234 99:59:59,999 --> 99:59:59,999 Now, this has really deep implications beyond the economic implications. 235 99:59:59,999 --> 99:59:59,999 Because the rules that prohibit breaking these digital locks 236 99:59:59,999 --> 99:59:59,999 also prohibit telling people about flaws that programmers made 237 99:59:59,999 --> 99:59:59,999 because if you know about a flaw that a programmer made, 238 99:59:59,999 --> 99:59:59,999 you can use it to break the digital lock. 239 99:59:59,999 --> 99:59:59,999 And that means that the errors, the vulnerabilities, 240 99:59:59,999 --> 99:59:59,999 the mistakes in our devices, they fester in them, they go on and on and on 241 99:59:59,999 --> 99:59:59,999 and our devices become these longlife reservoirs of digital pathogens. 242 99:59:59,999 --> 99:59:59,999 And we've seen how that plays out. 243 99:59:59,999 --> 99:59:59,999 One of the reasons that Volkswagen was able to get away 244 99:59:59,999 --> 99:59:59,999 with their Diesel cheating for so long 245 99:59:59,999 --> 99:59:59,999 is because no one could independently alter their firmware. 246 99:59:59,999 --> 99:59:59,999 It's happening all over the place. 247 99:59:59,999 --> 99:59:59,999 You may have seen -- you may have seen this summer 248 99:59:59,999 --> 99:59:59,999 that Chrysler had to recall 1.4 million jeeps 249 99:59:59,999 --> 99:59:59,999 because it turned out that they could be remotely controlled over the internet 250 99:59:59,999 --> 99:59:59,999 while driving down a motorway and have their brakes and steering 251 99:59:59,999 --> 99:59:59,999 commandeered by anyone, anywhere in the world, over the internet. 252 99:59:59,999 --> 99:59:59,999 We only have one methodology for determining whether security works 253 99:59:59,999 --> 99:59:59,999 and that's to submit it to public scrutiny, 254 99:59:59,999 --> 99:59:59,999 to allow for other people to see what assumptions you've made. 255 99:59:59,999 --> 99:59:59,999 Anyone can design a security system 256 99:59:59,999 --> 99:59:59,999 that he himself can think of a way of breaking, 257 99:59:59,999 --> 99:59:59,999 but all that means is that you've designed a security system 258 99:59:59,999 --> 99:59:59,999 that works against people who are stupider than you. 259 99:59:59,999 --> 99:59:59,999 And in this regard, security is no different 260 99:59:59,999 --> 99:59:59,999 from any other kind of knowledge creation. 261 99:59:59,999 --> 99:59:59,999 You know, before we had contemporary science and scholarship, 262 99:59:59,999 --> 99:59:59,999 we had something that looked a lot like it, called alchemy. 263 99:59:59,999 --> 99:59:59,999 And for 500 years, alchemists kept what they thought they knew a secret. 264 99:59:59,999 --> 99:59:59,999 And that meant that every alchemist was capable of falling prey 265 99:59:59,999 --> 99:59:59,999 to that most urgent of human frailties, which is our ability to fool ourselves. 266 99:59:59,999 --> 99:59:59,999 And so, every alchemist discovered for himself in the hardest way possible 267 99:59:59,999 --> 99:59:59,999 that drinking mercury was a bad idea. 268 99:59:59,999 --> 99:59:59,999 We call that 500-year period the Dark Ages 269 99:59:59,999 --> 99:59:59,999 and we call the moment at which they started publishing 270 99:59:59,999 --> 99:59:59,999 and submitting themselves to adversarial peer review, 271 99:59:59,999 --> 99:59:59,999 which is when your friends tell you about the mistakes that you've made 272 99:59:59,999 --> 99:59:59,999 and your enemies call you an idiot for having made them, 273 99:59:59,999 --> 99:59:59,999 we call that moment the Enlightenment. 274 99:59:59,999 --> 99:59:59,999 Now, this has profound implications. 275 99:59:59,999 --> 99:59:59,999 The restriction of our ability to alter the security of our devices 276 99:59:59,999 --> 99:59:59,999 for our own surveillance society, 277 99:59:59,999 --> 99:59:59,999 for our ability to be free people in society. 278 99:59:59,999 --> 99:59:59,999 At the height of the GDR, in 1989, 279 99:59:59,999 --> 99:59:59,999 the STASI had one snitch for every 60 people in East Germany, 280 99:59:59,999 --> 99:59:59,999 in order to surveil the entire country. 281 99:59:59,999 --> 99:59:59,999 A couple of decades later, we found out through Edward Snowden 282 99:59:59,999 --> 99:59:59,999 that the NSA was spying on everybody in the world. 283 99:59:59,999 --> 99:59:59,999 And the ratio of people who work at the NSA to people they are spying on 284 99:59:59,999 --> 99:59:59,999 is more like 1 in 10'000. 285 99:59:59,999 --> 99:59:59,999 They've achieved a two and a half order of magnitude 286 99:59:59,999 --> 99:59:59,999 productivity gain in surveillance. 287 99:59:59,999 --> 99:59:59,999 And the way that they got there is in part by the fact that 288 99:59:59,999 --> 99:59:59,999 we use devices that we're not allowed to alter, 289 99:59:59,999 --> 99:59:59,999 that are designed to treat us as attackers 290 99:59:59,999 --> 99:59:59,999 and that gather an enormous amount of information on us. 291 99:59:59,999 --> 99:59:59,999 If the government told you that you're required to carry on 292 99:59:59,999 --> 99:59:59,999 a small electronic rectangle that recorded all of your social relationships, 293 99:59:59,999 --> 99:59:59,999 all of your movements, 294 99:59:59,999 --> 99:59:59,999 all of your transient thoughts that you made known or ever looked up, 295 99:59:59,999 --> 99:59:59,999 and would make that available to the state, 296 99:59:59,999 --> 99:59:59,999 and you would have to pay for it, you would revolt. 297 99:59:59,999 --> 99:59:59,999 But the phone companies have managed to convince us, 298 99:59:59,999 --> 99:59:59,999 along with the mobile phone vendors, 299 99:59:59,999 --> 99:59:59,999 that we should foot the bill for our own surveillance. 300 99:59:59,999 --> 99:59:59,999 It's a bit like during the Cultural Revolution, 301 99:59:59,999 --> 99:59:59,999 where, after your family members were executed, 302 99:59:59,999 --> 99:59:59,999 they sent you a bill for the bullet. 303 99:59:59,999 --> 99:59:59,999 So, this has big implications, as I said, for where we go as a society. 304 99:59:59,999 --> 99:59:59,999 Because just as our kids have a hard time functioning 305 99:59:59,999 --> 99:59:59,999 in the presence of surveillance, and learning, 306 99:59:59,999 --> 99:59:59,999 and advancing their own knowledge, 307 99:59:59,999 --> 99:59:59,999 we as a society have a hard time progressing 308 99:59:59,999 --> 99:59:59,999 in the presence of surveillance. 309 99:59:59,999 --> 99:59:59,999 In our own living memory, people who are today though of as normal and right 310 99:59:59,999 --> 99:59:59,999 were doing something that a generation ago 311 99:59:59,999 --> 99:59:59,999 would have been illegal and landed them in jail. 312 99:59:59,999 --> 99:59:59,999 For example, you probably know someone 313 99:59:59,999 --> 99:59:59,999 who's married to a partner of the same sex. 314 99:59:59,999 --> 99:59:59,999 If you live in America, you may know someone who takes medical marijuana, 315 99:59:59,999 --> 99:59:59,999 or if you live in the Netherlands. 316 99:59:59,999 --> 99:59:59,999 And not that long ago, people who undertook these activities 317 99:59:59,999 --> 99:59:59,999 could have gone to jail for them, 318 99:59:59,999 --> 99:59:59,999 could have faced enormous social exclusion for them. 319 99:59:59,999 --> 99:59:59,999 The way that we got from there to here was by having a private zone, 320 99:59:59,999 --> 99:59:59,999 a place where people weren't surveilled, 321 99:59:59,999 --> 99:59:59,999 in which they could advance their interest ideas, 322 99:59:59,999 --> 99:59:59,999 do things that were thought of as socially unacceptable 323 99:59:59,999 --> 99:59:59,999 and slowly change our social attitudes. 324 99:59:59,999 --> 99:59:59,999 And in ........ (check) few things that in 50 years, 325 99:59:59,999 --> 99:59:59,999 your grandchildren will sit around the Christmas table, in 2065, and say: 326 99:59:59,999 --> 99:59:59,999 "How was it, grandma, how was it, grandpa, 327 99:59:59,999 --> 99:59:59,999 that in 2015, you got it all right, 328 99:59:59,999 --> 99:59:59,999 and we haven't had any social changes since then?" 329 99:59:59,999 --> 99:59:59,999 Then you have to ask yourself how in a world, 330 99:59:59,999 --> 99:59:59,999 in which we are all under continuous surveillance, 331 99:59:59,999 --> 99:59:59,999 we are going to find a way to improve this. 332 99:59:59,999 --> 99:59:59,999 So, our kids need ICT literacy, 333 99:59:59,999 --> 99:59:59,999 but ICT literacy isn't just typing skills or learning how to use PowerPoing. 334 99:59:59,999 --> 99:59:59,999 It's learning how to think critically 335 99:59:59,999 --> 99:59:59,999 about how they relate to the means of information, 336 99:59:59,999 --> 99:59:59,999 about whether they are its masters or servants. 337 99:59:59,999 --> 99:59:59,999 Our networks are not the most important issue that we have. 338 99:59:59,999 --> 99:59:59,999 There are much more important issues in society and in the world today. 339 99:59:59,999 --> 99:59:59,999 The future of the internet is way less important 340 99:59:59,999 --> 99:59:59,999 than the future of our climate, the future of gender equity, 341 99:59:59,999 --> 99:59:59,999 the future of racial equity, 342 99:59:59,999 --> 99:59:59,999 the future of the wage gap and the wealth gap in the world, 343 99:59:59,999 --> 99:59:59,999 but everyone of those fights is going to be fought and won or lost 344 99:59:59,999 --> 99:59:59,999 on the internet: it's our most foundational fight 345 99:59:59,999 --> 99:59:59,999 So weakened (check) computers can make us more free 346 99:59:59,999 --> 99:59:59,999 or they can take away our freedom. 347 99:59:59,999 --> 99:59:59,999 It all comes down to how we regulate them and how we use them. 348 99:59:59,999 --> 99:59:59,999 And it's our job, as people who are training the next generation, 349 99:59:59,999 --> 99:59:59,999 and whose next generation is beta-testing 350 99:59:59,999 --> 99:59:59,999 the surveillance technology that will be coming to us, 351 99:59:59,999 --> 99:59:59,999 it's our job to teach them to seize the means of information, 352 99:59:59,999 --> 99:59:59,999 to make themselves self-determinant in the way that they use their networks 353 99:59:59,999 --> 99:59:59,999 and to find ways to show them how to be critical and how to be smart 354 99:59:59,999 --> 99:59:59,999 and how to be, above all, subversive and how to use the technology around them. 355 99:59:59,999 --> 99:59:59,999 Thank you. (18:49) 356 99:59:59,999 --> 99:59:59,999 (Applause) 357 99:59:59,999 --> 99:59:59,999 (Moderator) Cory, thank you very much indeed. 358 99:59:59,999 --> 99:59:59,999 (Doctorow) Thank you (Moderator) And I've got a bundle 359 99:59:59,999 --> 99:59:59,999 of points which you've stimulated from many in the audience, which sent -- 360 99:59:59,999 --> 99:59:59,999 (Doctorow) I'm shocked to hear that that was at all controversial, 361 99:59:59,999 --> 99:59:59,999 but go on. (Moderator) I didn't say "controversial", 362 99:59:59,999 --> 99:59:59,999 you stimulated thinking, which is great. 363 99:59:59,999 --> 99:59:59,999 But a lot of them resonate around violation of secrecy and security. 364 99:59:59,999 --> 99:59:59,999 And this, for example, from Anneke Burgess (check) 365 99:59:59,999 --> 99:59:59,999 "Is there a way for students to protect themselves 366 99:59:59,999 --> 99:59:59,999 from privacy violations by institutions they are supposed to trust." 367 99:59:59,999 --> 99:59:59,999 I think this is probably a question William Golding (check) as well, 368 99:59:59,999 --> 99:59:59,999 someone who is a senior figure in a major university, but 369 99:59:59,999 --> 99:59:59,999 this issue of privacy violations and trust. 370 99:59:59,999 --> 99:59:59,999 (Doctorow) Well, I think that computers have a curious dual nature. 371 99:59:59,999 --> 99:59:59,999 So on the one hand, they do expose us to an enormous amount of scrutiny, 372 99:59:59,999 --> 99:59:59,999 depending on how they are configured. 373 99:59:59,999 --> 99:59:59,999 But on the other hand, computers have brought new powers to us 374 99:59:59,999 --> 99:59:59,999 that are literally new on the face of the world, right? 375 99:59:59,999 --> 99:59:59,999 We have never had a reality in which normal people could have secrets 376 99:59:59,999 --> 99:59:59,999 from powerful people. 377 99:59:59,999 --> 99:59:59,999 But with the computer in your pocket, with that, 378 99:59:59,999 --> 99:59:59,999 you can encrypt a message so thoroughly 379 99:59:59,999 --> 99:59:59,999 that if every hydrogen atom in the universe were turned into a computer 380 99:59:59,999 --> 99:59:59,999 and it did nothing until the heat death of the universe, 381 99:59:59,999 --> 99:59:59,999 but try to guess what your key was, 382 99:59:59,999 --> 99:59:59,999 we would run out of universe before we ran out of possible keys. 383 99:59:59,999 --> 99:59:59,999 So, computers do give us the power to have secrets. 384 99:59:59,999 --> 99:59:59,999 The problem is that institutions prohibit the use of technology 385 99:59:59,999 --> 99:59:59,999 that allow you to take back your own privacy. 386 99:59:59,999 --> 99:59:59,999 It's funny, right? Because we take kids and we say to them: 387 99:59:59,999 --> 99:59:59,999 "Your privacy is like your virginity: 388 99:59:59,999 --> 99:59:59,999 once you've lost it, you'll never get it back. 389 99:59:59,999 --> 99:59:59,999 Watch out what you're putting on Facebook" 390 99:59:59,999 --> 99:59:59,999 -- and I think they should watch what they're putting on Facebook, 391 99:59:59,999 --> 99:59:59,999 I'm a Facebook vegan, I don't even -- I don't use it, but we say: 392 99:59:59,999 --> 99:59:59,999 "Watch what you're putting on Facebook, 393 99:59:59,999 --> 99:59:59,999 don't send out dirty pictures of yourself on SnapChat." 394 99:59:59,999 --> 99:59:59,999 All good advice. 395 99:59:59,999 --> 99:59:59,999 But we do it while we are taking away all the private information 396 99:59:59,999 --> 99:59:59,999 that they have, all of their privacy and all of their agency. 397 99:59:59,999 --> 99:59:59,999 You know, if a parent says to a kid: 398 99:59:59,999 --> 99:59:59,999 "You mustn't smoke because you'll get sick" 399 99:59:59,999 --> 99:59:59,999 and the parent says it while lighting a new cigarette 400 99:59:59,999 --> 99:59:59,999 off the one that she's just put down in the ashtray, 401 99:59:59,999 --> 99:59:59,999 the kid knows that what you're doing matters more than what you're saying. 402 99:59:59,999 --> 99:59:59,999 (Moderator) The point is deficit of trust. 403 99:59:59,999 --> 99:59:59,999 It builds in the kind of work that David has been doing as well, 404 99:59:59,999 --> 99:59:59,999 this deficit of trust and privacy. 405 99:59:59,999 --> 99:59:59,999 And there is another point here: 406 99:59:59,999 --> 99:59:59,999 "Is the battle for privacy already lost? 407 99:59:59,999 --> 99:59:59,999 Are we already too comfortable with giving away our data?" 408 99:59:59,999 --> 99:59:59,999 (Doctorow) No, I don't think so at all. 409 99:59:59,999 --> 99:59:59,999 In fact, I think that if anything, we've reached 410 99:59:59,999 --> 99:59:59,999 peak indifference to surveillance, right? (21:25)