1 00:00:00,000 --> 00:00:00,778 (Cory Doctorow) Thank you very much 2 00:00:01,066 --> 00:00:05,420 So I'd like to start with something of a benediction or permission. 3 00:00:05,420 --> 00:00:07,740 I am one of nature's fast talkers 4 00:00:07,740 --> 00:00:10,620 and many of you are not native English speakers, or 5 00:00:10,620 --> 00:00:12,919 maybe not accustomed to my harsh Canadian accent 6 00:00:12,919 --> 00:00:15,919 in addition I've just come in from Australia 7 00:00:15,919 --> 00:00:19,166 and so like many of you I am horribly jetlagged and have drunkenough coffee 8 00:00:19,166 --> 00:00:20,736 this morning to kill a rhino. 9 00:00:22,112 --> 00:00:23,587 When I used to be at the United Nations 10 00:00:23,587 --> 00:00:27,023 I was known as the scourge of the simultaneous translation core 11 00:00:27,157 --> 00:00:29,528 I would stand up and speak as slowly as I could 12 00:00:29,576 --> 00:00:32,229 and turn around, and there they would be in their boots doing this 13 00:00:32,229 --> 00:00:35,136 (laughter) When I start to speak too fast, 14 00:00:35,136 --> 00:00:37,557 this is the universal symbol -- my wife invented it -- 15 00:00:37,557 --> 00:00:41,518 for "Cory, you are talking too fast". Please, don't be shy. 16 00:00:42,078 --> 00:00:45,790 So, I'm a parent , like many of you and I'm like I'm sure all of you 17 00:00:45,790 --> 00:00:48,850 who are parents, parenting takes my ass all the time. 18 00:00:50,030 --> 00:00:54,820 And there are many regrets I have about the mere seven and half years 19 00:00:54,820 --> 00:00:57,284 that I've been a parent but none ares so keenly felt 20 00:00:57,784 --> 00:01:00,878 as my regrets over what's happened when I've been wandering 21 00:01:00,878 --> 00:01:04,766 around the house and seen my daughter working on something 22 00:01:04,766 --> 00:01:08,697 that was beyond her abilities, that was right at the edge of what she could do 23 00:01:08,697 --> 00:01:12,759 and where she was doing something that she didn't have competence in yet 24 00:01:12,759 --> 00:01:17,019 and you know it's that amazing thing to see that frowning concentration, 25 00:01:17,019 --> 00:01:20,038 tongue stuck out: as a parent, your heart swells with pride 26 00:01:20,038 --> 00:01:21,332 and you can't help but go over 27 00:01:21,332 --> 00:01:23,280 and sort of peer over their shoulder what they are doing 28 99:59:59,999 --> 99:59:59,999 and those of you who are parents know what happens when you look too closely 29 99:59:59,999 --> 99:59:59,999 at someone who is working beyond the age of their competence. 30 99:59:59,999 --> 99:59:59,999 They go back to doing something they're already good at. 31 99:59:59,999 --> 99:59:59,999 You interrupt a moment of genuine learning 32 99:59:59,999 --> 99:59:59,999 and you replace it with a kind of embarrassment 33 99:59:59,999 --> 99:59:59,999 about what you're good at and what you're not. 34 99:59:59,999 --> 99:59:59,999 So, it matters a lot that our schools are increasingly surveilled environments, 35 99:59:59,999 --> 99:59:59,999 environments in which everything that our kids do is watched and recorded. 36 99:59:59,999 --> 99:59:59,999 Because when you do that, you interfere with those moments of real learning. 37 99:59:59,999 --> 99:59:59,999 Our ability to do things that we are not good at yet, that we are not proud of yet, 38 99:59:59,999 --> 99:59:59,999 is negatively impacted by that kind of scrutiny. 39 99:59:59,999 --> 99:59:59,999 And that scrutiny comes from a strange place. 40 99:59:59,999 --> 99:59:59,999 We have decided that there are some programmatic means 41 99:59:59,999 --> 99:59:59,999 by which we can find all the web page children shouldn't look at 42 99:59:59,999 --> 99:59:59,999 and we will filter our networks to be sure that they don't see them. 43 99:59:59,999 --> 99:59:59,999 Anyone who has ever paid attention knows that this doesn't work. 44 99:59:59,999 --> 99:59:59,999 There are more web pages that kids shouldn't look at 45 99:59:59,999 --> 99:59:59,999 than can ever be cataloged, and any attempt to catalog them 46 99:59:59,999 --> 99:59:59,999 will always catch pages that kids must be looking at. 47 99:59:59,999 --> 99:59:59,999 Any of you who have ever taught a unit on reproductive health 48 99:59:59,999 --> 99:59:59,999 know the frustration of trying to get round a school network. 49 99:59:59,999 --> 99:59:59,999 Now, this is done in the name of digital protection 50 99:59:59,999 --> 99:59:59,999 but it flies in the face of digital literacy and of real learning. 51 99:59:59,999 --> 99:59:59,999 Because the only way to stop kids from looking at web pages 52 99:59:59,999 --> 99:59:59,999 they shouldn't be looking at 53 99:59:59,999 --> 99:59:59,999 is to take all of the clicks that they make, all of the messages that they send, 54 99:59:59,999 --> 99:59:59,999 all of their online activity and offshore it to a firm 55 99:59:59,999 --> 99:59:59,999 that has some nonsensically arrived at list of the bad pages. 56 99:59:59,999 --> 99:59:59,999 And so, what we are doing is that we're exfiltrating all of our students' data 57 99:59:59,999 --> 99:59:59,999 to unknown third parties. 58 99:59:59,999 --> 99:59:59,999 Now, most of these firms, their primary business is 59 99:59:59,999 --> 99:59:59,999 in serving the education sector. 60 99:59:59,999 --> 99:59:59,999 Most of them service the government sector. 61 99:59:59,999 --> 99:59:59,999 The primarily service governments in repressive autocratic regimes. 62 99:59:59,999 --> 99:59:59,999 They help them make sure that their citizens aren't looking at 63 99:59:59,999 --> 99:59:59,999 Amnesty International web pages. 64 99:59:59,999 --> 99:59:59,999 They repackage those tools and sell them to our educators. 65 99:59:59,999 --> 99:59:59,999 So we are offshoring our children's clicks to war criminals. 66 99:59:59,999 --> 99:59:59,999 And what our kids do, we know, is they just get around it, 67 99:59:59,999 --> 99:59:59,999 because it's not hard to get around it. 68 99:59:59,999 --> 99:59:59,999 You know, never underestimate the power of a kid who is time-rich and cash-poor 69 99:59:59,999 --> 99:59:59,999 to get around our technological blockades. 70 99:59:59,999 --> 99:59:59,999 But when they do this, they don't acquire the kind of digital literacy 71 99:59:59,999 --> 99:59:59,999 that we want them to do, they don't acquire real digital agency 72 99:59:59,999 --> 99:59:59,999 and moreover, they risk exclusion and in extreme cases, 73 99:59:59,999 --> 99:59:59,999 they risk criminal prosecution. 74 99:59:59,999 --> 99:59:59,999 So what if instead, those of us who are trapped in this system of teaching kids 75 99:59:59,999 --> 99:59:59,999 where we're required to subject them to this kind of surveillance 76 99:59:59,999 --> 99:59:59,999 that flies in the face of their real learning, 77 99:59:59,999 --> 99:59:59,999 what if instead, we invented curricular units 78 99:59:59,999 --> 99:59:59,999 that made them real first class digital citizens, 79 99:59:59,999 --> 99:59:59,999 in charge of trying to influence real digital problems? 80 99:59:59,999 --> 99:59:59,999 Like what if we said to them: "We want you to catalog the web pages 81 99:59:59,999 --> 99:59:59,999 that this vendor lets through that you shouldn't be seeing. 82 99:59:59,999 --> 99:59:59,999 We want you to catalog those pages that you should be seeing, that are blocked. 83 99:59:59,999 --> 99:59:59,999 We want you to go and interview every teacher in the school 84 99:59:59,999 --> 99:59:59,999 about all those lesson plans that were carefully laid out before lunch 85 99:59:59,999 --> 99:59:59,999 with a video and a web page, and over lunch, 86 99:59:59,999 --> 99:59:59,999 the unaccountable distance center blocked these critical resources 87 99:59:59,999 --> 99:59:59,999 and left them handing out photographed worksheets in the afternoon 88 99:59:59,999 --> 99:59:59,999 instead of the unit they prepared. 89 99:59:59,999 --> 99:59:59,999 We want you to learn how to do the Freedom of Information Act's requests 90 99:59:59,999 --> 99:59:59,999 and find out what your school authority is spending 91 99:59:59,999 --> 99:59:59,999 to censor your internet access and surveil your activity. 92 99:59:59,999 --> 99:59:59,999 We want you to learn to use the internet to research these companies and 93 99:59:59,999 --> 99:59:59,999 we want you to present this to your parent-teacher association, 94 99:59:59,999 --> 99:59:59,999 to your school authority, to your local newspaper." 95 99:59:59,999 --> 99:59:59,999 Because that's the kind of digital literacy 96 99:59:59,999 --> 99:59:59,999 that makes kids into first-class digital citizens, 97 99:59:59,999 --> 99:59:59,999 that prepares them for a future in which they can participate fully 98 99:59:59,999 --> 99:59:59,999 in a world that's changing. 99 99:59:59,999 --> 99:59:59,999 Kids are the beta-testers of the surveillance state. 100 99:59:59,999 --> 99:59:59,999 The path of surveillance technology starts with prisoners, 101 99:59:59,999 --> 99:59:59,999 moves to asylum seekers, people in mental institutions 102 99:59:59,999 --> 99:59:59,999 and then to its first non-incarcerated population: children 103 99:59:59,999 --> 99:59:59,999 and then moves to blue-collar workers, government workers 104 99:59:59,999 --> 99:59:59,999 and white-collar workers. 105 99:59:59,999 --> 99:59:59,999 And so, what we do to kids today is what we did to prisoners yesterday 106 99:59:59,999 --> 99:59:59,999 and what we're going to be doing to you tomorrow. 107 99:59:59,999 --> 99:59:59,999 And so it matters, what we teach our kids. 108 99:59:59,999 --> 99:59:59,999 If you want to see where this goes, this is a kid named Blake Robbins 109 99:59:59,999 --> 99:59:59,999 and he attended Lower Merion High School in Lower Merion Pennsylvania 110 99:59:59,999 --> 99:59:59,999 outside f Philadelphia. 111 99:59:59,999 --> 99:59:59,999 It's the most affluent school district in America, so affluent 112 99:59:59,999 --> 99:59:59,999 that all the kids were issued Macbooks at the start of the year 113 99:59:59,999 --> 99:59:59,999 and they had to do their homework on their Macbooks, 114 99:59:59,999 --> 99:59:59,999 they had to bring them to school every day and bring them home every night. 115 99:59:59,999 --> 99:59:59,999 And the Macbooks had been fitted with Laptop Theft Recovery Software, 116 99:59:59,999 --> 99:59:59,999 which is fancy word for a rootkit, that let the school administration 117 99:59:59,999 --> 99:59:59,999 scovertly (check) operate the cameras and microphones on these computers 118 99:59:59,999 --> 99:59:59,999 and harvest files off of their hard drives 119 99:59:59,999 --> 99:59:59,999 view all their clicks, and so on. 120 99:59:59,999 --> 99:59:59,999 Now Blake Robbins found out that the software existed 121 99:59:59,999 --> 99:59:59,999 and how it was being used because he and the head teacher 122 99:59:59,999 --> 99:59:59,999 had been knocking heads for years, since he first got into the school, 123 99:59:59,999 --> 99:59:59,999 and one day, the head teacher summoned him to his office 124 99:59:59,999 --> 99:59:59,999 and said: "Blake, I've got you now." 125 99:59:59,999 --> 99:59:59,999 and handed him a print-out of Blake in his bedroom the night before, 126 99:59:59,999 --> 99:59:59,999 taking what looked like a pill, and said: "You're taking drugs." 127 99:59:59,999 --> 99:59:59,999 And Blake Robbins said: "That's a candy, it's a Mike and Ike candy, I take them -- 128 99:59:59,999 --> 99:59:59,999 I eat them when I'm studying. 129 99:59:59,999 --> 99:59:59,999 How did you get a picture of me in my bedroom?" 130 99:59:59,999 --> 99:59:59,999 This head teacher had taken over 6000 photos of Blake Robbins: 131 99:59:59,999 --> 99:59:59,999 awake and asleep, dressed and undressed, in the presence of his family. 132 99:59:59,999 --> 99:59:59,999 And in the ensuing lawsuit, the school settled for a large amount of money 133 99:59:59,999 --> 99:59:59,999 and promised that they wouldn't do it again 134 99:59:59,999 --> 99:59:59,999 without informing the students that it was going on. 135 99:59:59,999 --> 99:59:59,999 And increasingly, the practice is now 136 99:59:59,999 --> 99:59:59,999 that school administrations hand out laptops, because they're getting cheaper, 137 99:59:59,999 --> 99:59:59,999 with exactly the same kind of software, 138 99:59:59,999 --> 99:59:59,999 but they let the students know and t hey find that that works even better 139 99:59:59,999 --> 99:59:59,999 at curbing the students' behavior, 140 99:59:59,999 --> 99:59:59,999 because the students know that they're always on camera. 141 99:59:59,999 --> 99:59:59,999 Now, the surveillance state is moving from kids to the rest of the world. 142 99:59:59,999 --> 99:59:59,999 It's metastasizing. 143 99:59:59,999 --> 99:59:59,999 Our devices are increasingly designed to treat us as attackers, 144 99:59:59,999 --> 99:59:59,999 as suspicious parties who can't be trusted 145 99:59:59,999 --> 99:59:59,999 because our devices' job is to do things that we don't want them to do. 146 99:59:59,999 --> 99:59:59,999 Now that's not because the vendors who make our technology 147 99:59:59,999 --> 99:59:59,999 want to spy on us necessarily, 148 99:59:59,999 --> 99:59:59,999 but they want to take the ink-jet printer business model 149 99:59:59,999 --> 99:59:59,999 and bring it into every other realm of the world. 150 99:59:59,999 --> 99:59:59,999 So the ink-jet printer business model is where you sell someone a device 151 99:59:59,999 --> 99:59:59,999 and then you get a continuing revenue stream from that device 152 99:59:59,999 --> 99:59:59,999 by making sure that competitors can't make consumables or parts 153 99:59:59,999 --> 99:59:59,999 or additional features or plugins for that device, 154 99:59:59,999 --> 99:59:59,999 without paying rent to the original manufacturer. 155 99:59:59,999 --> 99:59:59,999 And that allows you to maintain monopoly margins on your devices. 156 99:59:59,999 --> 99:59:59,999 Now, in 1998, the American government passed a law called 157 99:59:59,999 --> 99:59:59,999 the Digital Millennium Copyright Act, 158 99:59:59,999 --> 99:59:59,999 in 2001 the European Union introduced its own version, 159 99:59:59,999 --> 99:59:59,999 the European Union Copyright Directive. 160 99:59:59,999 --> 99:59:59,999 And these two laws, along with laws all around the world, 161 99:59:59,999 --> 99:59:59,999 in Australia, Canada and elsewhere. These laws prohibit removing digital laws 162 99:59:59,999 --> 99:59:59,999 that are used to restrict access to copyrighted works 163 99:59:59,999 --> 99:59:59,999 and they were original envisioned as a way of making sure that Europeans didn't 164 99:59:59,999 --> 99:59:59,999 bring cheap DVDs in from America, 165 99:59:59,999 --> 99:59:59,999 or making sure that Australians didn't mport cheap DVDs from China. 166 99:59:59,999 --> 99:59:59,999 And so you have a digital work, a DVD, and it has a lock on it and to unlock it, 167 99:59:59,999 --> 99:59:59,999 you have to buy an authorized player 168 99:59:59,999 --> 99:59:59,999 and the player checks to make sure you are in region 169 99:59:59,999 --> 99:59:59,999 and making your own player that doesn't make that check 170 99:59:59,999 --> 99:59:59,999 is illegal because you'd have to remove the digital lock. 171 99:59:59,999 --> 99:59:59,999 And that was the original intent, 172 99:59:59,999 --> 99:59:59,999 it was to allow high rates to be maintained on removable media, 173 99:59:59,999 --> 99:59:59,999 DVDs and other entertainment content. 174 99:59:59,999 --> 99:59:59,999 But it very quickly spread into new rounds. 175 99:59:59,999 --> 99:59:59,999 So, for example, auto manufacturers now lock up all of their cars' telemetry 176 99:59:59,999 --> 99:59:59,999 with digital locks. 177 99:59:59,999 --> 99:59:59,999 If you're a mechanic and want to fix a car, 178 99:59:59,999 --> 99:59:59,999 you have to get a reader from the manufacturer 179 99:59:59,999 --> 99:59:59,999 to make sure that you can see the telemetry 180 99:59:59,999 --> 99:59:59,999 and know what parts to order and how to fix it. 181 99:59:59,999 --> 99:59:59,999 And in order to get this reader, you have to promise the manufacturer 182 99:59:59,999 --> 99:59:59,999 that you will only buy parts from that manufacturer 183 99:59:59,999 --> 99:59:59,999 and not from third parties. 184 99:59:59,999 --> 99:59:59,999 So the manufacturers can keep the repair costs high 185 99:59:59,999 --> 99:59:59,999 and get a secondary revenue stream out of the cars. 186 99:59:59,999 --> 99:59:59,999 This year, the Chrysler corporation filed comments with the US Copyright Office, 187 99:59:59,999 --> 99:59:59,999 to say that they believed that this was the right way to do it 188 99:59:59,999 --> 99:59:59,999 and that it should be a felony, punishable by 5 years in prison 189 99:59:59,999 --> 99:59:59,999 and a $500'000 fine, 190 99:59:59,999 --> 99:59:59,999 to change the locks on a car that you own, so that you can choose who fixes it. 191 99:59:59,999 --> 99:59:59,999 It turned out that when they advertised 192 99:59:59,999 --> 99:59:59,999 -- well, where is my slide here? Oh, there we go -- 193 99:59:59,999 --> 99:59:59,999 when they advertised that it wasn't your father's Oldsmobile, 194 99:59:59,999 --> 99:59:59,999 they weren't speaking metaphorically, they really meant 195 99:59:59,999 --> 99:59:59,999 that even though your father bought the Oldsmobile, 196 99:59:59,999 --> 99:59:59,999 it remained their property in perpetuity. 197 99:59:59,999 --> 99:59:59,999 And it's not just cars, it's every kind of device, 198 99:59:59,999 --> 99:59:59,999 because every kind of device today has a computer in it. 199 99:59:59,999 --> 99:59:59,999 The John Deer Company, the world's leading seller of heavy equipment 200 99:59:59,999 --> 99:59:59,999 and agricultural equipment technologies, 201 99:59:59,999 --> 99:59:59,999 they now view their tractors as information gathering platforms 202 99:59:59,999 --> 99:59:59,999 and they view the people who use them 203 99:59:59,999 --> 99:59:59,999 as the kind of inconvenient gut flora of their ecosystem. 204 99:59:59,999 --> 99:59:59,999 So if you are a farmer and you own a John Deer tractor, 205 99:59:59,999 --> 99:59:59,999 when you drive it around your fields, the torque centers on the wheels 206 99:59:59,999 --> 99:59:59,999 conduct a centimeter-accurate soil density survey of your agricultural land. 207 99:59:59,999 --> 99:59:59,999 That would be extremely useful to you when you're planting your seeds 208 99:59:59,999 --> 99:59:59,999 but that data is not available to you 209 99:59:59,999 --> 99:59:59,999 unless unless you remove the digital lock from your John Deer tractor 210 99:59:59,999 --> 99:59:59,999 which again, is against the law everywhere in the world. 211 99:59:59,999 --> 99:59:59,999 Instead, in order to get that data 212 99:59:59,999 --> 99:59:59,999 you have to buy a bundle with seeds from Monsanto, 213 99:59:59,999 --> 99:59:59,999 who are John Deer's seed partners. 214 99:59:59,999 --> 99:59:59,999 John Deer then takes this data that they aggregate across whole regions 215 99:59:59,999 --> 99:59:59,999 and they use it to gain insight into regional crop yields 216 99:59:59,999 --> 99:59:59,999 That they use to play the futures market. 217 99:59:59,999 --> 99:59:59,999 John Deer's tractors are really just a way of gathering information 218 99:59:59,999 --> 99:59:59,999 and the farmers are secondary to it. 219 99:59:59,999 --> 99:59:59,999 Just because you own it doesn't mean it's yours. 220 99:59:59,999 --> 99:59:59,999 And it's not just the computers that we put our bodies into 221 99:59:59,999 --> 99:59:59,999 that have this business model. 222 99:59:59,999 --> 99:59:59,999 It's the computers that we put inside of our bodies. 223 99:59:59,999 --> 99:59:59,999 If you're someone who is diabetic 224 99:59:59,999 --> 99:59:59,999 and you're fitted with a continuous glucose-measuring insulin pump, 225 99:59:59,999 --> 99:59:59,999 that insulin pump is designed with a digital log 226 99:59:59,999 --> 99:59:59,999 that makes sure that your doctor can only use the manufacturer's software 227 99:59:59,999 --> 99:59:59,999 to read the data coming off of it 228 99:59:59,999 --> 99:59:59,999 and that software is resold on a rolling annual license 229 99:59:59,999 --> 99:59:59,999 and it can't be just bought outright. 230 99:59:59,999 --> 99:59:59,999 And the digital locks are also used to make sure 231 99:59:59,999 --> 99:59:59,999 that you only buy the insulin that vendors approved 232 99:59:59,999 --> 99:59:59,999 and not generic insulin that might be cheaper. 233 99:59:59,999 --> 99:59:59,999 We've literally turned human beings into ink-jet printers. 234 99:59:59,999 --> 99:59:59,999 Now, this has really deep implications beyond the economic implications. 235 99:59:59,999 --> 99:59:59,999 Because the rules that prohibit breaking these digital locks 236 99:59:59,999 --> 99:59:59,999 also prohibit telling people about flaws that programmers made 237 99:59:59,999 --> 99:59:59,999 because if you know about a flaw that a programmer made, 238 99:59:59,999 --> 99:59:59,999 you can use it to break the digital lock. 239 99:59:59,999 --> 99:59:59,999 And that means that the errors, the vulnerabilities, 240 99:59:59,999 --> 99:59:59,999 the mistakes in our devices, they fester in them, they go on and on and on 241 99:59:59,999 --> 99:59:59,999 and our devices become these longlife reservoirs of digital pathogens. 242 99:59:59,999 --> 99:59:59,999 And we've seen how that plays out. 243 99:59:59,999 --> 99:59:59,999 One of the reasons that Volkswagen was able to get away 244 99:59:59,999 --> 99:59:59,999 with their Diesel cheating for so long 245 99:59:59,999 --> 99:59:59,999 is because no one could independently alter their firmware. 246 99:59:59,999 --> 99:59:59,999 It's happening all over the place. 247 99:59:59,999 --> 99:59:59,999 You may have seen -- you may have seen this summer 248 99:59:59,999 --> 99:59:59,999 that Chrysler had to recall 1.4 million jeeps 249 99:59:59,999 --> 99:59:59,999 because it turned out that they could be remotely controlled over the internet 250 99:59:59,999 --> 99:59:59,999 while driving down a motorway and have their brakes and steering 251 99:59:59,999 --> 99:59:59,999 commandeered by anyone, anywhere in the world, over the internet. 252 99:59:59,999 --> 99:59:59,999 We only have one methodology for determining whether security works 253 99:59:59,999 --> 99:59:59,999 and that's to submit it to public scrutiny, 254 99:59:59,999 --> 99:59:59,999 to allow for other people to see what assumptions you've made. 255 99:59:59,999 --> 99:59:59,999 Anyone can design a security system 256 99:59:59,999 --> 99:59:59,999 that he himself can think of a way of breaking, 257 99:59:59,999 --> 99:59:59,999 but all that means is that you've designed a security system 258 99:59:59,999 --> 99:59:59,999 that works against people who are stupider than you. 259 99:59:59,999 --> 99:59:59,999 And in this regard, security is no different 260 99:59:59,999 --> 99:59:59,999 from any other kind of knowledge creation. 261 99:59:59,999 --> 99:59:59,999 You know, before we had contemporary science and scholarship, 262 99:59:59,999 --> 99:59:59,999 we had something that looked a lot like it, called alchemy. 263 99:59:59,999 --> 99:59:59,999 And for 500 years, alchemists kept what they thought they knew a secret. 264 99:59:59,999 --> 99:59:59,999 And that meant that every alchemist was capable of falling prey 265 99:59:59,999 --> 99:59:59,999 to that most urgent of human frailties, which is our ability to fool ourselves. 266 99:59:59,999 --> 99:59:59,999 And so, every alchemist discovered for himself in the hardest way possible 267 99:59:59,999 --> 99:59:59,999 that drinking mercury was a bad idea. 268 99:59:59,999 --> 99:59:59,999 We call that 500-year period the Dark Ages 269 99:59:59,999 --> 99:59:59,999 and we call the moment at which they started publishing 270 99:59:59,999 --> 99:59:59,999 and submitting themselves to adversarial peer review, 271 99:59:59,999 --> 99:59:59,999 which is when your friends tell you about the mistakes that you've made 272 99:59:59,999 --> 99:59:59,999 and your enemies call you an idiot for having made them, 273 99:59:59,999 --> 99:59:59,999 we call that moment the Enlightenment. 274 99:59:59,999 --> 99:59:59,999 Now, this has profound implications. 275 99:59:59,999 --> 99:59:59,999 The restriction of our ability to alter the security of our devices 276 99:59:59,999 --> 99:59:59,999 for our own surveillance society, 277 99:59:59,999 --> 99:59:59,999 for our ability to be free people in society. 278 99:59:59,999 --> 99:59:59,999 At the height of the GDR, in 1989, 279 99:59:59,999 --> 99:59:59,999 the STASI had one snitch for every 60 people in East Germany, 280 99:59:59,999 --> 99:59:59,999 in order to surveil the entire country. 281 99:59:59,999 --> 99:59:59,999 A couple of decades later, we found out through Edward Snowden 282 99:59:59,999 --> 99:59:59,999 that the NSA was spying on everybody in the world. 283 99:59:59,999 --> 99:59:59,999 And the ratio of people who work at the NSA to people they are spying on 284 99:59:59,999 --> 99:59:59,999 is more like 1 in 10'000. 285 99:59:59,999 --> 99:59:59,999 They've achieved a two and a half order of magnitude 286 99:59:59,999 --> 99:59:59,999 productivity gain in surveillance. 287 99:59:59,999 --> 99:59:59,999 And the way that they got there is in part by the fact that 288 99:59:59,999 --> 99:59:59,999 we use devices that we're not allowed to alter, 289 99:59:59,999 --> 99:59:59,999 that are designed to treat us as attackers 290 99:59:59,999 --> 99:59:59,999 and that gather an enormous amount of information on us. 291 99:59:59,999 --> 99:59:59,999 If the government told you that you're required to carry on 292 99:59:59,999 --> 99:59:59,999 a small electronic rectangle that recorded all of your social relationships, 293 99:59:59,999 --> 99:59:59,999 all of your movements, 294 99:59:59,999 --> 99:59:59,999 all of your transient thoughts that you made known or ever looked up, 295 99:59:59,999 --> 99:59:59,999 and would make that available to the state, 296 99:59:59,999 --> 99:59:59,999 and you would have to pay for it, you would revolt. 297 99:59:59,999 --> 99:59:59,999 But the phone companies have managed to convince us, 298 99:59:59,999 --> 99:59:59,999 along with the mobile phone vendors, 299 99:59:59,999 --> 99:59:59,999 that we should foot the bill for our own surveillance. 300 99:59:59,999 --> 99:59:59,999 It's a bit like during the Cultural Revolution, 301 99:59:59,999 --> 99:59:59,999 where, after your family members were executed, 302 99:59:59,999 --> 99:59:59,999 they sent you a bill for the bullet. 303 99:59:59,999 --> 99:59:59,999 So, this has big implications, as I said, for where we go as a society. 304 99:59:59,999 --> 99:59:59,999 Because just as our kids have a hard time functioning 305 99:59:59,999 --> 99:59:59,999 in the presence of surveillance, and learning, 306 99:59:59,999 --> 99:59:59,999 and advancing their own knowledge, 307 99:59:59,999 --> 99:59:59,999 we as a society have a hard time progressing 308 99:59:59,999 --> 99:59:59,999 in the presence of surveillance. 309 99:59:59,999 --> 99:59:59,999 In our own living memory, people who are today though of as normal and right 310 99:59:59,999 --> 99:59:59,999 were doing something that a generation ago 311 99:59:59,999 --> 99:59:59,999 would have been illegal and landed them in jail. 312 99:59:59,999 --> 99:59:59,999 For example, you probably know someone 313 99:59:59,999 --> 99:59:59,999 who's married to a partner of the same sex. 314 99:59:59,999 --> 99:59:59,999 If you live in America, you may know someone who takes medical marijuana, 315 99:59:59,999 --> 99:59:59,999 or if you live in the Netherlands. 316 99:59:59,999 --> 99:59:59,999 And not that long ago, people who undertook these activities 317 99:59:59,999 --> 99:59:59,999 could have gone to jail for them, 318 99:59:59,999 --> 99:59:59,999 could have faced enormous social exclusion for them. 319 99:59:59,999 --> 99:59:59,999 The way that we got from there to here was by having a private zone, 320 99:59:59,999 --> 99:59:59,999 a place where people weren't surveilled, 321 99:59:59,999 --> 99:59:59,999 in which they could advance their interest ideas, 322 99:59:59,999 --> 99:59:59,999 do things that were thought of as socially unacceptable 323 99:59:59,999 --> 99:59:59,999 and slowly change our social attitudes. 324 99:59:59,999 --> 99:59:59,999 And in ........ (check) few things that in 50 years, 325 99:59:59,999 --> 99:59:59,999 your grandchildren will sit around the Christmas table, in 2065, and say: 326 99:59:59,999 --> 99:59:59,999 "How was it, grandma, how was it, grandpa, 327 99:59:59,999 --> 99:59:59,999 that in 2015, you got it all right, 328 99:59:59,999 --> 99:59:59,999 and we haven't had any social changes since then?" 329 99:59:59,999 --> 99:59:59,999 Then you have to ask yourself how in a world, 330 99:59:59,999 --> 99:59:59,999 in which we are all under continuous surveillance, 331 99:59:59,999 --> 99:59:59,999 we are going to find a way to improve this. 332 99:59:59,999 --> 99:59:59,999 So, our kids need ICT literacy, 333 99:59:59,999 --> 99:59:59,999 but ICT literacy isn't just typing skills or learning how to use PowerPoing. 334 99:59:59,999 --> 99:59:59,999 It's learning how to think critically 335 99:59:59,999 --> 99:59:59,999 about how they relate to the means of information, 336 99:59:59,999 --> 99:59:59,999 about whether they are its masters or servants. 337 99:59:59,999 --> 99:59:59,999 Our networks are not the most important issue that we have. 338 99:59:59,999 --> 99:59:59,999 There are much more important issues in society and in the world today. 339 99:59:59,999 --> 99:59:59,999 The future of the internet is way less important 340 99:59:59,999 --> 99:59:59,999 than the future of our climate, the future of gender equity, 341 99:59:59,999 --> 99:59:59,999 the future of racial equity, 342 99:59:59,999 --> 99:59:59,999 the future of the wage gap and the wealth gap in the world, 343 99:59:59,999 --> 99:59:59,999 but everyone of those fights is going to be fought and won or lost 344 99:59:59,999 --> 99:59:59,999 on the internet: it's our most foundational fight 345 99:59:59,999 --> 99:59:59,999 So weakened (check) computers can make us more free 346 99:59:59,999 --> 99:59:59,999 or they can take away our freedom. 347 99:59:59,999 --> 99:59:59,999 It all comes down to how we regulate them and how we use them. 348 99:59:59,999 --> 99:59:59,999 And it's our job, as people who are training the next generation, 349 99:59:59,999 --> 99:59:59,999 and whose next generation is beta-testing 350 99:59:59,999 --> 99:59:59,999 the surveillance technology that will be coming to us, 351 99:59:59,999 --> 99:59:59,999 it's our job to teach them to seize the means of information, 352 99:59:59,999 --> 99:59:59,999 to make themselves self-determinant in the way that they use their networks 353 99:59:59,999 --> 99:59:59,999 and to find ways to show them how to be critical and how to be smart 354 99:59:59,999 --> 99:59:59,999 and how to be, above all, subversive and how to use the technology around them. 355 99:59:59,999 --> 99:59:59,999 Thank you. (18:49) 356 99:59:59,999 --> 99:59:59,999 (Applause) 357 99:59:59,999 --> 99:59:59,999 (Moderator) Cory, thank you very much indeed. 358 99:59:59,999 --> 99:59:59,999 (Doctorow) Thank you (Moderator) And I've got a bundle 359 99:59:59,999 --> 99:59:59,999 of points which you've stimulated from many in the audience, which sent -- 360 99:59:59,999 --> 99:59:59,999 (Doctorow) I'm shocked to hear that that was at all controversial, 361 99:59:59,999 --> 99:59:59,999 but go on. (Moderator) I didn't say "controversial", 362 99:59:59,999 --> 99:59:59,999 you stimulated thinking, which is great. 363 99:59:59,999 --> 99:59:59,999 But a lot of them resonate around violation of secrecy and security. 364 99:59:59,999 --> 99:59:59,999 And this, for example, from Anneke Burgess (check) 365 99:59:59,999 --> 99:59:59,999 "Is there a way for students to protect themselves 366 99:59:59,999 --> 99:59:59,999 from privacy violations by institutions they are supposed to trust." 367 99:59:59,999 --> 99:59:59,999 I think this is probably a question William Golding (check) as well, 368 99:59:59,999 --> 99:59:59,999 someone who is a senior figure in a major university, but 369 99:59:59,999 --> 99:59:59,999 this issue of privacy violations and trust. 370 99:59:59,999 --> 99:59:59,999 (Doctorow) Well, I think that computers have a curious dual nature. 371 99:59:59,999 --> 99:59:59,999 So on the one hand, they do expose us to an enormous amount of scrutiny, 372 99:59:59,999 --> 99:59:59,999 depending on how they are configured. 373 99:59:59,999 --> 99:59:59,999 But on the other hand, computers have brought new powers to us 374 99:59:59,999 --> 99:59:59,999 that are literally new on the face of the world, right? 375 99:59:59,999 --> 99:59:59,999 We have never had a reality in which normal people could have secrets 376 99:59:59,999 --> 99:59:59,999 from powerful people. 377 99:59:59,999 --> 99:59:59,999 But with the computer in your pocket, with that, 378 99:59:59,999 --> 99:59:59,999 you can encrypt a message so thoroughly 379 99:59:59,999 --> 99:59:59,999 that if every hydrogen atom in the universe were turned into a computer 380 99:59:59,999 --> 99:59:59,999 and it did nothing until the heat death of the universe, 381 99:59:59,999 --> 99:59:59,999 but try to guess what your key was, 382 99:59:59,999 --> 99:59:59,999 we would run out of universe before we ran out of possible keys. 383 99:59:59,999 --> 99:59:59,999 So, computers do give us the power to have secrets. 384 99:59:59,999 --> 99:59:59,999 The problem is that institutions prohibit the use of technology 385 99:59:59,999 --> 99:59:59,999 that allow you to take back your own privacy. 386 99:59:59,999 --> 99:59:59,999 It's funny, right? Because we take kids and we say to them: 387 99:59:59,999 --> 99:59:59,999 "Your privacy is like your virginity: 388 99:59:59,999 --> 99:59:59,999 once you've lost it, you'll never get it back. 389 99:59:59,999 --> 99:59:59,999 Watch out what you're putting on Facebook" 390 99:59:59,999 --> 99:59:59,999 -- and I think they should watch what they're putting on Facebook, 391 99:59:59,999 --> 99:59:59,999 I'm a Facebook vegan, I don't even -- I don't use it, but we say: 392 99:59:59,999 --> 99:59:59,999 "Watch what you're putting on Facebook, 393 99:59:59,999 --> 99:59:59,999 don't send out dirty pictures of yourself on SnapChat." 394 99:59:59,999 --> 99:59:59,999 All good advice. 395 99:59:59,999 --> 99:59:59,999 But we do it while we are taking away all the private information 396 99:59:59,999 --> 99:59:59,999 that they have, all of their privacy and all of their agency. 397 99:59:59,999 --> 99:59:59,999 You know, if a parent says to a kid: 398 99:59:59,999 --> 99:59:59,999 "You mustn't smoke because you'll get sick" 399 99:59:59,999 --> 99:59:59,999 and the parent says it while lighting a new cigarette 400 99:59:59,999 --> 99:59:59,999 off the one that she's just put down in the ashtray, 401 99:59:59,999 --> 99:59:59,999 the kid knows that what you're doing matters more than what you're saying. 402 99:59:59,999 --> 99:59:59,999 (Moderator) The point is deficit of trust. 403 99:59:59,999 --> 99:59:59,999 It builds in the kind of work that David has been doing as well, 404 99:59:59,999 --> 99:59:59,999 this deficit of trust and privacy. 405 99:59:59,999 --> 99:59:59,999 And there is another point here: 406 99:59:59,999 --> 99:59:59,999 "Is the battle for privacy already lost? 407 99:59:59,999 --> 99:59:59,999 Are we already too comfortable with giving away our data?" 408 99:59:59,999 --> 99:59:59,999 (Doctorow) No, I don't think so at all. 409 99:59:59,999 --> 99:59:59,999 In fact, I think that if anything, we've reached 410 99:59:59,999 --> 99:59:59,999 peak indifference to surveillance, right? (21:25)