0:00:00.000,0:00:00.778 (Cory Doctorow) Thank you very much 0:00:01.066,0:00:05.420 So I'd like to start with something of a[br]benediction or permission. 0:00:05.420,0:00:07.740 I am one of nature's fast talkers 0:00:07.740,0:00:10.620 and many of you are not [br]native English speakers, or 0:00:10.620,0:00:12.919 maybe not accustomed [br]to my harsh Canadian accent 0:00:12.919,0:00:15.919 in addition I've just come in [br]from Australia 0:00:15.919,0:00:19.166 and so like many of you I am horribly[br]jetlagged and have drunkenough coffee 0:00:19.166,0:00:20.736 this morning to kill a rhino. 0:00:22.112,0:00:23.587 When I used to be at the United Nations 0:00:23.587,0:00:27.023 I was known as the scourge of the[br]simultaneous translation core 0:00:27.157,0:00:29.528 I would stand up and speak [br]as slowly as I could 0:00:29.576,0:00:32.229 and turn around, and there they[br]would be in their boots doing this 0:00:32.229,0:00:35.136 (laughter)[br]When I start to speak too fast, 0:00:35.136,0:00:37.557 this is the universal symbol --[br]my wife invented it -- 0:00:37.557,0:00:41.518 for "Cory, you are talking too fast".[br]Please, don't be shy. 0:00:42.078,0:00:45.790 So, I'm a parent , like many of you [br]and I'm like I'm sure all of you 0:00:45.790,0:00:48.850 who are parents, parenting takes my ass [br]all the time. 0:00:50.030,0:00:54.820 And there are many regrets I have[br]about the mere seven and half years 0:00:54.820,0:00:57.284 that I've been a parent [br]but none ares so keenly felt 0:00:57.784,0:01:00.878 as my regrets over what's happened [br]when I've been wandering 0:01:00.878,0:01:04.766 around the house and seen my[br]daughter working on something 0:01:04.766,0:01:08.697 that was beyond her abilities, that was [br]right at the edge of what she could do 0:01:08.697,0:01:12.759 and where she was doing something [br]that she didn't have competence in yet 0:01:12.759,0:01:17.019 and you know it's that amazing thing [br]to see that frowning concentration, 0:01:17.019,0:01:20.038 tongue stuck out: as a parent, your[br]heart swells with pride 0:01:20.038,0:01:21.332 and you can't help but go over 0:01:21.332,0:01:23.280 and sort of peer over their shoulder[br]what they are doing 9:59:59.000,9:59:59.000 and those of you who are parents know [br]what happens when you look too closely 9:59:59.000,9:59:59.000 at someone who is working [br]beyond the age of their competence. 9:59:59.000,9:59:59.000 They go back to doing something[br]they're already good at. 9:59:59.000,9:59:59.000 You interrupt a moment [br]of genuine learning 9:59:59.000,9:59:59.000 and you replace it with [br]a kind of embarrassment 9:59:59.000,9:59:59.000 about what you're good at [br]and what you're not. 9:59:59.000,9:59:59.000 So, it matters a lot that our schools are[br]increasingly surveilled environments, 9:59:59.000,9:59:59.000 environments in which everything that [br]our kids do is watched and recorded. 9:59:59.000,9:59:59.000 Because when you do that, you interfere[br]with those moments of real learning. 9:59:59.000,9:59:59.000 Our ability to do things that we are not [br]good at yet, that we are not proud of yet, 9:59:59.000,9:59:59.000 is negatively impacted [br]by that kind of scrutiny. 9:59:59.000,9:59:59.000 And that scrutiny comes [br]from a strange place. 9:59:59.000,9:59:59.000 We have decided that there are [br]some programmatic means 9:59:59.000,9:59:59.000 by which we can find all the web page[br]children shouldn't look at 9:59:59.000,9:59:59.000 and we will filter our networks[br]to be sure that they don't see them. 9:59:59.000,9:59:59.000 Anyone who has ever paid attention[br]knows that this doesn't work. 9:59:59.000,9:59:59.000 There are more web pages [br]that kids shouldn't look at 9:59:59.000,9:59:59.000 than can ever be cataloged, [br]and any attempt to catalog them 9:59:59.000,9:59:59.000 will always catch pages that kids[br]must be looking at. 9:59:59.000,9:59:59.000 Any of you who have ever taught [br]a unit on reproductive health 9:59:59.000,9:59:59.000 know the frustration of trying [br]to get round a school network. 9:59:59.000,9:59:59.000 Now, this is done in the name of[br]digital protection 9:59:59.000,9:59:59.000 but it flies in the face of digital[br]literacy and of real learning. 9:59:59.000,9:59:59.000 Because the only way to stop kids [br]from looking at web pages 9:59:59.000,9:59:59.000 they shouldn't be looking at 9:59:59.000,9:59:59.000 is to take all of the clicks that they [br]make, all of the messages that they send, 9:59:59.000,9:59:59.000 all of their online activity [br]and offshore it to a firm 9:59:59.000,9:59:59.000 that has some nonsensically arrived at[br]list of the bad pages. 9:59:59.000,9:59:59.000 And so, what we are doing is that we're [br]exfiltrating all of our students' data 9:59:59.000,9:59:59.000 to unknown third parties. 9:59:59.000,9:59:59.000 Now, most of these firms, [br]their primary business is 9:59:59.000,9:59:59.000 in serving the education sector. 9:59:59.000,9:59:59.000 Most of them service [br]the government sector. 9:59:59.000,9:59:59.000 The primarily service governments in[br]repressive autocratic regimes. 9:59:59.000,9:59:59.000 They help them make sure that [br]their citizens aren't looking at 9:59:59.000,9:59:59.000 Amnesty International web pages. 9:59:59.000,9:59:59.000 They repackage those tools [br]and sell them to our educators. 9:59:59.000,9:59:59.000 So we are offshoring our children's clicks[br]to war criminals. 9:59:59.000,9:59:59.000 And what our kids do, we know,[br]is they just get around it, 9:59:59.000,9:59:59.000 because it's not hard to get around it. 9:59:59.000,9:59:59.000 You know, never underestimate the power[br]of a kid who is time-rich and cash-poor 9:59:59.000,9:59:59.000 to get around our [br]technological blockades. 9:59:59.000,9:59:59.000 But when they do this, they don't acquire[br]the kind of digital literacy 9:59:59.000,9:59:59.000 that we want them to do, they don't [br]acquire real digital agency 9:59:59.000,9:59:59.000 and moreover, they risk exclusion [br]and in extreme cases, 9:59:59.000,9:59:59.000 they risk criminal prosecution. 9:59:59.000,9:59:59.000 So what if instead, those of us who are[br]trapped in this system of teaching kids 9:59:59.000,9:59:59.000 where we're required to subject them[br]to this kind of surveillance 9:59:59.000,9:59:59.000 that flies in the face [br]of their real learning, 9:59:59.000,9:59:59.000 what if instead, we invented [br]curricular units 9:59:59.000,9:59:59.000 that made them real first class[br]digital citizens, 9:59:59.000,9:59:59.000 in charge of trying to influence[br]real digital problems? 9:59:59.000,9:59:59.000 Like what if we said to them: [br]"We want you to catalog the web pages 9:59:59.000,9:59:59.000 that this vendor lets through [br]that you shouldn't be seeing. 9:59:59.000,9:59:59.000 We want you to catalog those pages that [br]you should be seeing, that are blocked. 9:59:59.000,9:59:59.000 We want you to go and interview [br]every teacher in the school 9:59:59.000,9:59:59.000 about all those lesson plans that were[br]carefully laid out before lunch 9:59:59.000,9:59:59.000 with a video and a web page, [br]and over lunch, 9:59:59.000,9:59:59.000 the unaccountable distance center[br]blocked these critical resources 9:59:59.000,9:59:59.000 and left them handing out photographed [br]worksheets in the afternoon 9:59:59.000,9:59:59.000 instead of the unit they prepared. 9:59:59.000,9:59:59.000 We want you to learn how to do the Freedom[br]of Information Act's requests 9:59:59.000,9:59:59.000 and find out what your [br]school authority is spending 9:59:59.000,9:59:59.000 to censor your internet access [br]and surveil your activity. 9:59:59.000,9:59:59.000 We want you to learn to use the internet [br]to research these companies and 9:59:59.000,9:59:59.000 we want you to present this [br]to your parent-teacher association, 9:59:59.000,9:59:59.000 to your school authority,[br]to your local newspaper." 9:59:59.000,9:59:59.000 Because that's the kind [br]of digital literacy 9:59:59.000,9:59:59.000 that makes kids into first-class[br]digital citizens, 9:59:59.000,9:59:59.000 that prepares them for a future [br]in which they can participate fully 9:59:59.000,9:59:59.000 in a world that's changing. 9:59:59.000,9:59:59.000 Kids are the beta-testers [br]of the surveillance state. 9:59:59.000,9:59:59.000 The path of surveillance technology[br]starts with prisoners, 9:59:59.000,9:59:59.000 moves to asylum seekers, [br]people in mental institutions 9:59:59.000,9:59:59.000 and then to its first non-incarcerated [br]population: children 9:59:59.000,9:59:59.000 and then moves to blue-collar workers,[br]government workers 9:59:59.000,9:59:59.000 and white-collar workers. 9:59:59.000,9:59:59.000 And so, what we do to kids today[br]is what we did to prisoners yesterday 9:59:59.000,9:59:59.000 and what we're going to be doing [br]to you tomorrow. 9:59:59.000,9:59:59.000 And so it matters, what we teach our kids. 9:59:59.000,9:59:59.000 If you want to see where this goes, this[br]is a kid named Blake Robbins 9:59:59.000,9:59:59.000 and he attended Lower Merion High School [br]in Lower Merion Pennsylvania 9:59:59.000,9:59:59.000 outside f Philadelphia. 9:59:59.000,9:59:59.000 It's the most affluent school district[br]in America, so affluent 9:59:59.000,9:59:59.000 that all the kids were issued Macbooks[br]at the start of the year 9:59:59.000,9:59:59.000 and they had to do their homework on[br]their Macbooks, 9:59:59.000,9:59:59.000 they had to bring them to school every day[br]and bring them home every night. 9:59:59.000,9:59:59.000 And the Macbooks had been fitted with[br]Laptop Theft Recovery Software, 9:59:59.000,9:59:59.000 which is fancy word for a rootkit, that[br]let the school administration 9:59:59.000,9:59:59.000 scovertly (check) operate the cameras[br]and microphones on these computers 9:59:59.000,9:59:59.000 and harvest files off [br]of their hard drives 9:59:59.000,9:59:59.000 view all their clicks, and so on. 9:59:59.000,9:59:59.000 Now Blake Robbins found out [br]that the software existed 9:59:59.000,9:59:59.000 and how it was being used [br]because he and the head teacher 9:59:59.000,9:59:59.000 had been knocking heads for years, [br]since he first got into the school, 9:59:59.000,9:59:59.000 and one day, the head teacher [br]summoned him to his office 9:59:59.000,9:59:59.000 and said: "Blake, I've got you now." 9:59:59.000,9:59:59.000 and handed him a print-out of Blake[br]in his bedroom the night before, 9:59:59.000,9:59:59.000 taking what looked like a pill, [br]and said: "You're taking drugs." 9:59:59.000,9:59:59.000 And Blake Robbins said: "That's a candy,[br]it's a Mike and Ike candy, I take them -- 9:59:59.000,9:59:59.000 I eat them when I'm studying. 9:59:59.000,9:59:59.000 How did you get a picture [br]of me in my bedroom?" 9:59:59.000,9:59:59.000 This head teacher had taken [br]over 6000 photos of Blake Robbins: 9:59:59.000,9:59:59.000 awake and asleep, dressed and undressed,[br]in the presence of his family. 9:59:59.000,9:59:59.000 And in the ensuing lawsuit, the school[br]settled for a large amount of money 9:59:59.000,9:59:59.000 and promised that [br]they wouldn't do it again 9:59:59.000,9:59:59.000 without informing the students [br]that it was going on. 9:59:59.000,9:59:59.000 And increasingly, the practice is now 9:59:59.000,9:59:59.000 that school administrations hand out [br]laptops, because they're getting cheaper, 9:59:59.000,9:59:59.000 with exactly the same kind of software, 9:59:59.000,9:59:59.000 but they let the students know and t[br]hey find that that works even better 9:59:59.000,9:59:59.000 at curbing the students' behavior, 9:59:59.000,9:59:59.000 because the students know that [br]they're always on camera. 9:59:59.000,9:59:59.000 Now, the surveillance state is moving[br]from kids to the rest of the world. 9:59:59.000,9:59:59.000 It's metastasizing. 9:59:59.000,9:59:59.000 Our devices are increasingly designed[br]to treat us as attackers, 9:59:59.000,9:59:59.000 as suspicious parties [br]who can't be trusted 9:59:59.000,9:59:59.000 because our devices' job is to do things[br]that we don't want them to do. 9:59:59.000,9:59:59.000 Now that's not because the vendors [br]who make our technology 9:59:59.000,9:59:59.000 want to spy on us necessarily, 9:59:59.000,9:59:59.000 but they want to take [br]the ink-jet printer business model 9:59:59.000,9:59:59.000 and bring it into every other realm[br]of the world. 9:59:59.000,9:59:59.000 So the ink-jet printer business model[br]is where you sell someone a device 9:59:59.000,9:59:59.000 and then you get a continuing [br]revenue stream from that device 9:59:59.000,9:59:59.000 by making sure that competitors can't make [br]consumables or parts 9:59:59.000,9:59:59.000 or additional features [br]or plugins for that device, 9:59:59.000,9:59:59.000 without paying rent [br]to the original manufacturer. 9:59:59.000,9:59:59.000 And that allows you to maintain [br]monopoly margins on your devices. 9:59:59.000,9:59:59.000 Now, in 1998, the American government [br]passed a law called 9:59:59.000,9:59:59.000 the Digital Millennium Copyright Act, 9:59:59.000,9:59:59.000 in 2001 the European Union [br]introduced its own version, 9:59:59.000,9:59:59.000 the European Union Copyright Directive. 9:59:59.000,9:59:59.000 And these two laws, along with laws[br]all around the world, 9:59:59.000,9:59:59.000 in Australia, Canada and elsewhere.[br]These laws prohibit removing digital laws 9:59:59.000,9:59:59.000 that are used to restrict [br]access to copyrighted works 9:59:59.000,9:59:59.000 and they were original envisioned as a way[br]of making sure that Europeans didn't 9:59:59.000,9:59:59.000 bring cheap DVDs in from America, 9:59:59.000,9:59:59.000 or making sure that Australians didn't [br]mport cheap DVDs from China. 9:59:59.000,9:59:59.000 And so you have a digital work, a DVD,[br]and it has a lock on it and to unlock it, 9:59:59.000,9:59:59.000 you have to buy an authorized player 9:59:59.000,9:59:59.000 and the player checks to make sure [br]you are in region 9:59:59.000,9:59:59.000 and making your own player [br]that doesn't make that check 9:59:59.000,9:59:59.000 is illegal because you'd have [br]to remove the digital lock. 9:59:59.000,9:59:59.000 And that was the original intent, 9:59:59.000,9:59:59.000 it was to allow high rates to be [br]maintained on removable media, 9:59:59.000,9:59:59.000 DVDs and other entertainment content. 9:59:59.000,9:59:59.000 But it very quickly spread [br]into new rounds. 9:59:59.000,9:59:59.000 So, for example, auto manufacturers now [br]lock up all of their cars' telemetry 9:59:59.000,9:59:59.000 with digital locks. 9:59:59.000,9:59:59.000 If you're a mechanic [br]and want to fix a car, 9:59:59.000,9:59:59.000 you have to get a reader [br]from the manufacturer 9:59:59.000,9:59:59.000 to make sure that you can [br]see the telemetry 9:59:59.000,9:59:59.000 and know what parts to order [br]and how to fix it. 9:59:59.000,9:59:59.000 And in order to get this reader, [br]you have to promise the manufacturer 9:59:59.000,9:59:59.000 that you will only buy parts [br]from that manufacturer 9:59:59.000,9:59:59.000 and not from third parties. 9:59:59.000,9:59:59.000 So the manufacturers can keep [br]the repair costs high 9:59:59.000,9:59:59.000 and get a secondary revenue stream[br]out of the cars. 9:59:59.000,9:59:59.000 This year, the Chrysler corporation filed [br]comments with the US Copyright Office, 9:59:59.000,9:59:59.000 to say that they believed that [br]this was the right way to do it 9:59:59.000,9:59:59.000 and that it should be a felony, [br]punishable by 5 years in prison 9:59:59.000,9:59:59.000 and a $500'000 fine, 9:59:59.000,9:59:59.000 to change the locks on a car that you own,[br]so that you can choose who fixes it. 9:59:59.000,9:59:59.000 It turned out that when they advertised 9:59:59.000,9:59:59.000 -- well, where is my slide here?[br]Oh, there we go -- 9:59:59.000,9:59:59.000 when they advertised that [br]it wasn't your father's Oldsmobile, 9:59:59.000,9:59:59.000 they weren't speaking metaphorically, [br]they really meant 9:59:59.000,9:59:59.000 that even though your father [br]bought the Oldsmobile, 9:59:59.000,9:59:59.000 it remained their property in perpetuity. 9:59:59.000,9:59:59.000 And it's not just cars, [br]it's every kind of device, 9:59:59.000,9:59:59.000 because every kind of device today [br]has a computer in it. 9:59:59.000,9:59:59.000 The John Deer Company, the world's leading seller of heavy equipment 9:59:59.000,9:59:59.000 and agricultural equipment technologies, 9:59:59.000,9:59:59.000 they now view their tractors as [br]information gathering platforms 9:59:59.000,9:59:59.000 and they view the people who use them 9:59:59.000,9:59:59.000 as the kind of inconvenient gut flora[br]of their ecosystem. 9:59:59.000,9:59:59.000 So if you are a farmer [br]and you own a John Deer tractor, 9:59:59.000,9:59:59.000 when you drive it around your fields, [br]the torque centers on the wheels 9:59:59.000,9:59:59.000 conduct a centimeter-accurate soil [br]density survey of your agricultural land. 9:59:59.000,9:59:59.000 That would be extremely useful to you[br]when you're planting your seeds 9:59:59.000,9:59:59.000 but that data is not available to you 9:59:59.000,9:59:59.000 unless unless you remove the digital lock [br]from your John Deer tractor 9:59:59.000,9:59:59.000 which again, is against the law [br]everywhere in the world. 9:59:59.000,9:59:59.000 Instead, in order to get that data 9:59:59.000,9:59:59.000 you have to buy a bundle with seeds [br]from Monsanto, 9:59:59.000,9:59:59.000 who are John Deer's seed partners. 9:59:59.000,9:59:59.000 John Deer then takes this data that they [br]aggregate across whole regions 9:59:59.000,9:59:59.000 and they use it to gain insight [br]into regional crop yields 9:59:59.000,9:59:59.000 That they use to play the futures market. 9:59:59.000,9:59:59.000 John Deer's tractors are really just [br]a way of gathering information 9:59:59.000,9:59:59.000 and the farmers are secondary to it. 9:59:59.000,9:59:59.000 Just because you own it [br]doesn't mean it's yours. 9:59:59.000,9:59:59.000 And it's not just the computers [br]that we put our bodies into 9:59:59.000,9:59:59.000 that have this business model. 9:59:59.000,9:59:59.000 It's the computers that we put [br]inside of our bodies. 9:59:59.000,9:59:59.000 If you're someone who is diabetic 9:59:59.000,9:59:59.000 and you're fitted with a continuous [br]glucose-measuring insulin pump, 9:59:59.000,9:59:59.000 that insulin pump is designed [br]with a digital log 9:59:59.000,9:59:59.000 that makes sure that your doctor [br]can only use the manufacturer's software 9:59:59.000,9:59:59.000 to read the data coming off of it 9:59:59.000,9:59:59.000 and that software is resold [br]on a rolling annual license 9:59:59.000,9:59:59.000 and it can't be just bought outright. 9:59:59.000,9:59:59.000 And the digital locks are also [br]used to make sure 9:59:59.000,9:59:59.000 that you only buy the insulin [br]that vendors approved 9:59:59.000,9:59:59.000 and not generic insulin [br]that might be cheaper. 9:59:59.000,9:59:59.000 We've literally turned human beings [br]into ink-jet printers. 9:59:59.000,9:59:59.000 Now, this has really deep implications[br]beyond the economic implications. 9:59:59.000,9:59:59.000 Because the rules that prohibit [br]breaking these digital locks 9:59:59.000,9:59:59.000 also prohibit telling people [br]about flaws that programmers made 9:59:59.000,9:59:59.000 because if you know about a flaw[br]that a programmer made, 9:59:59.000,9:59:59.000 you can use it to break the digital lock. 9:59:59.000,9:59:59.000 And that means that the errors,[br]the vulnerabilities, 9:59:59.000,9:59:59.000 the mistakes in our devices, they fester [br]in them, they go on and on and on 9:59:59.000,9:59:59.000 and our devices become these longlife [br]reservoirs of digital pathogens. 9:59:59.000,9:59:59.000 And we've seen how that plays out. 9:59:59.000,9:59:59.000 One of the reasons that Volkswagen [br]was able to get away 9:59:59.000,9:59:59.000 with their Diesel cheating for so long 9:59:59.000,9:59:59.000 is because no one could independently[br]alter their firmware. 9:59:59.000,9:59:59.000 It's happening all over the place. 9:59:59.000,9:59:59.000 You may have seen -- [br]you may have seen this summer 9:59:59.000,9:59:59.000 that Chrysler had to recall [br]1.4 million jeeps 9:59:59.000,9:59:59.000 because it turned out that they could be[br]remotely controlled over the internet 9:59:59.000,9:59:59.000 while driving down a motorway [br]and have their brakes and steering 9:59:59.000,9:59:59.000 commandeered by anyone, anywhere [br]in the world, over the internet. 9:59:59.000,9:59:59.000 We only have one methodology [br]for determining whether security works 9:59:59.000,9:59:59.000 and that's to submit it [br]to public scrutiny, 9:59:59.000,9:59:59.000 to allow for other people to see [br]what assumptions you've made. 9:59:59.000,9:59:59.000 Anyone can design a security system 9:59:59.000,9:59:59.000 that he himself can think [br]of a way of breaking, 9:59:59.000,9:59:59.000 but all that means is that you've [br]designed a security system 9:59:59.000,9:59:59.000 that works against people [br]who are stupider than you. 9:59:59.000,9:59:59.000 And in this regard, security [br]is no different 9:59:59.000,9:59:59.000 from any other kind of knowledge creation. 9:59:59.000,9:59:59.000 You know, before we had [br]contemporary science and scholarship, 9:59:59.000,9:59:59.000 we had something that looked [br]a lot like it, called alchemy. 9:59:59.000,9:59:59.000 And for 500 years, alchemists kept[br]what they thought they knew a secret. 9:59:59.000,9:59:59.000 And that meant that every alchemist[br]was capable of falling prey 9:59:59.000,9:59:59.000 to that most urgent of human frailties,[br]which is our ability to fool ourselves. 9:59:59.000,9:59:59.000 And so, every alchemist discovered [br]for himself in the hardest way possible 9:59:59.000,9:59:59.000 that drinking mercury was a bad idea. 9:59:59.000,9:59:59.000 We call that 500-year period the Dark Ages 9:59:59.000,9:59:59.000 and we call the moment at which [br]they started publishing 9:59:59.000,9:59:59.000 and submitting themselves [br]to adversarial peer review, 9:59:59.000,9:59:59.000 which is when your friends tell you [br]about the mistakes that you've made 9:59:59.000,9:59:59.000 and your enemies call you an idiot[br]for having made them, 9:59:59.000,9:59:59.000 we call that moment the Enlightenment. 9:59:59.000,9:59:59.000 Now, this has profound implications. 9:59:59.000,9:59:59.000 The restriction of our ability to alter [br]the security of our devices 9:59:59.000,9:59:59.000 for our own surveillance society, 9:59:59.000,9:59:59.000 for our ability to be free people[br]in society. 9:59:59.000,9:59:59.000 At the height of the GDR, in 1989, 9:59:59.000,9:59:59.000 the STASI had one snitch for every [br]60 people in East Germany, 9:59:59.000,9:59:59.000 in order to surveil the entire country. 9:59:59.000,9:59:59.000 A couple of decades later, we found out [br]through Edward Snowden 9:59:59.000,9:59:59.000 that the NSA was spying [br]on everybody in the world. 9:59:59.000,9:59:59.000 And the ratio of people who work [br]at the NSA to people they are spying on 9:59:59.000,9:59:59.000 is more like 1 in 10'000. 9:59:59.000,9:59:59.000 They've achieved a two and a half [br]order of magnitude 9:59:59.000,9:59:59.000 productivity gain in surveillance. 9:59:59.000,9:59:59.000 And the way that they got there [br]is in part by the fact that 9:59:59.000,9:59:59.000 we use devices that [br]we're not allowed to alter, 9:59:59.000,9:59:59.000 that are designed to treat us as attackers 9:59:59.000,9:59:59.000 and that gather an enormous [br]amount of information on us. 9:59:59.000,9:59:59.000 If the government told you that you're [br]required to carry on 9:59:59.000,9:59:59.000 a small electronic rectangle that [br]recorded all of your social relationships, 9:59:59.000,9:59:59.000 all of your movements, 9:59:59.000,9:59:59.000 all of your transient thoughts that[br]you made known or ever looked up, 9:59:59.000,9:59:59.000 and would make that [br]available to the state, 9:59:59.000,9:59:59.000 and you would have to pay for it,[br]you would revolt. 9:59:59.000,9:59:59.000 But the phone companies have [br]managed to convince us, 9:59:59.000,9:59:59.000 along with the mobile phone vendors, 9:59:59.000,9:59:59.000 that we should foot the bill [br]for our own surveillance. 9:59:59.000,9:59:59.000 It's a bit like during [br]the Cultural Revolution, 9:59:59.000,9:59:59.000 where, after your family members [br]were executed, 9:59:59.000,9:59:59.000 they sent you a bill for the bullet. 9:59:59.000,9:59:59.000 So, this has big implications, as I said, [br]for where we go as a society. 9:59:59.000,9:59:59.000 Because just as our kids have [br]a hard time functioning 9:59:59.000,9:59:59.000 in the presence of surveillance,[br]and learning, 9:59:59.000,9:59:59.000 and advancing their own knowledge, 9:59:59.000,9:59:59.000 we as a society have a hard time [br]progressing 9:59:59.000,9:59:59.000 in the presence of surveillance. 9:59:59.000,9:59:59.000 In our own living memory, people who are [br]today though of as normal and right 9:59:59.000,9:59:59.000 were doing something that [br]a generation ago 9:59:59.000,9:59:59.000 would have been illegal [br]and landed them in jail. 9:59:59.000,9:59:59.000 For example, you probably know someone 9:59:59.000,9:59:59.000 who's married to a partner [br]of the same sex. 9:59:59.000,9:59:59.000 If you live in America, you may know [br]someone who takes medical marijuana, 9:59:59.000,9:59:59.000 or if you live in the Netherlands. 9:59:59.000,9:59:59.000 And not that long ago, people [br]who undertook these activities 9:59:59.000,9:59:59.000 could have gone to jail for them, 9:59:59.000,9:59:59.000 could have faced enormous [br]social exclusion for them. 9:59:59.000,9:59:59.000 The way that we got from there to here[br]was by having a private zone, 9:59:59.000,9:59:59.000 a place where people weren't surveilled, 9:59:59.000,9:59:59.000 in which they could advance [br]their interest ideas, 9:59:59.000,9:59:59.000 do things that were thought of as[br]socially unacceptable 9:59:59.000,9:59:59.000 and slowly change our social attitudes. 9:59:59.000,9:59:59.000 And in ........ (check) few things [br]that in 50 years, 9:59:59.000,9:59:59.000 your grandchildren will sit around [br]the Christmas table, in 2065, and say: 9:59:59.000,9:59:59.000 "How was it, grandma, [br]how was it, grandpa, 9:59:59.000,9:59:59.000 that in 2015, you got it all right, 9:59:59.000,9:59:59.000 and we haven't had [br]any social changes since then?" 9:59:59.000,9:59:59.000 Then you have to ask yourself [br]how in a world, 9:59:59.000,9:59:59.000 in which we are all [br]under continuous surveillance, 9:59:59.000,9:59:59.000 we are going to find a way [br]to improve this. 9:59:59.000,9:59:59.000 So, our kids need ICT literacy, 9:59:59.000,9:59:59.000 but ICT literacy isn't just typing skills[br]or learning how to use PowerPoing. 9:59:59.000,9:59:59.000 It's learning how to think critically 9:59:59.000,9:59:59.000 about how they relate [br]to the means of information, 9:59:59.000,9:59:59.000 about whether they are its masters [br]or servants. 9:59:59.000,9:59:59.000 Our networks are not [br]the most important issue that we have. 9:59:59.000,9:59:59.000 There are much more important issues[br]in society and in the world today. 9:59:59.000,9:59:59.000 The future of the internet is [br]way less important 9:59:59.000,9:59:59.000 than the future of our climate,[br]the future of gender equity, 9:59:59.000,9:59:59.000 the future of racial equity, 9:59:59.000,9:59:59.000 the future of the wage gap [br]and the wealth gap in the world, 9:59:59.000,9:59:59.000 but everyone of those fights is going[br]to be fought and won or lost 9:59:59.000,9:59:59.000 on the internet:[br]it's our most foundational fight 9:59:59.000,9:59:59.000 So weakened (check) computers[br]can make us more free 9:59:59.000,9:59:59.000 or they can take away our freedom. 9:59:59.000,9:59:59.000 It all comes down to how we regulate them [br]and how we use them. 9:59:59.000,9:59:59.000 And it's our job, as people who are [br]training the next generation, 9:59:59.000,9:59:59.000 and whose next generation [br]is beta-testing 9:59:59.000,9:59:59.000 the surveillance technology [br]that will be coming to us, 9:59:59.000,9:59:59.000 it's our job to teach them to seize [br]the means of information, 9:59:59.000,9:59:59.000 to make themselves self-determinant[br]in the way that they use their networks 9:59:59.000,9:59:59.000 and to find ways to show them [br]how to be critical and how to be smart 9:59:59.000,9:59:59.000 and how to be, above all, subversive[br]and how to use the technology around them. 9:59:59.000,9:59:59.000 Thank you. (18:49) 9:59:59.000,9:59:59.000 (Applause) 9:59:59.000,9:59:59.000 (Moderator) Cory, thank you very much [br]indeed. 9:59:59.000,9:59:59.000 (Doctorow) Thank you[br](Moderator) And I've got a bundle 9:59:59.000,9:59:59.000 of points which you've stimulated [br]from many in the audience, which sent -- 9:59:59.000,9:59:59.000 (Doctorow) I'm shocked to hear that [br]that was at all controversial, 9:59:59.000,9:59:59.000 but go on.[br](Moderator) I didn't say "controversial", 9:59:59.000,9:59:59.000 you stimulated thinking, which is great. 9:59:59.000,9:59:59.000 But a lot of them resonate around[br]violation of secrecy and security. 9:59:59.000,9:59:59.000 And this, for example, [br]from Anneke Burgess (check) 9:59:59.000,9:59:59.000 "Is there a way for students [br]to protect themselves 9:59:59.000,9:59:59.000 from privacy violations by institutions[br]they are supposed to trust." 9:59:59.000,9:59:59.000 I think this is probably a question[br]William Golding (check) as well, 9:59:59.000,9:59:59.000 someone who is a senior figure[br]in a major university, but 9:59:59.000,9:59:59.000 this issue of privacy violations and trust. 9:59:59.000,9:59:59.000 (Doctorow) Well, I think that computers[br]have a curious dual nature. 9:59:59.000,9:59:59.000 So on the one hand, they do expose us[br]to an enormous amount of scrutiny, 9:59:59.000,9:59:59.000 depending on how they are configured. 9:59:59.000,9:59:59.000 But on the other hand, computers [br]have brought new powers to us 9:59:59.000,9:59:59.000 that are literally new [br]on the face of the world, right? 9:59:59.000,9:59:59.000 We have never had a reality in which[br]normal people could have secrets 9:59:59.000,9:59:59.000 from powerful people. 9:59:59.000,9:59:59.000 But with the computer in your pocket,[br]with that, 9:59:59.000,9:59:59.000 you can encrypt a message so thoroughly 9:59:59.000,9:59:59.000 that if every hydrogen atom in the [br]universe were turned into a computer 9:59:59.000,9:59:59.000 and it did nothing until [br]the heat death of the universe, 9:59:59.000,9:59:59.000 but try to guess what your key was, 9:59:59.000,9:59:59.000 we would run out of universe [br]before we ran out of possible keys. 9:59:59.000,9:59:59.000 So, computers do give us [br]the power to have secrets. 9:59:59.000,9:59:59.000 The problem is that institutions [br]prohibit the use of technology 9:59:59.000,9:59:59.000 that allow you to take back [br]your own privacy. 9:59:59.000,9:59:59.000 It's funny, right? Because we take kids[br]and we say to them: 9:59:59.000,9:59:59.000 "Your privacy is like your virginity: 9:59:59.000,9:59:59.000 once you've lost it, [br]you'll never get it back. 9:59:59.000,9:59:59.000 Watch out what you're [br]putting on Facebook" 9:59:59.000,9:59:59.000 -- and I think they should watch[br]what they're putting on Facebook, 9:59:59.000,9:59:59.000 I'm a Facebook vegan, I don't even --[br]I don't use it, but we say: 9:59:59.000,9:59:59.000 "Watch what you're putting on Facebook, 9:59:59.000,9:59:59.000 don't send out dirty pictures [br]of yourself on SnapChat." 9:59:59.000,9:59:59.000 All good advice. 9:59:59.000,9:59:59.000 But we do it while we are taking away [br]all the private information 9:59:59.000,9:59:59.000 that they have, all of their privacy[br]and all of their agency. 9:59:59.000,9:59:59.000 You know, if a parent says to a kid: 9:59:59.000,9:59:59.000 "You mustn't smoke [br]because you'll get sick" 9:59:59.000,9:59:59.000 and the parent says it [br]while lighting a new cigarette 9:59:59.000,9:59:59.000 off the one that she's just put down[br]in the ashtray, 9:59:59.000,9:59:59.000 the kid knows that what you're doing[br]matters more than what you're saying. 9:59:59.000,9:59:59.000 (Moderator) The point is deficit of trust. 9:59:59.000,9:59:59.000 It builds in the kind of work that [br]David has been doing as well, 9:59:59.000,9:59:59.000 this deficit of trust and privacy. 9:59:59.000,9:59:59.000 And there is another point here: 9:59:59.000,9:59:59.000 "Is the battle for privacy already lost? 9:59:59.000,9:59:59.000 Are we already too comfortable [br]with giving away our data?" 9:59:59.000,9:59:59.000 (Doctorow) No, I don't think so at all. 9:59:59.000,9:59:59.000 In fact, I think that if anything, [br]we've reached 9:59:59.000,9:59:59.000 peak indifference to surveillance, right? (21:25)