(Cory Doctorow) Thank you very much So I'd like to start with something of a benediction or permission. I am one of nature's fast talkers and many of you are not native English speakers, or maybe not accustomed to my harsh Canadian accent in addition I've just come in from Australia and so like many of you I am horribly jetlagged and have drunk enough coffee this morning to kill a rhino. When I used to be at the United Nations I was known as the scourge of the simultaneous translation core I would stand up and speak as slowly as I could and turn around, and there they would be in their boots doing this (laughter) When I start to speak too fast, this is the universal symbol -- my wife invented it -- for "Cory, you are talking too fast". Please, don't be shy. So, I'm a parent , like many of you and I'm like I'm sure all of you who are parents, parenting takes my ass all the time. And there are many regrets I have about the mere seven and half years that I've been a parent but none ares so keenly felt as my regrets over what's happened when I've been wandering around the house and seen my daughter working on something that was beyond her abilities, that was right at the edge of what she could do and where she was doing something that she didn't have competence in yet and you know it's that amazing thing to see that frowning concentration, tongue stuck out: as a parent, your heart swells with pride and you can't help but go over and sort of peer over their shoulder what they are doing and those of you who are parents know what happens when you look too closely at someone who is working beyond the age of their competence. They go back to doing something they're already good at. You interrupt a moment of genuine learning and you replace it with a kind of embarrassment about what you're good at and what you're not. So, it matters a lot that our schools are increasingly surveilled environments, environments in which everything that our kids do is watched and recorded. Because when you do that, you interfere with those moments of real learning. Our ability to do things that we are not good at yet, that we are not proud of yet, is negatively impacted by that kind of scrutiny. And that scrutiny comes from a strange place. We have decided that there are some programmatic means by which we can find all the web page children shouldn't look at and we will filter our networks to be sure that they don't see them. Anyone who has ever paid attention knows that this doesn't work. There are more web pages that kids shouldn't look at than can ever be cataloged, and any attempt to catalog them will always catch pages that kids must be looking at. Any of you who have ever taught a unit on reproductive health know the frustration of trying to get round a school network. Now, this is done in the name of digital protection but it flies in the face of digital literacy and of real learning. Because the only way to stop kids from looking at web pages they shouldn't be looking at is to take all of the clicks that they make, all of the messages that they send, all of their online activity and offshore it to a firm that has some nonsensically arrived at list of the bad pages. And so, what we are doing is that we're exfiltrating all of our students' data to unknown third parties. Now, most of these firms, their primary business isn't serving the education sector. Most of them service the government sectors. They primarily service governments in repressive autocratic regimes. They help them ensure that their citizens aren't looking at Amnesty International web pages. They repackage those tools and sell them to our educators. So we are offshoring our children's clicks to war criminals. And what our kids do, now, is they just get around it, because it's not hard to get around it. You know, never underestimate the power of a kid who is time-rich and cash-poor to get around our technological blockades. But when they do this, they don't acquire the kind of digital literacy that we want them to do, they don't acquire real digital agency and moreover, they risk exclusion and in extreme cases, they risk criminal prosecution. So what if instead, those of us who are trapped in this system of teaching kids where we're required to subject them to this kind of surveillance that flies in the face of their real learning, what if instead, we invented curricular units that made them real first class digital citizens, in charge of trying to influence real digital problems? Like what if we said to them: "We want you to catalog the web pages that this vendor lets through that you shouldn't be seeing. We want you to catalog those pages that you should be seeing, that are blocked. We want you to go and interview every teacher in the school about all those lesson plans that were carefully laid out before lunch with a video and a web page, and over lunch, the unaccountable distance center blocked these critical resources and left them handing out photographed worksheets in the afternoon instead of the unit they prepared. We want you to learn how to do the Freedom of Information Act requests and find out what your school authority is spending to censor your internet access and surveil your activity. We want you to learn to use the internet to research these companies and we want you to present this to your parent-teacher association, to your school authority, to your local newspaper." Because that's the kind of digital literacy that makes kids into first-class digital citizens, that prepares them for a future in which they can participate fully in a world that's changing. Kids are the beta-testers of the surveillance state. The path of surveillance technology starts with prisoners, moves to asylum seekers, people in mental institutions and then to its first non-incarcerated population: children and then moves to blue-collar workers, government workers and white-collar workers. And so, what we do to kids today is what we did to prisoners yesterday and what we're going to be doing to you tomorrow. And so it matters, what we teach our kids. If you want to see where this goes, this is a kid named Blake Robbins and he attended Lower Merion High School in Lower Merion Pennsylvania outside f Philadelphia. It's the most affluent public school district in America, so affluent that all the kids were issued Macbooks at the start of the year and they had to do their homework on their Macbooks, and bring them to school every day and bring them home every night. And the Macbooks had been fitted with Laptop Theft Recovery Software, which is fancy word for a rootkit, that let the school administration covertly (check) operate the cameras and microphones on these computers and harvest files off of their hard drives view all their clicks, and so on. Now Blake Robbins found out that the software existed and how it was being used because he and the head teacher had been knocking heads for years, since he first got into the school, and one day, the head teacher summoned him to his office and said: "Blake, I've got you now." and handed him a print-out of Blake in his bedroom the night before, taking what looked like a pill, and he said: "You're taking drugs." And Blake Robbins said: "That's a candy, it's a Mike and Ike's candy, I take them when I -- I eat them when I'm studying. How did you get a picture of me in my bedroom?" This head teacher had taken over 6000 photos of Blake Robbins: awake and asleep, dressed and undressed, in the presence of his family. And in the ensuing lawsuit, the school settled for a large amount of money and promised that they wouldn't do it again without informing the students that it was going on. And increasingly, the practice is now that school administrations hand out laptops, because they're getting cheaper, with exactly the same kind of software, but they let the students know and hey find that that works even better at curbing the students' behavior, because the students know that they're always on camera. Now, the surveillance state is moving from kids to the rest of the world. It's metastasizing. Our devices are increasingly designed to treat us as attackers, as suspicious parties who can't be trusted because our devices' job is to do things that we don't want them to do. Now that's not because the vendors who make our technology want to spy on us necessarily, but they want to take the ink-jet printer business model and bring it into every other realm of the world. So the ink-jet printer business model is where you sell someone a device and then you get a continuing revenue stream from that device by making sure that competitors can't make consumables or parts or additional features or plugins for that device, without paying rent to the original manufacturer. And that allows you to maintain monopoly margins on your devices. Now, in 1998, the American government passed a law called the Digital Millennium Copyright Act, in 2001 the European Union introduced its own version, the European Union Copyright Directive. And these two laws, along with laws all around the world, in Australia, Canada and elsewhere, these laws prohibit removing digital laws that are used to restrict access to copyrighted works and they were original envisioned as a way of making sure that Europeans didn't bring cheap DVDs in from America, or making sure that Australians didn't import cheap DVDs from China. And so you have a digital work, a DVD, and it has a lock on it and to unlock it, you have to buy an authorized player and the player checks to make sure you are in region and making your own player that doesn't make that check is illegal because you'd have to remove the digital lock. And that was the original intent, it was to allow high rents to be maintained on removable media, DVDs and other entertainment content. But it very quickly spread into new rounds. So, for example, auto manufacturers now lock up all of their cars' telemetry with digital locks. If you're a mechanic and want to fix a car, you have to get a reader from the manufacturer to make sure that you can see the telemetry and then know what parts to order and how to fix it. And in order to get this reader, you have to promise the manufacturer that you will only buy parts from that manufacturer and not from third parties. So the manufacturers can keep the repair costs high and get a secondary revenue stream out of the cars. This year, the Chrysler corporation filed comments with the US Copyright Office, to say that they believed that this was the right way to do it and that it should be a felony, punishable by 5 years in prison and a $500'000 fine, to change the locks on a car that you own, so that you can choose who fixes it. It turned out that when they advertised -- well, where is my slide here? Oh, there we go -- when they advertised that it wasn't your father's Oldsmobile, they weren't speaking metaphorically, they literally meant that even though your father bought the Oldsmobile, it remained their property in perpetuity. And it's not just cars, it's every kind of device, because every kind of device today has a computer in it. The John Deer Company, the world's leading seller of heavy equipment and agricultural equipment technologies, they now view their tractors as information gathering platforms and they view the people who use them as the kind of inconvenient gut flora of their ecosystem. So if you are a farmer and you own a John Deer tractor, when you drive it around your fields, the torque centers on the wheels conduct a centimeter-accurate soil density survey of your agricultural land. That would be extremely useful to you when you're planting your seeds but that data is not available to you unless unless you remove the digital lock from your John Deer tractor which again, is against the law everywhere in the world. Instead, in order to get that data you have to buy a bundle with seeds from Monsanto, who are John Deer's seed partners. John Deer then takes this data that they aggregate across whole regions and they use it to gain insight into regional crop yields That they use to play the futures market. John Deer's tractors are really just a way of gathering information and the farmers are secondary to it. Just because you own it doesn't mean it's yours. And it's not just the computers that we put our bodies into that have this business model. It's the computers that we put inside of our bodies. If you're someone who is diabetic and you're fitted with a continuous glucose-measuring insulin pump, that insulin pump is designed with a digital log that makes sure that your doctor can only use the manufacturer's software to read the data coming off of it and that software is resold on a rolling annual license and it can't be just bought outright. And the digital locks are also used to make sure that you only buy the insulin that vendors approved and not generic insulin that might be cheaper. We've literally turned human beings into ink-jet printers. Now, this has really deep implications beyond the economic implications. Because the rules that prohibit breaking these digital locks also prohibit telling people about flaws that programmers made because if you know about a flaw that a programmer made, you can use it to break the digital lock. And that means that the errors, the vulnerabilities, the mistakes in our devices, they fester in them, they go on and on and on and our devices become these longlife reservoirs of digital pathogens. And we've seen how that plays out. One of the reasons that Volkswagen was able to get away with their Diesel cheating for so long is because no one could independently alter their firmware. It's happening all over the place. You may have seen -- you may have seen this summer that Chrysler had to recall 1.4 million jeeps because it turned out that they could be remotely controlled over the internet while driving down a motorway and have their brakes and steering commandeered by anyone, anywhere in the world, over the internet. We only have one methodology for determining whether security works and that's to submit it to public scrutiny, to allow for other people to see what assumptions you've made. Anyone can design a security system that he himself can think of a way of breaking, but all that means is that you've designed a security system that works against people who are stupider than you. And in this regard, security is no different from any other kind of knowledge creation. You know, before we had contemporary science and scholarship, we had something that looked a lot like it, called alchemy. And for 500 years, alchemists kept what they thought they knew a secret. And that meant that every alchemist was capable of falling prey to that most urgent of human frailties, which is our ability to fool ourselves. And so, every alchemist discovered for himself in the hardest way possible that drinking mercury was a bad idea. We call that 500-year period the Dark Ages and we call the moment at which they started publishing and submitting themselves to adversarial peer review, which is when your friends tell you about the mistakes that you've made and your enemies call you an idiot for having made them, we call that moment the Enlightenment. Now, this has profound implications. The restriction of our ability to alter the security of our devices for our own surveillance society, for our ability to be free people in society. At the height of the GDR, in 1989, the STASI had one snitch for every 60 people in East Germany, in order to surveil the entire country. A couple of decades later, we found out through Edward Snowden that the NSA was spying on everybody in the world. And the ratio of people who work at the NSA to people they are spying on is more like 1 in 10'000. They've achieved a two and a half order of magnitude productivity gain in surveillance. And the way that they got there is in part by the fact that we use devices that we're not allowed to alter, that are designed to treat us as attackers and that gather an enormous amount of information on us. If the government told you that you're required to carry on a small electronic rectangle that recorded all of your social relationships, all of your movements, all of your transient thoughts that you made known or ever looked up, and would make that available to the state, and you would have to pay for it, you would revolt. But the phone companies have managed to convince us, along with the mobile phone vendors, that we should foot the bill for our own surveillance. It's a bit like during the Cultural Revolution, where, after your family members were executed, they sent you a bill for the bullet. So, this has big implications, as I said, for where we go as a society. Because just as our kids have a hard time functioning in the presence of surveillance, and learning, and advancing their own knowledge, we as a society have a hard time progressing in the presence of surveillance. In our own living memory, people who are today though of as normal and right were doing something that a generation ago would have been illegal and landed them in jail. For example, you probably know someone who's married to a partner of the same sex. If you live in America, you may know someone who takes medical marijuana, or if you live in the Netherlands. And not that long ago, people who undertook these activities could have gone to jail for them, could have faced enormous social exclusion for them. The way that we got from there to here was by having a private zone, a place where people weren't surveilled, in which they could advance their interest ideas, do things that were thought of as socially unacceptable and slowly change our social attitudes. And in ........ (check) few things that in 50 years, your grandchildren will sit around the Christmas table, in 2065, and say: "How was it, grandma, how was it, grandpa, that in 2015, you got it all right, and we haven't had any social changes since then?" Then you have to ask yourself how in a world, in which we are all under continuous surveillance, we are going to find a way to improve this. So, our kids need ICT literacy, but ICT literacy isn't just typing skills or learning how to use PowerPoing. It's learning how to think critically about how they relate to the means of information, about whether they are its masters or servants. Our networks are not the most important issue that we have. There are much more important issues in society and in the world today. The future of the internet is way less important than the future of our climate, the future of gender equity, the future of racial equity, the future of the wage gap and the wealth gap in the world, but everyone of those fights is going to be fought and won or lost on the internet: it's our most foundational fight So weakened (check) computers can make us more free or they can take away our freedom. It all comes down to how we regulate them and how we use them. And it's our job, as people who are training the next generation, and whose next generation is beta-testing the surveillance technology that will be coming to us, it's our job to teach them to seize the means of information, to make themselves self-determinant in the way that they use their networks and to find ways to show them how to be critical and how to be smart and how to be, above all, subversive and how to use the technology around them. Thank you. (18:49) (Applause) (Moderator) Cory, thank you very much indeed. (Doctorow) Thank you (Moderator) And I've got a bundle of points which you've stimulated from many in the audience, which sent -- (Doctorow) I'm shocked to hear that that was at all controversial, but go on. (Moderator) I didn't say "controversial", you stimulated thinking, which is great. But a lot of them resonate around violation of secrecy and security. And this, for example, from Anneke Burgess (check) "Is there a way for students to protect themselves from privacy violations by institutions they are supposed to trust." I think this is probably a question William Golding (check) as well, someone who is a senior figure in a major university, but this issue of privacy violations and trust. (Doctorow) Well, I think that computers have a curious dual nature. So on the one hand, they do expose us to an enormous amount of scrutiny, depending on how they are configured. But on the other hand, computers have brought new powers to us that are literally new on the face of the world, right? We have never had a reality in which normal people could have secrets from powerful people. But with the computer in your pocket, with that, you can encrypt a message so thoroughly that if every hydrogen atom in the universe were turned into a computer and it did nothing until the heat death of the universe, but try to guess what your key was, we would run out of universe before we ran out of possible keys. So, computers do give us the power to have secrets. The problem is that institutions prohibit the use of technology that allow you to take back your own privacy. It's funny, right? Because we take kids and we say to them: "Your privacy is like your virginity: once you've lost it, you'll never get it back. Watch out what you're putting on Facebook" -- and I think they should watch what they're putting on Facebook, I'm a Facebook vegan, I don't even -- I don't use it, but we say: "Watch what you're putting on Facebook, don't send out dirty pictures of yourself on SnapChat." All good advice. But we do it while we are taking away all the private information that they have, all of their privacy and all of their agency. You know, if a parent says to a kid: "You mustn't smoke because you'll get sick" and the parent says it while lighting a new cigarette off the one that she's just put down in the ashtray, the kid knows that what you're doing matters more than what you're saying. (Moderator) The point is deficit of trust. It builds in the kind of work that David has been doing as well, this deficit of trust and privacy. And there is another point here: "Is the battle for privacy already lost? Are we already too comfortable with giving away our data?" (Doctorow) No, I don't think so at all. In fact, I think that if anything, we've reached peak indifference to surveillance, right? The surveillance races on by a long shot. There will be more surveillance before there is less. But there'll never be fewer people who care about surveillance than there are today, because, as privacy advocates, we spectacularly failed, over the last 20 years, to get people to take privacy seriously and now we have firms that are frankly incompetent, retaining huge amounts of our personally identifying sensitive information and those firms are leaking that information at speed. So, it started this year with things like (phone rings) -- is that me? it's me! -- It started this year with things like Ashley Madison (Moderator) Do you want to take it? (Doctorow) No no, that was my timer going off telling me I've had my 22 minutes (Moderator) It may be someone who doesn't trust my.... (Doctorow) No,no, that was 22 minutes. So, it started with Ashley Madison, the office of personnel management, everyone who ever applied for security clearance in America had all of their sensitive information, everything you had to give them about why you shouldn't have security clearance, everything that's potentially compromising about you, all of it exfiltrated by what's believed to have been a Chinese spy ring. Something like one in twenty Americans now, have had their data captured and exfiltrated from the United States. This week, VTech, the largest electronic toy manufacturer in the world, leaked the personal information of at least five million children, including potentially photos that they took with their electronic toys, as well as their parents' information, their home addresses, their passwords, their parents' passwords and their password hints. So every couple of weeks, from now on, a couple of million people are going to show up on the door of people who care about privacy and say: "You were right all along, what do we do?" And the challenge is to give them something useful they can do about privacy. And there are steps that you can take personally. If you go to the Surveillance Defense Kit at the Electronic Frontier Foundation's website, you'll find a set of tools you can use. But ultimately, it's not an individual choice. It's a social one. Privacy is a team sport, because if you keep your information private and you send it to someone else who is in your social network, who doesn't keep it private, well, then it leaks out their back door. And so we need social movements to improve our privacy. We also need better tools. It's really clear that our privacy tools have fallen short of the mark in terms of being accessible to normal people. (Moderator) Corey -- (Doctorow) I sense you want to say something: go on (Moderator) Yes. But what I want to do is keep pushing this down the route of education as well. There are two or three questions here which are very specific about that. Could you tell us more about how we help students become digital citizens? It links in important ways to the German and Nordic pre-digital concept of Bildung ............... (check) and if you could give your daughter just one piece of advice for living in an age of digital surveillance, what would it be? (Doctorow) So, the one piece of advice I would give her is "Don't settle for anything less than total control of the means of information. Anytime someone tells you that the computer says you must do something, you have to ask yourself why the computer isn't doing what you want it to do. Who is in charge here?" In terms of how we improve digital citizenship, I think we are at odds with our institutions in large part here. I think that our institutions have decreed that privacy is a luxury we can't afford to give our students because if we do, they might do something we wouldn't like. And so, as teachers, the only thing that we can do without risking our jobs, without risking our students' education and their ability to stay in school is to teach them how to do judo, to teach them how to ask critical questions, to gather data, to demand evidence-based policy. And ultimately, I actually think that's better. I think that teaching kids how to evade firewalls is way less interesting than teaching them how to identify who is providing that firewall, who made the decision to buy that firewall, how much they're spending, and what the process is for getting that person fired. That's a much more interesting piece of digital citizenship than any piece of firewall hacking. (Moderator) But Corey, you're teaching kids to get hold of their information, but picking up on your analogy with the auto (check) where there's a very clear lock on, so much which has to be done has to be done through the manufacturers -- Martin Siebel (check) picking up -- How busy is the hacker community with breaking digital locks? In other words, digital locks are being put on so much, can the next generation, can the kids work out where these locks are, to keep control of their information. (Doctorow) So, getting the digital locks off of our devices is not just a matter of technology, it's also a matter of policy. Laws like the EUCD and new treaty instruments like TTIP and the Transpacific Partnership specify that these digital locks can't be removed legally, that firms can use them to extract monopoly ransom, to harm our security. And these treaties will make it impossible to remove these locks legally. It's not enough for there to be a demi-monde, in which people who are technologically savvy know how to break their iPhones. The problem with that is that it produces this world in which you can't know whether the software that you're using has anything bad lurking in it, right? to say to people who have iPhones that they want it reconfigured to accept software from parties that aren't approved by Apple, well all you need to do is find a random piece of software on the internet that claims it will allow you to do it, get it (check) on your phone, and then run this software on it is like saying to people: "Well, all you need to do is just find some heroin lying around and put it in your arm: I'm sure it'll work out fine." Right? The ways in which a phone that is compromised can harm you are really without limits, right? If you think about this, this is a rectangle with a camera and a microphone that you take into the bedroom and the toilet. and the only way you know whether that camera or microphone are on or off is if the software is doing what it says it's doing, right? So, I think that we have this policy dimension that's way more important than the technical dimension, and that the only way we're going to change that policy is by making kids know what they can't do and them making them know why they're not allowed to do it, and then set them loose. (Moderator) Two final points then, building on their yen ............... (check) but what do you think of recent laws, say, in the Netherlands, allowing police to spy on criminals and terrorists with webcams on their own laptops? In other words, the extension implicit in this question is, what about keeping an eye on others, including the next generation coming up, and this from Peter Andreesen (check), which is slightly connected: When the predominant management tool in our business is still based on the assumption of control, how do we change that psychology and that mindset? And David is nodding for those views ........ (check) at the back. (Doctorow) You know, there's not an operating system or suite of applications that bad guys use, and another set that the rest of us use. And so, when our security services discover vulnerabilities in the tools that we rely on, and then hoard those vulnerabilities, so that they can weaponize them to attack criminals or terrorists or whoever they're after, they insure that those vulnerabilities remain unpatched on our devices too. So that the people who are our adversaries, the criminals who want to attack us, the spies who want to exfiltrate our corporate secrets to their countries, the griefers, the autocratic regimes who use those vulnerabilities to spy on their population -- at Electronic Frontier Foundation, we have a client from Ethiopia, Mr. Kadani, who is dissident journalist -- Ethiopia imprisons more journalists than any other country in the world -- he fled to America, where the Ethiopian government hacked his computer with a bug in Skype that they had discovered and not reported. They'd bought the tools to do this from Italy, from a company called Hacking Team, they hacked his computer, they found out who is colleagues were in Ethiopia and they arrested them and subjected them to torture. So, when our security services take a bug and weaponize it so they can attack their adversaries, they leave that bug intact so that our adversaries can attack us. You know, this is -- I'm giving a talk later today, no matter what side you're on in the war on general-purpose computing, you're losing, because all we have right now is attack and not defense. Our security services are so concerned with making sure that their job is easy when they attack their adversaries, that they are neglecting the fact that they are making their adversaries' job easy in attacking us. If this were a football match, the score would be tied 400 to 400 after 10 minutes. (Moderator) Corey, I've got to stop you there. Thank you very much indeed. (Doctorow) Thank you very much.