1 00:00:14,745 --> 00:00:20,643 Good morning, it's a pleasure to be here and an honor to be at re: publica. 2 00:00:20,643 --> 00:00:33,692 For the last thousand years, we, our mothers, and our fathers, have been struggling for freedom of thought. 3 00:00:33,692 --> 00:00:48,576 We have sustained many horrible losses and some imense vicroties and we are now at a very serious time. 4 00:00:48,576 --> 00:00:58,538 From the adoption of printing by Europeans in the 15th century, we began to be concerned primarially 5 00:00:58,538 --> 00:01:02,671 with access to printed material. 6 00:01:02,671 --> 00:01:10,240 The right to read and the right to publish were the central subjects of our struggle for freedom of 7 00:01:10,240 --> 00:01:15,767 thought for most of the last half milenium. 8 00:01:15,767 --> 00:01:32,067 The basic concern was for the right to read in private and to think and speak and act on the basis of 9 00:01:32,067 --> 00:01:32,567 a free and uncensored will. 10 00:01:32,567 --> 00:01:46,649 The primary antogonist for freedom of thought in the beginning of our struggle was the universal Catholic 11 00:01:46,649 --> 00:01:53,987 church, an institution directed at the control of thought in the European world. 12 00:01:53,987 --> 00:02:06,317 Based around weekly surveillance of the conduct and thoughts of every human being. 13 00:02:06,317 --> 00:02:14,002 Based around the censorship of all reading material, and, in the end, based upon the ability to predict 14 00:02:14,002 --> 00:02:20,806 and to punish unorthodox thought. 15 00:02:20,806 --> 00:02:30,280 The tools available for thought control in early modern Europe were poor, even by 20th century standards 16 00:02:30,280 --> 00:02:32,439 But they worked. 17 00:02:32,439 --> 00:02:40,868 And for hundreds of years, the struggle primarially centered around that increasingly important first 18 00:02:40,868 --> 00:02:49,088 mass-manufactured article in western culture: the book. 19 00:02:49,088 --> 00:03:00,024 Whether you could print them, possess them, traffic in them, read them, teach from them without the permission 20 00:03:00,024 --> 00:03:10,404 or control of an entity empowered to punish thought. 21 00:03:10,404 --> 00:03:20,713 By the end of the 17th century, censorship of written material in Euopre had begun to break down. 22 00:03:20,713 --> 00:03:30,721 First in the Netherlands, then in the UK, then afterwards in waves throughout the European world. 23 00:03:30,721 --> 00:03:43,121 And the book became an article of subversive commerce and began eating away at the control of thought. 24 00:03:43,121 --> 00:03:52,525 By the late 18th century, that struggle for the freedom of reading had begun to attack the substance 25 00:03:52,525 --> 00:04:07,803 of Christianity itself. 26 00:04:07,803 --> 00:04:17,370 And the European world trembled on the brink of the first great revolution of the mind. It spoke of " Liberté, égalité, fraternité" 27 00:04:17,370 --> 00:04:18,020 but actually it meant freedom to think differently. 28 00:04:18,020 --> 00:04:26,008 The Ancien Régime began to struggle against thinking and we moved into the next phase of the struggle 29 00:04:26,008 --> 00:04:36,132 for freedom of thought, which presumed the possibility of unothorodox thinking and revolutionary acting. 30 00:04:36,132 --> 00:04:43,725 And for 200 years we struggled with the consequences of those changes. 31 00:04:43,725 --> 00:04:46,627 that was then and this is now. 32 00:04:46,627 --> 00:04:57,215 Now we begin a new phase in the history of the human race. We are building a single nervous system which 33 00:04:57,215 --> 00:05:01,232 will embrace every human mind. 34 00:05:01,232 --> 00:05:12,239 We are less than two generations now from the moment at which every human being will be connected to a single 35 00:05:12,239 --> 00:05:23,199 network in which all thoughts, plans, dreams, and actions will flow as nervious impulses in the network. 36 00:05:23,199 --> 00:05:30,374 And the fate of freedom of thought, indeed the fate of human freedom all together, everything that we 37 00:05:30,374 --> 00:05:39,429 have fought for for a thousand years will depend upon the neuroanatomy of that network. 38 00:05:39,429 --> 00:05:49,553 Ours are the last generation of human brains that will be formed without contact with the net. 39 00:05:49,553 --> 00:05:57,518 From here on out every human brain, by two generations form now, every single human brain will be formed 40 00:05:57,518 --> 00:06:03,392 from early life in direct connection to the network. 41 00:06:03,392 --> 00:06:15,838 Humanity will become a super-organism in which each of us is but a neuron in the brain and we are describing 42 00:06:15,838 --> 00:06:24,499 now, now, all of us now, this generation, unique in the history of the human race, in this generation 43 00:06:24,499 --> 00:06:28,841 we will decide how that network is organized. 44 00:06:28,841 --> 00:06:32,766 Unfortunately, we are beginning badly. 45 00:06:32,766 --> 00:06:40,103 Here's the problem: we grew up to be consumers of media. That's what they taught us, we are consumers 46 00:06:40,103 --> 00:06:41,473 of media. That's what they taught us. 47 00:06:41,473 --> 00:06:49,646 Now media is consuming us. 48 00:06:49,646 --> 00:06:59,074 The things we read watch us read them. The things we listen to listen to us listen to them. 49 00:06:59,074 --> 00:07:08,431 We are tracked, we are monitored, we are predicted by the media we use. 50 00:07:08,431 --> 00:07:18,184 The process of the building of the network institutionalizes basic principles of information flow. 51 00:07:18,184 --> 00:07:25,498 It determines whether there is such a thing as anonymous reading. 52 00:07:25,498 --> 00:07:31,930 And it is determining against anonymous reading. 53 00:07:31,930 --> 00:07:42,193 20 years ago I began working as a lawyer for a man called Philip Zimmerman who had created a form of 54 00:07:42,193 --> 00:07:47,093 public key encryption for mass use called "Pretty Good Privacy". 55 00:07:47,093 --> 00:07:53,246 The effort to create Pretty Good Privacy was the effort to retain the possibility of secrets in the late 56 00:07:53,246 --> 00:08:00,258 20th century. Phil was trying to prevent government from reading everything. 57 00:08:00,258 --> 00:08:07,015 And, as a result, he was at least threatened with prosecution by the United States government for sharing 58 00:08:07,015 --> 00:08:12,727 military secrets, which is what we called public key encryption back then. 59 00:08:12,727 --> 00:08:18,022 We said you shouldn't do this, there will be trillions of dollars of electronic commerce if everybody 60 00:08:18,022 --> 00:08:21,899 has strong encryption. Nobody was interested. 61 00:08:21,899 --> 00:08:30,723 But what was important about Pretty Good Privacy, about the strugle for freedom that public key encryption 62 00:08:30,723 --> 00:08:39,361 in civil society represented, what was crucial, became clear when we began to win. 63 00:08:39,361 --> 00:08:48,370 In 1995 there was a debate at Harvard Law School. Four of us discussing the future of public key encryption 64 00:08:48,370 --> 00:08:50,924 and it's control. 65 00:08:50,924 --> 00:08:59,469 I was on the side, I suppose, of freedom. It is where I try to be. With me at that debate was a man called 66 00:08:59,469 --> 00:09:06,273 Daniel Weitzner who now works in the White House making internet policy for the Obame Administration. 67 00:09:06,273 --> 00:09:14,167 On the other side was the, then deputy, Attorney General of the United States and a lawyer in private 68 00:09:14,167 --> 00:09:20,924 practice named Stuart Baker, who had been chief council to the National Security Agency, our listeners, 69 00:09:20,924 --> 00:09:29,678 and who was then in private life helping businesses to deal with the listeners. He then became latter 70 00:09:29,678 --> 00:09:35,994 on the deputy for policy planning in the Department of Homeland Security in the United States, and has 71 00:09:35,994 --> 00:09:43,285 had much to do with what happened in our network after 2001. At any rate, the four of us spent two pleasant 72 00:09:43,285 --> 00:09:50,692 hours debating the right to encrypt and at the end there was a little dinner party in the Harvard faculty 73 00:09:50,692 --> 00:09:56,335 club and at the end, after all the food had been taken away and just the port and the walnuts were left 74 00:09:56,335 --> 00:10:03,278 on the table, Stuart said "Alright, among us, now that we are all in private, just us girls, I'll let 75 00:10:03,278 --> 00:10:11,358 our hair down" he didn't have much hair even then, but he let it down. "We're not going to prosecute 76 00:10:11,358 --> 00:10:20,391 your client, Mr. Zimmerman" he said, "public key encryption will become available. We fought a long losing 77 00:10:20,391 --> 00:10:29,029 battle against it but it was just a delaying tactic." And then he looked around the room and he said 78 00:10:29,029 --> 00:10:34,857 "But nobody cares about anonymity, do they?" 79 00:10:34,857 --> 00:10:39,803 And a cold chill went up my spine and I thought alright, Stuart, now I know, you're going to spend the 80 00:10:39,803 --> 00:10:47,047 next 20 years trying to eliminate anonymity in human society and I'm going to try and stop you and we'll 81 00:10:47,047 --> 00:10:49,485 see how it goes. 82 00:10:49,485 --> 00:10:52,875 And it is going badly. 83 00:10:52,875 --> 00:10:59,377 We didn't build the net with anonymity built in. That was a mistake. 84 00:10:59,377 --> 00:11:02,280 Now we're paying for it. 85 00:11:02,280 --> 00:11:07,899 Our network assumes that you can be tracked everywhere. 86 00:11:07,899 --> 00:11:15,283 And we've taken the web and we've made Facebook out of it. 87 00:11:15,283 --> 00:11:19,764 We put one man in the middle of everything. 88 00:11:19,764 --> 00:11:27,473 We live our social lives, our private lives, in the web and we share everything with our friends, and 89 00:11:27,473 --> 00:11:38,363 also with our super-friend, the one who reports to anybody who makes him, who pays him, who helps him 90 00:11:38,363 --> 00:11:44,610 or who gives him the 100 billion dollar he desires. 91 00:11:44,610 --> 00:11:53,108 We are creating a media that consume us, and media loves it. 92 00:11:53,108 --> 00:12:02,907 The primary purpose of 21 first commerce is to predict how we can be made to buy. 93 00:12:02,907 --> 00:12:08,573 And the thing that people most want us to buy is debt. 94 00:12:08,573 --> 00:12:19,556 So we are going into debt. We're getting heavier, heavier with debt, heavier with doubt, heavier with 95 00:12:19,556 --> 00:12:26,777 all we need we didn't know we needed until they told us we were thinking about it. 96 00:12:26,777 --> 00:12:32,652 Because they own the search box and we put our dreams in it. 97 00:12:32,652 --> 00:12:39,989 Everything we want, everything we hope, everything we'd like, everything we wish we knew about is in 98 00:12:39,989 --> 00:12:44,958 the search box, and they own it. 99 00:12:44,958 --> 00:12:48,557 We are reported everywhere, all the time. 100 00:12:48,557 --> 00:12:56,382 In the 20th century you had to build Lubyanka, you had to torture people, you had to threaten people, 101 00:12:56,382 --> 00:13:05,508 you had to press people to inform on their friends. I don't need to talk about that in Berlin. 102 00:13:05,508 --> 00:13:08,968 In the 21st century, why bother? 103 00:13:08,968 --> 00:13:15,121 You just build social networking and everybody informs on everybody else for you. 104 00:13:15,121 --> 00:13:21,437 Why waste time and money having buildings full of little men who check who is in which photographs? 105 00:13:21,437 --> 00:13:27,845 Just tell everybody to tag their friends and bing, you're done. 106 00:13:27,845 --> 00:13:31,677 Oh, did I use that word, "bing?" You're done. 107 00:13:31,677 --> 00:13:39,525 There's a search box, and they own it, and we put our dreams in it, and they eat them. 108 00:13:39,525 --> 00:13:44,401 And they tell us who we are right back. 109 00:13:44,401 --> 00:13:47,327 "If you like that, you'll love this" 110 00:13:47,327 --> 00:13:50,508 And we do. 111 00:13:50,508 --> 00:13:55,593 They figure us out, the machines do. 112 00:13:55,593 --> 00:13:59,703 Every time you make a link, you're teaching the machine. 113 00:13:59,703 --> 00:14:05,671 Every time you make a link about someone else, you're teaching the machine about someone else. 114 00:14:05,671 --> 00:14:11,383 We need to build that network, we need to make that brain. 115 00:14:11,383 --> 00:14:19,347 This is humanity's highest purpose, we're fuilfiling it. But we musn't do it wrong. 116 00:14:19,347 --> 00:14:24,665 Once upon a time the technological mistakes were mistakes, we made them. 117 00:14:24,665 --> 00:14:34,069 They were the unintended consequences of our thoughtful behavior. That's not the way it is right now. 118 00:14:34,069 --> 00:14:37,946 The things that are going on are not mistakes, they're designs. 119 00:14:37,946 --> 00:14:49,510 They have purpose, and the purpose is to make the human population readable. 120 00:14:49,510 --> 00:14:54,061 I was talking to a senior government official in the United States a few weeks ago; 121 00:14:54,061 --> 00:14:58,961 our government has been misbehaving. 122 00:14:58,961 --> 00:15:09,154 We had rules, we made them after 9/11, they said "we will keep databases about people, and some of those people will be innocent. 123 00:15:09,154 --> 00:15:14,820 they won't be suspected of anything." The rules we made in 2001 said 124 00:15:14,820 --> 00:15:22,877 "We will keep information about people not suspected of anything for a maxium of 180 days, 125 00:15:22,877 --> 00:15:27,289 they we will discard it." 126 00:15:27,289 --> 00:15:37,157 In March, in the middle of the night, on a Wednesday, after everything shut down, when it was raining, 127 00:15:37,157 --> 00:15:42,266 the Department of Justice and the director of National Intelligence in the United States said "Oh, we're 128 00:15:42,266 --> 00:15:51,972 changing those rules. This small change: we used to say we would keep information on people not suspected 129 00:15:51,972 --> 00:15:59,634 of anything for only 180 days maximum, we're changing that a little bit to five years." 130 00:15:59,634 --> 00:16:02,003 Which is infinity. 131 00:16:02,003 --> 00:16:07,970 I joked with the lawyers I work with in NY, they only wrote five years in the press release because they couldn't get the 132 00:16:07,970 --> 00:16:12,103 sideways 8 into the font for the press release. 133 00:16:12,103 --> 00:16:16,492 Otherwise they'd have just said "infinity," which is what they mean. 134 00:16:16,492 --> 00:16:21,809 So I was having a conversation with a senior government official I have known all these many years who 135 00:16:21,809 --> 00:16:31,167 works in the White House and I said "You're changing American society" he said, "Well we realized that 136 00:16:31,167 --> 00:16:37,715 we need a robust social graph of the United States." 137 00:16:37,715 --> 00:16:42,289 I said "You need a robust social graph of the United States?" 138 00:16:42,289 --> 00:16:43,474 "Yes" he said. 139 00:16:43,474 --> 00:16:50,440 I said "You mean the United States government is, from now on, going to keep a list of everybody every 140 00:16:50,440 --> 00:16:59,170 American knows. Do you think by any chance that should require a law?" 141 00:16:59,170 --> 00:17:01,515 And he just laughed. 142 00:17:01,515 --> 00:17:08,830 Because they did it in a press release in the middle of the night, on Wednesday when it was raining. 143 00:17:08,830 --> 00:17:17,909 We're going to live in a world, unless we do something quickly, in which our media consume us and spit 144 00:17:17,909 --> 00:17:22,437 in the government's cup. 145 00:17:22,437 --> 00:17:28,567 There will never have been any place like it before. 146 00:17:28,567 --> 00:17:36,113 And if we let it happen, there will never ever be any place different from it again. 147 00:17:36,113 --> 00:17:46,516 Humanity will all have been wired together and media will consume us and spit in the government's cup. 148 00:17:46,516 --> 00:17:51,206 And the state will own our minds. 149 00:17:51,206 --> 00:17:59,890 The soon-to-be-ex president of France campaigned, as you will recall last month on the proposition that 150 00:17:59,890 --> 00:18:09,434 there should be criminal penalties for repeat visiting of jhiadi websites.That was a threat to 151 00:18:09,434 --> 00:18:14,774 criminalize reading in France. 152 00:18:14,774 --> 00:18:28,196 Well, he will be soon the ex-president of France, but that doesn't mean that that will be an ex-idea in France at all. 153 00:18:28,196 --> 00:18:33,652 The criminalization of reading is well advanced. 154 00:18:33,652 --> 00:18:41,361 In the United States, in what we call terrorism prosecutions, we now routinely see evidence of people's 155 00:18:41,361 --> 00:18:48,374 Google searches submitted as proof of their conspiratorial behavior. 156 00:18:48,374 --> 00:18:57,012 The act of seeking knowledge has become an overt act in conspiricy prosecutions. 157 00:18:57,012 --> 00:19:03,606 We are criminalizing thinking, reading, and research. 158 00:19:03,606 --> 00:19:10,409 We are doing this in so called free societies. We are doing this in a place with the 1st Amendment. 159 00:19:10,409 --> 00:19:19,512 We are doing this despite everything our history teaches us because we are forgetting even as we learn. 160 00:19:19,512 --> 00:19:23,668 We don't have much time. 161 00:19:23,668 --> 00:19:34,976 The generation that grew up outside the net is the last generation that can fix it without force. 162 00:19:34,976 --> 00:19:45,448 Governments all over the world are falling in love with the idea of datamining their populations. 163 00:19:45,448 --> 00:19:52,948 I used to think that we were going to be fighting the Chinese Communit party in the third decade 164 00:19:52,948 --> 00:19:56,385 of the 21st century. 165 00:19:56,385 --> 00:20:02,724 I didn't anticipate that we were going to be fighting the United States government and the government 166 00:20:02,724 --> 00:20:07,716 of the People's Republic of China. 167 00:20:07,716 --> 00:20:17,167 And when Ms. Cruise is here on Friday, perhaps you'll ask her whether we're going to be fighting her too. 168 00:20:17,167 --> 00:20:21,021 Governments are falling in love with datamining because it really really works. 169 00:20:21,021 --> 00:20:26,153 It's good. It's good for good things as well as evil things. 170 00:20:26,153 --> 00:20:32,190 It's good for helping government understand how to deliver services; it is good for government to understand 171 00:20:32,190 --> 00:20:41,525 what the problems are going to be; it is good for politicians to understand how voters are going to think. 172 00:20:41,525 --> 00:20:48,235 But it creates the possibility of kinds of social control that were previously very difficult, very 173 00:20:48,235 --> 00:20:55,573 expensive, and very cumbersome, in very simple and efficient ways. 174 00:20:55,573 --> 00:21:03,467 It is no longer necessary to maintain enormous networks of informants, as I have pointed out. 175 00:21:03,467 --> 00:21:11,919 Stazi gets a bargin now, if it comes back, because Zuckerberg does its work for it. 176 00:21:11,919 --> 00:21:19,814 But it is more than just the ease of surveillance it is more than just the permanence of data, 177 00:21:19,814 --> 00:21:25,085 it's the relentless of living after the end of forgetting. 178 00:21:25,085 --> 00:21:30,286 Nothing ever goes away any more. 179 00:21:30,286 --> 00:21:37,647 What isn't understood today will be understood tomorrow. The encrypted traffic you use today in relative 180 00:21:37,647 --> 00:21:44,381 security is simply waiting until there is enough of it for the crypto analysis to work. 181 00:21:44,381 --> 00:21:47,609 For the breakers to succeed in breaking it. 182 00:21:47,609 --> 00:21:55,689 We're going to have to re-do all our security all the time forever because no encrypted packet is ever 183 00:21:55,689 --> 00:21:59,149 lost again. 184 00:21:59,149 --> 00:22:08,158 Nothing is unconnected infinitely, only finitely. Every piece of information can be retained and everything 185 00:22:08,158 --> 00:22:11,595 eventually gets linked to something else. 186 00:22:11,595 --> 00:22:18,514 That's the rationale for the government official who says we need a robust social graph of the social 187 00:22:18,514 --> 00:22:19,884 graph of the United States. 188 00:22:19,884 --> 00:22:21,719 Why do you need it? 189 00:22:21,719 --> 00:22:29,962 So the dots you don't connect today, you can connect tomorrow, or next year, or the year after next. 190 00:22:29,962 --> 00:22:38,553 Nothing is ever lost, nothing ever goes away, nothing is forgotten any more. 191 00:22:38,553 --> 00:22:48,631 So the primary form of collection that should concern us most is media that spy on us while we use them. 192 00:22:48,631 --> 00:22:57,454 Books that watch us read them; music that listens to us listen to it; search boxes that report what we 193 00:22:57,454 --> 00:23:04,374 are searching for to whoever is searching for us and doesn't know us yet. 194 00:23:04,374 --> 00:23:10,155 There is a lot of talk about data coming out of Facebook. 195 00:23:10,155 --> 00:23:16,077 Is it coming to me? Is it coming to him? Is it coming to them? 196 00:23:16,077 --> 00:23:20,744 They want you to think that the threat is data coming out. 197 00:23:20,744 --> 00:23:26,897 You should know that the threat is code going in. 198 00:23:26,897 --> 00:23:34,420 For the last 15 years what has been happening in enterprise computing 199 00:23:34,420 --> 00:23:42,663 is the addition of that layer of analytics on top of the data warehouse that mostly goes, in enterprise computing, 200 00:23:42,663 --> 00:23:46,077 by the name of "business intelligence." 201 00:23:46,077 --> 00:23:54,088 What it means is you've been building these vast data warehouses in your company for a decade or two now, 202 00:23:54,088 --> 00:24:01,239 you have all the information about your own operations, your suppliers, your competitors, your customers, 203 00:24:01,239 --> 00:24:08,879 now you want to make that data start to do tricks by adding it to all the open source data 204 00:24:08,879 --> 00:24:14,823 out there in the world and using it to tell you the answers to questions you didn't know you had. 205 00:24:14,823 --> 00:24:17,958 That's business intelligence. 206 00:24:17,958 --> 00:24:23,438 The real threat of Facebook is the BI layer on top of the Facebook warehouse. 207 00:24:23,438 --> 00:24:26,874 The Facebook data warehouse contains the behavior, 208 00:24:26,874 --> 00:24:31,449 not just the thinking but also the behavior, 209 00:24:31,449 --> 00:24:35,907 of somewhere nearing a billion people. 210 00:24:35,907 --> 00:24:42,757 The business Intelligence layer on top of it, which is just all that code they get to run 211 00:24:42,757 --> 00:24:47,401 covered by the terms of service that say they can run any code they want 212 00:24:47,401 --> 00:24:51,487 for "improvement of the experience." 213 00:24:51,487 --> 00:24:58,546 The business intelligence layer on top of Facebook is where every intelligence service in the world wants to go. 214 00:24:58,546 --> 00:25:06,534 Imagine that you're a tiny little secret police organization in some not very important country. 215 00:25:06,534 --> 00:25:14,150 Let's put ourselves in their position, let's call them, I don't know what, you know Kyrgyzstan. 216 00:25:14,150 --> 00:25:21,000 You're secret police, you're in the people business. Secret policing is people business. 217 00:25:21,000 --> 00:25:28,871 You have classes of people that you want. You want agents, you want sources, you have adversaries, 218 00:25:28,871 --> 00:25:33,492 and you have "infulenciables," that is people that you can torture who are related to adversaries: 219 00:25:33,492 --> 00:25:38,786 wives, husbands, fathers, daughters, you know, those people. 220 00:25:38,786 --> 00:25:45,915 So you're looking for classes of people. You don't know their names but you know what they're like. 221 00:25:45,915 --> 00:25:51,279 You know who is recruitable for you as an agent, you know who are likely sources. 222 00:25:51,279 --> 00:25:55,714 You can give the social characteristics of your adversaries. 223 00:25:55,714 --> 00:25:59,940 And once you know your adversaries you can find the infulencables. 224 00:25:59,940 --> 00:26:04,398 So what you want to do is run code inside Facebook. 225 00:26:04,398 --> 00:26:07,254 It will help you find the people that you want. 226 00:26:07,254 --> 00:26:13,779 It will show you the people whose behavior and whose social circles tell you that they are what you want 227 00:26:13,779 --> 00:26:22,347 by way of agents, sources, what the adversaries are and who you can torture to get to them. 228 00:26:22,347 --> 00:26:26,921 So you don't want data out of Facebook, the minute you take data out of Facebook it is dead. 229 00:26:26,921 --> 00:26:32,169 You want to put code into Facebook, and run it there and get the results. 230 00:26:32,169 --> 00:26:35,327 You want to cooperate. 231 00:26:35,327 --> 00:26:38,647 Facebook wants to be a media company. 232 00:26:38,647 --> 00:26:40,760 It wants to own the web. 233 00:26:40,760 --> 00:26:44,778 It wants you to punch "like" buttons. 234 00:26:44,778 --> 00:26:50,675 "Like" buttons are terrific even if you don't punch them because they're web bugs, 235 00:26:50,675 --> 00:26:56,968 because they show Facebook every other webpage that you touch that has a "like" button on it, 236 00:26:56,968 --> 00:27:00,126 whether you punch it or you don't, they still get a record. 237 00:27:00,126 --> 00:27:09,205 The record is you read a page which had a "like" button on it and either you said yes or you said no, 238 00:27:09,205 --> 00:27:15,637 and either way you made data, you taught the machine. 239 00:27:15,637 --> 00:27:24,577 So media want to know you better than you know yourself and we shouldn't let anybody do that. 240 00:27:24,577 --> 00:27:28,152 We fought for a thousand years for the internal space, 241 00:27:28,152 --> 00:27:35,536 the space where we read, think, reflect, and become unorthodox, 242 00:27:35,536 --> 00:27:38,625 inside our own minds. 243 00:27:38,625 --> 00:27:44,035 That's the space that everybody wants to take away. 244 00:27:44,035 --> 00:27:46,519 Tell us your dreams. 245 00:27:46,519 --> 00:27:49,399 Tell us your thoughts. 246 00:27:49,399 --> 00:27:50,838 Tell us what you hope. 247 00:27:50,838 --> 00:27:52,278 Tell us what you fear. 248 00:27:52,278 --> 00:28:00,521 This is not weekly oricular confession, this is confession 24 by 7. 249 00:28:00,521 --> 00:28:03,098 The mobile robot that you carry around with you, 250 00:28:03,098 --> 00:28:07,371 the one that knows where you are all the times and listens to all your conversations. 251 00:28:07,371 --> 00:28:12,224 The one that you hope isn't reporting in at headquarters but it is only hope? 252 00:28:12,224 --> 00:28:19,933 The one that runs all that software you can't read, can't study, can't see, can't modify, and can't understand? 253 00:28:19,933 --> 00:28:20,792 That one. 254 00:28:20,792 --> 00:28:27,363 That one is taking your confession all the time. 255 00:28:27,363 --> 00:28:31,357 When you hold it up to your face from now on it is going to know your heartbeat. 256 00:28:31,357 --> 00:28:34,469 That's an Android app right now. 257 00:28:34,469 --> 00:28:39,507 Micro changes in the color of your face reveal your heartrate. 258 00:28:39,507 --> 00:28:44,801 That's a little lie detector you're carrying around with you. 259 00:28:44,801 --> 00:28:52,836 Pretty soon I'll be able to sit in a classroom and watch the blood pressure of my students go up and down. 260 00:28:52,836 --> 00:29:00,707 In a law school classroom in the United States, that is really important information. 261 00:29:00,707 --> 00:29:02,797 But it is not just me of course, it's everybody right? 262 00:29:02,797 --> 00:29:06,768 Because it's just data and people will have access to it. 263 00:29:06,768 --> 00:29:12,456 The inside of your head becomes the outside of your face, becomes the inside of your smart phone, 264 00:29:12,456 --> 00:29:21,512 becomes the inside of the network, becomes the front of the file at headquarters. 265 00:29:21,512 --> 00:29:27,782 So we need free media or we lose freedom of thought. It's that simple. 266 00:29:27,782 --> 00:29:30,034 What is free media mean? 267 00:29:30,034 --> 00:29:34,190 Media that you can read, that you can think about, that you can add to, 268 00:29:34,190 --> 00:29:40,199 that you can participate in without being monitored. 269 00:29:40,199 --> 00:29:47,147 Without being surveiled, without being reported in on. That's free media. 270 00:29:47,147 --> 00:29:57,016 If we don't have it, we lose freedom of thought, possibly forever. 271 00:29:57,016 --> 00:30:07,232 Having free media means having a network that behaves according to the needs of the people at the edge, 272 00:30:07,232 --> 00:30:14,802 not according to the needs of the servers in the middle. 273 00:30:14,802 --> 00:30:22,302 Making free media requires a network of peers, not a network of masters and servants, 274 00:30:22,302 --> 00:30:34,260 not a network of clients and servers, not a network where network operators control all the packets they move. 275 00:30:34,260 --> 00:30:40,994 This is not simple, but it is still possible. 276 00:30:40,994 --> 00:30:46,567 We require free technology. 277 00:30:46,567 --> 00:30:55,507 The last time I gave a political speech in Berlin it was in 2004. It was called "Die Gedanken sind frei." 278 00:30:55,507 --> 00:31:04,330 I said we need three things: free software, free hardware, free bandwidth. Now we need them more. 279 00:31:04,330 --> 00:31:09,230 It is eight years later; we've made some mistakes; we're in more trouble. 280 00:31:09,230 --> 00:31:14,199 We haven't come forward, we've gone back. 281 00:31:14,199 --> 00:31:20,143 We need free software, that means software you can copy, modify, and re-distribute. 282 00:31:20,143 --> 00:31:31,522 We need that because we need the software that runs the network to be modifiable by the people the network embraces. 283 00:31:31,522 --> 00:31:38,742 The death of Mr. Jobs is a positive event. I am sorry to break it to you like that. 284 00:31:38,742 --> 00:31:42,875 He was a great artist and a moral monster 285 00:31:42,875 --> 00:31:53,301 and he brought us closer to the end of freedom every single time he put something out because he hated sharing. 286 00:31:53,301 --> 00:31:55,855 It wasn't his fault, he was an artist. 287 00:31:55,855 --> 00:32:02,752 He hated sharing because he believed he invented everything, even though he didn't. 288 00:32:02,752 --> 00:32:07,326 Inside those fine little boxes with the lit up apples on them I see all around the room, 289 00:32:07,326 --> 00:32:13,526 is a bunch of free software, tailored to give him control. 290 00:32:13,526 --> 00:32:18,448 Nothing illegal, nothing wrong, he obeyed the licenses. 291 00:32:18,448 --> 00:32:23,232 He screwed us every time he could and he took everything we gave him, 292 00:32:23,232 --> 00:32:29,478 and he made beautiful stuff that controlled its users. 293 00:32:29,478 --> 00:32:36,003 Once upon a time there was a man here who built stuff in Berlin, for Alberst Spare(sp), 294 00:32:36,003 --> 00:32:41,227 his name was Philip Johnson and he was a wonderful artist and a moral monster. 295 00:32:41,227 --> 00:32:49,006 And he said he went to work building buildings for the Nazis because they had all the best graphics. 296 00:32:49,006 --> 00:32:53,046 And he meant it, because he was an artist. 297 00:32:53,046 --> 00:32:56,970 As Mr. Jobs was an artist. 298 00:32:56,970 --> 00:33:00,569 But artistry is no guaranty of morality. 299 00:33:00,569 --> 00:33:04,981 We need free software. 300 00:33:04,981 --> 00:33:11,576 The tablets that you use that Mr. Jobs designed are made to control you. 301 00:33:11,576 --> 00:33:18,774 You can't change the software. It's hard even to do ordinary programming. 302 00:33:18,774 --> 00:33:26,692 It doesn't really matter, they're just tablets, we just use them, we're just consuming the glories of what they give us. 303 00:33:26,692 --> 00:33:31,591 But they're consumming you too. 304 00:33:31,591 --> 00:33:38,580 We live, as the science fiction we read when we were children suggested we would, among robots not. 305 00:33:38,580 --> 00:33:45,918 We live commensally with robots. But they don't have hands and feet, we're their hands and feet. 306 00:33:45,918 --> 00:33:52,071 We carry the robots around with us; they know everywhere we go, they see everything we see; 307 00:33:52,071 --> 00:33:58,735 everything we say they listen to and there is no first law of robotics. 308 00:33:58,735 --> 00:34:05,067 They hurt us every day and there is no progamming to prevent it. 309 00:34:05,067 --> 00:34:08,209 So we need free software. 310 00:34:08,209 --> 00:34:16,707 Unless we control the software in the network, the network will, in the end, control us. 311 00:34:16,707 --> 00:34:18,403 We need free hardware. 312 00:34:18,403 --> 00:34:27,273 What that means is that when we buy an electronic something, it should be ours, not someone else's. 313 00:34:27,273 --> 00:34:32,102 We should be free to change it, to use it our way, 314 00:34:32,102 --> 00:34:37,234 to assure that it is not working for anyone other than ourselves. 315 00:34:37,234 --> 00:34:41,646 Of course most of us will never change anything. 316 00:34:41,646 --> 00:34:48,588 But the fact that we can change it will keep us safe. 317 00:34:48,588 --> 00:34:56,437 Of course we will never be the people that they most want to surveil. 318 00:34:56,437 --> 00:35:04,750 The man who will not be president of France, for sure, but who thought he would, now says that he was 319 00:35:04,750 --> 00:35:11,530 trapped and his political career was destroyed, not because he raped a hotel housekeeper, 320 00:35:11,530 --> 00:35:17,288 but because he was setup by spying inside his smart phone. 321 00:35:17,288 --> 00:35:21,213 Maybe he's telling the truth and maybe he isn't. 322 00:35:21,213 --> 00:35:24,092 But he's not wrong about the smart phone. 323 00:35:24,092 --> 00:35:28,596 Maybe it happened, maybe it didn't, but it will. 324 00:35:28,596 --> 00:35:32,033 We carry dangerous stuff around with us everywhere we go. 325 00:35:32,033 --> 00:35:35,028 It doesn't work for us, it works for someone else. 326 00:35:35,028 --> 00:35:39,719 We put up with it, we have to stop. 327 00:35:39,719 --> 00:35:42,319 We need free bandwidth. 328 00:35:42,319 --> 00:35:46,801 That means we need network operators who are common carriers, 329 00:35:46,801 --> 00:35:50,354 whose only job is to move the packet from A to B. 330 00:35:50,354 --> 00:35:56,275 They're meerly pipes, they're not allowed to get involved. 331 00:35:56,275 --> 00:35:59,386 It used to be that when you shipped a thing from point A to point B, 332 00:35:59,386 --> 00:36:04,587 if the guy in the middle opened it up and looked inside it, he was committing a crime. 333 00:36:04,587 --> 00:36:07,002 Not any more. 334 00:36:07,002 --> 00:36:10,810 In the United States, the House of Representatives voted last week 335 00:36:10,810 --> 00:36:17,892 that the network operators in the United States should be completely immunized against lawsuits 336 00:36:17,892 --> 00:36:27,807 for cooperating with illegal government spying so long as they do it "in good faith." 337 00:36:27,807 --> 00:36:32,916 And capitalism means never having to say you're sorry; you're always doing it in good faith. 338 00:36:32,916 --> 00:36:37,838 In good faith all we wanted to do was make money, your honor, let us out. 339 00:36:37,838 --> 00:36:40,323 Ok, you're gone. 340 00:36:40,323 --> 00:36:43,434 We must have free bandwidth. 341 00:36:43,434 --> 00:36:48,566 We still own the electromagnetic spectrum; it still belongs to all of us. 342 00:36:48,566 --> 00:36:54,650 It doesn't belong to anyone else. Government is a trustee, not an owner. 343 00:36:54,650 --> 00:37:01,082 We have to have spectrum we control, equal for everybody. 344 00:37:01,082 --> 00:37:09,928 Nobody's allowed to listen to anybody else, no inspecting, no checking, no record keeping. 345 00:37:09,928 --> 00:37:12,668 Those have to be the rules. 346 00:37:12,668 --> 00:37:18,961 Those have to be the rules in the same way that censorship had to go. 347 00:37:18,961 --> 00:37:24,162 If we don't have rules for free communication, we are re-introducing censorship, 348 00:37:24,162 --> 00:37:28,226 whether we know it or not. 349 00:37:28,226 --> 00:37:31,082 So we have very little choice now, 350 00:37:31,082 --> 00:37:39,696 our space has gotten smaller, our opportunity for change has gotten less. 351 00:37:39,696 --> 00:37:48,497 We have to have free software. We have to have free hardware. We have to have free bandwidth. 352 00:37:48,497 --> 00:37:53,373 Only from them can we make free media. 353 00:37:53,373 --> 00:37:56,949 But we have to work on media too, directly. 354 00:37:56,949 --> 00:38:01,639 Not intermitently, not off-hand. 355 00:38:01,639 --> 00:38:10,858 We need to demand of media organizations that they obey primary ethics, a first law of media robotics: 356 00:38:10,858 --> 00:38:12,854 do no harm. 357 00:38:12,854 --> 00:38:19,778 The first rule is: "do not surveil the reader." 358 00:38:19,778 --> 00:38:24,441 We can't live in a world where every book reports every reader. 359 00:38:24,441 --> 00:38:31,059 If we are, we're living in libraries operated by the KGB. 360 00:38:31,059 --> 00:38:35,169 Well, amazon.com 361 00:38:35,169 --> 00:38:40,161 Or the KGB, or both, you'll never know. 362 00:38:40,161 --> 00:38:47,174 The book, that wonderful printed article, that first commodity of mass capitalism, 363 00:38:47,174 --> 00:38:48,729 the book is dying. 364 00:38:48,729 --> 00:38:51,585 It's a shame, but it's dying. 365 00:38:51,585 --> 00:38:58,644 And the replacement is a box which either surveils the reader or it doesn't. 366 00:38:58,644 --> 00:39:01,849 You will remember that amazon.com decided 367 00:39:01,849 --> 00:39:08,559 that a book by George Orwell could not be distributed in the United States for copyright reasons. 368 00:39:08,559 --> 00:39:14,016 They went and errased it out of all the little amazon book reading devices 369 00:39:14,016 --> 00:39:18,399 where customers had "purchased" copies of Animal Farm. 370 00:39:18,399 --> 00:39:25,324 Oh, you may have bought it but that doesn't mean that you're allowed to read it. 371 00:39:25,324 --> 00:39:28,041 That's censorship. 372 00:39:28,041 --> 00:39:30,757 That's book burning. 373 00:39:30,757 --> 00:39:36,725 That's what we all lived through in the 20th century. 374 00:39:36,725 --> 00:39:40,997 We burned people, places, and art. 375 00:39:40,997 --> 00:39:42,391 We fought. 376 00:39:42,391 --> 00:39:49,960 We killed tens of millions of people to bring an end to a world in which the state would burn books. 377 00:39:49,960 --> 00:39:53,095 And then we watched as it was done again and again. 378 00:39:53,095 --> 00:39:58,366 And now we are preparing to allow it to be done without matches. 379 00:39:58,366 --> 00:40:02,267 Everywhere, any time. 380 00:40:02,267 --> 00:40:08,838 We must have media ethics, and we have the power to enforce those ethics 381 00:40:08,838 --> 00:40:11,764 because we're still the people who pay the freight. 382 00:40:11,764 --> 00:40:18,474 We should not deal with people who sell surveiled books. 383 00:40:18,474 --> 00:40:24,767 We should not deal with people who sell surveiled music. 384 00:40:24,767 --> 00:40:33,521 We should not deal with movie companies that sell surveiled movies. 385 00:40:33,521 --> 00:40:39,442 We are going to have to say that, even as we work on the technology, 386 00:40:39,442 --> 00:40:48,428 because otherwise capitalism will move as fast as possible to make our efforts at freedom irrelevant 387 00:40:48,428 --> 00:40:54,837 and there are children growing up who will never know what freedom means. 388 00:40:54,837 --> 00:40:57,646 So we have to make a point about it. 389 00:40:57,646 --> 00:41:00,665 It will cost us a little bit. 390 00:41:00,665 --> 00:41:02,917 Not much, but a little bit. 391 00:41:02,917 --> 00:41:11,253 We will have to forgoe and make a few sacrifices in our lives to enforce ethics on media. 392 00:41:11,253 --> 00:41:13,645 But that's our role. 393 00:41:13,645 --> 00:41:17,012 Along with making free technology, that's our role. 394 00:41:17,012 --> 00:41:23,467 We are the last generation capable of understanding directly what the changes are 395 00:41:23,467 --> 00:41:28,320 because we have lived on both sides of them and we know. 396 00:41:28,320 --> 00:41:31,455 So we have a responsibility. 397 00:41:31,455 --> 00:41:35,634 You understand that. 398 00:41:35,634 --> 00:41:38,514 It's always a surprise to me, though it is deeply true, 399 00:41:38,514 --> 00:41:44,411 that of all the cities in the world I travel to, Berlin in the freeist. 400 00:41:44,411 --> 00:41:48,034 You cannot wear a hat in the Hong Kong airport any more, 401 00:41:48,034 --> 00:41:52,376 I found out last month trying to wear my hat in the Hong Kong airport. 402 00:41:52,376 --> 00:41:58,436 You're not allowed, it disrupts the facial recognition. 403 00:41:58,436 --> 00:42:03,057 There will be a new airport here. Will it be so heavily surveiled 404 00:42:03,057 --> 00:42:08,955 that you won't be allowed to wear a hat because it disrupts the facial recognition? 405 00:42:08,955 --> 00:42:12,763 We have a responsibility. We know. 406 00:42:12,763 --> 00:42:16,478 That's how Berlin became the freeist city that I go to. 407 00:42:16,478 --> 00:42:19,729 Because we know. Because we have a responsibility. 408 00:42:19,729 --> 00:42:26,765 Because we remember, because we've been on both sides of the wall. 409 00:42:26,765 --> 00:42:30,085 That must not be lost now. 410 00:42:30,085 --> 00:42:35,054 If we forget, no other forgetting will ever happen. 411 00:42:35,054 --> 00:42:37,121 Everything will be remembered. 412 00:42:37,121 --> 00:42:42,020 Everything you read, all through life, everything you listened to, 413 00:42:42,020 --> 00:42:46,200 everything you watched, everything you searched for. 414 00:42:46,200 --> 00:42:53,560 Surely we can pass along to the next generation a world freeier than that. 415 00:42:53,560 --> 00:42:56,091 Surely we must. 416 00:42:56,091 --> 00:42:58,599 What if we don't? 417 00:42:58,599 --> 00:43:08,096 What will they say when they realize that we lived at the end of a thousand years 418 00:43:08,096 --> 00:43:12,392 of struggling for freedom of thought, at the end. 419 00:43:12,392 --> 00:43:19,126 When we had almost everything we gave it away. 420 00:43:19,126 --> 00:43:23,654 For convenience. For social networking. 421 00:43:23,654 --> 00:43:28,205 Because Mr. Zuckerberg asked us to. 422 00:43:28,205 --> 00:43:33,150 Because we couldn't find a better way to talk to our friends. 423 00:43:33,150 --> 00:43:40,558 Because we loved the beautiful pretty things that felt so warm in the hand. 424 00:43:40,558 --> 00:43:46,525 Because we didn't really care about the future of freedom of thought. 425 00:43:46,525 --> 00:43:50,240 Because we considered that to be someone else's business. 426 00:43:50,240 --> 00:43:52,168 Because we thought it was over. 427 00:43:52,168 --> 00:43:55,767 Because we believed we were free. 428 00:43:55,767 --> 00:43:59,250 Because we didn't think there was any struggling left to do. 429 00:43:59,250 --> 00:44:01,177 That's why we gave it all away. 430 00:44:01,177 --> 00:44:03,940 Is that what we're going to tell them? 431 00:44:03,940 --> 00:44:07,400 Is that what we're going to tell them? 432 00:44:07,400 --> 00:44:13,298 Free thought requires free media. 433 00:44:13,298 --> 00:44:19,498 Free media requires free technology. 434 00:44:19,498 --> 00:44:29,714 We require ethical treatment when we go to read, to write, to listen, and to watch. 435 00:44:29,714 --> 00:44:38,445 Those are the hallmarks of our politics. We need to keep those politics until we die. 436 00:44:38,445 --> 00:44:43,368 Because, if we don't, something else will die, 437 00:44:43,368 --> 00:44:50,473 something so precious that many many many of our fathers and mothers gave their lives for it. 438 00:44:50,473 --> 00:44:56,278 Something so precious that we understood it to define what it meant to be human. 439 00:44:56,278 --> 00:45:00,713 It will die if we don't keep those politics for the rest of our lives. 440 00:45:00,713 --> 00:45:09,049 And if we do, then all the things we struggled for, we'll get. 441 00:45:09,049 --> 00:45:15,434 Because everywhere on earth, everybody will be able to read freely. 442 00:45:15,434 --> 00:45:21,634 Because all the Einsteins in the street will be allowed to learn. 443 00:45:21,634 --> 00:45:27,973 Because all the Stravinskys will become composers. 444 00:45:27,973 --> 00:45:31,735 Because all the Saulks will become research physicians. 445 00:45:31,735 --> 00:45:37,308 Because humanity will be connected and every brain will be allowed to learn 446 00:45:37,308 --> 00:45:43,159 and no brain will be crushed for thinking wrong. 447 00:45:43,159 --> 00:45:46,619 We're at the moment where we get to pick. 448 00:45:46,619 --> 00:45:52,075 Whether we carry through that great revolution we've been making 449 00:45:52,075 --> 00:45:57,393 bit by bloody bit for a thousand years, 450 00:45:57,393 --> 00:46:02,501 or whether we give it away for convenience, 451 00:46:02,501 --> 00:46:07,424 for simplicity of talking to our friends, for speed in search, 452 00:46:07,424 --> 00:46:12,904 and other really important stuff. 453 00:46:12,904 --> 00:46:19,847 I said in 2004, when I was here, and I say now, we can win. 454 00:46:19,847 --> 00:46:28,182 We can be the generation of people who completed the work of building freedom of thought. 455 00:46:28,182 --> 00:46:35,079 I didn't say then, and I must say now, that we are also potentially the generation that can lose. 456 00:46:35,079 --> 00:46:42,347 We can slip back into an inquisition worse than any inquisition that ever existed. 457 00:46:42,347 --> 00:46:49,731 It may not use as much torture, it may not be as bloody, but it will be more effective. 458 00:46:49,731 --> 00:46:53,167 And we mustn't mustn't let that happen. 459 00:46:53,167 --> 00:46:57,695 Too many people fought for us. Too many people died for us. 460 00:46:57,695 --> 00:47:02,432 Too many people hoped and dreamed for what we can still make possible. 461 00:47:02,432 --> 00:47:05,737 We must not fail. 462 00:47:05,737 --> 00:47:06,577 Thank you very much. 463 00:47:06,577 --> 00:47:59,561 [Applause] 464 00:47:59,561 --> 00:48:03,196 Let's learn how to take questions here. 465 00:48:03,196 --> 00:48:08,227 It's not going to be simple but let's set a good example. 466 00:48:08,227 --> 00:48:20,828 [pause] 467 00:48:20,828 --> 00:48:21,794 [Questioner 1] Thank you. 468 00:48:21,794 --> 00:48:26,136 [Questioner 1] You put forward a very gruesome picture of the possible future. 469 00:48:26,136 --> 00:48:29,154 [Questioner 1] Could you name some organizations or groups 470 00:48:29,154 --> 00:48:35,586 [Questioner 1] in the United States that put forward actions in your way, 471 00:48:35,586 --> 00:48:40,764 [Questioner 1] in your positive way of transforming society? 472 00:48:40,764 --> 00:48:44,503 Not only in the United States, but around the world we have organizations 473 00:48:44,503 --> 00:48:48,032 that are concerned with electronic civil liberties. 474 00:48:48,032 --> 00:48:51,353 The EFF, the Electronic Frontier Foundation in the United States. 475 00:48:51,353 --> 00:48:53,760 La Quadrature du Net in France 476 00:48:53,760 --> 00:48:57,274 Bits of Freedom in the Netherlands and so on. 477 00:48:57,274 --> 00:49:01,593 Electronic civil liberties agitation is extraordinarily important. 478 00:49:01,593 --> 00:49:05,229 Pressure on governments to obey rules that came down from 479 00:49:05,229 --> 00:49:09,559 the 18th century regarding protection of human dignity 480 00:49:09,559 --> 00:49:14,729 and the prevention of state surveillance are crucially important. 481 00:49:14,729 --> 00:49:20,393 Unfortunately, electronic civil liberties work against governments are not enough. 482 00:49:20,393 --> 00:49:26,047 The free software movement, the FSF, the Free Software Foundation in the United States, 483 00:49:26,047 --> 00:49:29,224 and the Free Software Foundation Europe, headquartered in Germany, 484 00:49:29,224 --> 00:49:35,607 are working in an important way to maintain that system of 485 00:49:35,607 --> 00:49:41,520 the anarchistic creation of software which has brought us so much technology we can control. 486 00:49:41,520 --> 00:49:43,850 That's crucially important. 487 00:49:43,850 --> 00:49:48,099 The Creative Commons movement, which is strongly entrenched, 488 00:49:48,099 --> 00:49:52,991 not only in the United States and Germany, but in more than 40 countries around the world 489 00:49:52,991 --> 00:49:59,732 is also extraordinarily important because Creative Commons gives to creative workers 490 00:49:59,732 --> 00:50:05,977 alternatives to the kind of massive over control in the copyright system 491 00:50:05,977 --> 00:50:10,916 which makes surveillance media profitable. 492 00:50:10,916 --> 00:50:15,342 The Wikipedia is an extraordinarily important human institution. 493 00:50:15,342 --> 00:50:21,132 and we need to continue to support the Wikimedia Foundation as deeply as we can. 494 00:50:21,132 --> 00:50:26,479 Of the 100 most visited websites in the United States, 495 00:50:26,479 --> 00:50:29,357 in a study conducted by the Wall Street Journal, 496 00:50:29,357 --> 00:50:32,917 of the 100 most visited websites in the United States, 497 00:50:32,917 --> 00:50:36,826 only one does not surveil its users. 498 00:50:36,826 --> 00:50:40,600 You can guess which one it is, it's Wikipedia. 499 00:50:40,600 --> 00:50:47,681 We have enormously important developments now going on throughout the world of higher education 500 00:50:47,681 --> 00:50:54,232 as universities begin to realize that the costs of higher education must come down. 501 00:50:54,232 --> 00:50:57,624 and that brains will grow in the web. 502 00:50:57,624 --> 00:51:04,324 The Universitat Oberta de Catalunya the walk is the most extrodinary 503 00:51:04,324 --> 00:51:07,850 online only university in the world right now. 504 00:51:07,850 --> 00:51:13,500 It will soon be competing with more extraordinary universities still. 505 00:51:13,500 --> 00:51:19,113 MITx, the Massachusetts Institute of Technology's new program 506 00:51:19,113 --> 00:51:24,111 for web education will provide the highest quality technical education on earth 507 00:51:24,111 --> 00:51:30,343 for free to everybody everywhere all the time, building on existing MIT OpenCourseWare. 508 00:51:30,343 --> 00:51:35,902 Stanford is about to spin off a proprietary web learning structure 509 00:51:35,902 --> 00:51:39,933 which will be the Google of higher education if Stanford gets it lucky. 510 00:51:39,933 --> 00:51:44,493 We need to support free higher education on the web. 511 00:51:44,493 --> 00:51:49,166 Every European national ministry of Education should be working on it. 512 00:51:49,166 --> 00:51:56,658 There are many places to look for free software, free hardware, free bandwidth, and free media. 513 00:51:56,658 --> 00:52:02,419 There's no better place to look for free media right now on earth than this room. 514 00:52:02,419 --> 00:52:06,299 Everybody knows what they can do, they're doing it. 515 00:52:06,299 --> 00:52:11,811 We just have to make everybody else understand that if we stop, or if we fail, 516 00:52:11,811 --> 00:52:16,284 freedom of thought will be the causality and we will regret it forever. 517 00:52:17,641 --> 00:52:21,751 [Organizer] We've had three more questions in the meantime. 518 00:52:21,751 --> 00:52:23,701 [Organizer] The gentleman with the microphone over here will begin. 519 00:52:23,701 --> 00:52:26,998 [Organizer] And then one, two, I'm sure there are more of you in the back 520 00:52:26,998 --> 00:52:28,902 [Organizer] so raise your hands high. 521 00:52:28,902 --> 00:52:32,153 [Organizer] We'll take maybe your first please. 522 00:52:32,153 --> 00:52:35,683 [Questioner 2] Thank you very much, I just wanted to ask a short question. 523 00:52:35,683 --> 00:52:42,161 [Questioner 2] Can Facebook, can iPhone, and free media coexist on a long range? 524 00:52:42,161 --> 00:52:44,065 Probably not. 525 00:52:44,065 --> 00:52:46,596 But we don't have to worry too much. 526 00:52:46,596 --> 00:52:52,587 iPhone is just a product and Facebook is just a commercial version of a service. 527 00:52:52,587 --> 00:52:56,139 I said recently to a newspaper in New York that I thought Facebook 528 00:52:56,139 --> 00:53:00,087 would continue to exist for somewhere between 12 and 120 months. 529 00:53:00,087 --> 00:53:02,153 I still think that is correct. 530 00:53:02,153 --> 00:53:05,915 Federated social networking will become available. 531 00:53:05,915 --> 00:53:10,791 Federated social networking in a form which allows you to leave Facebook 532 00:53:10,791 --> 00:53:14,623 without leaving your friends, will become available. 533 00:53:14,623 --> 00:53:19,801 Better forms of communication without a man in the middle will become available. 534 00:53:19,801 --> 00:53:22,494 The question will be will people use them? 535 00:53:22,494 --> 00:53:26,883 FreedomBox is an attempt to produce a stack of software 536 00:53:26,883 --> 00:53:31,434 that will fit in a new generation of low power, low cost hardware servers 537 00:53:31,434 --> 00:53:34,545 the size of mobile phone chargers. 538 00:53:34,545 --> 00:53:37,355 And if we do that work right, we will be able to give 539 00:53:37,355 --> 00:53:40,281 billions of web servers to the net. 540 00:53:40,281 --> 00:53:44,530 Which will server the purpose of providing competting services 541 00:53:44,530 --> 00:53:49,754 that don't invade privacy and are compatible with existing services. 542 00:53:49,754 --> 00:53:54,213 But mobile phones get changed very frequently so iPhone goes away 543 00:53:54,213 --> 00:53:55,792 it's no big deal. 544 00:53:55,792 --> 00:53:58,230 and web services are much less unique 545 00:53:58,230 --> 00:54:00,459 than they appear right now. 546 00:54:00,459 --> 00:54:03,802 Facebook's a brand, it's not a thing that we need to worry about 547 00:54:03,802 --> 00:54:05,381 in any great particular. 548 00:54:05,381 --> 00:54:08,191 We just have to do it in as quickly as possible. 549 00:54:08,191 --> 00:54:10,397 Co-existence? 550 00:54:10,397 --> 00:54:12,463 Well all I have to say about that is that is 551 00:54:12,463 --> 00:54:15,621 they're not going to co-exist with freedom 552 00:54:15,621 --> 00:54:18,013 so I'm not sure why I should co-exist with them. 553 00:54:18,013 --> 00:54:25,536 [Applause] 554 00:54:25,536 --> 00:54:27,603 [Questioner 3]Hi, I'm Trey Gulalm(sp) from Bangladesh. 555 00:54:27,603 --> 00:54:32,108 [Questioner 3]Thank you for that wonderfully lucid, sintilating, 556 00:54:32,108 --> 00:54:35,358 [Questioner 3] and hugely informative presentation. 557 00:54:35,358 --> 00:54:39,120 [Questioner 3]I was involved in introducing email to Bangladesh 558 00:54:39,120 --> 00:54:43,718 [Questioner 3] in the early 90's and, at that time connectivity was very expensive 559 00:54:43,718 --> 00:54:46,829 [Questioner 3]we were spending $0.30 US cents per kilobyte 560 00:54:46,829 --> 00:54:51,380 [Questioner 3]so a megabyte of data would be $100, $300. 561 00:54:51,380 --> 00:54:55,119 [Questioner 3]It's changes from them but it is still very tightly constrained 562 00:54:55,119 --> 00:54:59,159 [Questioner 3]by the regulatory bodies. So we on the ground find it 563 00:54:59,159 --> 00:55:01,156 [Questioner 3]very difficult because the powers that be, 564 00:55:01,156 --> 00:55:05,150 [Questioner 3]the gate keepers have a vested interest in maintaining that. 565 00:55:05,150 --> 00:55:09,515 [Questioner 3]But in that gatekeeper nexus there is also a nexus 566 00:55:09,515 --> 00:55:13,625 [Questioner 3]between governments in my country and governments in yours, 567 00:55:13,625 --> 00:55:18,919 [Questioner 3]and right now the largest biometric data in the world 568 00:55:18,919 --> 00:55:21,589 [Questioner 3]is the census of Bangladesh 569 00:55:21,589 --> 00:55:25,119 [Questioner 3]and the company that's providing it is a company 570 00:55:25,119 --> 00:55:27,719 [Questioner 3]that's directly linked to the CIA. 571 00:55:27,719 --> 00:55:30,227 [Questioner 3]So what do we as pracatitioners do 572 00:55:30,227 --> 00:55:33,222 [Questioner 3]to overcome very very powerful entities? 573 00:55:33,222 --> 00:55:38,958 This is why I began by speaking about the United States Government's recent behaviors. 574 00:55:38,958 --> 00:55:43,718 My colleagues at the Software Freedom Law Center in India 575 00:55:43,718 --> 00:55:48,803 have been spending a lot of time this past month trying to get a motion 576 00:55:48,803 --> 00:55:51,613 through the upper house of the Indian Parliment 577 00:55:51,613 --> 00:55:56,628 to nullify Department of IT regulations on the censorship of the Indian net. 578 00:55:56,628 --> 00:56:00,320 And of course the good news is that the largest biometric database in the world 579 00:56:00,320 --> 00:56:04,082 will soon be the retinal scans that the Indian government 580 00:56:04,082 --> 00:56:07,542 are going to require if you want to have a propane gas cylinder, 581 00:56:07,542 --> 00:56:10,885 or anything else like energy for your home. 582 00:56:10,885 --> 00:56:16,992 And the difficulty that we've been having in talking to Indian government officials this month 583 00:56:16,992 --> 00:56:22,008 is that they say "Well if the Americans can do it, why can't we?" 584 00:56:22,008 --> 00:56:25,003 Which is, unfortunately true. 585 00:56:25,003 --> 00:56:29,392 The United States government has this winter lowered the bar around the world 586 00:56:29,392 --> 00:56:35,359 on internet freedom in the sense of datamining your society to the Chinese level. 587 00:56:35,359 --> 00:56:37,426 They've fundamentally agreed. 588 00:56:37,426 --> 00:56:40,514 They're going to datamine the hell out of their populations 589 00:56:40,514 --> 00:56:45,042 and they're going to encourage ever other state on earth to do the same. 590 00:56:45,042 --> 00:56:49,012 So I'm entirely with you about the definition of the problem. 591 00:56:49,012 --> 00:56:54,771 We are not now any longer living in a place, in a stage in our history 592 00:56:54,771 --> 00:56:58,231 where we can think in terms of a country at a time. 593 00:56:58,231 --> 00:57:01,365 Globalization has reached the point at which these questions 594 00:57:01,365 --> 00:57:05,292 of surveillance of society are now global questions 595 00:57:05,292 --> 00:57:09,993 and we have to work on them under the assumption that no government 596 00:57:09,993 --> 00:57:14,660 will decide to be more virtuous than the super powers. 597 00:57:14,660 --> 00:57:17,958 I don't know how we're going to deal with the Chinese Communist Party. 598 00:57:17,958 --> 00:57:19,978 I do not know. 599 00:57:19,978 --> 00:57:22,230 I know how we're going to deal with the American government. 600 00:57:22,230 --> 00:57:26,154 We're going to insist on our rights. 601 00:57:26,154 --> 00:57:30,496 We're going to do what it makes sense to do in the United States. 602 00:57:30,496 --> 00:57:32,145 We're going to litigate about it. 603 00:57:32,145 --> 00:57:33,283 We're going to push. 604 00:57:33,283 --> 00:57:34,560 We're going to shove. 605 00:57:34,560 --> 00:57:38,786 We're going to be everywhere, including in the street about it. 606 00:57:38,786 --> 00:57:43,035 And I suspect that's what's going to happen here too. 607 00:57:43,035 --> 00:57:46,147 Unless we move the biggest of the societies on earth, 608 00:57:46,147 --> 00:57:49,839 we will have no hope of convincing smaller governments 609 00:57:49,839 --> 00:57:53,368 that they have to let go of their controls. 610 00:57:53,368 --> 00:57:55,272 So far as bandwidth is concerned, of course, 611 00:57:55,272 --> 00:57:58,337 we're going to have to use unregulated bandwidth. 612 00:57:58,337 --> 00:58:01,449 That is, we're going to have to build around 802.11 and wifi 613 00:58:01,449 --> 00:58:06,836 and any other thing that the rules don't prevent us from using. 614 00:58:06,836 --> 00:58:10,621 And how is that going to reach the poorest of the poor? 615 00:58:10,621 --> 00:58:13,987 When the mobile phone system can be shaped to reach 616 00:58:13,987 --> 00:58:16,913 the poorest of the poor? I don't know. 617 00:58:16,913 --> 00:58:20,837 But I've got a little project with street children in Bangalore trying to figure it out. 618 00:58:20,837 --> 00:58:22,347 We have to. 619 00:58:22,347 --> 00:58:24,065 We have to work everywhere. 620 00:58:24,065 --> 00:58:30,149 If we don't, we're going to screw it up for humanity and we can't afford the risk. 621 00:58:30,149 --> 00:58:32,308 [Organizer]Thank you. The gentleman over here please. 622 00:58:32,308 --> 00:58:35,373 [Questioner 4]Yes, Professor Moglen,I also want to thank you. 623 00:58:35,373 --> 00:58:40,063 [Questioner 4]I can tell you that I from transformingfreedom.org in Vienna 624 00:58:40,063 --> 00:58:45,311 [Questioner 4]and some years ago I saw you talking on a web video 625 00:58:45,311 --> 00:58:48,817 [Questioner 4]at FOSDEM, and there I saw you pointing out 626 00:58:48,817 --> 00:58:54,019 [Questioner 4]the role of Zimmerman, Fredrick, and we tried to help him as well. 627 00:58:54,019 --> 00:58:58,709 [Questioner 4]But listening to you today, I see that this is just too slow, 628 00:58:58,709 --> 00:59:04,189 [Questioner 4]too little, and I'm a bit amazed at two things. 629 00:59:04,189 --> 00:59:08,717 [Questioner 4]The first is the academic system, let's say the European one 630 00:59:08,717 --> 00:59:12,502 [Questioner 4]was founded by Plato and was closed down by force 631 00:59:12,502 --> 00:59:15,590 [Questioner 4]about a thousand years after. 632 00:59:15,590 --> 00:59:20,164 [Questioner 4]The second start of the European University was 633 00:59:20,164 --> 00:59:23,392 [Questioner 4]around the last few century and let's see if we get there, 634 00:59:23,392 --> 00:59:28,477 [Questioner 4]to have it running as long as a thousand years. 635 00:59:28,477 --> 00:59:32,355 [Questioner 4]So my question is why is it not deeply in the cell structure 636 00:59:32,355 --> 00:59:36,859 [Questioner 4]of academia to help the cause that you have talked about today 637 00:59:36,859 --> 00:59:41,875 [Questioner 4]and why don't we have philanthropists helping 638 00:59:41,875 --> 00:59:46,077 [Questioner 4]our little projects, running for three or five 639 00:59:46,077 --> 00:59:49,281 [Questioner 4]thousand euros here and there, much more let's say 640 00:59:49,281 --> 00:59:53,159 [Questioner 4] efficiently, like maybe you would agree that Mr. Soros 641 00:59:53,159 --> 00:59:56,131 [Questioner 4]tries to do? 642 00:59:56,131 --> 01:00:01,704 Some years ago, at Columbia, we tried to interest Faculty 643 01:00:01,704 --> 01:00:07,361 in the state of preservation of the libraries and I saw 644 01:00:07,361 --> 01:00:13,476 more distinguished scholars at my own university than at any other time 645 01:00:13,476 --> 01:00:17,493 in my 25 years there, engaged politically. 646 01:00:17,493 --> 01:00:22,393 Their primary concern was the aging of the paper 647 01:00:22,393 --> 01:00:26,154 on which were printed the 19th century German Docterate 648 01:00:26,154 --> 01:00:32,099 that conserved more philological research than any other literature on earth 649 01:00:32,099 --> 01:00:33,190 right? 650 01:00:33,190 --> 01:00:36,511 But it was 19th century books that they needed to preserve. 651 01:00:36,511 --> 01:00:40,946 The problem with academis life is that it is inherently conservative 652 01:00:40,946 --> 01:00:44,011 because it preserves the wisdom of the old. 653 01:00:44,011 --> 01:00:45,752 And that is a good thing to do 654 01:00:45,752 --> 01:00:48,237 but the wisdom of the old is old, 655 01:00:48,237 --> 01:00:52,904 and it doesn't necessarily embrace the issues of the moment perfectly. 656 01:00:52,904 --> 01:00:56,456 I mentioned the walk because I think it is so important 657 01:00:56,456 --> 01:01:02,192 to support the university as it maneuvers itself towards the net 658 01:01:02,192 --> 01:01:06,279 and away from the forms of learning that have characterized 659 01:01:06,279 --> 01:01:10,063 the matriculatry university of the past. 660 01:01:10,063 --> 01:01:15,543 For the last thousand years mostly we moved scholars to books, 661 01:01:15,543 --> 01:01:19,142 and the university grew up around that principle. 662 01:01:19,142 --> 01:01:22,068 It grew up around the principle that books are hard to move 663 01:01:22,068 --> 01:01:23,461 and people are easy. 664 01:01:23,461 --> 01:01:25,505 So you bring everybody to it. 665 01:01:25,505 --> 01:01:28,546 Now we live in a world in which it is much simplier 666 01:01:28,546 --> 01:01:30,915 to move knowledge to people. 667 01:01:30,915 --> 01:01:36,697 But the continuance of ignorance is the desire of businesses that sell knowledge. 668 01:01:36,697 --> 01:01:39,878 What we really need is to being ourselves to help 669 01:01:39,878 --> 01:01:42,943 to turn the university system into something else. 670 01:01:42,943 --> 01:01:45,822 The something which allows everybody to learn 671 01:01:45,822 --> 01:01:49,746 and which demands unsurveiled learning. 672 01:01:49,746 --> 01:01:52,347 The commissioner for Information Society will be here; 673 01:01:52,347 --> 01:01:56,546 she should speak to that. 674 01:01:56,546 --> 01:02:00,590 That should be the great question of the European Commission. 675 01:02:00,590 --> 01:02:03,516 They know, they printed a report 18 months ago, 676 01:02:03,516 --> 01:02:06,256 that said for the cost of 100km of road you 677 01:02:06,256 --> 01:02:10,273 can scan 1/6 of all the books in European libraries. 678 01:02:10,273 --> 01:02:14,824 That means for the cost of 600km of road we could get 'em all. 679 01:02:14,824 --> 01:02:17,424 We built a lot of roads in a lot of places, 680 01:02:17,424 --> 01:02:19,328 including Greece, in the last 10 year, 681 01:02:19,328 --> 01:02:22,068 and we could've scanned all the books in Europe at the same time, 682 01:02:22,068 --> 01:02:28,407 and made them available to all humanity on an un-surveiled basis. 683 01:02:28,407 --> 01:02:31,960 If Ms. Cruise wants to build a monument to herself, 684 01:02:31,960 --> 01:02:35,815 it isn't going to be as a feif a day politician. 685 01:02:35,815 --> 01:02:38,601 She's going to do it this way, and you're going to ask her. 686 01:02:38,601 --> 01:02:41,109 I'm going to be on a plane on my way back across the Atlantic, 687 01:02:41,109 --> 01:02:43,593 or I promise you I'd ask her myself. 688 01:02:43,593 --> 01:02:45,637 Ask her for me. 689 01:02:45,637 --> 01:02:50,745 Say "It's not our fault, Eben want's to know. If you want to hurt somebody, hurt him. You should be changing 690 01:02:50,745 --> 01:02:55,993 the European University. You should be breaking it up into un-surveiled reading. You should be putting 691 01:02:55,993 --> 01:03:04,027 Google books and Amazon out of business. That's some North American, Anglo-Saxon, elbow capitalism thing. 692 01:03:04,027 --> 01:03:07,185 Why aren't we making knowledge free in Europe 693 01:03:07,185 --> 01:03:10,273 and assuring that it's un-surveiled?" 694 01:03:10,273 --> 01:03:15,648 That would be the biggest step possible and it is within their power. 695 01:03:15,648 --> 01:03:27,933 [Organizer] Thank you so much. [Applause] Brilliant. thank you.