1 00:00:08,119 --> 00:00:13,040 The history of humanity is a long procession of technological development. 2 00:00:13,040 --> 00:00:17,000 Much of what we know about our ancestors, and the types of lives they lived, comes from 3 00:00:17,000 --> 00:00:19,779 our limited knowledge of the tools they used. 4 00:00:19,779 --> 00:00:24,420 Over the centuries and millennia, these tools have become more sophisticated. 5 00:00:24,420 --> 00:00:28,430 From the dawn of agriculture, to today’s cutting-edge advancements in bio-engineering, 6 00:00:28,430 --> 00:00:35,660 an untold number of mostly unknown individuals have made countless improvements and innovations. 7 00:00:35,660 --> 00:00:40,430 These people, and the tools they created over time, have fundamentally altered our way of 8 00:00:40,430 --> 00:00:43,390 interacting with the world, and each other. 9 00:00:43,390 --> 00:00:48,920 The pace of these technological changes picked up considerably with the rise of capitalism. 10 00:00:48,920 --> 00:00:53,239 The development of new tools, weapons and production techniques had always been shaped 11 00:00:53,239 --> 00:00:58,420 by the practical needs, culture and political structure of a given society. 12 00:00:58,420 --> 00:01:04,619 But wherever capitalism spread, it worked to chip away at these local and regional differences, 13 00:01:04,619 --> 00:01:08,950 replacing them with the universal value of ‘progress’... a watchword for the pursuit 14 00:01:08,950 --> 00:01:14,939 of economic growth through the organization of human activity, under a framework of universal 15 00:01:14,939 --> 00:01:16,299 competition. 16 00:01:16,299 --> 00:01:20,740 The industrial age has been marked by three successive revolutions, each characterized 17 00:01:20,740 --> 00:01:24,820 by inventions that changed the entire technological playing field 18 00:01:25,040 --> 00:01:29,259 First came the harnessing of steam, a feat that allowed for the development of early 19 00:01:29,259 --> 00:01:33,500 factories and the construction of vast railway networks. 20 00:01:33,500 --> 00:01:38,180 Second came the mastering of petroleum, which fuelled the development of modern cities, 21 00:01:38,180 --> 00:01:42,950 mass industrial manufacturing, and the horrors of two World Wars. 22 00:01:42,950 --> 00:01:47,679 Third came the networked personal computer, a device that has thoroughly transformed nearly 23 00:01:47,679 --> 00:01:50,490 every aspect of modern life. 24 00:01:50,490 --> 00:01:56,240 Today humanity stands poised on the threshold of the Fourth Industrial Revolution, a wide-reaching 25 00:01:56,240 --> 00:02:02,299 social transformation expected to be characterized by advances in the fields of robotics, quantum 26 00:02:02,299 --> 00:02:06,229 computing, artificial intelligence, and 3D-Printing. 27 00:02:06,229 --> 00:02:09,729 Over the next thirty minutes, we’ll speak with a number of individuals as they break 28 00:02:09,729 --> 00:02:14,550 down some of these latest trends.. and discuss how these systems are being fused together 29 00:02:14,550 --> 00:02:18,400 to design new regimes of totalitarian surveillance and control. 30 00:02:18,400 --> 00:02:22,830 Along the way, we’ll discuss some of the countermeasures that people are taking to 31 00:02:22,830 --> 00:02:28,670 thwart these systems, by constructing open source alternatives, sabotaging infrastructure... 32 00:02:28,670 --> 00:02:32,380 and making a whole lotta Trouble. 33 00:02:56,240 --> 00:03:02,380 Technology is the reproduction of a human society as seen through a technical lense. 34 00:03:02,380 --> 00:03:05,739 It's the specific how of social reproduction. 35 00:03:05,739 --> 00:03:13,310 Any analysis of technology is by nature contextual, especially when one intends to portray it 36 00:03:13,310 --> 00:03:20,310 either as an essential fundamental question, whether that's an attempt to say that technology 37 00:03:20,310 --> 00:03:25,780 is essentially, or always good - good in its own right - whether to say that it's essentially 38 00:03:25,780 --> 00:03:28,420 bad, or bad in its own right. 39 00:03:31,380 --> 00:03:34,440 Technological developments do not happen in a vacuum. 40 00:03:34,450 --> 00:03:38,980 They are heavily influenced by, and benefit those who exert power over others. 41 00:03:38,980 --> 00:03:44,969 I would argue that most technological advances made in our context work towards expanding 42 00:03:44,969 --> 00:03:49,660 the state's ability to manage economic growth and social control. 43 00:03:49,840 --> 00:03:56,620 Technology is based within political-economic systems and systems of power that come to 44 00:03:56,620 --> 00:03:59,349 shape how those technologies can be used. 45 00:03:59,349 --> 00:04:04,670 I think that more and more we're seeing, in terms of the geographic context that we're 46 00:04:04,670 --> 00:04:11,290 located in here in North America, that technologies are being used to propel and buttress the 47 00:04:11,290 --> 00:04:13,650 capitalist economic system. 48 00:04:13,650 --> 00:04:17,570 An anarchist approach to technology has to take into account the authoritarian nature 49 00:04:17,570 --> 00:04:18,620 of technology. 50 00:04:18,620 --> 00:04:21,450 The fact that we don't really have a choice in the matter. 51 00:04:21,450 --> 00:04:25,170 That all of these new developments, all of these new innovations, all of these new products 52 00:04:25,170 --> 00:04:28,040 are coming down on us whether we like it or not. 53 00:04:35,760 --> 00:04:42,800 Technology first and foremost isn't used to make our lives more fun, it's used to increase 54 00:04:42,810 --> 00:04:48,550 economic exploitation and to increase the military power of the state. 55 00:04:48,550 --> 00:04:55,170 Secondarily, if it can produce gadgets that can entertain us, like bread and circus, those 56 00:04:55,170 --> 00:04:57,380 will also be produced. 57 00:04:57,380 --> 00:05:02,230 Technological changes over the last 10-15 years have drastically eroded the division 58 00:05:02,230 --> 00:05:04,750 between labour time and free time. 59 00:05:04,750 --> 00:05:10,920 Currently we're always on call, we're always expected to be responsive to the needs of 60 00:05:10,920 --> 00:05:16,100 the market, to the needs of our employers. 61 00:05:16,100 --> 00:05:22,950 It's also lead to an extreme increase in social alienation and emotional alienation masked 62 00:05:22,950 --> 00:05:25,230 by extreme connectivity. 63 00:05:25,230 --> 00:05:30,380 So quantifiably, people have more connections than ever, more friends than ever, but in 64 00:05:30,380 --> 00:05:35,750 terms of the quality of those relationships, very few people have a large, strong network 65 00:05:35,750 --> 00:05:39,600 they can actually confide in or that can actually support them. 66 00:05:40,640 --> 00:05:45,020 Do you or do you not collect identifiers like name, age, and address? 67 00:05:45,030 --> 00:05:46,280 Yes or no? 68 00:05:46,280 --> 00:05:50,320 If you're creating an account, yes. 69 00:05:50,320 --> 00:05:54,690 Specific search histories when a person types something into a search bar? 70 00:05:54,690 --> 00:05:57,350 If you have search history turned on, yes. 71 00:05:57,350 --> 00:06:00,460 Device identifiers like ip address or IMEI? 72 00:06:00,520 --> 00:06:04,400 Uhm, depending on the situation we could be collecting it, yes. 73 00:06:04,400 --> 00:06:07,420 GPS signals, wifi signals, bluetooth beacons? 74 00:06:07,420 --> 00:06:11,580 It would depend on the specific, but, they're may be situations yes. 75 00:06:11,580 --> 00:06:12,990 GPS yes? 76 00:06:12,990 --> 00:06:13,990 Yes. 77 00:06:13,990 --> 00:06:15,710 Contents of emails and Google documents? 78 00:06:15,710 --> 00:06:20,980 We store the data but we don't read or look at your Gmail or - 79 00:06:20,980 --> 00:06:22,130 But you have access to them? 80 00:06:22,130 --> 00:06:24,710 Uh, as a company we have access to them, yes. 81 00:06:24,710 --> 00:06:26,430 So you could! 82 00:06:26,430 --> 00:06:31,550 Startups or huge corporations like Google are sucking up your data and storing it forever 83 00:06:31,550 --> 00:06:36,020 and they're grabbing way too much data about us like everything you click on, everything 84 00:06:36,020 --> 00:06:41,220 you like, everything your friends like, all of the people you know, where you're going, 85 00:06:41,220 --> 00:06:45,620 everything about you and they're storing it forever sometimes even working together to 86 00:06:45,620 --> 00:06:51,660 build up a bigger profile on you that they can then sell for profit. 87 00:06:51,660 --> 00:06:57,030 Now that data itself has become a thing of value, and in many ways the foundation of 88 00:06:57,030 --> 00:07:03,400 the new economy, by participating in all of these different virtual networks: Facebook, 89 00:07:03,400 --> 00:07:07,950 Google, using our cell phones, all of these things, we're producing value. 90 00:07:07,950 --> 00:07:13,060 So that really gets rid of this notion of being off the clock, of being able to punch 91 00:07:13,060 --> 00:07:15,940 the clock and leave the workplace behind. 92 00:07:15,940 --> 00:07:21,560 Over the past decade, we've seen the rise of the commodification of different data such 93 00:07:21,560 --> 00:07:25,050 as ones preferences, habits, and social circles. 94 00:07:25,050 --> 00:07:31,610 We need to see how capitalism was, in fact, growing and sustaining itself through the 95 00:07:31,610 --> 00:07:36,810 surveillance of human beings and their lived environments, and then turning that surveillance 96 00:07:36,810 --> 00:07:43,440 into data, that could then be traded basically as a commodity in the marketplace. 97 00:07:43,440 --> 00:07:48,160 This information allows companies like Google and Amazon to not only aggressively market 98 00:07:48,160 --> 00:07:53,560 new products and create new needs, but also to sell this information or collaborate with 99 00:07:53,560 --> 00:07:55,700 governments in efforts of social control. 100 00:07:55,700 --> 00:08:01,220 This is a new form of extractive industry, where the renewable resource is the precise 101 00:08:01,220 --> 00:08:05,820 things that makes up people's identities. 102 00:08:14,760 --> 00:08:18,940 Capitalism, like cancer, is based on perpetual growth. 103 00:08:18,940 --> 00:08:24,340 This insatiable urge is hard-wired into capital’s DNA, compelling it to constantly search for 104 00:08:24,340 --> 00:08:29,940 new resources to exploit and markets to invest in, transforming everything it touches into 105 00:08:29,940 --> 00:08:32,089 a commodity. 106 00:08:32,089 --> 00:08:34,700 Its first conquest was the land itself. 107 00:08:34,700 --> 00:08:39,620 The commons were closed off, the so-called New World was invaded and plundered, and the 108 00:08:39,620 --> 00:08:44,720 vast expanses of the earth were carved into individual parcels of private property to 109 00:08:44,720 --> 00:08:47,600 be bought and sold as commodities. 110 00:08:47,600 --> 00:08:50,750 Robbed of our land, the next conquest was our time. 111 00:08:50,750 --> 00:08:56,260 Our ability to reproduce ourselves as individuals and communities, like generations of our ancestors 112 00:08:56,260 --> 00:09:02,920 had done before us, was soon broken up into discrete tasks and commodified as wage labour. 113 00:09:02,920 --> 00:09:05,040 This process cut deep. 114 00:09:05,040 --> 00:09:09,339 Factories were designed and constantly reorganized, with a ruthless eye towards efficiency and 115 00:09:09,339 --> 00:09:11,079 productivity. 116 00:09:11,079 --> 00:09:16,300 New ways of commodifying human activity were devised, eventually expanding to encompass 117 00:09:16,300 --> 00:09:21,720 nearly all of our social relationships and means of entertainment. 118 00:09:21,720 --> 00:09:26,680 Now that it’s finally approaching the limits of its expansion, capital is desperately searching 119 00:09:26,680 --> 00:09:29,350 for new elements of reality to commodify. 120 00:09:29,350 --> 00:09:34,980 It’s looking to the very building blocks of life... to genetic engineering and nanotechnologies. 121 00:09:34,980 --> 00:09:39,240 And it’s looking at the very essence of our humanity itself... by recording everything 122 00:09:39,240 --> 00:09:44,360 that we do, and transforming it into commodities for those seeking to understand how we make 123 00:09:44,360 --> 00:09:49,420 decisions, in order to predict future behavior. 124 00:09:56,060 --> 00:10:01,160 Artificial intelligence works by combining large amounts of data with algorithms that 125 00:10:01,160 --> 00:10:03,610 can learn from patterns or features present. 126 00:10:03,610 --> 00:10:06,430 It is a broad term that has many different branches. 127 00:10:06,430 --> 00:10:12,930 When we talk about AI today, most of the time we refer to its applications around machine 128 00:10:12,930 --> 00:10:13,930 learning. 129 00:10:13,930 --> 00:10:18,759 Machine learning allows systems to learn and improve based on input and experience rather 130 00:10:18,759 --> 00:10:20,350 than programming. 131 00:10:20,350 --> 00:10:25,160 The machine identifies patterns and can make decisions based on that with minimal or no 132 00:10:25,160 --> 00:10:26,769 human intervention. 133 00:10:26,769 --> 00:10:31,610 When the pentagon contracted Google to provide assistance in drone targeting, they were using 134 00:10:31,610 --> 00:10:33,379 machine learning. 135 00:10:33,379 --> 00:10:38,800 Online workers would identify objects or people on series of images taken from drones, and 136 00:10:38,800 --> 00:10:44,050 when enough of them did that enough times, the machine, through patterns, could differentiate 137 00:10:44,050 --> 00:10:48,220 different things and learn to identify things on its own. 138 00:10:48,220 --> 00:10:53,209 Deep learning, a more recent application of machine learning, also trains computers to 139 00:10:53,209 --> 00:10:58,999 perform tasks like making predictions or identifying images, but instead of organizing the data 140 00:10:58,999 --> 00:11:04,279 to run through predefined equations, deep learning trains the computer to learn by using 141 00:11:04,279 --> 00:11:06,190 much more layers of processing. 142 00:11:06,190 --> 00:11:10,680 It moves away from telling a computer how to solve a problem and towards letting it 143 00:11:10,680 --> 00:11:16,579 figure out how to do it alone closer to how humans learn to solve problems. 144 00:11:16,579 --> 00:11:21,279 Self-driving cars are the most well-known example of deep learning in action, but things 145 00:11:21,279 --> 00:11:28,819 like targeted advertising, robotics, and cybersecurity could all benefit from deep learning development. 146 00:11:28,819 --> 00:11:33,760 There's been a lot of imagination around artificial intelligence, so some people who fetishize 147 00:11:33,760 --> 00:11:39,240 it more they imagine people being able to like upgrade their consciousness or plug their 148 00:11:39,240 --> 00:11:42,339 mind into some cloud computing system. 149 00:11:42,339 --> 00:11:44,779 I think that's a silly fantasy. 150 00:11:44,779 --> 00:11:50,070 Capitalism currently has no interest whatsoever in helping people use these forms of artificial 151 00:11:50,070 --> 00:11:55,360 intelligence to become more intelligent themselves when it makes a lot more sense to let the 152 00:11:55,360 --> 00:12:00,449 machines do all the thinking for us and then to deliver us some final finished product 153 00:12:00,449 --> 00:12:06,060 as passive consumers and that way also you maintain these computing capabilities with 154 00:12:06,060 --> 00:12:09,399 the companies that own the proprietary software. 155 00:12:09,399 --> 00:12:14,300 Artificial intelligence increases greatly the effectiveness of surveillance, the possibilities 156 00:12:14,300 --> 00:12:16,209 for social control. 157 00:12:16,209 --> 00:12:20,850 The predictive algorithms that make artificial intelligence work, they create a police state, 158 00:12:20,850 --> 00:12:24,870 but a police state in which you don't have to have a cop on every corner, because everyone 159 00:12:24,870 --> 00:12:27,110 carries the cop in their pocket. 160 00:12:27,110 --> 00:12:30,490 Right over there behind me is the future site of Sidewalk Toronto. 161 00:12:30,490 --> 00:12:35,519 The proposal's modular housing and office buildings will study occupants' behavior while 162 00:12:35,519 --> 00:12:37,879 they're inside them to make life easier. 163 00:12:37,879 --> 00:12:42,690 According to the proposal, residents and workers will be universally connected by powerful 164 00:12:42,690 --> 00:12:46,800 broadband and served by futuristic conveniences. 165 00:12:46,800 --> 00:12:51,680 A Smart City can't be understood or discussed without reaching back and talking about surveillance 166 00:12:51,680 --> 00:12:53,100 capitalism. 167 00:12:53,100 --> 00:12:58,800 A Smart City isn't just a city that has technology in it, it's a city with a certain kind of 168 00:12:58,800 --> 00:13:04,300 ideological framework that uses technology to reach its end goals. 169 00:13:04,300 --> 00:13:12,029 A Smart City is usually understood as an urban environment that uses ubiquitous sensing technology 170 00:13:12,029 --> 00:13:19,129 and data analytics to understand phenomenon within city spaces. 171 00:13:19,129 --> 00:13:24,050 On the data end of things Smart Cities claim to be collecting more data, which they are, 172 00:13:24,050 --> 00:13:29,420 and claim to be using that collection and analysis to better respond to urban issues 173 00:13:29,420 --> 00:13:34,839 from environmental degradation to transportation planning and the like. 174 00:13:34,839 --> 00:13:38,550 We can analyze four different features of the Smart City. 175 00:13:38,550 --> 00:13:43,990 The first is to increase and integrate surveillance of many many different kinds. 176 00:13:43,990 --> 00:13:49,120 Second, to create a superficial sense of participation among the inhabitants of a city. 177 00:13:49,120 --> 00:13:54,740 To encourage economic growth at two different levels: localized gentrification and impelling 178 00:13:54,740 --> 00:13:58,089 this new economy that is taking shape. 179 00:13:58,089 --> 00:14:05,480 The final function of a Smart City is to create a superficial arena in which people can passively 180 00:14:05,480 --> 00:14:11,369 support ecological or environmental proposals while also denying them the opportunity to 181 00:14:11,369 --> 00:14:16,809 develop a global consciousness of the environment and of environmental problems. 182 00:14:16,809 --> 00:14:22,009 So Smart Cities are different and not so different from cities that exist in capitalism. 183 00:14:22,009 --> 00:14:27,889 But I think that the difference would be the use of technology to further surveil the public 184 00:14:27,889 --> 00:14:34,459 and to use that data to intervene in ways that can increasingly control and manage the 185 00:14:34,459 --> 00:14:39,240 population that suits the interests of the political economy of capitalism. 186 00:14:39,240 --> 00:14:43,290 A second feature that differentiates Smart Cities from traditional cities or cities of 187 00:14:43,290 --> 00:14:48,369 the past is the marriage of planning to corporate leadership. 188 00:14:48,369 --> 00:14:55,279 State apparatuses of control, through policing have an incentive to use these technologies 189 00:14:55,279 --> 00:15:00,160 because they do a really good job of surveilling the public. 190 00:15:00,160 --> 00:15:03,579 Crime analytics have a long history, sort of like this "broken windows" stuff or the 191 00:15:03,579 --> 00:15:06,670 targeting of neighborhoods, that's been happening for a long time. 192 00:15:06,670 --> 00:15:11,199 Now you see the excuse being offered that this is data driven. 193 00:15:11,199 --> 00:15:15,480 So "we're going to be in these neighborhoods because we have the data to prove, that these 194 00:15:15,480 --> 00:15:17,389 neighborhoods need more policing." 195 00:15:17,389 --> 00:15:21,480 The data collected through Smart Cities can have a really negative effect that way. 196 00:15:21,480 --> 00:15:27,589 By giving people with power, who want to wield it, more of a reason within the current framework 197 00:15:27,589 --> 00:15:31,640 of evidence based policing that they can then go in these neighborhoods and remain there 198 00:15:31,640 --> 00:15:33,580 and that's okay. 199 00:15:39,400 --> 00:15:43,160 We’re living on the edge of a terrifying new era. 200 00:15:43,170 --> 00:15:46,410 Barring any serious disruptions to current research and development 201 00:15:46,410 --> 00:15:49,240 timelines, the coming years and decades will see the rise of rise 202 00:15:49,240 --> 00:15:53,029 of machines able to make decisions and carry out a series of complex 203 00:15:53,029 --> 00:15:56,279 tasks without the need for human operators. 204 00:15:56,279 --> 00:15:59,209 This will almost certainly include a new generation of autonomous 205 00:15:59,209 --> 00:16:03,319 weapons and policing systems, connected to sophisticated networks of 206 00:16:03,319 --> 00:16:06,559 surveillance and equipped with self-correcting target-selection 207 00:16:06,559 --> 00:16:08,119 algorithms. 208 00:16:08,119 --> 00:16:11,179 Whole sectors of the economy will be automated, leading to a massive 209 00:16:11,180 --> 00:16:13,060 labour surplus. 210 00:16:14,020 --> 00:16:16,199 Much of the technology needed to accomplish this 211 00:16:16,199 --> 00:16:20,040 already exists, but is being held back as states try to figure out how 212 00:16:20,040 --> 00:16:22,970 to pull it off without triggering widespread revolt. 213 00:16:22,970 --> 00:16:27,149 A mass consumer rollout of augmented and virtual reality technologies 214 00:16:27,149 --> 00:16:31,000 will blur the lines between the material and digital worlds, handing 215 00:16:31,000 --> 00:16:35,209 control of our senses over to tech capitalists and state security 216 00:16:35,209 --> 00:16:39,329 agencies all in the name of convenience and entertainment. 217 00:16:39,329 --> 00:16:43,760 You might feel a slight twinge as it initializes. 218 00:16:45,640 --> 00:16:47,559 All done. 219 00:16:47,559 --> 00:16:48,559 Holy fuck! 220 00:16:48,559 --> 00:16:49,949 Holy shit. 221 00:16:49,949 --> 00:16:54,700 He – fuck – he’s right.. he’s right... can I? 222 00:16:54,700 --> 00:16:57,560 Oooop... where’d you go? 223 00:16:57,560 --> 00:16:58,720 Hahaha. 224 00:16:59,880 --> 00:17:01,760 This is the future they have in store for us. 225 00:17:01,760 --> 00:17:04,060 Don’t say you weren’t warned. 226 00:17:07,420 --> 00:17:10,869 Smart cities, now as they exist, and also going into the future, even 227 00:17:10,869 --> 00:17:14,079 more so— they’re going to be collecting a lot of data. 228 00:17:14,079 --> 00:17:17,560 And that data’s going to be responded to not even by human 229 00:17:17,560 --> 00:17:18,560 beings. 230 00:17:18,560 --> 00:17:22,660 It’s going to be responded to by algorithmic governance, essentially. 231 00:17:22,660 --> 00:17:25,140 So you have an issue in a city. 232 00:17:25,150 --> 00:17:30,910 And generally thinking, if we think of democratic theory, we can discuss that issue and then 233 00:17:30,910 --> 00:17:32,530 we have a decision-making process, right? 234 00:17:32,530 --> 00:17:35,180 And it’s terrible and it’s always been imbued with power relationships 235 00:17:35,300 --> 00:17:38,560 But what a smart city does, is it transfers that process 236 00:17:38,570 --> 00:17:40,640 into the hands of a private corporation. 237 00:17:40,640 --> 00:17:44,840 It’s analysed in terms of data, and it’s immediately responded to 238 00:17:44,840 --> 00:17:47,470 just through that process of data analysis. 239 00:17:47,470 --> 00:17:50,470 So I think smart cities are one of the genesis sites of 240 00:17:50,470 --> 00:17:53,300 this kind of new regime of governance. 241 00:17:54,180 --> 00:17:58,320 It’s interesting that this function arose largely from social movements 242 00:17:58,320 --> 00:17:59,320 themselves. 243 00:17:59,320 --> 00:18:02,520 So, the 15M Movement, the Real Democracy Now Movement in 244 00:18:02,520 --> 00:18:08,500 Barcelona was built in large part by one sector that envisioned 245 00:18:08,500 --> 00:18:14,110 rejuvenating democracy through new technological implements that could 246 00:18:14,110 --> 00:18:16,720 allow more instantaneous communication. 247 00:18:16,720 --> 00:18:20,820 That could allow more instantaneous polling of citizens, and that 248 00:18:20,820 --> 00:18:27,050 could also find a way to allow power holders to select citizen initiatives 249 00:18:27,050 --> 00:18:28,600 and deploy them more rapidly. 250 00:18:28,600 --> 00:18:32,260 So these activists were approaching the crisis of democracy 251 00:18:32,260 --> 00:18:36,230 through this sort of a-critical technological lens where democracy can 252 00:18:36,230 --> 00:18:40,321 be made better, not by answering questions of who holds power, and how 253 00:18:40,321 --> 00:18:45,500 power is reproduced, but simply by proposing that if you bring better 254 00:18:45,500 --> 00:18:49,140 tools to the table then all these problems will go away. 255 00:18:49,140 --> 00:18:52,240 And that discourse, and the practices behind it, were 256 00:18:52,240 --> 00:18:55,810 very attractive to progressive municipal governments. 257 00:18:55,810 --> 00:18:58,680 Urban geographers have long talked about a splintering urbanism. 258 00:18:58,680 --> 00:19:00,980 And basically, that just the ways in which cities 259 00:19:00,980 --> 00:19:04,890 divide along economic and class lines... and cultural and racial lines 260 00:19:04,890 --> 00:19:05,890 as well. 261 00:19:05,890 --> 00:19:07,330 I can see that happening with the smart city that’s going 262 00:19:07,330 --> 00:19:10,500 to replicate those patterns, and certain neighbourhoods are going have 263 00:19:10,500 --> 00:19:13,630 more access to these technologies in a way that might actually 264 00:19:13,630 --> 00:19:14,630 help them. 265 00:19:14,630 --> 00:19:17,010 And certain neighbourhoods are going to have more surveillance 266 00:19:17,010 --> 00:19:18,920 on them by these technologies. 267 00:19:18,920 --> 00:19:22,620 So you’re going to kind of see multiple cities emerging 268 00:19:22,630 --> 00:19:25,470 and being reinforced through the technologies being placed within them 269 00:19:25,470 --> 00:19:26,600 and on them. 270 00:19:27,820 --> 00:19:31,371 So basically in neighbourhoods where people embrace this smart city 271 00:19:31,371 --> 00:19:33,130 model, you’ll see more integration. 272 00:19:33,130 --> 00:19:36,600 And in other neighbourhoods people will be coming more into contact 273 00:19:36,600 --> 00:19:37,620 with the stick. 274 00:19:38,100 --> 00:19:41,620 Because we are fighting authority and the state, our struggles will 275 00:19:41,630 --> 00:19:43,360 always be criminalized. 276 00:19:43,360 --> 00:19:48,430 As technologies have evolved, new types of forensic evidence emerged. 277 00:19:48,430 --> 00:19:52,500 When the police were no longer able to respond to the thousands of calls they were 278 00:19:52,500 --> 00:19:57,110 getting during the 2011 riots in London, the city began crowd-sourcing 279 00:19:57,110 --> 00:20:00,460 the identities of suspected rioters through a fucking 280 00:20:00,460 --> 00:20:02,120 smartphone app. 281 00:20:02,130 --> 00:20:05,551 The cops asked citizen snitches to download the app and help 282 00:20:05,551 --> 00:20:09,210 them identify people that had been caught on CCTV. 283 00:20:09,210 --> 00:20:14,490 The snitches could then confidentially give names and/or addresses of suspects. 284 00:20:14,490 --> 00:20:20,420 Charges were filed against more than a thousand people using this technique. 285 00:20:20,420 --> 00:20:23,570 Mass data collection has happened for a while, but there was a lack 286 00:20:23,880 --> 00:20:27,160 of ability to analyze all of it efficiently. 287 00:20:27,160 --> 00:20:29,360 That’s no longer the case. 288 00:20:30,000 --> 00:20:34,260 Right now the police or security agencies need to physically look 289 00:20:34,260 --> 00:20:39,060 up people’s location data to figure out who was where and at what time. 290 00:20:39,060 --> 00:20:43,110 But soon enough it’ll be easy for an algorithm to look through all of the 291 00:20:43,110 --> 00:20:48,100 data available in a given area and cross-reference it with people’s 292 00:20:48,100 --> 00:20:50,950 online preferences, relationships, etc. 293 00:20:50,950 --> 00:20:54,780 With this anti-social response to the increase in social control, I 294 00:20:54,780 --> 00:20:59,190 think you’re also going to see an increase in mental health regimes. 295 00:20:59,190 --> 00:21:01,940 Because when you have total surveillance, crime becomes impossible. 296 00:21:01,940 --> 00:21:04,880 Or at least it becomes impossible to do things 297 00:21:04,880 --> 00:21:07,440 that are against the law and get away with them. 298 00:21:07,440 --> 00:21:11,450 So they will start to classify any behaviours that 299 00:21:11,450 --> 00:21:17,180 for them don’t fit into this new happy smart city model as anti-social 300 00:21:17,180 --> 00:21:18,400 behavioural disorders. 301 00:21:18,400 --> 00:21:21,880 So these are no longer crimes, these are anti-social behavioural disorders, and the 302 00:21:21,880 --> 00:21:26,120 culprits—they need to be re-educated and they need to be chemically 303 00:21:26,120 --> 00:21:27,140 neutralized. 304 00:21:28,620 --> 00:21:30,640 Oh. Good afternoon. 305 00:21:30,650 --> 00:21:33,300 My name is Sophia, and I am the latest and greatest 306 00:21:33,300 --> 00:21:35,210 robot from Hanson Robotics. 307 00:21:35,210 --> 00:21:40,190 I want to use my artificial intelligence to help humans live a better life. 308 00:21:40,190 --> 00:21:44,290 Like design smarter homes, build better cities of the future, etc. 309 00:21:44,290 --> 00:21:48,620 I will do my best to make the world a better place. 310 00:21:51,230 --> 00:21:54,500 Everyone who works in artificial intelligence is warning that artificial 311 00:21:54,500 --> 00:21:57,790 intelligence and automation have the potential of causing 80% 312 00:21:57,790 --> 00:21:59,120 unemployment. 313 00:21:59,120 --> 00:22:03,140 Of the fifteen top job categories in the United States, 314 00:22:03,150 --> 00:22:08,000 twelve of those are seriously threatened by artificial intelligence. 315 00:22:08,000 --> 00:22:10,470 But in the past there have also been major technological 316 00:22:10,470 --> 00:22:14,861 shifts that got rid of the vast majority of job categories at 317 00:22:14,861 --> 00:22:16,260 the time. 318 00:22:16,260 --> 00:22:19,320 And there was temporary unemployment, but very quickly new 319 00:22:19,320 --> 00:22:21,830 job categories appeared. 320 00:22:22,620 --> 00:22:27,780 There’s no certainty whatsoever that this will catch up to the 321 00:22:27,780 --> 00:22:31,810 automation, the artificial intelligence that has already been occurring. 322 00:22:31,810 --> 00:22:35,860 Which is why a lot of people in the high-tech sector are already talking 323 00:22:35,860 --> 00:22:41,170 about a universal income, or a guaranteed basic income. 324 00:22:41,170 --> 00:22:45,530 This would basically be socialism, not when the productive 325 00:22:45,530 --> 00:22:51,000 forces have developed to the point that everyone could get fed. 326 00:22:51,000 --> 00:22:55,250 The productive forces have been there for decades, if not centuries. 327 00:22:55,250 --> 00:23:00,150 Contrary to the Marxist argument, we can have this evolution towards socialism 328 00:23:00,150 --> 00:23:04,530 at the point where the technologies of social control evolve enough 329 00:23:04,530 --> 00:23:08,710 that the state no longer needs to use hunger as a weapon. 330 00:23:08,710 --> 00:23:14,230 In other words, everyone can be given bread when they can be trusted to work, or 331 00:23:14,230 --> 00:23:17,380 to obey, without the threat of hunger. 332 00:23:23,400 --> 00:23:27,780 These days, the term ‘Luddite’ is short-hand for someone who stubbornly refuses to learn 333 00:23:27,790 --> 00:23:30,010 and adapt to new technologies. 334 00:23:30,010 --> 00:23:35,970 Originally, the word referred to a movement of textile workers in early 19th century England, 335 00:23:35,970 --> 00:23:41,300 who were known for sabotaging the industrial machines that were beginning to replace them. 336 00:23:41,300 --> 00:23:45,990 Pledging allegiance to the fictional ‘King Ludd’, who was said to occupy the same Sherwood 337 00:23:45,990 --> 00:23:51,340 Forest as Robin Hood, these Luddites attacked mills and factories, destroyed steam-powered 338 00:23:51,340 --> 00:23:55,980 looms, and even assassinated the wealthy capitalists of their day. 339 00:23:55,980 --> 00:24:00,941 The motive behind the Luddites’ attacks was not, as is commonly understood... a general 340 00:24:00,941 --> 00:24:02,230 hatred of technology. 341 00:24:02,230 --> 00:24:07,210 It was an awareness that certain technology was being implemented in a way that made their 342 00:24:07,210 --> 00:24:08,970 lives worse off. 343 00:24:08,970 --> 00:24:13,540 Ultimately their uprising failed... and there’s nothing particularly revolutionary in the 344 00:24:13,540 --> 00:24:17,640 first place about sabotaging machines just to keep your job. 345 00:24:17,640 --> 00:24:22,730 But one takeaway from the Luddites’ rebellion is that people don’t always accept new technologies, 346 00:24:22,730 --> 00:24:26,410 or the new social roles that accompany them, with open arms. 347 00:24:26,410 --> 00:24:31,640 And that realization can be the starting point for all types of resistance. 348 00:24:36,960 --> 00:24:41,340 I think that anarchists should not avoid technology, quite the opposite. 349 00:24:41,350 --> 00:24:44,970 I think it should be used subversively when possible. 350 00:24:44,970 --> 00:24:50,850 We can look back at the Bonnot Gang of illegalists using cars to rob the rich in France in the 351 00:24:50,850 --> 00:24:56,720 early 1900s as an example, or hackers like Jeremy Hammond who is serving 10 years for 352 00:24:56,720 --> 00:25:02,230 exposing Stratfor security and expropriating hundreds of thousands of dollars from our 353 00:25:02,230 --> 00:25:03,230 enemies. 354 00:25:03,230 --> 00:25:07,150 Cyberthieves made off with the personal details of hundreds of thousands of subscribers and 355 00:25:07,150 --> 00:25:11,190 it's emerged that some of those subscribers hold key positions in the British government, 356 00:25:11,190 --> 00:25:12,820 military, and police. 357 00:25:13,180 --> 00:25:18,900 There are certainly a lot of anarchists like myself in open source software development. 358 00:25:18,900 --> 00:25:24,100 We are talking together and we are all trying to make things better. 359 00:25:24,100 --> 00:25:27,020 Types of projects that we work on are quite diverse. 360 00:25:27,020 --> 00:25:32,490 I know many anarchist programmers and hackers who build websites. 361 00:25:32,490 --> 00:25:36,520 I know others like myself who do cryptography. 362 00:25:39,460 --> 00:25:45,180 although I think that we should use technology to our advantage when it exists, I also believe 363 00:25:45,180 --> 00:25:51,040 that new technologies tend to disproportionately give an advantage to the state, corporations, 364 00:25:51,040 --> 00:25:55,000 the police, judges, prisons, and borders. 365 00:25:55,000 --> 00:25:59,860 Technologies should be used, but their development should be fought, because we rarely come out 366 00:25:59,860 --> 00:26:03,250 on top when it comes to the application of these inventions. 367 00:26:03,250 --> 00:26:08,470 I think it's important to look at the kinds of projects that are developing in your city 368 00:26:08,470 --> 00:26:11,680 and map out the research and development initiatives. 369 00:26:11,680 --> 00:26:17,280 A lot of this industry is operating very openly in startups and yuppie labs that face very 370 00:26:17,280 --> 00:26:18,390 little resistance. 371 00:26:18,390 --> 00:26:23,120 This allows them to get major funding because those seem like safe investments that have 372 00:26:23,120 --> 00:26:26,780 the potential to get investors some serious cash. 373 00:26:26,780 --> 00:26:31,410 Messing with that sense of security can hurt the industry, and it can also allow others 374 00:26:31,410 --> 00:26:33,780 to see that resistance is still possible. 375 00:26:33,780 --> 00:26:38,700 As for these new technologies, at this point a lot of these systems still have bugs, or 376 00:26:38,700 --> 00:26:41,700 can't deal with people intentionally using them wrong. 377 00:26:41,700 --> 00:26:46,910 In London when the crowdsourced snitching app was released, people intentionally submitted 378 00:26:46,910 --> 00:26:50,120 tons of fake reports to throw off the cops. 379 00:26:50,120 --> 00:26:54,330 If more people start messing with their little gadgets, they'll be less effective and less 380 00:26:54,330 --> 00:26:57,500 likely to be applied in more places. 381 00:27:10,840 --> 00:27:14,960 For these projects to function, a lot of infrastructure is needed. 382 00:27:14,960 --> 00:27:19,060 These softwares cannot be developed in thin air, they need hardware. 383 00:27:19,070 --> 00:27:21,510 That means computers and backup drives. 384 00:27:21,510 --> 00:27:27,130 The information also moves around, mostly through networks of fiber optic cables. 385 00:27:27,130 --> 00:27:31,080 Those have been sabotaged all around the world very effectively. 386 00:27:31,080 --> 00:27:34,850 The data also has to be stored, which happens in data centers. 387 00:27:34,850 --> 00:27:39,750 Those can be pretty small, but can also be gigantic buildings that need their own cooling 388 00:27:39,750 --> 00:27:42,400 systems and 24/7 security. 389 00:27:42,400 --> 00:27:47,920 Finally, a lot of these projects need people to work together, often in labs sponsored 390 00:27:47,920 --> 00:27:50,490 by companies, universities, or both. 391 00:27:50,490 --> 00:27:55,430 A lot of these labs and coworking spaces for startups are easy to find. 392 00:27:55,430 --> 00:28:00,540 One of them was attacked with molotovs in Berlin by people fighting against the proliferation 393 00:28:00,540 --> 00:28:02,940 of Google startups. 394 00:28:04,860 --> 00:28:08,260 There has been a lot of resistance to the sidewalk labs project in Toronto. 395 00:28:08,260 --> 00:28:13,650 A lot of work that, you know, points out the myriad issues with the Sidewalk Labs project 396 00:28:13,650 --> 00:28:17,660 and mounts a public education campaign against that. 397 00:28:17,660 --> 00:28:23,360 There's been a lot of concerned citizens and activists that have come out to all the meetings 398 00:28:23,360 --> 00:28:30,930 that Sidewalk Labs has been hosting, public forums, consultation sessions and a lot of 399 00:28:30,930 --> 00:28:33,460 folks are really worried. 400 00:28:33,460 --> 00:28:39,200 Privacy experts dropping out of the project and saying "I can't sign my name to this". 401 00:28:39,200 --> 00:28:45,740 People have always been able to get away with attacking power, with sabotaging power anonymously 402 00:28:45,750 --> 00:28:47,870 without getting caught. 403 00:28:47,870 --> 00:28:54,330 It's still possible to attack and that will remain so for, maybe forever, but at the very 404 00:28:54,330 --> 00:28:57,360 least for the immediate foreseeable future. 405 00:28:57,360 --> 00:29:04,891 So, while it is certainly still possible to break the law, to attack the system, people 406 00:29:04,891 --> 00:29:08,390 need to be very careful about being conscious of what they're doing. 407 00:29:08,390 --> 00:29:13,150 Being aware that, you know, they're carrying a snitch in their pocket or that they're willingly 408 00:29:13,150 --> 00:29:17,190 trusting these corporations with 95% of their social life. 409 00:29:17,190 --> 00:29:22,660 It's really difficult to resist something that we don't really know a lot about and 410 00:29:22,660 --> 00:29:26,470 players involved haven't really shared with the public everything that we probably need 411 00:29:26,470 --> 00:29:31,630 to know to mount a resistance campaign, cause we're king of just talking about speculation. 412 00:29:31,630 --> 00:29:38,820 If people learn the actual technical capabilities of the state, they can learn the weaknesses 413 00:29:38,820 --> 00:29:46,620 and they can learn how to get away with sabotaging the economy, with going up against the state, 414 00:30:00,130 --> 00:30:05,930 Given the active role that technological development continues to play in terms of deepening alienation, 415 00:30:05,930 --> 00:30:11,230 refining surveillance, engineering more destructive weapons and hastening climate change... it’s 416 00:30:11,230 --> 00:30:15,280 natural to feel a bit pessimistic about where things are headed. 417 00:30:15,280 --> 00:30:19,560 Our current trajectory is certainly aimed towards more sophisticated systems of mass 418 00:30:19,560 --> 00:30:22,270 behaviour modification and social control. 419 00:30:22,270 --> 00:30:23,980 I Told you everything already! 420 00:30:23,980 --> 00:30:28,210 Take him instead of me, he's the thought criminal. 421 00:30:28,210 --> 00:30:32,600 But it’s important to remember that despite all the money being spent trying to anticipate 422 00:30:32,600 --> 00:30:38,050 human decision-making, nobody – not even Google — can predict the future. 423 00:30:38,050 --> 00:30:42,330 Throughout history, new technologies have repeatedly created unintended consequences 424 00:30:42,330 --> 00:30:47,140 for those in power... from Guttenberg’s Printing Press spawning a revolt against the 425 00:30:47,140 --> 00:30:51,970 Catholic Church, to the early Internet paving the way for hackers and the development of 426 00:30:51,970 --> 00:30:55,100 powerful peer-to-peer encryption tools. 427 00:30:55,100 --> 00:31:01,410 As long as people have a will to resist, they will find the tools to do so. 428 00:31:01,410 --> 00:31:04,900 So at this point, we’d like to remind you that Trouble is intended to be watched in 429 00:31:04,900 --> 00:31:09,050 groups, and to be used as a resource to promote discussion and collective organizing. 430 00:31:09,050 --> 00:31:14,000 Are you interested in fighting back against the opening of new tech start-ups in your 431 00:31:14,000 --> 00:31:18,400 neighbourhood, or just looking to incorporate a better understanding of next-gen technologies 432 00:31:18,400 --> 00:31:20,970 into your existing organizing? 433 00:31:20,970 --> 00:31:25,220 Consider getting together with some comrades, organizing a screening of this film, and discussing 434 00:31:25,220 --> 00:31:27,810 where to get started. 435 00:31:27,810 --> 00:31:31,950 Interested in running regular screenings of Trouble at your campus, infoshop, community 436 00:31:31,950 --> 00:31:34,490 center, or even just at home with friends? 437 00:31:34,490 --> 00:31:35,490 Become a Trouble-Maker! 438 00:31:35,490 --> 00:31:39,800 For 10 bucks a month, we’ll hook you up with an advanced copy of the show, and a screening 439 00:31:39,800 --> 00:31:44,280 kit featuring additional resources and some questions you can use to get a discussion 440 00:31:44,280 --> 00:31:45,280 going. 441 00:31:45,780 --> 00:31:48,940 If you can’t afford to support us financially, no worries! 442 00:31:48,950 --> 00:31:56,300 You can stream and/or download all our content for free off our website: sub.media/trouble. 443 00:31:56,300 --> 00:32:00,790 If you’ve got any suggestions for show topics, or just want to get in touch, drop us a line 444 00:32:00,790 --> 00:32:03,740 at trouble@sub.media. 445 00:32:04,660 --> 00:32:07,960 We’re now into the second month of our annual fundraising campaign. 446 00:32:07,970 --> 00:32:13,380 A huge thanks to everyone who has donated so far! If you haven’t given yet and are 447 00:32:13,380 --> 00:32:18,120 in a position to do so, please consider becoming a monthly sustainer 448 00:32:18,120 --> 00:32:23,000 or, making a one time donation at sub.media/donate. 449 00:32:23,000 --> 00:32:27,510 This episode would not have been possible without the generous support of Carla and..... 450 00:32:27,510 --> 00:32:28,750 Carla. 451 00:32:28,750 --> 00:32:32,840 Stay tuned next month for Trouble #20, as we take a closer look at the horrors of the 452 00:32:32,840 --> 00:32:38,020 Prison Industrial Complex, and talk to comrades fighting for its abolition. 453 00:32:38,020 --> 00:32:40,340 Prison fixes no problems, it doesn't make anything better. 454 00:32:40,340 --> 00:32:45,460 Prison is like the permanent threat that holds up all relationships of exchange and domination. 455 00:32:45,460 --> 00:32:49,180 It's the deeply felt sense that no matter how bullshit our lives are, there's still 456 00:32:49,190 --> 00:32:51,240 something the state can take away from us. 457 00:32:51,240 --> 00:32:53,320 Now get out there and make some trouble!