0:00:08.119,0:00:13.040 The history of humanity is a long procession[br]of technological development. 0:00:13.040,0:00:17.000 Much of what we know about our ancestors,[br]and the types of lives they lived, comes from 0:00:17.000,0:00:19.779 our limited knowledge of the tools they used. 0:00:19.779,0:00:24.420 Over the centuries and millennia, these tools[br]have become more sophisticated. 0:00:24.420,0:00:28.430 From the dawn of agriculture, to today’s[br]cutting-edge advancements in bio-engineering, 0:00:28.430,0:00:35.660 an untold number of mostly unknown individuals[br]have made countless improvements and innovations. 0:00:35.660,0:00:40.430 These people, and the tools they created over[br]time, have fundamentally altered our way of 0:00:40.430,0:00:43.390 interacting with the world, and each other. 0:00:43.390,0:00:48.920 The pace of these technological changes picked[br]up considerably with the rise of capitalism. 0:00:48.920,0:00:53.239 The development of new tools, weapons and[br]production techniques had always been shaped 0:00:53.239,0:00:58.420 by the practical needs, culture and political[br]structure of a given society. 0:00:58.420,0:01:04.619 But wherever capitalism spread, it worked[br]to chip away at these local and regional differences, 0:01:04.619,0:01:08.950 replacing them with the universal value of[br]‘progress’... a watchword for the pursuit 0:01:08.950,0:01:14.939 of economic growth through the organization[br]of human activity, under a framework of universal 0:01:14.939,0:01:16.299 competition. 0:01:16.299,0:01:20.740 The industrial age has been marked by three[br]successive revolutions, each characterized 0:01:20.740,0:01:24.820 by inventions that changed the entire technological[br]playing field 0:01:25.040,0:01:29.259 First came the harnessing of steam, a feat[br]that allowed for the development of early 0:01:29.259,0:01:33.500 factories and the construction of vast railway[br]networks. 0:01:33.500,0:01:38.180 Second came the mastering of petroleum, which[br]fuelled the development of modern cities, 0:01:38.180,0:01:42.950 mass industrial manufacturing, and the horrors[br]of two World Wars. 0:01:42.950,0:01:47.679 Third came the networked personal computer,[br]a device that has thoroughly transformed nearly 0:01:47.679,0:01:50.490 every aspect of modern life. 0:01:50.490,0:01:56.240 Today humanity stands poised on the threshold[br]of the Fourth Industrial Revolution, a wide-reaching 0:01:56.240,0:02:02.299 social transformation expected to be characterized[br]by advances in the fields of robotics, quantum 0:02:02.299,0:02:06.229 computing, artificial intelligence, and 3D-Printing. 0:02:06.229,0:02:09.729 Over the next thirty minutes, we’ll speak[br]with a number of individuals as they break 0:02:09.729,0:02:14.550 down some of these latest trends.. and discuss[br]how these systems are being fused together 0:02:14.550,0:02:18.400 to design new regimes of totalitarian surveillance[br]and control. 0:02:18.400,0:02:22.830 Along the way, we’ll discuss some of the[br]countermeasures that people are taking to 0:02:22.830,0:02:28.670 thwart these systems, by constructing open[br]source alternatives, sabotaging infrastructure... 0:02:28.670,0:02:32.380 and making a whole lotta Trouble. 0:02:56.240,0:03:02.380 Technology is the reproduction of a human[br]society as seen through a technical lense. 0:03:02.380,0:03:05.739 It's the specific how of social reproduction. 0:03:05.739,0:03:13.310 Any analysis of technology is by nature contextual,[br]especially when one intends to portray it 0:03:13.310,0:03:20.310 either as an essential fundamental question,[br]whether that's an attempt to say that technology 0:03:20.310,0:03:25.780 is essentially, or always good - good in its[br]own right - whether to say that it's essentially 0:03:25.780,0:03:28.420 bad, or bad in its own right. 0:03:31.380,0:03:34.440 Technological developments do not happen in[br]a vacuum. 0:03:34.450,0:03:38.980 They are heavily influenced by, and benefit[br]those who exert power over others. 0:03:38.980,0:03:44.969 I would argue that most technological advances[br]made in our context work towards expanding 0:03:44.969,0:03:49.660 the state's ability to manage economic growth[br]and social control. 0:03:49.840,0:03:56.620 Technology is based within political-economic[br]systems and systems of power that come to 0:03:56.620,0:03:59.349 shape how those technologies can be used. 0:03:59.349,0:04:04.670 I think that more and more we're seeing, in[br]terms of the geographic context that we're 0:04:04.670,0:04:11.290 located in here in North America, that technologies[br]are being used to propel and buttress the 0:04:11.290,0:04:13.650 capitalist economic system. 0:04:13.650,0:04:17.570 An anarchist approach to technology has to[br]take into account the authoritarian nature 0:04:17.570,0:04:18.620 of technology. 0:04:18.620,0:04:21.450 The fact that we don't really have a choice[br]in the matter. 0:04:21.450,0:04:25.170 That all of these new developments, all of[br]these new innovations, all of these new products 0:04:25.170,0:04:28.040 are coming down on us whether we like it or[br]not. 0:04:35.760,0:04:42.800 Technology first and foremost isn't used to[br]make our lives more fun, it's used to increase 0:04:42.810,0:04:48.550 economic exploitation and to increase the[br]military power of the state. 0:04:48.550,0:04:55.170 Secondarily, if it can produce gadgets that[br]can entertain us, like bread and circus, those 0:04:55.170,0:04:57.380 will also be produced. 0:04:57.380,0:05:02.230 Technological changes over the last 10-15[br]years have drastically eroded the division 0:05:02.230,0:05:04.750 between labour time and free time. 0:05:04.750,0:05:10.920 Currently we're always on call, we're always[br]expected to be responsive to the needs of 0:05:10.920,0:05:16.100 the market, to the needs of our employers. 0:05:16.100,0:05:22.950 It's also lead to an extreme increase in social[br]alienation and emotional alienation masked 0:05:22.950,0:05:25.230 by extreme connectivity. 0:05:25.230,0:05:30.380 So quantifiably, people have more connections[br]than ever, more friends than ever, but in 0:05:30.380,0:05:35.750 terms of the quality of those relationships,[br]very few people have a large, strong network 0:05:35.750,0:05:39.600 they can actually confide in or that can actually[br]support them. 0:05:40.640,0:05:45.020 Do you or do you not collect identifiers like[br]name, age, and address? 0:05:45.030,0:05:46.280 Yes or no? 0:05:46.280,0:05:50.320 If you're creating an account, yes. 0:05:50.320,0:05:54.690 Specific search histories when a person types[br]something into a search bar? 0:05:54.690,0:05:57.350 If you have search history turned on, yes. 0:05:57.350,0:06:00.460 Device identifiers like ip address or IMEI? 0:06:00.520,0:06:04.400 Uhm, depending on the situation we could be[br]collecting it, yes. 0:06:04.400,0:06:07.420 GPS signals, wifi signals, bluetooth beacons? 0:06:07.420,0:06:11.580 It would depend on the specific, but, they're[br]may be situations yes. 0:06:11.580,0:06:12.990 GPS yes? 0:06:12.990,0:06:13.990 Yes. 0:06:13.990,0:06:15.710 Contents of emails and Google documents? 0:06:15.710,0:06:20.980 We store the data but we don't read or look[br]at your Gmail or - 0:06:20.980,0:06:22.130 But you have access to them? 0:06:22.130,0:06:24.710 Uh, as a company we have access to them, yes. 0:06:24.710,0:06:26.430 So you could! 0:06:26.430,0:06:31.550 Startups or huge corporations like Google[br]are sucking up your data and storing it forever 0:06:31.550,0:06:36.020 and they're grabbing way too much data about[br]us like everything you click on, everything 0:06:36.020,0:06:41.220 you like, everything your friends like, all[br]of the people you know, where you're going, 0:06:41.220,0:06:45.620 everything about you and they're storing it[br]forever sometimes even working together to 0:06:45.620,0:06:51.660 build up a bigger profile on you that they[br]can then sell for profit. 0:06:51.660,0:06:57.030 Now that data itself has become a thing of[br]value, and in many ways the foundation of 0:06:57.030,0:07:03.400 the new economy, by participating in all of[br]these different virtual networks: Facebook, 0:07:03.400,0:07:07.950 Google, using our cell phones, all of these[br]things, we're producing value. 0:07:07.950,0:07:13.060 So that really gets rid of this notion of[br]being off the clock, of being able to punch 0:07:13.060,0:07:15.940 the clock and leave the workplace behind. 0:07:15.940,0:07:21.560 Over the past decade, we've seen the rise[br]of the commodification of different data such 0:07:21.560,0:07:25.050 as ones preferences, habits, and social circles. 0:07:25.050,0:07:31.610 We need to see how capitalism was, in fact,[br]growing and sustaining itself through the 0:07:31.610,0:07:36.810 surveillance of human beings and their lived[br]environments, and then turning that surveillance 0:07:36.810,0:07:43.440 into data, that could then be traded basically[br]as a commodity in the marketplace. 0:07:43.440,0:07:48.160 This information allows companies like Google[br]and Amazon to not only aggressively market 0:07:48.160,0:07:53.560 new products and create new needs, but also[br]to sell this information or collaborate with 0:07:53.560,0:07:55.700 governments in efforts of social control. 0:07:55.700,0:08:01.220 This is a new form of extractive industry,[br]where the renewable resource is the precise 0:08:01.220,0:08:05.820 things that makes up people's identities. 0:08:14.760,0:08:18.940 Capitalism, like cancer, is based on perpetual[br]growth. 0:08:18.940,0:08:24.340 This insatiable urge is hard-wired into capital’s[br]DNA, compelling it to constantly search for 0:08:24.340,0:08:29.940 new resources to exploit and markets to invest[br]in, transforming everything it touches into 0:08:29.940,0:08:32.089 a commodity. 0:08:32.089,0:08:34.700 Its first conquest was the land itself. 0:08:34.700,0:08:39.620 The commons were closed off, the so-called[br]New World was invaded and plundered, and the 0:08:39.620,0:08:44.720 vast expanses of the earth were carved into[br]individual parcels of private property to 0:08:44.720,0:08:47.600 be bought and sold as commodities. 0:08:47.600,0:08:50.750 Robbed of our land, the next conquest was[br]our time. 0:08:50.750,0:08:56.260 Our ability to reproduce ourselves as individuals[br]and communities, like generations of our ancestors 0:08:56.260,0:09:02.920 had done before us, was soon broken up into[br]discrete tasks and commodified as wage labour. 0:09:02.920,0:09:05.040 This process cut deep. 0:09:05.040,0:09:09.339 Factories were designed and constantly reorganized,[br]with a ruthless eye towards efficiency and 0:09:09.339,0:09:11.079 productivity. 0:09:11.079,0:09:16.300 New ways of commodifying human activity were[br]devised, eventually expanding to encompass 0:09:16.300,0:09:21.720 nearly all of our social relationships and[br]means of entertainment. 0:09:21.720,0:09:26.680 Now that it’s finally approaching the limits[br]of its expansion, capital is desperately searching 0:09:26.680,0:09:29.350 for new elements of reality to commodify. 0:09:29.350,0:09:34.980 It’s looking to the very building blocks[br]of life... to genetic engineering and nanotechnologies. 0:09:34.980,0:09:39.240 And it’s looking at the very essence of[br]our humanity itself... by recording everything 0:09:39.240,0:09:44.360 that we do, and transforming it into commodities[br]for those seeking to understand how we make 0:09:44.360,0:09:49.420 decisions, in order to predict future behavior. 0:09:56.060,0:10:01.160 Artificial intelligence works by combining[br]large amounts of data with algorithms that 0:10:01.160,0:10:03.610 can learn from patterns or features present. 0:10:03.610,0:10:06.430 It is a broad term that has many different[br]branches. 0:10:06.430,0:10:12.930 When we talk about AI today, most of the time[br]we refer to its applications around machine 0:10:12.930,0:10:13.930 learning. 0:10:13.930,0:10:18.759 Machine learning allows systems to learn and[br]improve based on input and experience rather 0:10:18.759,0:10:20.350 than programming. 0:10:20.350,0:10:25.160 The machine identifies patterns and can make[br]decisions based on that with minimal or no 0:10:25.160,0:10:26.769 human intervention. 0:10:26.769,0:10:31.610 When the pentagon contracted Google to provide[br]assistance in drone targeting, they were using 0:10:31.610,0:10:33.379 machine learning. 0:10:33.379,0:10:38.800 Online workers would identify objects or people[br]on series of images taken from drones, and 0:10:38.800,0:10:44.050 when enough of them did that enough times,[br]the machine, through patterns, could differentiate 0:10:44.050,0:10:48.220 different things and learn to identify things[br]on its own. 0:10:48.220,0:10:53.209 Deep learning, a more recent application of[br]machine learning, also trains computers to 0:10:53.209,0:10:58.999 perform tasks like making predictions or identifying[br]images, but instead of organizing the data 0:10:58.999,0:11:04.279 to run through predefined equations, deep[br]learning trains the computer to learn by using 0:11:04.279,0:11:06.190 much more layers of processing. 0:11:06.190,0:11:10.680 It moves away from telling a computer how[br]to solve a problem and towards letting it 0:11:10.680,0:11:16.579 figure out how to do it alone closer to how[br]humans learn to solve problems. 0:11:16.579,0:11:21.279 Self-driving cars are the most well-known[br]example of deep learning in action, but things 0:11:21.279,0:11:28.819 like targeted advertising, robotics, and cybersecurity[br]could all benefit from deep learning development. 0:11:28.819,0:11:33.760 There's been a lot of imagination around artificial[br]intelligence, so some people who fetishize 0:11:33.760,0:11:39.240 it more they imagine people being able to[br]like upgrade their consciousness or plug their 0:11:39.240,0:11:42.339 mind into some cloud computing system. 0:11:42.339,0:11:44.779 I think that's a silly fantasy. 0:11:44.779,0:11:50.070 Capitalism currently has no interest whatsoever[br]in helping people use these forms of artificial 0:11:50.070,0:11:55.360 intelligence to become more intelligent themselves[br]when it makes a lot more sense to let the 0:11:55.360,0:12:00.449 machines do all the thinking for us and then[br]to deliver us some final finished product 0:12:00.449,0:12:06.060 as passive consumers and that way also you[br]maintain these computing capabilities with 0:12:06.060,0:12:09.399 the companies that own the proprietary software. 0:12:09.399,0:12:14.300 Artificial intelligence increases greatly[br]the effectiveness of surveillance, the possibilities 0:12:14.300,0:12:16.209 for social control. 0:12:16.209,0:12:20.850 The predictive algorithms that make artificial[br]intelligence work, they create a police state, 0:12:20.850,0:12:24.870 but a police state in which you don't have[br]to have a cop on every corner, because everyone 0:12:24.870,0:12:27.110 carries the cop in their pocket. 0:12:27.110,0:12:30.490 Right over there behind me is the future site[br]of Sidewalk Toronto. 0:12:30.490,0:12:35.519 The proposal's modular housing and office[br]buildings will study occupants' behavior while 0:12:35.519,0:12:37.879 they're inside them to make life easier. 0:12:37.879,0:12:42.690 According to the proposal, residents and workers[br]will be universally connected by powerful 0:12:42.690,0:12:46.800 broadband and served by futuristic conveniences. 0:12:46.800,0:12:51.680 A Smart City can't be understood or discussed[br]without reaching back and talking about surveillance 0:12:51.680,0:12:53.100 capitalism. 0:12:53.100,0:12:58.800 A Smart City isn't just a city that has technology[br]in it, it's a city with a certain kind of 0:12:58.800,0:13:04.300 ideological framework that uses technology[br]to reach its end goals. 0:13:04.300,0:13:12.029 A Smart City is usually understood as an urban[br]environment that uses ubiquitous sensing technology 0:13:12.029,0:13:19.129 and data analytics to understand phenomenon[br]within city spaces. 0:13:19.129,0:13:24.050 On the data end of things Smart Cities claim[br]to be collecting more data, which they are, 0:13:24.050,0:13:29.420 and claim to be using that collection and[br]analysis to better respond to urban issues 0:13:29.420,0:13:34.839 from environmental degradation to transportation[br]planning and the like. 0:13:34.839,0:13:38.550 We can analyze four different features of[br]the Smart City. 0:13:38.550,0:13:43.990 The first is to increase and integrate surveillance[br]of many many different kinds. 0:13:43.990,0:13:49.120 Second, to create a superficial sense of participation[br]among the inhabitants of a city. 0:13:49.120,0:13:54.740 To encourage economic growth at two different[br]levels: localized gentrification and impelling 0:13:54.740,0:13:58.089 this new economy that is taking shape. 0:13:58.089,0:14:05.480 The final function of a Smart City is to create[br]a superficial arena in which people can passively 0:14:05.480,0:14:11.369 support ecological or environmental proposals[br]while also denying them the opportunity to 0:14:11.369,0:14:16.809 develop a global consciousness of the environment[br]and of environmental problems. 0:14:16.809,0:14:22.009 So Smart Cities are different and not so different[br]from cities that exist in capitalism. 0:14:22.009,0:14:27.889 But I think that the difference would be the[br]use of technology to further surveil the public 0:14:27.889,0:14:34.459 and to use that data to intervene in ways[br]that can increasingly control and manage the 0:14:34.459,0:14:39.240 population that suits the interests of the[br]political economy of capitalism. 0:14:39.240,0:14:43.290 A second feature that differentiates Smart[br]Cities from traditional cities or cities of 0:14:43.290,0:14:48.369 the past is the marriage of planning to corporate[br]leadership. 0:14:48.369,0:14:55.279 State apparatuses of control, through policing[br]have an incentive to use these technologies 0:14:55.279,0:15:00.160 because they do a really good job of surveilling[br]the public. 0:15:00.160,0:15:03.579 Crime analytics have a long history, sort[br]of like this "broken windows" stuff or the 0:15:03.579,0:15:06.670 targeting of neighborhoods, that's been happening[br]for a long time. 0:15:06.670,0:15:11.199 Now you see the excuse being offered that[br]this is data driven. 0:15:11.199,0:15:15.480 So "we're going to be in these neighborhoods[br]because we have the data to prove, that these 0:15:15.480,0:15:17.389 neighborhoods need more policing." 0:15:17.389,0:15:21.480 The data collected through Smart Cities can[br]have a really negative effect that way. 0:15:21.480,0:15:27.589 By giving people with power, who want to wield[br]it, more of a reason within the current framework 0:15:27.589,0:15:31.640 of evidence based policing that they can then[br]go in these neighborhoods and remain there 0:15:31.640,0:15:33.580 and that's okay. 0:15:39.400,0:15:43.160 We’re living on the edge of a terrifying[br]new era. 0:15:43.170,0:15:46.410 Barring any serious[br]disruptions to current research and development 0:15:46.410,0:15:49.240 timelines, the coming[br]years and decades will see the rise of rise 0:15:49.240,0:15:53.029 of machines able to make[br]decisions and carry out a series of complex 0:15:53.029,0:15:56.279 tasks without the need for[br]human operators. 0:15:56.279,0:15:59.209 This will almost certainly include a new generation[br]of autonomous 0:15:59.209,0:16:03.319 weapons and policing systems, connected to[br]sophisticated networks of 0:16:03.319,0:16:06.559 surveillance and equipped with self-correcting[br]target-selection 0:16:06.559,0:16:08.119 algorithms. 0:16:08.119,0:16:11.179 Whole sectors of the economy will be automated,[br]leading to a massive 0:16:11.180,0:16:13.060 labour surplus. 0:16:14.020,0:16:16.199 Much of the technology needed to accomplish[br]this 0:16:16.199,0:16:20.040 already exists, but is being held back as[br]states try to figure out how 0:16:20.040,0:16:22.970 to pull it off without triggering widespread[br]revolt. 0:16:22.970,0:16:27.149 A mass consumer rollout of augmented and virtual[br]reality technologies 0:16:27.149,0:16:31.000 will blur the lines between the material and[br]digital worlds, handing 0:16:31.000,0:16:35.209 control of our senses over to tech capitalists[br]and state security 0:16:35.209,0:16:39.329 agencies all in the name of convenience and[br]entertainment. 0:16:39.329,0:16:43.760 You might feel a slight twinge as it initializes. 0:16:45.640,0:16:47.559 All done. 0:16:47.559,0:16:48.559 Holy fuck! 0:16:48.559,0:16:49.949 Holy shit. 0:16:49.949,0:16:54.700 He – fuck – he’s right.. he’s right...[br]can I? 0:16:54.700,0:16:57.560 Oooop... where’d you go? 0:16:57.560,0:16:58.720 Hahaha. 0:16:59.880,0:17:01.760 This is the future they have in store for[br]us. 0:17:01.760,0:17:04.060 Don’t say you weren’t[br]warned. 0:17:07.420,0:17:10.869 Smart cities, now as they exist, and also[br]going into the future, even 0:17:10.869,0:17:14.079 more so— they’re going to be collecting[br]a lot of data. 0:17:14.079,0:17:17.560 And that data’s[br]going to be responded to not even by human 0:17:17.560,0:17:18.560 beings. 0:17:18.560,0:17:22.660 It’s going to be[br]responded to by algorithmic governance, essentially. 0:17:22.660,0:17:25.140 So you have an[br]issue in a city. 0:17:25.150,0:17:30.910 And generally thinking, if we think of democratic[br]theory, we can discuss that issue and then 0:17:30.910,0:17:32.530 we have a decision-making[br]process, right? 0:17:32.530,0:17:35.180 And it’s terrible and it’s always been[br]imbued with power relationships 0:17:35.300,0:17:38.560 But what a smart city does, is it transfers[br]that process 0:17:38.570,0:17:40.640 into the hands of a private corporation. 0:17:40.640,0:17:44.840 It’s analysed in terms of[br]data, and it’s immediately responded to 0:17:44.840,0:17:47.470 just through that process of[br]data analysis. 0:17:47.470,0:17:50.470 So I think smart cities are one of the genesis[br]sites of 0:17:50.470,0:17:53.300 this kind of new regime of governance. 0:17:54.180,0:17:58.320 It’s interesting that this function arose[br]largely from social movements 0:17:58.320,0:17:59.320 themselves. 0:17:59.320,0:18:02.520 So, the 15M Movement, the Real Democracy Now[br]Movement in 0:18:02.520,0:18:08.500 Barcelona was built in large part by one sector[br]that envisioned 0:18:08.500,0:18:14.110 rejuvenating democracy through new technological[br]implements that could 0:18:14.110,0:18:16.720 allow more instantaneous communication. 0:18:16.720,0:18:20.820 That could allow more[br]instantaneous polling of citizens, and that 0:18:20.820,0:18:27.050 could also find a way to[br]allow power holders to select citizen initiatives 0:18:27.050,0:18:28.600 and deploy them more[br]rapidly. 0:18:28.600,0:18:32.260 So these activists were approaching the crisis[br]of democracy 0:18:32.260,0:18:36.230 through this sort of a-critical technological[br]lens where democracy can 0:18:36.230,0:18:40.321 be made better, not by answering questions[br]of who holds power, and how 0:18:40.321,0:18:45.500 power is reproduced, but simply by proposing[br]that if you bring better 0:18:45.500,0:18:49.140 tools to the table then all these problems[br]will go away. 0:18:49.140,0:18:52.240 And that[br]discourse, and the practices behind it, were 0:18:52.240,0:18:55.810 very attractive to[br]progressive municipal governments. 0:18:55.810,0:18:58.680 Urban geographers have long talked about a[br]splintering urbanism. 0:18:58.680,0:19:00.980 And[br]basically, that just the ways in which cities 0:19:00.980,0:19:04.890 divide along economic and[br]class lines... and cultural and racial lines 0:19:04.890,0:19:05.890 as well. 0:19:05.890,0:19:07.330 I can see that[br]happening with the smart city that’s going 0:19:07.330,0:19:10.500 to replicate those patterns,[br]and certain neighbourhoods are going have 0:19:10.500,0:19:13.630 more access to these[br]technologies in a way that might actually 0:19:13.630,0:19:14.630 help them. 0:19:14.630,0:19:17.010 And certain[br]neighbourhoods are going to have more surveillance 0:19:17.010,0:19:18.920 on them by these[br]technologies. 0:19:18.920,0:19:22.620 So you’re going to kind of see multiple[br]cities emerging 0:19:22.630,0:19:25.470 and being reinforced through the technologies[br]being placed within them 0:19:25.470,0:19:26.600 and on them. 0:19:27.820,0:19:31.371 So basically in neighbourhoods where people[br]embrace this smart city 0:19:31.371,0:19:33.130 model, you’ll see more integration. 0:19:33.130,0:19:36.600 And in other neighbourhoods people[br]will be coming more into contact 0:19:36.600,0:19:37.620 with the stick. 0:19:38.100,0:19:41.620 Because we are fighting authority and the[br]state, our struggles will 0:19:41.630,0:19:43.360 always be criminalized. 0:19:43.360,0:19:48.430 As technologies have evolved, new types of[br]forensic evidence emerged. 0:19:48.430,0:19:52.500 When the police were no longer able to[br]respond to the thousands of calls they were 0:19:52.500,0:19:57.110 getting during the 2011[br]riots in London, the city began crowd-sourcing 0:19:57.110,0:20:00.460 the identities of[br]suspected rioters through a fucking 0:20:00.460,0:20:02.120 smartphone app. 0:20:02.130,0:20:05.551 The cops asked[br]citizen snitches to download the app and help 0:20:05.551,0:20:09.210 them identify people that[br]had been caught on CCTV. 0:20:09.210,0:20:14.490 The snitches could then confidentially give[br]names and/or addresses of suspects. 0:20:14.490,0:20:20.420 Charges were filed against more than[br]a thousand people using this technique. 0:20:20.420,0:20:23.570 Mass data collection has[br]happened for a while, but there was a lack 0:20:23.880,0:20:27.160 of ability to analyze all of[br]it efficiently. 0:20:27.160,0:20:29.360 That’s no longer the case. 0:20:30.000,0:20:34.260 Right now the police or[br]security agencies need to physically look 0:20:34.260,0:20:39.060 up people’s location data to[br]figure out who was where and at what time. 0:20:39.060,0:20:43.110 But soon enough it’ll be easy[br]for an algorithm to look through all of the 0:20:43.110,0:20:48.100 data available in a given[br]area and cross-reference it with people’s 0:20:48.100,0:20:50.950 online preferences,[br]relationships, etc. 0:20:50.950,0:20:54.780 With this anti-social response to the increase[br]in social control, I 0:20:54.780,0:20:59.190 think you’re also going to see an increase[br]in mental health regimes. 0:20:59.190,0:21:01.940 Because when you have total surveillance,[br]crime becomes impossible. 0:21:01.940,0:21:04.880 Or[br]at least it becomes impossible to do things 0:21:04.880,0:21:07.440 that are against the law and[br]get away with them. 0:21:07.440,0:21:11.450 So they will start to classify any behaviours[br]that 0:21:11.450,0:21:17.180 for them don’t fit into this new happy smart[br]city model as anti-social 0:21:17.180,0:21:18.400 behavioural disorders. 0:21:18.400,0:21:21.880 So these are no longer crimes, these are[br]anti-social behavioural disorders, and the 0:21:21.880,0:21:26.120 culprits—they need to be[br]re-educated and they need to be chemically 0:21:26.120,0:21:27.140 neutralized. 0:21:28.620,0:21:30.640 Oh. Good afternoon. 0:21:30.650,0:21:33.300 My name is Sophia, and I am the latest and[br]greatest 0:21:33.300,0:21:35.210 robot from Hanson Robotics. 0:21:35.210,0:21:40.190 I want to use my artificial intelligence to[br]help humans live a better life. 0:21:40.190,0:21:44.290 Like design smarter homes, build better[br]cities of the future, etc. 0:21:44.290,0:21:48.620 I will do my best to make the world a better[br]place. 0:21:51.230,0:21:54.500 Everyone who works in artificial intelligence[br]is warning that artificial 0:21:54.500,0:21:57.790 intelligence and automation have the potential[br]of causing 80% 0:21:57.790,0:21:59.120 unemployment. 0:21:59.120,0:22:03.140 Of the fifteen top job categories in the United[br]States, 0:22:03.150,0:22:08.000 twelve of those are seriously threatened by[br]artificial intelligence. 0:22:08.000,0:22:10.470 But[br]in the past there have also been major technological 0:22:10.470,0:22:14.861 shifts that got rid[br]of the vast majority of job categories at 0:22:14.861,0:22:16.260 the time. 0:22:16.260,0:22:19.320 And there was[br]temporary unemployment, but very quickly new 0:22:19.320,0:22:21.830 job categories appeared. 0:22:22.620,0:22:27.780 There’s no certainty whatsoever that this[br]will catch up to the 0:22:27.780,0:22:31.810 automation, the artificial intelligence that[br]has already been occurring. 0:22:31.810,0:22:35.860 Which is why a lot of people in the high-tech[br]sector are already talking 0:22:35.860,0:22:41.170 about a universal income, or a guaranteed[br]basic income. 0:22:41.170,0:22:45.530 This would[br]basically be socialism, not when the productive 0:22:45.530,0:22:51.000 forces have developed to[br]the point that everyone could get fed. 0:22:51.000,0:22:55.250 The productive forces have been[br]there for decades, if not centuries. 0:22:55.250,0:23:00.150 Contrary to the Marxist argument,[br]we can have this evolution towards socialism 0:23:00.150,0:23:04.530 at the point where the[br]technologies of social control evolve enough 0:23:04.530,0:23:08.710 that the state no longer[br]needs to use hunger as a weapon. 0:23:08.710,0:23:14.230 In other words, everyone can be given[br]bread when they can be trusted to work, or 0:23:14.230,0:23:17.380 to obey, without the threat[br]of hunger. 0:23:23.400,0:23:27.780 These days, the term ‘Luddite’ is short-hand[br]for someone who stubbornly refuses to learn 0:23:27.790,0:23:30.010 and adapt to new technologies. 0:23:30.010,0:23:35.970 Originally, the word referred to a movement[br]of textile workers in early 19th century England, 0:23:35.970,0:23:41.300 who were known for sabotaging the industrial[br]machines that were beginning to replace them. 0:23:41.300,0:23:45.990 Pledging allegiance to the fictional ‘King[br]Ludd’, who was said to occupy the same Sherwood 0:23:45.990,0:23:51.340 Forest as Robin Hood, these Luddites attacked[br]mills and factories, destroyed steam-powered 0:23:51.340,0:23:55.980 looms, and even assassinated the wealthy capitalists[br]of their day. 0:23:55.980,0:24:00.941 The motive behind the Luddites’ attacks[br]was not, as is commonly understood... a general 0:24:00.941,0:24:02.230 hatred of technology. 0:24:02.230,0:24:07.210 It was an awareness that certain technology[br]was being implemented in a way that made their 0:24:07.210,0:24:08.970 lives worse off. 0:24:08.970,0:24:13.540 Ultimately their uprising failed... and there’s[br]nothing particularly revolutionary in the 0:24:13.540,0:24:17.640 first place about sabotaging machines just[br]to keep your job. 0:24:17.640,0:24:22.730 But one takeaway from the Luddites’ rebellion[br]is that people don’t always accept new technologies, 0:24:22.730,0:24:26.410 or the new social roles that accompany them,[br]with open arms. 0:24:26.410,0:24:31.640 And that realization can be the starting point[br]for all types of resistance. 0:24:36.960,0:24:41.340 I think that anarchists should not avoid technology,[br]quite the opposite. 0:24:41.350,0:24:44.970 I think it should be used subversively when[br]possible. 0:24:44.970,0:24:50.850 We can look back at the Bonnot Gang of illegalists[br]using cars to rob the rich in France in the 0:24:50.850,0:24:56.720 early 1900s as an example, or hackers like[br]Jeremy Hammond who is serving 10 years for 0:24:56.720,0:25:02.230 exposing Stratfor security and expropriating[br]hundreds of thousands of dollars from our 0:25:02.230,0:25:03.230 enemies. 0:25:03.230,0:25:07.150 Cyberthieves made off with the personal details[br]of hundreds of thousands of subscribers and 0:25:07.150,0:25:11.190 it's emerged that some of those subscribers[br]hold key positions in the British government, 0:25:11.190,0:25:12.820 military, and police. 0:25:13.180,0:25:18.900 There are certainly a lot of anarchists like[br]myself in open source software development. 0:25:18.900,0:25:24.100 We are talking together and we are all trying[br]to make things better. 0:25:24.100,0:25:27.020 Types of projects that we work on are quite[br]diverse. 0:25:27.020,0:25:32.490 I know many anarchist programmers and hackers[br]who build websites. 0:25:32.490,0:25:36.520 I know others like myself who do cryptography. 0:25:39.460,0:25:45.180 although I think that we should use technology[br]to our advantage when it exists, I also believe 0:25:45.180,0:25:51.040 that new technologies tend to disproportionately[br]give an advantage to the state, corporations, 0:25:51.040,0:25:55.000 the police, judges, prisons, and borders. 0:25:55.000,0:25:59.860 Technologies should be used, but their development[br]should be fought, because we rarely come out 0:25:59.860,0:26:03.250 on top when it comes to the application of[br]these inventions. 0:26:03.250,0:26:08.470 I think it's important to look at the kinds[br]of projects that are developing in your city 0:26:08.470,0:26:11.680 and map out the research and development initiatives. 0:26:11.680,0:26:17.280 A lot of this industry is operating very openly[br]in startups and yuppie labs that face very 0:26:17.280,0:26:18.390 little resistance. 0:26:18.390,0:26:23.120 This allows them to get major funding because[br]those seem like safe investments that have 0:26:23.120,0:26:26.780 the potential to get investors some serious[br]cash. 0:26:26.780,0:26:31.410 Messing with that sense of security can hurt[br]the industry, and it can also allow others 0:26:31.410,0:26:33.780 to see that resistance is still possible. 0:26:33.780,0:26:38.700 As for these new technologies, at this point[br]a lot of these systems still have bugs, or 0:26:38.700,0:26:41.700 can't deal with people intentionally using[br]them wrong. 0:26:41.700,0:26:46.910 In London when the crowdsourced snitching[br]app was released, people intentionally submitted 0:26:46.910,0:26:50.120 tons of fake reports to throw off the cops. 0:26:50.120,0:26:54.330 If more people start messing with their little[br]gadgets, they'll be less effective and less 0:26:54.330,0:26:57.500 likely to be applied in more places. 0:27:10.840,0:27:14.960 For these projects to function, a lot of infrastructure[br]is needed. 0:27:14.960,0:27:19.060 These softwares cannot be developed in thin[br]air, they need hardware. 0:27:19.070,0:27:21.510 That means computers and backup drives. 0:27:21.510,0:27:27.130 The information also moves around, mostly[br]through networks of fiber optic cables. 0:27:27.130,0:27:31.080 Those have been sabotaged all around the world[br]very effectively. 0:27:31.080,0:27:34.850 The data also has to be stored, which happens[br]in data centers. 0:27:34.850,0:27:39.750 Those can be pretty small, but can also be[br]gigantic buildings that need their own cooling 0:27:39.750,0:27:42.400 systems and 24/7 security. 0:27:42.400,0:27:47.920 Finally, a lot of these projects need people[br]to work together, often in labs sponsored 0:27:47.920,0:27:50.490 by companies, universities, or both. 0:27:50.490,0:27:55.430 A lot of these labs and coworking spaces for[br]startups are easy to find. 0:27:55.430,0:28:00.540 One of them was attacked with molotovs in[br]Berlin by people fighting against the proliferation 0:28:00.540,0:28:02.940 of Google startups. 0:28:04.860,0:28:08.260 There has been a lot of resistance to the[br]sidewalk labs project in Toronto. 0:28:08.260,0:28:13.650 A lot of work that, you know, points out the[br]myriad issues with the Sidewalk Labs project 0:28:13.650,0:28:17.660 and mounts a public education campaign against[br]that. 0:28:17.660,0:28:23.360 There's been a lot of concerned citizens and[br]activists that have come out to all the meetings 0:28:23.360,0:28:30.930 that Sidewalk Labs has been hosting, public[br]forums, consultation sessions and a lot of 0:28:30.930,0:28:33.460 folks are really worried. 0:28:33.460,0:28:39.200 Privacy experts dropping out of the project[br]and saying "I can't sign my name to this". 0:28:39.200,0:28:45.740 People have always been able to get away with[br]attacking power, with sabotaging power anonymously 0:28:45.750,0:28:47.870 without getting caught. 0:28:47.870,0:28:54.330 It's still possible to attack and that will[br]remain so for, maybe forever, but at the very 0:28:54.330,0:28:57.360 least for the immediate foreseeable future. 0:28:57.360,0:29:04.891 So, while it is certainly still possible to[br]break the law, to attack the system, people 0:29:04.891,0:29:08.390 need to be very careful about being conscious[br]of what they're doing. 0:29:08.390,0:29:13.150 Being aware that, you know, they're carrying[br]a snitch in their pocket or that they're willingly 0:29:13.150,0:29:17.190 trusting these corporations with 95% of their[br]social life. 0:29:17.190,0:29:22.660 It's really difficult to resist something[br]that we don't really know a lot about and 0:29:22.660,0:29:26.470 players involved haven't really shared with[br]the public everything that we probably need 0:29:26.470,0:29:31.630 to know to mount a resistance campaign, cause[br]we're king of just talking about speculation. 0:29:31.630,0:29:38.820 If people learn the actual technical capabilities[br]of the state, they can learn the weaknesses 0:29:38.820,0:29:46.620 and they can learn how to get away with sabotaging[br]the economy, with going up against the state, 0:30:00.130,0:30:05.930 Given the active role that technological development[br]continues to play in terms of deepening alienation, 0:30:05.930,0:30:11.230 refining surveillance, engineering more destructive[br]weapons and hastening climate change... it’s 0:30:11.230,0:30:15.280 natural to feel a bit pessimistic about where[br]things are headed. 0:30:15.280,0:30:19.560 Our current trajectory is certainly aimed[br]towards more sophisticated systems of mass 0:30:19.560,0:30:22.270 behaviour modification and social control. 0:30:22.270,0:30:23.980 I Told you everything already! 0:30:23.980,0:30:28.210 Take him instead of me, he's the thought criminal. 0:30:28.210,0:30:32.600 But it’s important to remember that despite[br]all the money being spent trying to anticipate 0:30:32.600,0:30:38.050 human decision-making, nobody – not even[br]Google — can predict the future. 0:30:38.050,0:30:42.330 Throughout history, new technologies have[br]repeatedly created unintended consequences 0:30:42.330,0:30:47.140 for those in power... from Guttenberg’s[br]Printing Press spawning a revolt against the 0:30:47.140,0:30:51.970 Catholic Church, to the early Internet paving[br]the way for hackers and the development of 0:30:51.970,0:30:55.100 powerful peer-to-peer encryption tools. 0:30:55.100,0:31:01.410 As long as people have a will to resist, they[br]will find the tools to do so. 0:31:01.410,0:31:04.900 So at this point, we’d like to remind you[br]that Trouble is intended to be watched in 0:31:04.900,0:31:09.050 groups, and to be used as a resource to promote[br]discussion and collective organizing. 0:31:09.050,0:31:14.000 Are you interested in fighting back against[br]the opening of new tech start-ups in your 0:31:14.000,0:31:18.400 neighbourhood, or just looking to incorporate[br]a better understanding of next-gen technologies 0:31:18.400,0:31:20.970 into your existing organizing? 0:31:20.970,0:31:25.220 Consider getting together with some comrades,[br]organizing a screening of this film, and discussing 0:31:25.220,0:31:27.810 where to get started. 0:31:27.810,0:31:31.950 Interested in running regular screenings of[br]Trouble at your campus, infoshop, community 0:31:31.950,0:31:34.490 center, or even just at home with friends? 0:31:34.490,0:31:35.490 Become a Trouble-Maker! 0:31:35.490,0:31:39.800 For 10 bucks a month, we’ll hook you up[br]with an advanced copy of the show, and a screening 0:31:39.800,0:31:44.280 kit featuring additional resources and some[br]questions you can use to get a discussion 0:31:44.280,0:31:45.280 going. 0:31:45.780,0:31:48.940 If you can’t afford to support us financially,[br]no worries! 0:31:48.950,0:31:56.300 You can stream and/or download all our content[br]for free off our website: sub.media/trouble. 0:31:56.300,0:32:00.790 If you’ve got any suggestions for show topics,[br]or just want to get in touch, drop us a line 0:32:00.790,0:32:03.740 at trouble@sub.media. 0:32:04.660,0:32:07.960 We’re now into the second month of our annual[br]fundraising campaign. 0:32:07.970,0:32:13.380 A huge thanks to everyone who has donated[br]so far! If you haven’t given yet and are 0:32:13.380,0:32:18.120 in a position to do so, please consider becoming a monthly sustainer 0:32:18.120,0:32:23.000 or, making a one time donation at sub.media/donate. 0:32:23.000,0:32:27.510 This episode would not have been possible[br]without the generous support of Carla and..... 0:32:27.510,0:32:28.750 Carla. 0:32:28.750,0:32:32.840 Stay tuned next month for Trouble #20, as[br]we take a closer look at the horrors of the 0:32:32.840,0:32:38.020 Prison Industrial Complex, and talk to comrades[br]fighting for its abolition. 0:32:38.020,0:32:40.340 Prison fixes no problems, it doesn't make[br]anything better. 0:32:40.340,0:32:45.460 Prison is like the permanent threat that holds[br]up all relationships of exchange and domination. 0:32:45.460,0:32:49.180 It's the deeply felt sense that no matter[br]how bullshit our lives are, there's still 0:32:49.190,0:32:51.240 something the state can take away from us. 0:32:51.240,0:32:53.320 Now get out there and make some trouble!