Return to Video

Trouble 19: Quiet Storm

  • 0:08 - 0:13
    The history of humanity is a long procession
    of technological development.
  • 0:13 - 0:17
    Much of what we know about our ancestors,
    and the types of lives they lived, comes from
  • 0:17 - 0:20
    our limited knowledge of the tools they used.
  • 0:20 - 0:24
    Over the centuries and millennia, these tools
    have become more sophisticated.
  • 0:24 - 0:28
    From the dawn of agriculture, to today’s
    cutting-edge advancements in bio-engineering,
  • 0:28 - 0:36
    an untold number of mostly unknown individuals
    have made countless improvements and innovations.
  • 0:36 - 0:40
    These people, and the tools they created over
    time, have fundamentally altered our way of
  • 0:40 - 0:43
    interacting with the world, and each other.
  • 0:43 - 0:49
    The pace of these technological changes picked
    up considerably with the rise of capitalism.
  • 0:49 - 0:53
    The development of new tools, weapons and
    production techniques had always been shaped
  • 0:53 - 0:58
    by the practical needs, culture and political
    structure of a given society.
  • 0:58 - 1:05
    But wherever capitalism spread, it worked
    to chip away at these local and regional differences,
  • 1:05 - 1:09
    replacing them with the universal value of
    ‘progress’... a watchword for the pursuit
  • 1:09 - 1:15
    of economic growth through the organization
    of human activity, under a framework of universal
  • 1:15 - 1:16
    competition.
  • 1:16 - 1:21
    The industrial age has been marked by three
    successive revolutions, each characterized
  • 1:21 - 1:25
    by inventions that changed the entire technological
    playing field
  • 1:25 - 1:29
    First came the harnessing of steam, a feat
    that allowed for the development of early
  • 1:29 - 1:34
    factories and the construction of vast railway
    networks.
  • 1:34 - 1:38
    Second came the mastering of petroleum, which
    fuelled the development of modern cities,
  • 1:38 - 1:43
    mass industrial manufacturing, and the horrors
    of two World Wars.
  • 1:43 - 1:48
    Third came the networked personal computer,
    a device that has thoroughly transformed nearly
  • 1:48 - 1:50
    every aspect of modern life.
  • 1:50 - 1:56
    Today humanity stands poised on the threshold
    of the Fourth Industrial Revolution, a wide-reaching
  • 1:56 - 2:02
    social transformation expected to be characterized
    by advances in the fields of robotics, quantum
  • 2:02 - 2:06
    computing, artificial intelligence, and 3D-Printing.
  • 2:06 - 2:10
    Over the next thirty minutes, we’ll speak
    with a number of individuals as they break
  • 2:10 - 2:15
    down some of these latest trends.. and discuss
    how these systems are being fused together
  • 2:15 - 2:18
    to design new regimes of totalitarian surveillance
    and control.
  • 2:18 - 2:23
    Along the way, we’ll discuss some of the
    countermeasures that people are taking to
  • 2:23 - 2:29
    thwart these systems, by constructing open
    source alternatives, sabotaging infrastructure...
  • 2:29 - 2:32
    and making a whole lotta Trouble.
  • 2:56 - 3:02
    Technology is the reproduction of a human
    society as seen through a technical lense.
  • 3:02 - 3:06
    It's the specific how of social reproduction.
  • 3:06 - 3:13
    Any analysis of technology is by nature contextual,
    especially when one intends to portray it
  • 3:13 - 3:20
    either as an essential fundamental question,
    whether that's an attempt to say that technology
  • 3:20 - 3:26
    is essentially, or always good - good in its
    own right - whether to say that it's essentially
  • 3:26 - 3:28
    bad, or bad in its own right.
  • 3:31 - 3:34
    Technological developments do not happen in
    a vacuum.
  • 3:34 - 3:39
    They are heavily influenced by, and benefit
    those who exert power over others.
  • 3:39 - 3:45
    I would argue that most technological advances
    made in our context work towards expanding
  • 3:45 - 3:50
    the state's ability to manage economic growth
    and social control.
  • 3:50 - 3:57
    Technology is based within political-economic
    systems and systems of power that come to
  • 3:57 - 3:59
    shape how those technologies can be used.
  • 3:59 - 4:05
    I think that more and more we're seeing, in
    terms of the geographic context that we're
  • 4:05 - 4:11
    located in here in North America, that technologies
    are being used to propel and buttress the
  • 4:11 - 4:14
    capitalist economic system.
  • 4:14 - 4:18
    An anarchist approach to technology has to
    take into account the authoritarian nature
  • 4:18 - 4:19
    of technology.
  • 4:19 - 4:21
    The fact that we don't really have a choice
    in the matter.
  • 4:21 - 4:25
    That all of these new developments, all of
    these new innovations, all of these new products
  • 4:25 - 4:28
    are coming down on us whether we like it or
    not.
  • 4:36 - 4:43
    Technology first and foremost isn't used to
    make our lives more fun, it's used to increase
  • 4:43 - 4:49
    economic exploitation and to increase the
    military power of the state.
  • 4:49 - 4:55
    Secondarily, if it can produce gadgets that
    can entertain us, like bread and circus, those
  • 4:55 - 4:57
    will also be produced.
  • 4:57 - 5:02
    Technological changes over the last 10-15
    years have drastically eroded the division
  • 5:02 - 5:05
    between labour time and free time.
  • 5:05 - 5:11
    Currently we're always on call, we're always
    expected to be responsive to the needs of
  • 5:11 - 5:16
    the market, to the needs of our employers.
  • 5:16 - 5:23
    It's also lead to an extreme increase in social
    alienation and emotional alienation masked
  • 5:23 - 5:25
    by extreme connectivity.
  • 5:25 - 5:30
    So quantifiably, people have more connections
    than ever, more friends than ever, but in
  • 5:30 - 5:36
    terms of the quality of those relationships,
    very few people have a large, strong network
  • 5:36 - 5:40
    they can actually confide in or that can actually
    support them.
  • 5:41 - 5:45
    Do you or do you not collect identifiers like
    name, age, and address?
  • 5:45 - 5:46
    Yes or no?
  • 5:46 - 5:50
    If you're creating an account, yes.
  • 5:50 - 5:55
    Specific search histories when a person types
    something into a search bar?
  • 5:55 - 5:57
    If you have search history turned on, yes.
  • 5:57 - 6:00
    Device identifiers like ip address or IMEI?
  • 6:01 - 6:04
    Uhm, depending on the situation we could be
    collecting it, yes.
  • 6:04 - 6:07
    GPS signals, wifi signals, bluetooth beacons?
  • 6:07 - 6:12
    It would depend on the specific, but, they're
    may be situations yes.
  • 6:12 - 6:13
    GPS yes?
  • 6:13 - 6:14
    Yes.
  • 6:14 - 6:16
    Contents of emails and Google documents?
  • 6:16 - 6:21
    We store the data but we don't read or look
    at your Gmail or -
  • 6:21 - 6:22
    But you have access to them?
  • 6:22 - 6:25
    Uh, as a company we have access to them, yes.
  • 6:25 - 6:26
    So you could!
  • 6:26 - 6:32
    Startups or huge corporations like Google
    are sucking up your data and storing it forever
  • 6:32 - 6:36
    and they're grabbing way too much data about
    us like everything you click on, everything
  • 6:36 - 6:41
    you like, everything your friends like, all
    of the people you know, where you're going,
  • 6:41 - 6:46
    everything about you and they're storing it
    forever sometimes even working together to
  • 6:46 - 6:52
    build up a bigger profile on you that they
    can then sell for profit.
  • 6:52 - 6:57
    Now that data itself has become a thing of
    value, and in many ways the foundation of
  • 6:57 - 7:03
    the new economy, by participating in all of
    these different virtual networks: Facebook,
  • 7:03 - 7:08
    Google, using our cell phones, all of these
    things, we're producing value.
  • 7:08 - 7:13
    So that really gets rid of this notion of
    being off the clock, of being able to punch
  • 7:13 - 7:16
    the clock and leave the workplace behind.
  • 7:16 - 7:22
    Over the past decade, we've seen the rise
    of the commodification of different data such
  • 7:22 - 7:25
    as ones preferences, habits, and social circles.
  • 7:25 - 7:32
    We need to see how capitalism was, in fact,
    growing and sustaining itself through the
  • 7:32 - 7:37
    surveillance of human beings and their lived
    environments, and then turning that surveillance
  • 7:37 - 7:43
    into data, that could then be traded basically
    as a commodity in the marketplace.
  • 7:43 - 7:48
    This information allows companies like Google
    and Amazon to not only aggressively market
  • 7:48 - 7:54
    new products and create new needs, but also
    to sell this information or collaborate with
  • 7:54 - 7:56
    governments in efforts of social control.
  • 7:56 - 8:01
    This is a new form of extractive industry,
    where the renewable resource is the precise
  • 8:01 - 8:06
    things that makes up people's identities.
  • 8:15 - 8:19
    Capitalism, like cancer, is based on perpetual
    growth.
  • 8:19 - 8:24
    This insatiable urge is hard-wired into capital’s
    DNA, compelling it to constantly search for
  • 8:24 - 8:30
    new resources to exploit and markets to invest
    in, transforming everything it touches into
  • 8:30 - 8:32
    a commodity.
  • 8:32 - 8:35
    Its first conquest was the land itself.
  • 8:35 - 8:40
    The commons were closed off, the so-called
    New World was invaded and plundered, and the
  • 8:40 - 8:45
    vast expanses of the earth were carved into
    individual parcels of private property to
  • 8:45 - 8:48
    be bought and sold as commodities.
  • 8:48 - 8:51
    Robbed of our land, the next conquest was
    our time.
  • 8:51 - 8:56
    Our ability to reproduce ourselves as individuals
    and communities, like generations of our ancestors
  • 8:56 - 9:03
    had done before us, was soon broken up into
    discrete tasks and commodified as wage labour.
  • 9:03 - 9:05
    This process cut deep.
  • 9:05 - 9:09
    Factories were designed and constantly reorganized,
    with a ruthless eye towards efficiency and
  • 9:09 - 9:11
    productivity.
  • 9:11 - 9:16
    New ways of commodifying human activity were
    devised, eventually expanding to encompass
  • 9:16 - 9:22
    nearly all of our social relationships and
    means of entertainment.
  • 9:22 - 9:27
    Now that it’s finally approaching the limits
    of its expansion, capital is desperately searching
  • 9:27 - 9:29
    for new elements of reality to commodify.
  • 9:29 - 9:35
    It’s looking to the very building blocks
    of life... to genetic engineering and nanotechnologies.
  • 9:35 - 9:39
    And it’s looking at the very essence of
    our humanity itself... by recording everything
  • 9:39 - 9:44
    that we do, and transforming it into commodities
    for those seeking to understand how we make
  • 9:44 - 9:49
    decisions, in order to predict future behavior.
  • 9:56 - 10:01
    Artificial intelligence works by combining
    large amounts of data with algorithms that
  • 10:01 - 10:04
    can learn from patterns or features present.
  • 10:04 - 10:06
    It is a broad term that has many different
    branches.
  • 10:06 - 10:13
    When we talk about AI today, most of the time
    we refer to its applications around machine
  • 10:13 - 10:14
    learning.
  • 10:14 - 10:19
    Machine learning allows systems to learn and
    improve based on input and experience rather
  • 10:19 - 10:20
    than programming.
  • 10:20 - 10:25
    The machine identifies patterns and can make
    decisions based on that with minimal or no
  • 10:25 - 10:27
    human intervention.
  • 10:27 - 10:32
    When the pentagon contracted Google to provide
    assistance in drone targeting, they were using
  • 10:32 - 10:33
    machine learning.
  • 10:33 - 10:39
    Online workers would identify objects or people
    on series of images taken from drones, and
  • 10:39 - 10:44
    when enough of them did that enough times,
    the machine, through patterns, could differentiate
  • 10:44 - 10:48
    different things and learn to identify things
    on its own.
  • 10:48 - 10:53
    Deep learning, a more recent application of
    machine learning, also trains computers to
  • 10:53 - 10:59
    perform tasks like making predictions or identifying
    images, but instead of organizing the data
  • 10:59 - 11:04
    to run through predefined equations, deep
    learning trains the computer to learn by using
  • 11:04 - 11:06
    much more layers of processing.
  • 11:06 - 11:11
    It moves away from telling a computer how
    to solve a problem and towards letting it
  • 11:11 - 11:17
    figure out how to do it alone closer to how
    humans learn to solve problems.
  • 11:17 - 11:21
    Self-driving cars are the most well-known
    example of deep learning in action, but things
  • 11:21 - 11:29
    like targeted advertising, robotics, and cybersecurity
    could all benefit from deep learning development.
  • 11:29 - 11:34
    There's been a lot of imagination around artificial
    intelligence, so some people who fetishize
  • 11:34 - 11:39
    it more they imagine people being able to
    like upgrade their consciousness or plug their
  • 11:39 - 11:42
    mind into some cloud computing system.
  • 11:42 - 11:45
    I think that's a silly fantasy.
  • 11:45 - 11:50
    Capitalism currently has no interest whatsoever
    in helping people use these forms of artificial
  • 11:50 - 11:55
    intelligence to become more intelligent themselves
    when it makes a lot more sense to let the
  • 11:55 - 12:00
    machines do all the thinking for us and then
    to deliver us some final finished product
  • 12:00 - 12:06
    as passive consumers and that way also you
    maintain these computing capabilities with
  • 12:06 - 12:09
    the companies that own the proprietary software.
  • 12:09 - 12:14
    Artificial intelligence increases greatly
    the effectiveness of surveillance, the possibilities
  • 12:14 - 12:16
    for social control.
  • 12:16 - 12:21
    The predictive algorithms that make artificial
    intelligence work, they create a police state,
  • 12:21 - 12:25
    but a police state in which you don't have
    to have a cop on every corner, because everyone
  • 12:25 - 12:27
    carries the cop in their pocket.
  • 12:27 - 12:30
    Right over there behind me is the future site
    of Sidewalk Toronto.
  • 12:30 - 12:36
    The proposal's modular housing and office
    buildings will study occupants' behavior while
  • 12:36 - 12:38
    they're inside them to make life easier.
  • 12:38 - 12:43
    According to the proposal, residents and workers
    will be universally connected by powerful
  • 12:43 - 12:47
    broadband and served by futuristic conveniences.
  • 12:47 - 12:52
    A Smart City can't be understood or discussed
    without reaching back and talking about surveillance
  • 12:52 - 12:53
    capitalism.
  • 12:53 - 12:59
    A Smart City isn't just a city that has technology
    in it, it's a city with a certain kind of
  • 12:59 - 13:04
    ideological framework that uses technology
    to reach its end goals.
  • 13:04 - 13:12
    A Smart City is usually understood as an urban
    environment that uses ubiquitous sensing technology
  • 13:12 - 13:19
    and data analytics to understand phenomenon
    within city spaces.
  • 13:19 - 13:24
    On the data end of things Smart Cities claim
    to be collecting more data, which they are,
  • 13:24 - 13:29
    and claim to be using that collection and
    analysis to better respond to urban issues
  • 13:29 - 13:35
    from environmental degradation to transportation
    planning and the like.
  • 13:35 - 13:39
    We can analyze four different features of
    the Smart City.
  • 13:39 - 13:44
    The first is to increase and integrate surveillance
    of many many different kinds.
  • 13:44 - 13:49
    Second, to create a superficial sense of participation
    among the inhabitants of a city.
  • 13:49 - 13:55
    To encourage economic growth at two different
    levels: localized gentrification and impelling
  • 13:55 - 13:58
    this new economy that is taking shape.
  • 13:58 - 14:05
    The final function of a Smart City is to create
    a superficial arena in which people can passively
  • 14:05 - 14:11
    support ecological or environmental proposals
    while also denying them the opportunity to
  • 14:11 - 14:17
    develop a global consciousness of the environment
    and of environmental problems.
  • 14:17 - 14:22
    So Smart Cities are different and not so different
    from cities that exist in capitalism.
  • 14:22 - 14:28
    But I think that the difference would be the
    use of technology to further surveil the public
  • 14:28 - 14:34
    and to use that data to intervene in ways
    that can increasingly control and manage the
  • 14:34 - 14:39
    population that suits the interests of the
    political economy of capitalism.
  • 14:39 - 14:43
    A second feature that differentiates Smart
    Cities from traditional cities or cities of
  • 14:43 - 14:48
    the past is the marriage of planning to corporate
    leadership.
  • 14:48 - 14:55
    State apparatuses of control, through policing
    have an incentive to use these technologies
  • 14:55 - 15:00
    because they do a really good job of surveilling
    the public.
  • 15:00 - 15:04
    Crime analytics have a long history, sort
    of like this "broken windows" stuff or the
  • 15:04 - 15:07
    targeting of neighborhoods, that's been happening
    for a long time.
  • 15:07 - 15:11
    Now you see the excuse being offered that
    this is data driven.
  • 15:11 - 15:15
    So "we're going to be in these neighborhoods
    because we have the data to prove, that these
  • 15:15 - 15:17
    neighborhoods need more policing."
  • 15:17 - 15:21
    The data collected through Smart Cities can
    have a really negative effect that way.
  • 15:21 - 15:28
    By giving people with power, who want to wield
    it, more of a reason within the current framework
  • 15:28 - 15:32
    of evidence based policing that they can then
    go in these neighborhoods and remain there
  • 15:32 - 15:34
    and that's okay.
  • 15:39 - 15:43
    We’re living on the edge of a terrifying
    new era.
  • 15:43 - 15:46
    Barring any serious
    disruptions to current research and development
  • 15:46 - 15:49
    timelines, the coming
    years and decades will see the rise of rise
  • 15:49 - 15:53
    of machines able to make
    decisions and carry out a series of complex
  • 15:53 - 15:56
    tasks without the need for
    human operators.
  • 15:56 - 15:59
    This will almost certainly include a new generation
    of autonomous
  • 15:59 - 16:03
    weapons and policing systems, connected to
    sophisticated networks of
  • 16:03 - 16:07
    surveillance and equipped with self-correcting
    target-selection
  • 16:07 - 16:08
    algorithms.
  • 16:08 - 16:11
    Whole sectors of the economy will be automated,
    leading to a massive
  • 16:11 - 16:13
    labour surplus.
  • 16:14 - 16:16
    Much of the technology needed to accomplish
    this
  • 16:16 - 16:20
    already exists, but is being held back as
    states try to figure out how
  • 16:20 - 16:23
    to pull it off without triggering widespread
    revolt.
  • 16:23 - 16:27
    A mass consumer rollout of augmented and virtual
    reality technologies
  • 16:27 - 16:31
    will blur the lines between the material and
    digital worlds, handing
  • 16:31 - 16:35
    control of our senses over to tech capitalists
    and state security
  • 16:35 - 16:39
    agencies all in the name of convenience and
    entertainment.
  • 16:39 - 16:44
    You might feel a slight twinge as it initializes.
  • 16:46 - 16:48
    All done.
  • 16:48 - 16:49
    Holy fuck!
  • 16:49 - 16:50
    Holy shit.
  • 16:50 - 16:55
    He – fuck – he’s right.. he’s right...
    can I?
  • 16:55 - 16:58
    Oooop... where’d you go?
  • 16:58 - 16:59
    Hahaha.
  • 17:00 - 17:02
    This is the future they have in store for
    us.
  • 17:02 - 17:04
    Don’t say you weren’t
    warned.
  • 17:07 - 17:11
    Smart cities, now as they exist, and also
    going into the future, even
  • 17:11 - 17:14
    more so— they’re going to be collecting
    a lot of data.
  • 17:14 - 17:18
    And that data’s
    going to be responded to not even by human
  • 17:18 - 17:19
    beings.
  • 17:19 - 17:23
    It’s going to be
    responded to by algorithmic governance, essentially.
  • 17:23 - 17:25
    So you have an
    issue in a city.
  • 17:25 - 17:31
    And generally thinking, if we think of democratic
    theory, we can discuss that issue and then
  • 17:31 - 17:33
    we have a decision-making
    process, right?
  • 17:33 - 17:35
    And it’s terrible and it’s always been
    imbued with power relationships
  • 17:35 - 17:39
    But what a smart city does, is it transfers
    that process
  • 17:39 - 17:41
    into the hands of a private corporation.
  • 17:41 - 17:45
    It’s analysed in terms of
    data, and it’s immediately responded to
  • 17:45 - 17:47
    just through that process of
    data analysis.
  • 17:47 - 17:50
    So I think smart cities are one of the genesis
    sites of
  • 17:50 - 17:53
    this kind of new regime of governance.
  • 17:54 - 17:58
    It’s interesting that this function arose
    largely from social movements
  • 17:58 - 17:59
    themselves.
  • 17:59 - 18:03
    So, the 15M Movement, the Real Democracy Now
    Movement in
  • 18:03 - 18:08
    Barcelona was built in large part by one sector
    that envisioned
  • 18:08 - 18:14
    rejuvenating democracy through new technological
    implements that could
  • 18:14 - 18:17
    allow more instantaneous communication.
  • 18:17 - 18:21
    That could allow more
    instantaneous polling of citizens, and that
  • 18:21 - 18:27
    could also find a way to
    allow power holders to select citizen initiatives
  • 18:27 - 18:29
    and deploy them more
    rapidly.
  • 18:29 - 18:32
    So these activists were approaching the crisis
    of democracy
  • 18:32 - 18:36
    through this sort of a-critical technological
    lens where democracy can
  • 18:36 - 18:40
    be made better, not by answering questions
    of who holds power, and how
  • 18:40 - 18:46
    power is reproduced, but simply by proposing
    that if you bring better
  • 18:46 - 18:49
    tools to the table then all these problems
    will go away.
  • 18:49 - 18:52
    And that
    discourse, and the practices behind it, were
  • 18:52 - 18:56
    very attractive to
    progressive municipal governments.
  • 18:56 - 18:59
    Urban geographers have long talked about a
    splintering urbanism.
  • 18:59 - 19:01
    And
    basically, that just the ways in which cities
  • 19:01 - 19:05
    divide along economic and
    class lines... and cultural and racial lines
  • 19:05 - 19:06
    as well.
  • 19:06 - 19:07
    I can see that
    happening with the smart city that’s going
  • 19:07 - 19:10
    to replicate those patterns,
    and certain neighbourhoods are going have
  • 19:10 - 19:14
    more access to these
    technologies in a way that might actually
  • 19:14 - 19:15
    help them.
  • 19:15 - 19:17
    And certain
    neighbourhoods are going to have more surveillance
  • 19:17 - 19:19
    on them by these
    technologies.
  • 19:19 - 19:23
    So you’re going to kind of see multiple
    cities emerging
  • 19:23 - 19:25
    and being reinforced through the technologies
    being placed within them
  • 19:25 - 19:27
    and on them.
  • 19:28 - 19:31
    So basically in neighbourhoods where people
    embrace this smart city
  • 19:31 - 19:33
    model, you’ll see more integration.
  • 19:33 - 19:37
    And in other neighbourhoods people
    will be coming more into contact
  • 19:37 - 19:38
    with the stick.
  • 19:38 - 19:42
    Because we are fighting authority and the
    state, our struggles will
  • 19:42 - 19:43
    always be criminalized.
  • 19:43 - 19:48
    As technologies have evolved, new types of
    forensic evidence emerged.
  • 19:48 - 19:52
    When the police were no longer able to
    respond to the thousands of calls they were
  • 19:52 - 19:57
    getting during the 2011
    riots in London, the city began crowd-sourcing
  • 19:57 - 20:00
    the identities of
    suspected rioters through a fucking
  • 20:00 - 20:02
    smartphone app.
  • 20:02 - 20:06
    The cops asked
    citizen snitches to download the app and help
  • 20:06 - 20:09
    them identify people that
    had been caught on CCTV.
  • 20:09 - 20:14
    The snitches could then confidentially give
    names and/or addresses of suspects.
  • 20:14 - 20:20
    Charges were filed against more than
    a thousand people using this technique.
  • 20:20 - 20:24
    Mass data collection has
    happened for a while, but there was a lack
  • 20:24 - 20:27
    of ability to analyze all of
    it efficiently.
  • 20:27 - 20:29
    That’s no longer the case.
  • 20:30 - 20:34
    Right now the police or
    security agencies need to physically look
  • 20:34 - 20:39
    up people’s location data to
    figure out who was where and at what time.
  • 20:39 - 20:43
    But soon enough it’ll be easy
    for an algorithm to look through all of the
  • 20:43 - 20:48
    data available in a given
    area and cross-reference it with people’s
  • 20:48 - 20:51
    online preferences,
    relationships, etc.
  • 20:51 - 20:55
    With this anti-social response to the increase
    in social control, I
  • 20:55 - 20:59
    think you’re also going to see an increase
    in mental health regimes.
  • 20:59 - 21:02
    Because when you have total surveillance,
    crime becomes impossible.
  • 21:02 - 21:05
    Or
    at least it becomes impossible to do things
  • 21:05 - 21:07
    that are against the law and
    get away with them.
  • 21:07 - 21:11
    So they will start to classify any behaviours
    that
  • 21:11 - 21:17
    for them don’t fit into this new happy smart
    city model as anti-social
  • 21:17 - 21:18
    behavioural disorders.
  • 21:18 - 21:22
    So these are no longer crimes, these are
    anti-social behavioural disorders, and the
  • 21:22 - 21:26
    culprits—they need to be
    re-educated and they need to be chemically
  • 21:26 - 21:27
    neutralized.
  • 21:29 - 21:31
    Oh. Good afternoon.
  • 21:31 - 21:33
    My name is Sophia, and I am the latest and
    greatest
  • 21:33 - 21:35
    robot from Hanson Robotics.
  • 21:35 - 21:40
    I want to use my artificial intelligence to
    help humans live a better life.
  • 21:40 - 21:44
    Like design smarter homes, build better
    cities of the future, etc.
  • 21:44 - 21:49
    I will do my best to make the world a better
    place.
  • 21:51 - 21:54
    Everyone who works in artificial intelligence
    is warning that artificial
  • 21:54 - 21:58
    intelligence and automation have the potential
    of causing 80%
  • 21:58 - 21:59
    unemployment.
  • 21:59 - 22:03
    Of the fifteen top job categories in the United
    States,
  • 22:03 - 22:08
    twelve of those are seriously threatened by
    artificial intelligence.
  • 22:08 - 22:10
    But
    in the past there have also been major technological
  • 22:10 - 22:15
    shifts that got rid
    of the vast majority of job categories at
  • 22:15 - 22:16
    the time.
  • 22:16 - 22:19
    And there was
    temporary unemployment, but very quickly new
  • 22:19 - 22:22
    job categories appeared.
  • 22:23 - 22:28
    There’s no certainty whatsoever that this
    will catch up to the
  • 22:28 - 22:32
    automation, the artificial intelligence that
    has already been occurring.
  • 22:32 - 22:36
    Which is why a lot of people in the high-tech
    sector are already talking
  • 22:36 - 22:41
    about a universal income, or a guaranteed
    basic income.
  • 22:41 - 22:46
    This would
    basically be socialism, not when the productive
  • 22:46 - 22:51
    forces have developed to
    the point that everyone could get fed.
  • 22:51 - 22:55
    The productive forces have been
    there for decades, if not centuries.
  • 22:55 - 23:00
    Contrary to the Marxist argument,
    we can have this evolution towards socialism
  • 23:00 - 23:05
    at the point where the
    technologies of social control evolve enough
  • 23:05 - 23:09
    that the state no longer
    needs to use hunger as a weapon.
  • 23:09 - 23:14
    In other words, everyone can be given
    bread when they can be trusted to work, or
  • 23:14 - 23:17
    to obey, without the threat
    of hunger.
  • 23:23 - 23:28
    These days, the term ‘Luddite’ is short-hand
    for someone who stubbornly refuses to learn
  • 23:28 - 23:30
    and adapt to new technologies.
  • 23:30 - 23:36
    Originally, the word referred to a movement
    of textile workers in early 19th century England,
  • 23:36 - 23:41
    who were known for sabotaging the industrial
    machines that were beginning to replace them.
  • 23:41 - 23:46
    Pledging allegiance to the fictional ‘King
    Ludd’, who was said to occupy the same Sherwood
  • 23:46 - 23:51
    Forest as Robin Hood, these Luddites attacked
    mills and factories, destroyed steam-powered
  • 23:51 - 23:56
    looms, and even assassinated the wealthy capitalists
    of their day.
  • 23:56 - 24:01
    The motive behind the Luddites’ attacks
    was not, as is commonly understood... a general
  • 24:01 - 24:02
    hatred of technology.
  • 24:02 - 24:07
    It was an awareness that certain technology
    was being implemented in a way that made their
  • 24:07 - 24:09
    lives worse off.
  • 24:09 - 24:14
    Ultimately their uprising failed... and there’s
    nothing particularly revolutionary in the
  • 24:14 - 24:18
    first place about sabotaging machines just
    to keep your job.
  • 24:18 - 24:23
    But one takeaway from the Luddites’ rebellion
    is that people don’t always accept new technologies,
  • 24:23 - 24:26
    or the new social roles that accompany them,
    with open arms.
  • 24:26 - 24:32
    And that realization can be the starting point
    for all types of resistance.
  • 24:37 - 24:41
    I think that anarchists should not avoid technology,
    quite the opposite.
  • 24:41 - 24:45
    I think it should be used subversively when
    possible.
  • 24:45 - 24:51
    We can look back at the Bonnot Gang of illegalists
    using cars to rob the rich in France in the
  • 24:51 - 24:57
    early 1900s as an example, or hackers like
    Jeremy Hammond who is serving 10 years for
  • 24:57 - 25:02
    exposing Stratfor security and expropriating
    hundreds of thousands of dollars from our
  • 25:02 - 25:03
    enemies.
  • 25:03 - 25:07
    Cyberthieves made off with the personal details
    of hundreds of thousands of subscribers and
  • 25:07 - 25:11
    it's emerged that some of those subscribers
    hold key positions in the British government,
  • 25:11 - 25:13
    military, and police.
  • 25:13 - 25:19
    There are certainly a lot of anarchists like
    myself in open source software development.
  • 25:19 - 25:24
    We are talking together and we are all trying
    to make things better.
  • 25:24 - 25:27
    Types of projects that we work on are quite
    diverse.
  • 25:27 - 25:32
    I know many anarchist programmers and hackers
    who build websites.
  • 25:32 - 25:37
    I know others like myself who do cryptography.
  • 25:39 - 25:45
    although I think that we should use technology
    to our advantage when it exists, I also believe
  • 25:45 - 25:51
    that new technologies tend to disproportionately
    give an advantage to the state, corporations,
  • 25:51 - 25:55
    the police, judges, prisons, and borders.
  • 25:55 - 26:00
    Technologies should be used, but their development
    should be fought, because we rarely come out
  • 26:00 - 26:03
    on top when it comes to the application of
    these inventions.
  • 26:03 - 26:08
    I think it's important to look at the kinds
    of projects that are developing in your city
  • 26:08 - 26:12
    and map out the research and development initiatives.
  • 26:12 - 26:17
    A lot of this industry is operating very openly
    in startups and yuppie labs that face very
  • 26:17 - 26:18
    little resistance.
  • 26:18 - 26:23
    This allows them to get major funding because
    those seem like safe investments that have
  • 26:23 - 26:27
    the potential to get investors some serious
    cash.
  • 26:27 - 26:31
    Messing with that sense of security can hurt
    the industry, and it can also allow others
  • 26:31 - 26:34
    to see that resistance is still possible.
  • 26:34 - 26:39
    As for these new technologies, at this point
    a lot of these systems still have bugs, or
  • 26:39 - 26:42
    can't deal with people intentionally using
    them wrong.
  • 26:42 - 26:47
    In London when the crowdsourced snitching
    app was released, people intentionally submitted
  • 26:47 - 26:50
    tons of fake reports to throw off the cops.
  • 26:50 - 26:54
    If more people start messing with their little
    gadgets, they'll be less effective and less
  • 26:54 - 26:58
    likely to be applied in more places.
  • 27:11 - 27:15
    For these projects to function, a lot of infrastructure
    is needed.
  • 27:15 - 27:19
    These softwares cannot be developed in thin
    air, they need hardware.
  • 27:19 - 27:22
    That means computers and backup drives.
  • 27:22 - 27:27
    The information also moves around, mostly
    through networks of fiber optic cables.
  • 27:27 - 27:31
    Those have been sabotaged all around the world
    very effectively.
  • 27:31 - 27:35
    The data also has to be stored, which happens
    in data centers.
  • 27:35 - 27:40
    Those can be pretty small, but can also be
    gigantic buildings that need their own cooling
  • 27:40 - 27:42
    systems and 24/7 security.
  • 27:42 - 27:48
    Finally, a lot of these projects need people
    to work together, often in labs sponsored
  • 27:48 - 27:50
    by companies, universities, or both.
  • 27:50 - 27:55
    A lot of these labs and coworking spaces for
    startups are easy to find.
  • 27:55 - 28:01
    One of them was attacked with molotovs in
    Berlin by people fighting against the proliferation
  • 28:01 - 28:03
    of Google startups.
  • 28:05 - 28:08
    There has been a lot of resistance to the
    sidewalk labs project in Toronto.
  • 28:08 - 28:14
    A lot of work that, you know, points out the
    myriad issues with the Sidewalk Labs project
  • 28:14 - 28:18
    and mounts a public education campaign against
    that.
  • 28:18 - 28:23
    There's been a lot of concerned citizens and
    activists that have come out to all the meetings
  • 28:23 - 28:31
    that Sidewalk Labs has been hosting, public
    forums, consultation sessions and a lot of
  • 28:31 - 28:33
    folks are really worried.
  • 28:33 - 28:39
    Privacy experts dropping out of the project
    and saying "I can't sign my name to this".
  • 28:39 - 28:46
    People have always been able to get away with
    attacking power, with sabotaging power anonymously
  • 28:46 - 28:48
    without getting caught.
  • 28:48 - 28:54
    It's still possible to attack and that will
    remain so for, maybe forever, but at the very
  • 28:54 - 28:57
    least for the immediate foreseeable future.
  • 28:57 - 29:05
    So, while it is certainly still possible to
    break the law, to attack the system, people
  • 29:05 - 29:08
    need to be very careful about being conscious
    of what they're doing.
  • 29:08 - 29:13
    Being aware that, you know, they're carrying
    a snitch in their pocket or that they're willingly
  • 29:13 - 29:17
    trusting these corporations with 95% of their
    social life.
  • 29:17 - 29:23
    It's really difficult to resist something
    that we don't really know a lot about and
  • 29:23 - 29:26
    players involved haven't really shared with
    the public everything that we probably need
  • 29:26 - 29:32
    to know to mount a resistance campaign, cause
    we're king of just talking about speculation.
  • 29:32 - 29:39
    If people learn the actual technical capabilities
    of the state, they can learn the weaknesses
  • 29:39 - 29:47
    and they can learn how to get away with sabotaging
    the economy, with going up against the state,
  • 30:00 - 30:06
    Given the active role that technological development
    continues to play in terms of deepening alienation,
  • 30:06 - 30:11
    refining surveillance, engineering more destructive
    weapons and hastening climate change... it’s
  • 30:11 - 30:15
    natural to feel a bit pessimistic about where
    things are headed.
  • 30:15 - 30:20
    Our current trajectory is certainly aimed
    towards more sophisticated systems of mass
  • 30:20 - 30:22
    behaviour modification and social control.
  • 30:22 - 30:24
    I Told you everything already!
  • 30:24 - 30:28
    Take him instead of me, he's the thought criminal.
  • 30:28 - 30:33
    But it’s important to remember that despite
    all the money being spent trying to anticipate
  • 30:33 - 30:38
    human decision-making, nobody – not even
    Google — can predict the future.
  • 30:38 - 30:42
    Throughout history, new technologies have
    repeatedly created unintended consequences
  • 30:42 - 30:47
    for those in power... from Guttenberg’s
    Printing Press spawning a revolt against the
  • 30:47 - 30:52
    Catholic Church, to the early Internet paving
    the way for hackers and the development of
  • 30:52 - 30:55
    powerful peer-to-peer encryption tools.
  • 30:55 - 31:01
    As long as people have a will to resist, they
    will find the tools to do so.
  • 31:01 - 31:05
    So at this point, we’d like to remind you
    that Trouble is intended to be watched in
  • 31:05 - 31:09
    groups, and to be used as a resource to promote
    discussion and collective organizing.
  • 31:09 - 31:14
    Are you interested in fighting back against
    the opening of new tech start-ups in your
  • 31:14 - 31:18
    neighbourhood, or just looking to incorporate
    a better understanding of next-gen technologies
  • 31:18 - 31:21
    into your existing organizing?
  • 31:21 - 31:25
    Consider getting together with some comrades,
    organizing a screening of this film, and discussing
  • 31:25 - 31:28
    where to get started.
  • 31:28 - 31:32
    Interested in running regular screenings of
    Trouble at your campus, infoshop, community
  • 31:32 - 31:34
    center, or even just at home with friends?
  • 31:34 - 31:35
    Become a Trouble-Maker!
  • 31:35 - 31:40
    For 10 bucks a month, we’ll hook you up
    with an advanced copy of the show, and a screening
  • 31:40 - 31:44
    kit featuring additional resources and some
    questions you can use to get a discussion
  • 31:44 - 31:45
    going.
  • 31:46 - 31:49
    If you can’t afford to support us financially,
    no worries!
  • 31:49 - 31:56
    You can stream and/or download all our content
    for free off our website: sub.media/trouble.
  • 31:56 - 32:01
    If you’ve got any suggestions for show topics,
    or just want to get in touch, drop us a line
  • 32:01 - 32:04
    at trouble@sub.media.
  • 32:05 - 32:08
    We’re now into the second month of our annual
    fundraising campaign.
  • 32:08 - 32:13
    A huge thanks to everyone who has donated
    so far! If you haven’t given yet and are
  • 32:13 - 32:18
    in a position to do so, please consider becoming a monthly sustainer
  • 32:18 - 32:23
    or, making a one time donation at sub.media/donate.
  • 32:23 - 32:28
    This episode would not have been possible
    without the generous support of Carla and.....
  • 32:28 - 32:29
    Carla.
  • 32:29 - 32:33
    Stay tuned next month for Trouble #20, as
    we take a closer look at the horrors of the
  • 32:33 - 32:38
    Prison Industrial Complex, and talk to comrades
    fighting for its abolition.
  • 32:38 - 32:40
    Prison fixes no problems, it doesn't make
    anything better.
  • 32:40 - 32:45
    Prison is like the permanent threat that holds
    up all relationships of exchange and domination.
  • 32:45 - 32:49
    It's the deeply felt sense that no matter
    how bullshit our lives are, there's still
  • 32:49 - 32:51
    something the state can take away from us.
  • 32:51 - 32:53
    Now get out there and make some trouble!
Title:
Trouble 19: Quiet Storm
Description:

We’re on the brink of a new era. In the coming years and decades, rapid advances in the fields of robotics, artificial intelligence, data analysis, nanotech, quantum computing, bio-engineering and 3D-printing promise to drastically restructure our societies – much as the steam-powered engine and personal computer did during earlier phases of capitalist development. Coming waves of automation are expected to eliminate the majority of current job categories, raising the spectre of widespread unemployment and the potential for newer, more sophisticated forms of economic servitude and social control. These transformations will take place under the watchful eyes of a high-tech surveillance state, aided by a new generation of AI-driven facial recognition software, and the further proliferation of networked ‘smart’ devices that record nearly everything we say or do.

Many of the technologies of tomorrow are being designed today in the universities and corporate R&D labs of Shenzen, Singapore and Silicon Valley, by scientists and engineers working at the behest of military contractors and multi-billion dollar tech companies. Claims that ‘technology is neutral’ ring hollow in a world dominated by powerful states and capitalist social relations. It’s clear to anyone keeping score that those who control and shape technological development and mass production are best situated to reap the benefits. But at the end of the day, capital and the state don’t hold a monopoly on innovation. There are many anarchists also working on building new technologies to help thwart our enemies and unlock new paths of resistance. And despite what you may have heard, the master’s tools can be used to dismantle the master’s home – provided the person swinging the hammer knows where to aim.

more » « less
Video Language:
English
Duration:
33:27

English subtitles

Revisions