< Return to Video

vimeo.com/.../244383308

  • 0:02 - 0:05
    Greetings Troublemakers... welcome to Trouble.
  • 0:05 - 0:07
    My name is not important.
  • 0:07 - 0:11
    At 10:30pm on October 29th, 1969,
  • 0:11 - 0:13
    Charlie Kline, a student programer at UCLA,
  • 0:13 - 0:16
    successfully sent the first digital message
  • 0:16 - 0:18
    from one computer terminal to another via the DARPANET,
  • 0:18 - 0:20
    a top-secret research project
  • 0:20 - 0:22
    run by the US Department of Defense.
  • 0:22 - 0:26
    The transmission of this single word, 'login'
  • 0:26 - 0:28
    was a pivotal moment in human history,
  • 0:28 - 0:30
    as it represents the official birth of the Internet.
  • 0:30 - 0:33
    And it was from here that the first message was sent.
  • 0:33 - 0:35
    A revolution began!
  • 0:35 - 0:37
    In the nearly fifty years that have followed,
  • 0:37 - 0:39
    this invention has thoroughly transformed our world
  • 0:39 - 0:42
    and come to dominate virtually all aspects of our lives.
  • 0:42 - 0:45
    It has restructured and rejuvenated capitalism,
  • 0:45 - 0:47
    by revolutionizing finance and transforming the globe
  • 0:47 - 0:49
    into a single interconnected marketplace.
  • 0:49 - 0:52
    It has provided new methods of interacting with one another
  • 0:52 - 0:56
    and helped shape the ways that we receive and process information.
  • 0:56 - 0:58
    And it has provided a place for people to upload
  • 0:58 - 1:00
    terabytes of videos of their cats.
  • 1:00 - 1:03
    This is pinky... he's a male. He's available for adoption.
  • 1:03 - 1:04
    He's pet of the week.
  • 1:11 - 1:14
    The Internet has also become the central pillar
  • 1:14 - 1:15
    of a new form of social control
  • 1:15 - 1:17
    based around mass data collection
  • 1:17 - 1:19
    and the construction of algorithms
  • 1:19 - 1:21
    aimed at better predicting and manipulating human behavior.
  • 1:21 - 1:24
    But while states and digital capitalists have used the Internet
  • 1:24 - 1:27
    as a platform for mass surveillance and pacification,
  • 1:27 - 1:29
    it has also been a site of subversion
  • 1:29 - 1:30
    and created new possibilities
  • 1:30 - 1:33
    for decentralized attacks on the dominant order.
  • 1:33 - 1:34
    We've got a problem.
  • 1:34 - 1:35
    What?
  • 1:35 - 1:37
    Someone synched a RAT to one of my servers.
  • 1:37 - 1:39
    A remote access tool – we're being hacked!
  • 1:39 - 1:41
    On the front-lines of this war are hackers.
  • 1:41 - 1:45
    Those who use curiosity, programming skills and problem solving
  • 1:45 - 1:49
    to unlock closed systems and bend powerful forces to their will.
  • 1:49 - 1:50
    Over the next thirty minutes,
  • 1:50 - 1:53
    we'll share the voices of a number of these individuals
  • 1:53 - 1:55
    as they share their experiences of defacing web sites,
  • 1:55 - 1:59
    leaking emails, developing tools to thwart digital surveillance
  • 1:59 - 2:00
    ... and making a whole lot of trouble.
  • 2:26 - 2:30
    Hacking is one of those terms that I think has become
  • 2:30 - 2:31
    a little bit nebulous.
  • 2:31 - 2:35
    I would define it as using technology in a way that wasn't intended,
  • 2:35 - 2:38
    by finding bugs and oversights in designs
  • 2:38 - 2:41
    to make it produce results that were never supposed to happen.
  • 2:41 - 2:43
    Creative subversion of technical systems.
  • 2:43 - 2:47
    You take software and you modify it to get another result.
  • 2:47 - 2:50
    For example, accessing information on a system
  • 2:50 - 2:52
    that you shouldn't be able to access.
  • 2:52 - 2:57
    Or making the system do something that it shouldn't be able to do
  • 2:57 - 2:59
    – or that you shouldn't be able to make it do.
  • 2:59 - 3:03
    There's a lot of different definitions of hacking, depending on who you ask.
  • 3:03 - 3:07
    US criminal law defines computer hacking as unauthorized access
  • 3:07 - 3:10
    to obtain information, transmitting destructive code, etc.
  • 3:10 - 3:12
    I mean, they've basically expanded the definition
  • 3:12 - 3:16
    in their ongoing efforts to criminalize everyday Internet activity.
  • 3:16 - 3:17
    If you ask someone like Richard Stallman,
  • 3:17 - 3:21
    he'll tell you that it's really just a creative solution to a problem.
  • 3:21 - 3:22
    But hackers also do like to break into systems.
  • 3:22 - 3:24
    There are all kinds of systems,
  • 3:24 - 3:26
    and there's all kinds of access
  • 3:26 - 3:28
    and all kinds of ways to gain access.
  • 3:28 - 3:32
    Some hackers choose to fix and protect these systems.
  • 3:32 - 3:35
    They work for the government, Microsoft etc.
  • 3:35 - 3:36
    They call themselves White Hats.
  • 3:36 - 3:38
    They're not even really hackers.
  • 3:38 - 3:40
    They're seen in the hacking scene as sellouts.
  • 3:40 - 3:43
    They do it for the paycheck... or maybe because of the flag.
  • 3:43 - 3:46
    But there are those, of course, who don't don't it for employment.
  • 3:46 - 3:47
    They don't do it for a paycheck,
  • 3:47 - 3:49
    they do it for the love of solving complex puzzles.
  • 3:49 - 3:52
    For the thrill of breaking into whatever artificial borders
  • 3:52 - 3:54
    that these people decide to erect.
  • 3:54 - 3:56
    Everything that's built can be broken.
  • 3:57 - 4:00
    I don't think hacking has changed all that much in the last few years.
  • 4:00 - 4:02
    What really has changed is the scope
  • 4:02 - 4:04
    of things that can be affected with hacking.
  • 4:04 - 4:08
    Before, in the 90's, most of it was just practical jokes
  • 4:08 - 4:10
    because none of it had a lot of impact on real life.
  • 4:10 - 4:12
    And in popular culture,
  • 4:12 - 4:16
    you start to have hackers appear in movies, in television series,
  • 4:16 - 4:19
    where there's this whole figure of these hackers
  • 4:19 - 4:21
    that have these super powers.
  • 4:21 - 4:24
    That can invade computer systems in any way, shape or form.
  • 4:24 - 4:26
    There's a new virus in the database.
  • 4:26 - 4:27
    What's happening?
  • 4:27 - 4:30
    It's replicating... eating up memory.... uhh, what do I do?
  • 4:30 - 4:32
    Type 'cookie' you idiot!
  • 4:32 - 4:34
    Then it gets a lot more popularized.
  • 4:34 - 4:36
    Since the dot-com boom at the end of the 90's,
  • 4:36 - 4:38
    things now have a huge impact
  • 4:38 - 4:39
    and everything tends to be connected to the Internet,
  • 4:39 - 4:40
    or some sort of network.
  • 4:40 - 4:43
    As digital information networks have evolved,
  • 4:43 - 4:46
    a lot of personal information is being stored.
  • 4:46 - 4:50
    Y'know, big data corporations and industries are relying on computers
  • 4:50 - 4:54
    ... so hackers have access to this kind of information
  • 4:54 - 4:55
    that these big companies have as well.
  • 4:55 - 4:58
    Hacking can be very simple and very complex.
  • 4:58 - 5:00
    But most times hacking is very simple.
  • 5:00 - 5:04
    By supplying input in a certain way,
  • 5:04 - 5:07
    you're able to make the back-end system
  • 5:07 - 5:10
    believe that what you're supplying
  • 5:10 - 5:12
    is actually part of its own code.
  • 5:12 - 5:16
    Which, in a lot of cases, can give you full access to that system.
  • 5:16 - 5:21
    That's not just limited to computers or telecommunication systems.
  • 5:21 - 5:25
    We can really kind of apply this idea to all kinds of technical systems.
  • 5:25 - 5:28
    So, for example, something like social engineering
  • 5:28 - 5:31
    is a human form of hacking.
  • 5:31 - 5:34
    Y'know, you can pretend to be somebody that you're not
  • 5:34 - 5:38
    and ask another person questions about themselves
  • 5:38 - 5:41
    in order to get them to reveal private information.
  • 5:41 - 5:43
    It's possible that there is software
  • 5:43 - 5:45
    in theory that doesn't have vulnerabilities.
  • 5:45 - 5:47
    But in practice, that's impossible to have.
  • 5:47 - 5:49
    If an application or a system
  • 5:49 - 5:53
    performs queries to a database based on your input,
  • 5:53 - 5:57
    you could possibly alter your input
  • 5:57 - 6:00
    to be able to then alter the database query,
  • 6:00 - 6:04
    and possibly gain access to information that you shouldn't be able to.
  • 6:04 - 6:05
    Mostly what an exploit does,
  • 6:05 - 6:09
    it's a small tool that you run to get access to a special sector
  • 6:09 - 6:11
    of the software you want to get.
  • 6:12 - 6:14
    A lot of exploits and vulnerabilities are discussed publicly
  • 6:14 - 6:16
    and being used in the wild.
  • 6:16 - 6:19
    If you pay attention to lists like Full Disclosure or Security Focus,
  • 6:19 - 6:22
    they'll tell you some of the latest tricks that are being used.
  • 6:22 - 6:24
    Of course, those are the ones that are already publicly known,
  • 6:24 - 6:27
    and that the vendors have often already released patches
  • 6:27 - 6:30
    ... but a lot of companies don't always patch.
  • 6:30 - 6:33
    They're not as on-top of it as they'd like to think that they are.
  • 6:33 - 6:36
    For example, the Equifax hacks a couple of weeks ago
  • 6:36 - 6:39
    was running outdated versions of Apache software.
  • 6:39 - 6:42
    Most people don't really do updates regularly.
  • 6:42 - 6:46
    So most people will actually get hacked by something very simple.
  • 6:46 - 6:47
    Denial of service attacks
  • 6:47 - 6:51
    ... basically coming up with ways to create an enormous amount
  • 6:51 - 6:53
    of traffic to your server,
  • 6:53 - 6:56
    to the point where it can't continue to provide those services.
  • 6:56 - 6:59
    There's such a thing as Distributed Denial of Service attacks,
  • 6:59 - 7:03
    where that traffic is coming from many places at the same time.
  • 7:03 - 7:07
    The most serious techniques are what they call 'undisclosed vulnerabilities',
  • 7:07 - 7:08
    what they call a 'zero day.'
  • 7:08 - 7:12
    When someone discovers a vulnerability, and instead of reporting it
  • 7:12 - 7:13
    – which is the White Hat way –
  • 7:13 - 7:15
    they continue using it privately.
  • 7:15 - 7:18
    And they don't report it publicly,
  • 7:18 - 7:20
    so that way there's no way for anyone to
  • 7:20 - 7:23
    really adequately protect themselves against it.
  • 7:23 - 7:26
    I think a useful way to think about this is
  • 7:26 - 7:28
    that the Internet is a really hostile place.
  • 7:28 - 7:32
    It was never designed with privacy or security in mind.
  • 7:32 - 7:36
    State actors and corporations control the entire thing.
  • 7:36 - 7:38
    And so when you talk about their ability to exploit it
  • 7:38 - 7:43
    ... I mean, to me, so many of the basic services that we use
  • 7:43 - 7:45
    on the Internet are exploitative
  • 7:45 - 7:48
    without thinking about a hacker getting into it,
  • 7:48 - 7:50
    or malware or something like that.
  • 7:50 - 7:52
    State actors like the US government
  • 7:52 - 7:56
    have the ability to observe all Internet traffic in real time,
  • 7:56 - 7:57
    collect it and store it,
  • 7:57 - 8:00
    and then use it later at their discretion.
  • 8:00 - 8:03
    And they work very closely with the digital capitalists
  • 8:03 - 8:05
    – facebook, google and all these other entities
  • 8:05 - 8:08
    – who are already storing that information anyway.
  • 8:13 - 8:16
    The Internet has long been a tool used by social movements
  • 8:16 - 8:17
    of various political stripes,
  • 8:17 - 8:20
    both as a means of disseminating information
  • 8:20 - 8:22
    and a fertile ground for recruitment.
  • 8:22 - 8:25
    Back in the 1990's, the anti-globalization movement
  • 8:25 - 8:28
    arose alongside the open-media publishing platform, Indymedia,
  • 8:28 - 8:31
    which allowed for the virtual coordination of many localized fronts
  • 8:31 - 8:34
    in the global fight against neoliberal capitalism.
  • 8:38 - 8:39
    I need 50,000 people.
  • 8:39 - 8:42
    50,000? You're gonna have to give me some time.
  • 8:42 - 8:43
    And drums.
  • 8:44 - 8:46
    You want drums? OK, I can do that.
  • 8:46 - 8:48
    And what about the Italians?
  • 8:51 - 8:52
    The Italians? Man, they're stuck on the border.
  • 8:52 - 8:54
    They're gonna be with you tomorrow.
  • 8:54 - 8:54
    And the black bloc?
  • 8:54 - 8:56
    The black bloc are already there.
  • 8:56 - 8:58
    You're gonna see black and red like there ain't no tomorrow, kid.
  • 8:58 - 8:59
    You just sit tight.
  • 9:07 - 9:10
    These days, social media platforms like facebook
  • 9:10 - 9:13
    have given rise to a new form of online activity known as 'clicktivism',
  • 9:13 - 9:16
    in which likes, shares and the signing of online petitions
  • 9:16 - 9:19
    have become a popular way for liberals and so-called 'progressives'
  • 9:19 - 9:21
    to project an image of ostensible participation
  • 9:21 - 9:25
    in campaigns centered around a variety of social justice-related issues,
  • 9:25 - 9:28
    and often masking their lack of participation in real world struggles.
  • 9:28 - 9:32
    Real change requires real action.
  • 9:32 - 9:36
    That’s why I always share political articles on facebook,
  • 9:36 - 9:37
    whenever I see them.
  • 9:37 - 9:40
    But not everyone has been lulled into this comforting delusion
  • 9:40 - 9:41
    of how social change works.
  • 9:41 - 9:43
    On both sides of the political spectrum,
  • 9:43 - 9:45
    groups and individuals have continued
  • 9:45 - 9:47
    to use the Internet pragmatically,
  • 9:47 - 9:51
    both to spread their ideologies and coordinate their IRL activities.
  • 9:52 - 9:55
    Anonymous is a decentralized network of hackers and activists
  • 9:55 - 9:57
    that exist in places like IRC and Twitter,
  • 9:57 - 9:59
    and anyone is free to become Anonymous
  • 9:59 - 10:02
    and start their own operations within the network.
  • 10:02 - 10:03
    It's kinda similar to the black bloc tactic
  • 10:03 - 10:06
    used as cover and collective identity.
  • 10:06 - 10:09
    I'm doing ten years in the fence for computer hacking charges
  • 10:09 - 10:11
    related to my involvement in Anonymous.
  • 10:11 - 10:14
    I was hacking police departments, military contractors
  • 10:14 - 10:16
    ... defacing their websites,
  • 10:16 - 10:19
    releasing their emails and databases to the public.
  • 10:19 - 10:22
    One of the bigger targets was a company known as Strategic Forecasting
  • 10:22 - 10:25
    – Stratfor – which is a private intelligence firm
  • 10:25 - 10:28
    made up of mostly former State Department and CIA agents.
  • 10:28 - 10:29
    We took down their websites.
  • 10:29 - 10:32
    We went on donation sprees with all their clients' credit cards,
  • 10:32 - 10:34
    and gave their email archives to Wikileaks.
  • 10:34 - 10:36
    And Wikileaks pubished them,
  • 10:36 - 10:38
    showed that they had been spying on activist groups
  • 10:38 - 10:41
    on behalf of corporations like Dow Chemical.
  • 10:41 - 10:44
    Groups like Anonymous got really really famous defacing websites.
  • 10:44 - 10:46
    Other groups attacked police websites,
  • 10:46 - 10:51
    getting all the data they have about current police members.
  • 10:51 - 10:55
    There's also groups that were blocking huge institutions,
  • 10:55 - 10:58
    like credit card companies or banks.
  • 10:58 - 11:00
    If they block their transactions, they lose money.
  • 11:00 - 11:04
    So there's a bunch of stuff you can do with hacking.
  • 11:04 - 11:05
    Anonymous, they were really famous for
  • 11:05 - 11:09
    really getting that kind of popular participation in a hacking movement
  • 11:09 - 11:12
    that really didn't mean you had to be an expert to use it.
  • 11:12 - 11:14
    You could download a piece of software,
  • 11:14 - 11:16
    and you could just run it on your computer
  • 11:16 - 11:19
    and you would enter in the target URL
  • 11:19 - 11:21
    and you could begin to participate
  • 11:21 - 11:25
    in what was effectively like a virtual sit-in.
  • 11:25 - 11:27
    Now as far as Anonymous, or hacktivists in general
  • 11:27 - 11:29
    playing a role in revolutionary movements...
  • 11:29 - 11:31
    Anonymous was very active during
  • 11:31 - 11:32
    Occupy Wall Street and the Arab Spring.
  • 11:32 - 11:35
    In general, an overall revolutionary strategy
  • 11:35 - 11:37
    benefits from a diversity of tactics.
  • 11:37 - 11:40
    Multiple attacks converging from all angles,
  • 11:40 - 11:43
    including street protests, to smashed windows, to hacked websites.
  • 11:43 - 11:47
    So Anonymous, y'know, revealing scandalous personal information
  • 11:47 - 11:51
    on individuals associated with a company that is the current target of protests
  • 11:51 - 11:53
    – timed well, it could be very effective.
  • 11:53 - 11:55
    It's a really interesting concept to me.
  • 11:55 - 11:58
    And a lot of people who are members of Anonymous
  • 11:58 - 12:00
    use tools that I work on every day.
  • 12:00 - 12:02
    And I hope they will use them for good.
  • 12:02 - 12:10
    I think the unifying idea is just using anonymity to achieve some end.
  • 12:10 - 12:11
    And doing it with other people.
  • 12:11 - 12:14
    And I think that that speaks to some of their internal contradictions too,
  • 12:14 - 12:18
    because they're not unified by a political ideology.
  • 12:18 - 12:21
    Members of Anonymous fight with each other about that.
  • 12:21 - 12:26
    And I think when you have no political ideology motivating work like that
  • 12:26 - 12:31
    – work that has the potential to impact the whole globe,
  • 12:31 - 12:33
    and has before
  • 12:33 - 12:34
    - it can be really dangerous.
  • 12:34 - 12:38
    We of Anonymous declare total fucking war on antifa,
  • 12:38 - 12:40
    and all who support their criminal and violent actions
  • 12:40 - 12:42
    towards innocent civilians.
  • 12:42 - 12:46
    I've seen Anonymous operations go after people
  • 12:46 - 12:50
    in a kind of y'know, right-wing, Pizzagate-type style.
  • 12:50 - 12:53
    You know... I mean it originated on 4Chan.
  • 12:53 - 12:57
    Historically, the hacker community has been very inclusive.
  • 12:57 - 12:59
    When everything started,
  • 12:59 - 13:03
    nobody really knew who was on the other side of the line.
  • 13:03 - 13:06
    Everyone was just green text on a black background.
  • 13:06 - 13:10
    With that said, there is a lot of sexism in tech generally,
  • 13:10 - 13:15
    and I'd say that the people who are recruited from
  • 13:15 - 13:19
    places like Reddit and 4Chan are like, y'know,
  • 13:19 - 13:21
    your typical tech bros.
  • 13:21 - 13:23
    Every community on the Internet,
  • 13:23 - 13:25
    and every sub-community within those sites,
  • 13:25 - 13:27
    whether it's 4Chan or Reddit or whatever,
  • 13:27 - 13:29
    has a dog in the fight in gamergate.
  • 13:29 - 13:35
    Gamergate and 4Chan, and the origins of the alt-right,
  • 13:35 - 13:38
    I think are one of the most obvious confirmations
  • 13:38 - 13:43
    of something that many of us who are radicals already knew
  • 13:43 - 13:47
    ... which is that toxic masculinity, misogyny, whatever you wanna call it,
  • 13:47 - 13:51
    is an incredibly dangerous and violent force.
  • 13:51 - 13:53
    And it never ends there.
  • 13:53 - 13:55
    Beyond the origins in 4Chan,
  • 13:55 - 13:59
    I don't really know exactly where a lot of these young men came from.
  • 13:59 - 14:02
    I imagine that it's probably not any more interesting
  • 14:02 - 14:07
    than they are a result of late-capitalist alienation.
  • 14:07 - 14:11
    But I think that they started out with, y'know,
  • 14:11 - 14:14
    your just, like, garden variety misogyny.
  • 14:14 - 14:18
    And then actual literal fascists went to their forums
  • 14:18 - 14:21
    and whispered fascist poison into the ears
  • 14:21 - 14:23
    of all these impressionable men.
  • 14:23 - 14:28
    And because they already were prone to violence and bigotry,
  • 14:28 - 14:31
    then it was just the natural conclusion.
  • 14:32 - 14:37
    Doxxing is the practice of exposing information about your opponent
  • 14:37 - 14:40
    that they'd rather have kept secret.
  • 14:40 - 14:45
    Typically, doxxing happens from information
  • 14:45 - 14:47
    that is already somehow readily available
  • 14:47 - 14:49
    and maybe just a little bit hidden.
  • 14:49 - 14:53
    If someone is doing their activism under a pseudonym,
  • 14:53 - 14:57
    attackers will search for any kind of connection
  • 14:57 - 15:01
    to their real physical persona and put that information online.
  • 15:01 - 15:04
    And then whoever the target is,
  • 15:04 - 15:06
    all the people who wanna go after that target
  • 15:06 - 15:09
    will work collectively to terrorize them.
  • 15:09 - 15:11
    The result of it can be, y'know, something like
  • 15:11 - 15:14
    getting 50 pizzas delivered to your house
  • 15:14 - 15:17
    ... or it can be a SWAT team
  • 15:17 - 15:20
    showing up in response to a fake bomb threat.
  • 15:20 - 15:23
    Protection against this is best done by
  • 15:23 - 15:26
    compartmentalization of your online activities.
  • 15:26 - 15:31
    So keep your activist activities and your regular activities separate.
  • 15:31 - 15:35
    Use different email accounts when you sign up for services.
  • 15:35 - 15:39
    Doxxing's also been used by hacker collectives
  • 15:39 - 15:44
    to expose lists of police officers, members of fascist organizations...
  • 15:44 - 15:48
    A lot of people were doxxed after the Charlottesville rally
  • 15:48 - 15:50
    out of just public open-source knowledge,
  • 15:50 - 15:52
    and had to back-track on their beliefs
  • 15:52 - 15:55
    and actually had to go out in public and offer apologies.
  • 15:57 - 16:00
    In June of 2010, a malicious computer worm
  • 16:00 - 16:02
    called the Stuxnet virus was first discovered
  • 16:02 - 16:06
    by a small Belorussian software company, VBA32.
  • 16:06 - 16:08
    It was soon shared with cyber-security experts
  • 16:08 - 16:10
    at Kaspersky Labs, in Moscow,
  • 16:10 - 16:12
    and Symantec in Silicon Valley,
  • 16:12 - 16:16
    who quickly realized that it was unlike any virus ever seen before.
  • 16:16 - 16:18
    Far from your run-of-the mill malware,
  • 16:18 - 16:20
    Stuxnet was a sophisticated weapon,
  • 16:20 - 16:21
    comprised of millions of lines of code
  • 16:21 - 16:23
    and believed to have been jointly developed
  • 16:23 - 16:25
    by the cyber-warfare divisions
  • 16:25 - 16:27
    of the American and Israeli military.
  • 16:27 - 16:30
    Its target was the Natanz nuclear enrichment facility, in Iran.
  • 16:30 - 16:34
    For Natanz, it was a CIA-led operation.
  • 16:34 - 16:36
    So we had to have agency sign-off.
  • 16:37 - 16:44
    Someone from the agency stood behind the operator and the analyst
  • 16:44 - 16:47
    and gave the order to launch every attack.
  • 16:47 - 16:49
    For months, the virus had lain hidden
  • 16:49 - 16:51
    within the plant's Programmable Logic Controllers,
  • 16:51 - 16:53
    machines that are commonly used to regulate and control
  • 16:53 - 16:56
    a wide variety of industrial processes.
  • 16:56 - 16:58
    Running commands that were completely untraceable
  • 16:58 - 16:59
    to workers in the plant,
  • 16:59 - 17:02
    Stuxnet targeted centrifuges for sabotage,
  • 17:02 - 17:05
    causing them to explode, seemingly without cause.
  • 17:05 - 17:08
    The virus was only discovered due to an error in an upgrade patch,
  • 17:08 - 17:11
    which allowed it to jump out of the secured military facility
  • 17:11 - 17:13
    and onto the world wide web
  • 17:13 - 17:15
    …. otherwise we would have never even known it existed.
  • 17:15 - 17:20
    The Israelis took our code for the delivery system and changed it.
  • 17:21 - 17:24
    Then, on their own, without our agreement
  • 17:24 - 17:26
    they just fucked up the code.
  • 17:27 - 17:30
    Instead of hiding, the code started shutting down computers
  • 17:30 - 17:33
    ... so naturally people noticed.
  • 17:33 - 17:36
    Because they were in a hurry, they opened Pandora's Box.
  • 17:37 - 17:41
    They let it out and it spread all over the world.
  • 17:41 - 17:44
    The Stuxnet virus set an important historical precedent,
  • 17:44 - 17:46
    as it heralded the beginnings
  • 17:46 - 17:48
    of a dangerous new chapter in modern warfare.
  • 17:48 - 17:52
    Still in its relative infancy, state-led cyber military campaigns
  • 17:52 - 17:54
    are now being conducted under conditions of total secrecy,
  • 17:54 - 17:57
    shrouded from public scrutiny, or even knowledge.
  • 17:57 - 18:00
    And given the widespread incorporation of digital systems
  • 18:00 - 18:02
    into all aspects of industrial civilization,
  • 18:02 - 18:05
    from electrical grids to emergency management systems
  • 18:05 - 18:07
    and even missile launch sites,
  • 18:07 - 18:09
    the potential consequences of these types of attacks
  • 18:09 - 18:11
    could lead to truly catastrophic loss of life.
  • 18:11 - 18:14
    And while states have been the first to reach this stage
  • 18:14 - 18:16
    in the development of offensive cyber warfare,
  • 18:16 - 18:18
    corporations and other sub-state actors
  • 18:18 - 18:20
    are already charting their own courses
  • 18:20 - 18:22
    in the militarization of digital systems.
  • 18:23 - 18:26
    A lot of what we have as the Internet now
  • 18:26 - 18:27
    - a lot of the building blocks of the Internet -
  • 18:27 - 18:28
    were created by hackers.
  • 18:28 - 18:31
    Experimenting with the technology, coming up with new uses
  • 18:31 - 18:36
    for a communications system that was originally designed
  • 18:36 - 18:40
    to sustain military communication in times of war.
  • 18:40 - 18:43
    And all of these really talented young programmers
  • 18:43 - 18:45
    started to found these Internet start-ups
  • 18:45 - 18:48
    and these companies that become Silicon Valley.
  • 18:48 - 18:51
    So hackers suddenly go from becoming criminals
  • 18:51 - 18:54
    to these billionaire entrepreneurs.
  • 18:54 - 18:59
    These corporations are gathering data at an impressive scale.
  • 18:59 - 19:02
    People are naturally communicative beings.
  • 19:02 - 19:06
    So we're constantly emitting information.
  • 19:06 - 19:10
    And that information is captured by the social media companies
  • 19:10 - 19:11
    and search engines.
  • 19:11 - 19:14
    And that information is then taken
  • 19:14 - 19:17
    and analyzed using algorithms to find patterns.
  • 19:17 - 19:20
    Facebook actually records everything that you type in the status message
  • 19:20 - 19:22
    – even if you don't send it.
  • 19:22 - 19:25
    And maybe you're just thinking out loud when you're doing this.
  • 19:25 - 19:27
    You're not thinking that you're actually thinking out loud
  • 19:27 - 19:29
    in a really crowded room,
  • 19:29 - 19:31
    with everybody having a recorder on them
  • 19:31 - 19:32
    ... but that's actually what you're doing.
  • 19:32 - 19:34
    I think about our right to privacy
  • 19:34 - 19:37
    the way that I think about a lot of our other rights,
  • 19:37 - 19:41
    in that if we actually had them they would be a good start
  • 19:41 - 19:42
    ... but we don't.
  • 19:42 - 19:45
    Privacy essentially is the right to keep thoughts to ourselves,
  • 19:45 - 19:48
    and the right to decide who we share them to,
  • 19:48 - 19:50
    and who can actually see them.
  • 19:50 - 19:53
    We have a guaranteed right to privacy in the US Constitution,
  • 19:53 - 19:55
    which says that the state can't just come in
  • 19:55 - 19:59
    and, like, look around at our stuff and do whatever it wants.
  • 19:59 - 20:01
    But they do. And they can.
  • 20:01 - 20:04
    Because the state has a monopoly on power.
  • 20:04 - 20:08
    Protecting your social information and your personal data
  • 20:08 - 20:10
    is also defending your self-determination.
  • 20:10 - 20:12
    There's this notion that's often used
  • 20:12 - 20:14
    by the state and private companies
  • 20:14 - 20:17
    that if you don't have anything to hide,
  • 20:17 - 20:19
    then you don't have to worry about privacy,
  • 20:19 - 20:21
    and you don't need security, and you don't need encryption.
  • 20:22 - 20:24
    If you hear anybody saying
  • 20:24 - 20:27
    “I have nothing to hide. I don't care about my privacy.”
  • 20:27 - 20:29
    I recommend asking them for their credit card information
  • 20:29 - 20:31
    or their social security number.
  • 20:32 - 20:35
    The biggest concern I have with the “I have nothing to hide”
  • 20:35 - 20:38
    is because today it seems really really easy to say it.
  • 20:38 - 20:41
    But in the past we have lived in darker times.
  • 20:41 - 20:45
    And the information we provide is really really useful to hit our groups
  • 20:45 - 20:47
    ... or any kind of political activity.
  • 20:47 - 20:50
    For example, surveillance cameras on universities
  • 20:50 - 20:52
    – when you face a bigger threat
  • 20:52 - 20:56
    ... let's say we have a coup d'etat in my country.
  • 20:56 - 21:00
    That surveillance camera information becomes really really different
  • 21:00 - 21:02
    from having just a couple of eyes watching them.
  • 21:02 - 21:04
    What people are really saying, I think,
  • 21:04 - 21:06
    when they don't care about their right to privacy is
  • 21:06 - 21:09
    “I'm not like those bad people.
  • 21:09 - 21:12
    "I'm a good person. I'm a law-abiding citizen.”
  • 21:12 - 21:13
    Which is a meaningless concept.
  • 21:13 - 21:14
    Everybody has secrets.
  • 21:14 - 21:16
    Everybody keeps things to themselves.
  • 21:16 - 21:19
    Whether or not they like to admit it, everybody puts pants on.
  • 21:22 - 21:24
    And then we have these new tendencies
  • 21:24 - 21:28
    like the Internet Research Agency and Cambridge Analytica
  • 21:28 - 21:32
    finding ways to use our communication and social media
  • 21:32 - 21:34
    and create these fake interactions
  • 21:34 - 21:37
    where they can quickly create a profile of us
  • 21:37 - 21:40
    of who we are, where we are in the political spectrum,
  • 21:40 - 21:42
    what our tendencies are,
  • 21:42 - 21:45
    and try to push us in new directions.
  • 21:45 - 21:47
    And kind of control our view of the world.
  • 21:47 - 21:50
    And you know, cyber is becoming so big today.
  • 21:50 - 21:53
    It's becoming something that, a number of years ago,
  • 21:53 - 21:55
    a short number of years ago, wasn't even a word.
  • 21:55 - 21:58
    And now the cyber is so big.
  • 21:58 - 22:01
    We conduct full-spectrum military cyberspace operations
  • 22:01 - 22:03
    to enable actions in all domains,
  • 22:03 - 22:06
    ensure the US and allied freedom of action in cyberspace,
  • 22:06 - 22:08
    and deny the same to any adversary.
  • 22:09 - 22:11
    Breaking news about Russian interference in our election.
  • 22:11 - 22:14
    FBI now investigating Vladimir Putin.
  • 22:14 - 22:17
    And as President Obama promises to retaliate for the cyber attack,
  • 22:17 - 22:19
    the Russian President continues to deny he ordered it.
  • 22:22 - 22:24
    Cyber warfare is really cheap.
  • 22:24 - 22:26
    It requires very little equipment.
  • 22:26 - 22:30
    It's very quiet. It's easily deniable.
  • 22:30 - 22:32
    And so it becomes a really powerful tool
  • 22:32 - 22:34
    for state actors and corporations to use,
  • 22:34 - 22:37
    because it's very easy for them to just brush it off after
  • 22:37 - 22:40
    and say “we never did this” or “we don't know who did this.”
  • 22:42 - 22:46
    Nation states are actively at cyber war with each other.
  • 22:46 - 22:49
    They each have their own dedicated cyber armies
  • 22:49 - 22:52
    for purposes of espionage, for purposes of sabotage.
  • 22:52 - 22:54
    It goes from intelligence gathering
  • 22:54 - 22:58
    to, really, destroying nuclear programs - as they've done in Iran.
  • 22:58 - 23:01
    And probably a whole bunch of other things we don't know about.
  • 23:01 - 23:04
    Because they are so secretive... because they're so easy to hide.
  • 23:04 - 23:08
    Everything run by a state in this area is run in a very military,
  • 23:08 - 23:10
    or corporate way.
  • 23:10 - 23:13
    There are people, y'know, doing shift work.
  • 23:13 - 23:17
    There are very clear plans and strategies.
  • 23:17 - 23:19
    Which means that they'll be working
  • 23:19 - 23:23
    more efficiently towards an actual goal.
  • 23:23 - 23:27
    The state can also use all of these techniques that it's developed
  • 23:27 - 23:29
    against the civilian population.
  • 23:29 - 23:31
    Against any actors it feels are a threat.
  • 23:31 - 23:34
    Since hacking and compromising someone digitally
  • 23:34 - 23:37
    is such an abstract thing,
  • 23:37 - 23:41
    it will probably be easier to pull the trigger on someone,
  • 23:41 - 23:44
    even if you're not exactly sure if they're
  • 23:44 - 23:46
    the person you're looking for.
  • 23:46 - 23:48
    This isn't like conventional warfare either.
  • 23:48 - 23:51
    They can act in a way that obscures the origins of the attack,
  • 23:51 - 23:54
    and they're not held to any standards of transparency
  • 23:54 - 23:56
    or accountability on the world stage.
  • 23:56 - 23:58
    If we're talking about a government that has no problems
  • 23:58 - 23:59
    sending drones into a country,
  • 23:59 - 24:00
    I mean... obviously they're not going to
  • 24:00 - 24:04
    feel any need to have to answer to their hacking activities.
  • 24:04 - 24:06
    It's very likely that we'll see a rise
  • 24:06 - 24:10
    in groups using cyber-warfare to advance their own political gains,
  • 24:10 - 24:12
    or to counter-attack repression.
  • 24:14 - 24:17
    It’s often said that there’s no such thing as perfect security.
  • 24:17 - 24:20
    All systems contain potential vulnerabilities
  • 24:20 - 24:23
    that can be exploited by determined and capable adversaries.
  • 24:23 - 24:25
    And when you choose to go up against the state,
  • 24:25 - 24:28
    you’ve chosen an adversary that is both.
  • 24:28 - 24:30
    Dozens of FBI agents targeted alleged members
  • 24:30 - 24:32
    of a loose-knit hacking group.
  • 24:32 - 24:35
    Armed with search warrants, agents hit six homes in New York,
  • 24:35 - 24:37
    along with locations across the country.
  • 24:37 - 24:40
    The best we can do is develop security protocols
  • 24:40 - 24:42
    that are adequate for the task at hand.
  • 24:42 - 24:45
    This means being constantly aware of the risks involved
  • 24:45 - 24:47
    with the actions that we carry out,
  • 24:47 - 24:50
    and understanding what steps that we can take to mitigate those risks.
  • 24:50 - 24:51
    When it comes to communication,
  • 24:51 - 24:54
    this means using methods and tools that are available
  • 24:54 - 24:57
    to thwart interception and mass data collection
  • 24:57 - 24:59
    in order to at least make things as difficult and expensive
  • 24:59 - 25:01
    for our enemies as possible.
  • 25:01 - 25:05
    How would they tell you to access the material on this phone?
  • 25:06 - 25:07
    I think they would say what they've said,
  • 25:07 - 25:09
    which I believe is in good faith.
  • 25:09 - 25:12
    That we have designed this in response to what we believe to be
  • 25:12 - 25:17
    the demands of our customers to be immune to any government warrant,
  • 25:17 - 25:20
    or our (the manufacturer's) efforts to get into that phone.
  • 25:20 - 25:24
    It’s also important to remember that this truth cuts both ways.
  • 25:24 - 25:27
    As infallible as the systems of social control may appear,
  • 25:27 - 25:30
    they too have vulnerabilities that are just waiting to be exploited
  • 25:30 - 25:32
    by determined and capable adversaries.
  • 25:32 - 25:34
    Let’s hope we can rise to the challenge.
  • 25:35 - 25:39
    Someone has broken into the national bank – the Federal Reserve.
  • 25:39 - 25:44
    A Twenty-First Century thief breaking into files, not into metal safes.
  • 25:45 - 25:47
    I think there's a lot of interesting things
  • 25:47 - 25:50
    that anarchist or anti-fascist collectives
  • 25:50 - 25:53
    could do with hacking for their movements.
  • 25:53 - 25:55
    But something I think is more interesting to me is:
  • 25:55 - 25:58
    how can we use technology,
  • 25:58 - 26:01
    and use hacking skills to come up with new ways
  • 26:01 - 26:04
    to connect to each other, in a global movement
  • 26:04 - 26:08
    where we can come to agreements together in a way that's safe
  • 26:08 - 26:10
    ... that doesn't expose us,
  • 26:10 - 26:12
    that doesn't put us at risk of surveillance?
  • 26:13 - 26:16
    We should begin with the assumption that the Internet is hostile territory.
  • 26:16 - 26:18
    It's an ongoing state of war.
  • 26:18 - 26:23
    Military and law enforcement are using it as a tool for social control.
  • 26:23 - 26:24
    But it doesn't have to be this way.
  • 26:24 - 26:27
    And hackers and activists, we could use it to undermine
  • 26:27 - 26:29
    and subvert these systems of power.
  • 26:29 - 26:31
    We could create secure communication networks
  • 26:31 - 26:33
    to coordinate the next big demonstration.
  • 26:33 - 26:38
    But you certainly would have to be aware of encryption,
  • 26:38 - 26:41
    of using proxy servers... of using software like Tor.
  • 26:41 - 26:42
    You have to be able to protect yourself.
  • 26:42 - 26:45
    Because if not, they're going to use it against us.
  • 26:45 - 26:49
    I think the first step for any radical to protect themselves on the Internet
  • 26:49 - 26:52
    is to understand their threat model.
  • 26:52 - 26:53
    There's a really great resource
  • 26:53 - 26:55
    that the Electronic Frontier Foundation has
  • 26:55 - 26:58
    on figuring out your threat model.
  • 26:58 - 27:01
    And they cover a lot of what you need to be thinking about with these things.
  • 27:02 - 27:07
    There's this software called Tor that is very good to anonymize yourself
  • 27:07 - 27:09
    if you want to do something over the Internet.
  • 27:09 - 27:13
    People are recommending a messaging app for cell phones named Signal.
  • 27:14 - 27:18
    The Tor browser, which when you install it and when you start it up,
  • 27:18 - 27:23
    it sets up a connection to a decentralized network.
  • 27:23 - 27:26
    And then your communication will go via this network
  • 27:26 - 27:28
    in a number of different hops,
  • 27:28 - 27:31
    so that if you're browsing to a website,
  • 27:31 - 27:33
    it's not possible for that website
  • 27:33 - 27:35
    to actually tell where you're coming from.
  • 27:35 - 27:38
    One of the most important things that we really need to do
  • 27:38 - 27:40
    is segment our identities online.
  • 27:40 - 27:42
    And so don't re-use identities.
  • 27:42 - 27:43
    Don't make them last for a long time.
  • 27:43 - 27:46
    You might have a public identity, which you carefully curate,
  • 27:46 - 27:50
    and then create yourself new identities to really target
  • 27:50 - 27:52
    special operations that you wanna do.
  • 27:52 - 27:54
    Special events that happen.
  • 27:54 - 27:56
    If there's some protest going on,
  • 27:56 - 27:58
    maybe make a new identity.
  • 27:58 - 27:59
    It makes things a little bit difficult
  • 27:59 - 28:02
    because we do tend to operate on a trust basis,
  • 28:02 - 28:03
    and you need to re-build these connections.
  • 28:03 - 28:08
    But definitely it's the only way to make sure that you stay protected.
  • 28:08 - 28:12
    If you have for some reason linked
  • 28:12 - 28:17
    your regular identity with your Internet persona,
  • 28:17 - 28:20
    it's going to bring problems if some kind of neo-nazi
  • 28:20 - 28:25
    wants to publish that information on a board or whatever.
  • 28:25 - 28:29
    Within the hacker community there's a strong ethos of
  • 28:29 - 28:30
    shutting the hell up.
  • 28:30 - 28:32
    Don't talk about things that you've done.
  • 28:32 - 28:36
    And it comes back to this idea that your identity is really not that important.
  • 28:36 - 28:40
    Forget the fame.... do it for the actual purpose of the thing you wanna do.
  • 28:40 - 28:42
    But don't actually care about
  • 28:42 - 28:44
    whether or not people really will know that it's you.
  • 28:44 - 28:46
    Don't tell your friends... don't talk.
  • 28:46 - 28:48
    Shut the hell up.
  • 28:48 - 28:51
    Use really strong passwords, and a password manager.
  • 28:51 - 28:54
    That's a really typical avenue that doxxing happens through,
  • 28:54 - 28:56
    is just people finding your simple password
  • 28:56 - 28:58
    and gaining access to your accounts.
  • 28:58 - 29:03
    Use two-factor authentication on all accounts that you can.
  • 29:03 - 29:04
    If you're trying to learn how to hack,
  • 29:04 - 29:06
    you are going to have to learn how to program.
  • 29:06 - 29:08
    How to computer program.
  • 29:08 - 29:12
    And if you can learn how to develop websites and run servers,
  • 29:12 - 29:14
    you could look at other people's code
  • 29:14 - 29:16
    and spot the mistakes that they've made
  • 29:16 - 29:19
    that allow you to exploit and leverage that vulnerability.
  • 29:20 - 29:22
    Re-install your computer as often as you can.
  • 29:22 - 29:25
    Try out new operating systems.
  • 29:25 - 29:27
    If you've never used Linux, install it.
  • 29:27 - 29:31
    If you've used Linux before, install another distribution of Linux.
  • 29:31 - 29:35
    If you've always used Mac OS, maybe try Windows.
  • 29:35 - 29:38
    Just try new things and don't be afraid to break it,
  • 29:38 - 29:40
    because in the end if the computer breaks
  • 29:40 - 29:43
    you can always just reinstall the operating system.
  • 29:43 - 29:47
    Try the different tutorials you can find on Kali Linux,
  • 29:47 - 29:49
    and really try and attack your own systems.
  • 29:50 - 29:52
    Try to think outside the box.
  • 29:52 - 29:55
    In that way, y'know, hacking is a lot like anarchy
  • 29:55 - 29:57
    – things are what you make of it.
  • 29:58 - 30:02
    It's not only about us getting more involved
  • 30:02 - 30:04
    in technology and using technology.
  • 30:04 - 30:06
    We also have to bring the politics to technology.
  • 30:06 - 30:08
    We have to connect with the spaces
  • 30:08 - 30:11
    where free software is being developed
  • 30:11 - 30:15
    and make our politics a part of that space.
  • 30:15 - 30:17
    And I think that's something that's happening, right?
  • 30:17 - 30:21
    We can see that in a lot of the free software communities.
  • 30:21 - 30:23
    But it's something we need more of.
  • 30:24 - 30:26
    Decentralized, ideologically-driven hacker collectives,
  • 30:26 - 30:29
    if we unite our efforts we could, without any resources whatsoever
  • 30:29 - 30:32
    can dismantle a corporation, humiliate a politician.
  • 30:32 - 30:35
    And independent hackers, we have the advantage
  • 30:35 - 30:37
    since we're not doing it for a paycheck.
  • 30:37 - 30:40
    We're not doing it for any kind of allegiance to a country.
  • 30:40 - 30:42
    We're up all night.
  • 30:42 - 30:44
    We're breaking into systems because we love it.
  • 30:44 - 30:47
    Because the thrill of breaking into anything that they can build
  • 30:47 - 30:52
    while being able to undermine their systems of control is a better driver,
  • 30:52 - 30:55
    a better incentive for hackers than a paycheck
  • 30:55 - 30:57
    or... America.
  • 30:57 - 30:58
    If you take the offensive and hack,
  • 30:58 - 31:01
    expose and destroy these systems of the rich and powerful,
  • 31:01 - 31:03
    we could drive them offline.
  • 31:03 - 31:07
    Hack the planet! Hack the planet!
  • 31:10 - 31:12
    As a deepening awareness has emerged of the role
  • 31:12 - 31:13
    that Russian hackers played
  • 31:13 - 31:16
    in swaying the 2016 US Presidential election,
  • 31:16 - 31:19
    and facebook has been pressured to release information
  • 31:19 - 31:22
    on the Kremlin’s widespread usage of its targeted ads function
  • 31:22 - 31:23
    as a means of exacerbating tensions
  • 31:23 - 31:26
    and sewing political discord among the American public,
  • 31:26 - 31:28
    hacking has moved from the margins of popular culture
  • 31:28 - 31:30
    to the center of mainstream political discourse.
  • 31:30 - 31:33
    If our movements of resistance have any hope of remaining relevant
  • 31:33 - 31:35
    in this rapidly shifting political climate,
  • 31:35 - 31:38
    it is vitally important that we understand the ways in which
  • 31:38 - 31:41
    power is restructuring itself in our current digital age,
  • 31:41 - 31:44
    and adapt our theory and practice accordingly.
  • 31:44 - 31:46
    So at this point, we’d like to remind you
  • 31:46 - 31:48
    that Trouble is intended to be watched in groups,
  • 31:48 - 31:49
    and to be used as a resource
  • 31:49 - 31:51
    to promote discussion and collective organizing.
  • 31:51 - 31:54
    Are you interested in upping your digital security,
  • 31:54 - 31:56
    or exploring the ways in which you can better incorporate
  • 31:56 - 32:00
    an offensive online strategy to your organizing campaigns?
  • 32:00 - 32:02
    Consider getting together with some comrades,
  • 32:02 - 32:05
    screening this film, discussing how this might be done,
  • 32:05 - 32:08
    and possibly pairing it with an info-session on how to use Tor,
  • 32:08 - 32:10
    and how to encrypt your communication devices.
  • 32:11 - 32:14
    Interested in running regular screenings of Trouble at your campus,
  • 32:14 - 32:16
    infoshop, community center,
  • 32:16 - 32:18
    or even just at your home with friends?
  • 32:18 - 32:19
    Become a Trouble-Maker!
  • 32:19 - 32:20
    For 10 bucks a month,
  • 32:20 - 32:23
    we’ll hook you up with an advanced copy of the show
  • 32:23 - 32:25
    and a screening kit featuring additional resources
  • 32:25 - 32:28
    and some questions you can use to get a discussion going.
  • 32:28 - 32:31
    If you can’t afford to support us financially, no worries!
  • 32:31 - 32:35
    You can stream and/or download all our content for free off our website:
  • 32:37 - 32:41
    If you’ve got any suggestions for show topics, or just want to get in touch,
  • 32:41 - 32:43
    drop us a line at trouble@sub.media.
  • 32:43 - 32:45
    We’d like to remind you that our fundraiser
  • 32:45 - 32:48
    to grow our subMedia collective is still ongoing.
  • 32:48 - 32:50
    We will be doing one final push in December,
  • 32:50 - 32:53
    and hope to reach our goals by the end of the year.
  • 32:53 - 32:56
    To help make sure this happens, go to sub.media/donate
  • 32:56 - 33:00
    and become a monthly sustainer for as little as $2 per month.
  • 33:00 - 33:03
    As always, we’re excited to see that people have been
  • 33:03 - 33:04
    supporting and screening our work,
  • 33:04 - 33:07
    and wanna give a big shout out to new troublemaker chapters
  • 33:07 - 33:11
    in Vancouver, Prince George, Seattle, Bloomington, Brighton,
  • 33:11 - 33:14
    Ithaca, Quebec City, Prescott and Edinburgh.
  • 33:14 - 33:16
    If you’ve been organizing screenings in your town
  • 33:16 - 33:19
    and we haven’t given you a shout-out, let us know!
  • 33:19 - 33:21
    We will be taking the month of December off,
  • 33:21 - 33:23
    and will be back with a fresh season of Trouble,
  • 33:23 - 33:27
    plus a ton of fresh new subMedia content, starting in January.
  • 33:27 - 33:31
    This episode would not have been possible without the generous support of
  • 33:31 - 33:34
    Nicholas, Josh, Avispa Midia, Peter and Biella.
  • 33:34 - 33:36
    Now get out there, and make some trouble!
Title:
vimeo.com/.../244383308
Video Language:
English
Duration:
34:06

English subtitles

Revisions