< Return to Video

Last Week Tonight with John Oliver: Government Surveillance (HBO)

  • 0:05 - 0:06
    Our main story tonight
  • 0:06 - 0:08
    is government surveillance.
  • 0:08 - 0:09
    And I realize most people
  • 0:09 - 0:10
    would rather have a conversation
  • 0:10 - 0:13
    about literally any other topic.
  • 0:13 - 0:14
    Including: 'Is my smartphone
  • 0:14 - 0:15
    giving me cancer?'
  • 0:15 - 0:17
    To which the answer is: Probably.
  • 0:17 - 0:19
    Or: 'Do goldfish suffer from depression?'
  • 0:19 - 0:20
    To which the answer is:
  • 0:20 - 0:22
    Yes, but very briefly.
  • 0:23 - 0:24
    But the fact is:
  • 0:24 - 0:25
    It is vital
  • 0:25 - 0:27
    that we have a discussion about this now.
  • 0:27 - 0:29
    Because an important date
  • 0:29 - 0:31
    is just around the corner.
  • 0:31 - 0:32
    One big day to circle on the calendar
  • 0:32 - 0:33
    when it comes to
  • 0:33 - 0:35
    a very controversial subject.
  • 0:35 - 0:37
    The re-authorization of the Patriot Act
  • 0:37 - 0:37
    and all of the
  • 0:37 - 0:39
    controversial provisions therein.
  • 0:39 - 0:41
    June 1 they've got to come to an agreement
  • 0:41 - 0:42
    to re-authorize
  • 0:42 - 0:44
    or curtail those programs.
  • 0:44 - 0:46
    Yes. Some controversial provisions
  • 0:46 - 0:47
    within the Patriot Act
  • 0:47 - 0:49
    are set to expire on June 1.
  • 0:49 - 0:51
    So circle that date
  • 0:51 - 0:52
    on your calendars, everyone.
  • 0:52 - 0:53
    And while you're at it:
  • 0:53 - 0:55
    Circle June 2 as well.
  • 0:55 - 0:57
    Because that's Justin Long's birthday.
  • 0:57 - 0:59
    You all forgot last year...
  • 0:59 - 1:00
    ...and he f*cking noticed.
  • 1:01 - 1:04
    Now, over the last couple of years
  • 1:04 - 1:06
    you've probably heard a lot about
  • 1:06 - 1:07
    strange-sounding programs. Such as:
  • 1:07 - 1:11
    X-Keyscore, Muscular, Prism, and Mystic.
  • 1:11 - 1:13
    Which are -coincidentally- also the names
  • 1:13 - 1:14
    of some of Florida's
  • 1:14 - 1:15
    least popular stripclubs.
  • 1:16 - 1:17
    "Welcome to X-Keyscore!"
  • 1:17 - 1:19
    "Our dancers are fully un-redacted
  • 1:19 - 1:22
    and Tuesday is wing-night!"
  • 1:23 - 1:25
    But if you don't mind, I would like
  • 1:25 - 1:27
    to refresh your memory over some of this.
  • 1:27 - 1:28
    And let's start our focussing on the most
  • 1:28 - 1:30
    controversial portion of the Patriot Act
  • 1:30 - 1:32
    that is up for renewal:
  • 1:32 - 1:33
    Section 215.
  • 1:33 - 1:35
    Which, I'm aware, sounds like the name
  • 1:35 - 1:37
    of an Eastern European boy-band.
  • 1:38 - 1:39
    "We are Section 215."
  • 1:39 - 1:41
    "Prepare to have your hearts...
  • 1:41 - 1:42
    throbbed."
  • 1:43 - 1:47
    There's the cute one, the bad-boy,
  • 1:47 - 1:49
    the one who strangled a potato-farmer,
  • 1:49 - 1:51
    and the one without an iron-deficiency.
  • 1:51 - 1:52
    They're incredible.
  • 1:53 - 1:56
    But the content of the real Section 215
  • 1:56 - 1:59
    is actually even more sinister.
  • 1:59 - 2:00
    It's called Section 215.
  • 2:00 - 2:03
    Nicknamed: the library records provision.
  • 2:03 - 2:05
    Which allows the government to require
  • 2:05 - 2:06
    businesses to hand over records of any
  • 2:06 - 2:09
    quote, 'any tangible things'
  • 2:09 - 2:11
    including: books, records, papers,
  • 2:11 - 2:13
    documents, and other items.
  • 2:13 - 2:15
    If that sounds broad, it's because
  • 2:15 - 2:17
    it was very much written that way.
  • 2:17 - 2:19
    Section 215 says the government can ask
  • 2:19 - 2:22
    for 'any tangible things' so long as it's
  • 2:22 - 2:24
    'for an investigation to protect
  • 2:24 - 2:25
    against international terrorism'.
  • 2:25 - 2:27
    Which is basically a blank cheque.
  • 2:27 - 2:29
    It's letting a teenager borrow the car
  • 2:29 - 2:31
    under the strict condition that they
  • 2:31 - 2:33
    only use it for 'car-related activities'.
  • 2:33 - 2:35
    "Okay, mom and dad, I'm gonna use this
  • 2:35 - 2:37
    for a hand-job in a Wendy's parking lot
  • 2:37 - 2:39
    but that is car-related,
  • 2:39 - 2:40
    so I think I'm covered."
  • 2:41 - 2:43
    Section 215 is overseen
  • 2:43 - 2:45
    by a secret intelligence-court
  • 2:45 - 2:46
    known as the FISA-court.
  • 2:46 - 2:48
    And they've interpreted it to mean
  • 2:48 - 2:50
    the government could basically collect and
  • 2:50 - 2:52
    store phone-records for every American.
  • 2:52 - 2:54
    The vast majority of whom, of course
  • 2:54 - 2:56
    have no connection to terrorism.
  • 2:56 - 2:59
    Unless, Aunt Cheryl has been gravely
  • 2:59 - 3:00
    mis-characterizing the activities
  • 3:00 - 3:02
    of her needle-point club.
  • 3:02 - 3:03
    "It's a sleeper-cell,
  • 3:03 - 3:05
    isn't is, Aunt Cheryl?"
  • 3:05 - 3:07
    "You will hang for this, Aunt Cheryl."
  • 3:07 - 3:09
    "You're a traitor and a terrible aunt."
  • 3:09 - 3:10
    "Not in that order."
  • 3:12 - 3:14
    Now, the government will point out
  • 3:14 - 3:17
    that under 215, they hold phone-records
  • 3:17 - 3:19
    and not the calls themselves.
  • 3:19 - 3:24
    What the intelligence-community is doing
  • 3:24 - 3:29
    is looking at phone-numbers
  • 3:29 - 3:31
    and durations of calls
  • 3:31 - 3:34
    they are not looking at people's names
  • 3:34 - 3:36
    and they are not looking at content.
  • 3:37 - 3:40
    Yes, but that's not entirely reassuring.
  • 3:40 - 3:42
    Because you can extrapolate a lot
  • 3:42 - 3:43
    from that information.
  • 3:43 - 3:45
    If they knew that you'd called your ex
  • 3:45 - 3:48
    12 times last night, between 1 and 4 AM
  • 3:48 - 3:51
    for a duration of 15 minutes each time
  • 3:51 - 3:53
    they can be fairly sure that you left
  • 3:53 - 3:55
    some pretty pathetic voice-mails.
  • 3:55 - 3:56
    "I don't care whose monitoring
  • 3:56 - 3:57
    this call, Vicky."
  • 3:57 - 3:59
    "We should be together!"
  • 3:59 - 4:00
    Pick up the phone, dammit!
  • 4:00 - 4:03
    I'm a human being, not an animal!"
  • 4:04 - 4:05
    Now, the Patriot-act was written
  • 4:05 - 4:07
    just after 9-11.
  • 4:07 - 4:08
    And for years it was extended
  • 4:08 - 4:10
    and re-authorized
  • 4:10 - 4:11
    with barely a passing thought.
  • 4:11 - 4:13
    In fact, it became so routine
  • 4:13 - 4:16
    that when it was extended in 2011
  • 4:16 - 4:18
    one newscast just tacked it
  • 4:18 - 4:19
    onto the end of a report
  • 4:19 - 4:21
    about a Presidential trip abroad.
  • 4:21 - 4:22
    Chip Reid. CBS-news.
  • 4:22 - 4:24
    Travelling with the President
  • 4:24 - 4:25
    in Deauville, France.
  • 4:25 - 4:27
    Also in France, by the way
  • 4:27 - 4:28
    President Obama signed in a law
  • 4:28 - 4:30
    4-year extension
  • 4:30 - 4:32
    of the terrorism fighting Patriot-Act.
  • 4:32 - 4:34
    Also in France, by the way?
  • 4:35 - 4:36
    By the way?
  • 4:36 - 4:37
    He threw that in
  • 4:37 - 4:38
    like a mother telling her grown daughter
  • 4:38 - 4:41
    that her childhood pet just died.
  • 4:41 - 4:42
    "Oh, nice talking to you, sweety.
  • 4:42 - 4:44
    Also, by the way, Mr. Peppers is dead.
  • 4:44 - 4:46
    See you at Christmas." BANG
  • 4:47 - 4:49
    But all of that was before the public
  • 4:49 - 4:51
    was made aware of what the government's
  • 4:51 - 4:54
    capabilities actually were.
  • 4:54 - 4:57
    'Cause that all ended in June of 2013.
  • 4:57 - 4:58
    Edward Snowden has just taken
  • 4:58 - 5:00
    responsibility for one of the
  • 5:00 - 5:02
    biggest government leaks in US history.
  • 5:02 - 5:04
    We learned that the government has
  • 5:04 - 5:05
    the capacity to track
  • 5:05 - 5:07
    virtually every American phone-call
  • 5:07 - 5:08
    and to scoop up impossibly vast
  • 5:08 - 5:10
    quantities of data across the internet.
  • 5:10 - 5:13
    Revelations that the NSA eavesdropped
  • 5:13 - 5:14
    on world leaders.
  • 5:14 - 5:15
    If you've ever been to the Bahamas
  • 5:15 - 5:17
    the NSA could've recorded your phone-calls
  • 5:17 - 5:19
    and stored them for up to a month.
  • 5:19 - 5:22
    All that information was exposed
  • 5:22 - 5:23
    by Edward Snowden.
  • 5:23 - 5:25
    And it is still kind of incredible
  • 5:25 - 5:26
    that a 29-year-old contractor
  • 5:26 - 5:28
    was able to steal top-secret documents
  • 5:28 - 5:30
    from an organization
  • 5:30 - 5:32
    that LITERALLY has the word 'security'
  • 5:32 - 5:33
    in it's name.
  • 5:34 - 5:36
    Clearly, that was not great for them.
  • 5:36 - 5:37
    The only place where it should be THAT
  • 5:37 - 5:39
    easy for employees in their 20ies to steal
  • 5:39 - 5:41
    is a Lid store.
  • 5:42 - 5:43
    "Dude, you sure I should take this?"
  • 5:43 - 5:45
    "Relax, dude, it's a Miami Marlins-cap,
  • 5:45 - 5:46
    we're not exactly selling
  • 5:46 - 5:48
    Fabergé eggs here."
  • 5:49 - 5:51
    It is still unclear
  • 5:51 - 5:53
    exactly how many documents
  • 5:53 - 5:54
    Edward Snowden stole.
  • 5:54 - 5:56
    Although he's consistently tried
  • 5:56 - 5:57
    to re-assure people
  • 5:57 - 5:59
    that he put them in good hands.
  • 5:59 - 6:01
    Honestly, I don't want to be the person
  • 6:01 - 6:03
    making the decisions on what should
  • 6:03 - 6:05
    be public and what shouldn't.
  • 6:05 - 6:05
    Which is why
  • 6:05 - 6:08
    rather than publishing these on my own
  • 6:08 - 6:10
    or putting them out openly
  • 6:10 - 6:12
    I'm running them through journalists.
  • 6:12 - 6:14
    Well, that sounds great.
  • 6:14 - 6:16
    But of course it's not a fail-safe plan.
  • 6:16 - 6:17
    As was proven when the New York Times
  • 6:17 - 6:19
    published this slide
  • 6:19 - 6:20
    but did such a sloppy job
  • 6:20 - 6:22
    of blocking out redacted information
  • 6:22 - 6:24
    that some people were able to read
  • 6:24 - 6:26
    the information behind that black bar
  • 6:26 - 6:28
    which concerned how the US was monitoring
  • 6:28 - 6:29
    Al Qaida in Mosul.
  • 6:29 - 6:31
    A group now known as ISIS.
  • 6:31 - 6:33
    So essentially a national security secret
  • 6:33 - 6:35
    was leaked because no-one at the Times
  • 6:35 - 6:38
    knows how to use Microsoft Paint.
  • 6:39 - 6:41
    And look, you can think
  • 6:41 - 6:43
    that Snowden did the wrong thing.
  • 6:43 - 6:45
    Or did it in the wrong way.
  • 6:45 - 6:46
    But the fact is:
  • 6:46 - 6:47
    we have this information now
  • 6:47 - 6:49
    and we no longer get the luxury
  • 6:49 - 6:51
    of pleading ignorance.
  • 6:51 - 6:53
    It's like you can't go to Sea World
  • 6:53 - 6:55
    and pretend that Shamu's happy, anymore.
  • 6:55 - 6:56
    When we now know
  • 6:56 - 6:58
    at least half the water in her tank
  • 6:58 - 6:59
    is whale-tears.
  • 6:59 - 7:01
    We know that now.
  • 7:01 - 7:03
    You can't un-know that information.
  • 7:04 - 7:06
    So you have to bear that in mind.
  • 7:06 - 7:08
    But here's the thing:
  • 7:08 - 7:09
    It's now 2 years later
  • 7:09 - 7:11
    and it seems like we've kind of forgotten
  • 7:11 - 7:12
    to have a debate
  • 7:12 - 7:14
    over the content of what Snowden leaked.
  • 7:14 - 7:16
    A recent Pew-report found that nearly
  • 7:16 - 7:18
    half of Americans say that they're
  • 7:18 - 7:19
    'not very concerned'
  • 7:19 - 7:21
    or 'not at all concerned'
  • 7:21 - 7:22
    about government surveillance.
  • 7:22 - 7:24
    Which is fine.
  • 7:24 - 7:25
    If that's an informed opinion.
  • 7:25 - 7:27
    But I'm not sure that it is.
  • 7:27 - 7:29
    Because we actually sent a camera-crew to
  • 7:29 - 7:32
    Times Square to ask some random passers by
  • 7:32 - 7:35
    who Edward Snowden was and what he did.
  • 7:35 - 7:37
    And there are the responses that we got.
  • 7:37 - 7:39
    I have no idea who Edward Snowden is.
  • 7:39 - 7:42
    Have no idea who Edward Snowden is.
  • 7:42 - 7:44
    I've heard the name, I just can't picture
  • 7:44 - 7:47
    think... right now exactly what it is.
  • 7:47 - 7:49
    Edward Snowden...
  • 7:49 - 7:50
    No. I do not.
  • 7:50 - 7:52
    Just for the record:
  • 7:52 - 7:53
    that wasn't cherry picking.
  • 7:53 - 7:55
    That was entirely reflective
  • 7:55 - 7:56
    of everyone we spoke to.
  • 7:56 - 7:57
    Although, to be fair:
  • 7:57 - 7:59
    some people did remember his name
  • 7:59 - 8:01
    they just couldn't remember why.
  • 8:01 - 8:04
    He sold some information to people.
  • 8:04 - 8:06
    He revealed some information
  • 8:06 - 8:09
    that shouldn't have been revealed.
  • 8:09 - 8:10
    I think from what I remember is
  • 8:10 - 8:12
    the information that he shared was
  • 8:12 - 8:14
    detrimental to our military secrets?
  • 8:14 - 8:16
    And keeping our soldiers and our country
  • 8:16 - 8:17
    safe?
  • 8:17 - 8:19
    He leaked documents what the US Army's
  • 8:19 - 8:21
    operations in Iraq.
  • 8:21 - 8:23
    Edward Snowden revealed a bunch of
  • 8:23 - 8:25
    of secrets, I guess, or information
  • 8:25 - 8:26
    into Wiki... Wikileaks?
  • 8:26 - 8:28
    Edward Snowden leaked...
  • 8:28 - 8:29
    Ah, he's in charge of Wikileaks?
  • 8:29 - 8:31
    Edward Snowden revealed a lot of
  • 8:31 - 8:33
    documents through Wikileaks...?
  • 8:35 - 8:36
    Okay, so here's the thing:
  • 8:36 - 8:39
    Edward Snowden is NOT the Wikileaks guy.
  • 8:39 - 8:41
    The Wikileaks guy is Julian Assange.
  • 8:41 - 8:43
    And you do not want
  • 8:43 - 8:44
    to be confused with him.
  • 8:44 - 8:46
    Partly because he was far less careful
  • 8:46 - 8:48
    than Snowden in what he released and how.
  • 8:48 - 8:50
    And partly because he resembles
  • 8:50 - 8:51
    a sandwich-bag full of biscuit-dough
  • 8:51 - 8:53
    wearing a Stevie Nicks-wig.
  • 8:54 - 8:57
    And that is, that is ciritical.
  • 8:57 - 8:59
    Julian Assange is not a like-able man.
  • 8:59 - 9:02
    Even Benedict Cumberbatch could not
  • 9:02 - 9:03
    make him like-able.
  • 9:03 - 9:05
    He's un-Cumberbatch-able.
  • 9:06 - 9:07
    That was supposed to be
  • 9:07 - 9:09
    physically impossible.
  • 9:09 - 9:11
    But I don't blame people
  • 9:11 - 9:12
    for being confused.
  • 9:12 - 9:13
    We've been looking at this story
  • 9:13 - 9:15
    for the last 2 weeks
  • 9:15 - 9:17
    and it is hard to get your head around.
  • 9:17 - 9:19
    Not just because there are so many
  • 9:19 - 9:21
    complicated programs to keep track of
  • 9:21 - 9:22
    but also because
  • 9:22 - 9:24
    there are no easy answers here.
  • 9:24 - 9:26
    We all naturally want perfect privacy
  • 9:26 - 9:29
    and perfect safety.
  • 9:29 - 9:31
    But those 2 things cannot coexist.
  • 9:31 - 9:33
    It's like how you can't have
  • 9:33 - 9:35
    a badass pet falcon...
  • 9:35 - 9:37
    and an adorable pet vole named Herbert.
  • 9:39 - 9:41
    Either you have to lose one of them
  • 9:41 - 9:43
    -which obviously you don't want to do-
  • 9:43 - 9:44
    or you have to accept some
  • 9:44 - 9:47
    reasonable restrictions on both of them.
  • 9:47 - 9:50
    Now to be fair, the NSA will argue
  • 9:50 - 9:52
    that just because they CAN do something
  • 9:52 - 9:54
    doesn't mean they DO do it.
  • 9:54 - 9:55
    And, that there are restrictions
  • 9:55 - 9:57
    on their operations
  • 9:57 - 9:58
    such as the FISA-court
  • 9:58 - 10:00
    which must approve requests
  • 10:00 - 10:01
    for foreign surveillance.
  • 10:01 - 10:02
    But.
  • 10:02 - 10:05
    In 34 years, that court has approved
  • 10:05 - 10:07
    over 35000 applications
  • 10:07 - 10:09
    and only rejected 12.
  • 10:10 - 10:13
    Yes. Much like Robert Durst's second wife.
  • 10:13 - 10:16
    The FISA-court is alarmingly accepting.
  • 10:16 - 10:18
    "Listen, Robert, I'm not
  • 10:18 - 10:20
    gonna ask you too many questions."
  • 10:20 - 10:22
    I'm just gonna give you the benefit of
  • 10:22 - 10:24
    a doubt that you clearly don't deserve."
  • 10:24 - 10:28
    At least tell him to blink and burp less.
  • 10:29 - 10:32
    The burping might be the most troubling
  • 10:32 - 10:34
    thing about that show.
  • 10:35 - 10:38
    So maybe it's time for us to talk.
  • 10:38 - 10:40
    About where the limits should be.
  • 10:40 - 10:42
    And the best place to start would be
  • 10:42 - 10:44
    Section 215.
  • 10:44 - 10:45
    Not just because it's the easiest
  • 10:45 - 10:46
    to understand
  • 10:46 - 10:48
    but there is wide-spread agreement
  • 10:48 - 10:50
    it needs to be reformed.
  • 10:50 - 10:52
    From the President, to Ted Cruz,
  • 10:52 - 10:54
    to both the ACLU and the NRA,
  • 10:54 - 10:55
    to even the guy
  • 10:55 - 10:57
    who wrote the thing in the first place.
  • 10:57 - 10:59
    I was the principal author
  • 10:59 - 11:00
    of the Patriot Act.
  • 11:00 - 11:03
    I can say, that without qualification
  • 11:03 - 11:06
    Congress never did intend to allow
  • 11:06 - 11:07
    bulk-collections
  • 11:07 - 11:09
    when it passed Section 215.
  • 11:09 - 11:12
    And no fair reading of the text
  • 11:12 - 11:14
    would allow for this program.
  • 11:14 - 11:15
    Think about that.
  • 11:15 - 11:17
    He was the author.
  • 11:17 - 11:18
    That's the legislative equivalent
  • 11:18 - 11:19
    of Lewis Carroll
  • 11:19 - 11:21
    seeing the tea-cups ride at Disney Land
  • 11:21 - 11:22
    and saying:
  • 11:22 - 11:24
    "This has got to be reined in."
  • 11:24 - 11:25
    "No fair reading of my text
  • 11:25 - 11:27
    would allow for this ride."
  • 11:27 - 11:29
    "You've turned my perfectly nice tale
  • 11:29 - 11:30
    of psychedelic paedofilia
  • 11:30 - 11:32
    into a garish vomitorium."
  • 11:34 - 11:36
    "This is not what I wanted!"
  • 11:37 - 11:39
    And even the NSA has said
  • 11:39 - 11:41
    that the number of terror-plots in the US
  • 11:41 - 11:42
    that the Section 215
  • 11:42 - 11:45
    telephone-records program has disrupted...
  • 11:45 - 11:46
    ...is 1.
  • 11:46 - 11:47
    And it's worth noting:
  • 11:47 - 11:49
    that one particular plot
  • 11:49 - 11:51
    involved a cabdriver in San Diego
  • 11:51 - 11:54
    who gave $8500 to a terror-group.
  • 11:54 - 11:57
    And that is the shittiest terrorist-plot
  • 11:57 - 11:58
    I've ever seen.
  • 11:58 - 11:59
    Other than the plot of
  • 11:59 - 12:01
    A Good Day To Die Hard.
  • 12:02 - 12:04
    But here's the big problem here:
  • 12:04 - 12:07
    If we let Section 215 get renewed
  • 12:07 - 12:08
    in it's current form
  • 12:08 - 12:09
    without serious public debate
  • 12:09 - 12:11
    we're in trouble.
  • 12:11 - 12:12
    Because Section 215
  • 12:12 - 12:13
    is the canary in the coal-mine.
  • 12:13 - 12:14
    If we cannot fix that
  • 12:14 - 12:16
    we're not gonna fix any of them.
  • 12:16 - 12:19
    And the public debate so far
  • 12:19 - 12:21
    has been absolutely pathetic.
  • 12:21 - 12:22
    A year ago
  • 12:22 - 12:23
    a former congresswoman was
  • 12:23 - 12:26
    discussing the 215 program on the news.
  • 12:26 - 12:27
    Watch wat happened.
  • 12:27 - 12:29
    This vast collection of data
  • 12:29 - 12:32
    is not that useful
  • 12:32 - 12:34
    and infringes substantially
  • 12:34 - 12:35
    on personal privacy.
  • 12:35 - 12:38
    I think at this point we should
  • 12:38 - 12:42
    seriously consider not continuing...
  • 12:42 - 12:45
    Congress woman Harman, let me interrupt..
  • 12:45 - 12:47
    Let me interrupt you just for a moment.
  • 12:47 - 12:49
    We've got some breaking news out of Miami.
  • 12:49 - 12:50
    Stand by if you will.
  • 12:50 - 12:51
    Right now in Miami
  • 12:51 - 12:52
    Justin Bieber
  • 12:52 - 12:55
    has been arrested on a number of charges.
  • 12:55 - 12:57
    The judge is reading the charges
  • 12:57 - 12:59
    including resisting arrest
  • 12:59 - 13:00
    and driving under the influence.
  • 13:00 - 13:02
    He's appearing now before the judge for
  • 13:02 - 13:04
    his bond-hearing. Let's watch.
  • 13:08 - 13:09
    Actually, you know what?
  • 13:09 - 13:11
    Bad news, we're gonna have to interrupt
  • 13:11 - 13:13
    your interruption of the Bieber news
  • 13:13 - 13:14
    for a new interruption.
  • 13:14 - 13:16
    This time featuring a YouTube video of
  • 13:16 - 13:18
    a tortoise having sex with a plastic clog.
  • 13:18 - 13:19
    Let's watch.
  • 13:19 - 13:20
    HEEH
  • 13:22 - 13:23
    HEEH
  • 13:27 - 13:28
    HEH
  • 13:30 - 13:33
    That. Is essentially the current tone
  • 13:33 - 13:35
    of this vitally important debate.
  • 13:35 - 13:37
    HEEEH
  • 13:38 - 13:39
    And again:
  • 13:39 - 13:41
    I'm not saying
  • 13:41 - 13:42
    this is an easy conversation.
  • 13:42 - 13:44
    But we have to have it.
  • 13:44 - 13:45
    I know this is confusing.
  • 13:45 - 13:46
    And unfortunately the most
  • 13:46 - 13:48
    obvious person to talk to
  • 13:48 - 13:50
    about this is Edward Snowden.
  • 13:50 - 13:52
    But he currently lives in Russia. Meaning:
  • 13:52 - 13:54
    If you wanted to ask him about any
  • 13:54 - 13:55
    of these issues, you'd have to fly
  • 13:55 - 13:57
    all the way there to do it.
  • 13:57 - 13:59
    And it is not a pleasant flight.
  • 14:00 - 14:01
    And the reason I know that...
  • 14:02 - 14:04
    ...is that last week, I went to Russia
  • 14:04 - 14:06
    to speak to Edward Snowden.
  • 14:06 - 14:08
    And this is what happened.
  • 14:10 - 14:13
    Yes, last week I spent 48 paranoid hours
  • 14:13 - 14:14
    in Moscow.
  • 14:14 - 14:16
    Arguably the last place on earth
  • 14:16 - 14:17
    where you can find
  • 14:17 - 14:19
    an overweight Josef Stalin impersonator
  • 14:19 - 14:22
    arguing with an unconvincing fake Lenin.
  • 14:22 - 14:23
    And after experiencing
  • 14:23 - 14:26
    Russia's famously warm hospitality
  • 14:26 - 14:28
    I went to meet Edward Snowden.
  • 14:28 - 14:30
    Who is supposed to show up in this room
  • 14:30 - 14:31
    at noon.
  • 14:31 - 14:33
    However, after 5 minutes after
  • 14:33 - 14:35
    the interview was scheduled to begin
  • 14:35 - 14:37
    I had a troubling thought.
  • 14:37 - 14:38
    I don't know.
  • 14:39 - 14:40
    Do you think he's coming?
  • 14:40 - 14:42
    Yeah, he's coming.
  • 14:42 - 14:44
    'Cause my argument is:
  • 14:44 - 14:46
    Why would he?
  • 14:46 - 14:48
    When you think about it.
  • 14:49 - 14:51
    I got 2000 roebels
  • 14:51 - 14:53
    that says he doesn't make it.
  • 14:53 - 14:55
    Without understanding how much that is.
  • 15:00 - 15:02
    All I'm saying is...
  • 15:02 - 15:05
    ...a 10-hour flight for an empty chair?
  • 15:05 - 15:07
    I'm gonna lose my shit.
  • 15:10 - 15:12
    It turns out it may be a bit of a problem
  • 15:12 - 15:15
    because our Russian producer
  • 15:15 - 15:19
    booked us in a room directly overlooking
  • 15:19 - 15:21
    the old KGB-building.
  • 15:21 - 15:22
    And the home
  • 15:22 - 15:25
    of the current Federal Security Bureau.
  • 15:25 - 15:27
    And we've just been told...
  • 15:27 - 15:28
    ...they know we're here.
  • 15:28 - 15:30
    So uhm...
  • 15:30 - 15:32
    So that happened.
  • 15:38 - 15:41
    Uhm, just if the Russian...
  • 15:41 - 15:42
    ...Russian KGB is listening:
  • 15:42 - 15:43
    We'll ring the fire-alarm
  • 15:43 - 15:45
    if he's not coming.
  • 15:48 - 15:49
    Oh shit.
  • 15:50 - 15:51
    Oh God.
  • 15:58 - 15:59
    So sorry for the delay.
  • 15:59 - 16:00
    It's fine, don't worry about it.
  • 16:00 - 16:02
    HOLY SHIT.
  • 16:02 - 16:04
    He actually came.
  • 16:04 - 16:07
    Edward f*cking Snowden.
  • 16:07 - 16:10
    The most famous hero and/or traitor
  • 16:10 - 16:12
    in recent American history!
  • 16:12 - 16:13
    And I've started with a question
  • 16:13 - 16:15
    designed to test his loyalties.
  • 16:16 - 16:18
    How much do you miss America?
  • 16:18 - 16:22
    You know, my country is something
  • 16:22 - 16:25
    that travels with me, you know.
  • 16:25 - 16:26
    It's not just a geogra...
  • 16:26 - 16:28
    That's a way too complicated answer.
  • 16:28 - 16:30
    The answer is: I miss it a lot.
  • 16:30 - 16:32
    it's the greatest country in the world.
  • 16:32 - 16:33
    I do miss my country.
  • 16:33 - 16:34
    I do miss my home.
  • 16:34 - 16:35
    I do miss my family.
  • 16:35 - 16:37
    Do you miss hot pockets?
  • 16:37 - 16:39
    Yes.
  • 16:39 - 16:41
    I miss hot pockets. Very much.
  • 16:41 - 16:44
    Okay. The entire state of Florida?
  • 16:52 - 16:53
    Let's just let that silence
  • 16:53 - 16:55
    hang in the air.
  • 16:55 - 16:57
    Truck Nuts?
  • 16:57 - 16:58
    Do you miss Truck Nuts?
  • 16:58 - 17:00
    I don't know what they are.
  • 17:00 - 17:01
    Lucky for you, Edward...
  • 17:05 - 17:07
    Not just Truck Nuts.
  • 17:07 - 17:09
    Stars and stripes Truck Nuts.
  • 17:09 - 17:12
    That is 2 balls of liberty
  • 17:12 - 17:13
    in a freedom sack.
  • 17:13 - 17:15
    You really thought ahead.
  • 17:15 - 17:17
    Well, at least one of us did.
  • 17:17 - 17:20
    You know, 'cause of the... uhm...
  • 17:20 - 17:22
    the quandary... the...
  • 17:22 - 17:25
    ...Kafka-esque nightmare that you're in.
  • 17:25 - 17:28
    Okay. Let's dive in.
  • 17:28 - 17:30
    Why did you do this?
  • 17:30 - 17:32
    The NSA has
  • 17:32 - 17:34
    the greatest surveillance capabilities
  • 17:34 - 17:36
    that we've ever seen in history.
  • 17:36 - 17:39
    Now, what they will argue
  • 17:39 - 17:41
    is that they don't use this
  • 17:41 - 17:43
    for nefarious purposes
  • 17:43 - 17:45
    against American citizens.
  • 17:45 - 17:48
    In some ways that's true.
  • 17:48 - 17:50
    But the real problem is that
  • 17:50 - 17:53
    they're using these capabilities
  • 17:53 - 17:55
    to make us vulnerable to them
  • 17:55 - 17:57
    and then saying:
  • 17:57 - 17:59
    "While I have a gun pointed at your head
  • 17:59 - 18:00
    I'm not gonna pull the trigger."
  • 18:00 - 18:02
    "Trust me."
  • 18:02 - 18:04
    So, what does the NSA you want look like?
  • 18:04 - 18:08
    Because you applied for a job at the NSA.
  • 18:08 - 18:10
    So you clearly see an inherent value
  • 18:10 - 18:12
    in that shadowy organization.
  • 18:12 - 18:15
    I worked with mass-surveillance systems
  • 18:15 - 18:18
    against Chinese hackers.
  • 18:18 - 18:19
    I saw that, you know
  • 18:19 - 18:20
    these things do have some purpose.
  • 18:20 - 18:22
    And you want your spies
  • 18:22 - 18:23
    to be good at spying.
  • 18:23 - 18:24
    To be fair.
  • 18:24 - 18:25
    Right.
  • 18:25 - 18:26
    What you don't want is
  • 18:26 - 18:28
    you don't want them spying inside
  • 18:28 - 18:29
    their own country.
  • 18:29 - 18:33
    Spies are great when they're on our side.
  • 18:33 - 18:34
    But we can never forget
  • 18:34 - 18:35
    that they're incredibly powerful
  • 18:35 - 18:37
    and incredibly dangerous.
  • 18:37 - 18:38
    And if they're off the leash...
  • 18:38 - 18:40
    ...they can end up coming after us.
  • 18:40 - 18:41
    Well just to be clear we're talking about
  • 18:41 - 18:42
    2 different things here
  • 18:42 - 18:43
    Domestic surveillance
  • 18:43 - 18:44
    and foreign surveillance.
  • 18:44 - 18:46
    'Cause domestic surveillance
  • 18:46 - 18:48
    Americans give some of a shit about.
  • 18:48 - 18:50
    Foreign surveillance...
  • 18:50 - 18:52
    ...they don't give any remote shit about.
  • 18:52 - 18:54
    Well the second question is:
  • 18:54 - 18:56
    When we talk about foreign surveillance
  • 18:56 - 18:58
    are we applying it in ways that are
  • 18:58 - 18:59
    beneficial...
  • 18:59 - 19:00
    No-one cares.
  • 19:00 - 19:01
    In terms...
  • 19:01 - 19:02
    They don't give a shit.
  • 19:02 - 19:05
    We spied on UNICEF, the children's fund.
  • 19:05 - 19:06
    Sure.
  • 19:06 - 19:10
    We spied on lawyers negotiating...
  • 19:10 - 19:13
    What was UNICEF doing?
  • 19:13 - 19:13
    I mean:
  • 19:13 - 19:16
    That's the question there, isn't it?
  • 19:16 - 19:16
    The question is:
  • 19:16 - 19:18
    Are these projects valuable?
  • 19:18 - 19:21
    Are we going to be safer when we're spying
  • 19:21 - 19:23
    on UNICEF and lawyers who are talking about
  • 19:23 - 19:26
    the price of shrimp and clove cigarettes.
  • 19:26 - 19:27
    I don't think people say that's good.
  • 19:27 - 19:28
    I think they'll say:
  • 19:28 - 19:31
    I definitely don't care.
  • 19:31 - 19:32
    Americans do not give a shit.
  • 19:32 - 19:33
    I think you're right.
  • 19:33 - 19:35
    About foreign surveillance.
  • 19:35 - 19:37
    What some people do care about
  • 19:37 - 19:38
    is whether Snowden considered
  • 19:38 - 19:40
    the adverse consequences of leaking
  • 19:40 - 19:43
    so much information at once.
  • 19:43 - 19:45
    How many of those documents
  • 19:45 - 19:47
    have you actually read?
  • 19:47 - 19:49
    I've evaluated all the documents
  • 19:49 - 19:50
    that are in the archive.
  • 19:50 - 19:51
    You've read every single one?
  • 19:51 - 19:54
    I do understand what I've turned over.
  • 19:54 - 19:56
    But there's a difference between
  • 19:56 - 19:59
    understanding what's in the documents
  • 19:59 - 20:00
    and reading what's in the documents.
  • 20:00 - 20:02
    I recognized the concern.
  • 20:02 - 20:04
    'Cause when you're handing over
  • 20:04 - 20:06
    thousands of NSA documents the last
  • 20:06 - 20:10
    thing you want to do is read them.
  • 20:10 - 20:12
    I think it's fair to be concerned about
  • 20:12 - 20:13
    'did this person do enough?'
  • 20:13 - 20:14
    'were they careful enough?'
  • 20:14 - 20:17
    Especially when you're handling material
  • 20:17 - 20:19
    like we know you are handling.
  • 20:19 - 20:20
    Well, in my defense:
  • 20:20 - 20:22
    I'm not handling anything anymore.
  • 20:22 - 20:23
    That's been passed to the journalists
  • 20:23 - 20:25
    and they're using extraordinary
  • 20:25 - 20:27
    security measures to make sure that this
  • 20:27 - 20:29
    is reported in the most responsible way.
  • 20:29 - 20:31
    But, those are journalists
  • 20:31 - 20:33
    with a lower technical skill-set than you.
  • 20:33 - 20:35
    That's true. But they DO understand
  • 20:35 - 20:36
    -just like you and I do-
  • 20:36 - 20:38
    just HOW important it is
  • 20:38 - 20:39
    to get this right.
  • 20:39 - 20:41
    So the New York Times took a slide
  • 20:41 - 20:43
    didn't redact it properly...
  • 20:43 - 20:44
    ...and
  • 20:44 - 20:46
    In the end it was possible
  • 20:46 - 20:48
    for people to see that something
  • 20:48 - 20:50
    was being used in Mosul.
  • 20:50 - 20:51
    On Al Qaida.
  • 20:51 - 20:53
    That is a problem.
  • 20:53 - 20:54
    Well, that's a f*ck-up.
  • 20:54 - 20:55
    It is a f*ck-up.
  • 20:55 - 20:57
    And these things do happen in reporting.
  • 20:57 - 20:59
    In journalism we have to accept
  • 20:59 - 21:01
    that some mistakes will be made.
  • 21:01 - 21:04
    This is a fundamental concept of liberty.
  • 21:04 - 21:04
    Right.
  • 21:04 - 21:07
    But you have to own that then.
  • 21:08 - 21:10
    You're giving documents with information
  • 21:10 - 21:12
    you know could be harmful
  • 21:12 - 21:14
    which could get out there.
  • 21:14 - 21:16
    Yes.
  • 21:16 - 21:19
    If people act in bad faith.
  • 21:19 - 21:22
    Not even bad faith, but incompetence.
  • 21:22 - 21:23
    We are.
  • 21:23 - 21:25
    But you will never be
  • 21:25 - 21:28
    completely free from risk, if you're free.
  • 21:28 - 21:29
    The only time you can be
  • 21:29 - 21:32
    free from risk is when you're in prison.
  • 21:32 - 21:34
    While the risks were significant
  • 21:34 - 21:36
    Snowden himself has made it clear
  • 21:36 - 21:38
    he feels the rewards have been worth it.
  • 21:38 - 21:40
    You've said in you letters to Brasil:
  • 21:40 - 21:42
    "I was motivated by a believe that
  • 21:42 - 21:43
    citizens deserve to understand
  • 21:43 - 21:45
    the system in which they live."
  • 21:45 - 21:46
    "My greatest fear was that
  • 21:46 - 21:48
    no-one would listen to my warning."
  • 21:48 - 21:50
    "Never have I been so glad
  • 21:50 - 21:52
    to have been so wrong."
  • 21:52 - 21:53
    How did that feel?
  • 21:53 - 21:56
    I was initially terrified that this
  • 21:56 - 21:57
    was going to be a 3-day story.
  • 21:57 - 21:59
    Everybody was going to forget about it.
  • 21:59 - 22:01
    But when I saw that
  • 22:01 - 22:03
    everybody around the world said:
  • 22:03 - 22:06
    "Wow, this is a problem."
  • 22:06 - 22:09
    "We have to do something about this."
  • 22:09 - 22:11
    It felt like vindication.
  • 22:11 - 22:12
    Even in America?
  • 22:12 - 22:13
    Even in America.
  • 22:13 - 22:15
    And I think we're seeing something amazing
  • 22:15 - 22:19
    which is if you ask... the American people
  • 22:19 - 22:21
    to make tough decisions
  • 22:21 - 22:23
    to confront tough issues
  • 22:23 - 22:25
    to think about hard problems...
  • 22:25 - 22:29
    ...they'll actually surprise you.
  • 22:29 - 22:30
    Okay.
  • 22:30 - 22:31
    Here's the problem:
  • 22:31 - 22:32
    I did ask some Americans.
  • 22:32 - 22:35
    And, boy did it surprise me.
  • 22:38 - 22:40
    I have no idea who Edward Snowden is.
  • 22:40 - 22:42
    You've never heard of Edward Snowden?
  • 22:42 - 22:43
    No.
  • 22:43 - 22:45
    I have no idea who Edward Snowden is.
  • 22:45 - 22:46
    I've heard the name
  • 22:46 - 22:49
    I just can't picture... think right now
  • 22:49 - 22:50
    exactly what it is.
  • 22:50 - 22:52
    Well, he's... uhm...
  • 22:52 - 22:54
    He sold some information to people.
  • 22:54 - 22:56
    He revealed some information
  • 22:56 - 22:59
    that shouldn't have been revealed.
  • 22:59 - 23:01
    Edward Snowden revealed a lot of documents
  • 23:01 - 23:04
    through Wikileaks.
  • 23:04 - 23:05
    Edward Snowden revealed a bunch of
  • 23:05 - 23:07
    secrets, I guess...
  • 23:07 - 23:09
    or information into Wikileaks.
  • 23:09 - 23:11
    Edward Snowden leaked... uhm...
  • 23:11 - 23:13
    he's in charge of Wikileaks.
  • 23:14 - 23:16
    I'm in charge of Wikileaks.
  • 23:16 - 23:18
    Not ideal.
  • 23:18 - 23:19
    I guess, on the plus side:
  • 23:19 - 23:20
    You might be able to go home.
  • 23:20 - 23:22
    'Cause it seems like no-one knows
  • 23:22 - 23:24
    who the f*ck you are or what you did
  • 23:24 - 23:27
    We can't expect everybody to be uniformly
  • 23:27 - 23:29
    informed.
  • 23:29 - 23:31
    So, did you do this to solve a problem?
  • 23:31 - 23:35
    I did this to give the American people
  • 23:35 - 23:37
    a chance to decide for themselves
  • 23:37 - 23:39
    the kind of government they want to have.
  • 23:39 - 23:42
    That is a conversation that I think that
  • 23:42 - 23:44
    the American people deserve to decide.
  • 23:44 - 23:46
    There is no doubt that it is
  • 23:46 - 23:47
    a critical conversation.
  • 23:47 - 23:50
    But is it a conversation that we have
  • 23:50 - 23:52
    the capacity to have?
  • 23:52 - 23:55
    Because it's so complicated.
  • 23:55 - 23:57
    We don't fundamentally understand it.
  • 23:57 - 23:59
    It is a challenging conversation.
  • 23:59 - 24:01
    It's difficult for most people
  • 24:01 - 24:03
    to even conceptualize.
  • 24:03 - 24:04
    The problem is
  • 24:04 - 24:06
    the internet is massively complex
  • 24:06 - 24:08
    and so much of it is invisible.
  • 24:08 - 24:11
    Service providers, technicians, engineers,
  • 24:11 - 24:12
    the phonenumber....
  • 24:12 - 24:14
    Let me stop you right there, Edward.
  • 24:14 - 24:16
    'Cause this is the whole problem.
  • 24:16 - 24:16
    Right.
  • 24:16 - 24:18
    This is the whole problem.
  • 24:18 - 24:18
    I glaze over.
  • 24:18 - 24:20
    It's like the IT-guy comes to your office
  • 24:20 - 24:23
    and you go: "Oooh shit".
  • 24:23 - 24:24
    In fairness...
  • 24:24 - 24:27
    "Ooh shit, don't teach me anything."
  • 24:27 - 24:29
    "I don't want to learn."
  • 24:29 - 24:31
    "You smell like canned soup."
  • 24:31 - 24:33
    It's a real challenge to figure out
  • 24:33 - 24:35
    how do we communicate
  • 24:35 - 24:38
    things that require sort of years and years
  • 24:38 - 24:39
    of technical understanding.
  • 24:39 - 24:43
    And compress that into seconds of speech.
  • 24:43 - 24:46
    So, I'm sympathetic to the problem there.
  • 24:46 - 24:47
    But the thing is
  • 24:47 - 24:49
    everything you did only matters
  • 24:49 - 24:53
    if we have this conversation properly.
  • 24:53 - 24:54
    So let me help you out there.
  • 24:54 - 24:56
    You mentioned in an interview
  • 24:56 - 24:58
    that the NSA was passing around
  • 24:58 - 24:59
    naked photo's of people.
  • 24:59 - 25:02
    Yeah. This is something where it's...
  • 25:02 - 25:06
    it's not actually seen as a big deal.
  • 25:06 - 25:07
    In the culture of NSA.
  • 25:07 - 25:09
    Because you see naked pictures
  • 25:09 - 25:11
    all of the time.
  • 25:11 - 25:12
    That.
  • 25:12 - 25:13
    Terrifies people.
  • 25:13 - 25:16
    'Cause when we asked people about THAT...
  • 25:16 - 25:17
    ...this is the response you get.
  • 25:17 - 25:19
    The government should not be able
  • 25:19 - 25:20
    to look at dick-pictures.
  • 25:20 - 25:22
    If the government was looking at
  • 25:22 - 25:24
    a picture of Gordon's penis
  • 25:24 - 25:26
    I definitely feel it would be an invasion
  • 25:26 - 25:27
    of my privacy.
  • 25:27 - 25:29
    Ah, yeah, if the government was looking at
  • 25:29 - 25:31
    pictures of my penis, that would upset me.
  • 25:31 - 25:33
    They should never, ever
  • 25:33 - 25:34
    the US government have a picture
  • 25:34 - 25:36
    of my dick.
  • 25:36 - 25:37
    If my husband sent me
  • 25:37 - 25:38
    a picture of his penis
  • 25:38 - 25:40
    and the government could access it
  • 25:40 - 25:44
    I would want that program to be shut down.
  • 25:44 - 25:47
    I would want the Dick-pic Program changed.
  • 25:48 - 25:50
    I would also want the Dick-pic program
  • 25:50 - 25:51
    changed.
  • 25:51 - 25:52
    It would be terrific if the program
  • 25:52 - 25:53
    could change.
  • 25:53 - 25:55
    I would want it to be tweeked
  • 25:55 - 25:57
    I would want it to have clear and
  • 25:57 - 25:59
    transparent laws that we knew about.
  • 25:59 - 26:01
    And that were communicated to us.
  • 26:01 - 26:04
    To understand what they're being used for.
  • 26:04 - 26:05
    Or why the were being kept.
  • 26:05 - 26:07
    Do you think that program exists?
  • 26:07 - 26:09
    I don't think that program exists at all.
  • 26:09 - 26:13
    No.
  • 26:13 - 26:17
    If I had knowledge that the US government
  • 26:17 - 26:19
    had a picture of my dick...
  • 26:19 - 26:22
    ...I would be very pissed off.
  • 26:24 - 26:26
    Well...
  • 26:26 - 26:27
    The good news is that there's no
  • 26:27 - 26:30
    program named 'the Dick-pic Program'.
  • 26:30 - 26:32
    The bad news is that they are still
  • 26:32 - 26:34
    collecting everybody's information.
  • 26:34 - 26:36
    Including your dick-pics.
  • 26:36 - 26:37
    What's the over/under on that last guy
  • 26:37 - 26:40
    having sent a dick-pic recently?
  • 26:40 - 26:42
    You don't need to guess, I'll show you.
  • 26:42 - 26:44
    I did.
  • 26:44 - 26:48
    I did take a picture of my... dick.
  • 26:48 - 26:50
    And I sent it to a girl. Recently.
  • 26:52 - 26:53
    This is the most visible
  • 26:53 - 26:55
    line in the sand for people.
  • 26:55 - 26:59
    "Can they see my dick?"
  • 26:59 - 27:01
    So, with that in mind...
  • 27:01 - 27:03
    look inside that folder.
  • 27:07 - 27:09
    That.
  • 27:09 - 27:11
    Is a picture of my dick.
  • 27:11 - 27:13
    So let's go through each NSA program
  • 27:13 - 27:15
    and explain to me it's capabilities
  • 27:15 - 27:18
    in regards to that photograph...
  • 27:18 - 27:21
    ...of my penis.
  • 27:21 - 27:24
    702 Surveillance: can they see my dick?
  • 27:24 - 27:26
    Yes.
  • 27:26 - 27:28
    The FISA-amendment act of 2008
  • 27:28 - 27:30
    which Section 702 falls under
  • 27:30 - 27:33
    allows the bulk-collection
  • 27:33 - 27:35
    of internet communications that are
  • 27:35 - 27:36
    one-end foreign.
  • 27:36 - 27:37
    Bulk-collection:
  • 27:37 - 27:38
    Now we're talking about my dick.
  • 27:38 - 27:39
    You get it.
  • 27:39 - 27:40
    It's not what...
  • 27:40 - 27:41
    You get it though, right?
  • 27:41 - 27:42
    I do.
  • 27:42 - 27:44
    Because it's... it's... yeah, anyway.
  • 27:44 - 27:46
    So, if you have you're email somewhere
  • 27:46 - 27:49
    like G-mail, hosted on a server overseas
  • 27:49 - 27:50
    or transferred over seas
  • 27:50 - 27:53
    or it any time crosses outside the borders
  • 27:53 - 27:55
    of the United States...
  • 27:55 - 27:57
    ...you're junk ends up in the database.
  • 27:57 - 27:59
    So it doesn't have to be
  • 27:59 - 28:00
    sending your dick to a German?
  • 28:00 - 28:01
    No.
  • 28:01 - 28:03
    Even if you sent it to somebody
  • 28:03 - 28:04
    within the United States
  • 28:04 - 28:08
    your wholly domestic communication
  • 28:08 - 28:09
    between you and your wife
  • 28:09 - 28:11
    can go from New York...
  • 28:11 - 28:12
    to London and back.
  • 28:12 - 28:13
    And get caught up in the database.
  • 28:13 - 28:17
    Executive Order 12-333: Dick or no dick?
  • 28:17 - 28:18
    Yes.
  • 28:18 - 28:21
    EO 12-333 is what the NSA uses when the
  • 28:21 - 28:24
    other authorities aren't aggressive enough
  • 28:24 - 28:26
    or not catching as much as they'd like.
  • 28:26 - 28:27
    For example:
  • 28:27 - 28:29
    How are they gonna see my dick?
  • 28:29 - 28:32
    I'm only concerned about my penis.
  • 28:33 - 28:36
    When you send your junk
  • 28:36 - 28:38
    through G-mail, for example.
  • 28:38 - 28:41
    That's stored on Google's servers.
  • 28:41 - 28:43
    Google moves data.
  • 28:43 - 28:45
    from datacenter to datacenter.
  • 28:45 - 28:46
    Invisibly to you.
  • 28:46 - 28:47
    Without your knowledge...
  • 28:47 - 28:49
    your data could be moved outside
  • 28:49 - 28:51
    the borders of the United States.
  • 28:51 - 28:52
    Oh no.
  • 28:52 - 28:53
    Temporarily.
  • 28:53 - 28:57
    When your junk was passed by G-mail
  • 28:57 - 28:59
    the NSA caught a copy of that.
  • 28:59 - 29:00
    Prism.
  • 29:00 - 29:03
    Prism is how they pull your junk
  • 29:03 - 29:05
    out of Google, with Google's involvement.
  • 29:05 - 29:08
    All of the different Prism partners
  • 29:08 - 29:11
    people like Yahoo, Facebook, Google.
  • 29:11 - 29:14
    The government deputizes them, to be...
  • 29:14 - 29:17
    sort of their little surveillance sheriff.
  • 29:17 - 29:18
    They're a dick-sheriff.
  • 29:18 - 29:19
    Correct.
  • 29:19 - 29:21
    Uhm, Upstream?
  • 29:21 - 29:23
    Upstream is how they snatch your junk
  • 29:23 - 29:25
    as it transits the internet.
  • 29:25 - 29:26
    Okay. Mystic.
  • 29:26 - 29:28
    If you're describing your junk
  • 29:28 - 29:29
    on the phone?
  • 29:29 - 29:30
    Yes.
  • 29:30 - 29:31
    But do they have the content
  • 29:31 - 29:32
    of that junk-call
  • 29:32 - 29:34
    or just the duration of it?
  • 29:34 - 29:36
    They have the content as well
  • 29:36 - 29:37
    but only for a few countries.
  • 29:37 - 29:39
    If you are on vacation in the Bahamas?
  • 29:39 - 29:40
    Yes.
  • 29:40 - 29:43
    Finally. And you need to remind yourself...
  • 29:43 - 29:45
    No, I'm just not sure...
  • 29:45 - 29:46
    what to do with this.
  • 29:46 - 29:48
    Just hold on to it.
  • 29:48 - 29:49
    It's a lot of responsibility.
  • 29:49 - 29:51
    Yeah. It is a lot of responsibility.
  • 29:51 - 29:53
    That's the whole point.
  • 29:53 - 29:53
    Should I...?
  • 29:53 - 29:55
    No, you should absolutely not.
  • 29:55 - 29:56
    And it's unbelievable
  • 29:56 - 29:58
    that you would do that.
  • 29:58 - 29:59
    Actually, it's entirely believable.
  • 29:59 - 30:01
    215 Meta-data?
  • 30:01 - 30:02
    No.
  • 30:02 - 30:03
    Good.
  • 30:03 - 30:04
    But...
  • 30:04 - 30:06
    Come on, Ed.
  • 30:06 - 30:07
    They can probably tell who you're sharing
  • 30:07 - 30:09
    your junk pictures with.
  • 30:09 - 30:10
    Because they're seeing
  • 30:10 - 30:11
    who you're texting with
  • 30:11 - 30:12
    who you're calling.
  • 30:12 - 30:13
    If you call
  • 30:13 - 30:15
    the penis enlargement centre at 3 AM
  • 30:15 - 30:16
    and that call lasted 90 minutes?
  • 30:16 - 30:17
    They would have a record
  • 30:17 - 30:19
    of YOUR phone-number
  • 30:19 - 30:21
    calling THAT phone-number.
  • 30:21 - 30:24
    (Which is a penis enlargement center).
  • 30:24 - 30:25
    They would say they don't know
  • 30:25 - 30:26
    it's penis enlargement center
  • 30:26 - 30:27
    but of course they can look it up.
  • 30:27 - 30:29
    Edward, if the American people
  • 30:29 - 30:33
    understood this...
  • 30:33 - 30:35
    ...they would be absolutely horrified.
  • 30:35 - 30:36
    I guess I never thought about putting it
  • 30:36 - 30:39
    in the...
  • 30:39 - 30:41
    ...in the context of your junk.
  • 30:41 - 30:44
    Would a good take-away from this be:
  • 30:44 - 30:46
    'Until such time as we've sorted
  • 30:46 - 30:47
    all of this out...
  • 30:47 - 30:50
    ...don't take pictures of your dick'.
  • 30:50 - 30:52
    Just don't do it anymore.
  • 30:52 - 30:54
    No. If we do that.
  • 30:54 - 30:57
    Wait, hold on, you're saying 'no'?
  • 30:57 - 30:57
    Yeah.
  • 30:57 - 30:58
    You should keep
  • 30:58 - 30:59
    taking pictures of your dick?
  • 30:59 - 31:01
    Yes. You shouldn't change your behavior
  • 31:01 - 31:03
    because of a government agency somehwere
  • 31:03 - 31:04
    is doing the wrong thing.
  • 31:04 - 31:07
    If we sacrifice our values
  • 31:07 - 31:08
    because we're afraid, we don't care
  • 31:08 - 31:10
    about those values very much.
  • 31:10 - 31:12
    That is a pretty inspiring answer
  • 31:12 - 31:13
    to the question:
  • 31:13 - 31:15
    "Hey, why did you just send me
  • 31:15 - 31:16
    a picture of your dick?"
  • 31:18 - 31:20
    "Because I love America, that's why."
  • 31:21 - 31:23
    So there you have it, America.
  • 31:23 - 31:25
    All of us should now be equipped
  • 31:25 - 31:27
    to have this vital debate.
  • 31:27 - 31:28
    Because by June 1
  • 31:28 - 31:30
    it is imperative we have a rational
  • 31:30 - 31:32
    adult conversation
  • 31:32 - 31:33
    about whether our safety is worth
  • 31:33 - 31:36
    living in a country of barely regulated
  • 31:36 - 31:39
    government-sanctioned dick-sheriffs.
  • 31:41 - 31:42
    And with my work here done
  • 31:42 - 31:44
    there was just time to take care of
  • 31:44 - 31:46
    one more thing.
  • 31:46 - 31:47
    Finally, congratulations on Citizenfour
  • 31:47 - 31:49
    winning the Oscar.
  • 31:49 - 31:50
    I know you couldn't be at the ceremony
  • 31:50 - 31:53
    for obvious reasons, so...
  • 31:54 - 31:56
    I thought we'd celebrate ourselves.
  • 31:56 - 31:57
    Cheers.
  • 31:57 - 31:59
    Wow, that's...
  • 31:59 - 32:01
    ...that's really, really something.
  • 32:01 - 32:02
    Thank you.
  • 32:02 - 32:02
    You're welcome.
  • 32:02 - 32:04
    What's the over/under on me
  • 32:04 - 32:06
    getting home safely?
  • 32:06 - 32:07
    Well, if you weren't on the list before
  • 32:07 - 32:09
    you are now.
  • 32:12 - 32:13
    Is that like, uhm...
  • 32:14 - 32:16
    Is that like a f... is that like a...
  • 32:16 - 32:18
    ...joke? Or is that actually possible?
  • 32:18 - 32:20
    No, it's... it's a real thing.
  • 32:20 - 32:22
    You're associated now.
  • 32:23 - 32:24
    Okay.
  • 32:24 - 32:25
    Just to be clear, NSA:
  • 32:25 - 32:26
    I never met this guy
  • 32:26 - 32:29
    so take me off you're f*cking list.
  • 32:29 - 32:32
    I DO NOT want to get stuck in Russia.
  • 32:32 - 32:34
    I want to go home I want to go home
  • 32:38 - 32:40
    Now, just for the record.
  • 32:40 - 32:42
    Just so you know.
  • 32:42 - 32:44
    We got in touch with the NSA,
  • 32:44 - 32:46
    the National Security Council,
  • 32:46 - 32:47
    and the White House.
  • 32:47 - 32:48
    And we asked them to comment
  • 32:48 - 32:50
    on the dick-pick capabilities
  • 32:50 - 32:52
    of each of the programs Edward Snowden
  • 32:52 - 32:53
    just discussed.
  • 32:53 - 32:55
    Which -incidentally- were some very fun
  • 32:55 - 32:57
    emails to write to government agencies.
  • 32:57 - 33:00
    They didn't wish to comment on the record.
  • 33:00 - 33:01
    And I can see why
  • 33:01 - 33:03
    for every possible reason.
Title:
Last Week Tonight with John Oliver: Government Surveillance (HBO)
Description:

more » « less
Video Language:
English
Team:
Captions Requested
Duration:
33:15

English subtitles

Revisions Compare revisions