Return to Video

34C3 - Tracking Transience

  • 0:00 - 0:15
    34C3 preroll music
  • 0:15 - 0:27
    Herald: So. The last are the best or
    something like that. Our next speaker is
  • 0:27 - 0:33
    here waiting for this massive crowd that
    will support him to go through his
  • 0:33 - 0:43
    experience with the FBI. He's gonna share
    with us the way he informs the FBI about
  • 0:43 - 0:49
    his behaviors. So it's not just like in
    cycling or other sports but ... I don't
  • 0:49 - 0:56
    know what sport you do, Hasan. But these
    easy sports men in communication with FBI.
  • 0:56 - 1:08
    Please give a welcome applause ... Yes!
    Last are the best. Oh sorry ...
  • 1:08 - 1:12
    Hasan: That's pointing to me. The arrow
    will start making sense. Really I mean
  • 1:12 - 1:16
    ... How amazing has it been the last few
    days here, right? It's been this ...
  • 1:16 - 1:19
    amazingly intense last few days and it's
    kind of ... You know it's, it's an honor
  • 1:19 - 1:24
    to kind of wrap this up and I really
    appreciate you guys hanging out here till
  • 1:24 - 1:28
    the very very very last time slot. Thank
    you, thank you for coming. This is ... it
  • 1:28 - 1:32
    means a lot. I've been hearing a lot
    about this Congress for many years and
  • 1:32 - 1:36
    this is my first time here and it's just
    ... you know, I was just telling some
  • 1:36 - 1:42
    friends that I consider myself relatively
    tech literate and then I come here and I
  • 1:42 - 1:45
    feel like everyone speaking a completely
    different foreign language that I have no
  • 1:45 - 1:49
    idea what's happening. And it's not
    because some of the talks are in German.
  • 1:49 - 1:54
    It's like literally like some of the
    subject matter is so specific that it's
  • 1:54 - 1:59
    so easy to be completely like: What is
    being discussed? So I'm gonna tell you,
  • 1:59 - 2:04
    it's something a little bit more ... you
    know, a little more of a story and kind of
  • 2:04 - 2:09
    tell you about what I've been doing. So
    I'm an artist and I get called all sorts
  • 2:09 - 2:12
    of different things I get ... You know,
    sometimes I get lumped in with the media
  • 2:12 - 2:16
    artists, sometimes with sculptor,
    sometimes with photographers, sometimes I
  • 2:16 - 2:24
    get called a con artist. But so ... so
    it's kind of odd being in this, being in
  • 2:24 - 2:28
    this gatherings. Particularly after some
    of the other talks that I've been sitting
  • 2:28 - 2:32
    in on. But let me tell you a little bit
    about how this thing came about, this
  • 2:32 - 2:38
    thing with the arrow. By the way I'm at
    elahi.org or elahi.umd.edu. I'll show you
  • 2:38 - 2:42
    some of my works but mainly I'll be
    focusing on this one project which is
  • 2:42 - 2:48
    called Tracking Transience which I
    started shortly after 9/11. So let me just
  • 2:48 - 2:51
    give you a live feed of what it looks
    like. So this is, this is it. So you can
  • 2:51 - 2:56
    go on the web and you could follow me at
    any given moment. And there's this pixel,
  • 2:56 - 2:59
    this arrow that comes here, which is actually...
    These days we're so used to seeing
  • 2:59 - 3:05
    ourselves as pixels on a map. But this
    project was started 2002 and back then to
  • 3:05 - 3:09
    see yourself as a pixel was just such an
    unusual thing. I mean, these days we ...
  • 3:09 - 3:13
    You know, these days you're driving down
    the road and there's that little icon on
  • 3:13 - 3:18
    the GPS, that tells you exactly where you
    are. Or you know the ... I mean like, you
  • 3:18 - 3:21
    know, like when was the last time you
    bought one of those maps at the gas
  • 3:21 - 3:25
    station? You opened it up ... Do you
    remember that? I mean, we don't ... it's
  • 3:25 - 3:28
    just like we just completely lost that. We
    can't even know how to fold a map back up
  • 3:28 - 3:31
    anymore. You know, cuz back in the old
    days you'd have to take out that map and
  • 3:31 - 3:37
    you'd go: "We're here!" So you'd have to
    locate yourself to the geography within
  • 3:37 - 3:42
    that map. These days, you take out that
    magic phone, you press that little button,
  • 3:42 - 3:46
    and you become the center of your own
    map. And your ... the map resizes to you.
  • 3:46 - 3:49
    So the space of the ... So looking at
    yourself as a digital space versus a
  • 3:49 - 3:53
    physical space. You know it's, it's a
    totally different concept that we're
  • 3:53 - 3:58
    running into now and for some reason this
    looks like this got caught up over here.
  • 3:58 - 4:02
    So let's give it another kick. And
    hopefully it'll kick in again.
  • 4:02 - 4:09
    Okay. So, that's, that's, so this is, this is where
    I started. So the project starts here.
  • 4:09 - 4:13
    And you know, then this goes through the
    cycles of everything that I eat and ...
  • 4:13 - 4:16
    You know, let's see what comes up after
    this. You know, you can probably see where
  • 4:16 - 4:19
    this place is. You probably recognize
    that. So these are all the meals that
  • 4:19 - 4:22
    I've been cooking at home. And I photograph
    all of them. It's just kind of interesting.
  • 4:22 - 4:25
    As now it's like it's so common
    on Instagram, it's like, you know, if
  • 4:25 - 4:28
    people think that you know, well, why else
    would you be using Instagram other than
  • 4:28 - 4:34
    posting photos of your food? But I help
    out the FBI with this. So I sent this ...
  • 4:34 - 4:38
    So this was on Friday December 4th. This
    was at the corner of Santa Clara and 11th
  • 4:38 - 4:43
    in San Jose. I got, I bought gas there.
    This is at the Gimhae Airport in Busan.
  • 4:43 - 4:47
    This is some tripe, that I've cooked at
    home. Because I really like tripe.
  • 4:47 - 4:53
    This is, this is actually the Barack Hussein
    Obama School in Jakarta, Indonesia,
  • 4:53 - 5:00
    1st March. This is, this is, ... Sorry, this is
    going so fast. So every few moments, what
  • 5:00 - 5:03
    I do is: I take a photograph and I
    timestamp my life and I send it to the
  • 5:03 - 5:08
    FBI. And, well you know, it's ... And
    it's also kind of interesting. Because
  • 5:08 - 5:12
    these days, the project means something
    so different. When I first started this
  • 5:12 - 5:16
    project, people would be like: "Oh no no,
    don't show up at my house!" And now it's
  • 5:16 - 5:19
    like - You know, it's not like if you're
    don't, if you're not on social media, if
  • 5:19 - 5:24
    you're not connected, like people think of
    you are some, some weirdo or something.
  • 5:24 - 5:30
    So what happens here is that this thing of
    like continuously, continuously
  • 5:30 - 5:34
    monitoring myself and continuously
    checking in ... You know, now a lot of
  • 5:34 - 5:37
    people are like: "I don't get this. This
    looks like my Instagram feed. What's the
  • 5:37 - 5:43
    big deal?" But then this concept of self-
    surveillance or this, this watching
  • 5:43 - 5:49
    ourselves or monitoring ourselves for the
    sake of someone else, whether it be your
  • 5:49 - 5:53
    Twitter followers or in my case the FBI
    ... I mean, this is something that's
  • 5:53 - 5:58
    become so commonplace. And each of us are
    creating our own archives. So a lot of
  • 5:58 - 6:01
    the shifts that's taking place culturally
    - I think it's really important to think
  • 6:01 - 6:05
    about this, because, you know, it's like,
    like this bed. I mean, you're not really
  • 6:05 - 6:08
    sure what happened in that bed. You're
    not really sure who I was there with. Or
  • 6:08 - 6:12
    you see these like random train stations.
    There are always these empty spaces.
  • 6:12 - 6:15
    There's like ... no one's ever there, and
    every now and then you might see some
  • 6:15 - 6:18
    people. But they just, they just tend to
    be incidental to the image.
  • 6:18 - 6:21
    Which is kind of interesting when you look at
    like the "no photography" or the
  • 6:21 - 6:22
    "ask permission
  • 6:22 - 6:26
    of people that are in there" Which is
    really amazing to think about. That like
  • 6:26 - 6:31
    how many cameras are there in every given
    moment? Actually in this room, right now,
  • 6:31 - 6:37
    they're at least twice as many cameras as
    there are people. Because there's one
  • 6:37 - 6:41
    camera in the front of your phone and
    another one in the back of your phone. And
  • 6:41 - 6:45
    who knows how many other cameras in, in
    all the other devices and everything else
  • 6:45 - 6:49
    that we are putting in. So we can't
    really escape these cameras, regardless of
  • 6:49 - 6:55
    what we do. And so all of these things,
    all these bits and pieces that I'm
  • 6:55 - 7:00
    archiving of my life. So every few moments
    another photo gets put up and it gets
  • 7:00 - 7:03
    tagged with the geo tag of where it was.
    Again this is something that's so common
  • 7:03 - 7:08
    today but 12, 13, 14 years ago this was
    completely just like: "What are you doing
  • 7:08 - 7:12
    and why are you doing this?"
  • 7:12 - 7:16
    And ...
    sorry ... So yeah, and then of course
  • 7:16 - 7:21
    there's all these toilets because, uh, my
    FBI agent really needs to know. I figure
  • 7:21 - 7:25
    the FBI, that ... You know, they want to
    know my business and I'm very open and I'm
  • 7:25 - 7:27
    very sharing. So I figure, I'm gonna tell
    them every little bit of detail,
  • 7:27 - 7:33
    including the toilets that I use and how
    frequently and where I go. You know, they
  • 7:33 - 7:37
    need to know, so! Really, what's
    happening, you know and, and what I'm
  • 7:37 - 7:41
    really, what I'm really doing is: I'm,
    it's, it's a matter of telling you
  • 7:41 - 7:45
    everything and telling you absolutely
    nothing simultaneously. It's a barrage of
  • 7:45 - 7:50
    noise. Look, I know that eventually, you
    know, machine and AI machine readers and
  • 7:50 - 7:56
    AI is going to get so, so sophisticated,
    that at a certain point this idea of
  • 7:56 - 8:01
    overwhelming the system, it's not going
    to be possible. This is a temporary fix.
  • 8:01 - 8:08
    And the reason I chose this fix is, you
    know, obviously the FBI has a file on me.
  • 8:08 - 8:13
    By the way, the background story of this
    is: I got reported as a terrorist.
  • 8:13 - 8:19
    I guess, I should probably explain on that
    a little bit. So shortly after 9/11, my
  • 8:19 - 8:24
    landlords, that I rented this storage
    space from, called the police and said,
  • 8:24 - 8:28
    that an Arab man had fled on September
    12th, who is hoarding explosives.
  • 8:28 - 8:31
    Never mind. There were no explosives.
    Never mind,
  • 8:31 - 8:33
    it wasn't an Arab man.
    But you know, that Arab man
  • 8:33 - 8:36
    would be me. Even though
    I'm Bengali and we're actually not Arabs.
  • 8:36 - 8:40
    But it doesn't matter because as far as
    many people ... You know, they're all the
  • 8:40 - 8:44
    same over there. They're all kind of fun.
    You know, that's, it's, it's that mentality
  • 8:44 - 8:47
    So I spent six months of my
    life with the FBI, convincing them that
  • 8:47 - 8:52
    I'm not a terrorist. And after all that
    they said: "Okay, everything's great,
  • 8:52 - 8:54
    you're fine", and I said: "Okay,
    wonderful, can I get a letter saying :
  • 8:54 - 8:59
    I'm fine?" There's a little problem with
    that, because have you ever tried to get
  • 8:59 - 9:04
    something that says: "You're not guilty
    of something you never did"? So you know,
  • 9:04 - 9:08
    so ever since then I've been with my FBI
    agent. I said: "Hey, what can I do? I
  • 9:08 - 9:11
    mean, I travel a lot. All we need is the
    next guy not to get the next memo and
  • 9:11 - 9:14
    here we go all over again. How do I avoid
    this?" So at that moment he gave me a
  • 9:14 - 9:17
    phone. He says: "Here's some phone
    numbers. If you get into trouble, give us
  • 9:17 - 9:21
    a call. We'll take care of it." So ever
    since then I would call my FBI agent,
  • 9:21 - 9:24
    tell him where I was going, send him
    photos to all of these things, and thus
  • 9:24 - 9:28
    that's what this whole project started. In
    a ... I mean back then when I started this
  • 9:28 - 9:31
    project, shortly after 9/11, I mean, my
    phone was like those old ... Remember
  • 9:31 - 9:35
    those old Nokia 6600? Those like super
    chunky ones and you had to hit the buttons
  • 9:35 - 9:38
    like a whole bunch of times to send like
    one text message across? And we should
  • 9:38 - 9:42
    call them smart phones back then. I mean,
    they're really not that smart, but
  • 9:42 - 9:46
    they're kind of smart phones. Anyway. So
    this process of like sending all this
  • 9:46 - 9:50
    information to the FBI. I started telling
    them everything. And also at the same
  • 9:50 - 9:54
    time by telling them everything I'm also
    telling them absoutely nothing because
  • 9:54 - 9:58
    it's just a wall of noise. And in that
    noise there is some signal and the ratio,
  • 9:58 - 10:04
    the signal-to-noise ratio, is actually
    pretty skewed. It's pretty, it's pretty
  • 10:04 - 10:08
    out of sync in that sense. Because it's,
    it's difficult for the, for the viewer to
  • 10:08 - 10:12
    know what they're looking at. So if you
    don't s–. That website, it's a, it's, it
  • 10:12 - 10:16
    is an archaic website. I mean, it is
    completely obsolete technologically, but
  • 10:16 - 10:21
    more importantly, it's also one of these
    things, it's, it's intentionally user-
  • 10:21 - 10:25
    unfriendly. It's not, you know,
    information is not categorized in a way
  • 10:25 - 10:28
    where, you know, "oh, I want to see this
    data this time". It's like, it's just, it
  • 10:28 - 10:32
    just comes in it, whichever way it, the,
    the algorithm these days feels like
  • 10:32 - 10:35
    generating the, the info. So sometimes
    you might see a taco, then you might see
  • 10:35 - 10:39
    the next one from Mexico City, then
    Mexico City might go to like this place
  • 10:39 - 10:42
    because I took a flight from there to
    Houston. So you might get some images
  • 10:42 - 10:45
    from Houston next. And then from Houston
    you might get some things from another
  • 10:45 - 10:50
    city in the back and forth. So it actually
    takes all these pathways various through
  • 10:50 - 10:54
    it. So in this process I'm really, you
    know, it's, it's, I'm telling you
  • 10:54 - 10:58
    everything and nothing in this
    camouflage. In this, you know, thing of,
  • 10:58 - 11:03
    like, you know, historic camouflage. Now
    the concept of camouflage particularly in
  • 11:03 - 11:08
    digital identity is really important.
    Because historically, when you hear
  • 11:08 - 11:13
    camouflage, it's usually meant for, you
    know, warfare, battle, soldiers. And the
  • 11:13 - 11:18
    reason we have, the reason, camouflage
    looks at certain ways, because it was
  • 11:18 - 11:23
    designed to break the silhouette of the
    body of the soldier in the battlefield,
  • 11:23 - 11:30
    in the landscape of warfare. So this is
    why, you know, certain trees look certain
  • 11:30 - 11:33
    colors and when you look, you know, it's
    every place you go, there's a different
  • 11:33 - 11:40
    war. Each, with each war, there's a
    different pattern, that, that's used. Now
  • 11:40 - 11:43
    if you look at the new camouflage, that
    most troops are wearing, particularly in
  • 11:43 - 11:49
    the US, where I live, it's all pixeled,
    it's all pixelized, you know, it's all,
  • 11:49 - 11:54
    and it's all this, like, weirdly grayish,
    greenish color. I don't know if you've
  • 11:54 - 12:00
    noticed that color. But there's no trees
    that color anywhere. It's because it's
  • 12:00 - 12:06
    not meant for blending into the landscape
    anymore. It's no longer meant so the
  • 12:06 - 12:12
    soldier has to blend into the
    battlefield. The reason it's pixelized and
  • 12:12 - 12:18
    it's that grayish, greenish color is
    because we no longer have a need for the
  • 12:18 - 12:23
    soldier to blend in to the battlefield,
    the physical battlefield. But what we want
  • 12:23 - 12:30
    is the soldier to be, so the enemy cannot
    distinguish the physical body of the
  • 12:30 - 12:35
    soldier in the noise and the night-vision
    goggles. That's why it's that color.
  • 12:35 - 12:41
    That's why it's that grayish green. So this
    is a huge step. So we no longer, it's no
  • 12:41 - 12:46
    longer about the physical presence of the
    body in the, in the war field, in the, in
  • 12:46 - 12:52
    warfare. But it's now about the, it's now
    where the, the digital body, that the
  • 12:52 - 12:58
    physical body is intentionally
    camouflaged and mistaken for a digital
  • 12:58 - 13:04
    artifact. It's that pixelization, that we've forced basically in
  • 13:04 - 13:07
    warfare. And a lot of work, that I do,
    with a lot of, a lot of the art, that I
  • 13:07 - 13:11
    come up with, comes out of warfare and
    such. And I'll show you a little. But this
  • 13:11 - 13:14
    is a really important piece, that I
    wanted to talk to you about. Because this
  • 13:14 - 13:20
    was taken from one of the earliest samples
    of what the US Army uses. And it was done
  • 13:20 - 13:25
    at the, at the base, at Natick, which is
    the, outside of Boston. This is where a
  • 13:25 - 13:28
    lot of the camouflage in the US is
    designed. It's the US Army's fashion
  • 13:28 - 13:33
    design department. Though they would never
    call it the fashion design department. But
  • 13:33 - 13:36
    this is where they design all the
    camouflage and the, and the outfits. And
  • 13:36 - 13:41
    so this sample was taken ... At first, it
    just looks like a modernist grid. Various
  • 13:41 - 13:43
    photographs and such. But what they
    really are, is ... I took the old sample
  • 13:43 - 13:50
    and I blew it out and took out each pixel
    and replaced each pixel with a photograph
  • 13:50 - 13:53
    of a historic or current point of
    conflict. So there's these two images
  • 13:53 - 13:58
    right in the middle. If you look. The two
    blue images over there, kind of in the
  • 13:58 - 14:06
    middle to the bottom, those are in North
    Korea. Those are north the 38th parallel.
  • 14:06 - 14:09
    That could technically, north the
    parallel. So you're actually in North
  • 14:09 - 14:13
    Korean territory, but at Panmunjom.
    There's actually six buildings and three
  • 14:13 - 14:18
    ... And all six are built directly across
    the border. And, you know, it's kind of
  • 14:18 - 14:22
    unpractical to share half the building
    with another country. So three of the
  • 14:22 - 14:26
    buildings belong to South Korea, three of
    them belong to North Korea. So these were
  • 14:26 - 14:30
    the South Korean buildings. But physically
    in North Korean territory. Similar like
  • 14:30 - 14:33
    that plate of ham that you see right
    below it, you know, it's a Spanish ham.
  • 14:33 - 14:38
    And being a good Muslim, I really do
    enjoy my Spanish ham. But this was, this
  • 14:38 - 14:43
    was photographed in Guernica, the site of
    a brutal massacre. Shortly ... you know,
  • 14:43 - 14:46
    so similarly, so each of the, every
    single one of these images, they might
  • 14:46 - 14:51
    seem like these generic or bland images,
    but they're all historic conflicts or
  • 14:51 - 14:54
    current conflicts. There's, there's an
    image in the, in the upper right hand, in
  • 14:54 - 14:59
    the upper left hand corner, right there,
    and it's, it's of an office building with
  • 14:59 - 15:05
    just a flag with three stripes going
    across. And this is like a quick world
  • 15:05 - 15:10
    affairs quiz. Which country has, has a
    flag that goes three horizontal stripes
  • 15:10 - 15:17
    across, the top and bottom are red, and
    the middle one is green? Any guesses on
  • 15:17 - 15:31
    this country? Three stripes, red, green,
    red. Lots of confused looks,. Huh? ...
  • 15:31 - 15:35
    It's, it's a trick question. The country
    doesn't exist. It's a, it's Transnistria
  • 15:35 - 15:40
    or at least the Russians call it
    Pridnestrovia. And it's a breakaway
  • 15:40 - 15:43
    republic of Moldova, it's a tiny little
    sliver of land between Moldova and
  • 15:43 - 15:50
    Ukraine. Maybe about 20, 30 kilometers
    wide and maybe about 200 kilometers long.
  • 15:50 - 15:54
    This tiny, tiny, tiny sliver ... no one
    ... They have it, well, by the way, they
  • 15:54 - 15:58
    have their own military, they have their
    own passports, they have their own
  • 15:58 - 16:02
    currency, they have their own judicial
    system, they've their own everything. Like
  • 16:02 - 16:09
    a country, except no one recognizes them
    as a country. But if you need to go buy 20
  • 16:09 - 16:14
    cases of AK-47s, this is the place to go.
    So this is, this tiny piece of land, this
  • 16:14 - 16:20
    tiny piece of land, that no one
    recognizes, is really where a lot of the
  • 16:20 - 16:24
    destabilization of the world comes from.
    So I went over there and then of course I
  • 16:24 - 16:27
    would tell my FBI agent: "This is where I
    was going, just loitering." He was like:
  • 16:27 - 16:30
    "What are you doing over there?" So I
    said: "I'm just hanging out, just seeing
  • 16:30 - 16:34
    what's going on." Anyway. So again, all
    of the, every single one of these images,
  • 16:34 - 16:38
    there's bits and pieces of these types of
    source ... But I want to go back at the
  • 16:38 - 16:44
    state of warfare and talk to you a little
    bit about that. So, I live in the US. And
  • 16:44 - 16:48
    you know, it's we're, you know, currently
    at the war on terror. Actually the, the
  • 16:48 - 16:54
    real name is the Global War on Terror. And
    this was a drawing that I did for a
  • 16:54 - 17:00
    project a few years ago. And there's a,
    there's a congressional document. So
  • 17:00 - 17:05
    there's this document, that the US
    Congress keeps, and it's called the
  • 17:05 - 17:11
    "Instances of Use of United States Armed
    Forces Abroad", this time was 1798 to
  • 17:11 - 17:16
    2006. This gets updated every year. And
    interestingly enough, when you look at
  • 17:16 - 17:20
    this document, it's ... everything is just
    the facts, everything is, you know, in
  • 17:20 - 17:27
    1838, you know, Marines chased pirates
    onto a Caribbean island. Well, when the
  • 17:27 - 17:30
    Marines landed on that Caribbean island,
    we invaded that country. Any time you send
  • 17:30 - 17:35
    your troops on to another country soil
    without that country's permission, you
  • 17:35 - 17:39
    have invaded that country. So everything
    from like, you know, Marines chasing
  • 17:39 - 17:44
    pirates to a Caribbean island all the way
    to World War One, it's all just one,
  • 17:44 - 17:47
    everything's just listed as one instance,
    just straight down like ... And this, and
  • 17:47 - 17:52
    it's just, goes on for pages and pages and
    pages and pages and pages and, you know,
  • 17:52 - 17:56
    so and then you look at it like: "Wait.
    When, when did, when did the US invade
  • 17:56 - 18:02
    Florida?" Well, that was when it was Spanish
    territory, this is a, you know, before
  • 18:02 - 18:05
    that became part of United States
    similarly, you know ... So when you look
  • 18:05 - 18:10
    at it, all the white X's are all the
    invasions, that the US has undertaken
  • 18:10 - 18:16
    between 1776, which was the year of the
    independence, to World War One. All the
  • 18:16 - 18:23
    gray X's are World War One to Vietnam. And
    all the black X's are Vietnam to today.
  • 18:23 - 18:30
    The number of invasions, that the US
    troops have undertaken between 1776 and
  • 18:30 - 18:36
    World War One, roughly equate the same
    amount as World War One to Vietnam. And
  • 18:36 - 18:43
    the number of invasions from 1776
    to Vietnam roughly equate Vietnam to
  • 18:43 - 18:51
    today. In the last 100 years, in the last
    117 years, since 1900, there's only
  • 18:51 - 18:56
    been five years, where the US has not
    been at war. Interestingly enough, we've
  • 18:56 - 19:01
    only declared war 11 times in history.
    Since independence of the country, the US
  • 19:01 - 19:08
    has only formally declared war 11 times.
    Yet of the all the years, that we've ever
  • 19:08 - 19:13
    been an independent country, like there's
    only like 20 or 30 years, that we haven't
  • 19:13 - 19:18
    been at war. And most of those are pre
    1800. And then... So since
  • 19:18 - 19:24
    1900, there's only been five years. And
    you can see all of these dots. So one of
  • 19:24 - 19:28
    the reasons that I wanted to create this
    document, is that I had this proposal,
  • 19:28 - 19:32
    that I wanted to do for this, uh, this
    museum in Spain. And I was talking to
  • 19:32 - 19:36
    curator and I said: "Hey. You know, it
    would be really great, what if we set up
  • 19:36 - 19:41
    a wall of like bulletproof glass in the
    gallery. So we'll just, it's like, as
  • 19:41 - 19:44
    people are going around the museum, we
    just set up a huge bulletproof, huge
  • 19:44 - 19:49
    sheet of bulletproof glass with a really
    faint drawing of the world. And we'll
  • 19:49 - 19:53
    have got, we'll have a guy with a rifle
    in the back and he'll shoot bullets at the
  • 19:53 - 19:59
    audience. But the, but because there's
    bulletproof glass, it'll stop. And I mean,
  • 19:59 - 20:02
    I knew that there was just no way they
    would go for it. But to my surprise, they
  • 20:02 - 20:09
    said: "Let us talk to some security people
    and we'll get back to you about this."
  • 20:09 - 20:13
    And I was really surprised. So in the
    meantime, we did some ballistics tests.
  • 20:13 - 20:16
    So check this out. Look at, look at, I
    mean, like, that's what it looks
  • 20:16 - 20:20
    like. I mean, it's just beautiful when ...
    I mean, just the way, like, when
  • 20:20 - 20:25
    basically, when bullets hit bulletproof
    glass. It's not really glass, it's just
  • 20:25 - 20:32
    very dense plastic. And the heat of the
    bullet comes in at such velocity,
  • 20:32 - 20:37
    that the plastic melts just ever so
    slightly and then just swallows the bullet
  • 20:37 - 20:43
    back up and it freezes back up, so
    slightly liquefies and then solidifies.
  • 20:43 - 20:48
    This is what happens inside your body,
    when you get shot. Except your body
  • 20:48 - 20:51
    doesn't just, you know, absorb the bullet
    and just refreeze together. It's just
  • 20:51 - 20:56
    like this really scary looking thing of
    just looking at that, that blunt trauma,
  • 20:56 - 21:02
    that immediate point of when that
    bullet hits. This is what happens. So we
  • 21:02 - 21:06
    did some ballistics tests and we went
    through some ideas and then, and then of
  • 21:06 - 21:09
    course, as I expected, they said: "No,
    you're crazy, there's no way we're doing
  • 21:09 - 21:16
    this." But they were willing to send this
    out to, to off-site. And we built this in
  • 21:16 - 21:21
    these fabricators built this off-site.
    And of course then there was these other
  • 21:21 - 21:27
    problems of, you know, if we build this in
    the US and send it to the museum, I mean,
  • 21:27 - 21:31
    you know, when it comes into Europe, does
    it get imported as artwork or does it get
  • 21:31 - 21:35
    imported as, well, possibly firearms
    because there's actual real gunpowder in
  • 21:35 - 21:42
    it? There's a lot of problems. So in
    order to deal with the customs issues
  • 21:42 - 21:46
    of it, we figured, you
    know, what, let's just ship it,
  • 21:46 - 21:51
    let's just have it built in Spain. And
    then, as we were about to build this, we
  • 21:51 - 21:57
    realized: There was a little pesky Spanish
    law, that requires you to be at least 25
  • 21:57 - 22:02
    meters away from the intended target. You
    can't discharge a weapon in less than 25
  • 22:02 - 22:07
    meters of its intended target. Now of
    course I wanted this piece to be gigantic.
  • 22:07 - 22:09
    And then you could only imagine, that
    this, the price of the bulletproof glass,
  • 22:09 - 22:14
    as it gets larger and larger and larger.
    So we eventually agreed that it'll be 4
  • 22:14 - 22:20
    meters by 2 meters. Except at that scale,
    each of these dots, each of these bull–,
  • 22:20 - 22:26
    each of these targets then become like 3
    millimeters or 4 millimeters in diameter.
  • 22:26 - 22:30
    Now I don't know how many of you do
    shooting or target practice or any of
  • 22:30 - 22:36
    that. But try to hit 330 dots that are
    about 3 to 4 millimeters in
  • 22:36 - 22:41
    diameter from 25 meters away, one after
    another, without missing them. That's
  • 22:41 - 22:46
    actually practically impossible. So I was
    like: How do we do this? And then the
  • 22:46 - 22:50
    museum fortunately hired three former
    Olympians to do this. It was just
  • 22:50 - 22:57
    unbelievable, watching these guys. And of
    the 330 bullets, that are there, 326 of
  • 22:57 - 23:02
    them were exactly on center. The four that
    were off, were no more than five, five
  • 23:02 - 23:07
    millimeters, and I'm convinced I just
    drew them wrong. Those four, I mean, it's
  • 23:07 - 23:10
    just, it's just unbelievable. But you,
    but you can see what happens. So in the
  • 23:10 - 23:14
    middle over here, this is Central
    America, we have hit, we have invaded
  • 23:14 - 23:19
    Central America so many times. That as
    you keep hitting these bullets, these
  • 23:19 - 23:23
    physical, these bullets, just eventually
    the bulletproof glass just shatters and
  • 23:23 - 23:28
    gives away, I mean, it so, it causes a
    literal wreck. So it's kind of
  • 23:28 - 23:33
    interesting, looking at this, this very
    abstract concept of, of these invasions,
  • 23:33 - 23:38
    these political, these types of political
    policies, that we have, and what the
  • 23:38 - 23:43
    physical manifestation of that, what that
    does in a map. But what does that do to
  • 23:43 - 23:47
    the physical geography? So I'm really
    interested in this idea of geography body,
  • 23:47 - 23:51
    data body, this physicality and the
    virtuality and how these two come
  • 23:51 - 23:55
    together. And this. So similarly, uh,
    I've been, I've been doing a lot of these
  • 23:55 - 23:59
    things about warfare and such. And I want
    to show you this project. This is a
  • 23:59 - 24:03
    project that I've been working on in, in
    Hawaii recently. And you know, Hawaii, I
  • 24:03 - 24:08
    mean, who's gonna turn down a job in
    Hawaii? It was pretty unbelievable,
  • 24:08 - 24:16
    getting a chance to do this
    project. And so, when you go to, when you
  • 24:16 - 24:20
    in... Hawaii is a really
    interesting, at least in American policy
  • 24:20 - 24:26
    and American political and cultural
    perspectives, Hawaii is a very unique
  • 24:26 - 24:32
    place. Because Hawaii is the ultimate
    manifestation or the ultimate, like,...
  • 24:32 - 24:37
    embodiment of what we in, in the, in the
    US we have this term called "manifest
  • 24:37 - 24:43
    destiny", which was a term from the early
    1700, actually the late 1700s, in the
  • 24:43 - 24:48
    early history of America, where we
    decided to start in the East and the
  • 24:48 - 24:52
    Atlantic and expand all the way across
    the ... That we will manifest the
  • 24:52 - 24:58
    country and the whole land will become
    ours. This is the birth of the US. This
  • 24:58 - 25:03
    is the idea of how the country was, was
    given birth to. Now, Hawaii is a really
  • 25:03 - 25:09
    interesting situation in this perspective
    because now, you know, we've started on
  • 25:09 - 25:12
    the Atlantic coast, we've
    conquered, conquered, conquered,
  • 25:12 - 25:17
    conquered, conquered, got, went all there
    over to the end. We've gone to the Pacific
  • 25:17 - 25:21
    Ocean. And now we've crossed the Pacific
    Ocean. And now we've gone to this middle
  • 25:21 - 25:26
    of this, this rock in the middle of the,
    of the ocean. And now we're at the top of
  • 25:26 - 25:29
    the mountain. So we've actually, like, you
    know, then all that, then ... The
  • 25:29 - 25:32
    interesting thing is: Hawaiian culture,
    when you go to the peaks of each of the
  • 25:32 - 25:37
    mountains, every, every of the mountain
    peaks are sacred. Interestingly enough,
  • 25:37 - 25:42
    when you go to each of these mountaintops,
    they're all, well, there are almost always
  • 25:42 - 25:46
    US military or some sort of government or
    some sort of science. But, of course, for
  • 25:46 - 25:50
    many ... And now, granted, I know, there
    is some amazing stuff being done. But
  • 25:50 - 25:55
    there's a lot of also very shady, you
    know, like it's military stuff being
  • 25:55 - 25:58
    done. And it's also this thing of like
    going up and then now with these
  • 25:58 - 26:04
    surveillance devices, these telescopes at
    the top of each of these monitoring and
  • 26:04 - 26:11
    surveying and measuring everything out in
    the West and out in the sky. And you
  • 26:11 - 26:13
    know, so we've gone through the West,
    which is also a really interesting thing
  • 26:13 - 26:17
    about the history of landscape in
    America. Let me show you, let me show you
  • 26:17 - 26:22
    a couple of these images of the top of
    this place, which is just, it's just
  • 26:22 - 26:25
    beautiful going up there. I mean, it
    looks like Mars. And a lot of the Mars
  • 26:25 - 26:30
    tests were actually done up there because
    ... I mean, it's just amazing.
  • 26:30 - 26:34
    Like you're looking down at the sky and
    there's these like satellites everywhere.
  • 26:34 - 26:39
    I mean, it really does look like a Martian
    landscape. I'm really interested in this.
  • 26:39 - 26:41
    Okay, so let me show you these projects.
    I've been really interested in watching
  • 26:41 - 26:47
    the watchers of this thing of ... like
    ... And what happens when you aestheticize
  • 26:47 - 26:51
    surveillance. I mean, obviously, this is
    a major issue for a lot of us. And a lot
  • 26:51 - 26:55
    of the conference topics here, talks
    about privacy and what to do and how to
  • 26:55 - 27:01
    stop mass surveillance and things of
    that type. But when you hear the word
  • 27:01 - 27:07
    "surveillance imagery", what do you think
    of? What are you thinking? Just imagine:
  • 27:07 - 27:11
    a picture, "surveillance imagery". What
    does it look like to you? And for most of
  • 27:11 - 27:14
    us, we generally probably think of like,
    you know, like, what CCTV or surveillance
  • 27:14 - 27:21
    camera images look like. Usually grainy,
    pixely, no, gritty, sometimes black and
  • 27:21 - 27:26
    white. You know, with this audience, we already know data
    storage is cheap. I mean, this data
  • 27:26 - 27:31
    storage, I mean, you could buy a terabyte
    drive for like 30 bucks. So data, you
  • 27:31 - 27:34
    know, it's, and it gets cheaper and
    cheaper every day. And the capacity gets
  • 27:34 - 27:38
    larger and larger. So it's not necessarily
    an economic issue. But it's actually more
  • 27:38 - 27:42
    of a cultural issue. And, and I'm
    proposing the fact that when you hyper-
  • 27:42 - 27:51
    aestheticize a surveillance image, your
    brain no longer reads it as surveillance.
  • 27:51 - 27:57
    Your brain rejects the image when it's
    too aesthetically pleasing. You read it as
  • 27:57 - 28:02
    landscape or you read it as, read it as
    landscape photography. And, and many of
  • 28:02 - 28:05
    the, in the case of landscape photography
    and many of it is, the history of it is
  • 28:05 - 28:11
    based on landscape painting. So it's
    really that, that gritty graininess that
  • 28:11 - 28:16
    actually gives us the, the emotional
    attachment that this is a surveillance
  • 28:16 - 28:21
    image. Because when it becomes, when it
    doesn't have those characteristics, when
  • 28:21 - 28:27
    it doesn't have those qualities, it
    becomes a photo or a video. So in this
  • 28:27 - 28:30
    case, I just, I was really interested in
    what happens when, when you hyper-
  • 28:30 - 28:34
    aestheticize that. And imagine if you had
    x-ray vision. Imagine you had x-ray
  • 28:34 - 28:37
    vision. You could see right through this
    wall. What would you see on the other of
  • 28:37 - 28:42
    this wall? Well, this image, which
    actually looks like a window pane. They're
  • 28:42 - 28:47
    actually six very large monitors and this
    is, this functions as a see-
  • 28:47 - 28:51
    through wall. So you could actually see
    through this wall. And just so, it just
  • 28:51 - 28:56
    so happened that on the other side of the,
    of the wall from this exhibition space
  • 28:56 - 29:00
    was the Baltimore Police Department. So I
    loved the idea of being able to watch the
  • 29:00 - 29:05
    police watching me watching them. And
    when you notice that there's that, next in
  • 29:05 - 29:07
    the blue light there's a little
    surveillance camera on top of their
  • 29:07 - 29:11
    dumpster which really doesn't make any
    sense to me why they would need a
  • 29:11 - 29:14
    surveillance camera over their dumpster.
    But I just love this image of just this
  • 29:14 - 29:17
    constant reflecting back and forth and
    watching and watching and watching back.
  • 29:17 - 29:20
    And in the image you couldn't tell, you
    know, whether it was, whether you're
  • 29:20 - 29:24
    looking at a photograph or whether you're
    looking at a video or whether you're, you
  • 29:24 - 29:28
    know, it's just every now that you see
    some blurry lights move by or some of
  • 29:28 - 29:32
    these figures. But really, you know, if
    you look at a lot of what machine vision,
  • 29:32 - 29:34
    a lot of the software, I mean, you know,
    like, you know, you see those things on
  • 29:34 - 29:39
    the TV with all those people looking at
    those monitors. And that's, no, that's,
  • 29:39 - 29:43
    that's like the movies or that's like they
    used for like fundraising. The real stuff
  • 29:43 - 29:47
    is all machines looking at this stuff. And
    it doesn't actually look that sexy at all.
  • 29:47 - 29:52
    So I love this idea of actually taking
    this image and, you know, seeing
  • 29:52 - 29:56
    what can happen when it's
    aestheticized. Now similarly, you know,
  • 29:56 - 30:00
    what happens when the form is
    aestheticized? This is the very
  • 30:00 - 30:03
    similar project at another location. But
    this is what you would see through the
  • 30:03 - 30:11
    wall. And when you see this, I mean, your
    brain completely rejects this as a
  • 30:11 - 30:17
    surveillance image. But this is exactly
    what happens right on the other side over
  • 30:17 - 30:22
    there. So thinking about the idea of
    aesthetics and what surveillance looks
  • 30:22 - 30:29
    like and at least, you know, growing up
    in an American context, you know, so
  • 30:29 - 30:34
    a lot of the history of American
    photography comes from American painting
  • 30:34 - 30:39
    or landscape painting. And the history of
    a lot of our landscape painting comes from
  • 30:39 - 30:42
    the Hudson River School, which is the
    Northeast and, you know, up north of New
  • 30:42 - 30:48
    York City and up the Hudson River. These
    gorgeous grand vistas, often of these
  • 30:48 - 30:54
    really large, you know, wide viewpoints,
    of these very like subliminal, very like
  • 30:54 - 30:59
    spiritual type of imagery. This very
    godlike imagery. Which is kind of
  • 30:59 - 31:04
    interesting because in the way it is like
    trying to replicate this eye of God. Or if
  • 31:04 - 31:07
    you want to look at it from the
    perspective of surveillance, in which we
  • 31:07 - 31:13
    generally tend to think of surveillance
    as a very post 9/11 or a very 21st century
  • 31:13 - 31:18
    concept. I mean, most of us did not use
    the word surveillance, say, you know, 20
  • 31:18 - 31:22
    years ago, 30 years ago. That word was
    not as common. It certainly has become
  • 31:22 - 31:27
    incredibly common in our vocabulary very
    recently. But even though we tend to
  • 31:27 - 31:32
    think of surveillance as a very post–, a
    very 21st century concept, this is
  • 31:32 - 31:37
    something that's been embedded in our
    history, in our brains for thousands of
  • 31:37 - 31:42
    years. Particularly when you look at those
    early paintings. And then God as the
  • 31:42 - 31:49
    original surveillance camera. You behaved
    because God was watching. So it was very
  • 31:49 - 31:55
    easy to replace this being above with a
    camera above. And that becomes so
  • 31:55 - 32:00
    normalized in our daily lives. I mean,
    think about it. How does Santa Claus know
  • 32:00 - 32:05
    everything about you? I mean, we were
    taught this since we were at such an
  • 32:05 - 32:10
    early point in life. We've ... So this
    normalization of surveillance, this
  • 32:10 - 32:14
    normalization. And also this, you know,
    and ... You know, we don't think, we
  • 32:14 - 32:19
    never use the word surveillance when we're
    talking about God. Again you know, this
  • 32:19 - 32:22
    aesthetics, the hyper-
    aestheticization or maybe, if you want to
  • 32:22 - 32:26
    call, I mean, this is not exactly a
    statusization. But, you know, in
  • 32:26 - 32:32
    the idea of thinking about it. So, again,
    this relationship between the watched and
  • 32:32 - 32:36
    watcher. And, you know, another similar
    project and one that looks like when
  • 32:36 - 32:40
    you break that frame of looking at ... of
    the format of surveillance and when you
  • 32:40 - 32:44
    aestheticize the format, not only the
    imagery of surveillance, but also the
  • 32:44 - 32:48
    format and how the surveillance is
    presented. I mean, we rarely, we would
  • 32:48 - 32:52
    never, we wouldn't even think of this as a
    surveillance camera image. But that is
  • 32:52 - 32:56
    exactly what this is. That's another
    similar piece. I just want to show you
  • 32:56 - 32:59
    the context of this one. Because I want
    to show you, but talk a little bit about
  • 32:59 - 33:04
    the project that's going on on the right.
    And this image is ... So they look like
  • 33:04 - 33:08
    these landmines. But they also look like
    satellites. I'm really interested in this
  • 33:08 - 33:12
    idea of like circumnavigation. I'm really
    interested in these things of, you know,
  • 33:12 - 33:16
    conceptually this project is telling you
    earlier about. This is about telling you
  • 33:16 - 33:21
    everything and nothing simultaneously. But
    in a similar way, I'm also really
  • 33:21 - 33:25
    interested in this concept of Magellan and
    circumnavigation, where you go far enough
  • 33:25 - 33:30
    to this side, you end up in the other.
    You circle around the whole thing. Does
  • 33:30 - 33:32
    that make sense, a little bit? So in the,
    you know, sometimes some people are so far
  • 33:32 - 33:37
    to the left, that they're actually very
    much to the right and vice versa. So
  • 33:37 - 33:40
    there's this, the circular reasoning
    that keeps coming back and forth. And I'm
  • 33:40 - 33:43
    really interested in this thing and ...
    This could be a landmine, but
  • 33:43 - 33:47
    this could also be a satellite. It's,
    there's these types. And then also at the
  • 33:47 - 33:51
    same time, like, this was, this is an
    earlier project. But they also look like
  • 33:51 - 33:55
    selfie sticks. But then there's these,
    like, all these things coming out with
  • 33:55 - 33:58
    these, like, these monitors sticking out
    of each of ... This is pretty iPad. Well,
  • 33:58 - 34:02
    actually this installation was done
    later. But I mean, now it's like these
  • 34:02 - 34:06
    tablets are like so common. And they're so
    everywhere. But what ... These are 72
  • 34:06 - 34:09
    monitors and all the images that I've
    shared with the FBI and said to my FBI
  • 34:09 - 34:13
    agent, these are all being shown on all
    these monitors simultaneously. And you
  • 34:13 - 34:19
    could get an idea of what you're looking
    at. So in the same sense this project was
  • 34:19 - 34:24
    ... This is only a part of ... This
    one's... This one's about,
  • 34:24 - 34:32
    it's about 28 feet. So that's just about
    8 meters, 8 and a half meters tall, this
  • 34:32 - 34:37
    piece. And it's basically every photo
    that I photographed and shared.
  • 34:37 - 34:42
    But they're all printed in these huge
    pieces, in these huge, vinyl banners. And
  • 34:42 - 34:46
    the actual photos, each photo is about
    this little. But, you know, there's like
  • 34:46 - 34:49
    thousands and thousands and thousands and
    thousands of them. So when you look from
  • 34:49 - 34:54
    the back, all you see is these vertical
    color stripes. Interestly enough, well,
  • 34:54 - 34:57
    what they, the other thing that ... I
    don't necessarily disclose this in public
  • 34:57 - 35:00
    and I put the [unclear] but you're really looking
    at all the images shot on Sunday, all the
  • 35:00 - 35:06
    ones on Monday, Tuesday, Wednesday,
    Thursday, and so on with the seven days.
  • 35:06 - 35:10
    But what's also happening is, you know,
    the color bars, the test bars for
  • 35:10 - 35:14
    color calibration of monitors, you know,
    it's like ... So, one of the things that
  • 35:14 - 35:18
    happend, I'm sure you must have had,
    I'm sure, you must have had this in
  • 35:18 - 35:21
    Germany. But in the US we used to
    have a thing called the Emergency
  • 35:21 - 35:27
    Broadcast System. And the television show
    would stop and would go peeep This tone
  • 35:27 - 35:30
    would come out and these color bars would
    come on the screen and it would say "This
  • 35:30 - 35:34
    is a test of the Emergency Broadcast
    System. Had this been a real emergency,
  • 35:34 - 35:37
    you would have been instructed on what to
    do." So this thing of impending doom,
  • 35:37 - 35:42
    this thing, if something is, something bad
    is about to happen or something bad could
  • 35:42 - 35:45
    happen. And because something like that
    could happen, we need to be ready and we
  • 35:45 - 35:50
    need to be prepared. This thing of like,
    you know, being ready for these things. So
  • 35:50 - 35:53
    I'm really interested in that, in this
    idea of, like, what happens when
  • 35:53 - 35:57
    you use that tone or when you use that
    color and you borrow that thing? But at
  • 35:57 - 36:02
    the same time, it's the very same system
    that broadcasters use to calibrate
  • 36:02 - 36:07
    signal. So this thing of signal, you
    know, and if you think about it. On a
  • 36:07 - 36:11
    daily basis, we generate so much digital
    information. How much of it can be, how do
  • 36:11 - 36:15
    we calibrate that? So in a way of
    giving a nod to that, that's where this
  • 36:15 - 36:21
    piece comes out of. So and then this one
    led to the next one, which ... This has
  • 36:21 - 36:25
    been making the rounds throughout the
    place. So this was at ZKM a couple of
  • 36:25 - 36:29
    years ago here. And then, now this has
    been floating around Eastern Europe. And
  • 36:29 - 36:32
    these are, you know, they look really
    small on the, in the photograph. But this
  • 36:32 - 36:36
    is the same piece, but lit up from the
    back. I'll show you another photo of it
  • 36:36 - 36:40
    so you can get a better idea of this
    piece. So you're seeing every little bits
  • 36:40 - 36:44
    and pieces and all of these things coming
    together. So they're about three meters
  • 36:44 - 36:49
    tall, each of these slabs. And
    and they're lit up from the back. So
  • 36:49 - 36:52
    and then this is a body of work that
    I'm dealing with. So this thing of, like,
  • 36:52 - 36:56
    you know, libraries or archives. And we
    all have them. We all create personal
  • 36:56 - 36:59
    archives. I mean, if you look at every
    single photo you've ever shot, I mean,
  • 36:59 - 37:03
    which you probably have a record of. You
    know, so when you have all of these, so
  • 37:03 - 37:07
    what, you know, so we're all keeping our
    own private archives, you know, I have my
  • 37:07 - 37:11
    own private archive that I'm doing as a
    parallel archive to the FBI. But, you
  • 37:11 - 37:15
    know, it doesn't have to be for the FBI.
    I mean, we all ... maybe in this room, I'm
  • 37:15 - 37:18
    sure everybody here has been watched
    by the government. And there's
  • 37:18 - 37:22
    probably a few people from the shady
    organizations in this room right here
  • 37:22 - 37:26
    anyway or certainly at the conference. But
    what you're looking at here is every,
  • 37:26 - 37:30
    maybe one of those images. And then
    there's this pretty grainy black-and-white
  • 37:30 - 37:35
    image on top. And it looks like every
    other industrial rooftop. It looks like
  • 37:35 - 37:39
    it's a roof top of a building. And you
    can see the, the fans and the exhaust and
  • 37:39 - 37:42
    the vents. And then at the lower right
    there's these like golf ball looking
  • 37:42 - 37:49
    things. And most buildings don't have
    those. This happens to be the rooftop of
  • 37:49 - 37:54
    the NSA. This is... So really
    you know, and I kind of look at
  • 37:54 - 37:58
    it like the NSA and you know - us,
    we're kind of in the same business. We're
  • 37:58 - 38:01
    kind of on the information business.
    Except, you know, they like to keep
  • 38:01 - 38:05
    theirs to themselves and I like to
    give it out to everybody. So, you know,
  • 38:05 - 38:08
    we operate a little differently in our
    business. But we're kind of in the same
  • 38:08 - 38:13
    business. So I decided to put this piece
    out. And really, this thing of like, you
  • 38:13 - 38:16
    know, so, you know, in a lot of ways
    they're kind of like selfies. But they're
  • 38:16 - 38:19
    not really selfies, because when I
    photographed these things ... So let's go
  • 38:19 - 38:23
    back to this for now. So, you know, when
    I'm photographing this, I mean, I could
  • 38:23 - 38:27
    have taken the camera and I could
    have photographed you right now. But
  • 38:27 - 38:30
    instead there's this very anonymous
    looking thing. I mean, like, you know, if
  • 38:30 - 38:33
    you've been in this room,
    you know, that this could, we
  • 38:33 - 38:37
    know where that is. But in the same way,
    there's these like really like, you know,
  • 38:37 - 38:41
    so the camera is not pointing towards me.
    But the camera is pointing outward. And
  • 38:41 - 38:43
    that's what's really important here. And
    there's thousands and thousands and
  • 38:43 - 38:48
    thousands of these. I know. So the
    selfies, it's become a common thing. And
  • 38:48 - 38:53
    it's no shame in, you know, saying
    you like the selfies. I mean, here's
  • 38:53 - 38:57
    President Obama trying to make the look
    at selfie stick and selfies look pretty
  • 38:57 - 39:02
    respectable. I can't use the new guy.
    I can't just bear to see him. So I
  • 39:02 - 39:06
    have to go with this guy. I have to go
    with Obama. Because, you know, it's
  • 39:06 - 39:10
    important. But, you know, he's making
    that, so when he's taken that image, what
  • 39:10 - 39:14
    happens? Now when you look at the
    image, this is what, this is the image
  • 39:14 - 39:17
    that we're seeing. That's, that's up on my
    website right now. And as soon as I leave
  • 39:17 - 39:20
    this room, there'll be a new image that'll
    pop up and then when I get to my hotel,
  • 39:20 - 39:24
    you'll see another image. And then when I
    get to, you know, wherever I go eat
  • 39:24 - 39:28
    tonight, you'll see another image. But
    anyway, so, you know, every few moments I
  • 39:28 - 39:32
    timestamp to change it. So when we're
    seeing this image, but really this is what
  • 39:32 - 39:35
    the, this is what the computer is seeing.
    And this ... Some of you, and I'm sure
  • 39:35 - 39:38
    you know about this, but for those that
    are not aware of it, I mean, you know,
  • 39:38 - 39:43
    when you look at the Exif data – I use [unclear]
    which is an Exif reader to extract what
  • 39:43 - 39:47
    my images are or what the data that's
    behind it – so you can see what the image
  • 39:47 - 39:51
    is. That was shot right there, in that
    location. I mean, we know that because
  • 39:51 - 39:56
    that's how our apps know. Would you like
    to check in from here? That's how it knows
  • 39:56 - 39:59
    the location. So location services,
    that's pretty simple. And that's all the,
  • 39:59 - 40:02
    the metadata that's going ... But the
    here it's where it gets really
  • 40:02 - 40:07
    interesting. The fact that when you
    look at the altitude of this image,
  • 40:07 - 40:14
    the altitude is 186.061643, sounds about
    right for Leipzig? For this area, 186
  • 40:14 - 40:20
    meters of altitude? Yes? No? I mean, I'm
    assuming it's correct because that's what
  • 40:20 - 40:22
    the phone is telling me. But similarly,
    what's really interesting is, when you
  • 40:22 - 40:28
    look at the image direction. So, I mean,
    we know the latitude, longitude, that's a
  • 40:28 - 40:31
    given. The altitude, that's a new one.
    Because an image taken on the ground, sea
  • 40:31 - 40:36
    level, on the ground level will look very
    different than an image shot on the 20th
  • 40:36 - 40:41
    or 30th floor of a building. Because the
    barometric sensor inside the phone will
  • 40:41 - 40:45
    put a different altitude number in there.
    So we know, that this image can also,
  • 40:45 - 40:50
    if I were to shoot this image from the
    roof looking down, that image probably
  • 40:50 - 40:58
    will be about a 188, 189, 190 meters.
    Because of the distance of this. The other
  • 40:58 - 41:03
    part, that's really interesting, is the
    image direction. The camera was tilted at
  • 41:03 - 41:12
    a 163 degrees from North. 0 being North,
    90 East, 180 South, 270 West. So slightly
  • 41:12 - 41:17
    south. The camera was pointed slightly
    south-ish. Which is exactly what that
  • 41:17 - 41:23
    camera's pointed. Now, why does this
    matter? It was only a few years ago that
  • 41:23 - 41:27
    we were thinking: "You mean, Google is
    going to drive up and down every single
  • 41:27 - 41:35
    road and record every single street?"
    Yes, basically the only thing, that's
  • 41:35 - 41:39
    holding them back in certain places, is
    the country's laws or the local things.
  • 41:39 - 41:43
    Aside from that, not only they drive up
    and down every single road in every single
  • 41:43 - 41:48
    city in every single country that they
    can. They do this repeatedly, year after
  • 41:48 - 41:52
    year, after year, after year, after year.
    Some places they'll put, you know, like
  • 41:52 - 41:55
    in Venice, they'll have like someone on a
    boat that goes through. On certain
  • 41:55 - 41:58
    pedestrian-only islands, there'll be
    someone carrying a backpack of those
  • 41:58 - 42:03
    Google cameras around. So it wasn't that
    long ago that this concept, that Google
  • 42:03 - 42:11
    would catalog every possible outdoor
    space was seemed so far-fetched. It's only
  • 42:11 - 42:18
    a matter of time before we collectively
    photograph every indoor space and we
  • 42:18 - 42:23
    catalog every single indoor space and we
    photograph all of it. And sooner or
  • 42:23 - 42:28
    later, this database, we, through this
    database of every indoor image with all
  • 42:28 - 42:33
    the metadata, we can restitch every
    interior space. We could recreate every
  • 42:33 - 42:36
    interior space. Similarly, you know, then,
    you know, when you look at your frequent
  • 42:36 - 42:38
    locations. I mean, you know, do you look
    at me that's like, you know, exactly what
  • 42:38 - 42:42
    moment you've come on, what minute you
    walked into a place, what minute you've
  • 42:42 - 42:48
    walked out of a place. I mean, you know,
    how does your phone know that you've
  • 42:48 - 42:54
    taken, you know, 8934 steps today? I
    mean, it's counting every single one of
  • 42:54 - 42:57
    them. It knows, you went three, you
    know it, you've done five flights
  • 42:57 - 43:01
    today, because it knows that your altitude
    changed from here to here to here. And it
  • 43:01 - 43:06
    knows every second you've done all
    this. I mean, this is kind of crazy. So,
  • 43:06 - 43:09
    you cross-reference all of this
    data. I mean, and short of basically
  • 43:09 - 43:14
    giving up our phones. I mean, this data
    is being collected and it's gonna happen.
  • 43:14 - 43:20
    It's going to get archived. I mean, you
    know, there's plenty of, you
  • 43:20 - 43:24
    know, but at these... that these phones
    are just like, it's inevitable. And we're
  • 43:24 - 43:28
    not going to give up our lives without
    them. So you cross-reference all of
  • 43:28 - 43:33
    these. And very quickly it creates
    a complete picture of the owner of that
  • 43:33 - 43:38
    device. Now back in the old days, you
    know, you'd have to, you know, the
  • 43:38 - 43:42
    intelligence agencies would waste all of
    this energy or follow people around. I
  • 43:42 - 43:46
    mean, you know, it wasn't that long ago.
    It happened right here. I mean, you'd have
  • 43:46 - 43:49
    people following other people around.
    You'd have databases. And you have
  • 43:49 - 43:54
    archives on other archives. This record-
    keeping, this massive amount of analog
  • 43:54 - 43:59
    record-keeping, was not, this is
    not far-fetched. This happened very
  • 43:59 - 44:04
    recently. Now, why should they
    go around, do that today, when they can
  • 44:04 - 44:07
    just go to one of the companies and say:
    "Hey. You don't want you give it to us?"
  • 44:07 - 44:11
    Now we know, you know, of course here's,
    you know, Tim Cook saying: "No, no, we
  • 44:11 - 44:15
    don't, we don't give our information to
    the NSA, we don't do that." We know a
  • 44:15 - 44:19
    little differently, you know, I mean,
    remember these? These are from Snowden. I
  • 44:19 - 44:24
    mean, this is obvious. Look, it doesn't
    say, when we ... well, it does not say,
  • 44:24 - 44:28
    when the NSA intercepted each of these
    dates, when prism collection began for
  • 44:28 - 44:33
    each provider. They're being called
    providers here. The reason they're being
  • 44:33 - 44:36
    called providers is, because these are the
    companies, that are providing the
  • 44:36 - 44:43
    information to the NSA. So while Tim Cook
    can say that he doesn't, you know, well,
  • 44:43 - 44:48
    while he can say, well, he can say all he
    wants about he doesn't, you know, that
  • 44:48 - 44:52
    they don't involve themselves, we
    know otherwise. This is proof. So, you
  • 44:52 - 44:57
    know, so my whole thing is like, hey, you
    know, if the tech companies, if these
  • 44:57 - 45:01
    organisa..., if these large multinational
    tech companies can provide this
  • 45:01 - 45:04
    information, why can't we do this for
    ourselves? Why can't we just put this out
  • 45:04 - 45:08
    there? And what and why not just put
    everything out there, whether it's
  • 45:08 - 45:13
    relevant or not? And why not just mud up
    the whole system? So, you know, I'm
  • 45:13 - 45:17
    really interested in this idea of like
    creating our own archives. Do you know
  • 45:17 - 45:20
    this building? This is the AT&T data
    center in San Francisco at the corner of
  • 45:20 - 45:23
    2nd and Folsom. This is a project that
    they call Hawkeye, which is actually
  • 45:23 - 45:30
    AT&T's internal code name for this data
    center. And shortly before 2007, about a
  • 45:30 - 45:35
    little over 10 years ago, there were,
    there was a whistleblower though, that
  • 45:35 - 45:42
    came forward. It was in room 641a of this
    building, which... Basically, what
  • 45:42 - 45:46
    happened is: The NSA approached 16 telecom
    companies in the US and said: "We'd like
  • 45:46 - 45:52
    to copy your data stream." And 15 of the
    16 said: "Sure, help yourself, you know,
  • 45:52 - 45:55
    make yourself at home." There's only one
    company, that said: "You know. We're not
  • 45:55 - 45:59
    really sure the legalities of this. Can
    you come back with a warrant and tell us
  • 45:59 - 46:03
    what you're looking for? And we
    will provide it to you, if you can tell
  • 46:03 - 46:09
    us what you want." But you know, you
    had, you had 15 of the 16 just say:
  • 46:09 - 46:14
    "Yeah, go ahead." So what basically, what
    happened in, in 2007, not like, who
  • 46:14 - 46:22
    knows, what it is now. But in 2007 it was
    estimated, that there was 213 terabytes
  • 46:22 - 46:30
    of phone records in this building. Every
    phone record of every domestic call in the
  • 46:30 - 46:35
    US was archived. And then of course
    there's multiple data centers. This is
  • 46:35 - 46:39
    just one of many. Now who does that phone
    call belong to? You know, when you're
  • 46:39 - 46:43
    calling grandma. Does that phone call,
    does that information and that phone
  • 46:43 - 46:47
    call, does that belong to you? Does that
    belong to Grandma? Or does that belong to
  • 46:47 - 46:52
    the phone company and they're licensing
    you a one-time use of it? We don't, I
  • 46:52 - 46:56
    mean again with this... Who
    actually owns this data? So I was really
  • 46:56 - 46:59
    interested in this and keeping it record.
    So, I know, you know, while AT&T's
  • 46:59 - 47:03
    keeping records, which is my
    mobile phone provider in the US.
  • 47:03 - 47:07
    Similarly, I'm keeping my own records.
    And this is all my, this is my log files
  • 47:07 - 47:11
    of all the people that come by on my
    website. And you can see all of the folks.
  • 47:11 - 47:15
    Let me show you a little bit cleaner
    version. And you can see all the
  • 47:15 - 47:19
    lovely happy people, that come by, like,
    the terrorist screening center of the
  • 47:19 - 47:25
    Central Intelligence Agency or the, you
    know, the National Geospatial-
  • 47:25 - 47:29
    Intelligence Agency. I don't know why they
    come by. I mean, I, I'm glad that they
  • 47:29 - 47:36
    like art. I'm glad we have patrons of the
    arts in, in these organizations. Because,
  • 47:36 - 47:39
    you know, I mean, like, they come by
    frequently enough. I'm not really sure
  • 47:39 - 47:42
    what they're doing. But I'm glad, I'm
    glad they like what I do. But, so, I
  • 47:42 - 47:47
    really have to thank one particular
    person for making my art career possible.
  • 47:47 - 47:51
    And that would be Dick Cheney. Because if
    it was not for Dick Cheney, I would not
  • 47:51 - 47:55
    be here talking to you. None of this would
    be possible without Dick Cheney. I mean,
  • 47:55 - 47:58
    really, I mean, you know, he's really
    responsible for a lot of things that have
  • 47:58 - 48:02
    happened recently. So thank you, Dick
    Cheney. If you're, if you're listening,
  • 48:02 - 48:05
    which you probably are, because you know
    everything and you are, you know, you have
  • 48:05 - 48:12
    your people. Anyway. So a few years ago,
    I said, well, about 2010 I moved to the, I
  • 48:12 - 48:15
    start–, I took a job at the University of
    Maryland, which is in College Park. When I
  • 48:15 - 48:18
    say Maryland, most people assume we're in
    Baltimore. Because that's the main city
  • 48:18 - 48:23
    in Maryland. But we're actually closer to
    Washington DC. So you can see, so there's
  • 48:23 - 48:28
    College Park, that's, that's where I work.
    This is the FBI headquarters. This is the
  • 48:28 - 48:35
    CIA. This is the NSA. You triangulate the
    three, it ends up on my campus. So it's
  • 48:35 - 48:37
    only appropriate, you know, it's, and
    plus, you know, we have this budget
  • 48:37 - 48:41
    problem in the US right now. So I'm,
    like, helping them out. It's like, you
  • 48:41 - 48:43
    know: "Guys, can I help you, guys? Can I
    help you out by moving in like right next
  • 48:43 - 48:47
    to you, so you don't have to waste your
    resources?" By the way, vice.com ranked
  • 48:47 - 48:51
    us "The Most Militarized University In
    America". We're number one. And then
  • 48:51 - 48:54
    we're not number one at football or
    anything. It takes a while for that sort
  • 48:54 - 48:57
    of stuff. But most militarized, that's
    pretty easy. We can do that. Let me show
  • 48:57 - 49:00
    you a little bit about ... So it's an
    amazing place. DC is, Washington DC is a
  • 49:00 - 49:06
    great place to be. Problem is, real estate
    is really, really, really expensive. Like
  • 49:06 - 49:09
    housing is really, you know, it's a
    company town. I mean, like, why else would
  • 49:09 - 49:13
    you need to live there unless, you know,
    you're working in government? So while,
  • 49:13 - 49:16
    you know, real estate prices in other
    parts of the country go up and down,
  • 49:16 - 49:19
    Washington just keeps going up and up and
    up and up and up. So, you know, it's been
  • 49:19 - 49:23
    good. But then I found this piece of land
    in St. Michael's. Now if you look over
  • 49:23 - 49:28
    here, it says, the average listing price
    in the zip code, which is the, the postal
  • 49:28 - 49:34
    code in that area, is 797303 dollars. This
    was back in 2010. You know, 800000
  • 49:34 - 49:39
    dollars at that time average. And then I
    found this piece of land for only 74000
  • 49:39 - 49:43
    bucks. I was like: "I want this. I want
    this. I want this. I'm gonna ..." I know,
  • 49:43 - 49:46
    it's like I need to do something. And
    it's about an hour and a half away from
  • 49:46 - 49:49
    work, which is a bit ... But, you know,
    it's possible. Anyway, so this is where
  • 49:49 - 49:53
    the house is over there. It goes
    further up. So it's a piece of land, it's a
  • 49:53 - 49:56
    beautiful place. I mean, you see the
    nature and all that. It's that weird
  • 49:56 - 50:00
    oblong shape over there. You see that
    little driveway with the nice trees and
  • 50:00 - 50:03
    that house in the water? That belongs to
    this guy.
  • 50:03 - 50:07
    slight laughter in the audience
    So, and let me take you around the
  • 50:07 - 50:12
    neighborhood for a little bit. So
    this is, this is his gate. This is his
  • 50:12 - 50:17
    driveway. He drives up this way. He
    has a five car garage. Two of those
  • 50:17 - 50:22
    garages go back pretty deep. This is
    swimming pool. We're gonna hang out here
  • 50:22 - 50:26
    over the summers, hang out, you know,
    we'd gonna throw some parties. This is his
  • 50:26 - 50:34
    front room. This is his kitchen. This is
    where he sleeps. And this is where we're
  • 50:34 - 50:40
    gonna plot world domination. We're gonna
    hang out and eat yellowcake up there. So
  • 50:40 - 50:43
    this is the place. I'm hoping to
    build this. This is what I was trying to
  • 50:43 - 50:47
    build. But the listing just kind of
    disappeared. It literally just
  • 50:47 - 50:50
    disappeared. And I'm convinced it's, so,
    you know, it's like he kind of does that
  • 50:50 - 50:54
    to people, too. I'm glad he just did it to
    the listening and not to me. But, you
  • 50:54 - 50:58
    know, it wasn't withdrawn, it wasn't
    sold, it just like disappeared. So these
  • 50:58 - 51:00
    days, it has become like this really
    conceptual project, where I'm actually
  • 51:00 - 51:04
    painting them. And then I'm photographing
    the paintings. So this is what I'm
  • 51:04 - 51:07
    actually showing. So, you know,
    it's... So, okay. Here's my thing. So,
  • 51:07 - 51:11
    look, here's a man, Dick Cheney, with not
    only are you dealing with the former vice
  • 51:11 - 51:16
    president. You're dealing with the former,
    you deal with the former CEO of
  • 51:16 - 51:22
    Halliburton. Like every ... This man has
    every motivation to be secret. This man,
  • 51:22 - 51:30
    money is no object. And yet, if I can show
    you, where he sleeps and where he eats
  • 51:30 - 51:34
    and where he swims, I mean, what can he
    know about us? I mean, I'm completely
  • 51:34 - 51:38
    self-trained in this. I'm not even, I'm
    not even like, you know, I'm not even like
  • 51:38 - 51:42
    that, that tech-savvy in this. I mean, I
    just kind of, just like duct-tape things
  • 51:42 - 51:48
    together to find things, whatever I
    needed. But what do entire companies and
  • 51:48 - 51:53
    corporations and countries, that really
    have the motivation to dig through our
  • 51:53 - 51:58
    information, what can they get about us?
    So I'd like to leave you with this, that,
  • 51:58 - 52:01
    you know, we really need to come up with
    a different idea and a different
  • 52:01 - 52:06
    interpretation and a different meaning of
    what privacy in the 21st century means.
  • 52:06 - 52:12
    This is no longer, we can no longer
    afford to use the same definitions
  • 52:12 - 52:17
    of privacy that we used 50 years ago. Or
    a hundred years ago. Or 200 years ago. Or
  • 52:17 - 52:21
    even just five years ago or even just last
    week. We need to have a much more
  • 52:21 - 52:25
    adaptive and open idea what this means to
    us. And for each of us, this will mean,
  • 52:25 - 52:28
    mean something differently. So I want to
    leave you with that. And thank you very
  • 52:28 - 52:32
    much and I'm happy to take some
    questions.
  • 52:32 - 52:38
    applause
    Herald: Thank you, Hasan.
  • 52:38 - 52:43
    Hasan: Thank you! Welcome!
    Herald: It's a fantastic adventure to be
  • 52:43 - 52:51
    always on the run and at the same time,
    actually, yeah, they're running for you
  • 52:51 - 52:54
    now.
    Hasan: A little bit. They come by. You
  • 52:54 - 52:56
    know, I try to be friends with them.
    Herald: Yeah, that's good.
  • 52:56 - 52:59
    Hasan: You know, they're not, they're
    actually not that, they're not very, they
  • 52:59 - 53:02
    don't like, they don't want to be friends.
    They just like, they just like to watch
  • 53:02 - 53:06
    from a distance.
    Herald: You think that Dick Cheney will
  • 53:06 - 53:10
    allow parties at your house?
    Hasan: I'm trying. Yeah. We're gonna have
  • 53:10 - 53:12
    a shooting range in the back
    Herald: Like it's good.
  • 53:12 - 53:14
    Hasan: Yeah, it'll be a lot of fun.
  • 53:14 - 53:16
    Herald: I'm looking forward.
    Hasan: Do you have a question?
  • 53:16 - 53:23
    Q: Yeah, thanks for the talk. You
    mentioned ways that you are ... kind of
  • 53:23 - 53:28
    anticipated things that now we kind of
    take very common. Is there any way in
  • 53:28 - 53:34
    which your art practice has evolved or
    responded to having a newer, younger
  • 53:34 - 53:39
    generation of students year over year in
    your classes, for whom Instagram or selfie
  • 53:39 - 53:42
    sticks might be just second nature to
    them?
  • 53:42 - 53:46
    Hasan: Yeah. You know, I think it's
    really interesting watching how people of
  • 53:46 - 53:50
    different generations react to this.
    Because, you know, I think a lot of it
  • 53:50 - 53:53
    also has to do with the way we
    communicate. I mean, there was a very
  • 53:53 - 53:57
    specific generation that would write a
    letter. And then you had a generation
  • 53:57 - 54:01
    that would write, you know, that would
    call. And then you have text. And then
  • 54:01 - 54:04
    you have people that write you over email.
    And you have people that will only text
  • 54:04 - 54:07
    you. I mean, we were seeing that and
    we're, and, and we're actually living in
  • 54:07 - 54:10
    that right now. Because we actually deal
    with certain different types of people
  • 54:10 - 54:14
    that only deal with different modes of
    communication. So I really think in, that
  • 54:14 - 54:17
    it's ... The interesting thing is though
    that time is compressed. Because we've
  • 54:17 - 54:21
    only had people that communicate over
    writing for hundreds of years. Then we
  • 54:21 - 54:26
    would have people that only communicated
    and phoned for maybe 30 to 50, maybe a
  • 54:26 - 54:30
    little longer. And then, now we only have
    that like in the last 5 years we've had
  • 54:30 - 54:34
    this. And then now, down to like one year.
    So who knows what the next method will
  • 54:34 - 54:39
    be? But I really think a lot of it has to
    do with this adaptation. I think a lot of
  • 54:39 - 54:43
    the things really interesting is this
    idea of like, like migration and, you
  • 54:43 - 54:48
    know, the whole immigration topics are
    some real hot topic politically right now.
  • 54:48 - 54:51
    But I think we also need to think
    about this from a digital perspective.
  • 54:51 - 54:55
    Because, you know, we turn, we tend to use
    the word digital native a lot. But we
  • 54:55 - 54:59
    never actually use the word digital
    immigrant. But yet that's all of us in
  • 54:59 - 55:03
    some way. I mean, very few of us...
    I mean there, yes, there are
  • 55:03 - 55:07
    certain, there is a certain generation
    that is 100% digital from day one. But
  • 55:07 - 55:11
    for most of us, at least of a certain
    age, we are all digital immigrants. And I
  • 55:11 - 55:16
    think that, that type of shift, I think
    that type of understanding of it, I think
  • 55:16 - 55:21
    it has an impact and particularly
    in terms of dealing with
  • 55:21 - 55:26
    learning and dealing with education, you
    know. At what point is it a natively
  • 55:26 - 55:38
    learned skill versus a naturalised, a
    migratory type of an action. So. Hopefully
  • 55:38 - 55:41
    that answers that you're talking about,
    yeah. Do you have another question? ...
  • 55:41 - 55:49
    Yeah?
    Q: Just ... have they ever asked
  • 55:49 - 55:51
    you to stop?
    Hasan: No, no there's no law against
  • 55:51 - 55:54
    talking too much. Because if there was a
    law against talking too much, we'd all be
  • 55:54 - 55:58
    in trouble. But you know, but this is
    interesting. Because they're very much
  • 55:58 - 56:05
    about, there ... It's very much a
    one-way direction. They're not about,
  • 56:05 - 56:09
    they don't editorialize, they don't
    provide opinion. They only take
  • 56:09 - 56:15
    information. They ask you questions and
    they take information. So you could answer
  • 56:15 - 56:20
    them with one word or you could answer
    them with millions. And I've just decided
  • 56:20 - 56:25
    that I'm gonna keep talking, keep
    talking to them. I'm a relatively, you
  • 56:25 - 56:30
    know, I think cooperative person. I'd like
    to be helpful, you know. But really, what
  • 56:30 - 56:35
    it is, it's really, it's, defiance
    through compliance. It's
  • 56:35 - 56:39
    this level of aggressive compliance. And
    at a certain point, I try to be so
  • 56:39 - 56:41
    helpful, that I'm completely not
    helpful at all.
  • 56:41 - 56:45
    slight laughter in the audience
    So thank you, thank you for the question.
  • 56:45 - 56:51
    applause
    Herald: Thank you very much. Is there, is
  • 56:51 - 56:53
    there ... Oh, there's a question from the
    internet!
  • 56:53 - 56:56
    Hasan: No, oh, okay. Nobody else? Okay,
    great.
  • 56:56 - 56:58
    Herald: Is there an FBI agent in there to
    have a drink, maybe?
  • 56:58 - 57:04
    Hasan: Yeah that'd be great. Yeah yeah.
    Herald: Why not? Yeah.
  • 57:04 - 57:08
    Hasan: I'm sure, I'm sure there's people
    from some government organization at this
  • 57:08 - 57:10
    place. So, you know, let's get a drink
    later together.
  • 57:10 - 57:12
    Herald: Okay, let's have one. Thank you
    very much!
  • 57:12 - 57:15
    Hasan: See you guys. Thank you so much,
    thanks.
  • 57:15 - 57:18
    Herald: Don't forget to help us all to
    clean this all up.
  • 57:18 - 57:30
    34C3 Music
  • 57:30 - 57:40
    subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
34C3 - Tracking Transience
Description:

more » « less
Video Language:
English
Duration:
57:39

English subtitles

Revisions