Return to Video

"Information. What are they looking at?" A documentary on privacy.

  • 0:09 - 0:10
    RC3 preroll music
  • 0:10 - 0:16
    Herald: Welcome back on the channel, of
    Chaos zone TV. Now we have an English
  • 0:16 - 0:24
    speaking contribution. And for that, I
    welcome pandemonium, who is a historian
  • 0:24 - 0:33
    and documentary film maker, and therefore
    we will also not have a normal talk, but
  • 0:33 - 0:41
    we will see the documentary information.
    What are they looking at? A documentary on
  • 0:41 - 0:50
    privacy. And afterwards, we can all talk
    about the film, so feel free to post your
  • 0:50 - 0:57
    questions and we can discuss them. Let's
    enjoy the film.
  • 0:57 - 1:01
    film started
    [Filler please remove in amara]
  • 1:01 - 1:03
    ---Because the great promise of the
    internet is freedom. Freedom of
  • 1:03 - 1:08
    expression, freedom of organization,
    freedom of assembly. These are really seen
  • 1:08 - 1:10
    as underpinning rights of what we see as
    democratic values.
  • 1:10 - 1:12
    ---Just because you have safety does not
    mean that you cannot have freedom. Just
  • 1:12 - 1:15
    because you have freedom does not mean you
    cannot have safety.
  • 1:15 - 1:24
    ---Why is the reaction to doubt it rather
    than to assume that it's true and act
  • 1:24 - 1:28
    accordingly?
    ---We need to be able to break those laws
  • 1:28 - 1:30
    that are unjust.
    ---Privacy is an essence, becoming a de
  • 1:30 - 1:31
    facto crime. That is somehow you are
    hiding something.
  • 1:31 - 1:32
    ---So just to be sure let's have no
    privacy.
  • 1:32 - 1:33
    "Information, what are they looking at? A
    film by Theresia Reinhold.
  • 1:33 - 1:35
    "The Internet is [...] everywhere, but we
    only see it in the glimpses. The internet
  • 1:35 - 2:17
    is like the wholy ghost: it makes itself
    knowable to us by taking possession of the
  • 2:17 - 2:18
    pixels on our screens to manifest sites
    and apps and email, but its essence is
  • 2:18 - 2:19
    always elsewhere.
    ---Before the Internet came about,
  • 2:19 - 2:25
    communication was generally one editor to
    many, many readers. But now it's peer to
  • 2:25 - 2:31
    peer. So, you know, at a touch of a button
    people have an opportunity to reach
  • 2:31 - 2:34
    millions of people. That's revolutionizing
    the way we communicate.
  • 2:34 - 2:38
    ---One of the things that Facebook and to
    a lesser degree Twitter allowed people to
  • 2:38 - 2:44
    do is be able to see that they weren't
    alone. And it was able to create a
  • 2:44 - 2:50
    critical mass. And I think that's a very
    important role that social media took on.
  • 2:50 - 2:53
    It was able to show people a very easy way
    in people's Facebook feeds: "Oh, wow. Look
  • 2:53 - 2:57
    at Tahrir Square, there's people out there
    in Bahrain, in Pearl Square." What people
  • 2:57 - 3:02
    could feel before walking out their door
    into real life action, that they could see
  • 3:02 - 3:07
    that they are not isolated in their desire
    for some sort of change.
  • 3:07 - 3:12
    ---The great promise of the internet is
    freedom where the minds without fear and
  • 3:12 - 3:18
    the head is held high. And the knowledge
    is free. Because the promise was: This
  • 3:18 - 3:22
    will be the great equalizer.
    ---Before the social web, before the Web
  • 3:22 - 3:29
    2.0, anything you were doing was kind of
    anonymous. By the very concept of
  • 3:29 - 3:37
    anonymity you were able to discuss things
    that would probably be not according to
  • 3:37 - 3:42
    dominance themes or the dominant trends of
    values of your own society.
  • 3:42 - 3:51
    ---I don't find this discussion about how
    to deal with the assertion "I have nothing
  • 3:51 - 4:00
    to hide" boring, even after many years.
    Because this sentence is very short, but
  • 4:00 - 4:07
    very perfidious. The speaker, who hurls
    the sentence "I have nothing to hide" at
  • 4:07 - 4:13
    me, not only says something about
    themselves, but also something about me.
  • 4:13 - 4:18
    Because this sentence "I have nothing to
    hide" also has the unspoken component of
  • 4:18 - 4:27
    "You don't either, do you?" In this
    respect, I think this sentence lacks
  • 4:27 - 4:30
    solidarity, because at the same time one
    does not want to work to ensure that the
  • 4:30 - 4:33
    other, who perhaps has something to hide,
    is able to do so.
  • 4:33 - 4:38
    ---One of the things about privacy is that
    it's not always about you. It's about the
  • 4:38 - 4:42
    people in our networks. And so, for
    example, I have a lot of friends who are
  • 4:42 - 4:46
    from Syria. People that I have met in
    other places in the world, not necessary
  • 4:46 - 4:51
    refugees. People who lived abroad for a
    while, but those people are at risk all
  • 4:51 - 4:56
    the time. Both in their home country and
    often in their host countries as well. And
  • 4:56 - 5:01
    so, I might say that I have nothing to
    hide. I might say that there's no reason
  • 5:01 - 5:05
    that I need to keep myself safe. But if
    you've got any more like that in a
  • 5:05 - 5:11
    network, any activists, any people from
    countries like that, it's thinking about
  • 5:11 - 5:15
    privacy and thinking about security means
    thinking about keeping those people safe,
  • 5:15 - 5:18
    too.
    Privacy is important, because if think of
  • 5:18 - 5:24
    the alternative, if everything is public,
    if the norm is public, then anything that
  • 5:24 - 5:29
    you want to keep to yourself has an
    association or guild attached. And that
  • 5:29 - 5:34
    should not be the world that we create.
    That's a chilling effect. It's a chilling
  • 5:34 - 5:38
    effect on our freedoms. It's a chilling
    effect on democracy.
  • 5:38 - 5:41
    "No one shall be subjected to arbitrary
    interference with this privacy, family,
  • 5:41 - 5:45
    home or correspondence, for the attacks
    upon his honor and reputation. Everyone
  • 5:45 - 5:49
    has the right to the protection of the law
    against such interference or attacks.
  • 5:49 - 5:56
    ---To me, human rights are something which
    has been put to place to guarantee the
  • 5:56 - 6:00
    freedoms of every single person in the
    world. They're supposed to be universal,
  • 6:00 - 6:06
    indivisible. Having in the eyes of this
    systems.
  • 6:06 - 6:10
    ---They're collecting data and metadata
    about hundreds of thousands, millions of
  • 6:10 - 6:15
    people. And some of that data will never
    be looked at. That's a fact. We know that.
  • 6:15 - 6:20
    But at the same time, assuming that just
    because you're not involved in activism or
  • 6:20 - 6:24
    you're not well known that you're not
    going to be a target at some point, I
  • 6:24 - 6:29
    think, that is what can be really harmful
    to us. Right now you may not be under any
  • 6:29 - 6:34
    threat at all, but your friends might be,
    your family might be or you might be in
  • 6:34 - 6:37
    the future. And so that's why we need to
    think about it this way, not because we're
  • 6:37 - 6:41
    going to be snatched out of our homes in
    the middle of the night now. But because
  • 6:41 - 6:46
    this data and this metadata lasts for a
    long time.
  • 6:46 - 6:54
    ---My observation is what we are
    experiencing right now is that the private
  • 6:54 - 7:06
    space that should be and remain private in
    the digital world is slowly beginning to
  • 7:06 - 7:13
    erode. It is becoming permeable, and not
    just factual. Factually, of course, but
  • 7:13 - 7:22
    not only factually, but also in
    perception. I imagine the digital world as
  • 7:22 - 7:32
    a panoptical. This is the ring-shaped
    building designed by Jeremy Benthem. In
  • 7:32 - 7:38
    the ring the prisoners are accommodated in
    the individual cell, and in the middle
  • 7:38 - 7:48
    there is a watchtower. And there is a
    guard sitting there. And this guard, who
  • 7:48 - 7:58
    can observe and supervise the prisoners in
    the cells around him all the time. The
  • 7:58 - 8:05
    trick is that the prisoners cannot know if
    they are being watched. They only see the
  • 8:05 - 8:11
    tower, but they don't see the warden. But
    they know very well that they could be
  • 8:11 - 8:16
    watched permanently at any time. And this
    fact exerts a changing decisive effect.
  • 8:16 - 8:24
    ---I think surveillance is a technology of
    governmentality. It's a bio political
  • 8:24 - 8:31
    technology. It's there to control and
    manage populations. It's really propelled
  • 8:31 - 8:38
    by state power and the power of entities
    that are glued in that cohere around the
  • 8:38 - 8:46
    state, right? So it is there as a form of
    population management and control. So you
  • 8:46 - 8:51
    have to convince people that it's in their
    interest and its like: Every man for
  • 8:51 - 8:56
    himself and everyone is out to get
    everyone.
  • 8:56 - 9:13
    relaying music is plays
    [Filler please remove in amara]
  • 9:13 - 9:21
    ---I take my cue from a former general
    counsel of the NSA, Suart Baker, who said
  • 9:21 - 9:27
    on this question: Meta-Data absolutely
    tells you everything about somebodies
  • 9:27 - 9:32
    life. If you have enough Meta-Data you
    don't really need content. It is sort of
  • 9:32 - 9:35
    embarrassing, how predictable we are as
    human beings.
  • 9:35 - 9:41
    ---So let's say that you make a phone one
    night, you call up a suicide hotline, for
  • 9:41 - 9:46
    example, you're feeling down, you call
    that hotline and then a few hours later
  • 9:46 - 9:52
    maybe you call a friend. A few hours later
    you call a doctor, you send an email and
  • 9:52 - 9:58
    so on so forth. Now, the contents of that
    of those calls and those e-mails are not
  • 9:58 - 10:02
    necessarily collected. What's being
    collected is the time of the call and the
  • 10:02 - 10:07
    place that you called. And so sometimes in
    events like that, those different pieces
  • 10:07 - 10:10
    of metadata can be linked together to
    profile someone.
  • 10:10 - 10:15
    ---David's description of what you can do
    with metadata, and quoting a mutual friend
  • 10:15 - 10:21
    Stewart Baker, is absolutely correct. We
    kill people based on Meta-Data. But that
  • 10:21 - 10:31
    is not what we do with this Meta-Data.
    Mayor Denett: Thankfully. Wow, I was
  • 10:31 - 10:33
    working up a sweat there Mayor laughs
    for a second.
  • 10:33 - 10:38
    ---You know, the impetus for governments
    for conducting this kind of surveillance
  • 10:38 - 10:43
    is often at least in rhetoric go to after
    terrorists. And obviously, we don't want
  • 10:43 - 10:48
    terrorism. And so that justification
    resonates with most of the public. But I
  • 10:48 - 10:52
    think that there's a couple problems with
    it. The first is that they haven't
  • 10:52 - 10:56
    demonstrated to us that surveillance
    actually works in stopping terrorist
  • 10:56 - 11:01
    attacks. We haven't seen it work yet. It
    didn't work in Paris. It didn't work in
  • 11:01 - 11:05
    Boston. t didn't work elsewhere. So that's
    one part of it. But then I think the other
  • 11:05 - 11:10
    part of it is that we spend billions of
    dollars on surveillance and on war, but
  • 11:10 - 11:13
    spend very little money on addressing the
    root causes of terrorism.
  • 11:13 - 11:18
    ---I consider this debate security versus
    freedom to be a bugaboo. Because these
  • 11:18 - 11:22
    values are not mutually exclusive. I'm not
    buying into this propaganda anymore. Many
  • 11:22 - 11:29
    of the measures we have endured in the
    last ten years have not led to more
  • 11:29 - 11:34
    security, in terms of state-imposed
    surveillance. And this is one reason why I
  • 11:34 - 11:42
    don't want to continue this debate about
    whether we should sacrifice freedom for
  • 11:42 - 11:47
    more security.
    ---I think power is concealed in the whole
  • 11:47 - 11:53
    discourse around surveillance, and the way
    its concealed is through this
  • 11:53 - 12:01
    legitimization that it's in your interest
    that it keeps you safe. But there have
  • 12:01 - 12:07
    been many instances where citizens groups
    have actually fought against that kind of
  • 12:07 - 12:11
    surveillance. And I think there is also
    sort of a mystique around music starts
  • 12:11 - 12:16
    the technology of surveillance. There is
    the whole sort of like this notion that,
  • 12:16 - 12:20
    ah, because it's a technology and it's
    designed to do this. It's actually
  • 12:20 - 12:26
    working. But all of this is a concealment
    of power relations because who can surveil
  • 12:26 - 12:32
    who? Is the issue, right?
    ---But it isn't the majority of the
  • 12:32 - 12:38
    English population here to get stopped and
    searched. It's non-white people. It is not
  • 12:38 - 12:43
    the majority of non-white people who get
    approached to inform on the community.
  • 12:43 - 12:48
    It's Muslim communities.
    ---The surveillance that one does on the
  • 12:48 - 12:56
    other. So as airport, it's the other
    passengers that say, oh, so-and-so is
  • 12:56 - 13:03
    speaking in Arabic. And therefore, that
    person becomes the subject, the target
  • 13:03 - 13:08
    that hyper-surveillance. So it's the kind
    of surveillances that are being exercised
  • 13:08 - 13:16
    by each of us on the other. Because of
    this culture of fear that has been
  • 13:16 - 13:23
    nourished on a way and that's mushrooming
    all around us. And these are fears, I
  • 13:23 - 13:28
    think, go anywhere from the most concrete
    to the most vague.
  • 13:28 - 13:33
    ---In this way, I think this is another
    way of creating a semblance of control
  • 13:33 - 13:38
    where this identity is very easily
    visible. It's very easily targeted and
  • 13:38 - 13:42
    it's very easily defined.
    ---For me, this political discussion is
  • 13:42 - 13:52
    purely based on fear in which the fear of
    people, which is justified, are exploited.
  • 13:52 - 13:59
    And where racist stereotypes are being
    repeated. I think it extremely dangerous
  • 13:59 - 14:08
    to give in to this more and more, also
    because I believe that it reinforces
  • 14:08 - 14:13
    negative instincts in people. Exclusion,
    but also racial profiling.
  • 14:13 - 14:18
    Kurz: It's inherently disentranchising,
    it's disempowering and it's isolating.
  • 14:18 - 14:23
    When you feel you're being treated as a
    different person to the rest of the
  • 14:23 - 14:27
    population, that's when measures like
    surveillance, things that are enabled by
  • 14:27 - 14:38
    technology really hit home. And cause you
    to sort of change that way you feel as a
  • 14:38 - 14:40
    subject. Because at the end of the day,
    you are subjective of a government.
  • 14:40 - 14:47
    ---How is it that these mass surveillance
    programs have been kept secret for years
  • 14:47 - 14:52
    when they are supposed to be so meaningful
    and effective? Why didn't anyone publicly
  • 14:52 - 14:58
    justify it? Then why was it all secretly
    justified by secret courts with secret
  • 14:58 - 15:02
    court rulings? Why, after the Snowden
    publications began, did the Commission of
  • 15:02 - 15:07
    intelligent Agents, which specifically
    appointed Obama com to the conclusion that
  • 15:07 - 15:15
    not a single --zero --- of this cases of
    terror or attempted terrorist attacks has
  • 15:15 - 15:20
    been partially resolved by these giant
    telecommunications metadata? In trying to
  • 15:20 - 15:26
    stop something from happening before it
    happens, they can put in a measure and
  • 15:26 - 15:31
    that thing might not happen. But they
    don't know it that measure stopped that
  • 15:31 - 15:35
    thing from happening, because that thing
    never happened. It's hard to measure. You
  • 15:35 - 15:39
    can't measure it. And you can't say with
    certainty thst because of this measure
  • 15:39 - 15:47
    that that didn't happen. But after 9/11,
    after the catastrophic level of attack, it
  • 15:47 - 15:55
    put decision makers into this impossible
    position where citizens where scared. They
  • 15:55 - 16:01
    needed to do something. One part of that
    is trying to screen everybody objectively
  • 16:01 - 16:07
    and have that sort of panoptical
    surveillance. Saying that: "No, no. We can
  • 16:07 - 16:14
    see everything. Don't worry. We have the
    haystack. We just need to find the needle.
  • 16:14 - 16:18
    But then obviously, they need ways to
    target that. You can see it most clearly
  • 16:18 - 16:23
    over here. You got leaflets through your
    door a few years ago, basically saying
  • 16:23 - 16:29
    that if you've seen anything suspicious,
    call this hotline. It listed things like
  • 16:29 - 16:38
    the neighbor who goes away on holiday many
    times a year or, another neighbor whose
  • 16:38 - 16:44
    curtains are always drawn. It just changes
    the way you look at society and you look
  • 16:44 - 16:51
    at yourself. And it shifts the presumption
    of Innocence to a presumption of guilt
  • 16:51 - 16:56
    already.
    ---When is someone a potential suicide
  • 16:56 - 17:08
    bomber? This is where the problem begins.
    When they wear an explosive belt and holds
  • 17:08 - 17:17
    the tiger in their hands? Or when they
    order the building blocks for an explosive
  • 17:17 - 17:29
    belt online? Or when they informed
    themselves about how to build an explosive
  • 17:29 - 17:38
    vest? When can the state legally
    intervene? For me it is about the central
  • 17:38 - 17:46
    very problematic question whether someone
    who very problematic question whether
  • 17:46 - 17:55
    someone who has been identified as a
    potential danger or a potential terrorist,
  • 17:55 - 18:04
    without being a terrorist, if someone like
    that can then be legally surveilled or
  • 18:04 - 18:15
    even arrested? That means if certain
    people by potentially posing a concrete
  • 18:15 - 18:20
    danger ot society can be stripped of,
    their fundamental human rights?
  • 18:20 - 18:23
    ---We face am unprecedented threat which
    will last
  • 18:23 - 18:26
    Two days after the attacks in Brüssel on
    the 22.03.2016 Jean Claude Juncker and the
  • 18:26 - 18:33
    french prime minister held a join pres
    conference. But we also believe that we
  • 18:33 - 18:37
    need to be a union of security. In it
    Juncker called to the ministers to accept
  • 18:37 - 18:42
    a proposal by the commision for the
    propection of the EU.
  • 18:42 - 18:48
    ---For over 15 years now we have observed
    a big populist push to adopt even more
  • 18:48 - 18:52
    surveillance measures. With the attacks of
    the past years, there was the opportunity
  • 18:52 - 18:58
    to pass even more. We have this proposal
    for a new directive whose contents a
  • 18:58 - 19:03
    purely based on ideology.
    The Text of the law passed in summary
  • 19:03 - 19:07
    proceedings was adopteed as an anti-
    terrorism directive. Green Member of
  • 19:07 - 19:15
    Parlament Jan Phillipp Albrecht wrote in
    an Statement to Netzpolitik.org: "What the
  • 19:15 - 19:18
    Directive defines as Terrorism could be
    used by governments to criminalize
  • 19:18 - 19:24
    political action or political protest".
    ---These type of laws actually are neutral
  • 19:24 - 19:26
    in principle. In praxis, they are very
    discrimantory. If you talk to any
  • 19:26 - 19:31
    politician right now of the EU level or at
    the national or local level they will tell
  • 19:31 - 19:34
    you that most likely this people are
    Muslims.
  • 19:34 - 19:38
    ---Historically and Philosophically this
    problem is well known to us. We always
  • 19:38 - 19:46
    tend to put everithing which is unpleasant
    or eene to us, to the horizon. "This is
  • 19:46 - 19:59
    strange ti us. It is done by others, not
    by us." And when one of us does it thent
  • 19:59 - 20:05
    they have to be distributed.
    ---And this is Edward Said's point of view
  • 20:05 - 20:13
    that the western seit comes to define
    itself in relation to this eastern other.
  • 20:13 - 20:18
    So everything that the West was, the East
    was'nt and everything that the East was,
  • 20:18 - 20:22
    the West wasn't. Ans so the East became
    this province of emotionality,
  • 20:22 - 20:28
    irrationality. And the West became the
    source of reason, everything controlled
  • 20:28 - 20:33
    and contained and so forth. And it is this
    dichotomy that continues to play itself
  • 20:33 - 20:37
    out.
    ---Terrorism emerged as a term for the
  • 20:37 - 20:44
    first time in context of the French
    Revolution. The Jacobins who under
  • 20:44 - 20:49
    Rebesprierre were the reign of terror,
    those where the first Terrorists, that's
  • 20:49 - 20:55
    what they called. The first terrorism was
    the terrorism of the state and of course
  • 20:55 - 20:59
    this also included the systematic
    monitoring of Conterrevoliners.
  • 20:59 - 21:07
    ---Where the proposal of the directive
    says that it complies with human rights.
  • 21:07 - 21:10
    It actually does not because they want to
    increase surveillance measures in order
  • 21:10 - 21:15
    for the population to feel safer. However,
    we've seen that more repressive measure do
  • 21:15 - 21:18
    not necessarily mean that you would have
    more security.
  • 21:18 - 21:25
    ---The way you sell it to people is to
    appease their sense of anxieties around
  • 21:25 - 21:32
    "Oh, this is an insecure world. Anything
    could happen at any time. And so if
  • 21:32 - 21:36
    anything could happen at any time, what
    can we do about it?
  • 21:36 - 21:41
    ---You gut the feeling that the text is
    trying to make sure that few enforecment
  • 21:41 - 21:44
    will be able to get access to
    communications by any means that they
  • 21:44 - 21:47
    wish.
    ---To be able to stop something from
  • 21:47 - 21:50
    happening before it happens, you have to
    know everything. You have to look at the
  • 21:50 - 21:55
    past, look at what happened, but also
    predict the future by looking at the past
  • 21:55 - 22:01
    and then getting as much information as
    you can on everything all the time. So
  • 22:01 - 22:04
    it's about zero risk.
    Kurz: All developed democraties have a
  • 22:04 - 22:09
    concept of like proporionality, that's
    what the call it in Germany. that
  • 22:09 - 22:12
    surveillance measures are weighed up
    against the repeat for fundamental rights.
  • 22:12 - 22:16
    This undoubtly includes privacy. Privacy
    is very highly viewed in Germany and
  • 22:16 - 22:23
    directly derived from human dignity. And
    human dignity is only negotiable to very
  • 22:23 - 22:29
    tiny degree.
    ---When we are afraid to speak either
  • 22:29 - 22:33
    because of our government coming after us
    because of a partner or a boss or
  • 22:33 - 22:37
    whomever. All sorts of surveillance causes
    self censorship. But I think that mass
  • 22:37 - 22:41
    surveillance. The idea that everything we
    are doing is being collected can cause a
  • 22:41 - 22:44
    lot of people to think twice before they
    open their mouths.
  • 22:44 - 22:52
    ---When all your likes can be traced back
    to you of course it affects your
  • 22:52 - 22:56
    behaviour. Of couse it's usually the case
    that sometimes you think: If you like this
  • 22:56 - 23:02
    thing or if you don't, it would have some
    social repressions.
  • 23:02 - 23:08
    ---But if you look throughout history, the
    Reformation, the gay rights movements all
  • 23:08 - 23:13
    these movements where illegal in some way.
    If not by law, stricktly, then by culture.
  • 23:13 - 23:19
    And if we'd this kind of mass surveillance
    would we have had this movements?
  • 23:19 - 23:23
    ---If all laws were absolutes, then we
    would never have progressed to the point
  • 23:23 - 23:28
    where women had equal rights because women
    had to break the laws. That said: "You
  • 23:28 - 23:33
    can't have equal rights". Black people in
    America had to break the laws that said
  • 23:33 - 23:41
    they could not have equal rights. And
    there's common thread here. You know a lot
  • 23:41 - 23:48
    of our laws historically have had the
    harshest effect on the most vulnerable in
  • 23:48 - 23:51
    society.
    Kurz: The components of whomever has
  • 23:51 - 23:57
    something to hide has to blame only
    themselves only emerged in recent years.
  • 23:57 - 24:04
    In particular, the former CEO of Google
    Eric Smith is known for that of course, he
  • 24:04 - 24:05
    certyinly said that.
    "If you have something that you don't want
  • 24:05 - 24:10
    anyone to know, maybe you shouldn't be
    dooing it on the first place. " But this
  • 24:10 - 24:16
    is so hostile to humans that is almost
    funny. You could think it is satire. A lot
  • 24:16 - 24:19
    of people can't help that they have
    something to hide in a society, that is
  • 24:19 - 24:24
    unjust. *wieder Eric Shmid" "But if you
    really need that kind of privacy, the
  • 24:24 - 24:36
    reallity is that search changes including
    google to retain that information for some
  • 24:36 - 24:46
    time. "
    ---Big corporations that have this
  • 24:46 - 24:51
    business model of people farming are
    interested in you becaurse you are the row
  • 24:51 - 24:55
    materials. Right. Your Infromation is row
    materials. What they do is they process
  • 24:55 - 25:00
    that to build a profile of you. And that's
    where the real value is. Because if I know
  • 25:00 - 25:06
    enough about you, if I as much information
    about you that I can build a very
  • 25:06 - 25:14
    lifelike, constantly evolving picture of
    you, a s simpulation of you. That's very
  • 25:14 - 25:17
    vulnerable.
    ---The economy of the net is predicting
  • 25:17 - 25:23
    human behaviour, so that eyeballls can be
    delivered to advertising and that's
  • 25:23 - 25:27
    targeting advertising.
    ---The system in ways is set up for them
  • 25:27 - 25:32
    to make money and sell our lettle bits of
    data, our interests, our demographics for
  • 25:32 - 25:37
    other people and for advertisers to be
    able to sell things. These companies know
  • 25:37 - 25:40
    more about us than we know about
    ourselves. Right now we're feeding the
  • 25:40 - 25:43
    beast. And right now, there's very little
    oversight.
  • 25:43 - 25:49
    ---It has to reach one person, the same ad
    at particular time, if at 3:00 p.m. you
  • 25:49 - 25:55
    buy the soda you get your lunch. How about
    2:55pm you'll get an ad about a discount
  • 25:55 - 26:02
    about a pizza place next door or a salad
    place. Where had exactly the soda comes.
  • 26:02 - 26:04
    So that's what targeted advertising is.
    ---It is true, it is convinient. You know,
  • 26:04 - 26:09
    I always laugh every time I'm on a site.
    I'm looking at, let's say, a sweater I
  • 26:09 - 26:12
    want to buy. And then I move over to
    another site and it advertising for that
  • 26:12 - 26:16
    same sweater. It pops up and reminds me
    how much I want it. It's both convinient
  • 26:16 - 26:20
    and annoying.
    ---It's a pity that some of the greatest
  • 26:20 - 26:25
    minds in our century are only wondering
    how to make you look advertising. And
  • 26:25 - 26:31
    that's where the surveillance economy
    beginns, I will say, and not just ends.
  • 26:31 - 26:36
    ---To a lot of people that may seem much
    less harmful. Bur the fact that they're
  • 26:36 - 26:41
    capturing this date means that data exists
    and we don't who they might share it with.
  • 26:41 - 26:45
    ---There is whole new business now, you
    know, data brokers who drew upon, you
  • 26:45 - 26:50
    know, thousands of data points and create
    client profiles to sell to companies. Now,
  • 26:50 - 26:54
    you don't really know what happens with
    this kind of things. So it is hard to
  • 26:54 - 27:00
    tell, what the implications are until it
    is too late. Until it happens.
  • 27:00 - 27:04
    ---The Stasi compared to Google or
    Facebook, where amateurs, the Stasi
  • 27:04 - 27:11
    actually had to use people to surcail you
    to spy on you. That was expensive. It was
  • 27:11 - 27:16
    time consuming. They had to pick targets.
    It was very expensive for them to have all
  • 27:16 - 27:21
    of these people spying on you. Facebook
    and Google don't have to do that. They use
  • 27:21 - 27:26
    Algorithms, that's the mass in mass
    surveillance. The fact that it is so
  • 27:26 - 27:32
    cheap, so convenient to spy on so many
    people. And it's not a conspiracy theory.
  • 27:32 - 27:36
    You don't need conspiracies when you have
    the simplicity of business models.
  • 27:36 - 27:43
    ---When we talk about algorithms, we
    actually talk about logic. When you want,
  • 27:43 - 27:49
    for example, buy a book on Amazon. You
    have always seen a few other suggestions.
  • 27:49 - 27:55
    These suggestions are produced for you
    based on the history of your preferences,
  • 27:55 - 28:00
    the history of your searches.
    ---They learn by making mistakes. And the
  • 28:00 - 28:07
    thing is, that's fine if it's like selling
    dog feed. But it's about predictive
  • 28:07 - 28:12
    pollicing and about creating a matrix
    where you see which individuals are
  • 28:12 - 28:17
    threatening, that's not ok for me. You
    know, that has to be limits. There has to
  • 28:17 - 28:22
    be lines. And these are all the dynamics
    that are coming from the bottom up. These
  • 28:22 - 28:26
    are the discussions that need to be had,
    but they need to be had with all actors.
  • 28:26 - 28:31
    It's can't just be a an echo chamber. You
    don't talk to the some people who agree
  • 28:31 - 28:36
    with you.
    ---So one consequence of this would be
  • 28:36 - 28:43
    many minorities or many people who have
    minority views would be silenced. And we
  • 28:43 - 28:47
    always know that when a minority view is
    silenced, it would empower them in a way
  • 28:47 - 28:57
    and it would radicalize them in the long
    run. This is one aspect. The other is that
  • 28:57 - 29:01
    you would never be challenged by anyone,
    who disagrees with you.
  • 29:01 - 29:07
    ---We have to understand that our data is
    not exhaust. Our data is not oil. Data is
  • 29:07 - 29:13
    people. You maybe not doing anything wrong
    today, but maybe three governments from
  • 29:13 - 29:19
    now when they pass a certain law, what you
    have done today might be illegal, for
  • 29:19 - 29:25
    example, and governments that keep that
    data can look back over 10, 20 years and
  • 29:25 - 29:33
    maybe start prosecuting.
    ---When everything we buy, everything we
  • 29:33 - 29:44
    read, even the people we meet and date is
    determined by this algorithms, I think the
  • 29:44 - 29:50
    amount of power that they exert on the
    society and individuals in this society is
  • 29:50 - 29:57
    more than the state to the some degree.
    And so there I think representatives
  • 29:57 - 30:04
    democracy have the duty to push the
    government to open up these private
  • 30:04 - 30:13
    entities, to at least expose to some
    degree how much control they exert.
  • 30:13 - 30:15
    ---If you adopt the technological
    perspective and realize that technology
  • 30:15 - 30:19
    will slip into our lives much more than is
    already the case: Technically into our
  • 30:19 - 30:23
    bodies, our clothes, into devices that we
    sit in an we're wearing, in all sorts of
  • 30:23 - 30:32
    areas of our coexistence and working life,
    then that's definitely the wrong way to
  • 30:32 - 30:35
    go. Because it leads to a total
    surveillance. And if you think about it
  • 30:35 - 31:31
    for a few minutes, you will realize that
    the dichotomy is between control &
  • 31:31 - 31:42
    freedom. And a fully controlled society
    cannot be free.
  • 31:42 - 31:43
    post film music
    [Filler please remove in amara]
  • 31:43 - 31:53
    Herald: Hello and welcome back from the
    movie and know I welcome also our
  • 31:53 - 32:09
    producer. And. It was very. Oh, yeah,
    showing very good. What information can do
  • 32:09 - 32:16
    and what could be done with information, I
    give some people a bit more time to ask
  • 32:16 - 32:28
    more questions. And in the meantime, I
    could ask, Oh, well, this is moving. Those
  • 32:28 - 32:38
    of these remote to your home was not shown
    today for the first time. So what would
  • 32:38 - 32:46
    you do different or what you think has
    maybe changed in the meanwhile since you
  • 32:46 - 32:51
    made it?
    pandemonium: What I would change is I
  • 32:51 - 33:00
    would definitely try much harder to secure
    funding to just simply make a better movie
  • 33:00 - 33:05
    and have more time and edited faster
    because the editing process, because I had
  • 33:05 - 33:12
    to work on the side was quite long. And
    this film, and the way it stands now, was
  • 33:12 - 33:20
    essentially only funded by a few very
    great people who supported me on Patreon
  • 33:20 - 33:27
    and helped me with some of their private
    money, essentially so that it was
  • 33:27 - 33:30
    essentially an almost no budget
    production. So I would definitely change
  • 33:30 - 33:37
    that. But documentary seen in Germany
    being what it is, it's very hard to secure
  • 33:37 - 33:42
    money if you're not attached to a TV
    station or if you don't have a name yet.
  • 33:42 - 33:47
    And since I didn't have a name, but I
    still want to make the movie, I made the
  • 33:47 - 33:53
    movie. I am still very happy with the
    general direction of it. But of course,
  • 33:53 - 34:01
    since that was mainly shot in 2015 and
    2016, some of the newer developments in
  • 34:01 - 34:11
    terms of especially biometric mass
    surveillance and police. Especially in the
  • 34:11 - 34:18
    U.S., the way police uses body cams, etc.
    is not really reflected. But I still think
  • 34:18 - 34:25
    that the I would still go with the whole
    angle on colonialism and racism that is
  • 34:25 - 34:31
    deeply entrenched in the discussions
    around surveillance and privacy. And we
  • 34:31 - 34:36
    can see that in discussions about shutting
    down Telegram in Germany at the moment
  • 34:36 - 34:42
    because right wing groups the there we see
    it in discussions about how to deal with
  • 34:42 - 34:49
    hate speech online on Facebook or the
    Metaverse. And it's going to be called
  • 34:49 - 34:54
    Zoom. And all of these things are already
    kind of in the movie, but I would have
  • 34:54 - 35:01
    probably focused a bit more on them if I'd
    known. Six years ago what would happen and
  • 35:01 - 35:07
    if I would make it now? But generally, the
    direction I would choose the same.
  • 35:07 - 35:19
    Herald: Yeah, it's quite fascinating. So
    it was nearly no budget was so low, and it
  • 35:19 - 35:29
    was also an interesting point to entertain
    now, because in principle, I understood
  • 35:29 - 35:36
    the ideas that body cams should create
    actually a means of protecting people
  • 35:36 - 35:42
    against police and not the other way
    around as it happens sometimes.
  • 35:42 - 35:48
    pandemonium: So there are definitely and
    the problem with especially body cams or
  • 35:48 - 35:55
    also other of other means of surveillance
    is that a video is always thought to be an
  • 35:55 - 36:02
    objective recording of the reality. But of
    course, it always depends on the angle and
  • 36:02 - 36:08
    cases of body cams, quite literally the
    angle, but also data interpretation. And
  • 36:08 - 36:15
    since humans are always full of biases and
    always full of presumptions about who
  • 36:15 - 36:21
    might be in the right and who might be in
    the wrong, these imagery, images or videos
  • 36:21 - 36:25
    tend to never, even if they would be
    showing the objective truth, they're
  • 36:25 - 36:32
    barely ever interpret it that way. And
    it's exactly the same with any sort of
  • 36:32 - 36:39
    film or photography that we're doing. I
    mean, for this movie, I assembled a ton of
  • 36:39 - 36:43
    interviews and there were very long there
    were several hours long in many cases, and
  • 36:43 - 36:49
    I could added probably 100 different
    versions of this movie going essentially
  • 36:49 - 36:59
    almost opposite directions, exactly the
    same material. It shows very strongly how
  • 36:59 - 37:06
    important it is, that at the end of the
    day we overcome our biosies as judges, as
  • 37:06 - 37:12
    police people, as people walking on the
    street an trying to overcome any sort of
  • 37:12 - 37:22
    surveillance on this impossible on the
    technical level, is always connected with
  • 37:22 - 37:32
    the way, we understand world.
    Herald: Yeah, this is, I also remember a
  • 37:32 - 37:45
    talk several years ago , it was about one
    of "freiheit statt Angst" Demonstrations
  • 37:45 - 37:54
    in Berlin. And there was also a case,
    where the term was established, the guy
  • 37:54 - 38:03
    with a t-shirt got beaten by police and it
    was very hard to assemble different videos
  • 38:03 - 38:11
    and to tell the whole story, what had
    happened. You should be able to find that,
  • 38:11 - 38:18
    there was a talk , where this was
    constructed somehow.
  • 38:18 - 38:23
    Producer: I will definitely looked that
    up.
  • 38:23 - 38:31
    Herald. But I'm not sure about the year
    anymore but you will find it. Now we have
  • 38:31 - 38:38
    a real question from the audience. The
    first is, can I find the movie anywhere
  • 38:38 - 38:43
    and show it to somebody else?
    Producer: Yes. It is on YouTube. laugh
  • 38:43 - 38:46
    Herald: Ok.
    Producer: You can literally find it by
  • 38:46 - 38:55
    typing my name, which is Theresia
    Reinhold. And you can find it.
  • 38:55 - 39:08
    Herald. ok. So, its very good. So, I hope
    they are happy. Is the 21. Century
  • 39:08 - 39:13
    attention span for non-technical friend
    with biggest claim. I don't get the
  • 39:13 - 39:36
    questions. The Idea is: Is there a way of
    explaining the non technical people, what
  • 39:36 - 39:41
    is the problem with "I do not have nothing
    to hide".
  • 39:41 - 39:48
    Producer: If there is anything in your
    life, where you are happy, no one is
  • 39:48 - 39:53
    watching you doing, whether it is a
    Century video, that a lot of people don't
  • 39:53 - 40:01
    know, that you are watching. Or singing in
    the shower or anything, then that is your
  • 40:01 - 40:13
    absolute right that you are not. No one is
    judging you on them, and that's the same
  • 40:13 - 40:19
    with mass surveillance and surfing online
    or walking down the street. We have a very
  • 40:19 - 40:27
    basic comfort zone, should be protected.
    Know we have a human right to privacy and
  • 40:27 - 40:31
    whether it's in a technical room or in an
    analog room, like being in a shopping mall
  • 40:31 - 40:35
    and picking up, I don't know whatever you
    don't want other people to know that
  • 40:35 - 40:40
    you're buying. You should have the right
    to do that in private and not have it be
  • 40:40 - 40:46
    known to other people. And when we are
    surfing the internet, everything we do is
  • 40:46 - 40:54
    constantly analyzed and watched in real
    time and you notice our movements online
  • 40:54 - 41:00
    are sold to the highest bidder and as a
    whole, massive advertising industry behind
  • 41:00 - 41:09
    it. And that's just immoral because humans
    should have always the ability to share
  • 41:09 - 41:16
    only what they want to share. That's how I
    try to explain it to non tech people. And
  • 41:16 - 41:18
    if they're not tech, people from the
    former east, is just here to move to Stasi
  • 41:18 - 41:21
    and they know exactly what you're talking
    about.
  • 41:21 - 41:27
    Herald: Yes, thanks for this explanation
    again. And I think also what is important,
  • 41:27 - 41:34
    what was also mentioned in the movie is
    the thing with that is and speak, since it
  • 41:34 - 41:41
    can be stored now that it's so that future
    can haunt your history. Kind of.
  • 41:41 - 41:48
    pandemonium. Yeah.
    Herald: And actually, the question was
  • 41:48 - 41:58
    now be more precise. And actually, it was
    not what I asked you Laughs. Actually is
  • 41:58 - 42:07
    the question of whether there is or there
    could be a short teaser that people could
  • 42:07 - 42:12
    send to your friends to two of their
    friends to to watch the whole movie?
  • 42:12 - 42:16
    laughs
    Reinhold: Oh yes, there is. And that is
  • 42:16 - 42:20
    also on YouTube and on Vimeo. OK, sorry.
    Yes. Laugh
  • 42:20 - 42:27
    Herald: Well, I also didn't get it from
    the question. So OK, so people will find
  • 42:27 - 42:34
    it very good. So then I guess we are
    through with the questions, and I thank
  • 42:34 - 42:47
    you again for your nice movie and for
    being here. Yeah, and. Then this talk is
  • 42:47 - 42:57
    over here. Chaos zone a TV comes back to
    you at four p.m. with the talk "tales from
  • 42:57 - 43:05
    the quantum industry" until then? Oh yeah.
    Which some other streams go to the world
  • 43:05 - 43:08
    or. Have some lunch. See you.
    [Filler please remove in amara]
  • 43:08 - 43:09
    post roll music
    [Filler please remove in amara]
  • 43:09 - 43:11
    Subtitles created by many many volunteers and
    the c3subtitles.de team. Join us, and help us!
Title:
"Information. What are they looking at?" A documentary on privacy.
Description:

more » « less
Video Language:
English
Duration:
43:18

English subtitles

Incomplete

Revisions Compare revisions