1
00:00:08,670 --> 00:00:09,670
RC3 preroll music
2
00:00:09,670 --> 00:00:15,599
Herald: Welcome back on the channel, of
Chaos zone TV. Now we have an English
3
00:00:15,599 --> 00:00:23,930
speaking contribution. And for that, I
welcome pandemonium, who is a historian
4
00:00:23,930 --> 00:00:33,290
and documentary film maker, and therefore
we will also not have a normal talk, but
5
00:00:33,290 --> 00:00:40,899
we will see the documentary information.
What are they looking at? A documentary on
6
00:00:40,899 --> 00:00:49,500
privacy. And afterwards, we can all talk
about the film, so feel free to post your
7
00:00:49,500 --> 00:00:56,739
questions and we can discuss them. Let's
enjoy the film.
8
00:00:56,739 --> 00:01:01,150
film started
[Filler please remove in amara]
9
00:01:01,150 --> 00:01:03,060
---Because the great promise of the
internet is freedom. Freedom of
10
00:01:03,060 --> 00:01:07,780
expression, freedom of organization,
freedom of assembly. These are really seen
11
00:01:07,780 --> 00:01:10,350
as underpinning rights of what we see as
democratic values.
12
00:01:10,350 --> 00:01:12,330
---Just because you have safety does not
mean that you cannot have freedom. Just
13
00:01:12,330 --> 00:01:15,330
because you have freedom does not mean you
cannot have safety.
14
00:01:15,330 --> 00:01:24,100
---Why is the reaction to doubt it rather
than to assume that it's true and act
15
00:01:24,100 --> 00:01:27,890
accordingly?
---We need to be able to break those laws
16
00:01:27,890 --> 00:01:29,610
that are unjust.
---Privacy is an essence, becoming a de
17
00:01:29,610 --> 00:01:30,610
facto crime. That is somehow you are
hiding something.
18
00:01:30,610 --> 00:01:31,610
---So just to be sure let's have no
privacy.
19
00:01:31,610 --> 00:01:32,610
"Information, what are they looking at? A
film by Theresia Reinhold.
20
00:01:32,610 --> 00:01:34,619
"The Internet is [...] everywhere, but we
only see it in the glimpses. The internet
21
00:01:34,619 --> 00:02:16,510
is like the wholy ghost: it makes itself
knowable to us by taking possession of the
22
00:02:16,510 --> 00:02:17,510
pixels on our screens to manifest sites
and apps and email, but its essence is
23
00:02:17,510 --> 00:02:18,510
always elsewhere.
---Before the Internet came about,
24
00:02:18,510 --> 00:02:25,190
communication was generally one editor to
many, many readers. But now it's peer to
25
00:02:25,190 --> 00:02:30,900
peer. So, you know, at a touch of a button
people have an opportunity to reach
26
00:02:30,900 --> 00:02:34,000
millions of people. That's revolutionizing
the way we communicate.
27
00:02:34,000 --> 00:02:38,280
---One of the things that Facebook and to
a lesser degree Twitter allowed people to
28
00:02:38,280 --> 00:02:43,800
do is be able to see that they weren't
alone. And it was able to create a
29
00:02:43,800 --> 00:02:49,530
critical mass. And I think that's a very
important role that social media took on.
30
00:02:49,530 --> 00:02:53,239
It was able to show people a very easy way
in people's Facebook feeds: "Oh, wow. Look
31
00:02:53,239 --> 00:02:57,210
at Tahrir Square, there's people out there
in Bahrain, in Pearl Square." What people
32
00:02:57,210 --> 00:03:02,099
could feel before walking out their door
into real life action, that they could see
33
00:03:02,099 --> 00:03:07,220
that they are not isolated in their desire
for some sort of change.
34
00:03:07,220 --> 00:03:12,440
---The great promise of the internet is
freedom where the minds without fear and
35
00:03:12,440 --> 00:03:18,340
the head is held high. And the knowledge
is free. Because the promise was: This
36
00:03:18,340 --> 00:03:22,400
will be the great equalizer.
---Before the social web, before the Web
37
00:03:22,400 --> 00:03:29,280
2.0, anything you were doing was kind of
anonymous. By the very concept of
38
00:03:29,280 --> 00:03:36,720
anonymity you were able to discuss things
that would probably be not according to
39
00:03:36,720 --> 00:03:42,030
dominance themes or the dominant trends of
values of your own society.
40
00:03:42,030 --> 00:03:50,550
---I don't find this discussion about how
to deal with the assertion "I have nothing
41
00:03:50,550 --> 00:03:59,849
to hide" boring, even after many years.
Because this sentence is very short, but
42
00:03:59,849 --> 00:04:07,099
very perfidious. The speaker, who hurls
the sentence "I have nothing to hide" at
43
00:04:07,099 --> 00:04:13,140
me, not only says something about
themselves, but also something about me.
44
00:04:13,140 --> 00:04:17,640
Because this sentence "I have nothing to
hide" also has the unspoken component of
45
00:04:17,640 --> 00:04:26,930
"You don't either, do you?" In this
respect, I think this sentence lacks
46
00:04:26,930 --> 00:04:30,160
solidarity, because at the same time one
does not want to work to ensure that the
47
00:04:30,160 --> 00:04:32,699
other, who perhaps has something to hide,
is able to do so.
48
00:04:32,699 --> 00:04:37,741
---One of the things about privacy is that
it's not always about you. It's about the
49
00:04:37,741 --> 00:04:41,620
people in our networks. And so, for
example, I have a lot of friends who are
50
00:04:41,620 --> 00:04:46,180
from Syria. People that I have met in
other places in the world, not necessary
51
00:04:46,180 --> 00:04:51,360
refugees. People who lived abroad for a
while, but those people are at risk all
52
00:04:51,360 --> 00:04:56,020
the time. Both in their home country and
often in their host countries as well. And
53
00:04:56,020 --> 00:05:00,680
so, I might say that I have nothing to
hide. I might say that there's no reason
54
00:05:00,680 --> 00:05:04,979
that I need to keep myself safe. But if
you've got any more like that in a
55
00:05:04,979 --> 00:05:10,780
network, any activists, any people from
countries like that, it's thinking about
56
00:05:10,780 --> 00:05:14,970
privacy and thinking about security means
thinking about keeping those people safe,
57
00:05:14,970 --> 00:05:17,760
too.
Privacy is important, because if think of
58
00:05:17,760 --> 00:05:23,690
the alternative, if everything is public,
if the norm is public, then anything that
59
00:05:23,690 --> 00:05:29,471
you want to keep to yourself has an
association or guild attached. And that
60
00:05:29,471 --> 00:05:33,804
should not be the world that we create.
That's a chilling effect. It's a chilling
61
00:05:33,804 --> 00:05:37,980
effect on our freedoms. It's a chilling
effect on democracy.
62
00:05:37,980 --> 00:05:41,450
"No one shall be subjected to arbitrary
interference with this privacy, family,
63
00:05:41,450 --> 00:05:44,920
home or correspondence, for the attacks
upon his honor and reputation. Everyone
64
00:05:44,920 --> 00:05:48,970
has the right to the protection of the law
against such interference or attacks.
65
00:05:48,970 --> 00:05:55,610
---To me, human rights are something which
has been put to place to guarantee the
66
00:05:55,610 --> 00:06:00,370
freedoms of every single person in the
world. They're supposed to be universal,
67
00:06:00,370 --> 00:06:05,710
indivisible. Having in the eyes of this
systems.
68
00:06:05,710 --> 00:06:09,940
---They're collecting data and metadata
about hundreds of thousands, millions of
69
00:06:09,940 --> 00:06:14,669
people. And some of that data will never
be looked at. That's a fact. We know that.
70
00:06:14,669 --> 00:06:19,710
But at the same time, assuming that just
because you're not involved in activism or
71
00:06:19,710 --> 00:06:24,139
you're not well known that you're not
going to be a target at some point, I
72
00:06:24,139 --> 00:06:28,680
think, that is what can be really harmful
to us. Right now you may not be under any
73
00:06:28,680 --> 00:06:33,580
threat at all, but your friends might be,
your family might be or you might be in
74
00:06:33,580 --> 00:06:37,080
the future. And so that's why we need to
think about it this way, not because we're
75
00:06:37,080 --> 00:06:41,400
going to be snatched out of our homes in
the middle of the night now. But because
76
00:06:41,400 --> 00:06:45,509
this data and this metadata lasts for a
long time.
77
00:06:45,509 --> 00:06:53,930
---My observation is what we are
experiencing right now is that the private
78
00:06:53,930 --> 00:07:05,781
space that should be and remain private in
the digital world is slowly beginning to
79
00:07:05,781 --> 00:07:13,250
erode. It is becoming permeable, and not
just factual. Factually, of course, but
80
00:07:13,250 --> 00:07:21,700
not only factually, but also in
perception. I imagine the digital world as
81
00:07:21,700 --> 00:07:31,500
a panoptical. This is the ring-shaped
building designed by Jeremy Benthem. In
82
00:07:31,500 --> 00:07:38,120
the ring the prisoners are accommodated in
the individual cell, and in the middle
83
00:07:38,120 --> 00:07:47,960
there is a watchtower. And there is a
guard sitting there. And this guard, who
84
00:07:47,960 --> 00:07:57,860
can observe and supervise the prisoners in
the cells around him all the time. The
85
00:07:57,860 --> 00:08:04,790
trick is that the prisoners cannot know if
they are being watched. They only see the
86
00:08:04,790 --> 00:08:10,620
tower, but they don't see the warden. But
they know very well that they could be
87
00:08:10,620 --> 00:08:16,360
watched permanently at any time. And this
fact exerts a changing decisive effect.
88
00:08:16,360 --> 00:08:24,229
---I think surveillance is a technology of
governmentality. It's a bio political
89
00:08:24,229 --> 00:08:30,789
technology. It's there to control and
manage populations. It's really propelled
90
00:08:30,789 --> 00:08:38,280
by state power and the power of entities
that are glued in that cohere around the
91
00:08:38,280 --> 00:08:45,780
state, right? So it is there as a form of
population management and control. So you
92
00:08:45,780 --> 00:08:51,279
have to convince people that it's in their
interest and its like: Every man for
93
00:08:51,279 --> 00:08:55,950
himself and everyone is out to get
everyone.
94
00:08:55,950 --> 00:09:12,870
relaying music is plays
[Filler please remove in amara]
95
00:09:12,870 --> 00:09:21,430
---I take my cue from a former general
counsel of the NSA, Suart Baker, who said
96
00:09:21,430 --> 00:09:27,240
on this question: Meta-Data absolutely
tells you everything about somebodies
97
00:09:27,240 --> 00:09:31,990
life. If you have enough Meta-Data you
don't really need content. It is sort of
98
00:09:31,990 --> 00:09:35,370
embarrassing, how predictable we are as
human beings.
99
00:09:35,370 --> 00:09:41,301
---So let's say that you make a phone one
night, you call up a suicide hotline, for
100
00:09:41,301 --> 00:09:46,490
example, you're feeling down, you call
that hotline and then a few hours later
101
00:09:46,490 --> 00:09:52,380
maybe you call a friend. A few hours later
you call a doctor, you send an email and
102
00:09:52,380 --> 00:09:57,690
so on so forth. Now, the contents of that
of those calls and those e-mails are not
103
00:09:57,690 --> 00:10:02,150
necessarily collected. What's being
collected is the time of the call and the
104
00:10:02,150 --> 00:10:06,800
place that you called. And so sometimes in
events like that, those different pieces
105
00:10:06,800 --> 00:10:10,160
of metadata can be linked together to
profile someone.
106
00:10:10,160 --> 00:10:14,920
---David's description of what you can do
with metadata, and quoting a mutual friend
107
00:10:14,920 --> 00:10:21,430
Stewart Baker, is absolutely correct. We
kill people based on Meta-Data. But that
108
00:10:21,430 --> 00:10:30,560
is not what we do with this Meta-Data.
Mayor Denett: Thankfully. Wow, I was
109
00:10:30,560 --> 00:10:32,760
working up a sweat there Mayor laughs
for a second.
110
00:10:32,760 --> 00:10:37,839
---You know, the impetus for governments
for conducting this kind of surveillance
111
00:10:37,839 --> 00:10:42,950
is often at least in rhetoric go to after
terrorists. And obviously, we don't want
112
00:10:42,950 --> 00:10:48,430
terrorism. And so that justification
resonates with most of the public. But I
113
00:10:48,430 --> 00:10:52,019
think that there's a couple problems with
it. The first is that they haven't
114
00:10:52,019 --> 00:10:55,930
demonstrated to us that surveillance
actually works in stopping terrorist
115
00:10:55,930 --> 00:11:00,830
attacks. We haven't seen it work yet. It
didn't work in Paris. It didn't work in
116
00:11:00,830 --> 00:11:05,170
Boston. t didn't work elsewhere. So that's
one part of it. But then I think the other
117
00:11:05,170 --> 00:11:10,389
part of it is that we spend billions of
dollars on surveillance and on war, but
118
00:11:10,389 --> 00:11:13,280
spend very little money on addressing the
root causes of terrorism.
119
00:11:13,280 --> 00:11:17,740
---I consider this debate security versus
freedom to be a bugaboo. Because these
120
00:11:17,740 --> 00:11:22,209
values are not mutually exclusive. I'm not
buying into this propaganda anymore. Many
121
00:11:22,209 --> 00:11:28,740
of the measures we have endured in the
last ten years have not led to more
122
00:11:28,740 --> 00:11:34,180
security, in terms of state-imposed
surveillance. And this is one reason why I
123
00:11:34,180 --> 00:11:42,480
don't want to continue this debate about
whether we should sacrifice freedom for
124
00:11:42,480 --> 00:11:47,381
more security.
---I think power is concealed in the whole
125
00:11:47,381 --> 00:11:53,130
discourse around surveillance, and the way
its concealed is through this
126
00:11:53,130 --> 00:12:00,710
legitimization that it's in your interest
that it keeps you safe. But there have
127
00:12:00,710 --> 00:12:07,470
been many instances where citizens groups
have actually fought against that kind of
128
00:12:07,470 --> 00:12:11,450
surveillance. And I think there is also
sort of a mystique around music starts
129
00:12:11,450 --> 00:12:15,690
the technology of surveillance. There is
the whole sort of like this notion that,
130
00:12:15,690 --> 00:12:20,280
ah, because it's a technology and it's
designed to do this. It's actually
131
00:12:20,280 --> 00:12:25,589
working. But all of this is a concealment
of power relations because who can surveil
132
00:12:25,589 --> 00:12:31,720
who? Is the issue, right?
---But it isn't the majority of the
133
00:12:31,720 --> 00:12:38,190
English population here to get stopped and
searched. It's non-white people. It is not
134
00:12:38,190 --> 00:12:42,519
the majority of non-white people who get
approached to inform on the community.
135
00:12:42,519 --> 00:12:47,649
It's Muslim communities.
---The surveillance that one does on the
136
00:12:47,649 --> 00:12:55,540
other. So as airport, it's the other
passengers that say, oh, so-and-so is
137
00:12:55,540 --> 00:13:03,240
speaking in Arabic. And therefore, that
person becomes the subject, the target
138
00:13:03,240 --> 00:13:08,380
that hyper-surveillance. So it's the kind
of surveillances that are being exercised
139
00:13:08,380 --> 00:13:15,810
by each of us on the other. Because of
this culture of fear that has been
140
00:13:15,810 --> 00:13:23,430
nourished on a way and that's mushrooming
all around us. And these are fears, I
141
00:13:23,430 --> 00:13:27,600
think, go anywhere from the most concrete
to the most vague.
142
00:13:27,600 --> 00:13:32,680
---In this way, I think this is another
way of creating a semblance of control
143
00:13:32,680 --> 00:13:38,200
where this identity is very easily
visible. It's very easily targeted and
144
00:13:38,200 --> 00:13:41,620
it's very easily defined.
---For me, this political discussion is
145
00:13:41,620 --> 00:13:51,639
purely based on fear in which the fear of
people, which is justified, are exploited.
146
00:13:51,639 --> 00:13:58,750
And where racist stereotypes are being
repeated. I think it extremely dangerous
147
00:13:58,750 --> 00:14:08,089
to give in to this more and more, also
because I believe that it reinforces
148
00:14:08,089 --> 00:14:13,269
negative instincts in people. Exclusion,
but also racial profiling.
149
00:14:13,269 --> 00:14:17,529
Kurz: It's inherently disentranchising,
it's disempowering and it's isolating.
150
00:14:17,529 --> 00:14:22,990
When you feel you're being treated as a
different person to the rest of the
151
00:14:22,990 --> 00:14:27,220
population, that's when measures like
surveillance, things that are enabled by
152
00:14:27,220 --> 00:14:37,710
technology really hit home. And cause you
to sort of change that way you feel as a
153
00:14:37,710 --> 00:14:40,160
subject. Because at the end of the day,
you are subjective of a government.
154
00:14:40,160 --> 00:14:46,730
---How is it that these mass surveillance
programs have been kept secret for years
155
00:14:46,730 --> 00:14:52,220
when they are supposed to be so meaningful
and effective? Why didn't anyone publicly
156
00:14:52,220 --> 00:14:57,710
justify it? Then why was it all secretly
justified by secret courts with secret
157
00:14:57,710 --> 00:15:02,410
court rulings? Why, after the Snowden
publications began, did the Commission of
158
00:15:02,410 --> 00:15:06,530
intelligent Agents, which specifically
appointed Obama com to the conclusion that
159
00:15:06,530 --> 00:15:14,550
not a single --zero --- of this cases of
terror or attempted terrorist attacks has
160
00:15:14,550 --> 00:15:20,010
been partially resolved by these giant
telecommunications metadata? In trying to
161
00:15:20,010 --> 00:15:26,029
stop something from happening before it
happens, they can put in a measure and
162
00:15:26,029 --> 00:15:30,630
that thing might not happen. But they
don't know it that measure stopped that
163
00:15:30,630 --> 00:15:34,959
thing from happening, because that thing
never happened. It's hard to measure. You
164
00:15:34,959 --> 00:15:38,690
can't measure it. And you can't say with
certainty thst because of this measure
165
00:15:38,690 --> 00:15:47,240
that that didn't happen. But after 9/11,
after the catastrophic level of attack, it
166
00:15:47,240 --> 00:15:54,620
put decision makers into this impossible
position where citizens where scared. They
167
00:15:54,620 --> 00:16:01,209
needed to do something. One part of that
is trying to screen everybody objectively
168
00:16:01,209 --> 00:16:07,220
and have that sort of panoptical
surveillance. Saying that: "No, no. We can
169
00:16:07,220 --> 00:16:13,709
see everything. Don't worry. We have the
haystack. We just need to find the needle.
170
00:16:13,709 --> 00:16:17,670
But then obviously, they need ways to
target that. You can see it most clearly
171
00:16:17,670 --> 00:16:22,930
over here. You got leaflets through your
door a few years ago, basically saying
172
00:16:22,930 --> 00:16:29,069
that if you've seen anything suspicious,
call this hotline. It listed things like
173
00:16:29,069 --> 00:16:38,220
the neighbor who goes away on holiday many
times a year or, another neighbor whose
174
00:16:38,220 --> 00:16:44,459
curtains are always drawn. It just changes
the way you look at society and you look
175
00:16:44,459 --> 00:16:51,430
at yourself. And it shifts the presumption
of Innocence to a presumption of guilt
176
00:16:51,430 --> 00:16:55,769
already.
---When is someone a potential suicide
177
00:16:55,769 --> 00:17:07,910
bomber? This is where the problem begins.
When they wear an explosive belt and holds
178
00:17:07,910 --> 00:17:17,309
the tiger in their hands? Or when they
order the building blocks for an explosive
179
00:17:17,309 --> 00:17:29,360
belt online? Or when they informed
themselves about how to build an explosive
180
00:17:29,360 --> 00:17:38,250
vest? When can the state legally
intervene? For me it is about the central
181
00:17:38,250 --> 00:17:45,600
very problematic question whether someone
who very problematic question whether
182
00:17:45,600 --> 00:17:55,150
someone who has been identified as a
potential danger or a potential terrorist,
183
00:17:55,150 --> 00:18:04,250
without being a terrorist, if someone like
that can then be legally surveilled or
184
00:18:04,250 --> 00:18:14,520
even arrested? That means if certain
people by potentially posing a concrete
185
00:18:14,520 --> 00:18:20,429
danger ot society can be stripped of,
their fundamental human rights?
186
00:18:20,429 --> 00:18:22,990
---We face am unprecedented threat which
will last
187
00:18:22,990 --> 00:18:25,740
Two days after the attacks in BrĂ¼ssel on
the 22.03.2016 Jean Claude Juncker and the
188
00:18:25,740 --> 00:18:33,299
french prime minister held a join pres
conference. But we also believe that we
189
00:18:33,299 --> 00:18:37,070
need to be a union of security. In it
Juncker called to the ministers to accept
190
00:18:37,070 --> 00:18:41,690
a proposal by the commision for the
propection of the EU.
191
00:18:41,690 --> 00:18:48,010
---For over 15 years now we have observed
a big populist push to adopt even more
192
00:18:48,010 --> 00:18:52,200
surveillance measures. With the attacks of
the past years, there was the opportunity
193
00:18:52,200 --> 00:18:57,940
to pass even more. We have this proposal
for a new directive whose contents a
194
00:18:57,940 --> 00:19:03,030
purely based on ideology.
The Text of the law passed in summary
195
00:19:03,030 --> 00:19:07,140
proceedings was adopteed as an anti-
terrorism directive. Green Member of
196
00:19:07,140 --> 00:19:15,210
Parlament Jan Phillipp Albrecht wrote in
an Statement to Netzpolitik.org: "What the
197
00:19:15,210 --> 00:19:17,980
Directive defines as Terrorism could be
used by governments to criminalize
198
00:19:17,980 --> 00:19:23,570
political action or political protest".
---These type of laws actually are neutral
199
00:19:23,570 --> 00:19:26,400
in principle. In praxis, they are very
discrimantory. If you talk to any
200
00:19:26,400 --> 00:19:30,950
politician right now of the EU level or at
the national or local level they will tell
201
00:19:30,950 --> 00:19:33,940
you that most likely this people are
Muslims.
202
00:19:33,940 --> 00:19:37,860
---Historically and Philosophically this
problem is well known to us. We always
203
00:19:37,860 --> 00:19:46,110
tend to put everithing which is unpleasant
or eene to us, to the horizon. "This is
204
00:19:46,110 --> 00:19:59,169
strange ti us. It is done by others, not
by us." And when one of us does it thent
205
00:19:59,169 --> 00:20:05,102
they have to be distributed.
---And this is Edward Said's point of view
206
00:20:05,102 --> 00:20:12,690
that the western seit comes to define
itself in relation to this eastern other.
207
00:20:12,690 --> 00:20:18,230
So everything that the West was, the East
was'nt and everything that the East was,
208
00:20:18,230 --> 00:20:22,030
the West wasn't. Ans so the East became
this province of emotionality,
209
00:20:22,030 --> 00:20:28,080
irrationality. And the West became the
source of reason, everything controlled
210
00:20:28,080 --> 00:20:33,210
and contained and so forth. And it is this
dichotomy that continues to play itself
211
00:20:33,210 --> 00:20:36,640
out.
---Terrorism emerged as a term for the
212
00:20:36,640 --> 00:20:43,620
first time in context of the French
Revolution. The Jacobins who under
213
00:20:43,620 --> 00:20:49,080
Rebesprierre were the reign of terror,
those where the first Terrorists, that's
214
00:20:49,080 --> 00:20:54,620
what they called. The first terrorism was
the terrorism of the state and of course
215
00:20:54,620 --> 00:20:58,970
this also included the systematic
monitoring of Conterrevoliners.
216
00:20:58,970 --> 00:21:07,370
---Where the proposal of the directive
says that it complies with human rights.
217
00:21:07,370 --> 00:21:10,320
It actually does not because they want to
increase surveillance measures in order
218
00:21:10,320 --> 00:21:14,830
for the population to feel safer. However,
we've seen that more repressive measure do
219
00:21:14,830 --> 00:21:18,440
not necessarily mean that you would have
more security.
220
00:21:18,440 --> 00:21:25,179
---The way you sell it to people is to
appease their sense of anxieties around
221
00:21:25,179 --> 00:21:32,020
"Oh, this is an insecure world. Anything
could happen at any time. And so if
222
00:21:32,020 --> 00:21:36,030
anything could happen at any time, what
can we do about it?
223
00:21:36,030 --> 00:21:40,610
---You gut the feeling that the text is
trying to make sure that few enforecment
224
00:21:40,610 --> 00:21:44,270
will be able to get access to
communications by any means that they
225
00:21:44,270 --> 00:21:46,710
wish.
---To be able to stop something from
226
00:21:46,710 --> 00:21:50,260
happening before it happens, you have to
know everything. You have to look at the
227
00:21:50,260 --> 00:21:55,159
past, look at what happened, but also
predict the future by looking at the past
228
00:21:55,159 --> 00:22:00,670
and then getting as much information as
you can on everything all the time. So
229
00:22:00,670 --> 00:22:03,760
it's about zero risk.
Kurz: All developed democraties have a
230
00:22:03,760 --> 00:22:08,870
concept of like proporionality, that's
what the call it in Germany. that
231
00:22:08,870 --> 00:22:12,020
surveillance measures are weighed up
against the repeat for fundamental rights.
232
00:22:12,020 --> 00:22:16,071
This undoubtly includes privacy. Privacy
is very highly viewed in Germany and
233
00:22:16,071 --> 00:22:23,370
directly derived from human dignity. And
human dignity is only negotiable to very
234
00:22:23,370 --> 00:22:28,919
tiny degree.
---When we are afraid to speak either
235
00:22:28,919 --> 00:22:32,780
because of our government coming after us
because of a partner or a boss or
236
00:22:32,780 --> 00:22:37,400
whomever. All sorts of surveillance causes
self censorship. But I think that mass
237
00:22:37,400 --> 00:22:41,059
surveillance. The idea that everything we
are doing is being collected can cause a
238
00:22:41,059 --> 00:22:43,730
lot of people to think twice before they
open their mouths.
239
00:22:43,730 --> 00:22:51,630
---When all your likes can be traced back
to you of course it affects your
240
00:22:51,630 --> 00:22:56,320
behaviour. Of couse it's usually the case
that sometimes you think: If you like this
241
00:22:56,320 --> 00:23:01,600
thing or if you don't, it would have some
social repressions.
242
00:23:01,600 --> 00:23:07,610
---But if you look throughout history, the
Reformation, the gay rights movements all
243
00:23:07,610 --> 00:23:13,480
these movements where illegal in some way.
If not by law, stricktly, then by culture.
244
00:23:13,480 --> 00:23:18,710
And if we'd this kind of mass surveillance
would we have had this movements?
245
00:23:18,710 --> 00:23:23,279
---If all laws were absolutes, then we
would never have progressed to the point
246
00:23:23,279 --> 00:23:27,820
where women had equal rights because women
had to break the laws. That said: "You
247
00:23:27,820 --> 00:23:32,750
can't have equal rights". Black people in
America had to break the laws that said
248
00:23:32,750 --> 00:23:40,690
they could not have equal rights. And
there's common thread here. You know a lot
249
00:23:40,690 --> 00:23:47,950
of our laws historically have had the
harshest effect on the most vulnerable in
250
00:23:47,950 --> 00:23:51,280
society.
Kurz: The components of whomever has
251
00:23:51,280 --> 00:23:56,510
something to hide has to blame only
themselves only emerged in recent years.
252
00:23:56,510 --> 00:24:03,600
In particular, the former CEO of Google
Eric Smith is known for that of course, he
253
00:24:03,600 --> 00:24:05,480
certyinly said that.
"If you have something that you don't want
254
00:24:05,480 --> 00:24:09,930
anyone to know, maybe you shouldn't be
dooing it on the first place. " But this
255
00:24:09,930 --> 00:24:16,420
is so hostile to humans that is almost
funny. You could think it is satire. A lot
256
00:24:16,420 --> 00:24:19,320
of people can't help that they have
something to hide in a society, that is
257
00:24:19,320 --> 00:24:24,190
unjust. *wieder Eric Shmid" "But if you
really need that kind of privacy, the
258
00:24:24,190 --> 00:24:36,470
reallity is that search changes including
google to retain that information for some
259
00:24:36,470 --> 00:24:46,460
time. "
---Big corporations that have this
260
00:24:46,460 --> 00:24:51,220
business model of people farming are
interested in you becaurse you are the row
261
00:24:51,220 --> 00:24:55,000
materials. Right. Your Infromation is row
materials. What they do is they process
262
00:24:55,000 --> 00:25:00,450
that to build a profile of you. And that's
where the real value is. Because if I know
263
00:25:00,450 --> 00:25:06,030
enough about you, if I as much information
about you that I can build a very
264
00:25:06,030 --> 00:25:14,020
lifelike, constantly evolving picture of
you, a s simpulation of you. That's very
265
00:25:14,020 --> 00:25:16,980
vulnerable.
---The economy of the net is predicting
266
00:25:16,980 --> 00:25:23,159
human behaviour, so that eyeballls can be
delivered to advertising and that's
267
00:25:23,159 --> 00:25:26,610
targeting advertising.
---The system in ways is set up for them
268
00:25:26,610 --> 00:25:32,169
to make money and sell our lettle bits of
data, our interests, our demographics for
269
00:25:32,169 --> 00:25:36,559
other people and for advertisers to be
able to sell things. These companies know
270
00:25:36,559 --> 00:25:40,220
more about us than we know about
ourselves. Right now we're feeding the
271
00:25:40,220 --> 00:25:43,490
beast. And right now, there's very little
oversight.
272
00:25:43,490 --> 00:25:49,190
---It has to reach one person, the same ad
at particular time, if at 3:00 p.m. you
273
00:25:49,190 --> 00:25:55,169
buy the soda you get your lunch. How about
2:55pm you'll get an ad about a discount
274
00:25:55,169 --> 00:26:01,640
about a pizza place next door or a salad
place. Where had exactly the soda comes.
275
00:26:01,640 --> 00:26:04,370
So that's what targeted advertising is.
---It is true, it is convinient. You know,
276
00:26:04,370 --> 00:26:08,610
I always laugh every time I'm on a site.
I'm looking at, let's say, a sweater I
277
00:26:08,610 --> 00:26:12,080
want to buy. And then I move over to
another site and it advertising for that
278
00:26:12,080 --> 00:26:16,350
same sweater. It pops up and reminds me
how much I want it. It's both convinient
279
00:26:16,350 --> 00:26:19,950
and annoying.
---It's a pity that some of the greatest
280
00:26:19,950 --> 00:26:25,440
minds in our century are only wondering
how to make you look advertising. And
281
00:26:25,440 --> 00:26:30,590
that's where the surveillance economy
beginns, I will say, and not just ends.
282
00:26:30,590 --> 00:26:36,370
---To a lot of people that may seem much
less harmful. Bur the fact that they're
283
00:26:36,370 --> 00:26:41,130
capturing this date means that data exists
and we don't who they might share it with.
284
00:26:41,130 --> 00:26:45,409
---There is whole new business now, you
know, data brokers who drew upon, you
285
00:26:45,409 --> 00:26:50,180
know, thousands of data points and create
client profiles to sell to companies. Now,
286
00:26:50,180 --> 00:26:53,929
you don't really know what happens with
this kind of things. So it is hard to
287
00:26:53,929 --> 00:26:59,870
tell, what the implications are until it
is too late. Until it happens.
288
00:26:59,870 --> 00:27:03,850
---The Stasi compared to Google or
Facebook, where amateurs, the Stasi
289
00:27:03,850 --> 00:27:10,570
actually had to use people to surcail you
to spy on you. That was expensive. It was
290
00:27:10,570 --> 00:27:16,250
time consuming. They had to pick targets.
It was very expensive for them to have all
291
00:27:16,250 --> 00:27:21,320
of these people spying on you. Facebook
and Google don't have to do that. They use
292
00:27:21,320 --> 00:27:26,169
Algorithms, that's the mass in mass
surveillance. The fact that it is so
293
00:27:26,169 --> 00:27:32,200
cheap, so convenient to spy on so many
people. And it's not a conspiracy theory.
294
00:27:32,200 --> 00:27:36,179
You don't need conspiracies when you have
the simplicity of business models.
295
00:27:36,179 --> 00:27:42,570
---When we talk about algorithms, we
actually talk about logic. When you want,
296
00:27:42,570 --> 00:27:48,669
for example, buy a book on Amazon. You
have always seen a few other suggestions.
297
00:27:48,669 --> 00:27:54,910
These suggestions are produced for you
based on the history of your preferences,
298
00:27:54,910 --> 00:28:00,280
the history of your searches.
---They learn by making mistakes. And the
299
00:28:00,280 --> 00:28:07,080
thing is, that's fine if it's like selling
dog feed. But it's about predictive
300
00:28:07,080 --> 00:28:12,480
pollicing and about creating a matrix
where you see which individuals are
301
00:28:12,480 --> 00:28:17,130
threatening, that's not ok for me. You
know, that has to be limits. There has to
302
00:28:17,130 --> 00:28:22,360
be lines. And these are all the dynamics
that are coming from the bottom up. These
303
00:28:22,360 --> 00:28:26,100
are the discussions that need to be had,
but they need to be had with all actors.
304
00:28:26,100 --> 00:28:31,190
It's can't just be a an echo chamber. You
don't talk to the some people who agree
305
00:28:31,190 --> 00:28:36,240
with you.
---So one consequence of this would be
306
00:28:36,240 --> 00:28:42,690
many minorities or many people who have
minority views would be silenced. And we
307
00:28:42,690 --> 00:28:47,400
always know that when a minority view is
silenced, it would empower them in a way
308
00:28:47,400 --> 00:28:57,250
and it would radicalize them in the long
run. This is one aspect. The other is that
309
00:28:57,250 --> 00:29:00,640
you would never be challenged by anyone,
who disagrees with you.
310
00:29:00,640 --> 00:29:07,409
---We have to understand that our data is
not exhaust. Our data is not oil. Data is
311
00:29:07,409 --> 00:29:13,460
people. You maybe not doing anything wrong
today, but maybe three governments from
312
00:29:13,460 --> 00:29:19,151
now when they pass a certain law, what you
have done today might be illegal, for
313
00:29:19,151 --> 00:29:24,530
example, and governments that keep that
data can look back over 10, 20 years and
314
00:29:24,530 --> 00:29:33,289
maybe start prosecuting.
---When everything we buy, everything we
315
00:29:33,289 --> 00:29:43,679
read, even the people we meet and date is
determined by this algorithms, I think the
316
00:29:43,679 --> 00:29:49,581
amount of power that they exert on the
society and individuals in this society is
317
00:29:49,581 --> 00:29:56,510
more than the state to the some degree.
And so there I think representatives
318
00:29:56,510 --> 00:30:03,679
democracy have the duty to push the
government to open up these private
319
00:30:03,679 --> 00:30:12,850
entities, to at least expose to some
degree how much control they exert.
320
00:30:12,850 --> 00:30:14,510
---If you adopt the technological
perspective and realize that technology
321
00:30:14,510 --> 00:30:19,059
will slip into our lives much more than is
already the case: Technically into our
322
00:30:19,059 --> 00:30:22,620
bodies, our clothes, into devices that we
sit in an we're wearing, in all sorts of
323
00:30:22,620 --> 00:30:32,220
areas of our coexistence and working life,
then that's definitely the wrong way to
324
00:30:32,220 --> 00:30:35,360
go. Because it leads to a total
surveillance. And if you think about it
325
00:30:35,360 --> 00:31:30,570
for a few minutes, you will realize that
the dichotomy is between control &
326
00:31:30,570 --> 00:31:42,130
freedom. And a fully controlled society
cannot be free.
327
00:31:42,130 --> 00:31:43,130
post film music
[Filler please remove in amara]
328
00:31:43,130 --> 00:31:52,630
Herald: Hello and welcome back from the
movie and know I welcome also our
329
00:31:52,630 --> 00:32:09,110
producer. And. It was very. Oh, yeah,
showing very good. What information can do
330
00:32:09,110 --> 00:32:16,190
and what could be done with information, I
give some people a bit more time to ask
331
00:32:16,190 --> 00:32:27,779
more questions. And in the meantime, I
could ask, Oh, well, this is moving. Those
332
00:32:27,779 --> 00:32:37,539
of these remote to your home was not shown
today for the first time. So what would
333
00:32:37,539 --> 00:32:46,120
you do different or what you think has
maybe changed in the meanwhile since you
334
00:32:46,120 --> 00:32:50,840
made it?
pandemonium: What I would change is I
335
00:32:50,840 --> 00:32:59,890
would definitely try much harder to secure
funding to just simply make a better movie
336
00:32:59,890 --> 00:33:05,440
and have more time and edited faster
because the editing process, because I had
337
00:33:05,440 --> 00:33:12,429
to work on the side was quite long. And
this film, and the way it stands now, was
338
00:33:12,429 --> 00:33:20,320
essentially only funded by a few very
great people who supported me on Patreon
339
00:33:20,320 --> 00:33:26,539
and helped me with some of their private
money, essentially so that it was
340
00:33:26,539 --> 00:33:30,280
essentially an almost no budget
production. So I would definitely change
341
00:33:30,280 --> 00:33:36,730
that. But documentary seen in Germany
being what it is, it's very hard to secure
342
00:33:36,730 --> 00:33:41,960
money if you're not attached to a TV
station or if you don't have a name yet.
343
00:33:41,960 --> 00:33:47,100
And since I didn't have a name, but I
still want to make the movie, I made the
344
00:33:47,100 --> 00:33:52,730
movie. I am still very happy with the
general direction of it. But of course,
345
00:33:52,730 --> 00:34:00,860
since that was mainly shot in 2015 and
2016, some of the newer developments in
346
00:34:00,860 --> 00:34:10,800
terms of especially biometric mass
surveillance and police. Especially in the
347
00:34:10,800 --> 00:34:18,020
U.S., the way police uses body cams, etc.
is not really reflected. But I still think
348
00:34:18,020 --> 00:34:25,370
that the I would still go with the whole
angle on colonialism and racism that is
349
00:34:25,370 --> 00:34:30,889
deeply entrenched in the discussions
around surveillance and privacy. And we
350
00:34:30,889 --> 00:34:36,220
can see that in discussions about shutting
down Telegram in Germany at the moment
351
00:34:36,220 --> 00:34:42,050
because right wing groups the there we see
it in discussions about how to deal with
352
00:34:42,050 --> 00:34:48,560
hate speech online on Facebook or the
Metaverse. And it's going to be called
353
00:34:48,560 --> 00:34:53,900
Zoom. And all of these things are already
kind of in the movie, but I would have
354
00:34:53,900 --> 00:35:00,880
probably focused a bit more on them if I'd
known. Six years ago what would happen and
355
00:35:00,880 --> 00:35:06,920
if I would make it now? But generally, the
direction I would choose the same.
356
00:35:06,920 --> 00:35:18,990
Herald: Yeah, it's quite fascinating. So
it was nearly no budget was so low, and it
357
00:35:18,990 --> 00:35:28,890
was also an interesting point to entertain
now, because in principle, I understood
358
00:35:28,890 --> 00:35:35,840
the ideas that body cams should create
actually a means of protecting people
359
00:35:35,840 --> 00:35:42,190
against police and not the other way
around as it happens sometimes.
360
00:35:42,190 --> 00:35:47,930
pandemonium: So there are definitely and
the problem with especially body cams or
361
00:35:47,930 --> 00:35:55,200
also other of other means of surveillance
is that a video is always thought to be an
362
00:35:55,200 --> 00:36:01,930
objective recording of the reality. But of
course, it always depends on the angle and
363
00:36:01,930 --> 00:36:08,010
cases of body cams, quite literally the
angle, but also data interpretation. And
364
00:36:08,010 --> 00:36:14,630
since humans are always full of biases and
always full of presumptions about who
365
00:36:14,630 --> 00:36:21,380
might be in the right and who might be in
the wrong, these imagery, images or videos
366
00:36:21,380 --> 00:36:25,180
tend to never, even if they would be
showing the objective truth, they're
367
00:36:25,180 --> 00:36:31,990
barely ever interpret it that way. And
it's exactly the same with any sort of
368
00:36:31,990 --> 00:36:38,890
film or photography that we're doing. I
mean, for this movie, I assembled a ton of
369
00:36:38,890 --> 00:36:42,550
interviews and there were very long there
were several hours long in many cases, and
370
00:36:42,550 --> 00:36:49,160
I could added probably 100 different
versions of this movie going essentially
371
00:36:49,160 --> 00:36:58,569
almost opposite directions, exactly the
same material. It shows very strongly how
372
00:36:58,569 --> 00:37:05,869
important it is, that at the end of the
day we overcome our biosies as judges, as
373
00:37:05,869 --> 00:37:12,220
police people, as people walking on the
street an trying to overcome any sort of
374
00:37:12,220 --> 00:37:22,480
surveillance on this impossible on the
technical level, is always connected with
375
00:37:22,480 --> 00:37:32,160
the way, we understand world.
Herald: Yeah, this is, I also remember a
376
00:37:32,160 --> 00:37:45,110
talk several years ago , it was about one
of "freiheit statt Angst" Demonstrations
377
00:37:45,110 --> 00:37:54,160
in Berlin. And there was also a case,
where the term was established, the guy
378
00:37:54,160 --> 00:38:02,849
with a t-shirt got beaten by police and it
was very hard to assemble different videos
379
00:38:02,849 --> 00:38:11,410
and to tell the whole story, what had
happened. You should be able to find that,
380
00:38:11,410 --> 00:38:18,290
there was a talk , where this was
constructed somehow.
381
00:38:18,290 --> 00:38:23,079
Producer: I will definitely looked that
up.
382
00:38:23,079 --> 00:38:31,250
Herald. But I'm not sure about the year
anymore but you will find it. Now we have
383
00:38:31,250 --> 00:38:38,040
a real question from the audience. The
first is, can I find the movie anywhere
384
00:38:38,040 --> 00:38:42,790
and show it to somebody else?
Producer: Yes. It is on YouTube. laugh
385
00:38:42,790 --> 00:38:46,099
Herald: Ok.
Producer: You can literally find it by
386
00:38:46,099 --> 00:38:54,540
typing my name, which is Theresia
Reinhold. And you can find it.
387
00:38:54,540 --> 00:39:07,590
Herald. ok. So, its very good. So, I hope
they are happy. Is the 21. Century
388
00:39:07,590 --> 00:39:12,980
attention span for non-technical friend
with biggest claim. I don't get the
389
00:39:12,980 --> 00:39:35,650
questions. The Idea is: Is there a way of
explaining the non technical people, what
390
00:39:35,650 --> 00:39:40,690
is the problem with "I do not have nothing
to hide".
391
00:39:40,690 --> 00:39:48,070
Producer: If there is anything in your
life, where you are happy, no one is
392
00:39:48,070 --> 00:39:52,760
watching you doing, whether it is a
Century video, that a lot of people don't
393
00:39:52,760 --> 00:40:01,050
know, that you are watching. Or singing in
the shower or anything, then that is your
394
00:40:01,050 --> 00:40:13,109
absolute right that you are not. No one is
judging you on them, and that's the same
395
00:40:13,109 --> 00:40:19,290
with mass surveillance and surfing online
or walking down the street. We have a very
396
00:40:19,290 --> 00:40:26,630
basic comfort zone, should be protected.
Know we have a human right to privacy and
397
00:40:26,630 --> 00:40:31,130
whether it's in a technical room or in an
analog room, like being in a shopping mall
398
00:40:31,130 --> 00:40:34,660
and picking up, I don't know whatever you
don't want other people to know that
399
00:40:34,660 --> 00:40:40,390
you're buying. You should have the right
to do that in private and not have it be
400
00:40:40,390 --> 00:40:45,500
known to other people. And when we are
surfing the internet, everything we do is
401
00:40:45,500 --> 00:40:54,280
constantly analyzed and watched in real
time and you notice our movements online
402
00:40:54,280 --> 00:41:00,119
are sold to the highest bidder and as a
whole, massive advertising industry behind
403
00:41:00,119 --> 00:41:08,609
it. And that's just immoral because humans
should have always the ability to share
404
00:41:08,609 --> 00:41:15,532
only what they want to share. That's how I
try to explain it to non tech people. And
405
00:41:15,532 --> 00:41:17,550
if they're not tech, people from the
former east, is just here to move to Stasi
406
00:41:17,550 --> 00:41:20,620
and they know exactly what you're talking
about.
407
00:41:20,620 --> 00:41:27,109
Herald: Yes, thanks for this explanation
again. And I think also what is important,
408
00:41:27,109 --> 00:41:34,220
what was also mentioned in the movie is
the thing with that is and speak, since it
409
00:41:34,220 --> 00:41:41,420
can be stored now that it's so that future
can haunt your history. Kind of.
410
00:41:41,420 --> 00:41:47,510
pandemonium. Yeah.
Herald: And actually, the question was
411
00:41:47,510 --> 00:41:58,400
now be more precise. And actually, it was
not what I asked you Laughs. Actually is
412
00:41:58,400 --> 00:42:06,880
the question of whether there is or there
could be a short teaser that people could
413
00:42:06,880 --> 00:42:12,260
send to your friends to two of their
friends to to watch the whole movie?
414
00:42:12,260 --> 00:42:15,630
laughs
Reinhold: Oh yes, there is. And that is
415
00:42:15,630 --> 00:42:20,119
also on YouTube and on Vimeo. OK, sorry.
Yes. Laugh
416
00:42:20,119 --> 00:42:27,320
Herald: Well, I also didn't get it from
the question. So OK, so people will find
417
00:42:27,320 --> 00:42:34,410
it very good. So then I guess we are
through with the questions, and I thank
418
00:42:34,410 --> 00:42:46,760
you again for your nice movie and for
being here. Yeah, and. Then this talk is
419
00:42:46,760 --> 00:42:57,010
over here. Chaos zone a TV comes back to
you at four p.m. with the talk "tales from
420
00:42:57,010 --> 00:43:05,170
the quantum industry" until then? Oh yeah.
Which some other streams go to the world
421
00:43:05,170 --> 00:43:08,329
or. Have some lunch. See you.
[Filler please remove in amara]
422
00:43:08,329 --> 00:43:09,329
post roll music
[Filler please remove in amara]
423
00:43:09,329 --> 00:43:10,829
Subtitles created by many many volunteers and
the c3subtitles.de team. Join us, and help us!