1
00:00:00,440 --> 00:00:10,299
music
2
00:00:10,299 --> 00:00:12,679
Let's start. Be welcome!
3
00:00:12,679 --> 00:00:16,880
More than two years ago, Edward Snowden's
files have become public.
4
00:00:16,880 --> 00:00:18,410
They went public
5
00:00:18,410 --> 00:00:20,910
and the media went crazy.
6
00:00:20,910 --> 00:00:23,160
And the public maybe not so much,
7
00:00:23,160 --> 00:00:26,250
as you may have noticed amongst your friends
and family,
8
00:00:26,250 --> 00:00:28,000
as well I did.
9
00:00:28,000 --> 00:00:34,530
A lot remains the same after Snowden's revelations,
10
00:00:34,530 --> 00:00:38,039
even if people are concerned about surveillance.
11
00:00:38,039 --> 00:00:44,780
The following talk by Arne Hintz and Lina
Dencik from University of Cardiff explores
12
00:00:44,780 --> 00:00:46,019
just that.
13
00:00:46,019 --> 00:00:54,739
They analyzed how actually the media reacted
to the relations made by Edward Snowden
14
00:00:54,739 --> 00:00:56,550
and they also looked at how the public,
15
00:00:56,550 --> 00:01:03,100
such as journalists and other people and activists,
reacted to Edward Snowden's disclosures.
16
00:01:03,100 --> 00:01:09,520
So please give a warm round of applause to
Arne Hintz and Lina Dencik. Thank you!
17
00:01:09,520 --> 00:01:17,790
applause
18
00:01:17,790 --> 00:01:21,910
Arne: Thank you very much, there are still
a few free seats over there.
19
00:01:21,910 --> 00:01:25,290
Hello everybody, my name is Arne Hintz, this
is Lina Denzik.
20
00:01:25,290 --> 00:01:31,620
We are both from Cardiff University, from
the school of journalism, media and cultural studies,
21
00:01:31,620 --> 00:01:34,380
so not from the tech department.
22
00:01:34,380 --> 00:01:38,960
We want to talk about some of the results
of a research project
23
00:01:38,960 --> 00:01:42,400
that we've been working on this year and for the past...
24
00:01:42,400 --> 00:01:45,470
for a bit more than a year
25
00:01:45,470 --> 00:01:50,300
and it's called "Digital Citizenship and Surveillance
Society: UK State-Media-Citizen Relations
26
00:01:50,300 --> 00:01:51,800
after the Snowden Leaks",
27
00:01:51,800 --> 00:01:56,980
and it's about the implications of the Snowden
leaks in four areas:
28
00:01:56,980 --> 00:02:01,440
News media, civil society, policy and technology
29
00:02:01,440 --> 00:02:05,430
and here what we want to do is present just
a few findings from that project
30
00:02:05,430 --> 00:02:11,230
and focus on two areas, the news media part
and the civil society part.
31
00:02:11,230 --> 00:02:16,780
It's all focused on the UK, the country where
Cardiff University is located
32
00:02:16,780 --> 00:02:22,510
so there won't be a lot of international comparisons,
not a lot about Germany and so on,
33
00:02:22,510 --> 00:02:29,470
but I think maybe at the end we can maybe
draw some comparisons ourselves here in this room.
34
00:02:32,730 --> 00:02:38,890
So this has been the project basically, the
title as you see it over there.
35
00:02:38,890 --> 00:02:43,190
The news media part has basically asked how
the british media represented the Snowden
36
00:02:43,190 --> 00:02:45,190
leaks and digital surveillance.
37
00:02:45,190 --> 00:02:51,400
The society part is about questions such as:
What is the nature of public knowledge with
38
00:02:51,400 --> 00:02:52,740
regards to digital surveillance?
39
00:02:52,740 --> 00:02:56,130
Are everyday communication practices changing?
40
00:02:56,130 --> 00:03:01,040
And how are activists affected by the revelations
of mass surveillance?
41
00:03:01,040 --> 00:03:04,560
The policies part is still ongoing, it's still
being developed
42
00:03:04,560 --> 00:03:08,700
and it's about the current policy and regulatory
framework of digital surveillance
43
00:03:08,700 --> 00:03:12,890
and reform proposals and current reforms that
are taking place.
44
00:03:12,890 --> 00:03:17,599
And the technology part is about the technological
infrastructure of surveillance
45
00:03:17,599 --> 00:03:22,150
and techonological possibilities of counter-surveillance
and resistance.
46
00:03:22,150 --> 00:03:27,970
And then we want to bring all this together
and ask: How does that re-define what we may
47
00:03:27,970 --> 00:03:30,610
understand as digital citizenship?
48
00:03:30,610 --> 00:03:34,080
The research team includes a number of people
from Cardiff University
49
00:03:34,080 --> 00:03:40,260
including us, including other lecturers, professors,
staff members of Cardiff University
50
00:03:40,260 --> 00:03:44,750
and a few research assistants and research
associates that we employed for this,
51
00:03:44,750 --> 00:03:53,760
plus a couple of guys from Oxford and one
from Briar from a tech development project.
52
00:03:53,760 --> 00:03:59,099
We also have an advisory board with some colleagues
from academia
53
00:03:59,099 --> 00:04:03,970
but also representatives of digital rights
organisations, such as Open Rights Group,
54
00:04:03,970 --> 00:04:05,690
Privacy International and others.
55
00:04:05,690 --> 00:04:11,770
We have a project website, where you can learn
more about the project, about the background
56
00:04:11,770 --> 00:04:13,920
and also some preliminary findings.
57
00:04:13,920 --> 00:04:20,220
We also had a conference earlier this year,
in June, maybe some of you were there.
58
00:04:20,220 --> 00:04:25,150
It was in Cardiff with some interesting speakers
to the conference
59
00:04:25,150 --> 00:04:29,810
and also combined the academic and the practical
part a little bit.
60
00:04:29,810 --> 00:04:34,960
So. A few glimpses of the results in these
two areas that I mentioned.
61
00:04:34,960 --> 00:04:42,080
So for the media research part we were interested
in studying how the British news media have
62
00:04:42,080 --> 00:04:46,639
represented the Snowden leaks and also digital
surveillance more broadly.
63
00:04:46,639 --> 00:04:54,630
And so we asked: How are debates over surveillance
constructed? What are the angles and opinions?
64
00:04:54,630 --> 00:04:57,040
What are usual sources? And so on.
65
00:04:57,040 --> 00:05:02,460
We need to start on an anecdotal basis.
66
00:05:02,460 --> 00:05:07,840
Some examples of media coverage that emerged
very quickly after the Snowden revelations,
67
00:05:07,840 --> 00:05:12,630
again in the UK press, which show different
types of the coverage.
68
00:05:12,630 --> 00:05:17,820
So we probably all know that the Guardian
was very instrumental in the revelations
69
00:05:17,820 --> 00:05:25,030
and provided a lot of information, really
took this role of the fourth estate and of
70
00:05:25,030 --> 00:05:27,169
investigative journalism quite seriously.
71
00:05:27,169 --> 00:05:34,000
On the other hand, other newspapers like this
one were very critical about the Snowden revelations
72
00:05:34,000 --> 00:05:38,729
and also about the Guardian for informing
people about these and running with these revelations.
73
00:05:40,169 --> 00:05:44,639
And then there were others like this one,
that was a famous example.
74
00:05:44,639 --> 00:05:52,300
The former editor of the Independent, actually
another liberal, middle ground, not really
75
00:05:52,300 --> 00:05:56,350
left but at least not ultra conservative newspaper.
76
00:05:56,350 --> 00:06:00,430
Who says "Edward Snowden's secrets may be
dangerous, I would not have published them".
77
00:06:00,430 --> 00:06:06,180
Okay, can debate that, but then he says "if
MI5 warns that this is not in the public interest,
78
00:06:06,180 --> 00:06:08,650
who am I to disbelieve them?".
79
00:06:08,650 --> 00:06:10,600
laughing
80
00:06:10,600 --> 00:06:12,550
That's an interesting understanding of journalism
81
00:06:12,550 --> 00:06:16,810
and it was later retracted, it was debated
quite a lot.
82
00:06:16,810 --> 00:06:28,150
But we see that also this caution towards
publishing something like this has been quite
83
00:06:28,150 --> 00:06:28,949
wide-spread.
84
00:06:28,949 --> 00:06:31,270
So what did we do?
85
00:06:31,270 --> 00:06:38,310
Here's a timeline of Snowden and surveillance
related coverage in the press in this case
86
00:06:38,310 --> 00:06:39,540
in the UK.
87
00:06:39,540 --> 00:06:44,240
And we looked at five case studies, five moments
of coverage.
88
00:06:44,240 --> 00:06:47,540
The first were the initial revelations of
Snowden.
89
00:06:47,540 --> 00:06:53,139
The second the interception of communications
in foreign embassies and European Union offices
90
00:06:53,139 --> 00:06:58,430
and spying on world leaders' phone communications,
such as Angela Merkel's for example.
91
00:06:58,430 --> 00:07:02,620
The third was the detention of Glenn Greenwald's
partner David Miranda at Heathrow Airport
92
00:07:02,620 --> 00:07:04,600
under anti-terror legislation.
93
00:07:04,600 --> 00:07:11,030
Which raised debates around freedom of the
press and national security.
94
00:07:11,030 --> 00:07:15,310
Then we looked at the parliamentary report
into the death of Lee Rigby.
95
00:07:15,310 --> 00:07:20,810
Which was a case that was described as a terrorist
attack on a British soldier on the streets
96
00:07:20,810 --> 00:07:22,500
of London.
97
00:07:22,500 --> 00:07:28,150
And it led to debates around social media
companies' role in tackling terrorism.
98
00:07:28,150 --> 00:07:30,370
And then finally the Charlie Hebdo attacks
in Paris,
99
00:07:30,370 --> 00:07:35,270
which prompted debates around digital encryption,
freedom of speech and the resurrection of
100
00:07:35,270 --> 00:07:40,180
the so-called Snooper's Charter in the UK,
101
00:07:40,180 --> 00:07:45,080
the legislation around surveillance.
102
00:07:45,080 --> 00:07:49,620
So a few results:
103
00:07:49,620 --> 00:07:54,539
Snowden was clearly prominent in the media
coverage, and generally was covered using
104
00:07:54,539 --> 00:07:56,930
mostly neutral or even positive language,
105
00:07:56,930 --> 00:08:00,729
described as a whistleblower as we see
here at the bottom.
106
00:08:00,729 --> 00:08:04,919
But if we look at the focus on issues around
surveillance taken in the stories
107
00:08:04,919 --> 00:08:13,360
and so at the context of coverage of surveillance,
the most important one here has to do
108
00:08:13,360 --> 00:08:18,020
as we can see there, probably it's a little
bit small to read.
109
00:08:18,020 --> 00:08:22,479
But the most important has to do
with themes of terrorism,
110
00:08:22,479 --> 00:08:27,259
with themes of the role of security agencies
and government response.
111
00:08:27,259 --> 00:08:30,548
So that's been very much the context of discussing in
112
00:08:30,548 --> 00:08:33,708
most media coverage of discussing
the context of discussing Snowden revelations
113
00:08:33,708 --> 00:08:35,208
and surveillance more broadly.
114
00:08:35,208 --> 00:08:40,580
And that is in stark contrast to discussing
surveillance in terms of human rights, personal
115
00:08:40,580 --> 00:08:43,049
privacy and freedom of the press.
116
00:08:43,049 --> 00:08:49,920
In other words: rights and digital... and citizen-based perspectives on surveillance.
117
00:08:49,920 --> 00:08:55,040
If we look at who was used as the sources
in these stories, we see a pattern that is
118
00:08:55,040 --> 00:08:58,800
actually quite typical in media sourcing generally.
119
00:08:58,800 --> 00:09:02,520
Politicians are by far the most prominent
source.
120
00:09:02,520 --> 00:09:05,810
And that is not unusual at all.
121
00:09:05,810 --> 00:09:12,000
But in this case it means that elite concerns
around surveillance are most prominent, not
122
00:09:12,000 --> 00:09:13,540
citizen concerns.
123
00:09:13,540 --> 00:09:19,290
Political sources are framing the debate and
how it is interpreted.
124
00:09:19,290 --> 00:09:25,649
And so unsurprisingly then the oppinions raised
by these sources are for example, as we see
125
00:09:25,649 --> 00:09:28,990
there, that surveillance should be increased
126
00:09:28,990 --> 00:09:33,950
or at least is necessary, at least has to
be maintained.
127
00:09:33,950 --> 00:09:38,290
That the Snowden leaks have compromised the
work of intelligence services
128
00:09:38,290 --> 00:09:42,870
and that social media companies should do
more to fight terror and to increase their
129
00:09:42,870 --> 00:09:44,470
own surveillance.
130
00:09:44,470 --> 00:09:48,839
And so this dominant framework understands
surveillance as a valuable activity,
131
00:09:48,839 --> 00:09:55,380
and one for which both intelligence services
and business actors have a responsibility.
132
00:09:55,380 --> 00:09:59,830
Rather than it being primarily problematic
for citizens.
133
00:09:59,830 --> 00:10:05,290
And where it is presented as problematic,
in the snooping on world leaders case study,
134
00:10:05,290 --> 00:10:10,209
surveillance was seen as damaging to international
relations and therefore problematic.
135
00:10:10,209 --> 00:10:15,399
And that's something that is primarily of
relevance to big players rather than ordinary
136
00:10:15,399 --> 00:10:16,170
citizens.
137
00:10:16,170 --> 00:10:20,709
So from these short glimpses, what we can
see, just a few preliminary conclusions,
138
00:10:20,709 --> 00:10:27,089
is that yes, there was extensive and often
positive reporting on Snowden himself, in
139
00:10:27,089 --> 00:10:28,360
some media at least.
140
00:10:28,360 --> 00:10:32,970
But debates around surveillance are framed
by elites, rather than citizens
141
00:10:32,970 --> 00:10:38,610
and this elite-centered structure of news
coverage means that the consequences and the
142
00:10:38,610 --> 00:10:42,600
extent particularly of mass surveillance of
citizens
143
00:10:42,600 --> 00:10:44,610
are largely invisible in media coverage.
144
00:10:44,610 --> 00:10:48,450
There's a strong framing on national security
and so on,
145
00:10:48,450 --> 00:10:53,640
but there is quite insufficient information
on the practices and implications of surveillance
146
00:10:53,640 --> 00:10:55,980
for normal citizens.
147
00:10:55,980 --> 00:11:01,399
And so the issues of mass surveillance that
were actually so central in Snowden's revelations,
148
00:11:01,399 --> 00:11:04,149
remain relatively invisible in these debates,
149
00:11:04,149 --> 00:11:09,050
apart from perhaps the Guardian coverage.
150
00:11:09,050 --> 00:11:16,260
And so we could say that media justify and
normalize current surveillance practices,
151
00:11:16,260 --> 00:11:23,220
and that discussions about individual rights
and human security are structurally discouraged.
152
00:11:23,220 --> 00:11:24,170
That is the media part
153
00:11:25,670 --> 00:11:29,620
Lina: so i'll just go briefly through some
of our key findings for what we call the civil
154
00:11:29,620 --> 00:11:31,450
society work stream on this.
155
00:11:31,450 --> 00:11:36,910
Which looks at two aspects, so there is the
public knowledge and attitudes on the Snowden
156
00:11:36,910 --> 00:11:38,450
leaks and digital surveillance.
157
00:11:38,450 --> 00:11:42,350
And then there's the second part which is
particularly to do with responses amongst
158
00:11:42,350 --> 00:11:43,899
political activists.
159
00:11:43,899 --> 00:11:48,720
And for the first part, the public opinion
research, we did a number of focus groups across
160
00:11:48,720 --> 00:11:49,899
different demographics in the UK,
161
00:11:49,899 --> 00:11:53,339
in order to get us a diverse range of
opinions and views.
162
00:11:53,339 --> 00:11:59,180
So that ranges from sort of high income people
working the financial centre to local young
163
00:11:59,180 --> 00:12:03,120
Muslim groups within Cardiff itself.
164
00:12:03,120 --> 00:12:05,959
So a different range and different groups
of people.
165
00:12:05,959 --> 00:12:11,589
And then for the research on the activist
responses we did a number of interviews with
166
00:12:11,589 --> 00:12:13,550
different groups and organisations,
167
00:12:13,550 --> 00:12:16,420
from large NGOs to smaller community groups.
168
00:12:16,420 --> 00:12:21,100
Ranging from environmental groups, labour
activists anti-war activists like "Stop the
169
00:12:21,100 --> 00:12:21,450
War",
170
00:12:21,450 --> 00:12:24,990
economic justice groups like "Global Justice
Now", and community
171
00:12:24,990 --> 00:12:30,420
and civil liberty groups such as also "CAGE",
who spoke earlier today.
172
00:12:30,420 --> 00:12:31,720
And talked with them.
173
00:12:31,720 --> 00:12:36,390
So there's particularly groups that weren't
digital rights activists or tech activists
174
00:12:36,390 --> 00:12:36,870
specifically,
175
00:12:36,870 --> 00:12:41,649
to try and get an understanding of how other
political activists view this issue in particular
176
00:12:41,649 --> 00:12:42,860
in response to the Snowden leaks.
177
00:12:42,860 --> 00:12:48,930
So with the first bit on public opinion in
our focus groups we had a range of themes.
178
00:12:48,930 --> 00:12:51,800
Understanding and experiences of surveillance,
179
00:12:51,800 --> 00:12:54,510
knowledge and opinions on Snowden leaks,
180
00:12:54,510 --> 00:12:56,540
concerns with privacy and personal data,
181
00:12:56,540 --> 00:12:58,920
questions around online behaviour and practices
182
00:12:58,920 --> 00:13:02,470
and attitudes towards intelligence services.
183
00:13:02,470 --> 00:13:06,700
So just a couple of key points from these
focus groups:
184
00:13:06,700 --> 00:13:11,350
First of all there was particularly low knowledge
of who Edward Snowden was,
185
00:13:11,350 --> 00:13:15,940
and even less knowledge of what the content
of the leaks were.
186
00:13:15,940 --> 00:13:21,450
And there was a lot of confusion in discussions
with Julian Assange, Chelsea Manning and Wikileaks
187
00:13:21,450 --> 00:13:21,750
really,
188
00:13:21,750 --> 00:13:24,160
in terms of how people had come about this
story.
189
00:13:24,160 --> 00:13:30,350
And there were a lot of mix-up between those
different stories.
190
00:13:30,350 --> 00:13:36,320
In terms of actually understandings of surveillance
all of this state surveillance isn't really
191
00:13:36,320 --> 00:13:38,570
isolated in how people speak about it.
192
00:13:38,570 --> 00:13:43,149
It overlaps also with questions of corporate
surveillance and also peer surveillance or
193
00:13:43,149 --> 00:13:44,670
employer surveillance and so forth.
194
00:13:44,670 --> 00:13:49,029
So a lot of concerns are not necessarily about
state surveillance per se and it's difficult
195
00:13:49,029 --> 00:13:52,350
to isolate this as a particular issue.
196
00:13:52,350 --> 00:13:57,139
And also when it comes to what constitutes
surveillance,
197
00:13:57,139 --> 00:14:01,540
the initial responses would be things like
CCTV and sort of these types of things were
198
00:14:01,540 --> 00:14:04,690
seen as more kind of real forms of surveillance.
199
00:14:04,690 --> 00:14:08,860
But on the other hand it was very clear that
people felt that the collection of data
200
00:14:08,860 --> 00:14:12,410
and also including the collection of meta
data, so distinguishing also from it being
201
00:14:12,410 --> 00:14:15,089
not about content, constitutes surveillance.
202
00:14:15,089 --> 00:14:21,070
So that was generally how people felt about
what surveillance actually means.
203
00:14:21,070 --> 00:14:27,480
In terms then of concerns around this, people's
worries about state surveillance in particular
204
00:14:27,480 --> 00:14:30,399
but dominantly concerns lack of transparency
around it.
205
00:14:30,399 --> 00:14:36,339
So a lack of transparency around what is being
collected, but also how it's being used and
206
00:14:36,339 --> 00:14:37,190
what it's being used for,
207
00:14:37,190 --> 00:14:42,720
and also what the regulatory framework is
that's in place surrounding it.
208
00:14:42,720 --> 00:14:46,660
And also concerns over the lack of knowledge
or understanding of how to actually opt out,
209
00:14:46,660 --> 00:14:50,970
or resist or circumvent collection of data.
210
00:14:50,970 --> 00:14:55,209
And in terms of sort of changes in online
behaviour then,
211
00:14:55,209 --> 00:14:58,430
these concerns do manifest themselves in some
changes, but it's mainly in terms of sort
212
00:14:58,430 --> 00:15:00,279
of self-regulating behaviour,
213
00:15:00,279 --> 00:15:03,760
not saying things that are too controversial
online and so forth,
214
00:15:03,760 --> 00:15:09,880
rather than actually changes in using different
tools or different communication platforms,
215
00:15:09,880 --> 00:15:13,860
which wasn't prominent at all in our focus
groups.
216
00:15:13,860 --> 00:15:17,589
And what we also saw as sort of implications
of this is that there was sort of an internalising
217
00:15:17,589 --> 00:15:18,970
of some of these justifications
218
00:15:18,970 --> 00:15:22,540
that have been very prominent also in the
media, particularly this phrase: "nothing
219
00:15:22,540 --> 00:15:24,880
to hide, nothing to fear".
220
00:15:24,880 --> 00:15:31,339
Although in this case there was clear
differences between the different demographic
221
00:15:31,339 --> 00:15:32,699
groups that we spoke with.
222
00:15:32,699 --> 00:15:35,670
Meaning that some people were more comfortable
saying this phrase "nothing to hide, nothing
223
00:15:35,670 --> 00:15:36,769
to fear",
224
00:15:36,769 --> 00:15:40,910
whereas for example when we spoke to local
Muslim groups they problematised this position
225
00:15:40,910 --> 00:15:41,480
much more.
226
00:15:41,480 --> 00:15:44,589
So there is definitely variation here in terms
of that,
227
00:15:44,589 --> 00:15:48,970
but there is a sense in which some of
these justifications have been internalized.
228
00:15:48,970 --> 00:15:52,519
And actually what we've seen is what we phrase
this as a kind of surveillance realism,
229
00:15:52,519 --> 00:15:56,760
is that surveillance has become normalized
to such an extent,
230
00:15:56,760 --> 00:16:01,000
it is difficult for people to really understand
or imagine a society in which surveillance
231
00:16:01,000 --> 00:16:03,240
doesn't take place.
232
00:16:03,240 --> 00:16:08,170
Which might also relate to some of these questions
around a lack of understanding of how to actually
233
00:16:08,170 --> 00:16:10,949
resist this or opt out from this.
234
00:16:10,949 --> 00:16:16,279
So i think a key point that we wanted to make
with our research with these focus groups,
235
00:16:16,279 --> 00:16:16,540
is
236
00:16:16,540 --> 00:16:20,910
that we need to re-distinguish here between
public consent versus public resignation, when
237
00:16:20,910 --> 00:16:23,260
we talk about attitudes towards surveillance,
238
00:16:23,260 --> 00:16:26,459
meaning that it isn't necessary that people
consent to this going on
239
00:16:26,459 --> 00:16:31,800
but actually have resigned to the fact that
this is how society is being organised.
240
00:16:31,800 --> 00:16:35,870
To then move on to interviews with activists.
241
00:16:35,870 --> 00:16:38,110
We also had similar questions here,
242
00:16:38,110 --> 00:16:40,170
so understanding and experiences of surveillance,
243
00:16:40,170 --> 00:16:44,180
and knowledge and opinions of Snowden leaks
and attitudes towards state surveillance.
244
00:16:44,180 --> 00:16:48,930
And then we also wanted to explore this question
around current online behaviour and practices
245
00:16:48,930 --> 00:16:53,600
and whether there had been any changes and
responses to the Snowden leaks.
246
00:16:53,600 --> 00:16:57,820
And again just some key findings here on these
questions:
247
00:16:57,820 --> 00:17:03,500
So basically the activists that we spoke with
were generally very aware of surveillance,
248
00:17:03,500 --> 00:17:07,589
but again it was visible and physical forms
of surveillance that were more prominent in
249
00:17:07,589 --> 00:17:09,419
how activists spoke about it.
250
00:17:09,419 --> 00:17:14,209
And this is particularly and perhaps particularly
in the UK a context,
251
00:17:14,209 --> 00:17:18,618
because there is a very troublesome history
in the UK with police infiltration into activist
252
00:17:18,618 --> 00:17:18,868
groups,
253
00:17:18,868 --> 00:17:22,849
which has really impacted the activist scene
quite a lot within the UK.
254
00:17:22,849 --> 00:17:26,989
And often this was how the activists we spoke
with would talk about surveillance first and
255
00:17:26,989 --> 00:17:27,799
foremost,
256
00:17:27,799 --> 00:17:33,850
rather than about these more virtual forms
and visible forms of surveillance.
257
00:17:33,850 --> 00:17:39,659
And also perhaps linked to that then despite
this general awareness and wide-spread experiences
258
00:17:39,659 --> 00:17:40,600
of surveillance,
259
00:17:40,600 --> 00:17:44,619
the activists we spoke with didn't know a
great deal of detail about the Snowden leaks
260
00:17:44,619 --> 00:17:45,519
particularly.
261
00:17:45,519 --> 00:17:50,649
And again there was this confusion with Chelsea
Manning and Wikileaks.
262
00:17:50,649 --> 00:17:56,249
And importantly also there was a sort of general
expectation some of these quotes sort of highlight
263
00:17:56,249 --> 00:17:57,049
that,
264
00:17:57,049 --> 00:18:02,369
that state surveillance goes on, this is sort
of expected.
265
00:18:02,369 --> 00:18:05,210
And it's confirmed for activists when police
are often there,
266
00:18:05,210 --> 00:18:07,960
when they've organized events or protests
and demonstrations,
267
00:18:07,960 --> 00:18:10,899
or when activities have been intercepted.
268
00:18:10,899 --> 00:18:14,759
And so the Snowden leaks in themselves and
the realities of mass surveillance
269
00:18:14,759 --> 00:18:19,169
came as little surprise to the political activists
in the UK.
270
00:18:19,169 --> 00:18:24,059
And perhaps also therefore or one other reason
there hasn't been much response from the groups
271
00:18:24,059 --> 00:18:24,899
we spoke with anyway,
272
00:18:24,899 --> 00:18:27,149
in terms of changing online behaviour.
273
00:18:27,149 --> 00:18:30,549
Particularly not directly because of Snowden.
274
00:18:30,549 --> 00:18:31,499
And there are some exceptions here,
275
00:18:31,499 --> 00:18:34,899
so for example Greenpeace did really change
their communication behaviour
276
00:18:34,899 --> 00:18:37,029
as a direct response to the Snowden leaks.
277
00:18:37,029 --> 00:18:41,019
And CAGE i think as we heard earlier had recently
also changed communication practices,
278
00:18:41,019 --> 00:18:43,019
although at the time of our interview with
them
279
00:18:43,019 --> 00:18:47,440
they hadn't done as much as they're doing
now.
280
00:18:47,440 --> 00:18:50,679
Predominantly however there has been very
little change in online behaviour,
281
00:18:50,679 --> 00:18:55,679
and where it has taken place it's been part
of a sort of longer term consciousness of
282
00:18:55,679 --> 00:18:57,220
surveillance.
283
00:18:57,220 --> 00:19:02,350
And the kind of changes we have seen more
are things like face to face interaction.
284
00:19:02,350 --> 00:19:08,929
So more face to face interaction, perhaps
slightly more careful online communication.
285
00:19:08,929 --> 00:19:12,299
But in terms of encryption:
286
00:19:12,299 --> 00:19:18,919
We found little use of encryption again although
with exceptions with some of the groups,
287
00:19:18,919 --> 00:19:22,139
but partly this was due to questions of convenience,
288
00:19:22,139 --> 00:19:24,460
and a perceived lack of technical ability.
289
00:19:24,460 --> 00:19:28,399
Which I think are arguments that we're quite
familiar with, when it comes to questions around
290
00:19:28,399 --> 00:19:28,830
this.
291
00:19:28,830 --> 00:19:33,049
But it was also related to a particular kind
of rationale thas was expressed in some of
292
00:19:33,049 --> 00:19:34,499
the interviews that we did,
293
00:19:34,499 --> 00:19:40,859
that somehow using encrypted software is about
being hidden or closed in some ways,
294
00:19:40,859 --> 00:19:45,629
whereas activists strive for open and transparent
organisations.
295
00:19:45,629 --> 00:19:51,129
So that somehow contradicts this aim to be
transparent and open and inclusive.
296
00:19:51,129 --> 00:19:57,159
That somehow it also excludes people to start
to use encrypted communication.
297
00:19:57,159 --> 00:20:00,330
And linked to that also many of the activists
we spoke with expressed the notion
298
00:20:00,330 --> 00:20:05,869
that their activities and their role in society
didn't constitute a need to really worry about
299
00:20:05,869 --> 00:20:07,049
surveillance.
300
00:20:07,049 --> 00:20:10,759
So despite being aware of surveillance and
expecting it to go on,
301
00:20:10,759 --> 00:20:13,450
there was a sense in which some of the organisations
here
302
00:20:13,450 --> 00:20:15,570
perceived themselves as fairly mainstream,
303
00:20:15,570 --> 00:20:17,119
and therefore kind of safe.
304
00:20:17,119 --> 00:20:19,989
And didn't really need to worry about surveillance.
305
00:20:19,989 --> 00:20:23,299
And really that surveillance would only really
need to be something to worry about,
306
00:20:23,299 --> 00:20:29,299
if they moved into more radical forms of politics
and action,
307
00:20:29,299 --> 00:20:31,599
whatever that might be.
308
00:20:31,599 --> 00:20:35,539
So in some ways we might think of this as
kind of it acts to somewhat keep the mainstream
309
00:20:35,539 --> 00:20:35,950
in check,
310
00:20:35,950 --> 00:20:40,070
in that there would only surveillance becomes
a variable only if you do certain kinds of
311
00:20:40,070 --> 00:20:42,369
actions.
312
00:20:42,369 --> 00:20:46,509
So and therefore also there wasn't really
in terms of sort of questions around digital
313
00:20:46,509 --> 00:20:49,179
rights and advocacy work around policies,
314
00:20:49,179 --> 00:20:52,649
and policy around privacy and so forth,
315
00:20:52,649 --> 00:20:56,950
wasn't something that the activists we spoke
with, most of them anyway,
316
00:20:56,950 --> 00:21:01,470
didn't see that as something that directly
featured on their agenda.
317
00:21:01,470 --> 00:21:04,690
So it wasn't really something that they were
so concerned with themselves,
318
00:21:04,690 --> 00:21:09,710
but rather that type of activism is kind of
outsourced to other groups like digital rights
319
00:21:09,710 --> 00:21:11,479
activists or tech activists.
320
00:21:11,479 --> 00:21:15,659
That that's what they do, we are doing something
else.
321
00:21:15,659 --> 00:21:19,970
So I think what we sort of want to suggest
with that is that our research seems anyway
322
00:21:19,970 --> 00:21:20,580
to suggest,
323
00:21:20,580 --> 00:21:24,639
that there are some limitations around resistance
to surveillance,
324
00:21:24,639 --> 00:21:29,989
in that this resistance seems to remain within
the silos of only certain types of actors.
325
00:21:29,989 --> 00:21:35,559
So we're sort of asking: How can we then move
beyond that?
326
00:21:35,559 --> 00:21:39,820
And start thinking of surveillance in terms
of perhaps data justice,
327
00:21:39,820 --> 00:21:45,059
or somehow thinking of how surveillance connects
or resistance to surveillance connects
328
00:21:45,059 --> 00:21:48,460
to broader social and economic justice agendas.
329
00:21:48,460 --> 00:21:50,849
And of course some of this is already happening,
330
00:21:50,849 --> 00:21:53,460
and some of it has been discussed here at
this congress.
331
00:21:53,460 --> 00:21:57,179
So for example how does data collection lead
to discrimination?
332
00:21:57,179 --> 00:21:59,859
Or how does it come to suppress dissent?
333
00:21:59,859 --> 00:22:04,789
But also how does surveillance relate to working
conditions and workers' rights for example,
334
00:22:04,789 --> 00:22:08,889
or how does it link to inequality and poverty?
335
00:22:08,889 --> 00:22:11,409
So I suppose our research suggests that we
need to think about
336
00:22:11,409 --> 00:22:15,720
that if encryption and technical solutions
and discussions around digital rights such
337
00:22:15,720 --> 00:22:16,749
as privacy
338
00:22:16,749 --> 00:22:21,710
remain really within certain circles and perhaps
events like this and so forth,
339
00:22:21,710 --> 00:22:27,349
how can we get it to resonate with a broader
public in some ways?
340
00:22:27,349 --> 00:22:29,460
So — wow, we finished much faster than we
thought we would.
341
00:22:29,460 --> 00:22:35,299
But anyway. So basically we've had a snapshot
now of sort of recent public debate,
342
00:22:35,299 --> 00:22:40,249
and sort of ones that suggest that we might
need to think about how to connect concerns
343
00:22:40,249 --> 00:22:41,789
with surveillance,
344
00:22:41,789 --> 00:22:47,379
that are discussed in places like this to
other issues in order to resonate with a broader
345
00:22:47,379 --> 00:22:48,629
public.
346
00:22:48,629 --> 00:22:50,169
And that's it, we have time for questions
347
00:22:50,169 --> 00:23:00,339
applause
348
00:23:00,339 --> 00:23:05,590
A: Ask questions or comments, or additional
information about some other projects.
349
00:23:05,590 --> 00:23:10,320
Angel: Please, line up at the microphones, so you
can speak clearly your questions into the
350
00:23:10,320 --> 00:23:13,190
microphone, please.
351
00:23:13,190 --> 00:23:16,759
The microphone in the back, please.
352
00:23:20,589 --> 00:23:21,449
Go ahead.
353
00:23:21,449 --> 00:23:28,129
Question: Hey. So do you think this lack of
technical understanding of the Snowden leaks
354
00:23:28,129 --> 00:23:34,539
might be due to Snowden fatigue, that is people
getting really tired of reading a Snowden
355
00:23:34,539 --> 00:23:35,320
article?
356
00:23:35,320 --> 00:23:38,859
And another one and another one: Did you know you might have contributed to it?
357
00:23:38,859 --> 00:23:41,869
Angel: Can you maybe repeat the question?
358
00:23:41,869 --> 00:23:45,639
And if you leave the room, please do so quietly,
359
00:23:45,639 --> 00:23:47,519
because we can't understand his question.
360
00:23:47,519 --> 00:23:56,109
Q: Sorry. So the question is: This lack of understanding of the content of the Snowden leaks, maybe
361
00:23:56,109 --> 00:23:58,320
on a basic technical level,
362
00:23:58,320 --> 00:24:03,649
could that something that contributed to that,
could that be Snowden fatigue?
363
00:24:03,649 --> 00:24:09,450
L: And you're referring to this sort of drip-feed
way of releasing those documents...
364
00:24:09,450 --> 00:24:12,869
Q: Not necessarily criticizing the way it
was released, but there was a hell of a lot
365
00:24:12,869 --> 00:24:15,060
of content and a lot of people got bored of
it.
366
00:24:15,060 --> 00:24:19,899
L: Right. okay. mumbling
367
00:24:19,899 --> 00:24:24,219
A: There's a bit of that I think probably
that we see
368
00:24:24,219 --> 00:24:29,710
and The Guardian at some point stopped their
coverage or releasing more information
369
00:24:29,710 --> 00:24:34,669
and then we've saw more information coming
out through other sources and Intercept and
370
00:24:34,669 --> 00:24:36,669
so on.
371
00:24:36,669 --> 00:24:44,320
But I think what we are focusing on or what
we saw in media coverage particularly,
372
00:24:44,320 --> 00:24:48,690
were some deficiencies I think in the media
coverage,
373
00:24:48,690 --> 00:24:54,429
and we would create this link mainly between
the lack of knowledge
374
00:24:54,429 --> 00:24:57,580
and the deficiencies in the media coverage
per se.
375
00:24:57,580 --> 00:25:06,340
Not necessarily in The Guardian, but probably
most other media organizations and other newspapers.
376
00:25:08,220 --> 00:25:12,289
L: I think there's different views on that
because a lot of people feel like it's stayed
377
00:25:12,289 --> 00:25:13,219
in the public debate
378
00:25:13,219 --> 00:25:18,129
or in the public realm, because there was a
continuation of revelations that came after
379
00:25:18,129 --> 00:25:18,389
each other,
380
00:25:18,389 --> 00:25:22,529
rather than just doing this data dump thing
and you know just doing everything in one
381
00:25:22,529 --> 00:25:23,200
go.
382
00:25:23,200 --> 00:25:27,629
So I think we will probably have been able
to say the same thing if it was done differently
383
00:25:27,629 --> 00:25:28,330
as well.
384
00:25:29,900 --> 00:25:31,950
Angel: There is a question from the internet.
385
00:25:31,950 --> 00:25:38,710
Q: Yes. Ifup is asking as far as he or she
understood the people were not informed pretty
386
00:25:38,710 --> 00:25:41,469
well on what really was revealed.
387
00:25:41,469 --> 00:25:45,729
Wouldn't it have been the task of the media
to inform them?
388
00:25:45,729 --> 00:25:48,509
And how could they have been done better?
389
00:25:48,509 --> 00:25:55,769
L: This seems to be a rhetorical question
in that they didn't... yes
390
00:25:55,769 --> 00:25:59,280
A: Well yes, they should have.
391
00:25:59,280 --> 00:26:04,849
Ideally we would think that it is the task
of the media to inform,
392
00:26:04,849 --> 00:26:11,179
we saw that some media did inform, others
did do pretty much the opposite.
393
00:26:11,179 --> 00:26:13,320
Then there's the question how to improve that.
394
00:26:13,320 --> 00:26:17,049
And what is the role of different types of
media and alternative media
395
00:26:17,049 --> 00:26:21,899
and what does need to change structurally
in forms of mainstream media?
396
00:26:21,899 --> 00:26:23,109
But that is a big debate.
397
00:26:23,939 --> 00:26:28,719
L: And we should also say that we've done
interviews with journalists, asking questions
398
00:26:28,719 --> 00:26:32,489
as to why they covered this the way that they
did.
399
00:26:32,489 --> 00:26:36,139
And hopefully those interviews will reveal
something more,
400
00:26:36,139 --> 00:26:38,210
but those are still ongoing.
401
00:26:38,210 --> 00:26:43,200
But we've had for example James Ball from
The Guardian who came to our conference in
402
00:26:43,200 --> 00:26:43,739
June,
403
00:26:43,739 --> 00:26:47,229
and talked about some of the structural problems
with a couple of journalists who cover security
404
00:26:47,229 --> 00:26:48,409
issues.
405
00:26:48,409 --> 00:26:54,369
And there's quite a lot of obstacles and so
for them to do that in a critical and investigatory
406
00:26:54,369 --> 00:26:54,700
way.
407
00:26:54,700 --> 00:26:58,719
So I think those are the issues that we want
to explore when we find out more through these
408
00:26:58,719 --> 00:26:59,859
those interviews.
409
00:27:00,599 --> 00:27:04,019
Angel: We have time for one last question,
please make it short
410
00:27:06,989 --> 00:27:10,219
Q: Hello. That's better
411
00:27:10,219 --> 00:27:12,909
I'm not surprised to be honest,
412
00:27:12,909 --> 00:27:18,009
we have seen a similar thing by John Oliver,
so Last Week Tonight, I can only recommend
413
00:27:18,009 --> 00:27:19,889
that scene.
414
00:27:19,889 --> 00:27:23,309
So the question is only about what do we talk
about,
415
00:27:23,309 --> 00:27:25,219
so can everybody relate to that?
416
00:27:25,219 --> 00:27:28,049
I have just one question to the first slides
you have shown
417
00:27:28,049 --> 00:27:31,049
the numbers: What do they reveal?
418
00:27:33,689 --> 00:27:34,919
A: Numbers?
419
00:27:34,919 --> 00:27:39,499
Q: In your first slides there were all of
those bar charts with kind of numbers and
420
00:27:39,499 --> 00:27:40,879
I was interested in those numbers.
421
00:27:40,879 --> 00:27:42,700
A: Okay.
422
00:27:42,700 --> 00:27:45,789
Q: I guess occurences.
423
00:27:45,789 --> 00:27:49,700
A: Yes, so at the beginning we showed the
time line of...
424
00:27:49,700 --> 00:27:51,619
L: Numbers of mumbling
425
00:27:51,619 --> 00:28:02,639
A: Ah yes. These were the dates of the publication
and that is the volume of publication
426
00:28:02,639 --> 00:28:05,440
again: Looking at the press in this case,
427
00:28:05,440 --> 00:28:08,389
looking at not just The Guardian, but all
kinds of other newspapers.
428
00:28:08,389 --> 00:28:12,379
That's one part of the research and there
will be another part of the research that
429
00:28:12,379 --> 00:28:15,239
you will find information about this on the
website,
430
00:28:15,239 --> 00:28:20,229
which is about broadcasting, which is about
TV and radio coverage.
431
00:28:20,229 --> 00:28:24,210
But so far what we saw is that there is a
fairly similar picture
432
00:28:24,210 --> 00:28:26,330
in terms of how these curves developed,
433
00:28:26,330 --> 00:28:30,039
and also in terms of the content of the coverage.
434
00:28:31,419 --> 00:28:33,049
Angel: I'd say time is up.
435
00:28:33,049 --> 00:28:36,309
Thank you very much Lina Dencik and Arne Hintz
for your talk!
436
00:28:36,309 --> 00:28:37,569
applause
437
00:28:37,569 --> 00:28:41,579
music
438
00:28:41,579 --> 00:28:48,000
subtitles created by c3subtitles.de
Join, and help us!