0:00:00.440,0:00:10.299
music
0:00:10.299,0:00:12.679
Let's start. Be welcome!
0:00:12.679,0:00:16.880
More than two years ago, Edward Snowden's[br]files have become public.
0:00:16.880,0:00:18.410
They went public
0:00:18.410,0:00:20.910
and the media went crazy.
0:00:20.910,0:00:23.160
And the public maybe not so much,
0:00:23.160,0:00:26.250
as you may have noticed amongst your friends[br]and family,
0:00:26.250,0:00:28.000
as well I did.
0:00:28.000,0:00:34.530
A lot remains the same after Snowden's revelations,
0:00:34.530,0:00:38.039
even if people are concerned about surveillance.
0:00:38.039,0:00:44.780
The following talk by Arne Hintz and Lina[br]Dencik from University of Cardiff explores
0:00:44.780,0:00:46.019
just that.
0:00:46.019,0:00:54.739
They analyzed how actually the media reacted[br]to the relations made by Edward Snowden
0:00:54.739,0:00:56.550
and they also looked at how the public,
0:00:56.550,0:01:03.100
such as journalists and other people and activists,[br]reacted to Edward Snowden's disclosures.
0:01:03.100,0:01:09.520
So please give a warm round of applause to[br]Arne Hintz and Lina Dencik. Thank you!
0:01:09.520,0:01:17.790
applause
0:01:17.790,0:01:21.910
Arne: Thank you very much, there are still[br]a few free seats over there.
0:01:21.910,0:01:25.290
Hello everybody, my name is Arne Hintz, this[br]is Lina Denzik.
0:01:25.290,0:01:31.620
We are both from Cardiff University, from[br]the school of journalism, media and cultural studies,
0:01:31.620,0:01:34.380
so not from the tech department.
0:01:34.380,0:01:38.960
We want to talk about some of the results[br]of a research project
0:01:38.960,0:01:42.400
that we've been working on this year and for the past...
0:01:42.400,0:01:45.470
for a bit more than a year
0:01:45.470,0:01:50.300
and it's called "Digital Citizenship and Surveillance[br]Society: UK State-Media-Citizen Relations
0:01:50.300,0:01:51.800
after the Snowden Leaks",
0:01:51.800,0:01:56.980
and it's about the implications of the Snowden[br]leaks in four areas:
0:01:56.980,0:02:01.440
News media, civil society, policy and technology
0:02:01.440,0:02:05.430
and here what we want to do is present just[br]a few findings from that project
0:02:05.430,0:02:11.230
and focus on two areas, the news media part[br]and the civil society part.
0:02:11.230,0:02:16.780
It's all focused on the UK, the country where[br]Cardiff University is located
0:02:16.780,0:02:22.510
so there won't be a lot of international comparisons,[br]not a lot about Germany and so on,
0:02:22.510,0:02:29.470
but I think maybe at the end we can maybe[br]draw some comparisons ourselves here in this room.
0:02:32.730,0:02:38.890
So this has been the project basically, the[br]title as you see it over there.
0:02:38.890,0:02:43.190
The news media part has basically asked how[br]the british media represented the Snowden
0:02:43.190,0:02:45.190
leaks and digital surveillance.
0:02:45.190,0:02:51.400
The society part is about questions such as:[br]What is the nature of public knowledge with
0:02:51.400,0:02:52.740
regards to digital surveillance?
0:02:52.740,0:02:56.130
Are everyday communication practices changing?
0:02:56.130,0:03:01.040
And how are activists affected by the revelations[br]of mass surveillance?
0:03:01.040,0:03:04.560
The policies part is still ongoing, it's still[br]being developed
0:03:04.560,0:03:08.700
and it's about the current policy and regulatory[br]framework of digital surveillance
0:03:08.700,0:03:12.890
and reform proposals and current reforms that[br]are taking place.
0:03:12.890,0:03:17.599
And the technology part is about the technological[br]infrastructure of surveillance
0:03:17.599,0:03:22.150
and techonological possibilities of counter-surveillance[br]and resistance.
0:03:22.150,0:03:27.970
And then we want to bring all this together[br]and ask: How does that re-define what we may
0:03:27.970,0:03:30.610
understand as digital citizenship?
0:03:30.610,0:03:34.080
The research team includes a number of people[br]from Cardiff University
0:03:34.080,0:03:40.260
including us, including other lecturers, professors,[br]staff members of Cardiff University
0:03:40.260,0:03:44.750
and a few research assistants and research[br]associates that we employed for this,
0:03:44.750,0:03:53.760
plus a couple of guys from Oxford and one[br]from Briar from a tech development project.
0:03:53.760,0:03:59.099
We also have an advisory board with some colleagues[br]from academia
0:03:59.099,0:04:03.970
but also representatives of digital rights[br]organisations, such as Open Rights Group,
0:04:03.970,0:04:05.690
Privacy International and others.
0:04:05.690,0:04:11.770
We have a project website, where you can learn[br]more about the project, about the background
0:04:11.770,0:04:13.920
and also some preliminary findings.
0:04:13.920,0:04:20.220
We also had a conference earlier this year,[br]in June, maybe some of you were there.
0:04:20.220,0:04:25.150
It was in Cardiff with some interesting speakers[br]to the conference
0:04:25.150,0:04:29.810
and also combined the academic and the practical[br]part a little bit.
0:04:29.810,0:04:34.960
So. A few glimpses of the results in these[br]two areas that I mentioned.
0:04:34.960,0:04:42.080
So for the media research part we were interested[br]in studying how the British news media have
0:04:42.080,0:04:46.639
represented the Snowden leaks and also digital[br]surveillance more broadly.
0:04:46.639,0:04:54.630
And so we asked: How are debates over surveillance[br]constructed? What are the angles and opinions?
0:04:54.630,0:04:57.040
What are usual sources? And so on.
0:04:57.040,0:05:02.460
We need to start on an anecdotal basis.
0:05:02.460,0:05:07.840
Some examples of media coverage that emerged[br]very quickly after the Snowden revelations,
0:05:07.840,0:05:12.630
again in the UK press, which show different[br]types of the coverage.
0:05:12.630,0:05:17.820
So we probably all know that the Guardian[br]was very instrumental in the revelations
0:05:17.820,0:05:25.030
and provided a lot of information, really[br]took this role of the fourth estate and of
0:05:25.030,0:05:27.169
investigative journalism quite seriously.
0:05:27.169,0:05:34.000
On the other hand, other newspapers like this[br]one were very critical about the Snowden revelations
0:05:34.000,0:05:38.729
and also about the Guardian for informing[br]people about these and running with these revelations.
0:05:40.169,0:05:44.639
And then there were others like this one,[br]that was a famous example.
0:05:44.639,0:05:52.300
The former editor of the Independent, actually[br]another liberal, middle ground, not really
0:05:52.300,0:05:56.350
left but at least not ultra conservative newspaper.
0:05:56.350,0:06:00.430
Who says "Edward Snowden's secrets may be[br]dangerous, I would not have published them".
0:06:00.430,0:06:06.180
Okay, can debate that, but then he says "if[br]MI5 warns that this is not in the public interest,
0:06:06.180,0:06:08.650
who am I to disbelieve them?".
0:06:08.650,0:06:10.600
laughing
0:06:10.600,0:06:12.550
That's an interesting understanding of journalism
0:06:12.550,0:06:16.810
and it was later retracted, it was debated[br]quite a lot.
0:06:16.810,0:06:28.150
But we see that also this caution towards[br]publishing something like this has been quite
0:06:28.150,0:06:28.949
wide-spread.
0:06:28.949,0:06:31.270
So what did we do?
0:06:31.270,0:06:38.310
Here's a timeline of Snowden and surveillance[br]related coverage in the press in this case
0:06:38.310,0:06:39.540
in the UK.
0:06:39.540,0:06:44.240
And we looked at five case studies, five moments[br]of coverage.
0:06:44.240,0:06:47.540
The first were the initial revelations of[br]Snowden.
0:06:47.540,0:06:53.139
The second the interception of communications[br]in foreign embassies and European Union offices
0:06:53.139,0:06:58.430
and spying on world leaders' phone communications,[br]such as Angela Merkel's for example.
0:06:58.430,0:07:02.620
The third was the detention of Glenn Greenwald's[br]partner David Miranda at Heathrow Airport
0:07:02.620,0:07:04.600
under anti-terror legislation.
0:07:04.600,0:07:11.030
Which raised debates around freedom of the[br]press and national security.
0:07:11.030,0:07:15.310
Then we looked at the parliamentary report[br]into the death of Lee Rigby.
0:07:15.310,0:07:20.810
Which was a case that was described as a terrorist[br]attack on a British soldier on the streets
0:07:20.810,0:07:22.500
of London.
0:07:22.500,0:07:28.150
And it led to debates around social media[br]companies' role in tackling terrorism.
0:07:28.150,0:07:30.370
And then finally the Charlie Hebdo attacks[br]in Paris,
0:07:30.370,0:07:35.270
which prompted debates around digital encryption,[br]freedom of speech and the resurrection of
0:07:35.270,0:07:40.180
the so-called Snooper's Charter in the UK,
0:07:40.180,0:07:45.080
the legislation around surveillance.
0:07:45.080,0:07:49.620
So a few results:
0:07:49.620,0:07:54.539
Snowden was clearly prominent in the media[br]coverage, and generally was covered using
0:07:54.539,0:07:56.930
mostly neutral or even positive language,
0:07:56.930,0:08:00.729
described as a whistleblower as we see[br]here at the bottom.
0:08:00.729,0:08:04.919
But if we look at the focus on issues around[br]surveillance taken in the stories
0:08:04.919,0:08:13.360
and so at the context of coverage of surveillance,[br]the most important one here has to do
0:08:13.360,0:08:18.020
as we can see there, probably it's a little[br]bit small to read.
0:08:18.020,0:08:22.479
But the most important has to do [br]with themes of terrorism,
0:08:22.479,0:08:27.259
with themes of the role of security agencies[br]and government response.
0:08:27.259,0:08:30.548
So that's been very much the context of discussing in
0:08:30.548,0:08:33.708
most media coverage of discussing[br]the context of discussing Snowden revelations
0:08:33.708,0:08:35.208
and surveillance more broadly.
0:08:35.208,0:08:40.580
And that is in stark contrast to discussing[br]surveillance in terms of human rights, personal
0:08:40.580,0:08:43.049
privacy and freedom of the press.
0:08:43.049,0:08:49.920
In other words: rights and digital... and citizen-based perspectives on surveillance.
0:08:49.920,0:08:55.040
If we look at who was used as the sources[br]in these stories, we see a pattern that is
0:08:55.040,0:08:58.800
actually quite typical in media sourcing generally.
0:08:58.800,0:09:02.520
Politicians are by far the most prominent[br]source.
0:09:02.520,0:09:05.810
And that is not unusual at all.
0:09:05.810,0:09:12.000
But in this case it means that elite concerns[br]around surveillance are most prominent, not
0:09:12.000,0:09:13.540
citizen concerns.
0:09:13.540,0:09:19.290
Political sources are framing the debate and[br]how it is interpreted.
0:09:19.290,0:09:25.649
And so unsurprisingly then the oppinions raised[br]by these sources are for example, as we see
0:09:25.649,0:09:28.990
there, that surveillance should be increased
0:09:28.990,0:09:33.950
or at least is necessary, at least has to[br]be maintained.
0:09:33.950,0:09:38.290
That the Snowden leaks have compromised the[br]work of intelligence services
0:09:38.290,0:09:42.870
and that social media companies should do[br]more to fight terror and to increase their
0:09:42.870,0:09:44.470
own surveillance.
0:09:44.470,0:09:48.839
And so this dominant framework understands[br]surveillance as a valuable activity,
0:09:48.839,0:09:55.380
and one for which both intelligence services[br]and business actors have a responsibility.
0:09:55.380,0:09:59.830
Rather than it being primarily problematic[br]for citizens.
0:09:59.830,0:10:05.290
And where it is presented as problematic,[br]in the snooping on world leaders case study,
0:10:05.290,0:10:10.209
surveillance was seen as damaging to international[br]relations and therefore problematic.
0:10:10.209,0:10:15.399
And that's something that is primarily of[br]relevance to big players rather than ordinary
0:10:15.399,0:10:16.170
citizens.
0:10:16.170,0:10:20.709
So from these short glimpses, what we can[br]see, just a few preliminary conclusions,
0:10:20.709,0:10:27.089
is that yes, there was extensive and often[br]positive reporting on Snowden himself, in
0:10:27.089,0:10:28.360
some media at least.
0:10:28.360,0:10:32.970
But debates around surveillance are framed[br]by elites, rather than citizens
0:10:32.970,0:10:38.610
and this elite-centered structure of news[br]coverage means that the consequences and the
0:10:38.610,0:10:42.600
extent particularly of mass surveillance of[br]citizens
0:10:42.600,0:10:44.610
are largely invisible in media coverage.
0:10:44.610,0:10:48.450
There's a strong framing on national security[br]and so on,
0:10:48.450,0:10:53.640
but there is quite insufficient information[br]on the practices and implications of surveillance
0:10:53.640,0:10:55.980
for normal citizens.
0:10:55.980,0:11:01.399
And so the issues of mass surveillance that[br]were actually so central in Snowden's revelations,
0:11:01.399,0:11:04.149
remain relatively invisible in these debates,
0:11:04.149,0:11:09.050
apart from perhaps the Guardian coverage.
0:11:09.050,0:11:16.260
And so we could say that media justify and[br]normalize current surveillance practices,
0:11:16.260,0:11:23.220
and that discussions about individual rights[br]and human security are structurally discouraged.
0:11:23.220,0:11:24.170
That is the media part
0:11:25.670,0:11:29.620
Lina: so i'll just go briefly through some[br]of our key findings for what we call the civil
0:11:29.620,0:11:31.450
society work stream on this.
0:11:31.450,0:11:36.910
Which looks at two aspects, so there is the[br]public knowledge and attitudes on the Snowden
0:11:36.910,0:11:38.450
leaks and digital surveillance.
0:11:38.450,0:11:42.350
And then there's the second part which is[br]particularly to do with responses amongst
0:11:42.350,0:11:43.899
political activists.
0:11:43.899,0:11:48.720
And for the first part, the public opinion[br]research, we did a number of focus groups across
0:11:48.720,0:11:49.899
different demographics in the UK,
0:11:49.899,0:11:53.339
in order to get us a diverse range of [br]opinions and views.
0:11:53.339,0:11:59.180
So that ranges from sort of high income people[br]working the financial centre to local young
0:11:59.180,0:12:03.120
Muslim groups within Cardiff itself.
0:12:03.120,0:12:05.959
So a different range and different groups[br]of people.
0:12:05.959,0:12:11.589
And then for the research on the activist[br]responses we did a number of interviews with
0:12:11.589,0:12:13.550
different groups and organisations,
0:12:13.550,0:12:16.420
from large NGOs to smaller community groups.
0:12:16.420,0:12:21.100
Ranging from environmental groups, labour[br]activists anti-war activists like "Stop the
0:12:21.100,0:12:21.450
War",
0:12:21.450,0:12:24.990
economic justice groups like "Global Justice[br]Now", and community
0:12:24.990,0:12:30.420
and civil liberty groups such as also "CAGE",[br]who spoke earlier today.
0:12:30.420,0:12:31.720
And talked with them.
0:12:31.720,0:12:36.390
So there's particularly groups that weren't[br]digital rights activists or tech activists
0:12:36.390,0:12:36.870
specifically,
0:12:36.870,0:12:41.649
to try and get an understanding of how other[br]political activists view this issue in particular
0:12:41.649,0:12:42.860
in response to the Snowden leaks.
0:12:42.860,0:12:48.930
So with the first bit on public opinion in[br]our focus groups we had a range of themes.
0:12:48.930,0:12:51.800
Understanding and experiences of surveillance,
0:12:51.800,0:12:54.510
knowledge and opinions on Snowden leaks,
0:12:54.510,0:12:56.540
concerns with privacy and personal data,
0:12:56.540,0:12:58.920
questions around online behaviour and practices
0:12:58.920,0:13:02.470
and attitudes towards intelligence services.
0:13:02.470,0:13:06.700
So just a couple of key points from these[br]focus groups:
0:13:06.700,0:13:11.350
First of all there was particularly low knowledge[br]of who Edward Snowden was,
0:13:11.350,0:13:15.940
and even less knowledge of what the content[br]of the leaks were.
0:13:15.940,0:13:21.450
And there was a lot of confusion in discussions[br]with Julian Assange, Chelsea Manning and Wikileaks
0:13:21.450,0:13:21.750
really,
0:13:21.750,0:13:24.160
in terms of how people had come about this[br]story.
0:13:24.160,0:13:30.350
And there were a lot of mix-up between those[br]different stories.
0:13:30.350,0:13:36.320
In terms of actually understandings of surveillance[br]all of this state surveillance isn't really
0:13:36.320,0:13:38.570
isolated in how people speak about it.
0:13:38.570,0:13:43.149
It overlaps also with questions of corporate[br]surveillance and also peer surveillance or
0:13:43.149,0:13:44.670
employer surveillance and so forth.
0:13:44.670,0:13:49.029
So a lot of concerns are not necessarily about[br]state surveillance per se and it's difficult
0:13:49.029,0:13:52.350
to isolate this as a particular issue.
0:13:52.350,0:13:57.139
And also when it comes to what constitutes[br]surveillance,
0:13:57.139,0:14:01.540
the initial responses would be things like[br]CCTV and sort of these types of things were
0:14:01.540,0:14:04.690
seen as more kind of real forms of surveillance.
0:14:04.690,0:14:08.860
But on the other hand it was very clear that[br]people felt that the collection of data
0:14:08.860,0:14:12.410
and also including the collection of meta[br]data, so distinguishing also from it being
0:14:12.410,0:14:15.089
not about content, constitutes surveillance.
0:14:15.089,0:14:21.070
So that was generally how people felt about[br]what surveillance actually means.
0:14:21.070,0:14:27.480
In terms then of concerns around this, people's[br]worries about state surveillance in particular
0:14:27.480,0:14:30.399
but dominantly concerns lack of transparency[br]around it.
0:14:30.399,0:14:36.339
So a lack of transparency around what is being[br]collected, but also how it's being used and
0:14:36.339,0:14:37.190
what it's being used for,
0:14:37.190,0:14:42.720
and also what the regulatory framework is[br]that's in place surrounding it.
0:14:42.720,0:14:46.660
And also concerns over the lack of knowledge[br]or understanding of how to actually opt out,
0:14:46.660,0:14:50.970
or resist or circumvent collection of data.
0:14:50.970,0:14:55.209
And in terms of sort of changes in online[br]behaviour then,
0:14:55.209,0:14:58.430
these concerns do manifest themselves in some[br]changes, but it's mainly in terms of sort
0:14:58.430,0:15:00.279
of self-regulating behaviour,
0:15:00.279,0:15:03.760
not saying things that are too controversial[br]online and so forth,
0:15:03.760,0:15:09.880
rather than actually changes in using different[br]tools or different communication platforms,
0:15:09.880,0:15:13.860
which wasn't prominent at all in our focus[br]groups.
0:15:13.860,0:15:17.589
And what we also saw as sort of implications[br]of this is that there was sort of an internalising
0:15:17.589,0:15:18.970
of some of these justifications
0:15:18.970,0:15:22.540
that have been very prominent also in the[br]media, particularly this phrase: "nothing
0:15:22.540,0:15:24.880
to hide, nothing to fear".
0:15:24.880,0:15:31.339
Although in this case there was clear [br]differences between the different demographic
0:15:31.339,0:15:32.699
groups that we spoke with.
0:15:32.699,0:15:35.670
Meaning that some people were more comfortable[br]saying this phrase "nothing to hide, nothing
0:15:35.670,0:15:36.769
to fear",
0:15:36.769,0:15:40.910
whereas for example when we spoke to local[br]Muslim groups they problematised this position
0:15:40.910,0:15:41.480
much more.
0:15:41.480,0:15:44.589
So there is definitely variation here in terms[br]of that,
0:15:44.589,0:15:48.970
but there is a sense in which some of[br]these justifications have been internalized.
0:15:48.970,0:15:52.519
And actually what we've seen is what we phrase[br]this as a kind of surveillance realism,
0:15:52.519,0:15:56.760
is that surveillance has become normalized[br]to such an extent,
0:15:56.760,0:16:01.000
it is difficult for people to really understand[br]or imagine a society in which surveillance
0:16:01.000,0:16:03.240
doesn't take place.
0:16:03.240,0:16:08.170
Which might also relate to some of these questions[br]around a lack of understanding of how to actually
0:16:08.170,0:16:10.949
resist this or opt out from this.
0:16:10.949,0:16:16.279
So i think a key point that we wanted to make[br]with our research with these focus groups,
0:16:16.279,0:16:16.540
is
0:16:16.540,0:16:20.910
that we need to re-distinguish here between[br]public consent versus public resignation, when
0:16:20.910,0:16:23.260
we talk about attitudes towards surveillance,
0:16:23.260,0:16:26.459
meaning that it isn't necessary that people[br]consent to this going on
0:16:26.459,0:16:31.800
but actually have resigned to the fact that[br]this is how society is being organised.
0:16:31.800,0:16:35.870
To then move on to interviews with activists.
0:16:35.870,0:16:38.110
We also had similar questions here,
0:16:38.110,0:16:40.170
so understanding and experiences of surveillance,
0:16:40.170,0:16:44.180
and knowledge and opinions of Snowden leaks[br]and attitudes towards state surveillance.
0:16:44.180,0:16:48.930
And then we also wanted to explore this question[br]around current online behaviour and practices
0:16:48.930,0:16:53.600
and whether there had been any changes and[br]responses to the Snowden leaks.
0:16:53.600,0:16:57.820
And again just some key findings here on these[br]questions:
0:16:57.820,0:17:03.500
So basically the activists that we spoke with[br]were generally very aware of surveillance,
0:17:03.500,0:17:07.589
but again it was visible and physical forms[br]of surveillance that were more prominent in
0:17:07.589,0:17:09.419
how activists spoke about it.
0:17:09.419,0:17:14.209
And this is particularly and perhaps particularly[br]in the UK a context,
0:17:14.209,0:17:18.618
because there is a very troublesome history[br]in the UK with police infiltration into activist
0:17:18.618,0:17:18.868
groups,
0:17:18.868,0:17:22.849
which has really impacted the activist scene[br]quite a lot within the UK.
0:17:22.849,0:17:26.989
And often this was how the activists we spoke[br]with would talk about surveillance first and
0:17:26.989,0:17:27.799
foremost,
0:17:27.799,0:17:33.850
rather than about these more virtual forms[br]and visible forms of surveillance.
0:17:33.850,0:17:39.659
And also perhaps linked to that then despite[br]this general awareness and wide-spread experiences
0:17:39.659,0:17:40.600
of surveillance,
0:17:40.600,0:17:44.619
the activists we spoke with didn't know a[br]great deal of detail about the Snowden leaks
0:17:44.619,0:17:45.519
particularly.
0:17:45.519,0:17:50.649
And again there was this confusion with Chelsea[br]Manning and Wikileaks.
0:17:50.649,0:17:56.249
And importantly also there was a sort of general[br]expectation some of these quotes sort of highlight
0:17:56.249,0:17:57.049
that,
0:17:57.049,0:18:02.369
that state surveillance goes on, this is sort[br]of expected.
0:18:02.369,0:18:05.210
And it's confirmed for activists when police[br]are often there,
0:18:05.210,0:18:07.960
when they've organized events or protests[br]and demonstrations,
0:18:07.960,0:18:10.899
or when activities have been intercepted.
0:18:10.899,0:18:14.759
And so the Snowden leaks in themselves and[br]the realities of mass surveillance
0:18:14.759,0:18:19.169
came as little surprise to the political activists[br]in the UK.
0:18:19.169,0:18:24.059
And perhaps also therefore or one other reason[br]there hasn't been much response from the groups
0:18:24.059,0:18:24.899
we spoke with anyway,
0:18:24.899,0:18:27.149
in terms of changing online behaviour.
0:18:27.149,0:18:30.549
Particularly not directly because of Snowden.
0:18:30.549,0:18:31.499
And there are some exceptions here,
0:18:31.499,0:18:34.899
so for example Greenpeace did really change[br]their communication behaviour
0:18:34.899,0:18:37.029
as a direct response to the Snowden leaks.
0:18:37.029,0:18:41.019
And CAGE i think as we heard earlier had recently[br]also changed communication practices,
0:18:41.019,0:18:43.019
although at the time of our interview with[br]them
0:18:43.019,0:18:47.440
they hadn't done as much as they're doing[br]now.
0:18:47.440,0:18:50.679
Predominantly however there has been very[br]little change in online behaviour,
0:18:50.679,0:18:55.679
and where it has taken place it's been part[br]of a sort of longer term consciousness of
0:18:55.679,0:18:57.220
surveillance.
0:18:57.220,0:19:02.350
And the kind of changes we have seen more[br]are things like face to face interaction.
0:19:02.350,0:19:08.929
So more face to face interaction, perhaps[br]slightly more careful online communication.
0:19:08.929,0:19:12.299
But in terms of encryption:
0:19:12.299,0:19:18.919
We found little use of encryption again although[br]with exceptions with some of the groups,
0:19:18.919,0:19:22.139
but partly this was due to questions of convenience,
0:19:22.139,0:19:24.460
and a perceived lack of technical ability.
0:19:24.460,0:19:28.399
Which I think are arguments that we're quite[br]familiar with, when it comes to questions around
0:19:28.399,0:19:28.830
this.
0:19:28.830,0:19:33.049
But it was also related to a particular kind[br]of rationale thas was expressed in some of
0:19:33.049,0:19:34.499
the interviews that we did,
0:19:34.499,0:19:40.859
that somehow using encrypted software is about[br]being hidden or closed in some ways,
0:19:40.859,0:19:45.629
whereas activists strive for open and transparent[br]organisations.
0:19:45.629,0:19:51.129
So that somehow contradicts this aim to be[br]transparent and open and inclusive.
0:19:51.129,0:19:57.159
That somehow it also excludes people to start[br]to use encrypted communication.
0:19:57.159,0:20:00.330
And linked to that also many of the activists[br]we spoke with expressed the notion
0:20:00.330,0:20:05.869
that their activities and their role in society[br]didn't constitute a need to really worry about
0:20:05.869,0:20:07.049
surveillance.
0:20:07.049,0:20:10.759
So despite being aware of surveillance and[br]expecting it to go on,
0:20:10.759,0:20:13.450
there was a sense in which some of the organisations[br]here
0:20:13.450,0:20:15.570
perceived themselves as fairly mainstream,
0:20:15.570,0:20:17.119
and therefore kind of safe.
0:20:17.119,0:20:19.989
And didn't really need to worry about surveillance.
0:20:19.989,0:20:23.299
And really that surveillance would only really[br]need to be something to worry about,
0:20:23.299,0:20:29.299
if they moved into more radical forms of politics[br]and action,
0:20:29.299,0:20:31.599
whatever that might be.
0:20:31.599,0:20:35.539
So in some ways we might think of this as[br]kind of it acts to somewhat keep the mainstream
0:20:35.539,0:20:35.950
in check,
0:20:35.950,0:20:40.070
in that there would only surveillance becomes[br]a variable only if you do certain kinds of
0:20:40.070,0:20:42.369
actions.
0:20:42.369,0:20:46.509
So and therefore also there wasn't really[br]in terms of sort of questions around digital
0:20:46.509,0:20:49.179
rights and advocacy work around policies,
0:20:49.179,0:20:52.649
and policy around privacy and so forth,
0:20:52.649,0:20:56.950
wasn't something that the activists we spoke[br]with, most of them anyway,
0:20:56.950,0:21:01.470
didn't see that as something that directly[br]featured on their agenda.
0:21:01.470,0:21:04.690
So it wasn't really something that they were[br]so concerned with themselves,
0:21:04.690,0:21:09.710
but rather that type of activism is kind of[br]outsourced to other groups like digital rights
0:21:09.710,0:21:11.479
activists or tech activists.
0:21:11.479,0:21:15.659
That that's what they do, we are doing something[br]else.
0:21:15.659,0:21:19.970
So I think what we sort of want to suggest[br]with that is that our research seems anyway
0:21:19.970,0:21:20.580
to suggest,
0:21:20.580,0:21:24.639
that there are some limitations around resistance[br]to surveillance,
0:21:24.639,0:21:29.989
in that this resistance seems to remain within[br]the silos of only certain types of actors.
0:21:29.989,0:21:35.559
So we're sort of asking: How can we then move[br]beyond that?
0:21:35.559,0:21:39.820
And start thinking of surveillance in terms[br]of perhaps data justice,
0:21:39.820,0:21:45.059
or somehow thinking of how surveillance connects[br]or resistance to surveillance connects
0:21:45.059,0:21:48.460
to broader social and economic justice agendas.
0:21:48.460,0:21:50.849
And of course some of this is already happening,
0:21:50.849,0:21:53.460
and some of it has been discussed here at[br]this congress.
0:21:53.460,0:21:57.179
So for example how does data collection lead[br]to discrimination?
0:21:57.179,0:21:59.859
Or how does it come to suppress dissent?
0:21:59.859,0:22:04.789
But also how does surveillance relate to working[br]conditions and workers' rights for example,
0:22:04.789,0:22:08.889
or how does it link to inequality and poverty?
0:22:08.889,0:22:11.409
So I suppose our research suggests that we[br]need to think about
0:22:11.409,0:22:15.720
that if encryption and technical solutions[br]and discussions around digital rights such
0:22:15.720,0:22:16.749
as privacy
0:22:16.749,0:22:21.710
remain really within certain circles and perhaps[br]events like this and so forth,
0:22:21.710,0:22:27.349
how can we get it to resonate with a broader[br]public in some ways?
0:22:27.349,0:22:29.460
So — wow, we finished much faster than we[br]thought we would.
0:22:29.460,0:22:35.299
But anyway. So basically we've had a snapshot[br]now of sort of recent public debate,
0:22:35.299,0:22:40.249
and sort of ones that suggest that we might[br]need to think about how to connect concerns
0:22:40.249,0:22:41.789
with surveillance,
0:22:41.789,0:22:47.379
that are discussed in places like this to[br]other issues in order to resonate with a broader
0:22:47.379,0:22:48.629
public.
0:22:48.629,0:22:50.169
And that's it, we have time for questions
0:22:50.169,0:23:00.339
applause
0:23:00.339,0:23:05.590
A: Ask questions or comments, or additional[br]information about some other projects.
0:23:05.590,0:23:10.320
Angel: Please, line up at the microphones, so you[br]can speak clearly your questions into the
0:23:10.320,0:23:13.190
microphone, please.
0:23:13.190,0:23:16.759
The microphone in the back, please.
0:23:20.589,0:23:21.449
Go ahead.
0:23:21.449,0:23:28.129
Question: Hey. So do you think this lack of[br]technical understanding of the Snowden leaks
0:23:28.129,0:23:34.539
might be due to Snowden fatigue, that is people[br]getting really tired of reading a Snowden
0:23:34.539,0:23:35.320
article?
0:23:35.320,0:23:38.859
And another one and another one: Did you know you might have contributed to it?
0:23:38.859,0:23:41.869
Angel: Can you maybe repeat the question?
0:23:41.869,0:23:45.639
And if you leave the room, please do so quietly,
0:23:45.639,0:23:47.519
because we can't understand his question.
0:23:47.519,0:23:56.109
Q: Sorry. So the question is: This lack of understanding of the content of the Snowden leaks, maybe
0:23:56.109,0:23:58.320
on a basic technical level,
0:23:58.320,0:24:03.649
could that something that contributed to that,[br]could that be Snowden fatigue?
0:24:03.649,0:24:09.450
L: And you're referring to this sort of drip-feed[br]way of releasing those documents...
0:24:09.450,0:24:12.869
Q: Not necessarily criticizing the way it[br]was released, but there was a hell of a lot
0:24:12.869,0:24:15.060
of content and a lot of people got bored of[br]it.
0:24:15.060,0:24:19.899
L: Right. okay. mumbling
0:24:19.899,0:24:24.219
A: There's a bit of that I think probably[br]that we see
0:24:24.219,0:24:29.710
and The Guardian at some point stopped their[br]coverage or releasing more information
0:24:29.710,0:24:34.669
and then we've saw more information coming[br]out through other sources and Intercept and
0:24:34.669,0:24:36.669
so on.
0:24:36.669,0:24:44.320
But I think what we are focusing on or what[br]we saw in media coverage particularly,
0:24:44.320,0:24:48.690
were some deficiencies I think in the media[br]coverage,
0:24:48.690,0:24:54.429
and we would create this link mainly between[br]the lack of knowledge
0:24:54.429,0:24:57.580
and the deficiencies in the media coverage[br]per se.
0:24:57.580,0:25:06.340
Not necessarily in The Guardian, but probably[br]most other media organizations and other newspapers.
0:25:08.220,0:25:12.289
L: I think there's different views on that[br]because a lot of people feel like it's stayed
0:25:12.289,0:25:13.219
in the public debate
0:25:13.219,0:25:18.129
or in the public realm, because there was a[br]continuation of revelations that came after
0:25:18.129,0:25:18.389
each other,
0:25:18.389,0:25:22.529
rather than just doing this data dump thing[br]and you know just doing everything in one
0:25:22.529,0:25:23.200
go.
0:25:23.200,0:25:27.629
So I think we will probably have been able[br]to say the same thing if it was done differently
0:25:27.629,0:25:28.330
as well.
0:25:29.900,0:25:31.950
Angel: There is a question from the internet.
0:25:31.950,0:25:38.710
Q: Yes. Ifup is asking as far as he or she[br]understood the people were not informed pretty
0:25:38.710,0:25:41.469
well on what really was revealed.
0:25:41.469,0:25:45.729
Wouldn't it have been the task of the media[br]to inform them?
0:25:45.729,0:25:48.509
And how could they have been done better?
0:25:48.509,0:25:55.769
L: This seems to be a rhetorical question[br]in that they didn't... yes
0:25:55.769,0:25:59.280
A: Well yes, they should have.
0:25:59.280,0:26:04.849
Ideally we would think that it is the task[br]of the media to inform,
0:26:04.849,0:26:11.179
we saw that some media did inform, others[br]did do pretty much the opposite.
0:26:11.179,0:26:13.320
Then there's the question how to improve that.
0:26:13.320,0:26:17.049
And what is the role of different types of[br]media and alternative media
0:26:17.049,0:26:21.899
and what does need to change structurally[br]in forms of mainstream media?
0:26:21.899,0:26:23.109
But that is a big debate.
0:26:23.939,0:26:28.719
L: And we should also say that we've done[br]interviews with journalists, asking questions
0:26:28.719,0:26:32.489
as to why they covered this the way that they[br]did.
0:26:32.489,0:26:36.139
And hopefully those interviews will reveal[br]something more,
0:26:36.139,0:26:38.210
but those are still ongoing.
0:26:38.210,0:26:43.200
But we've had for example James Ball from[br]The Guardian who came to our conference in
0:26:43.200,0:26:43.739
June,
0:26:43.739,0:26:47.229
and talked about some of the structural problems[br]with a couple of journalists who cover security
0:26:47.229,0:26:48.409
issues.
0:26:48.409,0:26:54.369
And there's quite a lot of obstacles and so[br]for them to do that in a critical and investigatory
0:26:54.369,0:26:54.700
way.
0:26:54.700,0:26:58.719
So I think those are the issues that we want[br]to explore when we find out more through these
0:26:58.719,0:26:59.859
those interviews.
0:27:00.599,0:27:04.019
Angel: We have time for one last question,[br]please make it short
0:27:06.989,0:27:10.219
Q: Hello. That's better
0:27:10.219,0:27:12.909
I'm not surprised to be honest,
0:27:12.909,0:27:18.009
we have seen a similar thing by John Oliver,[br]so Last Week Tonight, I can only recommend
0:27:18.009,0:27:19.889
that scene.
0:27:19.889,0:27:23.309
So the question is only about what do we talk[br]about,
0:27:23.309,0:27:25.219
so can everybody relate to that?
0:27:25.219,0:27:28.049
I have just one question to the first slides[br]you have shown
0:27:28.049,0:27:31.049
the numbers: What do they reveal?
0:27:33.689,0:27:34.919
A: Numbers?
0:27:34.919,0:27:39.499
Q: In your first slides there were all of[br]those bar charts with kind of numbers and
0:27:39.499,0:27:40.879
I was interested in those numbers.
0:27:40.879,0:27:42.700
A: Okay.
0:27:42.700,0:27:45.789
Q: I guess occurences.
0:27:45.789,0:27:49.700
A: Yes, so at the beginning we showed the[br]time line of...
0:27:49.700,0:27:51.619
L: Numbers of mumbling
0:27:51.619,0:28:02.639
A: Ah yes. These were the dates of the publication[br]and that is the volume of publication
0:28:02.639,0:28:05.440
again: Looking at the press in this case,
0:28:05.440,0:28:08.389
looking at not just The Guardian, but all[br]kinds of other newspapers.
0:28:08.389,0:28:12.379
That's one part of the research and there[br]will be another part of the research that
0:28:12.379,0:28:15.239
you will find information about this on the[br]website,
0:28:15.239,0:28:20.229
which is about broadcasting, which is about[br]TV and radio coverage.
0:28:20.229,0:28:24.210
But so far what we saw is that there is a[br]fairly similar picture
0:28:24.210,0:28:26.330
in terms of how these curves developed,
0:28:26.330,0:28:30.039
and also in terms of the content of the coverage.
0:28:31.419,0:28:33.049
Angel: I'd say time is up.
0:28:33.049,0:28:36.309
Thank you very much Lina Dencik and Arne Hintz[br]for your talk!
0:28:36.309,0:28:37.569
applause
0:28:37.569,0:28:41.579
music
0:28:41.579,0:28:48.000
subtitles created by c3subtitles.de[br]Join, and help us!