WEBVTT
00:00:00.440 --> 00:00:10.299
music
00:00:10.299 --> 00:00:12.679
Let's start. Be welcome!
00:00:12.679 --> 00:00:16.880
More than two years ago, Edward Snowden's
files have become public.
00:00:16.880 --> 00:00:18.410
They went public
00:00:18.410 --> 00:00:20.910
and the media went crazy.
00:00:20.910 --> 00:00:23.160
And the public maybe not so much,
00:00:23.160 --> 00:00:26.250
as you may have noticed amongst your friends
and family,
00:00:26.250 --> 00:00:28.000
as well I did.
00:00:28.000 --> 00:00:34.530
A lot remains the same after Snowden's revelations,
00:00:34.530 --> 00:00:38.039
even if people are concerned about surveillance.
00:00:38.039 --> 00:00:44.780
The following talk by Arne Hintz and Lina
Dencik from University of Cardiff explores
00:00:44.780 --> 00:00:46.019
just that.
00:00:46.019 --> 00:00:54.739
They analyzed how actually the media reacted
to the relations made by Edward Snowden
00:00:54.739 --> 00:00:56.550
and they also looked at how the public,
00:00:56.550 --> 00:01:03.100
such as journalists and other people and activists,
reacted to Edward Snowden's disclosures.
00:01:03.100 --> 00:01:09.520
So please give a warm round of applause to
Arne Hintz and Lina Dencik. Thank you!
00:01:09.520 --> 00:01:17.790
applause
00:01:17.790 --> 00:01:21.910
Arne: Thank you very much, there are still
a few free seats over there.
00:01:21.910 --> 00:01:25.290
Hello everybody, my name is Arne Hintz, this
is Lina Denzik.
00:01:25.290 --> 00:01:31.620
We are both from Cardiff University, from
the school of journalism, media and cultural studies,
00:01:31.620 --> 00:01:34.380
so not from the tech department.
00:01:34.380 --> 00:01:38.960
We want to talk about some of the results
of a research project
00:01:38.960 --> 00:01:42.400
that we've been working on this year and for the past...
00:01:42.400 --> 00:01:45.470
for a bit more than a year
00:01:45.470 --> 00:01:50.300
and it's called "Digital Citizenship and Surveillance
Society: UK State-Media-Citizen Relations
00:01:50.300 --> 00:01:51.800
after the Snowden Leaks",
00:01:51.800 --> 00:01:56.980
and it's about the implications of the Snowden
leaks in four areas:
00:01:56.980 --> 00:02:01.440
News media, civil society, policy and technology
00:02:01.440 --> 00:02:05.430
and here what we want to do is present just
a few findings from that project
00:02:05.430 --> 00:02:11.230
and focus on two areas, the news media part
and the civil society part.
00:02:11.230 --> 00:02:16.780
It's all focused on the UK, the country where
Cardiff University is located
00:02:16.780 --> 00:02:22.510
so there won't be a lot of international comparisons,
not a lot about Germany and so on,
00:02:22.510 --> 00:02:29.470
but I think maybe at the end we can maybe
draw some comparisons ourselves here in this room.
00:02:32.730 --> 00:02:38.890
So this has been the project basically, the
title as you see it over there.
00:02:38.890 --> 00:02:43.190
The news media part has basically asked how
the british media represented the Snowden
00:02:43.190 --> 00:02:45.190
leaks and digital surveillance.
00:02:45.190 --> 00:02:51.400
The society part is about questions such as:
What is the nature of public knowledge with
00:02:51.400 --> 00:02:52.740
regards to digital surveillance?
00:02:52.740 --> 00:02:56.130
Are everyday communication practices changing?
00:02:56.130 --> 00:03:01.040
And how are activists affected by the revelations
of mass surveillance?
00:03:01.040 --> 00:03:04.560
The policies part is still ongoing, it's still
being developed
00:03:04.560 --> 00:03:08.700
and it's about the current policy and regulatory
framework of digital surveillance
00:03:08.700 --> 00:03:12.890
and reform proposals and current reforms that
are taking place.
00:03:12.890 --> 00:03:17.599
And the technology part is about the technological
infrastructure of surveillance
00:03:17.599 --> 00:03:22.150
and techonological possibilities of counter-surveillance
and resistance.
00:03:22.150 --> 00:03:27.970
And then we want to bring all this together
and ask: How does that re-define what we may
00:03:27.970 --> 00:03:30.610
understand as digital citizenship?
00:03:30.610 --> 00:03:34.080
The research team includes a number of people
from Cardiff University
00:03:34.080 --> 00:03:40.260
including us, including other lecturers, professors,
staff members of Cardiff University
00:03:40.260 --> 00:03:44.750
and a few research assistants and research
associates that we employed for this,
00:03:44.750 --> 00:03:53.760
plus a couple of guys from Oxford and one
from Briar from a tech development project.
00:03:53.760 --> 00:03:59.099
We also have an advisory board with some colleagues
from academia
00:03:59.099 --> 00:04:03.970
but also representatives of digital rights
organisations, such as Open Rights Group,
00:04:03.970 --> 00:04:05.690
Privacy International and others.
00:04:05.690 --> 00:04:11.770
We have a project website, where you can learn
more about the project, about the background
00:04:11.770 --> 00:04:13.920
and also some preliminary findings.
00:04:13.920 --> 00:04:20.220
We also had a conference earlier this year,
in June, maybe some of you were there.
00:04:20.220 --> 00:04:25.150
It was in Cardiff with some interesting speakers
to the conference
00:04:25.150 --> 00:04:29.810
and also combined the academic and the practical
part a little bit.
00:04:29.810 --> 00:04:34.960
So. A few glimpses of the results in these
two areas that I mentioned.
00:04:34.960 --> 00:04:42.080
So for the media research part we were interested
in studying how the British news media have
00:04:42.080 --> 00:04:46.639
represented the Snowden leaks and also digital
surveillance more broadly.
00:04:46.639 --> 00:04:54.630
And so we asked: How are debates over surveillance
constructed? What are the angles and opinions?
00:04:54.630 --> 00:04:57.040
What are usual sources? And so on.
00:04:57.040 --> 00:05:02.460
We need to start on an anecdotal basis.
00:05:02.460 --> 00:05:07.840
Some examples of media coverage that emerged
very quickly after the Snowden revelations,
00:05:07.840 --> 00:05:12.630
again in the UK press, which show different
types of the coverage.
00:05:12.630 --> 00:05:17.820
So we probably all know that the Guardian
was very instrumental in the revelations
00:05:17.820 --> 00:05:25.030
and provided a lot of information, really
took this role of the fourth estate and of
00:05:25.030 --> 00:05:27.169
investigative journalism quite seriously.
00:05:27.169 --> 00:05:34.000
On the other hand, other newspapers like this
one were very critical about the Snowden revelations
00:05:34.000 --> 00:05:38.729
and also about the Guardian for informing
people about these and running with these revelations.
00:05:40.169 --> 00:05:44.639
And then there were others like this one,
that was a famous example.
00:05:44.639 --> 00:05:52.300
The former editor of the Independent, actually
another liberal, middle ground, not really
00:05:52.300 --> 00:05:56.350
left but at least not ultra conservative newspaper.
00:05:56.350 --> 00:06:00.430
Who says "Edward Snowden's secrets may be
dangerous, I would not have published them".
00:06:00.430 --> 00:06:06.180
Okay, can debate that, but then he says "if
MI5 warns that this is not in the public interest,
00:06:06.180 --> 00:06:08.650
who am I to disbelieve them?".
00:06:08.650 --> 00:06:10.600
laughing
00:06:10.600 --> 00:06:12.550
That's an interesting understanding of journalism
00:06:12.550 --> 00:06:16.810
and it was later retracted, it was debated
quite a lot.
00:06:16.810 --> 00:06:28.150
But we see that also this caution towards
publishing something like this has been quite
00:06:28.150 --> 00:06:28.949
wide-spread.
00:06:28.949 --> 00:06:31.270
So what did we do?
00:06:31.270 --> 00:06:38.310
Here's a timeline of Snowden and surveillance
related coverage in the press in this case
00:06:38.310 --> 00:06:39.540
in the UK.
00:06:39.540 --> 00:06:44.240
And we looked at five case studies, five moments
of coverage.
00:06:44.240 --> 00:06:47.540
The first were the initial revelations of
Snowden.
00:06:47.540 --> 00:06:53.139
The second the interception of communications
in foreign embassies and European Union offices
00:06:53.139 --> 00:06:58.430
and spying on world leaders' phone communications,
such as Angela Merkel's for example.
00:06:58.430 --> 00:07:02.620
The third was the detention of Glenn Greenwald's
partner David Miranda at Heathrow Airport
00:07:02.620 --> 00:07:04.600
under anti-terror legislation.
00:07:04.600 --> 00:07:11.030
Which raised debates around freedom of the
press and national security.
00:07:11.030 --> 00:07:15.310
Then we looked at the parliamentary report
into the death of Lee Rigby.
00:07:15.310 --> 00:07:20.810
Which was a case that was described as a terrorist
attack on a British soldier on the streets
00:07:20.810 --> 00:07:22.500
of London.
00:07:22.500 --> 00:07:28.150
And it led to debates around social media
companies' role in tackling terrorism.
00:07:28.150 --> 00:07:30.370
And then finally the Charlie Hebdo attacks
in Paris,
00:07:30.370 --> 00:07:35.270
which prompted debates around digital encryption,
freedom of speech and the resurrection of
00:07:35.270 --> 00:07:40.180
the so-called Snooper's Charter in the UK,
00:07:40.180 --> 00:07:45.080
the legislation around surveillance.
00:07:45.080 --> 00:07:49.620
So a few results:
00:07:49.620 --> 00:07:54.539
Snowden was clearly prominent in the media
coverage, and generally was covered using
00:07:54.539 --> 00:07:56.930
mostly neutral or even positive language,
00:07:56.930 --> 00:08:00.729
described as a whistleblower as we see
here at the bottom.
00:08:00.729 --> 00:08:04.919
But if we look at the focus on issues around
surveillance taken in the stories
00:08:04.919 --> 00:08:13.360
and so at the context of coverage of surveillance,
the most important one here has to do
00:08:13.360 --> 00:08:18.020
as we can see there, probably it's a little
bit small to read.
00:08:18.020 --> 00:08:22.479
But the most important has to do
with themes of terrorism,
00:08:22.479 --> 00:08:27.259
with themes of the role of security agencies
and government response.
00:08:27.259 --> 00:08:30.548
So that's been very much the context of discussing in
00:08:30.548 --> 00:08:33.708
most media coverage of discussing
the context of discussing Snowden revelations
00:08:33.708 --> 00:08:35.208
and surveillance more broadly.
00:08:35.208 --> 00:08:40.580
And that is in stark contrast to discussing
surveillance in terms of human rights, personal
00:08:40.580 --> 00:08:43.049
privacy and freedom of the press.
00:08:43.049 --> 00:08:49.920
In other words: rights and digital... and citizen-based perspectives on surveillance.
00:08:49.920 --> 00:08:55.040
If we look at who was used as the sources
in these stories, we see a pattern that is
00:08:55.040 --> 00:08:58.800
actually quite typical in media sourcing generally.
00:08:58.800 --> 00:09:02.520
Politicians are by far the most prominent
source.
00:09:02.520 --> 00:09:05.810
And that is not unusual at all.
00:09:05.810 --> 00:09:12.000
But in this case it means that elite concerns
around surveillance are most prominent, not
00:09:12.000 --> 00:09:13.540
citizen concerns.
00:09:13.540 --> 00:09:19.290
Political sources are framing the debate and
how it is interpreted.
00:09:19.290 --> 00:09:25.649
And so unsurprisingly then the oppinions raised
by these sources are for example, as we see
00:09:25.649 --> 00:09:28.990
there, that surveillance should be increased
00:09:28.990 --> 00:09:33.950
or at least is necessary, at least has to
be maintained.
00:09:33.950 --> 00:09:38.290
That the Snowden leaks have compromised the
work of intelligence services
00:09:38.290 --> 00:09:42.870
and that social media companies should do
more to fight terror and to increase their
00:09:42.870 --> 00:09:44.470
own surveillance.
00:09:44.470 --> 00:09:48.839
And so this dominant framework understands
surveillance as a valuable activity,
00:09:48.839 --> 00:09:55.380
and one for which both intelligence services
and business actors have a responsibility.
00:09:55.380 --> 00:09:59.830
Rather than it being primarily problematic
for citizens.
00:09:59.830 --> 00:10:05.290
And where it is presented as problematic,
in the snooping on world leaders case study,
00:10:05.290 --> 00:10:10.209
surveillance was seen as damaging to international
relations and therefore problematic.
00:10:10.209 --> 00:10:15.399
And that's something that is primarily of
relevance to big players rather than ordinary
00:10:15.399 --> 00:10:16.170
citizens.
00:10:16.170 --> 00:10:20.709
So from these short glimpses, what we can
see, just a few preliminary conclusions,
00:10:20.709 --> 00:10:27.089
is that yes, there was extensive and often
positive reporting on Snowden himself, in
00:10:27.089 --> 00:10:28.360
some media at least.
00:10:28.360 --> 00:10:32.970
But debates around surveillance are framed
by elites, rather than citizens
00:10:32.970 --> 00:10:38.610
and this elite-centered structure of news
coverage means that the consequences and the
00:10:38.610 --> 00:10:42.600
extent particularly of mass surveillance of
citizens
00:10:42.600 --> 00:10:44.610
are largely invisible in media coverage.
00:10:44.610 --> 00:10:48.450
There's a strong framing on national security
and so on,
00:10:48.450 --> 00:10:53.640
but there is quite insufficient information
on the practices and implications of surveillance
00:10:53.640 --> 00:10:55.980
for normal citizens.
00:10:55.980 --> 00:11:01.399
And so the issues of mass surveillance that
were actually so central in Snowden's revelations,
00:11:01.399 --> 00:11:04.149
remain relatively invisible in these debates,
00:11:04.149 --> 00:11:09.050
apart from perhaps the Guardian coverage.
00:11:09.050 --> 00:11:16.260
And so we could say that media justify and
normalize current surveillance practices,
00:11:16.260 --> 00:11:23.220
and that discussions about individual rights
and human security are structurally discouraged.
00:11:23.220 --> 00:11:24.170
That is the media part
00:11:25.670 --> 00:11:29.620
Lina: so i'll just go briefly through some
of our key findings for what we call the civil
00:11:29.620 --> 00:11:31.450
society work stream on this.
00:11:31.450 --> 00:11:36.910
Which looks at two aspects, so there is the
public knowledge and attitudes on the Snowden
00:11:36.910 --> 00:11:38.450
leaks and digital surveillance.
00:11:38.450 --> 00:11:42.350
And then there's the second part which is
particularly to do with responses amongst
00:11:42.350 --> 00:11:43.899
political activists.
00:11:43.899 --> 00:11:48.720
And for the first part, the public opinion
research, we did a number of focus groups across
00:11:48.720 --> 00:11:49.899
different demographics in the UK,
00:11:49.899 --> 00:11:53.339
in order to get us a diverse range of
opinions and views.
00:11:53.339 --> 00:11:59.180
So that ranges from sort of high income people
working the financial centre to local young
00:11:59.180 --> 00:12:03.120
Muslim groups within Cardiff itself.
00:12:03.120 --> 00:12:05.959
So a different range and different groups
of people.
00:12:05.959 --> 00:12:11.589
And then for the research on the activist
responses we did a number of interviews with
00:12:11.589 --> 00:12:13.550
different groups and organisations,
00:12:13.550 --> 00:12:16.420
from large NGOs to smaller community groups.
00:12:16.420 --> 00:12:21.100
Ranging from environmental groups, labour
activists anti-war activists like "Stop the
00:12:21.100 --> 00:12:21.450
War",
00:12:21.450 --> 00:12:24.990
economic justice groups like "Global Justice
Now", and community
00:12:24.990 --> 00:12:30.420
and civil liberty groups such as also "CAGE",
who spoke earlier today.
00:12:30.420 --> 00:12:31.720
And talked with them.
00:12:31.720 --> 00:12:36.390
So there's particularly groups that weren't
digital rights activists or tech activists
00:12:36.390 --> 00:12:36.870
specifically,
00:12:36.870 --> 00:12:41.649
to try and get an understanding of how other
political activists view this issue in particular
00:12:41.649 --> 00:12:42.860
in response to the Snowden leaks.
00:12:42.860 --> 00:12:48.930
So with the first bit on public opinion in
our focus groups we had a range of themes.
00:12:48.930 --> 00:12:51.800
Understanding and experiences of surveillance,
00:12:51.800 --> 00:12:54.510
knowledge and opinions on Snowden leaks,
00:12:54.510 --> 00:12:56.540
concerns with privacy and personal data,
00:12:56.540 --> 00:12:58.920
questions around online behaviour and practices
00:12:58.920 --> 00:13:02.470
and attitudes towards intelligence services.
00:13:02.470 --> 00:13:06.700
So just a couple of key points from these
focus groups:
00:13:06.700 --> 00:13:11.350
First of all there was particularly low knowledge
of who Edward Snowden was,
00:13:11.350 --> 00:13:15.940
and even less knowledge of what the content
of the leaks were.
00:13:15.940 --> 00:13:21.450
And there was a lot of confusion in discussions
with Julian Assange, Chelsea Manning and Wikileaks
00:13:21.450 --> 00:13:21.750
really,
00:13:21.750 --> 00:13:24.160
in terms of how people had come about this
story.
00:13:24.160 --> 00:13:30.350
And there were a lot of mix-up between those
different stories.
00:13:30.350 --> 00:13:36.320
In terms of actually understandings of surveillance
all of this state surveillance isn't really
00:13:36.320 --> 00:13:38.570
isolated in how people speak about it.
00:13:38.570 --> 00:13:43.149
It overlaps also with questions of corporate
surveillance and also peer surveillance or
00:13:43.149 --> 00:13:44.670
employer surveillance and so forth.
00:13:44.670 --> 00:13:49.029
So a lot of concerns are not necessarily about
state surveillance per se and it's difficult
00:13:49.029 --> 00:13:52.350
to isolate this as a particular issue.
00:13:52.350 --> 00:13:57.139
And also when it comes to what constitutes
surveillance,
00:13:57.139 --> 00:14:01.540
the initial responses would be things like
CCTV and sort of these types of things were
00:14:01.540 --> 00:14:04.690
seen as more kind of real forms of surveillance.
00:14:04.690 --> 00:14:08.860
But on the other hand it was very clear that
people felt that the collection of data
00:14:08.860 --> 00:14:12.410
and also including the collection of meta
data, so distinguishing also from it being
00:14:12.410 --> 00:14:15.089
not about content, constitutes surveillance.
00:14:15.089 --> 00:14:21.070
So that was generally how people felt about
what surveillance actually means.
00:14:21.070 --> 00:14:27.480
In terms then of concerns around this, people's
worries about state surveillance in particular
00:14:27.480 --> 00:14:30.399
but dominantly concerns lack of transparency
around it.
00:14:30.399 --> 00:14:36.339
So a lack of transparency around what is being
collected, but also how it's being used and
00:14:36.339 --> 00:14:37.190
what it's being used for,
00:14:37.190 --> 00:14:42.720
and also what the regulatory framework is
that's in place surrounding it.
00:14:42.720 --> 00:14:46.660
And also concerns over the lack of knowledge
or understanding of how to actually opt out,
00:14:46.660 --> 00:14:50.970
or resist or circumvent collection of data.
00:14:50.970 --> 00:14:55.209
And in terms of sort of changes in online
behaviour then,
00:14:55.209 --> 00:14:58.430
these concerns do manifest themselves in some
changes, but it's mainly in terms of sort
00:14:58.430 --> 00:15:00.279
of self-regulating behaviour,
00:15:00.279 --> 00:15:03.760
not saying things that are too controversial
online and so forth,
00:15:03.760 --> 00:15:09.880
rather than actually changes in using different
tools or different communication platforms,
00:15:09.880 --> 00:15:13.860
which wasn't prominent at all in our focus
groups.
00:15:13.860 --> 00:15:17.589
And what we also saw as sort of implications
of this is that there was sort of an internalising
00:15:17.589 --> 00:15:18.970
of some of these justifications
00:15:18.970 --> 00:15:22.540
that have been very prominent also in the
media, particularly this phrase: "nothing
00:15:22.540 --> 00:15:24.880
to hide, nothing to fear".
00:15:24.880 --> 00:15:31.339
Although in this case there was clear
differences between the different demographic
00:15:31.339 --> 00:15:32.699
groups that we spoke with.
00:15:32.699 --> 00:15:35.670
Meaning that some people were more comfortable
saying this phrase "nothing to hide, nothing
00:15:35.670 --> 00:15:36.769
to fear",
00:15:36.769 --> 00:15:40.910
whereas for example when we spoke to local
Muslim groups they problematised this position
00:15:40.910 --> 00:15:41.480
much more.
00:15:41.480 --> 00:15:44.589
So there is definitely variation here in terms
of that,
00:15:44.589 --> 00:15:48.970
but there is a sense in which some of
these justifications have been internalized.
00:15:48.970 --> 00:15:52.519
And actually what we've seen is what we phrase
this as a kind of surveillance realism,
00:15:52.519 --> 00:15:56.760
is that surveillance has become normalized
to such an extent,
00:15:56.760 --> 00:16:01.000
it is difficult for people to really understand
or imagine a society in which surveillance
00:16:01.000 --> 00:16:03.240
doesn't take place.
00:16:03.240 --> 00:16:08.170
Which might also relate to some of these questions
around a lack of understanding of how to actually
00:16:08.170 --> 00:16:10.949
resist this or opt out from this.
00:16:10.949 --> 00:16:16.279
So i think a key point that we wanted to make
with our research with these focus groups,
00:16:16.279 --> 00:16:16.540
is
00:16:16.540 --> 00:16:20.910
that we need to re-distinguish here between
public consent versus public resignation, when
00:16:20.910 --> 00:16:23.260
we talk about attitudes towards surveillance,
00:16:23.260 --> 00:16:26.459
meaning that it isn't necessary that people
consent to this going on
00:16:26.459 --> 00:16:31.800
but actually have resigned to the fact that
this is how society is being organised.
00:16:31.800 --> 00:16:35.870
To then move on to interviews with activists.
00:16:35.870 --> 00:16:38.110
We also had similar questions here,
00:16:38.110 --> 00:16:40.170
so understanding and experiences of surveillance,
00:16:40.170 --> 00:16:44.180
and knowledge and opinions of Snowden leaks
and attitudes towards state surveillance.
00:16:44.180 --> 00:16:48.930
And then we also wanted to explore this question
around current online behaviour and practices
00:16:48.930 --> 00:16:53.600
and whether there had been any changes and
responses to the Snowden leaks.
00:16:53.600 --> 00:16:57.820
And again just some key findings here on these
questions:
00:16:57.820 --> 00:17:03.500
So basically the activists that we spoke with
were generally very aware of surveillance,
00:17:03.500 --> 00:17:07.589
but again it was visible and physical forms
of surveillance that were more prominent in
00:17:07.589 --> 00:17:09.419
how activists spoke about it.
00:17:09.419 --> 00:17:14.209
And this is particularly and perhaps particularly
in the UK a context,
00:17:14.209 --> 00:17:18.618
because there is a very troublesome history
in the UK with police infiltration into activist
00:17:18.618 --> 00:17:18.868
groups,
00:17:18.868 --> 00:17:22.849
which has really impacted the activist scene
quite a lot within the UK.
00:17:22.849 --> 00:17:26.989
And often this was how the activists we spoke
with would talk about surveillance first and
00:17:26.989 --> 00:17:27.799
foremost,
00:17:27.799 --> 00:17:33.850
rather than about these more virtual forms
and visible forms of surveillance.
00:17:33.850 --> 00:17:39.659
And also perhaps linked to that then despite
this general awareness and wide-spread experiences
00:17:39.659 --> 00:17:40.600
of surveillance,
00:17:40.600 --> 00:17:44.619
the activists we spoke with didn't know a
great deal of detail about the Snowden leaks
00:17:44.619 --> 00:17:45.519
particularly.
00:17:45.519 --> 00:17:50.649
And again there was this confusion with Chelsea
Manning and Wikileaks.
00:17:50.649 --> 00:17:56.249
And importantly also there was a sort of general
expectation some of these quotes sort of highlight
00:17:56.249 --> 00:17:57.049
that,
00:17:57.049 --> 00:18:02.369
that state surveillance goes on, this is sort
of expected.
00:18:02.369 --> 00:18:05.210
And it's confirmed for activists when police
are often there,
00:18:05.210 --> 00:18:07.960
when they've organized events or protests
and demonstrations,
00:18:07.960 --> 00:18:10.899
or when activities have been intercepted.
00:18:10.899 --> 00:18:14.759
And so the Snowden leaks in themselves and
the realities of mass surveillance
00:18:14.759 --> 00:18:19.169
came as little surprise to the political activists
in the UK.
00:18:19.169 --> 00:18:24.059
And perhaps also therefore or one other reason
there hasn't been much response from the groups
00:18:24.059 --> 00:18:24.899
we spoke with anyway,
00:18:24.899 --> 00:18:27.149
in terms of changing online behaviour.
00:18:27.149 --> 00:18:30.549
Particularly not directly because of Snowden.
00:18:30.549 --> 00:18:31.499
And there are some exceptions here,
00:18:31.499 --> 00:18:34.899
so for example Greenpeace did really change
their communication behaviour
00:18:34.899 --> 00:18:37.029
as a direct response to the Snowden leaks.
00:18:37.029 --> 00:18:41.019
And CAGE i think as we heard earlier had recently
also changed communication practices,
00:18:41.019 --> 00:18:43.019
although at the time of our interview with
them
00:18:43.019 --> 00:18:47.440
they hadn't done as much as they're doing
now.
00:18:47.440 --> 00:18:50.679
Predominantly however there has been very
little change in online behaviour,
00:18:50.679 --> 00:18:55.679
and where it has taken place it's been part
of a sort of longer term consciousness of
00:18:55.679 --> 00:18:57.220
surveillance.
00:18:57.220 --> 00:19:02.350
And the kind of changes we have seen more
are things like face to face interaction.
00:19:02.350 --> 00:19:08.929
So more face to face interaction, perhaps
slightly more careful online communication.
00:19:08.929 --> 00:19:12.299
But in terms of encryption:
00:19:12.299 --> 00:19:18.919
We found little use of encryption again although
with exceptions with some of the groups,
00:19:18.919 --> 00:19:22.139
but partly this was due to questions of convenience,
00:19:22.139 --> 00:19:24.460
and a perceived lack of technical ability.
00:19:24.460 --> 00:19:28.399
Which I think are arguments that we're quite
familiar with, when it comes to questions around
00:19:28.399 --> 00:19:28.830
this.
00:19:28.830 --> 00:19:33.049
But it was also related to a particular kind
of rationale thas was expressed in some of
00:19:33.049 --> 00:19:34.499
the interviews that we did,
00:19:34.499 --> 00:19:40.859
that somehow using encrypted software is about
being hidden or closed in some ways,
00:19:40.859 --> 00:19:45.629
whereas activists strive for open and transparent
organisations.
00:19:45.629 --> 00:19:51.129
So that somehow contradicts this aim to be
transparent and open and inclusive.
00:19:51.129 --> 00:19:57.159
That somehow it also excludes people to start
to use encrypted communication.
00:19:57.159 --> 00:20:00.330
And linked to that also many of the activists
we spoke with expressed the notion
00:20:00.330 --> 00:20:05.869
that their activities and their role in society
didn't constitute a need to really worry about
00:20:05.869 --> 00:20:07.049
surveillance.
00:20:07.049 --> 00:20:10.759
So despite being aware of surveillance and
expecting it to go on,
00:20:10.759 --> 00:20:13.450
there was a sense in which some of the organisations
here
00:20:13.450 --> 00:20:15.570
perceived themselves as fairly mainstream,
00:20:15.570 --> 00:20:17.119
and therefore kind of safe.
00:20:17.119 --> 00:20:19.989
And didn't really need to worry about surveillance.
00:20:19.989 --> 00:20:23.299
And really that surveillance would only really
need to be something to worry about,
00:20:23.299 --> 00:20:29.299
if they moved into more radical forms of politics
and action,
00:20:29.299 --> 00:20:31.599
whatever that might be.
00:20:31.599 --> 00:20:35.539
So in some ways we might think of this as
kind of it acts to somewhat keep the mainstream
00:20:35.539 --> 00:20:35.950
in check,
00:20:35.950 --> 00:20:40.070
in that there would only surveillance becomes
a variable only if you do certain kinds of
00:20:40.070 --> 00:20:42.369
actions.
00:20:42.369 --> 00:20:46.509
So and therefore also there wasn't really
in terms of sort of questions around digital
00:20:46.509 --> 00:20:49.179
rights and advocacy work around policies,
00:20:49.179 --> 00:20:52.649
and policy around privacy and so forth,
00:20:52.649 --> 00:20:56.950
wasn't something that the activists we spoke
with, most of them anyway,
00:20:56.950 --> 00:21:01.470
didn't see that as something that directly
featured on their agenda.
00:21:01.470 --> 00:21:04.690
So it wasn't really something that they were
so concerned with themselves,
00:21:04.690 --> 00:21:09.710
but rather that type of activism is kind of
outsourced to other groups like digital rights
00:21:09.710 --> 00:21:11.479
activists or tech activists.
00:21:11.479 --> 00:21:15.659
That that's what they do, we are doing something
else.
00:21:15.659 --> 00:21:19.970
So I think what we sort of want to suggest
with that is that our research seems anyway
00:21:19.970 --> 00:21:20.580
to suggest,
00:21:20.580 --> 00:21:24.639
that there are some limitations around resistance
to surveillance,
00:21:24.639 --> 00:21:29.989
in that this resistance seems to remain within
the silos of only certain types of actors.
00:21:29.989 --> 00:21:35.559
So we're sort of asking: How can we then move
beyond that?
00:21:35.559 --> 00:21:39.820
And start thinking of surveillance in terms
of perhaps data justice,
00:21:39.820 --> 00:21:45.059
or somehow thinking of how surveillance connects
or resistance to surveillance connects
00:21:45.059 --> 00:21:48.460
to broader social and economic justice agendas.
00:21:48.460 --> 00:21:50.849
And of course some of this is already happening,
00:21:50.849 --> 00:21:53.460
and some of it has been discussed here at
this congress.
00:21:53.460 --> 00:21:57.179
So for example how does data collection lead
to discrimination?
00:21:57.179 --> 00:21:59.859
Or how does it come to suppress dissent?
00:21:59.859 --> 00:22:04.789
But also how does surveillance relate to working
conditions and workers' rights for example,
00:22:04.789 --> 00:22:08.889
or how does it link to inequality and poverty?
00:22:08.889 --> 00:22:11.409
So I suppose our research suggests that we
need to think about
00:22:11.409 --> 00:22:15.720
that if encryption and technical solutions
and discussions around digital rights such
00:22:15.720 --> 00:22:16.749
as privacy
00:22:16.749 --> 00:22:21.710
remain really within certain circles and perhaps
events like this and so forth,
00:22:21.710 --> 00:22:27.349
how can we get it to resonate with a broader
public in some ways?
00:22:27.349 --> 00:22:29.460
So — wow, we finished much faster than we
thought we would.
00:22:29.460 --> 00:22:35.299
But anyway. So basically we've had a snapshot
now of sort of recent public debate,
00:22:35.299 --> 00:22:40.249
and sort of ones that suggest that we might
need to think about how to connect concerns
00:22:40.249 --> 00:22:41.789
with surveillance,
00:22:41.789 --> 00:22:47.379
that are discussed in places like this to
other issues in order to resonate with a broader
00:22:47.379 --> 00:22:48.629
public.
00:22:48.629 --> 00:22:50.169
And that's it, we have time for questions
00:22:50.169 --> 00:23:00.339
applause
00:23:00.339 --> 00:23:05.590
A: Ask questions or comments, or additional
information about some other projects.
00:23:05.590 --> 00:23:10.320
Angel: Please, line up at the microphones, so you
can speak clearly your questions into the
00:23:10.320 --> 00:23:13.190
microphone, please.
00:23:13.190 --> 00:23:16.759
The microphone in the back, please.
00:23:20.589 --> 00:23:21.449
Go ahead.
00:23:21.449 --> 00:23:28.129
Question: Hey. So do you think this lack of
technical understanding of the Snowden leaks
00:23:28.129 --> 00:23:34.539
might be due to Snowden fatigue, that is people
getting really tired of reading a Snowden
00:23:34.539 --> 00:23:35.320
article?
00:23:35.320 --> 00:23:38.859
And another one and another one: Did you know you might have contributed to it?
00:23:38.859 --> 00:23:41.869
Angel: Can you maybe repeat the question?
00:23:41.869 --> 00:23:45.639
And if you leave the room, please do so quietly,
00:23:45.639 --> 00:23:47.519
because we can't understand his question.
00:23:47.519 --> 00:23:56.109
Q: Sorry. So the question is: This lack of understanding of the content of the Snowden leaks, maybe
00:23:56.109 --> 00:23:58.320
on a basic technical level,
00:23:58.320 --> 00:24:03.649
could that something that contributed to that,
could that be Snowden fatigue?
00:24:03.649 --> 00:24:09.450
L: And you're referring to this sort of drip-feed
way of releasing those documents...
00:24:09.450 --> 00:24:12.869
Q: Not necessarily criticizing the way it
was released, but there was a hell of a lot
00:24:12.869 --> 00:24:15.060
of content and a lot of people got bored of
it.
00:24:15.060 --> 00:24:19.899
L: Right. okay. mumbling
00:24:19.899 --> 00:24:24.219
A: There's a bit of that I think probably
that we see
00:24:24.219 --> 00:24:29.710
and The Guardian at some point stopped their
coverage or releasing more information
00:24:29.710 --> 00:24:34.669
and then we've saw more information coming
out through other sources and Intercept and
00:24:34.669 --> 00:24:36.669
so on.
00:24:36.669 --> 00:24:44.320
But I think what we are focusing on or what
we saw in media coverage particularly,
00:24:44.320 --> 00:24:48.690
were some deficiencies I think in the media
coverage,
00:24:48.690 --> 00:24:54.429
and we would create this link mainly between
the lack of knowledge
00:24:54.429 --> 00:24:57.580
and the deficiencies in the media coverage
per se.
00:24:57.580 --> 00:25:06.340
Not necessarily in The Guardian, but probably
most other media organizations and other newspapers.
00:25:08.220 --> 00:25:12.289
L: I think there's different views on that
because a lot of people feel like it's stayed
00:25:12.289 --> 00:25:13.219
in the public debate
00:25:13.219 --> 00:25:18.129
or in the public realm, because there was a
continuation of revelations that came after
00:25:18.129 --> 00:25:18.389
each other,
00:25:18.389 --> 00:25:22.529
rather than just doing this data dump thing
and you know just doing everything in one
00:25:22.529 --> 00:25:23.200
go.
00:25:23.200 --> 00:25:27.629
So I think we will probably have been able
to say the same thing if it was done differently
00:25:27.629 --> 00:25:28.330
as well.
00:25:29.900 --> 00:25:31.950
Angel: There is a question from the internet.
00:25:31.950 --> 00:25:38.710
Q: Yes. Ifup is asking as far as he or she
understood the people were not informed pretty
00:25:38.710 --> 00:25:41.469
well on what really was revealed.
00:25:41.469 --> 00:25:45.729
Wouldn't it have been the task of the media
to inform them?
00:25:45.729 --> 00:25:48.509
And how could they have been done better?
00:25:48.509 --> 00:25:55.769
L: This seems to be a rhetorical question
in that they didn't... yes
00:25:55.769 --> 00:25:59.280
A: Well yes, they should have.
00:25:59.280 --> 00:26:04.849
Ideally we would think that it is the task
of the media to inform,
00:26:04.849 --> 00:26:11.179
we saw that some media did inform, others
did do pretty much the opposite.
00:26:11.179 --> 00:26:13.320
Then there's the question how to improve that.
00:26:13.320 --> 00:26:17.049
And what is the role of different types of
media and alternative media
00:26:17.049 --> 00:26:21.899
and what does need to change structurally
in forms of mainstream media?
00:26:21.899 --> 00:26:23.109
But that is a big debate.
00:26:23.939 --> 00:26:28.719
L: And we should also say that we've done
interviews with journalists, asking questions
00:26:28.719 --> 00:26:32.489
as to why they covered this the way that they
did.
00:26:32.489 --> 00:26:36.139
And hopefully those interviews will reveal
something more,
00:26:36.139 --> 00:26:38.210
but those are still ongoing.
00:26:38.210 --> 00:26:43.200
But we've had for example James Ball from
The Guardian who came to our conference in
00:26:43.200 --> 00:26:43.739
June,
00:26:43.739 --> 00:26:47.229
and talked about some of the structural problems
with a couple of journalists who cover security
00:26:47.229 --> 00:26:48.409
issues.
00:26:48.409 --> 00:26:54.369
And there's quite a lot of obstacles and so
for them to do that in a critical and investigatory
00:26:54.369 --> 00:26:54.700
way.
00:26:54.700 --> 00:26:58.719
So I think those are the issues that we want
to explore when we find out more through these
00:26:58.719 --> 00:26:59.859
those interviews.
00:27:00.599 --> 00:27:04.019
Angel: We have time for one last question,
please make it short
00:27:06.989 --> 00:27:10.219
Q: Hello. That's better
00:27:10.219 --> 00:27:12.909
I'm not surprised to be honest,
00:27:12.909 --> 00:27:18.009
we have seen a similar thing by John Oliver,
so Last Week Tonight, I can only recommend
00:27:18.009 --> 00:27:19.889
that scene.
00:27:19.889 --> 00:27:23.309
So the question is only about what do we talk
about,
00:27:23.309 --> 00:27:25.219
so can everybody relate to that?
00:27:25.219 --> 00:27:28.049
I have just one question to the first slides
you have shown
00:27:28.049 --> 00:27:31.049
the numbers: What do they reveal?
00:27:33.689 --> 00:27:34.919
A: Numbers?
00:27:34.919 --> 00:27:39.499
Q: In your first slides there were all of
those bar charts with kind of numbers and
00:27:39.499 --> 00:27:40.879
I was interested in those numbers.
00:27:40.879 --> 00:27:42.700
A: Okay.
00:27:42.700 --> 00:27:45.789
Q: I guess occurences.
00:27:45.789 --> 00:27:49.700
A: Yes, so at the beginning we showed the
time line of...
00:27:49.700 --> 00:27:51.619
L: Numbers of mumbling
00:27:51.619 --> 00:28:02.639
A: Ah yes. These were the dates of the publication
and that is the volume of publication
00:28:02.639 --> 00:28:05.440
again: Looking at the press in this case,
00:28:05.440 --> 00:28:08.389
looking at not just The Guardian, but all
kinds of other newspapers.
00:28:08.389 --> 00:28:12.379
That's one part of the research and there
will be another part of the research that
00:28:12.379 --> 00:28:15.239
you will find information about this on the
website,
00:28:15.239 --> 00:28:20.229
which is about broadcasting, which is about
TV and radio coverage.
00:28:20.229 --> 00:28:24.210
But so far what we saw is that there is a
fairly similar picture
00:28:24.210 --> 00:28:26.330
in terms of how these curves developed,
00:28:26.330 --> 00:28:30.039
and also in terms of the content of the coverage.
00:28:31.419 --> 00:28:33.049
Angel: I'd say time is up.
00:28:33.049 --> 00:28:36.309
Thank you very much Lina Dencik and Arne Hintz
for your talk!
00:28:36.309 --> 00:28:37.569
applause
00:28:37.569 --> 00:28:41.579
music
00:28:41.579 --> 00:28:48.000
subtitles created by c3subtitles.de
Join, and help us!