1
00:00:00,000 --> 00:00:14,488
34C3 preroll music
2
00:00:14,488 --> 00:00:19,939
Herald angel: Today two people from privacy
international, one is Eva Blum--Dumontet
3
00:00:19,939 --> 00:00:25,349
she's a research officer working on data
exploitation especially in the global
4
00:00:25,349 --> 00:00:34,750
south and Millie Wood who's a lawyer and
is fighting against spy agencies and
5
00:00:34,750 --> 00:00:41,070
before that she fought seven years against
police cases and they're gonna be talking
6
00:00:41,070 --> 00:00:46,340
about policing in the the age of data
exploitation. Give them a warm welcome.
7
00:00:46,340 --> 00:00:55,242
Applause
8
00:00:55,242 --> 00:00:58,440
Millie Wood: Hi I'm Millie as was just said I've been
9
00:00:58,440 --> 00:01:02,440
at privacy international for two years
working as a lawyer before that I spent
10
00:01:02,440 --> 00:01:08,320
seven years bringing cases against the
police and what increasingly concerns me
11
00:01:08,320 --> 00:01:14,130
based on these experiences is a lack of
understanding of what tactics are being
12
00:01:14,130 --> 00:01:21,000
used by the police today and what legal
basis they are doing this on. The lack of
13
00:01:21,000 --> 00:01:26,780
transparency undermines the ability of
activists lawyers and technologists to
14
00:01:26,780 --> 00:01:31,479
challenge the police tactics and whilst
I'm sure a lot of you have a broad
15
00:01:31,479 --> 00:01:36,990
awareness of the technology that the
police can use I don't think this is
16
00:01:36,990 --> 00:01:43,390
enough and we need to know what specific
police forces are using against
17
00:01:43,390 --> 00:01:50,479
individuals. The reason why is that when
you're arrested you need to know what
18
00:01:50,479 --> 00:01:56,810
disclosure to ask for in order to prove
your innocence. Your lawyers need to know
19
00:01:56,810 --> 00:02:03,010
what expert evidence to ask for in order
to defend their client. And increasingly
20
00:02:03,010 --> 00:02:08,949
as there are invisible ways or seemingly
invisible for the police to monitor a scale
21
00:02:08,949 --> 00:02:14,010
we need to know that there are effective
legal safeguards. Now those who are
22
00:02:14,010 --> 00:02:20,720
affected are not just the guilty or those
who understand technology they include
23
00:02:20,720 --> 00:02:29,730
pensioners such as John Cat a 90 year old
man who's a peace protester and he's a
24
00:02:29,730 --> 00:02:36,260
law-abiding citizen no criminal record and
yet he is on the UK domestic extremism
25
00:02:36,260 --> 00:02:42,980
database and listed here are some of the
entries: He took his sketchpad and made
26
00:02:42,980 --> 00:02:50,220
drawings, he's clean shaven, and he was
holding a board with orange people on it.
27
00:02:50,220 --> 00:02:56,020
So this is the kind of people that they
are surveilling. John's case exposes
28
00:02:56,020 --> 00:03:03,800
unlawful actions by the police but these
actions date back to 2005 to 2009 as far
29
00:03:03,800 --> 00:03:10,170
as I'm aware there are no cases
challenging modern police tactics and
30
00:03:10,170 --> 00:03:14,879
privacy international in the UK and with
our partners throughout the world are
31
00:03:14,879 --> 00:03:20,520
increasingly concerned at the pace this is
developing unobstructed because people
32
00:03:20,520 --> 00:03:28,480
don't know what's going on, and so we've
started in the UK to try and uncover some
33
00:03:28,480 --> 00:03:34,180
of the police tactics using Freedom of
Information requests. These laws should be
34
00:03:34,180 --> 00:03:39,480
available throughout Europe and we want to
make similar requests in other countries
35
00:03:39,480 --> 00:03:44,450
hopefully with some of you. So now I'm
going to hand over to my colleague Eva who
36
00:03:44,450 --> 00:03:47,860
will talk a bit about privacy
international, some of the tactics we know
37
00:03:47,860 --> 00:03:52,030
the police are using, and then we'll speak
about some of the things that we found out
38
00:03:52,030 --> 00:03:54,570
through our initial research.
39
00:03:54,570 --> 00:03:59,530
Applause
40
00:03:59,530 --> 00:04:02,919
Thank you so, I'm just going to tell you a
little bit more about Privacy
41
00:04:02,919 --> 00:04:07,150
International for those of you who don't
know this organization. We are based in
42
00:04:07,150 --> 00:04:11,470
London and we fight against surveillance
and defend the right to privacy across the
43
00:04:11,470 --> 00:04:15,519
world. Basically, essentially what we're
doing is that we do litigation, we conduct
44
00:04:15,519 --> 00:04:21,350
research, and we carry out advocacy
including at the United Nations, we
45
00:04:21,350 --> 00:04:26,830
develop policies on issues that are
defining modern rights. Now, our work
46
00:04:26,830 --> 00:04:30,900
ranges from litigations against
intelligence services to a wide range of
47
00:04:30,900 --> 00:04:36,880
reports on issues such as connected cars,
smart cities, and FinTech. We've recently
48
00:04:36,880 --> 00:04:41,610
published an investigation on the role of
companies like Cambridge Analytica and
49
00:04:41,610 --> 00:04:47,990
Harris Media and their role in the latest
Kenyan elections. With our network of
50
00:04:47,990 --> 00:04:52,471
partner organisations across the world we
advocate for stronger privacy protection
51
00:04:52,471 --> 00:04:59,161
in the law and technology and stronger
safeguards against surveillance. Now we
52
00:04:59,161 --> 00:05:04,080
talk about data exploitation and it's
actually the title of the talk so what do
53
00:05:04,080 --> 00:05:10,380
we mean by that? The concept of data
exploitation emerges from our concerns
54
00:05:10,380 --> 00:05:15,720
that the industry and governments are
building a world that prioritize the
55
00:05:15,720 --> 00:05:22,650
exploitation of all data. We observe three
prevailing trends in data exploitation.
56
00:05:22,650 --> 00:05:28,000
One is the excessive data that's generated
beyond our control. The second one is the
57
00:05:28,000 --> 00:05:34,139
fact that this data is processed in a way
we cannot understand or influence and the
58
00:05:34,139 --> 00:05:39,530
lack of transparency around it. The last
one is, that at the moment this data is
59
00:05:39,530 --> 00:05:44,690
used to disadvantage us the ones who are
producing this data and it's further
60
00:05:44,690 --> 00:05:51,270
empowering the already powerful. We hardly
control the data anymore that's generated
61
00:05:51,270 --> 00:05:55,290
from phones or in our computers, but now
in the world we live in data just don't
62
00:05:55,290 --> 00:06:00,130
come just from our phones or computers. It
comes from the cars we're driving, it
63
00:06:00,130 --> 00:06:05,970
comes from our payment systems, from the
cities we live in. This is all generating
64
00:06:05,970 --> 00:06:12,770
data and this data is used by other
entities to make assumptions about us and
65
00:06:12,770 --> 00:06:18,450
take decisions that eventually influence
our lives. Are we entitled to a loan? Do
66
00:06:18,450 --> 00:06:25,060
we qualify for affordable insurance?
Should we be sent to jail or set free? Who
67
00:06:25,060 --> 00:06:31,130
should be arrested? This is at the core of
the world that we're building around data
68
00:06:31,130 --> 00:06:37,630
exploitation. The question of power
imbalance between those who have the data
69
00:06:37,630 --> 00:06:42,490
and who gets to make decisions based on
this data and those who are producing the
70
00:06:42,490 --> 00:06:50,180
data and losing control over it. Now what
is policing have to do with data, what
71
00:06:50,180 --> 00:06:57,020
does data exploitation have to do with
policing? The police has always been
72
00:06:57,020 --> 00:07:04,620
actually using data in the past. To give
you one example in 1980 a transit police
73
00:07:04,620 --> 00:07:10,530
officer named Jack Maple, developed a
project called chart of the future, this
74
00:07:10,530 --> 00:07:16,479
is how he described it: "I call them the
chart of the future. On 55 feet of wall
75
00:07:16,479 --> 00:07:20,740
space, I mapped every train station in New
York City and every train. Then I used
76
00:07:20,740 --> 00:07:25,340
crayons to mark every violent crime,
robbery, and grand larceny that occurred.
77
00:07:25,340 --> 00:07:33,250
I mapped the solved versus the unsolved".
Now the system was used by the Transit
78
00:07:33,250 --> 00:07:41,110
Police and it was credited with reducing
felonies by 27% and robberies by 1/3
79
00:07:41,110 --> 00:07:50,280
between 1990 and 1992. So this generated a
lot of interest in his projects and former
80
00:07:50,280 --> 00:07:56,039
New York Mayor Rudolph Giuliani asked the
New York police department to essentially
81
00:07:56,039 --> 00:08:02,479
take up chart of the future and develop
their own project. It became CompStat.
82
00:08:02,479 --> 00:08:10,360
CompStat was again essentially about
mapping crime to try and make assumptions
83
00:08:10,360 --> 00:08:19,360
about where crime wars are happening. So
this kind of shows the building of this
84
00:08:19,360 --> 00:08:25,570
narrative around this idea that the more
data you have, the more data you generate,
85
00:08:25,570 --> 00:08:31,780
the better you will be at reducing crime.
Now it becomes interesting in the world we
86
00:08:31,780 --> 00:08:36,379
live in that we describe, where we are
constantly generating data, often without
87
00:08:36,379 --> 00:08:42,059
the consent or even the knowledge of those
who are producing this data. So there are
88
00:08:42,059 --> 00:08:48,339
new questions to be asked: What data is
the police entitled to access? What can
89
00:08:48,339 --> 00:08:54,490
they do with it? Are we all becoming
suspects by default? One of the key
90
00:08:54,490 --> 00:09:00,449
elements of the intersection between data
exploitation and policing is the question
91
00:09:00,449 --> 00:09:06,119
of smart cities. It's worth bearing in
mind that data-driven policing is often
92
00:09:06,119 --> 00:09:12,029
referred to as smart policing, so obviously
the word smart has been used generally in
93
00:09:12,029 --> 00:09:17,699
a generic manner by various industry to
kind of describe this trend of using mass
94
00:09:17,699 --> 00:09:26,689
data collection in order to provide new
services. But there is actually a real and
95
00:09:26,689 --> 00:09:34,670
genuine connection between smart cities
and data-driven policing. The first reason
96
00:09:34,670 --> 00:09:43,709
for that is that actually one of the main
reasons for cities to invest in smart city
97
00:09:43,709 --> 00:09:48,910
infrastructure is actually the question of
security. This is something we've explored
98
00:09:48,910 --> 00:09:54,320
in our latest report on smart cities and
this is emerging also from the work we're
99
00:09:54,320 --> 00:10:00,890
doing other organizations including coding
rights in Brazil and DRF in Pakistan. So
100
00:10:00,890 --> 00:10:06,009
actually Brazil is an interesting example,
because before the mega events they
101
00:10:06,009 --> 00:10:10,350
started organizing like the football
World Cup and the Olympics they invested
102
00:10:10,350 --> 00:10:16,850
massively in smart city infrastructure.
Including projects with IBM and precisely
103
00:10:16,850 --> 00:10:20,250
the purpose of what they were trying to
achieve with their smart city
104
00:10:20,250 --> 00:10:25,850
infrastructure, was making the city safer
so it was extremely strongly connected
105
00:10:25,850 --> 00:10:32,420
with the police. So this is a picture for
example of the control room that
106
00:10:32,420 --> 00:10:39,109
was built to control CCTV cameras and to
create graphs in order to showcase where
107
00:10:39,109 --> 00:10:45,860
crime was happening and also in a way the
likeliness of natural disasters in some
108
00:10:45,860 --> 00:10:51,649
areas. In Pakistan there is a whole new
program on investment of smart cities,
109
00:10:51,649 --> 00:10:58,799
which is actually referred to as the safe
city project. Now companies understand
110
00:10:58,799 --> 00:11:05,249
that very well and this is actually an
image from an IBM presentation describing
111
00:11:05,249 --> 00:11:11,189
their vision of smart cities. And as you
see like policing that is very much
112
00:11:11,189 --> 00:11:16,790
integrated into their vision, their
heavily centralized vision of what smart
113
00:11:16,790 --> 00:11:22,829
cities are. So that's no wonder that
companies that offer smart city
114
00:11:22,829 --> 00:11:28,379
infrastructure are actually now also
offering a platform for policing. So those
115
00:11:28,379 --> 00:11:34,820
companies include IBM as I mentioned but
also Oracle and Microsoft. We see in many
116
00:11:34,820 --> 00:11:39,600
countries including the UK where we based
some pressure on budgets and budget
117
00:11:39,600 --> 00:11:44,379
reductions for the police and so there is
a very strong appeal with this narrative,
118
00:11:44,379 --> 00:11:51,120
that you can purchase platform you can
gather more data that will help you do
119
00:11:51,120 --> 00:11:58,109
policing in less time and do it more
efficiently. But little thought is given
120
00:11:58,109 --> 00:12:03,230
to the impact on society, or right to
privacy and what happens if someone
121
00:12:03,230 --> 00:12:13,439
unexpected take the reins of power. Now
we're gonna briefly explain what data-
122
00:12:13,439 --> 00:12:20,499
driven policing looks like, and eventually
Millie will look at our findings. So
123
00:12:20,499 --> 00:12:26,339
the first thing I wanted to discuss is
actually predictive policing, because
124
00:12:26,339 --> 00:12:30,740
that's often something we think of and
talked about when we think about data-
125
00:12:30,740 --> 00:12:37,759
driven policing. I mentioned CompStat
before and essentially predictive policing
126
00:12:37,759 --> 00:12:43,319
works on a similar premise. The idea is
that if you map where crime happens you
127
00:12:43,319 --> 00:12:50,859
can eventually guess where the next crime
will happen. So the key player in
128
00:12:50,859 --> 00:12:54,989
predictive policing is this company called
PREDPOL, I mean I think they describe
129
00:12:54,989 --> 00:12:58,230
pretty much what they do, they use
artificial intelligence to help you
130
00:12:58,230 --> 00:13:06,249
prevent crime, right, predicting when and
where crime will most likely occur. Now
131
00:13:06,249 --> 00:13:10,929
PREDPOL and other companies using
something called a Hawkes process that's
132
00:13:10,929 --> 00:13:17,019
used normally for the prediction of
earthquake tremors, so what Hawkes
133
00:13:17,019 --> 00:13:23,269
originally did is that he was analyzing
how after an earthquake you have after
134
00:13:23,269 --> 00:13:28,660
shakes and usually the after shakes tend
to happen where the original earthquake
135
00:13:28,660 --> 00:13:35,940
happened and in a short period of time
after that. So the Hawkes process basically
136
00:13:35,940 --> 00:13:40,910
is described as when a certain event
happens, other events of the same kind will
137
00:13:40,910 --> 00:13:45,470
happen shortly after in the same in the
same location. Now obviously it actually
138
00:13:45,470 --> 00:13:50,790
works quite well for earthquakes, whether
it works for crime is a lot more
139
00:13:50,790 --> 00:13:56,290
questionable. But that's actually the
premise on which companies that
140
00:13:56,290 --> 00:14:02,119
are offering predictive policing services
are relying. So basically applied to
141
00:14:02,119 --> 00:14:08,730
predictive policing the mantra is
monitoring data on places where crime is
142
00:14:08,730 --> 00:14:13,309
happening you can identify geographic
hotspots where crime will most likely
143
00:14:13,309 --> 00:14:20,819
happen again. Now other companies than
PREDPOL are joining in and they are adding
144
00:14:20,819 --> 00:14:26,259
more data than just simply location of
past crimes. So this data has included
145
00:14:26,259 --> 00:14:30,629
open source intelligence and we talked a
little bit more about this later on.
146
00:14:30,629 --> 00:14:35,699
Weather report, census data, the location
of key landmarks like bars, churches,
147
00:14:35,699 --> 00:14:40,089
schools, data sporting events, and moon
phases. I'm not quite sure what they're
148
00:14:40,089 --> 00:14:50,209
doing with moon phases but somehow that's
something they're using. When predictive
149
00:14:50,209 --> 00:14:56,179
policing first sort of emerged one of the
the key concerns was whether our world was
150
00:14:56,179 --> 00:15:00,999
going to be turning into a Minority Report
kind of scenario where people are arrested
151
00:15:00,999 --> 00:15:05,490
before a crime is even committed and
companies like PREDPOL were quick to
152
00:15:05,490 --> 00:15:10,199
reassure people and say that do not
concern about who will commit crime but
153
00:15:10,199 --> 00:15:15,800
where crimes are happening. Now that's not
actually true because in fact at the
154
00:15:15,800 --> 00:15:21,100
moment we see several programs emerging
especially in the US, where police
155
00:15:21,100 --> 00:15:25,509
departments are concerned not so much with
where crimes are happening, but who's
156
00:15:25,509 --> 00:15:30,920
committing it.,So I'm gonna talk about two
example of this: One is the Kansas City No
157
00:15:30,920 --> 00:15:37,850
Violence Alliance, which is a program laid
by the local police to identify who will
158
00:15:37,850 --> 00:15:42,579
become the next criminal - basically - and
they're using an algorithm that combines
159
00:15:42,579 --> 00:15:48,189
data from traditional policing as well as
social media intelligence and information
160
00:15:48,189 --> 00:15:53,569
that they have on drug use, based on this
they create graphics generated using
161
00:15:53,569 --> 00:16:01,609
predictive policing to show how certain
people are connected to already convicted
162
00:16:01,609 --> 00:16:06,169
criminals and gang members. Once they've
identified these people they request
163
00:16:06,169 --> 00:16:11,479
meeting with them whether they've
committed crimes or not in the past. And
164
00:16:11,479 --> 00:16:16,420
they would have a discussion about their
connection to those convicted criminals
165
00:16:16,420 --> 00:16:21,910
and gang members and what they tell them
is that they are warned that if a crime
166
00:16:21,910 --> 00:16:27,109
next happened within their network of
people every person connected to this
167
00:16:27,109 --> 00:16:33,319
network will be arrested whether or not
they were actually involved in the crime
168
00:16:33,319 --> 00:16:38,379
being committed. Now there are actually
dozens of police departments that are
169
00:16:38,379 --> 00:16:46,100
using similar programs. The Chicago Police
Department has an index of the 400 people
170
00:16:46,100 --> 00:16:50,359
most likely to be involved in violent
crimes. That sounds like a BuzzFeed
171
00:16:50,359 --> 00:16:56,389
article but actually there is a reality
which is extremely concerning, because
172
00:16:56,389 --> 00:17:02,069
those people who are in this list are for
the most part not actual criminals, they
173
00:17:02,069 --> 00:17:08,019
are purely seen to be connected to people
who've committed crime. So if your next-
174
00:17:08,019 --> 00:17:16,679
door neighbor is a criminal then you may
well find your name on that list. Now
175
00:17:16,679 --> 00:17:21,480
predictive policing is deceptive and
problematic for several reasons: First of
176
00:17:21,480 --> 00:17:26,519
all there's the question of the
presumption of innocence. In a world where
177
00:17:26,519 --> 00:17:32,519
even before you commit a crime you can
find your name on that list or be called
178
00:17:32,519 --> 00:17:37,899
by the police - you know - what happens to
this very basis of democracy which is the
179
00:17:37,899 --> 00:17:42,529
presumption of the of innocence. But also
there's the other question of like can we
180
00:17:42,529 --> 00:17:47,720
really use the math that was originally
designed for earthquakes and apply to
181
00:17:47,720 --> 00:17:53,049
human beings because human beings don't
work like earthquakes. They have their own
182
00:17:53,049 --> 00:17:59,870
set of biases and the biases
start with how we collect the data. For
183
00:17:59,870 --> 00:18:07,640
example, if the police is more likely to
police areas where there is minorities,
184
00:18:07,640 --> 00:18:11,769
people of color, then obviously the data
they will have will be disproportionately
185
00:18:11,769 --> 00:18:18,490
higher on persons of color. Likewise if
they are unlikely to investigate white-
186
00:18:18,490 --> 00:18:24,200
collar crime they will be unlikely to have
data that are reflecting a reality where
187
00:18:24,200 --> 00:18:29,040
crime also happens in wealthier areas. So
basically we are inputting biased datasets
188
00:18:29,040 --> 00:18:35,030
that obviously will lead to biased
results. And what these biased results
189
00:18:35,030 --> 00:18:41,600
mean is that it will continue the already
existing trend of over policing
190
00:18:41,600 --> 00:18:48,440
communities of color and low-income
communities. I'll leave it to Millie for
191
00:18:48,440 --> 00:18:55,667
the next box. So, one of the increasingly
popular technologies we're seeing in the
192
00:18:55,667 --> 00:19:00,586
UK, and is no doubt used around the world
and probably at border points, although we
193
00:19:00,586 --> 00:19:06,450
need more help with the reasearch to prove
this, is mobile phone extraction. The
194
00:19:06,450 --> 00:19:10,680
police can extract data from your phone,
your laptop, and other devices which
195
00:19:10,680 --> 00:19:16,431
results in a memory dump of the extracted
data taken from your device and now held
196
00:19:16,431 --> 00:19:23,331
in an agency database. So for example all
your photos, all your messages, and all
197
00:19:23,331 --> 00:19:28,330
those of people who had no idea they would
end up in a police database because
198
00:19:28,330 --> 00:19:34,549
they're associated with you retained for
as long as the police wish. Now these
199
00:19:34,549 --> 00:19:38,600
devices are pretty user friendly for the
police and if you're interested you can
200
00:19:38,600 --> 00:19:42,559
look on YouTube where Cellebrite one of
the big players has lots of videos about
201
00:19:42,559 --> 00:19:48,929
how you can use them, and so depending on
the device and the operating system some
202
00:19:48,929 --> 00:19:54,419
of the data this is from a police document
but it lists what they can extract using a
203
00:19:54,419 --> 00:20:01,820
Cellebrite UFED is what you might expect:
device information, calls, messages,
204
00:20:01,820 --> 00:20:08,970
emails, social media, and Wi-Fi networks.
But if you look at their website and here
205
00:20:08,970 --> 00:20:14,750
are a few examples they can also collect:
system and deleted data, they can access
206
00:20:14,750 --> 00:20:20,580
cloud storage, and inaccessible partitions
of the device. Now this is data that is
207
00:20:20,580 --> 00:20:26,490
clearly beyond the average users control,
and as the volume of data we hold on our
208
00:20:26,490 --> 00:20:31,749
phones increases so will this list. And
the companies we know the UK police are
209
00:20:31,749 --> 00:20:39,059
using, which includes: Cellebrite, Acceso,
Radio Tactics, MSAB, are all aware of how
210
00:20:39,059 --> 00:20:44,750
valuable this is and as one of them have
stated: "if you've got access to a person
211
00:20:44,750 --> 00:20:50,500
SIM card, you've got access to the whole
of a person's life". They also go on to
212
00:20:50,500 --> 00:20:56,070
note: "the sheer amount of data stored on
mobile phones is significantly greater
213
00:20:56,070 --> 00:21:04,149
today than ever before." There are also no
temporal limits to the extraction of data,
214
00:21:04,149 --> 00:21:09,149
this is from another police document we
obtained and it shows that if you choose
215
00:21:09,149 --> 00:21:16,159
to extract to certain data type you will
obtain all data of a particular type, not
216
00:21:16,159 --> 00:21:21,280
just the data relevant to an
investigation. So all that data on a
217
00:21:21,280 --> 00:21:28,429
police database, indefinitely and even if
you were asked whether you were happy for
218
00:21:28,429 --> 00:21:32,789
your data to be extracted during an
investigation I think it's highly unlikely
219
00:21:32,789 --> 00:21:37,630
you would realize the volume that the
police were going to take. Other targets
220
00:21:37,630 --> 00:21:44,179
for the police that we know about are:
infotainment systems in cars, Smart TVs,
221
00:21:44,179 --> 00:21:51,230
and connected devices in the home. This is
an extract from a tech UK report, where
222
00:21:51,230 --> 00:21:56,700
Mark Stokes head of digital forensics at
the Met Police which the police in London
223
00:21:56,700 --> 00:22:03,200
stated in January, that the crime scene of
tomorrow will be the Internet of Things
224
00:22:03,200 --> 00:22:08,450
and detectors of the future will carry a
digital forensics toolkit that will help
225
00:22:08,450 --> 00:22:15,020
them analyze microchips and download data
at the scene rather than removing devices
226
00:22:15,020 --> 00:22:20,081
for testing. Now I can imagine that the
evidence storage room is going to get a
227
00:22:20,081 --> 00:22:24,840
bit full if they start dragging in
connected fridges, hair dryers, hair
228
00:22:24,840 --> 00:22:32,570
brushes, your Google home, Amazon echo and
whatever else you have. However, their
229
00:22:32,570 --> 00:22:38,240
plans to walk into your home and download
everything, make no mention of needing a
230
00:22:38,240 --> 00:22:43,509
specific warrant and so the only
limitations at the moment are the
231
00:22:43,509 --> 00:22:50,220
protections that may exist on the devices.
The law does not protect us and this needs
232
00:22:50,220 --> 00:22:59,409
to change. So I'm going to be talking a
little bit about open source intelligence
233
00:22:59,409 --> 00:23:05,470
and in particular social media
intelligence, because when I talked about
234
00:23:05,470 --> 00:23:10,830
predictive policing I identified those two
sources as some of the data that's being
235
00:23:10,830 --> 00:23:17,470
used for predictive policing. Now, open
source intelligence is often thought as,
236
00:23:17,470 --> 00:23:23,409
or often assumed to be innocuous, and
there is the understanding that if
237
00:23:23,409 --> 00:23:29,440
information is publicly available then it
should be fair for the police to use. Now
238
00:23:29,440 --> 00:23:34,270
the problem is that among open source
intelligence there's often social media
239
00:23:34,270 --> 00:23:40,509
intelligence that we refer to as
documents. Now there are many ways to
240
00:23:40,509 --> 00:23:45,900
conduct document and it can range from
like the single police officer, who is
241
00:23:45,900 --> 00:23:54,009
just you know using Facebook or Twitter to
look up the accounts of victims or
242
00:23:54,009 --> 00:23:58,620
suspected criminals, but there was also
companies that are scrapping the likes of
243
00:23:58,620 --> 00:24:04,580
Facebook and Twitter to allow the police
to monitor social media. Now social medias
244
00:24:04,580 --> 00:24:10,580
have like blurred the lines between public
and private, because obviously we are
245
00:24:10,580 --> 00:24:17,909
broadcasting our views on this platform
and at the moment the police has been
246
00:24:17,909 --> 00:24:25,059
exploiting this kind of unique space, this
blured line, ithey are accessing this
247
00:24:25,059 --> 00:24:30,809
content in a completely unregulated
manner, as long as the content is publicly
248
00:24:30,809 --> 00:24:37,620
available like for example you don't need
to be friend or to have any already
249
00:24:37,620 --> 00:24:43,470
established connection with the suspected
criminal or the police or the victim
250
00:24:43,470 --> 00:24:48,610
anything that's available to you it's
completely unregulated there are no rules
251
00:24:48,610 --> 00:24:56,700
and I mentioned earlier the question of a
budget restriction and so the police is
252
00:24:56,700 --> 00:25:01,749
benefiting hugely from this because it
doesn't really cost anything to use social
253
00:25:01,749 --> 00:25:07,019
media so at the moment SOCMINT is kind of
like the first and easy step in a police
254
00:25:07,019 --> 00:25:14,470
investigation because there is no cost and
because there is no oversight. Now,
255
00:25:14,470 --> 00:25:19,420
SOCMINT actually isn't so innocent in the
sense that it allows the police to
256
00:25:19,420 --> 00:25:25,519
identify the locations of people based on
their post, it allows them to establish
257
00:25:25,519 --> 00:25:30,669
people's connection, their relationships,
their association, it allows the
258
00:25:30,669 --> 00:25:37,380
monitoring of protest and also to identify
the leaders of various movement, and to
259
00:25:37,380 --> 00:25:45,880
measure a person's influence. Now, in the
UK what we know is that the police is
260
00:25:45,880 --> 00:25:52,019
largely using marketing products, so this
is an anonymous quote from a report by
261
00:25:52,019 --> 00:25:58,029
academics that have been doing research on
SOCMINT and what someone said was that: "A
262
00:25:58,029 --> 00:26:01,620
lot of stuff came out of marketing because
marketing were using social media to
263
00:26:01,620 --> 00:26:05,190
understand what people were saying about
their product... We wanted to understand
264
00:26:05,190 --> 00:26:11,549
what people were saying so it's almost
using it in reverse". Now again, this is
265
00:26:11,549 --> 00:26:16,350
not considered like surveillance device
this is purely a marketing project that
266
00:26:16,350 --> 00:26:23,309
they're using and for that reason law
enforcement agencies and security agencies
267
00:26:23,309 --> 00:26:30,140
are often arguing that SOCMINT has
basically no impact on privacy. But
268
00:26:30,140 --> 00:26:36,640
actually when your post reveals your
location or when the content of your post
269
00:26:36,640 --> 00:26:40,080
reveal what used to be considered and is
still considered actually as sensitive
270
00:26:40,080 --> 00:26:45,090
private information like details about
your sexual life, about your health, about
271
00:26:45,090 --> 00:26:50,120
your politics, can we really minimize the
impact of the police accessing this
272
00:26:50,120 --> 00:26:56,190
information. Now obviously we may not have
a problem with the average twitter user or
273
00:26:56,190 --> 00:27:00,880
with a friend reading this information but
when the ones who are reading the
274
00:27:00,880 --> 00:27:06,460
information and taking actions on this
information have power over us like the
275
00:27:06,460 --> 00:27:17,717
police does, you know, what does it
actually mean for our right to privacy?
276
00:27:17,717 --> 00:27:26,610
That's not to say that people should stop
using social media but rather what kind of
277
00:27:26,610 --> 00:27:32,960
regulation can we put in place so that
it's not so easy for the police to access.
278
00:27:32,960 --> 00:27:41,720
The absence of regulations on SOCMINT has
actually already led to abuse in two cases
279
00:27:41,720 --> 00:27:48,159
both in the US that we've identified: One
is Raza v. the City of New York which is a
280
00:27:48,159 --> 00:27:55,840
case from the ACLU where we knew that we
found out that the city of New York,
281
00:27:55,840 --> 00:28:00,179
sorry, the New York Police Department was
systematically gathering intelligence on
282
00:28:00,179 --> 00:28:04,799
Muslim communities, and one of the ways
they were gathering this intelligence was
283
00:28:04,799 --> 00:28:11,509
essentially by surveilling social media
accounts of Muslims in New York. The
284
00:28:11,509 --> 00:28:17,320
second case is a company called ZeroFOX.
So what ZeroFox does is social media
285
00:28:17,320 --> 00:28:23,150
monitoring. Now, during the the riots that
followed the funeral of Freddie Gray,
286
00:28:23,150 --> 00:28:30,500
Freddie Gray was a 25 year old black man
who had been shot by the police, so after
287
00:28:30,500 --> 00:28:36,549
his funeral there had been a series of
riots in the UK and ZeroFOX produced a
288
00:28:36,549 --> 00:28:41,360
report that they shared with the Baltimore
Police to essentially advertise for their
289
00:28:41,360 --> 00:28:47,929
social social media monitoring tool and
what the company was doing was again like
290
00:28:47,929 --> 00:28:52,970
browsing social media and trying to
establish who were the threat actors in
291
00:28:52,970 --> 00:28:58,659
these riots and among the 19 threat
actors that they identified two of them
292
00:28:58,659 --> 00:29:04,499
were actually leaders of the black lives
matter movement. Actually at least one of
293
00:29:04,499 --> 00:29:09,550
them was a woman definitely not a physical
threat but this is how they were
294
00:29:09,550 --> 00:29:17,570
essentially labeled. So these two examples
actually show that again it's still sort
295
00:29:17,570 --> 00:29:24,240
of the same targets, it's people of
colors, it's activists, it's people from
296
00:29:24,240 --> 00:29:30,179
poor income backgrounds, that are singled
out as likely criminals. And it's very
297
00:29:30,179 --> 00:29:34,029
telling when we realize that SOCMINT is
actually one of the sources of data that's
298
00:29:34,029 --> 00:29:38,740
eventually used for predictive policing
and then again predictive policing leading
299
00:29:38,740 --> 00:29:45,409
to people being more surveiled and
potentially exposed to more police
300
00:29:45,409 --> 00:29:51,169
surveillance based on the fact that they
all singled out as as likely criminal. Now
301
00:29:51,169 --> 00:29:56,890
social media is a fascinating place
because it's a mix between a private and a
302
00:29:56,890 --> 00:30:02,210
public space as I said we are broadcasting
our views publicly but then again it's a
303
00:30:02,210 --> 00:30:07,679
privately owned space where we follow the
rules that is set up by private companies.
304
00:30:07,679 --> 00:30:13,779
Now, if we want to protect this space and
ensure that like free expression and
305
00:30:13,779 --> 00:30:18,619
political organization can still happen on
the spaces we need to fully understand how
306
00:30:18,619 --> 00:30:23,460
much the police have been exploiting the
spaces and how we can limit and regulate
307
00:30:23,460 --> 00:30:29,879
the use of it. Now, I'll talk to Millie
about what we can do next. So I'm going to
308
00:30:29,879 --> 00:30:33,460
briefly look at some of our initial
findings we've made using Freedom of
309
00:30:33,460 --> 00:30:39,539
Information requests, broadly: the lack of
awareness by the public, weak legal basis,
310
00:30:39,539 --> 00:30:45,429
and a lack of oversight. Now, sometimes
the lack of awareness appears intentional
311
00:30:45,429 --> 00:30:54,740
- we asked the police about their plans to
extract data from connected devices in the
312
00:30:54,740 --> 00:31:01,679
home and they replied neither confirm nor
deny. Now this is kind of a bizarre
313
00:31:01,679 --> 00:31:06,659
response given that Mark Stokes who's a
member of the police had already said that
314
00:31:06,659 --> 00:31:13,509
they plan to do this, in addition the UK
government Home Office replied to us
315
00:31:13,509 --> 00:31:18,269
saying the Home Office plans to develop
skills and capacity to exploit the
316
00:31:18,269 --> 00:31:23,929
Internet of Things as part of criminal
investigations. They also said that police
317
00:31:23,929 --> 00:31:29,920
officers will receive training in relation
to extracting, obtaining, retrieving, data
318
00:31:29,920 --> 00:31:35,399
from or generated by connected devices. So
we wrote back to every police force in the
319
00:31:35,399 --> 00:31:40,970
UK had refused to reply to us and
presented the evidence but they maintained
320
00:31:40,970 --> 00:31:45,679
their stance so we will be bringing a
challenge against them under the Freedom
321
00:31:45,679 --> 00:31:51,929
of Information Act. Now, Eva has also
identified the huge risks associated with
322
00:31:51,929 --> 00:31:57,769
predictive policing yet in the UK we've
found out this is set to increase with
323
00:31:57,769 --> 00:32:02,070
forces either using commercial tools or
in-house ones they've developed or
324
00:32:02,070 --> 00:32:09,049
planning trials for 2018. There has been
no public consultation, there are no
325
00:32:09,049 --> 00:32:14,279
safeguards, and there is no oversight. So
when we ask them more questions about the
326
00:32:14,279 --> 00:32:21,370
plans we were told we were 'vexatious' and
they won't respond to more requests so it
327
00:32:21,370 --> 00:32:27,299
seems like we have yet another challenge,
and what about mobile phone extraction
328
00:32:27,299 --> 00:32:32,570
tools here are some of the stats that have
been found out and I would say these
329
00:32:32,570 --> 00:32:36,821
aren't completely accurate because it
depends on how reliable the police force
330
00:32:36,821 --> 00:32:42,940
are in responding but roughly I'd say it's
probably more than 93 percent now of UK
331
00:32:42,940 --> 00:32:48,379
police forces throughout the country are
extracting data from digital devices. We
332
00:32:48,379 --> 00:32:53,080
know they plan to increase, we've seen in
their documents they plan to train more
333
00:32:53,080 --> 00:32:58,690
officers, to buy more equipment, and to
see extraction as a standard part of
334
00:32:58,690 --> 00:33:04,009
arrest, even if the devices had absolutely
nothing to do with the offense and so
335
00:33:04,009 --> 00:33:09,769
these figures are likely to increase
exponentially, but in the UK not only to
336
00:33:09,769 --> 00:33:15,610
the police not need a warrant in documents
we've read they do not even need to notify
337
00:33:15,610 --> 00:33:21,139
the individual that they have extracted
data, for example, from their mobile phone
338
00:33:21,139 --> 00:33:27,590
or that they're storing it. If this is
being done without people's knowledge how
339
00:33:27,590 --> 00:33:32,220
on earth can people challenge it, how can
they ask for their data to be removed if
340
00:33:32,220 --> 00:33:39,590
they're found innocent? Turning to social
media monitoring which the police refer to
341
00:33:39,590 --> 00:33:44,330
as open source research. This is Jenny
Jones she's a member of the House of Lords
342
00:33:44,330 --> 00:33:50,730
in the Green Party and next to her photo
is a quote from her entry on the domestic
343
00:33:50,730 --> 00:33:57,249
extremism database, and so, if a member of
the House of Lords is being subject to
344
00:33:57,249 --> 00:34:04,659
social media monitoring for attending a
bike ride then I think it's highly likely
345
00:34:04,659 --> 00:34:08,830
that a large number of people who
legitimately exercise their right to
346
00:34:08,830 --> 00:34:14,429
protest are being subject to social media
monitoring. Now, this hasn't gone
347
00:34:14,429 --> 00:34:20,399
unnoticed completely although they're
slightly old these are quotes from two
348
00:34:20,399 --> 00:34:24,899
officials: the first the UK independent
reviewer of terrorism who notes that the
349
00:34:24,899 --> 00:34:29,690
extent of the use of social media
monitoring is not public known, and the
350
00:34:29,690 --> 00:34:33,679
second is the chief surveillance
commissioner who is and this is a very
351
00:34:33,679 --> 00:34:38,949
strong statement for a commissioner is
saying that basically social media should
352
00:34:38,949 --> 00:34:47,649
not be treated as fair game by the police.
So now I'll move on to a weak or outdated
353
00:34:47,649 --> 00:34:52,649
legal basis. For most of the technologies
we've looked at it's very unclear what
354
00:34:52,649 --> 00:34:58,359
legal basis the police are using even when
we've asked them. This relates to mobile
355
00:34:58,359 --> 00:35:03,940
phone extraction - so the legislation
they're relying on is over 30 years old
356
00:35:03,940 --> 00:35:11,310
and is wholly inappropriate for mobile
phone extraction this law was developed to
357
00:35:11,310 --> 00:35:16,680
deal with standard traditional searches,
the search of a phone can in no way be
358
00:35:16,680 --> 00:35:22,300
equated to the search of a person, or the
search of a house, and despite the fact
359
00:35:22,300 --> 00:35:26,901
that we have repeatedly asked for a
warrant this is not the case and we
360
00:35:26,901 --> 00:35:31,270
believe that there should be a warrant in
place not only in the UK but in the rest
361
00:35:31,270 --> 00:35:35,550
of the world. So if you think that either
you or your friends have had their data
362
00:35:35,550 --> 00:35:39,369
extracted when they're arrested or your
phone has been in the possession of the
363
00:35:39,369 --> 00:35:45,650
authorities you should be asking
questions, and very briefly something on
364
00:35:45,650 --> 00:35:52,420
lack of oversight, so we reported in
January this year about documents that
365
00:35:52,420 --> 00:35:58,000
were obtained by The Bristol Cable's
investigation into Cellebrite and one
366
00:35:58,000 --> 00:36:04,020
report said that in half of the cases
sampled the police noted the police had
367
00:36:04,020 --> 00:36:10,320
failed to receive authorization internally
for the use of extraction tools. Poor
368
00:36:10,320 --> 00:36:15,809
training undermined investigations into
serious offences such as murder, and
369
00:36:15,809 --> 00:36:20,940
inadequate security practices meant that
encryption was not taking place even when
370
00:36:20,940 --> 00:36:26,849
it was easy to do and they were losing
files containing intimate personal data.
371
00:36:26,849 --> 00:36:33,490
So why does this matter? Here are some key
points: In relation to information
372
00:36:33,490 --> 00:36:37,760
asymmetry - it's clear as Eva has
explained that the police can now access
373
00:36:37,760 --> 00:36:43,670
far more data on our devices than the
average user. In relation to imbalance of
374
00:36:43,670 --> 00:36:47,420
power - it's clear they can collect and
analyze sources that are beyond our
375
00:36:47,420 --> 00:36:54,320
control whether it's publicly placed
sensors, cameras, and other devices. There
376
00:36:54,320 --> 00:36:58,890
is also unequal access and if lawyers
don't know what's being gathered they
377
00:36:58,890 --> 00:37:03,660
don't know what to ask for from the
police. All in all this puts the
378
00:37:03,660 --> 00:37:10,410
individual at a huge disadvantage. Another
impact is the chilling effect on political
379
00:37:10,410 --> 00:37:16,850
expression now I'm sure many of you maybe
think that the police monitor your social
380
00:37:16,850 --> 00:37:21,859
media but the average person is unlikely
to, and so if they start to know about
381
00:37:21,859 --> 00:37:27,110
this are they going to think twice about
joining in protesting either physically or
382
00:37:27,110 --> 00:37:32,380
using a hashtag, and what about who your
friends are? If they know you attend
383
00:37:32,380 --> 00:37:38,540
protests are they really want to have
their data on your phone if they know that
384
00:37:38,540 --> 00:37:44,460
potentially that could be extracted and
end up on a police database? It's far
385
00:37:44,460 --> 00:37:49,380
easier to be anonymous face among many
people than a single isolated person
386
00:37:49,380 --> 00:37:55,119
standing up to power but these new forms
of policing we have been discussing
387
00:37:55,119 --> 00:38:00,339
redefine the very act of protesting by
singling out each and every one of us from
388
00:38:00,339 --> 00:38:08,309
the crowd. So, what can we do? Many of you
will be familiar with these technologies,
389
00:38:08,309 --> 00:38:12,720
but do you know how to find out what the
police are doing? In the UK we've been
390
00:38:12,720 --> 00:38:16,610
using Freedom of Information requests, we
want to do this with people throughout
391
00:38:16,610 --> 00:38:21,910
Europe and you don't need to be a lawyer
so please get in touch. We also want to
392
00:38:21,910 --> 00:38:26,660
dig into the technology a bit more, I want
someone to use a Cellebrite UFED on my
393
00:38:26,660 --> 00:38:31,809
phone and show me exactly what can come
out of it, and we want to tell lawyers and
394
00:38:31,809 --> 00:38:37,329
activists about these new techniques. Many
lawyers I speak to who are experts in
395
00:38:37,329 --> 00:38:42,210
actions against the police do not know the
police are using these tools. This means
396
00:38:42,210 --> 00:38:46,700
they don't know the right questions to ask
and so it's fundamental you speak to
397
00:38:46,700 --> 00:38:50,920
people who are bringing these cases and
tell them about what they can do or what
398
00:38:50,920 --> 00:38:56,640
questions they should be asking, and
finally we want you to also raise the
399
00:38:56,640 --> 00:39:18,034
debate, to share our research, and to
critique it, thank you.
400
00:39:18,034 --> 00:39:24,220
Herald: So we've got ample enough time for
Q&A are there any questions in the hall,
401
00:39:24,220 --> 00:39:28,670
yes, there's one over there.
Question: You mentioned the problem of
402
00:39:28,670 --> 00:39:33,110
when they do physical extraction from the
Celebrite device it's going to get all of
403
00:39:33,110 --> 00:39:37,710
the photos, all of the emails, or whatever
maybe rather than just what the
404
00:39:37,710 --> 00:39:42,059
investigator needs. What is the solution
to that from your eyes is there a
405
00:39:42,059 --> 00:39:45,740
technical one that these companies are
gonna have to implement - which they're
406
00:39:45,740 --> 00:39:51,140
not going to - or a legal one, because on
the other side a mobile phone is a crucial
407
00:39:51,140 --> 00:39:56,890
part in a any criminal investigation in
2017. So what's the workaround or the
408
00:39:56,890 --> 00:40:00,020
solution to that?
Answer: I think it's both, I think the
409
00:40:00,020 --> 00:40:04,000
fact that there isn't any law looking at
this and no one's discussing can there be
410
00:40:04,000 --> 00:40:08,520
a technical solution or does it need to be
one where there's better regulation and
411
00:40:08,520 --> 00:40:12,660
oversight so you extract everything, can
you keep it for a certain period to see
412
00:40:12,660 --> 00:40:16,859
what's relevant then do you have to delete
it? The trouble is we don't see any
413
00:40:16,859 --> 00:40:22,290
deletion practices and the police have
publicly stated in the media that they can
414
00:40:22,290 --> 00:40:27,280
just keep everything as long as they like.
They like data you can kind of see why but
415
00:40:27,280 --> 00:40:31,240
that doesn't mean they should keep
everyone's data indefinitely just in case
416
00:40:31,240 --> 00:40:35,062
it's useful so I think there may be tech
solutions there may be legal ones and I
417
00:40:35,062 --> 00:40:40,510
think perhaps both together as is one of
the answers. Herald: The next question
418
00:40:40,510 --> 00:40:45,349
from microphone one please.
Q: I'm just wondering how those laws on
419
00:40:45,349 --> 00:40:50,280
action and power given to the cops are
being sold to the UK people is it because
420
00:40:50,280 --> 00:40:56,510
to fight terrorism as I said or to fight
drugs or this kind of stuff, what's the
421
00:40:56,510 --> 00:41:00,490
argument used by the government to sold
that to the people.
422
00:41:00,490 --> 00:41:05,170
A: I think actually one thing that's
important is to bear in mind is that I'm
423
00:41:05,170 --> 00:41:10,630
not sure most of the of the public in the
UK is even aware of it, so I think unlike
424
00:41:10,630 --> 00:41:15,330
the work of intelligence services an
agency where terrorism is used as the
425
00:41:15,330 --> 00:41:22,450
excuse for ever more power and especially
laws that have become increasingly
426
00:41:22,450 --> 00:41:26,130
invasive, actually with policing we don't
even fall in that kind of discourse
427
00:41:26,130 --> 00:41:30,980
because it's actually hardly talked about
in UK. Yeah, and the mobile phone
428
00:41:30,980 --> 00:41:34,880
extraction stuff we've been looking at is
low-level crimes, so that's like you
429
00:41:34,880 --> 00:41:40,750
have, it could be you know a pub fight,
it could be a robbery, which that's more
430
00:41:40,750 --> 00:41:45,550
serious, it could be an assault, so they
want to use it in every case. For all the
431
00:41:45,550 --> 00:41:48,170
other techniques we have no idea what
they're using for that's one of the
432
00:41:48,170 --> 00:41:53,599
problems.
Herald: The next question from the
433
00:41:53,599 --> 00:41:57,400
internet please.
Q: When you say that there's a lack of
434
00:41:57,400 --> 00:42:04,460
laws and regulations for police concerning
us in extraction and data from devices are
435
00:42:04,460 --> 00:42:09,790
you talking just about UK and/or USA or do
you have any examples of other countries
436
00:42:09,790 --> 00:42:13,500
who do better or worse?
A: I don't know of any country that has a
437
00:42:13,500 --> 00:42:18,520
regulation on publicly available
information on social media.
438
00:42:18,520 --> 00:42:25,849
Herald: Microphone number four.
Q: Thank you again for a great talk. In
439
00:42:25,849 --> 00:42:31,920
terms of data exploitation an element that
I didn't hear you talk about that I'd like
440
00:42:31,920 --> 00:42:35,940
to hear a little bit more is when there
are questions around who is doing the
441
00:42:35,940 --> 00:42:40,410
exploitation, I know in the U.S. some FOIA
researchers get around how difficult it is
442
00:42:40,410 --> 00:42:44,640
to get data from the feds by going after
local and state police departments, is
443
00:42:44,640 --> 00:42:48,450
that something that you're doing or do you
have a way of addressing confusion when
444
00:42:48,450 --> 00:42:50,880
people don't know what agency has the
data?
445
00:42:50,880 --> 00:42:56,580
A: Yeah, I think actually what one of the
things the data exploitation program at
446
00:42:56,580 --> 00:43:00,330
Privacy International is doing is actually
looking into the connection between the
447
00:43:00,330 --> 00:43:06,050
private sector and governments because
obviously at the moment there's the whole
448
00:43:06,050 --> 00:43:09,950
question of data brokers which is an
industry that's hardly regulated at all,
449
00:43:09,950 --> 00:43:14,130
that people don't necessarily know about,
we don't, the companies that are doing it
450
00:43:14,130 --> 00:43:19,900
are familiar household name. I'll let
Millie talk a lot more about the
451
00:43:19,900 --> 00:43:24,920
government aspects of it. I guess the
question is again a country-by-country
452
00:43:24,920 --> 00:43:29,470
basis, we work in many countries that
don't have any data protection regulations
453
00:43:29,470 --> 00:43:36,609
at all so there is this first difficulty
as how do we regulate, how do we limit the
454
00:43:36,609 --> 00:43:40,920
power of the state when you don't even
have the basic legislation around
455
00:43:40,920 --> 00:43:45,710
data protection? One thing to bear in mind
is like the problem with companies is like
456
00:43:45,710 --> 00:43:53,220
how do you also hold companies accountable
whereas with the state there is the whole
457
00:43:53,220 --> 00:43:58,119
challenge of finding the right legal
framework to limit their power, but maybe
458
00:43:58,119 --> 00:44:02,069
I'll let Millie talk a little bit more
about this. Yeah, with our with our FOIA
459
00:44:02,069 --> 00:44:06,270
request we tend to go after everyone so
with the example of the Home Office saying
460
00:44:06,270 --> 00:44:08,990
something that the other police didn't
that was because we went to all the
461
00:44:08,990 --> 00:44:14,680
different state bodies and I think that
there's a good example in in the states
462
00:44:14,680 --> 00:44:17,690
where there's far more research done on
what the police are doing, but they're
463
00:44:17,690 --> 00:44:22,600
using the same product in the UK I think
it's axiom and they're a storage device
464
00:44:22,600 --> 00:44:29,119
for body-worn camera videos, and a lawyer
in the states said that in order to access
465
00:44:29,119 --> 00:44:32,799
the video containing his client he had to
agree to the terms and condition on Axioms
466
00:44:32,799 --> 00:44:38,140
website which basically gave them full use
of his clients video about a crime scene.
467
00:44:38,140 --> 00:44:42,750
So that's a private company having use of
this video so given that we found they're
468
00:44:42,750 --> 00:44:47,120
using it in the UK we don't know if those
kind of terms and conditions exist but
469
00:44:47,120 --> 00:44:54,673
it's a very real problem as they rely
increasingly on private companies.
470
00:44:54,673 --> 00:44:58,370
Herald: Number two please.
Q: Thank you for your work perhaps you've
471
00:44:58,370 --> 00:45:03,450
already answered this partially from other
people's questions but it looks like we
472
00:45:03,450 --> 00:45:08,539
have a great way to start the process and
kind of taking the power back but you know
473
00:45:08,539 --> 00:45:13,250
the state and the system certainly doesn't
want to give up this much power, how do we
474
00:45:13,250 --> 00:45:18,190
actually directly, what's kind of the
endgame, what's the strategies for making
475
00:45:18,190 --> 00:45:24,770
the police or the government's give up and
restore balance, is it a suit, is it
476
00:45:24,770 --> 00:45:27,859
challenging through Parliament and in the
slow process of democracy, or what do you
477
00:45:27,859 --> 00:45:32,170
think is the right way of doing it?
A: I never think one works on its own,
478
00:45:32,170 --> 00:45:36,670
even though I'm a litigator I often think
litigation is quite a weak tactic,
479
00:45:36,670 --> 00:45:40,920
particularly if you don't have the public
on side, and then again if you don't have
480
00:45:40,920 --> 00:45:44,220
Parliament. So we need all of them and
they can all come through different means
481
00:45:44,220 --> 00:45:49,090
so we wouldn't just focus on one of the
different countries it might be that you
482
00:45:49,090 --> 00:45:53,540
go down the legal route or the down the
parliamentary route but in the UK we're
483
00:45:53,540 --> 00:45:57,460
trying all different routes so for example
on mobile phone extraction in the
484
00:45:57,460 --> 00:46:00,900
beginning of next year we're going to be
doing a video we're going to be doing
485
00:46:00,900 --> 00:46:04,120
interviewing the public and speaking to
them about it, we're going to be going to
486
00:46:04,120 --> 00:46:08,960
Parliament, and I've also been speaking to
a lot of lawyers so I'm hoping some cases
487
00:46:08,960 --> 00:46:15,280
will start because those individual cases
brought by local lawyers are where also
488
00:46:15,280 --> 00:46:19,859
you see a lot of change like the John Cat
case, that's one lawyer, so I think we
489
00:46:19,859 --> 00:46:25,901
need all different things to see what
works and what sticks.
490
00:46:25,901 --> 00:46:31,150
Herald: We haven't had number three yet.
Q: Hi, thanks for the talk, so I have a
491
00:46:31,150 --> 00:46:39,020
question regarding concerning the solution
side of things because one aspect I was
492
00:46:39,020 --> 00:46:45,569
missing in your talk was the economics of
the game actually because like you are
493
00:46:45,569 --> 00:46:51,510
from the UK and the private sector has
like stepped in also and another public
494
00:46:51,510 --> 00:46:58,799
domain the NHS to help out because funds
are missing and I would like to ask you
495
00:46:58,799 --> 00:47:03,299
whether or not you think first of all the
logic is the same within the police
496
00:47:03,299 --> 00:47:12,720
departments because it might also be like
cost driven aspect to limit the salaries
497
00:47:12,720 --> 00:47:18,589
or because you have the problem with
police force coming in because you have to
498
00:47:18,589 --> 00:47:24,099
pay their rents and automated things
especially when I'm given to the private
499
00:47:24,099 --> 00:47:30,779
sector which has another whole logic of
thinking about this stuff is cost saving
500
00:47:30,779 --> 00:47:43,930
and so maybe it would be a nice thing
whether if you could talk a bit about the,
501
00:47:43,930 --> 00:47:49,359
I'm sorry, the attempt to maybe like get
economics a bit more into the picture when
502
00:47:49,359 --> 00:47:56,130
it comes to solutions of the whole thing.
A: So I think yeah, your very right in
503
00:47:56,130 --> 00:48:02,309
pointing actually the relation, well that
you compare what's happening with the NHS
504
00:48:02,309 --> 00:48:07,799
and what's happening with the police
because in both the economics of
505
00:48:07,799 --> 00:48:14,940
companies offering policing services arise
from the same situation there's a need of
506
00:48:14,940 --> 00:48:23,380
doing more efficient policing because of
budget cuts, so the same way the NHS is
507
00:48:23,380 --> 00:48:30,079
being essentially privatized due to the
budget cuts and due to the to the needs
508
00:48:30,079 --> 00:48:34,799
that arise from being limited in your
finance, again there's a similar thing
509
00:48:34,799 --> 00:48:38,880
with the police when you when you're
understaffed then you're more likely to
510
00:48:38,880 --> 00:48:44,329
rely on on technologies to help you do
your work more efficiently because
511
00:48:44,329 --> 00:48:51,210
essentially with predictive policing the
idea behind this is that if you know where
512
00:48:51,210 --> 00:48:56,380
and when crime will happen then you can
focus the limited resources you have there
513
00:48:56,380 --> 00:49:02,640
and not sort of look at a more global
larger picture. So I mean I'm not gonna be
514
00:49:02,640 --> 00:49:06,599
here on stage advocating for more funds
for the police, I'm not gonna do that, but
515
00:49:06,599 --> 00:49:11,660
I think that there is there is a desperate
need to reframe actually the narrative
516
00:49:11,660 --> 00:49:19,170
around how we do policing actually and
then definitely also look at a different
517
00:49:19,170 --> 00:49:22,680
perspective and a different approach to
policing because as I've tried to show
518
00:49:22,680 --> 00:49:28,010
it's been a really long time since this
narrative has developed of more data leads
519
00:49:28,010 --> 00:49:32,789
to crime resolution but actually what I
didn't have the time to get into in this
520
00:49:32,789 --> 00:49:37,490
talk is actually all the research that are
showing that those product actually don't
521
00:49:37,490 --> 00:49:42,770
work like PREDPOL is actually basically
gaslighting a lot of police officers with
522
00:49:42,770 --> 00:49:47,650
their figures, the kind of figures that
are pushing and suggesting are just like
523
00:49:47,650 --> 00:49:53,671
plain inaccurate, it's not accurate to
compare a city on the one year to what a
524
00:49:53,671 --> 00:49:59,230
city is becoming in another year so it's
not even clear that a lot of this
525
00:49:59,230 --> 00:50:05,460
project are even like properly functioning
and in a sense I don't want them to
526
00:50:05,460 --> 00:50:09,250
function I'm not gonna say if we had
better predictive policing then the
527
00:50:09,250 --> 00:50:14,869
problem will be solved no that is not the
question, the question is how do we have
528
00:50:14,869 --> 00:50:20,820
regulation that force the police to look
differently into the way they are
529
00:50:20,820 --> 00:50:25,597
conducting policing.
Herald: Number four please.
530
00:50:25,597 --> 00:50:31,980
Q: So, thank you for your presentation I
have a question about SOCMINT, my opinion
531
00:50:31,980 --> 00:50:37,359
SOCMINT might violate the terms of
services of for example Twitter and
532
00:50:37,359 --> 00:50:41,000
Facebook have you tried to cooperate with
these companies to make them actually
533
00:50:41,000 --> 00:50:46,360
enforce their TOS?
A: So actually there is two things as I
534
00:50:46,360 --> 00:50:51,270
said like all companies that are doing
scraping of data and you're right in this
535
00:50:51,270 --> 00:50:58,700
case they violate the terms of services of
Facebook and Twitter. Now, the other
536
00:50:58,700 --> 00:51:03,049
problem is that there is already a loop to
this and actually the marketing company I
537
00:51:03,049 --> 00:51:08,289
was talking about that's being used by the
UK police what they essentially do is that
538
00:51:08,289 --> 00:51:13,559
they purchase the data from Facebook and
Twitter, so this is why it's interesting
539
00:51:13,559 --> 00:51:19,900
because when Facebook's say we don't sell
your data, well essentially actually with
540
00:51:19,900 --> 00:51:25,970
marketing tools that are there to monitor
what people say about products essentially
541
00:51:25,970 --> 00:51:29,599
what you're doing is selling your data,
they're not selling necessarily like your
542
00:51:29,599 --> 00:51:34,400
name or your location or things like that
but whatever you're going to be posting
543
00:51:34,400 --> 00:51:41,109
publicly for example in like groups or
public pages is something that they are
544
00:51:41,109 --> 00:51:45,329
going to be trying to sell to those
companies. So I think you're right and
545
00:51:45,329 --> 00:51:50,839
maybe Millie will have more to say about
this. I think those companies have a role
546
00:51:50,839 --> 00:51:56,260
to play but at the moment I think the
challenge we face is actually this loop
547
00:51:56,260 --> 00:52:00,960
that we're facing where by purchasing the
data directly from the company they don't
548
00:52:00,960 --> 00:52:07,420
face any they don't violate the terms of
services. Yeah, we've spoken a bit to the
549
00:52:07,420 --> 00:52:12,840
some of the social media companies, we've
been told that one of their big focuses is
550
00:52:12,840 --> 00:52:17,710
the problems of the social media
monitoring at the U.S. border and so
551
00:52:17,710 --> 00:52:22,609
because there's a lot known about that
they're looking at those issues so I think
552
00:52:22,609 --> 00:52:27,000
once we show more and more the problems
say in the UK or in other countries I
553
00:52:27,000 --> 00:52:31,869
think it would be very interesting to look
at what's happened over the Catalan
554
00:52:31,869 --> 00:52:37,410
independence vote period to see how social
media was used then. I think the companies
555
00:52:37,410 --> 00:52:42,380
aren't going to react until we make them
although they probably will meet with us.
556
00:52:42,380 --> 00:52:49,990
A slightly different aspect we revealed in
a different part of our work that the
557
00:52:49,990 --> 00:52:53,190
intelligence agencies were gathering
social media that's probably not
558
00:52:53,190 --> 00:52:57,779
groundbreaking news but it was it was
there in plain fact and so they all got a
559
00:52:57,779 --> 00:53:01,480
bit concerned about how that was
happening, whether some of them knew or
560
00:53:01,480 --> 00:53:05,950
some of them didn't, so the better our
research the more people speaking about it
561
00:53:05,950 --> 00:53:11,030
I think they will engage, or we'll find
out are they are the police getting it
562
00:53:11,030 --> 00:53:17,350
lawfully or unlawfully.
Herald: Number one please.
563
00:53:17,350 --> 00:53:21,200
Q: Thanks for your talk, I have a question
on predictive policing because German
564
00:53:21,200 --> 00:53:28,700
authorities in the last two years piloted pre-cops
PREDPOL projects in three states I think
565
00:53:28,700 --> 00:53:33,630
and they claimed that they would never use
these techniques with data on individuals
566
00:53:33,630 --> 00:53:38,870
but only aggregate data like the new
repeat stuff you presented and they
567
00:53:38,870 --> 00:53:42,940
presented as just an additional tool in
their toolbox and that if use responsibly
568
00:53:42,940 --> 00:53:48,240
can lead to more cost effective policing,
do you buy this argument or would you say
569
00:53:48,240 --> 00:53:55,020
that there's inevitably slippery slope or
kind of like a path dependency to more
570
00:53:55,020 --> 00:54:01,010
granular data assessment or evaluation
that would inevitably infringe on privacy
571
00:54:01,010 --> 00:54:05,319
rights?
A: I think this goes back to the question
572
00:54:05,319 --> 00:54:08,740
of like you know are we using per
listening to identify where crime is
573
00:54:08,740 --> 00:54:14,369
happening or who it is who's committing a
crime but actually I think even if we if
574
00:54:14,369 --> 00:54:18,910
we stick to this even if we stick to
identifying where crime is happening we
575
00:54:18,910 --> 00:54:23,650
still run into problems we still run into
the fundamental problem of predictive
576
00:54:23,650 --> 00:54:28,599
policing which is we only have data on
crime that have already been reported ever
577
00:54:28,599 --> 00:54:35,809
or already been addressed by the police,
and that's by essence already biased data.
578
00:54:35,809 --> 00:54:41,430
If we have police in some areas then we're
more likely to, you know, further police
579
00:54:41,430 --> 00:54:51,579
because the solution of those companies of
those algorithm will be leading to more
580
00:54:51,579 --> 00:54:57,880
suggestions that crime is is happening
more predominantly in those areas. So, as
581
00:54:57,880 --> 00:55:04,459
we've seen so far is that we fall into
these fundamental problems of just
582
00:55:04,459 --> 00:55:11,329
overpolicing communities that are already
overpoliced. So in a sense in terms of
583
00:55:11,329 --> 00:55:18,069
well the right to privacy but also the
question of the presumption of innocence I
584
00:55:18,069 --> 00:55:23,040
think purely just having trying to
cultivate data on the where crime is
585
00:55:23,040 --> 00:55:29,660
happening it's not efficient policing
first of all but it's also causing
586
00:55:29,660 --> 00:55:35,020
challenges for fundamental rights as well.
Yeah, I guess it's not a great comparison
587
00:55:35,020 --> 00:55:39,481
but what a lot of what they're bringing in
now is a program to assist you with the
588
00:55:39,481 --> 00:55:43,910
charging decision, so you've got someone
you've arrested do you charge them or not?
589
00:55:43,910 --> 00:55:48,319
The police say oh well of course it's only
advisory you only have to look at how busy
590
00:55:48,319 --> 00:55:52,660
a police station is to know how advisory
is that going to be and how much is it
591
00:55:52,660 --> 00:55:56,740
going to sway your opinion. So the more
you use these tools the more it makes your
592
00:55:56,740 --> 00:56:01,260
job easier because rather than thinking,
where are we going to go, what areas
593
00:56:01,260 --> 00:56:04,250
things going to happen, who are we going
to arrest, well the computer told us to do
594
00:56:04,250 --> 00:56:08,700
this so let's just do that.
Herald: Thank you and microphone number
595
00:56:08,700 --> 00:56:13,111
three please.
Q: Thank you, do you think that there are
596
00:56:13,111 --> 00:56:19,940
any credible arguments to be made for
limiting the police's abilities under acts
597
00:56:19,940 --> 00:56:25,130
in the UK that incorporate EU level
restrictions on privacy data protection
598
00:56:25,130 --> 00:56:29,650
human rights or fundamental rights and if
so do you anticipate that those arguments
599
00:56:29,650 --> 00:56:35,140
might change after brexit?
A: Well they they're bringing in GDPR and
600
00:56:35,140 --> 00:56:39,670
the Law Enforcement Directive now and
they're not going to scrap those once
601
00:56:39,670 --> 00:56:44,299
brexit comes in. We'll still be part,
hopefully, of the European Court of Human
602
00:56:44,299 --> 00:56:49,130
Rights, but not the European Court of
Justice. I think there are going to be
603
00:56:49,130 --> 00:56:51,960
implications it's going to be very
interesting how they play it out they're
604
00:56:51,960 --> 00:56:57,420
still going to want the data from Europol,
they want to be part of Interpol, policing
605
00:56:57,420 --> 00:57:02,309
operates at a different level and I think
if they have to comply with certain laws
606
00:57:02,309 --> 00:57:06,029
so that they can play with the big boys
then they probably will, but they may do
607
00:57:06,029 --> 00:57:12,160
things behind the scenes, so it depends
where it works for them, but certainly the
608
00:57:12,160 --> 00:57:16,019
politicians and definitely the police
wanna be part of those groups. So we'll
609
00:57:16,019 --> 00:57:20,809
have to see, but we will still use them
and we'll still rely on European judgments
610
00:57:20,809 --> 00:57:26,865
the force they have in a court of law may
be more difficult.
611
00:57:26,865 --> 00:57:32,319
Herald: Does the internet have any
questions, nope, well then number two
612
00:57:32,319 --> 00:57:35,839
please.
Q: So you've mentioned that they don't
613
00:57:35,839 --> 00:57:41,609
have really good operational security and
sometimes some stuff that should not leak
614
00:57:41,609 --> 00:57:47,869
leaked now within the last year we had
major data leaks all across the world like
615
00:57:47,869 --> 00:57:54,710
Philippines, South Africa, just to mention
a few, now if the, security, OPSEC is so
616
00:57:54,710 --> 00:58:00,160
bad in the police in Great Britain it's
not unlikely that something will happen
617
00:58:00,160 --> 00:58:05,299
in Europe of a similar kind what kind of
impact do you think such a huge data leak
618
00:58:05,299 --> 00:58:11,750
of private information which the police
legally stored has even if it was not
619
00:58:11,750 --> 00:58:16,539
leaked by the police and it would be leaked
by a private company that had some way
620
00:58:16,539 --> 00:58:19,329
access to it?
A: I I guess it depends what it what it
621
00:58:19,329 --> 00:58:25,340
is, if it's a database with serious
criminals and only the bad people, then
622
00:58:25,340 --> 00:58:29,480
people will think when it's
good they have that information but they
623
00:58:29,480 --> 00:58:35,920
need to make it more secure. If
somehow databases which held all sorts of
624
00:58:35,920 --> 00:58:39,589
information say from people's mobile
phones, innocent people's pictures, all
625
00:58:39,589 --> 00:58:44,820
that kind of thing then we might see a
much wider public reaction to the tools
626
00:58:44,820 --> 00:58:51,039
that are used and the safeguards, the
legal safeguards, will come a lot quicker
627
00:58:51,039 --> 00:58:55,599
than probably we will achieve in the way
we're trying to go now because there'll be
628
00:58:55,599 --> 00:59:02,030
a bigger public outrage.
Herald: Okay one last and hopefully short
629
00:59:02,030 --> 00:59:06,619
question from microphone one.
Q: Hi, thanks for the talk was really
630
00:59:06,619 --> 00:59:10,320
interesting, it's actually quite a short
question how much is a Cellebrite, and can
631
00:59:10,320 --> 00:59:14,760
we buy one?
A: I did look to buy one, I think there
632
00:59:14,760 --> 00:59:21,319
were some on eBay but I'm sure if they
were like the right things but a couple of
633
00:59:21,319 --> 00:59:24,319
thousand pounds, but I think you have to
actually be a police force to get those
634
00:59:24,319 --> 00:59:30,529
ones, maybe there are other types but
it's expensive but not unobtainable, but
635
00:59:30,529 --> 00:59:34,779
I'm trying to find universities that might
have them because I think that a lot of
636
00:59:34,779 --> 00:59:38,369
forensic schools I'm hoping that they
will, I know they do extractions of
637
00:59:38,369 --> 00:59:41,725
laptops but I haven't found one yet that
does phones but I probably haven't asked
638
00:59:41,725 --> 00:59:45,808
enough people.
Herald: So thank you very much.
639
00:59:45,808 --> 00:59:50,990
34C3 Music
640
00:59:50,990 --> 01:00:07,000
subtitles created by c3subtitles.de
in the year 2020. Join, and help us!