0:00:00.000,0:00:09.329
preroll music
0:00:09.329,0:00:14.179
Herald: Welcome Jeff with a warm applause[br]on stage. He works for Tactical Tech
0:00:14.179,0:00:19.130
applause
0:00:19.130,0:00:22.860
and will talk about a bias in[br]data and racial profiling
0:00:22.860,0:00:25.870
in Germany compared with[br]the UK. It’s your stage!
0:00:25.870,0:00:30.090
Jeff: Right. Thank you! Yeah, okay!
0:00:30.090,0:00:33.320
My presentation is called[br]“Profiling (In)justice –
0:00:33.320,0:00:36.430
– Disaggregating Data by Race[br]and Ethnicity to Monitor
0:00:36.430,0:00:41.630
and Evaluate Discriminatory Policing”.[br]In terms of my background:
0:00:41.630,0:00:46.780
I’ve done research, doing[br]mostly quantitative research
0:00:46.780,0:00:50.730
around the issues of racial[br]discrimination for a long time.
0:00:50.730,0:00:55.960
In New York, at the Center for[br]Constitutional Rights I was working on
0:00:55.960,0:00:59.800
looking at trends and levels of
0:00:59.800,0:01:04.210
use-of-force by police against civilians,[br]and also on stop-and-search
0:01:04.210,0:01:08.601
against civilians. And then more[br]recently for the last 18 months or so
0:01:08.601,0:01:12.330
I’ve been working as a research[br]consultant at Tactical Tech,
0:01:12.330,0:01:16.360
looking at issues of data politics and[br]privacy. So this is kind of like a merger
0:01:16.360,0:01:21.960
of these 2 areas. In terms of what this[br]presentation is gonna be about:
0:01:21.960,0:01:26.900
there’s gonna be 3 takeaways. First, that
0:01:26.900,0:01:29.590
we’re dealing with the issues of privacy[br]and also [freedom from] discrimination.
0:01:29.590,0:01:34.869
And both are fundamental human rights.[br]But there’s tension between the two.
0:01:34.869,0:01:40.879
And important questions to think about[br]are: “When do privacy concerns exceed
0:01:40.879,0:01:46.490
or take precedence over those of[br]discrimination, or vice versa?”
0:01:46.490,0:01:53.400
Two: That data is political, both in the[br]collection and aggregation of data;
0:01:53.400,0:01:56.930
but also in terms of having the[br]categories of being created.
0:01:56.930,0:02:00.549
And then, three: That data ethics are[br]a complex thing, that things aren’t
0:02:00.549,0:02:05.090
so black-and-white all of the time.[br]So what is racial profiling?
0:02:05.090,0:02:08.910
The term originates from the US.
0:02:08.910,0:02:14.509
And it refers to when a police officer[br]suspects, stops, questions, arrests or…
0:02:14.509,0:02:17.079
you know, or… at other stages (?)[br]of the communal justice system
0:02:17.079,0:02:21.039
because of their perceived[br]race or ethnicity. After 9/11
0:02:21.039,0:02:26.609
it also refers to the profiling of Muslims[br]or people perceived to be Middle Eastern.
0:02:26.609,0:02:31.519
And in German there is no direct translation,[br]so the term ‘Racial Profiling’ (quotes)
0:02:31.519,0:02:36.859
is used a lot in parliamentary hearings[br]and also in court documents.
0:02:36.859,0:02:41.790
So the problem that we’re gonna talk[br]about is that because of the lack of data
0:02:41.790,0:02:46.309
in Germany there’s no empirical[br]evidence to monitor and evaluate
0:02:46.309,0:02:50.729
trends in discrimination.[br]This is creating problems
0:02:50.729,0:02:55.290
for both civil society in terms of looking[br]at these levels and trends over time,
0:02:55.290,0:02:58.199
but also from an individual perspective[br]it becomes difficult for people
0:02:58.199,0:03:02.259
to file complaints. In Germany the only[br]way to file a complaint officially
0:03:02.259,0:03:07.999
is to go to the police department,[br]which introduces power dynamics,
0:03:07.999,0:03:11.349
you know, challenges and additional[br]barriers. But also if you’re an individual
0:03:11.349,0:03:16.329
you have to show that there’s a trend,[br]right? That you are part of another,
0:03:16.329,0:03:19.759
a long standing story. And without this[br]data it becomes difficult to prove
0:03:19.759,0:03:24.049
that that’s happening.[br]So what we’re needing,
0:03:24.049,0:03:27.159
or what some people are calling[br]for, is having this data
0:03:27.159,0:03:32.850
at a state and a sort of national level.[br]And this ratio that I’m putting here,
0:03:32.850,0:03:36.019
referring to policing, is looking at the[br]rate at which people are stopped
0:03:36.019,0:03:41.629
over the census figure of the[br]demographic share of the population.
0:03:41.629,0:03:44.900
And you really need both; the first[br]being on the police side and
0:03:44.900,0:03:49.589
the second being on the census. So[br]that, you know, if you only have one,
0:03:49.589,0:03:52.170
if you only have the rate at which police[br]were stopping people then you actually
0:03:52.170,0:03:55.169
can’t see if this is discriminatory or[br]not. And if you only have the census
0:03:55.169,0:03:59.720
then you can’t see that, either.[br]So you really need both.
0:03:59.720,0:04:03.790
The European Commission, the International[br]Labour Organisation and academics are all
0:04:03.790,0:04:10.549
calling for these… the creation of[br]standardized and comparable data sets.
0:04:10.549,0:04:13.939
And I’m not gonna read these out,[br]but I can go back to them later
0:04:13.939,0:04:18.760
if you’re interested. But what I’m gonna[br]talk about is comparing the UK
0:04:18.760,0:04:23.290
to that of Germany. So in Germany,
0:04:23.290,0:04:28.130
in 1983 there was a census; or there[br]was an attempt to making a census.
0:04:28.130,0:04:31.970
But due to wide-spread resentment[br]and disenfranchisement,
0:04:31.970,0:04:37.190
fears of surveillance and lack of[br]trust in state data collection
0:04:37.190,0:04:42.490
there was a big boycott. Or people[br]deliberately filled in forms wrong.
0:04:42.490,0:04:45.280
In some cases there were even[br]bombings of statistical offices.
0:04:45.280,0:04:51.220
Or people spilled coffee over census[br]forms to try to deliberately ruin them.
0:04:51.220,0:04:55.530
As a couple of other presentations at the[br]conference have already said
0:04:55.530,0:04:59.250
this was found to be an[br]unconstitutional census.
0:04:59.250,0:05:01.990
Because of the way that[br]they were framing it.
0:05:01.990,0:05:08.520
Comparing the census to[br]household registrations.
0:05:08.520,0:05:14.900
And so the census was delayed until 1987,
0:05:14.900,0:05:19.930
which was the most recent census until[br]the most recent European one in 2011.
0:05:19.930,0:05:23.260
This Supreme Court decision[br]was really important
0:05:23.260,0:05:28.810
because it established this right[br]for informational self-determination.
0:05:28.810,0:05:33.040
Very important for privacy[br]in terms of Germany.
0:05:33.040,0:05:37.710
You know, until today. So what kinds[br]of information is being collected?
0:05:37.710,0:05:40.690
In Germany we have pretty standard kind[br]of demographic information things
0:05:40.690,0:05:45.200
like gender, age, income, religion. But[br]what I want to talk about in particular
0:05:45.200,0:05:49.200
is country origin and country citizenship.
0:05:49.200,0:05:53.660
Which are used to determine a person[br]of migration background. And
0:05:53.660,0:05:56.860
this term ‘person of migration background’[br]generally refers to whether you,
0:05:56.860,0:06:00.220
your parents or your grandparents[br]– the first, second or third generation –
0:06:00.220,0:06:03.960
come from a migrant background. Right, and
0:06:03.960,0:06:10.000
this term is used oftentimes as a proxy[br]for ethnic or for racial diversity in Germany.
0:06:10.000,0:06:15.050
And this is problematic because[br]you’re using citizenship as a proxy
0:06:15.050,0:06:20.080
for looking at racial and ethnic identity.[br]And it also ignores the experiences
0:06:20.080,0:06:23.450
and identities, the self identities[br]of people who don’t fall into
0:06:23.450,0:06:26.870
this ‘first, second or third generation’,[br]right? People who may identify
0:06:26.870,0:06:30.690
as Black German, let’s say. But[br]of fourth, fifth or sixth generation.
0:06:30.690,0:06:34.710
They’re just ignored in this[br]data set. So they fall out.
0:06:34.710,0:06:38.160
Also, it’s difficult to measure these at[br]a national level because each state
0:06:38.160,0:06:41.950
has different definitions of what[br]constitutes a migrant background.
0:06:41.950,0:06:44.790
So we don’t have this at a national level[br]but also within states there’s no way
0:06:44.790,0:06:49.370
to compare them. Of course, not[br]having that data doesn’t mean
0:06:49.370,0:06:53.840
that there’s no racism, right?[br]And so in 2005 e.g. we see
0:06:53.840,0:06:57.180
that neo-Nazi incidents have increased 25%
0:06:57.180,0:07:03.320
– the NSU case coming out but still[br]going on in court proceedings.
0:07:03.320,0:07:08.020
The xenophobic attacks but also the way[br]in which these crimes were investigated
0:07:08.020,0:07:13.670
– at a state and at a federal level –[br]and the way that it was botched,
0:07:13.670,0:07:17.900
in addition to showing that[br]racism now in general
0:07:17.900,0:07:22.230
is at a higher rate than it has been for[br]the last 30 years. And much more recently
0:07:22.230,0:07:26.710
seeing the rise in arson attacks on[br]refugee centers. There’s been
0:07:26.710,0:07:30.360
over 200 attacks this year so far.[br]You know, all of these showed
0:07:30.360,0:07:34.220
that not collecting this data doesn’t[br]mean that we don’t have a problem.
0:07:34.220,0:07:40.830
So, the UK by comparison: In 1981,[br]there was the Brixton riots,
0:07:40.830,0:07:45.670
in an area of London.[br]And these arose largely
0:07:45.670,0:07:50.320
because of resentment towards[br]the way that police were
0:07:50.320,0:07:53.550
carrying out what they called ‘Sus Laws’.[br]Or people being able to be stopped
0:07:53.550,0:07:58.080
on suspicion of committing[br]a crime, carrying drugs,
0:07:58.080,0:08:03.650
having a weapon etc. and so forth.[br]And so in the aftermath of the riot
0:08:03.650,0:08:07.550
they came up with this report called the[br]‘Scarman report’. And this found
0:08:07.550,0:08:11.150
that there is much disproportionality in[br]the way that Police were carrying out
0:08:11.150,0:08:16.280
their stop-and-search procedures.[br]So for the first time this required…
0:08:16.280,0:08:20.130
or one of the reforms that was[br]instituted was that UK Police started
0:08:20.130,0:08:26.750
to have to collect data on race[br]or ethnicity during the stops.
0:08:26.750,0:08:29.600
When they stop a person they have to start[br]collecting this data. And then you have
0:08:29.600,0:08:34.629
a baseline that’s being established.[br]Around the same time in the UK
0:08:34.629,0:08:38.729
we have the 1981 census.
0:08:38.729,0:08:41.809
And in society they were having[br]a lot of debates around
0:08:41.809,0:08:45.899
whether or not they wanted to have this…
0:08:45.899,0:08:49.971
collecting this baseline national level[br](?) figure, because we need these 2 things
0:08:49.971,0:08:56.260
for this ratio in order to monitor and[br]evaluate levels of discrimination.
0:08:56.260,0:09:00.240
But, you know, there was[br]a lot of opposition to this.
0:09:00.240,0:09:04.829
And many found it to be (quote)[br]“morally and politically objectionable”.
0:09:04.829,0:09:08.570
But not for the reason you’d think.[br]People found objections to it
0:09:08.570,0:09:13.230
not because of asking these question,[br]but because of the way that the question
0:09:13.230,0:09:17.190
was phrased, with the categories that[br]were being used. And they did surveys
0:09:17.190,0:09:21.399
between ’75 and about ’95, and found that
0:09:21.399,0:09:26.529
among marginalized communities[br]and in minority ethnicity groups
0:09:26.529,0:09:31.329
there was actually a lot of support[br]for collecting this kind of data.
0:09:31.329,0:09:35.250
They just wanted to have it phrased to[br]be different. And so ’91 they started
0:09:35.250,0:09:40.359
to collect the data. They put this[br]‘race question’ in. And here I have,
0:09:40.359,0:09:45.600
in 2011 – the most recent census –[br]some of the kinds of categories
0:09:45.600,0:09:50.049
that they wanted to also include.[br]And they’ve changed over time.
0:09:50.049,0:09:54.329
So e.g. like ‘White Irish people’ felt[br]that they also were discriminated against.
0:09:54.329,0:09:58.930
And they experienced things differently[br]than white British people, e.g.
0:09:58.930,0:10:03.231
So having things broken down[br]further would be helpful for them
0:10:03.231,0:10:09.720
in terms of highlighting discrimination[br]that each specific demographic faces.
0:10:09.720,0:10:14.379
So around that time ’91, ’93 we[br]have the murder of Stephen Lawrence
0:10:14.379,0:10:19.130
in an unprovoked racist attack. Nobody[br]was ever convicted of that. But
0:10:19.130,0:10:22.529
what’s important is that we have this[br]‘Macpherson report’ that came out.
0:10:22.529,0:10:27.290
And it developed a lot of recommendations,[br]70, and most of them were adopted.
0:10:27.290,0:10:31.529
One: to be collecting this at a national[br]level, and to be comparing these.
0:10:31.529,0:10:35.199
In 2011 they stopped mandating[br]that you had to collect this data,
0:10:35.199,0:10:38.709
at a national level. So none of the[br]data from then going forward
0:10:38.709,0:10:42.659
can actually be trusted. Some[br]forces continued to do it,
0:10:42.659,0:10:46.270
but not all of them. So you can’t actually[br]compare them between forces.
0:10:46.270,0:10:50.249
In the same year we have these London[br]riots. The Guardian and LSE put out
0:10:50.249,0:10:54.190
a report called “Reading the Riots”. Where[br]they did a lot of interviews with people
0:10:54.190,0:10:58.429
who participated. And they found that[br]most of the people who participated
0:10:58.429,0:11:03.569
had feelings of… that they[br]were mistreated by Police.
0:11:03.569,0:11:07.820
Or that there is racial discrimination[br]in terms of the policing practices.
0:11:07.820,0:11:11.760
That they weren’t being[br]treated with respect.
0:11:11.760,0:11:16.710
So to put some data to that:[br]Before this was removed
0:11:16.710,0:11:22.219
there… it was 2 different types of[br]stops in the UK. Those PACE stops,
0:11:22.219,0:11:25.769
where you stops with reasonable suspicion.
0:11:25.769,0:11:30.379
And among that you have e.g. black people[br]stopped at 7 times the rate of white people.
0:11:30.379,0:11:34.690
Asian people – Asian referring to (?)(?)(?)(?)[br]Southeast Asian in the UK –
0:11:34.690,0:11:39.430
at twice the rate. And ‘Section 60 stops’:[br]where you don’t have to actually have
0:11:39.430,0:11:43.399
reasonable suspicion. And when you don’t[br]need to have that you have much, much
0:11:43.399,0:11:51.840
higher rates. 26.6 times the rate of white[br]people black people are being stopped at.
0:11:51.840,0:11:54.069
But the State Department even coming[br]out and they’re saying: “There’s
0:11:54.069,0:11:59.730
no relationship between criminality[br]and race… criminality and ethnicity”.
0:11:59.730,0:12:02.450
In fact it’s like: If people are being[br]stopped at these rates it’s…
0:12:02.450,0:12:06.670
it’s in the wrong direction. You have[br]white males in particular who are
0:12:06.670,0:12:10.020
fending at higher rates. Who are using[br]drugs at a higher rate. Who are
0:12:10.020,0:12:15.060
possessing weapons at a higher rate.[br]But that’s not who’s being stopped.
0:12:15.060,0:12:19.579
There is a connection though between[br]race and ethnicity and poverty.
0:12:19.579,0:12:23.040
So you can see here, they call it like[br]BAME groups, or ‘Black, Asian and
0:12:23.040,0:12:27.220
Minority Ethnicity’. And you can see[br]that among like wealth and assets:
0:12:27.220,0:12:30.710
it’s much, much lower for non-white[br]households. Unemployment rates
0:12:30.710,0:12:36.149
are much higher as well.[br]Income is much lower.
0:12:36.149,0:12:39.809
So I like making maps. And I think[br]maps are really cool. ’Cause you can
0:12:39.809,0:12:44.269
tell stories when you overlay a lot[br]of data with it. So on the left
0:12:44.269,0:12:50.699
I put by different borough in London[br]where people are actually being stopped.
0:12:50.699,0:12:54.529
Per 1,000 people in 2012.[br]And on the right I put
0:12:54.529,0:12:58.789
where the crime is actually occurring.[br]And this is coming from UK Police.
0:12:58.789,0:13:02.009
And so you can see that where people[br]are being stopped isn’t exactly
0:13:02.009,0:13:05.799
where the crime is actually happening.[br]And if you’re seeing this stop-and-search
0:13:05.799,0:13:11.069
as a crime preventing tactic then we[br]have to question why this isn’t lining up.
0:13:11.069,0:13:15.449
Going back to this ratio:
0:13:15.449,0:13:19.459
earlier I mentioned like – having the rate[br]at which one group is being stopped
0:13:19.459,0:13:22.990
over that share of the total population.
0:13:22.990,0:13:26.000
And we can take it a step further[br]and we can compare that to…
0:13:26.000,0:13:29.029
between different demographic groups.
0:13:29.029,0:13:33.610
And when using census figures[br]combined with police figures,
0:13:33.610,0:13:38.500
we can do things like looking on the left.[br]I mean this disproportionality ratio,
0:13:38.500,0:13:41.260
so the rate at which black groups[br]as a share are stopped
0:13:41.260,0:13:45.839
versus the total population, compared[br]to white groups are stopped.
0:13:45.839,0:13:49.920
And you can see the darker areas[br]is where you have a higher rate.
0:13:49.920,0:13:56.230
So when we’re talking about those[br]‘7 times, or 26 times more likely’
0:13:56.230,0:13:59.959
these are those areas that we’re[br]talking about. And so the darker areas:
0:13:59.959,0:14:05.909
you can see that when compared to poverty,
0:14:05.909,0:14:09.309
people are stopped… there’s[br]greater disproportionality ratios
0:14:09.309,0:14:13.030
in wealthier areas than there are in[br]poorer areas. And this is kind of
0:14:13.030,0:14:16.989
a way, you could say, almost[br]of perceiving people of colour
0:14:16.989,0:14:24.510
as others who shouldn’t belong in[br]these areas. It’s also… you can…
0:14:24.510,0:14:27.819
when combined with other census[br]information you can see that you have
0:14:27.819,0:14:32.069
more discrimination, you have more[br]disparities in areas that are more white
0:14:32.069,0:14:36.240
and also less racially diverse.
0:14:36.240,0:14:40.069
So this is kind of all on the[br]same kind of a message.
0:14:40.069,0:14:44.229
But if it works fine? – It doesn’t.[br]UK Police is saying that
0:14:44.229,0:14:49.499
at most they have a 6%[br]arrest rate of all stops.
0:14:49.499,0:14:52.970
And arrests are not conviction rates.
0:14:52.970,0:14:59.319
Looking for weapons we have like less[br]than 1% of a positive search rate.
0:14:59.319,0:15:03.350
And the European Human Rights[br]Commission e.g. has called for reform
0:15:03.350,0:15:06.999
of these practices. The UN has called[br]for reform of these practices.
0:15:06.999,0:15:12.559
And they instituted like[br]a reform that called for
0:15:12.559,0:15:19.039
having a 20% arrest quota. And so that[br]could either go positively or negatively.
0:15:19.039,0:15:21.649
Making a higher quota means that you[br]could be just arresting more people
0:15:21.649,0:15:26.439
that you’re stopping. More likely, or[br]hopefully it means that you have
0:15:26.439,0:15:31.550
a higher justification or grounds[br]for stopping a person.
0:15:31.550,0:15:35.430
So these are the kinds of things you can[br]do in the UK, with these kinds of data.
0:15:35.430,0:15:40.079
In Germany, you can’t. But I wanna[br]highlight there’s this one case
0:15:40.079,0:15:45.150
in Koblenz in 2010.[br]There was a student of…
0:15:45.150,0:15:50.050
unnamed, black student who[br]is stopped travelling on train,
0:15:50.050,0:15:53.310
and who was asked to show his ID.[br]And he refused. And he said: “No,
0:15:53.310,0:16:01.190
I’m not gonna do that. This is[br]reminiscent of Nazi era tactics”.
0:16:01.190,0:16:07.509
And so he was charged with slander.[br]And he was brought into court.
0:16:07.509,0:16:11.439
And the police officer, when it[br]was in court, said, (quote):
0:16:11.439,0:16:16.149
“I approach people that look like[br]foreigners, this is based on skin colour.”
0:16:16.149,0:16:20.209
And so this is for the first time[br]the police have admitted that
0:16:20.209,0:16:23.470
their grounds for doing immigration[br]related stops are based on
0:16:23.470,0:16:28.520
perceived race or ethnicity.[br]The judge sided with the police.
0:16:28.520,0:16:32.029
That this was good justification,[br]like it was good grounds.
0:16:32.029,0:16:36.779
But a higher court ruled[br]that that wasn’t the case.
0:16:36.779,0:16:38.540
They said: “Yeah,[br]this is unconstitutional,
0:16:38.540,0:16:42.339
you can’t actually do it,[br]it violates the constitution.”
0:16:42.339,0:16:46.249
No person shall be favoured or disfavoured[br]because of sex, parentage, race,
0:16:46.249,0:16:50.739
language, homeland, origin,[br]faith, religious… etc.
0:16:50.739,0:16:54.360
Just as a side note there’s been a large[br]movement to remove this term ‘race’
0:16:54.360,0:16:58.410
from that part of the constitution[br]since it’s been put in.
0:16:58.410,0:17:02.189
And also the court dismissed the slander[br]charge. They said: “No, this student…”
0:17:02.189,0:17:07.160
like he’s actually able to critique[br]the police, you know, in this way.
0:17:07.160,0:17:10.660
But after we have the response[br]by the police union,
0:17:10.660,0:17:14.440
the head of the police union[br]at the time, who said (quote):
0:17:14.440,0:17:18.010
“The courts deal with the law in[br]an aesthetical pleasing way, but
0:17:18.010,0:17:21.760
they don’t make sure their judgments[br]match practical requirements”.
0:17:21.760,0:17:25.400
And so what this means is: we see[br]that according to the police union
0:17:25.400,0:17:28.870
– this isn’t official response, but this[br]is from the Police Union itself –
0:17:28.870,0:17:32.920
they say that we need to[br]profile. We need to do this.
0:17:32.920,0:17:38.750
Or else we aren’t able to do[br]immigration related stops.
0:17:38.750,0:17:43.470
That’s crazy. They also…[br]I mean, at the same time
0:17:43.470,0:17:46.840
when they were doing these parliamentary[br]hearings they institute these mandatory
0:17:46.840,0:17:50.660
inter cultural trainings for police[br]officers. And these are kind of
0:17:50.660,0:17:55.210
like a one-day training where[br]you go and learn all about
0:17:55.210,0:17:58.650
how to deal with people from different[br]cultures. But in some of the interviews
0:17:58.650,0:18:01.910
that I was doing they said: “Okay, well,[br]this isn’t an inter cultural issue.
0:18:01.910,0:18:05.730
This is a racism issue”.
0:18:05.730,0:18:08.250
People aren’t just coming from other[br]places. These are Germans,
0:18:08.250,0:18:11.000
these are people who grew up here. These[br]are people who live here. Who know
0:18:11.000,0:18:15.970
how to speak the language.[br]Who were born and raised…
0:18:15.970,0:18:19.260
And we need to be dealing[br]with this in a different way.
0:18:19.260,0:18:23.250
However, in this time, we see that[br]racial profiling has become part of
0:18:23.250,0:18:29.560
the national conversation. And so this[br]is the sticker that somebody put up
0:18:29.560,0:18:33.040
in Berlin, in a U-Bahn.[br]It says: “Attention…,
0:18:33.040,0:18:36.140
we practice RACIAL PROFILING while[br]checking the validity of your ticket”.
0:18:36.140,0:18:42.200
It’s not real, but it looks…[br]I think it’s kind of cool.
0:18:42.200,0:18:45.790
When they were doing this in[br]these Bundestag hearings…
0:18:45.790,0:18:50.260
they released data for Federal Police[br]for 2013. This is the first time
0:18:50.260,0:18:54.270
that we have any data that’s released.[br]No data has ever been released
0:18:54.270,0:18:57.430
based on State Police stops.[br]They say that they’re not actually
0:18:57.430,0:19:01.010
collecting the information, so they[br]don’t have anything to show. Of course
0:19:01.010,0:19:03.960
the figures that are released from the[br]Federal Police are not disaggregated
0:19:03.960,0:19:08.000
by race and ethnicity.[br]But what does this actually show?
0:19:08.000,0:19:17.270
So, most of the stops,[br]over 85% are border stops.
0:19:17.270,0:19:20.910
Border being within ca. 30 km[br]of the German border.
0:19:20.910,0:19:25.540
So this is actually taking into account[br]most of the German population.
0:19:25.540,0:19:29.470
But if we’re doing these immigration[br]related stops then… if we break it down
0:19:29.470,0:19:34.430
by offense – in the last two, these are[br]the immigration related offenses
0:19:34.430,0:19:38.910
that people are committing – and[br]we have less than, at most,
0:19:38.910,0:19:44.080
maybe 1% of people who[br]are found to be a positive,
0:19:44.080,0:19:48.100
meaning that they’re found to be violating[br]some kind of offense. It’s – again,
0:19:48.100,0:19:53.930
it’s not a conviction, right?[br]And people can challenge this.
0:19:53.930,0:19:56.550
E.g. like you don’t have to have your[br]ID on you in all times. You can
0:19:56.550,0:20:00.470
present it later, and the[br]charge can go away.
0:20:00.470,0:20:05.080
But if we have such low[br]rates of positive searches
0:20:05.080,0:20:10.980
then like why is this happening? Why[br]do we feel that with such good data,
0:20:10.980,0:20:18.950
and knowing, as good researchers,[br]why are we continuing this as a practice?
0:20:18.950,0:20:22.000
On one of the other interviews that I was[br]doing they found that okay well:
0:20:22.000,0:20:26.470
You know, we know this is ineffective.[br]But this has the effect of criminalizing
0:20:26.470,0:20:31.550
our communities. And[br]whether or not this is true
0:20:31.550,0:20:35.130
is an argument for why we should maybe[br]have this kind of data to show that
0:20:35.130,0:20:41.220
this is or is not actually occurring.[br]Of course, European Commission
0:20:41.220,0:20:46.490
against racism and intolerance and the UN[br]have said: “Well, even among this at most
0:20:46.490,0:20:50.021
1% positive rates these are[br]not being distributed evenly, and
0:20:50.021,0:20:53.700
you have people of certain groups that are[br]being stopped at rates higher than others,
0:20:53.700,0:20:58.870
particularly black and other[br]minority ethnicity groups.”
0:20:58.870,0:21:05.670
Okay, so, going back, why…[br]into the initial question…
0:21:05.670,0:21:10.670
If we have both freedom from[br]discrimination and the right to privacy
0:21:10.670,0:21:15.930
as these human rights how[br]do we address this tension?
0:21:15.930,0:21:18.270
And how do we make sure that we’re[br]making the right decision in terms of
0:21:18.270,0:21:23.440
which takes precedence? And so I came…[br]or I’ve thought of 3 different reasons
0:21:23.440,0:21:27.690
why this isn’t happening. The first[br]being a series of legal challenges.
0:21:27.690,0:21:31.740
Things that are preventing[br]us from implementing this
0:21:31.740,0:21:36.400
from a legal basis. And the first…[br]you know there’s 3 exceptions
0:21:36.400,0:21:39.240
that would allow for this[br]data to be collected.
0:21:39.240,0:21:43.350
(1) The first being if there’s a provision[br]in EU directive that calls for collecting
0:21:43.350,0:21:49.700
this kind of a data. And within that[br](2) if you have the consent of the person
0:21:49.700,0:21:53.770
the data is subject, let’s say.[br]Consent is kind of a difficult thing
0:21:53.770,0:21:57.970
and we could have a whole conversation[br]just about that on its own.
0:21:57.970,0:22:00.950
If you’re being stopped by police officer[br]to what extent can you actually consent
0:22:00.950,0:22:06.660
to the data that’s being collected?[br]But this is put in place
0:22:06.660,0:22:10.510
as one of the mandatory[br]legal requirements.
0:22:10.510,0:22:16.050
Or (3) if there’s an exception in[br]the hopefully soon to be finalized
0:22:16.050,0:22:19.460
EU Data Protection law that[br]allows for collecting data
0:22:19.460,0:22:23.020
if it’s in the public interest. So you[br]could argue that we need to be collecting
0:22:23.020,0:22:28.920
this data because monitoring[br]and evaluating discrimination
0:22:28.920,0:22:34.480
is a problem that we need[br]to solve as a society, right?
0:22:34.480,0:22:38.810
Two: As a lot of people here at[br]the conference are talking about:
0:22:38.810,0:22:42.950
there’s a lot of distrust in terms[br]of collecting data by the state.
0:22:42.950,0:22:47.960
Particularly sensitive data. But I mean[br]as many of us are already aware
0:22:47.960,0:22:53.520
this data is already being collected. And[br]this doesn’t mean that we should maybe
0:22:53.520,0:22:57.680
collect more just for the[br]sake of collecting data.
0:22:57.680,0:23:01.460
But in terms of sensitive data –
0:23:01.460,0:23:04.990
we’re collecting things also like medical[br]data. And medical data sometimes
0:23:04.990,0:23:08.720
is interesting for looking at trends[br]in terms of the illnesses,
0:23:08.720,0:23:14.850
and where illnesses spread. And you can[br]look at this as also possibly a way of
0:23:14.850,0:23:21.130
using sensitive data for highlighting[br]and monitoring public problems.
0:23:21.130,0:23:25.150
And, (3), we have these[br]challenges in determining
0:23:25.150,0:23:29.060
which kind of categories[br]we should put in place.
0:23:29.060,0:23:32.890
But, like the UK, if something[br]were implemented in Germany
0:23:32.890,0:23:37.090
I feel as though this would change over[br]time as other groups also want their data
0:23:37.090,0:23:43.490
to be collected… or not!
0:23:43.490,0:23:48.400
So that’s kind of where[br]I’m at. I think that
0:23:48.400,0:23:51.480
there are no easy answers in terms of[br]whether we should or should not do this.
0:23:51.480,0:23:53.670
But I think that at the very least[br]we should be starting to have
0:23:53.670,0:23:56.500
these conversations. And I think that[br]it’s important to start having these
0:23:56.500,0:23:59.440
conversations with communities[br]themselves who are being targeted,
0:23:59.440,0:24:05.060
or feel they’re being profiled.[br]So, thank you!
0:24:05.060,0:24:16.320
applause
0:24:16.320,0:24:20.420
Herald: It was an awesome talk. I think[br]there might be 5 minutes for questions.
0:24:20.420,0:24:24.620
There are mics over there and over[br]there. And whoever has a question,
0:24:24.620,0:24:28.140
like in the front rows,[br]I can come walk to you.
0:24:28.140,0:24:30.980
Question: Thank you very much.[br]I’m just wondering in terms of…
0:24:30.980,0:24:33.370
are you sort of creating this…
0:24:33.370,0:24:34.690
Jeff: I’m sorry, I can’t hear you…
0:24:34.690,0:24:37.260
Question: Sorry, of course… I’m sort[br]of curious in terms of how you’re
0:24:37.260,0:24:40.990
creating the disproportionate demographics[br]where there will be birth, including
0:24:40.990,0:24:44.520
other kinds of information, such as sex,[br]age, time of day they’re stopped.
0:24:44.520,0:24:46.300
Because there’s possibly[br]unemployment bias as well…
0:24:46.300,0:24:47.830
Jeff: I’m sorry, I still can’t[br]actually hear you.
0:24:47.830,0:24:52.510
Question: Sorry… whether it’d be[br]worth including, say, other details
0:24:52.510,0:24:56.350
about people, such as their sex, their[br]age, maybe the time of day that
0:24:56.350,0:25:01.880
these stops are happening. As there may[br]be a bias towards the unemployed.
0:25:01.880,0:25:06.760
If that’s possible, do you think,[br]with the UK census data?
0:25:06.760,0:25:10.350
Jeff: So you’re asking: Do I feel as[br]though we should also be including
0:25:10.350,0:25:15.090
other kinds of demographic data?[br]Yeah. I mean I do, but I think that
0:25:15.090,0:25:18.600
I shouldn’t be the one who’s deciding how[br]to implement these programs. And I think
0:25:18.600,0:25:23.190
that we should be speaking with[br]the communities themselves
0:25:23.190,0:25:26.530
and having them give their opinion. So if[br]this is something that those communities
0:25:26.530,0:25:30.260
who feel that they’re being targeted[br]or being discriminated against
0:25:30.260,0:25:33.800
want to include then I think that they[br]should be taken into account. But
0:25:33.800,0:25:37.470
I don’t know that I should be[br]the one who’s deciding that.
0:25:37.470,0:25:40.980
Herald: Okay, next question[br]over there, please.
0:25:40.980,0:25:45.230
Question: To this ratio you’ve been[br]talking about: So you compare
0:25:45.230,0:25:49.530
census data to – as you[br]said in the definition
0:25:49.530,0:25:53.510
in the first slide –[br]perceived ethnicity or race.
0:25:53.510,0:25:57.810
So it is an attribution of the[br]persons themselves in a census
0:25:57.810,0:26:01.730
compared to attribution per[br]police officers. And those
0:26:01.730,0:26:05.490
won’t necessarily match, I’m not[br]sure. So I was just wondering
0:26:05.490,0:26:08.980
whether you could comment on[br]that a bit. And this is related
0:26:08.980,0:26:13.130
to the second question when it comes[br]about: We don’t get this data
0:26:13.130,0:26:17.600
maybe from the police, because it’s[br]difficult for the state to collect it.
0:26:17.600,0:26:21.560
But maybe we could get the data from[br]those which suffer from discrimination
0:26:21.560,0:26:25.830
in the first place. So do you see any[br]possibility for public platforms…
0:26:25.830,0:26:29.930
So I was reminded of this[br]idea from Egypt, HarassMap (?)
0:26:29.930,0:26:34.140
which is about sexual harassment[br]of women. That just made visible,
0:26:34.140,0:26:37.710
with maps, similar to what you do,[br]actually where this happened,
0:26:37.710,0:26:42.860
when this happened, and how this happened.[br]But it’s been the people themselves
0:26:42.860,0:26:46.700
speaking out and making this[br]heard. And I was wondering
0:26:46.700,0:26:51.600
whether that may be another source of the[br]data you would be needing for your work.
0:26:51.600,0:26:55.750
Jeff: So the first question was talking[br]about whether we should be using
0:26:55.750,0:26:58.640
‘self-identified’ vs. ‘perceived’,[br]right?
0:26:58.640,0:27:02.280
Yeah, I mean they may not line up, right?
0:27:02.280,0:27:06.470
People can be perceived in a way[br]different than they identify.
0:27:06.470,0:27:10.450
Some groups in Germany[br]are calling for both.
0:27:10.450,0:27:14.500
They’re calling for kind of like[br]a two-ticket mechanism
0:27:14.500,0:27:19.750
where you have people who[br]put how they self-identify
0:27:19.750,0:27:24.040
and also how the Police are identifying[br]them. If we’re looking for patterns
0:27:24.040,0:27:27.580
of discrimination then it may actually[br]be more interesting if we’re looking at
0:27:27.580,0:27:31.580
how people are perceived.[br]Then, how people self-identify.
0:27:31.580,0:27:35.520
But I think it’s important to take both[br]into account. And for the second question,
0:27:35.520,0:27:38.170
I’m sorry, I kind of forgot what that was.
0:27:38.170,0:27:42.010
Question: Like asking the[br]people themselves for data
0:27:42.010,0:27:45.770
when they suffer from discrimination[br]or [are] being stopped more.
0:27:45.770,0:27:49.790
Jeff: Yeah, no, I mean I think that’s a[br]great idea. And there was a survey
0:27:49.790,0:27:53.890
that was actually just done,[br]that was doing just that!
0:27:53.890,0:27:57.200
The findings haven’t been released,[br]but it just finishes up. And it’s looking
0:27:57.200,0:28:01.370
at different types of experiences of[br]discrimination that people are having.
0:28:01.370,0:28:05.600
There’s also organisations like[br]social worker organisations
0:28:05.600,0:28:08.730
that have been collecting[br]this data for a long time.
0:28:08.730,0:28:14.420
Having hundreds and hundreds[br]of cases. Yeah, thanks!
0:28:14.420,0:28:19.640
postroll music
0:28:19.640,0:28:25.421
Subtitles created by c3subtitles.de[br]in the year 2016. Join, and help us!