WEBVTT
00:00:00.000 --> 00:00:09.329
preroll music
00:00:09.329 --> 00:00:14.179
Herald: Welcome Jeff with a warm applause
on stage. He works for Tactical Tech
00:00:14.179 --> 00:00:19.130
applause
00:00:19.130 --> 00:00:22.860
and will talk about a bias in
data and racial profiling
00:00:22.860 --> 00:00:25.870
in Germany compared with
the UK. It’s your stage!
00:00:25.870 --> 00:00:30.090
Jeff: Right. Thank you! Yeah, okay!
00:00:30.090 --> 00:00:33.320
My presentation is called
“Profiling (In)justice –
00:00:33.320 --> 00:00:36.430
– Disaggregating Data by Race
and Ethnicity to Monitor
00:00:36.430 --> 00:00:41.630
and Evaluate Discriminatory Policing”.
In terms of my background:
00:00:41.630 --> 00:00:46.780
I’ve done research, doing
mostly quantitative research
00:00:46.780 --> 00:00:50.730
around the issues of racial
discrimination for a long time.
00:00:50.730 --> 00:00:55.960
In New York, at the Center for
Constitutional Rights I was working on
00:00:55.960 --> 00:00:59.800
looking at trends and levels of
00:00:59.800 --> 00:01:04.210
use-of-force by police against civilians,
and also on stop-and-search
00:01:04.210 --> 00:01:08.601
against civilians. And then more
recently for the last 18 months or so
00:01:08.601 --> 00:01:12.330
I’ve been working as a research
consultant at Tactical Tech,
00:01:12.330 --> 00:01:16.360
looking at issues of data politics and
privacy. So this is kind of like a merger
00:01:16.360 --> 00:01:21.960
of these 2 areas. In terms of what this
presentation is gonna be about:
00:01:21.960 --> 00:01:26.900
there’s gonna be 3 takeaways. First, that
00:01:26.900 --> 00:01:29.590
we’re dealing with the issues of privacy
and also [freedom from] discrimination.
00:01:29.590 --> 00:01:34.869
And both are fundamental human rights.
But there’s tension between the two.
00:01:34.869 --> 00:01:40.879
And important questions to think about
are: “When do privacy concerns exceed
00:01:40.879 --> 00:01:46.490
or take precedence over those of
discrimination, or vice versa?”
00:01:46.490 --> 00:01:53.400
Two: That data is political, both in the
collection and aggregation of data;
00:01:53.400 --> 00:01:56.930
but also in terms of having the
categories of being created.
00:01:56.930 --> 00:02:00.549
And then, three: That data ethics are
a complex thing, that things aren’t
00:02:00.549 --> 00:02:05.090
so black-and-white all of the time.
So what is racial profiling?
00:02:05.090 --> 00:02:08.910
The term originates from the US.
00:02:08.910 --> 00:02:14.509
And it refers to when a police officer
suspects, stops, questions, arrests or…
00:02:14.509 --> 00:02:17.079
you know, or… at other stages (?)
of the communal justice system
00:02:17.079 --> 00:02:21.039
because of their perceived
race or ethnicity. After 9/11
00:02:21.039 --> 00:02:26.609
it also refers to the profiling of Muslims
or people perceived to be Middle Eastern.
00:02:26.609 --> 00:02:31.519
And in German there is no direct translation,
so the term ‘Racial Profiling’ (quotes)
00:02:31.519 --> 00:02:36.859
is used a lot in parliamentary hearings
and also in court documents.
00:02:36.859 --> 00:02:41.790
So the problem that we’re gonna talk
about is that because of the lack of data
00:02:41.790 --> 00:02:46.309
in Germany there’s no empirical
evidence to monitor and evaluate
00:02:46.309 --> 00:02:50.729
trends in discrimination.
This is creating problems
00:02:50.729 --> 00:02:55.290
for both civil society in terms of looking
at these levels and trends over time,
00:02:55.290 --> 00:02:58.199
but also from an individual perspective
it becomes difficult for people
00:02:58.199 --> 00:03:02.259
to file complaints. In Germany the only
way to file a complaint officially
00:03:02.259 --> 00:03:07.999
is to go to the police department,
which introduces power dynamics,
00:03:07.999 --> 00:03:11.349
you know, challenges and additional
barriers. But also if you’re an individual
00:03:11.349 --> 00:03:16.329
you have to show that there’s a trend,
right? That you are part of another,
00:03:16.329 --> 00:03:19.759
a long standing story. And without this
data it becomes difficult to prove
00:03:19.759 --> 00:03:24.049
that that’s happening.
So what we’re needing,
00:03:24.049 --> 00:03:27.159
or what some people are calling
for, is having this data
00:03:27.159 --> 00:03:32.850
at a state and a sort of national level.
And this ratio that I’m putting here,
00:03:32.850 --> 00:03:36.019
referring to policing, is looking at the
rate at which people are stopped
00:03:36.019 --> 00:03:41.629
over the census figure of the
demographic share of the population.
00:03:41.629 --> 00:03:44.900
And you really need both; the first
being on the police side and
00:03:44.900 --> 00:03:49.589
the second being on the census. So
that, you know, if you only have one,
00:03:49.589 --> 00:03:52.170
if you only have the rate at which police
were stopping people then you actually
00:03:52.170 --> 00:03:55.169
can’t see if this is discriminatory or
not. And if you only have the census
00:03:55.169 --> 00:03:59.720
then you can’t see that, either.
So you really need both.
00:03:59.720 --> 00:04:03.790
The European Commission, the International
Labour Organisation and academics are all
00:04:03.790 --> 00:04:10.549
calling for these… the creation of
standardized and comparable data sets.
00:04:10.549 --> 00:04:13.939
And I’m not gonna read these out,
but I can go back to them later
00:04:13.939 --> 00:04:18.760
if you’re interested. But what I’m gonna
talk about is comparing the UK
00:04:18.760 --> 00:04:23.290
to that of Germany. So in Germany,
00:04:23.290 --> 00:04:28.130
in 1983 there was a census; or there
was an attempt to making a census.
00:04:28.130 --> 00:04:31.970
But due to wide-spread resentment
and disenfranchisement,
00:04:31.970 --> 00:04:37.190
fears of surveillance and lack of
trust in state data collection
00:04:37.190 --> 00:04:42.490
there was a big boycott. Or people
deliberately filled in forms wrong.
00:04:42.490 --> 00:04:45.280
In some cases there were even
bombings of statistical offices.
00:04:45.280 --> 00:04:51.220
Or people spilled coffee over census
forms to try to deliberately ruin them.
00:04:51.220 --> 00:04:55.530
As a couple of other presentations at the
conference have already said
00:04:55.530 --> 00:04:59.250
this was found to be an
unconstitutional census.
00:04:59.250 --> 00:05:01.990
Because of the way that
they were framing it.
00:05:01.990 --> 00:05:08.520
Comparing the census to
household registrations.
00:05:08.520 --> 00:05:14.900
And so the census was delayed until 1987,
00:05:14.900 --> 00:05:19.930
which was the most recent census until
the most recent European one in 2011.
00:05:19.930 --> 00:05:23.260
This Supreme Court decision
was really important
00:05:23.260 --> 00:05:28.810
because it established this right
for informational self-determination.
00:05:28.810 --> 00:05:33.040
Very important for privacy
in terms of Germany.
00:05:33.040 --> 00:05:37.710
You know, until today. So what kinds
of information is being collected?
00:05:37.710 --> 00:05:40.690
In Germany we have pretty standard kind
of demographic information things
00:05:40.690 --> 00:05:45.200
like gender, age, income, religion. But
what I want to talk about in particular
00:05:45.200 --> 00:05:49.200
is country origin and country citizenship.
00:05:49.200 --> 00:05:53.660
Which are used to determine a person
of migration background. And
00:05:53.660 --> 00:05:56.860
this term ‘person of migration background’
generally refers to whether you,
00:05:56.860 --> 00:06:00.220
your parents or your grandparents
– the first, second or third generation –
00:06:00.220 --> 00:06:03.960
come from a migrant background. Right, and
00:06:03.960 --> 00:06:10.000
this term is used oftentimes as a proxy
for ethnic or for racial diversity in Germany.
00:06:10.000 --> 00:06:15.050
And this is problematic because
you’re using citizenship as a proxy
00:06:15.050 --> 00:06:20.080
for looking at racial and ethnic identity.
And it also ignores the experiences
00:06:20.080 --> 00:06:23.450
and identities, the self identities
of people who don’t fall into
00:06:23.450 --> 00:06:26.870
this ‘first, second or third generation’,
right? People who may identify
00:06:26.870 --> 00:06:30.690
as Black German, let’s say. But
of fourth, fifth or sixth generation.
00:06:30.690 --> 00:06:34.710
They’re just ignored in this
data set. So they fall out.
00:06:34.710 --> 00:06:38.160
Also, it’s difficult to measure these at
a national level because each state
00:06:38.160 --> 00:06:41.950
has different definitions of what
constitutes a migrant background.
00:06:41.950 --> 00:06:44.790
So we don’t have this at a national level
but also within states there’s no way
00:06:44.790 --> 00:06:49.370
to compare them. Of course, not
having that data doesn’t mean
00:06:49.370 --> 00:06:53.840
that there’s no racism, right?
And so in 2005 e.g. we see
00:06:53.840 --> 00:06:57.180
that neo-Nazi incidents have increased 25%
00:06:57.180 --> 00:07:03.320
– the NSU case coming out but still
going on in court proceedings.
00:07:03.320 --> 00:07:08.020
The xenophobic attacks but also the way
in which these crimes were investigated
00:07:08.020 --> 00:07:13.670
– at a state and at a federal level –
and the way that it was botched,
00:07:13.670 --> 00:07:17.900
in addition to showing that
racism now in general
00:07:17.900 --> 00:07:22.230
is at a higher rate than it has been for
the last 30 years. And much more recently
00:07:22.230 --> 00:07:26.710
seeing the rise in arson attacks on
refugee centers. There’s been
00:07:26.710 --> 00:07:30.360
over 200 attacks this year so far.
You know, all of these showed
00:07:30.360 --> 00:07:34.220
that not collecting this data doesn’t
mean that we don’t have a problem.
00:07:34.220 --> 00:07:40.830
So, the UK by comparison: In 1981,
there was the Brixton riots,
00:07:40.830 --> 00:07:45.670
in an area of London.
And these arose largely
00:07:45.670 --> 00:07:50.320
because of resentment towards
the way that police were
00:07:50.320 --> 00:07:53.550
carrying out what they called ‘Sus Laws’.
Or people being able to be stopped
00:07:53.550 --> 00:07:58.080
on suspicion of committing
a crime, carrying drugs,
00:07:58.080 --> 00:08:03.650
having a weapon etc. and so forth.
And so in the aftermath of the riot
00:08:03.650 --> 00:08:07.550
they came up with this report called the
‘Scarman report’. And this found
00:08:07.550 --> 00:08:11.150
that there is much disproportionality in
the way that Police were carrying out
00:08:11.150 --> 00:08:16.280
their stop-and-search procedures.
So for the first time this required…
00:08:16.280 --> 00:08:20.130
or one of the reforms that was
instituted was that UK Police started
00:08:20.130 --> 00:08:26.750
to have to collect data on race
or ethnicity during the stops.
00:08:26.750 --> 00:08:29.600
When they stop a person they have to start
collecting this data. And then you have
00:08:29.600 --> 00:08:34.629
a baseline that’s being established.
Around the same time in the UK
00:08:34.629 --> 00:08:38.729
we have the 1981 census.
00:08:38.729 --> 00:08:41.809
And in society they were having
a lot of debates around
00:08:41.809 --> 00:08:45.899
whether or not they wanted to have this…
00:08:45.899 --> 00:08:49.971
collecting this baseline national level
(?) figure, because we need these 2 things
00:08:49.971 --> 00:08:56.260
for this ratio in order to monitor and
evaluate levels of discrimination.
00:08:56.260 --> 00:09:00.240
But, you know, there was
a lot of opposition to this.
00:09:00.240 --> 00:09:04.829
And many found it to be (quote)
“morally and politically objectionable”.
00:09:04.829 --> 00:09:08.570
But not for the reason you’d think.
People found objections to it
00:09:08.570 --> 00:09:13.230
not because of asking these question,
but because of the way that the question
00:09:13.230 --> 00:09:17.190
was phrased, with the categories that
were being used. And they did surveys
00:09:17.190 --> 00:09:21.399
between ’75 and about ’95, and found that
00:09:21.399 --> 00:09:26.529
among marginalized communities
and in minority ethnicity groups
00:09:26.529 --> 00:09:31.329
there was actually a lot of support
for collecting this kind of data.
00:09:31.329 --> 00:09:35.250
They just wanted to have it phrased to
be different. And so ’91 they started
00:09:35.250 --> 00:09:40.359
to collect the data. They put this
‘race question’ in. And here I have,
00:09:40.359 --> 00:09:45.600
in 2011 – the most recent census –
some of the kinds of categories
00:09:45.600 --> 00:09:50.049
that they wanted to also include.
And they’ve changed over time.
00:09:50.049 --> 00:09:54.329
So e.g. like ‘White Irish people’ felt
that they also were discriminated against.
00:09:54.329 --> 00:09:58.930
And they experienced things differently
than white British people, e.g.
00:09:58.930 --> 00:10:03.231
So having things broken down
further would be helpful for them
00:10:03.231 --> 00:10:09.720
in terms of highlighting discrimination
that each specific demographic faces.
00:10:09.720 --> 00:10:14.379
So around that time ’91, ’93 we
have the murder of Stephen Lawrence
00:10:14.379 --> 00:10:19.130
in an unprovoked racist attack. Nobody
was ever convicted of that. But
00:10:19.130 --> 00:10:22.529
what’s important is that we have this
‘Macpherson report’ that came out.
00:10:22.529 --> 00:10:27.290
And it developed a lot of recommendations,
70, and most of them were adopted.
00:10:27.290 --> 00:10:31.529
One: to be collecting this at a national
level, and to be comparing these.
00:10:31.529 --> 00:10:35.199
In 2011 they stopped mandating
that you had to collect this data,
00:10:35.199 --> 00:10:38.709
at a national level. So none of the
data from then going forward
00:10:38.709 --> 00:10:42.659
can actually be trusted. Some
forces continued to do it,
00:10:42.659 --> 00:10:46.270
but not all of them. So you can’t actually
compare them between forces.
00:10:46.270 --> 00:10:50.249
In the same year we have these London
riots. The Guardian and LSE put out
00:10:50.249 --> 00:10:54.190
a report called “Reading the Riots”. Where
they did a lot of interviews with people
00:10:54.190 --> 00:10:58.429
who participated. And they found that
most of the people who participated
00:10:58.429 --> 00:11:03.569
had feelings of… that they
were mistreated by Police.
00:11:03.569 --> 00:11:07.820
Or that there is racial discrimination
in terms of the policing practices.
00:11:07.820 --> 00:11:11.760
That they weren’t being
treated with respect.
00:11:11.760 --> 00:11:16.710
So to put some data to that:
Before this was removed
00:11:16.710 --> 00:11:22.219
there… it was 2 different types of
stops in the UK. Those PACE stops,
00:11:22.219 --> 00:11:25.769
where you stops with reasonable suspicion.
00:11:25.769 --> 00:11:30.379
And among that you have e.g. black people
stopped at 7 times the rate of white people.
00:11:30.379 --> 00:11:34.690
Asian people – Asian referring to (?)(?)(?)(?)
Southeast Asian in the UK –
00:11:34.690 --> 00:11:39.430
at twice the rate. And ‘Section 60 stops’:
where you don’t have to actually have
00:11:39.430 --> 00:11:43.399
reasonable suspicion. And when you don’t
need to have that you have much, much
00:11:43.399 --> 00:11:51.840
higher rates. 26.6 times the rate of white
people black people are being stopped at.
00:11:51.840 --> 00:11:54.069
But the State Department even coming
out and they’re saying: “There’s
00:11:54.069 --> 00:11:59.730
no relationship between criminality
and race… criminality and ethnicity”.
00:11:59.730 --> 00:12:02.450
In fact it’s like: If people are being
stopped at these rates it’s…
00:12:02.450 --> 00:12:06.670
it’s in the wrong direction. You have
white males in particular who are
00:12:06.670 --> 00:12:10.020
fending at higher rates. Who are using
drugs at a higher rate. Who are
00:12:10.020 --> 00:12:15.060
possessing weapons at a higher rate.
But that’s not who’s being stopped.
00:12:15.060 --> 00:12:19.579
There is a connection though between
race and ethnicity and poverty.
00:12:19.579 --> 00:12:23.040
So you can see here, they call it like
BAME groups, or ‘Black, Asian and
00:12:23.040 --> 00:12:27.220
Minority Ethnicity’. And you can see
that among like wealth and assets:
00:12:27.220 --> 00:12:30.710
it’s much, much lower for non-white
households. Unemployment rates
00:12:30.710 --> 00:12:36.149
are much higher as well.
Income is much lower.
00:12:36.149 --> 00:12:39.809
So I like making maps. And I think
maps are really cool. ’Cause you can
00:12:39.809 --> 00:12:44.269
tell stories when you overlay a lot
of data with it. So on the left
00:12:44.269 --> 00:12:50.699
I put by different borough in London
where people are actually being stopped.
00:12:50.699 --> 00:12:54.529
Per 1,000 people in 2012.
And on the right I put
00:12:54.529 --> 00:12:58.789
where the crime is actually occurring.
And this is coming from UK Police.
00:12:58.789 --> 00:13:02.009
And so you can see that where people
are being stopped isn’t exactly
00:13:02.009 --> 00:13:05.799
where the crime is actually happening.
And if you’re seeing this stop-and-search
00:13:05.799 --> 00:13:11.069
as a crime preventing tactic then we
have to question why this isn’t lining up.
00:13:11.069 --> 00:13:15.449
Going back to this ratio:
00:13:15.449 --> 00:13:19.459
earlier I mentioned like – having the rate
at which one group is being stopped
00:13:19.459 --> 00:13:22.990
over that share of the total population.
00:13:22.990 --> 00:13:26.000
And we can take it a step further
and we can compare that to…
00:13:26.000 --> 00:13:29.029
between different demographic groups.
00:13:29.029 --> 00:13:33.610
And when using census figures
combined with police figures,
00:13:33.610 --> 00:13:38.500
we can do things like looking on the left.
I mean this disproportionality ratio,
00:13:38.500 --> 00:13:41.260
so the rate at which black groups
as a share are stopped
00:13:41.260 --> 00:13:45.839
versus the total population, compared
to white groups are stopped.
00:13:45.839 --> 00:13:49.920
And you can see the darker areas
is where you have a higher rate.
00:13:49.920 --> 00:13:56.230
So when we’re talking about those
‘7 times, or 26 times more likely’
00:13:56.230 --> 00:13:59.959
these are those areas that we’re
talking about. And so the darker areas:
00:13:59.959 --> 00:14:05.909
you can see that when compared to poverty,
00:14:05.909 --> 00:14:09.309
people are stopped… there’s
greater disproportionality ratios
00:14:09.309 --> 00:14:13.030
in wealthier areas than there are in
poorer areas. And this is kind of
00:14:13.030 --> 00:14:16.989
a way, you could say, almost
of perceiving people of colour
00:14:16.989 --> 00:14:24.510
as others who shouldn’t belong in
these areas. It’s also… you can…
00:14:24.510 --> 00:14:27.819
when combined with other census
information you can see that you have
00:14:27.819 --> 00:14:32.069
more discrimination, you have more
disparities in areas that are more white
00:14:32.069 --> 00:14:36.240
and also less racially diverse.
00:14:36.240 --> 00:14:40.069
So this is kind of all on the
same kind of a message.
00:14:40.069 --> 00:14:44.229
But if it works fine? – It doesn’t.
UK Police is saying that
00:14:44.229 --> 00:14:49.499
at most they have a 6%
arrest rate of all stops.
00:14:49.499 --> 00:14:52.970
And arrests are not conviction rates.
00:14:52.970 --> 00:14:59.319
Looking for weapons we have like less
than 1% of a positive search rate.
00:14:59.319 --> 00:15:03.350
And the European Human Rights
Commission e.g. has called for reform
00:15:03.350 --> 00:15:06.999
of these practices. The UN has called
for reform of these practices.
00:15:06.999 --> 00:15:12.559
And they instituted like
a reform that called for
00:15:12.559 --> 00:15:19.039
having a 20% arrest quota. And so that
could either go positively or negatively.
00:15:19.039 --> 00:15:21.649
Making a higher quota means that you
could be just arresting more people
00:15:21.649 --> 00:15:26.439
that you’re stopping. More likely, or
hopefully it means that you have
00:15:26.439 --> 00:15:31.550
a higher justification or grounds
for stopping a person.
00:15:31.550 --> 00:15:35.430
So these are the kinds of things you can
do in the UK, with these kinds of data.
00:15:35.430 --> 00:15:40.079
In Germany, you can’t. But I wanna
highlight there’s this one case
00:15:40.079 --> 00:15:45.150
in Koblenz in 2010.
There was a student of…
00:15:45.150 --> 00:15:50.050
unnamed, black student who
is stopped travelling on train,
00:15:50.050 --> 00:15:53.310
and who was asked to show his ID.
And he refused. And he said: “No,
00:15:53.310 --> 00:16:01.190
I’m not gonna do that. This is
reminiscent of Nazi era tactics”.
00:16:01.190 --> 00:16:07.509
And so he was charged with slander.
And he was brought into court.
00:16:07.509 --> 00:16:11.439
And the police officer, when it
was in court, said, (quote):
00:16:11.439 --> 00:16:16.149
“I approach people that look like
foreigners, this is based on skin colour.”
00:16:16.149 --> 00:16:20.209
And so this is for the first time
the police have admitted that
00:16:20.209 --> 00:16:23.470
their grounds for doing immigration
related stops are based on
00:16:23.470 --> 00:16:28.520
perceived race or ethnicity.
The judge sided with the police.
00:16:28.520 --> 00:16:32.029
That this was good justification,
like it was good grounds.
00:16:32.029 --> 00:16:36.779
But a higher court ruled
that that wasn’t the case.
00:16:36.779 --> 00:16:38.540
They said: “Yeah,
this is unconstitutional,
00:16:38.540 --> 00:16:42.339
you can’t actually do it,
it violates the constitution.”
00:16:42.339 --> 00:16:46.249
No person shall be favoured or disfavoured
because of sex, parentage, race,
00:16:46.249 --> 00:16:50.739
language, homeland, origin,
faith, religious… etc.
00:16:50.739 --> 00:16:54.360
Just as a side note there’s been a large
movement to remove this term ‘race’
00:16:54.360 --> 00:16:58.410
from that part of the constitution
since it’s been put in.
00:16:58.410 --> 00:17:02.189
And also the court dismissed the slander
charge. They said: “No, this student…”
00:17:02.189 --> 00:17:07.160
like he’s actually able to critique
the police, you know, in this way.
00:17:07.160 --> 00:17:10.660
But after we have the response
by the police union,
00:17:10.660 --> 00:17:14.440
the head of the police union
at the time, who said (quote):
00:17:14.440 --> 00:17:18.010
“The courts deal with the law in
an aesthetical pleasing way, but
00:17:18.010 --> 00:17:21.760
they don’t make sure their judgments
match practical requirements”.
00:17:21.760 --> 00:17:25.400
And so what this means is: we see
that according to the police union
00:17:25.400 --> 00:17:28.870
– this isn’t official response, but this
is from the Police Union itself –
00:17:28.870 --> 00:17:32.920
they say that we need to
profile. We need to do this.
00:17:32.920 --> 00:17:38.750
Or else we aren’t able to do
immigration related stops.
00:17:38.750 --> 00:17:43.470
That’s crazy. They also…
I mean, at the same time
00:17:43.470 --> 00:17:46.840
when they were doing these parliamentary
hearings they institute these mandatory
00:17:46.840 --> 00:17:50.660
inter cultural trainings for police
officers. And these are kind of
00:17:50.660 --> 00:17:55.210
like a one-day training where
you go and learn all about
00:17:55.210 --> 00:17:58.650
how to deal with people from different
cultures. But in some of the interviews
00:17:58.650 --> 00:18:01.910
that I was doing they said: “Okay, well,
this isn’t an inter cultural issue.
00:18:01.910 --> 00:18:05.730
This is a racism issue”.
00:18:05.730 --> 00:18:08.250
People aren’t just coming from other
places. These are Germans,
00:18:08.250 --> 00:18:11.000
these are people who grew up here. These
are people who live here. Who know
00:18:11.000 --> 00:18:15.970
how to speak the language.
Who were born and raised…
00:18:15.970 --> 00:18:19.260
And we need to be dealing
with this in a different way.
00:18:19.260 --> 00:18:23.250
However, in this time, we see that
racial profiling has become part of
00:18:23.250 --> 00:18:29.560
the national conversation. And so this
is the sticker that somebody put up
00:18:29.560 --> 00:18:33.040
in Berlin, in a U-Bahn.
It says: “Attention…,
00:18:33.040 --> 00:18:36.140
we practice RACIAL PROFILING while
checking the validity of your ticket”.
00:18:36.140 --> 00:18:42.200
It’s not real, but it looks…
I think it’s kind of cool.
00:18:42.200 --> 00:18:45.790
When they were doing this in
these Bundestag hearings…
00:18:45.790 --> 00:18:50.260
they released data for Federal Police
for 2013. This is the first time
00:18:50.260 --> 00:18:54.270
that we have any data that’s released.
No data has ever been released
00:18:54.270 --> 00:18:57.430
based on State Police stops.
They say that they’re not actually
00:18:57.430 --> 00:19:01.010
collecting the information, so they
don’t have anything to show. Of course
00:19:01.010 --> 00:19:03.960
the figures that are released from the
Federal Police are not disaggregated
00:19:03.960 --> 00:19:08.000
by race and ethnicity.
But what does this actually show?
00:19:08.000 --> 00:19:17.270
So, most of the stops,
over 85% are border stops.
00:19:17.270 --> 00:19:20.910
Border being within ca. 30 km
of the German border.
00:19:20.910 --> 00:19:25.540
So this is actually taking into account
most of the German population.
00:19:25.540 --> 00:19:29.470
But if we’re doing these immigration
related stops then… if we break it down
00:19:29.470 --> 00:19:34.430
by offense – in the last two, these are
the immigration related offenses
00:19:34.430 --> 00:19:38.910
that people are committing – and
we have less than, at most,
00:19:38.910 --> 00:19:44.080
maybe 1% of people who
are found to be a positive,
00:19:44.080 --> 00:19:48.100
meaning that they’re found to be violating
some kind of offense. It’s – again,
00:19:48.100 --> 00:19:53.930
it’s not a conviction, right?
And people can challenge this.
00:19:53.930 --> 00:19:56.550
E.g. like you don’t have to have your
ID on you in all times. You can
00:19:56.550 --> 00:20:00.470
present it later, and the
charge can go away.
00:20:00.470 --> 00:20:05.080
But if we have such low
rates of positive searches
00:20:05.080 --> 00:20:10.980
then like why is this happening? Why
do we feel that with such good data,
00:20:10.980 --> 00:20:18.950
and knowing, as good researchers,
why are we continuing this as a practice?
00:20:18.950 --> 00:20:22.000
On one of the other interviews that I was
doing they found that okay well:
00:20:22.000 --> 00:20:26.470
You know, we know this is ineffective.
But this has the effect of criminalizing
00:20:26.470 --> 00:20:31.550
our communities. And
whether or not this is true
00:20:31.550 --> 00:20:35.130
is an argument for why we should maybe
have this kind of data to show that
00:20:35.130 --> 00:20:41.220
this is or is not actually occurring.
Of course, European Commission
00:20:41.220 --> 00:20:46.490
against racism and intolerance and the UN
have said: “Well, even among this at most
00:20:46.490 --> 00:20:50.021
1% positive rates these are
not being distributed evenly, and
00:20:50.021 --> 00:20:53.700
you have people of certain groups that are
being stopped at rates higher than others,
00:20:53.700 --> 00:20:58.870
particularly black and other
minority ethnicity groups.”
00:20:58.870 --> 00:21:05.670
Okay, so, going back, why…
into the initial question…
00:21:05.670 --> 00:21:10.670
If we have both freedom from
discrimination and the right to privacy
00:21:10.670 --> 00:21:15.930
as these human rights how
do we address this tension?
00:21:15.930 --> 00:21:18.270
And how do we make sure that we’re
making the right decision in terms of
00:21:18.270 --> 00:21:23.440
which takes precedence? And so I came…
or I’ve thought of 3 different reasons
00:21:23.440 --> 00:21:27.690
why this isn’t happening. The first
being a series of legal challenges.
00:21:27.690 --> 00:21:31.740
Things that are preventing
us from implementing this
00:21:31.740 --> 00:21:36.400
from a legal basis. And the first…
you know there’s 3 exceptions
00:21:36.400 --> 00:21:39.240
that would allow for this
data to be collected.
00:21:39.240 --> 00:21:43.350
(1) The first being if there’s a provision
in EU directive that calls for collecting
00:21:43.350 --> 00:21:49.700
this kind of a data. And within that
(2) if you have the consent of the person
00:21:49.700 --> 00:21:53.770
the data is subject, let’s say.
Consent is kind of a difficult thing
00:21:53.770 --> 00:21:57.970
and we could have a whole conversation
just about that on its own.
00:21:57.970 --> 00:22:00.950
If you’re being stopped by police officer
to what extent can you actually consent
00:22:00.950 --> 00:22:06.660
to the data that’s being collected?
But this is put in place
00:22:06.660 --> 00:22:10.510
as one of the mandatory
legal requirements.
00:22:10.510 --> 00:22:16.050
Or (3) if there’s an exception in
the hopefully soon to be finalized
00:22:16.050 --> 00:22:19.460
EU Data Protection law that
allows for collecting data
00:22:19.460 --> 00:22:23.020
if it’s in the public interest. So you
could argue that we need to be collecting
00:22:23.020 --> 00:22:28.920
this data because monitoring
and evaluating discrimination
00:22:28.920 --> 00:22:34.480
is a problem that we need
to solve as a society, right?
00:22:34.480 --> 00:22:38.810
Two: As a lot of people here at
the conference are talking about:
00:22:38.810 --> 00:22:42.950
there’s a lot of distrust in terms
of collecting data by the state.
00:22:42.950 --> 00:22:47.960
Particularly sensitive data. But I mean
as many of us are already aware
00:22:47.960 --> 00:22:53.520
this data is already being collected. And
this doesn’t mean that we should maybe
00:22:53.520 --> 00:22:57.680
collect more just for the
sake of collecting data.
00:22:57.680 --> 00:23:01.460
But in terms of sensitive data –
00:23:01.460 --> 00:23:04.990
we’re collecting things also like medical
data. And medical data sometimes
00:23:04.990 --> 00:23:08.720
is interesting for looking at trends
in terms of the illnesses,
00:23:08.720 --> 00:23:14.850
and where illnesses spread. And you can
look at this as also possibly a way of
00:23:14.850 --> 00:23:21.130
using sensitive data for highlighting
and monitoring public problems.
00:23:21.130 --> 00:23:25.150
And, (3), we have these
challenges in determining
00:23:25.150 --> 00:23:29.060
which kind of categories
we should put in place.
00:23:29.060 --> 00:23:32.890
But, like the UK, if something
were implemented in Germany
00:23:32.890 --> 00:23:37.090
I feel as though this would change over
time as other groups also want their data
00:23:37.090 --> 00:23:43.490
to be collected… or not!
00:23:43.490 --> 00:23:48.400
So that’s kind of where
I’m at. I think that
00:23:48.400 --> 00:23:51.480
there are no easy answers in terms of
whether we should or should not do this.
00:23:51.480 --> 00:23:53.670
But I think that at the very least
we should be starting to have
00:23:53.670 --> 00:23:56.500
these conversations. And I think that
it’s important to start having these
00:23:56.500 --> 00:23:59.440
conversations with communities
themselves who are being targeted,
00:23:59.440 --> 00:24:05.060
or feel they’re being profiled.
So, thank you!
00:24:05.060 --> 00:24:16.320
applause
00:24:16.320 --> 00:24:20.420
Herald: It was an awesome talk. I think
there might be 5 minutes for questions.
00:24:20.420 --> 00:24:24.620
There are mics over there and over
there. And whoever has a question,
00:24:24.620 --> 00:24:28.140
like in the front rows,
I can come walk to you.
00:24:28.140 --> 00:24:30.980
Question: Thank you very much.
I’m just wondering in terms of…
00:24:30.980 --> 00:24:33.370
are you sort of creating this…
00:24:33.370 --> 00:24:34.690
Jeff: I’m sorry, I can’t hear you…
00:24:34.690 --> 00:24:37.260
Question: Sorry, of course… I’m sort
of curious in terms of how you’re
00:24:37.260 --> 00:24:40.990
creating the disproportionate demographics
where there will be birth, including
00:24:40.990 --> 00:24:44.520
other kinds of information, such as sex,
age, time of day they’re stopped.
00:24:44.520 --> 00:24:46.300
Because there’s possibly
unemployment bias as well…
00:24:46.300 --> 00:24:47.830
Jeff: I’m sorry, I still can’t
actually hear you.
00:24:47.830 --> 00:24:52.510
Question: Sorry… whether it’d be
worth including, say, other details
00:24:52.510 --> 00:24:56.350
about people, such as their sex, their
age, maybe the time of day that
00:24:56.350 --> 00:25:01.880
these stops are happening. As there may
be a bias towards the unemployed.
00:25:01.880 --> 00:25:06.760
If that’s possible, do you think,
with the UK census data?
00:25:06.760 --> 00:25:10.350
Jeff: So you’re asking: Do I feel as
though we should also be including
00:25:10.350 --> 00:25:15.090
other kinds of demographic data?
Yeah. I mean I do, but I think that
00:25:15.090 --> 00:25:18.600
I shouldn’t be the one who’s deciding how
to implement these programs. And I think
00:25:18.600 --> 00:25:23.190
that we should be speaking with
the communities themselves
00:25:23.190 --> 00:25:26.530
and having them give their opinion. So if
this is something that those communities
00:25:26.530 --> 00:25:30.260
who feel that they’re being targeted
or being discriminated against
00:25:30.260 --> 00:25:33.800
want to include then I think that they
should be taken into account. But
00:25:33.800 --> 00:25:37.470
I don’t know that I should be
the one who’s deciding that.
00:25:37.470 --> 00:25:40.980
Herald: Okay, next question
over there, please.
00:25:40.980 --> 00:25:45.230
Question: To this ratio you’ve been
talking about: So you compare
00:25:45.230 --> 00:25:49.530
census data to – as you
said in the definition
00:25:49.530 --> 00:25:53.510
in the first slide –
perceived ethnicity or race.
00:25:53.510 --> 00:25:57.810
So it is an attribution of the
persons themselves in a census
00:25:57.810 --> 00:26:01.730
compared to attribution per
police officers. And those
00:26:01.730 --> 00:26:05.490
won’t necessarily match, I’m not
sure. So I was just wondering
00:26:05.490 --> 00:26:08.980
whether you could comment on
that a bit. And this is related
00:26:08.980 --> 00:26:13.130
to the second question when it comes
about: We don’t get this data
00:26:13.130 --> 00:26:17.600
maybe from the police, because it’s
difficult for the state to collect it.
00:26:17.600 --> 00:26:21.560
But maybe we could get the data from
those which suffer from discrimination
00:26:21.560 --> 00:26:25.830
in the first place. So do you see any
possibility for public platforms…
00:26:25.830 --> 00:26:29.930
So I was reminded of this
idea from Egypt, HarassMap (?)
00:26:29.930 --> 00:26:34.140
which is about sexual harassment
of women. That just made visible,
00:26:34.140 --> 00:26:37.710
with maps, similar to what you do,
actually where this happened,
00:26:37.710 --> 00:26:42.860
when this happened, and how this happened.
But it’s been the people themselves
00:26:42.860 --> 00:26:46.700
speaking out and making this
heard. And I was wondering
00:26:46.700 --> 00:26:51.600
whether that may be another source of the
data you would be needing for your work.
00:26:51.600 --> 00:26:55.750
Jeff: So the first question was talking
about whether we should be using
00:26:55.750 --> 00:26:58.640
‘self-identified’ vs. ‘perceived’,
right?
00:26:58.640 --> 00:27:02.280
Yeah, I mean they may not line up, right?
00:27:02.280 --> 00:27:06.470
People can be perceived in a way
different than they identify.
00:27:06.470 --> 00:27:10.450
Some groups in Germany
are calling for both.
00:27:10.450 --> 00:27:14.500
They’re calling for kind of like
a two-ticket mechanism
00:27:14.500 --> 00:27:19.750
where you have people who
put how they self-identify
00:27:19.750 --> 00:27:24.040
and also how the Police are identifying
them. If we’re looking for patterns
00:27:24.040 --> 00:27:27.580
of discrimination then it may actually
be more interesting if we’re looking at
00:27:27.580 --> 00:27:31.580
how people are perceived.
Then, how people self-identify.
00:27:31.580 --> 00:27:35.520
But I think it’s important to take both
into account. And for the second question,
00:27:35.520 --> 00:27:38.170
I’m sorry, I kind of forgot what that was.
00:27:38.170 --> 00:27:42.010
Question: Like asking the
people themselves for data
00:27:42.010 --> 00:27:45.770
when they suffer from discrimination
or [are] being stopped more.
00:27:45.770 --> 00:27:49.790
Jeff: Yeah, no, I mean I think that’s a
great idea. And there was a survey
00:27:49.790 --> 00:27:53.890
that was actually just done,
that was doing just that!
00:27:53.890 --> 00:27:57.200
The findings haven’t been released,
but it just finishes up. And it’s looking
00:27:57.200 --> 00:28:01.370
at different types of experiences of
discrimination that people are having.
00:28:01.370 --> 00:28:05.600
There’s also organisations like
social worker organisations
00:28:05.600 --> 00:28:08.730
that have been collecting
this data for a long time.
00:28:08.730 --> 00:28:14.420
Having hundreds and hundreds
of cases. Yeah, thanks!
00:28:14.420 --> 00:28:19.640
postroll music
00:28:19.640 --> 00:28:25.421
Subtitles created by c3subtitles.de
in the year 2016. Join, and help us!