< Return to Video

34C3 - Policing in the age of data exploitation

  • 0:00 - 0:14
    34C3 preroll music
  • 0:14 - 0:20
    Herald angel: Today two people from privacy
    international, one is Eva Blum--Dumontet
  • 0:20 - 0:25
    she's a research officer working on data
    exploitation especially in the global
  • 0:25 - 0:35
    south and Millie Wood who's a lawyer and
    is fighting against spy agencies and
  • 0:35 - 0:41
    before that she fought seven years against
    police cases and they're gonna be talking
  • 0:41 - 0:46
    about policing in the the age of data
    exploitation. Give them a warm welcome.
  • 0:46 - 0:55
    Applause
  • 0:55 - 0:58
    Millie Wood: Hi I'm Millie as was just said I've been
  • 0:58 - 1:02
    at privacy international for two years
    working as a lawyer before that I spent
  • 1:02 - 1:08
    seven years bringing cases against the
    police and what increasingly concerns me
  • 1:08 - 1:14
    based on these experiences is a lack of
    understanding of what tactics are being
  • 1:14 - 1:21
    used by the police today and what legal
    basis they are doing this on. The lack of
  • 1:21 - 1:27
    transparency undermines the ability of
    activists lawyers and technologists to
  • 1:27 - 1:31
    challenge the police tactics and whilst
    I'm sure a lot of you have a broad
  • 1:31 - 1:37
    awareness of the technology that the
    police can use I don't think this is
  • 1:37 - 1:43
    enough and we need to know what specific
    police forces are using against
  • 1:43 - 1:50
    individuals. The reason why is that when
    you're arrested you need to know what
  • 1:50 - 1:57
    disclosure to ask for in order to prove
    your innocence. Your lawyers need to know
  • 1:57 - 2:03
    what expert evidence to ask for in order
    to defend their client. And increasingly
  • 2:03 - 2:09
    as there are invisible ways or seemingly
    invisible for the police to monitor a scale
  • 2:09 - 2:14
    we need to know that there are effective
    legal safeguards. Now those who are
  • 2:14 - 2:21
    affected are not just the guilty or those
    who understand technology they include
  • 2:21 - 2:30
    pensioners such as John Cat a 90 year old
    man who's a peace protester and he's a
  • 2:30 - 2:36
    law-abiding citizen no criminal record and
    yet he is on the UK domestic extremism
  • 2:36 - 2:43
    database and listed here are some of the
    entries: He took his sketchpad and made
  • 2:43 - 2:50
    drawings, he's clean shaven, and he was
    holding a board with orange people on it.
  • 2:50 - 2:56
    So this is the kind of people that they
    are surveilling. John's case exposes
  • 2:56 - 3:04
    unlawful actions by the police but these
    actions date back to 2005 to 2009 as far
  • 3:04 - 3:10
    as I'm aware there are no cases
    challenging modern police tactics and
  • 3:10 - 3:15
    privacy international in the UK and with
    our partners throughout the world are
  • 3:15 - 3:21
    increasingly concerned at the pace this is
    developing unobstructed because people
  • 3:21 - 3:28
    don't know what's going on, and so we've
    started in the UK to try and uncover some
  • 3:28 - 3:34
    of the police tactics using Freedom of
    Information requests. These laws should be
  • 3:34 - 3:39
    available throughout Europe and we want to
    make similar requests in other countries
  • 3:39 - 3:44
    hopefully with some of you. So now I'm
    going to hand over to my colleague Eva who
  • 3:44 - 3:48
    will talk a bit about privacy
    international, some of the tactics we know
  • 3:48 - 3:52
    the police are using, and then we'll speak
    about some of the things that we found out
  • 3:52 - 3:55
    through our initial research.
  • 3:55 - 4:00
    Applause
  • 4:00 - 4:03
    Thank you so, I'm just going to tell you a
    little bit more about Privacy
  • 4:03 - 4:07
    International for those of you who don't
    know this organization. We are based in
  • 4:07 - 4:11
    London and we fight against surveillance
    and defend the right to privacy across the
  • 4:11 - 4:16
    world. Basically, essentially what we're
    doing is that we do litigation, we conduct
  • 4:16 - 4:21
    research, and we carry out advocacy
    including at the United Nations, we
  • 4:21 - 4:27
    develop policies on issues that are
    defining modern rights. Now, our work
  • 4:27 - 4:31
    ranges from litigations against
    intelligence services to a wide range of
  • 4:31 - 4:37
    reports on issues such as connected cars,
    smart cities, and FinTech. We've recently
  • 4:37 - 4:42
    published an investigation on the role of
    companies like Cambridge Analytica and
  • 4:42 - 4:48
    Harris Media and their role in the latest
    Kenyan elections. With our network of
  • 4:48 - 4:52
    partner organisations across the world we
    advocate for stronger privacy protection
  • 4:52 - 4:59
    in the law and technology and stronger
    safeguards against surveillance. Now we
  • 4:59 - 5:04
    talk about data exploitation and it's
    actually the title of the talk so what do
  • 5:04 - 5:10
    we mean by that? The concept of data
    exploitation emerges from our concerns
  • 5:10 - 5:16
    that the industry and governments are
    building a world that prioritize the
  • 5:16 - 5:23
    exploitation of all data. We observe three
    prevailing trends in data exploitation.
  • 5:23 - 5:28
    One is the excessive data that's generated
    beyond our control. The second one is the
  • 5:28 - 5:34
    fact that this data is processed in a way
    we cannot understand or influence and the
  • 5:34 - 5:40
    lack of transparency around it. The last
    one is, that at the moment this data is
  • 5:40 - 5:45
    used to disadvantage us the ones who are
    producing this data and it's further
  • 5:45 - 5:51
    empowering the already powerful. We hardly
    control the data anymore that's generated
  • 5:51 - 5:55
    from phones or in our computers, but now
    in the world we live in data just don't
  • 5:55 - 6:00
    come just from our phones or computers. It
    comes from the cars we're driving, it
  • 6:00 - 6:06
    comes from our payment systems, from the
    cities we live in. This is all generating
  • 6:06 - 6:13
    data and this data is used by other
    entities to make assumptions about us and
  • 6:13 - 6:18
    take decisions that eventually influence
    our lives. Are we entitled to a loan? Do
  • 6:18 - 6:25
    we qualify for affordable insurance?
    Should we be sent to jail or set free? Who
  • 6:25 - 6:31
    should be arrested? This is at the core of
    the world that we're building around data
  • 6:31 - 6:38
    exploitation. The question of power
    imbalance between those who have the data
  • 6:38 - 6:42
    and who gets to make decisions based on
    this data and those who are producing the
  • 6:42 - 6:50
    data and losing control over it. Now what
    is policing have to do with data, what
  • 6:50 - 6:57
    does data exploitation have to do with
    policing? The police has always been
  • 6:57 - 7:05
    actually using data in the past. To give
    you one example in 1980 a transit police
  • 7:05 - 7:11
    officer named Jack Maple, developed a
    project called chart of the future, this
  • 7:11 - 7:16
    is how he described it: "I call them the
    chart of the future. On 55 feet of wall
  • 7:16 - 7:21
    space, I mapped every train station in New
    York City and every train. Then I used
  • 7:21 - 7:25
    crayons to mark every violent crime,
    robbery, and grand larceny that occurred.
  • 7:25 - 7:33
    I mapped the solved versus the unsolved".
    Now the system was used by the Transit
  • 7:33 - 7:41
    Police and it was credited with reducing
    felonies by 27% and robberies by 1/3
  • 7:41 - 7:50
    between 1990 and 1992. So this generated a
    lot of interest in his projects and former
  • 7:50 - 7:56
    New York Mayor Rudolph Giuliani asked the
    New York police department to essentially
  • 7:56 - 8:02
    take up chart of the future and develop
    their own project. It became CompStat.
  • 8:02 - 8:10
    CompStat was again essentially about
    mapping crime to try and make assumptions
  • 8:10 - 8:19
    about where crime wars are happening. So
    this kind of shows the building of this
  • 8:19 - 8:26
    narrative around this idea that the more
    data you have, the more data you generate,
  • 8:26 - 8:32
    the better you will be at reducing crime.
    Now it becomes interesting in the world we
  • 8:32 - 8:36
    live in that we describe, where we are
    constantly generating data, often without
  • 8:36 - 8:42
    the consent or even the knowledge of those
    who are producing this data. So there are
  • 8:42 - 8:48
    new questions to be asked: What data is
    the police entitled to access? What can
  • 8:48 - 8:54
    they do with it? Are we all becoming
    suspects by default? One of the key
  • 8:54 - 9:00
    elements of the intersection between data
    exploitation and policing is the question
  • 9:00 - 9:06
    of smart cities. It's worth bearing in
    mind that data-driven policing is often
  • 9:06 - 9:12
    referred to as smart policing, so obviously
    the word smart has been used generally in
  • 9:12 - 9:18
    a generic manner by various industry to
    kind of describe this trend of using mass
  • 9:18 - 9:27
    data collection in order to provide new
    services. But there is actually a real and
  • 9:27 - 9:35
    genuine connection between smart cities
    and data-driven policing. The first reason
  • 9:35 - 9:44
    for that is that actually one of the main
    reasons for cities to invest in smart city
  • 9:44 - 9:49
    infrastructure is actually the question of
    security. This is something we've explored
  • 9:49 - 9:54
    in our latest report on smart cities and
    this is emerging also from the work we're
  • 9:54 - 10:01
    doing other organizations including coding
    rights in Brazil and DRF in Pakistan. So
  • 10:01 - 10:06
    actually Brazil is an interesting example,
    because before the mega events they
  • 10:06 - 10:10
    started organizing like the football
    World Cup and the Olympics they invested
  • 10:10 - 10:17
    massively in smart city infrastructure.
    Including projects with IBM and precisely
  • 10:17 - 10:20
    the purpose of what they were trying to
    achieve with their smart city
  • 10:20 - 10:26
    infrastructure, was making the city safer
    so it was extremely strongly connected
  • 10:26 - 10:32
    with the police. So this is a picture for
    example of the control room that
  • 10:32 - 10:39
    was built to control CCTV cameras and to
    create graphs in order to showcase where
  • 10:39 - 10:46
    crime was happening and also in a way the
    likeliness of natural disasters in some
  • 10:46 - 10:52
    areas. In Pakistan there is a whole new
    program on investment of smart cities,
  • 10:52 - 10:59
    which is actually referred to as the safe
    city project. Now companies understand
  • 10:59 - 11:05
    that very well and this is actually an
    image from an IBM presentation describing
  • 11:05 - 11:11
    their vision of smart cities. And as you
    see like policing that is very much
  • 11:11 - 11:17
    integrated into their vision, their
    heavily centralized vision of what smart
  • 11:17 - 11:23
    cities are. So that's no wonder that
    companies that offer smart city
  • 11:23 - 11:28
    infrastructure are actually now also
    offering a platform for policing. So those
  • 11:28 - 11:35
    companies include IBM as I mentioned but
    also Oracle and Microsoft. We see in many
  • 11:35 - 11:40
    countries including the UK where we based
    some pressure on budgets and budget
  • 11:40 - 11:44
    reductions for the police and so there is
    a very strong appeal with this narrative,
  • 11:44 - 11:51
    that you can purchase platform you can
    gather more data that will help you do
  • 11:51 - 11:58
    policing in less time and do it more
    efficiently. But little thought is given
  • 11:58 - 12:03
    to the impact on society, or right to
    privacy and what happens if someone
  • 12:03 - 12:13
    unexpected take the reins of power. Now
    we're gonna briefly explain what data-
  • 12:13 - 12:20
    driven policing looks like, and eventually
    Millie will look at our findings. So
  • 12:20 - 12:26
    the first thing I wanted to discuss is
    actually predictive policing, because
  • 12:26 - 12:31
    that's often something we think of and
    talked about when we think about data-
  • 12:31 - 12:38
    driven policing. I mentioned CompStat
    before and essentially predictive policing
  • 12:38 - 12:43
    works on a similar premise. The idea is
    that if you map where crime happens you
  • 12:43 - 12:51
    can eventually guess where the next crime
    will happen. So the key player in
  • 12:51 - 12:55
    predictive policing is this company called
    PREDPOL, I mean I think they describe
  • 12:55 - 12:58
    pretty much what they do, they use
    artificial intelligence to help you
  • 12:58 - 13:06
    prevent crime, right, predicting when and
    where crime will most likely occur. Now
  • 13:06 - 13:11
    PREDPOL and other companies using
    something called a Hawkes process that's
  • 13:11 - 13:17
    used normally for the prediction of
    earthquake tremors, so what Hawkes
  • 13:17 - 13:23
    originally did is that he was analyzing
    how after an earthquake you have after
  • 13:23 - 13:29
    shakes and usually the after shakes tend
    to happen where the original earthquake
  • 13:29 - 13:36
    happened and in a short period of time
    after that. So the Hawkes process basically
  • 13:36 - 13:41
    is described as when a certain event
    happens, other events of the same kind will
  • 13:41 - 13:45
    happen shortly after in the same in the
    same location. Now obviously it actually
  • 13:45 - 13:51
    works quite well for earthquakes, whether
    it works for crime is a lot more
  • 13:51 - 13:56
    questionable. But that's actually the
    premise on which companies that
  • 13:56 - 14:02
    are offering predictive policing services
    are relying. So basically applied to
  • 14:02 - 14:09
    predictive policing the mantra is
    monitoring data on places where crime is
  • 14:09 - 14:13
    happening you can identify geographic
    hotspots where crime will most likely
  • 14:13 - 14:21
    happen again. Now other companies than
    PREDPOL are joining in and they are adding
  • 14:21 - 14:26
    more data than just simply location of
    past crimes. So this data has included
  • 14:26 - 14:31
    open source intelligence and we talked a
    little bit more about this later on.
  • 14:31 - 14:36
    Weather report, census data, the location
    of key landmarks like bars, churches,
  • 14:36 - 14:40
    schools, data sporting events, and moon
    phases. I'm not quite sure what they're
  • 14:40 - 14:50
    doing with moon phases but somehow that's
    something they're using. When predictive
  • 14:50 - 14:56
    policing first sort of emerged one of the
    the key concerns was whether our world was
  • 14:56 - 15:01
    going to be turning into a Minority Report
    kind of scenario where people are arrested
  • 15:01 - 15:05
    before a crime is even committed and
    companies like PREDPOL were quick to
  • 15:05 - 15:10
    reassure people and say that do not
    concern about who will commit crime but
  • 15:10 - 15:16
    where crimes are happening. Now that's not
    actually true because in fact at the
  • 15:16 - 15:21
    moment we see several programs emerging
    especially in the US, where police
  • 15:21 - 15:26
    departments are concerned not so much with
    where crimes are happening, but who's
  • 15:26 - 15:31
    committing it.,So I'm gonna talk about two
    example of this: One is the Kansas City No
  • 15:31 - 15:38
    Violence Alliance, which is a program laid
    by the local police to identify who will
  • 15:38 - 15:43
    become the next criminal - basically - and
    they're using an algorithm that combines
  • 15:43 - 15:48
    data from traditional policing as well as
    social media intelligence and information
  • 15:48 - 15:54
    that they have on drug use, based on this
    they create graphics generated using
  • 15:54 - 16:02
    predictive policing to show how certain
    people are connected to already convicted
  • 16:02 - 16:06
    criminals and gang members. Once they've
    identified these people they request
  • 16:06 - 16:11
    meeting with them whether they've
    committed crimes or not in the past. And
  • 16:11 - 16:16
    they would have a discussion about their
    connection to those convicted criminals
  • 16:16 - 16:22
    and gang members and what they tell them
    is that they are warned that if a crime
  • 16:22 - 16:27
    next happened within their network of
    people every person connected to this
  • 16:27 - 16:33
    network will be arrested whether or not
    they were actually involved in the crime
  • 16:33 - 16:38
    being committed. Now there are actually
    dozens of police departments that are
  • 16:38 - 16:46
    using similar programs. The Chicago Police
    Department has an index of the 400 people
  • 16:46 - 16:50
    most likely to be involved in violent
    crimes. That sounds like a BuzzFeed
  • 16:50 - 16:56
    article but actually there is a reality
    which is extremely concerning, because
  • 16:56 - 17:02
    those people who are in this list are for
    the most part not actual criminals, they
  • 17:02 - 17:08
    are purely seen to be connected to people
    who've committed crime. So if your next-
  • 17:08 - 17:17
    door neighbor is a criminal then you may
    well find your name on that list. Now
  • 17:17 - 17:21
    predictive policing is deceptive and
    problematic for several reasons: First of
  • 17:21 - 17:27
    all there's the question of the
    presumption of innocence. In a world where
  • 17:27 - 17:33
    even before you commit a crime you can
    find your name on that list or be called
  • 17:33 - 17:38
    by the police - you know - what happens to
    this very basis of democracy which is the
  • 17:38 - 17:43
    presumption of the of innocence. But also
    there's the other question of like can we
  • 17:43 - 17:48
    really use the math that was originally
    designed for earthquakes and apply to
  • 17:48 - 17:53
    human beings because human beings don't
    work like earthquakes. They have their own
  • 17:53 - 18:00
    set of biases and the biases
    start with how we collect the data. For
  • 18:00 - 18:08
    example, if the police is more likely to
    police areas where there is minorities,
  • 18:08 - 18:12
    people of color, then obviously the data
    they will have will be disproportionately
  • 18:12 - 18:18
    higher on persons of color. Likewise if
    they are unlikely to investigate white-
  • 18:18 - 18:24
    collar crime they will be unlikely to have
    data that are reflecting a reality where
  • 18:24 - 18:29
    crime also happens in wealthier areas. So
    basically we are inputting biased datasets
  • 18:29 - 18:35
    that obviously will lead to biased
    results. And what these biased results
  • 18:35 - 18:42
    mean is that it will continue the already
    existing trend of over policing
  • 18:42 - 18:48
    communities of color and low-income
    communities. I'll leave it to Millie for
  • 18:48 - 18:56
    the next box. So, one of the increasingly
    popular technologies we're seeing in the
  • 18:56 - 19:01
    UK, and is no doubt used around the world
    and probably at border points, although we
  • 19:01 - 19:06
    need more help with the reasearch to prove
    this, is mobile phone extraction. The
  • 19:06 - 19:11
    police can extract data from your phone,
    your laptop, and other devices which
  • 19:11 - 19:16
    results in a memory dump of the extracted
    data taken from your device and now held
  • 19:16 - 19:23
    in an agency database. So for example all
    your photos, all your messages, and all
  • 19:23 - 19:28
    those of people who had no idea they would
    end up in a police database because
  • 19:28 - 19:35
    they're associated with you retained for
    as long as the police wish. Now these
  • 19:35 - 19:39
    devices are pretty user friendly for the
    police and if you're interested you can
  • 19:39 - 19:43
    look on YouTube where Cellebrite one of
    the big players has lots of videos about
  • 19:43 - 19:49
    how you can use them, and so depending on
    the device and the operating system some
  • 19:49 - 19:54
    of the data this is from a police document
    but it lists what they can extract using a
  • 19:54 - 20:02
    Cellebrite UFED is what you might expect:
    device information, calls, messages,
  • 20:02 - 20:09
    emails, social media, and Wi-Fi networks.
    But if you look at their website and here
  • 20:09 - 20:15
    are a few examples they can also collect:
    system and deleted data, they can access
  • 20:15 - 20:21
    cloud storage, and inaccessible partitions
    of the device. Now this is data that is
  • 20:21 - 20:26
    clearly beyond the average users control,
    and as the volume of data we hold on our
  • 20:26 - 20:32
    phones increases so will this list. And
    the companies we know the UK police are
  • 20:32 - 20:39
    using, which includes: Cellebrite, Acceso,
    Radio Tactics, MSAB, are all aware of how
  • 20:39 - 20:45
    valuable this is and as one of them have
    stated: "if you've got access to a person
  • 20:45 - 20:50
    SIM card, you've got access to the whole
    of a person's life". They also go on to
  • 20:50 - 20:56
    note: "the sheer amount of data stored on
    mobile phones is significantly greater
  • 20:56 - 21:04
    today than ever before." There are also no
    temporal limits to the extraction of data,
  • 21:04 - 21:09
    this is from another police document we
    obtained and it shows that if you choose
  • 21:09 - 21:16
    to extract to certain data type you will
    obtain all data of a particular type, not
  • 21:16 - 21:21
    just the data relevant to an
    investigation. So all that data on a
  • 21:21 - 21:28
    police database, indefinitely and even if
    you were asked whether you were happy for
  • 21:28 - 21:33
    your data to be extracted during an
    investigation I think it's highly unlikely
  • 21:33 - 21:38
    you would realize the volume that the
    police were going to take. Other targets
  • 21:38 - 21:44
    for the police that we know about are:
    infotainment systems in cars, Smart TVs,
  • 21:44 - 21:51
    and connected devices in the home. This is
    an extract from a tech UK report, where
  • 21:51 - 21:57
    Mark Stokes head of digital forensics at
    the Met Police which the police in London
  • 21:57 - 22:03
    stated in January, that the crime scene of
    tomorrow will be the Internet of Things
  • 22:03 - 22:08
    and detectors of the future will carry a
    digital forensics toolkit that will help
  • 22:08 - 22:15
    them analyze microchips and download data
    at the scene rather than removing devices
  • 22:15 - 22:20
    for testing. Now I can imagine that the
    evidence storage room is going to get a
  • 22:20 - 22:25
    bit full if they start dragging in
    connected fridges, hair dryers, hair
  • 22:25 - 22:33
    brushes, your Google home, Amazon echo and
    whatever else you have. However, their
  • 22:33 - 22:38
    plans to walk into your home and download
    everything, make no mention of needing a
  • 22:38 - 22:44
    specific warrant and so the only
    limitations at the moment are the
  • 22:44 - 22:50
    protections that may exist on the devices.
    The law does not protect us and this needs
  • 22:50 - 22:59
    to change. So I'm going to be talking a
    little bit about open source intelligence
  • 22:59 - 23:05
    and in particular social media
    intelligence, because when I talked about
  • 23:05 - 23:11
    predictive policing I identified those two
    sources as some of the data that's being
  • 23:11 - 23:17
    used for predictive policing. Now, open
    source intelligence is often thought as,
  • 23:17 - 23:23
    or often assumed to be innocuous, and
    there is the understanding that if
  • 23:23 - 23:29
    information is publicly available then it
    should be fair for the police to use. Now
  • 23:29 - 23:34
    the problem is that among open source
    intelligence there's often social media
  • 23:34 - 23:41
    intelligence that we refer to as
    documents. Now there are many ways to
  • 23:41 - 23:46
    conduct document and it can range from
    like the single police officer, who is
  • 23:46 - 23:54
    just you know using Facebook or Twitter to
    look up the accounts of victims or
  • 23:54 - 23:59
    suspected criminals, but there was also
    companies that are scrapping the likes of
  • 23:59 - 24:05
    Facebook and Twitter to allow the police
    to monitor social media. Now social medias
  • 24:05 - 24:11
    have like blurred the lines between public
    and private, because obviously we are
  • 24:11 - 24:18
    broadcasting our views on this platform
    and at the moment the police has been
  • 24:18 - 24:25
    exploiting this kind of unique space, this
    blured line, ithey are accessing this
  • 24:25 - 24:31
    content in a completely unregulated
    manner, as long as the content is publicly
  • 24:31 - 24:38
    available like for example you don't need
    to be friend or to have any already
  • 24:38 - 24:43
    established connection with the suspected
    criminal or the police or the victim
  • 24:43 - 24:49
    anything that's available to you it's
    completely unregulated there are no rules
  • 24:49 - 24:57
    and I mentioned earlier the question of a
    budget restriction and so the police is
  • 24:57 - 25:02
    benefiting hugely from this because it
    doesn't really cost anything to use social
  • 25:02 - 25:07
    media so at the moment SOCMINT is kind of
    like the first and easy step in a police
  • 25:07 - 25:14
    investigation because there is no cost and
    because there is no oversight. Now,
  • 25:14 - 25:19
    SOCMINT actually isn't so innocent in the
    sense that it allows the police to
  • 25:19 - 25:26
    identify the locations of people based on
    their post, it allows them to establish
  • 25:26 - 25:31
    people's connection, their relationships,
    their association, it allows the
  • 25:31 - 25:37
    monitoring of protest and also to identify
    the leaders of various movement, and to
  • 25:37 - 25:46
    measure a person's influence. Now, in the
    UK what we know is that the police is
  • 25:46 - 25:52
    largely using marketing products, so this
    is an anonymous quote from a report by
  • 25:52 - 25:58
    academics that have been doing research on
    SOCMINT and what someone said was that: "A
  • 25:58 - 26:02
    lot of stuff came out of marketing because
    marketing were using social media to
  • 26:02 - 26:05
    understand what people were saying about
    their product... We wanted to understand
  • 26:05 - 26:12
    what people were saying so it's almost
    using it in reverse". Now again, this is
  • 26:12 - 26:16
    not considered like surveillance device
    this is purely a marketing project that
  • 26:16 - 26:23
    they're using and for that reason law
    enforcement agencies and security agencies
  • 26:23 - 26:30
    are often arguing that SOCMINT has
    basically no impact on privacy. But
  • 26:30 - 26:37
    actually when your post reveals your
    location or when the content of your post
  • 26:37 - 26:40
    reveal what used to be considered and is
    still considered actually as sensitive
  • 26:40 - 26:45
    private information like details about
    your sexual life, about your health, about
  • 26:45 - 26:50
    your politics, can we really minimize the
    impact of the police accessing this
  • 26:50 - 26:56
    information. Now obviously we may not have
    a problem with the average twitter user or
  • 26:56 - 27:01
    with a friend reading this information but
    when the ones who are reading the
  • 27:01 - 27:06
    information and taking actions on this
    information have power over us like the
  • 27:06 - 27:18
    police does, you know, what does it
    actually mean for our right to privacy?
  • 27:18 - 27:27
    That's not to say that people should stop
    using social media but rather what kind of
  • 27:27 - 27:33
    regulation can we put in place so that
    it's not so easy for the police to access.
  • 27:33 - 27:42
    The absence of regulations on SOCMINT has
    actually already led to abuse in two cases
  • 27:42 - 27:48
    both in the US that we've identified: One
    is Raza v. the City of New York which is a
  • 27:48 - 27:56
    case from the ACLU where we knew that we
    found out that the city of New York,
  • 27:56 - 28:00
    sorry, the New York Police Department was
    systematically gathering intelligence on
  • 28:00 - 28:05
    Muslim communities, and one of the ways
    they were gathering this intelligence was
  • 28:05 - 28:12
    essentially by surveilling social media
    accounts of Muslims in New York. The
  • 28:12 - 28:17
    second case is a company called ZeroFOX.
    So what ZeroFox does is social media
  • 28:17 - 28:23
    monitoring. Now, during the the riots that
    followed the funeral of Freddie Gray,
  • 28:23 - 28:30
    Freddie Gray was a 25 year old black man
    who had been shot by the police, so after
  • 28:30 - 28:37
    his funeral there had been a series of
    riots in the UK and ZeroFOX produced a
  • 28:37 - 28:41
    report that they shared with the Baltimore
    Police to essentially advertise for their
  • 28:41 - 28:48
    social social media monitoring tool and
    what the company was doing was again like
  • 28:48 - 28:53
    browsing social media and trying to
    establish who were the threat actors in
  • 28:53 - 28:59
    these riots and among the 19 threat
    actors that they identified two of them
  • 28:59 - 29:04
    were actually leaders of the black lives
    matter movement. Actually at least one of
  • 29:04 - 29:10
    them was a woman definitely not a physical
    threat but this is how they were
  • 29:10 - 29:18
    essentially labeled. So these two examples
    actually show that again it's still sort
  • 29:18 - 29:24
    of the same targets, it's people of
    colors, it's activists, it's people from
  • 29:24 - 29:30
    poor income backgrounds, that are singled
    out as likely criminals. And it's very
  • 29:30 - 29:34
    telling when we realize that SOCMINT is
    actually one of the sources of data that's
  • 29:34 - 29:39
    eventually used for predictive policing
    and then again predictive policing leading
  • 29:39 - 29:45
    to people being more surveiled and
    potentially exposed to more police
  • 29:45 - 29:51
    surveillance based on the fact that they
    all singled out as as likely criminal. Now
  • 29:51 - 29:57
    social media is a fascinating place
    because it's a mix between a private and a
  • 29:57 - 30:02
    public space as I said we are broadcasting
    our views publicly but then again it's a
  • 30:02 - 30:08
    privately owned space where we follow the
    rules that is set up by private companies.
  • 30:08 - 30:14
    Now, if we want to protect this space and
    ensure that like free expression and
  • 30:14 - 30:19
    political organization can still happen on
    the spaces we need to fully understand how
  • 30:19 - 30:23
    much the police have been exploiting the
    spaces and how we can limit and regulate
  • 30:23 - 30:30
    the use of it. Now, I'll talk to Millie
    about what we can do next. So I'm going to
  • 30:30 - 30:33
    briefly look at some of our initial
    findings we've made using Freedom of
  • 30:33 - 30:40
    Information requests, broadly: the lack of
    awareness by the public, weak legal basis,
  • 30:40 - 30:45
    and a lack of oversight. Now, sometimes
    the lack of awareness appears intentional
  • 30:45 - 30:55
    - we asked the police about their plans to
    extract data from connected devices in the
  • 30:55 - 31:02
    home and they replied neither confirm nor
    deny. Now this is kind of a bizarre
  • 31:02 - 31:07
    response given that Mark Stokes who's a
    member of the police had already said that
  • 31:07 - 31:14
    they plan to do this, in addition the UK
    government Home Office replied to us
  • 31:14 - 31:18
    saying the Home Office plans to develop
    skills and capacity to exploit the
  • 31:18 - 31:24
    Internet of Things as part of criminal
    investigations. They also said that police
  • 31:24 - 31:30
    officers will receive training in relation
    to extracting, obtaining, retrieving, data
  • 31:30 - 31:35
    from or generated by connected devices. So
    we wrote back to every police force in the
  • 31:35 - 31:41
    UK had refused to reply to us and
    presented the evidence but they maintained
  • 31:41 - 31:46
    their stance so we will be bringing a
    challenge against them under the Freedom
  • 31:46 - 31:52
    of Information Act. Now, Eva has also
    identified the huge risks associated with
  • 31:52 - 31:58
    predictive policing yet in the UK we've
    found out this is set to increase with
  • 31:58 - 32:02
    forces either using commercial tools or
    in-house ones they've developed or
  • 32:02 - 32:09
    planning trials for 2018. There has been
    no public consultation, there are no
  • 32:09 - 32:14
    safeguards, and there is no oversight. So
    when we ask them more questions about the
  • 32:14 - 32:21
    plans we were told we were 'vexatious' and
    they won't respond to more requests so it
  • 32:21 - 32:27
    seems like we have yet another challenge,
    and what about mobile phone extraction
  • 32:27 - 32:33
    tools here are some of the stats that have
    been found out and I would say these
  • 32:33 - 32:37
    aren't completely accurate because it
    depends on how reliable the police force
  • 32:37 - 32:43
    are in responding but roughly I'd say it's
    probably more than 93 percent now of UK
  • 32:43 - 32:48
    police forces throughout the country are
    extracting data from digital devices. We
  • 32:48 - 32:53
    know they plan to increase, we've seen in
    their documents they plan to train more
  • 32:53 - 32:59
    officers, to buy more equipment, and to
    see extraction as a standard part of
  • 32:59 - 33:04
    arrest, even if the devices had absolutely
    nothing to do with the offense and so
  • 33:04 - 33:10
    these figures are likely to increase
    exponentially, but in the UK not only to
  • 33:10 - 33:16
    the police not need a warrant in documents
    we've read they do not even need to notify
  • 33:16 - 33:21
    the individual that they have extracted
    data, for example, from their mobile phone
  • 33:21 - 33:28
    or that they're storing it. If this is
    being done without people's knowledge how
  • 33:28 - 33:32
    on earth can people challenge it, how can
    they ask for their data to be removed if
  • 33:32 - 33:40
    they're found innocent? Turning to social
    media monitoring which the police refer to
  • 33:40 - 33:44
    as open source research. This is Jenny
    Jones she's a member of the House of Lords
  • 33:44 - 33:51
    in the Green Party and next to her photo
    is a quote from her entry on the domestic
  • 33:51 - 33:57
    extremism database, and so, if a member of
    the House of Lords is being subject to
  • 33:57 - 34:05
    social media monitoring for attending a
    bike ride then I think it's highly likely
  • 34:05 - 34:09
    that a large number of people who
    legitimately exercise their right to
  • 34:09 - 34:14
    protest are being subject to social media
    monitoring. Now, this hasn't gone
  • 34:14 - 34:20
    unnoticed completely although they're
    slightly old these are quotes from two
  • 34:20 - 34:25
    officials: the first the UK independent
    reviewer of terrorism who notes that the
  • 34:25 - 34:30
    extent of the use of social media
    monitoring is not public known, and the
  • 34:30 - 34:34
    second is the chief surveillance
    commissioner who is and this is a very
  • 34:34 - 34:39
    strong statement for a commissioner is
    saying that basically social media should
  • 34:39 - 34:48
    not be treated as fair game by the police.
    So now I'll move on to a weak or outdated
  • 34:48 - 34:53
    legal basis. For most of the technologies
    we've looked at it's very unclear what
  • 34:53 - 34:58
    legal basis the police are using even when
    we've asked them. This relates to mobile
  • 34:58 - 35:04
    phone extraction - so the legislation
    they're relying on is over 30 years old
  • 35:04 - 35:11
    and is wholly inappropriate for mobile
    phone extraction this law was developed to
  • 35:11 - 35:17
    deal with standard traditional searches,
    the search of a phone can in no way be
  • 35:17 - 35:22
    equated to the search of a person, or the
    search of a house, and despite the fact
  • 35:22 - 35:27
    that we have repeatedly asked for a
    warrant this is not the case and we
  • 35:27 - 35:31
    believe that there should be a warrant in
    place not only in the UK but in the rest
  • 35:31 - 35:36
    of the world. So if you think that either
    you or your friends have had their data
  • 35:36 - 35:39
    extracted when they're arrested or your
    phone has been in the possession of the
  • 35:39 - 35:46
    authorities you should be asking
    questions, and very briefly something on
  • 35:46 - 35:52
    lack of oversight, so we reported in
    January this year about documents that
  • 35:52 - 35:58
    were obtained by The Bristol Cable's
    investigation into Cellebrite and one
  • 35:58 - 36:04
    report said that in half of the cases
    sampled the police noted the police had
  • 36:04 - 36:10
    failed to receive authorization internally
    for the use of extraction tools. Poor
  • 36:10 - 36:16
    training undermined investigations into
    serious offences such as murder, and
  • 36:16 - 36:21
    inadequate security practices meant that
    encryption was not taking place even when
  • 36:21 - 36:27
    it was easy to do and they were losing
    files containing intimate personal data.
  • 36:27 - 36:33
    So why does this matter? Here are some key
    points: In relation to information
  • 36:33 - 36:38
    asymmetry - it's clear as Eva has
    explained that the police can now access
  • 36:38 - 36:44
    far more data on our devices than the
    average user. In relation to imbalance of
  • 36:44 - 36:47
    power - it's clear they can collect and
    analyze sources that are beyond our
  • 36:47 - 36:54
    control whether it's publicly placed
    sensors, cameras, and other devices. There
  • 36:54 - 36:59
    is also unequal access and if lawyers
    don't know what's being gathered they
  • 36:59 - 37:04
    don't know what to ask for from the
    police. All in all this puts the
  • 37:04 - 37:10
    individual at a huge disadvantage. Another
    impact is the chilling effect on political
  • 37:10 - 37:17
    expression now I'm sure many of you maybe
    think that the police monitor your social
  • 37:17 - 37:22
    media but the average person is unlikely
    to, and so if they start to know about
  • 37:22 - 37:27
    this are they going to think twice about
    joining in protesting either physically or
  • 37:27 - 37:32
    using a hashtag, and what about who your
    friends are? If they know you attend
  • 37:32 - 37:39
    protests are they really want to have
    their data on your phone if they know that
  • 37:39 - 37:44
    potentially that could be extracted and
    end up on a police database? It's far
  • 37:44 - 37:49
    easier to be anonymous face among many
    people than a single isolated person
  • 37:49 - 37:55
    standing up to power but these new forms
    of policing we have been discussing
  • 37:55 - 38:00
    redefine the very act of protesting by
    singling out each and every one of us from
  • 38:00 - 38:08
    the crowd. So, what can we do? Many of you
    will be familiar with these technologies,
  • 38:08 - 38:13
    but do you know how to find out what the
    police are doing? In the UK we've been
  • 38:13 - 38:17
    using Freedom of Information requests, we
    want to do this with people throughout
  • 38:17 - 38:22
    Europe and you don't need to be a lawyer
    so please get in touch. We also want to
  • 38:22 - 38:27
    dig into the technology a bit more, I want
    someone to use a Cellebrite UFED on my
  • 38:27 - 38:32
    phone and show me exactly what can come
    out of it, and we want to tell lawyers and
  • 38:32 - 38:37
    activists about these new techniques. Many
    lawyers I speak to who are experts in
  • 38:37 - 38:42
    actions against the police do not know the
    police are using these tools. This means
  • 38:42 - 38:47
    they don't know the right questions to ask
    and so it's fundamental you speak to
  • 38:47 - 38:51
    people who are bringing these cases and
    tell them about what they can do or what
  • 38:51 - 38:57
    questions they should be asking, and
    finally we want you to also raise the
  • 38:57 - 39:18
    debate, to share our research, and to
    critique it, thank you.
  • 39:18 - 39:24
    Herald: So we've got ample enough time for
    Q&A are there any questions in the hall,
  • 39:24 - 39:29
    yes, there's one over there.
    Question: You mentioned the problem of
  • 39:29 - 39:33
    when they do physical extraction from the
    Celebrite device it's going to get all of
  • 39:33 - 39:38
    the photos, all of the emails, or whatever
    maybe rather than just what the
  • 39:38 - 39:42
    investigator needs. What is the solution
    to that from your eyes is there a
  • 39:42 - 39:46
    technical one that these companies are
    gonna have to implement - which they're
  • 39:46 - 39:51
    not going to - or a legal one, because on
    the other side a mobile phone is a crucial
  • 39:51 - 39:57
    part in a any criminal investigation in
    2017. So what's the workaround or the
  • 39:57 - 40:00
    solution to that?
    Answer: I think it's both, I think the
  • 40:00 - 40:04
    fact that there isn't any law looking at
    this and no one's discussing can there be
  • 40:04 - 40:09
    a technical solution or does it need to be
    one where there's better regulation and
  • 40:09 - 40:13
    oversight so you extract everything, can
    you keep it for a certain period to see
  • 40:13 - 40:17
    what's relevant then do you have to delete
    it? The trouble is we don't see any
  • 40:17 - 40:22
    deletion practices and the police have
    publicly stated in the media that they can
  • 40:22 - 40:27
    just keep everything as long as they like.
    They like data you can kind of see why but
  • 40:27 - 40:31
    that doesn't mean they should keep
    everyone's data indefinitely just in case
  • 40:31 - 40:35
    it's useful so I think there may be tech
    solutions there may be legal ones and I
  • 40:35 - 40:41
    think perhaps both together as is one of
    the answers. Herald: The next question
  • 40:41 - 40:45
    from microphone one please.
    Q: I'm just wondering how those laws on
  • 40:45 - 40:50
    action and power given to the cops are
    being sold to the UK people is it because
  • 40:50 - 40:57
    to fight terrorism as I said or to fight
    drugs or this kind of stuff, what's the
  • 40:57 - 41:00
    argument used by the government to sold
    that to the people.
  • 41:00 - 41:05
    A: I think actually one thing that's
    important is to bear in mind is that I'm
  • 41:05 - 41:11
    not sure most of the of the public in the
    UK is even aware of it, so I think unlike
  • 41:11 - 41:15
    the work of intelligence services an
    agency where terrorism is used as the
  • 41:15 - 41:22
    excuse for ever more power and especially
    laws that have become increasingly
  • 41:22 - 41:26
    invasive, actually with policing we don't
    even fall in that kind of discourse
  • 41:26 - 41:31
    because it's actually hardly talked about
    in UK. Yeah, and the mobile phone
  • 41:31 - 41:35
    extraction stuff we've been looking at is
    low-level crimes, so that's like you
  • 41:35 - 41:41
    have, it could be you know a pub fight,
    it could be a robbery, which that's more
  • 41:41 - 41:46
    serious, it could be an assault, so they
    want to use it in every case. For all the
  • 41:46 - 41:48
    other techniques we have no idea what
    they're using for that's one of the
  • 41:48 - 41:54
    problems.
    Herald: The next question from the
  • 41:54 - 41:57
    internet please.
    Q: When you say that there's a lack of
  • 41:57 - 42:04
    laws and regulations for police concerning
    us in extraction and data from devices are
  • 42:04 - 42:10
    you talking just about UK and/or USA or do
    you have any examples of other countries
  • 42:10 - 42:14
    who do better or worse?
    A: I don't know of any country that has a
  • 42:14 - 42:19
    regulation on publicly available
    information on social media.
  • 42:19 - 42:26
    Herald: Microphone number four.
    Q: Thank you again for a great talk. In
  • 42:26 - 42:32
    terms of data exploitation an element that
    I didn't hear you talk about that I'd like
  • 42:32 - 42:36
    to hear a little bit more is when there
    are questions around who is doing the
  • 42:36 - 42:40
    exploitation, I know in the U.S. some FOIA
    researchers get around how difficult it is
  • 42:40 - 42:45
    to get data from the feds by going after
    local and state police departments, is
  • 42:45 - 42:48
    that something that you're doing or do you
    have a way of addressing confusion when
  • 42:48 - 42:51
    people don't know what agency has the
    data?
  • 42:51 - 42:57
    A: Yeah, I think actually what one of the
    things the data exploitation program at
  • 42:57 - 43:00
    Privacy International is doing is actually
    looking into the connection between the
  • 43:00 - 43:06
    private sector and governments because
    obviously at the moment there's the whole
  • 43:06 - 43:10
    question of data brokers which is an
    industry that's hardly regulated at all,
  • 43:10 - 43:14
    that people don't necessarily know about,
    we don't, the companies that are doing it
  • 43:14 - 43:20
    are familiar household name. I'll let
    Millie talk a lot more about the
  • 43:20 - 43:25
    government aspects of it. I guess the
    question is again a country-by-country
  • 43:25 - 43:29
    basis, we work in many countries that
    don't have any data protection regulations
  • 43:29 - 43:37
    at all so there is this first difficulty
    as how do we regulate, how do we limit the
  • 43:37 - 43:41
    power of the state when you don't even
    have the basic legislation around
  • 43:41 - 43:46
    data protection? One thing to bear in mind
    is like the problem with companies is like
  • 43:46 - 43:53
    how do you also hold companies accountable
    whereas with the state there is the whole
  • 43:53 - 43:58
    challenge of finding the right legal
    framework to limit their power, but maybe
  • 43:58 - 44:02
    I'll let Millie talk a little bit more
    about this. Yeah, with our with our FOIA
  • 44:02 - 44:06
    request we tend to go after everyone so
    with the example of the Home Office saying
  • 44:06 - 44:09
    something that the other police didn't
    that was because we went to all the
  • 44:09 - 44:15
    different state bodies and I think that
    there's a good example in in the states
  • 44:15 - 44:18
    where there's far more research done on
    what the police are doing, but they're
  • 44:18 - 44:23
    using the same product in the UK I think
    it's axiom and they're a storage device
  • 44:23 - 44:29
    for body-worn camera videos, and a lawyer
    in the states said that in order to access
  • 44:29 - 44:33
    the video containing his client he had to
    agree to the terms and condition on Axioms
  • 44:33 - 44:38
    website which basically gave them full use
    of his clients video about a crime scene.
  • 44:38 - 44:43
    So that's a private company having use of
    this video so given that we found they're
  • 44:43 - 44:47
    using it in the UK we don't know if those
    kind of terms and conditions exist but
  • 44:47 - 44:55
    it's a very real problem as they rely
    increasingly on private companies.
  • 44:55 - 44:58
    Herald: Number two please.
    Q: Thank you for your work perhaps you've
  • 44:58 - 45:03
    already answered this partially from other
    people's questions but it looks like we
  • 45:03 - 45:09
    have a great way to start the process and
    kind of taking the power back but you know
  • 45:09 - 45:13
    the state and the system certainly doesn't
    want to give up this much power, how do we
  • 45:13 - 45:18
    actually directly, what's kind of the
    endgame, what's the strategies for making
  • 45:18 - 45:25
    the police or the government's give up and
    restore balance, is it a suit, is it
  • 45:25 - 45:28
    challenging through Parliament and in the
    slow process of democracy, or what do you
  • 45:28 - 45:32
    think is the right way of doing it?
    A: I never think one works on its own,
  • 45:32 - 45:37
    even though I'm a litigator I often think
    litigation is quite a weak tactic,
  • 45:37 - 45:41
    particularly if you don't have the public
    on side, and then again if you don't have
  • 45:41 - 45:44
    Parliament. So we need all of them and
    they can all come through different means
  • 45:44 - 45:49
    so we wouldn't just focus on one of the
    different countries it might be that you
  • 45:49 - 45:54
    go down the legal route or the down the
    parliamentary route but in the UK we're
  • 45:54 - 45:57
    trying all different routes so for example
    on mobile phone extraction in the
  • 45:57 - 46:01
    beginning of next year we're going to be
    doing a video we're going to be doing
  • 46:01 - 46:04
    interviewing the public and speaking to
    them about it, we're going to be going to
  • 46:04 - 46:09
    Parliament, and I've also been speaking to
    a lot of lawyers so I'm hoping some cases
  • 46:09 - 46:15
    will start because those individual cases
    brought by local lawyers are where also
  • 46:15 - 46:20
    you see a lot of change like the John Cat
    case, that's one lawyer, so I think we
  • 46:20 - 46:26
    need all different things to see what
    works and what sticks.
  • 46:26 - 46:31
    Herald: We haven't had number three yet.
    Q: Hi, thanks for the talk, so I have a
  • 46:31 - 46:39
    question regarding concerning the solution
    side of things because one aspect I was
  • 46:39 - 46:46
    missing in your talk was the economics of
    the game actually because like you are
  • 46:46 - 46:52
    from the UK and the private sector has
    like stepped in also and another public
  • 46:52 - 46:59
    domain the NHS to help out because funds
    are missing and I would like to ask you
  • 46:59 - 47:03
    whether or not you think first of all the
    logic is the same within the police
  • 47:03 - 47:13
    departments because it might also be like
    cost driven aspect to limit the salaries
  • 47:13 - 47:19
    or because you have the problem with
    police force coming in because you have to
  • 47:19 - 47:24
    pay their rents and automated things
    especially when I'm given to the private
  • 47:24 - 47:31
    sector which has another whole logic of
    thinking about this stuff is cost saving
  • 47:31 - 47:44
    and so maybe it would be a nice thing
    whether if you could talk a bit about the,
  • 47:44 - 47:49
    I'm sorry, the attempt to maybe like get
    economics a bit more into the picture when
  • 47:49 - 47:56
    it comes to solutions of the whole thing.
    A: So I think yeah, your very right in
  • 47:56 - 48:02
    pointing actually the relation, well that
    you compare what's happening with the NHS
  • 48:02 - 48:08
    and what's happening with the police
    because in both the economics of
  • 48:08 - 48:15
    companies offering policing services arise
    from the same situation there's a need of
  • 48:15 - 48:23
    doing more efficient policing because of
    budget cuts, so the same way the NHS is
  • 48:23 - 48:30
    being essentially privatized due to the
    budget cuts and due to the to the needs
  • 48:30 - 48:35
    that arise from being limited in your
    finance, again there's a similar thing
  • 48:35 - 48:39
    with the police when you when you're
    understaffed then you're more likely to
  • 48:39 - 48:44
    rely on on technologies to help you do
    your work more efficiently because
  • 48:44 - 48:51
    essentially with predictive policing the
    idea behind this is that if you know where
  • 48:51 - 48:56
    and when crime will happen then you can
    focus the limited resources you have there
  • 48:56 - 49:03
    and not sort of look at a more global
    larger picture. So I mean I'm not gonna be
  • 49:03 - 49:07
    here on stage advocating for more funds
    for the police, I'm not gonna do that, but
  • 49:07 - 49:12
    I think that there is there is a desperate
    need to reframe actually the narrative
  • 49:12 - 49:19
    around how we do policing actually and
    then definitely also look at a different
  • 49:19 - 49:23
    perspective and a different approach to
    policing because as I've tried to show
  • 49:23 - 49:28
    it's been a really long time since this
    narrative has developed of more data leads
  • 49:28 - 49:33
    to crime resolution but actually what I
    didn't have the time to get into in this
  • 49:33 - 49:37
    talk is actually all the research that are
    showing that those product actually don't
  • 49:37 - 49:43
    work like PREDPOL is actually basically
    gaslighting a lot of police officers with
  • 49:43 - 49:48
    their figures, the kind of figures that
    are pushing and suggesting are just like
  • 49:48 - 49:54
    plain inaccurate, it's not accurate to
    compare a city on the one year to what a
  • 49:54 - 49:59
    city is becoming in another year so it's
    not even clear that a lot of this
  • 49:59 - 50:05
    project are even like properly functioning
    and in a sense I don't want them to
  • 50:05 - 50:09
    function I'm not gonna say if we had
    better predictive policing then the
  • 50:09 - 50:15
    problem will be solved no that is not the
    question, the question is how do we have
  • 50:15 - 50:21
    regulation that force the police to look
    differently into the way they are
  • 50:21 - 50:26
    conducting policing.
    Herald: Number four please.
  • 50:26 - 50:32
    Q: So, thank you for your presentation I
    have a question about SOCMINT, my opinion
  • 50:32 - 50:37
    SOCMINT might violate the terms of
    services of for example Twitter and
  • 50:37 - 50:41
    Facebook have you tried to cooperate with
    these companies to make them actually
  • 50:41 - 50:46
    enforce their TOS?
    A: So actually there is two things as I
  • 50:46 - 50:51
    said like all companies that are doing
    scraping of data and you're right in this
  • 50:51 - 50:59
    case they violate the terms of services of
    Facebook and Twitter. Now, the other
  • 50:59 - 51:03
    problem is that there is already a loop to
    this and actually the marketing company I
  • 51:03 - 51:08
    was talking about that's being used by the
    UK police what they essentially do is that
  • 51:08 - 51:14
    they purchase the data from Facebook and
    Twitter, so this is why it's interesting
  • 51:14 - 51:20
    because when Facebook's say we don't sell
    your data, well essentially actually with
  • 51:20 - 51:26
    marketing tools that are there to monitor
    what people say about products essentially
  • 51:26 - 51:30
    what you're doing is selling your data,
    they're not selling necessarily like your
  • 51:30 - 51:34
    name or your location or things like that
    but whatever you're going to be posting
  • 51:34 - 51:41
    publicly for example in like groups or
    public pages is something that they are
  • 51:41 - 51:45
    going to be trying to sell to those
    companies. So I think you're right and
  • 51:45 - 51:51
    maybe Millie will have more to say about
    this. I think those companies have a role
  • 51:51 - 51:56
    to play but at the moment I think the
    challenge we face is actually this loop
  • 51:56 - 52:01
    that we're facing where by purchasing the
    data directly from the company they don't
  • 52:01 - 52:07
    face any they don't violate the terms of
    services. Yeah, we've spoken a bit to the
  • 52:07 - 52:13
    some of the social media companies, we've
    been told that one of their big focuses is
  • 52:13 - 52:18
    the problems of the social media
    monitoring at the U.S. border and so
  • 52:18 - 52:23
    because there's a lot known about that
    they're looking at those issues so I think
  • 52:23 - 52:27
    once we show more and more the problems
    say in the UK or in other countries I
  • 52:27 - 52:32
    think it would be very interesting to look
    at what's happened over the Catalan
  • 52:32 - 52:37
    independence vote period to see how social
    media was used then. I think the companies
  • 52:37 - 52:42
    aren't going to react until we make them
    although they probably will meet with us.
  • 52:42 - 52:50
    A slightly different aspect we revealed in
    a different part of our work that the
  • 52:50 - 52:53
    intelligence agencies were gathering
    social media that's probably not
  • 52:53 - 52:58
    groundbreaking news but it was it was
    there in plain fact and so they all got a
  • 52:58 - 53:01
    bit concerned about how that was
    happening, whether some of them knew or
  • 53:01 - 53:06
    some of them didn't, so the better our
    research the more people speaking about it
  • 53:06 - 53:11
    I think they will engage, or we'll find
    out are they are the police getting it
  • 53:11 - 53:17
    lawfully or unlawfully.
    Herald: Number one please.
  • 53:17 - 53:21
    Q: Thanks for your talk, I have a question
    on predictive policing because German
  • 53:21 - 53:29
    authorities in the last two years piloted pre-cops
    PREDPOL projects in three states I think
  • 53:29 - 53:34
    and they claimed that they would never use
    these techniques with data on individuals
  • 53:34 - 53:39
    but only aggregate data like the new
    repeat stuff you presented and they
  • 53:39 - 53:43
    presented as just an additional tool in
    their toolbox and that if use responsibly
  • 53:43 - 53:48
    can lead to more cost effective policing,
    do you buy this argument or would you say
  • 53:48 - 53:55
    that there's inevitably slippery slope or
    kind of like a path dependency to more
  • 53:55 - 54:01
    granular data assessment or evaluation
    that would inevitably infringe on privacy
  • 54:01 - 54:05
    rights?
    A: I think this goes back to the question
  • 54:05 - 54:09
    of like you know are we using per
    listening to identify where crime is
  • 54:09 - 54:14
    happening or who it is who's committing a
    crime but actually I think even if we if
  • 54:14 - 54:19
    we stick to this even if we stick to
    identifying where crime is happening we
  • 54:19 - 54:24
    still run into problems we still run into
    the fundamental problem of predictive
  • 54:24 - 54:29
    policing which is we only have data on
    crime that have already been reported ever
  • 54:29 - 54:36
    or already been addressed by the police,
    and that's by essence already biased data.
  • 54:36 - 54:41
    If we have police in some areas then we're
    more likely to, you know, further police
  • 54:41 - 54:52
    because the solution of those companies of
    those algorithm will be leading to more
  • 54:52 - 54:58
    suggestions that crime is is happening
    more predominantly in those areas. So, as
  • 54:58 - 55:04
    we've seen so far is that we fall into
    these fundamental problems of just
  • 55:04 - 55:11
    overpolicing communities that are already
    overpoliced. So in a sense in terms of
  • 55:11 - 55:18
    well the right to privacy but also the
    question of the presumption of innocence I
  • 55:18 - 55:23
    think purely just having trying to
    cultivate data on the where crime is
  • 55:23 - 55:30
    happening it's not efficient policing
    first of all but it's also causing
  • 55:30 - 55:35
    challenges for fundamental rights as well.
    Yeah, I guess it's not a great comparison
  • 55:35 - 55:39
    but what a lot of what they're bringing in
    now is a program to assist you with the
  • 55:39 - 55:44
    charging decision, so you've got someone
    you've arrested do you charge them or not?
  • 55:44 - 55:48
    The police say oh well of course it's only
    advisory you only have to look at how busy
  • 55:48 - 55:53
    a police station is to know how advisory
    is that going to be and how much is it
  • 55:53 - 55:57
    going to sway your opinion. So the more
    you use these tools the more it makes your
  • 55:57 - 56:01
    job easier because rather than thinking,
    where are we going to go, what areas
  • 56:01 - 56:04
    things going to happen, who are we going
    to arrest, well the computer told us to do
  • 56:04 - 56:09
    this so let's just do that.
    Herald: Thank you and microphone number
  • 56:09 - 56:13
    three please.
    Q: Thank you, do you think that there are
  • 56:13 - 56:20
    any credible arguments to be made for
    limiting the police's abilities under acts
  • 56:20 - 56:25
    in the UK that incorporate EU level
    restrictions on privacy data protection
  • 56:25 - 56:30
    human rights or fundamental rights and if
    so do you anticipate that those arguments
  • 56:30 - 56:35
    might change after brexit?
    A: Well they they're bringing in GDPR and
  • 56:35 - 56:40
    the Law Enforcement Directive now and
    they're not going to scrap those once
  • 56:40 - 56:44
    brexit comes in. We'll still be part,
    hopefully, of the European Court of Human
  • 56:44 - 56:49
    Rights, but not the European Court of
    Justice. I think there are going to be
  • 56:49 - 56:52
    implications it's going to be very
    interesting how they play it out they're
  • 56:52 - 56:57
    still going to want the data from Europol,
    they want to be part of Interpol, policing
  • 56:57 - 57:02
    operates at a different level and I think
    if they have to comply with certain laws
  • 57:02 - 57:06
    so that they can play with the big boys
    then they probably will, but they may do
  • 57:06 - 57:12
    things behind the scenes, so it depends
    where it works for them, but certainly the
  • 57:12 - 57:16
    politicians and definitely the police
    wanna be part of those groups. So we'll
  • 57:16 - 57:21
    have to see, but we will still use them
    and we'll still rely on European judgments
  • 57:21 - 57:27
    the force they have in a court of law may
    be more difficult.
  • 57:27 - 57:32
    Herald: Does the internet have any
    questions, nope, well then number two
  • 57:32 - 57:36
    please.
    Q: So you've mentioned that they don't
  • 57:36 - 57:42
    have really good operational security and
    sometimes some stuff that should not leak
  • 57:42 - 57:48
    leaked now within the last year we had
    major data leaks all across the world like
  • 57:48 - 57:55
    Philippines, South Africa, just to mention
    a few, now if the, security, OPSEC is so
  • 57:55 - 58:00
    bad in the police in Great Britain it's
    not unlikely that something will happen
  • 58:00 - 58:05
    in Europe of a similar kind what kind of
    impact do you think such a huge data leak
  • 58:05 - 58:12
    of private information which the police
    legally stored has even if it was not
  • 58:12 - 58:17
    leaked by the police and it would be leaked
    by a private company that had some way
  • 58:17 - 58:19
    access to it?
    A: I I guess it depends what it what it
  • 58:19 - 58:25
    is, if it's a database with serious
    criminals and only the bad people, then
  • 58:25 - 58:29
    people will think when it's
    good they have that information but they
  • 58:29 - 58:36
    need to make it more secure. If
    somehow databases which held all sorts of
  • 58:36 - 58:40
    information say from people's mobile
    phones, innocent people's pictures, all
  • 58:40 - 58:45
    that kind of thing then we might see a
    much wider public reaction to the tools
  • 58:45 - 58:51
    that are used and the safeguards, the
    legal safeguards, will come a lot quicker
  • 58:51 - 58:56
    than probably we will achieve in the way
    we're trying to go now because there'll be
  • 58:56 - 59:02
    a bigger public outrage.
    Herald: Okay one last and hopefully short
  • 59:02 - 59:07
    question from microphone one.
    Q: Hi, thanks for the talk was really
  • 59:07 - 59:10
    interesting, it's actually quite a short
    question how much is a Cellebrite, and can
  • 59:10 - 59:15
    we buy one?
    A: I did look to buy one, I think there
  • 59:15 - 59:21
    were some on eBay but I'm sure if they
    were like the right things but a couple of
  • 59:21 - 59:24
    thousand pounds, but I think you have to
    actually be a police force to get those
  • 59:24 - 59:31
    ones, maybe there are other types but
    it's expensive but not unobtainable, but
  • 59:31 - 59:35
    I'm trying to find universities that might
    have them because I think that a lot of
  • 59:35 - 59:38
    forensic schools I'm hoping that they
    will, I know they do extractions of
  • 59:38 - 59:42
    laptops but I haven't found one yet that
    does phones but I probably haven't asked
  • 59:42 - 59:46
    enough people.
    Herald: So thank you very much.
  • 59:46 - 59:51
    34C3 Music
  • 59:51 - 60:07
    subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
34C3 - Policing in the age of data exploitation
Description:

more » « less
Video Language:
English
Duration:
01:00:07

English subtitles

Revisions