< Return to Video

34C3 - Uncertain Concern

  • 0:00 - 0:15
    34c3 preroll music
  • 0:15 - 0:19
    Herald Angel (H): OK, probably a couple
    years ago you realize that a lot of the
  • 0:19 - 0:26
    refugees coming up from Syria and North
    Africa where we're communicating. We're
  • 0:26 - 0:30
    using technology in an interesting way to
    find their way around a lot of the Border
  • 0:30 - 0:37
    Patrol's. A lot of the hurdles that were
    put up in their way. In the US we have a
  • 0:37 - 0:42
    similar issue but it's different in many
    ways with illegal immigrants trying to
  • 0:42 - 0:48
    stay underneath the radar. Allison
    McDonald from the University of Michigan
  • 0:48 - 0:54
    is, has been studying how immigrants in
    the States deal with technology and it's
  • 0:54 - 1:00
    very different from here. Her interests
    are in technology, privacy, society and
  • 1:00 - 1:04
    human rights and I think we're gonna have
    an awesome talk from her. So, well, please
  • 1:04 - 1:16
    welcome her and we'll get moving.
    Applause
  • 1:16 - 1:20
    Allision McDonald: OK, thanks for coming.
    I'm Allison from the University of
  • 1:20 - 1:26
    Michigan. I'm talking today primarily
    about technology in immigration
  • 1:26 - 1:30
    enforcement and specifically about how the
    immigrant community in the United States
  • 1:30 - 1:36
    is responding to those changes and
    especially the undocumented community.
  • 1:36 - 1:40
    Before we get too far into the details I
    just wanted to tell a little bit of a
  • 1:40 - 1:45
    story. This is Anna Maria she is not a
    real person she is sort of a compositive
  • 1:45 - 1:50
    of many people that we spoke to but her
    story is really representative of a lot of
  • 1:50 - 1:56
    people that we know are living in
    the United States today. She and her
  • 1:56 - 2:03
    husband emigrated from Mexico about 12
    years ago into the United States. She
  • 2:03 - 2:06
    really wanted to have children, but
    couldn't get the fertility support that
  • 2:06 - 2:11
    she needed in Mexico so she came to the
    United States. And now she and her husband
  • 2:11 - 2:18
    have two children who are attending US
    public schools. She and her husband are
  • 2:18 - 2:23
    both working and saving up to buy a
    house. They pay taxes; they attend church
  • 2:23 - 2:27
    every Sunday. They're involved in a lot of
    community events and are really integrated
  • 2:27 - 2:34
    into the local community. One
    difference from Anna Maria and a lot of
  • 2:34 - 2:40
    other people is that she's in the United
    States as an undocumented immigrant. What
  • 2:40 - 2:45
    this means is that she either entered the
    United States without legal authorization
  • 2:45 - 2:56
    or she came on a Visa and overstayed the
    allotted time. That means that day to day
  • 2:56 - 3:03
    she has to worry about being found and
    deported back to Mexico, removed from her
  • 3:03 - 3:11
    home and this puts her in quite a
    precarious situation trying to live a
  • 3:11 - 3:17
    normal life, a life similar to a lot of
    other people in our communities. But with
  • 3:17 - 3:21
    this constant concern that this life
    could be taken away from her if she's
  • 3:21 - 3:28
    detected. Other than this this one point
    she really lives this immigration story
  • 3:28 - 3:34
    that the United States loves to tell. We
    love to have this narrative of people
  • 3:34 - 3:38
    being able to come to the United States
    and build lives for themselves that they
  • 3:38 - 3:42
    might not be able to build in their
    origin countries. And that's exactly
  • 3:42 - 3:51
    what she's done. But just as natural to
    this immigration story is a history of a
  • 3:51 - 3:59
    lot of discrimination, racism and
    xenophobia. All the way back in the 1700s
  • 3:59 - 4:04
    we've had legislation that prevents people
    from becoming citizens based on their
  • 4:04 - 4:11
    origin country. We've had, for example,
    the Chinese Exclusion Act preventing
  • 4:11 - 4:17
    people from China laborers coming to the
    United States entirely. The Asiatic barred
  • 4:17 - 4:21
    zone a couple years later just drew a box
    on a map and said the people in this
  • 4:21 - 4:26
    region can't immigrate to the United
    States. We've also seen things like the
  • 4:26 - 4:33
    Johnson Reed Immigration Act in the 1900s
    where the the US took census data from
  • 4:33 - 4:39
    before a big wave of immigration putting a
    quota system in place that essentially
  • 4:39 - 4:45
    prevented people from eastern and southern
    Europe from coming to the United States.
  • 4:45 - 4:52
    This history of discrimination and racism
    continues to today. Many of you, I'm sure
  • 4:52 - 4:57
    have heard of what's happening now with
    the so-called Muslim ban where a list of
  • 4:57 - 5:02
    seven countries are now blacklisted.
    Immigrants are unable to enter the
  • 5:02 - 5:12
    country. And this is just another data
    point to show the trend that our discourse
  • 5:12 - 5:21
    and immigration policy in the United
    States is often racialized. I want to talk
  • 5:21 - 5:23
    a little bit about what immigration
    enforcement actually looks like in the
  • 5:23 - 5:30
    United States. The agency that manages
    enforcement is called the US Immigration
  • 5:30 - 5:40
    and Customs Enforcement or ICE. They're in
    charge of enforcing within the borders
  • 5:40 - 5:44
    once people have already entered the
    country, finding people without
  • 5:44 - 5:50
    documentation or managing immigration
    cases. Over the last couple of decades
  • 5:50 - 5:56
    they've really been gaining in size and
    power. This is anything from the removal
  • 5:56 - 6:03
    of privacy restrictions on sharing data
    between federal agencies to an increase in
  • 6:03 - 6:11
    financial resources after 9/11. And this
    is happening even today. President Trump
  • 6:11 - 6:16
    back in January had an executive order
    that is looking to add another 5,000
  • 6:16 - 6:21
    agents to their current 20,000 over the
    next couple of years. So this is an agency
  • 6:21 - 6:27
    that's continuing should be bolstered. And
    another way that they're changing,
  • 6:27 - 6:33
    recently, is the way that they're
    integrating technology into their jobs.
  • 6:33 - 6:38
    This photo in particular shows a
    fingerprint scanner. The collection of
  • 6:38 - 6:43
    biometric data is becoming really common
    in immigration enforcements. And it's not
  • 6:43 - 6:48
    just when someone's taken into an
    immigration office but mobile fingerprint
  • 6:48 - 6:52
    scanners are being taken into communities.
    There are stories of people having their
  • 6:52 - 6:58
    biometric data taken, even without arrest.
    Being stopped in the street or being near
  • 6:58 - 7:03
    someone who's being detained for a
    particular reason. Everyone in the area or
  • 7:03 - 7:09
    everyone in the household having their
    biometric data taken. We've also seen the
  • 7:09 - 7:13
    removal of some restrictions on how this
    data can be shared between federal
  • 7:13 - 7:22
    agencies. In particular President Trump
    has reinstated the Secure Communities
  • 7:22 - 7:27
    Program which allows local police officers
    when they're booking people for local
  • 7:27 - 7:33
    crimes or in local jails to take biometric
    data and cross-check it against federal
  • 7:33 - 7:47
    immigration databases and crime databases.
    We're also seeing evidence that,... So
  • 7:47 - 7:51
    DHS, is the Department of Homeland
    Security the umbrella organization over
  • 7:51 - 7:59
    ICE. We have recently seen through a
    Freedom of Information request that this
  • 7:59 - 8:05
    organization has used cell-site simulators
    or stingrays over 1,800 times in the last
  • 8:05 - 8:12
    five years. We don't know all of the cases
    where these have been used. And we really
  • 8:12 - 8:17
    can't speculate these cases are shrouded
    in secrecy and we don't know when and how
  • 8:17 - 8:21
    they're being used. But we do have one
    case, it's actually close to my home in
  • 8:21 - 8:28
    Detroit Michigan where an undocumented
    man, ICE was able to send a warrant to
  • 8:28 - 8:33
    Facebook to get his phone number and then
    use that phone number with a cell site
  • 8:33 - 8:42
    simulator to track him to his home and
    ended up deporting him to El Salvador.
  • 8:42 - 8:47
    We're also seeing this move to start
    collecting social media data at the
  • 8:47 - 8:52
    borders. This isn't just for people on
    temporary visas but also nationlised
  • 8:52 - 9:00
    citizens and people with permanent
    residency cards. This might not be so
  • 9:00 - 9:04
    relevant to people who are already in the
    country because they're not crossing the
  • 9:04 - 9:09
    border regularly, but this might be
    impactful if they have friends and family
  • 9:09 - 9:15
    crossing borders to visit them. And new
    immigrants as well. This is a database
  • 9:15 - 9:21
    that we don't really know what it's being
    used for yet. But there are some hints in
  • 9:21 - 9:28
    the way that, for example, ICE has been
    soliciting contracts from big data
  • 9:28 - 9:33
    companies to create algorithms to do this
    extreme vetting to be able to find
  • 9:33 - 9:40
    suspicious activity or suspicious people
    from troves of social media data. In fact
  • 9:40 - 9:45
    we have already seen some of these
    contracts being awarded. There was a 3
  • 9:45 - 9:51
    million contract recently given to a
    company called Giant Oak who claims to
  • 9:51 - 9:58
    take big data and find bad guys. Their
    creepy slogans, "We see the people behind
  • 9:58 - 10:07
    the data" 'trademark'. And this is just
    another example of the way that technology
  • 10:07 - 10:14
    is being used to... in ways that are sort
    of unpredictable at this point but
  • 10:14 - 10:22
    we have many examples where this
    style of research can often be
  • 10:22 - 10:30
    discriminatory. And it might be expected
    that at this point in time technologies
  • 10:30 - 10:34
    ending up integrated into law enforcement
    in the way that it's being integrated into
  • 10:34 - 10:38
    a lot of different parts of our lives. But
    there's a reason that this moment in
  • 10:38 - 10:44
    particular is so frightening. This
    administration's making it abundantly
  • 10:44 - 10:50
    clear what they think of immigration. Just
    in less than a year so far we've seen the
  • 10:50 - 10:54
    repeal of the deferred action for
    Childhood Arrivals Program which you might
  • 10:54 - 11:01
    also hear as the DREAM Act or people here
    talking about Dreamers. This is a program
  • 11:01 - 11:05
    that allowed people who entered the
    country under the age of 16 to get work
  • 11:05 - 11:12
    permits and driver licenses and attend
    university and have their immigration
  • 11:12 - 11:19
    cases delayed so long as they're meeting
    educational goals. We've seen the
  • 11:19 - 11:28
    elimination of privacy protections from
    sharing data between federal agencies. And
  • 11:28 - 11:32
    in addition to the actual concrete policy
    changes, we're hearing a lot of really
  • 11:32 - 11:37
    nasty rhetoric around immigrants and
    immigration. That's causing a lot of
  • 11:37 - 11:42
    concern among people who are in the
    immigrant community or who are allies to
  • 11:42 - 11:47
    the immigrant community about what this
    means in terms of harassment and hatred
  • 11:47 - 11:55
    even beyond the the legal changes. We're
    also seeing a change in deportation
  • 11:55 - 12:05
    practices while Obama was prolific in
    deportations. He had a very explicit
  • 12:05 - 12:10
    policy in place that the priority
    deportations would be people who were
  • 12:10 - 12:14
    national security threats whatever that
    might mean, or people with serious
  • 12:14 - 12:20
    criminal records, or people who had just
    recently entered the United States. That
  • 12:20 - 12:23
    policy is being removed and we're seeing
    more and more people who are deported
  • 12:23 - 12:31
    after living in the United States for a
    long time with family and friends and
  • 12:31 - 12:36
    lives built in the communities; who might
    have family or children who are US
  • 12:36 - 12:47
    citizens who don't have criminal records.
    So what does this mean for Anna Maria? For
  • 12:47 - 12:51
    one without a criminal record. She
    previously might have been able to have
  • 12:51 - 12:55
    some high amount of confidence that she
    wouldn't be a priority target and that
  • 12:55 - 13:02
    confidence is being eroded. We're
    seeing lots of people who previously
  • 13:02 - 13:11
    wouldn't have been targeted be deported
    regardless of their clean record, and lack
  • 13:11 - 13:17
    of action that really makes them more
    visible than they have been in the past.
  • 13:17 - 13:21
    She and her husband are starting to think
    about, what happens to their children if
  • 13:21 - 13:25
    they're deported. They have to make the
    decision because the children were born in
  • 13:25 - 13:30
    the United States, they're US citizens.
    They have to decide whether they should
  • 13:30 - 13:34
    give custody to friends and family who can
    stay in the United States, or if they
  • 13:34 - 13:38
    should take them back to Mexico, rather
    than letting them stay and get the US
  • 13:38 - 13:44
    education that they want to have. She has
    to be concerned about ICE being in her
  • 13:44 - 13:49
    community and outside of her home.
    Possibly having her fingerprints taken if
  • 13:49 - 13:53
    she's in the wrong place at the wrong
    time. She might have to worry about
  • 13:53 - 13:58
    friends and family from Mexico visiting,
    and crossing the border, and having social
  • 13:58 - 14:04
    media data taken from them. That, I mean,
    as we all know, might indicate a lot more
  • 14:04 - 14:11
    than just about the person who's crossing
    the border. Our social media lives give a
  • 14:11 - 14:17
    lot of information about her networks that
    might expose information about her. It's
  • 14:17 - 14:20
    also worth noting that Anna Maria is far
    from alone. There are as many as 11
  • 14:20 - 14:25
    million undocumented immigrants in the
    United States today. Over 2/3 of them have
  • 14:25 - 14:29
    been in the United States for more than 10
    years which means they're integrated into
  • 14:29 - 14:35
    our communities, they own houses, they
    have jobs, they pay taxes, they live
  • 14:35 - 14:40
    really normal lives to the extent that
    they can in the United States. They've
  • 14:40 - 14:49
    built their lives here. So with this
    context in mind, I and some of my
  • 14:49 - 14:54
    collaborators were wondering, how this is
    really changing the way that people use
  • 14:54 - 15:01
    technology? Or if it is, given the sort of
    objectively heightened risk that they're
  • 15:01 - 15:05
    facing day to day. We wanted to know
    whether or not there's any sort of
  • 15:05 - 15:15
    reaction to those changes happening in
    their daily lives. We reached out to some
  • 15:15 - 15:19
    immigration support organizations, so
    immigrant rights and activist's
  • 15:19 - 15:26
    organizations and worked with them to be
    able to communicate with this community.
  • 15:26 - 15:31
    In the end, we were able to talk to 17
    undocumented immigrants in the Midwest. We
  • 15:31 - 15:37
    were primarily asking them about how they
    manage risk in their daily lives offline,
  • 15:37 - 15:43
    as well as online. And whether or not
    that's changing over the last year or two
  • 15:43 - 15:46
    years, when this discourse around
    immigration is really changing, and then
  • 15:46 - 15:54
    whether these changes that we're seeing,
    are causing them to maybe react in the way
  • 15:54 - 16:01
    that they're using technology. I can tell
    you a little bit about who we spoke to.
  • 16:01 - 16:07
    The majority were women, 14 of our 17
    participants were women. Most of them were
  • 16:07 - 16:13
    in their mid 30s, average age 35. And lots
    of them had children. So it was a lot of
  • 16:13 - 16:17
    parents. Everyone that we spoke to, had
    been in the United States for more than 10
  • 16:17 - 16:23
    years. So they really had their lives and
    their communities here. And most of them
  • 16:23 - 16:27
    were also from Mexico. That's about
    consistent with the immigrant community in
  • 16:27 - 16:34
    the United States, especially from Latin
    America. The majority are from Mexico. And
  • 16:34 - 16:37
    then there was a mix of immigration
    stories. Some of the people we spoke to
  • 16:37 - 16:43
    had crossed the southern border by foot or
    otherwise. And some people had overstayed
  • 16:43 - 16:53
    visas, had flown to the United States and
    stayed. So we wanted to first get an idea
  • 16:53 - 16:57
    of how they're managing and sort of
    thinking about risk in their daily lives
  • 16:57 - 17:05
    offline to get a sense of how deeply it
    impacts the way that they're living. What
  • 17:05 - 17:10
    we found across the board is that
    immigration is a really sort of looming
  • 17:10 - 17:16
    presence in their lives. They think a lot
    about how they're exposing themselves, and
  • 17:16 - 17:22
    that possibly exposing their status to
    authority figures. And they put like a lot
  • 17:22 - 17:30
    of careful consideration into how to keep
    a low profile. Driving is one really good
  • 17:30 - 17:38
    example of this cost-risk cost-benefit
    analysis that they're doing. Most people
  • 17:38 - 17:41
    we spoke to you talked about driving one
    way or another, and about half chose to
  • 17:41 - 17:47
    drive and half chose not to. Most of the
    people don't have driver's licenses for
  • 17:47 - 17:52
    the United States because it's difficult
    to get them without legal immigration
  • 17:52 - 17:58
    papers. So the risk with driving is that
    if you're stopped, if you're pulled over,
  • 17:58 - 18:02
    even if you didn't have a traffic
    violation, if you stop for a taillight or
  • 18:02 - 18:06
    something. The routine is to ask for a
    documentation of your license. And if you
  • 18:06 - 18:09
    don't have that there might be more
    questions, and in the end, you could
  • 18:09 - 18:17
    expose yourself to immigration or other
    legal law enforcement. Some people really
  • 18:17 - 18:23
    thought that the risk was worth it. To
    live their lives how they want to. They're
  • 18:23 - 18:27
    going to try to just not think about the
    risk and do what they need to do day to
  • 18:27 - 18:33
    day. Other people felt that the risk was
    too great and chose not to drive at all.
  • 18:33 - 18:37
    And that's a significant sacrifice,
    especially in the United States where our
  • 18:37 - 18:41
    public transportation systems aren't
    fantastic. This might mean that they can't
  • 18:41 - 18:44
    set their own work schedules, or they
    can't take their kids to school if they
  • 18:44 - 18:50
    miss the bus. So it's a significant risk.
    But it's also a big sacrifice if they
  • 18:50 - 18:57
    choose not to drive. People also think a
    lot about how they're exposing themselves
  • 18:57 - 19:04
    to authority figures. As one example, the
    decision to file taxes or not is a big
  • 19:04 - 19:10
    risk. So in the United States, you don't
    need to have any sort of government ID to
  • 19:10 - 19:17
    file taxes, you just need a tax ID. So a
    lot of these people are filing taxes. But
  • 19:17 - 19:22
    in order to do that, they are giving up to
    the federal government their names, their
  • 19:22 - 19:27
    addresses, their employment history,
    contact information. And some people think
  • 19:27 - 19:33
    that that risk is worth it, right. Because
    this person for example feels like, by
  • 19:33 - 19:41
    paying taxes every year they're able to
    establish a good history of upstanding
  • 19:41 - 19:45
    behavior. They can maybe have a better
    case for getting a legal status if the
  • 19:45 - 19:57
    time comes, when that's an option. And
    another example of, you know, exposing
  • 19:57 - 20:02
    information to authorities, might be
    filing for benefits for US born children,
  • 20:02 - 20:09
    or even library cards, or local ID cards.
    And the risk is going to be different in
  • 20:09 - 20:14
    each case depending on what they're
    exposing. Some people chose to forego
  • 20:14 - 20:21
    significant benefits to avoid giving that
    information to authorities. This person is
  • 20:21 - 20:26
    talking about DACA, the deferred action
    for childhood arrival program. This would
  • 20:26 - 20:31
    make it much easier for their son to go to
    college, give their son hopefully if they
  • 20:31 - 20:36
    trust the program, a much more reliable
    immigration status. They wouldn't
  • 20:36 - 20:41
    technically have a legal immigration
    status but they would be sort of assured
  • 20:41 - 20:46
    that their status, or rather their
    immigration case is a low priority. They
  • 20:46 - 20:50
    wouldn't be targeted. And as long as
    they're attending universities, they could
  • 20:50 - 20:56
    have confidence. So the program says that
    they wouldn't be targeted. These people
  • 20:56 - 21:01
    were concerned because in order to file
    that paperwork for their son, they had to
  • 21:01 - 21:04
    give up a lot of information about
    themselves: their phone numbers, their
  • 21:04 - 21:10
    names, their addresses. And in the end,
    they decided not to do it. And
  • 21:10 - 21:14
    unfortunately, only weeks after we spoke
    to this person, the DACA program was
  • 21:14 - 21:19
    repealed. This has led a lot of people to
    be concerned because the people who did
  • 21:19 - 21:23
    apply for the program, have given that
    information to the government, to the
  • 21:23 - 21:28
    Immigration services in particular. And at
    this point in time, we have no assurances
  • 21:28 - 21:33
    that that information won't be used in
    immigration cases. At the moment, there's
  • 21:33 - 21:39
    just a sort of FAQ page that says, we
    don't use this information now but we
  • 21:39 - 21:47
    reserve the right to change that at any
    time without telling anyone. People are
  • 21:47 - 21:52
    also really feeling the changes that are
    happening in the last couple of months.
  • 21:52 - 21:58
    Well, it's been too many months, the last
    year and a half. They're feeling the
  • 21:58 - 22:03
    pressure in their communities for
    immigration services being, or immigration
  • 22:03 - 22:10
    enforcement being more present and less
    predictable. Of one person described
  • 22:10 - 22:13
    feeling like, instead of coming to take a
    particular person, they're just coming and
  • 22:13 - 22:19
    looking for anyone who might be
    undocumented. Many people that we spoke
  • 22:19 - 22:24
    to, had negative experiences with ICE.
    Including,... if it weren't,... if they
  • 22:24 - 22:28
    hadn't had to experience themselves, lots
    of people had friends and family who had
  • 22:28 - 22:33
    negative experiences. And they're feeling
    this increase in presence of enforcement
  • 22:33 - 22:38
    in their communities. And this is leading
    them to make significant changes to the
  • 22:38 - 22:43
    way that they're living their lives. For
    example, one person we spoke to talked
  • 22:43 - 22:47
    about how they won't leave their child at
    home alone anymore because they're worried
  • 22:47 - 22:52
    that, while they're out, their child; if
    they're picked up while they're out, and
  • 22:52 - 22:56
    the child's at home alone, they might be
    left there. Or ICE might even show up at
  • 22:56 - 22:59
    the house while the child's there alone.
    They don't want either of those things to
  • 22:59 - 23:07
    happen. So people are changing a lot of
    the ways that they live day to day. And
  • 23:07 - 23:13
    this is a very present concern, in the way
    that they talk about their daily lives. So
  • 23:13 - 23:16
    we were wondering if this is true when
    they think about the way that they use
  • 23:16 - 23:22
    technology and what they're doing online.
    First, let me just give you an overview of
  • 23:22 - 23:27
    what sort of technologies they primarily
    use. This community is really mobile
  • 23:27 - 23:32
    heavy. Some people had computers in the
    home. A lot of people had access to
  • 23:32 - 23:35
    computers through local libraries and
    things. But everyone had a smartphone and
  • 23:35 - 23:40
    they were very dependent on it. Some
    people used email but when they spoke
  • 23:40 - 23:48
    about email, it was mostly to do with
    communicating with their kids schools or
  • 23:48 - 23:51
    doctor's appointments. It wasn't really a
    social thing. So the majority of what we
  • 23:51 - 23:57
    spoke to people about, were social media
    tools. In particular, all but one of our
  • 23:57 - 24:03
    participants were active users of
    Facebook. Most people were using WhatsApp
  • 24:03 - 24:10
    and Facebook Messenger, as well. These are
    the three primary tools that people had
  • 24:10 - 24:16
    the most to say about. There were some
    other tools that they were on: Instagram,
  • 24:16 - 24:23
    Twitter, and Snapchat. But really, the
    overarching, sort of a sense that people
  • 24:23 - 24:26
    had about these tools is that it's
    bringing significant benefits to their
  • 24:26 - 24:31
    daily lives. Especially, when you think
    about this community being separated
  • 24:31 - 24:37
    permanently from a lot of their friends
    and family back home, or their former
  • 24:37 - 24:42
    home, their origin country. What they had
    to do before, maybe sending photos in the
  • 24:42 - 24:47
    mail or through post cards, buying
    international calling cards, being able to
  • 24:47 - 24:51
    call people with video chat now is a
    significant improvement to their ability
  • 24:51 - 24:56
    to keep in touch with people back in
    Mexico or in wherever their... the origin
  • 24:56 - 25:03
    country is. People also talked about, how
    it's improving their lives in other ways.
  • 25:03 - 25:07
    For example, being able to organize their
    own work schedules, and have more control
  • 25:07 - 25:12
    over the way that they're employed. The
    benefits go on and on, and it's a lot of
  • 25:12 - 25:15
    the same things that we've experienced
    over the last decade, and the way that our
  • 25:15 - 25:21
    lives have changed for the better. Because
    we're able to use these technologies. When
  • 25:21 - 25:26
    we ask people about risk, the things that
    really pop into their heads first, are
  • 25:26 - 25:33
    hackers. They're really concerned about
    fraud and identity theft. And they think a
  • 25:33 - 25:38
    lot about their children contacting
    strangers on the internet, or accessing
  • 25:38 - 25:48
    inappropriate content. But that's not to
    say that concerns related to their status,
  • 25:48 - 25:58
    their illegal status were absent. They're
    just much less certain. You know, it's
  • 25:58 - 26:04
    easy to think about the consequences of
    identity theft. That's sort of concrete.
  • 26:04 - 26:12
    But a lot of these status related concerns
    were less concrete. People talked about
  • 26:12 - 26:17
    harassment as well, being something that's
    increasing in the real world, as well as
  • 26:17 - 26:28
    online. In particular participating in
    communities, or in conversations online
  • 26:28 - 26:34
    that may be expose their immigration
    status. This harassment has moved online.
  • 26:34 - 26:39
    They're experiencing it in the real world,
    as well, but they're hearing stories or
  • 26:39 - 26:42
    having stories themselves about people
    threatening them with immigration
  • 26:42 - 26:54
    enforcement. That's increasing over the
    last year or so. There are a couple of
  • 26:54 - 27:01
    ways that people manage these risks.
    Primarily, what we found people really
  • 27:01 - 27:07
    thought about, is their concrete steps to
    managing their privacy online were fairly
  • 27:07 - 27:12
    basic things like, making sure that they
    only accept friends and family on
  • 27:12 - 27:20
    Facebook. They might have set their
    profile to private. But they're really not
  • 27:20 - 27:24
    fiddling with these more fine-grained
    privacy settings. They're not, you know,
  • 27:24 - 27:28
    sharing particular posts only to
    particular people, or using that. They
  • 27:28 - 27:31
    were talking about, they didn't tell us
    about using these, like private groups or
  • 27:31 - 27:40
    anything like that to sort of create
    separate spheres of friends and family.
  • 27:40 - 27:45
    And channel management, just in the sense
    that like, even though they think about
  • 27:45 - 27:49
    curating this, like close network of
    friends and family, they're still really
  • 27:49 - 27:55
    thoughtful about what they post in which
    channel. Whether like it's safe to put a
  • 27:55 - 27:59
    photo, for example on their wall, or you
    know, in their timeline versus sending it
  • 27:59 - 28:07
    directly to family. This person, for
    example, even after they post something
  • 28:07 - 28:14
    publicly, publicly being, you know. within
    their Facebook wall, they'll still go back
  • 28:14 - 28:17
    to a couple days later and just delete
    everything because they're not totally
  • 28:17 - 28:28
    confident that that's private. Another
    really interesting thing is that in all of
  • 28:28 - 28:32
    this, the conversations we had, no one
    really expressed the sense that they
  • 28:32 - 28:39
    understood that they're really living on
    Facebook. The tools that they're using
  • 28:39 - 28:47
    like almost exclusively, are all owned by
    the same company. No one also express any
  • 28:47 - 28:51
    sort of sense that these companies are
    entities in themselves that might have
  • 28:51 - 28:57
    interest in access to their data. Much
    less one that cooperates with law
  • 28:57 - 29:05
    enforcement. That concern didn't appear in
    any of our conversations. They tend to
  • 29:05 - 29:10
    think about these platforms as being sort
    of a medium to communicate with other
  • 29:10 - 29:17
    people. You know, the way that they use
    it, is to talk to other individuals, or
  • 29:17 - 29:21
    groups of individuals. But the platform
    doesn't seem to be like a repository for
  • 29:21 - 29:28
    data. In fact, they are expressing
    significant trust in Facebook, Facebook in
  • 29:28 - 29:33
    particular. A lot of people were grateful
    for the changes that Facebook's made over
  • 29:33 - 29:40
    the last year or two, in terms of account
    management. So they're grateful that if
  • 29:40 - 29:44
    there's a suspicious login attempt,
    they'll be able to stop it. That's helped
  • 29:44 - 29:49
    a lot of people. And that sort of
    generates trust in these platforms. And
  • 29:49 - 30:01
    the sense that Facebook really has their
    back. In addition to sort of managing the
  • 30:01 - 30:06
    way that they're sharing information, we
    did see some people choosing to abstain
  • 30:06 - 30:12
    from sharing. Especially, when it came to
    topics around immigration. Some people
  • 30:12 - 30:18
    chose to not join, you know, public
    Facebook groups, or get information from
  • 30:18 - 30:22
    certain places because they were afraid
    that by associating with these groups,
  • 30:22 - 30:33
    they might indicate something publicly
    about their status. And that's frustrating
  • 30:33 - 30:35
    for a lot of people who want to
    participate in these conversations, and
  • 30:35 - 30:39
    especially, because the discourse around
    immigration is so toxic in the United
  • 30:39 - 30:46
    States. Some people express this feeling
    that they have to just sit there and take
  • 30:46 - 30:51
    this discourse happening around them
    without participating, because they're
  • 30:51 - 30:57
    worried about being targeted, or harassed,
    or maybe even like having physical
  • 30:57 - 31:01
    consequences: being followed, or having
    immigration sent to their house if someone
  • 31:01 - 31:09
    were to find them. Some people expressed
    the opposite, though, which is
  • 31:09 - 31:16
    encouraging, right? Some people felt that,
    even though the risk is there, it's more
  • 31:16 - 31:20
    important for them to share their thoughts
    than it is for them to be tiptoeing around
  • 31:20 - 31:28
    immigration enforcement. This is also
    really interesting because this sort of
  • 31:28 - 31:34
    exposes sometimes family tensions about
    these topics. This is a really, it's a
  • 31:34 - 31:38
    mixed status community, meaning that
    sometimes parents will be undocumented and
  • 31:38 - 31:43
    children will be US citizens. Or lots of
    people have friends and family who have a
  • 31:43 - 31:48
    different legal status than they do. So
    risk is really distributed. You know, it's
  • 31:48 - 31:52
    not just individual, it's within families
    and within communities. And there can be a
  • 31:52 - 31:57
    lot of tension between, you know, children
    and parents, or friends, you know,
  • 31:57 - 32:01
    siblings, about how they share information
    on these platforms. Some people are much
  • 32:01 - 32:10
    more conservative with what they share.
    And this quote also reveals something else
  • 32:10 - 32:17
    kind of interesting. When we talk to
    people about concerns about immigration,
  • 32:17 - 32:22
    it's very rarely that they talk about
    whether immigration will be able to
  • 32:22 - 32:28
    investigate them, as much as it is about
    when, which is this final point that
  • 32:28 - 32:34
    there's really this sense of resignation
    in the community about what information
  • 32:34 - 32:44
    immigration enforcement has about them.
    Lots of people feel like, it doesn't
  • 32:44 - 32:51
    really matter what they do. Immigration
    can know where they are and what they're
  • 32:51 - 32:55
    doing. They can find them if they just
    decide to. It's just a matter of whether
  • 32:55 - 32:59
    immigration enforcement is going to choose
    to come after them, rather than whether
  • 32:59 - 33:09
    they can. This is also true with the way
    that they think about technology. They
  • 33:09 - 33:16
    have a sense that there's really no
    privacy. If immigration decided to, they
  • 33:16 - 33:20
    would be able to see the messages on
    Facebook, they could see what was
  • 33:20 - 33:26
    physically on their phones, that they have
    this sort of all-powerful, you know,
  • 33:26 - 33:32
    toolkit to access their digital
    information. And honestly, this story in
  • 33:32 - 33:39
    particular, this sense of surveillance
    comes from experience often. This person
  • 33:39 - 33:44
    had a really negative experience with ICE,
    you know, coming and talking to her
  • 33:44 - 33:49
    family. And ICE knowing things that they
    hadn't told anyone. Somehow ICE had known
  • 33:49 - 33:53
    things that they were keeping very
    private. And so there's this assumption
  • 33:53 - 33:57
    that, well, it's happened to me before,
    I've seen it happen to my friends, they
  • 33:57 - 34:08
    probably could know anything they want to.
    But it's not all negative, it's not all
  • 34:08 - 34:14
    resignation. Another thing that we saw,
    many people, not everyone, but maybe half
  • 34:14 - 34:16
    of the people we spoke to, had this really
    strong sense that there was this
  • 34:16 - 34:21
    responsibility to share things in the
    community to help each other. There's this
  • 34:21 - 34:29
    growing sense of community identity. And
    this might mean sharing information about
  • 34:29 - 34:35
    resources for the immigrant community or
    sharing information about workshops, or
  • 34:35 - 34:41
    events, vigils, but also information about
    immigration enforcement. If ICE is in a
  • 34:41 - 34:46
    particular community, they might tell
    their friends and family, avoid this area
  • 34:46 - 34:50
    until further notice. They're helping each
    other, they're sending information. So, it
  • 34:50 - 34:54
    can't be total resignation. There's still
    this sort of beam of hope that they're
  • 34:54 - 34:58
    helping each other. And they must have
    hope that they can do something because
  • 34:58 - 35:03
    they are. And this has been something that
    has become faster and easier with
  • 35:03 - 35:09
    technology, too, right? It's much easier
    to send a message than it is to call, or
  • 35:09 - 35:17
    to spread information before we had, you
    know, smartphones. But all of this really
  • 35:17 - 35:21
    leads to the question: Considering how
    much they inconvenience themselves in
  • 35:21 - 35:25
    their daily lives offline, why are they
    doing comparatively little online to
  • 35:25 - 35:33
    change their practices, or to reduce their
    visibility? I don't think it's enough
  • 35:33 - 35:39
    that, although lots of people expressed
    this sense that they're like relatively
  • 35:39 - 35:46
    low-tech literate. That in and of itself
    isn't really enough of an explanation,
  • 35:46 - 35:50
    right? There are so many different factors
    into the way that they're making these
  • 35:50 - 35:56
    decisions, and they're thinking carefully
    about the decisions they do make. So we
  • 35:56 - 36:01
    have some thoughts on this. It really
    can't be understated how much of a benefit
  • 36:01 - 36:06
    technology is to this community. It's
    making a significant difference in the way
  • 36:06 - 36:15
    that they live their lives. So the choice
    to abstain is not trivial. The risk that
  • 36:15 - 36:19
    they're facing by using like Facebook, by
    putting phone numbers on Facebook, or
  • 36:19 - 36:24
    sharing photos of their family and
    friends, and like, building these online
  • 36:24 - 36:30
    networks, is, really the risk involved in
    that is uncertain, right? At this point we
  • 36:30 - 36:35
    have really sparse data about direct
    connections between the use of technology,
  • 36:35 - 36:40
    or the use of social media and immigration
    enforcement, and consequences. Maybe that
  • 36:40 - 36:44
    will change, but at this point it's
    unclear which changes might be actually
  • 36:44 - 36:49
    beneficial, right? Because there's not a
    direct connection between using this tool,
  • 36:49 - 36:56
    putting this information online, and
    immigration enforcement showing up.
  • 36:56 - 37:01
    There's also the significant trust in the
    platforms that they're using and their
  • 37:01 - 37:08
    peers are using as well and there just
    tends to be less critical thought about
  • 37:08 - 37:14
    the safety of using platforms when there's
    already this component of trust. Facebook
  • 37:14 - 37:19
    has done a lot for account security for
    example over the last couple of years and
  • 37:19 - 37:25
    has built trust in this community. And as
    well as having you know all of your
  • 37:25 - 37:30
    community on a tool when they're all there
    together there's like less of a, less
  • 37:30 - 37:37
    critical thought about whether they're
    it's safe to be there. And there is this
  • 37:37 - 37:42
    component of resignation when we've sort
    of pushed people to think really
  • 37:42 - 37:47
    explicitly about the risk with immigration
    enforcement, being in sharing information
  • 37:47 - 37:54
    on social media using technology there was
    the sense that if they wanted to - they
  • 37:54 - 37:57
    could have the information, I mean, they
    already have it in a lot of ways when
  • 37:57 - 38:04
    they're filing taxes or just you know it's
    accessible to authorities is the general
  • 38:04 - 38:09
    sense of regardless of what they do
    online. So this kind of in combination
  • 38:09 - 38:14
    with the uncertain risk it makes it really
    hard to make concrete steps towards
  • 38:14 - 38:25
    changes that might be helpful. So finally,
    I just wanted to share a couple of things
  • 38:25 - 38:34
    that I learned especially as a digital
    security trainer and doing this study.
  • 38:34 - 38:42
    Most importantly everyone that we spoke to
    was really excited to learn. That's just
  • 38:42 - 38:47
    general like tech literacy but also
    security and privacy. People really care
  • 38:47 - 38:52
    and they're excited. And everyone
    expressed gratitude that we were talking
  • 38:52 - 39:00
    to them about this topic. They care a lot.
    But so what was difficult for me having a
  • 39:00 - 39:07
    background in trainings was still being
    surprised by things that in these
  • 39:07 - 39:13
    conversations that thinking I knew what
    they wanted or what they needed and that
  • 39:13 - 39:17
    not being the case. So one thing I would
    say is you know don't assume that you know
  • 39:17 - 39:23
    what's best for them or even what they
    want or need. Go and talk to people
  • 39:23 - 39:27
    they're really you'll learn a lot from
    talking to people about what they think
  • 39:27 - 39:33
    their risk is versus what they're doing.
    For example something that I was surprised
  • 39:33 - 39:37
    to learn is that they're really not using
    online resources when they have concerns
  • 39:37 - 39:42
    about online privacy. They're talking to
    their kids and they're talking to their
  • 39:42 - 39:48
    neighbors and their friends. So for this
    community in particular it would be really
  • 39:48 - 39:53
    much more effective to go into an in-
    person training. A training in Spanish in
  • 39:53 - 39:59
    this case. In the language that they're
    naturally speaking and have like in-person
  • 39:59 - 40:05
    resources that will get you much further
    than you know compiling lists of ideas or
  • 40:05 - 40:15
    tools or strategies, that'll probably
    never be accessed. And as a vehicle to do
  • 40:15 - 40:18
    this, when we had a really positive
    experience working with support
  • 40:18 - 40:24
    organizations, on the front end that
    allowed us to build trust with the
  • 40:24 - 40:28
    community, so by working with people who
    they already trusted and who already knew
  • 40:28 - 40:33
    them well I really think we were able to
    talk to people much more openly and much...
  • 40:33 - 40:37
    with much more trust than they would have
    otherwise. Whether they would have spoken
  • 40:37 - 40:43
    to us at all is a question. They also were
    a great resource for us as we were
  • 40:43 - 40:50
    developing interview materials and also
    like training materials afterwards when we
  • 40:50 - 41:00
    went back to communities and conducted
    digital trainings. They helped us develop,
  • 41:00 - 41:06
    you know, culturally sensitive language
    and we were able to just ask, you know, is
  • 41:06 - 41:10
    this location is this style of
    presentation, is this length, is this time
  • 41:10 - 41:14
    what should we do you know they were a
    resource to us to make sure that the
  • 41:14 - 41:17
    things that we were developing were most
    accessible to the people that we're
  • 41:17 - 41:25
    talking to. And, they also themselves from
    what I've seen have a lot of questions
  • 41:25 - 41:30
    about the way that they're using
    technology. That's a great place to go and
  • 41:30 - 41:36
    talk to people about, you know,
    organizational practices. And you might
  • 41:36 - 41:39
    find that it's a lot easier to get people
    to change their practices if they're in
  • 41:39 - 41:43
    sort of an organizational setting where
    there's peer pressure or maybe some
  • 41:43 - 41:49
    hierarchy of people who are really
    encouraging them to use more secure tools
  • 41:49 - 41:54
    or to think carefully about data
    they're collecting about people that they
  • 41:54 - 41:59
    contact. So working with these
    organizations also might be an opportunity
  • 41:59 - 42:05
    to do trainings with activists and with
    lawyers and with other people who are
  • 42:05 - 42:18
    working alongside this community. Finally,
    which is always a difficult thing to hear
  • 42:18 - 42:24
    as a trainer, the people we spoke to
    probably aren't going to be adopting new
  • 42:24 - 42:32
    tools for one it might not be safe, it's
    hard to make that calculus right, but a
  • 42:32 - 42:38
    tool that's specifically designed for a
    community at risk or in order to do a
  • 42:38 - 42:43
    particular function that would be of
    interest to, for example, the undocumented
  • 42:43 - 42:46
    community or some other vulnerable
    community might increase visibility
  • 42:46 - 42:50
    depending on the threat model. If they're
    found with a particular app or if the app
  • 42:50 - 42:58
    is like exposing number of users or
    location of users, for example. And it's
  • 42:58 - 43:01
    not to say that we should stop developing
    new tools we should always think about
  • 43:01 - 43:07
    ways to make better and safer and more
    private resources. But it's worth thinking
  • 43:07 - 43:11
    especially if you're going to be working
    with communities or building resources for
  • 43:11 - 43:16
    communities that we should think also
    about how to make sure that they're using
  • 43:16 - 43:21
    the tools they are already used more
    effectively and more safely. That might
  • 43:21 - 43:25
    mean sitting down with someone for a while
    and going to their privacy settings on
  • 43:25 - 43:30
    Facebook or, you know, making sure that
    their settings on Whatsapp, make don't
  • 43:30 - 43:38
    back up data to the cloud or expose phone
    numbers to people they don't know. But
  • 43:38 - 43:49
    there's a lot to do in both of these
    directions. And especially if you're going
  • 43:49 - 43:54
    to be moving into working with these
    communities, this is something to keep in
  • 43:54 - 44:05
    mind, that I thought was especially
    poignant. For that I can take questions.
  • 44:05 - 44:17
    applause
    Herald angel (H): So we have four
  • 44:17 - 44:22
    microphones in this room. I see one is
    already occupied with somebody. May I
  • 44:22 - 44:25
    remind you that a question is typically
    one to two sentence and ends with a
  • 44:25 - 44:31
    question mark. And with that I
    will take microphone 4.
  • 44:31 - 44:37
    Mic4: Hi, thanks! You mentioned that these
    communities are reluctant to adopt new
  • 44:37 - 44:42
    tools. Were there any exceptions to that
    or were there any like attributes of new
  • 44:42 - 44:46
    tools that you think they would be more
    likely to adopt?
  • 44:46 - 44:53
    Allison: Yeah that's a good question! I
    I've been thinking about this. I would say
  • 44:53 - 44:57
    that this is absolutely true what I said
    about reluctance to adopt new tools when
  • 44:57 - 45:01
    it's when we're talking about social
    media. So it's difficult to like move
  • 45:01 - 45:05
    people to Signal for example from Whatsapp
    or Facebook Messenger because the people
  • 45:05 - 45:09
    they talk to are already on these tools
    and it's not just moving one person but
  • 45:09 - 45:17
    like a community. If we start to think
    about tools that might be special-purpose
  • 45:17 - 45:21
    we didn't talk to anyone who mentioned
    this app but I know in the past there have
  • 45:21 - 45:26
    been discussions about ways being used
    it's like a crowd-sourced map system being
  • 45:26 - 45:34
    used to like track law enforcement. Like I
    said we didn't talk to anyone who used
  • 45:34 - 45:40
    that app but possibly if there's like a
    specific utility in it there could be some
  • 45:40 - 45:47
    critical mass of people who spread the
    word in a smaller community. Yeah it's
  • 45:47 - 45:50
    something to think about. I don't think
    it's impossible but I would say it would
  • 45:50 - 45:56
    be challenging.
    H: I assume that the baby doesn't want to
  • 45:56 - 46:01
    speak on microphone 1 so I'm gonna go to a
    microphone 3.
  • 46:01 - 46:03
    Mic3: I have two questions is that okay?
    Allison: Yeah.
  • 46:03 - 46:08
    Mic3: Thank you. The first one is kind of
    a nitty-gritty academic question and that
  • 46:08 - 46:12
    is: can you tell us anything about your
    IRB approval process, what you're doing to
  • 46:12 - 46:16
    protect subjects data? Because this is
    very sensitive and I'm curious how you've
  • 46:16 - 46:19
    approached that.
    Allison: Yeah absolutely. So we didn't
  • 46:19 - 46:28
    have IRB approval before we spoke to
    anyone. We actually got an exemption for
  • 46:28 - 46:33
    collecting data about participants. So we
    compensated for each interview that we
  • 46:33 - 46:42
    did, we gave participants $20. We were not
    required to collect any proof of payment
  • 46:42 - 46:49
    we recorded the interviews and encrypted
    them locally. They were translated by
  • 46:49 - 46:55
    people in our research group and then
    transcribed with all identifying location
  • 46:55 - 47:02
    and name data redacted. And, that those
    were all stored encrypted on our personal
  • 47:02 - 47:08
    drives and then in a University Drive. All
    the data has been deleted now all of the
  • 47:08 - 47:13
    original data as well.
    Mic3: Awesome! Thanks. The other one is a
  • 47:13 - 47:17
    big picture scatterbrain question: which
    is about how this is a technological
  • 47:17 - 47:24
    solution to a political problem. Do you
    feel that directing or helping immigrants
  • 47:24 - 47:29
    understand how to protect themselves
    technologically, is the answer or
  • 47:29 - 47:34
    necessarily part of the answer or do you
    feel like maybe eventually our community
  • 47:34 - 47:38
    needs to be helping people exit places
    like the U.S. that are increasingly
  • 47:38 - 47:43
    hostile to immigrants?
    Allison: That's a good question. I don't
  • 47:43 - 47:50
    think that helping people be more safe
    online is really a solution. I mean the
  • 47:50 - 47:55
    solutions gonna be in policy and in law. I
    think this is a utility really in the
  • 47:55 - 47:59
    short term is like making sure people feel
    safe and like have more control over
  • 47:59 - 48:03
    disclosure to the extent that they can.
    But I don't think that's going to,... I
  • 48:03 - 48:09
    don't think that's a winning, you know,
    single pronged battle. As for leaving the
  • 48:09 - 48:14
    United States that's kind of a funny
    question considering how much people have
  • 48:14 - 48:18
    sacrificed to come to the U.S. and
    especially having integrated into
  • 48:18 - 48:23
    communities already. A lot of the people I
    spoke about today were long-term residents
  • 48:23 - 48:26
    I mean everyone was a long-term resident.
    So they've sort of built their lives in
  • 48:26 - 48:30
    the U.S. But there has been a significant
    decrease in the number of people
  • 48:30 - 48:35
    immigrating to the U.S. without
    authorization that's thanks to Obama era
  • 48:35 - 48:41
    policies of like, you know, return
    immediately at the border so whether
  • 48:41 - 48:45
    people are now moving to other countries
    is a good question and whether we should
  • 48:45 - 48:48
    encourage that is... I don't know,
    interesting.
  • 48:48 - 48:55
    Mic3: Thank you
    H: Microphone 2.
  • 48:55 - 49:02
    Mic2: Hi, so I have a questions: Are there
    any initiatives to help the people in a
  • 49:02 - 49:12
    way that so,.. The fact that they don't...
    they feel that they are less risk online
  • 49:12 - 49:17
    and they don't perceive the risk as much
    and do you feel that helping them
  • 49:17 - 49:21
    understanding those risk and maybe trying
    to be more secure online will actually
  • 49:21 - 49:27
    help them or is there a resignation
    towards the government accurate?
  • 49:27 - 49:42
    Allison: If you're thinking about specific
    people I think,... Maybe when individual's
  • 49:42 - 49:47
    information is going to be accessible in
    the long run if immigration enforcement
  • 49:47 - 49:52
    really chooses to maybe that sense of
    resignation to some extent is accurate but
  • 49:52 - 49:58
    lots of people aren't necessarily on the
    radar. And I think what's most beneficial
  • 49:58 - 50:04
    about helping people understand how to use
    technology more effectively and like
  • 50:04 - 50:09
    that's really just increasing confidence.
    It's this uncertainty and like choosing to
  • 50:09 - 50:12
    abstain from participating in
    conversations because they just don't
  • 50:12 - 50:15
    trust that they can be secure, like
    private enough. You know or that their
  • 50:15 - 50:19
    personal information, their home addresses
    that they they're still at risk of this
  • 50:19 - 50:24
    harassment like that's... That lack of
    confidence and privacy is really what I
  • 50:24 - 50:40
    think can be helped and... Sorry I had
    another point. Yeah, but if it's worthwhile
  • 50:40 - 50:44
    you know thinking about how you can
    contribute to helping. I mean even
  • 50:44 - 50:51
    outside of like privacy work, a lot of
    people really just are eager to learn more
  • 50:51 - 50:59
    about how to use technology like to help
    their lives. Right, so the other thing I
  • 50:59 - 51:04
    was going to say was, we also put
    significant thought into whether or not,
  • 51:04 - 51:07
    you know, how to have these conversations
    with people and like how to ask questions
  • 51:07 - 51:13
    about, you know, the risks online without
    really freaking them out. Because we
  • 51:13 - 51:16
    didn't really have solutions. It's not
    like at the end of an interview we could
  • 51:16 - 51:21
    say like well we have a solution for you
    just install this app and you'll be safe.
  • 51:21 - 51:25
    So, it's sort of this balance between
    making sure that people still, you know,
  • 51:25 - 51:31
    use tools in the way that's so helpful for
    their lives. Right like we don't want them
  • 51:31 - 51:34
    to stop using Facebook if it means that
    they stop talking to their parents back in
  • 51:34 - 51:38
    Mexico. We don't want them to stop using
    email if it means that they can't talk to
  • 51:38 - 51:43
    their kid's teachers anymore. So it's this
    balance between like being aware of the
  • 51:43 - 51:46
    risk and being confident that you're doing
    as much as you can while not choosing to
  • 51:46 - 51:50
    abstain.
    H: So I'm hiding here in the corner
  • 51:50 - 51:54
    because I'm trying to see whether
    somebody's at number four? There's
  • 51:54 - 51:59
    somebody there yes. So Mic4 please.
    Mic4: Thanks. Hi, so I was wondering since
  • 51:59 - 52:05
    Facebook is the most popular tool that
    they use and they probably won't change
  • 52:05 - 52:11
    it, did you find anything that the people
    at Facebook could do to help undocumented
  • 52:11 - 52:15
    immigrants more?
    Allison: Yeah, I think the things that
  • 52:15 - 52:19
    Facebook can think about are really
    generalizable to a lot of vulnerable
  • 52:19 - 52:25
    communities. People, there were a few
    things in particular that some people are
  • 52:25 - 52:31
    really uncomfortable with, for example,
    Whatsapp if you're added to like a group
  • 52:31 - 52:36
    of people your phone number is exposed to
    everyone else in the group, without your
  • 52:36 - 52:40
    consent and that might be the case with
    like group SMS and things. But like, the
  • 52:40 - 52:44
    fact that Whatsup even uses a phone number
    is kind of something that we should
  • 52:44 - 52:51
    migrate out of, right. Facebook collecting
    phone numbers and collecting, you know,
  • 52:51 - 52:59
    location data regardless of how easy it is
    to opt in and out. And so, this is
  • 52:59 - 53:06
    primarily an academic work that's going to
    appear at the HCI, a human-computer
  • 53:06 - 53:11
    interaction conference, and we talk a lot
    in the paper about what these bigger
  • 53:11 - 53:19
    services can do. And really like we as a
    community can advocate for Facebook
  • 53:19 - 53:23
    resisting cooperating with law enforcement
    right. I mean it shouldn't really matter
  • 53:23 - 53:28
    to Facebook where you live or or how you
    got there. They're a social media platform
  • 53:28 - 53:34
    they shouldn't be, you know, helping
    immigration move people around physical
  • 53:34 - 53:42
    borders. They should be totally you know
    border agnostic. So advocating for that
  • 53:42 - 53:50
    kind of attitude shift would be helpful
    H: Microphone 2
  • 53:50 - 53:54
    Mic2: So thank you for the very
    interesting talk. And I have a question
  • 53:54 - 54:00
    that sort of picks up on the previous one.
    And because it's, you talk about it
  • 54:00 - 54:06
    Facebook has become such an important sort
    of a political actor in this arena. I'm
  • 54:06 - 54:10
    wondering if you've been following up on
    that as a survey research problem like
  • 54:10 - 54:15
    what's, what is there, what is it that
    they are doing and is this something
  • 54:15 - 54:23
    that's happening unwittingly or is there
    something about the general strategy of
  • 54:23 - 54:29
    Facebook that surf helps create this kind
    of trust. And I'm also wondering, going,
  • 54:29 - 54:36
    taking that question further, sorry it's
    more than a sentence that,
  • 54:36 - 54:40
    if you've been thinking about is if you
    see anything sort of suddenly eroding that
  • 54:40 - 54:45
    trust in the future, and I'm specifically
    thinking about this now, this question
  • 54:45 - 54:53
    about how it was possible for all this
    Russian money to go into Facebook
  • 54:53 - 55:00
    advertisements and that served, that's
    kind of point in the direction of pressure
  • 55:00 - 55:08
    for Facebook to be less serve general in
    their trust and picking up on certain, on
  • 55:08 - 55:16
    specific political issues which could also
    be immigration and disclosing some
  • 55:16 - 55:21
    information that they already have?
    A: Your question about whether there could
  • 55:21 - 55:27
    be a shift in trust in the future if
    something could trigger that. The example
  • 55:27 - 55:31
    in Detroit right where law enforcement was
    able to get a phone number from Facebook
  • 55:31 - 55:37
    with a warrant and then track the person
    with this phone number. If there are more
  • 55:37 - 55:42
    and more cases of social media data being
    used in immigration cases and there's
  • 55:42 - 55:48
    evidence to think that that might happen.
    It's possible that narrative might
  • 55:48 - 55:52
    overtake this sense that people have right
    now that Facebook's looking out for them
  • 55:52 - 56:01
    by keeping their account, you know,
    there's that letting them control it. In
  • 56:01 - 56:08
    terms of Facebook picking up immigration
    as a sort of an activist or a political
  • 56:08 - 56:15
    topic that they're interested in, I would
    now hold my breath on that one, but we'll
  • 56:15 - 56:19
    see. Yeah.
    H: So we have time for exactly one more
  • 56:19 - 56:26
    question and that is on Mic 1.
    Mic1: Hi, did you collect any information
  • 56:26 - 56:32
    or study anything about how these people
    were using financial services and such
  • 56:32 - 56:37
    things like online payments? Did they have
    bank accounts, were they concerned about
  • 56:37 - 56:45
    their financial privacy?
    A: Yeah, actually people, the concerns
  • 56:45 - 56:49
    they have with privacy and in terms of the
    way that they were using like online
  • 56:49 - 56:54
    banking because people were I mean using
    credit cards and online banking and paying
  • 56:54 - 56:59
    rent, you know, or utilities online. They
    didn't talk about privacy much in that
  • 56:59 - 57:03
    context except that they have this concern
    about their financial information being
  • 57:03 - 57:08
    stolen by hackers. Right, like the concern
    is for other people rather than the
  • 57:08 - 57:16
    entities that are providing these
    services. And I think a lot of the concern
  • 57:16 - 57:19
    there is coming from the fact that they
    have a lot to lose and very few legal
  • 57:19 - 57:27
    protections should something bad happened
    to them. But, yeah, so just generally like
  • 57:27 - 57:33
    people were using online banking and had
    bank accounts and were using these online
  • 57:33 - 57:36
    financials services. Some people were
    opting out but it wasn't due to privacy
  • 57:36 - 57:40
    concerns it was because they were worried
    about using their credit card on the
  • 57:40 - 57:45
    Internet.
    H: So with that I'd like you to help me to
  • 57:45 - 57:48
    thank our speaker Allison for this
    wonderful talk.
  • 57:48 - 57:54
    Applause
  • 57:54 - 58:03
    34C3 postroll music
  • 58:03 - 58:15
    subtitles created by c3subtitles.de
    in the year 2020. Join, and help us!
Title:
34C3 - Uncertain Concern
Description:

more » « less
Video Language:
English
Duration:
58:15

English subtitles

Revisions